Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create project structure #9

Merged
merged 18 commits into from
Feb 22, 2020
Merged
Show file tree
Hide file tree
Changes from 13 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -127,3 +127,9 @@ dmypy.json

# Pyre type checker
.pyre/

# Visual Code
**.vscode

# MacOS
**.DS_Store
53 changes: 26 additions & 27 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,43 +7,42 @@

## Table of contents
nvti marked this conversation as resolved.
Show resolved Hide resolved

- __Week 1__:
- [ ] Introduction and Word Vectors.
- [ ] Gensim word vectors example.
- [ ] Word Vectors 2 and Word Senses.
- [ ] Python review session.
- __Week 1__:
- [ ] [Introduction and Word Vectors](Week%2001/Introduction%20and%20Word%20Vectors/README.md)
- [ ] [Gensim word vectors example](Week%2001/Gensim%20word%20vectors/Gensim%20word%20vector%20visualization.ipynb)
- [ ] [Word Vectors 2 and Word Senses](Week%2001/Word%20Vectors%202%20and%20Word%20Senses/README.md)
- [ ] [Python review session](Week%2001/Python%20review%20session/README.md)
- __Week 2__:
- [ ] Word Window Classification, Neural Networks, and Pytorch.
- [ ] Matrix Calculus and Backpropagation.
- [ ] [Word Window Classification, Neural Networks, and Pytorch](Week%2002/Word%20Window%20Classification,%20Neural%20Networks,%20and%20Pytorch/README.md)
- [ ] [Matrix Calculus and Backpropagation](Week%2002/Matrix%20Calculus%20and%20Backpropagation/README.md)
- __Week 3__:
- [ ] Linguistic Structure: Dependency Parsing.
- [ ] The probability of a sentence? Recurrent Neural Networks and Language Models
- [ ] [Linguistic Structure: Dependency Parsing](Week%2003/Linguistic%20Structure:%20Dependency%20Parsing/README.md)
- [ ] [The probability of a sentence? Recurrent Neural Networks and Language Models](Week%2003/Recurrent%20Neural%20Networks%20and%20Language%20Models/README.md)
- __Week 4__:
- [ ] Vanishing Gradients and Fancy RNNs.
- [ ] Machine Translation, Seq2seq and Attention.
- [ ] [Vanishing Gradients and Fancy RNNs](Week%2004/Vanishing%20Gradients%20and%20Fancy%20RNNs/README.md)
- [ ] [Machine Translation, Seq2seq and Attention](Week%2004/Machine%20Translation,%20Seq2seq%20and%20Attention/README.md)
- __Week 5__:
- [ ] Practical Tips for Final Projects.
- [ ] Question Answering and the Default Final Project.
- [ ] [Practical Tips for Final Projects](Week%2005/Practical%20Tips%20for%20Final%20Projects/README.md)
- [ ] [Question Answering and the Default Final Project](Week%2005/Question%20Answering%20and%20the%20Default%20Final%20Project/README.md)
- __Week 6__:
- [ ] ConvNets for NLP.
- [ ] Information from parts of words (Subword Models) and Transformer architectures.
- [ ] [ConvNets for NLP](Week%2006/ConvNets%20for%20NLP/README.md)
- [ ] [Information from parts of words (Subword Models) and Transformer architectures](Week%2006/Subword%20Models%20and%20Transformer%20architectures/README.md)
- __Week 7__:
- [ ] Contextual Word Representation: BERT.
- [ ] Modeling contexts of use: Contextual Representations and Pretraining.
- [ ] [Contextual Word Representation: BERT](Week%2007/Contextual%20Word%20Representation:%20BERT/README.md)
- [ ] [Modeling contexts of use: Contextual Representations and Pretraining](Week%2007/Modeling%20contexts%20of%20use:%20Contextual%20Representations%20and%20Pretraining/README.md)
- __Week 8__:
- [ ] Natural Language Generation.
- [ ] Reference in Language and Coreference Resolution.
- [ ] [Natural Language Generation](Week%2008/Natural%20Language%20Generation/README.md)
- [ ] [Reference in Language and Coreference Resolution](Week%2008/Reference%20in%20Language%20and%20Coreference%20Resolution/README.md)
- __Week 9__:
- [ ] Fairness and Inclusion in AI.
- [ ] Constituency Parsing and Tree Recursive Neural Networks.
- [ ] [Fairness and Inclusion in AI](Week%2009/Fairness%20and%20Inclusion%20in%20AI/README.md)
- [ ] [Constituency Parsing and Tree Recursive Neural Networks](Week%2009/Constituency%20Parsing%20and%20Tree%20Recursive%20Neural%20Networks/README.md)
- __Week 10__:
- [ ] Recent Advances in Low Resource Machine Translation.
- [ ] Future of NLP + Deep Learning.
- [ ] [Recent Advances in Low Resource Machine Translation](Week%2010/Recent%20Advances%20in%20Low%20Resource%20Machine%20Translation/README.md)
- [ ] [Future of NLP + Deep Learning](Week%2010/Future%20of%20NLP%20+%20Deep%20Learning/README.md)
- __Week 11__:
- [ ] Final project poster session.
- [ ] [Final project poster session](Week%2011/Final%20project/README.md)

## Contributors:

- [Viet-Tien](https://github.com/tiena2cva)
- [honghanhh](https://github.com/honghanhh)

- 🐔 [Viet-Tien](https://github.com/tiena2cva)
nvti marked this conversation as resolved.
Show resolved Hide resolved
- 🐮 [honghanhh](https://github.com/honghanhh)
nvti marked this conversation as resolved.
Show resolved Hide resolved
28 changes: 28 additions & 0 deletions Week 01/Assignment 1/README.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
Welcome to CS224N!

We'll be using Python throughout the course. If you've got a good Python setup already, great! But make sure that it is at least Python version 3.5. If not, the easiest thing to do is to make sure you have at least 3GB free on your computer and then to head over to (https://www.anaconda.com/download/) and install the Python 3 version of Anaconda. It will work on any operating system.

After you have installed conda, close any open terminals you might have. Then open a new terminal and run the following command:

# 1. Create an environment with dependencies specified in env.yml:

conda env create -f env.yml

# 2. Activate the new environment:

conda activate cs224n

# 3. Inside the new environment, instatll IPython kernel so we can use this environment in jupyter notebook:

python -m ipykernel install --user --name cs224n


# 4. Homework 1 (only) is a Jupyter Notebook. With the above done you should be able to get underway by typing:

jupyter notebook exploring_word_vectors.ipynb

# 5. To make sure we are using the right environment, go to the toolbar of exploring_word_vectors.ipynb, click on Kernel -> Change kernel, you should see and select cs224n in the drop-down menu.

# To deactivate an active environment, use

conda deactivate
14 changes: 14 additions & 0 deletions Week 01/Assignment 1/env.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
name: cs224n
channels:
- defaults
- anaconda
dependencies:
- jupyter
- matplotlib
- numpy
- python=3.7
- ipykernel
- scikit-learn
- nltk
- gensim

Loading