---
title: '689 Project Ideas'
disqus: hackmd
---
689 Project Ideas
===
<!-- 

 -->
## Table of Contents
[TOC]
## Lottery Ticket Hypothesis - Behaviour Analysis
### How the lottery ticket hypothesis
1. differs across different optimization algorithms i.e. SGD, ADAM and RMSProp
2. differs across different domains but same task i.e. classification for MNIST vs Fashion MNIST
3. differs across different tasks in the same domain i.e. classification vs regression
4. differs across different domains i.e. vision task vs nlp task vs RL task
5. gets influenced by the loss surface?
6. works for generative models vs discriminative models
7. impacts different transfer learning strategies(important!)
8. impacts online learning
## Literature Review
1. Lottery Ticket Hypothesis - main paper https://arxiv.org/abs/1803.03635
2. Stabilizing the Lottery Ticket Hypothesis - https://arxiv.org/abs/1903.01611
3. Deconstructing lottery tickets: Zeros, signs, and the supermask - https://arxiv.org/abs/1905.01067v4
4. The Early Phase of Neural Network Training - https://openreview.net/forum?id=Hkl1iRNFwS
5. One ticket to win them all: Generalizing lottery ticket initializations across datasets and optimizers - https://arxiv.org/abs/1906.02773
6. Playing the lottery with rewards and multiple languages: lottery tickets in RL and NLP - https://openreview.net/forum?id=S1xnXRVFwH
7. Rigging the Lottery: Making All Tickets Winners - https://arxiv.org/abs/1911.11134v2
8. SqueezeBERT: What can computer vision teach NLP about efficient neural networks? (**nothing to do with lottery**)- https://arxiv.org/abs/2006.11316v1
9. Winning Lottery Tickets in Deep Generative Models - https://arxiv.org/abs/2010.02350
10. The Lottery Ticket Hypothesis for Pre-trained BERT Networks -https://arxiv.org/pdf/2007.12223.pdf
11. Transfer learning Survey https://arxiv.org/pdf/1911.02685.pdf
13. GLUE paper https://arxiv.org/pdf/1804.07461.pdf
# NLP
1. The Lottery Ticket Hypothesis for Pre-trained BERT Networks - https://arxiv.org/pdf/2007.12223.pdf
2. When BERT Plays the Lottery, All Tickets Are Winning - https://arxiv.org/pdf/2005.00561.pdf
3. Lottery tickets in RL and NLP - https://openreview.net/attachment?id=S1xnXRVFwH&name=original_pdf
4. Multi-Task Deep Neural Networks for Natural Language Understanding - https://arxiv.org/pdf/1901.11504.pdf
5. Transfer learning NLP - thesis -https://www2.eecs.berkeley.edu/Pubs/TechRpts/2020/EECS-2020-98.pdf
# Transfer Learning
## NLP
1. SST-2 >> IMDB (Sentiment analysis | single sentence classification)
2. QQP >> MRPC (paraphrase | sentence pair classification)
3. MNLI >> RTE (NLI)
<!-- 3. Comparing Rewinding and Fine-tuning in Neural Network Pruning - https://arxiv.org/abs/2003.02389 -->
<!-- https://arxiv.org/pdf/1912.05671v4.pdf -->
User story
---
```gherkin=
Feature: Guess the word
# The first example has two steps
Scenario: Maker starts a game
When the Maker starts a game
Then the Maker waits for a Breaker to join
# The second example has three steps
Scenario: Breaker joins a game
Given the Maker has started a game with the word "silky"
When the Breaker joins the Maker's game
Then the Breaker must guess a word with 5 characters
```
> I choose a lazy person to do a hard job. Because a lazy person will find an easy way to do it. [name=Bill Gates]
```gherkin=
Feature: Shopping Cart
As a Shopper
I want to put items in my shopping cart
Because I want to manage items before I check out
Scenario: User adds item to cart
Given I'm a logged-in User
When I go to the Item page
And I click "Add item to cart"
Then the quantity of items in my cart should go up
And my subtotal should increment
And the warehouse inventory should decrement
```
> Read more about Gherkin here: https://docs.cucumber.io/gherkin/reference/
User flows
---
```sequence
Alice->Bob: Hello Bob, how are you?
Note right of Bob: Bob thinks
Bob-->Alice: I am good thanks!
Note left of Alice: Alice responds
Alice->Bob: Where have you been?
```
> Read more about sequence-diagrams here: http://bramp.github.io/js-sequence-diagrams/
Project Timeline
---
```mermaid
gantt
title A Gantt Diagram
section Section
A task :a1, 2014-01-01, 30d
Another task :after a1 , 20d
section Another
Task in sec :2014-01-12 , 12d
anther task : 24d
```
> Read more about mermaid here: http://mermaid-js.github.io/mermaid/
## Appendix and FAQ
:::info
**Find this document incomplete?** Leave a comment!
:::
###### tags: `Templates` `Documentation`