# Capstone Meeting #10 -- 7/5/2020
## Agenda
* Updates from last week's todos
* Reviewing on Phase Plans from last meeting:
* [Reference from last meeting](https://github.com/relientm96/capstone2020/blob/master/meeting-logs/GroupLogs/log-09.md)
* In-Depth Phase 1 Discussion
* To-Do's for next week
## Last Week ToDo's
* [Yick] Set up and run different models using LSTM
* [Team] Set up and run different models (in terms of generalization, accuracy and ease of use)
* [Team] Team research
* [Team] Finding out different models to use (not urgent)
* Liase with AusLan communities
* Matthew --> Unimelb
* Yick --> Auslan / Other Unis
* Start Collecting Datasets for Phase 1
* Web Scrapping using libraries
* Manually downloading for personal tests
* Approach lecturers or experts for help (if needed)
## Updates from Last Week
* Leave to Yick to liase with communities.
* Yick has a list of computer vision papers
## continue
* [team] run other frameworks - colba/keras to implement the models;
## Phase 1 Discussion
* Defining which words to focus on?
* Let's start alphabets, fingerspelling
* How do we collect above data?
* [Auslan Corpus](https://elar.soas.ac.uk/Collection/MPI55247)
* [UCI Machine Learning Repository](http://archive.ics.uci.edu/ml/datasets/Australian+Sign+Language+signs)
* Model (Big rabbit holes)
* HMM
* Dynamic Time Warping
* CNN
* RNN + CNN
* LSTM
* attention based encoder-decoder (transformer)
* Small rabbit hole;
* Transfer learning,
* Implementing above models and testing on our datasets.
* How do we generate more data from current data?
* Adding noise to our current datasets for OpenPose
* reminder - not necessary sign language, could be gesture/action
## ToDo for Next Few Weeks
* [Tsz Kiu] Setting up data uploading protocol (Saturday)
* [team] Scrape whatever you could (alphabets + fingerspelling)
* you could record yourself!
* [rabbit hole]
* Annotating data
* [others]
* [Matt] Figure out implementation side of system for models
* [Yick] Liasing, with Auslan Communities (by wednesday; 13 may, 2020)