# EAI Final Project Proposal - [11/8 Proposal Report](https://1drv.ms/p/s!AitIPL4ojI9ohxy560VZenV6bUkZ?e=nyJevx) --- - [whisper-medium](https://huggingface.co/openai/whisper-medium) - [mT5-multilingual-XLSum](https://huggingface.co/csebuetnlp/mT5_multilingual_XLSum) - [finetuned-bart-for-conversation-summary](https://huggingface.co/kabita-choudhary/finetuned-bart-for-conversation-summary) ## Scenario - Task: - LLM - Model: - [mT5-multilingual-XLSum](https://huggingface.co/csebuetnlp/mT5_multilingual_XLSum) - Task : - (Conversation to Summary) - TODO : 確認chinese summary inference效果? ## Compression: - Profile tools: - NSYS: NVIDIA Nsight Systems - NCU: NVIDIA Nsight Compute - Methods: - Knowledge distilation (finetune) - Pruning - [Token pruning](https://arxiv.org/abs/2012.09852) - Quantization - PTQ (Post-training quantization) - QAT (Quantization aware training) - Runtime [early exit](https://arxiv.org/abs/2006.11979) ## Scenario - Deployment: - Inference: - server - A100 - web - text/image - linebot - text(Conversation) ## Reference: - [huggingface Model](https://huggingface.co/models) - [T5 model paper: Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer](https://arxiv.org/abs/1910.10683) - [XL-Sum: Large-Scale Multilingual Abstractive Summarization for 44 Languages](https://arxiv.org/abs/2106.13822)