| model | rougeL_f | bert_f | tok_f1 |uni_prec |bi_prec | |---------------------------------------------------------|----------|----------|---------|---------|--------| | [MedSwin-DaRE-TIES-KD-0.7](https://huggingface.co/MedSwin/MedSwin-DaRE-TIES-KD-0.7) | 0.17 | 0.8501 | 0.22 | 0.47 | 0.14 | | [MedSwin-NuSL-KD-0.7](https://huggingface.co/MedSwin/MedSwin-NuSL-KD-0.7)| 0.17 | 0.8500 | 0.21 | 0.47 | 0.11 | | [MedSwin-DaRE-Linear-KD-0.7](https://huggingface.co/MedSwin/MedSwin-DaRE-Linear-KD-0.7) | 0.17 | 0.8494 | 0.22 | 0.45 | 0.14 | | [MedSwin-TIES-KD-0.7](https://huggingface.co/MedSwin/MedSwin-TIES-KD-0.7) | 0.16 | 0.8494 | 0.18 | 0.51 | 0.20 | | MedSwin-TA-KD-w_0.7 | 0.18 | 0.8493 | 0.23 | 0.45 | 0.14 | | [MedSwin-TA-SFT-0.7](https://huggingface.co/MedSwin/MedSwin-TA-SFT-0.7) | 0.16 | 0.8491 | 0.18 | 0.51 | 0.20 | | MedSwin-TIES-KD-0.6-0.6 | 0.16 | 0.8491 | 0.18 | 0.51 | 0.20 | | [MedSwin-Dare_Ties-KD-0.75-0.7](https://huggingface.co/MedSwin/MedSwin-Dare_Ties-KD-0.75-0.7) | 0.18 | 0.8488 | 0.23 | 0.44 | 0.13 | | MedSwin-TA-SFT-w_0.7 | 0.16 | 0.8487 | 0.18 | 0.51 | 0.20 | | MedSwin-DaRE-Linear-SFT-0.7-0.55 | 0.15 | 0.8485 | 0.18 | 0.51 | 0.20 | | MedSwin-DaRE-Linear-SFT-0.7 | 0.15 | 0.8482 | 0.18 | 0.51 | 0.20 | | MedSwin-DaRE-TIES-SFT-0.7-0.7 | 0.15 | 0.8480 | 0.17 | 0.51 | 0.20 | | MedSwin-NuSLERP-SFT-0.7 | 0.15 | 0.8478 | 0.17 | 0.52 | 0.21 | | MedGemma-27b-Text-IT | 0.19 | 0.8465 | 0.27 | 0.40 | 0.09 | | MedSwin-TIES-SFT-0.5-0.6 | 0.16 | 0.8456 | 0.20 | 0.48 | 0.17 | | [MedSwin-7B-KD](https://huggingface.co/MedSwin/MedSwin-7B-LD) | 0.18 | 0.8441 | 0.26 | 0.46 | 0.12 | | [MedSwin-7B-SFT](https://huggingface.co/MedSwin/MedSwin-7B-SFT) | 0.15 | 0.8396 | 0.18 | 0.52 | 0.21 | | [MedSwin-KD-SFT-PubMed-l](https://huggingface.co/MedSwin/MedSwin-KD-SFT-PubMed-l) | 0.17 | 0.8341 | 0.24 | 0.44 | 0.11 | | [MedSwin-KD-SFT-PubMed-map](https://huggingface.co/MedSwin/MedSwin-KD-SFT-PubMed-map) | 0.17 | 0.8297 | 0.22 | 0.47 | 0.12 | | MedAlpaca-7B | 0.16 | 0.8196 | 0.18 | 0.50 | 0.20 | > Highlighted models are featured on MedSwin's Hugging Face org. --- > Abbrs: > - KD: Knowledge Distillation > - SFT: Supervised Fine-tuning > Merges: > - TA: Task Arithmetic > - SLERP: Spherical Linear Interpolation > - NuSL: Normalized Spherical Linear Interpolation > - DaRE: Drop And REscale > Metrics: > - rougeL_f: Longest Common Subsequence (LCS) > - bert_f: Semantic similarity (RoBERTa) > - tok_f1: Token-level precision/recall F1 > - uni_prec: Unigram Precision > - bi_prec: Bigram Precision > Specs: > - IT: Instruct > - l: labelled
×
Sign in
Email
Password
Forgot password
or
By clicking below, you agree to our
terms of service
.
Sign in via Facebook
Sign in via Twitter
Sign in via GitHub
Sign in via Dropbox
Sign in with Wallet
Wallet (
)
Connect another wallet
New to HackMD?
Sign up