# Anata VRMs v1.0
[toc]
---
## Summary / Action Items
- **Decision:** The NFT metadata is fine for now, no need to change anything yet. Reserve `vrm_url` in the NFT metadata for URLs to the web optimized models in the future
- Currently oncyber, hyperfy, and dverso integrated with that metadata standard for seamless importing
- **Decision:** Host the newest updated models with ARkit blendshapes via a centralized storage solution, for now
- More testing needs to be done with the new models
- It's ~50 gb of data for male + female VRMs combined, which is about $800 in Arweave tokens, making it a hefty decision to lock it in
We're confident with what we have to send Pas any avatar to use in game dev. This time also doubles as an opportunity to advocate for wider adoption of open standards using the optimized VRM files we produced as a case study while debugging integration issues.
---
## Platform Guidelines
**Limitations**
- Oncyber upload limit: 8 MB
- Hyperfy ideal max: 15 MB
- Nifty Island triangle count range: 20k - 40k max

**Impact**
- If the filesize is >8 MB you cannot upload to oncyber
- If the triangle count is >40k you cannot use in Nifty Island
- If the rating is Heavy or Unoptimized for Hyperfy you'll appear like this:

---
## Testing
Quickest and most effective optimization deals with textures. It can reduce file size a TON as seen here: **26 MB -> 2 MB**

However, if the model has too many bones as seen in this example it will still be given a poor rating. Hyperfy will load the VRMs regardless.
I tested with one of the biggest models in the web optimized batch and was able to optimize it from 27 MB -> 7.5 MB, confirming that majority of the file size is contained in the texture files. This one required 2x image resolution downscaling, but it was hardly noticeable. Anime aesthetic is very forgiving though.

I did some more quick and dirty optimization tests using glTF Compressor to apply KTX2 textures and the new [Needle Cloud](https://cloud.needle.tools/) tool for optimizing VRM files. The sample I chose was one of the biggest files from the web optimized batch, if it works on that it can work with anything. I was able to get <8 MB by downsizing the textures to 2K, however Needle Cloud was able to one shot optimize the VRM flawlessly 2x better.

Needle Tools team has unlocked the code for VRM optimization with the power of meshopt and KTX2. I also believe they're the first to ever get VRM + meshopt working, kudos!

It's around maybe 50 cents per optimization using Needle Cloud, and they are open to bulk deals. I think it's worth it when we're satisfied with current models.
[Dverso.io](https://dverso.io) had no issues with the VRMs I uploaded, even the 27 MB ones worked fine as well as the KTX2 optimized ones. Their VRM support is really good, they even have a vtuber mode where you can utilize the blendshapes in the face in the web based platform. It's like social vtubing.

[Oncyber](https://oncyber.io) only allows avatar uploads up to 8MB. Many of those in the version with less blendshapes work, but many are slightly over that.

Oncyber V2 just came out, and they modified the uploader which is now forcing uploads through some kind of pipeline. Our quickest and most effective measures for optimization are broken in this version, and I'm currently talking with the team in order to fix.

---
## Comparing averages
| Date | Gender | File Size (MB) | Triangle Count |
|----------|--------|----------------|----------------|
| 7-15-24 | Female | 12.58 | 67,069 |
| 6-21-24 | Male | 5.09 | 34,497 |
| 8-5-24* | Female | 28.78 | 67,069 |
| 8-3-24* | Male | 22.00 | 34,673 |
*High End Vtuber Exports (All 52 Arkit blendshapes, high end vtubing capable)

---
## Optmized for Web
Less blendshapes, more optimized for web
| Date | Gender | Metric | Average | Max | Min |
|----------|--------|----------------|-----------|-----------|----------|
| 7-15-24 | Female | File size (MB) | 12.58 | 26.12 | 7.07 |
| 7-15-24 | Female | Triangle count | 67,069.54 | 209,479.00| 19,941.00|
| 6-21-24 | Male | File size (MB) | 5.09 | 11.45 | 3.12 |
| 6-21-24 | Male | Triangle count | 34,497.55 | 111,940.00| 19,799.00|
<details>
### 7-15-24 Female
File size:
- Average: 12.58 MB
- Max: 26.12 MB
- Min: 7.07 MB
Triangle count:
- Average: 67069.54
- Min: 19941.00
- Max: 209479.00
### 6-21-24 Male
File size:
- Average: 5.09 MB
- Max: 11.45 MB
- Min: 3.12 MB
Triangle count:
- Average: 34497.55
- Min: 19799.00
- Max: 111940.00
</details>
---
## High End Vtuber Exports
All 52 Arkit blendshapes, high end vtubing capable
| Date | Gender | Metric | Average | Max | Min |
|---------|--------|----------------|-----------|-----------|----------|
| 8-5-24 | Female | File size (MB) | 28.78 | 42.31 | 23.27 |
| 8-5-24 | Female | Triangle count | 67,069.54 | 209,479.00| 19,941.00|
| 8-3-24 | Male | File size (MB) | 22.00 | 36.00 | 14.00 |
| 8-3-24 | Male | Triangle count | 34,673.25 | 112,119.00| 19,979.00|
<details>
### 8-5-24 Female
File size:
- Average: 28.78 MB
- Max: 42.31 MB
- Min: 23.27 MB
Triangle count:
- Average: 67069.54
- Min: 19941.00
- Max: 209479.00
### 8-3-24 Male
File size:
- Average: 22 MB
- Max: 36 MB
- Min: 14 MB
Triangle count:
- Average: 34673.25
- Min: 19979.00
- Max: 112119.00
</details>
---
## Scripts
### Get Stats Script
```bash!
echo "filename,triangles,size" > glb_stats.csv && for file in *.glb; do echo -n "$file,"; gltf-pipeline -i "$file" --stats 2>&1 | grep "Rendered primitives" | awk '{print $5}' | tr -d ',' | tr -d '\n'; echo -n ","; du -h "$file" | cut -f1; done >> glb_stats.csv
```
### Analyze CSV Script
```python!
import sys
import csv
from typing import List, Tuple
import statistics
def parse_size(size: str) -> float:
"""Convert size string to float in MB."""
units = {'K': 1/1024, 'M': 1, 'G': 1024}
number = float(size[:-1])
unit = size[-1]
return number * units[unit]
def analyze_csv(filename: str) -> Tuple[List[float], List[int]]:
"""Read CSV file and return lists of sizes and triangle counts."""
sizes = []
triangles = []
with open(filename, 'r') as f:
reader = csv.DictReader(f)
for row in reader:
sizes.append(parse_size(row['size']))
triangles.append(int(row['triangles']))
return sizes, triangles
def print_stats(data: List[float], name: str) -> None:
"""Print min, max, and average of the given data."""
print(f"{name} statistics:")
print(f" Min: {min(data):.2f}")
print(f" Max: {max(data):.2f}")
print(f" Average: {statistics.mean(data):.2f}")
def main(filename: str) -> None:
sizes, triangles = analyze_csv(filename)
print_stats(sizes, "File size (MB)")
print()
print_stats(triangles, "Triangle count")
if __name__ == "__main__":
if len(sys.argv) != 2:
print("Usage: python glb_stats_analyzer.py <csv_filename>")
sys.exit(1)
main(sys.argv[1])
```
### Comparing averages script
```python!
import plotly.graph_objects as go
from plotly.subplots import make_subplots
# Data
dates = ['7-15-24 F', '6-21-24 M', '8-5-24 F*', '8-3-24 M*']
file_sizes = [12.58, 5.09, 28.78, 22.00]
triangle_counts = [67069.54, 34497.55, 67069.54, 34673.25]
# Create subplots
fig = make_subplots(rows=2, cols=1, subplot_titles=("File Size Comparison", "Triangle Count Comparison"))
# Add traces for file size
fig.add_trace(
go.Bar(x=dates, y=file_sizes, name="File Size", marker_color='rgb(55, 83, 109)'),
row=1, col=1
)
# Add traces for triangle count
fig.add_trace(
go.Bar(x=dates, y=triangle_counts, name="Triangle Count", marker_color='rgb(26, 118, 255)'),
row=2, col=1
)
# Update layout
fig.update_layout(
title_text="Vtuber Model Comparison",
height=800,
width=800,
showlegend=False
)
# Update x-axis and y-axis labels
fig.update_xaxes(title_text="Date and Gender", row=1, col=1)
fig.update_xaxes(title_text="Date and Gender", row=2, col=1)
fig.update_yaxes(title_text="File Size (MB)", row=1, col=1)
fig.update_yaxes(title_text="Triangle Count", row=2, col=1)
# Add annotation for High End Vtuber Exports
fig.add_annotation(
x=1, y=-0.1,
xref='paper', yref='paper',
text="* High End Vtuber Exports",
showarrow=False,
font=dict(size=10),
align='right',
xanchor='right',
yanchor='auto'
)
# Show the plot
fig.show()
# Save the plot as an HTML file
fig.write_html("vtuber_stats_comparison_separate.html")
```