# Should we wear T-shirts?
### A call for discussion
---
## Should we wear T-shirts?
T-shirts are used by the department to express the time it takes to test a package, and is used for various things, from helping newcomers (or even experienced testers) decide on what unknown package to test next, to evaluating the time we spend or save on certain packages and by their automation.
---
## Maybe we shouldn't wear T-shirts?
#### (Not in real life!)
The question is - are T-shirts actually helpful? Or they are misleading for testers and inaccurate for meassuring progress and making predictions?
---
## First off, what T-shirts are?
#### T-Shirts come in 5 flavors:
- Extra Small: 0.5 days to test
- Small: 1.5 days to test
- Medium: 2.5 days to test
- Large: 5 days to test
- Extra Large: 10 days to test
---
#### ...and 3 colors:
- <span style="color:green">green</span>: Low complexity
- <span style="color:yellow">yellow</span>: Medium complexity
- <span style="color:red">red</span>: Big complexity
---
## Let's test how accurate they are!
Bellow is a list of some packages, their T-shirts, and the actual median days it takes us to test them:
- php5:
- - shirt: 2.5 days
- - actual testing time: 5.86 days
- php7:
- - shirt: 2.5 days
- - actual testing time: 3.23 days
---
- kernel-default:
- - shirt: 2.5 days
- - actual testing time: 1.69 days
- samba:
- - shirt: 2.5 days
- - actual testing time: 2.5 days
- xen:
- - shirt: 5 days
- - actual testing time: 3.82 days
## ...seems kinda accurate, right?
---
## ...but it doesn't take into account time!
- The time we take to test updates changes over time.
- So for example, **php5** may take 5.86 to test in median if we look at all data - but in 2016 it took 6.87 days to test, while in 2020, 2.03 days!
- Likewise, **kernel-default** may take 1.69 days to test in median - but in 2016 it took 2.82 days, while in 2020, 0.22 days!
---
## And now the ultimate test:
Let's see how many days we spent on testing according to the T-shirt sizes in 2019, versus the actual days we spent on testing.
- In 2019, we supposedly spent (if we sum the t-shirts of the updates we processed): **7,938 days**
- How many days did we actually spent testing: **10,315.8 days!**
We are selling ourselves short by using T-shirts to calculate how many days we worked or saved!
---
Digging deeper, we find even more inaccuraces:
- For that period (2019), according to t-shirts each update took a median of 1.5 days and an average of 1.3 days to test an update
- The actual data show a median of 0.9 days and an average of 2.6 days
---
## So, should T-shirts become obsolete?
There are three ways were we could go:
- T-shirts get completely removed
- - pros: No inaccurate expectations and metrics anymore
- - cons: No way to quickly see and evaluate an update
---
- T-shirts stay, but only as a generic way to quickly evaluate an update, and the days ascociated with them become obsolete and the actual days are used in their place
- - pros: The way to quickly evaluate packages remains, but no more false evaluations and expectations
- - cons: ?
- T-shirts are automatically and continuously assigned based on the real data
- - pros: eliminates any kind of estimation, everything is acurate and correct
- - cons: ?
---
## Discuss!
{"metaMigratedAt":"2023-06-15T09:20:28.763Z","metaMigratedFrom":"Content","title":"Should we wear T-shirts?","breaks":true,"contributors":"[{\"id\":\"11704596-8d44-441c-a2f9-88ccc8cc2473\",\"add\":3857,\"del\":549}]"}