# Consensual Data Sharing
Recently in Singapore, a big topic of debate has risen surrounding the usecase of data collected from mandetory COVID contact tracing tokens issued by the government. The "Trace Together"(TT) token tracks proximity data between individuals and while initially the token was publicly promised to only be used in COVID usecases, was recently used in a court case/police investigations for a murder.
This resulted in many Singaporeans feeling rather spooked. Previously TT's data usage was only COVID, now its for "serious" criminal cases, what next? It scares us when we are unsure if the government decides to use this data for other reasons not previously disclosed. The fear being that it would be too late by then as they now have full access of our data and in some ways a lot of control over us.
While we can go into the discussion of privacy rights vs pragmatism in society, something that intrigued me more was the shift in the way data is managed with the TT initiative. Whereas traditionally, most data models today are centralized to a party managing it. E.g governments keep your data with regards to your identity, your tax payments, house/address etc... on their servers. In order to make it palatable to the general public, the Singaporean government used a more user centric, and arguably consensual data model, which technologically requires access to the token to retrieve its graph like data.
Each TT token has a uniqueID which it then shares via bluetooth with other tokens/applications in close vicinity - logging that these individuals have been close to one another. This data is logged locally on the token itself and thus can only be accessed with the token. The uniqueID to citizen mapping is kept with the government themselves, and when they need to contact trace they'd request for the token and figure out the mappings there. Note that in this way, without your token physical (or others) the government can't process the data. This means that whereas with traditional models where parties that hold the vast amounts of data can do aggregate analysis/ and what not on the data, this isnt the case with the TT initiative, it's technologically impossible to gain inisights without the physical tokens themselves.
This paradigm shift of requiring consent to anaylse data reduces the risk of our personal information being used for undisclosed usecases, yet it enables us to leverage on the powers of data - be it for health, criminal, economic or other usecases. I'm not sure if the individuals in the police investigation consented to access to their token, but assuming so, they were able to prove to the police exactly whom they were close to without constantly exposing whom they are in contact with through their whole life. On top of not having to constantly give access to my data, with these models we are able to restrict what data we share. Just like how one might not mind sharing their profile information on Google, but definitely do mind sharing their Google drive, these local data models can have similar/even stricter requirements. If I'm sharing information with regards to the murdur investigation, I'll just share who I've been with in the past week for example.
If we're able to restrict access and when we understand what our data is used for on an individual level, and can guarantee it, it becomes easier trust usage of our data. Even with perhaps sensitive data like medical records, insurance schemes and others. It allows our collection of data to be more accurate and transparent because we know how its used. Local privacy control or decentralization of data is a trend thats not exactly new, but its always exciting to see more impactful usecases out there. Due to the fiasco