owned this note
owned this note
Published
Linked with GitHub
# dark-crystal-diaries-0-cory-doctorow
File: Dark Crystal Diaries 0
https://keybase.pub/danielsan/the-local-gossip/the-local-gossip-NeB4q4Hy-8-dark-crystal-diaries-0-cory-doctorow/dark-crystal-diaries-0-cory-doctorow.mp3
[scuttlebutt](https://scuttlebutt.nz) cypherlink: %YQqmTXmHM5g4sIDOMFs6ZRtr/qhcy1csarsfK2caqvs=.sha256
Transcribed by: https://typeology.co.uk/
Errors: Please ping dan at dh@blockades.org or @dan_mi_sun on twitter if you come by transcription errors. We're aiming for **eventual** consistency.
---
Dan: Hello everyone out there in cypherspace. This is Dan Hassan and I am with Cory Doctorow. This is the first in a series called Dark Crystal Diaries, where we’ll be speaking to friends, peers and advisors connected to the Dark Crystal project. Cory; I’m not gonna use the precious time that we have to introduce you. I think the typical audience for this are all gonna know who you are [inaudible 0:31] link out to a recent show that you did with Jamie King on Steal This Show, which I think is probably a good primer for this.
Cory: Yep.
Dan: So the short hand of how we’ve come to be on this call is that Cory and I have a mutual friend Emily James, who has connected us both. So Cory, at first when I reached out to you, I think the way that I framed the project was kind of based in the history of cryptocurrencies and you’re pretty well documented as being a sceptic of kind of a lot of the blockchain hype and I think I miss-pitched dark crystal to begin with but it’s really good that I did I think, because we’ve come through to the other side to something where you have agreed to kind of, consider working with us a bit more. So I’m just gonna read something from Walk Away which – the reason being is it’s right at the kernel, the seed of why I was interested in speaking with you and that is from Kindle page five ten. "There’s plenty of crypto weenies trying to figure this out, using shared secrets to split the key into say, ten pieces such that any five can be used to unlock the file." Do you remember writing that?
Cory: Oh yeah. Sure. Yeah.
Dan: And so I’m super interested in the conversations that would have lead kind of into that arch within Walk Away coming out. Was that stuff that was of interest to you for your own reading or do you know, as you put it, a number of crypto weenies ?
Cory: Well a little of both and really you know, the thing that got me thinking about how you would intentionally share your data with trusted parties after you were no longer able to control it, like after you were dead, was that a very dear friend of mine and a technologist who I grew up with; a guy named Eric Stuart who went by [inaudible 02:48] died of a freak brain aneurysm when he was in his early forties. He just went to bed one night and never woke up. And you know, lucky stroke for all of us, he had left his computers on with his encrypted discs all mounted and no screen lock and I went over and plugged a terabyte hard drive in and captured all of his data and reincrypted it and stuck it in an Amazon glacier locker and pre-paid for ten years of storage for his parents until they could figure out what to do with his data. And you know, around about the same time I think it was my friend Charlie Stross mentioned that by such and such a year – twenty-twenty or twenty-thirty – some pretty soon year, the majority of internet users would be dead right, the majority of people who’d ever used the internet would be dead and their data footprints would be all over the internet. You know, also around that same time my friend Aaron Schwartz hanged himself after he’d been hounded by US federal prosecuters for downloading scientific articles from MIT’s network and he had a very prolific data footprint that his friends struggled to figure out what to do with. And there was also, around that time, a cryptographer who was driving back from a cryto-conference – I don’t know his name because I only saw the presentation by his friend the next year, but he and his wife were in a car wreck that killed him and being a cryptographer, all of his data was really, really well encrypted and he’d never figured out any kind of succession plan. So all of those things got me thinking and you know, I have a data will that explains what I would like done with my data and also how to access all my data and you know, there are master passwords needed to get that and that master password has been split into two pieces that I hand wrote on slips of paper and I gave half of it to a lawyer in San Francisco, who is actually sitting about fifty yards from where I am now and the other to a lawyer in London, on the theory that it would be much harder to compel disclosure from lawyers in two different jurisdictions than it would be for one, along with instructions of when I’d like that stuff turned over to my wife or someone else if I were incapacitated or dead or whatever. And thinking about all of that and how thorny that problem is and how potentially compromising it is to trust third parties with access to all of your data while you’re alive and how potentially terrible it is for your loved ones not to be able to access your data after you’re dead and trying to figure out how to strike a balance between those two tensions. That was something that I really have been thinking about a lot.
Dan: When I asked that – I kind of – that quote that I started off with, was enough to kind of spurn around ten of us, at times more, on this year or two year journey, kind of delving deeper into that problem. So it doesn’t surprise me to hear that you – that the story that you’ve just related so rich – I’m really sorry about the tragedy-studded nature of it but I think that kind of, what you’ve highlighted is something that any person who maintains a relationship with the internet, whether they think about it or not – this has kind of come up in some shape or form. And so at the heart of Dark Crystal was the question of – although it was slightly orthogonal, it was how – when we say P2P, peer-to-peer, who are the peers that we’re considering, because although blockchain is one sub set, it’s one of the ones which has been getting a lot of hype and air-time. Typically the composition of people is not super diverse and so the beginning of the project was figuring out what would it mean to expand who gets to be a peer in these peer-to-peer systems and we should shout out to Jaya Klara Brekke who in the money lab three postcards from the future, was kind of – is also one of our advisors actually. I should get her on to say this stuff in her own words. That kind of figuring out who gets to be a peer and peer-to-peer systems, when you’re focussing on that custody problem, it really quickly gets into the realm of how do you securely pass data across transformations – so be it death or incapacitation or if you’re crossing a border and your stuff gets seized or other such, has been a problem that probably we could spend a whole number of years on. And one of the things that really inspired me about the way that you wrote about Shamir's Secret Sharing was that it was so closely mapped to the relationships of the characters in the story. And what I really liked about the novel or that theme in the story was although it was a deeply technological question, it was kind of, the technology needed to get out of the way and figure out how the humans would do it in any case and I know that’s something – that feature of WalkAway, this notion of kind of, human relationships amplified by technology is the theme that’s really inspired a lot of the Scuttlebutt folx. Have you – I know that you’ve written about Scuttlebutt before, have you like tried it out, have you looked at it -
Cory: No.
Dan: - has anyone around you showed it or…
Cory: No. I don’t have anyone to Scuttlebutt with. This is the problem with social technologies.
Dan: Yes, OK. So there’s a current, emergent way to try and understand those kind of people-centred technologies as being less – less about trying to make people do things in a computer way and more thinking about how can we make computers mimic the way people do things. So one of the problems that we had at the beginning of what was the genesis of the Dark Crystal project, was to get people access to these systems you needed to teach people password managers and I don’t know if you know this, statistically speaking I think, about between seventeen and eighteen percent of Bitcoins that have ever been – or ever will be generated, statistically speaking, look like they’ve kind of been lost. And so I’ve got this kind of small – and by that I mean you would expect some small fraction of a larger holdings to have been moved or sold off during peaks of changes in price and there’s like whole tranches of coins which just haven’t moved in a super long time so the speculation is they’ve been lost. And so kind of – I extrapolate from that and go; well if you can’t pay people to learn how to use password managers securely then it’s highly unlikely that people less incentivised, will be able to. And so if at the core of managing our data more securely, it’s gonna rely on password managers then we’re kind of screwed. Although this isn’t me saying don’t use them; I think they’re super important. But the thing that I really like about the arc within WalkAway is the sense that is I do think people are able to problem solve things like OK I think I have my apartment and who would I leave my key with in case I lose my wallet and my keys. And that’s kind of a much more human thing which I think most people can do to some degree. So essentially we’re thinking about who would I trust how much and for how long?
So in your travels with the experiences that you’ve had that made you think about that stuff, has anyone else come and spoken to you since that kind of theme within the book or have you kind of learnt about any other projects in that realm that have kind of added some wrinkles to the tale?
Cory: Well the thing that comes to mind is a presentation I recently saw at the Swiss Cyber Storm Conference where the Googler who’s in charge of their password recovery and anti-phishing, talked about how the system relies on a whole bunch of heuristics that are not either the things your stupid bank ask you when they’re like: “We’re looking at your credit report, can you tell me how much you spent last week on Amazon?” as a way of validating you, but instead like a bunch of stuff like: “Which of these six people do you know?” or you know: “Which of these four cafes did you go to the last time you were in Berlin?” Which, you know, creepy that they know all that shit but at the same time it does look a lot more like how we might authenticate a person. And, you know, in some ways these are all just shibboleths right, like things that are – that can only be pronounced by the trusted parties and that the untrusted parties can’t pronounce. And you know, I think that often times shibboleths can lead us astray, that we often assume that they’re harder to forge than they turn out to be. I mean think of all those green activists in the UK who got fooled by, you know, coppers who grew out dreadlocks and learnt to speak like anti-third-runway types and then, you know, went on to impregnate some of them. But, you know, as an adjunct to or maybe as something that is computationally managed and has maybe some calculable complexity or, you know, that can be thought of against a large data set and you can say well what is the proportion of people in this you know, huge league data set that we have, who would have been made secure by the deployment of given shibboleths then maybe we can do – add scale automated versions of this stuff.
Dan: And that’s something that, in my research around this stuff, is like what does already exist? And I think in 2000 people love to rag on Facebook, a lot of the time with good reason but I often wonder what it’s like in the belly of the beast when you’ve got these wicked smart people kind of thinking about this stuff. And in 2011 – I’m gonna screw up the names because it’s not that much in my brain but something like Trusted Contacts, where essentially it’s the same thing where someone’s lost their phone, access to their email and they can’t remember their password and they don’t want to provide identification to some unknown person within Facebook, then you can, in your settings, set up trusted contacts where as long as you can remember one of them, when you’ve forgotten your password, then those three to five people that you’ve identified get sent a code and then you’re meant to be able to ring them out of band - so on the phone or whatever. And what’s interesting is in my – since learning about that I’ve tried to find people who have used it – within these centralised contexts, what I’ve found is often people don’t – it’s not often nowadays that people kind of lose phone and email and aren’t willing to show ID. However in these peer-to-peer systems there is no Facebook, there’s no person that you can go back - so we’re kind of left with not many of the centralised options. OK -
Cory: OK. So I wonder, you know, as a security measure, how hard it would be if you had a big data set of trusted – of trusted third parties or trusted, you know, fall back people, how hard it would be to figure out who those people were and maybe suborn them and you know, it’s this – cause we have these weird threat models on the internet that are things like ransom threat – you know, like the half-smart ransom threat model, which is -
Dan: Yes.
Cory: You know, the dumb ransom threat model is I’ll just high-jack anything I can and ask for ransom right. That’s how you get like idiots high-jacking NHS hospitals and asking for three hundred dollars to unhigh-jack them. You know. But the half-smart one is; I have like an opportunistic attack where I can look at a huge data set of leaked accounts or you know, some other big, leaked, breached data set and then I can sort it by some field that will tell me who’s worth fucking. And then I can then go to those people and do the leg work necessary to figure out who their trusted third parties are and I can fish those people. And I wonder if like that wouldn’t create a bunch of really chewy, complicated security problems. I mean, one of the things that’s in Walk Away is that they’re sceptical of shamir's secrets, in part because complexity is the enemy of good security and you know, having this kind of ever expanding cloud that’s like geometrically expanding cloud of I trust you and these nine other people, any five of them and they all have their own list of ten with any five and you know, figuring out like is there like six of them you could roll up and like suborn and then get access to a whole ton of stuff. Like it’s a – it’s a very complicated and difficult question and I think it’s fun to do thought experiments with but it’s the kind of thing that before you ever ask someone to entrust something to, it’s the kind of thing that you really want read teams to look at. Cause I know that, you know, in the aggregate our social behaviour is a lot more deterministic than we think it is, particularly if you only care about one or two sigmas. If you just want to compromise – If you say that within any four million people that you are compromisable there’ll be two hundred that are really worth compromising then – which seems to me like kind of there’s a rule of thumb, probably like a pretty conservative estimate, then – asking yourself whether you could get two hundred account thefts out of a breech set of four million, which is a small breech set, by doing them – You know, essentially by doing the equivalent of checking if anyone’s password is ‘password’, right whether anyone’s like fallback is their mum, their wife, their dad and their boss. You know, then I worry that it creates a very compromisable environment.
Dan: One of the other themes which I’ve just heard as well is kind of, do we know what we’re getting ourselves in for? So it’s this what does consent mean in the field of new and unknown. So in version one of Dark Crystal it was – once we knew what the problem was that we were experimenting with – the first version was – is the kind of non-ideal, hacky, get it as close to something as possible. The side effect of – the side effect of that is that at the moment you can essentially implicate or others in the project would say, ideally speak to people before sending it off to them, but essentially you can send people secrets without necessarily in a problematic way, garnering consent from them. And so this brings up a whole kind of ethical question of yeah do people know what it is that they’re getting themselves in for. A related project that I’ve come by which is connected to – I’ve forgotten which uni it came out of. [Inaudible 19:30] Erin and Ian Goldberg I think. I forget which university but I saw it pop up on the open privacy – Sarah Jamie Lewis have they ever popped up on your radar?
Cory: No, I’ve never heard of it.
Dan: They’re – they’re rad. I want to get to know them more. But anyway, so this side project was called Shatter Secrets, which was coming at it from the angle of if you’re trying to move across borders where there’s a probability of your devices being taken from you and you’re trying to bring documents with you, it’s essentially this notion of – it’s called Shatter Secrets I think [inaudible 20:11] but essentially you split those secrets out to people across the border and then reassemble them at the other end. And [inaudible 20:21] within our project is like hey that’s super neat but if you have documents that are worth the trouble of identifying who those six people are, that’s not great news.
Cory: Yeah.
Dan: So I guess these are questions similar to what happened kind of post-modem; is he would initially have reached out to Glenn Romore, then connected out to Nietzsche to see if he could help get [inaudible 20:48] up and running with GPG. The thing is, he couldn’t say: “Oh by the way, if you accept this, you’re gonna implicate yourself in something really big.”
Cory: Yeah.
Dan: And so it’s this thing of, at a smaller scale, how do we build this in a consentful way so that people have time to think this stuff through -
Cory: I think there’s another inverse of that threat model that is worth thinking about. So I as a thought experiment, once proposed that you could create like ten pass phrases for your crypted disc, using just a strong password generator. So they’d just be random hundred and twenty-eight character strings and you would obviously not know any of those strings but you’d have your regular disc-unlocking password that’s just a long password. And the threat – the thing that I was trying to accomplish was you want to go somewhere with your computer and you trust that when you’re not at a border, there’s some rule of law but you also think that when you’re passing through a border the rule of law is suspended. Which, you know it’s a pretty good description of a lot of places, where the rule of law doesn’t apply even when you’re past the border and I think we don’t always know where those places are – I mean that guy who just got hauled into parliament and made to log into his Dropbox account to give up some Facebook documents is an interesting example of how what you might think of as your rights outside of the border are not your rights. But you know, stipulate for the sake of argument, that you could at least call a lawyer and argue about whether or not you have to hand over your password once you’re clear of the border but not was you’re in the border, not while you’re in the border. And you want to get some work done right and so you want to bring your laptop and use it on the plane, you want to use it on the other side. And so you have these ten passwords and you get in a cab to go to the airport and the first password in the list is your memorable password. You type it in and you work all the way to the airport and then you delete the password and now you can’t log into your laptop. You go through customs and you call your room mate and you say: “Tell me what the first encrypted password is.” and you type it in and change it to your memorable password, get on your plane to say Singapore on your way to Australia and get off the plane at Singapore, delete your memorable password again. Now you can’t log into your laptop. Go through customs, buy a smoothie, sit down in the departure lounge and call your friend and say: “Give me password number two.” and then you enter password number two and then you work again on the plane and then when you’re remaining in Sydney, you delete your password, you go through customs, when you get out of customs on the other side, you call your friend and you say unlock it. You do it all in reverse on the way home. And so in theory this works really well because your friend is outside the coerce or force of the state and as long as you’re right about where the state can coerce you, as long as it’s only at borders, then you’re in really good shape. And I mentioned this to a friend, actually the lawyer who’s sitting a hundred yards from me who has my password in case I drop dead and she said: “Oh yeah, they’ll just arrest you and call your friend and they’ll say if you ever want to see Cory again, you’d better give me his password.”. And so this is another way that people can be coerced right? They can be coerced not just by being like put to risk but by being made to decide whether or not you can be a free man right and that’s an incredible conundrum to plunge someone else into non-consensually or even consensually. It’s a very tough thing.
Dan: Yeah, that’s – I’m feeling a lot of things in relation to that. I guess – So one of the things I’m feeling is one; do we know what we’re – So in your simplified example it’s one person. So let’s keep it at one, although it could be ten or whatever -
Cory: Yeah.
Dan: - making it more complex. So the thing is, as you’re passing through the border, you as an individual they’d say what’s the password and you’re like aha I don’t know, I can’t actually tell you. And they’re like; well you wouldn’t just be carrying around this hunk of machinery as a like paperweight, so how is it - So in that scenario we’re assuming it would get to a point where he reveals that there would be someone who could help get that through. And so what’s happening in that instance is you’ve moved from being an individual to being kind of part of a small group of people and what we’re essentially – what I’m hearing you worry about is are people ready to essentially be responsible to each other even in knarly situations.
Cory: Aha.
Dan: And so, to loop back round to kind of the beginning of the Dark Crystal Project. I guess one of the gambits was – it feels like at the heart a lot of the more libertarian end of the spectrum of cryptocurrencies, is this notion that at the end point of society there is this kind of rational, informed, logical individual and that the price of entry into these systems is that that rational, logical individual is able to keep data secure and yes it’s a hard problem but if someone is kind of, an activated human, then they will be able to figure that out. Now I’m not in that camp of people, I’m more in the camp of people to whom the end point of society isn’t individuals, it’s individuals in relation to other people make up kind of groups of various kind of strengthly-kintted social fabric. And so a part of me feels that probably in terms of technology and data and these types of things, that sounds like – although a thorny problem, maybe one that it’s good for us humans to be thinking about in the times coming ahead; things that make us think about how technology can strengthen and amplify our social fabric and kind of get us to level up in that kind of Walk Away sense.
Cory: Yeah.
Dan: It was really – yeah it was really at the heart of the gambit at the heart of Dark Crystal which was it’s possible to reconfigure the sum of the component parts within cryptocurrencies in a way that’s more social, that kind of doesn’t amplify alienation in a way I find cryptocurrencies can tend towards. Rather a way of using technology to strengthen social fabric and part of that is knowing how to weather storms together.
Cory: Aha.
Dan: Cause when I think of Aaron Swartz, like his footprint isn’t just digital, it’s also the relationships which have carried his memory and kind of mission forward as well. He wasn’t just this brilliant – he wasn’t just a brilliant individual, though he was that. To kind of hear where I’m coming from – So basically what I’m saying TLDR is; yes I agree with your analysis that that is a problem. But I think it’s probably the right type of problem; it’s in the right neighbourhood, in the right direction for us to be figuring out together.
Cory: So let me see if I can make the point that I’m trying to make crisper here. There is a class of counter measures that networks enable, where you relieve someone of the risk of coercion by taking the thing that the coercive force wants and moving it out of their hands, moving it out of the jurisdiction of the person who’s trying to coerce them. And that is a powerful and useful tactic but it has it’s own counter measures. So a good example of this; we have a client here at the Electronic Frontier Foundation who in the court records is called Mr Cardone and he’s an Ethiopian national, he’s a dissonant journalist who lives in exile in Washington DC. And the Ethiopian government bought a zero-day hacking tool to break into his Skype, from a now disgraced and collapsed Italian company called Hackintin. And they broke into his Skype in Washington DC from Addis Abuba and they mined his list of contacts in Addis Abuba; the people who were giving him the material he needed to publish anonymously sourced, devastating reports on corruption in the government and they rounded up all his friends in Addis Abuba and tortured them. And so the technology that giveth the power for someone who is out of harms way to be a proxy or fiduciary for someone who is at risk is also the power for the person at risk to be held hostage to the person who is playing the fiduciary. And there’s two models for how you ally with your fiduciary right; one is through a kind of legal duty and the rule of law, so you have lawyers that you trust and those lawyers are – or a fiduciary of some other kind and they are bound by a code of professional conduct and maybe even given special legal powers like the power to resist certain orders compelling them to show evidence, like an attorney-client privilege. And so you trust them because you trust that the rule of law is intact and that has a weakness in that the rule of law is not always intact and the rule of law is a lot more contingent than we’d like it to be and particularly contingent on the whims of powerful people and that is a trend that is accelerating – in part I think we should note because crypto currency is letting Oligarchs launder money. And then the other model for this is that you have someone you love and trust, who you think would never betray you and you’re not relying on the rule of law, you’re relying on these human factors. But those human factors are the very human factors that you then get turned against you when someone kidnaps your wife or husband or kid or whatever and says tell me what their password is or they’re never getting out of jail. And so they both have their weaknesses like a lot – well I think like all security measures, they have to be deployed against a threat model that correctly assesses what risk you are going to be put to and by whom. And so if you use fiduciaries where the rule of law is weak then you will be exposed and if you used loved ones where, you know, your counter party understands – that the rule of law is strong but your counter party understands that, you know, your friend back in a safe place has no rights as to stop them from torturing you, then you’re also exposed. And so assessing the threat is really important to understanding the counter measure.
Dan: So what’s loud and clear is – This is earlier what you meant by Red Team analysis. So essentially threat modelling, working out specifically what the different constraints are and then working, with those in mind, with the understanding that there’s no such thing as perfect security and…
Cory: Well it’s not that there’s no such thing as perfect security, it’s that security is only ever secure as against an attack right? You can have perfect flood security but it won’t stop your house from burning down and so –
Dan: True.
Cory: And security’s always relative, you know. Like bank robbers are more secure when bank vaults are less secure. So there’s – It is always relative to a person and always relative to the threat that person fears and it’s just – It just doesn’t – Like there isn’t a shadow on Plato’s wall that says ‘security’ over it; there’s only security in context.
Dan: Very true. Cory; as I suspected would happen, my brain’s fully saturated from having bounced this off your brain, which was in part the hope. I know you have loads of calls lined up after this, so I’m kind of – I’m good. Do you have any questions?
Cory: No. This was a really interesting discussion.
Dan: About Dark Crystal?
Cory: You know I – I mean I guess what I would like to say is I don’t mean for any of this to be discouraging. You know, often times when we start with an idea it can seem like there are some really big problems that can turn out never to materialise or that have solutions that are also in place. But like part of that idea, part of the process of ideating a new security system is also to think about how it can be hacked and often times the way it can be hacked is the way you make it stronger right, it’s where you discover new and better ways of making the security robust against different kinds of attacks. So yeah; I’m interested in seeing what you guys come up with. I mean I think like maybe taxomising threat models would be a useful next step and just having some like user stories that are based on real things. I mean Mr Cardone, Ed Snowdon; there are a bunch of people out there who have these high-risk environments and also like the opportunistic attacks; the kind of ransom where dumb-dumbs – and what they might do against this kind of thing to.
Dan: That’s all super. Thank you. So we’ve got a residency coming up with Simply Secure folk in Berlin in a couple of weeks time, where I think we’re gonna be running through a bunch of this stuff. So I’m gonna listen back to this, make some notes and then speak to some more people and then go from – go from there.
Cory: Yeah, well good luck with it. It’s been really nice chatting.
Dan: Yes. Hope you have a good day and we can fit some coffee into next week.
Cory: Yeah, that’d be nice. Alright.
Dan: Thanks for making time -
Cory: OK. Nice talking to you. Bye.
[Recording ends 35:42]