You can also watch it on YouTube.
Below is the edited essay version of this video.
Why Notorious is a great example of a culture of data security
One of the best examples of how to think about organizational data security, access control, defense in depth, how you handle security leaks, and the like, doesn’t come from computer science, organisational theory, or case studies, but from spy media – specifically Hitchcock spy movies.
The idea that you should make sure the Germans or the Nazis didn’t discover any information or secret in the UK that might have been of use to them during the Battle of Britain was ingrained not just in the popular culture at the time, but also in just the general culture.
Not just military and top secret information, but information about like shift changes in a factory, or how production was going on a manufacturing line. All of that was valuable information to an adversary.
This meant that most of the people watched Hitchcock’s World War II movies, or the movies that were made immediately after, would have been intimately familiar with the idea of containment – of making sure that only the people who needed to know were allowed to know, because loose lips sink ships. The more people know, the more likely it is the adversary will discover it.
When you look at a movie like Notorious made in 1946, in the immediate aftermath of World War II, you would have to remember that the audience at the time knew what was needed to make sure an enemy didn’t find out information that threatened the security of the nation.
So if you aren’t familiar with the movie, you should absolutely watch it as Notorious is is one of the best examples of a Hitchcock thriller.
You’ve got Cary Grant and Ingrid Bergman as the leads. Cary Grant is the American spy, Ingrid Bergman is the daughter of a notorious Nazi traitor (hence the title), and he convinces her to infiltrate a group of Nazis that have managed to flee after the war by any means necessary, because they know that the Nazis are planning something.
Because Nazis are always planning something.
And it was vital for the national security of the US and other countries to discover what they were planning and prevent it.
That, right at the start, is a good first example of operational security in that the Nazi group that Ingrid Bergman is supposed to infiltrate makes sure that whatever vital strategic data they’re protecting is only accessible from within the organization.
They’re not keeping it in a mailbox or safety deposit box in London or somewhere where the US can access it. They make sure it’s stored somewhere away from where adversarial authorities can store it.
Similarly, if you’re worried about the US, you don’t store your data on a US-based server or with a US-based company.
This is not a new concern.
The US has a history of compromising the data security of their allies
From the time I got my first job at a software company, there’s been this concern about the safety or security of whichever data you store with the US server. And that comes down to George Bush, the Patriot Act, and the laws that followed that effectively mean you can trust that if the US authorities are interested in any given piece of data that’s stored with a US-based company or in a US-based server they will be able to access it.
That’s what the law is for.
What many warned about at the time was that we don’t always know that the US government will respect whatever safeguards they might have in place, that this was proto-fascism, or the first step towards the mechanisms of fascism, even though you might not have an actual fascist movement behind it.
That’s kind of what’s happened now.
It also demonstrates one key difference between the security concerns of a European company or organization today versus 20 years ago.
When I was working in a software security company based here in Iceland they hosted all of their data in Icelandic data center, specifically because they were worried about the Patriot Act.
What’s new is that movements have converts
One of the things that they didn’t have to worry about at the time were zealots or converts. You wouldn’t have an Icelander or a European suddenly convert to “Bushism” and think that the ideals of following George Bush would trump their own current national interests, or the interests of the company they’re working for, or the colleagues they’re around.
But today there is a decent chance that if somebody has authoritarian leanings or authoritarian ideas, or has sympathies to fascist ideologies, they might see the rise of Trumpism as something that they can relate to and follow.
There’s a very real possibility that a security leak would not necessarily come from accidentally hiring a North Korean agent or hiring an undercover agent or infiltration such as the Nazis were facing in Notorious, but from somebody who has been working for your organization who suddenly believes that the cause of Trumpism is more important than the organization they work for, or the country they belong to, or their colleagues that work around them.
The infiltration isn’t just the unfamiliar, or the new, or an employee, or manager getting hacked. It potentially is converts and zealotry.
That’s an aspect of this, that wasn’t the case 20 years ago, 10 years ago.
You didn’t really have to worry about somebody in your organization suddenly converting to Putinism or becoming a Chinese sympathizer overnight. There has nonetheless always been a contingent among tech with fascist and authoritarian leanings.
I remember conversations in companies where I used to work, where you’d ask a developer what they were reading, or talk to them about politics
and you’d end up thinking to yourself “that’s kind of fascist”.
You do actually need to worry about somebody inside your organization turning, especially if you’re in tech.
It’s not just the converts. As I mentioned earlier, people have accidentally hired North Korean agents and executive’s phones or email accounts have been hacked meaning they became an unwitting infiltrator.
It happened to Microsoft.
It can happen to you.
Access control
So the first step, if you follow in the playbook of the Nazis in the movie Notorious is access control. Even though Ingrid Bergman managed to infiltrate herself into the household of the Nazi played by Claude Rains, she didn’t have access to everything – all of the secrets of the organization.
Similarly, just because somebody works for you or is a manager, or is allowed in your office,that doesn’t mean they should have access to every database.
Access to any given data store in your company should only be open to those who need to access it.
Even a developer who says, “I need access to the database to develop software that works with it”.
No, they don’t need to work with the production database.
They need to work with something that looks and works like the production database.
They should never ever touch actual customer or client data, because you don’t know what they’re gonna do with it. They could even accidentally leak it, which is honestly just as bad because when it comes to regulatory compliance and your duty and obligation to report on security issues, regulators don’t care that much how it happens.
Unintentional or not, you need to still need to report leaks, so it still exposes you to liability and issues.
Be selective about what you store
You also need to make sure that you don’t store what you don’t need.
Much of this goes unsaid in the movie, but the Nazis make sure that they don’t name too many names. They don’t store too much detail because they know that anything that is stored can get leaked. The same thing applies to us when we’re developing software or structuring an organization.
The data that you don’t store doesn’t get leaked, it doesn’t get stolen, it doesn’t get confiscated or accessed through a warrant by an American organization who’ve just criminalized a section of their population and are wanting to use you to find people to arrest.
If you don’t store that data, you’re not gonna be a target.
Containment
Another item from the Nazi saboteur ring playbook is compartmentalisation: they organized into cells where the data that was contained within each cell wasn’t collected at a central point. As each spy cell was discovered, they didn’t know enough to harm the rest of the organization.
In terms of business strategy, that means data should be stored where it can be acted upon.
The more data you control centrally, the more data is stored away it is from where it can be acted on, which is bad idea in the first place.
Centrally storing all of your data also makes you a bigger target.
But the closer you store the data to where it’s acted upon, the tighter the feedback loop is for that part of the organization and that faster decisions can be made.
It’s not just a security issue to keep information at the edges rather than in the center.
It’s a functional organizational issue – a question of maneuverability.
Storing information at the edges makes an organization more limber and more capable of action at a short notice, often even before central management notices the issue.
One example of this were chains of bookstores. An issue that plagued bookstore chains in the UK, for example, was that many of them, decisions about book stock and orders for any given bookstore were decided upon centrally.
They collected all of their purchase data into one big pool and then some hoity-toity manager decided what any given branch would order.
(Borders UK had additional issues, such as an inventory system that required the use of custom bar codes instead of the pre-printed ones, but this central management of stock, if I remember correctly, was also a factor in Waterstones when it was owned by HMV.)
The people on the ground have a much better sense of of what works for their given store. Just because your bookstore is at a train station that doesn’t mean that it should order the same books as another bookstore in another train station.
It depends on what sort of customers come in on a regular basis and what neighbourhoods or employers are in proximity with the station.
When James Daunt, who now runs Waterstones, one of the first things that he did once he took on management of Waterstones, was to make sure that most inventory decisions for a bookstore were made in that store.
That meant that they would be able to focus on their actual clientele, and could respond to changes more quickly.
From the perspective of your average business organization, the logical next step after that is that non-essential data – all the data that doesn’t have to be stored centrally – is only stored at the edges in the branches or offices where it’s needed and not stored centrally.
You’re not gonna be able to apply this idea to every single piece of data you store, but it applies to more than you think.
The software equivalent
If you’re making software, the equivalent would be to store things in the client, whether it’s in the browser, or in the app, or in the user’s file system.
Anything that the end user stores and controls locally is something that they can act upon quicker, more reliably, and makes you as an organisation less of a a target.
It’s much harder for an attacker to go through all of the edge locations and gather the data one by one.
Anything that slows them down gives you time.
Anything that gives you time gives you more opportunity to notice and react, makes you more able to do the right thing and protect your own interest.
Because this is about pure pragmatism.
There are liability issues for a European company if an outside entity accesses private data. It doesn’t matter if it’s the US government or some fascist leaning employee who leaks whatever to whichever ultra-right-wing website is en vogue at each time.
Make it “shreddable”
If you have to store data, make sure you can actually delete it.
Or, technically, you don’t need to delete it completely, because that can be difficult when you’re using distributed storage.
But if you encrypt a data item and store the encryption key separately for each user, then deleting the key means you’ve effectively shredded the entire data set for that user.
Making sure that the data is impossible to access is almost as good as actually deleting it, because it can come down to effectively the same thing.
The weakness of Nazis
But the final lesson is what caused the downfall of the Nazi organization in the movie Notorious.
One of the recurring themes for authoritarians, both in media and in real life, is the need to command and control – the rule through fear and discipline, and the treatment of people as if they were cogs in a machine.
It doesn’t matter whether it’s Star Wars or Notorious, the standard tactic for authoritarians is to punish those who shows weakness.
This is surprisingly common in modern businesses, depsite numerous studies showing that the approach is self-descructive.
When a substantial percentage of a company can expect to get fired every year, and at the same time you have performance reviews and you have management reports, anybody who shows any kind of weakness or mistake or flaw can expect to be at risk for losing their job.
This happens in Notorious, albeit in a more extreme way. The Nazis kill anybody who shows any weakness.
That meant that when the Claude Rains character started to suspect the Ingrid Bergman character, his wife who had infiltrated his organization, he didn’t report it.
Even though that’s what you should do whenever you suspect your organization or your business has been infiltrated, whether it’s because you accidentally clicked a link or an email and your computer started to behave weirdly, or whether it’s the new employee doing weird things, you’re supposed to report it because that’s the only way an organization can prevent bad things from happening.
But if you punish people who report, even indirectly, that means that everybody’s first instict will be the same as that of the Claude Rains character’s in Notorious: he prevented the infiltration from being discovered and tried to handle it himself.
Instead of immediately shutting down the security risk and giving them the opportunity to salvage their operations and basically ensuring victory for the Nazi saboteurs in Notorious, he ensured their doom because he was afraid.
His fear of discovery or of him being purged for having made an understandable human error meant that the good side won.
This is what authoritarians never understand, they never grasp. This iron fist control over the organization, this rule through fear and weeding out weakness, actually makes an organization weak.
Robust organizations understand that people make mistakes, that people vary and that it’s the system of the organization that is the actual decider of the overall productivity or effectiveness of the organization.
The organization that understands that people make mistakes and works—makes sure it’s a safe enough environment for them to let you know that something has gone wrong.
That’s the organization that lasts, that tolerates crises and survives them, even thrives on change.
The authoritarian organizations that weed out weakness create weakness in themselves because you’re always going to need people and people will always make human errors.
You need to create an environment that benefits from human variation, from people noticing the odds and the strange and the weird.
You want people to be human and let you know when they see something.
But that’s not what authoritarians want or managers with an authoritarian bent.
That’s why they lose in the long run.
That’s why the rest of us will win eventually.