The Perils of the Security Mindset taken too far

post by whpearson · 2017-12-13T22:34:08.173Z · LW · GW · 6 comments

Epistemic status: A few initial thoughts.

For your project to be secure no one should know of your project's existence. Or better than that your project should show all the outwards signs of one project but be another.

To be secure, your inner workings should be opaque, so you become less predictable. Therefore people trust you less. In Newcomb's problem one of the common strategies people come up with to trick omega is to use quantum sources of information to become less predictable. The common counter to this is that Newcomb only fills both boxes in the case it can predict you.

If you are opaque, it is not known who you associate with. Even if people trust you, they might not trust that you would not associate with people they do not trust.

If you are opaque, your security model is not known. Even if people trust you not to leak things on purpose, they might not trust you to do so accidentally.

You trust people less than is optimal. There are false negatives to your decisions to trust people.

You degrade the rationality of other people. There are at least two things that mean people aren't as rational as they could be, a lack of brain power and a lack of information. Hiding information means people can be a lot less rational or effective. This cost is borne by everyone else but you, so you might not be accounting for it.

Hiding your strategic views, hides your flaws. No one can know if you are being too paranoid, because you hide your threat model.

Brain power is hoovered up trying to model other people modelling you, to make sure that you don't tip your hand.

If you can possibly avoid taking the security mindset this far, do so. Do all you can in the freedom of openness. Secrecy can also be a mindkiller.

6 comments

Comments sorted by top scores.

comment by Dagon · 2017-12-13T23:33:27.485Z · LW(p) · GW(p)

Secrecy != security. You're far more secure by being transparent, open, and immune from attack. The best way for your project to be secure is to have a solid business idea and excellent implementors, with no secrecy at all - tell everyone and recruit the best to your side.

The best way to "beat" Omega is to be so wealthy that you only care about the box contents for the game's amusement potential.

I think this post is based on a misunderstanding of "security mindset". When I've heard it used, it's usally assuming lack of secrecy and finding provable defenses against large classes of attack.

Replies from: whpearson
comment by whpearson · 2017-12-14T19:58:09.814Z · LW(p) · GW(p)

You always need a modicum of secrecy to be secure (private keys, passwords etc). Secrecy can also help security a lot. For example whoever is Satoshi Nakamoto helped their security a lot by using a pseudonym and covering their traces pretty well (if they are an individual), so they don't have to worry about being kidnapped and forced to hand over their bitcoin.

Security often thinks about secrecy when they are are wanting attempted attacks to be visible (because you can't protect against zero days). For example you might not want the rules of your web application firewall to be known, so that people can't set up duplicate infrastructure and quietly probe that for holes in it.

Secrecy becomes worse when you start worrying about securing yourself from insider threats etc....

The security mindset is

the security mindset involves thinking about how things can be made to fail. It involves thinking like an attacker, an adversary or a criminal.

If you want to really use it, you cannot stop at the computer, for adversaries and attackers do not. You have to look at personnel, their backgrounds. Can you trust Ywith X information (be it encryption keys or source code), for whatever you are trying to do?

Replies from: ChristianKl
comment by ChristianKl · 2017-12-16T14:27:12.635Z · LW(p) · GW(p)

How do you know that Satoshi Nakamoto is secure? For all we know there's good chance that he's dead.

It's not that easy for the NSA to let someone who's a famous hacker disappear but on the other hand there's no pushback when they kidnap someone like Satoshi Nakamoto.

comment by ChristianKl · 2017-12-16T14:53:18.134Z · LW(p) · GW(p)

To me the post feels very speculative and is far removed from the practical concerns that come with the secrecy mindset.

Planning on running a big project without anyone knowing is a bad plan. There's a reason of why open source code is valued in the security community.

Replies from: whpearson
comment by whpearson · 2017-12-16T23:09:06.436Z · LW(p) · GW(p)

Like I said most of the examples were of taking the mindset too far. Sometimes it is appropriate to go as far as I described. It depends on the stakes and the actors at play. For example to protect the secret that enigma had been broken the allies did the following.

To disguise the source of the intelligence for the Allied attacks on Axis supply ships bound for North Africa, "spotter" submarines and aircraft were sent to search for Axis ships. These searchers or their radio transmissions were observed by the Axis forces, who concluded their ships were being found by conventional reconnaissance. They suspected that there were some 400 Allied submarines in the Mediterranean and a huge fleet of reconnaissance aircraft on Malta. In fact, there were only 25 submarines and at times as few as three aircraft.[24]
This procedure also helped conceal the intelligence source from Allied personnel, who might give away the secret by careless talk, or under interrogation if captured. Along with the search mission that would find the Axis ships, two or three additional search missions would be sent out to other areas, so that crews would not begin to wonder why a single mission found the Axis ships every time.
Other deceptive means were used. On one occasion, a convoy of five ships sailed from Naples to North Africa with essential supplies at a critical moment in the North African fighting. There was no time to have the ships properly spotted beforehand. The decision to attack solely on Ultra intelligence went directly to Churchill. The ships were all sunk by an attack "out of the blue", arousing German suspicions of a security breach. To distract the Germans from the idea of a signals breach (such as Ultra), the Allies sent a radio message to a fictitious spy in Naples, congratulating him for this success. According to some sources the Germans decrypted this message and believed it.[76]
In the Battle of the Atlantic, the precautions were taken to the extreme. In most cases where the Allies knew from intercepts the location of a U-boat in mid-Atlantic, the U-boat was not attacked immediately, until a "cover story" could be arranged. For example, a search plane might be "fortunate enough" to sight the U-boat, thus explaining the Allied attack.

For your second point, this post wasn't an argument a random person should try and run projects secretly. Just that if you were trying to hide the progress or contents of your project from highly motivated state level actors, only partially hiding your project seems dumb.

Replies from: ChristianKl
comment by ChristianKl · 2017-12-17T14:29:55.460Z · LW(p) · GW(p)
only partially hiding your project seems dumb.

That sounds to me like you don't know anyone with a security mindset that has an interest in hiding certain parts of projects from state level actors and this is a completely theoretical exercise for you.