# What is Cryptographically Possible

post by paulfchristiano · 2010-12-24T04:58:37.892Z · LW · GW · Legacy · 19 commentsModern computational cryptography probes what is possible in our universe, and I think the results of this exploration may be interesting to people who wouldn't normally be exposed to it.

All of our algorithms will have a "security parameter" *k*. Our goal is to make it so that an honest participant in the algorithm needs to spend only about *k* time for each operation, while anyone trying to break the scheme needs to use a super-polynomial amount of time in *k*. These assumptions are specifically engineered such that they don't really depend on the type of computers being used.

When I say that a participant has a function in mind to evaluate, we are going to imagine that this function is described by its code in some programming language. It doesn't matter which language if you are willing to accept constant factor slowdowns; you can translate from any reasonable programming language into any other.

Now onto the results, stated very imprecisely and given roughly in increasing order of surprisingness. I have chosen a really random selection based on my own interests, so don't think this is an exhaustive list.

**One-Way Function (OWF):** A function is one-way if given x it is easy to compute f(x), but given f(x) it is hard to find either x or any other x' such that f(x') = f(x). For example, if I give you randomly chosen k-bit integers x, y, it is easy to compute their product x*y. But if I give you their product x*y, it is hard to recover x and y (or another pair of k-bit integers with the same product). We have many more explicit candidates for one-way functions, some of which are believed to be secure against quantum adversaries. Note that basically every other result here implies OWF, and OWF implies P != NP (so for the forseeable future we are going to have to make assumptions to do computational cryptography).

**Pseudorandom Generator (PRG): **Suppose I want to run a randomized algorithm that requires 1000000 bits of randomness, but I only have 1000 bits of randomness. A psuedorandom generator allows me to turn my 1000 bits of randomness into a 1000000 bits of psuedorandomness, and guarantees that any efficient randomized algorithm works just as often with a psuedorandom input as with a random input. More formally, a psuedorandom generator takes k random bits to k+1 psuedorandom bits in such a way that it is very difficult to distinguish its output from random. A PRG exists iff a OWF exists.

**Private Key Cryptography: **Suppose that Alice and Bob share a secret of length k, and would like to send a message so that an eavesdropper who doesn't know the secret can't understand it. If they want to send a message of length at most k, they can use a one time pad. Private key encryption allows them to send a much longer message in a way that is indecipherable to someone who doesn't know the secret. Private key cryptography is possible if a OWF exists.

**Psuedorandom Function Family (PRF): **A psuedorandom function family is a small family of functions (one for each k bit string) such that a black box for a randomly chosen function from the family looks exactly like a black box which chooses a random output independently for each input. A PRF exists iff a OWF exists.

**Bit Commitment: **If Alice and Bob meet in person, then Alice can put a message into an envelope and leave this envelope in plain view. Bob can't see the message, but if at some later time the envelope is opened Bob can guarantee that Alice wasn't able to change what was in the envelope. Bit commitment allows Alice and Bob to do the same thing when they only share a communication channel. So for example, Alice could take a proof of the Riemann Hypothesis and commit to it. If someone else were to later give a proof of the Riemann Hypothesis, she could "open" her commitment and reveal that she had a proof first. If she doesn't ever choose to open her commitment, then no one ever learns anything about her proof. Bit commitment is possible if a OWF exists.

**Public Key Cryptography:** Just like private key cryptography, but now Alice and Bob share no secret. They want to communicate in such a way that they both learn a secret which no eavesdropper can efficiently recover. Public key cryptography is not known to be possible if a OWF exists. We have a number of candidate schemes for public key cryptography schemes, some of which are secure against quantum adversaries. RSA is used in practice for this functionality.

**Zero-Knowledge Proofs (ZKP): **Suppose I know the solution to some hard problem, whose answer you can verify. I would like to convince you that I really do have a solution, but without giving you any other knowledge about the solution. To do this I can't just give you a static proof; we need to talk for a while. We say that an interactive proof is zero knowledge if the person verifying the proof can guess how the conversation will go before having it (if he can effectively sample from the distribution over possible transcripts of the conversation), but it will only be possible for the prover to keep up his end of the conversation if he really has a solution. ZKPs exist for NP problems if OWFs exist.

**Non-Interactive Zero-Knowledge Proofs (NIZKP):** Naively, zero-knowledge requires interaction. But we can make it non-interactive if we make use of a common random beacon. So for example, I am going to prove to you that I have a proof of the RH. In order to prove it to you, I say "Consider the sequence of solar flares that were visible last night. Using that as our random data, here is a verification that I really have a proof of the RH." Now we say that a protocol is zero-knowledge if the verifier can construct a non-interactive proof for apparently random strings of their own choice, but such that the prover can construct a proof for truly random strings if and only if he really has a solution. We have some candidate schemes for NIZKPs, but I know of none which are secure against quantum adversaries.

**Digital Signatures:** Modern law uses signatures extensively. This use is predicated on the assumption that I can obtain Alice's signature of a document if and only if Alice has signed it. I very much doubt this is true of real signatures; a digital signature scheme makes the same guarantee---there is no efficient way to compute Alice's signature of any document without having someone with Alice's private key sign it. Digital signature schemes exist which are secure against classical adversaries if claw-free permutation pairs exist. I don't know what the status of digital signatures against quantum adversaries is (they exist in the random oracle model, which is generally interpreted as meaning they exist in practice). RSA can also be used for this function in practice.

**Homomorphic Public-Key Encryption**: Just like public key cryptography, but now given an encryption of a message M I can efficiently compute the encryption of any function f(M) which I can compute efficiently on unencrypted inputs. There is a candidate scheme secure against quantum adversaries, but under pretty non-standard assumptions. There are no practical implementations of this scheme, although people who care about practical implementations are working actively on it.

**Secure Function Evaluation: **Suppose Alice and Bob have their own inputs A and B, and a function f(A, B) they would like to compute. They can do it easily by sharing A, B; in fact they can do it without sharing any information about their private input except what is necessarily revealed by f(A, B). If Alice cheats at any point, then the computational effort Bob needs to exert to learn f(A, B) is at most twice the computational effort Alice needs to exert to learn anything at all about B (this is a weaker assumption than usual in cryptography, but not that bad). There are no practical implementations of this functionality except for very simple functions.

And some random things I find interesting but which are generally considered less important:

**Computationally Secure Arguments: **Suppose that I am trying to prove an assertion to you. You only have a polynomial amount of time. I personally have an exponential amount of time, and I happen to also have a proof which is exponentially long. Conventional wisdom is that there is no possible way for me to prove the statement to you (you don't have time for me to tell you the whole proof---its really extraordinarily long). However, suppose you know that I only have an exponential amount of time in terms of the message size. There is a proof protocol which isn't perfectly secure, but such that faking a proof requires a super-exponential amount of time (in the random oracle model, which may correspond to a realistic cryptographic assumption in this case or may not).

**Run-Once Programs: **if I give you a program, you can copy it as much as you want and run it as often as you want. But suppose I have a special sort of hardware, which holds two messages but only gives one to the user (once the user asks for either message 0 or message 1, the hardware gives the specified message and then permanently destroys the other message). A run-once version of a program uses such hardware, and can be run once and only once. Run-once programs exist if public key encryption exists.

## 19 comments

Comments sorted by top scores.

## comment by ata · 2010-12-24T02:57:17.422Z · LW(p) · GW(p)

I remember the first time I read *Applied Cryptography* and learned that there were all kinds of simple algorithms and protocols for making numbers do things that I would previously have dismissed as impossible magic. I wonder how much cryptography at that level of surprisingness still has yet to be discovered.

## comment by JoshuaZ · 2010-12-24T15:15:38.899Z · LW(p) · GW(p)

I worry that these summaries while roughly correct are vague enough that someone who isn't already familiar with a lot of the results will end up interpreting the statements in ways that are incorrect. There may be an illusion of transparency issue in making many of these statements seem to you like they are uniquely determining the correct interpretations.

For example you wrote:

For example, if I give you randomly chosen x, y > 1, it is easy to compute their product x

y. But if I give you their product xy, it is hard to recover x and y.

This isn't quite true. The most nitpicky issue is that in order for this to even make sense x and y need to be integers). But even given that, it isn't in general possible to recover x and y at all from x*y. The only circumstance where this makes sense is when x and y are prime (otherwise x*y is simply not enough information). If one doesn't restrict to x and y being prime and just asks for given x*y find a non-trivial factorization, this is for most distributions of x and y easy. (I don't know if this is easy in the sense of polynomial time, but it certainly has close to polynomial time bounds with probability 1 and in practice is generally doable even for very large integers.A random integer n must have a prime factor that is at most about n^(1/log log n). This follows since w(n), the number of distinct prime factors of n, has both average and normal order log log n. In fact one can using slightly more sophisticated methods get better results for how small the smallest prime factor must be).

Similarly, in your entry about private key cryptography you write:

Suppose that Alice and Bob share a secret of length k, and would like to send a message so that an eavesdropper who doesn't know the secret can't understand it. If they want to send a message of length at most k, they can use a one time pad. Private key encryption allows them to send a much longer message in a way that is indecipherable to someone who doesn't know the secret. Private key cryptography is possible if a OWF exists.

This is misleading in that one-time pads are provably secure independent of whether or not one way functions exist. The issue here, as I understand it, is that private key *exchange* exists if one way function exists. That is, if one way functions exist, we can have the key exchange protocol be completely eavesdropped and still have a shared secret key at the end of the process.

## ↑ comment by paulfchristiano · 2010-12-24T18:45:12.283Z · LW(p) · GW(p)

You are correct that I do need to restrict to integers (which seems a forgivable omission) and that I really do need to make x and y k-bit integers, not just > 1 (which is less forgivable).

The statement about private key encryption is basically correct though. A one-time pad allows you to send a message which is no longer than the pad. But if you want to send a much longer message, as I said, or to send many messages, you can't use a one-time pad. You need to do something different, such as inflate your shared random secret to a much longer psuedorandom secret.

In general, there are probably many slight untruths here. I didn't think that was too horrible since most people's understanding is very bad (and they are pretty slight untruths), but I don't really know.

## ↑ comment by jsalvatier · 2010-12-25T17:39:24.525Z · LW(p) · GW(p)

Good on you for knowing your inferential distances.

## comment by Daniel_Burfoot · 2010-12-25T17:14:17.942Z · LW(p) · GW(p)

Great post, thanks - this is exactly what I read LW to discover.

The term "quantum adversary" came up several times, but doesn't seem to be defined in the main text.

Replies from: paulfchristiano## ↑ comment by paulfchristiano · 2010-12-27T20:12:56.522Z · LW(p) · GW(p)

I simply mean an adversary who uses a quantum computer. Such an adversary can solve harder problems, so they can sometimes break the assumptions underlying a cryptosystem even if it is secure against classical computers.

There is a more interesting problem which arises against quantum adversaries however. For example, consider a psuedorandom function family. An efficient classical protocol can obviously only query the function at a polynomial number of points---each query takes time. But a quantum protocol can query the function in superposition at exponentially many points at once and then do something with the resulting quantum state. This is really a fundamental difference which is poorly understood.

## comment by CronoDAS · 2010-12-25T17:58:13.869Z · LW(p) · GW(p)

Alice and Bob share a communications channel, but nothing else. Alice and Bob want to play poker, but don't trust each other not to cheat. (Indeed, they will cheat if they can.) Can they use that communications channel to play a fair game of poker?

Replies from: None## ↑ comment by **[deleted]** ·
2010-12-26T17:54:57.726Z · LW(p) · GW(p)

The first step is that Alice and Bob randomly pick permutations F and G of 1..52. The deck they'll use will be shuffled by the composition G*F; this means that neither can cheat by picking something less than random -- since it will be composed with the other permutation and randomized anyway -- and neither has any information on the resulting shuffle. Alice and Bob send commitments of their permutations to each other, but keep the permutations secret for now.

If Bob needs to draw the Nth card, he lets Alice know and Alice reveals F(N) to him. Then Bob applies G to this result and gets G(F(N)) -- this lets him know which card is the Nth card. Note that Alice gains no information about the Nth card through this protocol.

Alice's drawing scheme is more complicated. She uses a public-key homomorphic encryption scheme, and sends the key to Bob. When she needs to draw the Nth card, she sends the encryption of F(N) to him. Bob encrypts G, which allows him to compute the encryption of G(F(N)), and sends this back. Alice deciphers this in order to find G(F(N)), without gaining any additional information.

Alice and Bob could have cheated throughout this process. Since by assumption Bob does not know Alice's cards, Alice could have claimed they were whatever she wanted. However, they can check if cheating occurred after the end of the game by revealing their permutations, which they bit-committed to and cannot later change. Once Alice and Bob both know F and G, they can compute F(G(N)) for any N and verify that all cards drawn were correct.

Replies from: CronoDAS## ↑ comment by CronoDAS · 2010-12-26T21:36:50.199Z · LW(p) · GW(p)

Thanks for answering. Yes, that looks like it would work. How hard would it be to actually implement something like this?

Replies from: paulfchristiano, None## ↑ comment by paulfchristiano · 2010-12-27T20:08:29.167Z · LW(p) · GW(p)

I have played a secure implementation of Poker; I don't remember the link, but someone has written an API for doing this. For my final project in AP computer science in high school, I implemented a card game in a secure way under some non-standard cryptographic assumption.

In fact any number of players can play any game involving cards which can be turned face down, shuffled arbitrarily, revealed to a subset of the players, have only some properties revealed, etc. This involves much more complicated technology, but once you can do secure function evaluation, it shouldn't be too surprising.

Replies from: CronoDAS## ↑ comment by CronoDAS · 2010-12-28T05:07:53.606Z · LW(p) · GW(p)

The unofficial Magic: the Gathering program "Apprentice" was notoriously vulnerable to cheating. I asked because I was wondering how practical a "cheat-proof" implementation that didn't require a trusted third party server would be.

Replies from: paulfchristiano## ↑ comment by paulfchristiano · 2010-12-28T06:06:05.075Z · LW(p) · GW(p)

I remember dealing with that. A completely cheat-proof implementation would be more technically sophisticated than the rest of Apprentice put together, but you could eliminate deck stacking pretty easily.

## ↑ comment by **[deleted]** ·
2010-12-26T22:32:22.182Z · LW(p) · GW(p)

Overall, I'd estimate it would be no harder than carefully implementing any well-known cryptographic protocol.

There are two implementation details I left out above: the bit commitment to the permutation, and the homomorphic encryption. Commitment is easy: it can be done using RSA (you first encrypt the permutation, padding with extra noise to avoid known-plaintext attacks, using your public key, and send that; then send the plaintext in verification).

Homomorphic encryption is more complicated, because the algorithms are newer. Something like this should be sufficient, but the security requirements are weird: breaking the encryption is as hard as the approximate GCD problem, which I've never heard of outside the context of this paper. I imagine in the next few years people will be working on finding similar schemes but with more standard security assumptions.

To actually do computations on the homomorphic encryption, we need to describe the operation of permutation -- that is, given parameters N, G(1), ..., G(52) all in the range 1...52, find G(N) -- as a logic circuit. Fortunately, this isn't too hard. We then write the logic circuit as a polynomial (with AND being multiplication, XOR being addition, so that true and false correspond to 1 and 0 mod 2).

In this case, if the inputs and output are encoded as 6-bit integers, we can write down the circuit as 6 polynomials of degree 7, and all you need is the algorithm on the first page of the paper I linked to (there's a disclaimer about that algorithm being homomorphic for low-degree polynomials only; I'm pretty sure that for any security parameter high enough you can't brute force the encryption to break it, 7 is "low"). Just plug the encrypted inputs into the polynomial and send the result back to Alice.

I am leaving out a few details here -- for securely making this algorithm public-key, we need to include a bunch of "encryptions of 0" and add them in at various points -- but this is not relevant to the difficulty of the problem.

## comment by shokwave · 2010-12-25T13:57:10.096Z · LW(p) · GW(p)

In some places you say "X is possible iff one-way functions" and in others you say "Y is possible if one-way functions". I presume the first statement means "OWF is necessary and sufficient for X" and the second means "OWF is sufficient but not necessary for Y". Just wanted to double-check that. Bit-commitment, zero-knowledge proofs, and run-once programs are all very interesting ideas that are somewhat novel to me.

Replies from: paulfchristiano## ↑ comment by paulfchristiano · 2010-12-27T20:03:29.045Z · LW(p) · GW(p)

That is what I meant. I wasn't too careful with these implications; where I didn't know the results, I just added the second "f" if it was obvious to a cryptographer.

## comment by Sniffnoy · 2010-12-24T05:22:48.111Z · LW(p) · GW(p)

Can you explain secure function evaluation some more? What is meant by Alice "cheating"?

Replies from: paulfchristiano## ↑ comment by paulfchristiano · 2010-12-24T18:39:06.686Z · LW(p) · GW(p)

Alice and Bob have a function they want to evaluate. For example, maybe they want to determine the maximum of their two numbers, without having the other player reveal what their number is.

To do this, they start a conversation. There are "rules of engagement" they have agreed upon, but at any point Alice may stop adhering to them. For example, she may just stop participating in the protocol. We would like to say that if Alice does this that no one gets to learn the maximum of the two numbers, but its not quite true (and if you think hard, it can't be quite true). Instead what happens is that, depending on when Alice stops cooperating, it will take her a certain amount of computational effort to learn the maximum of the two numbers (and she will never be able to learn anything more). Bob can also learn the maximum of the two numbers, but it will take him just a little bit more computational effort.