Let us be slightly more precise about the problem we have to solve: the parties, or players that participate are called . Each player holds a secret input , and the players agree on some function that takes inputs. Their goal is to compute while making sure that the following two conditions are satisfied:
- Correctness: the correct value of is computed
- Privacy: is the only new information that is released.
Regarding the latter property, note that since the purpose of the whole exercise is that we want to learn , the best we can hope for in terms of privacy is that nothing but is leaked. In cryptographic literature this has become known as privacy. To avoid confusion, notice that this notion of privacy is different than the one put forward in for instance the GDPR, where privacy is about protecting privately identifiable information of the individual. If the function does data mining and as part of the intended result leaks sensitive information about an individual, but the protocol guarantees that only the intended result leaks, then two things are clear:
- Since the information of an individual is leaked via the intended output of the computation, clearly this individual suffered a loss of privacy. So the protocolis not GDPR-private.
- Cryptographically speaking we would still say that the protocol has privacy, as it only leak the intended outcome, and this is the definition of being cryptographic-private.
This is an unfortunate overloading of terms in the age of the GDPR. Since privacy of the individual clearly overrules nomenclature of any particular technical field, it must be GDPR-privacy that claims the rightful use of the term privacy. Nonetheless, I will in the rest of the description use privacy in the cryptographic sense of the word. I ask the reader to keep this in mind. So from now on and until you leave this web page privacy means: the protocol only leaks the intended output. I will in later texts address privacy in the sense of the individual when I discuss how to combine SMPC and differential privacy.
Computing such that privacy and correctness are achieved is referred to as computing securely. Note also that one may consider a more general case where each player gets his own private output. We focus on a single, public output for simplicity. To see that this is without loss of generality, notice that each party might input to the function her own secret key and the public output could then contain an encryption of the private output of that parties under their secret keys.
As one example of how this connects to the scenarios from the previous section, one may think of as a number, namely ‘s bid in a auction, and where and , i.e., outputs the highest bid, and the identity of the corresponding bidder. If we do not want the winner to pay her own bid, but the bid of the second highest bidder, we simply change to be this value, which is again a well-defined function of the inputs. This would give us a function implementing a so-called second price auction.
In the following texts, we give a first intuition on how one might compute a function securely without relying on trusted parties. This requires that we specify a protocol, i.e., a set of instructions that players are supposed to follow to obtain the desired result. For simplicity, we will assume for now that players always follow the protocol. We will later address the case where some parties may deviate from the protocol, in order to get more information than they are supposed to or cause the result to be incorrect. We will also assume that any pair of players can communicate securely, i.e., it is possible for to send a message to , such that no third party sees and knows that came from . Setting up such and authenticated and private channel can be done using other parts of cryptography that I will not discuss here.