当前位置:文档之家› 密码算法与协议2密钥交换协议

密码算法与协议2密钥交换协议


The DL,DH and DDH problems are random self-reducible. This is immediate for the DL problem: given any instance y = g x, Nhomakorabea
(i) transform y to y’ = ygr with r R Zn, (ii) solve the random instance y’ for x’ = logg y’, and (iii) extract the solution as x = logg h = x’ -r mod n.
Note that ord(y) | n.
2016/1/8
2
2016/1/8
3
Mathematical Preliminaries
Probability


Throughout, we will use basic notions from probability theory, such as sample space, events, probability distributions, and random variables. We will only be concerned with discrete random variables. Specifically, we use the following notion of statistical distance. Definition
Chapter 2.
Key Exchange Protocols
2016/1/8
1
Mathematical Preliminaries
Groups




Throughout, Gn denotes a cyclic group of finite order n, written multiplicatively. Usually, but not necessarily, the group order n is prime. Recall that any group of prime order is cyclic, and that any finite cyclic group is abelian. Also, Gn = <g>, where g is any generator (i.e., an element of order n) of Gn; thus, the elements of Gn are enumerated as 1, g, g2, g3, . . . , gn-1. Often, we write G as a shorthand for Gn. The discrete log of an element y G is defined as the least nonnegative integer x satisfying y = gx. We write x = logg y. For y G, we let ord(y) denote its order.
2016/1/8 6
Probabilistic Polynomial Time (p.p.t.) Algorithm
An (deterministic) algorithm is a well-defined computational procedure that takes a variable input and halts with an output. A probabilistic algorithm is such a computational procedure that may fail without an output, the probability of failure can be controlled to adequately small. The size of the input is the total number of bits needed to represent the input in ordinary binary notation using an appropriate encoding scheme. A polynomial-time algorithm is an algorithm whose worstcase running time function is of the form O(nk), where n is the input size and k is a constant. A probabilistic polynomial-time algorithm is similarly defined.
2016/1/8
15
Polynomially Indistinguishable Distributions
2016/1/8
16
Polynomially Indistinguishable Distributions
2016/1/8
17
Polynomially Indistinguishable Distributions
(p.p.t.) algorithm only succeeds with negligible probability in computing the correct output.
Evidently, these assumptions satisfy: DL DH DDH. Therefore it is better if a protocol can be proved secure under just the DL assumption. It turns out, however, that in many cases the security can only be proved under the DH assumption, or even only under the DDH assumption.
2016/1/8
8
Random Self-Reducibility
Definition 1.13 A problem is called random self-reducible if any instance I of the problem can be solved by:


(i) transforming instance I into one or more (uncorrelated) uniformly-random instances I’, (ii) solving instances I’, (iii) extracting the answer for I from the answers for I’. Steps (i) and (ii) are required to run in polynomial time.

Note
2016/1/8
4
Mathematical Preliminaries
The statistical distance is a metric in the following sense:
2016/1/8
5
Discrete Log and Die-Hellman Assumptions
The following three assumptions are commonly used in cryptographic schemes based on a discrete log setting. Assumption 2.1 The Discrete Logarithm (DL) assumption for group G states that it is hard to compute x given generator g and random group element gx. Assumption 2.2 The (Computational) Diffie-Hellman (DH) assumption for group G states that it is hard to compute gxy given generator g and random group elements gx, gy. Assumption 2.3 The Decision Diffie-Hellman (DDH) assumption for group G states that it is hard to decide whether zxy mod n given generator g and random group elements gx, gy, gz.
2016/1/8
9
Random Self-Reducibility
For cryptographic purposes, it is a good sign if a presumably hard problem is random self-reducible. In that case, it is excluded that even though the problem is hard in the worst-case, the problem is actually easy on the average. To see why, consider a random self-reducible problem and suppose that the problem is easy on the average. Then there cannot be any hard instances at all, since any such instance can be solved by solving an associated random instance due to random self-reducibility. Proposition: Any random self-reducible problem that is hard in the worst-case is also hard on the average.
相关主题