POSEIDON’s goal was to improve security mechanisms and their management. Several types of mechanisms have been addressed within POSEIDON, and the purpose of TYREX is to push further some research direction concerning one of them, namely Fully Homomorphic Encryption.
Context
Cryptology is the fundamental science of information security. It has many applications in every day life, for example for the security of electronic communication when using a credit card, internet banking, online payment, a wireless network (or many others), and it is very important for privacy, integrity and authenticity of data. The contemporary cryptography is constructing and using cryptographic primitives relying on two classes of presumably hard problems: the integer factorisation and the discrete logarithm problem. But since Shor [Shor97] efficiently solved in 1994 both of these problems with a quantum algorithm, quantum computers are an important concern in cryptography. Even if we are not able to construct such a computer yet, many research projects (with high funding) are focusing on building them and it is important to find alternative cryptographic designs.
Lattice-based cryptography is one possible alternative, it is a branch of cryptography exploiting the presumed hardness of lattice problems. Its main advantages are its simplicity, efficiency, and apparent security against quantum computers. But perhaps the most appealing aspect is that lattice-based cryptographic protocols often enjoy very strong security proofs based on the hardness of worst-case problems. Moreover, lattice-based cryptography allows to construct a wide range of primitives, from the basic ones (for example: encryption schemes [HPS98,Reg05,GPV08,…], signatures [GGH97,GPV08,…], hash functions [Ajtai96,GGH96,…]…) to the most advanced ones (group signatures, broadcast encryption, fully homomorphic encryption [Gen09,BV11, BGV12, Bra12]…).
The security of cryptographic primitives is proven from the hardness of those problems. Typically the possible attacks against a primitive will be modelled by a problem to solve. If an attacker is able to solve this problem, then it will be able to break the security of the primitive, on the other hand if one can show that this problem is not possible to solve in a reasonable time, then the primitive is secure. For each sort of primitive, there exist different levels of security modelled by different problems. To guaranty the security, we show that the problem, corresponding to modelling the attack, is at least as hard to solve as a hard problem, typically a variant of SVP. This is called a reduction. Note that there also exist lattice-based constructions for which the security does not rely on reductions from hard lattice problems. For example, this is the case of the NTRU encryption scheme [HPS98] (the security of one of its variant has recently be proven by [SS13]). In this case, the security is conjectured, based on the study of all the best known attacks (which themselves consist of solving a variant of SVP).
To construct the security reductions, two main problems serve as the foundation of numerous lattice-based cryptographic protocols. The first one, introduced by Ajtai in 1996 [Ajtai96], is the Small Integer Solution problem (SIS), it is used for example to construct signature schemes [GPV08, CHKP10,Boyen10] and hash functions [Ajtai96]. The second one, introduced by Regev in 2005 [Reg05], is the Learning With Errors problem (LWE), it is used to construct encryption schemes and variants with advanced functionalities [Reg05,GPV08,BV11,GVW13]. For all those schemes, the security is proven by providing a reduction from SIS or LWE to the possible attacks. The hardness of those two problems is then essential in lattice-based cryptography. Cryptographic protocols relying on SIS or LWE therefore enjoy the property of being provably as secure as a worst-case problem which is strongly suspected of being extremely hard. However, on the other hand, the cryptographic applications of SIS and LWE are inherently inefficient due to the size of the associated key.
To circumvent this inherent inefficiency, Micciancio [Mic02,Mic07] initiated an approach that consists in changing the SIS and LWE problems to variants involving structured matrices, thus allowing for more compact keys and more efficient algorithms. The problem considered by Micciancio is now commonly referred as Small Integer Solution problems over Rings, or R-SIS. A similar adaptation, called R-LWE, was introduced by Lyubashevsky et al. [LPR11]. Similarly to SIS and LWE, these problems admit reductions from worst-case lattice problems: they are shown to be at least as hard as SIVP restricted to ideal lattices [LM06, PR06,LPR11] (which correspond to ideals of the ring of integers of a number field corresponding to the specific matrix structure).
The aim of this collaboration was to study in details the hardness of lattice problems in ideal lattices. Our goal was first to give a concrete approach to set the parameters of constructions using cyclotomic rings, and second to study if non-cyclotomic rings could be a better option for constructions: how to build a secure scheme with them, how to implement it and which parameters should be used to guarantee the security.
People
This project has been funded from May 2017 to November 2019.
Permanent researchers implied in the project have been:
- Adeline Roux-Langlois (CNRS, IRISA)
- Caroline Fontaine (CNRS, Lab-STICC 2017-2018 then LSV 2018-2019)
Post-docs have been funded by the project:
- Mohamed Sabt – 10/15/2017 to 08/31/2018 (IRISA, now assistant professor at Université Rennes 1 since 09/01/2018)
- Yang Yu – 04/01/2019 to 09/30/2019 (IRISA)
Results
We had several results during the project, in particular :
- In a first work, we proposed an implementation of two constructions based on cyclotomic rings with a concrete evaluation of the parameters needed for there security.
Lattice-based signature and Identity-Based Encryption are well-known cryptographic schemes, and having both efficient and provable secure schemes in the standard model is still a challenging task in lightof the current NIST post-quantum competition. We address this problem in this paper by mixing standard IBE scheme, à la ABB (EUROCRYPT2010) on Ring-SIS/LWE assumptions with the efficient trapdoor of Peikert and Micciancio (EUROCRYPT 2012) and we provide an efficient implementation. Our IBE scheme is more efficient than the IBE scheme of Ducas, Lyubashevsky and Prest based on NTRU assumption and is based on more standard assumptions. We also describe and implement the underlying signature scheme, which is provably secure in the standard model and efficient.
- In a second work, we studied an other type of structured variant for lattice-based construction, called middle product problems.
At CRYPTO 2017, Roşca et al. introduce a new variant of the Learning With Errors(LWE) problem, called the Middle-Product LWE (MP-LWE). The hardness of this new assumption is based on the hardness of the Polynomial LWE (P-LWE) problem parameterized by as et of polynomials, making it more secure against the possible weakness of a single defining polynomial. As a cryptographic application, they also provide an encryption scheme based on the MP-LWE problem. In this paper, we propose a deterministic variant of their encryption scheme,which does not need Gaussian sampling and is thus simpler than the original one. Still, it has the same quasi-optimal asymptotic key and ciphertext sizes. The main ingredient for this purpose is the Learning With Rounding(LWR) problem which has already been used to derandomize LWE type encryption. The hardness of our scheme is based ona new assumption called Middle-Product Computational Learning With Rounding, an adaption of the computational LWR problem over rings, introduced by Chen et al. at ASIACRYPT 2018. We prove that this new assumption is as hard as the decisional version of MP-LWE and thus benefits from worst-case to average-case hardness guarantees.
- In a third work, we provided analysis of the information leakage about the input data that are processed in the encrypted domain with state-of-the-art functional encryption schemes.
Classification algorithms and tools become more and more powerful and pervasive. Yet, for some use cases, it is necessary to be able to protect data privacy while benefiting from the functionalities they provide. Among the tools that may be used to ensure such privacy, we are focusing in this paper on functional encryption. These relatively new cryptographic primitives enable the evaluation of functions over encrypted inputs, outputting cleartext results. Theoretically, this property makes them well-suited to the process of classification over encrypted data. Indeed, its design enables one to perform the classification algorithm over encrypted inputs (i.e. without knowing the inputs) while only getting the input classes as a result in the clear. In this paper, we study the security and privacy issues of classifiers using today practical functional encryption schemes. We provide analysis of the information leakage about the input data that are processed in the encrypted domain with state-of-the-art functional encryption schemes. This study, based on experiments ran on two datasets (MNIST and Census Income), shows that neural networks are able to partially re-cover information that should have been kept secret. Hence, great care should be taken when using the currently available functional encryption schemes to build (seemingly) privacy-preserving classification services. It should be emphasized that this work does not attack the cryptographic security of functional encryption schemes, it rather warns the community against the fact that they should be used with caution for some usecases and that the current state-of-the-art may lead to some operational weaknesses that could be mitigated in the future once more powerful functional encryption schemes are available.
- In the fourth work, we studed side-channels attacks against a NIST submission called Falcon and which is based on NTRU lattices.
In this paper, we initiate the study of side-channel leakage in hash-and-sign lattice-based signatures, with particular emphasis on the two efficient implementations of the original GPV lattice-trapdoor paradigm for signatures, namely NIST second-round candidate FALCON and its simpler predecessor DLP. Both of these schemes implement the GPV signature scheme over NTRU lattices, achieving great speed-ups over the general lattice case. Our results are mainly threefold. First, we identify a specific source of side-channel leakage in most implementations of those schemes. Signing in lattice-based hash-and-sign schemes involves sampling a lattice point according to a Gaussian distribution. This reduces to sampling several one-dimensional discrete Gaussian distributions with standard deviations determined by the Gram–Schmidt norms of the secret lattice basis. Our observation is that those norms often leak through timing side-channels in the implementation of the one dimensional Gaussian samplers. Second, we elucidate the link between this leakage and the secret key, by showing that the entire secret key can be efficiently reconstructed solely from those Gram–Schmidt norms. The result makes heavy use of the algebraic structure of the corresponding schemes, which work over a power-of-two cyclotomic field. To establish it, we propose efficient algorithms of independent interest which, given the leading principal minors of the matrix associated to a totally positive field element (in the power basis for DLP and the bit-reversed order basis for FALCON) recover the element up to conjugation. In the case of those schemes, that element is f ̄f+g ̄g, where(f,g)is the NTRU-style secret key. We then show that this element combined with the verification key suffices to recover the entire secret efficiently. Third, we concretely demonstrate the side-channel attack against DLP. The challenge is that timing information only provides an approximation of the Gram–Schmidt norms (with an accuracy increasing with the number of traces), and our algebraic recovery technique needs to be combined with pruned tree search in order to apply it to approximated values. Experimentally, we show that around235DLPtraces are enough to reconstruct the entire key with good probability. Carrying out a similar experiment against FALCONi s left as an open problem, however, since the recursive nature of our bit-reversed order recovery algorithm does not accommodate approximate inputs easily. Nevertheless, our results do under score the importance of constant time implementations particularly for schemes using Gaussian sampling.
Publications
2018
- Practical Implementation of Ring-SIS/LWE Based Signature and IBE, Pauline Bert, Pierre-Alain Fouque, Adeline Roux-Langlois, Mohamed Sabt, PQCrypto 2018: 271- 291.
2019
- Middle-Product Learning with Rounding Problem and Its Applications, Shi Bai, Katharina Boudgoust, Dipayan Das, Adeline Roux-Langlois, Weiqiang Wen, Zhenfei Zhang, ASIACRYPT 2019: 55-81.
2020
- Illuminating the Dark or how to recover what should not be seen in FE-based classifiers, Sergiu Carpov (CEA LIST), Caroline Fontaine (CNRS, LSV), Damien Ligier (Wallix), Renaud Sirdey (CEA LIST), PoPETS 2020.
- Key Recovery from Gram-Schmidt Norm Leakage in Hash-and-Sign Signatures over NTRU Lattices, Pierre-Alain Fouque, Paul Kirchner, Mehdi Tibouchi, Alexandre Wallet, Yang Yu, Eurocrypt 2020.