You are on page 1of 7

POSTQUANTUM CRYPTOGRAPHY, PART 1

Cryptography Standards in Quantum Time:


New Wine in an Old Wineskin?

Lidong Chen | NIST

The history of cryptography standards is reviewed, with a view to planning for the challenges, uncertainties,
and strategies that the standardization of postquantum cryptography will entail.

O n 15 December 2016, NIST announced a call


for proposals for quantum-resistant public-key
cryptographic algorithms.1 The call, whose deadline
difficulty of factoring large composite integers, PQC
schemes are based on problems that are predicted to
remain hard even with quantum computers, such as
is 30 November 2017, covers all public-key crypto- solving the shortest vector problem (SVP) in a lattice,
graphic primitives currently standardized by NIST: solving systems of multivariate quadratic equations
public-key encryption, key agreement, and digital sig- over finite fields (MQ), and decoding problems in an
nature schemes. error-correcting code.
It feels like just yesterday that the cryptographic In August 2015, NSA announced its plan to tran-
community’s IEEE P13632 and Accredited Standards sition to quantum-resistant cryptographic algorithms
Committee X9 (Financial Industry Standards) work- in the not-too-distant future. Standardizing PQC
ing groups were developing public-key cryptography schemes is the first step in this transition. Consider-
standards. Thanks to such standardization, public-key ing the nearly three decades of experience applying
cryptography schemes are deployed in every Internet public-key crypto­graphy and today’s mature deploy-
router, computer, tablet, and cell phone to enable secure ment environment, will plugging PQC into existing
communications and applications. Now that public-key applications be as easy as replacing a light bulb? In
cryptography schemes such as the Diffie-Hellman key this article, I discuss the challenges and opportunities
agreement3 and RSA digital signatures4 are indispens- involved in developing and deploying PQC standards.
able parts of our digitized life, the recent progress
made on quantum computers compels us to look for History Doesn’t Always Repeat Itself
quantum-resistant counterparts. When public-key cryptography was invented in the
Research on quantum-resistant public-key crypto­ 1970s, people were fascinated by its use of number
graphy, also called postquantum cryptography (PQC), theory and finite fields to resolve key distribution
has been fast paced and very productive in recent years. problems. For thousands of years, enabling encryp-
Many schemes proposed in the literature appear to be tion had demanded a secret channel to distribute keys.
good candidates for the next generation of public-key Public-key cryptography allows communicating parties
cryptography standards. Just as RSA is based on the to establish a shared secret key without a secret channel.

1540-7993/17/$33.00 © 2017 IEEE Copublished by the IEEE Computer and Reliability Societies July/August 2017 51
POSTQUANTUM CRYPTOGRAPHY, PART 1

It also enables digital signatures for public authentica- of Cryptographic Techniques.8 RSA-OAEP introduced
tion and authorization. not only a new way to randomize plaintext messages to
As the 1990s began, revolutionary advances in com- hide every bit of the plaintext but also the concept of
puting technology and digital communications provided nonmalleable security against adaptively chosen cipher-
commercial opportunities for public-key cryptography text attacks (NM-CCA2, also known as IND-CCA2)
deployment. RSA Laboratories developed and published under the random oracle model.
the first de facto standards: the Public-Key Cryptogra- In the past two decades, more security notions have
phy Standards (PKCS) series.5 In particular, PKCS#1 been established and used to prove security for a given
provides the basic definitions of and recommenda- cryptosystem. The rich theory in the provable secu-
tions for implementing RSA public-key cryptosystems. rity provides confidence in new cryptography systems.
In 1994, IEEE approved the P1363 project to develop However, determining how much to weigh provable
a public-key cryptography standard. Around the same security when selecting algorithms for standardization
time, X9, a standards orga- remains a challenge. For a
nization for financial given cryptographic
services, established scheme, should we
working group X9F1 Will plugging postquantum cryptography adopt a provably
to develop public-key (PQC) into existing applications be as easy secure but less effi-
cryptography stan- as replacing a light bulb? cient version­—or the
dards. The standards version that seems
developed by IEEE secure but doesn’t
P1363 and X9F1 have a security proof?
focus on general-use algorithm specifications. The Inter- As security theories advance, this decision might
net Engineering Task Force (IETF) was probably the become harder.
first organization to standardize public-key cryptog- We must also remember that efficiency in any par-
raphy for real applications, that is, Internet protocols. ticular computing environment has historically been
Internet Key Exchange (IKE)6 and TLS7 are two proto- a critical factor for adoption. In other words, a small
cols in which public-key cryptography is used for mutual advantage in performance might differentiate one algo-
authentication and key establishment. rithm from another. For example, being able to select
In standardization’s early days, the goal was to small public-key sizes to speed up encryption and sig-
make use of public-key cryptography in the emerging nature verification for RSA algorithms was considered
network for communication and commerce. Secu- a remarkable advantage. In the 1990s, great effort was
rity notions and proofs weren’t as well developed as made to improve performance. Open source implemen-
they are today. The ideas underlying the RSA and tations weren’t available.
Diffie-Hellman schemes can be explained to people On the other hand, attackers were also limited
with a high school mathematics background. The rela- by computing capacity. For IKE, Oakley Group 1
tionship between the hardness of integer factorization used a prime modulus of less than 800 bits in the
and RSA, and between the hardness of the discrete log- Diffie-Hellman key agreement, which is very weak con-
arithm problem (DLP) and Diffie-Hellman, are intui- sidering today’s discrete logarithm algorithms and com-
tive enough to be widely understood. puting power. Equally small integers were also used for
Early research focused on the computing complex- RSA as moduli.
ity of factorization and discrete logarithm computation. Today, computing power has increased tremen-
Theory focused on reduction proofs and the existence dously. Although efficiency remains important, for
of (trapdoor) one-way functions, pseudorandom func- many of today’s implementation platforms, resource
tions, and so on. demands for implementing cryptography aren’t major
At that time, many details about securely imple- showstoppers. Furthermore, recently proposed PQC
menting public-key cryptography weren’t understood. algorithms—such as lattice-based, coding-based, and
For example, PKCS#1’s padding scheme has several ver- multivariate cryptosystems—appear efficient enough
sions, with some of the padding methods having security to be plugged in to environments in which public-key
flaws. That is, the hardness of factorization can’t guaran- cryptography is now implemented. Therefore, process-
tee the security of the RSA scheme in practice unless ing efficiency might not be the major competing fac-
every detail is handled properly. RSA Optimal Asym- tor differentiating algorithms. But for very constrained
metric Encryption Padding (RSA-OAEP) was pro- devices and bandwidth-limited networks, key size, sig-
posed as a provably secure method for RSA encryption nature size, and ciphertext expansion might become
at the 1994 Workshop on the Theory and Application barriers for applications.

52 IEEE Security & Privacy July/August 2017


Uncertainties and a one-in-two chance of the same by 2031,”9 then we
It’s taken a quarter of a century for us, as a commu- don’t have much time to complete the standardization
nity, to understand the currently deployed public-key and deployment process. For cryptography, a one-in-
crypto­ graphic systems. On the journey to deploy- seven chance to be broken is indeed non-negligible.
ing them, we’ve learned to avoid security pitfalls step Taking cautious action to start the process is the only
by step. We know there’s no easy path, and NIST’s option for dealing with the uncertainty.
announcement to commence standardization of PQC Classical security certainly will be the first consid-
has highlighted several uncertainties. The first uncer- eration for quantum-resistant cryptosystems. Whereas
tainty is whether now is the time to standardize new some possible candidates are fairly new, others were
cryptographic schemes. We’ve not yet seen quantum proposed years ago and have been shown to be secure.
computers that can crack RSA and Diffie-Hellman cryp- For example, the code-based McEliece encryption
tographic schemes. Although promising progress has algorithm was proposed in the 1970s,10 while NTRU­
been made on quantum computers, we can’t estimate encrypt was proposed in the 1990s.11 Uncertainty
a time frame for their development with any certainty. about classical security arises for the newer versions
If quantum computers don’t appear along a predictable of these algorithms that improve performance or key
timeline, when should we make up our minds to move sizes. The fact that some new schemes have been bro-
toward quantum-resistant cryptosystems? ken quickly actually proves that our community’s crypt-
The second uncertainty concerns classical security for analysis abilities regarding classical security have grown
newly emerging PQC algorithms. In addition to the cost strong, so we’ll likely be able to identify security flaws
of replacing deployed cryptosystems with new schemes, effectively. Furthermore, an open, transparent process
are there risks involved in deploying such new crypto- will allow cryptographers and the community to thor-
graphic systems? Indeed, many proposed schemes have oughly analyze and assess the security of newly pro-
been broken in the past decade due to classical security posed algorithms.
flaws. Considering how long it’s taken us to understand Indeed, our understanding of quantum security is far
the security of the cryptosystems currently in use, it less comprehensive than our understanding of classical
seems risky to move quickly to any of the new schemes. security. However, over the past few years, we’ve made
The third uncertainty is probably the most wor- significant progress. The standardization process will
risome. We know much less about the properties of certainly promote research on quantum algorithms and
quantum computers than classical computers. Could quantum security. As further progress is made in quantum
new quantum algorithms be discovered that lead to computing, we’ll learn more about quantum security.
attacks on algorithms thought to be resistant to quan- In dealing with these uncertainties, NIST proposes
tum attacks? Likewise, the performance characteristics to use five equivalent security classes to select param-
of future quantum computers—cost, speed, memory eters and keys for each proposed algorithm.12 The gen-
size, and so on—are not well understood. This uncer- eral assumption is that there are no known quantum
tainty has resulted in differing opinions regarding the attacks on the proposed scheme (for example, Shor’s
appropriate quantum security strength levels for setting attack on factorization), with the exception of generic
parameters and key sizes. quantum speedup (for example, Grover’s quadratic
Shakespeare’s plays often focus on the impossibil- speedup on Advanced Encryption Standard [AES] key
ity of certainty. Doesn’t this also seem to be the case for search). The five security classes reflect not only classi-
cryptography’s history? We believed factorization and cal security strength but also the effectiveness of quan-
discrete logarithm were hard, until quantum computers tum speedups at the same classical security level. For
and quantum algorithms emerged to shake our beliefs. example, if a scheme can provide 128-bit classical secu-
As we move toward PQC standardization, the first step rity and there’s no quantum attack other than classical
will be to understand and work with the uncertainties. attacks with generic quantum speedups, such as Gro-
There’s much uncertainty about when quantum ver’s, then the scheme should be able to provide 64-bit
computers will be available at scale. What’s certain quantum security. However, if quantum speedups aren’t
is that developing and deploying new cryptographic as effective as Grover’s key search on AES-128, then
standards will take years. Considering that some data the scheme should be able to provide a quantum secu-
protection requirements will remain confidential for rity level greater than 64 bits. Note that breaking 64-bit
many years, we must ensure that quantum-resistant quantum security could be significantly more difficult
cryptographic algorithms are in place ahead of time to and expensive than breaking 64-bit classical security.
guarantee backward secrecy. If, as the experts predict, Precisely estimating the quantum security of PQC algo-
there’s “a one-in-seven chance that some fundamental rithms will require extensive collaboration between
public-key crypto will be broken by quantum by 2026, classical and quantum researchers.

www.computer.org/security 53
POSTQUANTUM CRYPTOGRAPHY, PART 1

Nevertheless, all of these uncertainties urge us perfect forward secrecy—meaning the compromise of
to start, because they’ll take time to understand and long-term keys doesn’t compromise past session keys.
resolve. It’s going to be a long journey. This has become a very desirable property. As specified
by IETF, in TLS version 1.2 and earlier versions, three
Postquantum Cryptography key establishment schemes have been supported: RSA
Drop-In Replacements key transport, Static and Ephemeral Diffie-Hellman,
Today, public-key cryptography is used everywhere. and Ephemeral-Ephemeral Diffie-Hellman. In the
Introducing quantum-resistant counterparts involves newest version, TLS 1.3, Ephemeral-Ephemeral
a transition stage. Finding PQC algorithms that can be Diffie-Hellman is the only supported key establishment
used as drop-in replacements will make the transition scheme.
less disruptive. The question is, can we find them? A Diffie-Hellman quantum-resistant counterpart
As I discussed, processing complexity might not be tops the wish list, and researchers are pursuing this
a barrier anymore, because most emerging PQC algo- direction. Oded Regev’s learning with errors (LWE)
rithms are pretty efficient in terms of processing time. problem has turned out to be a promising basis for con-
However, we must prepare to deal with new challenges. structing Diffie-Hellman-like key agreement schemes.18
One example is stateful Jintai Ding and his col-
hash-based signa- leagues built the
tures.13 Hash-based first such scheme in
signatures were first As we move toward PQC standardization, 2012.19 Currently,
introduced in the the first step will be to understand and work more than one
1970s in the Lamport with the uncertainties. Diffie-Hellman-like
one-time signature quantum-resistant
scheme. A major dis- key establishment
advantage of Lamp- scheme has been pro-
ort one-time signatures is their large public and private posed and even prototyped. Although their properties
keys. To sign a message M, h(M) = {0, 1}k , 2k hash val- differ, they’re generally very close to the Diffie-Hellman
ues have to be saved. In 1979, Ralph Merkle proposed key agreement.
using a hash tree to reduce the public-key size.14 State- One of these schemes is named New Hope.20 And
ful hash-based signatures are essentially Merkle signa- the proposed scheme is indeed as the name claims: a
tures. Compared to other PQC categories, the security new hope for a drop-in Diffie-Hellman replacement.
of hash-based signatures is better understood. However, Performance is quite reasonable. The difference is that
each private key can be used only once. Thus, the task the operations aren’t symmetric for the two parties. Not
of managing private keys, also called state management, only do the operations differ, but the responder needs to
becomes a major challenge for large-scale applications generate a message based on the initiator’s public value.
of hash-based signatures. The scheme could possibly fail even if both parties cor-
To overcome this state management challenge, rectly select random values and conduct operations.
stateless hash-based signatures were introduced.15 This might not be a major concern for key establish-
However, these have a much larger signature size. For ment, but it can hardly be considered a drop-in replace-
bandwidth-limited applications, signature transmission ment for Diffie-Hellman.
might require segmenting the data into multiple mes- Another “new hope” for key agreement is a fam-
sages in the existing protocols. Some postquantum sig- ily of recently proposed schemes based on isogenies
nature schemes, such as the family of schemes based on between elliptic curves. The hardness of finding isoge-
multivariate cryptography (for example, Quartz,16 Rain- nies between supersingular elliptic curves was first intro-
bow,17 and their variants), offer signature sizes compat- duced as the basis for cryptosystems more than 10 years
ible with standardized signature schemes such as the ago by Denis Charles and his colleagues.21 One version
Elliptic Curve Digital Signature Algorithm (ECDSA). is called Super­singular Isogeny Diffie-Hellman (SIDH)
But the public- and private-key sizes can be hundreds to emphasize the resemblance to Diffie-Hellman key
of times larger. As a result, a given quantum-resistant agreement. Operationally, it’s more symmetric for the
signature scheme could work as a replacement in one two parties. For those looking for drop-in replacements,
application but not in another—there’s no one-for-all SIDH looks much closer to Diffie-Hellman key agree-
drop-in replacement. ment than other postquantum key agreement schemes.
The Diffie-Hellman key agreement is a beauti- Performance-wise, it’s significantly slower and more
ful public-key cryptography scheme for many rea- costly than Elliptic Curve Diffie-Hellman (ECDH).22
sons. When ephemeral keys are used, it provides However, if performance cost is an issue for some

54 IEEE Security & Privacy July/August 2017


applications or processors, then SIDH might not be a parameters and conduct the operations properly, then
suitable replacement for Diffie-Hellman key agreement. the plaintext will be obtained through the decryption
Furthermore, whether a new PQC standard can be operation. Similarly, in a Diffie-Hellman key agreement
used as a drop-in replacement might depend on not scheme, if each party executes as specified, they’ll obtain
only similarities in the key size, signature size, and for- the same secret value in the end. However, in some of
mat but also on whether the security implementation the newly emerging public-key encryption schemes,
technologies we developed in the past can ensure PQC decryption or key agreement isn’t always correct. That
security. Protocols and applications that introduce new is, even if every parameter and key is selected per the
mechanisms might be required to guarantee secure specification and each operation is executed properly,
implementation of the new schemes. That is, the exist- it’s still possible that the plaintext won’t be obtained
ing protocols or applications might not provide suffi- through a decryption or that the two parties won’t share
cient countermeasures to deal with new issues. the same secret value after executing the scheme. Such
decryption failure happens with a relatively small prob-
Secure Implementation ability but might introduce security flaws.
Issues for New Algorithms
While developing and deploying the first generation Auxiliary Functions
of public-key cryptography standards, we learned a lot Public-key cryptography schemes usually entail using
about dealing with secure implementation issues. The certain auxiliary functions. For instance, hash functions
experience we gained will help prepare us mentally to are used as an auxiliary function for digital signatures.
face the issues of PQC. But we’ll need new techniques, Some PQC schemes will need new auxiliary functions
examples of which I discuss here. for secure implementation.
In the past, we’ve depended on the notion that a value
Public-Key Validation can be selected uniformly at random from a properly
In discrete logarithm–based cryptosystems such as sized set. A robust and secure random number generator
Diffie-Hellman key agreement, public-key validation is critical for the secure implementation of many crypto­
is needed to ensure that the public key is in the right systems in use today. We’ve concentrated on ensuring
subgroup, because a small subgroup attack can force correct implementations of random number generators
the established secret value into a small group that’s to output uniformly distributed elements in a given set.
vulnerable to exhaustive search. However, public-key Now, some of the new PQC algorithms require certain
validation is not straightforward for all the new PQC values to be selected according to a specific nonuniform
algorithms. Some methods have been introduced to distribution. For example, in R-LWE-based schemes like
conduct indirect public-key validation. Some alterna- New Hope, the “error” value must be selected accord-
tive indirect validation methods might have to require ing to a Gaussian distribution. Simulating these required
one party to reveal a function value of its secret key and distributions requires the introduction of new auxiliary
jeopardize the security. Other suggested methods might functions. Because the implementation’s security relies
be very costly and impractical. on properly selected values with the required distribu-
tion, the simulation function is critical.
Public-Key Reuse Some of the security implementation issues raised by
In a Diffie-Hellman key agreement scheme, a pub- PQC might not be new. For example, countermeasures
lic key can be ephemeral or static. Even for ephemeral for side-channel attacks have been implemented for the
Diffie-Hellman key agreement, some existing protocols cryptosystems currently in use. But we might need new
allow the ephemeral key to be used in more than one methods and techniques to protect a given new algo-
execution. However, for some quantum-resistant key rithm from side-channel attacks. Also, implementing
agreement schemes, if a public key is reused, it becomes the countermeasures to deal with security issues might
compromised. The reuse might be intentional—by an increase processing and communicating complexity for
attacker—or careless—by a legitimate party using a bad a given scheme. Therefore, understanding the tradeoffs
key generation function. If key agreement with such a is critical to making the right decisions.
protocol is deployed, it must include mechanisms to
prevent or reduce the security risk brought about by The Road Ahead
reusing public keys. For PQC standardization, we might not get what we
wish for. Nevertheless, we have reasons to be optimistic
Decryption Failure about the road ahead.
In a public-key encryption scheme such as RSA, if First, cryptographic research has advanced tremen-
the involved parties follow the rules to select keys and dously in the past 25 years. The security notions and

www.computer.org/security 55
POSTQUANTUM CRYPTOGRAPHY, PART 1

proofs that have been developed will certainly help us Acknowledgments


better understand the security of new schemes. The The opinions expressed in this article are the author’s and
research community has already demonstrated a strong don’t necessarily represent the views of NIST.
ability to conduct effective cryptanalysis on schemes to
be deployed. References
Second, the applications community has matured. 1. “Submission Requirements and Evaluation Crite-
Public-key cryptography has been implemented in ria for the Post-Quantum Cryptography Standard-
many communication protocols and digital devices. ization Process,” NIST, 15 Dec. 2016; csrc.nist.gov
The applications community has gained extensive expe- /groups/ST/post-quantum-crypto/documents/call-for
rience deploying new cryptographic algorithms. -proposals-final-dec-2016.pdf.
Furthermore, open source implementations are 2. “Standard Specifications for Public-Key Cryptogra-
available for most of the cryptographic algorithms in phy,” IEEE P1363, 10 Oct. 2008; grouper.ieee.org
use. These open source implementations have pro- /groups/1363.
moted a collective effort to ensure best practice. 3. W. Diffie and M. Hellman, “New Directions in Crypto­
Finally, advanced computing and communication graphy,” IEEE Trans. Information Theory, vol. 22, no. 6,
technologies can accommodate cryptographic func- 1976, pp. 644–654.
tions that are more demanding of processing and com- 4. R.L. Rivest, A. Shamir, and L. Adleman, “A Method for
munication resources. Obtaining Digital Signatures and Public-Key Cryptosys-
To overcome the fact that no exact drop-in replace- tems,” Comm. ACM, vol. 21, no. 2, 1978, pp. 120–126.
ments have been proposed for currently deployed 5. “PKCS#1 v2.2: RSA Cryptography Standard,” RSA Lab.,
cryptosystems, future standards could specify mul- 27 Oct. 2012; www.emc.com/collateral/white-papers
tiple algorithms for each cryptographic primitive /h11300-pkcs-1v2-2-rsa-cryptography-standard-wp.pdf.
according to the requirements of different applica- 6. C. Kaufman et al., “Internet Key Exchange Protocol Ver-
tions, especially to deal with nonideal characteristics sion 2 (IKEv2),” IETF RFC 7296, Oct. 2014.
such as large signature size or large keys. These algo- 7. T. Dierks and E. Rescorla, “The Transport Layer Secu-
rithms could be selected from different categories and rity (TLS) Protocol Version 1.2,” IETF RFC 5246,
based on different hard problems. The reason for doing Aug. 2008.
so is that signature or key size might not be a problem 8. M. Bellare and P. Rogaway, “Optimal Asymmetric
for some applications but a showstopper for others. Encryption—How to Encrypt with RSA,” Proc. Work-
In this way, the standards could allow different appli- shop Theory and Application of Cryptographic Techniques
cations to deploy different algorithms. On the other (EUROCRYPT 94), LNCS 950, Springer, 1995.
hand, existing protocols might need to be modified 9. M. Mosca, “Cybersecurity in an Era with Quantum
to handle larger signatures or key size, for example, Computers: Will We Be Ready?,” report 2015/1075,
through segmentation of messages. For new applica- IACR Cryptology ePrint Archive, 2015; eprint.iacr.org
tions, implementations must keep the demands of /2015/1075.
PQC in mind and allow the new schemes to adapt to 10. R.J. McEliece, “A Public-Key Cryptosystem Based on
them. PQC requirements might shape future applica- Algebraic Coding Theory,” DSN Progress Report, vol. 44,
tion standards. 1978, pp. 114–116.
Secure implementation issues can be addressed 11. J. Hoffstein, J. Pipher, and J.H. Silverman, “NTRU: A
through different approaches. For instance, efforts have Ring-Based Public Key Cryptosystem,” Proc. 3rd Int’l Symp.
been made to reduce the probability of decryption fail- Algorithmic Number Theory (ANTS 98), 1998, pp. 267–288.
ure on schemes like NTRUencrypt by justifying para­ 12. “Submission Requirements and Evaluation Crite-
meters and keys. Mechanisms could also be added at ria for the Post-Quantum Cryptography Standard-
the protocol level to limit security flaws. ization Process,” NIST, 15 Dec. 2016; csrc.nist.gov
/groups/ST/post-quantum-crypto/documents/call-for
-proposals-final-dec-2016.pdf.

F or PQC standardization, we will need a new


wineskin to hold the new wine. Plugging
quantum-resistant cryptosystems into existing appli-
13. J. Buchmann, E. Dahmen, and A. Hülsing, “XMSS—
A Practical Forward Secure Signature Scheme Based
on Minimal Security Assumptions,” Proc. 4th Int’l Conf.
cations will be a new experience for both cryptogra- Post-Quantum Cryptography (PQCrypto 11), 2011,
phers and practitioners. But the valuable experience pp. 117–129.
we’ve accumulated in the past 25 years of working on 14. R. Merkle, “Secrecy, Authentication and Public Key
first-generation public-key cryptography standards will ­Systems—A Certified Digital Signature,” PhD disserta-
help us deal with the new issues and challenges. tion, Dept. of Electrical Eng., Stanford Univ., 1979.

56 IEEE Security & Privacy July/August 2017


15. D. Bernstein et al. “SPHINCS: Practical State-
less Hash-Based Signatures,” Proc. Ann. Int’l Conf.
Theory and Applications of Cryptographic Techniques
(EUROCRYPT 15), LNCS 9056, Springer, 2015;
doi:10.1007/978-3-662-46800-5_15.
16. J. Patarin, N. Courtois, and L. Goubin, “Quartz, 128-Bit
Long Digital Signatures,” Proc. Conf. Topics in Cryptology:
The Cryptographer’s Track at RSA (CT-RSA 01), 2001, Executive Committee (ExCom) Members: Jeffrey Voas, President;
pp. 282–297. Dennis Hoffman, Sr. Past President, Christian Hansen, Jr. Past
17. J. Ding and D. Schmidt, “Rainbow, A New Multivariable President; Pierre Dersin, VP Technical Activities; Pradeep Lall, VP
Polynomial Signature Scheme,” Proc. Third Int’l Conf. Publications; Carole Graas, VP Meetings and Conferences; Joe Childs,
Applied Cryptography and Network Security (ACNS 05), VP Membership; Alfred Stevens, Secretary; Bob Loomis, Treasurer
2005, pp. 164–175.
18. O. Regev, “On Lattices, Learning with Errors, Random Administrative Committee (AdCom) Members:
Linear Codes, and Cryptography,” Proc. Ann. ACM Symp. Joseph A. Childs, Pierre Dersin, Lance Fiondella, Carole Graas, Samuel
Theory of Computing (STOC 05), 2005, pp. 84–93. J. Keene, W. Eric Wong, Scott Abrams, Evelyn H. Hirt, Charles H.
19. J. Ding, X. Xie, and X. Lin, “A Simple Provably Secure Recchia, Jason W. Rupe, Alfred M. Stevens, Jeffrey Voas, Marsha
Key Exchange Scheme Based on the Learning with Errors Abramo, Loretta Arellano, Lon Chase, Pradeep Lall, Zhaojun (Steven)
Problem,” report 2012/688, IACR Cryptology ePrint Li, Shiuhpyng Shieh
Archive, 2012; eprint.iacr.org/2012/688.
20. E. Alkim et al., “Post-Quantum Key Exchange—A New
http://rs.ieee.org
Hope,” Proc. USENIX Security Symp. (USENIX Security
16), 2016, pp. 327–343.
21. D.X. Charles, K.E. Lauter, and E.Z. Goren, “Crypto- The IEEE Reliability Society (RS) is a technical society within the
graphic Hash Functions from Expander Graphs,” J. Cryp- IEEE, which is the world’s leading professional association for the
tology, vol. 22, no. 1, 2009, pp. 93–113. advancement of technology. The RS is engaged in the engineering
disciplines of hardware, software, and human factors. Its focus on the
22. C. Costello, P. Longa, and M. Naehrig, “Efficient Algo-
broad aspects of reliability allows the RS to be seen as the IEEE Specialty
rithms for Supersingular Isogeny Diffie-Hellman,” Proc. Engineering organization. The IEEE Reliability Society is concerned
Ann. Cryptology Conf. (CRYPTO 16), LNCS 9814, with attaining and sustaining these design attributes throughout
Springer, 2016; doi:10.1007/978-3-662-53018-4_21. the total life cycle. The Reliability Society has the management,
resources, and administrative and technical structures to develop
and to provide technical information via publications, training,
Lidong Chen is a mathematician and leader of NIST’s
conferences, and technical library (IEEE Xplore) data to its
Cryptographic Technology Group. Her research inter- members and the Specialty Engineering community. The IEEE
ests include cryptographic protocols, zero-knowledge Reliability Society has 28 chapters and members in 60 countries
proofs, special-featured digital signatures, secu- worldwide.
rity protocol design, network security, and security The Reliability Society is the IEEE professional society for
for wireless and mobility applications. She has also Reliability Engineering, along with other Specialty Engineering
disciplines. These disciplines are design engineering fields that apply
actively contributed to various cryptography and
scientific knowledge so that their specific attributes are designed
security standards. Chen received a PhD in applied into the system / product / device / process to assure that it will
mathematics from Aarhus University. Contact her at perform its intended function for the required duration within
lily.chen@nist.gov. a given environment, including the ability to test and support it
throughout its total life cycle. This is accomplished concurrently
with other design disciplines by contributing to the planning
and selection of the system architecture, design implementation,
materials, processes, and components; followed by verifying
the selections made by thorough analysis and test and then
sustainment.
Visit the IEEE Reliability Society website as it is the gateway
to the many resources that the RS makes available to its members
and others interested in the broad aspects of Reliability and Specialty
Engineering.

www.computer.org/security 57

You might also like