Published in The SciTech Lawyer, Volume 18, Number 3, Spring 2022. © 2022 American Bar Association. Reproduced with permission. All rights reserved. This information or any portion thereof may not be copied or disseminated in any form or by any means or stored in an electronic database or retrieval system without the express written consent of the American Bar Association.

 

By Hoyt L. Kesterson ll

In the late 1990s, there was an explosion of digital signature activities. The international standards organizations, the International Organization for Standardization (ISO) and the International Telecommunication Union (ITU), had published a new edition of the standard specifying a digital signature certificate and the mechanisms for managing it. The Internet Engineering Task Force was specifying how that standard would support the internet. In 1996 the American Bar Association published the Digital Signature Guidelines: Legal Infrastructure for Certification Authorities and Secure Electronic Commerce. Its intent was to specify how a digitally signed electronic document would be legally equivalent to an original paper document physically signed in ink.

Many states were revising their statutes to allow such documents, and some states intended to allow electronic documents to be digitally notarized. I never expected to see the term “asymmetric encryption” in legislation, but there it was in the Arizona draft notary act in the late 1990s or early 2000s. Because I lived in Arizona, I was providing what help I could. There was one loud negative reaction from a local attorney. When I asked for his reason, he responded, “Quantum computing will make it for naught.”

A quarter of a century later, it is still not for naught, but we still hear a lot about quantum computers and how they will break the current encryption methods on which we depend to secure our communications, stored information, and digitally signed documents. To understand this threat, we need to understand how our current methods work. I am sorry to say there is some math.

 

 

 

ENCRYPTION 101 (AND A LITTLE MATH)

In the movie A Christmas Story, Ralphie uses his Little Orphan Annie Secret Circle Decoder Pin to read a secret message announced over the radio number by number and accompanied with settings for the pin. This method of secret writing, aka encryption, is character substitution. The method is at least as old as Julius Caesar, who shifted each letter in
his missives by three characters. The Caesar Cipher algorithm is simple, but there is no variability; identical text strings would encrypt to identical cypher text. “Annie” introduced variability with a device where each setting would produce a different result for an identical input. We call this variable a key. The sender of an encrypted message must use the same key to encrypt the message as the reader will use to decrypt it.

This is called symmetric encryption. The encryption algorithm must produce messages that resist analysis; the key must be distributed securely among the parties who encrypt and decrypt the messages. Techniques have evolved beyond simple substitution where character occurrence counts can lead to cracking the code. Character transposition, where the subsequent encryption of the same letter would result in different cyphertext, evolved to arithmetic manipulation of values characters digitally represented in a computer.

The National Bureau of Standards (renamed in 1988 to the National Institute of Standards Technology (NIST)) standardized the Data Encryption Standard (DES), followed by triple-DES, where the input value would be encrypted with key a, decrypted with key b, and then re-encrypted with either key a or c; triple-DES is still widely used by financial institutions and ATMs.

In 1997, NIST announced an open competition for the replacement of DES. Fifteen candidates from all over the world were evaluated not only by NIST but by any person or organization who cared to examine the proposed algorithms. This process was in sharp contrast to the closed evaluation and selection of DES in 1977. The fifteen were reduced to five for the final round of evaluation, and, in 2002, Rijndael, which was designed by two Belgians, was announced as the Advanced Encryption Standard (AES). Note that security is not provided by the secrecy of the algorithm; security is provided by the rigor of the algorithm, and secrecy is provided by the key.

Just as for previous symmetric encryption algorithms, it is that key that still causes two problems. If Bob wants to send an encrypted message that can only be read by Alice, Bob must share the secret key with Alice. The problem of sharing that key has many solutions and just as many attacks.

The second problem is that because the key is shared for communication, one cannot determine whether an encrypted message was written by Bob or Alice; source authentication is not possible and, therefore, symmetric encryption cannot be used for authentication or signature unless augmented with additional techniques.


In 1976, influenced by Ralph Merkle, Whitfield Diffie and Martin Hellman published New Directions in Cryptography in which they describe asymmetric encryption, a class of encryption algorithms that uses one key of a pair to encrypt and the other key to decrypt. Two gentlemen in an NSA-like agency in the UK developed the same idea seven years before, but they couldn’t publish it.

 

A Tale of Two Boxes

One can imagine a lockable chest to compare how symmetric and asymmetric encryption work. Assume the chest has one lock. If the holder of the only key locks something in that chest, then the owner is the only person who can retrieve it. If the owner wants to permit access to the contents of the chest to other people, then the owner gives a copy of the key to each person who is permitted access. The keys must be protected to ensure unauthorized people don’t get the key and thus unauthorized access. This provides confidentiality. Because any person with a key can put something in the chest, one cannot determine who placed the contents in the chest; there is no source authentication. This is symmetric encryption with a secret key for storage or a shared secret key for communication.

Assume the chest has two locks, each with a different key. Assume that if the chest is locked with one key, it can only be unlocked with the other. The owner of the box keeps one of the pair of keys private; it is never shared. The owner declares the other key to be public and gives a copy of that key to anyone who wants it. That public key could be copied and distributed by anyone. There are two scenarios.

In the first scenario, one of the holders of the public key puts something in the box and locks it. Because it can now only be unlocked with the associated private key, no other holders of a copy of the public key can open it; only the holder of the private key can unlock it. This provides confidentiality between the person who locked the box with a copy of the public key and the owner of the box, the person with the only copy of the private key. However, that owner cannot tell which copy of the public key was used, so source authentication is not provided.

In the second scenario, the owner puts something in the box and locks it with the private key. Because it can be unlocked with any of the copies of the public key, there is no confidentiality. However, anyone who can unlock the box with their copy of the public key knows that it could have only been locked with the corresponding private key. The source of the content in the box is proven to be from the holder of the private key, i.e., the owner of the box. This intrinsic ability to provide source authentication is the underlying method for a digital signature when supported by a method such as an X.509 Public-key Certificate to securely associate the public key to an entity.

 

Digital Signature and Hashing

There are two reasons why one does not encrypt the whole document to digitally sign it. The major reason is that asymmetric encryption is much slower than symmetric encryption. The second is that one may want to read the document without having to decrypt it. To leave the document in the clear, one encrypts a representation of it. That representation has names like message digest and fingerprint, but it’s most often referred to as the hash.

Merriam Webster Unabridged defines hash as “to chop to pieces specifically: to prepare for use in a meat and vegetable dish by cutting into small pieces” and “to make a confused muddle of.” That should evoke an image of digital data after hashing—chopped up and unrecognizable. The hash result must be small to reduce the effort of encrypting it. When used for digital signatures, the hash method must resist manipulation that would allow different messages to produce the same hash result. A cryptographic hash function will convert data of arbitrary length to a short fixed-length string of bits. A cryptographic function must meet two criteria:

• One cannot create two sets of data that would hash to the same result. This is called collision resistance. If an attacker could create two versions of a document that would hash to the same result, then one document could replace the one that had been digitally signed.
• Given a hash value, one cannot create data that would generate it. This is called pre-image resistance. If an attacker could modify the content of a digitally signed message such that the hash result was unchanged, then one could not trust that content.

From 2008 to 2012, NIST again ran an open competition to develop a secure hash standard, resulting in the Secure Hash Algorithm (SHA-3) to complement SHA-2, published in 2001. The use of SHA-1, published in 1995 for digital signatures, is deprecated. The two algorithms produce hash results in sizes varying from 224 bits to 512 bits, 64 characters.

An originator digitally signs data, e.g., a document, by computing a SHA hash of the data and then encrypting that hash with the originator’s private key. Anyone can verify the integrity and the source of the data by computing a hash of the received data and comparing that hash to the received hash decrypted by the originator’s public key.

 

Key Transfer and Key Agreement

Using symmetric encryption to protect the confidentiality of data when it is transmitted requires that both parties agree on the key, producing a shared secret.  Each party uses the RSA public key of the other party to asymmetrically encrypt the symmetric key information; each party uses their private key to decrypt the key sent by the other party. This is called key transfer.

Key agreement has each of the parties constructing the symmetric key based on information shared between them. The Diffie Helman Key Exchange algorithm is an early example. Elliptic curve Diffie Hellman (ECDH) uses key sizes much smaller that RSA’s.

 

The Math

The Diffie-Hellman paper posited that there are many mathematical problems that are very difficult to solve unless provided a bit of additional information. Immediately people decided to determine what math problem could be used as the basis for an asymmetric encryption algorithm. Ron Rivist, Adi Shamir, and Leonard Adleman based their approach on the difficulty of factoring large numbers. They published their RSA algorithm in 1977.

Factoring a number is determining what values can be multiplied to produce that number. Obviously, 1 is a factor of any number. What about 20? Its factors are 1, 2, 4, 5,10, and 20. What about 19? Its only factors are 1 and 19; this is a prime number, a number that can only be divided by one and itself and leave no remainder.

It’s easy to multiply two prime numbers; for example, 907 times 773 equals 701,111. But it’s difficult to determine its two factors if all you had was the product. If you knew one of the factors, it’s easy to determine the other using simple division. These types of problems that are easy to compute in one direction but almost impossible in the other are called trapdoor functions. Although you could use a calculator to solve this example, a 2048-bit RSA key is 617 digits and is somewhat intractable.

Factoring very large numbers is extremely difficult. RSA created the RSA challenge in which it offered large cash prizes to anyone factoring a set of values computed by multiplying two large primes. A hundred-digit number, 330 bits, was successfully factored in 1991; a 250-digit number, 829 bits, was factored in 2020. Much of this improvement can be attributed to an increase in computing power. The recommended minimum RSA key size has been increased as computers became more powerful.

NIST publishes a standard, SP 800-131A, Transitioning the Use of Cryptographic Algorithms and Key Lengths. Its most recent revision, published in 2019, recommends an RSA minimum key size of 2048 bits as adequate until 2030. This increase of key size for RSA has made the use of Elliptic Curve Cryptography (ECC) more attractive due to its significantly reduced key size.
The math problem supporting ECC is more complex, so we won’t take a deep dive here. Essentially, an elliptic curve is the set of values generated by an equation y2 = x3 + ax + b. The difficulty is computing discrete logarithms in elliptic curve groups. The tools used to help determine factors have no equivalent here.

ECC’s real advantage is that the key size is much smaller than that for RSA. NIST SP 800-131A recommends a minimum key size of 224 bits, twenty-eight characters. This reduction and its attendant processing cost make it an attractive choice for environments with restricted resources such as Internet of Things devices.

Steady and incremental improvement in computing power and algorithm efficiency is anticipated and used to determine the lifetimes of key sizes published in NIST’s SP 130-31A. Quantum computing could disrupt the dependence on steady and incremental, but there are those who believe that a mathematic breakthrough on the factoring of large numbers is more likely to occur before an effective quantum computer is created.

 

 

 

what is quantum computing?

It is acceptable to not have an entirely clear idea as to what a quantum computer is and how it does it. What is important is understanding the “effect” of what a quantum computer can do.  If you do want to understand how it works, Quantum Computation and Quantum Information by Michael Nielsen and Isaac Chung provides an in-depth, math-based description. More approachable explanations can be found with an internet search for “quantum computing.”

For the quantum computer to work, the components of a quantum computer need to be cooled to almost absolute zero. At that temperature, the quantum aspects of those components are manipulatable, leading to two properties important for computing—superposition and entanglement.

As most of us know, a classical computer functions using “bits” that are either off or on to represent zero or one. Quantum computers have “qubits”; they follow the laws of quantum mechanics, where physics works differently at the atomic and subatomic levels. A particle of light, i.e., a photon, can be anywhere. This state is called superposition; its location collapses to a single point when it is observed. Qubits use this state to represent all the possible values until they collapse to one value. For example, if qubits are used to represent all the paths in a maze, they will collapse to one value to represent the path through the maze.

One should avoid the temptation to restate this in classical terms by treating the solution of the maze as equivalent to using parallel processing to examine all paths. Each parallel process working the problem will produce a result; one then still must examine the results to select the solution. Using superposition, the qubits representation of all possible solutions will collapse to one result, e.g., the path through the maze.

Entanglement is the property that allows multiple qubits can be set to the same state even if the qubits are not in contact with each other. This aggregation of qubits grants the power of two exponential, e.g., two qubits have four possible values.

 

Shor Algorithm 

In 1994 Peter Shor developed an algorithm that would quickly determine the prime numbers that went into the large product used to create public and private keys. He also developed a way to compute discrete logarithms of elliptic curves.

In 2001 IBM used a seven-qubit quantum computer to apply Shor’s algorithm to fifteen, resulting in three and five. Although that’s a small number, the result proved the effectiveness of the algorithm.

A paper published in 2017 by Microsoft researchers estimated it would take four thousand qubits to break RSA and twenty-five hundred to break elliptic curve. The paper is discussing perfect qubits that would be stable for the length of the computation. Such qubits don’t exist because current quantum computers are quite noisy. There are several proposals for error correction, but most require additional qubits. Some pessimistic estimates put that number in the millions.

 

Grover's Algorithm

Lou Grover created an algorithm in 1996 that offers a quadratic speed improvement for a variety of problems. It is claimed that using this algorithm will cut the effective strength of an AES encryption in half.

 

The NIST Competition 

Yet again NIST is conducting an open competition to select one or more quantum-resistant encryption algorithms as a national standard. Two problems will remain after that selection: Implementations will have to incorporate the necessary algorithms and handle much larger key sizes. Both problems may be particularly difficult for endpoint devices. There are still devices such as ATMs with hardware implementations of the DES; this algorithm was standardized in 1977. Its 64-bit key is too weak, and use of the stronger Triple DES after 2023 has been deprecated by NIST.

All cryptographic implementations, including ones in hardware, should have the ability to be upgraded to support new algorithms, even after deployment. This is often called “cryptographic agility.”

 

 

 

keep calm and carry on

Attorneys should begin to become aware of and monitor the quantum computing effort. Labs all over the world are building quantum computers with an increasing number of qubits; however, the press releases rarely speak of the fragility of the quantum states nor of the presence of effective error correction. Keep calm until you see announcements of stable quantum states or effective error correction.

When such announcements occur, there are two areas of concern: maintaining the integrity for signed documents, especially those with long lives, and securing communication. Because there have been several bumps in the minimum acceptable key size, there should already be a plan in place to protect documents signed with key sizes that are no longer acceptable. Typically, that is done by digitally signing collections of documents signed with a key now considered too weak. Whatever approach is taken, it should also handle the move to post-quantum digital signature algorithms. Securing communication requires that post-quantum key establishment algorithms be deployed.

NIST is doing its part by evaluating, selecting, and issuing standards for post-quantum cryptographic algorithms. Attorneys with clients acquiring products containing encryption algorithms should advise them to select products with cryptographic agility. Those products should have the ability to support quantum-resistant encryption algorithms and the ability to double the key size for their symmetric algorithms.

Attorneys with clients who produce products containing encryption algorithms should advise them that if they do not support cryptographic agility, they could be found negligent for not being able to resist attacks using quantum computing in the future. 

 

 

Hoyt L. Kesterson II is a senior security and risk architect with Avertium with forty years of experience in information security. For twenty-one years he chaired the international standards group that created the X.509 public-key certificate, a fundamental component in digital signature and securing web transactions. He is a testifying expert. He is a PCI QSA and holds the CISSP and CISA certifications. He is the co-chair and founding member of the Science & Technology Law Section’s Information Security Committee.

 

 

Chat With One of Our Experts




NIST NIST Privacy Framework Quantum Computing Encryption Blog