New Quantum-Safe Cryptographic Standards: Future-Proofing Financial Security in the Quantum Age

September 19, 2024|

The National Institute of Standards and Technology (NIST) has released three new cryptographic standards

ML-KEM, ML-DSA, and SLH-DSA, designed to protect sensitive data against the emerging threat of quantum computing. These standards are crucial for the financial services sector, which is particularly vulnerable to the risks posed by quantum technology.

The quantum computing threat: A looming crisis

Quantum computers are on the horizon, poised to revolutionize computing with their ability to solve complex problems exponentially faster than classical computers. This technological leap, however, comes with significant security implications. Current cryptographic methods, which safeguard everything from financial transactions to customer data, could be rendered obsolete by quantum algorithms capable of breaking these traditional encryption methods.

One of the most pressing concerns is the “harvest-now, decrypt-later” threat. Malicious actors may already be intercepting and storing encrypted data today, with the intent to decrypt it later once quantum computers become sufficiently powerful. This means that sensitive financial data, thought to be secure now, could be exposed in the future when quantum technology matures.

Future-proofing financial security

NIST’s newly released standards are the result of extensive research and development aimed at countering the quantum threat. These standards are designed to withstand the capabilities of quantum computers, providing a robust defense against future decryption attempts.

  • ML-KEM (modular lattice key encapsulation mechanism): A general-purpose encryption standard suitable for securing data in transit across various applications.
  • ML-DSA (modular lattice digital signature algorithm): A general-purpose lattice-based algorithm for general-purpose digital signature protocols.
  • SLH-DSA (supersingular lattice-hard digital signature algorithm): A stateless hash-based digital signature scheme, primarily as a backup to ML-DSA.

The complexity of implementing quantum-safe standards

While the importance of adopting these new standards cannot be overstated, the complexity of their implementation must also be acknowledged. Transitioning to quantum-safe algorithms involves more than simply updating software; it requires a deep understanding of cryptographic principles and the potential pitfalls of implementation, and almost certainly requiring new crpto-agile architectures.

One critical component in this transition is the use of cryptographically secure random number generators (CSPRNGs). CSPRNGs are essential for generating keys that are truly random and, therefore, secure. With the introduction of these new algorithms, inadequate random number generation can very quickly lead to vulnerabilities, undermining the strength of even the most advanced cryptographic algorithms. Ensuring that CSPRNGs are correctly implemented is a foundational step in securing cryptographic systems against both current and future threats.

Moreover, knowing how to implement these quantum-safe algorithms is crucial to avoid side-channel attacks. Side-channel attacks exploit physical or logical data leaks during the encryption process, such as timing information or power consumption, to gain unauthorized access to the encrypted data or cryptographic key material. Proper implementation of the new standards must account for these risks by employing best practices in algorithm deployment, hardware security, and system architecture.

Performance considerations and challenges in high-performance environments

While the newly introduced quantum-safe cryptographic standards offer robust security against quantum threats, they come with certain trade-offs, particularly in terms of performance. Compared to traditional algorithms like RSA, these new standards generally require more computational resources, leading to a reduction in performance. This is especially true in high-performance environments where encryption, decryption and signing processes need to be conducted rapidly, such as in real-time financial transactions or high-frequency trading systems.

To mitigate the performance impact, hardware acceleration may be necessary. Specialized hardware, such as field-programmable gate arrays (FPGAs) or dedicated cryptographic processors, can be employed to offload the computational burden and maintain the required performance levels. However, this introduces additional complexity, especially in virtualized environments where such hardware is not typically available or easily integrated.

In virtualized or cloud-based infrastructures, where scalability and flexibility are paramount, the introduction of quantum-safe algorithms may necessitate significant re-architecting of systems. The reliance on hardware acceleration in such environments could undermine the inherent benefits of virtualization, such as resource pooling and dynamic provisioning. Consequently, organizations may need to rethink their infrastructure design to balance security with performance, potentially leading to increased costs and complexity.

Additionally, the increased computational demands of ML-KEM, ML-DSA, and SLH-DSA could also impact latency-sensitive applications, necessitating further optimization and tuning of systems to ensure that service-level agreements (SLAs) are met without compromising security.

The urgency of early implementation

The timeline for the practical deployment of quantum computers remains uncertain, but the potential risks they pose are immediate. The financial services sector must act now to integrate these quantum-safe standards to protect against future threats. Failure to do so could lead to catastrophic breaches, resulting in severe financial and reputational damage.

The “harvest-now, decrypt-later” threat underscores the urgency: data compromised today can be exploited in the future, making it imperative that financial institutions, particularly, transition to quantum-safe encryption as quickly as possible. The adoption of these standards is not merely a strategic advantage but a necessity to ensure long-term data security.

Preparing for these challenges

To address these challenges, organizations must not only adopt the new cryptographic standards but also invest in the necessary infrastructure upgrades and optimizations. This might include deploying hardware acceleration in data centers, re-architecting virtualized environments to better accommodate these new algorithms, and conducting thorough performance testing to identify and mitigate potential bottlenecks.

Challenges of implementing quantum-safe standards in mainframe and legacy systems

One of the most significant challenges facing financial institutions is the implementation of the new quantum-safe cryptographic standards in mainframes and other legacy systems. These systems, which are often critical to the core operations of financial services, were not designed with quantum-safe cryptography in mind and may lack the capacity for straightforward upgrades.

Mainframes, in particular, are known for their reliability, scalability, and ability to process large volumes of transactions. However, they often run on proprietary or outdated software and hardware architectures that may not be easily compatible with the computational demands of the new cryptographic algorithms. Implementing ML-KEM, ML-DSA, and SLH-DSA in such environments could require extensive modifications to existing systems, which can be both costly and time-consuming, even when scarce mainframe resources are available.

Furthermore, some legacy systems might not support the necessary hardware acceleration required to maintain performance when using these more resource-intensive algorithms. This could lead to a significant degradation in system performance, which is particularly problematic in environments where transaction speed and efficiency are paramount.

In cases where direct upgrades are not feasible, financial institutions may need to consider alternative approaches, such as:

  • Middleware solutions: Deploying middleware that can interface between legacy systems and newer cryptographic standards, ensuring secure communication without requiring a complete overhaul of existing infrastructure.
  • System segmentation: Isolating and segmenting critical legacy systems that cannot be upgraded, while introducing quantum-safe encryption in other parts of the infrastructure to mitigate overall risk.
  • Gradual migration: Planning a phased migration to newer systems or platforms that are designed to support quantum-safe algorithms, thereby reducing reliance on legacy infrastructure over time.

Preparing for legacy system challenges

Addressing the challenge of implementing quantum-safe standards in mainframe and legacy environments requires a strategic approach. Financial institutions must carefully assess the capabilities of their existing infrastructure and explore viable paths for integration. This might involve working closely with vendors to develop custom solutions or investing in the modernization of critical systems to ensure they are future-proof.

The views and opinions in these articles are solely of the authors and do not necessarily reflect those of Bespoke Business Development. They are offered to stimulate thought and discussion and not as legal, financial, accounting, tax or other professional advice or counsel.