-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy pathpaper.tex
More file actions
337 lines (223 loc) · 48.5 KB
/
paper.tex
File metadata and controls
337 lines (223 loc) · 48.5 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
\documentclass[12pt,a4paper]{article}
\usepackage[margin=1in]{geometry}
\usepackage[T1]{fontenc}
\usepackage[utf8]{inputenc}
\usepackage{graphicx}
\usepackage{float}
\usepackage{amsmath}
\usepackage{booktabs}
\usepackage{tabularx}
\usepackage{xurl}
\usepackage[hidelinks]{hyperref}
\usepackage{setspace}
\usepackage{parskip}
\setstretch{1.15}
\setlength{\parindent}{0pt}
\emergencystretch=2em
\title{To What Extent Have Existing Encryption Methods Become Obsolete in the Presence of Quantum Key Distribution?}
\author{Talen Mudaly}
\date{}
\begin{document}
\maketitle
\begin{abstract}
As quantum computing power continues to accelerate, the risk of its potential use to perform malicious brute-force attacks rises concurrently. As a result, there is an increased imperative for novel, robust methods of protecting sensitive data. Quantum Key Distribution (QKD) offers a potential solution. As a quantum-resistant method of encryption, its measuring of photons, alongside its agreement of disregarding one data type through classical communication channels, means that it can resist quantum computer attacks because the key is not disregarded (Asif, 2021). Due to this requirement, QKD cannot replace existing classical encryption. However, the rise of Post-Quantum Cryptography (PQC) is a means of making existing methods outdated. PQC aims to replace existing classical methods and will significantly increase the security of QKD if it is used in conjunction with PQC. QKD has provided a better alternative for quantum resistance and PQC has provided less quantum resistance, but for use on classical computers, meaning it can be used on current technology (Quantropi, n.d.). The study finds that although PQC cannot provide a completely secure alternative, when used alongside QKD, it can be used to create an ultimately secure alternative.
\end{abstract}
\section*{List of Abbreviations}
\begin{table}[H]
\centering
\begin{tabularx}{\textwidth}{@{}l l@{}}
\toprule
Full Term & Abbreviation \\
\midrule
Quantum Key Distribution & QKD \\
Post-Quantum Cryptography & PQC \\
Continuous Variable Quantum Key Distribution & CV-QKD \\
Discrete Variable Quantum Key Distribution & DV-QKD \\
Rivest--Shamir--Adleman & RSA \\
Elliptic Curve Cryptography & ECC \\
\bottomrule
\end{tabularx}
\end{table}
\section{Introduction}
The amount of data stored globally has increased exponentially since the dawn of the digital age. In 2025, the International Data Corporation (IDC) estimates global data storage will reach 175 zettabytes (Reinsel et al., 2017), whilst Cybersecurity Ventures' 2025 prediction places the total in excess of 200 zettabytes (Morgan, 2024). To quantify such an astronomical figure, it is equivalent to 40 trillion years of digitally stored video files, and could fill 1.5 trillion 128GB standard iPhones. As the volume of this data continues its rapid ascent, so too does its sensitivity, encompassing increasingly personal, financial, and strategic information involving large businesses and companies (Embroker, 2025). Consequently, this growing treasure trove of data attracts a parallel surge in malicious actors seeking to exploit vulnerabilities for illicit gain, making the threat of cyberattacks an ever-present and escalating danger (SentinelOne, 2024).
\subsection{How Data Is Currently Kept Secure}
Encryption stands as a central pillar of technological security, a line of defence that protects data stored on both servers and personal computers alike. Some of the most commonly used encryption algorithms are Advanced Encryption Standard (AES), Blowfish and Elliptic Curve Cryptography (ECC) (Ahmad and Naeem, 2022). These encryption algorithms are the foundation of current cybersecurity, depending on mathematical breakthroughs which make them cumbersome for computers to crack. The current standard for encryption recommended by the National Institute of Standards and Technology (NIST) is AES encryption (Rivera, 2024). Although recommended by the US government, it was created through an international cryptographic competition and is generally used across the world since it was established in 1977 (National Inventors Hall of Fame, n.d.).
However, as many companies release more advanced quantum computers, such as Microsoft's Majorana 1 in February 2025 or Google's Willow chip in December 2025, the threat of cracking the global standard of encryption and rendering current encryption algorithms vulnerable is becoming a more likely reality (SecurityWeek, 2025). This is because of the possibilities of quantum computing processing power that would make current algorithms susceptible to a brute-force cyberattack, which entails a computer processing each possibility of a private key until it correctly predicts it. The genesis of quantum computers remains in its infancy, yet once the principles of quantum chip manufacturing are mastered and become viable (Canoera, 2025), computing on a subatomic scale will mean much more computing power and potentially render traditional encryption methods obsolete, leaving sensitive data at risk.
\subsection{Quantum Key Distribution: A Possible Solution?}
Quantum Key Distribution (QKD) presents a fascinating possible solution to the redundancies traditional encryption methods may face, offering both a potential replacement and a stimulus for their evolution (Bedington et al., 2017). These algorithms present a solution to protecting data against quantum computers that may be able to access encrypted data by utilising different mathematical and scientific approaches to protect personal information from quantum computers. Whilst QKD may offer this crucial solution, it is imperative to ensure it has the capacity to withstand the potential threats posed by the continued advancements of quantum computing (Aquina et al., 2025).
Whilst the mathematics underpinning QKD has been researched extensively, comparatively little research has been conducted to assess its long-term viability, compatibility, and practical implementation in addressing the impending threat of quantum computers. Currently, research identifies the ways QKD can be practically applied, but little emphasis has been placed on its consolidation within the cybersecurity infrastructure (Aquina et al., 2025). The need for thorough investigation of the resilience and real-world compatibility of QKD against evolving quantum computing power and existing defence infrastructures is a research gap this study seeks to fill.
\section{Rationale}
In order to ascertain the suitability of QKD to mitigate the burgeoning threat of quantum computers, it is first important to consider the current standards of encryption in 2025 and how they may become vulnerable to quantum computing breakthroughs. Alongside Blowfish and ECC, AES is an encryption algorithm that is relied upon for the majority of current websites' encryption, data servers and virtual private networks (VPNs) (NIST, 2023). The issue with the impending danger of cybercrime due to quantum computing has yet to be addressed adequately.
There is already a plethora of research into optimising quantum computer algorithms theoretically, but not enough research conveying a practical pivot into quantum security using existing computing and frameworks (Alagic, Moseley, and Winick, 2025). Recent studies have focused on the power and possibilities of quantum error correction (QEC), which is essential for optimising the computing power of quantum computers and manufacturing quantum computers (Wu and Zhong, 2023). QEC is crucial for mitigating errors that occur due to the fragile nature of quantum states, therefore enabling more reliable quantum computing (Caune et al., 2024).
The focus of this study addresses the research gap, considering how QKD may offer a solution to aid cybersecurity by taking advantage of quantum computing. This approach would consider the strength of current encryption methods against classical and quantum cyberattacks. This is a review of how feasible current encryption methods will become in the presence of QKD. An important factor that will be considered is that QKD must also have another encryption channel working alongside it. As a result, Post-Quantum Cryptography, another vital method in the defence against quantum computer attacks, will need to be explored in order to draw valid conclusions on whether current encryption methods will become obsolete or merely integrated into QKD (ETSI, n.d.).
However, some believe PQC is a competitor or alternative to QKD, even though both use classical encryption to an extent, while others argue they both have unique use cases and are better for specific scenarios (Quantropi, 2023; Aquina et al., 2025). To fully consider the potential of QKD and PQC, an examination of the current state of encryption on classical computers must be conducted. This examination will include reviewing the mathematical foundations of these algorithms and what makes them unbreakable for classical computers. The implications of quantum computing are not yet clear; therefore, it is important to account for quantum computing's possibilities but also its shortcomings due to its infancy.
\section{Background}
\subsection{How Data Is Transmitted}
To transfer data between devices on the same network, the data is split into equally sized packets. These packets each have a piece of information connected to them called metadata, specifying both the origin and the destination of the data along with vital information required such as data type (Cloudflare, 2025). This is fundamental to a process named packet switching, which involves obscuring the path of these files to ensure any infiltrator cannot decipher these packets into the original data (Lenovo, n.d.). The Internet Protocol (IP) is fundamental to this process, as it governs the routing of these packets across networks to their intended recipient. While other protocol layers also contribute to ensuring safe and faster data transfer, such as TCP (Shuler, 2002), the core function of IP is to provide the necessary addressing framework for internet communication.
\subsection{The Role of Cybersecurity in Secure Communication}
Encryption is integral to secure data transmission. Although users may not be aware of its operation, encryption is routinely employed whenever data is accessed or transferred. A prevalent encryption framework for secure web communication is Transport Layer Security (TLS), which is integrated into HyperText Transfer Protocol Secure (HTTPS) requests on mobile devices. The initial iteration of web communication, HyperText Transfer Protocol (HTTP), lacked inherent encryption, rendering data transmissions vulnerable as they were sent in plaintext (EFF, 2021). The subsequent adoption of HTTPS, incorporating TLS and supporting cipher suites such as AES (Cloudflare, n.d.), introduced robust encryption mechanisms based on a hybrid of asymmetric and symmetric cryptographic techniques, representing a significant advancement in online security. This development alone within the internet's infrastructure illustrates its ability to adapt in the realm of security.
\subsection{The Economic Reliance}
Cybersecurity is the cornerstone of all online transactions. A testament to this is that the UK has its own dedicated cybersecurity department, the National Cyber Security Centre (NCSC). The aim of cybersecurity is to ensure that living and working online is safe. This is especially important in a world that tends towards work being done from home and online. According to the Office for National Statistics (ONS), between September 2022 and January 2023, 16\% of working adults exclusively worked from home, while 28\% engaged in hybrid working (ONS, 2023). This means that a large amount of UK workers already rely on data transfer over secure communication services to work, with 99\% of all businesses with at least 10 employees storing some form of digital data (DSIT, 2024). Many businesses understand the risks that poor cybersecurity imposes, with 43\% of businesses having a form of cyberattack in the last 12 months according to the Cyber Security Breaches Survey (DSIT and Home Office, 2025).
\section{Assessing Traditional Encryption Methods}
\subsection{The Current Industry Standard}
The most common cryptographic methods in industry can be grouped into three different types: symmetrical encryption, asymmetric encryption, and hash functions.
\subsubsection{Hash Functions}
Hash functions are the building blocks of modern cryptography (IBM Quantum Learning, n.d.; NIST, n.d.). Their strength lies in collision resistance, preimage resistance, and second preimage resistance (NIST, n.d.), which is why they are commonly used for generation and verification of digital signatures (NIST, n.d.). Collision resistance in this context means using a hash function $H(x)$. It is computationally impossible to find two different inputs into the hash function that would result in the same hash (NIST, n.d.). This is imperative to the security of hash functions as they prevent nefarious individuals from fabricating transactions, an incredibly important feature in cases such as banking or cryptocurrency ledgers.
In contemporary cryptographic applications, hash functions play a crucial role in ensuring data integrity and security (IBM Quantum Learning, n.d.). Among the most widely adopted algorithms are the SHA-2 family, particularly SHA-256 and SHA-512, which provide robust collision resistance and are integral to blockchain technologies and digital signature verification methods such as fingerprinting (NIST, n.d.). SHA-3, the latest iteration in the Secure Hash Algorithm series, offers an alternative with distinct design principles, enhancing cryptographic diversity (NIST, n.d.; StudySmarter, n.d.). Additionally, bcrypt, a key-derivation function, is specifically engineered for password hashing, employing iterative processes to increase computational cost and mitigate brute-force attacks (Auth0, 2020). These algorithms are fundamental in various applications, from verifying file authenticity to securing authentication processes, reflecting their significance in maintaining the reliability and confidentiality of digital information (Xiphera, n.d.).
\subsubsection{Symmetrical Encryption}
\begin{figure}[H]
\centering
\includegraphics[width=0.72\textwidth]{images/symmetric_encryption.png}
\caption{Symmetric encryption.}
\label{fig:symmetric-encryption}
\end{figure}
Different to hash functions, symmetric encryption involves a single key being used to encrypt and decrypt data. Whilst the simplicity and efficiency of this encryption method used to be considered a strength, it has become more vulnerable as processing power has increased and the sophistication of cyberattacks has developed (Badman and Kosinski, 2024). While the algorithms themselves are generally secure, the vulnerabilities lie within the key becoming discovered by unwanted parties. This means that even if the attacker manages to steal or guess the encryption key, they will also have the decryption key and vice versa. This is very dangerous as parties with malicious intent can forge fake encrypted documents and communications.
\subsubsection{Asymmetric Encryption}
\begin{figure}[H]
\centering
\includegraphics[width=0.8\textwidth]{images/assymetric_encryption.png}
\caption{Asymmetric encryption.}
\label{fig:asymmetric-encryption}
\end{figure}
Asymmetric encryption's most notable difference to symmetric encryption is that each agent uses a public key and a private key, instead of just a shared private key. The public key is used for encrypting data and is openly available, whereas the private key is used for decrypting the data sent using the related public key and kept secret by the owner itself. Both have specific use cases, however the general consensus is that asymmetric encryption is more secure as there is no need to share a private key, just a public key, but asymmetric encryption is generally slower (Daniel, 2023).
A specific use case of this is using Google's search engine. When a query is made, the data is encrypted using Google's public key. This data is then sent to the Google servers where it decrypts it using the private key and searches the server for relevant information (Comodo SSL Store, 2024). Each encryption method has different algorithms that actually execute it in industry.
In industry, asymmetric and symmetric encryption algorithms are generally used together in different parts of the data transfer (Badman and Kosinski, 2024). Whilst asymmetric encryption would be used for the signing as it is considered more secure, the transfer of data is generally symmetrically encrypted as it is faster and can handle more data at once (Badman and Kosinski, 2024).
\subsection{Mathematical Methods Behind Classical Algorithms}
The mathematical methods behind classical encryption represent a revolutionary shift from earlier cryptographic techniques to more abstract, seemingly unrelated mathematical problems that are computationally infeasible (Koblitz, 2010). They are also ever-changing as many variants and new versions of each algorithm are released. The encryption method of the algorithms requires abstract mathematical methods that classical computers find near impossible to break, such as prime factors or deconstructing a logarithm (Muller-Quade and Hulsing, 2015). Understanding the mathematics behind these algorithms is key to grasping how they ensure data security and privacy. The following is a methodical analysis of the principles that RSA and ECC use, as they are the most commonly used asymmetric encryption algorithms (Badman and Kosinski, 2024).
\subsubsection{Asymmetric Encryption Mathematical Method}
\begin{figure}[H]
\centering
\includegraphics[width=0.92\textwidth]{images/image_transfer.png}
\caption{Data transfer through packet switching.}
\label{fig:image-transfer}
\end{figure}
To begin, the data is first converted into packets (Figure~\ref{fig:image-transfer}), allowing it to be encrypted and decrypted easily as computable numbers, then reassembled. Classical asymmetric encryption, such as RSA, employs mathematical principles that create a significant difference in difficulty between encrypting information and decrypting it without a specific secret key. This difference in difficulty is the foundation of its security.
\subsubsection{Modular Arithmetic}
The core of RSA encryption lies in the mathematical concept of modular arithmetic. Consider a system of modular arithmetic, where numbers, upon reaching a specific value, known as the modulus, wrap around to the beginning. For instance, in modulo 12, the number 15 is congruent to 3, denoted as $15 \equiv 3 \pmod{12}$ (Khan Academy, n.d.). Similarly, 27 is also congruent to 3 modulo 12, written as $27 \equiv 3 \pmod{12}$. The modulo operation essentially finds the remainder after division by the modulus. In the examples provided, multiples of 12 are subtracted from the original number until a non-negative remainder less than 12 is obtained. This cycling is what modular arithmetic represents. In RSA, the computer performs mathematical operations within a specific cycle determined by $n$, which is arbitrarily large (Chamberlain and Korevaar, 2011).
\paragraph{Typical RSA Encryption/Decryption Cycle.}
\textbf{Generating the keys}
\begin{itemize}
\item Two very large prime numbers are chosen, $p$ and $q$. These two prime numbers are multiplied to get $n$, which is part of the public key.
\item Another number $e$, the encryption exponent, is also chosen and becomes part of the public key. The encryption exponent $e$ has a specific mathematical relationship to $p$ and $q$ where $e$ has no common factors with $(p - 1)(q - 1)$.
\item A private key $d$, the decryption exponent, is calculated using $p$, $q$ and $e$. This decryption exponent must be kept private and is used to decrypt encrypted data.
\end{itemize}
\textbf{Encryption}
\begin{itemize}
\item To encrypt data, the public key $(n, e)$ must be used. The operation performed is modular exponentiation.
\item Modular exponentiation can be thought of as taking the data, raising it to the power of $e$, then finding the remainder after dividing by $n$.
\item The result of the scrambled data is called the ciphertext.
\end{itemize}
\textbf{Decryption}
\begin{itemize}
\item The decryption process requires the receiver to use the private key $(d, n)$.
\item The private prime numbers $p$ and $q$ are never in the private key, because if $d$ is compromised, another one could be calculated using the same $p$ and $q$ by changing $e$, resulting in a change to the public and private keys.
\item With the private key, another modular exponentiation is used with $d$.
\item Raising the ciphertext to the power of $d$ reverses the encryption process and reveals the original message.
\end{itemize}
This mathematical method ensures that the decryption operation reverses the encryption process, while also being a one-way process for those with public keys. The security of RSA relies on how easy it is to compute $p \times q = n$ for extremely large numbers, but specifically on the difficulty of factoring the large number $n$ into the large and specific prime factors $p$ and $q$ using computational methods. Because of this, even if someone knows $n$ and $e$, it is computationally infeasible to deduce $p$, $q$, or the private key $d$ (Crawford, 2023).
The mathematical principles behind RSA are relatively simple considering it was used in serious commercial applications. For example, online banking using TLS relied on RSA static key encryption until 2018, the year TLS 1.3 was released (Lee et al., 2021). Initially, the adoption rate of the new ECC encryption methods was slow with only 15\% of systems adopting TLS 1.3 after 264 days, roughly nine months. However, this rate increased rapidly, leading to a decline in the use of RSA encryption and a rise in ECC encryption (Learmonth et al., 2021).
\subsubsection{The Mathematical Methods Behind ECC}
While the security of RSA is significantly linked to the factorisation of $n = pq$, Elliptic Curve Cryptography (ECC) operates on a different mathematical foundation and does not have a single analogous defining equation. Instead, it relies on non-singular elliptic curves in the form
\[
y^2 = x^3 + ax + b
\]
where
\[
4a^3 + 27b^2 \neq 0
\]
to ensure that the elliptic curve is non-singular. This means that if the curve is written in the form $F(x, y) = 0$, it is said to be singular if there is a point on the curve at which both partial derivatives of $F$ are zero (O'Maley, 2005; Weisstein, 2025).
\begin{figure}[H]
\centering
\includegraphics[width=0.88\textwidth]{images/eliptic curve.png}
\caption{Examples of singular and non-singular elliptic curves.}
\label{fig:elliptic-curves}
\end{figure}
Figure~\ref{fig:elliptic-curves} displays examples of elliptic curves, both singular and non-singular. An important feature specific to each elliptic curve is the symmetry about the $x$-axis (Knutson, 2018). ECC works by adding two points on a non-singular elliptic curve. This only works due to the unique feature of elliptic curves: every point on the graph is in a mathematical group. This means that any two points added together will result in a point which is also on the graph.
The operation of adding two points on the curve is defined as follows:
\begin{itemize}
\item To add two points $P$ and $Q$, a line is drawn through $P$ and $Q$.
\item This line will generally intersect the curve at exactly one other point.
\item The reflection of this intersection point across the $x$-axis gives the sum $P + Q$.
\end{itemize}
Another important concept to understand for ECC is point doubling. Point doubling refers to adding a point to itself. To do this, a tangent line to the curve at that point is drawn, and the intersection with the curve gives the result of $2P$. ECC works by using a point on the curve $P$ as the public key and a private key $k$, a scalar of the point $P$, giving the ciphertext $kP$. Due to the group property of ECC, creating the ciphertext $kP$ is not very computationally intensive, even when considering fields which cause large groups of possible ciphertexts (Rickard, 2022).
The determining factor of how secure an ECC algorithm is lies in the finite field for the curve. The most common elliptic curves are defined over either prime fields ($F_p$, where $p$ is a large prime number) or binary fields ($F_m$, where $m$ is a positive integer). When an elliptic curve is defined over a prime field, the total number of points on the curve will typically be a large number close to $p$, but it can be slightly less due to the structure of the curve and its equation.
The number of points $n$ on the curve is typically related to $p$ through Hasse's theorem (Soeten, 2019), which provides an upper bound on the number of points:
\[
\left| n - (p + 1) \right| \leq 2\sqrt{p}
\]
This means the number of points on the elliptic curve $n$ will be close to $p$, but it can vary slightly based on the curve's specific parameters. The larger $p$ is, the larger $n$ will be, leading to a high order for the curve, meaning increased computational difficulty (Liew and Kamarulhaili, 2011). This makes the task of predicting the exact points used on an elliptic curve a nearly impossible task for computers due to the magnitude of the groups. For example, an elliptic curve with an order of 256 bits would leave $2^{256}$ possible private keys. Even with fast computational processing power it would take non-polynomial time so large it cannot be represented meaningfully by ordinary numbers (Paar and Pelzl, 2010).
\section{The Dawn of Quantum Computing}
\begin{figure}[H]
\centering
\includegraphics[width=0.9\textwidth]{images/qubit.png}
\caption{Classical bits contrasted with quantum qubits.}
\label{fig:qubit}
\end{figure}
Quantum computing has completely changed the status quo of theoretical computational power (Brooks, 2025). Encryption algorithms that were previously seen as computationally infeasible by classical computers, such as ECC, can be solved immensely faster, making them solvable in polynomial time using algorithms such as Shor's algorithm (Mavroedis et al., 2018). Subsequently, classical encryption algorithms may be rendered obsolete. Whilst current quantum computers are yet to have the capability to take advantage of these algorithms, it means existing encryption algorithms are only several breakthroughs away from cyberattack (Mavroedis et al., 2018).
Classical computers rely on bits which can take up the state of either 1 or 0; quantum computers operate using qubits which can take state 1, 0, or a superposition anywhere in between. Quantum computing relies on core principles that make the computing process slightly faster and more efficient than classical computing, but on a larger scale it could cause a gigantic leap in technology (Rehbein and Hock, 2023). Qubits rely on three principles to operate differently from conventional bits: superposition, entanglement, and interference. These three principles allow values to be obtained through measuring quantum phenomena that rely on quantum physics for consistency at a subatomic level, meaning true, incomputable randomness for QKD.
\subsection{Superposition}
When an electron is in superposition, its different states can be thought of as separate outcomes, each with a particular probability of being observed. An electron might be said to be in a superposition of two different velocities or in two places at once (Caltech, n.d.). A way to visualise superposition is the famous Schrodinger's cat thought experiment (NASA, 2025). Schrodinger devised a thought experiment in 1935 to exemplify the concept of superposition. His experiment posed the scenario of a cat being placed into a box. This thought experiment used the time constraint of a radioactive substance interacting with a Geiger counter which would subsequently release a poison. Provided the half-life of the radioactive substance is unknown, there is effectively a 50\% chance that the hypothetical cat has died (Matwelli, 2024; Schrodinger, 1935). The fact is that the cat is in a superposition between death and living. The way it becomes one of these states and leaves the superposition is by an observer opening the box and observing the cat as either dead or alive.
\subsection{Entanglement}
One of the most far-out phenomena of quantum theory is quantum entanglement, the idea that particles of the same origin, which were once connected, always stay connected (NASA, 2025). This idea famously contradicted Einstein's special relativity theory that nothing can travel faster than the speed of light (NASA, 2025). However, current-day physics experiments have shown that data transfer using entanglement requires classical computing which is limited to the speed of light. This means that ongoing experiments such as teleportation still cannot exceed the speed of light (Aliro, 2025). This phenomenon is still being explored today with high-energy entanglement to ensure there is no limit of energy or loopholes. What countless experiments have proven is that the spin of an electron can be observed to instantaneously determine the spin of the entangled electron to be the same, no matter the distance (ATLAS Collaboration, 2024).
\subsection{Interference}
Quantum interference is an event not unique to quantum mechanics. Interference occurs in classical physics where it amplifies a wave (constructive interference) by two peaks overlapping and interfering. However, it can also cancel out a wave amplitude completely (destructive interference) when out-of-phase waves overlap and a peak and a trough interfere. Both take advantage of the wave-like nature of quantum particles. Although appearing unremarkably simple, it can be applied to one of the most vital algorithms to quantum computing, Grover's Algorithm (Grover, 1996). In practical applications, constructive interference is used to amplify the probability of the outcome being correct. Likewise, destructive interference is used to decrease the probability of an incorrect outcome of the computation. This is also applicable to the Quantum Fourier Transform (QFT), which allows many mathematical functions to be calculated on quantum computers more efficiently (Microsoft Learn, 2025). Grover's algorithm works on quantum computing to decrease the computational power required to sort through an unsorted list, a very common programming problem. Grover's algorithm square-roots the number of original iterations, although it still depends on the complexity of the application for the final time (Microsoft Quantum Team, 2025).
\subsection{Quantum Computing Principles Applied in QKD}
As already explored, classical algorithms such as RSA and ECC rely on specific mathematical problems that are computationally intensive to form keys that are secure against brute-force attacks. On the other hand, QKD relies on quantum variables that occur through entanglement, interference and superposition, essentially using the obscurity of measurement that occurs on quantum particles at a subatomic level (Gisin et al., 2002). There are more details about the symbiosis between these principles and QKD further along this investigation.
\subsection{The Bloch Sphere}
\begin{figure}[H]
\centering
\includegraphics[width=0.45\textwidth]{images/bloch_sphere.png}
\caption{The Bloch sphere.}
\label{fig:bloch-sphere}
\end{figure}
Figure~\ref{fig:bloch-sphere} is a visual representation of a qubit and is named the Bloch sphere, after physicist Felix Bloch (University of St Andrews, n.d.). It is a geometrical representation of a qubit's state space. The surface of the sphere represents the possible pure states, where it would be observed. The surface also includes possible states of superposition, while the interior of the sphere represents its possible mixed states that can be achieved through a mixture of entanglement and superposition. Compare this singular qubit to a single bit of information that can only take values 1 or 0. If quantum computing can get to the standard of computers today, it would leave certain encryption algorithms vulnerable by being brute-forced in polynomial time. As a result, many agencies and governments are looking at alternative encryption algorithms, especially for defence against these quantum computers (NIST, 2024; NCSC, 2025). The importance of this geometric representation is to note the vast amount of possible states, which minimises the likelihood of a brute-force attack, the goal of QKD.
\section{Quantum Encryption Methods}
When quantum computers reach full scalability, the two leading encryption methods used will be PQC and quantum cryptography.
\subsection{Quantum Cryptography}
Quantum cryptography, also known as quantum encryption, refers to various cybersecurity methods for encrypting and transmitting secure data based on the naturally occurring and immutable laws of quantum mechanics (Schneider and Smalley, 2023).
\subsection{Quantum Key Distribution}
There are two different types of QKD protocols: discrete variable (DV-QKD) and continuous variable (CV-QKD). DV-QKD works on discrete variables that are true or false, such as whether light is detected or not on a sensor. CV-QKD works using continuous variables such as the intensity of the light and would respond according to different levels of light. The first implementation of QKD was with the BB84 protocol made in 1984, but its security was not explicitly proven until years after the algorithm was published (Singh et al., 2014).
Traditionally, encryption works as either asymmetric or symmetric key cryptography, with hash functions being used more commonly for ledgers and blockchains than secure communications. QKD is not singly used for secure communications. To transfer data securely using QKD, the two users must also have access to the same symmetric-key channel. The advancement is how the key is generated: using quantum physics instead of large prime multiplication, modular logarithms, or elliptic curves.
In 1984, Bennett and Brassard sought to create an algorithm that worked on quantum principles and would provide secure communications between two parties. This is known as the Alice, Bob and Eve model. The premise poses Alice and Bob both having access to the quantum communications channel, which is private, and Eve as a third party attempting to eavesdrop on the secure channel.
This encryption relies on Heisenberg's uncertainty principle (Heisenberg, 1927), which states that, given a measurement on one value of a conjugate pair of values such as energy/time or momentum/position, the more certain the measurement of one value becomes, the more uncertain the other becomes. For example, if the momentum of a photon is measured with a high probability, this causes the velocity of the photon to become uncertain. This is also linked through a mathematical relationship:
\[
\Delta p \Delta x \geq \frac{h}{4\pi}, \qquad
\Delta t \Delta E \geq \frac{h}{4\pi}
\]
(Woods and Baumgartner, n.d.), where $p$ denotes momentum, $x$ denotes position or displacement, $t$ represents time and $E$ denotes energy.
Heisenberg's principle shows how measurement of a quantum system can cause disruption. However, this is not the only principle that QKD relies on. The no-cloning theorem (Wootters and Zurek, 1982) is also vital to QKD. The theorem establishes that provided a source provides two entangled photons, the superposition cannot be cloned by another photon because it would either become entangled to another photon or collapse into a pure state. Also, due to the nature of superposition, measuring the superposition would alter the state, meaning that if another entity measures the superposition of a specific photon, it would revert to a pure state, instantaneously making the other entangled photon a pure state. This is how Bob and Alice would know that Eve tried to infiltrate their secure communication. In practice, QKD encryption could notify users about any infiltration to their communication as the quantum particle would be disrupted by measurement from another party. This is superior to current methods that do not have this capability.
The first QKD protocol, BB84, is a DV-QKD protocol. This is because the protocol encodes information using singular photons of light in discrete quantum states. Specifically, it relies on the polarisation of light (Kumar and Garhwal, 2021). In the previous example, where Alice sends data to Bob, she would need to prepare, rather than measure, a photon in either a rectilinear or diagonal basis (Asif, 2021). If she wished to encode the value 1 in a rectilinear basis, the preparation of the polarisation might be vertical and encoding a value of 0 would be a horizontal polarisation. For diagonal encoding, value 1 could be $+45^\circ$ and value 0 could be $-45^\circ$ (Asif, 2021; Pillai and Polimetla, 2024). It is important to note how these corresponding values are discrete polarisations of light, making this DV-QKD (Asif, 2021).
In practice, the values sent over photons would be the secret key for the classical communication channel (Davies, 2024). Bob and Alice would communicate which basis they used for measuring the photon. This is called distilling the secret key. Each value would be randomly prepared or measured with either basis. Alice and Bob would each have a results table containing data. Once they eliminated data measured from the wrong basis, the result would be a secure key generated through quantum mechanics (Asif, 2021). From Eve's perspective, she also takes measurements of the photons transmitted, however this eavesdropper has not removed the same values as Bob and Alice would have both agreed to remove by communicating over a secure classical channel.
This is the largest caveat with QKD: although the generation itself is random using quantum physics, it is not independent enough to communicate entirely over, since the classical channel which Bob and Alice communicate over to make the key must be made using another secure classical method (Asif, 2021; Wang et al., 2021). In the future, when quantum computers have reached a computational level nearly equal to classical computers, these classical channels may have to utilise PQC to ensure a quantum threat cannot perform a cyberattack. The strength of QKD lies in it providing an alternative to generating the key for classical communication (Pillai and Polimetla, 2024). Currently, the key is generated through principles in prime factorisation in RSA or elliptic curves in ECC.
Another type of QKD protocol is CV-QKD. These protocols are more recent and heavily studied due to their higher efficiency and compatibility. In particular, the Gaussian Modulation Coherent State (GMCS) protocol appears to have the most promise as it can be implemented more easily in practice (Laudenbach et al., 2018; Weerasinghe et al., 2023). To use GMCS, Alice would use a laser that produces coherent light and change its amplitude and phase. Alice would change this in a way that follows a Gaussian distribution, as it was found that this was effective for both Eve's eavesdropping and defence against eavesdropping (Weerasinghe et al., 2023). To fine-tune the light transmission, algorithms must make the light slightly brighter by shifting its wave pattern in a smooth, random way. These changes encode the key in a continuous way. This is due to the continuous nature of waves, resulting in the classical communications between Alice and Bob becoming mathematical calculations to distil the key instead of simply discarding results. CV-QKD protocols are also superior to DV-QKD protocols as they can be implemented more seamlessly into the optical fibre infrastructure already used for classical communication (Weerasinghe et al., 2023; Laudenbach et al., 2018). Contrasting current encryption algorithms such as RSA or ECC, this allows significantly more data to become encrypted at a faster rate.
\subsection{Disadvantages of QKD}
Nevertheless, QKD has its own weaknesses. The most notable issue affecting practical application is that it is prone to errors over short distances (Kish et al., 2024). Currently, QKD protocols require light to travel through fibre-optic cable to transfer the secret keys. This makes the light prone to noise and weakens the signals, meaning that to transfer data further there would need to be quantum repeaters or trusted nodes, while maintaining security (PostQuantum, 2023). Cost is also an issue, along with the fact that conventional encryption methods still need to be used regardless.
DV-QKD requires precise sources that emit single photons and precise receivers; any imperfections in these make the protocol less secure (Laudenbach et al., 2018). Even though CV-QKD uses more common light sources and receivers, current technology limits its accuracy when measuring the amplitude and phase of light. There are also issues with key length, as classical encryption methods currently have longer key lengths and when the key is distilled it is shortened dramatically (Kovacevic and Svelto, 2015). However, CV-QKD has more promise in delivering longer keys since it uses waves of light which can store more data than single photons of light (Kish et al., 2024).
\subsection{The E91 Protocol}
The E91 protocol, similar to BB84, is a DV-QKD protocol. However, they differ greatly in how they take advantage of quantum principles to stay secure, especially those concerning entanglement. BB84's security comes from the fact that any eavesdropping attempt to measure these individual photons would inevitably disturb their state, a consequence of the Heisenberg Uncertainty Principle, and that these unknown states cannot be perfectly copied due to the No-Cloning Theorem. On the other hand, E91 uses pairs of entangled photons as its starting point (Ekert, 1991). Here, the security is not about disturbing individual photons but about disrupting the special connection between the entangled pair.
If an eavesdropper tries to measure one of the entangled photons, it would essentially ruin the superposition, a phenomenon described by Bell's theorem (Bell, 1964). Alice and Bob can detect this by checking for these correlations in their measurements. So, while BB84 uses the properties of individual quantum states, E91's security is deeply rooted in the unique and interconnected nature of entangled particles, potentially offering more secure methods in some scenarios (Gisin et al., 2002). Existing methods offer exceedingly less security, specifically to brute-force attacks, due to the higher number of possibilities available for a key to take.
\section{Post-Quantum Cryptography}
Another form of defence against the looming danger of quantum computers that are able to solve logarithms and factorisations faster than ever before is Post-Quantum Cryptography (PQC). The goal of post-quantum cryptography, also called quantum-resistant cryptography, is to develop cryptographic systems that are secure against both quantum and classical computers, and can interoperate with existing communications protocols and networks (NIST, 2017). The way PQC provides security against future quantum computing power is that it relies on other forms of mathematics or science that are still computationally intensive, even for quantum computers.
In 2016, NIST held a competition to find the best PQC protocols, including the ones most practical to be applied to international security. This investigation ended in August 2024 (NIST, 2024). This field is important to the development and growth of QKD as it currently works symbiotically with classical encryption. Relevant analysis of these emerging algorithms is necessary because they may render current encryption methods obsolete if they replace traditional encryption methods with quantum-resistant methods. Creating an effective PQC algorithm is vital to allow QKD to thrive, due to its requirement of a secure classical channel. NIST chose four final PQC algorithms to be studied: BIKE, Classic McEliece, HQC, and SIKE (NIST, 2025).
\subsection{Emerging PQC Algorithms}
\textbf{BIKE.} BIKE works using Bit Flipping Key Encapsulation. The security comes from intentional errors being added into data when it is transmitted securely. However, further research has identified potential weaknesses in the BIKE system, suggesting that it might be more vulnerable to certain types of attacks than initially believed, and ultimately a flawed solution (Nosouhi et al., 2023).
\textbf{Classic McEliece.} As the name suggests, it is a well-studied and older idea in PQC first introduced in 1978 (McEliece, 1978). Similar to BIKE, it scrambles and unscrambles data. At its core, this PQC protocol uses error correction effectively so that even if the secret decryption code was not sent correctly, it would still work. This is incredibly useful in practical applications and CV-QKD, which is prone to errors from inaccuracies in light detection. However, its downside is that the key length is far longer. This results in longer initiation times for the classical channel, which could consequently cause the QKD channel to become very slow to access as different keys would need to be generated.
\textbf{SIKE.} Different to the other protocols, SIKE stays secure using supersingular isogenies of elliptic curves (Jao, 2022). Similarly to ECC, it takes advantage of elliptic curves and their group properties. The difference between ECC and SIKE is what part of the elliptic curve it uses; it keeps the path between two singular elliptic curves secret for use in a key. However, researchers in 2022 found ways to compute the isogeny using the public key (Robert, 2023), which was devastating to its security. Nevertheless, this does not mean the end for isogeny-based encryption, as it was simply due to how the public key was generated for SIKE, giving too much information about the elliptic curve (NIST, 2025). This is very promising for QKD as it implies more optimal and superior algorithms can be developed using this principle. Developments in this algorithm may mean fully quantum-secure communication using QKD.
\textbf{HQC.} HQC was the winner and is being adapted to become the post-quantum standard algorithm for key transfer (NIST, 2025). This is an advancement in QKD becoming completely quantum-secure, as HQC could be a secure communication channel for using QKD where an eavesdropper has access to quantum computer power. Although HQC is considered superior (NIST, 2025), NIST did decide that HQC and BIKE were so similar, as both were code-based cryptography which involved scrambling data and intentionally including random errors. HQC was assumed to be the superior one due to its higher bandwidth and significantly faster key generation and encode/decode time (NIST, 2025), which would allow a more secure channel because the key would be significantly more difficult to brute-force.
\section{Conclusion}
Quantum encryption methods, such as QKD, have continued their evolution since their first introduction in 1984, developing in sophistication to be able to transmit data in greater quantities and with greater ease of integration into existing frameworks. However, even with CV-QKD protocols, existing encryption methods are nearly being replaced, but are unlikely to become obsolete until PQC, which is in its infancy, is fully implemented (NIST, 2025).
Secure communication using QKD simply requires a secure classical channel for communication while creating keys; therefore, a quantum-resistant channel is the most effective solution. This means existing encryption methods are only currently reasonable due to the latency between PQC inception and full implementation. To continue to rely on current encryption methods would be ill-advised, only able to be practical to this point due to the tailor-made infrastructure currently available (Singh, Gupta, and Singh, 2014). Although QKD will indirectly cause existing methods to become obsolete once optimised and implemented, the introduction of PQC has been the largest factor necessitating a switch away from existing encryption algorithms. As seen with PQC protocols, QKD has instead been a stimulus for evolution of existing encryption methods.
Quantum computers bring the danger of brute-force attack to a scale that current methods cannot handle (Seiler, 2024). QKD uses Heisenberg's uncertainty principle to generate a key using measurements of states. To transfer between these measurements and the usable key, the two parties must communicate over a secure, unrelated, and classical channel. Current encryption methods are viable for this, still making them useful. However, advancements in PQC provide a superior alternative which also has a vastly smaller probability of becoming susceptible to attack, unlike existing encryption methods. As more high-risk data becomes protected using QKD, PQC will become the replacement, making existing encryption obsolete while aiding QKD. In conclusion, the presence of QKD has indirectly caused existing methods to become outdated due to less reliance on classical cryptography, but PQC has caused a greater impact due to it also being used on classical computers.
\clearpage
\section*{Bibliography}
\begingroup
\small
\input{references.tex}
\endgroup
\clearpage
\section*{Image Sources}
\begingroup
\small
\input{image_sources.tex}
\endgroup
\end{document}