Quantum Computing for Classical Developers

Introduction¶
Quantum computing is one of the most interesting and increasingly popular topics of the last few years. Continuous research breakthroughs, such as the recent discovery of macroscopic quantum tunnelling in electrical circuits by John Clarke, Michel Devoret, and John Martinis (awarded the 2025 Nobel Prize in Physics), are pushing the boundaries of what we believe possible and of what we know about the quantum world.
Nowadays, quantum computers are not something only scientists can use. In fact, it’s something many of us can buy online or use via a cloud interface. The field is blooming, and software developers should start learning more about it today.
Besides running code on real quantum hardware, the Internet is full of “simulators” to play with quantum principles without using real quantum hardware. In this article, we’ll take a look at what simulators are and see how classical software developers can play around with qubits today.
What’s quantum computing?¶
The recent hype for quantum computing is mostly due to its implications for modern cryptography. Many believe that quantum computers will be powerful enough to crack RSA-2048 by early 2030. Other predictions exist, but almost everybody agrees that within the next decade, cryptographic systems must be secure against attackers with access to large-scale, cryptographically relevant, quantum computers.
Furthermore, scientists believe that quantum computing may help simulate properties of new materials (e.g., superconductors) before building and testing them. Similarly, drug simulation can be tailored to a user’s organism. New, interesting applications are explored as you read, and only the future will tell us which of them really show quantum supremacy (the point at which a quantum computer can perform a specific task that is practically impossible for even the most powerful classical supercomputers to solve in a reasonable timeframe).
Still, quantum computers are not a new idea. In fact, they date back to the 1980s. In 1982, Richard Feynman, one of the most brilliant physicists of all time, showed that quantum experiments cannot be simulated very well on a classical computer. Now, the reason for this lies in the probabilistic nature of quantum mechanics, but he was the first to say that maybe this is also because quantum experiments are capable of computation that is not possible on a classical computer.
He was right.
Similarly, there are plenty of problems that are very difficult to solve on a classical computer, but that are quite easier on a quantum computer. Factoring a large number into prime factors (which is basically what some cryptographic algorithms are based on) is one of them (actually finding the period of a function, which is strongly related to factoring a number).
However, quantum computers will not solve just any problems we have. The fields where we believe we’ll observe quantum supremacy are just a few nowadays, and, for many of them, we’re not even sure. Therefore, classical and quantum computers will co-exist in the future. In other words, we’ll still have classical applications (written in Python, Java, and so on), leveraging quantum algorithms.
Qubits¶
Many resources on the Internet explain what qubits are. However, it’s useful to refresh it, in informal and high-level terms.
Classical bits are discrete entities that can either be 0or 1. Similarly, qubits can either be one of the two basis states, |0> or |1>. However, they can also exist in a combination of states, known as a superposition. When a qubit is in the superposition state, we only know the probability of finding it in |0> or |1> once we measure it.
You can imagine measuring a qubit as reading its value.

Figure 1. Vector representation of a qubit
Figure 1 shows the vector representation of a qubit. α and β are called probability amplitudes, and are complex numbers (for reasons related to quantum mechanics that are out of the scope of this article). The square of their modulus is the probability of measuring either |0> or |1>.
What matters to us is that qubits are generated by and manipulated with physical phenomena. Several possibilities exist, and different companies adopt different types of qubits (e.g. IBM’s computers are powered by superconducting qubits, whereas Microsoft is investigating an approach based on Majorana fermions).
Different types of qubits are subject to noise in various ways. You can imagine noise as the stability of the qubit. Being qubits such small particles, they are affected by the enclosing environment, such as the room temperature, the electromagnetic noise, other qubits in the system, and so forth.
Let's dive into more details about EADT and what makes it a tool that can speed up a proof of concept on the target hardware we support. More information on the hardware we support, here.
Quantum gates¶
Similar to classical bits, quantum bits are manipulated via operations called gates. There are many gates out there, from the ones that have a classical counterpart (e.g. the quantum Pauli-X gate is the same as the NOT classical gate), to others that don’t.
Figure 2 shows a comparison between the NOT classical gate and the quantum Pauli-X.

Figure 2. Not classical and quantum gates.
Classical gates are generally described with truth tables, mapping the values of the input bits to the result. Quantum gates, on the contrary, are modelled as matrices, and applying a gate to one (or more) qubit(s) means multiplying the qubit’s state vector by the gate matrix.
An example of a gate without a classical equivalent is the Hadamard gate, often represented with the H symbol, which puts a single qubit into the superposition state. Therefore, after applying H to a qubit in one of the basis states (|0> or |1>) will get a qubit with a 50% chance to be read (aka “measured) as |0> and 50% as |1>.
Why should this interest classical software engineers?¶
If you’re thinking that this is very low-level, then you’re right. Classical software developers won’t likely have to write new quantum algorithms (which are very complex), actually, as many of us don’t really write classical algorithms today as part of our day-to-day.
Still, we might get exposed to them in various ways. Some of us might just invoke “functions” which will, under the hood, run quantum algorithms. Others, on the contrary, might have to implement quantum algorithms.
In any case, experimenting with qubits is a lot of fun. It kind of reminds me of when, in the early days of my education, I would download a simulator of the Assembly language and play with bits and registers. Knowing what happens at the quantum level, even if only intuitively, might turn out to be useful in the future.
Quantum computing simulators¶
Quantum computing simulators are software languages or libraries using classical computers to mimic the behaviour of a quantum computer. They serve many purposes:
- Education, they let us experiment with qubits and start our journey into the quantum computing world.
- Algorithmic development and testing, also at a research level; we can use simulators to try algorithms out before running them on real quantum hardware.
In this article, we’ll take a look at four different simulators for quantum systems:
- Q#, Microsoft-provided quantum programming language for hybrid quantum/classical computing;
- Cirq, a Google-provided, Python-based quantum simulator based on Noisy Intermediate-Scale Quantum computing;
- Qiskit, IBM-provided, Python-based quantum software suite for research and algorithm optimisation;
- Strange, JVM-based simulator by Johan Vos for rapid prototyping.
The rest of the article will show code to implement a hybrid quantum-classical algorithm and will not explain in-depth how the simulators work.
Running Examople¶
We’ll compare the simulators by looking at the solution to the following problem:
Generate a truly random number in [0, max)
That is, generate a positive random number lower than max (input).
This problem is currently unsolvable on classical computers, where random numbers are actually “pseudo-random”. In fact, when we generate a random number on classical computers, what happens is that we get a number taken from a sequence of numbers generated from a value called a seed. Two sequences generated using the same seed will yield the very same numbers, in the same order.
The algorithm to solve it on a hybrid quantum-classical application is as follows:
The algorithm to solve it on a hybrid quantum-classical application is as follows:
- Determine how many bits are needed to represent max: nbits = floor(ln(max) / ln(2) + 1). For instance, if max = 10, then we’ll need 4 bits.
- Generate a binary string with length = nbits. This is the quantum part. The resulting bitstring is a number base 2.
- We turn the number into base 10 and compare it to max. Returning it if and only if it’s lower than or equal to it. Otherwise, we repeat step #2.
The quantum circuit (aka the quantum program) for our solution (i.e. Step #2) is as in Figure 3:

Figure 3. The quantum circuit to generate a random bit.
In particular, to generate a random bit, we need a single qubit and a single Hadamard gate (to put the qubit into superposition). After that, we measure the qubit to obtain either 0 (with 50% probability) or 1 (with 50% probability). To be more precise, we have 50% of measuring either |0> or |1>, which are the basis states for the qubit.
The probability histogram for 100 runs of the Q# implementation (see below) is as in Figure 4:

Figure 4. Histogram for 100 runs of the circuit shown in Figure 2. In this case, the number of measured zeros was almost the same of that of measured ones.
As a side note, the number generation will not be truly random when we simulate our quantum code on a classical computer. This is because the |0> and |1> basis states (which then lead to a 0 or 1 classical bit, respectively) are “measured” with pseudo-random techniques. However, running the code on real quantum hardware will get us truly random measurements.
Strange¶
Strange is an educational JVM-based library to experiment with quantum algorithms. For developers with a Java background, writing code in Strange will feel very natural, as everything is modelled as an object. The library also allows for an imperative style when composing the steps to run a quantum program.
Strange exposes two different “sub-libraries”. The first one is a high-level collection of functions implementing common quantum algorithms. As a matter of fact, this could be the level of interaction with quantum programs for most software developers in the future. We invoke a method of a class, and it returns a value without even leaking how that value was obtained. On the contrary, the low-level library lets us manipulate qubits and implement our own algorithms.
Strange’s API is straightforward, but simplified, on purpose. It abstracts over some complex aspects of quantum computing (e.g. noise) and is therefore not really suitable for research at the moment. However, it’s a very valuable tool to start getting our hands dirty.
The quantum code to generate a random bit on Strange is as follows (we’re using Scala, not Java, in the following examples):
class QuantumNumberGenerator:
private val simulator = SimpleQuantumExecutionEnvironment()
private val program = Program(1, Step(Hadamard(0)))
private def quantumRandomBitGeneration(): Int =
val qubits = simulator.runProgram(program).getQubits
qubits(0).measure()
Strange code to generate a random bit with a quantum algorithm.
The QuantumNumberGenerator class defines a simulator (SimpleQuantumExecutionEnvironment is a local simulator) and a program. The latter is, in Strange, a sequence of steps. To generate a random classical bit, we need a single qubit (that’s the meaning of the first argument to the Program constructor) and a single step (with a Hadamard gate applied to the only qubit in the system, Step(Hadamard(0))).
The quantumRandomBitGeneration() function runs the quantum program and gets the measurement of the first (and only) qubit.
We can take care of the rest of the algorithm with Scala code:
class QuantumNumberGenerator:
// code shown before
private def generateNumberWithNBits(n: Int): Long =
(1 to n).foldLeft(0L) { case (acc, _) =>
val bit = quantumRandomBitGeneration()
// shift the accumulator left by 1 position and add the new random bit to the least significant position
(acc << 1) | bit
}
@tailrec
private def generateNumberRec(max: Long, nBits: Int): Long =
val generated = generateNumberWithNBits(nBits)
println(s"Generated: ${generated.toBinaryString} -> $generated")
if generated < max then generated
else generateNumberRec(max, nBits)
override def generateNumber(max: Long): Long = {
val nBits = (math.floor(math.log(max) / math.log(2)) + 1).toInt
println(s"Generating a number with $nBits bits but lower than $max")
generateNumberRec(max, nBits)
}
QuantumNumberGenerator.scala
generateNumberWithNBits() calls quantumRandomBitGeneration() as many times as needed to generate nbits measurements and then uses a fold and bitwise Scala operators to compose the bit string ( (acc << 1) | bit):
Example:
If n = 4, the function might generate:
- Iteration 1: bit = 1 → acc = 1 (base 2) = 1
- Iteration 2: bit = 0 → acc = 10 (base 2) = 2
- Iteration 3: bit = 1 → acc = 101 (base 2) = 5
- Iteration 4: bit = 0 → acc = 1010 (base 2) = 10
The rest of the code consists of Scala functions that repeatedly invoke generateNumberWithNBits() until a random number is generated that is lower than or equal to max. The @tailrec annotation shows we can even mix recursion with quantum simulation, achieving a true hybrid application.
Qiskit¶
Qiskit is likely the most comprehensive software suite for quantum computing. It defines several simulators as well as tools for algorithm optimisation and noise simulation. In particular, it lets us simulate our code using noise models extracted from real IBM-managed quantum computers. This is of paramount importance for researchers, as they get to see how the code behaves in (almost) real-world conditions.
Furthermore, we can run Qiskit code on IBM’s real quantum hardware.
The code for our algorithm is less verbose than its counterpart in Strage, and we can implement the majority of it in a single function:
simulator = AerSimulator()
def quantum_random_number_generator(nbits):
qc = QuantumCircuit(1, 1) # One qubit and one classical bit
qc.h(0)
qc.measure(0, 0) # Measure the qubit and store the result in the classical bit
job = simulator.run(qc, shots=nbits, memory=True)
simulation_result = job.result()
measurements = simulation_result.get_memory()
bits = ''.join([str(m) for m in measurements])
number = int(bits, 2)
print("Generated random number: ", bits, " -> ", number)
return number
Qiskit code to generate a random number.
The example uses the Aer IBM simulator and needs 1 qubit and 1 classical bit (qc = QuantumCircuit(1, 1)).
The code applies the Hadamard gate to the only qubit in the system (qc.h(0)) and then measures it, copying the result to the only classical bit (qc.measure(0, 0)).
The program runs the simulator as many times as needed to represent max (i.e. nbits times). The memory=True flag ensures the qubit measurements are all kept in memory (as we need them to generate the binary string representing the result in base 2). It then uses classical Python code to compose the bit string and to turn it into an integer.
As a side note, with the Aer simulator, we could model noise as well, thus making our algorithm closer to real quantum hardware.
The rest of the algorithm is easily implemented with Python code:
# quantum_random_number_generator as before
def generate_number(max):
nbits = math.floor(math.log(max) / math.log(2)) + 1
while True:
number = quantum_random_number_generator(nbits)
if number < max:
return number
Random number generation in Python with Qiskit.
Cirq¶
Cirq is another advanced quantum simulator. Similar to Qiskit, it lets us model noise in a very precise way, simulating it in different parts of our quantum circuits.
Furthermore, Cirq supports fine-grained control over the definition and interaction of the qubits, allowing for deep optimisation of quantum devices. For instance, it forbids some ways of composing qubits that would not be possible due to physical constraints.
Therefore, Cirq is also a valuable tool for research purposes.
Lastly, we can run the code on real quantum hardware, even if not on Google’s (at the time of writing)
The Cirq code to solve the problem is very similar to Qiskit:
simulator = cirq.Simulator()
def quantum_random_number_generator(nbits):
qubit = cirq.LineQubit(0)
circuit = cirq.Circuit(cirq.H(qubit), cirq.measure(qubit))
simulation_result = simulator.run(circuit, repetitions=nbits)
measurements = simulation_result.measurements["q(0)"]
bits = ''.join([str(m) for m in measurements.flatten()])
number = int(bits, 2)
print("Generated random number: ", bits, " -> ", number)
return number
Random number generation with Cirq
The implementation needs 1 qubit (created here as a LineQubit, but a NamedQubit would also be fine).
The circuit is composed of the Hadamard gate as a first step (cirq.H(qubit)), and then of a measurement (cirq.measure(qubit)).
The program runs the simulator as many times as needed to represent max (i.e. nbits times). It then accesses the measurement results of the qubit (simulation_result.measurements["q(0)"]) and uses classical Python code to compose the bit string and to turn it into an integer.
Again, the rest of the algorithm is easily implemented with Python code, and it is identical to the Qiskit case.
Q#¶
Q# is a new programming language developed by Microsoft to write quantum programs. The syntax was inspired by that of C# and F#, and it’s crafted to simplify the definition of qubits and their composition.
Q# is also specifically designed for hybrid quantum-classical applications. At the time of writing, we can run Q# code using a VS Code extension or by invoking it from a Python application. We sort of took this interoperability for granted when talking about the other simulators, but, for a dedicated programming language, having the ability to run it from Python is very nice.
Noise-wise, Q# is sort of between Strange and Cirq/Qiskit. It supports configurable noise models, but they’re not as advanced as Qiskit’s or as fine-grained as Cirq’s.
Lastly, we can run Q# programs on Azure quantum hardware.
The Q# code is much simpler compared to the other implementations, confirming that a dedicated programming language enables us to use a more intuitive and less verbose syntax:
operation GenerateRandomBit() : Result {
use q = Qubit();
H(q);
let result = M(q);
Reset(q);
return result;
}
Random bit generation with Q#.
Operations in Q# are basically functions we can run on a target quantum processor.
The GenerateRandomBit() function allocates a qubit (use q = Qubit()), applies a Hadamard gate (H(q)) and then a measurement operator (M(q)) on it, returning the result after releasing the qubit itself.
Resetting the qubit is necessary:
In Q#, qubits must be in the state when they’re released to avoid errors in the quantum hardware.
We can invoke that function either from a Python program or from another Q# operation:
// GenerateRandomBit as before
operation GenerateRandomNumberInRange(max: Int): Int {
mutable bits = [];
let nBits = BitSizeI(max);
for idxBit in 1..nBits {
bits += [GenerateRandomBit()];
}
let sample = ResultArrayAsInt(bits);
Message($"Generated sample: {sample}");
return sample > max ? GenerateRandomNumberInRange(max) | sample;
}
Random number generation in Q#.
The GenerateRandomNumberInRange() function allocates an array and fills it with random bits generated by GenerateRandomBit(). Notice that here we’re using functions from Q# Math and Convert libraries, namely BitSizeI and ResultArrayAsInt. This shows that Q# defines some built-in utilities to let us work with bits and to convert bit strings into decimal numbers.
Conclusion¶
This article briefly introduced quantum simulators for classical software engineers.
Even if these topics seem very abstract and very theoretical, they’re also becoming a reality. Many companies are leveraging quantum principles to generate random numbers (Quantum Random Number Generation) and to share secret keys in a physically-secured way (Quantum Key Distribution). Many more applications are still possible and will likely emerge in the coming years.
Software developers and engineers can start experimenting and learning today, at different levels. First, we can try out simple algorithms on local simulators, building a solid foundation. Then, we can move to cloud simulators or real quantum hardware.
Hopefully, after reading this, you’re also convinced that you don’t need a Ph.D to try quantum computing! 😉
You can find the code on GitHub and a presentation on the topic on YouTube.
Contact us¶
We would love to hear form you!
© 2025, Kynetics Inc. Santa Clara, California Enjoy the Art of Coding™ and Update Factory™ are registered Trademarks