The Quantum Computing Revolution
A primer to quantum computing, Cisco / Code BGP acquisition, b2b marketplaces, the story of a $580m exit, jobs, events, and more
Friends, we’re kicking off Season 4 of Startup Pirate with a primer on quantum computing, plus the usual roundup of Greek Tech funding rounds, jobs, new products, and events. Here’s what we covered in past issues:
Subscribe and join 4,752 readers.
The Quantum Computing Revolution
Is quantum computing hype or almost here? How far are we from having a well-functioning quantum computer? What are the benefits compared to today’s supercomputers? These are some of the questions we’re going to explore today with Georgios Korpas, a Senior Research Scientist at HSBC Labs and Ph.D. in Theoretical Physics who has spent many years researching and advancing the field of quantum computing. Let’s jump right in.
Giorgo, it’s great to have you here. What we’re going to discuss is perhaps one of the most exciting breakthroughs in technology today, so really excited about this.
GK: Thanks for having me, Alex! Very happy to chat with you.
There’s a lot of chatter nowadays, possibly magnified due to the AI craze, about how much faster classical computers can process information to help humans solve the most complex problems of our times. Are we close to hitting the theoretical limits of Moore’s Law, and what would be the implications if that’s true?
GK: Gordon Moore predicted in 1965, while he was still at Fairchild Semiconductor, that every two years, the number of transistors that could fit in a specific area would double. Indeed, we went from a transistor the size of a light bulb in 1954 to 134 billion transistors in the recent Apple M2 Ultra processor. The practical implications of Moore’s Law have been at the core of computer performance improvements over the years. Still, they are not the only parameter at play.
There are different ways to improve computer performance, as the chip is not the whole story. It’s about both hardware and software. In fact, manufacturers continue to build faster computers, to some extent, by developing more efficient algorithms that run on the chip, better control of the electronics, etc. Yet, it’s true that fitting more transistors in microprocessors has been happening at a slower pace for quite some time. There are physical limitations on how small we can design a transistor — the size of a silicon atom is 0.2 nanometers; hence, it would be impossible to create a silicon transistor smaller than that. This would pose a significant problem in the future, and think that today, several manufacturers are working on 3nm structures, while some of the fastest commercial computers are 5nm, such as Apple’s. However, I don’t foresee the physical limitations of chips being an issue for small-scale computers and the things we do in our everyday lives anytime soon.
When most people hear about quantum computers, they think of fast supercomputers. How fair of an argument is that?
GK: Scientists have proven that for certain types of problems, quantum computers can provide solutions exponentially faster than classical computers. For instance, a classical computer would need a few hundred million years to break a standard encryption key, such as the ones used in cybersecurity, to protect our communications. On the contrary, a quantum computer could do that in a couple of hours. A quantum computer can give quadratic speed-up in simulations of molecule interactions to help us discover drugs faster and more efficiently. So, yes, quantum computers can offer computation improvements of orders of magnitude. Nevertheless, it’s unknown, or yet unproven, to be more precise, whether they can reliably offer better performance in all types of complex problems compared to today’s supercomputers since the difficulty of solving many problems boils down to the nature of computations necessary, e.g. optimisation, simulation, etc, not only handling vast amounts of data.
What makes quantum computers different from classical computers like the ones we’re using today?
GK: A quantum computer is a fundamentally different machine in the way it performs computations. The computers we use today operate with classical bits, 0s and 1s. At a physical level, these are represented by a voltage drop. High voltage means 1. Low voltage means 0. Very simple physics in some sense, and to achieve this, we rely on semiconductor devices called transistors to amplify or switch electronic signals and electrical power. Not to get into too much detail here, but the numbers, texts, images, sounds, and all other kinds of data are transformed into two states to perform calculations: high and low voltage. This structure limits how fast even supercomputers with thousands of classical CPUs and GPUs can solve particular problems.
Unlike classical computers, quantum computers are not based on a binary version of the world, 0s and 1s. Instead, a qubit, which is the basic unit of information, can represent a 0, a 1, or any proportion of 0 and 1 in a superposition of both states, with a certain probability of being a 0 and a certain probability of being a 1 — not unlike a coin spinning through the air before it lands in your hand. By harnessing the laws of quantum physics, quantum computers have the potential to process exponentially more data compared to classical computers.
What does a quantum computer look like?
GK: Where classical computers use familiar silicon-based chips, quantum computers can be made from photons (using the polarisation of light), real atoms (spinning neutral atoms that live in a vacuum), artificial atoms (electric components that are frozen down to absolute 0), trapped ions, etc. It’s unclear which one will be the dominant modality of the future, and perhaps we will rely on other modalities for different applications. Like in the old days, we used electronics based on silicon and germanium, but silicon proved more efficient.
If you Google quantum computer, most of the results showcase dilution refrigerators — fridges that cool down a chip at the very bottom end of the machinery. It’s a bunch of rings and cables, where each ring has an order-of-magnitude temperature drop. And at the very bottom of the machine is a tiny chip that contains qubits, the artificial atoms — from a few qubits all the way to 433 qubits for the latest IBM machine or a few thousand (5-6K) for quantum annealers such as D-Wave. We need to block any external radiation that could disrupt the computations (initialise qubits, perform controlled qubit interactions, measure resulting quantum states, etc.), so we put the refrigerators inside modern Faraday cages. The setting of a quantum computer looks more like a physics lab with a huge refrigerator in the middle and big bottles of helium, the liquid required to reduce temperature.
What are the roadblocks awaiting breakthroughs, and what are some of the latest developments in the industry?
GK: Although we have quantum computers, we still don’t have a well-functioning one. The main roadblock is noise, which results in errors in computation. Unlike classical computers, quantum computers are way more prone to any electromagnetic interactions from the environment, such as WiFi signals, cosmic rays, etc. Any type of radiation can contaminate the computation (specifically the coherence of the qubits), leading to decoherence and random results (noise). And this is why we isolate them from electromagnetic signals. Even then, there might be errors due to the probabilistic way the computer operates.
No matter how powerful your computer is, failure is inevitable if it outputs errors that cannot be corrected. In the very early days of classical computing, there were a lot of errors, too, and often, the output of the computation was complete nonsense before engineers introduced error-correcting schemes. Similarly, we now have to find ways to reduce the errors quantum computers generate. Our community is working on error mitigation, a post-processing method that helps us understand the origins of errors and try to correct them after the computation. I believe error correction and mitigation is the most critical work the community is currently addressing.
This is a million-dollar question, but how long are we before we can have well-functioning quantum computers?
GK: Today, we are in an era called NISQ, noisy intermediate-scale quantum era. This means we have to follow very sophisticated ways to reduce the noise produced, potentially by combining classical computers to do a lot of the heavy lifting. It’s still unclear whether today’s machines can produce something that a fast supercomputer cannot. Are we close to having an operating quantum computer that can do valuable things? I don’t think so. From a technology standpoint, we’re not there yet, and much of it boils down to hardware. But my estimation is that this will change by 2030.
However, this doesn’t mean quantum computers are not useful today. In fact, there’s a field called hybrid computing, or quantum-inspired computing, wherein you might use a classical supercomputer — not a MacBook, something much more powerful — but construct algorithms in such a way that a classical computer thinks it’s quantum. This is a significant proof-of-concept for our industry as, with this reconfiguration, we can solve specific problems at a satisfactory level, potentially doing computations faster or at least as well compared to classical computers using algorithms that will be used for quantum computers.
I believe we will see a future where both quantum and classical computers are used together or interchangeably based on the use case. You can envision all kinds of synergies between them in the future. Indeed, it’s a fascinating time to enter the field and help push the industry forward, bringing skills from physics to engineering and product, design, marketing, and more. Well, you certainly don’t need a PhD to enter quantum computing!
Giorgo, thank you so much for taking the time. It was great to talk to you!
GK: Appreciate it, Alex!
Check out job openings HERE from startups hiring in Greece, abroad, and remotely.
Cisco acquired Code BGP. A team of Internet infrastructure experts based in Greece and backed by Marathon joined the global leader in networking equipment and software (link) — I previously wrote about how the team set out to build a better Internet here.
Endor Labs raised $70m from Lightspeed, Coatue, and others to help developers manage and secure their open source dependencies. (link)
Mental health startup building gamified tools, thymia, raised a $2.7m Seed. (link)
WINGS ICT Solutions, a company that develops 5G, Big Data & AI products, raised funding from 5G Ventures. (link)
Second-hand shopping marketplace Vendora raised funding. (link)
Digital mortgage platform Stavvy acquired Brace. (link)
SPOROS Impact Ventures, the first circular economy fund in Greece, launched with a size of €30.5m. (link)
Agapitos Diakogiannis, co-founder & CEO of Seafair, on lessons learned building a b2b marketplace in maritime. (link)
Generative AI disrupting the financial industry with Jason Manolopoulos, Partner at L-Stone Capital. (link)
Why become a Product Manager by Joseph Alvertis, VP Product at TileDB. (link)
We should be paying more attention to quantum computing by Yiannis Varelas, founder of Monday Capital. (link)
Roza Tapini and Yiota Tzavara from Skroutz discuss how company values evolved as the company grew throughout the years. (link)
The future of online advertising with Matina Thomaidou, VP Data Science of Dataseat. (link)
Dimitris Georgakopoulos, Partner of Zeno Capital, on his journey building a startup that got acquired for $580m in 2020. (link)
Backward Working Documents, what are they, and how to use them by Manos Kyriakakis, Head of Product & Growth at Simpler. (link)
Working with LLMs in the backend by Stelios Gerogiannakis. (link)
You can find a list of 90+ Greek Tech communities HERE. Most of them pretty active and span topics from engineering to product, AI, design, blockchain, and more.
“Bootcamps: front-end & back-end development” by WE LEAD
“Biomedicine, Bioinformatics & Biotechnology Forum” on Sep 15-17
“Disrupt Athens” by DisruptHR on Sep 28