Quantum computing advancements are set to revolutionise artificial intelligence as early as 2035. This breakthrough will significantly enhance AI’s ability to process vast datasets, optimise machine learning algorithms, and solve problems currently infeasible for classical systems.
By crunching the data and using the principles of **Moore’s Law, we can predict the scale of change but not the manner. This is open for discussion.
In this week’s Insight post, Jez gives his thoughts on AI and Quantum Computing. The video and transcript can be watched below.
** Moore’s Law is the observation that the number of transistors on a microchip doubles approximately every two years, leading to exponential growth in computing power while reducing relative costs.
Transcript
I’m going to give you some thoughts on Moore’s law, AI and the implications of quantum computing.
Moore’s law, states that the number of transistors on microchips doubles every two years. If you note the scale on the Y axis, this scale is doubling every two years. In this case it’s just showing the gaps between five and ten thousand and and so on.
It’s a fairly linear, but obviously linear in a logarithmic way scale. In fact this is an exponential scale of course, latest data 2020.
AI is following a similar-ish but a much steeper curve. The differences between, these two lines are, well this is a factor of one hundred – the previous one was doubling every year. This is measured in FLOPS by the way – floating operation points per second. It’s a measure of computing power. But the actual increase was roughly 1.4 x a year. Okay So obviously again exponentially increasing by 1.4 x a year.
Until we get to 2010 which they’re calling, this epoch.ai site, is calling the deep learning era. This is where things really take off The increase is at a rate of 4 x per year, the likes of ChatGPT and so on.
So… How long do we expect this to continue for? At the moment current thoughts are 2035. And why is that? Well, At that point, we may well have cracked quantum computing. And then what happens to this rate of increase, again, this exponential rate of increase?
This is probably going to be a dramatic change, a dramatic impact on what we do, when we have this much computing power, and we can run these kind of systems on what? A hundred, a hundred times the computing capabilities of the current machinery? What happens then? How useful would that be? What are the implications of that?
I don’t think any of us can possibly know, but it’s coming. Give me your thoughts below. Thank you very much.