This content is 6 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.
There’s been a lot of buzz around quantum computing over the last year or so and there seems little doubt that it will provide the next major step forward in computing power but it’s still largely theoretical – you can’t buy a quantum computer today. So, what does it really mean… and why should we care?
Today’s computers are binary. The transistors (tiny switches) that are contained in microchips are either off (0) or on (1) – just like a light switch. Quantum computing is based on entirely new principles. And quantum mechanics is difficult to understand – it’s counterintuitive – it’s weird. So let’s look at some of the basic concepts:
Superposition
Superposition is a concept whereby, instead of a state being on or off, it’s on and off. At the same time. And it’s everything in the middle as well. Think of it as a scale from 0 to 1 and all the numbers in-between.
Qubit
A quantum bit (qubit) uses superposition so that, instead of trying problems sequentially, we can compute in parallel with superposition.
Particles like an electron have a charge and a spin so they point in a certain direction. Noise from other electrons makes them wiggle so the information in one is leaking to others, which makes long calculations difficult. This is one of the reasons that quantum computers run at low temperatures.
Greek dancers hold their neighbour so that they move as one. One approach in quantum computing is to do the same with electrons so that only those at the end have freedom of motion – a concept called electron fractionalisation. This creates a robust building block for a qubit, one that is more like Lego (locking together) than a house of cards (loosely stacked).
Different teams of researchers are using different approaches to solve error correction problems, so not everyone’s Qubits are equal! One approach is to use topological qubits for reliable computation, storage and scaling. Just like Inca quipus (a system of knots and braids used to encode information so it couldn’t be washed away, unlike chalk marks), topological qubits can braid information and create patterns in code.
Exponential scaling
Once the error correction issue is solved, then scaling is where the massive power of quantum computing can be unleashed.
A 4 bit classical computer has 16 configurations of 0s and 1s but can only exist in one of these states at any time. A quantum register of 4 qubits can be in all 16 states at the same time and compute on all of them at the same time!
Every n interacting qubits can handle 2n bits of information in parallel so:
10 qubits = 1024 classical bits (1KiB)
20 qubits = 1MB
30 qubits = 1GB
40 qubits = 1TB
etc.
This means that the computational power of a quantum computer is potentially huge.
What sort of problems need quantum computing?
We won’t be using quantum computers for general personal computing any time soon – Moore’s Law is doing just fine there – but there are a number of areas where quantum computing is better suited than classical computing approaches.
We can potentially use the massive quantum computing power to solve problems like:
Cryptography (making it more secure – a quantum computer could break the RSA 2048 algorithm that underpins much of today’s online commerce in around 100 seconds – so we need new models).
Quantum chemistry and materials science (nitrogen fixation, carbon capture, etc.).
Machine learning (faster training of models – quantum computing as a “co-processor” for AI).
and other intractable problems that are supercompute-constrained (improved medicines, etc.).
A universal programmable quantum computer
Microsoft is trying to create a universal programmable quantum computer – the whole stack – and they’re pretty advanced already. The developments include:
A global team of physicians, mathematicians, cryogenicists, programmers and computer scientists.
Technology: Developmental work around physics, materials, devices and controls required to make a quantum computer, together with a runtime that executes a quantum algorithm while maintaining the state of the machine, operating the control system in a parallel real-time environment, and communicating from the device to the outside world.
This content is 7 years old. I don't routinely update old blog posts as they are only intended to represent a view at a particular point in time. Please be warned that the information here may be out of date.
Just over a week ago, risual held its bi-annual summit at the risual HQ in Stafford – the whole company back in the office for a day of learning with a new format: a mini-conference called risual:NXT.
I was given the task of running the technical track – with 6 speakers presenting on a variety of topics covering all of our technical practices: Cloud Infrastructure; Dynamics; Data Platform; Unified Intelligent Communications and Messaging; Business Productivity; and DevOps – but I was also privileged to be asked to present a keynote session on technology trends. Unfortunately, my 35-40 minutes of content had to be squeezed into 22 minutes… so this blog post summarises some of the points I wanted to get across but really didn’t have the time.
Organisations need to transform their cloud operations because that’s where the benefits are – embrace the productivity tools in Office 365 (no longer just cloud versions of Exchange/Lync/SharePoint but a full collaboration stack) and look to build new solutions around advanced workloads in Azure. Microsoft is way ahead in the PaaS space – machine learning (ML), advanced analytics, the Internet of Things (IoT) – there are so many scenarios for exploiting cloud services that simply wouldn’t be possible on-premises without massive investment.
And for those who still think they can compete with the scale that Microsoft (Amazon and Google) operate at, this video might provide some food for thought…
(and for a similar video from a security perspective…)
2. Data: the fuel of the future
I hate referring to data as “the new oil”. Oil is a finite resource. Data is anything but finite! It is a fuel though…
Data is what provides an economic advantage – there are businesses without data and those with. Data is the business currency of the future. Think about it: Facebook and Google are entirely based on data that’s freely given up by users (remember, if you’re not paying for a service – you are the service). Amazon wouldn’t be where it is without data.
So, thinking about what we do with that data: the 1st wave of the Internet was about connecting computers, 2nd was about people, the 3rd is devices.
Despite what you might read, IoT is not about connected kettles/fridges. It’s not even really about home automation with smart lightbulbs, thermostats and door locks. It’s about gathering information from billions of sensors out there. Then, we take that data and use it to make intelligent decisions and apply them in the real world. Artificial intelligence and machine learning feed on data – they are ying and yang to each other. We use data to train algorithms, then we use the algorithms to process more data.
The Microsoft Data Platform is about analytics and data driving a new wave of insights and opening up possibilities for new ways of working.
James Watt’s 18th Century steam engine led to an industrial revolution. The intelligent cloud is today’s version – moving us to the intelligence revolution.
3 Blockchain
Bitcoin is just one implementation of something known as the Blockchain. In this case as a digital currency.
But Blockchain is not just for monetary transactions – it’s more than that. It can be used for anything transactional. Blockchain is about a distributed ledger. Effectively, it allows parties to trust one another without knowing each other. The ledger is a record of every transaction, signed and tamper-proof.
The magic about Blockchain is that as the chain gets longer so does the entropy and the encryption level – effectively, the more the chain is used, the more secure it gets. That means infinite integrity.
Blockchain is seen as strategic by Microsoft and by the UK government and it’s early days but we will see where people want to talk about integrity and data resilience with integrity. Databases – anything transactional – can be signed with blockchain.
(BTW, Bletchley is a town in Buckinghamshire that’s now absorbed into Milton Keynes. Bletchley Park was the primary location of the UK Government’s wartime code-cracking efforts that are said to have shortened WW2 by around 2 years. Not a bad name for a cryptographic technology, hey?)
4 Into the third dimension
So we’ve had the ability to “print” in 3 dimensions for a while but now 3D is going further.Now we’re taking physical worlds into the virtual world and augmenting with information.
To make use of this we need to be able to scan and render 3D images, then move them into a virtual world. 3D is built into next Windows 10 release (the Fall Creators update, due on 17 October 2017). This will bring Paint 3D, a 3D Gallery, View 3D for our phones – so we can scan any object and import to a virtual world. With the adoption rates of new Windows 10 releases then that puts 3D on a market of millions of PCs.
This Christmas will see lots of consumer headsets in the market. Mixed reality will really take off after that. Microsoft is way ahead in the plumbing – all whilst we didn’t notice. They held their Hololens product back to be big in business (so that it wasn’t a solution without a problem). Now it can be applied to field worker scenarios, visualising things before they are built.
To give an example, recently, I had a builder quote for a loft extension at home. He described how the stairs will work and sketched a room layout – but what if I could have visualised it in a headset? Then imagine picking the paint, sofas, furniture, wallpaper, etc.
The video below shows how Ford and Microsoft have worked together to use mixed reality to shorten and improve product development:
5 The new dawn of artificial intelligence
All of the legends of AI are set by sci-fi (Metropolis, 2001 AD, Terminator). But AI is not about killing us all! Humans vs. machines? Deep Blue beating people at Chess, Jeopardy, then Google taking on Go. Heading into the economy and displacing jobs. Automation of business process/economic activity. Mass unemployment?
Let’s take a more optimistic view! It’s not about sentient/thinking machines or giving human rights to machines. That stuff is interesting but we don’t know where consciousness comes from!
AI is a toolbox of high-value tools and techniques. We can apply these to problems and appreciate the fundamental shift from programming machines to machines that learn.
Ai is not about programming logical steps – we can’t do that when we’re recognising images, speech, etc. Instead, our inspiration is biology, neural networks, etc. – using maths to train complex layers of neural networks led to deep learning.
Image recognition was “magic” a few years ago but now it’s part of everyday life. Nvidia’s shares are growing massively due to GPU requirements for deep learning and autonomous vehicles. And Microsoft is democratising AI (in its own applications – with an intelligent cloud, intelligent agents and bots).
So, about those bots…
A bot is a web app and a conversational user interface. We use them because natural language processing (NLP) and AI are here today. And because messaging apps rule the world. With bots, we can use Human language as a new user interface; bots are the new apps – our digital assistants.
We can employ bots in several scenarios today – including customer service and productivity – and this video is just one example, with Microsoft Cortana built into a consumer product:
The device is similar to Amazon’s popular Echo smart speaker and a skills kit is used to teach Cortana about an app; Ask “skillname to do something”. The beauty of Cortana is that it’s cross-platform so the skill can show up wherever Cortana does. More recently, Amazon and Microsoft have announced Cortana-Alexa integration (meanwhile Siri continues to frustrate…)
AI is about augmentation, not replacement. It’s true that bots may replace humans for many jobs – but new jobs will emerge. And it’s already here. It’s mainstream. We use recommendations for playlists, music, etc. We’re recognising people, emotions, etc. in images. We already use AI every day…
6 From silicon to cells
Every cell has a “programme” – DNA. And researchers have found that they can write code in DNA and control proteins/chemical processes. They can compile code to DNA and execute, creating molecular circuits. Literally programming biology.
The benefits of DNA are that it’s very dense and it lasts for thousands of years so can always be read. And we’re just storing 0s and 1s – that’s much simpler than what DNA stores in nature.
With massive data storage… the next step is faster computing – that’s where Quantum computing comes in.
I’m a geek and this one is tough to understand… so here’s another video:
https://youtu.be/doNNClTTYwE
Quantum computing is starting to gain momentum. Dominated by maths (quantum mechanics), it requires thinking in equations, not translating into physical things in your head. It has concepts like superposition (multiple states at the same time) and entanglement. Instead of gates being turned on/off it’s about controlling particles with nanotechnology.
A classical 2 bit on-off takes 2 clock cycles. One quantum bit (a Qubit) has multiple states at the same time. It can be used to solve difficult problems (the RSA 2048 challenge problem would take a billion years on a supercomputer but just 100 seconds on a 250-bit quantum computer). This can be applied to encryption and security, health and pharma, energy, biotech, environment, materials and engineering, AI and ML.
There’s a race for quantum computing hardware taking place and China sees this as a massively strategic direction. Meanwhile, the UK is already an academic centre of excellence – now looking to bring quantum computing to market. We’ll have usable devices in 2-3 years (where “usable” means that they won’t be cracking encryption, but will have initial applications in chemistry and biology).
Amazon, Google and Microsoft each invest over $12bn p.a. on R&D. As demonstrated in the video above, their datacentres are not something that many organisations can afford to build but they will drive down the cost of computing. That drives down the cost for the rest of us to rent cloud services, which means more data, more AI – and the cycle continues.
I’ve shared 7 “technology bets” (and there are others, like the use of Graphene) that I haven’t covered – my list is very much influenced by my work with Microsoft technologies and services. We can’t always predict the future but all of these are real… the only bet is how big they are. Some are mainstream, some are up and coming – and some will literally change the world.
Credit: Thanks to Rob Fraser at Microsoft for the initial inspiration – and to Alun Rogers (@AlunRogers) for helping place some of these themes into context.