Google makes breakthrough in quantum computing, much bigger than AI
Recently Google has claimed a breakthrough in an important sub-field called quantum error correction. This would make quantum computing more accurate.
- Under traditional computers, information is stored using bits representing 0 or 1.
- In contrast, under quantum computers, quantum bits or qubits are used to encode information as 0, 1 or both at the same time.
- However, qubits are so sensitive that even stray light can throw off their calculations. This problem will increase further with the expansion of quantum computers.
- To eliminate this error gap in calculations, Google claims to have made a breakthrough in quantum error correction technology.
- Quantum error correction protects information by encoding it into multiple physical qubits to form “logical qubits”. There is no reliance on single physical cubits.
- Quantum computing focuses on the development of computer technology based on quantum principles. It explains the nature and behavior of energy and matter at the quantum (atomic and sub-atomic) level.
It uses two key principles of quantum physics: superposition and entanglement.
- Superposition: This means that each qubit can represent both 1 and 0 at the same time.
- Entanglement: This means that the qubits in the superposition can be correlated with each other, i.e. the state of one (whether it is a 1 or a 0) can depend on the state of the other.
Source – Business Standard