Saturday, May 30, 2020

Speeding up learning in machine learning by Rice University


Couple of research topics from Rice University computer scientists that can speedup the learning in machine learning alogirthms. One is on SLIDE or sub-linear deep learning engine which is claimed to be the first algorithm for training deep neural nets faster on CPUs than GPUs. The other one is MACH or Merged Average Classifiers via Hashing where training times are about 7-10 times faster, and memory footprints are 2-4 times smaller.

TensorFlow Quantum - open source library for Quantum machine learning


In Mar 2020, Google released TensorFlow Quantum (TFQ), an open-source library for the rapid prototyping of quantum ML models. TFQ provides the tools necessary for bringing the quantum computing and machine learning research communities together to control and model natural or artificial quantum systems; e.g. Noisy Intermediate Scale Quantum (NISQ) processors with ~50 - 100 qubits. Under the hood, TFQ integrates Cirq with TensorFlow. Cirq is an open source framework for NISQ computers. Noisy Intermediate Scale Quantum (NISQ) computers are devices with ~50 - 100 qubits and high fidelity quantum gates - these Quantum computers with 50-100 qubits may be able to perform tasks which surpass the capabilities of today's classical digital computers, but noise in quantum gates will limit the size of quantum circuits that can be executed reliably.

A note on NISQ devices: NISQ devices will be useful tools for exploring many-body quantum physics, and may have other useful applications, but the 100-qubit quantum computer will not change the world right away- we should regard it as a significant step toward the more powerful quantum technologies of the future. Quantum technologists should continue to strive for more accurate quantum gates and, eventually, fully fault-tolerant quantum computing.

How a quantum computer actually works


A good article explaining how quantum computing works. It gives an overview of why quantum computers give us a potential advantage only in certain kinds of problems. Those problems are generally ones where it is easy to check an answer after you have it, but incredibly difficult to find it in the first place.