Technology Spotlight

 

The Future of Computing: Bits + Neurons + Qubits

Dr. Dario Gil, Director of Research, IBM

At the International Solid State Circuits Conference (ISSCC) in San Francisco, California, USA in February, Dr. Dario Gil of IBM gave an overview of how IBM sees the future of computing. He projects parallel advances in three technologies: conventional processors (“bits”), neural networks for AI (“neurons”), and quantum processors (“qubits”). Rather than any one of these technologies becoming dominant, he predicts major performance advances in all three, with heterogeneous systems incorporating two or more of these addressing critical problems in the computing environment, integrating cloud and edge computing. Near-term applications of quantum computing may be in quantum simulations for materials development, but longer term advances in AI are possible in combination with bits and neurons.

Access the video of Dr. Gil’s presentation.

Access a companion article in the ISSCC 2020 Proceedings.

A preprint of this article is also available.

Several other plenary talks from ISSCC 2020 are available.

 

The Electronics Research Initiative: Innovating the 4th Wave of Electronics Development
Dr. Mark Rosker, Director of Microsystems Technology Office, DARPA

At the DARPA Electronics Research Initiative Summit 2019 in Detroit, Michigan, USA, Dr. Rosker presented an overview of semiconductor technology development, and the role that the US government has played in coordinating and assisting this in the past, present, and future. A video of his talk is available.

Access the slides from his presentation (PDF, 3 MB).

His key point is that although the exponential improvement in Moore’s Law is sometimes presented as a single development process over 50 years, it is more properly a sequence of several waves of development, each one showing initial growth and later saturation. Each wave has been associated with a set of innovations in materials, devices, circuits, and architectures. The DARPA Electronics Research Initiative is now promoting the 4th wave of semiconductor development, which includes innovations such as 3D heterogeneous integration, optimized AI chips, and designing for cybersecurity.

The program of the Summit is available and includes links to other videos and slides of many of the keynote presentations.

 

IRDS™ 2019 Highlights and Future Directions
Dr. Paolo Gargini, Chairman of IRDS™

IEEE Rebooting Computing Week was held in San Mateo, California, on 4-8 November 2019, and included a Workshop for the International Roadmap on Devices and Systems (IRDS™), the Industry Summit on the Future of Computing, the International Conference on Rebooting Computing (ICRC), and the Workshop on Quantum Computing. Videos of many of the presentations are available on IEEE.tv.

Dr. Gargini presented a brief overview of the past, present, and future of semiconductor roadmaps and IRDS™. Access the video on IEEE.tv.

While traditional 2D scaling is saturating, Dr. Gargini identified 3D power scaling for the period 2025-2040, together with new circuit architectures.

The most recent Roadmap is available on the IRDS™ website.

The new edition of the Roadmap is expected to be online by April.

 

Highlights from the Industry Summit on the Future of Computing
Bruce Kraemer, IEEE Industry Summit Chairman

The IEEE Industry Summit on the Future of Computing was held in San Mateo, California, on 4 November 2019, and consisted of a series of invited talks and panel presentations by leaders in the field. The slides for many of the presentations are linked from the Summit Program, and videos for many of the presentations are available on IEEE.tv, together with other presentations from IEEE Rebooting Computing Week.

Summit Chairman Bruce Kraemer presented a brief overview of invited speakers on quantum computing, AI, memory-centric computing, and a panel on startups, with the video available on IEEE.tv.

In terms of all of these approaches to future computing, the technologies at these preliminary stages are remarkably diverse. For example, speakers on quantum computing presented superconducting, optical, and semiconducting solutions. Performance benchmarks that will permit these alternative technologies to be compared are still being developed. Despite the concerns of some that Moore’s Law is ending, there was agreement that this is an exciting time for the computer industry.

 

AI Systems in a Hybrid World
Dr. Cindy Goldberg, Program Director, IBM Research AI Hardware Center

The IEEE Industry Summit on the Future of Computing was held in San Mateo, California, on 4 November 2019. Many of the invited presentations are available through IEEE.tv. One of the invited speakers was Dr. Cindy Goldberg of IBM Research, who presented an overview of IBM’s programs in developing improved hardware for deep learning on neural networks. While IBM also has a major program in quantum computing, IBM believes that future quantum and AI systems will address very different types of problems with very different types of hardware.

One of the key problems with the present technology of AI based on digital CMOS is that the power consumptions for training and inference are quite large. While following schemes of approximate computing decreases power consumption, new algorithms and architectures may be needed to maintain performance. Further improvement in performance/watt may require analog AI cores with improved memory materials, which IBM is also developing to achieve a new Moore’s Law of improved performance for AI systems in the next decade.

Looking to the future, IBM envisions hybrid cloud computing, comprising combinations of bits, neurons, and qubits, i.e., classical digital, classical analog, and quantum computing.

The entire 25-minute talk is available on IEEE.tv.