Feature Articles

 

Intel’s New Path to Quantum Computing

In this article in IEEE Spectrum, Jim Clarke, Intel’s Director of Quantum Hardware, speaks about Intel’s two different technological approaches to quantum computing hardware. Despite all of the hype and promises, quantum computing is still an immature technology, and the ultimate technological approach for practical systems is still to be determined.

One approach uses superconducting quantum bits, or qubits, designed to operate at temperatures as low as 0.01 K. This is similar to an approach being pursued by Google and D-Wave Systems, among others. A 49-qubit system (code-named Tangle Lake) has been packaged and tested, and is shown in the photograph as the small object with gold connectors.

The other approach is based on Si quantum dots, where the qubits are essentially single-electron transistors, and the information is encoded in the spin of the electron. These are compatible with CMOS processing, and full wafers of chips with up to 26 qubits (shown in the photograph) have been fabricated and tested. These chips still need cryogenic temperatures, but may operate at slightly warmer temperatures than the superconductor approach, up to about 1 K. They may also be more compatible with integrated semiconductor control circuitry.

Intel also has a free online simulator for small quantum systems.

Another recent article in Semiconductor Engineering provides an overview of quantum computing R&D, including contributions from IBM, Google, Microsoft, LETI, and D-Wave Systems, as well as Intel.

 

Roadmapping Cryogenic Electronics and Quantum Information Processing

The IEEE International Roadmap for Devices and Systems (IRDS) has just released its 2017 roadmap which focuses primarily on extensions of conventional electronic technologies, but also covers newly developing technologies in its Chapter on Beyond CMOS (PDF, 3 MB).

One group of new technologies is forming its own International Focus Group, on Cryogenic Electronics and Quantum Information Processing. Committee members Scott Holmes and Erik DeBenedictis have summarized the case for roadmapping these technologies here (PDF, 378 KB).

These technologies include superconducting electronic circuits that require cryogenic temperatures less than 100 K, and often less than 10 K. In addition cryogenic semiconducting circuits have been developed for certain specialized applications. Both of these are generally mature technologies with integration on the intermediate scale, but not yet at the very large scale that would be needed for direct competition with room-temperature CMOS.

A distinct set of technologies are associated with the new field of quantum information processing, which may require temperatures less than 1 K for proper operation. Some of these are also based on superconducting devices, but operate in a regime of ultra-low power dissipation necessary for quantum operation.

For both superconducting and quantum circuits, the industry and the market are still small, but are likely to grow rapidly in the next 20 years. Roadmapping and standards are particularly important if these technologies are to achieve the ambitious goals that have been projected.

 

Artificial Intelligence and Machine Learning Applied to Cybersecurity: New IEEE Trend Paper Based on RC-Sponsored Confluence Summit

Cybersecurity is a critical issue in information technology throughout the world. IEEE has identified Artificial Intelligence and Machine Learning (AI/ML) as key technologies that will impact cybersecurity in both positive and negative ways. The IEEE Rebooting Computing Initiative, together with the IEEE Industry Engagement Committee, sponsored a Confluence Summit last October in Philadelphia, and 19 distinguished experts were charged with developing a Trend Paper on this topic. The Trend Paper has just been issued, and is available online here. Comments on the report are welcome.

The co-chairs of this Confluence Summit and Report were Dejan Milojicic and Barry Shoop. Dr Milojicic is a Distinguished Technologist at Hewlett Packard Labs, past president of the IEEE Computer Society, and chair of the IEEE Industry Engagement Committee. Dr. Shoop is a professor and head of the Department of Electrical Engineering and Computer Science at the U.S. Military Academy, West Point, and served as 2016 IEEE president.

In addition to addressing issues in Hardware, Software, and Data, the report also discusses legal issues, human factors, and implementation, in the context of industry, academia, government, standards bodies, and the general public. Key recommendations include the following:

  • The future needs of cybersecurity will require advances in technology, legal and human factors, and mathematically verified trust.
  • Coordinated business efforts will be required to establish market-accepted products, certified by established regulatory authorities
  • AI/ML-fueled cybersecurity must be based on standardized and audited operations.
  • Regulators will need to protect research and operations and establish internationally recognized cooperative organizations.
  • Data, models, and fault warehouses will be essential for tracking progress and documenting threats, defenses, and solutions.

A brief video of Dr. Milojicic discussing the Confluence Summit is also available here. IEEE Spectrum also has an article featuring this Trend Paper here.

 

Quantum Computers Strive to Break Out of the Lab

A new feature article in IEEE Spectrum reviews the past, present, and future of quantum computing, which has received much attention in the last year. The main conclusion is that while small quantum computing circuits made of fewer than 100 quantum bits (“qubits”) have been demonstrated, their practical near-term utility is severely limited, and this is likely to remain the case for at least the next few years. In the near future, these quantum systems may be used to model other small quantum systems, such as small clusters of atoms and molecules.

A key problem is that these quantum systems are extremely sensitive to thermal and electrical noise, and will require a very large overhead of quantum error correction circuits, which are themselves composed of noise-sensitive qubits. Furthermore, most experts in the field view the eventual larger quantum computing systems as special purpose accelerators to be used together with classical computers, rather than as general-purpose replacements for classical computers.

The article also presents some examples of current quantum computing circuits, using superconducting and coupled ion technologies. These are being developed by such computing giants as IBM, Google, Microsoft, and Intel, as well as smaller companies such as IonQ, Rigetti, and D-Wave, and university and government laboratory teams.

For further details, see the article in IEEE Spectrum here.

 

Transistor Options Beyond 3 nm

Fabrication of next-generation transistor devices is becoming more challenging, and several technologies are being explored to maintain improved performance at device nodes beyond 3 nm into the next decade and beyond. Some of these are variants of present advanced CMOS devices. These include, for example, gate-all-around (GAA) FETs, where the gate wraps entirely around nanowire Si channels. Other variations may include complementary FETs (CFETs) and negative capacitance FETs (NC-FETs). Some of these may incorporate novel materials, such as ferroelectric gates (such as hafnium oxide). This will also require innovations in lithography and interconnects. These technologies are projected for about 2025, but are unlikely to displace some of the larger nodes for many applications.

For further details, please see the article in Semiconductor Engineering here.

 

Beyond CMOS Computing: The Interconnect Challenge Workshop held in Annapolis, Maryland, Nov. 29, 2017

This workshop addressed the fact that data transfer between logic and memory has increasingly become the major bottleneck in computer speed and energy. Ways to deal with this problem include alternative architectures and computing paradigms (neuromorphic, approximate, 3D, analog, quantum) and alternative interconnect technologies (optical, superconducting, graphene). The agenda is available here, and the slide presentations for many of the talks are available.

The keynote address was given by Irene Qualters, the Director of the Division of Advanced Cyber-infrastructure at the US National Science Foundation. Her presentation is available here (PDF, 3 MB). She emphasizes that the interconnect challenge is at the heart of modern computing, with no single solution likely. A variety of complementary approaches in research and development are needed throughout the computing stack, requiring contributions and coordination among government, industry, and academia.

 

EDA Challenges Machine Learning

An article in Semiconductor Engineering describes the growing importance of Machine Learning (ML) in Electronic Design Automation (EDA). Optimization of circuit layout has long been automated for many cases, but other tasks often require extensive efforts by teams of experienced design engineers. Can machine learning develop the expertise of these engineers? Part of the difficulty is that the large database that is used for other machine learning tasks is typically not available for complex custom circuit design. An alternative may be iterative reinforcement learning, where the ML system and engineers work together to train the EDA system for accurate, efficient, and verifiable designs. Such improved automation tools will be necessary to accelerate development of the next generation of heterogeneous chips for computing systems and the internet of things.