- Neural Network Processor for Machine Learning - Intel announces new commercial AI chips for training and inference.
- Quantum Information Edge - New consortium on quantum computing led by US National Laboratories, with universities and industry.
- Update of National Strategic Computing Initiative - New Report from US Office of Science and Technology Policy addresses changes from original 2016 plan, including new emphasis on cybersecurity.
- Quantum Volume Explained - Metric promoted by IBM for comparing prototype quantum computing systems.
- Quantum Supremacy? - Google’s quantum tech milestone excites scientists and spurs rivals.
- Heterogeneous Integration Roadmap - Industry consortium SEMI announces new 15-year roadmap for electronic packaging and systems.
- Low-energy alternative to Bitcoin? - New financial algorithms are secure but much more energy-efficient than blockchains.
- Carbon Nanotube Microprocessor - 16-bit programmable processor based on 3D integrated circuit including 15,000 transistors.
- Moore’s Law for Circuit Density is Continuing - Smaller circuits and 3D integration may enable growth in density for 30 years.
- Artificial Intelligence Roadmap Released by Computing Community Consortium - Project key areas of collaborative research over next 20 years in academia, industry, and government.
- Secure Computer Architectures - Special Issue of IEEE Micro Magazine
- Wafer-Scale Fabrication of 3D Carbon Nanotube Chips - Integrated with RRAM memory arrays
- Processing in Memory to Reduce Power - Non-von-Neumann architecture could be key to low-power machine learning.
Grand Challenge: Applying Artificial Intelligence and Machine Learning to Cybersecurity
Access the article by Kirk Bresniker, Ada Gavrilovska, James Holt, Dejan Milojicic, and Trung Tran in IEEE Xplore.
Providing future cybersecurity will require integration of AI/ML throughout the network worldwide. Initiating a series of Grand Challenges on this topic will help the community achieve this goal.
The December issue of Computer has a set of open-access feature articles on Technology Predictions. One of these is by Bresniker et al., on how AI/ML can help to address the pervasive and growing problem of cyberattacks. This follows a set of earlier workshops and a 2018 report (PDF, 1 MB) on a similar topic by some of the same authors.
The authors argue that given the massive scale of the problem, that it is continuously changing, and that rapid responses are needed, this can only be handled by a system of ubiquitous AI agents capable of machine learning. However, these autonomous AI agents must quickly incorporate the insights of the best human cyber analysts, many of whom work privately on non-public data sets. The authors propose that an annual Grand Challenge, with prizes as motivation, can help to bring about the necessary collaborations and competition to achieve this goal. Given the critical nature of the problem to business and government, this should be initiated as soon as possible.
AI Systems in a Hybrid World
Dr. Cindy Goldberg, Program Director, IBM Research AI Hardware Center
The IEEE Industry Summit on the Future of Computing was held in San Mateo, California, on 4 November 2019. Many of the invited presentations are available through IEEE.tv. One of the invited speakers was Dr. Cindy Goldberg of IBM Research, who presented an overview of IBM’s programs in developing improved hardware for deep learning on neural networks. While IBM also has a major program in quantum computing, IBM believes that future quantum and AI systems will address very different types of problems with very different types of hardware.
One of the key problems with the present technology of AI based on digital CMOS is that the power consumptions for training and inference are quite large. While following schemes of approximate computing decreases power consumption, new algorithms and architectures may be needed to maintain performance. Further improvement in performance/watt may require analog AI cores with improved memory materials, which IBM is also developing to achieve a new Moore’s Law of improved performance for AI systems in the next decade.
Looking to the future, IBM envisions hybrid cloud computing, comprising combinations of bits, neurons, and qubits, i.e., classical digital, classical analog, and quantum computing.
The entire 25-minute talk is available on IEEE.tv.
- Rebooting Computing Video Overview
- IEEE Future Directions
- Computing in Science and Engineering on the End of Moore's Law
- IEEE Journal of Exploratory Solid-State Computational Devices and Circuits (JXCDC)
- Arch2030 Workshop Report
- Workshop on Neuromorphic Computing
- Workshop on Beyond CMOS Technology
- Update on National Strategic Computing Initiative (NSCI)
- RC White Paper on Nanocomputers
- IEEE Computer Magazine on Rebooting Computing
- RC-ITRS Report on the Foundation of the New Computer Industry Beyond 2020