What's New
-
Memory Chips that Compute Will Accelerate AI - Samsung could double the performance of neural nets with processing-in-memory. View the article here.
-
Real-Time Evolution and Deployment of Neuromorphic Computing at the Edge - Neuromorphic systems can provide real-time edge control systems with very low power dissipation.
- Frontier Supercomputer to Usher in Exascale Computing - Oak Ridge National Lab may be the first to reach 10^18 operations per second.
-
Memory Chips that Compute Will Accelerate AI - Samsung could double the performance of neural nets with processing-in-memory. View the article here.
-
The Femtojoule Promise of Analog AI - Analog neural networks based on non-volatile memory arrays promise dramatic reductions in power for many AI applications.
-
Supercomputers Flex Their AI Muscles - AI benchmarks recently tested on the latest supercomputers worldwide show exascale performance.
-
Explainable AI and ML: Guest Editor's Introduction, M. Raunak, and R. Kuhn, IEEE Computer, October 2021 - Overview of a special issue on how machine learning and training may be better understood; several of these articles are open-access.
-
The Great AI Reckoning: Special Report, IEEE Spectrum, October 2021 - This special report includes eight featured articles on the past, present, and future of artificial intelligence.
-
Next-Gen Chips Will Be Powered From Below - Redesign of power lines in state-of-the-art silicon chips will enable increased energy efficiency, permitting Moore's Law to continue a bit longer.
-
Cerebras' Tech Trains “Brain-Scale” AIs - A dedicated AI training system, containing a wafer-scale chip, further redesigns the memory architecture to enhance the scale and speed of machine learning.
-
Can Software Performance Engineering Save Us From the End of Moore's Law? - Although custom optimization of software for modern platforms can be difficult, performance improvements can be quite substantial.
Feature Article
Conscious Machines for Autonomous Agents and Cybersecurity
"Conscious Machines for Autonomous Agents and Cybersecurity," by A. Kadin. Can something resembling biological consciousness be implemented using neuromorphic computers? This article suggests that temporal pattern recognition on near-term neural networks could generate a dynamic model of self-acting in an environment, which seems similar to consciousness.
Advances in Machine Learning and Deep Neural Networks
The Proceedings of the IEEE has a special issue (May 2021) on Machine Learning (ML). The Guest Editors (R. Chellappa, S. Theodoridis, A. van Schaik) have presented an overview of the articles in this special issue.
ML has made great strides in the past decade and is now one of the major applications of computing. In most cases, this has been achieved using deep neural networks, where the interconnections between the neurons are obtained automatically by extensive training using large quantities of real data. This is extremely compute-intensive, and may be limiting the future evolution of ML.
The editors have selected 14 articles on a wide range of research approaches to improve the performance of ML systems. These cover theory, applications, and hardware implementations. Topics include causal inference, anomaly detection, neuromorphic chips, and application to medical imaging. The Table of Contents for the entire issue is available here.
Technology Spotlight
Efficient Computer Vision for Embedded Systems
In the April 2022 issue of Computer magazine, an expert panel discusses "Efficient Computer Vision for Embedded Systems," focusing on the Low-Power Computer Vision Competition, which is a part of the TFRC.
The Hard Tech Revolutionizing Computing: A Guided Journey of IBM Research
Dr. Dario Gil, Director of IBM Research
IBM Research has one of the world’s largest programs in R&D of advanced computing. The Director of IBM Research is Dr. Dario Gil . In May, Dr. Gil led an online presentation with featured interviews with leading IBM researchers in several areas of computing research, which is now available as a YouTube video.
This video is segmented into several parts:
1) Opening remarks
2) Overview of Advanced Semiconductor Fabrication
3) Quantum Computing
4) AI Language Processing Applied to Coding
5) The Hybrid Cloud
Regarding semiconductor fabrication, Dr. Gil and colleagues described recent 7 nm and 5 nm processes, leading to the newly developing 2 nm process. They also featured IBM System Z servers for data centers and cloud computing.
Quantum computing at IBM is based on a developing technology of superconducting qubits, cooled to ultralow temperatures of 15 millidegrees Kelvin. But these systems are interfaced with and controlled by classical computers, which in turn can be accessed remotely by researchers around the world. These hybrid classical/quantum computing systems promise tremendous performance enhancements in future computing.
The use of AI for natural language translation is well known. IBM has a research project examining legacy computer code, of which millions of lines exist. Similar language translation techniques are being used to translate, categorize, and reorganize this code, enabling it to be updated without massive programmer involvement. Similar techniques are also being applied to chemistry databases, which are full of non-text symbols and diagrams.
Finally, IBM researchers are working on developing universal high-level techniques to interact with the “hybrid cloud” as if it were a single infinite computer.
Useful Links
- 2020 CCC Workshop on Physics and Engineering Issues in Reversible/Adiabatic Classical Computing
- Rebooting Computing Video Overview
- IEEE Future Directions
- IEEE Future Directions Blog
- Computing in Science and Engineering on the End of Moore's Law
- IEEE Journal of Exploratory Solid-State Computational Devices and Circuits (JXCDC)
- Arch2030 Workshop Report (PDF, 948 KB)
- Workshop on Neuromorphic Computing
- Workshop on Beyond CMOS Technology
- Update on National Strategic Computing Initiative (NSCI)
- RC White Paper on Nanocomputers
- IEEE Computer Magazine on Rebooting Computing