What's New

Feature Article

Feature Article

Grand Challenge: Applying Artificial Intelligence and Machine Learning to Cybersecurity

Access the article by Kirk Bresniker, Ada Gavrilovska, James Holt, Dejan Milojicic, and Trung Tran in IEEE Xplore.

Providing future cybersecurity will require integration of AI/ML throughout the network worldwide. Initiating a series of Grand Challenges on this topic will help the community achieve this goal.

The December issue of Computer has a set of open-access feature articles on Technology Predictions. One of these is by Bresniker et al., on how AI/ML can help to address the pervasive and growing problem of cyberattacks. This follows a set of earlier workshops and a 2018 report (PDF, 1 MB) on a similar topic by some of the same authors.

The authors argue that given the massive scale of the problem, that it is continuously changing, and that rapid responses are needed, this can only be handled by a system of ubiquitous AI agents capable of machine learning. However, these autonomous AI agents must quickly incorporate the insights of the best human cyber analysts, many of whom work privately on non-public data sets. The authors propose that an annual Grand Challenge, with prizes as motivation, can help to bring about the necessary collaborations and competition to achieve this goal. Given the critical nature of the problem to business and government, this should be initiated as soon as possible.

Technology Spotlight

Technology Spotlight

AI Systems in a Hybrid World
Dr. Cindy Goldberg, Program Director, IBM Research AI Hardware Center

The IEEE Industry Summit on the Future of Computing was held in San Mateo, California, on 4 November 2019. Many of the invited presentations are available through IEEE.tv. One of the invited speakers was Dr. Cindy Goldberg of IBM Research, who presented an overview of IBM’s programs in developing improved hardware for deep learning on neural networks. While IBM also has a major program in quantum computing, IBM believes that future quantum and AI systems will address very different types of problems with very different types of hardware.

One of the key problems with the present technology of AI based on digital CMOS is that the power consumptions for training and inference are quite large. While following schemes of approximate computing decreases power consumption, new algorithms and architectures may be needed to maintain performance. Further improvement in performance/watt may require analog AI cores with improved memory materials, which IBM is also developing to achieve a new Moore’s Law of improved performance for AI systems in the next decade.

Looking to the future, IBM envisions hybrid cloud computing, comprising combinations of bits, neurons, and qubits, i.e., classical digital, classical analog, and quantum computing.

The entire 25-minute talk is available on IEEE.tv.