Jason Banta, vice president of AMD, believes that integrating AI into computers will make them more personal, more secure, and better able to understand what users want.
Banta predicts wider adoption of AI-enabled laptops by 2024, with a “major inflection point” starting in 2025. The biggest challenge will be scaling down models to run efficiently on laptops.
In general, Banta expects to see a shift from cloud AI applications to smaller models that run directly on computers in real-time and can be trained locally.
While previously we were the ones who had to learn how to use them, in the future computers with artificial intelligence will be able to anticipate our needs. This will enable new audio and video experiences, facilitating the generation of creative content, a process that can be challenging on a traditional computer.
Jason Banta, AMD vice president
Microsoft is said to be planning its first “AI PCs” for next year, but these will be equipped with Intel and Qualcomm chips instead of AMD chips.
AMD enters the AI chip race
In early December, AMD unveiled its MI300 AI chip. The Instinct MI300X GPU and MI300A APU are designed to outperform Nvidia’s GPUs for AI training and inference. Both chips are already shipping to OEM partners.
AMD is trying to compete with Nvidia by offering lower prices and better software. Major tech companies like Meta, Microsoft and OpenAI have expressed interest in the new chips.
However, Nvidia is not backing down and has announced the H200 GPU and a new Grace Hopper super chip that will be available in Q2 2024. Nvidia is also much further along when it comes to software for machine learning applications running on its hardware.