NEW YORK: (Web Desk) – Google has unveiled its next-generation Tensor Processing Units (TPUs) designed to accelerate artificial intelligence training and power emerging AI “agents,” marking a major step in the global race for advanced computing infrastructure.
The announcement came during Google’s annual cloud computing conference in Las Vegas, where the company showcased the eighth generation of its custom AI chips.
According to CEO Sundar Pichai, the new architecture reflects the growing demands of the “era of AI agents,” which require significantly more advanced infrastructure to handle complex workloads. In a blog post, he said the new TPUs are designed to support the next wave of AI development.
The updated system features a dual-chip approach: one TPU is optimized for training large language models, while the other focuses on “inference,” the process that enables AI systems to make real-time decisions and perform tasks autonomously.
Controversial remarks by Dr Umar Adil on actress Nargis spark online debate
These AI agents—digital assistants capable of independently completing computing tasks—are increasingly seen as the next frontier in artificial intelligence.
The chips were developed in partnership with semiconductor company Broadcom and will be made available later this year, according to Google Cloud executive Thomas Kurian.
The move comes amid intensifying competition in the AI hardware market. Earlier this year, Nvidia announced its upcoming Vera and Rubin GPUs, while Amazon introduced its latest Trainium processors.
Despite developing in-house chips, major tech firms including Google, Amazon, and Microsoft continue to rely on Nvidia GPUs for a significant portion of their AI infrastructure.
Recover your password.
A password will be e-mailed to you.

Comments are closed, but trackbacks and pingbacks are open.