Curated by Dillip Chowdary • Jan 28, 2026
Microsoft has officially revealed Maia 200, its latest custom-designed AI chip optimized for inference workloads. Built to power the next generation of AI models including GPT-5.2, the chip reportedly offers a 30% improvement in performance-per-dollar compared to existing market solutions like Amazon's Trainium 3.
The Maia 200 will be deployed across Microsoft's global data centers to support Microsoft 365 Copilot, Azure OpenAI Service, and internal research projects, further reducing reliance on external hardware vendors.
Read more on Azure Blog →OpenAI has introduced Prism, an AI-native workspace specifically designed for the scientific community. Built on the Crixet architecture and powered by GPT-5.2, Prism aims to streamline the process of technical research, data analysis, and scientific writing.
Prism provides researchers with tools for automated literature review, hypothesis generation, and complex data visualization, all within a collaborative, secure environment.
Read more on OpenAI News →Atlassian reported a massive efficiency boost in its engineering workflows, with its internal AI tool Rovo Dev reducing pull request (PR) cycle times by 45%. Rovo Dev assists developers by providing automated code reviews, suggesting fixes, and ensuring compliance with coding standards before a human reviewer even sees the code.
NVIDIA has announced a $2 billion investment in CoreWeave as part of a massive joint venture to scale AI infrastructure. The goal is to build over 5 gigawatts of AI-optimized data centers by 2030, ensuring a steady supply of compute power for the exploding demand for large language models.
This partnership solidifies CoreWeave's position as a primary AI cloud provider and ensures NVIDIA's latest Blackwell and upcoming Rubin architectures have a ready-made home.
View Partnership Details →