Microsoft has announced a breakthrough in AI chip cooling using microfluidics, claiming it could achieve three times better cooling than current methods. This development comes at a time when energy consumption from AI is a significant concern for greenhouse gas emissions.
- Microsoft claims a breakthrough in chip cooling.
- The new system is based on microfluidics.
- The technique can reduce silicon temperature rise by 65 percent.
- The coolant flows closer to the heat source.
- Microsoft's approach could lead to three times better cooling.
- The design is inspired by nature's veins.
Microsoft’s new cooling system allows coolant to flow closer to the heat source, moving through thread-like channels etched onto the back of the chip. This design aims to improve cooling efficiency compared to traditional cold plates, which are limited by the layers of material separating them from the heat source.
The company states that this technique can reduce the maximum silicon temperature rise inside a GPU by 65 percent, depending on the workload and chip type. This improvement could enable overclocking without the risk of damaging the chip and allow for closer physical placement of servers, reducing latency.
“If you’re still relying heavily on traditional cold plate technology [in five years], you’re stuck.”
— Sashi Majety, Microsoft Program Manager
Originally reported by www.engadget.com as “Microsoft claims a ‘breakthrough’ in AI chip cooling” on 2025-09-23 23:31:00.