Imagine a supercomputer the size of a campus — altogether, that’s what Microsoft is constructing in Pleasant Prairie / Mount Pleasant, Wisconsin. Called Fairwater, this AI-datacenter is billed as the most powerful ever built, designed specifically for training large AI models, pushing the envelope in performance, and scaling cloud AI infrastructure. Below is what’s known, what’s new, and what remains uncertain — plus what it means for the tech world.

What Microsoft Has Announced
These are the publicly confirmed details about Fairwater:
| Feature | Key Info |
|---|---|
| Location & Size | Mount Pleasant, Wisconsin; 315 acres; three buildings totaling ~1.2 million square feet. |
| Timing | Expected to go live early 2026. |
| Compute Hardware | “Hundreds of thousands” of Nvidia GB200 and GB300 GPUs, connected with high-speed interconnects to support massive throughput/low latency. |
| Cooling & Water Use | Closed-loop water cooling system with “zero water waste.” Outside air cooling will be used only on the hottest days. |
| Infrastructure & Build Scale | Deep foundations (46.6 miles of piles), over 120 miles of medium-voltage underground cable, and extensive mechanical piping systems. |
| Performance Claim | Microsoft says Fairwater will deliver performance 10× that of today’s fastest supercomputers. |
| Additional Centers | Fairwater is part of a larger initiative. Another large data center of similar scale is planned nearby with an extra $4 billion investment. |
What Extra Details & Context Add Depth
Technical Details & Scale of Hardware
- The GPU racks are designed to handle terabytes per second of bandwidth.
- Each rack may contain dozens of GPUs in dense, high-bandwidth configurations, crucial for training massive AI models.
- Networking infrastructure is designed to minimize latency; even the two-story layout reduces cable lengths for faster data transfer.
Sustainability, Local Resources & Environmental Impact
- Fairwater emphasizes sustainability with closed-loop water systems, limiting stress on local resources.
- The site sits on land once intended for a large-scale industrial project that never materialized, allowing Microsoft to reuse prepared infrastructure.
Economic & Local Impacts
- The project is creating thousands of construction jobs and will provide long-term employment once operational.
- Local governments are heavily involved in negotiations around power, water, and infrastructure demands.
Strategic and Competitive Context
- Fairwater is a centerpiece of Microsoft’s broader $80 billion push into AI infrastructure.
- Competitors like Google and Amazon are racing to build similar super-scale facilities, making Fairwater both a technical and symbolic advantage for Microsoft.
What’s Still Unclear
| Open Question | Why It Matters |
|---|---|
| Exact Count of GPUs and Compute Capacity | “Hundreds of thousands” is vague; precise figures impact cost, power, and deployment schedules. |
| Power Consumption & Efficiency Metrics | How many megawatts will it draw? What’s its energy efficiency ratio (PUE)? |
| AI Model Types Supported | Will it host large language models, multimodal systems, or a mix? |
| Reliability & Redundancy | What safeguards exist for hardware failures or cooling breakdowns? |
| Supply Chain Risks | Global shortages in GPUs and networking gear could cause delays. |
| Community & Environmental Pushback | Local resistance could arise over water, energy, or environmental concerns. |
| Deployment Timeline | How quickly phases will scale from initial activation to full capacity. |
Why Fairwater Matters
- AI at Scale: The datacenter will make it possible to train models larger and faster than ever before.
- Speed Advantage: Low-latency infrastructure could improve Microsoft’s AI services across the globe.
- Competitive Edge: It positions Microsoft strongly against AWS and Google in the AI cloud race.
- Environmental Stakes: The sustainability claims are bold, but only time will tell if they hold up.
- Regional Impact: Creates jobs, boosts local economy, and reshapes Wisconsin’s role in U.S. tech.
Frequently Asked Questions (FAQs)
| Question | Answer |
|---|---|
| 1. Why is Microsoft building Fairwater? | To handle the explosive compute demands of AI, scale faster, and maintain its competitive lead in cloud AI. |
| 2. What makes Fairwater different from normal data centers? | It’s purpose-built for AI: dense GPU racks, extreme interconnect speeds, advanced water cooling, and ultra-low latency design. |
| 3. How sustainable is it? | Microsoft claims “zero water waste” via closed-loop cooling, with air cooling used in limited conditions. True environmental impact remains to be seen. |
| 4. When will it go live? | Initial operations are expected in early 2026. |
| 5. What kind of AI will run here? | Likely large-scale training of advanced models (language, multimodal, enterprise AI) plus global inference services. |
| 6. How much is Microsoft investing? | About $3.3 billion for Fairwater, plus another $4 billion for a second facility. Overall, $80 billion is committed to global AI infrastructure. |
| 7. What are the risks? | Technical complexity, hardware supply shortages, cooling reliability, power costs, and community/environmental concerns. |
| 8. How will this affect competitors? | It raises the bar. Google, Amazon, and others will need to accelerate their own next-gen AI infrastructure projects. |
| 9. What does this mean for everyday users? | Faster, more powerful AI applications in Microsoft products and services — from Office to Azure to Copilot — powered by datacenter muscle behind the scenes. |
Conclusion
Fairwater isn’t just another data center — it’s a symbol of the AI arms race. With unprecedented scale, cutting-edge hardware, and bold sustainability claims, Microsoft is betting big on infrastructure as the foundation for future AI breakthroughs. Whether it sets new benchmarks or sparks new controversies, one thing is clear: Fairwater will shape the future of cloud AI.

Sources Microsoft


