The global collaboration expands to Asia-Pacific, enabling Indian organizations to meet compliance and low-latency inference requirements
MUMBAI, India : Groq, a global pioneer in AI inference, has expanded its global AI infrastructure footprint to the Asia-Pacific region through its deployment in Equinix’s International Business Exchange™ (IBX®) data center in Sydney, Australia. Following launches in the US and EMEA, the collaboration brings Groq’s fast, low-cost and scalable LPU Inference Engine closer to organizations and the public sector in India and the wider Asia-Pacific region.
Under this partnership, Groq and Equinix will establish one of the largest high-speed AI inference infrastructure sites in the country with a 4.5MW Groq facility in Sydney, offering up to 5x faster and lower cost compute power than traditional GPUs and hyperscaler clouds. Leveraging Equinix Fabric®, a software-defined interconnection service, organizations in Asia-Pacific will benefit from secure, low-latency, high-speed interconnectivity, ensuring seamless access to GroqCloud™ for production AI workloads, ensuring full control and compliance.
The demand and adoption of AI-driven solutions continue to grow in Asia-Pacific. According to IDC’s latest Worldwide AI and Generative AI Spending Guide, AI and Generative AI (GenAI) investments in the region are projected to reach US$110 billion by 2028, growing at a compound annual growth rate (CAGR) of 24.0% from 2023 to 2028.
By combining Groq’s cutting-edge AI inference technology with Equinix’s global infrastructure and vendor-neutral connectivity solutions, organizations can efficiently scale their AI workloads while maintaining cost-effectiveness and speed.
Supporting Quotes:
Cliff Obrecht, Co-Founder and COO of Canva, said: “We’re entering a new era where technology has the potential to massively accelerate human creativity. With Australia’s growing strength in AI and compute infrastructure, we’re looking forward to continuing to empower more than 260 million people to bring their ideas to life in entirely new ways.”
Jonathan Ross, CEO and Founder of Groq, said: “The world doesn’t have enough compute for everyone to build AI. That’s why Groq and Equinix are expanding access, starting in Australia.”
Scott Albin, General Manager, APAC, Groq, said: “Asia-Pacific is a key growth market for Groq, with over half of our global developers already using GroqCloud based here. Our deployment in Equinix’s Sydney data centre is our first step to bring high-performance, cost-efficient AI inference closer to the region, enabling secure, low-latency access while supporting data sovereignty and privacy. Together, we’re building the infrastructure to drive AI innovation and growth in Asia-Pacific.”
Cyrus Adaggra, President, Asia-Pacific, Equinix, said: “Groq is a pioneer in AI inference, and we’re delighted they’re rapidly scaling their high-performance infrastructure globally through Equinix. Our unique ecosystems and wide global footprint continue to serve as a connectivity gateway to their customers and enable efficient enterprise AI workflows at scale.”
Key Highlights:
- AI performance at scale – Groq’s LPU Inference Engine, optimized by a custom-built compiler, achieves unmatched throughput and latency for inference on next-gen open-source LLMs.
- Purpose-built for AI inference – The LPU delivers instant speed, unparalleled affordability, and energy efficiency at scale. Fundamentally different from the GPU, the LPU was designed for AI inference and language.
- Equinix Fabric – Groq leverages Equinix’s on-demand, software-defined interconnection service that securely connects businesses to clouds, partners, and digital ecosystems across Equinix’s 270+ facilities across 37 countries.
- Scalable innovation – The partnership demonstrates how Equinix enables next-generation AI companies to scale globally without building their own infrastructure footprint. Groq technology can be accessed by anyone via GroqCloud, while enterprises and partners can choose between cloud or on-prem AI compute centre deployment.



Leave a Reply