News

providing remarkably fast text output for large language models (LLMs), at a far lower cost than Nvidia GPUs. But Groq has ...
Groq claims its chip, dubbed a "language processing unit" (or LPU), is faster and one-tenth of the cost of conventional graphics processing units commonly used in training AI models. Its chips are ...
It's a problem most startups would love to have. While many AI companies have focused on training large language models, Groq seeks to make them run as fast as possible using chips it developed ...
The data centre will be up and running by the end of this year, and could later expand to include a total of 200,000 language processing units, Ross said Groq has partnered with Aramco Digital ...
MOUNTAIN VIEW, Calif., April 5, 2025 /PRNewswire/ -- Groq, the pioneer in AI inference, has launched Meta's Llama 4 Scout and Maverick models, now live on GroqCloud™. Developers and enterprises ...
Saudi Arabia is positioning itself to take on huge volumes of the grunt work required by the artificial intelligence sector.