News

providing remarkably fast text output for large language models (LLMs), at a far lower cost than Nvidia GPUs. But Groq has ...
Saudi Arabia’s sovereign wealth fund-backed artificial intelligence company HUMAIN has selected US chipmaker and Nvidia ...
It's a problem most startups would love to have. While many AI companies have focused on training large language models, Groq seeks to make them run as fast as possible using chips it developed ...
(MENAFN- PR Newswire) Groq has been named an official inference provider for HUMAIN, a newly launched AI company headquartered in Saudi Arabia and designed to operate across the full AI value chain.
Similarly, Groq’s Language Processing Unit (LPU) chips deliver speeds of up to 625 tokens per second. Jonathan Ross, Groq’s CEO, emphasized that their solution is “vertically integrated for ...
Nvidia Corp. (NASDAQ:NVDA) rival Groq announced on Tuesday that it will ... This operating system allows users to talk to the ...
It has yet to clarify its relationship with Groq, a smaller chip company developing dedicated inference hardware as well as its own GroqCloud. But on the new Humain website, the company claims that it ...
Saudi Arabia's HUMAIN AI initiative partners with NVIDIA and Groq for "AI factories of the future," using a dual chip ...
April 29, 2025 /PRNewswire/ -- Groq, a leader in AI inference, announced today its partnership with Meta to deliver fast inference for the official Llama API – giving developers the fastest ...
MOUNTAIN VIEW, Calif., May 15, 2025 /PRNewswire/ -- Groq, the pioneer in AI inference, today announced a major global expansion, accelerating its rise as an emerging hyperscaler. With significant ...