logo
#

Latest news with #Groq

Bell Unveils Ambitious AI Infrastructure Network Across Canada
Bell Unveils Ambitious AI Infrastructure Network Across Canada

Arabian Post

time03-06-2025

  • Business
  • Arabian Post

Bell Unveils Ambitious AI Infrastructure Network Across Canada

Bell Canada has launched a major initiative to bolster the country's artificial intelligence capabilities through the establishment of a nationwide network of high-performance, hydroelectric-powered data centres. Dubbed Bell AI Fabric, the project aims to provide 500 megawatts of AI computing capacity, marking a significant step in enhancing Canada's sovereign AI infrastructure. The first facility, a 7-megawatt data centre in Kamloops, British Columbia, is set to commence operations this month. This centre is developed in partnership with Groq, an AI inference provider and chip designer. A second 7-megawatt facility in Merritt, British Columbia, is scheduled to open by the end of the year. Additionally, two larger 26-megawatt data centres are planned for Kamloops, with the first expected to open in 2026 at Thompson Rivers University and the second in 2027. Two more data centres, with a combined capacity exceeding 400 megawatts, are in advanced planning stages. Groq's advanced Language Processing Units will power these centres, offering faster inference performance at lower costs compared to existing market alternatives. This technology is particularly suited for large language models, enhancing the efficiency of AI workloads. ADVERTISEMENT The initiative also includes an academic partnership with Thompson Rivers University. The data centre at the university will support AI training and inference, providing students and faculty with access to cutting-edge computing capabilities. Moreover, the facility will integrate into the district energy system, repurposing waste heat to supply energy to the university's buildings.

Nvidia rival Groq makes a bold prediction on what's next for AI
Nvidia rival Groq makes a bold prediction on what's next for AI

Yahoo

time02-06-2025

  • Business
  • Yahoo

Nvidia rival Groq makes a bold prediction on what's next for AI

Listen and subscribe to Opening Bid on Apple Podcasts, Spotify, Amazon Music, YouTube or wherever you find your favorite podcasts. The next big breakthrough in AI could be one that channels the spirit of famed scientist Albert Einstein. "The problem is that large language models [LLMs] make mistakes. And they're always going to make mistakes, but they're going to make fewer and fewer. At some point, they'll make so few mistakes you can use it in medicine, you can use it in law," Groq founder and CEO Jonathan Ross said on Yahoo Finance's Opening Bid podcast (watch above; listen-only below). This embedded content is not available in your region. Ross said the ability of AI models to invent is among the next wave coming, and it's likely to happen within his lifetime. Currently, LLMs tend to pick the most probable answer to questions, which precludes them from discovering something new. "And if it's the most obvious, it's not going to be good writing. It's not going to be good science. It's not going to invent something for you. It's not going to invent a drug that isn't known. That's what we're working on next. Invention is next," Ross explained. While at Google (GOOG, GOOGL), Ross designed the custom chips that the tech giant would go on to use for training its AI models. Today, Ross has a ground-floor view into how LLMs are likely to continue to evolve in the years ahead. Groq makes what it calls language processing units (LPUs). These LPUs are designed to make large language models run faster and more efficiently than Nvidia's (NVDA) GPUs, which target training LLMs. By making them run faster and more efficiently, Groq's chips can help LLMs on their road to becoming inventive rather than just using reasoning, as ChatGPT and other chatbots do today. Groq's last capital raise came in August 2024, when it raised $640 million from companies including BlackRock (BLK) and Cisco (CSCO). The company's valuation at the time stood at $2.8 billion, a fraction of Nvidia's more than $3 trillion market cap. It currently clocks in at $3.5 billion, according to Yahoo Finance's private markets data. Three times each week, Yahoo Finance Executive Editor Brian Sozzi fields insight-filled conversations and chats with the biggest names in business and markets on Opening Bid. You can find more episodes on our video hub or watch on your preferred streaming service. Brian Sozzi is Yahoo Finance's Executive Editor. Follow Sozzi on X @BrianSozzi, Instagram, and LinkedIn. Tips on stories? Email

Nvidia rival Groq makes a bold prediction on what's next for AI
Nvidia rival Groq makes a bold prediction on what's next for AI

Yahoo

time02-06-2025

  • Business
  • Yahoo

Nvidia rival Groq makes a bold prediction on what's next for AI

Listen and subscribe to Opening Bid on Apple Podcasts, Spotify, Amazon Music, YouTube or wherever you find your favorite podcasts. The next big breakthrough in AI could be one that channels the spirit of famed scientist Albert Einstein. "The problem is that large language models [LLMs] make mistakes. And they're always going to make mistakes, but they're going to make fewer and fewer. At some point, they'll make so few mistakes you can use it in medicine, you can use it in law," Groq founder and CEO Jonathan Ross said on Yahoo Finance's Opening Bid podcast (watch above; listen-only below). This embedded content is not available in your region. Ross said the ability of AI models to invent is among the next wave coming, and it's likely to happen within his lifetime. Currently, LLMs tend to pick the most probable answer to questions, which precludes them from discovering something new. "And if it's the most obvious, it's not going to be good writing. It's not going to be good science. It's not going to invent something for you. It's not going to invent a drug that isn't known. That's what we're working on next. Invention is next," Ross explained. While at Google (GOOG, GOOGL), Ross designed the custom chips that the tech giant would go on to use for training its AI models. Today, Ross has a ground-floor view into how LLMs are likely to continue to evolve in the years ahead. Groq makes what it calls language processing units (LPUs). These LPUs are designed to make large language models run faster and more efficiently than Nvidia's (NVDA) GPUs, which target training LLMs. By making them run faster and more efficiently, Groq's chips can help LLMs on their road to becoming inventive rather than just using reasoning, as ChatGPT and other chatbots do today. Groq's last capital raise came in August 2024, when it raised $640 million from companies including BlackRock (BLK) and Cisco (CSCO). The company's valuation at the time stood at $2.8 billion, a fraction of Nvidia's more than $3 trillion market cap. It currently clocks in at $3.5 billion, according to Yahoo Finance's private markets data. Three times each week, Yahoo Finance Executive Editor Brian Sozzi fields insight-filled conversations and chats with the biggest names in business and markets on Opening Bid. You can find more episodes on our video hub or watch on your preferred streaming service. Brian Sozzi is Yahoo Finance's Executive Editor. Follow Sozzi on X @BrianSozzi, Instagram, and LinkedIn. Tips on stories? Email Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store