On Whatsapp channel, Twitter, Facebook, Google News, and Instagram.By MICHAEL LIEDTKE and MATT O’BRIEN (AP Technology Writers) Their ability to perform a certain type of calculation rapidly by doing many of computations simultaneously has made them the go-to choice for training AI software.įollow HT Tech for the latest tech news and reviews, also keep up with us The chips are based on the type of semiconductors called graphics processing units, or GPUs, which have typically been used by video gamers to get the most realistic experience. ![]() AMD has said that its own revenue from accelerators will top $2 billion in 2024, with analysts estimating that the chipmaker's total sales will reach about $26.5 billion. But it will take the company a while to grab a large piece of that market. That compares with $597 billion for the entire chip industry in 2022, according to IDC.Īs recently as August, AMD had offered a more modest forecast of $150 billion over the same period. And then Nvidia is expected to come out with a whole new architecture for the processor later in the year.ĪMD's prediction that AI processors will grow into a $400 billion market underscores the boundless optimism in the artificial intelligence industry. That should match at least some of what AMD's offering. The H100 will be succeeded by the H200 in the first half of next year, giving access to a new high-speed type of memory. Many others will vie for market share too.Īt the same time, Nvidia is developing its own next-generation chips. While the company expressed confidence in its product's performance, Su said it won't just be a competition between two companies. Su said that the new chip is equal to Nvidia's H100 in its ability to train AI software and much better at inference - the process of running that software once it's ready for real-world use. It also has 1.6 as much memory bandwidth, further boosting performance, AMD said. The new AMD chip has more than 150 billion transistors and 2.4 times as much memory as Nvidia's H100, the current market leader. The big question is how long it will essentially have the accelerator market to itself.ĪMD sees an opening: Large language models - used by AI chatbots such as OpenAI's ChatGPT - need a huge amount of computer memory, and that's where the chipmaker believes it has an advantage. Surging demand for Nvidia chips by data center operators helped propel that company's shares this year, sending its market value past $1.1 trillion. On a day when tech stocks were generally down, the shares fell 1.3% to $116.82. Still, AMD shares didn't see a commensurate increase. Nvidia shares dropped 2.3% to $455.03 in New York on Wednesday, a sign investors see the new chip as a threat. ![]() Customers using the processors will include Microsoft Corp., Oracle Corp. I believe it.”ĪMD is showing increasing confidence that the MI300 lineup can win over some of the biggest names in technology, potentially diverting billions in spending toward the company. It will take time to assess the impact on productivity and other aspects of the economy, she said. But deployment of the technology is still only just beginning. Building AI systems that rival human intelligence - considered the holy grail of computing - is now within reach, Su said in an interview.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |