Jump to content


Just Joined
  • Content Count

  • Joined

  • Last visited

About ryan45

  • Rank
    InsanelyMac Protégé
  1. I'm genuinely curious. Was just reading a bit about Larrabee that was planned to be released in ~2010, but consumer cards were canceled due to underwhelming performance and delays. Couldn't they just keep the project alive with just about enough r&d to be able to keep improving it from both hardware (arch) and software side over the next few years? I mean it's not like they would get close to bankruptcy or something since they had enough money + solid CPUs at that time and on the roadmap. Makes me wonder how the situation on the dedicated GPU market would look like today if intel had not given up developing their consumer cards almost a decade ago. 6 years is quite a lot of time to create something decent and to start slowly getting a certain chunk of (dedicated GPUs) marketshare. I said 6 years, because 2016 seemed like a really good time to take attention with solid hardware when basically the only competition we had back then was in mid range with 1060 vs rx 480 and Nvidia had a free hand with pricing anything above 1060 however they wanted. In 2016 and onwards we started seeing this unusual behavior on the GPU market with AMD being unable to compete in high-end/falling behind with their tight r&d budget at RTG and Nvidia basically monopolizing high-end. So here we are when people today are heavily arguing which card - 2080 or R VII offers better price to performance while both are more or less 1080Ti-like performance for 1080Ti-like price 2 years later which is just sad. On the other hand now it's possible to snatch a used non-reference rx 580 8gb for the price of a new 1050 Ti, a used 1070 for the price of a new rx 590 (sometimes even for cheaper than 590), heck there are even used 1070 Ti's priced almost the same as used 1070's and used non-reference 1080's priced similarly as the new higher-end-non-reference 580's 8gb were used to right after their release back in 2017. No i'm not kidding, at least that's the situation with used GPUs here in Poland, but i digress I'm interested to hear from people that were digging/following closely intel's moves for a longer while - What was this whole deal with giving up on dedicated GPUs? Not enough money to keep the project alive? Strange management decisions? Intel thought back then that PC gaming market wouldn't grow so much? Did they backed off, because focusing generally on CPUs in consumer and server segments was profitable enough for them? Or maybe there was some other reason?