cancel
Showing results for 
Search instead for 
Did you mean: 

General Discussions

Intel Will Enter GPU Market By 2020

Intel Will Enter GPU Market By 2020 - ExtremeTech

"When Intel hired Raja Koduri away from AMD and announced it was working on discrete GPU solutions, it still wasn’t clear exactly when the company would enter the market. Intel CEO Brian Krzanich has told analysts that the company intends to enter the market in 2020 — a slightly faster time frame than Nvidia or AMD may have planned on, and one with potential ramifications for both companies.

It was never likely that Intel would leap into the graphics space. The typical rule of thumb is that creating a new CPU architecture from scratch can take 4-5 years, and while graphics cards are considered to be an easier lift in that regard, 2-3 years is not uncommon. While Intel had obviously planned to launch itself into the GPU market before it hired Koduri, it would want to bring him on during the beginning of that process. With a late-2017 hire, a 2020 launch date is aggressive — but within the realm of possibility.

As for why Intel is moving into this space now, it’s not because of gaming — at least, not primarily. Intel has been perfectly happy to evolve its own on-board GPUs to a point somewhere between “acceptable” and “barely useful” for gaming, depending on the CPU in question. Intel hasn’t felt much of a need to push the overall status quo, particularly given that gaming is something of a boutique market. But the last few years have made it clear that gaming isn’t just a semi-niche market space that Intel can also sell into thanks to its high-end CPUs: It’s the jumping off point for building processors that are also excellent when performing machine learning and AI workloads. And because these are the major markets growing by leaps and bounds, it’s critical for Intel to establish itself in these spaces."

"We’re seeing what you might call an emerging continuum in these products. On the one hand, there are going to be dedicated, custom hardware solutions built by companies like Google and its TPU family, but also primarily intended for its own internal use. Nvidia and AMD have the closest thing to “general purpose” AI and ML hardware that you can find today, and while the Xeon may be quite competitive for inference tests, it’s not a good fit for actually training neural networks or working on AI/ML. And while Intel isn’t speaking yet about how it intends to market these chips in the gaming market, it is saying it explicitly intends to bring them to the data center.

But data centers could prove to be an easier lift for Intel, in some ways, than the gaming space. Gaming GPUs can be a very challenging market, with the need to build credibility with gamers, developers, and to create driver software and robust consumer experiences from scratch. Intel doesn’t have much experience in competing in this kind of space, whereas Nvidia and AMD have both been doing it for decades. Raja Koduri, of course, does have that experience — but there’s no substitute for the time it takes to convince developers to write code targeting your GPU hardware or for giving people experience on it. But while targeting data centers and machine learning markets also leaves Intelplaying catch-up, these spaces are much newer, the players are less entrenched, and the market itself is erudite enough to support extensive software customization and optimization. It may make more sense for Intel to enter the GPU market now that it’s being fused with the AI and ML space, not less. And we’d bet that’s part of what has piqued the company’s interest.

Will this attempt to enter the market play better than Larrabee? Most likely, yes. Will it put Intel on the competitive footing it wants to occupy against Nvidia and AMD? We’ll need a little more crystal ball polish before we tackle that one."

Intel’s Raja Koduri, VP of the Core and Visual Computing Group.

0 Likes
3 Replies

I'm still surprised Intel didn't buy nVidia when AMD bought ATI all those years ago.

0 Likes

Yeah...that turned out to be a big mistake....and at that time, Intel was still making their own motherboards.

0 Likes

I think that if ATI had been more competitive with nVidia at the time (ATI was just releasing the X100 series, which was not competitive against the nVidia 6000 series, neither were the X1000, HD 2000, and HD 3000 series) Intel would have seen them as more of a threat worthy of countering, but also AMD was forward thinking with their Fusion prospective (APUs) which Intel did not see as a credible threat, and honestly they haven't been until these latest models, so Intel was happy to simply license nVidia technology.

What's going to be interesting is to see exactly how much of a "gamer" card these Intel models turn out to be. Personally I'm thinking they're going to be more of a prosumer device focused on professional tasks but can also play games, like VegaFE. It's much easier to focus on a few programs with strict standards than it is games, not to mention much more profitable. This would also line up very well if they're looking to lock AMD out of Apple completely since gaming on Apple is very limited, and since they're deprecating OpenCL and OpenGL as you pointed out, they'll be able to focus their resources even more if this is indeed their goal.