The PC gaming industry was worth an estimated $33.4B in 2018, accounting for roughly 25 percent of all gaming spending. Consoles account for slightly more market share, at 28 percent, while mobile gaming’s $63.2B is estimated at 47 percent. But if a new report from Jon Peddie Research is correct, up to 20 million PC gamers could decamp for consoles over the next three years, mostly from the lower end of the market.
The argument appears to be this: As Moore’s law has slowed and streaming services have ramped, the gaps between consoles and PCs have shrunk. At the same time, a plethora of new streaming services, including projects like Google Stadia, are offering more and more ways for consumers to engage with content. These services are primarily TV-focused and TV-centric, but with 4K HDR content ramping on both consoles and PCs, the image quality gaps are smaller than they’ve ever been. Meanwhile, the slowing rate of improvements in technology and the efficiency gap between PCs and consoles (games tend to be better-optimized on console due to the much smaller number of supported system configurations) has made it easier than ever for the PC market to bleed customers.
JPR writes:
The majority will come from the low-end (under $1000 full build cost), but because of improvements in TV displays and console semiconductors, as well as console exclusive titles, the ranks of mid-range and high-end PC gamer populations are also affected.
Is this likely? Maybe. But I’m not sure it practically matters. And at the same time, it’s going to matter for certain kinds of games.
One of the amusing things about the PC-versus-console debate in the Year of Our Lord 2019 is that PCs and consoles (not to mention the Xbox and PlayStation) have never been more similar to one another. This was not always the case. Compare, for example, the capabilities of the Sega Saturn and the Sony PlayStation, and you’ll find that the two had fundamentally different hardware, with the Saturn excelling at 2D graphics and the PlayStation performing far better with 3D polygons.
If you go back farther in time, the differences get even larger. As someone who started gaming on the PC in 1987, I remember when my gaming platform of choice was literally incapable of playing the same sorts of games that one saw on the NES. Space Quest, King’s Quest, Wizardry, The Bard’s Tale, and Chuck Yeager’s Advanced Flight Simulator are very different games than Super Mario Bros., Mega Man, Metroid, or even The Legend of Zelda.
Games like Space Quest III were fabulous — and very, very different from anything on the consoles of the day.
Even if you compare game series that existed on both PC and console, like Ultima, they were so different as to constitute entirely different games. There was virtually no overlap between the two platforms in actual gameplay or content.
Today, the overlaps are substantial. Even the specific differences that constitute major reasons why I prefer the PC, like modding support, have become more common on consoles than they used to be. This is not to say that PC gamers are willing to leap on consoles, or vice-versa, but that we’ve seen a substantial blurring of the lines over the past 30 years over what, exactly, distinguishes PC gaming from console gaming. So yes. Objectively, it’s possible that we’ll see a major shift from PCs over to streaming devices and televisions, particularly at the lower end of the market.
But overemphasizing this feels like it would miss the larger point. Increasingly, gaming is something you can access on multiple devices sequentially, as you move through life. Microsoft will stream games to an Xbox from a PC or vice-versa. Game streaming, both from a service or across a local network, is an increasingly common feature. It’s not entirely clear to me that this represents a shift away from PCs specifically so much as a new trend in content consumption.
And of course, even if this trend proves true, there are games that are never going to move away from PC. The experience of gaming on a PC remains tethered to the mouse and keyboard, and while plenty of games can translate between KBM and controller, again, many can’t. There’s a reason why series like Civilization don’t really head for consoles. Players of these titles don’t have much to worry about.
Ultimately, I’m not convinced that PC players are going to dump PCs and shift over to TV-focused gaming. But even if they do, I’m also not convinced it would represent a true market transformation so much as a willingness to game on multiple devices at the same time. And frankly, the more PCs and consoles continue to evolve towards each other, the less this distinction matters in the first place. From a hardware perspective, the Xbox One and PS4 are both PCs with custom SoCs, particularly in the Xbox One’s case, given that it runs a variant of Windows.
In fact, that might be the funniest thing about the way we slice up the console market versus the PC space. If gaming consoles were being invented today, we wouldn’t talk about them as separate, specialized hardware with their own history and heritage, but as cost-optimized, mass-market, PC-based game systems with custom software and a few bits of specialized hardware. From that perspective, shifts in the gaming market towards specialized streaming services that use PC server hardware to deliver PC games to your home would scarcely be shifts at all.
Report: 20 Million PC Gamers Could Switch to Consoles by 2022 - ExtremeTech
I get the points made in this article. I have however seen this claim before, in fact with pretty much every console generation I have ever seen. I would point out that it is possible that streaming services could also increase the amount of pc's used to play games. I do this now in my own home even and no reason a cloud service isn't the same. I use an older pc in my basement and stream to it from my more powerful gaming rig. Making that PC still a relevant PC gaming computer. Allowing you to even use PC's that never were gaming computers.
Eventually mainstream computers will become fast enough (they aren't that far off now) and an OS friendly enough (that part is a ways away) so that households will just have a mainframe computer which handles all the rendering and such which is then sent either wirelessly or wired to displays such as TVs or handhelds like the Shield, nothing new in that concept, remote rendering is already a thing, especially in businesses which allows them to have a single powerful mainframe and inexpensive remote systems. Consoles will still be around for some time, but will function much more like computer systems than as entertainment hubs, mainly for houses to connect their smart home devices and such, as they will be much more affordable than mainframes.
The major problem is the telecoms which hold a deathgrip in this country, like most others, with severe data caps and speeds. Heck, Cox is offering a gamer fast lane for $15 additional per month https://www.tomshardware.com/news/cox-elite-gamer-internet-fast-lane,39176.html that's going to cut down on the adoption of them for quite a number of people to be able to carry around a thin tablet like device which privately and securely connects to your home mainframe or a remote server to be able to stream play the latest AAA game.
Plus who knows that the PS6 and XBOX will look like in 2022. Possible 8 core processors clocked at 2ghz or greater, decent graphics chip...
my R5 2400G blows away the Xbox and PS consoles and my video card is a universe more powerful
For me personally, the issue hasn't really been the hardware in the box, but more the supporting technology. PCs and PC users tend to be early adopters of new technology, and that drives a technology disparity between PCs and the consoles operating in the living room.
Take for example the early to mid 1980s. At this time, I primarily gamed in the living room, with my commodore 64/128 hooked up to a CRT TV via an RF adapter. The commodore 128 had a cartridge port for Atari games and supported a joystick. No need to buy a separate console.
As technology progressed, computer monitors began to support higher resolutions, and games began to take advantage of those. TVs however, remained at 480i as that was all that was required for their primary use "watching TV over a tuner" So to play a game a high res, you needed a monitor at the desk
TVs would eventually catch up, but just as HD flat screens became affordable, computer monitors again took anther leap by adding features like adaptive sync, and higher bandwidth connections (DisplayPort) to support high frames at higher resolutions. So again, to game in the living room you have to sacrifice newer technologies available to you at the desk.
That is starting to change, with the BFGD displays, but more importantly displays like the Alienware 55" OLED monitor. A pure living room sized monitor with no smart TV nonsense built in, would really allow no compromises gaming in the living room. Also, input devices like the Razer Turret for Xbox One are also a good first step in evening out the input device gap between desktop and living room. We may finally be at a point where living room gaming performs just like desktop gaming without compromises. And it only took 35 years to get back here.