Computer hardware, and in particular consumer PC hardware is plagued with a particular brand of parasite: corporate shills. Whether it is because they enjoy Intel’s fat paychecks, or to rationalise spending hundreds on Core i5s, on silly 4 thread 14nm+++++++++++++++++ nodes, they will argue against reason that you should not buy Ryzen, because fuck you. This is sheer quackery and it deserves to be mocked— nay, it deserves to be gas gulagged. In 2019, AMD CPUs are the objectively superior purchase at every desktop price point, from dirt poor to filthy rich, from normie to enthusiast, for productivity and for gaming. Intel apologias work by: a) undervaluing multi-core performance, b) blowing single-thread gains out of proportions, and c) not considering externalities. This leads to comical results such as a 4 core 4 thread Core i3 8350K “outperforming” an 8 core 16 thread Ryzen 2700X, even though the Ryzen is significantly better and also cheaper.
Single thread performance gains year over year have been stale for at least a decade now. This is due to thermal limitations; it is simply not possible to keep increasing clock speeds without melting the CPU. The primary way the industry has kept up with Moore’s Law is by increasing core counts, even at the expense of clock speeds; smaller cores are more efficient than bigger ones, and the gains are not linear.
Software must keep up with the changing hardware landscape, and software that won’t be updated will be replaced by software that does. The next generation of consoles will have 8 cores and 16 threads, based on AMD’s Zen 2 architecture, so that is what developers will target as the lowest common denominator. Your 4 core 4 thread 14nm refresh from 2014 will not be able to keep up. Even enthusiast gamers aren’t upgrading every 2 years any more: you buy into a platform with the hope of staying there for 5, 6, however many years. Quad cores, or God forbid, dual cores will become deadweights next year; it is simply irresponsible to recommend them as a purchase in 2019. Our workloads will only become more dependent on multi-core going forward.
Computer games are famously difficult to parallelise, and so can hit diminishing returns even at 4 cores, however, we’ve been seeing a few key developments in parallelisation’s favour.
The future of graphics is raytracing/pathtracing, which is a parallel workload by design. The rays being cast are independent of each other, so they can be computed and displayed entirely in parallel. Doubling execution units means you can double the rays you can calculate at any given time, linearly.
The future of audio is also pathtracing, for more realistic 3D sound. Audio is also waves, and these waves interact with other surfaces, so in order to have a more life-like experience, games must have awareness of their surroundings.
Games feature many more AI actors that can act independently, in parallel. There are bottlenecks for when AI must coordinate actions, but in many situations, such as NPCs walking around in your favourite Bethesda game, they can make their decisions independently, and players won’t notice.
Machine learning is becoming more prevalent, from our graphics pipeline, to dynamic generation of geometry and materials and other assets, to physics calculations. The first machine learning video game is already playable and viral; machine learning workloads are massively parallel by design.
Video games are increasingly providing an ecosystem of networking, text chat, audio chat, footage recording, and so on, all of which run in separate threads and independent of the core game engine loop. When you are playing a game, especially a competitive online game, you aren’t running just a graphics engine any more, but rather several multiplexed applications.
Unless you’ve taken conscious steps to airgap your gaming machine, you are running more than just your game. First and foremost are all your operating system’s services, which may decide it’s time for an update or an anti-virus search as you’re gaming. Most PC gamers use some sort of client or launcher for their games, such as Steam or BattleNet or Origin or whatever. Discord, Skype, Teamspeak and company remain popular. Maybe they are watching some YouTube video or a streamer on the side; maybe they are listening to music; maybe they are recording clips of sick frags to upload later; maybe they are rendering said sick frag compilation.
All of the above are separate, independent applications running in separate threads that your operating system parallelises for you, for free, using its scheduler. Most are Electron “apps”, so they are effectively Chromium browsers, which itself spawns several threads for the node.js process, for V8, and for rendering the DOM. By the time you’re done allocating threads for everything, how much horsepower do you have left for your game, really? Benchmarks do not tell the whole story; real-world usage on a PC involves a lot of multitasking, even if all you do is play League of Legends for a living.
Shills like to pretend that gamers are stuck in their mother’s basement 24/7 and do nothing but queue up games of Fortnite until they die of dehydration. For the life of me, I’ve never seen anyone use their computer like that, and it should insult you that Intel shills are patronising you like that. If you’ve so much as tried to work on an essay with more than a couple tabs open for research or reference purposes, you will benefit from more cores. Your browser will be snappier, and also your word processor, which is likely the fat cow Office 365 to begin with. Do you listen to music while you’re writing the essay? That requires a thread in itself to decode and mix.
If you do programming, you will like having more cores. If you do any kind of art whatsoever—from music production, to video editing, to drawing, to 3D sculpting—you will like having more cores. If you do any kind of simulation for electronics, biology, physics, or engineering, you will like having more cores. If you are dealing with lots of data for statistical purposes, you will like having more cores. If you want to transcode your 10bit animu encodes to 8 bit RGB format so that your smart TV can play them, you will like having more cores.
Actually, let me put it this way: if your use of your computer is in any way more complicated than sitting on your arse all day and watching Netflix, or doing mindless data entry for the public sector, more cores will make you more productive, because you’ll be able to do more in less time, to reduce the friction between what’s in your head and the capabilities of your hardware.
And even if you have a love affair with Netflix, five years in the future, when its highest bitrate content is available in AV1, you will need as many cores as you can get your hands on to decode it.
You will not notice a 5% performance lead. Performance differences start being perceptible at 15%, give or take. When talking about current Intel CPUs vs. current AMD CPUs, we’re talking about differences of not even 5%; we’re talking about 3%, 2%, sometimes even under 1%. These aren’t “leads”; they are the identical performance, well within the margin of error, and they might as well be random. The extra couple of frames you will gain here and there aren’t perceptible by a human eye. Do not trust recordings or still frames; when you’re gaming, you’re focused on the game, not whether your frame rate dips to 59FPS for 3 frames, then goes back up again. You will not be reviewing your gameplay footage over and over again for minor frame drops. You will be trying to be immersed in your experience.
Higher clocks mean more energy, and older process nodes also mean more energy. More energy means more heat. More heat requires more cooling. In order to extract all the performance you can from an Intel CPU, you need to overclock it and couple it with a beefy cooling solution; for the highest end, we’re talking about some really exotic water cooling. We’re talking about hundreds of dollars of investment, not to mention the time investment, in order to boast you’re 5% better than the competition, even as you’re running at twice or thrice the wattage that they are. Unless you have solar panels on your house, themselves a $10,000 investment, you should care about your electricity bill. What a fucking joke. With Ryzen CPUs, you get much better coolers for free, and a much more efficient process node. The money you will save over the years will be enough to fund a new processor down the line.
Since 2018, every 3 months, Intel suffers even more hardware security vulnerabilities, which operating systems have to patch, costing you performance. Single thread performance has already taken multiple hits, all of which add up. Multi-thread is even worse, as Intel’s hyperthreading is fundamentally broken and insecure. It’s got to the point that if you care about security, one Intel core equals one AMD thread. There is no indication the onslaught of Intel vulnerabilities is going to stop any time soon. There is no indication that Intel is taking the security community seriously by getting ahead of the curve; all we get are software patches for the same shitty 14nm Skylake clones.
Unless you perform benchmarks on current machines, with all security mitigations on, no Intel vs. AMD comparison is accurate; AMD has nothing to mitigate, so Ryzen processors will always have the lead. Security is important, and arguing that mitigations should be disabled is dangerous. I know all of you purchase things from your computers. I know you’re watching porn, especially (((intellectual))) loli hentai that may or may not be illegal in your degenerate illiberal theocracy. I know you keep important records on your computers. Your privacy is your human right, and it should not be sacrificed for a few extra frames in Call of Duty so that you can feel better about Intel’s incompetence.
Intel is selling you processors that are more expensive than AMD’s, that have half the performance, that are fundamentally insecure, that consume more energy, and that have fewer features. If you value your human rights, you should buy Ryzen. If you value your wallet, you should buy Ryzen. If you value the environment, you should buy Ryzen. If you value performance, you should buy Ryzen. If you value convenience, you should buy Ryzen.
If you are dirt poor, you should buy the Athlon 3000G or the Ryzen 2200G. If you are rich, you should buy the Ryzen 3950X. If you are maximising bang for your buck, you should buy the Ryzen 2700. If you are looking for an APU, you should wait until CES 2020 for hot new AMD announcements. If you are a data centre, you should buy an Epyc 7702P. Top to bottom, at every price point, for every occasion, for every user, for every workload, just buy Ryzen. The only justifiable Intel purchase is the one you pay $0 for.
And if you are UserBenchmark.com, you should kill yourself.