2018 was the year we finally stopped seeing blatant regressions in battery in smartphones, partially because miniaturisation has advanced to the point that further thinness would make the devices finger-bleedingly difficult to hold; partially because the thinness craze has faded away; partially because the DPI craze has settled to about 1080p. As a result, battery life actually improved. For example, the Mate 20 Pro is a comfortably 2 day phone even for the most heavy of users, and close to 4 days for more moderate users.
Flagships continue removing the headphone jack, though the USB Type-C ecosystem remains a disappointment. All type-C headphones come with their own independent amplifiers, usually worse than the smartphone’s, though the type-C specification supports analogue audio output. Cross-device support is iffy. This is transparent price-gouging and anyone claiming otherwise should end their lives now.
Flagships also continued the super-fragile super-slippery glass sandwich meme, now empowered by wireless charging. Wireless charging remains a waste of perfectly useful electricity, at least for large batteries such as for smartphones. A handful of devices have spawned that are perfectly suited to it, mainly peripherals by Apple, such as the airpods, keyboards, mice, touchpads, and of course the pencil.
Apple doubled down on their notch, and the iSheep over at Android land followed suit like the mindless NPCs they are. A lot of you might think the Google Pixel had the worst notch, because it hung too low. You’re retarded. Sharp is by far and away the winner of the design atrocity competition. Because nothing says privilege better than literally cutting off parts of the most expensive, expertly calibrated, high pixel density display. In a strange turn of events, OnePlus 6T did it the best; in fact, it’s the only remotely appealing notch, ever.
Xiaomi continues its domination on the low end, though their offerings were very slightly less value-for-money than they used to. They also continued their illegal behaviour towards the GPL and engaged in other scummy behaviour regarding bootloader unlocks, though community pushback reigned that in a little. Apple continues its domination on the high end, where it belongs: Instagram whores.
Screen sizes kept inflating and there are officially no compact phones remaining.
GNU/Linux on smartphones via PostMarketOS has made some progress for select devices, which is impressive for such a short-lived project but probably not what you’re looking for. There’s some assorted collaborations lined up for the future so there are actually developments on that front.
Google has all but given up on the tablet market. Android has been an utter failure there, and there’s no coming back. Their ChromeOS-based late 2018 release is rushed, pointless, and user-unfriendly. The gap is filled extremely large smartphones, such as the Mi Maxes, Galaxy Notes, and iPhone Pluses of this world. These have every indication of becoming the new normal, but I doubt we’ll ever see a phone larger than 7”.
The best tablet is hands down the iPad Pro, which is also the closest iOS has come to competing with desktop computers. Depending on your workflow, the iPad Pro can replace a desktop computer. It’s cheaper than the (utterly pointless) Macbook Air, and makes for a better Netflix, Facebook, Instawhore, and sketching machine. Its keyboard is also better than the butterfly abomination. iOS’ limitations still exist, especially around file management, but if you’re a bourgeois normie you’ve probably already bought into the
NSA spyware cloud meme so you’ll be fine.
Windows 10 tablets remain a Microsoft Surface affair, with only slight improvements year over year. The hardware is impressive, held back by Windows 10’s awful design decisions. Nevertheless, if you live in Windows and require the extra mobility, there’s no outdoing the Surface tablet, but it’s probably not what you’re looking for if all you want is a Netflix, Facebook, and Instawhore machine. Linux compatibility is okay and the closest you can get to watching Coalgirls encodes on proper x86 hardware on a handheld device, if you can get GNOME and KDE to cooperate.
Apple Watch is the only watch worth buying and everything else either closed shop or disappointed.
There was an interesting development, particularly for cheap smart watches: if you’re a dumb phone kinda guy, you can unironically do your calls and texting through your SIM-equipped watch. I think that’s kind of cool if gratuitous. The lack of device independence and killer application makes wearables an uninteresting category overall.
If you had the money for them, 2018 was peak laptop across the board. Shit tier normie laptops remain sad and uninspiring, and for some reason people still ship 1366x768 displays; stop, please stop. That resolution should have been dead five years ago, and it has no reason to exist now. Even sub-$100 smartphones come with more capable displays and the degree in which the low-end laptop market is lagging behind is inexcusable. Same goes for dual core Intel chips. The only reason why you should be buying a dual core in 2018 is if it costs under $200.
Apple refreshed the Macbook Air with a dual core Intel chip for $1300. I don’t think even Apple themselves could give us compelling reasons why the Macbook Air should exist. They should have done at least something interesting with the hardware, like for example an ARM CPU, a touch screen, anything, really. The Macbook Pros went from professional to “prosumer” with the removal of the function row and escape key, the new and deteriorated butterfly keyboard, and of course fewer ports. Overall Apple is telling you to buy an iPad Pro or switch platforms, and really, why don’t you?
If you’re looking for thin and lights costing north of $1000, dare I say the Windows manufacturers are doing it better than Apple, and in 2018 we’re officially above and beyond Macbooks, save perhaps for the trackpad—though that is arguable, as some find the Macbooks’ enormous trackpads difficult to use. Even Razer’s offerings are a better deal, though Lenovo, HP, and Dell were the most impressive overall in my opinion.
Quad core CPUs became the standard thanks to AMD finally competing in the space, effectively doubling most people’s performance. SSDs became much cheaper and every laptop worth anything comes with one, though that is mostly to cope with Windows 10; updates, daemons, and spyware make Windows 10 very heavy on the disk I/O, and magnetic spinny disks in particular grind down to a halt.
Due to miniaturisation and thermal improvements, it became almost trivial cramming decent GPUs into a laptop form factor, and as with before, if you’ve got $1000 to spend and really need the portability of a laptop, you can definitely get a beefy gaming laptop that will run all the Fortnite your heart desires, and in fact if you were a normie getting into the space, that was the best idea for most of 2018 due to the cryptocurrency ponzi schemes inflating GPU prices.
You’re living in it. Ask any website what the OS distribution of its visitors is, and you’ll notice mobile platforms dominate. It’s official: smartphones and tablets are the normies’ primary computing devices, and desktops exist only in the corners where smartphones can no longer compete, such as desktop video games or legacy productivity applications. The year of the Linux desktop came with Android 2.3 Ice Cream Sandwich and Windows is going the way of the dinosaur.
Some of the edge cases are being smoothed over by applications getting ported to mobile platforms, notably Photoshop and Final Cut. Others are more difficult to solve, such as the expansion and file management problems, which is where cloud platforms step in with their offerings. Smartphones are becoming thin clients to the
Big Brother cloud world because of their own limitations. This is saddening and irreversible. The mobile/desktop rift will be the Windows/UNIX rift of the 2020s, but on the plus side it should be easier to bridge it than the 90s dystopia.
We spent most of 2018 in the lawless, godless hellscape of bitcoin mining, with ever greater fools hogging all the quality gaming hardware so they could divine their magic numbers and get rich fast. Once that blew up, a bunch of them went bankrupt and killed themselves, and with any luck many more will follow. /r/bitcoin has a million subscribers, so the corpse pile will rival some small countries in numbers; natural selection at work, my dudes. I LOVE IT.
NVidia announced its new 2000-series GPUs, even though 1100-series would have made more sense, but don’t let basic arithmetic get in the way of a good marketing scam. They also rebranded to RTX, because NVidia want to focus on real-time raytracing now. This is a welcome change, as our current mesh + shader rendering frameworks are hacks atop hacks atop hacks, bloating our protocols and files alike, although in my opinion, NVidia doesn’t take the architectural differences far enough; they should have been more radical by adopting path tracing instead.
Unfortunately, RTX proceeded to be a flop, with no titles adopting the new technology months after release, and the one (1) that eventually did exhibiting underwhelming performance and visual improvements. Dollar-for-dollar, the 2000 series are worse performers than the 1000 series, and as you can imagine this pissed a lot of fanboys off. This means we now have a bunch of idiots declaring raytracing a dead technology, not understanding all their shitty CGI superhero flicks have been raytraced since the very beginning. I guess if you’re retarded enough to stan NVidia, not knowing anything about graphics technology is to be expected.
As you might have expected, this was all a blatant cash grab to sell more 1000 series graphics cards, because apparently NVidia has too many of them that they thought they could sell during the cryptocurrency craze. AMD, for their part, was sitting on their arse the entire year. Whatever praise they may deserve for their processors, it was clear that in 2018, they all but gave up on desktop graphics, and there’s many indications 2019 will be more of the same, because AMD simply cannot compete with graphics any longer; the CPU division is what’s keeping them afloat.
Due to Intel’s overall difficulties with the 10nm process, their graphics got stuck to UHD 620 and delivered nothing of note. In fact, the 8000 series processors have graphical regressions compared to the 7000 series, because until very late into 2018, you couldn’t buy any of the former with Iris graphics. When they eventually shipped in NUCs, they were meh at best, to be expected of a company that’s facing massive yield problems.
Intel has been disintegrating since Skylake, and in 2018 this got so bad that it gave not just AMD but also ARM the opportunity to compete. In 2018, the current year, the most current of years, you can actually go out and buy a laptop with a Snapdragon SoC running Windows 10; not Windows RT or a cut down version of Windows. It isn’t much more than a novelty, but the fact that you’re able to says a lot about the state of the market.
The 14nm process has been optimised and refined to an astonishing degree, and of course Intel remains the king of single threaded workloads, but there’s only so many iterative upgrades you can offer before the target market loses all interest in upgrading. Due to pressure from AMD, Intel were forced to offer processors with more cores, quad-core becoming the new standard, though it should have been for at least a couple of years now.
The greatest winners were hands down AMD with their Ryzen series processors. The first generation was already impressive, but the Zen+ processors were truly something else. As a result, with the exception of the most high-end ($2000+) builds whose only purpose is gaming, Ryzen are the only processors worth paying attention to; the value offering is simply that good. Even then, if you want to use your desktop as more than just a glorified console and would like to do some work on it, Ryzen remains king of productivity.
On ARM, Apple continues its performance dominance, offering more powerful and efficient processors and graphics units than anything Qualcomm or Mediatek hope to muster. Qualcomm is becoming ARM’s Intel, though thankfully breaking into that market is easier, so not all hope is lost yet. For the most part, the disappointments aren’t very visible, because we’re talking about smartphones, after all. In fact, the only way Apple can show its superiority is by flirting with ARM on desktops; the iPad Pro outperforms its entry-level Macbooks.
Rejoice, for the memory price fixing scandal has (mostly) been resolved. Memory prices are lower in 2018 than they were in 2017, after a criminal investigation by the Chinese government and smartphone sales dropping somewhat. ECC memory is cheaper in some cases than non-ECC memory, because you can actually use it now; unlike Intel, AMD’s Zen doesn’t segregate ECC to the server market.
This didn’t stop manufacturers from shipping 4GB of RAM for most computers in 2018; the savings, as usual, aren’t passed down to you. 4GB ought to be enough for most people, but we live in the era of
shitware Electron, so every application must use 3GB of RAM, even if all it’s doing is displaying photos, themselves barely a megabyte apiece. Don’t you just love 2018? I want to kill myself.
We live in a magical era, because for the first time in, well, ever, hard disk drives are more cost efficient than Blu-Ray disks. Granted, optical media retains some advantages for archival—magnetic spinny disks are notoriously fragile—but if all you want to do is store tons of shit, hard disk drives are officially your best bet. The discrepancy used to be such a few years ago that Facebook built custom data centre hardware to manage BDs. I believe optical media still has promise, in theory, but in practise the normies are migrating to tablets and smartphones, so unless there are radical changes in form-factor, I think we’ve seen the end of the optical medium. RIP.
Windows 10 continues to get worse every month that goes by. The updates remain broken despite broken promises to the contrary. Recently Microsoft admitted than regular users are automatically enrolled into a beta testing program simply for hitting the “check for updates” button, which explains so many catastrophic things revolving around the October update. Ads, bugs, spyware, and kindergarten-tier design mistakes have kept creeping in, and Windows 10 is unsustainable as an operating system. Many users have fled to macOS or Linux as a result. Microsoft cares little as 10 transitions into a far more lucrative software as a service pricing model.
MacOS continues its iOS-ification and the raping of its Human Interface Guidelines. Apple won’t admit it, but it very much is the second class citizen of the Apple world. I guarantee Apple would love nothing more than everyone switching to iPads en masse. It lacks the polish it used to have, but it’s far more manageable than Windows, so people tolerate it. Through the Hackintosh movement, hardware compatibility has improved and you can get around many of Apple’s questionable pricing decisions that way.
Linux made a great leap forward for gaming thanks to Valve’s Proton for of Wine. For most gamers, Linux is now a viable gaming platform, and has picked up significant traction owing mainly to Microsoft alienating its userbase. Ubuntu dropped Unity and adopted GNOME, thereby improving upon GNOME’s shit tier design decisions with an infusion of IQ above room temperature. Open-source hardware support improved massively, save for NVidia, which remains a pain point. No big deal; NVidia is for cucks anyway.
Trident, Edge’s and Internet Explorer’s HTML rendering engine, is dead. This makes Firefox’s Gecko the only HTML engine that isn’t WebKit/Blink. Google dominates web browsers, and if all goes smoothly for them, that means Firefox is on a deathwatch and soon Google will have complete and utter monopoly over web standards by controlling every browser on every device in the world. It’s their game to lose, and the collective welfare of humanity is the underdog.
Twitter, Tumblr, Reddit, Facebook, YouTube, Patreon, Steam, and almost every other major service was exceedingly censorious, on a steady pace to out-book-burn each other. With Tumblr’s porn ban, we’ll likely see the largest data loss (in bytes) in internet history. Combined with Patreon, the cultural effects will run deep and scar artistic output for a decade or more. YouTube flirting with banning anyone to the right of Mao is saddening. The sadistic destruction of a generation’s art inspires nothing but the darkest nihilism and no organisation should ever wield that kind of influence over our culture.
Video games lose more of themselves to the microtransaction cancer. Everyone with a brand name wants in on the
gambling scam lootbox market mobile devices and normies have enabled, and rest assured by the time the 2010s are over, few if any AAA releases will remain actually enjoyable, standalone experiences. Early adopters are beta testers for greedy capitalists with every release’s first few months being at about the quality control of a Windows 10 patch, i.e. minus infinity. The only company with an ounce of self-respect and engineering talent left is Nintendo. Everyone else, without exception, should have never touched a keyboard.
The Netflix streaming utopia is coming to an end, with Disney, YouTube, and perhaps other companies deciding to get in on the streaming meme. Streaming will become the new TV, except worse because you’ll have to buy subscription services for each individual service, so have fun with your balkanised culture across different corporate networks, the way you’ve always wanted it to be, I’m sure. The Pirate Bay will always be there to welcome you, BTW.
The biggest winner of 2018 was undoubtedly Pawoo, and Mastodon as a whole. While the mainline instance is notoriously censorious, Pawoo and other Japanophonic instances arose out of Twitter’s utter loathing for
the highest form of art lolicon. By now Pawoo is the place to be if you want to appreciate the divine flatness of peak female sexuality socialise with other paedophiles and talk about how much you love grade schoolers. This means that in 2018 we actually got more paedo-friendly communities than we started with; a first in at least a decade, I’m sure.
Privacy and security scandals sprout again and again without end, with the NPCs growing increasingly indifferent to their basic human rights being signed away by Silicon Valley technofascists, and more educated users being burnt out trying to scrape a space for themselves. The web is increasingly centralised, bloated, and user-hostile. Through Electron, it invades the desktop as well, like a venture-capital-backed necrotising fasciitis out to ruin everything that once made software beautiful, until we breathe frameworks and shit trackers. There is no hope. There is no salvation. We live in V for Vendetta, and the psychopathic dictator deserving public execution is software developers.
May God guide Kim Jong-un’s hand to the launch button and bless us all with the species-ending nuclear winter we deserve. That, at least, is technology I can get behind. Amen.