This is a topic that I tackled early on Second Opinion, but that episode was over a year ago and things have changed both in the industry of the medium and for myself. If you would like to compare my thoughts then to the following write-up, click here.
I still remember my introduction to the medium of video games, though some details are a bit hazy. Nearly twenty years does that to memories of early development. I was about three or four years old and my dad hooked up his old Sega Genesis (model 2, if you were curious) up to our hulking 32 inch television in the family room downstairs. I instantly gravitated towards one game: Ms. Pac-Man. The amount of hours I would sink into playing that game alone (along with the few I would spend attempting to try to understand the original Mortal Kombat) flew by on the rare occasion that Dad would hook up the old Genesis. In fact, I still have the console and all the games and joystick controllers to this day.
Then came the biggest phenomenon among the youth culture in the late-1990’s: Pokémon. While Pokémon Red and Green Versions were released in Japan in February 1996, the franchise wouldn’t make waves internationally until 1998 when a massive multi-media campaign was launched for the franchise. The anime, the Trading Card Game, the video games (Pokémon Red and Blue Versions, internationally, based on the updated Blue Version of the game released in Japan in 1997), the manga, children’s books—Pokémon had its logo on pretty much anything that it could be slapped on.
Funny enough, I remember my introduction to Pokémon much more clearly than the day I first played the Genesis. I was only three years old. It was raining outside. Dad was at work, so it was just me and my mom at home. We were in the living room that afternoon, and I was watching Kids’ WB’s afternoon programming block when the first ever episode of the Pokémon anime came on.
I was enthralled, along with the rest of my generation. Every weekday thereafter, I’d watch the newest episode for years. For my fourth birthday, my great-grandfather gave me money to buy a Gameboy Color and my parents bought me a used copy of Pokémon Red Version. Yes, like most others my age, I had a hard on for Charizard. But at that point in my life, I had nary the slightest ability to read, and as such I merely hit “Continue” on the main menu screen and wandered about in what I would later find out was the empty field beside the museum in Pewter City, unsure of what to do.
Later, as I started kindergarten and began to learn to read, I learned how to start a new game, and caught a plethora of Pidgey and Rattata before finding the Save function. Thinking that using this option would “Save the world” in the dramatic sense, I pressed it, thinking I had…well, saved the world. I later learned that this meant “Save your progress”, and from there it was a grand adventure.
I don’t think I ever actually beat Pokémon Red, but the first game in the series I did beat was its counterpart, Pokémon Blue Version. Like my introduction to the franchise, I remember this moment vividly: my family and I were at BJ’s Warehouse for groceries, and I brought my tried and true Gameboy Color. For the past several days, I had been training my Charizard to the highest level I had ever gotten a Pokémon to be thus far: somewhere between 89 and 92. I remember also having an Articuno on my team when I finally beat the Elite Four and my rival, who, being the uncreative kid that I was, I had named Gary. It was a curb-stomp battle, but it was not only the first Pokémon game I had ever beaten—it was the first video game I had beaten, period.
It was a joyous moment for me, as evident by how clear certain details are despite it having been well over fifteen or so years ago. I would follow-up on this by finding out that the mysterious cave outside of Cerulean City, the Unknown Dungeon, was open, and so I braved its depths, caught powerful Pokémon, and faced the true final boss of the game: Mewtwo. By this point, I had already seen the first movie numerous times on VHS (I still have my copy today!), but this was so far beyond my expectations. So of course, being the simple child I was, I didn’t save scum the encounter in an effort to try to catch it (I had used my Master Ball on either the aforementioned Articuno or on Zapdos), but had Charizard defeat it. Not my proudest moment, admittedly.
The point is, Pokémon cemented my love of video games and for years to come I would play the newest (main series) games as soon as I could after they came out.
Around 2002, after seeing a cousin play 007: Nightfire on his PlayStation 2 (and playing a few rounds), I asked for a PS2 for Christmas with The Simpsons: Road Rage and “that game such-and-such had at his house”. And that’s what I got that year for Christmas. Though, because I never saw the cover for Nightfire, I had no idea that was one of the games I wanted. My parents didn’t know how it got there, either, so we took it to Toys “R” Us to exchange it for something else (Yu-Gi-Oh: Forbidden Memories, if I remember correctly).
And so for years, I discovered a great deal of games on the PlayStation 2 that I grew to love, including my all-time favorite series: Jak & Daxter. I can’t even begin to count how many times I rented Jak II from Blockbuster, not knowing how to read the minimap to get to missions and just pissing off the Krimzon Guard. Other notable favorites include games and series such as Tony Hawk’s Underground 2, Dragon Ball Z: Budokai, Freekstyle, Spider-Man 2, The Simpsons: Hit & Run, Star Wars: Battlefront, and Kingdom Hearts.
But after moving to Florida, the friends I had made on my street introduced me to something like I had never experienced before: the Xbox, and along with that, Halo 2. It was shortly after the game had launched, and one of my friend’s older brothers had picked up the Limited Edition. We spent hours playing against his brother, 3v1, on a 12 inch tube TV in a cramped bedroom, and while we never did beat him (though we came close a few times), we always had an absolute blast.
During a sleepover one night, I played one of the campaign missions through to completion on my own: Metropolis. I played the first level of the original Halo. I fell in love with Halo.
From the tail end of elementary school through my first year of junior high, I was reading the original trilogy of Halo novels: The Fall of Reach by Eric Nylund; The Flood by William C. Dietz; and First Strike by Eric Nylund. The number of times I’ve read The Fall of Reach alone is incalculable. Nylund truly was a fantastic choice to write for this universe.
I got an Xbox 360 for Christmas in 2007, and a few days later I bought Halo 3 with some money I was gifted by relatives. I spent an entire weekend playing the campaign, but it wouldn’t be until I got Xbox Live in August 2008 that I’d be playing it full-time. In those days, I had to bring my console into the office/spare bedroom, where we kept the router, and use the paltry Ethernet cable that came bundled with the console to connect to the internet. There was no TV in the room, so I used a portable DVD player with a 6 inch screen and composite video input to play. I even had friends play with me at my house, meaning that there would be two or three people staring at this tiny screen to play against people from all over the world.
It was around this time that I got into Red vs. Blue, a Halo machinima series produced by Rooster Teeth. Years later, at RTX 2014, I would meet Burnie Burns, one of the founders of Rooster Teeth and creator of RvB, and tell him in person that he was one of the people who inspired me to make machinima and to be a filmmaker.
But something happened in the past decade or so in the realm of video games. Something sinister that has been slowly pushing me away. Or rather, it’s more of an amalgamation of several problems that have been turning me away from the medium for a while now. And I’m not going to act like these issues just popped up over night and I made an immediate, conscious decision to start moving away from video games, because it all happened gradually, over time.
The video game industry has become predatory.
I can hear you now: “You’re just going to complain about Electronic Arts, microtransactions, and lootboxes, aren’t you? Way to get on the bandwagon of hate!”
Well, you’re not entirely wrong. I will be addressing all of those elements among others in this diatribe, but I’ll try to do so in a more constructive way than simply complaining. Maybe you’ve heard all this before from such figures as TotalBiscuit, Jim Sterling, or YongYea (all three of whom I recommend watching), but as far as I recall I don’t think any of them have covered the total scope of the matter in a single video. If any of them have, I apologize for my ignorance, but for the sake of maintaining some semblance of integrity I will be conveying my thoughts on the matter without searching through their back catalog of videos, so as to not accidentally regurgitate their words and present them as my own.
As with any issue, we must look back to the beginning…
(or: How I Learned to Stop Worrying and Love the Add-Ons)
Downloadable content, or DLC, is nothing new to the realm of video games anymore. The concept is simple to grasp: you’ve bought your game, a complete game, at full retail price. Now, for a small sum of money, you can add just a bit more content to the game that was created by the developer. Maybe it’s an extra dungeon, or a couple new bosses.
Or maybe it’s cosmetic-only armor for your horse.
On April 3, 2006, Bethesda released the decorative horse armor DLC for The Elder Scrolls IV: Oblivion for $2.50 on the Xbox Live Marketplace. It was almost instantly mocked for being overpriced and ultimately without value, as Oblivion was a single-player game, meaning nobody but the player would be able to appreciate the armor, and the armor itself had no actual function in-game other than to look pretty.
Years after the fact, Pete Hines, the vice president of public relations and marketing at Bethesda, addressed the backlash in an interview with Game Industry, in which he said:
“The reaction to Horse Armor wasn’t just about price. It was more of a lesson: when you’re going to ask somebody to pay X, do they feel like they’re getting Y in exchange? If they don’t feel like they’re getting their money’s worth, they’re going to bitch.”
It’s important to note, however, that horse armor was certainly not the first instance of post-launch DLC for a console game. During the previous console generation, multiple DLC packs were available for purchase for Halo 2, Star Wars: Battlefront II, Batlefield 2: Modern Combat, and many others on Xbox Live, and massive expansions had been around for PC games (particularly MMORPGs) for quite a while. Hell, the Sega Dreamcast technically had a rudimentary form of DLC, and that was a console that originally came out in 1998!
A New Generation in Full Swing
After the horse armor fiasco, the seventh generation of consoles seemed to finally come into its own in 2007. Several monolithic hits, both critically and commercially, saw release that year: BioShock, Halo 3, Team Fortress 2, Super Paper Mario, Command & Conquer 3: Tiberium Wars, Portal, Heavenly Sword, Metroid Prime 3: Corruption, Skate, Call of Duty 4: Modern Warfare, Rock Band, Super Mario Galaxy, and so many more were finally ushering in the Xbox 360, PlayStation 3, and Wii as the then-standard of console gaming.
And several of those games had paid DLC that generally was released without any sort of backlash. Halo 3 had several map packs which each gave players three new multiplayer arenas at $9.99 per pack (though those prices were cut down later). Rock Band (on the PS3 and 360, anyway) allowed players to purchase playable songs for $1.99 each from a catalog of hundreds. Call of Duty 4 had a single map pack that contained four maps for $9.99. The PS3 version of BioShock had an exclusive Challenge Rooms DLC for $9.99.
I feel that the lack of outrage over these paid DLC packs can be attributed to what Pete Hines said in 2012: players felt that the money they were paying for this extra content was fair and that the content was worth the extra financial investment. $9.99 for three maps in Halo 3 may seem somewhat ludicrous to some, but with the innumerable amount of ways to play on those maps across so many game modes and matchmaking playlists—not to mention the variety of the maps themselves—that $9.99 seems more than fair.
Even Bethesda learned from their misstep with horse armor. Oblivion saw a couple major expansions during its time alongside several smaller additions. The Knights of the Nine expansion was priced at $9.99, and Shivering Isles, a considerably larger expansions, was set at $19.99. Fans of the game hold these two paid expansions in high regard, suggesting that they feel that the price was justified.
Bethesda moved forward with their newfound knowledge of how to approach DLC and expansions with 2008’s Fallout 3. Each of the game’s five expansions were sold for $9.99, and much like with Oblivion‘s two major expansions, fans seemed satisfied with the price for the content.
Despite a global economic recession in 2008, gaming looked to be, for the most part, heading in the right direction heading into the 2010’s…
The Rise of Freemium
But something else was released in 2007 that would come to change the world as we know it: Apple’s iPhone. While it wasn’t the first smartphone on the market, the iPhone revolutionized the concept and made it much more user-friendly and accessible. Over the next several years and as new generations of the iPhone came onto the market, a new type of video game began to rise in ubiquity on mobile devices and smartphones: the free-to-play, or “freemium”, game.
The concept is simple: the user would download a game at no cost. It is a fully functional game, but there are restrictions in place to make certain, and often integral, parts of the game a chore to play. This may include wait times for constructing buildings or crafting tools, limiting the amount of in-game currency the player could acquire in a given period of time, or even restricting access to certain parts of the game entirely.
In order to circumvent this frustration, the player is able to spend real world money to bypass these restrictions. $2 can speed up or skip a six-hour crafting period. $5 can unlock access to better units. $1 nets you a random selection of crafting materials. Various increments of real world money can get you certain amounts of standard or premium in-game currency.
By releasing games for free and structuring them around these microtransactions, publishers began to see profits rise astronomically compared to releasing a console game for $60 with maybe a few DLCs here and there for $10 each. So, of course, the question on the minds of publishers was if they can both have their cake and eat it, too.
Double-Dipping, Part 1: Gatekeeping
Electronic Arts began to lead the pack with this new plethora of extra monetization tactics in the console game space. The first was Project Ten Dollar in late-2009, which served two purposes: combating second-hand game sales through retailers such as GameStop, as well as to further monetize their games at the expense of consumers who did not buy the game brand new.
Simply put, starting in 2010, a code was included in EA’s games that granted the player additional content when it was redeemed through Xbox Live or the PlayStation Network. Maybe it was some extra missions, a new outfit or two, an additional companion; overall minor things, but still part of the game. However, if the player purchased the game used, and the code had already been redeemed, they would have to spend $10 to access that content that originally came with the game.
This evolved into the Online Pass, where the code included with the game no longer was just a few “extra” bits and bobs but now was a requirement in order to access certain standard features in the game, most notably online play. Just as with Project Ten Dollar, those who purchased used copies of games with Online Passes which had their code used would have to pay an additional $10 in order to access the whole game.
EA wasn’t the only publisher to implement the Online Pass, however; Ubisoft and Sony quickly adopted the practice alongside EA. During most of the tail end of the seventh generation of game consoles, this anti-consumer practice maintained prevalence despite mounting outcries and criticisms from both consumers and the press.
The backlash reached its peak in 2012 when EA’s Kingdoms of Amalur: Reckoning, a single-player game, included an Online Pass that locked several missions behind a pay wall if it was purchased used. The common defense of the practice at this point for the practice was that the content provided through these codes was “not required to complete the game.”
It wasn’t until 2013 that the Online Pass would meet its end. Electronic Arts, Ubisoft, and Sony each announced the end of their implementation of them. Officially, for EA and Ubisoft, this was due to the outcry from consumers. Others feel that the prospective digital rights management, or DRM, features of the upcoming Xbox One and PlayStation 4 meant they were redundant. This was explicitly the case for Sony, though not because of DRM (which they considered the Online Pass to be a form of), but because of the requirement of a PlayStation Plus subscription to play games online on the PS4.
No DRM features were officially announced for the PS4, and it would not be until months after the reveal of the Xbox One that Microsoft would backtrack on their draconian plans for DRM for their upcoming eighth generation console.
But while all of this was unfolding, there were other forms of further monetization being implemented in AAA games…
Double-Dipping, Part 2: Season Passes
DLC was still very much in vogue by the start of the 2010’s, despite its rocky start with Bethesda’s horse armor, thanks to exceptional expansions in games such as Oblivion, Fallout 3, and Grand Theft Auto IV. The latter in particular proved that DLC was, in theory, a positive addition to the gaming landscape. Both expansions for GTA IV—The Lost and Damned and The Ballad of Gay Tony—cost $19.99 each and not only added new weapons, game modes, characters, and features, but also contained entire campaigns unto themselves which rivaled the length of many AAA games. In a sense, you were paying for a new game.
In 2011, Rockstar Games, the developer of the Grand Theft Auto series, released L.A. Noire, a detective game set in the 1940s. It also featured what was called the “Rockstar Pass”, a $10 (later $12) access pass that allowed players to pre-purchase all of the game’s upcoming DLC (which would cost $20 in total) at a reduced rate. Also in 2011, Mortal Kombat would feature its own Season Pass which allowed players to pre-purchase additional fighters at a reduced rate.
In theory, the idea of the Season Pass is a double-edged sword. On one hand, players can purchase all of the future content of a game ahead of time at a reduced price, and the developer collects more revenue ahead of time in order to subsidize the costs of developing the additional content. On the other hand, developers and publishers don’t necessarily have to disclose exactly what those future DLC packs will be, possibly leaving players who will buy the Season Pass to get all future content, but not all of it is content they would have purchased individually, potentially meaning they spent more money than they would have otherwise.
In practice, it’s slightly more deceptive and even exploitative. There have been many games, even since only a year after the concept became reality, that have offered Season Passes only to release DLC that is not covered by these supposedly all-encompassing, one-time purchase access fees. Halo 4, Dragon Ball Xenoverse 2, Destiny, Borderlands 2, Evolve, and more have had extra content not covered in the cost of their Season Passes.
Despite the harrowing negative aspect of the Season Pass, they are still prevalent today. Some of them even rival the price of the base game, such as in the cases of Star Wars: Battlefront (2015), Fallout 4, Battlefield One, and Batman: Arkham Knight (originally; its price of $39.99 is now $19.99).
But of course, there’s the big one that’s been highly publicized and discussed these past few months ad nausea…
Double-Dipping, Part 3: Microtransactions & Lootboxes
While this particular aspect of the AAA game industry has seen rampant criticism in the wake of EA’s Star Wars: Battlefront II (2017), this has been an issue that has been in development for quite some time.
Let’s go back to those smartphone games we discussed earlier in this (admittedly lengthy) article. They were making money hand over fist with their use of microtransactions and the profits made from them were starting to dwarf the profits of traditional AAA games. Big publishers implemented the Online Passes, and the backlash put a stop to them. The Season Passes worked, and so they stuck around, but they still only guaranteed a finite amount of revenue from any single consumer.
So they needed a way to continually profit off of the same consumers with as little effort as possible. Thus, the practices typically associated with freemium games on mobile devices began to be integrated into major AAA releases.
This seemed to begin, at least in a highly noticeable way, with EA’s Mass Effect 3 in 2012. The game featured a cooperative multiplayer mode where up to four players fight against ten waves of enemies in a battle for survival. As they play, they accrue credits which are then used to purchase varying tiers of supply boxes with different chances of different rarities of weapons, supplies, characters, and gear, all randomized.
However, the option was also available to simply purchase these supply crates with real world money, just like in a freemium game.
I don’t recall much controversy surrounding this, if any, as I had not stepped into the Mass Effect series until a couple of years later and, if there were outcries about this, they likely were drowned out in the vitriolic response to the controversial ending of the game’s story.
In the proceeding years, other forms of microtransactions would be found in major releases, such as weapon and armor packs in games such as Dead Space 3 and 2013’s reboot of Tomb Raider. Often times, such items were attainable by simply playing the game normally, but the microtransaction merely worked as a shortcut.
In the preceding years, however, Valve’s Team Fortress 2, which had gone free-to-play, began utilizing lootboxes which were easily acquired in-game but cost real world money to purchase the keys required to open them. Given that TF2 had gone to a free-to-play model, this practice did not draw much attention, either.
Then in 2013, Grand Theft Auto V was released to massive critical acclaim and astronomical sales. A month after its launch, its multiplayer component, Grand Theft Auto Online, became available to play. GTA Online‘s early days were marred with issues concerning player progression: job payouts were diminutive, the prices of goods were high, and the service launched with a plethora of options to purchase in-game currency with real money—the Shark Cards.
Players despised Shark Cards because one could simply pay real money to buy a high-end apartment, the best car, and a tank right from the start with no penalty, while those who played the game to earn their home and vehicles will feel like they just wasted their time.
Despite this, GTA Online remains astonishingly popular and lively thanks to a regular output of free content updates from Rockstar. These updates, however, have not been free from criticism, as long-time players point out that many of the new vehicles, offices, garages, warehouses, and more cost an exorbitant amount of in-game currency, and that much of the new content is locked behind purchasing these new properties, which is a large investment in and of itself, and that is before spending even more money on top of that to start the new missions, or losing all of your progress (and money) when players in freemode (where many missions are required to be played) ultimately destroy your shipments, encouraging the purchase of Shark Cards in order to avoid such frustration and tedium.
In short, many feel that Grand Theft Auto Online has become so centered around microtransactions that purchasing them is almost required in order to enjoy or even play the game.
Yet the game is still wildly successful. The promised single-player DLC for the game never surfaced and is looking like it never will, as the encouragement of buying Shark Cards has been deemed more profitable.
Since then, major AAA games have been moving ever more closely to that freemium model than ever before. Halo 5: Guardians‘ Warzone mode allows players to spend money on crates that contain weapons and vehicles you can spawn with during the match. NBA 2K18 becomes a ridiculous slog of a grind in order to rank up unless the player pays more money. FIFA and Madden titles have entire game modes centered around microtransactions that earns EA over $800 million annually. And, of course, there was Star Wars: Battlefront II (2017), which limited the amount of points you could earn in-game and had preposterously high credit requirements for purchasing heroes, cultivating in thousands of hours of game time or over $2,100 to unlock it all before DICE did some tweaking.
Then, of course, there are the lootboxes. Required for progression and available for purchase in many cases. The dam for them really began to break when 2017’s Middle-Earth: Shadow of War implemented them when it is a single-player game.
Warner Bros. put microtransaction lootboxes in a single-player game.
This and the situation with Star Wars: Battlefront II (2017) have begun to rock the boat dramatically on the consumer front. Players have begun to grow frustrated with the constant nickel-and-dime “games as service” approach to development, myself included. Call me a hipster all you want, but I saw this coming for a while and did my damnedest to steer clear of these sorts of games. You see, even if you don’t buy microtransactions, the fact that you bought the game which has them is victory enough for the publisher. Even if you and a thousand others don’t buy into them, the “whales” (real term used in development referring to players who pour hundreds or even thousands of dollars into a game via microtransactions) will more than make up for that.
These whales are the primary target of these predatory practices. Some of them may have more money than sense, others have seriously mental problems and need help. And these microtransactions, specifically random lootboxes, are designed to prey on them.
“But what about cosmetics,” I hear you ask, desperately, “if they don’t affect gameplay, then it’s okay, right?”
No, it’s still not okay. If you can pay real money for a chance at something you want, even if it ultimately has no bearing on how you play the game, it’s a form of gambling. Gambling can be addictive, and can (and has) ruined people’s lives. I’m glad some countries and states are seriously considering this as legally being gambling.
But while the ideology of “games as service” certainly grinds my gears, it is by no means my only gripe with the current state of gaming…
Samey, Uninspired, Broken
In my eyes, the mainstream, AAA gaming landscape has become more and more stagnant in the past decade. Too many shooters, open-worlds, zombies (though thankfully this one has been dying off), and RPG-lite games. Too much focus on gritty realism. So many games have just been blending together to me in the past several years.
Yearly sequels in franchises certainly don’t help. The last Call of Duty and Assassin’s Creed games I owned were Black Ops (2010) and Assassin’s Creed III (2012), the latter of which I never even finished. I played some more recent installments and have watched gameplay of the most recent releases, but they still look like not much has changed at all in the years since I bought into these series.
“What about indie games,” you may be asking. “Why don’t you try looking there?”
To be honest, most indie games, while I’m sure they’re absolutely fantastic and deserve the praise they get, just don’t really gel with me. I tried the demo for Undertale some time ago and the game’s visuals physically hurt my eyes, even though I can play retro games (i.e. NES, SNES, Genesis) with no problems. The Binding of Isaac and the rouge-like genre in general is something I find more repetitive and tedious than enjoyable. I generally avoid games in early access on principle, so a good many of those games are no-goes for me, as well.
The reliance on post-launch patches also leaves me weary. The general consensus nowadays seems to be that a game will be a buggy mess when it comes out, then after six months or so it will be playable. It’s become difficult to commit to a $60 purchase of a game when there is a very real possibility that the developer did not have time to finish it, but the publisher forced it out the door to meet the deadline, leaving the developer to fix it later through patches or, at the discretion of the publisher, paid DLC to reintegrate missing content back into the game.
The over-reliance on DLC and Season Passes wanes on me, as well. The base game may leave certain areas open in clear anticipation for future paid content, or the story may cut off some extraneous branches for the sake of having the player fork up some extra cash to reach the leaves. The prices of DLC have risen, too, it seems. Call of Duty: Modern Warfare Remastered, itself originally a bonus for buying a special edition of Call of Duty: Infinite Warfare, has a DLC pack for the maps from the sole DLC release of the original version of the game, and it costs $14.99. Fifty percent more than how much the original DLC was sold.
Between anti-consumer, predatory business practices on the parts of the publishers; stagnation of the creativity of the medium’s largest names, developers, and franchises; and corporate greed taking precedence over the consumers, I truly believe that the video game industry is overdue for another crash, just like in 1983 (though the circumstances differ wildly).
All Is Not Lost
While this article has mostly been an exploration and condemnation of how the video game industry operates today, all based on the premise that I have begun to drift away from the medium I love, I do, in fact, still love video games. I don’t play as much or as often as I used to (too busy with work, class, Reel Life, and now this webpage), but I do still play.
I gave the AAA crowd a bit of a scathing lashing, but that doesn’t mean there haven’t been any major releases in the past few years that I have loved. 2016’s DOOM, despite Bethesda’s focus on marketing its multiplayer and not sending out review copies, is probably one of the best shooters I’ve played in quite a long time, and I’m enjoying Panic Button’s Nintendo Switch port of the game, which is nothing short of miraculous. (I also reviewed it for The Backlog, watchable here.)
Grand Theft Auto V, despite the microtransactions, is still a fun game to play with friends when you want to just mess around.
The Legend of Zelda: Breath of the Wild, while it has its faults, is still a game I enjoy exploring even after finishing its story. The Kingdom of Hyrule is just so vast that it really does feel like an adventure.
While I have grown tired of genres due to their over-saturation, Titanfall 2 had a surprisingly gripping campaign and a fun (if frustrating, on PC) multiplayer.
And, of course, Persona 5 went from a purchase I made out of curiosity and admiration for its soundtrack to getting me into a series I had never even considered before and becoming one of my favorite games of all time, thanks to its gripping story, addictive gameplay, and well-defined characters.
I still keep an eye out for games that may interest me, even if I’m not exactly easy to please these days. But I think that there are still quality games that buck static trends every once in a while, and I’m certainly looking forward to Kingdom Hearts III, Metroid Prime 4, Bayonetta 3, and Persona 3: Dancing Moon Night and Persona 5: Dancing Star Night in the near-future.
Yes, I know the Persona dancing games are ridiculous. No, I don’t care.