By John Messer

We’ve all seen them. They’re too pervasive to miss. No free-to-play game would be complete without them: microtransactions. They’re growing in popularity, as the digital frontier makes implementation all the more manageable. Are they a nuisance or an improvement? Are companies just greedy, or is this a realistic industry growth? Most importantly, are microtransactions here to stay?

Microtransactions are a practice in which in-game items are purchased for a few dollars of real currency. These items are often cosmetic, but other times, they are not.. A derogatory nickname has cropped up for game-affecting in-game purchases: “Pay-To-Win”. This alludes to how unbalanced the game is for anyone not toting the purchased game modifier.

For a long time, microtransactions were not an issue. At first, they were only used in free-to-play games, wherein no purchases are made in order to download and play the game. Developers of such titles accrue revenue from in-game advertisements and microtransactions.

Often, these in-game purchases reduced the time needed to accomplish things. For example, in popular mobile farming game “Hay Day”, it might normally take 12 real world hours to grow a crop, but spend 50 cents and it’ll be done instantly.

Cosmetic features are also frequent forms of microtransactions. In third-person co-op shooter “Warframe”, developed and published by Digital Extremes, playing the game is free but microtransactions exist to customize your character’s appearance and play style.

Regardless of the forms microtransactions have taken, one element was nearly ubiquitous for many years. Anything purchasable via microtransactions could often be earned by simply playing the game enough. The crop in the farming game would eventually grow, and most options in Warframe would eventually unlock. On top of that, anything that you couldn’t eventually earn was often only cosmetic, meaning it did not affect gameplay.

What changed? As many ruffled gamers might describe it, big-budget developers got dollar symbols in their eyes, looking like the Wolf in that old Tex Avery short (which can be viewed here). This is obviously a biased narrative, but what we know for sure is that microtransactions started popping up in games that already had full $60 price tags. That’s where the modern discussion sits, on big-budget games getting full-priced releases but containing microtransactions.

Now, whenever a free-to-play game has microtransactions, no one bats an eye. This is because those games draw most of their needed revenue from those transactions.

On the other hand, the general consensus on retail-priced games is that the cost of making the game and some additional profit is entirely covered by the $60 spent buying the game. Indeed, this has been the case for the vast majority of gaming’s history. When modern games with full retail pricing then include microtransactions, it seems like a naked attempt to draw in extra profit for very little risk and cost. For example, that custom weapon skin in “Battlefield 1” does not cost EA all that much to draw up, and those “Team Fortress 2” hats are pumped out for pennies.

Microtransactions have a variety of negative effects. Chief for some consumers is the added cost needed to get the entirety of content available for some games. Some feel that content is purposefully left out and later added post-launch just to earn a few more dollars. This leads into a plethora of concerns for how games are being designed with microtransaction in mind, leading to fears of games focused on creating MMO-like conditions of effort-reward. (“Massively multiplayer online” games often operate on subscriptions so they give just enough reward needed to keep players hooked for months.)

In many ways, these fears are already somewhat realized. For many years, “Call of Duty” has had paywalls for cosmetic features, and these have only increased in intensity with each new yearly installment. Most yearly sports and racing games are also infested with tiny payoffs for content or better player statistics.

Compare Bungie’s “Destiny” to its older title “Halo”. Gameplay is ostensibly similar: both are first-person alien sci-fi shooters with quiet protagonists and deep lore. “Destiny”, however, takes a wildly different approach, borrowing MMO elements like a leveling system, fixed level areas, scaled enemies, simplistic boss fights and, let’s not forget, a huge influx of microtransactions that tie directly into gameplay. The result is a game with much looser design. Gameplay is boring, repetitive and less thoughtful. The story suffers, becoming stale and static to accommodate the game style. “Halo”, by comparison, is streamlined and focused. Everything in it obviously had thought put into it, and effort was made refining every component. In a way, it practically feels more complete.

Microtransactions aren’t all bad, according to many publishers. Many claim that they extend the longevity of multiplayer experiences, allowing an alternative to subscriptions to account for upkeep and additional free content years past launch. While this reason certainly makes sense, there are few cases in which this has been implemented. None really come to mind beyond “Grand Theft Auto 5”, and even that game is starting to be dated and may not represent more current trends in the gaming landscape.

Are microtransactions a bane on gaming, responsible for poor design and gated content? I certainly don’t appreciate them. I either ignore the cosmetic ones or avoid playing games that have performance-enhancing downloadable content.

Nonetheless, it’s hard to determine if they are well and truly a bad thing. Is a gut reaction always the safest bet? Do microtransactions harm or help? It’s hard to say for sure, but one thing is sure. Microtransactions are in, and they aren’t going to disappear overnight.

Tags: , , ,