Game Pile: Mirror’s Edge (and Platform Capitalism)

Mirror's Edge and Platform Capitalism

Mirror’s Edge is a 2008 parkour game released by EA, which had what we can generally accept is a solid example of a ‘troubled development,’ where some really good ideas were executed really well, and a lot of the rest of what was going on in the game kinda got cut up.

Central to the setting of Mirror’s Edge is the idea of a dystopian future where a clean world of gleaming whiteness and the rooftops don’t have bird spikes on them, where this cleanliness comes at a cost of your privacy. You play as Faith, a runner, who ostensibly works beneath the notice of the police moving messages and goods from place to place.

When you play as Faith, you play as a runner, a woman who sees the world differently and sees routes that other people don’t, doing work that protects people’s data in exchange for money.

But before we go on, we’re going to need to talk about Capital.

There’s a guideline in Academia that you don’t want to bring up Marx unless you’re willing to go all in on Marx, because it’s a contentious topic with a lot of reading around it. This is definitely a cliff notes version, a Wikipedia summary.

Capital is a phrase that’s got a lot of meanings, and a lot of reasons to interrogate those meanings.

Capital, as invoked by renowned beard-haver Karl Marx, is money that’s used to purchase goods or services only to resell it. It’s the money generated not by things being made or done or consumed, but money that is made by the circulation of money itself. These days you might hear that kind of capital referred to as ‘financial’ capital.

It is money that does nothing but makes money. It is spending money to buy money. It is the nihilistic extrusion of a parasite.

And these days, capital ain’t even capital no more.

In our 21st century era, we have a new dangerous idea being used to inform people’s decision. It’s not that money is capital, it’s that data is capital. If data is capital, you can see that expressed in how people value access to data.

There’s one obvious way to test this, which is, look at the value people are placing on data they can buy. There’s the obvious existing global empire of Facebook, which monetises the data of a billion people. Similarly, you can look at academic anti-plagiarism service TurnitIn, which grows in value as it accumulates student submissions to build its corpus of data to compare to future student works.

Perhaps the best example, though, is LinkedIn, which was acquired by Microsoft for terrifyingly large amounts of money. In 2016, Microsoft forked out 26 billion dollars to acquire LinkedIn wholesale, and explicitly cited the value of the data LinkedIn had as being the primary reason they acquired it.

When Marx wrote about capital, it was an era of industrial machinery, of gears and looms, and he did conceive of things like software going forward. What we’re dealing with now is the idea of platform capitalism – where a platform gives you control of, and therefore the ability to extract value out of, data.

There’s a structure around you everywhere you go that wants to turn your actions into data, so it can turn that data into capital. The way they get that is via control of the platforms you interact with.

And this is where it gets weird.

See, algorithms want data, but they want data they can meaningfully infer value from. There’s an old joke – so old it’s in a Dilbert strip, and gosh, isn’t Scott Adams a dinghole these days – about how 40% of all sick days are taken on Monday and Friday, and how a foolish boss might interpret that data as seemingly like a big deal. It’s not enough to collect data in general, it has to be data you can convince people you can monetise – browsing habits, spending habits, moving habits, the things people are doing to live their lives and when those lives intersect with spending money.

There’s always data to be gathered. There’s what people choose and how they prefer to be treated. There’s the things people respond to, and the things they say they respond to. Market research is filled with stories of people engaging in behaviour that flies in the face of what they report.

And people will pay for data! Not just businesses, buying car tracking statistics or facebook selling your data as advertising vectors, but people will buy data they don’t have any use for! Look at League of Legends and World of Warcraft where people will pay for addons that give them data, even if they can’t meaningfully act on it – giving themselves a sort of data pornography. And in that space, the roaming eye of the platform turns to games.

Still, there’s a problem.

Data is, ostensibly, only useful as long as it’s natural. There’s a whole history in computers of systems that were put in place to try and curtail unnatural behaviours that were bested by other even more elaborate unnatural behaviour, like how solving Captchas is a job.

Games are, essentially, a kind of algorithm, where you give it inputs in order to get a specific result. Normally, you can treat a player as someone engaging with the game to try and achieve some end – usually what we call ‘the end’ of the game, or in other more repetitive games, ‘the next level.’ But the nature of the game is still ultimately something that asks you to give it input, within a particular range, and it’ll give you an output you want.

What makes games powerful is in part that games intuitively encourage you to keep playing, and teach you how to keep playing them. This is what gets called conveyance. This can be very powerful when you’re teaching someone something, and it’s part of how games work to convince of us of things. The feedback loops of games mean that as we play, we get better at playing, and find the ways we like to play more rewarding.

This is where the problem kicks in: Because the game doesn’t tell you what it wants, but players deduce what the game wants as part of learning how to play well. In a game, deducing what the game wants to know is part of that process of entanglement that pulls us in to keep playing. And once a player knows what a game is asking for, they start to give the game that data, instead of that ‘natural’ data about what they really want. This is like how many Steam users are born on January the first – a piece of junk data that the interface nonetheless has captured.

What this means is that we’re back to where we were: gamification is bullshit. It’s another form of advertising that may be able to achieve some sinister ends by what it’s doing, but because all of its experiments are kind of non-reproduceable and ethically dubious, what we’re presented with is a big, clueless, headless automaton that is trying very hard to use information it doesn’t quite understand on populations that it can’t predict.

The fear, ostensibly, is that Big Data can use your data to manipulate people and your games are going to be used to transform people in to doing things they don’t want to do, which isn’t really how media effects work. The reality is that Big Data can reinforce the worst impulses in people while it heedlessly stumbles around and tries to prove it’s definitely been worth all the money people have poured into a system that literally exists to convince you that it’s worth lots of money.

And here we come, back to Mirror’s Edge.

Mirror’s Edge is a game where the urge to excel at it, for the challenge of what the game defines as a victory condition creates the incentive to stop playing with systems in the game, and instead focus on finding a way to drop as many of them as possible. Eventually, the boundaries are found, eventually, there’s a clip through a fence, and eventually, in the name of speed, the game stops being about playing and becomes about executing.

Mirror’s Edge is a game I love. It’s got ideas about people subverting systems, about how the world looks when your priorities are different, about how environmental catastrophe demands infrastructural change, about betrayal and family and conspiracy, about the willingness to do violence and all of this is filtered through a game that’s more jagged edges and broken pieces than a whole, cohesive product.

This article is derived in part, from the talk by Rowan Tulloch & Craig Johnson at DIGRAA Australia named The Player as Data: The Hidden Algorithms of Dystopia, presented February 2019. The initial conception and description of Platform Capitalism, and the challenge of designing games to capture natural data, is theirs.

As I listened to the talk, though, I kept wondering about how many of these algorithmic games are broken by player behaviour. How many fake birthdays they get, how many lying answers to puzzles they get, and how much data speed runners must generate compared to normal players.

At this point, I’m probably one of the players who’s played the most The Swindle ever. If you took the average of players of that game, many of whom probably played the game for a few hours and never finished it, it’s entirely possible that I personally have generally deformed the average success or failure rates.

Games are still fascinatingly complicated things to study, and part of what makes them complicated is the way they change, while you’re studying them.