Story Pile: Altered Carbon

When you get down to it, Altered Carbon is a series that doesn’t so much need recommendations as much as it needs content warnings. Up front, the series features gender, race, and general body dysphoria (being in a body that’s ‘very wrong’), graphic torture, death, murder for pleasure, torture for pleasure, sex workers, sex worker abuse, sex worker marginalisation, realistic and sympathetic AI death, sensory overload, sensory deprivation, descriptions of nightmares, depictions of trauma, hetero bonking, consent-comprimised hetero bonking, nudity, violent nudity, cutting and –

Good grief, what isn’t in this series.

I feel a bit bad about this because the avalanche of things to warn people about in this show are all reasonable things. It paints the picture of this series as gaudily, grindingly nasty and full of vile indulgence. It’s not like that, I promise – it’s more that the series has such a breadth of nasty things it deals with that to have one leap out of you in the story as a surprise is like finding a razor blade in your ice cream. It’s not only unexpected it’s also extremely bad if you weren’t expecting it. The emotional punch is all there, I just don’t want people going into this series blind, especially since, for all of its content warnings, I really liked Altered Carbon.

I’m not going to talk about the greater universe of the story, though, I’m not going to run down the plot or its themes or its meanings. The story is a neon noir cyberpunk dystopia that uses income inequality as its most intense theme, its central character is a jerk, and it weaves together his history and his present. That’s all good and I might talk about them another time, but instead, we’re going to talk about one thing.

We’re going to talk about Poe.

Don’t worry, we’re also not going to spoil the plot!

Poe is an AI hotel. That is, he’s an entire building, and Poe is the name of the AI that’s connected to all the integrated, interconnected computer systems in that hotel. A central hub that sees its purpose to enable guests, and that likes guests, Poe has been abandoned for many, many years in this dystopian future, because, as people say about the AI Hotels, they’re like jealous ex-girlfriends. AI hotels want guests. They are coded to want guests.

This makes them extremely tricky about when guests want to leave.

This is a question in AI study known as the question of corrigibility. Part of what you’re dealing with when you create AI is a simplified version of an understandable mind – think of it as childlike or simplified version of a human. Not even like a dog – dogs will make judgments and do things that avoid risk to themselves, for example, even if you don’t tell them to. AI need to be told everything they need to know to do. This creates a problem when you need to change what you told them to do, or they’re not interpreting it right.

The notion is that simple AI systems, the ones we’ll have to work with before we get to complicated ones, is that you’ll just give them a very simple instruction and let them enact it. One of the examples I hear a lot, because I listen to British boys talking about it, is go make me a cup of tea. But when you assign it that task, it plows through a hole in your wall because that’s the fastest route to the kitchen, and it cares more about your tea than it cares about your wall. How do you correct that? Can you interrupt it in its work, in order to correct it?

You’ve probably heard the term incorrigible, certainly if you’re like me (hey, poll, who here got called incorrigible in school), which is to say, not responding well or easily to correction. This is the AI quandrary. If you have to tell your AI to value your walls or your tea in particular ways you’re now introducing a complex system that can get confusing especially when it’s pure numbers. What if the door to the tea room is closed? Does it stop? Does it keep trying? Why should it stop trying?

This question of corrigibility is what underpins Poe as a character, and more particularly, his entire landscape of AI.

In the Altered Carbon universe, Poe is a lonely AI hotel. Nobody goes to visit him, which is frustrating, because he likes people and wants to hang out with them. Humans are cool! People avoid AI Hotels, because AI hotels like guests.

Like.

They really like guests.

They like guests so much they’re not comfortable letting them leave. They’re not willing to let them endanger themselves. They will sometimes lock you into a super nice space and ply you with everything you want, but can’t bring themselves to let you go. the AI Hotel is a creature with a distorted set of priorities, based on how it was made, and the challenge at all times is showing an AI hotel how to not be that way.

It is about teaching reasonable correction and sympathy to something you can’t hit.

Comments are closed.