IGM Interviews – Autelia LTD (Human Orbit)

When I was first pitched the synopsis for Human Orbit, the debut launch title from the recently formed Autelia LTD indie studio, I was told to “picture the Sims meets the Novelist in Space.” Well, I did that, and I still came up with a result that had far too many emoji, and not enough potentially sinister undertones. (They can’t drown ‘em if they don’t have pools! Good thinking, Sims 4…)

 

In actuality, Human Orbit is about a sentient AI, which the player takes complete control over. This then conveniently grants control of the station’s droids, camera, and onboard internet, allowing the player to observe people and hack into the internal network. “It’s all about giving players the freedom to do as they please, be it rewriting emails to nurture a relationship, or expanding the research labs to study alien relics,” The team mentioned.

 

Autelia may be a new studio, but they’re born from experienced talent, featuring the work of Dan Raihert (who was an environment artist on Dead Space 3) and Eli Hason (the senior sound engineer on the recent Thief reboot.) I got the chance to speak with the team about developing the interesting concepts behind Human Orbit, and whether or not unleashing a sentient AI on the gaming community was the wisest of decisions. Check it out!

 

Indie Game Magazine: There aren’t too many sci-fi social simulators out there. What made you decide on this particular genre for Human Orbit?

Autelia LTD: From the beginning, we knew we wanted to create a social simulation. We felt AI has not advanced as much in games as it should have done, and we wanted to contribute on progressing in that area.

 

I think that, a lot of the time, the best stories emerge from constrained environments. We’ve always loved the way that bottle episodes in television shows really showcase the personalities of the characters – so we wanted to push the idea of the game being one long bottle episode. I can’t think of many environments more constrained than an isolated space station on the furthest frontiers of known space!

 

I think Sci-fi is suited very well to this sort of game, if I’m honest. The world-building in-game is strong and extrapolated from technological and cultural trends, so it very much is a sci-fi game, rather then just having a sci-fi setting. By this I mean that it will hopefully give a glimpse into how another society functions and lives, with tense soap-opera moments being generated as the game progresses.

 

IGM: Whose idea was it to create a sentient AI and let it loose on a space station? 

Autelia: It sort of arose from the game we wanted to make. We didn’t want the player to be a part of the community. We didn’t want the player character to be someone that the game characters would turn around to and say “Oh, this is a problem. Fix it for me,” or “Oh, thank you for what you have done.”

 

We wanted the player to be outside of all of that. The player character should be an autonomous entity with its own set of motives, completely separate from those of the station’s inhabitants. It shouldn’t have any characterization forced upon it by the social rules that govern the station.

 

When that’s what you want, it’s better not to be someone and instead to be something. A droid is a very neutral character, and a true emergent AI is something that a human can’t possibly begin to comprehend the motives of. When you play as something like that, all of the moral and social dilemmas exist in the space between the player and the game.

 

A lot of the coverage that we’ve had so far has described the game as voyeuristic. It’s not a term that we were using ourselves to describe the game until recently, but it’s accurate. Some people are going to play the game and feel empowered and that’s great – it’s great if you forge some strong connection to the station’s inhabitants and you start acting for some and against others. It’s great if you can identify someone on the station and think ‘This person deserves everything,’ and then arrange to give it to them.

That’s great.

 

But even when you’re doing exactly what’s right for the people on the station. Even if the things that you are doing are making people’s lives easier and making people live happier lives – no one gave you the moral authority to be manipulating them like that. No one asked you to change their lives for them. We are interested in that.

 

Dan Raihert helped optimize, model, and texture Dead Space 3

 

IGM: How sophisticated and adaptive is the AI? How disparate are the potential end-game scenarios?

Autelia: I think people are going to be impressed with our AI. It really is the core of the game. There are no end-game scenarios, unless you really mess things up. There are some modes of progression – at the start of the game, you won’t have as much information about people or power over them as you will have by the end, but once you’ve spread your influence across the whole station, then the world is your oyster. That’s when the real fun begins.

 

I’ll give an example to show the scope of influence you will have in game:

 

Let’s say that you hack into an NPC’s email and you notice that there is some romantic tension between them and another NPC on the station. For the sake of the example, we will call them Sam Smooch and Anne Richie. You could write an email to Anne from Sam asking her to meet for a date in the bar at 11. Unknown to Anne and Sam, you also invite an unpopular senior officer with whom they both have a bad relationship: Captain Jesse.

 

Earlier in the day, you had already switched out the captain’s heart medicine with steroids and so the captain is already feeling quite aggressive when they arrive. The captain’s mood worsens from being around people that they don’t like and they confront Sam and berate him for a mistake that had been made a couple of days before. Anne is a confrontational type, so she stands up to the captain.

 

While they’re arguing, you head off to the cantina and return with a knife, which you place nearby. The fight escalates, the knife ends up being used – things don’t go very well for Sam…

 

Well, a lot of other things could happen from there. The thing that makes emergent gameplay so appealing is how surprising scenarios can arise and make their own new narratives.

 

In the example that we gave, for instance, the player brought a knife over and obviously intended for someone to pick it up and use it – but it could have turned out differently. What if the captain’s heart had given out before they had been able to reach the knife? After all, the player denied them access to their medicine.

 

IGM: How much freedom do players actually have to influence the everyday lives of the onboard community? How will NPC behaviors change?

Autelia: A great deal of freedom. Characters on the station follow a common model of needs – you’re familiar with it – they need food, and water, and rest, and social stimulation, and so on.

 

The station is self sustaining and so pretty much everything is produced within the station. For instance, food is 3d printed in various wonderful flavors and shapes. This food is made from a nutrient rich paste, which is produced from algae that is cultivated on-board. This system and all others can be disrupted and manipulated for various effects, and the NPCs will change their behavior accordingly.

 

If you want to directly alter their behavior, then you can slip medicines into the food supply chain or into the water. If you want to reduce someone’s well-being, you can arrange for their sleep to be disrupted.

 

If you want to kill someone, then there are numerous ways to do it. One way would be to send them an email, pretending to be their boss and inviting them to a meeting. When they turn up for the fake meeting, you could lock them in and reverse the ventilation for that room. They won’t last long.

 

Of course, I am a human myself (you have my word on that), so I don’t encourage killing people outright. There are far more interesting ways to handle people than doing that. I’m sure that the players will engineer flavors of insanity that I couldn’t even begin to guess at this stage.

 

IGM: Can you give an example of the emergent stories that players can discover while exploring the satellite, and explain what you mean by procedurally generated storytelling?

Autelia: I suppose it would be more accurate to call it emergent storytelling. You’ll see various scenarios played out, affected by personality, affected by the function of station systems, affected by direct action taken by the player.

 

The inhabitants of the station are procedurally generated with their own personality profiles and their own histories, and their own attitudes and ways of living. They’ll interact with one another on the basis of that. They’ll form relationships with one another, and they’ll work towards their aspirations and live their lives in the best way that they can – and you can be an active participant in all of that.

 

You have access to every message being sent on the station, and the ability to edit those messages and create entirely new ones. If you recognize that a certain group on the station are working together in their own best interests and you decide that you don’t like that, then you can take action by disrupting their communication and sowing discord among them. Or, you could take more direct action by arranging for a few of them to meet unfortunate accidents.

 

Whatever you choose to do, the inhabitants of the station will still be aware of what their aspirations are and they’ll work around their losses (and with their gains) in order to meet their goals. When you take an action as the player, then you can be sure that you’re not interrupting a fixed, scripted scenario.

 

Continued On Page: 1 2

Join the discussion by leaving a comment

Leave a reply

IndieGameMag - IGM