Picking up the Pieces, by Cole Chittim, is an experimental narrative game about mental health. It’s sort of a low poly walking sim, where you experience a cross-section of the life of the narrator, picking up the pieces of their life.

The game grapples with feelings related to external pressures, like unsupportive friends or reading news stories about people relatively more successful than you are. Or feeling like you’re stuck in a messy room, or trapped in the dungeon of your mind.

I thought I understood my brain better than this.
But, now, I feel like I never did.

Picking up the Pieces carries a poignant message to consider for these last few days of 2019. Maybe it’s time to step back and rethink the negative feedback loops we’re stuck in, face our fears directly, and focus on what you can control in your life.

This narrative game takes about ten minutes to play, but you’ll be thinking about its message for longer than that.

Picking up the Pieces is available for Windows and macOS on itch.io.

The Missing Quests Season 1 is Complete

The Missing Quests was a season of sharing small indie games by Alex Guichet.
Stay tuned for new writing projects, or a potential next season of TMQ.
Alex Guichet @alexguichet
A therapy session is in progress, in the game Eliza—a game which explores parametric AI-based therapy.

Eliza is a visual novel by Zachtronics, the developer of puzzle games Opus Magnum, Exapunks, and other Zachlike games.

The visual novel explores the potential impact, ethics, and effectiveness of an AI-driven digital therapy program, Eliza. You take control of Evelyn, a newly hired proxy for the Eliza program.

AI doesn’t make the world go round

Eliza sessions are delivered by Proxies, contract workers who serve as a therapeutic human intermediary between the two way conversation of AI and client. An Eliza proxy wears AR glasses, getting a realtime stream of patient vitals, sentiment analysis of the patient’s words, and scripted guidance on what to say to a patient.

The Eliza sessions are held by human proxies because the company behind Eliza, Skanda, feels that interacting with computers feels impersonal and unnatural. Theoretically, a realtime conversation feels more natural and less contrived—it’ll challenge a patient to open up more, so Eliza can listen.

However, patients are incredibly aware that they’re still interacting with an AI. Sometimes, they’ll even break the “fourth wall” and say that they’d just wish for an actual human exchange with the person that’s delivering the Eliza-based therapy.

Evelyn talking with Erland, in the Eliza server rooms.

Can a chatbot provide effective mental healthcare?

The game exists in a world affected by a mental health crisis, not unlike today. (I’m personally someone affected by this, too. As someone going through therapy, this game feels close to me, in a way.) Many people in-game are seeking treatment, and Eliza seems to be the option for most, not because it’s good, but because it’s cheap.

You see, therapy is a process of listening, understanding, and challenging a patient to change and improve. Eliza’s treatment seems to land at either end of ineffective and adept at this, depending on the particular patient. For example, Eliza will happily listen to a long-winded essay from a patient, yet its follow up response is banal and unempathetic: “it seems that something is really troubling you.” It’ll fail to drive deeper on the pain or augment the conversation contextually—it’s just an AI’s lexical notation of what the patient is exhibiting.

The biggest miss for Eliza is in the “solution” department, which is a glorified ad for more Skanda services and partnerships. After a patient has poured their heart out during their appointment, Eliza gives cold and scripted recommendations for an app and a pill. In a way, this kind of highlights a big issue with a service this—if you have cheap therapy as a service, is the company really invested in addressing your root issues? Or, will they just do the absolute minimum to keep you happy and keep you as a customer?

At a cafe, discussing Eliza's role in the future.

What if you get to change things?

But what’s your role in all this, as a proxy? Well, chapters take their twists and turns. It turns out that not only was Evelyn a previous Skanda employee, but she was also a principal engineer for the Eliza project.

And that’s what makes this game compelling. It sets up an AI-driven near-future dystopian world and then sets you up to explore what that means. How does your role in this change if you made Eliza? What would you differently? What would you do to change the future?

Each chapter introduces some sort of plot-shift, that twists your feeling of the overall effectiveness of Eliza, your alignment and empathy for Evelyn, and your sentiments toward other involved characters—like the Skanda administration and other ex-employees.

And here’s where the game really hits close to home for me. If you’re on the inside of a large company and can see something is going wrong, and you want to try to change that momentum, what really happens if you speak up? Sure, you internally know that speaking up is the right thing to do, but, likely, you could just be seen as a troublemaker, and ultimately pushed out. Your in-game relationship with Nora (an ex-Skanda engineer, turned artist) and Erlend (the new engineering leader of the Eliza project) make you grapple with the feeling from differing perspectives.

The obligatory zachtronics solitaire game in Eliza.

An effective new entry from Zachtronics

Yeah, it’s dramatically different from anything you’d expect from a Zachtronics style game, but that honestly makes it worth the play. They approach the subjects of engineering, AI influence, and ethics from an informed perspective. There’s no handwaving and namedropping of just the right keywords to make it “seem” competent. It’s apparent actual programmers and engineers wrote and influenced this game, and they know their audience will be technically minded.

It’s different from other visual novels I’ve played too. It’s fully voice acted. It’s got a Zachtronics style UI, suited to the story they’re telling. With this, it manages to feel like a visual novel of their own, not that they just decided to do something new and build a game on top of RenPy or Twine.

And, overall, Eliza is effective at exploring the philosophical and ethical issues of our world, today. It doesn’t try to give you the answers or arrive on any novel conclusion—it wants to make you think. I bet you’ll come out of the game feeling totally different than I did, and the game gives you the freedom to experience that.

At the very least, you still get to play another solitaire game from Zachtronics.

The flowery bird boy you befriend and help heal, in Compassion.

Compassion is a narrative experience game by Ivan Papiol, about getting help when you're in pain.

You encounter an injured bird, in need of help. When you interact with the bird, the thoughts that appear on-screen reflect more gloomy thoughts about how you deal with pain. Poking the bird with a stick explores the ways you can pull away when someone tries to be there and care for you. Offering the bird some flowers dives into the validation and, well, the compassion you can feel when you open up and let others provide help. If you think about it, in a way, that bird is almost like yourself.

This game carries a tang of extra meaning for me. If you check the archive of this site, things slowed down in July. I had a hard time dealing with some things going on in my life, and figuring out how to navigate the road ahead hasn't been easy.

You can't erase pain, but it's easier when you're taking my hand.

Caring for my flowery bird boy was somehow the highlight of my day. It's games and experiences like this that reenergize me. I'm gonna come back to this game when it feels like the sticks and rocks are clouding my judgment.