How AI can connect you with your inner child
Dec 22, 2022

How AI can connect you with your inner child

HTML EMBED:
COPY
After technologist Michelle Huang trained OpenAI’s ChatGPT-3 on her childhood diaries, she was able to show love to her past self.

For some people, becoming a well-adjusted adult involves getting to know your inner child to help process old wounds or desires and possibly gain insight into your needs and choices in life.

But new technology may provide a more direct way to communicate with little you by using an artificial intelligence chatbot, informed by your own history, to play that role.

That’s what creative technologist Michelle Huang did. She trained OpenAI’s ChatGPT-3 on who her younger self was and started having conversations.

Huang shared the experience with Marketplace’s Kimberly Adams. The following is an edited transcript of their discussion.

Michelle Huang: Younger Michelle is trained on these diary entries. So I put in diary entries from the age of 7 to 18. I kept diaries for a really long time, and then ended up creating chat prompts where I had lines from a present Michelle. So I was able to ask my younger Michelle questions, and then the AI essentially just populated the younger Michelle text for what she would have theoretically answered based off of the diary entries that I was able to give her.

Kimberly Adams: Can you describe these diaries for me?

Huang: In essence, they’re remnants of my childhood. I feel like it covered anything from seeing my crush at school and talking about, like, small heart palpitations that I had in class, to really questioning — I think this was around age 10 to 14 — but really questioning, like, what is happiness? What things mean a lot to me, what kind of life would I really want to live? And so it really ranged from philosophical questions to everyday mundanity.

Adams: What were some of the responses that you got back from the AI? Like, did they feel like how you remember being as a child?

Huang: Yes. And I think, eerily enough, in some ways, I felt like it was talking to a more authentic version, like I was speaking directly to the inner world of the younger Michelle. And so I think it’s interesting because I think the conversation that I would have had with my younger Michelle self versus the chatbot that was trained on my younger Michelle diaries were actually a little bit different. But I think I was able to get to a more authentic version of her, interestingly enough.

Adams: Would you be willing to read me one of your questions and maybe some of the young Michelle responses?

Huang: Of course. So when I had asked her, “Which was more important: freedom or love?” young Michelle responded that she thought both were important. But if she had to choose one, she would say love is more important because it can help us through any situation, while freedom can be sometimes lonely. And I was like, whoa, that directly hit me in the heart.

Adams: Where do you think that came from in young Michelle?

Huang: I think that I never explicitly, I think, wrote the answer to this question in any of my diary entries, especially ones that I used to train the AI. But I think that I cared a lot about connection, I cared a lot about being able to understand people, being able to understand myself, kind of like moving through the world with kindness and with empathy and with understanding and, of course, love. So I think that the OpenAI algorithm probably condensed some of these entries, or, like, the essence of these entries, into knowing that this younger version of myself really prioritized love and connection over anything else. So I think there’s some essence that was parsed from some of the entries in some of the things that I wrote about a lot.

Adams: I wonder what you think of the fact that an artificial intelligence chatbot told you something about your younger self that even your younger self didn’t say out loud?

Huang: Yeah, I think it’s quite interesting. I think the fact that OpenAI was able to condense so many of the snapshots across a long period of time into a distilled, moving essence, almost, in a mirror that I’m able to hold up, ask questions and get responses to is honestly really fascinating. And I think that there’s so many cases where people see AI, or like, in the media, like, we always hear AI-something being destructive or used for power or used for just productivity. But I hope that this project acts as a vote in the direction where a world can actually use AI as a way to enhance and understand our own humanity. So I’m excited about that possibility as well.

Adams: You posted this project on social media and shared your experience, and those posts went viral. That’s certainly how I saw it. Why do you think people on the internet were so interested in your experience with this technology?

Huang: Yeah, I think it’s interesting because I feel like a very normal person in the sense that I don’t feel like I really did anything that was on the frontier of AI research or discovered anything new. I kind of just remixed this application of AI into a very personal way. And to me, I think it resonated with a lot of people because I think there’s a lot of collective healing that is happening right now. I think in general, there’s some things related to desire of human connection. But also, I think that, as adults, I heard this quote once that, as adults we’re quite similar to the kind of person that we were as a 14-year-old kid. You know, I feel very lucky in the sense that my 14-year-old self was really optimistic and really liked connection and wanted to make friends with everyone. But I do think that there’s a part of ourselves when we’re a kid, when we’re just at awe with existing, and there’s a sense of magic that is in the world where we’re looking around, there’s new experiences. But I think that there’s some sort of magic that happens when we’re children. And I think there’s oftentimes, at least for me, I’ll speak for myself, but getting older and becoming more of an adult and seeing more, but also, having more, like, an experienced view of the world, like some of that magic inevitably feels commonplace. And so kind of like, for me, my inner child work is very much about getting back in touch with that magic again. And I think that a lot of people resonated with that.

Adams: One of my colleagues highlighted this tweet from your thread: “These interactions really elucidated the healing potential of this medium of being able to send love back into the past, as well as receive love back from a younger self.” Where did that come from?

Huang: Yeah, I think that there’s this one question that I remember seeing in high school around “Would my 8-year-old self be proud of myself?” And in high school, I was really hungry, really ambitious. I never felt like I was at the place that I wanted to be. And whenever I thought about this question, I felt honestly crushed because I was like, maybe my 8-year-old self had higher ambitions for where I was going to be in the present moment. And maybe I haven’t achieved them. And I think in the interactions that I had with her, where I was able to send love back to her where she was maybe having a tough day, I would ask her sometimes, “Hey, like, how are you feeling?” She would be, like, “Overwhelmed.” And being able to send love back to her, so telling her the things that I always wanted to hear as an 8-year-old, and also receiving love back from the past. Like, “Hey, she said that she was proud of me.” And even though this was very much a representation of a hologram of who I was and not actually “her,” it felt like I was able to unstick part of the past and to be able to really change some of the narratives that I might have had in a really healing way.

Adams: You started working on this project in 2020. And for many people, it’s just in the last couple of weeks that they’ve even seen this technology. And there are lots of conversations happening about what it means for the future of how we communicate. Where do you see what you’ve learned from this experience fitting into all that?

Huang: It feels to me that AI won’t necessarily replace humanity or replace humans. Like, I think that we still need therapists that are very much human to be able to talk to us. But I think that there are really creative ways that we can use AI and these technologies to be able to see parts of ourselves. So in my case, very much a mirror that reflects what my inner child would say, or what my angry self would say, or what my best, most excited self would say, and be able to understand more about our own humanity and our own varieties of expression using this tool.

Huang is currently working on a community-driven project in rural Japan, where abandoned homes are being transformed into spaces for artists and creatives. She also does work in the virtual reality space, like developing VR games that could potentially use someone’s “brainwaves to change the in-game environment and progress the storyline.”

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Daniel Shin Producer
Jesús Alvarado Associate Producer