AI promises it can know one’s mental state, but that comes with a lot of data tracking
May 11, 2023

AI promises it can know one’s mental state, but that comes with a lot of data tracking

Emotion recognition technology requires collecting biomarkers to analyze mental health. Law professor Tiffany Li says we need special privacy protections for this “specially sensitive data.”

Sure, technology that supposedly reads human emotion has been on the scene for a while, along with concerns about its use. But now it looks like Apple may be getting in on the game. The tech titan is reportedly developing artificial intelligence-powered mood tracking for Apple Watches.

Marketplace’s Meghan McCarty Carino spoke with Dr. Daniel Kraft, a physician-scientist and founder of Digital.Health. He says wearable emotion recognition devices could achieve something that’s been difficult to provide in mental health care: real-time response.

The following is an edited transcript of their conversation.

Daniel Kraft (Courtesy Kraft)

Daniel Kraft: The standard element for clinically measuring something like depression or stress and anxiety is to ask someone to fill out a survey in real time, which takes them out of maybe their current emotional state and isn’t the most engaging or reliable way to get a true picture of someone’s internal state or experience over time. I think the potential now is to collect those things sort of seamlessly and start to layer them up from our wearables, our voice, our environment, our movement, and we start to get a better picture and a predictability — I like to call it “predictalytics” — of where someone might be with their overall health. And if we can start to optimize them, we can make a big dent on both public health and individual health going forward.

Meghan McCarty Carino: Yeah, I’ve seen some idea that this could include like a virtual coach. I mean, is that something consumers want?

Kraft: I think health care, in particular in this new world of digital health, needs to be sort of hyperpersonalized. We all might respond differently to coaches, and then with this new world of generative AI, we’ll be able to create, in the near future, your personalized health coach that matches your emotional state or psychological state or medical needs, or what gets you motivated to do your workout or eat healthy foods. So I see it as a pretty exciting era in the next decade where we can connect all these dots to potentially really impact mental health, emotional health and big-picture health care. The challenge is most clinicians aren’t aware of what’s out there.

McCarty Carino: Can you give me some examples of how this might work? Like, what could a virtual coach, enabled with emotion recognition technology in a wearable, do?

Kraft: Well, imagine you wake up in the morning and you look in the mirror in your kitchen or your bathroom, and it’s been tracking you — whether you like it or not, our digital exhaust is collectible in many forms — and it might, in the right form and tone that matches your age, your culture, your personality, educational level, your incentive — some people want badges, some people want points, some people want dollars — sort of get into your head space and remind you to eat that healthy breakfast and that your workout is coming up and that “Oh, your stress level looks a little higher based on your voice biomarkers from the day before,” and give you some coaching and maybe interactions and nudges through the day or the week that can start to get things back on track for you. It might also show you, not just you of today in the mirror, it might show you future you if you continue to smoke and communicate that to you in a way that really matches your, again, personalized psychology because we’re all built a bit differently. A lot of these tools today have the same user interface. I think we’re moving into an era of precision digital health, where our avatars, our coaches, will start to really match and meet us where we are.

McCarty Carino: As this kind of technology becomes deployed more often and more companies are sort of jumping into it, where do you see this going?

Kraft: Well, I almost imagine a world [where] we’re entering this age of digital therapeutics — there’s thousands of health care applications out there. So I think the future will be this precision, digital health, digital mental health, where your clinician, or your ChatGPT-enabled clinician, will start to go, “Hey Daniel, I see you’ve been having some challenges lately. Here’s a platform that might help.” The challenge is getting them paid for, regulated, proven and into the workflow of clinicians and individuals so that they can use them and they don’t end up with 50 different apps on their phone and 50 different wearables. I think we’re going to see a lot of consolidation where the “One Ring rules them all” — and, of course, there’s rings that can track health and sleep and activity — to come together.

Emotion recognition technology, as we heard Daniel Kraft talk about, requires tracking and recording a lot of biometric data from users. So what are the privacy implications?

McCarty Carino asked Tiffany Li, an assistant professor of law at the University of New Hampshire.

Tiffany Li (Courtesy Li)

Tiffany Li: The first problem with collecting emotion data is that this is very sensitive data. And the fact that companies can collect all the data on how we’re feeling at any given point in the day is really scary. So that’s why we need a lot of special protections for the specially sensitive data.

McCarty Carino: And do we have any of those special protections in place?

Li: We have some privacy laws out there and some privacy regulations. California, for example, has one of the strongest privacy laws right now in the U.S. And Illinois has one of the strongest biometric privacy laws, so that’s privacy for data coming from or about the body. You’ve got things like a right to ask for your data to be deleted or a right to even request knowledge about what data has been collected about you. But we don’t have a federal privacy law.

McCarty Carino: What are some ways that the law could provide guardrails here? Are there examples, say, in Europe for this?

Li: So the European Union does have stronger laws on privacy and related technologies. The General Data Protection Regulation, the GDPR, is probably one of the strongest privacy laws in the world right now. And, in fact, California law and a few other state privacy laws were based, at least in part, on the GDPR. And additionally, Europe has an upcoming AI act, a law just regulating artificial intelligence.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Daisy Palacios Senior Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer