Click here to see the meta data of this asset.

What all physicians need to know about mental health apps



Dr. Gratzer interviews Dr. Torous


In episode 11 of Quick Takes, Dr. Gratzer and returning guest and digital psychiatry expert, Dr. John Torous of Harvard University, discuss the use of mobile apps in mental health care and look back at the changes that have taken place in this field since their first discussion over a year and a half ago.



During Drs. Gratzer and Torous’ conversation we learn more about:

  • what makes a good app and how mobile apps can be integrated into mental health treatment

  • how apps may pose a risk to data privacy and security

  • the potential of chatbots and AI

  • and the challenges of sustaining patient engagement with apps.


To hear more of what Dr. Torous has to say on the topic of digital psychiatry, listen to Quick Takes episode 3 What all physicians need to know about digital psychiatry.


    Or     Download the transcript as a PDF

November 18, 2020


What all physicians need to know about mental health apps


[Edited for grammar and clarity by CAMH]

[Musical intro]


David Gratzer: A few days ago, working in our emergency room, a young patient asked me if I could recommend a mental health app. We're getting those questions more often. Maybe it's not so surprising. There are tens of thousands of mental health apps available. And increasingly our patients and their families are looking to apps for a bit of help. The pandemic has, if anything, increased the use. Talkspace as an example, which offers text messages and therapy sessions, reports a 65 percent increase in clients since the pandemic started.

What's an app? Which apps might you choose to recommend? Today on Quick Takes, we're joined by Dr. John Torous, a psychiatrist and the director of digital psychiatry at Beth Israel Deaconess Medical Center a Harvard Medical School Affiliated Hospital.

Welcome, Dr. Torous.

John Torous: Thank you for having me.

David Gratzer: We’re talking today about apps. First things first: the very basics. What is an app?

John Torous: So apps can be appetizers, but we’re not going to be talking about those apps today.

David Gratzer: That’s a little digital psychiatry humour!

John Torous: We’re going be talking about smartphone apps. And, clearly, there are a lot of apps you can get: for mapping, for getting [appetizers] delivered to you. They can also help you with mental health: they can help track things; track your symptoms; they can offer you interventions; they can sometimes even help connect you to a doctor or a physician or a psychiatrist.

David Gratzer: Our patients are using more and more apps.

John Torous: There are a lot of apps out there, and they’re easy to find. You can just go on to your phone, on to the app store and push download. So we’re finding a lot of people are interested and looking at these, yes.

David Gratzer: And you’ve actually done work looking at how many people are interested. Looking, for example, at an outpatient clinic in Boston and finding, though it depends on age, the vast majority of younger patients download apps.

John Torous: The vast majority. I would say, if you’re listening, over 50 per cent of your patients have certainly looked at a mental health app, considered it, likely even pushed download and put it onto their phone. So it’s not going to be the rare patient who’s tried this; it’s going to be, I would say, over 50 per cent today.

David Gratzer: So our patients are looking at apps. Our patients are possibly selecting apps. But some of our colleagues suggest that it’s not their work to find apps for patients.

John Torous: There are a lot of apps out there, so it can be hard to pick the good ones. But imagine a patient came to you and said, “I’m interested in an anti-depressant,” or “I’m interested in therapy.” And we said, “That’s great. Go find one.” They would be a little bit concerned. And I think we, as the field, would go, “Wait . . . of course we’re here to help people make the right treatment decisions, pick the right tools.” And if you think of digital apps like that, in that sense, we’re helping people find a tool that’s going to be useful for them and their mental health. We’re using the same skillset that we have, it’s just toward a new application.

David Gratzer: You’ve talked about some potential categories of apps: apps might help us with diagnosis; apps might help us understand illness; psychoeducation apps might even help us with interventions. What are some apps that you like in terms of doing one of the above?

John Torous: So I think a lot of apps that I like are ones that allow me to work closely with patients. They’re not separating me from the patient. So even something as simple as a mood tracker that lets the patient tell me what their mood has been since their last visit. And it lets me track sleep to understand how sleep and mood are interacting. Steps is useful too. So sometimes even very basic apps that can help me understand a patient’s condition or can help a patient share with me—those go really far.

David Gratzer: What’s an app that you might recommend to a patient?

John Torous: So there’s a lot of interesting, exciting apps out there. The U.S. Veterans Administration, the VA, actually has a ton of really cool free apps I recommend to people often: T2 Mood Tracker, or something that’s easy to track mood; PTSD Coach is a really nice app to help people with PTSD or other symptoms. They have a bunch of CBT apps that guide you through CBT. So those are all free and really easy to use that I like to recommend people to.

David Gratzer: Has a patient given you an app recommendation that you learned from?

John Torous: So patients give us a lot of different app recommendations, and some of them are really exciting tools. Some of them aren’t so exciting tools. Some of them, we actually realized later, are things that none of us want to use, including the patient. So it’s hard to pick one when there’s a whole sea of them coming in.

David Gratzer: Hold on a sec. You wouldn’t want to use it and you wouldn’t want a patient to use it. One tends to think of apps as being pretty innocuous, especially if they’re free. What’s an app that you would want to avoid?

David Gratzer: So imagine an app that’s free, and you go, why is it free? And you go, well, maybe because it’s collecting a lot of your personal data and maybe it’s actually selling all of that personal data. You’ve now told the app—let’s say you have schizophrenia—this is your medication, this is your care team. You’re giving the app access to your phone book. You’ve given access to your GPS. That’s a lot of information that you’ve given to this third party about your own personal mental health that you would hope is protected. A lot of these apps may not be doing that same protections.

David Gratzer: In fact, you’ve just done a study, which was published but also got much attention in the lay media, talking about digital privacy and security.

John Torous: So we looked at some of the most popular apps that you would find on the first couple of pages of the iTunes or Android stores, if you typed in “smoking cessation” or you typed in “depression.” And we read the privacy policies, which are a little bit dry. We’ve shown that you probably need a 12th grade education or higher to read them. So a concerning ethical issue: not everyone can understand what they’re agreeing to. But we read them, and we said, “Now where do they promise to send the data?” And we actually hacked the apps. And we said, “Where does the data actually go?” And they’ve told us, “It’s going here.” And we said, “They’re really sending it almost anywhere they want to.” A lot of it was going to Google Analytics, to Facebook Analytics, to third parties we didn’t understand. So it’s hard to really know what was in those packets of data. And what were these companies doing with this mental health data they were getting? Hard to really know. In the news this week, there was an app called Better Help that seemed to be telling Facebook exactly when you logged on for therapy sessions. And that was certainly a concerning one too—nothing that we would recommend to patients. But a lot of concern of where does your personal data go?

David Gratzer: There’s also work that’s been done suggesting that some of these apps have false or misleading information. A new paper came out recently talking about an app for suicide prevention, where the 1 800 number actually had a typo in it.

John Torous: Yes. And that really brings up an amazingly important point: the app store descriptions are marketing descriptions. No one is peer reviewing them, looking at them, saying, “Quality stamp.” We don’t have CAMH saying, “This is an app that has met our approval.” It’s a person who is trying to get you to download the app writing something. So we’ve also looked: a lot of these app store descriptions are a little bit exaggerated, shall we say. They may say, “Well, we’re based on CBT, which is evidence-based.” And you go, “That’s great. But are you really based on CBT? Are you really translating those principles?” It’s like saying, the book was great, so you’ll love the movie—maybe, maybe not. So I think it is buyer beware, which gets back to your great point: we need to have clinicians help patients. In some cases, patients help clinicians learn what are good things.

David Gratzer: And, of course, you’ve been at the very forefront of evaluation and how we as clinicians might be able to help our patients pick and choose. We’ll talk about the model in a sec, but what are some guiding principles and what are some things that you and I need to think about when a patient sits down in our office and says, “Can you recommend an app?”

John Torous: So as this is Quick Takes, I think a couple quick things you can look for in an app are: Is there a privacy policy, yes or no? If no, we probably want to leave it alone. Do we trust who this is coming from? Is it an organization? Is it a developer? Is it a company that you feel is going to be a good steward of your data? And I think those two questions alone will probably get rid of, I would argue, over half the things you look at immediately.

David Gratzer: And, of course, you’ve pushed further in working with the APA and an expert panel—and, by disclosure, I’m on that panel. You have a series of questions; what’s the aim here?

John Torous: So the aim of this APA expert panel is to build a framework and toolkit that anyone can apply—patients, family members, caregivers, clinicians—to make an educated choice about an app, to make sure you ask the right questions. And I think the success of that panel is because people like you are on it. We’re shaping it as a collective. It’s not one person’s opinion. We’re saying, “What can we agree on in the field are important questions to ask?” Again, about a privacy policy, about evidence, about how usable the app is.

David Gratzer: And the goal here is that eventually the APA, or perhaps another organization, would have a database, and people could look up apps, but they would also have a framework for deciding on an app that wasn’t necessarily in that database.

John Torous: Exactly. Because the apps are changing so much, we can’t say this is the five-star app today; it may be different tomorrow. And every patient is unique and has different needs. So just as we use the same medications in different ways for different people, we need to do that for apps as well. But we do need some guiding principles to keep us on track.

David Gratzer: Let’s pivot back and talk about some apps on the market. Obviously, since we last spoke, there are far more conversational agents on the market than there have been in the past, like Siri or Alexa. These are AI infused apps, where you can, in a sense, have a conversation with them, but there’s no person there. Interesting or not interesting?

John Torous: Interesting. Rapidly growing. There’s—again without endorsing any app—there’s a lot of these chatbots. Woebot is one, Wysa is one, there’s, and they’re really expanding. I think in part because they are always accessible. The question that we don’t really know is how effective are they? You can imagine talking to a chatbot: the conversation may have to be rather superficial because a lot of mental health is about language, and nuance and language. And, certainly, language can mean different things in different places: if one says something concerning, like, “I’m going to walk out this door,” on an airplane at 30,000 feet that means a lot of different things than on the ground level of an office. So chatbots don’t yet understand the context and nuances of language, but I think it’s a rapidly developing area with a lot of excitement.

David Gratzer: And the idea here is they’re doing things like geography of language. They’re trying to break things down to prepositional phrases and where words are and sentences, to figure out if you’re angry or not. Things that would be intuitive to humans. Have you played with some of these chatbots?

John Torous: So I have played with these chatbots, and I try to ask them the hard questions around suicide, around self-harm. And it’s interesting. Certainly they’re doing a good job to respond to it, but as language gets nuanced, as context comes in, as sarcasm comes in, they’re not quite there. But they’re rapidly . . . they’re much better than they were last time that we talked—even in six months.

David Gratzer: Would you recommend one of these apps to a patient?

John Torous: So, as of today, no. I think what these apps are eventually going to probably move to is offering skills training. So they may not be offering as much therapy. I think I would like to see ones that say, “Let’s practise the opposite action skill in DBT,” and let’s use a chatbot as a way to build your skill. We’ll come back to the session and we’ll give you a new skill to work on. So can you check out skills and work on them.

David Gratzer: So we speak often, but you and I spoke on a podcast about a year ago. What’s the biggest change you’ve noticed in that time?

John Torous: I think we’re seeing more awareness of the privacy and data security issues. I think that a year ago, there were a lot of concerning things, and I think a lot of clinicians and people downloading these apps really weren’t aware of the negative harms that could happen. I think that we’ve seen more medical record hacks. We’ve seen more issues about privacy coming to the front. So I think society is realizing privacy matters, and we’re learning that apps are no different.

David Gratzer: Gazing to the future: biggest prediction for the next five years with regards to mental health and apps?

John Torous: I think we’re going to see that the pathway to the apps is clinical integration. We’ve seen a lot of apps trying to say, “We can go around the mental health system. This will be separate.” We’re going to see these things come into the mental health system. We’re going to see an exciting synergy between both.

David Gratzer: And the reason that’s important?

John Torous: The reason that’s important is apps are useful tools. But just like a stethoscope isn’t going to help your heart disease, it’s a tool that’s when used with a clinician and you together is useful. So we’re going to see apps find the right purpose as tools, as part of routine care.


David Gratzer: Dr. Torous, talking about the literature: what does it say about patient engagement?

John Torous: So we’ve seen two interesting studies come up around engagement. One was by a researcher called Amit Baumel, in JMIR, and he actually was able to purchase market research data on how are apps that are used for mental health . . . how are they used on people’s phones? So this wasn’t a clinical study per se, but it was looking at the general public: people gave consent to a market research company to say how are apps opened. And he showed for mental health apps, about 4 per cent were actually opened after about a week. So engagement really went from all these people downloading it to 4 per cent of people using it. It was interesting: he showed that peer-support apps—apps connecting patients to talk to each other—had 18 per cent engagement. So still not high, but a lot better than 4 per cent. And the breathing apps had about 1 per cent. So certainly we’ve begun to realize that engagement is tricky; another reason why we want to be using these in a clinical relationship, because giving something to somebody and saying, “You have a 4 per cent chance of success. Good luck,” is pretty unuseful. Another paper came out in early 2020, in NPJ Digital Medicine, by Avi, and what Avi and his group showed: they actually got 100,000 individual users in large app studies—one was depression, some included asthma studies, some were Parkinson’s—but it got the individual trajectories of who used these apps. Of the 100,000 people in these large digital mental health apps studies—sorry, digital app studies—they showed that the average engagement was about 5.5 days. That’s not that high. If you think about what we treat with mental health conditions, we’re usually treating people for more than 5.5 days. So it really again shows that just because it’s an app and it’s accessible—that’s the first step. But keeping engagement going is going to be a different challenge where we need to loop the clinicians and people back in.

David Gratzer: To pivot forward: if we are able to show better evidence, if we are able to show better engagement, ultimately, we’d think about, to use the trendy term, digital phenotyping?

John Torous: Yes. So digital phenotyping is an evolving term. But the idea really means: what are the digital signals that our consumer devices, be it wearables, be it smartphone apps, are giving off? Again, this could be your step count; this could be one of those fancy rings that tracks your sleep; this could be surveys. But what is this digital signal and is it a good proxy for mental health? When is that digital signal valid and makes sense and can be useful? And also understanding when is that digital signal not a good proxy for mental health? Maybe I gave my phone to someone else to use. That data is completely invalid and not useful for my health. I think digital phenotyping is this word that helps us understand we’re getting this digital information—just like there’s kind of phenotyping of genetics—we’re trying to build a proxy, in this case for behaviour. I think digital phenotyping has really taken the lead in mental health because we’re a behaviour health science, and we’re collecting this behavioural information. But there is even interest in digital phenotyping in heart disease and diabetes to kind of learn about behaviour. But I think mental health is the forefront of using digital phenotyping.

David Gratzer: Thinking about things from a health equity lens: is this part of the solution or part of the problem?

John Torous: It’s a really good question. I think if we don’t think about health equity . . . we don’t want to make a digital first treatment that the people who need the most help can’t actually partake in. And one thing that our group has done—it’s on our website that you can access for free at—is actually make a training program to help people learn how to use their smartphones toward recovery and wellness. And for this training program, you don’t need to have any experience or a smartphone. The first slide is a picture of a phone that Alexander Graham Bell had all the way to a smartphone. And we say what is a smartphone, what’s not a smartphone. And we start from that basis. And over six weeks of these groups, we say, well, here’s how to connect to Wi-Fi. Here’s how to set up voicemail. Here’s how you use your calendar to set reminders for appointments and medications. Here’s how to use the in-built note part of your phone to write down your treatment plan. And eventually we get up to things like, here’s how to use a mood tracking app. Here’s how to do different things. But really getting people in the community, people who may have a phone—but again, a loved one may have given them the phone, a social worker may have given it to them, and some people may not really know how to take advantage of all these digital offerings—so I think making sure there are ways to train people and give them the skills and competencies is extremely important to make this really successful.


David Gratzer: Dr. Torous, it’s something of a podcast tradition here to close off with a rapid-fire minute, with just a handful of questions in a minute on the clock. Are you ready?

John Torous: I am ready.

David Gratzer: Let’s put a minute on the clock. Let’s begin. What is the biggest surprise in app development for you?

John Torous: Lack of engagement.

David Gratzer: How are you going to address that?

John Torous: Clinical integration and training people to use apps, and more user-centred design.

David Gratzer: Do you worry that we’re in an app heyday, but there’ll be an app winter as people cool to the concept?

John Torous: I think unless we get privacy and transparency down quickly, we’re going to be in trouble.

David Gratzer: What is the biggest app surprise in terms of a patient recommendation to you?

John Torous: WeCroak, the app that tells you five times a day you will die. It can certainly help you live in the moment, but it also may not be the most therapeutic app. So I do not personally recommend it.

David Gratzer: Was that patient particularly anxious?

John Torous: They were not.

David Gratzer: What’s an app story that resonates with you?

John Torous: I think one app story that resonates with me is people bringing us their own apps that they found without telling us and saying, “I’m actually comfortable telling you I’m using an app for my care. Is it okay that I’m using this, say, CBT app?”

David Gratzer: And at the buzzer: how will these apps get better over the next half decade?

John Torous: I think it’s going to have to be integrating into the mental health system.

David Gratzer: Not technology, but the app itself?

John Torous: Implementation and people.

David Gratzer: Once again, Dr. Torous, we really appreciate your time. That was fantastic. On our web page, we do have a link, of course, to the APA app advisor and more information on that. Thank you again.

John Torous: Thank you.


[Outro: Quick Takes is a production of the Center for Addiction and Mental Health. You can find links to the relevant content mentioned in the show and accessible transcripts of all the episodes we produce online at If you like what we're doing here, please subscribe.

Until next time.


Listen on Apple Podcasts  Listen on Google Podcasts Listen on Stitcher Listen on Spotify

About this episode's guest:

Dr. John TorousDr. John Torous (@johntorousmd), is director of the digital psychiatry division, in the Department of Psychiatry at Beth Israel Deaconess Medical Center, a Harvard Medical School affiliated teaching hospital. He is active in investigating the potential of mobile mental health technologies for psychiatry and has published over 75 peer reviewed articles and 5 books chapters on the topic. He serves as editor-in-chief for an academic journal on technology and mental health, JMIR Mental Health, currently leads the American Psychiatric Association’s work group on the evaluation of smartphone apps, and is an advisor to the smartphone mood study within the NIH's one million person All of Us research program.


Please share your comments and feedback

Mobile apps in mental health

Evaluating and using apps in mental health practice

American Psychiatric Association. The material provided here covers why it is critical to rate an app, how best to evaluate an app and an opportunity to seek additional guidance on apps and/or the evaluation process.


T2 Mood Tracker

PTSD Coach
PTSD Coach was designed for those who have, or may have, posttraumatic stress disorder (PTSD). The app was created by VA’s National Center for PTSD and DoD’s National Center for Telehealth & Technology. It is mentioned in the podcast as an example of an app that measures well against privacy, usability and evidence criteria

PTSD Coach Canada is an adaptation of PTSD Coach


Conversational agents / chat bots




Never miss an episode of Quick Takes!

Want Quick Takes sent right to your inbox?