top of page

Aria – AI Wellbeing Companion

Frequently Asked Questions

1. About Aria

​1.1 What is Aria?

Aria is an AI wellbeing companion that offers confidential, evidence-informed support for everyday mental health challenges such as stress, low mood and anxiety. It uses state-of-the-art AI models trained on psychological research and Cognitive Behavioural Therapy (CBT) techniques, and is available 24/7 via a mobile app.

Aria is designed to help you:

  • Make sense of how you’re feeling

  • Learn practical coping tools and CBT-style exercises

  • Reflect on patterns over time

  • Decide when it might help to reach out for further support

​

1.2 Who is Aria for?

Aria is designed for adult users (such as NHS staff and other employees, depending on your organisation’s licence) who want quick, convenient support with their mental wellbeing.

It is not aimed at children or teenagers, and it is not a replacement for specialist mental health services.

​

1.3 Is Aria a therapist or a clinician?

No. Aria is not a therapist, doctor, psychologist or crisis service.

Instead, Aria is a digital self-help tool that:

  • Uses CBT-based ideas and coping strategies

  • Helps you reflect on your thoughts, feelings and behaviours

  • Can signpost and encourage you to contact human services (for example, Health Assured or local NHS services) when that would be more appropriate than continuing with the app alone

Any diagnosis, change in medication, or major treatment decision should always be made by a qualified health professional.

​

1.4 How is Aria different from generic AI chatbots?

You may have seen people using general-purpose AI tools for mental health support. At the same time, researchers and clinicians have raised concerns that unregulated chatbots can offer unsafe, biased or overly simplistic responses, and should not be treated as therapists.

Aria is different in several ways:

  • Purpose-built for wellbeing rather than general chat

  • Informed by psychological research and CBT rather than random internet content

  • Clinically moderated design: Aria is intentionally designed to complement, not replace existing support pathways and to encourage appropriate escalation to human care.

 

2. Safety, evidence and quality

2.1 Is Aria safe to use?

There is growing public discussion about the safety of AI in mental health, including worries about harmful advice, missed risk, or AI “saying the wrong thing”.

Aria’s safety approach includes:

  • Clear boundaries: Aria is framed as a wellbeing companion, not a therapist or emergency service.

  • Evidence-informed techniques: Aria’s underlying approach is based on CBT-style exercises that have been used widely in mental health care.

  • Encouraging real-world help: Aria actively guides users towards appropriate services (e.g. Health Assured or local NHS support) and can help you prepare for appointments or referrals, instead of trying to manage everything itself.

Like any tool, Aria has limitations (see below). It is safest when used alongside, not instead of, human support.

​

2.2 Is there any evidence that tools like Aria work?

Early studies on the technology Aria is based on have shown promising results:

  • In a controlled study of a very similar AI wellbeing companion, regular use was associated with reduced self-reported depression (by up to 28%), anxiety (31%) and negative affect (15%), and increased positive mood.

  • Aria’s own early user data (from internal evaluations) show that around 85% of users report feeling better after their first conversation, and around 40% find that Aria alone is enough to resolve a particular issue, while others use it as a stepping stone to further support.

These are encouraging signs, but Aria is not presented as a cure or a substitute for clinical treatment. Larger, long-term studies are still important, and the wider NHS is actively evaluating AI tools in mental health settings.

​

2.3 Can Aria get things wrong or misunderstand me?

Yes. Like all AI systems, Aria can misinterpret what you say, oversimplify, or give suggestions that don’t land well for you personally.

To reduce risk:

  • Aria has been designed to avoid giving diagnoses, medication advice or specific treatment instructions.

  • Aria focuses on supportive listening, reflection and self-help tools, not on telling you what to do with your health.

  • You are encouraged to treat Aria’s suggestions as options, not instructions, and to speak to a clinician if you’re unsure.

If something Aria says feels unhelpful or inaccurate, you can simply ignore it, correct it, or bring it to a human professional for discussion.

 

2.4 What happens if I’m in crisis, feeling unsafe, or thinking about self-harm?

Aria is not an emergency or crisis service and cannot keep you physically safe.

If you are in immediate danger or feel unable to keep yourself safe, you should:

  • Call 999

  • Contact your local Crisis Team (if you have their number)

  • Call NHS 111 and choose the mental health option (England)

  • Use established crisis services such as Samaritans (116 123)

Aria’s role is to encourage and support you in reaching out to appropriate crisis or urgent care services, and to help you think through the next steps before and after those conversations – not to replace them.

 

2.5 Will Aria replace therapists, counsellors or other mental health staff?

No. Experts are very clear that AI should not replace human mental health care, particularly for more complex or high-risk situations.

Aria is designed to:

  • Provide additional, immediate support when people can’t easily access a human

  • Help people notice difficulties early and seek help sooner

  • Take some pressure off staff by offering day-to-day coping tools, not by replacing clinicians

Think of Aria as part of a wider support ecosystem, not a replacement for it.

 

3. Privacy, confidentiality and data

3.1 Who can see what I say to Aria?

Aria is designed to be confidential and secure. The only place your conversation transcript is stored alongside any personally identifiable information (like your name or email) is on your own device.

That means:

  • Your employer does not see your individual conversations

  • Line managers and HR do not have access to your chat history

  • Aria is intended to feel like a private space where you can speak freely

​

3.2 Do I have to use my real name?

You will typically sign up using a work email to verify that you are eligible for access, but you do not have to use your legal name as the name you chat under inside the app.

You can treat Aria as a pseudonymous, private space, within the usual boundaries of safety and law.

​

3.3 What data does Aria store and why?

To function, Aria may process:

  • The text you type during conversations

  • Information about how and when you use the app

  • Basic account details (such as the fact that you’re entitled to access via your organisation)

This data is used to:

  • Keep your conversation going in a coherent way

  • Improve the quality of the app over time

Public discussions about AI in mental health often highlight concerns around privacy, data misuse and surveillance. Aria’s approach is to minimise identifiable data, keep transcripts linked to identity on your device, and avoid sharing your personal conversations with employers or third parties.

For full details, you should check the Aria Privacy Policy in the app or on the Medstars website.

​

4. When and how to use Aria

4.1 When is Aria a good option?

Aria can be particularly helpful when you:

  • Feel stressed, worried, low or overwhelmed and want a safe place to talk it through

  • Want to try CBT-style exercises (e.g. reframing thoughts, grounding techniques, breathing exercises)

  • Are awake out of hours (for example, night shifts or insomnia) and can’t access other support

  • Are considering whether to talk to a GP, counsellor or helpline and want help preparing

Research suggests many adults and young people are already turning to AI chatbots as a convenient, anonymous first step, especially when traditional support feels hard to reach.

​

4.2 When should I talk to a human instead?

You should seek human support if:

  • Your mood or anxiety is severely affecting your ability to cope

  • You are having frequent thoughts of self-harm or suicide

  • You have complex mental health difficulties or a history of severe illness

  • You need a diagnosis, medication review or fit note

  • You feel that Aria’s responses are not understanding your situation, or are making you feel worse

Aria is designed to encourage – not discourage – contact with human services. If you’re unsure, it’s safer to speak to a GP, mental health professional or trusted helpline.

​

4.3 Can Aria diagnose me or tell me what treatment I need?

No. Aria cannot diagnose mental health conditions and cannot decide your treatment plan.

What it can do is:

  • Help you reflect on your symptoms and patterns

  • Provide self-help tools often used alongside therapy

  • Help you prepare to talk to your GP, counsellor or occupational health

  • Support you in sticking with coping strategies between appointments

Any diagnosis, prescription or change in treatment must come from a qualified health professional.

​

4.4 What if I’m already receiving mental health care?

You can use Aria as a complement to your existing care:

  • As a space to journal and reflect between sessions

  • To practise techniques (e.g. breathing, grounding, cognitive restructuring) you’ve learned in therapy

  • To help you plan what to discuss at your next appointment

Aria is not a replacement for your therapist, psychiatrist or care team, and it won’t interfere with your clinical records. If you’re unsure whether to use it, you can always discuss it with your clinician.

​

5. Fairness, bias and accessibility

5.1 Will Aria understand my culture, identity or background?

One of the concerns often raised about AI in mental health is the risk of bias and limited cultural understanding – for example, downplaying discrimination, or misreading the significance of certain experiences.

Aria aims to:

  • Use language that is respectful and inclusive

  • Treat your experiences as valid, rather than dismissing them

  • Recognise that context (e.g. racism, stigma, family norms, faith) matters for wellbeing

However, AI systems can still reflect gaps or biases in the data they were trained on. If Aria ever responds in a way that feels invalidating or biased, it’s important to:

  • Trust your own lived experience

  • Raise it with a trusted person or service if you feel able

  • Use Aria as one tool among many, not the sole judge of your situation

​

5.2 Is Aria accessible for people with disabilities?

Some people may find AI chat helpful because:

  • It’s text-based, which can suit people who are hard-of-hearing or prefer writing

  • It can feel safer for people with social anxiety or those who find face-to-face conversations difficult

At the same time, people with:

  • Visual impairments,

  • Dexterity difficulties, or

  • Cognitive / learning differences

may need additional accessibility features, such as screen-reader compatibility, clear language and the option to take things slowly.

Accessibility is an ongoing area of improvement. Feedback from disabled users is vital to help shape future updates.

​

6. Practicalities

6.1 How do I access Aria?

You can download the Aria app from the Apple App Store or Google Play Store (where available) and sign up using your work email if your organisation provides access.

Once verified, you can start chatting with Aria immediately.

​

6.2 Does Aria cost me anything to use?

If your organisation has partnered with Medstars, access to Aria is free at the point of use for eligible staff. There is no in-app purchase required to talk to Aria.

​

6.3 When is Aria available?

Aria is available 24 hours a day, 7 days a week.

This is particularly helpful for night-shift workers, people with irregular schedules, or anyone who struggles most outside normal office hours.

​

6.4 What if I have technical problems with the app?

If you experience technical issues (for example, difficulties logging in, crashes or performance problems), you can:

  • Check your internet connection and app store for updates

  • Restart the app or your phone

  • Contact the support or helpdesk details provided by your organisation or within the app itself

bottom of page