Doctor and patient reviewing health data together
Back to Home

AI Health Dialogue

When AI monitors your health, both the patient and the clinician need to understand what it's saying — and why.

The Current Problem

AI health monitoring platforms collect rich patient data and generate clinical insights — but the explanations stop at the dashboard. Patients receive scores without meaning. Clinicians receive AI recommendations without reasoning. The result is a consultation where neither party fully understands the information between them.

Patients receive scores, not meaning

“Your pain score is 34” tells a patient nothing about what to do next

Clinicians trust the output, not the reasoning

AI recommendations arrive without explanation, making it hard to discuss with patients or challenge when wrong

Patient-reported data gets lost in aggregation

Quality-of-life questionnaires are flattened into numbers before the clinician ever sees the nuance

No audit trail for AI-influenced decisions

When AI shapes a diagnosis or treatment path, neither patient nor clinician has a record of what the AI concluded and why

Our Design Approach

We've designed a bidirectional explainability layer for health AI: interfaces that translate AI findings into language patients can act on, and that surface patient-reported experience in ways clinicians can trust and use. Transparency flows both ways.

Plain-language interpretation

AI findings translated into what they mean for daily life, with suggested questions to ask the clinician

Visible reasoning

Each AI recommendation shows the data points that drove it, so clinicians can validate, override, and explain

Narrative surfacing

AI identifies the most significant patient-reported signals and presents them as clinical talking points before the appointment

EU AI Act Article 13 compliance

Full transparency logs, human override records, and patient-accessible decision summaries

Interface Design Proposals

These mockups show how explainability can be designed into both sides of the health AI consultation. Developed in context of EU AI Act Article 13 transparency requirements and GDPR health data obligations.

MyHealth
Before Your Appointment

Your Health Summary

Based on your last 4 weeks of responses, your pain levels have increased on days after activity. Your clinician will want to discuss this.

What this means for you

  • Your body may need more recovery time between physical activities — your clinician can help adjust your routine
  • The pattern is consistent enough to discuss whether your current pain management is working as well as it could

Questions you might want to ask:

Tap to add to your appointment notes

This summary was generated by AI. Your clinician will review it. You can ask for a human explanation at any time.

Key Design Principles:

  • AI conclusions are always explained, never just asserted
  • Patient language and clinical language are treated as equally valid inputs
  • Every AI recommendation includes a “why this was flagged” disclosure
  • Patients can request human review of any AI-generated finding (EU AI Act Article 22)
  • Clinicians can override, annotate, and correct AI outputs with full audit logging
  • Appointment prep is shared — both parties enter the consultation with the same information

These are simplified previews. Full interactive prototypes available in our complete package.

Research Interview Framework

A bidirectional research approach designed to capture both patient comprehension and clinician confidence in AI-mediated health consultations.

Bidirectional explainability requires research with both sides of the consultation:

Patients with Chronic Conditions

  • Patients using digital monitoring (cardiology, orthopaedics, oncology)
  • Mix of health literacy levels and digital confidence
  • Include patients who have received AI-generated health summaries
  • Recruit 15-20 participants across at least 2 clinical areas

Clinicians Using PROMs Platforms

  • GPs, consultants, and nurse specialists reviewing AI outputs
  • Mix of AI-positive and AI-sceptical clinicians
  • Those using Patient-Reported Outcome Measures (PROMs) in practice
  • Recruit 10-12 across primary and secondary care

Healthcare IT Leads

  • Hospital teams implementing AI diagnostics or monitoring
  • Compliance officers familiar with EU AI Act and GDPR health obligations
  • Chief Clinical Information Officers (CCIOs)
  • Recruit 5-8 from NHS trusts or equivalent institutions

Ready to make health AI legible for everyone in the room?

Our complete package includes bidirectional explainability frameworks, EU AI Act Article 13 compliance templates, and patient communication design patterns for health AI platforms.