Clinical safety has always mattered in the NHS, but with AI now stepping directly into patient pathways, it has become the deciding factor between genuine progress and avoidable risk.
As AI-enabled tools enter mainstream care pathways, the expectations around safe design, transparent governance and measurable patient benefit are higher than ever. For health systems under intense operational pressure, the question is no longer whether to adopt digital tools but how to ensure they are safe, clinically robust and aligned with established national standards.
To unpack what clinical safety actually means in practice, we spoke with Sharon Price, EBO’s new Clinical Pre-Sales Manager.
With a career spanning acute, community and mental health services and experience as a Clinical Safety Officer and Digital Clinical Lead within the NHS, Sharon bridges the worlds of frontline practice, clinical safety and digital innovation.
Below she shares practical, get-real insights about what safety looks like for AI and digital care pathways and what NHS teams should insist on before letting any AI system touch a patient pathway.
1. Could you briefly tell us about your clinical background and your role as Clinical Safety Officer within the NHS?
My career began on the frontline, emergency care, ambulance services, community nursing and later mental health. Working across those environments showed me very quickly where safe care breaks down: disconnected systems, manual workarounds, and processes that simply can’t keep up with clinical demand.
When I moved into Digital Nursing, I became closely involved with Clinical Safety teams, reviewing risks, writing safety cases and supporting staff to use technology confidently. Completing the NHS Clinical Safety Officer training helped bring all of that practical experience together. In short: keeping patients safe while helping staff work smarter, not harder.
2. What motivated you to transition from the NHS into health tech, and specifically into your role at EBO?
I loved working in the NHS, but I kept encountering the same systemic issues, not because people weren’t working hard, but because the tools and processes weren’t fit for purpose. I realised I could make more impact by tackling those problems at scale.
EBO stood out because it addresses one of the biggest sources of risk and frustration: communication. Whether it’s contacting patients, gathering information, or managing waiting lists, poor communication creates clinical risk. In Clinical Pre-Sales, I can bring my frontline perspective directly into product design so our solutions are safe, practical, and genuinely helpful for both staff and patients.
3. AI is often seen within the NHS as risky or unsafe. From your perspective, what are the most common misunderstandings clinicians have about AI and automation?
The biggest misunderstandings I see are:
1) “AI will make clinical decisions for me.”
Most NHS AI today supports decision-making, it doesn’t replace it.
2) “It’s a black box.”
Good AI is transparent, rule-based, and full of guardrails. If you can’t see how it works, you shouldn’t use it.
3) “AI will deskill people.” When done well, it actually removes low-value admin so clinicians can do more of the work only humans can do, listening, analysing, caring.
4. When reviewing digital solutions, what should Clinical Safety Officers be specifically looking for to assess clinical risk and robustness?
If you’re reviewing a new system, AI or otherwise, these four things matter most:
1) Clear scope and boundaries — what the system can do, what it must not do, when it should escalate and when it should stop.
2) Evidence of safe design — a robust hazard log, a clear safety case, and testing that reflects messy real-world use, not just polished demo journeys.
3) Post-go-live governance — how incidents, safety concerns and performance will be monitored, managed and improved over time.
4) Change control — how the supplier manages updates, versioning and modifications to logic or models.
If any of these are missing, the product isn’t ready for clinical use.
5. Many companies treat clinical safety as a compliance step rather than a foundational requirement. What needs to change in the industry mindset?
Too many companies treat clinical safety as paperwork needed to pass procurement. That’s not enough. Safety has to be part of the product’s DNA.
The industry needs to move from “tick the box” to “own the risk.”
That means involving clinicians early, being honest about limitations, and with product design keeping the realities of NHS services in mind, not waiting until the end to write a safety case.
6. What role do standards such as DCB0129 and DCB0160 play in governing digital and AI systems, and do you feel they are currently applied well in commercial settings?
DCB0129 (for suppliers) and DCB0160 (for NHS organisations) set out the minimum safety standards for digital systems. They’re essential, especially now AI is entering the mainstream, because they create a shared language around risk.
Are they applied well?
Sometimes, but not always consistently. Many suppliers still see them as optional or misunderstand that AI tools count as clinical systems. We need much better education, clearer expectations in procurement, and stronger partnerships working between vendors and providers.
7. What does effective partnership between suppliers and NHS clinical safety teams look like?
It looks like genuine collaboration. The best partnerships involve:
√ engaging early
√ discussing risks openly
√ co-designing pathways
√ sharing ownership of safety
√ maintaining transparency around incidents and changes
The most successful relationships feel like joint problem-solving, not blame-shifting.
- Redefine Patient Engagement
- Build Meaningful Relationships
- Enhance Patient Access
8. How does EBO support its customers in ensuring safe deployment and ongoing clinical oversight?
At EBO, clinical safety starts long before deployment. We co-design pathways with clinicians, design guardrails and escalation rules, and provide the clinical safety documentation NHS teams need.
After deployment, we continue monitoring, reviewing changes and supporting updates. Every conversation flow is version-controlled, risk-assessed and reviewed before release. We also schedule regular check-ins to ensure pathways remain safe, relevant, and aligned to evolving service needs.
9. If you could give one piece of advice to NHS CSOs or digital clinical leaders navigating AI deployment today, what would it be?
Don’t get pulled into the hype, whether overly positive or overly negative. Focus on clinical intent, clear risk boundaries and real-world workflow. If an AI solution genuinely reduces risk, reduces workload and enhances care, it’s worth exploring. If it doesn’t deliver those three things, it’s not the right tool.
For NHS teams navigating the evolving landscape of AI and digital automation, clarity and rigour are essential. The organisations that succeed will be those that embrace clinical safety as a strategic enabler.
To support NHS leaders, CSOs, and digital teams in evaluating digital tools safely and effectively, we’ve created a concise guide covering DCB0129, DCB0160, DTAC, DPIAs and clinical-safety best practice.
👉 Download the Healthcare Guide: A Practical Framework for Safe AI Deployment.
