Difficult conversations: Privacy vs home sensors

The world of care has changed. Sensors monitor people in their own homes, care services are trialing the use of robots and AI to plug workforce gaps, and technology has the opportunity to improve choice and control or to remove personal autonomy.  This webinar included a discussion of the difficult conversations we need to have about privacy and sensors.

 

Date of webinar: 11 March 2026

Overview

This webinar looks at:

  • No “right or wrong”— only the right questions. Instead of debating sides, highlight uncertainties, fears, and practical realities. Some people are much more privacy-conscious and don’t want to feel surveilled in their own home, others like the security of knowing they have remote support.
  • Lived-experience perspectives.
  • Tech and risk perspectives.
  • A technologist or digital care expert explaining how monitoring can be avoided, alternatives to CCTV, and what “least intrusive” tech looks like.
  • Ethical considerations.

Speakers include:

  • Richard Keyse, Chief Executive of 2iC- Care
  • Amy Lewis, Managing Director of Just Checking
  • Lynsey Way, Clinical Director of Active Prospects
  • Mark De Bernhardt Lane, Senior Advisor for Partners in Care & Health (PCH) – LGA/ADASS
  • Adam Chamberlain, Strategy Director and General Counsel of  Orchid Group
  • Care Workers Charity will have a representative of a front line care worker
  • Think Local Act Personal (TLAP) will have a representative

 

As care technology becomes more common, difficult questions are emerging about how to use it responsibly. This webinar brought together a diverse panel, including technology providers, care professionals, and a family carer, to explore how home sensors and monitoring tools can support people while still respecting their privacy, dignity, and right to choose. 

 

Key themes 

 

Real choice includes the option to refuse
People should be able to say “no” to monitoring without losing access to care. To make informed decisions, they need clear explanations, time to consider options, and opportunities to try technologies in practice. 

Care must be personalised
There is no one-size-fits-all approach. Decisions should reflect each person’s needs, preferences, risks, and communication style and be reviewed as circumstances change. 

Privacy and dignity depend on perspective
Technology and human care can both feel intrusive in different ways. The least intrusive option depends on the individual and should always be proportionate to the level of risk. 

Data use matters as much as data collection
It’s not just what is collected, but who can access it. Panels stressed avoiding unnecessary data collection and ensuring access is carefully tailored. 

Technology should support, not replace, relationships
Monitoring can improve safety and provide early warnings, but care must not become “data-rich and relationship-poor.” Any efficiency gains should enhance, not reduce, human support. 

Potential benefits
Used well, technology can increase independence, reduce restrictions, support safer hospital discharges, and identify early signs of health issues before they escalate. 

Concerns about trust and surveillance
Care workers may feel monitored themselves, while families often seek reassurance about safety. Balancing these perspectives is key to maintaining trust. 

Consent and capacity are ongoing processes
Consent should be informed, clearly communicated, and regularly revisited. Where someone lacks capacity, decisions must be made in their best interests, while still paying attention to signs of discomfort or refusal. 

Wearables add new challenges
Wearable devices can support wellbeing but often involve sensitive data like location tracking, requiring careful ethical and legal consideration. 

Cybersecurity and governance
Providers and commissioners need to ask better questions about data protection and security. While risks can’t be eliminated, stronger processes can build confidence. 

 

Practical takeaways 

  • Start with the person’s goals, such as independence and quality of life, and use the minimum level of monitoring needed 
  • Make choices clear and concrete with plain language, demonstrations, and regular reviews 
  • Be explicit about what is monitored, why, who can access it, and how it will be used 
  • Ensure technology strengthens, rather than replaces, human relationships 
  • Put robust data protection and cybersecurity measures in place 

Conclusion 

Home monitoring technologies offer real opportunities to improve care, but they also raise important ethical and practical challenges. The key is to keep the person at the centre—ensuring genuine choice, respecting privacy, and using technology to enhance, not diminish, human care. 

 11th March 2026