Consent: what does ‘agreeing to tech’ really mean?

This webinar explored how to ensure the use of care technology is based on genuine consent, not pressure. The discussion emphasised that consent must be ongoing, informed, and revisited over time. Speakers also warned against “consent by duress”, where people feel they must accept technology to stay safe or receive support. Conversations should be open and transparent, with people able to understand both benefits and limits.

Date of webinar: 17 March 2026

Difficult Conversations: what does agreeing to tech really mean?  

Introduction
This panel discussion explores the complex issue of consent in the use of technology within health and social care. With increasing digitisation, such as AI, digital care records, and sensor-based monitoring there is growing tension between improving care and respecting individuals’ rights. The panel, made up of professionals and people with lived experience, examines how consent should be obtained, understood, and maintained, particularly when capacity varies and technology becomes embedded in care systems. 

Key themes 

A central theme is that consent is not a one-time event but an ongoing, dynamic process. Panellists emphasise the importance of continuous dialogue, where individuals are regularly informed and asked about their preferences as circumstances, technologies, and understanding evolve. 

Another major issue is informed consent. For consent to be valid, individuals must clearly understand what data is collected, how it is used, and who can access it. However, the complexity of AI and data systems makes this difficult. Panellists stress the need for transparent communication, focusing less on technical details and more on real-life implications for the person. 

The discussion also highlights diversity in needs and abilities. There is no “one-size-fits-all” approach, some individuals are comfortable with technology, while others may struggle due to disability, lack of digital skills, or personal preference. Accessible formats (e.g., Easy Read, verbal explanations, paper alternatives) are essential to ensure inclusion. 

Capacity and fluctuating decision-making ability are critical considerations. Consent may need to be revisited frequently, and advance planning (e.g., stating preferences early) can help guide decisions if capacity is later lost. However, past decisions should not override present wishes and ongoing validation is necessary. 

The panel also discusses ethical concerns and data use and surveillance. Technologies like sensors, wearables, and AI systems can blur boundaries between support and intrusion. There are risks of coercion, misuse of data, or prioritising organisational efficiency over individual dignity. Consent must therefore include understanding the trade-offs between safety, privacy, and autonomy. 

A further tension exists between efficiency and choice. While digital systems can streamline care, they may reduce alternatives (e.g., paper records), making consent feel forced rather than voluntary. True consent requires genuine options, including non-digital pathways. 

Finally, the panel strongly favours opt-in consent over opt-out, as it ensures active, informed participation rather than passive acceptance. 

Practical Takeaways 

  • Treat consent as an ongoing conversation, not a one-off decision. 
  • Use clear, accessible formats tailored to the individual (e.g., Easy Read, verbal, paper). 
  • Focus on real-life impacts, not technical explanations, when describing technology. 
  • Always provide genuine alternatives, including non-digital options. 
  • Regularly review and revisit consent, especially where capacity may change. 
  • Prefer opt-in models to ensure active, informed agreement. 
  • Be transparent about data use, sharing, and risks. 
  • Involve individuals early through co-design and shared decision-making. 
  • Balance safety with dignity and privacy—avoid unnecessary surveillance. 
  • Respect that saying “no” is a valid and important choice. 

Conclusion
The discussion concludes that meaningful consent in digital care depends on person-centred, transparent, and ongoing engagement. Technology should enhance care, not impose it. Without genuine choice, accessibility, and trust, consent risks becoming superficial or coerced. The guiding principle remains: the Important message to remember is nothing about us without us 

 

17th March 2026