
Why We Need to Stop Talking About “What AI Will Do”
Technological advances have sparked new debates about AI replacing human professionals. Reporting on this often misrepresents AI - what it is, and how people interact with it in real life.
Someone recently sent me a New York Times' Hard Fork podcast episode and asked me what I thought about it. It featured a study that found ChatGPT “defeated doctors” at diagnosing diseases. It made me a bit angry. Headlines like this grab attention, but are misleading.
It’s Not AI vs Humans - It’s Humans Using Tools
“Is AI better at diagnosing conditions than doctors?” - that’s like asking “is a hammer better at putting nails in walls than human hands”. A hammer won’t do anything on its own. A human without tools would struggle. The result of a human using a hammer might be excellent, but depends on their skill in using the tool.
The study wasn't testing ‘AI versus doctors’. It was comparing three different scenarios:
- Doctors using conventional resources (like medical reference websites)
- Doctors using both the above and ChatGPT, but without prompt training
- The output of ChatGPT from a standardised prompt, developed by researchers
All of these scenarios are examples of humans using tools. There are many more scenarios that weren't tested, or reported on. The third scenario performed best, but this doesn't mean AI is “defeating” doctors. It means we need to be careful about how to integrate such tools into medical practice.
Real-World Examples: Building Tools for Humans
At HelpFirst, we’re developing AI tools to support caseworkers with admin tasks. We are humans, building tools that other humans will use in their work with people in vulnerable situations. It involves developers, caseworkers, supervisors and service users. Each has their own set of motivations, biases, needs and experiences. We have a responsibility to make ourselves aware of these as much as possible - without that, we can't design useful and ethical systems.
My last role was at the Cooperative AI Foundation, a research funder. Their mission is to make AI systems collaborative, not competitive (think training them on games like Diplomacy rather than chess). I do believe this is valuable research. But we should remember that this is still about humans creating tools. Humans will use the tools, impacting other humans. These collaborative tools might be used to tackle joint problems - like food supply and housing. They might also be used to defraud people faster, and wage more destructive wars.
What This Means for Public Services
So what does this mean for healthcare and other caring services? Improving, say, diagnostic accuracy, is a service design challenge. Perhaps clinical and AI experts will co-design self-service diagnostic tools. These could provide a ‘first pass’ diagnosis and signpost to a specialist service. Maybe GPs will go through training programmes on how to use AI diagnostic tools. These could reduce referral times and streamline care pathways. I predict a combination of these and more.
But even with excellent AI tools, we’ll still need human-to-human care. Patients need to be able to trust their care providers. They need to get tailored support to adhere to their treatment plans. They need to feel like they matter.
Every healthcare system needs to improve patient care while managing resources. The solution doesn’t lie in the question of “humans versus AI”. It’s in finding the best combination of human expertise, relationships and technology.
We Need To Talk About Human Choices
As we continue developing AI systems, we need to stop talking about AI as if it acts on its own. Language matters, because it can change our perception, even if only by a small amount every day. Conversations about “what AI will do” often anthropomorphise and demonise AI. This whips up fear - for example, about job losses. It also absolves those who design and deploy the tools from responsibility.
We need to keep talking about the human decisions that shape these tools: Who’s building them? Who’s using them? Who are they used with or on? What training and support are users getting? What are the power dynamics at play?
Remember: AI doesn’t do anything. Humans do things, sometimes with AI tools. Humans make choices in development, implementation, access, upskilling and more. Those choices are what matter.
So please, the next time you hear someone talking about “what AI will do”, ask instead: “what are the humans doing?”
This article was originally published on Digital Leaders.
The Priority Services Register (PSR) is a key tool that energy suppliers use to fulfil their responsibilities to vulnerable customers. But there isn't just one register. Every energy company has their own PSR and the application forms vary unpredictably from supplier to supplier.
As part of our CivTech Challenge, we’ve been researching best practice across the industry. We were left with lots of questions:
- Why is 'restricted hand movement' a vulnerability that almost all suppliers assess?
- Why are archaic phrases like ‘bedridden’ used?
- Why does only one supplier check if their vulnerable customers use ‘electric showering’?
Alas, we weren't able to fully answer these questions. But here’s a visual guide to various PSR forms, so you can get an overview of the landscape.
Overview
We accessed PSR application forms for Ovo Energy, British Gas, SSE, Octopus, EDF, Shell and Utilita. For many other suppliers, access is restricted.
The first observation is that the application forms are extremely varied:
- We’ve grouped questions into categories to make things a bit easier to read, however the forms themselves come in very different structures. Some offer all their options in one long list, some separate into smaller sections. Some only show certain sections once a customer has selected a particular option (e.g. selecting ‘sight loss’ gets you extra questions on the Shell application).
- Several vulnerabilities are only mentioned by one supplier. Only one asks about autism, and another asks about breathing difficulties. The following options only showed up once: ‘female presence preferred’, ‘longer time to answer the door’ and ‘bedridden’.
- For sensory needs: ‘blind’ and ‘partially sighted’ are separate options in all the forms. 'Hearing impairment' and 'deaf' are combined in half the forms and the former is not asked at all in one. This may be contentious, as hearing impairment and being deaf are very different conditions.
- Some suppliers include options for accessible information provision in the same form (i.e. braille, large print letters, etc.) Others link to an additional form, or do not reference it at all.
- When temporary conditions are mentioned, only some suppliers allow the customer to select a date when they believe the condition will no longer apply.
- Most of the forms are multiple choice, limiting to what the supplier chooses to ask about. Occasionally the supplier (e.g. EDF) gives the customer a larger space to talk about their conditions, equipment and needs in more detail.
Next, we dive deeper into the application forms.
Medical Conditions2
EDF’s application form has the highest number of options related to medical conditions (20 in total) with British Gas and Utilita featuring the lowest (13). EDF also features options which cover multiple medical conditions (e.g. 'breathing difficulties', 'disability benefits') more frequently than other suppliers. SSE has the highest number of options for learning and mental health related conditions (including 'dyslexia', 'autism', 'learning difficulties' and 'anxiety/depression').
There is some overlap within options, which could be confusing. For example, SSE lists both ‘developmental condition’ and ‘autism’ separately, even though the latter is a type of the former. Another example is the ‘mental ill health’ and ‘anxiety/depression’ options, again found in the SSE form. It is not clear if customers should tick both or only the more specific option.
All organisations feature options to indicate older age, however they specify a variety of different ages as the lower threshold, including: 60+, 65+, 'pensionable age' or 'pensioner'. British Gas have two separate options relating to older age ('pensionable age (65 and over)' and 'age 75 and over').
There is some degree of consistency across organisations. This appears to be where specific conditions have been mentioned within the Ofgem guidance (for instance, 'restricted hand movement' appears in all but one form, in spite of the fact this is a very specific need).
Language Used
The language used across suppliers is very inconsistent. SSE uses ‘hard of hearing’ and ‘deaf’ to describe hearing loss-related needs, while other suppliers employ terms such as ‘hearing impairment’ or ‘hearing impaired’.
Some options have multiple potential meanings: ‘carer’ could refer to the respondent either needing a carer or being a carer for someone else.
All suppliers ask about speech and language difficulties and broader language barriers. However there is no shared way of asking whether a customer speaks English. Variations include: 'unable to communicate in English', 'language barrier' and 'foreign language speaker'.
‘Unable to communicate in English’ (used by Octopus and Ovo) is somewhat ambiguous. Customers might take it to mean having a different first language or having a speech condition. The requirements are quite different: with the former you could use an interpreter or multilingual support, with the latter you would need different support.
Medical Equipment3
Options Offered
British Gas do not offer any specific options for types of medical equipment: they solely offer the generic category ‘mains powered electric medical equipment’. All other organisations surveyed have more specific options. These are broadly consistent across suppliers with some more limited options (e.g. ‘wheelchair’, ‘MDE electric showering’).
Most organisations (bar British Gas and EDF) also ask about reliance on water.
Language Used
It is unclear what is meant by the ‘life support’ option used by EDF. Often the phrase ‘life support machine’ refers to a ventilator, but EDF also have a separate option for ‘heart and lung ventilators’. It could mean life support as a condition or set of needs but that seems too broad for the PSR.
Temporary Changes
In a rare show of unanimity, all suppliers offer the same options for temporary changes.
householder <18
and under
Other Questions
Passwords
All suppliers offered the option of setting up a password or PIN. This is usually so a technician can state this password as an additional security measure on home visits. Two suppliers required a 6-letter password, one an 8-letter password and one a 10-letter password. A final supplier did not specify length. An unfortunate side effect of this variation is that if an individual were to move supplier, they may need to change their password and remember a new one. (Note: not shown in an infographic.)
Life Scenarios
Varying from the multiple choice standard, Shell veer into first person narratives. In their ‘Nominee Scheme’ section of the form, they feature an additional tick box option: ‘I can be easily confused and worried by communications from my energy supplier’. When asking about meter support they offer: ‘I have a prepayment meter and no-one in my household is able to safely read it or top it up’.
Accessibility Information4
Organisations vary on including accessibility questions on their PSR form. Ovo offers seven different accessibility options for receiving information, while Shell offers a single broad range checkbox.
Conclusions
Suppliers diverge considerably in what information they collect on their customers to register them for Priority Services support.
On our travels we encountered the aspiration to create a more standardised or universal PSR. Initiatives like the Vulnerability Registration Service and Experian’s Support Hub aim in this direction. In the future we are keen to explore the user experience of these services and how they aid vulnerable customers.
In the meantime, we hope this analysis will prove useful if you are looking to improve the experience of vulnerable energy customers. Any questions or comments, contact harriet@helpfirst.ai. We’d love to hear from you!
Footnotes
- SSE was acquired by Ovo Energy in 2020. They hadn't completed their move over when we started this research and were still registering people to their PSR. We’ve included them in this analysis as their approach was interesting with many mental health and developmental condition-type questions.
- Some questions have been condensed in the infographics. Numbers referenced in the discussion refer to the full options as available on the questionnaires, but the infographics demonstrate a condensed version for brevity and ease of visualisation. Full original data is available on request.
- 'Heart/lung machine & ventilator' is the most common formulation of question regarding this equipment. However EDF separates these questions into: 'heart/lung machine' and 'ventilator'.
- Octopus and British Gas do not ask about accessibility. Shell only offers a general ‘accessible information’ needs tick box if the customer has earlier selected that they have a visual impairment. This does not mean they do not record this information elsewhere, where these questions did not appear on their forms we were not able to verify what (or if) they ask about accessible information.