"If the user can’t use it, it doesn’t work": This phrase, from Susan Dray, was originally addressed at system developers. It presupposes good understanding of who the intended users are and what their capabilities are. But the same applies in sales and procurement.
In hospital (and similar) contexts, this means that procurement processes need to take account of who the intended users of any new technology are. E.g., who are the intended users of new, wireless integrated glucometers or of new infusion pumps that need to have drug libraries installed, maintained... and also be used during routine clinical care? What training will they need? How will the new devices fit into (or disrupt) their workflow? Etc. If any of the intended users can’t use it then the technology doesn’t work.
I have just encountered an analogous situation with some friends. These friends are managing multiple clinical conditions (including Alzheimer’s, depression, the after-effects of a mini-stroke, and type II diabetes) but are nevertheless living life to the full and coping admirably. But recently they were sold a sophisticated “Agility 3” alarm system, comprising a box on the wall with multiple buttons and alerts, a wearable “personal attack alarm”, and two handheld controllers (as well as PIR sensors, a smoke alarm and more). They were persuaded that this would address all their personal safety and home security needs. I don’t know whether the salesperson referred directly or obliquely to any potential physical vulnerability. But actually their main vulnerability was that they no longer have the mental capacity to assess the claims of the salesperson, let alone the capacity to use any technology that is more sophisticated than an on/off switch. If the user can’t use it, it doesn’t work. By this definition, this alarm system doesn’t work. Caveat emptor, but selling a product that is meant to protect people when the net effect is to further expose their vulnerability is crass miss-selling. How ironic!