April 19, 2018 (Rapid Growth Media) – It’s been over a century since hacking off a limb was considered the best cure for a deep cut, and even less than that since frontal lobotomies were forcefully prescribed for the melancholy.
The dark ages of medicine are seemingly behind us. Sterilization revolutionized medicine during the Civil War, as did refrigeration decades later. Leaps forward in prosthetics, anesthesia, and new ways of looking into bodies without actually opening them up have further advanced modern healthcare into what many know it as today.
Innovation in this field has long been tied to the latest technologies, the most impressive devices, the fastest-acting medicines, and the advantages they bring to helping patients recover. But this definition covers only a fraction of the “patient experience.”
Human-centeredness is an adjective as of late used in describing product and systems design techniques, heavily reliant on understanding an individual’s needs before devising a solution. The process involves interviews with users, small incremental changes, and a constant stream of feedback guiding future development. It’s not far removed from the intake procedure at a physician’s office: interviews are conducted, doctors and nurses document each patient’s “story” and take action based on inferences from that data. With a practice so closely tied to “user feedback,” healthcare would seemingly be the most appropriate application of the human-centered approach.
Obviously, that’s not always the case, as evidenced in questions of whether or not healing people is a sustainable business model, and opioid-addicted communities are being cultivated at great human expense. In contrast to these ultra-capitalistic outliers, human-centered care does make a difference.