A bigger problem than getting healthcare professionals to accept technology is that there's been too much emphasis on computers alone, and not enough focus on streamlining workflows and providing software that could help doctors make better medical decisions, according to an Institute of Medicine report released last year.
Researchers who visited leading healthcare facilities found that "IT applications appear designed largely to automate tasks or business processes. They are often designed in ways that simply mimic existing paper-based forms and provide little support for the cognitive tasks of clinicians or the workflow of the people who must actually use the system."
During the visits, researchers "repeatedly observed healthcare IT focused on individual transactions (e.g., medication x is given to the patient at 9:42 p.m., laboratory result y is returned to the physician, and so on) and virtually no attention being paid to helping the clinician understand how the voluminous data collected could relate to the overall healthcare status of any individual patient," the institute's report said.
There's been a lot of hype suggesting that the benefits of healthcare IT will show up quickly and automatically, Gabriel says. "Simply implementing computer systems won't dramatically improve [healthcare] quality overnight," he says. "Very careful system design and configuration, along with a lot of thoughtful human process improvement, are necessary in order to make the technology truly helpful."
Pratt is a Computerworld contributing writer in Waltham, Mass. You can contact her at firstname.lastname@example.org. Computerworld's Mitch Betts contributed to this article.