相关文章推荐
谈吐大方的佛珠  ·  php ...·  4 月前    · 
无邪的松树  ·  easyUI ...·  7 月前    · 
捣蛋的骆驼  ·  %E5%BD%A9%E7%A5%A8%E6% ...·  1 年前    · 
The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely. As a library, NLM provides access to scientific literature. Inclusion in an NLM database does not imply endorsement of, or agreement with, the contents by NLM or the National Institutes of Health.
Learn more: PMC Disclaimer

  1. Hardware and Software Computing Infrastructure. This dimension of the model focuses solely on the hardware and software required to run the applications. The most visible part of this dimension is the computer, including the monitor, printer, and other data display devices along with the keyboard, mouse, and other data entry devices used to access clinical applications and medical or imaging devices. This dimension also includes the centralized (network-attached) data storage devices and all of the networking equipment required to allow applications or devices to retrieve and store patient data. Also included in this dimension is software at both the operating system and application levels. Finally, this dimension of the model subsumes all the machines, devices, and software required to keep the computing infrastructure functioning such as the high-capacity air conditioning system, the batteries that form the uninterruptable power supply (UPS) that provides short-term electrical power in the event of an electrical failure, and the diesel-powered backup generators that supply power during longer outages.

    In short, this dimension is purely technical; it is only composed of the physical devices and the software required keeping these devices running. One of the key aspects of this dimension is that, for the most part, the user is not aware that most of this infrastructure exists until it fails [ 33 ]. For example, in 2002 the Beth Israel Deaconess Medical Center in Boston experienced a four-day computer outage due to old, out-of-date computer equipment coupled with an outdated software program designed to direct traffic on a much less complex network. Furthermore, their network diagnostic tools were ineffective because they could only be used when the network was functioning [ 34 ].

  2. Clinical Content . This dimension includes everything on the data-information-knowledge continuum that is stored in the system (i.e., structured and unstructured textual or numeric data and images that are either captured directly from imaging devices or scanned from paper-based sources) [ 35 ]. Clinical content elements can be used to configure certain software requirements. Examples include controlled vocabulary items that are selected from a list while ordering a medication or a diagnostic test, and the logic required to generate an alert for certain types of medication interactions). These elements may also describe certain clinical aspects of the patients’ condition (e.g., laboratory test results, discharge summaries, or radiographic images). Other clinical content, such as demographic data and patient location, can be used to manage administrative aspects of a patient’s care. These data can be entered (or created), read, modified, or deleted by authorized users and stored either on the local computer or on a network. Certain elements of the clinical content, such as that which informs clinical decision support (CDS) interventions, must be managed on a regular basis [ 36 ].
  3. Human Computer Interface . An interface enables unrelated entities to interact with the system and includes aspects of the system that users can see, touch, or hear. The hardware and software “operationalize” the user interface; provided these are functioning as designed, any problems with using the system are likely due to human-computer interaction (HCI) issues. The HCI is guided by a user interaction model created by the software designer and developer [ 37 ]. During early pilot testing of the application in the target clinical environment, both the user’s workflow and the interface are likely to need revisions. This process of iterative refinement, wherein both the user and user interface may need to change, must culminate in a human computer interaction model that matches the user’s modified clinical workflow. For example, if a clinician wants to change the dose of a medication, the software requires the clinician to discontinue the old order and enter a new one, but the user interface should hide this complexity. This dimension also includes the ergonomic aspects of the interface [ 38 ]. If users are forced to use a computer mouse while standing, they may have difficulty controlling the pointer on the screen because they are moving the mouse using the large muscles of their shoulder rather than the smaller muscles in the forearm. Finally, the lack of a feature or function within the interface represents a problem with both the interface and with the software or hardware that implements the interface.
  4. People . This dimension represents the humans (e.g., software developers, system configuration and training personnel, clinicians, and patients) involved in all aspects of the design, development, implementation, and use of HIT. It also includes the ways that systems help users think and make them feel [ 39 ]. Although user training is clearly an important component of the user portion of the model, it may not by itself overcome all user-related problems. Many “user” problems actually result from poor system design or errors in system development or configuration. In addition to the users of these systems, this dimension includes the people who design, develop, implement, and evaluate these systems. For instance, these people must have the proper knowledge, skills, and training required to develop applications that are safe, effective, and easy to use. This is the first aspect of the model that is purely on the social end of the socio-technical spectrum.

    In most cases, users will be clinicians or employees of the health system. However, with recent advances in patient-centered care and development of personal health record systems and "home monitoring" devices, patients are increasingly becoming important users of HIT. Patients and/or their caregivers may not possess the knowledge or skills to manage new health information technologies, and this is of specific concern as more care shifts to the patient’s home [ 40 ].

  5. Workflow and Communication. This is the first portion of the model that acknowledges that people often need to work cohesively with others in the health care system to accomplish patient care. This collaboration requires significant two-way communication. The workflow dimension accounts for the steps needed to ensure that each patient receives the care they need at the time they need it. Often, the clinical information system does not initially match the actual “clinical”workflow. In this case, either the workflow must be modified to adapt to the HIT, or the HIT system must change to match the various workflows identified.
  6. Internal Organizational Policies, Procedures, and Culture. The organization’s internal structures, policies, and procedures affect every other dimension in our model. For example, the organization’s leadership allocates the capital budgets that enable the purchase of hardware and software, and internal policies influence whether and how offsite data backups are accomplished. The organizational leaders and committees who write and implement IT policies and procedures are responsible for overseeing all aspects of HIT system procurement, implementation, use, monitoring, and evaluation. A key aspect of any HIT project is to ensure that the software accurately represents and enforces, if applicable, organizational policies and procedures. Likewise, it is also necessary to ensure that the actual clinical workflow involved with operating these systems is consistent with policies and procedures. Finally, internal rules and regulations are often created in response to the external rules and regulations that form the basis of the next dimension of the model.
  7. External Rules, Regulations, and Pressures. This dimension accounts for the external forces that facilitate or place constraints on the design, development, implementation, use, and evaluation of HIT in the clinical setting. For example, the recent passage of the American Recovery and Reinvestment Act (ARRA) of 2009, which includes the Health Information Technology for Economic and Clinical Health (HITECH) Act, makes available over $20 billion dollars for health care practitioners who become “meaningful users” of health IT. Thus, ARRA introduces the single largest financial incentive ever to facilitate electronic health record (EHR) implementation. Meanwhile, a host of federal, state, and local regulations regulate the use of HIT. Examples include the 1996 Health Insurance Portability and Accountability Act (HIPAA), recent changes to the Stark Laws, and restrictions on secondary use of clinical data. Finally, there are three recent national developments that have the potential to affect the entire health care delivery system in the context of HIT. These include: 1) the initiative to develop the data and information exchange capacity to create a national health information network [ 41 ]; 2) the initiative to enable patients to access copies of the clinical data via personal health records [ 42 ]; and 3) clinical and IT workforce shortages [ 43 ].
  8. System Measurement and Monitoring . This dimension has largely been unaccounted for in previous models. We posit that the effects of HIT must be measured and monitored on a regular basis. An effective system measurement and monitoring program must address four key issues related to HIT features and functions [ 44 ]. First is the issue of availability – the extent to which features and functions are available and ready for use. Measures of system availability include response times and percent uptime of the system. A second measurement objective is to determine how the various features and functions are being used by clinicians. For instance, one such measure is the rate at which clinicians override CDS warnings and alerts. Third, the effectiveness of the system on health care delivery and patient health should be monitored to ensure that anticipated outcomes are achieved. For example, the mean HbA1c value for all diabetic patients in a practice may be measured before and after implementation of a system with advanced CDS features. Finally, in addition to measuring the expected outcomes of HIT implementation, it is also vital to identify and document unintended consequences that manifest themselves following use of these systems [ 45 ]. For instance, it may be worthwhile to track practitioner efficiency before and after implementation of a new clinical charting application [ 46 ]. In addition to measuring the use and effectiveness of HIT at the local level, we must develop the methods to measure and monitor these systems and assess the quality of care resulting from their use on a state, regional, or even national level [ 47 , 48 ].

Relationships and Interactions between our Model’s Components

Our research and experience has led us, and others, to conclude that HIT-enabled healthcare systems are best treated as complex adaptive systems [ 49 ]. The most important result of this conclusion is that hierarchical decomposition (i.e., breaking a complex system, process, or device down into its components, studying them, and then integrating the results in an attempt to understand how the complete system functions) cannot be used to study HIT [ 50 ]. As illustrated by the evaluation of centrally stored electronic summaries in the UK, complex interdependencies between various socio-technical dimensions are to be expected and our HIT model (had it existed at the time) might have potentially predicted some of them and allowed them to address them prior to go-live rather than in the evaluation stages of the project. Therefore, one should not view or use our model as a set of independent components which can be studied in isolation and then synthesized to develop a realistic picture of how HIT is used within the complex adaptive healthcare system. Rather, the key to our model is how the eight dimensions interact and depend on one another. They must be studied as multiple, interacting components with non-linear, emergent, dynamic behavior (i.e., small changes in one aspect of the system lead to small changes in other parts of the system under some conditions, but large changes at other times) that often appears random or chaotic. This is typical of complex adaptive systems, and our model reflects these interactions.

For example, a computer-based provider order entry (CPOE) system that works successfully on an adult, surgical nursing unit within a hospital may not work at all in the nearby pediatric unit for any number of potential reasons, including: 1) hardware/software (e.g., fewer computers, older computers, poor wireless reception, poor placement); 2) content (e.g., no weight- or age-based dosing, no customized order sets or documentation templates); 3) user-interface (e.g., older workforce that has trouble seeing the small font on the screen); or 4) personnel (e.g., no clinical champion within the medical staff). However, each of these dimensions has a potential relationship with one or more of the other dimensions. For instance, computers may have been few or old because of some organizational limitations, there may be no customized order sets because clinician-users did not agree on how best to do it, and there was no clinical champion because the organization did not provide any incentive for the additional time this role would entail. Other reasons could include problems with the user interface and the communication and workflow related to how nurses process new medication orders using the EHR and record administration of medications. These issues, in turn, may have been due to organizational policies and procedures. For example, the unit governance committee may have decided not to approve a request for mobile computers, with the result that nurses spent more time away from patients and therefore had a slower workflow related to processing new orders. The preceding example illustrates the interaction of six dimensions of our model: hardware/software, clinical content, user interface, people, workflow, and organizational policies. Additionally, some form of monitoring could have detected these issues. In summary, our model provides HIT researchers with several new avenues of thinking about key technology components and how these dimensions can be accounted for in future research.

The New HIT Model in Action in Real-World Settings

The following sections illustrate how we have used the socio-technical model of safe and effective HIT use within our research. In an attempt to describe how the model can be applied across the breadth of HIT research and development, and to provide examples of different systems and interventions that can be analyzed within this new paradigm, we highlight key elements of our model in the context of several recent projects.

HIT Design and Development

The design and development of CDS interventions within clinicians’ workflow presents several challenges. We conducted several qualitative studies to gain insight into the 8 dimensions of our model during the development of a CDS tool within a CPOE application. This CDS intervention was designed to alert clinicians whenever they attempted to order a medication that was contraindicated in elderly patients or one that had known serious interactions with warfarin. We used several methods, including focus groups, usability testing, and educational sessions with clinician users [ 51 ], to identify issues related to hardware/software, content, interface, people, measurement, workflow/communication, and internal policies and procedures. These efforts helped us, for example, to understand the need to meet with the organization’s Pharmacy and Therapeutics (P & T) committee (i.e., internal policy ) to convince them to modify the medication formulary as well as the information technology professional (i.e., people ) who was responsible for maintaining the textual content of the alerts (i.e., font size, contents and order of the messages) to fit within the constraints of the alert notification window (i.e., user interface ) which eliminated the need to train clinicians to use the horizontal scrolling capability. This is just one simple example of how use of the 8 dimensional model paid huge dividends during the development and implementation stages of this highly successful project [ 52 , 53 ].

HIT Implementation

In a recent article we described lessons that could be learned from CPOE implementation at another site [ 54 ]. One of the most important conclusions from this implementation was that problems could, and often do, occur in all 8 dimensions of the model (see Table 1 ) [ 55 ].

Table 1

Illustration of how the 8-dimensions of our socio-technical model have been used to analyze different HIT-related interventions and how other dimensions might need to be addressed for every dimension

Socio-technical
model
dimension
Lessons Learned from
Implementation of Computer-
based Provider Order Entry
Follow-up of Alerts related to
Abnormal Diagnostic Imaging Results
Hardware and Software The majority of computer terminals were linked to the hospital computer system via wireless signal, communication bandwidth was often exceeded during peak operational periods, which created additional delays between each click on the computer mouse. Alerts should be retracted when the patient dies or if the radiologist calls, or the patient is admitted before the alert is acknowledged. However, this can be done only through a centralized organizational policy.
Clinical Content No ICU-specific order sets were available at the time of CPOE implementation. The hurried implementation timeline established by the leaders in the organization prohibited their development. Interventions to reduce alert overload and improve the signal to noise ratio should be explored. Unnecessary alerts should be minimized. However, people (physicians) may not agree which alerts are essential and which ones are not [ 58 ].
Human Computer Interface The process of entering orders often required an average of 10 clicks on the computer mouse per order, which translated to 1 to 2 minutes to enter a single order. Organizational leaders eventually hired additional clinicians to “work the CPOE system” while others cared for the patients. Unacknowledged alerts must stay active on the EMR screen for longer periods, perhaps even indefinitely, and should require the provider’s signature and statement of action before they are allowed to drop off the screen. However, providers might not want to spend additional time stating their actions; who will make this decision?
People Leaders at all levels of the institution made implementation decisions (re: hardware placement, software configuration, content development, user interface design, etc.) that placed patient care in jeopardy. Many clinicians did not know how to use many of the EMR’s advanced features that greatly facilitated the processing of alerts so training should be revamped. However, providers are only given 4 hours of training time by the institution
Workflow and Communication Rapid implementation timeline did not allow time for clinicians to adapt to their new routines and responsibilities. In addition, poor hardware and software design and configuration decisions complicated the workflow issues. Communicating alerts to 2 recipients, which occurred when tests were ordered by a healthcare practitioner other than the patient’s regular PCP, significantly increased the odds that the alert would not be read and would not receive timely follow-up action. No policy was available that states who is responsible for follow-up. Additionally, back-up notification required by the institution to improve critical test result follow-up, a Joint Commission goal.
Organizational Policies and Procedures Order entry was not allowed until after the patient had physically arrived at the hospital and been fully registered into the clinical information system. Every institution must develop and publicize a policy regarding who is responsible (PCP vs the ordering provider, who may be a consultant) for taking action on abnormal results. Also meets External Joint commission requirements.
External Rules, Regulations, and Pressures Following the IOM’s report “ To Err is Human: Building a Safer Health System” and subsequent congressional hearings the issue of patient safety has risen to a position of highest priority among health care organizations. Poor reimbursement and heavy workload of patients puts productivity pressure on providers The nature of high-risk transitions between health care practitioners, settings, and systems of care makes timely and effective electronic communication particularly challenging.
System Measurement and Monitoring Monitoring identified a significant increase in patient mortality following CPOE implementation. An audit and performance feedback system should be established to give providers information on timely follow-up of patients’ test results on a regular basis. However, providers may not want feedback or the institution does not have the persons required to do so.

HIT Use

Safe and effective use of an EHR-based notification system involves many factors that are addressed by almost all dimensions of our model [ 56 , 57 ]. This CDS system generates automated asynchronous “alerts” to notify clinicians of important clinical findings. We examined communication outcomes of over 2500 such alerts that were specifically related to abnormal test results. We found that 18.1% of abnormal lab alerts and 10.2% of abnormal imaging alerts were never acknowledged (i.e., were unread by the receiving provider). Additionally, 7–8% of these alerts lacked timely follow-up, which was unrelated to acknowledgment of the alert.

Despite a notification system that ensured transmission of results, it was concerning that abnormal test results did not always receive timely follow-up, even when acknowledged. This study revealed complex interactions between users, the user interface, software, content, workflow/communication, and organizational policies related to who was responsible for abnormal test follow-up. Our findings thus highlighted the multiple dimensions of our model that need to be addressed to improve the safety of EHR-based notification systems and perhaps other forms of CDS (see Table 1 ) [ 59 , 60 , 61 , 62 ]. We are now applying the socio-technical model to study barriers, facilitators, and interventions for safe and effective test result notification through EHRs.

HIT Evaluation

Our model recently provided us guidance in HIT evaluation, reminding us that however technologically savvy we make our patient care processes, we must also carefully monitor their impact, effectiveness, and unintended consequences. We recently evaluated why, despite implementation of an automated notification system to enhance communication of fecal occult blood test (FOBT) results, providers did not take follow-up actions in almost 40% of cases [ 63 ]. Again, our findings highlighted multiple dimensions corresponding to our socio-technical model. For instance, we found that clinician non-response to automated notifications was related to a software configuration error that prevented transmission of a subset of test results but we also found that if the institution was using certain types of workflows related to test performance and that if organizational procedures for computerized order-entry of FOBTs were different, the problem may not have occurred. Thus, we found our multi-dimensional approach, which accounted for interactions, to be useful for comprehensive evaluation of HIT after implementation.

Conclusions

The 8 dimensions of the safe and effective HIT use model introduced in this manuscript establish a new paradigm for the study of HIT. We have successfully applied this model to study several HIT interventions at different levels of design, development, implementation, use and evaluation. We anticipate that additional study of the 8 dimensions and their complex interactions will yield further refinements to this model and, ultimately, improvements in the quality and safety of the HIT applications that translate to better health and welfare for our patients.

Acknowledgments

We thank Donna Espadas and Adol Esquivel, MD, PhD for their help creating the graphical depiction of the model. We also thank the two reviewers of this paper for their constructive criticism. This research was supported in part by the National Library of Medicine R01- LM006942 (DFS), NIH K23 career development award (K23CA125585) to HS, the VA National Center of Patient Safety (DFS, HS), Agency for Health Care Research and Quality (R18 HS17820) to HS and in part by the Houston VA HSR&D Center of Excellence (HFP90-020) (HS). These sources had no role in the preparation, review, or approval of the manuscript. We also thank Andrea Bradford, PhD for editorial assistance.

Footnotes

The views expressed in this article are those of the authors and do not necessarily represent the views of the Department of Veterans Affairs or the National Institutes of Health.

No conflicts of interest

License for Publication:

The Corresponding Author has the right to grant on behalf of all authors and does grant on behalf of all authors, an exclusive license (or non exclusive for government employees) on a worldwide basis to the BMJ Publishing Group Ltd to permit this article (if accepted) to be published in QSHC and any other BMJPGL products and sublicenses such use and exploit all subsidiary rights, as set out in our license.

Competing Interest: None declared.

References

1. Beuscart-Zéphir MC, Aarts J, Elkin P. Human factors engineering for healthcare IT clinical applications. Int J Med Inform. 2010 Feb 16; [ PubMed ] [ Google Scholar ]
2. Holden RJ, Karsh B. A theoretical model of health information technology usage behaviour with implications for patient safety. Behaviour & Information Technology. 2009; 28 :21–38. [ Google Scholar ]
3. Rogers EM. Diffusion of Innovations. 5th Edition. Free Press; 2003. p. 512. [ Google Scholar ]
4. Ash J. Organizational factors that influence information technology diffusion in academic health sciences centers. J Am Med Inform Assoc. 1997 Mar–Apr; 4 (2):102–111. [ PMC free article ] [ PubMed ] [ Google Scholar ]
5. Gosling AS, Westbrook JI, Braithwaite J. Clinical team functioning and IT innovation: a study of the diffusion of a point-of-care online evidence system. J Am Med Inform Assoc. 2003 May–Jun; 10 (3):244–251. [ PMC free article ] [ PubMed ] [ Google Scholar ]
6. Venkatesh V, Morris MG, Davis FD, Davis GB. “User Acceptance of Information Technology: Toward a Unified View,” MIS Quarterly. 2003; 27 :425–478. [ Google Scholar ]
7. Holden RJ, Karsh BT. The technology acceptance model: its past and its future in health care. J Biomed Inform. 2010 Feb; 43 (1):159–172. [ PMC free article ] [ PubMed ] [ Google Scholar ]
8. Duyck P, Pynoo B, Devolder P, Voet T, Adang L, Vercruysse J. User acceptance of a picture archiving and communication system. Applying the unified theory of acceptance and use of technology in a radiological setting. Methods Inf Med. 2008; 47 (2):149–156. [ PubMed ] [ Google Scholar ]
9. Kijsanayotin B, Pannarunothai S, Speedie SM. Factors influencing health information technology adoption in Thailand's community health centers: applying the UTAUT model. Int J Med Inform. 2009 Jun; 78 (6):404–416. [ PubMed ] [ Google Scholar ]
10. Hutchins E. Cognition in the Wild. Cambridge, MA: MIT Press; 1996. p. 401. [ Google Scholar ]
11. Hazlehurst B, McMullen C, Gorman P, Sittig D. How the ICU follows orders: care delivery as a complex activity system. AMIA Annu Symp Proc. 2003:284–288. [ PMC free article ] [ PubMed ] [ Google Scholar ]
12. Cohen T, Blatter B, Almeida C, Shortliffe E, Patel V. A cognitive blueprint of collaboration in context: distributed cognition in the psychiatric emergency department. Artif Intell Med. 2006 Jun; 37 (2):73–83. [ PubMed ] [ Google Scholar ]
13. Hazlehurst B, McMullen CK, Gorman PN. Distributed cognition in the heart room: how situation awareness arises from coordinated communications during cardiac surgery. J Biomed Inform. 2007 Oct; 40 (5):539–551. [ PubMed ] [ Google Scholar ]
14. Patel VL, Zhang J, Yoskowitz NA, Green R, Sayan OR. Translational cognition for decision support in critical care environments: a review. J Biomed Inform. 2008 Jun; 41 (3):413–431. [ PMC free article ] [ PubMed ] [ Google Scholar ]
15. Reason J. Human error: models and management. BMJ. 2000 Mar 18; 320 (7237):768–770. [ PMC free article ] [ PubMed ] [ Google Scholar ]
16. van der Sijs H, Aarts J, Vulto A, Berg M. Overriding of drug safety alerts in computerized physician order entry. J Am Med Inform Assoc. 2006 Mar–Apr; 13 (2):138–147. [ PMC free article ] [ PubMed ] [ Google Scholar ]
17. Lederman RM, Parkes C. Systems failure in hospitals--using Reason's model to predict problems in a prescribing information system. J Med Syst. 2005 Feb; 29 (1):33–43. [ PubMed ] [ Google Scholar ]
18. Norman D. The Psychology of Everyday Things. New York: Basic Books; 1988. [ Google Scholar ]
19. Malhotra S, Jordan D, Shortliffe E, Patel VL. Workflow modeling in critical care: piecing together your own puzzle. J Biomed Inform. 2007 Apr; 40 (2):81–92. [ PubMed ] [ Google Scholar ]
20. Sheehan B, Kaufman D, Stetson P, Currie LM. Cognitive analysis of decision support for antibiotic prescribing at the point of ordering in a neonatal intensive care unit. AMIA Annu Symp Proc. 2009 Nov 14; 2009 :584–588. [ PMC free article ] [ PubMed ] [ Google Scholar ]
21. Henriksen K, Kaye R, Morisseau D. Industrial ergonomic factors in the radiation oncology therapy environment. In: Nielsen R, Jorgensen K, editors. Advances in industrial ergonomics and safety V. Washington, DC: Taylor and Francis; 1993. pp. 325–335. [ Google Scholar ]
22. Vincent C, Taylor-Adams S, Stanhope N. Framework for analysing risk and safety in clinical medicine. BMJ. 1998 Apr 11; 316 (7138):1154–1157. [ PMC free article ] [ PubMed ] [ Google Scholar ]
23. Carayon P, Schoofs Hundt A, Karsh BT, Gurses AP, Alvarado CJ, Smith M, Flatley Brennan P. Work system design for patient safety: the SEIPS model. Qual Saf Health Care. 2006 Dec; 15 Suppl 1:i50–i58. [ PMC free article ] [ PubMed ] [ Google Scholar ]
24. Harrison MI, Koppel R, Bar-Lev S. Unintended consequences of information technologies in health care--an interactive sociotechnical analysis. J Am Med Inform Assoc. 2007 Sep–Oct; 14 (5):542–549. [ PMC free article ] [ PubMed ] [ Google Scholar ]
25. Rector AL. Clinical terminology: why is it so hard? Methods Inf Med. 1999 Dec; 38 (4–5):239–252. [ PubMed ] [ Google Scholar ]
26. Rosenbloom ST, Miller RA, Johnson KB, Elkin PL, Brown SH. Interface terminologies: facilitating direct entry of clinical data into electronic health record systems. J Am Med Inform Assoc. 2006 May–Jun; 13 (3):277–288. [ PMC free article ] [ PubMed ] [ Google Scholar ]
27. Wright A, Sittig DF, Ash JS, Bates DW, Fraser G, Maviglia SM, McMullen C, Nicol WP, Pang JE, Starmer J, Middleton B. Governance for Clinical Decision Support: Case Studies and Best Practices of Exemplary Institutions. J Amer Med Inform Assoc. 2010 (under review) [ PMC free article ] [ PubMed ] [ Google Scholar ]
28. Sittig DF, Campbell EM, Guappone KP, Dykstra RH, Ash JS. Recommendations for Monitoring and Evaluation of In-Patient Computer-based Provider Order Entry Systems: Results of a Delphi Survey. Proc. Amer Med Informatics Assoc Fall Symposium. 2007:671–675. [ PMC free article ] [ PubMed ] [ Google Scholar ]
29. Sittig DF, Simonaitis L, Carpenter JD, Allen GO, Doebbeling BN, Sirajuddin AM, Ash SJ, Middleton B. The state of the art in clinical knowledge management: An inventory of tools and techniques. Int J Med Inform. 2010 Jan; 79 (1):44–57. [ PMC free article ] [ PubMed ] [ Google Scholar ]
30. Hripcsak G. Monitoring the monitor: automated statistical tracking of a clinical event monitor. Comput Biomed Res. 1993 Oct; 26 (5):449–466. [ PubMed ] [ Google Scholar ]
31. Rasmussen J. Risk management in a dynamic society: a modelling problem. Safety Science. 1997; 27 (2):183–213. [ Google Scholar ]
32. Greenhalgh T, Stramer K, Bratan T, Byrne E, Russell J, Potts HW. Adoption and non-adoption of a shared electronic summary record in England: a mixed-method case study. BMJ. 2010 Jun 16; 340 :c3111. [ PubMed ] [ Google Scholar ]
33. Leveson NG, Turner CS. An Investigation of the Therac-25 Accidents. IEEE Computer. 1993; 26 (7):18–41. Updated version available at: http://sunnyday.mit.edu/papers/therac.pdf . [ Google Scholar ]
34. Kilbridge P. Computer crash--lessons from a system failure. N Engl J Med. 2003 Mar 6; 348 (10):881–882. [ PubMed ] [ Google Scholar ]
35. Bernstam EV, Smith JW, Johnson TR. What is biomedical informatics? J Biomed Inform. 2010 Feb; 43 (1):104–110. [ PMC free article ] [ PubMed ] [ Google Scholar ]
36. Sittig DF, Wright A, Simonaitis L, Carpenter JD, Allen GO, Doebbeling BN, Sirajuddin AM, Ash JS, Middleton B. The state of the art in clinical knowledge management: an inventory of tools and techniques. Int J Med Inform. 2010 Jan; 79 (1):44–57. [ PMC free article ] [ PubMed ] [ Google Scholar ]
37. Shneiderman B, Plaisant C, Cohen M, Jacobs S. Designing the User Interface: Strategies for Effective Human-Computer Interaction. 5th ed. Pearson Educaiton; 2009. p. 672. [ Google Scholar ]
38. Svanæs D, Alsos OA, Dahl Y. Usability testing of mobile ICT for clinical settings: Methodological and practical challenges. Int J Med Inform. 2008 Sep 10; [ PubMed ] [ Google Scholar ]
39. Sittig DF, Krall M, Kaalaas-Sittig J, Ash JS. Emotional aspects of computer-based provider order entry: a qualitative study. J Am Med Inform Assoc. 2005 Sep–Oct; 12 (5):561–567. [ PMC free article ] [ PubMed ] [ Google Scholar ]
40. Henriksen K, Joseph A, Zayas-Caban T. The Human Factors of Home Health Care: A Conceptual Model for Examining Safety and Quality Concerns. J Patient Safety. 2009 December; 5 (4) [ PubMed ] [ Google Scholar ]
41. American Recovery and Reinvestment Act of 2009, State Grants to Promote Health Information Technology Planning and Implementation Projects. Available at: https://www.grantsolutions.gov/gs/preaward/previewPublicAnnouncement.do?id=10534 .
42. Sittig DF. Personal health records on the internet: a snapshot of the pioneers at the end of the 20th Century. Int J Med Inform. 2002 Apr; 65 (1):1–6. [ PubMed ] [ Google Scholar ]
43. Detmer DE, Munger BS, Lehmann CU. Medical Informatics Board Certification: History, Current Status, and Predicted Impact on the Medical Informatics Workforce. Applied Clinical Informatics. 2010; 1 (1):11–18. Available: http://www.schattauer.de/nc/en/magazine/subject-areas/journals-a-z/applied-clinical-informatics/issue/special/manuscript/12624/download.html . [ PMC free article ] [ PubMed ] [ Google Scholar ]
44. Leonard KJ, Sittig DF. Improving information technology adoption and implementation through the identification of appropriate benefits: creating IMPROVE-IT. J Med Internet Res. 2007 May 4; 9 (2):e9. [ PMC free article ] [ PubMed ] [ Google Scholar ]
45. Ash JS, Berg M, Coiera E. Some unintended consequences of information technology in health care: the nature of patient care information system-related errors. J Am Med Inform Assoc. 2004 Mar–Apr; 11 (2):104–112. [ PMC free article ] [ PubMed ] [ Google Scholar ]
46. Bradshaw KE, Sittig DF, Gardner RM, Pryor TA, Budd M. Computer-based data entry for nurses in the ICU. MD Comput. 1989 Sep–Oct; 6 (5):274–280. [ PubMed ] [ Google Scholar ]
47. Sittig DF, Shiffman RN, Leonard K, Friedman C, Rudolph B, Hripcsak G, Adams LL, Kleinman LC, Kaushal R. A draft framework for measuring progress towards the development of a National Health Information Infrastructure. BMC Med Inform Decis Mak. 2005 Jun 13; 5 :14. [ PMC free article ] [ PubMed ] [ Google Scholar ]
48. Sittig DF, Classen DC. Safe electronic health record use requires a comprehensive monitoring and evaluation framework. JAMA. 2010 Feb 3; 303 (5):450–451. [ PMC free article ] [ PubMed ] [ Google Scholar ]
49. Begun JW, Zimmerman B, Dooley K. Health Care Organizations as Complex Adaptive Systems. In: Mick SM, Wyttenbach M, editors. Advances in Health Care Organization Theory. San Francisco: Jossey-Bass; 2003. pp. 253–288. [ Google Scholar ]
50. Rouse WB. Health Care as a Complex Adaptive System: Implications for Design and Management. The Bridge: Spring; 2008. pp. 17–25. [ Google Scholar ]
51. Feldstein A, Simon SR, Schneider J, Krall M, Laferriere D, Smith DH, Sittig DF, Soumerai SB. How to design computerized alerts to safe prescribing practices. Jt Comm J Qual Saf. 2004 Nov; 30 (11):602–613. [ PubMed ] [ Google Scholar ]
52. Feldstein AC, Smith DH, Perrin N, Yang X, Simon SR, Krall M, Sittig DF, Ditmer D, Platt R, Soumerai SB. Reducing warfarin medication interactions: an interrupted time series evaluation. Arch Intern Med. 2006 May 8; 166 (9):1009–1015. [ PubMed ] [ Google Scholar ]
53. Smith DH, Perrin N, Feldstein A, Yang X, Kuang D, Simon SR, Sittig DF, Platt R, Soumerai SB. The impact of prescribing safety alerts for elderly persons in an electronic medical record: an interrupted time series evaluation. Arch Intern Med. 2006 May 22; 166 (10):1098–1104. [ PubMed ] [ Google Scholar ]
54. Sittig DF, Ash JS, Zhang J, Osheroff JA, Shabot MM. Lessons from "Unexpected increased mortality after implementation of a commercially sold computerized physician order entry system". Pediatrics. 2006 Aug; 118 (2):797–801. [ PubMed ] [ Google Scholar ]
55. Sittig DF, Ash JS. Clinical information systems: Overcoming adverse consequences. Sudbury, MA: Jones and Bartlett; 2010. [ Google Scholar ]
56. Singh H, Thomas EJ, Mani S, Sittig D, Arora H, Espadas D, Khan MM, Petersen LA. Timely follow-up of abnormal diagnostic imaging test results in an outpatient setting: are electronic medical records achieving their potential? Arch Intern Med. 2009 Sep 28; 169 (17):1578–1586. [ PMC free article ] [ PubMed ] [ Google Scholar ]
57. Singh H, Thomas EJ, Sittig DF, Wilson L, Espadas D, Khan MM, Petersen LA. Notification of Abnormal Laboratory Test Results in an Electronic Medical Record: Do Any Safety Concerns Remain? Am J Med. 2010 Mar; 123 (3):238–244. [ PMC free article ] [ PubMed ] [ Google Scholar ]
58. van der Sijs H, Aarts J, van Gelder T, Berg M, Vulto A. Turning off frequently overridden drug alerts: limited opportunities for doing it safely. J Am Med Inform Assoc. 2008 Jul–Aug; 15 (4):439–448. [ PMC free article ] [ PubMed ] [ Google Scholar ]
59. Hysong SJ, Sawhney MK, Wilson L, Sittig DF, Esquivel A, Watford M, Davis T, Espadas D, Singh H. Improving outpatient safety through effective electronic communication: A study protocol. Implement Sci. 2009 Sep 25; 4 (1):62. PMID: 19781075. [ PMC free article ] [ PubMed ] [ Google Scholar ]
60. Hysong SJ, Sawhney MK, Wilson L, Sittig DF, Espadas D, Davis TL, Singh H. Provider Management Strategies of Abnormal Test Result Alerts: A Cognitive Task Analysis. J Am Med Inform Assoc. 2010; 17 :71–77. PMID: 20064805. [ PMC free article ] [ PubMed ] [ Google Scholar ]
61. Singh H, Wilson L, Reis B, Sawhney MK, Espadas D, Sittig DF. Ten Strategies to Improve Management of Abnormal Test Result Alerts in the Electronic Health Record. Journal of Patient Safety. 2010 Jun; 6 (2):121–123. In press. [ PMC free article ] [ PubMed ] [ Google Scholar ]
62. Singh H, Vij M. Eight Recommendations for Policies for Communication of Abnormal Test Results. Joint Commission Journal on Quality and Patient Safety. 2010 In press. [ PubMed ] [ Google Scholar ]
63. Singh H, Wilson L, Petersen LA, Sawhney MK, Reis B, Espadas D, Sittig DF. Improving Follow-up of Abnormal Cancer Screens using Electronic Health Records: Trust but Verify Test Result Communication. BMC Med Inform Decis Mak. 2009 Dec 9; 9 :49. [ PMC free article ] [ PubMed ] [ Google Scholar ]