Canadian Journal of Nursing Informatics

Information

This article was written on 20 Dec 2019, and is filled under Volume 14 2019, Volume 14 No 4.

Current post is tagged

, , , , ,

Informatics in Nursing Education: What Do We Do Next?

Print Friendly, PDF & Email

by Andrea Knox, RN, BSN, CON(c)
Senior Practice Leader – Kelowna
BC Cancer Professional Practice Nursing

Informatics in Nursing Education: What Do We Do Next?

Abstract

Information and communication technologies (ICT’s)continue to grow and affect how health care professionals work.The utilization of ICT’s in nursing practice is associated with increased efficiency, safety, and quality in the delivery of patient care(Canadian Nursing Informatics Association-Canadian Nurses Association, 2017).Nursing informatics has emerged to support the professions’ involvement in designing, implementing and evaluating the adoption of ICT’s, necessitating the integration of informatics content into nursing education curriculums (Canadian Association of Schools of Nursing, 2015; Canadian Nurses Association, 2006; Canadian Nurses Association, 2015; Canadian Nursing Informatics Association-Canadian Nurses Association, 2017).The purpose of this paper was to explore the current literature and identify existing resources related to the development and assessment of nursing informatics competencies. A review of the current literature was undertaken to answer the question: What evidence-based resources are available to nurse educators to assess the development of nursing informatics competencies? The current literature revealed that nurse educators have a variety of validated competency frameworks and assessment tools at their fingertips to support development of these skills in students.This article will present the findings of the evidence review and provide recommendations on competency assessment tools for use by nursing educators in the Canadian context. 

Keywords:

Nursing, education, informatics, competencies, competency assessment

Introduction

Technology is everywhere, touching almost every part of our personal and professional lives. Society has moved from a world dominated by paper and face-to-face interactions, to an online presence and way of communicating that is in a constant state of evolution. The progressive nature of technology in healthcare has seen widespread adoption of a variety of information and communication technologies to improve the efficiency, safety and quality of patient care (Canadian Nursing Informatics Association-Canadian Nurses Association, 2017). Information and communication technologies (ICT’s) encompass a wide variety of components such as computers, smart phones, intranet platforms, decision support tools, email and electronic health records (Canadian Association of Schools of Nursing, 2015; Canadian Nurses Association, 2006). These technologies continue to grow and affect how health care professionals work, from the simple functions of email communication to fully integrated closed-loop medication management systems with bar-code administration (Adu, 2017). Nurses are on the front-line of this technological wave and are constantly being challenged to learn and integrate ICT’s into practice (Canadian Association of Schools of Nursing, 2015; Canadian Nurses Association, 2006; Canadian Nurses Association, 2015; Canadian Nursing Informatics Association-Canadian Nurses Association, 2017).

Background

It has been reported that the utilization of ICT’s in nursing practice is associated with increased efficiency, safety, and quality in the delivery of patient care (Canadian Nursing Informatics Association-Canadian Nurses Association, 2017); making it an important aspect to support in how nurses practice. Nurses are also increasingly being called upon to participate in the design, implementation and evaluation of new technologies in their organizations (Canadian Association of Schools of Nursing, 2015; Canadian Nurses Association, 2006; Canadian Nurses Association, 2015; Canadian Nursing Informatics Association-Canadian Nurses Association, 2017). The Clinical & Systems Transformation (CST) project, a partnership between three health authorities in British Columbia working towards implementing a new electronic health record, is an example of a systems level project which requires nursing input at all phases (Clinical & Systems Transformation, 2018). As a result of projects like CST (Clinical & Systems Transformation, 2018), the field of nursing informatics has emerged to support the professions involvement in designing, implementing and evaluating the adoption of ICT’s (Canadian Association of Schools of Nursing, 2015; Canadian Nursing Informatics Association-Canadian Nurses Association, 2017). Nursing schools have been charged with integrating nursing informatics content into curriculum to support skill development in the utilization and adoption of ICT’s in clinical practice (Canadian Association of Schools of Nursing, 2015; Canadian Nurses Association, 2006; Canadian Nurses Association, 2015; Canadian Nursing Informatics Association-Canadian Nurses Association, 2017). The purpose of this paper is to explore the current literature and identify existing resources related to the development and assessment of nursing informatics competencies.

Research Question

With the emergence of nursing informatics as a specialty area of nursing practice (Canadian Nursing Informatics Association-Canadian Nurses Association, 2017) and the trend to incorporate informatics content into nursing curriculum across the country, the identification and assessment of nursing informatics competencies also requires attention (Canadian Association of Schools of Nursing, 2015; Canadian Nurses Association, 2006; Canadian Nurses Association, 2015; Canadian Nursing Informatics Association-Canadian Nurses Association, 2017). In the 2006 Canadian Nurses Association (CNA) E-Nursing Strategy it was identified that “two-thirds of the schools have a curriculum vision or design that includes nursing informatics competencies but that do not have explicit outcomes” (p.22). This statement by CNA infers an informal approach to integrate informatics into undergraduate curriculum without clearly identifying specifics informatics objectives or competency measurement criteria. In 2015, the Canadian Association of Schools of Nursing (CASN) released entry to practice competencies for nursing informatics with the goal of promoting “the development of a culture within nursing education in Canada that embraces the integration of nursing informatics in curricula and professional practice” (p. 1). It was noted, however, that 9 years lapsed between the release of the CNA E-Nursing Strategy and the CASN entry to practice competencies with no mention of strategies to assess the development of nursing informatics competencies against the recommendations proposed in both documents (Canadian Association of Schools of Nursing, 2015; Canadian Nurses Association, 2006). To address this gap, a review of the current literature was undertaken to answer the question: What evidence-based resources are available to nurse educators to assess the development of nursing informatics competencies? 

Methods

Once the research question was created, the MedLine, CINAHL and PubMed databases were searched for relevant literature under the concepts of nursing education and nursing informatics competencies. Inclusion criteria for the literature search was limited in all three databases to full text, peer reviewed articles, available in English that were published between 2014 and 2018. A brief high-level manual search of the literature was also completed using Google Scholar on nursing education and informatics in Canada for any additional relevant articles. It should be identified that the limits placed on the literature search, such as English only with full text, may have inadvertently excluded some studies with relevance to this review. However, the time and cost associated with translation of non-English articles was beyond the scope of this evidence review. It must also be noted that the exclusion of literature prior to 2014 may have removed earlier research studies relevant to the question posed, however the time frame selected for inclusion is appropriate for an evidence review. To address any possible limitations with the inclusion and exclusion criteria of the search strategy employed a mix of quantitative, qualitative and mixed method study designs were included to ensure a fulsome capture of the current evidence on nursing informatics competencies in nursing education.

Results

From the database search, 28 articles were identified with a further 10 from the manual literature search. A final total of 32 articles were identified for preliminary screening after compilation of the database and manual searches with removal of six duplicates (Figure 1). Initial screening by title and abstract review resulted in the exclusion of 17 articles. A full-text review was then completed to screen the remaining 15 articles for studies that focused on nursing informatics competencies in nursing education. This resulted in the exclusion of an additional two articles that entailed process descriptions for completing a nursing curriculum revision. A PRISMA diagram was created (Figure 1) to delineate the identification, screening, eligibility and inclusion process.

Figure 1
Figure 1: Prisma Diagram
Adapted From:  Moher D, Liberati A, Tetzlaff J, Altman DG, The PRISMA Group (2009). Preferred Reporting Items for Systematic Reviews and Meta-Analyses: The PRISMA Statement. PLoS Med 6(6): e1000097. doi:10.1371/journal.pmed1000097

The final group of studies consisted of 13 that used quantitative, qualitative or mixed method research designs. Data was then extracted from the eligible studies and organized into a summary table (Table 1). Data extracted included study author(s), year of publication, location, purpose, research design, sample, instruments and methodology, findings and limitations.

Quality Assessment

After full text review and initial data extraction, each study was evaluated for quality using either the COnsolidated criteria for REporting Qualitative research(Tong, Sainsbury., Craig, 2007), the Assessing Quantitative Studies checklist (Pesut, Taylor, 2018), or the overview of mixed methods research designs provided by Andrew (2009). Quality criteria revealedthat the range spanned from low to moderate; overallaverage quality wasmoderatefor the grouping. All the studies were published in peer reviewed journals with limitations identified by the authors. One common limitation noted wasa small sample size resulting in an effect on the generalizability of the findings reported. 

Quantitative Studies

Five of the quantitative studies had a cross-sectional design, while the sixth quantitative study used a comparative case study approach with pre & post-test data collection (Abdrbo, 2015; Bryant, Whitehead, & Kleier, 2016; Kleib & Nagle, 2018a; Kleib & Nagle, 2018b; Kleib & Nagle, 2018c; Oh, Shin-Jeong Kim, Kim, & Vasuki, 2017).Overall the grouping of six were placed at the lower end of the hierarchy of evidenceoutlined by Evans (2003). Each study was reviewed for quality using the Assessing Quantitative Studies checklist (Pesut, Taylor, 2018) as the guiding framework for reflection. For studies in thequantitative group there were a mix of openly acknowledged and inferred potential biasesincluding recruitment, selection, and performance bias (Abdrbo, 2015; Bryant, et al., 2016; Kleib & Nagle, 2018a; Kleib & Nagle, 2018b; Kleib & Nagle, 2018c; Oh et al., 2017).Implications related to bias ranged from the convenience of the sample related to the recruitment strategy employed, which could influence participant responses based on the subjective nature of the data collected. Overall the findings on the competency measurement tools assessed were consistent across all the quantitative studies reviewed. However due to the wide variety of measurement tools assessed and the differences in the data collection, analysis and reporting methods, a direct metric to metric comparison of the competency tools was not possible across this range of studies.

Qualitative Studies

Two qualitative cohort studies were reviewed using the COREQ checklist (Tong et al., 2007) as a guide. Neither study explicitly outlined criteria for meeting credibility, dependability, confirmability & transferability in qualitative research (Morse, 2015), nor did they identify an underlying theoretical framework (Achampong, 2017; Choi, Park, Lee, 2016). A moderate level of reflexivity was noted as present in the study by Choi, Park and Lee (2016), with it being notably absent in the study by Achampong (2017). Potential issues with recruitment and performance bias were identified in both studies with the utilization of a purposive process for recruitment and participants awareness of their informatics competencies being evaluated (Achampong, 2017; Choi, et al., 2016). Implications related to bias included potential influence on participant responses, the subjective nature of the data collection and the limited generalizability of the results due to the recruitment strategy and sample sizes in the studies. 

Mixed Method Studies

The remaining five mixed method studies were also reviewed for quality. Each study used a cross-sectional approach within a sequential explanatory design (Andrew, 2009) that employed an electronic survey, workshops and case study review for data collection and analysis (Egbert, et al., 2016; Hubner, et al., 2018; Hubner, et al., 2016; Nagle, et al., 2014; Ronquillo, et al., 2017). Potential for selection bias was present in all studies due to the use of convenience sampling strategies (Egbert, et al., 2016; Hubner, et al., 2018; Hubner, et al., 2016; Nagle, et al., 2014; Ronquillo, Topaz, Pruinelli, Peltonen, & Nibber, 2017). The sample for Ronquillo et al. (2017) came from a variety of countries which resulted in a need to have both the survey and the resulting data collected translated. This resulted in the potential for interpretation bias as the need to translate responses may have resulted in lost meaning that could have influenced the thematic analysis of the responses (Ronquillo, et al., 2017). None of the mixed methods studies reviewed identified an overarching paradigm or theoretical underpinning (Egbert, et al., 2016; Hubner, et al., 2018; Hubner, et al., 2016; Nagle, et al., 2014; Ronquillo, et al., 2017). As a result, it is difficult to fully determine reliability and validity in the studies reviewed as rigour in mixed methods research is determined through identifying methodological congruency with the paradigmatic underpinnings (Andrew, 2009). It is noted, however, that each of the five studies exhibited congruency between the data collection and analysis methods in the absence of an explicit theoretical stance (Egbert, et al., 2016; Hubner, et al., 2018; Hubner, et al., 2016; Nagle, et al., 2014; Ronquillo, et al., 2017). The two studies by Hubner et al. demonstrated the most congruency overall, and indicated a moderate level of reliability and validity to the study findings (Hubner, et al., 2018; Hubner, et al., 2016).

Study Characteristics

In reviewing the general study characteristics, it was noted that the main purpose for all studies was to identify efficacy and validity of resources for measuring informatics competencies in nursing (Achampong, 2017; Abdrbo, 2015; Bryant, et al., 2016; Choi, et al., 2016; Egbert, et al., 2016; Hubner, et al., 2018; Hubner, et al., 2016; Kleib & Nagle, 2018a; Kleib & Nagle, 2018b; Kleib & Nagle, 2018c; Nagle, et al., 2014; Oh, et al., 2017; Ronquillo, et al., 2017). Population demographics focused on nurses across a range of work experience from a variety of undergraduate and clinical settings. Four of the thirteen studies were identified as having an international focus, including research team members and study participants from multiple countries (Egbert, et al., 2016; Hubner, et al., 2018; Hubner, et al., 2016; Ronquillo, et al., 2017). Five additional studies took place in a variety of countries outside of Canada (Achampong, 2017; Abdrbo, 2015; Bryant, et al., 2016; Choi, et al., 2016; Oh, et al., 2017), with the remaining four focused exclusively on measuring informatics competencies in the Canadian context (Kleib & Nagle, 2018a; Kleib & Nagle, 2018b; Kleib & Nagle, 2018c; Nagle, et al., 2014).

Instruments and Sampling

Sample size in the quantitative group ranged from 43 – 2844 participants, with a mean of 1296 (Abdrbo, 2015; Bryant, et al., 2016; Kleib & Nagle, 2018a; Kleib & Nagle, 2018b; Kleib & Nagle, 2018c; Oh, et al., 2017). A smaller range of 18 to 59 participants with a mean sample size of 26 was present in the qualitative grouping (Achampong, 2017; Choi, et al., 2016) and 43 – 272 with a mean of 121 for the mixed method study samples (Egbert, et al., 2016; Hubner, et al., 2018; Hubner, et al., 2016; Nagle, et al., 2014; Ronquillo, et al., 2017). All the studies in the quantitative group utilized surveys in the data collection process (Abdrbo, 2015; Bryant, et al., 2016; Kleib & Nagle, 2018a; Kleib & Nagle, 2018b; Kleib & Nagle, 2018c; Oh, et al., 2017; Ronquillo, et al., 2017). One quantitative study used a previously validated tool, the “Self-Assessment of Nursing Informatics Competencies (SANICS)” (Abdrbo, 2015, p. 510) tool, to assess informatics competencies in relation to patient safety competencies in nursing students. Four studies sought to validate new competency assessment tools, such as the “Canadian Nurse Informatics Competency Scale (C-NICAS)” (Kleib & Nagle, 2018a, p. 351; Kleib & Nagle, 2018b, p.407; Kleib & Nagle, 2018c, p.360) and the “Knowledge, Skills, and Attitudes towards Nursing Informatics (KSANI) scale” (Bryant, et al., 2016, p.1). Oh, et al. (2017) used Kirkpatrick’s evaluation model to assess the self-perceived informatics competencies of nursing students.

Three of the mixed method studies sought to validate an international framework of core informatics competencies for nurses, the “Technology Informatics Guiding Education Reform – TIGER” (Hubner, et al., 2018, p. e30; Hubner, et al., 2016, p.656; Ronquillo, et al., 2017, p.130) framework, while the other two studies focused on the development of informatics competency framework development at their national levels (Nagle, et al., 2014; Ronquillo, et al., 2017). The variation in data collection using a range of instruments across the quantitative and mixed method studies made in-depth comparative analysis impossible within the time constraints of this evidence review. The two qualitative studies and three of the mixed method studies utilized semi-structured focus groups for data collection with thematic content analysis (Achampong, 2017; Choi, et al., 2016; Hubner, et al., 2018; Hubner, et al., 2016; Nagle, et al., 2014). The two mixed method studies by Egbert et al. and Ronquillo et al. employed an iterative data collection and analysis process in conjunction with the survey to guide the development of recommendations on informatics competencies (Egbert, et al., 2016; Ronquillo, et al., 2017).

Findings

After reviewing the literature two main themes were identified related to nursing informatics competencies and education; “Nursing Informatics Competency Framework” and “Competency Assessment”. Two sub-themes were identified under the competency framework category related to development and validation. Under the competency assessment theme semi-structured interviews, international survey tools, Canadian survey tools, and the Kirkpatrick evaluation model were identified.

Nursing Informatics Competency Framework – Development

A theme concerning the development of a nursing informatics competency frameworks was noted in the literature reviewed (Egbert, et al., 2016; Nagle, et al., 2014). The two studies reviewed by Egbert et al. (2016) and Nagle et al. (2014) employed a triple iterative mixed-methodology to develop nursing informatics competency frameworks in Europe and Canada respectively. The approach taken by Egbert et al., included eliciting recommended competencies from nursing informatics experts, completing a literature review to confirm and build on the expert recommendations, and then validating the list of competencies generated using a two-stage online survey in combination with focus group workshops. The focus groups identified two additional domains to be added to the initial list while the validation surveys had high response rates of 72.5% and 67.5% (Egbert, et al., 2016). 

Nagle et al. (2014) employed a similar approach with the difference of generating the initial CASN entry to practice competency list based on the literature, not expert opinion. Once the initial list was created, a focus group workshop was held with experts to build consensus and uncover key recommendations for next steps in the competency development process (Nagle, et al., 2014). This was followed by revisions and a second review by stakeholders using an online survey to complete refinements to the competency list (Nagle, et al., 2014). Results revealed 83.3% of those surveyed indicated agreement with the inclusion of the competencies in the framework (Nagle, et al., 2014). In both cases, the nursing informatics competency frameworks were developed based on current literature, expert recommendations, iterative review and validation through an online survey (Egbert, et al., 2016; Nagle, et al., 2014). 

Nursing Informatics Competency Framework – Validation

The second theme noted related to competency frameworks for nursing informatics centered on validation of existing frameworks as presented in the three-remaining mixed-methods studies (Hubner, et al., 2018; Hubner, et al., 2016; Ronquillo, et al., 2017). The two studies by Hubner et al. (2018; 2016) were subsets of the larger Technology Informatics Guiding Education Reform (TIGER) Competency Synthesis Project (Hubner, et al., 2018; Hubner, et al., 2016). The aim of both Hubner et al. (2018; 2016) studies included in this evidence review was to empirically define and validate a globally accepted framework of core health informatics competencies (Hubner, et al., 2018; Hubner, et al., 2016). One study focused on the 24 recommended core competencies defined for five major nursing roles (Hubner, et al., 2018), with the other expanding to include the development and validation of eight exemplar case studies selected from each participating country to purposefully mirror different professional cultures (Hubner, et al., 2016).

Ronquillo et al. (2017) utilized an internationally focused, electronic questionnaire based on nursing informatics literature to assess current trends. Three recommendations specific to informatics education were identified; informatics content should span all levels of nursing education, competency requirements for advanced nursing informatics should be developed, and nursing informatics competencies for faculty need to be developed to support facilitation of informatics content in nursing curricula (Ronquillo, et al., 2017). Ronquillo et al. (2017) identified that the results of their survey align with the recommendations from the TIGER Competency Synthesis Project (Hubner, et al., 2018; Hubner, et al., 2016). All three utilized surveys to validate the informatics competency frameworks with Ronquillo et al. (2017) highlighting alignment of their work with previously validated frameworks (Hubner, et al., 2018; Hubner, et al., 2016). As a result, it is argued that the nursing informatics competency frameworks included in this evidence review demonstrate validity with the rigorous approaches employed across the studies. As such, these frameworks can be used by educators to inform the integration of validated informatics competencies into undergraduate curriculums (Egbert, et al., 2016; Nagle, et al., 2014).

Competency Assessment – Semi-Structured Interviews

The second main theme identified was related to nursing informatics competency assessment tools and techniques. Two qualitative studies chose to employ semi-structured interviews as the primary tool to assess participants perceptions of their informatics competencies. Achampong’s (2017) study on the perspectives of educators found that specific content on informatics was too early in the curriculum and educator skill level in teaching informatics was lacking, limiting the programs ability to properly support competency development in students. Choi, et al. (2016) were able to elicit three major themes related to building informatics competencies in students: using an Electronic Medical Record (EMR) as an academic learning tool; an academic EMR provides students opportunities to build skills with essential functions; and incorporation of an academic EMR supports desired nursing informatics outcomes  such as enhancing critical thinking and informatics competencies. 

While the findings of both studies appear to identify strengths and weaknesses related to informatics competencies using the semi-structured interview method, neither shared the questions used to guide the interviews or indicated supporting resources used to develop interview questions (Achampong, 2017; Choi, et al., 2016). This makes it impossible to ascertain if the questions were based on any known nursing informatics competency recommendations such as those developed by CASN (Canadian Association of Schools of Nursing, 2015). It can be stated that semi-structured interviews could be employed by educators to assess nursing informatics competencies, however the research examined for this review does not identify or demonstrate the existence of any evidence-based tools to support such an approach at this time.

Competency Assessment – International Survey Tools

The last six quantitative studies reviewed sought to further validate an approach to measure informatics competency (Abdbro, 2015; Bryant, et al., 2016; Kleib & Nagle, 2018a; Kleib & Nagle, 2018b; Kleib & Nagle, 2018c; Oh, et al., 2017). In the study by Abdbro (2015), the Self-Assessment of Nursing Informatics Competencies (SANICS) tool was used in conjunction with the Patient Safety Competencies Self-Evaluation (PSCAE) tool to assess the relationship between nursing informatics competencies and patient safety competencies. The SANICS tool had been previously found to have internal consistency and reliability and its use by Abdbro (2015) demonstrated continued reliability with Cronbach’s alpha of 0.9 (Abdbro, 2015). Overall Abdbro (2015) demonstrated a correlation between informatics competencies and patient safety competencies as those who took the informatics course demonstrated higher patient safety competencies in knowledge and skill than those who did not (Abdbro, 2015).

The Knowledge, Skills, and Attitudes towards Nursing Informatics (KSANI) tool is a 24-item competency assessment tool developed and tested with attendees at the Florida Nursing Students Association conference (Bryant, et al., 2016). While it could be argued that the sample was too selective to allow for generalizability of the results, the findings indicated the tool demonstrated high internal consistency and reliability across four factors: educational opportunity to apply informatics, knowledge of informatics, informatics skills confidence, and attitude toward informatics.

Competency Assessment – Canadian Survey Tools

The three studies by Kleib and Nagle (2018a, b, c) reported on various aspects of the Canadian Nurse Informatics Competency Assessment Scale (C-NICAS) which was developed based on CASN competencies. The first study identified priority areas for informatics education for practising Alberta nurses and found self-perceived informatics competencies rate slightly above competent (Kleib & Nagle, 2018a). The C-NICAS instrument face and content validity were supported by this study with a Cronbach’s alpha .926. The second study by Kleib and Nagle (2018b) employed a variety of descriptive statistical and multiple regression analysis techniques to further assess C-NICAS responses. Results of the analysis revealed that overall scores varied with age, educational qualifications, work experience and work setting. The last study by Kleib and Nagle (2018c) focused on examining the factor structure, internal reliability and consistency of the C-NICAS. Kleib and Nagle (2018c) completed an assessment of the sample size and strength of C-NICAS items to determine suitability and factorability including completion of statistical psychometric analysis and calculation of internal reliability. The component analysis process revealed a four-component structure for the 21 item C-NICAS with high Cronbach’s alpha across the subscales of foundational skills, knowledge and information management, professional and regulatory accountability, and ICT use (Kleib & Nagle, 2018c). Overall, the findings of the third study by Kleib and Nagle provided preliminary evidence of the reliability of C-NICAS as well as construct validity of the CASN entry to practice informatics competencies (Canadian Association of Schools of Nursing, 2015).

Competency Assessment – Kirkpatrick Evaluation Model

The Oh, et al. (2017) study used the Kirkpatrick evaluation model in a two-phased approach to evaluate the effects of “flipped learning of a nursing informatics course” (Oh, et al., 2017, p. 477). The results demonstrated a statistically significant difference in self-perceived nursing informatics competencies across domains of knowledge, skill application confidence and achievement attitudes. However, the Kirkpatrick evaluation model requires three years for participants to progress through each level (Oh, et al., 2017). This would necessitate embedding it across the trajectory of undergraduate nursing curricula to ensure all evaluation components are captured. The time-extended format of this model may limit uptake and utilization with nursing faculty given the availability of simpler informatics competency tools.

Discussion

This evidence review focused on examining the literature to determine whatevidence-based resources are available to nurse educators to assess the development of nursing informatics competencies. A variety of resources were identified included the use of semi-structured interviews, informatics competency frameworks, informatics competency assessment tools and the Kirkpatrick evaluation model (Achampong, 2017; Abdrbo, 2015; Bryant, et al., 2016; Choi, et al., 2016; Egbert, et al., 2016; Hubner, et al., 2018; Hubner, et al., 2016; Kleib & Nagle, 2018a; Kleib & Nagle, 2018b; Kleib & Nagle, 2018c; Nagle, et al., 2014; Oh, et al., 2017; Ronquillo, et al., 2017). The semi-structured interview approaches employed by Achampong (2017) and Choi, et al. (2016) did not demonstrate the use of evidence-based tools in the development of the interview tools and would not be recommended as a nursing informatics assessment strategy at this time. Likewise, the practicality of the Kirkpatrick evaluation model given the timeframe and required curriculum integration make it an unlikely choice (Oh, et al.,  2017).  

The informatics competency frameworks presented all followed an iterative development and validation process that blended current literature with expert predication (Egbert, et al, 2016; Hubner, et al., 2018; Hubner, et al., 2016; Nagle, et al., 2014; Ronquillo, et al., 2017). While all the frameworks reviewed demonstrated validity and reliability, only the CASN entry to practice competencies were specific to the Canadian context (Nagle, et al., 2014). The SANICS, KSANI and C-NICAS tools all demonstrated high internal consistency and reliability, making them evidence-based options for educators looking to determine nursing informatics skill and knowledge in students (Abdbro, 2015; Bryant, Whitehead, & Kleier, 2016; Kleib & Nagle, 2018c). However, the linkage between the CASN entry-to practice competencies (Canadian Association of Schools of Nursing, 2015) and the C-NICAS combined with the supporting evidence for both as demonstrated by Kleib and Nagle (2018c) demonstrates high relevance and applicability to Canadian nursing education context.It is recommended that Canadian nursing faculty utilize the CASN framework to inform curriculum revisions inclusive of informatics content, with the use of C-NICAS for the assessment of students’ informatics competency development (Kleib & Nagle, 2018c; Nagle, et al., 2014).

Conclusion

The current literature on informatics competencies in nursing clearly demonstrates the importance of this unique set of knowledge and skills in today’s rapidly changing healthcare environment. This evidence review has demonstrated that nurse educators have a variety of validated competency frameworks and assessment tools at their fingertips to support developing these skills in students. The challenge will be to implement and integrate these evidence-based recommendations to advance practice at the same rate as the technology we strive to understand. The future is here, how do you know your students will be ready for it?

References

Abdrbo, A. (2015). Nursing Informatics Competencies Among Nursing Students and Their Relationship to Patient Safety Competencies. Computers, Informatics, Nursing, 33(11), 509-514

Achampong, E. K. (2017). Assessing the current curriculum of the nursing and midwifery informatics Course at all nursing and midwifery institutions in Ghana.Journal of Medical Education and Curricular Development., 1(4), e1-4. 

Adu, E. (2017). Organizational complexity and hospitals’ adoption of electronic medical records for closed-loop medication therapy management (Order No. 10282443). Available from ProQuest Dissertations & Theses Global. (1907551305).

Andrew, S. (2009).Mixed methods research for nursing and the health sciences, Blackwell Pub.

Bryant, L. E., Whitehead, D. K., & Kleier, J. A. (2016). Development and testing of an instrument to measure informatics knowledge, skills, and attitudes among undergraduate nursing students.Online Journal of Nursing Informatics, 20(2), e1-12.

Canadian Association of Schools of Nursing (2015). Nursing Informatics Entry-to-Practice Competencies for Registered Nurses. Ottawa: CASN. 

Canadian Nurses Association (2006). E-Nursing Strategy for Canada. Ottawa: CNA.

Canadian Nurses Association (2015). Position Statement: Nursing Information and Knowledge Management.Ottawa: CNA.

Canadian Nursing Informatics Association-Canadian Nurses Association (2017). Joint Position Statement: Nursing Informatics. Ottawa: CNA.

Choi, M., Park, J., Lee, H. (2016). Assessment of the Need to Integrate Academic Electronic Medical Records into the Undergraduate Clinical Practicum. Computers, Informatics, Nursing, 34(6), 259-265

Clinical & Systems Transformation (2018). The CST Project. Retrieved from http://cstproject.ca/

Egbert, N., Thye, J., Schulte, G., Liebe, J., Hackl, W. O., Ammenwerth, E., & Hubner, U. (2016). An iterative methodology for developing national recommendations for nursing informatics curricula.Studies in Health Technology & Informatics, 228, 660-664. doi:10.3233/978-1-61499-678-1-660 

Evans, D. (2003). Hierarchy of evidence: a framework for ranking evidence evaluating healthcare interventions. Journal of Clinical Nursing, 12(1), 77-84. https://doi.org/10.1046/j.1365-2702.2003.00662.x

Hübner, U., Shaw, T., Thye, J., Egbert, N., Marin, H.F., Chang, P., O’Connor, S., Day, K., Honey, M., Blake, R., Hovenga, E., Skiba, D., Ball, M.J. (2018). Technology informatics guiding education reform – tiger..Methods of Information in Medicine, 57(1), 30-42. doi:10.3414/ME17-01-0155 

Hübner, U., Shaw, T., Thye, J., Egbert, N., Marin, H.F.., & Ball, M. (2016). Towards an international framework for recommendations of core competencies in nursing and inter-professional informatics: The TIGER competency synthesis project.Studies in Health Technology & Informatics, 228, 655-659. doi:10.3233/978-1-61499-678-1-655 

Kleib, M., & Nagle, L. (2018a). Development of the Canadian Nurse Informatics Competency Assessment Scale and Evaluation of Alberta’s Registered Nurses’ Self-Perceived Informatics Competencies. Computers, Informatics, Nursing, 36(7), 350-358.

Kleib, M., & Nagle, L. (2018b). Factors Associated with Canadian Nurses’ Informatics Competency. Computers, Informatics, Nursing, 36(8), 406-414.

Kleib, M., & Nagle, L. (2018c). Psychometric Properties of the Canadian Nurse Informatics Competency Assessment Scale. Computers, Informatics, Nursing, 36(7), 359-365.

Morse, J. M. (2015). Critical Analysis of Strategies for Determining Rigor in Qualitative Inquiry. Qualitative Health Research25(9), 1212–1222. https://doi.org/10.1177/1049732315588501

Moher, D., Liberati, A., Tetzlaff, J., Altman, D.G., The PRISMA Group. (2009). PreferredReporting Items for Systematic Reviews and Meta-Analyses: The PRISMA Statement. PLoS Medicine, 6(6): e1000097. doi:10.1371/journal.pmed1000097

Nagle, L. M., Crosby, K., Frisch, N., Borycki, E., Donelle, L., Hannah, K., . . . Shaben, T. (2014). Developing entry-to-practice nursing informatics competencies for registered nurses.Studies in Health Technology & Informatics, 201, 356-363. doi:10.3233/978-1-61499-415-2-356 

Oh, J., Kim, S.J., Kim, S., & Vasuki, R. (2017). Evaluation of the effects of flipped learning of a nursing informatics course.Journal of Nursing Education, 56(8), 477-483. doi:10.3928/01484834-20170712-06 

Pesut, B., & Taylor, D. (2018). Assessing Quality of Quantitative Studies for Nursing 504.

Ronquillo, C., Topaz, M., Pruinelli, L., Peltonen, L., & Nibber, R. (2017). Competency recommendations for advancing nursing informatics in the next decade: International survey results…NI 2016, Switzerland.Studies in Health Technology & Informatics, 232, 119-129. doi:10.3233/978-1-61499-738-2-119 

Schardt, C., Adams, M., Owens, T., Keitz, S. & Fontelo, P. (2007). Utilization of the PICO framework to Improve searching PubMed for clinical questions. BMC Medical Informatics and Decision Making, 7(16), e1-6. 

Tong, A., Sainsbury, P., Craig, J. (2007). Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. International Journal for Quality in Health Care, 19(6), 349–357. 

Biography

Andrea Knox, RN, BSN, CON(c)

Andrea is a nationally certified oncology nurse and currently holds the position of Senior Practice Leader for Nursing at BC Cancer – Kelowna. Andrea has a strong interest in the field of nursing informatics and is passionate about supporting the professional practice of oncology nurses across BC. 

Be Sociable, Share!

Comments are closed.