Abstract
Behavioral health professionals are charged with providing effective outpatient services while addressing patient and public safety, yet training in empirically-informed violence risk assessment strategies remains inaccessible. The authors developed and evaluated an online distance learning (ODL) course on clinical risk assessment targeting frontline providers and trainees in the United States. The ODL consisted of three modules: confidentiality, duty to third parties, and clinical assessment of violence risk. We evaluated the response characteristics and reach among different disciplines, as well as training satisfaction, change in knowledge, self-perceived competence, and self-reported impact on practice at six-week follow-up among 221 learners. Self-perceptions of competence and knowledge in the focal areas increased immediately after completing the training; self-perceived competence increased again by a significant margin at six-week follow-up. Participants reported a moderate-high positive impact of the training on practice.
The assessment and management of violence risk in clinical practice is a core competency among frontline providers in routine clinical settings.1 Murrie and Kelley2 called on clinicians to hold at least a basic literacy in the key factors and processes underlying a violence risk assessment. This call may be more pertinent than ever, as concerns mount over the escalating rates of interpersonal violence observed during the COVID-19 pandemic3,4 as well as inconsistent access to behavioral health care for individuals with preexisting serious mental health conditions.5 Brief, targeted, and accessible trainings on approaches to clinical risk assessment that account for empirically supported and culturally sensitive risk and protective factors are needed. Pan-jurisdictional trainings in confidentiality and duty to third parties are important tools to meet the risk management demands placed upon behavioral health professionals working in routine care settings.
In the absence of training in empirically supported strategies for assessing risk in routine care settings, most providers rely on clinical judgment, in which interview information and relevant history are gathered and processed by the clinicians to inform a clinical impression of violence risk potential. By their nature, clinical judgment approaches to violence risk assessment (VRA) are idiosyncratic and susceptible to implicit biases, as they may be formed without consideration of base rates or empirically supported risk and protective factors. Furthermore, when concerns about violence potential arise, clinicians are often unsure of the laws and regulations concerning confidentiality and obligations to third parties.6
Currently, clinicians lack access to expert-delivered training in empirically supported clinical approaches to VRA, and many behavioral health professionals report gaps in knowledge and skills in this core area.6 Challenges related to scalability include jurisdictional differences in laws relating to reporting requirements and responsibilities to third parties, organizational differences related to risk communication and documentation, accessibility of training by qualified professionals, and a dearth of experienced expert clinicians to provide such training. Remote web-based training is acceptable and feasible and shows promise in scaling clinical training to working professionals.7,8 This article details an open-enrollment online distance learning (ODL) approach to increasing working behavioral health professionals’ fund of knowledge and foundational competencies in the limits of confidentiality; duty to warn and duty to protect standards; and violence risk assessment, management, and documentation. We evaluate reach among different disciplines as well as training satisfaction and change in knowledge, self-perceived competence, and self-reported impact on practice at six-week follow-up.
Method
Course Description
The three-hour ODL course was sponsored by the Northwest Mental Health Technology Transfer Center (MHTTC), which is funded by a cooperative agreement with the Substance Abuse and Mental Health Services Administration. The training was developed with two objectives in mind. First, the training was intended to enhance knowledge among a range of behavioral health and allied professionals, paraprofessionals, and students who lack specialized forensic behavioral health training or education. Training content was developed to ensure principles and practices are applicable irrespective of clinical role. Clinical and legal constructs were defined on first use and efforts were made to minimize psycholegal jargon. Second, the ODL course was intended to have a broad reach. Content was framed to be applicable across jurisdictions within the United States. Training delivery aimed to redress common barriers to accessing professional development. Toward that end, the course was free, self-paced, web-accessible, 508 compliant,9 and included interactive components.
The ODL consists of three distinct modules. Each module contains a 40-minute didactic, a 10-minute case illustration, and a three-item quiz. A summative practical exercise serves as the course capstone. Module 1 concerns law and ethics concerns pertaining to confidentiality and privacy in a therapeutic relationship. Learning objectives targeted: developing an understanding of confidentiality, privilege, and privacy; identifying common exceptions to confidentiality under state and federal law; and identifying appropriate steps to take when receiving a subpoena. Module 2 concerns duty to warn and duty to protect standards. The module learning objectives were to define duties to warn or protect, characterize variations in duties, know how to identify local requirements, and identify potential appropriate responses to verbal threats. The third module on violence risk assessment intends to help behavioral health professionals identify the key components of a high-quality clinical VRA, list the relevant risk factor domains, learn strategies to gather relevant data for a risk formulation, and become familiar with a stratified clinical response to managing risk in a routine care setting.
The training was hosted on a virtual learning platform called HealthEKnowledge (www.healtheknowledge.com). Marketing and promotion were facilitated by the MHTTC Network through email campaigns to behavioral health administrators and clinicians as well as by word of mouth. After signing up for the ODL, learners were prompted to complete the precourse assessment. Upon completion of the capstone exercise, learners completed a knowledge quiz and postcourse survey (see below). A six-week follow-up survey was sent via email along with a five dollar honorarium.
Data collection was ongoing for the first 48 months of the course. We were interested in learning which disciplines elected to complete the course and in assessing associations between professional characteristics and the constructs of knowledge, mastery, and satisfaction. We were also interested in change in knowledge and perceived competence from pre to posttraining as well as the extent to which learners perceived the training to affect their role performance.
We did not collect identifiable private information from learners. As this evaluation sought to describe the characteristics of individuals who chose to complete the course, report on learning analytics, and explore potential predictors of training outcomes, it did not meet criteria as a research study and IRB approval was not required.
Measures
Demographics
Race, ethnicity, gender, education, and profession were assessed through a funder-required survey that aligns with the Government Performance and Results Act (GPRA).10 In addition, participants were asked to report the number of years that they have worked as a behavioral health professional and the number of VRA trainings they have completed prior to the current event.
Training Quality
Indicators of perceived training quality were drawn from the GPRA survey. Participants were asked to indicate on a five-point Likert-type index the degree to which they were satisfied with the training, felt that they received a professional benefit from the training, and would use the training content (1 = lowest rating and 5 = highest rating). Participants also responded to a single yes/no question that asked if they would recommend the training to a colleague.
Knowledge Quiz
Participants completed three quizzes that were administered at two timepoints: pretest and immediately following the training. Each quiz included three multiple-choice questions. The first focused on confidentiality, the second focused on duty to warn, and the third focused on risk assessment. Scores for each quiz were computed as the number of correct responses (0 to 3), and composite quiz scores were calculated using the number of correct items across all three quizzes (0 to 9). A quiz change score was computed by subtracting the composite pretest quiz score from the composite posttest quiz score for each participant.
Impact of Training
The impact of training was assessed immediately following the training using the Impact of Training and Technical Assistance (IOTTA) survey.11,12 Select items from the IOTTA were repeated at a six-week follow-up assessment. The following constructs from the IOTTA were used in the current analysis:
Training characteristics: At postevent, participants were asked to rate the degree to which they perceived that the training topic was important, the degree to which training content represented a change from current practice, and the degree to which the training level was appropriate. Response categories ranged from 0 to 10, with higher scores indicating higher levels of each construct.
Participant competence: At postevent, participants were asked to rate their existing level of competence on training content (before completing the training) on an index ranging from 0 for “complete beginner” to 10 for “fully expert.” A second item using the same response categories was used to assess posttraining competence. Finally, this item was included at follow-up to assess the degree to which gains were maintained from postevent to six weeks following training participation.
Impact on practice: Training impact at postevent was assessed using a 10-item index. The index began with the statement, “How will what you learned/gained from today’s training impact…?” Example categories included “How you understand issues related to privacy” and “How you interact with clients who pose a potential risk of harm to others.” Response categories ranged from 1 for “large negative impact” to 7 for “large positive impact.” Scores on the individual items were averaged to form a single composite score (α = .98).
Plan of Analysis
Participant demographics were summarized with simple percentages, and we calculated mean scores of participant perceptions of the training event. Paired samples t-tests were conducted to examine changes in quiz scores from pre to posttest. Separate analyses were run to compare pretest and posttest scores for each training topic (confidentiality, duty to warn, and risk assessment) and composite quiz scores. We hypothesized that knowledge scores would increase from pre to posttraining across the three respective content areas and the composite score. Given the directional nature of these hypotheses, we assessed statistical significance using one-tailed tests with a criterion P value of .05 for each analysis.
Paired samples t-tests were conducted to examine changes in self-reported topic competence. Separate analyses were run to examine changes from pre-event to postevent and from pre-event to follow-up. We hypothesized that self-reported competence would increase from pre to post and would be sustained at follow-up. As with the preceding analysis, we assessed statistical significance with 1-tailed tests and a criterion P value of .05. Practice outcomes as measured by the IOTTA survey were summarized by calculating mean scores for each survey item and for the composite 10-item index.
Ordinary least squares (OLS) regression models were created to examine relations among hypothesized predictors and postevent quiz scores, quiz change scores, postevent competence, and postevent practice outcomes. In separate models, each of these outcomes was regressed on profession and years of experience. Dummy codes were created for profession (student, paraprofessional, and professional as the reference group). In addition, we tested three exploratory models in which we regressed each of the outcome variables on perceived importance of the training, change from current practice, training level, number of previous VRA trainings, and pretest competence as each of these variables has been found to be associated with training outcomes in an analysis of other MHTTC trainings.13
Results
Participant Characteristics
Attendance data through January 2022 indicate that 2,234 learners enrolled in the course and 1,427 completed all learning activities. Of those who completed the course, 1,096 learners responded to the postevent survey for a response rate of 76.8 percent. Of those who completed the postevent survey, only 221 provided follow-up responses at six weeks postevent, for a follow-up response rate of 20.2 percent.
Demographic data indicate that participants most commonly identified as female (84.2%), non-Hispanic (83.5%), and White (69.6%), with Black or African American representing the next largest racial category (23.1%). Most participants’ highest degree was a Bachelor’s degree (39.4%) or a Master’s degree (21.1%) in a counseling profession. Participants represented a variety of professional backgrounds, with students (35.9%), counselors (19.6%), and social workers (17.4%) being the largest professional categories represented. The majority had less than a year of experience in a behavioral health field (54.3%), and more than half had never completed a VRA before participating in the current event (54.3%). (See Table 1)
Demographic Data of Violence Risk Assessment (VRA) Online Distance Learners
Perceptions of Event
Ratings of training quality indicate that learners were satisfied with the quality of the training (Xquality = 4.21, SD = .86), viewed it as having a professional benefit (Xbenefit = 4.24, SD = .79), and reported that the material presented would be useful in practice (Xuseful = 4.22, SD = .80). In addition, 94.2 percent of learners would recommend the training to a colleague.
Outcomes
Knowledge
Paired samples t-tests indicate statistically significant increases in knowledge quiz scores from pre to posttest for each module: Confidentiality (XPretest = 1.33, SD = .86; XPosttest = 1.60, SD = .89; t(739) = −7.94, P < .001); Duty to Warn (XPretest = 1.56, SD = .93; XPosttest = 1.78, SD = .95; t(735) = −6.47, P < .001); and Risk Assessment (XPretest = 1.41, SD = .77; XPosttest = 1.78, SD = .71; t(23) = −11.32, P < .001). Total quiz scores (all three units combined) also increased from pre to posttest (XPretest = 4.29, SD = 1.61; XPosttest = 5.17, SD = 1.88; t(655) = −13.03, P < .001).
Self-Reported Competence
Learners reported levels of pre-event competence that were statistically significantly lower than both postevent competence (XPre-event = 4.39, SD = 2.61; XPostevent = 5.69, SD = 2.15; t(1031) = −22.88, P < .001) and follow-up competence (XPre-event = 4.38, SD = 2.50; Xfollow-up = 7.16, SD = 1.73; t(220) = −15.58, P < .001).
Practice Outcomes
Average responses to the practice outcome questions in the IOTTA survey ranged from a low of 5.69 for “how you collaborate with your colleagues,” to a high of 5.81 for “how you understand issues related to privacy, obligations to third parties, and violence risk assessment and management.” The mean score for the total index was 5.77 (see Table 2). These findings suggest that learners reported a moderate-high positive impact of the VRA ODL on this clinical practice domain.
Mean Scores on Violence Risk Assessment Training Outcomes
Predictors of Participant Outcomes
Separate OLS regression models were created to identify predictors of each of the above outcomes, including: postevent composite knowledge quiz scores, quiz change scores, postevent competence, and postevent practice outcomes (see Table 3). Results from the first model indicate that years of experience was positively associated with quiz scores but not profession. In the second regression model, quiz change scores were unrelated to the independent variables. In the third model, postevent competence was positively associated with years of experience. In addition, mental health professionals had significantly higher postevent competence scores compared with paraprofessionals. In the third regression model, neither profession nor years as a professional were associated with postevent practice outcomes.
Results from Ordinary Least Squares Regression Analysis
The exploratory regression models revealed additional predictors of learner outcomes (see Table 4). In the first model, postevent knowledge quiz scores were positively associated with perceived importance of the training and perceptions that the level of material covered during the training was appropriate, and negatively associated with ratings of the degree to which the material presented in the training challenged current practices. In the second model, changes in quiz scores were unrelated to the independent variables. In the third model, posttraining competence was positively associated with perceived importance of the training, perceptions of the appropriateness of the training level, the number of previous VRA trainings completed, and pretest competence. In the final model, postevent practice outcomes were positively associated with perceived importance of the training and perceptions of the appropriateness of the training level, and negatively associated with the number of VRA trainings previously completed.
Results from Exploratory Ordinary Least Squares Regression Analysis
Discussion
Knowledge and competence in the areas of confidentiality, duties to third parties, and violence risk assessment remain essential to front-line clinical practice. Online modules have the potential to reach broad and geographically disparate audiences who might not otherwise have access to VRA and related trainings. Despite the fact that psychologists are the predominant discipline to execute structured violence risk assessments,14 aspects of VRA, confidentiality, and duty to third parties are relevant to all behavioral health professionals.1
The VRA ODL appeared to do well in reaching its primary target learners. More than half of the learners who responded to our surveys had less than a year of experience in a behavioral health field and more than half had never completed a VRA training before participating in the current learning activity. Individuals who accessed the VRA ODL were demographically comparable with the national behavioral health clinical workforce.15
Overall, learners rated the quality of this online training approach favorably, with satisfaction ratings indicating that the training is professionally beneficial and clinically useful. Approximately 94 percent indicated that they intended to recommend the training to a colleague. Learners' perceptions of their competence in the focal areas increased immediately after completing the training. Interestingly, their self-perceived competence increased again by a significant margin at six-week follow-up. It is possible that learners’ self-perceptions aligned with objective indicators of competence in aspects of patient-provider confidentiality, obligations to third parties, risk assessment, and risk management. Alternatively, their perceived competence may have increased as they compared their new knowledge to colleagues who did not participate in the training. Some literature suggests, however, that self-assessments of competence are negatively correlated with observer-rated competence.16 Therefore, assessing the effect of the training on the learner’s behavior is an important target for subsequent research.
Learners reported a moderate-high positive impact of the training on practice. As expected, evaluation metrics suggest that the ODL course enhanced participants’ knowledge in each of the three subject areas immediately posttraining. Indeed, both quiz scores and ratings of competence increased to a statistically significant degree between pretest and posttest. Furthermore, results of the primary regression model suggest that mental health practitioners with more years of experience in the field attained higher posttraining quiz scores and rated their competencies higher at posttraining compared with those with less experience. Similarly, mental health professionals had higher competence scores compared with paraprofessionals. Changes in quiz scores were not related to years of experience or profession, however. Such findings suggest that while professional experience is associated with higher levels of knowledge and competence overall, an ODL training modality in VRA may be more valuable for individuals with more professional experience in terms of relative gains from pre to posttraining. Similarly, results from the exploratory regression models indicate that professionals who reported previous VRA training tended to achieve higher levels of posttraining competence, but there was less impact on practice compared with those who were training-naive. Intuitively, this finding may be attributable to those with more training experience already having made relevant practice changes prior to participating in the VRA ODL, suggesting that ODL modality holds appeal as a refresher course to enhance confidence, knowledge retention, and rehearsal. Further, participants who viewed the training topic as important and the level as appropriate had higher quiz scores and self-reported competence compared with those who rated the training lower on those characteristics.13 Interestingly, although perceptions of importance and training level were related to overall postevent knowledge and competence, they were not related to changes in quiz scores over time, which suggests that those who viewed the topic as important and the training level as appropriate experienced the same relative knowledge gains as those who scored lower on those variables. Finally, those who felt challenged by training material obtained lower quiz scores than those who felt less challenged. Such findings suggest that less experienced professionals may need additional support or training to master concepts that are new to them.
Limitations
This training evaluation has several limitations. First, only 20 percent of learners who completed the ODL responded to both the posttraining and follow-up surveys. The attrition in responses at the six-week follow-up may have resulted in a particularly motivated or otherwise nonrepresentative group of respondents. That said, our six-week follow-up response rate is comparable with other ODL courses hosted on the HealtheKnowledge web-based platform (Gotham H, personal communication, May 2022). As this was a naturalistic evaluation, we are unable to determine the context for participants’ engaging in the training (recommendation of a supervisor or training program, personal interest, or some other motivator), which could affect participant perception of importance and impact.
There are important limitations to relying on self-report to assess the impact of training. Although the authors have taken steps to minimize these limitations, such as anonymized and confidential data collection and using validated self-report training measures, future research should undertake more objective indicators of continued knowledge improvement and practice change. Because of the naturalistic design, we do not know which participants had opportunities to put new learning into practice, the nature of real-world applications, or the competence with which these practices were implemented. Moreover, we have no basis to compare our observed outcomes. Although the moderate-high scores on the Impact of Training and Technical Assistance scale appear to be high at face value, future research is needed to contextualize the value to the practitioner, setting, and evaluees. Finally, although we attempted to assess the durability of training effects, our follow-up period was brief and the response rate was low.
Future Directions
The behavioral health workforce is under tremendous strain, yet the need for empirically supported violence risk assessments and management practices is greater than ever in the pandemic era. E-training is accessible, scalable, and palatable to practitioners and presents a feasible means of enhancing clinical practices. Our VRA ODL represents a low-cost and scalable training modality that was satisfactory to student, novice, and veteran clinician-learners. In addition, the training was associated with an increase in objective assessments of learning as well as a subjective sense of enhanced competence. Learners reported moderate-high levels of perceived clinical impact at six-week follow-up. Given the minimal investment of time, money, and human resources, our findings reflect a promising avenue for enhancing clinical knowledge and skill in these crucial practice areas. VRA ODL shows promise as a complement to preservice training, as a means to redress the paucity of VRA instruction among existing training programs, or as a means of enhancing job-relevant knowledge among practitioners in the field.
Footnotes
Disclosures of financial or other potential conflicts of interest: None.
- © 2023 American Academy of Psychiatry and the Law