IntroductionBackground
The costs and burdens of schizophrenia spectrum psychosis and bipolar disorder are huge and often neglected in research [,]. Digital mental health interventions (DMHIs) can help people with psychosis or bipolar disorder monitor, manage, and improve their symptoms and health [-]. Although there is promising evidence of feasibility, acceptability, and efficacy [-], and widespread expectations that DMHIs will become core to health care [], user uptake and engagement are highly variable [], and integrated clinical use is low []. The challenges of implementing DMHIs in health care are well established; attempts to implement well-evidenced DMHIs for common mental health problems in the United States and United Kingdom health services have frequently failed due to lack of patient and provider engagement and difficulty integrating DMHIs into clinical care []. For example, while randomized controlled trials (RCTs) of computerized cognitive behavior therapy (cCBT) for depression have reported relatively high engagement (47% of participants completing all cCBT sessions []), pragmatic trials delivering cCBT in real-world settings report much lower engagement (16%-18% of participants completing all sessions; most participants completing 1 session only) []. Implementing DMHIs for people with psychosis or bipolar disorder, within secondary care mental health services (eg, community mental health teams), presents additional challenges (eg, high staff caseloads and reactive care in response to increased risk or crisis), which have been examined in this review.
Implementation science acknowledges that intervention effectiveness cannot guarantee uptake in services; uptake depends largely on contextual barriers and facilitators []. Examining such contextual factors is therefore crucial to improving uptake in real-world settings, allowing tailoring of interventions to maximize engagement, and informing development and testing of implementation strategies. Barriers and facilitators of engagement with DMHIs for common mental health problems (eg, depression or anxiety) were examined relatively recently [], but the last systematic review of factors affecting DMHI implementation in samples with psychosis or bipolar disorder was 5 years ago []. This review provides a timely update on this rapidly developing literature by examining barriers and facilitators of patient and staff engagement with DMHIs for this group.
Using a best-fit framework synthesis approach [,], the Consolidated Framework for Implementation Research (CFIR) [] guided this analysis, enabling consideration of user engagement barriers and facilitators in the context of the complex, multilevel systems within and surrounding health care []. In supplementary analyses, we also examined safety reporting within reviewed studies to explore whether perceived harms may be additional barriers to DMHI uptake and to further explore recent findings on safety reporting quality [,].
Objective
In summary, this review examines evidence from qualitative and quantitative studies regarding the barriers and facilitators of patient and staff engagement with DMHIs for psychosis or bipolar disorder. By synthesizing this information using CFIR, this review presents important findings about how to maximize engagement with such DMHIs, and ideal conditions for DMHI implementation in secondary mental health services.
MethodsDesign
This systematic review follows PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses; see for PRISMA checklist) guidance [], with eligibility criteria developed according to the PICOS (participants, interventions, comparators, outcomes, study design) framework. The review was registered with PROSPERO (main review: CRD42021282871; safety review: CRD42022306123).
Search Strategy and Selection Criteria
We conducted a PRISMA-compliant search of 7 databases using search terms relating to psychosis and bipolar disorder, DMHIs, and barriers, facilitators or implementation (see search terms and database list in ). Searches were restricted to peer-reviewed articles reporting on studies with human participants, published in English between January 2010 and October 2021. By restricting the search in this way, we were able to capture the period between the first studies of digital mental health tools being conducted (consistent with previous reviews [,]) and the COVID-19 pandemic. The search end date (1.5 years after the worldwide COVID-19 restrictions peaked []), allows for the lag from data collection to publication. The search retrieved very few papers reporting data gathered during or after the pandemic, suggesting this was a suitable watershed date. A future systematic review will examine studies from October 2021 onward.
Author EE combined, deduplicated, and screened the titles and abstracts of the search results; author AP independently screened a random selection (10%). Full-text of the remaining articles were independently screened against PICOS inclusion and exclusion criteria by authors EE and AP. Studies included and excluded by each researcher were compared, disagreements were resolved by consensus, and final reasons for all exclusions were recorded. Full inclusion and exclusion criteria are provided in . In summary, the review included all published studies reporting qualitative or quantitative data on hypothetical or actual barriers or facilitators of patient or staff engagement with a DMHI aiming to monitor or improve the mental or physical health of adults with a bipolar or schizophrenia spectrum diagnosis using a digital method (eg, smartphone app, SMS text messaging, website, or wearable).
Data Extraction
A data extraction form was used to extract relevant data: study metadata, design, DMHI details, and sample characteristics. Quantitative findings relating to barriers and facilitators of user engagement were extracted and imported into NVivo software (Lumivero) [], as were the full results and discussion sections of qualitative studies. A second researcher checked all the extracted data. Study quality was assessed using the Mixed Methods Appraisal Tool []. In addition, from the main paper text and supplements of studies that tested the actual use of a DMHI, we extracted data on how adverse events (AEs) were monitored, classified (relatedness or severity), analyzed, reported, and discussed, and whether reporting followed the CONSORT (Consolidated Standards of Reporting Trials) harms checklist []. Given that the reporting of formal AE data is typically poor for DMHIs [,], we also extracted information about incidents that might be classifiable as AEs but were not described as such by the authors.
Data Analysis
Study characteristics, quality checks, and AE information were tabulated and summarized descriptively. A best-fit framework synthesis approach [,] was used to synthesize qualitative and quantitative data from all included studies to address the following research question: What are the barriers and facilitators of service user or staff engagement with DMHIs for people with psychosis or bipolar disorder? Generally, within a best-fit framework approach, relevant published frameworks, models, or theories are systematically identified, and key elements of these are integrated into an a priori coding framework for use in the review []. Primary studies are then deductively coded into the a priori coding framework, with data not fitting within the framework coded inductively. In this review, we were able to use a single existing implementation framework, the CFIR, as the a priori coding framework [,], followed by inductive coding. CFIR was developed using a comprehensive systematic review of the implementation science literature [,]. It is a meta-theoretical framework incorporating constructs from a wide range of existing implementation theories into a single comprehensive framework and is used by researchers and practitioners to identify barriers and facilitators of successful implementation and tailor their strategies accordingly. Thus, it is an appropriate choice of framework to guide the systematic evaluation of potential barriers and facilitators of user engagement with DMHIs in this review. CFIR has 5 major domains as provided in .
Textbox 1. Major domains of Consolidated Framework for Implementation Research.Innovation characteristics: features of the innovation or intervention being implemented, including its complexity, adaptability, design, and evidence base. For example, a smartphone app for monitoring symptoms of psychosis.Outer setting: the external context in which the organization is situated, including economic, political, legal, and social factors that may influence implementation. For example, a particular country or state.Inner setting: features of the organization in which the intervention is being implemented, including its culture, structure, resources, and readiness for change. For example, a secondary care mental health service.Individuals: characteristics of individuals involved in the implementation process, such as their knowledge, beliefs, and motivation. For example, patients with psychosis or bipolar disorder, and mental health staff.Implementation process: the implementation process itself, including planning, engaging, executing, and reflecting.
ResultsSearch Results
Our search retrieved 7617 records after deduplication (); 503 underwent full-text review, of which 175 were included based on inclusion and exclusion criteria. A list of the excluded studies and their reasons for exclusion are provided in Table S1 in . Studies that did not contain data on barriers or facilitators of DMHIs were excluded (psychosis samples n=36; bipolar samples n=57). However, to supplement findings regarding the potential harms of DMHIs, we examined AE reporting in the 36 studies of psychosis samples.
Figure 1. PRISMA (Preferred Reporting Items for Systematic reviews and Meta-Analyses) flow diagram. Description of Included Studies
We included 175 papers reporting 150 studies (11,446 participants). Tables S2-S5 in outline the study design, sample characteristics, and quality checks. Most DMHIs focused on general symptom management, relapse prevention, or similar. Some focused on a specific symptom (eg, cognitive difficulties and paranoia), and a minority on medication adherence, lifestyle, physical activity, or smoking cessation. Common technology types were mobile apps (77/150, 51.3% studies), websites (42/150, 28% studies), SMS text messaging (13/150, 8.7% studies), and wearables (15/150, 10% studies). Slightly more DMHIs were blended (ie, delivered with human support, 49/150, 32.7% studies) or stand-alone (24/150, 16% studies) than minimally blended (23/150, 15.3% studies) or included setup or technical input only (20/150, 13.3% studies). The median DMHI use was 8 (IQR 4-19) weeks. Study types included single-arm interventional studies (47/150, 31.3%), RCTs (25/150, 16.7%), qualitative studies (20/150, 13.3%), cross-sectional surveys (14/150, 9.3%), observational studies (7/150, 4.7%), intervention development studies (13/150, 8.7%), usability testing studies (11/150, 7.3%), and case studies (7/150, 4.7%). Most studies examined actual DMHI use (106/150, 70.7%), though some were hypothetical (ie, asking participants about potential DMHI use, 44/150, 29.3%), and some tested in-session usability rather than use in daily life (9/150, 6%). Samples included people with schizophrenia spectrum disorders (98/150, 65.3% studies), bipolar disorder (62/150, 41.3% studies), and clinicians (26/150, 17.3% studies). Participants were recruited from a range of settings, including inpatient services (21/150, 14% studies), community services (115/150, 76.7% studies), and online adverts (16/150, 10.7% studies). The Mixed Methods Appraisal Tool subsections (scored out of 5; Table S4 in ) indicated the moderate quality of qualitative (mean 4.1, SD 1.57) and quantitative (mean 4.0, SD 1.0) study elements, and lower mixed methods quality (mean 3.1, SD 1.9). The CONSORT harms checklist showed AE reporting to be mostly poor or missing (Table S5 in ); thus, it is difficult to draw informative conclusions regarding the harms of DMHIs.
Findings Not Fitting Within CFIR Framework
Although most relevant findings were coded into CFIR, a few could not be coded: patterns of DMHI use over time, associations between demographic variables and DMHI engagement, and safety reporting findings.
Patterns of Use
Sixteen studies reported reduced DMHI use over time in this study, 7 reported stable use and none reported an increase.
Demographics
Demographic findings are outlined in the Table S6 in . Only 28% (11/39) studies examining age found significant effects: 6 found higher engagement in older people, 4 in younger people, and 1 found mixed effects. Conversely, all staff participants in qualitative studies hypothesized higher DMHI engagement among younger people [-]. Only 28% (7/25) studies examining gender or sex reported significant effects; most (n=5) found higher DMHI engagement among women. Only 20% (4/20) studies examining education reported significant effects; all 4 found higher education associated with higher engagement. Only 20% (2/11) studies reported significant effects of race, ethnicity, or immigration: Ben-Zeev et al [] reported significantly higher engagement among White participants than African American participants (and among White participants than Hispanic participants for some, but not all, engagement indices), while Bonet et al [] reported significantly higher agreement to use a DMHI among non-native participants than native Spanish participants. Of studies examining the effects of IQ, 67% (2/3) showed significant effects, with higher IQ associated with higher DMHI engagement [,]. Reading level (0/1), employment (0/5), and marital status (0/2) showed no significant effects.
Safety Reporting
describes safety information extracted from 168 papers (139 studies) reporting actual DMHI use. Only 86 papers (76 studies) reported AE-related content; 23 reported AE frequency or percentage (median number of AEs reported per study 2, IQR 0.8-39.2). AEs included hospitalizations, psychological distress, suicidal ideation, physical health-related events, and death. Relatedness of AEs to the intervention was only reported in 11 papers, 8 of which deemed some events related.
CFIR Coding
provide detailed findings regarding the coding of reviewed studies into the 5 domains of the CFIR framework: Innovation (), Outer setting (), Inner setting (), Individual characteristics (), and Implementation process (). Findings were most frequently related to Innovation (; n=149 papers) and Individual (; n=137) domains of CFIR, with aspects of the Outer setting (; n=9), Inner setting (; n=20), and Implementation process (; n=5) domains rarely described. shows the number of papers coded into each CFIR domain per year of publication. Studies published between 2010 and 2015 were coded almost exclusively against the Innovation or Individual CFIR domains. Occasional studies were then coded against the Inner setting domain from 2016 onward, and against the Inner setting, Outer setting, and Implementation process domains from 2018 onward.
Prominent subcodes within the Innovation domain were Relative advantage (99/175, 56.6% papers), Complexity (56/175, 32% papers), and Design (128/175, 73.1% papers). As many papers commented on DMHI design, details of inductive subcodes are provided separately in . These included design features to increase DMHI engagement, customization, repetition vs variety, type of device used, user interface design, passive sensing, data security and privacy, and technical difficulties. Need (48/175, 27.4% papers), Capability (88/175, 50.3% papers), Opportunity (48/175, 27.4% papers), and Motivation (73/175, 41.7% papers) from the Individuals domain were all widely used, although almost exclusively in relation to innovation recipients rather than other roles. Details of how prominent findings from the CFIR-coded data related to the 5 crosscutting themes, are provided in the Crosscutting Themes section.
Table 1. Consolidated Framework for Implementation Research (CFIR) Innovation codes.CFIR constructFindings in relation to this constructPapers, n (%)Data typeStudy typeaSample
Qualitative; quantitative; mixedHypothetical; usability; actualPatient; staff; bothSourceParticipants hypothesized that clear endorsement by a known institution (eg, health service or university) would increase engagement. One study [] noted that online information from pharmaceutical companies was less trusted than other sources. Some preferred endorsement by individual staff because “organizations have hidden agendas” []. Perceived credibility of teams developing or testing the DMHIb impacted engagement [-]. Teams that included people with lived experience of psychosis or bipolar disorder were especially credible. However, one study [] highlighted that co-design is not a panacea; despite being co-designed, the study innovation (MindFrame) “was neither applicable nor appealing to all.”14 (8)11; 0; 37; 1; 810; 0; 4Evidence baseA patient perspective piece [] suggested that health care professionals discussing the evidence base for specific DMHIs with patients would likely increase engagement. Two surveys [,] collected mixed methods data on whether information on the evidence base of bipolar apps influenced users’ app choice. Qualitatively, “participants seldom reported evidence of efficacy or scientific quality to justify app choice,” but in quantitative responses, 93% [] and 90% of participants [] rated scientific quality as important. Health care professionals [] cite a lack of knowledge regarding which apps were evidence based as the foremost barrier to discussing apps with patients with bipolar disorder.4 (2)1; 1; 24; 0; 03; 1; 0Relative advantageConnection with peers and normalization: qualitative data highlighted that DMHIs can facilitate peer connection and normalize psychosis or bipolar disorder experiences. Peer contact, both as a formal part of the intervention (peer facilitators or video resources) or via a bespoke or mainstream social media platform, provided “the opportunity to learn from others managing the same condition, reducing self-stigma through normalization of shared experiences” []. Peer support appeared to influence participants’ actual level of engagement with DMHIs [], or participants hypothesized that it would [,,]. Participants valued encouragement and accountability from others on the platform [-] and the opportunity to socialize with peers [-]. Even DMHIs without specific peer content were normalizing as their presence implied others with similar needs [-], and they used a similar format to everyday things (“...everyone uses an app these days... It’s normal now,” [,]). However, some were anxious that others would ask what the device or app was for [,,,], or preferred apps for “the general consumer, so I do not feel stigmatized as a patient for using it” [].43 (25)31; 2; 1018; 0; 2931; 3; 9Relative advantageAvailability and autonomy: reviewed studies highlighted DMHIs’ availability (anytime, anywhere) as a key advantage and reported that DMHIs increased participants’ autonomy. Availability was particularly beneficial for people constrained by working hours [], childcare [], geography [,,], reversed sleep cycles [], or during recovery from psychosis [,,]. Patients contrasted DMHI availability with that of their care team [] and valued DMHIs’ ecologic validity [] and being able to visit, or revisit, content at their own pace [-]. Quantitative data [] indicated similar app engagement off- and on-hours; ie, DMHIs are used outside clinic hours when available. Choosing when and where to use DMHIs enhanced feelings of autonomy, control, or empowerment [,], encouraging individuals to play an active role in managing their mental health (“They actually feel like they have part ownership in the process,” []) and decreasing reliance on health professionals (“it gives you a bit of freedom to say ‘hold on a second, I don’t have to wait for my CPNc’” []). Accessing information without a clinician’s direct involvement was particularly empowering, with patients frequently describing DMHIs as a friend or therapist. Patients and staff noted that internet-based mental health information ranges in accuracy, so providing “curated knowledge from a bunch of professionals rather than just what Google tells you” [] was a significant advantage. Participants from 10 studies described a DMHI in human terms, including as a “CPN in your pocket” [], “therapist in your phone,” “personal smoking trainer” [], “life-coach” [], “friend” [,,,,], “buddy” [], “close companion” [] or “constant companion” []. Staff [,,] and patients [,,,,,,,,] across 12 studies highlighted a DMHI’s anonymity can be a substantial advantage. For example, some DMHIs provided an opportunity to vent without fears of a subsequent medication increase [], to “scream on the keyboard” at any time [], or to disclose sensitive or potentially disturbing information in a nonjudgmental, anonymous place instead of directly to a person [,,,,,]. A minority of participants viewed some automated DMHI functions as potentially detracting from autonomy; for example, automatic transfer of data from the app to services [,,], and medication reminders []. Availability and autonomy contributed to perceptions of DMHIs as widening the reach of services. Symptom-monitoring or cognitive behavior therapy apps helped users feel safe with a more hands-off clinical approach (“it did... make me feel safe... I knew he was keeping an eye on those graphs,” []) while allowing prompt intervention if needed. More trivially, technology provided a convenient way to keep in touch with clinicians (eg, SMS text messaging, email, and instant messaging).65 (37)42; 8; 1529; 2; 3843; 7 15Relative advantageSelf-awareness and memory: DMHIs’ availability helped patients notice patterns in their experiences (“to see what’s making me happy, what’s doing my head in” []), remember those experiences, and talk to others about them. Participants said DMHIs gave a useful structure for therapeutic conversations [,-], promoting shared understanding, strengthening therapeutic relationships, helping patients feel “heard” [] or “connected” [,,], and facilitating help-seeking. One study highlighted a self-monitoring app’s value in translating individuals’ subjective experiences into an objective form, enabling others to access their inner experiences and take them more seriously (“If you were to answer the questions and go to the doctor and say, ‘look... you can see clearly there’s a change, and these are my experiences,’ that would be substantial evidence for the doctor to then sit up and take note” []). In some cases, this process was formalized as part of an Ecological Momentary Intervention [,]. One study drew a specific link between self-awareness and user engagement (“EMAd self-monitoring and EMA-derived feedback were seen as helpful for improving awareness, highlighting a willingness to engage with more intense monitoring for clinical purposes” []. As well as sharing symptom data with clinicians, patients often shared it with family, facilitating communication, understanding, and support. However, 2 surveys reported that only a minority of patients [,] and staff [] thought symptom information should be shared with family, and a qualitative study cautioned that users should always retain control over who had data access [].61 (35)34; 9; 1817; 0; 4545; 1; 15Relative advantageShould enhance but not replace human care: qualitative studies outlined ways that DMHIs did not present a relative advantage over human care. Neither staff [] nor patients [] liked “fake empathy” offered by a machine, considering it degrading. Many saw DMHIs as a cost-cutting exercise [,,,,], which might feel “dismissive” to patients, “as though they are not worthy of a clinician or therapist” [], leading them to feel “fobbed-off” []. Some worried that using DMHIs blended within face-to-face sessions might render clinicians less “present” in the interaction [,,] because they are “increasingly focused on a computer or on data rather than on the suffering human seated in front of them” []. Patients considered DMHIs a “poor substitute for seeing a person that knows you” [], lacking personal contact [,], opportunities for “being listened to” [], tone of voice, body language [] and emotional reassurance [], and unable to address everyday practical problems (eg, housing difficulties; []. Staff worried that DMHIs lacked nonverbal communication [], context [,] and personalization [,], left patients “alone to deal with any issues that surface” [], removed a key source of socialization and warmth, potentially reinforcing social avoidance [], and that DMHI-generated data would be difficult to interpret without staff members’ clinical experience [].16 (9)13; 0; 311; 0; 79; 4; 3AdaptabilityNo studies provided information on whether the ability to adapt a DMHI to a specific service affected engagement. Personalization of DMHIs for individual users is coded under Innovation design.0—e——TrialabilityAn expert consensus study [] and a patient perspective piece [] highlighted that the availability of a free trial version of a DMHI would increase the likelihood that staff and patients would use it.2 (1)1; 1; 02; 0; 01; 1; 0ComplexityTechnology-related complexity, stemming from poor user experience design, increased user burden, and “mental fatigue” [], prompting disengagement [,,,]. Lengthy login processes reduced engagement [,,,,], as did difficult-to-navigate websites or apps (“I wanted to go and do it, but then I just couldn’t navigate it. And I was just like, ‘Oh, I can’t be bothered’,” []. Qualitative and quantitative data found shallow website and app structures, and pages with <200 words, most usable by patients. Unwieldy, multistep systems (eg, SMS text messaging–based symptom monitoring) were perceived as “awkward, laborious, and burdensome” [] and associated with significantly lower engagement than a more streamlined native smartphone app []. One study noted that tailoring DMHI content increased complexity []. Patients suggested that training and support may mitigate the effects of technology-related complexity []. Individual capability (eg, cognitive ability, digital literacy; [,] also moderated the extent to which complexity hindered engagement and the amount of training that was needed to mitigate this effect [,]. Some studies noted that practical complexity, such as remembering to carry, charge, or use an additional device [,-], hindered engagement. Although most issues of complexity were related to the technology, rather than the intervention more broadly, staff in several studies [,,,,] found the information they received from symptom-monitoring apps complex to interpret, making it difficult to know how to respond clinically. Patients expressed concerns that some apps and websites had overwhelming amounts of information or features [,,]. A patient perspective piece highlighting the fallacy of the “killer app,” preferring apps with a clear aim that do not try to do too much at once [].55 (31)31; 10; 1416; 7; 3741; 2; 12DesignAs most reviewed studies commented on aspects of DMHI design, details are provided separately in .128 (73)56; 43; 2938; 5; 93101; 7; 20CostMost participants (85%) in a large international survey of people with bipolar disorder (n=919) preferred free or low-cost apps [] and qualitative survey data suggested DMHIs for bipolar should be “free or inexpensive so accessible to all or many” []. Five studies [-] listed financial costs associated with delivering a particular DMHI (eg, web hosting, licensing fees, texting fees, technical support, and staff time to deliver) but did not examine the effect on engagement.9 (5)1; 6; 24; 0; 67; 0; 2
aSome papers fall into more than one category, so categories may add to more than the overall total N.
bDMHI: digital mental health intervention.
cCPN: community psychiatric nurse.
dEMA: ecological momentary assessment.
eNot applicable.
Table 2. The Consolidated Framework for Implementation Research (CFIR) Outer setting codes.CFIR constructFindings in relation to this constructPapers, n (%)Data typeStudy typeaSample
Qualitative; quantitative; mixedHypothetical; usability; actualPatient; staff; bothCritical incidentsPatients and clinicians noted their use of technology had increased since the COVID-19 pandemic [].1 (1)1; 0; 01; 0; 00; 0; 1Local attitudes—b0———Local conditionsUS clinicians [] said local technological conditions were a major barrier to DMHIc implementation. A lack of regional infrastructure (mobile phone signal) prevented app use by clinicians and patients. Many participants had government-sponsored smartphones, incapable of running the study app. Technological infrastructure is not a universal barrier though a survey with EISd clinicians across 30 US states indicated that 93% had high-speed internet available [] and a study testing an app in a Canadian city indicated 100% mobile phone coverage [].3 (2)1; 1 12; 0; 31; 2; 0Partnerships—0———Policies and lawsClinicians from Indian health services highlighted the need to understand regulations and abilities governing app-generated data before using apps with clients []; US clinicians in the same study did not raise such concerns. Study authors attributed the difference to recently introduced telehealth policies in India. Clinicians from Australia [] and the United Kingdom [] mental health services noted the lack of guidelines and policies as a barrier to DMHI engagement.3 (2)3; 0; 03; 0; 00; 1; 2FinancingEmpirical results did not directly report on financing in relation to DMHI engagement or implementation, but study authors noted that a sustainable business model for maintaining evidence-based DMHIs is often absent. One study noted that costs were offset by the billable nature of the service [].3 (2)1; 0; 22; 0; 22; 0; 1External pressure—0———
aSome papers fall into more than one category, so categories may add to more than the overall total N.
bNot applicable.
cDMHI: digital mental health intervention.
dEIS: early intervention service
Table 3. The Consolidated Framework for Implementation Research (CFIR) Inner setting codes.CFIR constructFindings in relation to this constructPapers, n (%)Data typeStudy typeaSample
Qualitative; quantitative; mixedHypothetical; usability; actualPatient; staff; bothStructural characteristicsIT: poor IT infrastructure, including aging computers [], poor bandwidth [,], and no wireless connection in hospitals [] hindered engagement. Staff in one study expressed low confidence that their organization could implement DMHIsb due to its historical failures to successfully deliver past IT projects [].4 (2)3; 0; 12; 0; 21; 2; 1Structural CharacteristicsPhysical infrastructure: a lack of quiet space in the clinic hindered implementation [].1 (1)0; 0; 10; 0; 10; 0; 1Structural characteristicsWork infrastructure: work infrastructure impacted staff DMHI engagement in one study []. Local mental health teams had been relocated and merged shortly before the DMHI was introduced. Study authors hypothesized that staff hesitated to fully engage with the DMHI for fear that it would increase their workload further.1 (1)1; 0; 00; 0; 10; 0; 1Relational connections—c0———Communications—0———CultureHuman equality-centeredness: none0———CultureRecipient-centeredness: staff felt was it important to offer patients a choice to use a DMHI: “It’s also nice to think, to be able to give someone an option that you’re not forcing down their throat” []. Patients in this study appreciated this attitude, although the impact on DMHI engagement was not directly investigated.1 (1)1; 0; 00; 0; 10; 0; 1CultureDeliverer-centeredness: none0———CultureLearning-centeredness: none0———Tension for change—0———CompatibilityThe lack of compatibility between DMHIs and existing workflows was a key barrier to engagement and implementation. Clinicians believed DMHIs would create extra work, a major barrier in overstretched services [,,,,]. Some DMHIs duplicated existing work [,]: “reporting symptoms via the app system, instead of saving time, added another workflow to manage.” Participants often questioned who should be responsible for managing and responding to real-time symptom data [,,,,], particularly out-of-hours [,,,,], as doing so did not fit well with their current ways of working. Several studies emphasized that compatibility with existing electronic medical record systems would be important [,,]. Clinicians in one study [] strongly suggested patients should bring symptom-monitoring data to appointments, rather than uploading data in real-time. Although arguably undermining a key advantage of DMHIs (real-time data), this suggestion highlights how challenging the staff find the idea of being presented with a “constant stream of information” [], and the perceived additional responsibility that comes with that information.17 (10)11; 3; 39; 0; 92; 5; 10Relative priority—0———Incentive systems“Reimbursement by payers for time spent training patients and family members about DMHI and for time spent using or reviewing data from the DMHI” was a strong potential facilitator of HCPd engagement with DMHIs []. Similarly, the “difficulty having the DMHI approved by insurance” was considered a substantial barrier to implementation [].1 (1)0; 1; 01; 0; 00; 1; 0Mission alignmentAlthough staff appeared to consider some aspects of DMHIs aligned with their mission (see Relative advantage in ), 5 studies highlighted ways that implementing DMHIs was not aligned with the inner setting’s central mission. Staff felt funding was “channeled into technological advancement when perhaps it would be better channeled into staffing and training” []. Given limited time, clinicians considered other on-mission tasks more important than using DMHIs; eg, emergent crises [], the client’s own priorities [,], and simply having “more pressing things to deal with” []. Staff in one study felt “a huge burden of responsibility toward protecting their clients from harm and take this responsibility incredibly seriously” []; they perceived some aspects of DMHIs to present additional risks, directly conflicting with this perceived mission. Other staff were cautious of DMHIs disrupting the trust they had built with clients (“I’m not prepared to do anything that could damage that relationship... you work so hard to get that trust,” []); hence, they would always prescreen DMHI content before showing it to a client.5 (3)4; 0; 13; 0; 20; 3; 2Available resourcesMaterials and equipment: in response to digital exclusion challenges, it has been suggested that health services could provide devices to facilitate DMHI access. None of the reviewed studies directly investigated the impact of this, but staff in one study “felt the NHS should not provide the required technology because of concerns that tablets and mobile phones may get lost, sold, or damaged” []. However, across 15 reviewed studies that reported device loss rates, only 112 (11.1%) of 1011 provided devices were lost, broken, stolen, or damaged. Staff [] and patients [] also mentioned the need for devices for staff themselves, to facilitate blended DMHIs; eg, for SMS text messaging–based DMHIs [], or as a backup if the data connection is lost during video-based aspects part of a blended DMHI [].20 (11)7; 6; 75; 0; 1514; 2; 4Available resourcesFunding: none0———Available resourcesSpace: none0———Access to knowledge and informationProviding staff with training in basic digital skills [,], how to evaluate DMHI credibility and quality [,], and how to use specific DMHIs [,,] was hypothesized to facilitate engagement. One participant said, “There’s no point in [a patient] being a whizz on that computer and smartphone and me not having a clue cos I wouldn’t be able to support adequately, there would be... ongoing training needs,” []. Even staff who were digitally confident felt they would need more training to assist others [] and lacked knowledge about which DMHIs are credible [,]. As well as clear training or instructions on how to support patients with specific DMHIs [,], staff also valued the opportunity to try the DMHI themselves as part of this training [].5 (3)2; 3; 04; 0; 20; 4; 1
aSome papers fall into more than one category, so categories may add to more than the overall total N.
bDMHI: digital mental health intervention.
cNot applicable.
dHCP: health care professional.
Table 4. The Consolidated Framework for Implementation Research (CFIR) Individual codes.CFIR constructFindings in relation to this constructPapers, n (%)Data typeStudy typeaSample
Qualitative; quantitative; mixedHypothetical; usability; actualPatient; staff; bothNeedPerceived benefit: users need to perceive a DMHI’sb benefit to try it (“People do not do things just because their doctors tell them to. They need to see the benefit of what they do,” [] and continue using it (“If people think an app is helping them, they will continue using it,” []). Clinicians noted that quickly finding benefits helped sustain use: “People tend to download the app and then it decreases in use if people don’t see a quick benefit” [].34 (19)16; 10; 810; 0; 2524; 4; 6NeedStage of illness: qualitative data highlighted that needs may differ depending on patients’ stage of illness or recovery and that DMHI content (eg, psychoeducation) and functions (eg, symptom tracking) are often best suited to the needs of people with recent onset illness or ongoing symptoms. Quantitative data implied effects on engagement: people with recent onset psychosis were more likely to use an app than those with established psychosis (037); those with longer psychosis duration rated digital monitoring as less important to managing their illness than more recently diagnosed people. Several studies reported DMHI discontinuations due to improved mental health [,,]. Some adapted their DMHI use to meet their own needs [].27 (15)14; 8; 57; 0; 2121; 1; 5CapabilityPhysical capability: 4 studies mentioned that medication side effects (eg, fine motor tremors and blurry vision), interfered with participants’ ability to use a DMHI [,,,].4 (2)2; 0; 21; 0; 33; 0; 1CapabilityPsychological capability: Table S7 in outlines data from studies quantitatively examining the effects of specific symptoms, functioning, and diagnosis on DMHI engagement. Around a third of quantitative analyses found a significant effect of symptoms on engagement, with higher baseline positive, negative, or cognitive symptoms associated with lower DMHI engagement. Results for functioning and depression were more mixed. Quantitative studies did not examine the impact of relapse on DMHI engagement, but 11 qualitative studies noted decreased DMHI use during relapse due to thought disorder (“I wouldn’t have been able to make sense of the topics,” []), using a DMHI feeling “too confrontational” [], avoiding certain symptom-related questions [], difficulty staying awake during app training [], or lack of energy or motivation to engage [] when unwell. Nevertheless, relapsing participants were typically able to continue using a DMHI once symptom exacerbation had resolved [,], and one study noted that, for some, stopping the app might be a sign of imminent relapse []. A few participants avoided DMHIs due to interactions between their symptoms and technology, such as delusions or paranoia about technology [,,,], adverse effects of technology in mania [,], or social anxiety during DMHI social media use []. Some variables relating to psychological capability were examined in very few studies. Neither baseline biological rhythm disturbance [] nor baseline medication adherence predicted DMHI engagement [,,], although those who chose to use a DMHI were less likely to be using injectable medication []. Better premorbid adjustment was associated with higher DMHI engagement []. Cannabis use predicted lower DMHI engagement [].76 (43)26; 33; 1716; 0; 6164; 4; 8CapabilityDigital literacy: lack of digital literacy was a barrier for some, and expert consensus participants considered self-efficacy beliefs about using devices as highly likely to promote DMHI engagement []. Some clinicians reported lacking the technological capability to deliver a DMHI [,,]. Three studies quantitatively examined the effects of patients’ digital literacy on DMHI engagement. Two reported no significant effect [,]; in one study participants with prior smartphone experience engaged significantly more in a digital than a nondigital intervention, whereas no such difference was observed for people without previous smartphone experience.33 (19)14; 10; 99; 3; 2224; 6; 3OpportunitySpace: participants reported a lack of a suitable environment as a barrier. Busy homes, hospitals, public libraries, or homelessness presented challenges of noise, repeated interruptions, and lack of privacy. For blended DMHIs, the in-person location was sometimes a barrier [,,].12 (7)7; 2; 31; 0; 119; 1; 2OpportunityTime: lack of time due to work [,], physical illness [], preoccupation with delusions [], or other commitments was a barrier to engagement for some patient participants. Conversely, being able to fit the DMHI within their daily or weekly routine was a facilitator, and inpatient participants enjoyed using a DMHI as “something to do” [].22 (13)7; 10; 51; 0; 2120; 0; 2OpportunityTechnology: lack of access to technology was a barrier. This included not owning a phone, tablet, or computer [,,,], or having a device with limited functionality [], insufficient storage [], insufficient data allowance [,], poor internet connection [], or frequently changed phone numbers [,]. Staff highlighted a key difficulty with using technology for health care: “phones are... a luxury item... if people aren’t working because of health needs... they are being denied healthcare because they can’t afford a phone” [].30 (17)15; 12; 316; 0; 1519; 6; 5MotivationGeneral: some were motivated by financial or material factors (eg, vouchers or free study devices). Others described intrinsic motivations: commitment to participating [], wanting to provide accurate data [], agreement with DMHI tasks or goals [], less “controlled motivations for treatment” [], and interest in technology []. One study noted that people who tended to be more engaged with services were also more engaged with the DMHI [].12 (7)7; 3; 24; 1; 99; 2; 1MotivationSupport: across 45 qualitative studies, the following types of human support were mentioned: training in basic digital skills, managing technical difficulties, support navigating the DMHI, ongoing coaching to sustain use, discussion of psychoeducational materials, and interpreting or responding to changes in real-time symptoms data. Four randomized controlled trials quantitatively compared unsupported vs supported DMHI engagement. Two reported significantly higher engagement in a peer-supported group than in an unsupported group (P<.05 [,]; P=.005 []). One reported a significant interaction effect such that reviewing symptoms had a significant positive effect on app adherence only for younger women (P=.001), with no significant overall effect []. Finally, a pilot randomized controlled trial found moderate effect sizes in favor of the group allocated to support from a clinical helper (not powered to detect significant differences []). Regarding social support more generally, there were mixed quantitative results. One study reported lower social support among more actively engaged participants [], with a second reporting the opposite pattern [].48 (27)24; 13; 1113; 1; 3936; 4; 8MotivationFear of relapse (barrier): anticipated or actual anxiety about DMHI use was sometimes a barrier. Staff [] and patients [,] hypothesized that using a DMHI may increase awareness of symptoms, heightening users’ anxiety about deterioration, and potentially making symptoms worse. There was some evidence to indicate that this belief may hinder engagement. Quantitatively, users reporting higher fear of relapse at baseline showed lower engagement with a symptom-monitoring app over the following 6 months []. Elsewhere, one participant reported: “Being notified of all the changes sometimes made me anxious. It made me wonder if the illness was maybe about to get out of control” [] and a minority of participants in other studies reported increased rumination [,,]. This could be mitigated to some extent by support: “It really helped to have her sit there and talk things through with me... and just to have her to bring me back down, or bring me back into reality” []. Others were hesitant to (accurately) report symptoms via a DMHI for fear that it may lead to more restrictive treatment, such as hospital admission or increased medication [,,,].26 (15)11; 5; 108; 0; 1920; 3; 3MotivationDMHI too confronting (barrier): some participants found DMHIs to be confronting: being reminded about their diagnosis or noticing how unwell they were, was disheartening and often prompted disengagement. A striking number of quotations related to this, for example, “My best days are when I typically forget that I have any kind of illness... to be very heavily reminded about the gravity of it all can be a bit unpleasant [];” “It started me comparing how I am now to how I used to be... and its not good. That’s why I canceled it. I found it too personal in the end [];” “I wasn’t ready to accept the illness... I wasn’t willing to change my life according to the program [].” In one study [], half the patients felt negatively influenced by frequent confrontations with their symptoms. In another study [] only a minority (4%) found the DMHI confronting and others reported that it helped them recognize their progress. One study quantitatively examined if recovery style predicted DMHI engagement, finding no significant effect [].11 (6)7; 3; 12; 0; 98; 0; 3
aSome papers fall into more than one category, so categories may add to more than the overall total N.
bDMHI: digital mental health intervention.
Table 5. The Consolidated Framework for Implementation Research (CFIR) Implementation process codes.CFIR constructFindings in relation to this constructPapers, n (%)Data typeStudy typeaSample
Qualitative; quantitative; mixedHypothetical; usability; actual
Comments (0)