Skip to main content

Psychometric properties of the Dutch version of the Evidence-Based Practice Attitude Scale (EBPAS)

Abstract

Background

The Evidence-Based Practice Attitude Scale (EBPAS) was developed in the United States to assess attitudes of mental health and welfare professionals toward evidence-based interventions. Although the EBPAS has been translated in different languages and is being used in several countries, all research on the psychometric properties of the EBPAS within youth care has been carried out in the United States. The purpose of this study was to investigate the psychometric properties of the Dutch version of the EBPAS.

Methods

After translation into Dutch, the Dutch version of the EBPAS was examined in a diverse sample of 270 youth care professionals working in five institutions in the Netherlands. We examined the factor structure with both exploratory and confirmatory factor analyses and the internal consistency reliability. We also conducted multiple linear regression analyses to examine the association of EBPAS scores with professionals’ characteristics. It was hypothesized that responses to the EBPAS items could be explained by one general factor plus four specific factors, good to excellent internal consistency reliability would be found, and EBPAS scores would vary by age, sex, and educational level.

Results

The exploratory factor analysis suggested a four-factor solution according to the hypothesized dimensions: Requirements, Appeal, Openness, and Divergence. Cronbach’s alphas ranged from 0.67 to 0.89, and the overall scale alpha was 0.72. The confirmatory factor analyses confirmed the factor structure and suggested that the lower order EBPAS factors are indicators of a higher order construct. However, Divergence was not significantly correlated with any of the subscales or the total score. The confirmatory bifactor analysis endorsed that variance was explained both by a general attitude towards evidence-based interventions and by four specific factors. The regression analyses showed an association between EBPAS scores and youth care professionals’ age, sex, and educational level.

Conclusions

The present study provides strong support for a structure with a general factor plus four specific factors and internal consistency reliability of the Dutch version of the EBPAS in a diverse sample of youth care professionals. Hence, the factor structure and reliability of the original version of the EBPAS seem generalizable to the Dutch version of the EBPAS.

Peer Review reports

Background

The dissemination and implementation of evidence-based practice (EBP) to improve the quality of care and outcomes for clients and their families is a critical concern worldwide [1]. EBP is the integration of the best available research with clinical expertise in the context of client characteristics, culture, and preferences [2,3]. The purpose of EBP is to promote effective psychological practice and enhance public health by applying empirically supported principles of psychological assessment, case formulation, therapeutic relationship, and intervention. EBP also holds the promise to increase cost-effectiveness [4]. EBP may not only comprise the use of efficacious interventions, but may also include innovations such as data monitoring systems, alerts to target prescribing practices, and routine outcome monitoring (ROM) with feedback to clinicians [5-7].

Multiple factors at community, organizational, and individual levels influence the dissemination and implementation of EBP in real-world mental healthcare settings [8-10]. There is increasing evidence that the values and beliefs of professionals play an important role in the degree to which innovations are initiated and incorporated into common practice [11,12]. On the one hand, attitudes of professionals toward EBP can be a precursor to the decision of whether or not to try a new practice [13,14], and if professionals do decide to try a new practice, the affective or emotional component of attitudes can impact decision processes regarding the actual implementation and use of the innovation [12]. On the other hand, behaviour can influence attitudes [15]. Engaging in a behaviour, such as continued use of an evidence-based intervention until familiarity is developed, using a data monitoring system to track specific indicators of change, or attending collaborative meetings with one’s peers, may change attitudes and beliefs about EBP [5].

In order to tailor implementation efforts to meet the needs and characteristics of professionals in youth care institutions, we have to consider professionals’ attitudes toward adopting EBP [16,17]. Aarons [18] developed the Evidence-Based Practice Attitude Scale (EBPAS) to assess professionals’ attitudes toward adopting EBP in mental health and welfare settings. The EBPAS asks for professionals’ feelings about using evidence-based interventions, which are defined as new types of therapy or treatments and interventions with specific guidelines or components that are outlined in a manual or that are to be followed in a predetermined way. There are two other instruments to assess the attitudes of professionals toward adopting EBP. However, one is a non-validated survey questionnaire that was not designed to assess change over time [19]. The other was developed to measure professionals’ views about and implementation of the five steps of the EBP process [20] as defined by Sackett et al. [3]. Because most of our professionals are not trained in these five steps, we chose to use an instrument that assesses a general attitude toward using evidence-based interventions.

The EBPAS consists of 15 items measured on a 5-point Likert scale ranging from 0 (Not at all) to 4 (To a very great extent). The items of the EBPAS are organized into four dimensions. The Appeal subscale (four items) assesses the extent to which the professional would adopt an evidence-based intervention if it was intuitively appealing, could be used correctly, or was being used by colleagues who were happy with it. The Requirements subscale (three items) assesses the extent to which the professional would adopt an evidence-based intervention if it was required by the supervisor, agency, or state. The Openness subscale (four items) assesses the extent to which the professional is generally open to trying new interventions and would be willing to try or use more structured or manualized interventions. The Divergence subscale (four items) assesses the extent to which the professional perceives evidence-based interventions as not clinically useful and less important than clinical experience. The EBPAS total score is computed by first reverse scoring the Divergence scale item scores and then computing the overall mean [17]. The EBPAS total score represents one’s global attitude toward adoption of evidence-based interventions. The higher the score, the more positive the attitude toward evidence-based interventions.

Previous studies [16-18] confirmed the four-factor structure of the EBPAS in samples from the United States. These studies also suggested adequate internal consistency reliability for the EBPAS total score (Cronbach’s alpha ranging from 0.79 to 0.77) and good internal consistency reliability for the subscale scores (Cronbach’s alpha ranging from 0.93 to 0.74) in three samples. Only the Divergence subscale had a somewhat lower reliability estimate (Cronbach’s alpha ranging from 0.66 to 0.59 across studies). Construct validity was supported, in part, by finding acceptable model-data fit for confirmatory factor analysis (CFA) models in both United States samples and a Greek sample [16-18,21,22]. Acceptable fit indices were found for both a first-order structure (in which the individual items loaded on four factors) and a higher order structure (in which the four first-order factors were indicators of a more global higher order construct). Construct validity was also supported by the association of EBPAS scores with mental health clinic structures and policies [18], culture and climate [23], and leadership [24]. Evidence of content validity was obtained by asking an expert panel to rate the relevance of each item of the EBPAS for each proposed construct [16]. Content validity was supported because every item was on average rated as at least moderately relevant, important, and representative of the factor it was purported to assess. Criterion validity was supported by studies showing that EBPAS scores predict adoption and use of evidence-based interventions [14,25,26]. To date, two studies have examined changes over time of EBPAS scores, with both reporting little variation over time [27,28].

Furthermore, several studies have examined EBPAS scores in relation to individual differences between professionals (e.g. education, level of experience, age, discipline, and sex) and organizational characteristics (e.g. structures and policies, climate and culture, and leadership). Higher educational attainment was associated with a lower likelihood of adopting evidence-based interventions if required, greater willingness to adopt given the appeal of evidence-based interventions, and more general openness to evidence-based interventions [16,24,29,30]. As with educational attainment, years of experience was associated with lower Requirements scores but also associated with lower Openness and EBPAS total scores [16,29]. Years of experience was related to higher Divergence scores for autism early intervention professionals [29] and lower Divergence scores for mental health professionals [16]. Contrasting educational attainment with years of experience demonstrated different patterns suggesting a more restrained openness to adopting an evidence-based intervention for those with higher educational attainment and lower enthusiasm for an evidence-based intervention given more on-the-job experience [16]. Results concerning differences by age, discipline, and sex were inconsistent. Some studies found that older professionals had higher Requirements [16] and Openness scores [23], while other studies found that younger professionals had higher EBPAS total scores [21,31]. However, older professionals also had higher Divergence scores [23]. The relationship between age and attitude toward evidence-based interventions is possibly affected by job tenure. Concerning discipline, one study did not find any difference in EBPAS scores [18], but another study in a United States nationally representative sample found that professionals trained in social work had higher Openness and EBPAS total scores than professionals trained in psychology [16]. Sex differences were absent in two studies [18,21], but in two other studies women had higher Appeal, Requirements, and EBPAS total scores [16,24].

Although the EBPAS has been translated to different languages and is currently being used in several countries (e.g. in Iran, Israel, Japan, Korea, Norway, Romania, and Sweden), a Dutch version of the EBPAS has not yet become available [32]. Further, all research on the psychometric properties of the EBPAS within youth care has been carried out in the United States. It is imperative to test the EBPAS in other countries to facilitate cross-cultural comparisons. Even though there are similarities between youth care in the Netherlands and the United States, such as types of disorders treated, substantial differences exist regarding organizational structures, financial barriers, types of services, training and background of professionals, and client attitudes [33].

The purpose of this study was to investigate the psychometric properties of the Dutch version of the EBPAS. We first translated the EBPAS into Dutch and evaluated the forward and back-translation. Second, we examined the factor structure and internal consistency reliability of the Dutch version of the EBPAS in a diverse sample of youth care professionals. We hereby replicated the exploratory factor analysis and confirmatory factor analyses by Aarons et al. [16] and Aarons [18]. In addition, we conducted a confirmatory bifactor analysis to evaluate the plausibility of subscales, to determine the extent to which the EBPAS total score reflects a single factor, and to evaluate the feasibility of applying a unidimensional model structure to a measure with heterogeneous indicators. Third, we examined the association of EBPAS scores with age, sex, and educational level of the professionals. Based on the literature [32], it was hypothesized that (1) responses to the EBPAS items could be explained by one general factor (attitude toward evidence-based interventions) plus four specific factors (Requirements, Appeal, Openness, and Divergence), (2) good to excellent internal consistency reliability would be found, (3) EBPAS scores would vary by age (with older professionals scoring higher on the Requirements, Openness, and Divergence subscales and lower on the EBPAS total score), (4) EBPAS scores would vary by sex (with women scoring higher on the Appeal and Requirements subscales and EBPAS total score), and (5) EBPAS scores would vary by educational level (with professionals with university education scoring higher on the Appeal and Openness subscales and lower on the Requirements subscale).

Methods

Setting

The present study took place within the Academic Center Youth Nijmegen (ACYN), a multidisciplinary collaboration between the Radboud University and multiple youth care institutions in the South-East of the Netherlands. One of the main aims of ACYN is improving the care for youth by making this care more evidence based. Among other things, ACYN stimulates the youth care institutions to use more evidence-based interventions and innovations such as ROM with feedback to clinicians. The coordinators of seven youth care institutions were contacted and the study was described to them in detail. Permission was sought to survey the professionals of the collaborating departments. Five institutions agreed to participate: one large mental healthcare institution and four institutions for child welfare.

Procedure

With permission of the original author (GAA), a Dutch version of the EBPAS was constructed. First, a forward translation was conducted. The EBPAS was translated into Dutch by the first and last author of this article. The authors emphasized conceptual equivalence of the questionnaire rather than literal translation. Then, a back-translation was made by a bilingual, native English-speaking translator. The first and last author of this article compared the back-translation with the original version and discussed the differences. Subsequently, another bilingual, native English-speaking translator made a literal Dutch translation of the original version, compared the first Dutch translation with the second Dutch translation, and compared the differences with the original version of the EBPAS. It was concluded that no significant differences appeared during the translation process. Only small adjustments were suggested, which were incorporated by the first and last author of this article. The final Dutch version of the EBPAS can be obtained from the first author.

For each participating institution an appointment was made to discuss how and when the EBPAS was assessed. The first two institutions and one department of the third institution preferred a pen and paper survey of the EBPAS. The coordinators of these institutions and this department scheduled the survey sessions within regular team meetings. The surveys were administered to the whole group of professionals at these meetings. After the meetings, the professionals returned the survey in an envelope which was sent to the principal investigator. The other department of the third institution and the fourth and fifth institution preferred a web-based survey of the EBPAS. The professionals of this department and these institutions received a personalized survey invitation email with a web-link to electronic questionnaire system NetQuestionnaires version 6.5 [34]. Professionals who did not respond within a week received a reminder email. After the surveys were completed, the answers were automatically saved in the electronic questionnaire system. The data were exported to an SPSS file. Data collection was conducted from November 2011 through November 2012.

After the data collection was finished, the coordinators of the institutions received an overview of participating professionals. Subsequently, the coordinators collected information about the age, sex, and education of the professionals through the electronic personal files of the institutions. If demographic information was missing, the coordinators directly asked the professionals to provide the information. Due to technical issues with retrieving information from the electronic personal files and in reaching professionals of specific locations, as well as absence of professionals after filling out the EBPAS, it was not possible to obtain demographic information of all professionals.

Participants

A total of 276 youth care professionals completed the survey. The results of six professionals were excluded; one endorsed two answers for most items, one completed the survey twice, one was not a clinician, and three filled out 0 (Not at all) for all items. Filling out Not at all for every item (straight lining), produced inconsistent results such as “I do not know better than academic researchers how to care for my clients” contrary to “I am not willing to use new and different types of therapy/interventions developed by researchers”. Thus, the final sample size consisted of 270 professionals.

Of the respondents, 140 (51.9%) worked in a mental healthcare institution and 130 (48.1%) worked in a child welfare institution. Data on age, sex, and educational level were missing for 85 (31.5%), 30 (11.1%), and 44 (16.3%) respondents, respectively. The mean age of the remaining respondents was 43.15 years (SD = 11.03; range, 23–63) and 71.3% (n = 171) were female. The respondents’ primary discipline was education (n = 78; 34.5%), psychology (n = 66; 29.2%), nursing (n = 33; 14.6%), social work (n = 12; 5.3%), medicine (psychiatrists and physicians; n = 12; 5.3%), non-verbal therapies (e.g. psychomotor therapy; n = 10; 4.4%), teaching (n = 8; 3.5%), and other (n = 7; 3.1%). Because of the relatively few professionals within each discipline, we created groups based on level of education. This resulted in two groups: higher vocational education (education, nursing, social work, non-verbal therapies, and teaching; n = 125; 55.3%) and university education (education, psychology, and medicine; n = 94; 41.6%). The ‘other’ category was excluded from the analyses (n = 7; 3.1%).

Analyses

SPSS statistical software version 20 [35] was used for the assessment of internal consistency reliability and the exploratory factor analysis. Mplus software version 6 [36] was used for the confirmatory factor analyses, confirmatory bifactor analysis, and multiple linear regression analyses. For one respondent (0.4%), data of seven questions were missing. These missing data were handled in SPSS by imputing the missing values through expectation maximization, which makes use of all available data in estimating parameters. We used the imputed dataset for our analyses. Because we used two different data collection methods, we also tested for differences in mean total score between the pen and paper group and the web-based group in SPSS. Since we found no differences we have merged the results of both groups for the analyses.

We examined descriptive statistics (means and standard deviations) at the item level, for the EBPAS subscales, and the EBPAS total score. We also examined item-total correlations. Internal consistency reliability was examined using Cronbach’s alpha.

To assess the factor structure of the Dutch version of the EBPAS, we first replicated the analyses by Aarons et al. [16] and Aarons [18]. Two separate factor analytic procedures were conducted. The sample was divided by randomly selecting approximately 50% of cases and assigning cases to either an exploratory (n = 127) or confirmatory (n = 143) analysis group. An exploratory factor analysis (EFA) was conducted on the one half of the sample using Principal Axis Factoring in order to partition systematic and error variance in the solution [37,38]. Promax oblique rotation was used allowing for factor intercorrelations [37]. Items were retained on a factor if they loaded at least 0.32 on the primary factor and less than 0.32 on all other factors [39]. Item-total correlations and scale reliabilities were also used to assess scale structure. Confirmatory factor analyses CFA were conducted on the other half of the sample to test the factor structure derived in the EFA. Because professionals were nested within institutions, models were adjusted for the nested data structure using the TYPE = COMPLEX procedure within MPlus (cf. [40]).

In addition, we conducted a confirmatory bifactor analysis with the following features [41,42]: (1) each item had a non-zero loading on both the general factor and the specific factor that it was designed to measure, but zero loadings on the other specific factors, (2) the specific factors were uncorrelated with each other and with the general factor, and (3) all error terms associated with the items were uncorrelated. The model was identified by fixing the variance of each latent factor to 1. Because the observed variables are measured on a 5-point Likert scale Robust Maximum Likelihood estimation was used.

Commonly accepted rules of thumb for fit indices in confirmatory factor analyses include a comparative fit index (CFI) and Tucker–Lewis Index (TLI) value near 0.95 or greater, a root mean square error of approximation (RMSEA) value near 0.06 or less, and a standardized root mean square residual (SRMR) near 0.08 or less [43].

After identifying a measurement model with acceptable fit, we examined the association of the Dutch EBPAS subscale and total scores with characteristics of the youth care professionals (i.e. age, sex, and educational level). Rather than replicating Aarons [18] multiple stage analytic approach, we replicated Aarons et al. [16] regression analyses because we already adjusted for nested data with the TYPE = COMPLEX procedure within MPlus. We used two-tailed tests.

Results

Exploratory factor analysis (EFA)

An EFA was conducted using data from the randomly selected half of the sample (n = 127). The EFA suggested a four-factor solution in accordance with simple structure criteria, scale reliabilities, and parallel analysis. The EFA model accounted for 61% of the variance in the data. Table 1 shows overall means and standard deviations, item-total correlations, eigenvalues, internal consistency reliabilities, and item loadings for each of the scales. Cronbach’s alphas ranged from 0.67 to 0.89, showing acceptable to good values for the different subscales. The overall scale alpha was 0.72. The factors represented four subscales of attitudes toward adoption of evidence-based interventions in keeping with the hypothesized dimensions: Requirements (three items, α = 0.89), Appeal (four items; α = 0.75), Openness (four items; α = 0.77), and Divergence (four items; α = 0.67). Item analyses showed that the reliability coefficient for all the subscales would not significantly improve by removing items from the subscale.

Table 1 EBPAS subscale and item means, standard deviations, item-total correlations, eigenvalues, Cronbach’s alpha, and exploratory factor analysis loadings

First-order confirmatory factor analysis (CFA)

A first-order CFA was conducted using data from the other randomly selected half of the sample (n = 143), specifying the factor structure identified in the EFA. CFA items were constrained to load only on the primary factor indicated in the EFA, thus providing a highly stringent test of the factor structure. As in the EFA, factor intercorrelations were allowed. CFA factor loadings confirmed the EFA-based a priori factor structure and the model demonstrated acceptable fit (χ2(84) = 179.17, CFI = 0.90, TLI = 0.87, RMSEA = 0.09, SRMR = 0.08) further supporting the original EBPAS factor structure. Factor loadings ranged from 0.40 to 0.99 and all factor loadings were statistically significant. Factor intercorrelations ranged from 0.04 to 0.50. The Openness subscale had a weak positive correlation with the Requirements subscale (r = 0.33, P <0.01) and a moderate positive correlation with the Appeal subscale (r = 0.51, P <0.01). The Divergence subscale had no significant correlations with the Requirements (r = 0.04, P >0.05), Appeal (r = −0.05, P >0.05), or Openness (r = −0.23, P >0.05) subscales. The results of the first-order CFA are shown in Figure 1.

Figure 1
figure 1

Confirmatory factor analysis model of the EBPAS. n = 143, χ2 (84) = 179.17, CFI = 0.90, TLI = 0.87, RMSEA = 0.09, SRMR = 0.08, *P < 0.05, ** P < 0.01; all factor loadings are significant at P < 0.01.

Second-order confirmatory factor analysis (CFA)

In replication of Aarons et al. [16], a second-order CFA was conducted to test if all lower order EBPAS factors are indicators of a higher order construct that might be regarded as general attitude toward adoption of evidence-based interventions. With data from the randomly selected half of the sample that was used for the first-order CFA (n = 143), the model could not be identified. Because we assumed that our subgroup was too small for a second-order CFA, we also conducted the second-order CFA among the total sample of professionals (n = 270). In the total sample, the model could be identified, but the latent variable covariance matrix was not positive definite. This indicated a correlation greater or equal to one between two variables: item 9 (intuitively appealing) and item 10 (makes sense). By correcting the model for this correlation, we found a good fit (χ2(85) = 146.48, CFI = 0.96, TLI = 0.95, RMSEA = 0.05, SRMR = 0.05) and our results exactly matched the results of Aarons et al. [16]. Factor loadings ranged from 0.45 to 0.99. All factor loadings were statistically significant, except for Divergence on Attitude (P = 0.21). The results of the second-order CFA are shown in Figure 2.

Figure 2
figure 2

Second-order confirmatory factor analysis model of the EBPAS. n = 270, χ2 (85) = 146.48, CFI = 0.96, TLI = 0.95, RMSEA = 0.05, SRMR = 0.05; All factor loadings are significant at P < 0.01, except for Divergence on Attitude (in italics; P = 0.21). Estimation of correlated residuals between two Appeal subscale items is indicated by a double-headed arrow.

Confirmatory bifactor analysis

The bifactor model simultaneously assessed the specific factors Requirements, Appeal, Openness, and Divergence as well as the general factor Attitude toward evidence-based interventions shared by those specific factors. The bifactor model demonstrated a good fit (χ2(75) = 107.37, CFI = 0.97, TLI = 0.96, RMSEA = 0.04, SRMR = 0.04). The majority of the factor loadings were statistically significant, except for items 3 and 6 on Attitude (P = 0.38 and P = 0.83), and items 14 and 15 on Appeal (P = 0.16 and P = 0.07). Most factor loadings (10 out of 15) were weaker for the general factor than for the grouping factors and some loadings were negative. According to Reise et al. 44], if items primarily reflect the general factor and have low loadings on the grouping factors, subscales make little sense. However, when items have substantial loadings on both the general factor and the grouping factors, subscales will make sense. Therefore, the findings indicate that variance is explained both by a general factor (Attitude toward evidence-based interventions) and by specific factors above and beyond the general factor (Requirements, Appeal, Openness, and Divergence). This further supports the original EBPAS factor structure. The results of the confirmatory bifactor analysis are shown in Figure 3.

Figure 3
figure 3

Confirmatory bifactor analysis model of the EBPAS. n = 270, χ2 (75) = 107.37, CFI = 0.97, TLI = 0.96, RMSEA = 0.04, SRMR = 0.04; All factor loadings are significant at P < 0.05, except for items ‘Clinical experience important’ and ‘Know better than researchers’ on the general factor (in italics; P = 0.83 and P = 0.38), and items ‘Colleagues like it’ and ‘Enough training’ on Appeal (in italics; P = 0.16 and P = 0.07).

Regression analyses

Regression models using the maximum likelihood estimation with robust standard errors were used to assess the association of EBPAS scores with characteristics of youth care professionals. For these analyses, in order to enable a good comparison with the study of Aarons et al. [16], the whole sample was used in a first-order factor model. A second-order factor model could not be identified because the number of iterations was exceeded. The results are shown in Table 2. For the Requirements subscale, willingness to adopt evidence-based interventions given the requirements to do so decreased with higher levels of education. Controlling for age and sex, professionals with university education scored significantly lower than professionals with higher vocational education. For the Appeal subscale, no associations with characteristics of youth care professionals were found. For the Openness subscale, greater openness to new practices was associated with higher levels of education. Controlling for age and sex, professionals with university education scored significantly higher than professionals with higher vocational education. Finally, for the Divergence subscale, greater divergence between evidence-based interventions and current practice was associated with being older, being male, and lower levels of education. While holding constant other variables in the model, older professionals scored significantly higher than younger professionals, males scored significantly higher than females, and professionals with higher vocational education scored significantly higher than professionals with university education.

Table 2 Association of characteristics of youth care professionals with EBPAS scores

Discussion

The current study contributes to the development and cross-validation of the EBPAS by examining the psychometric properties of the Dutch version of the EBPAS. Besides that, the present research is the first to assess a bifactorial solution of attitude toward evidence-based interventions. The results provide strong support for a structure with a general factor plus four specific factors of the Dutch version of the EBPAS in a diverse sample of youth care professionals of both a mental healthcare institution and institutions for child welfare. The general factor is attitude toward evidence-based interventions and the specific factors involve willingness to adopt evidence-based interventions given the intuitive appeal, willingness to adopt evidence-based interventions if required, general openness toward evidence-based interventions, and perceived divergence of usual practice with evidence-based interventions. The EBPAS total score and subscale scores demonstrated acceptable to good internal consistency reliability and the mean loading of items on their predicted factors was substantial to good. These findings are highly congruent with studies of the original version of the EBPAS conducted in the United States [32].

Generally consistent with previous findings, Openness subscale scores were positively correlated with Appeal and Requirements subscale scores. Professionals who are more open toward adoption of evidence-based interventions may also endorse positive attitudes toward the intuitive appeal of these interventions and the requirements to adopt these interventions. Contrary to previous findings, we did not find a statistically significant correlation between the extent to which professionals would adopt evidence-based interventions if they were intuitively appealing and if they were required. The Divergence subscale was not significantly correlated with any of the other subscales. Therefore, the extent to which professionals perceive evidence-based interventions as not clinically useful does not appear to be meaningfully associated with openness to evidence-based interventions, appeal of these interventions, and willingness to adopt these interventions if required. Moreover, we did not find a statistically significant correlation between the Divergence subscale and the general Attitude toward adoption of evidence-based interventions. While the three distinct dimensions Requirements, Appeal and Openness possibly address a common attitude toward evidence-based interventions, Divergence seems to be a separate construct in the Dutch sample.

Possible explanations for the finding that the Divergence factor loading was not statistically significant in the higher order model might lie in cultural, political and/or time differences between youth care institutions in the United States and the Netherlands. A possible cultural difference is that Dutch professionals may view research as separate from using evidence-based interventions [45,46]. There are three items in the EBPAS that explicitly use the term “research/researchers”: two items of the Divergence subscale (“Research-based treatments/interventions are not clinically useful” and “I know better than academic researchers how to care for my clients”) and one item of the Openness subscale (“I am willing to use new and different types of therapy/interventions developed by researchers”). The item scores of the Dutch sample appear to be somewhat higher on the first two items and somewhat lower on the last item compared to scores found in the United States samples. Dutch professionals may feel more discrepancy between their clinical views and scientific results or perceive research as confining their ability to make independent decisions about therapy, while they still can be open to evidence-based interventions, can find these interventions appealing, and are willing to adopt these interventions if required. A possible time difference is that evidence-based interventions may be more widely disseminated and implemented in the United States than in the Netherlands. The main body of research about attitudes toward evidence-based interventions (beginning in 2004) is from the United States, exposure to the concept and meaning of evidence-based interventions may be more common across different professional disciplines (e.g. social work, psychology, marriage and family therapy, nursing) in the United States, and this has been developing for a longer period of time than in the Netherlands.

The results also demonstrated associations between EBPAS scores and characteristics of youth care professionals. Professionals with university education scored higher on Openness and lower on both Requirements and Divergence than professionals with higher vocational education. Professionals with university education may have received more education related to evidence-based interventions. In addition, it is likely that the drawbacks of overreliance upon unsystematic clinical observations and experience alone, were raised. It may also be that professionals with university education have had more training and applied practice in evidence-based interventions. Professionals whose education includes and emphasizes exposure to and training in research may be more likely to understand, value, and utilize research [47]. Such professionals may report more openness and less divergence. However, professionals with university education also seem less willing to adopt evidence-based interventions if required. It may be that professionals with additional education and practice are more autonomous and assertive in making independent decisions about utilizing evidence-based interventions [16] and therefore less willing to accept external pressure than their colleagues with lower education.

Besides this, males and older professionals reported more perceived divergence than females and younger professionals. The sex difference is in contrast to studies with the original version of the EBPAS, in which this effect was not found [32]. This suggests that additional research is needed to explore when, where, and why sex differences in attitudes toward evidence-based interventions might operate. The age difference is in accordance with one earlier study [23]. It is conceivable that younger professionals, like professionals with a university degree, are more familiar with evidence-based interventions because they have had more prior exposure to these interventions during their education than older professionals [48]. Being older may result in giving more importance to own clinical experience and less appreciation of research-based interventions.

Some limitations should be noted. First, the sample was not large and thus statistical power was limited. Further, data on age was missing for almost one-third of the respondents and data about level of experience was not available, limiting the direct comparisons that could be made with previous studies. Additionally, only one mental healthcare organization was included; although this was a large one, caution should be exercised when interpreting the generalizability to other mental healthcare institutions. Furthermore, while our sample consisted of professionals of a mental healthcare institution and institutions for child welfare, our findings may not generalize to professionals in individual practice or to other sectors. Finally, because only information about respondents was available, it was not possible to compare respondents with non-respondents to examine a potential non-response bias.

This study has added to the knowledge about the EBPAS in general and for the Netherlands specifically. To our knowledge, this is the first study that used the EBPAS in the Netherlands. Additional research is needed to further establish the factor structure of the Dutch version of the EBPAS. Both the first-order structure and higher order structure require more consideration and investigation. Additionally, it is not clear if the factor structure would vary for child mental health versus child welfare professionals. Future research should examine factor structure differences by professional primary discipline, level of experience, and sector. In addition, the construct and criterion validity of the Dutch version of the EBPAS have to be further established. More research is needed to confirm associations between EBPAS scores and characteristics of youth care professionals, structures and policies of youth care institutions, culture and climate, and leadership. Furthermore, the sensitivity to change of the Dutch version of the EBPAS should be investigated to be able to examine how attitudes towards evidence-based interventions might influence behaviour, how behaviour might influence change in attitudes, and whether attitudes predict adoption, implementation, and sustainment of evidence-based interventions. Finally, for implementation of evidence-based interventions within youth care, it will be useful to examine factors that might moderate the relationship between attitudes and behaviour. Such behaviours are not limited to the use of efficacious treatment models, but can also include use of innovations such as data monitoring systems, alerts to target prescribing practices, and ROM. Although the referent for the EBPAS are interventions that have research support and may be manualized and structured, attitudes towards these interventions can also reflect attitudes toward other clinical innovations. Using information obtained by the Dutch version of the EBPAS may ultimately enable a better tailoring of implementation efforts to the readiness of youth care professionals to adopt evidence-based initiatives.

Conclusions

The present study provides strong support for the original four-factor structure and internal consistency reliability of the Dutch version of the EBPAS in a diverse sample of youth care professionals. This supports the utility of the EBPAS in varied countries and settings. Because we focused on real-world professionals, the study also provides support for ecological validity.

The study suggests four directions for future research. First, the factor structure of the Dutch version of the EBPAS needs to be further established. Second, additional research is needed to further establish the construct and criterion validity of the Dutch version of the EBPAS. Third, the sensitivity to change of the Dutch version of the EBPAS should be investigated. Finally, it is recommended to examine factors that might moderate the relationship between attitudes and behaviour of youth care professionals.

Abbreviations

ACYN:

Academic Center Youth Nijmegen

CFA:

Confirmatory factor analysis

CFI:

Comparative fit index

EBP:

Evidence-based practice

EBPAS:

Evidence-Based Practice Attitude Scale

EFA:

Exploratory factor analysis

RMSEA:

Root mean square error of approximation

ROM:

Routine Outcome Monitoring

SRMR:

Standardized root mean square residual

TLI:

Tucker–Lewis Index

References

  1. Aarons GA, Cafri G, Lugo L, Sawitzky A. Expanding the domains of attitudes towards evidence-based practice: the evidence based practice attitude scale-50. Admin Pol Ment Health. 2012;39(5):331–40. doi:10.1007/s10488-010-0302-3.

    Article  Google Scholar 

  2. APA Presidential Task Force on Evidence-Based Practice. Evidence-based practice in psychology. Am Psychol. 2006;61(4):271–85. doi:10.1037/0003-066X.61.4.2712.

    Article  Google Scholar 

  3. Sackett DL, Rosenberg WMC, Gray JAM, Haynes RB, Richardson WS. Evidence based medicine: what it is and what it isn’t. Br Med J. 1996;312(7023):71–2.

    Article  CAS  Google Scholar 

  4. Barnoski R. Outcome evaluation of Washington State’s research-based programs for juvenile offenders. Olympia, WA: Washington State Institute for Public Policy; 2004.

    Google Scholar 

  5. Aarons GA, Glisson C, Green PD, Hoagwood K, Kelleher KJ, Landsverk JA, et al. The organizational social context of mental health services and clinician attitudes toward evidence-based practice: a United States national study. Implement Sci. 2012;7:56. doi:10.1186/1748-5908-7-56.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Bickman L. A measurement feedback system (MFS) is necessary to improve mental health outcomes. J Am Acad Child Adolesc Psychiatry. 2008;47(10):1114–9.

    Article  PubMed  Google Scholar 

  7. Bickman L, Kelley SD, Breda C, de Andrade AR, Riemer M. Effects of routine feedback to clinicians on mental health outcomes of youths: results of a randomized trial. Psychiat Serv. 2011;62(12):1423–9.

    Article  Google Scholar 

  8. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82(4):581–629.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Glisson C, Schoenwald SK. The ARC organizational and community intervention strategy for implementing evidence-based children’s mental health treatments. Ment Health Serv Res. 2005;7(4):243–59. doi:10.1007/s11020-005-7456-1.

    Article  PubMed  Google Scholar 

  10. Raghavan R, Bright CL, Shadoin AL. Toward a policy ecology of implementation of evidence-based practices in public mental health settings. Implement Sci. 2008;3:26. doi:10.1186/1748-5908-3-26.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Damanpour F. Organizational innovation: a meta-analysis of effects of determinants and moderators. Acad Manage J. 1991;34(3):555–90.

    Article  Google Scholar 

  12. Frambach RT, Schillewaert N. Organizational innovation adoption: a multi-level framework of determinants and opportunities for future research. J Bus Res. 2002;55(2):163–76.

    Article  Google Scholar 

  13. Fishbein M, Hennessy M, Yzer M, Douglas J. Can we explain why some people do and some people do not act on their intentions? Psychol Health Med. 2003;8(1):3–18. doi:10.1080/1354850021000059223.

    Article  CAS  PubMed  Google Scholar 

  14. Henggeler SW, Chapman JE, Rowland MD, Halliday-Boykins CA, Randall J, Shackelford J, et al. Statewide adoption and initial implementation of contingency management for substance-abusing adolescents. J Consult Clin Psychol. 2008;76(4):556–67. doi:10.1037/0022-006X.76.4.556.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Olson JM, Stone J. The influence of behavior on attitudes. In: Albarracín D, Johnson BT, Zanna MP, editors. The handbook of attitudes. Mahwah, NJ: Lawrence Erlbaum Associates Publishers; 2005. p. 223–71.

    Google Scholar 

  16. Aarons GA, Glisson C, Hoagwood K, Kelleher K, Landsverk J, Cafri G. Psychometric properties and U.S. National norms of the Evidence-Based Practice Attitude Scale (EBPAS). Psychol Assess. 2010;22(2):356–65. doi:10.1037/a0019188.

    Article  PubMed  Google Scholar 

  17. Aarons GA, McDonald EJ, Sheehan AK, Walrath-Greene CM. Confirmatory factor analysis of the Evidence-Based Practice Attitude Scale in a geographically diverse sample of community mental health providers. Admin Pol Ment Health. 2007;34(5):465–9. doi:10.1007/s10488-007-0127-x.

    Article  Google Scholar 

  18. Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: the Evidence-Based Practice Attitude Scale (EBPAS). Ment Health Serv Res. 2004;6(2):61–74.

    Article  PubMed  PubMed Central  Google Scholar 

  19. McColl A, Smith H, White P, Field J. General practitioners’ perceptions of the route to evidence based medicine: a questionnaire survey. Brit Med J. 1998;316(7128):361–5.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  20. Rubin A, Parrish DE. Validation of the Evidence-Based Practice Process Assessment Scale. Res Soc Work Pract. 2011;21(1):106–18. doi:10.1177/1049731509347851.

    Article  Google Scholar 

  21. Melas CD, Zampetakis LA, Dimopoulou A, Moustakis V. Evaluating the properties of the Evidence-Based Practice Attitude Scale (EBPAS) in health care. Psychol Assess. 2012;24(4):867–76. doi:10.1037/a0027445.

    Article  PubMed  Google Scholar 

  22. Wolf DAPS, Dulmus CN, Maguin E, Fava N. Refining the evidence-based practice attitude scale: an alternative confirmatory factor analysis. Soc Work Res. 2014;38(1):47–58. doi:10.1093/swr/svu006.

    Article  PubMed Central  Google Scholar 

  23. Aarons GA, Sawitzky AC. Organizational culture and climate and mental health provider attitudes toward evidence-based practice. Psychol Serv. 2006;3(1):61–72. doi:10.1037/1541-1559.3.1.61.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Aarons GA. Transformational and transactional leadership: association with attitudes toward evidence-based practice. Psychiat Serv. 2006;57(8):1162–9.

    Article  Google Scholar 

  25. Smith BD, Manfredo IT. Frontline counselors in organizational contexts: a study of treatment practices in community settings. J Subst Abuse Treat. 2011;41(2):124–36. doi:10.1016/j.jsat.2011.03.002.

    Article  PubMed  Google Scholar 

  26. Beidas RS, Marcus S, Aarons GA, Hoagwood KE, Schoenwald S, Evans AC, et al. Individual and organizational predictors of community therapists’ use of evidence-based practices in a large public mental health system. JAMA Psychiatry. 2015. Ahead of print.

  27. Lopez MA, Osterberg LD, Jensen-Doss A, Rae WA. Effects of workshop training for providers under mandated use of an evidence-based practice. Admin Pol Ment Health. 2011;38(4):301–12. doi:10.1007/s10488-010-0326-8.

    Article  Google Scholar 

  28. Gioia D. Using an organizational change model to qualitatively understand practitioner adoption of evidence-based practice in community mental health. Best Pract Ment Health. 2007;3(1):1–15.

    Google Scholar 

  29. Stahmer AC, Aarons G. Attitudes toward adoption of evidence-based practices: a comparison of autism early intervention providers and children’s mental health providers. Psychol Serv. 2009;6(3):223–34. doi:10.1037/a0010738.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Nakamura BJ, Higa-McMillan CK, Okamura KH, Shimabukuro S. Knowledge of and attitudes towards evidence-based practices in community child mental health practitioners. Admin Pol Ment Health. 2011;38(4):287–300. doi:10.1007/s10488-011-0351-2.

    Article  Google Scholar 

  31. Gray MJ, Elhai JD, Schmidt LO. Trauma professionals’ attitudes toward and utilization of evidence-based practices. Behav Modif. 2007;31(6):732–48. doi:10.1177/0145445507302877.

    Article  PubMed  Google Scholar 

  32. Aarons GA, Green A, Miller E. Researching readiness for implementation of evidence-based practice: a comprehensive review of the Evidence-Based Practice Attitude Scale (EBPAS). In: Kelly B, Perkins D, editors. Cambridge handbook of implementation science for psychology in education: how to promote evidence based practice. Cambridge: Cambridge University Press; 2012.

    Google Scholar 

  33. Sareen J, Jagdeo A, Cox B, Clara I, ten Have M, Belik S-L, et al. Perceived barriers to mental health service utilization in the United States, Ontario, and the Netherlands. Psychiat Serv. 2007;58(3):357–64.

    Article  Google Scholar 

  34. NetQuestionnaires Nederland BV. NetQuestionnaires, Version 6.5. Amsterdam: NETQ Insights B.V., Released; 2002.

    Google Scholar 

  35. IBM Corp. IBM SPSS Statistics for Windows, Version 20.0. Armonk, NY: IBM Corp; 2011.

    Google Scholar 

  36. Muthén LK, Muthén BO. Mplus User’s Guide. Seventh Edition. Los Angeles, CA: Muthén & Muthén; 1998–2012

  37. Fabrigar LR, Wegener DT, MacCallum RC, Strahan EJ. Evaluating the use of exploratory factor analysis in psychological research. Psychol Methods. 1999;4(3):272.

    Article  Google Scholar 

  38. Nunnally J, Bernstein I. Psychometric theory. 3rd ed. New York: McGraw-Hill; 1994.

    Google Scholar 

  39. Tabachnick BG, Fidell LS. Using multivariate statistics. 5th ed. Boston: Allyn and Bacon; 2007.

    Google Scholar 

  40. Kuntsche E, Jordan MD. Adolescent alcohol and cannabis use in relation to peer and school factors. Results of multilevel analyses. Drug Alcohol Depend. 2006;84(2):167–74. doi:10.1016/j.drugalcdep.2006.01.014.

    Article  PubMed  Google Scholar 

  41. Holzinger KJ, Swineford F. The bi-factor method. Psychometrika. 1937;2(1):41–54. doi:10.1007/BF02287965.

    Article  Google Scholar 

  42. Chen FF, Hayes A, Carver CS, Laurenceau JP, Zhang Z. Modeling general and specific variance in multifaceted constructs: a comparison of the bifactor model to other approaches. J Pers. 2012;80(1):219–51. doi:10.1111/j.1467-6494.2011.00739.x.

    Article  PubMed  Google Scholar 

  43. Hu L-t, Bentler PM. Cutoff criteria for fit indexes in covariance structure analysis: conventional criteria versus new alternatives. Struct Equ Modeling. 1999;6(1):1–55. doi:10.1080/10705519909540118.

    Article  Google Scholar 

  44. Reise SP, Moore TM, Haviland MG. Bifactor models and rotations: exploring the extent to which multidimensional data yield univocal scale scores. J Pers Assess. 2010;92:544–59. doi:10.1080/00223891.2010.496477.

    Article  PubMed  PubMed Central  Google Scholar 

  45. Lauteslager M. Wetenschap, akkoord, maar welke wetenschap? Het EBP-debat dat maar geen debat wil worden. Tijdschr Psychother. 2011;37(4):276–93.

    Article  Google Scholar 

  46. Lauteslager M. Wetenschap, akkoord, maar welke wetenschap? Psychotherapie als bouwpakket of trukendoos. Tijdschr Psychother. 2011;37(5):350–66.

    Article  Google Scholar 

  47. Stewart RE, Chambless DL. Does psychotherapy research inform treatment decisions in private practice? J Clin Psychol. 2007;63(3):267–81. doi:10.1002/jclp.20347.

    Article  PubMed  Google Scholar 

  48. van der Zwet RJM, Beneken genaamd Kolmer DM, Schalk R. Social workers’ orientation toward the evidence-based practice process: a Dutch survey. Research on Social Work Practice. 2014. doi:10.1177/1049731514540340

Download references

Acknowledgements

This study was funded in part by a grant from ZonMw, the Dutch organization for Health Research and Development (Grant Number 157031002). The authors would like to thank the youth care institutions and professionals that participated in the study and made this work possible.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Maartje A. M. S. van Sonsbeek.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors’ contributions

MAMSS, GJMH, JWV and BGT conceptualized the specific research questions and analytic approach for this manuscript. MAMSS and BGT were responsible for the translation of the EBPAS into Dutch. Analyses were conducted by MAMSS and MK. MAMSS wrote the first draft of the manuscript, and all authors edited and revised portions of the manuscript. All authors read and approved the final manuscript. The Dutch version of the EBPAS is available from the first author.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

van Sonsbeek, M.A.M.S., Hutschemaekers, G.J.M., Veerman, J.W. et al. Psychometric properties of the Dutch version of the Evidence-Based Practice Attitude Scale (EBPAS). Health Res Policy Sys 13, 69 (2015). https://doi.org/10.1186/s12961-015-0058-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12961-015-0058-z

Keywords