Purpose. The job analysis described in this report was conducted by the National Certification Board for Diabetes Educators (NCBDE) in order to 1) provide a basis for documenting the continuing validity of the Certification Examination for Diabetes Educators, 2) define new areas that should be assessed in future certification examinations, and 3)ensure that the content of certification examinations is job related.

Methods. The study involved developing a diabetes educator job task list and survey, distributing 1,100 surveys, and analyzing survey responses from a multidisciplinary and geographically representative sample of certified diabetes educators.

Results. Three hundred and thirty-nine surveys were suitable for analysis, with relevant demographic subgroups adequately represented. Based on survey data, an examination matrix and detailed content outline was constructed that will be used by NCBDE to assemble future test forms.

Conclusions. Certification examination specifications were developed directly related to the important activities that diabetes educators perform. Future forms of the certification examination will continue to be matched to job-related, criterion-referenced test specifications and will have strong evidence of content validity. Future forms of the exam will contain 200 items at specified cognitive levels, with a representative sampling of tasks within three core areas from the detailed content outline.

This study of the role of certified diabetes educators (CDEs) was conducted in 2004 for the National Certification Board for Diabetes Educators (NCBDE) by its testing agency, Applied Measurement Professionals. The purpose of this study was to describe the CDE's job with enough detail to 1) provide a valid basis for a national, state-of-the-art, professional, job-related certification examination for diabetes educators, and 2) define areas that should be assessed in future certification examinations.

The previous NCBDE job analysis was completed in 2000. The NCBDE board of directors wanted to determine whether there had been changes in the core job responsibilities of diabetes educators within the past 4 years that needed to be incorporated into the structure of the exam. Connecting an assessment of knowledge and skill to current job responsibilities is an essential objective of the certification examination. As a valid measure, it needs to mirror what CDEs are actually doing and reflect not only mastery of knowledge, but also application of that knowledge to everyday practice.

The NCBDE selected a Job Analysis Advisory Committee to ensure that expert judgment was available to the board and its testing agency at every stage of the project. The committee consisted of experienced practitioners, all CDEs from a variety of professional disciplines and geographic areas, who were thoroughly familiar with the skills and activities of a diabetes educator.

It was the committee's responsibility to 1) provide information about the core knowledge and skills that define the role of a CDE, 2)collaborate with the testing agency to design and deliver the survey instrument, 3) review the final form of the survey for completeness and relevance to the profession, 4) interpret the survey results, and 5) create the final detailed examination specifications.

The committee developed a comprehensive list of tasks performed by diabetes educators based on job descriptions, standards and scopes of practice, the current certification examination content outline, and other relevant job-related materials. The final survey consisted of 125 tasks presented in content order and subsequently divided into three core areas: assessment,intervention, and program development and administration. Survey respondents had an opportunity to suggest additional tasks.

A rating scale was selected by the committee to use with the survey. This scale was based on a similar scale used in previous national job analysis studies by NCBDE and other professional organizations. It was designed to determine the importance of each task on the survey to the performance of the diabetes educator. The survey specifically asked respondents to rate on a Likert-type scale the importance of each task to safe and effective performance as a diabetes educator. Such information was necessary to demonstrate that the examination measures significant aspects of the job and covers appropriate content.

The committee also selected background questions designed to gather information about the characteristics of respondents and their patients. A pilot job analysis survey was then distributed to committee members and colleagues for review. The purpose of the pilot study was to determine if any important tasks were missing from the survey, if the directions were clear,and if the rating scale was easy to use and understand. Comments from the pilot study participants were reviewed, and modifications were made to the survey before distribution.

In an effort to obtain information from respondents who represented CDEs throughout the United States, 1,100 surveys were mailed to CDEs randomly selected from NCBDE's database.

Return Rate and Sample Size

Of the 1,100 surveys mailed, 339 usable surveys were returned for an initial response rate of 30.82%. Sixteen were returned as not useable (i.e.,past the extended deadline or not delivered). Therefore, the corrected response rate was 31.27%, which was nearly identical to NCBDE's 2000 job analysis study. Results based on this sample were stable and judged sufficient for the job analysis.

A general approach was incorporated to evaluate the standard error of the ratings. An approximate standard error was used for the rating scale by applying the equation

\(1{\div}{\surd}339\)
(total sample size). The resulting standard error of the ratings was 0.054. This indicates that ratings were relatively stable and reflective of the population of CDEs.

Task and Respondent Rating Reliability Estimates

To find the extent to which tasks were consistently rated within each survey section, a statistic known as coefficient-α was used.1 Coefficient-α is an estimate of the amount of error reflected by the scores associated with the instrument. Higher estimate values (i.e., ≥0.90) reflect smaller amounts of error. To determine the extent to which the respondents were consistent in rating inventory tasks, a statistic known as the intraclass correlation was used.2  Separate reliability estimates were calculated for content areas and are displayed in Table 1. Because a maximum reliability coefficient is represented by a value of 1.00 and the total reliability estimate for the whole task list was 0.93 (intraclass) and 0.98(α), the respondents' task ratings were considered statistically reliable. Based on these data, it is likely that a different sample from the same population would have produced similar task ratings.

Table 1.

Task and Respondent Rating Reliability Estimates

Task and Respondent Rating Reliability Estimates
Task and Respondent Rating Reliability Estimates

After respondents completed the survey, they indicated how well they thought the task list covered the diabetes educator's job. Ninety-eight percent of respondents thought the survey completely or adequately described the diabetes educator's job.

Demographic Analyses

Background information was collected from respondents about themselves and their patients. These demographic data helped describe the sample. Not all respondents answered every question. Demographhic information gathered included:

  • Registered nurses and registered dietitians represented nearly three-fourths (72%) of the respondents.

  • The majority of respondents (43%) were primarily employed in outpatient hospital work settings.

  • The survey response rate was comparable to the actual percentage of CDEs by region, and the largest number of respondents practice in the Great Lakes Region (20%).

  • The majority of respondents (68%) practice in urban or suburban communities.

  • Respondents had a mean of 13.36 years (SD 7.3) of experience as a diabetes educator.

  • The majority of respondents (40%) have a bachelor's degree

  • Ninety-three percent of respondents plan to recertify, and 78% of those planning to recertify have decided to use the new continuing education option rather than take the examination.

  • Ninety-one percent of respondents identified themselves as female.

  • Eighty-four percent of those responding identified themselves as Caucasian.

Figure 1 shows respondents'reasons for obtaining the CDE credential, and Fig. 2 shows respondents'perceived benefits of having the CDE credential. Personal satisfaction was noted to be the most popular answer to both these questions.

Figure 1.

For what reason did you obtain your CDE?

Figure 1.

For what reason did you obtain your CDE?

Close modal
Figure 2.

What effect did obtaining certification have on you or your career?

Figure 2.

What effect did obtaining certification have on you or your career?

Close modal

The survey asked CDEs for information about the characteristics of the patients they educate. As shown in Table 2, the largest percentage of respondents estimate that they spend most of their time educating patients (61.4%). Table 3 demonstrates that most of their patients are adults with type 2 diabetes (83.8%), and Table 4 shows that most patients are taking oral agents (51.7%). The majority of educators in this sample indicated no involvement with type 2 diabetes in adolescents or children and surprisingly little contact with patients using pump therapy(1.3%). Their self-report that 68.8% of patients treated were > 40 years of age reflected the type 2 diabetic population. Educators also noted that the largest percentage of patients were Caucasian (63.7%), followed by African-American (11%) and Hispanic (7.2%). The majority of educators estimated that most communication with patients was face to face (81.6%)rather than by telephone, and only a small percentage (0.4%) had contact by e-mail.

Table 2.

Estimate the percentage of your professional time that you spend in the following roles.

Estimate the percentage of your professional time that you spend in the following roles.
Estimate the percentage of your professional time that you spend in the following roles.
Table 3.

Estimate the percentage of your patient population with each of the following.

Estimate the percentage of your patient population with each of the following.
Estimate the percentage of your patient population with each of the following.
Table 4.

Estimate the percentage of your patient population that uses each of the following treatment regimens.

Estimate the percentage of your patient population that uses each of the following treatment regimens.
Estimate the percentage of your patient population that uses each of the following treatment regimens.

Each of the 125 diabetes educator tasks was rated by the respondents for importance to safe and effective job performance using the following scale:Not Performed, Not Important, Somewhat Important, Quite Important, and Extremely Important. In order to determine which of the 125 tasks were more significant and performed more frequently, descriptive data were calculated by Applied Measurement Professionals for each task and reviewed by the Job Analysis Advisory Committee.

Because this exam is national in scope, it was critical that the test specifications reflect the responsibilities of diabetes educators across the United States who might be eligible to take the examination. Therefore, the committee adopted four decision rules to identify tasks eligible for assessment. Essentially, these rules kept only tasks that respondents rated as important and frequently performed across geographical regions and across varying years of experience in diabetes education. Specifically, the rules kept only tasks rated by respondents as at least 1) Quite Important, 2) Quite Important by at least 7 of the 11 regional subgroups, 3) Quite Important by all of the years-of-experience subgroups, and 4) performed by at least 75% of the respondents.

In addition to information from respondents, the Job Analysis Advisory Committee and the NCBDE board of directors took into consideration their own diabetes education experience in determining the 125 tasks included in the final content outline. Consequently, the NCBDE board of directors concluded that the job analysis survey data adequately defined the CDE role on a national basis. Moreover, the job analysis data were judged to be sufficient for the purpose of defining the future structure and content of the NCBDE credentialing examination.

Next, the Job Analysis Advisory Committee determined the number of examination items for each of the three core areas of practice listed in Table 1. The goal was to distribute items in accordance with observed working patterns across the three core areas. In deciding on 200 total items (including 25 new questions that do not contribute to the final score but are studied for future use), the committee considered the respondents' mean percentages within each core area,the respondents' ratings about what percentage of the examination should be devoted to each area, and the number and importance of the tasks in each area.

After the number of examination items was determined, the final step involved defining the cognitive complexity of the examination content. A complexity scale was designed to determine at what cognitive level individual tasks were performed. The information provided a basis for matching test item complexity to job complexity. The complexity scale was based on Bloom's Taxonomy of Educational Objectives,3  and the ratings were described as 1) Recall, requiring only the identification, recall, or recognition of isolated information such as facts,principles or procedures; 2) Application, requiring comprehension,interpretation, or manipulation of limited concepts or data in which the outcome is situationally dependent but not overly complex (e.g., application of knowledge that varies based on patient characteristics and environment);and 3) Analysis/Evaluation, requiring the integration or synthesis of a variety of concepts or elements to solve a specific problem situation (e.g.,evaluating complex problems with many situational variables).

The Job Analysis Advisory Committee discussed each task, assigned appropriate cognitive ratings, averaged ratings for each of the three major content categories, and distributed items appropriately across cognitive levels. Table 5 presents the final certification examination specifications. These specifications and the detailed content outline will be used by the test developers, item writers,and item reviewers to build future forms of the certification examination beginning with testing scheduled for 2006.

Table 5.

Test Specifications

Test Specifications
Test Specifications

The job analysis described in this article was undertaken to provide evidence supporting content-valid inferences from examination scores. The study was conducted to determine and comprehensively describe the CDE's job,to evaluate this description through the ratings of job experts, and to define areas that should be assessed in this examination.

The NCBDE Job Analysis Advisory Committee prepared a comprehensive list of activities describing the job, and a representative sample of CDEs completed the survey. The results of the survey task ratings were used to develop new test specifications directly related to the important activities that CDEs perform. These test specifications outline the content domain and distribution of items across the three critical content areas. The specifications will guide test development and provide content-related evidence that examination scores relate to the job. Because each test form will be developed to match these job-related test specifications, valid content-related inferences can be drawn about candidates' abilities to perform a CDE's job. In summary, the NCBDE examination content is the result of a carefully detailed process that truly reflects what educators are doing in practice.

John Zrebiec, MSW, CDE, is the associate director of mental health services at the Joslin Diabetes Center and a lecturer in psychiatry at Harvard Medical School in Boston, Mass. He served as chair of the NCBDE Job Analysis Advisory Committee and is a past chair of NCBDE.

NCBDE appreciated the support of Steven Nettles, EdD, and Jaime Walla,MSEd, from Applied Measurement Professionals. NCBDE is grateful to the members of the Job Analysis Advisory Committee for their guidance, expertise, and devotion to this complex project. The committee included: Kathy Berkowitz,APRN, BC, FNP, CDE; Karen A. Chalmers, MS, RD, LD, CDE; Stephen C. Clement,MD, CDE; Ramona K. Corson, PharmD, CDE; Patti Duprey, RN, MSN, ARNP, CDE;Alison Evert, RD, CDE; Cindy J. Halstenson, RD, CDE; Carolé R. Mensing,RN, MA, CDE; Sandra R. Muchnick, MEd, CDE; Stephen W. Ponder, MD, CDE; Lupe A. Rupert, RN, CDE; and John Zrebiec, MSW, CDE, Chair.

1.
Guilford JP:
Fundamental Statistics in Psychology and Education.
New York, McGraw Hill,
1978
2.
Hopkins KD,Stanley JC, Hopkins BR:
Educational and Psychological Measurement and Evaluation.
7th ed. Inglewood Cliffs, N.J., Prentice Hall,
1990
3.
Bloom B (Ed.):
Taxonomy of Educational Objectives, Handbook I: The Cognitive Domain.
New York, David McKay Company,
1956