commission on accreditation · 2020-06-16 · program annual report summary: 2006–2010 figures...
TRANSCRIPT
5-Year
Summary report
Commission o
n ACCreditAtio
n
Education DirectorateOffice of Program Consultation and Accreditation
www.apa.org/ed/[email protected].: 202.336.5979Fax: 202.336.5978
Co
mm
iss
ion
on
AC
Cr
ed
itA
tio
n | 5-Y
ear
Su
mm
ar
y r
ep
or
tA
MER
ICA
N PS
YCH
OLO
GIC
AL A
SS
OC
IATIO
N
5-Year Summary report Commission on ACCreditAtion
office of program Consultation and accreditation education Directorate
american psychological association www.apa.org/ed/accreditation
Copyright © 2011 by the american psychological association.
this material may be reproduced and distributed without permission
for educational or research purposes provided that acknowledgment is
given to the american psychological association. this material may not
be translated or commercially reproduced without prior permission
in writing from the publisher. For permission, contact apa, rights and
permissions, 750 First Street, Ne, Washington, DC 20002-4242.
APA Editorial and Design Services
Deborah Farrell, editor
David Spears, Designer
5-Year
Summary report
Commission o
n ACCreditAtio
n
Contents 1 Introduction 2 apa office of program Consultation and accreditation Staff
3 overview of accredited programs and program review Statistics 3 accredited programs by program type 4 program review Statistics
7 program annual report Summary: 2006–2010 8 Doctoral programs 16 Internship programs 20 postdoctoral programs
27 From a Committee to a Commission 27 Coa membership profile: 2007–2011 28 Coa Chairs: 2007–2011 30 timeline 32 Background 36 Higher education accreditation update
39 Coa policy and procedural Changes: 2007–2011 39 policy Changes 41 procedural Changes
42 appendix CoA Membership Lists: 2007–2011
ensuring the quality of education and training of students/trainees is one of the ways we as a health care and mental health profession can best retain the trust of the public and of our colleagues in other professions, as well as assure our continued growth and development.
—the apa office of program Consultation and accreditation
1
Introduction
This report is a 5-year compendium of accredi-
tation actions in terms of program review and
policy initiatives taken by the APA Committee/
Commission on Accreditation (CoA). The com-
pendium includes data on the landscape of
professional psychology programs accredited
by the APA-CoA at the doctoral, internship,
and postdoctoral level. This report provides
graphs that summarize the past 5 years in a
descriptive manner. To encourage you as the
reader and consumer to better understand
the changes in the past 5 years, we have pro-
vided the data for these graphs on our website
(http://www.apa.org/ed/accreditation/about/
research/index.aspx) for your own analysis to
facilitate better understanding of the changes
occurring in our field and our future.
The past 5 years have been very turbulent for both the broader higher education community and for accreditation at the institutional and program-matic level. The federal government has proposed and passed a number of regulations, including the Higher Education Opportunity Act of 2008 and the Integrity Regulations of 2011. Many in the higher education community have expressed concerns that these regulations as a group appear to be moving accreditation in the direction of becoming extensions of the U.S. Department of Education rather than as shared higher education governance for peer reviews of educational quality. The APA CoA as an accrediting body works to ensure the quality of its accredited pro-grams through its own process of self-study and self-regulation. Critical in that effort is to provide you, the public, with the information and data in this report.
Thus, the past 5 years have included some impor-tant policy initiatives in response to issues within the discipline and the higher education landscape. One of the major internal changes was the shift in the structure of the CoA itself from a 21-member Committee to a 32-member Commission. The shift to
a Commission is the beginning of a new phase. Given the importance of self-regulation, we encourage you to assist us in defining the future of our accreditation system. In that regard, the purpose of this special report is twofold:
• To provide summary data on all accredited pro-grams as collected through their annual reports of 2006–2010, highlighting changes and trends over this 5-year period.
• To bring relevant publics up to speed on the changes within APA CoA’s structure, policies, and activities since its 2006 report.
We hope that this information is useful to you.
—Susan Zlotlow, PhD APA Office of Program Consultation and Accreditation
2 Commission on Accreditation: 5-Year Summary Report
APA Office of Program Consultation and Accreditation Staff*
Susan Zlotlow, PhD Director, Office of Program Consultation and Accreditation and Associate Executive Director, Education Directorate
Gregory Greenwood, PhD Associate Director, Research
Tia Scales Associate Director, Governance and Administration
Elizabeth Horrocks Assistant Director, Operations
Erinn Monteiro, MPH Assistant Director, Policy Analysis and Communications
Jonathan Cole, MA Manager, Research
Erin Wilson Manager, Administration
Daniel Michalski, MPA Manager, Site Visits
Jennifer McCrindle, MPA Manager, Preliminary Review
Lauren Williams Manager, Program Agenda
Tasha Haywood Administration Assistant
Lois Jones Research Assistant
Ashima Kapur Site Visit Coordinator
Tanika Wiggins Administrative Assistant
Office of Program Consultation and Accreditation staff:
Bottom row, left to right: Erin Wilson, Tia Scales, Susan Zlotlow, and Tanika Wiggins.
Middle row, left to right: Jon Cole, Jen McCrindle, Tasha Haywood, and Erinn Monteiro.
Top row, left to right: Ashima Kapur, Daniel Michalski, Lauren Williams,
Lois Jones, Greg Greenwood, and Betsy Horrocks.
*As of September 2011.
3
Overview of Accredited Programs and Program Review Statistics: 2006–2010
Numbera of Accredited Programs by Program Type: 2006–2010
Level / type 2006 2007 2008 2009 2010
Doctoral 372 373 374 372 373
Clinical PhD 171 170 173 173 173
Clinical PsyD 60 62 62 62 62
Counseling 72 72 70 69 69
School 58 58 59 60 61
Combined 11 11 10 8 8
Internships 471 474 469 471 477
Postdoctoral 46 49 48 49 59
Clinical child psychology 3 4 4 4 5
Clinical health psychology 5 6 6 6 8
Clinical neuropsychology 11 12 12 13 15
Clinical psychology 26 26 25 25 30
Rehabilitation psychology 1 1 1 1 1
totAL 889 896 891 892 909aReflects program counts at the conclusion of each calendar year.
4 Commission on Accreditation: 5-Year Summary Report
Program Review Statistics
aAdverse actions, as defined in Section 4.2 of the Accreditation Operating Procedures, include “accredited, on probation,” “revocation of accreditation,” “denial of accreditation,” and “denial of a site visit.”
20
0
40
60
80
100
120
140
160
180
Number of programs granted initial or continued accreditation
Number of programs receiving any adverse action
aNumber of programs voluntarily withdrawing from accreditation
2008LeGeND2006 20092007 2010
The CoA conducts program review three times per year, with 50–100 programs typically reviewed at each meeting. This graph represents the overall breakdown of final decisions made during 2006–2010.
accreditation Decisions: 2006–2010
5Overview of Accredited Programs & Program Review Statistics
programs Granted Initial accreditation programs Granted Continued accreditation
3 years 3 years
3 years 3 years
3 years 3 years
5 years 5 years
5 years 5 years
5 years 5 years
7 years 7 years
7 years 7 years
7 years 7 years
doCtorAL: Initial accred
internship: Initial accred
postdoCtorAL: Initial accred
internship: Continued accred
postdoCtorAL: Continued accred
doCtorAL: Continued accred
Of the programs that were granted initial or contin-ued accreditation during 2006–2010, the proportions of each level (doctoral, internship, postdoctoral) receiving accreditation terms of 3, 5, and 7 years are shown here.
Figures 1–18 present descriptive data based on the doctoral, internship, and postdoctoral residency programs that submitted reports to the annual report online between 2006 and 2010.
7
Program Annual Report Summary: 2006–2010
Figures 1–18 present descriptive data based on the doctoral, internship, and postdoctoral residency pro-grams that submitted reports to the Annual Report Online between 2006 and 2010. The purpose of this report is to provide select summary data in graph form to highlight the trends over this 5-year period. Simple descriptive statistics form the basis of the graphs that follow; however, data tables with detailed statistics (e.g., sample size, mean, median, standard deviation, and minimum–maximum) are available on the Office of Program Consultation and Accreditation (OPCA) website (http://www.apa.org/ed/accreditation/about/
research/index.aspx).
Since 1998, the annual data submitted by programs each year have been used by the CoA to monitor program adherence to quality assurance standards during those years the program is not engaged in periodic review. The OPCA is providing you—accred-ited programs and various accreditation publics—with a 5-year overview of some of the data collected since 2006.
For doctoral programs, each graph presents a select metric by program and degree type: clinical PsyD, clinical PhD, counseling, school, and combined pro-grams. The last three groups (counseling, school, and combined) include both PhD and PsyD degrees.
For internship programs, the current report presents data overall; however, the online data tables break out each metric by program setting type such as uni-versity counseling center, community mental health center, Veterans Affairs medical center, state or county hospital, and so on.
For postdoctoral residency programs, graphs report each metric by the type of practice area: clinical child psychology, clinical health psychology, clinical neu-ropsychology, and clinical psychology. Because only one program has been accredited as a rehabilitation psychology residency program, and because only one clinical child program submitted annual data before 2010, both of these programs are included in the traditional clinical psychology category instead. Beginning in 2010, clinical child programs constituted a separate category.
Select metrics presented in the report include:
In 2012, the annual data will not only be used by the CoA in its annual review but will also be disseminated publicly on the OPCA website each year in aggregate form. As usual, specific requests for additional research need to be submitted to and approved by the CoA, and the aggregate results of such analyses will be made available to the pub-lic through the Internet (see Implementing Regulation E 1-3(a): Use of Data and Research Personnel Resources).
Select Data Point Doctoral Internship Postdoctoral
Number of programs
Number of students
Percentage of programs & students
Percentage of female faculty & female students
Percentage of non-White faculty & non-White students
Percentage of admission offers
Median time-to-degree completion
Percentage of attrition
Median annual stipend for full-time interns/residents
8 Commission on Accreditation: 5-Year Summary Report
Doctoral Programs
Figure 1 Number of accredited Doctoral programs by program type
The proportion of programs submitting annual data each year ranged from 98% to 100% of all accredited doctoral programs, and the number of programs by program type remained fairly constant overall during the 5-year period. By 2010, there were 172 clinical PhD programs, 61 clinical PsyD programs, 69 coun-seling programs (PhD and PsyD), 59 school programs (PhD and PsyD), and 8 combined programs (PhD and PsyD).
0
20
40
200
180
160
140
120
100
80
60
Clinical PhD Clinical PsyD Counseling School Combined
2008LeGeND2006 20092007 2010
Num
ber
of D
octo
ral P
rogr
ams
Doctoral Program Type
9Program Annual Report Summary: 2006–2010
Figure 2 Number of Doctoral Students by program type
The number of students in accredited doctoral programs rose for some types of programs but declined for others between 2006 and 2010. For example, by 2010 the count of doctoral students in clinical PsyD and school programs increased by 14% and 5%, respectively. The number of students in accredited doctoral programs decreased as follows: for students in clinical PhD pro-grams, a 4% decrease; for students in counseling psychology programs (PhD and PsyD), a 3% decrease; and for students in combined programs (PhD and PsyD), a 22% decrease. Please note that the total number of students in combined programs in that period decreased from 690 in 2006 to 541 in 2010.
0
2,000
4,000
6,000
8,000
10,000
12,000
Clinical PhD Clinical PsyD Counseling School Combined
2008LeGeND2006 20092007 2010
Num
ber
of D
octo
ral S
tude
nts
Doctoral Program Type
10 Commission on Accreditation: 5-Year Summary Report
Doctoral Programs
Figure 3 percentage of Doctoral programs and Students by program type
The horizontal bars on the left represent the percentage of accredited doctoral programs by program type between 2006 and 2010, and the bars on the right represent the percentage of doctoral students by program type during the same years. Holding steady across the 5-year time frame, clinical PhD pro-grams accounted for 46-47% of all accredited doctoral programs, and school psychology programs for 16%. The mean percentage of clinical PsyD programs increased from 15% of all doctoral programs in 2006 to almost 17% in 2010. There was an approxi-mately 1% decrease found for counseling and combined programs during the same time period.
In terms of the proportion of doctoral students from 2006 through 2010 in various program types, students from counsel-ing and school programs composed roughly 11% and 8%, respec-tively, of all students across the years. Clinical PhD and combined programs reported a lower percentage of students by 2010 (39% in 2006 to 37% in 2010 for clinical; 3% in 2006 to 2% in 2010 for combined), while clinical PsyD programs reported an increased percentage of all doctoral students in that time frame (38% in 2006 to 42% in 2010).
2008LeGeND2006 20092007 2010
Percentage of Doctoral Programs Percentage of Doctoral Students
50% 40% 30% 20% 10% 0% 10% 20% 30% 40% 50%
Clinical PhD
Clinical PsyD
Counseling
School
Combined
Doctoral Program Type
11Program Annual Report Summary: 2006–2010
Figure 4 median percentage of Female Faculty members and Female Students by program type
The horizontal bars on the left represent the percentage of female doctoral faculty members by program type between 2006 and 2010, and the bars on the right represent the percentage of female doctoral students by program type during the same period. Female status is based on self-report by doctoral faculty and doctoral students. Faculty includes both core and other fac-ulty classifications.
All types of accredited doctoral programs experienced slight increases (or remained steady) in the proportion of female faculty during the 5-year period. The most notable increase occurred in combined programs. The median percentage of female faculty rose from 48% (120 females/233 total faculty) in 2006 to 57% (93 females/161 total faculty) in 2010. Overall, clinical PhD programs had the lowest proportion of female faculty members, ranging from 44% in 2006 to 45% in 2010. The median percent-age of female faculty members hovered near (or at) the 50% mark for clinical PsyD, counseling, and school programs.
Accredited doctoral programs also reported slight increases in the proportion of female doctoral students during the 5-year period. Whereas female doctoral students accounted for approximately 83% of all doctoral students in school programs from 2006 to 2010, the proportion of female doctoral students in counseling programs was around 73%. Female doctoral students accounted for 77–78% of doctoral students in clinical PhD and clinical PsyD programs and for 79–82% of students in combined programs between 2006 and 2010.
2008LeGeND2006 20092007 2010
Median Percentage of Female Doctoral Faculty Members Median Percentage of Female Doctoral Students
90% 70% 60%80% 50% 30%40% 20% 10% 0% 20%10% 40%30% 60%50% 80%70% 90%
Clinical PhD
Clinical PsyD
Counseling
School
Combined
Doctoral Program Type
12 Commission on Accreditation: 5-Year Summary Report
Doctoral Programs
Figure 5 median percentage of Non-White Faculty members and Non-White Students by program type
The horizontal bars on the left represent the percentage of non-White doctoral faculty members by program type between 2006 and 2010, and the bars on the right represent the percentage of non-White doctoral students during the same time period. Non-White status is self-reported by doctoral faculty and doctoral stu-dents, and faculty includes core and other faculty classifications.
Doctoral programs reported slight gains in the median percent-age of non-White faculty members by 2010. Overall, counseling programs have the highest proportion (at 20–21% non-White doctoral faculty), while clinical PhD programs typically have the lowest (at approximately 10%). In 2010, the median percentage of non-White faculty was 16% for clinical PsyD programs, 14% for school programs, and 8% for combined programs.
Counseling programs also had the highest proportion of non-White doctoral students across all 5 years, hovering between 34% and 35% each year, while school programs had the lowest, rang-ing from 18% to 19%. Despite a decrease in the overall number of combined programs, combined programs reported an increase in non-White doctoral students, from 26% in 2006 to almost 29% in 2010. The median percentage of non-White students hovered at 22% for clinical PhD programs and for clinical PsyD programs.
2008LeGeND2006 20092007 2010
Median Percentage of Non-White Doctoral Faculty Members Median Percentage of Non-White Doctoral Students
40% 30% 20% 10% 0% 10% 20% 30% 40%
Clinical PhD
Clinical PsyD
Counseling
School
Combined
Doctoral Program Type
13Program Annual Report Summary: 2006–2010
Figure 6 median percentage of admission offers by program type
The median percentage of applicants offered admission by doc-toral programs also varied by type of program. Overall, clinical PsyD programs have the highest proportion of doctoral admis-sions offers each year, ranging from 42% in 2006 to 39% in 2010, while clinical PhD programs have the lowest, at 6–7% during the 5-year period. School psychology programs reported an increase in admission offers (from 26% in 2006 to 30% in 2010). The median percent admission offer hovered at 14–15% for combined pro-grams (2008 was an exception) and decreased from 16% in 2006 to 11% in 2010 for counseling programs.
To place these findings in context, we measured the mean num-ber of total students and the mean number of total applications per program type each year (the data table can be found online at http://www.apa.org/ed/accreditation/about/research/index.aspx). The average size of the student body for clinical PhD programs decreased slightly, from approximately 59 in 2006 to 56 in 2010, but the mean number of total applications rose from 170 (2006) to 186 (2010). The mean number of students in clinical PsyD pro-grams increased from 174 in 2006 to 182 in 2010, as did the mean number of applications received (173 in 2006 to 184 in 2010). The mean number of counseling psychology students hovered near 40 each year, while the mean number of applications increased from 76 (2006) to 90 (2010). School programs had an average of 35–37 students per year, with an average of 41 and 36 total applications in 2006 and 2010, respectively. Just as the number of accredited combined programs declined from 10 in 2006 to 8 in 2010, the average number of students in combined programs decreased slightly, from 69 in 2006 to 67 in 2010; the mean total number of applications decreased as well, from 141 in 2006 to 106 in 2010.
2008LeGeND2006 20092007 2010
0%
5%
10%
45%
40%
35%
30%
25%
20%
15%
Clinical PhD Clinical PsyD Counseling School Combined
Med
ian
Perc
enta
ge o
f adm
issi
on O
ffer
s
Doctoral Program Type
14 Commission on Accreditation: 5-Year Summary Report
Doctoral Programs
Figure 7 median time-to-Degree Completiona by program type
Median time-to-degree was calculated for all doctoral stu-dents (with prior bachelor’s or master’s degrees) admitted into programs. Median time-to-degree held steady every year at approximately 6 years for clinical PhD and 5 years for clinical PsyD programs. Despite some slight variations across the years, median time-to-degree completion was approximately 5.6–5.7 years for counseling and 5.4–5.7 years for school programs. Combined programs reported a slight decrease in the median time-to-degree, from 5.6 years in 2006 to 5.2 years in 2010.
2008LeGeND2006 20092007 2010
0
1
2
7
6
5
4
3
Clinical PhD Clinical PsyD Counseling School Combined
Med
ian
Num
ber
of Y
ears
Doctoral Program Type
aAll admitted students with bachelor’s or master’s degrees at entry.
15Program Annual Report Summary: 2006–2010
Figure 8 median percentage of annual attrition by program type
Attrition, or leaving the program before successfully graduat-ing, was calculated across all active students each year. By 2010, median attrition rates had decreased for all types of doctoral pro-grams. Median percentage decreases were as follows: from 2.6% in 2006 to 2.1% in 2010 for clinical PhD programs; from 3.5% in 2006 to 3.4% in 2010 for clinical PsyD programs; from 2.9% to 2.5% for counseling programs; from 3.9% to 3.5% for school pro-grams; and from 2.9% to 1.7% for combined programs.
2008LeGeND2006 20092007 2010
0%
1%
2%
5%
4%
3%
Clinical PhD Clinical PsyD Counseling School Combined
Med
ian
Perc
enta
ge o
f Stu
dent
s W
ho L
eft B
efor
e G
radu
atin
g
Doctoral Program Type
16 Commission on Accreditation: 5-Year Summary Report
Internship Programs
Figure 9 Number of accredited Internship programs and Interns
The horizontal bars on the left represent the number of accred-ited internship programs submitting annual data each year, and the bars on the right represent the number of interns in these accredited programs. The number of accredited internship programs providing annual data ranged from 454 in 2006 and 2007 (96% of all accredited internships each year), 465 in 2008 (99%), 459 in 2009 (97%), and 463 in 2010 (97%). The number of interns in these programs hovered around a total of 2,350 in 2006 and 2007 and rose in 2008 and 2009 to 2,445 and 2,526, respectively. In 2010, there were 2,534 interns in accredited pro-grams providing annual data.
Of note, even though the number of accredited internships providing annual data dropped by 6 in 2009, the total number of interns in accredited programs rose that year. We examined, as a check, the mean number of interns each year and found the mean to be 5.26 in 2008, 5.50 in 2009, and 5.47 in 2010. Therefore, the fewer internship programs with data in 2009 cor-responded with a slightly higher accredited mean number of interns per site that year.
2008LeGeND2006 20092007 2010
Number of Internship Programs Number of Interns
500 400 300 200 100 0 900 1,350 1,800 2,250 2,700450
17Program Annual Report Summary: 2006–2010
Figure 10 median percentage of Female Supervisors and Female Interns
The horizontal bars on the left represent the percentage of female internship supervisors between 2006 and 2010, and the bars on the right represent the percentage of female interns during the same period. Female status was based on self-report by intern-ship supervisors and interns; supervisor includes core and other training classifications. The median percentage of female super-visors increased slightly from 55% in 2006 to 60% in 2010. The median percentage of female interns was 75% from 2006 through 2008 and rose to 80% in 2010.
2008LeGeND2006 20092007 2010
Median Percentage of Female Internship Supervisors Median Percentage of Female Interns
90% 60% 50%70%80% 40% 30% 20% 10% 0% 20% 30%10% 40% 50% 60% 70% 80% 90%
18 Commission on Accreditation: 5-Year Summary Report
Internship Programs
Figure 11 median percentage of Non-White Supervisors and Non-White Interns
The horizontal bars on the left represent the percentage of non-White internship supervisors between 2006 and 2010, and the bars on the right represent the percentage of non-White interns during the same period. Non-White status is self-reported by internship supervisors and interns. The median percentage of non-White internship supervisors increased very slightly, from almost 15% in 2006 to almost 16% in 2010. The median percent-age of non-White interns was 24% in 2006, 21% in 2007, and 25% in 2008–2010.
2008LeGeND2006 20092007 2010
Median Percentage of Non-White Internship Supervisors Median Percentage of Non-White Interns
30% 20% 10% 0% 10% 20% 30%
19Program Annual Report Summary: 2006–2010
Figure 12 median annual Stipend for Full-time Interns
The median annual stipend increased from $19,983 in 2006 to $23,600 in 2010—an almost 18% increase during the 5-year period. One consideration in the rise of the average stipend amount for predoctoral internship training is the influence of external factors such as the Fair Labor Standards Act, which requires employers of covered employees who are not otherwise exempt to pay these employees a minimum wage effective July 24, 2009.
2008LeGeND2006 20092007 2010
$0
$18,000
$19,000
$24,000
$23,000
$22,000
$21,000
$20,000
Med
ian
ann
ual S
tipe
nd
20 Commission on Accreditation: 5-Year Summary Report
Postdoctoral Programs
Figure 13 Number of accredited postdoctoral programs by practice area
The number of accredited postdoctoral residency programs that completed the annual report each year is shown here. The pro-portion of accredited programs with annual data is 76% in 2006 (35/46), 78% in 2007 (38/49), 85% in 2008 (41/48), 84% in 2009 (41/49), and 75% in 2010 (44/59).
Three important caveats are noted:
• Those postdoctoral residency programs that were accredited for the first time were not required to submit annual data that year.
• Some specialty programs such as clinical child psychology were originally a part of preexisting accredited programs (e.g., clinical psychology), and annual data from these specialty programs were not collected separately until later in 2009 and 2010. Therefore, we were unable to report on any clinical child programs until we had data from at least two or more such pro-grams in 2010.
• Because there is only one accredited rehabilitation psychology program, annual data from this program are reported in the clinical psychology category instead.
Clinical psychology programs account for well over half of all accredited postdoctoral residency programs during the 5-year period: 69% (24/35) in 2006 to 59% (26/44) in 2010. The num-ber of clinical neuropsychology programs increased slightly, from 7 (21%) in 2006 to 10 (23%) in 2010. Clinical health psychology programs increased from 4 (9%) in 2006 to 6 (14%) in 2010. Two accredited clinical child health programs submitted annual data in 2010, allowing us to report data in a separate category for that year.
2008LeGeND2006 20092007 2010
0
5
10
30
25
20
15
Clinical Child Psychology Clinical Health Psychology Clinical Neuropsychology Clinical Psychology
Num
ber
of P
ostd
octo
ral r
esid
ency
Pro
gram
s
Postdoctoral residency Practice area
21Program Annual Report Summary: 2006–2010
Figure 14 Number of postdoctoral residents by practice area
The number of postdoctoral residents per year is presented here. The bulk of residents are matriculating in clinical psychology residency programs, with a total of 126 (78% of all residents) in 2006, to the peak of 183 (79% of all residents) in 2009, and down to 157 (67% of all residents) in 2010. Although the actual number of residents in clinical neuropsychology rose from 23 in 2006 to 32 in 2010, the proportion of total residents in these accredited programs shifted slightly from 14.2% to 13.6%. The number of residents in clinical health psychology programs rose from 13 (8% of all residents) in 2006 to 23 (10% of all residents) in 2010.
2008LeGeND2006 20092007 2010
0
20
40
200
180
160
140
120
100
80
60
Clinical Child Psychology Clinical Health Psychology Clinical Neuropsychology Clinical Psychology
Num
ber
of P
ostd
octo
ral r
esid
ents
Postdoctoral residency Practice area
22 Commission on Accreditation: 5-Year Summary Report
Postdoctoral Programs
Figure 15 percentage of postdoctoral programs and residents by practice area
The horizontal bars on the left represent the percentage of accredited postdoctoral residency programs by practice area from 2006 to 2010, and the ones on the right represent the percentage of postdoctoral residents by practice area in the same years. Even though the percentage of traditional clinical psychology postdoc-toral programs and the associated percentage of residents have declined during this 5-year time frame (from 69% to 59% for programs, and from 78% to 67% for residents), they still account for the largest proportion of programs and residents at this level of training. The percentage of clinical neuropsychology programs increased slightly (from 20% to almost 23%); however, there was little change in the proportion of residents training in this prac-tice area. Small gains in the percentages of clinical health psychol-ogy programs and its residents were measured as well. Data from two clinical child programs (4.5% of all programs) were reported in 2010 (with 10% of all postdoctoral residents that year).
2008LeGeND2006 20092007 2010
Percentage of Postdoctoral residency Programs Percentage of Postdoctoral residents
80% 60% 50%70% 40% 30% 20% 10% 0% 10% 20% 40% 60%30% 50% 70% 80%
Clinical Child
Psychology
Clinical Health
Psychology
Clinical Neuro-
psychology
Clinical Psychology
Postdoctoral residency
Practice area
23Program Annual Report Summary: 2006–2010
Figure 16 median percentage of Female postdoctoral Supervisors and Female residents by practice area
The horizontal bars on the left represent the percentage of female postdoctoral supervisors by practice area between 2006 and 2010, and the bars on the right represent the percentage of female resi-dents during the same period. Female status is self-reported by postdoctoral supervisors and postdoctoral residents, and the data on supervisors includes core and other training classifications.
Median percentages are based on the total number of supervisors in each practice area. Because of the small sample size of total supervisors in some practice areas, there is greater variability across the years. The median percentage of female supervisors increased in all practice areas. From 2006 to 2010, the proportion of female postdoctoral supervisors rose from 51% to 56% in clini-cal psychology programs, 50% to 55% in clinical neuropsychology programs, and 35% to 59% in clinical health programs. In 2010, women made up 59% of all supervisors in clinical child programs.
In 2007 the percentage of female residents in clinical psychology programs reached a high of almost 91%; since 2007, approxi-mately three fourths of all residents in this practice area have been female. There was an increase in the median percentage of female residents in clinical neuropsychology programs, from 50% (N = 23) in 2006 to nearly 71% (N = 32) in 2010. Although the median percentage of female residents in clinical health pro-grams was 68% in 2006 (N = 13), it was 100% by 2010 (N = 23). The small number of total residents in this practice area likely accounts for these variable findings, however. In 2010, women made up almost 80% of all residents in clinical child programs.
2008LeGeND2006 20092007 2010
Median Percentage of Female Postdoctoral Supervisors Median Percentage of Female Postdoctoral residents
100% 80% 60% 40% 20% 0% 20% 40% 60% 80% 100%
Clinical Child
Psychology
Clinical Health
Psychology
Clinical Neuro-
psychology
Clinical Psychology
Postdoctoral residency
Practice area
24 Commission on Accreditation: 5-Year Summary Report
Postdoctoral Programs
Figure 17 median percentage of Non-White postdoctoral Supervisors and Non-White residents by practice area
The horizontal bars on the left represent the percentage of non-White postdoctoral supervisors by practice area between 2006 and 2010, and the bars on the right represent the percentage of non-White residents during the same period. Non-White status is self-reported by postdoctoral residency supervisors and postdoc-toral residents. The designation of supervisor includes core and other training classifications.
The median percentage of non-White postdoctoral supervisors in clinical
psychology programs rose from 14% in 2006 to 17% in 2010. From 2006
to 2010, the median percentage of non-White supervisors rose from 8%
to 12% (total no. of supervisors in 2010 = 39) in clinical neuropsychology
programs and from 9% to 15% (total no. of supervisors in 2010 = 64) in
clinical health psychology programs. Such variability is likely due to the
small sample size of supervisors each year. In 2010, non-White supervisors
accounted for almost 21% of all supervisors in clinical child programs.
Non-White status was also variable among postdoctoral residents in all
practice areas due to small samples sizes. The median percentage of non-
White residents in clinical psychology programs varied from almost 29% in
2006, 33% in 2007, 17% in 2008, and 21% in 2010 (total residents in these
programs ranged from 126 to 157 during this period). The median per-
centage of non-White residents in clinical neuropsychology programs was
zero until 2010, when it reached approximately 13% (total no. of clinical
neuropsychology residents = 32). The proportion of non-White residents
in clinical health psychology programs ranged from a low median percent-
age of 6% in 2006 (total no. of residents = 13) to 22% in 2007 and 2008. In
2010, the percentage was 13% (total no. of residents = 23). Almost 40% of
clinical child residents (N = 23) identified as non-White in 2010, the high-
est recorded during any year between 2006 and 2010.
2008LeGeND2006 20092007 2010
Median Percentage of Non-White Postdoctoral Supervisors Median Percentage of Non-White Postdoctoral residents
50% 40% 30% 20% 10% 0% 10% 20% 30% 40% 50%
Clinical Child
Psychology
Clinical Health
Psychology
Clinical Neuro-
psychology
Clinical Psychology
Postdoctoral residency
Practice area
25Program Annual Report Summary: 2006–2010
Figure 18 median annual Stipend for Full-time postdoctoral residents by practice area
The median annual stipend for full-time postdoctoral residents by area is presented here for 2006–2010. The median annual stipend for clinical psychology programs ranged from $40,000 in 2006 to $42,528 in 2010. Median annual stipends increased 21% for clinical neuropsychology programs, from $33,255 in 2006 to $40,534 in 2010. Overall, stipends were the highest for clinical health programs, ranging from $42,171 in 2006 to $43,692 in 2010. However, the median annual stipend for clinical child pro-grams was the lowest, at $38,376 in 2010.
The relative stability of stipends for clinical psychology programs could be related to the high proportion of military- or Veterans Affairs (VA)-based settings among these programs. Military- or VA-based settings, where stipends are largely fixed, accounted for 65% of all clinical psychology programs by 2010 and for 40% of all clinical neuropsychology programs by 2010.
2008LeGeND2006 20092007 2010
$0
$20,000
$25,000
$45,000
$40,000
$35,000
$30,000
Clinical Child Psychology Clinical Health Psychology Clinical Neuropsychology Clinical Psychology
Med
ian
ann
ual S
tipe
nd
Postdoctoral residency Practice area
as we move into 2012, in what will be the Commission’s fifth year, the Coa is committed to continuing initiatives to improve communication with its publics and to ensuring quality and excellence—not only in terms of education and training in professional psychology, but of itself as a recognized accrediting body.
—richard Seime, phD (2011)
27
From a Committee to a Commission: What’s Been Happening in Psychology Accreditation Since 2007
CoA Membership Profile: 2007–2011 Members (N) Women (%) Racial/ethnic
minorities (%)U.S. states
represented (N)APA divisions
represented (%)Board certified
by ABPP (%)
2007 21 38 10 15 48 48
2008 32 38 9 17 54 38
2009 32 44 13 15 54 28
2010 32 50 16 19 56 34
2011a 31 48 13 21 43 35
aOne CoA seat was vacant from May through the end of the year.
Note. ABPP = American Board of Professional Psychology.
28 Commission on Accreditation: 5-Year Summary Report
CoA Chairs: 2007–2011
“The year 2007 was a year of firsts, lasts, and any number of changes for the APA Committee on Accreditation. Importantly, it served as the last year of the 21-person Committee, as we prepared to transition to a new 32-member Commission in 2008. It also included the first Accreditation Assembly and marked the first time the number of accredited programs grew to almost 900. It was an important year for policy initiatives and procedural innovations for the CoA and the Office.”
“The year 2008 was the landmark year for the new 32-person Commission on Accreditation, including the first time in the CoA’s history that new members (19) outnumbered continuing members (13). Much time and effort was spent implementing the Commission’s new compo-sition, structure, and culture and developing quality assurance processes and mechanisms to carry the CoA forward in future years.”
“The Commission’s newly developed work group structure for addressing policy issues came to fruition in 2009, an incredibly active year for discussion and action on many new and revised Implementing Regulations related to program review. CoA members had a voice in each of these issues, and input from the public was considered. The year helped clarify important expectations for CoA’s processes in terms of refining aspects of the G&P [Guidelines & Principles for Accreditation] and making policy decisions.”
2007 Jim Lichtenberg, phD 2008 Jeffrey Baker, phD 2009 Nancy elman, phD
29From a Committee to a Commission
“In 2010, the CoA continued to be active in creating and mak-ing changes to accreditation policy, highlighted by two significant Implementing Regulations on distance education and telesupervision issues that culminated more than a year’s worth of discussion and research. Increased attention was provided to policy issues affect-ing the postdoctoral sector, given in large part to the rapid growth in the number of initial applications for residency programs. The year also marked the birth of the written and distributable CoA Updates that provide our many publics with information on CoA’s actions and activities—an important step forward in promoting both consistency of information and transparency.”
“The past year (2011) has included a number of additional advance-ments in communication between the CoA and its communities of interest, including expanded efforts in training for site visitors and programs. The year has also been marked by the process of our own external review, requiring a careful assessment of the policies and procedures that CoA currently has in place and making revisions as needed. As we move into 2012, in what will be the Commission’s fifth year, the CoA is committed to continuing initiatives to improve communication with its publics and to ensuring quality and excel-lence—not only in terms of education and training in professional psychology, but of itself as a recognized accrediting body.”
2010–2011 richard Seime, phD
30 Commission on Accreditation: 5-Year Summary Report
Timeline
aBBrevIaTIONS
aOP Accreditation Operating Procedures CCTC Council of Chairs of Training Councils CHea Council of Higher Education Accreditation Coa Committee/Commission on Accreditation CPa Canadian Psychological Association HeOa Higher Education Opportunity Act
Ir Implementing Regulation MOU Memorandum of Understanding NaCIQI National Advisory Council on Institutional Quality and Integrity PrC Program Review Consultant USDe U.S. Department of Education
2005 2008
2006
2007
June
Inter-Organizational Summit on Structure of Accrediting Body for Professional Psychology (Snowbird, UT)
JAnuAry
IR C-20 includes licensure rates for program graduates
FebruAry
First meeting as a 32-person Commission
mAy
Second Accreditation Assembly (Minneapolis, MN)
JuLy
Decisions on applicant programs made public (AOP 8.0)
mid-yeAr to the end oF 2007
PRCs assist with program review
Structure and plan for new 32-member Commission developed
JAnuAry
IR C-20 effective for all doctoral programs
First Accreditation Assembly (St. Pete Beach, FL)
JuLy
Revisions to IR C-2 clarify doctoral program residency requirements
september
APA and CPA revise MOU with phase-out plan for concurrent accreditation
oCtober
Last meeting as a 21-person Committee
31From a Committee to a Commission
2009
mAy
Third Accreditation Assembly (San Diego, CA)
June
Accreditation staff submit (1st) petition to USDE for renewal of federal recognition
JuLy
New federal regulations effective for CoA and other recognized accrediting agencies due to HEO Act of 2008
JAnuAry
Accreditation staff submit (2nd) petition to USDE
FebruAry & ApriL
USDE staff member observes CoA policy and program review meetings
mAy
Accreditation staff submit eligibility portion of recognition process to CHEA
June
NACIQI reviews CoA for continued recognition
2010 2011
FebruAry
First CoA Policy & Procedure Update provided in printable and distributable formata
CoA sessions and forums held at CCTC Joint Conference
JuLy
Decision options for program appeals altered per federal regulations (AOP 5.5)
Two major IRs re distance education issues
IR C-11(c) on postdoctoral residency program transitions
september
IR C-20 includes titling and location of information
ApriL
CoA approves a strategic, multiyear communications plan
JuLy
Revised provisions re program annual monitoring (IRs D.4-7(a),(b),(c))
IR C-30 on outcome data for internship and postdoctoral programs
Revisions to IR C-16 on broad and general education in doctoral programs
JuLy–september
Proposed changes to AOP available for public comment
LAte spring–FALL
APA negotiates contract to move toward fully electronic accreditation system
deCember
First meeting of new 18-person NACIQI for USDE
deCember
aAll CoA Updates from 2010 and 2011 are available at http://www.apa.org/ed/accreditation/newsletter/index.aspx.
32 Commission on Accreditation: 5-Year Summary Report
Background
Snowbird and the resulting Changes to the Coa Structure
In June 2005, members of the APA Committee on Accreditation chaired the Inter-Organizational Summit on the Structure of the Accrediting Body for Professional Psychology—nicknamed “Snowbird” because of the location of the meeting in Snowbird, Utah. This summit brought together representatives of the multiple communities of psychology with an interest in accreditation to outline a plan for the future structure and composition of the CoA. The purpose of the plan was to address issues related to representation, inclusion of multiple voices, commu-nication with the field, and accreditation workload. The final proposal enlarged the accrediting body from 21 to 32 members to allow for greater representa-tion of the diverse perspectives within the field and renamed it the Commission on Accreditation. The proposal was subsequently reviewed and approved by the CoA, the APA Board of Educational Affairs, the APA Board of Directors, and the APA Council of Representatives in August 2006.
After this change was approved, members of the CoA worked actively with the constituent groups responsible for nominating members to new seats on the Commission to develop criteria and processes for doing so. New members were formally appointed through the APA governance process in December 2007. Throughout 2006 and 2007, a subcommittee of
CoA members and Accreditation Office staff worked on developing the operating structure for the new Commission and creating effective training methods to ensure consistency. An extensive training and ori-entation session for all 32 Commission members was planned for February 2008 in conjunction with the CoA’s annual policy meeting.
program review Consultants
The year 2007 was the last year in which the 21- member Committee on Accreditation was in place. An important part of the transition to a Commission included the use of program review consultants (PRCs) in the review process, in preparation for the change in the structure of the CoA. This pilot project, which took place over the second half of 2006 and through-out 2007, employed non-CoA member readers in a supporting role in the CoA’s review of programs. The PRCs reviewed program materials and provided their review panels with detailed analyses regarding pro-grams’ compliance with the Guidelines & Principles for Accreditation (G&P). The PRCs did not make any rec-ommendations regarding accreditation. This pilot proj-ect allowed the CoA and office staff to prepare for the increased number of CoA members beginning in 2008 and for CoA to dedicate more time to its policy agenda as program review became more efficient.
The CoA and APA staff express appreciation for the work of
the 7 PRCs (left to right): Edward Sheridan, PhD; Gerald Stone,
PhD (in memoriam); Frank Collins, PhD (in memoriam); Philip
Farber, PhD; Joseph Kobos, PhD; Donna Horn, PhD; and Cathy
Mavrolas, PhD.
Development and progression of Implementing regulation C-20
• In the interest of transparency and to help pro-spective students make better informed decisions, beginning on January 1, 2007, all doctoral pro-grams were required to make their education and training outcomes available to the public, including placement on the program’s website if it had one. The required variables included students’ time to degree completion, student attrition, program costs, and internship placement rates.
• As of January 1, 2008, licensure rates for student graduates were required to be included in the public information.
• In August 2010, CoA updated IR C-20 with new provisions regarding the required location and titling of the C-20 information. Effective September 2010, programs are required to label the information “Student Admissions, Outcomes, and Other Data,” and it must be located no more than one click away from the program’s website landing page. The revisions to the IR included clarification on how to calculate and present the required infor-mation, as well as the provision that all data must be updated by October 1 of each year.
The CoA and APA staff express appreciation for the
work of the 7 PRCs (left to right): Edward Sheridan,
PhD; Gerald Stone, PhD (in memoriam); Frank Collins,
PhD (in memoriam); Philip Farber, PhD; Joseph Kobos,
PhD; Donna Horn, PhD; and Cathy Mavrolas, PhD.
33From a Committee to a Commission
• All programs are required to include the current url to their C-20 data within the Annual Report Online each year. Accreditation staff will review program websites each year to ensure that information is presented properly and is up-to-date.
phasing out of Canadian accreditation
Shortly after the Accreditation Panel of the Canadian Psychological Association (CPA) first began accredit-ing programs in 1984, it signed its first Memorandum of Understanding (MOU) with the Committee on Accreditation (CoA) of APA (see APA and CPA timeline above). The two parties entered into this agreement so that the CoA, which had been accrediting programs since 1948 and Canadian programs since 1970, could provide the CPA with its experience in accreditation. Further, at the time, most of the training faculty in Canadian programs were educated and trained in the United States and had strong allegiances to the accreditation review conducted by APA. Since then, the CPA and APA–CoA were engaged in a system of concurrent accreditation, with programs in Canada having the option of being accredited by both bodies and being reviewed at the same time.
In the years since the initial MOU was signed, much changed both in the landscape of psychology in the two countries and within the independent review committees. There are significant differences between the two countries in terms of their higher educational
systems, their health care systems, the philosophical views concerning diversity as expressed in Canadian law, and the format of the information provided by programs to the review committees. Significantly, after the CPA changed its standards for accreditation in 2002, the APA–CoA had difficulty in reviewing some aspects of its own standards that are not consis-tent with Canadian law and culture.
The APA–CoA studied this issue for several years, conducting multiple surveys and discussions within the field of professional psychology. In 2005, the CoA sent for public comment a proposal to phase out its accreditation of programs in Canada, citing the CPA’s maturity to accredit programs in Canada on its own; differences in the educational systems, health care sys-tems, and laws surrounding diversity of the two coun-tries; and the administrative time and financial costs of maintaining accreditation with two bodies. While this proposal brought forth a variety of opinions, the CoA voted to send the proposal on to APA governance.
In February 2007, the APA Council of Representatives (COR) adopted the proposal and charged the CoA with developing a new MOU to outline a timeline for phasing out accreditation of programs in Canada by the APA–CoA. The CoA was challenged to develop an agreement that would be acceptable to both parties and considerate of students who recently entered Canadian programs while believing the programs were APA-accredited. By late 2007, a multiyear phaseout
plan was in place, and the following were announced to the public and to state licensing boards/credential-ing agencies:
• The APA–CoA will no longer accredit programs in Canada effective September 15, 2015.
• As of January 1, 2008, the CoA will no longer accept initial accreditation applications from pro-grams located in Canada.
In addition, the CoA adopted Implementing Regulation C-21, which stated that it would no longer accredit programs outside the United States and its territories. The phaseout of the currently accredited Canadian programs would be the only exception, and the CoA will wait until such time as the APA sets a policy on international quality assurance before con-sidering any further changes.
1948First APA accredited doctoral programs
1970APA accredits first program in Canada
1984CPA begins accrediting programs
1987APA–CPA MOU on concurrent accreditation
1996APA revision of G&P
2002CPA Accreditation Standards revision
2006APA surveys activities in Canada
2007APA CoR resolution to cease accreditation in Canada
2008No new applicant programs from Canada accepted
2015 End date for accreditation of programs in Canada
History of APA and CPA Concurrent Accreditation
34 Commission on Accreditation: 5-Year Summary Report
Background (continued)
Changes to the Coa meeting Structure
For the portion of its 4-day meetings not dedicated to program review, the 21-person Committee on Accreditation generally discussed policy issues as a group. There were subcommittees for research issues and for reviewing complaints, but those groups typi-cally met on their own time, and not every Committee member belonged to one of those groups. In response to its larger membership size, beginning in 2008 the CoA felt it was important to designate work groups and policy groups to advance its many differ-ent activities and initiatives. In this structure, every Commission member is assigned to one work group and one policy group (assignments may change year to year). Each group receives dedicated time during the CoA meetings to meet, and work is frequently done between meetings as well. On the last day of each CoA meeting, the work and policy groups may bring issues forward to the full CoA for discussion and action.
accreditation assembly: past and Future
In 2007, 2008, and 2009, the CoA hosted an Accreditation Assembly to engage with its various constituency groups and provide training, a recom-mendation that came out of the Snowbird meeting in 2005. Each Assembly was attended by approximately 150–200 individuals, including CoA members and presenters. Surveys after each Assembly indicated that those who did attend were appreciative of the
training offered, programming, and opportunity to interact with CoA members on the future of accredi-tation in psychology. The CoA did not plan a stand-alone Assembly in 2010 because of the opportunity afforded by the Council of Chairs of Training Councils (CCTC) Joint Conference to meet the same goals.
In evaluating the costs associated with hosting a sepa-rate Assembly, the CoA agreed to use an alternative format in 2011 and 2012 by enhancing its presence at meetings of training councils and other relevant groups. In addition to site visitor and self-study train-ing, representatives from CoA and the Accreditation Office would provide updates on CoA policies and procedures, engage in discussions, and receive input from interested parties. In addition, accreditation ses-sions were added to the APA convention program. In 2011, CoA hosted a special session on current issues in higher education affecting accreditation, and a CoA Open Forum is planned for the 2012 APA con-vention. During 2012, in assessing the future of the Accreditation Assembly, the CoA will evaluate the costs and benefits associated with these different formats.
Distance education policies
At its July 2010 meeting, CoA adopted two new IRs related to key issues in distance education after extensive discussion, opportunity for public com-ment, and many revisions:
• IR C-27—Distance and Electronically Mediated Education in Doctoral Programs is the product of multiple years of deliberation during which the CoA reviewed the research and literature on dis-tance delivery and considered the practices of other health profession accreditors in this area, all within the context of G&P Domain A.3 on academic resi-dency and related IR C-2. The resulting policy states that accredited doctoral programs cannot deliver education and training substantially or completely by distance education because of the importance of face-to-face, in-person interaction in fulfilling many aspects of the G&P (e.g., student socializa-tion, faculty role modeling). IR C-27 was developed in recognition of current and emerging practices in higher education and in the service of ensuring the quality of education and training in professional psychology. The IR represents the CoA’s best profes-sional judgment at the time it was adopted on the evolving area of distance education in professional psychology, an area that CoA expects to revisit as more evidence and experience become available.
• IR C-28—Telesupervision was developed in order to (a) require programs to report the extent to which they may be using telesupervision within their
CoA held an open forum at the 2008 Assembly in Minneapolis.
35From a Committee to a Commission
training programs and (b) ensure quality educa-tion and training if programs use this supervision modality. The resulting policy provides guidelines and limits on the amount of telesupervision that can be used in accredited doctoral, internship, and postdoctoral programs. Given the limited literature base regarding the use of telesupervision in psy-chology training at each level, IR C-28 represents CoA’s best professional judgment at the time it was adopted on the evolving area of telesupervision, consistent with its mandate to protect the public and maintain program quality. As the literature base expands, CoA will consider revisiting the guidelines and limits imposed by this IR.
Improvements in Coa Communications
In early 2011, the newly formed Communication work group created a strategic, multiyear plan that was approved by the CoA at its April 2011 meeting. The major goals of the initial plan are to increase
• transparency of CoA decision making to its publics,
• engagement/participation of the public in the accreditation process, and
• fidelity and consistency of CoA decision making.
Updates on progress on this plan were provided to the public throughout 2011. Some accomplishments thus far include improved access to CoA information on the APA website, improved navigation of the Implementing Regulations, enhanced content for the CoA Updates produced after each meeting, wider distribution of CoA Updates and other information, and closer examina-tion of written communications for clarity. Plans are in place for continued progress on these and other com-munication initiatives through 2012.
36 Commission on Accreditation: 5-Year Summary Report
Higher Education Accreditation Update
While reviewing and recognizing professional psy-chology programs for adherence to the Guidelines & Principles for Accreditation, the APA’s Commission on Accreditation (CoA) is itself reviewed by both the Council for Higher Education Accreditation (CHEA) and the U.S. Secretary of Education through the U.S. Department of Education (USDE). The APA CoA is recognized by both as the accrediting body for profes-sional psychology. In addition, the CoA participates as a member of the Association of Specialized and Professional Accreditors.
The recognition review process for both CHEA and USDE is similar to the process by which the CoA reviews psychology programs for APA accreditation. Both processes include a comprehensive self-study, one or more on-site visits, and opportunity for third-party comment.
CHea update
The APA CoA has been recognized by CHEA since 2002. The CoA submitted an interim report in 2007 and will undergo its next full recognition review in 2012. The CHEA recognition review occurs in two parts: eligibility and recognition. All agencies, even those already recognized by CHEA, must participate in both portions each time they come up for review. In May 2011, Accreditation staff submitted CoA’s eli-gibility materials to CHEA and learned in September
37From a Committee to a Commission
drastically between the two petitions. CoA also hosted on-site visits from USDE staff members after both the first and second petitions.
On June 8, 2011, the CoA was finally formally reviewed for continued recognition by the NACIQI. The final decision by Mr. Ochoa—reflecting consid-eration of both the USDE staff member’s analysis and NACIQI’s recommendations—was that the CoA’s recognition be continued while requiring a compliance report on seven issues in 2012, essentially deferring the decision.
Of the issues cited by USDE/NACIQI, most were minor and simply require additional documentation or clarifications to Accreditation Office policies. Two of the issues, however, have to do with the amount of time that programs on “accredited, on probation” sta-tus are permitted to remedy areas of noncompliance identified by the CoA. In its review, USDE/NACIQI determined that the CoA’s current process—which involves programs undergoing another full review (self-study, site visit, and review by CoA) after being placed on probation—allows too much time to elapse and is inconsistent with the current interpretation of the timelines enforced by the USDE’s Criteria for Recognition. Because the CoA is committed to maintaining recognition by USDE, achieving compli-ance with this criterion requires significant changes to Sections 4.2–4.4 of the Accreditation Operating Procedures (AOP).
The CoA needed to move quickly in order to propose, collect public comment on, finalize, and implement the required changes. At its July 2011 meeting, CoA approved the proposed AOP changes for public com-ment. After reviewing the public comments and mak-ing any additional changes, CoA will send forward the proposed revisions to APA governance for approval.
2011 that the CHEA Board of Directors had approved the CoA to move forward in the recognition process.
u.S. Department of education update: Coa’s review for Federal recognition
The APA CoA was last reviewed for federal recogni-tion by the USDE’s National Advisory Committee on Institutional Quality and Integrity (NACIQI) in 2004. At that time, NACIQI determined that the CoA was in full compliance with the recognition criteria and recommended continued recognition for the full 5-year period. As such, CoA was scheduled for review again in 2009.
With the 2008 reauthorization of the Higher Education Opportunity Act, many agency recognition reviews were put on hold while the NACIQI was dis-banded and then reconstituted. The new, 18-person NACIQI, which is now charged with making recom-mendations to the senior department official of the USDE (currently Edward Ochoa), held its first meet-ing in December 2010. Because the CoA was originally scheduled for review in 2009, the process was drawn out over a full 2-year period. Accreditation staff were required to submit CoA’s petition (self-study) both in June 2009 under the old regulations and again in January 2011 under the new regulations. Notably, the USDE’s expectations regarding providing docu-mentation to support evidence of compliance changed
38 Commission on Accreditation: 5-Year Summary Report
39
CoA Policy and Procedural Changes: 2007–2011
Policy Changes
Implementing regulations related to the Guidelines & Principles for Accreditation (G&p)
IR Description
2007
C-2IR revised to include the definition of residency requirement and “equivalent thereof” for doctoral programs.
C-20New IR provides guidelines for doctoral programs in disclosing their education and training outcomes to the public.
C-21New IR clarifies the phase-out timeline for accreditation of programs in Canada and CoA’s policy of accrediting programs outside the U.S.
2008 C-20 IR revised to add licensure to education and training outcomes for doctoral programs.
2009
C-15(b) IR revised to clarify the definition and amounts of supervision required in internship/postdoctoral training programs.
C-22New IR clarifies the interpretation of Domain D.1 of the G&P regarding the recruitment and retention of diverse individuals.
C-23 New IR clarifies the interpretation of Domain D.2 of the G&P regarding training in diversity.
C-24New IR creates a broader discussion of the use of the terms empirically supported procedures and empirically supported treatments within the G&P.
C-25New IR outlines requirements for programs using any type of distance education methodolo-gies regarding procedures for the positive identification of students engaging in those methods (required by federal regulations).
continued on the following page
40 Commission on Accreditation: 5-Year Summary Report
IR Description
2010
C-6(b)New IR provides guidance to accredited programs as to how their accreditation status and CoA’s contact information should be presented within programs’ public materials.
C-11(a) IR revised to replace the term integrated practice programs with multiple practice programs refer-ring to postdoctoral residency programs focusing on more than one traditional or specialty area.
C-11(c)New IR provides guidance to postdoctoral residency programs that may be looking to add or transition from traditional to specialty practice areas.
C-20
IR revised to include new provisions regarding the required location and titling of doctoral programs’ information on education and training outcomes on their websites; also provides clarification on calculating and presenting all information and requirements that new information must be updated by October 1 of each year.
C-22(a)New IR clarifies how the CoA reviews programs in religiously affiliated institutions that invoke the use of “Footnote 4” in the G&P regarding the recruitment and retention of diverse individuals.
C-26New IR provides guidelines for practicum training in doctoral programs within the context of Domain B.4 of the G&P.
C-27New IR clarifies how distance and electronically mediated education within doctoral programs is viewed by CoA within the context of the G&P.
C-28New IR requires programs to report the extent to which they may be using telesupervision within their training programs and clarifies the amount acceptable within different levels of programs.
C-29New IR clarifies the type of information required from internship/postdoctoral programs regarding didactic activities offered as part of the training curriculum.
2011a
C-19IR revised to add a section on stipend sufficiency to provide clarification and guidance to internship programs.
C-11(d)New IR provides clarification on differentiating a track, rotation, or area of emphasis from a separate specialty practice postdoctoral residency program.
C-16
IR revised after several periods of public comment to include how CoA defines several content areas of Domain B.3(a) and (b) for doctoral programs, CoA’s interpretations of broad and general training both across and within the required areas, expectations for graduate-level training, and expectations for faculty qualifications to deliver content in these areas.
C-30New IR explains why (for internship and postdoctoral programs) outcome data are a critical component of the accreditation review and provides guidance on the types and specificity of outcome data that CoA needs in order to make an accreditation decision.
aThrough the July 2011 CoA meeting.
Policy Changes
Implementing regulations related to the Guidelines & Principles for Accreditation (G&p) (continued)
41CoA Policy and Procedural Changes: 2007–2011
aThrough the July 2011 CoA meeting.
Procedural Changes
Accreditation Operating Procedures (aop) and related Implementing regulations
Policy Purpose
2007
AOP 6.1.2Extends the time frame for students, interns, and residents to file a complaint against an accred-ited program until 18 months after they leave the program.
AOP 4.2 Adds “denial of a site visit” for applicant programs to the list of possible CoA decision options.
IR D.3-3(b)New IR outlines procedures for providing site visitors with copies of the program’s response to their report once CoA has made a final decision.
IR D.8-1(a)New IR outlines the time frames for providing the public with notice of any adverse decisions as well as accredited programs’ voluntary withdrawals from accreditation.
2008 AOP 8 Clarifies that the outcomes of programs applying for initial accreditation will be made public.
2009IR D.8-2 IR revised to include denial of a site visit as an adverse action (consistent with AOP 4.2).
IR D.8-4 IR revised to include denial of a site visit as an adverse action (consistent with AOP 4.2).
2010
AOP 5.5Alters the decision options of the appeals panel to include the ability to amend or reverse the CoA decision (in accordance with federal regulations).
AOP 5.6Provides the process for review of an adverse decision made by CoA if it is based solely on financial deficiencies (in accordance with federal regulations).
IR D.5-1 IR revised to include the new decision options of the appeals panel (consistent with AOP 5.5).
IR D.5-3New IR outlines the composition of the appeals panel pool, including the new federal require-ment that all appeals panels include a public member.
2011a
IR D.4-7(a)New IR (previously D.4-8) provides the rationale and procedures for using annual reports in the reaffirmation of programs’ accredited status.
IR D.4-7(b)New IR (previously D.4-7) provides the definitions and thresholds of student achievement out-comes for doctoral programs.
IR D.4-7(c)New IR (previously partially included in D.4-8) provides the rationale and procedures for using any requested narrative reports in the reaffirmation of programs’ accredited status.
IR D.8-2IR was revised in Section 2 (“Publicly-Available Information”) to reflect that the OPCA staff infor-mation is available to the public, consistent with the USDE criteria.
IR E.1-1IR was revised to update the USDE criterion referenced in the policy and to reflect the office’s actual practices with regard to retaining program records.
Note. IR = implementing regulations; OPCA = APA Office of Program Consulation and Accreditation; USDE = U.S. Department of Education.
42 Commission on Accreditation: 5-Year Summary Report
AppendixCoA Membership Lists: 2007–2011
2007 Committee on Accreditation
CHAIR James Lichtenberg, PhD (1/02–12/07)
ASSOCIATE CHAIR
Jeffrey Baker, PhD (1/05–12/07)
Patricia Alexander, PhD (1/07–1/09)
W. Edward Craighead, PhD (1/07–12/09)
Thomas DiLorenzo, PhD (1/07–12/09)
Kelly Ducheny, PsyD (1/07–12/09)
Nancy Elman, PhD (1/06–12/08)
Edward Gaughan, PhD (1/02–12/07)
Martin Heesacker, PhD (1/06–12/08)
Betty Horton, DNSc (1/04–12/09)*
Joyce Illfelder–Kaye, PhD (1/07–12/09)
Robert Knight, PhD (1/07–12/09)
Edmund Nightingale, PhD (3/06–12/07)
Karen Schilling, PhD (1/05–12/08)
Norma Simon, EdD (1/04–12/09)
Milton Strauss, PhD (1/05–12/07)
William Strein, DEd (1/07–12/09)
David Werner, PhD (1/02–12/07)*
La Pearl Logan Winfrey, PhD (1/05–12/07)
Eric VandeVoorde (1/07–12/07)**
Jeffrey Younggren, PhD (1/02–12/07)
*Public member. **Student member.
43Appendix
2008 Commission on Accreditation
CHAIR Jeffrey Baker, PhD (1/05–12/10)
ASSOCIATE CHAIR, PROGRAM REvIEW Thomas DiLorenzo, PhD (1/07–12/09)
ASSOCIATE CHAIR, QUALITY ASSURANCE Nancy Elman, PhD (1/06–12/08)
Patricia Alexander, PhD (1/07 – 1/09)
Howard Berenbaum, PhD (1/08–12/09)
W. Edward Craighead, PhD (1/07–12/09)
Raymond Crossman, PhD (1/08–12/08)
Kelly Ducheny, PsyD (1/07–12/09)
Rodney Goodyear, PhD (1/08–12/10)
Martin Heesacker, PhD (1/06–12/08)
David Hess, PhD (1/08–12/10)
Donna Horn, PhD (1/08–12/10)
Betty Horton, DNSc (1/04–12/09)*
Joyce Illfelder-Kaye, PhD (1/07–12/09)
Elizabeth Klonoff, PhD (1/08–12/10)
Linda Knauss, PhD (1/08–12/10)
Robert Knight, PhD (1/07–12/09)
David McIntosh, PhD (1/08–12/08)
Kathie Nichols, PhD (1/08–12/08)
Carlton Parks Jr., PhD (1/08–12/10)
Ruperto Perez, PhD (1/08–12/10)
Roger Peterson, PhD (1/08–12/09)
Deborah Richardson, PhD (1/08–12/08)
Brad Roper, PhD (1/08–12/09)
Richard Seime, PhD (1/08–12/09)
Kenneth Sher, PhD (1/08–12/10)
Rick Short, PhD (1/08–12/10)
Wayne Siegel, PhD (1/08–12/08)
Norma Simon, EdD (1/04–12/09)
William Strein, DEd (1/07–12/09)
Eric VandeVoorde (1/07–12/08)**
Judith Watkins, EdD (1/08–12/10)*
2009 Commission on Accreditation
CHAIR Nancy Elman, PhD (1/06–12/11)
ASSOCIATE CHAIR, PROGRAM REvIEW Robert Knight, PhD (1/07–12/09)
ASSOCIATE CHAIR, QUALITY ASSURANCE Roger Peterson, PhD (1/08–12/09)
Patricia Alexander, PhD (1/07–12/09)
Jeffrey Baker, PhD (1/05–12/10)
Howard Berenbaum, PhD (1/08–12/10)
W. Edward Craighead, PhD (1/07–12/09)
Raymond Crossman, PhD (1/08–12/11)
Charme Davidson, PhD. (1/09–12/09)
Thomas DiLorenzo, PhD (1/07–12/09)
Wallace Dixon Jr., PhD (1/09–12/11)
Kelly Ducheny, PsyD (1/07–12/09)
Rodney Goodyear, PhD (1/08–12/10)
M. Kathy Hamilton, PhD (1/09–12/10)
David Hess, PhD (1/08–12/10)
Donna Horn, PhD (1/08–12/10)
Betty Horton, DNSc (1/04–12/09)*
Joyce Illfelder-Kaye, PhD (1/07–12/09)
M. Marlyne Kilbey, PhD (1/09–12/11)
Elizabeth Klonoff, PhD (1/08–12/10)
Linda Knauss, PhD (1/08–12/10)
Nicole Manns, MA (1/09–12/09)**
Carlton Parks Jr., PhD (1/08–12/10)
Ruperto Perez, PhD (1/08–12/10)
Deborah Richardson, PhD (1/08–12/11)
Brad Roper, PhD (1/08–12/09)
Richard Seime, PhD (1/08–12/09)
Kenneth Sher, PhD (1/08–12/10)
Rick Short, PhD (1/08–12/10)
Wayne Siegel, PhD (1/08–12/11)
William Strein, DEd (1/07–12/09)
Judith Watkins, EdD (1/08–12/10)*
*Public member. **Student member.
*Public member. **Student member.
44 Commission on Accreditation: 5-Year Summary Report
2010 Commission on Accreditation
CHAIR Richard Seime, PhD (1/08–12/12)
ASSOCIATE CHAIR, PROGRAM REvIEW Joyce Illfelder–Kaye, PhD (1/07–12/12)
ASSOCIATE CHAIR, QUALITY ASSURANCE
Elizabeth Klonoff, PhD (1/08–12/10)
Patricia Alexander, PhD (1/07–12/12)
Jeffrey Baker, PhD (1/05–12/10)
Laura Barbanel, PhD (1/10–12/10)
Debora Bell, PhD (1/10–12/12)
Raymond Crossman, PhD (1/08–12/11)
Charme Davidson, PhD (1/09–12/12)
Wallace Dixon Jr., PhD (1/09–12/11)
Changming Duan, PhD (1/10–12/12)
Nancy Elman, PhD (1/06–12/11)
Victoria Follette, PhD (1/10–12/12)
Paul Gaston, PhD (1/10–4/10) / Mona Mitnick, JD (4/10–12/10)*
Rodney Goodyear, PhD (1/08–12/10)
David Hess, PhD (1/08–2/10) / Kim Dixon, PhD (2/10–12/10)
Donna Horn, PhD (1/08–12/10)
M. Marlyne Kilbey, PhD (1/09–12/11)
Linda Knauss, PhD (1/08–12/10)
Keren Lehavot, MA (1/10–12/10)**
David McIntosh, PhD (1/10–12/12)
Carlton Parks Jr., PhD (1/08–12/10)
Carl Paternite, PhD (1/10–12/12)
Ruperto Perez, PhD (1/08–12/10)
Roger Peterson, PhD (1/10–12/12)
Deborah Richardson, PhD (1/08–4/10) / Mark Ashcraft, PhD (4/10–12/11)
Brad Roper, PhD (1/08–12/12)
Rick Short, PhD (1/08–12/10)
Wayne Siegel, PhD (1/08–12/11)
Milton Strauss, PhD (1/10–12/10)
Judith Watkins, EdD (1/08–12/10)*
Richard Zinbarg, PhD (1/10–12/10)
2011 Commission on Accreditation
CHAIR Richard Seime, PhD (1/08–12/12)
ASSOCIATE CHAIR, PROGRAM REvIEW
Joyce Illfelder-Kaye, PhD (1/07–12/12)
ASSOCIATE CHAIR, QUALITY ASSURANCE Elizabeth Klonoff, PhD (1/08–12/10)
Patricia Alexander, PhD (1/07–12/12)
Mark Ashcraft, PhD (4/10–12/11)
Debora Bell, PhD (1/10–12/12)
Kathleen Bieschke, PhD (1/11–12–13)
Michael Boroughs, MA (1/11–12/11)**
Lawrence Cohen, PhD (1/11–12/13)
Raymond Crossman, PhD (1/08–12/11)
Charme Davidson, PhD (1/09–12/12)
Kim Dixon, PhD (2/10–12/13)
Wallace Dixon Jr., PhD (1/09–12/11)
Changming Duan, PhD (1/10–12/12)
Nancy Elman, PhD (1/06–12/11)
Ana Faraci, PhD (1/11–12/13)
Victoria Follette, PhD (1/10–12/12)
Mary Anne Hanner, PhD (1/11–12/13)*
H. Garland Hershey Jr., DDS (1/11–12/13)*
Stephen Holliday, PhD (1/11–12/13)
M. Marlyne Kilbey, PhD (1/09–12/11)
Linda Knauss, PhD (1/08–12/10)
David McIntosh, PhD (1/10–12/12)
Richard Milich, PhD (1/11–5/11) / Vacant (5/11–12/11)
Carl Paternite, PhD (1/10–12/12)
Roger Peterson, PhD (1/10–12/12)
Brad Roper, PhD (1/08–12/12)
Nancy Ruddy, PhD (1/11–12/13)
Lawrence Schoenfeld, PhD (1/11–12/13)
Wayne Siegel, PhD (1/08–12/11)
Gary Stoner, PhD (1/10–12/12)
Luis Vazquez, PhD (1/11–12/13)
*Public member. **Student member.
*Public member. **Student member.
5-Year
Summary report
Commission o
n ACCreditAtio
n
Education DirectorateOffice of Program Consultation and Accreditation
www.apa.org/ed/[email protected].: 202.336.5979Fax: 202.336.5978
Co
mm
iss
ion
on
AC
Cr
ed
itA
tio
n | 5-Y
ear
Su
mm
ar
y r
ep
or
tA
MER
ICA
N PS
YCH
OLO
GIC
AL A
SS
OC
IATIO
N