american behavioral scientist -...

21
http://abs.sagepub.com/ American Behavioral Scientist http://abs.sagepub.com/content/58/8/1051 The online version of this article can be found at: DOI: 10.1177/0002764213515233 2013 2014 58: 1051 originally published online 30 December American Behavioral Scientist Malisa Lee Edward P. St. John, Johanna C. Massé, Amy S. Fisher, Karen Moronski-Chapman and Comprehensive Intervention Strategy Beyond the Bridge: Actionable Research Informing the Development of a Published by: http://www.sagepublications.com can be found at: American Behavioral Scientist Additional services and information for http://abs.sagepub.com/cgi/alerts Email Alerts: http://abs.sagepub.com/subscriptions Subscriptions: http://www.sagepub.com/journalsReprints.nav Reprints: http://www.sagepub.com/journalsPermissions.nav Permissions: http://abs.sagepub.com/content/58/8/1051.refs.html Citations: What is This? - Dec 30, 2013 OnlineFirst Version of Record - Jun 10, 2014 Version of Record >> at JOHNS HOPKINS UNIVERSITY on August 8, 2014 abs.sagepub.com Downloaded from at JOHNS HOPKINS UNIVERSITY on August 8, 2014 abs.sagepub.com Downloaded from

Upload: others

Post on 22-Jun-2020

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: American Behavioral Scientist - olms.cte.jhu.eduolms.cte.jhu.edu/olms2/data/ck/sites/2677/files/St... · 1054 American Behavioral Scientist 58(8) student researchers studied these

http://abs.sagepub.com/American Behavioral Scientist

http://abs.sagepub.com/content/58/8/1051The online version of this article can be found at:

 DOI: 10.1177/0002764213515233

2013 2014 58: 1051 originally published online 30 DecemberAmerican Behavioral Scientist

Malisa LeeEdward P. St. John, Johanna C. Massé, Amy S. Fisher, Karen Moronski-Chapman and

Comprehensive Intervention StrategyBeyond the Bridge: Actionable Research Informing the Development of a

  

Published by:

http://www.sagepublications.com

can be found at:American Behavioral ScientistAdditional services and information for    

  http://abs.sagepub.com/cgi/alertsEmail Alerts:

 

http://abs.sagepub.com/subscriptionsSubscriptions:  

http://www.sagepub.com/journalsReprints.navReprints:  

http://www.sagepub.com/journalsPermissions.navPermissions:  

http://abs.sagepub.com/content/58/8/1051.refs.htmlCitations:  

What is This? 

- Dec 30, 2013OnlineFirst Version of Record  

- Jun 10, 2014Version of Record >>

at JOHNS HOPKINS UNIVERSITY on August 8, 2014abs.sagepub.comDownloaded from at JOHNS HOPKINS UNIVERSITY on August 8, 2014abs.sagepub.comDownloaded from

Page 2: American Behavioral Scientist - olms.cte.jhu.eduolms.cte.jhu.edu/olms2/data/ck/sites/2677/files/St... · 1054 American Behavioral Scientist 58(8) student researchers studied these

American Behavioral Scientist2014, Vol. 58(8) 1051 –1070© 2013 SAGE PublicationsReprints and permissions:

sagepub.com/journalsPermissions.nav DOI: 10.1177/0002764213515233

abs.sagepub.com

Article

Beyond the Bridge: Actionable Research Informing the Development of a Comprehensive Intervention Strategy

Edward P. St. John1, Johanna C. Massé1, Amy S. Fisher1, Karen Moronski-Chapman1, and Malisa Lee2

AbstractUsing an actionable research approach, researchers and practitioners can collaborate in research to inform advocacy for and development of interventions that seek to reduce inequality in educational outcomes. This process was used at an elite public university to examine the limitations of a summer bridge program and inform the development of an intervention model that provided comprehensive academic and social support for underrepresented students enrolled in engineering and, after a pilot test, other programs at the Midwest University (MU). This article discusses the process leading to the Engineering Academy bridge program, summarizes the research on the pilot test, and describes how the results were used by administrators to advocate for the continuation of the Engineering Academy. Currently, support from the National Science Foundation and MU’s general fund is being used to adapt the intervention for biology, math, and other fields at MU’s College of Liberal Arts.

Keywordsbridge program, engineering education, student support, mixed methods

Bridge programs are frequently conceptualized as a remedy for inadequate academic preparation of underrepresented and/or low-income students enrolling in higher educa-tion—an approach that has been used in recent decades in research universities attempt-ing to improve diversity in science, technology, engineering, and math (STEM) fields.

1University of Michigan, Ann Arbor, MI, USA2University of Nebraska–Omaha, Omaha, NE, USA

Corresponding Author:Edward P. St. John, University of Michigan, School of Education, Room 2002A, Ann Arbor, MI 48109, USA. Email: [email protected]

515233 ABSXXX10.1177/0002764213515233American Behavioral ScientistSt. John et al.research-article2013

at JOHNS HOPKINS UNIVERSITY on August 8, 2014abs.sagepub.comDownloaded from

Page 3: American Behavioral Scientist - olms.cte.jhu.eduolms.cte.jhu.edu/olms2/data/ck/sites/2677/files/St... · 1054 American Behavioral Scientist 58(8) student researchers studied these

1052 American Behavioral Scientist 58(8)

Although bridge programs may marginally adjust for differences in academic prepara-tion, there are also social, cultural, and economic barriers to retention of underrepre-sented students (St. John, Hu, & Fisher, 2011). We argue that a fundamental rethinking is needed of the relationship between public universities and their state K–12 systems. Students may be top achievers in their high schools, but if their schools are below par, these same students may be stigmatized as low achievers when they enter college.

Given the substantial challenges facing research universities as they develop new strategies to recruit and retain low-income students, we argue that researchers and change advocates within universities should work in partnerships using institutional data and qualitative research to inform the development of new, more comprehensive strategies for promoting diversity and academic success, an approach tested in other recent studies (St. John, 2013; St. John & Musoba, 2011).

This article presents the formative evaluation for the pilot test of the Engineering Academy (EA), a comprehensive intervention that sought to overcome limitations of the summer bridge program previously used in the College of Engineering (COE) at Midwest University (MU). We provide background on the pilot test, describe meth-ods, summarize results, and conclude with a discussion of the utility of the findings.

Background

Based on a study of MU students conducted by the senior author, a planning team dis-covered that the extant bridge program had not reduced the persistence gap; in fact, stu-dents who had participated in the bridge program earned lower grades as freshmen and were less likely to persist in engineering. After examining a range of alternate approaches, MU decided to pilot test a comprehensive intervention. The new program was modeled in part on a version of the University of Maryland–Baltimore County’s (UMBC) Meyerhoff Program for underrepresented students in STEM fields (Harbowski, Maton, Green, & Grief, 2002; Harbowski, Maton, & Grief, 1998). The Meyerhoff Program was created in 1988 and has long recruited high-achieving students of color into STEM pro-grams using a philosophy of engaging education and building strengths. This has been a successful program and is a model being adapted by several research universities across the nation. Midwest University adapted this model to focus on middle-ability students entering the university, rather than target high-achieving underrepresented students as UMBC had. In the 1st year, the MU pilot EA focused on admitted in-state residents, assuming that as a public university its first obligation was to the state.

Program Theory of Change

The EA aims to provide holistic support services to promote academic and personal suc-cess among a cohort of STEM students. Figure 1 visually represents the program theory behind the EA. Although the EA was initially designed to integrate learning across STEM disciplines, in its inaugural year the incoming students were all based in the COE.

The new, redesigned bridge program included accelerated math linked to basic con-cepts in physics and student support, along with mentoring that continued into the 1st

at JOHNS HOPKINS UNIVERSITY on August 8, 2014abs.sagepub.comDownloaded from

Page 4: American Behavioral Scientist - olms.cte.jhu.eduolms.cte.jhu.edu/olms2/data/ck/sites/2677/files/St... · 1054 American Behavioral Scientist 58(8) student researchers studied these

St. John et al. 1053

year (Component 1). During the 1st year, students lived in an EA living-learning com-munity that emphasized family and provided mentoring and academic support (Component 2); career support, including internships and research experience (Component 3); support services that provided additional opportunities for engage-ment in the university and community as well as counseling and academic support (Components 1 and 3); and supplemental financial support beyond the basic aid pack-age provided by MU (Component 4). These additional program features moved beyond the prior bridge concept, by both redesigning the summer programs and add-ing features, which created a system of support that acknowledged the cultures and achievements of students.

The underlying theory of change recognized the need to transform the culture from one of stigma to one of merit and acceleration. The COE formed a team of profession-als from student affairs, minority programs, women in science, and other existing offices to guide the development of the program. The research group was actively engaged in developing the theory of academic capital formation (ACF; St. John et al., 2011) during this period of actionable research support. The researchers and EA lead-ership discussed the roles of key constructs from social, human, and culture capital that were the starting point for theorizing ACF. They focused on developing mecha-nisms that could transform a culture of reproduction to one of uplift, by providing academic and social support that built on student strengths.

Simultaneously, researchers were engaged in building on strengths (Bowman & St. John, 2011) using strengths-based indicators (including those developed in admissions review at MU) as a basis for designing academic and social support. The graduate

COMPONENT 3:

Structured support services:

• Academic, peer, and industry advising

• Professional development ac�vi�es

• Guaranteed career explora�on experience between freshman and sophomore years

COMPONENT 4:

Financial incen�ve: Students receive a s�pend of $3000, part of which is distributed at the end of the summer bridge program with the balance payable at the end of the first fall semester.

OUTCOME:

Incoming underrepresented STEM students maximize academic, personal, and professional success and thus persist to comple�on of an undergraduate degree in a STEM major.

COMPONENT 1:

Summer bridge program: A rigorous introduc�on to the academic climate and expecta�ons of a first-year STEM student.

COMPONENT 2:

Residen�al program creates a “family” in the sense of a close-knit peer support network.

Figure 1. Program theory of the Engineering Academy.Note. STEM = science, technology, engineering, and math.

at JOHNS HOPKINS UNIVERSITY on August 8, 2014abs.sagepub.comDownloaded from

Page 5: American Behavioral Scientist - olms.cte.jhu.eduolms.cte.jhu.edu/olms2/data/ck/sites/2677/files/St... · 1054 American Behavioral Scientist 58(8) student researchers studied these

1054 American Behavioral Scientist 58(8)

student researchers studied these concepts in courses and actively used them in their conversations with the EA leadership team. Some members of the EA team partici-pated in workshops on using noncognitive and strengths-based indicators in admis-sions and student support at MU. Thus, in addition to features emulating the Meyerhoff Program, EA leadership developed its own theory of change.

The team of student researchers worked directly with the EA leadership as partners in change, consistent with St. John’s (2013) theory of actionable research supporting and informing reforms to improve fairness and social justice with institutional systems that are resistant to change. This was a bottom-up innovation enabled by recommenda-tion of a summer planning group of senior administrators without a budget allocation. Actionable research was used to inform early advocacy for the program. The COE leadership and research teams had open conversations about organizational barriers and the ways objective research could be developed to support and inform develop-ment of the new program, create opportunities for ongoing funding, and use a targeted research-informed approach of mini research reports to inform planning groups in the COE and university-level administration. The program would be funded through refo-cusing existing internal resources and external grants.

The Pilot Test

For the pilot year (academic year 2008–2009), the EA leadership and research teams reviewed the applications of a group of in-state residents who had already been admit-ted to the COE. A subgroup of admitted applicants with achievement indicators that suggested they could benefit from the intervention (e.g., inadequate high school prepa-ration, including whether students took advanced math courses) were selected for sec-ondary review. Within this subgroup, selection criteria for the EA included background, prior education, and noncognitive factors routinely noted in the MU admission pro-cess. Some students were invited into the new pilot program based in the COE, whereas others enrolled in MU’s general summer bridge program. Students opting into the EA participated in a 6-week engineering summer program, which provided a noncredit course to prepare them for collegiate-level work in math (precalculus) and physics. Students in the EA pilot program also had access to support services, community meetings, and mentoring, along with internship programs after their freshman year. One additional element to the cohort structure of the EA included “check-ins,” where program staff and faculty met with EA participants regularly to see how they were coping with the transition to college, academically and personally.

Most students invited to participate in the EA pilot accepted, resulting in 50 freshmen in the pilot test of the redesigned bridge. (Some chose other options because of other commitments, so there is some student self-selection bias inherent in the evaluation design.) Of this group, 47 students stayed after the first week. One student dropped out because the program did not provide the time he wanted for other activities on campus, and two dropped out for personal reasons. Students who left the EA could rejoin the program in the fall, as neither the EA nor the summer bridge was mandatory.

An aim of the selection process for the EA was to reduce the achievement gap in retention of minority students in STEM. Table 1 presents the racial composition of EA

at JOHNS HOPKINS UNIVERSITY on August 8, 2014abs.sagepub.comDownloaded from

Page 6: American Behavioral Scientist - olms.cte.jhu.eduolms.cte.jhu.edu/olms2/data/ck/sites/2677/files/St... · 1054 American Behavioral Scientist 58(8) student researchers studied these

St. John et al. 1055

students compared to 1st-year students in the COE as a whole. The majority of EA students were from underrepresented racial/ethnic groups.

The Actionable Research Approach to Formative Evaluation

The actionable approach focuses specifically on using research to inform and support institutional change focused on reducing inequality (St. John, 2013). Researchers work in partnership with change advocates engaged in planning for, testing, and developing inter-ventions to improve academic access and success for underrepresented students (e.g., Musoba, 2006; St. John & Musoba, 2011). Methods used are similar to traditional research in K–12 and higher education, but the focus is on informing change and theories of change rather than on building functional knowledge about students and institutions.

Mixed Methods Research

This mixed methods evaluation of the EA pilot test consisted of qualitative and quan-titative analyses. Using multiple methods offers the opportunity to uncover discrete factors that can lead to the desired program outcome of maximizing the academic and personal success of participants. The specific rationale for this study included triangu-lation and complementarity of results. Johnson and Onwuegbuzie (2004) defined tri-angulation as “seeking convergence and corroboration of results from different methods and designs studying the same phenomenon” and complementarity as “seek-ing elaboration, enhancement, illustration, and clarification of results from one method with results from the other method” (p. 22). The qualitative and quantitative compo-nents were performed concurrently.

Quantitative methods. The analyses for the quantitative portion of this report were derived from all 47 current EA students. A series of regression analyses for the 2002 engineering cohort (ordinary least squares regressions for grades and logistic

Table 1. Racial/Ethnic Composition of Engineering Academy (EA) Compared to College of Engineering (COE).

Group EAa 1st-Year Students Fall 2008 COE

African American 59.57 4.84Hispanic 12.77 2.42Native American 0 0.33Asian 8.51 16.36Other/not indicated 4.26 14.02White 14.89 62.02

Note. All numbers are percentages. One student did not have final grades and could not be included in subsequent analyses. Without her, the percentages changed to African American, 58.7%; Hispanic, 13.04%; Asian, 8.7%; Other, 4.35%; and White, 15.22%.aThis includes all participants in the EA.

at JOHNS HOPKINS UNIVERSITY on August 8, 2014abs.sagepub.comDownloaded from

Page 7: American Behavioral Scientist - olms.cte.jhu.eduolms.cte.jhu.edu/olms2/data/ck/sites/2677/files/St... · 1054 American Behavioral Scientist 58(8) student researchers studied these

1056 American Behavioral Scientist 58(8)

regressions for retention) were used for base estimates of grades and retention (see Appendix A for a graphical presentation of the process and Appendix B for results of the regressions).

This simulation generated a better comparison group than would have been possible using either an older cohort or another group of students in the same entering cohort to create a matched comparison. Student background and prior academic achievement were used to predict 1st-year outcomes. Since students were invited based on a second stage selection process using admission files, and invited students had the option of whether to enroll, it was impossible to create a random comparison group; any attempt to match would have had an embedded bias. Thus, the simulation comparison group provided a matched group that estimated the achievement and persistence outcomes that could be expected had the same students not chosen to enroll in this intervention.

Qualitative methods. For the qualitative analyses, graduate student researchers collected focus group baseline data from 20 EA students in a focus group in summer 2008 and follow-up data from 15 EA students in fall 2008, and 11 students in winter 2009, with two focus groups each session. All of the students were 18 years old at the time of the focus groups and fully consented to participate in the evaluation. The questions focused on the ways students built an understanding of the features of the pilot program. Our analysis focused on themes that emerged relating to students’ engagement in the summer and 1st-year components of the EA. All names presented in this article are pseudonyms.

Protections of subjects. Because this study is an administrative evaluation, it is eligible for an exemption from MU’s institutional review board. Nonetheless, all necessary steps were taken to secure the data and student information used in this evaluation.

Limitations

The results presented in this report are preliminary based on initial focus groups, which is consistent with the longitudinal design of the evaluation. At the time the analyses were completed, 1st-year persistence data were not yet available from which to infer the effect of the EA on reducing the retention gap that had been identified at the outset. Our research team used the data on the summer program, first-term grades, and focus group interviews to document the effect of the program.

The quantitative analyses presented in this article do not rise to the level of causal inference for several reasons. First, we were limited by the data available. To use the first-term grades in our prediction models, we used data from the 2002 cohort as the basis for comparison. The data were provided by the Office of Financial Aid and the Office of the Registrar and represent all of the students enrolled in the COE.

Second, we used first-semester grades to predict persistence to degree completion. Ideally, we use grades from the students’ fourth semester to predict the probability of persisting from the 2nd year to the 4th year. Our use of the students’ first-semester data is consistent with actionable research that focuses on informing the change process. In this case, we were conducting an ongoing formative evaluation with the intent of informing the EA leadership team, which was refining an intervention being developed

at JOHNS HOPKINS UNIVERSITY on August 8, 2014abs.sagepub.comDownloaded from

Page 8: American Behavioral Scientist - olms.cte.jhu.eduolms.cte.jhu.edu/olms2/data/ck/sites/2677/files/St... · 1054 American Behavioral Scientist 58(8) student researchers studied these

St. John et al. 1057

based in part on the research we provided. This is a significant departure from summa-tive evaluations that provide no mechanisms for addressing unexpected outcomes that may arise during the change process as it unfolds in practice.

Third, we did not have access to information on high school courses; the estimates in this article would be improved had this information been available. Finally, this report presents predictions using coefficients and precise data points. We did not present simu-lations that adjusted for confidence intervals associated with the model parameters.

Even with these limitations, the results derived from these analyses offer a solid first step toward using data to analyze predicted outcomes of interventions designed to narrow the achievement gap between underrepresented and White students. Furthermore, they allow for an evaluation that presents benefits and costs from both a fiduciary perspective and a values standpoint.

Findings

This evaluation summary includes estimates of the extent to which fall grades were higher than otherwise would be expected given the performance of previous cohorts and the value added in terms of gains in retention expected as a result of the improve-ment in grades. We also provide a summary of interviews with students, illustrating how students made sense of their experiences in the comprehensive EA program.

Effect of Grades

During the fall term after the bridge program, EA students took courses in the standard engineering curriculum. The fall term grade point average (GPA) for the group, 3.221, was higher than was typical of underrepresented students in past years in the COE. Table 2 summarizes the analyses of value added in fall term grades, along with the estimated effect on retention (analyses presented in Appendix C). The findings were as follows:

•• The predicted average GPA for the EA students (based on race/ethnicity, income, residency status, and ACT/SAT score) was 2.881. The reported score of 3.221 was an average of 0.34 grade points higher, representing a substantial value added.

•• Based on background and expected GPA (the fall GPA based on entry charac-teristics), the estimated retention rates within the COE from year 1 to year 2 would be 93.94% compared to an estimated 94.79% for the same students with their actual fall grades. This suggests an increased retention rate from year 1 to year 2 of 0.39 or approximately one-third full time equivalent (FTE).

•• Using expected grades, the rate of predicted persistence from year 2 to year 4 was 80.45% in the COE (and 83.56% in the university, inclusive of major changes), compared to an estimated rate of 84.43% in the COE (87.73% at MU) based on actual grades of EA participants. This is an approximate 2 FTE gain per year (1.92 for the university and 1.83 for the COE). This estimate is conser-vative due to the fact that many engineering students enroll for 5 years of study.

at JOHNS HOPKINS UNIVERSITY on August 8, 2014abs.sagepub.comDownloaded from

Page 9: American Behavioral Scientist - olms.cte.jhu.eduolms.cte.jhu.edu/olms2/data/ck/sites/2677/files/St... · 1054 American Behavioral Scientist 58(8) student researchers studied these

1058 American Behavioral Scientist 58(8)

The Voices of Participants

A major benefit from the improved grades will likely be a slight reduction in the reten-tion differential for underrepresented compared to majority students (see Appendix B for descriptive and regression tables). All 47 of the initial EA participants were retained between the first and second semesters. Our analysis of interviews focused on the ways that students engaged in the social and academic support as they navigated the system, and how students used these strengths in their 1st year of study in the engi-neering program.

Program interventions supporting capital formation. The students attributed their first-term academic success largely to the preparation they received during the summer academy. For example, Vanna, a White woman, recognized that her summer math course was helpful because “my [high] school, like, did not teach me math. So I would have been way behind here.” Instead, her familiarity with the math concepts intro-duced during the EA program provided the foundation she needed to be successful in her fall math course.

The students were generally happy with their first-semester grades, even as they acknowledged that there was room for improvement. Larry, a low-income White male, said, “I was really happy. I did . . . actually better than I thought I was going to do. So, I was very pleased.” He credited his success to EA’s math supplemental instruction and the study skills he developed during the summer program:

Table 2. Simulated Value Added of Participation in Engineering Academy (EA).

Simulated Nonparticipation

in EA

Simulated Participation

in EAValue Added

Additional 2008 EA Students

Retained (N = 46)

Average 1st-term GPA 2.881 3.221a 0.340 —Predicted persistence between

1st and 2nd years (any department)a

97.28% 97.93% 0.65% 0.30

Predicted persistence between 1st and 2nd years (engineering only)b

93.94% 94.79% 0.85% 0.39

Predicted persistence from 2nd to 4th years (any department)b

83.56% 87.73% 4.17% 1.92

Predicted persistence from 2nd to 4th years (engineering only)c

80.45% 84.43% 3.98% 1.83

aThis is the actual average 1st-term grade point average (GPA) for 2008 EA students (N = 46).bThis includes students who have graduated or are still enrolled in any Midwest University college (including College of Engineering).cThis includes students who have either graduated or are still enrolled in the College of Engineering.

at JOHNS HOPKINS UNIVERSITY on August 8, 2014abs.sagepub.comDownloaded from

Page 10: American Behavioral Scientist - olms.cte.jhu.eduolms.cte.jhu.edu/olms2/data/ck/sites/2677/files/St... · 1054 American Behavioral Scientist 58(8) student researchers studied these

St. John et al. 1059

During the summer they told us, you know, this is how it was going to be. They gave us the homework load equivalent to what we generally get in the fall. So it’s like, I knew that I had to hit the ground my feet running. That helped me prepare. So right from the start, I was able to hit the books [pause]. That helped a lot.

Heath, a Hispanic male, however, reported during a focus group that he still had some difficulties with math:

I think it [first-term grades] was pretty good too except for math. Umm, I didn’t do so well on the first exam so that kinda started me off low and I did better on the following exams. So, that helped me keep up. But I think the fact that I didn’t do well on the first one ‘cause I was still learning how to study for math brought my grade down lower than I wanted.

Later in the focus group, Heath described how he realized that cramming for exams at the last minute does not work in college and that he studies better when he has the opportunity to “bounce ideas off” other students in study groups. His comments sug-gest that the group-based approaches to teach math may, in fact, result in higher math grades in subsequent semesters.

These comments illustrate how the strength of the cohort groups and the bonds they developed enabled them to overcome barriers related to inequality in preparation. The comments also relate directly to the concept of college readiness. The student responses demonstrate actualization of change that provided academic and social support to overcome barriers created by the high school educational system. These findings rein-force the ideas of using academic and social support as mechanisms for building aca-demic capital, an integral part of our emerging theory of change.

Academic success in engineering courses. In fact, the supplemental instruction and other forms of group study seemed to be key in promoting the EA students’ academic achievement, although group composition affected the quality of the experience (e.g., groups where all members did not pull their weight equally tended to result in conflict and ill feelings). Dacian agreed that working in groups was important:

I was kinda happy that I took Engin 100 first semester because I’ve worked in a group, you know, the whole semester. Our project was designed in a way that put solar panels on the roof. And our group worked really well together. We all stayed in [dorm name] again and it’s kinda helping me out a lot this semester in Engin 101. Even though we don’t, there’s no group assignments, but like a lot of us work together to get the projects done.

Another student added,

Yeah, and it’s like everyone will bring different ideas to the table so you will see things in a different light because you may not think the same way as someone else. And it’s a good thing to instill in us now because when we get into the job world and everything, it’s not going to be individual work. We’re working with a lot of different people that you don’t know you may not like but you have to deal with it and you have to move on.

at JOHNS HOPKINS UNIVERSITY on August 8, 2014abs.sagepub.comDownloaded from

Page 11: American Behavioral Scientist - olms.cte.jhu.eduolms.cte.jhu.edu/olms2/data/ck/sites/2677/files/St... · 1054 American Behavioral Scientist 58(8) student researchers studied these

1060 American Behavioral Scientist 58(8)

The students alluded to an unspoken agreement that when there are EA students in a class and the students can choose their own groups, the EA students will work together. This may be a double-edged sword. On one hand, the working relationships that were forged over the summer are maintained into the school year. Conversely, by clustering in what may be perceived as “exclusive” groups, the EA students mini-mize the possibility of sharing robust dialogue with other engineering students.

These comments illustrate how students’ social bonding enhanced their sense of identity, a shared understanding of uplift that countered any stigma associated with remedial or bridge programs. The continuity of support from professors, the adminis-trative leadership team, and other mentors aided this reconstruction of self-concept, one that was also linked to honoring their prior achievements as strengths, rather than implicitly treating them as deficiencies because they had to take courses in the summer and fall that might otherwise be labeled remedial.

From the perspective of the research team, the students’ voices illustrate the importance of socialization processes. The socialization that students gained in the bridge program and support programs during the 1st year of college helped them navigate the first 2 years of college. The challenges facing the COE at the time we concluded this phase of the research were to (a) discover challenges to persistence during the last 2 years of college and (b) conduct more research to inform advocacy for the continuation of the pilot program.

Research Informing Reform

The early evidence on the COE’s EA suggested that the intervention model will improve (a) the academic achievement of underrepresented students in the COE, (b) retention rates of underrepresented students, reducing the retention gap, and (c) the yield of high-achiev-ing minority students from the COE, further improving employer interest in visiting the COE for interviews. During the following 2 years, this early research was used in the COE and MU by administrators to advocate for continuing the EA through reports to leadership groups in the COE and MU administration. We summarize subsequent developments at MU as an illustration of utility of the research and conclude with brief reflections on the implications of the actionable research approach for higher education as a field.

Implications

The EA pilot test illustrates some of the ways that campuses can adapt bridge programs to reduce inequality. Our findings show how new approaches to introducing engineering stu-dents to fundamental math concepts resulted in improved grades during the first 2 years of college, possibly one of the reasons for the empirical finding that bridge programs improved retention (e.g., Attewell & Douglas, 2013). Yet, our focus groups indicated that social engagement during the first 2 years of college was also an important mechanism, consis-tent with other findings (e.g., Strayhorn, Tillman-Kelly, & Dorime-Williams, 2013). Actionable research undertaken with the intent of informing reforms within institutions seems consonant with more traditional approaches to research. We think it is important to consider how research is and can be used in advocacy for equity, as well as in efforts to document the effects of different types of interventions like bridge programs.

at JOHNS HOPKINS UNIVERSITY on August 8, 2014abs.sagepub.comDownloaded from

Page 12: American Behavioral Scientist - olms.cte.jhu.eduolms.cte.jhu.edu/olms2/data/ck/sites/2677/files/St... · 1054 American Behavioral Scientist 58(8) student researchers studied these

St. John et al. 1061

There is a need for concerted efforts to learn about the uses of research in advocacy on social justice issues in higher education. Examination of uses and abuses of research and research funding through political maneuvering may be as important as the research on the effects of programs. However, these experiences reinforce the idea that researchers need to partner with change agents in the institution in reforms based on actionable research (St. John, 2013). Both research and advocacy remain necessary elements of reforms that seek to reduce inequalities in educational opportunities. We need to better understand how to engage in these processes while maintaining the integrity of research used to inform reform.

Conclusion

Although summer bridge programs provide a mechanism for reducing inequality, more efforts are required beyond providing supplemental instruction during the sum-mer before college. However, this action experiment with multi-pronged intervention and research components illustrates the value of taking a comprehensive approach to effecting change within universities.

First, the MU case illustrates that the bridge program’s design, testing, and refinement are appropriately viewed as one part of a set of comprehensive remedies to disparities in degree completion rates. Senior administrators in a 2006 planning session were concerned about the gap in retention between majority and minority students. St. John provided data analyses to the senior planning group that identified the limited success of bridge programs as a problem that could be addressed. The follow-up planning process involved collabora-tion between program managers (i.e., women’s and multicultural program offices) and the research team to develop a strategy to link the bridge program with support services.

Second, post-hoc analysis of changes that resulted from the research partnership illustrates the political nature of intervention processes. Budgetary constraints require that data be used to support advocacy of new programs within universities. This formative evaluation provided simulations to illustrate how the revised summer program had altered the trajectory through the university, providing evidence for continuation of the program. But to continue the program, it proved necessary to alter the selection process to provide a mechanism to recruit high-achieving students from out of state, rather than only state residents, the original target group in the interventions. Although this type of adaptation helps universities in their pursuit of prestige, it undermines the social contract between public research universities and their states. We recognize that by engaging in actionable research, our responsibility is constrained to reporting on the research rather than becoming advocates for one type of strategy or another.

Third, our findings were congruent with the organizers’ theory of change that focused on transforming deficiency by building on strengths. The intervention focused on state residents who were high achievers within their schools but did not have the math prepa-ration needed for MU’s engineering courses. Whereas the summer program provided supplemental instruction in math, students were provided coaching, mentoring, and other support services, along with a living-learning community that built a stronger cul-ture of peer self-help and collaboration.

at JOHNS HOPKINS UNIVERSITY on August 8, 2014abs.sagepub.comDownloaded from

Page 13: American Behavioral Scientist - olms.cte.jhu.eduolms.cte.jhu.edu/olms2/data/ck/sites/2677/files/St... · 1054 American Behavioral Scientist 58(8) student researchers studied these

1062 American Behavioral Scientist 58(8)

Appendix A

Par�cipa�on in program

GenderEthnicityParent IncomeResidencyACT ScoreFirst-Term GPA

ProbabilityofPersis�ng*

Out-of-sample predic�on from generalpopula�on

Coefficients for:GenderEthnicityParent IncomeResidencyACT ScoreFirst–Term GPA

XActual Values forProgramPar�cipants,Using PredictedGPA from Step 2

PredictedProbability ofPersis�ng ifNotPar�cipa�ngin theProgram

Simulated non-par�cipa�on in program

Coefficients for:GenderEthnicityParent IncomeResidencyACT ScoreFirst–Term GPA

XActual Values forProgramPar�cipants,Using ActualFirst-Term GPAfrom Step 2

PredictedProbability ofPersis�ng ifPar�cipa�ngin theProgram

Difference inprobability ofpersis�ng

*Note: Different persistence outcomes maybe used, for example, first-to-second year;second-to-fourth year; persist or not; orgraduate, dropout, s�ll enrolled in homecollege, s�ll enrolled at university buttransferred to another college, to name a fewchoices.

Step 1: Predict GPA

First-TermGPA

GenderEthnicityParent IncomeResidencyACT Score

Step 2: Mul�ply coefficients frompredictors in Step 1 by individualcharacteris�cs of program par�cipants

Coefficientsfor:GenderEthnicityParent IncomeResidencyACT Score

PredictedFirst-TermGPA

Actual valuesof eachvariable foreachindividualprogrampar�cipant

Step 3: Compare individualpredicted GPA to their actual GPA

>

=

<

PredictedFirst-TermGPA

ActualFirst-TermGPA

If actual GPA > predicted GPA, thenthe program adds value to the GPA

Step 4: Calculate poten�alincreased revenue resul�ngfrom program par�cipa�on

Predictedprobability ofpersis�ng ifno programpar�cipa�on

Predictedprobability ofpersis�ng ifyes programpar�cipa�on

FTERevenue generated by1 student per year ofadded enrollment

See next page for graphic display of the bracketedpor�on of the equa�on in Step 4.

Figure A1. Process chart for methods used in quantitative analysis.

at JOHNS HOPKINS UNIVERSITY on August 8, 2014abs.sagepub.comDownloaded from

Page 14: American Behavioral Scientist - olms.cte.jhu.eduolms.cte.jhu.edu/olms2/data/ck/sites/2677/files/St... · 1054 American Behavioral Scientist 58(8) student researchers studied these

St. John et al. 1063

Appendix B

Regression Tables

Table B1. Descriptive Statistics of Variables in the Predictive Model of Grade Point Average.

Category Variable N %

Total N 980 Gender Male 684 69.8 Female 296 30.2Ethnicity Ethnicity missing 57 5.8 White 585 59.7 African American 76 7.8 Hispanic 62 6.3 Asian 185 18.9 Native American 6 0.6 Other ethnicity 9 0.9Parent income Missing 2003 parent income 319 32.6 Low income (< $60,000) 168 17.1 Medium income ($60,000 to $99,999) 195 19.9 High income (≥ $100,000) 298 30.4Residency status Residency status missing 85 8.7 State resident 223 22.8 Non-state resident 672 68.6ACT score ACT score missing 0 0.0 Low (13–23) 66 6.7 Medium (24–30) 627 64.0 High (31–36) 287 29.3

Table B2. Ordinary Least Squares Regression of Grade Point Average for First Term, Fall 2002 Term Grades (n = 980).

Category Variable Unstd. Coef. Sig.

Gender Male 0.132 ****Ethnicity Missing –0.183 ** African American –0.462 **** Hispanic –0.309 **** Asian American –0.098 * Native American 0.035 Other –0.117 Parent income Low income (< $60,000) 0.024 High income (≥ $100,000) 0.081 Missing 0.100 *

(continued)

at JOHNS HOPKINS UNIVERSITY on August 8, 2014abs.sagepub.comDownloaded from

Page 15: American Behavioral Scientist - olms.cte.jhu.eduolms.cte.jhu.edu/olms2/data/ck/sites/2677/files/St... · 1054 American Behavioral Scientist 58(8) student researchers studied these

1064 American Behavioral Scientist 58(8)

Category Variable Unstd. Coef. Sig.

Residency status Missing 0.248 *** Non-state resident –0.044 ACT score Low (13–23) –0.323 **** High (31–36) 0.302 ****Constant 3.065 ****Adjusted R2 0.191 ****Change in R2 0.068 Model F (df) 17.512 (14) ****

Note. Unstd. Coef. = unstandardized coefficient; Sig. = significance; df = degrees of freedom.*p < .10. **p < .05. ***p < .01. ****p < .001.

Table B2. (continued)

Table B3. Descriptive Statistics of Variables in Predictive Model of Persistence to 2nd Year (Any Department), by Persistence Status.

Persisters Nonpersisters

Category Variable N % N %

Total N 958 22 Gender Male 665 69.4 19 86.4 Female 293 30.6 3 13.6Ethnicity Ethnicity missing 57 5.9 0 0.0 White 575 60.0 10 45.5 African American 71 7.4 5 22.7 Hispanic 58 6.1 4 18.2 Asian 183 19.1 2 9.1 Native American 5 0.5 1 4.5 Other ethnicity 9 0.9 0 0.0Parent income Missing 2003 parent income 315 32.9 4 18.2 Low income (< $60,000) 158 16.5 10 45.5 Medium income ($60,000 to $99,999) 191 19.9 4 18.2 High income (≥ $100,000) 294 30.7 4 18.2Residency status Residency status missing 85 8.9 0 0.0 State resident 215 22.4 8 36.4 Non-state resident 658 68.7 14 63.6ACT score ACT score missing 0 0.0 0 0.0 Low (13–23) 61 6.4 5 22.7 Medium (24–30) 614 64.1 13 59.1 High (31–36) 283 29.5 4 18.2GPA fall 2002 A 228 23.8 1 4.5 B 574 59.9 11 50.0

(continued)

at JOHNS HOPKINS UNIVERSITY on August 8, 2014abs.sagepub.comDownloaded from

Page 16: American Behavioral Scientist - olms.cte.jhu.eduolms.cte.jhu.edu/olms2/data/ck/sites/2677/files/St... · 1054 American Behavioral Scientist 58(8) student researchers studied these

St. John et al. 1065

Table B4. Descriptive Statistics of Variables in Predictive Model of Persistence From 2nd to 4th Years (any Department), by Persistence Status.

GraduatedStill

Enrolled Transferred Dropout

Category Variable N % N % N % N %

Gender Male 374 66.7 207 73.9 35 83.3 30 73.2 Female 187 33.3 73 26.1 7 16.7 11 26.8Ethnicity Ethnicity missing 43 7.7 8 2.9 1 2.4 3 7.3 White 340 60.6 171 61.1 32 76.2 17 41.5 African American 24 4.3 25 8.9 2 4.8 8 19.5 Hispanic 20 3.6 23 8.2 2 4.8 6 14.6 Asian 124 22.1 50 17.9 5 11.9 7 17.1 Native American 4 0.7 2 0.7 0 0.0 0 0.0 Other ethnicity 6 1.1 1 0.4 0 0.0 0 0.0Parent income Missing 2003 parent

income209 37.3 66 23.6 9 21.4 16 39.0

Low income (< $60,000)

71 12.7 52 18.6 15 35.7 12 29.3

Medium income ($60,000 to $99,999)

96 17.1 79 28.2 7 16.7 5 12.2

High income (≥ $100,000)

185 33.0 83 29.6 11 26.2 8 19.5

Residency status

Residency status missing 70 12.5 12 4.3 1 2.4 4 9.8State resident 128 22.8 66 23.6 12 28.6 4 9.8Non-state resident 363 64.7 202 72.1 29 69.0 33 80.5

ACT score ACT score missing 0 0.0 0 0.0 0 0.0 0 0.0 Low (13–23) 30 5.3 25 8.9 4 9.5 4 9.8 Medium (24–30) 340 60.6 198 70.7 28 66.7 30 73.2 High (31–36) 191 34.0 57 20.4 10 23.8 7 17.1GPA fall 2002 A 185 33.0 36 12.9 8 19.0 2 4.9 B 331 59.0 177 63.2 28 66.7 24 58.5 C 42 7.5 59 21.1 4 9.5 12 29.3 D and lower 3 0.5 8 2.9 2 4.8 3 7.3 Mean GPA 561 3.41 280 3.07 42 3.18 41 2.82

Note. GPA = grade point average.

Persisters Nonpersisters

Category Variable N % N %

C 132 13.8 7 31.8 D and lower 24 2.5 3 13.6 Mean GPA 958 3.23 22 2.46

Note. GPA = grade point average.

Table B3. (continued)

at JOHNS HOPKINS UNIVERSITY on August 8, 2014abs.sagepub.comDownloaded from

Page 17: American Behavioral Scientist - olms.cte.jhu.eduolms.cte.jhu.edu/olms2/data/ck/sites/2677/files/St... · 1054 American Behavioral Scientist 58(8) student researchers studied these

1066

Tab

le B

5. L

ogis

tic R

egre

ssio

n of

Fac

tors

Influ

enci

ng P

ersi

sten

ce (

N =

980

).

Pers

iste

nce

Betw

een

1st

and

2nd

Yea

rs

(any

Dep

artm

ent)

a

Pers

iste

nce

Betw

een

1st

and

2nd

Yea

rs

(Eng

inee

ring

Onl

y)b

Pers

iste

nce

From

2n

d to

4th

Yea

rs

(any

Dep

artm

ent)

a

Pers

iste

nce

From

2n

d to

4th

Yea

rs

(Eng

inee

ring

Onl

y)b

Cat

egor

yV

aria

ble

Odd

s R

atio

Sig.

Odd

s R

atio

Sig.

Odd

s R

atio

Sig.

Odd

s R

atio

Sig.

Gen

der

Mal

e0.

321

*0.

986

0.61

5*

0.50

9**

*R

ace

Mis

sing

c2.

470

1.81

71.

504

Afr

ican

Am

eric

an0.

861

0.96

20.

453

**0.

754

His

pani

c0.

858

0.90

00.

493

0.78

0

A

sian

2.06

10.

788

1.28

81.

613

Nat

ive

Am

eric

an0.

096

**0.

265

**0.

380

0.84

3

O

ther

c0.

146

c0.

909

Pa

rent

inco

me

Low

inco

me

(< $

60,0

00)

0.35

20.

656

0.54

70.

434

***

Hig

h in

com

e (≥

$10

0,00

0)1.

418

1.16

11.

075

1.03

6

M

issi

ng in

com

e1.

036

0.94

80.

542

0.62

8

Stat

e re

side

ncy

Mis

sing

c1.

558

1.76

82.

320

N

on-r

esid

ent

0.69

90.

806

1.40

91.

149

A

CT

sco

res

Low

AC

T (

13–2

4)0.

682

0.86

71.

219

0.79

2

H

igh

AC

T (

31–3

6)0.

795

0.84

20.

903

1.01

7

GPA

Firs

t te

rm3.

492

****

1.67

7**

3.51

4**

**2.

684

****

Mod

el c

2 (d

f)45

.437

(15

)**

**17

.659

(15

)87

.995

(15

)**

**83

.640

(15

)**

**N

agel

kerk

e ps

eudo

R2

0.23

40.

050.

199

0.15

4

Not

e. S

ig. =

sig

nific

ance

; GPA

= g

rade

poi

nt a

vera

ge; d

f = d

egre

es o

f fre

edom

.a T

his

incl

udes

stu

dent

s w

ho h

ave

grad

uate

d or

are

stil

l enr

olle

d in

any

Mid

wes

t U

nive

rsity

col

lege

(in

clud

ing

Col

lege

of E

ngin

eeri

ng).

b Thi

s in

clud

es s

tude

nts

who

hav

e gr

adua

ted

or a

re s

till e

nrol

led

in t

he C

olle

ge o

f Eng

inee

ring

.c O

dds

ratio

s ar

e no

t in

clud

ed b

ecau

se t

heir

val

ues

are

high

due

to

the

num

ber

of c

ases

in t

he m

odel

(se

e D

escr

iptiv

e ta

bles

).*p

< .1

0. *

*p <

.05.

***

p <

.01.

***

*p <

.001

.

at JOHNS HOPKINS UNIVERSITY on August 8, 2014abs.sagepub.comDownloaded from

Page 18: American Behavioral Scientist - olms.cte.jhu.eduolms.cte.jhu.edu/olms2/data/ck/sites/2677/files/St... · 1054 American Behavioral Scientist 58(8) student researchers studied these

St. John et al. 1067

Table C2. Simulated Grade Point Average (GPA) and Persistence of Underrepresented Minority Students not Participating in Engineering Academy.

From Predicted Performance

From Actual Performance

Value Added

Average 1st-term GPA 2.861 2.833a –0.028Predicted persistence between 1st and 2nd years

(any department)b95.04% 93.13% –1.91%

Predicted persistence between 1st and 2nd years (engineering only)c

92.03% 91.49% –0.54%

Predicted persistence from 2nd to 4th years (any department)b

83.01% 80.03% –2.98%

Predicted persistence from 2nd to 4th years (engineering only)c

79.21% 76.92% –2.29%

aThis is the actual average 1st-term GPA for 2008 underrepresented minority students not in the Engineering Academy (N = 56).bThis includes students who have graduated or are still enrolled in any Midwest University college (including College of Engineering).cThis includes students who have graduated or are still enrolled in the College of Engineering.

Table C1. Simulated Value Added of Participation in Engineering Academy (EA) for Underrepresented Minority (URM) Students.

Simulated Nonparticipation

in EA

Simulated Participation

in EAValue Added

Additional 2008 URM EA Students Retained (N = 33)

Average 1st-term GPA 2.777 3.219a 0.443 —Predicted persistence between

1st and 2nd years (any department)bc

96.61% 97.57% 0.96% 0.32

Predicted persistence between 1st and 2nd years (engineering only)c

93.51% 94.66% 1.15% 0.38

Predicted persistence from 2nd to 4th years (any department)b

79.31% 85.13% 5.82% 1.92

Predicted persistence from 2nd to 4th years (engineering only)c

76.77% 82.20% 5.43% 1.79

aThis is the actual average 1st-term grade point average (GPA) for 2008 URM EA students (N = 33).bThis includes students who have graduated or are still enrolled in any Midwest University college (including College of Engineering).cThis includes students who have graduated or are still enrolled in the College of Engineering.

Appendix C

Tables Showing Value Added for Underrepresented Minority Students in the Engineering Academy

at JOHNS HOPKINS UNIVERSITY on August 8, 2014abs.sagepub.comDownloaded from

Page 19: American Behavioral Scientist - olms.cte.jhu.eduolms.cte.jhu.edu/olms2/data/ck/sites/2677/files/St... · 1054 American Behavioral Scientist 58(8) student researchers studied these

1068

Tab

le C

3. S

imul

ated

Rel

ativ

e V

alue

Add

ed o

f Par

ticip

atio

n in

Eng

inee

ring

Aca

dem

y (E

A)

for

Und

erre

pres

ente

d M

inor

ity (

UR

M)

Stud

ents

.

Part

icip

atio

n in

EA

No

Part

icip

atio

n in

EA

Sim

ulat

ed

Non

part

icip

atio

n in

EA

Sim

ulat

ed

Part

icip

atio

n in

EA

Val

ue

Add

ed

From

Pr

edic

ted

Perf

orm

ance

From

Act

ual

Perf

orm

ance

Val

ue

Add

edR

elat

ive

Val

ue

Add

eda

Ave

rage

1st

-ter

m G

PA2.

777

3.21

9b0.

443

2.86

12.

833c

–0.0

280.

471

Pred

icte

d pe

rsis

tenc

e be

twee

n 1s

t an

d 2n

d ye

ars

(any

dep

artm

ent)

d96

.61%

97.5

7%0.

96%

95.0

4%93

.13%

–1.9

1%2.

88%

Pred

icte

d pe

rsis

tenc

e be

twee

n 1s

t an

d 2n

d ye

ars

(eng

inee

ring

onl

y)e

93.5

1%94

.66%

1.15

%92

.03%

91.4

9%–0

.54%

1.69

%

Pred

icte

d pe

rsis

tenc

e fr

om 2

nd t

o 4t

h ye

ars

(any

dep

artm

ent)

d79

.31%

85.1

3%5.

82%

83.0

1%80

.03%

–2.9

8%8.

81%

Pred

icte

d pe

rsis

tenc

e fr

om 2

nd t

o 4t

h ye

ars

(eng

inee

ring

onl

y)e

76.7

7%82

.20%

5.43

%79

.21%

76.9

2%–2

.29%

7.72

%

a Thi

s is

the

rel

ativ

e ef

fect

of E

A o

n U

RM

stu

dent

s, c

alcu

late

d by

sub

trac

ting

the

valu

e ad

ded

of n

o pa

rtic

ipat

ion

from

the

val

ue a

dded

of p

artic

ipat

ion.

b Thi

s is

the

act

ual a

vera

ge 1

st-t

erm

gra

de p

oint

ave

rage

(G

PA)

for

2008

UR

M E

A s

tude

nts

(N =

33)

.c T

his

is t

he a

ctua

l ave

rage

1st

-ter

m G

PA fo

r 20

08 U

RM

stu

dent

s no

t in

the

EA

(N

= 5

6).

d Thi

s in

clud

es s

tude

nts

who

hav

e gr

adua

ted

or a

re s

till e

nrol

led

in a

ny M

idw

est

Uni

vers

ity c

olle

ge (

incl

udin

g C

olle

ge o

f Eng

inee

ring

).e T

his

incl

udes

stu

dent

s w

ho h

ave

grad

uate

d or

are

stil

l enr

olle

d in

the

Col

lege

of E

ngin

eeri

ng.

at JOHNS HOPKINS UNIVERSITY on August 8, 2014abs.sagepub.comDownloaded from

Page 20: American Behavioral Scientist - olms.cte.jhu.eduolms.cte.jhu.edu/olms2/data/ck/sites/2677/files/St... · 1054 American Behavioral Scientist 58(8) student researchers studied these

St. John et al. 1069

Acknowledgments

We thank the Office of the Registrar for providing data for this study and the College of Engineering and our partner university for providing support for the research summarized in this article. The analyses and interpretations are the authors’ and do not represent official policies or positions of the partner university, hereafter referred to as Midwest University.

Declaration of Conflicting Interests

The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Funding

The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: Funding from Midwest University College of Education sup-ported graduate assistant positions for a couple of the authors. The “human subjects” require-ments prevented us from identifying the name of the university. We gratefully acknowledge this financial support.

References

Attewell, P., & Douglas, D. (2013, November). Recent evidence on summer bridge programs and degree completion. Presented at the American Educational Research Association, San Francisco, CA.

Bowman, P. J., & St. John, E. P. (2011). Diversity, merit, and higher education: Toward a comprehensive agenda for the twenty-first century (Readings on Equal Education, Vol. 25). New York: AMS Press.

Harbowski, F. A., Maton, K. I., Green, M. L., & Grief, G. L. (2002). Overcoming the odds: Raising academically successful African American women. Oxford, UK: Oxford University Press.

Harbowski, F. A., Maton, K. I., & Grief, G. L. (1998). Overcoming the odds: Raising academi-cally successful African American males. Oxford, UK: Oxford University Press.

Johnson, R. B., & Onwuegbuzie, A. J. (2004). Mixed methods research: A research paradigm whose time has come. Educational Researcher, 33(7), 14–26.

Musoba, G. D. (2006). Using evaluation to close the inquiry loop. In E. P. St. John & M. Wilkerson (Eds.), Reframing persistence research to support academic success. New directions for institutional research, Vol. 130 (pp. 77–94). San Francisco, CA: Jossey-Bass.

St. John, E. P. (2013). Research, actionable knowledge, and social change: Reclaiming social responsibility through research partnerships. Sterling, VA: Stylus.

St. John, E. P., Hu, S., & Fisher, A. S. (2011). Breaking through the access barrier: Academic capital formation informing public policy. New York: Routledge.

St. John, E. P., & Musoba, G. D. (2011). Pathways to academic success: Expanding opportunity for underrepresented students. New York: Routledge.

Strayhorn, T. L., Tillman-Kelly, D. L., & Dorime-Williams, M. (2013, November). Parsing “what works . . . for whom” in summer bridge programs: A multisite analysis of underrep-resented minorities. Presented at the annual meeting of the American Educational Research Association, San Francisco, CA.

at JOHNS HOPKINS UNIVERSITY on August 8, 2014abs.sagepub.comDownloaded from

Page 21: American Behavioral Scientist - olms.cte.jhu.eduolms.cte.jhu.edu/olms2/data/ck/sites/2677/files/St... · 1054 American Behavioral Scientist 58(8) student researchers studied these

1070 American Behavioral Scientist 58(8)

Author Biographies

Edward P. St. John is the Algo D. Henderson Collegiate Professor of Education at the University of Michigan’s Center for the Study of Higher and Postsecondary Education. His recent book, Research, Actionable Knowledge, and Social Change, launched the Actionable Research and Social Justice in Education and Society Series by Stylus Press.

Johanna C. Massé is pursuing a joint PhD in sociology and higher education at the University of Michigan. Her primary research interests include qualitative methodologies; social stratifica-tion and mobility; and undergraduate social justice education.

Amy S. Fisher recently defended her dissertation, “When Did the Public Become the New Private? Grappling With Access to Postsecondary Education for Low-Income Students.” Her research interests, guided by a social justice framework, lie in postsecondary access and persis-tence for low-income and underrepresented students.

Karen Moronski-Chapman is pursuing a PhD in higher education at the University of Michigan and is a program assistant for Title III and Center for Excellence in Teaching & Learning at Daemen College. Her research interests include the effect of academic preparation and financial aid on college access and persistence as well as persistence of underrepresented students in STEM fields.

Malisa Lee is pursuing a PhD in higher education at the University of Michigan and is the assis-tant vice chancellor of enrollment management at the University of Nebraska–Omaha. Her pri-mary research interests include college preparation; postsecondary access and equity; enrollment management; and persistence.

at JOHNS HOPKINS UNIVERSITY on August 8, 2014abs.sagepub.comDownloaded from