effectiveness - evaluate · i contend that nsf grants will increasingly go to ... proposals for ate...

4
Evaluaon Terminology . . . . 2 Real Quesons, Real Answers . . . . 2 . . . . 3 . . . . 3 Upcoming Events . . . . 4 Conduit Editor: Emma Perk Steven Budd, Independent Researcher Having made a career of community college administraon, first as a grant writer and later as a college president, I know well the power of grants in advancing a college’s mission. Somewhere in the early 1990s, the NSF was one of the first grantmakers in higher educaon to recognize the role of community colleges in STEM undergraduate educaon. Ever since, two-year faculty have strived to enter the NSF world with varied success. Unlike much of the grant funding from federal sources, success in winning NSF grants is predicated on innovaon and advancing knowledge, which stands in stark contrast to a history of colleges making the case for support based on instuonal need. Colleges that are repeatedly successful in winning NSF grants are those that demonstrate their strengths and their ability to deliver what the grantor wants. I contend that NSF grants will increasingly go to new or “first-me” instuons once they recognize and embrace their capacity for innovaon and knowledge advancement. With success in winning grants comes the responsibility to document achievements through effecve evaluaon. I am encouraged by what I perceive as a stepped-up discussion among grant writers, project PIs, and program officers about evaluaon and its importance. As a grant writer/developer, my main concern was to show that the acvies I proposed were actually accomplished and that the ancipated courses, curricula, or other project deliverables had been implemented. Longer-term outcomes pertaining to student achievement were generally considered to be beyond a project’s scope. However, student outcomes have now become the measure for aracng public funding, and the emphasis on outcomes will only increase in this era of performance-based budgeng. When I was a new president of an instuon that had never benefied from grant funding, I had the pleasure of rolling up my sleeves and joining the faculty in wring a proposal to the Advanced Technological Educaon (ATE) program. College presidents think long and hard about performance measures like graduaon rates, length of enrollment unl compleon, and the gainful employment of graduates, yet such measures may seem distant to faculty who must focus on geng more students to pass their courses. The queson rises as how to reconcile equally important interests in outcomes—at the course and program levels for faculty and the instuonal level for the president. While I was not convinced that student outcomes were beyond the scope of the grant, the faculty and I agreed that our ATE evaluaon ought to be a step in a larger process. Most evaluators would agree that longitudinal studies of student outcomes cannot fall within the typical three-year grant period. By the same token, I think the new emphasis on logic models that demonstrate the progression from project inputs and acvies through short-, mid-, and long-term outcomes allows grant developers to beer tailor evaluaon designs to the funded work, as well as extend project planning beyond the immediate funding period. The noon of “stackable credenals” so popular with the college compleon agenda should now be part of our thinking about grant development. For example, we might look to develop proposals for ATE Targeted Research that build upon more limited project evaluaon results. Or perhaps the converse is the way to go: Let’s plan our ATE projects with a mind toward long- term results, supported by evaluaon and research designs that ulmately get us the data we need to “make the case” for our colleges as innovators and advancers of knowledge. Data Management Plan Guest Columnist: Elaine Johnson EvaluATE has funds to assist up to ten evaluators to travel to and aend the ATE PI conference! We are pleased to announce that we will reimburse up to ten internal or external ATE evaluators for transportaon, lodging, and registraon expenses to enable them to aend the 2014 ATE PI conference ($1,500 maximum). All current ATE evaluators are invited to apply by compleng the applicaon form at hp://evalu- ate.org/events/workshop_2014/. Addional informaon about requirements for funding recipients is included in this form. The submission deadline is August 22, 2014. Noficaon of awards will be made by August 29. EvaluATE is offering our August webinar—Your ATE Proposal: Got Evaluaon?on two different days and at mes. See page 4 for event details.

Upload: vanlien

Post on 27-Aug-2018

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Effectiveness - EvaluATE · I contend that NSF grants will increasingly go to ... proposals for ATE Targeted Research that build upon more limited project ... Infusion Project that

Evaluation Terminology . . . . 2

Real Questions, Real Answers . . . . 2

. . . . 3

. . . . 3

Upcoming Events . . . . 4

Conduit Editor: Emma Perk

Steven Budd, Independent Researcher

Having made a career of community college administration, first as a grant writer and later as a college president, I know well the power of grants in advancing a college’s mission. Somewhere in the early 1990s, the NSF was one of the first grantmakers in higher education to recognize the role of community colleges in STEM undergraduate education. Ever since, two-year faculty have strived to enter the NSF world with varied success.

Unlike much of the grant funding from federal sources, success in winning NSF grants is predicated on innovation and advancing knowledge, which stands in stark contrast to a history of colleges making the case for support based on institutional need. Colleges that are repeatedly successful in winning NSF grants are those that demonstrate their strengths and their ability to deliver what the grantor wants. I contend that NSF grants will increasingly go to new or “first-time” institutions once they recognize and embrace their capacity for innovation and knowledge advancement. With success in winning grants comes the responsibility to document achievements through effective evaluation.

I am encouraged by what I perceive as a stepped-up discussion among grant writers, project PIs, and program officers about evaluation and its importance. As a grant writer/developer, my main concern was to show that the activities I proposed were actually accomplished and that the anticipated courses, curricula, or other project deliverables had been implemented. Longer-term outcomes pertaining to student achievement were generally considered to be beyond a project’s scope. However, student outcomes have now become the measure for attracting public funding, and the emphasis on outcomes will only increase in this era of performance-based budgeting.

When I was a new president of an institution that had never benefitted from grant funding, I had the pleasure of rolling up my sleeves and joining the faculty in writing a proposal to the Advanced Technological Education (ATE) program. College presidents think long and hard about performance measures like graduation rates, length of enrollment until completion, and the gainful employment of graduates, yet such measures may seem distant to faculty who must focus on getting more students to pass their courses. The question rises as how to reconcile equally important interests in outcomes—at the course and program levels for faculty and the institutional level for the president. While I was not convinced that student outcomes were beyond the scope of the grant, the faculty and I agreed that our ATE evaluation ought to be a step in a larger process.

Most evaluators would agree that longitudinal studies of student outcomes cannot fall within the typical three-year grant period. By the same token, I think the new emphasis on logic models that demonstrate the progression from project inputs and activities through short-, mid-, and long-term outcomes allows grant developers to better tailor evaluation designs to the funded work, as well as extend project planning beyond the immediate funding period. The notion of “stackable credentials” so popular with the college completion agenda should now be part of our thinking about grant development. For example, we might look to develop proposals for ATE Targeted Research that build upon more limited project evaluation results. Or perhaps the converse is the way to go: Let’s plan our ATE projects with a mind toward long-term results, supported by evaluation and research designs that ultimately get us the data we need to “make the case” for our colleges as innovators and advancers of knowledge.

Data Management Plan

Guest Columnist: Elaine Johnson

EvaluATE has funds to assist up to ten evaluators to travel to and attend the ATE PI conference!

We are pleased to announce that we will reimburse up to ten internal or external ATE evaluators for transportation, lodging, and registration expenses to enable them to attend the 2014 ATE PI conference ($1,500 maximum).

All current ATE evaluators are invited to apply by completing the application form at http://evalu-ate.org/events/workshop_2014/. Additional information about requirements for funding recipients is included in this form. The submission deadline is August 22, 2014. Notification of awards will be made by August 29.

EvaluATE is offering our August webinar—Your ATE Proposal: Got Evaluation?—on two different days and at times. See page 4 for event details.

Page 2: Effectiveness - EvaluATE · I contend that NSF grants will increasingly go to ... proposals for ATE Targeted Research that build upon more limited project ... Infusion Project that

Effectiveness The ATE program solicitation calls for the evaluation of project effectiveness. Effectiveness, as defined by the Oxford English Dictionary, is “the degree to which something is successful in producing a desired result.” Therefore, ATE evaluations should determine the extent to which projects achieved their intended results, demonstrating how the project’s activities led to observed outcomes.

To claim effectiveness requires establishing causal links between a project’s activities and observed outcomes. To establish causation, three criteria must be met: temporal precedence, covariation, and no plausible alternative explanations (see bit.ly/trochim). For example, if you claim that your project led to increased enrollment of women in engineering technology, you need to provide evidence that (1) the trend did not start until after the project was initiated, (2) individuals or campuses not involved in the project did not experience the same changes or that the degree of change varied with the degree of involvement; and (3) nothing else going on in the project’s environment could have produced the observed increase in the number of women enrolled.

While important, there is more to evaluation than measuring effectiveness. Some other considerations include relevance, efficiency, impact, and sustainability (these are project evaluation criteria developed by the Organisation for Economic Co-operation and Development; to learn more see bit.ly/oecd-dac.)

The evaluation requirements and expectations expressed in the new ATE program solicitation are generally consistent with those that were in the prior version. However, there are two important changes that relate specifically to ATE centers:

First, the solicitation states that proposals for center renewals “may submit up to five pages on Results of Prior Support in the supplemental documents section and refer the reader to that section in the Project Description section.” The requirement that all proposals must begin with a subsection titled Results of Prior Support has not changed. What is new is the option—for Centers only—of describing results of prior support in a supplementary document, allowing proposers to devote more of their 15-page project descriptions to what they intend to do, rather than what they have accomplished in the past. Whether embedded in the project description or appended as a supplementary document, this section should identify the prior grant’s outcomes and impacts, supported with evidence from the evaluation. Reviewers will be looking for strong evidence that NSF made a good investment in the center and that a renewal grant is warranted given the center’s track record.

Second, the new solicitation calls for national center proposals to include evaluation plans that describe how impacts on institutions, faculty, students, and industry will be assessed. This is a more specific expectation for the evaluation than in the previous solicitation, which called for evaluations to provide evidence of impacts relating to a center’s disciplinary focus. Thus, proposals for national centers should

describe the intended impacts at each of these levels (institutions, faculty, students, industry) and the evaluation plan should explain what data will be used to determine the quality and magnitude of those impacts.

Although not directly related to evaluation, other notable changes in the 2014 solicitation include the following:

‒ there is a new track for ATE projects called “ATE Coordination Networks”

‒ the Targeted Research track has been expanded

‒ Resource Centers have been renamed Support Centers

‒ all grantees are required to work with ATE Central to archive materials developed with grant funds to ensure they remain available to the public after funding ends

The archiving requirement relates directly to the data management plans that are required with all NSF proposals. To learn more about DMPs and how to develop yours, check out the article on page 3 of this newsletter.

Also, note the submission deadline is earlier this year—October 9! To learn more about developing an evaluation plan to include in your ATE proposal, join our webinars on August 20 and 26 (see page 4).

Check out the new solicitation at www.nsf.gov/ate.

Page 3: Effectiveness - EvaluATE · I contend that NSF grants will increasingly go to ... proposals for ATE Targeted Research that build upon more limited project ... Infusion Project that

Elaine Johnson is the PI and Executive Director of Bio-Link, the Next Generation National ATE Center for Biotechnology and Life Sciences at City College of San Francisco. She has served in that role since the founding of Bio-Link in 1998. She is a mentor for Mentor-Connect and has been a co-PI on the Synergy ATE project.

Jason Burkhardt

Where are the hidden opportunities to positively influence proposal reviewers? Surprisingly, this is often the Results from Prior Support section. Many proposers do not go beyond simply recounting what they did in prior grants. They miss the chance to “wow” the reader with impact examples, such as Nano-Link’s Nano-Infusion Project that has resulted in the integration and inclusion of nanoscale modules into multiple grade levels of K-14 across the nation. Teachers are empowered with tools to effectively teach nanoscale concepts as evidenced by their survey feedback. New leaders are emerging and enthusiasm for science can be seen on the videos available on the website. Because of NSF funding, additional synergistic projects allowed for scaling activities and growing a national presence.

Any PI having received NSF support in the past 5 years must include a summary of the results (up to 5 pages) and how those results support the current proposal. Because pages in this subsection count toward the total 15 pages, many people worry that they are using too much space to describe what has been done. These pages, however, can provide a punch and energy to the proposal with metrics, outcomes, and stories. This is the time to quote the evaluator’s comments and tie the results to the evaluation plan. The external view provides valuable decision-making information to the reviewers. This discussion of prior support helps reviewers evaluate the proposal, allows them to make comments, and provides evidence that the new activities will add value.

According to the NSF Grant Proposal Guide, updated in 2013, the subsection must include: Award #, amount, period of support; title of the project; summary of results described under the distinct separate headings of Intellectual Merit, and Broader Impact; publications acknowledging NSF support; evidence of research products and their availability; and relation of completed work to proposed work.

The bottom line is that the beginning of the project description sets the stage for the entire proposal. Data and examples that demonstrate intellectual merit and broader impact clearly define what has been done thus leaving room for a clear description of new directions that will require funding.

NSF requires that ALL proposals include a data management plan (DMP); FastLane will not accept submissions without one. The DMP must detail “how you will conform to NSF policy on the dissemination and sharing of research results.” The term “research results” basically means any information collected or produced as a result of your program. Therefore, the DMP must detail what data you will collect and how you will collect, maintain, report, and disseminate those data, as well as other resources generated by your grant. While NSF does outline requirements for what should be included in a DMP (bit.ly/dmp-ehr), they do not tell you how to write one. There are a handful of resources that can help you write a DMP.

The University of Wisconsin Research Data Services Unit has a webpage that provides several links to resources (bit.ly/UW-dmp), and the University of Michigan features extensive guidance, including templates and worksheets (bit.ly/um-dmp). The University of Minnesota also offers several resources for DMP development (bit.ly/umn-dmp).

One other tool that can be helpful is the DMP Tool available at DMPTool.org. You fill out the plan as you go through the tool, and you can save plans as well. The tool provides extensive guidance on DMP development, with instructions for each part of the plan, guidance on how to fill out the sections, and

helpful links. ATE Central includes guidance, resources, and an example plan in their handbook, available at atecentral.net/handbook, and also provides archive services for resources produced by ATE projects and centers (which supports sustainability). A new requirement in the 2014 ATE program solicitation is that grantees “must provide copies of [their] resources to ATE Central for archiving purposes.”

If you can demonstrate that you followed the data management plan for a prior grant, and also that you have provided access to the information and resources that your project or center has generated, then you can even use this information in your Results of Prior Support section for your next proposal.

Page 4: Effectiveness - EvaluATE · I contend that NSF grants will increasingly go to ... proposals for ATE Targeted Research that build upon more limited project ... Infusion Project that

Dr. Lori Wingate, PI

Dr. Arlen Gullickson, Co-PI

The Evaluation Center Western Michigan University 1903 West Michigan Avenue Kalamazoo, MI 49008-5237 P: 269-387-5895

E: [email protected]

EvaluATE is operated by the

Western Michigan University

Evaluation Center and funded

by the National Science

Foundation under grant

number 1204683. The views

expressed in this newsletter are

those of the authors and do not

necessarily reflect those of NSF.

Hosted by our friends at MATEC Networks (matecnetworks.org)

Nonprofit

U.S. Postage

PAID

Western Michigan University

To make the most use of a logic model, it must be connected to an evaluation plan that plots out what data you need to collect at what times to provide the most benefit for your project. In this webinar, we’ll demonstrate this process using examples from ATE projects. This webinar is the second in a series by the FAS4ATE project, which is connecting PIs, evaluators, and international experts to design evaluative activities to effectively serve both day-to-day management and end-of-project needs for the ATE community. This project will culminate in a one-day workshop at the Omni Shoreham in Washington, DC on October 21 (prior to the ATE PI Conference). If you are interested in applying for the workshop go to: http://evalu-ate.org/events/

got evaluation?

A strong evaluation plan that is well integrated into your grant proposal will strengthen your submission and maybe even give you a competitive edge. In this webinar, experienced ATE community members will provide insights on ways to enhance your proposal and avoid common pitfalls with regard to evaluation. We’ll walk through the Evaluation Planning Checklist for ATE Proposals, which provides detailed guidance on how to address evaluation throughout a proposal—from the project summary to the budget justification. We’ll share examples of how to incorporate results from previous evaluations in the Results of Prior Support section, present a coherent evaluation plan linked to project activities and goals, and budget for an external evaluation (among other things). We’ll have plenty of time for questions and discussion with our knowledgeable panel.

«FIRST NAME» «LAST NAME»

«ORGANIZATION»

«ADDRESS»

«ADDRESS 2»

«CITY» «STATE» «ZIP»