2018 dsm portfolio evaluation report - nipsco

448
2018 DSM Portfolio Evaluation Report June 14, 2019 Prepared for: NIPSCO 801 East 86th Avenue Merrillville, IN 46410

Upload: others

Post on 20-May-2022

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 2018 DSM Portfolio Evaluation Report - NIPSCO

2018 DSM Portfolio

Evaluation Report June 14, 2019

Prepared for:

NIPSCO

801 East 86th Avenue

Merrillville, IN 46410

Page 2: 2018 DSM Portfolio Evaluation Report - NIPSCO

Prepared by:

The Cadmus Group LLC

With subcontractors:

Page 3: 2018 DSM Portfolio Evaluation Report - NIPSCO

i

Table of Contents Table of Contents ................................................................................................................................ i

List of Acronyms and Abbreviations .................................................................................................... v

Executive Summary ............................................................................................................................ 1

Portfolio Performance and Insights ........................................................................................................ 1

Savings Achievements ............................................................................................................................ 4

Summary of Recommendations ........................................................................................................... 13

Program Offerings ............................................................................................................................ 22

Evaluation Objectives and Methodology ........................................................................................... 24

Research Questions .............................................................................................................................. 24

Impact Evaluation Approach ................................................................................................................ 25

Process Evaluation Approach ............................................................................................................... 26

Research Activities ............................................................................................................................... 26

NTG Methods ....................................................................................................................................... 31

Residential HVAC Rebates Program .................................................................................................. 34

Program Performance .......................................................................................................................... 34

Research Questions .............................................................................................................................. 35

Impact Evaluation ................................................................................................................................. 35

Process Evaluation ................................................................................................................................ 49

Conclusions and Recommendations .................................................................................................... 62

Residential Lighting Program ............................................................................................................ 64

Program Performance .......................................................................................................................... 64

Research Questions .............................................................................................................................. 65

Impact Evaluation ................................................................................................................................. 65

Process Evaluation ................................................................................................................................ 69

Conclusions and Recommendations .................................................................................................... 71

Residential Home Energy Analysis Program ...................................................................................... 73

Program Performance .......................................................................................................................... 73

Research Questions .............................................................................................................................. 74

Impact Evaluation ................................................................................................................................. 74

Process Evaluation ................................................................................................................................ 86

Page 4: 2018 DSM Portfolio Evaluation Report - NIPSCO

ii

Conclusions and Recommendations .................................................................................................. 100

Residential Appliance Recycling Program ........................................................................................ 104

Program Performance ........................................................................................................................ 104

Research Questions ............................................................................................................................ 105

Impact Evaluation ............................................................................................................................... 105

Process Evaluation .............................................................................................................................. 118

Conclusions and Recommendations .................................................................................................. 127

Residential School Education Program ............................................................................................ 129

Program Performance ........................................................................................................................ 129

Research Questions ............................................................................................................................ 130

Impact Evaluation ............................................................................................................................... 131

Process Evaluation .............................................................................................................................. 142

Conclusions and Recommendations .................................................................................................. 148

Residential Multi-Family Direct Install Program .............................................................................. 150

Program Performance ........................................................................................................................ 150

Research Questions ............................................................................................................................ 151

Impact Evaluation ............................................................................................................................... 151

Process Evaluation .............................................................................................................................. 159

Conclusions and Recommendations .................................................................................................. 161

Residential Behavioral Program ...................................................................................................... 162

Program Performance ........................................................................................................................ 163

Research Questions ............................................................................................................................ 163

Impact Evaluation ............................................................................................................................... 164

Savings Analysis Methodology ........................................................................................................... 173

Process Evaluation .............................................................................................................................. 176

Conclusions and Recommendations .................................................................................................. 186

Residential Income Qualified Weatherization Program ................................................................... 189

Program Performance ........................................................................................................................ 189

Research Questions ............................................................................................................................ 190

Impact Evaluation ............................................................................................................................... 191

Process Evaluation .............................................................................................................................. 204

Conclusions and Recommendations .................................................................................................. 218

Page 5: 2018 DSM Portfolio Evaluation Report - NIPSCO

iii

C&I Prescriptive Program ............................................................................................................... 222

Program Performance ........................................................................................................................ 222

Research Questions ............................................................................................................................ 223

Impact Evaluation ............................................................................................................................... 223

Process Evaluation .............................................................................................................................. 236

Conclusions and Recommendations .................................................................................................. 246

C&I Custom Incentive Program ....................................................................................................... 249

Program Performance ........................................................................................................................ 249

Research Questions ............................................................................................................................ 250

Impact Evaluation ............................................................................................................................... 251

Process Evaluation .............................................................................................................................. 265

Conclusions and Recommendations .................................................................................................. 281

C&I New Construction Incentive Program ....................................................................................... 284

Program Performance ........................................................................................................................ 284

Research Questions ............................................................................................................................ 285

Impact Evaluation ............................................................................................................................... 286

Process Evaluation .............................................................................................................................. 298

Conclusions and Recommendations .................................................................................................. 307

C&I Small Business Direct Install Program ....................................................................................... 310

Program Performance ........................................................................................................................ 310

Research Questions ............................................................................................................................ 311

Impact Evaluation ............................................................................................................................... 311

Process Evaluation .............................................................................................................................. 324

Conclusions and Recommendations .................................................................................................. 334

C&I Retro-Commissioning Program ................................................................................................. 337

Program Performance ........................................................................................................................ 337

Research Questions ............................................................................................................................ 338

Impact Evaluation ............................................................................................................................... 338

Process Evaluation .............................................................................................................................. 343

Conclusions and Recommendations .................................................................................................. 346

Page 6: 2018 DSM Portfolio Evaluation Report - NIPSCO

iv

Appendix A. Evaluation Definitions................................................................................................. 348

Appendix B. Self-Report Net-to-Gross Evaluation Methodology ...................................................... 349

Appendix C. Residential Lighting Program Assumptions and Algorithms .......................................... 368

Appendix D. Residential Home Energy Analysis Program Assumptions and Algorithms .................... 374

Appendix E. Residential Home Energy Analysis Program Billing Analysis Methodology .................... 385

Appendix F. Residential School Education Program Ex Post Measure Savings Calculation

Methodologies .............................................................................................................................. 390

Appendix G. Residential Behavioral Program Regression Analysis and Uplift Analysis ...................... 397

Appendix H. Multi-Family Direct Install Program ............................................................................. 401

Appendix I. Residential Income Qualified Weatherization Program Assumptions and Algorithms ..... 408

Appendix J. Cost-Effectiveness ....................................................................................................... 422

Page 7: 2018 DSM Portfolio Evaluation Report - NIPSCO

v

List of Acronyms and Abbreviations Acronym/Abbreviation Definition

ACFM Actual cubic feet per minute of compressed air

ARCA Appliance Recycling Centers of America

C&I Commercial and Industrial

CAC Central air conditioner

CBCP Center beam candle power

CDD Cooling degree days

CF Coincidence factor

CFM Cubic feet per minute

CHA report Comprehensive home assessment report

COP Coefficient of performance

DHW Domestic hot water

DOE U.S. Department of Energy

DP&L Dayton Power and Light

DSM Demand-side management

EFLH Effective full-load hours

EISA Energy Independence and Security Act

EM&V Evaluation, measurement, and verification

HDD Heating degree day

HEA program Home Energy Analysis program

HEW Home energy worksheet

HOU Hours of use

IQW program Income Qualified Weatherization program

ISR In-service rates

M&V Measurement and verification

MFDI program Multi-Family Direct Install program

NPV Net present value

NTG Net-to-gross

PCT Participant cost test

PPS Probability proportional to size

QA/QC Quality assurance and quality control

RCx program Retro-Commissioning program

RIM Ratepayer impact measure test

ROI Return on investment

SBDI program Small Business Direct Install program

TMY3 Typical meteorological year

TRC Total resource cost test

TRM Technical Reference Manual

UCT Utility cost test

UMP Uniform Methods Project

VFD Variable frequency drive

WHF Waste heat factor

Page 8: 2018 DSM Portfolio Evaluation Report - NIPSCO

1

Executive Summary NIPSCO’s demand-side management (DSM) portfolio contains eight residential programs and five

commercial and industrial (C&I) programs that serve its customer base. This executive summary includes

key findings from the evaluation team’s1 assessment including ex post gross and net savings impacts,

cost-effectiveness, program operations, performance, and opportunities for improvement. Overall, the

portfolio achieved 172,345,982 kWh ex post gross electric energy savings, 23,844 kW ex post gross peak

demand reduction, and 5,123,584 therms ex post gross natural gas energy savings. NIPSCO achieved its

overall electric energy savings goal, but fell short of its overall peak demand reduction and natural gas

ex post savings goals in 2018. The residential portfolio met electric energy, peak demand reduction, and

natural gas energy ex post savings goals, whereas the C&I sector met only its electric energy ex post

savings goal.

Portfolio Performance and Insights Thousands of residential and C&I customers participated in NIPSCO’s DSM programs in 2018 through

expanded offerings to targeted customer groups. To evaluate program impacts and performance, the

evaluation team interviewed utility program and implementation staff, surveyed customers and trade

allies, and conducted on-site verifications. The evaluation team also measured and verified savings for

each program. A summary of savings impacts, spending, and key accomplishments for each program

follows. Figure 1 and Figure 2 show the overall performance and key findings for the residential and C&I

sectors.

1 The evaluation team includes Cadmus (lead firm), Illume Advising, and Integral Analytics.

Page 9: 2018 DSM Portfolio Evaluation Report - NIPSCO

2

Figure 1. 2018 Residential Sector Performance and Insights

Page 10: 2018 DSM Portfolio Evaluation Report - NIPSCO

3

Figure 2. 2018 C&I Sector Performance and Insights

Page 11: 2018 DSM Portfolio Evaluation Report - NIPSCO

4

Savings Achievements The following section details the program and portfolio-level savings achievements relative to planning

goals, the savings achievements at each step of the impact evaluation, the contribution of each program

to portfolio savings, and a summary of recommendations for each program.

Portfolio Results Table 1 and Table 2 show 2018 gross planning goals for electric and natural gas savings, and each

program’s performance in achieving those goals. Because NIPSCO’s program savings goals are gross,

goal achievement can be assessed at different stages of the evaluation (audited, verified, ex post gross,

ex post net). The tables show goal achievement in terms of ex post gross savings.

Table 1. 2018 Portfolio Electric Goal Achievement

Electricity Demand

Program

Gross

Electric

Savings Goal

(kWh)

Ex Post

Gross

Electric

Savings

(kWh)

Share of

Electric Goal

Achieved

(%)a

Gross Peak

Demand

Reduction

Goal (kW)

Ex Post

Gross Peak

Demand

Reduction

(kW)

Share of

Peak

Demand

Goal

Achieved

(%)a

Residential

HVAC Rebates 1,657,129 1,622,426 98% 942 1,088 116%

Lighting 26,171,797 35,884,266 137% 6,553 4,884 75%

Home Energy Analysis 1,160,772 424,231 37% 674 201 30%

Appliance Recycling 1,864,894 2,854,738 153% 274 419 153%

School Education 2,946,360 2,193,543 74% 400 266 67%

Multi-Family Direct Install 921,264 993,817 108% 206 111 54%

Behavioral 24,516,272 31,976,257 130% N/A 3,650 N/A

Income Qualified

Weatherization 1,845,908 843,410 46% 285 270 95%

Total Residential 61,084,397 76,792,687 126% 9,333 10,890 117%

C&I

Prescriptive 41,558,911 45,255,008 109% 13,697 6,352 46%

Custom 29,267,725 27,019,275 92% 5,716 3,359 59%

New Construction 12,084,164 16,787,567 139% 2,469 2,341 95%

Small Business Direct

Install 6,076,864 6,177,763 102% 2,542 866 34%

Retro-Commissioning 2,088,805 313,681 15% 406 37 9%

Total C&I 91,076,468 95,553,294 105% 24,830 12,955 52%

Total 2018 Portfolio 152,160,865 172,345,982 113% 34,164 23,844 70%

Source: 2018 DSM Scorecard January 2018 to December 2018

Note: Totals may not properly sum due to rounding. a Program goals are gross; savings achievements are ex post gross.

Page 12: 2018 DSM Portfolio Evaluation Report - NIPSCO

5

Table 2. 2018 Portfolio Natural Gas Goal Achievement

Program Gross Natural Gas

Savings Goal (therms)

Ex Post Gross Natural

Gas Savings (therms)

Share of Natural Gas

Goal Achieved (%)a

Residential

HVAC Rebates 2,078,492 2,176,218 105%

Home Energy Analysis 196,763 44,704 23%

School Education 202,304 157,106 78%

Multi-Family Direct Install 175,387 39,608 23%

Behavioral 751,885 1,169,960 156%

Income Qualified Weatherization 276,792 201,550 73%

Total Residential 3,681,624 3,789,146 103%

C&I

Prescriptive 364,789 288,206 79%

Custom 1,265,347 536,686 42%

New Construction 294,687 360,639 122%

Small Business Direct Install 365,074 148,905 41%

Retro-Commissioning 38,745 0 0%

Total C&I 2,328,642 1,334,437 57%

Total 2018 Portfolio 6,010,266 5,123,584 85%

Source: 2018 DSM Scorecard January 2018 to December 2018

Note: Totals may not properly sum due to rounding. a Program goals are gross; savings achievements are ex post gross.

Table 3 through Table 5 show the electric energy, peak demand reduction, and natural gas energy

savings achieved by each program in the 2018 NIPSCO portfolio.

Page 13: 2018 DSM Portfolio Evaluation Report - NIPSCO

6

Table 3. 2018 Portfolio Electric Energy Savings

Program

Reported Electric Savings (kWh) Evaluated Electric Savings (kWh)

Ex Ante Audited Verified Ex Post

Gross

Realization

Rate (%)

Freerider-

ship (%) Spillover (%)

NTG Ratio

(%)

Ex Post

Net

Residential

HVAC Rebates 1,413,559 1,413,559 1,413,559 1,622,426 115% 35% 0% 65% 1,054,577

Lightinga 24,148,317 24,148,317 22,447,462 35,884,266 149% N/A N/A 52% 18,659,818

Home Energy Analysisa 450,871 451,104 417,545 424,231 94% 23% 0% 77% 325,147

Appliance Recyclinga 2,369,904 2,369,904 2,369,904 2,854,738 120% 35% 0% 65% 1,845,713

School Education 2,947,908 2,948,137 1,585,000 2,193,543 74% 11% 6% 95% 2,083,866

Multi-Family Direct Install 937,088 937,088 937,088 993,817 106% N/A N/A 100% 993,817

Behavioral 31,504,104 31,504,104 32,126,813 31,976,257 101% N/A N/A 100% 31,976,257

Income Qualified

Weatherization 1,304,976 1,303,717 1,248,092 843,410 65% 0% 0% 100% 843,410

Total Residential 65,076,727 65,075,928 62,545,463 76,792,687 118% N/A N/A N/A 57,782,604

C&I

Prescriptive 42,608,649 42,608,649 42,608,649 45,255,008 106% 12% 1% 89% 40,276,957

Custom 27,644,407 27,647,724 29,297,639 27,019,275 98% 21% 0% 79% 21,345,228

New Construction 15,439,881 15,441,219 15,441,219 16,787,567 109% 48% 0% 52% 8,729,535

Small Business Direct Install 5,730,613 5,730,613 5,730,613 6,177,763 108% 4% 1% 97% 5,992,430

Retro-Commissioning 328,052 328,052 328,052 313,681 96% 0% 0% 100% 313,681

Total C&I 91,751,602 91,756,258 93,406,172 95,553,294 104% N/A N/A N/A 76,657,830

Total 2018 Portfolio 156,828,329 156,832,185 155,951,635 172,345,982 110% N/A N/A N/A 134,440,435

Page 14: 2018 DSM Portfolio Evaluation Report - NIPSCO

7

Table 4. 2018 Portfolio Peak Demand Savings

Program

Reported Peak Demand Reduction (kW) Evaluated Peak Demand Reduction (kW)

Ex Ante Audited Verified Ex Post

Gross

Realization

Rate (%)a

Freerider-

ship (%) Spillover (%)

NTG Ratio

(%)

Ex Post

Net

Residential

HVAC Rebates 643 643 643 1,088 169% 35% 0% 65% 707

Lightingb 3,308 3,308 3,075 4,884 148% N/A N/A 52% 2,540

Home Energy Analysisc 248 248 220 201 81% 13% 0% 87% 175

Appliance Recyclingc 348 349 349 419 120% 35% 0% 65% 271

School Education 400 471 188 266 67% 11% 6% 95% 253

Multi-Family Direct Install 176 176 176 111 63% N/A N/A 100% 111

Behavioral N/A N/A N/A 3,650 N/A N/A N/A 100% 3,650

Income Qualified Weatherization 152 152 137 270 177% 0% 0% 100% 270

Total Residential 5,275 5,346 4,787 10,890 206% N/A N/A N/A 7,977

C&I

Prescriptive 7,728 7,728 7,728 6,352 82% 12% 1% 89% 5,653

Custom 3,465 3,465 3,672 3,359 97% 21% 0% 79% 2,654

New Construction 2,575 2,606 2,606 2,341 91% 48% 0% 52% 1,217

Small Business Direct Install 2,002 2,002 2,002 866 43% 4% 1% 97% 840

Retro-Commissioning 0 0 0 37 N/A 0% 0% 100% 37

Total C&I 15,770 15,801 16,008 12,955 82% N/A N/A N/A 10,401

Total 2018 Portfolio 21,045 21,147 20,796 23,844 113% N/A N/A N/A 18,378

Note: Tables may not properly sum due to rounding. a Realization rate is ex post gross/ex ante. b Lighting NTG ratio is based on a method that produces an overall NTG ratio but not individual freeridership or spillover values. c Freeridership, spillover, and NTG percentages represent averages of the specific values applied to individual measures for HEA and Appliance Recycling.

Page 15: 2018 DSM Portfolio Evaluation Report - NIPSCO

8

Table 5. 2018 Portfolio Natural Gas Energy Savings

Program

Reported Natural Gas Savings (therms) Evaluated Natural Gas Savings (therms)

Ex Ante Audited Verified Ex Post

Gross

Realization

Rate (%)a

Freerider-

ship (%)

Spillover

(%)

NTG Ratio

(%)

Ex Post

Net

Residential

HVAC Rebates 1,896,351 1,896,351 1,896,351 2,176,218 115% 35% 0% 65% 1,414,542

Home Energy Analysisb 83,323 83,370 72,825 44,704 54% 8% N/A 92% 41,217

School Education 202,409 204,394 77,389 157,106 78% 11% 6% 95% 149,251

Multi-Family Direct Install 79,283 79,283 79,283 39,608 50% N/A N/A 100% 39,608

Behavioral 1,110,584 1,110,584 1,176,246 1,169,960 105% N/A N/A 100% 1,169,960

Income Qualified Weatherization 217,691 217,646 191,283 201,550 93% 0% N/A 100% 201,550

Total Residential 3,589,640 3,591,627 3,493,376 3,789,146 106% N/A N/A N/A 3,016,128

C&I

Prescriptive 363,958 363,237 363,237 288,206 79% 12% 1% 89% 256,504

Custom 554,281 552,358 585,320 536,686 97% 21% 0% 79% 423,982

New Construction 366,300 367,051 367,051 360,639 98% 48% 0% 52% 187,532

Small Business Direct Install 174,519 174,646 174,646 148,905 85% 4% 1% 97% 144,438

Retro-Commissioning 0 0 0 0 N/A 0% 0% 100% 0

Total C&I 1,459,058 1,457,293 1,490,255 1,334,437 91% N/A N/A N/A 1,012,456

Total 2018 Portfolio 5,048,698 5,048,920 4,983,632 5,123,584 101% N/A N/A N/A 4,028,584

Note: Tables may not properly sum due to rounding. a Realization rate is ex post gross/ex ante. b Freeridership, spillover, and NTG percentages represent averages of the specific values applied to individual measures for HEA.

Page 16: 2018 DSM Portfolio Evaluation Report - NIPSCO

9

Program Contribution to Portfolio Savings Figure 3 and Figure 4 illustrate each program’s contribution to total ex post gross portfolio energy

savings for electric and demand, respectively. The Lighting program contributed the largest share of

electric energy savings to the Residential portfolio, with 47% of total electric energy (kilowatt-hour)

savings. The Behavioral program accounted for the next largest share (42%). The Lighting program also

accounted for the largest share of peak demand reduction (kilowatts) for the Residential portfolio,

contributing 45% of total peak demand reduction. The Behavioral program contributed 34% and the

HVAC Rebates program contributed 10%.

In the C&I sector, the Prescriptive program contributed the largest share of electric energy savings, with

47% of the total C&I portfolio electric energy (kilowatt-hour) savings. The Custom program contributed

28% and the New Construction program contributed 18%. The Prescriptive, Custom, and New

Construction programs contributed the largest shares of peak demand reduction (kilowatts) to the C&I

portfolio, accounting for 49%, 26%, and 18% of peak demand reduction, respectively.

Figure 3. Program Contribution to Portfolio Electric Energy Savings (kWh) by Ex Post Gross

Figure 4. Program Contribution to Portfolio Peak Demand Reduction (kW) by Ex Post Gross

Page 17: 2018 DSM Portfolio Evaluation Report - NIPSCO

10

Figure 5 illustrates each program’s contribution to total ex post gross natural gas portfolio energy

savings. The HVAC Rebates program accounted for the largest share of Residential natural gas energy

(therm) savings, with 57% of the Residential portfolio savings. The Behavioral program was the second

largest contributor to the Residential program’s natural gas savings total (31%). The Custom program

contributed 40% of the natural gas energy savings for the C&I sector, the most of any of the C&I

programs. The New Construction program accounted for 27%, and the Prescriptive program accounted

for 22% of the C&I sector’s natural gas savings.

Figure 5. Program Contribution to Portfolio Natural Gas Energy Savings (therms) by Ex Post Gross

Budget and Cost-Effectiveness As shown in Table 6 and Table 7, NIPSCO spent 89% of its electric budget and 75% of its natural gas

budget for the 2018 portfolio.

Page 18: 2018 DSM Portfolio Evaluation Report - NIPSCO

11

Table 6. 2018 Electric Portfolio Budget and Spending

Program Budget ($) Actual Spend ($) Budget Spent (%)

Residential

HVAC Rebates $500,164 $341,842 68%

Lighting $4,119,284 $3,611,711 88%

Home Energy Analysis $521,661 $208,900 40%

Appliance Recycling $467,608 $578,609 124%

School Education $505,192 $492,427 97%

Multi-Family Direct Install $279,129 $373,882 134%

Behavioral $1,828,988 $1,782,342 97%

Income Qualified Weatherization $859,761 $579,632 67%

Total Residential $9,081,787 $7,969,345 88%

C&I

Prescriptive $4,126,847 $3,899,345 94%

Custom $3,677,007 $2,949,615 80%

New Construction $1,353,864 $1,555,177 115%

Small Business Direct Install $945,242 $879,322 93%

Retro-Commissioning $258,406 $23,834 9%

Total C&I $10,361,366 $9,307,294 90%

Total 2018 Portfolio $19,443,152 $17,276,639 89%

Source: 2018 DSM Scorecard.

Note: Totals may not properly sum due to rounding.

Table 7. 2018 Natural Gas Portfolio Budget and Spending

Program Budget ($) Actual Spend ($) Budget Spent (%)

Residential

HVAC Rebates $2,539,083 $2,233,956 88%

Home Energy Analysis $678,249 $293,558 43%

School Education $544,281 $532,932 98%

Multi-Family Direct Install $374,754 $154,883 41%

Behavioral $192,544 $189,063 98%

Income Qualified Weatherization $1,354,340 $1,264,785 93%

Total Residential $5,683,250 $4,669,177 82%

C&I

Prescriptive $377,151 $358,855 95%

Custom $1,373,354 $561,814 41%

New Construction 319,841 $379,741 119%

Small Business Direct Install $412,006 $195,033 47%

Retro-Commissioning $42,053 ($605)a (1%)

Total C&I $2,524,404 $1,494,838 59%

Total 2018 Portfolio $8,207,654 $6,164,015 75%

Source: 2018 DSM Scorecard.

Note: Totals may not properly sum due to rounding. a The program experienced cancelled/delayed projects which resulted in a negative amount balance in 2018.

Page 19: 2018 DSM Portfolio Evaluation Report - NIPSCO

12

Table 8 and Table 9 show the results of the cost-effectiveness analysis. Energy efficiency cost-

effectiveness is measured by comparing the monetized energy efficiency benefits of an investment with

the costs. The evaluation team used four cost-effectiveness tests as a part of this analysis: the utility

cost test (UCT, or program administrator cost test), the total resource cost test (TRC), the ratepayer

impact measure test (RIM), and the participant cost test (PCT). The evaluation team used option-based

assumptions for the cost-effectiveness analysis. The inputs used and a description of each test can be

found in Appendix J. Cost-Effectiveness.

Table 8. Electric Program Cost-Effectiveness Results

Program Cost-Effectiveness

UCT TRC RIM PCTa

Residential

HVAC Rebates 5.02 1.37 0.99 0.99

Lighting 4.34 1.81 0.56 2.73

Home Energy Analysis 2.40 2.40 0.72 N/A

Appliance Recycling 2.95 2.51 0.54 8.68

School Education 2.45 2.45 0.49 N/A

Multi-Family Direct Install 2.69 2.69 0.51 N/A

Behavioral 1.20 1.20 0.39 N/A

Income Qualified Weatherization 2.02 2.02 0.61 N/A

Total Residential 3.15 1.79 0.55 3.20

C&I

Prescriptive 8.66 1.28 0.47 1.88

Custom 5.21 3.97 0.43 9.01

New Construction 4.70 4.70 0.45 16.58

Small Business Direct Install 5.71 3.09 0.46 8.89

Retro-Commissioning 10.62 1.93 0.46 3.03

Total Commercial & Industrial 6.63 1.84 0.46 3.28

Total 2018 Electric Portfolio 5.02 1.82 0.48 3.26

a For programs without incremental participant costs, the PCT score cannot be calculated.

Page 20: 2018 DSM Portfolio Evaluation Report - NIPSCO

13

Table 9. Natural Gas Program Cost-Effectiveness Results

Program Cost-Effectiveness

UCT TRC RIM PCTa

Residential

HVAC Rebates 5.54 2.31 0.84 2.13

Home Energy Analysis 1.42 1.42 0.58 N/A

School Education 1.18 1.18 0.54 N/A

Multi-Family Direct Install 2.81 2.81 0.73 N/A

Behavioral 3.70 3.70 0.76 N/A

Income Qualified Weatherization 1.75 1.75 0.63 N/A

Total Residential 3.59 2.15 0.77 2.64

C&I

Prescriptive 6.13 2.19 0.85 1.85

Custom 5.72 3.61 0.84 3.72

New Construction 4.23 3.19 0.80 4.24

Small Business Direct Install 6.55 2.80 0.86 3.12

Retro-Commissioning N/A N/A N/A N/A

Total Commercial & Industrial 5.55 2.90 0.84 3.06

Total 2018 Natural Gas Portfolio 4.07 2.36 0.79 2.76

a For programs without incremental participant costs, the PCT score cannot be calculated.

Summary of Recommendations Based on the 2018 evaluation findings, the evaluation team proposes a number of recommendations

intended to improve program uptake, processes, and performance within NIPSCO’s DSM portfolio. A

summary of these recommendations follow.

Residential HVAC Rebates • Review the incentive levels to ensure they are comparable to other similar programs (other

HVAC programs for similarly-sized utilites) and consider increasing the incentive amounts if the

program would still be cost-effective.

• Apply the application process improvements to processing the rebate checks—i.e., increase

communication around payment status so customers know when to expect rebates. This can

help customers stay engaged and know when to expect their checks.

• Conduct survey efforts focused on participants receiving two thermostats, designed to

determine their usage patterns for the second thermostat—what portion are given away, what

portion of those are given away in NIPSCO territory, and what type of system is being controlled

by installed secondary thermostats.

• If low savings for secondary smart thermostats becomes a concern for overall HVAC Rebates

program cost-effectiveness and it does not otherwise affect overall program satisfaction,

rebates for second thermostats should be curtailed.

Page 21: 2018 DSM Portfolio Evaluation Report - NIPSCO

14

• Update ex ante savings to generally reflect ex post findings. This will reduce disparities each

program year. In particular, consider doing this for air conditioner and heat pump measures, as

well as for demand savings for smart thermostats, which were sharply reduced this year.

Residential Lighting • Provide an increased focus and deeper discounts on specialty and reflector LED lamps, and LED

fixtures, as these have the highest per-unit realization rates and are not anticipated to be

affected by Energy Independence and Security Act (EISA) 2020 rulemaking to the same extent as

general service lamps.

• Use the UMP recommended lumens binning approach, combined with Indiana TRM values for

HOU, WHF, and CF to generate ex-ante savings for each lamp in the program, ensuring that the

program gets fuller credit for higher wattage, specialty, and reflector LEDs and realization rates

are closer to 100%.

Residential Home Energy Analysis • Use the ex post gross natural gas savings values from this report as new ex ante values to reduce

uncertainty in current savings estimates if the program is unable to conduct blower-door tests

and record the heating capacity and efficiency for duct sealing.

• Consider having the evaluation team conduct a billing analysis in the next evaluation with a

larger population if program participation increases.

• Track metrics related to the scheduling process, such as length of time from scheduling to

assessment and the percentage of customers who call to schedule but never complete an

assessment. Use the tracked metrics to identify areas for improvement and set incremental

goals towards progress.

• Determine reasons for possible prioritization of Income Qualified Weatherization (IQW) over

Home Energy Analysis (HEA) by trade allies and take steps to ensure participating trade allies do

not prioritize one program over the other.

• Recruit additional trade allies to ensure there are enough trade allies to adequately serve

customers of both programs.

• Have energy assessors encourage their participants to accompany them during the walk-through

audit to provide an opportunity for the energy assessor to communicate what they will install

and the reasons for what they cannot install.

• Move the listing of installed measures to a more prominent location near the beginning of the

comprehensive home assessment (CHA) report. Consider also including a list of measures that

were not installed with the reasons why if possible, within the reporting platform.

• Consider including a list of installed measures using check-boxes and fill-ins for quantity when

applicable, so as not to be overly burdensome for the assessor or customer in the work

authorization form that participants sign after the assessment. This could provide an alternative

to using self-report for calculating ISRs.

Page 22: 2018 DSM Portfolio Evaluation Report - NIPSCO

15

• Revisit 2017 recommendation to establish additional data capture activities such as having

energy assessors take pictures of post-installation conditions while on-site. This documentation

could provide another method for calculating ISRs for some measures such as duct sealing and

pipe wrap, potentially resulting in higher ISRs than self-report for measures participants do not

see and/or interact with on a daily basis.

• In addition to the leave-behind brochures, include references to other NIPSCO energy efficiency

programs and rebates, including the $500 instant discount for attic insulation, within the

CHA report to help increase awareness of other program offerings. Having this information

available when participants review their recommendations may serve as reminder of these

programs.

• Include easy and immediate access to the CHA report (print and digital) and ability to add

custom content, such as related NIPSCO rebates and program information as top features to

look for in a new reporting platform.

• Coordinate with program implementation staff and trade allies to ensure program processes for

printing reports, installing measures and recording installation, and promoting other NIPSCO

programs are followed.

• Consider having the evaluation team conduct ride-alongs as part of the 2019 program

evaluation to better assess the extent to which program procedures are followed and

opportunities for program improvement.

Residential Appliance Recycling • Identify and retain the program elements key to customer satisfaction as the Appliance

Recycling program is transitioned from Appliance Recycling Centers of America (ARCA) to

Recleim in 2019. Maintaining key successful program elements (such as marketing, program

materials, and customer communication processes) will help build consistency and save on

program operation costs during the transition period.

• Set goals for the new contractor on key areas of customer satisfaction and track the new

contractor’s progress. Goals may include the following: (1) call customers within half an hour of

the pickup, (2) do not leave behind any mess for the customer during removal, (3) return all

phone calls within 24 hours for scheduling or rescheduling a pickup, and (4) make certain the

new contractor has sufficient trucks and drivers to ensure timely pickups.

• In future program years, adjust per-unit annual kilowatt-hour savings to reflect the 2018

evaluation results (1,009 kWh for refrigerators and 704 kWh for freezers). Continue to update

these savings as frequently as possible within the constraints of the planning and evaluation

cycles.

Residential School Education • Document the reported scorecard savings methodology and sourcing to ensure it matches the

base ex ante savings calculation methodologies, including rounding discrepancies.

• Update domestic water heater fuel saturation rates using evaluated results.

Page 23: 2018 DSM Portfolio Evaluation Report - NIPSCO

16

• Use the known wattage of the LED night-light (0.5 watts) in calculating savings for the measure.

• Use the known gallon-per-minute efficiency value for faucet aerators and low-flow

showerheads.

• Calculate demand reductions and natural gas energy savings resulting from the furnace filter

whistle measure using the methodology outlined in the 2018 evaluation.

• Update all ISRs for each measure to match evaluated ISRs.

• Discontinue weighting savings values between various methodologies.

• Fully document all savings equations and input sources.

• Update person-per-home and showers/bathroom faucets-per-home input estimates to match

the higher values determined through the evaluation surveys.

Residential Multi-Family Direct Install • The program is well positioned for the new program cycle, with an extensive pipeline

developed. In 2019, less effort will be needed to recruit property managers and more effort will

be needed to engage with interested property managers; organize 12- to 24-month installation

plans with those sites, and manage new and existing accounts.

• Adjust the natural gas savings goal in 2019 using consumption data acquired in 2018.

• Record all values necessary for accurate savings calculations and verification directly in the

program tracking database, including the following for all program customers:

▪ Heating system type and efficiency

▪ Heating system capacity (Btu/h)

▪ Cooling system type and efficiency

▪ Cooling system capacity (Btu/h)

▪ Existing bulb type and wattage (for lighting measures)

• Document the sources, assumptions, and (where possible) algorithms and key inputs included in

the derivation of each deemed savings value for all program measures.

• Consider the fuel type and heating and cooling systems in measure descriptions and savings

calculations. Use the ex post gross per-measure savings values provided in this report, which

incorporate these factors, in program planning for measures included in this evaluation.

Residential Behavioral • To solely achieve evaluable savings, terminate the HERs for some customers as group sizes lead

to results that are consistently not statistically significant, then start new waves. However, this

may cause customers who like the reports and no longer receive them to be dissatisfied.

• Several approaches could be used to minimize the discontinuity in treatment while continuing

to send reports to wave 2 and 3 customers. For instance, the treatment group could be

incorporated into the eligible participants in a new wave and randomly assigned to either

treatment (high likelihood) or control (lower likelihood). Then prior HER participation would be

Page 24: 2018 DSM Portfolio Evaluation Report - NIPSCO

17

randomized and checked for equivalency. This would lead to slightly more muted early energy

savings than expected from a typical new HER wave, as control group customers would persist in

energy saving behaviors for a few years but long-run yearly savings would be the same. Ideally,

such a change should be developed in consultation with the evaluation team.

• HERs are a logical place to educate customers about energy efficiency program opportunities,

but customers may need to see the messaging multiple times or at the right time (such as when

they are thinking about new equipment) for it to be effective. Consider testing different

approaches to messaging such as: 1) different frequency and timing of cross-program

marketing; and 2) dedicating more space in the printed report to more detailed tips, such as

those available online, which may mean less space for messaging about other programs.

• To help the tips stay useful to participants and address the concerns of some participants,

regularly review the Energy Saving Tips to ensure they remain relevant and current with changing

technologies and customer preferences (such as the use of smart devices).

• Communicate to customers clearly that going to the web portal and updating their home

information details will help improve the accuracy of their Similar Homes Comparison. Part of the

clear communications could be using the term “accurate” or “accuracy” in the eHER and

embedding a hyperlink in the term that redirects customers to the web portal.

• Work with the program implementer and Oracle Utilities Opower to improve the ease of access to

the web portal, either by enabling single sign-on access to the web portal or by embedding

tracking cookies in eHERs that customers can use to automatically sign in to the web portal.

Residential Income Qualified Weatherization • Use the provided ex post gross savings values, particularly for air and duct sealing, attic

insulation, and refrigerator measures, which more accurately represent participant

characteristics, for planning to most closely align ex ante and ex post gross savings and achieve

realization rates closer to 100%.

• Break out the existing attic insulation measure into three categories based on existing R-value

(R-0, R-1 to R-7, R-8 to R-11) to more accurately capture savings based on existing conditions

and guard against discrepancies in ex ante and ex post gross savings due to actual home

characteristics not falling into the average savings category.

• Establish data validation procedures in the program tracking database to require that the

heating and cooling fuels and system types are entered and that the measure selected aligns

with those housing characteristics where they are necessary for accurate savings calculations.

• Enhance efforts to educate energy assessors on the importance of accurately recording

measures installed, providing the CHA report, and verbally informing customers of the measures

installed so that customers can accurately verify the installation of those measures.

• Revisit 2017 recommendation to establish additional data capture activities such as having

energy assessors take pictures of post-installation conditions while on-site. This documentation

Page 25: 2018 DSM Portfolio Evaluation Report - NIPSCO

18

could provide another method for calculating ISRs for some measures such as duct sealing, air

sealing, and pipe wrap, potentially resulting in higher ISRs than self-report for measures

participants do not see and/or interact with on a daily basis.

• Consider including a list of installed measures using check-boxes and fill-ins for quantity when

applicable so as not to be overly burdensome for the assessor or customer in the work

authorization form that participants sign after the assessment. This could provide an alternative

to using self-report for calculating ISRs.

• Include easy and immediate access to the CHA report (print and digital) and the ability to add

custom content such as related NIPSCO rebates and program information as top features to look

for in a new reporting platform.

• Coordinate with program implementation staff and trade allies to ensure program processes for

printing reports, installing measures and recording installation, and promoting other NIPSCO

programs are followed.

• Provide energy assessors with a branded leave-behind postcard they can use to check off which

measures were installed. This postcard could also include the URL for NIPSCO’s energy efficiency

program website.

C&I Prescriptive • Promote project phasing—breaking a large-scale project into manageable phases—as a way to

alleviate participant cost barriers.

• Continue outreach with hydronic contractors to increase savings from steam traps and

insulation, boilers, and boiler tune-ups.

• Develop targeted emails, followed by telephone calls and meetings with select contractors, and

track the outreach effort outcomes.

• Have the implementer apply a coincidence factor (CF) of zero to all exterior lighting measures,

as outlined in the 2015 Indiana TRM (v2.2).

• Reference the 2019 Illinois TRM (v7.0) for better estimates of deemed savings by building type.

Currently, the ex ante deemed savings align with the Illinois TRM (v7.0) for low pressure steam

systems with recirculation. However, for systems without recirculation, use the building-specific

deemed therm-per-foot values for either Chicago (closest weather station to NIPSCO territory)

or a statewide average. Extrapolate these values to different diameter pipes to provide new

deemed savings for the three program measures offered.

• Reference the 2019 Illinois TRM (v7.0) for better estimates of deemed savings for steam trap

replacements.

Page 26: 2018 DSM Portfolio Evaluation Report - NIPSCO

19

C&I Custom • Develop a process for responding to customers in a timely, personalized manner. As a part of

this process, follow up within 24 hours of initial contact with any customer planning a project or

attempting to complete an application.

• Establish a process to proactively follow up with customers who have not recently contacted

program staff. Provide a courtesy email or phone call every few months to help build or

maintain relationships with businesses that have indicated interest in a project.

• In addition to hosting kickoff meetings and overview webinars for trade allies, offer thorough

program process trainings on a quarterly basis. Target those who are familiar with the program

but may be seeking a deeper understanding of the program nuances. Rotate training topics to

cover the span of trade ally specialties—lighting, refrigeration, compressed air, HVAC, and

hydronic applications—and use actual project examples. Include content that encourages trade

allies to seek out and upsell equipment that is eligible for program incentives. These sessions

will better inform trade allies about the program and its offerings, so they can confidently

discuss it with their customers—ultimately identifying more program-eligible projects and

increasing the program’s economic value to trade allies.

• Have the implementer follow the savings calculations in the Indiana TRM (v2.2) for all measures

where appropriate. The implementer should also ensure that the CFs are clearly laid out and

that the calculations are presented in an easy-to-follow format.

• Ensure that the CF=0 for all exterior lighting measures (unless they are 24/7 fixtures, where the

CF=1 regardless of interior or exterior location).

• For direct-fired heating measures that encompass efficiency, destratification, and setback

savings, use the building load model located in the file, LM Building Usage & Setback+Destrat

Savings Calc.xlsx, or similar spreadsheet calculator. Avoid read-only PDFs.

• Include a general narrative of project savings and nuances of assumptions in the project

application. For example, include reasons for ignoring destratification or setback savings for

natural gas HVAC measures when similar measures tend to include them.

• Calculate central HVAC equipment measure savings using the Indiana TRM (v2.2). Custom

calculations may be necessary in cases where program controls or technology is integrated into

the measure.

• Ensure the most up-to-date savings calculation source is uploaded to the implementer’s

database, LM Captures, at the same time as the final application workbook. This will ensure the

correct supporting documentation is present with the claimed savings. Employ documentation

similar to the spreadsheet file referenced in a previous bullet point (dealing with direct-fired

heating measures), which lays out all assumptions, inputs, and formulae used in the calculation.

• If energy models are used, include the actual model file in LM Captures or all applicable inputs

required to build a similar model.

Page 27: 2018 DSM Portfolio Evaluation Report - NIPSCO

20

C&I New Construction • Have the implementer follow the savings calculations in the Indiana TRM (v2.2) for all measures

where appropriate. The implementer should also ensure that the CFs are clearly laid out and

that the calculations are presented in an easy-to-follow format.

• Ensure that the CF=0 for all exterior lighting measures (unless they are 24/7 fixtures, where the

CF=1 regardless of interior or exterior location).

• To align with the other C&I programs, define measure unit quantity as the following:

▪ A fixture for lighting measures

▪ Equipment capacity for non-lighting equipment

▪ Linear feet for pipe insulation

▪ Square feet for envelope measures

• Because integrated measures like process or controls measures are more difficult to define

quantities for, set those quantities to one. Other definition protocols may be appropriate, but

the implementer should ensure they are used consistently across like measures.

• For direct-fired heating measures that encompass efficiency, destratification, and setback

savings, use the building load model located in the file, LM Building Usage & Setback+Destrat

Savings Calc.xlsx, or similar spreadsheet calculator. Avoid read-only PDFs.

• Include a general narrative of project savings and nuances of assumptions in the project

application. For example, include reasons for ignoring destratification or setback savings for

natural gas HVAC measures when similar measures tend to include them.

• Calculate central HVAC equipment measure savings using the Indiana TRM (v2.2). Custom

calculations may be necessary in cases where custom controls or technology is integrated into

the measure.

• Ensure the most up-to-date savings calculation source is uploaded to the implementer’s

database, LM Captures, at the same time as the final application workbook. This will ensure the

correct supporting documentation is present with the claimed savings. Employ documentation

similar to the spreadsheet file referenced in a previous bullet point (dealing with direct-fired

heating measures), which lays out all assumptions, inputs, and formulae used in the calculation.

• Ensure all C&I New Construction baseline assumptions align with the current federal minimum

efficiency standards and ASHRAE 90.1 energy code standards.

• Continue to diversify the trade allies involved in the program to ensure a mix of projects and

that savings goals are met in future years.

• Perform ongoing interviews with trade allies to gain insight into current baseline practices for

direct-fired heating measures.

• Re-assess the appropriateness of the non-setback unit heater baseline at the end of the next

three-year cycle (2021).

Page 28: 2018 DSM Portfolio Evaluation Report - NIPSCO

21

C&I Small Business Direct Install • Continue outreach with hydronic contractors to increase savings from steam systems.

• Develop targeted emails, followed by telephone calls and meetings with select contractors, and

track outreach outcomes.

• Promote project phasing—breaking a large-scale project into manageable phases—as a way to

alleviate participant cost barriers.

• Have the implementer apply a CF of zero to all exterior lighting measures, as outlined in the

2015 Indiana TRM (v2.2).

• Target interior lighting opportunities (or 24/7 operating conditions) to help meet peak

coincidence demand reduction targets.

• Reference the 2019 Illinois TRM (v7.0) for better estimates of deemed savings by building type.

Currently, the ex ante deemed savings align with the Illinois TRM (v7.0) for low pressure steam

systems with recirculation. However, for systems without recirculation, use the building-specific

deemed therm-per-foot values for either Chicago (closest weather station to NIPSCO territory)

or a statewide average. Extrapolate these values to different diameter pipes to provide new

deemed savings for the three program measures offered.

• Target a more diverse population of building types for natural gas measures that utilize deemed

savings for ex ante impacts. Deemed savings generally describe a composite of end-use

operating profiles and if a program is only serving a predominant building type, ex ante

estimates become skewed and unrepresentative of the program population.

• Reference the 2019 Illinois TRM (v7.0) for better estimates of deemed savings for steam trap

replacements.

C&I Retro-Commissioning • Ensure that peak coincident demand reduction is properly claimed. This is the portion of

demand reduction that occurs during the summer peak period as outlined in the Indiana

TRM (v2.2).

• Partner with out-of-network commissioning providers and engineering firms that service

customers in Indiana to promote the program. Expand outreach from Lockheed Martin Energy’s

trade ally network to out-of-network firms to help spread the information across various service

providers and reach a larger audience. Participants in all other C&I programs said they heard

about their respective program through trade allies and word of mouth. It is likely the Retro-

Commissioning (RCx) program can also benefit from more word-of-mouth referrals.

Page 29: 2018 DSM Portfolio Evaluation Report - NIPSCO

22

Program Offerings NIPSCO’s DSM portfolio consists of 13 customer programs distributed across the Residential and C&I

sectors. NIPSCO administers these programs with the support of a third-party implementer, Lockheed

Martin Energy. The 2018 program year marked the third year of a three-year program cycle. A brief

description of each program’s offering follows:

• Through the Residential HVAC Rebates program, NIPSCO provides incentives to natural gas and

electric residential customers to purchase energy-efficient heating and cooling products. The

program includes energy-efficient measures such as smart thermostats, furnaces, air

conditioners, boilers, and heat pumps.

• Through the Residential Lighting program, NIPSCO provides upstream discounts on LED lamps

and LED lighting fixtures. NIPSCO works with retailers and manufacturers to offer reduced prices

at the point of sale.

• Through the Residential Home Energy Analysis (HEA) program, NIPSCO provides no-cost, in-

home energy assessments to residential customers. During an assessment, an energy assessor

analyzes the efficiency of the heating and cooling systems and insulation levels in the home and

installs energy-saving lighting and water conservation measures. The assessment concludes with

the assessor providing a report of findings and energy-saving recommendations. The primary

focus of the program is to educate customers about energy efficiency in their homes.

• Through the Residential Appliance Recycling program, NIPSCO provides removal and recycling

services to electric customers who reduce energy consumption through recycling unneeded

refrigerators and freezers. Annually, participants may recycle up to two working secondary

refrigerators or freezers, sized 10 to 30 cubic feet, by scheduling a pickup of the units.

• Through the Residential School Education program, NIPSCO works with fifth-grade teachers to

educate students about energy efficiency and how they can make an impact at school and

home. Participating teachers receive classroom curriculum and take-home efficiency kits to

distribute to their students.

• Through the Residential Multi-Family Direct Install (MFDI) program, property owners and

managers of multi-family housing can receive a no-cost property walk-through for residential

units and common spaces, and can receive energy efficiency measures in-unit at no-cost as well.

The walk-through results in a report with recommendations for energy-efficient upgrades.

During a follow up visit, a program approved contractor will install some or all of the suggested

energy-efficient measures in the residential units.

• Through the Residential Behavioral program, selected customers receive paper and/or

electronic home energy reports that educates them on their energy consumption patterns.

Participants receive a targeted, individualized report that is intended to motivate them to

engage in energy-saving behaviors. The report shows the participant’s monthly energy use and

compares this use to similarly sized homes nearby, and it also provides semi-customized energy-

saving tips. Participants may opt-in to an online portal.

Page 30: 2018 DSM Portfolio Evaluation Report - NIPSCO

23

• Through the Residential Income Qualified Weatherization (IQW) program, NIPSCO provides no-

cost, in-home energy assessments to income-qualified residential customers. Program

participants receive a home assessment, where an energy assessor first analyzes the efficiency

of heating and cooling systems and insulation levels in the home. Depending on opportunities in

the home, the assessor then installs energy-saving lighting and water-conservation measures, as

well as duct sealing and air sealing to qualifying homes during the assessment. Homes with

refrigerators 10 years old or older are also eligible to receive a new, ENERGY STAR®-rated

refrigerator, and those with attic insulation levels below R-11 may qualify for attic insulation.

Both of these items are installed after the initial assessment. The assessor also provides a report

of findings and energy-saving recommendations.

• Through the C&I Prescriptive program, NIPSCO provides prescriptive rebates to facilities for the

installation of energy efficiency equipment and system improvements. The program offers

rebates for lighting, pumps and drives, heating, cooling, and refrigeration equipment.

• Through the C&I Custom program, NIPSCO focuses on energy savings opportunities unique to

the commercial participant’s application or process. The program requires individual

engineering analyses to determine savings. This program offers customers incentives based on

the calculated savings for energy savings opportunities outside the traditional rebate program.

• Through the C&I New Construction program, NIPSCO focuses on opportunities for energy

savings in new construction and renovation projects. NIPSCO offers incentives to encourage

building owners, designers and architects to exceed standard building practice. Projects may

also qualify for either prescriptive or custom incentives.

• Through the C&I Small Business Direct Install (SBDI) program, trade allies provide small

business participants energy assessments and higher incentives for specific measures than those

offered through the standard C&I Prescriptive program. The program offers an array of

incentives for refrigeration, lighting, HVAC, and other natural gas–saving measures typically

used in small business operations.

• Through the C&I Retro-Commissioning (RCx) program, NIPSCO offers customers the

opportunity to determine the energy performance of their facilities and optimize the efficiency

of existing operating systems. The program provides incentives to customers for completing a

retro-commissioning project.

Page 31: 2018 DSM Portfolio Evaluation Report - NIPSCO

24

Evaluation Objectives and Methodology The evaluation team employs consistent methods across programs and from prior evaluation years

whenever possible. The evaluation process can be broken into three key areas of research, which are

summarized below:

• Impact Evaluation. The evaluation team verifies measure installation, calculates evaluated (or

gross) savings, and measures freeridership and spillover to produce net savings impacts. This

research includes conducting engineering desk reviews of project savings calculations,

completing site visits to observe project conditions and measure savings performance, and

surveying participants to understand program influence.

• Process Evaluation. The evaluation team investigates program processes, participation barriers,

and the program experiences of customers and trade allies. This research uses telephone and

online surveys with program actors (trade allies, participants, and other supporting actors), and

interviews with program and implementation staff to better understand program performance.

This research gives stakeholders insight into the aspects of success or potential improvement for

each program, and provides context for impact findings.

• Cost-Effectiveness. The evaluation team conducts a cost-effectiveness analysis (a form of

economic analysis) to compare the relative costs and benefits from NIPSCO’s investment in each

program. In the energy efficiency industry, cost-effectiveness metrics serve as an indicator of

the economic attractiveness of any energy efficiency investment or practice, as compared to the

costs of energy produced and delivered in the absence of such investments.

Research Questions The evaluation team examined a common set of research questions to guide the evaluation. Impact

activities for most programs included an assessment of these research areas:

• Data quality review

• Installation rates or ISRs

• Measure verification

• Freeridership

• Spillover

• Program cost-effectiveness

Process activities for most programs included an assessment of these research areas:

• Follow up on 2017 evaluation recommendations

• Program design, delivery, and administration

• Communication and coordination between NIPSCO and its implementers

• Marketing strategies

• Program processes (including application processes)

Page 32: 2018 DSM Portfolio Evaluation Report - NIPSCO

25

• Drivers of participation and barriers to participation

• Quality control processes

• Future program plans

Impact Evaluation Approach To determine portfolio impacts, the evaluation team completed the following activities:

• Compared tracking data, program documents, and scorecard data for alignment and accuracy

• Reviewed savings values, calculations, assumptions, and sources

• Collected ISR data for program measures

• Calculated ex post gross savings values for programs and the portfolio

• Estimated freeridership and spillover behavior from participant surveys, site visits, and

secondary sources

• Calculated ex post net savings values for programs and the portfolio

The team employed statistical and engineering-based analysis techniques to achieve these results,

adjusting program-reported gross savings (ex ante) using the information gathered through database

and document reviews, engineering reviews of tracking data and project work papers, Indiana

TRM (v2.2) deemed savings calculation reviews, on-site verification and metering, and regression

analysis.

The evaluation team’s presentation of analysis results follows a progression, with each savings type

corresponding to a specific step in the evaluation process.

The evaluation team defined these key savings terms as follows for the impact evaluation:

• Reported ex ante savings: Annual gross savings for the evaluation period, as reported by NIPSCO

in the 2018 DSM Scorecard.

• Audited savings: Annual gross savings after alignment or reconciliation with the program

tracking data.

• Verified savings: Annual gross savings after alignment with the program tracking data (i.e.,

Audited savings), and adjustments related to ISRs.

• Evaluated ex post savings: Annual gross savings with all previous adjustments (i.e., Verified

savings), and adjusted to include the best available inputs and methodology available at the

time of the evaluation.

• Realization rate (percentage): the percentage of savings the program realized, calculated using

the following equation:

𝑅𝑒𝑎𝑙𝑖𝑧𝑎𝑡𝑖𝑜𝑛 Rate =𝐸𝑥 𝑃𝑜𝑠𝑡 Gross Savings

𝐸𝑥 𝐴𝑛𝑡𝑒 Gross Savings

• Evaluated net savings: Evaluated ex post savings, adjusted for attribution (i.e., freeridership and

spillover).

Page 33: 2018 DSM Portfolio Evaluation Report - NIPSCO

26

Appendix A. Evaluation Definitions further outlines each of the evaluation steps.

Process Evaluation Approach For the process evaluation, the evaluation team conducted interviews with program and

implementation staff to document how each program worked, identify and understand the important

influences on the program’s operations, and gain insight into factors influencing the program’s

performance. For some programs, the evaluation team also conducted surveys and interviews with

program participants and participating trade allies to understand their perspectives and experiences

with a given program.

Research Activities The evaluation team designed the 2018 research activities to emphasize programs that were new in

2018 or underwent program design changes in 2018. Our comprehensive approach focused on

investigating a variety of process and impact questions for each program. Table 10 and Table 11 show

the impact and process research activities conducted, followed by a description of each activity and how

it was applied.

Table 10. 2018 Impact Evaluation Activities

Program Database

Review

Engineering

Analysis

On-Site

Verification

NTG

Estimation ISRs

Billing

Analysis

Residential

HVAC Rebates ✓ ✓ ✓

Lighting ✓ ✓

Home Energy Analysis ✓ ✓ ✓ ✓

Appliance Recycling ✓ ✓ ✓ ✓

School Education ✓ ✓ ✓ ✓

Multi-Family Direct Install ✓ ✓ ✓

Behavioral ✓ ✓

Income Qualified Weatherization ✓ ✓

C&I

Prescriptive ✓ ✓ ✓ ✓ ✓

Custom ✓ ✓ ✓ ✓ ✓

New Construction ✓ ✓ ✓ ✓ ✓

Small Business Direct Install ✓ ✓ ✓ ✓ ✓

Retro-Commissioning ✓ ✓ ✓ ✓

Page 34: 2018 DSM Portfolio Evaluation Report - NIPSCO

27

Table 11. 2018 Process Evaluation Activities

Program Staff Interviews Materials Review Participant Surveys

and Interviews

Trade Ally Surveys

and Interviews

Residential

HVAC Rebates ✓ ✓ ✓

Lighting ✓ ✓

Home Energy Analysis ✓ ✓ ✓

Appliance Recycling ✓ ✓ ✓

School Education ✓ ✓ ✓

Multi-Family Direct Install ✓ ✓

Behavioral ✓ ✓ ✓

Income Qualified Weatherization ✓ ✓ ✓

C&I

Prescriptive ✓ ✓ ✓

Custom ✓ ✓ ✓ ✓

New Construction ✓ ✓ ✓ ✓

Small Business Direct Install ✓ ✓ ✓

Retro-Commissioning ✓ ✓ ✓ ✓

Database and Document Review The evaluation team reviewed NIPSCO’s program tracking databases, scorecards, and other

documentation to assess the quality of information and to identify potential anomalous entries, outliers,

duplicates, and missing values. This included reviewing all data fields recommended in the Indiana

TRM (v2.2), along with those necessary to calculate deemed savings. The evaluation team conducted a

database and document review for all programs, including these specific activities:

• Verified that all customer and vendor information needed to conduct primary research was

available and complete

• Confirmed that all measure-specific data included the necessary details in the proper formats to

enable impact evaluation

• Confirmed that all program costs and other tracking information required to calculate impacts

and assess resource allocation were available and complete

• Assessed new marketing, outreach materials, and other related activities

For measures not included in the Indiana TRM (v2.2), the evaluation team reviewed project

documentation (e.g., audit reports and savings calculation work papers) from a sample of energy

efficiency project sites. The evaluation team closely reviewed the calculation procedures and savings

estimate documentation. The evaluation team also verified the appropriateness of NIPSCO’s analyses

for calculating savings as well as the assumptions used for participating facilities’ structural attributes

and operational characteristics.

Page 35: 2018 DSM Portfolio Evaluation Report - NIPSCO

28

Engineering Review and Deemed Savings Analysis The evaluation team compared the program tracking data with the Indiana TRM (v2.2) savings

algorithms to verify the inputs and correct use of the algorithms. As the first step of this review, the

evaluation team identified all assumed inputs for each algorithm (e.g., average water main temperature

by weather zone) and installation-specific values (e.g., R-values, change in cubic feet per minute,

wattage, thickness of tank wrap). Next, the evaluation team confirmed that the program tracking data

contained all installation-specific inputs required to successfully calculate savings for each measure. The

evaluation team also identified missing required inputs and worked with NIPSCO to determine whether

these inputs existed or whether the implementer made other assumptions regarding these inputs. After

confirming the data inputs, the evaluation team replicated measure-level savings and produced a

realization rate, which is the ratio of claimed (ex ante) savings to evaluated ex post gross savings.

Verification and Metering Site Visits

C&I Prescriptive, Custom, New Construction, Small Business Direct Install, Retro-Commissioning

For the C&I programs, the evaluation team focused on-site activities on verifying and measuring

program measures installed in C&I buildings. The evaluation team performed short-term metering at a

sample of facilities to determine usage profiles associated with large or complex measures.

The evaluation team also gathered data for the impact analyses during the site visits. The total number

of site visits is outlined in the Sampling section below (Table 13). The evaluation team developed a

stratified random sample for on-site measurement and verification (M&V), targeting ±10% precision at

90% confidence (90/10) for each C&I program. The site visits, along with the engineering desk reviews,

helped the evaluation team to meet the 90/10 precision targets.

NIPSCO provided contact information for project decision-makers and implementation contractors, and

the evaluation team contacted customers at selected sites to schedule visits in advance. The evaluation

team conducted these three primary tasks during the M&V site visits:

1. Verified that all measures were installed correctly and functioning properly, and confirmed the

operational characteristics of the installed equipment such as temperature, set-points, and

annual operating hours

2. Collected physical data such as cooling capacity or horsepower, and analyzed the energy savings

realized from the installed improvements and measures

3. Interviewed facility personnel to obtain any additional information about the installed system to

supplement data from other sources

Page 36: 2018 DSM Portfolio Evaluation Report - NIPSCO

29

Regression Analysis

Appliance Recycling, Behavioral, Home Energy Analysis

For the Appliance Recycling program, the evaluation team followed the U.S. Department of Energy’s

(DOE’s) Uniform Methods Project (UMP) evaluation protocol for refrigerator recycling2 and used a

multivariate regression model to estimate the gross unit energy consumption for refrigerators and

freezers recycled though the program. The evaluation team estimated model coefficients using an

aggregated in situ metering dataset, composed of over 591 appliances metered as part of five

evaluations conducted in California, Wisconsin, and Michigan between 2009 and 2013.3 Collectively,

these evaluations offered a wide distribution of appliance ages, sizes, configurations, usage scenarios

(primary or secondary), and climate conditions. The diversity of the dataset provided an effective

secondary data source for estimating energy savings since this study did not include metering data

specific to Indiana. The evaluation team used program tracking data, provided by NIPSCO, in its

regression modeling. The evaluation team multiplied the coefficient set, created through the model, by

attributes specific to NIPSCO’s tracking data to predict energy consumption for each unit recycled

through the program.

For the Behavioral and HEA programs, the evaluation team performed consumption data regression

analysis (or billing analysis). These industry-standard techniques use interval energy consumption data

(typically from monthly energy bills) to forecast participant consumption in the absence of the program

and to calculate energy savings as the difference between modeled and actual consumption.

Staff Interviews The evaluation team interviewed NIPSCO program managers and implementation staff to understand

how each program was designed and delivered, what worked well, and what could be improved. The

interviews covered wide-ranging topics such as program design and administration, communication and

data tracking processes, marketing strategies, trade ally and participant interactions, and challenges and

successes.

Participant and Trade Ally Surveys and Interviews To support the impact and process evaluations, the evaluation team conducted surveys. The evaluation

team designed the survey to collect data about market awareness of NIPSCO’s energy-saving programs,

product installation rates, customer behavior and equipment use, participant satisfaction with program

components, and barriers to participation. The surveys informed process and impact research questions,

such as freeridership and spillover.

2 U.S. Department of Energy. The Uniform Methods Project: Methods for Determining Energy Efficiency Savings

for Specific Measures. April 2013. https://energy.gov/sites/prod/files/2013/11/f5/53827-7.pdf

3 Cadmus conducted these evaluations for Southern California Edison, Pacific Gas & Electric, San Diego Gas &

Electric, DTE Energy, Focus on Energy, and Consumers Energy.

Page 37: 2018 DSM Portfolio Evaluation Report - NIPSCO

30

Participants and trade allies did not receive an incentive for completing the survey or interview, except

for the Behavioral program. In past evaluations, the evaluation team found that behavioral programs

tend to have the lowest response rates, likely due to its program design where a customer receives

informational feedback rather than a rebate check. The evaluation team has also observed that without

an incentive, surveys from a large population can sometimes elicit extreme responses (very negative or

very positive). Therefore, the evaluation team provided each Behavioral participant respondent who

qualified and completed the survey with a $10 online gift card to boost the response rate and reduce

response bias.

Sampling

The evaluation team used a sampling approach to develop sample frames for participant,

nonparticipant, and trade ally surveys, and to determine the number of site visits needed for field work.

In each case, the evaluation team designed the survey samples to achieve ±10% precision at 90%

confidence for NTG metrics (for select programs), accounting for the total number of respondents. Table

12 shows the population and sample sizes, as well as the number of completes for surveys.

Table 12. 2018 Survey Sampling and Number of Completes by Program

Program Respondent

Group

Population

(Count of Unique

Participants)

Included in

Sample Frame

Target

Completes

Achieved

Completes

Residential

HVAC Rebates Participant 9,660 9,033 67 67

Home Energy Analysis Participant 840 678 68 67

Appliance Recycling Participant 2,852 1,916 140 140

School Education Parent 11,606 834 67 67

Behavioral Participant 368,724 175,009 136 136

Income Qualified Weatherization Participant 1,131 704 68 69

C&I1

Prescriptive Participant 722 487 68 68

Custom Participant 163 159 66 40

Custom Trade Ally 104 82 10 16

New Construction Participant 46 37 28 9

New Construction Trade Ally 32 28 5 5

Small Business Direct Install Participant 343 310 67 53

Retro-Commissioning Participant 1 1 1 1

Retro-Commissioning Trade Ally 1 1 1 1 1 C&I participant population based on count of unique participants.

The evaluation team employed a slightly different method in choosing samples for the impact evaluation

sample. For each C&I program, the evaluation team selected samples using a systematic approach to

probability proportional to size (PPS) sampling. The evaluation team evaluated the measures in the

samples to calculate ISRs and ex post realization rates for each program. The PPS method is particularly

useful when estimating population totals based on sampled measures. It weighted population measures

by ex ante energy savings, and in this case, since the populations included both natural gas and electric

measures, the evaluation team converted all energy savings into units of MMBtu to assign comparable

Page 38: 2018 DSM Portfolio Evaluation Report - NIPSCO

31

weights to each measure. Weighting the population resulted in project measures with higher savings

having a higher probability of being selected into the evaluation sample. PPS sampling helps realization

rate precision since ex post gross energy savings are correlated with reported savings, and the approach

increases the proportion of savings in the sample. Table 13 shows the impact evaluation sample for the

C&I programs.

Table 13. 2018 Impact Evaluation Sample

Program Population (Measures)a Target Completes Achieved Completes

C&I Prescriptive 4,055 16 Electric, 59 Natural Gas 26 Electric, 64 Natural Gas

C&I Custom 579 40 Electric, 33 Natural Gas 41 Electric, 35 Natural Gas

C&I New Construction 155 36 Electric, 24 Natural Gas 37 Electric, 24 Natural Gas

C&I Small Business Direct Install 1,182 34 Electric, 15 Natural Gas 39 Electric, 24 Natural Gas

C&I Retro-Commissioning 1 1 Electric, 1 Natural Gas 1 Electric, 0 Natural Gas a Population is based on queried data extracts from the implementer’s database, LM Captures, and may vary slightly with

NIPSCO’s scorecard data source.

NTG Methods An NTG ratio is made of two components: freeridership and spillover. Freeridership is the percentage of

savings that would have occurred in the absence of the program because participants would have

behaved the same (purchasing the same measures) without the influence of the program. Spillover

occurs when customers purchase energy-efficient measures or adopt energy-efficient building practices

without participating in a utility-sponsored program. The evaluation team used the following equation

to calculate NTG for each program:

𝑃𝑟𝑜𝑔𝑟𝑎𝑚 𝑁𝑇𝐺 𝑅𝑎𝑡𝑖𝑜 = 100% − 𝐹𝑟𝑒𝑒𝑟𝑖𝑑𝑒𝑟𝑠ℎ𝑖𝑝 + 𝑆𝑝𝑖𝑙𝑙𝑜𝑣𝑒𝑟

The evaluation team employed the standard self-report method for all programs where the evaluation

team estimated NTG. However, the evaluation team combined two methods—the standard self-report

method and the intention/influence method—to calculate NTG for the HVAC Rebates program, and all

C&I programs (same methods as the 2017 and 2016 evaluations). The evaluation team computed a

savings-weighted average of the NTG derived from each method to apply the overall NTG for each

program. Only one of the six total C&I RCx program participants (representing seven measures)

completed the program participant survey, and the evaluation team applied the NTG ratio estimated for

the participant to their ex post gross evaluated program savings.

Other programs (Appliance Recycling, HEA, and School Education) called for an approach that first

estimated measure-level freeridership as an input to measure-level impact calculations and then rolled

up to the measure-level results to provide a program NTG ratio.

Self-Report Method To determine a freeridership score, the evaluation team relied on self-report participant surveys, in

which the evaluation team asked participants a series of questions about what their actions would have

been in the absence of the program. The evaluation team used each unique set of responses to calculate

Page 39: 2018 DSM Portfolio Evaluation Report - NIPSCO

32

a freeridership score for that individual. The evaluation team then aggregated the scores and

determined a total freeridership score for each program. To facilitate comparisons over program years,

the evaluation team used NTG question batteries consistent with those used in the 2017 and 2016

evaluations.

Spillover is measured by asking participants who purchased a particular measure if, as a result of the

program, they decided to install another energy-efficient measure or undertake some other activity to

improve energy efficiency. The evaluation team assessed spillover through self-report surveys, in which

interviewers read a list of energy-efficient products to respondents and asked if they had installed any of

the products in their home or business since participating in the program. If respondents said they had

made energy-efficient improvements or purchased products, interviewers asked how influential the

program was on their purchasing decisions.

The evaluation team estimated spillover savings for measures which participants said the program was

very influential in their decision by using specific information about participants, determined through

the evaluation, and using the Indiana TRM (v2.2) as a baseline reference. The sum of the estimated

spillover savings, divided by savings achieved through the program for each relevant measure, yielded

spillover savings as a percentage of total savings, which the evaluation team then extrapolated to the

population of program participants.

Intention/Influence Method for Self-Reports For the intention/influence method, the evaluation team assessed freeridership in two steps. Although

the questions were similar to those used in the self-report method, the intention/influence questions

explored the participant’s intention and the program’s influence in more detail. The evaluation team first

scored these two parts of the survey separately, then combined them with equal weight to determine

one freeridership score for each survey respondent. Spillover under this method focused on the

program’s influence on a participant’s decision to invest in additional energy-efficient measures.

The evaluation team derived the participants’ intention freeridership score by translating their

responses into a matrix value and applying a consistent, rules-based calculation to obtain the final

freeridership score.

The evaluation team used the following process for determining the intention freeridership score:

• Customers were categorized as 0% freeriders if they were not aware of a program (i.e., efficient)

measure and had no plans to install that measure prior to hearing about the program.

Customers also were categorized as 0% freeriders if they knew about the program but had no

plans to install an efficient, program-promoted measure.

• Customers were categorized as 100% freeriders if they would have installed the measure in the

program’s absence or if they had already installed the measure before learning about the

program.

• Customers received a partial freeridership score if they planned to install the measure and the

program altered their decision. This effect may have included the installation’s timing, the

Page 40: 2018 DSM Portfolio Evaluation Report - NIPSCO

33

number of measures installed, or the efficiency levels of measures installed. For customers who

were highly likely to install a measure, and for whom the program had less effect on their

decisions, the evaluation team assigned a higher intention freeridership score.

The evaluation team assessed influence freeridership by asking participants how important various

program elements were in their purchase decision-making process. The maximum rating of any program

factor determined a participant’s influence freeridership score (0% to 100% score range using a 1 to 4

scale).

The evaluation team calculated the arithmetic mean of the intention and influence freeridership

components to estimate total freeridership for programs.

Total Freeridership =𝐼𝑛𝑡𝑒𝑛𝑡𝑖𝑜𝑛 FR Score +𝐼𝑛𝑓𝑙𝑢𝑒𝑛𝑐𝑒 FR Score

2

The influence and intention scores contribute equally to the total freeridership score. The higher the

total freeridership score, the greater the deduction of savings from the gross savings estimates.

As detailed in Appendix B. Self-Report Net-to-Gross Evaluation Methodology, the evaluation team

estimated spillover measure savings using specific information about participants, determined through

the evaluation, and using the Indiana TRM (v2.2) as a baseline reference. The evaluation team estimated

the percentage of program spillover by dividing the sum of additional spillover savings (as reported by

survey respondents) by the total gross savings achieved by the respondents.

Using the calculated freeridership and spillover values, the evaluation team applied the overall NTG ratio

to the ex post gross savings to identify the ex post net savings.

Deemed Savings Method The evaluation team applied a deemed NTG ratio in two types of situations. First, the evaluation team

applied an NTG of 100% for programs targeting low-income customers. Low-income programs tend to

focus on direct installation of measures and are based on the hypothesis that the customer would not

have installed the energy-efficient product without the assistance of the program. For the Income

Qualified Weatherization program, the evaluation team applied an NTG of 100%. Second, performing a

NTG analysis for the Residential Multifamily Direct Install Program was beyond the scope of work, as this

program began mid-year and participant surveys were not undertaken. As it is a direct-install program,

with no cost to participants, the evaluation team is utilizing a 100% NTG value for the 2018 program

year.

Page 41: 2018 DSM Portfolio Evaluation Report - NIPSCO

34

Residential HVAC Rebates Program In 2018, NIPSCO offered the HVAC Rebates program, marketed to natural gas and electric customers as

the Energy Efficiency Rebate program. Through the program, NIPSCO encouraged customers to install

energy-efficient equipment and, ultimately, to reduce energy consumption. The program was available

to all residential customers with an active NIPSCO account.

Program rebates ranged from $45 to $250, covering a range of HVAC equipment from smart

thermostats to boilers and furnaces. Rebate levels varied by equipment efficiency level and measure

type.

Program Performance Table 14 presents a savings summary for the 2018 program, including program savings goals. In terms of

ex post gross savings, the program achieved 98% of its electric energy savings, 116% of its peak demand

reduction, and 105% of its natural gas energy savings goals.

Table 14. 2018 HVAC Rebates Program Savings Summary

Metric

Gross

Savings

Goal

Ex Ante Audited Verified Ex Post

Gross

Ex Post

Net

Gross Goal

Achievement

Electric Energy Savings

(kWh/yr) 1,657,129 1,413,559 1,413,559 1,413,559 1,622,426 1,054,577 98%

Peak Demand Reduction

(kW) 942 643 643 643 1,088 707 116%

Natural Gas Energy

Savings (therms/yr) 2,078,492 1,896,351 1,896,351 1,896,351 2,176,218 1,414,542 105%

Ex post gross energy savings were 15% higher than ex ante savings. This difference was driven largely by

air conditioner measures—the ex ante savings consisted of a single value for each unit that may have

assumed the efficient 15 SEER that was called out in the measure name, while actual installed values

were generally higher than 15 SEER. Ex post gross demand reduction was 69% higher than ex ante

savings, also driven by high air conditioner SEER but tempered by a decrease in demand reduction for

smart thermostats. Ex post gross therms savings were 15% higher than ex ante savings, driven by

furnace measures and probable higher actual AFUE values than ex ante calculations assumed.

Table 15 presents the 2018 program realization rates and NTG percentage.

Table 15. 2018 HVAC Rebates Program Adjustment Factors

Metric Realization Ratea Freeridership Spillover NTGb

Electric Energy Savings (kWh/yr) 115%

35% 0% 65% Peak Demand Reduction (kW) 169%

Natural Gas Energy Savings (therms/yr) 115% a Realization rate is defined as ex post gross savings divided by ex ante savings. b NTG is defined as ex post net savings divided by ex post gross savings.

Page 42: 2018 DSM Portfolio Evaluation Report - NIPSCO

35

Table 16 presents the 2018 program budget and expenditures. The program spent 68% of its electric

budget and 88% of its natural gas budget.

Table 16. 2018 HVAC Rebates Program Expenditures

Fuel Program Budget Program Expenditures Budget Spent (%)

Electric $500,164 $341,842 68%

Natural Gas $2,539,083 $2,233,956 88%

Research Questions The evaluation team conducted qualitative and quantitative research activities to answer several key

research questions for the HVAC Rebates program:

• What affected customer decision making?

• How effective were program marketing efforts in driving participation?

• How satisfied were customers with the program?

• How satisfied were customers with their contractor (if relevant)?

• How familiar were participants with other NIPSCO energy efficiency programs?

• Have customers been maximizing benefits from installed measures?

• What were the program freeridership and spillover estimates?

Impact Evaluation In 2018, NIPSCO claimed HVAC Rebates program savings for six measure types:

• Furnaces with ECMs

• Air conditioners

• Smart and Wi-Fi thermostats

• Air Souce Heat Pumps

• Boilers

• Furnaces4

4 These were furnaces that did not have ECMs, and furnaces with ECMs installed for natural gas–only

participants.

Page 43: 2018 DSM Portfolio Evaluation Report - NIPSCO

36

For all measure types, the evaluation team compared its engineering calculations to NIPSCO’s ex ante

savings, basing its savings methodologies and inputs for each measure on several sources:

• Standard engineering practices

• 2015 Indiana TRM (v2.2)5

• NIPSCO’s program tracking database

• Indiana Residential Baseline Study6

In 2018, participants received 12,505 units7 through the HVAC Rebates program. Table 17 summarizes

audited savings for the six measure categories. Furnaces with ECMs accounted for about two-thirds of

program audited electric energy savings, with ASHPs making up less than 1% of audited electric energy

savings. Audited demand reduction derives almost exclusively, at 99.8%, from air conditioners and smart

thermostats. Furnaces (both categories) and thermostats make up the vast majority of audited natural

gas savings, with boiler measures comprising only 0.7%.

It should be noted that the tracking data contained two general categories of furnace measures: Furnace

and Furnace with ECM. Generally, Furnace measures consisted of a mix of furnace models with and

without ECMs. However, all units with ECMs from the “Furnace” category were for participants who

were not electric customers. Therefore, these units did not claim or receive electric energy savings or

peak demand reduction.

Table 17. 2018 HVAC Rebates Program Audited Savings Summary by Measure Category

Measure Category

Audited Electric Energy

Savings

Audited Peak Demand

Reduction

Audited Natural Gas Energy

Savings

kWh/yr Share kW Share Therms/yr Share

Furnaces w/ ECMs 954,500 67.5% - 0.0% 399,809 21.1%

Air Conditioners 138,474 9.8% 258.0 40.2% - 0.0%

Smart Thermostats 312,475 22.1% 382.8 59.6% 523,434 27.6%

ASHPs 8,110 0.6% 1.8 0.3% - 0.0%

Furnaces - 0.0% - 0.0% 960,729 50.7%

Boilers - 0.0% - 0.0% 12,379 0.7%

Total 1,413,559 100% 642.6 100% 1,896,351 100%

Note: Totals may not sum properly due to rounding.

5 Cadmus. Indiana Technical Reference Manual Version 2.2. July 28, 2015.

6 Cadmus. 2016. 2015 Demand Side Management Programs Evaluation Report. Prepared for NIPSCO.

7 This count reflects the number of physical units delivered, with individual dual-fuel units (such as a furnace

with ECM coupled with a smart thermostat) counting as one unit delivered. Therefore, these values do not

align with scorecard values, which counted dual-fuel units as two units.

Page 44: 2018 DSM Portfolio Evaluation Report - NIPSCO

37

Audited and Verified Savings The following sections describe how the evaluation team developed audited and verified quantities.

Audited Quantity

To develop an audited measure quantity, the evaluation team performed two activities:

• Checked for duplicate entries in the HVAC Rebates program database. The evaluation team did

not find any duplicate entries, but many residences had multiple furnaces or air conditioners

installed through the program and separate serial numbers were recorded for each piece of

equipment in the database. Many of these residences tended to be large based on our internet

research.

• Reviewed individual measure parameters (such as efficiencies and R-values) to ensure the

correct placement of all measures in the appropriate measure efficiency categories.

Based on this review, the evaluation team found that audited quantities equaled ex ante quantities.

Verified Quantity

The evaluation team applied installation rates to the audited measure quantity to arrive at the verified

quantity. The evaluation team did not ask installation verification questions in the 2018 participant

survey, but did gather this data in the 2015 participant survey. That 2015 survey confirmed that

participants received all measures and that all measures remained installed. As a result, the ISR is 100%

for all program measures, as shown in Table 18.

Table 18. 2018 HVAC Rebates Program In-Service Rates

Measure Category In-Service Rate

Furnaces with ECMs 100%

Air Conditioners 100%

Smart Thermostats 100%

ASHPs 100%

Furnaces 100%

Boilers 100%

Source: 2015 Participant Survey

Table 19 shows audited and verified quantities for the six measure categories.

Page 45: 2018 DSM Portfolio Evaluation Report - NIPSCO

38

Table 19. 2018 HVAC Rebates Program Audited and Verified Quantities

Measure Category Unit of Measure Audited Quantity

(Units)a Installation Rate

Verified Quantity

(Units)

Furnaces w/ ECMs Furnace 2,300 100% 2,300

Air Conditioners Air Conditioner 794 100% 794

Smart Thermostats Thermostat 3,819 100% 3,819

ASHPs Heat Pump 7 100% 7

Furnaces Furnace 5,527 100% 5,527

Boilers Boiler 58 100% 58

Total - 12,505 100% 12,505

Note: Totals may not sum properly due to rounding. a These quantities reflect physical unit counts and therefore do not match the scorecard, which counts each fuel type for

dual-fuel measures.

Ex Post Gross Savings The evaluation team referred to the Indiana TRM (v2.2) to calculate ex post electric and natural gas

energy savings and demand reduction for the program measures. Where data was not available in the

Indiana TRM (v2.2), the evaluation team employed data from the 2012 Residential Indiana Baseline

Study. In addition, the evaluation team employed measure characteristics provided in the database for

variables such as capacities, efficiencies, HVAC equipment type and model, and project location. Finally,

the evaluation team acquired latitude and longitude for each customer’s address in the database, then

matched each of these to the closest city from the Indiana TRM (v2.2) to assign heating and cooling

hours.

A description follows of the evaluation methodologies used and the results for each of the six measure

categories.

Furnaces with ECMs

Per the Indiana TRM (v2.2), the evaluation team used the following natural gas savings algorithm for

furnaces with ECMs:

∆𝑡ℎ𝑒𝑟𝑚𝑠 = 𝐶𝐴𝑃 × 𝐸𝐹𝐿𝐻𝐻 × (𝐴𝐹𝑈𝐸𝐸𝐸

𝐴𝐹𝑈𝐸𝐵𝐴𝑆𝐸− 1) × 0.00001

Where:

CAP = Capacity of the furnace in Btu/h

EFLHH = Effective full-load heating hours

AFUEEE = Efficiency of the installed furnace

AFUEBASE = Efficiency of the baseline furnace

0.00001 = Factor to convert from Btu/h to therms

The evaluation team obtained CAP and AFUEEE from the ex ante data, EFLHH from the Indiana TRM (v2.2)

based on location, and assigned an AFUEBASE of 80% based on the Indiana TRM (v2.2). The ex ante data

Page 46: 2018 DSM Portfolio Evaluation Report - NIPSCO

39

assigned deemed savings of 173.83 therms to each unit, which was a less detailed approach than that

used by the evaluation team. Yet, the overall natural gas realization rate for this measure category is

119%. This discrepancy exists because of differences between the actual capacities and AFUEs used by

the evaluation team and the average values for those factors used in the ex ante data.

Electric savings for this measure category come from installing the ECMs. Note that this measure

category includes only furnaces with ECMs, and that furnaces without ECMs are a separate category

discussed below. The Indiana TRM (v2.2), following the 2015 Wisconsin TRM,8 assigns a deemed electric

energy savings value of 415 kWh, which is a composite of savings from motor consumption reduction in

heating mode and circulation mode and of the overall reduction in energy consumed while in cooling

mode. The cooling mode energy reduction comes from a slight increase in SEER for HVAC systems with

air conditioners installed as a result of the ECM installation: it incorporates an estimate of the number of

households that have a central air conditioner installed (92.5%). The ex ante data assigns a deemed

savings value of 415 kWh (Indiana TRM v2.2) for all units, and the energy realization rate for this

measure category is 100%.

The Indiana TRM (v2.2) does not claim any summer peak demand reduction for ECMs. However, the

evaluation team did credit savings for summer peak demand reduction since reduced consumption

during summer months will reduce fan use. The evaluation team followed the Indiana TRM (v1.0) in

applying a deemed demand reduction of 0.073 kW, coupled with the 0.88 cooling CF of the Indiana

TRM (v2.2). The evaluation team also incorporated the percentage of households that have a central air

conditioner installed (92.5%), according to the 2015 Wisconsin TRM. The final deemed ex post demand

reduction for ECMs are 0.059 kW. The ex ante data assigns no demand reduction for ECMs and the

resulting overall demand realization rate for this measure category is therefore undefined.

Air Conditioners

The evaluation team used the following equation from the Indiana TRM (v2.2) to calculate energy

savings from the SEER upgrade for air conditioners:

∆𝑘𝑊ℎ =𝐶𝐴𝑃

1,000× 𝐸𝐹𝐿𝐻𝐶 × (

0.23

𝑆𝐸𝐸𝑅𝐶𝑂𝐷𝐸+

0.77

𝑆𝐸𝐸𝑅𝑆𝑇𝑂𝐶𝐾−

1

𝑆𝐸𝐸𝑅𝐸𝐸) + ∆𝑘𝑊ℎ𝐻𝐸𝐴𝑇 + ∆𝑘𝑊ℎ𝐶𝐼𝑅𝐶

Where:

CAP = Total cooling capacity in Btu/h

EFLHC = Effective full-load cooling hours

SEERCODE = Baseline SEER value for time-of-sale replacements

SEERSTOCK = Baseline SEER value for early replacements

SEEREE = Installed SEER value

8 Cadmus. October 22, 2015. Wisconsin Focus on Energy Technical Reference Manual.

https://focusonenergy.com/sites/default/files/Wisconsin%20Focus%20on%20Energy%20Technical%20Refere

nce%20Manual%20October%202015.pdf

Page 47: 2018 DSM Portfolio Evaluation Report - NIPSCO

40

kWhHEAT = Electric energy savings from an ECM serving a furnace in heating mode

kWhCIRC = Electric energy savings from an ECM serving a furnace in circulation

mode

The evaluation team obtained CAP and SEEREE from the ex ante data, and EFLHC from the Indiana

TRM (v2.2) based on project location. The 2018 participant survey determined that 23% of participants

replaced broken units and 77% replaced working units. Based on these percentages and following the

Indiana TRM (v2.2) practices for time of sale and early replacement air conditioners, the evaluation

team produced a weighted baseline SEER that blends federal code (SEERCODE = 13.0) for broken unit

replacements and building stock findings (SEERSTOCK = 11.15) for working replacements.

The last two terms in the above equation, kWhHEAT and kWhCIRC, represent the savings from installing

an ECM from heating mode and circulation mode, respectively. The evaluation team assumed that these

measures included the installation of an ECM along with the air conditioner. However, as previously

mentioned, the energy savings value of 415 kWh for installing an ECM already includes the savings from

the increase in SEER produced by installing that ECM. Therefore, only the ECM energy savings from

heating mode and circulation mode should be added to the energy savings from the SEER increase for

this measure. These two components sum to 345 kWh, according to the 2015 Wisconsin TRM (kWhHEAT

= 134 kWh and kWhCIRC = 211 kWh).

The ex ante data shows deemed savings values for all air conditioners of 174 kWh; however, the average

ex post unit energy savings is 565 kWh, so the energy realization rate for this measure category is 324%.

This difference likely comes from the evaluation team using actual installed values for CAP and SEEREE.

Note that in 2017, air conditioner ex ante unit savings ranged from 174 kWh to 848 kWh, with an

average ex ante unit energy savings of 418 kWh.

Per the Indiana TRM (v2.2), the evaluation team used the following algorithm to calculate demand

reduction:

∆𝑘𝑊 =𝐶𝐴𝑃

1,000× (

0.23

𝐸𝐸𝑅𝐶𝑂𝐷𝐸+

0.77

𝐸𝐸𝑅𝑆𝑇𝑂𝐶𝐾−

1

𝐸𝐸𝑅𝐸𝐸) × 𝐶𝐹

Where:

CAP = Total cooling capacity in Btu/h

EERCODE = Baseline EER value for time-of-sale replacements

EERSTOCK = Baseline EER value for early replacements

EEREE = Installed efficiency

CF = Coincidence factor

The evaluation team obtained CAP from the program data. Per the Indiana TRM (v2.2), the evaluation

team assigned a CF value of 0.88 for all units. EEREE was not provided for any units in the program data,

but per the Indiana TRM (v2.2), the evaluation team used an EEREE value of 0.9 x SEEREE. As with baseline

SEER values, the evaluation team also applied a weighting algorithm for baseline EER, based on the

Page 48: 2018 DSM Portfolio Evaluation Report - NIPSCO

41

fraction of sites reporting replacement of broken or working units. The realization rate for demand

reduction in this measure category is 241%, and its variance from 100% stems largely from the

evaluation team using a baseline SEER that mostly reflected the value for SEERSTOCK of 11.15. This

variance likely also stems from the evaluation team’s use of actual CAP and calculating EEREE from

SEEREE, instead of using deemed savings values.

Smart Thermostats

Several evaluated savings cases exist within this measure category and each was established within the

measure name, with delivered unit population splits shown in Table 20.

Table 20. HVAC Configurations for Thermostat Measures

Measure Name-Defined Configuration Count of Unitsa Ex Ante Unit Savings

kWh kW Therms

Natural gas heat with air conditioner 1,870 157.2 0.202 138

Natural gas heat with no air conditioner 1,923 0.0 0 138

Heat pump / electric 26 718.0 0.202 0 a These quantities reflect physical unit counts, and therefore do not match the scorecard, which counted both fuel types for

dual-fuel measures.

The Indiana TRM (v2.2) does not denote a difference between smart and smart Wi-Fi thermostats, and

the evaluation team could find no examples of separate savings factors for these two types of units.

Therefore, the evaluation team followed the Indiana TRM (v2.2) and grouped the smart and smart Wi-Fi

thermostats together for this evaluation.

To determine energy savings, the evaluation team used the following equations.

For natural gas heating with air conditioning, and for air conditioning alone:

∆𝑘𝑊ℎ =𝐶𝐴𝑃𝐶

𝑆𝐸𝐸𝑅 × 1,000× 𝐸𝐹𝐿𝐻𝐶 × 𝐸𝑆𝐹𝐶

For heat pump systems:

∆𝑘𝑊ℎ = (𝐶𝐴𝑃𝐶

𝑆𝐸𝐸𝑅 × 1,000× 𝐸𝐹𝐿𝐻𝐶 × 𝐸𝑆𝐹𝐶) + (

𝐶𝐴𝑃𝐻

𝐶𝑂𝑃 × 3,412× 𝐸𝐹𝐿𝐻𝐻 × 𝐸𝑆𝐹𝐻)

Where:

CAPC = System cooling capacity

SEER = System SEER

EFLHC = Effective full-load cooling hours

ESFC = Savings factor for cooling

CAPH = System heating capacity

COP = Heating system coefficient of performance

3,412 = Conversion from Btu to kWh (3,412 Btu = 1 kWh)

Page 49: 2018 DSM Portfolio Evaluation Report - NIPSCO

42

EFLHH = Effective full-load heating hours

ESFH = Savings factor for heating

For thermostats serving natural gas heating systems without air conditioning, no electric energy savings

are produced from the Indiana TRM (v2.2) calculations. Table 21 lists the values and sources for these

variables.

Table 21. Variable, Values, and Sources Used for Thermostat Energy Savings Calculations

Variable Value Source

CAPC

Actual when possible or

28,994 Btu/h for air conditioners and

35,901 Btu/h for heat pumps

Program data or

2012 Residential Indiana Baseline Study or

Average of HVAC program heat pumps (2017 and 2018)

SEER 11.15 (Btu/h)/W 2012 Residential Indiana Baseline Study

EFLHC Varies by location Indiana TRM (v2.2)

ESFC 0.139 for smart thermostats Indiana TRM (v2.2) (based on 2015 Cadmus study)a

CAPH

Actual when possible or

74,309 Btu/h for natural gas or electric furnaces and

35,901 Btu/h for heat pumps

Program data or

2012 Residential Indiana Baseline Study or

Average of HVAC program heat pumps (2017 and 2018)

COP 2.26 for heat pumps and

1.0 for electric furnace

Indiana TRM (v2.2) or

Engineering calculation

EFLHH Varies by location Indiana TRM (v2.2)

ESFH 0.125 for smart thermostats Indiana TRM (v2.2) (based on 2015 Cadmus study)a

a Cadmus. January 29, 2015. Evaluation of the 2013–2014 Programmable and Smart Thermostat Program.

http://www.cadmusgroup.com/wp-

content/uploads/2015/06/Cadmus_Vectren_Nest_Report_Jan2015.pdf?submissionGuid=7cbc76e9-41bf-459a-94f5-

2b13f74c4e52

Ex post energy savings vary widely across different HVAC system configurations:

• Sites with an air conditioner and natural gas furnace yielded energy savings ranging from

49 kWh to 160 kWh

• Sites with a heat pump produce energy savings ranging from 512 kWh to 962 kWh

In addition, the evaluation team adjusted savings for sites that received two thermostats. The 3,819

program thermostats were delivered to 3,597 sites, with 222 thermostats (5.8%) being second

thermostats delivered to a given site.

A recent evaluation for Dayton Power and Light9 (DP&L) also gave special consideration to sites with two

thermostats. From the DP&L study survey results, 46.4% of second thermostats were installed in the

same home as the first thermostat, while 42.9% were given away and 10.7% were not in use. The DP&L

study analysis assumed that 50% of thermostats installed in the same home as the first were controlling

9 Cadmus. May 7, 2018. 2017 Evaluation, Measurement, and Verification Final Report. Residential Smart

Thermostat Rebate Pilot section. p. 86 (PDF p. 206).

http://dis.puc.state.oh.us/DocumentRecord.aspx?DocID=b651378a-b806-4b86-8f99-aafcf47ae218

Page 50: 2018 DSM Portfolio Evaluation Report - NIPSCO

43

a second HVAC system equivalent to the first HVAC system (and therefore generating equal savings) and

that the other 50% controlled the same system (no additional savings). The DP&L study analysis also

assumed that 70% of thermostats that were given away are given to someone else in the utility’s

territory (and therefore generate savings equal to the first). The final effective ISR for second

thermostats was then (46.4% × 50%) + (42.9% × 70%) = 53.2% in the DP&L study. This effective ISR was

applied to second thermostats in the tracking data for the NIPSCO HVAC Rebates program, slightly

reducing overall savings.

The Indiana TRM (v2.2) does not provide guidance on claiming demand reduction for these thermostat

measures. Currently savings for thermostats in most TRMs and evaluations are derived via analysis of

billing data, which generally cannot produce values for demand reduction. However, it is likely that

some demand reduction for smart thermostats does exist, and this reduction is accommodated in the

Illinois TRM (v7.0).10 This TRM calculates savings using standard methods for deriving baseline peak load,

then applies a smart thermostat ESF and half the CF normally used for cooling. The evaluation team

used that same approach:

∆𝑘𝑊 =𝐶𝐴𝑃𝐶

𝐸𝐸𝑅 × 1,000×

𝐶𝐹

2× 𝐸𝑆𝐹𝐶

Here, the standard cooling CF of 0.88 is used, but divided by 2. The resultant savings per first

thermostats were 0.177 kW for sites with air conditioners and 0.224 kW for sites with heat pumps. Ex

ante unit savings for each thermostat were 0.202 kW, and the overall realization rate for demand was

85%.

Evaluated natural gas savings for thermostats follow a single equation:

∆𝑡ℎ𝑒𝑟𝑚𝑠 =𝐶𝐴𝑃𝐻

100,000× 𝐸𝐹𝐿𝐻𝐻 × 𝐸𝑆𝐹𝐻

All variables here follow the same sources as those in Table 21. Ex post therm savings varied greatly,

between 83 therms and 254 therms. These values are generally slightly higher than the deemed ex ante

savings, and as such the natural gas energy realization rate for thermostats was 115%.

Air-Source Heat Pumps

The evaluation team used the following algorithm from the Indiana TRM (v2.2) to calculate the total

electric energy savings:

∆𝑘𝑊ℎ =𝐶𝐴𝑃𝐶

1,000× 𝐸𝐹𝐿𝐻𝐶 × (

1

𝑆𝐸𝐸𝑅𝐵𝐴𝑆𝐸−

1

𝑆𝐸𝐸𝑅𝐸𝐸) +

𝐶𝐴𝑃𝐻

1,000× 𝐸𝐹𝐿𝐻𝐻 × (

1

𝐻𝑆𝑃𝐹𝐵𝐴𝑆𝐸−

1

𝐻𝑆𝑃𝐹𝐸𝐸)

+ ∆𝑘𝑊ℎ𝐶𝐼𝑅𝐶

10 Illinois Energy Efficiency Stakeholder Advisory Group. September 28, 2018. 2019 Illinois Statewide Technical

Reference Manual for Energy Efficiency. Version 7.0. Volume 3: Residential Measures.

http://ilsagfiles.org/SAG_files/Technical_Reference_Manual/Version_7/Final_9-28-18/IL-

TRM_Effective_010119_v7.0_Vol_3_Res_092818_Final.pdf

Page 51: 2018 DSM Portfolio Evaluation Report - NIPSCO

44

Where:

CAPC = Total cooling capacity

EFLHC = Effective full-load cooling hours

SEERBASE = Baseline SEER

SEEREE = Efficient SEER

CAPH = Total heating capacity

EFLHH = Effective full-load heating hours

HSPFBASE = Baseline heating seasonal performance factor

HSPFEE = Efficient heating seasonal performance factor

∆kWhCIRC = Circulation mode energy savings from an ECM installation

Because the program data described these measures as being heat pumps with ECMs, the evaluation

team added kWhCIRC of 211 kWh to the energy savings captured from the SEER and HSPF upgrade.

The evaluation team used CAPC and CAPH values from the program tracking database and from model

look-ups in the AHRI equipment database. The evaluation team also found SEEREE and HSPFEE in the

tracking database and used EFLH values from the Indiana TRM (v2.2), based on project location. The

evaluation team assumed SEERBASE and HSPFBASE to be 13.0 and 7.7, respectively.

Evaluated savings varied from 885 kWh to 1,596 kWh, averaging 1,187 kWh. The ex ante savings used a

deemed value of 1,159 kWh, and the realization rate for electric energy savings was 102%. Some

variance between ex ante and ex post savings was likely caused by the evaluation team’s use of actual

values for CAP, SEEREE, and HSPFEE. It is also possible that the submitted values for each measure only

incorporated heating or cooling savings but not both.

The evaluation team used the following algorithm to calculate demand reduction:

∆𝑘𝑊 =𝐶𝐴𝑃𝐶

1,000× (

1

𝐸𝐸𝑅𝐵𝐴𝑆𝐸−

1

𝐸𝐸𝑅𝐸𝐸) × 𝐶𝐹

The evaluation team assumed an EERBASE of 11.0 according to the Indiana TRM (v2.2), while CF was again

0.88 and the evaluation team calculated EEREE according to EEREE = SEEREE x 0.9, with SEEREE coming from

the program data. Evaluated demand reduction ranged from 0.41 kW to 0.91 kW, averaging 0.66 kW. Ex

ante savings were a deemed value of 0.25 kW, and the peak demand realization rate for this measure

category was 263%.

Furnaces

The program tracking data contained 2,986 furnace units that were not specified as with or without an

ECM. The evaluation team referenced the AHRI database using the AHRI reference numbers listed for

most units, and found that 52% of these units had an ECM. However, these furnace models all

corresponded to natural gas–only customers, and had no claimed or evaluated energy savings or peak

demand reduction.

Page 52: 2018 DSM Portfolio Evaluation Report - NIPSCO

45

The evaluation team applied the same natural gas energy savings methodology as for furnaces with

ECMs, which produced an average evaluated natural gas energy savings value of 196 therms and a

realization rate of 113% for these measures.

Boilers

There were 58 boiler measures delivered as part of the program in 2018. These measures followed an

algorithm identical to Furnace measures. The resulting realization rate is 141% for this measure, largely

as a result of the evaluation team using actual AFUE values to calculate savings.

Ex Post Gross Summary Table 22 presents ex ante and ex post savings by measure category. As previously discussed, many of the

deviations from expected (ex ante) values were driven by the ex ante data generally employing deemed

values for savings, while the ex post evaluation used actual values for equipment sizing and efficiency,

and varied EFLH by measure location.

Table 22. 2018 HVAC Rebates Program Ex Ante and Ex Post Gross Savings Values

Measure Category

Electric Energy Savings (kWh/yr) Peak Demand Reduction (kW) Natural Gas Energy Savings

(therms/yr)

Ex Ante Ex Post

Gross RR Ex Ante

Ex Post

Gross RR Ex Ante

Ex Post

Gross RR

Furnaces w/ ECMs 954,500 954,500 100% - 137 399,809 475,491 119%

Air Conditioners 138,474 448,517 324% 258 623 241% - -

Smart Thermostats 312,475 211,100 68% 383 324 85% 523,434 601,738 115%

ASHPs 8,110 8,308 102% 2 5 263% - -

Furnaces - - - - 960,729 1,081,532 113%

Boilers - - - - 12,379 17,457 141%

Total 1,413,559 1,622,426 115% 643 1,088 169% 1,896,351 2,176,218 115%

Notes: Totals may not sum properly due to rounding.

Ex ante values are presented at the measure level and represent audited values, since the scorecard provides only savings totals.

RR stands for realization rate.

Ex Post Net Savings The evaluation team calculated freeridership and participant spillover using the methods described in

Appendix B. Self-Report Net-to-Gross Evaluation Methodology and the survey data collected from 2018

participants. Table 23 shows the percentage of HVAC program freeridership and participant spillover,

and the resulting NTG percentage.

Table 23. 2018 HVAC Rebates Program Net-to-Gross Results

Responses (n) Freeridershipa Participant Spillover NTG

67 35% 0% 65% a This score is an average weighted by survey sample ex post gross program MMBtu savings.

Page 53: 2018 DSM Portfolio Evaluation Report - NIPSCO

46

Freeridership

To determine freeridership, the evaluation team asked 67 participants a set of questions focused on

whether they would have installed equipment to the same level of efficiency, at the same time, and in

the same amount in the absence of the HVAC Rebates program.

Based on survey feedback, the evaluation team calculated an overall freeridership score of 35% for the

program (Table 24).

Table 24. 2018 HVAC Rebates Program Overall Freeridership Results

Program Category Responses (n) Freeridership (%)a

HVAC Rebates Program 67 35% a This score is an average weighted by survey sample ex post gross program MMBtu savings.

By combining a previously used intention methodology with a new influence methodology, the

evaluation team produced a freeridership score for the program by averaging the savings weighted

intention and influence freeridership scores. Refer to Appendix B. Self-Report Net-to-Gross Evaluation

Methodology for further details on the intention and influence questions and scoring methodologies.

Intention Freeridership

The evaluation team estimated an intention freeridership score for each participant based on their

responses to the intention-focused freeridership questions. The evaluation team surveyed 67

respondents to find an average weighted (ex post gross MMBtu savings) intention freeridership score of

52%.

Figure 6 shows the distribution of the Individual intention freeridership scores.

Page 54: 2018 DSM Portfolio Evaluation Report - NIPSCO

47

Figure 6. 2018 HVAC Rebates Program Distribution of Intention Freeridership Scores

Source: Participant Survey. Questions: D1 to D6 and D8 are used to estimate an intention freeridership

score. See Table 239 in the Appendix B. Self-Report Net-to-Gross Evaluation Methodology for the full text of

the questions, response options, and scoring treatments used to estimate intention freeridership scores.

See Table 240 for the unique Residential HVAC program participant response combinations resulting from

intention freeridership questions, along with intention freeridership scores assigned to each combination,

and the number of responses for each combination.

Influence Freeridership

The evaluation team assessed influence freeridership by asking participants how important various

program elements were in their purchase decision-making process. Table 25 shows program elements

that participants rated for importance, along with a count and average rating for each element.

Table 25. 2018 HVAC Rebates Program Influence Freeridership Responses

Influence Rating Influence

Score

Information about

the Program from a

Contractor

Rebates for

Equipment

Energy

Efficiency

Information

from NIPSCO

Participation in

a NIPSCO

Efficiency

Program

1 (not at all important) 100% 13 12 14 15

2 75% 3 4 6 7

3 25% 14 19 21 14

4 (very important) 0% 31 32 20 7

Not applicable 50% 6 0 6 24

Average Rating 3.0 3.1 2.8 2.3

The evaluation team determined each respondent’s influence freeridership rate for each measure

category using the maximum rating provided for any element listed in Table 25. As shown in Table 26,

the respondents’ maximum influence ratings ranged from 1 (not at all important) to 4 (very important).

A maximum score of 1 means the customer ranked all elements from the table as not at all important,

Page 55: 2018 DSM Portfolio Evaluation Report - NIPSCO

48

while a maximum score of 4 means the customer ranked at least one element as very important. Counts

refer to the number of maximum influence responses for each element influence score response option.

Table 26. 2018 HVAC Rebates Program Influence Freeridership Score

Maximum Influence Rating Influence

Score Count

Total Survey

Sample Ex Post

MMBtu Savings

Influence

Score MMBtu

Savings

1 (not important) 100% 10 131 131

2 75% 1 16 12

3 25% 10 97 24

4 (very important) 0% 46 687 0

Average Maximum Influence Rating (Simple Average) 3.4 - - -

Average Influence Score (Weighted by Ex Post Savings) 18% - - -

The average influence score of 18% for the 2018 HVAC program is weighted by respondents’ ex post

MMBtu program savings.

Total Freeridership

The evaluation team calculated the arithmetic mean of the intention and influence freeridership

components to estimate the final freeridership score for the HVAC program of 35%.

𝐹𝑖𝑛𝑎𝑙 𝐹𝑟𝑒𝑒𝑟𝑖𝑑𝑒𝑟𝑠ℎ𝑖𝑝 (35%) =𝐼𝑛𝑡𝑒𝑛𝑡𝑖𝑜𝑛 𝐹𝑅 𝑆𝑐𝑜𝑟𝑒 (52%) + 𝐼𝑛𝑓𝑙𝑢𝑒𝑛𝑐𝑒 𝐹𝑅 𝑆𝑐𝑜𝑟𝑒 (18%)

2

A higher freeridership score translates to more savings being deducted from the gross savings estimates.

Table 27 lists the intention, influence, and freeridership scores for the HVAC program.

Table 27. 2018 HVAC Rebates Program Freeridership Score

Responses (n) Intention Score Influence Score Freeridership Score

67 52% 18% 35%

Participant Spillover

As detailed in Appendix B. Self-Report Net-to-Gross Evaluation Methodology, the evaluation team

estimated participant spillover11 measure savings using specific information about participants

determined through the evaluation and using the Indiana TRM (v2.2) as a baseline reference. The

evaluation team estimated the percentage of program participant spillover by dividing the sum of

additional spillover savings (as reported by survey respondents) by the total gross savings achieved by

those respondents. The participant spillover estimate for the HVAC program is 0.06%, rounded to 0%, as

shown in Table 28.

11 Non-participant spillover evaluation activities were not conducted for the 2018 program year.

Page 56: 2018 DSM Portfolio Evaluation Report - NIPSCO

49

Table 28. 2018 HVAC Rebates Program Participant Spillover

Spillover Savings (MMBtu) Participant Program Savings (MMBtu) Participant Spillover

0.6 930.3 0%

Overall, the program was very important to two participants’ decisions to install additional high-

efficiency equipment for which they did not receive a rebate from NIPSCO. Table 29 shows this

additional participant spillover measure and the total resulting energy savings.

Table 29. 2018 HVAC Rebates Program Participant Spillover Measures, Quantity, and Savings

Spillover Measures Quantity Total Energy Savings (MMBtu)

ENERGY STAR Refrigerator 2 0.6

NTG Table 30 shows the percentage of HVAC Rebates program freeridership, participant spillover, and NTG.

Table 30. 2018 HVAC Rebates Program Net-to-Gross Results

Responses (n) Freeridershipa Participant Spillover NTGa

67 35% 0% 65% a This score is an average weighted by survey sample ex post gross program MMBtu savings.

Evaluated Net Savings Adjustments Using the calculated freeridership and participant spillover values, the evaluation team applied the

overall NTG to the ex post gross savings to calculate the ex post net savings. Table 31 shows ex post net

savings by fuel type for the 2018 HVAC program.

Table 31. 2018 HVAC Rebates Program Ex Post Net Savings

Fuel Ex Antea Gross

Savings

Ex Post Gross

Savings NTG

Ex Post Net

Savings

Electric Energy Savings (kWh/yr) 1,413,559 1,622,426 65% 1,054,577

Peak Demand Reduction (kW) 643 1,088 65% 707

Natural Gas Energy Savings (therms/yr) 1,896,351 2,176,218 65% 1,414,542

Process Evaluation The evaluation team interviewed NIPSCO program staff to gain insights into program implementation

and associated challenges and successes. The evaluation team also surveyed 67 program participants

who submitted a rebate form in 2018, to collect feedback on customer awareness, participation

experiences, and satisfaction with the program. The process evaluation results are provided below.

Program Design and Delivery As in previous years, 2018 participants either installed measures through the contractor of their choice

or through self-installation. The program included an instant discount option for contractors who

Page 57: 2018 DSM Portfolio Evaluation Report - NIPSCO

50

applied for rebates on their customers’ behalf. Customers and contractors were able to fill out the

application form on paper or through an online form.

While NIPSCO does not have a contractor network, and does not promote any individual contractors,

Lockheed Martin Energy (the program implementer) has its own network of contractors it is able to

leverage. If contractors did not pursue the instant discount option, participants had to fill out and

submit rebate forms within 60 days of installation. Every project was subject to an on-site visual

inspection, with program staff randomly inspecting 10% of installations.

Once submitted, the implementer reviewed the application form for completeness and accuracy. If

accepted, the implementer processed the form and sent a rebate check within eight weeks. The

program implementer reported 455 discontinued applications in 2018, 94 of which were for non-

qualifying equipment and 49 of which were for non-customers. The remaining discontinued projects

were due to customers failing inspection, not responding to an inspection request, installing used

equipment, or an unsuccessful follow up on project questions from Lockheed Martin Energy.

Changes from 2017 Design

For the 2018 program year, NIPSCO offered a range of rebated measures to its customers to provide

value and to drive them toward more energy-efficient equipment, while remaining within the program

budget. As shown in Table 32, NIPSCO made one change to the program measures in 2018 by removing

the HVAC filter whistle, which had been added in 2017, due to low customer interest. All other measures

and incentive levels remained consistent with 2017 offerings.

Table 32. 2016–2018 HVAC Rebates Program Measures and Rebates

Measure Qualifying Efficiency Level 2016 Incentive 2017 Incentive 2018 Incentive

Air Conditioner with ECM 15+ SEER $350.00 $125.00 $125.00

Heat Pump with ECM 14.5 SEERa $350.00 $175.00 $175.00

15+ SEER $400.00 N/A N/A

Geothermal Heat Pump N/A $500.00 N/A N/A

Natural Gas Furnace

95% AFUE with ECM $350.00 $250.00 $250.00

95% AFUE without ECM $300.00 $200.00 $200.00

98% AFUE with ECM $400.00 N/A N/A

98% AFUE without ECM $350.00 N/A N/A

Natural Gas Boiler 90% AFUE $200.00 $200.00 $200.00

92% AFUE $300.00 $200.00 $200.00

Smart Thermostat N/A $50.00 $45.00 $45.00

HVAC Filter Whistle N/A N/A $5.00 N/A a Changed from 14 SEER for the 2017 program year.

In addition to the one measure change, NIPSCO made program improvements to help customers

complete the application process. Lockheed Martin Energy implemented customer notifications for any

incomplete applications. NIPSCO produced a video to walk customers through how to complete an

application.

Page 58: 2018 DSM Portfolio Evaluation Report - NIPSCO

51

In 2018, the program had an increase to 12,505 verified rebated measure units, from 10,916 verified

measure units in 2017. As shown in Figure 7, 2018 had greater participation in furnace, smart

thermostat, and furnace with ECM measures compared to 2017, and less participation in air conditioner

measures.

Figure 7. 2018 and 2017 Participation by Measure Type (Units)

2017 Recommendation Status

The evaluation team followed up with NIPSCO regarding the status of the 2017 evaluation

recommendations. Table 33 lists the 2017 HVAC program evaluation recommendations and NIPSCO’s

progress toward addressing those recommendations to date.

Table 33. Status of 2017 HVAC Rebates Program Evaluation Recommendations

Summary of 2017 Recommendations Status

Prioritize making improvements to rebate processing times by

setting incremental goals and frequently tracking performance

as NIPSCO and the program implementer proceed with current

implementation plans to ensure timely processing.

Incremental goals and frequent tracking will help ease the

introduction of new processes (such as the online application

system), while identifying difficulties along the way.

Completed in 2018. In 2018, Lockheed Martin Energy set

a rebate processing time of eight weeks. This was

requested particularly by NIPSCO due to inconsistent and

sometimes lengthy processing times in previous years.

Page 59: 2018 DSM Portfolio Evaluation Report - NIPSCO

52

Summary of 2017 Recommendations Status

To increase participation in the instant discount offering,

consider creating educational materials and training on the

instant discount option, and provide these to Lockheed Martin

Energy’s contractor network members, educating them about

the instant discount model, and increasing the number of

contractors offering the instant discount. The application

paperwork required with offering instant discounts and the

wait of up to eight weeks for reimbursement for the discount

may present barriers to contractors considering the instant

discount model. Consider offering these contractors a trial

period to test the instant discount model.

No Further Action. According to program staff, Lockheed

Martin Energy has sent out several trade ally newsletters

that specifically discussed the instant discount. Based on

NIPSCO staff’s review of applications, the few large

contractors who have sufficient office staff use the

instant discount. There is no plan to try and push the

instant discount on smaller contractors.

To increase participation in the instant discount offering,

shorten and simplify steps through the online application

system for contractors and customers, facilitating faster rebate

processing times. While the above recommendation about

persuading more contractors to use the instant discount

model will help lessen the number of applications to process

on the back end, some contractors will not switch, and some

customers will prefer to conduct self-installations. For such

contractors and customers, shorten the time required to

complete the application by simplifying the process through

the use of auto-fill and drop-down boxes where possible.

Completed in 2018. NIPSCO produced a video to walk

customers through completing the application and made

use of auto-fill and dropdown option selection wherever

possible in the online application. While this was done to

help reduce confusion and provide support, it did not

affect the steps involved. Whether online or on paper,

the processing time was about the same, and applicants

still needed all the same paperwork (such as an invoice,

application, and AHRI certificate).

Ensure that the correct capacity for air conditioners is

recorded to increase the accuracy of the savings calculation.

Air conditioner capacities should align with those listed in the

AHRI equipment database.

Completed in 2018. The data was examined and it was

found that SEER matched the value listed in the AHRI

database for every case with a listed AHRI Certified

Reference Number.

Include (wherever applicable) the heating and cooling

capacity, SEER, and COP of the equipment being controlled by

the thermostat.

No Further Action. NIPSCO said that thermostat

applications only asked for the manufacturer and model

number of replaced thermostat and whether the unit

was working (Y/N). Lockheed Martin Energy confirmed

that customers who self-install thermostats generally do

not know this information, and that requiring it could

hinder participation.

Consider more explicit measure naming conventions while

specifying the type of customer account where a measure is

installed (electric only, natural gas only, or combo). The

current measure descriptions used for furnaces have some

ambiguity, which can lead to savings claim discrepancies.

Furnace models with ECMs should be delivered as part of the

Furnace with ECM measure group to ensure accurate savings

claims, and that group could be further divided to reflect

installations for different types of customer accounts.

Completed in 2018. LM Captures now contains a field

denoting if the customer is natural gas, electric, or

natural gas and electric.

Participant Feedback This section presents participant survey findings.

Page 60: 2018 DSM Portfolio Evaluation Report - NIPSCO

53

Energy Efficiency Awareness and Marketing

In 2018, the HVAC Rebates program was marketed through a variety of channels: bill inserts, the

Appliance Recycling program participant leave-behind materials, the NIPSCO website, and social media

(Twitter and Facebook). While NIPSCO does not have a contractor network, the program has access to

Lockheed Martin Energy’s contractor network, and while these contractors are not trained on how to

sell the program, they have access to marketing collateral such as the program trifold and information

on how to apply.

Participant respondents most frequently heard about the program through a contractor (41%), which

was also the most common source of program awareness in 2017 (49%). In 2018, other common

sources of program awareness included retailers and vendors (23%) and via word of mouth (18%). As

shown in Figure 8, only 2% of 2018 respondents learned about the program through social media.

Figure 8. How Participants Learned about the HVAC Rebates Program

Source: Participant Survey. Question: “How did you learn about NIPSCO’s Energy Efficiency Rebate

program?” (Multiple response allowed)

Over half of respondents (60%) were aware of other NIPSCO energy efficiency programs when provided

with a list including prompts on what those programs cover. However, of those who said they had heard

of other programs, 70% could not name an individual program when asked (Figure 9). Of those who

were able to name a program, respondents most frequently named this program (the HVAC Rebates

program; 15%) and the Home Energy Assessment program (10%).

Page 61: 2018 DSM Portfolio Evaluation Report - NIPSCO

54

Figure 9. Participant Awareness of Other NIPSCO Programs

Source: Participant Survey. Questions: “Besides the Energy Efficiency Rebate program, which you

participated in, are you aware that NIPSCO offers other energy efficiency programs?” and “What energy

efficiency programs are you aware of?” (Multiple response allowed)

Participation Drivers

Respondents reported that their primary motivation for participating in the HVAC Rebates program was

to get the rebate (46%). The other most common responses were to replace old but still working HVAC

equipment (18%) and to save money on utility bills (18%). As shown in Figure 10, the top three

motivations were the same in 2018 and 2017.

Figure 10. Motivation to Participate in the HVAC Rebates Program

Source: Participant Survey. Question: “Why did you decide to participate in the NIPSCO Energy Efficiency

Rebate program?” (Multiple response allowed)

The evaluation team determined whether HVAC Rebates program participants replaced a working unit

or a broken unit. As shown in Figure 11, 75% of respondents replaced a working unit in 2018. This

proportion was statistically similar to that in 2017 (79%).

Page 62: 2018 DSM Portfolio Evaluation Report - NIPSCO

55

Figure 11. Replacement of Working Unit versus Broken Unit

Source: Participant Survey. Question: “Was the measure purchased replacing a working unit or a broken

unit?”

When asked about the amount of information they had when making decisions about energy efficiency

upgrades, most respondents said they had some information (53%). One-third of respondents said they

had a lot of information. Together, the percentage of customers reporting at least some information on

energy efficiency was 86% in 2018, compared to 83% in 2017 and 65% in 2016.12 In 2018, only 6% of

respondents said they had no information at all when deciding to install new equipment.

Figure 12. Amount of Energy Efficiency Information Participants had for Decision Making

Source: Participant Survey. Question: “How much information about energy efficiency or expected energy

savings would you say you had when you decided to install the new [measure]?”

Perceived Impact on Home Comfort and Energy Bills

Since installing the program measures, 84% of respondents said their home was more comfortable

(consistent with 82% in 2017). As shown in Figure 13, 36% of respondents reported seeing lower

monthly energy bills; a significant decrease from 57% in 2017.13

12 The change from 2017 to 2018 was not statistically significant. The change from 2016 to 2018 was statistically

significant at p≤0.01 (99% confidence).

13 The change from 2017 to 2018 was statistically significant at p≤0.05 (95% confidence).

Page 63: 2018 DSM Portfolio Evaluation Report - NIPSCO

56

Figure 13. Perceived Impact on Home Comfort and Energy Bills Since Installation of Measure

Source: Participant Survey. Question: “Do you find that your home is more comfortable since the

installation of the measure?” and “Is your NIPSCO energy bill lower since you installed the measure?”

The perceived impact on energy bills varied by measure type (Figure 14). Among those who installed an

air conditioner, 48% of respondents reported lower energy bills. Just over one-third (35%) of smart

thermostat respondents and less than one-third (29%) of furnace respondents reported lower energy

bills, with the majority of respondents for both measures stating there was no difference. No boiler

respondents could tell a difference in energy bills in 2018 (there were no boiler respondents in 2017).

Figure 14. Perceived Impact on Energy Bills by Installed Measure

Source: Participant Survey. Question: “Is your NIPSCO energy bill lower since you installed the measure?”

Responses also varied by measure type regarding whether respondents perceived an increase inhome

comfort, ranging from 33% to 92% of customers reporting greater home comfort. As shown in Figure 15,

respondents who installed a furnace most commonly reported increased home comfort (92%), followed

by those who installed a smart thermostat (88%). Boiler respondents were least likely to report

improved home comfort (33%), but this is likely due to the small sample size for that measure type.

Page 64: 2018 DSM Portfolio Evaluation Report - NIPSCO

57

Figure 15. Perceived Impact on Home Comfort by Installed Measure

Source: Participant Survey. Question: “Do you find that your home is more comfortable since the installation of the measure?”

The survey asked participants who installed a smart thermostat about any settings they might have

adjusted since installation, to determine whether customers are maximizing the benefits of their

installed equipment. Of 16 thermostat respondents, 50% adjusted temperature settings, 31% adjusted

schedule settings, 25% adjusted modes, and 6% adjusted fan settings. Forty-four percent reported

adjusting all the thermostat settings on their new unit, up from 20% in 2017. Six percent reported not

adjusting any settings.

Satisfaction with Overall Program and Program Aspects

Respondents reported high satisfaction with the overall program in 2018. As shown in Figure 16, 92% of

respondents said they were very satisfied or somewhat satisfied with the program. Though this was

slightly lower than in 2017 (99%), the decrease was not statistically significant.

Figure 16. Overall Satisfaction with HVAC Rebates Program

Source: Participant Survey. Question: “How satisfied are you with the NIPSCO Energy Efficiency Rebate

program overall?”

Page 65: 2018 DSM Portfolio Evaluation Report - NIPSCO

58

Respondents who were less than very satisfied most often said they felt the rebates had taken too long

to arrive (seven respondents), that the rebate amount was low (five respondents), and that they had

either installed or wanted to install additional measures that were not eligible for a rebate through the

program (four respondents).

Respondents also rated their satisfaction with various programs aspects. Based on the percentage who

were very satisfied, respondents were most satisfied with the measures installed (91%). As shown in

Figure 17, respondents were the least satisfied with the rebate amount (57%). Significantly more 2018

respondents than 2017 respondents were very satisfied with the application process.14 The percentage

of customers who were very satisfied also increased regarding the time it took to receive the rebate,

from 45% in 2017 to 62% in 2018; this was a notable but not statistically significant improvement.

Figure 17. Satisfaction with Program Aspects

Source: Participant Survey. Multiple Questions: “How satisfied were you with the…”

Of the 67 surveyed participants, 82% had a contractor install the measures and 18% installed the

measures themselves. Of respondents who used a contractor, the majority (87%) reported being very

satisfied with their contractor, while 5% were somewhat satisfied (Figure 18). In 2017, 90% were very

satisfied with their contractor.15

14 This change from 2017 to 2018 was statistically significant at p≤0.05 (95% confidence). Other changes in

satisfaction between 2017 and 2018 were not statistically significant.

15 The change in those who were very satisfied with their contractor is not statistically significant.

Page 66: 2018 DSM Portfolio Evaluation Report - NIPSCO

59

Figure 18. Satisfaction with Contractors

Source: Participant Survey. Question: “How satisfied were you with the contractor who installed your

measure?”

Instant Discount Option

The evaluation team investigated how often contractors offered customers the instant discount option,

and whether customers took advantage of this option. The majority of respondents who had a

contractor install their measure (88%) reported that they were not offered the instant discount and

received their rebate in the mail (Figure 19). Of the respondents who were offered the instant discount,

83% accepted the discount while 17% opted to get the mailed rebate check.

Figure 19. Usage of the Instant Discount Option

Source: Participant Survey. Question: “Did your contractor offer you a discount on your new equipment or

did you receive a rebate in the mail?”

Suggestions for Improvement

When asked if they would like any additional equipment rebated, 41% of respondents (n=59) said yes.

As shown in Figure 20, of those who said yes, 54% requested rebates for water heaters, 42% wanted

rebates for major home appliances (such as dishwashers, stoves, washers, dryers, and refrigerators), and

13% wanted rebates for new windows.

Page 67: 2018 DSM Portfolio Evaluation Report - NIPSCO

60

Figure 20. Other Equipment Customers Would Like Rebated

Source: Participant Survey. Question: “Are there any other equipment you would like to see included in the

Energy Efficiency Rebate program?”

The evaluation team also asked respondents for program improvement suggestions. Of the 20 who gave

suggestions, respondents most commonly recommended increasing marketing (40%), speeding up the

time to receive the rebate (25%), and providing higher incentives (20%), as shown in Figure 21. Direct

customer responses included:

• “Promote the program more so more people are aware of it.”

• “Rebates should arrive sooner.”

• “Increase the amount given.”

• “When I was filling the application, knowing how to fill out the form and where to put the

answers was sometimes confusing.”

• “Add more products.”

Figure 21. Suggestions for Program Improvement

Source: Participant Survey. Question: “Do you have any suggestions for improving NIPSCO’s Energy

Efficiency Rebate program?”

Satisfaction with NIPSCO

As shown in Figure 22, the majority of HVAC Rebates program respondents (92%) were very satisfied or

somewhat satisfied with NIPSCO as their energy service provider. In 2018, 58% of respondents were very

Page 68: 2018 DSM Portfolio Evaluation Report - NIPSCO

61

satisfied and 34% somewhat satisfied with NIPSCO, which were statistically similar to 2017 (65% very

satisfied and 26% somewhat satisfied). Only 1% of participants expressed dissatisfaction in 2018.

Figure 22. Satisfaction with NIPSCO

Source: Participant Survey. Question: “How satisfied are you with NIPSCO overall as your utility service

provider?”

Participant Survey Demographics

As part of the participant survey, the evaluation team collected responses on the participants’ home

ownership status, type of residence, and number of people in the home (Table 34).

Table 34. HVAC Rebates Program Respondent Demographics

Demographics Percentage (n=67)

Home Ownership

Rent 0%

Own 100%

Type of Residence

Single-family detached 94%

Multi-family apartment or condo (four or more units) 3%

Mobile or manufactured home 3%

Number of People in Home

One 14%

Two 44%

Three 14%

Four 18%

Five 6%

More than five 5%

Page 69: 2018 DSM Portfolio Evaluation Report - NIPSCO

62

Conclusions and Recommendations Conclusion 1: The HVAC Rebates program achieved high customer satisfaction in 2018, especially with

its measure offerings.

In 2018, 92% of participant respondents were very satisfied with the HVAC Rebates program and 91%

were very satisfied with the measures they had installed. However, only 57% of customer respondents

were very satisfied with the incentive amount; this was the lowest percentage of very satisfied responses

for all program aspects investigated in the 2018 evaluation. This result is similar to the 53% in 2017 and

aligns with the incentive levels remaining unchanged between the two program years. To improve the

program, customers recommended increasing the rebate amount (20% of all suggestions). Several

respondents who reported low satisfaction with the program noted the rebate amount as a reason.

Recommendation:

• Review the incentive levels to ensure they are comparable to other similar programs and

consider adjusting the incentive amounts if the program would still be cost-effective.

Conclusion 2: Improvements made to the rebate processing time and application process boosted

customer satisfaction in these two areas.

After dealing with the backlog of applications forms in early 2017, throughout 2018 Lockheed Martin

Energy was able to implement processes and hired additional staff to keep to the agreed-upon eight-

week deadline for processing applications and sending rebates. This achievement was reflected in the

notable increase in customer satisfaction with the time it took to receive the rebate (62% very satisfied

in 2018, up from 45% in 2017). Despite this improvement, 25% of respondents suggested improving

rebate processing times.

There was also a significant increase in customer satisfaction with the application process (69% very

satisfied in 2018, up from 51% in 2017). The significant increase can be explained by the improvements

NIPSCO and Lockheed Martin Energy made in the application process such as automation around

communication regarding incomplete applications.

Recommendation:

• Apply improvements around the application process (i.e. increase communication around

payment status) to the rebate processing processes, so customers know when to expect

rebates. This can help customers stay engaged and know when to expect their checks.

Conclusion 3: Secondary smart thermostats provide only a fraction of the savings of primary ones.

Results from other programs indicate that many secondary smart thermostats are not in use or given

away to other households. In addition, it’s reasonable that many of those that are kept control the same

system as the first and do not provide additional savings. However, the portion of given-away

thermostats that remain in NIPSCO territory is unknown. The portion of installed second thermostats

controlling the same system as the first is also not known. These issues affected savings from 5.8% of

delivered thermostats in 2018.

Page 70: 2018 DSM Portfolio Evaluation Report - NIPSCO

63

Recommendations:

• Conduct survey efforts focused on participants receiving two thermostats, designed to

determine their usage patterns for the second thermostat—what portion are given away, what

portion of those are given away in NIPSCO territory, and what type of system is being controlled

by installed secondary thermostats.

• If low savings for secondary smart thermostats becomes a concern for overall HVAC Rebates

program cost-effectiveness and it does not otherwise affect overall program satisfaction,

rebates for second thermostats should be curtailed.

Conclusion 4: Ex ante savings are often very different from ex post savings, and also often have not

been updated for multiple program years.

Many measures have ex ante savings that are far less than ex post savings. This is especially true for air

conditioners and air source heat pumps. The baseline and efficient assumptions driving ex ante savings

may be outdated and out of line with those driving ex post savings, which use actual installed

information whenever possible. In addition, furnaces with ECMs do not have ex ante demand savings

even though there have been ex post demand savings for these measures for the past few program

years.

Recommendation:

• Update ex ante savings to generally reflect ex post findings. This will reduce disparities each

program year. In particular, consider doing this for air conditioner and heat pump measures, as

well as for demand savings for smart thermostats, which were sharply reduced this year.

Page 71: 2018 DSM Portfolio Evaluation Report - NIPSCO

64

Residential Lighting Program Through the Lighting program, NIPSCO seeks to reduce electric energy consumption and peak demand

through increased awareness and adoption of energy-efficient lighting technologies. By partnering with

retailers and manufacturers, NIPSCO provides program customers with instant discounts on efficient

lighting purchases that meet standards set forth by the DOE ENERGY STAR program. The Lighting

program promotes customer awareness and purchase of program-discounted products through a range

of marketing and outreach strategies, such as point-of-purchase marketing and promotional materials,

website advertising, and in-store lighting events. NIPSCO also provides program training to store staff at

participating retailers.

In 2018, NIPSCO offered program discounts on standard and specialty LEDs and LED fixtures across a

wide range of applications, package sizes, and wattages. Participating retailers varied and included big-

box stores, do-it-yourself stores, club stores, and discount stores.

Lockheed Martin Energy and CLEAResult (formerly Ecova) administered the program in 2018 and were

responsible for maintaining manufacturer and retailer relationships, providing point-of-purchase

materials and in-store training, conducting in-store promotional events, and overseeing data tracking,

reporting, and invoicing processes.

Program Performance Throughout 2018, the Lighting program discounted 1,292,186 light bulbs and fixtures, with counts based

on the evaluation team’s audit step, matching the scorecard. Table 35 presents a savings summary for

the program, including program gross savings goals. In terms of ex post gross savings, the program

achieved 137% of the electric energy savings goal and 75% of the peak demand reduction goal.

Table 35. 2018 Lighting Program Savings Summary

Metric Gross Savings

Goal Ex Ante Audited Verified

Ex Post

Gross

Ex Post

Net

Gross Goal

Achievement

Electric Energy

Savings (kWh/yr) 26,171,797 24,148,317 24,148,317 22,447,462 35,884,266 18,659,818 137%

Peak Demand

Reduction (kW) 6,553 3,308 3,308 3,075 4,884 2,540 75%

Table 36 presents the program realization rates and NTG percentage. The evaluation team used the

same NTG as was identified for the 2017 program year.

Table 36. 2018 Lighting Program Adjustment Factors

Metric Realization Ratea NTGb

Electric Energy Savings (kWh/yr) 149% 52%

Peak Demand Reduction (kW) 148% 52% a The realization rate is defined as ex post gross savings divided by ex ante savings. b The NTG is defined as ex post net savings divided by ex post gross savings.

Page 72: 2018 DSM Portfolio Evaluation Report - NIPSCO

65

Table 37 presents the 2018 program budget and expenditures. The program spent 88% of its electric

budget.

Table 37. 2018 Lighting Program Expenditures

Fuel Program Budget Program Expenditures Budget Spent (%)

Electric $4,119,284 $3,611,711 88%

Research Questions The evaluation team conducted quantitative analyses and interviews with program staff to answer three

research questions:

• How is the program performing relative to its goals?

• What opportunities exist for program improvements?

• What efforts are in place to drive customer participation and awareness?

Impact Evaluation The Lighting program impact evaluation consisted of several activities:

• Quantitative analysis of the program tracking databases

• Engineering analysis of tracked energy savings and demand reduction

• Application of installation rates established by Opinion Dynamics in 201516

• Application of a NTG percentage established in 2017 through demand elasticity modeling to

2018 program sales data

Audited and Verified Savings To audit energy savings and demand reduction, the evaluation team reviewed the program tracking

database and checked savings estimates and calculations against the Indiana TRM (v2.2) to confirm

accurate application of the assumptions. Following the review, the evaluation team recalculated

program energy savings and demand reduction to account for errors, omissions, and inconsistencies

identified in the program tracking data.

To confirm consistency in the tracking data, the evaluation team audited bulb quantities by comparing

bulb descriptions, numbers of packs, and numbers of units provided in the tracking database. The

evaluation team also validated bulb quantities through an analysis of rebate and buy-down dollar

amounts, and found that the data was accurate, complete, and comprehensive and did not require any

modifications.

16 Opinion Dynamics. July 15, 2015. 2014 Energizing Indiana: Residential Programs Market Effects Study.

Prepared by the Indiana Statewide Core program evaluation evaluation team.

Page 73: 2018 DSM Portfolio Evaluation Report - NIPSCO

66

The evaluation team thoroughly investigated energy savings and demand reduction assumptions, and

corrected errors or omissions. Throughout this investigation, the evaluation team did not identify any

significant errors that required adjustments to ex ante claimed savings.

The evaluation team estimated ISRs using results from Opinion Dynamics 2015.17 To determine

carryover savings from delayed installation of program lamps, the evaluation team used the UMP-

recommended “Discount Future Savings” method (National Renewable Energy Laboratory/UMP Chapter

21, 2015), which indicated that most bulbs placed in storage (up to 97%) were installed within four years

(including the initial program year), with 24% of bulbs left over from Year 1 installed in Year 2, 24% in

Year 3, and so on. However, given expected baseline lighting changes anticipated to be applied as part

of EISA 2007, all standard LEDs are anticipated to effectively function as baseline lamps. Therefore, the

evaluation team applied an adjusted lifetime ISR of 92% to standard LEDs and a 96% ISR for specialty

and reflector lamps, thus accounting for carryover savings, resulting in a weighted lamp ISR of 93%. LED

fixtures retained a 100% ISR, as in keeping with prior evaluation years.

Table 38 lists the applied ISRs by product type. Table 39 presents the audited and verified quantities

used to determine per-measure deemed and ex post savings.

Table 38. Lighting Program In-Service Rates

Measure 2018 In-Service Rate 2017 In-Service Rate

LED Fixtures 100% 100%

LEDs (Standard and Specialty) 93% 86%

Source: Opinion Dynamics 2015.

Table 39. Lighting Program Audited and Verified Quantities

Measure Audited Quantitya In-Service Rate Verified Quantity

LED – Fixture 11,799 100% 11,799

LED – General Service 981,020 92% 902,538

LED – Reflector 171,617 96% 164,752

LED – Retrofit Kit 29,412 96% 28,236

LED – Specialty 98,338 96% 94,404

Total 1,292,186 93% 1,201,729 a Values presented at a measure-level represent audited values, since the scorecard provides only measure totals.

Ex Post Gross Savings The evaluation evaluation team determined the program’s ex post gross energy savings and demand

reduction through an engineering analysis. Similar to the ex ante calculations, algorithms included HOU,

17 The evaluation team applied first-year installation rates, derived through the 2014 Market Effects Study

(Opinion Dynamics 2015)—the most current research available from Indiana. More recent studies in Maryland

(86%, 2016) and New Hampshire (87%, 2016) have similar first year LED ISRs. ISRs for LEDs typically range

between 74% (Wyoming, 2016) and 97% (New Hampshire, 2016).

Page 74: 2018 DSM Portfolio Evaluation Report - NIPSCO

67

interactive effects, and CFs (for demand reduction) from the Indiana TRM (v2.2) and the recommended

baseline watts approach proscribed in the most recent version of the UMP. The evaluation team used a

range of data sources to ensure the evaluation teamused the most recent and accurate savings

assumptions. Appendix C. Residential Lighting Program Assumptions and Algorithms contains the

detailed equations the evaluation team used to calculate 2018 energy savings and demand reduction for

the program and provides a summary table of savings assumptions, their sources, and how they

compare to the ex ante assumptions.

Table 40 includes ex post gross energy savings and demand reduction by product type, along with the ex

ante savings assumptions.

Table 40. Lighting Program Ex Ante and Ex Post Gross Per-Measure Savings Values

Measure Ex Antea Per Measure Deemed Savings Ex Post Gross Per-Measure Savings

kWh kW kWh kW

LED - Fixture 40.3 0.006 47.3 0.006

LED - General Service 19.2 0.003 29.4 0.004

LED - Reflector 19.0 0.003 29.2 0.004

LED - Retrofit Kit 23.3 0.003 33.3 0.005

LED - Specialty 8.9 0.001 32.3 0.004

Weighted Average 18.7 0.003 29.9 0.004 a Values presented at a measure-level represent audited values, since the scorecard only provides savings totals.

The overall realization rate for the program is 149% for energy savings and 148% for demand reduction

(Table 41 and Table 42). The primary driver of higher realization rates for energy savings and demand

reduction is higher baseline wattage assumptions combined with including more LED specialty lamps in

the program (which have substantially higher per-unit savings than other lamp types).

Table 41. Lighting Program Ex Ante and Ex Post Gross Electric Energy Savings

Measure Ex Ante Deemed

Savings (kWh/unit)

Gross Savings Realization

Rate

Ex Post Gross Savings

Audited (kWh) Verified (kWh) kWh/unit kWh

LED - Fixture 40.3 475,856 475,856 117% 47.3 558,440

LED - General

Service 19.2 18,848,896 17,340,984 141% 29.4 26,525,172

LED - Reflector 19.0 3,259,958 3,129,560 148% 29.2 4,809,870

LED - Retrofit Kit 23.3 685,632 658,207 137% 33.3 940,865

LED - Specialty 8.9 877,975 842,856 347% 32.3 3,049,919

Total -- 24,148,317 22,447,462 149% -- 35,884,266

Note: Totals may not sum properly due to rounding.

Page 75: 2018 DSM Portfolio Evaluation Report - NIPSCO

68

Table 42. Lighting Program Ex Ante and Ex Post Gross Peak Demand Reduction

Measure Ex Ante Deemed

Reduction (kW/unit)

Gross Reduction Realization

Rate

Ex Post Gross Reduction

Audited (kW) Verified (kW) kW/unit kW

LED - Fixture 0.006 65.2 65.2 117% 0.006 76.1

LED - General

Service 0.003 2,582.0 2,375.4 140% 0.004 3,610.5

LED - Reflector 0.003 446.6 428.7 147% 0.004 654.7

LED - Retrofit Kit 0.003 93.9 90.2 136% 0.005 128.1

LED - Specialty 0.001 120.3 115.5 345% 0.004 415.1

Total -- 3,307.9 3,075.0 148% -- 4,884.4

Note: Totals may not sum properly due to rounding.

Ex Post Net Savings The evaluation evaluation team applied a program NTG ratio to ex post gross energy savings to calculate

net savings. Generally, a NTG ratio is comprised of freeridership and spillover estimates. However, the

evaluation evaluation team developed the NTG ratio for this program using a net-of-freeridership

approach known as demand elasticity modeling, which did not produce discrete freeridership and

spillover estimates. Demand elasticity modeling relies on the same working theory as the program

design: the assumption that changes in price and promotion will affect lamp sales. Demand elasticity

modeling creates a counterfactual scenario where the program does not exist to identify what measure

sales would have been absent the program. This approach generally returns similar results as other

methods, but has the benefit of theoretical appropriateness and is used to examine similar programs

across the country.

The evaluation team applied a NTG of 52% to the 2018 program ex post gross savings. This percentage is

based on primary research conducted in NIPSCO territory in 2017. A more detailed discussion of the

methodology surrounding that result is available in Appendix C. Residential Lighting Program

Assumptions and Algorithms. Ex post net savings are summarized in Table 43 and Table 44. The

evaluation team applied the program-level NTG to the ex post gross energy savings and demand

reduction to arrive at the ex post net values.

Table 43. Ex Post Gross and Ex Post Net Savings by Lighting Program Measure

Program Measure Ex Post Gross Savings/Reduction

NTG Ex Post Net Savings/Reduction

kWh kW kWh kW

LED – Fixture 558,440 76.1 52% 290,389 39.6

LED - General Service 26,525,172 3,610.5 52% 13,793,089 1,877.4

LED – Reflector 4,809,870 654.7 52% 2,501,133 340.4

LED - Retrofit Kit 940,865 128.1 52% 489,250 66.6

LED – Specialty 3,049,919 415.1 52% 1,585,958 215.9

Total 35,884,266 4,884.4 52% 18,659,818 2,539.9

Note: Totals may not sum properly due to rounding.

Page 76: 2018 DSM Portfolio Evaluation Report - NIPSCO

69

Table 44. Lighting Program Ex Post Net Savings

Savings Type Ex Ante Gross Savings Ex Post Gross Savings NTG Net Savings

Electric (kWh) 24,148,317 35,884,266 52% 18,659,818

Demand (kW) 3,308 4,884 52% 2,540

Process Evaluation The evaluation evaluation team interviewed NIPSCO program staff to gain insights into program

implementation and associated challenges and successes.

Program Design and Delivery Through the Lighting program, NIPSCO provides an upstream incentive structure for energy-efficient

lighting technologies. Through partnerships with manufacturers and area retailers, discounts are

automatically applied as price markdowns on customer purchases of efficient lighting products. The

discount amount depends on the retailer and the specific product, and discounts can be adjusted

throughout the program year.

In 2018, NIPSCO offered discounted energy-efficient lighting through the program to 14 retailers at 101

locations throughout its service territory. At each location, the implementer placed in-store signage and

point-of-purchase signage, such as stickers on the product aisle, to advertise the price markdown and

attribute it to NIPSCO.

NIPSCO’s website, in-store training, and educational lighting events also promoted the program; these

channels helped to inform customers about appropriate bulb types, lumens, and wattages to fit their

lighting needs. Lockheed Martin Energy administered the training and promotional lighting events. To

ensure the program functioned as intended, a field coordinator from Lockheed Martin Energy

performed regular visits to participating retailers, provided the training and promotional events, and

ensured that the retail location offered the correct price markdowns and adequate bulb placements.

Changes from 2017 Design

The 2018 program budget was larger than in 2017, increasing from $3,468,110 to $4,119,284. This

budget allowed NIPSCO to continue discounting bulbs throughout the year with no major adjustments

needed in 2018, whereas in 2016 product discounts and offerings had to be adjusted to sustain the

program budget throughout the full program year. Other program design elements also remained

unchanged from the 2017 program year.

Program Product Mix

Based on the evaluation team’s program audit, NIPSCO discounted 1,292,186 light bulbs and fixtures

through the Lighting program in 2018, up from 1,109,371 in 2017 and 475,002 bulbs in 2016 (only

99,217 of which were LED bulbs). Compared to the 2017 program year, the proportion of standard bulb-

type LEDs sold in 2018 increased by 19 percentage points, while the proportion of specialty bulbs sold

decreased by 13 percentage points, as shown in Figure 23. The proportion of reflector bulbs also

Page 77: 2018 DSM Portfolio Evaluation Report - NIPSCO

70

decreased in 2018, from 20% to 13%. Retrofit kits and fixture bulbs remained the least frequently sold

LED bulb types, cumulatively accounting for just over 5% of total bulb sales.

Figure 23. Lighting Program LED Audited Product Mix by Bulb Type

Source: Program Tracking Data

Note: Totals may not sum properly due to rounding.

Program Retailer Mix

In 2018, 14 unique retailers participated in the program. In 2018, Walgreens joined the program and The

Salvation Army left the program. The majority of sales (61%) occurred through four retailers, compared

to 75% of sales from these same retailers in 2017. Table 45 shows the number of bulbs sold by

each retailer.

Page 78: 2018 DSM Portfolio Evaluation Report - NIPSCO

71

Table 45. Lighting Program Audited Bulb Sales by Retailer

Retailer 2018 Percentage of

Total (n=1,292,186)

2017 Percentage of

Total (n=1,109,371)

The Home Depot 18% 20%

Menards 16% 14%

Costco 15% 26%

Walmart 11% 15%

Independents 8% 9%

Habitat ReStore 8% 2%

Meijer 6% 4%

Sam's Club 5% 2%

Dollar Tree 4% 3%

Walgreens 2% n/a

Target 2% 2%

Lowe's 1% 1%

Goodwill 1% 2%

Batteries Plus <1% 1%

Total 100% 100%

Note: Totals may not sum properly due to rounding.

2017 Recommendation Status

In addition to the research objectives set forth for the 2018 evaluation, the evaluation evaluation team

followed up on the status of a recommendation made in the 2017 evaluation. Table 46 lists the 2017

Lighting program evaluation recommendation and NIPSCO’s progress toward addressing that

recommendation to date.

Table 46. Status of 2017 Lighting Program Evaluation Recommendation

Summary of 2017 Recommendations Status

To fully account for program activity, keep a detailed

accounting of all program marketing and promotional

events. This could increase the accuracy of any future

demand elasticity model evaluations of program net

impacts. The program implementer should consider storing

these data in such a manner that they could be easily

provided to the program evaluation evaluation team for

inclusion in future evaluation years.

No Further Action. While the implementer provided

additional details to 2018 tracking data, enabling easier

evaluation, not all proposed fields were added. However,

without a demand elasticity model analysis in 2018, those

additional fields were not required and were not requested

by the evaluation evaluation team.

Conclusions and Recommendations Conclusion 1: The 2018 Lighting program saw similar bulb sales to the 2017 program year.

Overall, the program functions well, exceeding its gross energy savings goal for the year, but not

meeting the demand goal. Neither the implementer nor NIPSCO reported any issues in 2018. The

program continues to offer attractive price markdowns for customers.

Page 79: 2018 DSM Portfolio Evaluation Report - NIPSCO

72

The Lighting program discounted an audited total of 1,292,186 light bulbs in 2018, compared to

1,109,371 bulbs in 2017 and 475,002 bulbs in 2016. The overall 2018 program budget increased by 19%

to $4,119,284, from $3,468,110 in 2017. The number of participating retailers remained the same

between years, with one retailer joining the program and one retailer leaving the program.

Conclusion 2: The 2018 Lighting program yielded higher evaluated savings realization rates than in

2017. This higher realization rate is attributed to an updated ISR that accounts more for carry-over

savings, and the UMP approach to calculating per-unit savings (“delta Watts”) and the inclusion of

more high-savings lamp types, especially specialty and reflector LEDs.

While the program was able to realize 137% of its gross energy savings goal, it was not able to achieve

the same for demand reductions, where it realized only 75% of its goal. By comparison, the program

experienced an improved ex ante realization rate (149% for ex post gross energy savings and 148% for ex

post demand reduction) over 2016 (94% for energy savings and 108% for demand reduction) and 2017

(121% for both energy and demand). This was due to two factors. First, the ISR being updated to ensure

that the program received at least partial credit for bulbs that were incentivized in previous years but

not installed until the current year, which aligns with other evaluation approaches used in Indiana.

Second, the method for determining per unit evaluated savings utilized the UMP recommended lumens

bin matching process, which produced more accurate lamp level results and better accounted for

changes in measure mix.

Recommendations:

• Provide an increased focus and deeper discounts on specialty and reflector LED lamps, and LED

fixtures, as these have the highest per-unit realization rates and are not anticipated to be

affected by EISA 2020 rulemaking to the same extent as general service lamps.

• Use the UMP recommended lumens binning approach, combined with Indiana TRM (v2.2)

values for HOU, WHF, and CF, to generate ex-ante savings for each lamp in the program,

ensuring that the program gets fuller credit for higher wattage, specialty, and reflector LEDs and

realization rates are closer to 100%.

Page 80: 2018 DSM Portfolio Evaluation Report - NIPSCO

73

Residential Home Energy Analysis Program The HEA program provides no-cost, in-home energy assessments to residential customers. During an

assessment, an energy auditor analyzes the efficiency of the heating and cooling systems and insulation

levels in the home and installs energy-saving lighting and water conservation measures. The assessment

concludes with the auditor providing a report of findings and energy-saving recommendations. The

HEA program is designed to help participants improve the efficiency, safety, and comfort of their homes,

as well as deliver an immediate reduction in electricity and/or natural gas consumption and promote

additional efficiency measures through other NIPSCO programs.

Program Performance The HEA program fell short of its 2018 gross savings goals, achieving 37% of its electric energy (kilowatt-

hours per year) savings goal, 30% of its peak demand reduction (kilowatts) goal, and 23% of its natural

gas energy (therms per year) savings goal. The program implementer, Lockheed Martin Energy, reported

that the program got off to a strong start in 2018 but then saw a marked decline in the second half of

the year. They attribute this decline to trade allies prioritizing work for the IQW program, a more limited

mix of measures compared to IQW, and not having enough trade allies working with the program. Table

47 summarizes savings for the program, including program savings goals.

Table 47. 2018 HEA Program Savings Summary

Metric

Gross

Savings

Goal

Ex Ante Audited Verified Ex Post

Gross Ex Post Net

Gross Goal

Achievement

Electric Energy

Savings (kWh/yr) 1,160,772 450,871 451,104 417,545 424,231 325,147 37%

Peak Demand

Reduction (kW) 674 248 248 220 201 175 30%

Natural Gas

Energy Savings

(therms/yr)

196,763 83,323 83,370 72,825 44,704 41,217 23%

Table 48 outlines the ex post gross and NTG adjustment factors.

Table 48. 2018 HEA Program Adjustment Factors

Realization Rate (%) Realization Rate (%)a Freeridership Spillover NTG (%)b

Electric Energy Savings (kWh/yr) 94% 23%

N/A

77%

Peak Demand Reduction (kW) 81% 13% 87%

Natural Gas Energy Savings (therms/yr) 54% 8% 92% a Realization Rate is defined as ex post Gross savings divided by Ex ante savings. b NTG is defined as ex post net savings divided by ex post gross savings.

Table 49 lists the 2018 program budget and expenditures by fuel type. The program spent 40% of its

electric budget and 43% of its natural gas budget, which aligns with its program ex ante savings goal

attainment.

Page 81: 2018 DSM Portfolio Evaluation Report - NIPSCO

74

Table 49. 2018 HEA Program Expenditures

Fuel Program Budget Program Expenditures Budget Spent (%)

Electric $521,661 $208,900 40%

Natural Gas $678,249 $293,558 43%

Research Questions The evaluation evaluation team conducted qualitative and quantitative research activities to answer the

following key research questions for the program:

• What was the customer experience with the program, from sign-up through completion?

• How did customers become aware of the program?

• What were customer motivations for participation?

• How satisfied were customers with the program, including the participation process,

interactions with the program implementer, and satisfaction with each piece of equipment

received?

• How useful were the written reports that customers received after the assessment?

• Are customer homes more comfortable after the assessment?

• Are utility bills lower after the assessment?

• What do participants recommend for program improvement?

• During the assessment, were customers given additional energy savings tips that they have put

into practice? If yes, what have they done?

• As a result of the assessment, did customers install any other measures for which they did not

receive a utility rebate? If so, what were they?

• What are the program’s verified measure installations?

• To what extent did freeridership affect the program?

Impact Evaluation This section details each step of the impact evaluation and its associated electric energy savings, peak

demand reduction, and natural gas savings.

Audited and Verified Savings To determine the audited quantity and savings, the evaluation evaluation team first reviewed the

program tracking database for duplicates or other data quality issues. The evaluation team then looked

for any discrepancies between the program tracking data and the program scorecard and found none

that resulted in any adjustments to savings or quantities. The evaluation team did find a few instances

where the description of a home’s heating or cooling system included as part of the measure name did

not align with what was stored separately in the heating system or AC type fields. Per guidance received

in 2017 from the program implementer, the evaluation evaluation team treated the description included

Page 82: 2018 DSM Portfolio Evaluation Report - NIPSCO

75

in the measure name as the correct type when discrepancies existed. This issue did not affect savings or

quantity.

The evaluation evaluation team also compared application materials and participant survey responses to

the program tracking database and verified measure eligibility. There was one area of difference

between the ex ante scorecard quantity and the audited quantity: a difference between the program

reported quantity and the quantity the respondent verified through the participant survey, indicating

untracked measure installations that affected four measures (LEDs, bathroom faucet aerators, kitchen

faucet aerators, and low-flow showerheads). While this resulted in only minor increases in the number

of units installed, it may indicate a larger problem and additional untracked savings that the program

might claim.

The evaluation evaluation team established ISRs for all measures through the 2018 participant survey.

Consistent with the method established in the 2015 evaluation, and used in the 2016 and 2017

evaluations, the evaluation teamused the percentage of customers who recalled receiving an

assessment report as the ISR for the energy assessment recommendations measure.

The evaluation sample and survey approach resulted in relatively few survey completes in many

measure categories. To increase confidence in the applied percentage, the evaluation evaluation team

calculated a weighted average of the ISRs established through the 2018 and 2017 participant surveys as

the same program trade allies installed the same program measures in both years. Table 50 lists the ISRs

for all program-installed measures.

Table 50. 2018 HEA Program In-Service Rates

Measure 2018 In-

Service Rate

2018

Sample

Size (n)

2017 In-

Service Rate

2017

Sample

Size (n)

Final 2018

In-Service

Rate

Total

Sample

Size (n)

LED 92.2% 52 95.8% 68 94.2% 120

Bath Aerator 93.9% 24 87.8% 30 90.5% 54

Kitchen Aerator 90.0% 20 85.2% 27 87.2% 47

Showerhead 87.5% 28 86.2% 29 86.8% 57

Pipe Wrap 88.9% 36 90.9% 33 89.9% 69

Filter Whistlea - 0 86.7% 15 86.7% 15

Duct Sealing 86.4% 44 88.2% 51 87.4% 95

Attic Insulation 100.0% 4 100.0% 1 100.0% 5

Assessment Recommendations 85.1% 67 73.5% 68 79.3% 135 a The program installed a total of 22 filter whistles in 2018 and none were represented in survey responses.

The ISRs fell below 100% for two reasons: (1) respondents reported removing items after they were

installed and/or (2) respondents reported that a measure reported by the program was not installed. A

variety of factors can impact ISRs, such as those documented below. It is important to note that these

calculations rely on self-reported surveys. While the evaluation team used these surveys as a basis, the

evaluation team recognize that self-report errors could have affected the accuracy of the ISRs.

Page 83: 2018 DSM Portfolio Evaluation Report - NIPSCO

76

Respondents reported the following reasons for measure-specific removals:

• LEDs: The program reported installing a total of 994 LEDs in the homes of survey respondents.

Several respondents reported removing LEDs provided through the program because they

stopped working (n=8, 23 bulbs), they were not bright enough (n=1, 3 bulbs), did not like the

color (n=1, 1 bulb), or broke a bulb (n=1, 1 bulb).

• Kitchen aerators: One respondent removed the kitchen aerator to use a water filter.

• Low-flow showerheads: One respondent removed the showerhead because they did not like it

and one respondent removed a showerhead because they did not like the water pressure.

• Other measures: The ISRs for pipe wrap and duct sealing are lower than 100% due to survey

respondents saying they did not receive that measure or service.

Table 51 summarizes the tracking data quantity, audited quantity, applied installation rates, and

resulting verified quantity per measure. To calculate the verified measure quantity, the evaluation

evaluation team multiplied the audited measure quantity by the installation rate.

Table 51. 2018 HEA Program Audited and Verified Quantities

Measure Unit of Measure Audited Quantity ISR Verified Quantity

LED (9 watt) – Dual Fuel Lamp 11,440 94.2% 10,781

LED (9 watt) - Electric Lamp 765 94.2% 721

Bathroom Aerator - Electric Aerator 27 90.5% 24

Bathroom Aerator – Natural Gas Aerator 562 90.5% 509

Kitchen Aerator - Electric Aerator 15 87.2% 13

Kitchen Aerator - Natural Gas Aerator 255 87.2% 222

Low-Flow Showerhead - Electric Showerhead 16 86.8% 14

Low-Flow Showerhead - Natural Gas Showerhead 346 86.8% 300

Pipe Wrap - Electric Per foot 157 89.9% 141

Pipe Wrap - Natural Gas Per foot 4,067 89.9% 3,655

HVAC Filter Whistle Filter whistle 22 86.7% 19

Duct Sealing - Dual Fuel Participant 453 87.4% 396

Duct Sealing - Electric Cooling Participant 4 87.4% 3

Duct Sealing - Electric Participant 16 87.4% 14

Duct Sealing - Natural Gas Participant 165 87.4% 144

Attic Insulation - Dual Fuel Participant 13 100.0% 13

Attic Insulation - Natural Gas Participant 3 100.0% 3

Audit Recommendations - Dual Fuela Participant 630 79.3% 499

Audit Recommendations - Electrica Participant 48 79.3% 38

Audit Recommendations - Natural Gasa Participant 162 79.3% 128

19,166 N/A 17,639

Note: Totals may not sum properly due to rounding. a These quantities reflect physical unit counts, and therefore do not match scorecard counts which count each fuel type for

dual-fuel measures.

Page 84: 2018 DSM Portfolio Evaluation Report - NIPSCO

77

Ex Post Gross Savings The evaluation evaluation team used two methods to determine ex post gross savings for the

HEA program: billing analysis and engineering review. The evaluation team used billing analysis to

estimate the ex post gross natural gas savings from duct sealing. This methodology is appropriate and

potentially more accurate than engineering analysis given duct sealing is a weather sensitive measure

with a reasonable sample size and the claimed savings are approximately equal to or greater than 10%

of annual energy usage. For the remaining measures, and energy savings and demand reduction

associated with duct sealing, the evaluation evaluation team used an engineering review.

Billing Analysis

The evaluation evaluation team used billing analysis to estimate ex post gross natural gas savings from

duct sealing installed through the HEA program. The evaluation team pursued billing analysis evaluation

for HEA because it offers an additional level of rigor over engineering review, which the evaluation team

used to evaluate HEA for previous program years due to insufficient sample sizes and/or billing data. The

billing analysis methodology is described in greater detail in Appendix E. Residential Home Energy

Analysis Program Billing Analysis Methodology.

As shown in Table 52, the billing analysis yielded site-level savings of 56.43 therms per year as compared

to ex ante savings of 113.86 therms per year. Meeting the 90% confidence level, the results show an

error band of 20.8% which means the savings could be as high as 67 therms or as low as 44 therms. The

difference between ex post gross and ex ante per-measure savings was partially due to both ISRs and

previously suggested adjustments to claimed savings accounted for in the 2017 ex post gross per-

measure savings recommendations. However, the billing analysis results suggest that there are other

factors that are also causing a reduction in savings. These factors may include customer behavior,

relatively efficient baseline conditions of HVAC equipment, installation quality, or interactive effects.

Additional research would be needed to make any conclusions as to the factors that drive this result.

Table 52. Ex Post Gross Natural Gas Savings Comparison for HEA Duct Sealing

Measure Sample

Size

Ex Ante Per-

Measure

therms

2017 Ex Post

Gross Per-

Measure

therms

Two-Year

Weighted In-

Service Ratea

2017 Ex Post

Gross Per

Measure after

ISR

2018 Ex Post

Gross Per-

Measure

therms

Prescriptive Duct

Sealing – Natural Gas

426

113.9 92.7 87.4% 81.0

56.4 + 20.8% Duct Sealing Package

Natural Gas and

Electric

113.9 94.1 87.4% 82.3

a As shown in Table 50.

Engineering Review

The evaluation team referred to the Indiana TRM (v2.2) for variable assumptions to calculate ex post

gross electric energy savings, demand reduction, and natural gas savings. Where data were unavailable

in the Indiana TRM (v2.2), the evaluation team used data from the Illinois TRM (v6.0), the 2016

Page 85: 2018 DSM Portfolio Evaluation Report - NIPSCO

78

Pennsylvania TRM, or other secondary sources. The evaluation team revised assumptions for savings

estimates applicable to the NIPSCO service territory, as needed. Appendix D. Residential Home Energy

Analysis Program Assumptions and Algorithms contains more details on the specific algorithms, variable

assumptions, and references for all program measure ex post gross calculations.

Through the engineering review, the evaluation team uncovered some significant differences between

ex ante and ex post gross savings. These differences were primarily driven by the following three

overarching factors:

• The evaluation team calculated ex post gross savings for most of the measures using the Indiana

TRM (v2.2). The planning and reporting assumptions NIPSCO used to calculate ex ante savings

referenced the Indiana TRM (v2.2) and the 2015 evaluation, measurement, verification (EM&V)

results, and sometimes included an average of the savings values provided in each source.

• The evaluation team used specific characteristics of installed measures provided within the

tracking data or program application materials for variables such as pre- and post-installation R-

values, square footage, and project location in ex post gross savings. Ex ante savings were

calculated using savings values from past studies. Calculations using actual participant data were

invariably different than deemed values.

• The evaluation team also used geolocation for each customer address in the database, then

matched each address with the closest city from the Indiana TRM (v2.2)—for example, South

Bend and Fort Wayne—to more precisely account for variations in climate for measures

including LED bulbs, faucet aerators, low-flow showerheads, duct sealing, and attic insulation.

Sources for the ex post gross savings calculation for each measure are provided in Appendix D.

Residential Home Energy Analysis Program Assumptions and Algorithms.

Table 53 highlights notable differences between ex ante and ex post gross estimates.

Table 53. 2018 HEA Program Notable Differences Between Ex Ante and Ex Post Gross

Measure Ex Ante Sources and

Assumptions

Ex Post Gross Sources and

Assumptions

Primary Reasons for

Differences

LEDs

Ex ante savings are an

average of savings values

from these sources: (1) HEA

EM&V; (2) Indiana

TRM (v2.2), with a 60-watt

base; and (3) Indiana

TRM (v2.2), with a 30.69-

watt Base

Indiana TRM (v2.2) and information

in program tracking data. Baseline

wattage value per the Indiana

TRM (v2.2) and WHFs averaged

across customer location, per

customer type.

kWh and kW savings align with

ex ante, but the therm penalty

differs substantially. Without

the actual calculations, the

reason for the difference in

therm savings could not be

determined.

Low-Flow

Showerhead

Roughly 90/10 split between

Indiana TRM (v2.2) and 2015

EM&V values; assumed

unknown housing type for

people per household,

faucets per household, and

1.74 gpm-eff.

Indiana TRM (v2.2) and participant

information in program tracking data

assumed single-family housing; cold

water inlet temperature averaged

across customer location and

customer type; actual rating for gpm-

eff of 1.5.

Water temperatures, gpm

assumptions, people per

household, and showerheads

per household.

Page 86: 2018 DSM Portfolio Evaluation Report - NIPSCO

79

Measure Ex Ante Sources and

Assumptions

Ex Post Gross Sources and

Assumptions

Primary Reasons for

Differences

Pipe Wrap

Average of Indiana

TRM (v2.2) and 2015 EM&V

savings values for natural gas

and electric water heaters.

Indiana TRM (v2.2), entering water

temperature averaged across

customer location

Entering water temperature

and other unknown

assumptions.

Filter Whistle

2016 Pennsylvania TRM,

Pittsburgh deemed savings,

ISR = 0.474.

2016 Pennsylvania TRM, with Indiana

TRM (v2.0) full load heating and

cooling hours, and ISR = 1

ISR and full load heating and

cooling hours.

Duct Sealing 2015 EM&V

Indiana TRM (v2.2) and 2016

Pennsylvania TRM referenced for

equipment capacity values in Btu/h,

as well as SEER. Full load heating and

cooling hours averaged across

customer location, per customer

type.

Natural gas savings: used billing

analysis to determine ex post gross

Energy savings and demand

reduction: Equipment capacity

ratings and full load hours

Natural gas: ex ante savings

may be overstated for the same

reasons as the billing analysis

found much lower therm

savings.

Attic Insulation

Indiana TRM (v2.2) and 2015

EM&V, assumed 1,000 sq ft,

South Bend.

Indiana TRM (v2.2) and customer

applications pre-R and square

footage. R-value interpolation and

kWh/ksf, kW/ksf, and MMBtu/ksf

averaged across customer location

per customer type.

Customer type, customer

location, and square footage.

Ex Post Gross Savings

Table 54 shows the ex ante deemed savings and ex post gross per-measure savings for 2018

HEA program measures. The ex ante and ex post gross units differed for the attic insulation measures,

with ex ante savings claimed per participant and ex post gross savings claimed per square foot of

installed insulation. This difference limits the comparability of the ex ante and ex post gross per-measure

insulation savings.

Table 54. 2018 HEA Program Ex Ante and Ex PostGross Per-Measure Savings Values

Measure Units Ex Antea Deemed Savings Ex Post Gross Per-Measure Savings

kWh kW therms kWh kW therms

LED (9 watt) – Dual Fuel Lamp 29.0 0.005 (0.05) 28.52 0.004 (0.58) d

LED (9 watt) - Electric Lamp 29.0 0.005 - 28.52 0.004 -

Bath Aerator - Electric Aerator 41.6 0.011 - 35.39 0.003 -

Bath Aerator - Natural Gas Aerator - - 1.85 - - 1.50

Kitchen Aerator - Electric Aerator 182.0 0.008 - 182.12 0.008 -

Kitchen Aerator - Natural Gas Aerator - - 8.01 - - 7.97

Low-Flow Showerhead -

Electric Showerhead 271.5 0.013 - 350.23 0.017 -

Low-Flow Showerhead -

Natural Gas Showerhead - - 11.94 - - 15.23

Pipe Wrap - Electric Per foot 18.6 0.002 - 22.25 0.003 -

Pipe Wrap - Natural Gas Per foot - - 0.82 - - 0.99

Page 87: 2018 DSM Portfolio Evaluation Report - NIPSCO

80

Measure Units Ex Antea Deemed Savings Ex Post Gross Per-Measure Savings

kWh kW therms kWh kW therms

HVAC Filter Whistle Filter whistle 58.0 0.023 - 139.35 0.049 -

Duct Sealing - Dual Fuel Participant 120.7 0.374 113.86 119.52 0.354 56.43

Duct Sealing - Electric Cooling Participant 120.7 0.374 - 116.18 0.354 -

Duct Sealing - Electric Participant 829.4 0.374 - 1,374.92 0.354 -

Duct Sealing - Natural Gas Participant - - 113.86 - - 56.43

Attic Insulation - Dual Fuel Participant/

Square footb 130.0 0.054 53.60 0.17 0.000c 0.15

Attic Insulation - Natural Gas Participant/

Square footb - - 53.60 0.06 - 0.13

Audit Recommendations -

Dual Fuel Participant 21.6 0.012 2.74 21.60 0.012 2.74

Audit Recommendations -

Electric Participant 21.6 0.012 - 21.60 0.012 -

Audit Recommendations -

Natural Gas Participant - - 2.74 - - 2.74

Note: Totals may not sum properly due to rounding. a Values presented at a measure-level represent audited values since the scorecard provides only savings totals. b Attic insulation savings are per-participant for ex ante and per-square foot of insulation for ex post gross. c Ex post gross kilowatts for dual fuel attic insulation is 0.0000816674153592073. d The therm penalty results from an increased heating load when incandescent bulbs are replaced with LEDs.

Realization Rates

The next three tables (Table 55 through Table 57) show the program’s ex ante reported savings, verified

savings, and ex post gross savings. The program achieved electric energy, demand reduction, and natural

gas energy realization rates of 94%, 81%, and 54%, respectively.

With a realization rate of 94%, the ex post gross electric energy savings were similar to ex ante savings at

the program level. The main drivers in the realization rate for electric energy savings were the ISRs

applied during the verification step. Increased ex post gross electric energy (kilowatt-hour) savings value

for the showerheads, pipe wrap, filter whistles, and duct sealing in homes with electric heat, and attic

insulation helped offset the difference between ex post gross and ex ante savings.

With a realization rate of 81%, the ex post gross demand savings were somewhat lower than ex ante

savings at the program level. This difference is driven in fairly equal parts by the reduction of the verified

number of units for all measures, but in particular the dual fuel duct sealing measure and reduced

demand reduction for the LEDs and dual fuel duct sealing.

The realization rate of 54% for natural gas savings can primarily be attributed to the substantially lower

ex post gross savings for duct sealing in homes with natural gas heat than the ex ante savings.

Additionally, the evaluation team calculated a higher therm penalty for the LED bulbs, and included

natural gas energy savings for the filter whistle measure, which was not assigned ex ante natural gas

energy (therm) savings.

Page 88: 2018 DSM Portfolio Evaluation Report - NIPSCO

81

Table 55. 2018 HEA Program Ex Ante and Ex Post Gross Electric Energy Savings

Measure

Ex Ante a Deemed

Electric Energy

Savings

(kWh/yr/unit)

Ex Antea Electric

Energy Savings

(kWh/yr)

Audited Gross

Electric Energy

Savings (kWh/yr)

Verified Gross

Electric Energy

Savings (kWh/yr)

Ex Post Gross Per-

Measure Electric

Energy Savings

(kWh/yr/unit)

Ex Post Gross

Electric Energy

Savings (kWh/yr)

LED (9 watt) – Dual Fuel 29.0 331,528 331,760 312,651 28.5 307,489

LED (9 watt) - Electric 29.0 22,185 22,185 20,907 28.5 20,562

Bath Aerator - Electric 41.6 1,123 1,123 1,017 34.4 841

Bath Aerator - Natural Gas - - - - - -

Kitchen Aerator - Electric 182.0 2,730 2,730 2,382 182.1 2,383

Kitchen Aerator - Natural Gas - - - - - -

Low-Flow Showerhead - Electric 271.5 4,344 4,344 3,772 350.2 4,866

Low-Flow Showerhead - Natural Gas - - - - - -

Pipe Wrap - Electric 18.6 2,920 2,920 2,624 22.3 3,139

Pipe Wrap - Natural Gas - - - - - -

HVAC Filter Whistle 58.0 1,276 1,276 1,106 139.4 2,657

Duct Sealing - Dual Fuel 120.7 54,677 54,677 47,771 119.5 47,306

Duct Sealing - Electric Cooling 120.7 483 483 422 116.2 406

Duct Sealing - Electric 829.4 13,270 13,270 11,594 1,374.9 19,220

Duct Sealing - Natural Gas - - - - - -

Attic Insulation - Dual Fuel 130.0 1,690 1,690 1,690 0.2 3,752

Attic Insulation - Natural Gas - - - -

Audit Recommendations - Dual Fuel 21.6 13,608 13,608 10,787 21.6 10,787

Audit Recommendations - Electric 21.6 1,037 1,037 822 21.6 822

Audit Recommendations - Natural Gas - - - - - -

Total Savings N/A 450,872 451,104 417,545 N/A 424,231

Total Program Realization Rate 94.1%

Note: Totals may not sum properly due to rounding. a Values presented at a measure-level represent audited values since the scorecard provides only savings totals.

Page 89: 2018 DSM Portfolio Evaluation Report - NIPSCO

82

Table 56. 2018 HEA Program Ex Ante and Ex Post Gross Peak Demand Reduction

Measure

Ex Antea

Deemed Peak

Demand

Reduction

(kW/unit)

Ex Antea Peak

Demand

Reduction (kW)

Audited Gross

Peak Demand

Reduction (kW)

Verified Gross Peak

Demand Reduction

(kW)

Ex Post Gross Per-

Measure Peak

Demand Reduction

(kW/unit)

Ex Post Gross

Peak Demand

Reduction (kW)

LED (9 watt) – Dual Fuel 0.005 57.16 57.20 53.91 0.004 41.85

LED (9 watt) - Electric - 3.88 3.83 3.60 0.004 2.80

Bath Aerator - Electric 0.011 0.30 0.30 0.27 0.003 0.08

Bath Aerator - Natural Gas - - - - - -

Kitchen Aerator - Electric 0.008 0.12 0.12 0.10 0.008 0.11

Kitchen Aerator - Natural Gas - - - - - -

Low-Flow Showerhead - Electric 0.013 0.20 0.21 0.18 0.017 0.24

Low-Flow Showerhead - Natural Gas - - - - - -

Pipe Wrap - Electric 0.002 0.32 0.31 0.28 0.003 0.36

Pipe Wrap - Natural Gas - - - - - -

HVAC Filter Whistle 0.023 0.51 0.51 0.44 0.049 0.93

Duct Sealing - Dual Fuel 0.374 169.33 169.42 148.02 0.354 140.12

Duct Sealing - Electric Cooling 0.374 1.49 1.50 1.31 0.354 1.24

Duct Sealing - Electric 0.374 5.75 5.75 5.02 0.354 4.95

Duct Sealing - Natural Gas - - - - -

Attic Insulation - Dual Fuel 0.054 0.70 0.70 0.70 0.000b 1.75

Attic Insulation - Natural Gas - - - - - -

Audit Recommendations - Dual Fuel 0.012 7.49 7.56 5.99 0.012 5.99

Audit Recommendations - Electric 0.012 0.57 0.58 0.46 0.012 0.46

Audit Recommendations - Natural Gas - - - - - -

Total Savings N/A 247.81 247.97 220.29 N/A 200.87

Total Program Realization Rate 81.1%

Note: Totals may not sum properly due to rounding. a Values presented at a measure-level represent audited values since the scorecard provides only savings totals. b Ex post gross per-measure kW for dual fuel attic insulation is 0.0000816674153592073.

Page 90: 2018 DSM Portfolio Evaluation Report - NIPSCO

83

Table 57. 2018 HEA Program Ex Ante and Ex Post Gross Natural Gas Energy Savings

Measure

Ex Antea Deemed

Natural Gas Energy

Savings

(therms/yr/unit)

Ex Antea Natural

Gas Energy

Savings

(therms/yr)

Audited Gross

Natural Gas

Energy Savings

(therms/yr)

Verified Gross

Natural Gas

Energy Savings

(therms/yr)

Ex Post Gross Per-

Measure Natural

Gas Energy Savings

(therms/yr/unit)

Ex Post Gross

Natural Gas

Energy Savings

(therms/yr)

LED (9 watt) (0.05) b (572) (572) (539) (0.58) (6,282)

LED (9 watt) - Electric - - - - - -

Bath Aerator - Electric - - - - - -

Bath Aerator - Natural Gas 1.85 1,032 1,040 941 1.50 762

Kitchen Aerator - Electric - - - - - -

Kitchen Aerator - Natural Gas 8.01 2,027 2,043 1,782 7.97 1,773

Low-Flow Showerhead - Electric - - - - - -

Low-Flow Showerhead - Natural Gas 11.94 4,107 4,131 3,588 15.23 4,578

Pipe Wrap - Electric - - - - - -

Pipe Wrap - Natural Gas 0.82 3,335 3,335 2,997 .99 3,625

HVAC Filter Whistle - - - - - -

Duct Sealing - Dual Fuel 113.86 51,579 51,579 45,064 56.43 25,563

Duct Sealing - Electric Cooling - - - - - -

Duct Sealing - Electric - - - - - -

Duct Sealing - Natural Gas 113.86 18,787 18,787 16,414 56.43 9,311

Attic Insulation - Dual Fuel 53.60 697 697 697 0.15 3,286

Attic Insulation - Natural Gas 53.60 161 161 161 0.13 369

Audit Recommendations - Dual Fuel 2.74 1,726 1,726 1,368 2.74 1,368

Audit Recommendations - Electric - - - - - -

Audit Recommendations - Natural Gas 2.74 444 444 352 2.74 352

Total Savings N/A 83,323 83,370 72,825 N/A 44,704

Total Program Realization Rate 53.7%

Note: Totals may not sum properly due to rounding. a Values presented at a measure-level represent Audited values, since the scorecard provides only savings totals. b The therm penalty results from an increased heating load when incandescent bulbs are replaced with LEDs.

Page 91: 2018 DSM Portfolio Evaluation Report - NIPSCO

84

Ex Post Net Savings The evaluation team based the NTG ratio on self-reported responses to participant survey questions.

This survey battery asked what customers would have done in the absence of the program and the

influence the program had on their decision to install the energy-efficient measures.18 For all measures,

the questions addressed the likelihood that participants would have changed their equipment to energy-

efficient equipment in the absence of the program and the timing associated with this change. For LEDs,

the evaluation team also considered the presence of LEDs already in the home. Because the program

measures were direct-installed, the influence of the program on a customer’s decision to install the

measures does not apply as it does in other rebate programs.

Participant spillover calculates the savings that result from purchases and actions taken outside of the

program due to program influence. Because the evaluation team captured savings for energy-saving

behavior and/or subsequent installation of energy-efficient equipment associated with the energy

assessment recommendations measure, calculating participant spillover would be redundant to those

savings. Therefore, spillover is not included in the NTG ratio for the HEA program (consistent with 2015,

2016, and 2017 methods).

To calculate the NTG ratio, the evaluation team used responses from 2018 participant survey questions

asking if participants would have installed each measure without the program and, if so, when they

would have done so. Most surveyed participants reported that they were not planning to make energy

efficiency upgrades, while those planning upgrades said they would not have done so right away. For

most measures, the evaluation team calculated a weighted average of the previous two evaluation years

(2017 and 2018) to arrive at the final NTG ratio used in the impact evaluation. This approach increases

the confidence of the applied ratio. The exceptions to this approach were attic insulation, which was

based solely on 2018 participants, as the one 2017 participant would have installed insulation without

the program. The other was the filter whistle, which had a very low number of installations in 2018 and

was not represented in the survey sample.

As in the 2015, 2016, and 2017 evaluations, the evaluation team used an NTG ratio of 100% for the

assessment recommendations measure because participants would not have received the

recommendations if they had not participated in the program. Table 58 shows the NTG ratios by

measure.

18 Refer to Appendix B. Self-Report Net-to-Gross Evaluation Methodology for more information on using the self-

report method to calculate NTG.

Page 92: 2018 DSM Portfolio Evaluation Report - NIPSCO

85

Table 58. HEA Program Net-to-Gross Ratios by Measure

Measure 2018 2017 Final 2018

NTG n NTG n NTG n

LED 72% 50 73% 59 72% 109

Duct sealing 96% 41 87% 45 91% 86

Filter whistle - - 92% 12 92% 12

Pipe wrap 93% 34 71% 32 82% 66

Bathroom aerator 96% 27 93% 28 94% 55

Kitchen aerator 88% 21 81% 26 84% 47

Showerhead 91% 30 88% 23 89% 53

Attic Insulation 83% 3 0% 1 83% 4 a The program installed a total of 22 filter whistles in 2018 and none were represented in survey responses.

Table 59 presents the resulting net electric savings, demand reduction, and natural gas savings.

Table 59. 2018 HEA Program Ex Post Net Savings

Program Measure Ex Post Gross Savings/Reduction

NTG Ex Post Net Savings/Reduction

kWh kW therms kWh kW therms

LED (9 watt) 307,489 41.9 (6,282) 72% 222,499 30.3 (4,546)a

LED (9 watt) - Electric 20,562 2.8 - 72% 14,879 2.0 -

Bath Aerator - Electric 841 0.1 - 94% 793 0.1 -

Bath Aerator - Natural

Gas - - 762 94% - - 719

Kitchen Aerator - Electric 2,383 0.1 - 84% 2,003 0.1 -

Kitchen Aerator - Natural

Gas - - 1,773 84% - - 1,490

Low-Flow Showerhead -

Electric 4,866 0.2 - 89% 4,355 0.2 -

Low-Flow Showerhead -

Natural Gas - - 4,578 89% - - 4,096

Pipe Wrap - Electric 3,139 0.4 - 82% 2,579 0.3 -

Pipe Wrap - Natural Gas - - 3,625 82% - - 2,978

HVAC Filter Whistle 2,657 0.9 - 92% 2,436 0.9 -

Duct Sealing - Dual Fuel 47,306 140.1 25,563 91% 43,020 127.4 23,247

Duct Sealing - Electric

Cooling 406 1.2 - 91% 369 1.1 -

Duct Sealing - Electric 19,220 5.0 - 91% 17,479 4.5 -

Duct Sealing - Natural Gas - - 9,311 91% - - 8,467

Attic Insulation - Dual Fuel 3,752 1.8 3,286 83% 3,127 1.5 2,738

Attic Insulation - Natural

Gas - 369 83% - 307

Page 93: 2018 DSM Portfolio Evaluation Report - NIPSCO

86

Program Measure Ex Post Gross Savings/Reduction

NTG Ex Post Net Savings/Reduction

kWh kW therms kWh kW therms

Audit Recommendations -

Dual Fuel 10,787 6.0 1,368 100% 10,787 6.0 1,368

Audit Recommendations -

Electric 822 0.5 - 100% 822 0.5 -

Audit Recommendations -

Natural Gas - - 352 100% - - 352

Program Total 424,231 201 44,704 325,147 174.8 41,217

Note: Totals may not sum properly due to rounding. a The therm penalty results from an increased heating load when incandescent bulbs are replaced with LEDs.

Table 60 shows the freeridership for each fuel. After calculating the individual freeridership at the fuel

level, the evaluation team applied savings-weighted freeridership, based on MMBtu, to develop a

program-level NTG of 88% NTG for 2018 (Table 61).

Table 60. HEA Program Net-to-Gross Results by Fuel Type

Savings Type Ex Antea Gross

Savings

Ex Post Gross

Savings NTG Ratio (%)

Ex Post Net

Savings

Electric Energy Savings (kWh/yr) 450,871 424,231 77% 325,147

Peak Demand Reduction (kW) 248 201 87% 175

Natural Gas Energy Savings (therms/yr) 83,323 44,704 92% 41,217

a Values presented at a measure-level represent audited values since the scorecard provides only savings totals.

Table 61. HEA Program-Level NTG

Responses (n) Freeridership (%) Spillover (%) NTG Ratio (%)

134 12% N/A 88%

Process Evaluation This section describes the evaluation team’s process evaluation findings derived from conducting

interviews, reviewing program materials, and surveying program participants.

The evaluation team conducted interviews with NIPSCO and Lockheed Martin Energy program staff to

gain insight into program design and operations. The evaluation team also reviewed program

materials—including the NIPSCO website, Comprehensive Home Assessment (CHA) report, and program

marketing materials—to understand the customer-facing information provided by the program.

The evaluation team surveyed 67 program participants to inform the process evaluation research issues,

including source of program awareness, reasons for participation, and satisfaction with HEA and its

offerings.

Page 94: 2018 DSM Portfolio Evaluation Report - NIPSCO

87

Program Design and Delivery Through the HEA program, NIPSCO provides walk-through energy assessments and direct installations of

energy efficiency measures to single-family homeowners or renters (with landlord approval). The

HEA program is designed to help participants improve the efficiency, safety, and comfort of their homes,

as well as deliver an immediate reduction in electricity and/or natural gas consumption and promote

additional efficiency work through other NIPSCO programs. Program staff reported that the primary goals

of the program are to increase customer satisfaction and save customers money on their energy bills.

Lockheed Martin Energy, the program implementer, is responsible for the program design and

management, contractor payment processing, quality assurance and quality control, and contractor

support to facilitate the quality installation of energy-efficient measures.

Lockheed Martin Energy recruits and manages a network of trade allies (program-approved energy

assessors and contractors) to perform the energy assessments and install insulation for the program.

They also provide technical training to the trade allies to ensure work quality and customer service meet

program standards. Lockheed Martin Energy, to date, has provided this training to trade allies

individually after they sign up to participate in the program.

Lockheed Martin Energy and NIPSCO collaborate to promote the program, marketing the program via

direct mail, email, radio, TV, and social media campaigns. Trade allies may also do their own marketing

for the program.

Interested customers can enroll in the HEA program by calling the NIPSCO Residential Energy Efficiency

program hotline or signing up through the website. Participating trade allies are also encouraged to

discuss the program and sign up customers for an assessment while performing other work for

customers.

Energy assessors begin the assessment by walking through the home and recording key housing

characteristics such as square footage, heating and cooling systems, appliance type, number of windows

and doors, and insulation levels. Depending on the conditions and current equipment in the home, the

energy assessor installs any or all of the following measures during the assessment:

• LED bulbs (up to 22)

• Kitchen faucet aerator

• Bathroom faucet aerators (up to 2)

• Low-flow showerheads (up to 2)

• Pipe wrap (up to 10 feet)

• Furnace filter whistle

• Duct sealing

Qualifying program participants can also receive an instant discount of up to $500 on attic insulation

installed through a program contractor. To receive the instant discount, an existing insulation level of

Page 95: 2018 DSM Portfolio Evaluation Report - NIPSCO

88

less than R-11 must be improved to R-38 or greater, and participants must call a program participating

insulation contractor and schedule the installation directly.

After the assessment, the energy assessor will discuss the findings and recommendations with the

participant and provides a CHA report. The CHA report includes detailed information about the home’s

existing conditions, current energy use, heating and cooling loads, as well as a prioritized (low, medium

and high) listing of recommendations specific to the home, with estimated costs and payback periods.

The CHA report also includes a list of zero-cost and low-cost solutions to help lower energy costs, with a

link to NIPSCO’s website. While the recommendations in the assessment report may include measures

eligible for incentives through other NIPSCO programs, it does not provide any information about those

incentives.

Changes from 2017 Design

While Lockheed Martin Energy made no changes from the 2017 program design for 2018, it made the

following improvements to program processes:

• Shortened the CHA report significantly from approximately 40 pages to 21 pages for 2018 and is

currently evaluating other platforms for producing an assessment report that may offer

enhanced reporting capabilities.

• Developed a tri-fold for the Energy Efficiency Rebate program that is left behind with program

participants to increase awareness of other NIPSCO programs.

• Implemented procedures to improve the scheduling process, such as sending weekly email

notifications to trade allies and follow-up procedures for hard-to-reach customers. However, the

NIPSCO program manager reported receiving several phone calls during 2018 from customers

who tried to schedule assessments online but were not contacted by the program implementer

to complete scheduling.

2017 Evaluation Recommendation Status

The evaluation team followed up with NIPSCO on the status of the 2017 evaluation recommendations.

Table 62 lists the 2017 HEA evaluation recommendations and progress toward those recommendations

to date.

Page 96: 2018 DSM Portfolio Evaluation Report - NIPSCO

89

Table 62. Status of 2017 HEA Program Evaluation Recommendations

Summary of 2017 Recommendations Status

Record all values necessary for accurate savings calculations and

verification directly in the program tracking database, including the

following for all program customers:

• Customer type (combo, electric, or natural gas)

• Number of above-ground stories

• Heating system type and efficiency

• Heating capacity (Btu/h)

• Cooling system type and efficiency

• Cooling system capacity

• (For insulation measures) square footage of installed insulation,

pre-install R-value, and post-install R-value

• (For lighting measures) existing bulb type and wattage

Completed in 2019. Customer type, heating system

type, and cooling system type are currently reported

in the application and LM Captures. Insulation and

lighting details are currently recorded in the

application. For 2019, these parameters will be

entered into the tracking system.

Heating and cooling efficiency and system capacity

use the Indiana TRM (v2.2) and evaluation results to

calculate the deemed savings for each measure.

These values have been evaluated and produce

reasonable deemed savings; therefore, the

implementer will not record these variables at the

project level.

Consider using blower-door tests to measure air infiltration before

and after performing air and duct sealing for more accurate savings

calculations, using the modified blower-door subtraction method.

No Further Action. Blower-door testing is performed

for air sealing but not duct sealing.

Document the sources, assumptions, and (where possible)

algorithms and key inputs included in the derivation of each

deemed savings value for all program measures.

Completed in 2019. Sources and assumptions are

recorded in the program design and will include 2019

revisions.

Consider the fuel type and heating and cooling systems in measure

descriptions and savings calculations. Use the ex post gross per-

measure savings values provided in this report, which incorporate

these factors, in program planning for measures included in this

evaluation.

Completed in 2019. Some measure descriptions and

savings have been updated to follow this

recommendation. All will be reviewed and updated

for 2019.

Educate energy assessors on the importance of accurately recording

measures installed, providing the CHA report, and verbally

informing customers of the measures installed so that customers

can accurately verify the installation of those measures.

Completed in 2019. Lockheed Martin Energy

reported that HEA and IQW programs require the

trade ally to complete and generate a CHA, review

results with the customer, and leave a copy of the

CHA with the customer to reference.

Consider establishing additional data capture activities to document

measure installation conditions, such as having energy assessors

take pictures of post-installation conditions while on-site.

Declined. While trade allies are not required to

provide pictures of post-installation, Lockheed

Martin Energy inspects 10% of all projects to verify

installation and takes pictures for reference where

necessary.

Since the duct sealing's natural gas savings appeared high, move to

a semi-deemed approach to improve the accuracy of savings

estimates by leveraging available customer data and the

distribution efficiency lookup tables in the Indiana TRM (v2.2).

In Progress. Lockheed Martin Energy will consider

evaluation ex post savings values and additional

Indiana TRM (v2.2) methods for 2019 program

design.

Consider including adder values that augment the distribution

efficiency lookup values to reduce uncertainty in savings estimates

for duct sealing projects occurring in semi-conditioned spaces. Both

the Ohio and Illinois TRM use this method.

In Progress. Lockheed Martin Energy will consider

evaluation ex post savings values and additional

Indiana TRM (v2.2) methods for 2019 program

design.

Page 97: 2018 DSM Portfolio Evaluation Report - NIPSCO

90

Summary of 2017 Recommendations Status

Coordinate with the program implementation staff and trade allies

to identify and optimize key processes that might reduce the wait

time for executing assessments. Processes to consider for review

and optimization include the scheduling method, flow of

communication between parties, staffing, and in-home assessment

procedures.

In Progress. Trade allies are receiving weekly email

notifications regarding scheduling. Implemented

follow-up procedures for customers that the trade

ally was unable to get in touch with to schedule

project.

Develop standardized training modules to ensure that all trade

allies receive consistent training on program policies, procedures,

and NIPSCO program marketing. Include policies and procedures for

situations that may tempt an assessor to leave direct-install

measures behind for participants to install.

Under Consideration by Cadmus in 2018. Lockheed

Martin Energy provided consistent email

notifications regarding issues and program changes

throughout the program year and held trade ally

webinar to address program concerns and updates,

however, did not develop training modules.

Consider using energy advisors to provide additional guidance and

support to participants after the assessment. Energy advisors could

deliver a personalized, printed report to participants who choose to

connect with them. This will also help maximize the benefits of

framing the discussion to match participants’ motivations.

Under Consideration by Cadmus in 2018. Trade

allies are required to provide and review the

completed CHA with customers. Additional

improvements or suggestions are noted and, in some

cases, customers are directed to additional resources

or assistance, where relevant, but the program does

not have resources for energy advisors.

Provide additional training to assessors on how to print and email

the report while at the participant’s home. This will increase the

likelihood that participants will refer to the report and recall that

they received it.

Completed in 2018. Lockheed Martin Energy

reminded and educated trade allies throughout the

year in person and via email on properly providing

the CHA to customers.

Redesign the CHA report so that it is shorter and more focused. This

will make the report easier to print at the home for the assessor,

and easier to read for the participant.

Completed in 2018. Report length was shortened

significantly for 2018. Lockheed Martin Energy is in

the process of evaluating other CHA platforms to

implement for 2019 program year.

Request a participant signature verifying that the assessor has

discussed the findings and recommendations of the assessment and

has provided a CHA report.

Completed in 2018. Customer is required to sign

work authorization form upon project completion.

Lockheed Martin Energy is in the process of

evaluating other CHA platforms to implement for

2019 program year that may have signature

acknowledgment capability.

To raise program awareness, develop a leave-behind folder in

conjunction with IQW to hold literature about NIPSCO’s other

programs, other energy-related resources, and the printed copy of

the CHA report. Having a folder of materials to review may help

facilitate the conversation about assessment results, which is

identified as a best practice. This conversation should be framed in

terms of how much money the direct-install measures may save the

participant per month and what type of impact additional actions

can make in the short term. Additionally, the folder of materials

may aid in recall of receiving the report.

Completed in 2018. Lockheed Martin Energy

developed and distributed additional marketing

collateral for both HEA and IQW programs, as well as

an HVAC trifold. Additionally, NIPSCO approved yard

signs and door hangers for trade allies to use on

project sights. Targeted marketing campaigns took

place via direct mail, bill inserts, and social media to

improve awareness. The CHA outlines potential

savings and recommendations.

Page 98: 2018 DSM Portfolio Evaluation Report - NIPSCO

91

Participant Feedback The evaluation team surveyed 67 customers who participated in the program. The following sections

describe the results related to source of awareness, reasons for participation, experience with the in-

home assessment, satisfaction with the program, and program impacts on customers.

Energy Efficiency Awareness and Marketing

Respondents most commonly reported bill inserts (33%), word of mouth (19%), and direct mail (16%) as

the way they learned about the program (Figure 24). These three were also the top ways that

participants learned of the program in 2017.

Figure 24. How Participants Learned about the HEA Program

Source: Participant survey. Question: “How did you learn about NIPSCO’s Home Energy Assessment

program?” (Multiple answers allowed)

Thirty-six percent of respondents were aware that NIPSCO offers other energy efficiency programs

besides the HEA program (n=67). Of those aware, respondents most frequently heard of the Appliance

Recycling program (12%) and the Energy Efficiency Rebate program (12%), as shown in Figure 25.

Several respondents (10%) stated that they were aware that NIPSCO offers other energy efficiency

programs but could not name a program. Four respondents, or 6%, were aware of the IQW program.

As noted above, the energy assessor has access to marketing collateral, including a brochure for the

Energy Efficiency Rebate program, to distribute to participants after the assessment. However, only

about a third of survey respondents were aware of any NIPSCO programs, and only 12% were aware of

the Energy Efficiency Rebate program. This is not a significant increase from 2017 when 28% of

participants were aware that NIPSCO offered other energy efficiency programs and 10% were aware of

the Energy Efficiency Rebate program.

Page 99: 2018 DSM Portfolio Evaluation Report - NIPSCO

92

Figure 25. Awareness of Other NIPSCO Programs

Source: Participant survey. Question: “What energy efficiency programs are you aware of?”

(Multiple answers allowed)

Participation Drivers

When asked why they decided to participate in the program, respondents most frequently said they

participated to save money on utility bills (39%) or to save energy (21%). The assessment report was also

a large motivator, with nearly 20% of respondents citing this as the reason they participated. Fewer than

15% of respondents said that they participated because they wanted the free items (Figure 26).

Figure 26. Reasons for Participating in Program

Source: Participant survey. Question: “Why did you decide to participate in NIPSCO’s Home Energy

Assessment program?” (Multiple answers allowed)

Comprehensive Home Assessment Report

After conducting the walk-through of the home and installing the energy-efficient items, the energy

assessors are required to discuss their findings and recommendations with the participant and provide a

printed CHA report detailing those findings and additional energy-saving recommendations. Ninety-

three percent of respondents recalled the assessor discussing their findings and recommendations with

them, which is a significant increase from the previous evaluation when 79% of respondents said the

Page 100: 2018 DSM Portfolio Evaluation Report - NIPSCO

93

assessor discussed the findings and recommendations of the assessment with them. Eighty-five percent

recalled receiving a report following their assessment, which is trending upward from the previous

evaluation when 74% recalled receiving the report (but the difference is not statistically significant).

Figure 27 shows how participants were presented their findings and recommendations.

Figure 27. Presentation of Findings and Recommendations

Note: Boxed value indicates difference between 2017 and 2018 is significant at p≤0.10 (90% confidence).

Source: Participant survey. Questions “Did the energy assessor discuss the assessment findings and

recommendations with you?” and “Did the energy assessor provide you with a report with assessment

findings and energy-savings recommendations for your home?”

Respondents (n=57) most frequently received the CHA report after the assessment through email (44%)

and mail (42%). Only 14% of respondents received a printed report during the assessment. Most

respondents who received a report or a verbal discussion of findings and recommendations found the

information provided to them through the report and discussions with the energy assessor very useful

(48%) or somewhat useful (36%). Figure 28 provides additional detail on how the assessor provided

information.

Figure 28. Usefulness of Information Provided in HEA Program Assessment

Source: Participant survey. Question “How useful was the information provided to you?”

Page 101: 2018 DSM Portfolio Evaluation Report - NIPSCO

94

Fourteen respondents provided suggestions for improving the information provided verbally or in the

report. Most often, respondents suggested improvements to the level of detail and types of

recommendations provided. Four respondents wanted clearer instruction on how to make the

recommended improvements. Three respondents wanted more diagnostic testing results with bigger

savings potential. Two respondents wanted recommendations that accurately reflect their home’s

situation. As one respondent stated, “Actual ways to fix the problem instead of just saying what was

wrong.”

Four respondents suggested that getting the CHA report would be helpful, stating that they never

received a report after the assessor explained their findings and recommendations verbally.

For those respondents who recalled receiving it, the CHA report effectively encouraged them to take

some of the recommended actions. Just under 60% of respondents who received a CHA report made all

(14%), most (3%), or some (42%) of the recommended improvements. Twenty-three percent of

respondents who received the CHA report did not make any of the recommended improvements, most

commonly saying it was because the recommendations were too expensive or not relevant. Figure 29

details all responses.

Figure 29. Follow-Through on Recommended Improvements

Source: Participant survey. Question: “After participating in NIPSCO’s HEA program, would you say you have

made all, most, some or none of the energy-saving recommendations made in the assessment report

you received?”

Direct-Install Measures

Relatively few respondents had water-saving measures installed in their homes at the time of the

assessment. Forty percent of the respondents (n=67) indicated that they had showerheads installed,

34% had bathroom aerators installed, and only 31% had kitchen aerators installed. Respondents who did

not have the various water-saving measures installed cited two primary reasons: already having the

items and not being offered the items. Notably, large numbers of respondents were unsure why the

Page 102: 2018 DSM Portfolio Evaluation Report - NIPSCO

95

measures were not installed. Table 63 provides additional detail on the reasons for not installing water-

saving measures.

Table 63. Reasons for Not Installing Water-Saving Measures

Demographics Showerheads (n=39) Bathroom faucet

aerators (n=42)

Kitchen faucet

aerators (n=46)

Already had/didn't need one 41% 29% 26%

Wasn't offered one 31% 21% 26%

Don't know 10% 38% 26%

Didn't fit on fixture 15% 17% 26%

Didn't match current fixture color 3% 0% 0%

Other 5% 10% 2%

Source: Participant survey. Question: “why didn’t you have [equipment] installed?” (Multiple answers allowed)

Respondents were generally satisfied with the measures installed during the assessment. Based on the

percentage of very satisfied ratings, respondents were most satisfied with the bathroom aerators (91%)

and duct sealing measures (84%). None of the measures received any very dissatisfied ratings.

Figure 30. Satisfaction with Direct-Install Measures

Source: Participant survey. Question: “How satisfied are you with the [equipment]?”

Perceived Effect on Home Comfort and Energy Bills

Thirty-four percent of respondents reported increased comfort since the assessment, and 31% reported

lower energy bills since the assessment (Figure 31). All respondents had their assessments completed at

least two months prior to the participant survey.

Page 103: 2018 DSM Portfolio Evaluation Report - NIPSCO

96

Figure 31. Changes Noticed Since Assessment

Source: Participant survey. Questions: “Do you find that your home is more comfortable since your

assessment?” and, “Is your NIPSCO energy bill lower since your assessment?”

Satisfaction with In-Home Energy Assessment

Ninety-two percent of respondents were satisfied with the in-home energy assessment overall. While

the percentage of respondents who were very satisfied (70%) appears to be trending higher after a

downward trend in 2017 (59%), this difference is not statistically significant. The percentage of

respondents who indicated they were either very satisfied or somewhat satisfied with the in-home

energy assessment overall is similar to that of 2017 and 2016 respondents. Figure 32 shows the

satisfaction ratings respondents gave to the in-home energy assessment overall.

Figure 32. Satisfaction with Overall In-Home Energy Assessment

Source: Participant survey. Question: “How satisfied were you with the in-home energy assessment overall?”

Most respondents were also satisfied with the individual components of the assessment: amount of

time between calling to schedule and the assessment, amount of time it took to complete the

assessment, professionalism of the energy assessor, quality of work performed, and the written report.

All aspects of the assessment received very satisfied or somewhat satisfied ratings from 91% or more of

Page 104: 2018 DSM Portfolio Evaluation Report - NIPSCO

97

the respondents. As found in 2017, the time between calling to schedule, the assessment, and the

assessment report received the lowest percentages of very satisfied ratings. Figure 33 details the

satisfaction ratings for each component of the assessment.

Figure 33. Satisfaction with In-Home Energy Assessment Components

Source: Participant survey. Question: “How satisfied were you with each of the following?”

Satisfaction with Program

Most respondents (88%) were satisfied with the HEA program overall, with nearly three-quarters

reporting they were very satisfied. The percentage of very satisfied respondents in 2018 (71%) increased

significantly from 2017 (55%). This pattern was similar to participant satisfaction ratings of the

assessment itself, reflecting that customers relate the two closely.

While the percentage of respondents who were very satisfied with the HEA program increased

significantly from last year, there was also a significant increase in the percentage of respondents who

were somewhat dissatisfied with the HEA program compared to last year (Figure 34).

Page 105: 2018 DSM Portfolio Evaluation Report - NIPSCO

98

Figure 34. Satisfaction with the HEA Program Overall

Note: Boxed values indicate difference between 2017 and 2018 is significant at p≤0.10 (90% confidence).

Source: Participant survey. Question: “How satisfied are you with NIPSCO’s Home Energy

Assessment program?”

The survey asked respondents to briefly explain why they gave the program satisfaction ratings they did.

Nineteen respondents who were satisfied with the program generally said it was because of the

valuable information and service they were given from the program, as well as the friendly and

professional staff. As one respondent said, “Because it was a quality assessment. The assessor was very

professional, and I appreciate all the energy saving products given to me. The assessment has reduced

my energy bill.” Another said, “I noticed a big change. My house is brighter, and it is using less energy

also. I blew some energy savings with my oxygen machine and other machines I use, but I know they

can't help with medical reasons. I think it's a great program.”

Respondents who were neutral or dissatisfied with the program generally said their satisfaction rating

was related to the assessment not providing enough value. For example, one respondent said, “nothing

was done to fix anything,” and another stated, “I expected it to [be] more in depth and pressure tests,

and infrared readings. Also, I thought that they seal up cracks.”

Suggestions for Improvement

Just over one-quarter of respondents (28%, n=67), provided suggestions to improve the program. Eight

respondents wanted to see more out of the program including more light bulbs, more thorough testing

like blower-door tests, and more detailed information on their home. Three respondents suggested that

more advertising and marketing could help the program reach more people. As one respondent said,

“More advertising to get more well known. Specific list to show how they help people. How this program

can benefit people. To help the environment. Save money on the purchases and that it is a good, honest

program.” Other suggestions included providing a list of NIPSCO-approved contractors and offering

incentives for self-installed measures.

Page 106: 2018 DSM Portfolio Evaluation Report - NIPSCO

99

Satisfaction with NIPSCO

Overall, 80% of respondents were satisfied with NIPSCO as their utility service provider (Figure 35).

Neary half (48%) said they were very satisfied with NIPSCO and another 32% said they were somewhat

satisfied. While the percentage of respondents who gave NIPSCO very satisfied ratings appears to be

trending back upward after a dip in 2017, the differences between 2018 and 2017 ratings are not

statistically significant.

Figure 35. Satisfaction with NIPSCO

Source: Participant survey. Question: “How satisfied are you with NIPSCO overall as your utility

service provider?”

Participant Survey Demographics As part of the participant survey, the evaluation team collected responses on the following

demographics, shown in Table 64.

Table 64. HEA Program Respondent Demographics

Demographics Percentage (n=67)

Home Ownership

Own 97%

Rent 2%

Prefer not to say 2%

Type of Residence

Single-family detached 85%

Multi-family apartment or condo (four or more units) 5%

Attached house (townhouse, rowhouse) 6%

Manufactured home 3%

Other 2%

Number of People in the Home

One 22%

Two 37%

Three 15%

Four 16%

Five or more 16%

Prefer not to say 9%

Page 107: 2018 DSM Portfolio Evaluation Report - NIPSCO

100

Demographics Percentage (n=67)

Annual Household Income

Under $25,000 9%

$25,000 to under $35,000 10%

$35,000 to under $50,000 16%

$50,000 to under $75,000 12%

$75,000 to under $100,000 16%

$100,000 to under $150,000 9%

Over $150,000 14%

Prefer not to say 12%

Conclusions and Recommendations Conclusion 1: At 54%, the realization rate for therms was low. This was primarily because of much

lower ex post gross natural gas savings for duct sealing compared to ex ante.

The program implementer used an ex ante savings value of 113.86 therms per year for duct sealing

during 2018. In the 2017, the evaluation team recommended 92.68 therms per year for duct sealing in

homes with natural gas only and 94.14 therms per year for duct sealing in dual fuel homes as a result of

its engineering review. While the ex post gross recommendation of 56.43 therms per year is

approximately 50% of ex ante value, it is closer to 70% of the 2017 recommended value after applying

the ISR. Factors such as customer behavior (energy conservation behaviors in general and how they use

their HVAC systems), relatively efficient baseline conditions, installation quality, or other factors may be

affecting savings. The three most important inputs to achievable savings for duct sealing are these: (1)

pre-condition of the ducts, including whether or not they are located in unconditioned space; (2) heating

capacity and efficiency of heating system; and (3) how HVAC system is used (schedules, occupancy, set

points, etc.). While the first two inputs can be tracked, the third cannot.

Recommendations:

• Use the ex post gross natural gas savings values from this report as new ex ante values to reduce

uncertainty in current savings estimates if the program is unable to conduct blower-door tests

and record the heating capacity and efficiency for duct sealing.

• Consider having the evaluation team conduct a billing analysis in the next evaluation with a

larger population if program participation increases.

Conclusion 2: The HEA program continued to deliver high customer satisfaction overall and across

most of the program components.

Most respondents (88%) were satisfied with the HEA program overall, with nearly three-quarters

reporting they were very satisfied. The percentage of very satisfied respondents in 2018 (71%) increased

significantly from 2017 (55%).

Customers rated their satisfaction with the professionalism of the energy assessor, the time to complete

assessment, quality of work performed, time between scheduling and the assessment, and the

assessment report very highly, with each component of the program receiving very or somewhat

Page 108: 2018 DSM Portfolio Evaluation Report - NIPSCO

101

satisfied ratings from at least 91% of respondents. Like in 2017, the time between scheduling, the

assessment, and the assessment report received the lowest percentages of very satisfied ratings and is a

potential area for program improvement. Furthermore, the NIPSCO program manager reported

receiving phone calls from customers who had problems with the scheduling process.

Recommendation:

• Track metrics related to the scheduling process, such as length of time from scheduling to

assessment and the percentage of customers who call to schedule but never complete an

assessment. Use the tracked metrics to identify areas for improvement and set incremental

goals towards progress.

Conclusion 3: While the program fell short of achieving savings goals, program participants largely

retained measures installed and attributed their energy-saving actions to the program.

The program achieved 37% of its electric energy (kilowatt-hour) savings goal, 30% of its peak demand

reduction (kilowatt) goal, and 23% of its natural gas energy (therms per year) savings goal. The program

implementer, Lockheed Martin Energy, reported that after a strong start, participation declined because

trade allies prioritized work for IQW. Had the program been able to maintain the strong participation, it

would likely have been able to achieve its savings goals.

Recommendations:

• Determine reasons for possible prioritization of IQW over HEA by trade allies and take steps to

ensure participating trade allies do not prioritize one program over the other.

• Recruit additional trade allies to ensure there are enough trade allies to adequately serve

customers of both programs.

Conclusion 4: Some energy assessors may not be accurately recording the measures installed during

the assessment, communicating what they installed, or explaining to participants why they did not

install particular measures.

Through the verification questions included in the survey, the evaluation team found several untracked

LEDs, bathroom and kitchen aerators, and showerheads. Conversely, the evaluation team also found

several cases in which the respondent was unable to confirm that measures were installed, thereby

lowering the ISRs for these measures. This was most notable discrepancy between program tracking

data and customer confirmation was for duct sealing, where only 86% of the respondents confirmed

that it was performed.

Additionally, many respondents who did not have water-saving measures installed said they were not

offered the measures (31% showerheads, 26% kitchen aerator, and 21% bathroom aerator), and large

percentages said they were not sure why they were not installed (10% showerheads, 26% kitchen

aerator, and 38% bathroom aerator).

Page 109: 2018 DSM Portfolio Evaluation Report - NIPSCO

102

Recommendations:

• Have energy assessors encourage their participants to accompany them during the walk-through

audit to provide an opportunity for the energy assessor to communicate what they will install

and the reasons for what they cannot install.

• Move the listing of installed measures to a more prominent location near the beginning of the

CHA report. Consider also including a list of measures that were not installed with the reasons

why if possible, within the reporting platform.

• Consider including a list of installed measures using check-boxes and fill-ins for quantity when

applicable so as not to be overly burdensome for the assessor or customer in the work

authorization form that participants sign after the assessment. This could provide an alternative

to using self-report for calculating ISRs.

• Revisit 2017 recommendation to establish additional data capture activities such as having

energy assessors take pictures of post-installation conditions while on-site. This documentation

could provide another method for calculating ISRs for some measures such as duct sealing and

pipe wrap, potentially resulting in higher ISRs than self-report for measures participants do not

see and/or interact with on a daily basis.

Conclusion 5: Additional marketing collateral and leave-behind materials have not raised awareness

and knowledge of other NIPSCO energy efficiency programs among HEA participants.

Awareness of other NIPSCO energy efficiency programs has not increased significantly since 2017, 36%

of respondents were aware this year compared to 28% in 2017. Additionally, many respondents who

said they knew that NIPSCO offers other programs were unable to name one (29% of those aware).

Similar to 2017, participants who could name another NIPSCO program were most frequently aware of

the Appliance Recycling, Energy Efficiency Rebate, and the Income Qualified Weatherization programs.

Recommendation:

• In addition to the leave-behind brochures, include references to other NIPSCO energy efficiency

programs and rebates, including the $500 instant discount for attic insulation, within the

CHA report to help increase awareness of other program offerings. Having this information

available when participants review their recommendations may serve as reminder of these

programs.

Conclusion 6: Participants not receiving the CHA report on site may contribute to lower verified

savings and may be missed opportunities to drive participation in other energy efficiency programs.

Eighty-five percent of respondents said that they received a report with findings and recommendations,

which is an upward trend from last year when only 74% said they received a report. However, of the

85% of people who received the report, only 14% said it was printed on site by the energy assessor

despite efforts by the implementer to shorten the report to make it more user-friendly for both the

assessor and the participants. The majority of respondents said the report was emailed to them after

the assessment (44%) or mailed to them after the assessment (42%) even though program requirements

specify that the report is provided at the time of the assessment. Additionally, the report is an important

Page 110: 2018 DSM Portfolio Evaluation Report - NIPSCO

103

feature of the program. Twenty percent of respondents cited it as a reason for participating in the

program.

Recommendations:

• Include easy and immediate access to the CHA report (print and digital) and ability to add

custom content, such as related NIPSCO rebates and program information as top features to

look for in a new reporting platform.

• Coordinate with program implementation staff and trade allies to ensure program processes for

printing reports, installing measures and recording installation, and promoting other NIPSCO

programs are followed.

• Consider having the evaluation team conduct ride-alongs as part of the 2019 program

evaluation to better assess the extent to which program procedures are followed and

opportunities for program improvement.

Page 111: 2018 DSM Portfolio Evaluation Report - NIPSCO

104

Residential Appliance Recycling Program The Appliance Recycling program provides a $50 incentive to electric customers who reduce energy

consumption through recycling unneeded refrigerators and freezers. Annually, participants can recycle

up to two working secondary refrigerators or freezers, sized 10 to 30 cubic feet, by scheduling a pick-up

of the units.

Program Performance Table 65 presents a savings summary for the program, including program savings goals. The following

sections provide details on how the evaluation team calculated each savings value. The program

achieved 153% of its electric energy gross savings and 153% of its peak demand gross savings reduction

goals. Continued outreach through bill inserts, radios spots, and social media posts, as well as general

marketing pieces directing customers to NIPSCO residential program website, helped ensure NIPSCO

met their energy savings goal for this program.

Table 65. 2018 Appliance Recycling Program Savings Summary

Metric

Gross

Savings

Goal

Ex Antea Audited Verified Ex Post

Gross

Ex Post

Net

Gross Goal

Achievement

Electric Energy

Savings (kWh/yr) 1,864,894 2,369,904 2,369,904 2,369,904 2,854,738 1,845,713 153%

Peak Demand

Reduction (kW) 274 348 349 349 419 271 153%

a Values presented at a measure-level represent audited values since the scorecard provides only savings totals.

Overall, the program achieved a 120% realization rate for both energy and demand savings, primarily

because ex post per-unit refrigerator savings of 1,009 kWh was higher than ex ante per-unit deemed

savings value of 761 kWh.19 Refrigerators account for 80% of the total appliances recycled during the

2018 program year, and the overall gross realization rates for 2018 are heavily weighted towards the

refrigerator-specific gross realization rate.

Table 66 shows ex post gross savings realization rates and net savings adjustment factors for energy and

demand savings.

Table 66. 2018 Appliance Recycling Program Adjustment Factors

Metric Realization Rate (%)a Freeridership (%) Spillover (%) NTG (%)b

Electric Energy Savings (kWh) 120% 35% 0% 65%

Peak Demand Reduction (kW) 120% a Realization rate is defined as ex post gross savings divided by ex ante savings. b NTG is defined as ex post met savings divided by ex post gross savings.

19 Sourced from the Indiana TRM (v2.2)

Page 112: 2018 DSM Portfolio Evaluation Report - NIPSCO

105

In 2018, the program achieved its savings, but spent 124% of the $467,608 budget, as shown in

Table 67.

Table 67. 2018 Appliance Recycling Program Expenditures

Fuel Program Budget Program Expenditures Budget Spent (%)

Electric $467,608 $578,609 124%

Research Questions The evaluation team conducted qualitative and quantitative research activities to answer the following

key research questions for the program:

• How accurate were the measure quantities recorded in the data?

• Were program marketing efforts effective at driving program participation?

• How familiar were participants with other NIPSCO energy efficiency programs?

• What were the characteristics of the recycled appliances?

• What motivated customers to participate in the program?

• Were replaced appliances energy-efficient?

• To what extent did freeridership or spillover affect NTG?

• Were customers satisfied with the program?

• What program improvements could be made?

• Is there demand to recycle window-mounted air conditioners?

Impact Evaluation This section outlines steps that the evaluation team took in establishing ex post gross and net energy

savings and demand reduction.

In 2018, the evaluation team evaluated the Appliance Recycling program using a methodology

consistent with the U.S. DOE’s UMP evaluation protocol for refrigerator recycling.20 For 2018, the

evaluation team conducted phone surveys with 140 program participants and analyzed in situ metering

data from other jurisdictions to evaluate program savings.

Impact Evaluation Methods As in 2016 and 2017, the evaluation team determined gross per-unit energy consumption for retired

refrigerators and freezers for 2018 using a multivariate regression model. The UMP model is based on

20 U.S. Department of Energy. October 2018. The Uniform Methods Project: Methods for Determining Energy

Efficiency Savings for Specific Measures. https://www.energy.gov/eere/about-us/ump-protocols.

Page 113: 2018 DSM Portfolio Evaluation Report - NIPSCO

106

an aggregated in situ dataset of 591 appliances, metered for evaluations conducted in California,

Wisconsin, and Michigan.21

Collectively, the dataset offered a wide distribution of appliance ages, sizes, configurations, usage

scenarios (primary or secondary), and climate conditions. Because utility-specific in situ metering was

not conducted for NIPSCO, these data provided an ideal secondary source for determining the energy

savings from appliance recycling.

The UMP protocol methods focus on energy savings, but do not include other parameter assessments,

such as a peak CF for calculating demand reduction. The evaluation team calculated demand reduction

using the Indiana TRM (v2.2) algorithm.

The evaluation team estimated gross and net impact components on a per-unit basis and for the

program overall. This section presents program impacts in two major sections: gross impacts and

net impacts.

Gross impacts encompass estimates of the following:

• Per-unit energy consumption (through in situ metering-based regression modeling)

• Part-use factor (accounting for units not in use for the entire year)

• Average gross per-unit energy savings (based on the per-unit energy consumption and part-use

factors)

Net impacts encompass estimates of the following:

• Freeridership and the program’s secondary market impacts

• Spillover

• Average net per-unit energy savings, total program net savings, and part-use factors

Impact Evaluation Findings This section details findings from the evaluation team’s assessment of energy savings from recycling

refrigerators and freezers. It also discusses assumptions used to estimate savings.

Measure Verification

For the 2018 program year, the evaluation team verified appliances recycled through the program using

the implementer’s tracking database and participant surveys. All surveyed participants confirmed the

appliances listed in the tracking data were picked up. The evaluation team found no discrepancies

between ex ante unit quantities and program tracking data. Table 68 shows the resulting ISRs.

21 In California: Southern California Edison, Pacific Gas & Electric, and San Diego Gas & Electric; in Wisconsin:

Focus on Energy; in Michigan: DTE Energy and Consumers Energy.

Page 114: 2018 DSM Portfolio Evaluation Report - NIPSCO

107

Table 68. Appliance Recycling Program Measure Verification Results (In-Service Rate)

Measure Installations

In-Service Rate Ex Ante Ex Post Verified

Refrigerator 2,404 2,404 2,404 100%

Freezer 610 610 610 100%

Total 3,014 3,014 3,014 100%

Deemed and Evaluated Savings Review

Table 69 lists the 2018 Appliance Recycling program’s ex ante and evaluated per-unit annual gross

savings for each measure.

Table 69. 2018 Appliance Recycling Program Per-Unit Ex Ante and Ex Post Savings

Measure Gross Electric Energy Savings (kWh/yr)

Gross Electric Peak Coincident Demand Savings

(kW)

Ex Ante Ex Post Ex Ante Ex Post

Refrigerator 761 1,009 0.11 0.15

Freezer 886 704 0.13 0.10

For comparison, Table 70 lists per-unit gross annual savings for the 2017 Appliance Recycling program.

The ex ante savings values for each measure remained the same between the two years.

Table 70. 2017 Appliance Recycling Program Per-Unit Ex Ante and Ex Post Savings

Measure Gross Electric Energy Savings (kWh/yr)

Gross Electric Peak Coincident Demand Savings

(kW)

Ex Ante Ex Post Ex Ante Ex Post

Refrigerator 761 917 0.11 0.13

Freezer 886 792 0.13 0.12

Refrigerator and Freezer Models

The evaluation team used a regression model, specified in the UMP, to estimate consumption for

refrigerators. Since the UMP does not include specifications for freezers, the evaluation team created an

analogous freezer model. The coefficients for each independent variable indicated the influence of that

variable on daily consumption. Holding all other variables constant, a positive coefficient indicated an

upward influence on consumption, and a negative coefficient indicated a downward effect on

consumption.

Table 71 shows the UMP model specification that the evaluation team used to estimate a refrigerator’s

annual energy consumption and related parameters.22

22 The p-value, or calculated probability, is the probability of finding the observed, or more extreme, results

when the null hypothesis of a study question is true.

Page 115: 2018 DSM Portfolio Evaluation Report - NIPSCO

108

Table 71. Appliance Recycling Program Refrigerator Per-Unit Energy Consumption Regression

Model Estimates

Independent Variables Coefficient Variance Inflation Factor p-Value

Intercept 0.80 0.0 0.13

Age (years) 0.02 2.0 0.04

Dummy: Unit Manufactured Pre-1990s 1.04 1.7 <.0001

Size (cubic feet) 0.06 1.8 0.02

Dummy: Single Door -1.75 1.2 <.0001

Dummy: Side-by-Side 1.12 1.5 <.0001

Dummy: Primary 0.56 1.6 0.003

Interaction: Unconditioned Space x HDDs -0.04 1.3 <.0001

Interaction: Unconditioned Space x CDDs 0.03 1.5 0.19

Note: Dependent Variable = Average Daily kWh, R2 = 0.30

The coefficient value indicates the marginal impact on the per-unit energy consumption of a one-point

increase in the independent variable. For example, as shown in Table 71, an increase of one cubic foot in

refrigerator size resulted in an increase of 0.06 kWh in daily consumption. In the case of dummy

variables, the coefficient value represented the difference in consumption if the given condition proved

true. For example, the evaluation team’s refrigerator model used a coefficient of 0.56 for the variable

indicating whether a refrigerator was a primary unit; thus, with all else equal, a primary refrigerator

consumed 0.56 kWh per day more than a secondary unit.

Table 72 details the final model specifications used to estimate annual per-unit energy consumption of

participating freezers and related parameters. Again, as the UMP only specified a refrigerator model, the

evaluation team created an analogous freezer model.

Table 72. Appliance Recycling Program Freezer Per-Unit Energy Consumption Regression

Model Estimates

Independent Variables Coefficient Variance Inflation Factor p-Value

Intercept -0.95 0.0 0.54

Age (years) 0.05 2.2 0.12

Dummy: Unit Manufactured Pre-1990 0.54 2.1 0.24

Size (cubic feet) 0.12 1.2 0.09

Dummy: Chest Freezer 0.30 1.1 0.07

Interaction: Unconditioned Space x HDDs -0.03 1.1 0.54

Interaction: Unconditioned Space x CDDs 0.08 0.1 0.07

Note: Dependent Variable = Average Daily kWh, R2 = 0.45

Page 116: 2018 DSM Portfolio Evaluation Report - NIPSCO

109

The evaluation team analyzed the corresponding characteristics (that is, the independent variables) for

participating appliances (as captured in the 2018 program tracking database and participant survey).23

Table 73 summarizes program averages or proportions for each independent variable. CDDs presented

here are the weighted average CDDs from typical meteorological year (TMY3) data for weather stations

mapped to participating appliance zip codes.24

Table 73. Appliance Recycling Program Participant Mean Variables and Model Coefficients

Appliance Independent Variables 2018 Mean Value 2018 Model Coefficient

Refrigerator

Intercept 1.00 0.80

Age (years) 20.61 0.021

Dummy: Manufactured Pre-1990s 0.21 1.04

Size (cubic feet) 20.97 0.06

Dummy: Single Door 0.02 -1.75

Dummy: Side-by-Side 0.25 1.12

Dummy: Primary 0.54 0.56

Interaction: Unconditioned Space x HDDsa 5.69 -0.04

Interaction: Unconditioned Space x CDDsa 1.04 0.03

Freezer

Intercept 1.00 -0.95

Age (years) 25.81 0.045

Dummy: Unit Manufactured Pre-1990 0.47 0.54

Size (cubic feet) 15.87 0.12

Dummy: Chest Freezer 0.31 0.30

Interaction: Unconditioned Space x HDDsa 8.43 -0.03

Interaction: Unconditioned Space x CDDsa 1.52 0.08 a Cooling degree days (CDDs) and heating degree days (HDDs) are weighted averages, based on TMY3 data from weather

stations mapped to participating appliance zip codes.

Per-Unit Energy Consumption

To determine annual and average daily per-unit energy consumption using regression models and 2018

Appliance Recycling program tracking data, the evaluation team applied average participant refrigerator

and freezer characteristics to regression model coefficients. This approach ensured that the resulting

per-unit energy consumption would be based on specific units recycled through NIPSCO’s program in

2018 rather than on a secondary data source.

Table 74 shows the average per-unit energy consumption for refrigerators and freezers recycled during

the 2018 program year, with refrigerators having a higher consumption value and freezers having a

23 Age, size, and configuration (side-by-side, single door, etc.) were captured by the tracking data. The evaluation

team used survey data to identify installed location (primary or secondary, conditioned or unconditioned

space).

24 TMY3 used median daily values for a variety of weather data collected from 1991 to 2005.

Page 117: 2018 DSM Portfolio Evaluation Report - NIPSCO

110

lower consumption value than found in the 2017 evaluation. The average per-unit energy consumption

shown in the table does not include the part-use adjustment factor.

Table 74. Appliance Recycling Program Average Per-Unit Energy Consumption by Appliance Type

Appliance 2017 Average Unit Energy

Consumption (kWh/yr)

2018 Average Unit Energy

Consumption (kWh/yr)

2018 Relative Precision

(90% Confidence)

Refrigerator 1,092 1,121 10%

Freezer 966 848 22%

For example, using values from Table 73, the evaluation team calculated the estimated annual per-unit

energy consumption for 2018 freezers using the following equation:

2018 Freezer per − unit energy consumption

= 365.25 𝑑𝑎𝑦𝑠 ∗ (−0.95 + 0.045 ∗ [25.81 𝑦𝑒𝑎𝑟𝑠 𝑜𝑙𝑑] + 0.54 ∗

[47% 𝑢𝑛𝑖𝑡𝑠 𝑚𝑎𝑛𝑢𝑓𝑎𝑐𝑡𝑢𝑟𝑒𝑑 𝑝𝑟𝑒 − 1990] + 0.12 ∗ [15.87 𝑓𝑡. ^3 ] + 0.47 ∗

[31% 𝑢𝑛𝑖𝑡𝑠 𝑡ℎ𝑎𝑡 𝑎𝑟𝑒 𝑐ℎ𝑒𝑠𝑡 𝑓𝑟𝑒𝑒𝑧𝑒𝑟𝑠] − 0.03 ∗ [8.43 𝑈𝑛𝑐𝑜𝑛𝑑𝑖𝑡𝑖𝑜𝑛𝑒𝑑 𝐻𝐷𝐷𝑠] + 0.08 ∗

[1.52 𝑈𝑛𝑐𝑜𝑛𝑑𝑖𝑡𝑖𝑜𝑛𝑒𝑑 𝐶𝐷𝐷𝑠]) = 848 𝑘𝑊ℎyear

In 2018, the average per-unit energy consumption for refrigerators increased by 29 kWh relative to 2017

evaluated values, and the average per-unit energy consumption for freezers decreased by 118 kWh.

Recycled refrigerators that participants used as primary units increased by 13% in 2018, compared to

those recycled in 2017. This was the primary factor driving higher refrigerator gross savings in 2018,

compared to the 2017 estimate of 1,092 kWh per year. Other factors contributing to the higher

refrigerator gross savings in 2018 included an increase in the average size and side-by-side door

configuration of units compared to 2017.

A per-unit energy consumption change for freezers in 2018 from 2017 resulted from several factors:

• A decrease in the average age of recycled appliances—from 29.40 years in 2017 to 25.81 years

in 2018

• A decrease in the percentage of recycled freezers manufactured prior to 1990—from 60% in

2017 to 47% in 2018

• A decrease in the percentage of recycled units that were chest freezers—from 47% in 2017 to

31% in 2018

Table 75 shows a direct comparison of the average 2017 and 2018 input values and coefficients for all

model variables; shaded cells highlight the differences outlined above.

Page 118: 2018 DSM Portfolio Evaluation Report - NIPSCO

111

Table 75. Appliance Recycling Program Participant Mean Variables Comparison, 2017 and 2018

Appliance Independent Variables 2018 Mean

Value

2018 Model

Coefficient

2017 Mean

Value

2017 Model

Coefficient

Refrigerator

Age (years) 20.61 0.021 22.95 0.021

Dummy: Manufactured Pre-1990s 0.21 1.04 0.32 1.04

Size (cubic feet) 20.97 0.06 20.01 0.06

Dummy: Single Door 0.02 -1.75 0.05 -1.75

Dummy: Side-by-Side 0.25 1.12 0.23 1.12

Dummy: Primary 0.54 0.56 0.41 0.56

Interaction: Unconditioned Space x HDDsa 5.69 -0.04 6.79 -0.04

Interaction: Unconditioned Space x CDDsa 1.04 0.03 1.25 0.03

Freezer

Age (years) 25.81 0.045 29.40 0.045

Dummy: Unit Manufactured Pre-1990 0.47 0.54 0.60 0.54

Size (cubic feet) 15.87 0.12 16.18 0.12

Dummy: Chest Freezer 0.31 0.30 0.47 0.30

Interaction: Unconditioned Space x HDDsa 8.43 -0.03 8.18 -0.03

Interaction: Unconditioned Space x CDDsa 1.52 0.08 1.49 0.08 a CDDs and HDDs derive from weighted average CDDs and HDDs from TMY3 data for weather stations mapped to

participating appliance zip codes.

Part-Use

Part-use, an adjustment factor specific to appliance recycling, is used to convert a per-unit energy

consumption value into an average per-unit gross savings value. The per-unit energy consumption itself

does not equal the gross savings value, because the regression model yields an estimate of annual

consumption, and not all recycled refrigerators would have operated year-round had they not been

recycled by the program.

The part-use methodology applied in 2018 relied on information from surveyed customers regarding

their pre-program use patterns. The final part-use estimate reflects how appliances would have

operated had they not been recycled (rather than how they previously operated). For example, a

primary refrigerator, operated year-round, could have become a secondary appliance, operating

part-time.

This methodology accounts for possible shifts in usage types. Specifically, the evaluation team calculated

part-use using a weighted average of these prospective part-use categories and factors:

• Appliances that would have run full-time (part-use = 1.0)

• Appliances that would not have run at all (part-use = 0.0)

• Appliances that would have operated a portion of the year (part-use is between 0.0 and 1.0)

Page 119: 2018 DSM Portfolio Evaluation Report - NIPSCO

112

Using information gathered through the 2018 participant survey, the evaluation team employed the

following multistep process to determine part-use:

• The survey asked respondents whether the refrigerator or freezer that was recycled remained

unplugged, operated year-round, or operated for a portion of the preceding year.

• If participants said that their refrigerator or freezer operated for only a portion of the preceding

year, the survey asked participants for the total number of months that the appliance was

plugged in. (In 2018, responses from this participant subset resulted in secondary refrigerators

operating an average of 5.40 months and secondary freezers operating an average of 4.80

months.)

• The evaluation team divided each value by 12 to convert months of operation into an annual

part-use factor for all refrigerators and freezers. In 2018, for those refrigerators and freezers

that operated part of the time, the average refrigerator had a part-use factor of 0.45 and the

average freezer had a part-use factor of 0.40.

• If participants said that they would have discarded their appliance independently of the

program, the survey did not follow up about that appliance’s future use as those actions would

be determined by another customer. Since future use of discarded refrigerators remained

unknown, the evaluation team applied the 0.93 weighted part-use average of all units (primary

and secondary, including those that were expected to be in operation full time) to this subset. It

is possible that discarded appliances may be used as primary or secondary units in a would-be

recipient’s home.

Table 76 lists the resulting part-use factor results by category.

Table 76. Appliance Recycling Program Part-Use Factor by Category

Usage Type and Part-Use Category

Refrigerators Freezers

Recycled

Units (%)a

Part-Use

Factor

Per-Unit Energy

Savings (kWh/yr)

Recycled

Units (%)

Part-Use

Factor

Per-Unit

Energy

Savings

(kWh/yr)

Secondary Units Only n = 32

N/A

Not in Use 6% 0.00 -

Used Part Time 16% 0.45 504

Used Full Time 78% 1.00 1,121

Weighted Average 100% 0.85 955

All Units (Primary and Secondary) n = 70 n = 69a

Not in Use 3% 0.00 - 10% 0.00 -

Used Part Time 7% 0.45 504 12% 0.40 339

Used Full Time 90% 1.00 1,121 78% 1.00 848

Weighted Average 100% 0.93 1,044 100% 0.83 702

Note: Totals may not sum properly due to rounding.

a All freezer units are considered secondary.

Page 120: 2018 DSM Portfolio Evaluation Report - NIPSCO

113

Combining the part-use factors shown in Table 76 with participants’ self-reported likely actions in the

program’s absence resulted in the distribution of future-use scenarios and corresponding part-use

estimates for refrigerators, as shown in Table 77. As the table shows, the weighted average of these

future scenarios produced a final part-use factor for refrigerators of 0.90 for the 2018 program. The final

part-use estimate of 0.83 for freezers, shown in Table 76, with all freezer units considered secondary

units and no additional weighting needed.

Table 77. Appliance Recycling Program Refrigerator Weighted Average Part-Use

Use Prior to Recycling Likely Use Independent of

Recycling

Refrigerators

Gross Savings Factor Participants (%)

Secondary Kept 0.85 16%

Discarded 0.93 31%

Primary

Kept (as primary unit) 1.00 6%

Kept (as secondary unit) 0.85 22%

Discarded 0.93 25%

Overall 0.90 100%

Note: Totals may not sum properly due to rounding.

In 2018, the part-use factor for refrigerators increased to 0.90 from 0.84 in 2017, and the 2018 part-use

factor of 0.83 for freezers increased slightly from 0.82 in 2017.

Gross Savings

As shown in Table 78, the evaluation team applied part-use factors from Table 76 to modeled annual

consumption to calculate average per-unit gross energy savings for the 2018 program. The evaluation

team determined per-unit gross energy savings of 1,009 kWh per year for refrigerators and per-unit

gross freezer savings of 704 kWh per year, as shown Table 68 below.

Table 78. Appliance Recycling Program Evaluated Per-Unit Gross Energy Savings

Appliance

Evaluated

Per-Unit

Consumption

(kWh/yr)

Gross Demand

Reduction (kW)

Part-Use

Factors

Gross Energy Savings

After Part-Use (kWh/yr)

Gross Demand

Reduction After Part-

Use (kW)

Refrigerators 1,121 0.16 0.90 1,009 0.15

Freezers 848 0.12 0.83 704 0.10

Net-to-Gross Analysis

In the case of appliance recycling, programs generate net savings only when the recycled appliances

would have continued to operate without program intervention (either in the participating customer’s

home or at the home of another utility customer).

The evaluation team employed a decision-tree approach to calculate net program savings and used a

weighted average of these scenarios to calculate net savings attributable to the program. The decision

tree—populated by responses from surveyed 2018 participants and by information gathered from local

market actors interviewed during other recent evaluations—represents all of a program’s possible

Page 121: 2018 DSM Portfolio Evaluation Report - NIPSCO

114

savings scenarios. Discussion of specific portions of the decision tree continue throughout this chapter,

highlighting particular aspects of the net savings analysis.

The decision-tree approach not only accounts for what a participating household would have done

independently of the program, but it also addresses the possibility that the recycled unit would have

transferred to another household, and whether the recipient of that appliance would have found an

alternate unit instead.

Freeridership

Independent of program intervention, participant refrigerators and freezers were generally subject to

one of three scenarios:

• Scenario 1: The participant keeps the refrigerator.

• Scenario 2: The participant discards the refrigerator by a method that transfers it to another

customer for continued use.

• Scenario 3: The participant discards the refrigerator by a method that removes the unit

from service.

The evaluation team applied freeridership only under Scenario 3, as the unit would have been removed

from service and destroyed in the absence of the program, even though it was recycled through the

program. As such, the program could not claim energy savings generated by recycling this appliance.

To determine the percentage of participants in each of the scenarios and to assess freeridership, the

team asked each surveyed participant what would likely have happened to the appliance had it not been

recycled by NIPSCO. Participants’ provided the following responses:

• Kept it and continued to operate the appliance

• Kept it, but stored it unplugged indefinitely

• Sold it to a private party, either to someone known or by running an ad

• Sold it to a used appliance dealer

• Gave it to a private party, such as a friend or neighbor

• Had it removed by the dealer from whom the new or replacement appliance was purchased

• Hauled it to the dump or recycling center

• Hired someone to haul it away for junking or dumping

To ensure the highest quality of responses possible and to mitigate socially responsible response bias,

the evaluation team asked some participants follow-up questions to test the reliability of their initial

responses. For example, the evaluation team conducted interviews with local market actors for other

evaluations who indicated that used appliance dealers usually do not purchase appliances more than 10

years old. Therefore, the evaluation team asked participants who recycled an appliance that was more

than 10 years old and who indicated they would have sold their unit to a used appliance dealer, what

they would have done had they been unable to carry through with their plans.

Page 122: 2018 DSM Portfolio Evaluation Report - NIPSCO

115

Upon determining the final assessments of participants’ actions independently of the program, the

evaluation team calculated the percentage of refrigerators and freezers that would have been kept or

discarded. Table 79 shows the results.

Table 79. Appliance Recycling Program Final Distribution of Kept and Discarded Appliances

Stated Action Absent Program Indicative of Freeridership Refrigerators (n=65)a Freezers (n=62) a

Kept No 49% 63%

Discarded Varies by discard method 51% 37%

Total 100% 100%

a Does not include “don’t know” responses and refusals.

Secondary Market Impacts

After determining that a participant would have directly or indirectly (through a market actor)

transferred the unit to another customer on the electric grid, the evaluation team addressed what the

recipient would have done had the recycled unit been unavailable. Three possible scenarios resulted:

• Scenario 1: None of the potential recipients would find another unit. That is, program

participation would result in a one-for-one reduction in the total number of refrigerators

operating on the electric grid. In this case, total energy consumption of avoided transfers

(participating appliances that otherwise would have been used by another customer) would be

credited as program savings. This position is consistent with the theory that participating

appliances are essentially convenience goods for would-be acquirers: the recipient would have

accepted the refrigerator had it been readily available, but, as the refrigerator was not a

necessity, the would-be acquirer would not have sought an alternate unit.

• Scenario 2: All potential recipients would find another unit. Thus, program participation would

not affect the total number of refrigerators operating on the grid. This position is consistent

with the concept that participating appliances are necessities and customers always seek

alternative units when participating appliances are unavailable.

• Scenario 3: Some potential recipients would find another unit, while others would not. This

scenario reflects the awareness that some acquirers were in the market for a refrigerator and

would acquire another unit, while others were not and would have taken the unit only

opportunistically.

After the evaluation team determined if a participant would have directly or indirectly (through a

market actor) transferred the unit to another customer on the electric grid, the question became what

the potential recipient would have done had the recycled unit been unavailable. The evaluation team

assumed one-half of would-be acquirers of avoided transfers would have found alternate units—an

assumption consistent with the UMP.

The evaluation team then addressed the likelihood that the alternate unit would be another used

appliance (similar to those recycled through the program) or—with fewer used appliances presumably

available in the market due to program activity—the customer would acquire a new standard-efficiency

unit. Even if a would-be acquirer could select a new ENERGY STAR® unit, the evaluation team assumed it

Page 123: 2018 DSM Portfolio Evaluation Report - NIPSCO

116

likely that a customer in the market for a used appliance would upgrade to the next lowest price point.

For reasons previously discussed, the evaluation team applied a midpoint approach, with one-half of

potential program unit recipients finding a similar used appliance and one-half acquiring a new

standard-efficiency unit.25

Figure 36 explains the methodology used for assessing the program’s impact on the secondary

refrigerator market and the application of recommended midpoint assumptions (when primary data

proved unavailable). As shown, accounting for market effects resulted in three savings scenarios:

• Full savings (that is, per-unit gross savings)

• No savings (that is, the difference in energy consumption of the program unit and a similar unit)

• Partial savings (that is, the difference between the energy consumption of the program unit and

that of a new, standard-efficiency appliance)

Figure 36. Secondary Market Impacts—Refrigerators

After estimating the parameters of freeridership impacts and secondary market impacts, the evaluation

team used the UMP decision tree to calculate average per-unit program savings. Figure 37 shows how

these values were integrated into a combined savings estimate as a weighted average, net of

freeridership and secondary market impacts.

25 The evaluation team calculated the energy consumption of a new, standard-efficiency appliance using the

ENERGY STAR website, taking the average energy consumption of new, comparably sized, and standard-

efficiency appliances with similar configurations as the program units. U.S. Environmental Protection Agency.

ENERGY STAR. “Refrigerator Retirement Savings Calculator.” Accessed February 2019:

(http://www.energystar.gov/index.cfm?fuseaction=refrig.calculator

Page 124: 2018 DSM Portfolio Evaluation Report - NIPSCO

117

Figure 37. Savings Net of Freeridership and Secondary Market Impacts—Refrigerators

Participant Spillover

As recommended in the UMP, the evaluation team did not include spillover in program net savings

estimates for 2018. The UMP suggests that although appliance recycling programs promote enrollment

in other energy efficiency programs, spillover of unrelated measures is unlikely to occur because

appliance recycling programs do not provide comprehensive energy education like other programs.

Summary of Verified Net Program Impacts

The evaluation evaluation team calculated final verified per-unit net savings using the following

equation:

𝑁𝑒𝑡 𝑃𝑟𝑜𝑔𝑟𝑎𝑚 𝑆𝑎𝑣𝑖𝑛𝑔𝑠 (𝑘𝑊ℎ 𝑝𝑒𝑟 𝑦𝑒𝑎𝑟)

= 𝐺𝑟𝑜𝑠𝑠 𝑃𝑟𝑜𝑔𝑟𝑎𝑚 𝑆𝑎𝑣𝑖𝑛𝑔𝑠 − 𝐹𝑟𝑒𝑒𝑟𝑖𝑑𝑒𝑟𝑠ℎ𝑖𝑝 & 𝑆𝑒𝑐𝑜𝑛𝑑𝑎𝑟𝑦 𝑀𝑎𝑟𝑘𝑒𝑡 𝐼𝑚𝑝𝑎𝑐𝑡 + 𝑆𝑝𝑖𝑙𝑙𝑜𝑣𝑒𝑟

Table 80 lists all per-unit net impacts discussed in this chapter, and overall NTG ratios by appliance type.

Table 80. 2018 Appliance Recycling Program Net-to-Gross Results by Appliance Type

Appliance Gross Per-Unit

Savings (kWh/yr)

Per-Unit

Freeridership and

Secondary Market

Impacts (kWh/yr)

Additional (Spillover)

Per-Unit Electric

Energy Savings

(kWh/yr)

Net Per-Unit Electric

Energy Savings

(kWh/yr)

NTG

Ratio

Refrigerator 1,009 376 0 633 63%

Freezer 704 182 0 522 74%

Evaluated Net Savings Adjustments

Table 81 and Table 82 provide reported ex ante savings, evaluated ex post savings, realization rates, and

evaluated net savings for the Appliance Recycling program. In 2018, the program achieved an overall

realization rate of 120% for kilowatt-hour savings and a 120% realization rate for coincidence peak

kilowatt savings. The program achieved net savings of 1,845,713 kWh and 271.0 coincident kilowatt

demand reduction.

Page 125: 2018 DSM Portfolio Evaluation Report - NIPSCO

118

Table 81. 2018 Appliance Recycling Program Electric Savings

Measure

Ex Antea Savings (kWh) Evaluated Ex Post

Savings (kWh)

Realization

Rate (kWh)

NTG

Ratio

Evaluated

Net Savings

(kWh) Reported Audited Verified

Refrigerator 1,829,444 1,829,444 1,829,444 2,425,396 133% 63% 1,527,999

Freezer 540,460 540,460 540,460 429,342 79% 74% 317,713

Total 2,369,904 2,369,904 2,369,904 2,854,738 120% 65% 1,845,713

Note: Totals may not sum properly due to rounding. a Values presented at a measure-level represent Audited values, since the scorecard provides only savings totals.

Table 82. 2018 Appliance Recycling Program Demand Reduction

Measure

Ex Antea Savings (Coincident Peak kW) Evaluated Ex Post

Savings

(Coincident Peak

kW)

Realization

Rate

(Coincident

Peak kW)

NTG

Ratio

Evaluated

Net Savings

(Coincident

Peak kW)

Reported Audited Verified

Refrigerator 268.8 269.2 269.2 356.1 132% 63% 224.4

Freezer 79.3 79.3 79.3 63.0 79% 74% 46.6

Total 348.1 348.5 348.5 419.2 120% 65% 271.0

Note: Totals may not sum properly due to rounding. a Values presented at a measure-level represent Audited values, since the scorecard provides only savings totals.

Process Evaluation For the 2018 process evaluation, the evaluation team conducted staff interviews with NIPSCO and

Lockheed Martin Energy. The evaluation team also fielded participant surveys, collecting responses from

70 participants who recycled a freezer and 70 participants who recycled a refrigerator.

Program Design and Delivery Lockheed Martin Energy and NIPSCO utilized the services of ARCA to schedule and pick up appliances,

carry out the recycling functions, and process the rebates. The program was marketed to customers

through bill inserts, emails, the NIPSCO website, and leave-behinds, along with other residential

programs in the NIPSCO “Meet the Greens” campaign (which promotes saving the planet and saving

money, and directs customers to the residential programs landing page).

Upon learning about the Appliance Recycling program offering, interested customers participated in the

program by scheduling a pickup with ARCA over the phone or via an online scheduler. For the online

scheduler, once customers entered their information, they received a call back, and were provided

appointment times to choose from. The day before their pickup, customers received a confirmation call

while the driver was en route. Two ARCA crew members conducted each pickup. On site, they confirmed

the appliances’ eligibility —whether they were plugged in and operational—and removed the appliances

for transport to the processing centers. ARCA also conducted rebate fulfilments, issuing rebates directly

to customers, and receiving reimbursement by Lockheed Martin Energy. Participants received $50 in

exchange for each qualifying appliance.

Page 126: 2018 DSM Portfolio Evaluation Report - NIPSCO

119

2018 marked the last year that ARCA provided the program services. NIPSCO and Lockheed Martin

Energy selected Recleim to take over the program services starting in 2019.

Changes to 2017 Design

In 2018, the program’s participation goal was to recycle 2,354 units—a slight increase over the 2017 goal

of 2,191 units; the goal was increased for 2018, after the program’s success at meetings its goal in 2017.

According to tracking data, the program collected 3,014 units in 2018 (128% of the goal), which likely

explains why the program exceeded its allocated budget for 2018 (see Table 67). NIPSCO confirmed that

they extended the Appliance Recycling program in order to make up for savings shortfalls in other

programs. As shown in Table 83, 2018 participation increased by 38% from 2017.

Table 83. Appliance Recycling Program Participation Goals and Actuals

Unit Type 2017 2018

Actual Goal % of Goal Actual Goal % of Goal

Refrigerator 1,683 2,191 100%

2,404 2,354 128%

Freezer 508 610

Total 2,191 2,191 100% 3,014 2,354 128%

According to program data, the process to pick up appliances averaged 11 days, an improvement over

13 days in 2017 and 19 days in 2016. As shown in Figure 38, participation in the program fluctuated

throughout 2018 (and the final months of 2017, which was included in the 2018 evaluation) and peaked

in June and August, after the marketing push in February (the program bill insert) and April (the social

media campaign for the program). The last invoices for the 2018 program year were processed in

December. Additional requests were rolled over to the 2019 program year tracking.

Figure 38. 2018 Appliance Recycling Program Participation by Month

Characteristics of Recycled Appliances

The top-freezer refrigerators were the most common refrigerator type recycled (66%), followed by side-

by-side refrigerators (25%), bottom-freezer refrigerators (7%), and single-door refrigerators (2%). The

Page 127: 2018 DSM Portfolio Evaluation Report - NIPSCO

120

most common type of recycled freezers were upright models (69%), and the remaining 31% were chest

models. Recycled refrigerators averaged 20.6 years old, and recycled freezers averaged 25.8 years old.

2017 Recommendation Status

The evaluation team followed up with NIPSCO regarding the status of the 2017 evaluation

recommendations. Table 84 lists the 2017 Appliance Recycling program evaluation recommendations

and NIPSCO’s progress toward addressing those recommendations to date.

Table 84. Status of 2017 Appliance Recycling Program Evaluation Recommendations

Summary of 2017 Recommendations Status

When the marketing budget reaches 10% to 15% remaining for a given year,

consider assessing progress towards measure-specific goals to target a final

marketing push. Similar to recalibrating the 2017 participation goal, utilize 2016

and 2017 monthly freezer participation data and historical marketing data to

identify an opportune time towards the end of the program year to conduct one

additional marketing push, targeted at any measure that has not yet reached its

participation goal. Digital marketing may provide a quick, cost-effective way to

implement such a push.

Declined. The program was ahead of

its participation goal. Lockheed Martin

Energy did not foresee needing a final

marketing push to help the program

achieve its participation goal.

Insert a “next-step customer participation journey” marketing collateral piece

along with incentive checks. Rather than leaving behind marketing collateral

during the pick-up, inserting collateral with the incentive check may help enhance

recall of other program offerings by tapping into the positive experience of

receiving a check in the mail. Consider including marketing collateral which shows

the variety of energy efficiency program offerings and which guides customers to

take the next step in their journey to energy and cost savings.

In Progress. ARCA provided a leave-

behind highlighting the other

programs with customers at the time

of appliance pickup. NIPSCO and

Lockheed Martin Energy will consider

options for inserting a marketing piece

with the incentive check for 2019.

In future program years, adjust per-unit annual kWh savings to reflect the 2017

evaluation results (917 kWh for refrigerators and 792 kWh for freezers). Continue

to update these savings as frequently as possible within the constraints of the

planning and evaluation cycles.

In progress. Lockheed Martin Energy

noted that this will be updated in the

2019-2021 program design.

Participant Feedback This section presents the findings from the participant survey evaluation activity.

Energy Efficiency Awareness and Marketing

In 2018, the Appliance Recycling program was marketed through a variety of channels, including bill

inserts, radio spots, and social media posts, as well as though general marketing pieces directing

customers to the NIPSCO residential programs website. NIPSCO also developed a one-minute video

highlighting the ease of recycling through the program that was released in July 2018.

Participant respondents most frequently learned about the program via word of mouth (35%) and a

NIPSCO bill insert (35%), as shown in Figure 39. How respondents learned about the program in 2018

was consistent with 2017. Only 3% of respondents learned about the program via social media or emails

from NIPSCO in 2018.

Page 128: 2018 DSM Portfolio Evaluation Report - NIPSCO

121

Figure 39. How Participants Learned About the Appliance Recycling Program

Source: Participant Survey. Question: “How did you learn about NIPSCO’s Appliance Recycling program?”

(Multiple answers allowed; percentages may add up to more than 100%)

Fifty-nine percent of respondents reported that they were aware of other NIPSCO energy efficiency

programs, as shown in Figure 40. This was similar to 2017 where 56% were aware of other programs.

Among those aware of other energy efficiency programs, 44% could not name a program. Respondents

able to name a program most frequently cited the Energy Efficiency Rebate (HVAC Rebates) program

(37%), HEA program (33%), and Lighting program (25%).

Figure 40. Participant Awareness of Other NIPSCO Programs

Source: Participant Survey. Question: “Besides NIPSCO’s Appliance Recycling program, are you aware that

NIPSCO offers other energy efficiency programs?” “What energy efficiency programs are you aware of?”

(Multiple answers allowed; percentages may add up to more than 100%)

Page 129: 2018 DSM Portfolio Evaluation Report - NIPSCO

122

Participation Drivers

As shown in Figure 41, respondents’ top reasons for participating in the program were to get rid of an

extra appliance (48%), to get the rebate (30%), and to replace old equipment (25%). Those who

participated to get rid of an appliance rose significantly from 33% in 2017 to 48% in 2018.26 Respondents

who participated to replace old but working equipment significantly fell from 36% in 2017 to 25% in

2018.27

Figure 41. Motivation to Participate in the Appliance Recycling Program

Source: Participant Survey. Question: “Why did you decide to participate in NIPSCO’s Appliance Recycling

program?” (Multiple answers allowed; percentages may add up to more than 100%)

Overall, 68% (n=140) of respondents reported that they replaced their recycled appliance with a new

unit. As shown in Figure 42, 77% of those recycling a refrigerator replaced the unit with a new

refrigerator, significantly higher than the 59% of respondents recycling a freezer.28 The two “Other”

responses were for respondents who replaced their recycled units with a pre-owned unit.

Figure 42. Rate of Respondents Replacing Recycled Units

Source: Participant Survey. Question: “Did you replace the unit you recycled with a new unit?”

26 Difference is statistically significant at p≤0.05 (95% confidence).

27 Difference is statistically significant at p≤0.10 (90% confidence).

28 Difference is statistically significant at p≤0.05 (95% confidence).

Page 130: 2018 DSM Portfolio Evaluation Report - NIPSCO

123

When asked if replaced appliances were ENERGY STAR or high-efficiency models, 68% of respondents

said yes, as shown in Figure 43. This was statistically similar to 2017 (76%).

Figure 43. Efficiency of New Appliances Replacing Recycled Units

Source: Participant Survey. Question: “Is the new appliance an ENERGY STAR or high-efficiency model,

or a standard efficiency unit?”

Satisfaction with Program Processes and Overall Program

Respondents showed high satisfaction with the overall program. As shown in Figure 44, 99% of

respondents were satisfied (very or somewhat satisfied) with the program, similar to 2017 (98%

satisfied). Specifically, 91% of 2018 respondents were very satisfied, which is statistically similar to 2017

respondents (87%). The percentage of very satisfied respondents did not differ by unit type in 2018; 93%

of freezer respondents (n=70) and 89% of refrigerator respondents (n=70) were very satisfied.

Figure 44. Overall Satisfaction with the Appliance Recycling Program

Source: Participant Survey. Question: “How satisfied are you with NIPSCO’s Appliance Recycling program overall? Would you

say you are…” Percentages may add up to more than 100% due to rounding.

Among program components, respondents showed the greatest satisfaction with the contractors who

picked up appliances; 93% were very satisfied with these contractors, as shown in Figure 45.

Respondents showed the least satisfaction with the time it took to receive the rebate checks, 76% still

said they were very satisfied.

Overall, very few respondents reported dissatisfaction (not too satisfied or not at all satisfied) with

program components. Four percent or less of respondents reported dissatisfaction with each aspect of

Page 131: 2018 DSM Portfolio Evaluation Report - NIPSCO

124

the program. Satisfaction with program components did not differ by unit type (refrigerator and freezer)

in 2018.

Figure 45. Satisfaction with Program Components

Source: Participant Survey. Question: “How satisfied are you with…?”

For those respondents who were less than very satisfied with contractors, the survey asked what the

contractor could have done to improve their experience. As shown in Figure 46, the majority (80%) of

respondents did not have any suggestions. Six percent of respondents said that they could improve

communication regarding picking up the appliance. One of these respondents said, “[They could have]

returned the 1st phone call.” Five percent of respondents suggested taking more care with removal. One

of these respondents said, “When the refrigerator was tipped onto the dolly, condensation leaked onto

the linoleum floor and rug that I had to clean up after they left.” Other suggestions included scheduling

pickups more quickly and to arrive on time.

Figure 46. Participant Suggestions to Improve Contractor Experience

Source: Participant Survey. Question: “What, if anything, could the contractor have done to improve

your experience?”

Page 132: 2018 DSM Portfolio Evaluation Report - NIPSCO

125

Suggestions for Improvement

When the survey asked participants for suggestions of how to improve the program, 98 participants

responded; however, 33% said that they did not have suggestions for improvement. Those who

provided suggestions most frequently said increase program advertising (19%) and increase the rebate

amount (15%). As shown in Figure 47, other suggestions included expanding the types of appliances the

program will pick up (13%), improving the scheduling experience (12%), and sending out rebate checks

faster (3%).

Figure 47. Suggestions for Program Improvement

Source: Participant Survey. Question: “What is the one thing NIPSCO could do to improve its

Appliance Recycling program?”

Satisfaction with NIPSCO

Participants showed high satisfaction with NIPSCO: 92% of respondents said they were satisfied (very

and somewhat satisfied). As shown in Figure 48, the percentage of those very satisfied increased

significantly from 50% in 2017 to 68% in 2018.29 Those somewhat satisfied fell from 45% to 24%

between 2017 and 2018.

Figure 48. Satisfaction with NIPSCO

Source: Participant Survey. Question: “How satisfied are you with NIPSCO overall as your energy service provider?”

29 Difference is statistically significant at p≤0.05 (95% confidence).

Page 133: 2018 DSM Portfolio Evaluation Report - NIPSCO

126

Potential Program Enhancements

Some appliance recycling programs accept window-mounted air conditioning units. The savings

associated with recycling these units are typically low, so it is common for appliance recycling programs

that include this measure to minimize costs by offering pick-up only with other appliances (or not

offering incentives for these units). NIPSCO’s Appliance Recycling program currently does not qualify air

conditioning units for the program. To gauge customer interest in recycling window-mounted air

conditioning units, the survey asked participants if they would have recycled such units if the program

included them. As shown in Figure 49, more than one-half of respondents (59%) said yes, statistically

similar to the 56% in 2017.

Figure 49. Participants’ Interest in Recycling Window-Mounted Air Conditioning Units

Source: Participant Survey. Question: “Would you have recycled a window-mounted air conditioning unit, if

the program included the option?”

Participant Survey Demographics

The evaluation team collected responses on the following demographics (Table 85).

Table 85. Appliance Recycling Program Respondent Demographics

Demographics Percentage

Home Ownership (n=139)

Rent 3%

Own 97%

Type of Residence (n=140)

Single-family detached 91%

Attached house (townhouse, rowhouse) 4%

Multi-family apartment or condo (4 or more units) 3%

Manufactured home 2%

Something else 1%

Number of People in the Home (n=138)

One 18%

Two 42%

Three 14%

Four 12%

Five 9%

More than five 5%

Page 134: 2018 DSM Portfolio Evaluation Report - NIPSCO

127

Conclusions and Recommendations Conclusion 1: In 2018, the Appliance Recycling program set a higher participation goal for 2018 and

exceeded this goal, as well as the original budget for the program year.

In total, the program attained 128% of its overall participation goal for 2018 and spent 124% of its

budget. This suggests that the budget had been set at a level that would have been close to what was

needed to achieve the number of rebated measures planned for in 2018. Including the last few months

of 2017 in the 2018 program results (256 units from 2017 were covered by the 2018 program year

budget) may have affected the ability of the original program budget from covering the whole 2018

period.

Recommendation:

• Recalibrate program budget and planning to account for higher levels of participation,

particularly if the program expects to see holdover participants from the previous year.

Conclusion 2: The 2018 program delivered high customer satisfaction, especially for the program

components involving the contractor.

The top three program components in terms of percentage of very satisfied customers were satisfaction

with the contractor overall (93%), communication with the contractor (87%), and scheduling the pickup

(83%); all three are related to the contractor experience. These findings not only show how satisfied

customers were in 2018 with ARCA, but also showed improvement over satisfaction ratings from 2017.

While customers did suggest improvements (when asked) such as improving communication, taking

more care with the appliance removal, and being available to do the pickup sooner, 80% of respondents

were not able to identify anything that could improve the contractor experience. While ARCA will not be

the contractor going forward, this high level of customer satisfaction indicates that ARCA had made

improvements over time and provided a satisfying experience.

Recommendations:

• Identify and retain the program elements key to customer satisfaction as the Appliance

Recycling program is transitioned from ARCA to Recleim in 2019. Maintaining key successful

program elements (such as marketing, program materials, and customer communication

processes) will help build consistency and save on program operation costs during the transition

period.

• Set goals for the new contractor on key areas of customer satisfaction and track the new

contractor’s progress. Goals may include the following: (1) call customers within half an hour of

the pickup, (2) do not leave behind any mess for the customer during removal, (3) return all

phone calls within 24 hours for scheduling or rescheduling a pickup, and (4) make certain the

new contractor has sufficient trucks and drivers to ensure timely pickups.

Page 135: 2018 DSM Portfolio Evaluation Report - NIPSCO

128

Conclusion 3: The ex ante savings were lower for refrigerators and higher for freezers than the 2018

evaluated ex post savings.

Ex ante deemed savings values are sourced from the Indiana TRM (v2.2) for refrigerators (761 kWh per

year) and freezers (886 kWh per year). For the ex post analysis, the evaluation team used actual

appliance characteristics from the program tracking database and weather data specific to service

territory (South Bend, Indiana). Thus, the ex post savings values are expected to fluctuate from year to

year as they reflect the characteristics and location of appliances recycled during a given program

period.

Recommendation:

• In future program years, adjust per-unit annual kilowatt-hour savings to reflect the 2018

evaluation results (1,009 kWh for refrigerators and 704 kWh for freezers). Continue to update

these savings as frequently as possible within the constraints of the planning and evaluation

cycles.

Page 136: 2018 DSM Portfolio Evaluation Report - NIPSCO

129

Residential School Education Program NIPSCO designed the School Education program to help fifth-grade students and their families

understand and identify opportunities to manage their energy consumption. The program distributed

energy efficiency kits at no cost to participating schools. Lockheed Martin Energy served as the program

implementer, managing the overall program and acting as a liaison between NIPSCO and program

subcontractors. To deliver the program, Lockheed Martin Energy contracted with AM Conservation

Group (the kit distributor) and the National Energy Foundation (NEF; curriculum and marketing).

AM Conservation Group and NEF distributed the kits and curriculum materials to individual teachers

who signed up for the program. This included two types of kits in 2018: combo kits to schools in

NIPSCO’s natural gas and electric territory and gas-only kits to schools in the natural gas territory. The

kits contained the following energy-saving measures, along with the other educational materials and

activities:

• Measures in Combo Kits

▪ One kitchen faucet aerator (1.5 gpm)

▪ One bathroom faucet aerator (1.0 gpm)

▪ One low-flow showerhead (1.5 gpm)

▪ Three 9-watt LEDs

▪ One 0.5-watt LED night-light

▪ One furnace filter whistle

• Measures in Gas Only Kits

▪ One kitchen faucet aerator (1.5 gpm)

▪ Two bathroom faucet aerators (1.0 gpm)

▪ Two low-flow showerheads (1.5 gpm)

Program Performance Table 86 summarizes savings for the 2018 program, including program savings goals. Based on ex post

gross savings, the program achieved 100% of its participation goal (with 11,606 total kits distributed),30

74% of its electric energy savings goals, 67% of its demand reduction goals, and 78% of its natural gas

energy savings goals.

30 Quantity verified using implementer tracking data.

Page 137: 2018 DSM Portfolio Evaluation Report - NIPSCO

130

Table 86. 2018 School Education Program Savings Summary

Metric Gross

Savings Goal

Ex Antea Audited Verified Ex Post Gross

Ex Post Net

Gross Goal Achievement

Electric Energy Savings (kWh/yr)

2,946,360 2,947,908 2,948,137 1,585,000 2,193,543 2,083,866 74%

Peak Demand Reduction (kW)

400 400 471 188 266 253 67%

Natural Gas Energy Savings (therms/yr)

202,304 202,409 204,394 77,389 157,106 149,251 78%

a Values presented at a measure-level represent audited values since the scorecard provides only savings totals.

Table 87 shows the 2018 program realization rates and NTG ratio.

Table 87. 2018 School Education Program Adjustment Factors

Metric Realization Rate (%)a Freeridership (%) Spillover (%) NTG (%)b

Electric Energy Savings (kWh/yr) 74%

11% 6% 95% Peak Demand Reduction (kW) 67%

Natural Gas Energy Savings (therms/yr) 78% a Realization rate is defined as ex post gross savings divided by ex ante savings. b NTG ratio is defined as ex post net savings divided by ex post gross savings.

Table 88 lists the 2018 program budget and expenditures.

Table 88. 2018 School Education Program Expenditures

Fuel Program Budget Program Expenditures Spend (%)

Electric $505,192 $492,427 97%

Natural Gas $544,281 $532,932 98%

Research Questions The evaluation team conducted qualitative and quantitative research activities to answer the following

research questions for the School Education program:

• What are the program’s verified measure installations?

• What are the program’s participant spillover and freeridership estimates?

• How satisfied are customers with individual measures?

• How satisfied are customers with the program overall?

• What are the demographic characteristics of participants?

Page 138: 2018 DSM Portfolio Evaluation Report - NIPSCO

131

Impact Evaluation

Audited Savings To audit program savings, the evaluation team performed the following reviews to verify consistency

with the program’s scorecard:

• Step 1. Audited Kits Quantity. Checked program tracking data provided by the implementer to

audit the number of kits distributed (included combo and gas-only kits).

• Step 2. Scorecard Savings Estimate. Recalculated per-kit savings using the scorecard; divided

total electric energy (kWh), peak demand (kW), and natural gas energy (therms) savings values

listed on the scorecard by the number of kits listed on the scorecard.

• Step 3. Savings Estimate Review. Reviewed per-measure and per-kit savings in the

documentation provided by NIPSCO.

• Step 4. Audited Program Savings. Calculated total audited program savings using inputs from

the prior steps.

Step 1. Audited Quantity of Kits

NIPSCO reported a total count of 11,426 combo kits and 180 gas-only kits distributed through the School

Education program. NIPSCO reported that, though a number of combo kits were sent to gas-only

schools, electric savings were not claimed for these kits, effectively treating them as gas-only kits. The

evaluation team audited the number of kits distributed by comparing reported kit quantities from the

scorecard with year-end shipment data from Lockheed Martin Energy. The total quantity of gas-only and

combo kits matched between the scorecard and these shipment data.

Step 2. Scorecard Savings Estimates

The evaluation team used the scorecard to arrive at per-kit savings values. The evaluation team divided

the total scorecard savings of 2,947,908 kWh by 11,426 kits, yielding 258 kWh per kit. Similarly, for

demand savings, the evaluation team divided total scorecard savings of 400 kW by 11,426 kits, yielding

0.035 kW per kit. Dividing total reported natural gas energy savings—202,409 therms—by 11,606 kits

yielded 17.44 therms per kit.

Step 3. Savings Estimate Review

NIPSCO sent the evaluation team the program savings documentation provided by Lockheed Martin

Energy, presenting per-kit savings estimates of 258 kWh, 0.035 kW, and 17.44 therms. However, the

per-kit electric demand savings in the savings document contained an error: when adding kilowatt

savings for each electric measure in the kit, the evaluation team calculated a total peak demand savings

of 0.041 kW per kit. The evaluation team could not replicate calculations that arrived at a per-kit savings

value of 0.035 kW. Although the 0.035 kW per-kit value used for calculating total demand savings in the

scorecard matches the incorrect summarized per-kit value in the savings document, it does not match

the sum of all electric demand savings determined by the methodology outlined in the savings

document. The evaluation team therefore corrected the per-kit kW savings to 0.041 kW.

Page 139: 2018 DSM Portfolio Evaluation Report - NIPSCO

132

The savings document shows that the reported scorecard per-kit savings values were multiplied by the

correct number of kits (11,426 combo kits and 180 gas-only kits) to determine total natural gas energy

savings. However, after multiplying 11,426 combo kits by the per-kit kWh value from the savings

document, total electric energy (kilowatt-hour) savings did not match. The evaluation team determined

this mismatch resulted from a rounding difference. The savings document showed a per-kit electric

energy savings of 258.02 kWh, whereas the scorecard used a value of 258.00, resulting in a total electric

energy difference of 228.52 kWh.

The savings document shows that the reported scorecard per-kit savings values were multiplied by the

correct number of kits (11,426 combo kits and 180 gas-only kits) to determine total natural gas energy

savings. However, the number of fixtures in the gas-only kits is not equal to the number of the same

fixtures in the combo kits. The gas-only kits include 2 low-flow showerheads, 2 bathroom aerators, and 1

kitchen aerator (as opposed to one of each for the combo kits). Because of this, the per-kit gas savings

values are not the same between kits as they were in past program years. The per-kit savings were

calculated separately for each of the two kit types. Following the methodology and details outlined in

the provided savings document, the evaluation team adjusted the combo per-kit savings values to

258.02 kWh and 0.041 kW, and the gas-only per-kit savings value to 28.47 therms, to calculate total

program audited savings.

Step 4. Audited Program Savings

Table 89 shows total reported ex ante and audited program savings. Reported savings were derived

from the scorecard; these savings reflect the number of kits listed on the scorecard. The evaluation

team calculated a per-kit savings estimate by dividing scorecard savings by the number of kits on the

scorecard. Audited savings use the number of kits tracked in implementer data and per-kit savings

estimates provided by NIPSCO.31

As described in the previous section, per-kit savings values did not match between the scorecard and

savings methodology. Table 89 shows audited savings after correcting for these inconsistencies.

31 NIPSCO. 2018 NIPSCO Res – Calculations – School Kits – MFDI Lighting_031219_v2. Provided to the evaluation

team winter 2018.

Page 140: 2018 DSM Portfolio Evaluation Report - NIPSCO

133

Table 89. 2018 School Education Program Audited Savings Results

Kit Measure Electric Energy Savings (kWh/yr) Peak Demand Reduction (kW)

Natural Gas Energy Savings (therm/yr)a

Ex Ante Audited Ex Ante Audited Ex Ante Audited

Per combo kit savings

258.00 258.02 0.035 0.041 17.44 17.44

Per gas-only kit savings

- - - - 17.44 28.47

Quantity of combo kitsa

11,426 11,426 11,426 11,426 11,426 11,426

Quantity of gas-only kitsa - - - - 180 180

Total Savings 2,947,908 2,948,137 400 471 202,409 204,394

Note: Totals may not sum properly due to rounding. a The number of kits represents the kit type applicable for the savings type.

Verified Savings The evaluation team applied adjustments to two inputs based on the verification analysis:

• ISRs for each kit measure

• Water heater fuel-type saturation

In-Service Rates

Parent Survey

The evaluation team verified measure ISRs by conducting telephone surveys with the parents of

participating students who gave permission to be contacted for a survey. The evaluation team

completed 67 surveys, achieving the number of survey responses targeted for 90% confidence with

±10% precision. For ISRs, the evaluation team asked each parent to indicate whether each item in their

kit was currently installed at the time of the survey.

Home Energy Worksheet

The home energy worksheet (HEW) serves as alternative data source for ISRs,32 though the timing of

HEW responses made these forms less suitable than parent surveys for calculating ISRs. As the follow-up

parent surveys were completed months after student participation, whereas HEWs were distributed

with the kit, parent surveys allowed more time for participants to install (and uninstall, if deemed

unsatisfactory) kit measures. Consequently, parent surveys typically yield higher and more reliable ISRs

than HEWs.

32 The implementation contractor distributed the HEW shortly after students received the kit. Students, with

help from their parents, responded to questions about measure installations and either returned the

worksheets to the teacher, who returned them to the program implementer, or submitted their

responses online.

Page 141: 2018 DSM Portfolio Evaluation Report - NIPSCO

134

Table 90 compares ISRs from the HEWs and parent surveys; the evaluation team used the parent survey

ISRs in conducting savings verification calculations.

Table 90. In-Service Rates: Home Energy Worksheet vs. Parent Survey

Measure Ex Ante Reported ISR Home Energy Worksheet

ISR (n=7,321)

Parent Survey ISR

(n=67)

9-watt LED 100% 46% 86%

Low-flow showerhead 100% 32% 48%

Kitchen aerator 100% 28% 43%

Bathroom aerator 100% 30% 38%

Furnace filter whistle 100% 29% 15%

LED night-light 100% 79% 68%

Note: The evaluation team used parent survey ISR estimates to verify savings.

Water Heater Fuel Saturation

The evaluation team adjusted the ex ante electric and natural gas saturation rates for water-saving

measures by analyzing data from the 2018 HEW results, shown in Table 91. Results indicate a slight

discrepancy between ex ante and verified electric and natural gas domestic water heating saturation

rates.

Table 91. Measure Verification: Fuel Saturation Rates for Water-Saving Measures

Savings Type Electric Water Heating Saturation Rate (%) Natural Gas Water Heating

Saturation Rate (%)

Reported ex ante 20% 80%

Verifieda 20% 73% a Electric and natural gas saturation rates do not add to 100% because 7% of respondents replied “Other” on the HEW.

After applying adjustments to ISRs and water heater fuel saturation rates, the evaluation team

calculated the verified electric energy and natural gas energy savings shown in Table 92. The table lists

per-kit savings for combo and gas-only kits, demonstrating the difference in natural gas energy savings

for each. Although verified electric heater saturations were the same as ex ante saturations, verified

savings were lower than scorecard-reported ex ante for electric energy (kilowatt-hour) savings and peak

demand (kilowatt) reduction, which reflected the applied ISR. For all measures, NIPSCO assumed an ISR

of 100% to determine ex ante savings.

Page 142: 2018 DSM Portfolio Evaluation Report - NIPSCO

135

Table 92. School Education Program Verified Savingsa

Measure Verified ISRs Audited kWh

Savings

Verified kWh

Savings

Audited kW

Reduction

Verified kW

Reduction

Audited Therm

Savings

Verified Therm

Savings

Low-flow Showerhead-

combo kit 48.5% 54.30 26.48 0.003 0.001 9.55 3.84

Low-flow Showerhead- gas-

only kit 48.5% - - - - 19.10 7.68

Kitchen Aerator-combo kit 43.3% 36.40 15.85 0.002 0.001 6.41 2.30

Kitchen Aerator- gas-only kit 43.3% - - - - 6.41 2.30

Bathroom Aerator- combo

kit 37.5% 8.32 3.14 0.002 0.001 1.48 0.46

Bathroom Aerator- gas-only

kit 37.5% - - - - 2.96 0.92

9-watt LED 86.2% 87.00 74.95 0.012 0.010 - -

Furnace filter whistle 15.1% 58.00 8.75 0.023 0.003 - -

LED night-light 68.2% 14.00 9.55 0.000 0.000 - -

Per Combo Kit -- 258.02 138.72 0.041 0.016 17.44 6.60

Per Gas-Only Kit -- N/A N/A N/A -- 28.47 10.90

Total 2,948,137 1,585,000 471 188 204,394 77,389 a All measures savings values included in this table represent per unit savings, with the exception of the 9-watt LED, which shows savings values for 3 bulbs (1

kit). All savings values are shown with 2 and 3 decimal places to provide additional granularity that impacts the savings discussion. Totals may not sum properly

due to rounding.

Page 143: 2018 DSM Portfolio Evaluation Report - NIPSCO

136

Ex Post Gross Savings Ex post savings reflect engineering analysis adjustments made to verified measure savings. The

evaluation team calculated ex post electric energy, peak demand, and natural gas energy savings for

each kit measure using algorithms and inputs from the Indiana TRM (v2.2), the 2014 Indiana Statewide

Core program evaluation results, and secondary sources. A description of the various adjustments made

by the evaluation team follows.

Evaluated savings for low-flow showerheads and faucet aerators differed from verified savings. The

equations NIPSCO used to calculate savings utilized the Indiana TRM (v2.2) values for gallon-per-minute

values of the installed efficient measures, whereas the evaluation team used the actual gallon-per-

minute value specified on kit measure documentation. The evaluation team also substituted

showers/faucets per home and persons per home TRM values with survey-derived, NIPSCO-specific

values (specifically, inputs from the HEW forms and the parent survey).

NIPSCO applied a weighting percentage to the TRM-derived savings values. For each measure, it applied

a percentage weighting between TRM-derived savings and evaluated savings from the 2015 program

year. Table 93 shows the weightings used for aerators and showerheads. This weighting process

compounded the differences between 2018 reported savings values and evaluated savings.

Table 93. Reported Ex Ante Savings Weights

Measure TRM kWh

Savings

2015 EM&V

kWh Savings

TRM

Weighting

2015 EM&V

Weighting

Reported

Ex Ante Savings

Low-flow showerhead 290.3 267.4 16% 84% 271.0

Bathroom aerator 33.4 61.8 71% 29% 41.6

Kitchen aerator 170.3 330.0 93% 7% 182.0

Several discrepancies occurred between NIPSCO and the evaluation team’s methodologies for

calculating LED lighting savings. For the LED night-light, NIPSCO’s calculations showed a baseline wattage

of 0.33 watts, following the default baseline in the Indiana TRM (v2.2). However, since the LED night-

light’s wattage was known (0.5 watts), the evaluation team used the actual wattage. The evaluation

team also adjusted LED night-light savings by an incandescent replacement factor—a percentage

adjustment for the proportion of installed bulbs that replace an incandescent night-light, versus not

replacing a pre-existing night-light or an existing LED night-light.

For LED bulbs, NIPSCO applied an HOU value of 1,040, while the evaluation team used the 1,135 HOU

value listed in the Indiana TRM (v2.2) for school kits. Additionally, NIPSCO and the evaluation team used

a different electric WHF. The evaluation team used the weighted average WHFs listed in the Indiana

TRM (v2.2) for South Bend. NIPSCO utilized a value from an undocumented source. NIPSCO also

weighted its reported LED savings values by three different methodologies:

• TRM-derived savings estimates (40% weighting)

• An equivalent wattage methodology (30% weighting)

• LED ex post savings values from the Home Energy Analysis program’s 2015 evaluation

(30% weighting)

Page 144: 2018 DSM Portfolio Evaluation Report - NIPSCO

137

NIPSCO did not provide documentation for the equivalent wattage methodology.

The evaluation team used a lumen equivalence method to determine differences between efficient and

baseline wattages (delta watts). This method aligned with the approach recommended by the UMP, a

resource that is more current than the Indiana TRM and which allows the evaluation team to properly

assess a baseline for lighting, a market which has been highly dynamic in recent years.33

Additionally, NIPSCO did not quantify natural gas energy savings resulting from the furnace filter whistle,

while the evaluation team did. NIPSCO used the 2016 Pennsylvania TRM to calculate savings for the

measure; the evaluation evaluation team used Quantec’s Engineering Review and Savings Estimates for

the “Filtertone” Filter Restriction Alarm assessment detailing algorithms for the measure.34 Appendix F.

Residential School Education Program Ex Post Measure Savings Calculation Methodologies provides

additional details regarding calculation methodologies for kit measures.

Table 94 summarizes the differences between the ex ante and the ex post gross savings methodologies

for each kit measure.

33 U.S. Department of Energy. The Uniform Methods Project: Methods for Determining Energy Efficiency Savings

for Specific Measures, Chapter 21: Residential Lighting Evaluation Protocol. February 2015. Available online:

https://energy.gov/sites/prod/files/2015/02/f19/UMPChapter21-residential-lighting-evaluation-protocol.pdf

34 Reichmuth, Howard. Engineering Review and Savings Estimates for the “Filtertone” Filter Restriction Alarm.

Prepared by Quantec LLC. 1999. The evaluation team elected to use this source because it aligned with

previous Indiana evaluations, and because the evaluation team found it to be more detailed and technically

thorough than the 2016 Pennsylvania TRM methodology.

Page 145: 2018 DSM Portfolio Evaluation Report - NIPSCO

138

Table 94. Differences Between Ex Ante and Ex Post Gross Savings Methodologies

Table 95 shows evaluated ex post savings by measure-- including all adjustments for ISRs and electric

and gas water heater saturation rates-- compared to the audited ex ante savings. There are no claimed

savings by measure in the scorecard, so the audited ex ante values were included in this by-measure

savings comparison.

Table 95. School Education Program Ex Ante and Ex Post Gross Per-Measure Savings Values

Measure Ex Ante Savings Approach Ex Post Gross Savings Approach

Low-flow Showerheads

NIPSCO utilized Indiana TRM (v2.2) to derive

values for gallons per minute of installed

efficient measures and showers/faucets per

home and persons per home values.

The evaluation team used actual values specified

on kit measure documentation for gallons per

minute of installed measures. The evaluation

team survey-derived NIPSCO-specific values for

showers/faucets per home and persons per

home.

Kitchen aerators

Bathroom aerators

9-watt LED NIPSCO used HOU value of 1,040 and WHF

value of -0.059.

The evaluation team used 1,135 HOU value for

school kits, and weighted average WHFs for South

Bend, as listed in Indiana TRM (v2.2). The

evaluation team used UMP lumen equivalence

method to determine delta watts.

Air filter alarm

NIPSCO did not quantify natural gas energy

savings resulting from furnace filter whistle,

and used 2016 Pennsylvania TRM to calculate

savings.

The evaluation team quantified natural gas

energy savings resulting from furnace filter

whistle, and used Quantec’s Engineering Review

and Savings Estimates for the “Filtertone” Filter

Restriction Alarm to calculate savings.

LED night-light

NIPSCO used baseline wattage of 0.33 watts

per Indiana TRM (v2.2) and did not apply an

incandescent replacement factor.

The evaluation team used actual wattage of

nightlight (0.5 watts) and applied incandescent

replacement factor.

Measure Audited Ex Ante Savingsa Ex Post Gross Per-Measure Savings

kWh kW therms kWh kW therms

Low-flow Showerheads-

combo kit

54.30 0.003 9.55 54.97 0.002 7.98

Low-flow Showerheads- gas-

only kit

- - 19.10 - - 15.95

Kitchen aerators- combo kit 36.40 0.002 6.41 29.42 0.001 4.27

Kitchen aerators- gas-only kit - - 6.41 - - 4.27

Bathroom aerators- combo kit 8.32 0.002 1.48 3.89 0.000 0.56

Bathroom aerators- gas-only

kit

- - 2.96 - - 1.13

9-watt LED 87.00 0.012 - 92.76 0.010 (1.90)

Air filter alarm 58.00 0.023 - 8.74 0.011 2.50

LED night-light 14.00 0.000 - 2.19 0.000 -

Per Combo Kit 258.02 0.041 17.44 191.98 0.023 13.41

Per Gas-Only Kit N/A N/A 28.47 N/A N/A 21.35

Note: Totals may not sum properly due to rounding. a Per-measure audited ex ante savings are derived from the savings document provided to the evaluation team; the scorecard

does not provide per-measure inputs.

Page 146: 2018 DSM Portfolio Evaluation Report - NIPSCO

139

Realization Rates Table 96 shows evaluated realization rates for the program. The electric energy (kilowatt-hour), the

peak demand reduction (kW), and the natural gas energy (therms) savings rates were lower than

expected due in part to lower verified installation rates (i.e., lower than those used by NIPSCO). The

different ex post per-unit savings also contributed to the difference.

Table 96. School Education Program Ex Post Gross Savings and Realization Rates

Metric Population Ex Ante Realization Rate Population Ex Post Gross

Electric Energy Savings (kWh/yr) 2,947,908 74% 2,193,543

Peak Demand Reduction (kW) 400 67% 266

Natural Gas Energy Savings (therms/yr)

202,409 78% 157,106

Ex Post Net Savings The evaluation team calculated freeridership and participant spillover using the survey data collected

from 2018 participants. The methodology is described in greater detail in the Appendix B. Self-Report

Net-to-Gross Evaluation Methodology. As shown in Table 97, the evaluation team estimated a 95% NTG

for the 2018 School Education program.

Table 97. 2018 School Education Program Net-to-Gross Results

Measure Type n Freeridership

(%)

Participant

Spillover

(%)

NTG (%) Ex Post Gross Sample Population

Savings (MMBtu)

9-watt LED 60 36% 6% 62% 1,448

Low-flow showerhead 32 8% 6% 98% 11,543

Kitchen aerator 29 16% 6% 90% 6,101

Bathroom aerator 24 10% 6% 96% 17

Furnace filter whistle 7 0% 6% 106% 3,194

LED night-light 41 12% 6% 94% 85

Overall 193 11%b 6%b 95%b 23,189 a Weighted by survey sample ex post gross program kilowatt-hour savings. b Weighted by ex post gross program population kilowatt-hour savings.

Freeridership To determine freeridership, the evaluation team asked 67 participants (representing 193 measure-

specific freeridership responses) about whether they would have installed equipment to the same level

of efficiency within one year in absence of the School Education program. Based on survey feedback, the

evaluation team calculated overall freeridership for the program as 11% (Table 98).

Page 147: 2018 DSM Portfolio Evaluation Report - NIPSCO

140

Table 98. 2018 School Education Program Freeridership Results

Measure Type N Freeridership (%) Ex Post Gross Sample Population

Savings (MMBtu)

9-watt LED 60 36%a 1,448

Low-flow showerhead 32 8%a 11,543

Kitchen aerator 29 16%a 6,101

Bathroom aerator 24 10%a 817

Furnace filter whistle 7 11%a 3,194

LED night-light 41 12%a 85

Overall 193 11%b 23,189 a Weighted by survey sample ex post gross program MMBtu savings. b Weighted by ex post gross program population MMBtu savings.

The evaluation team estimated measure-level freeridership for each participant based on his or her

response to the following questions:

• FR1. “Before you received this program service, were you already planning to buy any

equivalent, high-efficiency [MEASURE] for your residence within the next 12 months?”

• FR2. “Would you have purchased the [MEASURE]…around the same time you received the kit,

later but within one year, or later but more than one year?”

If participants answer “No” to FR1, they are estimated as 0% freeriders. If participants answer “Yes” to

FR1, then their freeridership estimate is based off their answer to FR2. Table 99 shows the response

options to the freeridership questions, the freeridership score (FR Score) associated with each response,

and the frequency of responses for each measure type.

Table 99. 2018 School Education Freeridership Responses and Scoring

Freeridership Questions /

Response Options

FR1. If you had not received the

kit, would you have purchased

[MEASURE] on your own?

FR

Score

Frequency of Responses

9-

Watt

LED

Low-Flow

Showerhead

Kitchen

Aerator

Bathroom

Aerator

Furnace

Filter

Whistle

LED

Night-

Light

No 0% 22 22 25 19 18 45

If Yes,

FR2. Would you have purchased the [MEASURE]…?

Around the same time you

received the kit 100% 10 1 1 1 0 3

Later but within one year 50% 23 3 7 3 0 4

Later but more than one year 0% 5 4 1 0 0 2

(Don't know) 25% 2 0 0 0 0 0

Total N/A 60 32 29 24 7 41

Figure 50 shows the distribution of assigned freeridership scores by program measure type.

Page 148: 2018 DSM Portfolio Evaluation Report - NIPSCO

141

Figure 50. 2018 School Education Program

Distribution of Freeridership Scores by Measure Type

Participant Spillover The evaluation team estimated participant spillover35 using specific information about participants

(determined through the evaluation) and the Indiana TRM (v2.2) as a baseline reference. The evaluation

team estimated the percentage of program participant spillover by dividing the sum of additional

spillover savings (as reported by survey respondents) by the total gross savings achieved by all program

respondents. The participant spillover estimate for the School Education program is 6%, rounded to the

nearest whole percentage, as shown in Table 100.

Table 100. 2018 School Education Program Participant Spillover

Spillover Savings (MMBtu) Survey Respondent Program Savings (MMBtu) Participant Spillover (%)

4.58 71.67 6%

Five participants said that the program was very important in the decision to purchase and install an

additional energy-efficient product without receiving an incentive. Table 101 shows these additional

participant spillover measures and the total resulting energy savings.

35 Non-participant spillover evaluation activities were not conducted for the 2018 program year.

Page 149: 2018 DSM Portfolio Evaluation Report - NIPSCO

142

Table 101. 2018 School Education Program Participant Spillover Measures, Quantity, and Savings

Spillover Measures Quantity Total Energy Savings (MMBtu)

Low-flow showerhead 2 1.97

ENERGY STAR Clothes Washer 1 1.14

ENERGY STAR Dishwasher 1 1.16

ENERGY STAR Refrigerator 1 0.28

Room Air Conditioner 1 0.04

Total N/A 4.58

Evaluated Net Savings Adjustments Table 102 shows ex post gross and ex post net savings for the School Education program.

Table 102. School Education Program Ex Post Net Savings

Program Measure Ex Post Gross Savings/Reduction

NTGa Ex Post Net Savings/Reduction

kWh kW Therms kWh kW Therms

Low-flow Showerheads 628,122 19 94,018

95%

596,716 18 89,317

Kitchen aerators 336,156 8 49,548 319,348 8 47,070

Bathroom aerators 44,471 3 6,657 42,248 3 6,324

9-watt LED 1,059,858 115 (21,653) 1,006,866 109 (20,570)

Air filter alarm 99,912 122 28,537 94,916 116 27,110

LED night-light 25,023 0 0 23,772 0 0

Program Total 2,193,543 266 157,106 2,083,866 253 149,251 a The evaluation team converted electric and gas ex post gross savings to MMBtu savings for program measures so a

common unit of analysis could be used for weighting the measure level freeridership estimates. Measure-level freeridership

estimates were weighted by ex post gross program population MMBtu savings to arrive at a 11% program level freeridership

estimate. Participant spillover was estimated as 6% of the total analysis sample ex post gross program MMBtu savings.

Combining the program level freeridership estimate of 11% with the 6% program level spillover estimate results in a 95%

NTG estimate for the program.

Process Evaluation The 2018 process evaluation included interviews with NIPSCO program staff and Lockheed Martin

Energy program staff, and surveys with participating parents.

Program Design and Delivery As the program implementer, Lockheed Martin Energy was responsible for program administration,

making decisions about the kit-measure mix, and consulting with NIPSCO for approval of program

materials. NEF handled outreach, teacher enrollment, and development of curriculum and educational

materials, while AM Conservation Group took responsibility for kit assembly and distribution. NIPSCO

approved the use of any logos and all marketing materials. NEF coordinated directly with teachers to

enroll them, scheduled kit shipments (for a month and time selected by the teacher), sent the

curriculum materials, and supported delivery of the curriculum to classrooms.

Page 150: 2018 DSM Portfolio Evaluation Report - NIPSCO

143

In 2018, the teacher recruitment process remained the same as in previous program years. NIPSCO

provided Lockheed Martin Energy with a list of eligible schools and contact information within its service

territory. NEF used the approved list and sent mailers and emails to teachers, inviting them to enroll.

Teachers could enroll over the phone or online through NIPSCO’s website. Once a teacher enrolled, NEF

mailed the curriculum and kits. In 2018, like 2017, recruitment relied heavily upon re-enrolling teachers

from prior years.

Lockheed Martin Energy maintained a secure database that NIPSCO could access to determine the

number of enrollments and shipments, as well as the number of HEWs completed. AM Conservation

Group updated the database weekly with kit delivery reports. Lockheed Martin Energy submitted

monthly reports to NIPSCO, detailing program participation and progress towards goals. NIPSCO and

Lockheed Martin Energy frequently connected to discuss progress and issues, and they remained

available to each other for ad hoc issues throughout the process.

In 2018, the program set a distribution goal of 11,600 kits (11,420 combo kits and 180 gas-only kits),

higher than the 2017 goal of 10,127 kits.

After students received a kit, they completed the HEW at home with their parents. The worksheet asked

participants about measure installations and various home demographics, such as the number of

children and adults in the home. Students returned the HEW to their teachers (these were mailed back

to Lockheed Martin Energy for data collection and reporting), or students filled out the worksheet online

at thinkenergy.org/NIPSCO. Teachers received a $50 gift card incentive for sending in at least 80% of

their classrooms’ HEWs. The program achieved a 63% response rate for HEWs in 2018 (7,321 out of

11,606 total kits), an increase from the 62% response rate in 2017, but lower than the 72% response

rate in 2016.

NIPSCO and Lockheed Martin Energy expressed great satisfaction with the program, referring to it as

their best performing program.

Changes from 2017 Design

NIPSCO and Lockheed Martin Energy expressed difficulties in making program design changes when the

program was already at the midyear point. They reported one minor change to the 2018’s program

design and delivery from 2017.

Back in 2017, the HEW incorporated the parent comment card to boost the number of contacts for the

parent follow-up survey used to verify installation. The parent comment card portion changed in 2018 to

allow parents to opt-out of follow-up contact, as opposed to opting-in. Out of 7,321 total HEWs

returned (via mail or online), 849 parents (12%) did not opt out for a follow-up survey and provided

their contact information, compared to 345 parents 2017 (9%) who opted in. The opt-out format

contributed to the evaluation team successfully obtaining the targeted number of parent survey

responses for the 2018 evaluation.

Page 151: 2018 DSM Portfolio Evaluation Report - NIPSCO

144

In the future, NIPSCO and Lockheed Martin Energy are considering updating some of the materials

found in the kits, updating the collateral that the teachers use, and adjusting some of the measures.

Specifics of these changes have not been determined.

2017 Recommendation Status

In addition to the research objectives set forth for the 2018 evaluation, the evaluation team followed up

on the 2017 evaluation recommendations. Table 103 lists the 2017 School Education program

evaluation recommendations and NIPSCO’s progress toward addressing those recommendations to

date.

Table 103. Status of 2017 School Education Program Evaluation Recommendations

Summary of 2017 Recommendations Status

Consider adding another 9-watt LED bulb to the program kits, bringing the total quantity to four.

Declined. The number of bulbs remained the same as in 2017. NIPSCO explained that increasing the number of bulbs in the kit would require increasing the size of the box; a larger box would not fit in a typical backpack, and therefore has less of a chance of making it home safely.

Send teachers reminders to help them remind students and parents to complete the HEW. The reminder could simply ask the teacher to verbally remind students in the classroom, or the reminder could be integrated into the online HEW system, where teachers can dispatch an email to students and parents who have not completed the worksheet.

Completed in 2018. Lockheed Martin Energy explained that there are no automated email reminders being dispatched to teachers, students, or parents who have missing HEWs, but there are several email reminders that NEF sends out to teachers to keep them engaged in the program, and to remind them of the approaching HEW deadlines.

Document the reported scorecard savings methodology and sourcing to ensure it matches the base ex ante savings calculation methodologies, including rounding discrepancies.

Completed in 2019. The savings calculation workbook only cited the 2017 kit savings assumptions, noting that they were not updated after the 2017 EM&V activities were completed in the summer of 2018. NIPSCO and Lockheed Martin Energy reported difficulties in changing methodology midyear. Lockheed Martin Energy applied suggested changes to the 2019 design savings calculation workbook, per NIPSCO’s recommendation.

Update ex ante savings algorithm inputs including:

• Updating domestic water heater fuel saturation rates using evaluated results

• Using the known wattage of the LED night-light (0.5 watts) in calculating savings for the measure

• Using the known gallon-per-minute efficiency value for faucet aerators and low-flow showerheads

• Calculating demand reductions and natural gas energy savings resulting from the furnace whistle measure using the methodology outlined in the 2017 evaluation

• Updating all ISR values for each measure to match evaluated ISRs

• Discontinuing weighting savings values between various methodologies

• Fully documenting all savings equations and input sources

• Updating person-per-home and showers/bathroom faucets-per-home input estimates to match the higher values determined through the evaluation surveys

In Progress. NIPSCO and Lockheed Martin Energy reported difficulties in changing methodology, including inputs to ex ante savings calculations. NIPSCO noted that any changes suggested in this recommendation will be considered for the 2019 program year. Lockheed Martin Energy considered these suggestions in the updated design savings workbook for the 2019 program year, per NIPSCO’s recommendation.

Page 152: 2018 DSM Portfolio Evaluation Report - NIPSCO

145

Participant Feedback This section presents the satisfaction findings from the parent surveys. The evaluation team obtained

survey responses from 68 parents.

Satisfaction with Kit Measures

In the parent survey, the evaluation team gathered parents’ responses to questions about satisfaction

with kit measures. As shown in Figure 51, virtually all respondents were satisfied (98%) with the LED

lightbulbs (81% very satisfied; 17% somewhat satisfied). Nearly all respondents (95%) were also satisfied

with the LED night-lights (78% very satisfied; 16% somewhat satisfied). Though still a majority, the

furnace filter whistle (65%) and low-flow showerheads (72%) had the lowest proportion of satisfied

respondents. These results differed slightly from last year’s, in which participants were most satisfied

with the LED night-light (99%) (91% very satisfied; 8% somewhat satisfied), and least satisfied (71%) with

the bathroom faucet aerator (55% very satisfied; 16% somewhat satisfied). Across both the 2017 and

2018 program years, the LED measures saw higher satisfaction than the water measures (aerators and

showerhead).

Figure 51. Satisfaction with Kit Measures

Source: Parent Survey question, “How satisfied are you with the [measure] overall?

Would you say you are…”

The evaluation team asked a follow-up question to respondents who were less-than-satisfied with the

kit measures, asking them to state why. The one respondent who was not satisfied with the LED light

bulbs said that the equipment did not work properly. Of the three respondents who were not satisfied

with the LED night-light, reasons for dissatisfaction included the light not being bright enough or not

making any difference. The nine respondents not satisfied with the low-flow showerhead cited low

water pressure and general dislike of the measure. Similarly, the 4 respondents who were less-than-

Page 153: 2018 DSM Portfolio Evaluation Report - NIPSCO

146

satisfied with the kitchen faucet aerators did not like the change in water pressure, could not properly

install it or get it to work, or generally disliked it. With the bathroom aerator, respondents mentioned

the aerator not working properly, already owning one, or not being allowed to use it as reasons for

dissatisfaction (nine respondents). Those less-than-satisfied with the furnace filter whistle said that they

did not have a furnace or did not have a need for it (nine respondents).

The evaluation team also asked participants what could be done to improve the School Education

program. Out of the twenty respondents who had suggestions for improvement, five of the respondents

suggested informational materials on the following topics:

• How to properly install the measures

• How to calculate the impact of using efficient measures

• Other ways to save energy

Satisfaction with Program

In 2018, respondents showed high satisfaction with the School Education program. Overall, 95% said

they were satisfied with the program (85% very satisfied; 10% somewhat satisfied), as shown in Figure

52. The 2018 program observed statistically similar satisfaction results to 2017.

Figure 52. Satisfaction with Overall Program

Source: Parent Survey question, “How satisfied are you with the Energy Efficient School Kits program overall? Would you say

you are…”

Page 154: 2018 DSM Portfolio Evaluation Report - NIPSCO

147

Respondents offered the following positive comments about the program:

• “I think it’s awesome and it definitely saved me money.”

• “It’s a neat idea and I hope my daughter gets to do it again next year.”

• “I thought it was great. It was fun to do with my child, and it was a fun family thing to do. And

there was learning involved also, which is a plus.”

Satisfaction with NIPSCO

In 2018, respondents showed high satisfaction with NIPSCO, as shown in Figure 53. Overall, 84% of

respondents said they were satisfied with NIPSCO (48% very satisfied; 36% somewhat satisfied). The

2018 program observed statistically similar satisfaction results to 2017.

Figure 53. Satisfaction with NIPSCO

Note: Satisfaction with NIPSCO was not a survey question asked for this program during the 2016 program evaluation.

Source: Parent Survey question, “How satisfied are you with NIPSCO overall as your energy service

provider? Would you say you are…”

Page 155: 2018 DSM Portfolio Evaluation Report - NIPSCO

148

Conclusions and Recommendations Conclusion 1: The program continues to achieve high savings and high satisfaction.

Based on ex post gross savings, the program achieved 100% of its participation goal (with 11,606 total

kits distributed), 74% of its electric energy savings goals, 67% of its demand reduction goals, and 77% of

its natural gas energy savings goals. The program saw very high parent satisfaction for the last 3 years.

Overall, 95% of parent respondents were satisfied with the program in 2018, and several parents gave

positive feedback regarding the kit measures and educational materials. This is in line with the 95%

parent satisfaction rates observed in both 2016 and 2017.

Conclusion 2: Audited ex ante savings differed from reported ex ante savings, largely due to

inconsistencies between the calculated savings methodology and scorecard input.

Like in the 2017 evaluation, program reported and audited electric energy (kilowatt-hour) savings

differed by 228.5 kWh due to a rounding error. Additionally, reported peak demand (kilowatt) savings

per kit did not match the provided savings methodology documentation, and the evaluation team could

not re-create the per-kit peak demand savings value of 0.035 kW. A review of the savings methodology

revealed that the claimed ex ante savings methodology should have been 0.041 kW per kit.

Recommendation:

• Document the reported scorecard savings methodology and sourcing to ensure it matches the

base ex ante savings calculation methodologies, including rounding discrepancies.

Conclusion 3: Evaluated savings differed from ex ante savings, with realization rates below 100% for

electric and natural gas energy savings.

Like in the 2017 evaluation, all reported ex ante savings for the School Kit program assumed an ISR of

100%, considerably inflating the claimed savings for electric energy (kilowatt-hour), peak demand

(kilowatt), and natural gas energy (therms) above evaluated values. Although the impacts of these ISRs

were partially negated by conservative savings calculations and verified water heater fuel-type

saturations, they ultimately negatively impacted realization rates.

The equations NIPSCO used to calculate savings for faucet aerators and low-flow showerheads weighted

the final savings value by differing amounts between savings derived from the Indiana TRM (v2.2) and

2015 evaluated savings. The source of these weighting percentages could not be verified by Lockheed

Martin Energy. Savings equations for these measures also used TRM-suggested values for the gallon-

per-minute efficiency rating of the installed equipment. The evaluation team used the actual gallon-per-

minute value. Additionally, savings for these measures differed due to discrepancies between assumed

person-per-home and faucet-per-home values, and the survey-verified values.

The evaluation team used the same savings methodology as that used to calculate ex ante savings

reported for LED night-lights, but employed the actual wattage of the efficient night-light in the kit,

whereas ex ante values utilized the TRM’s suggested value. For 9-watt LED bulbs, the evaluation team

identified two differing inputs used in the savings calculation (i.e., HOU and electric WHF), but could not

verify the ex ante methodology used. TRM-derived LED bulb savings were also weighted, along with

Page 156: 2018 DSM Portfolio Evaluation Report - NIPSCO

149

2015 HEA evaluated LED savings and an undetermined methodology to arrive at the final LED savings

value. As a result, 2018 evaluated savings for LED bulbs differed from reported ex ante values.

NIPSCO also did not calculate natural gas savings from the furnace filter whistle, while the evaluation

team did. NIPSCO relied on the 2016 Pennsylvania TRM to calculate furnace filter alarm savings,

whereas the evaluation team used the Quantec Engineering Review and Savings Estimates for the

“Filtertone” Filter Restriction Alarm algorithms. The Quantec study aligned with previous Indiana

evaluations, and the evaluation team found it to be more detailed and technically thorough than the

2016 Pennsylvania TRM methodology.

Recommendations:

• Update domestic water heater fuel saturation rates using evaluated results.

• Use the known wattage of the LED night-light (0.5 watts) in calculating savings for the measure.

• Use the Uniform Methods Project (UMP) lumen equivalence method to determine delta watts

for the LED light bulb measure savings calculation.

• Use the known gallon-per-minute efficiency value for faucet aerators and low-flow

showerheads.

• Calculate natural gas energy savings resulting from the furnace filter whistle measure using the

methodology outlined in the 2018 evaluation.

• Update all ISR values for each measure to match evaluated ISRs.

• Discontinue weighting savings values between various methodologies.

• Fully document all savings equations and input sources.

• Update person-per-home and showers/bathroom faucets-per-home input estimates to match

the higher values determined through the evaluation surveys.

Page 157: 2018 DSM Portfolio Evaluation Report - NIPSCO

150

Residential Multi-Family Direct Install Program NIPSCO’s MFDI relaunched in 2018 after a three-year hiatus. NIPSCO originally intended to relaunch the

program in 2017, but encountered program design delays due to the transitioning to a new

implementation contractor for the residential sector (from Good Cents to Lockheed Martin Energy).

Lockheed Martin Energy restarted the program in 2018 as an opportunity to engage more customers

within the NIPSCO territory, especially among renters.

The MFDI program is designed to provide a “one-stop shop” to multifamily building owners, managers,

and tenants of multifamily units containing three or more residences, and receiving service from

NIPSCO. The program generates immediate energy savings and improvements in two distinct program

phases. Phase I is a walkthrough assessment of each property, conducted by a Lockheed Martin Energy

program advisor. Phase II is an in-unit direct installation of energy-efficient devices at no-cost or low-

cost to the tenant or landlord.

More savings can be identified throughout the common areas of each property such as community

rooms, hallways and laundry rooms. The program advisor also conducts a walk-through of these areas

and provides a no-cost detailed report that will identify potential savings and rebates that can be

incentivized through NIPSCO’s SBDI program.

Program Performance The MFDI program reported savings surpassed its 2018 electric energy (921,264 kWh/yr) savings goal,

achieving 108%, but fell short of achieving its other 2018 savings goals, reaching 54% of its peak demand

reduction (206 kW) goal, and 23% of its natural gas energy (175,387 therm/yr) savings goal. Program

staff reported a late first enrollment, in May 2018, as they spent the first half of 2018 engaging in

marketing efforts to get the program off the ground.

Table 104 presents a savings summary for the program, including program savings goals. The audited

savings mostly aligned with the ex ante savings claimed, with a few minor exceptions described more

fully later in this section.

• The evaluation team found no issues through the tracking system analysis.

• The engineering analysis completed for the ex post gross analysis reduced program peak

demand savings, but increased program electric energy and natural gas energy savings, due to a

difference in savings algorithms and the use of individual measure characteristics for some

measures.

While the evaluation team calculated ex post gross savings for most measures using Indiana TRM (v2.2),

the ex ante savings were generally an average of the TRM (v2.2) values and evaluated values from the

2015 evaluation report. Additionally, the evaluation team used actual participant data, such as customer

location, whenever possible.

Page 158: 2018 DSM Portfolio Evaluation Report - NIPSCO

151

Table 104. 2018 MFDI Program Savings Summary

Metric Goal Ex Ante Audited Verified Ex Post

Gross

Ex Post Net

Gross Goal Achievement

Electric Energy Savings (kWh/yr) 921,264 937,088 937,088 937,088 993,817 993,817 108%

Peak Demand Reduction (kW) 206 176 176 176 111 111 54%

Natural Gas Energy Savings

(therms/yr) 175,387 79,283 79,283 79,283 39,608 39,608 23%

Table 105 outlines the ex post gross adjustment factors.

Table 105. 2018 MFDI Program Adjustment Factorsc

Realization Rate (%) Realization Rate (%)a Freeridership Spillover NTG (%)b

Electric Energy Savings (kWh/yr) 106% N/A

N/A

100%

Peak Demand Reduction (kW) 63% N/A 100%

Natural Gas Energy Savings (therms/yr) 50% N/A 100% a Realization Rate is defined as ex post Gross savings divided by Ex ante savings. b NTG is defined as ex post net savings divided by ex post gross savings. c NTG (including freeridership and spillover) was not calculated for this program, as it is beyond the scope of this program

evaluation.

Table 106 lists the 2018 program budget and expenditures by fuel type. The program spent

approximately 134% of its electric budget and 41% of its gas budget.

Table 106. 2018 MFDI Program Expenditures

Fuel Program Budget Program Expenditures Budget Spent (%)

Electric $279,129 $373,882 134%

Natural Gas $374,754 $154,883 41%

Research Questions The evaluation team conducted quantitative analyses and interviews with program staff to answer the

following research questions:

• How is the program performing relative to its goals?

• How effective were program marketing efforts in driving participation?

• What opportunities exist for program improvements?

• What efforts are in place to drive customer participation and awareness?

Impact Evaluation In 2018, NIPSCO claimed MFDI program savings for these six measure types:

• LED light bulbs

• Bathroom faucet aerators (1.0 gpm)

• Kitchen aerators (1.5 gpm)

Page 159: 2018 DSM Portfolio Evaluation Report - NIPSCO

152

• Low-flow showerheads (1.5 gpm)

• Hot water pipe insulation and

• Smart thermostats (programmable)

This section details each impact evaluation step and its associated electric energy savings, peak demand

reduction, and natural gas savings.

Audited and Verified Savings To develop an audited measure quantity, the evaluation team first checked the implementer tracking

data for duplicates or other data quality issues. The evaluation team also looked for any discrepancies

between the program tracking data and the program scorecard. The only difference found was in the

natural gas savings for lighting measures. 98% of candelabra measures, 71% of globe LEDs, and 66% of

9W LEDs showed up in the tracking data with a reported 0 therm savings, despite the deemed savings

workbook showing -0.05 therm/unit savings for each of these lighting measures. After review, it was

clarified to the evaluation evaluation team that all projects with lighting measures claiming no therm

savings in the tracking data were electric-only projects. These lighting measures were installed in units

that do not have individually metered NIPSCO gas accounts, but rather are served by central heating

systems and/or electric heat.

Because a participant survey is beyond the scope of this evaluation, and trade allies were instructed to

leave no old measures behind in-unit after the installation of new measures, the in-service rates (ISRs)

for all program-installed measures were then assumed to be 100%.36

Table 107 summarizes the audited quantity, applied installation rates, and resulting verified quantity per

measure. To calculate the verified measure quantity, the evaluation team multiplied the audited

measure quantity by the installation rate.

36 Per Indiana TRM v 2.2, for direct install measures

Page 160: 2018 DSM Portfolio Evaluation Report - NIPSCO

153

Table 107. 2018 MFDI Program Audited and Verified Quantities

Measure Unit of Measure Audited

Quantity ISR

Verified

Quantity

LED (9W) – Dual Fuel Lamp 8,546 100% 8,546

LED (9W) - Electric Lamp 16,885 100% 16,885

Candelabra – Dual Fuel Lamp 16 100% 16

Camdelabra - Electric Lamp 638 100% 638

Globe LED – Dual Fuel Lamp 1,216 100% 1,216

Globe LED - Electric Lamp 2,935 100% 2,935

Bath Aerator - Gas Aerator 1,037 100% 1,037

Kitchen Aerator - Gas Aerator 648 100% 648

Low Flow Showerhead - Gas Showerhead 704 100% 704

Pipe Wrap - Gas Per foot 402 100% 402

WiFi Thermostat- Electric Thermostat 48 100% 48

WiFi Thermostat- Gas Thermostat 99 100% 99

Programmable Thermostat- Electric Thermostat 100 100% 100

Programmable Thermostat- Gas Thermostat 885 100% 885

Total 34,159 34,159

Ex Post Gross Savings

Ex Post Gross Savings

The evaluation team referred to the Indiana TRM (v2.2) for variable assumptions to calculate ex post

gross electric savings, demand reduction, and natural gas savings. Where data was unavailable in the

Indiana TRM (v2.2), the evaluation team used data from the Illinois TRM (v6), the Pennsylvania TRM

(2016), or other secondary sources. The evaluation team revised assumptions for savings estimates

applicable to the NIPSCO service territory, as needed. Appendix H. Multi-Family Direct Install Program

contains more details on the specific algorithms, variable assumptions, and references for all program

measure ex post grosscalculations.

Table 108 shows the audited savings and ex post gross per-measure savings for 2018 MFDI program

measures.

Table 108. 2018 MFDI Program Ex Ante and Ex Post GrossPer-Measure Savings Values

Measure Units Audited Savingsa Ex Post Gross Per-Measure Savings

kWh kW Therms kWh kW Therms

LED (9W) – Dual Fuel Lamp 29.0 0.005 -0.05 32.8 0.004 -0.06

LED (9W)- Electric Lamp 29.0 0.005 - 32.8 0.004 -

Candelabra- Dual Fuel Lamp 29.7 0.004 -0.05 23.2 0.003 -0.04

Candelabra- Electric Lamp 29.7 0.004 - 23.2 0.003 -

Globe LED- Dual Fuel Lamp 28.9 0.004 -0.05 22.2 0.003 -0.04

Globe LED- Electric Lamp 28.9 0.004 - 22.2 0.003 -

Bath Aerator - Gas Aerator - - 1.85 - - 1.47

Kitchen Aerator - Gas Aerator - - 8.01 - - 7.92

Low Flow Showerhead - Gas Showerhead - - 11.94 - - 14.02

Page 161: 2018 DSM Portfolio Evaluation Report - NIPSCO

154

Measure Units Audited Savingsa Ex Post Gross Per-Measure Savings

kWh kW Therms kWh kW Therms

Pipe Wrap - Gas Per Foot - - 0.82 - - 0.66

WiFi Thermostat- Electric Thermostat 466.7 0.202 - 351.4 0.000 -

WiFi Thermostat- Gas Thermostat - - 89.70 - - 23.74

Programmable Thermostat-

Electric Thermostat 378.0 0.200 0.00 351.4 0.000 -

Programmable Thermostat-

Gas Thermostat - - 62.20 - - 23.74

Note: Totals may not sum properly due to rounding. a Values presented at a measure-level represent Audited values, since the scorecard provides only savings totals.

There are numerous reasons for differences between ex ante and ex post gross savings, although

primarily driven by the following overarching factor:

• The evaluation team calculated ex post gross savings for most of the measures using Indiana

TRM (v2.2). The planning and reporting assumptions used to claim ex ante savings reference

both the TRM (v2.2) and the 2015 EM&V report to calculate savings, sometimes averaging

results of the two sources, as documented in Table 109.

Sources for the ex post gross savings calculation for each measure are provided in Appendix H.

Multi-Family Direct Install Program Table 109 highlights notable differences between ex ante and ex

post gross estimates.

Table 109. 2018 MFDI Program Differences Between Ex Ante and Ex Post Gross

Measure Ex Ante Sources and

Assumptions Ex Post Gross Sources and Assumptions

Primary Reasons for

Differences

LEDs

Ex ante savings are an

average of savings values

from (1) HEA EM&V, (2) TRM

V2.2 with a 60 W base and (3)

TRM V2.2 with a 30.69 W

Base

Indiana TRM v2.2 and information in

program tracking data. Baseline wattage

value from the DOE Uniform Methods

Project, Chapter 21 Residential Lighting

Evaluation Protocol for post-EISA. Waste

heat factors averaged across customer

location, per customer type.

Baseline wattages and

waste heat Factors.

Bathroom

Aerator

Roughly 80/20 split between

TRM v2.2 and EM&V values,

assumed unknown housing

type for people per

household and faucets per

household, South Bend.

Indiana TRM v2.2 and information in

program tracking data. Cold water inlet

temperature averaged across customer

location per customer type, assumed single

family household. Actual rating for gpm-eff.

Water temperatures, gpm

assumptions, people per

household, number of

faucets per household,

and drain factor; ex post

gross assumes no impact

of water down drain

(DR=1).

Kitchen

Aerator

Roughly 90/10 split between

TRM v2.2 and EM&V values,

assumed unknown housing

type for people per

household and faucets per

household, South Bend.

Indiana TRM v2.2 and information in

program tracking data. Cold water inlet

temperature averaged across customer

location per customer type, assumed multi

family housing inputs. Actual rating for

gpm-eff.

Water temperatures, gpm

assumptions, people per

household, faucets per

household, and drain

factor

Page 162: 2018 DSM Portfolio Evaluation Report - NIPSCO

155

Measure Ex Ante Sources and

Assumptions Ex Post Gross Sources and Assumptions

Primary Reasons for

Differences

Low-Flow

Showerhead

TRM v2.2, assumed unknown

housing type for people per

household and showerheads

per household, South Bend.

Indiana TRM v2.2 and information in

program tracking data. Cold water inlet

temperature averaged across customer

location per customer type, assumed multi

family housing inputs. Actual rating for

gpm-eff.

Water temperatures, gpm

assumptions, people per

household and

showerheads per

household.

Pipe Wrap

Average of TRM v2.2 and

2015 EM&V savings values

for both gas and electric

water heaters.

Indiana TRM v2.2 Initial R value and other

unknown assumptions.

Realization Rates

The next three tables (Table 110 through Table 112) present the program’s ex ante reported savings,

audited, verified savings, and ex post gross savings. The program achieved electric energy, demand

reduction, and natural gas energy realization rates of 106%, 63%, and 50%, respectively.

Thermostat measures account almost entirely for the low demand reduction and natural gas realization

rates. The evaluation team calculated a substantially lower ex post gross demand reduction (kW) and

natural gas energy (therm) value for each of these measures. Given the number of thermostats—1,132

thermostats across 16 properties-- provided through the program, this difference resulted in a

considerable decrease in ex post gross savings, compared with audited savings.

Page 163: 2018 DSM Portfolio Evaluation Report - NIPSCO

156

Table 110. 2018 MFDI Program Evaluated Electric Energy Savings Steps

Measure

Ex Ante a Deemed

Electric Energy

Savings

(kWh/yr/unit)

Ex Ante a Electric

Energy Savings

(kWh/yr)

Audited Gross

Electric Energy

Savings (kWh/yr)

Verified Gross

Electric Energy

Savings (kWh/yr)

Ex Post Gross Per-

Measure Electric

Energy Savings

(kWh/yr/unit)

Ex Post Gross

Electric Energy

Savings (kWh/yr)

LED (9W) – Dual Fuel 29.0 247,834 247,834 247,834 32.8 280,435

LED (9W)- Electric 29.0 489,665 489,665 489,665 32.8 554,077

Candelabra- Dual Fuel 29.7 475 475 475 23.2 371

Candelabra- Electric 29.7 18,949 18,949 18,949 23.2 14,778

Globe LED- Dual Fuel 28.9 35,142 35,142 35,142 22.2 26,993

Globe LED- Electric 28.9 84,822 84,822 84,822 22.2 65,152

Bath Aerator - Gas - - - - - -

Kitchen Aerator - Gas - - - - - -

Low Flow Showerhead - Gas - - - - - -

Pipe Wrap - Gas - - - - - -

WiFi Thermostat- Electric 466.7 22,402 22,402 22,402 351.43 16,869

WiFi Thermostat- Gas - - - - - -

Programmable Thermostat- Electric 378 37,800 37,800 37,800 351.43 35,143

Programmable Thermostat- Gas - - - - - -

Total Savings N/A 937,088 937,088 937,088 N/A 993,817

Total Program Realization Rate 106%

Note: Totals may not sum properly due to rounding. a Values presented at a measure-level represent Audited values, since the scorecard provides only savings totals.

Page 164: 2018 DSM Portfolio Evaluation Report - NIPSCO

157

Table 111. MFDI Program Evaluated Peak Demand Reduction Steps

Measure

Ex Ante Deemed Peak

Demand Reduction (kW/unit)

Ex Ante a Peak Demand

Reduction (kW)

Audited Gross Peak Demand

Reduction (kW)

Verified Gross Peak Demand Reduction

(kW)

Ex Post Gross Per-Measure Peak

Demand Reduction (kW/unit)

Ex Post Gross Peak Demand

Reduction (kW)

LED (9W) – Dual Fuel 0.005 42.73 42.73 42.73 0.004 33.18

LED (9W)- Electric 0.005 84.43 84.43 84.43 0.004 65.55

Candelabra- Dual Fuel 0.004 0.06 0.07 0.07 0.003 0.04

Candelabra- Electric 0.004 2.60 2.60 2.60 0.003 1.75

Globe LED- Dual Fuel 0.004 4.87 4.81 4.81 0.003 3.19

Globe LED- Electric 0.004 11.74 11.60 11.60 0.003 7.71

Bath Aerator - Gas - - - - - -

Kitchen Aerator - Gas - - - - - -

Low Flow Showerhead - Gas - - - - - -

Pipe Wrap - Gas - - - - - -

WiFi Thermostat- Electric 0.202 9.70 9.70 9.70 0.000 0.00

WiFi Thermostat- Gas - - - - - -

Programmable Thermostat- Electric 0.200 20.00 20.00 20.00 0.000 0.00

Programmable Thermostat- Gas - - - - - -

Total Savings N/A 176 176 176 N/A 111

Total Program Realization Rate 63%

Note: Totals may not sum properly due to rounding. a Values presented at a measure-level represent Audited values, since the scorecard provides only savings totals.

Page 165: 2018 DSM Portfolio Evaluation Report - NIPSCO

158

Table 112. MFDI Program Evaluated Natural Gas Energy Savings Steps

Measure

Ex Antea Deemed Natural Gas Energy

Savings (therms/yr/unit)

Ex Antea Natural Gas Energy

Savings (therms/yr)

Audited Gross Natural Gas

Energy Savings (therms/yr)

Verified Gross Natural Gas

Energy Savings (therms/yr)

Ex Post Gross Per-Measure Natural

Gas Energy Savings (therms/yr/unit)

Ex Post Gross Natural Gas

Energy Savings (therms/yr)

LED (9W)- Dual Fuel -0.05 -427.3 -427.3 -427.3 -0.06 -497.97

LED (9W)- Electric - - - - - -

Candelabra- Dual Fuel -0.05 -0.80 -0.80 -0.80 -0.04 -0.66

Candelabra- Electric - - - - - -

Globe LED- Dual Fuel -0.05 -60.80 -60.80 -60.80 -0.04 -47.93

Globe LED- Electric - - - - - -

Bath Aerator - Gas 1.85 1,918.5 1,918.5 1,918.5 1.47 1,525.0

Kitchen Aerator - Gas 8.01 5,190.5 5,190.5 5,190.5 7.92 5,134.5

Low Flow Showerhead - Gas 11.94 8,405.8 8,405.8 8,405.8 14.02 9,867.6

Pipe Wrap - Gas 0.82 329.6 329.6 329.6 0.66 266.5

WiFi Thermostat- Electric - - - - - -

WiFi Thermostat- Gas 89.70 8,880.3 8,880.3 8,880.3 23.74 2,350.4

Programmable Thermostat- Electric - - - - - -

Programmable Thermostat- Gas 62.20 55,047.0 55,047.0 55,047.0 23.74 21,010.8

Total Savings N/A 79,283 79,283 79,283 N/A 39,608

Total Program Realization Rate 50%

Note: Totals may not sum properly due to rounding. a Values presented at a measure-level represent Audited values, since the scorecard provides only savings totals.

Page 166: 2018 DSM Portfolio Evaluation Report - NIPSCO

159

Ex Post Net Savings

Performing a NTG analysis for this program was beyond the scope of work, as this program began mid-

year and participant surveys were not undertaken. As it is a direct-install program, with no cost to

participants, the evaluation team is utilizing a 100% NTG value for the 2018 program year.

Process Evaluation As a part of the process evaluation, the evaluation team interviewed NIPSCO’s program manager and

implementation staff to gain a better understanding of the program design, the delivery process, and

any challenges or successes experienced. The evaluation team’s findings follow.

Program Design and Delivery The MFDI program supports energy efficiency in residential units of multi-family properties. Common

spaces of the property, including community rooms, hallways, and laundry rooms, can also be evaluated

for energy efficiency upgrades through the program. A program advisor from Lockheed Martin Energy

walks through these areas with the property manager and provides a no-cost report that identifies

energy saving opportunities, estimates energy savings and project costs, and provides a list of applicable

NIPSCO rebates. Commercially metered common area improvements that are identified in the

assessment will be provided rebates through the SBDI program; this program maintains a separate

budget from the MFDI program, however the MFDI program allows property managers to apply for SBDI

rebates directly through the MFDI application.

After the initial walk-through, the program advisor sits down with the property manager, identifying

measures that are of interest to the property manager for in-unit installation. A member of a trade ally

from Lockheed Martin’s network then returns at a later date to conduct the installations.

To qualify for the program, property owners must manage multi-family buildings that have three or

more adjoined units with active NIPSCO residential electric and/or natural gas service that are

individually metered. Additionally, the building must be more than five years old and have not received

a utility sponsored energy assessment in the past three years.

The total incentive cap that can be claimed per multi-family building by a trade ally depends on the

number of residential units in the building. The per residential unit cap on incentives is $230, which does

not include any SBDI measures installed in building common areas.

Lockheed Martin Energy administered the MFDI program in 2018 and was responsible for its program

design and management, processing contractor payments, conducting quality assurance and quality

control, executing technical training, and providing contractor support to facility quality installation of

measures. The implementer also conducted the initial energy assessment walkthrough of the property,

and managed a network of trade allies to execute the direct installs.

Overall, NIPSCO expressed satisfaction with Lockheed Martin Energy as the program implementer.

Page 167: 2018 DSM Portfolio Evaluation Report - NIPSCO

160

Program Marketing and Outreach

Lockheed Martin Energy was responsible for the program marketing and outreach. They printed

banners, put out sandwich boards in front of buildings, developed door hangers for property managers

to remind tenants of any impending work, drafted an email for property managers to print and deliver

to tenants, and supplied common area notices, to ensure that all building tenants were aware that a

trade ally would be visiting the premises and performing in-unit installations.

To recruit properties and property managers to the program, Lockheed Martin Energy searched on

apartments.com to find potential participants, showed up on-site with a box of measures on hand, and

talked to property staff and management. While at that initial on-site, Lockheed Martin Energy staff

aimed to conduct the walkthrough that day, as opposed to coming back at a later time, in hopes of

beginning the application process before leaving the property.

Recognizing that some property managers own more than one building or complex, Lockheed Martin

Energy was hopeful that if an owner or manager had a positive experience at one property, they would

be interested in making upgrades to another property as well, or would notify other building owners

through word of mouth.

Through these marketing and outreach efforts, the program installed energy efficiency measures at 24

multi-family properties in 2018.

Program Challenges

NIPSCO and Lockheed Martin Energy experienced a challenge around cultivating interest with property

managers in a very large territory. NIPSCO set a target of visiting and servicing 2,700 units through the

MFDI program in 2018, but fell short of this target. NIPSCO and Lockheed Martin Energy found it difficult

to predict which properties would ultimately express interest in participation, and with properties

having such a range in the quantity of units, strategizing an accurate target number of units was

complicated.

NIPSCO reported that the program overall delivered lower natural gas savings in the units than expected

among the projects completed in 2018, due to the fact that there was an unexpectedly high proportion

of properties with central heating systems, as opposed to in-unit heaters.

Lockheed Martin Energy noted a disconnect between the central property management staff’s and the

on-site property staff’s program values and concerns. The central property management staff—seeing

the program as a free and beneficial service-- typically favored moving forward with the program after

initial contact with Lockheed Martin Energy. However, the on-site personnel were more hesitant about

the logistics surrounding entering units and conducting the actual installations. This disconnect between

the two parties resulted in extended or delayed project timelines.

NIPSCO noted delay in collecting data in LM Captures and invoicing was put on hold so that the data

could be transferred properly. While the data tracking is the same as other NIPSCO programs, the on-

site application completion by the Lockheed Martin Energy program advisor is relatively more time

consuming. To improve and expedite data tracking in LM Captures, Lockheed Martin Energy is

Page 168: 2018 DSM Portfolio Evaluation Report - NIPSCO

161

considering integrating an ongoing spreadsheet of growing pipeline projects and outreach efforts into

LM Captures in 2019.

Conclusions and Recommendations Conclusion 1: Although the program had a three-year hiatus and a late relaunch, the MFDI program

operated well and exceeded its electric savings goal.

Lockheed Martin Energy and NIPSCO expressed satisfaction with on-the-ground marketing efforts to get

the program rolled out this year. The electric energy savings goal of 921 MWh was exceeded at a goal

achievement of 108%. The program is well positioned for the new program cycle, with an extensive

pipeline developed. In 2019, less effort will be needed to recruit property managers and more effort will

be needed to engage with interested property managers, organize 12-24 month installation plans with

those sites, and manage new and existing accounts.

Conclusion 2: The MFDI program underperformed on natural gas savings due, in part, to an incorrect

implementer assumption that each residential unit has its own gas heating equipment.

The evaluation observed a 50% realization rate for natural gas savings in 2018. The multi-family

buildings visited had central gas heating and/or electric baseboard heat, where the tenant is only

responsible for the electric portion of the utility bill. Only half of the expected reduction in therms per

unit was realized through gas saving measures. Having no baseline data for historical usage, or any

indication of heating systems in the buildings that were being targeted, NIPSCO and Lockheed Martin

Energy overestimated the savings potential.

Recommendation:

• Adjust the natural gas savings goal in future years, based on the findings from the 2018

evaluation.

Page 169: 2018 DSM Portfolio Evaluation Report - NIPSCO

162

Residential Behavioral Program First launched in 2011, the Behavioral program provides paper and electronic Home Energy Reports

(HERs) to selected NIPSCO customers. The reports detail the customer’s energy usage—including their

historical consumption data as well as a comparison to other households—and provide low-cost and no-

cost tips to save energy. Customers in the program with a valid email address also receive monthly

eHERs and access to the program-affiliated web portal to review their energy consumption and see

additional energy saving tips. The HERs also promote and encourage participation in other NIPSCO

energy efficiency programs.

The program uses randomized control trial (RCT) design whereby customers are randomly assigned to a

treatment or control group. Customers in the treatment group receive the HERs while customers in the

control group do not receive any HERs. The customer population is divided into seven waves based on

when a customer began receiving the HERs (Table 113). The initial five waves have respective natural

gas and electric populations known as cohorts. The program launched a sixth wave of gas only

customers in September 2017, and a seventh wave of electric only customers in May 2018. Treatment

group customers in all seven waves received the reports in 2018. The number of reports a treatment

group customer received varied by their fuel type and by availability of a valid email address.

Table 113. 2018 Behavioral Program Design

Group and Wave Delivery Frequency Fuel Type Number of Customers (Beginning of 2018)

Electric Natural Gas

Treatment Group

Wave 1 (first report March 2011) Four paper HERs a year;

monthly eHERs; natural

gas–only receive

monthly paper HER and

eHERs during winter

months; year-round

web portal access

Dual 94,462 94,172

Wave 2 (first report June 2012) Dual 7,618 7,597

Wave 3 (first report July 2014) Dual 33,316 33,308

Wave 4 (first report March 2015) Dual 25,653 25,540

Wave 5 (first report June 2017) Dual 31,249 31,220

Wave 6 (first report September 2017) N. Gas - 47,560

Wave 7 (first report in May 2018) Electric 23,450 -

Total Treatment Group 215,749 239,397

Control Group

Wave 1

N/A

Dual 31,389 31,308

Wave 2 Dual 7,657 7,620

Wave 3 Dual 7,529 7,533

Wave 4 Dual 6,528 6,503

Wave 5 Dual 10,186 10,171

Wave 6 N. Gas - 11,429

Wave 7 Electric 11,260 -

Total Control Group 74,549 74,565

Note: For the dual fuel waves, the same group of customers receives natural gas and electric feedback. The customer counts

shown are based on billing data. Due to missing billing data, there are differences in counts between electric and natural gas.

Page 170: 2018 DSM Portfolio Evaluation Report - NIPSCO

163

Program Performance Table 114 presents a savings summary for the program, including goals. The program achieved 130% of

its electric gross savings goal and 156% of its natural gas gross savings goal (aggregate of all seven

waves). NIPSCO did not have a demand reduction goal for the program and did not track ex ante

demand reduction. However, the evaluation team estimated demand reduction of 3,650 kW as part of

the ex post analysis.

Table 114. 2018 Behavioral Program Savings Summary

Metric Gross

Savings Goal Ex Ante Audited Verified

Ex Post

Gross

Ex Post

Net

Gross Goal

Achievement

Electric Energy Savings

(kWh/year) 24,516,272 31,504,104 31,504,104 32,126,813 31,976,257 31,976,257 130%

Peak Demand

Reduction (kW) N/A N/A N/A N/A 3,650 3,650 N/A

Natural Gas Energy

Savings (therms/year) 751,885 1,110,584 1,110,584 1,176,246 1,169,960 1,169,960 156%

Table 115 presents the program realization rates and NTG percentages. Electric energy savings yielded a

101% realization rate and natural gas energy savings yielded a 105% realization rate.

Table 115. 2018 Behavioral Program Adjustment Factors

Metric Realization Ratea NTGb

Electric Energy Savings (kWh/year) 101% 100%

Peak Demand Reduction (kW) - 100%

Natural Gas Energy Savings (therms/year) 105% 100% a Realization rate is defined as ex post gross savings divided by ex ante gross savings. b NTG is defined as ex post net savings divided by ex post gross savings. Here, the evaluation produces a net savings value so

the NTG is 100%.

Table 116 presents the operating expenses for the 2018 program year, split out by fuel type. The

program used 97% of its electric budget and 98% of its natural gas budget.

Table 116. 2018 Behavioral Program Expenditures

Fuel Program Budget Program Expenditures Budget Spent (%)

Electric $1,828,988 $1,782,342 97%

Natural Gas $192,544 $189,063 98%

Research Questions The evaluation team conducted qualitative and quantitative research activities to answer several key

research questions for the Behavioral program:

• What energy saving actions have treatment group customers taken?

• How often do customers read the reports?

• What do customers do with the reports?

Page 171: 2018 DSM Portfolio Evaluation Report - NIPSCO

164

• How satisfied are customers with the different report sections and with the reports overall?

• Do customers have any recommendations for improving the reports?

• Are customers aware of the program-affiliated web portal?

• How satisfied are customers with the web portal overall?

• What energy efficiency programs are treatment group customers aware of?

• What is the overall satisfaction with NIPSCO?

• Do customers participate in other NIPSCO energy efficiency programs based on the information

provided in the reports?

• What are the demographic characteristics of treatment group customers?

Impact Evaluation This section presents the results of the Residential Behavioral program impact evaluation. The program

uses an RCT design whereby customers are randomly assigned to a treatment or control group.

Customers in the treatment group receive the HERs while customers in the control group do not receive

any HERs. The control group energy use acts as a baseline or counterfactual, representing what the

treatment group energy use would have looked like in the absence of the program. Because the savings

are calculated using a modeled difference-in-differences approach with an RCT, gross savings are

inherently equal to net savings37 by design.

The Behavioral program differs from other programs in the portfolio in that it claims savings from

December of the previous program year through November of the current program year. The Behavioral

program impact evaluation does not include December of the current program year because the

implementer claims savings based on customer billing data and complete data for December are not

available for up to three months. Thus, for 2018, the program claimed savings for December 2017

through November 2018, and this evaluation report covers that same time period.

The evaluation team calculated savings in two distinct steps:

• Regression analysis of program billing data, which results in verified savings

• Uplift analysis, which results in ex post gross savings

While program performance is reported as an aggregate of the seven individual waves, this report

discusses results that apply to individual waves and cohorts where pertinent. Each wave possesses

unique characteristics contingent upon when the wave began receiving treatment, the size of the

treatment and control populations, and the baseline energy consumption of the customers within the

wave. As such, some considerations and recommendations apply to a selection of treatment waves, but

not to the program as a whole. This in-depth discussion provides the context necessary to understand

37 In this instance, “net” means net of freeridership (in contrast to gross or excluding freeridership).

Page 172: 2018 DSM Portfolio Evaluation Report - NIPSCO

165

the program performance overall and the nuance of recommendations related to specific wave

considerations.

Ex Ante Program Savings Oracle Utilities Opower, overseen by the program implementer, estimates savings monthly based on

available billing data and statistical models. Oracle Utilities Opower updates savings estimates for any

particular month for up to three months as additional billing data become available. Lockheed Martin

Energy reports the current month plus any adjustments on the monthly scorecards, which are the

ex ante program savings for the Behavioral program. The implementer reported 31,504,104 kWh and

1,110,584 therms of ex ante program savings.

Audited Program Savings The Behavioral program does not have an audited step like other programs, as it does not include the

installation of specific measures. The evaluation team reviewed the billing data and customers counts

provided by the implementer and found no errors; therefore, audited program savings are equal to

ex ante program savings.

Verified Program Savings Verified program savings are the direct result of the regression analysis.

Regression Analysis (Step 1)

The regression analysis produced savings estimates of 32,127 MWh of electricity and 1,176,246 therms

of natural gas in 2018. Note that modeled electric savings for one wave (Wave 2) and modeled natural

gas savings for four waves (Wave 1 eHer, Wave 2, Wave 3, and Wave 5) are not statistically significant

(p>0.10). Since the program is an RCT experimental design, these results are the unbiased, best

estimates of true savings values. Although with these waves the evaluation team cannot rule out that

savings are unequal to zero, with all the waves the evaluation team cannot rule out that the savings are

unequal to a different value in the confidence interval. The evaluation team reports confidence intervals

for all waves and for all waves used the point estimate as the best estimate of savings (see Table 117).

For example, the Wave 7 confidence interval for electric savings ranges from 985 MWh to 2,172 MWh,

yet the evaluation team reports the center point (1,578 MWh) as the evaluated savings. The evaluation

team applied the same approach across all waves even if the interval included zero.

These savings values do not account for double-counted savings from participation in other NIPSCO

offerings; these adjustments were generated through an uplift analysis and are presented in a

subsequent section, Program Uplift (Step 2).

Table 117 displays the claimed and verified savings (before uplift analysis) and the per-household

electric savings percentage for each wave reporting electric savings. Verified savings exceeded

implementer reported savings for each wave except Wave 1, with realization rates ranging from 101%

(Wave 7) to 136% (Wave 2). While Wave 2 demonstrates the highest realization rate among individual

waves (136%), it is the only wave to not be statistically significant. This is attributable to a small savings

Page 173: 2018 DSM Portfolio Evaluation Report - NIPSCO

166

point estimate, wide confidence interval, and small treatment and control groups. However, while not

statistically significant, the savings estimate is still expected to be unbiased.

Table 117. 2018 Behavioral Program Claimed and Verified Electric Savings

Wave

Electric Savings (MWh) Evaluated Savings Percentage per Home

Claimed Verified 90% CI Lower

Bound

90% CI Upper

Bound Household

90% CI Lower

Bound

90% CI Upper

Bound

Wave 1 (eHer)a 18,870

6,589 5,042 8,135 2.25% 1.72% 2.78%

Wave 1 (No eHer)a 11,547 8,468 14,625 1.85% 1.35% 2.34%

Wave 2b 327 445 -72 961 0.99% -0.16% 2.15%

Wave 3 5,359 5,766 3,644 7,887 1.90% 1.20% 2.60%

Wave 4 3,194 3,726 2,039 5,414 1.91% 1.04% 2.77%

Wave 5 2,186 2,477 1,159 3,795 0.96% 0.45% 1.47%

Wave 7 1,568 1,578 985 2,172 1.34% 0.84% 1.85%

Total Unadjustedc 31,504 32,127 30,371 33,882 1.75% 1.65% 1.85% a The eHer and no eHer populations had significantly different baseline consumption numbers such that it was necessary to

model them separately to achieve accurate and significant results.

b Savings for Wave 2 were not statistically significant for either the electric or natural gas fuel types.

c Unadjusted savings do not account for channeling (uplift) analysis.

Table 118 displays the claimed and verified savings (before uplift analysis) and per-household natural

gas savings percentage for each wave reporting natural gas savings. Among the natural gas cohorts,

each wave achieved less than 1% savings of per-household natural gas consumption, with only the Wave

1 (no eHER cohort) and Wave 4 exceeding 0.5% household savings.

Table 118. 2018 Behavioral Program Verified Natural Gas Savings

Wave

Natural Gas Savings (therms) Evaluated Savings Percentage per Home

Claimed Verified 90% CI Lower

Bound

90% CI Upper

Bound Result

90% CI Lower

Bound

90% CI Upper

Bound

Wave 1 (eHer)a 469,960

36,511 -63,649 136,670 0.13% -0.23% 0.50%

Wave 1 (No eHer) 479,823 257,857 701,789 0.72% 0.39% 1.05%

Wave 2a 40,467 28,725 -14,964 72,414 0.46% -0.24% 1.16%

Wave 3a 157,683 132,668 -4,465 269,801 0.42% -0.01% 0.85%

Wave 4 121,047 118,276 9,915 226,637 0.55% 0.05% 1.06%

Wave 5a 97,599 78,822 -12,406 170,050 0.29% -0.05% 0.63%

Wave 6 223,828 301,421 131,500 471,342 0.45% 0.19% 0.70%

Total Unadjustedb 1,110,584 1,176,246 1,040,519 1,311,973 0.47% 0.42% 0.52%

Note: Relative to claimed savings, there is considerably more variation among natural gas cohorts than among electric cohorts.

This is primarily driven by lower absolute energy savings: where the detected savings (point estimates) are small, a small

absolute deviation in the detected savings can result in a realization rate that is either much higher or lower than 100%. These

low savings point estimates, in combination with small treatment and control group sizes, contributed to four of the seven

natural gas cohorts reporting savings that are not statistically significant. As with the Wave 2 electric cohort (the sole electric

cohort reporting non-significant savings), the lack of statistical significance does not necessarily imply that the point estimates

are not accurate. a Savings for Wave 1 (eHer), Wave 2, Wave 3, and Wave 5 were not statistically significant in for natural gas fuel types. b Unadjusted savings do not account for the channeling analysis.

Page 174: 2018 DSM Portfolio Evaluation Report - NIPSCO

167

In general, industry research suggests that participants of residential behavior change programs save

between 1.2% and 2.2% of household electricity usage per year and save between 0.3% and 1.6% of

household natural gas usage per year; most waves exhibit a one- or two-year ramp-up period, with

savings continuing at the ramped-up level for at least the following five years.38 Within that context, the

household savings percentage of each wave fall within these expectations (see Figure 54 and Figure 55),

except for Wave 2, where savings are declining earlier than expected (Wave 5 is still within the ramp up

period).

When considering the future of each wave and its ability to contribute cost-effective savings to the

program, attention must be given to the size of each wave’s respective treatment and control

populations. Control groups for waves 2, 3, 4, and 5 have all fallen below 10,000 respondents due to

attrition over time. While these waves maintain statistical equivalence to their respective treatment

groups, as attrition continues, the loss of statistical power—the likelihood the study will detect a real

effect if present—may render it impossible to detect significant savings estimates even if treatment and

control groups remain equivalent. As an example, while Wave 2 and Wave 5 achieved nearly the same

household-level electric savings for 2018 (0.99% versus 0.96%), the confidence interval for Wave 2

household-level savings is more than twice as wide (2.30% versus 1.02%), due in part to Wave 2’s

comparatively smaller treatment population (7,618 versus 29,806). Additionally, the relatively low

baseline consumption numbers for Wave 2 for both the electric and natural gas cohorts will continually

impede the ability to observe statistically significant savings.

38 Sussman, R., and M. Chikumbo. 2016. “Behavior Change Programs: Status and Impact.” American Council for

an Energy-Efficient Economy. https://aceee.org/sites/default/files/publications/researchreports/b1601.pdf

Page 175: 2018 DSM Portfolio Evaluation Report - NIPSCO

168

Figure 54. Household-Level Percentage Savings of Electricity for

Behavioral Program Participants, by Wave and Year

Average household-level electric savings as a percentage of usage are shown for all six Behavioral program

waves from 2012 to 2018. Statistically insignificant savings are depicted by an empty circle. Error bars

corresponding to the 90% confidence interval are shown for each year for each wave.

Figure 55. Household-Level Percentage Savings of Natural Gas for

Behavioral Program Participants, by Wave and Year

Average household-level natural gas savings as a percentage of usage are shown for all six Behavioral

program waves from 2012 to 2018. Statistically insignificant savings are depicted by an empty circle. Error

bars corresponding to the 90% confidence interval are shown for each year for each wave.

Page 176: 2018 DSM Portfolio Evaluation Report - NIPSCO

169

Ex Post Gross Savings Some treatment recipients participate in other NIPSCO program offerings; however, savings cannot

simultaneously be claimed by other programs and by the Residential Behavioral program (this is known

as double counting). The evaluation team accounted for double-counted savings by first calculating

program uplift, calculating the additional savings incurred by treatment customers (or, in some cases, by

control customers) as a result of program participation, then the evaluation team subtracted these

savings from the values determined through regression analysis. The evaluation team used this

technique to account for program participation in—and remove savings claimed by—the Appliance

Recycling, HEA, HVAC Rebates, and IQW programs. The Lighting program is addressed in the Upstream

Lighting Program Uplift section.

Program Uplift (Step 2)

Table 119 indicates the percentage difference (uplift) in program participation of treatment recipients

relative to their respective control groups, for five applicable programs. Each wave exhibits uplift in at

least two of the evaluated programs; however, all but waves 2 and 5 exhibit negative lift in at least one

program, indicating that the impact of HERs on participation in other programs may be negligible.

Programs may have increased participation from treatment customers early in a behavioral program’s

life, and may have reduced participation from treatment customers as programs mature because

interested respondents have already pursued those opportunities.

Table 119. Program Uplift Due to Home Energy Reports

Program Wave 1 Wave 2 Wave 3 Wave 4 Wave 5 Wave 6 Wave 7

Appliance Recycling -0.07% 0.02% 0.85% 0.59% 0.21% 0.00% 0.26%

HEA 0.06% 0.20% 0.24% 0.05% 0.08% 0.08% 0.15%

HVAC Rebate 0.12% 0.22% -0.02% -0.11% -0.07% 0.14% -0.02%

IQW 0.04% 0.00% 0.06% 0.00% 0.11% 0.07% 0.03%

Multi-Family 0.00% 0.00% 0.00% 0.00% 0.00% 0.00% -0.01%

Double-Counted Savings

Table 120 and Table 121 show the double-counted electric and natural gas savings, respectively, for all

2018 waves that was attributed to other programs as a result of the uplift analysis. The 150,556 kWh

and 6,285 therms (shown in Table 122) claimed by other programs in 2018 were also counted in the

Behavioral program billing analysis. Note that Table 120 and Table 121 calculate a per-home value for

comparison to average per-home savings from the Behavioral program, though only a subset of

treatment households participated in energy efficiency programs.

Page 177: 2018 DSM Portfolio Evaluation Report - NIPSCO

170

Table 120. Double-Counted Electric Savings

Program

Wave 1 (eHer)

Savings

Wave 1 (No eHer)

Savings Wave 2 Savings Wave 3 Savings Wave 4 Savings Wave 5 Savings Wave 7 Savings

Per

Home

(kWh)

Total

(MWh)

Per Home

(kWh)

Total

(MWh)

Per Home

(kWh)

Total

(MWh)

Per

Home

(kWh)

Total

(MWh)

Per Home

(kWh)

Total

(MWh)

Per

Home

(kWh)

Total

(MWh)

Per

Home

(kWh)

Total (MWh)

Appliance

Recycling -0.28 -7.60 2.49 35.73 -0.05 -0.35 1.18 39.36 -0.06 -1.47 0.31 9.80 0.51 11.90

HEA 0.24 6.70 1.53 21.95 0.38 2.93 0.54 18.02 0.15 3.84 0.20 6.35 0.44 10.24

HVAC Rebate 0.27 7.36 -0.83 -11.91 0.65 4.99 -0.17 -5.76 -0.26 -6.55 -0.28 -8.76 0.05 1.12

IQW 0.36 10.01 1.31 18.85 0.14 1.04 -0.06 -2.12 -0.08 -2.08 0.03 0.98 0.08 1.88

Multi-Family 0.00 0.00 0.00 0.00 0.51 3.86 0.00 0.00 0.00 0.00 0.17 5.28 -1.07 -25.03

Total 0.6 16.47 4.5 64.62 1.63 12.47 1.49 49.5 -0.25 -6.26 0.43 13.65 0.01 0.11

Note: Negative double-counted savings correspond to higher savings in non-behavioral programs for control respondents compared to treatment respondents. Because the billing

analysis would detect this difference as a reduction in the treatment effect, the Behavioral program is credited additional savings.

Table 121. Double-Counted Natural Gas Savings

Program

Wave 1 (eHer) Savings Wave 1 (No eHer) Savings Wave 2 Savings Wave 3 Savings Wave 4 Savings Wave 5 Savings Wave 7 Savings

Per

Home

(therms)

Total

(therms)

Per Home

(therms)

Total

(therms)

Per

Home

(therms)

Total

(therms)

Per

Home

(therms)

Total

(therms)

Per

Home

(therms)

Total

(therms)

Per

Home

(therms)

Total

(therms)

Per

Home

(therms)

Total

(therms)

Appliance Recycling 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00

HEA 0.06 1,575.05 0.25 3,652.28 0.06 475.02 0.07 2,414.55 -0.01 -239.54 0.03 913.70 0.03 1,552.06

HVAC Rebate 0.11 2,918.30 -0.73 -10,483.65 0.31 2,365.97 0.03 940.77 -0.15 -3,951.13 -0.09 -2,868.30 0.08 3,998.13

IQW 0.02 511.82 0.00 37.66 0.05 376.19 -0.05 -1,771.81 0.00 110.12 0.03 854.62 0.05 2,301.10

Multi-Family 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.02 602.52 0.00 0.00

Total 0.19 5,005.17 -0.48 -6,793.71 0.42 3,217.18 0.05 1,583.51 -0.16 -4,080.55 -0.01 -497.46 0.16 7,851.29

Note: Negative double-counted savings correspond to higher savings in non-behavioral programs for control respondents compared to treatment respondents. Because the billing analysis would detect

this difference as a reduction in the treatment effect, the Behavioral program is credited additional savings.

Page 178: 2018 DSM Portfolio Evaluation Report - NIPSCO

171

Table 122. Double-Counted Electric and Natural Gas Savings as a Percentage of Total Wave Savings

Wave

Electricity Savings Natural Gas Savings

Double Counted

Savings (kWh)

Percentage of Total

Savings

Double Counted

Savings (therms)

Percentage of Total

Savings

Wave 1 (eHer) 16,472 0.25% 5,005 13.71%

Wave 1 (No eHer) 64,611 0.56% -6,794 -1.42%

Wave 2 12,458 2.80% 3,217 11.20%

Wave 3 49,505 0.86% 1,584 1.19%

Wave 4 -6,247 -0.17% -4,081 -3.45%

Wave 5 13,647 0.55% -497 -0.63%

Wave 6 - - 7,851 2.60%

Wave 7 108 0.01% - -

Total 150,556 0.47% 6,285 0.53%

Upstream Lighting Program Uplift

As noted above, the double-counted savings analysis does not include NIPSCO’s upstream lighting

program. In upstream lighting programs, utilities work directly with manufacturers, distributors,

retailers, or a combination to offer built-in discounts on energy-efficient products, rather than paying

incentives directly to program participants. Because of this design, these programs do not track detailed

participation data such as respondent names and billing account numbers, which are typically available

for utility rebate programs. Consequently, the evaluation team could not identify HER treatment and

control group respondents who participated in an upstream lighting program. Obtaining the data

necessary to adjust for upstream programs requires expensive primary data collection that relies on

home visits or customer surveys and requires respondents to recall their lighting purchases. Given these

data limitations, the evaluation team did not estimate double-counted savings from upstream programs.

Because adjustments to electric savings due to other programs are small, this omission should not affect

the total claimed savings significantly.

Demand Reduction The evaluation team used the conservative estimate of equally distributing savings across all 8,760

annual hours to estimate demand reduction.39 As such, the demand reduction estimates are directly

proportional to the electric savings estimates calculated above. Table 123 displays the demand

reduction estimates for all waves in 2018, at both the individual level and the program level. The 90%

confidence intervals are also shown. The total demand reduction is calculated at 3,702 kW.

39 Demand reduction estimates from AMI data are as high as 2.3 times the 8,760 model estimate, because

electric savings are usually weighted to the summer and likely correspond to changes in peak air conditioner

usage. See also: Stewart, James, and Pete Cleff. November 2013. “Are You Leaving Peak Demand Savings on

the Table? Estimates of Peak-Coincident Demand Savings from PPL Electric’s Residential Behavior-Based

Program.” Oracle Utilities Opower Whitepaper.

Page 179: 2018 DSM Portfolio Evaluation Report - NIPSCO

172

Table 123. Demand Reduction Estimates for All Waves

Wave Estimated Peak Demand Reduction (kW) 90% Confidence Interval

Per Home Total Lower Bound Upper Bound

Wave 1 (eHer) 0.028 752 576 929

Wave 1 (No eHer) 0.020 1,318 967 1,669

Wave 2a 0.007 51 -8 110

Wave 3 0.021 658 416 900

Wave 4 0.017 425 233 618

Wave 5 0.010 318 132 433

Wave 7 0.016 180 112 248

Total - 3,702 52 7,353 a Savings for Wave 2 were not statistically significant.

Savings Summary Combining the billing analysis results with the analysis of double-counted savings from program uplift

leads to the final ex post gross savings (shown in Table 124 and Table 125 for electric and natural gas,

respectively). The Behavioral program achieved 101% of its ex ante electric claimed savings in 2018, with

a ±5.5% bound on the 90% confidence interval for total savings, and achieved 105% of its ex ante natural

gas savings in 2018, with a ±11.5% bound on the 90% confidence interval for total savings.

Table 124. 2018 Behavioral Program Electric Savings Summary

Wave Annual Net Electricity Savings (kWh/year)

Ex Ante/Audited Verified Program Uplift Ex Post

Gross/Net Realization Rate

Wave 1 (eHer) 18,869,997

6,588,659 16,472 6,572,186 96% Wave 1 (No eHer) 11,546,503 64,611 11,481,892

Wave 2 326,841 444,741 12,458 432,282 132%

Wave 3 5,359,391 5,765,510 49,505 5,716,005 107%

Wave 4 3,193,664 3,726,345 -6,246 3,732,592 117%

Wave 5 2,186,296 2,476,661 13,647 2,463,014 113%

Wave 7 1,567,915 1,578,394 108 1,578,286 101%

Total 31,504,104 32,126,813 150,556 31,976,257 101%

Page 180: 2018 DSM Portfolio Evaluation Report - NIPSCO

173

Table 125. 2018 Behavioral Program Natural Gas Savings Summary

Wave Annual Net Natural Gas Savings (therms/year)

Ex Ante/Audited Verified Program Uplift Ex Post Gross/Net Realization Rate

Wave 1 (eHer) 469,960

36,511 5,005 31,506 110%

Wave 1 (No eHer) 479,823 (6,794) 486,617

Wave 2 40,467 28,725 3,217 25,508 63%

Wave 3 157,683 132,668 1,584 131,085 83%

Wave 4 121,047 118,276 (4,081) 122,357 101%

Wave 5 97,599 78,822 -498 79,319 81%

Wave 7 223,828 301,421 7,851 293,570 131%

Total 1,110,584 1,176,246 6,285 1,169,960 105%

Savings Analysis Methodology

2018 Wave Design In 2018, NIPSCO added new a new group of electric-only HER recipients to the program, Wave 7. The

evaluation team helped assign respondents to the treatment and control groups to ensure an RCT

design.

The implementer provided a list of eligible treatment and control group respondents. To ensure a high

degree of equivalency, the evaluation team performed the randomization iteratively, checking for

similarity between treatment and control groups at each iteration. The evaluation team repeated this

process until the evaluation team arrived at five randomizations that passed a stringent equivalency

check—all months of the pre-period had to have a p-value for the t-test statistic of difference in means

of less than 0.25 for electric usage. The evaluation team provided these five randomizations to the

implementer, who selected one for deployment.

2018 Evaluation Methods The evaluation team applied several analysis steps for our Behavioral program evaluation:

• Data cleaning: The evaluation team identified respondent data to exclude from the analysis.

Reasons for exclusion include an insufficient number of pre-period or program period months or

insufficient billing days within a given month to determine a monthly average.

• Equivalency check: The evaluation team verified that the distribution of average monthly

energy usage prior to receiving the HERs was sufficiently similar between the treatment and

control groups, consistent with the random assignment of customers to treatment and control

groups.

• Regression analysis: The evaluation team verified program impacts using two alternative

statistical models: a post-program regression (PPR) analysis with lagged participant controls and

a linear fixed effects regression (LFER) analysis. Both models control for individual respondent

differences, but the PPR achieves this by including lagged participant controls for each

participant as an explanatory variable while the LFER removes each participant’s average energy

Page 181: 2018 DSM Portfolio Evaluation Report - NIPSCO

174

consumption before modeling. The evaluation team applied both of these models to monthly

energy usage data obtained from respondent bill records. The results of the PPR model are

reported as the official impact estimates, with the LFER model serving as a check on those

results. More details are provided in Appendix A. Regression Analysis.

• Uplift analysis: The evaluation team estimated the uplift in other energy efficiency programs

due to actions suggested by HERs through a post-only differences approach applied to tracking

data from other programs. Post-only differences are a direct comparison of program uptake in

the post-period as a percentage of respondents from treatment and control groups. More

details are provided in Appendix B. Uplift Analysis.

• Demand reduction: Monthly billing data do not have sufficient granularity to estimate demand

impacts. Modeling demand impacts requires hourly or shorter interval meter data. The

evaluation team estimated the demand impacts of equally distributed savings across all hours to

estimate demand reduction.

Data Cleaning As shown in Table 126 and Table 127 for electric and natural gas customers, respectively, the evaluation

team cleaned the billing data to ensure that data used in the billing analysis contained sufficient pre-

period (11) and post-period (two) months in the analysis periods, and sufficient billing days. Customers

with insufficient post-period data had either moved or disconnected service after their respective

waves’ inception, but before this evaluation period began. As a result, some of the earlier deployment

waves appear to have considerably high numbers of customers removed. Treatment and control

customers have shown near identical rates of attrition, as the percentage difference of treatment and

control customers removed from any one wave does not exceed 0.5%.

Page 182: 2018 DSM Portfolio Evaluation Report - NIPSCO

175

Table 126. Participants Filtered Out by Data Sufficiency Checks for Electric Customers

Wave 1 Wave 2 Wave 3 Wave 4 Wave 5 Wave 7

Treatment Control Treatment Control Treatment Control Treatment Control Treatment Control Treatment Control

Original randomly assigned homes 131,866 45,031 12,358 12,362 44,962 10,224 34,525 8,785 35,790 11,692 24,718 11,855

Filters Applied

Insufficient post-period data 38,075 13,855 4,802 4,772 12,007 2,773 9,205 2,342 5,243 1,704 2,127 1,035

Insufficient pre-period data 1,524 655 201 198 679 141 1,017 272 740 247 141 67

Total Filtered 39,601 14,510 5,003 4,970 12,691 2,914 10,225 2,614 5,984 1,951 2,269 1,103

Final Estimation Sample 92,265 30,521 7,355 7,392 32,271 7,310 24,300 6,171 29,806 9,741 22,449 10,752

Table 127. Participants Filtered Out by Data Sufficiency Checks for Natural Gas Customers

Wave 1 Wave 2 Wave 3 Wave 4 Wave 5 Wave 6

Treatment Control Treatment Control Treatment Control Treatment Control Treatment Control Treatment Control

Original randomly assigned homes 130,335 44,465 12,123 12,118 44,520 10,111 34,012 8,671 35,583 11,603 49,336 11,844

Filters Applied

Insufficient post-period data 36,828 13,365 4,591 4,563 11,590 2,655 8,820 2,256 5,067 1,633 2,223 541

Insufficient pre-period data 1,524 660 606 614 835 178 1,248 329 929 313 795 178

Total Filtered 38,352 14,025 5,197 5,177 12,426 2,833 10,069 2,585 5,997 1,946 3,019 719

Final Estimation Sample 91,983 30,440 6,926 6,941 32,094 7,278 23,943 6,086 29,586 9,657 46,317 11,125

Page 183: 2018 DSM Portfolio Evaluation Report - NIPSCO

176

Equivalency Check Because the treatment and control groups are randomly assigned, pre-treatment energy use should

theoretically be equivalent between the groups. The evaluation team performed an equivalency check

of the energy usage patterns of the treatment and control groups of each wave in the year preceding

the rollout to confirm that the data in each case were consistent with an RCT evaluation approach.

The evaluation team employed two methods to assess the equivalency of treatment and control

energy usage:

• Visual inspection of overlaid plots of monthly mean energy use for treatment and control groups

(an example is shown in Figure 56).

A. T-tests of the monthly differences in mean energy use between treatment and control groups in

each month. A significant difference (p<0.05) indicates that pre-period usage is dissimilar

between groups.40

Figure 56. Equivalency Check for 2018 Wave 7

This figure represents the equivalency check for the 2018 electric usage of wave 7, with p-values reported

above the data points. As evident in the graph, the average daily consumption between treatment and

control groups is highly similar. All analyzed groups passed equivalency checks.

Process Evaluation The evaluation team interviewed program and implementation staff to review the HER design and the

program web portal, and the evaluation team surveyed program participants.

40 A t-test is a statistical test of the difference between the mean values of observed characteristics between two

populations. In this case, it is a test of the difference in average electricity usage in each month between

treatment and control group respondents.

0.61

0.25

0.32

0.37

0.530.97

0.35

0.45

0.55

0.36

0.42

0.91

15.0

20.0

25.0

30.0

35.0

40.0

May2017

Jun2017

Jul2017

Aug2017

Sep2017

Oct2017

Nov2017

Dec2017

Jan2018

Feb2018

Mar2018

Apr2018

Ave

rage

Dai

ly C

on

sum

pti

on

n(k

Wh

)

Treatment Group Control Group

Page 184: 2018 DSM Portfolio Evaluation Report - NIPSCO

177

Program Design and Delivery Lockheed Martin Energy serves as the program implementer and oversees Oracle Utilities Opower as

the producer and distributor of the HERs.

The HERs have three major components:

• The Similar Homes Comparison shows the household’s current usage in relation to their peers’

usage along with a summary statement of whether their usage is higher, lower, or about the

same as average energy users.

• The Weather Adjusted Self-Comparison shows the household’s monthly energy usage for the

current year and the previous year.

• The semi-customized Energy Saving Tips include low-cost and no-cost suggestions about how to

save energy.

In 2018, 453 treatment group customers (or 0.19%) voluntarily opted out of receiving the HERs. A total

of 1,454 treatment group customers (or 0.61%) have voluntarily opted out of receiving the HERs since

the program inception.

2017 Recommendation Status The evaluation team followed up with NIPSCO regarding the status of the 2017 evaluation

recommendations. Table 84 lists the 2017 Behavioral program evaluation recommendations and

NIPSCO’s progress toward addressing those recommendations to date.

Table 128. Status of 2017 Behavioral Program Evaluation Recommendations

Summary of 2017 Recommendations Status

Due to delays in contracting, waves 5 and 6 launched late in 2017, and did not

provide enough billing data for the 2017 evaluation to detect statistically significant

savings from these new waves. Conduct a billing analysis at the end of 2018 to

provide statistically significant impact results.

Completed in 2018. Waves 5 and 6

were not delayed because of

contracting; these were one

additional natural gas–only wave

and one additional dual fuel wave

to increase savings.

After several years, the wave 2 and 3 control and treatment group sizes are

beginning to shrink due to natural customer attrition out of NIPSCO territory or to

new residences. NIPSCO and program implementers should monitor the significance

of energy savings for these waves in future impact evaluations. Depending on

NIPSCO’s goals, this problem can be addressed in a variety of ways:

• To solely achieve evaluable savings: To achieve significant savings in future

years, NIPSCO could terminate HERs for customers in the dwindling waves and

start new waves. However, customers who like the reports and no longer receive

them may be dissatisfied.

• To maintain experimental integrity and continuity: Several approaches could be

used to minimize the discontinuity in treatment while continuing to send reports

to wave 2 and 3 customers. For instance, the treatment group could be

incorporated into the eligible participants in a new wave, then be randomly

assigned to either treatment (high likelihood) or control (lower likelihood), with

prior HER participation randomized and checked for equivalency. This would

In Progress. These are known risks

that Lockheed Martin Energy is

addressing with the Behavioral

program subcontractor.

Page 185: 2018 DSM Portfolio Evaluation Report - NIPSCO

178

Summary of 2017 Recommendations Status

lead to slightly more muted early energy savings than could be expected from a

typical new HER wave, as control group customers would persist in energy

saving behaviors for a few years, but long-run yearly savings would be the same.

Ideally, such a change should be developed in consultation with the evaluation

team.

Participant Feedback The evaluation team contacted 2,700 treatment group customers who received HERs. The evaluation

team stratified the sample by wave proportionate to the population, and 144 treatment customers

completed the survey. Comparisons to previous program years are not included in the following survey

results as 2018 was the first time the evaluation team conducted participant surveys for the Behavioral

program.

Report Readership

Over 90% of respondents who completed the survey (n=144) recalled receiving the HERs. Of those who

recalled receiving the reports, 65% reported that they read all of the reports, while 30% reported

reading some and the remaining 5% reported that they did not read any of the reports. Respondents

varied on how thoroughly they read the most recent report. About one-third said they read the most

recent report thoroughly (31%), 34% read some of it, and 30% only glanced at the pictures and charts.

Figure 57 shows respondents’ overall readership of the HERs. Altogether, 28% of respondents reported

that they read all the reports and read the most recent report thoroughly, while 20% read all reports

and read the most recent report partially. Another 18% of respondents read all the reports though they

only glanced at images in the most recent report. About 13% of respondents read some reports and

read the most recent report partially, while another 10% read some reports and the simply skimmed the

most recent report. About 2% of respondents reported that they had read the most recent report

thoroughly and typically read only some of the reports. The remaining 9% did not read the reports or

were unable to explain at what length they read the report (reflected as, don’t know).

Page 186: 2018 DSM Portfolio Evaluation Report - NIPSCO

179

Figure 57. Report Readership

Source: Participant Survey. Questions: “Thinking about all the Home Energy Reports that you have received,

about how many of them have you read or at least glanced through?” and “Thinking about the most recent

report that you received, which of the following most closely describes what you did with it?”

Report Engagement

The evaluation team asked respondents what they did with the recent report. Figure 58 shows that

respondents most frequently discussed the report with members in their household (32%) or with

others outside their household (14%) or implemented tips from the report (13%). Respondents who

discarded the reports (33%) are not represented in Figure 58.

Figure 58. Report Engagement

Source: Participant Survey. Question: “Thinking about the most recent report that you received, did you do

any of the following? Select all that apply.”

Usefulness of Report Components

Respondents rated on the usefulness of the three main components of the HERs: Similar Homes

Comparison, Weather Adjusted Self-Comparison, and Energy Saving Tips. Figure 59 demonstrates that of

the three sections, respondents found the Weather Adjusted Self-Comparison to be the most useful

(78%), followed by the Energy Saving Tips (74%) and the Similar Homes Comparison (64%).

Page 187: 2018 DSM Portfolio Evaluation Report - NIPSCO

180

Figure 59. Usefulness of Home Energy Report Components

Source: Participant Survey. Questions: “How would you rate the usefulness of this personal comparison?,”

“How would you rate the usefulness of the energy saving tips provided?,” and “How would you rate the

usefulness of the comparison to similar homes?”

Similar Homes Comparison. Ninety-five respondents provided open-end comments on the Similar

Homes Comparison, 62 who rated the section positively and 33 who rated the section negatively.

Negative comments mostly mentioned concerns about the accuracy of the Similar Homes Comparison

(31 mentions). Many respondents felt that the comparisons did not accurately reflect their usage against

comparable neighbors (16 mentions). For example, one respondent remarked that, “If you were actually

using similar homes. There are condos in our neighborhood, and I feel we are compared to them (MUCH

smaller in square footage and footprint) - so my report always shows my usage as "high" - however I feel

I do my part to conserve. It's not reflected that.,” indicating that the usage and rates at their address

would not equal those at other residential properties.

Other respondents did not have a clear understanding of who the comparable neighbors were (16

mentions).41 As one respondent said, “It would be useful to know what information NIPSCO uses to

decide what a ‘similar house’ is.” Several respondents expressed skepticism about the data in the

reports (15 mentions), indicating that respondents believe they are being compared to their immediate

street neighbors. For example, one concerned respondent does not believe the comparisons, “I don’t

believe they use real data. One time I was way high up in the ranking and then the next time I was way

down. I talked to our neighbors about it and they all said it is just a marketing flyer and the data isn’t

technically accurate.”

Weather Adjusted Self-Comparison. Eighty-three respondents provided open-end comments on the

Weather Adjusted Self-Comparison, 65 who rated the section positively and 18 who rated it negatively.

Respondents most often (six mentions) expressed that they had not realized that the self-comparison

41 The HERs include an explanation of the Similar Homes Comparison, describing similar homes as, “based on 100

similar homes within approx. 1 mi.”, “You’re compared with 100 homes within 1 mi of you that are a similar

size (1,961 sq. ft.) and have gas heat“, or “ This is based on 95 similar homes.” With an explanation that

“Efficient neighbors are the 20% who use the least amount of electricity.” The Frequently Asked Questions at

the report back provides further details on the Similar Homes Comparison.

Page 188: 2018 DSM Portfolio Evaluation Report - NIPSCO

181

charts were weather adjusted, and that this information is not immediately apparent, although they do

expect the temperature to affect their energy consumption patterns. Five respondents wanted to see a

temperature trendline on the charts. Other respondents (three mentions) misunderstood the Weather

Adjusted Self-Comparison as being an extension of the Similar Homes Comparison and had concerns

that their household occupancy patterns differed from that of their neighbors, hence their higher overall

energy consumption. These customers expressed that their neighbors were not home as often as they

were, which explains their higher household energy use.

Energy Saving Tips. Seventy-four respondents provided open-end comments on the Energy Saving Tips,

58 who gave positive ratings and 16 who gave negative ratings. Respondents frequently wanted tips that

were more customized and tailored to their unique living situation (five mentions). For example, one

respondent said their home configuration may hinder their home efficiency and expressed a desire for

tips about how to mitigate this issue, “I think everyone has to assess their own living situation. For

instance, a piece of furniture might be in front of a vent, because I have a really small living room. Not

because I wanted to partially block a vent, but [due to] size of room, furniture, etcetera. So, while some

of them [the tips] are no-brainers, that still doesn’t account for individual circumstances.”

Other respondents felt the tips were redundant or were ideas they had heard before (eight mentions)

and said they wanted more unique, low-cost tips (four mentions). One respondent would appreciate

greater detail in the Energy Saving Tips, “Give feedback on how to potentially utilize the tip. [Give]

examples of good fans and heaters, how to utilize them to maximize potential.”

Energy Saving Actions

Most respondents reported taking action to save energy in their home within the past year (72%).

Figure 60 shows that the majority of respondents (61% to 88% across all waves) reported taking an

energy saving action. While Figure 60 shows responses by wave, the survey sample sizes are the wave-

level are small and should be interpreted cautiously. Overall, the oldest waves (1 and 2) and newest

waves (6 and 7) tended to show higher rates of taking action than those in the middle waves (3, 4, and

5). However, the survey sample sizes at the wave level were small; therefore, wave level comparisons

should be viewed with caution.

Page 189: 2018 DSM Portfolio Evaluation Report - NIPSCO

182

Figure 60. Taking Energy Saving Actions in Home by Wave

Source: Participant Survey. Question: “In the past year, has your household taken any actions to save energy?”

As shown in Figure 61, the most frequently taken actions reported by respondents were turning the

lights off more often (85%), setting back the thermostat when away or at night (84%), installing LED

lightbulbs (76%), and unplugging devices when not in use (59%). Respondents tended to take behavioral

actions rather than measure installation actions as three of the top four actions were behavioral.

When asked how important the HERs were in their decision to take the energy saving actions, of 97

respondents, 70% said the HERs were important. Specifically, 24% said the HERs were very important

and 46% said somewhat important. Only 8% said the HERs were not important.

Page 190: 2018 DSM Portfolio Evaluation Report - NIPSCO

183

Figure 61. Types of Energy Saving Actions Taken in Home

Source: Participant Survey. Question: “What energy saving actions has your household taken in the past

year? Select all that apply.”

Energy Efficiency Program Awareness

Survey respondents named any NIPSCO energy efficiency programs they were aware of. Without

providing a list of program names, 7% of respondents were able to name a program (n=136). Of those

who named a program, respondents frequently mentioned six programs:

B. Appliance Recycling (four mentions)

C. HVAC Rebate (three mentions)

D. WIFI Programmable Thermostat Rebate (three mentions)

E. Home Energy Analysis (three mentions)

F. Income Qualified Weatherization (two mentions)

G. Budget or flat billing plans (three mentions)

Web Portal Engagement

Treatment group customers had access to the program-affiliated web portal. Only 27 respondents (21%)

had ever visited the web portal; of those, 4% visited weekly, 26% visited monthly, 33% visited several

times a year, and 22% visited one time a year, while 15% could not recall how often they had visited the

web portal.

The evaluation team obtained the web tracking data for portal logins during 2018. As shown in

Figure 62, very few customers logged into the portal. On average, 71 customers logged into the portal

Page 191: 2018 DSM Portfolio Evaluation Report - NIPSCO

184

per month during 2018, representing 0.19% of the 455,146 treatment group customers. The web portal

requires users to create a separate online account, rather than employing a single sign-on account using

NIPSCO account credentials. Some survey respondents indicated frustration with the login process,

stating, “it is useful, but it takes time to go there and log in.” The separate account requirement explains

part of the very low logins to the portal.

Figure 62. 2018 Unique Monthly Web Portal Logins

Source: Web Portal Analytics provided by Oracle Utilities Opower.

Respondents (n=27) frequently visited the portal to understand their bill amounts (35%), to learn about

alternative billing options (23%), or for some other unique reason (31%). The high number of comments

referencing billing suggests that some respondents may have confused the portal with the general

NIPSCO web account page. Of 38 comments related to the portal, eight referred to bill pay features

(21%).

Satisfaction with Web Portal

Overall, 86% of respondents found the web portal useful (n=27). Specifically, 30% said the portal was

very useful and 56% said it was somewhat useful. Only 8% reported that the web portal was not useful.

However, these ratings were based on a small number of respondents who were aware of the portal.

Respondents who visited the portal were satisfied because it is convenient for them to access and use.

Respondents also liked the level of detail available on the portal and the additional energy saving tips.

Respondent suggestions and recommendations for improving the portal included changing the interface

design to make it more visually appealing.

Satisfaction with Home Energy Reports

As shown in Figure 63, most respondents (71%) were satisfied with the HERs. Specifically, 35% said they

were very satisfied and 36% said they were somewhat satisfied. Using satisfaction with the HERs as the

program satisfaction metric, the Behavioral program had the lowest satisfaction among the evaluated

NIPSCO residential programs in 2018. The evaluation team has observed that a HER type of program

typically receives some of the lowest satisfaction scores because of the opt-out participation design and

it does not offer the incentives that traditional rebate programs offer. Despite it being the program with

the lowest satisfaction, NIPSCO’s Behavioral program still achieved a majority of satisfied respondents.

Page 192: 2018 DSM Portfolio Evaluation Report - NIPSCO

185

Figure 63. Satisfaction with Home Energy Reports

Source: Participant Survey. Question: “How satisfied are you with the home energy reports?”

Satisfaction With NIPSCO

As shown in Figure 64, most respondents (74%) were satisfied with NIPSCO. Specifically, 41% said they

were very satisfied and 33% said they were somewhat satisfied.

Figure 64. Overall Satisfaction with NIPSCO

Source: Participant Survey. Question: “How satisfied are you with NIPSCO overall as your utility service provider?”

Page 193: 2018 DSM Portfolio Evaluation Report - NIPSCO

186

Participant Survey Demographics

As part of the participant survey, the evaluation team collected responses on the demographics shown

in Table 129. Most respondents live in a single-family detached home (82%) and own their home (83%).

Although the program targets single-family homeowners to receive the HERs, 14% of respondents were

renters.

Table 129. Behavioral Program Respondent Demographics

Demographics Percentage (n=136)

Home Ownership Status

Rent 14%

Own 83%

Other/Prefer not to say 3%

Type of Residence

Single-family detached 82%

Multi-family apartment or condo (4 or more units) 9%

Attached house (townhouse, rowhouse) 4%

Manufactured home 3%

Other/Prefer not to say 2%

Number of People in Home

One 20%

Two 38%

Three 22%

Four 11%

Five or more 7%

Prefer not to say 2%

Annual Household Income

Under $25,000 9%

$25,000 to under $35,000 10%

$35,000 to under $50,000 13%

$50,000 to under $75,000 16%

$75,000 to under $100,000 10%

$100,000 or more 15%

Prefer not to say 28%

Conclusions and Recommendations Conclusion 1: The program exceeded its electric and natural gas savings goals, achieving 130% of its

24,516 MWh electric goal (saving 31,976 MWh) and 156% of its 751,885 therm goal (saving

(1,169,960 therms).

Industry research suggests that most waves will exhibit a one- or two-year ramp-up period, with savings

continuing at the ramped-up level for at least the following five years (Sussman and Chikumbo 2016).

For the most part, the NIPSCO Behavioral program has followed that pattern. Two waves—Wave 2

(electric and natural gas) and Wave 3 (natural gas)—show declining savings at the household level,

though this is not affecting the attainment of overall program goals.

Page 194: 2018 DSM Portfolio Evaluation Report - NIPSCO

187

Conclusion 2: As predicted in 2017, dwindling control group sizes have rendered savings estimates

insignificant for the Wave 2 natural gas and electric cohorts and for the Wave 3 natural gas cohort.

The control groups for waves 2, 3, 4, and 5 have all fallen below 10,000 customers due to attrition over

time. While these waves maintain statistical equivalence to their respective treatment groups, as

attrition continues, the loss of statistical power (that is, the likelihood the study will detect a real effect

if present) may render it impossible to detect significant savings estimates even if treatment and control

groups remain equivalent. Additionally, the relatively low baseline consumption numbers for Wave 2

(electric and natural gas) will continually impede the ability to observe statistically significant savings.

Recommendations:

• To solely achieve evaluable savings, terminate the HERs for some customers as group sizes lead

to results that are consistently not statistically significant, then start new waves. However, this

may cause customers who like the reports and no longer receive to be dissatisfied.

• Several approaches could be used to minimize the discontinuity in treatment while continuing

to send reports to wave 2 and 3 customers. For instance, the treatment group could be

incorporated into the eligible participants in a new wave and randomly assigned to either

treatment (high likelihood) or control (lower likelihood). Then prior HER participation would be

randomized and checked for equivalency. This would lead to slightly more muted early energy

savings than expected from a typical new HER wave, as control group customers would persist in

energy saving behaviors for a few years but long-run yearly savings would be the same. Ideally,

such a change should be developed in consultation with the evaluation team.

Conclusion 3: The Behavioral program achieved high reader engagement in 2018.

Ninety-five percent of survey respondents reported reading some or all of the reports, with one-third of

those who read the report saying they discussed the report with members of their household. Most

respondents who recalled the reports stated that within the past year their household took action to

save energy in their home (72%) and 13% implemented tips. About 70% of respondents who reported

taking an action also said the HERs were important in their decision to take action to save energy in their

home.

Conclusion 4: Program awareness and participation uplift in other programs as a result of HERs was

limited.

A few respondents, 7%, were able to name specific energy saving programs that NIPSCO offers (n=136).

Respondents successfully recalled six programs, while most respondents could not name any NIPSCO

programs. Measured participation uplift was less than 1% in all programs.

Conclusion 5: Respondents rate the energy savings tips favorably: 78% rated them very or somewhat

useful. Sixteen survey respondents expressed interest in having more detailed and personalized tips.

Respondents expressed a desire for more detailed tips. The web portal contains the type of more

detailed tips and customization that customers desire, but few customers access these resources.

Page 195: 2018 DSM Portfolio Evaluation Report - NIPSCO

188

Recommendations:

• HERs are a logical place to educate customers about energy efficiency program opportunities,

but customers may need to see the messaging multiple times or at the right time (such as when

they are thinking about new equipment) for it to be effective. Consider testing different

approaches to messaging such as: 1) different frequency and timing of cross-program

marketing; and 2) dedicating more space in the printed report to more detailed tips, such as

those available online, which may mean less space for messaging about other programs.

• To help the tips stay useful to participants and address the concerns of some participants,

regularly review the Energy Saving Tips to ensure they remain relevant and current with changing

technologies and customer preferences (such as the use of smart devices).

Conclusion 6: Customers raised issues about the accuracy of the Similar Homes Comparison and want

to see improvements made to this section of the report.

Suggestions for improvements included more accurate peer comparisons, such as by considering home

occupancy in terms of the number of occupants as well as the amount of time the house is occupied.

Some respondents felt their usage is being compared to holiday homes. Respondents also expressed

being skeptical of the peer comparisons because the criteria used for determining similar homes is

unclear. Many respondents believe their immediate neighbors are used in the comparison.

Recommendation:

• Communicate to customers clearly that going to the web portal and updating their home

information details will help improve the accuracy of their Similar Homes Comparison. Part of

the clear communications could be using the term “accurate” or “accuracy” in the eHER and

embedding a hyperlink in the term that redirects customers to the web portal.

Conclusion 7: Very few customers visited the program-affiliated web portal, likely due to its separate

login requirement.

In 2018, there were 71 average logins to the web portal per month, indicating low levels of engagement.

There were 855 visits to the web portal during 2018, representing 0.19% of treatment group customers.

Only 21% of survey respondents reported visiting the web portal. Some may have mistaken the online

NIPSCO account page for the program-affiliated web portal, evidenced by the number of respondents

who said they had used the web portal to pay bills (which is not a current feature of the web portal).

Without single sign on, the current web portal requirements may be cumbersome for customers. The

web portal is separate from the NIPSCO page and customers have to take extra steps to create a login

credential to access the information.

Recommendation:

• Work with the program implementer and Oracle Utilities Opower to improve the ease of access

to the web portal, either by enabling single sign-on access to the web portal or by embedding

tracking cookies in eHERs that customers can use to automatically sign in to the web portal.

Page 196: 2018 DSM Portfolio Evaluation Report - NIPSCO

189

Residential Income Qualified Weatherization Program The IQW program provides no-cost, in-home energy assessments to income-qualified residential

customers. The program consists of three components:

• Assess. Program participants receive a home assessment, where an energy assessor first

analyzes the efficiency of heating and cooling systems and insulation levels in the home.

• Install. Depending on opportunities in the home, the assessor then installs energy-saving

lighting and water conservation measures, as well as duct sealing and air sealing, to qualifying

homes during the assessment. Homes with refrigerators more than 10 years old are also eligible

to receive a new, ENERGY STAR®–rated refrigerator, while those with attic insulation levels

lower than R-11 may qualify for attic insulation. Both items are installed after the initial

assessment.

• Report. The assessor provides a report of findings and energy-saving recommendations.

Program Performance Table 130 summarizes evaluated savings, including gross savings goals. While conducting 1,131

assessments during 2018, the program fell short of achieving its energy goals for the year (in terms of ex

post gross savings), reaching 46% of the electric energy (kilowatt-hour) goal, 95% of its peak demand

goal and 73% of the natural gas (therm) goal. The program implementer reported that despite a slow

start in 2018, the program made good progress compared to previous years.

Table 130. 2018 IQW Program Savings Summary

Metric Goal Ex Antea Audited Verified Ex Post

Gross Ex Post Net

Gross Goal

Achievement

Electric Energy

Savings (kWh/yr) 1,845,908 1,304,976 1,303,717 1,248,092 843,410 843,410 46%

Peak Demand

Reduction (kW) 285 152 152 137 270 270 95%

Natural Gas Energy

Savings (therms/yr) 276,792 217,691 217,646 191,283 201,550 201,550 73%

The audited savings closely aligned with the claimed ex ante savings. With a few minor exceptions

documented further in this section, the evaluation team did not identify any issues through the tracking

system analysis.

Verified savings were somewhat lower than claimed values due to ISRs of select measures. The

engineering analysis completed for the ex post gross analysis served to increase the peak demand and

natural gas energy savings values.

Table 131 outlines the realization rates and net energy adjustment factors resulting from the evaluation.

Page 197: 2018 DSM Portfolio Evaluation Report - NIPSCO

190

Table 131. 2018 IQW Program Adjustment Factors

Metric Realization

Rate (%)a Freeridership Spillover NTG (%)b

Electric Energy Savings (kWh/yr) 65%

0% 0% 100% Peak Demand Reduction (kW) 177%

Natural Gas Energy Savings (therms/yr) 93% a Realization rate is defined as ex post gross savings divided by ex ante savings. b NTG is defined as ex post net savings divided by ex post gross savings.

Table 132 lists the 2018 IQW program budget and expenditures by fuel type. Compared to the

percentage of goal attained based on ex ante savings (71% for electric and 79% for natural gas), program

spending for electric savings closely aligns with program goal achievement however, program spending

for natural gas savings exceeded program goal achievement.

Table 132. 2018 IQW Program Expenditures

Fuel Program Budget Program Expenditures Budget Spent (%)

Electric $859,761 $579,632 67%

Natural Gas $1,354,340 $1,264,785 93%

Research Questions The evaluation team conducted qualitative and quantitative research activities to answer the following

key research questions for the program:

• What was the customer experience with the program, from sign-up through completion?

• How did customers become aware of the program?

• What were customer motivations for participation?

• How satisfied were customers with the program, including the participation process,

interactions with the program implementer, and satisfaction with each piece of equipment

received?

• How useful were the written reports that customers received after the assessment?

• Are customer homes more comfortable after the assessment?

• Are utility bills lower after the assessment?

• What do participants recommend for program improvement?

• During the assessment, were customers given additional energy-savings tips that they now have

put into practice? If yes, what have they done?

• As a result of the assessment, did customers install any other measures for which they did not

receive a utility rebate? If so, what were they?

• What are the program’s verified measure installations?

Page 198: 2018 DSM Portfolio Evaluation Report - NIPSCO

191

Impact Evaluation This section details each step of the impact evaluation and its associated electric energy savings, peak

demand reduction, and natural gas energy savings.

Audited and Verified Savings To determine the audited quantity and savings, the evaluation team first reviewed the program tracking

database for duplicates or other data quality issues. The evaluation team then looked for any

discrepancies between the program tracking data and the program scorecard and found two issues:

• There were two duplicated projects, which were removed from the audited quantity to remove

the duplicated measures.

• There was a data-quality issue wherein the description of a home’s heating or cooling system

included as part of the measure name did not align with what is stored separately in the heating

system or air conditioning type fields. Per guidance received in 2017, the evaluation team

treated the description included in the measure name as the correct type when discrepancies

existed. This issue did not affect savings or quantity.

The evaluation team also compared application materials and participant survey responses to the

program tracking database and verified measure eligibility. There was one area of difference between

the ex ante scorecard quantity and the audited quantity. It included a difference between the program

reported quantity and the quantity the respondent verified through the participant survey, indicating

untracked measure installations that affected four measures (LEDs, bathroom faucet aerators, kitchen

faucet aerators, and energy-efficient showerheads). While this resulted in only minor increases in the

number of units installed, it could indicate a larger problem and additional untracked savings that the

program might claim.

The evaluation team established ISRs for all measures through the 2018 participant survey. Consistent

with the method established in the 2015 evaluation and used in the 2016 and 2017 evaluations, the

evaluation team used the percentage of customers who recalled receiving an assessment report as the

ISR for the energy assessment recommendations measure.

To increase the confidence level in the applied percentage, the evaluation team calculated a weighted

average of the ISRs established through the 2018 and 2017 participant surveys as the same program

trade allies installed the same program measures in both years.

The ISRs for all program-installed measures are listed in Table 133.

Page 199: 2018 DSM Portfolio Evaluation Report - NIPSCO

192

Table 133. 2018 IQW Program In-Service Rates

Measure 2018 In-

Service Rate

2018

Sample

Size (n)

2017 In-

Service Rate

2017

Sample

Size (n)

Final 2018

In-Service

Rate

Total

Sample

Size (n)

LED 96.5% 61 98.1% 63 97.3% 124

Bath Aerator 80.0% 37 89.8% 42 85.2% 79

Kitchen Aerator 83.3% 36 100.0% 41 92.2% 77

Showerhead 86.2% 28 97.3% 29 91.9% 57

Pipe Wrap 80.7% 31 88.4% 43 85.2% 74

Filter Whistlea 100.0% 2 76.7% 30 78.2% 32

Water Heater Setback 80.0% 35 87.9% 33 83.8% 68

Programmable thermostat 88.9% 27 100.0% 16 93.0% 43

Duct Sealing 80.0% 35 84.0% 50 82.4% 85

Infiltration Reduction 62.0% 24 100.0% 21 79.7% 45

Attic Insulation 100.0% 37 100.0% 2 100.0% 39

Refrigerator Replacement 100.0% 26 100.0% 16 100.0% 42

Assessment Recommendations 71.0% 69 77.8% 63 74.2% 132 a The program installed a total of 47 filter whistles in 2018.

The ISRs fall below 100% for two reasons: (1) respondents report removing items after the program

installed them and (2) respondents report that a measure was not installed. A variety of factors can

impact ISRs, documented below. It is important to note that these calculations rely on self-reported

surveys. Although needing to use these surveys as a basis, the evaluation team recognized that self-

report error could affect the accuracy of the ISRs.

Respondents reported the following reasons for removing specific measures:

• LEDs: Nine respondents reported removing a total of 26 of the 749 LEDs installed through the

program because they stopped working (n=8, 21 bulbs) or they flickered (n=1, 5 bulbs).

• Kitchen aerators: Respondents removed the kitchen aerators because they replaced the whole

faucet (n=3), the aerator stopped working (n=3), and/or they did not like the water pressure

(n=3).

• Programmable thermostats: Two respondents reported removing programmable thermostats.

One because the energy assessor would not touch the wires for the central air conditioner, so

the air conditioning contractor replaced the thermostat when they hooked it up. The other

replaced it because it appeared not working correctly, although this respondent later learned

the problem was with the furnace.

• Showerheads: Two respondents removed the showerheads because they stopped working.

• Other measures: The ISRs for pipe wrap, infiltration reduction, water heater setback, and duct

sealing are lower than 100% because survey respondents reported not receiving that measure

or service.

To calculate the verified quantity, the evaluation team multiplied audited quantity by ISR. Table 134

summarizes the audited quantity, applied ISRs, and resulting verified quantity per measure.

Page 200: 2018 DSM Portfolio Evaluation Report - NIPSCO

193

Table 134. 2018 IQW Audited and Verified Measure Quantities

Measure Unit of

Measure

Audited

Quantity ISR

Verified

Quantity

LED (9 watt) Lamp 14,803 97.3% 14,408

LED (9 watt) - Electric Lamp 362 97.3% 352

Bathroom Aerator - Electric Aerator 26 85.2% 22

Bathroom Aerator - Natural Gas Aerator 672 85.2% 573

Kitchen Aerator - Electric Aerator 28 92.2% 26

Kitchen Aerator - Natural Gas Aerator 543 92.2% 501

Low-Flow Showerhead - Electric Showerhead 18 91.9% 17

Low-Flow Showerhead - Natural Gas Showerhead 500 91.9% 459

Pipe Wrap - Electric Per foot 71 85.2% 60

Pipe Wrap - Natural Gas Per foot 4,617 85.2% 3,931

HVAC Filter Whistle Filter whistle 47 78.2% 37

Water Heater Temperature Setback - Electric Domestic Hot

Water (DHW) Participant 3 83.8% 3

Water Heater Temperature Setback - Natural Gas DHW Participant 518 83.8% 434

Programmable thermostat - Electric Resistance Heated thermostat 1 93.0% 1

Programmable thermostat - Natural Gas Heated thermostat 309 93.0% 287

Duct Sealing - Dual Fuel Participant 401 82.4% 330

Duct Sealing - Electric Participant 6 82.4% 5

Duct Sealing - Electric Cooling Participant 1 82.4% 1

Duct Sealing - Natural Gas Participant 327 82.4% 269

20% Infiltration Reduction - Dual Fuel Participant 333 79.7% 266

20% Infiltration. Reduction - Electric Resistance Heated Participant 2 79.7% 2

20% Infiltration Reduction - Natural Gas Heated Participant 97 79.7% 77

Attic Insulation - Dual Fuel Square foot 271,370 100.0% 271,370

Attic Insulation - Electric Heated Square foot 1,700 100.0% 1,700

Attic Insulation - Natural Gas Heated Square foot 164,166 100.0% 164,166

Refrigerator Replacement Refrigerator 315 100.0% 315

Audit Recommendations - Dual Fuel Participant 897 74.2% 666

Audit Recommendations - Electric Participant 26 74.2% 19

Audit Recommendations - Natural Gas Participant 208 74.2% 154

Total 462,367 N/A 460,451

Note: Totals may not sum properly due to rounding.

Ex Post Gross Savings The evaluation team referred to the Indiana TRM (v2.2) for variable assumptions to calculate ex post

gross electric energy, demand reduction, and natural gas energy savings. Where data were unavailable

in the Indiana TRM (v2.2), the evaluation team used data from the Illinois TRM (v6.0), the 2016

Pennsylvania TRM, or other secondary sources. The evaluation team revised assumptions for savings

estimates applicable to the NIPSCO service territory as needed. Appendix I. Residential Income Qualified

Weatherization Program Assumptions and Algorithms contains details on the specific algorithms,

variable assumptions, and references used for all program measure ex post gross calculations.

Page 201: 2018 DSM Portfolio Evaluation Report - NIPSCO

194

There are significant differences between ex ante and ex post gross savings which are accounted for by

the following overarching factors:

• The evaluation team calculated ex post gross savings for most of the measures using the Indiana

TRM (v2.2). The planning and reporting assumptions used by NIPSCO to calculate ex ante savings

referenced the Indiana TRM (v2.2) and 2015 evaluation, which included taking the average of

the savings values provided in each source in some cases.

• The evaluation team used specific characteristics of installed measures provided within the

tracking data or program application materials for variables such as pre- and post-installation R-

values, square footage, and project location in ex post gross savings. NIPSCO used savings values

from past studies to calculate ex ante savings. Calculations using actual participant data are

invariably different than deemed values.

• The ex ante deemed savings value for several measures (filter whistle, duct sealing, air sealing,

and attic insulation) did not account for the heating and cooling systems of the home. However,

the evaluation team did account for these systems to calculate ex post gross per measure

savings; this generally resulted in an increase in energy savings and demand reduction when

central air conditioning was present and a decrease when it was not.

• The evaluation team also used geolocation for each customer address in the database, then

matched each address with the closest city from the Indiana TRM (v2.2), for example, South

Bend and Fort Wayne, to more precisely account for variations in climate.

The appendices include sources for the ex post gross savings calculation for each measure. Table 135

highlights the notable differences between ex ante and ex post gross estimates.

Table 135. 2018 IQW Notable Differences Between Ex Ante and Ex Post Gross

Measure Ex Ante Sources and

Assumptions Ex Post Sources and Assumptions

Primary Reasons for

Differences

Low-Flow

Showerhead

Roughly 90/10 split between

Indiana TRM (v2.2) and 2015

EM&V values; assumed

unknown housing type for

people per household and

faucets per household and 1.74

gpm-eff

Indiana TRM (v2.2) and participant

information in program tracking data,

assumed single-family housing; cold

water inlet temperature averaged

across customer location and customer

type; actual rating for gpm-eff of 1.5

Water temperatures, gpm

assumptions, people per

household, and

showerheads per household

Pipe Wrap Average between Indiana

TRM (v2.2) and 2015 EM&V

Indiana TRM (v2.2); entered water

temperature averaged across customer

locations

Entering water temperature

differences

Filter Whistle

2016 Pennsylvania TRM,

Pittsburgh deemed savings, ISR

= 0.474

2016 Pennsylvania TRM with Indiana

full load heating and cooling hours and

ISR = 1

ISR and full load heating and

cooling hours

Page 202: 2018 DSM Portfolio Evaluation Report - NIPSCO

195

Measure Ex Ante Sources and

Assumptions Ex Post Sources and Assumptions

Primary Reasons for

Differences

Programmable

thermostat

Weighted average of Indiana

TRM (v2.2) at 75% and 2015

EM&V at 25%

Indiana TRM (v2.2) and full load hours

based on program tracking data home

location. Btu/h heat- electric furnaces

assumed 32,000 Btu/h per 2016

Pennsylvania TRM

Ex post refers to Indiana

TRM (v2.2) only – not a split

between Indiana TRM (v2.2)

and 2015 EM&V. Also,

differences in customer

location and equipment

rating

Duct Sealing

Reduction of HEA duct sealing

savings values, which are a

conservative average of

Indiana TRM (v2.2) calculations

and 2015 EM&V results, to

account for smaller IQW homes

and lower savings in the 2015

EM&V report. Program

documentation reports that

IQW savings are a 20%

reduction of HEA savings;

however, the ex ante demand

reduction represents a 64%

reduction of HEA savings.

Indiana TRM (v2.2) algorithm and 2015

EM&V for NIPSCO territory distribution

efficiency values. Full load heating and

cooling hours averaged across

customer location per customer type.

Btu/h heat- electric furnaces assumed

32,000 Btu/h per the 2016

Pennsylvania TRM.

Distribution efficiencies,

equipment ratings and full

load hours; the ex ante IQW

Duct Sealing kW per

measure savings are a 64%

reduction of the HEA per

measure kW savings

Infiltration

Reduction

(Air Sealing)

Indiana TRM (v2.2) and 2015

EM&V

Indiana TRM (v2.2) blower door test

results are not recorded so evaluation

team used an average of delta cubic

feet per minute (CFM) from the 2016

NIPSCO EM&V study, per customer

type, and chose n-factor corresponding

to 1.5-story home, normal exposure

Delta CFM is based on

program participants of two

years ago because current

participant data are

unavailable and exposure

level

Attic

Insulation

2015 EM&V results for savings

per square foot; averaged

savings of different pre R-

values

Indiana TRM (v2.2) using actual square

footage and sample of customer

applications for actual pre-R value;

evaluation team used interpolation to

determine kWh/ksf, kW/ksf, and

MMBtu/ksf savings per customer

heating and cooling type

Pre-R value and square

footage assumptions of

ex ante savings did not align

with actual participation;

methodology for

determining savings/ksf

Refrigerator

Replacement

Indiana TRM (v2.2) for Low

Income, Early Replacement

ENERGY STAR calculator to estimate

energy use of each existing refrigerator

and ENERGY STAR ratings for energy

use of new refrigerators

Reference and approach for

determining energy use of

existing and replacement

refrigerators; ENERGY STAR

calculator provides more

accurate savings estimates

for the actual refrigerator

replaced, which resulted in

lower savings than Indiana

TRM (v2.2) assumptions

Table 136 shows the ex ante deemed savings and ex post gross per-measure savings for 2018

IQW program measures.

Page 203: 2018 DSM Portfolio Evaluation Report - NIPSCO

196

Table 136. 2018 IQW Ex Ante and Ex Post Gross Per-Measure Savings Values

Measure Units Ex Antea Deemed Savings Ex Post Gross Per-Measure Savings

kWh kW therms kWh kW therms

LED (9 watt) - Joint Lamp 29.0 0.005 (0.05) 28.5 0.004 (0.58)d

LED (9 watt) - Electric Lamp 29.0 0.005 - 28.5 0.004 -

Bath Aerator - Electric Aerator 41.6 0.011 - 34.0 0.003 -

Bath Aerator - Natural Gas Aerator - - 1.85 - - 1.50

Kitchen Aerator - Electric Aerator 182.0 0.008 - 181.6 0.008 -

Kitchen Aerator - Natural Gas Aerator - - 8.01 - - 7.97

Low-Flow Showerhead - Electric Showerhead 271.5 0.013 - 345.5 0.017 -

Low-Flow Showerhead - Natural Gas Showerhead - - 11.94 - - 2.31

Pipe Wrap – Electric Per foot 18.6 0.003 - 22.3 0.003 -

Pipe Wrap - Natural Gas Per foot - - 0.82 - - 0.99

Filter Whistle – Heating and Cooling Filter Whistle 58.0 0.023 -

139.4 0.049 -

Filter Whistle – Heating Only Filter Whistle 107.0 - -

Water Heater Temperature Setback - Electric DHW Participant 86.4 0.010 - 81.6 0.009 -

Water Heater Temperature Setback - Natural Gas DHW Participant - - 6.40 - - 3.50

Programmable thermostat - Electric Resistance Heated Thermostat 378.0 0.200 - 910.1 - -

Programmable thermostat - Natural Gas Heated with Central Air

Conditioner (CAC) Thermostat

- - 34.00 98.5 - 74.45

Programmable thermostat - Gas Heated without CAC Thermostat - - 74.48

Duct Sealing – Dual Fuel Participant 96.6 0.136 91.00 120.1 0.395 94.27

Duct Sealing – Electric Participant 829.4 0.136 - 1,248.0 0.395 -

Duct Sealing - Electric Cooling Participant 96.6 0.136 - 120.1 0.395 -

Duct Sealing - Natural Gas Participant - - 91.00 - - 92.88

20% Infil. Reduction - Dual Fuel with CAC Participant 413.9 - 127.00

85.0 0.044 105.00

20% Infil. Reduction - Dual Fuel without CAC Participant 52.8 - 110.78

20% Infil. Reduction - Electric Resistance Heated with CAC Participant 3,115.5 - -

2,431.3 0.135 -

20% Infil. Reduction - Electric Resistance Heated without CAC 2,375.1 - -

20% Infil. Reduction - Natural Gas Heated Participant - - 127.00 - - 107.89

Attic Insulation - Dual Fuel with CACa Square foot 1.1 - 0.15 0.22 - 0.20

Attic Insulation - Electric Heated with CACb Square foot 5.8 - - 3.1 - -

Attic Insulation - Natural Gas Heated Square foot - - 0.15 - - 0.18

Page 204: 2018 DSM Portfolio Evaluation Report - NIPSCO

197

Measure Units Ex Antea Deemed Savings Ex Post Gross Per-Measure Savings

kWh kW therms kWh kW therms

Refrigerator Replacement Participant 1,054.0 0.024 - 756.7 0.108 -

Assessment Recommendations – Joint Participant 21.6 0.012 2.74 21.6 0.012 2.74

Assessment Recommendations – Electric Participant 21.6 0.012 - 21.6 0.012 -

Assessment Recommendations - Natural Gas Participant - - 2.74 - - 2.74

Note: Totals may not sum properly due to rounding. a Values presented at a measure-level represent audited values since the scorecard provides only savings totals. b Ex post gross kilowatt savings for attic insulation are 0.000097498336599514 for natural gas heat with central air conditioning. c Ex post gross kilowatt savings for attic insulation are 0.00005984 for electric heat with central air conditioning. d The therm penalty results from an increased heating load when incandescent bulbs are replaced with LEDs.

Page 205: 2018 DSM Portfolio Evaluation Report - NIPSCO

198

Realization Rates

Table 137, Table 138, and Table 139 show the program’s ex ante reported savings, verified savings, and

ex post gross savings, respectively. The program achieved electric energy, peak demand reduction, and

natural gas energy realization rates of 65%, 177%, and 93%, respectively.

Three measures account for the low realization rate for electric energy (kilowatt-hour) savings: attic

insulation (dual fuel), air sealing (dual fuel), and refrigerator replacement. The evaluation team

calculated a substantially lower ex post gross electric energy (kilowatt-hour) savings value for each of

these measures (for reasons described in Table 135, above). Given the large number of homes insulated

through the program, this difference resulted in a considerable decrease in ex post gross savings

(approximately 242,436 kWh), relative to ex ante savings. The reduction in energy savings for air sealing

and refrigerator replacement also led to considerable decreases in ex post gross savings, approximately

119,562 kWh and 94,698 kWh, respectively.

The evaluation team attributed much higher peak demand (kilowatt) reduction to duct sealing measures

when central air conditioning was present, and the refrigerator replacement measure resulted in the

high realization rate for demand reduction.

With a realization rate of 93%, the ex post gross therm savings were similar to the ex ante savings at the

program level. While there are some differences in measure-level ex post gross and ex ante savings, the

bulk of the difference between program-level ex post gross and ex ante savings is due to a reduction in

the verified number of units.

Page 206: 2018 DSM Portfolio Evaluation Report - NIPSCO

199

Table 137. 2018 IQW Ex Ante and Ex Post Gross Electric Energy Savings

Measure

Ex Antea Deemed

Electric Energy

Savings

(kWh/yr/unit)

Ex Antea

Electric Energy

Savings

(kWh/yr)

Audited Gross

Electric Energy

Savings

(kWh/yr)

Verified Gross

Electric Energy

Savings

(kWh/yr)

Ex Post Gross Per-

Measure Electric

Energy Savings

(kWh/yr/unit)

Ex Post Gross

Electric Energy

Savings (kWh/yr)

LED (9 watt) 29.0 429,374 429,287 417,825 28.5 410,927

LED (9 watt) - Electric 29.0 10,498 10,498 10,218 28.5 10,027

Bathroom Aerator - Electric 41.6 1,082 1,082 922 34.0 754

Bathroom Aerator - Natural Gas - - - - - -

Kitchen Aerator - Electric 182.0 5,096 5,096 4,699 181.6 4,689

Kitchen Aerator - Natural Gas - - - - - -

Low-Flow Showerhead - Electric 271.5 4,887 4,887 4,489 345.5 5,712

Low-Flow Showerhead - Natural Gas - - - - - -

Pipe Wrap - Electric 18.6 1,321 1,321 1,124 22.3 1,348

Pipe Wrap - Natural Gas - - - - - -

HVAC Filter Whistle 58.0 2,726 2,726 2,131 139.4,107.0b 4,538

Water Heater Temperature Setback - Electric 86.4 259 259 217 81.6 205

Water Heater Temperature Setback - Natural Gas - - - - -

Programmable thermostat - Electric Resistance

Heated 378.0 378 378 352 910.1 847

Programmable thermostat - Natural Gas Heated - - - 98.5 14,847

Duct Sealing - Dual Fuel 96.6 38,833 38,737 31,900 120.1 39,654

Duct Sealing - Electric 829.4 4,976 4,976 4,098 1,248.0 6,166

Duct Sealing - Electric Cooling 96.6 97 97 80 120.1 99

Duct Sealing - Natural Gas - - - - - -

20% Infiltration Reduction - Dual Fuel 413.9 137,829 137,829 109,891 85.0,52.8c 20,665

20% Infiltration. Reduction - Electric Resistance

Heated 3115.5 6,231 6,231 4,968 2,431.3,2,375.1d 3,832

20% Infiltration Reduction - Natural Gas Heated - - - - - -

Attic Insulation - Dual Fuel 1.1 298,507 298,507 298,507 0.2 60,620

Attic Insulation - Electric Heated 5.8 9,860 9,860 9,860 3.1 5,311

Attic Insulation - Natural Gas Heated - - - - - -

Refrigerator Replacement 1,054.0 333,064 332,010 332,010 756.7 238,366

Page 207: 2018 DSM Portfolio Evaluation Report - NIPSCO

200

Measure

Ex Antea Deemed

Electric Energy

Savings

(kWh/yr/unit)

Ex Antea

Electric Energy

Savings

(kWh/yr)

Audited Gross

Electric Energy

Savings

(kWh/yr)

Verified Gross

Electric Energy

Savings

(kWh/yr)

Ex Post Gross Per-

Measure Electric

Energy Savings

(kWh/yr/unit)

Ex Post Gross

Electric Energy

Savings (kWh/yr)

Audit Recommendations - Dual Fuel 21.6 19,397 19,375 14,386 21.6 14,386

Audit Recommendations - Electric 21.6 562 562 417 21.6 417

Audit Recommendations - Natural Gas - - - - - -

Total Program Savings N/A 1,304,976 1,303,717 1,248,092 N/A 843,410

Total Program Realization Rate 64.6%

Note: Totals may not sum properly due to rounding. a Values presented at a measure-level represent audited values since the scorecard provides only savings totals. b Values for “Filter whistle - Natural Gas Heat with CAC" and "Filter Whistle - Electric Heat with CAC," respectively. c Values for "20% Infil. Reduction - Dual Fuel with CAC" and "20% Infil. Reduction - Dual Fuel without CAC," respectively. d Values for "20% Infil. Reduction - Electric Resistance Heated with CAC" and "20% Infil. Reduction - Electric Resistance Heated without CAC," respectively.

Table 138. IQW Ex Ante and Ex Post Gross Peak Demand Reduction

Measure

Ex Antea

Deemed Peak

Demand

Reduction

(kW/unit)

Ex Antea Peak

Demand

Reduction

(kW)

Audited Gross

Peak Demand

Reduction (kW)

Verified Gross

Peak Demand

Reduction (kW)

Ex Post Gross

Per-Measure

Peak Demand

Reduction

(kW/unit)

Ex Post Gross

Peak Demand

Reduction (kW)

LED (9 watt) 0.005 74.19 74.03 72.06 0.004 55.93

LED (9 watt) - Electric 0.005 1.81 1.81 1.76 0.004 1.37

Bathroom Aerator - Electric 0.011 0.28 0.29 0.24 0.003 0.07

Bathroom Aerator - Natural Gas - - - - - -

Kitchen Aerator - Electric 0.008 0.23 0.22 0.21 0.008 0.21

Kitchen Aerator - Natural Gas - - - - - -

Low-Flow Showerhead - Electric 0.013 0.23 0.23 0.21 0.017 0.28

Low-Flow Showerhead - Natural Gas - - - - - -

Pipe Wrap - Electric 0.003 0.18 0.17 0.15 0.003 0.15

Pipe Wrap - Natural Gas - - - - - -

HVAC Filter Whistle 0.023 1.08 1.08 0.84 0.049 0.91

Water Heater Temperature Setback - Electric 0.010 0.03 0.03 0.03 0.009 0.02

Page 208: 2018 DSM Portfolio Evaluation Report - NIPSCO

201

Measure

Ex Antea

Deemed Peak

Demand

Reduction

(kW/unit)

Ex Antea Peak

Demand

Reduction

(kW)

Audited Gross

Peak Demand

Reduction (kW)

Verified Gross

Peak Demand

Reduction (kW)

Ex Post Gross

Per-Measure

Peak Demand

Reduction

(kW/unit)

Ex Post Gross

Peak Demand

Reduction (kW)

Water Heater Temperature Setback - Natural

Gas - - - - - -

Programmable thermostat - Electric Resistance

Heated 0.200 0.20 0.20 0.19 - -

Programmable thermostat - Natural Gas Heated - - 0.00 0.00 0.000 0.00

Duct Sealing - Dual Fuel 0.136 54.78 54.54 44.91 0.395 130.35

Duct Sealing - Electric 0.136 0.82 0.82 0.67 0.395 1.95

Duct Sealing - Electric Cooling 0.136 0.14 0.14 0.11 0.395 0.33

Duct Sealing - Natural Gas - - - - - 0.00

20% Infiltration Reduction - Dual Fuel - - - - 0.044, 0.000b 9.09

20% Infiltration. Reduction - Electric Resistance

Heated - - - - 0.135, 0.000c 0.11

20% Infiltration Reduction - Natural Gas Heated - - - - - -

Attic Insulation - Dual Fueld - - - - 0.000 26.46

Attic Insulation - Electric Heatede - - - - 0.000 0.10

Attic Insulation - Natural Gas Heated - - - - - -

Refrigerator Replacement 0.024 7.45 7.56 7.56 0.108 34.02

Audit Recommendations - Dual Fuel 0.012 10.62 10.76 7.99 0.012 7.99

Audit Recommendations - Electric 0.012 0.31 0.31 0.23 0.012 0.23

Audit Recommendations - Natural Gas - 0.00 0.00 0.00 0.000 0.00

Total Program Savings N/A 152.34 152.19 137.16 N/A 269.58

Total Program Realization Rate 177.0%

Note: Totals may not sum due to rounding. a Values presented at a measure-level represent Audited values, since the scorecard provides only savings totals. b Values for "20% Infil. Reduction - Dual Fuel with CAC" and "20% Infil. Reduction - Dual Fuel without CAC," respectively. c Values for "20% Infil. Reduction - Electric Resistance Heated with CAC" and "20% Infil. Reduction - Electric Resistance Heated without CAC," respectively. d Ex post gross kW savings for attic insulation are 0.000097498336599514 for natural gas heat with central air conditioning. e Ex post gross kW savings for attic insulation are 0.00005984 for electric heat with central air conditioning.

Page 209: 2018 DSM Portfolio Evaluation Report - NIPSCO

202

Table 139. IQW Ex Ante and Ex Post Gross Natural Gas Energy Savings

Measure

Ex Antea Deemed

Natural Gas Energy

Savings

(therms/yr/unit)

Ex Antea Natural

Gas Energy

Savings

(therms/yr)

Audited Gross

Natural Gas

Energy Savings

(therms/yr)

Verified Gross

Natural Gas

Energy Savings

(therms/yr)

Ex Post Gross Per-

Measure Natural

Gas Energy Savings

(therms/yr/unit)

Ex Post Gross

Natural Gas

Energy Savings

(therms/yr)

LED (9 watt) (0.05)d (740) (740) (720) (0.58) (8,395)

LED (9 watt) - Electric - - - - - -

Bathroom Aerator - Electric - - - - - -

Bathroom Aerator - Natural Gas 1.85 1,241 1,243 1,059 1.50 858

Kitchen Aerator - Electric - - - - - -

Kitchen Aerator - Natural Gas 8.01 4,325 4,349 4,011 7.97 3,990

Low-Flow Showerhead - Electric - - - - - -

Low-Flow Showerhead - Natural Gas 11.94 5,934 5,970 5,483 2.31 1,060

Pipe Wrap - Electric - - - - - -

Pipe Wrap - Natural Gas 0.82 3,793 3,786 3,224 0.99 3,907

HVAC Filter Whistle - - - - - -

Water Heater Temperature Setback - Electric - - - - - -

Water Heater Temperature Setback - Natural Gas 6.40 3,322 3,315 2,779 3.05 1,518

Programmable thermostat - Electric Resistance

Heated - - - - - -

Programmable thermostat - Natural Gas Heated 34.00 10,506 10,506 9,773 74.45,74.48b 21,403

Duct Sealing - Dual Fuel 91.00 36,582 36,491 30,050 94.27 31,130

Duct Sealing - Electric - - - - - -

Duct Sealing - Electric Cooling - - - - - -

Duct Sealing - Natural Gas 91.00 29,757 29,757 24,505 92.88 25,012

20% Infiltration Reduction - Dual Fuel 127.00 42,291 42,291 33,719 105.0,110.78c 28,219

20% Infiltration. Reduction - Electric Resistance

Heated - - - - - -

20% Infiltration Reduction - Natural Gas Heated 127.00 12,319 12,319 9,822 107.89 8,344

Attic Insulation - Dual Fuel 0.15 40,706 40,706 40,706 0.20 53,157

Attic Insulation - Electric Heated - - - - - -

Attic Insulation - Natural Gas Heated 0.15 24,625 24,625 24,625 0.18 29,100

Refrigerator Replacement - N/A - - - -

Audit Recommendations - Dual Fuel 2.74 2,461 2,458 1,825 2.74 1,825

Page 210: 2018 DSM Portfolio Evaluation Report - NIPSCO

203

Measure

Ex Antea Deemed

Natural Gas Energy

Savings

(therms/yr/unit)

Ex Antea Natural

Gas Energy

Savings

(therms/yr)

Audited Gross

Natural Gas

Energy Savings

(therms/yr)

Verified Gross

Natural Gas

Energy Savings

(therms/yr)

Ex Post Gross Per-

Measure Natural

Gas Energy Savings

(therms/yr/unit)

Ex Post Gross

Natural Gas

Energy Savings

(therms/yr)

Audit Recommendations - Electric - - - - - -

Audit Recommendations - Natural Gas 2.74 570 570 423 2.74 423

Total Program Savings N/A 217,691 217,646 191,283 N/A 201,550

Total Program Realization Rate 92.6%

Note: Totals may not sum properly due to rounding. a Values presented at a measure-level represent audited values since the scorecard provides only savings totals. b Values for "Programmable thermostat - Natural Gas Heated with CAC" and "Programmable thermostat - Natural Gas Heated without CAC," respectively. c Values for "20% Infil. Reduction - Dual Fuel with CAC" and "20% Infil. Reduction - Dual Fuel without CAC," respectively. d The therm penalty results from an increased heating load when incandescent bulbs are replaced with LEDs.

Page 211: 2018 DSM Portfolio Evaluation Report - NIPSCO

204

Ex Post Net Savings The ex post net savings values reflect savings attributed to the program after adjusting for freeridership

and spillover by applying an NTG ratio.

Evaluators typically calculate NTG using survey participants’ self-reported responses to questions related

to what participants would have done in the absence of the program (freeridership) and the influence

the program had on their decision to implement additional energy efficiency projects after participation

in the program (spillover). Because of the income-qualified focus of the program, the evaluation team

used an industry-standard assumption that, absent the program, participants would not have purchased

and installed the measures provided due to financial constraints. In this situation, the NTG ratio is 100%,

where both freeridership and spillover equal 0%.

With a NTG ratio of 100%, the ex post net savings are identical to the ex post gross savings.

Process Evaluation This section outlines the evaluation team’s process evaluation findings derived from conducting

interviews, reviewing program materials, and surveying program participants.

The evaluation team conducted interviews with the NIPSCO program coordinator and with program

management staff at Lockheed Martin Energy, the program implementer. The evaluation team also

reviewed promotional materials and assessment reports, and surveyed 69 of the 1,131 participants who

had assessments performed.

Program Design and Delivery Through the IQW program, NIPSCO provides walk-through energy assessments and direct installations

of energy efficiency measures to income-qualified single-family homeowners or renters (with landlord

approval). Customers are enrolled in the program on a first-come, first-served basis. Program staff

reported that the primary program objectives are to increase customer satisfaction, serve customers

who often do not have access to other programs, and to reduce the burden of energy bills for families in

need.

The program is open to income-qualified residential natural gas and/or electric customers living in

homes that have not been weatherized in the past 10 years. The account holder must either receive Low

Income Home Energy Assistance, Temporary Assistance for Needy Families, or Supplemental Security

Incomeand total household income meets Low-Income Guidelines. If the customers do not receive any

of these, they may still qualify for program services if they meet the DOE’s Weatherization Assistance

Program Low-Income Guidelines (per program documents, customers are income-qualified if they are at

or below 200% of current federal poverty guidelines).

Lockheed Martin Energy is responsible for program design and management, contractor payments

processing, quality assurance and quality control, technical training, and contractor support to facilitate

the quality installation of energy-efficient measures. It also recruits and manages a network of trade

Page 212: 2018 DSM Portfolio Evaluation Report - NIPSCO

205

allies (program-approved contractors and energy assessors) to implement the IQW program. The trade

ally network for the IQW program consists largely of community-based organizations that have staff to

perform the work and existing relationships with those in need. Lockheed Martin Energy trains the trade

allies to ensure work quality and customer service meet program standards. Lockheed Martin Energy

and NIPSCO collaborate to promote the program jointly. Trade allies may also do their own marketing

for the program.

During 2018, Lockheed Martin Energy and NIPSCO recruited participants through a variety of marketing

efforts: email blasts to NIPSCO’s energy assistance customers, a program introduction letter placed at

community locations income-qualified residents may visit for assistance, and outreach events including

an Earth Day event and health fair at a large employer in the area. Additionally, Lockheed Martin Energy

screened customers interested in HEA for income-eligibility, so the program also benefits from the direct

mail, email, radio, TV, and social media campaigns for that program. Interested customers enroll in the

IQW program by calling the NIPSCO Residential Energy Efficiency program hotline or through the

website.

Depending on the conditions and current equipment in the home, the energy assessor installs any or all

of the following measures during the assessment:

• LED bulbs (up to 22)

• Programmable thermostat

• Kitchen faucet aerator

• Bathroom faucet aerators (up to two)

• Low-flow showerheads (up to two)

• Pipe wrap (up to 10 feet)

• Furnace filter whistle

• Water heater temperature setback

• Infiltration reduction < 20% (air sealing)

• Duct sealing

Participants may also qualify for a refrigerator replacement and attic insulation after the assessment.

IQW participants with a primary refrigerator that is at least 10 years old may also qualify for a

refrigerator replacement. After a visual inspection, the energy assessor indicates eligibility on the

application form submitted to Lockheed Martin Energy. Lockheed Martin Energy then processes the

application and submits the request for the refrigerator replacement to its subcontractor ARCA

Recycling ARCA), which contacts the customer to schedule a delivery date. Lockheed Martin Energy staff

reported that the time between audit and delivery is typically four to six weeks.

Qualifying program participants can also receive up to 1,200 square feet of attic insulation installed at a

later date through the trade ally performing the assessment at no cost. To receive the attic insulation

through the program, the existing attic insulation level must be less than R-11.

At the end of the assessment, the energy assessor provides a CHA report outlining information about

the home’s existing conditions, current energy use, heating and cooling loads, as well as a prioritized

(low, medium and high) listing of recommendations specific to the home. These recommendations are

based on the energy assessor’s findings and include estimated costs and payback periods. The report

Page 213: 2018 DSM Portfolio Evaluation Report - NIPSCO

206

also lists zero-cost and low-cost solutions to help lower energy costs, with a link to NIPSCO’s website,

but does not include details about other NIPSCO energy efficiency programs or incentives.

The assessor reviews the CHA report with the customer and discusses the findings and

recommendations. This ensures that customers understand the information provided and the next steps

they can take. In addition to the CHA report, which may contain recommendations that refer customers

to other NIPSCO programs, Lockheed Martin Energy stated that the energy assessors also leave behind

promotional materials for other programs.

Changes from 2017 Design

While Lockheed Martin Energy made no changes to the 2018 program design from 2017, they made the

following improvements to program processes:

• Reduced the length of the CHA report significantly. They are also assessing other platforms to

produce an assessment report that may offer enhanced reporting capabilities.

• Developed a tri-fold for the HVAC Rebates program that is left behind with program participants

to increase awareness of other NIPSCO programs.

• Implemented procedures to improve the scheduling process, such as sending weekly email

notifications to trade allies and follow-up procedures for hard-to-reach customers. However, the

NIPSCO program manager said they received several phone calls during 2018 from customers

who tried to schedule assessments online but were not contacted to complete scheduling.

Lockheed Martin Energy also took steps to manage customer expectations regarding the replacement

refrigerator. Trade allies provided a brochure detailing the replacement refrigerator, so customers knew

what they would be receiving before the replacement refrigerator was ordered. The trade allies also

supplied ARCA with the current refrigerator cubic feet, dimensions, and available space around the unit

to help provide a replacement refrigerator of a similar size. If there was a significant size difference, the

trade allies contacted customers prior to delivery to ensure they were still interested in the replacement

refrigerator.

2017 Recommendation Status

The evaluation team followed up with NIPSCO on the status of the 2017 evaluation recommendations.

Table 140 lists the 2017 IQW program evaluation recommendations and NIPSCO’s progress toward

addressing those recommendations to date.

Page 214: 2018 DSM Portfolio Evaluation Report - NIPSCO

207

Table 140. Status of 2017 IQW Program Evaluation Recommendations

Summary of 2017 Recommendations Status

Record all values necessary for accurate savings calculations

and verification directly in the program tracking database,

including the following for all program customers:

• customer type (combo, electric, or natural gas)

• number of above-ground stories

• heating system type

• heating capacity (Btu/h)

• cooling system type

• (for insulation measures) pre-install R-value and post-install

R-value

• (for lighting measures) existing bulb type and wattage

• (for air sealing measures) pre-install cfm and post-install

cfm

Completed in 2019. Lockheed Martin Energy currently

reports customer type, heating system type, cooling

system, insulation, and lighting details in the application.

For 2019, Lockheed Martin Energy will enter these

parameters and pre/post CFM into the tracking system.

Heating and cooling efficiency and system capacity use

the Indiana TRM (v2.2) and evaluation results to calculate

the deemed savings for each measure. Lockheed Martin

Energy evaluated these values and reported they produce

reasonable deemed savings. Therefore, Lockheed Martin

Energy will not record these variables at the project level.

Consider using blower door tests to measure air infiltration

before and after performing air and duct sealing for more

accurate savings calculations, using the modified blower door

subtraction method.

Partially Completed. Lockheed Martin Energy performs

blower door testing for air sealing but not duct sealing.

Document the sources, assumptions and, where possible,

algorithms and key inputs included in the derivation of each

deemed savings value for all program measures.

Completed in 2019. Lockheed Martin Energy is in progress

of addressing this recommendation for the 2019-2021

program cycle.

Consider the fuel type and heating and cooling systems in

measure descriptions and savings calculations. Use the ex post

gross per-measure savings values provided in this report,

which incorporate these factors, in program for measures

included in this evaluation.

Completed in 2019. Lockheed Martin Energy updated

some measure descriptions and savings calculations to

follow this recommendation. Lockheed Martin Energy will

review and update descriptions and savings calculations

where relevant for 2019.

Educate energy assessors on the importance of accurately

recording measures installed, providing the CHA report, and

verbally informing customers of the measures installed so that

customers can accurately verify the installation of those

measures.

Completed in 2018. Lockheed Martin Energy reported

that HEA and IQW programs require the trade ally to

complete and generate a CHA, review results with the

customer, and leave a copy of the CHA with the customer

to reference.

Consider establishing additional data capture activities to

document measure installation conditions, such as having

energy assessors take pictures of post-installation conditions

while on-site.

Declined. While trade allies are not required to provide

pictures of post-installation, Lockheed Martin Energy

inspects 10% of all projects to verify installation and takes

pictures for reference where necessary.

Provide additional training to assessors on how to print and

email the report while at the participant’s home to increase

the likelihood that participants will refer to the report and

recall that they received it.

Completed in 2018. Lockheed Martin Energy reminded

and educated trade allies throughout the year in person

and via email on properly providing CHA to customers.

Redesign the CHA report so that it is shorter and more

focused. This will make the report easier to print at the home

for the assessor, and easier to read for the participant.

Completed in 2018. Lockheed Martin Energy decreased

the report length from about 40 pages to 21 pages for

2018. Lockheed Martin Energy is in the process of

evaluating other CHA platforms to implement for 2019.

Request a participant signature verifying that the assessor has

discussed the findings and recommendations of the

assessment and has provided a CHA report.

Completed in 2018. Lockheed Martin Energy require

customers to sign a work authorization form upon project

completion. Lockheed Martin Energy is in the process of

assessing other CHA platforms to implement for 2019 that

may have signature acknowledgment capability.

Page 215: 2018 DSM Portfolio Evaluation Report - NIPSCO

208

Summary of 2017 Recommendations Status

Establish and communicate clear product and delivery timing

expectations across program staff and participants for the

replacement refrigerator. This includes determining and clearly

explaining the appropriate size of the replacement refrigerator

and the duration of refrigerator delivery processes. NIPSCO

may want to consider maintaining equivalence in size when

replacing the refrigerator to maintain customer expectations

of replacement refrigerator size.

Completed in 2018. Lockheed Martin Energy developed

marketing leave-behind materials for customers

explaining the replacement process. Lockheed Martin

Energy educated trade allies on proper information

acquisition and added replacement data input to

LM Captures to properly track and identify missing

information.

To increase program awareness, use a folder of leave-behind

materials to help facilitate the conversation about assessment

results, which is identified as a best practice. Develop a leave-

behind folder in conjunction with HEA to hold literature about

NIPSCO’s other programs, other energy-related resources

available to income-qualified participants, and the printed

copy of the CHA report. This conversation should be framed in

terms of how much money the direct-install measures will save

the participant per month and what type of impact additional

actions can make in the short term. Additionally, the folder of

materials may aid in recall of receiving the report.

Completed in 2018. Lockheed Martin Energy developed

additional marketing collateral and distributed it for both

HEA and IQW programs, as well as HVAC Rebates trifold.

Additionally, NIPSCO approved yard signs and door

hangers for trade allies to use on project sites. Lockheed

Martin Energy conducted targeted marketing campaigns

via direct mail, bill inserts, and social media to improve

awareness. The CHA outlines potential savings and

recommendations.

Participant Feedback The evaluation team surveyed 69 customers who participated in the program in 2018. The key findings

follow.

Energy Efficiency Awareness and Marketing

As shown in Figure 65, IQW respondents most frequently learned about the program through word of

mouth (41%), similar to 2017 (49%). Fewer 2018 respondents stated that they learned of the program

through a NIPSCO bill insert (19%), community service organization (9%), direct mail (7%), and phone

calls from NIPSCO (4%). However, significantly more respondents learned from bill inserts and

community-based organizations in 2018 than in 2017.42 As noted above, the implementer placed

program introduction letters at community locations income-qualified residents may visit for assistance

in 2018.

42 In 2017, 8% of respondents (n=63) learned about the program through bill inserts, a slight increase from the

6% of the 2016 survey respondents (n=63). While the difference between 2016 and 2017 is not statistically

significant, the difference between 2017 and 2018 is significant at p≤0.10 (90% confidence). No respondents in

2016 or 2017 reported learning about the program from community-based organizations. The difference

between 2017 and 2018 is significant at p≤0.05 (95% confidence).

Page 216: 2018 DSM Portfolio Evaluation Report - NIPSCO

209

Figure 65. How Participants Learned about the IQW Program

Source: Participant survey. Question: “How did you learn about NIPSCO’s IQW program?”

(Multiple answers allowed)

Similar to 2017 (24%, n=15), less than one-quarter (23%, n=16) of IQW respondents were aware that

NIPSCO offers other energy efficiency programs. These respondents said they were aware of the Home

Energy Assessment (three respondents) and Appliance Recycling (two respondents) programs. Nine

respondents named programs other than those listed on the survey such as discounted rate programs

like Customer Assistance for Residential Energy and BudgetPlan Billing (two respondents), community

service program (one respondent), and tree planting (one respondent). Two others were unable to

describe a specific NIPSCO program.

Participation Drivers

Respondents most commonly reported participating in the IQW program to save money on utility bills

(45%), which has been the top reason for the last three years (Figure 66). Significantly more respondents

reported participating to replace old or broken equipment in 2018 (19%) than in 2017 (13%), while

significantly fewer respondents participated to save energy in 2018 (13%) than in 2017 (31%).

Page 217: 2018 DSM Portfolio Evaluation Report - NIPSCO

210

Figure 66. Top Reasons for Participating in IQW Program

Note: Boxed values indicate differences between years is significant at p≤0.10 (90% confidence).

Source: Participant survey. Question: “Why did you decide to participate in NIPSCO’s IQW program?”

(Multiple answers allowed)

In-Home Energy Assessment

After conducting their assessment of the home and installing the energy-efficient items, per the

program design, the energy assessor generates the CHA report and discusses their findings and

recommendations with the participant before leaving the home. The report contains the findings of

their assessment and additional energy-saving recommendations.

Most respondents (87%, n=60), recalled the assessor discussing the findings and recommendations of

their assessment, similar to 2017 (81%, n=51). Seventy percent (n=48) recalled receiving a report,

somewhat fewer than in 2017 (78%, n=49), although the difference was not statistically significant.

Of the 70% of respondents who recalled receiving the report, most received it after the assessment

(82%), with 54% of those receiving a printed report by mail, 17% receiving the report by email, and 9%

receiving a printed report that the trade ally dropped off after the assessment. Only 22% of the

respondents received a report during the assessment as specified in the program design, with 20% of

those receiving a printed report and 2% receiving the report via email (Figure 67).43 In interviews, the

implementer noted that the length of the report (about 20 pages, shortened from about 40 pages) is a

barrier to delivering the report to the customer during the assessment.

43 2018 findings related to report delivery timing and format are similar to 2017, with no significant differences.

In 2017, of the 78% of respondents who recalled receiving the report, most received it after the assessment

(73%), with 65% of those receiving a printed report by mail and 8% receiving a report by email. Only 29%

received a report during the assessment as specified in the program design, with 27% receiving a printed

report and 2% receiving a report via email.

Page 218: 2018 DSM Portfolio Evaluation Report - NIPSCO

211

Figure 67. When did the Energy Assessor Provide the CHA Report

Source: Participant survey. Question: “How was the report provided to you?”

(Multiple answers allowed)

Whether information was provided verbally or through the CHA report, most respondents (81%, n=50)

thought the information they received during the assessment was very useful or somewhat useful (16%,

n=10). Five respondents reported not receiving a report at all and provided suggestions for improving

the information provided during the assessment. All five respondents suggested that the energy

assessor provide a report.

Over half (56%) of respondents who received a CHA report made at least some of the recommended

improvements. However, as shown in Figure 68, 28% of respondents reported that they did not make

any of the recommended improvements. The expense of making improvements was the most

commonly cited reason for not acting on the recommendations.

Figure 68. Follow Through on Recommended Improvements

Source: Participant survey. Question: “After participating in NIPSCO’s Income-Qualified Weatherization

program, would you say you have made all, most, some or none of the energy-saving recommendations

made in the assessment report you received?”

Respondents were generally satisfied with direct-install measures or services received, with 84% to 97%

of respondents somewhat or very satisfied (Figure 69). At least 80% of respondents were very satisfied

Page 219: 2018 DSM Portfolio Evaluation Report - NIPSCO

212

with all measures except for programmable thermostats, of which 65% of respondents were very

satisfied.

Figure 69. Satisfaction with Direct-Install Measures

Source: Participant survey. Question: “How satisfied are you with the [equipment] installed?”

Perceived Effect on Home Comfort and Energy Bills

The majority of respondents noticed improved comfort in their homes (65%) and noticed lower energy

bills (58%) following their assessment and direct-install measures (Figure 70). A substantial proportion of

respondents said that they did not notice a difference or there was not difference in their comfort after

the assessment (33%) or their energy bills (35%). Even though all respondents completed assessments a

minimum of two months prior to completing the survey, it may be that they did not have enough time

to notice a difference.

Figure 70. Changes Noticed Since Assessment

Source: Participant survey. Question: “Do you find that your home is more comfortable since your

assessment?” Question: “Is your NIPSCO energy bill lower since your assessment?”

Page 220: 2018 DSM Portfolio Evaluation Report - NIPSCO

213

Refrigerator Replacement

Twenty-six of the 315 IQW participants who received refrigerators through the program responded to

the survey. The survey verified that all 26 respondents’ refrigerators were still being used and were still

located in the same place where they were installed. Just over three-quarters (19 of 24 respondents)

knew that their new ENERGY STAR refrigerator was sponsored by NIPSCO.

The implementer reported that it takes four to six weeks from the time of the assessment until the

delivery of the replacement refrigerator. As shown in Figure 71, most respondents (18 of 23) said they

received their replacement refrigerator within or under this timeframe. Eleven respondents reported

receiving their refrigerator within four weeks of the assessment and another seven reported receiving it

within the four- to six-week timeframe. Many respondents (18 of 26) received some kind of material or

instruction at the time of delivery about who to contact if they experienced any problems with the new

refrigerator. Six respondents indicated that they did not receive any such materials or instructions.

Figure 71. Self-Reported Length of Time to Receive Refrigerator

Source: Participant survey. Question: “From the time you had your energy assessment, how long did it take

to get your ENERGY STAR refrigerator?”

As mentioned earlier, Lockheed Martin Energy took steps to ensure participants knew what to expect

with their new refrigerator and to provide replacement refrigerators closer in size to the participant’s

existing refrigerator. The survey results showed improvements in meeting participant expectations. Only

about half (12 of 26) of respondents received information about the replacement refrigerator before it

was ordered, but most respondents (22 of 26) said that the replacement refrigerator was what they

expected, compared to only half in 2017 (8 of 16). A review of program tracking data revealed that

fewer participants (23%) received refrigerators smaller than their old ones than in 2017 (28%). Sixty-

three percent received a refrigerator of the same size and 14% received a larger refrigerator.

Four respondents said the replacement refrigerator was not what they expected. Of these four

respondents, three said it was too small. The fourth respondent said that the new refrigerator broke

easily, specifically that the door cracked and the top shelf broke when they put weight on it.

Respondents rated their satisfaction with the process of removing their old refrigerator, with the

professionalism of the technician, and with the new ENERGY STAR refrigerator. Most respondents were

very satisfied with each of these components of the refrigerator replacement (Figure 72).

Page 221: 2018 DSM Portfolio Evaluation Report - NIPSCO

214

Regarding the professionalism of the technician, respondents said, “They were very warm and explained

everything,” and “He was so kind and so quick, very accurate.” Three respondents who were less than

satisfied with the technician’s professionalism all felt the technicians rushed the service; specifically, one

respondent said the technician was short with them for not clearing out space ahead of time, another

respondent said the technician delivered the refrigerator without completing the installation, and the

third respondent said the technician scratched the new refrigerator while unloading it from the truck.

Almost all respondents said they were satisfied with the new refrigerator (combined very satisfied and

somewhat satisfied; 25 of 26). The one respondent who was somewhat dissatisfied with the refrigerator

said the shelves in the door “always fall off.”

Figure 72. Satisfaction with Refrigerator Replacement Program Components

Source: Participant survey. Question: “How satisfied were you with the professionalism of the technician?”

Question: “How would you rate the process of removing your old refrigerator and installing your new

ENERGY STAR refrigerator?” Question: “How would you rate your new ENERGY STAR refrigerator?”

Satisfaction with Assessment

Respondents reported high levels of satisfaction with the in-home energy assessment overall. As shown

in Figure 73, 97% of respondents were satisfied in 2018 with 78% of respondents were very satisfied and

19% were somewhat satisfied. The percentage of very satisfied respondents in 2018 appears to have

increased from 2017, but this was not a statistically significant increase.

Page 222: 2018 DSM Portfolio Evaluation Report - NIPSCO

215

Figure 73. Satisfaction with the In-Home Energy Assessment Overall

Source: Participant survey. Question: “How satisfied were you with each of the following? The in-home

energy assessment overall?”

Figure 74 shows satisfaction with the individual components of the assessment. Respondents reported

high levels of satisfaction with all five components. Significantly more respondents were very satisfied

with the professionalism of the energy assessor in 2018 (90%) than 2017 (78%). Significantly more

respondents were very satisfied with the assessment report in 2018 (88%) than 2017 (67%). The other

three components did not show significant increases in satisfaction (very satisfied) from 2017 to 2018.

However, when very satisfied and somewhat satisfied are combined, significantly more respondents in

2018 (96%) than 2017 (87%) were satisfied with the time between scheduling and the assessment.44

Figure 74. Satisfaction with Components of In-Home Energy Assessment

Note: Boxed value indicates difference between 2017 and 2018 is significant at p≤0.10 (90% confidence).

Source: Participant survey. Question: “How satisfied were you with each of the following…?”

44 The difference between the combined satisfaction ratings for time between scheduling and the assessment is

statistically significant at p≤0.10 (90% confidence).

Page 223: 2018 DSM Portfolio Evaluation Report - NIPSCO

216

Satisfaction with Program

A large majority (94%) of respondents were satisfied with the IQW program. As shown in Figure 75, the

percentage of very satisfied ratings increased significantly year over year since 2016.

Figure 75. Satisfaction with the IQW Program

Note: Boxed value indicates difference between 2017 and 2018 is significant at p≤0.10 (90% confidence).

Source: Participant survey. Question: “How satisfied are you with NIPSCO’s Income-Qualified

Weatherization program?”

The survey included an open-ended question asking respondents to briefly explain their program

satisfaction rating. The 43 respondents who were satisfied with the program said it was due to the value

of the program service and information (62%); the professional, friendly staff (29%, n=20); and the

money they save either on their bill through lower energy use or the cost savings of receiving program

measures for free (25%, n=17). Nine respondents (13%) said the program did not provide the additional

weatherization work they wanted. Of these respondents, three were dissatisfied with the program, but

the remaining respondents were still satisfied.

Suggestions for Improvement

Thirty respondents provided suggestions to improve the

program. Similar to 2017, respondents most frequently

suggested the program offer more weatherization services,

such as sealing drafty windows and doors, more insulation,

and checking underneath the home for opportunities to

improve energy efficiency (11 respondents). Five

respondents said that more advertising is needed to raise

customer awareness about the programs. Some

respondents also suggested providing better explanations of

the home’s conditions and recommendations for improving

energy efficiency (four respondents) and offering better or higher quality direct install measures (four

respondents). Three respondents said the program should ensure that there is follow-through on what

was promised to them after the assessment (see participant quote), and two respondents reported that

NIPSCO should provide more information about what the assessors should be doing while in their home

to help manage their expectations.

“I know that they wanted to come

and insulate the home in the attic

and they never returned to do that.

They were going to install a

programmable thermostat and they

did not come back to do that either.”

– IQW Participant

Page 224: 2018 DSM Portfolio Evaluation Report - NIPSCO

217

Satisfaction with NIPSCO

A large majority of IQW respondents (94%) were satisfied with NIPSCO (Figure 76). Specifically, 64% said

they were very satisfied with NIPSCO, though not a statistically significant increase from 52% in 2017.

Figure 76. Satisfaction with NIPSCO

Source: Participant survey. Question: “How satisfied are you with NIPSCO overall as your utility

service provider?

Participant Survey Demographics

As part of the participant survey, the evaluation team collected responses on the following

demographics, shown in Table 141.

Table 141. IQW Program Respondent Demographics

Demographics Percentage

(n=69)

Home Ownership

Rent 22%

Own 78%

Type of Residence

Single-family detached 91%

Attached house (townhouse, row house, or twin) 1%

Two-to-four unit 0%

Multi-family apartment or condo (four or more units) 1%

Manufactured home 6%

Number of People in the Home

One 41%

Two 32%

Three 9%

Four 10%

Five or more 9%

Page 225: 2018 DSM Portfolio Evaluation Report - NIPSCO

218

Demographics Percentage

(n=69)

Annual Household Income

Under $25,000 67%

$25,000 to under $35,000 16%

$35,000 to under $50,000 6%

$50,000 to under $75,000 4%

Prefer not to say/Don’t know 7%

Note: Not all categories sum to 100% due to rounding.

Conclusions and Recommendations Conclusion 1. Although the 2018 IQW program did not meet its savings goals, it still delivered high

customer satisfaction.

In 2018, 81% of respondents were very satisfied with the IQW program. The percentage of respondents

that were very satisfied with the program has increased significantly year over year since 2016 (67% in

2017; 51% in 2016). The percentage of respondents that were very satisfied with NIPSCO increased

slightly year over year, with a statistically significant increase between 2018 (64%) and 2016 (48%).

These increases show not only that customers are currently highly satisfied with the program and with

NIPSCO, but also show a positive trend in rates of satisfaction over time.

Conclusion 2: Drivers of participation in the IQW program may be changing.

In 2018, more than half (58%) of respondents noticed lower energy bills following program participation,

which indicates that the burden of paying them was reduced. For the last three years, respondents cited

saving money on utility bills as a driver to participate in the program more than any other factor.

However, the number of respondents saying they participated to replace old or broken equipment

appears to have grown each year: the rate of customers participating to replace old or broken

equipment grew to 19% in 2018 and while respondents that said they participated to save energy

dropped significantly to 13%.

Conclusion 3: Realization rates vary widely due to ex ante savings not accounting for the fuel type and

heating and cooling systems of homes and actual participant housing characteristics differing from

planning assumptions, as well as low ISRs for some measures.

Realization rates ranged from 65% for kWh to 177% for kW reduction. The evaluation team calculated

ex post gross per-measure savings using algorithms and variable assumptions from the Indiana

TRM (v2.2). Where data were unavailable in the Indiana TRM (v2.2), the evaluation team used the

Illinois TRM (v6.0), the 2016 Pennsylvania TRM, or other secondary sources. The evaluation team revised

assumptions for savings estimates applicable to the NIPSCO service territory as needed. In most cases,

the ex post gross per-measure savings were very different from the ex ante per-measure savings for a

variety of reasons, including not accounting for the fuel type and heating and cooling systems in savings

calculations, which resulted in much lower ex post gross per-measure savings for attic insulation (dual

fuel) and air sealing (dual fuel). Additionally, the actual participant characteristics differed from planning

Page 226: 2018 DSM Portfolio Evaluation Report - NIPSCO

219

assumptions, which resulted in much lower ex post gross per-measure savings for the refrigerator

replacement.

The evaluation team understands that Lockheed Martin Energy will account for the different heating

and cooling systems in 2019. However, current program tracking data does not contain these data

points for every participant and, in many cases, the housing characteristics entered for a participant

home do not align with the description included as part of the measure name.

Another factor in the realization rates is the ISR calculated based on self-reported survey data and in

many instances, respondents were not able to verify that measures were installed. This resulted in low

ISRs for the air sealing and duct sealing, which were only verified by 62% and 80% of program-tracked

recipients. This also affected the assessment recommendations measure as only 70% of respondents

recalled receiving the report.

Recommendations:

• Use the provided ex post gross savings values, particularly for air and duct sealing, attic

insulation, and refrigerator measures, which more accurately represent participant

characteristics, for planning to most closely align ex ante and ex post gross savings and achieve

realization rates closer to 100%.

• Break out the existing attic insulation measure into three categories based on existing R-value

(R-0, R-1 to R-7, R-8 to R-11) to more accurately capture savings based on existing conditions

and guard against discrepancies in ex ante and ex post gross savings due to actual home

characteristics not falling into the average savings category.

• Establish data validation procedures in the program tracking database to require that the

heating and cooling fuels and system types are entered and that the measure selected aligns

with those housing characteristics where they are necessary for accurate savings calculations.

• Enhance efforts to educate energy assessors on the importance of accurately recording

measures installed, providing the CHA report, and verbally informing customers of the measures

installed so that customers can accurately verify the installation of those measures.

• Revisit 2017 recommendation to establish additional data capture activities such as having

energy assessors take pictures of post-installation conditions while on-site. This documentation

could provide another method for calculating ISRs for some measures such as duct sealing, air

sealing, and pipe wrap, potentially resulting in higher ISRs than self-report for measures

participants do not see and/or interact with on a daily basis.

• Consider including a list of installed measures using check-boxes and fill-ins for quantity when

applicable so as not to be overly burdensome for the assessor or customer in the work

authorization form that participants sign after the assessment. This could provide an alternative

to using self-report or photographs for calculating ISRs.

Page 227: 2018 DSM Portfolio Evaluation Report - NIPSCO

220

Conclusion 4. The program better managed customer expectations on the refrigerator replacement

because of new communication procedures.

The implementer took steps to ensure participants knew what to expect with their new refrigerator and

to provide replacement refrigerators closer in size to the participant’s existing refrigerator. About half of

refrigerator replacement participants surveyed said they received information about the replacement

before they fully agreed to participate in the program (12 of 26), and the majority of respondents said

the replacement refrigerator was what they expected (22 of 26).

Conclusion 5. The program may have had higher verified savings if more participants had received the

CHA report during the assessment rather than after the assessment.

The program design requires assessors to provide a written report to customers during the assessment.

In the last three years, 20% to 30% of respondents did not recall receiving any report at any time. The

evaluation uses the self-reported percentage of respondents who received a report to serve as the ISR

for the verified program savings claims, and because this percentage was low, so were the verified

savings.

Furthermore, in the last two years, of those that did recall receiving a report, only 22% to 29% received

it during the assessment. The implementer took steps to increase rates of customers receiving the

CHA report during the assessment including reminding and educating trade allies throughout the year to

do so and reducing the length of the report by about half (from about 40 pages to about 20 pages—as

noted under 2017 Recommendation Status). However, these steps did not appear to increase rates of

report delivery during the assessment. Most respondents that recalled receiving a report in 2018,

received the report after the assessment by mail (80%), which is similar to 2017 (73%). NIPSCO and the

implementer are considering a new platform in 2019 that will streamline program processes like the

customer application and the delivery of the report.

Recommendations:

• Include easy and immediate access to the CHA report (print and digital) and the ability to add

custom content such as related NIPSCO rebates and program information as top features to look

for in a new reporting platform.

• Coordinate with program implementation staff and trade allies to ensure program processes for

printing reports, installing measures and recording installation, and promoting other NIPSCO

programs are followed.

• Provide energy assessors with a branded leave-behind postcard they can use to check off which

measures were installed. This postcard could also include the URL for NIPSCO’s energy efficiency

program website.

Conclusion 6. The program has room to improve upon optimal customer service delivery from its

trade allies.

Almost all respondents reported high satisfaction with the professionalism of energy assessors (97%)

and refrigerator replacement technicians (23 of 26 respondents). Although there is high satisfaction,

there is also some indication of possible customer service issues. The most prevalent issue among

Page 228: 2018 DSM Portfolio Evaluation Report - NIPSCO

221

energy assessors, as noted in the prior conclusion, is that 30% of all respondents did not receive the

CHA report. Additionally, the NIPSCO program manager reported receiving phone calls from participants

about not being contacted for scheduling after requesting an assessment. Three customers reported a

lack of follow-up from their energy assessors after the home assessment when they were told that the

assessor would return to install additional measures (like a programmable thermostat or attic insulation)

but the assessor never returned or contacted them again. Among refrigerator replacement technicians,

three customers experienced poor professionalism, reporting that the technicians rushed through the

service.

Recommendations:

• Monitor customer satisfaction with individual trade allies and refrigerator installation

technicians to identify any customer service issues through ongoing customer satisfaction

surveys.

• Consider having the evaluation team conduct ride-alongs as part of the 2019 program

evaluation to better assess the extent to which program procedures are followed and

opportunities for program improvement.

Page 229: 2018 DSM Portfolio Evaluation Report - NIPSCO

222

C&I Prescriptive Program Through the Prescriptive program, NIPSCO offers a set rebate amount for installations from a

predetermined list of energy efficiency measures. The implementer, Lockheed Martin Energy, oversees

program management and delivery. With support from NIPSCO’s Major Account Managers and trade

allies, NIPSCO and Lockheed Martin Energy promote the program to customers.

Program Performance Table 142 summarizes savings for the 2018 program, including program savings goals. In terms of gross

savings, the program achieved 109% of its electric energy savings, 46% of its peak demand reduction,

and 79% of its natural gas energy savings goals.

Table 142. 2018 Prescriptive Program Savings Summary

Metric Gross

Savings Goal Ex Antea Audited Verified

Ex Post

Gross

Ex Post

Net

Gross Goal

Achievement

Electric Energy

Savings (kWh/yr) 41,558,911 42,608,649 42,608,649 42,608,649 45,255,008 40,276,957 109%

Peak Demand

Reduction (kW) 13,697 7,728 7,728 7,728 6,352 5,653 46%

Natural Gas

Energy Savings

(therms/yr)

364,789 363,958 363,237 363,237 288,206 256,504 79%

a Values presented at a measure-level represent audited values since the scorecard provides only savings totals.

Audited, verified, and ex post savings aligned well with the program’s ex ante electric energy savings,

but the evaluation revealed variability in peak demand reduction values. Similar to 2017, the

implementer continued to overstate ex ante peak coincident demand reduction for exterior lighting,

which drove the large discrepancy in ex post gross and net compared to ex ante. The incorporation of

WHFs for interior lighting projects caused the electric energy realization rate to surpass 100%.

As in 2017, for natural gas savings, the evaluation team found that the updated source for steam trap

assumptions caused reductions in 2018 ex post gross therm savings. Additionally, the evaluation team

found that claimed assumptions in large commercial steam pipe insulation measures were overstating

therm savings for low-occupancy buildings (namely, religious worship). See the Impact Evaluation

section for further discussion of evaluation adjustments.

Page 230: 2018 DSM Portfolio Evaluation Report - NIPSCO

223

Table 143 lists the 2018 program realization rates and NTG ratio.

Table 143. 2018 Prescriptive Program Adjustment Factors

Metric Realization

Rate (%)a

Precision at 90% Confidence

(Fuel-Level Energy

Realization Rate)

Freeridership

(%)

Spillover

(%)

NTG

(%)b

Electric Energy Savings

(kWh/yr) 106%

3.9%

12% 1% 89% Peak Demand Reduction (kW) 82%

Natural Gas Energy Savings

(therms/yr) 79% 9.7%

a Realization rate is defined as ex post gross savings divided by ex ante savings. b NTG is defined as ex post net savings divided by ex post gross savings.

Table 144 lists the 2018 program budget and expenditures. The program spent 94% of its electric budget

and 95% of its natural gas budget.

Table 144. 2018 Prescriptive Program Expenditures

Fuel Program Budget Program Expenditures Budget Spent (%)

Electric $4,126,847 $3,899,345 94%

Natural Gas $377,151 $358,855 95%

Research Questions The evaluation team conducted a materials review, staff interviews, site visits, desk reviews, and a

participant survey to address the following research questions:

• How do participants hear about the program, and what are the best ways for the program to

reach potential participants?

• What are the barriers and challenges to energy efficiency and program participation?

• What are the primary reasons for participation?

• Are participants satisfied with the program and its components?

• How could NIPSCO improve the participants’ experience with the program?

• What are the program’s freeridership and participant spillover estimates?

• Are tracking database savings sourced with proper project documentation?

• Do claimed savings algorithms align with the Indiana TRM (v2.2) or other appropriate

secondary sources?

Impact Evaluation To perform an impact analysis of the 2018 Prescriptive program, the evaluation team selected a

representative sample of measures to evaluate, and then extrapolated findings to the larger program

population. This process used a PPS sampling approach, a stratified sampling technique which uses a

saving parameter (here, total energy savings in MMBtu) as a weighting factor. Out of 4,055 project

Page 231: 2018 DSM Portfolio Evaluation Report - NIPSCO

224

measures, the C&I Prescriptive evaluation sample resulted in 90 measures receiving either an

engineering review or an on-site M&V analysis. Electric measures accounted for 80% of the ex ante

population, in terms of MMBtu savings, with lighting measures contributing the greatest savings.

Table 145 shows the total number of sample sites evaluated for this program year and their cumulative

savings, along with sample-size targets designed to yield sufficient precision. Measures represent the

number of unique energy efficiency upgrades performed within the population. Units represent the

quantity of each measure (number of light fixtures, capacity of heating equipment in MBH, feet of pipe

insulation, etc.).

Table 145. 2018 Prescriptive Program Impact Evaluation Activities

Gross Population Counta Sample Electric Measure

Count

Sample Natural Gas

Measure Count Evaluation

Sample Share of

Program Savings Units Measures Evaluation Target Evaluation Target

337,529 4,055 26 16 64 59 23%

a Population is based on queried data extracts from the implementer’s database, LM Captures, and may vary slightly with

NIPSCO’s scorecard data source.

The evaluation team used findings from 2015, 2016, and 2017 evaluations to inform sampling targets for

2018. The evaluation team’s understanding of previous years’ savings variability (error ratio) of

prescriptive measures allowed the evaluation sample to efficiently target natural gas and electric

measures in 2018, achieving 90/10 confidence and precision for each fuel’s realization rates, while

representing 23% of the total program savings. The actual sample surpassed its targets due to the

representative measure-mix in the sample and the fact that several projects contained multiple

measures. The reason that natural gas measures represent a larger portion of sampled measures in

2018 is that the error ratio for natural gas measures in the C&I Prescriptive program is higher than

electric measures (roughly 0.6 and 0.2, respectively), thus requiring a larger sample size to accurately

calculate a fuel-level realization rate.

Natural gas and electric measures in the Prescriptive program primarily differed in that the ex ante

savings methodology for natural gas measures relied more heavily on deemed savings values. Non-

lighting electric measures relied on deemed savings values as well, but lighting represented the majority

of the electric program. The implementer calculated lighting savings using site-specific inputs (e.g.,

baseline wattage, efficient wattage, annual HOU), which aligns with evaluation methods. As a result,

lighting measures tended to have better precision (less variation) in evaluated savings relative to

claimed savings.

Deemed savings measures can cause large variations in evaluated savings as they require very general

assumptions to calculate and they fail to incorporate specific characteristics of equipment upgrades,

building operations, and weather locations. Additionally, correcting outdated deemed savings in ex post

results can cause a systematic adjustment for a group of measures causing lower quality precision at the

program level realization rate. The savings variability (error ratio) for natural gas measures over the

current three-year cycle required the sample target for natural gas measures to surpass that of electric

measures in 2018, even though electric measures made up 80% of the program’s ex ante savings.

Page 232: 2018 DSM Portfolio Evaluation Report - NIPSCO

225

Conversely, the expected precision of electric measures allowed the evaluation team to more efficiently

target the electric evaluation sample, decreasing it to 16 measures in 2018. Sample planning, in

conjunction with surpassing target measure counts, allowed the sample realization rates (in terms of

MMBtu) to achieve ±10% precision with 90% confidence at both fuel levels.

Figure 77 illustrates the distribution of the 2018 C&I Prescriptive project population by energy savings

(MMBtu), fuel type, and measure category, as labeled in the tracking database. In 2018, C&I Prescriptive

projects accounted for 42,608,649 kWh and 363,958 therms ex ante savings (181,777 MMBtu savings

combined).

Figure 77. 2018 Prescriptive Program Ex Ante Savings Distribution by Fuel Type and Measure Category

Note: VFD = Variable Frequency Drive

Total program savings in 2018 were the highest historically over the past four years for natural gas and

electric measures. As usual, electric savings were driven by lighting measures, consisting mostly of LED

retrofits. LED high-bay and exterior retrofits consisted of 92% of ex ante lighting savings, illustrating how

LEDs continue to drive the market as a cost-effective alternative to fluorescents and metal halides. The

increase in natural gas savings compared to past years appears to be the result of high participation of

customers in steam pipe insulation measures. The program saw steam pipe insulation measures

increase from 17% to 78% of total prescriptive therm savings between 2017 and 2018. Figure 78

illustrates the Prescriptive program’s increase in energy savings over the past four years, driven by LEDs

and steam pipe insulation measures.

Page 233: 2018 DSM Portfolio Evaluation Report - NIPSCO

226

Figure 78. Prescriptive Program Fuel Distribution Trends, 2015–2018

At the combined fuel level, the evaluation sample represented 28% of the total population’s combined

energy (MMBtu) savings. Figure 79 shows the sample’s composition between on-site M&V efforts and

engineering desk reviews.

Figure 79. 2018 Prescriptive Program Summary of 90/10 Sample Analysis by Ex Ante Savings

The evaluation team distributed the majority of the planned site visits in the first two years of the 2016-

2018 cycle, reducing the number of targeted on-site visits from 19 in 2017 to six in 2018. In total, the

evaluation team performed 50 site visits for the Prescriptive program over the three-year cycle,

surpassing the target of 43. The absence of large energy savings project measures in 2018 drove the

small portion of savings captured by site visits (for example, the top six projects in program only

represented 7% of total program savings in 2018 at the MMBtu level). However, the more evenly

distributed savings contributed to a PPS sample with per-measure savings similar to the population

average, allowing the evaluation sample (engineering reviews plus on-site M&V) to still achieve the 90%

confidence with ±10% precision target for the energy realization rates.

Page 234: 2018 DSM Portfolio Evaluation Report - NIPSCO

227

Additionally, since the error ratio for natural gas measures is historically higher than electric measures

(roughly 0.6 and 0.2, respectively), a larger portion of natural gas savings (specifically HVAC measures)

was required in the 2018 evaluation sample to account for the larger variability in measure-level savings

compared to electric savings (specifically lighting measures).

Figure 80 shows the sample distribution, based on measure categories from the tracking database. VFD

measures claimed electric savings on HVAC fans and pumps, “other” measures include various small

HVAC, kitchen equipment, and water heater measures.

Figure 80. 2018 Prescriptive Program 90/10 Sample Savings Distribution by Measure Type

Desk reviews supplemented the six targeted site visits to achieve the sampled measure target. Due to

the PPS sampling approach, the distribution of ex ante energy savings in the sample closely matched the

distribution of measure types for each fuel. The only measure type not represented in the sample were

VFDs since their individual measure weights and representation in the population (in terms of ex ante

savings) were not large enough for the PPS sampling approach to require them for the ±10% precision

target at 90% confidence.

Audited and Verified Savings As shown in Table 142 above, the evaluation team made only minor adjustments in the evaluation’s

audit phase. For lighting measures, the claimed savings relied upon algorithms with site-specific inputs.

The evaluation team based any adjustments in the audit phase largely on discrepancies with respect to

these inputs, such as efficient or baseline wattages and fixture quantities. The evaluation team aligned

these inputs with supporting documentation (specification sheets and invoices) when they differed with

the calculation workbooks (program application files). The lighting documentation aligned well with

claimed calculations presented in the application files, so audited savings did not vary from ex ante

savings.

Page 235: 2018 DSM Portfolio Evaluation Report - NIPSCO

228

For natural gas measures and non-lighting electric measures, the implementer did not use approved

algorithms to calculate savings. Rather, the implementer generated claimed savings using a well-cited

deemed savings look-up table. Notes for each deemed value outlined the source of assumptions and

algorithms, and the evaluation team could recreate the values using the notes provided. The small

audited discrepancy in the 2018 natural gas savings were based on slight differences between claimed

quantities in the project application and the supporting invoices (for example, one project claimed 702

feet of steam pipe insulated but invoices only totaled 630 feet). The evaluation team corrected for these

inconsistencies in project documentation during the audit phase, and carried them over to verified and

ex post adjustments.

The version control issues with the implementer’s 2017 deemed savings tables were addressed in the

2018 program, and the evaluation team confirmed that a single table was consistently used for non-

lighting measures in 2018.

In-Service Rate

The evaluation team performed site visits to verify installations of 10 sampled measures at 10 sites in

2018 and to calculate ISR for the Prescriptive program. Through site visits, the evaluation team verified

that claimed unit counts were correct, finding no discrepancies. Depending upon the measure type, the

ex ante unit count in the tracking database either described the installed quantity or the equipment

capacity (for example, horsepower or MBH). The evaluation team applied a 100% ISR to the program’s

audited savings to determine verified savings in 2018.

Ex Post Gross Savings The evaluation evaluation team adjusted 2018 lighting measure savings in the ex post analysis, based

upon these data:

• Fixture quantity, equipment capacity, or wattage discrepancies discovered during site visits or

discussions with business owners.

• Annual operating hours metered during site visits, provided by business owners, or recorded

from scheduling systems.

• Inclusion of electric WHFs and adjustment of peak summer CFs consistent with the Indiana

TRM (v2.2).

• Review of any potential updates to sources listed in the implementer’s deemed savings tables.

Adjustments based on the above data resulted in only minor discrepancies in electric energy savings, but

yielded larger discrepancies in peak demand reduction. The evaluation team attributed this deviation in

demand savings to the sizeable portion of exterior lighting projects in the gross population and the

evaluation sample. Exterior lighting projects made up 18% of the population’s ex ante peak demand

reduction. Although the evaluation team confirmed that the implementer began applying CFs based on

building type to lighting projects in 2018 (based on evaluation recommendations in 2016 and 2017), the

evaluation team determined that they incorrectly applied CFs to exterior lighting measures. The

implementer applied the CF associated with building type, which is appropriate for interior lighting

measures, but exterior lighting has its own building category in the Indiana TRM (v2.2) and its associated

Page 236: 2018 DSM Portfolio Evaluation Report - NIPSCO

229

CF is zero. Therefore, the evaluation team removed all demand savings from exterior lighting as part of

the ex post analysis. Exterior lighting is non-operational during the summer peak period (defined in the

Indiana TRM v2.2 as summer weekdays from 3:00 p.m. to 6:00 p.m.), and the Indiana TRM (v2.2)

assumes exterior lighting is only operational during non-daylight hours (hence, CF=0). This adjustment

serves as the main contributor to the low realization rate for the program’s peak demand reduction

(82%). The evaluation team made additional small adjustments to CFs for some interior lighting

measures on a case-by-case basis based on building type of the project.

As noted in previous evaluations, the implementer disregarded WHFs in calculating electric lighting

savings. To align with methodologies originated during the Indiana Statewide Core evaluation, the

evaluation team applied WHFs to energy savings for interior lighting measures in cooled spaces. Cooling

systems were verified for on-site measures and from measure application workbooks for engineering

reviews. In these cases, the WHF contributed additional electric energy savings to lighting measures,

which was the primary reason the electric energy realization rate achieved 106% in 2018.

The natural gas realization rate dropped below 100% primarily because of the change in deemed savings

assumptions for several large steam pipe insulation measures. The implementer calculated the deemed

savings for steam pipe insulation with accepted engineering heat loss calculations based on diameter of

pipe. Table 146 outlines the deemed savings by pipe diameter.

Table 146. Steam Pipe Insulation Ex Ante Deemed Savings by Diameter

Pipe Diameter Savings(therm/year/foot)

0.5” 2.8

1” 5.4

1.5” 7.9

2” 10.5

2.5” 13.0

In 2017, the evaluation team reviewed this methodology and accepted these values as reasonable for

average commercial building heating systems. The Illinois TRM (v7.0)45 also aligns with these estimates.

At a statewide average, the Illinois TRM (v7.0) assumes 10.8 therm per year per foot for the average

commercial low pressure steam system with 2-inch pipe and recirculation heating.

However, the evaluation team found that 90% of all steam pipe insulation measures in the 2018 C&I

Prescriptive program were installed in religious buildings, which tend to have lower-than average usage

patterns and load profiles. As such, the evaluation team reviewed the 2017 natural gas billing data

available on LM Captures for the sites falling into the evaluation sample that received steam pipe

insulation. After normalizing for weather (using HDD and TMY3 for South Bend), the evaluation team

45 2019. Illinois Statewide Technical Reference Manual – Volume 1: Overview and User Guide.

http://ilsagfiles.org/SAG_files/Technical_Reference_Manual/Version_7/Final_9-28-18/IL-

TRM_Effective_010119_v7.0_Vol_1-4_Compiled_092818_Final.pdf

Page 237: 2018 DSM Portfolio Evaluation Report - NIPSCO

230

found several instances of large energy-saving steam pipe projects claiming more savings than the site’s

baseline consumption. For these instances, the evaluation team aligned per-foot saving assumptions

specifically with the religious building entry in the Illinois TRM (v7.0) for low-pressure steam systems

without recirculation (statewide average of 3.2 therms per year per foot for 2-inch pipe) to determine a

more reasonable savings estimate. An example project is outlined below in Table 147.

Table 147. Religious Building Steam Pipe Insulation Example

Metric Ex Ante Assumption Ex Post Assumption

Measure Name Pipe insulation, 2-3” diameter pipe

Length of Pipe 1,448 1,448

Per-Foot Annual Therm Savings 11.7a 3.6b

Total Annual Therms 16,942 5,259

a Average of 2” and 2.5” pipe

b 3.6 = 11.7*(3.2/10.5)

As seen in Figure 81, ex ante savings for this site are 32% higher than the weather-normalized baseline

consumption, which is a significant overstatement of savings. Adjusting the per-foot savings to align with

the Illinois TRM (v7.0) assumption outlined above, the project received a 31% realization rate.

Figure 81. Weather-Normalized Cumulative Baseline Consumption Compared with Ex Ante and

Ex Post Savings

Adjusting the deemed savings for projects that were claiming more savings than baseline consumption

was the primary reason the natural gas realization rate dropped to 79% in 2018. While deemed savings

approaches are not expected to accurately depict savings at each specific installation site, the approach

overall will, on average, accurately represent savings at the program level. However, since the

overwhelming building type in the 2018 population was religious worship, the ex ante deemed savings

for steam pipe insulation no longer accurately represented the program population. Therefore, the

evaluation team corrected individual steam pipe insulation projects on a case-by-case basis since many

Page 238: 2018 DSM Portfolio Evaluation Report - NIPSCO

231

were the major drivers of ex ante energy savings (the top 10 steam pipe insulation measures accounted

for 33% of total natural gas savings in the program) and religious worship buildings tend to have lower

heating loads than typical commercial buildings.

Similar to 2017, another reason why the natural gas realization rate dropped below 100% in 2018 was

an evaluation update in the claimed methodology for steam trap replacement measures. The

implementer referenced Illinois TRM (v4.0) to calculate the deemed savings for steam trap

replacements (this measure was not present in the Indiana TRM v2.2). The evaluation team verified

proper use of the algorithm, but, upon review of version 5 of the Illinois TRM (published in 2016), the

evaluation team found an update to the steam trap replacements savings methodology. The Illinois

TRM (v5.0) updated the input for average steam loss per leaking trap from 13.8 pounds per hour to 6.9

pounds per hour, effectively reducing per-unit savings by one-half. As the new steam loss assumption

carried over to the most recent version of the Illinois TRM (v6) and was available for reference during

the 2018 program year, the evaluation team aligned ex post savings for steam trap repalcements with

the new assumption, and steam trap replacement measures achieved a 48% realization rate.

Application of the 2015 Indiana TRM (v2.2) algorithms also contributed to variations in natural gas

energy savings between verified and ex post steps. For these, the implementer’s deemed savings

understated actual natural gas energy savings in some cases, and overstated them in other cases. The

implementer primarily cited algorithms from the 2015 Indiana TRM (v2.2) for deemed savings, but the

evaluation team used project-specific inputs (for example, equipment capacity, efficiency, effective full

load hours) in the ex post analysis, resulting in realization rates greater and less than 100% in several

cases.

Realization Rates

Table 148 shows the sample’s percentage of verified savings and realization rates.

Table 148. Prescriptive Program Realization Rates

Realization

Rate

Verified/Ex Ante Savings Realization Rate for Evaluation Samplea

Electric Energy

(kWh)

Peak Demand

(kW)

Natural Gas

Energy

(therms)

Electric

Energy

(kWh)

Peak

Demand

(kW)

Natural Gas Energy

(therms)

100.0% 100.0% 100.0% 99.8% 106.2% 82.2% 79.2% a Realization rate is defined as ex post gross savings divided by ex ante savings.

Table 149 lists measure and unit counts within the sample and aggregated (total energy in MMBtu)

ex post realization rates for each measure type. The Ex Post Gross Savings section outlines the drivers of

the realization rate for each measure type.

Page 239: 2018 DSM Portfolio Evaluation Report - NIPSCO

232

Table 149. Prescriptive Program Evaluation Sample Results by Measure Type

Fuel Measure Type

Evaluation

Sample Measure

Count

Evaluation

Sample Unit

Count

Demand Ex Post

Realization Rate

Energy Ex Post

Realization Rate

Electric Lighting 25 22,368 82% 107%

Electric Refrigeration 1 46 N/Aa 95%

Natural gas Boiler Replacement 9 9,246 N/A 116%

Natural gas Furnace Replacement 3 1,972 N/A 83%

Natural gas Natural gas – Other 6 74 N/A 83%

Natural gas SteamPipe Insulation 45 26,231 N/A 72%

Natural gas Steam Traps 1 106 N/A 48% a N/A denotes zero claimed demand savings

It is important to note that the realization rates in Table 149 were not applied to each measure type in

the population to determine ex post gross savings. Table 149 is presented only to provide a further

outline of findings with in the evaluation sample. To calculate the ex post gross impacts, the evaluation

team applied the metric-level realization rates resulting from the sample and applied them to the

population ex ante energy and demand savings (shown in Table 150).

Table 150. Application of 2018 Prescriptive Program Realization Rates

Metric Population Ex Ante Realization Rate (From

Evaluation Sample) Population Ex Post Gross

Electric Energy Savings (kWh/yr) 42,608,649 106.2% 45,255,008

Peak Demand Reduction (kW) 7,728 82.2% 6,352

Natural Gas Energy Savings

(therms/yr) 363,958 79.2% 288,206

Ex Post Net Savings The evaluation team calculated freeridership and participant spillover using methods described in

Appendix B. Self-Report Net-to-Gross Evaluation Methodology, using survey data collected from 2018

participants. As shown in Table 151, the evaluation team estimated an 89% NTG ratio for the program.

Table 151. 2018 Prescriptive Program Net-to-Gross Results

Program Category Freeridership (%)a Participant Spillover

(%) NTG (%)a

Prescriptive Program 12% 1% 89%

a Weighted by survey sample ex post gross program MMBtu savings.

Freeridership To determine freeridership, the evaluation team asked 68 respondents (representing 90 measures)

questions about whether they would have installed equipment at the same efficiency level, at the same

Page 240: 2018 DSM Portfolio Evaluation Report - NIPSCO

233

time, and in same amount in the Prescriptive program’s absence. Based on survey feedback, the

evaluation team calculated an overall freeridership score of 12% for the program, as shown in Table 152.

Table 152. 2018 Prescriptive Program Freeridership Results

Program Category Responses (n)a Freeridership (%)b

Prescriptive Program 90 12%

a For customers who installed more than one measure, the evaluation team asked the freeridership battery of questions for a

maximum of two measures, resulting in 90 unique responses. b The freeridership score was weighted by the survey sample ex post gross program MMBtu savings.

By combining the previously used intention methodology with influence methodology, the evaluation

team produced a freeridership score for the program by averaging savings-weighted intention and

influence freeridership scores. Refer to Appendix B. Self-Report Net-to-Gross Evaluation Methodology for

further details on intention and influence questions and scoring methodologies.

Intention Freeridership

The evaluation team estimated intention freeridership scores for all participants, based on their

responses to the intention-focused freeridership questions. As shown in Table 153, the 2018 Prescriptive

program’s intention freeridership score was 22%.

Table 153. 2018 Prescriptive Program Intention Freeridership Results

Program Category Responses (n)a Intention Freeridership Score (%)b

Prescriptive Program 90 22%

a For customers who installed more than one measure, the evaluation team asked the freeridership battery of questions for a

maximum of two measures, resulting in 90 unique responses. b The freeridership score was weighted by survey sample ex post gross program MMBtu savings.

Page 241: 2018 DSM Portfolio Evaluation Report - NIPSCO

234

Figure 82 shows the distribution of Individual intention freeridership scores.

Figure 82. 2018 Prescriptive Program Distribution of Intention Freeridership Scores

Source: Participant Survey. Questions: G1 to G9 and G11 are used to estimate an intention freeridership

score. See Table 239 in the Appendix B. Self-Report Net-to-Gross Evaluation Methodology for the full text of

the questions, response options, and scoring treatments used to estimate intention freeridership scores.

See Table 240 for the unique Prescriptive program participant response combinations resulting from

intention freeridership questions, along with intention freeridership scores assigned to each combination,

and the number of responses for each combination.

Influence Freeridership

The evaluation team assessed influence freeridership by asking participants how important various

program elements were in their purchasing decision-making process. Table 154 shows the program

elements participants rated for importance, along with a count and average rating for each factor.

Table 154. 2018 Prescriptive Program Influence Freeridership Responses

Influence Rating Influence

Score

NIPSCO

Incentive

Information from

NIPSCO on Energy

Savings Opportunities

Recommendation

from Contractor or

Vendor

Participation in a

NIPSCO Efficiency

Program

1 (not at all important) 100% 4 8 5 11

2 75% 9 13 9 9

3 25% 12 20 19 18

4 (very important) 0% 64 40 50 19

Don’t Know 50% 1 9 7 33

Average 3.5 3.1 3.4 2.8

The evaluation team determined each respondent’s influence freeridership rate for each measure

category, using the maximum rating provided for any factor included in Table 154. As shown in

Table 155, the respondents’ maximum influence ratings ranged from 1 (not at all important) to 4 (very

important). A maximum score of 1 meant the customer ranked all factors from the table as not at all

important, while a maximum score of 4 means the customer ranked at least 1 factor very important.

Page 242: 2018 DSM Portfolio Evaluation Report - NIPSCO

235

Counts refer to the number of “maximum influence” responses for each factor, or influence score,

response option.

Table 155. 2018 Prescriptive Program Influence Freeridership Score

Maximum Influence Rating Influence

Score Count

Total Survey

Sample Ex Post

MMBtu Savings

Influence Score

MMBtu Savings

1 (not important) 100% 0 0 0

2 75% 0 0 0

3 25% 12 622 56

4 (very important) 0% 77 12,279 0

Don’t Know 50% 1 151 75

Average Maximum Influence Rating (Simple Average) 3.9

Average Influence Score (Weighted by Ex Post MMBtu Savings) 2%

The average influence score of 2% for the 2018 Prescriptive program is weighted by ex post MMBtu

program savings.

Final Freeridership

The evaluation team calculated the mean of intention and the influence of freeridership components to

estimate final freeridership for the program at 12%:

𝐹𝑖𝑛𝑎𝑙 𝐹𝑟𝑒𝑒𝑟𝑖𝑑𝑒𝑟𝑠ℎ𝑖𝑝 (12%) =𝐼𝑛𝑡𝑒𝑛𝑡𝑖𝑜𝑛 𝐹𝑅 𝑆𝑐𝑜𝑟𝑒 (22%) + 𝐼𝑛𝑓𝑙𝑢𝑒𝑛𝑐𝑒 𝐹𝑅 𝑆𝑐𝑜𝑟𝑒 (2%)

2

A higher freeridership score translates to more savings that are deducted from the gross savings

estimates. Table 156 lists the intention, influence, and final freeridership scores for the 2018 C&I

Prescriptive program.

Table 156. 2018 Prescriptive Program Freeridership Score

Responses (n) Intention Score Influence Score Freeridership Score

90 22% 2% 12%

Participant Spillover As detailed in Appendix B. Self-Report Net-to-Gross Evaluation Methodology, the evaluation team

estimated participant spillover46 measure savings using specific information about participants,

determined through the evaluation, and employing the Indiana TRM (v2.2) as a baseline reference. The

evaluation team estimated the percentage of program participant spillover by dividing the sum of

additional spillover savings (as reported by survey respondents) by total gross savings achieved by all

46 Non-participant spillover evaluation activities were not conducted for the 2018 program year.

Page 243: 2018 DSM Portfolio Evaluation Report - NIPSCO

236

survey respondents. The Prescriptive program had a participant spillover estimate of 1%, rounded to the

nearest whole percent, as shown in Table 157.

Table 157. 2018 Prescriptive Program Participant Spillover

Spillover Savings (MMBtu) Participant Program Savings (MMBtu) Participant Spillover

183 13,052 1%

Five participants reported that, overall, the program proved very important in their decisions to install

additional high-efficiency LED lighting (for which they did not receive a rebate from NIPSCO). Table 158

shows the additional spillover measures and the total resulting energy savings.

Table 158. 2018 Prescriptive Program Participant Spillover Measures, Quantity and Savings

Spillover Measures Quantity Total Energy Savings (MMBtu)

LED Lighting 321 183

Total 321 183

Table 159 summarizes the percentage of freeridership, participant spillover, and NTG for the program.

Table 159. 2018 Prescriptive Program Net-to-Gross Results

Responses (n) Freeridershipa Participant Spillover NTGa

90 12% 1% 89%

a Weighted by survey sample ex post gross program MMBtu savings

Evaluated Net Savings Adjustments Table 160 shows the savings, realization rates, and NTG for the program.

Table 160. 2018 Prescriptive Program Ex Post Net Savings

Fuel Type Ex Antea

Gross Savings

Ex Post Gross

Savings

Realization

Rate (%) NTG (%)

Ex Post Net

Savings

Electric Energy Savings (kWh/yr) 42,608,649 45,255,008 106.2% 89% 40,276,957

Peak Demand Reduction (kW) 7,728 6,352 82.2% 89% 5,653

Natural Gas Energy Savings (therms/yr) 363,958 288,206 79.2% 89% 256,504

a Values presented at a measure-level represent audited values since the scorecard provides only savings totals.

Process Evaluation As a part of the process evaluation, the evaluation team reviewed the program database and program

materials, and conducted participant surveys. The evaluation team also interviewed NIPSCO’s program

manager and key program implementer staff to gain a better understanding of the program design and

delivery process, and any associated changes or challenges experienced in 2018. The evaluation team’s

findings follow.

Page 244: 2018 DSM Portfolio Evaluation Report - NIPSCO

237

Program Design and Delivery Through the Prescriptive program, NIPSCO offers prescriptive rebates for one-for-one replacements of

dozens of measures including efficient lighting, pumps and drives, heating, cooling, and refrigeration

equipment. The implementer’s program manager, supported by a program supervisor, energy

engineers, and other implementation staff, is responsible for delivering the Prescriptive program

through trade allies and for coordinating customer outreach with NIPSCO staff. Implementation staff,

including a marketing manager, trade ally, implementation specialists, and field and project engineers,

handle program marketing, trade ally and customer engagement, quality assurance,47 and day-to-day

program operations.

While program implementation staff, NIPSCO’s Major Account Managers, and trade allies all promote

the program, trade allies are instrumental in identifying energy-saving opportunities with customers and

recommending them to the Prescriptive program.

NIPSCO’s program manager guides program design and strategy, and NIPSCO’s Major Account Managers

assist with implementation efforts through direct support and program assistance to customers within

the service territory. The program implementer documented savings in its database for projects initiated

and completed in 2018. NIPSCO staff maintained a scorecard to track program energy savings, demand

reduction, and expenditures.

Changes from 2017 Design

In July 2018, NIPSCO consolidated savings goals across C&I programs regardless of project or program

type.

In 2018, Lockheed Martin Energy focused its marketing efforts on promoting natural gas savings; the

program implementer developed a natural gas–only prescriptive measure list to highlight the therm

offerings. In August 2018, the program implementer launched a simplified rebate application and

corresponding outreach campaign for boiler tune-ups, steam trap replacements, and steam pipe

insulation, contacting and distributing the new applications to targeted trade allies ahead of the heating

season. The program implementer said that although the outreach did not result in as many projects as

intended, it helped staff initiate conversations about these measures with trade allies. The program

achievements for these measures were mixed compared to 2017. Savings from steam trap replacements

and steam pipe insulation measures increased substantially, from 39% of the gross therm savings in

2017 to 80% of the gross therm savings in 2018, while savings from boiler tune-ups decreased in 2018,

contributing only 3% of the program’s gross therm savings, compared to 12% in 2017.

However, the program did not implement a bonus offering for therm-saving measures in 2018 due to

budget limitations. The program implementer also noted that after offering a bonus in 2016 and 2017, it

47 The implementer conducts a final inspection on at least 10% of all projects, prior to rebate payment, and 100%

of projects with incentives exceeding $10,000.

Page 245: 2018 DSM Portfolio Evaluation Report - NIPSCO

238

was concerned that customers would come to rely on the bonus and refrain from executing a project

until the bonus was available.

2017 Recommendation Status

In addition to research objectives laid out for the 2018 evaluation, the evaluation team followed up on

the 2017 evaluation recommendations. Table 161 lists the 2017 Prescriptive program evaluation

recommendations and NIPSCO’s progress toward addressing those recommendations to date.

Table 161. Status of 2017 Prescriptive Program Evaluation Recommendations

Summary of 2017 Recommendations Status

To increase customer awareness and program

engagement, develop cross-over marketing

between digital channels and human channels, such

as offering a one-on-one consultation and

promoting through email marketing and NIPSCO’s

website. Ensure email campaigns encourage

customers to take action (for example, call or email

to set up a consultation appointment). Set goals to

follow up with a target market segment within a

few weeks of an email campaign.

In Progress. While every email communication included contact

information such as the general email and the program 800 number, the

implementer expressed concerns with adding content that could distract

from the primary message. The implementer’s outreach staff did not

administer a follow-up telephone or email campaign.

For commercial lighting, the implementer should

follow the algorithm for energy and demand

savings, outlined in the 2015 Indiana TRM (v2.2).

In Progress. The implementer has started to incorporate CFs in lighting

demand savings in 2018, which is reflecting a more accurate depiction of

peak coincidence demand reduction for the program. The implementer

has reservations about incorporating WHFs into lighting calculations due to

concerns that the WHF assumption would be applied to sites that

significantly vary in size and layout, which both greatly impact how a space

is heated, and it did not apply them in 2018. For this evaluation cycle,

program stakeholders have agreed to allow the implementer to continue

this approach regarding WHFs, with the evaluation team counting electric

WHFs towards program savings in ex post gross savings. Historically,

Indiana evaluations capture WHF credits only as part of evaluated electric

savings. If the heating fuel is electric, the penalty comes in the form of

reduced electric savings. The evaluation team will continue to apply this

methodology as it is generally not appropriate to impact natural gas

portfolios, based on electric measure impacts (or vice versa). Natural gas

and electric measures represent two distinct ratepayer and stakeholder

groups, often with different regulatory frameworks within a given

jurisdiction.

Page 246: 2018 DSM Portfolio Evaluation Report - NIPSCO

239

Summary of 2017 Recommendations Status

Have the implementer directly reference savings

values from the deemed savings look-up table for

each unique measure in the tracking database.

Completed in 2018. The evaluation team found that non-lighting measures

consistently referenced the appropriate savings from the implementer’s

look-up table. The evaluation team found that deemed savings were well

sourced, and they are reasonable estimates based on cited

documentation. However, deemed savings approaches will generally

produce less precise savings estimates than algorithms based on specific

installed measures at the project level. The program implementer

calculates claimed savings for lighting measures using algorithms at the

project level, which is preferred. The evaluation team realizes that it is not

conducive for all measure types but recommends this approach for other

measures where possible.

Participant Feedback The evaluation team surveyed participants about their project and program participation experiences to

assess customer motivation, participation barriers, and customer satisfaction with program components

(discussed in more detail below). The evaluation team contacted 487 businesses who participated in the

Prescriptive program—68 participants responded for a 14% response rate.

Energy Efficiency Awareness and Marketing

NIPSCO and implementation staff continued efforts to build customer program awareness, though on a

more limited scale in 2018. The program implementer released five customer-facing email messages in

2018, compared to 13 in 2017 and five in 2016.

As shown in Figure 83, most participant respondents (54%) learned about the program through their

trade allies, a significantly higher percentage compared to 2017 (38%, n=68).48 Respondents reported

that they learned about the program second-most frequently through word of mouth (28%), which is

consistent with 2017 results (24%). In 2018, no respondents reported learning about the program

through printed or emailed materials, a significant decrease compared to 7% in 2017, and reflective of

the decrease in email marketing from the program implementer in 2018.49

48 Difference is statistically significant at p≤0.05 (95% confidence).

49 Difference is statistically significant at p≤0.01 (99% confidence).

Page 247: 2018 DSM Portfolio Evaluation Report - NIPSCO

240

Figure 83. How Participants Learned about the Prescriptive Program

Source: Participant Survey. Question: “How did you first learn about NIPSCO’s Prescriptive Incentive

program?” (multiple responses allowed)

Fifty-six percent of 2018 respondents knew that NIPSCO offered other energy efficiency programs for its

business customers, which is consistent with 2017 and 2016 results. Fifty-seven percent of these

respondents (n=37) could identify a specific program or offer, which is also consistent with 2017 results

(55%, n=31), but significantly higher than the 2016 results (30%, n=43).50

When asked for the best way for NIPSCO to keep businesses like them informed, respondents most

frequently cited email (48%; n=67), followed by their mailings (27%), and the NIPSCO website (12%).

Participation Drivers

Respondents said the most important factor for participating in NIPSCO’s Prescriptive program was to

save money on utility bills (35%), followed by obtaining the program incentive (26%). However,

significantly fewer 2018 respondents than 2017 respondents (54%, n=68) and 2016 respondents (71%,

n=68) mentioned saving money on their utility bills as a top factor.51 Further, a significantly higher

percentage of 2018 respondents cited obtaining the program incentive as a top factor compared to 2017

respondents (12%) and 2016 respondents (13%).52 Figure 84 shows respondents’ participation drivers.

50 Difference is statistically significant at p≤0.05 (95% confidence).

51 Difference between 2017 and 2018 is statistically significant at p≤0.05 (95% confidence). Difference between

2016 and 2018 is statistically significant at p≤0.01 (99% confidence).

52 Difference is statistically significant at p≤0.05 (95% confidence).

Page 248: 2018 DSM Portfolio Evaluation Report - NIPSCO

241

Figure 84. Prescriptive Program Participation Drivers

Source: Participant Survey. Question: “What factors were important in your decision to make energy saving

improvements through NIPSCO’s Prescriptive Incentive program?” (multiple responses allowed)

Participation Barriers

Most 2018 respondents experienced relative ease in investing in energy efficiency, with 64% of

respondents saying it was very (21%) or somewhat easy (41%). These results were similar to 2017

findings (21% said very easy and 44% said somewhat easy).

When the evaluation team asked respondents about challenges that businesses face in becoming more

energy-efficient, 65% of respondents said that the high initial cost posed the greatest challenge,

significantly more than the 2017 (41%, n=68) and 2016 findings (44%, n=68).53 In particular, a

significantly higher percentage of 2018 lighting participants (60%), who represent 65 of the 68

respondents in the sample, reported high cost as a barrier, compared to 2016 and 2017 lighting

respondents (3%, n=60, and 40%, n=58, respectively).54 In addition, significantly fewer 2018 respondents

(6%) said none or don’t know than in 2017 and 2016 (18% and 22%, respectively).55 All remaining

challenges mentioned in 2018, as shown in Figure 85, were similar to 2016 and 2017 responses.

53 Difference between 2016 and 2017 is statistically significant at p≤0.05 (95% confidence). The question’s

wording, however, changed from 2016 to 2017, from considering how their organizations struggled to how

organizations similar to theirs struggle. This change may have affected how participants responded from 2016

to the subsequent years.

54 Difference between 2016 and 2018 is statistically significant at p≤0.01 (99% confidence). Difference between

2017 and 2018 is statistically significant at p≤0.05 (95% confidence).

55 Difference is statistically significant at p≤0.01 (99% confidence).

Page 249: 2018 DSM Portfolio Evaluation Report - NIPSCO

242

Figure 85. Prescriptive Program Customer Barriers to Becoming Energy-Efficient

Source: Participant Survey. Question: “What do you think are the most significant challenges that

organizations face when investing in energy-efficient equipment?”

Ten respondents (15%; n=67) identified the following challenges with participating in the program:

• Completing the application (five respondents)

• Not understanding the application process (two respondents)

• Short program time frames to complete required steps (two respondents)

• High upfront cost (one respondent)

Eight of these 10 respondents offered the following suggestions for managing these program challenges:

• Provide more technical or engineering support (three respondents)

• Provide improvements to the application format or process, such as providing a video on how to

fill out the application or have the implementer help complete the application (six

respondents)56

• Allow customers to implement projects over time to so that they can spread out the upfront

cost (one respondent)

Satisfaction with Program Processes

Respondents rated their satisfaction with different program components. Figure 86 shows 2018

satisfaction levels for each program component. Similar to 2017 and 2016 findings, respondents were

generally satisfied with all of the program components. Based on the percentage of very satisfied

responses, respondents were most satisfied with the quality of their trade ally’s work (83%) and least

56 The program implementer developed a video to provide step-by-step instructions for completing its Excel-

based application. http://www.lmenergyefficiency.com/excel-based-application-tutorial/.

Page 250: 2018 DSM Portfolio Evaluation Report - NIPSCO

243

satisfied with the incentive amount (53%). Thirty-nine of 68 respondents (57%) worked with the

program implementer in some capacity. Of these 39 respondents, 74% said they were very satisfied.

Figure 86. Customer Satisfaction with Prescriptive Program Components

Source: Participant Survey. Question: “How would you rate your satisfaction with…”

Overall Satisfaction with Program and NIPSCO

As shown in Figure 87, most respondents (96%) were very or somewhat satisfied with the program

overall; only 4% of respondents reported they were somewhat dissatisfied. Responses were statistically

similar to 2017.

Figure 87. Overall Customer Satisfaction with the Prescriptive Program

Source: Participant Surveys. Question: “How satisfied are you with NIPSCO’s Prescriptive program overall?

Would you say you are…”

Two of the three 2018 respondents who were neither satisfied nor dissatisfied or somewhat dissatisfied

with the program gave reasons for their ratings. One respondent mentioned the program time limits

Page 251: 2018 DSM Portfolio Evaluation Report - NIPSCO

244

make it difficult to afford the full investment.57 The other respondent said that the energy-efficient

upgrade has not resulted in reduced energy bills or any monetary savings.

When asked about their satisfaction with NIPSCO, a significantly higher percentage of 2018 respondents

(92%, n=68) said they were very or somewhat satisfied compared to 2017 respondents (85%, n=66).58

Figure 88 shows the full distribution of 2016, 2017, and 2018 responses for satisfaction with NIPSCO.

Figure 88. Satisfaction with NIPSCO, 2016-2018

Note: Values with a box around them indicate a significant difference, p < 0.05.

Source: Participant Surveys. Question: “How satisfied are you with NIPSCO overall as your organization’s

utility service provider? Would you say you are…”

Suggestions for Improvements

The evaluation team asked respondents to identify any suggestions they had for improving the

Prescriptive program. Most respondents (82%; n=65) did not offer any suggestions. As shown in

Figure 89, three of 12 respondents who offered suggestions said they would like to receive more

technical assistance from NIPSCO or Lockheed Martin Energy. Three recommended more or better

communication from NIPSCO about the program offerings or a check-in call from NIPSCO mid project.

Other suggestions for improving the program included simplifying the application process (two

respondents), expanding program offerings (one respondent, who elaborated that by requesting

incentives for fluorescent lighting measures), higher incentives (one respondent), and a longer

timeframe in which to complete the project (one respondent).

57 If the overall project allows, NIPSCO customers have the option to phase in projects over multiple years and

apply for multiple program year incentives.

58 Difference is statistically significant at p≤0.05 (95% confidence).

Page 252: 2018 DSM Portfolio Evaluation Report - NIPSCO

245

Figure 89. Suggestions for Improving the Prescriptive Program

Source: Participant Survey. Question: “Is there anything NIPSCO could have done to improve your overall

experience with the program?” (multiple responses allowed)

Participant Survey Firmographics

As part of the participant survey, the evaluation team collected responses on the firmographics shown

in Table 162.

Table 162. Prescriptive Program Respondent Firmographics

Firmographic Percentage

Building Square Footage (n=68)

Less than 5,000 square feet 16%

5,000 to less than 10,000 square feet 19%

10,000 to less than 50,000 square feet 26%

50,000 to less than 100,000 square feet 4%

100,000 square feet or greater 13%

Don’t know 21%

Building Ownership (n=66)

Own 91%

Lease 9%

Page 253: 2018 DSM Portfolio Evaluation Report - NIPSCO

246

Firmographic Percentage

Industry (n=68)

Retail/wholesale 22%

Education/schools 13%

Manufacturing 12%

Real estate/property management 10%

Other 7%

Religious – church 7%

Office/professional services 6%

Finance/Insurance 4%

Auto dealer/repair shop 3%

Construction 3%

Government 3%

Healthcare 3%

Grocery/food/convenience stores 1%

Hotel/motel 1%

Nonprofit 1%

Transportation 1%

Space Heating Fuel Type (n=63)

Natural Gas 86%

Electric 6%

Other 8%

Water Heating Fuel Type (n=64)

Natural Gas 70%

Electric 25%

Other 5%

Note: Totals may not properly sum due to rounding.

Conclusions and Recommendations Conclusion 1: The 2018 Prescriptive program delivered high satisfaction with participating customers.

Most 2018 respondents (96%) were very or somewhat satisfied with the program overall, and a

significantly higher percentage of 2018 respondents (92%) said they were very or somewhat satisfied

with NIPSCO compared to 2017 respondents (85%).

Conclusion 2: Customers are concerned with the cost of implementing energy-efficient equipment and

are financially motivated to participate in the Program to offset project costs.

Upfront cost is the greatest challenge for businesses interested in energy-efficient equipment and

services, with more participants reporting initial cost as a barrier to project implementation than in 2016

and 2017. The majority of electric savings in 2018 are from lighting measures (93%), suggesting that

customers have fewer financial challenges implementing lighting opportunities, but may be impaired by

the high costs and slow return on investment (ROI) when reviewing the potential to install other energy-

efficient equipment. While participants identified saving money on their utility bills as the top reason for

making improvements through the program from 2016 to 2018, fewer 2018 participants were motivated

Page 254: 2018 DSM Portfolio Evaluation Report - NIPSCO

247

by saving money on their utility bills and saving energy, and more 2018 participants were motivated by

obtaining the program incentive, than reported in 2016 and 2017.

Recommendation:

• Consider promoting project phasing—breaking a large-scale project into manageable phases—

as a way to alleviate participant cost barriers.

Conclusion 3: Lockheed Martin Energy’s campaign to increase utilization of the program’s steam trap

replacements, steam pipe insulation, and boiler tune-up incentives yielded mixed savings results.

The program implementer launched a simplified rebate application and contractor outreach campaign

for boiler tune-ups, steam trap replacements, and steam pipe insulation in advance of the heating

season. The program implementer targeted an internal list of hydronic contractors and said that it

helped staff initiate conversations about these measures with trade allies. Savings from steam trap

replacements and steam pipe insulation measures increased from 39% of the gross therm savings in

2017 to 80% of the gross therm savings in 2018; however, savings from boiler tune-ups decreased in

2018, contributing only 3% of the program’s gross therm savings, compared to 12% in 2017. That said,

the program came very close to achieving its therm savings goal, despite not using the bonus offering

implemented in 2016 and 2017.

Recommendations:

• Continue outreach with hydronic contractors to increase savings from steam trap replacements

and steam pipe insulation, boilers, and boiler tune-ups.

• Consider developing targeted emails, followed by telephone calls and meetings with select

contractors, and tracking the outreach effort outcomes.

Conclusion 4: The implementer incorrectly applied CFs in the claimed impact calculations for exterior

lighting measures. This largely overstated the peak demand savings for the program.

Exterior lighting measures accounted for 18% of total ex ante program demand savings in 2018. For

these measures, the implementer applied interior-fixture CFs to exterior lighting measures instead of

using zero, as outlined in the 2015 Indiana TRM (v2.2). This omission ignored lighting load profiles during

the summer peak period outlined in the 2015 Indiana TRM (v2.2). A large portion of exterior lighting

project measures comprised the 2018 population, and the gross demand reduction for these measures

tends to be large. Since the ex post analysis reduced all exterior lighting demand savings to zero, the

demand reduction realization rate was very low in 2018.

Recommendation:

• The implementer should apply a CF of zero to all exterior lighting measures, as outlined in the

2015 Indiana TRM (v2.2).

Page 255: 2018 DSM Portfolio Evaluation Report - NIPSCO

248

Conclusion 5: The ex ante deemed savings for steam pipe insulation measures caused overstated

natural gas savings in 2018 based on the predominant building type in the population (religious

worship).

Steam pipe insulation measures encompassed 78% of the 2018 Prescriptive therm savings. The

evaluation team additionally found that 90% of steam pipe insulation measures in the program were

installed in religious buildings, which tend to have lower-than-average usage patterns and load profiles.

As such, the evaluation team found several instances of large energy-saving steam pipe insulation

projects claiming more savings than the site’s baseline consumption. For these instances, the evaluation

team aligned per-foot savings assumptions specifically with the religious building entry in the Illinois

TRM (v7.0) for low-pressure steam systems without recirculation (statewide average of 3.2 therms per

year per foot for 2-inch pipe) to determine a more reasonable savings estimate.

Recommendation:

• Reference the 2019 Illinois TRM (v7.0) for better estimates of deemed savings by building type.

Currently, the ex ante deemed savings align with the Illinois TRM (v7.0) for low pressure steam

systems with recirculation. However, for systems without recirculation, use the building specific

deemed therm-per-foot values for either Chicago (closest weather station to NIPSCO territory)

or a statewide average. Extrapolate these values to different diameter pipes to provide new

deemed savings for the three program measure offerings

Conclusion 6: An update in the Illinois TRM has reduced the natural gas savings estimate for steam

trap replacements.

The implementer referenced the Illinois TRM (v4.0) to calculate the deemed savings for steam trap

replacements. The evaluation team verified proper use of the algorithm, but, upon review of the Illinois

TRM (v5.0) published in 2016, the evaluation team found an update to the steam trap replacement

savings methodology. This update remains in the most recent version of the Illinois TRM (v7.0) and it

lowered the input for average steam loss per leaking trap from 13.8 pounds per hour to 6.9 pounds per

hour, effectively reducing per-unit savings by one-half.

Recommendation:

• Reference the 2019 Illinois TRM (v7.0) for better estimates of deemed savings for steam trap

replacements.

Page 256: 2018 DSM Portfolio Evaluation Report - NIPSCO

249

C&I Custom Incentive Program Through the Custom Incentive (Custom) program, NIPSCO offers incentives for the installation of energy

efficiency measures or improvements where a measure is not available through the Prescriptive

program. Incentives are based on forecasted first-year electric and natural gas energy savings. The

implementer, Lockheed Martin Energy, oversees program management and delivery. NIPSCO and

Lockheed Martin Energy promote the program to customers, with support from NIPSCO’s Major

Account Managers and trade allies.

Program Performance Table 163 summarizes savings for the 2018 Custom program, including program savings goals. In terms

of gross savings, the program achieved 92% of the electric energy goals, 59% of peak demand savings

goals, and 42% of the natural gas energy goals.

Table 163. 2018 Custom Program Savings Summary

Metric

Gross

Savings

Goal

Ex Antea Audited Verified Ex Post

Gross

Ex Post

Net

Gross Goal

Achievement

Electric Energy Savings

(kWh/yr) 29,267,725 27,644,407 27,647,724 29,297,639 27,019,275 21,345,228 92%

Peak Demand

Reduction (kW) 5,716 3,465 3,465 3,672 3,359 2,654 59%

Natural Gas Energy

Savings (therms/yr) 1,265,347 554,281 552,358 585,320 536,686 423,982 42%

a Values presented at a measure-level represent audited values since the scorecard provides only savings totals.

Audited and ex post electric savings aligned well with the program’s ex ante estimates. The increase in

verified savings over audited was due to the 106% ISR calculated from site visits in 2018. Ex post savings

dropped back down for each metric based on similar findings from 2017. Similar to last year, the

implementer continued to overstate ex ante peak coincident demand reduction by inconsistently

applying CFs for Custom lighting projects. However, ex post adjustments to demand reduction were not

as significant as they were in 2017 since the implementer did not omit CFs for all lighting projects (which

was the case in 2017). The implementer started to incorporate CFs into 2018 demand calculations but

overlooked a portion of projects. The decrease in ex post demand reduction reflects the evaluation

team’s adjustment to these overlooked projects. Additionally, data logging and site visit findings for

several large electric measures revealed less electric energy and demand savings than ex ante estimates.

For the natural gas program, the 106% ISR caused verified savings to increase over audited and ex ante

savings. The reduction in ex post therm savings came from case-by-case adjustments to natural gas

savings calculations that differed from the UMP and Indiana TRM (v2.2) methodology and were

thermodynamically incorrect. Refer to the Impact Evaluation section for further discussion of evaluation

adjustments.

Table 164 shows the ex post gross and net energy adjustment factors resulting from the evaluation.

Page 257: 2018 DSM Portfolio Evaluation Report - NIPSCO

250

Table 164. 2018 Custom Program Adjustment Factors

Metric Realization

Rate (%)a

Precision at 90%

Confidence (Fuel-

Level Energy

Realization Rate)

Freeridership

(%)

Spillover

(%)

NTG

(%)b

Electric Energy Savings (kWh/yr) 98% 4.4%

21% 0% 79% Peak Demand Reduction (kW) 97%

Natural Gas Energy Savings (therms/yr) 97% 7.1% a Realization rate is defined as ex post gross savings divided by ex ante savings. b NTG is defined as ex post net savings divided by ex post gross savings.

NIPSCO spent 80% of its electric and 41% of its natural gas budgets. As NIPSCO did not achieve its

electric or natural gas savings goal, spending aligned with project activity. Table 165 details the 2018

Custom program budget and expenditures.

Table 165. 2018 Custom Program Expenditures

Fuel Program Budget Program Expenditures Spend (%)

Electric $3,677,007 $2,949,615 80%

Natural Gas $1,373,354 $561,814 41%

Research Questions The evaluation team conducted a materials review, staff interviews, site visits with M&V analysis, desk

reviews, a participant survey, and a trade ally survey to address the following research questions:

• How do participants hear about the program, and what are the best ways for the program to

reach potential participants?

• What are the barriers and challenges to energy efficiency and program participation?

• What are the primary reasons for participation?

• Are participants satisfied with the program and its components?

• How could NIPSCO improve participants’ experience with the program?

• What are the program’s participant spillover and freeridership estimates?

• Are the tracking database savings sourced with proper project documentation?

• Do claimed savings algorithms align with the 2015 Indiana TRM (v2.2) or another appropriate

secondary source?

• Are trade allies satisfied with NIPSCO’s program and its components, including communications

with program staff?

• What key factors influence a trade ally’s decision to participate in the program?

• What is the trade allies’ value proposition in participating? How did participation affect trade

allies’ businesses? Have they expanded services, staff, or measures offered?

Page 258: 2018 DSM Portfolio Evaluation Report - NIPSCO

251

Impact Evaluation To perform an impact analysis of the 2018 Custom program, the evaluation team selected a

representative sample of measures to evaluate and then extrapolated the findings to the larger program

population. This process used a PPS sampling approach, a stratified sampling technique which uses a

saving parameter (here, total energy savings in MMBtu) as a weighting factor. Out of the 579 project

measures in the population, the evaluation team conducted either an engineering review or an on-site

M&V analysis for 76 measures in the C&I Custom evaluation sample. Although the natural gas measures

accounted for 9% of program measures, they comprised 37% of the program ex ante energy (MMBtu)

savings in 2018. This indicates that natural gas measures, on average, contained larger per-installation

energy savings than electric measures. Lighting measures were the largest contributors of ex ante

savings to the electric population and furnace/unit heater HVAC upgrades were the largest contributors

to the natural gas population. Table 166 shows the total number of sample sites evaluated for this

program year and their cumulative savings, along with sample size targets designed to yield sufficient

precision.

Table 166. Custom Program Measures and Savings Evaluated

Gross Population Count Sample Electric Measure

Count

Sample Natural Gas Measure

Count

Evaluation Sample

Share of Program

Savings Unit Measure Actual Target Actual Target

132,439 579 41 40 35 33 57%

The evaluation team used findings from 2015-2017 evaluations to further inform sampling targets for

2018. The evaluation team’s understanding of the savings variability (error ratio) of natural gas and

electric custom measures allowed it to more efficiently target natural gas and electric measures in 2018;

the analysis achieved better than 90/10 confidence and precision for each fuel’s realization rates, while

representing 57% of the total program savings. The actual sample surpassed its targets because the

evaluation team used a representative measure-mix in the sample and included large energy-saving

project measures.

Figure 90 illustrates the distribution of the 2018 Custom project population by energy savings (MMBtu)

and fuel type, as labeled in the tracking database. In 2018, Custom projects accounted for 27,644,407

kWh and 554,281 therm ex ante savings (149,751 MMBtu savings combined).

Page 259: 2018 DSM Portfolio Evaluation Report - NIPSCO

252

Figure 90. 2018 Custom Program Ex Ante Savings Distribution by Fuel Type and Measure Category

The 2018 Custom population was a largely heterogeneous mix of measure types, containing the largest

mix of measure categories of all the NIPSCO C&I programs (Prescriptive, New Construction, SBDI, and

RCx). Natural gas HVAC measures are comprised of natural gas furnace or unit heater replacements,

boiler replacements, and ventilation control measures. Industrial and municipal-specific measures, such

as grain dryers and wastewater treatment pump controls, made up process measures.

Since the majority of the program’s energy savings were concentrated in a small percentage of large-

energy saving projects (the top 5% of the measures contained 47% of program ex ante savings

[MMBtu]), the PPS sampling approach was able to represent 57% of the population’s total combined

savings across the sample of 76 project measures. Figure 91 shows the sample’s composition for on-site

M&V efforts and engineering reviews, compared to the population.

Page 260: 2018 DSM Portfolio Evaluation Report - NIPSCO

253

Figure 91. Custom Program Summary of 90/10 Sample Analysis Ex Ante Savings

Due to the high number of smaller lighting measures in the population (in all, lighting accounted for 74%

of total program measures, but only 36% of total program ex ante savings [MMBtu]), and the fact that

large-energy measures tended to be in process and HVAC categories, the PPS sampling approach

captured a larger portion of ex ante therm savings than electric. However, since savings variability (error

ratio) historically is lower for electric measures than natural gas measures for the NIPSCO Custom

program, the evaluation met the 90% confidence with ±10% precision target for the energy realization

rates. Figure 92 shows the sample distribution, based on measure categories from the tracking

database. Electric HVAC measures consisted of air conditioner and chiller equipment measures and

“other” measures include various HVAC retrofits, controls, and heat recovery measures.

The evaluation team conducted on-site activities across most measure types during the evaluation. On-

site M&V was especially informative for the non-lighting measure groups, which contained a large

variety of measure types within the groups and generally had fewer project characteristics documented

than the lighting measures. The evaluation team increased the evaluation sample with engineering desk

reviews to ensure any large energy-savings projects were included in the sample, and to help match the

measure distribution to the population.

Because of the PPS sampling approach, the distribution of ex ante energy savings in the sample were

well aligned with the distribution of savings in the overall population by measure at the fuel level

(Figure 92).

Page 261: 2018 DSM Portfolio Evaluation Report - NIPSCO

254

Figure 92. Custom Program 90/10 Sample Savings Distribution by Measure and Fuel Type

Since the PPS sampling approach favors large energy-saving measures, the larger measures from the

electric process and motors categories were responsible for a slightly larger portion of the savings in the

evaluation sample for those measures compared with the overall population.

Audited and Verified Savings The evaluation team accessed project documentation in the implementer’s database, LM Captures, to

audit savings for the evaluation sample of Custom measures in 2018. Project documentation included all

custom assumptions and inputs included in the savings calculations performed and/or approved by the

implementer. Other documentation required for the audit included the final approved project

application, appropriate equipment specification sheets provided by the manufacturer, and invoices or

proof of installation of equipment by trade allies. In many instances, the evaluation team failed to locate

proper supporting documentation for project measures in the sample in LM Captures. This included

calculations that were missing or showed savings values that did not match the claimed savings in the

program application or tracking data. In these cases, the evaluation team had to perform a secondary

request to the implementer to provide the correct documentation.

As shown in Table 163, the evaluation team made only minor adjustments in the evaluation’s audit

phase. The slight reduction in audited electric energy savings came from rounding errors between

ex ante savings in the tracking database and values sourced from project documentation. Natural gas

savings received slight adjustments as well in the audit phase in several instances where boiler upgrades

included the wrong equipment capacity or efficiency level in the Indiana TRM (v2.2) algorithm. The

evaluation team corrected the inputs based on specification sheets in the project documentation, and

the adjustments reflected in the audited phase carried over to the ex post calculation for these

measures.

Page 262: 2018 DSM Portfolio Evaluation Report - NIPSCO

255

Installation Rate

The evaluation team performed site visits to verify the installation of 16 sampled measures to calculate

the ISR for the Custom program. During the site visits, the evaluation team verified the claimed unit

count was correct in most instances. However, the program level ISR surpassed 100% in 2018 due to a

measure where the evaluation team found that the total VFD-controlled process motor capacity was

600 hp and not 230 hp, as claimed in the tracking data.

Because these installation adjustments increased the verified quantity, the program-wide ISR was

calculated as an average of 106%. The evaluation team applied the ISR to the audited savings to

calculate the verified savings.

Ex Post Gross Savings The evaluation team adjusted measure savings values in the ex post analysis based on different data:

• Discrepancies in claimed equipment capacity, quantity, or wattages discovered during site visits

and/or discussions with business owners.

• Utilization of average power or annual operating hours metered during site visits and/or

provided by business owners.

• Inclusion of electric energy WHFs and peak summer CFs consistent with the 2015 Indiana

TRM (v2.2).

• Correcting custom natural gas equipment savings methodologies that were thermodynamically

incorrect.

Adjustments based on the above data resulted in large discrepancies for some project measures, but at

the program level, realization rates only reduced slightly (realization rates were between 97% and 98%

for all fuel-level energy and demand metrics). Some realization rate drivers remained the same as

previous years, such as application of CFs for lighting measures. Although the evaluation team

confirmed that the implementer began applying CFs based on building type to a portion lighting projects

in 2018 (based on evaluation recommendations in 2016 and 2017), the evaluation team determined that

they incorrectly applied CFs to exterior lighting measures and omitted them for number of interior

lighting projects. Thus, the 2018 ex ante peak coincident demand reduction was overstated because of

the inconsistent application of CFs for C&I Custom lighting projects. However, ex post adjustments to

demand reduction were not as significant as they were in 2017 since the implementer did not omit CFs

for all lighting projects (which was the case in 2017).

Electric Savings

The program’s electric savings dropped below 100% primarily due to on-site M&V findings for several

large program measures, as outlined in the projects that follow.

Project 1. A fabricator replaced two constant-speed single-stage oil-cooled rotary air compressors with

two VFD oil-cooled rotary air compressors with less horsepower. The evaluation team visited the site

and confirmed all claimed equipment was fully commissioned and in stable condition. The evaluation

team also installed a three-phase power transducer and a temporary data logger to record operation

Page 263: 2018 DSM Portfolio Evaluation Report - NIPSCO

256

profiles for each of the compressed air systems for a duration of six weeks. The evaluation team

leveraged the specific power ratings of the new equipment (in kilowatts per 100 actual cubic feet per

minute of compressed air [ACFM]) with the logged data to compare actual operation histograms to what

was claimed in the ex ante calculations. Ex ante histograms were based on six days of pre-installation

operation. Figure 93 shows a comparison of the two histograms.

Figure 93. Comparison of Ex Ante and Ex Post Compressed Air Operating Histograms

For both systems, the evaluation team’s M&V data show much higher operating hours for the first

histogram bin than claimed. This means that the site’s operating profile has more down time that

originally expected, which is due to variable operation during the day and scheduled downtime during

weekends. The evaluation teams’ data also shows that System A operates almost entirely at ACFM

demands less than claimed.

The evaluation team estimated the baseline consumption at these operating profiles using specifications

provided in the project files. Combined savings for each system profile resulted in ex post savings of

172,135, which yielded a realization rate of 74% for this project. However, since the total capacity of the

systems was less than the baseline equipment (90 hp compared to 110 hp), this project received 5.34

kW of peak coincident demand savings (ex ante demand savings were overlooked, equaling zero).

Project 2. A municipal wastewater treatment plant installed VFDs on aeration tank blowers to decrease

blower energy consumption when oxygen demand is below capacity. However, during the site visit and

interview with operational personnel, the evaluation team confirmed that the blowers were not able to

operate at maximum efficiency because of foaming issues in the aeration basin that occurred at low

blower loading. As such, the facility locked the speed of both blowers at 60%. Figure 94 shows the

settings at the time of the site visit. Both blowers were operating at 60% capacity and consuming

114 kW.

Page 264: 2018 DSM Portfolio Evaluation Report - NIPSCO

257

Figure 94. Screenshots of Wastewater Treatment Plant Blower Control Settings

Facility staff plan to investigate work-arounds that may reduce foaming, but confirmed that they will

keep the blowers at 60%, 24 hours per day, to mitigate the issue for the foreseeable future.

Therefore, the evaluation team modified the ex ante savings based on the observed kilowatts for the

newly installed units. Ex ante assumptions assumed substantial operating hours at less than 60%

aeration capacity, which formulated their assumption that the combined post-installation average

electric demand would be 180 kW. Since the facility has operational issues preventing it from taking full

advantage of the VFD controls, it is keeping the combined demand constant at 228 kW. Due to the

increase in post-installation demand, the ex post energy and demand savings were less than claimed.

Annual electric savings dropped from 2,017,316 annual kWh to 1,593,357 annual kWh and peak

coincidence demand savings dropped from 210 kW to 182 kW.

Project 3. Another large electric energy saving project was a manufacturing facility that upgraded a

process chiller to a high-efficiency unit with free-cooling capability. The ex ante savings of 616,277

annual kWh was based on the assumption that the baseline and installed chiller would operate at 100%

load for 6,000 hours per year. However, the evaluation team installed power metering and data logging

equipment on the system for two months and found that the load factor (average percentage of full

cooling load during the time chilled water is required) was 51% and the utility factor (average percent of

the time the chilled water is required during production hours) was 75%. This reduced ex post savings to

275,167 annual kWh savings. Figure 95 outlines the weekly load profile from the evaluation team’s data

logging effort.

Page 265: 2018 DSM Portfolio Evaluation Report - NIPSCO

258

Figure 95. Average Weekly Load Profile of Chiller

Figure 95 illustrates that the average hourly production operation does not require full cooling capacity

(most of the time remaining below 40%), yielding a 45% realization rate for this project measure.

Natural Gas Savings

The evaluation team found inconsistencies in natural gas heating equipment replacement savings

methodologies in 2018 that drove the therm savings realization rate below 100%. Some boiler

replacements did not use the Indiana TRM (v2.2) or UMP algorithms and assumptions, which led to

overstated savings in some cases. For example, a school replaced a 2,576 MBH central boiler (90%

efficient) with a 2,175 MBH boiler rated at 97.2% efficiency. The ex ante savings utilized the algorithm

present in the Indiana TRM (v2.2) and UMP:

𝑆𝑎𝑣𝑖𝑛𝑔𝑠 = 𝑁𝑒𝑤 𝐸𝑞𝑢𝑖𝑝𝑚𝑒𝑛𝑡 𝐶𝑎𝑝𝑎𝑐𝑖𝑡𝑦 × (𝑛𝑒𝑤 𝑒𝑓𝑓𝑖𝑐𝑖𝑒𝑛𝑐𝑦

𝑜𝑙𝑑 𝑒𝑓𝑓𝑖𝑐𝑖𝑒𝑛𝑐𝑦− 1) × 𝐸𝐹𝐿𝐻

Where:

EFLH = Effective full-load hours of the new equipment

However, in addition to these savings, the implementer also claimed savings due to the capacity

reduction between the new and old system. Combining the savings, ex ante savings equaled 7,280

therms per year. This is actually a form of double counting savings since the UMP and Indiana

TRM (v2.2) algorithm already takes into account the input capacity difference. In ex post, the evaluation

team reduced savings to by removing the capacity reduction estimate (therefore aligning with the

Indiana TRM v2.2 and UMP) and the project received an 81% realization rate.

In another school example, project documentation described the measure’s baseline as a central boiler

that feeds a pool heating loop and a DHW loop. The measure would reconfigure the system as two

separate, high-efficiency water heating systems that feed each loop independently. To estimate ex ante

savings, the implementer calculated savings for each loop separately and combined them to claim

10,595 annual therm savings. The implementer base the DHW savings on Indiana TRM (v2.2) equations

Page 266: 2018 DSM Portfolio Evaluation Report - NIPSCO

259

but estimated pool savings by subtracting the calculated baseline consumption (based on engineering

estimates) from the non-weather sensitive baseline annual natural gas consumption. The difference in

these two baselines do not properly describe the savings achieved by the new system. As such, the

evaluation team corrected the spreadsheet model provided in LM Captures to reflect pool savings due

to the increase in heating efficiency, which yielded a 37% realization rate.

Project Documentation and Measure Evaluability

Clear and understandable project documentation is a key piece of a robust evaluation. As mentioned

previously, in 2018, the evaluation team found that some project documentation did not include

calculations or stated savings values that did not match the claimed savings in the program application

or tracking data. In these cases, the evaluation team had to perform a secondary request to the

implementer to provide the correct documentation. However, for two large natural gas energy-saving

measures, described below, the fulfillment of the secondary request was still insufficient for proper

evaluation of impact metrics.

Project 1. The largest natural gas measure in the 2018 Custom population was a heating system retrofit

of a bank. Savings were achieved by converting several areas of the building from a three-pipe system,

where there is only a single return line for heating hot water and chilled water, to a four-pipe system

with separate heating and chilled water return lines. Ex ante savings were estimated at 64,150 annual

therms and based on a percent savings assumption applied to baseline billing data. The assumption was

based on engineering estimates originating from similar projects. The project description also

mentioned TRACE energy modelling that helped support the savings estimate.

The evaluation team requested the following additional documentation to help support claimed savings:

• Screenshots of the major inputs to the TRACE model and justification for all critical inputs and

assumptions.

• Access to the ENERGY STAR® portfolio or at least screenshots of all inputs.

The implementer did not have these documents and forwarded contact information to a third party who

provided the engineering assumptions for this measure’s claimed savings. The site contact was non-

responsive, so the evaluation team was not able to fully investigate the accuracy of ex ante savings

assumptions.

Project 2. Another large project measure claimed 20,181 annual therm savings due to a furnace

replacement with a high-efficiency air rotation heating unit. Supporting calculations showed savings

originating from a difference in total heating usage between a baseline and efficient case energy model.

Model inputs and engineering calculations were not present in LM Captures for this measure so the

evaluation team requested the following information from the implementer:

• Hours of operation assumptions

• Heating load profiles

• Baseline equipment efficiency and scheduling

• Measure equipment efficiency and scheduling

Page 267: 2018 DSM Portfolio Evaluation Report - NIPSCO

260

The implementer did not provide these assumptions, and the eQuest model the evaluation team

attempted to build to benchmark savings showed baseline consumption an order of magnitude below

what was claimed in the ex ante calculation. Historical billing data in LM Captures also did not align the

claimed estimate.

The evaluation team passed these measures though at 100% realization rates due to a lack of

justification for adjustments. Since the implementer does not require documentation of assumptions

and inputs, the evaluation team could not base corrections with any engineering rigor. However, the

evaluation team recognizes that acceptance of these ex ante savings may be overstating the program’s

natural gas savings. Therefore, the evaluation team may consider a different treatment of savings in

situations where large energy saving projects (within the top 10% of the fuel-specific ex ante savings)

continue to lack proper engineering assumptions and model inputs.

Realization Rates

Table 167 shows the percentage of verified savings and realization rates for the sample.

Table 167. Custom Program Realization Rates

ISR

Verified/ Ex Ante Realization Rate for Evaluation Samplea

Electric Energy

(kWh)

Peak Demand

(kW)

Natural Gas

Energy

(therms)

Electric Energy

(kWh)

Peak Demand

(kW)

Natural Gas

Energy

(therms)

106.0% 106.0% 106.0% 105.6% 97.7% 96.6% 96.8% a Realization rate is defined as ex post gross savings divided by ex ante savings.

Table 168 lists the measure and unit count within the sample, and the aggregated ex post realization

rates for each measure type. The Ex Post Gross Savings section outlines the drivers of realization rate for

each measure type.

Table 168. Custom Program Evaluation Sample Results by Measure Type

Fuel Type Measure Type

Evaluation

Sample Measure

Count

Evaluation

Sample Unit

Count

Peak Demand

Ex Post

Realization Rate

Total Energy

Ex Post

Realization Rate

Electric Compressed Air 2 150 N/A 64%

Electric Process 3 233 105% 70%

Electric Lighting 34 7,264 94% 103%

Electric Motors 2 49 160% 91%

Natural Gas Controls 2 390 N/A 153%

Natural Gas Boilers 14 54,705 N/A 84%

Natural Gas Process 3 1,202 N/A 100%

Natural Gas HVAC/Furnace 9 22,001 N/A 103%

Natural Gas Other 5 1,463 N/A 97%

Natural Gas Ventilation 2 2 N/A 100%

Note: N/A denotes no claimed demand savings.

It is important to note that the realization rates in Table 168 were not applied to each measure type in

the population to determine ex post gross savings. Table 168 is presented only to provide a further

Page 268: 2018 DSM Portfolio Evaluation Report - NIPSCO

261

outline of findings within the evaluation sample. To calculate the ex post gross impacts, the evaluation

team applied the metric-level realization rates resulting from the sample and applied them to the

population ex ante energy and demand savings (shown in Table 174).

Table 169. Application of 2018 Custom Program Realization Rates

Metric Population Ex Ante Realization Rate (From

Evaluation Sample) Population Ex Post Grossa

Electric Energy Savings (kWh/yr) 27,644,407 98% 27,019,275

Peak Demand Reduction (kW) 3,465 97% 3,359

Natural Gas Energy Savings

(therms/yr) 554,281 97% 536,686

a Totals may not calculate properly due to rounding.

Ex Post Net Savings The evaluation team calculated freeridership and participant spillover using the methods described in

Appendix B. Self-Report Net-to-Gross Evaluation Methodology, and survey data collected from 2018

participants. As shown in Table 170, the evaluation team estimated a 79% NTG for the program.

Table 170. 2018 Custom Program Net-to-Gross Results

Program Category Freeridership (%)a Participant Spillover

(%) NTG (%)a

Custom Program 21% 0% 79%

a Weighted by survey sample ex post gross program MMBtu savings.

Freeridership To determine freeridership, the evaluation team asked 38 respondents (representing 45 measures)

questions focused on whether they would have installed equipment at the same level of efficiency, at

the same time, and in the same amount in the absence of the Custom program.

Based on survey feedback, the evaluation team calculated overall freeridership of 21% for the program,

as shown in Table 171.

Table 171. 2018 Custom Program Freeridership Results

Program Category Responses (n)a Freeridership (%)b

Custom Program 45 21%

a For customers who installed more than one measure, the evaluation team asked the freeridership battery of questions for a

maximum of two measures, resulting in 90 unique responses. b The freeridership score was weighted by the survey sample ex post gross program MMBtu savings.

By combining the previously used intention methodology with the influence methodology (through

simple averaging at the individual level), the evaluation team produced average freeridership for each

respondent. Refer to Appendix B. Self-Report Net-to-Gross Evaluation Methodology for further details on

intention and influence questions and scoring methodologies.

Page 269: 2018 DSM Portfolio Evaluation Report - NIPSCO

262

Intention Freeridership

The evaluation team estimated intention freeridership scores for all participants based on their

responses to the intention-focused freeridership questions. As shown in Table 172, the intention

freeridership score for the C&I Custom program was 36%.

Table 172. 2018 Custom Program Intention Freeridership Results

Program Category Responses (n)a Intention Freeridership Score (%)b

Custom Program 45 36%

a For customers who installed more than one measure, the evaluation team asked the freeridership battery of questions for a

maximum of two measures, resulting in 90 unique responses. b The freeridership score was weighted by survey sample ex post gross program MMBtu savings.

Figure 96 shows the distribution of the individual intention freeridership scores.

Figure 96. 2018 Custom Program Distribution of Intention Freeridership Scores

Source: Participant Survey. Questions: G1 to G9 and G11 are used to estimate an intention freeridership

score. See Table 239 in the Appendix B. Self-Report Net-to-Gross Evaluation Methodology for the full text of

the questions, response options, and scoring treatments used to estimate intention freeridership scores.

See Table 240 for the unique Custom program participant response combinations resulting from intention

freeridership questions, along with intention freeridership scores assigned to each combination, and the

number of responses for each combination.

Influence Freeridership

The evaluation team assessed influence freeridership by asking participants how important various

program elements were in their decision-making process. Table 173 shows program elements

participants rated for importance, along with a count and average rating for each factor.

Page 270: 2018 DSM Portfolio Evaluation Report - NIPSCO

263

Table 173. 2018 Custom Program Influence Freeridership Responses

Influence Rating Influence

Score

NIPSCO

Incentive

Information from

NIPSCO on Energy

Savings Opportunities

Recommendation

from Contractor or

Vendor

Participation in a

NIPSCO Efficiency

Program

1 (not at all important) 100% 0 4 3 4

2 75% 2 3 3 3

3 25% 10 7 6 5

4 (very important) 0% 32 29 28 20

Don’t Know 50% 1 2 5 13

Average 3.7 3.4 3.5 3.3

The evaluation team determined each respondent’s influence freeridership rate for each measure

category using the maximum rating provided for any factor included in Table 173. As shown in

Table 174, the respondents’ maximum influence ratings ranged from 1 (not at all important) to 4 (very

important). A maximum score of 1 meant the customer ranked all factors from Table 173 as not at all

important, while a maximum score of 4 means the customer ranked at least 1 factor very important.

Counts refer to the number of “maximum influence” responses for each factor, or influence score,

response option.

Table 174. 2018 Custom Program Influence Freeridership Score

Maximum Influence Rating Influence

Score Count

Total Survey

Sample Ex Post

MMBtu Savings

Influence Score

MMBtu Savings

1 (not important) 100% 1 0 0

2 75% 2 2,050 1,538

3 25% 8 333 83

4 (very important) 0% 30 28,367 0

Not applicable 50% 0 0 0

Average Maximum Influence Rating (Simple Average) 3.9

Average Influence Score (Weighted by Ex Post Savings) 5%

The average influence score of 5% for the 2018 Custom program is weighted respondents by ex post

MMBtu program savings.

Final Freeridership

Next, the evaluation team calculated the mean of the intention and influence freeridership components

to estimate the final freeridership for the Custom program at 21%.

𝐹𝑖𝑛𝑎𝑙 𝐹𝑟𝑒𝑒𝑟𝑖𝑑𝑒𝑟𝑠ℎ𝑖𝑝 (21%) =𝐼𝑛𝑡𝑒𝑛𝑡𝑖𝑜𝑛 𝐹𝑅 𝑆𝑐𝑜𝑟𝑒 (36%) + 𝐼𝑛𝑓𝑙𝑢𝑒𝑛𝑐𝑒 𝐹𝑅 𝑆𝑐𝑜𝑟𝑒 (5%)

2

A higher freeridership score translates to more savings that are deducted from the gross savings

estimates. Table 175 lists the intention, influence, and freeridership scores for the C&I Custom program.

Page 271: 2018 DSM Portfolio Evaluation Report - NIPSCO

264

Table 175. 2018 Custom Program Freeridership Score

Responses (n) Intention Score Influence Score Freeridership Score

45 36% 5% 21%

Participant Spillover As detailed in Appendix B. Self-Report Net-to-Gross Evaluation Methodology the evaluation team

estimated participant spillover59 measure savings using specific information about participants,

determined through the evaluation, and using the Indiana TRM (v2.2) as a baseline reference. The

evaluation team estimated the percentage of program participant spillover by dividing the sum of

additional spillover savings (as reported by survey respondents) by the total gross savings achieved by all

program respondents. The evaluation team estimated 0%60 participant spillover for the C&I Custom

program, rounded to the nearest whole percent (Table 176).

Table 176. 2018 Custom Program Spillover

Spillover Savings (MMBtu) Participant Program Savings (MMBtu) Spillover

53.8 30,749.7 0%

Two participants reported that, overall, the program was very important in their decision to install

additional LEDs, a lighting control, and an air makeup unit (without NIPSCO incentives). Table 177 shows

this additional participant spillover measure and the total resulting energy savings.

Table 177. 2018 Custom Program Participant Spillover Measures, Quantity, and Savings

Spillover Measures Quantity Total Energy Savings (MMBtu)

Air make-up unit 1 22.2

LED Lighting 46 31.4

Lighting control 1 0.2

Total N/A 53.8

Table 178 lists the percentage of freeridership, participant spillover, and NTG for the Custom program.

Table 178. 2018 Custom Program Net-to-Gross Results

Responses (n) Freeridershipa Participant Spillover NTG

45 21% 0% 79%

a Weighted by survey sample ex post gross program MMBtu savings

59 Non-participant spillover evaluation activities were not conducted for the 2018 program year.

60 True value is 0.2%.

Page 272: 2018 DSM Portfolio Evaluation Report - NIPSCO

265

Evaluated Net Savings Adjustments Table 179 shows the energy savings, realization rate, and NTG for the Custom program.

Table 179. 2018 Custom Program Ex Post Net Savings

Fuel Type Ex Ante Gross

Savings

Ex Post Gross

Savings

Realization

Rate (%) NTG (%)

Ex Post Net

Savings

Electric Energy Savings (kWh/yr) 27,644,407 27,019,275 98% 79% 21,345,228

Peak Demand Reduction (kW) 3,465 3,359 97% 79% 2,654

Natural Gas Energy Savings (therms/yr) 554,281 536,686 97% 79% 423,982

Process Evaluation The evaluation team reviewed the program database and materials, and conducted participant and

trade ally surveys. The evaluation team also interviewed NIPSCO’s program manager and key program

implementer staff to gain a better understanding of the program design and delivery process, and any

associated changes or challenges experienced in 2018. The evaluation team’s findings follow.

Program Design and Delivery Through the Custom program, NIPSCO offers incentives for non-standard projects that involve more

complex technologies or equipment changes than are covered in the one-for-one replacement offers

available through the Prescriptive program. Custom incentives are based on a project’s estimated first-

year peak demand reduction and electric or natural gas energy savings.

NIPSCO’s C&I program manager guides program design and strategy, and NIPSCO’s Major Account

Managers assist with implementation efforts through direct support and program assistance to

customers within the service territory.

The implementer’s program manager, supported by a program supervisor, energy engineers, and other

implementation staff, is responsible for delivering the Custom program through trade allies and

coordinating customer outreach with NIPSCO staff. Implementation staff, including a marketing

manager, trade ally and implementation specialists, and field and project engineers, handle program

marketing, trade ally and customer engagement, quality assurance, and day-to-day program operations.

Implementation staff, NIPSCO’s Major Account Managers, and trade allies all promote the program.

Notably, it is the trade allies who work with customers to identify energy-savings opportunities and

drive most projects to the program through their direct outreach efforts.

To participate in the Custom program, a customer must submit an application for pre-approval, or a

trade ally can apply for Custom incentives on a customer’s behalf. The program implementer reviews

the application and conducts a site visit to verify the baseline equipment and projected savings. The

program implementer then notifies the customer of pre-approval and sends an installation agreement

with an 18-month installation deadline. After the installation, the customer or trade ally submits a

Page 273: 2018 DSM Portfolio Evaluation Report - NIPSCO

266

completion form. For at least 10% of completed projects and for every project with incentives exceeding

$10,000, the implementer conducts a final inspection prior to rebate payment.

Changes from 2017 Design

In July 2018, NIPSCO consolidated the savings goals across C&I programs, regardless of project or

program type.

To boost participation, NIPSCO increased the incentive cap from 50% to 75% of the project cost and

increased the electric incentive for non-lighting measures from $0.09 to $0.10 per kilowatt-hour.

However, NIPSCO did not implement a bonus offering for therm-saving measures in 2018. The program

implementer noted that after offering a therm bonus in 2016 and 2017, it was concerned that

customers would come to rely on the bonus and therefore refrain from executing a project until the

bonus was available.

2017 Recommendation Status

During interviews with NIPSCO staff and the program implementer, the evaluation team followed up on

the 2017 evaluation recommendations. Table 180 lists the 2017 Custom program evaluation

recommendations and NIPSCO’s progress toward addressing those recommendations to date.

Table 180. Status of 2017 Custom Program Evaluation Recommendations

Summary of 2017 Recommendations Status

To increase customer awareness and program

engagement, develop cross-over marketing

between digital channels and human channels,

such as offering a one-on-one consultation and

promoting through email marketing and

NIPSCO’s website. Ensure email campaigns

encourage customers to take action (for

example, call or email to set up a consultation

appointment). Set goals to follow up with a

target market segment within a few weeks of

an email campaign.

In Progress. While every email communications piece included contact

information like a general email and the program 800 number, the program

implementer expressed concerns with adding content that could distract from

the primary message. The program implementer’s outreach staff did not

administer a follow-up telephone or email campaign.

Page 274: 2018 DSM Portfolio Evaluation Report - NIPSCO

267

Summary of 2017 Recommendations Status

Follow the algorithm for energy and demand

savings outlined in the 2015 Indiana

TRM (v2.2) for commercial lighting

In Progress. The implementer has started to incorporate CFs in lighting demand

savings in 2018, which is reflecting a more accurate depiction of peak

coincidence demand reduction for the program. The implementer has

reservations about incorporating WHFs into lighting calculations due to

concerns that the WHF assumption would be applied to sites that significantly

vary in size and layout, which both greatly impact how a space is heated, and it

did not apply them in 2018. For this evaluation cycle, program stakeholders

have agreed to allow the implementer to continue this approach regarding

WHFs, with the evaluation team counting electric WHFs towards program

savings in ex post gross savings. Historically, Indiana evaluations capture WHF

credits only as part of evaluated electric savings. If the heating fuel is electric,

the penalty comes in the form of reduced electric savings. The evaluation team

will continue to apply this methodology as it is generally not appropriate to

impact natural gas portfolios, based on electric measure impacts (or vice versa).

Natural gas and electric measures represent two distinct ratepayer and

stakeholder groups, often with different regulatory frameworks within a given

jurisdiction.

Directly reference savings values from a

custom analysis template for each measure in

the tracking database. This creates a uniform

method of displaying calculation methodology

for each measure. Also, rely on spreadsheet

calculators, when applicable, instead of read-

only documents of calculation results.

In Progress. The implementer developed internal calculator tools for consistent

savings reporting. If a customer supplies calculation files, the implementer vets

and reviews the calculations on a project-by-project basis. The implementer

said customer-supplied calculations often include data applicable to a specific

project, which may be more accurate than an Indiana TRM (v2.2) calculation.

The implementer said the calculation files are too large to upload into

LM Captures, and the evaluation team can request these as needed from the

implementer. However, for two large natural gas energy-saving measures,

described in the Project Documentation and Measure Evaluability section, the

fulfillment of the secondary request was still insufficient for proper evaluation

of impact metrics.

Participant Feedback The evaluation team surveyed participants about their project and program participation experience to

assess customer motivation, participation barriers, and customer satisfaction with program

components. These are discussed in more detail below. The evaluation team contacted 159 businesses

that participated in the Custom program—40 participants completed the survey for a 25% response

rate. Because of the small sample sizes in 2016, 2017, and 2018, statistical differences in the responses

between years should be considered with caution.

Energy Efficiency Awareness and Marketing

NIPSCO markets all its C&I programs—Custom, Prescriptive, SBDI, New Construction, and RCx—as one

unit, and provides program-specific messaging, incentive summaries, and brochures to target its desired

customer segments. NIPSCO and implementation staff continued efforts to build customer program

awareness, though on a more limited scale in 2018. The program implementer maintained its program

materials, but released only five customer-facing email messages in 2018, compared to 13 in 2017 and

five in 2016.

Page 275: 2018 DSM Portfolio Evaluation Report - NIPSCO

268

Through the participant survey, the evaluation team assessed how participants learned about the

program. Respondents most frequently learned through trade allies (43%), followed by NIPSCO staff

(18%) and word of mouth (15%). Figure 97 shows a full breakdown of all the responses. As shown below,

a significantly higher percentage of 2018 respondents (18%) learned about the program through NIPSCO

staff than in 2017 (5%) and 2016 (3%).61 In 2018, no respondents reported learning about the program

through printed or emailed materials, compared to 3% in 2017 (likely reflective of the decrease in email

marketing from the program implementer in 2018).

Figure 97. How Participants Learned about the Custom Program, 2016-2018

Note: Boxed value indicates difference between 2017 and 2018 is significant at p≤0.05 (95% confidence).

Source: Participant Surveys. Question: “How did you first learn about NIPSCO’s Custom Incentive program?”

(Multiple responses allowed)

Most respondents were aware that NIPSCO offers other energy efficiency programs for business

customers. Sixty-eight percent of 2018 respondents reported awareness of other NIPSCO business

programs, compared with 77% in 2017 and 63% in 2016. Fifty percent of these respondents (n=26) could

identify a specific program or offer, consistent with the 2017 responses (60%, n=31) and 2016 responses

(45%, n=22).

When asked for the best way for NIPSCO to keep them informed of future savings opportunities,

respondents (n=40) most frequently said they prefer email (51%), bill inserts (22%), and in-person

contact from NIPSCO (15%).

61 Difference is statistically significant at p≤0.01 (99% confidence).

Page 276: 2018 DSM Portfolio Evaluation Report - NIPSCO

269

Participation Drivers

Respondents shared a variety of reasons for participating in NIPSCO’s Custom program. The most

important reasons mentioned were to save money on their utility bills (30%) and obtaining a program

incentive (30%). Figure 98 shows a breakdown of the 2018 respondents’ participation drivers. A

significantly higher percentage of 2018 respondents (20%, n=40) mentioned saving energy as a top

reason, compared to 2017 (5%, n=39).62

Figure 98. Custom Program Participation Drivers

Source: Participant Survey. Question: “What two factors were most important in your decision to make

energy-saving improvements through NIPSCO’s Custom program?” (two responses allowed)

Participation Barriers

The survey asked respondents about challenges that businesses face in becoming more energy-efficient.

Fifty-four percent of 2018 respondents (n=39) said it is very easy or somewhat easy for businesses to

invest in energy-efficient equipment. This is similar to 2017 responses (59%, n=39). As shown in

Figure 99, 2018 respondents most frequently mentioned the high initial cost as the greatest challenge

(50%), followed by understanding potential areas for improvement (15%) and lack of staff time

dedicated to energy efficiency upgrades (15%). The barriers reported in 2018 were statistically similar to

those in 2017.

62 Difference is statistically significant at p≤0.05 (95% confidence).

Page 277: 2018 DSM Portfolio Evaluation Report - NIPSCO

270

Figure 99. Barriers to Becoming Energy-Efficient

Source: Participant Surveys. Question: “What do you think are the most significant challenges that

organizations face when investing in energy-efficient equipment?”

Seven of the 40 respondents identified challenges with participating in the program. Four of these seven

respondents cited issues with understanding the rebate application process or completing the

application. Respondents cited the other following challenges: long payback period, confusion with

whom to contact with questions, long response time for answers to questions, level of detail needed to

apply, and challenges with a trade ally. When asked what NIPSCO could have done to help them

overcome their challenges, three would have liked timely assistance from NIPSCO or implementer staff,

and one wanted more education on the application process and program requirements in general.

Satisfaction with Program Processes

Respondents rated their satisfaction with different program components (Figure 100). Respondents

showed the highest level of satisfaction for working with Lockheed Martin Energy, with 97% of

respondents (n=40) being very satisfied or somewhat satisfied. Respondents were the least satisfied

with the program application process and the time to receive the incentive check. The 2018 results were

similar to 2017.

Page 278: 2018 DSM Portfolio Evaluation Report - NIPSCO

271

Figure 100. Satisfaction with Custom Program Components

Source: Participant Survey. Question: “How would you rate your satisfaction with…”

Overall Satisfaction with Program and NIPSCO

Most 2018 respondents (95%) were very satisfied or somewhat satisfied with the program overall, which

was similar to results in 2017 (100%) and 2016 (94%). The evaluation team asked respondents to

identify the reasons for their overall program rating. Respondents who were very satisfied gave a variety

of positive reasons for their scores, such as noting that they experienced a smooth application process,

found the information provided to be very clear, or thought that Lockheed Martin Energy’s staff was

professional and helpful. Two of the nine respondents who were somewhat satisfied with the program

overall had positive words for the program and program staff; the remaining seven offered reasons for

not giving the highest rating possible, including a desire for more technical assistance and timely

information throughout the application process (two respondents), a preference for learning about the

program through NIPSCO itself instead of their vendor (two respondents), an interest in higher incentive

levels (one respondent), simplified paperwork (one respondent), and a more accurate estimate of the

project cost (one respondent). 2018, 2017, and 2016 overall satisfaction ratings are shown in Figure 101.

Page 279: 2018 DSM Portfolio Evaluation Report - NIPSCO

272

Figure 101. Overall Satisfaction with the Custom Program

Source: Participant Surveys. Question: “How satisfied are you with NIPSCO’s Custom Incentive program

overall? Would you say you are…”

When asked about their satisfaction with NIPSCO, 98% of respondents said they were very satisfied or

somewhat satisfied, compared with 92% in 2017 and 94% in 2016. Figure 102 shows the distribution of

2016, 2017, and 2018 responses for satisfaction with NIPSCO.

Figure 102. Satisfaction with NIPSCO

Source: Participant surveys. Question: “How satisfied are you with NIPSCO overall as your organization’s

utility service provider? Would you say you are…”

Page 280: 2018 DSM Portfolio Evaluation Report - NIPSCO

273

Suggestions for Improvement

The evaluation team asked respondents to identify any suggestions they had for improving the Custom

program. Most respondents (85%; n=39) did not offer any suggestions. Seven respondents suggested

the following (multiple responses allowed):

• Simplify the application form such as by providing a drop-down menu and streamlining the

application form itself (three respondents).63

• Improve communication from NIPSCO, such as educating customers about the program,

providing more technical assistance on projects, and informing vendors about updated

paperwork (three respondents).

• Offer higher incentives (two respondents).

• Reduce the gap between the incentive estimated during the pre-approval process and the actual

incentive amount given post-project (one respondent).

Participant Survey Firmographics

As part of the participant survey, the evaluation team collected responses on the following

firmographics, shown in Table 181.

Table 181. Custom Program Respondent Firmographics

Firmographic Percentage

Industry (n=40)

Manufacturing 35%

Education/schools 13%

Government 10%

Retail/wholesale 10%

Religious/church 5%

Healthcare 5%

Transportation 5%

Utility 5%

Agriculture 3%

Construction 3%

Grocery/food stores/convenience stores 3%

Auto dealer/repair shop 3%

Office, professional services 3%

Building Ownership (n=38)

Own 84%

Lease 16%

63 Lockheed Martin Energy’s Excel-based application form contains drop-down options for baseline and efficient

measure entries.

Page 281: 2018 DSM Portfolio Evaluation Report - NIPSCO

274

Firmographic Percentage

Building Square Footage (n=40)

(Less than 5,000 square feet) 12%

5,000 to less than 10,000 square feet 2%

10,000 to less than 50,000 square feet 22%

50,000 to less than 100,000 square feet 20%

100,000 square feet or greater 18%

Don’t know 25%

Space Heating Fuel Type (n=38)

Natural gas 95%

Electric 3%

Processed heat from equipment 3%

Water Heating Fuel Type (n=37)

Natural gas 71%

Electric 26%

None 3%

Trade Ally Feedback Through an online survey conducted in January 2019, the evaluation team asked trade allies who

participated in NIPSCO’s 2018 C&I programs for feedback on their experiences. The evaluation team

contacted 82 trade allies who participated in the Custom program—16 trade allies responded for a 20%

response rate. Trade ally feedback for the C&I programs follows.

Trade Ally Engagement and Awareness

Trade allies’ agreement levels to statements about program engagement showed that they are fairly

engaged with the Custom program. Most trade ally respondents strongly agree or somewhat agree with

the statements shown in Figure 103. Similar to 2017 survey results, 2018 respondents most strongly

agreed with the statement that they are well-informed about the program’s requirements and

processes.

Page 282: 2018 DSM Portfolio Evaluation Report - NIPSCO

275

Figure 103. Trade Ally Agreement with Custom Program Engagement Statements

Source: Trade Ally Survey. Question: “Please indicate whether you strongly agree, somewhat agree, neither

agree nor disagree, somewhat disagree, or strongly disagree with the following statements…”

Of the five C&I programs, Custom program trade ally survey respondents said they were most familiar

with the Custom and Prescriptive programs. Figure 104 shows the 2016, 2017, and 2018 trade ally

familiarity ratings for all the C&I programs. On a scale of 1 to 5, where 1 is very familiar, and 5 is not at

all familiar, 75% of 2018 trade ally survey respondents rated their familiarity with the Custom program

as very (44%, n=16) or somewhat familiar (31%). These results were statistically similar to 2017 and

2016 responses, though the small sample size makes it difficult to detect significant differences.64

Although the evaluation team surveyed trade allies who submitted a Custom application in 2018, one of

16 was not at all familiar with the Custom program. Respondent familiarity with the New Construction

and RCx programs increased slightly in 2018 from 2016 and 2017, suggesting improvement in program

engagement. Familiarity with the SBDI program declined from 2017 to 2018, with 27% of 2018

respondents reporting that they were very familiar with the program, compared to 41% in 2017;

however, the difference in responses is not statistically significant.

64 Due to small 2016, 2017, and 2018 sample sizes, the evaluation team largely refrained from testing for

statistical differences among the groups. Observed and tested differences should be considered with caution.

Page 283: 2018 DSM Portfolio Evaluation Report - NIPSCO

276

Figure 104. Trade Ally Familiarity with C&I Programs, 2016-2018

Source: Trade Ally Surveys. Question: “How familiar are you with the following NIPSCO energy efficiency

programs and incentives for business customers, on a scale of 1 to 5, where 1 is very familiar and 5 is not at

all familiar?”

The survey asked trade allies to rank sources of program information from most to least useful. Custom

program trade ally respondents ranked personal calls from program implementer representatives as the

most useful source, followed by emails and the NIPSCO website. NIPSCO utility representatives and

trainings ranked equally in fourth place. Table 182 shows the sources ranked from most to least useful.

These results are consistent with the rankings from the 2017 survey results.

Page 284: 2018 DSM Portfolio Evaluation Report - NIPSCO

277

Table 182. Information Source Usefulness

Information Source Ranking

Personal calls from program implementer representatives 1 – most useful

Emails 2

NIPSCO program website 3

NIPSCO utility representative 4 (tie)

Trainings 4 (tie)

Trade Ally Network peers 5

Meetings 6

Newsletters 7 – least useful

Source: Trade Ally Survey. Question: “Please rank the following communication options from most useful to least useful.”

(n=15)

Ten of 16 trade ally respondents reported that they typically promote NIPSCO’s energy efficiency

programs to customers all the time (eight respondents) or frequently (two respondents). The six

respondents who promote the programs only sometimes (four respondents) or seldom (two

respondents) provided the following reasons for not promoting the program more often (multiple

responses allowed):

• Not familiar enough with the details of the program or who is eligible (four respondents)

• Too much paperwork or the incentives are not worth the hassle (three respondents)

• Out-of-state or not always in the NIPSCO service territory (two respondents)

• Perceived financial risk to the trade ally or customer (one respondent)

• Program is confusing to the customer (one respondent)

The survey asked trade allies about the benefits of promoting NIPSCO’s energy efficiency programs.

Thirteen of 15 respondents reported the incentives for their customers as the greatest benefit, and two

respondents reported increased business. These results are consistent with the 2017 survey findings.

Program’s Economic Impact on Trade Allies

As shown in Table 183, trade allies reported NIPSCO’s programs had varying levels of economic impact

on their business. Similar to the 2017 and 2016 survey results, approximately one-third of trade ally

respondents reported that at least 50% of their companies’ projects were eligible for and received a

NIPSCO incentive, with five of sixteen trade allies reporting that more than 50% of their companies’

projects received NIPSCO incentives. However, half of the 2018 trade allies (nine of 16) reported that

10% or less of their projects received an incentive.

Page 285: 2018 DSM Portfolio Evaluation Report - NIPSCO

278

Table 183. Self-Reported Percentage of Trade Ally Projects that Received a NIPSCO Incentive

Percentage of

Projects Receiving

Incentive

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

Number of

Respondents 1 8 0 1 0 1 0 0 1 3 1

Source: Trade Ally Survey. Question: “What percentage of your company’s projects were eligible and received an incentive

from NIPSCO in 2018? These are projects for which you or your customers submitted an application to NIPSCO for an

incentive. Use your best guess.”

When asked how the volume of their sales has changed since their involvement with NIPSCO, 13

respondents answered the question. Seven of them said their sales increased, and six said sales have not

changed since their involvement with NIPSCO. In response to increased business, one trade ally added

more services and products to the company’s business offerings, one expanded the company’s service

location (territory served), one hired more staff, and five reported no changes to their business.

Trade Ally-Reported Spillover

Two of the 16 surveyed trade allies reported that participating in the Custom program influenced their

sales of high-efficiency air compressor equipment not eligible for incentives in NIPSCO’s service

territory. Both trade allies reported similar percentages of air compressor sales that were standard

efficiency (50% and 50%) and high efficiency (25% and 30% did not receive incentive). Only one of the

two trade allies provided sales numbers; this trade ally reported selling 60 to 80 units of air compressor

equipment in NIPSCO’s service territory in 2018 and that 30% of the sales (18 to 24 units) were high-

efficiency units not eligible for incentives, where participating in the Custom program was somewhat

important on the sales. The same trade ally went on to say that 20% (80 units) of the company’s total air

compressor sales in NIPSCO’s service territory were high-efficiency units that were eligible for incentives

and 50% (80 units) of the units were of standard efficiency. The other trade ally who did not provide the

number of unit sales reported that 25% of the sales were high-efficiency units not eligible for incentives,

where participating in the Custom program was somewhat important on the sales. This trade ally

reported that 25% of the company’s total air compressor sales in NIPSCO’s service territory were high-

efficiency units that were eligible for incentives and 50% of the units were of standard efficiency.

The trade allies’ reported spillover activity was small relative to the gross evaluated program savings

associated with the 16 surveyed trade allies. As such, the evaluation team reported these results

qualitatively due to concerns about the applicability of extrapolating the findings to the program

population.

Trade Ally Satisfaction

Trade allies also rated their overall satisfaction with the Custom and Prescriptive programs on a 5-point

scale, where 1 means very satisfied and 5 means very dissatisfied. Sixty-nine percent of trade allies (11

of 16 respondents) reported that they were very (seven respondents, n=16) or somewhat satisfied (four

respondents) with the programs overall (Figure 105). These responses are consistent with 2016 and

2017 results. Also similar to 2017 and 2016 results, a minority of 2018 respondents reported that they

not too satisfied or not at all satisfied (one respondent each). The two 2018 respondents who rated

Page 286: 2018 DSM Portfolio Evaluation Report - NIPSCO

279

satisfaction at a 4 or 5 each provided a suggestion to improve the programs: increase staff availability

for project-specific questions and make the incentives easier to obtain.

Figure 105. Custom Program Trade Ally Overall Satisfaction, 2016-2018

Source: Trade Ally Surveys. Question: “How satisfied are you with NIPSCO’s Prescriptive and Custom

programs overall?”

More trade allies in 2018 reported receiving program training from NIPSCO or the program implementer

than previous years. Ten of 16 trade allies reported receiving training in 2018, compared to two of 13 in

2017 and four of 11 trade allies in 2016. Eight trade allies reported receiving training on site or having in-

depth communication with Lockheed Martin Energy, five participated in trade ally events such as a trade

ally breakfast or annual meeting, and one received information from a NIPSCO representative (multiple

responses allowed, n=10).

Most of the 2018 respondents (n=10) rated the trainings as very useful (five respondents) or somewhat

useful (three respondents), and just two rated the trainings as neither useful nor unuseful (one

respondent) or not useful at all (one respondent). In comparison, only two respondents in 2017 and four

respondents in 2016 rated the usefulness of the trainings. The 2017 respondents found the training to

be very useful or somewhat useful (one respondent each), while the ratings for the 2016 respondents

were split evenly between very useful, somewhat useful, neither useful not unuseful, and not at all useful

(one respondent each). One 2018 trade ally recommended NIPSCO improve program training by offering

two annual kickoff meetings instead of one: one in-person meeting for new trade allies and one webinar

that only covers program changes for existing trade allies.

When asked what technology-specific training trade allies would be interested in attending, two of 13

respondents were not interested in training, but five were interested in energy management systems,

three in lighting, two in compressed air, one in steam trap repair and replacement, and one in solar

(multiple responses allowed).

When asked about the ease of completing the program’s incentive application, two-thirds of survey

respondents (10 of 16) stated that they run into application challenges seldom or never, which is similar

to 2017 results (nine of 13). One respondent encountered challenges all the time and explained that it

Page 287: 2018 DSM Portfolio Evaluation Report - NIPSCO

280

was difficult to reach program staff to ask questions and the approval process took long. Two

respondents who encountered challenges frequently said the process takes too long (both respondents),

there were too many required supporting documents, and they felt uncertainty over actual versus

marketed rebate amount.

The evaluation team asked Custom program trade allies about their satisfaction working with program

implementer staff (Figure 106). Similar to results in 2016 and 2017, 10 of 15 were very satisfied and four

were somewhat satisfied with the support received from program representatives. Just one 2018

respondent reported being very dissatisfied. This respondent mentioned difficulties reaching a program

representative, but also said that the representative was very helpful once reached.

Figure 106. Trade Ally Satisfaction with Custom Program Representatives, 2016-2018

Source: Trade Ally Surveys. Question: “In regard to your business Prescriptive and Custom programs’

activities, how satisfied are you with the program support you received from the program representatives

(Lockheed Martin Energy)?”

Of the trade allies who worked with a NIPSCO Major Account Manager during a project (12 out of 15),

eight were very satisfied with the support they received, three were somewhat satisfied, and one was

neither satisfied nor dissatisfied. These results were similar to the 2017 results.

Trade allies were asked to provide suggestions for program improvement other than increasing rebates.

Six trade allies offered suggestions and said the following:

• Provide a list of customers who have opted out of NIPSCO’s DSM programs so that trade allies

can target those who remain in the program (one respondent).

• Increase customer awareness about the benefits of energy efficiency upgrades (one

respondent).

• Allow additional lighting types for de-lamping T8s (one respondent).

• Shift old lamp technologies from the Custom program to the Prescriptive program (one

respondent).

Page 288: 2018 DSM Portfolio Evaluation Report - NIPSCO

281

• Refer customers to top-performing trade allies (one respondent).

• Provide an overall project payback calculation, not a payback calculation for each line item (one

respondent).

Conclusions and Recommendations Conclusion 1: The 2018 program delivered high satisfaction among participants and trade allies.

Most 2018 participants were very or somewhat satisfied with the program overall (95%) and with

NIPSCO as their utility provider (98%). Trade allies also reported reasonable levels of satisfaction in

2018, with 69% being very or somewhat satisfied with the program.

Conclusion 2: Customers value direct access to and responsiveness from program staff and would like

more one-on-one interactions with program staff.

The evaluation team recognizes NIPSCO’s and Lockheed Martin Energy’s concerted efforts to engage

customers directly. More participants interacted directly with utility staff in 2018 than in 2017, while

Lockheed Martin Energy decreased the frequency of its electronic outreach from 13 customer-facing

electronic communications in 2017 to five in 2018. NIPSCO staff play an important role in customer

awareness and their satisfaction with the utility, and customers truly value direct one-on-one support

from program and utility representatives, more so than the support received through their contractor.

Participants most commonly recommended improving the program through more interaction with

program and utility staff, particularly when encountering challenges with the program, and through a

simplified application process. NIPSCO and the implementer may increase program satisfaction by

improving processes for following up with businesses that struggle with project applications and

providing direct technical assistance and timely information during the application process.

Recommendations:

• Consider developing a process for responding to customers in a timely, personalized manner.

Part of the process can involve following up within 24 hours of the initial contact with any

customer planning a project or attempting to complete an application.

• Consider establishing a process for proactively following up with customers who have not

recently contacted program staff. Provide a courtesy email or phone call every few months to

help build or maintain relationships with businesses who have indicated interest in a project in

the past.

Conclusion 3: Trade allies seek and may benefit financially from a deeper understanding of the

program and its processes.

Four out of six trade ally respondents who promote the programs only sometimes or seldom—25% of all

respondents—said this was because they were not familiar enough with the details of the program or

who is eligible. Half of 2018 surveyed trade allies (nine of 16) reported that 10% or fewer of their

projects received an incentive, and while seven trade allies said their sales have increased since

participating in the program, six said sales have not changed since their involvement with NIPSCO.

Because the evaluation team recruited trade ally survey respondents from a sample of those who

Page 289: 2018 DSM Portfolio Evaluation Report - NIPSCO

282

participated in the 2018 Custom program, this group is likely the most engaged with the program and its

processes compared to other trade allies in the service territory. This core group could benefit from a

thorough program review.

Recommendation:

• In addition to hosting kickoff meetings and overview webinars for trade allies, consider offering

thorough program process trainings quarterly for those familiar with the program but seeking a

full understanding of the program nuances. To offer some relevance that is specific to trade ally

type, rotate the training topics to cover lighting, refrigeration, compressed air, HVAC, and

hydronic applications using actual project examples. Include content that encourages trade

allies to seek out and upsell equipment that is eligible for program incentives. These sessions

should improve trade ally confidence, increasing the number of program-eligible projects and

the program’s economic value to trade allies.

Conclusion 4: The implementer started to incorporate CFs into 2018 demand calculations but not

comprehensively.

Peak demand savings for exterior lighting measures were claimed due to the CFs not being applied. Also,

the peak demand savings for interior lighting measures were overstated because, in many cases, the CF

of the installed lights was not applied.

Recommendations:

• The implementer should follow the savings calculations in the Indiana TRM (v2.2) for all

measures where appropriate. The implementer should ensure that the CFs are clearly laid out

and that the calculations are presented in an easy-to-follow format.

• Ensure that the CF=0 for all exterior lighting measures (unless they are 24/7 fixtures, where the

CF=1 regardless of interior or exterior location).

Conclusion 5: Lack of consistency in ex ante calculations, savings methodology, and supporting

documentation could be a potential contributor to overstated or understated realized savings.

In many instances, the evaluation team failed to locate proper supporting documentation for project

measures in the sample in LM Captures. This included calculations that were missing or showed savings

values that did not match the claimed savings in the program application or tracking data. In these

cases, the evaluation team had to perform a secondary request to the implementer to provide the

correct documentation. Even so, in some instances, the documentation received from the secondary

request still did not allow the evaluation team enough detail to fully and rigorously evaluate impacts. In

such instances, the evaluation team passed the measures through at 100% realization rates due to a lack

of justification for adjustments. Since the implementer does not require documentation of assumptions

and inputs, the evaluation team could not base corrections with engineering rigor. However, the

evaluation team recognizes that acceptance of these ex ante savings may be overstating the program’s

natural gas savings. Therefore, the evaluation team may consider a different treatment of savings in

situations where large energy saving projects (within the top 10% of the fuel-specific ex ante savings)

continue to lack proper engineering assumptions and model inputs.

Page 290: 2018 DSM Portfolio Evaluation Report - NIPSCO

283

Additionally, some measures lacked consistency in savings methodology. For the direct-fired heating

measures, the savings methodology varied across four separate approaches even though one approach

seemed to most accurately estimate therm savings (see New Construction chapter for further discussion

regarding this issue).

Recommendations:

• For direct-fired heating measures that encompass efficiency, destratification, and setback

savings, utilize the building load model present in the file, LM Building Usage & Setback+Destrat

Savings Calc.xlsx or similar spreadsheet calculator. Avoid read-only PDFs.

• Include a general narrative of project savings and nuances of assumptions in the project

application. For example, include reasons for ignoring destratification or setback savings for

natural gas HVAC measures when similar measure tend to include them.

• Central Heating equipment measure savings should primarily be calculated using the Indiana

TRM (v2.2). C&I custom calculations may be necessary in cases where program controls or

technology is integrated into the measure.

• Ensure the most up-to-date savings calculation source is uploaded to LM Captures at the same

time as the final application workbook. This will ensure the correct supporting documentation is

present with the final savings being claimed. Employ documentation similar to the spreadsheet

file reference in the first bullet point, which lays out all assumptions, inputs, and formulae used

in the calculation.

• If energy models are used, make sure to provide actual model file in LM Captures or all

applicable inputs required to build a similar model.

Conclusion 6: The program implementer’s quality assurance and quality control (QA/QC) process

encourages high realization rates.

The implementer conducts a final inspection prior to rebate payment for at least 10% of completed

custom projects and for every project with incentives exceeding $10,000. While the evaluation team

needed to adjust some of the implementer’s calculations, this rigorous QA/QC protocol likely

contributed to the program’s strong realization rates (between 97% and 98% for all fuel-level energy

and demand metrics).

Page 291: 2018 DSM Portfolio Evaluation Report - NIPSCO

284

C&I New Construction Incentive Program The New Construction program is designed to encourage energy-efficient new construction of C&I

facilities within NIPSCO’s service territory. This program offers financial incentives to building owners,

designers, and architects for projects that exceed the current statewide building code requirements for

energy. Trade allies primarily drive customer outreach to generate program-qualified projects, with

support from NIPSCO and the program implementer, Lockheed Martin Energy.

Program Performance Table 184 summarizes savings for the 2018 New Construction program, including program gross savings

goals. Program performance exceeded expectations for electric and natural gas and nearly met demand

goals due to the low peak demand reduction realization rate. Ex post gross savings achieved 139% of the

electric energy goals, 95% of demand savings goals, and 122% of the natural gas energy goals.

Table 184. 2018 New Construction Program Savings Summary

Metric Gross

Savings Goal Ex Ante Audited Verified

Ex Post

Gross

Ex Post

Net

Gross Goal

Achievement

Electric Energy

Savings (kWh/yr) 12,084,164 15,439,881 15,441,219 15,441,219 16,787,567 8,729,535 139%

Peak Demand

Reduction (kW) 2,469 2,575 2,606 2,606 2,341 1,217 95%

Natural Gas Energy

Savings (therms/yr) 294,687 366,300 367,051 367,051 360,639 187,532 122%

Audited and verified electric and demand savings aligned well with the program’s ex ante estimates.

Ex post gross savings varied for each metric based on similar findings from 2017. Similar to last year, the

implementer continued to overstate ex ante peak coincident demand reduction by inconsistently

applying CFs for New Construction lighting projects. However, ex post adjustments to demand reduction

were not as significant as they were in 2017 since the implementer did not omit CFs for all lighting

projects (which was the case in 2017). The implementer started to incorporate CFs into 2018 demand

calculations but not all projects were corrected. The decrease in ex post gross demand reduction reflects

the evaluation team’s adjustment to these projects. Again, similar to the other C&I projects in both 2017

and 2018, the incorporation of WHFs for interior lighting projects mainly drove the electric energy

realization rate to surpass 100%.

For the natural gas program, audited, verified, and ex post therm savings aligned well with the

program’s ex ante estimate. The slight reduction in ex post therm savings came from adjustments to the

claimed methodology and baseline assumptions for several heating equipment replacement projects

that were inconsistent with other claimed savings for similar projects. The reasons for these savings

discrepancies are covered in detail in the Impact Evaluation section below.

Table 185 shows the ex post gross and net energy adjustment factors resulting from the evaluation.

Page 292: 2018 DSM Portfolio Evaluation Report - NIPSCO

285

Table 185. 2018 New Construction Program Adjustment Factors

Metric Realization

Rate (%)a

Precision at 90%

Confidence (Fuel-

Level Energy

Realization Rate)

Freeridership

(%)

Spillover

(%) NTG (%)b

Electric Energy Savings (kWh/yr) 109% 3.7%

48% 0% 52% Peak Demand Reduction (kW) 91%

Natural Gas Energy Savings

(therms/yr) 98% 6.7%

a Realization rate is defined as ex post gross savings divided by ex ante savings. b NTG is defined as ex post net savings divided by ex post gross savings.

Table 186 summarizes the budget and 2018 expenditures for the 2018 New Construction program.

Table 186. 2018 New Construction Program Expenditures

Fuel Program Budget Program Expenditures Budget Spent (%)

Electric $1,353,864 $1,555,177 115%

Natural Gas $319,841 $379,741 119%

Research Questions The evaluation team conducted qualitative and quantitative research activities to answer the following

research questions for the New Construction program:

• How do participants hear about the program, and what are the best ways for the program to

reach potential participants?

• What are the barriers and challenges to energy efficiency and program participation?

• What are the primary reasons for participation?

• Are participants satisfied with the program and its components?

• How could NIPSCO improve the participants’ experience with the program?

• What are the freeridership and participant spillover estimates?

• How do trade allies become aware of the program?

• What are the key factors influencing a trade ally’s decision to register with the program?

• Are there any barriers to program participation?

• How are trade allies marketing the program?

• How satisfied are trade allies with the program and NIPSCO as an energy service provider?

• How does a C&I New Construction program project differ from a regular project?

• What barriers do trade allies face when building energy-efficient buildings?

• What are participating trade allies’ company characteristics?

Page 293: 2018 DSM Portfolio Evaluation Report - NIPSCO

286

• Are tracking database savings sourced with proper project documentation?

• Do claimed savings algorithms align with the Indiana TRM (v2.2) or other appropriate

secondary sources?

Impact Evaluation To perform an impact analysis of the 2018 New Construction program, the evaluation team selected a

representative sample of measures to evaluate, and then extrapolated findings to the larger program

population. This process used a PPS sampling approach, a stratified sampling technique which uses a

saving parameter (here, total energy savings in MMBtu) as a weighting factor. Out of 155 project

measures, the New Construction evaluation sample resulted in 61 measures receiving either an

engineering review or an on-site M&V analysis.

Table 187 details the completed impact evaluation activities for this program year, along with sample

size targets designed to yield sufficient precision. The measure number represents the total number of

individual upgrades that received funding through the program in 2018.

Table 187. 2018 New Construction Program Impact Evaluation Activities

Gross Population Count Sample Electric Measure Count Sample Natural Gas Measure Count Evaluation Sample Share

of Program Savings Unit Measure Actual Target Actual Target

67,243 155 37 36 24 24 81%

a Population is based on queried data extracts from the implementer’s database, LM Captures, and may vary slightly with NIPSCO’s

scorecard data source.

Each measure is further quantified by the number of units of the specified technology installed at the

measure level. However, upon review of project documentation, the evaluation team found that the

implementer’s definition of “unit” was variable and did not consistently describe the quantity of the

installed measure. For example, in the other C&I programs (Custom, SBDI, and Prescriptive), the number

of units for lighting measures represents the number of lighting fixtures installed. This is not the case for

all C&I New Construction lighting measures, where the unit quantity represented the total wattage of

lighting installed for measures incorporating lighting power density reduction methodology.

Additionally, natural gas heating measures generally defined “unit” as the total 1,000 Btus per hour of

the installed equipment (consistent with other C&I programs), but in some cases used the number of

individual heating systems installed.

The evaluation team used findings from 2015, 2016, and 2017 evaluations to inform sampling targets for

2018. Its understanding of previous years’ savings variability (error ratio) of New Construction measures

allowed the evaluation sample to efficiently target natural gas and electric measures in 2018, achieving

90/10 confidence and precision for each fuel’s realization rates, while representing 81% of the total

program savings. The actual sample surpassed its targets due to ensuring a representative measure-mix

in the sample and several projects contained multiple measures.

Figure 1 illustrates the distribution of the 2018 New Construction project population by energy savings

(MMBtu), fuel type, and measure category, as labeled in the tracking database. In 2018, New

Page 294: 2018 DSM Portfolio Evaluation Report - NIPSCO

287

Construction projects accounted for 15,439,881  kWh and 366,300 therm ex ante savings (89,311

MMBtu savings combined).

Figure 107. 2018 New Construction Program Measure Type Distribution by Ex Ante MMBtu Savings

Similar to 2017, electric lighting projects were the single largest measure category in the 2018 program.

Again, HVAC projects accounted for the largest portion of natural gas savings, largely consisting of

direct-fired heating measures in warehouse and production spaces that captured savings by a

combination of efficiency improvement, destratification, and temperature setbacks.

Since the majority of the programs’ energy savings were concentrated in a small percentage of the large

energy-saving projects (56% of the ex ante electric savings were achieved by just 10 project measures,

and the top 10 natural gas project measures encompassed 69% of the program’s ex ante therm savings),

the PPS sampling approach was able to represent 81% of the population’s total combined savings.

Figure 108 shows the sample’s composition for on-site M&V efforts and engineering reviews, compared

to the population.

Page 295: 2018 DSM Portfolio Evaluation Report - NIPSCO

288

Figure 108. New Construction Program Summary of 90/10 Sample Analysis by Ex Ante Savings

Since most of the large energy-saving project measures were lighting and direct-fired natural gas heating

measures, the PPS sampling approach helped to create a sample with similar distribution of savings

when compared to the population ex ante savings. Figure 109 shows the sample distribution, based on

measure categories from the tracking database. Electric HVAC measures consisted of air conditioner and

chiller equipment measures, and “other” measures include various compressed air, energy management

system, and process measures.

Figure 109. New Construction Program 90/10 Sample Distribution by Measure and Fuel Type

Project documentation was thorough for the measures reviewed. For each audited measure, the

evaluation team reviewed the invoices and technical specifications supplied by Lockheed Martin Energy.

The evaluation team then used the 2015 Indiana TRM (v2.2) to calculate savings and compared the

evaluated results to NIPSCO’s program tracking database. During these reviews of the reported savings

Page 296: 2018 DSM Portfolio Evaluation Report - NIPSCO

289

calculation methodologies, assumptions, and results, the evaluation team checked to make certain of

the following:

• Calculation assumptions matched equipment specifications and supporting project

documentation

• Reported savings algorithms were correct and aligned with the 2015 Indiana TRM (v2.2) for

applicable measures

Audited and Verified Savings The evaluation team accessed project documentation in the implementer’s database, LM Captures, to

audit savings for the evaluation sample of New Construction measures in 2018. Project documentation

required for the audit included the assumptions and inputs involved in the savings calculations that

were either calculated or approved by the implementer. Other documentation required for the audit

included the final approved project application, appropriate equipment specification sheets provided by

the manufacturer, and invoices or proof of installation of equipment by trade allies. In many instances,

the evaluation team failed to locate proper supporting documentation for project measures in the

sample in LM Captures. This included calculations that were missing or showed savings values that did

not match the claimed savings in the program application or tracking data. In these cases, the evaluation

team had to perform a secondary request to the implementer to provide the correct documentation.

As shown in Table 184, the evaluation team made only minor adjustments in the evaluation’s audit

phase. For lighting measures, the claimed savings used algorithms with site-specific inputs. The

evaluation team based any adjustments in the audit phase largely on discrepancies with respect to these

inputs, such as efficient or baseline wattages and fixture quantities. The evaluation team aligned these

inputs with supporting documentation (specification sheets and invoices) when they differed with the

calculation workbooks (program application files). In general, the lighting documentation aligned well

with claimed calculations presented in the application files, so audited energy savings only varied due to

small rounding errors between audited and ex ante estimates. Additionally, one chiller upgrade measure

claimed higher peak coincident demand savings in the project application than it did in the tracking

database. Since the team confirmed that the value in the program application aligned with appropriate

Indiana TRM (v2.2) assumptions, audited demand savings for the program were slightly higher than

ex ante.

After requesting updated project documentation for batch of project measures, there remained two

natural gas heating projects where the savings calculation in the supporting documentation still varied

slightly from what was in the tracking database and project application (by about 3%-8%). In these

cases, the evaluation team confirmed the supporting savings calculations were correct and aligned

audited savings with the supporting calculation, which slightly increased the audited savings for natural

gas at the program level.

In-Service Rate

The evaluation team performed site visits to verify installations of 12 sampled measures in 2018 and to

calculate ISR for the New Construction program. As mentioned previously, the unit quantity for New

Page 297: 2018 DSM Portfolio Evaluation Report - NIPSCO

290

Construction projects in 2018 were not consistently defined—lighting measure units either described

the number of fixtures installed or total wattage installed; heating and motor equipment measure either

described the total equipment capacity or number of systems installed. For ISR determination, the team

aligned the definitions for each site visited. Since the team verified that claimed unit counts matched

what existed at each site, it applied a 100% ISR to the program’s audited savings to determine verified

savings in 2018.

Ex Post Savings The evaluation team adjusted 2018 lighting measure savings in the ex post analysis based upon

these data:

• Uncovering of any discrepancies in fixture quantity, equipment capacity, or wattage during site

visits or discussions with business owners.

• Utilization of annual operating hours metered during site visits, provided by business owners, or

recorded from scheduling systems.

• Inclusion of electric WHFs and adjustment of peak summer CFs consistent with the Indiana

TRM (v2.2).

• Review of any potential updates to sources listed in the implementer’s deemed savings tables.

Adjustments based on the above data resulted in only minor discrepancies in electric energy savings, but

resulted in larger discrepancies in peak demand reduction. Although the evaluation team confirmed that

the implementer began applying CFs based on building type to a portion lighting projects in 2018 (based

on evaluation recommendations in 2016 and 2017), the evaluation team determined that they

incorrectly applied CFs to exterior lighting measures and omitted them for number of interior lighting

projects. Thus, 2018 ex ante peak coincident demand reduction continued to be overstated due to

inconsistent application of CFs for New Construction lighting projects. However, ex post adjustments to

demand reduction were not as significant as they were in 2017 since the implementer did not omit CFs

for all lighting projects (which was the case in 2017).

The evaluation team removed all demand savings from exterior lighting as part of the ex post analysis.

The evaluation team also applied CFs to those interior lighting measures that the implementer

overlooked. This adjustment serves as the main contributor to the realization rate for the program’s

peak demand reduction (91%).

As noted in previous evaluations, the implementer disregarded WHFs in calculating electric lighting

savings. To align with methodologies originated during the Indiana Statewide Core evaluation, the

evaluation team applied WHFs to energy savings for interior lighting measures in cooled spaces. Cooling

systems were verified for on-site measures and from measure application workbooks for engineering

reviews. In these cases, the WHF contributed additional electric energy savings to lighting measures,

which was the primary factor in the electric energy realization rate achieving 109% in 2018.

On the natural gas side of the program, the predominant HVAC measure in 2018 consisted of direct-

fired natural gas heating measures installed in production or warehouse spaces. These measures

Page 298: 2018 DSM Portfolio Evaluation Report - NIPSCO

291

incorporated the same make and model of equipment and achieved savings by a combination of

metrics:

1. Increase in efficiency over federal baseline (80%)

2. Destratification of vertical temperature differential in the conditioned space

3. Temperature setback schedules

The evaluation team found that the implementer used a variety of approaches in calculation savings for

these measures, outlined in Table 188.

Table 188. Ex Ante Savings Approaches for Direct-Fired Natural Gas Heating HVAC Measures

Approach Savings Methodology Metrics

Included Notes

A

Indiana TRM (v2.2)

algorithm for natural gas

furnaces

1

Since the Indiana TRM (v2.2) does not provide estimates or algorithms

for commercial destratification or T-stat savings, this approach ignores

any savings that may have been achieved from metrics 2 and 3.

B.1

Heating load model

provided by trade ally

1 & 2

PDF/read-only source documents that the evaluation team was able to

re-create as a separate model. Temperature setback was sometimes

incorporated by assuming a 10% reduction in efficient-case

consumption.

B.2 1, 2, & 3

PDF/read-only source document that the evaluation team was not able

to re-create as a separate model. This methodology was primarily used

for four large identical facilities built at an RV manufacturing facility.

C

Heating load model

performed by

Implementer

1, 2, & 3

This Excel file (titled, LM Building Usage & Setback+Destrat Savings

Calc) contained an organized summary of all assumptions and results.

There was also an instance where the implementer used this approach

but only claimed the setback savings for the measure.

It was not always clear which approach the implementer used to report final claimed savings, which

required the evaluation team to make secondary requests to the implementer for correct

documentation. Once the implementer provided additional guidance and notes to the evaluation team

to illustrate which approach was used, the evaluation team aligned ex post methodology for these

natural gas measures with the ex ante approach. However, the evaluation team reduced some ex post

savings in these approaches when baseline efficiency assumptions were lower than the federal

minimum standard of 80%. Such assumptions may be warranted in the Custom program, but the New

Construction projects should never have baseline efficiencies less than the minimum code standards.

Furthermore, the evaluation team found that approach C provided the most comprehensive

understanding of the source of savings. For the four large projects that followed approach B.2 (each

claiming 17,847 ex ante therm savings per year), the evaluation team incorporated the trade allies’

model assumptions into the implementer’s model from approach C to better evaluate the savings as an

engineering review. The per-measure savings in this case dropped slightly from 17,847 therms to 17,347

therms, which the evaluation team applied as ex post savings.

One additional reason the realization rate dropped below 100% for natural gas savings was because the

evaluation team updated the savings methodology for heating equipment installed in schools. Several

Page 299: 2018 DSM Portfolio Evaluation Report - NIPSCO

292

projects used the Commercial Buildings Energy Consumption Survey’s natural gas usage estimates as the

baseline consumption of natural gas heating in their natural gas savings estimates. However, a more

accurate approach is to use the EFLH assumption in the Indiana TRM (v2.2) since it describes the natural

gas heating consumption for specific buildings in the distinctive weather region.

Baseline Conditions for Direct-Fired Heating Measures

A common 2018 New Construction measure type involved a particular warehouse/production floor

heating technology that provides increased efficiency through direct heating and destratification (also

known as air turnover). Assuming the same run times and zone temperature setpoints, the main factor

that makes these heaters more efficient than more common heating technologies is that it actively

reduces the stratification of air temperature in spaces with high ceilings, such as warehouses and

manufacturing spaces, by reducing the amount of heat lost through the building envelope. However,

many of the air turnover projects this year also claimed savings from the installed system’s ability to set

back its temperature setpoint during unoccupied hours, as compared to a baseline system that could

not. This baseline assumption came into question during other Indiana utility custom incentive program

evaluations.

In short, the evaluation team found that this assumption does not meet Indiana energy code

requirements. Namely 2007 ASHRAE 90.1 section 6.4.3.3.2 which states, “Heating systems shall be

equipped with controls capable of and configured to automatically restart and temporarily operate the

system as required to maintain zone temperatures above an adjustable heating set point at least 10°F

below the occupied heating set point.”

During previous evaluations for another Indiana natural gas utility company, the evaluation team

highlighted this discrepancy with the program administrators. The administrators stated anecdotally

that several of the trade allies that install these systems and less efficient systems not eligible for

incentives claimed that almost all of the systems not eligible for incentives installed in new construction

projects do not meet the 2007 ASHRAE 90.1 requirement described above.

In previous evaluations for the other Indiana utility, the evaluation team recommended that further

research be performed to determine the appropriate baseline for new construction projects of this type.

Based on this recommendation in October of 2018, the program administrators performed a field

investigation to inform this discussion. Their findings showed that none of the “baseline” style systems

examined had thermostat controllers capable of unoccupied setback scheduling.

Based on these findings, the evaluation team, utility, and the program administrator agreed that this

baseline assumption of no temperature setback control capabilities was acceptable for the program year

under evaluation. However, setback savings should not continue to be claimed in successive years based

upon the 2010 Indiana Energy Conservation Code (IECC) based on ASHRAE 90.1-2007.

Based on this, the evaluation team accepts this baseline assumption for the relevant 2018 C&I New

Construction projects. However, the evaluation team also recommend that this assumption be

reassessed for acceptability in NIPSCO territory in 2021.

Page 300: 2018 DSM Portfolio Evaluation Report - NIPSCO

293

Realization Rates

Table 189 lists realization rates for the New Construction program.

Table 189. New Construction Program Realization Rates

ISR

Verified/Ex Ante Savings Realization Rate for Evaluation Samplea

Electric

Energy (kWh)

Peak Demand

(kW)

Natural Gas

(Therms)

Electric Energy

(kWh)

Peak Demand

(kW)

Natural Gas

(Therms)

100.0% 100.0% 101% 100% 109% 91% 98%

a Realization rate is defined as ex post gross savings divided by ex ante savings.

Table 190 lists measure and unit counts within the sample, and aggregated (total energy in MMBtu)

ex post realization rates for each measure type. The Ex Post Savings section outlines the drivers of

realization rate for each measure type.

Table 190. New Construction Program Evaluation Sample Results by Measure Type

Fuel Type Measure Type

Evaluation

Sample

Measure Count

Evaluation Sample

Unit Count

Demand Ex Post

Realization Ratea

Energy Ex Post

Realization Ratea

Electric Building Redesign 2 300 100% 100%

Electric HVAC 1 2,290 135% 100%

Electric Lighting 31 2,702 89% 110%

Electric Motors 3 3 100% 100%

Natural Gas Building Redesign 1 160 N/A 100%

Natural Gas HVAC 22 50,998 N/A 98%

Natural Gas Process 1 1 N/A 100%

Electric Building Redesign 2 300 100% 100%

a Realization rate is defined as Ex Post Gross savings divided by Ex Ante savings.

It is important to note that the realization rates in Table 190 were not applied to each measure type in

the population to determine ex post gross savings. Table 190 is presented only to provide a further

outline of findings with in the evaluation sample. To calculate the ex post gross impacts, the evaluation

team applied the metric-level realization rates resulting from the sample and applied them to the

population ex ante energy and demand savings (shown in Table 196).

Table 191. Application of 2018 New Construction Program Realization Rates

Metric Population Ex Ante Realization Rate (From

Evaluation Sample) Population Ex Post Grossa

Electric Energy Savings (kWh/yr) 15,439,881 109% 16,787,567

Peak Demand Reduction (kW) 2,575 91% 2,341

Natural Gas Energy Savings

(therms/yr) 366,300 98% 360,639

a Totals may not calculate properly due to rounding.

Page 301: 2018 DSM Portfolio Evaluation Report - NIPSCO

294

Ex Post Net Savings The evaluation team calculated freeridership and participant spillover using the methods described in

Appendix B. Self-Report Net-to-Gross Evaluation Methodology using the survey data collected from 2018

participants. As shown in Table 192, the evaluation team estimated a 52% NTG for the program.

Table 192. 2018 New Construction Program Net-to-Gross Results

Program Category Freeridership (%)a Participant Spillover (%) NTG (%)a

New Construction Program 48% 0% 52%

a Weighted by survey sample ex post gross program MMBtu savings.

Freeridership To determine freeridership, the evaluation team asked nine participants questions focused on whether

they would have installed equipment to the same level of efficiency, at the same time, and the same

amount in the absence of the program. Table 193 shows the results, an overall freeridership score of

48%.

Table 193. 2018 New Construction Program Freeridership Results

Program Category Responses (n) Freeridershipa

New Construction Program 9 48%

a The freeridership score is weighted by the survey sample ex post gross program MMBtu savings.

By combining the previously used intention methodology with the influence methodology, the

evaluation team produced a freeridership score for the program by averaging the savings weighted

intention and influence freeridership scores. Refer to Appendix B. Self-Report Net-to-Gross Evaluation

Methodology further details on the intention and influence questions and scoring methodologies.

Intention Freeridership

The evaluation team estimated intention freeridership scores for all participants based on their

responses to the intention-focused freeridership questions. As shown in Table 194, the intention

freeridership score for the New Construction program is 84%.

Table 194. 2018 New Construction Program Intention Freeridership Results

Program Category Responses (n) Intention Freeridership Scorea

New Construction Program 9 84%

a The freeridership score is weighted by the survey sample ex post gross program MMBtu savings.

Page 302: 2018 DSM Portfolio Evaluation Report - NIPSCO

295

Figure 110 shows the distribution of Individual intention freeridership scores.

Figure 110. 2018 New Construction Program Distribution of Intention Freeridership Scores

Source: Participant Survey. Questions: G14 to G22 and G24 are used to estimate an intention freeridership

score. See Table 243 in the Appendix B. Self-Report Net-to-Gross Evaluation Methodology for the full text of

the questions, response options, and scoring treatments used to estimate intention freeridership scores.

See Table 244 for the unique Prescriptive program participant response combinations resulting from

intention freeridership questions, along with intention freeridership scores assigned to each combination,

and the number of responses for each combination.

Influence Freeridership

The evaluation team assessed influence freeridership by asking participants how important various

program elements were in their purchase decision-making process. Table 195 shows program elements

participants rated for importance, along with a count and average rating for each factor.

Table 195. 2018 New Construction Program Influence Freeridership Responses

Influence Rating Influence

Score

NIPSCO

Incentive

Information from NIPSCO

on Energy Saving

opportunities

Recommendation

from contractor or

vendor

Participation in a

NIPSCO Efficiency

Program

1 (not at all

important) 100% 2 3 1 3

2 75% 4 1 0 0

3 25% 0 1 1 0

4 (very

important) 0% 3 2 6 2

Not applicable 50% 0 2 1 4

Average 2.4 2.3 3.5 2.2

The evaluation team determined each respondent’s influence freeridership rate for each measure

category using the maximum rating provided for any factor in Table 195. As shown in Table 196, the

Page 303: 2018 DSM Portfolio Evaluation Report - NIPSCO

296

respondents’ maximum influence ratings ranged from 1 (not at all important) to 4 (very important). A

maximum score of 1 means the customer indicated the customer ranked all factors from Table 195 as

not at all important, while a maximum score of 4 means the customer ranked at least 1 factor very

important. Counts refer to the number of “maximum influence” responses for each factor, or influence

score, response option.

Table 196. 2018 New Construction Program Influence Freeridership Score

Maximum Influence Rating Influence

Score Count

Total Survey

Sample Ex Post

MMBtu Savings

Influence Score

MMBtu Savings

1 (not important) 100% 1 813 813

2 75% 0 0 0

3 25% 1 600 150

4 (very important) 0% 7 6,067 0

Not applicable 50% 0 0 0

Average Maximum Influence Rating (Simple Average) 3.6

Average Influence Score (Weighted by Ex Post Savings) 13%

The average influence score of 13% for the 2018 New Construction program is weighted respondents by

ex post MMBtu program savings.

Final Freeridership

Next, the evaluation team calculated the arithmetic mean of the intention and influence freeridership

components to estimate the final percentage of freeridership for the program of 48%.

𝐹𝑖𝑛𝑎𝑙 𝐹𝑟𝑒𝑒𝑟𝑖𝑑𝑒𝑟𝑠ℎ𝑖𝑝 (48%) =𝐼𝑛𝑡𝑒𝑛𝑡𝑖𝑜𝑛 𝐹𝑅 𝑆𝑐𝑜𝑟𝑒 (84%) + 𝐼𝑛𝑓𝑙𝑢𝑒𝑛𝑐𝑒 𝐹𝑅 𝑆𝑐𝑜𝑟𝑒 (13%)

2

A higher freeridership score translates to more savings that are deducted from the gross savings

estimates. Table 197 lists the intention, influence, and freeridership scores for the program.

Table 197. 2018 New Construction Program Freeridership Score

Responses (n) Intention Score Influence Score Freeridership Score

9 84% 13% 48%

Freeridership results rely completely on self-reported responses and therefore can change considerably

from one year to the next, especially when sample sizes are small and when there is the potential for

large variations in the program energy savings of respondents, which has been the case throughout the

New Construction Program.

Page 304: 2018 DSM Portfolio Evaluation Report - NIPSCO

297

Participant Spillover As detailed in Appendix B. Self-Report Net-to-Gross Evaluation Methodology, the evaluation team

estimated participant spillover65 measure savings using specific information about participants, as

determined through the evaluation, and employing the Indiana TRM (v2.2) as a baseline reference. The

evaluation team planned to estimate the program participant spillover percentage by dividing the sum

of additional spillover savings (as reported by survey respondents) by the total gross savings achieved by

all program respondents. However, none of the participants attributed their program participation as an

influence on additional energy-efficient purchases. As such, the evaluation team estimated 0%

participant spillover for the 2018 New Construction program (Table 198).

Table 198. 2018 New Construction Program Participant Spillover Results

Spillover Savings (MMBtu) Participant Program Savings (MMBtu) Participant Spillover

0 7,479 0%

Table 199 summarizes the percentage of freeridership, participant spillover, and NTG for the program.

Table 199. 2018 New Construction Program Net-to-Gross Results

Responses (n) Freeridershipa Participant Spillover NTGa

9 48% 0% 52%

a Weighted by survey sample ex post gross program kWh savings

Evaluated Net Savings Adjustments Table 200 shows the savings, realization rate, and NTG for the program.

Table 200. New Construction Program Ex Post Net Savings

Fuel Type Ex Antea Gross

Savings

Ex Post Gross

Savings

Realization Rate

(%) NTG (%)

Ex Post Net

Savings

Electric Energy Savings

(kWh/yr) 15,439,881 16,787,567 108.7% 52% 8,729,535

Peak Demand

Reduction (kW) 2,575 2,341 90.9% 52% 1,217

Natural Gas Energy

Savings (therms/yr) 366,300 360,639 98.5% 52% 187,532

a Values presented at a measure-level represent audited values since the scorecard provides only savings totals.

65 Non-participant spillover evaluation activities were not conducted for the 2018 program year.

Page 305: 2018 DSM Portfolio Evaluation Report - NIPSCO

298

Process Evaluation The evaluation team conducted interviews with NIPSCO and implementation staff, a materials review,

and surveys with participants and trade allies.

Program Design and Delivery The New Construction program provides financial incentives to C&I new construction facilities that

exceed the energy efficiency requirements of statewide building codes. The following new construction

projects are eligible for the program:

• New buildings

• Additions or expansions to existing buildings

• Gut rehabs for a change of purpose requiring replacement of all electrical and mechanical

equipment

Trade allies primarily drive customer outreach to generate program-qualified projects, with support

from NIPSCO and the program implementer. A network of trade allies recruits prospective customers

and recommends measures that are eligible for program incentives.

The program process starts with the customer (or a trade ally on their behalf) submitting an application

via email, mail, or fax. The implementer then confirms project eligibility. The implementer encourages

customers to select a program representative in project initiation meetings so that the representative

can introduce the program, explain the process for securing incentives, and identify qualifying measures

the customer can incorporate into the design and construction of the facility.

After the application is complete, the implementer conducts a documentation review and an initial

engineering review. Multiple engineering reviews are required for projects that apply for an incentive

greater than $10,000. After communications, documentation, and engineering reviews are complete,

the customer and trade ally receive an incentive offer.

The implementer may conduct up to three post-installation inspections (depending on complexity) at a

project site to validate savings once the project is complete. Otherwise the incentive payment is

distributed after a project is approved for completeness and accuracy.

Changes from 2017 Design

NIPSCO staff and the program implementer reported that the only change to program design in 2018

was a slight increase in the non-lighting electric incentive (increased from $0.09 to $0.10 per kilowatt-

hour). This increase was intended to increase the number of non-lighting electric projects and it

appeared to work. Non-lighting measures accounted for 21% of electric (kWh) savings in 2017 and

increased to 29% of electric (kWh) savings in 2018. The program implementer also reported that natural

gas incentives would increase in 2019.

Page 306: 2018 DSM Portfolio Evaluation Report - NIPSCO

299

2017 Recommendation Status

In addition to research objectives laid out for the 2018 evaluation, the evaluation team followed up on

the 2017 evaluation recommendations. Table 201 lists the 2017 New Construction program evaluation

recommendations and NIPSCO’s progress toward addressing those recommendations to date.

Table 201. Status of 2017 New Construction Program Evaluation Recommendations

Summary of 2017 Recommendations Status

The implementer should follow the savings

calculations in the Indiana TRM (v2.2) for all

measures where appropriate. The

implementer should ensure that the use of

WHF and CF are clearly laid out and that the

calculations are presented in an easy-to-

follow format.

In Progress. The implementer has started to incorporate CFs in lighting

demand savings in 2018, which is reflecting a more accurate depiction of

peak coincidence demand reduction for the program. The implementer has

reservations about incorporating WHFs into lighting calculations due to

concerns that the WHF assumption would be applied to sites that

significantly vary in size and layout, which both greatly impact how a space is

heated, and it did not apply them in 2018. For this evaluation cycle, program

stakeholders have agreed to allow the implementer to continue this

approach regarding WHFs, with the evaluation team counting electric WHFs

towards program savings in ex post gross savings. Historically, Indiana

evaluations capture WHF credits only as part of evaluated electric savings. If

the heating fuel is electric, the penalty comes in the form of reduced electric

savings. The evaluation team will continue to apply this methodology as it is

generally not appropriate to impact natural gas portfolios, based on electric

measure impacts (or vice versa). Natural gas and electric measures

represent two distinct ratepayer and stakeholder groups, often with

different regulatory frameworks within a given jurisdiction.

Work with contractors and customers to

encourage more comprehensive projects

beyond lighting that might generate greater

savings opportunities.

Completed in 2018. The implementer successfully recruited additional trade

allies capable of completing non-lighting electric projects as well as natural

gas projects. In 2018, non-lighting electric measures increased to 29% of

electric (kwh) savings from 21% of savings in 2017. At the same time,

claimed natural gas savings increased from 163,312 therms in 2017 to

366,300 therms in 2018.

The implementer should ensure that all

measures submitted through the New

Construction program exceed energy code

baseline levels.

In progress. Generally, ex ante methodologies aligned with energy code

baselines. However, the evaluation team continued to find some instances

for heating measures where the baseline assumption did not meet minimum

federal efficiency levels.

Continue to address partially completed

2016 evaluation recommendations to

diversify the trade allies involved in the

program and to provide case studies.

Specifically, focus on trade allies involved

with natural gas projects and consider

developing natural gas project case studies.

Completed. The implementer successfully recruited more trade allies for

natural gas projects in 2018. The number of completed natural gas projects

increased from 10 in 2017 to 28 in 2018, and the program met its 2018

natural gas gross savings target.

Participant Feedback The evaluation team surveyed participants about their project and program participation experiences to

assess customer motivation, participation barriers, and customer satisfaction with program

Page 307: 2018 DSM Portfolio Evaluation Report - NIPSCO

300

components. The evaluation team contacted 37 businesses who participated in the New Construction

program—nine participants responded for a 24% response rate.

Energy Efficiency Awareness and Marketing

NIPSCO and implementation staff reported that trade allies are the main drivers of customer awareness

of the program. As shown in Figure 111, the surveyed participants identified trade allies as their most

frequent source of awareness of the program: four of the eight respondents indicated trade allies. In

2017, participants also identified trade allies as the most frequent source of awareness of the program

(six out of 11 responses in 2017).

Figure 111. How Participants Learned about the New Construction Program

Source: Participant Survey. Question: “How did you learn about NIPSCO’s New Construction program?”

(multiple response allowed)

Participation Drivers

Participant respondents frequently said that the most important factor in their decision to participate in

the program was to save energy in 2018 (four out of eight respondents). As shown in Figure 112, other

factors reported by more than one participant included saving money on utility bills (two respondents)

and ROI (two respondents). In 2017, participants reported the top factors in participation were to obtain

a program incentive (eight out of 11 respondents) and to save money on utility bills (seven out of 11

respondents). Only one respondent in 2017 listed saving energy as a factor in their decision to

participate in the program.

Page 308: 2018 DSM Portfolio Evaluation Report - NIPSCO

301

Figure 112. New Construction Program Participation Drivers

Source: Participant Survey. Question: “What two factors were most important in your decision to make

energy saving improvements through NIPSCO’s New Construction program?” (multiple response allowed)

Participation Barriers

Most respondents (six out of nine) reported that it was either very easy or somewhat easy for their

organizations to invest in energy efficiency. Two respondents said that it was neither easy nor difficult

and one respondent said that it was somewhat difficult. When asked about the challenges organizations

face when investing in energy-efficient equipment, the responses were varied. As shown in Figure 113,

the responses that were slightly higher included high initial costs (two out of eight respondents), and

lack of technical knowledge (two out of eight respondents). The evaluation team asked participants

about challenges to investing in energy-efficient equipment in 2017 as well. The most frequent

responses in 2017 included high initial cost (five out of 11 respondents) and long ROI (three out of 11

respondents).

Page 309: 2018 DSM Portfolio Evaluation Report - NIPSCO

302

Figure 113. Customer Barriers to Investment in Energy Efficiency

Source: Participant Survey. Question: “What do you think are the most significant challenges that

organizations face when investing in energy-efficient equipment?” (multiple response allowed)

Satisfaction with Program Processes

Participant respondents were asked about their satisfaction with different program components.

Figure 114 shows the reported satisfaction levels for each component. Based on the number of very

satisfied responses, respondents were most satisfied with the quality of work by their vendor or

contractor (seven respondents) and the post-inspection process (six respondents). Respondents were

least satisfied with the program application process (two respondents were very satisfied). The

evaluation team asked participants about their satisfaction with different program components in 2017

as well. In 2017, participants were most satisfied with the time it took to receive the incentive check

(seven out of ten respondents were very satisfied) and were least satisfied with the program application

process (two out of ten respondents were very satisfied).

Page 310: 2018 DSM Portfolio Evaluation Report - NIPSCO

303

Figure 114. Customer Satisfaction with New Construction Program Components

Source: Participant Survey. Question: “How would you rate your satisfaction with…”

Overall Satisfaction with Program and NIPSCO

As shown in Figure 115, when asked about their satisfaction with the program overall, all of the

respondents reported that they were very satisfied or somewhat satisfied. Some of the respondents

who indicated that they were very satisfied provided reasons for their satisfaction with the program. For

example, one respondent said, “I…liked the cooperation between the companies to [achieve] the final

outcome.” Another respondent commented on the application process: “NIPSCO made the application

easy…it happened very fast.” One respondent that was somewhat satisfied with the program overall

commented that while they were “very happy with it…it took a lot of time and jumping through hoops

to get qualified.” The evaluation team asked the same question to respondents in 2016 and 2017. As

shown in Figure 115, both respondents in 2016 were very satisfied with the program overall. While in

2017, five of the eleven respondents were very satisfied, four respondents were somewhat satisfied,

and one respondent was neither satisfied nor dissatisfied. The 2017 respondent that was neither

satisfied nor dissatisfied cited difficulties with the application process as the reason for that satisfaction

rating.

Page 311: 2018 DSM Portfolio Evaluation Report - NIPSCO

304

Figure 115. Overall Satisfaction with the New Construction Program, 2016-2018

Source: Participant Survey. Question: “How satisfied are you with NIPSCO’s New Construction program

overall? Would you say you are…”

The evaluation team also asked participant respondents about their satisfaction with NIPSCO. The

majority of respondents (seven out of nine) indicated they were either very satisfied or somewhat

satisfied, while two respondents reported being very dissatisfied. While neither respondent that was

very dissatisfied with NIPSCO provided a reason for their dissatisfaction, one of these respondents

suggested that NIPSCO could improve their program experience by increasing incentives and “showing

up on time for work.” The latter appeared to be a criticism of their contractor. The evaluation team

asked respondents about their satisfaction with NIPSCO in 2016 and 2017 as well. As shown in

Figure 116, both respondents in 2016 were very satisfied with NIPSCO. Respondents in 2017 provided

varied responses ranging from very satisfied to somewhat dissatisfied with the most frequent response

(four out of 11) being somewhat satisfied. While the proportion of very satisfied responses increased

from 2017 to 2018, the two very dissatisfied responses in 2018 were the first instances of this response.

Figure 116. Overall Satisfaction with NIPSCO, 2016-2018

Source: Participant Surveys. Question: “How satisfied are you with NIPSCO overall as your organization’s

utility service provider? Would you say you are…”

Page 312: 2018 DSM Portfolio Evaluation Report - NIPSCO

305

Suggestions for Improvement

The evaluation team asked respondents to identify any suggestions they had for improving the New

Construction program. Most respondents (eight of nine) did not offer any suggestions. As reported

above, one respondent suggested increasing the incentive levels and also suggested “showing up on

time for work,” which appeared to be a comment for their contractor.

Participant Survey Firmographics

As part of the participant survey, the evaluation team collected responses on the firmographics shown

in Table 202.

Table 202. New Construction Respondent Firmographics

Firmographic Percentage

Industry (n=9)

Manufacturing 3

Agriculture 1

Construction 1

Government 1

Hotel/motel 1

Real estate and property management 1

Other 1

Building Square Footage (n=9)

Less than 5,000 square feet 0

5,000 to less than 10,000 square feet 1

10,000 to less than 50,000 square feet 2

50,000 to less than 100,000 square feet 2

100,000 square feet or greater 4

Space Heating Fuel Type (n=9)

Natural gas 8

Electric 1

Space Heating Fuel Type (n=9)

Natural gas 7

Electric 1

None 1

Trade Ally Feedback The evaluation team spoke with five trade allies who completed at least one New Construction project

that received an incentive in 2018. The findings are detailed below.

Trade Ally Motivations

The evaluation team asked trade allies how they first learned about the program. Three of the five trade

allies first learned of the program through an email from NIPSCO. One trade ally reported they learned

of the program from the implementer and one trade ally learned about the program at an annual trade

show.

Page 313: 2018 DSM Portfolio Evaluation Report - NIPSCO

306

The evaluation team also asked trade allies to list their reasons for registering as a NIPSCO trade ally and

for participating in the New Construction program. As shown in Figure 117, respondents most frequently

said they registered as a NIPSCO trade ally to receive the rebate on their customers’ behalf (three out of

four).In 2017, trade allies most frequent reason for registering as a trade ally (three out of four

respondents) was to save customers money. All five trade allies said they participated in the New

Construction program for the program incentives.

Figure 117. Reasons for Registering as a NIPSCO Trade Ally

Source: Trade Ally Survey. Question: “What are the reasons why your company chose to register with

NIPSCO’s Trade Ally Network?” (multiple response allowed)

Trade Ally Marketing

The evaluation team asked trade allies what percentage of their sales calls were promoting NIPSCO’s

New Construction program. Only one of the five trade allies reported making sales calls regularly. This

trade ally did not have a large client base in NIPSCO’s territory and estimated that only 1% of sales calls

promoted the New Construction program. The remaining trade allies reported that they did not make

sales calls. One of these trade allies focused on in-person sales and reported bringing up the New

Construction program about 50% of the time while on site.

Program Influence

The evaluation team asked trade allies how influential the program incentive was to their customers’

decision to complete energy-efficient projects. All five surveyed trade allies indicated that the incentive

was very influential in their customers’ decisions. The evaluation evaluation team also asked trade allies

if they felt the program’s incentives were set at an appropriate level. Four of the trade allies indicated

that they thought the incentive levels were appropriate, while one trade ally said that the lighting

incentive should be increased.

Suggestions for Improvement

Besides the suggestion for an increase to the lighting incentive, two other trade allies offered

suggestions for improving the program. One trade ally felt that the application required too much

supporting documentation and would prefer that the process was streamlined. Another trade ally had

only completed lighting projects through the New Construction program and was unaware of other

available rebates. They wanted to be informed of other (non-lighting) electrical rebates that were

available.

Page 314: 2018 DSM Portfolio Evaluation Report - NIPSCO

307

Trade Ally Satisfaction

The evaluation team asked trade allies how satisfied they were with the program. Figure 118 shows the

trade ally responses from 2016, 2017, and 2018. In 2016, two trade allies were very satisfied, one trade

ally was somewhat satisfied, and three trade allies were neither satisfied nor dissatisfied. 2017 saw the

largest number of very satisfied responses (four out of five), but it also had the only very dissatisfied

response. The trade ally that was very dissatisfied with the program in 2017 specified that it took too

much time and effort to participate in the program and that they had only participated in response to a

customer’s request. In 2018, all respondents were either very satisfied or somewhat satisfied with the

program.

Figure 118. Trade Ally Satisfaction with the New Construction Program, 2016-2018

Source: Trade Ally Survey. Question: “How satisfied are you with NIPSCO’s New Construction program?”

Conclusions and Recommendations Conclusion 1: The implementer started to incorporate CFs into 2018 demand calculations but did not

update a portion of projects.

Peak demand savings for exterior lighting measures were claimed because the CFs were not applied.

Also, the peak demand savings for interior lighting measures were overstated because, in many cases,

the CF of the installed lights was not applied.

Recommendations:

• Have the implementer follow the savings calculations in the Indiana TRM (v2.2) for all measures

where appropriate. The implementer should ensure that the CFs are clearly laid out and that the

calculations are presented in an easy-to-follow format.

• Ensure that the CF=0 for all exterior lighting measures (unless they are 24/7 fixtures, where the

CF=1 regardless of interior or exterior location).

Page 315: 2018 DSM Portfolio Evaluation Report - NIPSCO

308

Conclusion 2: The implementer’s definition of measure unit varies within measure types and does not

consistently describe the quantity of the installed measure.

In the other C&I programs (Custom, SBDI, and Prescriptive), the number of units for lighting measures

represents the number of lighting fixtures installed. This is not the case for all New Construction lighting

measures, where the unit quantity represented the total wattage of lighting installed for measures

incorporating lighting power density reduction methodology. Additionally, natural gas heating measures

generally defined unit as the total 1,000 Btus per hour of the installed equipment (consistent with other

C&I programs), but in some cases used the number of individual heating systems installed.

Recommendations:

• To align with the other C&I programs, define measure unit quantity as the following:

▪ A fixture for lighting measures

▪ Equipment capacity for non-lighting equipment

▪ Linear feet for pipe insulation

▪ Square feet for envelope measures

• Because integrated measures like process or controls measures are more difficult to define

quantities for, set those quantities to one. Other definition protocols may be appropriate, but

the implementer should ensure they are used consistently across like measures.

Conclusion 3: Lack of consistency in ex ante calculations, savings methodology, and supporting

documentation could be a potential contributor to overstated or understated realized savings.

In many instances, the evaluation team failed to locate proper supporting documentation for project

measures in the sample in LM Captures. This included calculations that were missing or showed savings

values that did not match the claimed savings in the program application or tracking data. In these

cases, the evaluation team had to make a secondary request to the implementer to provide the correct

documentation. The documentation received from the secondary request still did not match for a few

measures.

Additionally, some measures lacked consistency in savings methodology. For the direct-fired heating

measures, the savings methodology varied across four separate approaches even though one approach

seemed to most accurately estimate therm savings.

Recommendations:

• For direct-fired heating measures that encompass efficiency, destratification, and setback

savings, utilize the building load model present in the file, LM Building Usage & Setback+Destrat

Savings Calc.xlsx or similar spreadsheet calculator. Avoid read-only PDFs.

• Include a general narrative of project savings and nuances of assumptions in the project

application. For example, include reasons for ignoring destratification or setback savings for

natural gas HVAC measures when similar measures tend to include them.

• Calculate central HVAC equipment measure savings using the Indiana TRM (v2.2). Custom

calculations may be necessary where custom controls or technology is integrated into the measure.

Page 316: 2018 DSM Portfolio Evaluation Report - NIPSCO

309

• Ensure the most up-to-date savings calculation source is uploaded to LM Captures at the same

time as the final application workbook. This will ensure the correct supporting documentation is

present with the final savings being claimed. Employ documentation similar to the spreadsheet

file reference in the first bullet point which lays out all assumptions, inputs, and formulae used

in the calculation.

• Ensure all New Construction baseline assumptions align with the current federal minimum

efficiency standards and ASHRAE 90.1 energy code standards.

Conclusion 4: The program increased the number of natural gas projects and exceeded its therm goals

by adding more trade allies who focused on natural gas work.

In 2016 and 2017 the New Construction program achieved more than half of their savings through

lighting projects. The evaluation team recommended adding more trade allies capable of completing gas

projects. In 2018, the implementer reported making a concerted effort to recruit additional trade allies

that work on natural gas projects. The effort to diversify and expand the network of trade allies appears

to have worked. The number of natural gas projects completed through the New Construction program

increased form 10 projects in 2017 to 28 projects in 2018. Claimed natural gas savings increased from

163,312 therms in 2017 to 366,300 therms in 2018. Despite this success, there is room to continue to

diversify and expand the projects completed in future years, even among existing trade allies. One trade

ally indicated that they were only aware of lighting rebates available through the New Construction

program, but would like to learn about available non-lighting rebates.

Recommendation:

• Continue to diversify the trade allies involved in the program to ensure a mix of projects and

that savings goals are met in future years.

Conclusion 5: The evaluation team accepts the implementer’s baseline assumption for direct-fired

heating measures for 2018, but ongoing research is required to continue the non-setback baseline for

future program years.

The evaluation found that the assumption of a non-setback baseline does not meet Indiana energy code

requirements, namely 2007 ASHRAE 90.1 section 6.4.3.3.2. However, research from program

administrators for another Indiana natural gas utility company showed that the non-setback baseline is

currently the de facto market practice by contractors in new construction projects installing unit heaters

for warehouses and production floor spaces.

Recommendations:

• Perform ongoing interviews with trade allies to gain insight into current baseline practices for

this measure type.

• Reassess the appropriateness of the non-setback unit heater baseline at the end of the next

three-year cycle (2021).

Page 317: 2018 DSM Portfolio Evaluation Report - NIPSCO

310

C&I Small Business Direct Install Program Through the SBDI program, NIPSCO offers prescriptive incentives to small business electric

customers with rate categories 720, 721, 722, or 723 and whose billing demand is less than 200

kW per month for twelve consecutive months. Eligible small business natural gas customers are

those served under rate categories 421, 425, and 451. The implementer, Lockheed Martin

Energy, oversees program management and delivery, and promotes the program to customers

with support from trade allies.

Through trade allies, small business participants receive energy assessments and higher incentives for

specific measures than those offered through the standard Prescriptive program. The program also

offers an array of incentives for refrigeration, lighting, HVAC, and other natural gas–saving measures

typically used in small business operations.

Program Performance Table 203 summarizes savings for the 2018 program, including program savings goals. In terms of gross

savings, the program achieved 102% of its electric energy goal, 34% of its demand reduction goal, and

41% of its natural gas energy savings goal.

Table 203. 2018 SBDI Program Savings Summary

Metric

Gross

Savings

Goal

Ex Antea Audited Verified Ex Post

Gross

Ex Post

Net

Gross Goal

Achievement

Electric Energy

Savings (kWh/yr) 6,076,864 5,730,613 5,730,613 5,730,613 6,177,763 5,992,430 102%

Peak Demand

Reduction (kW) 2,542 2,002 2,002 2,002 866 840 34%

Natural Gas

Energy Savings

(therms/yr)

365,074 174,519 174,646 174,646 148,905 144,438 41%

a Values presented at a measure-level represent audited values since the scorecard provides only savings totals.

Audited, verified, and ex post savings aligned well with SBDI ex ante electric energy savings, but the

evaluation revealed variability in peak demand reduction values. Similar to last year, the implementer

continued to overstate ex ante peak coincident demand reduction for exterior lighting, which drove the

large discrepancy in ex post gross and net compared to ex ante. The incorporation of WHFs for interior

lighting projects caused the electric energy realization rate to surpass 100%.

As in 2017, for natural gas savings, the evaluation team found that the updated source for steam trap

replacement assumptions caused reductions in ex post gross therm savings. Additionally, the evaluation

team found that claimed assumptions in large commercial steam pipe insulation measures were

overstating therm savings for low-occupancy buildings (namely, religious worship). Refer to the Impact

Evaluation section of this chapter for further discussion of evaluation adjustments.

Page 318: 2018 DSM Portfolio Evaluation Report - NIPSCO

311

Table 204 lists the 2018 program realization rates and NTG ratio.

Table 204. 2018 SBDI Program Adjustment Factors

Metric Realization

Rate (%)a

Precision at 90%

Confidence (Energy-

Level Realization Rate)

Freeridership

(%)

Spillover

(%)

NTG

(%)b

Electric Energy Savings (kWh/yr) 107.8% 3.1%

4% 1% 97% Peak Demand Reduction (kW) 43.3%

Natural Gas Energy Savings (therms/yr) 85.3% 9.9% a Realization rate is defined as ex post Gross savings divided by ex ante savings. b NTG is defined as ex post net savings divided by ex post gross savings.

Table 205 shows the budget and expenditures for the 2018 SBDI program. The program spent 93% of its

electric budget and 47% of its natural gas budget.

Table 205. 2018 SBDI Program Expenditures

Fuel Program Budget Program Expenditures Budget Spent (%)

Electric $945,242 $879,322 93%

Natural Gas $412,006 $195,033 47%

Research Questions The evaluation team conducted a materials review, stakeholder interviews, site visits, desk reviews, and

a participant survey to address the following research questions:

• How are participants hearing about the program, and what are the best ways for the program to

reach potential participants?

• What are the barriers and challenges to energy efficiency and program participation?

• What are the primary reasons for participation?

• Are participants satisfied with the program and its components?

• How could NIPSCO improve the participants’ experience with the program?

• What are the program’s participant spillover and freeridership estimates?

• Are the tracking database savings sourced with proper project documentation?

• Do claimed savings algorithms align with the 2015 Indiana TRM (v2.2) or other appropriate

secondary sources?

Impact Evaluation To perform an impact analysis of the 2018 SBDI program, the evaluation team selected a representative

sample of measures to evaluate and then extrapolated findings to the larger program population. This

process used a PPS sampling approach, a stratified sampling technique which uses a saving parameter

(here, total energy savings in MMBtu) as a weighting factor. Out of the 1,182 project measures in the

population, the SBDI evaluation sample resulted in 63 measures receiving either an engineering review

Page 319: 2018 DSM Portfolio Evaluation Report - NIPSCO

312

or an on-site M&V analysis. Natural gas measures accounted for 47% of the ex ante population in terms

of MMBtu savings, of which steam pipe insulation was the predominant measure type. Table 206 shows

the total number of sample sites evaluated for this program year and their cumulative savings, along

with sample size targets designed to yield sufficient precision.

Table 206. SBDI Program Impact Evaluation Activities

Gross Population Counta Sample Electric Measure Count Sample Natural Gas Measure

Count Evaluation Sample

Share of Program

Savings Unit Measure Evaluation Target Evaluation Target

37,875 1,182 39 34 24 15 39%

a Population is based on queried data extracts from the implementer’s database, LM Captures, and may vary slightly with

NIPSCO’s scorecard data source.

The evaluation team used findings from 2015, 2016, and 2017 evaluations to inform sampling targets for

2018. The evaluation team’s understanding of previous years’ savings variability (error ratio) of

prescriptive measures allowed the evaluation sample to efficiently target natural gas and electric

measures in 2018, achieving 90/10 confidence and precision for each fuel’s realization rates, while

representing 39% of the total program savings. The actual sample surpassed its targets due to ensuring

the representative measure-mix in the sample and the fact that several projects contained multiple

measures.

Figure 119 illustrates the distribution of the SBDI project population by energy savings (MMBtu), fuel

type, and measure category, as labeled in the tracking database. In 2018, SBDI projects accounted for

ex ante savings of 5,730,613 kWh and 174,519 therms (37,005 MMBtu savings combined).

Figure 119. SBDI Program Ex Ante Savings Distribution by Fuel Type and Measure Category

Page 320: 2018 DSM Portfolio Evaluation Report - NIPSCO

313

HVAC measures consisted entirely of natural gas–fueled steam-system upgrades; steam pipe insulation

measures made up 77% of the HVAC ex ante MMBtu savings, with steam trap replacements

encompassing the remaining 23%. Lighting measures consisted mostly of LED retrofits, with LED exterior

fixtures accounting for 65% of ex ante lighting savings. Refrigeration measures consisted of refrigerated

and freezer case LED lighting.

With the SBDI natural gas offering falling short of the 2018 goal for ex ante therm savings, the total

program savings were the lowest historically over the past four years for the SBDI program. Figure 120

shows the distribution of savings between natural gas and electric measures over past four years.

Figure 120. SBDI Program Fuel Savings Distribution Trends, 2015-2018

Natural gas and electric measures in the SBDI program primarily differed in that the ex ante savings

methodology for natural gas measures relied more heavily on deemed savings values. The implementer

calculated lighting savings using site-specific inputs (for example, baseline wattage, efficient wattage,

annual HOU), which aligns with evaluation methods. As a result, lighting measures tended to have

better precision (less variation) in evaluated savings relative to claimed savings. The expected precision

of measures allowed the evaluation team to efficiently target the electric evaluation sample, decreasing

it to 15 for natural gas measures and 34 for electric measures in 2018. Sample planning, in conjunction

with surpassing target measure counts, allowed the sample realization rates (in terms of MMBtu) to

achieve ±10% precision with 90% confidence for both fuels.

At the combined-fuel level, the evaluation sample represented 39% of the total population’s combined

energy (MMBtu) savings. Figure 121 shows the composition of the sample for on-site M&V efforts and

engineering desk reviews.

Page 321: 2018 DSM Portfolio Evaluation Report - NIPSCO

314

Figure 121. SBDI Program Summary of 90/10 Sample Analysis by Ex Ante Savings

2018 was the final year in the 2016-2018 cycle, and the evaluation team distributed the majority of the

planned site visits in the first two years. This reduced the number of targeted on-site visits from 19 in

2017 to six in 2018, which accounted for the small portion of program savings captured by site visits. In

total, the evaluation team performed 52 site visits for the SBDI program over the three-year cycle,

surpassing the target of 44. However, the more evenly distributed savings contributed to a PPS sample

with per-measure savings similar to the population average allowed the evaluation sample (engineering

reviews plus on-site M&V) to still achieve the 90% confidence with ±10% precision target for the energy

realization rates.

Natural gas measures made up 47% of the ex ante energy savings for the 2018 SBDI program, even

though they represented only 6% of the installed measure count of the program. This result indicates

that the per-measure savings for steam pipe insulation are much larger than for lighting measures for

the SBDI program. As such, the PPS sampling plan allowed the evaluation team to represent the natural

gas measures with a smaller sample size (24) than the electric measures (39). Sample planning, in

conjunction with surpassing target measure counts, allowed the evaluation team to achieve realization

rates of ±10% precision with 90% confidence for the natural gas sample as well.

Figure 122 shows the sample distribution, based on the measure categories from the tracking database.

Page 322: 2018 DSM Portfolio Evaluation Report - NIPSCO

315

Figure 122. SBDI Program Percentage Ex Ante MMBtu Savings Distribution of Sample Compared to

Population

Desk Reviews supplemented the six targeted site visits to reach the minimum target of sampled

measures, and the sample represented all measure categories in the population. Because the PPS

sampling approach favors the largest energy-saving measures in each fuel type, the distribution of

ex ante energy savings in the sample has a larger share of HVAC ex ante savings (green categories)

compared with electric measures (blue category) in the population. Steam trap replacements and steam

pipe insulation projects contained the largest program measures installations in 2018, and the inclusion

of these measures in the evaluation sample caused the sample distribution to diverge from the

population at the combined MMBtu level.

Audited and Verified Savings As shown in Table 203, the evaluation team made only minor adjustments in the evaluation’s audit

phase. For lighting measures, the claimed savings relied upon algorithms with site-specific inputs. The

evaluation team based any adjustments in the audit phase largely on discrepancies with respect to these

inputs, such as efficient or baseline wattages and fixture quantities. The evaluation team aligned these

inputs with supporting documentation (specification sheets and invoices) when they differed with the

calculation workbooks (program application files). The lighting documentation generally aligned well

with claimed calculations presented in the application files, so audited savings did not vary from ex ante

savings.

For natural gas measures and non-lighting electric measures, the implementer did not use approved

algorithms to calculate savings. Rather, the implementer generated claimed savings using a well-cited

deemed savings look-up table. Notes for each deemed value outlined the source of assumptions and

algorithms, and the evaluation team could recreate the values using notes provided. The small audited

discrepancy in the 2018 natural gas savings were based on rounding differences in claimed deemed

savings (for example, the evaluation team calculated 4.07 therms per year per foot for a small diameter

steam trap instead of rounding to 4.0 therms per year per foot like in ex ante savings).

The version control issues with the implementer’s 2017 deemed savings tables were addressed in the

2018 program, and the evaluation team confirmed that a single table was consistently used for non-

lighting measures in 2018.

Page 323: 2018 DSM Portfolio Evaluation Report - NIPSCO

316

In-Service Rate

The evaluation team performed site visits to verify installations of eight sampled measures across 6 sites

in 2018 and to calculate ISR for the SBDI program. Through site visits, the evaluation team verified that

claimed unit counts were correct, finding no discrepancies. Depending upon the measure type, the

ex ante unit count in the tracking database either described the installed quantity or the equipment

capacity (for example, horsepower or MBH). The evaluation team applied a 100% ISR to the program’s

audited savings to determine verified savings in 2018.

Ex Post Gross Savings

Electricity and Demand

The evaluation team adjusted 2018 lighting measure savings in the ex post analysis based upon the

following data:

• Discrepancies in fixture quantity, equipment capacity, or wattage discovered during site visits or

discussions with business owners

• Annual operating hours metered during site visits, provided by business owners or recorded

from scheduling systems

• Inclusion of electric WHFs and peak summer CFs consistent with the 2015 Indiana TRM (v2.2)

• Review of any potential updates to sources listed in the implementer’s deemed savings tables

Adjustments based on the above data resulted in only minor discrepancies in electric energy savings, but

yielded larger discrepancies in peak demand reduction. The evaluation team attributed this deviation in

demand savings to the sizeable portion of exterior lighting projects in the gross population and the

evaluation sample. Exterior lighting projects made up 49% of the population’s ex ante peak demand

reduction. Although the evaluation team confirmed that the implementer began applying CFs based on

building type to lighting projects in 2018 (based on evaluation recommendations in 2016 and 2017), the

evaluation team determined that they incorrectly applied CFs to exterior lighting measures. The

implementer applied the CF associated with building type, which is appropriate for interior lighting

measures, but exterior lighting has its own building category in the Indiana TRM (v2.2) and its associated

CF is zero. Therefore, the evaluation team removed all demand savings from exterior lighting as part of

the ex post analysis. Exterior lighting is non-operational during the summer peak period (defined in the

Indiana TRM v2.2 as summer weekdays from 3:00 p.m. to 6:00 p.m.), and the Indiana TRM (v2.2)

assumes exterior lighting is only operation during non-daylight hours (hence, CF=0). This adjustment

serves as the main contributor to the low realization rate for the program’s peak demand reduction

(43%). The evaluation team made additional small adjustments to CFs for some interior lighting

measures on a case-by-case basis based on a more accurate building type for the project.

As noted in previous evaluations, the implementer disregarded WHFs in calculating electric lighting

savings. To align with methodologies originated during the Indiana Statewide Core evaluation, the

evaluation team applied WHFs to energy savings for interior lighting measures in cooled spaces. Cooling

systems were verified for on-site measures and from measure application workbooks for engineering

Page 324: 2018 DSM Portfolio Evaluation Report - NIPSCO

317

reviews. In these cases, the WHF contributed additional electric energy savings to lighting measures,

which was the primary reason the electric energy realization rate achieved 108% in 2018.

Natural Gas

The natural gas realization rate dropped below 100% primarily because of an evaluation update in the

claimed methodology for steam trap replacement measures. The implementer referenced the Illinois

TRM (v4.0) to calculate the deemed savings for steam trap replacements—this measure is not listed in

the Indiana TRM (v2.2). The evaluation team verified proper use of the algorithm, but upon review of

the Illinois TRM (v5.0; published in 2016), the evaluation team discovered an update to the steam trap

replacement savings methodology. The Illinois TRM (v5.0) updated the input for average steam loss per

leaking trap from 13.8 pounds per hour to 6.9 pounds per hour, effectively reducing the per-unit savings

by half. Because the new steam-loss assumption has carried over to the most recent version of the

Illinois TRM (v7.0) and was available for reference during the 2018 program year, the evaluation team

aligned ex post savings for steam trap replacements with the new assumption, and steam trap

replacement measures then achieved a 48% realization rate, driving the natural gas–level realization

rate down for the program.

Another driver in the natural gas realization rate reduction was a change in deemed savings assumptions

for several large steam pipe insulation measures. The implementer calculated the deemed savings for

steam pipe insulation with accepted engineering heat loss calculations based on diameter of pipe.

Table 212 outlines the deemed savings by pipe diameter, and they are the same values for steam pipe

insulation measures in the Prescriptive program.

Table 207. Steam Pipe Insulation Ex Ante Deemed Savings by Diameter

Pipe Diameter Savings (Therm/Year/Ft)

0.5” 2.8

1” 5.4

1.5” 7.9

2” 10.5

2.5” 13.0

In 2017, the evaluation team reviewed this methodology and accepted these values as reasonable for

average commercial building heating systems. The Illinois TRM (v7.0)66 also aligns with these estimates.

At a statewide average, the Illinois TRM (v7.0) assumes 10.8 therm per year per foot for the average

commercial low pressure steam system with 2-inch pipe and recirculation heating.

However, the evaluation team found that 68% of all steam pipe insulation measures in the 2018 C&I

Prescriptive program were installed in religious buildings (up from 24% in 2017). These tend to have

66 2019. Illinois Statewide Technical Reference Manual – Volume 1: Overview and User Guide.

http://ilsagfiles.org/SAG_files/Technical_Reference_Manual/Version_7/Final_9-28-18/IL-

TRM_Effective_010119_v7.0_Vol_1-4_Compiled_092818_Final.pdf

Page 325: 2018 DSM Portfolio Evaluation Report - NIPSCO

318

lower-than average usage patterns and load profiles for commercial buildings. As such, the evaluation

team reviewed the 2018 natural gas billing data available on LM Captures for the sites falling into the

evaluation sample that received steam pipe insulation. After normalizing for weather (using HDD and

TMY3 for South Bend), the evaluation team found several instances of large energy-saving steam pipe

insulation projects claiming more savings than the site’s baseline consumption. For these instances, the

evaluation team aligned per-foot savings assumptions specifically with the religious building entry in the

Illinois TRM (v7.0) for low-pressure steam systems without recirculation (statewide average of 3.2

therms per year per foot for 2-inch pipe) to determine a more reasonable savings estimate. A specific

example is outlined in the Ex Post Gross Savings section of the C&I Prescriptive chapter.

Finally, the evaluation team adjusted deemed savings for projects that were claiming more savings than

baseline consumption, which dropped the natural gas realization rate to 85% in 2018. The evaluation

team recognizes that deemed savings approaches are not expected to accurately depict savings at each

specific installation site but that the approach will, on average, accurately represent savings at the

program level. However, since the overwhelming building type in the 2018 SBDI pipe insulation

population was religious worship, the ex ante deemed savings for steam pipe insulation no longer

accurately represented the program population. Therefore, the evaluation team corrected individual

steam pipe insulation projects on a case-by-case basis since many were the major drivers of ex ante

savings (the top 10 steam pipe insulation measures accounted for 35% of total natural gas savings in the

program) and religious worship buildings tend to have lower heating loads than typical commercial

buildings.

Realization Rates

Table 208 shows the percentage of verified savings and realization rates for the sample.

Table 208. SBDI Program Sample Realization Rates

ISR

Verified/Ex Ante Savings Realization Rate for Evaluation Samplea

Electric Energy

(kWh)

Peak Demand

(kW)

Natural Gas

Energy

(therms)

Electric Energy

(kWh)

Peak Demand

(kW)

Natural Gas

Energy

(therms)

100% 100% 100% 100% 108% 43% 85%

a Realization rate is defined as ex post gross savings divided by ex ante savings.

Table 209 lists the measure and unit count within the sample and the aggregated ex post realization

rates for each measure type. The Ex Post Gross Savings section outlines drivers of the realization rate for

each measure type.

Page 326: 2018 DSM Portfolio Evaluation Report - NIPSCO

319

Table 209. SBDI Program Evaluation Sample Results by Measure Type

Fuel Measure Type

Evaluation

Sample Measure

Count

Evaluation

Sample Unit

Count

Peak Demand

Ex Post

Realization Rate

Total Energy

Ex Post

Realization Rate

Electric Lighting 39 1,470 43% 108%

Natural gas Steam Pipe Insulation 21 9,207 N/A 91%

Natural gas Steam Trap

Repalcements 3 479 N/A 48%

It is important to note that the realization rates in Table 209 were not applied to each measure type in

the population to determine ex post gross savings. Table 209 Table 168is presented only to provide a

further outline of findings with in the evaluation sample. To calculate the ex post gross impacts, the

evaluation team applied the metric-level realization rates resulting from the sample and applied them to

the population ex ante energy and demand savings (shown in Table 150).

Table 210. Application of 2018 SBDI Program Realization Rates

Metric Population Ex Ante Realization Rate (From

Evaluation Sample) Population Ex Post Gross

Electric Energy Savings (kWh/yr) 5,730,613 107.8% 6,177,763

Peak Demand Reduction (kW) 2,002 43.3% 866

Natural Gas Energy Savings

(therms/yr) 174,519 85.3% 148,905

Ex Post Net Savings The evaluation team calculated freeridership and spillover using the methods described in the Appendix

B. Self-Report Net-to-Gross Evaluation Methodology using survey data collected from 2018 participants.

As shown in Table 211, the evaluation team estimated a 97% NTG for the 2018 SBDI program.

Table 211. 2018 SBDI Program Net-to-Gross Results

Program Category Freeridership (%)a Spillover (%) NTG (%)a

Small Business Direct Install Program 4% 1% 97%

a The results are weighted by survey sample ex post gross program MMBtu savings.

Freeridership The evaluation team conducted surveys with SBDI participants to determine freeridership for the

program. The evaluation team asked 53 respondents (representing 71 measures) questions focused on

whether they would have installed equipment to the same level of efficiency, at the same time, and of

the same quantity in the absence of the SBDI program. Based on this feedback, the evaluation team

calculated overall freeridership at 4% (Table 212).

Page 327: 2018 DSM Portfolio Evaluation Report - NIPSCO

320

Table 212. 2018 SBDI Program Freeridership Results

Program Category Responses (n)a Freeridership (%)b

SBDI Program 71 4%

a The evaluation team asked customers who installed more than one measure the freeridership battery of questions for a

maximum of two measures, resulting in 71 unique responses.

b The freeridership score is weighted by the survey sample ex post gross program MMBtu savings.

By combining the previously used intention methodology with the influence methodology (through

simple averaging at the individual level), the evaluation team produced average freeridership for each

respondent. Refer to Appendix B. Self-Report Net-to-Gross Evaluation Methodology for further details on

the intention and influence questions and scoring methodologies.

Intention Freeridership

The evaluation team estimated intention freeridership scores for all participants, based on their

responses to the intention-focused freeridership questions.

The evaluation team calculated an intention freeridership score of 6% for the SBDI program after

weighting individual estimates by ex post evaluated energy savings (Table 213).

Table 213. 2018 SBDI Program Intention Freeridership Results

Program Category Responses (n)a Intention Freeridership Scoreb

SBDI Program 71 6%

a The evaluation team asked customers who installed more than one measure the freeridership battery of questions for a

maximum of two measures, resulting in 71 unique responses. b The freeridership score is weighted by the survey sample ex post gross program MMBtu savings.

Page 328: 2018 DSM Portfolio Evaluation Report - NIPSCO

321

Figure 123 shows individual intention freeridership scores by distribution.

Figure 123. 2018 SBDI Program Distribution of Intention Freeridership Scores

Source: Participant Survey. Questions: G1 to G9 and G11 are used to estimate an intention freeridership

score. See Table 239 for the full text of the questions, response options, and scoring treatments used to

estimate intention freeridership scores. See Table 240 for the unique SBDI program participant response

combinations resulting from intention freeridership questions, along with intention freeridership scores

assigned to each combination, and the number of responses for each combination.

Influence Freeridership

The evaluation team assessed influence freeridership by asking participants how important various

program elements were in their purchase decision-making process. Table 214 shows the program

elements that participants rated as important, along with a count and average rating for each factor.

Table 214. 2018 SBDI Program Influence Freeridership Responses

Influence

Rating

Influence

Score

NIPSCO

Incentive

Information provided by

NIPSCO on Energy Saving

Opportunities

Recommendation from

Contractor or Vendor

Previous Participation

in a NIPSCO Efficiency

Program

1 (not at all

important) 100% 2 2 4 5

2 75% 0 5 5 0

3 25% 9 26 7 6

4 (very

important) 0% 57 34 54 27

Don’t Know 50% 3 4 1 33

Average 3.8 3.4 3.6 3.4

The evaluation team determined each respondent’s influence freeridership rate for each measure

category using the maximum rating provided for any factor in Table 214. As shown in Table 215, the

respondents’ maximum influence ratings ranged from 1 (not at all important) to 4 (very important). A

maximum score of 1 means the customer ranked all factors from the table as not at all important, while

Page 329: 2018 DSM Portfolio Evaluation Report - NIPSCO

322

a maximum score of 4 means the customer ranked at least 1 factor very important. Counts refer to the

number of “maximum influence” responses for each factor/influence score response option.

Table 215. 2018 SBDI Program Influence Freeridership Score

Maximum Influence Rating Influence Score Count

Total Survey

Sample Ex Post

MMBtu Savings

Influence Score

MMBtu Savings

1 (not at all important) 100% 2 44 44

2 75% 0 0 0

3 25% 7 598 150

4 (very important) 0% 62 8,804 0

Don’t Know 50% 0 0 0

Average Maximum Influence Rating (Simple Average) 3.8

Average Influence Score (Weighted by Ex Post Savings) 2%

Final Freeridership

The evaluation team then calculated the arithmetic mean of the intention and influence freeridership

components to estimate final freeridership for the SBDI program of 4%.

𝐹𝑖𝑛𝑎𝑙 𝐹𝑟𝑒𝑒𝑟𝑖𝑑𝑒𝑟𝑠ℎ𝑖𝑝 (4%) =𝐼𝑛𝑡𝑒𝑛𝑡𝑖𝑜𝑛 𝐹𝑅 𝑆𝑐𝑜𝑟𝑒 (6%) + 𝐼𝑛𝑓𝑙𝑢𝑒𝑛𝑐𝑒 𝐹𝑅 𝑆𝑐𝑜𝑟𝑒 (2%)

2

A higher freeridership score translates to more savings that are deducted from the gross savings

estimates. Table 216 shows the intention, influence, and freeridership scores for the 2018 SBDI program.

Table 216. 2018 SBDI Program Freeridership Score

Responses (n) Intention Score Influence Score Freeridership Score

71 6% 2% 4%

Participant Spillover As detailed in Appendix B. Self-Report Net-to-Gross Evaluation Methodology the evaluation evaluation

team estimated participant spillover67 measure savings using specific information about participants

determined through the evaluation, and using the Indiana TRM (v2.2) as a baseline reference. The

evaluation team estimated the percentage of program participant spillover by dividing the sum of

additional spillover savings (as reported by survey respondents) by the total gross savings achieved by all

survey respondents. The participant spillover estimate for the SBDI program is 1%, rounded to the

nearest whole percent, as shown in Table 217.

67 Non-participant spillover evaluation activities were not conducted for the 2018 program year.

Page 330: 2018 DSM Portfolio Evaluation Report - NIPSCO

323

Table 217. 2018 SBDI Program Participant Spillover Results

Spillover Savings (MMBtu) Participant Program Savings (MMBtu) Participant Spillover

53 9,447 1%

Four respondents reported that overall the program was very important to their decision to install

additional energy efficiency projects that were not rebated by NIPSCO. Table 218 shows these additional

participant spillover measures and their total resulting energy savings.

Table 218. 2018 SBDI Program Participant Spillover Measures, Quantity, and Savings

Spillover Measures Quantity Total Energy Savings (MMBtu)

LED lighting 2 2.5

Light fixtures 5 1.8

Natural gas furnace 1 22.2

Molding press equipment 1 26.6

Table 219 summarizes freeridership, spillover, and NTG for the SBDI program.

Table 219. 2018 SBDI Program Net-to-Gross Results

Responses (n) Freeridershipa Spillover NTGa

71 4% 1% 97%

a Weighted by survey sample ex post gross program MMBtu savings

Evaluated Net Savings Adjustments Using the calculated freeridership and participant spillover values, the evaluation team applied the

overall NTG ratio to the ex post gross savings to identify the ex post net savings. Table 220 shows the

savings, realization rate, and NTG for the 2018 SBDI program.

Table 220. 2018 SBDI Program Ex Post Net Savings

Fuel Type Ex Antea Gross

Savings

Ex Post Gross

Savings

Realization

Rate (%) NTG (%)

Ex Post Net

Savings

Electric Energy Savings

(kWh/yr) 5,730,613 6,177,763 107.8% 97% 5,992,430

Peak Demand Reduction

(kW) 2,002 866 43.3% 97% 840

Natural Gas Energy Savings

(therms/yr) 174,519 148,905 85.3% 97% 144,438

a Values presented at a measure-level represent audited values since the scorecard provides only savings totals.

Page 331: 2018 DSM Portfolio Evaluation Report - NIPSCO

324

Process Evaluation As a part of the process evaluation, the evaluation team reviewed the program database and program

materials, and conducted participant surveys. The evaluation team also interviewed NIPSCO’s program

manager and implementation staff to gain a better understanding of the program design, the delivery

process, and any associated changes, challenges, or successes experienced in 2018. The evaluation

team’s findings follow.

Program Design and Delivery The SBDI program was designed to encourage small business customers, who typically do not have in-

house expertise, to service or replace standard efficiency lighting, refrigeration, and heating and cooling

equipment with high-efficiency measures. A network of registered trade allies recruited prospective

customers, offering walk-through assessments of their facilities and rebates to install energy-efficient

equipment. NIPSCO required pre-approval of projects when incentives exceeded $10,000. Customers

could also apply for Prescriptive and Custom program incentives for equipment that fell outside the

scope of the SBDI program. The implementer encouraged trade allies to reduce the total cost to the

customer on the project invoice and submit the application on their behalf for payment, and the

implementer reported that this occurred for the majority of SBDI projects.

Lockheed Martin Energy’s implementation program manager led program delivery and coordination

with NIPSCO staff. Other implementation staff handled marketing, trade ally engagement, quality

assurance, and day-to-day program operations across the five C&I programs. NIPSCO’s program

manager guided implementation, and NIPSCO’s Major Account Managers assisted with implementation

efforts through direct support and program assistance to customers within the service territory.

The program implementer, as noted above, was responsible for trade ally recruitment. Trade allies had

to be registered to deliver the program to customers and were required to participate in program

training. The program implementer recruited trade allies to attend the formal training and enrollment

process, hosted formal webinar trainings, and delivered email communications to engage trade allies

throughout 2018.

As part of its quality control process, the program implementer inspected the first five projects each

newly registered trade ally completed, then randomly inspected additional projects after that.

Changes from 2017 Design

NIPSCO and the program implementer reported no noteworthy changes from the 2017 design.

However, in August 2018, Lockheed Martin Energy developed and performed outreach to an internal list

of trade allies that install steam traps and pipe insulation measures. The program implementer said that

although the outreach did not result in as many projects as intended, it helped staff initiate

conversations about these measures with trade allies. While all 2018 therm savings came from either

steam trap repair or replacement and steam pipe insulation, which was also the case in 2017, the

program produced over two-thirds (68%) more savings in 2017 (293,151 gross therms) compared to

2018 (174,519 gross therms). Increased outreach did not increase the number of steam system

Page 332: 2018 DSM Portfolio Evaluation Report - NIPSCO

325

contractors listed in the trade ally search engine available through the NIPSCO energy efficiency

program website (two total steam system contractors listed in the search engine since 2017).68

2017 Recommendation Status

During interviews with NIPSCO and the program implementer, the evaluation team followed up on the

2017 evaluation recommendation. Table 221 lists the 2017 SBDI program evaluation recommendations

and NIPSCO’s progress toward addressing those recommendations to date.

Table 221. Status of 2017 SBDI Program Evaluation Recommendations

Summary of 2017 Recommendations Status

Set steam trap trade ally outreach and enrollment

goals for individual program implementer field. Offer

training and/or incentives for local contractors who

want to learn to offer steam trap services, and

contractor bonuses for a specified number of each

trade ally’s first steam trap maintenance projects.

This approach should help transform the local

market and lower this market barrier while

increasing program participation and savings.

In Progress. The implementer fielded a steam system contractor outreach

campaign in August 2018, but it did not set formal outreach goals or

create contractor-facing incentives to bring in new trade allies.

Encourage the Implementer to follow the algorithm

for energy and demand savings outlined in the 2015

Indiana TRM (v2.2) for commercial lighting.

In Progress. The implementer has started to incorporate CFs in lighting

demand savings in 2018, which is reflecting a more accurate depiction of

peak coincidence demand reduction for the program. The implementer

has reservations about incorporating WHFs into lighting calculations due

to concerns that the WHF assumption would be applied to sites that

significantly vary in size and layout, which both greatly impact how a

space is heated, and it did not apply them in 2018. For this evaluation

cycle, program stakeholders have agreed to allow the implementer to

continue this approach regarding WHFs, with the evaluation team

counting electric WHFs towards program savings in ex post gross savings.

Historically, Indiana evaluations capture WHF credits only as part of

evaluated electric savings. If the heating fuel is electric, the penalty

comes in the form of reduced electric savings. The evaluation team will

continue to apply this methodology as it is generally not appropriate to

impact natural gas portfolios, based on electric measure impacts (or vice

versa). Natural gas and electric measures represent two distinct

ratepayer and stakeholder groups, often with different regulatory

frameworks within a given jurisdiction.

68 NIPSCO. “Trade Ally Search.” (A search engine for NIPSCO business energy efficiency programs.) Accessed

March 2019. http://www.lmenergyefficiency.com/contractor-search

Page 333: 2018 DSM Portfolio Evaluation Report - NIPSCO

326

Summary of 2017 Recommendations Status

Reference savings values directly from the deemed

savings look-up table for each unique measure in the

tracking database.

Completed in 2018. The evaluation team found that Non-lighting

measures consistently referenced the appropriate savings from the

implementer’s look-up table. The evaluation team found that deemed

savings were well sourced, and they are reasonable estimates based on

cited documentation. However, deemed savings approaches will

generally produce less precise savings estimates than algorithms based

on specific installed measures at the project level. The program

implementer calculates claimed savings for lighting measures using

algorithms at the project level, which is preferred. The evaluation team

realizes that it is not conducive for all measure types but recommends

this approach for other measures where possible.

Participant Feedback The evaluation team surveyed participants about their project and program experience to assess

participation drivers, barriers to participating, and satisfaction with program components, all discussed

in more detail below. The evaluation team contacted 310 businesses that participated in the

SBDI program—53 participants completed the survey for a 17% response rate.

Energy Efficiency Awareness and Marketing

As noted earlier, NIPSCO delivers the SBDI program through trade allies, with support from NIPSCO and

the program implementer. Trade allies target potential participants through typical maintenance or

sales calls, and proactively through door-to-door campaigns. NIPSCO markets all the C&I programs—

Custom, Prescriptive, SBDI, New Construction, and RCx—as one unit and provides program-specific

messaging, incentive summaries, and brochures to the desired customer segments. Program

implementation staff target potential participants primarily through trade allies. NIPSCO and the

program implementer’s field staff also perform direct outreach to customers, while internal program

implementation staff deliver targeted or mass email marketing campaigns to customers.

NIPSCO’s emphasis on the trade ally role was reflected in how most SBDI survey respondents heard

about the program. Similar to the responses in 2017 and 2016, most 2018 respondents (52%) said they

learned about the program through their trade ally. The second most common ways were through word

of mouth (28%) and NIPSCO staff (6%). The increase in word of mouth as an awareness source from

2016 and 2017 to 2018 is notable but not statistically significant. However, significantly fewer

respondents learned about the program through NIPSCO staff in 2018 than in 2017 and 2016.69 Eight

percent could not recall how they heard about the program. Figure 124 shows the complete breakdown

of how all the respondents learned about the program.

69 Difference is statistically significant at p≤0.05 (95% confidence).

Page 334: 2018 DSM Portfolio Evaluation Report - NIPSCO

327

Figure 124. How Participants Learned about SBDI Program

Note: Boxed value indicates difference between 2017 and 2018 is significant at p≤0.05 (95% confidence).

Source: Participant Surveys. Question: “How did you first learn about NIPSCO’s Small Business Direct Install

program?” (multiple response allowed)

Forty percent of 2018 respondents (n=52) were aware that NIPSCO offered other energy efficiency

programs for its business customers. Although this percentage was lower than the 56% of 2017

respondents (n=63) who reported awareness of other NIPSCO business programs, the difference was

not statistically significant. A statistically similar percentage of 2018 respondents (43%, n=21) compared

to 2017 respondents (57%, n=35) could identify a specific program or offer. Among the aware

respondents (n=8), they were specifically aware of the HVAC incentives (four respondents), followed by

incentives for lighting (two respondents), showerheads (one respondent), and pipe insulation (one

respondent).

When asked how NIPSCO could best keep organizations informed about energy-saving opportunities

(n=52), 50% of respondents preferred email, followed by mailings (29%) and bill inserts (19%). Other

open-ended responses included TV advertisement (one of four respondents), Chamber of Commerce

(one respondent), and through meetings hosted by NIPSCO (one respondent).

Participation Drivers

Saving money on utility bills was the top factor motivating businesses to participate in the 2018

program. Shown in Figure 125, 42% of respondents (n=52) said they participated in the program to save

money on their utility bills, followed by saving energy (19%), and obtaining a program incentive (12%).

Page 335: 2018 DSM Portfolio Evaluation Report - NIPSCO

328

Significantly fewer 2018 respondents than 2017 respondents (64%, n=66) reported saving money on

their utility bills as a top motivator.70

Figure 125. SBDI Program Participation Drivers

Source: Participant Survey. Question: “What two factors were most important in your decision to make

energy-saving improvements through NIPSCO’s Small Business Direct Install program…”

Participation Barriers

The evaluation team asked respondents about challenges businesses face in becoming more energy-

efficient. Notably, respondents’ level of ease to invest in energy-efficient equipment in 2018 decreased

from 2017. As shown in Figure 126, the difference between the percentage of 2018 (23%) and 2017

(41%) respondents who said it was very easy was statistically significant, as was the percentage of 2018

respondents (26%) and 2017 respondents (11%) who said it was somewhat difficult.71 The 2016 survey

did not ask this question.

70 Difference is statistically significant at p≤0.01 (99% confidence).

71 Difference is statistically significant at p≤0.05 (95% confidence).

Page 336: 2018 DSM Portfolio Evaluation Report - NIPSCO

329

Figure 126. Level of Ease to Invest in Efficient Equipment

Note: Boxed value indicates difference between 2017 and 2018 is significant at p≤0.05 (95% confidence).

Source: Participant Surveys. Question: “How easy or difficult is it for organizations like yours to invest in

energy-efficient equipment? Would you say it is…”

As shown in Figure 127, a significantly higher percentage of 2018 respondents (69%) than 2017

respondents (49%) said high initial cost was the greatest challenge to investing in energy-efficient

equipment.72 This increase was also identified in the Prescriptive program, where 65% of respondents

said that the high initial cost posed the greatest challenge, compared to 2017 (41%, n=68) and 2016

findings (44%, n=68).73 Significantly fewer 2018 SBDI respondents (8%) than 2017 respondents (24%)

reported understanding potential areas for improvement as a challenge.74

72 Difference is statistically significant at p≤0.05 (95% confidence).

73 Difference is statistically significant at p≤0.01 (99% confidence). The question’s wording, however, changed

from 2016 to 2017, from considering how their organizations struggled to how organizations similar to theirs

struggle. This change may have affected how participants responded from 2016 to the subsequent years.

74 Difference is statistically significant at p≤0.01 (99% confidence).

Page 337: 2018 DSM Portfolio Evaluation Report - NIPSCO

330

Figure 127. Barriers to Becoming Energy-Efficient

Note: Boxed value indicates difference between 2017 and 2018 is significant at p≤0.05 (95% confidence).

Source: Participant Surveys. Question: “What do you think are the most significant challenges that

organizations face when investing in energy-efficient equipment?”

Only five respondents identified challenges with participating in the program, reporting the following

(multiple responses allowed):

• Lack of program awareness (two respondents)

• Understanding equipment eligibility (two respondents)

• Understanding the application process (two respondents)

• Confusion about whom to contact with questions (two respondents)

• Dissatisfaction with the eligible outdoor LED lighting equipment (one respondent)

The five respondents provided the following recommendations to overcome program challenges

(multiple responses allowed):

• Improve the application process or simplify its language (three respondents)

• Provide more detail about how the program works (two respondents)

• Have a NIPSCO staff person readily available to answer customer questions (one respondent)

• Offer higher incentives (one respondent)

Satisfaction with Program Processes

Respondents rated their satisfaction with different program components; Figure 128.shows satisfaction

levels for each component. Based on the percentage of very satisfied responses, respondents were most

satisfied with working with Lockheed Martin Energy and least satisfied with the information provided

Page 338: 2018 DSM Portfolio Evaluation Report - NIPSCO

331

about the program. A significantly higher percentage of 2018 respondents (93%, n=27) were very

satisfied with working with Lockheed Martin Energy compared to 2017 (74%, n=38).75 2018 responses

were otherwise consistent with 2017 responses.

Figure 128. Customer Satisfaction with SBDI Program Components

Source: Participant Survey. Question: “How would you rate your satisfaction with…”

Overall Satisfaction with Program and NIPSCO

Most respondents (95%) were very satisfied or somewhat satisfied with the program overall, consistent

with 2017 and in 2016 (94% and 97% very satisfied or somewhat satisfied, respectively). Figure 129

shows the distribution of responses, which were also similar between 2016, 2017, and 2018. One

respondent was neither satisfied nor dissatisfied, who felt that the application process, which took about

two weeks according to the respondent, was too lengthy. Two respondents were somewhat dissatisfied,

with one of them saying this was because the business never received the rebate, and the other

respondent had issues with measure eligibility, saying the hired contractor claimed the business’ sign

lighting was eligible, while the program deemed it ineligible.

75 Difference is statistically significant at p≤0.05 (95% confidence).

Page 339: 2018 DSM Portfolio Evaluation Report - NIPSCO

332

Figure 129. Overall Satisfaction with the SBDI Program

Source: Participant Surveys. Question: “How satisfied are you with NIPSCO’s Small Business Direct Install

program overall? Would you say you are…”

When asked about their satisfaction with NIPSCO, 94% said they were very satisfied or somewhat

satisfied, consistent with 2017 (94%, n=67) and 2016 (97%, n=68). Figure 130 shows the distribution of

2016, 2017, and 2018 responses.

Figure 130. Satisfaction with NIPSCO

Source: Participant Surveys. Question: “How satisfied are you with NIPSCO overall as your organization’s

utility service provider? Would you say you are…”

Suggestions for Program Process Improvement

The evaluation team asked respondents to identify any suggestions they had for improving the

SBDI program. Most respondents (77%, n=52) did not offer any recommendations. Five of the 12

respondents who offered suggestions said that they would like to have more or better communication

with their NIPSCO representative. Other suggestions included these:

• Offer a simplified application process (one respondent)

• Send the incentive check faster (one respondent)

Page 340: 2018 DSM Portfolio Evaluation Report - NIPSCO

333

• Offer more technical assistance on projects (one respondent; this respondent specifically

requested assistance with outdoor lighting because the upgraded lighting equipment now has a

flickering problem)

• Complete the upgrade work more quickly (one respondent)

• Offer a warranty on the upgrade work (one respondent)

• Increase the knowledge of NIPSCO’s electrical representative about opportunities to improve

the efficiency of older building structures (one respondent)

Participant Survey Firmographics

As part of the participant survey, the evaluation team collected responses on the following

firmographics, shown in Table 222.

Table 222. SBDI Program Respondent Firmographics

Firmographic Percentage

Industry (n=53)

Religious – church 32%

Retail/wholesale 15%

Real estate and property management 8%

Other 8%

Auto repair shop 6%

Government 6%

Nonprofit 4%

Healthcare 4%

Manufacturing 4%

Office, professional services 4%

Restaurant/food service 4%

Transportation 4%

Grocery/food stores/convenience stores 2%

Media – TV, radio, newspaper, etc. 2%

Building Ownership (n=52)

Own 90%

Lease 10%

Building Square Footage (n=42)

Less than 5,000 square feet 26%

5,000 to less than 10,000 square feet 19%

10,000 to less than 50,000 square feet 40%

50,000 to less than 100,000 square feet 12%

100,000 square feet or greater 2%

Space Heating Fuel Type (n=50)

Natural Gas 92%

Electric 8%

Page 341: 2018 DSM Portfolio Evaluation Report - NIPSCO

334

Firmographic Percentage

Water Heating Fuel Type (n=48)

Natural Gas 75%

Electric 25%

Conclusions and Recommendations Conclusion 1: The 2018 SBDI program delivered high satisfaction with participating customers.

Most 2018 respondents were very or somewhat satisfied with the SBDI program overall and with

NIPSCO as their utility provider (95% and 94%, respectively).

Conclusion 2: Lockheed Martin Energy’s campaign to increase utilization of the program’s steam trap

and steam pipe insulation incentives yielded mixed savings results.

The program implementer launched a contractor outreach campaign for steam traps and steam pipe

insulation in August 2018. The program implementer targeted an internal list of hydronic contractors

and said that it helped staff initiate conversations about these measures with trade allies. As in 2017, all

2018 therm savings came from either steam trap repair or replacement and steam pipe insulation;

however, the program produced over two-thirds more savings in 2017 (293,151 gross therms) than in

2018 (174,519 gross therms). Additionally, increased outreach did not increase the number of steam

system contractors listed in the trade ally search engine; only two contractors listed as serving the

SBDI program on the NIPSCO energy efficiency program website.

Recommendations:

• Continue outreach with hydronic contractors to increase savings from steam systems.

• Consider developing targeted emails, followed by telephone calls and meetings with select

contractors and tracking the outreach effort outcomes.

Conclusion 3: Customers are experiencing high initial cost as a substantial barrier to program

participation.

Upfront cost is the greatest challenge for businesses interested in energy-efficient equipment and

services, with more participants reporting initial cost as a barrier to project implementation than in

2017. This increase was also observed among C&I Prescriptive program survey respondents. 2018

respondents’ reported ease to invest in energy-efficient equipment decreased from 2017. Because the

program incentives did not change substantially (though there were a number of incentive level

decreases in 2017 from 2016), it is not clear why customers are reporting cost to be a greater barrier

than in 2017.

Recommendation:

• Consider promoting project phasing—breaking a large-scale project into manageable phases—

as a way to alleviate participant cost barriers.

Page 342: 2018 DSM Portfolio Evaluation Report - NIPSCO

335

Conclusion 4: The implementer incorrectly applied CFs in the claimed impact calculations for exterior

lighting measures. This largely overstated the peak demand savings for the program.

Exterior lighting measures accounted for 49% of total ex ante program demand savings in 2018. For

these measures, the implementer applied CFs for interior fixtures to exterior lighting measures instead

of using zero, as outlined in the 2015 Indiana TRM (v2.2). This omission ignored lighting load profiles

during the summer peak period outlined in the 2015 Indiana TRM (v2.2). A large portion of exterior

lighting project measures comprised the 2018 population, and the gross demand reduction for these

measures tends to be large. Since the ex post analysis reduced all exterior lighting demand savings to

zero, the kilowatt realization rate was low in 2018.

Recommendation:

• The implementer should apply a CF of zero to all exterior lighting measures, as outlined in the

2015 Indiana TRM (v2.2).

• Target interior lighting opportunities (or 24/7 operating conditions) to help meet peak

coincidence demand reduction targets.

Conclusion 5: The ex ante deemed savings for steam pipe insulation measures caused overstated

natural gas savings in 2018 based on the predominant building type in the population (religious

worship).

Steam pipe insulation measures accounted for 77% of 2018 SBDI ex ante therm savings. The evaluation

team additionally found that 68% of steam pipe insulation measures in the 2018 C&I SBDI program were

installed in religious buildings, which tend to have lower-than average usage patterns and load profiles.

As such, the evaluation team found several instances of large energy-saving steam pipe insulation

projects claiming more savings than the site’s baseline consumption. For these instances, the evaluation

team aligned per-foot savings assumptions specifically with the religious building entry in the Illinois

TRM (v7.0) for low-pressure steam systems without recirculation (statewide average of 3.2 therm per

year per foot for 2-inch pipe) to determine a more reasonable savings estimate.

Recommendation:

• Reference the 2019 Illinois TRM (v7.0) for better estimates of deemed savings by building type.

Currently, the ex ante deemed savings align with the Illinois TRM (v7.0) for low pressure steam

systems with recirculation. However, for systems without recirculation, use the building-specific

deemed therm-per-foot values for either Chicago (closest weather station to NIPSCO territory)

or a statewide average. Extrapolate these values to different diameter pipes to provide new

deemed savings for the three program measure offerings.

• Target a more diverse population of building types for natural gas measures that utilize deemed

savings for ex ante impacts. Deemed savings generally describe a composite of end-use

operating profiles and if a program is only serving a predominant building type, ex ante

estimates become skewed and unrepresentative of the program population.

Page 343: 2018 DSM Portfolio Evaluation Report - NIPSCO

336

Conclusion 6: An update in the Illinois TRM has reduced the natural gas savings estimate for steam

trap replacements.

The implementer referenced the Illinois TRM (v4.0) to calculate the deemed savings for steam trap

replacements. The evaluation team verified proper use of the algorithm, but, upon review of the Illinois

TRM (v5.0) published in 2016, the evaluation team found an update to the steam trap saving

replacement methodology. This update remains in the most recent version of the Illinois TRM (v7.0) and

it lowered the input for average steam loss per leaking trap from 13.8 pounds per hour to 6.9 pounds

per hour, effectively reducing per-unit savings by one-half.

Recommendation:

• Reference the 2019 Illinois TRM (v7.0) for better estimates of deemed savings for steam trap

replacements.

Page 344: 2018 DSM Portfolio Evaluation Report - NIPSCO

337

C&I Retro-Commissioning Program Through the RCx program, NIPSCO helps C&I electric and natural gas customers determine the energy

performance of their facilities and optimize the efficiency of existing operating systems. The RCx process

identifies, reduces, and/or removes operational inefficiencies present in functional equipment that is

not at the end of its useful life. NIPSCO provides program incentives to customers for completing a

retro-commissioning project. The incentive amount is up to 75% of the total installed cost (material and

labor) of energy efficiency measures and services, not to exceed $500,000 per year per project, and not

to exceed $1 million per year per fuel. Incentive rates per fuel are $0.09 per kilowatt-hour and $0.80 per

therm for annual savings. The project payback must be less than 12 months.

Program Performance Table 223 presents a savings summary for the 2018 program, including program savings goals. The

program only served one customer in 2018, and only achieved 15% of its electric energy savings, 9% of

its peak demand reduction, and 0% of its natural gas energy savings goals.

Table 223. 2018 RCx Program Savings Summary

Metric

Gross

Savings

Goal

Ex Antea Audited Verified Ex Post

Gross

Ex Post

Net

Gross Goal

Achievement

Electric Energy

Savings (kWh/yr) 2,088,805 328,052 328,052 328,052 313,681 313,681 15%

Peak Demand

Reduction (kW) 406 0 0 0 37 37 9%

Natural Gas Energy

Savings (therms/yr) 38,745 0 0 0 0 0 0%

a Ex ante values presented at a measure-level represent audited values, since the scorecard only provides savings totals.

The evaluation team conducted an engineering review of the single measure installed through the

RCx program in 2018. Table 224 presents the 2018 program realization rates and NTG percentages. The

measure was to repair leaks in a compressed air system, so there are no ex ante natural gas savings

associated with the measure. Additionally, even though the evaluation team calculated an ex post peak

demand reduction (discussed further below in the chapter), the implementer did not claim ex ante

demand reduction for the measure, so a realization rate is not available for kilowatts (due to division by

zero).

Table 224. 2018 RCx Program Adjustment Factors

Metric Realization Ratea Freeridership Spillover NTGb

Electric Energy Savings (kWh/yr) 96%

0% 0% 100% Peak Demand Reduction (kW) N/A

Natural Gas Energy Savings (therms/yr) N/A a The realization rate is defined as ex post gross savings divided by ex ante savings. b NTG is defined as ex post net savings divided by ex post gross savings.

Page 345: 2018 DSM Portfolio Evaluation Report - NIPSCO

338

Table 225 lists the 2018 RCx program budget and expenditures associated with the single program

measure installed.

Table 225. 2018 RCx Program Expenditures

Fuel Program Budget Program Expenditures Budget Spent

Electric $258,406 $23,834 9%

Natural Gas $42,053 ($605)a -1%

a The program experienced canceled projects which resulted in a negative amount balance in 2018.

Research Questions The evaluation team conducted a materials review, staff interviews, engineering desk reviews, and a

participant survey to address several research questions:

• How do participants and nonparticipants hear about the program and what are the best ways

for the program to reach potential participants?

• What are the barriers and challenges to energy efficiency and program participation?

• How could NIPSCO improve the program experience for participants?

• What are the freeridership and participant spillover estimates?

• Are tracking database savings sourced with proper project documentation?

• Are custom calculation assumptions reasonable and expected for the characterized measure

and/or project site?

• Do claimed savings algorithms align with the 2015 Indiana TRM (v2.2) or other appropriate

secondary sources?

Impact Evaluation For our impact analysis of the 2018 RCx program, the evaluation team performed an engineering review

on the single retro-commissioning project installed in 2018: a compressed air system leakage repair that

accounted for the entirety of the 328,052 kWh ex ante savings claimed in the program scorecard. As

mentioned, this project claimed no ex ante peak demand reduction or therm savings.

Audited and Verified Savings The implementer provided a compressed air leakage repair report from the site trade ally, which

outlined all the fittings and connections that were sealed along with the size of the leak in CFM. The

evaluation team reviewed the ex ante approach used to calculate savings based on the CFM recovery

and efficiency of the existing system. The evaluation team determined an average kilowatt reduction

based on these inputs and multiplied it by the annual operating hours of the compressed air system to

determine annual energy savings. Calculated savings in the documentation matched the savings present

in the tracking data, so the audited energy savings match the ex ante impacts.

Page 346: 2018 DSM Portfolio Evaluation Report - NIPSCO

339

The evaluation team conducted a phone survey with the single RCx program participant in 2018 to

confirm the installation of the upgrade and operating schedule. During the survey the evaluation team

verified that the measure was installed completely in 2018, so the program achieved a 100% ISR.

Verified savings therefore match both the audited and ex ante savings for the RCx program in 2018.

Ex Post Gross Savings As previously mentioned, the evaluation evaluation team agreed with the ex ante approach for

calculating energy savings. The slight discrepancy in annual energy savings came from a reduction of

operating hours in the ex post phase. Additionally, the evaluation evaluation team determined that peak

coincident demand reduction was being overlooked in the ex ante phase. Table 226 outlines these

findings.

Table 226. RCx Program Ex Post Discrepancies

Metric Average Demand

Savings (kW)

Annual Operating

Hours

Annual Energy

Savings (kWh)

Peak Coincident Ex Post

Gross Demand Reduction

Ex ante 37.45 8,760 328,052 0

Ex post 37.45 8,376 313,681 37.45

The evaluation team agreed with the average demand reduction from the CFM sealing, but reduced the

24/7 operating hour assumption to account for 16 days a year of both planned and unplanned

production downtime. This estimated shutdown time was provided by the site contact during the phone

survey. Furthermore, the evaluation team applied the average demand reduction from the repair to the

peak coincident demand reduction in the ex post analysis since a significant and consistent cutback in

the compressed air load will reduce the electrical demand from the compressors during the peak period.

Table 232 shows the program’s percentage of verified savings and realization rates. The program did not

claim any ex ante natural gas energy savings or peak coincident demand reduction in 2018.

Table 227. RCx Program Realization Rates

ISR

Verified/Ex Ante Savings Realization Rate for Evaluation Samplea

Electric Energy

(kWh)

Peak Demand

(kW)

Natural Gas

Energy (therms)

Electric

Energy (kWh)

Peak Demand

(kW)

Natural Gas

Energy (therms)

100.0% 100.0% N/A N/A 95.6% N/A N/A

a The realization rate is defined as ex post gross savings divided by ex ante savings.

Ex Post Net Savings The evaluation team calculated freeridership and spillover using methods described in Appendix B. Self-

Report Net-to-Gross Evaluation Methodology, using survey data collected from 2018 participants. As

shown in Table 233, the evaluation team estimated 100% NTG for the program.

Page 347: 2018 DSM Portfolio Evaluation Report - NIPSCO

340

Table 228. 2018 RCx Program Net-to-Gross Results

Program Freeridership Spillover NTG

RCx Program 0% 0% 100%

Freeridership To determine freeridership, the evaluation team asked the program’s only respondent questions about

whether they would have installed equipment at the same efficiency level, at the same time, and in

same amount in the RCx program’s absence. Based on survey feedback, the evaluation team calculated

an overall freeridership score of 0% for the program, as shown in Table 234.

Table 229. 2018 RCx Program Freeridership Results

Program Responses (n) Freeridershipa

RCx Program 1 0%

By combining the previously used intention methodology with a new influence methodology, the

evaluation team produced a freeridership score for the program by averaging savings-weighted

intention and influence freeridership scores. Refer to Appendix B. Self-Report Net-to-Gross Evaluation

Methodology for further details on intention and influence questions and scoring methodologies.

Intention Freeridership

The evaluation team estimated intention freeridership scores for the participant based on their

responses to the intention-focused freeridership questions. As shown in Table 235, the RCx program’s

intention freeridership score was 0%.

Table 230. 2018 RCx Program Intention Freeridership Results

Program Responses (n) Intention Freeridership Scorea

RCx Program 1 0%

Table 236 shows the unique RCx program participant response combination resulting from intention

freeridership questions, along with intention freeridership score assigned the combination. An “x”

indicates a skipped question (depending on the participant’s response to a previous question). The table

includes “Yes,” “Partial,” and “No” to represent whether the respondent’s answer to a given question

indicated freeridership.

Page 348: 2018 DSM Portfolio Evaluation Report - NIPSCO

341

Table 231. 2018 RCx Program Frequency of Intention Freeridership Scoring Combinations 1

. Co

mp

lete

d e

xact

sam

e

pro

ject

wit

ho

ut

NIP

SCO

pro

gram

?

2. A

lre

ady

de

cid

ed

to

com

ple

te p

roje

ct?

3. C

om

ple

ted

an

y ty

pe

of

pro

ject

?

4. I

n C

apit

al B

ud

get?

[Ask

if q

ue

stio

n 1

is N

o]

5.

Co

nfi

rm, w

ou

ld n

ot

hav

e

inst

alle

d a

ny

me

asu

re?

6. I

nst

alle

d a

t th

e s

ame

tim

e?

7. I

n c

apit

al b

ud

get?

8. O

rgan

izat

ion

has

RO

I

goal

?

[Ask

if q

ue

stio

n 9

is Y

es]

9. P

rogr

am in

cen

tive

was

key

to m

ee

tin

g R

OI g

oal

?

Fre

eri

de

rsh

ip s

core

Yes Yes Yes Yes x No x x x 0%

Influence Freeridership

The evaluation team assessed influence freeridership by asking the participant how important various

program elements were in their purchasing decision-making process. Table 237 shows program

elements the participant rated for importance, along with a count and average rating for each factor.

Table 232. 2018 RCx Program Influence Freeridership Responses

Influence Rating Influence

Score

NIPSCO

Incentive

Information from

NIPSCO on Energy

Savings Opportunities

Recommendation

from Contractor

or Vendor

Previous

Participation in a

NIPSCO Efficiency

Program

1 (not at all important) 100% 0 0 0 0

2 75% 0 0 0 0

3 25% 0 0 0 0

4 (very important) 0% 1 1 1 1

Don’t Know 50% 0 0 0 0

Average 4.0 4.0 4.0 4.0

The evaluation team determined the respondent’s influence freeridership rate for the measure category

using the maximum rating provided for any factor included in Table 237. As shown in Table 238, the

respondent’s maximum influence ratings ranged from 1 (not important) to 4 (very important). A

maximum score of 1 meant the customer ranked all factors from the table as not important, while a

maximum score of 4 meant the customer ranked at least one factor as being very important. Counts

refer to the number of “maximum influence” responses for each factor/influence score response option.

Page 349: 2018 DSM Portfolio Evaluation Report - NIPSCO

342

Table 233. 2018 RCx Program Influence Freeridership Score

Maximum Influence Rating Influence

Score Count

Total Survey Sample

Ex Post MMBtu Savings

Influence Score

MMBtu Savings

1 (not at all important) 100% 0 0 0

2 75% 0 0 0

3 25% 0 0 0

4 (very important) 0% 1 1,069 0

Don’t Know 50% 0 0 0

Average Maximum Influence Rating (Simple Average) 4.0

Average Influence Score (Weighted by Ex Post Savings) 0%

Final Freeridership

The evaluation team calculated the mean of the intention and influence freeridership components to

estimate final freeridership for the RCx program of 0%:

𝐹𝑖𝑛𝑎𝑙 𝐹𝑟𝑒𝑒𝑟𝑖𝑑𝑒𝑟𝑠ℎ𝑖𝑝 (0%) =𝐼𝑛𝑡𝑒𝑛𝑡𝑖𝑜𝑛 𝐹𝑅 𝑆𝑐𝑜𝑟𝑒 (0%) + 𝐼𝑛𝑓𝑙𝑢𝑒𝑛𝑐𝑒 𝐹𝑅 𝑆𝑐𝑜𝑟𝑒 (0%)

2

A higher freeridership score translates to more savings being deducted from the gross savings estimates.

Table 239 presents the intention, influence, and final freeridership scores for the RCx program.

Table 234. 2018 RCx Program Freeridership Score

Responses (n) Intention Score Influence Score Freeridership Score

1 0% 0% 0%

Participant Spillover As detailed in to Appendix B. Self-Report Net-to-Gross Evaluation Methodology, the evaluation team

estimated participant spillover76 measure savings using specific information about participants, as

determined through the evaluation, and using the Indiana TRM (v2.2) as a baseline reference. The

evaluation team planned to estimate the program spillover percentage by dividing the sum of additional

spillover savings (as reported by survey respondents) by the total gross savings achieved by all program

respondents. However, the participant interviewed did not cite their program participation as an

influence on additional energy-efficient purchases. As such, the participant spillover estimate for the

RCx program is 0%, as shown in Table 240.

Table 235. 2018 RCx Program Participant Spillover

Spillover Savings (MMBtu) Participant Program Savings (MMBtu) Participant Spillover

0 1,069 0%

76 Non-participant spillover evaluation activities were not conducted for the 2018 program year.

Page 350: 2018 DSM Portfolio Evaluation Report - NIPSCO

343

Table 241 summarizes the percentage of freeridership, participant spillover, and NTG for the

RCx program.

Table 236. 2018 RCx Program Net-to-Gross Results

Responses (n) Freeridership Spillover NTGa

1 0% 0% 100%

Evaluated Net Savings Adjustments Table 242 presents the savings, realization rates, and NTG for the RCx program.

Table 237. 2018 RCx Program Ex Post Net Savings

Fuel Type Ex Ante Gross

Savingsa

Ex Post Gross

Savings

Realization

Rate NTG

Ex Post Net

Savings

Electric Energy Savings (kWh/yr) 328,052 313,681 96% 100% 313,681

Peak Demand Reduction (kW) 0 37 N/A 100% 37

Natural Gas Energy Savings (therms/yr) 0 0 N/A 100% 0

a Ex ante values presented at a measure-level represent audited values, since the scorecard only provides savings totals.

Process Evaluation The evaluation team reviewed the program database and materials and interviewed the sole program

participant and trade ally for that project. The evaluation team also interviewed NIPSCO’s program

manager and key program implementer staff to gain a better understanding of the program design and

delivery process, program awareness and marketing, and program satisfaction. The evaluation team’s

findings follow.

Program Design and Delivery NIPSCO designed the RCx program to serve the energy efficiency needs of nonresidential electric and

natural gas customers who operate existing office buildings, large hotels, hospitals, large retail stores,

industrial facilities, and warehouses. The program identifies energy savings opportunities in these

buildings or facilities through the optimization of existing systems. Lockheed Martin Energy implements

the RCx program and oversees project management and delivery. Lockheed Martin Energy works with a

network of trade allies to promote the program to eligible customers. Projects start with a thorough

examination of energy-consuming systems for cost-effective savings opportunities. Trade allies identify

operational inefficiencies that can either be removed or reduced to yield energy savings then execute

the approved project work.

Page 351: 2018 DSM Portfolio Evaluation Report - NIPSCO

344

Customers with a building or facility that meet three criteria are eligible for the program:

• Higher-than-average electrical intensity usage

• Functional equipment in place that is not at the end of its useful life

• Commitment to implementing recommendations for energy savings

To participate in the program, customers submit an application that outlines the proposed

commissioning project. Engineers from Lockheed Martin Energy review the proposal to ensure that

costs align with estimated savings. If the project qualifies, customers receive an incentive offer. To

qualify for a program incentive, a project must have a payback of less than 12 months. If a proposal

qualifies for an incentive of more than $10,000, the implementer initiates three levels of engineering

review and an additional quality control review to examine the assumptions made in the first

engineering review of the application.

The completed project undergoes a post-inspection, in which Lockheed Martin Energy visually inspects

the as-installed operating conditions and confirms installation of the proposed equipment. During the

final engineering review, Lockheed Martin Energy inspects post-install data to validate savings.

In 2018, the program had one participant who completed one project.77

Changes from 2017 Design

The electric incentive increased from $0.08/kWh to $0.09/kWh in 2018. In July 2018, NIPSCO

consolidated savings goals across C&I programs regardless of project or program type. Lockheed Martin

Energy also indicated that they may move the RCx program offerings into the Custom program in 2019

rather than continue it as a stand-alone program.

2017 Recommendation Status

In addition to the research objectives laid out for the 2018 program, the evaluation team followed up on

our 2017 evaluation recommendations. Table 243 lists the 2017 RCx program evaluation

recommendations and NIPSCO’s progress toward addressing those recommendations to date.

77 In 2016, the program had no participants. In 2017, the program had five participants who completed six

projects.

Page 352: 2018 DSM Portfolio Evaluation Report - NIPSCO

345

Table 238. Status of 2017 RCx Program Evaluation Recommendations

Summary of 2017 Recommendations Status

Compare incentives offered by retro-commissioning

programs in neighboring regions to determine if

incentives are adequate to offset challenges caused

by limits on the customer base due to opt-out

customers.

Completed in 2018. The implementer reviewed other regional utility

programs that have had more success attracting retro-commissioning

projects. These programs were not stand-alone retro-commissioning

programs, but rather custom programs that included incentives for retro-

commissioning in their offerings.

Establish a primary focus of educating customers and

trade allies about retro-commissioning, with

program promotion as a secondary focus. Consider

partnering with commissioning providers and

engineering firms to develop educational materials

to disseminate to C&I customers. Customers may be

more familiar with new building commissioning and

may not be aware of existing building commissioning

(retro-commissioning).

In progress. The implementer provided education and outreach to trade

allies who can complete retro-commissioning projects. The implementer

also created a tri-fold informational brochure for prospective participants

that provides detail of the program process and benefits of retro-

commissioning.

However, NIPSCO and the program implementer should continue to

educate customers about the benefits of retro-commissioning. The trade

ally for the only retro-commissioning project completed in 2018 identified

a lack of information as the primary barrier that prevents businesses from

pursuing retro-commissioning projects.

Program Awareness and Marketing There was only one RCx program participant in 2018. Both NIPSCO and Lockheed Martin Energy

reported that there was a general lack of retro-commissioning awareness in their territory. Similarly, the

trade ally for the completed project identified a lack of information as the primary barrier that prevents

businesses from pursuing retro-commissioning projects. The trade ally elaborated that, “for a lot of

people, retro-commissioning is a bit of a mystery…more education [about] successful projects and how

much they can save would get people interested.”

The evaluation team surveyed 170 participants of four other C&I NIPSCO programs: Prescriptive,

Custom, New Construction, and SBDI. The evaluation team asked each of these surveyed participants if

they were aware that NIPSCO offers other energy efficiency programs to their business customers.

Ninety-two of the respondents reported being aware of other programs, but when asked to name or

describe those other program(s), none named or described the RCx program. Similarly in 2017, the

evaluation team surveyed 180 participants of other commercial NIPSCO programs, and when asked to

name or describe any other programs they were aware of, none named or described the RCx program.

Both NIPSCO and Lockheed Martin Energy said the most effective program marketing is through trade

allies. This aligns with the surveyed participant, who specified that she first learned about the program

through her trade ally. The trade ally for this project reported that he participates in several utilities’

retro-commissioning programs and first learned about the NIPSCO program on the NIPSCO website.

Lockheed Martin Energy said they continued to recruit trade allies capable of completing retro-

commissioning projects and attempted to recruit larger-scale retro-commissioning projects. However,

the program has had low participation over the past three years and has been mostly limited to smaller

leak repair or compressed air system projects as opposed to retro-commissioning projects that

encompass an entire facility.

Page 353: 2018 DSM Portfolio Evaluation Report - NIPSCO

346

Program Satisfaction The evaluation team asked the participant and trade ally about their satisfaction with various aspects of

the program. The participant reported being very satisfied with all aspects of the program including

information about the program, the application process, the length of the review and approval process,

the post-inspection process, the final engineering review, the incentive amount, the quality of the trade

ally’s work, and interactions with Lockheed Martin Energy. However, the participant said her trade ally

was more directly involved with the program, and therefore the participant was not familiar with the

program aspects in the survey—further indicating the participant’s lack of information. Overall, the

participant was very satisfied with the program.

The trade ally was very satisfied with the program overall, with the implementation incentive application

format, with the project meeting with Lockheed Martin Energy, and with communication with Lockheed

Martin Energy throughout the project. The trade ally was somewhat satisfied with the application

format, application processing time, and required paperwork and documentation. The only program

process challenge the trade ally reported was related to the amount of supporting documents required.

Conclusions and Recommendations Conclusion 1: The RCx program continues to encounter very low participation.

The RCx program was introduced in 2016 and did not have any participants in its first year. In 2017, the

program had five participants who completed six projects. In 2018, there was only one participant and

one completed project. Beginning in 2019, the implementer has integrated retro-commissioning

projects into the custom program.

Conclusion 2: Lockheed Martin Energy overlooked peak coincident demand reduction for the air

leakage project in the 2018 program.

For continuous 24/7 operating compressed air systems, an increase in system efficiency or decrease in

CFM losses provides peak-coincident demand reduction. Lockheed Martin Energy did not report this

reduction in their project savings calculation for 2018.

Recommendation:

• Ensure that peak coincident demand reduction is being properly claimed. This is the portion of

demand reduction that occurs during the summer peak period as outlined in the Indiana

TRM (v2.2).

Conclusion 3: Customer awareness of retro-commissioning continues to be the main

participation barrier.

NIPSCO and Lockheed Martin Energy acknowledged that their customers lack general awareness of

retro-commissioning. Similarly, the trade ally for the completed project identified a lack of information

as the primary barrier that prevents businesses from pursuing retro-commissioning projects. In 2017

and 2018, no surveyed participants of other NIPSCO commercial programs listed or described the

RCx program when asked about their awareness of other NIPSCO programs. In 2017, the evaluation

team interviewed two nonparticipant businesses and asked specifically if they had heard of the

Page 354: 2018 DSM Portfolio Evaluation Report - NIPSCO

347

RCx program. One nonparticipant had heard of the program, but was unfamiliar with what it offered,

while the other had not heard of retro-commissioning or the RCx program, but after hearing a brief

description said their organization would likely be interested in the program.

Recommendation:

• Partner with out-of-network commissioning providers and engineering firms that service

customers in Indiana to promote the program. Expand outreach from Lockheed Martin Energy’s

trade ally network to out-of-network firms to help spread the information across various service

providers and reach a larger audience. Participants in all other C&I programs said they heard

about their respective programs through trade allies and word of mouth. It is likely the

RCx program can also benefit from more word-of mouth referrals.

Page 355: 2018 DSM Portfolio Evaluation Report - NIPSCO

348

Appendix A. Evaluation Definitions Term Definition Notes

Goal Target for claimed savings by utility Performance target for a given program year

Ex Ante Savings

Claimed savings values from the Scorecard or other

source of record, after utility reconciliation with

implementer tracking data

Baseline for comparison against evaluation findings

Audited Savings

Project tracking data savings values, which are

checked and adjusted (if needed) for alignment with

the less-granular ex ante data.

Allows for checking the accuracy of tracking

system; program savings are based on adjusted

program‐tracking data.

Verified Savings

This step adjusts the audited savings as follows:

• Adjusts ex ante deemed savings estimates and

calculations (if needed) to align with those agreed-

upon values provided by the implementer/utility

• Adjusts program tracking data (if needed) to

correct any errors or omissions identified above

(via Indiana TRM v2.2 or work paper), or other

program-specified data and methods.

• Re-calculates program savings based on the

adjusted program tracking data

• Applies an installation or ISR

Evaluation evaluation team’s best available

estimate without benefit of hindsight and in

conformance with program methods.

Confirms program reach and persistence of

installed measures, including that measures are still

installed and operating as planned.

This step may feature adjustments to address

issues such as:

• Measures rebated but never installed

• Measures not meeting program qualifications

• Measures installed but later removed

• Measures improperly installed

Ex Post Gross

Savings

Evaluator's savings calculations, adjusted from

verified values.

This step considers the best-available information

from all primary (logging, trend data, on-site data

collection, etc.) and secondary (applicable TRM,

ASHRAE, DOE, etc.) sources. These methods may

differ from program-specified data and methods, and

inform updates to same. Typical methods include

engineering analysis, building simulation modeling,

consumption data analysis and regression, metering

analysis, or other accepted methods. May include

changes to the baseline assumptions to adjust for

weather, occupancy levels, production levels, etc.

Evaluation evaluation team’s best estimate of

savings, using provided project data and secondary

sources.

Ex Post Net

Savings

Evaluator's savings calculations, adjusted from ex post

gross values.

Calculated by adjusting the ex post gross savings

estimates to account for attribution factors, including

savings‐weighted freeridership and spillover effects.

NTG ratio is determined from the following formula:

NTG Ratio = 1‐ freeridership adjustment + participant

spillover adjustment

Ex Post Net savings inform program design

improvements, future program planning, cost-

effectiveness analysis, and calculations of lost

revenues.

Page 356: 2018 DSM Portfolio Evaluation Report - NIPSCO

349

Appendix B. Self-Report Net-to-Gross Evaluation

Methodology NTG estimates play a critical role in DSM program impact evaluations, allowing a utility to determine the

portions of gross energy savings attributable to its programs, free from other influences. There are two

components of NTG: freeridership and participant spillover78. Freeriders are customers who would have

purchased a measure without a program’s influence. Participant spillover is the amount of additional

savings obtained by customers investing in energy‐efficient measures or activities due to their program

participation.

Various methods can be used to estimate program freeridership and participant spillover. The

evaluation evaluation team used self‐reports, procured through participant surveys for the 2018

baseline approach.

Survey Design

For the 2018 program survey design, the evaluation evaluation team continued to utilize a modification

of past freeridership measurements that were first implemented in 2016. Our NIPSCO freeridership

research prior to 2016 relied on customers’ self-reported intention to purchase a measure in the

absence of program incentives. Survey questions for this approach addressed the incentive’s effect on

the efficiency, quantity, and timing of purchases. This portion of the freeridership measurement did not

change in 2016, 2017 and 2018.

Persistent conjecture in the industry, however, indicates that intention self-reports may be subject to

biases, yielding an inflated freeridership value.79 To address this possibility and to provide a triangulation

of approaches to the estimate (a desirable measurement principle), the evaluation team integrated a

second set of survey questions in 2016 (which the evaluation team used again in 2017 and 2018),

designed these questions to measure the program’s perceived influence on the respondents’ purchasing

decisions.

78 Non-participant spillover evaluation activities were not conducted for the 2018 program year.

79 Some identified biases could lead to underestimated freerider rates; industry literature’s typical stance is that

the net biasing effect remains unknown: Peters, J.; McRae, M. 2008. “Freeridership Measurement Is Out of

Sync with Program Logic…or, We’ve Got the Structure Built, but What’s Its Foundation.” Proceedings of the

2008 American Council for an Energy-Efficient Economy Summer Study on Energy Efficiency in Buildings,

Washington, DC. Ridge, R.; Willems, P.; Fagan, J.; Randazzo, K. 2009. “The Origins of the Misunderstood and

Occasionally Maligned Self-Report Approach to Estimating Net-to-Gross Ratio.” Paper presented at the 2009

Energy Program Evaluation Conference, Portland, Oregon. Keating, K. 2009. “Freeridership Borscht: Don’t Salt

the Soup.” Paper presented at the 2009 International Energy Program Evaluation Conference.

www.aceee.org/files/proceedings/2008/data/papers/5_491.pdf

Page 357: 2018 DSM Portfolio Evaluation Report - NIPSCO

350

By combining the previously used intention methodology with the influence methodology, the

evaluation team produced a freeridership score for the program by averaging the savings weighted

intention and influence freeridership scores.80 The evaluation team calculated the arithmetic mean of

the intention and influence freeridership components to estimate final freeridership for programs.

Final Freeridership =𝐼𝑛𝑡𝑒𝑛𝑡𝑖𝑜𝑛 Freeridership Score + 𝐼𝑛𝑓𝑙𝑢𝑒𝑛𝑐𝑒 Freeridership Score

2

The evaluation team sought to determine whether program participants installed other energy saving

measures after participating in the program through the participant spillover questions. Savings from

additional measures installed by participants would be considered participant spillover savings if they

met the following conditions:

• The program significantly influenced their decisions to purchase additional measures; and

• They did not receive additional incentives for those measures.

If the participant installed one or more measures, additional questions addressed the quantity they

installed and the program’s influence on their purchasing decisions (e.g., very important, somewhat

important, or not at all important).

The evaluation team combined freeridership and spillover questions in the same survey, simultaneously

asking the questions with randomly selected program participants. The evaluation team used these

freeridership and participant spillover approaches for the Residential HVAC Rebates program, as well as

the C&I Prescriptive, Custom, SBDI and New Construction and RCx programs.

Intention Freeridership Methodology

The evaluation evaluation team estimated intention freeridership scores for all participants based on

their responses to the intention-focused freeridership questions. As part of past NIPSCO evaluations, the

evaluation team developed a transparent, straightforward matrix approach to assign a single score to

each participant based on his or her objective responses.

Direct questions such as, “Would you have installed measure X without the program incentive?” tend to

result in exaggerated “yes” responses. Participants often provide answers they believe surveyors seek,

so a question becomes the equivalent of asking, “Would you have done the right thing on your own?”

Effectively avoiding such bias involves asking a question in several different ways, and then checking for

consistent responses.

Determining intention freeridership estimates from a series of questions rather than using a single

question helps to form a picture of the program’s influence on the participant. (For example, “Did the

program affect the timing of their decision and, if so, by how many months/years?” “Did the program

affect the efficiency of equipment installed and, if so, by how much?” “Did the program affect the

80 Intention and influence freeridership scores both have a maximum of 100%.

Page 358: 2018 DSM Portfolio Evaluation Report - NIPSCO

351

quantity of technology installed and, if so, by how much?”). Use of multiple questions also checks

consistency.

Not all questions are weighted equally. For example, if the respondent would not have installed

measures at the same efficiency level without the program, they automatically become a 0% intention

freerider. If they would not have installed the measures within two years without the program, they also

automatically become a 0% intention freerider. Other questions included in the intention freeridership

analysis are assigned partial weights for responses indicative of a non‐freerider.

The intention freeridership survey questions addressed five core freeridership dimensions for the

residential HVAC Rebates program and seven core freeridership dimensions for C&I programs:

• Would participants have installed measures without the program?

• Were participants planning on ordering or installing the measures before learning about the

program?

• Would participants have installed the measures at the same efficiency levels without the

program incentive?

• Would participants have installed the same quantity of measures without the program?

• In the program’s absence, would participants have installed the measures at a different time?

• Was the purchase of the measures in the organization’s most recent capital budget? (C&I only)

• Was the program incentive key to meeting a minimum acceptable ROI or hurdle rate when

selecting the energy-efficiency project? (C&I only)

The survey design included several skip patterns, allowing interviewers to confirm answers previously

provided by respondents by asking the same question in a different format.

After assigning an intention freeridership score to every survey respondent, the evaluation evaluation

team calculated a savings‐weighted average intention freerider score for the program category. The

following equation weighted respondents’ intention freerider scores by the estimated savings of

equipment installed:

𝑆𝑎𝑣𝑖𝑛𝑔𝑠 𝑊𝑒𝑖𝑔ℎ𝑡𝑒𝑑 𝐼𝑛𝑡𝑒𝑛𝑡𝑖𝑜𝑛 𝐹𝑟𝑒𝑒𝑟𝑖𝑑𝑒𝑟𝑠ℎ𝑖𝑝

=∑[Respondent 𝐼𝑛𝑡𝑒𝑛𝑡𝑖𝑜𝑛 Freeridership Score] ∗ [𝐸𝑥 𝑃𝑜𝑠𝑡 Measure Energy Savings]

∑[All Respondents Measure Energy Savings]

Intention Freeridership

Table 239 Illustrates how initial survey responses for the C&I Prescriptive, C&I Custom, and C&I

SBDI programs are translated into whether the response is “yes,” “no,” or “partially” indicative of

freeridership (in parentheses). The value in brackets is the scoring decrement associated with each

response option. Each participant intention freeridership score starts with 100%, which the evaluation

team decrement based on their responses to the 10 questions, as shown in Table 239.

Page 359: 2018 DSM Portfolio Evaluation Report - NIPSCO

352

Table 239. Raw Survey Responses Translation to Intention Freeridership

Scoring Matrix Terminology and Scoring (C&I Prescriptive, Custom, and SBDI Programs)

G1. Without

the incentive

and program

information

from NIPSCO

would you

have still

purchased

[MEASURE]?

G2. [ASK IF

G1 = Yes or

DK] Had your

organization

ALREADY

ordered or

purchased

the

[MEASURE]

BEFORE you

heard about

the

program?

G3. Did your

organization

have specific

plans to

install the

[MEASURE]

BEFORE

learning

about the

NIPSCO

program

incentive?

G4. [ASK IF G3

= Yes or DK]

Prior to

hearing about

the program

incentive, was

the purchase

of the

[MEASURE]

included in

your

organization’s

capital

budget?

G5. [ASK IF

G1=No] So,

without the

incentive and

program

information

from NIPSCO,

you would

not have

installed

[MEASURE] at

all. Is that

correct?

G6. And

would you

have

installed the

same

quantity of

[MEASURE]

without the

incentive

and program

information

from

NIPSCO?

G7. Without the

incentive and

program

information

from NIPSCO,

would you have

still purchased

[MEASURE] that

was just as

efficient, more

efficient, or less

efficient than

the one you

purchased?

G8. Without

the incentive

and program

information

from NIPSCO,

when would

you have

installed this

equipment

without the

program?

Would you

have installed

it

G9. Does your

organization use

a minimum

acceptable return

on investment

(ROI) or hurdle

rate when

selecting energy-

efficiency

projects?

G11. Do you

know if the

program

incentive was

key to

meeting this

minimum

rate?

Yes (Yes) [-

0%]

Yes (Yes)

[100% FR]

Yes (Yes) [-

0%] Yes (Yes) [-0%]

Yes/correct,

we would not

have installed

anything

without the

program

incentive (No)

[-100%]

Yes, the

same

quantity

(Yes) [-0%]

Just as efficient

(Yes) [-0%]

In the same

year? (Yes) [-

0%]

Yes (No) [-0%]

Yes, the

program

incentive was

key to

meeting the

ROI (No) [-

50%]

No (No) [-

50%]

No (No) [-

0%]

No (No) [-

50%] No (No) [-50%]

No/not

correct,

would have

installed

something

without the

incentive

(Yes) [-0%]

No, would

have

installed

fewer (No) [-

50%]

More efficient

(Yes) [-0%]

Within one to

two years?

(Partial) [-25%]

Yes (Yes) [-0%]

No, the

program

incentive was

not key to

meeting the

ROI (Yes) [-

0%]

Don't Know

(Partial) [-

25%]

Don't Know

(Partial) [-

0%]

Don't Know

(Partial) [-

25%]

Don't Know

(Partial) [-25%]

Don't Know

(Partial) [-

25%]

No, would

have

installed

Less efficient

(No) [-100%]

Within three

to five years?

(No) [-100%]

Don't Know

(Partial) [-0%]

Don't Know

(Partial) [-

25%]

Page 360: 2018 DSM Portfolio Evaluation Report - NIPSCO

353

G1. Without

the incentive

and program

information

from NIPSCO

would you

have still

purchased

[MEASURE]?

G2. [ASK IF

G1 = Yes or

DK] Had your

organization

ALREADY

ordered or

purchased

the

[MEASURE]

BEFORE you

heard about

the

program?

G3. Did your

organization

have specific

plans to

install the

[MEASURE]

BEFORE

learning

about the

NIPSCO

program

incentive?

G4. [ASK IF G3

= Yes or DK]

Prior to

hearing about

the program

incentive, was

the purchase

of the

[MEASURE]

included in

your

organization’s

capital

budget?

G5. [ASK IF

G1=No] So,

without the

incentive and

program

information

from NIPSCO,

you would

not have

installed

[MEASURE] at

all. Is that

correct?

G6. And

would you

have

installed the

same

quantity of

[MEASURE]

without the

incentive

and program

information

from

NIPSCO?

G7. Without the

incentive and

program

information

from NIPSCO,

would you have

still purchased

[MEASURE] that

was just as

efficient, more

efficient, or less

efficient than

the one you

purchased?

G8. Without

the incentive

and program

information

from NIPSCO,

when would

you have

installed this

equipment

without the

program?

Would you

have installed

it

G9. Does your

organization use

a minimum

acceptable return

on investment

(ROI) or hurdle

rate when

selecting energy-

efficiency

projects?

G11. Do you

know if the

program

incentive was

key to

meeting this

minimum

rate?

more (Yes) [-

0%]

Don't Know

(Partial) [-

25%]

Don't Know

(Partial) [-25%]

In more than

five years?

(No) [-100%]

Don't Know

(Partial) [-25%]

Table 240 shows the unique Prescriptive program participant response combinations resulting from intention freeridership questions, along with

intention freeridership scores assigned to each combination, and the number of responses for each combination. An “x” indicates a skipped

question (depending on the participant’s response to a previous question). The table’s “Yes,” “Partial,” and “No” values represent whether the

respondent’s answer to a given question indicated freeridership.

Page 361: 2018 DSM Portfolio Evaluation Report - NIPSCO

354

Table 240. 2018 Prescriptive Program Frequency of Intention Freeridership Scoring Combinations

G1.

Without

the

incentive

and

program

informatio

n from

NIPSCO

would you

have still

purchased

[MEASURE]

?

G2. [ASK IF

G1 = Yes or

DK] Had

your

organizatio

n ALREADY

ordered or

purchased

the

[MEASURE]

BEFORE

you heard

about the

program?

G3. Did

your

organizatio

n have

specific

plans to

install the

[MEASURE]

BEFORE

learning

about the

NIPSCO

program

incentive?

G4. [ASK IF

G3 = Yes or

DK] Prior to

hearing

about the

program

incentive,

was the

purchase of

the

[MEASURE]

included in

your

organization

’s capital

budget?

G5. [ASK IF

G1=No]

So,

without

the

incentive

and

program

informatio

n from

NIPSCO,

you would

not have

installed

[MEASURE

] at all. Is

that

correct?

G6. And

would you

have

installed

the same

quantity

of

[MEASURE

] without

the

incentive

and

program

informatio

n from

NIPSCO?

G7.

Without

the

incentive

and

program

informatio

n from

NIPSCO,

would you

have still

purchased

[MEASURE

] that was

just as

efficient,

more

efficient,

or less

efficient

than the

one you

purchased

?

G8.

Without

the

incentive

and

program

informatio

n from

NIPSCO,

when

would you

have

installed

this

equipmen

t without

the

program?

Would you

have

installed it

G9. Does your

organization

use a minimum

acceptable retu

rn on

investment

(ROI) or hurdle

rate when

selecting

energy-

efficiency

projects?

G11. Do

you

know if

the

program

incentiv

e was

key to

meeting

this

minimu

m rate?

Freeridershi

p score

Response

Frequenc

y

Yes Yes x X x x x x x x 100% 9

Partial Yes x X x x x x x x 100% 1

Yes No Yes Yes x Yes Yes Yes Yes x 100% 2

Yes No Yes Yes x Yes Yes Partial No No 25% 2

Yes No Yes Yes x Yes Partial Partial Yes x 50% 1

Yes No Yes Yes x Yes No x x x 0% 1

Yes No Yes Yes x Partial Yes Yes Yes x 75% 1

Yes No Yes Yes x Partial Partial Partial No No 0% 1

Yes No Yes Partial x Yes Yes Partial Yes x 50% 1

Page 362: 2018 DSM Portfolio Evaluation Report - NIPSCO

355

G1.

Without

the

incentive

and

program

informatio

n from

NIPSCO

would you

have still

purchased

[MEASURE]

?

G2. [ASK IF

G1 = Yes or

DK] Had

your

organizatio

n ALREADY

ordered or

purchased

the

[MEASURE]

BEFORE

you heard

about the

program?

G3. Did

your

organizatio

n have

specific

plans to

install the

[MEASURE]

BEFORE

learning

about the

NIPSCO

program

incentive?

G4. [ASK IF

G3 = Yes or

DK] Prior to

hearing

about the

program

incentive,

was the

purchase of

the

[MEASURE]

included in

your

organization

’s capital

budget?

G5. [ASK IF

G1=No]

So,

without

the

incentive

and

program

informatio

n from

NIPSCO,

you would

not have

installed

[MEASURE

] at all. Is

that

correct?

G6. And

would you

have

installed

the same

quantity

of

[MEASURE

] without

the

incentive

and

program

informatio

n from

NIPSCO?

G7.

Without

the

incentive

and

program

informatio

n from

NIPSCO,

would you

have still

purchased

[MEASURE

] that was

just as

efficient,

more

efficient,

or less

efficient

than the

one you

purchased

?

G8.

Without

the

incentive

and

program

informatio

n from

NIPSCO,

when

would you

have

installed

this

equipmen

t without

the

program?

Would you

have

installed it

G9. Does your

organization

use a minimum

acceptable retu

rn on

investment

(ROI) or hurdle

rate when

selecting

energy-

efficiency

projects?

G11. Do

you

know if

the

program

incentiv

e was

key to

meeting

this

minimu

m rate?

Freeridershi

p score

Response

Frequenc

y

Yes No Yes Partial x Partial Yes Partial No x 25% 1

Yes No Yes No x Yes Yes Yes Yes x 50% 2

Yes No Yes No x Yes Yes Partial Yes x 25% 1

Yes No Yes No x Yes Partial Yes Yes x 25% 1

Yes No Yes No x No Yes Partial Yes x 0% 1

Yes No Partial Yes x Yes Yes Yes Yes x 75% 2

Yes No Partial Partial x Partial Yes Partial Yes x 12.5% 1

Yes No No X x Yes Yes Yes No x 50% 1

Yes No No X x Yes Yes Partial No Partial 25% 1

Yes No No X x Yes Yes Partial No No 0% 1

Page 363: 2018 DSM Portfolio Evaluation Report - NIPSCO

356

G1.

Without

the

incentive

and

program

informatio

n from

NIPSCO

would you

have still

purchased

[MEASURE]

?

G2. [ASK IF

G1 = Yes or

DK] Had

your

organizatio

n ALREADY

ordered or

purchased

the

[MEASURE]

BEFORE

you heard

about the

program?

G3. Did

your

organizatio

n have

specific

plans to

install the

[MEASURE]

BEFORE

learning

about the

NIPSCO

program

incentive?

G4. [ASK IF

G3 = Yes or

DK] Prior to

hearing

about the

program

incentive,

was the

purchase of

the

[MEASURE]

included in

your

organization

’s capital

budget?

G5. [ASK IF

G1=No]

So,

without

the

incentive

and

program

informatio

n from

NIPSCO,

you would

not have

installed

[MEASURE

] at all. Is

that

correct?

G6. And

would you

have

installed

the same

quantity

of

[MEASURE

] without

the

incentive

and

program

informatio

n from

NIPSCO?

G7.

Without

the

incentive

and

program

informatio

n from

NIPSCO,

would you

have still

purchased

[MEASURE

] that was

just as

efficient,

more

efficient,

or less

efficient

than the

one you

purchased

?

G8.

Without

the

incentive

and

program

informatio

n from

NIPSCO,

when

would you

have

installed

this

equipmen

t without

the

program?

Would you

have

installed it

G9. Does your

organization

use a minimum

acceptable retu

rn on

investment

(ROI) or hurdle

rate when

selecting

energy-

efficiency

projects?

G11. Do

you

know if

the

program

incentiv

e was

key to

meeting

this

minimu

m rate?

Freeridershi

p score

Response

Frequenc

y

Yes No No X x Yes Yes Partial Yes x 25% 2

Yes No No X x Yes Yes No x x 0% 1

Yes No No X x Yes Partial Partial Yes x 12.5% 1

Yes No No X x Yes No x x x 0% 1

Yes No No X x No Yes Partial Yes x 0% 2

Yes No No X x No Yes No x x 0% 2

Partial No No X x No Yes Partial Yes x 0% 1

Partial No No X x No Yes No x x 0% 2

No x x X Yes Yes Yes No x x 0% 2

No x x X Yes No Yes Partial No No 0% 1

Page 364: 2018 DSM Portfolio Evaluation Report - NIPSCO

357

G1.

Without

the

incentive

and

program

informatio

n from

NIPSCO

would you

have still

purchased

[MEASURE]

?

G2. [ASK IF

G1 = Yes or

DK] Had

your

organizatio

n ALREADY

ordered or

purchased

the

[MEASURE]

BEFORE

you heard

about the

program?

G3. Did

your

organizatio

n have

specific

plans to

install the

[MEASURE]

BEFORE

learning

about the

NIPSCO

program

incentive?

G4. [ASK IF

G3 = Yes or

DK] Prior to

hearing

about the

program

incentive,

was the

purchase of

the

[MEASURE]

included in

your

organization

’s capital

budget?

G5. [ASK IF

G1=No]

So,

without

the

incentive

and

program

informatio

n from

NIPSCO,

you would

not have

installed

[MEASURE

] at all. Is

that

correct?

G6. And

would you

have

installed

the same

quantity

of

[MEASURE

] without

the

incentive

and

program

informatio

n from

NIPSCO?

G7.

Without

the

incentive

and

program

informatio

n from

NIPSCO,

would you

have still

purchased

[MEASURE

] that was

just as

efficient,

more

efficient,

or less

efficient

than the

one you

purchased

?

G8.

Without

the

incentive

and

program

informatio

n from

NIPSCO,

when

would you

have

installed

this

equipmen

t without

the

program?

Would you

have

installed it

G9. Does your

organization

use a minimum

acceptable retu

rn on

investment

(ROI) or hurdle

rate when

selecting

energy-

efficiency

projects?

G11. Do

you

know if

the

program

incentiv

e was

key to

meeting

this

minimu

m rate?

Freeridershi

p score

Response

Frequenc

y

No x x X Yes No Yes Partial Yes x 0% 1

No x x X Yes No Yes No x x 0% 1

No x x X Partial Yes Yes No x x 0% 1

No x x X No x x x x x 0% 41

Table 241 shows the unique Custom program participant response combinations resulting from the intention freeridership questions, along with

the intention freeridership score assigned to each combination, and the number of responses for each combination. An “x” indicates a skipped

question (depending on the participant’s response to a previous question). The “Yes,” “Partial,” and “No” values in the table represent whether

the respondent’s answer to a given question was indicative of freeridership.

Page 365: 2018 DSM Portfolio Evaluation Report - NIPSCO

358

Table 241. 2018 Custom Program Frequency of Intention Freeridership Scoring Combinations

1.

Installed

same

measure

without

incentive?

2.

Already

ordered

or

installed?

3 Already

planning

to

purchase?

4. In

capital

budget?

[Ask if question 1

is No] 5. Confirm,

would not have

installed any

measure?

6.

Installed

same

quantity?

7. Installed

same

efficiency?

8.

Installed

at the

same

time?

9.

Organization

has ROI

goal?

10.

Program

incentive

was key to

meeting

ROI goal?

Freeridership

score

Response

Frequency

Yes Yes x x x x x x x x 100% 5

Yes No Yes Yes x Yes Yes Yes No Yes 100% 1

Yes No Yes Yes x Yes Yes Yes No x 100% 2

Yes No Yes Yes x Yes Yes Yes Yes x 100% 1

Yes No Yes Yes x Yes No x x x 0% 1

Yes No Yes No x Yes Yes Yes Yes x 50% 1

Yes No Yes No x Yes Yes Partial Yes x 25% 2

Yes No Yes No x No Yes Yes Yes x 12.5% 1

Yes No Partial Partial x Yes Yes Yes Yes x 50% 1

Yes No No x x Yes Yes Yes No x 50.0% 1

Yes No No x x Yes Yes Yes Yes x 50% 3

Yes No No x x Yes Yes Partial Yes x 25% 2

Yes No No x x Yes Yes No x x 0% 1

Yes No No x x No Yes Yes No x 12.5% 1

Yes No No x x No Yes Partial Yes x 0% 1

Yes No No x x No Yes No x x 0% 1

Yes No No x x No Partial Partial Yes x 0% 1

Yes No No x x No No x x x 0% 3

Partial No Yes No x No No x x x 0% 1

Partial No No x x No No x x x 0% 1

No x x x Yes No Yes No x x 0% 1

No x x x No x x x x x 0% 13

Table 242 shows the unique SBDI program participant response combinations resulting from the intention freeridership questions, along with

the intention freeridership score assigned to each combination, and the number of responses for each combination. An “x” indicates a skipped

Page 366: 2018 DSM Portfolio Evaluation Report - NIPSCO

359

question (depending on the participant’s response to a previous question). The “Yes,” “Partial,” and “No” values in the table represent whether

the respondent’s answer to a given question was indicative of freeridership.

Table 242. 2018 SBDI Program Frequency of Intention Freeridership Scoring Combinations

1.

Installed

same

measure

without

incentive?

2.

Already

ordered

or

installed?

3 Already

planning

to

purchase?

4. In

capital

budget?

[Ask if

question 1 is

No] 5.

Confirm,

would not

have installed

any measure?

6.

Installed

same

quantity?

7.

Installed

same

efficiency?

8.

Installed

at the

same

time?

9.

Organization

has ROI

goal?

[Ask if

question 8 is

Yes] 10.

Program

incentive was

key to meeting

ROI goal?

Intention

Freeridership

score

Response

Frequency

Yes Yes x x x x x x x x 100% 4

Partial Yes x x x x x x x x 100% 2

Yes No Yes Yes x Yes Yes Yes No x 100% 2

Yes No Yes No x Yes Yes Yes Yes x 50% 1

Yes No Yes No x Yes Yes Partial Yes x 25% 2

Yes No Yes No x Yes Yes No x x 0% 1

Yes No Yes No x No Yes No x x 0% 1

Yes No Partial Partial x Yes Yes Partial Yes x 25% 1

Yes No Partial Partial x Partial Partial Partial Yes x 0% 1

Yes No No x x Yes Yes Yes No x 50.0% 2

Yes No No x x Yes Yes Partial Yes x 25% 1

Yes No No x x Yes Yes No x x 0% 1

Yes No No x x Yes Partial Yes Yes x 25% 1

Yes No No x x Yes Partial Partial Yes x 13% 1

Yes No No x x No Yes Partial Yes x 0% 1

Yes No No x x No Yes No x x 0% 2

Yes No No x x No Partial Yes Yes x 0% 1

Partial No No x x No No x x x 0% 2

No x x x No x x x x x 0% 44

Page 367: 2018 DSM Portfolio Evaluation Report - NIPSCO

360

Table 243 Illustrates how initial C&I New Construction program survey responses are translated into whether the response is “yes,” “no,” or

“partially” indicative of free ridership (in parentheses). The value in brackets is the scoring decrement associated with each response option.

Each participant free ridership score starts with 100%, which the evaluation team decrement based on their responses to the 10 questions, as

show in Table 243.

Table 243. Raw Survey Responses Translation to Intention Freeridership Scoring Matrix Terminology and Scoring – New Construction Program

G14. Without

the incentive

and program

information

from NIPSCO,

would your

construction

project been

built above

the Indiana

building

energy code

requirement?

G15. [ASK IF G14

= Yes or DK] Had

your organization

ALREADY decided

to construct your

building above

the Indiana

building energy

code

requirement

BEFORE your

organization

heard about the

program?

G16. Did your

organization

have specific

plans to

construct the

building above

the Indiana

building

energy code

requirement

BEFORE

learning about

the NIPSCO

program

incentive?

G17. [ASK IF G16

= Yes or DK]

Prior to hearing

about the

program, was

the plan to

construct the

building above

the Indiana

building energy

code

requirement

included in your

organization’s

capital budget?

G18. [ASK

IF G14=No]

So, without

the

incentive

and

program

information

from

NIPSCO,

would you

have

constructed

the building

at all?

G19. And

would your

construction

project have

been built

to the same

square

footage

without the

incentive

and

program

information

from

NIPSCO?

G20. Without the

incentive and

program

information from

NIPSCO, would you

have still

constructed a

building that was

just as efficient,

more efficient, or

less efficient than

the one you were

originally planning

to build?

G21.

Without the

incentive and

program

information

from

NIPSCO,

when would

you have

constructed

the building?

Would you

have built it

G22. Does

your

organization

use a

minimum

acceptable

ROI or hurdle

rate when

selecting

energy-

efficiency

projects?

G24. Do

you know

if the

program

incentive

was key to

meeting

this

minimum

rate?

Yes (Yes) [-

0%] Yes (Yes) [100%] Yes (Yes) [-0%] Yes (Yes) [-0%]

Yes, would

have built a

project

(Yes) [-

100%]

Yes, the

same size

(Yes) [-0%]

Just as

efficient (Yes)

[-0%]

In the same year?

(Yes) [-0%] Yes (No) [-0%]

Yes, the

program

incentive

was key to

meeting

the ROI

(No) [-

50%]

Page 368: 2018 DSM Portfolio Evaluation Report - NIPSCO

361

G14. Without

the incentive

and program

information

from NIPSCO,

would your

construction

project been

built above

the Indiana

building

energy code

requirement?

G15. [ASK IF G14

= Yes or DK] Had

your organization

ALREADY decided

to construct your

building above

the Indiana

building energy

code

requirement

BEFORE your

organization

heard about the

program?

G16. Did your

organization

have specific

plans to

construct the

building above

the Indiana

building

energy code

requirement

BEFORE

learning about

the NIPSCO

program

incentive?

G17. [ASK IF G16

= Yes or DK]

Prior to hearing

about the

program, was

the plan to

construct the

building above

the Indiana

building energy

code

requirement

included in your

organization’s

capital budget?

G18. [ASK

IF G14=No]

So, without

the

incentive

and

program

information

from

NIPSCO,

would you

have

constructed

the building

at all?

G19. And

would your

construction

project have

been built

to the same

square

footage

without the

incentive

and

program

information

from

NIPSCO?

G20. Without the

incentive and

program

information from

NIPSCO, would you

have still

constructed a

building that was

just as efficient,

more efficient, or

less efficient than

the one you were

originally planning

to build?

G21.

Without the

incentive and

program

information

from

NIPSCO,

when would

you have

constructed

the building?

Would you

have built it

G22. Does

your

organization

use a

minimum

acceptable

ROI or hurdle

rate when

selecting

energy-

efficiency

projects?

G24. Do

you know

if the

program

incentive

was key to

meeting

this

minimum

rate?

No (No) [-

50%] No (No) [-0%]

No (No) [-

50%] No (No) [-50%]

No, would

not have

built a

project

(No) [-0%]

No, smaller

size (No) [-

50%]

More

efficient (Yes)

[-0%]

Within one to two

years? (Partial) [-

25%]

Yes (Yes) [-0%]

No, the

program

incentive

was not

key to

meeting

the ROI

(Yes) [-0%]

Don't Know

(Partial) [-

25%]

Don't Know

(Partial) [-0%]

Don't Know

(Partial) [-

25%]

Don't Know

(Partial) [-25%]

Don't

Know

(Partial) [-

25%]

No, larger

size (Yes) [-

0%]

Less efficient

(No) [-100%]

Within three to

five years? (No) [-

100%]

Don't Know

(Partial) [-0%]

Don't

Know

(Partial) [-

25%]

Don't Know

(Partial) [-

25%]

Don't Know

(Partial) [-

25%]

In more than five

years? (No) [-

100%]

Don't Know

(Partial) [-25%]

Table 244 shows the unique New Construction program participant response combinations resulting from the intention freeridership questions,

along with the intention freeridership score assigned to each combination and the number of responses for each combination. An “x” indicates a

Page 369: 2018 DSM Portfolio Evaluation Report - NIPSCO

362

skipped question (depending on the participant’s response to a previous question). The “Yes,” “Partial,” and “No” values in the table represent

whether the respondent’s answer to a given question was indicative of freeridership.

Table 244. 2018 New Construction Program Frequency of Intention Freeridership Scoring Combinations

G14.

Without the

incentive

and program

information

from

NIPSCO,

would your

construction

project been

built above

the Indiana

building

energy code

requirement

?

G15. [ASK

IF G14 =

Yes or DK]

Had your

organizatio

n ALREADY

decided to

construct

your

building

above the

Indiana

building

energy

code

requiremen

t BEFORE

your

organizatio

n heard

about the

program?

G16. Did

your

organizatio

n have

specific

plans to

construct

the

building

above the

Indiana

building

energy

code

requiremen

t BEFORE

learning

about the

NIPSCO

program

incentive?

G17. [ASK IF

G16 = Yes or

DK] Prior to

hearing

about the

program,

was the plan

to construct

the building

above the

Indiana

building

energy code

requirement

included in

your

organization’

s capital

budget?

G18. [ASK

IF G14=No]

So,

without

the

incentive

and

program

informatio

n from

NIPSCO,

would you

have

constructe

d the

building at

all?

G19. And

would your

constructio

n project

have been

built to the

same

square

footage

without the

incentive

and

program

informatio

n from

NIPSCO?

G20.

Without

the

incentive

and

program

informatio

n from

NIPSCO,

would you

have still

constructe

d a

building

that was

just as

efficient,

more

efficient,

or less

efficient

than the

one you

were

originally

planning

to build?

G21.

Without

the

incentive

and

program

informatio

n from

NIPSCO,

when

would you

have

constructe

d the

building?

Would you

have built

it …

G22. Does

your

organizatio

n use a

minimum

acceptable

ROI or

hurdle rate

when

selecting

energy-

efficiency

projects?

G24. Do

you

know if

the

program

incentiv

e was

key to

meeting

this

minimu

m rate?

Freeridershi

p score

Response

Frequenc

y

Yes Yes x X x x x x x x 100% 6

Yes No Partial Partial x Yes Yes Yes Yes x 50% 1

No x x X Yes Yes Yes Yes Yes x 50% 1

No x x X Yes Yes No x x x 0% 1

Page 370: 2018 DSM Portfolio Evaluation Report - NIPSCO

363

Table 245 illustrates how initial Residential HVAC Rebates program responses are translated into whether the response is “yes,” “no,” or

“partially” indicative of freeridership (in parentheses). The value in brackets is the scoring decrement associated with each response option. Each

participant intention freeridership score starts with 100%, which the evaluation team then decrement based on their responses to the seven

questions.

Table 245. Raw Survey Responses Translation to Intention Freeridership

Scoring Matrix Terminology—Residential HVAC Rebates Program and Scoring

D1. BEFORE you

heard about

NIPSCO’s Energy

Efficiency Rebate

program, had you

already planned to

purchase the

[MEASURE]?

D2. BEFORE you heard

anything about

NIPSCO’s Energy

Efficiency Rebate

program, had you

already purchased or

installed your

[MEASURE]?

D3. So, just to be

clear, you installed

your new

[MEASURE] before

you heard anything

about NIPSCO’s

Energy Efficiency

Rebate program,

correct?

D4. Would you have

installed the same

[MEASURE] without

the rebate from

NIPSCO?

D5. Just so I

understand, would

you have installed a

different [MEASURE]

without the NIPSCO

rebate or would you

have decided not to

purchase one at all?

D6. When you say you

would have installed a

[MEASURE] without

the rebate from

NIPSCO, would you

have installed one

that had the same

efficiency rating?

D8. And, thinking about

timing, without the

NIPSCO rebate, would

you have installed the

[MEASURE]…

Yes (Yes) [-0%] Yes (Yes) [-0%] Yes, that is correct

(Yes) [100%] Yes (Yes) [-0%]

I would have

installed a different

[MEASURE] (Yes) [-

0%]

Yes (Yes) [-0%] In the same year? (Yes)

[-0%]

No (No) [-50%] No (No) [-0%] No, that’s not

correct (No) [-50%] No (No) [-50%]

I would have

decided not to

purchase one at all

(No) [-100%]

No (No) [-100%] Within one to two

years? (Partial) [-100%]

Don't Know (Partial)

[-0%]

Don't Know (Partial) [-

0%]

Don't Know (Partial)

[-25%]

Don't Know (Partial)

[-25%]

Don't Know (Partial)

[-25%]

Don't Know (Partial) [-

25%]

Within three to five

years? (No) [-100%]

In more than five years?

(No) [-100%]

Don't Know (Partial) [-

25%]

Page 371: 2018 DSM Portfolio Evaluation Report - NIPSCO

364

Table 246 shows the unique HVAC Rebates program participant response combinations resulting from

the intention freeridership questions, along with the intention freeridership score assigned to each

combination, and the number of responses for each combination. An “x” indicates a skipped question

(depending on the participant’s response to a previous question). The “Yes,” “Partial,” and “No” values

in the table represent whether the respondent’s answer to a given question was indicative of

freeridership.

Table 246. 2018 HVAC Rebates Program Frequency of Intention Freeridership Scoring Combinations

1. A

lre

ady

pla

nn

ing

to p

urc

has

e?

2. A

lre

ady

ord

ere

d o

r in

stal

led

?

3. C

on

firm

atio

n:

Alr

ead

y o

rde

red

or

inst

alle

d?

4. I

nst

alle

d s

ame

me

asu

re w

ith

ou

t

ince

nti

ve?

[Ask

if q

ue

stio

n 1

is N

o]

5. C

on

firm

,

wo

uld

no

t h

ave

inst

alle

d a

ny

me

asu

re?

6. I

nst

alle

d s

ame

eff

icie

ncy

?

7 In

stal

led

wit

hin

sam

e y

ear

?

Inte

nti

on

Fre

eri

de

rsh

ip s

core

Re

spo

nse

Fre

qu

en

cy

Yes Yes Yes x x x x 100% 9

Yes Yes No Yes x Yes Yes 100% 1

Yes No x Yes x Yes Yes 100% 16

Yes No x Yes x Yes No 0% 5

Yes No x Yes x Partial No 0% 1

Yes No x Yes x No x 0% 2

Yes No x Partial Yes Partial No 0% 1

Yes No x Yes x x Yes 100% 3

Yes No x No Yes No x 0% 5

Yes No x No Yes x No 0% 3

Partial X x Yes x Yes Yes 75% 1

No X x Yes x Yes Yes 50% 6

No X x Yes x Yes No 0% 1

No X x Yes x Partial Yes 25% 1

No X x Yes x Partial Partial 12.5% 1

No X x Yes x x No 0% 1

No X x Yes x x Yes 50% 5

No X x Partial No x x 0% 2

No X x No Yes No x 0% 1

No X x No No x x 0% 2

Page 372: 2018 DSM Portfolio Evaluation Report - NIPSCO

365

Influence Freeridership Methodology

To estimate an influence freeridership score for the 2018 evaluation, the evaluation team asked

respondents to rate the importance of four program elements on their purchasing decisions. The survey

captured responses using a 1 to 4 scale, with 1 meaning not important and 4 meaning very important. A

surveyed participant’s overall influence rating equaled the maximum importance of any single program

element. This approach drew upon an underlying principle: if a single element had a substantial

influence on a respondent’s purchasing decision, the program successfully influenced the respondent.

For example, the survey included a question such as that shown in Table 247 to capture respondents’

perspectives on elements driving them to take energy-efficient actions. In this case, a 4 for the

importance of the incentive provided by NPSCO (the highest rating) represents the program’s maximum

influence rating, which determined the influence freeridership component score.

Table 247. Influence Freeridership Component Question

For the [MEASURE 1] and [MEASURE_2] purchases, on a scale from 1 to 4, with 1 being not at all important

and 4 being very important, how important was each of the following factors in deciding which equipment to

install.

Rate Influence of Program Elements

Not at all

important

Very

important

Recommendation

from contractor or

vendor

1 2 3 4 DK N/A

Information provided

by NIPSCO on energy

savings opportunities

1 2 3 4 DK N/A

Information on

payback period (C&I

only)

1 2 3 4 DK N/A

The NIPSCO incentive 1 2 3 4 DK N/A

Previous

participation in a

NIPSCO energy

efficiency program

1 2 3 4 DK N/A

High program influence levels and freeridership maintain an inverse relationship: the greater the

program’s influence, the lower the participant’s final freeridership score. Table 248 presents the

freeridership level implied by the influence rating.

Page 373: 2018 DSM Portfolio Evaluation Report - NIPSCO

366

Table 248. Influence Freeridership Implied by Response to Influence Items

Influence Rating Influence Freeridership Score

1 (not at all important) 100%

2 75%

3 25%

4 (very important) 0%

Don't Know 50%

Not applicable 50%

Participant Spillover Methodology

Participant spillover81 refers to additional savings generated by program participants through program

participation, but not captured by program records. Participant spillover occurs when participants

choose to purchase energy-efficient measures or adopt energy‐efficient practices due a program’s

influence, but do not receive a financial incentive for the additional measures. As these customers have

not received a financial incentive, they typically do not appear in program records of savings generated

by spillover impacts.

The energy efficiency programs’ participant spillover effect serves as an additional impact, which is

added to the program’s direct results.

The evaluation evaluation team measured participant spillover by asking a sample of participants who

purchased a particular measure and received an incentive whether they installed another efficient

measure or undertook another energy efficiency activity due to the program. Surveys asked

respondents to rate the program’s (and incentive’s) relative influence (e.g., highly, somewhat, not at all

important) on their decisions to pursue additional savings.

Participant Spillover Analysis

Particpant spillover savings calculations used a top‐down approach. Analysis began with a subset

containing only survey respondents, indicating they installed additional energy‐savings measures after

participating in the program. The evaluation team removed participants from this subset if they

indicated the program had little influence on their decisions to purchase additional measures; thus, the

subset only retained participants rating the program as very important on their purchasing decision.

For remaining participants with spillover measures, the evaluation team estimated energy savings for

additional measures installed. The evaluation team calculated savings values based on average savings

calculated for this evaluation and using the Indiana TRM (V2.2) as a reference when data from the

evaluation could not be used.

The evaluation team used the following equation to calculate the percentage of participant spillover per

program category by dividing the sum of additional spillover savings reported by respondents for a given

81 Non-participant spillover evaluation activities were not conducted for the 2018 program year.

Page 374: 2018 DSM Portfolio Evaluation Report - NIPSCO

367

program category by total incentivized ex post gross savings achieved by all respondents in the program

category:

𝑆𝑝𝑖𝑙𝑙𝑜𝑣𝑒𝑟 % =∑ Spillover Measure Energy Savings for All Survey Respondents

∑ Program Measure Energy Savings for All Survey Respondents

Page 375: 2018 DSM Portfolio Evaluation Report - NIPSCO

368

Appendix C. Residential Lighting Program Assumptions

and Algorithms

Gross Impact Methodology This section contains the assumptions for the measure energy savings and demand reduction for the

measures within the Lighting program. The evaluation evaluation team examined each assumption

behind the algorithms to capture savings and compared these against the Indiana TRM (v2.2), as well as

other state and industry approaches.

Detailed information on the analysis and supporting assumptions for the following lighting measures are

included within this appendix:

• LED lighting

The algorithms and assumptions the evaluation evaluation team used to calculate ex post savings for

each of these measures follow.

LED Lighting The evaluation evaluation team used the following equations to calculate energy and demand savings

for LEDs:

∆𝑘𝑊ℎ =(𝑊𝑏𝑎𝑠𝑒 − 𝑊𝐿𝐸𝐷)

1,000∗ 𝐻𝑂𝑈 ∗ (1 + 𝑊𝐻𝐹𝑒) ∗ 𝐼𝑆𝑅

∆𝑘𝑊 =(𝑊𝑏𝑎𝑠𝑒 − 𝑊𝐿𝐸𝐷)

1,000∗ 𝐶𝐹 ∗ (1 + 𝑊𝐻𝐹𝑑) ∗ 𝐼𝑆𝑅

Where:

Wbase = Weighted average wattage of the bulb being replaced, W

WLED = Wattage of the LED bulb, W

HOU = Average HOU per year

WHFe = Waste heat factor for energy to account for HVAC interactions with lighting, depends on

location

WHFd = Waste heat factor for demand to account for HVAC interactions with lighting, depends

on location

CF = Summer peak CF

1,000 = Constant to convert watts to kW, W/kW

ISR = ISR, lifetime net present value (NPV)

Table 249 lists input assumptions and sources for the LEDs measure savings calculations.

Page 376: 2018 DSM Portfolio Evaluation Report - NIPSCO

369

Table 249. Ex Post Variable Assumptions for LEDs

Input Value Source

Wbase Varies ENERGY STAR Lumens bins

WLED Varies 2018 tracking data

HOU 902 Indiana TRM (v2.2)

WHFe -0.070 Indiana TRM (v2.2), South Bend values

WHFd 0.038

CF 0.11 Indiana TRM (v2.2)

ISR

First Year, all lamps: 86%

General Service: 92%

Reflector/Specialty: 96%

2014 Indiana Market Effects Study,

augmented using DOE UMP

Baseline Wattages for non-PAR, MR, and MRX Lamp Types Table 250 shows the distribution of baseline wattages applied using the lumen equivalence method. This

approach is specified in the UMP and utilizes the ENERGY STAR online database to calculate final

baseline wattages for all program LEDs except certain PAR, MR, and MRX lamp types depending on their

stated Lumen output.

Table 250. Baseline Wattages for qualifying LED lamps by Lumens and Shape

Lamp Shape Lower Lumen

Range

Upper Lumen

Range

2017-2019

WattsBase 2020+ WattsBase

Omnidirectional, Medium

Screw Base Lamps (A, BT,

P, PS, S or T) (†, ◊see

exceptions below)

250 309 25 25

310 749 29 12

750 1049 43 20

1050 1489 53 28

1490 2600 72 46

2601 3300 150 66

3301 3999 200 200

4000 6000 300 300

†S Shape <=749 lumens

and T Shape <=749 lumens

or T>10" length)

250 309 25 25

310 749 40 12

Decorative, Medium Screw

Base (G Shape) (‡see

exceptions below)

250 309 25 25

310 749 29 12

750 1049 43 20

1050 1300 53 26

‡G16-1/2, G25, G30

<=499 lumens

250 309 25 25

310 349 25 7

350 499 40 9

Page 377: 2018 DSM Portfolio Evaluation Report - NIPSCO

370

Lamp Shape Lower Lumen

Range

Upper Lumen

Range

2017-2019

WattsBase 2020+ WattsBase

‡G Shape with diameter

>=5"

250 349 25 25

350 499 40 40

500 574 60 60

575 649 75 75

650 1099 100 100

1100 1300 150 150

Decorative, Medium Screw

Base (B, BA, C, CA, DC, and

F, and ST) (*see exceptions

below)

70 89 10 10

90 149 15 15

150 299 25 25

300 309 40 40

310 499 29 9

500 699 29 13

*B, BA, CA, and F <=499

lumens

70 89 10 10

90 149 15 15

150 299 25 25

300 309 40 40

310 499 40 9

Omnidirectional,

Intermediate Screw Base

Lamps (A, BT, P, PS, S or T)

(†see exceptions below)

250 309 25 25

310 749 40 12

†S Shape that have a

first number symbol <=

12.5 and T Shape lamps

with first number symbol

<= 8 and nominal overall

length <12"

250 309 25 25

310 749 40 40

Decorative, Intermediate

Screw Base (G Shape)

(‡see exceptions below)

250 309 25 25

310 349 25 7

350 499 40 9

‡G Shape with first

numeral less than 12.5 or

with diameter >=5"

250 349 25 25

350 499 40 40

Decorative, Intermediate

Screw Base (B, BA, C, CA,

DC, and F, and ST)

70 89 10 10

90 149 15 15

150 299 25 25

300 309 40 40

310 499 40 9

Omnidirectional,

Candelabra Screw Base

Lamps (A, BT, P, PS, S or T)

(†see exceptions below)

250 309 25 25

310 749 40 12

750 1049 60 20

Page 378: 2018 DSM Portfolio Evaluation Report - NIPSCO

371

Lamp Shape Lower Lumen

Range

Upper Lumen

Range

2017-2019

WattsBase 2020+ WattsBase

†S Shape that have a

first number symbol <=

12.5 and T Shape with first

number symbol <= 8 and

nominal overall length

<12"

250 309 25 25

310 749 40 40

750 1049 60 60

Decorative, Candelabra

Screw Base (G Shape)

(‡see exceptions below)

250 309 25 25

310 349 25 7

350 499 40 9

500 574 60 12

‡G Shape with first

numeral less than 12.5 or

with diameter >=5"

250 349 25 25

350 499 40 40

500 574 60 60

Decorative, Candelabra

Screw Base (B, BA, C, CA,

DC, and F, and ST)

70 89 10 10

90 149 15 15

150 299 25 25

300 309 40 40

310 499 40 9

500 699 60 13

Directional, Medium Screw

Base, w/diameter <=2.25"

400 449 40 9

450 499 45 10

500 649 50 13

650 1199 65 20

Directional, Medium Screw

Base, R, , ER, BR, BPAR or

similar bulb shapes w/

diameter >2.5 " (**see

exceptions below)

640 739 40 15

740 849 45 18

850 1179 50 22

1180 1419 65 29

1420 1789 75 36

1790 2049 90 43

2050 2579 100 51

2580 3300 120 65

3301 3429 120 120

3430 4270 150 150

Directional, Medium Screw

Base, R, ER, BR, BPAR or

similar bulb shapes with

medium screw bases w/

diameter > 2.26'' and ≤

2.5" (**see exceptions

below)

540 629 40 13

630 719 45 15

720 999 50 19

1000 1199 65 24

1200 1519 75 30

1520 1729 90 36

1730 2189 100 44

2190 2899 120 56

2900 3300 120 69

Page 379: 2018 DSM Portfolio Evaluation Report - NIPSCO

372

Lamp Shape Lower Lumen

Range

Upper Lumen

Range

2017-2019

WattsBase 2020+ WattsBase

3301 3850 150 150

**ER30, BR30, BR40, or

ER40

400 449 40 9

450 499 45 10

500 649-1179 50 14

**BR30, BR40, or ER40 650 1419 65 23

**R20 400 449 40 9

450 719 45 13

**All reflector lamps

below lumen ranges

specified above

200 299 20 20

300 399-639 30 9

◊Rough service, shatter

resistant, 3-way

incandescent, and

vibration service

250 309 25 25

310 749 40 12

750 1049 60 20

1050 1489 75 28

1490 2600 100 46

2601 3300 150 66

3301 3999 200 200

4000 6000 300 300

Baseline Wattages for PAR, MR, and MRX Lamp Types

For highly focused directional lamps, Center Beam Candle Power (CBCP) and beam angle measurements

are needed for accurate estimate of the equivalent baseline wattage. The formula below is based on the

Energy Star Center Beam Candle Power tool.82 If CBCP and beam angle information are not available or if

the equation below returns a negative value (or undefined), use the manufacturer’s recommended

baseline wattage equivalent.83 The WattsBase algorithm below is for reference.

WattsBase =

375.1 − 4.355(𝐷) − √227800 − 937.9(𝐷) − 0.9903(𝐷2) − 1479(𝐵𝐴) − 12.02(𝐷 ∗ 𝐵𝐴) + 14.69(𝐵𝐴2) − 16720 ∗ ln(𝐶𝐵𝐶𝑃) 84

Where:

D = Bulb diameter (e.g. for PAR20 D = 20)

BA = Beam angle

CBCP = Center beam candle power

82 http://www.energystar.gov/ia/products/lighting/iledl/IntLampCenterBeamTool.zip

83 The ENERGY STAR CBCP tool does not accurately model baseline wattages for lamps with certain bulb

characteristic combinations – specifically for lamps with very high CBCP.

84 Illinois TRM (v6.0) Vol.3 P.245

Page 380: 2018 DSM Portfolio Evaluation Report - NIPSCO

373

The result of the Energy Star calculator or equation above should be rounded down to the nearest

wattage established by ENERGY STAR, presented in Table 251:

Table 251. Baseline Wattages for qualifying LED PAR, MR, and MRX Lamps

Lamp Diameter Permitted Wattages

16 20, 35, 40, 45, 50, 60, 75

20 50

30S 40, 45, 50, 60, 75

30L 50, 75

38 40, 45, 50, 55, 60, 65, 75, 85, 90, 100, 120, 150, 250

First Year, Lifetime, and Net Present Value ISRs The evaluation evaluation team relied on the UMP to calculate lifetime ISRs through 2021 to account for

future installations of bulbs in storage. The methodology assumes 24% of all bulbs in storage will be

installed in each subsequent year after purchase. To account for the time sensitivity of these added

savings, which stem from increased ISRs but take place after 2018, the evaluation team discounted the

lifetime ISR 10% annually to achieve NPV lifetime ISRs for each LED. Table 252 compares first-year and

lifetime ISRs, showing how marginal increases to first-year ISRs using the UMP methodology result in the

NPV lifetime ISRs used in measure impact calculations.

Table 252. First-Year and Lifetime ISR Calculations

Measure First-Year ISR 2019 2020 2021 Lifetime ISR NPV ISR

General Service LED 86% 3% 3% N/Aa 97% 92%

Specialty/Reflector LED 86% 3% 3% 3% 97% 96% a General service lamps are not anticipated to have gross savings post EISA 2020 implementation, and the UMP

recommends a final ISR for these measures of 92%.

Percentages are rounded.

Waste Heat Factors For WHFs, the evaluation applied Indiana TRM (v2.2) WHF values for South Bend to each program lamp

using the values shown in Table 253.

Table 253. Indiana TRM (v2.2) Waste Heat Factors by City

HVAC Type WHFe WHFd Distribution

Indianapolis -0.061 0.055 -0.0018

South Bend -0.070 0.038 -0.0019

Evansville -0.034 0.092 -0.0017

Ft Wayne -0.082 0.038 -0.0019

Terre Haute -0.048 0.061 -0.0018

Statewide -0.059 0.057 -0.0018

Page 381: 2018 DSM Portfolio Evaluation Report - NIPSCO

374

Appendix D. Residential Home Energy Analysis Program

Assumptions and Algorithms This appendix contains the assumptions for electric energy savings, peak demand reduction, and natural

gas energy savings algorithms for the measures within the HEA program. The evaluation team examined

each assumption used by the algorithms to capture savings and compared these against the Indiana

TRM (v2.2), as well as other state and industry approaches.

Detailed information on the analysis and supporting assumptions for the following HEA program

measures are included within this appendix:

• 9-watt LEDs

• Kitchen faucet aerators

• Bathroom faucet aerators

• Low-flow showerheads

• Pipe wrap

• Filter whistle

• Duct Sealing

• Attic Insulation

Table 254 lists our assumptions for the ex post per measure savings.

Table 254. HEA Program Measures

Measure Reviewed Assumptions

LEDs New and baseline wattages, HOU, WHF, CFs

Kitchen Faucet Aerator New and baseline flow rates, occupants per dwelling, minutes of use per day, faucets per

home, water temperatures, water heater fuel type and efficiency

Bathroom Faucet Aerator New and baseline flow rates, occupants per dwelling, minutes of use per day, faucets per

home, water temperatures, water heater fuel type and efficiency

Low-Flow Showerhead New and baseline flow rates, occupants per dwelling, minutes of use per day, showerheads

per home, water temperatures, water heater fuel type and efficiency

Pipe Wrap New and baseline R-values, pipe diameter, water heater recovery efficiency

Filter Whistle Full load heating and cooling hours, efficiency ratings, efficiency improvement

Duct Sealing New and baseline distribution efficiencies, full load heating and cooling hours, capacity and

efficiencies of heating and cooling equipment

Attic Insulation Void space and compression factor, pre-installation and post-installation R-values, square

footage of installed insulation

Page 382: 2018 DSM Portfolio Evaluation Report - NIPSCO

375

Details by Measure The algorithms and assumptions the evaluation team used to calculate ex post savings for each of these

measures follow.

LEDs The following equations are used to calculate electric, demand, and therm penalties for LEDs:

𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 𝑝𝑒𝑟 𝑙𝑎𝑚𝑝 =(𝑊𝑏𝑎𝑠𝑒 − 𝑊𝐿𝐸𝐷) ∗ (𝐷𝑎𝑖𝑙𝑦 ℎ𝑜𝑢𝑟𝑠 𝑜𝑓 𝑢𝑠𝑒 ∗ 365) ∗ (1 + 𝑊𝐻𝐹𝑒)

1,000

𝑘𝑊 𝑟𝑒𝑑𝑢𝑐𝑡𝑖𝑜𝑛 𝑝𝑒𝑟 𝑙𝑎𝑚𝑝 =(𝑊𝑏𝑎𝑠𝑒 − 𝑊𝐿𝐸𝐷) ∗ 𝐶𝑜𝑖𝑛𝑐𝑖𝑑𝑒𝑛𝑐𝑒 𝐹𝑎𝑐𝑡𝑜𝑟 ∗ (1 + 𝑊𝐻𝐹𝑑)

1,000

𝑇ℎ𝑒𝑟𝑚 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 𝑝𝑒𝑟 𝑙𝑎𝑚𝑝 =(𝑊𝑏𝑎𝑠𝑒 − 𝑊𝐿𝐸𝐷) ∗ (𝐷𝑎𝑖𝑙𝑦 ℎ𝑜𝑢𝑟𝑠 𝑜𝑓 𝑢𝑠𝑒 ∗ 365) ∗ (1 + 𝑊𝐻𝐹𝑔)

1,000

Where:

Wbase = Wattage of the bulb being replaced, W

WLED = Wattage of the LED bulb, W

Daily hours of use = Average HOU per year, hr

WHFe = Waste heat factor for energy (depends on location)

WHFd = Waste heat factor for demand (depends on location)

WHFg = Waste heat factor for natural gas (depends on location)

Coincidence Factor = Summer peak CF

365 = Number of days per year, days/yr

1,ooo = Constant to convert watts to kW

Table 255 lists the input assumptions and source of each assumption for the LED measure savings

calculations.

Page 383: 2018 DSM Portfolio Evaluation Report - NIPSCO

376

Table 255. Ex Post Variable Assumptions for LEDs

Input Value Source

WattsBase 43 2016 Residential Lighting Evaluation Protocol, UMP

WattsEff 9 Actual installed wattage

ISR 1 Indiana TRM (v2.2)

Hours 902 Indiana TRM (v2.2)

Coincidence Factor 0.11 Indiana TRM (v2.2)

Energy Waste Heat Factor

(WHFE)

-0.07 Indiana TRM (v2.2), averaged across participant location

Demand Waste Heat Factor

(WHFD)

0.038

Natural Gas Waste Heat Factor

(WHFG)

-0.0019

Conversion Factor 1000 Convert watts to kW

Kitchen and Bathroom Faucet Aerators The evaluation evaluation team used the following equations to calculate electric energy, peak demand,

and natural gas energy savings for low-flow kitchen and bathroom faucet aerators:

𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐺𝑃𝑀𝑏𝑎𝑠𝑒 − 𝐺𝑃𝑀𝑙𝑜𝑤 𝑓𝑙𝑜𝑤) ∗ 𝑀𝑃𝐷 ∗𝑃𝐻

𝐹𝐻∗ 8.3 ∗ (𝑇𝑚𝑖𝑥 − 𝑇𝑖𝑛𝑙𝑒𝑡) ∗

365

𝑅𝐸 ∗ 3412

𝑘𝑊 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐺𝑃𝑀𝑏𝑎𝑠𝑒 − 𝐺𝑃𝑀𝑙𝑜𝑤 𝑓𝑙𝑜𝑤) ∗ 60 ∗ 8.3 ∗𝑇𝑚𝑖𝑥 − 𝑇𝑖𝑛𝑙𝑒𝑡

𝑅𝐸 ∗ 3412∗ 𝐶𝐹

𝑡ℎ𝑒𝑟𝑚 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐺𝑃𝑀𝑏𝑎𝑠𝑒 − 𝐺𝑃𝑀𝑙𝑜𝑤 𝑓𝑙𝑜𝑤) ∗ 𝑀𝑃𝐷 ∗𝑃𝐻

𝐹𝐻∗ 8.3 ∗ (𝑇𝑚𝑖𝑥 − 𝑇𝑖𝑛𝑙𝑒𝑡) ∗

365

𝑅𝐺 ∗ 100,000

Where:

GPMbase = Gallons per minute of baseline faucet aerator

GPMlow flow = Gallons per minute of low-flow faucet aerator

MPD = Average minutes of faucet use per person per day

PH = Average number of people per household

FH = Average number of faucets per household

Tmix = Mixed water temperature exiting faucet, ℉

Tinlet = Cold water temperature entering the domestic hot water (DHW) system, ℉

RE = Recovery efficiency of electric hot water heater

RG = Recovery efficiency of natural gas hot water heater

CF = Summer peak CF

60 = Minutes per hour, min/hr

8.3 = Specific weight of water in pounds per gallon, then multiplied by specific water

temperature (1.0 Btu/lb-°F)

3,412 = Constant to convert Btu to kWh

Page 384: 2018 DSM Portfolio Evaluation Report - NIPSCO

377

365 =- Days per year, day/yr

100,000 = Constant to convert MMBtu to therm

Table 256 lists the input assumptions and source of each assumption for the kitchen and bathroom

faucet aerator measure savings calculations.

Table 256. Ex Post Variable Assumptions for Faucet Aerators

Input Kitchen Value Bathroom Value Source

GPMbase 2.44 1.90 Indiana TRM (v2.2)

GPMlow flow 1.5 1.0 Actual

MPD 4.5 1.6 Indiana TRM (v2.2)

PH 2.64 2.64 Indiana TRM (v2.2)

FH 1 2.04 Indiana TRM (v2.2)

Tmix 93 86 Indiana TRM (v2.2)

Tinlet - Electric WHF 57˚F 56.9˚F Indiana TRM (v2.2), averaged across

participant location

Tinlet - Natural Gas

WHF 57.2˚F 57.2˚F

Indiana TRM (v2.2), averaged across

participant location

RE 0.98 0.98 Indiana TRM (v2.2)

RG 0.76 0.76 Indiana TRM (v2.2)

CF 0.0033 0.0012 Indiana TRM (v2.2)

Low-Flow Showerheads The evaluation evaluation team used the following equations to calculate electric energy, peak demand,

and natural gas energy savings for low-flow showerheads:

𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐺𝑃𝑀𝑏𝑎𝑠𝑒 − 𝐺𝑃𝑀𝑙𝑜𝑤 𝑓𝑙𝑜𝑤) ∗ 𝑀𝑆 ∗ 𝑆𝑃𝐷 ∗𝑃𝐻

𝑆𝐻∗ 8.3 ∗ (𝑇𝑚𝑖𝑥 − 𝑇𝑖𝑛𝑙𝑒𝑡) ∗

365

𝑅𝐸 ∗ 3412

𝑘𝑊 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐺𝑃𝑀𝑏𝑎𝑠𝑒 − 𝐺𝑃𝑀𝑙𝑜𝑤 𝑓𝑙𝑜𝑤) ∗ 60 ∗ 8.3 ∗𝑇𝑚𝑖𝑥 − 𝑇𝑖𝑛𝑙𝑒𝑡

𝑅𝐸 ∗ 3412∗ 𝐶𝐹

𝑡ℎ𝑒𝑟𝑚 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐺𝑃𝑀𝑏𝑎𝑠𝑒 − 𝐺𝑃𝑀𝑙𝑜𝑤 𝑓𝑙𝑜𝑤) ∗ 𝑀𝑆 ∗ 𝑆𝑃𝐷 ∗𝑃𝐻

𝐹𝐻∗ 8.3 ∗ (𝑇𝑖𝑚𝑥 − 𝑇𝑖𝑛𝑙𝑒𝑡) ∗

365

𝑅𝐺 ∗ 100,000

Where:

GPMbase = Gallons per minute of baseline showerhead, GPM

GPMlow flow = Gallons per minute of low-flow showerhead, GPM

MS = Average minutes of minutes per shower event

SPD = Average number of shower events per person per day

PH = Average number of people per household

SH = Average number of showerheads per household

Tmix = Mixed water temperature exiting faucet, ℉

Tinlet = Cold water temperature entering the DHW system, ℉

Page 385: 2018 DSM Portfolio Evaluation Report - NIPSCO

378

RE = Recovery efficiency of electric hot water heater

RG = Recovery efficiency of natural gas hot water heater

CF = Summer peak CF

60 = Minutes per hour, min/hr

8.3 = Specific weight of water in pounds per gallon, then multiplied by specific water

temperature (1.0 Btu/lb-°F)

3,412 = Constant to convert Btu to kWh

365 = Days per year

100,000 = Constant to convert MMBtu to therm

Table 257 lists the input assumptions and source of each assumption for the low-flow showerhead

measure savings calculations.

Table 257. Ex Post Variable Assumptions for Low-Flow Showerheads

Input Value Source

GPMbase 2.63 Indiana TRM (v2.2)

GPMlow flow 1.5 Actual

MS 7.8 Indiana TRM (v2.2)

SPD 0.6 Indiana TRM (v2.2)

PH 2.64 Indiana TRM (v2.2)

SH 1.6 Indiana TRM (v2.2)

Tmix 101 Indiana TRM (v2.2)

Tinlet - Electric WHF 56.7˚F Indiana TRM (v2.2), averaged across participant location

Tinlet – Natural Gas

WHF 57.2˚F Indiana TRM (v2.2), averaged across participant location

RE 0.98 Indiana TRM (v2.2)

RG 0.76 Indiana TRM (v2.2)

CF 0.0023 Indiana TRM (v2.2)

Pipe Wrap Insulation The evaluation evaluation team used the following equations to calculate electric energy, peak demand,

and natural gas energy savings for pipe wrap insulation:

𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (1

𝑅𝑒𝑥𝑖𝑠𝑡−

1

𝑅𝑛𝑒𝑤) ∗

𝐿 ∗ 𝐶 ∗ ∆𝑇 ∗ 8760

𝑁𝐷𝐻𝑊 ∗ 3412

𝑘𝑊 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 =∆𝑘𝑊ℎ

8760

𝑡ℎ𝑒𝑟𝑚 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (1

𝑅𝑒𝑥𝑖𝑠𝑡−

1

𝑅𝑛𝑒𝑤) ∗

𝐿 ∗ 𝐶 ∗ ∆𝑇 ∗ 8760

𝑁𝐷𝐻𝑊 ∗ 100,000

Page 386: 2018 DSM Portfolio Evaluation Report - NIPSCO

379

Where:

Rexist = Pipe loss heat coefficient (R-value) of existing, uninsulated pipe, ℉∗ℎ𝑟∗𝑓𝑡2

𝐵𝑡𝑢

Rnew = Pipe loss heat coefficient (R-value) of insulated pipe, ℉∗ℎ𝑟∗𝑓𝑡2

𝐵𝑡𝑢

L = Length of pipe from water heating source covered by pipe wrap, in feet

C = Circumference of pipe, in feet

∆T = Average temperature difference between supplied water and ambient air

temperature, ℉

NDHW = Recovery efficiency of hot water heater

8,760 = Hours per year, hr

3,412 = Constant to convert Btu to kWh

100,000 = Constant to convert MMBtu to therm

Table 258 lists the input assumptions and source of each assumption for the pipe wrap insulation

measure savings calculations.

Table 258. Ex Post Variable Assumptions for Pipe Wrap Insulation

Input Value Source

Rexist 1 Indiana TRM (v2.2)

Rnew 3 Indiana TRM (v2.2)

L 1 Calculate savings per foot of pipe length

C 0.131 Circumference for 0.5-inch pipe

∆T 65 Indiana TRM (v2.2)

NDHW - electric 0.98 Indiana TRM (v2.2)

NDHW – natural gas 0.75 Indiana TRM (v2.2)

Furnace Filter Whistles The following equations are used to calculate electric energy, peak demand, and natural gas energy

savings for filter whistles.

𝑘𝑊ℎ 𝑆𝑎𝑣𝑖𝑛𝑔𝑠𝐶𝐴𝐶 = 𝐹𝐿𝐻𝑐𝑜𝑜𝑙 ∗ 𝐵𝑡𝑢𝐻𝐶𝐴𝐶 ∗

1𝑆𝐸𝐸𝑅1,000

∗ 𝐸𝐹𝑒𝑙𝑒𝑐𝑡𝑟𝑖𝑐

𝑘𝑊ℎ 𝑆𝑎𝑣𝑖𝑛𝑔𝑠ℎ𝑒𝑎𝑡 = 𝐹𝐿𝐻ℎ𝑒𝑎𝑡 ∗ 𝐵𝑡𝑢𝐻ℎ𝑒𝑎𝑡 ∗

1𝐶𝑂𝑃 ∗ 3.412

1,000∗ 𝐸𝐹𝑒𝑙𝑒𝑐𝑡𝑟𝑖𝑐

𝑘𝑊 𝑆𝑎𝑣𝑖𝑛𝑔𝑠𝐶𝐴𝐶 = 𝐵𝑡𝑢𝐻𝐶𝐴𝐶 ∗

1𝐸𝐸𝑅

1,000∗ 𝐸𝐹𝑒𝑙𝑒𝑐𝑡𝑟𝑖𝑐 ∗ 𝐶𝐹

Page 387: 2018 DSM Portfolio Evaluation Report - NIPSCO

380

𝑡ℎ𝑒𝑟𝑚𝑠 𝑆𝑎𝑣𝑖𝑛𝑔𝑠 = 𝐹𝐿𝐻ℎ𝑒𝑎𝑡 ∗

𝐵𝑡𝑢𝐻𝑔𝑎𝑠

𝜂𝑔𝑎𝑠

100,000∗ 𝐸𝐹𝑔𝑎𝑠

Where:

FLHheat = Full load heating hours, hr

FLHcool = Full load cooling hours, hr

BtuHCAC = Size of central air conditioning units

BtuHheat = Size of electric furnaces

BtuHgas = Size of natural gas furnace

SEER = Seasonal energy efficiency ratio, Btu/h/Wh

EER = Energy efficiency ratio Btu/h/Wh

EFelectric = Efficiency savings for air conditioner and electric furnace

EFgas = Efficiency savings for natural gas furnace

COP = Coefficient of performance for electric furnace

CF = Coincidence factor

ηgas = Furnace efficiency

3.412 = Constant to convert Btu to Wh

1,000 = Constant to convert Wh to kWh

Table 259 shows the input assumptions and the source of each assumption in the filter whistle measure

savings calculations.

Table 259. Ex Post Variable Assumptions for Filter Whistles

Input Value Source

FLHcool 431 Indiana TRM (v2.2), averaged across participant location

FLHheat 1,427

SEER 11.15 Indiana TRM (v2.2)

EER 10.035 Indiana TRM (v2.2): EER = SEER x 0.9

BtuHCAC 28,994 Indiana TRM (v2.2)

BtuHheat 32,000 2016 Pennsylvania TRM

BtuHgas 77,386 Indiana TRM (v2.2)

EFgas 0.0185 Quantec analysis: Engineering Review and Savings Estimates for the "Filtertone" Filter Restriction

Alarm EFelec 0.035

COP 1 Indiana TRM (v2.2)

ηgas 0.8 Indiana TRM (v2.2)

CF 0.88 Indiana TRM (v2.2)

Duct Sealing The evaluation evaluation team used the following equations to calculate electric and demand savings

for duct sealing.

Page 388: 2018 DSM Portfolio Evaluation Report - NIPSCO

381

𝑘𝑊ℎ 𝑆𝑎𝑣𝑖𝑛𝑔𝑠 (ℎ𝑒𝑎𝑡𝑖𝑛𝑔) = (𝐷𝐸𝑎𝑓𝑡𝑒𝑟 − 𝐷𝐸𝑏𝑒𝑓𝑜𝑟𝑒

𝐷𝐸𝑎𝑓𝑡𝑒𝑟) ∗ (

𝐹𝐿𝐻ℎ𝑒𝑎𝑡 ∗ 𝐵𝑡𝑢ℎℎ𝑒𝑎𝑡

3412 ∗ 𝑁ℎ𝑒𝑎𝑡𝑖𝑛𝑔)

𝑘𝑊ℎ 𝑆𝑎𝑣𝑖𝑛𝑔𝑠 (𝑐𝑜𝑜𝑙𝑖𝑛𝑔) = (𝐷𝐸𝑎𝑓𝑡𝑒𝑟𝑐𝑜𝑜𝑙 − 𝐷𝐸𝑏𝑒𝑓𝑜𝑟𝑒𝑐𝑜𝑜𝑙

𝐷𝐸𝑎𝑓𝑡𝑒𝑟𝑐𝑜𝑜𝑙) ∗ (

𝐹𝐿𝐻𝑐𝑜𝑜𝑙 ∗ 𝐵𝑡𝑢ℎ𝑐𝑜𝑜𝑙

𝑆𝐸𝐸𝑅 ∗ 1000)

𝑘𝑊 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐷𝐸𝑝𝑘𝑎𝑓𝑡𝑒𝑟 − 𝐷𝐸𝑝𝑘𝑏𝑒𝑓𝑜𝑟𝑒

𝐷𝐸𝑝𝑘𝑎𝑓𝑡𝑒𝑟) ∗ (

𝐵𝑡𝑢ℎ𝑐𝑜𝑜𝑙

𝐸𝐸𝑅 ∗ 1000∗ 𝐶𝐹)

𝑡ℎ𝑒𝑟𝑚 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐷𝐸𝑎𝑓𝑡𝑒𝑟ℎ𝑒𝑎𝑡 − 𝐷𝐸𝑏𝑒𝑓𝑜𝑟𝑒ℎ𝑒𝑎𝑡

𝐷𝐸𝑎𝑓𝑡𝑒𝑟) ∗ (

𝐹𝐿𝐻ℎ𝑒𝑎𝑡 ∗ 𝐵𝑡𝑢ℎℎ𝑒𝑎𝑡

100,000)

Where:

DEaftercool = Distribution efficiency after duct sealing

DEbeforecool = Distribution efficiency before duct sealing

DEafterheat = Distribution efficiency after duct sealing

DEbeforeheat = Distribution efficiency before duct sealing

DEpkafter = Distribution efficiency under peak summer conditions after duct sealing

DEpkbefore = Distribution efficiency under peak summer conditions before duct sealing

FLHcool = Full load cooling hours

FLHheat = Full load heating hours

BtuHcool = Cooling capacity of cooling equipment (Btu per hour)

BtuHheat = Heating capacity of heating equipment (Btu per hour)

Nheat = Efficiency in COP of heating equipment

SEER = Seasonal average efficiency of air conditioning equipment

EER = Peak efficiency of air conditioning equipment

CF = Coincidence factor

Table 260 lists the assumptions and source of each assumption for the duct sealing measure savings

calculations.

Table 260. Ex Post Variable Assumptions for Duct Sealing

Input Value Source

DEaftercool 0.84 Indiana TRM (v2.2). Average ducts in unconditioned basement and attic for 8-10% leakage

DEbeforecool 0.75 Indiana TRM (v2.2). Average ducts in unconditioned basement and attic for 15-30% leakage

DEafterheat 0.82 Indiana TRM (v2.2). Average ducts in unconditioned basement and attic for 8-10% leakage

DEbeforeheat 0.75 Indiana TRM (v2.2). Average ducts in unconditioned basement and attic for 15-30% leakage

DEpkafter 0.79 Indiana TRM (v2.2). Average ducts in unconditioned basement and attic for 8-10% leakage

DEpkbefore 0.68 Indiana TRM (v2.2). Average ducts in unconditioned basement and attic for 15-30% leakage

FLHheat 1,425 Indiana TRM (v2.2), averaged across participant location - Joint Customers Natural Gas

Heat with CAC

Page 389: 2018 DSM Portfolio Evaluation Report - NIPSCO

382

Input Value Source

FLHheat 1,400 Indiana TRM (v2.2), averaged across participant location - Joint and Electric Only Customers

Electric Heat with CAC

FLHheat 1,404 Indiana TRM (v2.2), averaged across participant location - Natural Gas Only Customers

FLHcool 429 Indiana TRM (v2.2), averaged across participant location - Joint Customers Natural Gas

Heat with CAC

FLHcool 409 Indiana TRM (v2.2), averaged across participant location - Joint and Electric Only Customers

with Electric Heat with and CAC

FLHcool 417 Indiana TRM (v2.2), averaged across participant location - Joint and Electric Cooling Only

Customers

BTUHcool 28,994 Indiana TRM (v2.2)

BTUHheat 77,386 Indiana TRM (v2.2) (natural gas heat)

BTUHheat 32,000 2016 Pennsylvania TRM (electric heat)

SEER 11.15 Indiana TRM (v2.2)

EER 10.035 Calculated based on SEER

N-heating 1 Indiana TRM (v2.2)

Coincidence

factor

0.88 Indiana TRM (v2.2)

Attic Insulation The evaluation evaluation team used the following equations to calculate electric energy and peak

demand savings for attic insulation.

𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝑆𝐹

1000) ∗ (

∆𝑘𝑊ℎ

𝑘𝑆𝐹)

𝑘𝑊 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝑆𝐹

1000) ∗ (

∆𝑘𝑊

𝑘𝑆𝐹) ∗ 𝐶𝐹

𝑡ℎ𝑒𝑟𝑚 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝑆𝐹

1000) ∗ (

∆𝑀𝑀𝐵𝑡𝑢

𝑘𝑆𝐹) ∗ 10

Where:

SF = Total area of wall insulation in square feet

∆kWh/ksf = Energy savings expected for every 1,000 square feet of insulation installed with

respect to pre- and post-R-values from data tracking information

∆kW/ksf = Demand reduction expected for every 1,000 square feet of insulation installed

with respect to pre- and post-R-values from data tracking information

∆MMBtu/ksf = Natural gas savings expected for every 1,000 square feet of insulation installed

with respect to pre- and post-R-values from data tracking information

CF = Coincidence factor

Page 390: 2018 DSM Portfolio Evaluation Report - NIPSCO

383

Electric energy, peak demand, and natural gas energy savings are dependent upon pre- and post-

measure insulation R-values, calculated using the following steps:

• Step 1. Determine variables for insulation compression, Rratio, and void factors

• Step 2. Calculate adjusted R-values, Radj

• Step 3. Interpolate within Indiana TRM (v2.2) tables to obtain savings per 1,000 square feet of

insulation to obtain values for ∆kWh/sf, ∆kW/sf, and ∆MMBtu/sf

Step 1. Determine variables for insulation compression, Rratio, and void factors:

Adjusted pre-installation and post-installation R-values are calculated using the following formula:

Radj = Rnominal x Fcompression x Fvoid

Where:

Rnominal = Total installed R-value per manufacturers specifications. This value varies across

participants and was calculated on an individual level to account for individual

savings between pre- and post-measure.

Fcompression = Insulation compression factor, assumed to be 1 for 0% compression (as shown in

Indiana TRM v2.2), because actual information is unknown.

Fvoid = Void factor, dependent on insulation grade level and percent coverage, assumed to

be at the 2% grade per the Indiana TRM (v2.2), because actual information is

unknown.

The void factor, Fvoid, varies based on the ration between the full assembly R-value and the nominal

R-value, Rnominal, including compression effects. Pre- and post-insulation values are determined next,

using the following equation:

𝑅𝑟𝑎𝑡𝑖𝑜 = 𝑅𝑛𝑜𝑚𝑖𝑛𝑎𝑙 ∗ 𝐹𝑐𝑜𝑚𝑝𝑟𝑒𝑠𝑠𝑖𝑜𝑛

𝑅𝑛𝑜𝑚𝑖𝑛𝑎𝑙 + 𝑅𝑓𝑟𝑎𝑚𝑖𝑛𝑔&𝑎𝑖𝑟𝑠𝑝𝑎𝑐𝑒

Where:

Rnominal = Total installed R-value per manufacturers specifications. This value varies across

participant and was calculated on an individual level to account for individual

savings between pre- and post-measure.

Fcompression = Insulation compression factor, assumed to be 1 for 0% compression (as shown in

Indiana TRM (v2.2), because information is unknown.

Rframing&airspace = R-value for materials, framing, and airspace for the area in which the insulation

is installed. Assumed to be R-5, per Indiana TRM (v2.2).

Values for void factors, based on the Rratio calculation are shown in Table 261. The evaluation team

assumed a void factor at 2% in accordance with the Indiana TRM (v2.2).

Page 391: 2018 DSM Portfolio Evaluation Report - NIPSCO

384

Table 261. Insulation Void Factors

Rratio Fvoid, 2%

0.50 0.96

0.55 0.96

0.60 0.95

0.65 0.94

0.70 0.94

0.75 0.92

0.80 0.91

0.85 0.88

0.90 0.83

0.95 0.71

0.99 0.33

Step 2. Calculate Radj

Pre- and post-R-values, Radj, are calculated at the participant level using Rnominal and Rratio.

Step 3. Determine ∆kWh/ksf, ∆kW/ksf, and ∆MMBtu/ksf

Electric energy, peak demand, and natural gas energy savings per thousand square feet values were

obtained by interpolating within Indiana TRM (v2.2) tables and averaging across participant locations.

Page 392: 2018 DSM Portfolio Evaluation Report - NIPSCO

385

Appendix E. Residential Home Energy Analysis Program

Billing Analysis Methodology The evaluation evaluation team used billing analysis to estimate ex post gross natural gas savings from

duct sealing installed through the HEA program. The evaluation evaluation team conducted the analysis

in the following steps:

• Data Processing

• Data Summary

• Modeling

• Savings Estimation

• Robustness Checks

• Site Attrition

Data Processing The evaluation evaluation team processed and validated participant consumption (or billing) data,

program tracking data and weather data. Prior to providing the participant billing data, NIPSCO staff

verified the accuracy and completeness of the data.

Participant Consumption Data The evaluation evaluation team received participant consumption data after initial and thorough

cleaning completed by NIPSCO staff. The evaluation team conducted basic validation checks to confirm

the data looked as expected:

• The data contained over 99% of the participants identified in the tracking data

• The data covered the requested date range; there were little-to-no duplicated rows

• Instances of high usage typically occurred at high-usage sites and during cold months.

Furthermore, as a part of a series of robustness checks, the evaluation team conducted sensitivity

analysis around the effect of high-users on savings, where the effect on final savings was small and not

statistically significantly different.

Program Tracking Data The evaluation evaluation team reviewed program tracking data for duplicates or other data quality

issues and verified its accuracy and completeness by reconciling it and the program scorecard. No

participants were tracked as having multiple duct sealing measures installed. All natural gas savings for

duct sealing measures were tracked at the expected claimed savings.

Weather Data The evaluation evaluation team collected two forms of weather data: actual year weather data and

TMY3 weather data. The evaluation team downloaded hourly actual year weather data from the

Page 393: 2018 DSM Portfolio Evaluation Report - NIPSCO

386

National Oceanic and Atmospheric Administration’s Integrated Surface Database.85 The evaluation team

downloaded TMY3 data from the National Renewable Energy Laboratory.86

The evaluation team matched each site to the closest weather station where both actual and TMY3 data

was available. While this criterion limits the available weather stations, it ensures accurate weather

normalization and generally narrows the available weather stations to those that have reliable data. The

distance between a site and its closest weather station was calculated using the latitude and longitude

for each site’s zip code as compared to the latitude and longitude of each weather station.

Data Summary The evaluation team was interested to pursue billing analysis evaluation for HEA, because it offers an

additional level of rigor above engineering review, which the evaluation team used to evaluate HEA for

previous program years due to insufficient sample sizes and/or billing data. Upon reviewing the HEA

tracking data and consumption data, however, the evaluation team decided that estimating natural gas

savings from duct sealing would be the most appropriate approach; there was a reasonable sample size

and the claimed savings are approximately equal to or greater than 10% of annual energy usage and it is

a weather sensitive measure. Similarly, although other measures were installed for about 96% of duct

sealing participants, duct sealing represents 88% of the site level natural gas savings on average.

The average site level electric savings are about 5% of annual usage, with lighting and low-flow

showerheads contributing the highest savings per site. If the program installs duct sealing for

participants with central air conditioning at a high enough volume, the site-level savings and sample size

may be adequate to achieve statistically significant results from billing analysis. However, the evaluation

evaluation team notes that electric billing analysis may still be challenging for customers with natural

gas heat and electric cooling given Indiana’s climate and the substantial variability in electric usage data

in general.

The evaluation team considered conducting natural gas billing analysis for sites with attic insulation, but

the program installed attic insulation in only 16 homes during the period under review.

Modeling The evaluation team conducted site specific regression models for pre- and post-treatment periods for

each site. These models disaggregate heating from non-weather-related energy use and enable the

evaluation team to estimate heating energy use during a typical weather year.

𝑎𝑑𝑢𝑖,𝑗,𝑡 = 𝛼𝑖,𝑗 + 𝛽𝑖,𝑗 ∗ 𝐻𝐷𝐷. 𝐴𝑐𝑡𝑖,𝑗,𝑡 + 𝜖𝑖,𝑗,𝑡

85 https://www.ncdc.noaa.gov/isd

86 https://rredc.nrel.gov/solar/old_data/nsrdb/1991-2005/tmy3/

Page 394: 2018 DSM Portfolio Evaluation Report - NIPSCO

387

Where:

𝑎𝑑𝑢𝑖,𝑗,𝑡 = Average daily usage (therms) for site i during month t in period j (pre- or

post-treatment period)

𝛼𝑖,𝑗 = Modeled intercept for site i in period j (pre- or post-treatment period)

𝛽𝑖,𝑗 = Modeled coefficient for change in adu over change in HDD for site i in

period j (pre- or post-treatment period)

𝐻𝐷𝐷. 𝐴𝑐𝑡𝑖,𝑗,𝑡 = Actual weather heating degree days base 65 for site i during month t in

period j (pre- or post-treatment period)

𝜖𝑖,𝑗,𝑡 = Model error for site i during month t in period j (pre- or post-treatment

period)

Rather than finding the optimal balance temperature for each site, the evaluation team used a set

balance temperature of 65oF. This approach avoids the complexity of whether to interpret changes in

balance temperature as savings or as non-program-related, exogenous effects. Furthermore, it reduces

the likelihood of overfitting the models, where overfitted models may inaccurately extrapolate the

results onto TMY3 weather data. The evaluation team, however, did develop savings while optimizing

the balance temperature to serve as a robustness check on the final results.

To validate the modeling results, the evaluation team compared the model-estimated energy use to

actual annual energy use and verified the reasonableness of the model-estimated percent heating load.

As expected, the model-estimated heating load was about 90% of total annual natural gas consumption.

Savings Estimation The evaluation evaluation team conducts the savings estimation in two steps: estimating site level

savings and isolating the duct sealing savings.

Site Level Savings With this approach, the evaluation team calculates site level savings as the change in weather

normalized heating between the pre- and post-treatment periods for each site. The evaluation team

calculates weather normalized heating as 𝛽𝑖,𝑝𝑟𝑒 from the modeling step times the annual HDD

calculated from TMY3 data.

𝑒𝑣𝑎𝑙. 𝑠𝑎𝑣𝑖 = 𝛽𝑖,𝑝𝑟𝑒 ∗ 𝐻𝐷𝐷. 𝑇𝑀𝑌𝑖 − 𝛽𝑖,𝑝𝑜𝑠𝑡 ∗ 𝐻𝐷𝐷. 𝑇𝑀𝑌𝑖

Where:

𝑒𝑣𝑎𝑙. 𝑠𝑎𝑣𝑖 = Evaluated savings for site i (therms/yr)

𝛽𝑖,𝑝𝑟𝑒 = Modeled coefficient for change in adu over change in HDD for site i in

the pre-treatment period

𝐻𝐷𝐷. 𝑇𝑀𝑌𝑖 = Annual typical weather heating degree days base 65 for site i

Page 395: 2018 DSM Portfolio Evaluation Report - NIPSCO

388

𝛽𝑖,𝑝𝑜𝑠𝑡 = Modeled coefficient for change in adu over change in HDD for site i in

the post-treatment period

Duct Sealing Savings Although duct sealing represents the majority of claimed natural gas savings, the evaluation evaluation

team makes an additional effort to isolate an evaluated duct sealing savings value. The evaluation team

subtracts estimated ex post gross savings for other measures from the total site-level savings. The

evaluation team estimates ex post gross savings estimates for other measures by referencing the 2017

audited savings values for each measure and incorporating a three-year weighted ISR.

𝑒𝑣𝑎𝑙. 𝑠𝑎𝑣𝑖,𝑑𝑢𝑐𝑡 𝑠𝑒𝑎𝑙 = 𝑒𝑣𝑎𝑙. 𝑠𝑎𝑣𝑖 − ∑ 𝑎𝑢𝑑𝑖𝑡. 𝑠𝑎𝑣𝑖,𝑞 ∗ 𝐼𝑆𝑅𝑞

𝑞

Where:

𝑒𝑣𝑎𝑙. 𝑠𝑎𝑣𝑖,𝑑𝑢𝑐𝑡 𝑠𝑒𝑎𝑙 = Evaluated duct sealing savings for site i (therms/yr)

𝑒𝑣𝑎𝑙. 𝑠𝑎𝑣𝑖 = Evaluated savings for site i (therms/yr)

𝑎𝑢𝑑𝑖𝑡. 𝑠𝑎𝑣𝑖,𝑞 = 2017 audited savings estimates for measures q installed at site i

(therms/yr)

𝐼𝑆𝑅𝑞 = Three-year weighted average ISR for measure q

Robustness Checks The evaluation team estimated savings using other reasonable analysis techniques as robustness checks

on the findings. The evaluation team used the following techniques, which supported that the evaluated

savings are lower than the claimed savings.

• Pooled model

• Variable base degree day method

• Conduct evaluation analysis, but when filtering out high users

• Conduct evaluation analysis, but when applying various requirements on the number of months

for each site

Site Attrition The evaluation evaluation team produced results for 426 of the 618 HEA participants who installed duct

sealing measures with claimed natural gas savings (Table 262). The evaluation team removed sites

without enough observations before and after measure installation in order to ensure that the models

accurately captured heating and non-heating related energy use. The majority of site attrition occurred

at this step. The evaluation team also removed sites where other significant natural gas saving energy

efficiency measures were installed. These sites were removed in order for the analysis to most

accurately isolate duct sealing impacts from other measures. Lastly, the evaluation evaluation team

Page 396: 2018 DSM Portfolio Evaluation Report - NIPSCO

389

removed sites from the analysis where the regression diagnostics indicated that the models were not

able to accurately differentiate heating and non-heating related energy use.

These requirements reflect a reasonable balance of site attrition and restrictiveness. These

requirements are relatively mild but filter out 31% of eligible sites from the analysis. The evaluation

team decided not to apply more stringent requirements in order to preserve more sites in the final

analysis. This decision is supported by the generally high R-Squared values and robust results that are

not particularly sensitive to more stringent billing data requirements.

Table 262. Billing Analysis Disposition

Analysis Step Total

Duct Sealing Package

Natural Gas and Electric

Prescriptive Duct Sealing

Package Natural Gas and

Electric - 2014 Savings –

Natural Gas

Remaining Removed Remaining Removed Remaining Removed

Step 1: Natural Gas Sites with Duct

Sealing Installed through the HEA

Program

618 - 453 - 165 -

Step 2: Sites with Adequate Natural

Gas Billing Dataa 469 148 357 95 112 53

Step 3: Sites without Other Significant

Natural Gas Energy Efficiency

Measures Installed (e.g., Attic

Insulation or through the IQW

Program)

462 7 351 6 111 1

Step 4: Sites with Pre- and Post-

Installation Models with R-Squared

>= 0.80

426 36 317 34 109 2

Final Analysis Data 426 - 317 - 109 - a The evaluation evaluation team limited the analysis data set to sites with 3 or more bills pre- and post-installation and with

usage data (pre- and post-installation) during a time period with about 25% or more of the year’s HDDs. These requirements

reflect a reasonable balance of site attrition and restrictiveness as evidenced by generally high R-Squared values on site-

specific models and robustness testing, which suggests that the results are not particularly sensitivity to the required

number of bills.

Page 397: 2018 DSM Portfolio Evaluation Report - NIPSCO

390

Appendix F. Residential School Education Program Ex

Post Measure Savings Calculation Methodologies This section contains assumptions used in electric energy, demand reduction, and natural gas energy

savings algorithms for measures within the School Education program. The evaluation evaluation team

examined each assumption behind the algorithms to capture savings, and it compared these to the

Indiana TRM (v2.2) as well as other state and industry sources.

This appendix includes detailed information on the analysis and supporting assumptions for the

following School program measures:

• 9-watt LEDs

• Kitchen faucet aerator

• Bathroom faucet aerators

• Low-flow showerheads

• Furnace filter whistle

• LED night-light

The algorithms and assumptions follow that were used to calculate ex post savings for each of these

measures.

LEDs The following equations are used to calculate energy, demand, and therm savings for LEDs:

𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 𝑝𝑒𝑟 𝑙𝑎𝑚𝑝 =(𝑊𝑏𝑎𝑠𝑒 − 𝑊𝐿𝐸𝐷) ∗ (𝐷𝑎𝑖𝑙𝑦 ℎ𝑜𝑢𝑟𝑠 𝑜𝑓 𝑢𝑠𝑒 ∗ 365) ∗ (1 + 𝑊𝐻𝐹𝑒)

1,000

𝑘𝑊 𝑟𝑒𝑑𝑢𝑐𝑡𝑖𝑜𝑛 𝑝𝑒𝑟 𝑙𝑎𝑚𝑝 =(𝑊𝑏𝑎𝑠𝑒 − 𝑊𝐿𝐸𝐷) ∗ 𝐶𝑜𝑖𝑛𝑐𝑖𝑑𝑒𝑛𝑐𝑒 𝐹𝑎𝑐𝑡𝑜𝑟 ∗ (1 + 𝑊𝐻𝐹𝑑)

1,000

𝑇ℎ𝑒𝑟𝑚 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 𝑝𝑒𝑟 𝑙𝑎𝑚𝑝 =(𝑊𝑏𝑎𝑠𝑒 − 𝑊𝐿𝐸𝐷) ∗ (𝐷𝑎𝑖𝑙𝑦 ℎ𝑜𝑢𝑟𝑠 𝑜𝑓 𝑢𝑠𝑒 ∗ 365) ∗ (1 + 𝑊𝐻𝐹𝑔)

1,000

Where:

Wbase = Wattage of the bulb being replaced, W

WLED = Wattage of the LED bulb, W

Hours = Average HOU per year

WHFe = Waste heat factor for energy to account for HVAC interactions with lighting (depends on

location)

WHFd = Waste heat factor for demand to account for HVAC interactions with lighting (depends

on location)

Page 398: 2018 DSM Portfolio Evaluation Report - NIPSCO

391

WHFg = Waste heat factor for natural gas to account for HVAC interactions with lighting

(depends on location)

Coincidence Factor = Summer peak CF

365 = Number of days per year, days/yr

1,000 = Constant to convert watts to kW

Table 263 lists the input assumptions and source of each assumption for the LED measure savings

calculations.

Table 263. Ex Post Variable Assumptions for LEDs

Input Value Source

Wbase for 9-watt (baseline) 43 DOE UMP, Chapter 21 Residential Lighting Evaluation Protocol for post-EISA 9W

LED

WLED for 9-watt (LED) 9 Actual installed wattage

Daily hours of use x 365 1,135 Indiana TRM (v2.2) – HOU for school kits

WHFe -0.07 Indiana TRM (v2.2), assumed South Bend to account for most of NIPSCO’s

service territory in the North

WHFd 0.038 Indiana TRM (v2.2), assumed South Bend to account for most of NIPSCO’s

service territory in the North

WHFg -0.0019

Indiana TRM (v2.2), assumed South Bend to account for most of NIPSCO’s

service territory in the North

Coincidence Factor 0.11 Indiana TRM (v2.2)

LED Night-Lights The following equations were used to calculate energy savings for LED night-lights:

𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 𝑝𝑒𝑟 𝑛𝑖𝑔ℎ𝑙𝑖𝑔ℎ𝑡 =(𝑊𝑏𝑎𝑠𝑒 − 𝑊𝐿𝐸𝐷) ∗ 𝐻𝑜𝑢𝑟𝑠 ∗ 𝐼𝑅𝐹

1,000

Where:

Wbase = Wattage of the bulb being replaced, W

WLED = Wattage of the LED night-light, W

Daily hours of use = Average HOU per year, hr

365 = Number of days per year, days/yr

1,000 = Constant to convert watts to kW

IRF = Incandescent replacement factor, or percentage of LED night-lights that replaced

incandescent nightlights

Table 264 lists the input assumptions and source of each assumption for the LED nightlight measure

savings calculations.

Page 399: 2018 DSM Portfolio Evaluation Report - NIPSCO

392

Table 264. Ex Post Variable Assumptions for LED Night-Lights

Input Value Source

Wbase for 9-watt (baseline) 5.0 Indiana TRM (v2.2)

WLED for LED night-light 0.5 Actual installed wattage

Daily hours of use x 365 2,920 Indiana TRM (v2.2) – HOU for school kits

IRF 0.4 Indiana School Kit Parent Survey 2018 data

Kitchen and Bathroom Faucet Aerators The evaluation evaluation team used the following equations to calculate electric energy, peak demand,

and natural gas energy savings for low-flow kitchen and bathroom faucet aerators:

𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐺𝑃𝑀𝑏𝑎𝑠𝑒 − 𝐺𝑃𝑀𝑙𝑜𝑤 𝑓𝑙𝑜𝑤) ∗ 𝑀𝑃𝐷 ∗𝑃𝐻

𝐹𝐻∗ 𝐷𝑅 ∗ 8.3 ∗ (𝑇𝑚𝑖𝑥 − 𝑇𝑖𝑛𝑙𝑒𝑡) ∗

365

𝑅𝐸 ∗ 3412

𝑘𝑊 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐺𝑃𝑀𝑏𝑎𝑠𝑒 − 𝐺𝑃𝑀𝑙𝑜𝑤 𝑓𝑙𝑜𝑤) ∗ 𝐷𝑅 ∗ 60 ∗ 8.3 ∗𝑇𝑚𝑖𝑥 − 𝑇𝑖𝑛𝑙𝑒𝑡

𝑅𝐸 ∗ 3412∗ 𝐶𝐹

𝑡ℎ𝑒𝑟𝑚 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐺𝑃𝑀𝑏𝑎𝑠𝑒 − 𝐺𝑃𝑀𝑙𝑜𝑤 𝑓𝑙𝑜𝑤) ∗ 𝑀𝑃𝐷 ∗𝑃𝐻

𝐹𝐻∗ 𝐷𝑅 ∗ 8.3 ∗ (𝑇𝑚𝑖𝑥 − 𝑇𝑖𝑛𝑙𝑒𝑡)

∗365

𝑅𝐺 ∗ 100,000

Where:

GPMbase = Gallons per minute of baseline faucet aerator

GPMlow flow = Gallons per minute of low-flow faucet aerator

MPD = Average minutes of faucet use per person per day

PH = Average number of people per household

FH = Average number of faucets per household

Tmix = Mixed water temperature exiting faucet, ℉

Tinlet = Cold water temperature entering the DHW system, ℉

RE = Recovery efficiency of electric hot water heater

RG = Recovery efficiency of natural gas hot water heater

CF = Summer peak CF

60 = Minutes per hour, min/hr

DR = Percentage of water flowing down the drain

8.33 = Specific weight of water in pounds per gallon, then multiplied by specific water

temperature (1.0 Btu/lb-°F)

3,412 = Constant to convert Btu to kWh

365 = Days per year, day/yr

100,000 = Constant to convert MMBtu to therm

Page 400: 2018 DSM Portfolio Evaluation Report - NIPSCO

393

Table 265 lists the input assumptions and source of each assumption for the kitchen and bathroom

faucet aerator measure savings calculations.

Table 265. Ex Post Variable Assumptions for Faucet Aerators

Input Kitchen

Value

Bathroom

Value Source

GPMbase 2.44 1.90 Indiana TRM (v2.2)

GPMlow

flow 1.50 1.00 Provided in program materials

MPD 4.5 1.60 Indiana TRM (v2.2)

PH 4.86 4.86 NIPSCO School Kit 2018 HEW data

FH 1 2.46 Kitchen - Indiana TRM (v2.2); Bathroom – NIPSCO School Kit Parent Survey

2018 data

Tmix 93 86 Indiana TRM (v2.2)

Tinlet 57.4 57.4 Indiana TRM (v2.2)

RE 0.98 0.98 Indiana TRM (v2.2)

RG 0.76 0.76 Indiana TRM (v2.2)

CF 0.0033 0.0012 Indiana TRM (v2.2)

DR 0.5 0.7 Indiana TRM (v2.2)

Low-Flow Showerheads The evaluation team used the following equations to calculate electric energy, peak demand, and

natural gas energy savings for low-flow showerheads:

𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐺𝑃𝑀𝑏𝑎𝑠𝑒 − 𝐺𝑃𝑀𝑙𝑜𝑤 𝑓𝑙𝑜𝑤) ∗ 𝑀𝑆 ∗ 𝑆𝑃𝐷 ∗𝑃𝐻

𝑆𝐻∗ 8.3 ∗ (𝑇𝑚𝑖𝑥 − 𝑇𝑖𝑛𝑙𝑒𝑡) ∗

365

𝑅𝐸 ∗ 3412

𝑘𝑊 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐺𝑃𝑀𝑏𝑎𝑠𝑒 − 𝐺𝑃𝑀𝑙𝑜𝑤 𝑓𝑙𝑜𝑤) ∗ 60 ∗ 8.3 ∗𝑇𝑚𝑖𝑥 − 𝑇𝑖𝑛𝑙𝑒𝑡

𝑅𝐸 ∗ 3412∗ 𝐶𝐹

𝑡ℎ𝑒𝑟𝑚 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐺𝑃𝑀𝑏𝑎𝑠𝑒 − 𝐺𝑃𝑀𝑙𝑜𝑤 𝑓𝑙𝑜𝑤) ∗ 𝑀𝑆 ∗ 𝑆𝑃𝐷 ∗𝑃𝐻

𝐹𝐻∗ 8.3 ∗ (𝑇𝑖𝑚𝑥 − 𝑇𝑖𝑛𝑙𝑒𝑡) ∗

365

𝑅𝐺 ∗ 100,000

Where:

GPMbase = Gallons per minute of baseline showerhead, GPM

GPMlow flow = Gallons per minute of low-flow showerhead, GPM

MS = Average minutes of minutes per shower event

SPD = Average number of shower events per person per day

PH = Average number of people per household

SH = Average number of showerheads per household

Tmix = Mixed water temperature exiting faucet, ℉

Tinlet = Cold water temperature entering the DHW system, ℉

RE = Recovery efficiency of electric hot water heater

Page 401: 2018 DSM Portfolio Evaluation Report - NIPSCO

394

RG = Recovery efficiency of natural gas hot water heater

CF = Summer peak CF

60 = Minutes per hour, min/hr

8.33 = Specific weight of water in pounds per gallon, then multiplied by specific water

temperature (1.0 Btu/lb-°F)

3,412 = Constant to convert Btu to kWh

365 = Days per year

100,000 = Constant to convert MMBtu to therm

Table 266 lists the input assumptions and source of each assumption for the low-flow showerhead

measure savings calculations.

Table 266. Ex Post Variable Assumptions for Low-Flow Showerheads

Input Value Source

GPMbase 2.63 Indiana TRM (v2.2)

GPMlow flow 1.5 Provided in program materials

MS 7.8 Indiana TRM (v2.2)

SPD 0.6 Indiana TRM (v2.2)

PH 4.86 NIPSCO School Kit 2018 HEW data

SH 1.99 NIPSCO School Kit Parent Survey 2018 data

Tmix 101 Indiana TRM (v2.2)

Tinlet 57.4 Indiana TRM (v2.2)

RE 0.98 Indiana TRM (v2.2)

RG 0.76 Indiana TRM (v2.2)

CF 0.0023 Indiana TRM (v2.2)

Furnace Filter Whistles NIPSCO calculated ex ante savings for the furnace filter whistle using the 2016 Pennsylvania TRM

methodology. The evaluation team, however, decided to use the energy savings results for a furnace

filter whistle engineering assessment (conducted by Quantec) for the ex post analysis. The evaluation

team determined that Quantec’s analysis of furnace whistle energy savings was more detailed and

technically thorough than the sources used by the 2016 Pennsylvania TRM. The TRM referenced energy

savings claims reported on the DOE’S website for air filter replacement and efficiency that the

evaluation team could not reproduce.

The 2016 Pennsylvania TRM also only assumed savings resulted from reduced furnace blower fan motor

power requirements for winter and summer use of the blower fan motor. Quantec’s assessment

assumed cooling and heating savings (from the reduced time needed to cool or heat the home), in

addition to blower fan motor savings. Consequently, the evaluation team’s methodology calculated

demand reduction and natural gas savings resulting from the furnace filter whistle. Use of the Quantec

Filter Restriction Alarm whitepaper was consistent with ex post evaluation methods employed for the

Page 402: 2018 DSM Portfolio Evaluation Report - NIPSCO

395

2014 Energizing Indiana Statewide Core report for the School Kit program as well as for other Indiana

school kit program evaluations.

The evaluation team used the following equations to calculate savings for the measure:

𝑘𝑊ℎ 𝑆𝑎𝑣𝑖𝑛𝑔𝑠𝐶𝐴𝐶 = 𝐹𝐿𝐻𝑐𝑜𝑜𝑙 ∗ 𝐵𝑡𝑢𝐻𝐶𝐴𝐶 ∗

1𝑆𝐸𝐸𝑅1,000

∗ 𝐸𝐹𝑒𝑙𝑒𝑐𝑡𝑟𝑖𝑐

𝑘𝑊ℎ 𝑆𝑎𝑣𝑖𝑛𝑔𝑠𝐻𝑃 = (𝐹𝐿𝐻𝑐𝑜𝑜𝑙 ∗ 𝐵𝑡𝑢𝐻𝐶𝐴𝐶 ∗

1𝑆𝐸𝐸𝑅1,000

∗ 𝐹𝐿𝐻ℎ𝑒𝑎𝑡 ∗ 𝐵𝑡𝑢𝐻𝐻𝑃 ∗

1𝐻𝑆𝑃𝐹1,000

) ∗ 𝐸𝐹𝑒𝑙𝑒𝑐𝑡𝑟𝑖𝑐

𝑘𝑊 𝑆𝑎𝑣𝑖𝑛𝑔𝑠𝐶𝐴𝐶 = 𝐵𝑡𝑢𝐻𝐶𝐴𝐶 ∗

1𝐸𝐸𝑅

1,000∗ 𝐸𝐹𝑒𝑙𝑒𝑐𝑡𝑟𝑖𝑐 ∗ 𝐶𝐹

𝑘𝑊 𝑆𝑎𝑣𝑖𝑛𝑔𝑠𝐻𝑃 = 𝐵𝑡𝑢𝐻𝐻𝑃 ∗

1𝐸𝐸𝑅

1,000∗ 𝐸𝐹𝑒𝑙𝑒𝑐𝑡𝑟𝑖𝑐 ∗ 𝐶𝐹

𝑡ℎ𝑒𝑟𝑚𝑠 𝑆𝑎𝑣𝑖𝑛𝑔𝑠 = 𝐹𝐿𝐻ℎ𝑒𝑎𝑡 ∗

𝐵𝑡𝑢𝐻𝑔𝑎𝑠

𝜂𝑔𝑎𝑠

100,000∗ 𝐸𝐹𝑔𝑎𝑠 x NGF %

𝑡𝑜𝑡𝑎𝑙 𝑘𝑊ℎ 𝑆𝑎𝑣𝑖𝑛𝑔𝑠 = (𝑘𝑊ℎ 𝑆𝑎𝑣𝑖𝑛𝑔𝑠𝐶𝐴𝐶 ∗ CAC %) + (𝑘𝑊ℎ 𝑆𝑎𝑣𝑖𝑛𝑔𝑠𝐻𝑃 ∗ HP %)

𝑡𝑜𝑡𝑎𝑙 𝑘𝑊 𝑆𝑎𝑣𝑖𝑛𝑔𝑠 = (𝑘𝑊 𝑆𝑎𝑣𝑖𝑛𝑔𝑠𝐶𝐴𝐶 ∗ CAC %) + (𝑘𝑊 𝑆𝑎𝑣𝑖𝑛𝑔𝑠𝐻𝑃 ∗ HP %)

Where:

FLHcool = Full load cooling hours

FLHheat_elec = Full load heating hours – electric

FLHheat_gas = Full load heating hours – natural gas

SEER = Seasonal energy efficiency ratio

EER = Energy efficiency ratio

BtuHCAC = Size of central air conditioning units

BtuHHP = Size of heat pump

BtuHgas = Furnace size

EFgas = Efficiency savings for natural gas furnace

EFelec = Efficiency savings for heat pump/air conditioner

HSPF = Heating season performance factor

ηgas = Furnace efficiency

CF = Summer peak CF for heat pump/CAC

CAC % = CAC saturation rate (excluding heat pumps)

HP % = Heat pump saturation rate

NGF % = Natural gas furnace saturation rate

Page 403: 2018 DSM Portfolio Evaluation Report - NIPSCO

396

Table 267 lists the input assumptions and source of each assumption for the furnace filter whistle

measure savings calculations.

Table 267. School Education Program Furnace Filter Whistle Savings Inputs

Input Value Source

FLHcool 431

Indiana TRM (v2.2; South Bend) FLHheat_elec 1,427

FLHheat_gas 1,427

SEER 11.15 Indiana TRM (v2.2): when unknown use 11.15 (Minimum Federal Standard)

EER 10.035 Indiana TRM (v2.2): EER = SEER x 0.9

BtuHCAC 28,994 Indiana TRM (v2.2): central air conditioner early replacement default existing unit

cooling capacity

BtuHHP 78,236 Indiana Residential Baseline Report 2012

BtuHgas 78,236 Indiana Residential Baseline Report 2012

EFgas 0.0185 Quantec analysis: Engineering Review and Savings Estimates for the "Filtertone"

Filter Restriction Alarm EFelec 0.035

HSPF 7.7 Indiana TRM (v2.2): When unknown use HSPF 7.7 (Minimum Federal Standard after

2006)

ηgas 0.8 Indiana TRM (v2.2): 80% AFUE for natural gas–fired condensing furnace

CF 0.88 Indiana TRM (v2.2)

CAC % 63% Residential Energy Consumption Survey 2015 Data, based East North Central

Midwest region valuesa NGF % 64%

HP % 6%

a As school kits were distributed for students to take home, and measures were not directly installed in homes, the evaluation

team could not know details regarding heating or cooling profiles of homes receiving the measures. This stands in contrast to

programs such as HEA. Thus, the evaluation team made assumptions regarding home heating and cooling saturation rates

using DOE survey data, and applied these across the quantities of distributed kits in generating total savings. U.S. Department

of Energy. Residential Energy Consumption Survey. 2015. https://www.eia.gov/consumption/residential/data/2015/

Page 404: 2018 DSM Portfolio Evaluation Report - NIPSCO

397

Appendix G. Residential Behavioral Program Regression

Analysis and Uplift Analysis

Regression Analysis The evaluation team conducted a regression analysis to determine energy savings for treatment and

control respondents using two models: PPR and LFER. Both approaches produced unbiased estimates of

program savings. The evaluation team reported the PPR results and used the LFER results as a

robustness check. Although structurally different, assuming the RCT is well-balanced with respect to the

drivers of energy use, the two models should produce similar program savings estimates. Based on our

experience analyzing the impacts of similar programs, the savings estimates produced by the PPR

approach tend to be more precisely estimated (smaller standard errors) than those produced from the

LFER model. This increase in precision occurs because the PPR accounts for groupwide pre-post

consumption differences with a continuous term (ADClag) instead of a categorical term (post). Detailed

descriptions of both model types are provided below.

Post-Period Regression The PPR model controls for anomalous differences in energy usage between treatment and control

group respondents by using lagged energy use as an explanatory variable. In other words, the model

frames energy use in each calendar month of the post-program period as a function of both the

treatment variable and energy use in the same calendar month of the pre-program year. The underlying

logic is that any small systematic differences between the control and treatment respondents that

remain, despite the randomization, will be reflected in differences in their past energy use, which is

highly correlated with their current energy use. Including the lagged energy use in the model serves as a

control for any such differences. The version the evaluation team estimated includes monthly fixed

effects interacted with the pre-program energy use variable. These interaction terms allow pre-program

usage to have a different effect on post-program usage in each calendar month.

Equation 1. Post-Period Regression

ADCkt= β0+ β1ADClagkt

+β2Treatmentk+ ∑ β3jMonthjt+ ∑ β4jMonthjt *ADClagkt

j

+

j

εkt

Where:

ADCkt = The average daily usage in kilowatt-hours or therms for respondent k during

billing cycle t. This is the dependent variable in the model.

ADClagkt = Respondent k’s energy use in the same calendar month of the pre-

treatment year as calendar month t.

Treatmentk = A binary variable indicating whether respondent k is in the participant group

(taking a value of 1) or the control group (taking a value of 0).

Page 405: 2018 DSM Portfolio Evaluation Report - NIPSCO

398

Monthjt = A binary variable taking a value of 1 when j=t and 0 otherwise.87

εkt = The cluster-robust error term for respondent k during billing cycle t that

accounts for heteroscedasticity and autocorrelation at the respondent level.

In this model, 𝛽2 is the estimate of average daily energy savings due to the program. Program savings

are the product of the average daily savings estimate and the total number of participant-days in the

analysis.

Linear Fixed Effects Regression As with the PPR model, the LFER model combines cross-sectional and time series data. Unlike the PPR

model, however, the LFER models the full set of pre- and post-program usage data. The regression

essentially compares the pre- and post-program energy usage of participants to those in the control

group to identify the effect of the program. The purpose of the respondent-specific fixed effect is to

capture all systematic cross-respondent variation in electric energy usage that is not captured by the

model. Like the lagged usage variable in the PPR model, the fixed effect represents an attempt to

control for any small systematic differences between the treatment and control respondents that might

occur in the data despite the randomization.

Equation 2. Linear Fixed Effects Regression

ADCkt = β0kt + β1Postt + β2TreatmentkPost

t + εkt

Where:

ADCkt = The average daily usage in kilowatt-hours or therms for respondent k during

billing cycle t. This is the dependent variable in the model.

𝛽0𝑘𝑡 = The respondent-specific fixed effect at month-year t.

β1 = The effect of being in the post-period on energy use to account for non-

program effects that impact both the treatment and control groups.

Postt = A binary variable indicating whether bill cycle t is in the post-program period

(taking a value of 1) or in the pre-program period (taking a value of 0).

β2 = The estimate of treatment effects: the average daily energy savings per

household due to behavioral program treatment.

Treatmentk = A binary variable indicating whether respondent k is in the participant group

(taking a value of 1) or in the control group (taking a value of 0).

εkt = The cluster-robust error term for respondent k during billing cycle t. Cluster-

robust errors account for heteroscedasticity and autocorrelation at the

respondent level.

87 If there are post-program months, the model has monthly dummy variables, with the dummy variable

“month” being the only one to take a value of 1 at time t. These are, in other words, monthly fixed effects.

Page 406: 2018 DSM Portfolio Evaluation Report - NIPSCO

399

Uplift Analysis The HERs sent to treatment respondents included energy saving tips and marketing modules, some of

which encouraged respondents to participate in other NIPSCO energy efficiency programs. To assess the

interactions between these programs, the evaluation team analyzed both the HER program and the

Behavioral program data for participation overlap to address two factors:

• Participation lift: Does the Behavioral program treatment have an effect on participation in

other energy efficiency programs?

• Savings lift and adjustment: What portion of savings from the Behavioral program was obtained

through NIPSCO’s other energy efficiency efforts?

As with the energy savings calculations, the control group acts as the counterfactual, for both

participation and savings from other programs, to address the above questions and provide unbiased

estimates through the RCT model.

First, the evaluation team assessed whether the Behavioral program increased participation in NIPSCO’s

other energy efficiency programs by comparing participation rates between control and treatment

groups. If participation rates in other residential energy efficiency programs were the same across HER

treatment and control groups, the savings estimates for HERs from the regression analysis were already

net of savings from the other programs and indicates that the Behavioral program had no effect on

participation in other energy efficiency programs.

However, if the Behavioral program channeled participants into other energy efficiency programs, then

savings detected in the HER billing analysis would include savings that are also counted by those other

energy efficiency programs. For instance, if the Behavioral program increased participation in the

HEA program, the increase in savings could be allocated to either the HER program or to HEAs provided

through the Behavioral program (or some portion to each), but it could not be fully allocated to both

programs simultaneously.

The evaluation team then calculated participant lift and savings lift and adjustment:

• Participant lift: Using participation flags, the evaluation team calculated a participation rate

based on the number of accounts (either by individual or by household) that initiated

participation in other tracked energy efficiency programs after the first report date. The

difference in treatment and control participation in the post-treatment period is participation

lift.

• Savings lift and adjustment: The evaluation team estimated the energy savings associated with

participation lift in other NIPSCO energy efficiency programs:

• First the evaluation team calculated annual savings for all measures installed in the post-

period.

• Then the evaluation team adjusted annual savings for each measure installation by the

number of days per year in the post-period in which the measure was installed while the

Page 407: 2018 DSM Portfolio Evaluation Report - NIPSCO

400

account was active; this step is necessary to most accurately estimate the savings that

would be captured by the billing analysis.

• Next the evaluation team determined the average household net savings per participant day

(the number of days a household was active in a given period) from other programs in the

post-period for both the treatment and control groups.

• Last, the evaluation team multiplied the average savings per participant day by the number

of treatment group participant days in the post-period to identify the incremental savings

attributable to other energy efficiency programs.

The evaluation team then reduced the overall savings estimated in the billing analysis by the final

estimated incremental savings of the treatment group to avoid double-counting savings.

Page 408: 2018 DSM Portfolio Evaluation Report - NIPSCO

401

Appendix H. Multi-Family Direct Install Program This appendix contains the assumptions for electric energy savings, peak demand reduction, and natural

gas energy savings algorithms for the measures within the MFDI program. The evaluation team

examined each assumption used by the algorithms to capture savings and compared these against the

Indiana TRM (v2.2), as well as other state and industry approaches.

Detailed information on the analysis and supporting assumptions for the following HEA program

measures are included within this appendix:

• LED light bulbs

• Bathroom faucet aerators (1.0 gpm)

• Kitchen aerators (1.5 gpm)

• Low-flow showerheads (1.5 gpm)

• Hot water pipe insulation and

• Smart thermostats (programmable and Wi-Fi enabled)

Table 268 lists our assumptions for the ex post per measure savings.

Table 268. MFDI Program Measures

Measure Reviewed Assumptions

LEDs New and baseline wattages, house of use, waste heat factors, coincidence factors

Kitchen Faucet Aerator New and baseline flow rates, occupants per dwelling, minutes of use per day, faucets per

home, water temperatures, water heater fuel type and efficiency

Bathroom Faucet Aerator New and baseline flow rates, occupants per dwelling, minutes of use per day, faucets per

home, water temperatures, water heater fuel type and efficiency

Low-Flow Showerhead New and baseline flow rates, occupants per dwelling, minutes of use per day, showerheads

per home, water temperatures, water heater fuel type and efficiency

Pipe Wrap New and baseline R-values, pipe diameter, water heater recovery efficiency

Smart Thermostat Full load heating hours, energy savings factor, electric and gas heating capacity, COP of

heating equipment

Details by Measure The algorithms and assumptions the evaluation team used to calculate ex post savings for each of these

measures follow.

LEDs The following equations are used to calculate electric, demand, and therm penalties for LEDs:

𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 𝑝𝑒𝑟 𝑙𝑎𝑚𝑝 =(𝑊𝑏𝑎𝑠𝑒 − 𝑊𝐿𝐸𝐷) ∗ (𝐷𝑎𝑖𝑙𝑦 ℎ𝑜𝑢𝑟𝑠 𝑜𝑓 𝑢𝑠𝑒 ∗ 365) ∗ (1 + 𝑊𝐻𝐹𝑒)

1,000

𝑘𝑊 𝑟𝑒𝑑𝑢𝑐𝑡𝑖𝑜𝑛 𝑝𝑒𝑟 𝑙𝑎𝑚𝑝 =(𝑊𝑏𝑎𝑠𝑒 − 𝑊𝐿𝐸𝐷) ∗ 𝐶𝑜𝑖𝑛𝑐𝑖𝑑𝑒𝑛𝑐𝑒 𝐹𝑎𝑐𝑡𝑜𝑟 ∗ (1 + 𝑊𝐻𝐹𝑑)

1,000

Page 409: 2018 DSM Portfolio Evaluation Report - NIPSCO

402

𝑇ℎ𝑒𝑟𝑚 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 𝑝𝑒𝑟 𝑙𝑎𝑚𝑝 =(𝑊𝑏𝑎𝑠𝑒 − 𝑊𝐿𝐸𝐷) ∗ (𝐷𝑎𝑖𝑙𝑦 ℎ𝑜𝑢𝑟𝑠 𝑜𝑓 𝑢𝑠𝑒 ∗ 365) ∗ (1 + 𝑊𝐻𝐹𝑔)

1,000

Where:

Wbase = Wattage of the bulb being replaced, W

WLED = Wattage of the LED bulb, W

Daily hours of use = Average hours of use per year, hr

WHFe = Waste heat factor for energy (depends on location)

WHFd = Waste heat factor for demand (depends on location)

WHFg = Waste heat factor for natural gas (depends on location)

Coincidence Factor = Summer peak coincidence factor

365 = Number of days per year, days/yr

1,000 = Constant to convert watts to kW

Table 269 lists the input assumptions and source of each assumption for the LED measure savings

calculations.

Table 269. Ex Post Variable Assumptions for LEDs

Input Value Source

WattsBase (9 W LEDs) 43 2016 Residential Lighting Evaluation Protocol, UMP

WattsBase (Globe LEDs) 29

WattsBase (Candelabras) 29

WattsEff (9 W LEDs) 9 Actual installed wattage

WattsEff (Globe LEDs) 6

WattsEff (Candelabras) 5

ISR 1 Indiana TRM V2.2

Hours 902 Indiana TRM 2.2

Coincidence Factor 0.11 Indiana TRM V2.2

Energy Waste Heat Factor

(WHFE)

-0.07 Indiana TRM V2.2, averaged across participant location

Demand Waste Heat Factor

(WHFD)

0.038

Gas Waste Heat Factor (WHFG) -0.0019

Conversion Factor 1000 Convert watts to kW

Kitchen and Bathroom Faucet Aerators The evaluation team used the following equations to calculate natural gas energy savings for low-flow

kitchen and bathroom faucet aerators:

𝑡ℎ𝑒𝑟𝑚 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = 10 ∗ (𝐺𝑃𝑀𝑏𝑎𝑠𝑒 − 𝐺𝑃𝑀𝑙𝑜𝑤 𝑓𝑙𝑜𝑤) ∗ 𝑀𝑃𝐷 ∗𝑃𝐻

𝐹𝐻∗ 𝐷𝑅 ∗ 8.3 ∗ (𝑇𝑚𝑖𝑥 − 𝑇𝑖𝑛𝑙𝑒𝑡) ∗

365

𝑅𝐺 ∗ 100,000

Page 410: 2018 DSM Portfolio Evaluation Report - NIPSCO

403

Where:

GPMbase = Gallons per minute of baseline faucet aerator

GPMlow flow = Gallons per minute of low-flow faucet aerator

MPD = Average minutes of faucet use per person per day

PH = Average number of people per household

FH = Average number of faucets per household

Tmix = Mixed water temperature exiting faucet, ℉

Tinlet = Cold water temperature entering the domestic hot water (DHW) system, ℉

RG = Recovery efficiency of gas hot water heater

60 = Minutes per hour, min/hr

8.3 = Specific weight of water in pounds per gallon, then multiplied by specific water

temperature (1.0 Btu/lb-°F)

3,412 = Constant to convert Btu to kWh

365 = Days per year, day/yr

100,000 = Constant to convert MMBtu to therm

Table 270 lists the input assumptions and source of each assumption for the kitchen and bathroom

faucet aerator measure savings calculations.

Table 270. Ex Post Variable Assumptions for Faucet Aerators

Input Kitchen Value Bathroom Value Source

GPMbase 2.44 1.90 Indiana TRM (v2.2)

GPMlow flow 1.5 1.0 Actual

MPD 4.5 1.6 Indiana TRM (v2.2)

PH 2.64 1.83 Indiana TRM (v2.2) for multi-family housing

FH 1 1.43 Indiana TRM (v2.2) for multi-family housing

Tmix 93 86 Indiana TRM (v2.2)

Tinlet 57.4˚F 56.9˚F Indiana TRM V2.2, averaged across

participant location

DR 0.5 0.7 Indiana TRM (v2.2)

RG 0.76 0.76 Indiana TRM (v2.2)

Low-Flow Showerheads The evaluation team used the following equations to calculate natural gas energy savings for low-flow

showerheads:

𝑡ℎ𝑒𝑟𝑚 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐺𝑃𝑀𝑏𝑎𝑠𝑒 − 𝐺𝑃𝑀𝑙𝑜𝑤 𝑓𝑙𝑜𝑤) ∗ 𝑀𝑆 ∗ 𝑆𝑃𝐷 ∗𝑃𝐻

𝐹𝐻∗ 8.3 ∗ (𝑇𝑖𝑚𝑥 − 𝑇𝑖𝑛𝑙𝑒𝑡) ∗

365

𝑅𝐺 ∗ 100,000

Page 411: 2018 DSM Portfolio Evaluation Report - NIPSCO

404

Where:

GPMbase = Gallons per minute of baseline showerhead, GPM

GPMlow flow = Gallons per minute of low-flow showerhead, GPM

MS = Average minutes of minutes per shower event

SPD = Average number of shower events per person per day

PH = Average number of people per household

SH = Average number of showerheads per household

Tmix = Mixed water temperature exiting faucet, ℉

Tinlet = Cold water temperature entering the DHW system, ℉

RE = Recovery efficiency of electric hot water heater

RG = Recovery efficiency of gas hot water heater

CF = Summer peak coincidence factor

60 = Minutes per hour, min/hr

8.3 = Specific weight of water in pounds per gallon, then multiplied by specific water

temperature (1.0 Btu/lb-°F)

3,412 = Constant to convert Btu to kWh

365 = Days per year

100,000 = Constant to convert MMBtu to therm

Table 271 lists the input assumptions and source of each assumption for the low-flow showerhead

measure savings calculations.

Table 271. Ex Post Variable Assumptions for Low-Flow Showerheads

Input Value Source

GPMbase 2.63 Indiana TRM (v2.2)

GPMlow flow 1.5 Actual

MS 7.8 Indiana TRM (v2.2)

SPD 0.6 Indiana TRM (v2.2)

PH 1.83 Indiana TRM (v2.2) for multi-family housing

SH 1.2 Indiana TRM (v2.2) for multi-family housing

Tmix 101 Indiana TRM (v2.2)

Tinlet - Electric WH 56.7˚F Indiana TRM V2.2, averaged across participant location

Tinlet - Gas WH 57.4˚F Indiana TRM V2.2, averaged across participant location

RG 0.76 Indiana TRM (v2.2)

Page 412: 2018 DSM Portfolio Evaluation Report - NIPSCO

405

Pipe Wrap Insulation The evaluation team used the following equations to calculate natural gas energy savings for pipe wrap

insulation:

𝑡ℎ𝑒𝑟𝑚 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (1

𝑅𝑒𝑥𝑖𝑠𝑡−

1

𝑅𝑛𝑒𝑤) ∗

𝐿 ∗ 𝐶 ∗ ∆𝑇 ∗ 8760

𝑁𝐷𝐻𝑊 ∗ 100,000

Where:

Rexist = Pipe loss heat coefficient (R-value) of existing, uninsulated pipe, ℉∗ℎ𝑟∗𝑓𝑡2

𝐵𝑡𝑢

Rnew = Pipe loss heat coefficient (R-value) of insulated pipe, ℉∗ℎ𝑟∗𝑓𝑡2

𝐵𝑡𝑢

L = Length of pipe from water heating source covered by pipe wrap, in feet

C = Circumference of pipe, in feet

∆T = Average temperature difference between supplied water and ambient air

temperature, ℉

NDHW = Recovery efficiency of hot water heater

8,760 = Hours per year, hr

3,412 = Constant to convert Btu to kWh

100,000 = Constant to convert MMBtu to therm

Table 272 lists the input assumptions and source of each assumption for the pipe wrap insulation

measure savings calculations.

Table 272. Ex Post Variable Assumptions for Pipe Wrap Insulation

Input Value Source

Rexist 1 Indiana TRM (v2.2)

Rnew 3 Indiana TRM (v2.2)

L 1 Calculate savings per foot of pipe length

C 0.131 Circumference for 0.5-inch pipe

∆T 65 Indiana TRM (v2.2)

NDHW - electric 0.98 Indiana TRM (v2.2)

NDHW - gas 0.75 Indiana TRM (v2.2)

Smart Thermostats This measure category consists of two general measure names: Wi-Fi thermostats and programmable

thermostats. The Indiana TRM (v2.2) does not denote a difference between programmable and Wi-Fi

thermostats, and the evaluation team could find no examples of separate savings factors for these two

types of units. Therefore, the evaluation team followed the Indiana TRM (v2.2) and grouped the

programmable and Wi-Fi thermostats together for this evaluation.

All of the smart thermostat measures for the MFDI program were listed in the tracking data as one of

the following:

Page 413: 2018 DSM Portfolio Evaluation Report - NIPSCO

406

• WiFi Thermostat – Electric Heat

• WiFi Thermostat – Gas Heat

• Programmable Thermostat – Gas Heat

• Programmable Thermostat – Electric Heat

Because the heating is specified in the measure name, the evaluation team assumed that there is no air

conditioning in any of the units that received smart thermostats, and therefore only accounted for

savings during the heating season. Additionally, the Indiana TRM does not specify methodology for

calculating demand reduction savings for smart thermostats. Peak demand savings are calculated in

other TRMs (such as the Illinois TRM v 7.0) only for the cooling season. For this reason, demand

reduction savings for all thermostat measures were assumed to be 0.

The following equations are used to calculate electric energy and natural gas energy savings for smart

thermostats.

∆𝑘𝑊ℎℎ𝑒𝑎𝑡 = 𝐸𝐹𝐿𝐻ℎ𝑒𝑎𝑡 ∗1

𝐶𝑂𝑃∗

𝐵𝑡𝑢ℎℎ𝑒𝑎𝑡

3,412∗ 𝐸𝑆𝐹ℎ𝑒𝑎𝑡

𝑡ℎ𝑒𝑟𝑚𝑠 𝑆𝑎𝑣𝑖𝑛𝑔𝑠 = 𝐸𝐹𝐿𝐻ℎ𝑒𝑎𝑡 ∗𝐵𝑡𝑢ℎℎ𝑒𝑎𝑡

𝐴𝐹𝑈𝐸 ∗ 100,000∗ 𝐸𝑆𝐹ℎ𝑒𝑎𝑡

Where:

EFLHheat = Full load heating hours, hr

Btuhheat = Heating equipment capacity

ESFheat = Efficiency savings factor

COP = Coefficient of performance for electric furnace

AFUE = Annual fuel utilization efficiency

3,412 = Constant to convert Btu to Wh

Table 273 shows the input assumptions and the source of each assumption in the thermostat measure

savings calculations.

Page 414: 2018 DSM Portfolio Evaluation Report - NIPSCO

407

Table 273. Variable, Values, and Sources Used for Thermostat Energy Savings Calculations

Variable Value Source

Btuhheat 18,397 Btu/hr for electric heating and

19,573 Btu/hr for gas heating

2012 Residential Indiana Baseline Study and

Average of HVAC program heat pumps (2017 and

2018) scaled by a factor of 0.30 from single-family

housing units to estimate multi-family heating

equipment capacity. The factor of 0.30 comes from

EIA RECS data from 2015.

COP 1.49

Indiana TRM (v2.2) COPs for heat pumps and electric

furnaces were taken in a weighted average, by

percent of multi-family housing units that utilize each

electric heating type. The proportions were calculated

from EIA RECS data from 2015.

AFUE 0.8 Source forthcoming

EFLHH 1427 Indiana TRM (v2.2) for South Bend

ESFH 0.068 for smart thermostats Indiana TRM (v2.2)

Page 415: 2018 DSM Portfolio Evaluation Report - NIPSCO

408

Appendix I. Residential Income Qualified Weatherization

Program Assumptions and Algorithms This appendix contains the assumptions used in electric savings, demand reduction, and natural gas

savings algorithms for the measures within the IQW program. The evaluation team examined each

assumption behind the algorithms to capture savings and compared it against the Indiana TRM (v2.2), as

well as other state and industry approaches. Detailed information on the analysis and supporting

assumptions for the following IQW program measures are included within this appendix:

• LEDs

• Programmable thermostats

• Kitchen faucet aerators

• Bathroom faucet aerators

• Low-flow showerheads

• Pipe wrap

• Filter whistles

• Water heater temperature setback

• Air sealing

• Duct sealing

• Attic insulation

• Refrigerator replacement

Table 274 lists the assumptions of the ex post per-measure savings.

Table 274. IQW Program Measures

Measure Reviewed Assumptions

LEDs New and baseline wattages, HOU, WHF, CFs

Programmable thermostat Full load heating and cooling hours and equipment efficiencies

Kitchen Faucet Aerator New and baseline flow rates, people per house, minutes of use per day, faucets per home,

water temperatures, water heater fuel type and efficiency

Bathroom Faucet Aerator New and baseline flow rates, people per house, minutes of use per day, faucets per home,

water temperatures, water heater fuel type and efficiency

Low-Flow Showerhead New and baseline flow rates, people per house, minutes of use per day, showerheads per

home, water temperatures, water heater fuel type and efficiency

Pipe Wrap New and baseline R-values, pipe diameter, water heater recovery efficiency

Filter Whistle Full load heating and cooling hours, efficiency ratings, efficiency improvement

Duct Sealing New and baseline distribution efficiencies, full load heating and cooling hours, capacities

and efficiencies of heating and cooling equipment

Air Sealing Pre-install and post-install infiltration rates, N-factor, CF

Attic Insulation Void space and compression factor, pre-install and post-install R-values, square footage of

installed insulation

Refrigerator replacement New and baseline energy use

Details by Measure The algorithms and assumptions the evaluation team used to calculate ex post savings for each of these

measures follow.

Page 416: 2018 DSM Portfolio Evaluation Report - NIPSCO

409

LEDs The evaluation team used the following equations to calculate electric energy and peak demand savings,

as well as natural gas energy penalties, for LEDs:

𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 𝑝𝑒𝑟 𝑙𝑎𝑚𝑝 =(𝑊𝑏𝑎𝑠𝑒 − 𝑊𝐿𝐸𝐷) ∗ (𝐷𝑎𝑖𝑙𝑦 ℎ𝑜𝑢𝑟𝑠 𝑜𝑓 𝑢𝑠𝑒 ∗ 365) ∗ (1 + 𝑊𝐻𝐹𝑒)

1,000

𝑘𝑊 𝑟𝑒𝑑𝑢𝑐𝑡𝑖𝑜𝑛 𝑝𝑒𝑟 𝑙𝑎𝑚𝑝 =(𝑊𝑏𝑎𝑠𝑒 − 𝑊𝐿𝐸𝐷) ∗ 𝐶𝑜𝑖𝑛𝑐𝑖𝑑𝑒𝑛𝑐𝑒 𝐹𝑎𝑐𝑡𝑜𝑟 ∗ (1 + 𝑊𝐻𝐹𝑑)

1,000

𝑡ℎ𝑒𝑟𝑚 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 𝑝𝑒𝑟 𝑙𝑎𝑚𝑝 =(𝑊𝑏𝑎𝑠𝑒 − 𝑊𝐿𝐸𝐷) ∗ (𝐷𝑎𝑖𝑙𝑦 ℎ𝑜𝑢𝑟𝑠 𝑜𝑓 𝑢𝑠𝑒 ∗ 365) ∗ (1 + 𝑊𝐻𝐹𝑔)

1,000

Where:

Wbase = Wattage of the bulb being replaced, W

WLED = Wattage of the LED bulb, W

Daily hours of use = Average HOU per day, hr

WHFe = Waste heat factor for energy to account for HVAC interactions with lighting

(depends on location)

WHFd = Waste heat factor for demand to account for HVAC interactions with lighting

(depends on location)

WHFg = Waste heat factor for natural gas to account for HVAC interactions with lighting

(depends on location)

Coincidence Factor = Summer peak CF

365 = Number of days per year, days/yr

1,000 = Constant to convert watts to kW

Table 275 lists the input assumptions and source of each assumption for the LED measure savings

calculations.

Table 275. Ex Post Variable Assumptions for LEDs

Input Value Source

Wbase 43 206 Residential Lighting Evaluation Protocol, UMP

WLED for 9-watt (LED) 9 Actual installed wattage

Daily hours of use x 365 902 Indiana TRM (v2.2)

WHFe (Joint Customers) -0.07 Indiana TRM (v2.2), Joint customer with natural gas heat with central air

conditioning, averaged across participant location

WHFe (Electric Only) -0.072 Indiana TRM (v2.2), Electric customer with electric heat with central air

conditioning, averaged across participant location

WHFd 0.038 Indiana TRM (v2.2), averaged across participant location

WHFg -0.0019 Indiana TRM (v2.2), averaged across participant location

Coincidence Factor 0.11 Indiana TRM (v2.2)

Page 417: 2018 DSM Portfolio Evaluation Report - NIPSCO

410

Programmable thermostat The evaluation team used the following equations to calculate electric and natural gas energy savings

for programmable thermostats. There are no summer peak coincidence demand savings associated with

this measure.

𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠_𝑐𝑜𝑜𝑙 = 1

𝑆𝐸𝐸𝑅∗ 𝐹𝐿𝐻𝑐𝑜𝑜𝑙 ∗

𝐵𝑡𝑢ℎ𝑐𝑜𝑜𝑙

1,000∗ 𝐸𝑆𝐹𝑐𝑜𝑜𝑙

𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠_ℎ𝑒𝑎𝑡 = 𝐹𝐿𝐻ℎ𝑒𝑎𝑡 ∗𝐵𝑡𝑢𝐻ℎ𝑒𝑎𝑡

𝑁ℎ𝑒𝑎𝑡 ∗ 3412∗ 𝐸𝑆𝐹ℎ𝑒𝑎𝑡

𝑡ℎ𝑒𝑟𝑚 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = 𝐹𝐿𝐻ℎ𝑒𝑎𝑡

𝐵𝑡𝑢𝐻𝐹𝐹 ∗ 𝐸𝑆𝐹ℎ𝑒𝑎𝑡

100,000

Where:

SEER = Seasonal average efficiency ratio

FLHcool = Full load cooling hours

BtuHcool = Cooling system capacity in Btu per hour

ESFcool = Cooling energy savings fraction

FLHheat = Full load heating hours

BtuHheat = Heating system capacity in Btu per hour

Nheat = Efficiency in COP of heating equipment

BtuHFF = Heating capacity of natural gas equipment

Table 276 lists the assumptions and source of each assumption for the smart thermostat measure

savings calculations.

Table 276. Ex Post Variable Assumptions for Programmable thermostats

Input Value Source

SEER 11.15 Indiana TRM (v2.2)

FLHcool (Joint) 421 Indiana TRM (v2.2), joint customer with natural gas heat and central air conditioning,

averaged across participant locations

FLHcool

(Electric Only) 431

Indiana TRM (v2.2), electric customer with electric heat and central air conditioning, averaged

across participant locations

Btuhcool 28,994 Indiana TRM (v2.2)

ESFcool 0.09 Indiana TRM (v2.2)

FLHheat 1417.73 Indiana TRM (v2.2), joint customer with natural gas heat and central air conditioning,

averaged across participant location

FLHheat 1415.33 Indiana TRM (v2.2), joint customer with natural gas heat and no central air conditioning,

averaged across participant location

FLHheat 1427 Indiana TRM (v2.2), electric customer with electric heating and central air conditioning,

averaged across participant location

FLHheat 1415 Indiana TRM (v2.2), natural gas customer with natural gas heat, averaged across participant

location

Page 418: 2018 DSM Portfolio Evaluation Report - NIPSCO

411

Input Value Source

Nheat 1 Indiana TRM (v2.2)

ESFheat 0.068 Indiana TRM (v2.2)

Btuhelec 32,000 2016 Pennsylvania TRM

BtuhFF 77,386 Indiana TRM (v2.2)

Constant 3412 Factor to convert Btu to kW

Constant 100,000 Factor to convert Btu to therm

Kitchen and Bathroom Faucet Aerators The evaluation team used the following equations to calculate electric energy, peak demand, and

natural gas energy savings for kitchen and bathroom faucet aerators:

𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐺𝑃𝑀𝑏𝑎𝑠𝑒 − 𝐺𝑃𝑀𝑙𝑜𝑤 𝑓𝑙𝑜𝑤) ∗ 𝑀𝑃𝐷 ∗𝑃𝐻

𝐹𝐻∗ 8.3 ∗ (𝑇𝑚𝑖𝑥 − 𝑇𝑖𝑛𝑙𝑒𝑡) ∗

365

𝑅𝐸 ∗ 3412

𝑘𝑊 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐺𝑃𝑀𝑏𝑎𝑠𝑒 − 𝐺𝑃𝑀𝑙𝑜𝑤 𝑓𝑙𝑜𝑤) ∗ 60 ∗ 8.3 ∗ 𝐷𝑅 ∗𝑇𝑚𝑖𝑥 − 𝑇𝑖𝑛𝑙𝑒𝑡

𝑅𝐸 ∗ 3412∗ 𝐶𝐹

𝑡ℎ𝑒𝑟𝑚 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐺𝑃𝑀𝑏𝑎𝑠𝑒 − 𝐺𝑃𝑀𝑙𝑜𝑤 𝑓𝑙𝑜𝑤) ∗ 𝑀𝑃𝐷 ∗𝑃𝐻

𝐹𝐻∗ 8.3 ∗ 𝐷𝑅 ∗ (𝑇𝑚𝑖𝑥 − 𝑇𝑖𝑛𝑙𝑒𝑡)

∗365

𝑅𝐺 ∗ 100,000

Where:

GPMbase = Gallons per minute of baseline faucet aerator

GPMlow flow = Gallons per minute of low-flow faucet aerator

MPD = Average minutes of faucet use per person per day

PH = Average number of people per household

FH = Average number of faucets per household

DR = Percentage of water flowing down drain

Tmix = Mixed water temperature exiting faucet, ℉

Tinlet = Cold water temperature entering the DHW system, ℉

RE = Recovery efficiency of electric hot water heater

RG = Recovery efficiency of natural gas hot water heater

CF = Summer peak CF

60 = Minutes per hour

8.3 = Specific weight of water in pounds per gallon, then multiplied by specific water

temperature (1.0 Btu/lb-°F)

3412 = Constant to convert Btu to kWh

365 = Days per year

100,000 = Constant to convert MMBtu to therm

Page 419: 2018 DSM Portfolio Evaluation Report - NIPSCO

412

Table 277 lists the assumptions and source of each assumption for the kitchen and bathroom faucet

aerator measure savings calculations.

Table 277. Ex Post Variable Assumptions for Faucet Aerators

Input Kitchen Value Bathroom Value Source

GPMbase 2.44 1.90 Indiana TRM (v2.2)

GPMlow flow 1.5 1 Actual

MPD 4.5 1.6 Indiana TRM (v2.2)

PH 2.64 2.64 Indiana TRM (v2.2)

FH 1 2.04 Indiana TRM (v2.2)

DR 0.5 0.7 Indiana TRM (v2.2)

Tmix 93 86 Indiana TRM (v2.2)

Tinlet 57.4 57.4 Indiana TRM (v2.2) averaged across participant

location

RE 0.98 0.98 Indiana TRM (v2.2)

RG 0.76 0.76 Indiana TRM (v2.2)

CF 0.0033 0.0012 Indiana TRM (v2.2)

Low-Flow Showerheads The evaluation team used the following equations to calculate electric energy, peak demand, and

natural gas energy savings for low-flow showerheads:

𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐺𝑃𝑀𝑏𝑎𝑠𝑒 − 𝐺𝑃𝑀𝑙𝑜𝑤 𝑓𝑙𝑜𝑤) ∗ 𝑀𝑆 ∗ 𝑆𝑃𝐷 ∗𝑃𝐻

𝑆𝐻∗ 8.3 ∗ (𝑇𝑚𝑖𝑥 − 𝑇𝑖𝑛𝑙𝑒𝑡) ∗

365

𝑅𝐸 ∗ 3412

𝑘𝑊 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐺𝑃𝑀𝑏𝑎𝑠𝑒 − 𝐺𝑃𝑀𝑙𝑜𝑤 𝑓𝑙𝑜𝑤) ∗ 60 ∗ 8.3 ∗𝑇𝑚𝑖𝑥 − 𝑇𝑖𝑛𝑙𝑒𝑡

𝑅𝐸 ∗ 3412∗ 𝐶𝐹

𝑡ℎ𝑒𝑟𝑚 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐺𝑃𝑀𝑏𝑎𝑠𝑒 − 𝐺𝑃𝑀𝑙𝑜𝑤 𝑓𝑙𝑜𝑤) ∗ 𝑀𝑆 ∗ 𝑆𝑃𝐷 ∗𝑃𝐻

𝐹𝐻∗ 8.3 ∗ (𝑇𝑖𝑚𝑥 − 𝑇𝑖𝑛𝑙𝑒𝑡) ∗

365

𝑅𝐺 ∗ 100,000

Where:

GPMbase = Gallons per minute of baseline showerhead

GPMlow flow = Gallons per minute of low-flow showerhead

MS = Average minutes per shower event

SPD = Average number of shower events per person per day

PH = Average number of people per household

SH = Average number of showerheads per household

Tmix = Mixed water temperature exiting faucet, ℉

Tinlet = Cold water temperature entering the DHW system, ℉

RE = Recovery efficiency of electric hot water heater

RG = Recovery efficiency of natural gas hot water heater

CF = Summer peak CF

Page 420: 2018 DSM Portfolio Evaluation Report - NIPSCO

413

60 = Minutes per hour

8.3 = Specific weight of water in pounds per gallon, then multiplied by specific water

temperature (1.0 Btu/lb-°F)

3,412 = Constant to convert Btu to kWh

365 = Days per year

100,000 = Constant to convert MMBtu to therm

Table 278 shows the input assumptions and the source of each assumption for the low-flow showerhead

measure savings calculations.

Table 278. Ex Post Variable Assumptions for Low-Flow Showerheads

Input Value Source

GPMbase 2.63 Indiana TRM (v2.2)

GPMlow flow 1.5 Actual

MS 7.8 Indiana TRM (v2.2)

SPD 0.6 Indiana TRM (v2.2)

PH 2.64 Indiana TRM (v2.2)

SH 1.6 Indiana TRM (v2.2)

Tmix 101 Indiana TRM (v2.2)

Tinlet – electric water heater 57.1 Indiana TRM (v2.2), averaged across participant location

Tinlet – natural gas water heater 57.2 Indiana TRM (v2.2), averaged across participant location

RE 0.98 Indiana TRM (v2.2)

RG 0.76 Indiana TRM (v2.2)

CF 0.0023 Indiana TRM (v2.2)

Pipe Wrap Insulation The evaluation team used the following equations to calculate electric energy, peak demand, and

natural gas energy savings for pipe wrap insulation:

𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (1

𝑅𝑒𝑥𝑖𝑠𝑡−

1

𝑅𝑛𝑒𝑤) ∗

𝐿 ∗ 𝐶 ∗ ∆𝑇 ∗ 8760

𝑁𝐷𝐻𝑊 ∗ 3412

𝑘𝑊 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 =∆𝑘𝑊ℎ

8760

𝑡ℎ𝑒𝑟𝑚 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (1

𝑅𝑒𝑥𝑖𝑠𝑡−

1

𝑅𝑛𝑒𝑤) ∗

𝐿 ∗ 𝐶 ∗ ∆𝑇 ∗ 8760

𝑁𝐷𝐻𝑊 ∗ 100,000

Where:

Rexist = Pipe loss heat coefficient (R-value) of existing, uninsulated pipe, ℉∗ℎ𝑟∗𝑓𝑡2

𝐵𝑡𝑢

Rnew = Pipe loss heat coefficient (R-value) of insulated pipe, ℉∗ℎ𝑟∗𝑓𝑡2

𝐵𝑡𝑢

L = Length of pipe from water heating source covered by pipe wrap, in feet

Page 421: 2018 DSM Portfolio Evaluation Report - NIPSCO

414

C = Circumference of pipe, in feet

∆T = Average temperature difference between supplied water and ambient air

temperature, ℉

NDHW = Recovery efficiency of hot water heater

8,760 = Hours per year

3,412 = Constant to convert Btu to kWh

100,000 = Constant to convert MMBtu to therm

Table 279 lists the assumptions and source of each assumption for the pipe wrap insulation measure

savings calculations.

Table 279. Ex Post Variable Assumptions for Pipe Wrap Insulation

Measure Value Reviewed Assumptions

Rexist 1 Indiana TRM (v2.2)

Rnew 3 Indiana TRM (v2.2)

L 1 Calculate savings per foot of pipe length

C 0.131 Circumference for 0.5” pipe

∆T 65 Indiana TRM (v2.2)

NDHW (electric) 0.98 Indiana TRM (v2.2)

NDHW (natural gas) 0.75 Indiana TRM (v2.2)

Filter Whistles The evaluation team used the following equations to calculate electric and demand savings for filter

whistles.

𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = 𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠ℎ𝑒𝑎𝑡 + 𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠𝑐𝑜𝑜𝑙

𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠ℎ𝑒𝑎𝑡 = 𝑘𝑊𝑚𝑜𝑡𝑜𝑟 ∗ 𝐹𝐿𝐻ℎ𝑒𝑎𝑡 ∗ 𝐸𝐼

𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠𝑐𝑜𝑜𝑙 = 𝑘𝑊𝑚𝑜𝑡𝑜𝑟 ∗ 𝐹𝐿𝐻𝑐𝑜𝑜𝑙 ∗ 𝐸𝐼

𝑘𝑊 𝑠𝑎𝑣𝑖𝑛𝑔𝑠𝑝𝑒𝑎𝑘 = 𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠𝑐𝑜𝑜𝑙

𝐹𝐿𝐻𝑐𝑜𝑜𝑙∗ 𝐶𝐹

Where:

kWmotor = Average motor full load electric demand

FLHheat = Full load heating hours, hr

FLHcool = Full load cooling hours, hr

EI = Efficiency improvement

CF = Coincidence factor

Page 422: 2018 DSM Portfolio Evaluation Report - NIPSCO

415

Table 280 shows the input assumptions and the source of each assumption in the filter whistle measure

savings calculations.

Table 280. Ex Post Variable Assumptions for Filter Whistles

Input Value Source

kWmotor 0.5 2016 Pennsylvania TRM

FLHcool 431 Indiana TRM (v2.2), for South Bend

FLHheat 1,427

EI 0.15 2016 Pennsylvania TRM

CF 0.647 2016 Pennsylvania TRM

Water Heater Setback The evaluation team used the following equations to calculate electric and natural gas energy savings

for water heater temperature setback.

𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝑈 ∗ 𝐴 ∗ (𝑇𝑝𝑟𝑒 − 𝑇𝑝𝑜𝑠𝑡) ∗ 𝐻𝑜𝑢𝑟𝑠)/(3412 ∗ 𝑅𝐸)

𝑘𝑊 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 =𝛥𝑘𝑊ℎ

𝐻𝑜𝑢𝑟𝑠∗ 𝐶𝐹

𝑡ℎ𝑒𝑟𝑚 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝑈 ∗ 𝐴 ∗ (𝑇𝑝𝑟𝑒 − 𝑇𝑝𝑜𝑠𝑡) ∗ 𝐻𝑜𝑢𝑟𝑠)/(100,000 ∗ 𝑅𝐺)

Where:

U = Overall heat transfer coefficient of water heater tank (Btu/hr -°F-ft2)

A = Surface area of tank, ft2

Tpre = Hot water setpoint prior to adjustment

Tpost = Hot water setpoint after adjustment

Hours = Number of hours in a year

RE = Recovery efficiency of electric hot water heater

RG = Recovery efficiency of natural gas hot water heater

CF = Summer peak CF

Table 281 lists the assumptions and source of each assumption for the hot water heater setback

measure savings calculations.

Page 423: 2018 DSM Portfolio Evaluation Report - NIPSCO

416

Table 281. Ex Post Variable Assumptions for Water Heater Setback

Input Value Source

U 0.083 Illinois TRM (v6.0)

A 24.99 Illinois TRM (v6.0), 50-gallon tank

Tpre 135 Illinois TRM (v6.0)

Tpost 120 Illinois TRM (v6.0)

RE 0.98 Illinois TRM (v6.0)

RG 0.78 Illinois TRM (v6.0)

CF 1 Illinois TRM (v6.0)

Hours 8,760 Hours per year

Note: This measure is not available in the Indiana TRM (v2.2).

Duct Sealing The evaluation team used the following equations to calculate electric energy, peak demand, and

natural gas energy savings for duct sealing.

𝑘𝑊ℎ 𝑆𝑎𝑣𝑖𝑛𝑔𝑠 (ℎ𝑒𝑎𝑡𝑖𝑛𝑔) = (𝐷𝐸𝑎𝑓𝑡𝑒𝑟 − 𝐷𝐸𝑏𝑒𝑓𝑜𝑟𝑒

𝐷𝐸𝑎𝑓𝑡𝑒𝑟) ∗ (

𝐹𝐿𝐻ℎ𝑒𝑎𝑡 ∗ 𝐵𝑡𝑢ℎℎ𝑒𝑎𝑡

3412 ∗ 𝑁ℎ𝑒𝑎𝑡𝑖𝑛𝑔)

𝑘𝑊ℎ 𝑆𝑎𝑣𝑖𝑛𝑔𝑠 (𝑐𝑜𝑜𝑙𝑖𝑛𝑔) = (𝐷𝐸𝑎𝑓𝑡𝑒𝑟𝑐𝑜𝑜𝑙 − 𝐷𝐸𝑏𝑒𝑓𝑜𝑟𝑒𝑐𝑜𝑜𝑙

𝐷𝐸𝑎𝑓𝑡𝑒𝑟𝑐𝑜𝑜𝑙) ∗ (

𝐹𝐿𝐻𝑐𝑜𝑜𝑙 ∗ 𝐵𝑡𝑢ℎ𝑐𝑜𝑜𝑙

𝑆𝐸𝐸𝑅 ∗ 1000)

𝑘𝑊 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐷𝐸𝑝𝑘𝑎𝑓𝑡𝑒𝑟 − 𝐷𝐸𝑝𝑘𝑏𝑒𝑓𝑜𝑟𝑒

𝐷𝐸𝑝𝑘𝑎𝑓𝑡𝑒𝑟) ∗ (

𝐵𝑡𝑢ℎ𝑐𝑜𝑜𝑙

𝐸𝐸𝑅 ∗ 1000∗ 𝐶𝐹)

𝑡ℎ𝑒𝑟𝑚 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐷𝐸𝑎𝑓𝑡𝑒𝑟ℎ𝑒𝑎𝑡 − 𝐷𝐸𝑏𝑒𝑓𝑜𝑟𝑒ℎ𝑒𝑎𝑡

𝐷𝐸𝑎𝑓𝑡𝑒𝑟) ∗ (

𝐹𝐿𝐻ℎ𝑒𝑎𝑡 ∗ 𝐵𝑡𝑢ℎℎ𝑒𝑎𝑡

100,000)

Where:

DEaftercool = Distribution efficiency after duct sealing

DEbeforecool = Distribution efficiency before duct sealing

DEafterheat = Distribution efficiency after duct sealing

DEbeforeheat = Distribution efficiency before duct sealing

DEpkafter = Distribution efficiency under peak summer conditions after duct sealing

DEpkbefore = Distribution efficiency under peak summer conditions before duct sealing

FLHcool = Full load cooling hours

FLHheat = Full load heating hours

BtuHcool = Cooling capacity of cooling equipment (Btu per hour)

BtuHheat = Heating capacity of heating equipment (Btu per hour)

Nheat = Efficiency in COP of heating equipment

SEER = Seasonal average efficiency of air conditioning equipment

Page 424: 2018 DSM Portfolio Evaluation Report - NIPSCO

417

EER = Peak efficiency of air conditioning equipment

CF = Coincidence factor

Table 282 lists the assumptions and source of each assumption for the duct sealing measure savings

calculations.

Table 282. Ex Post Variable Assumptions for Duct Sealing

Input Value Source

DEaftercool 0.84 2015 EM&V, Average duct in unconditioned basement and attic, 8-10% leakage

DEbeforecool 0.75 2015 EM&V, Average duct in unconditioned basement and attic, 15-30% leakage

DEafterheat 0.82 2015 EM&V, Average duct in unconditioned basement and attic, 8-10% leakage

DEbeforeheat 0.75 2015 EM&V, Average duct in unconditioned basement and attic, 15-30% leakage

DEpkafter 0.79 2015 EM&V, Average duct in unconditioned basement and attic, 8-10% leakage

DEpkbefore 0.68 2015 EM&V, Average duct in unconditioned basement and attic, 15-30% leakage

FLHcool 431 Indiana TRM (v2.2), Joint customers, natural gas heat and central air conditioning, and

Electric cooling customers, averaged across participant location

FLHheat 1,427 Indiana TRM (v2.2), Joint customers, natural gas heat and central air conditioning, and

Electric heating only customers, averaged across participant location

FLHcool 419 Indiana TRM (v2.2), Electric customers, electric heat and central air conditioning,

averaged across participant location

FLHheat 1,413 Indiana TRM (v2.2), Electric customers, electric heat and central air conditioning,

averaged across customer location

FLHheat 1,406 Indiana TRM (v2.2), Natural gas heating customers, averaged across customer location

Btuhcool 28,994 Indiana TRM (v2.2)

Btuhheat –

electric 32,000 2016 Pennsylvania TRM

Btuhheat – fossil

fuel 77,386 Indiana TRM (v2.2)

SEER 11.15 Indiana TRM (v2.2)

EER 10.035 SEER*0.9, Indiana TRM (v2.2)

Nheat 1.0 Indiana TRM (v2.2)

CF 0.88 Indiana TRM (v2.2)

Constant 1,000 Convert W to kW

Constant 3,412 Convert Btu/hr to kW

Constant 100,000 Convert Btu to therm

Air Sealing The evaluation team used the following equations to calculate electric energy, peak demand, and

natural gas energy savings for air sealing.

𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐶𝐹𝑀50𝑒𝑥𝑖𝑠𝑡 − 𝐶𝐹𝑀50𝑛𝑒𝑤

𝑁𝑓𝑎𝑐𝑡𝑜𝑟) ∗ (

∆𝑘𝑊ℎ

𝐶𝐹𝑀)

𝑘𝑊 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐶𝐹𝑀50𝑒𝑥𝑖𝑠𝑡 − 𝐶𝐹𝑀50𝑛𝑒𝑤

𝑁𝑓𝑎𝑐𝑡𝑜𝑟) ∗ (

∆𝑘𝑊

𝐶𝐹𝑀∗ 𝐶𝐹)

Page 425: 2018 DSM Portfolio Evaluation Report - NIPSCO

418

𝑡ℎ𝑒𝑟𝑚 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝐶𝐹𝑀50𝑒𝑥𝑖𝑠𝑡 − 𝐶𝐹𝑀50𝑛𝑒𝑤

𝑁𝑓𝑎𝑐𝑡𝑜𝑟) ∗ (

∆𝑀𝑀𝐵𝑡𝑢

𝐶𝐹𝑀∗ 𝐶𝐹 ∗ 10)

Where:

CFM50exist = Average of initial blower door results, by customer type, in CFM, pressurized

at 50 pascal; amount of leakage in the home prior to air sealing measures

CFM50new = Average of post measure blower door results, by customer type, in CFM,

pressurized at 50 pascal; amount of leakage in the home after installing air

sealing measures.

N-factor = Constant to convert 50 pascal air flow to natural airflow, dependent on

exposure levels (Table 283).

∆kWh/CFM = Expected savings in kWh per CFM reduction

∆kW/CFM = Expected savings in kW per CFM reduction

∆MMBtu/CFM = Expected savings in MMBtu per CFM reduction

CF = Summer peak CF

Table 283. N-factor by Number of Stories, Normal Exposure

Number of Stories Value

1 Story 18.5

1.5 Stories 16.7

2 Stories 14.8

Source: Indiana TRM (v2.2)

Table 284 lists the assumptions and source of each assumption for the air sealing savings calculations.

Table 284. Ex Post Variable Assumptions for Air Sealing

Measure Value Reviewed Assumptions

∆CFM50t - Joint w/ CAC 835 Averaged over homes in 2016 program tracking data; Indiana TRM (v2.2)

∆CFM50Joint – no CAC 881 Averaged over homes in 2016 program tracking data; Indiana TRM (v2.2)

∆CFM50 – Elec – W/ CAC

and no CAC 853 Averaged over homes in 2016 program tracking data; Indiana TRM (v2.2)

∆CFM50 – Natural Gas 858 Averaged over homes in 2016 program tracking data; Indiana TRM (v2.2)

N-factor 16.7 Average program tracking data, normal exposure; Indiana TRM (v2.2)

∆kWh/CFM 1.7 Joint customers with CAC, averaged over homes in program tracking data;

Indiana TRM (v2.2)

∆kWh/CFM 1.0 Joint customers without CAC, averaged over homes in program tracking data;

Indiana TRM (v2.2)

∆kWh/CFM 47.6 Electric customers with electric heat and CAC, averaged over homes in program

tracking data; Indiana TRM (v2.2)

∆kWh/CFM 46.5 Electric customers with electric heat without CAC, averaged over homes in

program tracking data; Indiana TRM (v2.2)

∆kW/CFM 0.001 Joint customers with CAC, averaged over homes in program tracking data;

Indiana TRM (v2.2)

∆kW/CFM 0.003 Electric customers with CAC, averaged over homes in program tracking data;

Indiana TRM (v2.2)

Page 426: 2018 DSM Portfolio Evaluation Report - NIPSCO

419

Measure Value Reviewed Assumptions

∆MMBtu/CFM 0.21 Averaged over homes in program tracking data; Indiana TRM (v2.2)

CF 0.88 Indiana TRM (v2.2)

Attic Insulation The evaluation team used the following equations to calculate electric energy, peak demand, and

natural gas energy savings for attic insulation.

𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝑆𝐹

1000) ∗ (

∆𝑘𝑊ℎ

𝑘𝑆𝐹)

𝑘𝑊 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝑆𝐹

1000) ∗ (

∆𝑘𝑊

𝑘𝑆𝐹) ∗ 𝐶𝐹

𝑡ℎ𝑒𝑟𝑚 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = (𝑆𝐹

1000) ∗ (

∆𝑀𝑀𝐵𝑡𝑢

𝑘𝑆𝐹) ∗ 10

Where:

SF = Total area of wall insulation in square feet

∆kWh/ksf = Energy savings expected for every 1,000 square feet of insulation installed with

respect to pre-install and post-install R-values from data tracking information

∆kW/ksf = Demand reduction expected for every 1,000 square feet of insulation installed

with respect to pre-install and post-install R-values from data tracking

information

∆MMBtu/ksf = Natural gas savings expected for every 1,000 square feet of insulation installed

with respect to pre-install and post-install R-values from data tracking

information

CF = Coincidence factor

Electric energy, peak demand, and natural gas energy savings are dependent upon pre-install and post-

install insulation R-values, calculated using the following steps:

• Step 1. Determine variables for insulation compression, Rratio, and void factors

• Step 2. Calculate adjusted R-values, Radj

• Step 3. Interpolate within Indiana TRM (v2.2) tables to obtain savings per 1,000 square feet of

insulation to obtain values for ∆kWh/sf, ∆kW/sf, and ∆MMBtu/sf

Step 1. Determine variables for insulation compression, Rratio, and void factors:

Adjusted pre-install and post-install R-values are calculated using the following formula:

Radj = Rnominal x Fcompression x Fvoid

Page 427: 2018 DSM Portfolio Evaluation Report - NIPSCO

420

Where:

Rnominal = Total installed R-value per manufacturer’s specifications. This value varies across

participants and was calculated on an individual level to account for individual

savings between pre-install and post-install conditions.

Fcompression = Insulation compression factor, assumed to be 1 for 0% compression (as shown in

Indiana TRM v2.2), because actual information is unknown.

Fvoid = Void factor, dependent on insulation grade level and percent coverage, assumed to

be at the 2% grade per the Indiana TRM (v2.2), because actual information is

unknown.

The void factor, Fvoid, varies based on the ration between the full assembly R-value and the nominal R-

value, Rnominal, including compression effects. Pre-install and post-install insulation values are determined

next, using the following equation:

𝑅𝑟𝑎𝑡𝑖𝑜 = 𝑅𝑛𝑜𝑚𝑖𝑛𝑎𝑙 ∗ 𝐹𝑐𝑜𝑚𝑝𝑟𝑒𝑠𝑠𝑖𝑜𝑛

𝑅𝑛𝑜𝑚𝑖𝑛𝑎𝑙 + 𝑅𝑓𝑟𝑎𝑚𝑖𝑛𝑔&𝑎𝑖𝑟𝑠𝑝𝑎𝑐𝑒

Where:

Rnominal = Total installed R-value per manufacturer’s specifications. This value varies across

participants and was calculated on an individual level to account for individual

savings between pre-install and post-install conditions.

Fcompression = Insulation compression factor, assumed to be 1 for 0% compression (as shown in

Indiana TRM (v2.2), because information is unknown.

Rframing&airspace = R-value for materials, framing, and airspace for the area in which the insulation

is installed. Assumed to be R-5, per Indiana TRM (v2.2).

Values for void factors, based on the Rratio calculation are shown in Table 285. The evaluation team

assumed a void factor at 2% in accordance with the Indiana TRM (v2.2).

Table 285. Attic Insulation Void Factors

Rratio Fvoid, 2%

0.50 0.96

0.55 0.96

0.60 0.95

0.65 0.94

0.70 0.94

0.75 0.92

0.80 0.91

0.85 0.88

0.90 0.83

0.95 0.71

0.99 0.33

Page 428: 2018 DSM Portfolio Evaluation Report - NIPSCO

421

Step 2. Calculate Radj

Pre-install and post-install R-values, Radj, are calculated at the participant level using Rnominal and Rratio.

Step 3. Determine ∆kWh/ksf, ∆kW/ksf, and ∆MMBtu/ksf

Electric energy, peak demand, and natural gas energy savings per thousand square feet values were

obtained by interpolating within Indiana TRM (v2.2) tables and averaging across participant locations.

Refrigerator Replacement The evaluation team used the following equations to calculate electric energy and peak demand savings

for refrigerator replacement:

𝑘𝑊ℎ 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 = 𝑈𝐸𝐶𝑒𝑥𝑖𝑠𝑡 − 𝑈𝐸𝐶𝑒𝑠

𝑘𝑊 𝑠𝑎𝑣𝑖𝑛𝑔𝑠 =∆𝑘𝑊ℎ

8760∗ 𝑇𝐴𝐹 ∗ ∆𝐿𝑆𝐴𝐹

Where:

UECexist = Energy consumption of existing unit, kWh

UECes = Energy consumption of new, efficient unit, kWh

TAF = Temperature adjustment factor

ΔLSAF = Load shape adjustment factor

Table 286 lists the assumptions and source of each assumption for the refrigerator replacement

measure savings calculations.

Table 286. Ex Post Variable Assumptions for Refrigerator Replacement

Measure Value Reviewed Assumptions

UECexist Varies by age, size, and configuration ENERGY STAR calculator website

UECes Actual ENERGY STAR rating

TAF 1.21 Indiana TRM (v2.2)

LSAFexist 1.063 Indiana TRM (v2.2)

LSAFnew 1.124 Indiana TRM (v2.2)

Page 429: 2018 DSM Portfolio Evaluation Report - NIPSCO

422

Appendix J. Cost-Effectiveness This chapter details the cost-effectiveness analysis results for each of the NIPSCO electric and gas energy

efficiency programs as well as for the portfolios of programs for measures implemented in the 2018

program year. This benefit/cost analysis covers the second year of a three-year effort. Evaluating the

cost-effectiveness of NIPSCO’s energy efficiency programs involved following procedures specified in the

Indiana Evaluation Framework (Framework), which primarily draws upon the California Standard

Practice Manual (SPM).

Adherence to Framework and SPM procedures may follow a number of paths, but two approaches

prove most prevalent:

• One involves evaluating ex ante cost-effectiveness (i.e., the cost-effectiveness of proposed

programs)

• The other involves evaluating energy efficiency programs on an ex post basis

The ex ante approach uses projected measure impacts and costs, while the ex post approach uses actual

load impact results from EM&V and actual program costs. For this evaluation, the Evaluation Team used

the latter approach (i.e., ex post) for the cost-effectiveness analysis.

This chapter includes costs associated with the development, start-up, rollout, and operational

adjustments associated with the year 2018. Benefit/cost assessments presented in this chapter include

NIPSCO’s program implementation costs (e.g., administrative, marketing, EM&V, overhead costs). The

resulting benefit/cost assessments provide a perspective on the cost-effectiveness of NIPSCO’s

performance, including its oversight or management function costs.

Cost-effectiveness analysis is a form of economic analysis that compares an investment’s relative costs

and benefits. In the energy efficiency industry, it serves as an indicator of the energy supply’s relative

performance or the economic attractiveness of energy efficiency investments or practices, compared to

the costs of energy produced and delivered in the absence of such investments (but without considering

the value or costs of non-energy benefits or non-included externalities). The typical cost-effectiveness

formula provides an economic comparison of costs and benefits.

The Framework (and ultimately the SPM) guides the cost-effectiveness analyses of these energy

efficiency programs. The chapter provides benefit/cost test results for each program and for each full

portfolio. Though these tests are not necessarily used to recover costs, these tests provide information

that can be used to improve decisions regarding which programs should be adjusted or continued within

an energy efficiency portfolio. NIPSCO may use this evaluation’s results to true-up previous estimates

used in its cost-recovery mechanism.

Analysis results include the primary Framework tests at the program and portfolio levels, with all tests

reported based on the net present value of benefits versus costs. These tests employ the full effective

useful life (EUL) of installed measures and the utility’s cost of capital, as though program funds were

Page 430: 2018 DSM Portfolio Evaluation Report - NIPSCO

423

acquired via a utility loan from capital supply markets at a rate similar to that borrowed to construct a

new generation plant.

Cost-Effectiveness Model Description EM&V and cost-effectiveness modeling prove critical to the long-term success of electric and gas energy

efficiency programs. To understand cost-effectiveness, the utility/program administrator should use a

model that enables them to evaluate changes to individual programs and to the portfolio. Such model

should include (but not be limited to) the ability to evaluate cost-effectiveness impacts from changes in

numerous factors (e.g., incentive levels, participant levels, measure savings, measure costs, avoided

costs, end-use load shapes, coincident peak factors, NTG factors, administrative costs, and addition or

deletion of measures or programs).

To provide the best and most accurate DSM/demand response/energy efficiency portfolio cost-

effectiveness modeling, the Evaluation Team used the DSMore model—considered the leading

DSM/energy efficiency benefit/cost modeling tool in the country. DSMore, developed by Integral

Analytics, is currently being used by utilities in approximately 35 states and by numerous state

regulatory commissions. NIPSCO also has relied upon DSMore cost-effectiveness tool results for its ex

ante program planning and ex post evaluation. DSMore is the only tool in the country that captures

hourly price and load volatilities across multiple weather years, an ability needed to assess a program’s

true cost-effectiveness under expected supply and load conditions (even in extreme weather situations).

In its simplest form, energy efficiency’s cost-effectiveness can be measured by comparing the benefits of

an investment with the costs. Typically, the SPM identifies five cost-effectiveness tests for use in energy

efficiency program evaluations:

• The participant cost test (PCT)

• The utility cost test (UCT; sometimes called the program administrator cost test)

• The ratepayer impact measure test (RIM)

• The total resource cost test (TRC)

• The societal cost test (SCT)

For this EM&V analysis, the Evaluation Team did not use the SCT, as estimates of environmental and

other non-energy costs and benefits were not readily available and remained highly uncertain.88 The TRC

test result provides the closest proxy to the SCT.

88 These could include effects like the following: the value of power plant emissions displaced (or avoided) by the

programs’ direct energy impacts; direct and indirect effects of dollar flows in the economy of northern

Indiana; economic benefits from increased equipment life; productivity improvements; lowered waste

generation; increased sales; reduced personnel needs injuries and illnesses; reduced repair and maintenance

expenses; and increased property values.

Page 431: 2018 DSM Portfolio Evaluation Report - NIPSCO

424

The four remaining SPM tests consider the impacts of energy efficiency programs from different points

of view within the energy system. Though each test provides a unique stakeholder’s perspective, taken

together, the tests can provide a comprehensive view of program viability. The tests also can be used to

improve program design by answering questions such as the following:

• Is the program cost-effective overall?

• Are some costs or incentives too high or too low?

• What will be the potential impact on customer rates?

Each cost-effectiveness test shares a common structure, with all comparing total benefits and total costs

in dollars from a certain point of view to determine whether or not overall benefits exceed costs. A

program passes a cost-effectiveness test if it produces a benefit/cost ratio greater than 1.0; it fails the

test if it produces a benefit/cost ratio less than 1.0, as shown in the following equation:

𝐵𝑒𝑛𝑒𝑓𝑖𝑡/𝐶𝑜𝑠𝑡 𝑅𝑎𝑡𝑖𝑜 = 𝐵𝑒𝑛𝑒𝑓𝑖𝑡𝑠

𝐶𝑜𝑠𝑡𝑠=

𝑁𝑃𝑉 ∑ 𝑏𝑒𝑛𝑒𝑓𝑖𝑡𝑠 ($)

𝑁𝑃𝑉 ∑ 𝑐𝑜𝑠𝑡𝑠 ($)

Table 287 and Table 288 provide an overview of the four SPM tests the Team used for this evaluation.

Table 287. Overview of Cost-Effectiveness Tests

Cost-Effectiveness Test Objective Comparison

PCT Are there positive benefits to the

customer? Costs and benefits of customers installing measures

UCT Will utility costs increase or

decrease?

The program administration’s cost to achieve supply-

side resource cost savings

RIM Will utility rates increase? Program administration cost and utility bill reductions

to achieve supply-side resource cost savings

TRC Will the total cost of energy in the

utility service territory decrease?

Program administrator and customer costs to achieve

utility resource savings

Table 288. SPM Costs and Benefits

Costs and Benefits PCT UCT RIM TRC

Avoided energy costs (fuel and operation and maintenance of power

plants and transmission and distribution [T&D] lines)* Benefit Benefit Benefit

Avoided capacity costs (constructing power plants, T&D lines,

pipelines)* Benefit Benefit Benefit

Other benefits (fossil fuel savings, water savings, equipment operation

and maintenance) Benefit Benefit

Participants’ incremental cost (above baseline) of efficient equipment Cost Cost

Program administration costs (staff, marketing) Cost Cost Cost

Incentives (rebates) Benefit Cost Cost

Lost utility revenue/lower energy bills (due to lower sales) Benefit Cost

*Avoided energy and capacity costs are net of free-rider impacts.

Page 432: 2018 DSM Portfolio Evaluation Report - NIPSCO

425

The cost-effectiveness tests also examine measures from different perspectives:

• The TRC compares a program’s total costs and benefits for the whole population of customers.

The costs include the total costs to the utility and the incremental participation costs for

customers, while the benefits include tax incentives and avoided supply costs. The TRC

benefit/cost ratio is computed based on the present value of program benefits (primarily the

avoided cost of capacity, generation, and transmission/distribution) relative to the cost of total

program implementation and operation as well as incremental customer costs.

• The UCT measures a program’s net costs as a resource option, based on costs incurred by the

program administrator. The UCT’s benefits are the same as the TRC’s (i.e., energy savings and

demand reduction values); however, the more narrowly defined costs do not include

incremental customer costs but do include incentives paid to participants.

• The PCT assesses cost-effectiveness from the participating customers’ perspectives by

calculating a customer’s quantifiable benefits and costs for participating in the program. As

many customers do not base their participation decisions entirely on quantifiable variables, this

test does not necessarily provide a complete measure of all benefits and costs perceived by

a participant.

• The RIM measures a program’s effect on consumer rates due to resulting changes in utility

revenues and operating costs. This test indicates the direction and magnitude of expected

potential impacts on rates.

The following formulas describe the tests using the terminology from DSMore:

𝑇𝑅𝐶 = 𝐴𝑣𝑜𝑖𝑑𝑒𝑑 𝐶𝑜𝑠𝑡𝑠 + 𝑇𝑎𝑥 𝑆𝑎𝑣𝑒𝑑

𝑈𝑡𝑖𝑙𝑖𝑡𝑦 𝐶𝑜𝑠𝑡𝑠 𝑁𝑒𝑡 𝑜𝑓 𝐼𝑛𝑐𝑒𝑛𝑡𝑖𝑣𝑒𝑠 + 𝑃𝑎𝑟𝑡𝑖𝑐𝑖𝑝𝑎𝑛𝑡 𝐼𝑛𝑐𝑟𝑒𝑚𝑒𝑛𝑡𝑎𝑙 𝐶𝑜𝑠𝑡𝑠

𝑈𝐶𝑇 = 𝐴𝑣𝑜𝑖𝑑𝑒𝑑 𝐶𝑜𝑠𝑡𝑠

𝑈𝑡𝑖𝑙𝑖𝑡𝑦 𝐶𝑜𝑠𝑡𝑠

𝑃𝐶𝑇 = 𝐿𝑜𝑠𝑡 𝑅𝑒𝑣𝑒𝑛𝑢𝑒 + 𝐼𝑛𝑐𝑒𝑛𝑡𝑖𝑣𝑒𝑠 + 𝑇𝑎𝑥 𝑆𝑎𝑣𝑖𝑛𝑔𝑠

𝑃𝑎𝑟𝑡𝑖𝑐𝑖𝑝𝑎𝑛𝑡 𝐼𝑛𝑐𝑟𝑒𝑚𝑒𝑛𝑡𝑎𝑙 𝐶𝑜𝑠𝑡𝑠

𝑅𝐼𝑀 = 𝐴𝑣𝑜𝑖𝑑𝑒𝑑 𝐶𝑜𝑠𝑡𝑠

𝑈𝑡𝑖𝑙𝑖𝑡𝑦 𝐶𝑜𝑠𝑡𝑠 + 𝐿𝑜𝑠𝑡 𝑅𝑒𝑣𝑒𝑛𝑢𝑒

𝑅𝐼𝑀 (𝑁𝑒𝑡 𝐹𝑢𝑒𝑙) = 𝐴𝑣𝑜𝑖𝑑𝑒𝑑 𝐶𝑜𝑠𝑡𝑠

𝑈𝑡𝑖𝑙𝑖𝑡𝑦 𝐶𝑜𝑠𝑡𝑠 + 𝐿𝑜𝑠𝑡 𝑅𝑒𝑣𝑒𝑛𝑢𝑒 𝑁𝑒𝑡 𝑜𝑓 𝐹𝑢𝑒𝑙

DSMore Overview DSMore is a financial analysis tool designed to evaluate the costs, benefits, and risks of electric and gas

energy efficiency programs and measures. The tool estimates the value of an energy efficiency measure

at an hourly level, across distributions of weather and/or energy costs or prices. By examining energy

efficiency performance and cost-effectiveness over a wide variety of weather and cost conditions, the

evaluator can better measure the risks and benefits of energy efficiency measures.

Page 433: 2018 DSM Portfolio Evaluation Report - NIPSCO

426

The tests address a range of weather conditions, including normal weather, and various cost and market

price conditions. Designed to analyze extreme conditions, DSMore allows users to obtain a distribution

of cost-effectiveness outcomes or expectations. Avoided costs for energy efficiency tend to increase

with increasing market prices and/or more extreme weather conditions due to the covariance between

load and costs/prices. Understanding the manner in which energy efficiency cost-effectiveness varies

under these conditions allows for a more precise valuation of energy efficiency and demand response

programs.

Using valuation or modeling methods that employ averages (e.g., annual use, monthly use, weather

normal load profiles) instead of actual and forecasted hourly usage and avoided costs, by definition,

undervalues energy efficiency and DSM programs, which tend to exhibit higher savings during times of

higher avoided costs (i.e., from HVAC, weatherization, and demand response). For programs exhibiting

around-the-clock energy savings, averaging type methods yield results equivalent to DSMore results.

Such programs, however, do not represent the norm (exceptions include refrigeration and lighting left

on at all hours).

In all other cases, averaging-based methods yield cost-effectiveness test results lower than actual

values. DSMore’s methods and algorithms avoid this potential error through a very granular use of

hourly energy savings and hourly avoided costs, linked via the same set of actual, local, hourly

weather histories.

Generally, DSMore requires users to input specific information regarding energy efficiency measures or

programs to be analyzed as well as a utility’s cost and rate information. Such inputs enable analysis of a

measure’s or program’s cost-effectiveness.

DSMore uses a set of program inputs, entered through the first two tabs (worksheets) of an Excel

interface. These are combined with electric and gas load shape and price data to calculate the cost-

effectiveness tests. Integral Analytics creates a custom set of hourly loads and prices, based on analysis

over a 30+ year period for each customer type. Load files are specific to the customer class served by

energy efficiency programs.

The user enters the measure information data into Excel, selects the appropriate load file, selects the

appropriate price file, and executes DSMore. DSMore uses these inputs to calculate the cost-

effectiveness tests and then exports the results into the same Excel workbook (as worksheet tabs 3

through 8). Figure 131 provides an overview of DSMore and shows how the key inputs relate to the

application engine.

Page 434: 2018 DSM Portfolio Evaluation Report - NIPSCO

427

Figure 131. DSMore Overview

Analytical Approach It is important to capture the relationship between loads and prices. Table 289 addresses two scenarios:

one using average load and prices and one using hourly loads and prices. Both scenarios use the same

average load (2 MW) and the same average price ($50/MWh) over the time period. However, the total

value of the hourly analysis differs ($500 versus $620). The program’s actual value in this example is 24%

higher when using hourly costs to estimate benefits compared to using average pricing.

Table 289. Average Versus Hourly Valuation

Hour Average Loads and Prices Hourly Loads and Prices

MW Cost/MWh Total Cost MW Cost/MWh Total Cost

1 2 $50 $100 1 $20 $20

2 2 $50 $100 1 $20 $20

3 2 $50 $100 2 $50 $100

4 2 $50 $100 3 $80 $240

5 2 $50 $100 3 $80 $240

Average 2 $50 2 $50

Total $500 $620

To perform the hourly analysis, DSMore correlates historic loads and prices to actual historic weather.

DSMore uses the relationships (along with the covariance) between loads, weather, and price, along

with the probability distributions of these relationships, to calculate approximately 700 different

market/load/price scenarios, each with a unique test result. DSMore reports the endpoints or extremes

of this distribution, conveniently reducing the number of test results in the Excel output (i.e., reporting

five to nine test results rather than 693). Users can, however, simply use one test result, which reflects

their preferred set of avoided costs across normal weather conditions.

Page 435: 2018 DSM Portfolio Evaluation Report - NIPSCO

428

DSMore offers another more versatile function: simultaneously assessing multiple cost-effectiveness

results over many different avoided cost scenarios. For each of the 30 years of weather scenarios,

DSMore can assess 21 different electric market/cost/price scenarios. Typically, DSMore uses 33 years of

weather as a default, yielding 693 cost-effectiveness test results (i.e., 33 weather scenarios multiplied by

21 market scenarios) to reflect a full spectrum of possible valuations for a particular program.

Figure 132 depicts how 693 weather/price scenarios capture the extremes, which an averaging type of

avoided cost method would ignore.

Figure 132. Weather and Market Scenarios

The average value of these tests represents an average, weather-normalized expectation across all

possible market prices and forward cost scenarios. Selecting one market price scenario (today’s value)

provides test results for the current market (this year) across 30 or more weather scenarios. Using fewer

than 30 years of weather jeopardizes the accuracy of the estimated normal weather and extreme

weather effects. DSMore strives to reflect an appropriate range or distribution of highs to lows, and

seeks to ensure appropriately extreme hourly weather patterns are reflected and valued, given

historically observed extreme hourly weather.

Regarding forward electric market prices, DSMore uses 21 different forward curves. The first set of

avoided costs or prices reflects traditional cost-based avoided costs (e.g., system lambda, avoided

production costs), leaving 20 different forward market price curves, ranging from low to high (e.g.,

$30/MWh to $70/MWh on average over 8,760 hours). Use of 20 curves ensures that approximately

every fifth percentile of increase in the forward curves can be observed. Generally, this means one can

safely interpolate test results between five percentiles (e.g., the 45th percentile and the 50th percentile)

in a linear fashion. It is not appropriate, however, to interpolate test results linearly between, for

example, the 25th percentile and the 80th percentile. Energy prices are notoriously nonlinear (i.e., peak

prices are much more volatile than off-peak prices).

Page 436: 2018 DSM Portfolio Evaluation Report - NIPSCO

429

The key benefit to valuing energy efficiency across such a wide range of future cost and weather

conditions arises from the ability to not only quantify short-run cost-effectiveness, but to create

long-term predictions.

DSMore’s Option Test result provides a long-run cost-effectiveness perspective. Essentially, the Option

Test result values programs across 21 future possible sets of avoided costs, and across 30 to 40 years of

actual hourly weather patterns. Traditionally, utilities only calculate one test result for the current year’s

avoided costs, but energy prices tend to boom and bust over time, reaching high price conditions during

periods of short supply (such as in 1999 and 2000) and low avoided cost conditions during periods of

excess supply (such as in 2003 and 2009). Valuing energy efficiency programs across all possible future

avoided cost possibilities makes it is easier to determine whether programs prove cost-effective for all

future possibilities rather just for the current year. This produces meaningful short-run and long-run

test results. The long-run test result is called the Option Test because energy savings can be viewed as

an option (although it may not be possible to execute this because, for example, lighting equipment may

already be installed and cannot be removed) in comparison to paying for future, potentially higher

avoided cost conditions.

Inputs to Cost-Effectiveness Analysis

Hourly Prices and Energy Savings Best practice cost-effectiveness modeling begins with hourly prices and hourly energy savings from the

specific measures/technologies considered and correlates both of these to weather. In turn, this allows

the model to capture and apply appropriate values to low-probability, high-consequence weather

events. This captures a more accurate view of the efficiency measure’s value compared to other supply

options. Additionally, completing the analysis requires several inputs, as summarized below.

The foundation of the hourly price analysis used for the study derives from an analysis of historic hourly

price data, matched with hourly weather to measure the price-to-weather covariance. The analysis can

measure the overall variation and portion attributable to weather, arriving at a normal weather price

distribution. Price variation results from several uncertain variables, including weather. Using over

30 years of weather data along with a relationship developed from two years of actual price and

weather data, allows the analysis to measure the full range of possible outcomes, reported in the

DSMore results as Minimum, Todays (expected), and Maximum test ratios.

Program-Related Inputs The model’s program inputs include participation rates, incentives paid, measure load savings, measure

life, implementation costs, administrative costs including free rider incentives, and incremental costs to

the participant. These inputs come from the EM&V activities that the EM&V project team supplied to

Integral Analytics for cost-effectiveness analysis. The Evaluation Team applied measured kWh savings to

the appropriate hours for the customer, based on load curves for the customer group most likely to

install the measure. For example, the Team used the commercial load curve for commercial measures

and may have used various commercial load curves, depending on the measure type and size installed.

Page 437: 2018 DSM Portfolio Evaluation Report - NIPSCO

430

The Team calculated the value of electric savings by hour based on that hour’s market value for the

measure EUL, given the assumed escalation rates. This avoided cost served as the present value benefit,

with savings valued to today’s dollars. The model then used these present values relative to the

associated costs to determine the cost-effectiveness test results.

Effective Useful Life Energy savings from installed energy efficiency programs measures are counted and valued over a

measure’s full EUL. In addition, energy savings are incorporated into the cost-effectiveness analysis for

technologies with a remaining useful life. In such situations, energy savings reflect a higher impact for

the remaining useful life, then slowly decrease to a level consistent with the current baseline EUL.

Spillover and Freeriders Spillover arises from participants’ energy savings that result from program activities, but which are not

captured in program tracking of energy savings:

• Example: a customer, due to the program’s influence, buys two units of a qualifying piece of

efficient equipment, but obtains a rebate for only one unit.

• Example: a program participant obtains a rebate in one location, then replicates the program-

induced purchasing decision in another building but does not apply for a rebate for the

second purchase.

In both cases, the program influenced the customer to the extent that their short-term, program-

induced actions spilled into other efficient purchases or behaviors not rebated or tracked by a program.

This evaluation identified spillover savings, including them in the benefit/cost assessment, as short-term

actions taken between the participation period and the evaluation effort. As a result, the spillover

included only represents a fraction of the total spillover that may be achieved. Longer-term spillover

results from actions taken due to the program but spread over many years. This assessment does not

include such spillover, which is associated with programs changing the way markets operate.

Freeriders are program participants who would have installed the same energy-efficient equipment in

the program’s absence. All programs include freeriders—often early adopters of a technology, with

many differing motivations beyond the program incentive. Program designs, however, can use several

methods to minimize freeriders:

• First, incentive levels must be sufficiently high to entice those who would not have participated

due to financial concerns.

• Second, some measures may be eliminated if known to produce high freeridership rates.89

89 For example, programs often eliminate residential ENERGY STAR refrigerators, even though they pass the

benefit/cost analysis, as the market already experiences high adoption rates for these units, and other studies

have indicated high freerider rates.

Page 438: 2018 DSM Portfolio Evaluation Report - NIPSCO

431

To calculate cost-effectiveness ratios, the Evaluation Team included spillover and freeridership impacts

as a net freeridership percentage.

Utility Inputs Regarding utility information, DSMore requires utility rates, escalation rates, avoided costs, and discount

rates for the utility, society, and participants. For this report, NIPSCO supplied the values used for

avoided costs, escalation rates, discount rates, loss ratios, and electric and gas rates.

Avoided Costs The recommended avoided cost framework develops each hour’s electricity valuation using a bottom-up

approach to quantify an hourly avoided cost as the sum of forward-looking, incremental cost elements

for that hour. The resulting hourly, avoided electricity costs vary by hour of day, day of week, and time

of year. Weather-dependent results require a normal weather outcome and a distribution of outcomes

corresponding to the weather-related variation in outcomes.

Electric avoided costs, by cost component, include the following:

• Generation Costs: Variable by hour. The annual forecast of avoided generation costs is allocated

according to an hourly price shape, obtained from historic participant-specific data reflecting a

workably competitive market environment and expected weather variation. NIPSCO provided

average annual prices.

• Capacity Costs: Associated with generation or capacity markets, these reflect the cost of

acquiring the additional capacity. NIPSCO provided these cost estimates.

• T&D Costs: Variable by hour. Reductions in non-coincident peak loads provide value in avoiding

T&D capacity costs. Non-peak hours produce zero avoided T&D capacity costs, reflecting that

T&D capacity investments serve peak hours. NIPSCO provided the cost estimates.

Gas avoided costs, by cost component, include the following:

• Commodity Costs: Variable by month. NIPSCO provided the forecast of gas commodity costs.

• Other Costs: These can include fixed costs (e.g., pipeline charges, reservation charges). NIPSCO

provided any such applicable costs.

Net Present Value The Evaluation Team calculated an energy efficiency measure’s cost-effectiveness based on the net

present value of costs and benefits valued in each test, discounted over the EULs for the installed

measures. The discount rate being used for the present value calculations is 6.53% for electric programs

and 6.24% for gas programs.

Programs The Evaluation Team evaluated cost-effectiveness for each program implemented within NIPSCO’s

service area, as follows.

Page 439: 2018 DSM Portfolio Evaluation Report - NIPSCO

432

Electric Programs Residential:

• HVAC Rebate

• Residential Lighting

• Home Energy Analysis

• Appliance Recycling

• School Education

• Multi-Family Direct Install

• Behavioral

• Income-Qualified Weatherization

C&I:

• Prescriptive

• Custom

• New Construction

• Small Business Direct Install

• Retro-Commissioning

Gas Programs Residential:

• HVAC Rebate

• Home Energy Analysis

• School Education

• Multi-Family Direct Install

• Behavioral

• Income-Qualified Weatherization

C&I:

• Prescriptive

• Custom

• New Construction

• Small Business Direct Install

• Retro-Commissioning

The Team collected information on costs and impacts associated with each program throughout the

EM&V process. These included indirect costs in the cost-effectiveness analysis for each customer class

segment by program.

Page 440: 2018 DSM Portfolio Evaluation Report - NIPSCO

433

Results DSMore provides energy efficiency planners with insights regarding their energy efficiency programs’

actual cost-effectiveness. The following tables provide the results of the cost-effectiveness analyses for

each program, per the UCT, TRC, RIM, and PCT. Results are reported at the program level as well as at

the customer class and portfolio level. Table 290 summarizes cost-effectiveness results for the electric

residential programs.

Table 290. Electric Residential Program Cost-Effectiveness Results

Program Cost-Effectiveness

UCT TRC RIM PCT*

Residential

HVAC Rebate 5.02 1.37 0.99 0.99

Lighting 4.34 1.81 0.56 2.73

Home Energy Analysis 2.40 2.40 0.72 N/A

Appliance Recycling 2.95 2.51 0.54 8.68

School Education 2.45 2.45 0.49 N/A

Multi-Family Direct Install 2.69 2.69 0.51 N/A

Behavioral 1.20 1.20 0.39 N/A

Income Qualified Weatherization 2.02 2.02 0.61 N/A

Total Residential 3.15 1.79 0.55 3.20

Note: For programs without incremental participant costs, the PCT score is N/A, as it cannot be calculated.

All electric residential programs passed the UCT cost-effectiveness tests. All electric residential programs

passed the TRC cost-effectiveness tests. The Evaluation Team based these on an evaluation of actual

program costs and actual load impacts. The overall residential portfolio passed the UCT, TRC, and PCT

cost-effectiveness tests.

Table 291 summarizes the cost-effectiveness results for the electric C&I programs.

Table 291. Electric Commercial and Industrial Program Cost-Effectiveness Results

Program Cost-Effectiveness

UCT TRC RIM PCT*

Commercial and Industrial

Prescriptive 8.66 1.28 0.47 1.88

Custom 5.21 3.97 0.43 9.01

New Construction 4.70 4.70 0.45 16.58

Small Business Direct Install 5.71 3.09 0.46 8.89

Retro-Commissioning 10.62 1.93 0.46 3.03

Total Commercial and Industrial 6.63 1.84 0.46 3.28

Note: For programs without incremental participant costs, the PCT score is N/A, as it cannot be calculated.

All electric C&I programs passed the UCT, TRC, and PCT tests. The total C&I portfolio also passed the

UCT, TRC, and PCT.

Page 441: 2018 DSM Portfolio Evaluation Report - NIPSCO

434

All of the programs and both portfolios did not pass the RIM test.

Table 292 provides a complete summary of the cost-effectiveness results for all programs in both

customer segments and for the full electric program portfolio.

Table 292. Electric Program Cost-Effectiveness Results

Program Cost-Effectiveness

UCT TRC RIM PCT*

Residential

HVAC Rebate 5.02 1.37 0.99 0.99

Lighting 4.34 1.81 0.56 2.73

Home Energy Analysis 2.40 2.40 0.72 N/A

Appliance Recycling 2.95 2.51 0.54 8.68

School Education 2.45 2.45 0.49 N/A

Multi-Family Direct Install 2.69 2.69 0.51 N/A

Behavioral 1.20 1.20 0.39 N/A

Income Qualified Weatherization 2.02 2.02 0.61 N/A

Total Residential 3.15 1.79 0.55 3.20

Commercial and Industrial

Prescriptive 8.66 1.28 0.47 1.88

Custom 5.21 3.97 0.43 9.01

New Construction 4.70 4.70 0.45 16.58

Small Business Direct Install 5.71 3.09 0.46 8.89

Retro-Commissioning 10.62 1.93 0.46 3.03

Total Commercial and Industrial 6.63 1.84 0.46 3.28

Total 2018 Electric Portfolio 5.02 1.82 0.48 3.26

Note: For programs without incremental participant costs, the PCT score is NA, as it cannot be calculated.

The 2018 total portfolio of electric programs proved cost-effective for the UCT, TRC, and PCT.

Table 293 summarizes cost-effectiveness results for the gas residential programs. Though these results

primarily focused on the cost-effectiveness of gas impacts from the gas energy efficiency programs, the

test results included offsetting increases in gas usage due to reductions in internal heating as a result of

the electric energy efficiency programs. Gas usage increases can occur upon installing more efficient

lighting measures.

Page 442: 2018 DSM Portfolio Evaluation Report - NIPSCO

435

Table 293. Gas Residential Program Cost-Effectiveness Results

Program Cost-Effectiveness

UCT TRC RIM PCT*

Residential

HVAC Rebate 5.54 2.31 0.84 2.13

Home Energy Analysis 1.42 1.42 0.58 N/A

School Education 1.18 1.18 0.54 N/A

Multi-Family Direct Install 2.81 2.81 0.73 N/A

Behavior 3.70 3.70 0.76 N/A

Income Qualified Weatherization 1.75 1.75 0.63 N/A

Total Residential 3.59 2.15 0.77 2.64

Note: For programs without incremental participant costs, the PCT score is NA, as it cannot be calculated.

In general, the gas programs were significantly less cost-effective this year due to the lower gas prices.

Two gas programs in particular were impacted enough to lower the UCT and TRC scores below 1. These

were the Home Energy Audit and the Income Qualified Weatherization. All other gas residential

programs passed the UCT and TRC cost-effectiveness tests. The Evaluation Team based these results on

an evaluation of actual program costs and actual load impacts. The overall residential portfolio passed

the UCT, TRC, and PCT cost-effectiveness tests.

Table 294 summarizes the cost-effectiveness results for the gas C&I programs.

Table 294. Gas Commercial and Industrial Program Cost-Effectiveness Results

Program Cost-Effectiveness

UCT TRC RIM PCT*

Commercial and Industrial

Prescriptive 6.13 2.19 0.85 1.85

Custom 5.72 3.61 0.84 3.72

New Construction 4.23 3.19 0.80 4.24

Small Business Direct Install 6.55 2.80 0.86 3.12

Retro-Commissioning N/A N/A N/A N/A

Total Commercial and Industrial 5.55 2.90 0.84 3.06

Note: For programs without incremental participant costs, the PCT score is NA, as it cannot be calculated.

As with the residential gas programs, the overall the cost-effectiveness scores are lower this year due to

the lower gas prices. However, all gas C&I programs passed the UCT Test. All gas C&I programs passed

the TRC and PCT tests. The total C&I portfolio passed the UCT, TRC, and PCT.

As with the electric programs, the gas energy efficiency programs did not pass the RIM test.

Table 295 provides a complete summary of cost-effectiveness results for all programs in both customer

segments and for the full gas program portfolio.

Page 443: 2018 DSM Portfolio Evaluation Report - NIPSCO

436

Table 295. Gas Program Cost-Effectiveness Results

Program Cost-Effectiveness

UCT TRC RIM PCT*

Residential

HVAC Rebate 5.54 2.31 0.84 2.13

Home Energy Analysis 1.42 1.42 0.58 N/A

School Education 1.18 1.18 0.54 N/A

Multi-Family Direct Install 2.81 2.81 0.73 N/A

Behavior 3.70 3.70 0.76 N/A

Income Qualified Weatherization 1.75 1.75 0.63 N/A

Total Residential 3.59 2.15 0.77 2.64

Commercial and Industrial

Prescriptive 6.13 2.19 0.85 1.85

Custom 5.72 3.61 0.84 3.72

New Construction 4.23 3.19 0.80 4.24

Small Business Direct Install 6.55 2.80 0.86 3.12

Retro-Commissioning N/A N/A N/A N/A

Total Commercial and Industrial 5.55 2.90 0.84 3.06

Total 2018 Gas Portfolio 4.07 2.36 0.79 2.76

Note: For programs without incremental participant costs, the PCT score is NA, as it cannot be calculated.

The 2018 total portfolio of gas programs proved cost-effective for the UCT, TRC, and PCT.

Table 296 through Table 303 provide estimates of the present value benefits and costs as well as the net present value of program benefits for each of the four tests: UCT, TRC, RIM, and PCT. The tables provide values for each electric and gas program, customer segment, and the total portfolio of programs. The net present values represent the difference between the present value of benefits and the present value of costs, including indirect costs (as applicable).

Page 444: 2018 DSM Portfolio Evaluation Report - NIPSCO

437

Table 296. Net Present Value of Electric Program Benefits: UCT

Program Total Benefits Total Costs Present Value of Net Benefits

UCT

Residential

HVAC Rebate 1,717,518 341,842 1,375,676

Lighting 15,669,290 3,611,711 12,057,579

Home Energy Analysis 500,590 208,900 291,690

Appliance Recycling 1,705,207 578,609 1,126,598

School Education 1,206,456 492,427 714,029

Multi-Family Direct Install 1,006,726 373,882 632,844

Behavioral 2,136,923 1,782,342 354,581

Income Qualified Weatherization 1,168,913 579,632 589,280

Total Residential $25,111,622 $7,969,345 $17,142,277

Commercial and Industrial

Prescriptive 33,751,045 3,899,345 29,851,700

Custom 15,367,708 2,949,615 12,418,093

New Construction 7,303,159 1,555,177 5,747,982

Small Business Direct Install 5,019,822 879,322 4,140,500

Retro-Commissioning 253,182 23,834 229,348

Total Commercial and Industrial $61,694,916 $9,307,294 $52,387,623

Total 2018 Electric Portfolio $86,806,539 $17,276,639 $69,529,900

Table 297. Net Present Value of Electric Program Benefits: TRC

Program Total Benefits Total Costs Present Value of Net Benefits

TRC

Residential

HVAC Rebate 1,717,518 1,253,170 464,348

Lighting 15,669,290 8,634,583 7,034,707

Home Energy Analysis 500,590 208,900 291,690

Appliance Recycling 1,705,207 680,301 1,024,906

School Education 1,206,456 492,427 714,029

Multi-Family Direct Install 1,006,726 373,882 632,844

Behavioral 2,136,923 1,782,342 354,581

Income Qualified Weatherization 1,168,913 579,632 589,280

Total Residential $25,111,622 $14,005,237 $11,106,385

Page 445: 2018 DSM Portfolio Evaluation Report - NIPSCO

438

Program Total Benefits Total Costs Present Value of Net Benefits

TRC

Commercial and Industrial

Prescriptive 33,751,045 26,404,408 7,346,637

Custom 15,367,708 3,867,727 11,499,981

New Construction 7,303,159 1,555,177 5,747,982

Small Business Direct Install 5,019,822 1,622,229 3,397,594

Retro-Commissioning 253,182 131,350 121,833

Total Commercial and Industrial $61,694,916 $33,580,891 $28,114,026

Total 2018 Electric Portfolio $86,806,539 $47,586,128 $39,220,411

Table 298. Net Present Value of Electric Program Benefits: RIM

Program Total Benefits Total Costs

Present Value of Net

Benefits

RIM

Residential

HVAC Rebate 1,717,518 1,728,788 -11,270

Lighting 15,669,290 28,152,518 -12,483,228

Home Energy Analysis 500,590 694,712 -194,121

Appliance Recycling 1,705,207 3,163,098 -1,457,890

School Education 1,206,456 2,465,430 -1,258,974

Multi-Family Direct Install 1,006,726 1,965,477 -958,752

Behavioral 2,136,923 5,435,229 -3,298,306

Income Qualified Weatherization 1,168,913 1,930,352 -761,439

Total Residential $25,111,622 $45,535,603 ($20,423,980)

Commercial and Industrial

Prescriptive 33,751,045 71,873,501 -38,122,456

Custom 15,367,708 35,341,171 -19,973,463

New Construction 7,303,159 16,287,740 -8,984,581

Small Business Direct Install 5,019,822 10,992,558 -5,972,736

Retro-Commissioning 253,182 553,224 -300,042

Total Commercial and Industrial $61,694,916 $135,048,193 ($73,353,277)

Total 2018 Electric Portfolio $86,806,539 $180,583,796 ($93,777,257)

Page 446: 2018 DSM Portfolio Evaluation Report - NIPSCO

439

Table 299. Net Present Value of Electric Program Benefits: PCT

Program Total Benefits Total Costs Present Value of Net Benefits

PCT

Residential

HVAC Rebate 1,613,375 1,622,426 -9,050

Lighting 33,349,769 12,199,398 21,150,371

Home Energy Analysis 386,696 0 386,696

Appliance Recycling 2,708,338 312,115 2,396,223

School Education 1,779,506 0 1,779,506

Multi-Family Direct Install 931,867 0 931,867

Behavioral 3,652,887 0 3,652,887

Income Qualified Weatherization 790,835 0 790,835

Total Residential $45,213,273 $14,133,939 $31,079,335

Commercial and Industrial

Prescriptive 52,584,301 28,010,418 24,573,884

Custom 30,333,418 3,367,979 26,965,439

New Construction 19,683,442 1,360,849 18,322,593

Small Business Direct Install 6,806,454 765,883 6,040,572

Retro-Commissioning 355,104 117,333 237,771

Total Commercial and Industrial $109,762,720 $33,622,461 $76,140,259

Total 2018 Electric Portfolio $154,975,993 $47,756,400 $107,219,593

Table 300. Net Present Value of Gas Program Benefits: UCT

Program Total Benefits Total Costs Present Value of Net Benefits

UCT

Residential

HVAC Rebate 12,368,048 2,233,956 10,134,092

Home Energy Analysis 417,737 293,558 124,179

School Education 629,296 532,932 96,363

Multi-Family Direct Install 435,117 154,883 280,235

Behavior 699,742 189,063 510,679

Income Qualified Weatherization 2,214,131 1,264,785 949,347

Total Residential $16,764,072 $4,669,177 $12,094,895

Commercial and Industrial

Prescriptive 2,199,428 358,855 1,840,573

Custom 3,212,121 561,814 2,650,306

New Construction 1,608,025 379,741 1,228,284

Small Business Direct Install 1,276,809 195,033 1,081,776

Retro-Commissioning 0 0 0

Total Commercial and Industrial $8,296,383 $1,495,443 $6,800,940

Total 2018 Gas Portfolio $25,060,454 $6,164,620 $18,895,834

Page 447: 2018 DSM Portfolio Evaluation Report - NIPSCO

440

Table 301. Net Present Value of Gas Program Benefits: TRC

Program Total Benefits Total Costs Present Value of Net Benefits

TRC

Residential

HVAC Rebate 12,368,048 5,344,084 7,023,965

Home Energy Analysis 417,737 293,558 124,179

School Education 629,296 532,932 96,363

Multi-Family Direct Install 435,117 154,883 280,235

Behavior 699,742 189,063 510,679

Income Qualified Weatherization 2,214,131 1,264,785 949,347

Total Residential $16,764,072 $7,779,304 $8,984,767

Commercial and Industrial

Prescriptive 2,199,428 1,005,643 1,193,785

Custom 3,212,121 890,243 2,321,878

New Construction 1,608,025 504,607 1,103,418

Small Business Direct Install 1,276,809 455,500 821,309

Retro-Commissioning 0 0 0

Total Commercial and Industrial $8,296,383 $2,855,993 $5,440,390

Total 2018 Gas Portfolio $25,060,454 $10,635,297 $14,425,157

Table 302. Net Present Value of Gas Program Benefits: RIM

Program Total Benefits Total Costs

Present Value of Net

Benefits

RIM

Residential

HVAC Rebate 12,368,048 14,769,346 -2,401,297

Home Energy Analysis 417,737 715,108 -297,371

School Education 629,296 1,172,695 -543,400

Multi-Family Direct Install 435,117 592,751 -157,634

Behavior 699,742 915,082 -215,341

Income Qualified Weatherization 2,214,131 3,492,916 -1,278,785

Total Residential $16,764,072 $21,657,899 ($4,893,827)

Commercial and Industrial

Prescriptive 2,199,428 2,576,557 -377,129

Custom 3,212,121 3,810,540 -598,420

New Construction 1,608,025 2,001,126 -393,101

Small Business Direct Install 1,276,809 1,482,451 -205,641

Retro-Commissioning 0 0 0

Total Commercial and Industrial $8,296,383 $9,870,674 ($1,574,291)

Total 2018 Gas Portfolio $25,060,454 $31,528,573 ($6,468,118)

Page 448: 2018 DSM Portfolio Evaluation Report - NIPSCO

441

Table 303. Net Present Value of Gas Program Benefits: PCT

Program Total Benefits Total Costs

Present Value of Net

Benefits

PCT

Residential

HVAC Rebate 13,911,435 6,528,654 7,382,781

Home Energy Analysis 268,000 0 268,000

School Education 816,178 0 816,178

Multi-Family Direct Install 245,432 0 245,432

Behavior 726,019 0 726,019

Income Qualified Weatherization 1,248,901 0 1,248,901

Total Residential $17,215,964 $6,528,654 $10,687,310

Commercial and Industrial

Prescriptive 1,841,274 995,854 845,420

Custom 3,176,499 853,068 2,323,432

New Construction 2,260,305 533,167 1,727,138

Small Business Direct Install 812,269 260,468 551,801

Retro-Commissioning 0 0 0

Total Commercial and Industrial $8,090,348 $2,642,557 $5,447,791

Total 2018 Gas Portfolio $25,306,312 $9,171,211 $16,135,101

Conclusions This cost-effectiveness analysis indicates that NIPSCO’s electric and gas energy efficiency portfolios are

cost-effective. Both portfolios generate significant net benefits. The electric portfolio generates over $69

million in net benefits on a present-value basis, while the gas portfolio creates over $18 million

additional net benefits. These programs very successfully provide value for NIPSCO’s customers.