mse performance metrics and tentative results summary

24
MSE Performance Metrics and Tentative Results Summary Joint Technical Committee Northwest Fisheries Science Center, NOAA Pacific Biological Station, DFO School of Resource and Environmental Management, SFU

Upload: stacie

Post on 22-Feb-2016

32 views

Category:

Documents


0 download

DESCRIPTION

MSE Performance Metrics and Tentative Results Summary. Joint Technical Committee Northwest Fisheries Science Center, NOAA Pacific Biological Station, DFO School of Resource and Environmental Management, SFU. Outline. Summarize the hake MSE Example simulations Performance metrics - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: MSE Performance Metrics and Tentative Results Summary

MSE Performance Metrics and Tentative Results Summary

Joint Technical CommitteeNorthwest Fisheries Science Center, NOAA

Pacific Biological Station, DFOSchool of Resource and Environmental Management, SFU

Page 2: MSE Performance Metrics and Tentative Results Summary

Outline

• Summarize the hake MSE • Example simulations • Performance metrics • Summary figures

Page 3: MSE Performance Metrics and Tentative Results Summary

Objectives of the MSE

• Use the 2012 base case as the operating model.

• As defined in May 2120– Evaluate the performance of the harvest control

rule– Evaluate the performance of annual, relative to

biennial survey frequency.

Page 4: MSE Performance Metrics and Tentative Results Summary

Organization of MSE Simulations

Operating Model* Stock dynamics* Fishery dynamics* True population

Management Strategy* Data choices* *Stock Assessment* Harvest control rule

CatchData

Performance Statistics* Conservation

objectives* Yield objectives* Stability objectives

Feedback

Loop

* Use the MPD (not posterior medians, or other quantiles) for applying the harvest control rule

Page 5: MSE Performance Metrics and Tentative Results Summary

1960 1970 1980 1990 2000 2010 2020 2030

0.0

0.5

1.0

1.5

2.0

2.5

3.0

Year

SS

Bt

Existing (2012) assessment MSE Simulations

Page 6: MSE Performance Metrics and Tentative Results Summary

Example MSE run 1

Page 7: MSE Performance Metrics and Tentative Results Summary

But remember – starting points are not the same

Page 8: MSE Performance Metrics and Tentative Results Summary

Example MSE Run II

Page 9: MSE Performance Metrics and Tentative Results Summary

Measuring Performance• Choose metrics that capture the tradeoffs between conservation,

variability in catch and total yield for specific time periods.• Define short, medium and long time periods as Short=2013-2015,

Medium=2016-2020, Long=2021-2030.• The main conservation metric is the proportion of years depletion

is below 10%• The main variability in catch metric is the Average Annual

Variability in catch for a given time period.• For yield we used the median average catch• We’ve chosen what we think are the top six. We’d like to discuss

if others are needed.

Page 10: MSE Performance Metrics and Tentative Results Summary

Key Performance Statistics

Medium 2016-2020 Perfect Information Annual Biennial

Median average depletion 28% 27% 28%

Proportion of years below SB10% 1% 7% 6%Proportion of years between SB10% and SB40% 70% 61% 58%

Proportion of years above SB40% 29% 32% 36%

Median Average Annual Variability (AAV) in catch 23% 35% 36%

Median Average Catch 216 219 211

Page 11: MSE Performance Metrics and Tentative Results Summary

Other available options• First quartile depletion• Third quartile depletion• Median final depletion• Median of lowest depletion• Median of lowest perceived depletion• First quartile of lowest depletion• Third quartile of lowest depletion• First quartile of AAV in catch• Third quartile of AAV in catch• First quartile of average catch• Third quartile of average catch• Median of lowest catch levels• First quartile of lowest catch levels• Third quartile of lowest catch levels• Proportion with any depletion below SB10%• Proportion perceived to have any depletion below SB10%

Page 12: MSE Performance Metrics and Tentative Results Summary

Statistics Break - Medians vs Means

Page 13: MSE Performance Metrics and Tentative Results Summary

Average Annual Variability in Catch (illustration)

Page 14: MSE Performance Metrics and Tentative Results Summary

Perfect Information Case

• We created a reference, perfect information case where we simulated data with no error

• The purpose of the perfect information case was as follows:– To separate observation vs process error i.e. variable

data don’t affect management procedure performance

– to provide a standard relative to which a comparison of the test (biennial and annual) cases could be made

Page 15: MSE Performance Metrics and Tentative Results Summary

Perfect information (con’t)

Page 16: MSE Performance Metrics and Tentative Results Summary
Page 17: MSE Performance Metrics and Tentative Results Summary
Page 18: MSE Performance Metrics and Tentative Results Summary

Comparisons of Depletion, Catch and AAV for All Cases

Page 19: MSE Performance Metrics and Tentative Results Summary
Page 20: MSE Performance Metrics and Tentative Results Summary
Page 21: MSE Performance Metrics and Tentative Results Summary

Summary for long-term depletion

Page 22: MSE Performance Metrics and Tentative Results Summary

Summary for long term AAV

Page 23: MSE Performance Metrics and Tentative Results Summary

Summary for long-term catch

Page 24: MSE Performance Metrics and Tentative Results Summary

Discussion

• Does the groups want alternative performance statistics considered

• Progress and next steps