mse performance metrics and tentative results summary

Post on 22-Feb-2016

32 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

MSE Performance Metrics and Tentative Results Summary. Joint Technical Committee Northwest Fisheries Science Center, NOAA Pacific Biological Station, DFO School of Resource and Environmental Management, SFU. Outline. Summarize the hake MSE Example simulations Performance metrics - PowerPoint PPT Presentation

TRANSCRIPT

MSE Performance Metrics and Tentative Results Summary

Joint Technical CommitteeNorthwest Fisheries Science Center, NOAA

Pacific Biological Station, DFOSchool of Resource and Environmental Management, SFU

Outline

• Summarize the hake MSE • Example simulations • Performance metrics • Summary figures

Objectives of the MSE

• Use the 2012 base case as the operating model.

• As defined in May 2120– Evaluate the performance of the harvest control

rule– Evaluate the performance of annual, relative to

biennial survey frequency.

Organization of MSE Simulations

Operating Model* Stock dynamics* Fishery dynamics* True population

Management Strategy* Data choices* *Stock Assessment* Harvest control rule

CatchData

Performance Statistics* Conservation

objectives* Yield objectives* Stability objectives

Feedback

Loop

* Use the MPD (not posterior medians, or other quantiles) for applying the harvest control rule

1960 1970 1980 1990 2000 2010 2020 2030

0.0

0.5

1.0

1.5

2.0

2.5

3.0

Year

SS

Bt

Existing (2012) assessment MSE Simulations

Example MSE run 1

But remember – starting points are not the same

Example MSE Run II

Measuring Performance• Choose metrics that capture the tradeoffs between conservation,

variability in catch and total yield for specific time periods.• Define short, medium and long time periods as Short=2013-2015,

Medium=2016-2020, Long=2021-2030.• The main conservation metric is the proportion of years depletion

is below 10%• The main variability in catch metric is the Average Annual

Variability in catch for a given time period.• For yield we used the median average catch• We’ve chosen what we think are the top six. We’d like to discuss

if others are needed.

Key Performance Statistics

Medium 2016-2020 Perfect Information Annual Biennial

Median average depletion 28% 27% 28%

Proportion of years below SB10% 1% 7% 6%Proportion of years between SB10% and SB40% 70% 61% 58%

Proportion of years above SB40% 29% 32% 36%

Median Average Annual Variability (AAV) in catch 23% 35% 36%

Median Average Catch 216 219 211

Other available options• First quartile depletion• Third quartile depletion• Median final depletion• Median of lowest depletion• Median of lowest perceived depletion• First quartile of lowest depletion• Third quartile of lowest depletion• First quartile of AAV in catch• Third quartile of AAV in catch• First quartile of average catch• Third quartile of average catch• Median of lowest catch levels• First quartile of lowest catch levels• Third quartile of lowest catch levels• Proportion with any depletion below SB10%• Proportion perceived to have any depletion below SB10%

Statistics Break - Medians vs Means

Average Annual Variability in Catch (illustration)

Perfect Information Case

• We created a reference, perfect information case where we simulated data with no error

• The purpose of the perfect information case was as follows:– To separate observation vs process error i.e. variable

data don’t affect management procedure performance

– to provide a standard relative to which a comparison of the test (biennial and annual) cases could be made

Perfect information (con’t)

Comparisons of Depletion, Catch and AAV for All Cases

Summary for long-term depletion

Summary for long term AAV

Summary for long-term catch

Discussion

• Does the groups want alternative performance statistics considered

• Progress and next steps

top related