\documentclass[reqno]{amsart}
\usepackage{graphicx}
\usepackage[usenames]{color} \color{black}
\usepackage{hyperref}
\AtBeginDocument{{\noindent\small
2004 Conference on Diff. Eqns. and Appl. in Math. Biology,
Nanaimo, BC, Canada.\newline
{\em Electronic Journal of Differential Equations},
Conference 12, 2005, pp. 143--158.\newline
ISSN: 1072-6691. URL: http://ejde.math.txstate.edu or
http://ejde.math.unt.edu
\newline ftp ejde.math.txstate.edu (login: ftp)}
\thanks{\copyright 2005 Texas State University - San Marcos.}
\vspace{9mm}}
\setcounter{page}{143}
%**************************
\newcounter{saveEq}
\def\putEq{\setcounter{saveEq}{\value{equation}}}
\def\getEq{\setcounter{equation}{\value{saveEq}}}
\def\tableEq{ % equations in tables
\putEq \setcounter{equation}{0}
\renewcommand{\theequation}{T\arabic{table}.\arabic{equation}}}
\def\normalEq{ % renew normal equations
\getEq
\renewcommand{\theequation}{\arabic{section}.\arabic{equation}}}
\definecolor{OliveGreen}{rgb}{0.10,0.65,0.10}
% ***************************
\begin{document}
%\color{black}
\title[\hfilneg EJDE/Conf/12 \hfil Mathematics and fisheries]
{Mathematics and fisheries: Match or mismatch?}
\author[J. T. Schnute \hfil EJDE/Conf/12 \hfilneg]
{Jon T. Schnute}
\address{Jon T. Schnute \hfill\break
Fisheries and Oceans Canada \\
Pacific Biological Station \\
3190 Hammond Bay Road \\
Nanaimo, B.C. V9T 6N7, Canada}
\email{schnutej@pac.dfo-mpo.gc.ca}
\date{}
\thanks{Published April 20, 2005.}
\subjclass[2000]{62P10, 92B05, 92B15}
\keywords{Fishery models; state space models; statistical decisions}
\begin{abstract}
Mathematics plays a major role in contemporary
fisheries management. Stock assessments often depend on elaborate
models used to set catch levels and address other policy objectives. In
recent years, the collapse of various important fish stocks has caused
some critics to suggest that mathematical models actually obscure the
truth by narrowing scientific understanding to the realm of
quantifiable events. In the words of one fisherman, {\color{blue}
``Mathematics has highjacked the definition and position of real
science.''} In this paper, I present a number of typical fishery
models, examine their limitations, discuss controversies about their
use, and explore possible alternatives. I draw on examples from
economics and investment theory to illustrate the problem of making
credible predictions about an uncertain future. The constraints of the
real world, where people care deeply about policy consequences, have
altered my scientific perspective as an applied mathematician. This
paper reflects the evolution of thought that has accompanied my
experience working for 28 years at the Pacific Biological Station in
Nanaimo, B.C., the host city for this conference.
\end{abstract}
\maketitle
\numberwithin{equation}{section}
\normalEq
\section{Introduction}
Mathematics offers a wonderful tool for speculation about
biological principles that govern animal populations. Most sensible
rules for birth, growth, movement, and death can be stated
mathematically. Ideally, theoreticians can use these assumptions to
derive theorems that characterize model behavior. If theory seems
intractable, computer simulations allow scientists to explore a model's
properties. Unfortunately, even the most brilliant analysis leaves a
nagging question unresolved. Does this set of rules adequately describe
how the world actually works?
Fishery science often depends heavily on mathematical methods for
inferring the state of fish stocks from limited available data.
Fishermen care about fish, not theorems, and they can be highly critical
of models that fail to capture the biological world as they see it. An
optimistic theory that predicts a high birth rate becomes meaningless
if, in fact, these births do not take place. Similarly, a pessimistic
forecast that proves wrong can also bring mathematical models into
disrepute. Speaking in the context of a collapsed fishery, the fisherman
James O'Malley \cite{Om} concluded that {\color{blue} ``in our attempt
to comprehend the oceans $\dots$ mathematics has} {\color{blue} been
elevated to a status which suppresses knowledge and actually detracts
from our efforts to acquire knowledge.''}
In this paper, I examine the role of mathematics as a tool for
making decisions about the real world. Case studies from economics and
fishery management suggest reasons to apply mathematics with caution.
Statistics plays a key role in financial and fishery models, where
observed data never conform to a deterministic model. In practice, the
interpretation of a data set can vary greatly with the choice of
assumptions about statistical error. A detailed analysis of one fish
stock illustrates the process commonly used to estimate the unknown
total biomass from available data. I conclude by discussing a recent
shift in fishery research from estimation methods to robust decision
algorithms that perform well across a broad spectrum of possible models.
\section{Investments and fisheries}
In 1973, a new mathematical formula (Black and Scholes \cite{BS};
Merton \cite{Mer}) revolutionized the world of finance by providing a
rational system for setting prices in the options market. Essentially,
an option contract gives its owner the right, without obligation, to buy
or sell an asset at a specified future time and price. A \emph{call}
option gives the right to buy, whereas a \emph{put} option gives the
right to sell. In practice, traders want to know how to assign value to
such contracts. For example, if a stock has current price $S$, what is a
fair price $C$ for a call option to buy the stock at the future time $T$
and specified price $K$ (the so-called \emph{strike} price)? If the
prevailing interest rate is $r$ and the stock price moves log-normally
in continuous time with standard deviation $\sigma$ (a Wiener process),
then the famous Black-Scholes formula gives the call price
%
\begin{equation}
C(T,S,K,r,\sigma) = S\, \Phi\Big( x + \frac{\sigma \sqrt{T}}{2} \Big)
- K\, e^{-rT} \Phi\Big( x - \frac{\sigma\sqrt{T}}{2} \Big)
\,, \label{bsf} \end{equation}
%
where \[ x = \frac{ rT + \log\frac{S}{K} }{\sigma\sqrt{T}} \]
%
and $\Phi(\cdot)$ denotes the cumulative normal distribution function
%
\[ \Phi(y) = \frac{1}{\sqrt{2\pi}} \int_{-\infty}^y e^{-x^2/2} dx. \]
%
The Internet (e.g., \cite{BSwik}) provides much more information and
various mathematical proofs.
The formula \eqref{bsf} solves an important problem in investment
theory and has the practical advantage that it can be implemented easily
on the trading floor. Mathematically, it follows from a continuous-time
stochastic model. In 1997, after Black had died, Scholes and Merton won
the Nobel prize in economics ``for a new method to determine the value
of derivatives'', including this result. Not surprisingly, investment
strategists looked for opportunities posed by such an elegant theory.
Lowenstein \cite{Low} gives an engaging account of the most famous
example, Long-Term Capital Management (LTCM). The name ``hedge fund''
derives from the expression ``to hedge one's bets'', i.e., to bound risk
like a common hedge bounds a garden \cite[p.~25]{Low}. A hedge fund
typically bets on the \emph{spread} between current and future prices,
where the future option usually sells at a discount and the spread
shrinks over time. In this scenario, an investor is protected from the
risk of market fluctuations because the two prices generally move in
tandem while the spread between them declines.
In February 1994, armed with a powerful mathematical theory and
supported by the now-famous economists Merton and Scholes, the
experienced investment manager John Merriweather started LTCM with
assets of \$1.25 billion contributed by wealthy investors \cite[
p.~39]{Low}. For about four years following its inception, the
investment grew at an astonishing rate that kept it above a steady
exponential growth curve of 40\% annually (Figure \ref{LTCM}). Managers
achieved this growth using highly leveraged betting on assets purchased
with loans from major banks. The fund's capital value, \$4.7 billion at
the start of 1998, had to support borrowed assets of about \$100 billion
\cite[p.~xix--xx]{Low}. Furthermore, in Lowenstein's words,
{\color{blue} LTCM ``had entered into thousands of derivative contracts,
which had endlessly intertwined it with every bank on Wall Street. These
contracts, essentially side bets on market prices, covered an
astronomical sum -- more than \$1 trillion worth of exposure.''}
% LTCM Figure *********************************************************
\begin{figure}[t]
\includegraphics[scale=0.5]{fig1} % ltcm.eps
\caption{
Gross value of \$1 invested in Long-Term Capital Management, March 1994
to October 1998 (green line; data from a graph by Lowenstein
\cite{Low}). The blue curve represents an exponential growth rate that
compounds to 40\% per year (proportional to $1.4^t$ with time $t$ in
years).}
\label{LTCM} \end{figure}
According to the model that underlies \eqref{bsf}, asset values
should move lognormally with a standard deviation $\sigma\sqrt{\delta
t}$ during a small time period $\delta t$. Theoretically, a huge change
in a short time should be nearly impossible, but extreme market
conditions sometimes cause the model to fail. On August 17, 1998, Russia
devalued the ruble and declared a moratorium on its Treasury debt of
about \$13.5 billion \cite{ER}. That event and other market crises
caused LTCM's equity to drop much more rapidly than the most extreme
model predictions. On September 23, 1998, it appeared that the fund
might not survive another day \cite[p.~xx]{Low}. Faced with the prospect
that a default by LTCM might trigger market panic and collapse, the
Federal Reserve Bank of New York organized a bail-out loan of \$3.6
billion. A consortium of 14 major firms now had the power to oversee all
fund transactions. The initial investors, including Merriweather,
Merton, and Scholes, lost millions of dollars. LTCM repaid its loans by
December, 1999, and quietly closed down a few weeks later \cite{TDB}.
% Two-Fisheries Figure ************************************************
\begin{figure}[t]
\includegraphics[scale=0.6]{fig2} %twofish.eps
\caption{
Annual catch ($\mbox{kt} = 10^6\ \mbox{kg}$) from two Canadian
fisheries. {\bf A}. British Columbia herring (\emph{Clupea pallasi}),
where red lines highlight two periods: (1930-1966) historical expansion
along the Pacific coast and (1978-present) current stable fishery with
an annual catch near 30 kt. Data from Jake Schweigert, Pacific
Biological Station, Nanaimo, B.C. {\bf B}. Newfoundland northern cod
(\emph{Gadus morhua}) in area 2J/3KL, where red `x' symbols show annual
quotas imposed since 1973. Data from \cite[Table 1, p.~63--64]{Cod}.}
\label{TwoFish} \end{figure}
This cautionary tale gives applied mathematicians ample opportunity
to speculate about the application of theory to real world problems.
Lowenstein \cite{Low} paints a vivid picture of the players involved,
and two related TV programs (\cite{MF}, \cite{TDB}) give us an
opportunity to hear their views. For example, Myron Scholes speculated
\cite{TDB} that the difficulties with LTCM didn't come only from the
models. {\color{blue} ``It could be inputs to the models, it could be
the models themselves, it could be a combination of many things. And so
just saying models were flawed is not necessarily the right answer.''}
Paul Samuelson, winner of the 1970 Economics Nobel Prize for his
analytical work in the field, observed \cite{TDB} that {\color{blue}
``There is a tempting and fatal fascination in mathematics. Albert
Einstein warned against it. He said elegance is for tailors, don't
believe in something because it's a beautiful formula. There will always
be room for judgment.''} U.S. Federal Reserve Chairman Alan Greenspan
asked \cite{TDB} {\color{blue} ``How much dependence should be placed on
financial modeling, which for all its sophistication can get too far
ahead of human judgment?''} Norbert Wiener, who invented the
continuous-time stochastic process that bears his name and underlies the
Black-Scholes formula, regarded skepticism as a professional obligation
\cite{W}: {\color{blue} ``One of the chief duties of a mathematician in
acting as an advisor to scientists is to discourage them from expecting
too much of mathematicians.''}
The shared word \emph{stock} curiously links financial and fishery
management. In the investment world, a stock has a current price $S$
known from the marketplace itself. By contrast, a fish stock has a
current biomass $B$ that generally is not known and may be impossible to
measure directly. Consequently, biological models to predict future
values $B$ have even greater uncertainty than financial models for
predicting $S$. Like financial managers who must decide how much stock
to buy, sell, or hold, fishery managers must decide how much stock
biomass to allow as catch. The same urge to use quantitative methods
prevails in both worlds. Unfortunately, like financial models, fishery
management models can fail to represent the world adequately.
Figure \ref{TwoFish} tells the tale of two Canadian fisheries, one
successfully managed for recovery and the other not. In the first case
(Figure~\ref{TwoFish}A), the British Columbia herring fishery
experienced a historical period of expansion, followed by severe
regulations to permit stock recovery. For the last two decades, under
restrictive management, the fishery has sustained a steady catch at a
fraction of historical levels. Because higher annual catches might only
drive down market prices, the industry finds economic reasons to agree
with current regulations. Furthermore, herring provide food for other
commercial fish species, and stakeholders generally recognize the
importance of a robust herring population for other Pacific fisheries.
In the second case (Figure \ref{TwoFish}B), the Newfoundland cod
fishery also experienced a historical period of expansion, followed by
quotas to limit the catch. However, seemingly moderate catch levels
during the 1980s still did not allow the population to recover, and the
fishery was closed in 1992. Evidence suggests that the population still
hasn't recovered enough to allow more than very low levels of catch.
Lilly et al.\ \cite{Cod} discuss possible reasons (still controversial)
for this fishery collapse. Why couldn't historical catch levels be
maintained? Why hasn't the stock recovered like Pacific herring?
According to one scenario, fishery management did not respond quickly
enough to curtail a fishery on a population becoming stressed by
changing ocean conditions in the north Atlantic.
Some people saw the problem coming and advised caution, just as
some econo\-mists had expressed concerns for the levels of risk
prevailing at LTCM near the end of 1997. But fishery and financial
management always entail risk, and it's tempting to suppose that things
won't change too dramatically in a short time. Rapid climate change,
like a Russian default on bonds, may not come up on the radar screen
until it's too late. Mathematical models, conditioned to past
performance, tend to handle regime shifts poorly. Myron Scholes
expressed frustration with this problem at LTCM \cite{TDB}:
{\color{blue} ``In August of 1998, after the Russian default, $\dots$
all the relations that tended to exist in a recent past seemed to
disappear.''}
% Model Figure ********************************************************
\begin{figure}[t]
\includegraphics[scale=0.48]{fig3} % model.eps
\caption{
A stochastic, dynamic fishery model. Colors indicate the role of model
components: {\color{blue} theory} (blue), {\color{OliveGreen} observed
data} (green), and {\color{red} unknown quantities} (red).}
\label{ModFig} \color{black}\end{figure}
\section{Fishery models}
Like economists seeking formulas for option prices, fishery
scientists try to find sensible rules for setting catch quotas. Figure 3
shows the logical structure of a typical fishery model. An {\color{red}
unknown biomass} follows a {\color{blue} dynamic process} governed by
{\color{red} unknown parameters} and the removal of a
{\color{OliveGreen} known catch}. Natural variability enters the
dynamics as {\color{blue} stochastic process error}. {\color{OliveGreen}
Other known data} come from biological observations, such as surveys and
samples of the catch. An {\color{blue} observation model} connects these
data to the biomass, subject to {\color{blue} measurement error}, which
introduces another source of statistical noise. Practical applications
depend on a key idea related to the color codes in Figure~3: {\bf Use
the {\color{blue} blue theory} to estimate the {\color{red} unknown red
quantities} from the {\color{OliveGreen} known green observations}.}
The simple deterministic model in Table~\ref{ModTab} illustrates
how this procedure might actually work. Imagine a fixed stock of fish
altered only by the removal of catch $\color{OliveGreen} C_t$ at various
times $t$. Let $\color{red} B_t$ denote the biomass available prior to
the catch $\color{OliveGreen} C_t$, and suppose that a survey method
exists to measure a biomass index $\color{OliveGreen} I_t$ proportional
to $\color{red} B_t$ through an unknown coefficient $\color{red} q$.
(For example, the survey might use acoustic sensors to produce a signal
proportional to biomass.) The index data $\color{OliveGreen} I_t$ at
times $t=1,2$ correspond to the two observation equations
\eqref{I1}--\eqref{I2}, and the transition of biomass $\color{red} B_t$
from $t=1$ to $t=2$ gives the dynamic equation \eqref{B12}. In this
example, only a few mathematical symbols define the dynamic
($\color{blue} =$,$\color{blue} -$) and observation ($\color{blue}
=$,$\color{blue} \times$) models. Three equations relate three unknowns
($\color{red} B_1$,$\color{red} B_2$,$\color{red} q$) to three
observations ($\color{OliveGreen} C_1$,$\color{OliveGreen}
I_1$,$\color{OliveGreen} I_2$). Simple algebra gives the estimates
\eqref{q}--\eqref{B2} of {\color{red} unknown reds} in terms of
{\color{OliveGreen} known greens}.
% Model Table *********************************************************
\begin{table}[t] \caption{
A simple fishery depletion model with deterministic equations for
dynamics and measurement. Three data values (catch $\color{OliveGreen}
C_1$ and biomass indices $\color{OliveGreen} I_1$, $\color{OliveGreen}
I_2$) determine three unknowns (parameter $\color{red} q$ and biomass
levels $\color{red} B_1$, $\color{red} B_2$).} \tableEq
%
\begin{align}
\mbox{Dynamics:}\quad
& \color{red} B_2 \color{blue} = \color{red} B_1 \color{blue}
- \color{OliveGreen} C_1 \label{B12} \\[1ex]
\mbox{Observations:}\quad
& \color{OliveGreen} I_1 \color{blue} =
\color{red} q {\color{blue} \times} B_1 \label{I1} \\
& \color{OliveGreen} I_2 \color{blue} =
\color{red} q {\color{blue} \times} B_2 \label{I2} \\[1ex]
\mbox{Estimates:}\quad
& \color{red} q \color{blue} =
\color{OliveGreen} \frac{I_1 - I_2}{C_1} \label{q} \\
& \color{red} B_1 \color{blue} =
\color{OliveGreen} \frac{I_1}{I_1-I_2} \, C_1 \label{B1} \\
& \color{red} B_2 \color{blue} =
\color{OliveGreen} \frac{I_2}{I_1-I_2} \, C_1 \label{B2}
\end{align}
\label{ModTab} \normalEq \end{table}
% End Model Table *****************************************************
\color{black}
The depletion model in Table \ref{ModTab} can readily be extended
to include multiple time steps $t=1,\dots,n$, and the equations then
imply the general result
%
\begin{equation}
{\color{OliveGreen} I_t} = {\color{red} qB_1}
- {\color{red} q} \sum_{i=1}^{t-1} {\color{OliveGreen} C_i}
\label{It} \end{equation}
%
for each observation $\color{OliveGreen} I_t$, where by definition
$\sum_{i=1}^0 {\color{OliveGreen} C_i} = 0$ when $t=1$. In this
formulation, the model has only two unknowns ($\color{red}
q$,$\color{red} B_1$), regardless of the number $n$ of observed time
steps. An estimate of $\color{red} B_1$ gives biomass estimates
%
\begin{equation}
{\color{red} B_t} = {\color{red} B_1}
- \sum_{i=1}^{t-1} {\color{OliveGreen} C_i}
\label{Bt} \end{equation}
%
for all future times $t>1$. The overdetermined system \eqref{It} of $n$
equations with two unknowns cries out for statistical analysis, such as
linear regression of $\color{OliveGreen} I_t$ on the cumulative catch
$\sum_{i=1}^{t-1} {\color{OliveGreen} C_i}$. But exactly how should we
introduce error into the model that underlies \eqref{It}? If we think
that immigration and emigration randomly alter the biomass $\color{red}
B_t$, we could introduce process error into the dynamics ${\color{red}
B_{t+1}} = {\color{red} B_t} - {\color{OliveGreen} C_t}$. If we think
that our measurements $\color{OliveGreen} I_t$ aren't exactly
proportional to $\color{red} B_t$, we could (and almost certainly
should) introduce measurement error into the observation equation
${\color{OliveGreen} I_t} = {\color{red} q B_t}$. Such choices
correspond to the decision by Black, Scholes, and Merton to use a Wiener
process for modeling stock price fluctuations. In some cases, different
choices can dramatically alter the analysis.
% Rectangle Figure ****************************************************
\begin{figure}[t]
\includegraphics[scale=0.4]{fig4} % rect.eps
\caption{
Schnute's \cite[Figure~1]{Sch87} demonstration of the importance of
noise in stochastic models. Four observed data points
({\color{OliveGreen} green circles}) lie at the corners of a rectangle.
If error occurs in the $y$ coordinate, the data determine a horizontal
regression line ({\color{red} solid red}) through two observed mean
values $y=2$ ({\color{blue} blue circles}). If error occurs in the $x$
coordinate, the data determine a vertical regression line ({\color{red}
broken red}) through two observed mean values $x=3$ ({\color{blue} blue
diamonds}). }
\label{rect} \end{figure}
Figure~\ref{rect} illustrates this issue for the problem of fitting
a straight line through four data points that lie on the corners of a
rectangle in the $xy$-plane \cite{Sch87}. The estimated regression line
is horizontal if the error occurs in $y$, but vertical if the error
occurs in $x$. Because a line must have slope somewhere between 0
(horizontal) and $\infty$ (vertical), this example shows that the
perceived signal depends entirely on the definition of noise. By
analogy, think of watching 3D movies with polaroid glasses. The two
polarized axes define separate images for the left and right eyes. Noise
for the right eye is signal for the left, and vice-versa. Seemingly
innocent choices of error in biological models sometimes disguise other
interpretations of the data, even given the same deterministic model
(e.g., a straight line in Figure~\ref{rect}).
% POP Trend Figure ****************************************************
\begin{figure}[t]
\includegraphics[scale=0.45]{fig5} % poptrend.eps
\caption{
Annual estimates of biomass ($\mbox{kt} = 10^6\ \mbox{kg}$) for a stock
of Pacific ocean perch (POP, \emph{Sebastes alutus}) assessed in
\cite{SchPOP}. The model uses historical data on catch (kt,
{\color{OliveGreen} green bars}) and two survey indices with distinct
coefficients ${\color{red} q_j}\,(j=1,2)$ in \eqref{Itj}. Observed index
values $\color{OliveGreen} I_{tj}$ are scaled to biomass levels
${\color{OliveGreen} I_{tj}}/{\color{red} q_j}$ for $j=1$ ({\color{blue}
blue circles}) and $j=2$ ({\color{blue} blue triangles}). The biomass
vulnerable to the fishery ({\color{red} solid red line}, $\color{red}
B_t$ in \eqref{Bsel}) is smaller than the total biomass ({\color{red}
dotted line}, $\color{red} B'_t$ in \eqref{Btot}), a feature represented
in the model by selectivity coefficients $\color{red} \beta_a$ that
increase steadily to 1 with increasing fish age $a$. }
\label{POPtrend} \end{figure}
Realistic fishery models involve much more complexity and data than
the simple prototypes \eqref{B12}--\eqref{I2} or \eqref{It}--\eqref{Bt}.
For example, Figure~\ref{POPtrend} shows results obtained from a model
of a particular stock of Pacific ocean perch (POP) on the coast of
British Columbia, Canada \cite{SchPOP}. The analysis uses three main
data sets: the annual catch $\color{OliveGreen} C_t$, intermittent
biomass index measurements $\color{OliveGreen} I_{tj}$ obtained in
various years $t$ by two methods ($j=1,2$), and sample proportions
$\color{OliveGreen} p_{at}$ of fish in the catch that have age $a$ in
year $t$. Historical data also give reasonable values for the weight
$\color{OliveGreen} w_a$ of a fish at age $a$. POP start to appear in
the fishery at recruitment age $k=7$, although the data suggest that
young fish are less vulnerable to the gear than older fish. The model
tries to capture this phenomenon with unknown selectivity coefficients
$\color{red} \beta_a$ that increase steadily toward 1 as $a \rightarrow
\infty$.
Internally, the model keeps track of the number $\color{red}
N_{at}$ of fish at age $a$ in year $t$ prior to the fishery. From this
matrix, various annual population characteristics can be computed,
including
%
\begin{align}
\mbox{recruitment:}\quad &
{\color{red} R_t} = {\color{red} N_{kt}}\,, \label{Rt} \\
\mbox{total biomass:}\quad &
{\color{red} B'_t} =
\sum_{a \geq k} {\color{OliveGreen} w_a} {\color{red} N_{at}}\,,
\label{Btot} \\
\mbox{selected biomass:}\quad &
{\color{red} B_t} = \sum_{a \geq k}
{\color{red} \beta_a} {\color{OliveGreen} w_a}
{\color{red} N_{at}}\,,
\label{Bsel} \\
\mbox{selected proportion:}\quad &
{\color{red} u_{at}} = {\color{red} \beta_a N_{at}} /
\sum_{a \geq k} {\color{red} \beta_a N_{at}}\,.
\label{uat}
\end{align}
%
Given suitable definitions of measurement error, the index data
$\color{OliveGreen} I_{tj}$ should be proportional to the selected
biomass $\color{red} B_t$ with unknown coefficients ${\color{red} q_j}
\, (j=1,2)$, and the observed proportions $\color{OliveGreen} p_{at}$
should match the internal proportions $\color{red} u_{at}$. Thus, the
model's {\color{blue} observation} component (Figure~\ref{ModFig})
involves stochastic counterparts of the deterministic equations
%
\begin{gather}
{\color{OliveGreen} I_{tj}} = {\color{red} q_j B_t} \,, \label{Itj} \\
{\color{OliveGreen} p_{at}} = {\color{red} u_{at}} \,. \label{pat}
\end{gather}
Figure \ref{POPtrend} shows a fishery similar to the one portrayed
in Figure \ref{TwoFish}A. Large historical catches $\color{OliveGreen}
C_t$ have been curtailed by regulation to a relatively modest steady
catch during the last two decades. According to the model, historically
large biomass levels $\color{red} B_t$ were driven down by large catches
in the 1960s and 1970s, but have recovered somewhat after the catch was
reduced. Sporadic surveys lend support to this scenario, but the survey
index data deviate substantially from the biomass trend. (Note the
scatter of blue points around the solid red line.) A recent slow decline
in biomass stems from low recruitment in the 1990s, as discussed below.
% POP Age Figure ******************************************************
\begin{figure}[t]
\begin{center} \includegraphics[scale=0.70]{fig6} % popage.eps
\end{center}
\caption{
(A) {\color{OliveGreen} Observed} and (B) {\color{red} estimated} age
distributions $\color{OliveGreen} p_{at}$ and $\color{red} u_{at}$ of
the population in Figure \ref{POPtrend}. Circular areas represent
relative age proportions within each year, where $\sum_a
{\color{OliveGreen} p_{at}} = \sum_a {\color{red} u_{at}} = 1$ for each
$t$. Solid lines show the annual mean age. The circle at the top of each
column represents a ``plus class'', i.e., fish of the indicated age or
older. An improved ageing method in 1977 gave better discrimination to
ages in the range 17 to 29. Observed data in (A) show an abrupt
increment in age resolution. Model estimates in (B) show a more gradual
change as the plus class increases annually from age 17 in 1963. }
\label{POPage} \end{figure}
The observed age data $\color{OliveGreen} p_{at}$, portrayed as a
\emph{bubble plot} \cite{RSO} in Figure~\ref{POPage}A, display a pattern
with several diagonal lines of large bubbles. These correspond to
episodes of high recruitment; for example, a particularly strong cohort
proceeds from ages 7 to 24 during the years 1983--2000. POP live long
enough to spawn many times. Ocean conditions and other factors determine
the success of each annual spawning event, which occasionally pays off
very well. Like good financiers, the fish distribute their genetic
investments across time, taking advantage of occasional high returns.
The estimated proportions $\color{red} u_{at}$ show this cohort effect
even more strongly (Figure~\ref{POPage}B). Each diagonal has small
proportions of fish near the recruitment age $k=7$, due to low
selectivity by the fishery. Natural and fishing mortality also cause the
proportions to decline as fish reach advanced ages.
For brevity, I have omitted many technical details in this
discussion of the POP example. The model handles {\color{blue} process
error} (Figure~\ref{ModFig}) by allowing independent recruitments
$\color{red} R_t$ for each cohort, with some constraints on their
variability and serial correlation. Stochastic versions of
\eqref{Itj}--\eqref{pat} introduce {\color{blue} measurement error}. As
in the dilemma of Figure \ref{rect}, model definition here requires
balancing recruitment variability with survey observation error. The
scenario portrayed in Figures \ref{POPtrend}--\ref{POPage} involves
estimates of nearly 60 independent unknown quantities from which many
others (such as the matrices $\color{red} N_{at}$ and $\color{red}
u_{at}$) are calculated. I haven't even mentioned the statistical
properties of all these estimates, which might be assessed using
computer-intensive algorithms from Bayesian statistics. Interested
readers can find a complete model description in \cite{SchPOP}.
Like most fishery age-structured models, the POP model has an
essential simplicity at its core. Each cohort (represented as a diagonal
of bubbles in Figure~\ref{POPage}A) experiences successive depletion
from annual removals by the fishery, analogous to the simple model in
Table~\ref{ModTab}. Combining removals with the effects of selectivity
and natural mortality gives the pattern in Figure~\ref{POPage}B, where
suitable recruitment parameters scale the cohorts sizes relative to each
other. Despite this simplicity, however, a complete mathematical
statement of the stochastic model and its likelihood functions occupies
several pages dense with equations and notation
\cite[p.~23--29]{SchPOP}.
When scientists meet to debate the legitimacy of such a model, the
discussion can become highly technical, often using terminology that
alienates fishermen and other stakeholders. Furthermore, analysts
typically invest substantial effort in writing computer code, running
various analyses, producing tables, and crafting figures. All this hard
work can generate resistance to new ideas, which might be difficult to
include in an otherwise tidy framework. Like the legendary sculptor
Pygmalion who fell in love with his own ivory statue, analysts can
become attached to their models. Picture a cartoon scenario at LTCM in
August, 1998: ``Sir, now that Russia has defaulted on its bonds, should
I trash the model that made 40\% annually and start again?''
Skeptical fisherman James O'Malley sensed this problem in his claim
\cite{Om} that {\color{blue} ``mathematics has been elevated to a status
which suppresses knowledge and actually detracts from our efforts to
acquire knowledge.''} In his view, {\color{blue} the problem is ``not
mathematics per se, but the place of idolatry we have given it. $\dots$
Like any priesthood, it has developed its own language, rituals and
mystical signs to maintain its status, and to keep a befuddled
congregation subservient, convinced that criticism is blasphemy. Late at
night, of course, many members of the scientific community will confess
their doubts. But in the morning, they reappear to preach the catechism
once again.''}
I can easily find late-night reasons for doubts about the POP
model. It represents only one stock along the British Columbia
coastline, chosen for the availability of historical survey data. The
precise spatial definition of this stock is somewhat arbitrary, and its
genetic extent remains uncertain. Although POP have a complex spatial
distribution associated with geological features of the continental
slope, the model includes no spatial structure at all. Furthermore, the
fishery that captures POP also removes significant amounts of about 100
other species, and smaller amounts of many others. Perhaps the most
reassuring message comes directly from the {\color{OliveGreen} catch
data} in Figure \ref{POPtrend}. After a history of much higher catches,
modest removals have been sustained for more than two decades.
% Strategy Figure *****************************************************
\begin{figure}[t]
\includegraphics[scale=0.45]{fig7} % strat.eps
\caption{
Robust management strategy. Uncertain {\color{red} biological dynamics}
generate observed {\color{OliveGreen} data} from which a {\color{blue}
management strategy}, specified as a mathematical algorithm, determines
the next allowable {\color{red} quota}. This decision influences the
dynamics during the next management cycle. The future {\color{red}
quota} remains unknown until calculated from the {\color{OliveGreen}
data}. What algorithms define sensible management policies over a broad
spectrum of potential dynamic systems?}
\label{strat} \end{figure}
\section{Decisions and uncertainty}
In economics, \emph{risk} is quantifiable, but \emph{uncertainty}
is not. For example, if a process can follow one of three scenarios $A$,
$B$, or $C$, then risk analysis requires knowing the corresponding
probability $(p_A,p_B,p_C)$ for each scenario, where $p_A + p_B +
p_C=1$. Casinos and lotteries, with games that have known outcomes and
probabilities, can exploit risk analysis to the fullest. Investors and
fisheries managers rarely have the benefit of such clear understanding.
Typically, they face uncertainty with unknown event probabilities and an
incomplete list of potential scenarios. For example, future events might
actually follow scenario $X$ not contemplated when formulating policies
to deal with $A$, $B$, or $C$.
Mathematics proceeds from assumptions to conclusions, and a
mathematical model necessarily describes a chosen set of scenarios. The
estimation model in Figure~\ref{ModFig} limits data interpretation to a
specified framework of dynamics and observation. In practice, the
analyst usually looks at several alternative models to see how much the
interpretation changes. But perhaps this entire approach is wrong. If,
for example, the goal is to set a catch quota for the next year, then
why worry about the profusion of estimates that spill out of a
complicated analysis like the one discussed for POP? All we really need
is a {\color{blue} formula or algorithm} to calculate the (currently
unknown) {\color{red} future quota} from the {\color{OliveGreen}
observed data}.
Figure \ref{strat} portrays the decision problem from this point
of view. We want a {\color{blue} management strategy} that is robust to
an {\color{red} unknown biological system}. For example, suppose that we
seek to operate a fishery based on a time series of known catches
$\color{OliveGreen} C_t$ and biomass indices $\color{OliveGreen} I_t$.
Then what function $\color{blue} F$ should we use to set the catch quota
%
\begin{equation}
{\color{red} Q_{n+1}} =
{\color{blue} F(} {\color{OliveGreen} C_1,\dots,C_n,I_1,\dots,I_n};
{\color{OliveGreen} \theta} {\color{blue})}
\label{QF}
\end{equation}
%
for time $t=n+1$, given data up to time $t=n$? This formulation allows a
specified parameter vector $\color{OliveGreen} \theta$ to configure the
rule for particular circumstances and policy objectives. For example,
the recruitment age might be used to set time lags in the definition of
$\color{blue} F$. Other components of $\color{OliveGreen} \theta$ might
deal with risk tolerance, margins for error, and catch stability. When
this policy is implemented in year $n+1$, the actual catch
$\color{OliveGreen} C_{n+1}$ normally equals the quota, but conceptually
Figure \ref{strat} and formula \eqref{QF} admit the possibility of
\emph{implementation error} with ${\color{OliveGreen} C_{n+1}} \neq
{\color{red} Q_{n+1}}$.
Fishery literature in recent years (e.g., \cite{dlM} and
\cite{SPS}) has begun to investigate problems similar to the one posed
in Figure~\ref{strat}. Unlike optimal control theory, this research
seeks pragmatic controls that work sensibly across a spectrum of
biological models, where the actual model remains unknown. For example,
the policy \eqref{QF} might simply be to stay the course:
%
\begin{equation}
{\color{red} Q_{n+1}} = {\color{OliveGreen} C_n} \,, \label{Cfix}
\end{equation}
%
as has happened recently in the herring and POP fisheries (Figures
\ref{TwoFish}A and \ref{POPtrend}). Under what circumstances would this
be a good or bad policy? Furthermore, could some really clever algorithm
\eqref{QF} guide fishery management toward a policy robust to nature's
unpredictability? As illustrated in a different context by the
Black-Scholes formula \eqref{bsf}, a standard calculation based on a
relatively simple model might give acceptable results most of the time.
Fishery scientists use simulation models with realistic biological
complexity to evaluate management strategies like \eqref{QF} and tune
the control parameters $\color{OliveGreen} \theta$ to meet management
objectives \cite{SPS}. In practice, each new year $n$ provides an
opportunity to review and update the strategy in light of new
information. This allows the process to be guided by human judgment and
reduces the risk of disaster, but no model can reduce that risk to zero.
Like financial markets altered by a crisis, nature sometimes changes the
perceived ``rules'' governing fish population dynamics \cite{SchR}.
During the last few decades, the discovery of chaotic behavior has
stimulated extensive research into nonlinear dynamic models. May
\cite{May} characterizes the spirit of this revolution in the 1970s by
quoting a line from the play \emph{Arcadia} by Tom Stoppard:
{\color{blue} ``It's the best possible time to be alive, when almost
everything you knew is wrong.''} The boundary between determinism and
chance became blurred when scientists realized that deterministic models
from an orderly Newtonian tradition could generate highly complex
trajectories. Kuznetsov \cite{Kuz} illustrates the depth and variety of
results that have emerged in this field. Brauer and Castillo-Ch\'avez
\cite{Br} incorporate some of these developments in their modern
treatment of biological models. By discussing the historical and
conceptual context for each model, they also guide readers toward
appropriate model application.
While mathematicians developed the new science of chaos theory,
statisticians began exploring new approaches to data analysis made
possible by the computer revolution. For example, in 1979 Efron
\cite{Ef} speculated that computationally intensive methods might bypass
the need for statistical theory. In their textbook written 14 years
later, Efron and Tibshirani \cite[p.~xiv]{EfT} explain that
{\color{blue} ``The traditional road to statistical knowledge is
blocked, for most, by a formidable wall of mathematics.} {\color{blue}
Our approach here avoids that wall. The bootstrap is a computer-based
method of statistical inference that can answer many real statistical
questions without formulas.''} This technique belongs properly to
frequentist statistics, in which parameters have actual values in nature
and their estimates inherit a distribution from the data. The bootstrap
algorithm involves resampling the data with replacement and generating
an empirical distribution of parameter estimates.
A similar computational revolution has also taken place in Bayesian
statistics, which uses probability distributions to describe subjective
uncertainty in parameter values. Starting from an initial \emph{prior}
distribution, new data determine a revised \emph{posterior}
distribution. The proposed algorithms use clever methods to generate a
random sample of parameter values from the posterior, and this sample
represents the analyst's current understanding. Clifford (\cite{Clif},
p.~53) emphasized the practical impact of one sampling method:
{\color{blue} ``$\dots$ from now on we can compare our data with the
model that we actually want to use rather than a model which has some
mathematically convenient form. This is surely a revolution''}.
To some extent, these historical developments have drawn
mathematicians into two camps. Using deliberate oversimplification, I'll
call them \emph{theorists} and \emph{realists}. In this hypothetical
world, a theorist explores deterministic models to find interesting
theorems and sometimes surprising dynamic behavior. Scenarios emerge
from first principles. For example, different regions of parameter space
might produce qualitatively different trajectories -- stable, periodic,
or chaotic. Data come into the discussion only after a clearly
formulated understanding of how things might work. In fact, too much
attention to data might restrict the theorist's range of imagination and
exploration.
A realist, on the other hand, considers the data paramount, as in
the quote from Sherlock Holmes \cite{Hol}: {\color{blue} ''It is a
capital mistake to theorize before one has data. Insensibly one begins
to twist facts to suit theories, instead of theories to suit facts.''}
Data never fit a deterministic model exactly, so any analysis falls
automatically into the realm of statistics. The focus shifts from
theorems to algorithms, from abstract theory to detective work.
Given these stereotypes, it's easy to imagine areas of
disagreement. The realist asks: ``Why bother proving existence theorems
about models that don't fit any data?'' And the theorist counters: ``How
can you trust your complex model framework, with unknown global
properties and limited computer explorations that blunder around
randomly in the dark?'' Perhaps like many readers of this paper, I
consider myself an applied mathematician with roots in both camps.
Applied work demands respect for reality, not just mathematical elegance
(Samuelson \cite{TDB}): {\color{blue} ``$\dots$ don't believe in
something because it's a beautiful formula. There will always be room
for judgment.''} I'll give the final words to fisherman James O'Malley,
who puts mathematics in a singular position between knowledge and
understanding about the real world \cite{Om}:
{\color{blue} ``What is happening out there on the ocean, and why
is it happening? What will we do about it? $\dots$ We owe it to
ourselves, to the ocean, and especially to science itself, to assemble
that great body of knowledge, those millions of observations, and to use
every tool, including mathematics, to further our understanding of that
knowledge. Knowledge and understanding are not the same. They may, in
fact, be separated by a wide chasm. Mathematics is neither knowledge nor
understanding. It may be a useful tool to help up bridge that gap. That
is where it belongs, that is} {\color{blue} how we should use it, and we
need to start now -- before the bean-counters destroy us all.''}
\subsection*{Acknowledgements}
I thank Lev Idels and Elena Braverman for inviting me to this conference
and encouraging me to contribute to the proceedings. My conversations
with conference participants (particularly Fred Brauer, Odo Diekmann,
and Clara Nucci) stimulated my thinking about the relationship between
nonlinear model theory and its statistical application in practice. Jake
Schweigert supplied data and background on the herring fishery portrayed
in Figure~\ref{TwoFish}A. Rowan Haigh and an anonymous reviewer made
helpful suggestions leading to an improved final draft.
\begin{thebibliography}{99}
\bibitem{BS}
Black F. and Scholes M.; \emph{The pricing of options and corporate
liabilities}; Journal of Political Economy 81:~637--654 (1973).
\bibitem{Br}
Brauer F. and Castillo-Ch\'avez C.; \emph{Mathematical models in
population biology and epidemiology}; Springer-Verlag, New York; 416~p.
(2001).
\bibitem{Clif}
Clifford P.; \emph{Discussion on the meeting on the Gibbs sampler and
other Markov chain Monte Carlo methods}; Journal of the Royal
Statistical Society (Series B) 55:~53--102 (1993).
\bibitem{Hol}
Conan Doyle A.; \emph{A scandal in Bohemia}, The Adventures of Sherlock
Holmes (1892). Reprinted in \emph{The Illustrated Sherlock Holmes
Treasury}; Chatham River Press, New York, NY (1986).
\bibitem{dlM}
de la Mare W.K.; \emph{{\color{blue} Tidier fisheries management needs a
new MOP}\footnote{\color{blue} A blue font highlights several catchy
titles.} (management-oriented paradigm)}; Reviews in Fish Biology and
Fisheries 8:~349--356 (1998).
\bibitem{Ef}
Efron B.; \emph{\color{blue} Computers and the theory of statistics:
thinking the unthinkable}; SIAM Review 21:~460--480 (1979).
\bibitem{EfT} Efron B.E. and Tibshirani R.J.; \emph{An introduction to the
bootstrap}, Monographs on Statistics and Applied Probability 57; Chapman
\& Hall, New York; 436~p. (1993; republished by CRC Press 1998).
\bibitem{ER}
ERisk.com; \emph{Case study: Long-Term Capital Management}, ERisk
Learning Center. Available (December 2004) at
http://www.erisk.com/Learning/CaseStudies/ref\_case\_ltcm.asp
\bibitem{MF}
Horizon; \emph{\color{blue} The Midas Formula}; BBC TV Horizon program
(December 2, 1999). Transcript available (December 2004) at
http://www.bbc.co.uk/science/horizon/1999/midas.shtml
\bibitem{Kuz}
Kuznetsov Y.A.; \emph{Elements of applied bifurcation theory}, Applied
Mathematical Sciences 112; Springer-Verlag, New York; 631~p. (1995,
1998, 2004 - third edition).
\bibitem{Cod}
Lilly G.R., Shelton P.A., Brattey J., Cadigan N.G., Healey B.P., Murphy
E.F., Stansbury D.E., and Chen N.; \emph{An assessment of the cod stock
in NAFO Divisions 2J+3KL in February 2003}; DFO Canadian Science
Advisory Secretariat Research Document 2003/023; 157 pp. (2003).
Available at\\
http://www.dfo-mpo.gc.ca/CSAS/Csas/English/Research\_Years/2003/2003\_023\_E.htm
\bibitem{Low}
Lowenstein R.; \emph{{\color{blue} When genius failed:} the rise and
fall of Long-Term Capital Management}; Random House, New York; 264~p.
(2001).
\bibitem{May}
May R., \emph{\color{blue} The best possible time to be alive}, chapter
in: Farmelo G. [ed.] \emph{It must be beautiful: Great equations of
modern science}; Granta Publications, London; 284~p. (2002).
\bibitem{Mer}
Merton R.C.; \emph{Theory of rational option pricing}; Bell Journal of
Economics and Management Science 4:~141--183 (1973).
\bibitem{TDB}
Nova; \emph{\color{blue} Trillion Dollar Bet}; PBS TV Nova program
(February 8, 2000). Transcript available (December 2004) at
http://www.pbs.org/wgbh/nova/transcripts/2704stockmarket.html
\bibitem{Om}
O'Malley J.; \emph{\color{blue} From science to illusion: mathematics in
fishery management}; talk presented in Halifax, Nova Scotia (November
30, 1998). Available (December 2004) at
http://www.fishingnj.org/artomalley.htm
\bibitem{RSO}
Richards L.J., Schnute J.T., and Olsen N.; \emph{Visualizing catch-age
analysis: a case study}; Canadian Journal of Fisheries and Aquatic
Sciences 54:~1646--1658 (1997).
\bibitem{SPS}
Sainsbury K.J., Punt A.E., and Smith A.D.M.; \emph{Design of operational
management strategies for achieving fishery ecosystem objectives}; ICES
Journal of Marine Science 57:~731--741 (2000).
\bibitem{Sch87}
Schnute J.; \emph{Data uncertainty, model ambiguity, and model
identification: ({\color{blue} ``Mirror, mirror on the wall, what
model's fairest of them all?''})}; Natural Resource Modeling 2:~159--212
(1987).
\bibitem{SchPOP}
Schnute J.T., Haigh R., Krishka B.A., and Starr P.; \emph{Pacific ocean
perch assessment for the west coast of Canada in 2001}; CSAS Research
Document 2001/138:~90~p (2001). Available at:
http://www.dfo-mpo.gc.ca/CSAS/Csas/English/Research\_Years/2001/2001\_138e.htm
\bibitem{SchR}
Schnute J.T. and Richards L.J.; {\color{blue} \it Use and abuse of
fishery models}; Canadian Journal of Fisheries and Aquatic Sciences
58:~10--17 (2001).
\bibitem{W} Wiener, N.; \emph{Quotations by Norbert Wiener}, School of
Mathematics and Statistics, University of St.\ Andrews, Scotland.
Available (December 2004) at
http://www-history.mcs.st-andrews.ac.uk/Quotations/Wiener\_Norbert.html
\bibitem{BSwik}
Wikipedia; \emph{Black-Scholes}; Wikipedia, the free encyclopedia.
Available (December 2004) at http://en.wikipedia.org/wiki/Black-Scholes
\end{thebibliography}
\end{document}