Abstract
Ordinary differential equations (ODE) are widely used for modeling in Systems Biology. As most commonly only some of the kinetic parameters are measurable or precisely known, parameter estimation techniques are applied to parametrize the model to experimental data. A main challenge for the parameter estimation is the complexity of the parameter space, especially its high dimensionality and local minima.
Parameter estimation techniques consist of an objective function, measuring how well a certain parameter set describes the experimental data, and an optimization algorithm that optimizes this objective function. A lot of effort has been spent on developing highly sophisticated optimization algorithms to cope with the complexity in the parameter space, but surprisingly few articles address the influence of the objective function on the computational complexity in finding global optima. We extend a recently developed multiple shooting for stochastic systems (MSS) objective function for parameter estimation of stochastic models and apply it to parameter estimation of ODE models. This MSS objective function treats the intervals between measurement points separately. This separate treatment allows the ODE trajectory to stay closer to the data and we show that it reduces the complexity of the parameter space.
We use examples from Systems Biology, namely a LotkaVolterra model, a FitzHughNagumo oscillator and a Calcium oscillation model, to demonstrate the power of the MSS approach for reducing the complexity and the number of local minima in the parameter space. The approach is fully implemented in the COPASI software package and, therefore, easily accessible for a wide community of researchers.
Introduction
Mathematical modeling is a vital technique for analyzing complex systems in the sciences. We focus on models of ordinary differential equations (ODE) as these are widely used to model time course experiments in Systems Biology. These models are often parametrized using parameter estimation methods that find sets of parameters that let the model optimally fit to the data.
Challenges arise from the complexity of the parameter space that may contain (many) local minima. Parameter estimation techniques consist of two components: an objective function quantifying how well the parametrized model fits the data and an optimization algorithm optimizing this objective function. A huge amount of research focuses on optimization algorithms: there are global optimization techniques
Moles03 ; Rodriguez06 ; PS , gradient based approaches Bock07 ; LM64 and Bayesian techniques Ghasemi11 ; Girolami08 . However, there have been few investigation on how the choice of the objective function leads to different landscapes influencing the difficulty of the optimization problem Bock07 ; Leander14 .We slightly extend a recently developed method Zimmer12 and show how it can be applied to ODE models to drastically reduce the complexity in the parameter search space. This “multiple shooting for stochastic systems” (MSS) method was recently developed for parameter estimation in stochastic models. The MSS approach splits the time course data into intervals that are treated separately. The initial values for each interval are composed from actual measurements and a state updating for the unobserved components. This article makes the approach more flexible by allowing to vary the number of intervals that are introduced. Using only one interval corresponds to the common least squares functional for ODE models. Using as many intervals as there is measurements corresponds to the MSS objective function in Zimmer12 .
The importance of state estimation has been recognized in Bock07 that considers the state variables as optimization variables and uses continuity constraints on them to obtain a continuous trajectory in the end. Highly sophisticated structure exploitation reduces the complexity of the optimization problem again. Leander14
uses techniques from stochastic differential equations combined with a Kalman filtering. As both approaches and ours add flexibility to the states, this seems to be a crucial point in influencing the objective function landscape. The appeal of our method is the technical simplicity that gives any user the chance to easily implement it on their own. Even more, as the approach is implemented in COPASI, the full model import and analysis functionality of COPASI can be used.
This article will demonstrate the benefits of using the extended approach to time course data from ODE models. The parameter estimation landscapes are a lot smoother with this MSS objective function which greatly simplifies optimization. It is fully implemented in the software COPASI Copasi a popular modeling and simulation environment. Apart from its own file format, COPASI supports the import of models encoded in SBML hucka_2003full , hucka_2010 . Therefore, a user can apply the method quickly to models available in model databases, or models created by hundreds of other SBML compliant software programs.
The article will use three different example models to show the power of the MSS approach on ODE models: a FitzHughNagumo oscillator, a LotkaVolterra model and a Calcium oscillation model.
Method
MSS objective function
Assume that data is observed at time points . The state of the system at time is composed of the observed and unobserved species: . A model that describes the dynamical behavior of the system is given with a systems of ordinary differential equations:
(1)  
(2) 
with a right hand side function depending on the systems state and some parameter . In Systems Biology examples is often determined by the stoichiometric matrix
and the rate law vector
: .We recently published an objective function for parameter estimation Zimmer12 that decomposes the whole time series into intervals and fits to the intervals individually. The initial value for each interval is obtained by a state updating. We suggest a slightly more flexible state updating here, namely the state updating is performed as follows: Choose a subset of the measurement time points: and define the state update at as:
(3)  
and  
with , “obs” denoting the observable components and “hid” the unobservable (hidden) components; hence the state estimate is composed by . The same holds for the ODE solution .
The objective function for parameter estimation is then defined as
(4) 
with as in equation (1), as in equation (3).
If , the objective function equals the standard least squares functional (LSQ). If , then the objective function equals the MSS objective function in Zimmer12 ; Zimmer14 . The number of points in determines how many data points are used for updating the observable components of the system.
Optimization and Software
The objective function can be optimized with gradient based methods, global optimization techniques or Bayesian approaches. The objective function is implemented in the software package COPASI. COPASI is a platform independent, user friendly software tool for setting up, simulating and analysing kinetic models of biochemical reaction networks. It is freely available, open source software and is developed in a well established international cooperation. For a user of COPASI the workflow of running a parameter estimation with this objective function looks as follows: first the user would import a dataset containing the observed/measured time points and map the columns of the dataset to the corresponding elements in his model. Once that is done, a feature in the Tools menu called Create Events for Timeseries Experiment turns the measured time points into discrete events. These events force the value of a model variable at the measured time points to the observed value. Users are free to modify the list of events, removing or adding further time points as desired. Removing all automatically created events restores the original LSQ functional. Example files have been placed on our website at http://copasi.org/Projects/piecewise_parameter_fitting/.
Results
The results section demonstrates the power of the method on three examples from Systems Biology: a FitzHughNagumo oscillator, a LotkaVolterra model and a Calcium oscillation model.
FitzHughNagumo oscillator
The first example is a FitzHughNagumo oscillator Nagumo62 ; FitzHugh61 with
with initial values: and and parameters with , , . Measurements of are recorded at
with normally distributed additive noise with variance
. The parametrization has been chosen according to Leander14 .Figure 2 (left panel) shows the fitness landscape with the LSQ objective function () and figure 2 (right panel) shows the fitness landscape with MSS objective function (. One can see that the objective function landscape is a lot smoother using the MSS objective function rather than the conventional LSQ objective function. Therefore, optimization algorithms have less difficulties using the MSS objective function.
As the MSS objective function differs from the LSQ objective function, their global minima are not necessarily identical. The LSQ minimum for this scenario is . Note, that the minimum is not the true parameter as measurement noise has been added to the pseudo data. The MSS minimum is . As most users will be interested in the LSQ minimum, one can use the MSS minimum as start value for a second optimization with the LSQ function. This second optimization will be very fast as the MSS minimum is within the valley of attraction of the global LSQ minimum. As described above both objective functions are implemented in the software COPASI Copasi , making them readily available for users. They can enter their model using a graphical user interface, or start with models from model databases. Measured data is quickly applied to the model, and simulations can be easily carried out, plotted and analyzed.
LotkaVolterra
The second example is a LotkaVolterra system which describes the interaction of a prey population with a predator population :
with initial values and parameters with , , , . Measurements of and are recorded at with additive normally distributed measurement error with variance .
Figure 3 (left) shows the objective function landscape with the LSQ objective function and figure 3 (right) shows the objective function landscape using the MSS objective function. The first observation is that the MSS objective function landscape is free of local minima whereas the LSQ objective function landscape shows local minima. Therefore, optimization is much easier with the MSS objective function.
Again, as the objective function are not identical, they lead to two slightly different minima: for LSQ and for MSS. However, the important point is that the MSS minimum is in the valley of attraction of the global LSQ minimum. This means that one can use the MSS minimum for a second optimization run with the LSQ objective function and this second optimization run is very fast.
Furthermore, the LotkaVolterra example has been used to investigate bias and variance of the LSQ estimates and MSS estimates. To this end, noisy time series have been created by adding gaussian noise with variance to a time course of observations with intersample distance . Each of the time series is used for parameter estimation with the LSQ and MSS objective function, resulting in estimates , for each objective function. The average deviation of the estimates to the true value is calculated: This deviation is for the MSS method and
for the LSQ approach. This means that the LSQ method has a smaller bias for this example. The standard deviation of the MSS estimates is with
larger than the standard deviation of the LSQ estimates with . However, the authors suggest to use the MSS method for a first estimation and the MSS estimate as start value for a second LSQ estimation.Figure 4 illustrates why the LSQ objective function landscape has local minima: The global optimal fit in (A1,A2) shows good agreement of data and model. The fit in the local minimum (B1,B2) shows an oscillation with a different frequency, but still some of the peaks are fitted well. The parameter in the middle of global and local optimum, however, shows an oscillation that does not fit any of the peaks and results, therefore, in a worse objective function value. Our MSS method fits the data peace wise. This means that every interval is initialized with a state estimate that incorporates information from the observation. Therefore, effect as outofphase oscillation cannot occur any more. This seems to be a major effect in reducing the number of local minima in the objective function landscape.
Calcium Oscillation model
The third model is a Calcium oscillation core model from Kummer05 :
(5) 
with initial values and parameters . Measurements are assumed to be recorded at .
The model with this parametrization exhibits a complex oscillatory behavior with bursting (figure 5 (A)), making parameter estimation especially challenging.
Figure 5 (B) shows the LSQ objective function landscape when varying the parameter . The landscape is full of local minima and optimization will require many function evaluations even on the relatively small interval . Figure 5 (H) plots the MSS objective function landscape for the same setting. It is local minima free and convex which means that an optimization is simple. This shows again the beneficial influence of the MSS method on the objective function landscape.
To address the question, how many intermediate points are necessary to reduce the influence of local minima, the objective function is also plotted with different values of . This shows that already the introduction of only one intermediate point 5 (C) improves the landscape. However, local minima are still present and one needs, indeed, almost all observation points included in to obtain a local minima free landscape.
If the MSS objective function is used with a nonempty , it differs from the LSQ objective function which means that the minimum will not be necessarily identical. Estimating the parameter for all the scenarios one can see that the estimates are all in an interval from . This means that the MSS objective function introduces
only a small bias compared to the LSQ functional for this model.
More importantly, it means that all the preestimates are in the valley of attraction of the global minimum of the LSQ functional. The software COPASI enables the user to use this preestimate as start value for a second run with the LSQ objective function. As this second run starts very close to the global minimum, an optimization is very fast.
Discussion
We extended a recently developed objective function for parameter estimation in stochastic models. This extension can be used for parameter estimation in models of ODEs and greatly simplifies the complexity of the parameter search space. We demonstrate this feature on three models where the LSQ objective function exhibits multiple local minina and show the power of our MSS objective function.
The MSS method treats intervals between succeeding data points separately and uses a state updating to gain initial values for each interval. The extension increases the flexibility of this approach by allowing some of the intervals separately and some together. This extension allows to investigate the influence of state estimation on the objective function landscape.
As Bock07 and Leander14 also add flexibility to the states, this seems to be a crucial point in influencing the objective function landscape. Figure 4 illustrates how local minima can occur in the LSQ objective function landscape: As the LSQ approach uses only the very first initial value for integration, phase shifts in the oscillations might occur. This leads to an increase in the objective function (C1,C2). As soon as e.g. a double frequency is reached, the objective function value lowers again (B1,B2) leading to a local minimum. More generally, in complex systems an increase in deviation of the parameter from the true parameter does not necessarily lead to a stronger deviation of the model response from the data. However, approaches using state estimation reduce this problem as they are able to incorporate information from all observations (and not only the first) in the initial values for the intervals and, therefore, only require an integration over a shorter time interval. The special appeal of our method is the technical simplicity that gives any user the chance to easily implement it on their own. Even more, as the approach is implemented in COPASI, the full model import and analysis functionality of COPASI can be used.
As the MSS objective function is not identical to the conventionally used least squares objective function, an MSS estimate will slightly differ from a LSQ estimate. This has been demonstrated for the LotkaVolterra model for which the MSS method shows in a simulation study of pseudo data sets a slightly higher bias and variance. We do not have a general result but we think that the flatter curve of the MSS objective function leads to an easier optimization but as well an higher variance as experimental errors might shift the minimum of the curve stronger. Anyways, it is possible to use the MSS parameter estimate as already close start value for a (e.g. gradient based) LSQ based parameter estimation. As the computational cost of an optimization depends heavily on the landscape, these two optimizations have the advantage of both having local minima free landscapes. An additional benefit of the COPASI implementation of our approach is easy switching between the MSS and the LSQ optimization.
Acknowledgement
The software implementation of the presented method benefited from the efforts of the whole COPASI development team, currently Stefan Hoops, Brian Klahn, Ursula Kummer, Pedro Mendes, Ettore Murabito, and Jürgen Pahle. Christoph Zimmer is supported by BIOMS, Frank Bergmann by Virtual Liver & NIH R01 GM070923 and Sven Sahle by the Klaus Tschira Stiftung. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
References
References
 [1] C.G. Moles, P. Mendes, and J.R. Banga. Parameter estimation in biochemical pathways: A comparision of global optimization methods. Genome Research, 13:2467–2474, 2003.
 [2] M. RodriguezFernandez, J. A. Egea, and J. R. Banga. Novel metaheuristic for parameter estimation in nonlinear dynamic biological systems. BMC Bioinformatics, 7:483, 2006.

[3]
J. Kennedy and R. Eberhart.
Particle swarm optimization.
Proceedings of the Fourth IEEE International conference on Neural Networks, Perth, Australia
, 4 (6):1942–1948, 1995.  [4] H.G. Bock, E. Kostina, and J.P. Schlöder. Numerical methods for parameter estimation in nonlinear differential algebraic equations. GAMM Mitteilungen, 30 (2):376–408, 2007.
 [5] D. W. Marquardt. An algorithm for leastsquares estimation of nonlinear parameters. J. Soc. Indust. Appl. Math., 11:431 – 441, 1963.
 [6] Omid Ghasemi, Merry Lindsey, Tianyi Yang, Nguyen Nguyen, Yufei Huang, and YuFang Jin. Bayesian parameter estimation for nonlinear modelling of biological pathways. BMC Systems Biology, 5(Suppl 3):S9, 2011.
 [7] Mark Girolami. Bayesian inference for differential equations. Theoretical Computer Science, 408:4 – 16, 2008.
 [8] J. Leander, T. Lundh, and M. Jirstrand. Stochastic differential equations as a tool to regularize the parameter estimation problem for continuous time dynamical systems given discrete time measurements. Mathematical Biosciences, 251, 2014.
 [9] C. Zimmer and S. Sahle. Parameter estimation for stochastic models of biochemical reactions. Journal of Computer Science & Systems Biology, 6:011–021, 2012.
 [10] S. Hoops, S. Sahle, R. Gauges, C. Lee, J. Pahle, N. Simus, and et al. COPASI  a COmplex PAthway SImulator. Bioinformatics, 22 (24):3067–3074, 2006.
 [11] M. Hucka, A. Finney, H. M. Sauro, H. Bolouri, J. C. Doyle, H. Kitano, A. P. Arkin, B. J. Bornstein, D. Bray, A. CornishBowden, A. A. Cuellar, S. Dronov, E. D. Gilles, M. Ginkel, V. Gor, I. I. Goryanin, W. J. Hedley, T. C. Hodgman, J.H. Hofmeyr, P. J. Hunter, N. S. Juty, J. L. Kasberger, A. Kremling, U. Kummer, N. Le Novère, L. M. Loew, D. Lucio, P. Mendes, E. Minch, E. D. Mjolsness, Y. Nakayama, M. R. Nelson, P. F. Nielsen, T. Sakurada, J. C. Schaff, B. E. Shapiro, T. S. Shimizu, H. D. Spence, J. Stelling, K. Takahashi, M. Tomita, J. Wagner, and J. Wang. The Systems Biology Markup Language (SBML): A medium for representation and exchange of biochemical network models. Bioinformatics, 19(4):524–531, 2003.
 [12] Michael Hucka, Frank T. Bergmann, Stephan Hoops, Sarah M. Keating, Sven Sahle, James C. Schaff, and Lucian P. Smith. The Systems Biology Markup Language (SBML): Language specification for Level 3 Version 1 core. Technical report, California Institute of Technology, October 2010.

[13]
C. Zimmer and S. Sahle.
Deterministic inference for stochastic systems using multiple shooting and a linear noise approximation for the transition probabilities.
IET Systems Biology, accepted.  [14] J. Nagumo, S. Arimoto, and S. Yoshizawa. An active pulse transmission line simulating nerve axon. Proceedings of the IRE, pages 2061 – 2070, 1962.
 [15] R. FitzHugh. Impulses and physiological states in theoretical models of nerve membrane. Biophysical Journal, 1:445 – 466, 1961.
 [16] U. Kummer, B. Krajnc, J. Pahle, A.K. Green C.J. Dixon, and M. Marhl. Transition from stochastic to deterministic behavior in calcium oscillations. Biophysical Journal, 89:1603–1611, 2005.
Comments
There are no comments yet.