Towards Learning Stochastic Population Models by Gradient Descent

Kreikemeyer, Justin N. and Andelfinger, Philipp and Uhrmacher, Adelinde M. (2024) Towards Learning Stochastic Population Models by Gradient Descent. In: 38th ACM SIGSIM Principles of Advanced Discrete Simulation (PADS 2024), 24-26 Jun 2024, Atlanta, Georgia, USA. Proceedings, published by Association for Computing Machinery (ACM), New York, NY USA, pp. 88-92.

Full text not available from this repository.
Official URL: https://dl.acm.org/doi/10.1145/3615979.3656058

Abstract

Increasing effort is put into the development of methods for learning mechanistic models from data. This task entails not only the accurate estimation of parameters but also a suitable model structure. Recent work on the discovery of dynamical systems formulates this problem as a linear equation system. Here, we explore several simulation-based optimization approaches, which allow much greater freedom in the objective formulation and weaker conditions on the available data. We show that even for relatively small stochastic population models, simultaneous estimation of parameters and structure poses major challenges for optimization procedures. Particularly, we investigate the application of the local stochastic gradient descent method, commonly used for training machine learning models. We demonstrate accurate estimation of models but find that enforcing the inference of parsimonious, interpretable models drastically increases the difficulty. We give an outlook on how this challenge can be overcome.

Item Type: Conference or Workshop Item (Paper)
Projects: GrEASE, SODA