Entropy-metric estimation of the small data models with stochastic parameters

Heliyon. 2024 Jan 17;10(2):e24708. doi: 10.1016/j.heliyon.2024.e24708. eCollection 2024 Jan 30.

Abstract

The formalization of dependencies between datasets, taking into account specific hypotheses about data properties, is a constantly relevant task, which is especially acute when it comes to small data. The aim of the study is to formalize the procedure for calculating optimal estimates of probability density functions of parameters of linear and nonlinear dynamic and static small data models, created taking into account specific hypotheses regarding the properties of the studied object. The research methodology includes probability theory and mathematical statistics, information theory, evaluation theory, and stochastic mathematical programming methods. The mathematical apparatus presented in the article is based on the principle of maximization of information entropy on sets determined as a result of a small number of censored measurements of "input" and "output" entities in the presence of noise. These data structures became the basis for the formalization of linear and nonlinear dynamic and static models of small data with stochastic parameters, which include both controlled and noise-oriented input and output measurement entities. For all variants of the above-mentioned small data models, the tasks of determining the optimal estimates of the probability density functions of the parameters were carried out. Formulated optimization problems are reduced to the forms canonical for the stochastic linear programming problem with probabilistic constraints.

Keywords: Dynamic stochastic model; Information entropy; Machine learning; Parametric optimization; Probability density functions estimation; Small data model; Static stochastic model.