Content

About this project
In late 1950s and early 1960s two Russian scientists Andrei Kolmogorov and Vladimir Arnold (top two pictures) published several researches where it has been proven that any continuous multivariate
function $M$ can be adequately replaced by a particular tree of univariate functions $\Phi_q, \phi_{q,p}$
$$ M(x_1, x_2, x_3, ... , x_n) = \sum_{q=0}^{2n} \Phi_q\left(\sum_{p=1}^{n} \phi_{q,p}(x_{p})\right). $$
It was obvious from the beginning that this representation can be used in artificial intelligence (back then it was called something else), but as
always, it took some time to find an effective way of training the model.
It has been found later that having this exact number of functions is not always necessary, number of layers can be
chosen differently for different datasets, this model may be used also for data labeling, solving partial differential
equations and, as our research showed, it also can be expanded to stochastic modeling, when some functions in representation are
probability densities.
One really amazing and counterintuitive property of this representation is how easy and quick this model can be
trained. The goal of this site is to show these ways to regular software engineers with or without previous
AI experience. Prerequisite is bachelor degree in technical science and some experience with objectoriented coding.
The third and fourth pictures from the top are other two scientists who made this new form of AI possible:
Pavel Urysohn and Stefan Kaczmarz.
The authors of these site Andrew Polar and Mike Poluektov also provided some modest theoretical contribution in 2020, 2021 and 2023:
Discrete Urysohn Operator
KolmogorovArnold Representation
NewtonKaczmarz Method
but I suggest not to jump reading this immediately because it may scare away anyone without mathematical background.
This site explains what is published in these articles in reader's friendly form.
One recently appeared new paper of MIT sparkled the interest to the problem or, to say more accurately,
made a hype. Their solution is different, their code is also published and users can compare and choose what fits
better their particular needs. Obviously, the entire theoretical research of KolmogorovArnold representation is immense, this site
narrows this huge pool to only practical applications and tries to bring it up in a friendliest form.
Contact
Andrew Polar  


