Determinism and dynamism
Lets put the thing in concept. I am in very early stages of learning machine learning, and going through some of the fundamental, very simple concepts behind neural nets.
There is basically an interesting question that we humans tend to ponder? What is the essentials of a “intelligent” system?
If you look back in the decades prior to 2020s, in 1980s already a lot of things that we can say were prelude to machine learning, were made. There is a method called “simulated annealing”, which is basically similar to the concept of a Bolzmannian heat. Compute (amount of CPU power) was not yet ripe for some of the extreme computation needed in modern, vision-based systems for example.
Nowadays we have been getting benefits of Moore´s law and thus in 2024 there is immense CPU packed right in the area of a single nail. In 1980s supercomputers ran industrial solutions. We could roughly say that with the development of both computer science and some algorithmic methods as well, plus the development of hardware (silicon), Machine learning has entered into a fruitful new era.
What is this simulated annealing thing? Annealing is a metallurgical process, of atoms in a alloy settling to a particular rather rigid conformation. Think of a smelt: the alloy exists in rather liquid form, but when you let it cool down, it becomes metal.
Regarding the heat in a computational Bolztmann machine, I don’t have all the details of Bolzmann machine, but certainly there are good articles out there. One example here, in Medium: A Complete Guide to Boltzmann Machine — Deep Learning | by Soumallya Bishayee | Medium
Determinism is the result of a classical programming: when you put logic as IF/THEN statements, you will have a limited (finite) amount of states, where the program can end up in. These are correspondent to the outputs, results of that program.
But there is one thing: a traditional program basically cannot learn in its full extent. A traditional program is limited to what the programmer “saw coming”, at the time when the program was built (released).
The program´s environment and its features must have been conceived by programmer, before the app is set to run and deal with the world.
This question of dynamism and determinism is sort of philosophical and nuanced difference perhaps, but it really sets apart machine learning and traditional finite-state programs. Traditional programs need even quite labor-intensive changes to accomodate for new situations. I think what the AI and machine learning paradigms are aiming at, is to reduce this exhausting change, needed just to keep the program running without errors.
Machine learning learns to do classifications, or to react to change in environment. The “environment” can be something really tangible, like the visible world, in a automated vision system being coupled to a CCTV (camera), which recognizes registration plates of cars, and decides whether to lift the port up, or show the driver instructions to pay the parking bill first.
In Machine Learning, model is first taught, by showing a lot of data (cases). These cases are taught in a cycle called training. When under training, the Model adjusts itself. I wrote earlier about the adjustments. Also if you want to look up other writings, about different kind of machine learning setups, and methods to use in particular situations, take a look at the science literature around machine learning.