Como, 22-nd February, 2019. Ontonix launches a fast version of its Quantitative Complexity Management engine OntoNet™ - the OntoNetS™ - for the analysis of very large systems and processes. OntoNetS™ features an approximate algorithm which reduces significantly memory usage as well as accelerates computation. For example, in the case of 5000 variables, the speed-up is 25, while with 10000 it is 32.
Only a Linux version is available.
“With similar runtime reduction we are now in the condition to tackle problems with thousands of variables even on a laptop” said Dr. J. Marczyk, the Founder and President of Ontonix. “OntoNetS™ open new possibilities in terms of real-time implementations such as monitoring of networks or IT systems” he added. “The approximate algorithm produces results that have a correlation of nearly 97% with the original one. The small difference is not only negligible, it is also irrelevant. In the spirit of L. Zadeh’s the Principle of Incompatibility, which states that high precision is incompatible with high complexity, highly complex systems, by their very nature, are never precise and cannot be studied with high precision. Therefore, a small loss of precision in the analysis of highly complex systems is perfectly tolerable” he concluded.