Ontonix Releases Fast Version of OntoNet™

Como, 22-nd February, 2019. Ontonix launches a fast version of its Quantitative Complexity Management engine OntoNet™ - the OntoNetS™ - for the analysis of very large systems and processes. OntoNetS™ features an approximate algorithm which reduces significantly memory usage as well as accelerates computation. For example, in the case of 5000 variables, the speed-up is 25, while with 10000 it is 32.

Only a Linux version is available.

“With similar runtime reduction we are now in the condition to tackle problems with thousands of variables even on a laptop” said Dr. J. Marczyk, the Founder and President of Ontonix. “OntoNetS™ open new possibilities in terms of real-time implementations such as monitoring of networks or IT systems” he added. “The approximate algorithm produces results that have a correlation of nearly 97% with the original one. The small difference is not only negligible, it is also irrelevant. In the spirit of L. Zadeh’s the Principle of Incompatibility, which states that high precision is incompatible with high complexity, highly complex systems, by their very nature, are never precise and cannot be studied with high precision. Therefore, a small loss of precision in the analysis of highly complex systems is perfectly tolerable” he concluded.

Jacek Marczyk

Visionary, scientist, businessman and writer with over 35 years of experience in QUANTITATIVE large-scale Uncertainty and Complexity Management in diverse fields (manufacturing, finance, economics).

Author of nine books on simulation, uncertainty and complexity management, rating.

Developed in mid 90s the theory of eigenvalue orbits, a generalization of the concept of eigenvalue.

In 2000-2005 has developed the first Quantitative Complexity Theory (QCT), including a comprehensive measure of complexity.

Founded Ontonix Complexity Management in 2005 in the USA and launched in 2006 the first commercial system for MEASURING and managing complexity: OntoSpace.

In 2007 launched first on-line Resilience Rating for businesses, an objective and transparent rating system:


In 2009 delivered real-time technology to measure the complexity and stability of patients during operation or permanence in Intensive Care Units.


Developed a new theory of risk and rating published in 2009 in a book entitled "A New Theory of Risk and Rating".

Over last decade develops quantitative complexity management (QCM) technology and solutions for applications in economics, finance, Risk Rating and Management as well as in Asset Management and medicine. In the past five years works towards the democratization of ratings.

In 2013 he founded London-based Assetdyne, focusing on design of complexity-based high-performance portfolios and complexity-based asset allocation and asset management.


He is currently focusing on creating a new Rating Agency and a fund which will be managed via complexity technology (QCT).