We find out Why Complex Systems Malfunction. On the fly.
Beyond Machine Learning.
High Complexity = Vulnerability
Conventional means of risk assessment and management are not suited for a complex and turbulent context. They are subjective and results can be manipulated. There is a pressing need to devise more modern, objective and science-based means of dealing with uncertainty and complexity. This is exactly what we offer.
Modern IT, software-packed products, such as cars, aircraft, critical infrastructures, or the IoT, offer many examples of how complexity can become a nightmare. What makes these systems powerful, also makes them fragile. Complexity is a next generation risk which requires a next generation technology and approach.
We detect anomalies without using machine learning for one very good reason: our clients don’t have the luxury of multiple failures to teach a piece of software to recognize it or to establish the presence of rules.
Correlations play a central role not just in data or risk analysis. However, conventional linear correlations may deliver misleading results.
Ontonix offers an innovative generalized correlation, which takes into account non-linear aspects of data. The method uses brand new next-generation AI technology which transforms data into images, emulating an expert looking at it. The system actually ‘sees’ correlations.
In 2015, Ontonix has been the principal author of the World's first 'Business Complexity Assessment' standard, published in Italy by UNI, 11613. The ISO 22375 standard on business complexity, which follows the UNI standard, has been published in 2018.
In 2018 Ontonix launched a Complexity Monitoring Chip, developed in partnership with SAIC.
Solutions For a Complex World
Traditional technology is unable to deal with the immense complexity of ICT systems or critical infrastructures which makes them vulnerable. High complexity is a nightmare.
Our solutions have been crafted specifically for:
pinpointing concentrations of vulnerability
extracting new knowledge from data
delivering crisis early warnings
The new science is based on model-free methods, which allow us to concentrate on solving real problems not on building exotic mathematical constructs.
The most recent application of Quantitative Complexity Management is a futuristic Cyber Attack System, CODE, which has the objective of delivering a systemic low-visibility strike, inducing collapse of enemy networks.
Complexity X Uncertainty = Fragility™
The above equation is the Principle of Fragility, which has been coined by Ontonix in 2005. It reveals why in an uncertain context a highly sophisticated and complex business or infrastructure are more exposed, hence more vulnerable. As the uncertainty and turbulence of our World increases, simpler solutions are preferable as they are more resilient.
Serious Science Starts When You Begin To Measure
Examples of Complexity Maps, which reveal the structure of complexity and its drivers.