We find out Why Complex Systems Malfunction. On the fly.
Beyond Machine Learning.
High Complexity = Vulnerability
Highly complex systems are inherently fragile. This is because they can fail in countless, often non intuitive ways. Moreover, high complexity implies high vulnerability. To all effects, high complexity is a next generation risk which can only be countered with a totally new technology and a new approach. This is exactly what we offer.
Modern IT, software-packed products, such as cars, aircraft, critical infrastructures, or weapon systems are highly complex and offer many examples of how complexity can become a nightmare. What makes these systems powerful, also makes them fragile.
With our technology we detect anomalies without using machine learning for one very good reason: our clients don’t have the luxury of multiple failures to teach a piece of software to recognize it or to establish the presence of rules. Our tools catch anomalies without having seen them first. Sometimes, you must get it right the first and only time!
Ontonix offers an innovative generalized correlation, which takes into account non-linear aspects of data. The method uses brand new next-generation AI technology which transforms data into images, emulating an expert looking at it. The system actually ‘sees’ correlations.
In 2018 Ontonix launched a Complexity Monitoring Chip, developed in partnership with SAIC.
The chip, housed in a ruggedized enclosure, processes data from a CAN bus and delivers real time early-warnings of failures, anomalies and systemic collapses.
Solutions For a Complex World
Traditional technology is unable to deal with the immense complexity of ICT systems or critical infrastructures. This is because complexity isn’t taken into account when these systems are designed. Since 2005 complexity can be measured for any system. Our Quantitative Complexity Management solutions have been crafted specifically for:
pinpointing concentrations of vulnerability in highly complex systems
delivering collapse or malfunction early warnings in highly complex systems
extracting new knowledge from data
The new complexity science is based on model-free methods, which allow us to concentrate on solving real problems not on building exotic mathematical constructs.
The most recent application of Quantitative Complexity Management technology is a futuristic Offensive Cyber Operations (OCO) System, CODE, which has the goal of delivering a systemic low-visibility strike, inducing collapse of enemy networks.
Complexity X Uncertainty = Fragility™
The above equation is the Principle of Fragility, which has been coined by Ontonix in 2005. It reveals why in an uncertain context a highly sophisticated and complex system more exposed, hence more vulnerable. As uncertainty increases, simpler solutions are preferable as they are inherently more resilient.