Solutions that work when the flood-gates open
The explosion of social media over the last decade presents both threats and opportunities to industries where real-time events, consumer sentiment and the accelerating news cycle have material impacts on their activities. Obvious examples are news organisations, public relations firms, financial institutions and the security industry.
Working with this vast, evolving corpus of information requires cutting-edge technology and algorithms. We have deep experience using AI techniques to transforming streams of “big data” into streams of relevant knowledge.
We have developed a platform that derives business-critical knowledge from news and social media in real time
We were invited to develop a platform for a start-up that delivers precision intelligence to firms requiring rapid intelligence on real-world events. We developed a highly distributed data gathering platform that collects news and social media conversations pertinent to our client’s business and stores the data in a graph database. This storage format lays bare the relationships between events – providing the context that is critical to understanding events in our hyper-connected world.
Our data scientists developed algorithms to learn from this evolving graph, inferring stories as they are discussed and predicting their relevance to – and impact on – different business domains. The platform is cloud-hosted and delivers its knowledge via a web application to end customers.
We designed algorithms to detect malicious behaviour across the entire IT estate of a global financial institution
Our client had launched a major new cyber-security initiative with the objective of building a central nervous system capable of detecting and correlating threats occurring anywhere in their IT systems in real time.
Our data scientists were invited to help develop the algorithms that would search billions of incoming data-points every day, searching for anomalies and threats. Multiple independent “detector” algorithms wrote their findings into a distributed graph database. A second set of algorithms then processed this graph, making connections between seemingly unrelated entities or events. In this way billions of messages each day could be compressed into mere hundreds of alerts for specialist analysts to follow up.