Data analysis and artificial intelligence have a problem. Despite the transformative power they hold, as evidenced by many successful projects in recent years, the field is plagued by a high failure rate.
The exact scale of the problem is up for debate – Gartner says that 85% of big data projects will deliver false results, while a talk at Transform 2019 between IBM CTO for Data Science and AI Deborah Leff and Gap SVP of Data and Analytics, Chris Chapo, cited that 87% of data science projects never make it to production.
Whatever the exact statistic, there is a clear gulf between the potential offered by data and AI and what they deliver. This means organisations are too frequently losing their investments.
To better understand this disparity and explore how organisations can close it, DIGIT spoke with Professor Stephen McArthur, CTO and Co-founder of Bellrock Technology.
Proof of Value
Perhaps the key factor behind the gap, according to McArthur, is the failure of many projects to provide an adequate proof of value.
As a professor at the University of Strathclyde, McArthur works to develop proofs of concept to show how advances in data science and AI could deliver value in the electricity industry.
“I had the same problem many times,” he says. “We would create analytics that showed you could take data and predict when something was going to fail. But we could never get that proof of concept to business value because the companies didn’t have the time, resources, and the ability to connect up this AI solution to their systems.”
McArthur notes that when companies become interested in the potential held by data solutions, senior managements’ first instincts are to immediately bring in data scientists. However, this can merely exacerbate the problem.
“There are people that are very skilled at building data science models and AI techniques that can help industry and commerce, but they don’t tend to be skilled in turning it into a secure, authenticated, encrypted industry strength solution,” he explains.
On the other, leadership tends not to understand the data component, leaving it to their IT teams to handle data projects.
As such, it is vital that data projects be built with a proof of value incorporated into them from the beginning. If they start off simply as a novel science experiment, a solution in search of a problem, then they are more likely to fail.
When creating a proof of value, there are several considerations. Most of these revolve around designing the system to be simple, adaptable, and easy to use. For example, adding an intelligent and accessible front-end is vital to ensure that business users can get value out of it.
Simplicity, and the ability to explain it to other people, ensures the system and how to use it can be understood by everyone in an organisation.
“You need systems to be able to explain how they came up with their advice,” McArthur says. “Not in a way that a data scientist understands but in a way that the person digesting the information can understand.”
Part of what makes explainability so important goes back to the divide between management and data science. Too often, the algorithms that define an AI are contained within a black box, leaving even the people who created them in the dark.
Ensuring that everyone in an organisation can understand what a system is doing helps them understand why it is doing it. With this, management can ensure the data project fits into the group’s operations.
Furthermore, with a simple, automated, and explainable system, analysis is open to everyone in an organisation, not just people with data science degrees.
“Anyone from the organisation might be coming up with analytics to help others,” McArthur says. “You want to make sure there’s a way of getting those into your system and exposing them to others.”
The Power to Change
Adaptability is another key element of a valuable data project, and a big part of that is the ability to connect the system to live data.
“People need to think about how to make it as easy as possible to get new analytics or improve and update those existing analytics, and constantly deliver new use cases with the analytics,” McArthur says.
While most data projects will utilise an organisation’s historical data, this only reveals insights about what has previously happened. The real power of analysis and prediction comes when the system is using up-to-date information being delivered in real-time.
However, if the ability to use live data is not incorporated from the beginning, it can be difficult to adapt it and create a valuable business solution.
“That might require a whole new team with cloud development skills,” McArthur explains. “It might mean getting a team that’s already very busy in IT to build it, so it goes on a shelf and waits until it’s ready.
“Either way, it introduces cost and delay, or they just can’t justify the cost to turn it into a full solution because the business value is not strong enough.”
Furthermore, the power of live data means that the AI can adapt, creating lifelong learning algorithms.
The system can use new information to refine its analysis, create new predictions, or be used to automate additional processes. This increases the lifespan, and the return on investment, of the product.
In the Cloud
To access these live data streams, the cloud provides an ideal solution. Data can be drawn from multiple sources and fed into the system in real time.
Furthermore, operating in the cloud helps prevent data and analysis becoming isolated from each other. There is no point having the analytics on one employee’s laptop in the field while the data is spread out on a variety of thumb drives.
“Companies can get economies of scale as you can run up as much processing, or down, as you need,” McArthur says. “Some of the data algorithms are very large, and where there are huge volumes of data, this can be very expensive to run and process. But it’s even more expensive if you have to build all that as your own infrastructure.”
But the real benefit of cloud infrastructure, he adds, is the ability to use standard tools, interfaces, and protocols for interoperability. This helps organisations connect all their different data sources to the system.
“You might want to inject some offline data, you might have some real time data stored in your organisation you need to put in and you might want to attach that to publicly available data,” he explains.
As part of his work at the university, McArthur developed an AI system to connect data with analytics. This ultimately led into the creation of Bellrock Technology and its Lumen product.
“What Lumen does, as a product, is make sure your proof of value is effective and immediately operational, so that you’re not having to think about how to build the next stage.”
The cloud-based SaaS system automates the creation of the application. This means that companies without dedicated IT or software teams can use it to develop their data projects.
With the data attached to it, and the analytics specified, Lumen works out which data to connect with the analytics and when to make the outputs available to users.
- DIGIT Deal Roundup Column | June 2021 edition
- Winchcombe Meteorite could hold secrets to life on Earth
For companies looking to initiate their first data projects, focusing on a proof of value and the creation of a minimum viable product is key to ensuring its success.
“If you just build a proof of concept without knowing how to reach the next stage, or if there are uncontrolled costs to do that, you’re never going to go beyond that proof of concept,” McArthur says.
“So, think about the end-to-end use case you’re trying to prove and make sure it’s of value to the business.
“Then you can scale up, and then consider how you’re going to construct it to the point where it can be used by the people who can get value from it – make sure you think of that at the early stages of a project.”