If you want to see data being put hard at work, look no further than artificial intelligence and machine learning – never before has timely, accurate data been so critical.
A recent survey released by Narrative Science finds 61 percent of enterprises have already implemented AI within their businesses. At least 71 percent said AI was part of their enterprises’ innovation strategies. At the same time, 90 percent of respondents in the business intelligence function reported that they would be interested in incorporating AI to make their data and analytics tools smarter.
At the same time, AI and machine learning are not overnight projects in themselves – they take considerable time and expertise to get rolling, and constantly need refreshing and updating. So it looks like a lot of work is going to take place in this area in the year ahead.
That is one of the takeaways from Jim Kobielus, analyst with SiliconAngle/Wikibon and one of the industry’s top data analysts, who recently provided his insights on what to expect in the year ahead in an interview with TechTarget’s Jack Vaughn.
While AI has been around for decades, what has changed in recent years is that AI “has shifted away from fixed, declarative, rule-based systems toward statistical, probabilistic, data-driven systems,” Kobielus says. Machine learning is also helping to reshape the AI value proposition, employing “algorithms to infer correlations and patterns in data sets” for predictive analysis, speech recognition, and other solutions. In addition, neural networks are also on the rise, fueled by the fact there “is much more data,” Kobielus points out.
Kobielus sees deep learning as the next frontier – “machine learning with more processing layers, more neural layers, able to infer higher level abstractions of the data.”
AI, machine learning and deep learning systems will take time to fully develop. “It’s not enough to build the algorithms; you have to train them to make sure they are fit for the purpose for which they have been built. And training is tough work. You have to prepare the data — that’s no easy feat. Three-quarters of the effort in building out AI involves acquiring and preparing the data to do the training and so forth. The data sets are huge, and they run on distributed clusters. Often, Hadoop and NoSQL are involved. It costs money to deploy all that.”
Model decay – an age-old problem – gets even more intense in an AI-driven environment, Kobielus also cautions. “You have to keep re-evaluating and retraining the AI models you have deployed. Models become less predictive over time. That’s simply because the world changes. The model behind predicting an item a customer may have clicked on three years ago in your e-commerce portal may not be as predictive anymore.”
To be successful with AI and cognitive technologies, Kobielus advocates embedding it into a DevOps workflow. “You need to create a workflow that is very operational. It means always being sure you have the best training data and the best-fit AI and machine learning models.”
One thing is clear: AI and cognitive technologies are looming as a force that can transform the way we use and understand data. But it’s going to take time to get there.
Source: Informatica Perspectives