The use of Big Data has continued to grow and mature, with some organizations reaping considerable rewards. The processing of Big Data has recently advanced to a new level of evolution, in the form of AI (Artificial Intelligence) platforms. AI platforms promise significant impact (and disruptions) over the next decade. The use of AI to process massive datasets will bring previously unknown improvements to Business Intelligence and Analytics among innumerable other technologies.
According to Anil Kaul, CEO and co-founder of Absolutdata, In mid-2000s, the concept of using Big Data to “train” Artificial Intelligence was developed, and advanced with several successes. Machine Learning (ML), Deep Learning (DL), and new Data Architectures have led to smarter AI over the following years.
Machine Learning uses algorithms to analyze data, learn from them, and then
make predictions. Algorithmic approaches include decision tree learning, clustering, reinforcement learning, and inductive logic programming. Deep Learning uses algorithms and an “artificial neural network” to develop Artificial Intelligence.
In a recent DATAVERISTY® interview, Anil Kaul discussed the current landscape of these technologies, along with how newly developed AI platforms, such as that created by Absolutdata are changing the industry of Data Management. According to Kaul, Artificial Intelligence and Machine Learning are being used today primarily as personal assistants for research and internet activities, as well as to accomplish tasks such as answering phones, making sales predictions, and driving vehicles.
Kaul also discussed the overall structure of these highly aligned technologies. In general, Machine Learning is treated as both a process for training AI and as a “more primitive” version of AI. They are allied technologies working together, with Artificial Intelligence as the larger, more evolved concept. Deep Learning is often considered a more advanced version of Machine Learning, and is used as a teaching process. When used with targeted focus, these technologies have already shown great promise said Kaul:
“As an example, everyone uses analytics for email campaigns, but because we used AI with a client, they had a 51 percent increase in sales with the AI-driven campaign. While Analytics can figure out who you should target, AI recommends and generates what campaigns should be run.”
The AI Platform
An AI Platform is a framework designed to function more efficiently and intelligently than traditional frameworks. “When designed well,” said Kaul. “It provides organizations with faster, more efficient, more effective collaboration with Data Scientists and staff.” It can help minimize costs in a number of ways – preventing duplication of effort, automating simple tasks, and eliminating some expensive activities, such as copying or extracting data. An AI Platform can also provide Data Governance, ensuring the use of best practices by a team of AI scientists and ML engineers. And it can aid in ensuring that work is distributed more evenly and completed more quickly.
An Artificial Intelligence Platform generally has its elements organized into five layers of logic:
- The Data and Integration Layer provides access to the data. This access is critical as developers do not hand-code the rules. Instead, the rules are being “learned” by the AI, using the data it has access to.
- The Experimentation Layer allows Data Scientists to develop, test, and prove a hypotheses. A well-designed Experimentation Layer offers automated-feature engineering, feature selection, and model selection.
- The Operations and Deployment Layer provides model governance and deployment. It is here that a model’s risk assessments are tested, allowing the model governance team to validate it. This layer offers tools to manage the deployment of various “containerized” models and components across the platform.
- The Intelligence Layer supports the AI when it’s working (training activities take place in the Experimentation Layer). The Intelligence Layer organizes and delivers intelligent services, and is the primary component used in directing service delivery. Ideally, this layer has implemented concepts, such as dynamic service discoveries, to offer a flexible response platform supporting cognitive interaction.
- The Experience Layer interacts with users through technologies such as augmented reality, conversational UI, and gesture control. This layer is often controlled by a cognitive experience team that strives to create rich and meaningful experiences, which are enabled by AI technology.
Using an Artificial Intelligence to analyze Big Data can provide a deeper understanding of both the external and internal dynamics impacting a business, said Kaul. Employing the latest Big Data Architecture and Machine Learning supports the use of Artificial Intelligence. According to Kaul, in a modern, cutting-edge AI-based platform:
- The AI has access to all available data
- It learns from the history of clients or prospects
- It takes experience from previous, similar clients, and shows tactics that worked in the past
- The AI monitors and learns, finding patterns humans might miss
- The AI learns in real time, and responds in real-time, adjusting to the new data
- It provides guidance based on changing data
- The AI incorporates Machine Learning
To maximize the results offered by cutting-edge Artificial Intelligence, there are three requirements noted Kaul. The first is an Analytical Framework. Analytical Frameworks are methodologies that have been developed over time to solve specific business problems (often complex). “Using an Analytical Framework is critical in supporting the system’s Artificial Intelligence and Machine Learning capabilities” said Kaul.
Context is also a necessity. Artificial Intelligence and Machine Learning are currently very poor at determining context. AI can pick up on trends, and can determine what is happening in the data, but being able to take it beyond trend insights to making a recommendation of what staff should be doing, “context must be included,” noted Kaul. While it is hoped AIs will be able learn how to determine context, this is not yet a reality. Currently, context needs to be determined and added to the model by a human.
Appropriate technology is the third requirement. Unlike traditional Analytical Systems, an AI-supported platform has to be scalable for the AI to learn, and to create solutions. A traditional Analytical System would deliver insights on the data, while an AI would provide recommendations in real time.
There are a variety of different approaches used in scaling databases upward to very large sizes, while simultaneously promoting ever-faster rates of transaction each second. One tactic used by the majority of Database Management Systems is the partitioning of data-heavy tables. This tactic allows a database to scale out across clusters of separate database servers. Additionally, multi-core CPUs, large SMP multiprocessors, and 64-bit microprocessors can now support multi-threaded implementations that can offer a significant scaling up of transaction processing capacities.