Companies today are looking for a data-driven competitive edge. That edge is increasingly derived from the timeliness of the data being processed and analysed.
For most companies, the methods and ways in which they are generating insights from data are developing at pace. Being able to capture and analyse data in real time is the new challenge.
Hadoop was the breakthrough that opened up a world of unstructured “Big Data” that enabled data scientists to start finding new patterns in data. This breakthrough allowed businesses to be transformed, accelerated research and development, and enabled accurate predictions in a way that we had not been able to do before.
The first wave of big data projects focused on analysing historical data sets. While the volume and variety of data being analysed were on a massive scale, it was always data that was recorded after the event and ingested into a data store like HDFS. Times have moved on, and analysis of historical data alone is no longer likely to deliver the competitive advantage that business strives to find.
The approach to analytics is still valid, but increasingly the answers that companies are looking for are held in data at the moment it is created.
Companies need to be able to access, process and analyse data in real-time. The industrial IoT and the Internet of Anything (IoAT) are providing new sources of data streams that provide telemetry or information on devices that monitor systems, people, vehicles, machines, and equipment in real-time.
As an example, grabbing data from thousands of sensors in real-time enables businesses to make by-the-second decisions to improve efficiencies, deliver goods faster and even avoid accidents or failures.
Real-time analytics enables big data style analysis and insight to be applied to processes, procedures and production as they occur. However, a word of caution is required. Just like the early days of big data, if not done correctly, a lot of time, resource and money can be wasted. With these pitfalls in mind, here are five important points you need to consider when working with data in real-time.
1) Use the Right Technology. Get help from experts to do so. Analysing thousands of real-time data streams is not a simple task, and neither can it be achieved with a one-size-fits-all approach. Reading data science message boards, you will see data scientists ask questions like which is better, Apache Kafka or NiFi or Storm. The truth is that these different components of the Apache project are actually complementary and dependent on your use case. You may even need to combine them all. An experienced big data provider like Cloudera can help companies integrate the technologies that best suit their own data sources.
2) Customer Touchpoints. In addition to connected devices, most companies are using increasing numbers of SaaS applications to service their own staff and customer interactions. Effectively, this means that customers are interacting with businesses across multiple disparate touch points. Much of this data can be valuable to real-time analytics especially if it can be applied to supply chain decisions on the fly. These touch points need to be identified and integrated into a real-time data teaming framework.
3) Preserve & Understand Object Relationships. If data objects in different systems have relationship dependencies, this is something that can be maintained in traditional ETL in a data store. Preserving these dependencies and relationship between objects streaming in real-time from two different data sources is much harder to do. This ties back to point one above and ensuring you use the right technology for the data sources you stream.
4) Data and Storage Management. Building and running infrastructure for a big data installation has always come with unique challenges. Adding massive streams of data in real-time increases this challenge further. You need a data management strategy and culture that maximises the efficiency of the data you stream. As an example, Twitter developed analytics management tools to identify fake and fraudulent accounts and remove them from its real-time data analytics streams.
In addition, you also need storage that can cope with unexpected volumes being generated from real-time data sources. The cloud (be it public, private or hybrid) becomes a great option here as it can provide you with the flexibility, scalability and elasticity you need in order to cope with huge volumes of varying data cost-effectively.
5) Data Integrity. Real-time streaming comes with new data integrity challenges. When you ingest data into something like HDFS, part of the ingest process can include data cleansing. This “luxury” is not possible with real-time data analytics. The quality of the data streams needs to be high at the source. This is not just a technical issue. Promoting a culture of data accountability across staff at all levels can help with the integrity of data at the source but requires that top management subscribes to the benefits they will see from successful real-time analytics.
Your business will benefit from real-time data analytics "done right" but "done wrong", can have an even stronger negative effect. Working with the right people, companies and technologies will accelerate your time to success in the world of real-time analytics.
For more information, please click here.