Mastering Data Usage: Tackling the Data Latency Issue

“Data is the new oil,” declared Clive Humby, a British mathematician.

And he is right. Data has become a crucial part of today’s world. In fact, it propels the world forward, having become the lifeblood of modern business and devices.

As a result, the amount of data to be created worldwide is expected to grow to more than 180 zettabytes by 2025. This projection not only underscores the world’s increasing reliance on data but also raises a critical question: How can one be insured against data latency— the time that passes between data acquisition and the time that data is made available to the public?

To answer this, Genie Yuan, Regional Vice President, APAC at Couchbase, shared his insights and expert opinions about the matter in an exclusive interview with Data Storage ASEAN.

The Evolution of the Internet of Things (IoT) and Edge Computing

Before touching the heart of the question, Genie briefly explained the Internet of Things (IoT) and edge computing as a foundation and context for the topic. Genie noted that IoT “initially began as a method to track inventory with data flowing in one direction” but has, over time, “evolved into a complex and interconnected system generating massive amounts of data.”

As such, legacy solutions such as RESTful Web Services are no longer adequate for managing this data. In response to this inadequacy, businesses and devices have adopted edge computing architecture, offering embedded and cloud databases, seamless data synchronisation, big data integration, and guaranteed data availability. This integration of edge computing with IoT allows for the collection of sensor data from various sources for analytics and value creation.

Why the IoT and Edge Computing Evolution Matters

To emphasise the necessity and benefits of IoT and edge computing evolution, Genie provided a scenario as an example:

“On an offshore oil drilling platform, where sensors constantly monitor critical data like pressure and temperature, real-time access to data is crucial. Delays in sending this data to the cloud for processing can pose significant risks, especially when every second counts.”

“Edge computing overcomes this issue by processing data locally on-site, thus solving the latency and downtime challenges. This approach has driven the growth of IoT at the network edge. In addition, edge computing enables immediate responses to anomalies while conserving bandwidth by sending only aggregated data to the cloud for long-term storage.”

Needless to say, latency is crucial for most organisations today as it is one of the factors that determine customer retention rates.

To wit, lags from latency will cause customers to be at odds with the business or services they are using or subscribing to because of the expectation of smooth real-time transactions. For example, delayed real-time input (lag) in accessing a booking site when booking a hotel room or receiving push notifications to confirm the booking would cause potential customers to feel frustrated—to the point of abandoning the transaction entirely. If such incidents are widespread and happen all the time, it may lead to a significant drop in that site’s active users.

High latency also impairs back-end systems processes, which in turn causes business or service processes to be incomplete, leading to data losses and downtime—which then causes revenue loss for 53% of organisations and productivity drops for 47% of organisations. It also negatively impacts brand reputation for 41% of organisations.

Mitigating Latency Issues Due to Increased Data Creation

To mitigate latency issues that will arise from increased data creation in the coming years, organisations should start leveraging edge computing as it allows them to make the most efficient use of network connectivity by bringing data processing and computer infrastructure closer to the source. Organisations can start by offloading legacy workloads and migrating their databases to the cloud.

In addition to that, organisations also should understand the unique characteristics of databases and how they relate to specific business needs.

  • Relational databases or Structured Query Language (SQL) are well-suited for applications with structured data and complex queries.

  • Non-relational databases or Non-Structured Query Language (NoSQL) excel in applications with unstructured or semi-structured data and high scalability requirements. This enables IoT organisations to meet any demand and linearly scale easily and without disruption.

However, adopting NoSQL is not enough, as some of these databases like MongoDB require a third-party cache. This adds to both cost and complexity. On the other hand, a fully integrated read-through and write-through caching layer establishes memory- and network-centric architecture that consistently delivers the real-time responsiveness that users today expect.

The adoption and integration of edge computing to IoT for business organisations will help facilitate the speed, availability, and governance of IoT data. This, in turn, enhances data security and protection, overcomes latency and distance limitations, and enhances compliance—all while empowering continuous operation even during network interruptions.

A noteworthy use case would be Malaysia Digital Economy Corporation’s laudable effort in assisting farmers in adopting an IoT-enabled fertilisation and irrigation system that helped increase the quality of monthly yield by as much as 90% while cutting manpower requirements by 25% and using 20% less fertiliser.

Apart from the business improvements in operational efficiency and productivity, this success story highlights the possibilities edge computing holds for improving service delivery in key sectors like healthcare for underserved and rural communities.

Integrating edge computing and IoT also benefits sectors like e-commerce or tourism, enabling them to generate millions of queries every second through round-the-clock availability and accessibility. Time delays experienced by users are one of the main contributing factors for users to disengage from a service. Reduced latency solves not only time delays but also enables users to experience smoother, faster and more responsive real-time input, resulting in high user engagement and retention.

Couchbase Approach in Integrating Edge Computing Into IoT

Firstly, Couchbase equips organisations with the agility and flexibility to eliminate database sprawl and capture billions of data points via a single platform that manages and syncs IoT data from any cloud to every edge.

In addition, Couchbase’s network-centric architecture enhances easier, more affordable scalability. Equipped with a high-performance replication backbone that seamlessly extends databases without compromising performance at scale, Couchbase arms organisations with a single-pane view of data through real-time capture and consolidation of application information.

Couchbase ensures data is always handy via built-in high availability and flexible cross-data centre replication capabilities. This provides organisations with the requisite support in the event of disaster recovery, buttressing their ability to meet data locality requirements, and ensuring data is always available. Combined with a microservices architecture, Couchbase empowers IoT data management by ensuring each component of a larger application continues to do what it does best, regardless of failures or load elsewhere.

Therefore, Couchbase's integration of edge computing is poised to become a crucial solution—if it isn’t already. It’s not for nothing that Couchbase’s Genie is championing the importance of IoT evolution and edge computing in resolving real-time data challenges. This union not only mitigates latency issues but also revolutionises service delivery across various sectors, cementing data's vital role in this data-driven world.

You might also like
Most comment
share us your thought

0 Comment Log in or register to post comments