This post originally appeared on InfoWorld on September 27, 2017.
What real-time application pattern works for you?
3 common real-time application patterns that require a real-time decision
At first glance, building a real-time application may sound like a daunting proposition, one that involves technical challenges as well as a significant financial investment, especially when you have an application goal of responding within a fraction of a second. But advances in hardware, networking, and software—both commercial as well as open source—make building real-time applications today very achievable. So what do these real-time applications look like?
This article presents three common real-time application patterns that require a real-time decision, meaning a response returned or transaction executed based on real-time input. To determine which pattern to apply to your application, you must first define your real-time objective. Ask yourself: How fast does the application need to respond?
Each application pattern addresses a particular level of real-time response: sub-millisecond, milliseconds, or 100 milliseconds and greater.
Pattern 1: Embedded applications—delivering responses in sub-milliseconds
To achieve sub-millisecond response, you need to eliminate any server-side networking and embed your application onto a computer or hardware appliance. This is the bleeding edge of real-time processing for more specialized applications that are not very common. This pattern is relevant for areas such as high frequency trading applications, nuclear power plant systems and signal processing and sensor applications.
Delivering sub-millisecond responses involves low-level programming, often at the kernel level. Standard kernels, operating systems, and device drivers can add unwanted processing overhead resulting in extra latency. Applications that care about every microsecond or nanosecond, every clock cycle, should seek to eliminate this overhead and code directly on the hardware. Alternatively, if you can withstand some additional latency, you can forgo writing low-level code and build and run your application directly on the operating system, embedding a data store such as SQLite, if needed.
Pattern 2: High speed OLTP—delivering responses in milliseconds
This is the classic client-server OLTP application architecture where a client application talks to a server-side application and database. These applications are very common — you have likely interacted with them several times today without even realizing it. These applications detect credit card fraud, compute personalized webpages, and deliver optimized digital ads. For instance, when you use your iPhone or Android phone to make a call, run an app, or access the internet, several decisions (transactions) must be made by the telco provider before your action is allowed to occur: Is your account valid? Do you have enough quota (voice or data)? What policy should apply to the action (throttling etc.)? And each transaction must respond in milliseconds.
Optimizing network performance between the client application and the server allows for low-latency responses for high-speed OLTP application patterns. Low-cost gigabit ethernet (GigE) and relatively low-cost 10GigE networking is readily available to most application developers. Network performance can be further optimized by keeping the application on the same network switch or rack as the server or minimally on the same LAN. In other words, keep the client and server in close proximity. Within the server, the application and database usually minimize blocking disk I/O, either by avoiding it completely, by applying sequential I/O, or by using advanced storage such as SSDs or the newly emerging non-volatile RAM.
One additional point worth noting is that with next generation in-memory data stores and caches, it is even possible to achieve low single-digit millisecond latency with highly available clustered data stores; that is, databases and systems spanning multiple nodes or processes. Today, many shared-nothing, in-memory databases, data grids, and NoSQL stores offer highly available data stores with predictable low latency (often single-digit millisecond) response times.
Pattern 3: Streaming fast data pipelines—delivering responses in seconds
A fast data pipeline, historically rooted in complex event processing (CEP) applications, is becoming a more broadly deployed real-time application pattern today. In this application pattern, a never-ending stream of immutable events is being ingested with real-time analytics applied.
Typical applications have a queuing or streaming system that delivers events, ultimately feeding the data lake, managed by Hadoop, Spark, or a data warehouse. Before arriving at the historical archive, the event stream is processed by a fast data store or computational engine. It is the role of this engine to aggregate, dedupe, and compute real-time analytics on incoming events and generate real-time alerts or decisions as required. The analytics are often displayed on a dashboard, and alerts or decisions are generated. A person or business process reacts to the alert, in human speed. A few seconds is often enough time to ensure any late data has arrived to inform the decision.
In this pattern, data flows in one direction. This real-time engine often holds a predetermined amount of “hot data,” either in the form of continuously computed analytics or a database of the last hour, day, or week’s worth of data. Older data is delivered to the historic data lake or data warehouse.
Advances in queuing systems like Kafa, in-memory databases, data grids, and NoSQL data stores make implementing this pattern possible. This pattern has broad usage across the internet of things (IoT), electric smart grids, log file management, and mobile in-game analytic processing, among others. We’ll be seeing more of this pattern in future applications.
The age of real time is now
If you are just starting out with your real-time application, first consider what response rate your problem domain requires. If it requires sub-millisecond response, consider an embedded application. If your application is high-velocity OLTP, explore high-performance network configurations and new offerings in low-latency data store and in-memory database technology. If you need to handle relentless streams of data, consider a fast data-pipeline architecture.
Low-cost computing, readily accessible high-speed networking, and numerous open source and commercial data storage software offerings capable of low-latency data processing means that real-time applications are no longer out of reach.
Post a Comment