Return to site

"Real-Time" vs Truly Real-Time

by Bill French

Chief Architect at Intelliscape

For more than a decade, business intelligence firms, publishing, integration platform companies, and database vendors have claimed high ground concerning "real-time" data, reporting, and all manner of information distribution. Companies in every business segment have enjoyed great success standing high atop the real-time heap. For the most part, real-time solutions were certainly faster compared to previous generation architectures.

Most of them are lying and more likely to be selling solutions that are seemingly real-time or near-real-time.

There's a difference between [seemingly] real-time data interchange and truly real-time performance. And while we're putting a finer point on the definition of real-time, we might also want to consider real-time architectures that also scale. After all, it's easy to say that your solution is real-time, but if only one person at a time can get real-time updates, the definition falls short in most business contexts.

Near Real-Time

This is a common performance characteristic. Your server in Los Angeles has data; your sales team spread across 120 cities need data. There are many ways to deliver critical information to the sales professionals at a variety of speeds. Delivering sales-related data in near-real-time is generally fine until, of course, your competitor updates their sales force slightly faster.

This is the moment you realize that near-real-time is adequate but not competitive.


Information delivery performance depends largely on the shelf-life of the information. Most data has a relatively moderate shelf life, but some content is worthless in less than a minute.

Information architects rarely consider the shelf-life of data, but there are new emerging requirements that will transform the shelf-life of information into a key competitive attribute.

We live in an ever-increasing real-time consumer economy; we expect information instantly. As an always-on society, fresh and informative content is considered a necessity, not a luxury. This expectation is wending its way into every aspect of business.

Truly Real-Time

Until recently, near-real-time was about the best we could do. Even the most modernized integration and API architectures are incapable of moving data with truly real-time performance.

This has changed in the last year; truly real-time performance will soon be ubiquitous across many data platforms and web services.

The Promise of Real-Time Data

I recently completed a project that required rapid access to a list of 2 million companies over HTTP - a basic web app.

Under most circumstances - and this was initially the case - the client's IT group created a traditional architecture with a SQL database, and a RESTful integration. It performed reasonably well, but it didn't give users any sense of zippy performance.

I modified the app by using Firebase as the data store which allowed me to update web pages in about a quarter of a second and all without the heavy-handedness of a request-response architecture commonly used in modern web applications.

How is this possible?

Sockets. Instead of opening a socket for each request/response via an API, open a single socket and maintain it throughout the entirety of the user's session with the web app.

The sockets connection is able to move data - even complex data structures that need to be queried - from the server to the client as if the two entities were texting each other. And in fact, this is precisely what they are doing, and Firebase provides all the sockets machinery to do this with just a few lines of javascript.

Publish-Subscribe (and then some)

Think of sockets as a pipe that your database has opened, and an endpoint that clients may subscribe to about specific data elements. It's like a wormhole for data providing a super-highway between two points.

This architecture makes it possible for dozens, hundreds, or tens of thousands of clients to receive the same (or different) data instantly - without any latency.

Client applications are able to specify what flows through the pipes and the real-time database is able to convey the desired information with true real-time performance.

This is the future of real-time analytics, but it's game-changing on two sides of the process:

  1. Capturing event data.
  2. Rendering insights.

This demonstrates a very important aspect of sockets; they are just as good at conveying information out to applications as they are at collecting information from those same applications.

Imagine a mouse-over event on an image. It calls a javascript function that drops the event data such as date, time, image name, user name, cursor location on the page, etc into a real-time data wormhole. The events are in Firebase instantly. Further imagine a dashboard that also uses the same sockets architecture to render the data captured by the client application(s) - also in true real-time.

Firebase, and other truly real-time platforms change everything. It's faster, simpler, and easier than ever to deliver solutions that are faster than [seemingly] fast - they're instant.

Internet of Recognition™ is a Trademark of Global Technologies, LLC. - ALL RIGHTS RESERVED

All Posts

Almost done…

We just sent you an email. Please click the link in the email to confirm your subscription!