The Future Rests On The Internet BETWEEN Things – not IoT

The Internet of Things (IoT) is all the rage today when it comes to conversations around the future of technology. The term is reaching the same pervasiveness as “Big Data”. Depending on who you are, and your point of view on things, this is either a good thing or a bad thing.   I am on the latter side of that equation.   This is primarily because the concept of IoT is becoming too watered down, too generic, and very misunderstood.

First, lets take a moment to understand where IoT came from. Some would say that it was a marketing term used to “sexy up” and better describes the original intent of Machine-to-Machine (M2M). M2M started with the industrial world, and by default was not considered cool when it came to the consumer side of the fence. It was a Business-to-Business (B2B) concept framed up by equipment talking to other equipment. Specifically, it was a term used to describe any technology that enables networked devices to exchange information and perform actions without the manual assistance of humans. It became associated with groups of sensors and telemetry data.

On the other hand, IoT has been aligned with consumer related items. It is often associated with the idea of trying to combine disparate systems and data streams into one bigger system and data stream to create a larger or combine application. In theory it has also become synonymous with the cloud and wireless connectivity to the Internet, as we know it today. Think of the smart home.

The story often told early on was that your fridge would know that you are low on eggs or some other product via some weight or pressure sensor. Then your fridge would add that item to your shopping list or next Peadpod delivery. But, is not your fridge a machine connected to your wireless network through a router (e.g. another machine) that connects to the Internet? In other words, are not two machines connected to make the IoT kitchen work?

There is a difference though. M2M and IoT are the same genus but different species.   But what is key to acknowledge is that M2M was (and is) the foundation for IoT. Without M2M, there is no IoT.   Now that we have that cleared up, let’s discuss why IoT will give birth to the next (and dare I say more valuable) generation of technology: the Internet between Things (IbT). IbT will be the key to making the intention of IoT become real.

So what is IbT and why is it important? Let’s first take a look at the difference between the words “of” and “between”. The word “of” means “expressing the relationship between a part and a whole.” The word “between” means “at, into, or across the space separating (two objects or regions).” One expresses a relationship. The other focuses on what exists between and across those two things. That is a subtle yet important distinction. Now lets look at how IoT works in the real world today to provide more important context.

A normal example used is energy consumption in the home. A consumer uses a gadget and an app that monitor and track how much energy you are burning when you are home and away. It tracks when appliances are on and off, and how much they use. The consumer gets an alert when they are on vacation that the computer was left on. They remotely turn it off, or if you have the premium service, it does it for you based on some trigger. The result, they save a bit on that energy bill and feel better about it. But all of these devices are not talking to each other to solve the problem autonomously. They just report up the chain to the central processing center in the sky – the cloud. The report out the relationship of being powered on and consuming power.

Yes, there are efficiencies being gained, insights being created, but all of it is limited compared to what could be achieved. It’s not that I don’t think that IoT is making big strides and is having a real impact on the world we know it today. It is, but not just as big or as pervasive as we all expected and wanted. This is because IoT can’t solve problems autonomously as well as IbT.

Why? Well, the current applications of IoT and their corresponding impact are often limited because of their design. More specifically, they are limited because IoT solutions today rely on centralization. They create a system of things that obtain or generate data and send it back into the cloud in order to spit out aggregated data insights or functionality. Again, as noted above, IoT’s focus is to combine disparate systems and data streams into one bigger system and data stream to create a larger or combine application. Data runs up from individual nodes to a central processing station. It is steeped in the old adage that the sum is greater than the parts.

Unfortunately this model creates a unique set of challenges. First, it requires centralized data across all the elements on the edge to drive aggregated decisions from the core or top of the system. This in turn assumes (or demands) that the data and information is relatively heterogeneous in order for a broad-based decision on insight “acceptable” across the edge. If not, a blended or generalized decision is made which works for one part of a system but perhaps not another.

For example, if you are hauling ice cream and the temperature went above freezing to 50 degrees, you have to assume all of it melted.   But if you are hauling crates of strawberries along with crates of milk and the temperature of the truck hits 75 degrees, you can’t assume both products are spoiled.

What if you redefine how you calculate the concept of “sum”. Today the concept of “sum” in IoT is defined by adding together the data from individual assets on the edge of a system to produce a centralized insight or result. But what if you were to force the communication between the assets on the edge instead in order to create insights and drive decisions locally and then communicate back centrally?

You first decentralize data aggregation and analysis, and then deploy coordination between the edge and the center for communication and correlation. In this model you are forcing local assets to communicate with one another to create a new “local sum” regarding the environment what actions or impacts area created as a result.

For example, let’s revisit the delivery of food again. Normally a refrigerated truck has a traditional remote monitoring system. The system has a standard probe that provides the temperature at the truck level (e.g. the ambient temperature in the trailer). Each reading is taken and it’s pushed back in real-time to a central dispatch operation that is responsible for reacting to any temperatures that exceed approved thresholds for that product.

The airflow in a refrigerated trailer has a major impact of the temperature of the pallets and goods within it.   Specifically, it is impacted by how those pallets are placed in the truck. The deviations can be several degrees from front to back or top to bottom because of airflow. Also, different food products are often hauled together, all of which have different temperature ranges requirements.

If each pallet on this truck had a simple Bluetooth sensor attached and configured to the product’s temperature requirements, a decentralized IbT network can be created. Each sensor can be reporting its temperature to a small node on the truck that then relays the figures to the trailer’s refrigerated system. The node can also calculate and recommend the best temperature for the truck to be adjusted to based on an algorithm that considers the impacts by pallet from airflow, ambient (outside) temperature and other factors.

Now let’s also assume you are driving that truck in a remote area where you have no cellular communications. A good example would be Central or Northern Mexico, where produce is often grown for delivery in the United States. Only a decentralized system at the edge can impact and ensure service and product delivery quality. This in turn allows the products themselves to achieve optimal product freshness while reducing fuel consumption of the refrigeration system.   And, this does not require a centralized IoT system back at a home office to drive changes and adjustments o the temperature.

The key in this example is that the data model is based on a bottom up approach of the product temperature versus a top down approach.   The product becomes self-managing of its required temperature without the need for centralized monitoring. In addition, it becomes a local neural network that drives the remote refrigeration system. This system can be changed on the fly by placing small sensors on the product prior to pick-up. This removes the need to centralize temperature settings from pre-set ingestion from the bill of lading, which in turn allows for a more flexible and efficient cold chain operations.

What about real-time? That still exists, but the intent and role shifts. Real time adjustments would happen within the short-range network. But there is no need to push those communications for processing in real time bat at a centralized hub manned by dispatchers. Data would only be pushed back to the center when a product has surpassed its temperature thresholds and has expired would you need a real time relay. At that point, it would not be to “react” to save the product (moot point), but rather to push a re-supply order through the system to ensure inventory is not impacted.

The result: More efficient operations, lower costs, less reliance on centralized processing and smarter decisions made for services and applications where they need to be made….at the edge.