The Future Rests On The Internet BETWEEN Things – not IoT

The Internet of Things (IoT) is all the rage today when it comes to conversations around the future of technology. The term is reaching the same pervasiveness as “Big Data”. Depending on who you are, and your point of view on things, this is either a good thing or a bad thing.   I am on the latter side of that equation.   This is primarily because the concept of IoT is becoming too watered down, too generic, and very misunderstood.

First, lets take a moment to understand where IoT came from. Some would say that it was a marketing term used to “sexy up” and better describes the original intent of Machine-to-Machine (M2M). M2M started with the industrial world, and by default was not considered cool when it came to the consumer side of the fence. It was a Business-to-Business (B2B) concept framed up by equipment talking to other equipment. Specifically, it was a term used to describe any technology that enables networked devices to exchange information and perform actions without the manual assistance of humans. It became associated with groups of sensors and telemetry data.

On the other hand, IoT has been aligned with consumer related items. It is often associated with the idea of trying to combine disparate systems and data streams into one bigger system and data stream to create a larger or combine application. In theory it has also become synonymous with the cloud and wireless connectivity to the Internet, as we know it today. Think of the smart home.

The story often told early on was that your fridge would know that you are low on eggs or some other product via some weight or pressure sensor. Then your fridge would add that item to your shopping list or next Peadpod delivery. But, is not your fridge a machine connected to your wireless network through a router (e.g. another machine) that connects to the Internet? In other words, are not two machines connected to make the IoT kitchen work?

There is a difference though. M2M and IoT are the same genus but different species.   But what is key to acknowledge is that M2M was (and is) the foundation for IoT. Without M2M, there is no IoT.   Now that we have that cleared up, let’s discuss why IoT will give birth to the next (and dare I say more valuable) generation of technology: the Internet between Things (IbT). IbT will be the key to making the intention of IoT become real.

So what is IbT and why is it important? Let’s first take a look at the difference between the words “of” and “between”. The word “of” means “expressing the relationship between a part and a whole.” The word “between” means “at, into, or across the space separating (two objects or regions).” One expresses a relationship. The other focuses on what exists between and across those two things. That is a subtle yet important distinction. Now lets look at how IoT works in the real world today to provide more important context.

A normal example used is energy consumption in the home. A consumer uses a gadget and an app that monitor and track how much energy you are burning when you are home and away. It tracks when appliances are on and off, and how much they use. The consumer gets an alert when they are on vacation that the computer was left on. They remotely turn it off, or if you have the premium service, it does it for you based on some trigger. The result, they save a bit on that energy bill and feel better about it. But all of these devices are not talking to each other to solve the problem autonomously. They just report up the chain to the central processing center in the sky – the cloud. The report out the relationship of being powered on and consuming power.

Yes, there are efficiencies being gained, insights being created, but all of it is limited compared to what could be achieved. It’s not that I don’t think that IoT is making big strides and is having a real impact on the world we know it today. It is, but not just as big or as pervasive as we all expected and wanted. This is because IoT can’t solve problems autonomously as well as IbT.

Why? Well, the current applications of IoT and their corresponding impact are often limited because of their design. More specifically, they are limited because IoT solutions today rely on centralization. They create a system of things that obtain or generate data and send it back into the cloud in order to spit out aggregated data insights or functionality. Again, as noted above, IoT’s focus is to combine disparate systems and data streams into one bigger system and data stream to create a larger or combine application. Data runs up from individual nodes to a central processing station. It is steeped in the old adage that the sum is greater than the parts.

Unfortunately this model creates a unique set of challenges. First, it requires centralized data across all the elements on the edge to drive aggregated decisions from the core or top of the system. This in turn assumes (or demands) that the data and information is relatively heterogeneous in order for a broad-based decision on insight “acceptable” across the edge. If not, a blended or generalized decision is made which works for one part of a system but perhaps not another.

For example, if you are hauling ice cream and the temperature went above freezing to 50 degrees, you have to assume all of it melted.   But if you are hauling crates of strawberries along with crates of milk and the temperature of the truck hits 75 degrees, you can’t assume both products are spoiled.

What if you redefine how you calculate the concept of “sum”. Today the concept of “sum” in IoT is defined by adding together the data from individual assets on the edge of a system to produce a centralized insight or result. But what if you were to force the communication between the assets on the edge instead in order to create insights and drive decisions locally and then communicate back centrally?

You first decentralize data aggregation and analysis, and then deploy coordination between the edge and the center for communication and correlation. In this model you are forcing local assets to communicate with one another to create a new “local sum” regarding the environment what actions or impacts area created as a result.

For example, let’s revisit the delivery of food again. Normally a refrigerated truck has a traditional remote monitoring system. The system has a standard probe that provides the temperature at the truck level (e.g. the ambient temperature in the trailer). Each reading is taken and it’s pushed back in real-time to a central dispatch operation that is responsible for reacting to any temperatures that exceed approved thresholds for that product.

The airflow in a refrigerated trailer has a major impact of the temperature of the pallets and goods within it.   Specifically, it is impacted by how those pallets are placed in the truck. The deviations can be several degrees from front to back or top to bottom because of airflow. Also, different food products are often hauled together, all of which have different temperature ranges requirements.

If each pallet on this truck had a simple Bluetooth sensor attached and configured to the product’s temperature requirements, a decentralized IbT network can be created. Each sensor can be reporting its temperature to a small node on the truck that then relays the figures to the trailer’s refrigerated system. The node can also calculate and recommend the best temperature for the truck to be adjusted to based on an algorithm that considers the impacts by pallet from airflow, ambient (outside) temperature and other factors.

Now let’s also assume you are driving that truck in a remote area where you have no cellular communications. A good example would be Central or Northern Mexico, where produce is often grown for delivery in the United States. Only a decentralized system at the edge can impact and ensure service and product delivery quality. This in turn allows the products themselves to achieve optimal product freshness while reducing fuel consumption of the refrigeration system.   And, this does not require a centralized IoT system back at a home office to drive changes and adjustments o the temperature.

The key in this example is that the data model is based on a bottom up approach of the product temperature versus a top down approach.   The product becomes self-managing of its required temperature without the need for centralized monitoring. In addition, it becomes a local neural network that drives the remote refrigeration system. This system can be changed on the fly by placing small sensors on the product prior to pick-up. This removes the need to centralize temperature settings from pre-set ingestion from the bill of lading, which in turn allows for a more flexible and efficient cold chain operations.

What about real-time? That still exists, but the intent and role shifts. Real time adjustments would happen within the short-range network. But there is no need to push those communications for processing in real time bat at a centralized hub manned by dispatchers. Data would only be pushed back to the center when a product has surpassed its temperature thresholds and has expired would you need a real time relay. At that point, it would not be to “react” to save the product (moot point), but rather to push a re-supply order through the system to ensure inventory is not impacted.

The result: More efficient operations, lower costs, less reliance on centralized processing and smarter decisions made for services and applications where they need to be made….at the edge.

The Danger of Airport Tarmacs to Pharmaceuticals.

The Pharmaceutical industry scraps $15 Billion in product each year due to temperature deviations during shipment. They spend another $20 Billion on resupply costs and mandated regulatory investigations into the root cause of the issue. The vast majority of all of these temperature excursions occur on the airport tarmac.

The current “industry-standard” technology deployed provides very limited information when it comes to understanding a root cause. Specifically, a PDF document that provides a temperature graph only tells a pharmaceutical company that they had an issue during transport, not why or how it possibly occurred.

For example, you may know that your product went out of its specified temperature range for 30 minutes on a certain date, at a certain time–rendering it bad. But the pharmaceutical firm is still required by law to obtain information from their logistics partners to manually reconstruct the shipment timeline and changes of custody along with way.

The accuracy and fidelity of this data is usually less than optimal due to a reliance of human generated data, disparate reporting systems from vendors, and planned or approximated transportation legs versus reality. For example, the airlines may not report that your six pallets of products were split between two separate flights a few hours apart and then recombined at the final airline destination.

Why does this matter?

Without definitive insight, no organization can effectively change their standard operating procedures to eliminate expensive reoccurring events within their supply chain.  To illustrate, let’s review an example regarding flights.

This image represents nine pharmaceutical shipments from Switzerland to Japan for a multi-billion dollar pharmaceutical firm.   The shipments are ordered from new to old, with each shipment traveling on the same lane. Purple represents truck segments, grey represents warehouse segments, orange represents tarmac time, and blue represents are on the airplane and flights. All of this data was derived from sensor based data and pattern recognition – not human input or estimates.screen-shot-2016-09-06-at-2-23-03-pm

Notice something different in the pattern?  The airline changed to three and four flight legs versus two in order to get the product from point A to point B. Why does this matter? The extra flights exposed the product to 50% or more incremental tarmac time. Thus, the product was left on hot a hot concrete surface for much longer than desired (or possibly allowed).

How hot can the surface of the airport tarmac get?

Research shows that the surface temperature of asphalt pavement can reach ~ 50ºC  (122ºF) higher than the air temperature. Concrete pavement surface temperature can reach a temperature of about 30ºC  higher at noon (86F degrees higher).

Most pharmaceuticals must be kept from 2C to 8ºC or 15ºC to 25ºC. Even basic over the counter drugs like Claritin must be stored between 20 to 25 Celsius (68ºF-77ºF).  To quote the study, “Exposure of 1 second to pavement at 158 ºF can burn human skin. At 158 ºF you can cook an egg on pavement surface in five minutes.”

Even when pharmaceuticals are packaged with dry ice or special phase change material to maintain temperature during transport, an increase of 50% more exposure to potentially damaging temperatures is not ideal.  The packaging will break down faster, thus increasing risk of product loss.

Not being made aware of that exposure is even worse, because the pharmaceutical company would not be able to truly diagnose the source of the issue. Logistics professionals can only achieve clarity and affect change through complete supply chain oversight from CargoSense.  A simple data logger acts only like a canary in the coal mine:  If its dead, so is the product.  But knowing why is more important in the long run.


Logistics Beacons: Using Proximity As A Logistics Oversight Service (an Excerpt)

The global logistics industry generates $9.1 Trillion dollars in revenue. The global Third Party Logistics (3PL) industry generates $750 Billion in revenue as a part of that figure. Further, the U.S. trucking industry alone generates approximately $700 Billion in revenue. Air and sea cargo industries generate $62 Billion each. These are huge numbers that represent the movement of every good across the globe and the reason we are seeing the rise of Logistics Beacons as a way to increase logistics and supply chain visibility.

In the most simple of terms, any product that is produced – or the raw goods used to produce it, are moved through some sort of logistics network. There is no larger industry in the world than logistics. And there is not a more important industry to the world than logistics. Food, medicine, technology, fuel, building materials, waste – the very basic inputs and outputs of any modern society are dependent upon logistics.

Technology has always played a critical part within logistics. Over the past two decades, there have been several technologies that played an important role with respect to increased supply chain visibility. Two of those were bar codes and GPS. There were also technologies that over promised and under delivered, such as RFID.

Each of these technologies provided visibility of a single item flowing through a logistics network. GPS was and is a pro-active technology providing location of a truck, train, plane or container. Barcodes on the other hand, were reactive technologies and often operated at the product level. RFID technology can operate at both a reactive or proactive (passive versus active) manner, and can be paired at a transportation asset or product level.

But RFID fell short for a myriad of reasons, most of which could be detailed in a stand-alone research paper. For simplicity sake, let’s just state that RFID infrastructure (gates and readers) are expensive and cumbersome to deploy. This cost and difficulty became a natural antagonist to the desired ubiquity of the technology. In addition, the cost of active tags combined with the “dumb” (e.g. no power or memory) of passive tags further plagued adoption within logistics. Thus, RFID is now more synonymous with inventory management.

Today, we know roughly where a product is or has been in terms of single fixed points in time—two dimensions if you will (where and when). But, what logistics does not know today is what that asset or product was around, flowed through or handled by. You may know that a shipment arrived at a location. You may not know that it was moved through docking bay 7, handled by forklift 123, driven by employee ABC, and placed in location XYZ at 1:12 PM. Or, if you do, you have to cobble that information together from six different sources – many of which are not IoT-based in terms of data integration.

But much of this can and will change with the introduction of Bluetooth® Low Energy (BLE) technology in the form of Logistics Beacons. A beacon is an independent, low-cost device that is built upon the latest Bluetooth® standards. These devices are self-powered and require no data plan to provide proximity services. They broadcast their presence to other nearby devices, such as smart phones, tablets, computers, and sensors.


Cold Chain Shipping Loss in Pharmaceuticals – $35B per year and growing

In 2014, the pharmaceutical industry had sales of $790 Billion in non-cold-chain (77.8%) and $225 Billion in Cold-chain or controlled room temperature (22.2%) products. That totals $1.015 Trillion. If we estimate a 5% CAGR (compound annual growth rate) then by 2019, that number will be $1.36 Trillion.

The losses associated with temperature excursions in healthcare come to $35 Billion. That is broken down as follows:

  • $15.2B in Lost product cost
  • $8.6B in root cause analysis
  • $5.65B in clinical trial loss
  • $3.65B in replacement costs
  • $1B in wasted logistics costs

Within Clinical trials, the total loss of $5.65B is broken down further as follows*:

  • $1.3B in Opportunity labor costs
  • $2B in Direct labor costs
  • $2.34B in Trial product loss

Loss is present across the industry in high numbers, for example:

  • 25% of vaccines reach their destination degraded because of incorrect shipping.
  • 30% of scrapped pharmaceuticals can be attributed to logistics issues alone
  • 20% of temp-sensitive products are damaged during transport due to a broken cold chain.

A pallet of unprotected product on an airport tarmac with an ambient temperature of ~70°F (21°C) can quickly reach temperatures above ~130°F (55°C). At that temperature, you can fry an egg in 20 minutes.

So, what is a billion worth?

$16 Billion, which is the approximate costs incurred by the top ten pharma firms due to temperature excursions, is 20 times the average price to earnings ratio of big pharmaceutical firms.

$320 Billion, which is the total corporate value wasted due to temperature, is larger than the 2015 total market capitalization of Johnson & Johnson ($274 Billion).

Get the infographic here


* = Ray Geoff, Wyeth Vaccines—white paper entitled “Cold Chain to Clinical Site: The Shipping Excursions”, indeed website salary estimates
Other sources: World Health Organization (WHO); Parenteral Drug Associate(PDA);; other industry estimates.

Supply Chain Intelligence, The Internet Of Things (IoT) and it’s impact on healthcare logistics

Over $15Billion in product losses occur every year in the Pharmaceutical industry due to temperature excursions alone.

This figure does not include the costs associated with replacing those goods, the labor costs (direct / indirect) associated with the root cause analysis process, or other causes of product loss such as shock, humidity, etc. All accounted for, it is estimated that over $35B is lost each year.

When Cold Chain IQ surveyed pharmaceutical executives, it found that at least 10 percent of respondents recorded temperature deviations in more than 15 percent of their temperature-sensitive shipments. Twenty percent didn’t know whether excursions had occurred. According to a report on Cold Chain by ChainLink Research, conservative industry estimates cite 80M climate-sensitive shipments occur annually. More aggressive estimates suggest up to 130M shipments.

There are very specific industry statistics that have been quantified with respect to the loss that consistently occur within the healthcare space:

  • 25% of vaccines reach their destination degraded because of incorrect shipping
  • 5% of pharmaceutical sales are marked as scrap
  • 30% of scrapped pharmaceutical can be attributed to logistics issues alone
  • 20% of temp-sensitive products are damaged during transport due to a broken cold chain
  • The average costs of root cause analysis for each excursion can range from $3K to up to $10K
  • An average pharmaceutical organization spends 6% of its revenue on logistics requirements

Sources: World Health Organization (WHO), Parenteral Drug Association (PDA), and other industry estimates

The costs associated with these types of losses in the pharmaceutical supply chain focus around expensive product replacement costs and wasted shipping costs. In addition, there are large operational costs due to the human capital required to manage the quality and control process when damage occurs. Damage can also be caused by environmental factors beyond temperature—such as shock, pressure, humidity and tilt events.

These losses not only occur in volume product shipments, they also occur in clinical trials. To that end, the average cost for shipping excursions of a single clinical study to a clinical site can well exceed $ 150,000 and commits over 2,300 staff hours for excursion resolution for an average cold chain related study (source: Ray Goff, Wyeth Vaccines – white paper titled Cold Chain to Clinical Site: The Shipping Excursion). This does not include another 1,500 hours of lost opportunity labor.

All the while, key regulatory pressures continue to grow. Regulatory compliance for healthcare companies revolves around various standards that are produced by government bodies like the Good Distribution Practices (GDP) in the EU and related standards published in the U.S. by the FDA.  Recently the EU added the following update to the GDP:

“it is the responsibility of the supplying wholesale distributor to protect medicinal products against breakage, adulteration and theft and to ensure that temperature conditions are maintained within acceptable limits during transport.”

Experts interpret this to require the wholesale distributor/ manufacturer to measure the temperature to ensure it’s properly maintained (data logging, etc.). But this is just one of many various regulations the impact the industry – from either the manufacturer side or the logistics provider attempting to resolve these issues on their behalf.

To learn more about the issues Bio-Pharma is facing, how technology can be use to address these issues, and how data can drive corrective and preventative action programs, read the white paper here.

How Do We know if Goods are Still Good?

As the need for temperature-controlled transportation rises, problems persist. According to The Loadstar article, “Unreliable Air Cargo Industry Loses Pharma Traffic While IATA Sleeps”, Kuehne + Nagel’s senior vice president for global air logistics products and services, Marcel Fujike, is quoted, “There was a lack of skills, training and standards throughout cool-chain logistics, with “no SOPs or working instructions in place overall”.

Furthermore, Mr. Fujike noted that vulnerable spots in the air transport chain included handling, loading, the tarmac phase, “which is considered the weakest link in the chain”, and customs clearance.

read more here

CargoSense Announced as a Winner of the 2015 Internet of Things (IoT) Evolution Asset Tracking Award

TMC, a global, integrated media company that helps clients build communities in print, in person and online, announced CargoSense was among the recipients awarded the 2015 Internet of Things (IoT) Evolution Asset Tracking Award.

CargoSense’s new “Black Box” cargo solution ( shared the stage with the likes of Cisco Systems, Savi, SkyBitz and Vodafone.