The Future Rests On The Internet BETWEEN Things – not IoT

The Internet of Things (IoT) is all the rage today when it comes to conversations around the future of technology. The term is reaching the same pervasiveness as “Big Data”. Depending on who you are, and your point of view on things, this is either a good thing or a bad thing.   I am on the latter side of that equation.   This is primarily because the concept of IoT is becoming too watered down, too generic, and very misunderstood.

First, lets take a moment to understand where IoT came from. Some would say that it was a marketing term used to “sexy up” and better describes the original intent of Machine-to-Machine (M2M). M2M started with the industrial world, and by default was not considered cool when it came to the consumer side of the fence. It was a Business-to-Business (B2B) concept framed up by equipment talking to other equipment. Specifically, it was a term used to describe any technology that enables networked devices to exchange information and perform actions without the manual assistance of humans. It became associated with groups of sensors and telemetry data.

On the other hand, IoT has been aligned with consumer related items. It is often associated with the idea of trying to combine disparate systems and data streams into one bigger system and data stream to create a larger or combine application. In theory it has also become synonymous with the cloud and wireless connectivity to the Internet, as we know it today. Think of the smart home.

The story often told early on was that your fridge would know that you are low on eggs or some other product via some weight or pressure sensor. Then your fridge would add that item to your shopping list or next Peadpod delivery. But, is not your fridge a machine connected to your wireless network through a router (e.g. another machine) that connects to the Internet? In other words, are not two machines connected to make the IoT kitchen work?

There is a difference though. M2M and IoT are the same genus but different species.   But what is key to acknowledge is that M2M was (and is) the foundation for IoT. Without M2M, there is no IoT.   Now that we have that cleared up, let’s discuss why IoT will give birth to the next (and dare I say more valuable) generation of technology: the Internet between Things (IbT). IbT will be the key to making the intention of IoT become real.

So what is IbT and why is it important? Let’s first take a look at the difference between the words “of” and “between”. The word “of” means “expressing the relationship between a part and a whole.” The word “between” means “at, into, or across the space separating (two objects or regions).” One expresses a relationship. The other focuses on what exists between and across those two things. That is a subtle yet important distinction. Now lets look at how IoT works in the real world today to provide more important context.

A normal example used is energy consumption in the home. A consumer uses a gadget and an app that monitor and track how much energy you are burning when you are home and away. It tracks when appliances are on and off, and how much they use. The consumer gets an alert when they are on vacation that the computer was left on. They remotely turn it off, or if you have the premium service, it does it for you based on some trigger. The result, they save a bit on that energy bill and feel better about it. But all of these devices are not talking to each other to solve the problem autonomously. They just report up the chain to the central processing center in the sky – the cloud. The report out the relationship of being powered on and consuming power.

Yes, there are efficiencies being gained, insights being created, but all of it is limited compared to what could be achieved. It’s not that I don’t think that IoT is making big strides and is having a real impact on the world we know it today. It is, but not just as big or as pervasive as we all expected and wanted. This is because IoT can’t solve problems autonomously as well as IbT.

Why? Well, the current applications of IoT and their corresponding impact are often limited because of their design. More specifically, they are limited because IoT solutions today rely on centralization. They create a system of things that obtain or generate data and send it back into the cloud in order to spit out aggregated data insights or functionality. Again, as noted above, IoT’s focus is to combine disparate systems and data streams into one bigger system and data stream to create a larger or combine application. Data runs up from individual nodes to a central processing station. It is steeped in the old adage that the sum is greater than the parts.

Unfortunately this model creates a unique set of challenges. First, it requires centralized data across all the elements on the edge to drive aggregated decisions from the core or top of the system. This in turn assumes (or demands) that the data and information is relatively heterogeneous in order for a broad-based decision on insight “acceptable” across the edge. If not, a blended or generalized decision is made which works for one part of a system but perhaps not another.

For example, if you are hauling ice cream and the temperature went above freezing to 50 degrees, you have to assume all of it melted.   But if you are hauling crates of strawberries along with crates of milk and the temperature of the truck hits 75 degrees, you can’t assume both products are spoiled.

What if you redefine how you calculate the concept of “sum”. Today the concept of “sum” in IoT is defined by adding together the data from individual assets on the edge of a system to produce a centralized insight or result. But what if you were to force the communication between the assets on the edge instead in order to create insights and drive decisions locally and then communicate back centrally?

You first decentralize data aggregation and analysis, and then deploy coordination between the edge and the center for communication and correlation. In this model you are forcing local assets to communicate with one another to create a new “local sum” regarding the environment what actions or impacts area created as a result.

For example, let’s revisit the delivery of food again. Normally a refrigerated truck has a traditional remote monitoring system. The system has a standard probe that provides the temperature at the truck level (e.g. the ambient temperature in the trailer). Each reading is taken and it’s pushed back in real-time to a central dispatch operation that is responsible for reacting to any temperatures that exceed approved thresholds for that product.

The airflow in a refrigerated trailer has a major impact of the temperature of the pallets and goods within it.   Specifically, it is impacted by how those pallets are placed in the truck. The deviations can be several degrees from front to back or top to bottom because of airflow. Also, different food products are often hauled together, all of which have different temperature ranges requirements.

If each pallet on this truck had a simple Bluetooth sensor attached and configured to the product’s temperature requirements, a decentralized IbT network can be created. Each sensor can be reporting its temperature to a small node on the truck that then relays the figures to the trailer’s refrigerated system. The node can also calculate and recommend the best temperature for the truck to be adjusted to based on an algorithm that considers the impacts by pallet from airflow, ambient (outside) temperature and other factors.

Now let’s also assume you are driving that truck in a remote area where you have no cellular communications. A good example would be Central or Northern Mexico, where produce is often grown for delivery in the United States. Only a decentralized system at the edge can impact and ensure service and product delivery quality. This in turn allows the products themselves to achieve optimal product freshness while reducing fuel consumption of the refrigeration system.   And, this does not require a centralized IoT system back at a home office to drive changes and adjustments o the temperature.

The key in this example is that the data model is based on a bottom up approach of the product temperature versus a top down approach.   The product becomes self-managing of its required temperature without the need for centralized monitoring. In addition, it becomes a local neural network that drives the remote refrigeration system. This system can be changed on the fly by placing small sensors on the product prior to pick-up. This removes the need to centralize temperature settings from pre-set ingestion from the bill of lading, which in turn allows for a more flexible and efficient cold chain operations.

What about real-time? That still exists, but the intent and role shifts. Real time adjustments would happen within the short-range network. But there is no need to push those communications for processing in real time bat at a centralized hub manned by dispatchers. Data would only be pushed back to the center when a product has surpassed its temperature thresholds and has expired would you need a real time relay. At that point, it would not be to “react” to save the product (moot point), but rather to push a re-supply order through the system to ensure inventory is not impacted.

The result: More efficient operations, lower costs, less reliance on centralized processing and smarter decisions made for services and applications where they need to be made….at the edge.

Cold Chain Shipping Loss in Pharmaceuticals – $35B per year and growing

In 2014, the pharmaceutical industry had sales of $790 Billion in non-cold-chain (77.8%) and $225 Billion in Cold-chain or controlled room temperature (22.2%) products. That totals $1.015 Trillion. If we estimate a 5% CAGR (compound annual growth rate) then by 2019, that number will be $1.36 Trillion.

The losses associated with temperature excursions in healthcare come to $35 Billion. That is broken down as follows:

  • $15.2B in Lost product cost
  • $8.6B in root cause analysis
  • $5.65B in clinical trial loss
  • $3.65B in replacement costs
  • $1B in wasted logistics costs

Within Clinical trials, the total loss of $5.65B is broken down further as follows*:

  • $1.3B in Opportunity labor costs
  • $2B in Direct labor costs
  • $2.34B in Trial product loss

Loss is present across the industry in high numbers, for example:

  • 25% of vaccines reach their destination degraded because of incorrect shipping.
  • 30% of scrapped pharmaceuticals can be attributed to logistics issues alone
  • 20% of temp-sensitive products are damaged during transport due to a broken cold chain.

A pallet of unprotected product on an airport tarmac with an ambient temperature of ~70°F (21°C) can quickly reach temperatures above ~130°F (55°C). At that temperature, you can fry an egg in 20 minutes.

So, what is a billion worth?

$16 Billion, which is the approximate costs incurred by the top ten pharma firms due to temperature excursions, is 20 times the average price to earnings ratio of big pharmaceutical firms.

$320 Billion, which is the total corporate value wasted due to temperature, is larger than the 2015 total market capitalization of Johnson & Johnson ($274 Billion).

Get the infographic here

 

* = Ray Geoff, Wyeth Vaccines—white paper entitled “Cold Chain to Clinical Site: The Shipping Excursions”, indeed website salary estimates
Other sources: World Health Organization (WHO); Parenteral Drug Associate(PDA); worldpharmaceuticals.net; other industry estimates.

Supply Chain Intelligence, The Internet Of Things (IoT) and it’s impact on healthcare logistics

Over $15Billion in product losses occur every year in the Pharmaceutical industry due to temperature excursions alone.

This figure does not include the costs associated with replacing those goods, the labor costs (direct / indirect) associated with the root cause analysis process, or other causes of product loss such as shock, humidity, etc. All accounted for, it is estimated that over $35B is lost each year.

When Cold Chain IQ surveyed pharmaceutical executives, it found that at least 10 percent of respondents recorded temperature deviations in more than 15 percent of their temperature-sensitive shipments. Twenty percent didn’t know whether excursions had occurred. According to a report on Cold Chain by ChainLink Research, conservative industry estimates cite 80M climate-sensitive shipments occur annually. More aggressive estimates suggest up to 130M shipments.

There are very specific industry statistics that have been quantified with respect to the loss that consistently occur within the healthcare space:

  • 25% of vaccines reach their destination degraded because of incorrect shipping
  • 5% of pharmaceutical sales are marked as scrap
  • 30% of scrapped pharmaceutical can be attributed to logistics issues alone
  • 20% of temp-sensitive products are damaged during transport due to a broken cold chain
  • The average costs of root cause analysis for each excursion can range from $3K to up to $10K
  • An average pharmaceutical organization spends 6% of its revenue on logistics requirements

Sources: World Health Organization (WHO), Parenteral Drug Association (PDA), Worldpharmaceuticals.net and other industry estimates

The costs associated with these types of losses in the pharmaceutical supply chain focus around expensive product replacement costs and wasted shipping costs. In addition, there are large operational costs due to the human capital required to manage the quality and control process when damage occurs. Damage can also be caused by environmental factors beyond temperature—such as shock, pressure, humidity and tilt events.

These losses not only occur in volume product shipments, they also occur in clinical trials. To that end, the average cost for shipping excursions of a single clinical study to a clinical site can well exceed $ 150,000 and commits over 2,300 staff hours for excursion resolution for an average cold chain related study (source: Ray Goff, Wyeth Vaccines – white paper titled Cold Chain to Clinical Site: The Shipping Excursion). This does not include another 1,500 hours of lost opportunity labor.

All the while, key regulatory pressures continue to grow. Regulatory compliance for healthcare companies revolves around various standards that are produced by government bodies like the Good Distribution Practices (GDP) in the EU and related standards published in the U.S. by the FDA.  Recently the EU added the following update to the GDP:

“it is the responsibility of the supplying wholesale distributor to protect medicinal products against breakage, adulteration and theft and to ensure that temperature conditions are maintained within acceptable limits during transport.”

Experts interpret this to require the wholesale distributor/ manufacturer to measure the temperature to ensure it’s properly maintained (data logging, etc.). But this is just one of many various regulations the impact the industry – from either the manufacturer side or the logistics provider attempting to resolve these issues on their behalf.

To learn more about the issues Bio-Pharma is facing, how technology can be use to address these issues, and how data can drive corrective and preventative action programs, read the white paper here.

How Do We know if Goods are Still Good?

As the need for temperature-controlled transportation rises, problems persist. According to The Loadstar article, “Unreliable Air Cargo Industry Loses Pharma Traffic While IATA Sleeps”, Kuehne + Nagel’s senior vice president for global air logistics products and services, Marcel Fujike, is quoted, “There was a lack of skills, training and standards throughout cool-chain logistics, with “no SOPs or working instructions in place overall”.

Furthermore, Mr. Fujike noted that vulnerable spots in the air transport chain included handling, loading, the tarmac phase, “which is considered the weakest link in the chain”, and customs clearance.

read more here

CargoSense Announced as a Winner of the 2015 Internet of Things (IoT) Evolution Asset Tracking Award

TMC, a global, integrated media company that helps clients build communities in print, in person and online, announced CargoSense was among the recipients awarded the 2015 Internet of Things (IoT) Evolution Asset Tracking Award.

http://www.iotevolutionworld.com/newsroom/articles/405688-winners-the-2015-iot-evolution-asset-tracking-award.htm

CargoSense’s new “Black Box” cargo solution (http://cargosense.com/solution/) shared the stage with the likes of Cisco Systems, Savi, SkyBitz and Vodafone.