June 21st, 2013 by Chance Coble

Some of the software needed for the industrial internet to be successful is still missing. Often the industrial internet is discussed in terms of the changes that need to happen to everyday machines in order to have them governed by intelligent and easy-to-update software. Granted many of the tools for this are already common including web services for machine to machine communication, big data repositories and machine learning tools to optimize the parameters for a governing model. However, there will be a new breed of flexible analytics tools required before everything is fully trusted to the machines. There will be an in-between step in which reporting and analytics tools will need to meet us half-way so that the right humans are involved when they need to be.


The old AI was all about controlling a single machine so effectively that it could do tasks that matched or surpassed a human. The new AI and the industrial internet approach the problem of a massive number of machines all behaving smarter as a system.

Over the past year we have seen plenty of discussion on the coming industrial internet and its implications. Being able to frame software abstractions as controlling some aspect of the physical world means we can put a common, well specified interface in between sensors and actuators on machines that give us system-wide visibility and control. There are so many implications that visions of statistically optimized energy grids and cars the report their own problems are becoming common. Complete now with it’s own online forum and magazine the futuristic vision for the industrial internet is becoming a somewhat defined plan. The implications are interesting as well. Descriptions of automotive and aviation sensor monitoring, efficiency and governance exist. Of course, smarter energy grids, predictive healthcare outfits and rapid product evolution in manufacturing are beginning to take shape as well.

Recent events indicate that we will see some real progress over the coming years. GE’s CEO announced the industrial internet as a major initiative a few weeks ago, and it has opened up $2 billion in investment capital for next generation healthcare using the industrial internet as well as making data available for analysis.  Major car companies are scrambling to include more connected interfaces in their latest designs and Silicon Valley is starting to heat up with startups seeking investments for ideas to advance the initiative. The business opportunities are tremendous including optimization and efficiency, the opportunity to improve consumer outcome and the opportunity for more lean product cycles using software to update the behavior of machines based on what the entire system is reporting.

[action title=”Connect to more data” size=”col-9″ last_column=”false” layers=”true” btn_color=”red” btn_title=”Learn More” btn_link=”/about-us/contact-us/” btn_icon=”Q”] Connect your business intelligence solutions to cloud data like Salesforce or Google Docs, local NoSQL or flat file repositories such as CSV files or MongoDB and even your proprietary API’s today. [/action]

The challenges of the industrial internet substantial for many applications. Machine monitoring and profiling will mean taking advantage of massive data tools. Analyzing that data will require machine learning techniques that identify patterns which require intervention. Those patterns could indicate that maintenance was required, or that consumer outcome could be improved. Finally reports, alerts and recommendations will need to be delivered to people who can make a change at the time that change is relevant. That is, content delivery will need to evolve and I believe modern flexible business intelligence tools play a role in that evolution.

Big Data tools have now added real value for the consumer internet, many open source options exist. However, many companies still struggle to unlock value out of this data, to clean it properly or even get basic reporting out of it much less the sophisticated forecasting that is often promised. The question is, if machines were producing that massive data volume from sensors and diagnostics would we be able to better understand that machine’s maintenance needs? Would we be able to optimize it more effectively in the context of what other machines are reporting? The potential certainly exists, but this is likely only going to initially be in the realm of the big investors. This will certainly be out of reach for the small to medium sized business market for the foreseeable future.

So what is the small inventor of new machinery to do? One possibility looks like the app store model. The small inventor purchases components from another company that comes with the infrastructure to do this reporting. The algorithms for analyzing the resulting data can be used as a service for the smaller company, or the inventor can possibly plug-in algorithms of their own to analyze the results and communicate instructions back to the machine.

This latter scenario is an exciting one for a new generation of data products. Bringing data product expertise to businesses is exciting work, and there is tremendous value in today’s market in adding data products on to existing software services. However, it is all human focused. Creating data products that both glue machines together and involve the human with reports, alerts and recommendations only where necessary will be an interesting new challenge.

The governance of the data product and its results will be another challenge. Machine learning does very well at optimizing parameters, but in the case of a context switch can be disastrous.  The problem gives rise to several questions. Will these systems be more problematic in the event of a context switch? A rare cold front in summer, or a plane that suddenly enters a dive could be examples of unlikely events that don’t represent use cases the algorithms were trained on. How will human override work?

In particular, the reduced but necessary intersection between the systems and human beings that will form a step between systems today and the promise of the industrial internet are intriguing.  There may be no way to build this kind of system as a big upfront design and deploy, because the scale is simply too massive. There are too many machines, too many systems and to many modalities. Achieving this incrementally means starting out with reducing human involvement rather than eliminating it. For maintenance prevention, we will have to have tools that anticipate the maintenance need and delivery that to a human in the simple case. In the more sophisticated case it may mean asking a plant manager to delay running a process for 10 minutes when their should be less power demand, or recommending to a ground controller and coordinating with a pilot on in-flight configuration of a plane to pickup additional fuel savings. The tricky piece is that the machines will be smarter than us in many instances, but will rely on human judgement for context.

Flow for content automatically generated by a business intelligence system.

Flow for content automatically generated by a business intelligence system.

This in-between stage for the industrial internet means we have an advanced set of tools that can create and deliver that content in a non-intrusive way. The demand for tooling like this could really change the business intelligence market. Why? Because the business intelligence market relies more and more on the same kind of data. Big transaction (website analytics, call detail records) and machine generated (gps, sensor data, RFID, biometrics, application logs) data are exactly the kinds of high-volume and high velocity information that the industrial internet will have to deliver on. The value of that information only exists if you can inform the right person about an event, or better if you can anticipate an impact on efficiency or consumer outcome and alert the right person to intervene on that.

That generation of business intelligence tools would still work on data, meta-data and incorporate visualizations. However, the interaction would be different. This set of tools might be constantly running and available for query, or might be running in the background and help the business analyst boil their concerns down to the notifications from the system. In that case, the business analyst’s job will likely be in assisting the system in coming up with the set of rules and models that help it inform the business analyst of that loss of efficiency or threat to outcome.

Today we have an awkward set of teams that have to work together on this kind of system’s setup for business. A business analyst requires an executive, someone from data governance to make sure the proper MDM is in place, a business intelligence expert (often an outside consultant), a team of data scientists to assist in the model generation and possibly data warehouse and ETL experts if the data is not yet in the proper form for reporting. The result will be reports and dashboards that the business analyst uses on a regular basis after a considerable effort.

The difference in the system that might evolve from the needs of the industrial internet is that the business analyst may point the system to a set of data … and the system works with the business analyst to iteratively define the important views to present and events to anticipate. This may not cut down on initial time to develop the result, but the resulting data product will be much more agile because the feedback loop is direct. And this kind of approach would certainly not remove the need for all of the roles mentioned above, but their work would change to a systems focus rather than an individual deployment focus. The key is that this kind of system has to be willing to take a guess on what sort of content the business analyst wants to see, and to change that content based on the analyst’s feedback. This would be a radically different way to interact with a business intelligence tool.