Irecently attended an Internet of Things (Io T) forum in Cambridge, MA, and was pleasantly surprised by the quality of presentation, the
quantity of interaction, and the technical depth of discussion. For a
meeting unaffiliated with a larger technical conference or trade show,
it attracted a knowledgeable and engaged audience with a good mix of
engineering and management staff.
Ascent Venture Partners sponsored and co-hosted the event, as they
do for several industry meetings during the year in the Boston/Cam-bridge area. I give a tip of the hat to them for keeping the session free
of commercial messaging, save one tastefully brief announcement at the
meeting’s start. Another hat tip goes to Ascent and co-hosts INEX Advisors for assembling a panel of industry participants with a wide range
of application foci. Doing so makes more work for the moderator, who
must juggle topics and pull together disparate threads of discussion to
keep the session moving. However, such a panel makeup provides the
audience with a broader range of perspectives and a more multifaceted
view of the topic — a valuable attribute for a session on Io T at this
point in its evolution.
The panel included Stephen Pavlosky, Equipment Insight Lead at
GE’s Intelligent Platforms Division; Doug Merritt, SVP of Field Operations at SPLUNK, an operational intelligence platform that analyzes
and visualizes machine data; Jamshed Dubash, COO of Senaya, a supply
chain technology company focused on the Io T for remote and mobile
asset management; Mike Helfrich, CEO of Blue Force Development,
which focuses on mobile Io T for national security, public safety, and
distributed enterprise; and Carl Levine, Community Manager at Dyn,
which uses Io T technologies and methods for monitoring, control,
and optimization of online infrastructure; and moderator Christopher
Rezendes, President of INEX Advisors, a global firm focused on Io T
When things learn to talk
The panel’s range of interests demonstrated, perhaps, why many discussions about the Io T seem to lack cohesiveness: the Io T is, I’d assert,
is varied in application, scope, and technology, so one tends to view
the capability and requisite resources through the lens of one’s primary
interests. For example, GE’s Pavlosky, Senaya’s Dubash, and Blue Force
Development’s Helfrich use Io T capabilities to monitor distributed
physical assets and to collect, process, and deliver data about those
assets to decision makers, both in the field and in central offices.
Pavlosky observed that, the sensing technologies aren’t new but “it’s
an interesting time, driven by ubiquitous connectivity, large amounts
of computing power in small devices, along with cloud-based infra-
structure that allows us to deploy a large number of applications at a
small price point.”
In many cases, asset monitoring provides machine health data. Com-
bining and comparing data from large numbers of distributed assets
allows asset providers like GE to refine prediction models to minimize
unscheduled downtime of, say, power generation equipment, railroad
traction systems, or jet engines. For example, GE systems monitors
every Delta flight, three times during each flight, and collects data that
allows the company to detect anomalies in an engine’s operation that
trigger non-scheduled maintenance. This protects the lives of passen-
gers and crew, minimizes schedule interruptions, and lowers costs to
the airline by keeping expensive assets in service with greater safety
Senaya’s Dubash mentioned of the $80 trillion global GDP, $8 to $10
trillion is in the supply chain. For the supply-chain sector, the Io T
provides means of determining asset status without adding on to or
depending on infrastructure at ports or transfer points not controlled
by asset owners or shipping services providers.
Helfrich of Blue Force Development offered one of the most compelling use cases of the evening. Blue Force builds software for the military
and counter-terrorism space. As is the case in many other Io T applications, the use models for military predate ubiquitous bandwidth access.
Blue Force’s goal was to build secure software to allow rapid (i.e. within
10 seconds) formation of asset teams — confederations of people, sensors, and software-based services.
The software needed to support non-traditional partnerships such
as coalitions or interagency security teams. Among other outputs, the
software provides to field personnel and command and control staff
what Helfrich described as a common relevant operational picture. Instead of giving forward forces a roster of people, resources, and sensors,
the software allows users to create a view of machine and sensor data,
correlate between them, and set up a notification framework that allows
the team to get information and make decisions quickly. For example,
the system can deliver time-and-distance-to-threat information, which
directly affects safety and operational efficiency.
When data is the product
By contrast, Dyn’s Levine and SPLUNK’s Merritt focus less on physical
assets and more on data about data flows. While Levine focuses more
on internet infrastructure, Merritt’s interests focus on traffic within IT data centers. Doug noted that IT data centers generate massive
amounts of log data which is unstructured and which can vary from
system to system. Historically, the ability to analyze that data has
been, at best, limited. With increasing use of interactive online services and the growing market for cloud-based computing, traditional
use models, which were heavily weighted toward file serving, are shifting to include app delivery and app management.
The heterogenous use models have complicated an already difficult
security environment: Millions or billions of non-structured data items
pass through the environment and, if analyzed properly, they could
reveal an impending breach or other malicious behavior.
Security was one of the topics that resonated with the forum’s audience. Data security remains a seemingly intractable problem online.
For example, Forbes reports that one breach of Home Depot servers
With increasing use of interactive online services and the growing
market for cloud-based computing, traditional use models, which
were heavily weighted toward file serving, are shifting to include
app delivery and app management.