Articles, IIoT, Smart Services

Software is the new sensor

Orignially published by Jeff Sieracki, CTO and Co-Founder of Reality Analytics, Inc.

Sensor: a device that detects or measures a physical property and records, indicates, or otherwise responds to it.

We all know what a sensor is, right? A sensor makes “sense” of physical property – it turns something about the physical world into data upon which a system can act. Traditionally, sensors have filled well defined, single-purpose roles: A thermostat, a pressure switch, a motion detector, a oxygen sensor, a knock detector, a smoke detector, a voltage arrestor. Measure one thing, and transmit a very simple message about that one thing. This thinking stems from several hundred years of physical engineering of devices, and persists today in part because of the convenience of modular thinking in system design.

But this is changing. Fundamentally. Software is becoming the new sensor.

Consider your smart phone: It has sound and image capability, along with multi-axis accelerometer, 3-axis gyroscope, magnetic compass, air pressure, light levels, touch, you name it. The sensor suite on current generations of smartphone would completely outclass sensor packages flown by the US military only a few years ago. Some of these phone-based sensors still have use dedicated hardware to reduce transduced data to information, but increasingly it’s all done in software: Acceleration and gyro data is reduced to a screen orientation, to a “phone-to-ear” detector, and to navigational inputs. Instead of the old-school sensor design, chips capable of capturing highly granular physical inputs at high-frequency sample rates feed software run in local memory on a local processor, reducing that data stream into specific inputs needed for a variety of different purposes by the OS and by apps.

Because the decision engine no longer owns the transducer, the underlying data is also available in its raw form. This means a smartphone app can increasingly leverage the same sensor data to make its own decisions in ways never specifically intended by the hardware designer. Am I running, walking, or standing in line? Am I on the bus or in the car? How’s my driving? How’s my workout going? Is it getting dark out? Is it the ambient crowed noise loud enough I should turn up the volume? What’s the gender and age of the speaker? A home security system based on similar thinking, with a microphone and a suitable microcontroller, can do much with a software defined audio processing capability: A glass break detector, a foot step detector, a heartbeat counter, a door knob rattle detector, a dog bark or shout detector, a trip and fall sensor for grandma, an unauthorized teenager party alarm, all of these sensors defined in software within the same, flexible hardware box.

But this is changing. Fundamentally. Software is becoming the new sensor.

Industrial IoT applications are numerous, and many are already in production. Dedicated physical thermocouples and vibration-limit-switches are being replaced with digital temperature probes and accelerometers attached to embedded microcontrollers. New software-defined sensing can now employ AI and predictive analytics to intervene before a problem happens. We can now alert operators to a pending issue or needed maintenance with time for critical, high-value-processes to be spooled down in a controlled, planned fashion – or resolved during the next scheduled downtime so no interruption is necessary at all. Manufacturers and insurers can be kept in the loop regarding equipment field issues and parts needs, and can perform post-event forensics after critical failures.

Modularity is increasingly moving from the physical layer to a network layer, in which modules are connected on a peer-to-peer network, exchanging packetized information. While this network layer begins as a digital substitute for individual electrical circuits, with increasing bandwidth capacity it can also provide the flexibility for devices to share underlying data as well as local yes/no decisions. This creates an unprecedented opportunity both to integrate information across modalities and to add brand new capability, ad hoc, in the form of software sensors. This shift in thinking also opens the door to incorporating more complex, AI based algorithms, rather than just simple condition thresholds. Sensor information can be integrated in ever more complex ways, and even the innocuous electrical panel circuit breaker is becoming a micro-controller powered, software sensing device. Tools like Reality AI for IoT are enabling machine learning smarts in these environments.

This shift in thinking also opens the door to incorporating more complex, AI based algorithms, rather than just simple condition thresholds. Sensor information can be integrated in ever more complex ways, and even the innocuous electrical panel circuit breaker is becoming a micro-controller powered, software sensing device. Tools like Reality AI for IoT are enabling machine learning smarts in these environments.

About the author:

Jeff Sieracki is CTO and Co-Founder of Reality Analytics, Inc. He is blogging regularly about advanced artificial intelligence techniques optimized for sensor inputs and topics such as Connected Devices, Internet of Things, Wearable Technology, Smart Home, Smart Car, and Machine Health. Find out more about Reality Analytics: http://www.reality.ai/

Leave a Reply

Your email address will not be published. Required fields are marked *