The curious incident of the dog in the night-time…

In the 90s, the world of software development pivoted around a concept called “Object Oriented Programming.” The previous unit of capability in an application was its functions or procedures (hence the later moniker “Procedural Programming”) where a program would move through a series of steps, often in a loop, executing functional steps (check for user input, process that input, write the result somewhere, repeat).
In Object Oriented Programming, an application is made up of things¬†(or, “objects”) just like in the real world. Objects have functions, but they also have properties, and events. This lent itself well to the Graphical User Interface, where something like a button is easily understood as an object. A button has a property that describes its label text, an event that happens when its pressed, and a function it performs when that press is complete. Objects also have hierarchy, where one object is the “parent” of another — a model that lets a programmer travel a program from its higher level functions, to its smallest blocks of capability.
More powerful languages included a sophisticated feature called “classes.” These are a little harder to explain. If you think of a car as an object, then abstract that a little: a car is an object in a Vehicle¬†class. All vehicles share some common properties, like a chassis and a seat, common functions like MoveForward, MoveBackward, and common events like StartingUp. Creating the class lets you define those once, and every object that derives from that class shares those attributes automatically.
You can then create a “sub-class”, called FourWheeledMotorVehicles, which shares all the common properties and functions of the Vehicle class, but declares its own set of common attributes. You can apply this cascading inheritance through as many sub-classes as you want, but you can’t actually interact with them — to do that, you have to create a object which is an instance of a class. An object inherits all the properties of the class it derives from, and those become interactive when the object is instantiated (created).
An information “super highway” then is full of objects that can be queried for information (through their properties) given instructions (through their functions) and can inform other objects of what’s happening to them (through their events.) And the class inheritance approach lets a programmer understand things about many objects, without having to know the full details about each of them (eg: most objects on this highway derive from the Vehicle class, so if I know how to get information from a Vehicle, I know how to get information from most of the objects.)
To get more information, a programmer may have to learn how to talk to the FourWheeledVehicle class; those capabilities are a little more specialized (some of what I learn might not be applicable to another class called TwoWheeledVehicle) but for that set of objects, also more specific and powerful.
Of course, this is only a surface explanation, and I provide it only to point out that this was a powerful concept that helped change computing permanently. One of the pioneers of Object Oriented Programming was NeXT, who delivered a brand new operating system around this model. That OS, called NeXTStep became Apple’s OSX, and iOS, powering modern Macs and iPhones. Other vendors were working on similar efforts, and by the end of the 90s, the pivot was complete… for some people.
The thing is, this powerful shift never really came to manufacturing — not fully. Manufacturing technology has layers, with primitive, physical switches, motors and buttons at the bottom, and orchestration and business software at the top. Those layers were famously modeled at Purdue in the 90s; a first step toward understanding the flow of data in an operation. At the top layers, most software moved to Object Oriented Programming — having been delivered by, and for, Information Technology (IT) resources. In the mid-to-bottom layers, technology remained largely in the realm of the electrical engineer; skilled in orchestrating logic flow, and used to working with physical wiring, programming for the electrical engineer evolved from wiring diagrams, into a concept called “ladder logic.” Ladder logic allows a programmer to express functions and loops in software, which are designed to be applied against physical equipment. This kind of programming is written to a PLC (Programmable Logic Controller) or PAC (Programmable Automation Controller) that controls the operation. We often call this Operational Technology (OT).
The result of this alternate evolution is that there is no object model in the lower layers of a manufacturing network. Sure, there are physical objects, and there’s a mapping between rungs of ladder logic, and points of data (called tags), to the real world — but that mapping is largely in the programmers head. There’s no forcing mechanism (and rarely even the facility) to model manufacturing objects in code. As a result, the individual or team who programmed a machine can look into the code and understand how the machine works — but no one else can. Nothing else can. Another program can’t come along, inspect the objects, map them against classes of common capabilities, and create value out of the information the system emits…because there’s no way to understand what that information means, without asking a human being to participate.
This is where my career began, about 20 years ago: building higher level information software systems that attempted to assemble an after-the-fact object model for a manufacturing system — by asking a human to construct it. If we could just get people to participate in (re)establishing an object model, we could give them powerful information (analytics) about what was happening with their systems. It took the industry most of two decades to realize that this wasn’t broadly accomplishable. The divide between the skills of the people who build manufacturing systems, and the skills of people who built information systems, was too hard and too expensive to bridge. For an “information superhighway” to come to life inside manufacturing, we were going to need to bridge the gap on-behalf of our users.
That’s what Shelby does — in a small way. By discovering devices on a manufacturing network, Shelby can match each of them to a device Class that we pre-define inside of Shelby (we call these “Profiles”). Using the closest match we can find, we’re able to create an instance of that device inside Shelby’s object model, and automatically begin asking the device about its status (calling functions), collecting information (defining properties) and surfacing diagnostic notifications (events), about some of the primitive components of the network. Shelby can’t understand what those parts are doing, or many details about how the objects are related to each other — that info is still trapped in the ladder logic. But it can do something we couldn’t before: it can create information value automatically, and begin to bridge operational and informational technology worlds together. That’s why Shelby works in minutes, where all other information software takes days, weeks or months.
SherlockThis is only the beginning, of course. Understanding parts of a system is an important step along the way, but we need to get inside the head of the implementer — and the best surface we have for that is the Controller (PLC); the place where the engineer articulated their understanding, and intent for, the system. To automate this understanding is another technology leap, one that goes beyond modeling the presence of devices, and begins to understand the physics of an operation. That’s where Sherlock comes in, and that’s why its one of the most important innovations manufacturing will see this decade…

Leave a Reply

Your email address will not be published. Required fields are marked *