Avatar

When I started in technology thirty-plus years ago, I never would have imagined we’d have the smart devices we enjoy today. There are so many types of gadgets that improve our quality of life, from smart lighting and smart refrigerators (to remind you it’s time to buy more milk), even smart hairbrushes for healthier hair. I have to admit, there are times when I just sit back, a little awed, and wonder how in the world did this all get started?

A first-person history of the Internet of Things

I began my career working for a very small company of about thirteen people. We spent our days developing products for monitoring diecasting and injection molding machines. Think of this device as a network attached digital oscilloscope. Our goal was to analyze, in real-time, the complex process of making a specific part. This included speeds, temperatures, pressures, and so on. But in retrospect, the most interesting part was that we were also setting parameters around those variables to determine if the part created was good or bad.

I remember a conversation with a gentleman who had been operating diecast machines for over thirty years and was the “guru” for dialing in the process. I asked him how he knew if he was making good parts, and he said it was the “feel” of the machine. So our efforts to embed technology into the mix drastically changed the manufacturing process, turning it from art into science. And even back then, we realized we were on the cusp of a new revolution and I was excited to be a part of it.

Early IoT: Frying eggs with sticks and stones

As you can imagine, in the late 1980’s we were developing with sticks and stones in comparison with what’s available today. Early Internet of Things (IoT) products we created were based on the first 16-bit home computer processor; Motorola’s 9900 central processing unit (CPU). It was big enough to fry an egg on and hot enough to burn off a fingerprint (I’m speaking purely from experience, and not about the egg).

Fortunately, cooling and space requirements weren’t a big concern since our cabinets were about the size of the robot in the 1960s series Lost in Space. For reference, that’s about the size of a modern smart fridge. The bigger issues we faced were:

  • processing the inputs locally from the diecasting and injection molding machines
  • sending that data to the central controller for storage and additional analysis.

Basically, this was fog computing in its infancy. And we were knee-deep in figuring it all out.

The birth of digital transformation

To better understand what we were dealing with back in the 80s, try to visualize a sea of machines pumping out parts. The time it takes to make each part is called cycle-time. Say you have a 120 second cycle time per part. Try reducing it to 15 seconds for a multi-point RS-485 serial communication network that only allows one device to transmit at a time. Definitely a challenge, especially back in the 80s.

We had to develop techniques to compress the time it took to send information, one of which was to send the starting value then the subsequent differences. We were working with bits, not bytes (and certainly not kilo-bites). Given the speed at which many of these machines operated, we even had to develop custom hardware that would send data directly to random access memory (RAM) because the CPU wasn’t fast enough to collect the information. I realize how hard that is to believe given today’s processing capabilities, but it was a simple fact of everyday life for us back then.

You may be asking why we did we use the multi-point RS-485? Pretty straight forward; all the manufacturing equipment that relied on induction motors would produce serious electro-magnetic interference (EMI). RS-485 was the only viable tech at the time that could withstand that harsh environment.

The heart of our system was a central controller that would store the information collected from the remote devices for quality assurance, allow the capability to update process parameters on each remote device, and then provide alarming for machines that were operating out of parameters. Many manufacturers were so reliant on our technology, that they couldn’t run their machines without ours collecting and processing data.

Next up: How the IoT transitioned from past to future

History_Of_the_IoT_Rasberry_Pi

Ironically, we are still dealing with many of the same challenges from 30 years ago, but at a much greater scale. This is a picture of a circuit board I designed in 1988 and a Raspberry PI. These devices are very similar in function, with the exception of analog to digital (A/D) conversion capability on the large board, but this functionality could be easily added to a PI with a single A/D chip. There is obviously no comparison of the processing, storage, and display function when it comes to speed and performance, the PI wins hands-down!

Every day you and I leverage the IoT for a better quality of life. And IoT devices are helping private and public sector organizations be better stewards of our planet’s limited resources. Plus, the power of the IoT is helping to monitor and improve manufacturing processes, better control traffic flow, ease parking stresses in communities large and small, and more. In part two, I’ll address the transition from past to present.

Additional resources

Interactive: Create your own blueprint for digital transformation

Article: Powering digital transformation in municipal government

Digital transformation for government

FedRAMP Authorized Collaboration Solutions for Government