The movement to smaller intervals of time has dramatic implications in the field of control. Processing power and dramatic reductions in cost are making the future of the industry much more difficult to manage. When the cost barrier to implementing sophisticated control systems drops low enough, there will be an influx of new applications that represent an major shift in how we do things. What are those applications and what will they look like in the future? No one knows for sure.
You cannot control what you cannot measure. The converse of this statement is not strictly true. You at least have a chance to control something if you can measure it. The second step in measuring, particularly for the digital age, is the number of times a sample needs to be taken in order to determine a value for the measurement.
Digital sampling theory requires that the sample rate be at least 2 times the event rate. For more complex interactions, 10 samples are sometimes required. Oversampling as it is referred to, is a technique that also helps to insure data integrity. One of the anomalies of digital measurement is the potential for a dropout to occur where a zero value is returned and it is actually an error. The audio guys picked this one up early in the conversion from analog (vinyl) recordings to digital CD.
In the industrial arena one of the biggest issues is electric motor control. Regardless of the application, whether it is pumping fluids, blowing air, powering a conveyor or moving a robot, electric motor control is a major activity. We build and ship $12 billion a year in electric motors.
How does time relate to the electric motor? In the simplest application alternating current motors operate based on a 60 hertz power source. The 60 hertz frequency of ac electricity forms the time base of all ac motors whose windings are changing state every 16.6 milliseconds. If we are to measure the presence or absence of power we could try to do so at two times the event frequency, but since the power we are measuring is changing sinusoidally, the data at this frequency is relatively useless. Even at 10 times, or 1.66 milliseconds, or roughly 600 hertz, the values are sketchy depending on how accurately we need to see things.
Consider that the rate of current over time (di/dt) is the threshold of “safe operating area” for a power transistor. In order to exert control over the system to prevent blowing up the power transistor, the control system needs to measure incoming current at a high enough frequency that it can take the value over several sample, perform a math calculation to determine the rate of rise of current in relation to the measured value, and then turn off an output to the power transistor if the rise is too high. All of which needs to happen very quickly.
In today’s electronics market, this is a piece of cake. Microcontrollers can perform this function at 50 megahertz clock rates and beyond at costs of only a few dollars. Often with Ethernet or Canbus network capabilities already integrated into the device.
Which leads back to the original question; where is all this technology leading? To the future and beyond.