Using tools such as graphical system design, reserachers are developing new, safer ways of interacting with machines that also permit more efficient operation
By Gerardo Garcia, Product Manager
Ben Black, Systems Engineer
National Instruments
Have you ever played a car racing video game that shakes when you go off-road? If so, you have interacted with a haptic interface. The word haptic comes from the Greek haptikos, which means to touch, grasp, or perceive.
With haptic robotics, a user can feel a remote or virtual environment. A haptic interface provides sensory feedback — typically in the form of pressure or physical resistance — so users feel as if they are physically interacting with something, even though they are not. For example, a haptic interface may be used to provide a feeling of resistance in the rudder controls of a flight simulator. The feedback would help the pilot know when to apply more or less force to the instruments.
Figure 1. Ray Goertz, seen here operating a mechanical-link teleoperator, later invented the first electronic remotely operated manipulators. (Source: Argonne National Laboratories).
Haptic technology can be incorporated in surgical procedure training, remote-controlled robotics, and virtual-reality applications. It provides the user with sensory information other than, or in
addition to, the visual data provided by, say, an LCD and a remote camera. By offering additional information about the machine being controlled and the environment in which it is operating, haptic technology can permit more precise and rapid human response.
A brief haptic history
Haptic technology is said to have had its origins in the work of Raymond C. Goertz (Figure 1), who, while working for the Atomic Energy Commission at Argonne National Laboratory in 1951, developed the first master/slave remote manipulator. This electro-mechanical device, also referred to as a teleoperation system, allowed humans to handle radioactive materials more safely by connecting them to the mechanical arms that moved the dangerous substances solely by electrical connections.
However, until the development of modern computing components such as microprocessors, the joints of the master and slave were directly coupled, with the motion of each joint of the master device directly replicated in those of the slave unit. With widespread availability of microprocessors in the 1980s, research in haptics began to flourish. In that era, it was Antal Bejczy, working at the Jet Propulsion Laboratory, who recognized the importance of force reflection/feedback in the man-machine interface.
Figure 2. The PHANTOM Premium system from SensAble is a commercially available active-haptic device.
Until recently, the development of haptic systems has relied on the use of active components, such as motors and pneumatics, to provide force feedback. For example, the PHANTOM Premium system from SensAble Technologies (Figure 2) uses an active force feedback system to provide resistance in accord with the design of the teleoperation system. Other active systems with multiple degrees-of-freedom have recently been unveiled for use with gaming systems.
Recent haptic research
One of the cutting-edge areas of haptic research is called “passive haptics.” Since active haptic interfaces use actuators such as motors, they present the risk that the actuators could add too much force and injure the user. Passive haptic interfaces offer a safer alternative.
Instead of adding force to the system, passive haptics remove force from the system, using passive actuators such as magneto-rheological brakes. Passive haptic interfaces are not only inherently safer, but also more efficient, more compact, and less expensive. They also present the possibility of using energy harvesting to gather power for operating other systems.
Working on passive haptic systems at Northwestern University’s Department of Mechanical Engineering in the mid 1990s, Professors J. Edward Colgate, Michael A. Peshkin, and Witaya Wannasuphoprasit explored the use of energetically neutral passive haptic devices that use continuously variable transmissions (CVT) as actuators that neither dissipate energy from nor add energy to the system. An early result of this work was a two- degree-of-freedom device, called a cobot (Figure 3). While CVT devices are capable of producing very stiff surfaces (one way of judging a haptic system’s performance), in free motion they require processing and actuation that can introduce a time lag into the system.
Figure 3. This two-degree-of-freedom unicycle cobot developed at Northwestern University uses a roller blade wheel on a smooth surface to constrain the user’s motion to a single-degree-of-freedom path.
Colgate and Peshkin, along with Professor Kevin Lynch, currently co-direct Northwestern’s Laboratory for Intelligent Mechanical Systems. The lab’s major focus is on interaction mediated by physical contact and the exchange of forces and motions, and it has produced multiple generations of cobots.
Using graphical system design
Researchers at the Georgia Institute of Technology Intelligent Machine Dynamics Laboratory (IMDL) also study the use of passive haptic systems. In one project, Drs. Wayne Book and Benjamin Black explored whether passive haptic systems can be as effective as active haptic systems for remote operation of a device, with the additional guarantee of safety.
One of the main limitations of passive haptic systems, however, is that the device cannot be placed in a specific position. Instead, the passive actuators must guide the operator to the desired position. Book and Black worked on how to overcome this limitation by developing advanced control strategies for the passive actuators.
The design of the system involved several steps that were made possible by the graphical system design approach. Graphical system design employs the combination of graphical development software tools and off-the-shelf hardware to rapidly design, prototype, and deploy embedded control devices. The researchers used National Instruments LabVIEW, a graphical software development environment, to design and simulate the haptic control system and communication for remote operation.
The design was deployed to real-time PXI control and acquisition systems to test the control strategies. The advantage of this approach is that Book and Black could iterate and create a better design by avoiding low-level embedded software development and custom hardware design when deploying.
The Georgia Tech researchers were able to quickly import their master and slave controller algorithms into LabVIEW and then instrument these with actuators and sensors using a high-level programming interface. By instrumenting the algorithms with real hardware, they could verify theories with real-world data. Figure 4 shows the graphical source code the researchers used to control the position of the slave.
Figure 4. LabVIEW source code for the slave controller provides a high-level view of the system performance.
Moreover, the software tools provided high-level abstraction interfaces, such as the timed-loop feature. The timed loop is a LabVIEW programming structure that abstracts the details of priorities, multi-threading, and processor core assignment. With these types of abstraction, engineers and scientists can easily apply the performance advantages of multithreading and multicore processors to their applications. This frees up the researcher to spend more time perfecting the object of the design instead of spending time developing low-level code.
Deploying the design to hardware
The researchers deployed the software algorithms to PXI modular hardware systems. These systems include a deterministic, real-time controller and appropriate I/O modules that interface with sensors of the experimental haptic devices. Using the LabVIEW Real-Time Module, the researchers deployed their algorithms to the PXI controller for headless operation. They used a plug-in PXI motion control module to control the linear slave motor, and they used multifunction data acquisition devices to interface with the position sensors.
Figure 5. The test apparatus used in developing the passive haptic system at Georgia Tech.
The test apparatus for this research uses a two-degrees-of-freedom manipulator called the magneto-rheological passive trajectory enhancing robot, or MR PETR, that serves as the master
device to control a two-degrees-of-freedom robotic manipulator that acts as the slave. There is no physical connection between the master and slave; rather, there is a PXI real-time control system coupled to the master and another system coupled to the slave, as shown in Figure 5. PXI System 1 executes a deterministic application programmed in LabVIEW that reads a gamma force sensor and two optical encoders from the master manipulator. The researchers use the data to determine the position of the master and then send that position to PXI System 2.
PXI System 2 uses the master position as the setpoint input to a 4-kHz PD (proportional-derivative) controller designed in LabVIEW to actuate the linear motor while reading position data from an optical encoder. The slave device encounters a physical constraint that resists its movement. The slave position is sent back to the master through UDP to PXI System 1, which feeds the data to a control algorithm that determines the haptic force that should be applied to the user to communicate the presence of the physical constraint. The force is applied using actuation of the magneto-rheological brakes. The goal of the system is that the slave position tracks the master position.
Book and Black then improved the performance of their system by implementing a sophisticated model-based control algorithm that runs in real time to compensate for the dynamics of the master device. Black developed a differential algebraic equation solver in LabVIEW that calculates how the master device will respond to any set of actuation parameters. The calculated response is then fed into the controller so that it can compensate appropriately to ensure more accurate haptic feedback to the operator. The researchers have completed human trials of this haptic system and are in the process of publishing the results.
Tell Us What You Think!