Virtual reality applied for programming CNC machine tools

Future CNC programming may have a manual teach mode. Mechatronics researchers demonstrate how future CNC machines could be designed with virtual reality elements to allow an operator to move body elements manually so motion sequences can be saved in the machine program.


For simpler machine programming, imagine a computer numerical controlled (CNC) machine tool that could interact with an operator using virtual reality technology to make programming easier and faster.

CNC machine tools are equipped with various interfaces enabling their control and programming using a variety of features to support the operator in the process of preparing the machining program. However, the operation of CNC machine tools, especially five-axis ones, requires highly skilled operators, especially within the scope of basic programming knowledge.

Figure 2: An implemented system for CNC program would allow an operator to move the virtual table and the machine tool headstock to generate the trajectory of the tool within the workpiece system. Courtesy: West Pomeranian University of Technology, Contro

There is an expanding trend toward more intuitive and easy-to-use control systems for machine tools. Programming basic technological operations should be easy and intuitive enough not to cause difficulties for the average operator. The research team from the Centre for Mechatronics of the West Pomeranian University of Technology proposed a solution to this problem by introducing a technique of manual control and programming for a machine tool. As shown, operator can move the body elements of a machine tool manually with the use of appropriate levers, which measure the applied force and, depending on its value, provide the correct speed to the control system of the machine tool. This allows the machine subassemblies to move with the movements of the operator. By applying this technique, the operator may teach the machine tool the movement trajectory, and the individual motion sequences can be saved in a machining program.

A virtual reality (VR) technology was applied enable analogous programming of the machine tool in an off-line mode outside the machine tool. The VR technology used in the developed system enables the operator to manually move the machine tool subassemblies on its virtual model. The article describes the designed virtual reality system for programming CNC machine tools.

Operator interface

The designed operator interface consists of input and output elements as well as drivers to enable the communication of these devices with a PC. The input devices in the designed system are digital gloves, scanners of the operator's upper limb movements, a head-tracking system, and a system for the operator's body orientation. 5DT Data Glove 5 Ultra was used for measuring average finger flexure for each of the five fingers of the left and the right hand. Two scanners were designed and constructed to measure arm movements. The scanner of the left upper limb was based on orientation sensors, whereas the scanner of the right upper limb was based on a mechanical structure with potentiometers.

The first type of scanner uses an ADIS16405 sensor from Analog Devices, equipped with a three-component accelerometer, a three-component gyroscope, and a three-component magnetometer. An orientation filter for MARG (magnetic, angular rate, and gravity) implementation was implemented on a SMT32 micro-controller produced by STMicroelectronics.

The system communicates with a PC with the use of the USB interface. The measured orientation of the main joints of the upper limb by the use of simple kinematics allows for determining the position of the operator's arm in space. ADIS16405 sensors track the operator's head and torso orientation.

Figure 1: This diagram shows the input and output elements of the virtual reality CNC programming interface. Courtesy: West Pomeranian University of Technology, Control Engineering Poland, Control EngineeringThe output element is an eMagine HMD (head-mounted display) including two displays that present an independent image for the left and for the right eye. Using the stereoscopic display of the image, the operator can judge the distance of the items in the VR space. Figure 1 presents a schematic view of the input and output elements of the designed interface.

The VR simulation program is designed to receive all signals from the input interfaces, their processing and interpretation, as well as generating a stereoscopic image on its basis in real time, which is a reflection of the operators performed by the operator. The system was written in C++ and uses the OpenGL graphic library.

Programming without writing

The main objective was to enable the operator to program the CNC machine without the need to write a program. This was achieved by providing the operator with the possibility to capture the generated three-dimensional body elements of the machine tool, move them, and perform various control gestures. Appropriate gestures of the operator enable:

  • Retrieving the function of saving or deleting the trajectory point
  • Change of the tool
  • Command to generate the G-code.

The machined workpiece is shown on the machine tool. The operator may initially determine its size by grabbing its walls and pulling them in the intended direction. The operator, grabbing and moving the virtual table and the headstock of the machine tool, generates the trajectory of the tool in the workpiece system. If the tool goes through the item, it is possible to see the effect that would be caused by machining. The operator can save the set position by using the OK gesture, by raising the thumb, or delete the saved point by using the thumb-down gesture. Points are stored along with the information on the tool and its working parameters. The tool and its parameters may be changed by the operator performing appropriate gestures. Finally, the operator can issue a command to generate the G-code by raising the thumb of the left and the right hand. The operator can move within the VR environment and look around the setting by moving his head.

Intuitive, voice, feedback

Figure 3: This diagram shows the completed and planned work on manual programming using virtual reality for CNCs. Courtesy: West Pomeranian University of Technology, Control Engineering Poland, Control EngineeringThe designed system performs the assumed functions, and is intuitive and easy to use. It is based on the experience gained by programming the actual machine tool with the use of control levers. The system observes the result of the performed operations. Figure 3 presents the work implemented so far and the tasks planned for the future. Further expansion of the system will encompass the addition of the voice control function, as well as sound feedback.

The research p is partially financed by the Polish Ministry of Science and Higher Education Project No. NN503 243 138, entitled "Development of the Project and Experimental Studies on a Prototype System for Manual CNC Machine Tool Programming."

- Mirosław Pajor, PhD, DSc, prof. ZUT, Kamil Stateczny, MSc, and Krzysztof Pietrusewicz, PhD, DSc, of West Pomeranian University of Technology, Szczecin, work on self-diagnostic, intelligent CNC machining with open architecture control systems. They are also contributors to Control Engineering Poland. Edited by Mark T. Hoske, content manager, CFE Media, Control Engineering and Plant Engineering, mhoske(at)


At the bottom of this file see other linked advanced controls articles from these researchers.

At, see other global contributions from Control Engineering.

KHAMMONH , Non-US/Not Applicable, Thailand, 08/19/13 10:11 PM:

This is only a conceptual discussion. I didn't see how it could work in a real industrial environment. First of all, it would need a solid machining knowledge base. The article said the operator would determine how much force to apply by gesturing on digital glove. Actually operator won't know how much force he needs, as it depend on material properties of cutting tools, material being cut, tool geometry, feed rate, depth of cut, etc. Operator won't be able to accurately determine tool path and cutting tolerances by gesturing either, and if the shape of the part is complex curve it would not possible to draw a path by gesturing. The most powerful tool to generate a G-code is post processing the tool path by using CAD data by using a CAD/CAM interface. In a real industry, the operator just sets up the machine and operates it according to loaded/transferred program. To make this concept work in reality it would need a lot of work. I didn't see how the operator could gesture an accurate tool path due to human sensing limitations.