MENU

Design, implementation and integration of sensor-aided assembly systems with industrial robots

Missing elements development/integration for sensor-guided
assembly processes of tomorrow: evaluating strategies for
sensor signals for systems programming and process guidance/
robot control/integration capabilities.

1. State-of-the-art: The assembly process provides a great potential for rationalisation of new assembly techniques and new types of shop floor organisation. Assembly systems with a higher level of automation than achieved today are requiring subsystems, which are able to compensate tolerances in position and geometry of joint partners and cell components and perform inspection tasks automatically. Deviations of actual geometry and in the arrangement of system components from a programmed position and deviations of the programmed assembly process result from workpieces and fixtures as well as from joint and handling processes during the assembly. Very often the measurement of process deviations by sensor systems may be the most economic solution for robot-guided assembly. The use of sensor technique in assembly is not restricted to the control of assembly processes. Sensors might also be used to cut the time for systems programming. Thus the shutdown time of the assembly system for adaptation of on-line or off-line generated programs can be reduced. Besides systems for force/torque measurement and one dimensional sensors for distance measurement, opto- electronic sensor systems will become very important in the future as they outrange current sensor principles by a wider range of application flexibility and an enlarged amount of retrieved information. Until now sensor-guided assembly has been very rarely achieved in industry. This is due to the insufficient software flexibility of sensor systems, their insufficient design with respect to real shop floor conditions and the insufficent capability for task oriented coupling of sensor systems with robot controllers and the factory information system. The use of these systems implies intelligent data processing. But the industrially used sensor systems today are generally built for rather special tasks. Given industrial software modules developed for fast recognition of specific geometric characters and specific groups of parts to be recognised cannot easily be transferred to other recognition problems. This causes an engineering effort, which in most cases hinders the economic development of sensor-guided assembly systems. 2. Objectives: Thus it is the objective of the SEMOS project to develop and integrate the missing elements for the sensor-guided assembly processes of tomorrow. Together with the functional optimisation of hardware and software components considering their technical limitations, their adaptation to operative conditions and requirements of economically efficient production will be improved. In this the following topics will be considered: - planning of sensor integration, - systems programming, - systems calibration and testing, - sensor-guided process control, - integration of guided process quality control functions. The emphasis is put on: - development of task-oriented strategies for evaluating sensor signals for systems programming and process guidance, - improvement of robot control systems concerning sensor signal processing, and - improvement of integration capabilities of sensors and robots. An important objective of the project is to combine manufacturers of subsystems such as robots, sensors and integration equipment such as networks, with users of flexible automated assembly systems. This enables the consideration of users' requirements and manufacturers' potentials in a early stage. The aspect of sensor integration thus in this project will dominate the development of new sensors. 3. Project structure and working groups: The overall project structure is matched to ensure flexible system solutions for sensor applications, which are adapted to industrial requirements. Prototype developments of typical sensor-guided processes in form of demonstrators are performed based on the development of subsystems. The subsystem development consider different kinds of sensors, robot control and data transmission networks. The demonstrators will integrate sensor systems and robots producing prototype industrial applications. The development of subsystems and demonstrators will be carried out in working groups for each type of subsystem and demonstrator based on a Work Programme for each Working Group and the work plans of each partner. The definition of application classes and process requirements will be worked out in the first phase of the project by all customers and users of subsystems and the scientific institutes. Based on the results of this phase, subsystems and system integration in demonstrators are developed with the support of national and international co-operation between specific partners. The following Working Groups will be set up from the beginning of the project: 1. Geometry Processing Sensor Systems (Laser Scanning Sensors) 2. Image Processing Sensor Systems (Vision Systems) 3. Force/Torque Sensor System and Robot Control (6-axis Force/Torque Sensors) 4. KUKA Demonstrator 5. ZENON Demonstrator 6. OFZS Demonstrator 7. HCS Demonstrator 8. IPK Demonstrator 9. OLCSAN Demonstrator. 4. SEMOS Research Programme The project work of the co-operating companies and academic institutes will contain the following research items: 4.1. The subsystems and system integrations shall be valid not only for specific applications but also valid for application classes in order to open wider markets for system suppliers and to avoid high engineering costs for users. These application classes and correspondingly representative prototypes will be defined by strong cooperation of customers, users and academic institutes. All project partners will contribute to this work package. Definition items: - classification of assembly and inspection tasks for sensor integration - process-related classifications of assembly operations of automotive assembly - analysis of applications use for opto-electronic sensor systems - classification of image processing tasks for visual inspection - definition of scenes for image processing and for data acquisition - classification of robotic tasks using force/torque sensors - selection of representative assembly and inspection tasks - specifications of demonstrators. 4.2 Definition of process requirements for subsystems and system integration System integration and subsystem development will be produced on the basis of a system control concept, which considers sensor functions for specific manufacturing tasks as well as for process planning and quality control tasks. Process requirements will be defined in detail in order to match system/subsystem specifications and bridge the customer/supplier interface. All project partners will contribute to this work package. Definition Terms: - analysis of manual assembly operations and assembly processes for sensor requirements and sensor integration - analysis of demands on hardware and software of sensor systems for different groups of assembly operations - analysis of demands on hardware and software of sensor systems for sensor-guided programming tasks - definition of requirements for opto-electronic sensor systems - definition of a priori process and geometry information for sensor data processing - analysis of measured parameters, the required precision and reliability for sensor data - system theoretical definition of required basic control functions for robot control - analysis of requirements in process communication - analysis of the interfaces for the integration of sensor systems, quantity and qualification of datastream - definition of requirements for force/torque sensors and distance sensors with respect to multi-sensor systems - definition of requirements for user interfaces for industrial assembly systems - definition of multi-sensor information processing for assembly tasks. 4.3 Development of Subsystems and Demonstrators Subsystem development and system integration can be produced by balancing the system optimisation with respect to specific manufacturing tasks and system solutions valid for wide application classes. Corresponding to the different Working Groups the R & D items are set as follows: Geometry processing sensor systems (Laser Scanning Sensors) R & D Items: - syntactic description of curved 2-D profiles - template generation for curved 2-D profiles - user interface for 2-D template generation - template matching for curved 2-D profiles - use of CAD-data for template generation - modular sensor data processing. Application fields: - geometry and position detection for part handling and sensor-guided arc welding. Image processing sensor systems (Vision Systems) R & D Items: - image acquisition and preprocessing for data of multi-sensor systems - modular software packages - imaging-processing algorithms for inspection tasks - part recognition and position determination - hardware components for image-processing - methods of camera calibration - integration into industrial controlling systems. Application Fields: - inspection during assembly, part matching, textural analysis of surfaces, precision gauging. Force/Torque sensor systems and robot control (6-axis Force/torque Sensors) R & D Items: - development of a force/torque sensor system for multi-sensorial information processing - development of a distance sensor system for multi-sensorial information processing - sensory-guided control algorithms for multiple sensor - interfaces for sensors in multi-sensorial information processing - user interface for industrial assembly using multiple sensors - functional control concept for advanced robot control - interface between robot control and sensor systems - Real-Time control for force and torque-guided mating - Real-Time control for path tracking - interface between robot control and peripheral processes - off-line programming - interface between robot control and production control system for actual process state information and off-line programming - optimisation of redundant kinematic systems - optimum part placement. Application Fields: - force/torque-guided mating processes, automated assembly using multi-sensorial information processing - sensor-guided assembly. KUKA Demonstrator: Development Items: - planning tools for sensor integration - automatic calibration and self test - shop floor oriented user interfaces for flexible programming - sensor-driven robot programming. Application Fields: - sensor-guided assembly in automotive production. ZENON Demonstrator: - flexible, sensor-guided welding cells - part mating techniques - off-line programming - vision-based process control and inspection - system design for small/medium sized enterprises. Application Fields: - sensor-guided process control and inspection for welding applications. OFZS Demonstrator: Development Items: - sensor-guided inspection - flexible inspection cell - optimisation of inspection kinematics - off-line programming of robot movements for inspection tasks - hardware and software integration of vision systems and robot control systems. Development Fields: - sensor-guided assembly and inspection. HCS Demonstrator: Development Items: - flexible software environment for image processing - design of a multiple camera sensor - design of a camera calibration system. Application Fields: - sensor-guided assembly and inspection. IPKL Demonstrator: Development Items: - development of parts mating strategies - integration of algorithms into an advanced robot control - fast image processing algorithms - development of supervised learning methods - surface inspection by texture analysis. Application Fields: - sensor-guided robotics. OLCSAN Demonstrator: Development Items: - development of a leather cutting press - image processing for defect detection - coupling of vision system and CAD system. Application Fields: - leather cutting in shoe production. 5. Work packages: The common project work programme of the industrial and scientific partners is structured by the following common Work Packages (WP): 1. WP 100: Definition of Application Classes, Process Requirements and Representative Prototypes 2. WP 200: Development of Solution Strategy 3. WP 300: Development of Algorithms 4. WP 400: Simulation Tests 5. WP 500: Prototype Development and Testing 6. WP 600: Documentation and Reports 7. WP 700: Project Management. The work package WP 100 is substructured in: 1. WP 110: Definition of Application Classes 2. WP 12O: Definition of Process Requirements 3. WP 130: Definition of Representative Prototypes According to the different subsystems used for sensory-guided assembly systems, the work packages WP 200 - WP 500 are substructured in Work Packages for: - Geometry Processing Sensor Systems - Image Processing Sensor Systems - Force Torque Sensors and Robot Control. 6. Work Programme: The execution times for the different work packages of different partners may be shorter but will not exceed the indicated execution period. 1. Definition of Application Classes and Process Requirements: March 1989 - July 1990 2. Development of Solution Strategies: July 1989 - March 1991 3. Development of Algorithms and Interfaces: October 1989 - August 1991 4. Simulation Tests: January 1990 - August 1991 5. Prototype Realization and Testing: April 1991 - December 1992. 6. Documentation and Reports: November 1989 - December 1992 7. Project Management: March 1989 - December 1992. 7. Milestones: There will be 3 milestones for the SEMOS project work: Milestone 1: This milestone is set after the definition of application classes, process requirements and representative prototypes (end of WP 100). Milestone 2: This milestone is set after the verification of the development of solution strategies and the developed sensor data processing algorithms and interfaces by simulation tests an appropriate test environment (end of WP 400). Milestone 3: This milestone is set after the development of prototypes of the subsystems and the demonstrators (end of WP 500). 8. Working Group Leaders, Working Group Members and Project Secretary 1. Geometry processing sensor systems (Laser Scanning Sensors) cooperation partners: IPK-PLT, OLDELFT, ZENON, KUKA, NTUA Working Group Leader: IPK, Dept. PLT, Dipl-Ing. Voit. 2. Image processing sensor systems (Vision Systems) cooperation partners: KONTRON, IPT-PT, ISRA, HCS, KUKA, ZENON, OLCSAN, ITU Working Group Leader: IPK, Dept. PT, Dipl.-Ing. Mollath. 3. Force/torque sensor systems and Robot Control (6-axis Force/Torque Sensors) cooperation partners: ISRA, IPT-AT, KUKA Working Group Leader: ISRA, Dipl.-Ing. Wienand. 4. KUKA Demonstrator: cooperation partners:KUKA, KONTGRON, ISRA, IPK-PLT, IPK-PT, OLDELFT Working Group Leader: KUKA, Dr. Woern. 5. ZENON Demonstrator: cooperation partners: ZENON, NTUA, ELEFSIS SHIPYARDSD Working Group Leader: Dr. Ikonomopolous. 6. OFZS Demonstrator: cooperation partners: OFSZ, ISRA, IPK-PLT Working Group Leader: OFZS, Dr. P. Moellner. 7. HCS Demonstrator: cooperation partners: HCS Working Group Leader: HCS, Ing. Fehervari. 8. IPK Demonstrator: cooperation partners: IPK-PT, IPK-AT, KONTRON, ISRA Working Group Leader: IPK-PT, Dipl.-Ing. Mollath. 9. OLCSAN Demonstrator: cooperation partners:OLCSAN, IRSA Working Group Leader: OLCSAN, Mr. Sondal. 9. Project Secretary: IPK Berlin, Dept. PLT, Dipl.-Ing. H. Schuler.
Acronym: 
SEMOS
Project ID: 
276
Start date: 
01-03-1989
Project Duration: 
46months
Project costs: 
9 200 000.00€
Technological Area: 
Market Area: 

Raising the productivity and competitiveness of European businesses through technology. Boosting national economies on the international market, and strengthening the basis for sustainable prosperity and employment.