UC7 - Human-Robot Collaboration in a Disassembly Process with Workers with Disabilities

Human-Robot Collaboration in a Disassembly Process with Workers with Disabilities use case aims to develop and validate a collaborative robotic disassembly solution for End of Life products which is adapted to work with people with disabilities. This solution will provide people with reduced working capabilities means to improve their working conditions and gain productivity, giving them an improved ability to find a meaningful work and a place on the labour market
UC7
Industrial Robotics

This Human-Robot Collaboration in a Disassembly Process with Workers with Disabilities use case aims to develop and validate a collaborative robotic disassembly solution for End-Of-Life products which is adapted to work with people with disabilities. This solution will provide people with reduced working capabilities means to improve their working conditions and gain productivity, giving them an improved ability to find a meaningful work and a place on the labour market. This solution will provide technologies capable of enabling collaboration through intelligent human-robot interaction (e.g. natural language interfaces such as voice and perhaps gestures) and learning from monitoring the disassembly process. Additionally, augmented reality technologies will be used to validate the robotic solution in different virtual scenarios before implementing it in a real environment.

The robotic solution will be based on KUKA LBR Iiwa serial manipulator. In terms of safety, this robot complies with an action level D in Category 3, meeting the requirements of the EN ISO 13849 standard on machine safety. The LBR Iiwa has seven degrees of freedom, which provides greater versality in movement by allowing access to different positions with more than one axis configuration. In addition, it has force sensors along each axis, which guarantees direct and safe interaction between the robot and the operator.The figure ‎shows the generic HW architecture for the proposed robotic solution. The interaction between the operator and the robot will be carried out through the TedCube voice management device. At the same time, a camera will be incorporated to monitor the robot’s movement and a computer to program its motion. All these devices will be connected through an ethernet network (own IP) and will be linked among them through a router.

aldakin_uc.png

In regard to safety and security, collaborative robotics provides a collaborative working environment between the operator and the robot, where the worker provides flexibility and problem-solving ability, while the robot provides strength and accuracy. However, collaborative robotics is not an overworked field, mainly because, due to the absence of fences, this robotics increases both the risk and the sense of risk perceived, a feeling that, based on the experience and knowledge of companies in whose staff there are workers with some kind of disability, ought to be even more noticeable.

Currently, the most widely used safety standards in the field of collaborative robotics are ISO/TS 15066:2016, ISO 10218-1 and ISO 10218-2:2011. This last standard sets out four strategies for reducing risk in collaborative robotics; 1) safety-rated monitored stop, 2) hand guiding, 3) speed and separation monitoring, and 4) power and force limiting by inherent design or control. In this case, use case’s safety will be based on ISO/TS 15066:2016 and the third strategy proposed by ISO 10218-2:2011. This strategy, through the monitoring of the distance between the worker and the robotic cell, allows the speed of the robot to be regulated depending on the safety zone in which the operator is located. The use of safetyEYE® technology is being considered for this purpose. For the verification of the system's solution, simulations will be performed in the Gazebo  software to identify possible improvements in early stages. Later, and as aforementioned, augmented reality will be used to validate the solution with the operators in different proposed scenarios (human in the loop). Finally, it is expected to perform a validation in a relevant environment.

Use case Evaluation Scenarios
Workflows
VALU3S Framework
Contents