Robot Autonomously Assembles IKEA Chair

Summary: A new robotics system has succeeded where so many humans fail; in autonomously assembling an IKEA chair without interruption.

Source: NTU Singapore.

Nanyang Technological University, Singapore (NTU Singapore) scientists have developed a robot that can autonomously assemble an IKEA chair without interruption.

Designed by Assistant Professor Pham Quang Cuong and his team from NTU’s School of Mechanical and Aerospace Engineering, the robot comprises a 3D camera and two robotic arms equipped with grippers to pick up objects. The team coded algorithms using three different open-source libraries to help the robot complete its job of putting together the IKEA chair.

It assembled IKEA’s Stefan chair in 8 minutes and 55 seconds. Prior to the assembly, the robot took 11 minutes and 21 seconds to independently plan the motion pathways and 3 seconds to locate the parts.

The results are published in the journal Science Robotics today.

Asst Prof Pham said, “For a robot, putting together an IKEA chair with such precision is more complex than it looks. The job of assembly, which may come naturally to humans, has to be broken down into different steps, such as identifying where the different chair parts are, the force required to grip the parts, and making sure the robotic arms move without colliding into each other. Through considerable engineering effort, we developed algorithms that will enable the robot to take the necessary steps to assemble the chair on its own.

“We are looking to integrate more artificial intelligence into this approach to make the robot more autonomous so it can learn the different steps of assembling a chair through human demonstration or by reading the instruction manual, or even from an image of the assembled product.”

The NTU team of Asst Prof Pham, research fellow Dr Francisco Suárez-Ruiz and alumnus Mr Zhou Xian believe that their robot could be of greatest value in performing specific tasks with precision in industries where tasks are varied and do not merit specialised machines or assembly lines.

How it works

The robot is designed to mimic the genericity of the human “hardware” used to assemble objects: the ‘eyes’ through a 3D camera and the ‘arms’ through industrial robotic arms that are capable of six-axis motion. Each arm is equipped with parallel grippers to pick up objects. Mounted on the wrists are force sensors that determine how strongly the “fingers” are gripping and how powerfully they push objects into contact with each other.

The robot starts the assembly process by taking 3D photos of the parts laid out on the floor to generate a map of the estimated positions of the different parts. This is to replicate, as much as possible, the cluttered environment after humans unbox and prepare to put together a build-it-yourself chair. The challenge here is to determine a sufficiently precise localisation in a cluttered environment quickly and reliably.

Next, using algorithms developed by the team, the robot plans a two-handed motion that is fast and collision-free. This motion pathway needs to be integrated with visual and tactile perception, grasping and execution.

To make sure that the robotic arms are able to grip the pieces tightly and perform tasks such as inserting wooden plugs, the amount of force exerted has to be regulated. This is challenging because industrial robots, designed to be precise at positioning, are bad at regulating forces, Asst Prof Pham explained.

The force sensors mounted on the wrists help to determine the amount of force required, allowing the robot to precisely and consistently detect holes by sliding the wooden plug on the surfaces of the work pieces, and perform tight insertions.

An example of successful autonomous dexterous manipulation

The robot developed by the NTU Singapore scientists is being used to explore dexterous manipulation, an area of robotics that requires precise control of forces and motions with fingers or specialised robotic hands. As a result, the robot is more human-like in its manipulation of objects.

So far, autonomous demonstration of dexterous manipulation has been limited to elementary tasks, said Asst Prof Pham.

“One reason could be that complex manipulation tasks in human environments require many different skills. This includes being able to map the exact locations of the items, plan a collision-free motion path, and control the amount of force required. On top of these skills, you have to be able to manage their complex interactions between the robot and the environment,” he explained.

robot and chair
Using algorithms developed by the team, the robot plans a two-handed motion that is fast and collision-free. This motion pathway needs to be integrated with visual and tactile perception, grasping and execution. To make sure that the robotic arms are able to grip the pieces tightly and perform tasks such as inserting wooden plugs, the amount of force exerted has to be regulated. NeuroscienceNews.com image is credited to NTU Singapore.

“The way we have built our robot, from the parallel grippers to the force sensors on the wrists, all work towards manipulating objects in a way humans would,” he added.

Now that the team has achieved its goal of demonstrating the assembly of an IKEA chair, they are working with companies to apply this form of robotic manipulation to a range of industries.

The team is now working to deploy the robot to do glass bonding that could be useful in the automotive industry, and drilling holes in metal components for the aircraft manufacturing industry. Cost is not expected to be an issue as all the components in the robotic setup can be bought off the shelf.

About this neuroscience research article

Funding: The research which took three years was supported by grants from the Ministry of Education, NTU’s innovation and enterprise arm NTUitive, and the Singapore-MIT Alliance for Research & Technology.

Source: Jie Ying – NTU Singapore
Publisher: Organized by NeuroscienceNews.com.
Image Source: NeuroscienceNews.com image is credited to NTU Singapore.
Original Research: Open access research for “Can robots assemble an IKEA chair?” by Francisco Suárez-Ruiz, Xian Zhou and Quang-Cuong Pham in Science Robotics. Published April 19 2018.
doi:10.1126/scirobotics.aat6385

Cite This NeuroscienceNews.com Article

[cbtabs][cbtab title=”MLA”]NTU Singapore “Robot Autonomously Assembles IKEA Chair.” NeuroscienceNews. NeuroscienceNews, 19 April 2018.
<https://neurosciencenews.com/ikea-chair-robot-8838/>.[/cbtab][cbtab title=”APA”]NTU Singapore (2018, April 19). Robot Autonomously Assembles IKEA Chair. NeuroscienceNews. Retrieved April 19, 2018 from https://neurosciencenews.com/ikea-chair-robot-8838/[/cbtab][cbtab title=”Chicago”]NTU Singapore “Robot Autonomously Assembles IKEA Chair.” https://neurosciencenews.com/ikea-chair-robot-8838/ (accessed April 19, 2018).[/cbtab][/cbtabs]


Abstract

Can robots assemble an IKEA chair?

The limits of robotic manipulation were explored by automatic assembly of an IKEA chair.

Feel free to share this Neuroscience News.
Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.