close
close

Le-verdict

News with a Local Lens

Meta AI has created new tools that will allow robots to touch and feel like humans
minsta

Meta AI has created new tools that will allow robots to touch and feel like humans

Meta’s AI research team, known as FAIR (Fundamental AI Research), is advancing robotics with new tools aimed at giving robots the ability to “feel,” move skillfully, and work alongside people. These advances aim to create robots that are not only technically capable, but can also handle real-world tasks in a way that feels natural and safe in the presence of humans. Here’s a simple overview of what they announced and why it matters.

Imagine the daily tasks humans perform: getting a cup of coffee, stacking dishes, or even shaking hands. All of this requires a sense of touch and careful control that humans take for granted. However, robots do not naturally possess these abilities. They typically rely on vision or programmed instructions, making it difficult to manipulate delicate objects, understand textures, or adapt to changes on the spot.

Meta’s latest tools help bots overcome these limitations. By giving robots a sense of “touch” through advanced sensors and systems, these tools could enable robots to perform tasks with the same sensitivity and adaptability used by humans. This could open up a world of possibilities for robots in areas such as healthcare, manufacturing and even virtual reality.

Meta has released three new technologies to improve robots’ touch, movement, and collaboration with humans:

1. Meta Sparsh:
Sparsh is a touch-sensing technology that helps AI recognize textures, pressure and even movement through touch, not just sight. Unlike many AI systems that require labeled data for each task, Sparsh uses raw data, making it more adaptable and accurate for various tasks. It’s like giving a robot a general sense of touch that can work with many different touch sensors.

2. Meta number 360:
Digit 360 is an advanced artificial fingertip with human-level touch sensitivity. It can detect tiny differences in texture and detect very small forces, capturing tactile details similar to those of a human finger. It’s built with a powerful lens that covers the entire fingertip, allowing it to “see” in all directions, and it can even react to different temperatures. This makes it useful for delicate tasks, such as medical procedures or detailed virtual interactions.

3. Metadigit plexus:
Plexus is a system that connects multiple touch sensors onto a robotic hand, giving it a touch feeling from the fingertips to the palm. This facilitates the creation of robotic hands that can move with the precise control that humans have, helping researchers build robots that can manipulate fragile or irregular objects.

Meta does not develop these tools alone. They have partnered with two companies, GelSight Inc. and Wonik Robotics, to manufacture and distribute these touch sensing tools:

– GelSight Inc. will produce and distribute the Digit 360 fingertip sensor. Researchers can apply to gain early access and explore new uses of this technology in areas such as healthcare, robotics, and more.

– Wonik Robotics will integrate Plexus technology into its existing robotic hand model, Allegro Hand, improving its tactile capabilities and making it accessible to researchers who wish to study or build on this technology.

These partnerships help ensure that researchers and developers around the world have access to these cutting-edge tools, allowing them to explore and expand on Meta’s work in tactile perception and robot dexterity.

New PARTNR tool

For robots to be useful in our daily lives, they need more than just physical skills; they must also work well with people. To solve this problem, Meta created a tool called PARTNR that helps test and improve how robots handle tasks that involve interaction with humans.

PARTNR functions as a simulation platform that allows robots to practice different household tasks and situations with “virtual” human partners. This is a safer and more scalable way to train robots before putting them in real-world situations. It allows AI to acquire important social and spatial skills, such as knowing how to move in shared spaces or adapt to human instructions.

With PARTNR, researchers can evaluate how robots and AI plan, track tasks, and respond to unexpected events. This is the key to creating robots that not only work alone but collaborate effectively with humans.

Meta’s advancements in robotic touch and human-robot interaction could have significant impacts across industries:

– Healthcare: Robots with tactile skills could assist in surgeries or help provide care with gentle movements.
– Manufacturing and warehousing: Robots could handle fragile objects or perform complex assembly tasks.
– Virtual reality: Robots and VR devices could “feel” objects, creating more immersive experiences.