Close Menu
  • AI
  • Content Creation
  • Tech
  • Robotics
AI-trends.todayAI-trends.today
  • AI
  • Content Creation
  • Tech
  • Robotics
Trending
  • Anthropic releases Claude Opus 4.7, a major upgrade for agentic coding, high-resolution vision, and long-horizon autonomous tasks
  • The Coding Guide to Property Based Testing with Hypothesis and Stateful, Differential and Metamorphic Test Designs
  • Schematik Is ‘Cursor for Hardware.’ The Anthropics Want In
  • Hacking the EU’s new age-verification app takes only 2 minutes
  • Google AI Releases Google Auto-Diagnosis: A Large Language Model LLM Based System to Diagnose Integrity Test Failures At Scale
  • This is a complete guide to running OpenAI’s GPT-OSS open-weight models using advanced inference workflows.
  • The Huey Code Guide: Build a High-Performance Background Task Processor Using Scheduling with Retries and Pipelines.
  • Top 19 AI Red Teaming Tools (2026): Secure Your ML Models
AI-trends.todayAI-trends.today
Home»Robotics»A robot with feelings: Tactile AI could transform human-robot relationships

A robot with feelings: Tactile AI could transform human-robot relationships

Robotics By admin27/05/20258 Mins Read
Facebook Twitter LinkedIn Email
NVIDIA Cosmos: Empowering Physical AI with Simulations
NVIDIA Cosmos: Empowering Physical AI with Simulations
Share
Facebook Twitter LinkedIn Email

Science fiction has been filled with intelligent robots for many decades. These machines have raised tantalizing moral questions, and they’ve also shed some light on how difficult it is to create artificial consciousness. Recent advances in deep-learning have enabled the tech industry to achieve a lot in AI (artificial intelligence) in recent years. This allows machines to be trained to learn on their own. 

This breakthrough eliminates the need for painstaking, manual feature engineering—a key reason why deep learning stands out as a transformative force in AI and tech innovation. 

Building on this momentum, Meta — which owns Facebook, WhatsApp and Instagram — is diving into bold new territory with advanced “tactile AI” technologies. The company recently introduced three new AI-powered tools—Sparsh, Digit 360, and Digit Plexus—designed to give robots a form of touch sensitivity that closely mimics human perception. 

The objective? Create robots who don’t simply mimic human actions, but engage actively with their environment. 

It is not surprising that Sparsh was named after Sanskrit for “touch,” This model is an agentic general AI, which allows robots the ability to react in real time to any sensory input. In the same way, the Digit 360 sensorThe is an artificial robot fingertip that helps perceive even the smallest of physical sensations, such as a tiny needle poke or a change in pressure. The Digit Plexus The bridge will provide a standard framework to integrate tactile sensors into various robotic designs. This makes it easier for robots to collect and analyze data. Meta believes AI-powered tools can help robots tackle complex tasks that require an “human” In fields such as healthcare where sensitivity, precision, and accuracy are crucial, touch is essential.

The introduction of sensor robots also raises a larger question: will this technology enable new levels of cooperation, or introduce complexity that society is not prepared to deal with?

“As robots unlock new senses, and gain a high degree of intelligence and autonomy, we will need to start considering their role in society,” Ali AhmedCo-founder and CEO RobomartI was told. “Meta’s efforts are a major first step towards providing them with human-like senses. As humans become exceedingly intimate with robots, they will start treating them as life partners, companions, and even going so far as to build a life exclusively with them.”

What is the future of human-robot harmony? 

Meta unveiled its tactile AI advancements alongside the unveiling of the PARTNR A benchmark is a framework that allows for the evaluation of human-robot interaction on a larger scale. PARTNR is designed to allow robots and humans to interact in both structured and non-structured environments. PARTNR uses large language modeling (LLMs), which are used to guide the interactions. This allows robots to be assessed on key elements such as coordination and task-tracking, moving them away from being mere automatons. “agents” Genuine “partners” Capable of working with human counterparts fluidly. 

“The current paper is very limited for benchmarking, and even in Natural Language Processing (NLP), it took a considerable amount of time for LLMs to be perfected for the real world. It will be a huge exercise to generalize for the 8.2 billion population with a limited lab environment,” Ram PalaniappanCTO TEKsystemsI was told. “There will need to be a larger dedicated effort to boost this research paper to get to a workable pilot.”

Meta has joined forces with Wonik Robotics and GelSight, Inc. to bring tactile AI advances to the market. GelSight is responsible for the Digit 360 Sensor, which will be released next year. This sensor will give researchers access to tactile abilities that are advanced. Wonik Robotics is responsible for the production of Allegro Hand’s next generation, which incorporates Digit Plexus. This allows robots perform complex, touch sensitive tasks to a high level of precision. But not all are convinced that this is a positive step. 

“Although I still believe that adding sensing capabilities could be meaningful for robots to understand the environment, I believe that current use cases are more related to robots for mass consumers and improving on their interaction,” Agustin H. Huerta SVP Digital Innovation for North America GlobantI was told. “I don’t believe we are going to be close to giving them human-level sensations, nor that it’s actually needed. Rather, it will act more as an additional data point for a decision-making process.”

Meta’s AI tactile developments are part of a larger trend in Europe where countries such as Germany, France and the UK push boundaries for robotic awareness and sensing. The Horizon 2020 Program, for example, supports projects that aim to push robotic boundaries. These range from tactile sensors and environmental awareness, all the way up to decision making capabilities. A humanoid robotic system designed specifically for industrial environments was recently unveiled by the Karlsruhe Institute of Technology, Germany. ARMAR-6 has the ability to operate tools such as drills and hammers. AI abilities allow ARMAR-6 to be taught how to pick up objects and help human workers. 

But, Dr. Peter Gorm LarsenVice-Head of section at Aarhus’ Department of Electrical and Computer Engineering The coordinator for the EU funded RoboSAPIENS Meta’s project cautions Meta that it may be missing a major challenge. This is the disconnect between robots operating in a virtual world and their physical environment. 

“Robots do NOT have intelligence in the same way that living creatures do,” He told me. “Tech companies have a moral obligation to ensure that their products respect ethical boundaries. Personally, I’m most concerned about the potential convergence of such advanced tactile feedback with 3D glasses as compact as regular eyewear.”

Are we ready for robots? “Feel”?

Dr. Larsen The real problem is not the AI tactile sensors, but how to deploy them in an autonomous setting. “In the EU, the Machinery Directive currently restricts the use of AI-driven controls in robots. But, in my view, that’s an overly stringent requirement, and we hope to be able to demonstrate that in the RoboSAPIENS project that I currently coordinate.” 

Robots already work with humans across various industries around the globe. Consider, for instance. Kiwibot You can learn more about this by clicking here. helped Swiss firms and logistic companies are dealing with the labor shortage in warehouses. Anybotics The US recently raised 60 million dollars to assist in bringing more industrial robots into the US. according TechCrunch. As industries continue to adopt artificial intelligence, we should not be surprised. “AI accelerates productivity in repeatable tasks like code refactoring, addresses tech debt and testing, and transforms how global teams collaborate and innovate,” said Vikas Basra, Global Head, Intelligent Engineering Practice, Ness Digital Engineering.

At the same time the safety of these robots – now as well as in their potentially “sentient” future – is the main concern in order for the industry to progress. 

You Said Matan Libis, SQream’s VP Product at SQreamThe company is a leader in advanced data processing. The Observer, “The next major mission for companies will be to establish AI’s place in society—its roles and responsibilities … We need to be clear about its boundaries and where it truly helps. Unless we identify AI’s limits, we’re going to face growing concerns about its integration into everyday life.”

When AI includes tactile sensing it brings up the question of if society is ready to accept robots. “feel.” Experts argue that pure software-based superintelligence may hit a ceiling; for AI to reach a true, advanced understanding, it must sense, perceive, and act within our physical environments, merging modalities for a more profound grasp of the world—something robots are uniquely suited to achieve. Superintelligence is not the same as sentience. “We must not anthropomorphize a tool to the point of associating it as a sentient creature if it has not proven that it is capable of being sentient,” Explained Ahmed “However if a robot does pass the test for sentience then they should be recognized as a living sentient being and then we shall have the moral, and fundamental responsibility to grant them certain freedoms and rights as a sentient being.”

Meta’s tactile AI is a significant advancement, however it remains to be seen whether or not these technologies can bring about revolutionary changes in society and/or cross ethical lines. For now, society is left to ponder a future where AI not only sees and hears but also touches—potentially reshaping our relationship with machines in ways we’re only beginning to imagine.

“I don’t think that increasing AI’s sensing capabilities crosses the line on ethics. It’s more related to how that sensing is later used to make decisions or drive others’ decisions,” The following are some of the ways to get in touch with each other Huerta. “The robot revolution is not going to be different from the industrial revolution. It will affect our lives and leave us in a state that I think can make humanity thrive. In order for that to happen, we need to start educating ourselves and the upcoming generations on how to foster a healthy relationship between humans and robots.”

Robotics robots
Share. Facebook Twitter LinkedIn Email
Avatar
admin
  • Website

Related Posts

Are Robots a Real Way to Increase ROI for Warehouses and Factories

03/06/2025

SonicSense gives robots human-like sensing abilities through acoustic vibrations

27/05/2025

Low-cost tactile sensing system bridges human-robot gap with 3D ViTac

27/05/2025

Radio Wave Technology Provides robots with a ‘all-weather vision’

27/05/2025
Top News

What is Google One? The Google One Plans and Pricing

It’s not for plumbers or electricians that the real AI talent war is.

Jen Easterly, Former CISA Director and Leader of the RSA Conference

OpenAI invests in Sam Altman’s new brain-tech startup Merge Labs

Google Pixel 10 series, Pixel Watch 4 Pixel Buds: Features, Specs, and Release Date

Load More
AI-Trends.Today

Your daily source of AI news and trends. Stay up to date with everything AI and automation!

X (Twitter) Instagram
Top Insights

openJiuwen Group Releases ‘JiuwenClaw’: A Self Evolving AI Agent for Activity Administration

27/03/2026

Stream-Omni is a new LLM from the Chinese Academy of Sciences for cross-modal real-time AI

26/06/2025
Latest News

Anthropic releases Claude Opus 4.7, a major upgrade for agentic coding, high-resolution vision, and long-horizon autonomous tasks

19/04/2026

The Coding Guide to Property Based Testing with Hypothesis and Stateful, Differential and Metamorphic Test Designs

19/04/2026
X (Twitter) Instagram
  • Privacy Policy
  • Contact Us
  • Terms and Conditions
© 2026 AI-Trends.Today

Type above and press Enter to search. Press Esc to cancel.