Robots are becoming smarter every year. They can see, hear, and even make decisions using artificial intelligence. But one human ability has remained extremely difficult to replicate: the sense of touch.
Now, researchers at King's College London have developed a powerful new approach that could dramatically change the future of tactile robotics. Their new simulation platform can reduce the design and training time of touch-sensitive robots from eighteen months to just two weeks.
Published in Cyborg and Bionic Systems and supported by complementary research in Nature Communications, the work introduces two major innovations: SimTac, a physics-based simulator for bio-inspired tactile sensors, and GenForce, an AI training method that mimics human tactile memory.
Together, these breakthroughs could significantly cut the cost and time needed to develop next-generation robots.
Why Touch Matters for Robots
Most robots today rely heavily on cameras and visual systems. While vision is important, it is not enough for tasks that require delicate handling.
Humans naturally adjust force when picking up different objects. We instinctively know the difference between holding a strawberry and gripping a hammer. A robot, however, does not have this built-in understanding. Without touch feedback, it may squeeze too hard or too softly, leading to mistakes.
Tactile robots solve this problem by using sensors embedded in robotic hands or grippers. These sensors measure pressure, force, and surface texture, allowing robots to interact more carefully with objects.
This ability is crucial for:
Automated manufacturing
Warehouse logistics
Handling fragile goods
Medical robotics
Advanced prosthetic limbs
Despite these benefits, designing tactile robots has traditionally been slow and expensive. Engineers rely heavily on trial and error. Each prototype must be physically built, tested, recalibrated, and redesigned multiple times.
The result? It can take up to eighteen months to produce a single working tactile robot prototype — with no guarantee of success.
Learning from Nature’s Best Sensors
Nature has already solved the problem of touch through millions of years of evolution.
Researchers drew inspiration from some of the most sensitive biological systems:
Cats’ paws
Octopus tentacles
Elephant trunks
The human hand
Each of these structures is designed for a specific type of interaction. A cat’s paw can silently sense vibrations. An octopus tentacle can grasp irregular shapes. An elephant’s trunk can handle both delicate flowers and heavy logs.
Instead of copying these designs physically through slow prototyping, the research team decided to simulate them digitally.
SimTac: A Virtual Playground for Tactile Design
The team developed SimTac, a physics-based simulation platform that allows researchers to design and test bio-inspired tactile sensors inside a virtual environment.
Traditional simulation tools mostly modeled flat sensors, similar to a human fingertip. But real-world interactions are more complex. For example, try picking up a sheet of paper using only a flat fingertip — it’s extremely difficult.
SimTac expands the design possibilities. Engineers can now create:
Simulated cat-like paw pads
Flexible tentacle structures
Trunk-inspired gripping surfaces
Complex curved tactile geometries
The system generates realistic data based on virtual interactions with real-world object shapes. This removes the need for repeated physical prototyping during early development.
By exploring thousands of designs digitally, engineers can quickly identify the most effective structure before building a physical robot.
This drastically reduces development time from months to weeks.
Cutting Costs with GenForce
Designing better sensors is only half the challenge. Training them is equally expensive.
High-accuracy force and torque sensors can cost more than £10,000 each. Training a full robotic hand often requires multiple sensors working together, increasing costs significantly.
To solve this, the researchers introduced GenForce, an AI model that mimics how humans learn to sense force.
Humans do not need to repeatedly test every finger to learn how much force to apply. If one finger touches an object, the brain quickly generalizes that knowledge across the entire hand.
GenForce works in a similar way.
Instead of training every tactile sensor individually, the AI abstracts force information into a simplified representation — similar to a 2D image. This “memory” of force can then calibrate the entire robotic hand.
In simple terms:
Train one sensor
Teach the AI the force pattern
Apply that knowledge to the whole device
This approach can dramatically reduce both hardware costs and training time.
Why This Is a Major Breakthrough
This combined approach — simulation plus intelligent training — offers several advantages:
1. Massive Time Savings
Development cycles shrink from 18 months to about 2 weeks.
2. Lower Financial Barriers
Fewer expensive physical sensors are needed.
3. Expanded Design Freedom
Engineers can test nature-inspired shapes easily.
4. Improved Dexterity
Robots can handle delicate and complex objects more effectively.
5. Scalable Manufacturing
Industries can deploy tactile robots at larger scale without excessive costs.
Real-World Impact
The impact of this research could extend across multiple sectors.
Automated Manufacturing
Factories could deploy smarter robotic pickers capable of handling fragile components without damage.
E-Commerce and Warehousing
Robots could safely sort items of varying shapes and textures.
Healthcare and Prosthetics
Advanced prosthetic hands could provide more natural grip control.
Service Robotics
Domestic robots could perform household tasks requiring gentle touch.
The key benefit is flexibility. Instead of building specialized robots for narrow tasks, companies could design adaptable tactile systems faster and more affordably.
The Future of Bio-Inspired Robotics
The researchers believe this is just the beginning.
Their long-term goal is to fully fabricate bio-inspired tactile robots based on designs first perfected in simulation. By expanding the design space even further, they aim to unlock new capabilities in robotics.
As artificial intelligence and simulation tools continue to improve, we may soon see robots that can:
Adjust grip strength like a human
Recognize surface textures instantly
Manipulate soft and fragile objects safely
Learn new force interactions after a single touch
This level of sensitivity would mark a major step forward in human-robot interaction.
A New Era for Robotic Touch
For decades, robotic touch has been a bottleneck in development. While robots became excellent at vision and movement, their inability to “feel” limited their versatility.
The combination of SimTac and GenForce changes that equation.
By learning from nature and mimicking the way the human brain processes tactile information, researchers have created a faster, smarter path toward truly dexterous machines.
If this approach scales successfully, the next generation of robots may not just see and think — they may finally feel.
And thanks to this breakthrough, they could be built in weeks instead of years.
References: (1)
- Xuyang Zhang,
- Jiaqi Jiang,
- Zhuo Chen,
- Yongqiang Zhao,
- Tianqi Yang,
- Daniel Fernandes Gomes,
- Jianan Wang,
- Shan Luo.
SimTac: A Physics-Based Simulator for Vision-Based Tactile Sensing with Biomorphic Structures. Cyborg Bionic Syst. 2026;7:0510.DOI:10.34133/cbsystems.0510 (2) Chen, Z., Ou, N., Zhang, X. et al. Training tactile sensors to learn force sensing from each other. Nat Commun 17, 2101 (2026). https://doi.org/10.1038/s41467-026-68753-1

Comments
Post a Comment