In a quiet laboratory chamber, a tiny drama unfolds. A male fruit fly stretches out his wings and begins to vibrate them rapidly, producing a soft buzzing sound. To another fruit fly nearby, this is not noise—it is a carefully crafted love song. The female pauses, listening. For a moment, courtship seems successful.
Then, without warning, a brief green flash of light cuts across the chamber.
The male’s song stops instantly. His wings fold back against his body. The female, unimpressed by the sudden silence, turns away and walks off.
The reason for this failed romance is not another rival fly or a mistake by the male. The culprit is an advanced artificial intelligence (AI) system that detected the very start of the courtship dance and instantly shut down the specific brain cells responsible for producing the song.
This remarkable experiment, developed by scientists at Nagoya University, together with collaborators from Osaka University and Tohoku University, marks a major leap in neuroscience. Their AI system does more than just observe animals—it recognizes their behavior in real time and directly controls the brain circuits that cause those behaviors.
The study, published in Science Advances, introduces a powerful new tool called YORU (Your Optimal Recognition Utility). It opens the door to understanding how brains control social interactions with a level of precision that was previously impossible.
The Challenge of Understanding Behavior
For decades, scientists have tried to understand how the brain produces behavior. Watching animals interact—courting, fighting, grooming, sharing food—has always been central to this effort. But observing behavior accurately, especially when several animals interact at the same time, is extremely difficult.
Traditional behavior analysis relies on tracking individual body parts frame by frame, much like motion-capture systems used in video games or movies. While effective in simple situations, this approach struggles when animals overlap, move quickly, or interact closely. It is also too slow for experiments that require instant responses.
In many brain experiments, timing is everything. A delay of even a fraction of a second can mean missing the moment when a specific brain circuit becomes active.
Scientists needed a faster, smarter way to recognize behavior—one that works in real time and in complex social settings.
Meet YORU: An AI That Recognizes Behavior Instantly
YORU was designed to solve exactly this problem. Instead of tracking body parts over time, YORU takes a completely different approach. It recognizes entire behaviors from their appearance in a single video frame.
Using a camera and AI-based image recognition, YORU can instantly identify what an animal is doing and which individual is doing it—even in a group.
According to Hayato Yamanouchi, co-first author of the study from Nagoya University’s Graduate School of Science, this method is both faster and more accurate than existing tools. YORU achieved 90–98% accuracy across different species and ran 30% faster than competing systems.
YORU successfully recognized a wide range of behaviors, including:
Food-sharing between ants
Social orientation and movement in zebrafish
Grooming behavior in mice
What makes YORU especially powerful is its flexibility. It works across species, can be trained to recognize new behaviors using only small amounts of data, and does not require programming skills to operate. This makes it accessible to scientists around the world.
The Real Breakthrough: Linking AI to the Brain
While YORU’s ability to recognize behavior is impressive, the true breakthrough came when researchers connected it to optogenetics—a technology that uses light to control neurons.
Senior author Azusa Kamikouchi explained that this combination allows scientists to intervene in the brain at the exact moment a behavior begins.
In the fruit fly experiment, YORU was trained to detect a specific behavior: wing extension during courtship. The instant the male fly began this movement, YORU sent a signal to a light source. A precise beam of light then targeted the fly, silencing the neurons responsible for producing the courtship song.
The result was immediate. The song stopped mid-note, and the male’s chances of mating dropped sharply.
In another experiment, the team went even further. They used a moving, targeted light system that followed individual flies as they walked freely. This allowed researchers to block the hearing neurons of just one fly, while others nearby remained unaffected.
Why Individual Control Matters
Before this technology, most optogenetic experiments illuminated an entire chamber at once. This meant that all animals were affected at the same time, making it impossible to study the role of a single individual during social interactions.
YORU changes this completely.
By identifying both the behavior and the individual animal in real time, YORU enables individual-focused brain control. Scientists can now ask questions such as:
What happens if only one animal in a group cannot hear?
How does silencing one individual’s behavior affect others?
Which specific actions trigger responses during social interactions?
These questions were nearly impossible to answer before.
How the Brain Control Technology Works
The system combines AI, genetics, and light in a carefully coordinated process:
Step 1: Genetic Engineering
Scientists genetically modify the animals so that specific neurons in their brains produce light-sensitive proteins, known as opsins. These proteins can either activate or silence neurons when exposed to certain wavelengths of light.
Step 2: Detection and Response
A camera records the animals’ behavior in real time.
YORU analyzes each video frame and detects the target behavior instantly.
The moment the behavior appears, YORU sends an electrical signal to a light source.
Step 3: Light Controls the Brain
The light shines only on the target animal.
The light-sensitive proteins in the selected neurons respond by opening ion channels in the neuron membrane.
This changes the activity of those neurons—either turning them on or shutting them off.
As a result, the animal’s behavior is altered in real time.
A New Era for Neuroscience
YORU represents a major step toward understanding how the brain generates behavior during real-life social interactions. By closing the loop between observation and intervention, scientists can now test cause-and-effect relationships with unmatched precision.
Importantly, the Nagoya University team has made YORU available online, allowing researchers worldwide to use it in their own studies. This openness could accelerate discoveries in fields ranging from neuroscience and biology to robotics and artificial intelligence.
From silencing a fruit fly’s love song to unraveling the mysteries of social behavior, YORU shows how AI is no longer just watching nature—it is actively helping us understand how the brain works, moment by moment.
Reference:
Hayato Yamanouchi et al., YORU: animal behavior detection with object-based approach for real-time closed-loop feedback, Science Advances (2026). DOI: 10.1126/sciadv.adw2109.

Comments
Post a Comment