Image from the 3D-POP dataset.

Sensory ecology - 3D Tracking of movement and behavior.

Scroll ↓

My research on sensory ecology focuses on studying the mechanisms behind complex behavior patterns of animals by measuring their movement and posture. I use compute vision & AI to automatically measure the subtle differences in behavioral patterns of individuals in a group by tracking their interactions in the three dimensions. 3D tracking and individual identification offers a remarkable window to measure complex attributes such as gaze or visual field of view of an animal. Since 2018, I have developed several fundamental datasets, tracking methods and infrastructures with birds as my primary system and now I am working with wild birds and primates on topics such as social learning and tool use. I also take interest in Virtual Reality systems of animals, which directly involve study of behavioral mechanisms by triggering the visuo-motor response using artificial visual stimulation.

Focus

In 2023, We introduced SMART-BARN, one of the largest facility for tracking animals (insects, mammals and birds) in indoor environments. We combined motion capture cameras, video cameras, and audio microphones for complete 3D tracking of movement and sound. We conducted three concrete case studies to demonstrate capability of the system in supporting a wide range of biological questions. The research facility is now adapted as Imaging Hangar by the Center of the Advanced Studies of Collective Behavior (CASCB) at the University of Konstanz. The study is published in Science Advances and a hands on overview can be found at New Scientist or Uni Konstanz. The facility is currently supporting a wide range of experiments in collective behavior, robotics and swarm intelligence.

3D Tracking with Markers

I used SMART-BARN to create the first large-scale datasets for markerless 3D posture tracking of birds. I developed a novel technique to record the 3D postural movements (head and body orientation) of birds using motion capture markers. Furthermore, I used this technique to create millions of annotations necessary to train a machine learning model for markerless tracking. 3D-POP is one of the first datasets consisting sub-millimeter accurate 3D posture for an entire group of birds for several hours. We show that the model works with pigeons without markers. Read more details from here.

Foundation for markerless 3D movement & posture Tracking

My collaborators used 3D-POP to extend the work on 3D postures to track posture of birds in the wild. We show that data created in indoor environments can be used effectively for tracking birds in the wild with minimal additional annotations. This approach is further extended by myself and collaborators to other systems such as Great tits, Siberian jays to study social learning.

Going from the lab to the wild

My research on 3D posture has lead to understanding of visual mechanism of attention in pigeons. The SMART-BARN technology has further shaped research of predator-pray dynamic and collective information processing in pigeons. Pigeons are model systems for behavioral studies and thus the techniques and models developed for tracking pigeons are universally useful for scientists across the globe.

What did we learn so far?

Other projects and relevant work

I am currently extend the work for single-multi view posture tracking of primates i.e. Chimps, Capuchin monkeys and Oliver Baboons. The posture tracking methods will be used for exploring topics such as social learning, tool use and collective dynamics of sleep.

One the right side you can find a video of my seminal review on Virtual Reality systems used with animals. This review traces the history of visual stimulation based experiments and takes the reader on a journey of how and why we used various visual stimulus to study animal behavior.

Previous
Previous

Mating Ecology - Project MELA (Mating Ecology of a Lek Breeding Antelope)