Skip to content
Revolutionizing Wildlife Tracking: Markerless Tracking Using AI

Revolutionizing Wildlife Tracking: Markerless Tracking Using AI

A significant advancement has come in the domain of animal tracking, brought forward by researchers from the Cluster of Excellence Collective Behaviour of the University of Konstanz. They have developed a potent computer vision framework that can estimate posture and track identities without the use of markers. This technology is not only suitable for use in indoor environments but also adapts efficiently for applications in the wild, enabling markerless tracking of animals.

The innovative research aims to set a new trend in the realm of computer vision and machine learning by enabling the markerless tracking of animals in the wild. A setup comprising four cameras covered a scene in a park in Konstanz, capturing the interactions of three pigeons. Post the collection of the footage, it was analyzed via the developed compute vision framework.

The framework successfully detected all pigeons and outlined their central body parts and determined their posture, position, and interactions with the surrounding environment. What makes it truly remarkable is that all the tracking was completed without attaching any markers onto the pigeons or requiring human intervention.

The researchers paved the path for 3D-MuPPET, a newly developed computer vision framework, capable of estimating and tracking 3D body postures of upto ten pigeons in real-time. This is an exciting leap towards advancing the methods for tracking larger animal groups in 3D, a domain which lacks any comprehensive frameworks or benchmarks.

3D-MuPPET, short for 3D Multi-Pigeon Pose Estimation and Tracking, is another prolific contribution to the field from Urs Waldmann and Alex Chan. With this development, they have crafted the first example of 3D tracking for an entire group of up to ten individuals.

What makes the framework further advantageous is that it can be used to track pigeons in the wild. It showcases one of the initial case studies demonstrating the transition from tracking animals in captivity towards tracking animals in the wild. This is a monumental step towards allowing fine-scaled behaviour of animals in their natural habitats to be measured. Moreover, the methods developed can be extended in the future to information across other species in a non-invasive manner.

3D-MuPPET represents a valuable tool for researchers interested in using 3D posture reconstruction to understand collective behaviour across various environments and species. The only requirement is the availability of a multi-camera setup and a 2D posture estimator, and the framework will efficiently track the 3D postures of any set of animals.

Disclaimer: The above article was written with the assistance of AI. The original source can be found on ScienceDaily.