banner

Projects

A user interface for automated behavior quantification and multimodal data visualization

PI: David J. Anderson (Division of Biology and Biological Engineering)
SASE: Dave Rumph, Senior Software Engineer

Researchers in the Anderson lab investigate the function of neural circuits involved in the control of behaviors such as aggression, mating and defense with the aid of two software toolsets.  One, the Mouse Action Recognition System (MARS), takes videos of two mice interacting as input and automatically identifies common behaviors by locating the mice, determining their orientations to each other and classifying their behaviors.  The other, the Behavior Ensemble and Neural Trajectory Observatory (BENTO), visualizes the videos used by MARS, along with associated neural, audio and pose data. BENTO also allows researchers to review behavior annotations from MARS and to annotate behaviors manually.

Over the course of this project, MARS has been upgraded from a very early version of TensorFlow to TF 1.15, and from Python 2 to Python 3. In addition, the MARS user code has been performance tuned so that it can run at usable speeds on personal computers such as laptops, not just on lab servers with expensive GPUs.

A beta version of a Python-language version of BENTO has been released.  The goal is to supply equivalent functionality to the existing MatLab version, but with improved experiment management capabilities and an internal architecture that supports extensions to the capabilities via plug-ins, and an eventual integration with MARS. This will allow researchers to both submit experiments for automatic behavior annotation, and to supply MARS with new training data directly from BENTO. At present, the new version of Bento supports display of video, pose, neural and annotation data, manual editing of annotations, and has a new database backend that organizes experimental data in a consistent way. There is a plug-in interface for various pose formats, and collaborators have implemented a plug-in for the SLEAP format, adding to MARS and DeepLabCut formats.

Bento repo: https://github.com/neuroethology/bento.git
MARS repo for users: https://github.com/neuroethology/MARS.git
MARS repo for developers: https://github.com/neuroethology/MARS_Developer.git


diagram

Bento, with top and side view of the video data at top left, neural data at right, and the main control window and available behaviors below.  Green and blue lines overlaying the top view video image are representations of the pose determined automatically by MARS.  Behavior annotations are shown graphically as color bars in the neural and main windows, and as a label in the upper left corner of each video viewer.