SmartAnnotator Help
How the behaviors are classified
Behavior annotation in SmartAnnotator consists of the following steps:
- The top view videos are preliminary annotated with the Rat Behavior Recognition Module or Mouse Behavior Recognition Module of Ethovision XT version ***
See the section Behavior Recognition Requirements in the EthoVision XT Help for the best practices for camera setup, arena setup and detection settings. - SmartAnnotator makes fragments, each with a separate event, based on the automatic behavior classification in EthoVision XT.
- Video fragments are shown to the user, for manual annotation.
- The AI model is updated with these manual annotations.
- Steps 3 and 4 are repeated until the classifier reaches an accuracy threshold.
- The resulting classifier is applied to all events that are not yet annotated, giving these an automatic annotation if the model is certain enough.
- Events that are most informative for the classifier, are shown to the user, for manual annotation.
- Steps 3-7 are repeated, until all events are annotated.
How does the model identify a behavior?
Preliminary analysis by EthoVision XT
The preliminary analysis of the mouse or rat behavior by EthoVision XT makes use of the following information:
- Location
- Body shape
- Movement
- Environment proximity information based on the location of the animal
- Multi-scale temporal window features. See, for example, https://en.wikipedia.org/wiki/Multiscale_modeling and https://medium.com/@data-overload/sliding-window-technique-reduce-the-complexity-of-your-algorithm-5badb2cf432f for explanations of those concepts.
More information on the Mouse behavior recognition and Rat behavior recognition modules in EthoVision XT is found in the following paper: https://www.frontiersin.org/journals/neuroscience/articles/10.3389/fnins.2023.1198209/full
SmartAnnotator
More information on the AI model in Smart Annotator is found in the paper Fast Annotation of Rodent Behaviors with AI Assistance: Human Observer and SmartAnnotator Collaborate through Active Learning, page 230-235.
https://www.measuringbehavior.org/wp-content/uploads/2024/06/Measuring-Behavior-2024-Final-web.pdf