Researcher(s)
- Abby Fowler, Neuroscience, University of Delaware
Faculty Mentor(s)
- Joshua Neunuebel, Psychological and Brain Sciences, University of Delaware
Abstract
Social behaviors are critical for survival and reproduction of many animals and rely on continuous feedback between brain and behavior. To investigate the neural mechanisms underlying these interactions, researchers must first segment behavior into meaningful units. Our lab is developing an unsupervised model that uses a self-organizing map (SOM) to segment mouse behavior from video data. However, currently the model outputs behavioral segments as short as one video frame, or 33 milliseconds, resulting in fragmented and unrealistic representations. Previous research has found behavioral motifs at sub-second timescales in single mouse videos using a filtered derivative change point algorithm. This algorithm measures statistically significant shifts in time series data, such as a change in local mean, indicating a potential behavioral transition. To identify meaningful segmentation timescales in socially interacting mice, we compare two approaches for detecting behavioral transitions. Our first method applies the filtered derivative algorithm to transformed image pixels to detect changes in the visual scene. Our alternative method uses mouse tracking software to fit ellipses to each mouse. Then, we estimate their spine lengths and identify related postural change points. Preliminary results find mean changepoint durations of 489ms (SD=315ms) and 591ms (SD=598ms) for each approach respectively. Once optimized, the chosen timescale will inform SOM-based clustering, improving the interpretability of behavioral segmentation.