This animation shows the reconstruction of how the brain dynamically represents musical pitches. The pitches that are closer in space are perceived as more similar at a given moment. 4/n
Comments
Log in with your Bluesky account to leave a comment
The brain doesn’t process pitch in an unstructured way. Typically, it represents pitches in a mostly linear structure—think piano keyboard layout. BUT—just 0.3 seconds after hearing a sound, something wild happens: the brain briefly represents pitch in a helix-like structure! 5/n
The helix model reflects the idea that pitches separated by an octave (e.g., the repeating piano keys) are perceived as inherently similar. This concept was first explored in the early 1900s by Géza Révész, laying the groundwork for modern music cognition! 🧠🎹 6/n
In short: By combining machine learning and MEG, we show how the brain’s dynamic pitch representation echoes ideas proposed over 100 years ago. Feels like completing a full circle in music cognitive neuroscience! Huge thanks to my collaborators! End/n
Comments