Profile avatar
mkachlicka.bsky.social
Postdoctoral researcher at Birkbeck University of London @birkbeckpsychology.bsky.social @audioneurolab.bsky.social working on speech + sounds + brains 🧠 cognitive sci, auditory neuroscience, brain imaging, language, methods https://mkachlicka.github.io
28 posts 696 followers 1,056 following
Prolific Poster

We are pleased to announce the next AFNI Bootcamp, May 28-30, 2025. First 2 days: data visualization, single subject analysis and QC. 3rd day: statistics, results reporting and group analysis. Please see here for details, registration link and preliminary schedule: afni.nimh.nih.gov/bootcamp

New paper in Imaging Neuroscience by Britta U. Westner, Tim M. Tierney, et al: Cycling on the Freeway: The perilous state of open-source neuroscience software doi.org/10.1162/imag...

An evolutionary model of rhythmic accelerando in animal vocal signalling. New paper by Yannick Jadoul & al. with Taylor Hersh and @andrearavignani.bsky.social doi.org/10.1371/journal.pcbi.1013011

New preprint! Thrilled to share my latest work with @esfinn.bsky.social -- "Sensory context as a universal principle of language in humans and LLMs" osf.io/preprints/ps...

Closing tomorrow! Apply for my fully-funded PhD studentship, open to UK students

This is important work from @thomasserre.bsky.social and co! AI and biology are diverging as AI has gotten better; but AI is also plateauing and scale alone won’t fix it… Considering neurobiology could also help us build better AI (see my extended thoughts here: arxiv.org/abs/2411.15234)

And there is another paper in our special issue in JASA which has a slightly different, but not excluisve angle: doi.org/10.1121/10.0...

Greebles are back! www.nature.com/articles/s41...

Amplifying Humanities Research with AI and Network Science Methods workshop, Northeastern University London, May 7th, 9am - 5pm. This introductory in-person workshop introduces participants to AI and Network Science methods. Book your place (open to all): tinyurl.com/32f8jfn2

🚨New preprint alert!🚨 Say you have a new hypothesis that the brain encodes a particular musical information X, which is theoretically plausible.🧐 So, you ran an encoding analysis on EEG data. Now the topography of prediction accuracies looks like this!🤩 What would you conclude?🥳 🧪

Proud to announce our primer on "Ten principles for reliable, efficient, and adaptable coding in psychology and cognitive neuroscience" www.nature.com/articles/s44... This primer is for beginners to get started, advanced programmers to improve, and PIs. #psychology #psychsci #cogsci #neuroskyence

PINEAPPLE, LIGHT, HAPPY, AVALANCHE, BURDEN Some of these words are consistently remembered better than others. Why is that? In our paper, just published in J. Exp. Psychol., we provide a simple Bayesian account and show that it explains >80% of variance in word memorability: tinyurl.com/yf3md5aj

A map of neural signals and circuits traces the logic of brain computation www.nature.com/articles/d41...

New brain/language study w/ @evfedorenko.bsky.social! We applied task-agnostic individualized functional connectomics (iFC) to the entire history of fMRI scanning in the Fedorenko lab, parcellating nearly 1200 brains into networks based on activity fluctuations alone. doi.org/10.1101/2025... . 🧵

🚨Our paper with @brainlanglab.bsky.social on how multilingual phonology shapes the auditory cortex is out in @elife.bsky.social : elifesciences.org/reviewed-pre... 🚨 Loved the process and looking forward to tackling the reviews out in the open!

✍️ #BCBLpaper in #HumanBrainMapping |Temporal structure of #music improves the cortical encoding of #speech Fernández-Merino, L. (@laufdezlaura.bsky.social), Lizarazu, M., Molinaro, N. (@nicolaml.bsky.social) and Kalashnikova, M.  +info ⬇️ https://onlinelibrary.wiley.com/doi/full/10.1002/hbm.70199

New preprint from the lab: Aakash Agrawal and I use 7T fMRI and MEG to dissect the cortical stages of invariant word recognition and test our recent theoretical proposal of an ordinal letter code in the Visual Word Form Area (VWFA). www.biorxiv.org/content/10.1...

Thrilled to share Aline-Priscillia Messi’s first PhD paper on how context shapes meaning in the brain! Disambiguated noun/verb stems are neurally organized by syntactic category and context-driven item-specific semantics, not by homonymy or polysemy. @sfnjournals.bsky.social tinyurl.com/59amm8j7

Just presented our work using #ECoG to decode words during sentence production at #HSP2025. Really grateful for all the great feedback. I got more clever ideas for future directions than I can possibly follow up on. Love this conference! doi.org/10.1101/2024...

What is the promise of computational language modeling for language neuroscience? @mitpress.bsky.social journal @jneurolang.bsky.social @shaileejain.bsky.social @alexanderhuth.bsky.social direct.mit.edu/nol/article/...

It's all about the waves, cats Musical neurodynamics www.nature.com/articles/s41... #neuroscience

Are high perceptual precision demands always necessary for training to drive auditory neural plasticity? 🧠 🔊🎙️ Find out more in this cool study with voice actors doi.org/10.1016/j.co... by @mkachlicka.bsky.social & @adamtierney.bsky.social @birkbeckpsychology.bsky.social #auditory #neuroscience 🧵1/6

Happy to share my very first first-author paper! In this study, we show that rhythms can shape how our brains track speech (in very simple words). Work with Marina Kalashnikova, @nicolaml.bsky.social and Mikel Lizarazu. onlinelibrary.wiley.com/doi/full/10....

Gaser & Schlaug doi: 10.1523/JNEUROSCI.23-27-09240.2003. Brain structures differ between musicians and non-musicians Gray matter volume differences in motor, auditory, & visual-spatial brain regions in pro musicians vs matched group of amateur musicians and non-musicians... #neuroskyence

Our brains have an internal GPS that guides us through physical space. But does it do more than that? A recent preprint from the lab shows that the brain’s navigation system also maps sound 🧠 🌎 🔉 Read more here: doi.org/10.1101/2025... #auditory #neuroscience @birkbeckpsychology.bsky.social 🧵1/4

Excited to introduce funROI: A Python package for functional ROI analyses of fMRI data! funroi.readthedocs.io/en/latest/ #fMRI #Neuroimaging #Python #OpenScience Work w @neuranna.bsky.social 🧵👇

Want to procedurally generate large-scale relational reasoning experiments in natural language, to study human psychology 🧠 or eval LLMs 🤖? We have a tool for you! Our latest #ICLR work on long-context/relational reasoning evaluation for LLMs ReCogLab! github.com/google-deepm... Thread ⬇️

⏰CALL FOR PARTICIPANTS⏰ We're looking for London-based #participants who would like to take part in an fMRI study and gain insight into how our brains process second language speech 🧠🧠🧠 #auditory #neuroscience @birkbeckpsychology.bsky.social Contact @mkachlicka.bsky.social for details. Please share!

Predicting the future of academia, given the news: painful. Predicting the next couple of seconds, given this new paper: a joyful, rigorous, near-mechanistic cogneuro explanation. Great work by Matthias Grabenhorst and Georgios Michalareas. (tl;dr hazard rate again the wrong model) rdcu.be/edKK0

By studying songbirds, our own Vikram Gadagkar, @neurokim.bsky.social, Jonathan Kasdin and colleagues witnessed the role the brain’s reward machinery plays as the brain naturally learns over time through practice. Learn more! zuckermaninstitute.columbia.edu/songbirds-hi...

Nicely done!

new preprint on Theory of Mind in LLMs, a topic I know a lot of people care about (I care. I'm part of people): "Re-evaluating Theory of Mind evaluation in large language models" (by Hu* @jennhu.bsky.social , Sosa, and me) link: arxiv.org/pdf/2502.21098

🚨 New Preprint!! LLMs trained on next-word prediction (NWP) show high alignment with brain recordings. But what drives this alignment—linguistic structure or world knowledge? And how does this alignment evolve during training? Our new paper explores these questions. 👇🧵

New preprint! In arxiv.org/abs/2502.20349 “Naturalistic Computational Cognitive Science: Towards generalizable models and theories that capture the full range of natural behavior” we synthesize AI & cognitive science works to a perspective on seeking generalizable understanding of cognition. Thread:

Decoding semantics from natural speech using human intracranial EEG https://pubmed.ncbi.nlm.nih.gov/39990331/

New paper led by the unstoppable @qingtianmi.bsky.social. We study how people learn a task that involves combining multiple sources of information. We work out which curricula work best, build a model, and use it to successfully discover new curricula. enjoy! osf.io/preprints/ps...

Imagine listening to a language you don't know. When does one word end, and another begin? Human infants face a similar challenge. Scientists have looked at exactly how babies recognise patterns in speech from birth.

Excited to kick off 2025 with new research in #MachineLearning, #Decoding, #MusicNeuroscience! Our paper, “Temporally Dissociable Neural Representations of Pitch Height and Chroma”, now in @sfnjournals.bsky.social doi.org/10.1523/JNEU... @davidpoeppel.bsky.social, @xiangbin-teng.bsky.social! 🧠🎵 1/n

🚨New preprint🚨 “Measuring Naturalistic Speech Comprehension in Real Time” ➡️ bit.ly/4iESDdV w/ @kriesjill.bsky.social, Shiven Gupta, & @lauragwilliams.bsky.social Do you use naturalistic listening paradigms, but wish you could record a continuous behavioral measure of comprehension? 🧵1/8

New review on the neurobiology of sentence production by my grad student Jeremy Yeaton. www.sciencedirect.com/science/arti...

The developmental emergence of reliable cortical representations 🧠🤖 www.nature.com/articles/s41...