Profile avatar
avsp.bsky.social
The official(ish) account of the Auditory-VIsual Speech Association (AVISA) AV 👄 👓 speech references, but mostly what interests me avisa.loria.fr
296 posts 998 followers 843 following
Prolific Poster

Dynamic changes in large-scale functional connectivity prior to stimulation determine performance in a multisensory task www.frontiersin.org/journals/sys... Investigated cortical network dynamics at multiple spatial scales & subnetworks used ECoG & AV detection task in ferrets lots of cortical areas

To read it later

Speech outcomes in cochlear implant users depend on visual cross-modal cortical activity measured before or after implantation academic.oup.com/braincomms/a... Measured visual x-modal activity in a silent lip reading task using EEG in a cross-sectional, observational study- needed a pre- post- study

Opening Social Interactions: The Coordination of Approach, Gaze, Speech, and Handshakes During Greetings onlinelibrary.wiley.com/doi/10.1111/... "Despite the importance of greetings for opening social interactions, their multimodal coordination processes remain poorly understood" 🚶‍♂️👀 🙂 👀 🤝👅 🎵👂😄

Transforming literature screening: The emerging role of LLMs in systematic reviews www.pnas.org/doi/10.1073/... Github github.com/fmdelgado/LL... Also "Large language models for conducting systematic reviews: on the rise, but not yet ready for use – a scoping review" www.medrxiv.org/content/10.1...

a plea to think carefully about surprisal + what it means to understand how we understand >> link.springer.com/article/10.1... brand new paper in Computational Brain and Behaviour with @andreaeyleen.bsky.social at @mpi-nl.bsky.social

New paper out !!Children use spatial gestures to complement speech more for complex than simpler spatial relations - gestures might reduce mapping difficulties of complex spatial concepts to arbitrary speech symbols - in Cognitive Science !! @ercenurunal.bsky.social @dilaykaradoller @beyzasumer

Decoding semantics from natural speech using human intracranial EEG https://pubmed.ncbi.nlm.nih.gov/39990331/

Jiwar: A database and calculator for word neighborhood measures in 40 languages link.springer.com/article/10.3... Neighbourhood information for 3 levels (orthographic, phonological, and phonographic) across 40 languages

Prior expectations guide multisensory integration during face-to-face communication www.biorxiv.org/content/10.1... Face-to-face communication as a Bayesian Causal Inference problem looked how prior expectations acts on the integration/segregation of vocal & bodily signals (via ventriloquist effect)

Temporal Window of Integration XOR Temporal Window of Synchrony osf.io/preprints/ps... The temporal boundaries of integration phenomena are aptly described as the temporal window of integration; temporal boundaries of simultaneity judgments should be referred to as the temporal window of synchrony 👍

Learning to operate an imagined speech Brain-Computer Interface involves the spatial and frequency tuning of neural activity https://pubmed.ncbi.nlm.nih.gov/39979463/

Pupil respiratory-phase physoc.onlinelibrary.wiley.com/doi/10.1113/... "Across all conditions-free & controlled breathing; different tasks,lighting & fixation distances;& with & without olfactory bulbs...consistently found...pupil size is smallest around inhalation onset & largest during exhalation"

Early selective attention to the articulating mouth as a potential female-specific marker of better language development in autism: a review www.frontiersin.org/journals/psy... Speculative article proposes [see title] ... looks to build a case

In this review article, I summarize some of our recent work on the neural basis of visual search in scenes, showing how attention and expectation interactively drive preparatory activity in visual cortex and jointly modulate the visual processing of potential target objects. doi.org/10.1177/0963...

Isochrony in titi monkeys duets: social context as a proximate cause of duets’ rhythm and regularity royalsocietypublishing.org/doi/10.1098/... Investigated titi monkeys’ duet rhythms to assess adherence to rhythmic patterns (seen in Old World primates) & try to understand the proximate causes 🐒 🐵

Atypical audio-visual neural synchrony and speech processing in early autism jneurodevdisorders.biomedcentral.com/articles/10.... EEG & gaze patterns in 31 children with ASD & 33 typically developing children as they watched cartoon videos -> reports anomalies in AV integration for those with ASD

Statistical learning beyond words in human neonates Examined: Can neonates can compute transitional probabilities on one dimension despite irrelevant variation on another? Does the linguistic dimension enjoy an advantage over the voice dimension? elifesciences.org/articles/101...

"Perceived Multisensory Common Cause Relations Shape the Ventriloquism Effect but Only Marginally the Trial‐Wise Aftereffect" Kayser and Heuer, European Journal of Neuroscience onlinelibrary.wiley.com/doi/full/10....

Audiovisual speech perception deficits in unaffected siblings of children with developmental language disorder https://pubmed.ncbi.nlm.nih.gov/39954391/

Step-by-Step Guide to Analyzing Webcam Eye-Tracking Data Webcam eye-tracking presents unique design & preprocessing challenges. To help researchers overcome these this tutorial focused on visual world webcam eye-tracking for L2 language research osf.io/preprints/ps...

Sign Languages in Healthy Aging Population: Review onlinelibrary.wiley.com/doi/10.1111/... Changes in healthy aging signers show increased reliance on basic syntactic & lexical structures not complex interface ones. Nice to see sign contributing to understanding of language changes in healthy aging!

Paper🚨 "Objects, Faces, and Spaces: Organizational Principles of Visual Object Perception as Evidenced by Individual Differences in Behavior" by @heidasigurdar.bsky.social & @ingamariao.bsky.social JEP:G editor's choice ->free to read psycnet.apa.org/fulltext/202... #visionscience #psychscisky 🧵1/13

Learning produces an orthogonalized state machine in the hippocampus rdcu.be/d9HOD

The neuroethology of ant navigation www.cell.com/current-biol... A working model of the neural basis of the multimodal navigational strategies of ants - outlines the anatomy & functioning of major central brain areas & neural circuits involved in the coordination of navigational behaviour 🐜📡

Vinegar fly fun >Drosophila melanogaster exhibit play-like behaviour in voluntary spinning on a carousel - note: some avoid it, others seek stimulation, engaging in repeated, prolonged visits > sensation-seeking allows for activity-dependent circuit refinement & it's fun www.cell.com/current-biol...

Cortical tracking of hierarchical rhythms orchestrates the multisensory processing of biological motion elifesciences.org/articles/98701 Presented dot motion and/or walking person sound & found evidence that EEG activity tracks the step rhythm, as well as the gait (2-step cycle) rhythm 🚶‍♀️🦶🔉👂

Immature vocalizations elicit simplified adult speech across multiple languages https://pubmed.ncbi.nlm.nih.gov/39919741/

Presenting Natural Continuous Speech in a Multisensory Immersive Environment Improves Speech Comprehension & Reflects the Allocation of Processing Resources in Neural Speech Tracking direct.mit.edu/jocn/article... Looked at memory load & naturalistic immersive AV speech comprehension in older adults

Animal models of the human brain: Successes, limitations & alternatives www.sciencedirect.com/science/arti... Kanwisher describes the “parts list” of the human brain & the role of animal models; yet mental function not shared with other animals (e.g, language & music) a problem- ANNs a way forward?

Language-like efficiency in whale communication www.science.org/doi/10.1126/... Analyzed vocal sequences from 16 baleen & toothed whale species to look for Menzerath’s law & Zipf’s law of abbreviation -> that longer sequences consist of shorter elements & more frequent elements will be shorter 🐋📣

Are humans the only species that communicates when a collaborator is missing information? In @pnas.org, Luke Townrow and I show that our closest relatives, bonobos, can track when a partner is knowledgeable or ignorant, and tailor communication accordingly www.pnas.org/doi/10.1073/...

The Less Meaningful the Understanding, the Faster the Feeling: Speech Comprehension Changes Perceptual Speech Tempo onlinelibrary.wiley.com/doi/10.1111/... Lack of research on how speech comprehension affects the perception of speech tempo > perception of speech tempo & comprehensibility interact 🥁

Origins of food selectivity in human visual cortex www.cell.com/trends/neuro... Proposes how visual properties of food & nonvisual signals associated with multimodal reward processing, social cognition & physical interactions with food contribute to the emergence of food selectivity

Using hearing & vision for motion psycnet.apa.org/doiLanding?d... 4 expts looked at localization for auditory, visual & AV targets c.f. maximum likelihood estimation ->observers use hearing & vision for localising static objects but unisensory input for localising moving objects & occluded motion

I’m looking for a postdoc to join my lab at the University of Sydney for a project investigating how perception changes during walking (VR, psychophysics, EEG, AI). Full time, 3 years fixed-term position; full details here: usyd.wd3.myworkdayjobs.com/USYD_EXTERNA... Please repost 🥹

Thoughts, loud and silent. Comment on "The sound of thought: Form matters - The prosody of inner speech" by Kreiner and Eviatar https://pubmed.ncbi.nlm.nih.gov/39874619/

NIRS-BIDS: Brain Imaging Data Structure Extended to Near-Infrared Spectroscopy www.nature.com/articles/s41... BIDS supports dozens of neuroimaging modalities, this paper extends BIDS to NIRS data and details tools to organise data with the goal of promoting public disseminations of fNIRS datasets 👍

Cool #musicscience #neuroskyence alert: Auditory Rhythm Encoding during the Last Trimester of Human Gestation: From Tracking the Basic Beat to Tracking Hierarchical Nested Temporal Structures www.jneurosci.org/content/45/4...

In our latest preprint we’ve used EEG imaging to provide more direct evidence that “blind see w/ sounds” in sensory substitution devices (SSDs)-we’ve shown that IT cx is activated within the first wave of activity when blind hear sounds rep’ing learnt faces vs. same sounds scrambled #neuroskyence 🧠📈

Multisensory naturalistic decoding with high-density diffuse optical tomography www.spiedigitallibrary.org/journals/neu... HD-DOT (fNIRS ++) data sample cortical hemodynamics with sufficient resolution & fidelity to support decoding complex, naturalistic, multisensory stimuli via template matching

A tutorial on representational similarity analysisfor research in social cognition osf.io/preprints/ps... Accessible introduction to RSA for research social cognition> strengths/limitations cf other multivariate methods. Code & functions (R & Python) for 2 examples showcasing different applications