21-22 May 2024 Aix en Provence (France)

Topics and Program

Why don’t we do what we should do?

 

The overarching aim is to understand how neural constraints, nonlinearities, cross-frequency coupling organize the interactions between oscillatory activity in brain and behavior. The workshop is organized around three main topics, each featuring one or more short presentations followed by a longer panel discussion. A final panel session will summarize the workshop’s outcomes. Several posters will be displayed, describing individual studies or individual points that may contribute to this discussion.

 

21/05

9:3012:30

Session I: Looking beyond the amplitude envelope / theta scale

 

 

 

Moderators:  A. Strauss, F. Pellegrino

 

Most research has been devoted to the relation between the amplitude envelope and brain oscillations at the theta scale. However, other acoustic variables (e.g.: spectral flow, f0) and articulatory variables (jaw height, total amount of articulatory displacement) display periodicities at other time scales both faster and slower than 5Hz and are potentially related to the production of linguistic units other than the syllable (e.g., gestures, prosodic prominence).
 Further, informational content may also play a significant role in shaping oscillatory activity.
 We should then ask how large prosodic constituents (supra-segmental) on the one hand, and sub-syllabic features on the other, are related to the expression of different oscillatory frequency bands across motor and auditory regions.

 


L. Goldstein (remote presentation): The structure of articulatory actions that produce acoustic rhythms

 

J. Li: Intersyllabic cohesion and the of coordination between syllabic and suprasyllabic rhytms

 

J. Giroud: Investigation of the spectral flux as an acoustic marker of fast time scales in speech

 

 

21/05

14:00 – 16:00

Session II: How does the brain generate and exploit oscillatory activity and how is it related to the acoustic signal and its processing?

 

 

 

Moderators:  B. Giordano, L. Lancia

 

Does the brain simply mirror signal properties?
 Does it make use of time scale and time scale integration to decode the signal?
 Are the tools we have used so far appropriate to analyze both acoustic and neural data?
 How are the tools we used so far affected by our conceptualization of acoustic and brain data analysis?
 When does the linguistic processing comes in?

 


E. Thoret: Rethinking the modelling of the auditory periphery with cascaded envelope interpolation

 

C. Daube: A stimulus-computable model of beta-power responses to speech

 

 

21/05

16:0018:00

Poster Session

 

 

22/05
9.3012:30

Session III: Localized vs network based processing

 

 

 

Moderators: S. Kotz, B. Morillon, D. Schön

 

Assuming an interface between auditory and motor areas: is it simply an interaction or do auditory and motor cortex have multimodal representations?
 Can we ignore one modality because the area subserving a specific modality is multimodal in essence?
 How does this translate into oscillatory activity and potentially oscillatory coupling? [throwing beta in the equation]

 


F. Assaneo: Speech rhythms as a consequence of the underlying neural architecture

 

S. ten Oever: Are oscillations a mechanism for neural communication or organization?                                     

Online user: 2 Privacy
Loading...