Kira Wegner-Clemens
cognitive neuroscientist


kira at gwu dot edu

CV

google scholar / github
linkedin / twitter
Hi! I'm a PhD candidate in cognitive neuroscience at George Washington University.

My research focuses on semantic guidance of audiovisual attention. I'm particularly interested in how real world multisensory scenes are percieved and attended to.

Before grad school, I studied cognitive science at Rice University & worked as post bac reseacher at Baylor College of Medicine.

Full list also available on Google Scholar.

* = equal contributions

Work in preparation


Wegner-Clemens, K., Kravitz, D.J, Shomstein, S. Task irrelevance guidance of attention in audiovisual semantic relationships.


McEvoy, K.*, Wegner-Clemens, K.*, Auer, E., Eberhardt, S., Bernstein, L., Shomstein, S. Covert attention modulates visual speech perception independent of eye position.


Nag, S.*, Mahableshwarkar, P.*, Wegner-Clemens, K., Cox, P., Kaplan, S., Teng, C., Kravitz, D., Mitroff, S. Efficiencies of online data collection.


Bean, S., Wegner-Clemens, K., Shomstein, S., Malcolm, G.L. Semantically congruous sounds draw and hold gaze on objects across different scene backgrounds.


Wegner-Clemens, K., Malcolm, G., Shomstein, Search efficiency scales with audiovisual semantic relatedness in a continuous manner.


2024


Wegner-Clemens, K., Malcolm, G. L., Shomstein, S. (2024) Predicting attention in real-world environments: the need to investigate crossmodal semantic guidance. WIRES Cognitive Science. (link)

2022


Wegner-Clemens, K., Malcolm, G. L., Shomstein, S. (2022) How much is a cow like a meow? A novel database of human judgements of audiovisual semantic relatedness. Attention, Perception, & Psychophysics. ( link; preprint )


2020


Magnotti, J.F., Dzeda, K.B., Wegner-Clemens, K., Rennig, J., & Beauchamp, M.S. (2020). Weak observer-level correlation and strong stimulus-level correlation between the McGurk effect and audiovisual speech-in-noise: a causal inference explanation. Cortex. doi:10.1016/j.cortex.2020.10.002 (pdf; link)


Wegner-Clemens, K., Rennig, J., & Beauchamp, M.S. (2020) A relationship between Autism-Spectrum Quotient and face viewing behavior in 98 participants. PLoS ONE 15(4): e0230866. (pdf; link)


2019


Wegner-Clemens K, Rennig J, Magnotti JF, Beauchamp MS. (2019) Using principal components analysis to characterize eye movement fixation patterns during face viewing. Journal of Vision, November 2019, Vol.19, 2. doi:10.1167/19.13.2 (pdf; link)


Rennig, J., Wegner-Clemens, K., & Beauchamp, M.S. (2019) Face Viewing Behavior Predicts Multisensory Gain During Speech Perception. Psychonomic Bulletin & Review. 27, 70–77(2020) ( pdf; link)


Convento, S., Wegner-Clemens, K. A., & Yau, J. M. (2019). Reciprocal Interactions Between Audition and Touch in Flutter Frequency Perception, Multisensory Research, 32(1), 67-85. (pdf; link)
This page is still under construction!


Semantic guidance of audiovisual attention:
What we know about the world shapes how we attend to and percieve it.

My dissertation research investigated how meaningful sounds can modulate visual attentional priority, which is essential to understanding attention in complex real world scenes, as (reviewed here!)

I used online data collection techniques (javascript, AWS servers, mTurk) to:

- create a freely available database of semantic relatedness values based on human judgements

- demonstrate that a sound's semantic relationship to the target predicts visual search speeds (preprint available).

- investigate the importance of task relevance (ongoing work)

In ongoing work, I'm also using investigation the neural mechanisms investgating audiovisual semantic understandings and am learning modern MRI analysis techniques that leverage machine learning on complex biological data (MVPA, RSA).


Eye movements & speech perception:
Faces are extremely important and common visual objects, meaning they might be attended differently. In particularly noisy environments, visual information from the mouth can be crucial to understanding speech.

I have contributed to several projects investigating face and speech perception:

- developing a novel eye tracking analysis for face viewing

- investigating the role of eye movements in speech perception (1, 2)

- how face viewing behavior relates to social/behavioral traits (1).

- the relationship between attention, face viewing preferenes, and speech perception (manuscript in prep)