SLPAT 2022 (Organized by the ACL/ISCA SIG-SLPAT)

 
9th Workshop on Speech and Language Processing for Assistive Technologies (SLPAT)
May 27,2022 – Dublin, Ireland
Collocated with ACL
 
Submission deadline:February 28, 2022

Hello,

We are pleased to announce the first call for papers for the Ninth Workshop on Speech and Language Processing for Assistive Technologies (SLPAT) on May 27, 2022, co-located with ACL 2022 in Dublin: Ireland

This workshop will bring together researchers from areas such as natural language processing, speech signal processing, (special) education, rehabilitation sciences, computer science, HCI, communication, psychology, psycholinguistics, computer vision, and computer graphics with a common interest in making everyday life more accessible for people with physical, cognitive, sensory, emotional, or developmental disabilities as well as older adults. The workshop will provide an opportunity for researchers, domain experts, and users of assistive technology (AT) to share their findings, to discuss present and future challenges, and to explore possibilities for collaboration.

Possible topics include but are not limited to:

  • Speech synthesis for physical, cognitive, or sensory impairments (talking devices in Augmentative and Alternative Communication (AAC), screen readers, audio description/audio subtitling using speech synthesis)
  • Sign synthesis (sign language animation, synthetic videos)
  • Speech recognition (AAC, respeaking for live subtitling, fully automatic subtitling)
  •  Sign recognition (AT, natural user interfaces for sign language resources, computer-augmented corpus annotation, sign language assessment)
  • Speech and language technologies for daily assisted living and Ambient/Active Assisted Living (AAL)
  • Translation to and from speech, text (including subtitles), pictographs, Braille, and sign language
  • Novel modeling and machine learning approaches for AT
  • Personalized voices for AAC based on limited data
  • Biofeedback for therapy in neurological disorders
  • Text generation for improved comprehension (e.g., sentence and text simplification)
  • Silent speech: speech technology based on sensors without audio
  • Nonverbal communication
  • Multimodal user interfaces and dialogue systems adapted to AT
  • Speech and language technologies for cognitive assistance applications
  • Presentation of graphical information for people with visual impairments
  • Speech and language technologies applied to typing interface applications
  • Brain-computer interfaces for language processing applications
  • Assessment of speech and language processing within the context of AT
  • Web accessibility, media accessibility
  • Deployment of speech and language technologies in the clinic or in the field, such as language analysis for diagnosis or intervention
  • Linguistic resources; corpora and annotation schemes
  • Automatic evaluation within the context of AT
  • Reception studies with target user groups
  • Ethical considerations and standards within the context of AT
  • Crowdsourcing and Citizen Science efforts within the context of AT

Please contact the conference organizers at slpat2022-organizers@googlegroups.com with any questions.

Important dates (subject to change)

February 28, 2022: Deadline for papers
March 26, 2022: Notification of acceptance
April 10, 2022: Camera-ready paper due
May 27, 2022: Workshop

Instructions for authors

Papers must be submitted using the OpenReview paper submission system which you can access here:

openreview.net/group?id=aclweb.org/ACL/2022/Workshop/SLPAT

Paper submissions must use the official ACL style templates, which are available as an Overleaf template and also downloadable directly (Latex and Word). Please follow the paper formatting guidelines general to *ACL conferences available here. Authors may not modify these style files or use templates designed for other conferences.

Full papers should contain up to 6 pages of content, not including references. Demo papers should be up to 4 pages, not including references.

Organizers

  • Emily Prud’hommeaux, Boston College
  • Sarah Ebling, University of Zurich
  • Preethi Vaidyanathan, Eyegaze Inc


Both comments and pings are currently closed.

Comments are closed.

Design by 2b Consult