[Accessibility] 03/30 Open A11y meeting minutes

Pete Brunet pete at a11ysoft.com
Mon Apr 12 13:49:27 PDT 2010


Regarding the minutes, regarding the following:

PK: ... The refresh adds a bunch of new rules. One of the notable
additions is real-time text for the deaf and a much more significant
effort on the deaf and on communication technologies

PB: On the speech reco angle, did they talk about whether it was good
enough for what we want to do?

PK: I don't remember any mention of speech recognition. The NRPM says at
a high level that someone without hands needs to be able to use your
app. Doesn't specifically mention speech recognition. Apps that support
audio and video chat must also support real-time text.

This is an accurate representation of the exchange during the meeting,
but what I really was asking is, in the case where speech reco might be
used to transcribe speech, will the current (or near in) state of the
art of speech reco technology be able to provide acceptable real-time
text, considering the challenge of large vocabularies, speaker
independence, the variety of speakers, and conversational speech?  Or do
the requirements allow for the use of real time human transcribers
(either local or remote)?

-- 
*Pete Brunet*
                                                                
a11ysoft - Accessibility Architecture and Development
(512) 238-6967 (work), (512) 689-4155 (cell)
Skype: pete.brunet
IM: ptbrunet (AOL, Google), ptbrunet at live.com (MSN)
http://www.a11ysoft.com/about/
Ionosphere: WS4G
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.linux-foundation.org/pipermail/accessibility/attachments/20100412/339f899c/attachment-0001.htm 


More information about the Accessibility mailing list