Microsoft Corporation

10/07/2021 | News release | Distributed by Public on 10/07/2021 12:25

What’s new in Microsoft 365 accessibility features

Online meetings are a staple of hybrid work- andwe've gotten some great feedback from the deaf and hard of hearing community about how they could be improved. Whilewe still have some way to go on this journey, Microsoft Teams has rolled out several features that enhance the meeting experience for deaf and hard of hearing people and increaseflexibility and focus for everyone.

Turn on live captions or transcripts to help meeting attendees follow along

One way to make meetings more inclusive for participants who are deaf or hard of hearing - as well as those working in noisy or quiet environments - is by using captioning and transcription tools to translate the spoken content of the meeting into text. In Microsoft Teams, participants using the desktop or mobile apps can turn on live captionsthat only they can see during a meeting, so long as the feature has been enabled by the organization's admin. Over the past quarter, we have continued to expand live captions by broadening support for speaker attribution, increasing the number of languages covered to 28, and extending the feature beyond meetings to Live Events.

Like live captions, live transcriptionconverts speech to text in near real time and identifies each speaker. Unlike captioning, transcription is available both during and after the meeting. If your admin has enabled both transcription and recording, live transcription will begin automatically when participant starts recording the meeting, conveniently capturing the discussion and detailing what was spoken in the transcript with one click. Transcription can also be turned on without recording. Live transcripts can also be reviewed after the meeting, providing an opportunity to catch up on items you may have missed. Learn more about how to view live transcriptions in a Teams meeting.

Bring in a captioner with CART support

YouTube Video

In addition to native live captioningand transcriptionsin Teams, this month we are rolling out support for CART (CommunicationAccessRealtimeTranslation), or real-time captioning, in Teams meetings. The National Court Reporters Associationdescribes CART services as "the instant translation of the spoken word into English text using a stenotype machine, notebook computer and real-time software." CART can be especiallyhelpful in contexts that are challenging for speech recognition software, such asthose with a lot of specialized terminology, or where participants speak with a range of different accents. In order to use CART, the meeting organizer provides an invitation and a special link to theirpreferred CART captioner. During the meeting, the CART provider transcribes the conversation in real-time. All participants can see the captions and have the option to toggle between CART and AI-generated captioning.

Keep both presenters and interpreters onscreenby pinning or spotlighting multiple videos

Individuals who communicate using sign languageoften need to keep both an interpreter and a presenter onscreen and visiblethroughout a videocall. This is an area where we still have significant work underway, but in the meantime, Teams now supports both pinning and spotlightingmultiple videos. While pinning changesonlythe participant's private view, spotlighting allows an organizer or presenter to highlight up to seven videosfor everyone on the call. Spotlighting also enlarges the video, which can help make it easier for deaf and hard of hearingparticipantsto read lips. And as of last month, both of these features are also available for Microsoft Teams Rooms on Windows. Learn how to adjust your view or spotlight someone's videoin a Teams meeting.