How does player and authoring software create, store and read closed captions?

 

With two competitive mechanisms, closed captions may be:

1. Stored in a video file: closed captions are either CEA-608 (analog) or CEA-708 (digital) tracks in a video file. 3GPP Timed Text is the standard for subtitles stored in MP4 and 3GP files. These formats are either stored as a data track or encoded in a video stream itself. Most DVDs distributed in America contain analog closed captions. Broadcasters use digital closed captions for all their programming. The HTTP Live Streaming format supports analog and digital closed captions. The most recent FCC regulations require closed captions stored in a video file.

2. Stored in a text file: SCC, SRT, DFXP, and SAMI files may contain closed captions. These formats provide timed textual information outside the video file. SCC is frequently used in authoring environments (ex: Avid, Premiere, Final Cut Pro). Browser-based video players (ex: Flash, Silverlight, HTML5) generally rely on sidecars to deliver closed captions. Video players native to Android generally require sidecars. (We’ve encountered no Android device that has a closed caption decoder.) Apple’s mobile devices can decode both analog and digital captions. Subtitles for the hard of hearing (and other users) are stored in a text file.

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...

edchelp