Starr Andersen
Posted April 14, 1999
Contents
Introduction
Providing Closed Captioning for a Live Stream
Inserting Files into the Stream
Adding Links to a Live Stream
Summary
Once you have created a Web site complete with streaming media, what is the best way to add new content or enhance the existing content? There are many different things that you can do with your live stream once it is created. If you have users who need closed captioning, you can provide that to your live stream, however you will need a fast typist or a scripted presentation so that you can prepare a SAMI file in advance. You can also provide helpful links to other Web sites during your broadcast. The link provided can either automatically take the viewer to the Web site, or can wait for the viewer to click on the link. Or, you can add in stored .asf files to cover breaks in the live event or as advertisements or promotional material. This article provides an overview of these methods and some sample code to help you get started implementing them on your Web site. More information on these topics is available in the Windows Media Player Software Developer's Kit and in the Windows Media Technologies.
There are two methods that you can use to provide closed captioned content with your live ASF stream. One method is known as a SAMI (Synchronized Accessible Media Interchange) file, the other is script commands.
First, let's investigate how to prepare and use a SAMI file to provide closed captioning. The Microsoft® Windows® Media Player control provides an easy means of adding closed captioning to your media presentation, and supports SAMI for closed captioning. SAMI (.smi) files contain text strings associated with specified times within the presentation. The text strings are displayed in the Windows Media Player's closed captioning display area as the clip reaches the designated times.
SAMI files can also include conditional text strings for providing closed captioning in different languages, as well as changeable font types and font sizes to customize the captioning for specific audiences.
The SAMI interchange format is based on SGML/HTML. Every SAMI document should start with a <SAMI> tag and end with a </SAMI> tag. The document's class name is "Synchronized Accessible Media Interchange" and its file extension is .smi or .sami.
The following is a sample SAMI document:
<SAMI> <HEAD> <Title>George Graham Vest Speech</Title> <SAMIParam><!-- Copyright="(C)Copyright 1999, Microsoft Corporation" Media="Great_Speeches.nsc", none Length=73000 CaptionMetrics=scaleable CaptionLineLength=180 CaptionFontSize=12 CaptionTextLines=3--> </SAMIParam> <STYLE TYPE="text/css"><!-- P {margin-left: 29pt; margin-right: 29pt; font-size: 10pt; text-align: left; font-family: tahoma, arial, sans-serif; font-weight: normal; color: white; background-color: black;} TABLE {Width: "248pt" ;} .ENUSCC {Name: "English Captions"; lang: en-US-CC;} #Source {margin-bottom: -15pt; background-color: silver; color: black; vertical-align: normal; font-size: 10pt; font-family: tahoma, arial, sans-serif; font-weight: normal;} #Youth {color: greenyellow; font-size: 18pt;} #BigPrint-1 {color: yellow; font-size: 24pt;}--> </STYLE> </HEAD> <BODY><TABLE> <SYNC Start=0> <P Class=ENUSCC ID=Source>Senator George Graham Vest <SYNC Start=10> <P Class=ENUSCC>Gentlemen of the Jury: The best friend a man has in the world may turn against him and become his enemy <SYNC Start=8800> <P Class=ENUSCC>His son or daughter that he has reared with loving care may prove ungrateful. <SYNC Start=19500> <P Class=ENUSCC>Those who are nearest and dearest to us, those whom we trust with our happiness and our good name may become traitors to their faith. </TABLE></BODY> </SAMI>
The SAMI file name must be embedded within either an .asx file or an .htm file in order to be played with the live stream. Here is the sample code for a Web page with an embedded player and closed captioning:
<HTML> <HEAD> <TITLE>Windows Media Technologies - Close Captioning Sample</TITLE> </HEAD> <BODY> <CENTER> This is a sample of closed captioning using Windows Media Technologies<BR> and SAMI (Synchronized Accessible Media Interchange) files.<BR><BR><BR> <!-- BEGIN --> <OBJECT ID="MediaPlayer1" width=320 height=240 classid="CLSID:22D6F312-B0F6-11D0-94AB-0080C74C7E95" codebase="http://activex.microsoft.com/activex/controls/mplayer/en/ nsmp2inf.cab#Version=5,1,52,701" standby="Loading Microsoft Media Player components..." type="application/x-oleobject"> <PARAM name="FileName" value="great speeches.asx"> <PARAM name="SAMIFileName" value="vest.smi"> <PARAM name="ShowCaptioning" value="1"> <PARAM name="ShowControls" value="1"> </OBJECT> <BR> <A href="http://servername/great speeches2.asx">Start the Windows Media presentation in the stand-alone player.</A> <!--END --> </CENTER> </BODY> </HTML>
The .asx file referenced for the embedded player is just a standard .asx file, for example:
<ASX version ="3.0"> <Title> Great Speeches</Title> <Abstract> Presentations of Great Speeches from History</Abstract> <Entry> <Title>Tribute to the Dog</Title> <Author>George Graham Vest</Author> <Copyright>1999, Microsoft</Copyright> <Abstract>Presenter Jerald Amunsen</Abstract> <Ref href="http://servername/great speeches.nsc"/> </Entry> </ASX>
However, to support closed captioning via a SAMI file in the stand-alone payer, the .asx file must be modified by concatenating the path to the SAMI file with the HREF to the .nsc file.
<Ref href="http://servername/great speeches.nsc?sami=http://servername/vest.smi"/>
Two important notes about creating an ASX that will support SAMI content in a stand-alone player:
Embedding this reference into the ASX file will cause the same behavior in the stand-alone player as the earlier HTML did for the embedded player. Stand-alone players are a useful implementation if your users are going to continue Web-browsing while receiving your content.
If you do not have the timing and content of the presentation available prior to your broadcast, closed captioning text can be added to your stream on the fly using script commands. To do this, the encoder must have an operator that is in place throughout the broadcast. The operator will use the script command box to input the text.
To provide closed-captioning content manually, do the following:
When the command is sent, the text will appear in the caption area of the player window. This is a very cumbersome method, but allows you to respond to events as they happen.
A third option that you can use combines aspects of the SAMI file method and the manual method. It requires that you have the presentation script, but you do not need the timings. You can prepare the text before the broadcast and assign each line of text to a script command event that is identified in the .asx file. When the event is sent the assigned text will be displayed in the caption area. For example, the speech used earlier in the preparation of the SAMI file would be inserted into the ASX file. Instead of sentences being preceded by <LANGUAGE> and <TIME> tags, they would be preceded by an <EVENT> tag. The .asx file would appear as follows:
<ASX version ="3.0"> <Title> Great Speeches</Title> <Abstract> Presentations of Great Speeches from History</Abstract> <Entry> <Title>Tribute to the Dog</Title> <Author>George Graham Vest</Author> <Copyright>1999, Microsoft</Copyright> <Abstract>Presenter Jerald Amunsen</Abstract> <Ref href="http://servername/great speeches.nsc"/> </Entry> <Event Name=I1 Whendone="Resume"> <Text> Gentlemen of the Jury: The best friend a man has in the world may turn against him and become his enemy </Text> </Event> </ASX>
This method allows preparation of the majority of text strings prior to the live event, which will save time and be less prone to error. However, it also requires an operator on the encoder at all times during the broadcast. The operator will perform the following actions during the broadcast when using this method:
A benefit of this method is that it allows the operator the option of using the manual method if a speaker diverts from the prepared text.
While the live event is streaming, there may be occasion when you would like to play a stored .asf file rather than the live content. Examples of when this would be useful include when you are experiencing technical difficulties, playing commercials for event sponsors, or giving live performers a short break. A file can be inserted into a live stream either on the fly through a script command, or through modifying the .asx file.
Inserting a file into a live stream is done through the FILENAME script command. Simply type FILENAME in the script command box and follow it with either the URL or UNC path to the file you want to play. However, once the command has been sent the stream does not return to the live content.
A better way to insert files into a live stream is through the .asx file. The opening of a file through an ASX reference is called an event. Files are referenced in the .asx through their URL and given event names. They are activated by the script command EVENT being called in the ASF stream. The script command references the given event name. To create an event that will open a stored .asf file during a broadcast, add the following lines to your .asx file:
<Event Name= "Promo" Whendone= "Resume"> <EntryRef href= /> </Event>
The NAME property gives the event a name that you will use to call the event from the encoder using the EVENT script command. The WHENDONE property tells the player to resume playing the live stream when the stored .asf file is complete. The ENTRYREF property provides the path to the .asf file.
For more information on adding ASF files to streams see the article "Ad Insertion," and the section "Playlists"in the article "All About ASX Files."
If you are broadcasting a live stream in order to promote an event or a product, then one of the most useful things you can do during the broadcast is provide links to various Web sites that are engaged in promoting the same thing. URL links can provide a way of getting user feedback about your broadcast, having viewers place orders for products, or simply provide additional information. URL links in a live stream are provided using script commands. To send a URL link, use the script command URL and then follow it with the Web site address.
When the player receives this script command, it will open up the browser and navigate to the site specified.
If you do not want the browser to open automatically, you can use the TEXT script command to create an HTML link to your site that your users can click on to open. In the example, the text appears in the caption window and opens the site in that window.
To have the link open in a separate browser, add TARGET=_TOP to your Web site reference.
Whether you choose to enhance your live stream by adding closed captioning, URL flips, or inserting additional files depends on the audience for your event and your goals as a content producer. All three may be used together if that is the best solution for your event. Closed captioning is a great way to provide alternative languages and additional meaning to your audio content. It can be accomplished through creating a SAMI file or through manually typing the captions during the broadcast. URL flips are a means to take your viewers to other Web sites during your broadcast. These Web sites can be used to capture demographic data, get feedback, or to promote a product. File insertion is a means to provide additional content during a live broadcast; a good example of this is the insertion of advertising. The best way to insert a file is to create a playlist in your ASX file that brings the user back to your broadcast after the file has been played. Adding these features to a well-designed Web site will create an attractive and enthralling user experience.