I have been playing with flash/flex and video streamming + chat, just for fun. I think there might be some people searching for some working examples, and I didn't find too much on the net working straight forward with the latest versions. So, that's why I want to share my working code:
Features:
- Publish a Stream (sound included)
- Receive a Stream (playing sound)
- Minimal chat components
What's needed: Flash Builder/Flex Actionscript 3 (You can addapt the example for normal Flash). FMS aka Flash Media Server with the application "live", that's usually installed by default. FMS will install Apache, publishing an admin console to view connections, app's status and stream status (*Hint: Check the stream status of the app "live").
3 mxml files:
- Publisher
- Viewer
- Main app
You would also like to create your own app on FMS, but I'm not covering this at the moment.
So, here we go:
The publisher will send the stream to the server through the NetConnection, using the webcam attached to a NetStream (with only 1 direction). It will also attach the mic if any.
Publisher.mxml
-- cut here --
<?xml version="1.0" encoding="utf-8"?>
<s:Group xmlns:fx="http://ns.adobe.com/mxml/2009"
xmlns:s="library://ns.adobe.com/flex/spark"
xmlns:mx="library://ns.adobe.com/flex/mx" width="410" height="490" creationComplete="LiveStreams()">
<fx:Declarations>
<!-- Place non-visual elements (e.g., services, value objects) here -->
</fx:Declarations>
<fx:Script>
<![CDATA[
import flash.display.MovieClip;
import flash.events.ActivityEvent;
import flash.events.MouseEvent;
import flash.events.NetStatusEvent;
import flash.media.Camera;
import flash.media.Microphone;
import flash.media.Video;
import flash.net.NetConnection;
import flash.net.NetStream;
import mx.utils.ObjectUtil;
var nc:NetConnection;
var ns:NetStream;
var video:Video;
var camera:Camera;
var mic:Microphone;
private var meta:Object;
public function sendChat() {
var obj:Object = new Object();
obj = new Object();
obj.chat = this.txtSend.text;
this.txtChat.text = this.txtChat.text + "\n" + this.txtSend.text;
this.txtChat.verticalScrollPosition = this.txtChat.maxVerticalScrollPosition;
this.txtSend.text = "";
ns.send("receiveChat", obj);
}
public function LiveStreams()
{
startBtn.addEventListener(MouseEvent.CLICK, startHandler);
clearBtn.addEventListener(MouseEvent.CLICK, clearHandler);
stopBtn.addEventListener(MouseEvent.CLICK, stopHandler);
}
private function ns_onMetaData(item:Object):void {
trace("meta");
trace(ObjectUtil.toString(item));
/*
meta = item;
// Resize Video object to same size as meta data.
video.width = item.width;
video.height = item.height;
// Resize UIComponent to same size as Video object.
uic.width = video.width;
uic.height = video.height;
panel.title = "framerate: " + item.framerate;
panel.visible = true;
*/
}
private function ns_onCuePoint(item:Object):void {
trace("cue");
}
/*
* Connect and start publishing the live stream
*/
private function startHandler(event:MouseEvent):void {
trace("Okay, let's connect now");
nc = new NetConnection();
nc.client=this;
nc.addEventListener(NetStatusEvent.NET_STATUS, netStatusHandler);
nc.connect(this.txtServer.text);
}
/*
* Disconnect from the server
*/
private function stopHandler(event:MouseEvent):void {
trace("Now we're disconnecting");
nc.close();
}
/*
* Clear the MetaData associated with the stream
*/
private function clearHandler(event:MouseEvent):void {
if (ns){
trace("Clearing MetaData");
ns.send("@clearDataFrame", "onMetaData");
}
}
private function netStatusHandler(event:NetStatusEvent):void
{
trace("connected is: " + nc.connected );
trace("event.info.level: " + event.info.level);
trace("event.info.code: " + event.info.code);
switch (event.info.code)
{
case "NetConnection.Connect.Success":
trace("Congratulations! you're connected");
publishLiveStream();
break;
case "NetConnection.Connect.Rejected":
trace ("Oops! the connection was rejected");
break;
case "NetStream.Play.Stop":
trace("The stream has finished playing");
break;
case "NetStream.Play.StreamNotFound":
trace("The server could not find the stream you specified");
break;
case "NetStream.Publish.Start":
trace("Adding metadata to the stream");
// when publishing starts, add the metadata to the stream
var metaData:Object = new Object();
metaData.title = "UnStreammmm";
metaData.width = 200;
metaData.height = 150;
ns.send("@setDataFrame", "onMetaData", metaData);
break;
case "NetStream.Publish.BadName":
trace("The stream name is already used");
break;
}
}
public function onBWDone():void{
}
private function activityHandler(event:ActivityEvent):void {
trace("activityHandler: " + event);
trace("activating: " + event.activating);
}
/*
* Create a live stream, attach the camera and microphone, and
* publish it to the local server
*/
private function publishLiveStream():void {
ns = new NetStream(nc);
ns.addEventListener(NetStatusEvent.NET_STATUS, netStatusHandler);
var nsClient:Object = {};
nsClient.onMetaData = ns_onMetaData;
nsClient.onCuePoint = ns_onCuePoint;
ns.client = nsClient;
camera = Camera.getCamera();
mic = Microphone.getMicrophone();
if (camera != null){
camera.addEventListener(ActivityEvent.ACTIVITY, activityHandler);
video = new Video();
video.smoothing=true;
video.width=200;
video.height=150;
video.attachCamera(camera);
ns.attachCamera(camera);
uic.addChild(video);
//this.addChild(video);
//myvid.source=camera;
}
if (mic != null) {
mic.addEventListener(ActivityEvent.ACTIVITY, activityHandler);
mic.setUseEchoSuppression(true);
mic.rate = 44;
ns.attachAudio(mic);
}
if (camera != null || mic != null){
// start publishing
// triggers NetStream.Publish.Start
ns.publish(this.txtStream.text, "live");
} else {
trace("Please check your camera and microphone");
}
}
]]>
</fx:Script>
<mx:UIComponent id="uic" x="5" y="72" width="200" height="150" />
<mx:Button id="startBtn" x="15" y="40" label="Start"/>
<mx:Button id="clearBtn" x="93" y="40" label="Clear"/>
<mx:Button id="stopBtn" x="171" y="40" label="Stop"/>
<s:TextInput x="59" y="11" id="txtServer" text="rtmp://192.168.1.130/live"/>
<s:TextInput x="266" y="11" id="txtStream"/>
<s:Label x="15" y="15" text="Server"/>
<s:Label x="206" y="16" text="Stream"/>
<mx:TextArea x="10" y="380" width="390" height="66" id="txtChat" verticalScrollPolicy="auto" editable="false"/>
<s:TextInput x="10" y="458" width="308" id="txtSend"/>
<s:Button x="326" y="459" label="Send" click="sendChat();"/>
</s:Group>
-- stop here --
The Viewer component can connect to a stream, play sound and receive chat msgs from the stream being published:
Viewer.mxml
-- start here --
<?xml version="1.0" encoding="utf-8"?>
<s:Group xmlns:fx="http://ns.adobe.com/mxml/2009"
xmlns:s="library://ns.adobe.com/flex/spark"
xmlns:mx="library://ns.adobe.com/flex/mx" width="410" height="490" creationComplete="LiveStreams()">
<fx:Declarations>
<!-- Place non-visual elements (e.g., services, value objects) here -->
</fx:Declarations>
<fx:Script>
<![CDATA[
import flash.display.MovieClip;
import flash.events.ActivityEvent;
import flash.events.MouseEvent;
import flash.events.NetStatusEvent;
import flash.media.Camera;
import flash.media.Microphone;
import flash.media.Sound;
import flash.media.Video;
import flash.net.NetConnection;
import flash.net.NetStream;
import flash.media.SoundTransform;
import mx.utils.ObjectUtil;
var nc:NetConnection;
var ns:NetStream;
var video:Video;
var sound:Sound;
private var meta:Object;
public function LiveStreams()
{
startBtn.addEventListener(MouseEvent.CLICK, startHandler);
clearBtn.addEventListener(MouseEvent.CLICK, clearHandler);
stopBtn.addEventListener(MouseEvent.CLICK, stopHandler);
}
public function receiveChat(obj:Object) {
trace(ObjectUtil.toString(obj));
this.txtChat.text = this.txtChat.text + "\n" + obj.chat;
this.txtChat.verticalScrollPosition = this.txtChat.maxVerticalScrollPosition;
}
private function ns_onMetaData(item:Object):void {
trace("meta"+item.toString()+" "+item.title );
trace(ObjectUtil.toString(item));
/*
meta = item;
// Resize Video object to same size as meta data.
video.width = item.width;
video.height = item.height;
// Resize UIComponent to same size as Video object.
uic.width = video.width;
uic.height = video.height;
panel.title = "framerate: " + item.framerate;
panel.visible = true;
trace(ObjectUtil.toString(item));
*/
}
Â
private function ns_onCuePoint(item:Object):void {
trace("cue");
}
/*
* Connect and start publishing the live stream
*/
private function startHandler(event:MouseEvent):void {
trace("Okay, let's connect now");
nc = new NetConnection();
nc.client=this;
nc.addEventListener(NetStatusEvent.NET_STATUS, netStatusHandler);
nc.connect(this.txtServer.text);
}
/*
* Disconnect from the server
*/
private function stopHandler(event:MouseEvent):void {
trace("Now we're disconnecting");
nc.close();
}
/*
* Clear the MetaData associated with the stream
*/
private function clearHandler(event:MouseEvent):void {
if (ns){
trace("Clearing MetaData");
ns.send("@clearDataFrame", "onMetaData");
}
}
private function netStatusHandler(event:NetStatusEvent):void
{
trace("connected is: " + nc.connected );
trace("event.info.level: " + event.info.level);
trace("event.info.code: " + event.info.code);
switch (event.info.code)
{
case "NetConnection.Connect.Success":
trace("Congratulations! you're connected");
publishLiveStream();
break;
case "NetConnection.Connect.Rejected":
trace ("Oops! the connection was rejected");
break;
case "NetStream.Play.Stop":
trace("The stream has finished playing");
break;
case "NetStream.Play.StreamNotFound":
trace("The server could not find the stream you specified");
break;
case "NetStream.Publish.Start":
trace("Adding metadata to the stream");
// when publishing starts, add the metadata to the stream
var metaData:Object = new Object();
metaData.title = "UnStreammmm";
metaData.width = 200;
metaData.height = 150;
ns.send("@setDataFrame", "onMetaData", metaData);
break;
case "NetStream.Publish.BadName":
trace("The stream name is already used");
break;
}
}
public function onBWDone():void{
}
private function activityHandler(event:ActivityEvent):void {
trace("activityHandler: " + event);
trace("activating: " + event.activating);
}
/*
* Create a live stream, attach the camera and microphone, and
* publish it to the local server
*/
private function publishLiveStream():void {
ns = new NetStream(nc);
ns.addEventListener(NetStatusEvent.NET_STATUS, netStatusHandler);
var nsClient:Object = {};
nsClient.receiveChat = receiveChat;
nsClient.onMetaData = ns_onMetaData;
nsClient.onCuePoint = ns_onCuePoint;
ns.client = nsClient;
video = new Video();
video.smoothing=true;
video.attachNetStream(ns);
video.width=200;
video.height=150;
sound = new Sound();
ns.soundTransform = new SoundTransform(1);
ns.play(this.txtStream.text);
uic.addChild(video);
//this.addChild(video);
//myvid.source=camera;
}
]]>
</fx:Script>
<mx:UIComponent id="uic" x="5" y="70" width="200" height="150" />
<mx:Button id="startBtn" x="15" y="40" label="Start"/>
<mx:Button id="clearBtn" x="98" y="40" label="Clear"/>
<mx:Button id="stopBtn" x="176" y="40" label="Stop"/>
<s:TextInput x="54" y="9" id="txtServer" text="rtmp://192.168.1.130/live"/>
<s:TextInput x="261" y="9" id="txtStream"/>
<s:Label x="10" y="13" text="Server"/>
<s:Label x="201" y="14" text="Stream"/>
<mx:TextArea x="10" y="380" width="390" height="100" id="txtChat" verticalScrollPolicy="auto" editable="false"/>
</s:Group>
-- stop here --
Now we merge both components, Publisher and Viewer into a single app:
MainApp.mxml
-- start here --
<?xml version="1.0" encoding="utf-8"?>
<s:Application xmlns:fx="http://ns.adobe.com/mxml/2009"
xmlns:s="library://ns.adobe.com/flex/spark"
xmlns:mx="library://ns.adobe.com/flex/mx" minWidth="955" minHeight="600" xmlns:ns1="*">
<fx:Declarations>
<!-- Place non-visual elements (e.g., services, value objects) here -->
</fx:Declarations>
<ns1:Publisher x="10" y="10" width="410" height="350">
</ns1:Publisher>
<ns1:Viewer x="432" y="10" width="410" height="350">
</ns1:Viewer>
</s:Application>
-- stop here --
To run the examples you must set the IP of the FMS server (IP_OF_THE_SERVER). Then open the webpage in 2 different browsers/tabs.
On browser/tab 1 set the stream name of the publisher component to for example "Jaime" (Thanks for testing), and the stream name of the viewer to "Pablo".
On browser/tab 2 set the stream name of the publisher component to for example "Pablo" (Thanks for testing), and the stream name of the viewer to "Jaime" (just the opposite to the other browser/tab).
Now click start on the publisher components and then start on the viewer components. You can also try to send some text with the chat window. That is thanks to metadata functions, where you can define a function handler at the netstream and send metadata with that handler name.
I found some issues because of older versions (ex:onBWDone()). That's why you will find some empty definitions at the code. Anyway they are not needed as soon as you create a custom client attached to the NetConnection.
As a final note, I'm not capturing all the exceptions that the application can throw, so please be careful using it! It is just a proof of concept..
;-)