Multilingual audio in a single stream?

I'm streaming two languages simultaneously, one in a live camera and a direct translator. Right now it is configured as two distinct components that works very well, but I would like to combine the two audio streams so that a single recorded flv contains all the audio data with the customer to choose the language of reading.

Is it feasible with fms3? How other people have dealt with flows in direct-translated?

Any thoughts would be appreciated.

Thanks, it was pretty much what I thought.

Tags: Adobe Media Server

Similar Questions

  • can I stream audio from several sources into a single stream?

    If I have 5 users within the same distribution, can I let any audio use (talk) - I can hear all the? at the same time? (I know you can do continuous few-to-many; in such a situation audio distribution)?

    fix.

  • Since the update "-Audio and Microsoft, Communication device, Streaming Media Broadcast" webcam does not work? ___Installation status: Successful___Update type: Recommended___Microsoft Audio, device communication, Media Streaming and Broadcast__

    Microsoft LIFECAM-VX 5500 Help! Since the update "-Audio and Microsoft, Communication device, Streaming Media Broadcast - Microsoft LifeCam VX-5500" it has not worked. The program does not open and when I try it says "stopped working". Also, when in MSN Messenger it freezes it and nobody can see me nor can I see myself. What can I do to fix this? I know it's the update because it worked perfectly before, and I read of others having this problem. Help!

    Hi john.w.11,

    I suggest you to install the latest driver for VX-500 that could help you solve the issue.

    Click on the link below for more information.
    Troubleshoot failures to find the new LifeCam hardware
    http://support.Microsoft.com/kb/929087

    Check whether the problem is resolved.

    Please post back and let us know if it helped to solve your problem.

    Kind regards
    KarthiK TP

  • Playbook HTML5 audio problem (problem of streaming media)?

    Hello. I'm working on a simple app using HTML5 audio tags. Deploy the application in virtual Playbook (BlackBerry WebWorks SDK for Tablet 2.2.0.5 - BlackBerryPlayBookSimulator - 1.0.7)

    It's my config.xml file:

    
    http://www.w3.org/ns/widgets" xmlns:rim="http://www.blackberry.com/ns/widgets"
        version="1.0.0.0">
        Radio
        
            A sample application to demostrate some of the possibilities.
        
        
            CESAR
        
        
            Example license.
        
    
        
        
    
        
        
    
        
    
    

    My index.html:

    
    
    
    audio testing live stream!
    
    
        

    REMOTE RADIO (works on playbook browser)

    MP3 (local file)

    REMOTE FILE

    Remote stream (shoutcast radio) and remote files (ogg, http://blackberry.github.com/WebWorks-Samples/kitchenSink/html/html5/audio.html) does not work on HTML5 app.

    In streams remotely, the error was: "there was an error decoding of this media. "The multimedia format not supported".

    What's wrong?

    What can I do?

    Now I know what was my mistake: the URL does not contain a stream, only a playlist (PLS) with multiple URLS to to access the stream. For more details you can see:

    http://en.Wikipedia.org/wiki/PLS_%28file_format%29

  • Can I standardize the levels of audio in a single operation?

    I have a video which lasts 17 minutes, 15 sequences, and God knows how many clips.  The noise level is not as consistent as I wish it were, so I wonder if I can standardize all of the output with a single audio effect?  Or should I make it as it is and use the hearing on the rendering of MP4 files?

    I realize that this is not recommended, so I would appreciate being pointed in the right direction to avoid to be wrong in the future.  I ' v comes to be going to the ear and keeping everything below 0 dB.  I have a subscription to Lynda.com.

    Thanks for any help.

    On the effects of the export settings tab, there is a normalization of overall volume.

    I have no idea how effective it will be, however.

  • Audio IOS RTMP live stream

    Hi guys I hope that some of you will be able to help me.

    I'm trying to stream audio from a live RTMP stream using the NetConnection and NetStream classes. I managed to get my app works without problem on Android, but I have some major difficulty make him play the audio back on iPad. Interestingly, it works in the device emulators when you are debugging, but I guess this isn't really an accurate representation. I tried streaming RTMP in AAC and MP3, but without success to be. I check through debugging that there haven't connected to the stream, however I just get no audio playback.

    Everything I read on door to believe that this is possible on IOS that I'm only interested by audio and video not. Can anyone help?

    Code example below (it's quick and dirty! ).

    Thanks in advance!

    <? XML version = "1.0" encoding = "utf-8"? >

    " < = xmlns:fx s:View ' http://ns.Adobe.com/MXML/2009 "

    xmlns:s = "library://ns.adobe.com/flex/spark" title = "Audio" creationComplete = "init ()" >

    < s:layout >

    < s:VerticalLayout paddingLeft = "10" paddingRight = "10".

    paddingTop = "10" paddingBottom = "10" / >

    < / s:layout >

    < fx:Script >

    <! [CDATA]

    import flash.media.Video;

    import flash.net.NetConnection;

    to import flash.net.NetStream;

    Import mx.core.UIComponent;

    private var vid:Video;

    private var videoHolder:UIComponent;

    private var nc:NetConnection;

    private var defaultURL:String = '[STREAM]. "

    private var streamName:String = '[STREAMNAME]. "

    private var ns:NetStream;

    private var msg:Boolean;

    private var intervalMonitorBufferLengthEverySecond:uint;

    private function init (): void

    {

    vid = new Video();

    VID. Width = 864;

    VID. Height = 576;

    VID. Smoothing = true;

    Join the video of the scene

    videoHolder = new UIComponent();

    videoHolder.addChild (vid);

    addEventListener (SecurityErrorEvent.SECURITY_ERROR, onSecurityError);

    grpVideo.addElement (videoHolder);

    Connect();

    }

    public void onSecurityError(e:SecurityError):void

    {

    trace ("error in security :");

    }

    public void connect (): void

    {

    NC = new NetConnection();

    NC.client = this;

    nc.addEventListener (NetStatusEvent.NET_STATUS, netStatusHandler);

    nc.objectEncoding = flash.net.ObjectEncoding.AMF0;

    NC. Connect (defaultURL);

    }

    public void netStatusHandler(e:NetStatusEvent):void

    {

    Switch (e.info.code) {}

    case "NetConnection.Connect.Success":

    trace ("audio - connected successfully");

    createNS();

    break;

    case "NetConnection.Connect.Closed":

    trace ("audio - Connection closed");

    Connect();

    break;

    case "NetConnection.Connect.Failed":

    trace ("audio - connection failed");

    break;

    case "NetConnection.Connect.Rejected":

    trace ("audio - connection rejected");

    break;

    case 'NetConnection.Connect.AppShutdown ':

    trace ("audio - application shutdown");

    break;

    case 'NetConnection.Connect.InvalidApp ':

    trace ("audio - app connection not valid");

    break;

    by default:

    trace ("audio-" + e.info.code + "-" + e.info.description);

    break;

    }

    }

    public function createNS (): void

    {

    trace ("NetStream creation");

    NS = new NetStream (nc);

    NC. Call ("FCSubscribe", null, "live_production"); Do not use this option if your CA requires it

    ns.addEventListener (NetStatusEvent.NET_STATUS, netStreamStatusHandler);

    vid.attachNetStream (ns);

    OnMetaData and onCuePoint //Handle event reminders: solution to http://tinyurl.com/mkadas

    //See another solution to http://www.Adobe.com/devnet/flash/QuickStart/metadata_cue_points/

    var infoClient:Object = new Object();

    infoClient.onMetaData = function MDGs (): void {};

    infoClient.onCuePoint = function oCP (): void {};

    NS.client = clientside;

    ns.bufferTime = 0;

    NS. Play (StreamName);

    ns.addEventListener (AsyncErrorEvent.ASYNC_ERROR, asyncErrorHandler);

    function asyncErrorHandler(event:AsyncErrorEvent):void {}

    trace (Event.Text);

    }

    intervalMonitorBufferLengthEverySecond = setInterval (monPlayback, 1000);

    }

    public void netStreamStatusHandler(e:NetStatusEvent):void

    {

    Switch (e.info.code) {}

    case "NetStream.Buffer.Empty":

    trace ("audio - buffer empty :");

    break;

    case "NetStream.Buffer.Full":

    trace ("audio - buffer full :");

    break;

    case "NetStream.Play.Start":

    trace ("audio - beginning of the game :"); ")

    break;

    by default:

    trace ("audio-" + e.info.code + "-" + e.info.description);

    break;

    }

    }

    public function monPlayback (): void {}

    Print the current length of the buffer

    trace ("audio - buffer length:" + ns.bufferLength);

    trace ("audio - FPS:" + ns.currentFPS);

    trace ("audio - delay Live:" + ns.liveDelay);

    }

    public function onBWDone (): void {}

    Do nothing

    }

    public void onFCSubscribe(info:Object):void {}

    Do nothing. Prevents the error if connection to CDN.

    }

    public void onFCUnsubscribe(info:Object):void {}

    Do nothing. Prevents the error if connection to CDN.

    }

    []] >

    < / fx:Script >

    < s:Group id = "grpVideo" >

    < / s:Group >

    < / s:View >

    Just an update on this for all of you who come along after me.

    I managed to get this working on the MP3 but not AAC (I guess just AAC is not taken in charge?).

    My problem was the buffertime. Docs seemed to indicate that it should be set to 0 for live streaming, however he switch to "1" has solved my problem on the particular flow I pointed at.

    So, essentially above the justchanged line:

    ns.bufferTime = 0;

    TO

    ns.bufferTime = 1;

    Would be nice to know if someone got AAFC works though...

  • No Digital Audio output when Internet Streaming on STR-DA5700ES

    I have a STR-DA5700ES I bought precisely because it has a digital Toslink out.  I used to use this connection on old receivers to connect to my Sony MDR-DS6500 7.1 headphones.  The headset unit accepts a digital Toslink connection to listen to the audio Dolby Digital.

    I can listen to Dolby Digital happy when I watch TV, because it generates the audio of the Toslink out and my helmet.  However, when streaming a movie Internet through the STR-DA5700ES with audio 5.1, it comes out not audio for the Toslink on.

    Am I missing something here?

    Hello

    Unfortunately the receiver works as expected and will not digital audio (HDMI or optical) from all the internal dissemination services. The receiver expects that you will use it connected speakers to listen to surround audio, or the taking of headset available on the front. There is not a firmware update available or planned that the receiver has been designed to work this way and can be changed by modifying the software.  Keep in mind that it is impossible to get 5.1 surround sound 2 headphones to listen to the speakers, the surround effect is 'Virtual' or 'simulated '.  As a work around you can use a media player that allows an external output optical to your TV if it offers streaming services, or head Jack on the front of the receiver that was planned for this exact purpose. I'm sorry for the inconvenience that this has done and have relayed concerns to the right team

  • audio but when video streaming o

    When you play go to CNN, I can't. When the live stream

    CNN Live will not play on MacMini - Solution

  • Is there a limited number of connections for a single stream in Cirrus, RTMFP?

    In this article , it shows the number of the Subscriber streams is limited by the maxPeerConnections property:

    "The NetConnection.maxPeerConnections property specifies the number of streams of peers who are allowed to connect to the Publisher. The default value is set to 8... »

    But I remember having read somewhere and thought for a long time that RTMFP implements this relay of stream feature that creates this mesh of counterparts allowing unlimited connections of flow and keep a fixed editor download bandwidth.

    RTMFP with this feature would be useless maxPeerConnections property, is only if there is no such feature?

    At least that... maxPeerConnections has a certain number of maximum connections DIRECT and isn't actually the maximum length of the peerStreams array.

    Could someone clarify this for me, please?

    I thought about it. That article I posted does not cooperative networking use. Note that NetStream.DIRECT_CONNECTIONS in sendStream = new NetStream (netConnection, NetStream.DIRECT_CONNECTIONS);

    However, this article is by using cooperative networking. Note the chain of GroupSpecifier groupspec in _outgoingStream = new NetStream (_netConnection, _groupSpec);

    Thank you from me

  • Audio keeps stuck when two streaming audio feeds at once

    Hello!

    I'm live streaming of videos with audio to our Web page, each with its own audio accompaniment.   On my laptop, I could flow without any problem but it overheats so I got two new computers built just for the encoding.   Now, on new computers, the fine stream 2 videos but only on the works of audio file at a time. The other just hangs on a single note of sound and very annoyingly repeats itself more and more.

    You please help me if you know what could be causing my audio does not work on the second stream?   I use an i7 processor Intel and Gigabyte motherboard with Windows 7 Home Premium and Flash Media Live Encoder 3.2.

    It's driving me crazy.   I do not understand why my laptop that would do the trick, but a more powerful desktop computer would not!

    For example, when I listen to the audio on the first stream, it plays perfectly.   When I turn that down and check the other sound stream, it will play the first 1 second sound over and over again.   Like ' ah, ah, ah, ah, ah, ah, ah, ah, ah, ah, ah, ' and it turns on like this until I stop.

    I think it might have something to do with device drivers?   This configuration was working fine on my laptop with the same capture devices and the OS (Windows 7 Home Premium).   The Office has twice the RAM vs processor i7 i5 laptop, but for some reason, the Office is not working properly.

    The only thing that I think can be different, is that the OS is a more recent version of Windows 7?   Or maybe because the Office has a Gigabyte motherboard?

    Thank you!   CHunter32

    Hello

    Thank you for trying OUT and FMS.

    Also, I guess the problem is with the device drivers. OUT 3.2 and FMS does support Windows 7 (although for purposes of development only), there should be no difference in functionality.

  • Don't read in streaming videos, such as youtube, mode full screen shows no video, only audio.

    Windows Xp SP3. Just installed firefox, I can play any video on youtube, unless I click on full screen, my screen is black except for the progress bar. I hear audio fine. Video streaming full screen plays well in IE8, but I prefer to use Firefox. Thanx

    -> Tap ALT key or press F10 to display the menu bar

    -> go to the Tools Menu-> Options-> advanced-> general-> uncheck use hardware acceleration when available -> click OK

    Managing the Flash Plugin

    Check and tell if its working.

  • 4th gen play multi audio stream at the sametime.

    Hello

    I have a weird problem with the video which was remuxed with ffmpeg which have both 2 channel and ac3 aac streams 6 channels being played at the same time, it's a common problem?

    Video playback on other devices as expected, i.e. play only a single stream.

    Welcome to the Apple community.

    I suspect that they are simply coded incorrectly, I have no issue with the compressor.

  • WTVConverter problem with multiple audio streams...

    The WTVConverter MSFT includes Windows 7 does not correctly handle multiple audio streams. The WTVConverter in the conversion of the format ".wtv' to '.dvrms' instead of keep the main audio stream retains the other stream audio visually impaired. Usually, he keeps the other stream audio visually impaired. NCIS and NCIS Los Angeles, CBS programs have this problem because they include signals SAP (accessibility for blind and partially sighted options)
    How can I tell WTVConverter to include only the main audio instead of the stream audio visually impaired
    Thank you
    Chandra

    Hello, Chandra

    If you think that there should be an option to select a stream when converting, you can leave your comments on the following link:http://mymfe.microsoft.com/Windows%207/Feedback.aspx?formID=195

    In addition, you can try searching for a free replacement program convert WTV files.

    David
    Microsoft Answers Support Engineer
    Visit our Microsoft answers feedback Forum and let us know what you think.

  • Reading secured with HTML5 Audio audio stream

    Hello

    Is it possible to read the stream protected with the HTML5 Audio tag? The stream that I play is protected by password

    (Basic HTTP authentication), how can I suppy credentials? Thanks in advance.

    Best regards

    I solved the problem of providing credentials as parameters: http://aaaa.bbbb.com:8117 / stream? user = adriana & pass = lima

  • How to video and Audio recording in a single file like Default BB Video Recorder?

    Hello

    Is it possible to record audio and video in a single file using...  Something like default video recorder. My goal is to spread this record to the server using SocketConnection. I aired successfully on video, but it contains no audio...

    Manager.createPlayer ("capture://video?" +);

    I went through another post and did not find anything.

    Is - this possiable for recording audio & video single instance? ...

    I tried after coding

    1. Encoding = video/3gpp & mode = standard
    2. Encoding = video/3gpp & mode = mms
    3. Encoding = video/3gpp & width = 480 & height = 352 & video_codec = MPEG-4 & audio_codec = AMR
    4. Encoding = video/3gpp & width = 176 & height = 144 & video_codec = MPEG-4 & audio_codec = AMR
    5. Encoding = video/3gpp & width = 480 & height = 352 & video_codec H263 = & audio_codec = AMR
    6. Encoding = video/3gpp & width = 176 & height = 144 & video_codec H263 = & audio_codec = AMR

    Unfortunately, none of them contained Audio...

    How can I record Audio using the single instance of player of the & video? is this possiable? ...

    (I am able to convert 1 encoding raw .3gp file using YouTube downloader)

    Help, please...

    Thanks in advance...

    Yes, he does. I do usually just #1 but I don't know how it works on the Simulator.

Maybe you are looking for