This discussion is archived
7 Replies Latest reply: Aug 16, 2011 10:39 AM by chummer RSS

A/V stream out of sync with when playing audio on SourceDataLine

chummer Newbie
Currently Being Moderated
Hi,

I'm receiving a video and an audio stream via RTP and try to play them synchronized (audio synchron with lip movement).
I'm using a DataSourceReader in order to read from a datasource and play on a JavaSound SourceDataLine. This works pretty good but unfortunately the audio is played with a delay of approx 1 second.

else if (evt instanceof NewReceiveStreamEvent) {               
     try {
          stream = ((NewReceiveStreamEvent)evt).getReceiveStream();
          DataSource ds = stream.getDataSource();
          Format format = null;
                    
          // Find out the formats.
          RTPControl ctl = (RTPControl)ds.getControl("javax.media.rtp.RTPControl");
          if (ctl != null) {
               format = ctl.getFormat();
               log.info("Received new RTP stream: " + format);
          }
          else
               log.info("Received new RTP stream");
          
          if (format != null && format instanceof VideoFormat) {
               Player video = javax.media.Manager.createRealizedPlayer(ds);
               video.start();
               this.video_player = video;                         
          }
          else if (format != null && format instanceof AudioFormat) {
               JavaSoundDataSourceReader dsr = new JavaSoundDataSourceReader();
               Player audio = dsr.open(ds);
               audio.start();
               this.audio_player = audio;
          }          
I already tried to synchronize them with
videoPlayer.addController(audio_player);
audio_player.start();
but this is throwing an Exception:
javax.media.IncompatibleTimeBaseException
     at com.sun.media.multiplexer.RawBufferMux.setTimeBase(RawBufferMux.java:377)
     at com.sun.media.BasicSinkModule.setTimeBase(BasicSinkModule.java:52)
     at com.sun.media.PlaybackEngine.setTimeBase(PlaybackEngine.java:1672)
     at com.sun.media.BasicPlayer.setTimeBase(BasicPlayer.java:272)
     at com.sun.media.BasicPlayer.addController(BasicPlayer.java:1043)
     at VideoPlayback.update(VideoPlayback.java:281)
Line 281 of VidePlayback.java is the "videoPlayer.addController(audio_player);" line.

My guess is that it should rather use RawSyncBufferMux instead of RawBufferMux but I don't know how to change this. Maybe directly as a processor option?
Please help as I've no clue how to improve this.
  • 1. Re: A/V stream out of sync with when playing audio on SourceDataLine
    captfoss Pro
    Currently Being Moderated
    chummer wrote:
    I'm using a DataSourceReader in order to read from a datasource and play on a JavaSound SourceDataLine. This works pretty good but unfortunately the audio is played with a delay of approx 1 second.
    My question for you is, why are you doing that? Why aren't you just using JMF to play the audio?
  • 2. Re: A/V stream out of sync with when playing audio on SourceDataLine
    chummer Newbie
    Currently Being Moderated
    Because the project I'm working on is about high quality audio recording with 3 or more soundcards at the same time and streaming without compression (bandwidth > 50 mbit/s down and up).
    The JMF part is just a goody to include a webcam chat between the connected users. As there are multiple soundcards installed, the user needs to select one for JMF sound (normally the worst as it's a low quality RTP stream).
  • 3. Re: A/V stream out of sync with when playing audio on SourceDataLine
    captfoss Pro
    Currently Being Moderated
    chummer wrote:
    Because the project I'm working on is about high quality audio recording with 3 or more soundcards at the same time and streaming without compression (bandwidth > 50 mbit/s down and up).
    The JMF part is just a goody to include a webcam chat between the connected users. As there are multiple soundcards installed, the user needs to select one for JMF sound (normally the worst as it's a low quality RTP stream).
    First off, I'm not sure that you answered my question. Recording is done elsewhere... streaming is done elsewhere. It seems like you're receiving the audio via JMF RTP, so why can't you play it with JMF?

    Also, you do realize that JMF renders audio by wrapping around JavaSound, right?
  • 4. Re: A/V stream out of sync with when playing audio on SourceDataLine
    chummer Newbie
    Currently Being Moderated
    captfoss wrote:

    First off, I'm not sure that you answered my question. Recording is done elsewhere... streaming is done elsewhere. It seems like you're receiving the audio via JMF RTP, so why can't you play it with JMF?
    The thing is that I want to be able to control on which soundcard the audio is recorded on one side and played on the other side.
    I could do the audio part completely without JMF but... it should also be synchronized with the video :-)
    captfoss wrote:

    Also, you do realize that JMF renders audio by wrapping around JavaSound, right?
    I know that but the abstraction of JMF forbids the access to the hardware like source or targetdatalines (as far as I know).
    The software I write is like kind of a mixing desk where you can select channels and route them to the physical hardware channel.
  • 5. Re: A/V stream out of sync with when playing audio on SourceDataLine
    captfoss Pro
    Currently Being Moderated
    chummer wrote:
    it should also be synchronized with the video :-)
    The problem with getting it synchronized is that Javasound & your sound card have internal buffering, so there is some delay between writing to them and hearing the sound come out...

    Normally, JMF handles this internally somehow by aligning things to a timebase, which can be syncronized, and allowing Processors to fill up their internal buffers ahead of time during the prefetching phase so that everything that needs to play together can have an instant start time. As it goes along, presumably adjustments are made as things get out of sync.

    My best advise to you would be to implement your Javasound stuff inside of a Renderer class, instead of a DataSink class. DataSinks are designed to take in data and write it out as fast as they can... whereas Renderers are given data at "real-time" intervals and are designed to be played.

    When implementing your Renderer class, JMF should handle the syncronization for you. IE, it'll give you data as it should be played by calling the Process function, so you'll just need to write the data to your TDL as it comes in. I believe that "Process" may be called before "Start" to allow you to prebuffer some information, but you'll need to test that as I've never actually tried it.
  • 6. Re: A/V stream out of sync with when playing audio on SourceDataLine
    chummer Newbie
    Currently Being Moderated
    perfect, thanks for that. I'll give that a try and report back.
  • 7. Re: A/V stream out of sync with when playing audio on SourceDataLine
    chummer Newbie
    Currently Being Moderated
    Ok, I've done this and changed from datasink to custom renderer.
    First it was the same result as before (audio a second after video), but adjusting the receive buffers helped to get it perfectly sync:
    BufferControl bc1 = (BufferControl)receivingVideoRTPManager.getControl("javax.media.control.BufferControl");
    BufferControl bc2 = (BufferControl)receivingAudioRTPManager.getControl("javax.media.control.BufferControl");
    if (bc1 != null)
        bc1.setBufferLength(1024*1024*1024);
    if (bc2 != null)
        bc2.setBufferLength(0);
    Video now has the same delay now as audio has (like 1 second).
    I'm quite sure it would also have worked this way with the custom datasink but I like more the renderer solution as it's cleaner.

    Thanks captfoss for your help.

Legend

  • Correct Answers - 10 points
  • Helpful Answers - 5 points