13 Replies Latest reply: Mar 17, 2011 12:34 PM by captfoss RSS

    AUDIO CONFERENCE USING JMF

    843867
      I use JMF RTP Manager to send and receive audio streams... File1 captures audio device and gets a DataSource from it and it listens for Audio Stream coming from another system. After receiving audio stream from another system it mixes up the captured audio and received audio together. Then it transmits the resulting DataSource to another system and plays it using Player.
      File2 captures audio device and gets a DataSource from it. Then it transmits the DataSource captured and it listens for Audio Stream. After receiving it just plays it.

      Run file1 in one system and file2 in another system...

      PROBLEM:
      Now the problem i get is that the system in which i run File1 executes well... plays the mixed audio well.. But the another system in which i run file2 doesnot plays.. it displays upto "in receive stream evt.." but it never displays "receiver thread started.."



      //FILE1
      /**
      *
      * @author Sathishkumar
      *
      */
      import java.util.*;
      import javax.media.*;
      import javax.media.protocol.*;
      import javax.media.bean.playerbean.MediaPlayer;
      import javax.media.format.*;
      import java.net.*;
      import java.util.StringTokenizer;
      import java.util.Vector;
      import javax.media.rtp.*;
      import javax.media.rtp.event.*;
      /*this class is used to add, removes the peers to the mixer and receive voice from all the peers and mix it and send to the all the peers*/

      public class Mixing implements SessionListener,ReceiveStreamListener
      {
      public static Processor audioProcessor,audioProcessor1;
      public static String mixer_ip;
      DataSource dataSource ;
      DataSource processedAudioSource,processedAudioSource1;
      DataSource audioInputSource,audioInputSource1;
      public static RTPManager srcMgrs;
      Object[] sources;
      List dataSources;
      public static SessionAddress srclocalAddr;
      public Mixing()
      {
      try
      {
      dataSources= new ArrayList();
      if(srcMgrs==null)
      {
      System.out.println("CHILD NODES");
      srcMgrs = (RTPManager) RTPManager.newInstance();
      srclocalAddr= new SessionAddress( InetAddress.getByName(InetAddress.getLocalHost().g etHostName()), 3000);
      srcMgrs.initialize( srclocalAddr);
      srcMgrs.addTarget("HOST NAME of another sstm",3000);
      srcMgrs.addReceiveStreamListener(this);
      srcMgrs.addSessionListener(this);
      System.out.println("initiated listeners........");

      }

      process();

      }
      catch(Exception e)
      {
      System.out.println("Error in Mixing Add Child Nodes11");
      System.out.println("Error"+e);
      }

      }


      public synchronized void update(SessionEvent evt)
      {
      try
      {
      System.out.println("Come inside Session event");
      }
      catch(Exception e)
      {
      System.out.println("Error in Session");
      System.out.println("Error"+e);
      }
      }

      public void process()
      {
      try
      {
      Vector info = CaptureDeviceManager.getDeviceList(null);
      if (info == null)
      System.out.println("No Capture devices known to JMF");
      else
      {
      CaptureDeviceInfo deviceInfo = (CaptureDeviceInfo)info.elementAt(0);
      MediaLocator locator=deviceInfo.getLocator();
      System.out.println("Got Locator:"+locator);
      audioInputSource=Manager.createDataSource(locator) ;
      System.out.println("started to capture........");
      System.out.println("Mixing Configuration in Process");
      Format []formats=new Format[1];
      formats[0]=new AudioFormat(AudioFormat.GSM_RTP,8000,16,1);
      audioProcessor =Manager.createRealizedProcessor(new ProcessorModel(audioInputSource,formats,new FileTypeDescriptor(FileTypeDescriptor.RAW)));
      processedAudioSource = audioProcessor.getDataOutput();
      dataSources.add(processedAudioSource);
      audioProcessor.start();
      System.out.println("Transfering to peers");

      }
      catch(Exception e)
      {
      System.out.println("Error in Mixing Process");
      System.out.println("Error"+e);
      }

      }


      public static void main(String [] ar)
      {
      Mixing m=new Mixing();
      }


      public synchronized void update( ReceiveStreamEvent evt)
      {

      System.out.println("in receive stream evt...");
      try
      {
      if (evt instanceof RemotePayloadChangeEvent)
      {
      System.err.println(" - Received an RTP PayloadChangeEvent.");
      System.exit(0);
      }
      else
      {
      if (evt instanceof NewReceiveStreamEvent)
      {
      System.out.println("inside Receive event of play");
      RTPManager mgr = (RTPManager)evt.getSource();
      Participant participant = evt.getParticipant();
      ReceiveStream stream = evt.getReceiveStream();
      audioInputSource1= stream.getDataSource();

      dataSources.add(audioInputSource1);
      sources= dataSources.toArray(new DataSource[0]);
      dataSource = Manager.createMergingDataSource((DataSource[])sources);

      sendStream =srcMgrs.createSendStream(dataSource,0);
      sendStream.start();

      Player audioPlayer = Manager.createRealizedPlayer(dataSource);
      System.out.println("receiver thread started");     
      audioPlayer.start();
      }
      }
      }
      catch(Exception e)
      {
      System.out.println("Error in Play Receive Event");
      System.out.println("Error"+e);
      }
      }
      }



      //FILE2
      /**
      *
      * @author Sathishkumar
      *
      */
      import javax.media.*;
      import javax.media.protocol.*;
      import javax.media.bean.playerbean.MediaPlayer;
      import javax.media.format.*;
      import java.net.*;
      import java.util.StringTokenizer;
      import java.util.Vector;
      import javax.media.rtp.*;
      import javax.media.rtp.event.*;

      public class Mixing_c implements SessionListener,ReceiveStreamListener
      {
      public static Processor audioProcessor;
      String mixer_ip;
      DataSource processedAudioSource;
      DataSource audioInputSource,audioInputSource1;
      public static RTPManager srcMgrs;
      public static SessionAddress srclocalAddr;
      public Mixing_c()
      {
      try
      {
      if(srcMgrs==null)
      {
      System.out.println("CHILD NODES in mixing_c ");
      srcMgrs = (RTPManager) RTPManager.newInstance();
      srclocalAddr= new SessionAddress( InetAddress.getByName(InetAddress.getLocalHost().g etHostName()), 3000);
      srcMgrs.initialize( srclocalAddr);

      srcMgrs.addTarget("HOst name of sstm",3000);
      srcMgrs.addReceiveStreamListener(this);
      srcMgrs.addSessionListener(this);
      System.out.println("initiated listeners in mixing_c........");
      }
      process();

      }
      catch(Exception e)
      {
      System.out.println("Error in Mixing Add Child Nodes11");
      System.out.println("Error"+e);
      }

      }


      public synchronized void update(SessionEvent evt)
      {
      try
      {
      System.out.println("Come inside Session event");
      }
      catch(Exception e)
      {
      System.out.println("Error in Session");
      System.out.println("Error"+e);
      }
      }

      public void process()
      {
      try
      {
      Vector info = CaptureDeviceManager.getDeviceList(null);
      if (info == null)
      System.out.println("No Capture devices known to JMF");
      else
      {
      CaptureDeviceInfo deviceInfo = (CaptureDeviceInfo)info.elementAt(0);
      MediaLocator locator=deviceInfo.getLocator();
      System.out.println("Got Locator:"+locator);
      audioInputSource=Manager.createDataSource(locator) ;
      }

      System.out.println("started to capture........");

      System.out.println("Mixing Configuration in Process");

      Format []formats=new Format[1];
      formats[0]=new AudioFormat(AudioFormat.GSM_RTP,8000,16,1);

      audioProcessor =Manager.createRealizedProcessor(new ProcessorModel(audioInputSource,formats,new FileTypeDescriptor(FileTypeDescriptor.RAW)));

      processedAudioSource = audioProcessor.getDataOutput();

      sendStream =srcMgrs.createSendStream(processedAudioSource,0);

      sendStream.start();

      audioProcessor.start();
      System.out.println("Transfering to peers");

      }
      catch(Exception e)
      {
      System.out.println("Error in Mixing Process");
      System.out.println("Error"+e);
      }

      }


      public static void main(String [] ar)
      {
      Mixing_c m=new Mixing_c();
      }


      public synchronized void update( ReceiveStreamEvent evt)
      {

      System.out.println("in receive stream evt...");
      try
      {
      if (evt instanceof RemotePayloadChangeEvent)
      {
      System.err.println(" - Received an RTP PayloadChangeEvent.");
      System.exit(0);
      }
      else
      {
      if (evt instanceof NewReceiveStreamEvent)
      {
      System.out.println("inside Receive event of play");
      RTPManager mgr = (RTPManager)evt.getSource();
      Participant participant = evt.getParticipant();
      ReceiveStream stream = evt.getReceiveStream();
      audioInputSource1= stream.getDataSource();

      Player audioPlayer = Manager.createRealizedPlayer(audioInputSource1);
      System.out.println("receiver thread started");
      audioPlayer.start();
      }
      }
      }
      catch(Exception e)
      {
      System.out.println("Error in Play Receive Event");
      System.out.println("Error"+e);
      }
      }
      }
        • 1. Re: AUDIO CONFERENCE USING JMF
          captfoss
          You can't just transmit the output of a MergingDataSource without transcoding it into an RTP format first...
          • 2. Re: AUDIO CONFERENCE USING JMF
            843867
            how to transcode it to RTP format?... Now what should i do..
            • 3. Re: AUDIO CONFERENCE USING JMF
              captfoss
              840864 wrote:
              how to transcode it to RTP format?... Now what should i do..
              So, basically what you're saying is that "your" code is actually hacked together peices of sample code done in complete and utter ignorance of what any of the code does?

              http://www.cs.odu.edu/~cs778/spring04/lectures/jmfsolutions/examplesindex.html

              Look at the RTP transmitting examples and figure out what they're doing that you aren't...
              • 4. Re: AUDIO CONFERENCE USING JMF
                843867
                Thanks for ur reply.. Is this you are asking me to do?... But this didnt work.. the same thing i get when i execute this...


                dataSource = Manager.createMergingDataSource((DataSource[])sources);

                audioProcessor1 =Manager.createRealizedProcessor(new ProcessorModel(dataSource,formats,new FileTypeDescriptor(FileTypeDescriptor.RAW)));

                processedAudioSource1 = audioProcessor1.getDataOutput();

                audioProcessor1.start();          

                sendStream =srcMgrs.createSendStream(processedAudioSource1,0);

                sendStream.start();

                Player audioPlayer = Manager.createRealizedPlayer(processedAudioSource1);

                System.out.println("receiver thread started");
                                         
                audioPlayer.start();
                • 5. Re: AUDIO CONFERENCE USING JMF
                  EJP
                  Please use the { code } tags and proper indentation when you post code here. Posting it unreadably like that greatly reduces your chances of anybody reading it.
                  • 6. Re: AUDIO CONFERENCE USING JMF
                    843867
                    in file1 i modified the last 18 lines as follows... but i get the same result..


                    //code starts here...


                    dataSource = Manager.createMergingDataSource((DataSource[])sources);

                    audioProcessor1 =Manager.createRealizedProcessor(new ProcessorModel(dataSource,formats,new FileTypeDescriptor(FileTypeDescriptor.RAW)));

                    processedAudioSource1 = audioProcessor1.getDataOutput();

                    audioProcessor1.start();

                    sendStream =srcMgrs.createSendStream(processedAudioSource1,0);

                    sendStream.start();

                    Player audioPlayer = Manager.createRealizedPlayer(processedAudioSource1);

                    System.out.println("receiver thread started");

                    audioPlayer.start();
                    }
                    catch(Exception e)
                    {
                    System.out.println("Error in Play Receive Event");
                    System.out.println("Error"+e);
                    }
                    }
                    }

                    what else i should do?..

                    Edited by: 840864 on Mar 11, 2011 10:13 AM
                    • 8. Re: AUDIO CONFERENCE USING JMF
                      843867
                      Is that enough to use audio processor as i posted in previous thread?... or i should use any other thing?...

                      and i give HOST NAME for adding a system to TARGET list. Instead of HOST NAME i can use IP Address also. But i know only the local IP(Private IPs ex.192.168.1.23) of the system..

                      how can i use this in Internet?.. How can i identify public IP of those systems?..
                      • 9. Re: AUDIO CONFERENCE USING JMF
                        captfoss
                        840864 wrote:
                        Is this you are asking me to do?...
                        No, I asked you to read the RTP transmitting examples I gave you. You obviously did not, because they don't use ProcessorModels and they sure as hell don't do something as blatently ignorant as transcoding into a RAW filetype for an RTP transmission...
                        audioProcessor1 =Manager.createRealizedProcessor(new ProcessorModel(dataSource,formats,new FileTypeDescriptor(FileTypeDescriptor.RAW)));
                        • 10. Re: AUDIO CONFERENCE USING JMF
                          843867
                          I read that.. i feel Difficulty in understanding it... it uses DataSink and it also use video formats... i need to use only audio streams.. so i should edit the program to only use the audio streams..

                          could you help me pls?..
                          • 11. Re: AUDIO CONFERENCE USING JMF
                            captfoss
                            it uses DataSink and
                            Have you not already grasped that JMF is modular and you can swap out the peices?
                            it also use video formats... i need to use only audio streams.. so i should edit the program to only use the audio streams..
                            Yes, that's exactly what you need to do...
                            could you help me pls?..
                            No. You either lack the technical expertise in programming in general or Java in specific for anyone to be able to help you...
                            for (int i = 0; i < tracks.length; i++) {
                              Format format = tracks.getFormat();
                            if ( tracks[i].isEnabled() && format instanceof VideoFormat) {

                            // Found a video track. Try to program it to output JPEG/RTP
                            // Make sure the sizes are multiple of 8's.
                            Dimension size = ((VideoFormat)format).getSize();
                            float frameRate = ((VideoFormat)format).getFrameRate();
                            int w = (size.width % 8 == 0 ? size.width :(int)(size.width / 8) * 8);
                            int h = (size.height % 8 == 0 ? size.height : (int)(size.height / 8) * 8);
                            VideoFormat jpegFormat = new VideoFormat(VideoFormat.JPEG_RTP,
                            new Dimension(w, h), Format.NOT_SPECIFIED,Format.byteArray,frameRate);
                            tracks[i].setFormat(jpegFormat);
                            }
                            else {
                            tracks[i].setEnabled(false);
                            }
                            }
                            The above code loops through all of the tracks on the Processor, and turns them on / configures them to transmit as an RTP type if they're a video track, and turns them off otherwise... modify it to check if they're audio formats, and configure them accordingly.
                            
                            You'll also want to make sure that the content descriptor of your Processor is set to RAW_RTP, not RAW.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                        
                            • 12. Re: AUDIO CONFERENCE USING JMF
                              843867
                              Thank you for your reply...

                              I dint study JMF theoretically from starting.. I dont have more time to start from initial.. So i searched through Internet and understood only what my program does.. i dint know actually what the processor is, What it is for,etc,. Aftr reading your previous post i studied.. still studying.. thanks for making me study these concepts.. Surely, now i'm better than previous..

                              You are right.. ya.. I modified the code you posted previously.. Now i can send the mixed stream to another system.. It works fine..

                              But now the problem i get is that

                              For my application i have to send and play the mixed stream simultaneously.. It sends mixed stream.. No problem in it.. When i try to create player using the same DataSource it throws NoPlayerException com.sun.media.multiplexer.RawBufferMux$RawBufferDataSource...

                              I can understand it is because i changed tracks of audio stream to RAW_RTP as you said.. So i tried creating clones and use one clone each for sending and playing.. It doesnt work...

                              Now what should i do?.. help me...

                              pls..
                              • 13. Re: AUDIO CONFERENCE USING JMF
                                captfoss
                                When you use a merging DataSource, it'll do one of two things...
                                1) Merge two streams of different formats
                                2) Mix two streams of the same format

                                You don't want to merge them, you want to mix them... therefore you need to make sure that they're the same format before you merge them...

                                How do you do that? By transcoding one of them into the format of the other before you merge them.

                                Which one should you transcode? Probably the RTP stream since it's unusable without transcoding anyway... so just set it to be transcoded into the format that you're capturing your audio stream as.