GVKun编程网logo

将实时Android音频流传输到服务器(将实时android音频流传输到服务器上)

25

在本文中,我们将给您介绍关于将实时Android音频流传输到服务器的详细内容,并且为您解答将实时android音频流传输到服务器上的相关问题,此外,我们还将为您提供关于android–如何获取音频流的

在本文中,我们将给您介绍关于将实时Android音频流传输到服务器的详细内容,并且为您解答将实时android音频流传输到服务器上的相关问题,此外,我们还将为您提供关于android – 如何获取音频流的持续时间,并从任何点继续音频流、android – 通过MediaRouter将音频流式传输到设备、android-以最小的延迟播放实时音频流、android-如何使用翻新之类的文件流式传输到服务器?的知识。

本文目录一览:

将实时Android音频流传输到服务器(将实时android音频流传输到服务器上)

将实时Android音频流传输到服务器(将实时android音频流传输到服务器上)

我目前正在尝试将直播麦克风音频从Android设备流式传输到Java程序。我首先在两个Android设备之间发送实时音频,以确认我的方法正确。在接收设备上几乎没有任何延迟地可以完美地听到音频。接下来,我将相同的音频流发送到一个小型Java程序,并验证了数据也已正确发送到此处。现在,我想要做的是对这些数据进行编码,并以某种方式在运行Java程序的服务器上对其进行回放。我宁愿在使用HTML5或JavaScript的网络浏览器中播放它,但可以使用VLC等替代方法。

这是发送实时麦克风音频的Android应用的代码

public class MainActivity extends Activity {private Button startButton,stopButton;public byte[] buffer;public static DatagramSocket socket;    AudioRecord recorder;private int sampleRate = 44100;   private int channelConfig = AudioFormat.CHANNEL_CONFIGURATION_MONO;    private int audioFormat = AudioFormat.ENCODING_PCM_16BIT;       int minBufSize = AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat);    private boolean status = true;@Overrideprotected void onCreate(Bundle savedInstanceState) {    super.onCreate(savedInstanceState);    setContentView(R.layout.activity_main);     startButton = (Button) findViewById (R.id.start_button);     stopButton = (Button) findViewById (R.id.stop_button);     startButton.setOnClickListener(startListener);     stopButton.setOnClickListener(stopListener);     minBufSize += 2048;}@Overridepublic boolean onCreateOptionsMenu(Menu menu) {    getMenuInflater().inflate(R.menu.main, menu);    return true;}private final OnClickListener stopListener = new OnClickListener() {    @Override    public void onClick(View arg0) {                status = false;                recorder.release();                Log.d("VS","Recorder released");    }};private final OnClickListener startListener = new OnClickListener() {    @Override    public void onClick(View arg0) {                status = true;                startStreaming();               }};public void startStreaming(){    Thread streamThread = new Thread(new Runnable(){        @Override        public void run()        {            try{                DatagramSocket socket = new DatagramSocket();                Log.d("VS", "Socket Created");                byte[] buffer = new byte[minBufSize];                Log.d("VS","Buffer created of size " + minBufSize);                Log.d("VS", "Address retrieved");                recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,sampleRate,channelConfig,audioFormat,minBufSize);                Log.d("VS", "Recorder initialized");                recorder.startRecording();                InetAddress IPAddress = InetAddress.getByName("192.168.1.5");                byte[] sendData = new byte[1024];                byte[] receiveData = new byte[1024];                while (status == true)                {                    DatagramPacket sendPacket = new DatagramPacket(sendData, sendData.length, IPAddress, 50005);                    socket.send(sendPacket);                }            } catch(UnknownHostException e) {                Log.e("VS", "UnknownHostException");            } catch (IOException e) {                Log.e("VS", "IOException");                e.printStackTrace();            }        }    });    streamThread.start();}}

这是Java程序读取数据的代码。

class Server{   public static void main(String args[]) throws Exception      {         DatagramSocket serverSocket = new DatagramSocket(50005);            byte[] receiveData = new byte[1024];            byte[] sendData = new byte[1024];            while(true)               {                  DatagramPacket receivePacket = new DatagramPacket(receiveData, receiveData.length);              serverSocket.receive(receivePacket);              String sentence = new String( receivePacket.getData().toString());              System.out.println("RECEIVED: " + sentence);           }  }}

我知道在将音频发送到Java程序之前,应该先在应用程序端对音频进行编码,但是我不确定在使用AudioRecorder时如何进行编码。我宁愿不使用NDK,因为我没有使用它的经验,也没有时间去学习如何使用它。

答案1

小编典典

所以我解决了我的问题。问题主要在接收方。接收器接收音频流,并将其推送到PC的扬声器。产生的声音仍然很漫长和破碎,但是仍然有效。尝试使用缓冲区大小可以改善这一点。

编辑:您可以使用线程读取音频,以免产生延迟。另外,最好使用16 000的采样大小,因为这样可以进行语音处理。

Android代码:

package com.example.mictest2;import java.io.IOException;import java.net.DatagramPacket;import java.net.DatagramSocket;import java.net.InetAddress;import java.net.UnknownHostException;import android.app.Activity;import android.media.AudioFormat;import android.media.AudioRecord;import android.media.MediaRecorder;import android.os.Bundle;import android.util.Log;import android.view.View;import android.view.View.OnClickListener;import android.widget.Button;public class Send extends Activity {private Button startButton,stopButton;public byte[] buffer;public static DatagramSocket socket;private int port=50005;AudioRecord recorder;private int sampleRate = 16000 ; // 44100 for musicprivate int channelConfig = AudioFormat.CHANNEL_CONFIGURATION_MONO;    private int audioFormat = AudioFormat.ENCODING_PCM_16BIT;       int minBufSize = AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat);private boolean status = true;@Overridepublic void onCreate(Bundle savedInstanceState) {    super.onCreate(savedInstanceState);    setContentView(R.layout.activity_main);    startButton = (Button) findViewById (R.id.start_button);    stopButton = (Button) findViewById (R.id.stop_button);    startButton.setOnClickListener (startListener);    stopButton.setOnClickListener (stopListener);}private final OnClickListener stopListener = new OnClickListener() {    @Override    public void onClick(View arg0) {                status = false;                recorder.release();                Log.d("VS","Recorder released");    }};private final OnClickListener startListener = new OnClickListener() {    @Override    public void onClick(View arg0) {                status = true;                startStreaming();               }};public void startStreaming() {    Thread streamThread = new Thread(new Runnable() {        @Override        public void run() {            try {                DatagramSocket socket = new DatagramSocket();                Log.d("VS", "Socket Created");                byte[] buffer = new byte[minBufSize];                Log.d("VS","Buffer created of size " + minBufSize);                DatagramPacket packet;                final InetAddress destination = InetAddress.getByName("192.168.1.5");                Log.d("VS", "Address retrieved");                recorder = new AudioRecord(MediaRecorder.AudioSource.MIC,sampleRate,channelConfig,audioFormat,minBufSize*10);                Log.d("VS", "Recorder initialized");                recorder.startRecording();                while(status == true) {                    //reading data from MIC into buffer                    minBufSize = recorder.read(buffer, 0, buffer.length);                    //putting buffer in the packet                    packet = new DatagramPacket (buffer,buffer.length,destination,port);                    socket.send(packet);                    System.out.println("MinBufferSize: " +minBufSize);                }            } catch(UnknownHostException e) {                Log.e("VS", "UnknownHostException");            } catch (IOException e) {                e.printStackTrace();                Log.e("VS", "IOException");            }         }    });    streamThread.start(); } }

Android XML:

<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"xmlns:tools="http://schemas.android.com/tools"android:layout_width="match_parent"android:layout_height="match_parent"android:paddingBottom="@dimen/activity_vertical_margin"android:paddingLeft="@dimen/activity_horizontal_margin"android:paddingRight="@dimen/activity_horizontal_margin"android:paddingTop="@dimen/activity_vertical_margin"tools:context=".MainActivity" ><TextView    android:id="@+id/textView1"    android:layout_width="wrap_content"    android:layout_height="wrap_content"    android:text="@string/hello_world" /><Button    android:id="@+id/start_button"    android:layout_width="wrap_content"    android:layout_height="wrap_content"    android:layout_below="@+id/textView1"    android:layout_centerHorizontal="true"    android:layout_marginTop="130dp"    android:text="Start" /><Button    android:id="@+id/stop_button"    android:layout_width="wrap_content"    android:layout_height="wrap_content"    android:layout_alignLeft="@+id/button1"    android:layout_below="@+id/button1"    android:layout_marginTop="64dp"    android:text="Stop" /></RelativeLayout>

服务器代码:

package com.datagram;import java.io.ByteArrayInputStream;import java.net.DatagramPacket;import java.net.DatagramSocket;import javax.sound.sampled.AudioFormat;import javax.sound.sampled.AudioInputStream;import javax.sound.sampled.AudioSystem;import javax.sound.sampled.DataLine;import javax.sound.sampled.FloatControl;import javax.sound.sampled.SourceDataLine;class Server {AudioInputStream audioInputStream;static AudioInputStream ais;static AudioFormat format;static boolean status = true;static int port = 50005;static int sampleRate = 44100;public static void main(String args[]) throws Exception {    DatagramSocket serverSocket = new DatagramSocket(50005);    byte[] receiveData = new byte[1280];     // ( 1280 for 16 000Hz and 3584 for 44 100Hz (use AudioRecord.getMinBufferSize(sampleRate, channelConfig, audioFormat) to get the correct size)    format = new AudioFormat(sampleRate, 16, 1, true, false);    while (status == true) {        DatagramPacket receivePacket = new DatagramPacket(receiveData,                receiveData.length);        serverSocket.receive(receivePacket);        ByteArrayInputStream baiss = new ByteArrayInputStream(                receivePacket.getData());        ais = new AudioInputStream(baiss, format, receivePacket.getLength());        // A thread solve the problem of chunky audio         new Thread(new Runnable() {            @Override            public void run() {                toSpeaker(receivePacket.getData());            }        }).start();    }}public static void toSpeaker(byte soundbytes[]) {    try {        DataLine.Info dataLineInfo = new DataLine.Info(SourceDataLine.class, format);        SourceDataLine sourceDataLine = (SourceDataLine) AudioSystem.getLine(dataLineInfo);        sourceDataLine.open(format);        FloatControl volumeControl = (FloatControl) sourceDataLine.getControl(FloatControl.Type.MASTER_GAIN);        volumeControl.setValue(100.0f);        sourceDataLine.start();        sourceDataLine.open(format);        sourceDataLine.start();        System.out.println("format? :" + sourceDataLine.getFormat());        sourceDataLine.write(soundbytes, 0, soundbytes.length);        System.out.println(soundbytes.toString());        sourceDataLine.drain();        sourceDataLine.close();    } catch (Exception e) {        System.out.println("Not working in speakers...");        e.printStackTrace();    }}}

我希望这可以帮助某人节省几个小时的痛苦:)

android – 如何获取音频流的持续时间,并从任何点继续音频流

android – 如何获取音频流的持续时间,并从任何点继续音频流

描述:

我有一个音频播放器的以下代码.我可以通过点击进度栏(0到to-mediaplayer.getDuration())之间的任何时间段继续音频播放.它对于音频播放工作正常.

音频流中的问题:

>当我从互联网服务器(比如s3-bucket)流式传输音频文件时
它开始流正确.
>但是mediaPlayer.getDuration()和mediaPlayer.getCurrentPosition()
返回错误的值.在流媒体开始
mediaPlayer.getCurrentPosition()返回5小时.
>由于这个原因,我无法继续音频流
流(0到流持续时间)的指定持续时间.

问题:

>如何获取音频流的持续时间
>如何从指定的持续时间继续音频流.对于
例如,为10分钟的文件,我想从6开始流式传输
分钟.

码:

public class MyAudioPlayer extends Activity 
implements OnClickListener{


    MediaPlayer mediaPlayer = null;
    private boolean isPaused=false;
    private boolean isstop = true;

    String filePath = null;
    String productName = null;

    ImageButton btnPlay = null;
    ImageButton btnPause = null;
    ImageButton btnReset = null;
    ImageButton btnStop = null;

    AudioManager audioManager = null;
    SeekBar volControl = null;
    SeekBar progressControl = null;
    TextView progresstext = null;
    long durationInMillis = -1;
    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.ltd_audio_player);

        volControl = (SeekBar)findViewById(R.id.player_volume);
        progressControl = (SeekBar)findViewById(R.id.player_seekbar);
        progresstext = (TextView) findViewById(R.id.player_progress_text);

        btnPlay = (ImageButton) findViewById(R.id.ic_player_play); 

        btnPause = (ImageButton) findViewById(R.id.ic_player_pause);  

        btnReset = (ImageButton) findViewById(R.id.ic_player_reset); 

        btnStop = (ImageButton) findViewById(R.id.ic_player_stop);   

        btnPlay.setonClickListener(this);
        btnPause.setonClickListener(this);
        btnReset.setonClickListener(this);
        btnStop.setonClickListener(this);

        filePath = getIntent().getExtras().getString("localPath");

        this.setPlayer();
        this.resetAndStartPlayer();


    }

    @Override
    protected void onResume() {
        super.onResume();   
        isPaused=false;
        progresstext.postDelayed(onEverySecond,1000);
    }

    @Override
    protected void onPause() {
        super.onPause();

        isPaused=true;
    }
    private void setProgressControl() {
        int maxVolume = mediaPlayer.getDuration();
        int curVolume = mediaPlayer.getCurrentPosition();

        progressControl.setMax(maxVolume);
        progressControl.setProgress(curVolume);
        progressControl.setonSeekBarchangelistener(new SeekBar.OnSeekBarchangelistener() {

            @Override
            public void onProgressChanged(SeekBar seekbar,int progress,boolean fromUser) {
                mediaPlayer.seekTo(progress);
            }

            @Override
            public void onStartTrackingTouch(SeekBar seekBar) {
                // Todo Auto-generated method stub

            }

            @Override
            public void onStopTrackingTouch(SeekBar seekBar) {
                // Todo Auto-generated method stub

            }
        });     
    }
    @Override
    public void onClick(View v) {
        switch(v.getId()){
        case R.id.ic_player_play:
            if(isstop==true){
                try {
                    mediaPlayer.prepareAsync();
                } catch (Exception e) {
                    e.printstacktrace();
                }
            }else{
                mediaPlayer.start();
                isstop = true;
            }
            break;
        case R.id.ic_player_pause:
            mediaPlayer.pause();
            break;
        case R.id.ic_player_reset:
            mediaPlayer.seekTo(0);
            break;
        case R.id.ic_player_stop:
            isstop = true;
            progressControl.setProgress(0);
            mediaPlayer.stop();
            break;
        }

    }
    private void resetAndStartPlayer(){
        try {
            if(filePath!=null){
                mediaPlayer.setDataSource(filePath);
                mediaPlayer.prepareAsync();
            }
        } catch (Exception e) {
            e.printstacktrace();
        }
    }
    private void setPlayer(){

        getwindow().setFormat(PixelFormat.UNKNowN);
        mediaPlayer = new MediaPlayer();    

        mediaPlayer.setonBufferingUpdateListener(new OnBufferingUpdateListener() {

            @Override
            public void onBufferingUpdate(MediaPlayer mp,int percent) {
                progressControl.setSecondaryProgress((progressControl.getMax()/100)*percent);

            }
        });
        mediaPlayer.setonPreparedListener(new OnPreparedListener() {

            @Override
            public void onPrepared(MediaPlayer mp) {
                mediaPlayer.start();
                isstop=false;
                durationInMillis = mediaPlayer.getDuration();
                MyAudioPlayer.this.setProgressControl();
            }
        });
        mediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
    }
    @Override
    protected void onDestroy() {
        // Todo Auto-generated method stub
        mediaPlayer.release();
        super.onDestroy();
    }
    protected void setProgresstext() {
        durationInMillis = mediaPlayer.getDuration();
        int curVolume = mediaPlayer.getCurrentPosition();
        long HOUR = 60*60*1000;
        if(progresstext!=null){
            if(durationInMillis>HOUR){
                progresstext.setText(String.format("%1$tH:%1$tM:%1$tS",new Date(curVolume))
                        +" / "+String.format("%1$tH:%1$tM:%1$tS",new Date(durationInMillis)));
            }else{
                progresstext.setText(String.format("%1$tM:%1$tS",new Date(curVolume))
                        +" / "+String.format("%1$tM:%1$tS",new Date(durationInMillis)));
            }
        }       
    }
    private Runnable onEverySecond=new Runnable() {
        public void run() {

            if (mediaPlayer!=null) {
                progressControl.setProgress(mediaPlayer.getCurrentPosition());

                MyAudioPlayer.this.setProgresstext();
            }

            if (!isPaused) {
                progresstext.postDelayed(onEverySecond,1000);
            }
        }
    };
}

时间显示在进度条上方.

时间:’当前持续时间’/’总持续时间’

解决方法

希望它可以解决你的问题.

1)音频流的持续时间和进度

我已经查看了你的代码,你的代码中有一个重大错误来计算时间.您创建新的Date(durationInMillis).日期增加您的地点,即GMT XX小时,这就是为什么您在开始流式传播时间达到5个小时.您应该使用以下方法来计算currentProgress / duration.

protected void setProgresstext() {

    final int HOUR = 60*60*1000;
    final int MINUTE = 60*1000;
    final int SECOND = 1000;

    int durationInMillis = mediaPlayer.getDuration();
    int curVolume = mediaPlayer.getCurrentPosition();

    int durationHour = durationInMillis/HOUR;
    int durationMint = (durationInMillis%HOUR)/MINUTE;
    int durationSec = (durationInMillis%MINUTE)/SECOND;

    int currentHour = curVolume/HOUR;
    int currentMint = (curVolume%HOUR)/MINUTE;
    int currentSec = (curVolume%MINUTE)/SECOND;

    if(durationHour>0){
        System.out.println(" 1 = "+String.format("%02d:%02d:%02d/%02d:%02d:%02d",currentHour,currentMint,currentSec,durationHour,durationMint,durationSec));            
    }else{
        System.out.println(" 1 = "+String.format("%02d:%02d/%02d:%02d",durationSec));
    }
}

2)擦洗流.

MediaPlayer允许擦除音频流.我已经在我的一个项目中实施了它,但需要一些时间.从另一个位置恢复音频流需要一些时间.

android – 通过MediaRouter将音频流式传输到设备

android – 通过MediaRouter将音频流式传输到设备

我想通过MediaRouter类将音频文件从我的 Android设备流式传输到另一个.据我所知,我需要的

mediarouter.addCallback(MediaRouter.ROUTE_TYPE_LIVE_AUdio,mCallback);

用于监听连接的设备是否正在播放音频.但是我不确定.
我的问题是我如何获得音频流?

解决方法

如果您使用AudioTrack播放音频,则需要确保streamType与路径的streamType匹配.要获取路径的streamType,只需执行以下操作:

private final MediaRouter.Callback mMediaRouterCallback =
    new MediaRouter.Callback() {

        @Override
        public void onRouteSelected(MediaRouter router,MediaRouter.RouteInfo route) {
                Log.i(TAG,"streamType = " + route.getPlaybackStream());
        }
};

通常它将是AudioManager.STREAM_MUSIC.因此,当您初始化AudioTrack时,请确保:

_audioTrack = new AudioTrack(
                AudioManager.STREAM_MUSIC,sampleRate,channelConfig,audioFormat,bufferSize,mode);

android-以最小的延迟播放实时音频流

android-以最小的延迟播放实时音频流

我们正在开发某种对讲系统.我们需要使用RTSP或HTTP协议在Android应用程序中播放实时音频流,并且延迟最小.使用MediaPlayer.setDataSource(URL)的标准方法会产生太大的延迟(大约2-3秒);我们正在使用android 2.2.据我了解,媒体播放器中缓冲区的大小只能在固件级别上设置.您能给我一些建议如何实现这一点,还是应该深入了解真正的VoIP?

解决方法:

我找到了灵活的解决方案-使用AudioTrack API.关于Android中可用的音频API的另一篇有趣的文章:http://www.wiseandroid.com/post/2010/07/13/Intro-to-the-three-Android-Audio-APIs.aspx

android-如何使用翻新之类的文件流式传输到服务器?

android-如何使用翻新之类的文件流式传输到服务器?

我可以使用Retrofit的MultiPart注释将文件顺利上传到服务器.

但是,我想要的是将仍在写入磁盘的文件流式传输到服务器.可以使用翻新吗?

解决方法:

多部分请求的改进使用okhttp3.RequestBody.因此,您必须创建自己的RequestBody来生成数据.

这是取自OkHttp食谱(https://github.com/square/okhttp/wiki/Recipes#post-streaming)的示例

public static final MediaType MEDIA_TYPE_MARKDOWN
  = MediaType.parse("text/x-markdown; charset=utf-8");

RequestBody requestBody = new RequestBody() {
  @Override public MediaType contentType() {
    return MEDIA_TYPE_MARKDOWN;
  }

  @Override public void writeto(BufferedSink sink) throws IOException {
    sink.writeUtf8("Numbers\n");
    sink.writeUtf8("-------\n");
    for (int i = 2; i <= 997; i++) {
      sink.writeUtf8(String.format(" * %s = %s\n", i, factor(i)));
    }
  }

  private String factor(int n) {
    for (int i = 2; i < n; i++) {
      int x = n / i;
      if (x * i == n) return factor(x) + " × " + i;
    }
    return Integer.toString(n);
  }
};

今天关于将实时Android音频流传输到服务器将实时android音频流传输到服务器上的分享就到这里,希望大家有所收获,若想了解更多关于android – 如何获取音频流的持续时间,并从任何点继续音频流、android – 通过MediaRouter将音频流式传输到设备、android-以最小的延迟播放实时音频流、android-如何使用翻新之类的文件流式传输到服务器?等相关知识,可以在本站进行查询。

本文标签: