如何使用 Xamarin for Android 将数据从 MediaCodec 流式传输到 AudioTrack

Mac*_*zyb 5 android xamarin.android audiotrack xamarin android-mediacodec

我正在尝试解码 mp3 文件并将其流式传输到 AudioTrack。一切正常,但会在 Java 端导致大量 GC。我确保不在我的播放/流循环中分配内存并归咎于 ByteBuffer.Get(byte[], int, int) 分配临时 Java 数组的绑定。任何人都可以确认和/或展示将数据从 MediaCodec 馈送到 AudioTrack 的更好方法吗?(我知道 API 21 引入了 AudioTrack.write(ByteBuffer, ...))谢谢

这是我所做的:

byte[] audioBuffer = new byte[...];

...

ByteBuffer codecOutputBuffer = codecOutputBuffers[outputIndex];

// The next line seems to be the source of a lot of GC during playback
codecOutputBuffer.Get(audioBuffer, 0, bufInfo.Size);

audioTrack.Write(audioBuffer, 0, bufInfo.Size);
Run Code Online (Sandbox Code Playgroud)

更新 1:我尝试使用分配跟踪器来确认分配站点。我发现分配的对象是 8kb 大字节数组。不幸的是,分配跟踪器没有为他们显示分配站点堆栈跟踪:

1   32      org.apache.harmony.dalvik.ddmc.Chunk    6   org.apache.harmony.dalvik.ddmc.DdmServer    dispatch    
2   16      java.lang.Integer                       6   java.lang.Integer   valueOf 
3   16      byte[]                                  6           
4   8192    byte[]                                  20          
5   8192    byte[]                                  20          
6   8192    byte[]                                  20          
Run Code Online (Sandbox Code Playgroud)

为了确保分配数组的是 ByteBuffer.Get(byte[], int, int),我重新运行了应用程序:

  1. audioTrack.Write(...) 注释掉 - 没有变化

  2. codecOutputBuffer.Get(audioBuffer, 0, bufInfo.Size) 注释掉 - 分配消失了

我将用 Java 重写它以检查在本机应用程序中是否得到相同的结果。

更新 2:我已经用 Java 重写了代码,现在我在内存监视器中得到了一个完美的平面图 - 在播放过程中没有分配。

My conclusion/guess is that the ByteBuffer.Get(byte[], int, int) binding from Mono to Java allocates a temp array. Not really sure why it is 8kb large since my audioBuffer only ever gets to slightly above 4kb.

UPDATE 3: My ultimate goal with this is to have a cross platform app (with more complex functionality on top of mp3 player) so I went ahead and created another experiment. I have a java component with audio streaming/decoding/playing functionality that exposes only play(), pause() methods that I consume from C#. This way I don't have allocation problems but still can drive the player from my hopefully reusable c# code. Source code below (this is just research - not production code).

Java:

import android.content.Context;
import android.content.res.AssetFileDescriptor;
import android.media.AudioFormat;
import android.media.AudioManager;
import android.media.AudioTrack;
import android.media.MediaCodec;
import android.media.MediaExtractor;
import android.media.MediaFormat;    
import java.nio.ByteBuffer;

public class AudioPlayer {

    public void play(Context aContext, final int resourceId){

        final Context context = aContext;

        new Thread()
        {
            @Override
            public void run() {

                try {
                    AssetFileDescriptor fd = context.getResources().openRawResourceFd(resourceId);

                    MediaExtractor extractor = new MediaExtractor();
                    extractor.setDataSource(fd.getFileDescriptor(), fd.getStartOffset(), fd.getLength());
                    extractor.selectTrack(0);

                    MediaFormat trackFormat = extractor.getTrackFormat(0);

                    MediaCodec decoder = MediaCodec.createDecoderByType(trackFormat.getString(MediaFormat.KEY_MIME));
                    decoder.configure(trackFormat, null, null, 0);

                    decoder.start();
                    ByteBuffer[] decoderInputBuffers = decoder.getInputBuffers();
                    ByteBuffer[] decoderOutputBuffers = decoder.getOutputBuffers();

                    int inputIndex = decoder.dequeueInputBuffer(-1);
                    ByteBuffer inputBuffer = decoderInputBuffers[inputIndex];
                    MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
                    byte[] audioBuffer = null;
                    AudioTrack audioTrack = null;

                    int read = extractor.readSampleData(inputBuffer, 0);
                    while (read > 0) {
                        decoder.queueInputBuffer(inputIndex, 0, read, extractor.getSampleTime(), 0);

                        extractor.advance();

                        int outputIndex = decoder.dequeueOutputBuffer(bufferInfo, -1);
                        if (outputIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {

                            trackFormat = decoder.getOutputFormat();

                        } else if (outputIndex >= 0) {

                            if (bufferInfo.size > 0) {

                                ByteBuffer outputBuffer = decoderOutputBuffers[outputIndex];
                                if (audioBuffer == null || audioBuffer.length < bufferInfo.size) {
                                    audioBuffer = new byte[bufferInfo.size];
                                }

                                outputBuffer.rewind();
                                outputBuffer.get(audioBuffer, 0, bufferInfo.size);
                                decoder.releaseOutputBuffer(outputIndex, false);

                                if (audioTrack == null) {
                                    int sampleRateInHz = trackFormat.getInteger(MediaFormat.KEY_SAMPLE_RATE);
                                    int channelCount = trackFormat.getInteger(MediaFormat.KEY_CHANNEL_COUNT);
                                    int channelConfig = channelCount == 1 ? AudioFormat.CHANNEL_OUT_MONO : AudioFormat.CHANNEL_OUT_STEREO;

                                    audioTrack = new AudioTrack(
                                            AudioManager.STREAM_MUSIC,
                                            sampleRateInHz,
                                            channelConfig,
                                            AudioFormat.ENCODING_PCM_16BIT,
                                            AudioTrack.getMinBufferSize(sampleRateInHz, channelConfig, AudioFormat.ENCODING_PCM_16BIT) * 2,
                                            AudioTrack.MODE_STREAM);

                                    audioTrack.play();
                                }

                                audioTrack.write(audioBuffer, 0, bufferInfo.size);
                            }
                        }

                        inputIndex = decoder.dequeueInputBuffer(-1);
                        inputBuffer = decoderInputBuffers[inputIndex];

                        read = extractor.readSampleData(inputBuffer, 0);
                    }
                } catch (Exception e) {

                }
            }
        }.start();    
    }    
}
Run Code Online (Sandbox Code Playgroud)

C#

[Activity(Label = "AndroidAudioTest", MainLauncher = true, Icon = "@drawable/icon")]
public class MainActivity : Activity
{
    protected override void OnCreate(Bundle bundle)
    {
        base.OnCreate(bundle);

        SetContentView(Resource.Layout.Main);

        var play = FindViewById<Button>(Resource.Id.Play);
        play.Click += (s, e) =>
        {
            new AudioPlayer().Play(this, Resource.Raw.PianoInsideMics);
        };
    }
}
Run Code Online (Sandbox Code Playgroud)