Android 音频可视化 Visualizer

Posted 立花泷える宫水三叶

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了Android 音频可视化 Visualizer相关的知识,希望对你有一定的参考价值。

音频可视化 Visualizer

效果图

Visualizer

官方语录:

The Visualizer class enables application to retrieve part of the currently playing audio for visualization purpose. It is not an audio recording interface and only returns partial and low quality audio content. However, to protect privacy of certain audio data (e.g voice mail) the use of the visualizer requires the permission android.permission.RECORD_AUDIO.

The audio session ID passed to the constructor indicates which audio content should be visualized:

  • If the session is 0, the audio output mix is visualized
  • If the session is not 0, the audio from a particular or using this audio session is visualized MediaPlayer AudioTrack

Two types of representation of audio content can be captured:

  • Waveform data: consecutive 8-bit (unsigned) mono samples by using the methodgetWaveForm(byte[])
  • Frequency data: 8-bit magnitude FFT by using the methodgetFft(byte[])

The length of the capture can be retrieved or specified by calling respectively and methods. The capture size must be a power of 2 in the range returned by . getCaptureSize() setCaptureSize(int) getCaptureSizeRange()

In addition to the polling capture mode described above with and methods, a callback mode is also available by installing a listener by use of the method. The rate at which the listener capture method is called as well as the type of data returned is specified. getWaveForm(byte[]) getFft(byte[]) setDataCaptureListener(android.media.audiofx.Visualizer.OnDataCaptureListener, int, boolean, boolean)

Before capturing data, the Visualizer must be enabled by calling the method. When data capture is not needed any more, the Visualizer should be disabled. setEnabled(boolean)

It is good practice to call the method when the Visualizer is not used anymore to free up native resources associated to the Visualizer instance. release()

Creating a Visualizer on the output mix (audio session 0) requires permission Manifest.permission.MODIFY_AUDIO_SETTINGS

The Visualizer class can also be used to perform measurements on the audio being played back. The measurements to perform are defined by setting a mask of the requested measurement modes with . Supported values are to cancel any measurement, and for peak and RMS monitoring. Measurements can be retrieved through .setMeasurementMode(int) MEASUREMENT_MODE_NONE MEASUREMENT_MODE_PEAK_RMS getMeasurementPeakRms(android.media.audiofx.Visualizer.MeasurementPeakRms)

实现过程

  1. 通过MediaPlayer进行音频播放
  2. 然后创建Visualizer对象,根据Visualizer需要传递一个audiosessionId,通过MediaPlayergetAudioSessionId方法获取,
  3. 然后根据Visualizer官方文档所述,通过设置setDataCaptureListener监听,捕获波形数据或者频率数据。
  4. 然后根据数据遍历绘制图形即可。

代码实现

  1. 首先根据需要申请权限

    <uses-permission android:name="android.permission.RECORD_AUDIO" />
        <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
        <uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
    
    RxPermissions rxPermissions = new RxPermissions(this);
            rxPermissions.requestEach(Manifest.permission.RECORD_AUDIO,
                Manifest.permission.WRITE_EXTERNAL_STORAGE, Manifest.permission.READ_EXTERNAL_STORAGE)
                .subscribe(new Consumer<Permission>() 
                    @Override
                    public void accept(Permission permission) throws Exception 
                        if (permission.granted) 
                            Log.d(TAG, "accept: true");
                         else if (permission.shouldShowRequestPermissionRationale) 
                            finish();
                         else 
                            finish();
                        
                    
                );
    

    RxPermissions依赖:

    implementation 'com.tbruyelle.rxpermissions2:rxpermissions:0.9.5'
    
  2. 通过MediaPlayer播放音频文件

    mMediaPlayer = MediaPlayer.create(this, R.raw.daoxiang);
     mMediaPlayer.setOnErrorListener(null);
            mMediaPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() 
                @Override
                public void onPrepared(MediaPlayer mediaPlayer) 
                    mediaPlayer.setLooping(true);//循环播放
                
            );
            mMediaPlayer.start();
    
  3. 获取audioSessionId

    int audioSessionId = mediaPlayer.getAudioSessionId();
    
  4. 创建Visualizer对象

    visualizer = new Visualizer(audioSessionId);
    //生成Visualizer实例之后,为其设置可视化数据的大小,其范围是Visualizer.getCaptureSizeRange()[0] ~ Visualizer.getCaptureSizeRange()[1],此处设置为最大值:
    visualizer.setCaptureSize(Visualizer.getCaptureSizeRange()[1]);
    
  5. 通过 setDataCaptureListener 为可视化对象设置采样监听数据的回调

    visualizer.setDataCaptureListener(new Visualizer.OnDataCaptureListener() 
        @Override
        public void onWaveFormDataCapture(Visualizer visualizer, byte[] bytes, int samplingRate) 
        
    
        @Override
        public void onFftDataCapture(Visualizer visualizer, byte[] fft, int samplingRate) 
            float[] model = new float[fft.length / 2 + 1];
            model[0] = (byte) Math.abs(fft[1]);
            int j = 1;
    
            for (int i = 2; i < fft.length / 2; ) 
                model[j] = (float) Math.hypot(fft[i], fft[i + 1]);
                i += 2;
                j++;
                model[j] = (float) Math.abs(fft[j]);
            
            //model即为最终用于绘制的数据
        
    , Visualizer.getMaxCaptureRate() / 2, false, true);
    

    setDataCaptureListener的参数作用如下:

    listener:回调对象
    rate:采样的频率,其范围是0~Visualizer.getMaxCaptureRate(),此处设置为最大值一半。
    waveform:是否获取波形信息
    fft:是否获取快速傅里叶变换后的数据

    OnDataCaptureListener中的两个回调方法分别为:

    onWaveFormDataCapture:波形数据回调
    onFftDataCapture:傅里叶数据回调,即频率数据回调

    这里我们采用的是傅里叶数据进行可视化绘制,onFftDataCapture 中返回的byte数组就是快速傅里叶转换之后的数据,但还需要处理一下:

    根据上面设置的采样率为Visualizer.getCaptureSizeRange()[1],即1024个采样点,每1024个实数点放入一个数组,进行FFT快速傅里叶变换,得到1024个复数点,由于对称性,前512个点与后512个点对称,取前513个点(包括第0点)

    其中第0点和第512点为实数,中间511点为复数

    onFftDataCapture(Visualizer visualizer, byte[] bytes, int samplingRate)
    

    FFT数据为byte类型,放于byte[1024]中,其中一共1+1+(1024-2)/2=513个有效FFT数据,除了直流和n/2对应的点占一个坑,其他频率数据都是 实部+i虚部 两个坑

    获得的频率范围= 0~采样率/2 = 0~22.05kHz之间

    即513个频率分布在 [ 0Hz,22.05kHz ]之间

    每相邻两个频率间隔(mHz) = 采样率 / (1024 / 2) = 44 100 000 / 512 = 86.132Hz分辨率为86.132Hz,再小的频率间隔将无法分辨

    采样率:每秒采集音频流的点数

    frequencyEach = samplingRate * 2 / visualizer.getCaptureSize();  //86132  samplingRate=44,100,000 mHz  getCaptureSize()=1024
    
    		float[] model = new float[fft.length / 2 + 1];
    		//由于返回的byte数据有可能为负,所以要取绝对值处理:
            model[0] = (byte) Math.abs(fft[1]);
            int j = 1;
    
            for (int i = 2; i < fft.length / 2; ) 
                model[j] = (float) Math.hypot(fft[i], fft[i + 1]);
                i += 2;
                j++;
                model[j] = (float) Math.abs(fft[j]);
            
            //model即为最终用于绘制的数据
    
  6. 设置Visualizer启动

    visualizer.setEnabled(true);
    
  7. 绘制图形:

    public class VisualizeView extends View 
    
        private static final String TAG = "SingleVisualizeView";
    
        /**
         * the count of spectrum
         */
        protected int mSpectrumCount = 60;
        /**
         * the width of every spectrum
         */
        protected float mStrokeWidth;
        /**
         * the color of drawing spectrum
         */
        protected int mColor;
        /**
         * audio data transform by hypot
         */
        protected float[] mRawAudioBytes;
        /**
         * the margin of adjoin spectrum
         */
        protected float mItemMargin = 12;
    
        protected float mSpectrumRatio = 2;
    
        protected RectF mRect;
        protected Paint mPaint;
        protected Path mPath;
        protected float centerX, centerY;
        private int mode;
        public static final int SINGLE = 0;
        public static final int CIRCLE = 1;
        public static final int NET = 2;
        public static final int REFLECT = 3;
        public static final int WAVE = 4;
        public static final int GRAIN = 5;
        float radius = 150;
    
        public VisualizeView(Context context) 
            super(context);
            init();
        
    
        public VisualizeView(Context context, @Nullable AttributeSet attrs) 
            super(context, attrs);
            init();
        
    
        protected void init() 
            mStrokeWidth = 5;
    
            mPaint = new Paint();
            mPaint.setStrokeWidth(mStrokeWidth);
            mPaint.setColor(getResources().getColor(R.color.black));
            mPaint.setStrokeCap(Paint.Cap.ROUND);
            mPaint.setAntiAlias(true);
            mPaint.setMaskFilter(new BlurMaskFilter(5, BlurMaskFilter.Blur.SOLID));
    
            mRect = new RectF();
            mPath = new Path();
        
    
        @Override
        protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec) 
            super.onMeasure(widthMeasureSpec, heightMeasureSpec);
            int finallyWidth;
            int finallyHeight;
            int wSpecMode = MeasureSpec.getMode(widthMeasureSpec);
            int wSpecSize = MeasureSpec.getSize(widthMeasureSpec);
            int hSpecMode = MeasureSpec.getMode(heightMeasureSpec);
            int hSpecSize = MeasureSpec.getSize(heightMeasureSpec);
            if (wSpecMode == MeasureSpec.EXACTLY) 
                finallyWidth = wSpecSize;
             else 
                finallyWidth = 500;
            
    
            if (hSpecMode == MeasureSpec.EXACTLY) 
                finallyHeight = hSpecSize;
             else 
                finallyHeight = 500;
            
    
            setMeasuredDimension(finallyWidth, finallyHeight);
        
    
        @Override
        protected void onLayout(boolean changed, int left, int top, int right, int bottom) 
            super.onLayout(changed, left, top, right, bottom);
            mRect.set(0, 0, getWidth(), getHeight() - 50);
            centerX = mRect.width() / 2;
            centerY = mRect.height() / 2;
        
    
        @Override
        protected void onDraw(Canvas canvas) 
            super.onDraw(canvas);
            if (mRawAudioBytes == null) 
                Log.d(TAG, "onDraw: ");
                return;
            
            drawChild(canvas);
        
    
        protected void drawChild(Canvas canvas) 
            mStrokeWidth = (mRect.width() - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f;
            mPaint.setStrokeWidth(mStrokeWidth);
            mPaint.setStyle(Paint.Style.FILL);
    
            switch (mode) 
                case SINGLE:
                    for (int i = 0; i < mSpectrumCount; i++) 
                        canvas.drawLine(mRect.width() * i / mSpectrumCount, mRect.height() / 2,
                            mRect.width() * i / mSpectrumCount, 2 + mRect.height() / 2 - mRawAudioBytes[i], mPaint);
                    
                    break;
                case CIRCLE:
                    mStrokeWidth = (float) ((Math.PI * 2 * radius - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f);
                    mPaint.setStyle(Paint.Style.STROKE);
                    mPaint.setStrokeWidth(2);
                    canvas.drawCircle(centerX, centerY, radius, mPaint);
                    mPaint.setStrokeWidth(mStrokeWidth);
                    mPaint.setStyle(Paint.Style.FILL);
                    mPath.moveTo(0, centerY);
                    for (int i = 0; i < mSpectrumCount; i++) 
                        double angel = ((360d / mSpectrumCount * 1.0d) * (i + 1));
                        double startX = centerX + (radius + mStrokeWidth / 2) * Math.sin(Math.toRadians(angel));
                        double startY = centerY + (radius + mStrokeWidth / 2) * Math.cos(Math.toRadians(angel));
                        double stopX = centerX + (radius + mStrokeWidth / 2 + mSpectrumRatio * mRawAudioBytes[i]) * Math.sin(Math.toRadians(angel));
                        double stopY = centerY + (radius + mStrokeWidth / 2 + mSpectrumRatio * mRawAudioBytes[i]) * Math.cos(Math.toRadians(angel));
                        canvas.drawLine((float) startX, (float) startY, (float) stopX, (float) stopY, mPaint);
                    
                    break;
                case NET:
                    mStrokeWidth = (float) ((Math.PI * 2 * radius - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f);
                    mPaint.setStyle(Paint.Style.STROKE);
                    mPaint.setStrokeWidth(2);
                    canvas.drawCircle(centerX, centerY, radius, mPaint);
    
                    mPaint.setStrokeWidth(mStrokeWidth);
                    mPaint.setStyle(Paint.Style.FILL);
                    mPath.moveTo(0, centerY);
                    for (int i = 0; i < mSpectrumCount; i++) 
                        double angel = ((360d / mSpectrumCount * 1.0d) * (i + 1));
                        double startX = centerX + (radius + mStrokeWidth / 2) * Math.sin(Math.toRadians(angel));
                        double startY = centerY + (radius + mStrokeWidth / 2) * Math.cos(Math.toRadians(angel));
                        double stopX = centerX + (radius + mStrokeWidth / 2 + mSpectrumRatio * mRawAudioBytes[i]) * Math.sin(Math.toRadians(angel));
                        double stopY = centerY + (radius + mStrokeWidth / 2 + mSpectrumRatio * mRawAudioBytes[i]) * Math.cos(Math.toRadians(angel));
                        canvas.drawLine((float) startX, (float) startY, (float) stopX, (float) stopY, mPaint);
                        if (i == 0) 
                            mPath.moveTo((float) startX, (float) startY);
                        
                        mPath.lineTo((float) stopX, (float) stopY);
                    
                    mPaint.setStyle(Paint.Style.STROKE);
                    canvas.drawPath(mPath, mPaint);
                    mPath.reset();
                    break;
                case REFLECT:
                    mStrokeWidth = (mRect.width() - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f;
                    mPaint.setStrokeWidth(mStrokeWidth);
                    mPaint.setStyle(Paint.Style.FILL);
                    for (int i = 0; i < mSpectrumCount; i++) 
                        canvas.drawLine(mRect.width() * i / mSpectrumCount, mRect.height() / 2, mRect.width() * i / mSpectrumCount, 2 + mRect.height() / 2 - mSpectrumRatio * mRawAudioBytes[i], mPaint);
                        canvas.drawLine(mRect.width() * i / mSpectrumCount, mRect.height() / 2, mRect.width() * i / mSpectrumCount, 2 + mRect.height() / 2 + mSpectrumRatio * mRawAudioBytes[i], mPaint);
                    
                    break;
                case WAVE:
                    mStrokeWidth = (mRect.width() - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f;
                    mPaint.setStrokeWidth(mStrokeWidth);
                    mPaint.setStyle(Paint.Style.FILL);
                    mPath.moveTo(0, centerY);
    
                    for (int i = 0; i < mSpectrumCount; i++) 
                        mPath.lineTo(mRect.width() * i / mSpectrumCount, 2 + mRect.height() / 2 + mRawAudioBytes[i]);
                    
                    mPath.lineTo(mRect.width(), centerY);
                    mPath.close();
                    canvas.drawPath(mPath, mPaint);
                    mPath.reset();
                    break;
                case GRAIN:
                    mStrokeWidth = (mRect.width() - (mSpectrumCount - 1) * mItemMargin) / mSpectrumCount * 1.0f;
                    mPaint.setStrokeWidth(mStrokeWidth);
                    mPaint.setStyle(Paint.Style.FILL);
                    for (int i = 0; i < mSpectrumCount; i++) 
                        canvas.drawPoint(mRect.width() * i / mSpectrumCount, 2 + mRect.height() / 2 - mRawAudioBytes[i], mPaint);
                        canvas.drawPoint(mRect.width() * i / mSpectrumCount, mRect.height() / 4 + 2 + (mRect.height() / 2 - mRawAudioBytes[i]) / 2, mPaint);
                    
                    break;
                default:
                    break;
            
    
        
    
        public void setMode(int mode) 
            this.mode = mode;
            if (mRawAudioBytes != null) 
                invalidate();
            
        
    
        public void setData(float[] parseData) 
            mRawAudioBytes = parseData;
            invalidate();
        
    
    
  8. 退出应用时进行释放:

    	@Override
        protected void onDestroy() 
            super.onDestroy();
            if (mMediaPlayer != null) 
                mMediaPlayer.stop();
                mMediaPlayer.reset();
                mMediaPlayer.release();
                mMediaPlayer = null;
            
            if (visualizer != null) 
                visualizer.setEnabled(false);
                visualizer.release();
            
        
    

tips:Spinner使用

  1. 新建values目录下arrays的xml文件,配置

    <?xml version="1.0" encoding="utf-8"?>
    <resources>
        <string-array name="view_type">
            <item>SINGLE</item>
            <item>CIRCLE </item>
            <item>NET</item>
            <item>REFLECT</item>
            <item>WAVE</item>
            <item>GRAIN</item>
        </string-array>
    </resources>
    
  2. 使用spinner

     		<Spinner
                android:id="@+id/spinner_view"
                android:layout_width="200px"
                android:layout_height="wrap_content"
                android:entries="@array/view_type"
                app:layout_constraintEnd_toEndOf="parent"
                app:layout_constraintTop_toTopOf="parent" />
    
    mBinding.spinnerView.setOnItemSelectedListener(new AdapterView.OnItemSelectedListener() 
                @Override
                public void onItemSelected(AdapterView<?> parent, View view, int position, long id) 
                    mBinding.visualizerView.setMode(position);
                
    
                @Override
                public void onNothingSelected(AdapterView<?> parent) 
    
                
            );
    

完整代码

  1. MainActivity

    public class MainActivity extends AppCompatActivity 
    
        private static final String TAG = "MainActivity";
        Visualizer visualizer;
        int mCount = 60;
        ActivityMainBinding mBinding;
        private MediaPlayer mMediaPlayer;
    
        @Override
        protected void onCreate(Bundle savedInstanceState) 
            super.onCreate(savedInstanceState);
            mBinding = DataBindingUtil.setContentView(this, R.layout.activity_main);
            RxPermissions rxPermissions = new RxPermissions(this);
            rxPermissions.requestEach(Manifest.permission.RECORD_AUDIO,
                Manifest.permission.WRITE_EXTERNAL_STORAGE, Manifest.permission.READ_EXTERNAL_STORAGE)
                .subscribe(new Consumer<Permission>() 
                    @Override
                    public void accept(Permission permission) throws Exception 
                        if (permission.granted) 
                            Log.d(TAG, "accept: true");
                         else if (permission.shouldShowRequestPermissionRationale) 
                            finish();
                         else 
                            finish();
                        
                    
                );
            mMediaPlayer = MediaPlayer.create(this, R.raw.daoxiang);
            if (mMediaPlayer == null) 
                Log.d(TAG, "mediaPlayer is null");
                return;
            
    
            mMediaPlayer.setOnErrorListener(null);
            mMediaPlayer.setOnPreparedListener(new MediaPlayer.OnPreparedListener() 
                @Override
                public void onPrepared(MediaPlayer mediaPlayer) 
                    mediaPlayer.setLooping(true);//循环播放
                    int audioSessionId = mediaPlayer.getAudioSessionId();
                    visualizer = new Visualizer(audioSessionId);
                    visualizer.setCaptureSize(Visualizer.getCaptureSizeRange()[1]);
                    visualizer.setDataCaptureListener(new Visualizer.OnDataCaptureListener() 
                        @Override
                        public void onWaveFormDataCapture(Visualizer visualizer, byte[] bytes, int samplingRate) 
                        
    
                        @Override
                        public void onFftDataCapture(Visualizer visualizer, byte[] fft, int samplingRate) 
                            Log.d(TAG, "onFftDataCapture: fft " + fft.length);
                            float[] model = new float[fft.length / 2 + 1];
                            model[0] = (byte) Math.abs(fft[1]);
                            int j = 1;
    
                            for (int i = 2; i < fft.length / 2; ) 
                                model[j] = (float) Math.hypot(fft[i], fft[i + 1]);
                                i += 2;
                                j++;
                                model[j] = (float) Math.abs(fft[j]);
                            
                            //model即为最终用于绘制的数据
                            mBinding.visualizerView.setData(model);
                        
                    , Visualizer.getMaxCaptureRate() / 2, false, true);
                    visualizer.setEnabled(true);
                
            );
            mMediaPlayer.start();
    
            mBinding.spinnerView.setOnItemSelectedListener(new AdapterView.OnItemSelectedListener() 
                @Override
                public void onItemSelected(AdapterView<?> parent, View view, int position, long id) 
                    mBinding.visualizerView.setMode(position);
                
    
                @Override
                public void onNothingSelected(AdapterView<?> parent) 
    
                
            );
        
    
        @Override
        protected void onDestroy() 
            super.onDestroy();
            if (mMediaPlayer != null) 
                mMediaPlayer.stop();
                mMediaPlayer.reset();

    我正在制作一个Android应用来录制音频。我想要录制的音频的视觉表示,例如线性可视化器

    我有一个Android应用来录制音频。我想要像线可视化器那样以可视方式表示录制的音频。我尝试使用它,但失败了https://github.com/gauravk95/audio-visualizer-android

    我是一个初学者,因此,感谢您的逐步帮助,谢谢。

    这是我的主要活动的样子:

    public class MainActivity extends AppCompatActivity {
    
    private Button buttonStart, buttonStop, buttonPlayLastRecordAudio,
            buttonStopPlayingRecording ;
    private String AudioSavePathInDevice = null;
    private MediaRecorder mediaRecorder ;
    private Random random ;
    private static final int RequestPermissionCode = 1;
    private MediaPlayer mediaPlayer ;
    
    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
    
        buttonStart = findViewById(R.id.button);
        buttonStop = findViewById(R.id.button2);
        buttonPlayLastRecordAudio = findViewById(R.id.button3);
        buttonStopPlayingRecording = findViewById(R.id.button4);
    
        buttonStop.setEnabled(false);
        buttonPlayLastRecordAudio.setEnabled(false);
        buttonStopPlayingRecording.setEnabled(false);
    
        random = new Random();
    
        buttonStart.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View view) {
    
                if(checkPermission()) {
    
                    AudioSavePathInDevice =
                            Environment.getExternalStorageDirectory().getAbsolutePath() + "/" + CreateRandomAudioFileName() + "AudioRecording.3gp";
    
                    MediaRecorderReady();
    
                    try {
                        mediaRecorder.prepare();
                        mediaRecorder.start();
                    } catch (IllegalStateException e) {
                        //TODO Auto-generated catch block
                        e.printStackTrace();
                    } catch (IOException e) {
                        //TODO Auto-generated catch block
                        e.printStackTrace();
                    }
    
                    buttonStart.setEnabled(false);
                    buttonStop.setEnabled(true);
    
                    Toast.makeText(MainActivity.this, "Recording Started", LENGTH_LONG).show();
                } else {
                    requestPermission();
                }
    
            }
        });
    
        buttonStop.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View view) {
                mediaRecorder.stop();
                buttonStop.setEnabled(false);
                buttonPlayLastRecordAudio.setEnabled(true);
                buttonStart.setEnabled(true);
                buttonStopPlayingRecording.setEnabled(false);
    
                Toast.makeText(MainActivity.this, "Recording Completed", LENGTH_LONG).show();
            }
        });
    
        buttonPlayLastRecordAudio.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View view) throws IllegalArgumentException,
                    SecurityException, IllegalStateException {
    
                buttonStop.setEnabled(false);
                buttonStart.setEnabled(false);
                buttonStopPlayingRecording.setEnabled(true);
    
                mediaPlayer = new MediaPlayer();
                try {
                    mediaPlayer.setDataSource(AudioSavePathInDevice);
                    mediaPlayer.prepare();
                } catch (IOException e) {
                    e.printStackTrace();
                }
    
                mediaPlayer.start();
                Toast.makeText(MainActivity.this, "Recording Playing", LENGTH_LONG).show();
            }
        });
    
        buttonStopPlayingRecording.setOnClickListener(new View.OnClickListener() {
            @Override
            public void onClick(View view) {
                buttonStop.setEnabled(false);
                buttonStart.setEnabled(true);
                buttonStopPlayingRecording.setEnabled(false);
                buttonPlayLastRecordAudio.setEnabled(true);
    
                if(mediaPlayer != null){
                    mediaPlayer.stop();
                    mediaPlayer.release();
                    MediaRecorderReady();
                }
            }
        });
    
    }
    
    public void MediaRecorderReady(){
        mediaRecorder=new MediaRecorder();
        mediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC);
        mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.THREE_GPP);
        mediaRecorder.setAudioEncoder(MediaRecorder.OutputFormat.AMR_NB);
        mediaRecorder.setOutputFile(AudioSavePathInDevice);
    }
    
    private String CreateRandomAudioFileName(){
        StringBuilder stringBuilder = new StringBuilder(5);
        int i = 0 ;
        while(i < 5) {
            String randomAudioFileName = "ABCDEFGHIJKLMNOP";
            stringBuilder.append(randomAudioFileName.
                    charAt(random.nextInt(randomAudioFileName.length())));
    
            i++ ;
        }
        return stringBuilder.toString();
    }
    
    private void requestPermission() {
        ActivityCompat.requestPermissions(MainActivity.this, new
                String[]{WRITE_EXTERNAL_STORAGE, RECORD_AUDIO}, RequestPermissionCode);
    }
    
    @Override
    public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull  int[] grantResults) {
        if (requestCode == RequestPermissionCode) {
            if (grantResults.length > 0) {
                boolean StoragePermission = grantResults[0] ==
                        PackageManager.PERMISSION_GRANTED;
                boolean RecordPermission = grantResults[1] ==
                        PackageManager.PERMISSION_GRANTED;
    
                if (StoragePermission && RecordPermission) {
                    Toast.makeText(MainActivity.this, "Permission Granted",
                            LENGTH_LONG).show();
                } else {
                    Toast.makeText(MainActivity.this, "Permission Denied", LENGTH_LONG).show();
                }
            }
        }
    }
    
    public boolean checkPermission() {
        int result = ContextCompat.checkSelfPermission(getApplicationContext(), WRITE_EXTERNAL_STORAGE);
        int result1 = ContextCompat.checkSelfPermission(getApplicationContext(), RECORD_AUDIO);
        return result == PackageManager.PERMISSION_GRANTED &&
                result1 == PackageManager.PERMISSION_GRANTED;
    }
    

    }

    答案

    您只需要使用音频dsp库

    检查此链接:https://developer.android.com/ndk/guides/audio

    您也可以检查:https://github.com/james34602/JamesDSPManager

    MWEngine音频引擎也可以执行此操作:“通过设备输入(例如麦克风)进行实时录制和处理”

    https://github.com/igorski/MWEngine

    经常考虑一下dsp,您会发现如何可视化音频频率

    以上是关于Android 音频可视化 Visualizer的主要内容,如果未能解决你的问题,请参考以下文章

    如何在 Android 音频录制时显示 MIC 输入可视化

    Android OpenGLES3绘图 - 音频可视化(模仿MIUI系统效果)

    Android 如何使用 MediaRecorder 录制音频并输出为原始 PCM?

    窃听(集中)音频流android

    跟踪音频(强度级别) - android/iphone

    从 ByteStream 可视化 Android AudioTrack