MTAudioProcessingTap EXC_BAD_ACCESS ,并不总是触发 finalize 回调。如何释放它?

Posted

技术标签:

【中文标题】MTAudioProcessingTap EXC_BAD_ACCESS ,并不总是触发 finalize 回调。如何释放它?【英文标题】:MTAudioProcessingTap EXC_BAD_ACCESS , doesnt always fire the finalize callback. how to Release it? 【发布时间】:2018-12-05 16:29:59 【问题描述】:

我正在尝试实现 MTAudioProcessingTap,效果很好。问题是当我完成使用 Tap 并重新实例化我的类并创建一个新的 Tap 时。

我应该如何释放水龙头 1-我在创建时将水龙头保留为属性,希望以后可以访问它并释放它 2- 在类的 deinit() 方法中,我将 audiomix 设置为 nil 并尝试执行 self.tap?.release()

问题是.. 有时它会工作并调用 FINALIZE 回调,一切都很好,有时它不会,只是在 tapProcess 回调线崩溃:

let selfMediaInput = Unmanaged<VideoMediaInput>.fromOpaque(MTAudioProcessingTapGetStorage(tap)).takeUnretainedValue()

这是完整代码:https://gist.github.com/omarojo/03d08165a1a7962cb30c17ec01f809a3

import Foundation
import UIKit
import AVFoundation;
import MediaToolbox

protocol VideoMediaInputDelegate: class 
    func videoFrameRefresh(sampleBuffer: CMSampleBuffer) //could be audio or video


class VideoMediaInput: NSObject 
    private let queue = DispatchQueue(label: "com.GenerateMetal.VideoMediaInput")

    var videoURL: URL!

    weak var delegate: VideoMediaInputDelegate?

    private var playerItemObserver: NSKeyValueObservation?
    var displayLink: CADisplayLink!
    var player = AVPlayer()
    var playerItem: AVPlayerItem!
    let videoOutput = AVPlayerItemVideoOutput(pixelBufferAttributes: [String(kCVPixelBufferPixelFormatTypeKey): NSNumber(value: kCVPixelFormatType_32BGRA)])
    var audioProcessingFormat:  AudiostreamBasicDescription?//UnsafePointer<AudioStreamBasicDescription>?
    var tap: Unmanaged<MTAudioProcessingTap>?

    override init()

    

    convenience init(url: URL)
        self.init()
        self.videoURL = url

        self.playerItem = AVPlayerItem(url: url)

        playerItemObserver = playerItem.observe(\.status)  [weak self] item, _ in
            guard item.status == .readyToPlay else  return 
            self?.playerItemObserver = nil
            self?.player.play()
        

        setupProcessingTap()


        player.replaceCurrentItem(with: playerItem)
        player.currentItem!.add(videoOutput)

        NotificationCenter.default.removeObserver(self)
        NotificationCenter.default.addObserver(forName: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: nil, queue: nil) [weak self] notification in

            if let weakSelf = self 
                /*
                 Setting actionAtItemEnd to None prevents the movie from getting paused at item end. A very simplistic, and not gapless, looped playback.
                 */
                weakSelf.player.actionAtItemEnd = .none
                weakSelf.player.seek(to: CMTime.zero)
                weakSelf.player.play()
            

        
        NotificationCenter.default.addObserver(
            self,
            selector: #selector(applicationDidBecomeActive(_:)),
            name: UIApplication.didBecomeActiveNotification,
            object: nil)

    

    func stopAllProcesses()
        self.queue.sync 
            self.player.pause()
            self.player.isMuted = true
            self.player.currentItem?.audioMix = nil
            self.playerItem.audioMix = nil
            self.playerItem = nil
            self.tap?.release()
        
    


    deinit
        print(">> VideoInput deinited !!!! ????????")
        if let link = self.displayLink 
            link.invalidate()
        
        NotificationCenter.default.removeObserver(self)

        stopAllProcesses()

    
    public func playVideo()
        if (player.currentItem != nil) 
            print("Starting playback!")
            player.play()
        
    
    public func pauseVideo()
        if (player.currentItem != nil) 
            print("Pausing playback!")
            player.pause()
        
    

    @objc func applicationDidBecomeActive(_ notification: NSNotification) 
        playVideo()
    




    //MARK: GET AUDIO BUFFERS
    func setupProcessingTap()

        var callbacks = MTAudioProcessingTapCallbacks(
            version: kMTAudioProcessingTapCallbacksVersion_0,
            clientInfo: UnsafeMutableRawPointer(Unmanaged.passUnretained(self).toOpaque()),
            init: tapInit,
            finalize: tapFinalize,
            prepare: tapPrepare,
            unprepare: tapUnprepare,
            process: tapProcess)

        var tap: Unmanaged<MTAudioProcessingTap>?
        let err = MTAudioProcessingTapCreate(kCFAllocatorDefault, &callbacks, kMTAudioProcessingTapCreationFlag_PostEffects, &tap)
        self.tap = tap


        print("err: \(err)\n")
        if err == noErr 
        

        print("tracks? \(playerItem.asset.tracks)\n")

        let audioTrack = playerItem.asset.tracks(withMediaType: AVMediaType.audio).first!
        let inputParams = AVMutableAudioMixInputParameters(track: audioTrack)
        inputParams.audioTapProcessor = tap?.takeRetainedValue()//tap?.takeUnretainedValue()
//        tap?.release()

        // print("inputParms: \(inputParams), \(inputParams.audioTapProcessor)\n")
        let audioMix = AVMutableAudioMix()
        audioMix.inputParameters = [inputParams]

        playerItem.audioMix = audioMix
    

    //MARK: TAP CALLBACKS

    let tapInit: MTAudioProcessingTapInitCallback = 
        (tap, clientInfo, tapStorageOut) in
        tapStorageOut.pointee = clientInfo

        print("init \(tap, clientInfo, tapStorageOut)\n")

    

    let tapFinalize: MTAudioProcessingTapFinalizeCallback = 
        (tap) in
        print("finalize \(tap)\n")
    

    let tapPrepare: MTAudioProcessingTapPrepareCallback = 
        (tap, itemCount, basicDescription) in
        print("prepare: \(tap, itemCount, basicDescription)\n")
        let selfMediaInput = Unmanaged<VideoMediaInput>.fromOpaque(MTAudioProcessingTapGetStorage(tap)).takeUnretainedValue()
        selfMediaInput.audioProcessingFormat = AudioStreamBasicDescription(mSampleRate: basicDescription.pointee.mSampleRate,
                                                                           mFormatID: basicDescription.pointee.mFormatID, mFormatFlags: basicDescription.pointee.mFormatFlags, mBytesPerPacket: basicDescription.pointee.mBytesPerPacket, mFramesPerPacket: basicDescription.pointee.mFramesPerPacket, mBytesPerFrame: basicDescription.pointee.mBytesPerFrame, mChannelsPerFrame: basicDescription.pointee.mChannelsPerFrame, mBitsPerChannel: basicDescription.pointee.mBitsPerChannel, mReserved: basicDescription.pointee.mReserved)
    

    let tapUnprepare: MTAudioProcessingTapUnprepareCallback = 
        (tap) in
        print("unprepare \(tap)\n")
    

    let tapProcess: MTAudioProcessingTapProcessCallback = 
        (tap, numberFrames, flags, bufferListInOut, numberFramesOut, flagsOut) in
        print("callback \(bufferListInOut)\n")

        let selfMediaInput = Unmanaged<VideoMediaInput>.fromOpaque(MTAudioProcessingTapGetStorage(tap)).takeUnretainedValue()

        let status = MTAudioProcessingTapGetSourceAudio(tap, numberFrames, bufferListInOut, flagsOut, nil, numberFramesOut)
        //print("get audio: \(status)\n")
        if status != noErr 
            print("Error TAPGetSourceAudio :\(String(describing: status.description))")
            return
        

        selfMediaInput.processAudioData(audioData: bufferListInOut, framesNumber: UInt32(numberFrames))
    
    func processAudioData(audioData: UnsafeMutablePointer<AudioBufferList>, framesNumber: UInt32) 
        var sbuf: CMSampleBuffer?
        var status : OSStatus?
        var format: CMFormatDescription?

        //FORMAT
//        var audioFormat = self.audioProcessingFormat//self.audioProcessingFormat?.pointee
        guard var audioFormat = self.audioProcessingFormat else 
            return
        
        status = CMAudioFormatDescriptionCreate(allocator: kCFAllocatorDefault, asbd: &audioFormat, layoutSize: 0, layout: nil, magicCookieSize: 0, magicCookie: nil, extensions: nil, formatDescriptionOut: &format)
        if status != noErr 
            print("Error CMAudioFormatDescriptionCreater :\(String(describing: status?.description))")
            return
        


        print(">> Audio Buffer mSampleRate:\(Int32(audioFormat.mSampleRate))")
        var timing = CMSampleTimingInfo(duration: CMTimeMake(value: 1, timescale: Int32(audioFormat.mSampleRate)), presentationTimeStamp: self.player.currentTime(), decodeTimeStamp: CMTime.invalid)


        status = CMSampleBufferCreate(allocator: kCFAllocatorDefault,
                                      dataBuffer: nil,
                                      dataReady: Bool(truncating: 0),
                                      makeDataReadyCallback: nil,
                                      refcon: nil,
                                      formatDescription: format,
                                      sampleCount: CMItemCount(framesNumber),
                                      sampleTimingEntryCount: 1,
                                      sampleTimingArray: &timing,
                                      sampleSizeEntryCount: 0, sampleSizeArray: nil,
                                      sampleBufferOut: &sbuf);
        if status != noErr 
            print("Error CMSampleBufferCreate :\(String(describing: status?.description))")
            return
        
        status =   CMSampleBufferSetDataBufferFromAudioBufferList(sbuf!,
                                                                  blockBufferAllocator: kCFAllocatorDefault ,
                                                                  blockBufferMemoryAllocator: kCFAllocatorDefault,
                                                                  flags: 0,
                                                                  bufferList: audioData)
        if status != noErr 
            print("Error cCMSampleBufferSetDataBufferFromAudioBufferList :\(String(describing: status?.description))")
            return
        

        let currentSampleTime = CMSampleBufferGetOutputPresentationTimeStamp(sbuf!);
        print(" audio buffer at time: \(currentSampleTime)")
        self.delegate?.videoFrameRefresh(sampleBuffer: sbuf!)

    



我如何使用我的课程

self.inputVideoMedia = nil
self.inputVideoMedia = VideoMediaInput(url: videoURL)
self.inputVideoMedia!.delegate = self

我第二次这样做......它崩溃了(但并非总是如此)。它没有崩溃的时间我可以在控制台中看到 FINALIZE 打印。

【问题讨论】:

【参考方案1】:

如果VideoMediaInput 被释放之前点击被释放(这可能会发生,因为似乎无法同步停止点击),然后点击回调可能会阻塞对您的引用释放类。

您可以通过传递(我猜是包装的)对您的类的弱引用来解决此问题。你可以这样做:

首先删除您的 tap 实例变量,以及对它的任何引用 - 它不需要。然后进行以下更改:

class VideoMediaInput: NSObject 

    class TapCookie 
        weak var input: VideoMediaInput?

        deinit 
            print("TapCookie deinit")
        
    
...

    func setupProcessingTap()
        let cookie = TapCookie()
        cookie.input = self

        var callbacks = MTAudioProcessingTapCallbacks(
            version: kMTAudioProcessingTapCallbacksVersion_0,
            clientInfo: UnsafeMutableRawPointer(Unmanaged.passRetained(cookie).toOpaque()),
            init: tapInit,
            finalize: tapFinalize,
            prepare: tapPrepare,
            unprepare: tapUnprepare,
            process: tapProcess)
...


    let tapFinalize: MTAudioProcessingTapFinalizeCallback = 
        (tap) in
        print("finalize \(tap)\n")

       // release cookie
        Unmanaged<TapCookie>.fromOpaque(MTAudioProcessingTapGetStorage(tap)).release()
    


    let tapPrepare: MTAudioProcessingTapPrepareCallback = 
        (tap, itemCount, basicDescription) in
        print("prepare: \(tap, itemCount, basicDescription)\n")
        let cookie = Unmanaged<TapCookie>.fromOpaque(MTAudioProcessingTapGetStorage(tap)).takeUnretainedValue()
        let selfMediaInput = cookie.input!
...

    let tapProcess: MTAudioProcessingTapProcessCallback = 
        (tap, numberFrames, flags, bufferListInOut, numberFramesOut, flagsOut) in
        print("callback \(bufferListInOut)\n")

        let cookie = Unmanaged<TapCookie>.fromOpaque(MTAudioProcessingTapGetStorage(tap)).takeUnretainedValue()

        guard let selfMediaInput = cookie.input else 
            print("Tap callback: VideoMediaInput was deallocated!")
            return
        
...

我不确定是否需要 cookie 类,它的存在只是为了包装 weak 引用。尖端的 Swift 专家可能知道如何通过所有青少年突变忍者的原始指针来弥补弱点,但我不知道。

【讨论】:

你能解释一下你的意思 - 将弱引用传递给类 - 吗? 你的评论在我解释的时候出现了! OH MY GOD,成功了,没有一次崩溃。老实说,我真的不知道那里发生了什么。但它有效,它释放了类和 cookie。并调用 unprepare,finalize。我猜那个 GUARD 可以防止崩溃。 没错——如果你不能先停止你的类释放,至少你可以检测到它。 这个答案似乎假设进程回调中的代码在使用该回调释放对象后可以安全运行。【参考方案2】:

音频上下文在它自己的实时线程中运行。因此,音频进程不会与停止或取消函数调用同步停止,而是在某个未知时间之后(在某些内部音频缓冲区中的一些音频样本的持续时间的顺序上),在实时线程耗尽之后。

因此,音频缓冲区、对象和回调不应在停止任何实时音频流后的某个(未知,但少于几秒钟)时间后释放(或重新分配)。

据报道,根据实时线程之间的释放对象消息或实例变量状态(包括弱引用),当前在 Swift 中是不安全的(请参阅 WWDC 2018 音频会议)。因此,我建议使用信号量(在实时上下文之外,例如音频)或 posix 内存屏障(在对 C 函数的桥接调用内部)。 (...直到 Swift 的某个未来版本找出实时并发机制。)(...尤其是在可以重新排序内存写入的 iOS 设备上)。

【讨论】:

我没有把 Swift 缺乏线程模型会影响弱引用的联系联系起来!说得通。那么弱引用可以在objective-c中使用吗?我想这并不重要,因为水龙头不应该用 swift 或 objc 编写。叹息,我在这里所做的只是传播更多MTAudioProcessingTap 错误信息。我应该在我的答案前加上一个警告(如果你的“this”可能超出范围,则使用 c & a memory barrier)。

以上是关于MTAudioProcessingTap EXC_BAD_ACCESS ,并不总是触发 finalize 回调。如何释放它?的主要内容,如果未能解决你的问题,请参考以下文章

错误 - 线程 1 exc_bad_instruction(代码=exc_1386_invop 子代码=0x0)

斯威夫特 3 - 'EXC_BAD_INSTRUCTION(代码 = EXC_1386_INVOP,子代码 = 0x0)' 错误

线程 1:EXC_BAD_INSTRUCTION(代码=EXC_1386_INVOP,子代码=0x0)

关于 Swift:执行被中断,原因:EXC_BAD_INSTRUCTION (code=EXC_1386_INVOP, subcode=0x0)

错误:“线程 1:EXC_BAD_ACCESS(代码=EXC_I386_GPFLT)

线程 1:EXC_BAD_INSTRUCTION(code=EXC_I386_INVOP,subcode=0*0) 错误