在 iOS 中保存音频后效

Posted

技术标签:

【中文标题】在 iOS 中保存音频后效【英文标题】:Saving Audio After Effect in iOS 【发布时间】:2016-12-21 04:26:39 【问题描述】:

我正在开发一个应用程序,以便人们可以通过应用程序录制和更改他们的声音并分享它。基本上我有很多事情,现在是时候请你帮忙了。这是我的播放功能,它播放录制的音频文件并对其添加效果。

private func playAudio(pitch : Float, rate: Float, reverb: Float, echo: Float) 
        // Initialize variables
        audioEngine = AVAudioEngine()
        audioPlayerNode = AVAudioPlayerNode()
        audioEngine.attachNode(audioPlayerNode)

        // Setting the pitch
        let pitchEffect = AVAudioUnitTimePitch()
        pitchEffect.pitch = pitch
        audioEngine.attachNode(pitchEffect)

        // Setting the platback-rate
        let playbackRateEffect = AVAudioUnitVarispeed()
        playbackRateEffect.rate = rate
        audioEngine.attachNode(playbackRateEffect)

        // Setting the reverb effect
        let reverbEffect = AVAudioUnitReverb()
        reverbEffect.loadFactoryPreset(AVAudioUnitReverbPreset.Cathedral)
        reverbEffect.wetDryMix = reverb
        audioEngine.attachNode(reverbEffect)

        // Setting the echo effect on a specific interval
        let echoEffect = AVAudioUnitDelay()
        echoEffect.delayTime = NSTimeInterval(echo)
        audioEngine.attachNode(echoEffect)

        // Chain all these up, ending with the output
        audioEngine.connect(audioPlayerNode, to: playbackRateEffect, format: nil)
        audioEngine.connect(playbackRateEffect, to: pitchEffect, format: nil)
        audioEngine.connect(pitchEffect, to: reverbEffect, format: nil)
        audioEngine.connect(reverbEffect, to: echoEffect, format: nil)
        audioEngine.connect(echoEffect, to: audioEngine.outputNode, format: nil)

        audioPlayerNode.stop()

        let length = 4000
        let buffer = AVAudioPCMBuffer(PCMFormat: audioPlayerNode.outputFormatForBus(0),frameCapacity:AVAudioFrameCount(length))
        buffer.frameLength = AVAudioFrameCount(length)

        try! audioEngine.start()


        let dirPaths: AnyObject = NSSearchPathForDirectoriesInDomains( NSSearchPathDirectory.DocumentDirectory,  NSSearchPathDomainMask.UserDomainMask, true)[0]
        let tmpFileUrl: NSURL = NSURL.fileURLWithPath(dirPaths.stringByAppendingPathComponent("effectedSound.m4a"))


        do
            print(dirPaths)
            let settings = [AVFormatIDKey: NSNumber(unsignedInt: kAudioFormatMPEG4AAC), AVSampleRateKey: NSNumber(integer: 44100), AVNumberOfChannelsKey: NSNumber(integer: 2)]
            self.newAudio = try AVAudioFile(forWriting: tmpFileUrl, settings: settings)

            audioEngine.outputNode.installTapOnBus(0, bufferSize: (AVAudioFrameCount(self.player!.duration)), format: self.audioPlayerNode.outputFormatForBus(0))
                (buffer: AVAudioPCMBuffer!, time: AVAudioTime!)  in

                print(self.newAudio.length)
                print("=====================")
                print(self.audioFile.length)
                print("**************************")
                if (self.newAudio.length) < (self.audioFile.length)

                    do
                        //print(buffer)
                        try self.newAudio.writeFromBuffer(buffer)
                    catch _
                        print("Problem Writing Buffer")
                    
                else
                    self.audioPlayerNode.removeTapOnBus(0)
                

            
        catch _
            print("Problem")
        

        audioPlayerNode.play()

    

我想问题是我将TapOnBus 安装到audioPlayerNode 但受影响的音频在audioEngine.outputNode 上。但是我尝试将TapOnBus 安装到audioEngine.outputNode 但它给了我错误。我还尝试将效果连接到audioEngine.mixerNode但这也不是解决方案。那么你有保存受影响的音频文件的经验吗?我怎样才能获得这种影响的音频?

感谢任何帮助

谢谢

【问题讨论】:

【参考方案1】:

这似乎没有正确连接。我自己只是在学习所有这些,但我发现当您将它们连接到混音器节点时,效果已正确添加。此外,您需要点击混音器,而不是引擎输出节点。我刚刚复制了您的代码并进行了一些修改以考虑到这一点。

private func playAudio(pitch : Float, rate: Float, reverb: Float, echo: Float) 
    // Initialize variables
    audioEngine = AVAudioEngine()
    audioPlayerNode = AVAudioPlayerNode()
    audioEngine.attachNode(audioPlayerNode)

    // Setting the pitch
    let pitchEffect = AVAudioUnitTimePitch()
    pitchEffect.pitch = pitch
    audioEngine.attachNode(pitchEffect)

    // Setting the playback-rate
    let playbackRateEffect = AVAudioUnitVarispeed()
    playbackRateEffect.rate = rate
    audioEngine.attachNode(playbackRateEffect)

    // Setting the reverb effect
    let reverbEffect = AVAudioUnitReverb()
    reverbEffect.loadFactoryPreset(AVAudioUnitReverbPreset.Cathedral)
    reverbEffect.wetDryMix = reverb
    audioEngine.attachNode(reverbEffect)

    // Setting the echo effect on a specific interval
    let echoEffect = AVAudioUnitDelay()
    echoEffect.delayTime = NSTimeInterval(echo)
    audioEngine.attachNode(echoEffect)

    // Set up a mixer node
    let audioMixer = AVAudioMixerNode()
    audioEngine.attachNode(audioMixer)

    // Chain all these up, ending with the output
    audioEngine.connect(audioPlayerNode, to: playbackRateEffect, format: nil)
    audioEngine.connect(playbackRateEffect, to: pitchEffect, format: nil)
    audioEngine.connect(pitchEffect, to: reverbEffect, format: nil)
    audioEngine.connect(reverbEffect, to: echoEffect, format: nil)
    audioEngine.connect(echoEffect, to: audioMixer, format: nil)
    audioEngine.connect(audioMixer, to: audioEngine.outputNode, format: nil)

    audioPlayerNode.stop()

    let length = 4000
    let buffer = AVAudioPCMBuffer(PCMFormat: audioPlayerNode.outputFormatForBus(0),frameCapacity:AVAudioFrameCount(length))
    buffer.frameLength = AVAudioFrameCount(length)

    try! audioEngine.start()


    let dirPaths: AnyObject = NSSearchPathForDirectoriesInDomains( NSSearchPathDirectory.DocumentDirectory,  NSSearchPathDomainMask.UserDomainMask, true)[0]
    let tmpFileUrl: NSURL = NSURL.fileURLWithPath(dirPaths.stringByAppendingPathComponent("effectedSound.m4a"))


    do
        print(dirPaths)
        let settings = [AVFormatIDKey: NSNumber(unsignedInt: kAudioFormatMPEG4AAC), AVSampleRateKey: NSNumber(integer: 44100), AVNumberOfChannelsKey: NSNumber(integer: 2)]
        self.newAudio = try AVAudioFile(forWriting: tmpFileUrl, settings: settings)

        audioMixer.installTapOnBus(0, bufferSize: (AVAudioFrameCount(self.player!.duration)), format: self.audioMixer.outputFormatForBus(0))
            (buffer: AVAudioPCMBuffer!, time: AVAudioTime!)  in

            print(self.newAudio.length)
            print("=====================")
            print(self.audioFile.length)
            print("**************************")
            if (self.newAudio.length) < (self.audioFile.length)

                do
                    //print(buffer)
                    try self.newAudio.writeFromBuffer(buffer)
                catch _
                    print("Problem Writing Buffer")
                
            else
                self.audioMixer.removeTapOnBus(0)
            

        
    catch _
        print("Problem")
    

    audioPlayerNode.play()


我也无法正确格式化文件。当我将输出文件的路径从m4a 更改为caf 时,我终于让它工作了。另一个建议是不要将 nil 用于 format 参数。我使用了audioFile.processingFormat。我希望这有帮助。我的音频效果/混音功能正常,虽然我没有链接我的效果。所以请随时提出问题。

【讨论】:

@KBB 我的一个问题是原始音频文件在哪里?是先录制后播放的吗?如果是这样,您需要安排该文件 - audioPlayerNode.scheduleFile(recordedAudioFile, atTime: nil, completionHandler: nil) 并在调用 audioPlayerNode.play() 之前启动 audioEngine 这绝对是我要写的作为我自己问题的答案:DI 通过将最后一个效果连接到混音器节点并点击混音器节点来解决它,我将编写自己的解决方案下面 太棒了!很高兴它有帮助。我不断从缓冲区获取数据,文件长度随着缓冲区写入文件而增加,但音频文件没有持续时间。一旦我将文件类型更改为.caf,它就可以阅读/播放。【参考方案2】:

这是我的问题解决方案:

func playAndRecord(pitch : Float, rate: Float, reverb: Float, echo: Float) 
    // Initialize variables

// These are global variables . if you want you can just  (let audioEngine = etc ..) init here these variables
    audioEngine = AVAudioEngine()
    audioPlayerNode = AVAudioPlayerNode()
    audioEngine.attachNode(audioPlayerNode)
    playerB = AVAudioPlayerNode()

    audioEngine.attachNode(playerB)

    // Setting the pitch
    let pitchEffect = AVAudioUnitTimePitch()
    pitchEffect.pitch = pitch
    audioEngine.attachNode(pitchEffect)

    // Setting the platback-rate
    let playbackRateEffect = AVAudioUnitVarispeed()
    playbackRateEffect.rate = rate
    audioEngine.attachNode(playbackRateEffect)

    // Setting the reverb effect
    let reverbEffect = AVAudioUnitReverb()
    reverbEffect.loadFactoryPreset(AVAudioUnitReverbPreset.Cathedral)
    reverbEffect.wetDryMix = reverb
    audioEngine.attachNode(reverbEffect)

    // Setting the echo effect on a specific interval
    let echoEffect = AVAudioUnitDelay()
    echoEffect.delayTime = NSTimeInterval(echo)
    audioEngine.attachNode(echoEffect)

    // Chain all these up, ending with the output
    audioEngine.connect(audioPlayerNode, to: playbackRateEffect, format: nil)
    audioEngine.connect(playbackRateEffect, to: pitchEffect, format: nil)
    audioEngine.connect(pitchEffect, to: reverbEffect, format: nil)
    audioEngine.connect(reverbEffect, to: echoEffect, format: nil)
    audioEngine.connect(echoEffect, to: audioEngine.mainMixerNode, format: nil)


    // Good practice to stop before starting
    audioPlayerNode.stop()

    // Play the audio file 
// this player is also a global variable  AvAudioPlayer
    if(player != nil)
    player?.stop()
    

    // audioFile here is our original audio
    audioPlayerNode.scheduleFile(audioFile, atTime: nil, completionHandler: 
        print("Complete")
    )


    try! audioEngine.start()


    let dirPaths: AnyObject = NSSearchPathForDirectoriesInDomains( NSSearchPathDirectory.DocumentDirectory,  NSSearchPathDomainMask.UserDomainMask, true)[0]
    let tmpFileUrl: NSURL = NSURL.fileURLWithPath(dirPaths.stringByAppendingPathComponent("effectedSound2.m4a"))

//Save the tmpFileUrl into global varibale to not lose it (not important if you want to do something else)
filteredOutputURL = tmpFileUrl

    do
        print(dirPaths)

        self.newAudio = try! AVAudioFile(forWriting: tmpFileUrl, settings:  [
            AVFormatIDKey: NSNumber(unsignedInt:kAudioFormatAppleLossless),
            AVEncoderAudioQualityKey : AVAudioQuality.Low.rawValue,
            AVEncoderBitRateKey : 320000,
            AVNumberOfChannelsKey: 2,
            AVSampleRateKey : 44100.0
            ])

        let length = self.audioFile.length


        audioEngine.mainMixerNode.installTapOnBus(0, bufferSize: 1024, format: self.audioEngine.mainMixerNode.inputFormatForBus(0)) 
            (buffer: AVAudioPCMBuffer!, time: AVAudioTime!) -> Void in


            print(self.newAudio.length)
            print("=====================")
            print(length)
            print("**************************")

            if (self.newAudio.length) < length //Let us know when to stop saving the file, otherwise saving infinitely

                do
                    //print(buffer)
                    try self.newAudio.writeFromBuffer(buffer)
                catch _
                    print("Problem Writing Buffer")
                
            else
                self.audioEngine.mainMixerNode.removeTapOnBus(0)//if we dont remove it, will keep on tapping infinitely

                //DO WHAT YOU WANT TO DO HERE WITH EFFECTED AUDIO

             

        
    catch _
        print("Problem")
    

    audioPlayerNode.play()


【讨论】:

- 我们可以保存带有效果的音频文件吗 只想保存新影响的音频文件。上面的代码也播放文件。无论如何我可以在混合音频时停止播放声音吗?【参考方案3】:

对于必须播放音频文件两次才能保存它的任何人,我只是在相应的位置添加了以下行,它解决了我的问题。 将来可能会对某人有所帮助。

P.S:我使用与上面选中的答案完全相同的代码,只是添加了这一行并解决了我的问题。

//Do what you want to do here with effected Audio

   self.newAudio = try! AVAudioFile(forReading: tmpFileUrl)

【讨论】:

【参考方案4】:

我添加后得到了这个

self.newAudio = try! AVAudioFile(forReading: tmpFileUrl)

这样返回

Error
Domain=com.apple.coreaudio.avfaudio
Code=1685348671 "(null)" UserInfo=failed
call=ExtAudioFileOpenURL((CFURLRef)fileUR
L, &_extAudioFile)

【讨论】:

【参考方案5】:

只需将参数 unsigned int 从 kAudioFormatMPEG4AAC 更改为 kAudioFormatLinearPCM 并将文件类型更改为 .caf 它肯定会对我的朋友有所帮助

【讨论】:

你救了我一整天 :) 谢谢你 谢谢兄弟,你也可以通过其他方式做到这一点,我从这个项目中了解了更多关于这件事的信息,所以请查看github.com/alvesmarcel/V-Voice-Changer/tree/master/…

以上是关于在 iOS 中保存音频后效的主要内容,如果未能解决你的问题,请参考以下文章

如何保存录制的音频iOS?

iOS:保存在 iTunes 资料库中可见的音频文件

如何从麦克风实时获取原始音频帧或从 iOS 中保存的音频文件获取原始音频帧?

ios,iTunes文件共享保存音频文件

Objective C/IOS - 将音频流保存到文件

如何通过改变音高和速度 iOS 保存音频?