如何通过改变音高和速度 iOS 保存音频?
Posted
技术标签:
【中文标题】如何通过改变音高和速度 iOS 保存音频?【英文标题】:How to save the audio with changed pitch and speed iOS? 【发布时间】:2016-08-20 09:33:37 【问题描述】:我可以更改音频的音高和速度,但在保存更改了音高和速度的音频时遇到问题
//this is method which set the pitch
[self.audioEngine connect:audioPlayerNode
to:timePitchEffect
format:nil];
[self.audioEngine connect:timePitchEffect
to:self.audioEngine.outputNode
format:nil];
[audioPlayerNode scheduleFile:self.audioFile
atTime:nil
completionHandler:nil];
[self.audioEngine startAndReturnError:&audioEngineError];
NSLog(@"%@",self.audioFile.url);
if (audioEngineError)
NSLog(@"%@",@"whats this!!!");
//在点击按钮时调用方法
[self playAudioWithEffect:EAudioEffectPitch effectValue:@-500.0];
我正在使用此功能更改音高,但是如何使用更改后的音高保存它...请帮助...
【问题讨论】:
您找到解决方案了吗? 不,我还在路上..如果你得到了,请告诉我 @MiteshVaru 如果您有任何解决方案,请更新您的答案。 【参考方案1】:您可以在AVAudioEngine
开启离线手动渲染模式,该模式下引擎的输入输出节点与音频硬件断开连接,由您的应用驱动渲染。
准备源音频
let sourceFile: AVAudioFile
let format: AVAudioFormat
do
let sourceFileURL = Bundle.main.url(forResource: "YOUR_AUDIO_NAME", withExtension: "caf")!
sourceFile = try AVAudioFile(forReading: sourceFileURL)
format = sourceFile.processingFormat
catch
fatalError("Unable to load the source audio file: \(error.localizedDescription).")
创建和配置音频引擎
let engine = AVAudioEngine()
let player = AVAudioPlayerNode()
let reverb = AVAudioUnitReverb()
engine.attach(player)
engine.attach(reverb)
// Set the desired reverb parameters.
reverb.loadFactoryPreset(.mediumHall)
reverb.wetDryMix = 50
// Connect the nodes.
engine.connect(player, to: reverb, format: format)
engine.connect(reverb, to: engine.mainMixerNode, format: format)
// Schedule the source file.
player.scheduleFile(sourceFile, at: nil)
启用离线手动渲染模式
do
// The maximum number of frames the engine renders in any single render call.
let maxFrames: AVAudioFrameCount = 4096
try engine.enableManualRenderingMode(.offline, format: format,
maximumFrameCount: maxFrames)
catch
fatalError("Enabling manual rendering mode failed: \(error).")
//start the engine
do
try engine.start()
player.play()
catch
fatalError("Unable to start audio engine: \(error).")
准备输出目的地
// The output buffer to which the engine renders the processed data.
let buffer = AVAudioPCMBuffer(pcmFormat: engine.manualRenderingFormat,
frameCapacity: engine.manualRenderingMaximumFrameCount)!
let outputFile: AVAudioFile
do
let documentsURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0]
let outputURL = documentsURL.appendingPathComponent("Rhythm-processed.caf")
outputFile = try AVAudioFile(forWriting: outputURL, settings: sourceFile.fileFormat.settings)
catch
fatalError("Unable to open output audio file: \(error).")
手动渲染音频
while engine.manualRenderingSampleTime < sourceFile.length
do
let frameCount = sourceFile.length - engine.manualRenderingSampleTime
let framesToRender = min(AVAudioFrameCount(frameCount), buffer.frameCapacity)
let status = try engine.renderOffline(framesToRender, to: buffer)
switch status
case .success:
// The data rendered successfully. Write it to the output file.
try outputFile.write(from: buffer)
case .insufficientDataFromInputNode:
// Applicable only when using the input node as one of the sources.
break
case .cannotDoInCurrentContext:
// The engine couldn't render in the current render call.
// Retry in the next iteration.
break
case .error:
// An error occurred while rendering the audio.
fatalError("The manual rendering failed.")
catch
fatalError("The manual rendering failed: \(error).")
// Stop the player node and engine.
player.stop()
engine.stop()
The reference link
【讨论】:
以上是关于如何通过改变音高和速度 iOS 保存音频?的主要内容,如果未能解决你的问题,请参考以下文章