带有 kAudioUnitSubType_VoiceProcessingIO 剪辑的 kAudioUnitType_Output 音频单元
Posted
技术标签:
【中文标题】带有 kAudioUnitSubType_VoiceProcessingIO 剪辑的 kAudioUnitType_Output 音频单元【英文标题】:kAudioUnitType_Output audio unit with a kAudioUnitSubType_VoiceProcessingIO clipping 【发布时间】:2015-04-17 14:47:08 【问题描述】:我正在开发一个录音和播放应用程序。我正在使用带有 kAudioUnitSubType_VoiceProcessingIO 的 kAudioUnitType_Output 音频。有时它工作正常,但有时有很多剪辑。我认为这是因为环境噪音。我不知道这种削波是否是 AEC 噪音的副作用,还是我的音频单元设置错误:
这是我的设置函数:
struct CallbackData
AudioUnit rioUnit;
BOOL* audioChainIsBeingReconstructed;
CallbackData(): rioUnit(NULL), audioChainIsBeingReconstructed(NULL)
cd;
static OSStatus performRender (void*inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData)
OSStatus err = noErr;
if (*cd.audioChainIsBeingReconstructed == NO)
err = AudioUnitRender(cd.rioUnit, ioActionFlags, inTimeStamp, 1, inNumberFrames, ioData);
float *inputFrames = (float*)(ioData->mBuffers->mData);
//engine_process_ios(inputFrames, ioData->mBuffers->mNumberChannels * inNumberFrames);
return err;
- (void)setupAudioSession
try
// Configure the audio session
AVAudioSession *sessionInstance = [AVAudioSession sharedInstance];
NSError *error = nil;
[sessionInstance setCategory:AVAudioSessionCategoryPlayAndRecord withOptions:AVAudioSessionCategoryOptionAllowBluetooth error:&error];
NSTimeInterval bufferDuration = .005;
[sessionInstance setPreferredIOBufferDuration:bufferDuration error:&error];
[sessionInstance setPreferredSampleRate:44100 error:&error];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:@selector(handleInterruption:)
name:AVAudioSessionInterruptionNotification
object:sessionInstance];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:@selector(handleRouteChange:)
name:AVAudioSessionRouteChangeNotification
object:sessionInstance];
[[NSNotificationCenter defaultCenter] addObserver: self
selector: @selector(handleMediaServerReset:)
name: AVAudioSessionMediaServicesWereResetNotification
object: sessionInstance];
[[AVAudioSession sharedInstance] setActive:YES error:&error];
catch (NSException *e)
NSLog(@"Error returned from setupAudioSession");
catch (...)
NSLog(@"Unknown error returned from setupAudioSession");
return;
- (void)setupIOUnit
try
// Create a new instance of AURemoteIO
AudioComponentDescription desc;
desc.componentType = kAudioUnitType_Output;
desc.componentSubType = kAudioUnitSubType_VoiceProcessingIO ;
desc.componentManufacturer = kAudioUnitManufacturer_Apple;
desc.componentFlags = 0;
desc.componentFlagsMask = 0;
AudioComponent comp = AudioComponentFindNext(NULL, &desc);
AudioComponentInstanceNew(comp, &_rioUnit);
UInt32 one = 1;
AudioUnitSetProperty(_rioUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Input, 1, &one, sizeof(one));
AudioUnitSetProperty(_rioUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Output, 0, &one, sizeof(one));
CAStreamBasicDescription ioFormat = CAStreamBasicDescription(44100, 2, CAStreamBasicDescription::kPCMFormatFloat32, true);
AudioUnitSetProperty(_rioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 1, &ioFormat, sizeof(ioFormat));
AudioUnitSetProperty(_rioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &ioFormat, sizeof(ioFormat));
UInt32 maxFramesPerSlice = 4096;
AudioUnitSetProperty(_rioUnit, kAudioUnitProperty_MaximumFramesPerSlice, kAudioUnitScope_Global, 0, &maxFramesPerSlice, sizeof(UInt32));
UInt32 propSize = sizeof(UInt32);
AudioUnitGetProperty(_rioUnit, kAudioUnitProperty_MaximumFramesPerSlice, kAudioUnitScope_Global, 0, &maxFramesPerSlice, &propSize);
cd.rioUnit = _rioUnit;
cd.audioChainIsBeingReconstructed = &_audioChainIsBeingReconstructed;
// Set the render callback on AURemoteIO
AURenderCallbackStruct renderCallback;
renderCallback.inputProc = performRender;
renderCallback.inputProcRefCon = NULL;
AudioUnitSetProperty(_rioUnit, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Input, 0, &renderCallback, sizeof(renderCallback));
// Initialize the AURemoteIO instance
AudioUnitInitialize(_rioUnit);
//if (err) NSLog(@"couldn't start AURemoteIO: %d", (int)err);
catch (NSException *e)
NSLog(@"Error returned from setupIOUnit");
catch (...)
NSLog(@"Unknown error returned from setupIOUnit");
return;
这种剪辑的原因可能是什么?
【问题讨论】:
engine_process_iOS() 中发生了什么 engine_process_iOS() 是我用来处理样本并使用我自己的均衡器处理音频的函数。我尝试在不调用此函数的情况下启动音频单元,但问题是一样的。 嗨,你修复那个剪辑了吗? 【参考方案1】:回调中音频缓冲区的样本应该是 SInt16。尝试投射它:
SInt16 *inputFrames = (SInt16*)(ioData->mBuffers[0]->mData);
【讨论】:
但在我的 AudioUnit Init 中,我将流格式设置为“CAStreamBasicDescription::kPCMFormatFloat32”以上是关于带有 kAudioUnitSubType_VoiceProcessingIO 剪辑的 kAudioUnitType_Output 音频单元的主要内容,如果未能解决你的问题,请参考以下文章
如何翻转正面带有标签而背面带有另一个标签的视图 - 参见图片
CakePHP 如何处理带有/不带有 'id' 字段的 HABTM 表?
带有 RecyclerView 的 DialogFragment 比带有 Recyclerview 的 Fragment 慢