捕获音频/视频后无法播放系统声音
Posted
技术标签:
【中文标题】捕获音频/视频后无法播放系统声音【英文标题】:Can't play system sounds after capturing audio / video 【发布时间】:2011-12-15 16:36:29 【问题描述】:我正在使用 AVfoudnation 重新编码音频/视频。在开始捕获视频/音频之前,我需要使用系统声音播放声音。这是第一次正常工作,但是当我第二次尝试这样做时,系统 audi 没有播放。我的猜测是 AVfoundation 中的某些内容未正确发布。
在我的应用程序删除中,我在 applicationDidFinishLaunching 方法中有以下代码:
VKRSAppSoundPlayer *aPlayer = [[VKRSAppSoundPlayer alloc] init];
[aPlayer addSoundWithFilename:@"sound1" andExtension:@"caf"];
self.appSoundPlayer = aPlayer;
[aPlayer release];
还有这个方法
- (void)playSound:(NSString *)sound
[appSoundPlayer playSound:sound];
如您所见,我正在使用 VKRSAppSoundPlayer,效果很好!
在一个视图中,我有这个代码:
- (void) startSession
self.session = [[AVCaptureSession alloc] init];
[session beginConfiguration];
if([session canSetSessionPreset:AVCaptureSessionPreset640x480])
session.sessionPreset = AVCaptureSessionPresetMedium;
[session commitConfiguration];
CALayer *viewLayer = [videoPreviewView layer];
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = viewLayer.bounds;
[viewLayer addSublayer:captureVideoPreviewLayer];
self.videoInput = [AVCaptureDeviceInput deviceInputWithDevice:[self frontFacingCameraIfAvailable] error:nil];
self.audioInput = [AVCaptureDeviceInput deviceInputWithDevice:[self audioDevice] error:nil];
if(videoInput)
self.videoOutput = [[AVCaptureMovieFileOutput alloc] init];
[session addOutput:videoOutput];
//[videoOutput release];
if([session canAddInput:videoInput])
//[session beginConfiguration];
[session addInput:videoInput];
//[videoInput release];
[session removeInput:[self audioInput]];
if([session canAddInput:audioInput])
[session addInput:audioInput];
//[audioInput release];
if([session canAddInput:audioInput])
[session addInput:audioInput];
NSLog(@"startRunning!");
[session startRunning];
[self startRecording];
if(![self recordsVideo])
[self showAlertWithTitle:@"Video Recording Unavailable" msg:@"This device can't record video."];
- (void) stopSession
[session stopRunning];
[session release];
- (AVCaptureDevice *)frontFacingCameraIfAvailable
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
Boolean cameraFound = false;
for (AVCaptureDevice *device in videoDevices)
NSLog(@"1 frontFacingCameraIfAvailable %d", device.position);
if (device.position == AVCaptureDevicePositionBack)
NSLog(@"1 frontFacingCameraIfAvailable FOUND");
captureDevice = device;
cameraFound = true;
break;
if(cameraFound == false)
for (AVCaptureDevice *device in videoDevices)
NSLog(@"2 frontFacingCameraIfAvailable %d", device.position);
if (device.position == AVCaptureDevicePositionFront)
NSLog(@"2 frontFacingCameraIfAvailable FOUND");
captureDevice = device;
break;
return captureDevice;
- (AVCaptureDevice *) audioDevice
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio];
if ([devices count] > 0)
return [devices objectAtIndex:0];
return nil;
- (void) startRecording
#if _Multitasking_
if ([[UIDevice currentDevice] isMultitaskingSupported])
[self setBackgroundRecordingID:[[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:^]];
#endif
[videoOutput startRecordingToOutputFileURL:[self generatenewVideoPath]
recordingDelegate:self];
- (void) stopRecording
[videoOutput stopRecording];
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
fromConnections:(NSArray *)connections error:(NSError *)error
NSFileManager *man = [[NSFileManager alloc] init];
NSDictionary *attrs = [man attributesOfItemAtPath: [outputFileURL path] error: NULL];
NSString *fileSize = [NSString stringWithFormat:@"%llu", [attrs fileSize]];
// close this screen
[self exitScreen];
-(BOOL)recordsVideo
AVCaptureConnection *videoConnection = [AVCamUtilities connectionWithMediaType:AVMediaTypeVideo
fromConnections:[videoOutput connections]];
return [videoConnection isActive];
-(BOOL)recordsAudio
AVCaptureConnection *audioConnection = [AVCamUtilities connectionWithMediaType:AVMediaTypeAudio
fromConnections:[videoOutput connections]];
return [audioConnection isActive];
如果我这样做 [videoInput release];和[音频输入释放];我遇到了错误的访问错误。这就是他们被注释掉的原因。这可能是问题的一部分。
如果我尝试播放系统声音n次,它可以工作,但如果我先进入录音脚本,然后它就不会工作了。
有什么想法吗?
【问题讨论】:
您需要更好地理解 self.iVar 的含义,以及 release 的含义。 release 减少保留计数,如果为 0,则使对象有资格进行释放。使用 self.iVar(假设您已将其声明为属性)会保留该 iVar,因此您可以在此之后立即释放它。但我认为这不是您的音频问题。 【参考方案1】:释放 AVCaptureSession 的正确方法如下:
- (void) destroySession
// Notify the view that the session will end
if ([delegate respondsToSelector:@selector(captureManagerSessionWillEnd:)])
[delegate captureManagerSessionWillEnd:self];
// remove the device inputs
[session removeInput:[self videoInput]];
[session removeInput:[self audioInput]];
// release
[session release];
// remove AVCamRecorder
[recorder release];
// Notify the view that the session has ended
if ([delegate respondsToSelector:@selector(captureManagerSessionEnded:)])
[delegate captureManagerSessionEnded:self];
如果您遇到某种发布问题(访问权限不正确),我建议您将代码从当前“混乱”项目中移到其他新项目中,并在那里调试问题。
当我遇到类似问题时,我就这样做了。我在 Github 上分享了它,你可能会发现这个项目很有用:AVCam-CameraReleaseTest
【讨论】:
以上是关于捕获音频/视频后无法播放系统声音的主要内容,如果未能解决你的问题,请参考以下文章
Android Q及以上系统音频捕获功能(声音内录)的简单实现