AVCaptureSession 废弃内存 - 分配 - 仪器
Posted
技术标签:
【中文标题】AVCaptureSession 废弃内存 - 分配 - 仪器【英文标题】:AVCaptureSession Abandoned memory - allocations - instruments 【发布时间】:2011-03-11 15:45:40 【问题描述】:我使用默认的 AVCaptureSession 来捕获相机视图。 一切正常,我没有任何泄漏,但是当我在启动和关闭 AVCaptureDevice 后使用 Allocations 查找 Abandoned memory 时,它向我显示了大约 230 个对象,这些对象仍然存在。
这是我的代码:
控制器.h:
@interface Controller : UIViewController <AVCaptureVideoDataOutputSampleBufferDelegate>
AVCaptureSession *captureSession;
AVCaptureDevice *device;
IBOutlet UIView *previewLayer;
@property (nonatomic, retain) AVCaptureSession *captureSession;
@property (nonatomic, retain) UIView *previewLayer;
- (void)setupCaptureSession;
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection;
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer;
Controller.m:
- (void)setupCaptureSession
NSError *error = nil;
[self setCaptureSession: [[AVCaptureSession alloc] init]];
self.captureSession.sessionPreset = AVCaptureSessionPresetMedium;
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus] && [device lockForConfiguration:&error])
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
[device unlockForConfiguration];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input)
// TODO: Obsługa błędu, gdy nie uda się utworzyć wejścia
[[self captureSession] addInput:input];
AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
[[self captureSession] addOutput:output];
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
output.videoSettings =
[NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];
output.minFrameDuration = CMTimeMake(1, 15);
[[self captureSession] startRunning];
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
captureVideoPreviewLayer.frame = previewLayer.bounds;
[previewLayer.layer insertSublayer:captureVideoPreviewLayer atIndex:0];
[previewLayer setHidden:NO];
// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
if (mutex && ![device isAdjustingFocus] && ![device isAdjustingExposure] && ![device isAdjustingWhiteBalance])
// Create a UIImage from the sample buffer data
mutex = NO;
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
image = [Tools rotateImage:image andRotateAngle:UIImageOrientationUp];
CGRect rect;
rect.size.width = 210;
rect.size.height = 50;
rect.origin.x = 75;
rect.origin.y = 175;
UIImage *croppedImage = [image resizedImage:image.size interpolationQuality:kCGInterpolationHigh];
croppedImage = [croppedImage croppedImage:rect];
croppedImage = [self processImage:croppedImage];
[NSThread detachNewThreadSelector:@selector(threadedReadAndProcessImage:) toTarget:self withObject:croppedImage];
// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
// Get a CMSampleBuffer's Core Video image buffer for the media data
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// Lock the base address of the pixel buffer
CVPixelBufferLockBaseAddress(imageBuffer, 0);
// Get the number of bytes per row for the pixel buffer
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
// Get the number of bytes per row for the pixel buffer
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// Get the pixel buffer width and height
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
// Create a device-dependent RGB color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
// Create a bitmap graphics context with the sample buffer data
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
// Create a Quartz image from the pixel data in the bitmap graphics context
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
// Unlock the pixel buffer
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
UIImage *image = [UIImage imageWithCGImage:quartzImage];
CGImageRelease(quartzImage);
return (image);
我用这段代码清理一切:
- (void)cancelTapped
[[self captureSession] stopRunning], self.captureSession = nil;
for (UIView *view in self.previewLayer.subviews)
[view removeFromSuperview];
[self dismissModalViewControllerAnimated:YES];
- (void)dealloc
[super dealloc];
[captureSession release];
[device release];
[previewLayer release];
仪器向我展示了这样的东西: http://i.stack.imgur.com/NBWgZ.png
http://i.stack.imgur.com/1GB6C.png
任何想法我做错了什么?
【问题讨论】:
你能看看这里的问题吗? ***.com/questions/11717962/… 【参考方案1】:- (void)setupCaptureSession NSError *error = nil; [self setCaptureSession: [[AVCaptureSession alloc] init]]; ...
这会泄漏捕获会话,这将使所有输入和输出以及它们所有的内部小助手保持活动状态。
两种选择:
AVCaptureSession *session = [[AVCaptureSession alloc] init];
self.captureSession = session;
[session release], session = nil;
// or:
self.captureSession = [[[AVCaptureSession alloc] init] autorelease];
【讨论】:
但是有什么问题呢?我在 dealloc [captureSession release] 中释放它; 你声明了@property (nonatomic, _retain_) AVCaptureSession *captureSession;
,所以这个setter保留了你新创建的对象,它(因为alloc/init)你已经拥有了。
@danyowdee 你能看看这里的问题吗? ***.com/questions/11717962/…【参考方案2】:
- (void)dealloc
[super dealloc];
[captureSession release];
[device release];
[previewLayer release];
super dealloc 应该在其他释放之后调用,否则您的实例的内存可能不包含指向您正在释放的这些对象的有效指针,因此您不会释放它们,尤其是当它们为 nil 时。
【讨论】:
以上是关于AVCaptureSession 废弃内存 - 分配 - 仪器的主要内容,如果未能解决你的问题,请参考以下文章