在 XIB 中查看未针对不同设备调整大小

Posted

技术标签:

【中文标题】在 XIB 中查看未针对不同设备调整大小【英文标题】:View in XIB Not Resizing For Different Devices 【发布时间】:2016-07-30 20:20:15 【问题描述】:

在我的应用程序中,我将 UIView 子类化,以便从显示在视图中的摄像头捕获视频,以及覆盖在其上的 UIImage。我打开了 AutoLayout,但问题是 View 的大小与我在 XIB 中列出的 Simulated Metrics Size 的大小相同。如果我将模拟的 Metrics 保留为 Inferred 或 Freeform,它对它们都不起作用,但如果我把它放在 iPhone 4.7 上,你会在边缘看到它后面的背景。这是发生 AVCaptureSession 的 ViewController 的代码:

-(void)viewWillAppear:(BOOL)animated 
    NSNumber *value = [NSNumber numberWithInt:UIInterfaceOrientationLandscapeLeft];
    [[UIDevice currentDevice] setValue:value forKey:@"orientation"];
    NSDate *today = [NSDate date];
    NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
    [dateFormatter setDateFormat:@"MMM d hh:mm:ss a"];
    // display in 12HR/24HR (i.e. 11:25PM or 23:25) format according to User Settings
    NSString *currentTime = [dateFormatter stringFromDate:today];
    NSError* error4 = nil;
    AVAudiosession* audioSession = [AVAudioSession sharedInstance];
    [audioSession setCategory:AVAudioSessionCategoryAmbient error:&error4];
    OSStatus propertySetError = 0;
    UInt32 allowMixing = true;
    propertySetError |= AudioSessionSetProperty(kAudioSessionProperty_OtherMixableAudioShouldDuck, sizeof(allowMixing), &allowMixing);

    // Activate the audio session
    error4 = nil;
    if (![audioSession setActive:YES error:&error4]) 
        NSLog(@"AVAudioSession setActive:YES failed: %@", [error4 localizedDescription]);
    
    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectoryPath = [paths objectAtIndex:0];

    self.session = [[AVCaptureSession alloc] init];
    [self.session beginConfiguration];
    self.session.sessionPreset = AVCaptureSessionPreset1280x720;
    self.navigationController.navigationBarHidden = YES;

    NSError *error = nil;

    AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
    NSError *error2 = nil;
    AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&error2];


    AVCaptureDevice *device;
    AVCaptureDevicePosition desiredPosition = AVCaptureDevicePositionBack;
    // find the front facing camera

    device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];


    // get the input device
    AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];




    AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];

    NSString *archives = [documentsDirectoryPath stringByAppendingPathComponent:@"archives"];
    NSString *editedfilename = [[@"ComeOnDown" lastPathComponent] stringByDeletingPathExtension];
    NSString *datestring = [[editedfilename stringByAppendingString:@" "] stringByAppendingString:currentTime];
    NSLog(@"%@", datestring);
    NSString *outputpathofmovie = [[archives stringByAppendingPathComponent:datestring] stringByAppendingString:@".mp4"];
    NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputpathofmovie];
    [self.session addInput:audioInput];
    [self.session addInput:deviceInput];
    [self.session addOutput:movieFileOutput];
    [self.session commitConfiguration];
    [self.session startRunning];




    AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session];
    previewLayer.backgroundColor = [[UIColor blackColor] CGColor];
    previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    previewLayer.connection.videoOrientation = AVCaptureVideoOrientationLandscapeRight;

    CALayer *overlayLayer = [CALayer layer];
    CALayer *overlayLayer2 = [CALayer layer];

    UIImage *overlayImage = [UIImage imageNamed:@"LiveBorder.png"];

    [overlayLayer setContents:(id)[overlayImage CGImage]];
    overlayLayer.frame = CGRectMake(0, 0, self.view.bounds.size.width, self.view.bounds.size.height);
    [overlayLayer setMasksToBounds:YES];

       CALayer *rootLayer = [self.view layer];
    [rootLayer setMasksToBounds:YES];
    [previewLayer setFrame:[rootLayer bounds]];
    [rootLayer addSublayer:previewLayer];
    [rootLayer addSublayer:overlayLayer];
    [rootLayer addSublayer:label2.layer];
        movieFileOutput.movieFragmentInterval = kCMTimeInvalid;

    [movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];



    //self.session = nil;
    if (error) 
        UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:
                                  [NSString stringWithFormat:@"Failed with error %d", (int)[error code]]
                                                            message:[error localizedDescription]
                                                           delegate:nil
                                                  cancelButtonTitle:@"Dismiss"
                                                  otherButtonTitles:nil];
        [alertView show];

    
    [super viewWillAppear:YES];

-(void)viewWillDisappear:(BOOL)animated 
    self.navigationController.navigationBarHidden = NO;



-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections 


-(void)video:(NSString *)videoPath didFinishSavingWithError:(NSError *)error contextInfo:(void *)contextInfo
    NSLog(@"Finished with error: %@", error);

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error 
    //finished
    NSLog(@"Finished");
    NSString *proud = [[NSString alloc] initWithString:[outputFileURL path]];
    UISaveVideoAtPathToSavedPhotosAlbum(proud, self, @selector(video:didFinishSavingWithError: contextInfo:), (__bridge void *)(proud));


-(IBAction)goBackNow 
    [myAVPlayer stop];

    [self.session stopRunning];
    // [captureView performSelector:@selector(stopRecording) withObject:nil afterDelay:0.0];

    NSString *outputPath = [[NSString alloc] initWithFormat:@"%@/%@", [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0], @"output.mp4"];
    UISaveVideoAtPathToSavedPhotosAlbum(outputPath, self, @selector(video:didFinishSavingWithError: contextInfo:), (__bridge void *)(outputPath));

    [self.navigationController popToRootViewControllerAnimated:YES];
    self.navigationController.navigationBarHidden = NO;

    //[self.navigationController popToRootViewControllerAnimated:YES];

-(void)allDone 


- (void)didReceiveMemoryWarning 
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.



@end

这是 UIView 子类的代码:

- (void) writeVideoFrameAtTime:(CMTime)time;
@end

@implementation ScreenCaptureView

@synthesize currentScreen, frameRate, delegate;

- (void) initialize 
    // Initialization code
    self.clearsContextBeforeDrawing = YES;
    self.currentScreen = nil;
    self.frameRate = 10.0f;     //10 frames per seconds
    _recording = false;
    videoWriter = nil;
    videoWriterInput = nil;
    avAdaptor = nil;
    startedAt = nil;
    bitmapData = NULL;


- (id) initWithCoder:(NSCoder *)aDecoder 
    self = [super initWithCoder:aDecoder];
    if (self) 
        [self initialize];
    
    return self;


- (id) init 
    self = [super init];
    if (self) 
        [self initialize];
    
    return self;


- (id)initWithFrame:(CGRect)frame 
    self = [super initWithFrame:frame];
    if (self) 
        [self initialize];
    
    return self;


- (CGContextRef) createBitmapContextOfSize:(CGSize) size 
    CGContextRef    context = NULL;
    CGColorSpaceRef colorSpace;
    int             bitmapByteCount;
    int             bitmapBytesPerRow;

    bitmapBytesPerRow   = (size.width * 4);
    bitmapByteCount     = (bitmapBytesPerRow * size.height);
    colorSpace = CGColorSpaceCreateDeviceRGB();
    if (bitmapData != NULL) 
        free(bitmapData);
    
    bitmapData = malloc( bitmapByteCount );
    if (bitmapData == NULL) 
        fprintf (stderr, "Memory not allocated!");
        return NULL;
    

    context = CGBitmapContextCreate (bitmapData,
                                     size.width,
                                     size.height,
                                     8,      // bits per component
                                     bitmapBytesPerRow,
                                     colorSpace,
                                     kCGImageAlphaNoneSkipFirst);

    CGContextSetAllowsAntialiasing(context,NO);
    if (context== NULL) 
        free (bitmapData);
        fprintf (stderr, "Context not created!");
        return NULL;
    
    CGColorSpaceRelease( colorSpace );

    return context;


//static int frameCount = 0;            //debugging
- (void) drawRect:(CGRect)rect 
    NSDate* start = [NSDate date];
    CGContextRef context = [self createBitmapContextOfSize:self.frame.size];

    //not sure why this is necessary...image renders upside-down and mirrored
    CGAffineTransform flipVertical = CGAffineTransformMake(1, 0, 0, -1, 0, self.frame.size.height);
    CGContextConcatCTM(context, flipVertical);

    [self.layer renderInContext:context];

    CGImageRef cgImage = CGBitmapContextCreateImage(context);
    UIImage* background = [UIImage imageWithCGImage: cgImage];
    CGImageRelease(cgImage);

    self.currentScreen = background;

    //debugging
    //if (frameCount < 40) 
    //      NSString* filename = [NSString stringWithFormat:@"Documents/frame_%d.png", frameCount];
    //      NSString* pngPath = [NSHomeDirectory() stringByAppendingPathComponent:filename];
    //      [UIImagePNGRepresentation(self.currentScreen) writeToFile: pngPath atomically: YES];
    //      frameCount++;
    //

    //NOTE:  to record a scrollview while it is scrolling you need to implement your UIScrollViewDelegate such that it calls
    //       'setNeedsDisplay' on the ScreenCaptureView.
    if (_recording) 
        float millisElapsed = [[NSDate date] timeIntervalSinceDate:startedAt] * 1000.0;
        [self writeVideoFrameAtTime:CMTimeMake((int)millisElapsed, 1000)];
    

    float processingSeconds = [[NSDate date] timeIntervalSinceDate:start];
    float delayRemaining = (1.0 / self.frameRate) - processingSeconds;

    CGContextRelease(context);

    //redraw at the specified framerate
    [self performSelector:@selector(setNeedsDisplay) withObject:nil afterDelay:delayRemaining > 0.0 ? delayRemaining : 0.01];


- (void) cleanupWriter 
    [avAdaptor release];
    avAdaptor = nil;

    [videoWriterInput release];
    videoWriterInput = nil;

    [videoWriter release];
    videoWriter = nil;

    [startedAt release];
    startedAt = nil;

    if (bitmapData != NULL) 
        free(bitmapData);
        bitmapData = NULL;
    


- (void)dealloc 
    [self cleanupWriter];
    [super dealloc];


- (NSURL*) tempFileURL 
    NSString* outputPath = [[NSString alloc] initWithFormat:@"%@/%@", [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0], @"output.mp4"];
    NSURL* outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
    NSFileManager* fileManager = [NSFileManager defaultManager];
    if ([fileManager fileExistsAtPath:outputPath]) 
        NSError* error;
        if ([fileManager removeItemAtPath:outputPath error:&error] == NO) 
            NSLog(@"Could not delete old recording file at path:  %@", outputPath);
        
    

    [outputPath release];
    return [outputURL autorelease];


-(BOOL) setUpWriter 
    NSError* error = nil;
    videoWriter = [[AVAssetWriter alloc] initWithURL:[self tempFileURL] fileType:AVFileTypeQuickTimeMovie error:&error];
    NSParameterAssert(videoWriter);

    //Configure video
    NSDictionary* videoCompressionProps = [NSDictionary dictionaryWithObjectsAndKeys:
                                           [NSNumber numberWithDouble:1024.0*1024.0], AVVideoAverageBitRateKey,
                                           nil ];

    NSDictionary* videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   AVVideoCodecH264, AVVideoCodecKey,
                                   [NSNumber numberWithInt:self.frame.size.width], AVVideoWidthKey,
                                   [NSNumber numberWithInt:self.frame.size.height], AVVideoHeightKey,
                                   videoCompressionProps, AVVideoCompressionPropertiesKey,
                                   nil];

    videoWriterInput = [[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings] retain];

    NSParameterAssert(videoWriterInput);
    videoWriterInput.expectsMediaDataInRealTime = YES;
    NSDictionary* bufferAttributes = [NSDictionary dictionaryWithObjectsAndKeys:
                                      [NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];

    avAdaptor = [[AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput sourcePixelBufferAttributes:bufferAttributes] retain];

    //add input
    [videoWriter addInput:videoWriterInput];
    [videoWriter startWriting];
    [videoWriter startSessionAtSourceTime:CMTimeMake(0, 1000)];

    return YES;


- (void) completeRecordingSession 
    NSAutoreleasePool* pool = [[NSAutoreleasePool alloc] init];

    [videoWriterInput markAsFinished];

    // Wait for the video
    int status = videoWriter.status;
    while (status == AVAssetWriterStatusUnknown) 
        NSLog(@"Waiting...");
        [NSThread sleepForTimeInterval:0.5f];
        status = videoWriter.status;
    

    @synchronized(self) 
        BOOL success = [videoWriter finishWriting];
        if (!success) 
            NSLog(@"finishWriting returned NO");
        

        [self cleanupWriter];

        id delegateObj = self.delegate;
        NSString *outputPath = [[NSString alloc] initWithFormat:@"%@/%@", [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0], @"output.mp4"];
        NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];

        NSLog(@"Completed recording, file is stored at:  %@", outputURL);
        if ([delegateObj respondsToSelector:@selector(recordingFinished:)]) 
            [delegateObj performSelectorOnMainThread:@selector(recordingFinished:) withObject:(success ? outputURL : nil) waitUntilDone:YES];
        

        [outputPath release];
        [outputURL release];
    

    [pool drain];


- (bool) startRecording 
    bool result = NO;
    @synchronized(self) 
        if (! _recording) 
            result = [self setUpWriter];
            startedAt = [[NSDate date] retain];
            _recording = true;
        
    

    return result;


- (void) stopRecording 
    @synchronized(self) 
        if (_recording) 
            _recording = false;
            [self completeRecordingSession];
        
    


-(void) writeVideoFrameAtTime:(CMTime)time 
    if (![videoWriterInput isReadyForMoreMediaData]) 
        NSLog(@"Not ready for video data");
    
    else 
        @synchronized (self) 
            UIImage* newFrame = [self.currentScreen retain];
            CVPixelBufferRef pixelBuffer = NULL;
            CGImageRef cgImage = CGImageCreateCopy([newFrame CGImage]);
            CFDataRef image = CGDataProviderCopyData(CGImageGetDataProvider(cgImage));

            int status = CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, avAdaptor.pixelBufferPool, &pixelBuffer);
            if(status != 0)
                //could not get a buffer from the pool
                NSLog(@"Error creating pixel buffer:  status=%d", status);
            
            // set image data into pixel buffer
            CVPixelBufferLockBaseAddress( pixelBuffer, 0 );
            uint8_t* destPixels = CVPixelBufferGetBaseAddress(pixelBuffer);
            CFDataGetBytes(image, CFRangeMake(0, CFDataGetLength(image)), destPixels);  //XXX:  will work if the pixel buffer is contiguous and has the same bytesPerRow as the input data

            if(status == 0)
                BOOL success = [avAdaptor appendPixelBuffer:pixelBuffer withPresentationTime:time];
                if (!success)
                    NSLog(@"Warning:  Unable to write buffer to video");
            

            //clean up
            [newFrame release];
            CVPixelBufferUnlockBaseAddress( pixelBuffer, 0 );
            CVPixelBufferRelease( pixelBuffer );
            CFRelease(image);
            CGImageRelease(cgImage);
        

    



@end

【问题讨论】:

【参考方案1】:

据我观察:self.view 的大小直到

-(void) viewDidAppear:(BOOL)动画

将您的代码放入 ViewWillAppear 没有帮助

您也可以在每次方向更改时执行此操作。:

     self.view.frame = [[UIScreen mainScreen] bounds];
    _prevLayer.frame = [[UIScreen mainScreen] bounds];
    _prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    [_prevLayer.connection setVideoOrientation:statusBarOrientation];

【讨论】:

以上是关于在 XIB 中查看未针对不同设备调整大小的主要内容,如果未能解决你的问题,请参考以下文章

XIB TableViewCell 未调整大小以适应宽度

自动调整 iPhone xib 应用程序到 ipad 的大小

当 XIB 没有时,UIImageView 在模拟器中调整大小

将UIView中UILabel的字体大小调整为不同的设备大小

UIView 未调整大小以填充 iPhone 6 窗口

将 Xib 的大小调整为 swift 4 的屏幕大小