将传入的 NSStream 转换为 View

Posted

技术标签:

【中文标题】将传入的 NSStream 转换为 View【英文标题】:Convert incoming NSStream to View 【发布时间】:2014-03-13 20:54:06 【问题描述】:

我正在成功发送 NSData 流。下面的委托方法是获取该流并附加到 NSMutableData self.data。如何获取这些数据并将其转换为 UIView/AVCaptureVideoPreviewLayer(应该显示视频)?我觉得我错过了另一个转换, AVCaptureSession > NSStream > MCSession > NSStream > ?

- (void)stream:(NSStream *)stream handleEvent:(NSStreamEvent)eventCode 
    switch(eventCode) 
        case NSStreamEventHasBytesAvailable:
        
            if(!self.data) 
                self.data = [NSMutableData data];
            
            uint8_t buf[1024];
            unsigned int len = 0;
            len = [(NSInputStream *)stream read:buf maxLength:1024];
            if(len) 
                [self.data appendBytes:(const void *)buf length:len];
             else 
                NSLog(@"no buffer!");
            

// Code here to take self.data and convert the NSData to UIView/Video

我用这个发送流:

-(void) captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection


    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer,0);

    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
//    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
    void *src_buff = CVPixelBufferGetBaseAddress(imageBuffer);

    NSData *data = [NSData dataWithBytes:src_buff length:bytesPerRow * height];

    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);


    NSError *error;
    self.oStream = [self.mySession startStreamWithName:@"videoOut" toPeer:[[self.mySession connectedPeers]objectAtIndex:0] error:&error];
    self.oStream.delegate = self;
    [self.oStream scheduleInRunLoop:[NSRunLoop mainRunLoop]
                            forMode:NSDefaultRunLoopMode];
    [self.oStream open];

    [self.oStream write:[data bytes] maxLength:[data length]];






//    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer( sampleBuffer );

    CGSize imageSize = CVImageBufferGetEncodedSize( imageBuffer );
    // also in the 'mediaSpecific' dict of the sampleBuffer

    NSLog( @"frame captured at %.fx%.f", imageSize.width, imageSize.height );

【问题讨论】:

您可能想看看是否可以使用 Open GL。获取您的数据,将其转换为 GL 纹理,然后使用 GL 显示它。可能有一个更高级别的 API。数据不是任何标准格式? 视频格式是什么? UIView?视频的链接是什么? 视频格式为 AVCaptureSession 你必须锁定/解锁每帧的像素吗?我想知道这是否会花费大量时间。 我不知道我在做什么。你在代码中哪里看到的? 【参考方案1】:

我认为你需要AVCaptureManager,看看下面的代码是否适合你..

AVCamCaptureManager *manager = [[AVCamCaptureManager alloc] init];
[self setCaptureManager:manager];

[[self captureManager] setDelegate:self];

if ([[self captureManager] setupSession]) 
     // Create video preview layer and add it to the UI
    AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:[[self captureManager] session]];
    UIView *view = self.videoPreviewView;//Add a view in XIB where you want to show video
    CALayer *viewLayer = [view layer];
    [viewLayer setMasksToBounds:YES];
    CGRect bounds = [view bounds];

    [newCaptureVideoPreviewLayer setFrame:bounds];

    [newCaptureVideoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];

    [viewLayer insertSublayer:newCaptureVideoPreviewLayer below:[[viewLayer sublayers] objectAtIndex:0]];

    [self setCaptureVideoPreviewLayer:newCaptureVideoPreviewLayer];

    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^
        [[[self captureManager] session] startRunning];
    );

管理代表

- (void)captureManager:(AVCamCaptureManager *)captureManager didFailWithError:(NSError *)error




- (void)captureManagerRecordingBegan:(AVCamCaptureManager *)captureManager




- (void)captureManagerRecordingFinished:(AVCamCaptureManager *)captureManager outputURL:(NSURL *)url






- (void)captureManagerStillImageCaptured:(AVCamCaptureManager *)captureManager






- (void)captureManagerDeviceConfigurationChanged:(AVCamCaptureManager *)captureManager



希望对你有帮助。

【讨论】:

captureManager 的委托方法都没有处理视频的方法。我错过了什么吗? @Eric 看看这个developer.apple.com/library/ios/samplecode/AVCam/Introduction/… 如果有帮助的话..【参考方案2】:

您可以像这样在处理事件上创建 UIImageView:

UIImageView * iv = [[UIImageView alloc] initWithImage: [UIImage imageWithData: self.data];

您也可以只分配一次,然后调用 init。

每次您从套接字接收时,您都会初始化 UIImageView,然后您可以显示它,将 UIImageView 添加到 UIView。

对不起我的英语,我不知道我是否听懂了你

【讨论】:

以上是关于将传入的 NSStream 转换为 View的主要内容,如果未能解决你的问题,请参考以下文章

蓝牙通信 NSStream 是不是需要单独的线程?

通过 NSStream 与我的证书进行 SSL 握手

NSStream实现发送和接受数据

将来自 json 的传入数据转换为二维数组

传入金额,将金额转换为千分位表示法

swift:将传入的 json 数组转换为字典和对象