ARKit 中的 ChromaKey 视频

Posted

技术标签:

【中文标题】ARKit 中的 ChromaKey 视频【英文标题】:ChromaKey video in ARKit 【发布时间】:2018-10-02 06:29:04 【问题描述】:

我正在尝试在 ARKit 中为视频设置色键,我所做的与@Felix 在这里所做的非常相似:GPUImageView inside SKScene as SKNode material - Playing transparent video on ARKit

但是,当视频应该显示时(在这种情况下,当检测到 AR 参考图像时)我收到 [SceneKit] Error: Cannot get pixel buffer (CVPixelBufferRef) 错误并且视频不再播放。在我实现chromaKeyMaterial 之前它确实发挥了作用。这是我的代码,从检测到 AR 参考图像后开始:

DispatchQueue.main.async 
let filePath = Bundle.main.path(forResource: "wigz", ofType: "mp4")
let videoURL = NSURL(fileURLWithPath: filePath!)
let player = AVPlayer(url: videoURL as URL)

let spriteKitScene = SKScene(size: CGSize(width: 640, height: 480))
let videoSpriteKitNode = SKVideoNode(avPlayer: player)
let videoNode = SCNNode()
videoNode.geometry = SCNPlane(width: imageAnchor.referenceImage.physicalSize.width,
              height: imageAnchor.referenceImage.physicalSize.height)
videoNode.eulerAngles = SCNVector3(-Float.pi/2, 0, 0)

// Use spritekit with videonode inside
spriteKitScene.scaleMode = .aspectFit
videoSpriteKitNode.position = CGPoint(x: spriteKitScene.size.width / 2,
                      y: spriteKitScene.size.height / 2)
videoSpriteKitNode.size = spriteKitScene.size
videoSpriteKitNode.yScale = -1.0
videoSpriteKitNode.play()

// Loop video
NotificationCenter.default.addObserver(forName: .AVPlayerItemDidPlayToEndTime, object: player.currentItem, queue: .main)  _ in
    player.seek(to: kCMTimeZero)
    player.play()


spriteKitScene.addChild(videoSpriteKitNode)

videoNode.geometry?.firstMaterial?.diffuse.contents = spriteKitScene
videoNode.geometry?.firstMaterial?.isDoubleSided = true
let chromaKeyMaterial = ChromaKeyMaterial()
chromaKeyMaterial.diffuse.contents = player
videoNode.geometry!.materials = [chromaKeyMaterial]

node.addChildNode(videoNode)

self.imageDetectView.scene.rootNode.addChildNode(node)

在 ChromaKeyMaterial.swift 文件中,我将这些行更改为:

float maskY = 0.0 * c_colorToReplace.r + 1.0 * c_colorToReplace.g + 0.0 * c_colorToReplace.b;
float maskCr = 0.7132 * (c_colorToReplace.r - maskY);
float maskCb = 0.5647 * (c_colorToReplace.b - maskY);

float Y = 0.0 * textureColor.r + 1.0 * textureColor.g + 0.0 * textureColor.b;
float Cr = 0.7132 * (textureColor.r - Y);
float Cb = 0.5647 * (textureColor.b - Y);

试图用色键抠出纯绿色,但我不确定这是否是正确的方法。

任何帮助将不胜感激!

【问题讨论】:

请注意,在最新版本的 SceneKit 中使用 SKSceneSKVideoNode 是不必要的。您可以直接将AVPlayer 设置为SCNMaterialProperty 实例的内容。 @mnuages 很有趣!我会检查的!谢谢。 @mnuages 哇。简单得多,它似乎运行得更好,尽管这可能是我的记忆力让我失望或安慰剂。感谢您的提示! 【参考方案1】:

想通了。我将我的颜色设置为不正确(甚至在错误的位置facepalm),并且似乎有一个错误会阻止视频播放,除非你稍微延迟一下。据说该错误已修复,但似乎并非如此。

如果有人感兴趣,这是我更正和清理的代码(编辑以包含来自@mnuages 的提示):

// Get Video URL and create AV Player
let filePath = Bundle.main.path(forResource: "VIDEO_FILE_NAME", ofType: "VIDEO_FILE_EXTENSION")
let videoURL = NSURL(fileURLWithPath: filePath!)
let player = AVPlayer(url: videoURL as URL)

// Create SceneKit videoNode to hold the spritekit scene.
let videoNode = SCNNode()

// Set geometry of the SceneKit node to be a plane, and rotate it to be flat with the image
videoNode.geometry = SCNPlane(width: imageAnchor.referenceImage.physicalSize.width,
              height: imageAnchor.referenceImage.physicalSize.height)
videoNode.eulerAngles = SCNVector3(-Float.pi/2, 0, 0)

//Set the video AVPlayer as the contents of the video node's material.
videoNode.geometry?.firstMaterial?.diffuse.contents = player
videoNode.geometry?.firstMaterial?.isDoubleSided = true

// Alpha transparancy stuff
let chromaKeyMaterial = ChromaKeyMaterial()
chromaKeyMaterial.diffuse.contents = player
videoNode.geometry!.materials = [chromaKeyMaterial]

//video does not start without delaying the player
//playing the video before just results in [SceneKit] Error: Cannot get pixel buffer (CVPixelBufferRef)
DispatchQueue.main.asyncAfter(deadline: .now() + 0.001) 
    player.seek(to:CMTimeMakeWithSeconds(1, 1000))
    player.play()

// Loop video
NotificationCenter.default.addObserver(forName: .AVPlayerItemDidPlayToEndTime, object: player.currentItem, queue: .main)  _ in
    player.seek(to: kCMTimeZero)
    player.play()


// Add videoNode to ARAnchor
node.addChildNode(videoNode)

// Add ARAnchor node to the root node of the scene
self.imageDetectView.scene.rootNode.addChildNode(node)

这里是铬键材料

import SceneKit

public class ChromaKeyMaterial: SCNMaterial 

public var backgroundColor: UIColor 
    didSet  didSetBackgroundColor() 


public var thresholdSensitivity: Float 
    didSet  didSetThresholdSensitivity() 


public var smoothing: Float  
    didSet  didSetSmoothing() 


public init(backgroundColor: UIColor = .green, thresholdSensitivity: Float = 0.50, smoothing: Float = 0.001) 

    self.backgroundColor = backgroundColor
    self.thresholdSensitivity = thresholdSensitivity
    self.smoothing = smoothing

    super.init()

    didSetBackgroundColor()
    didSetThresholdSensitivity()
    didSetSmoothing()

    // chroma key shader is based on GPUImage
    // https://github.com/BradLarson/GPUImage/blob/master/framework/Source/GPUImageChromaKeyFilter.m

    let surfaceShader =
    """
uniform vec3 c_colorToReplace;
uniform float c_thresholdSensitivity;
uniform float c_smoothing;

#pragma transparent
#pragma body

vec3 textureColor = _surface.diffuse.rgb;

float maskY = 0.2989 * c_colorToReplace.r + 0.5866 * c_colorToReplace.g + 0.1145 * c_colorToReplace.b;
float maskCr = 0.7132 * (c_colorToReplace.r - maskY);
float maskCb = 0.5647 * (c_colorToReplace.b - maskY);

float Y = 0.2989 * textureColor.r + 0.5866 * textureColor.g + 0.1145 * textureColor.b;
float Cr = 0.7132 * (textureColor.r - Y);
float Cb = 0.5647 * (textureColor.b - Y);

float blendValue = smoothstep(c_thresholdSensitivity, c_thresholdSensitivity + c_smoothing, distance(vec2(Cr, Cb), vec2(maskCr, maskCb)));

float a = blendValue;
_surface.transparent.a = a;
"""

    //_surface.transparent.a = a;

    shaderModifiers = [
        .surface: surfaceShader,
    ]


required public init?(coder aDecoder: NSCoder) 
    fatalError("init(coder:) has not been implemented")


//setting background color to be keyed out
private func didSetBackgroundColor() 
    //getting pixel from background color
    //let rgb = backgroundColor.cgColor.components!.mapFloat($0)
    //let vector = SCNVector3(x: rgb[0], y: rgb[1], z: rgb[2])
    let vector = SCNVector3(x: 0.0, y: 1.0, z: 0.0)
    setValue(vector, forKey: "c_colorToReplace")


private func didSetSmoothing() 
    setValue(smoothing, forKey: "c_smoothing")


private func didSetThresholdSensitivity() 
    setValue(thresholdSensitivity, forKey: "c_thresholdSensitivity")


【讨论】:

嗨,你能告诉我如何初始化“node”和“imageAnchor”吗?【参考方案2】:

使用 RealityKit 2 - ios14

我相信使用 RealityKit 您需要使用金属来创建色度着色器。 我还不太了解金属,也不能说如何创建它,但我找到了另一种使用 RealityKit 在 AR 中播放色度键视频的方法。

从 iOS14 开始,可以使用视频素材作为 ModelEntity 的纹理。

对于色度需要一些额外的步骤:

首先我们需要转换视频资源并移除色度键。 然后我们在播放器中加载此资源并使用 modelEntity (iOS 14) 的新 videomaterial 属性

我们开始导入 Yu Ao 这个令人难以置信的包。https://github.com/MetalPetal/MetalPetal/issues/289

不要忘记导入包:import MetalPetal

这是代码:


// in the viewmodel you process the asset and create the player
let context = try! MTIContext(device: MTLCreateSystemDefaultDevice()!)
let chromaKeyBlendFilter = MTIChromaKeyBlendFilter()
let color = MTIColor(red: 0.998, green: 0.0, blue: 0.996, alpha: 1)
//let backgroundColor = MTLClearColor(red: 0, green: 0, blue: 0, alpha: 0)
let backgroundColor = MTIColor(red: 0.0, green: 0.0, blue: 0, alpha: 0)
chromaKeyBlendFilter.color = color
chromaKeyBlendFilter.smoothing = 0.001
chromaKeyBlendFilter.thresholdSensitivity = 0.4//0.475
chromaKeyBlendFilter.inputBackgroundImage = MTIImage(color: backgroundColor, sRGB: false, size: videoSize)
let composition = MTIVideoComposition(asset: asset, context: context, queue: DispatchQueue.main, filter:  request in
    guard let sourceImage = request.anySourceImage else 
        return MTIImage(color: backgroundColor, sRGB: false, size: videoSize)
    
    return FilterGraph.makeImage(builder:  output in
        sourceImage => chromaKeyBlendFilter.inputPorts.inputImage
        chromaKeyBlendFilter => output
    )!
)

videoPlayerItem = AVPlayerItem(asset: asset)
videoPlayerItem.videoComposition = composition.makeAVVideoComposition()

let player = AVPlayer(playerItem: videoPlayerItem)
player.volume = 0.5
// player.play()

我们可以在 RealityKit 2.0(Xcode 12 和 iOS 14)中使用视频纹理。 See this answer by Andy Jazz here about how to set it up

【讨论】:

这个过滤器也去除了白色,你能建议任何调整来避免这个问题吗? 我不知道这一点,我现在也不知道。我使用的动画电影没有白色,我没有用那种方式测试它。也许最好的办法是在 GitHub 上向包维护者提出问题。他确实为我和其他人提供了很多帮助,而且通常回复很快。

以上是关于ARKit 中的 ChromaKey 视频的主要内容,如果未能解决你的问题,请参考以下文章

如何改善 ARKit 3.0 中的人物遮挡

在 ARKit 上显示具有透明背景的视频

Unity透明视频/去除视频黑色shader

FFmpeg滤镜(9)

ARKit 如何在没有跟踪时暂停视频

ARKit 镜像/翻转相机层