如何从 ARSKView(或 SKView)帧创建视频,包括微音频(ios 11)
Posted
技术标签:
【中文标题】如何从 ARSKView(或 SKView)帧创建视频,包括微音频(ios 11)【英文标题】:How to create a video from ARSKView (or SKView) frames, including micro audio (ios 11) 【发布时间】:2017-06-22 16:00:10 【问题描述】:我正在使用 ARKit,我想从 ARSKView 帧创建视频。 我尝试使用 ReplayKit,但行为不是我所期望的: - 我不想记录整个屏幕。 - 我不希望用户被提示我们正在录制屏幕。
另外,如何结合微输入和视频?我猜音频没有在 ARSKView 中流式传输? 这是代码(来自 Apple 示例):
import UIKit
import SpriteKit
import ARKit
class ViewController: UIViewController, ARSKViewDelegate
@IBOutlet var sceneView: ARSKView!
override func viewDidLoad()
super.viewDidLoad()
// Set the view's delegate
sceneView.delegate = self
// Show statistics such as fps and node count
sceneView.showsFPS = true
sceneView.showsNodeCount = true
// Load the SKScene from 'Scene.sks'
if let scene = SKScene(fileNamed: "Scene")
sceneView.presentScene(scene)
override func viewWillAppear(_ animated: Bool)
super.viewWillAppear(animated)
// Create a session configuration
let configuration = ARWorldTrackingSessionConfiguration()
// Run the view's session
sceneView.session.run(configuration)
override func viewWillDisappear(_ animated: Bool)
super.viewWillDisappear(animated)
// Pause the view's session
sceneView.session.pause()
override func didReceiveMemoryWarning()
super.didReceiveMemoryWarning()
// Release any cached data, images, etc that aren't in use.
// MARK: - ARSKViewDelegate
func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode?
// Create and configure a node for the anchor added to the view's session.
let labelNode = SKLabelNode(text: "????")
labelNode.horizontalAlignmentMode = .center
labelNode.verticalAlignmentMode = .center
return labelNode;
func session(_ session: ARSession, didFailWithError error: Error)
// Present an error message to the user
func sessionWasInterrupted(_ session: ARSession)
// Inform the user that the session has been interrupted, for example, by presenting an overlay
func sessionInterruptionEnded(_ session: ARSession)
// Reset tracking and/or remove existing anchors if consistent tracking is required
如果有必要,场景类:
import SpriteKit
import ARKit
class Scene: SKScene
override func didMove(to view: SKView)
// Setup your scene here
override func update(_ currentTime: TimeInterval)
// Called before each frame is rendered
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?)
guard let sceneView = self.view as? ARSKView else
return
// Create anchor using the camera's current position
if let currentFrame = sceneView.session.currentFrame
// Create a transform with a translation of 0.2 meters in front of the camera
var translation = matrix_identity_float4x4
translation.columns.3.z = -0.2
let transform = simd_mul(currentFrame.camera.transform, translation)
// Add a new anchor to the session
let anchor = ARAnchor(transform: transform)
sceneView.session.add(anchor: anchor)
【问题讨论】:
【参考方案1】:如果您只需要记录帧(如AVCaptureSession
,而不是SCNNodes
的“真实” 3D 场景),只需在updateAtTime
的updateAtTime
委托函数中将它们作为ARFrame.capturedImage
获取:
func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval)
createMovieWriterOnce(frame: session.currentFrame)
appendFrameWithMetadaToMovie(frame: session.currentFrame)
我还没有找到从ARSession
获取帧大小的方法,所以MovieWriter
等待第一帧设置大小:
func createMovieWriterOnce(frame: ARFrame?)
if(frame == nil) return
DispatchQueue.once(token: "SimplestMovieWriter.constructor")
movieWriter = SimplestMovieWriter(frameWidth: CVPixelBufferGetWidth(frame!.capturedImage), frameHeight: CVPixelBufferGetHeight(frame!.capturedImage))
接下来的每个CVPixelBuffer
都被馈送到MovieWriter
:
func appendFrameWithMetadaToMovie(frame: ARFrame?)
if(!isVideoRecording || frame == nil) return
let interestingPoints = frame?.rawFeaturePoints?.points
movieWriter.appendBuffer(buffer: (frame?.capturedImage)!, withMetadata: interestingPoints)
MovieWriter
是带有AVAssetWriter
、AVAssetWriterInput
和AVAssetWriterInputPixelBufferAdaptor
的自定义类。
您可以保存没有音频的视频,然后使用AVAssetExportSession
添加您想要的任何内容(音频、字幕、元数据):
let composition = AVMutableComposition()
...
let trackVideo = composition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
let videoFileAsset = AVURLAsset(url: currentURL!, options: nil)
let videoFileAssetTrack = videoFileAsset.tracks(withMediaType: AVMediaTypeVideo)[0]
// add audio track here
【讨论】:
非常感谢!这正是我所需要的,我不需要节点 :)以上是关于如何从 ARSKView(或 SKView)帧创建视频,包括微音频(ios 11)的主要内容,如果未能解决你的问题,请参考以下文章
如何从 UIViewController 移动到 SKView,反之亦然?