Swift:实例成员不能用于 ARKitVision 示例中的类型

Posted

技术标签:

【中文标题】Swift:实例成员不能用于 ARKitVision 示例中的类型【英文标题】:Swift: Instance member cannot be used on type in ARKitVision example 【发布时间】:2018-06-06 07:36:56 【问题描述】:

Apple ARKitVision 示例在 ViewController.swift 文件中有以下声明:

// The view controller that displays the status and "restart experience" UI.
private lazy var statusViewController: StatusViewController = 
    return children.lazy.compactMap( $0 as? StatusViewController ).first!
()

但是,如果我复制相同的视图和源文件并将它们合并到另一个测试情节提要/项目中,我会收到错误消息“实例成员 'children' 不能用于类型 'StatusViewController'”

那么,为什么这适用于 ARKitVision 示例,但如果我自己从头开始设置它就不起作用? ARKitVision 示例还做了什么来使其正常工作?谢谢 ????

StatusViewController 的完整类定义为:

/*
See LICENSE folder for this sample’s licensing information.

Abstract:
Utility class for showing messages above the AR view.
*/

import Foundation
import ARKit

/**
 Displayed at the top of the main interface of the app that allows users to see
 the status of the AR experience, as well as the ability to control restarting
 the experience altogether.
 - Tag: StatusViewController
 */
class StatusViewController: UIViewController 
    // MARK: - Types

    enum MessageType 
        case trackingStateEscalation
        case planeEstimation
        case contentPlacement
        case focusSquare

        static var all: [MessageType] = [
            .trackingStateEscalation,
            .planeEstimation,
            .contentPlacement,
            .focusSquare
        ]
    

    // MARK: - IBOutlets

    @IBOutlet weak private var messagePanel: UIVisualEffectView!

    @IBOutlet weak private var messageLabel: UILabel!

    @IBOutlet weak private var restartExperienceButton: UIButton!

    // MARK: - Properties

    /// Trigerred when the "Restart Experience" button is tapped.
    var restartExperienceHandler: () -> Void = 

    /// Seconds before the timer message should fade out. Adjust if the app needs longer transient messages.
    private let displayDuration: TimeInterval = 6

    // Timer for hiding messages.
    private var messageHideTimer: Timer?

    private var timers: [MessageType: Timer] = [:]

    // MARK: - Message Handling

    func showMessage(_ text: String, autoHide: Bool = true) 
        // Cancel any previous hide timer.
        messageHideTimer?.invalidate()

        messageLabel.text = text

        // Make sure status is showing.
        setMessageHidden(false, animated: true)

        if autoHide 
            messageHideTimer = Timer.scheduledTimer(withTimeInterval: displayDuration, repeats: false, block:  [weak self] _ in
                self?.setMessageHidden(true, animated: true)
            )
        
    

    func scheduleMessage(_ text: String, inSeconds seconds: TimeInterval, messageType: MessageType) 
        cancelScheduledMessage(for: messageType)

        let timer = Timer.scheduledTimer(withTimeInterval: seconds, repeats: false, block:  [weak self] timer in
            self?.showMessage(text)
            timer.invalidate()
        )

        timers[messageType] = timer
    

    func cancelScheduledMessage(`for` messageType: MessageType) 
        timers[messageType]?.invalidate()
        timers[messageType] = nil
    

    func cancelAllScheduledMessages() 
        for messageType in MessageType.all 
            cancelScheduledMessage(for: messageType)
        
    

    // MARK: - ARKit

    func showTrackingQualityInfo(for trackingState: ARCamera.TrackingState, autoHide: Bool) 
        showMessage(trackingState.presentationString, autoHide: autoHide)
    

    func escalateFeedback(for trackingState: ARCamera.TrackingState, inSeconds seconds: TimeInterval) 
        cancelScheduledMessage(for: .trackingStateEscalation)

        let timer = Timer.scheduledTimer(withTimeInterval: seconds, repeats: false, block:  [unowned self] _ in
            self.cancelScheduledMessage(for: .trackingStateEscalation)

            var message = trackingState.presentationString
            if let recommendation = trackingState.recommendation 
                message.append(": \(recommendation)")
            

            self.showMessage(message, autoHide: false)
        )

        timers[.trackingStateEscalation] = timer
    

    // MARK: - IBActions

    @IBAction private func restartExperience(_ sender: UIButton) 
        restartExperienceHandler()
    

    // MARK: - Panel Visibility

    private func setMessageHidden(_ hide: Bool, animated: Bool) 
        // The panel starts out hidden, so show it before animating opacity.
        messagePanel.isHidden = false

        guard animated else 
            messagePanel.alpha = hide ? 0 : 1
            return
        

        UIView.animate(withDuration: 0.2, delay: 0, options: [.beginFromCurrentState], animations: 
            self.messagePanel.alpha = hide ? 0 : 1
        , completion: nil)
    


extension ARCamera.TrackingState 
    var presentationString: String 
        switch self 
        case .notAvailable:
            return "TRACKING UNAVAILABLE"
        case .normal:
            return "TRACKING NORMAL"
        case .limited(.excessiveMotion):
            return "TRACKING LIMITED\nExcessive motion"
        case .limited(.insufficientFeatures):
            return "TRACKING LIMITED\nLow detail"
        case .limited(.initializing):
            return "Initializing"
        case .limited(.relocalizing):
            return "Recovering from interruption"
        
    

    var recommendation: String? 
        switch self 
        case .limited(.excessiveMotion):
            return "Try slowing down your movement, or reset the session."
        case .limited(.insufficientFeatures):
            return "Try pointing at a flat surface, or reset the session."
        case .limited(.relocalizing):
            return "Return to the location where you left off or try resetting the session."
        default:
            return nil
        
    

ViewController类的定义是:

/*
See LICENSE folder for this sample’s licensing information.

Abstract:
Main view controller for the ARKitVision sample.
*/

import UIKit
import SpriteKit
import ARKit
import Vision

class ViewController: UIViewController, UIGestureRecognizerDelegate, ARSKViewDelegate, ARSessionDelegate 

    @IBOutlet weak var sceneView: ARSKView!

    // The view controller that displays the status and "restart experience" UI.
    private lazy var statusViewController: StatusViewController = 
        return children.lazy.compactMap( $0 as? StatusViewController ).first!
    ()

    // MARK: - View controller lifecycle

    override func viewDidLoad() 
        super.viewDidLoad()

        // Configure and present the SpriteKit scene that draws overlay content.
        let overlayScene = SKScene()
        overlayScene.scaleMode = .aspectFill
        sceneView.delegate = self
        sceneView.presentScene(overlayScene)
        sceneView.session.delegate = self

        // Hook up status view controller callback.
        statusViewController.restartExperienceHandler =  [unowned self] in
            self.restartSession()
        
    

    override func viewWillAppear(_ animated: Bool) 
        super.viewWillAppear(animated)

        // Create a session configuration
        let configuration = ARWorldTrackingConfiguration()

        // Run the view's session
        sceneView.session.run(configuration)
    

    override func viewWillDisappear(_ animated: Bool) 
        super.viewWillDisappear(animated)

        // Pause the view's session
        sceneView.session.pause()
    

    // MARK: - ARSessionDelegate

    // Pass camera frames received from ARKit to Vision (when not already processing one)
    /// - Tag: ConsumeARFrames
    func session(_ session: ARSession, didUpdate frame: ARFrame) 
        // Do not enqueue other buffers for processing while another Vision task is still running.
        // The camera stream has only a finite amount of buffers available; holding too many buffers for analysis would starve the camera.
        guard currentBuffer == nil, case .normal = frame.camera.trackingState else 
            return
        

        // Retain the image buffer for Vision processing.
        self.currentBuffer = frame.capturedImage
        classifyCurrentImage()
    

    // MARK: - Vision classification

    // Vision classification request and model
    /// - Tag: ClassificationRequest
    private lazy var classificationRequest: VNCoreMLRequest = 
        do 
            // Instantiate the model from its generated Swift class.
            let model = try VNCoreMLModel(for: Inceptionv3().model)
            let request = VNCoreMLRequest(model: model, completionHandler:  [weak self] request, error in
                self?.processClassifications(for: request, error: error)
            )

            // Crop input images to square area at center, matching the way the ML model was trained.
            request.imageCropAndScaleOption = .centerCrop

            // Use CPU for Vision processing to ensure that there are adequate GPU resources for rendering.
            request.usesCPUOnly = true

            return request
         catch 
            fatalError("Failed to load Vision ML model: \(error)")
        
    ()

    // The pixel buffer being held for analysis; used to serialize Vision requests.
    private var currentBuffer: CVPixelBuffer?

    // Queue for dispatching vision classification requests
    private let visionQueue = DispatchQueue(label: "com.example.apple-samplecode.ARKitVision.serialVisionQueue")

    // Run the Vision+ML classifier on the current image buffer.
    /// - Tag: ClassifyCurrentImage
    private func classifyCurrentImage() 
        // Most computer vision tasks are not rotation agnostic so it is important to pass in the orientation of the image with respect to device.
        let orientation = CGImagePropertyOrientation(UIDevice.current.orientation)

        let requestHandler = VNImageRequestHandler(cvPixelBuffer: currentBuffer!, orientation: orientation)
        visionQueue.async 
            do 
                // Release the pixel buffer when done, allowing the next buffer to be processed.
                defer  self.currentBuffer = nil 
                try requestHandler.perform([self.classificationRequest])
             catch 
                print("Error: Vision request failed with error \"\(error)\"")
            
        
    

    // Classification results
    private var identifierString = ""
    private var confidence: VNConfidence = 0.0

    // Handle completion of the Vision request and choose results to display.
    /// - Tag: ProcessClassifications
    func processClassifications(for request: VNRequest, error: Error?) 
        guard let results = request.results else 
            print("Unable to classify image.\n\(error!.localizedDescription)")
            return
        
        // The `results` will always be `VNClassificationObservation`s, as specified by the Core ML model in this project.
        let classifications = results as! [VNClassificationObservation]

        // Show a label for the highest-confidence result (but only above a minimum confidence threshold).
        if let bestResult = classifications.first(where:  result in result.confidence > 0.5 ),
            let label = bestResult.identifier.split(separator: ",").first 
            identifierString = String(label)
            confidence = bestResult.confidence
         else 
            identifierString = ""
            confidence = 0
        

        DispatchQueue.main.async  [weak self] in
            self?.displayClassifierResults()
        
    

    // Show the classification results in the UI.
    private func displayClassifierResults() 
        guard !self.identifierString.isEmpty else 
            return // No object was classified.
        
        let message = String(format: "Detected \(self.identifierString) with %.2f", self.confidence * 100) + "% confidence"
        statusViewController.showMessage(message)
    

    // MARK: - Tap gesture handler & ARSKViewDelegate

    // Labels for classified objects by ARAnchor UUID
    private var anchorLabels = [UUID: String]()

    // When the user taps, add an anchor associated with the current classification result.
    /// - Tag: PlaceLabelAtLocation
    @IBAction func placeLabelAtLocation(sender: UITapGestureRecognizer) 
        let hitLocationInView = sender.location(in: sceneView)
        let hitTestResults = sceneView.hitTest(hitLocationInView, types: [.featurePoint, .estimatedHorizontalPlane])
        if let result = hitTestResults.first 

            // Add a new anchor at the tap location.
            let anchor = ARAnchor(transform: result.worldTransform)
            sceneView.session.add(anchor: anchor)

            // Track anchor ID to associate text with the anchor after ARKit creates a corresponding SKNode.
            anchorLabels[anchor.identifier] = identifierString
        
    

    // When an anchor is added, provide a SpriteKit node for it and set its text to the classification label.
    /// - Tag: UpdateARContent
    func view(_ view: ARSKView, didAdd node: SKNode, for anchor: ARAnchor) 
        guard let labelText = anchorLabels[anchor.identifier] else 
            fatalError("missing expected associated label for anchor")
        
        let label = TemplateLabelNode(text: labelText)
        node.addChild(label)
    

    // MARK: - AR Session Handling

    func session(_ session: ARSession, cameraDidChangeTrackingState camera: ARCamera) 
        statusViewController.showTrackingQualityInfo(for: camera.trackingState, autoHide: true)

        switch camera.trackingState 
        case .notAvailable, .limited:
            statusViewController.escalateFeedback(for: camera.trackingState, inSeconds: 3.0)
        case .normal:
            statusViewController.cancelScheduledMessage(for: .trackingStateEscalation)
            // Unhide content after successful relocalization.
            setOverlaysHidden(false)
        
    

    func session(_ session: ARSession, didFailWithError error: Error) 
        guard error is ARError else  return 

        let errorWithInfo = error as NSError
        let messages = [
            errorWithInfo.localizedDescription,
            errorWithInfo.localizedFailureReason,
            errorWithInfo.localizedRecoverySuggestion
        ]

        // Filter out optional error messages.
        let errorMessage = messages.compactMap( $0 ).joined(separator: "\n")
        DispatchQueue.main.async 
            self.displayErrorMessage(title: "The AR session failed.", message: errorMessage)
        
    

    func sessionWasInterrupted(_ session: ARSession) 
        setOverlaysHidden(true)
    

    func sessionShouldAttemptRelocalization(_ session: ARSession) -> Bool 
        /*
         Allow the session to attempt to resume after an interruption.
         This process may not succeed, so the app must be prepared
         to reset the session if the relocalizing status continues
         for a long time -- see `escalateFeedback` in `StatusViewController`.
         */
        return true
    

    private func setOverlaysHidden(_ shouldHide: Bool) 
        sceneView.scene!.children.forEach  node in
            if shouldHide 
                // Hide overlay content immediately during relocalization.
                node.alpha = 0
             else 
                // Fade overlay content in after relocalization succeeds.
                node.run(.fadeIn(withDuration: 0.5))
            
        
    

    private func restartSession() 
        statusViewController.cancelAllScheduledMessages()
        statusViewController.showMessage("RESTARTING SESSION")

        anchorLabels = [UUID: String]()

        let configuration = ARWorldTrackingConfiguration()
        sceneView.session.run(configuration, options: [.resetTracking, .removeExistingAnchors])
    

    // MARK: - Error handling

    private func displayErrorMessage(title: String, message: String) 
        // Present an alert informing about the error that has occurred.
        let alertController = UIAlertController(title: title, message: message, preferredStyle: .alert)
        let restartAction = UIAlertAction(title: "Restart Session", style: .default)  _ in
            alertController.dismiss(animated: true, completion: nil)
            self.restartSession()
        
        alertController.addAction(restartAction)
        present(alertController, animated: true, completion: nil)
    

【问题讨论】:

【参考方案1】:

这表明您的类 StatusViewController 没有从 UIViewController 继承,因为 children 的属性已可用于 UIViewController 的子类很长一段时间了。

你能分享一下你是如何组成你的 StatusViewController 的吗?

【讨论】:

我现在已经添加了两个源文件。这就是我不明白的,因为这两个类都是从 UIViewController 实现的,但由于某种原因,Xcode 看不到父级的定义......

以上是关于Swift:实例成员不能用于 ARKitVision 示例中的类型的主要内容,如果未能解决你的问题,请参考以下文章

Swift 2 实例成员“circleIndexes”不能用于“GameScene”类型

Swift Playground - 实例成员不能用于类型自定义类

实例成员不能用于类型 - 错误

无法在 Swift 中获取 UITextField 的 .text 属性 - 错误:“实例成员 'textFieldName' 不能用于类型 'ViewController'

Xcode Swift 2 变量声明(实例成员不能在类型视图控制器上使用)

在 Swift 3 中将类与同名的实例成员区分开来