ARKit in a SwiftUI app

In this article I’ll show you how to integrate ARKit in a SwiftUI app.
With this simple app, you’ll be able to tap on two points of the AR scene and calculate the distance between them.
As usual, the code is available on GitHub.

ARKit view

Let’s start with ARKit. The example is quite simple, yet powerful. With a few lines of code we’ll be able to measure the distance between two points in the physical world with the help of ARKit and a little bit of geometry. Pretty much everything happens in the ARDelegate class


func setARView(_ arView: ARSCNView) {
    self.arView = arView
    
    let configuration = ARWorldTrackingConfiguration()
    configuration.planeDetection = .horizontal
    arView.session.run(configuration)
    
    arView.delegate = self
    arView.scene = SCNScene()
    let tapGesture = UITapGestureRecognizer(target: self, action: #selector(tapOnARView))
    arView.addGestureRecognizer(tapGesture)
}

this is the initial setup. We have a ARSCNView which is the view responsible to show a camera feed with AR content. We create an ARWorldTrackingConfiguration to start tracking planes, and we are interested in horizontal planes so we set planeDetection accordingly.
Then we run a session with WorldTrackingConfiguration and add a Tap gesture recogniser to the view. Remember we want to measure the distance between two points, to we need to intercept a tap to create a new point.


@objc func tapOnARView(sender: UITapGestureRecognizer) {
    guard let arView = arView else { return }
    let location = sender.location(in: arView)
    if let query = arView.raycastQuery(from: location,
                                    allowing: .existingPlaneGeometry,
                                    alignment: .horizontal) {
        let results = arView.session.raycast(query)
        if let result = results.first {
            addCircle(raycastResult: result)
        }
    }
    
}

above you can see the action bound to the tap. If you wonder why I needed to annotate the function with @objc that’s because the method has to be exposed to the Objective-C runtime.
Anyway, we have the location of the point (relative to the AR view) and we want to add a small circle so the user can see the points in the real world.
In order to do that, we first need to create a raycast query and see if we can get any results. What the query does is create a ray from the given point (the one we got from the gesture recogniser) and see if that ray intercept with something on the real world. If it does, we have a ARRaycastResult and we can add our circle there.

Create the circle


private func addCircle(raycastResult: ARRaycastResult) {
    let circleNode = GeometryUtils.createCircle(fromRaycastResult: raycastResult)
    if circles.count >= 2 {
        for circle in circles {
            circle.removeFromParentNode()
        }
        circles.removeAll()
    }
    
    arView?.scene.rootNode.addChildNode(circleNode)
    circles.append(circleNode)
    
    if circles.count == 2 {
        let distance = GeometryUtils.calculateDistance(firstNode: circles[0], secondNode: circles[1])
        print("distance = \(distance)")
        message = "distance " + String(format: "%.2f cm", distance)
    }
    else {
        message = "add second point"
    }
}

this is how we add the circle. Before I show you how to create one from the ARRaycastResult a few words on this function. As you can see if we have more than 2 circles we remove them and start from scratch, so the user can tap a third time to start with a new measure. If we have exactly two circles, we can calculate the distance between them.
A circle is a SCNNode, an object that can be added to a SCNScene. We add it to the root node, and ARKit will add it to the scene and keep track of it. Try moving your iPhone and get back to the point you tapped on, the circle is still there.


static func createCircle(fromRaycastResult result:ARRaycastResult) -> SCNNode {
    let circleGeometry = SCNSphere(radius: 0.005)
    
    let material = SCNMaterial()
    material.diffuse.contents = UIColor.systemBlue
    
    circleGeometry.materials = [material]
    
    let circleNode = SCNNode(geometry: circleGeometry)
    circleNode.simdWorldTransform = result.worldTransform
    
    return circleNode
}

this is how we create an SCNNode representing a circle. We start with the geometry, as we want a circle we can use SCNSphere, if we prefer a cube we can use SCNBox. We then need to create a SCNMaterial, a set of properties that define the appearance of a geometry. In our example, we want the circle to be blue so we set blue as the color via diffuse.contents. You can also apply textures to nodes, but let’s keep it simple for now.
The final step is to create a SCNNode with a geometry, and then we need to place it into the real world. This is done by setting smdWorldTransform to the worldTransform of the raycast result returned by the query.

Calculate the distance


static func calculateDistance(firstNode: SCNNode, secondNode:SCNNode) -> Float {
    let firstPosition = firstNode.position
    let secondPosition = secondNode.position
    var distance:Float = sqrt(
        pow(secondPosition.x - firstPosition.x, 2) +
            pow(secondPosition.y - firstPosition.y, 2) +
            pow(secondPosition.z - firstPosition.z, 2)
    )
    
    distance *= 100 // convert in cm
    return abs(distance)
}

As I said, with a little bit of geometry we can calculate the distance between two points, represented by two SCNNodes.
Each SCNNode has a position with x, y and z coordinates. The formula is quite simple, we need the square root of the power of the difference between x, y and z of the two coordinates.

Move the circles

What if we want to move a circle after we placed it into the AR scene? I’m going to show you how to move a node with a UIPanGestureRecognizer.


@objc func panOnARView(sender: UIPanGestureRecognizer) {
    guard let arView = arView else { return }
    let location = sender.location(in: arView)
    switch sender.state {
    case .began:
        if let node = nodeAtLocation(location) {
            trackedNode = node
        }
    case .changed:
        if let node = trackedNode {
            if let result = raycastResult(fromLocation: location) {
                moveNode(node, raycastResult:result)
            }
        }
    default:
        ()
    }

the gist is recognising that the pan gesture began on an existing node, and then move it using a raycast result exactly as we added the node with a tap.
To find out if a tap (or the beginning of the pan gesture) is on an existing node we have two alternatives. We could use a raycast result and find out if one of our nodes shares the same world transform, or we can use the hitTest method provided by ARSCNView. I’ll show you the latter.


private func nodeAtLocation(_ location:CGPoint) -> SCNNode? {
    guard let arView = arView else { return nil }
    let result = arView.hitTest(location, options: nil)
    return result.first?.node
}

with hitTest we can have a SCNHitTestResult object, with a SCNNode. If we have a node, we now the user tapped on an existing one and we can then move it by changing its simdTransform with the raycast result as we saw previously.

Integrate in SwiftUI

All right, there is SwiftUI in the title and I guess you finally want to know how to put ARKint inside a SwiftUI view. In order to integrate ARKit into a SwiftUI app we’ll need to use UIViewRepresentable, as we’ll use a UIKit view, in particular ARSCNView.


struct ARViewRepresentable: UIViewRepresentable {
    let arDelegate:ARDelegate
    
    func makeUIView(context: Context) -> some UIView {
        let arView = ARSCNView(frame: .zero)
        arDelegate.setARView(arView)
        return arView
    }
    
    func updateUIView(_ uiView: UIViewType, context: Context) {
        
    }
}

The ARDelegate class serves two purposes, it is delegate of ARSCNView and controls the UITapGestureRecognizer, but I also used it as a view model for SwiftUI.


struct ContentView: View {
    @ObservedObject var arDelegate = ARDelegate()
    
    var body: some View {
        ZStack {
            ARViewRepresentable(arDelegate: arDelegate)
            VStack {
                Spacer()
                Text(arDelegate.message)
                    .foregroundColor(Color.primary)
                    .frame(maxWidth: .infinity)
                    .padding(.bottom, 20)
                    .background(Color.secondary)
            }
        }.edgesIgnoringSafeArea(.all)
    }
}

This is the SwiftUI view with the embedded ARKitView.
I use a ZStack, so I can have something in foreground above the ARKit live view. It is the Text bound to message property of ARDelegate, so I can show a live message telling the user the distance.
Remember to set your view model as an @ObservedObject so your view will be updated every time one of the property changes, in this case message that is declare as @Published in ARDelegate.

That’s it, as you can see it is really easy to integrate ARKit into a SwiftUI only app.
Happy coding 🙂