In this tutorial, we’re going to combine the power of ARKit, CoreLocation, and Pusher to create a geolocation AR app. This article is part of ARKit course.
In this tutorial, we’re going to combine the power of ARKit, CoreLocation, and Pusher to create a geolocation AR app.
Augmented Reality (AR) has a lot of interesting and practical use cases. One of them is the location.
With iOS 11, the ability to use ARKit to create AR apps and combine them with multiple libraries has opened a lot of possibilities.
Let’s think of a taxi service. Some services allow you to track on a map the car that is going to pick you up, but wouldn’t be great to have an AR view to see the route of the car and how it gets closer to you?
Something like this:
As you can see, the information to position the car in the AR world is not always accurate, both on the CoreLocation side and on the ARKit side, however, for this use case, most of the time it will be enough.
Here’s what you’ll need:
You can find free 3D models on sites like Free3D, Turbosquid, or Google’s Poly.
The most common format is OBJ (with its textures defined in a MTL file), which can be converted to DAE with a program like Blender.
For this project, I chose this model, which it’s available in DAE format.
The math for this project is a bit heavy. I’ll dedicate more time to explain the operations related to geolocation than the ones related to rotating and translating a model with ARKit.
If you don’t know about transformation matrices or how to convert your 3D model to the DAE format, take a look at my previous tutorial about ARKit.
For reference, the source code of this project is on GitHub.
Let’s start by setting up a Pusher app.
If you haven’t already, create a free account at Pusher. Then, go to your Dashboard and create an app, choosing a name, the cluster closest to your location, and iOS as your front-end technology:
This will give you some sample code to get started:
Save your app id, key, secret and cluster values. We’ll need them later.
Finally, go to the App Setting tab, check the option Enable client events and click on Update:
Through this app, the drivers will send their locations as latitude/longitude coordinates along with the direction they’re heading (in degrees) as a client event.
But let’s not get ahead of ourselves, let’s set up the Xcode project first.
Open Xcode 9 and create a new Single View App:
We’re choosing this option because we are going to manually set up an AR view along with other controls.
Enter the project information, choosing Swift as the language:
Create the project and close it. We’re going to use CocoaPods to install the project’s dependencies. Open a terminal window, go to the root directory of your project and, in case you don’t have CocoaPods installed (or if you want to update it), execute:
1sudo gem install cocoapods
Once installed, create the file Podfile
with the command:
1pod init
Edit this file to set the platform to iOS 11 and add the Pusher’s Swift library as a dependency of the project:
1# Uncomment the next line to define a global platform for your project 2 platform :ios, '11.0' 3 4 target 'ARKitCarGeolocation' do 5 # Comment the next line if you're not using Swift and don't want to use dynamic frameworks 6 use_frameworks! 7 8 # Pods for ARKitCarGeolocation 9 pod 'PusherSwift', '~> 5.0.1' 10 end
Once you’ve edited the Podfile
, execute the following command to install the dependency:
1pod install
In case version 5.0.1 (or later) is not installed (the output of the installation will tell you the installed version), you can update your CocoaPod repository and install the latest version of the library with the command:
1pod install --repo-update
Now open the Xcode workspace instead of the project file. The workspace has the dependency already configured:
1open ARKitCarGeolocation.xcworkspace
If you build your project at this point, a couple of warnings may show up, but the operation should be successful.
Next, select the file Info.plist
, add a row of type Privacy – Camera Usage Description (NSCameraUsageDescription
) and give it a description. This is required for ARKit to access the camera.
We’ll also need a row of type Privacy – Location When In Use Usage Description (NSLocationWhenInUseUsageDescription
). This is required to get the location from your device’s GPS (only when the app is being used, not all the time):
Finally, configure a team so you can run the app on your device:
Now let’s build the user interface.
Go to Main.storyboard
and drag an ARKit SceneKit View to the view:
Next, add constraints to all sides of this view so that it fills the entire screen. You do this by pressing the ctrl
key while dragging a line from the ARSCNView to each side of the parent view and choosing leading, top, trailing, and bottom to the superview, with a value of 0
:
Next, add a text view and disable its Editable and Selectable behaviors in the Attributes inspector:
Change its background color (I chose a white color with 50%
opacity):
Add a height constraint with a value of 90
and leading, top, and trailing constraints with the value 0
so it remains fixed to the top of the screen:
In ViewController.swift
, import ARKit:
1import ARKit
Then, create two IBOutlet
s, one to the scene view and another one to the text view:
You’re ready to start coding the app, but before that, let me explain what needs to be done. However, if you’re already familiar with geolocation concepts or if you’re not interested, feel free to skip the next section.
Imagine you are standing at some point in the world. It doesn’t matter where or in what direction you’re looking at.
Your location is given by two numbers, latitude and longitude.
Latitude is the distance between the North or the South Pole and the equator (an imaginary circle around the Earth halfway between the poles). It goes from 0º
to 90º
for places to the north of the equator, and 0º
to -90º
for places to the south of the equator.
Longitude is the distance from the prime meridian (an imaginary line running from north to south through Greenwich, England) to a point at the west or east. It goes from 0º
to 180º
for places to the east of the prime meridian, and 0º
to -180º
for places to the west of the prime meridian.
For example, if you’re in Brazil, your latitude and longitude will be negative because you are on the southwest side of the Earth:
And if you’re in Japan, for example, your latitude and longitude will be positive because you are on the northeast side of the Earth:
This app will take into account your position and the driver’s position in a latitude and longitude coordinate system:
But if it’s easier to you, you can think of your position as the origin (0
, 0
):
You need to calculate two things:
The distance will tell you how far you have to position the 3D model in the AR world.
The bearing will help you create a rotation transformation to position your model in the right direction at the above distance.
If we were talking about a simple x
and y
coordinate system, we could get those calculations by applying the Pythagorean theorem and some simple trigonometry, with sine and cosine operations.
But we are talking about latitudes and longitudes of the Earth. And as the Earth is not a flat plane, the math gets more complex.
The distance is calculated by calling just a method of the class CLLocation. It uses the Haversine Formula which, from two different latitude/longitude pairs of values, calculates the distance by tracing a line between them that follows the curvature of the Earth.
On the other hand, we have to calculate the bearing between two different latitude/longitude pairs of values manually. This is the formula:
1atan2 ( X, Y )
Where X
equals:
1sin(long2 - long1) * cos(long2)
And Y
equals:
1cos(lat1) * sin(lat2) - sin(lat1) * cos(lat2) * cos(long2 - long1)
Another thing to consider is that for the matrix transformation, you’ll have to use radians instead of degrees as angle units. As the length of an entire circumference is equal to 2π
radians ( 360º
), one radian is equal to 180/π
degrees.
So this is the plan.
Using Pusher, the drivers will publish their location and direction they’re heading in realtime.
Using CoreLocation, the AR app is going to get your location. It will also listen to the driver’s location updates.
When a location update is received, using the formulas explained above, the app will place a 3D model of a car in a position relative to your location inside the AR world, and it will orient the model to the same direction the driver is heading.
The app is only going to get your location once, so it assumes your location is fixed (which is true most of the time).
In addition, an arrow emoji (⬇️) will be shown on top of the model at all times so you can spot it easily, and the text view you added in the last section will show the status of the app and the distance between you and the car.
Now that you know what to do, let’s get into the code.
Let’s start by defining two extensions.
One to provide conversion methods to radians and degrees to all floating point types. Create a new Swift file, FloatingPoint+Extension.swift
, with the following content:
1import Foundation
2
3 extension FloatingPoint {
4 func toRadians() -> Self {
5 return self * .pi / 180
6 }
7
8 func toDegrees() -> Self {
9 return self * 180 / .pi
10 }
11 }
And another extension to create an image from a string. Create another Swift file, String+Extension.swift
, with the following content (taken from this StackOverflow answer):
1import UIKit
2
3 extension String {
4 func image() -> UIImage? {
5 let size = CGSize(width: 100, height: 100)
6 UIGraphicsBeginImageContextWithOptions(size, false, 0)
7 UIColor.clear.set()
8 let rect = CGRect(origin: CGPoint(), size: size)
9 UIRectFill(CGRect(origin: CGPoint(), size: size))
10 (self as NSString).draw(in: rect, withAttributes: [NSAttributedStringKey.font: UIFont.systemFont(ofSize: 90)])
11 let image = UIGraphicsGetImageFromCurrentImageContext()
12 UIGraphicsEndImageContext()
13 return image
14 }
15 }
You’ll use this extension to create an image out of the arrow emoji (a string). It creates a rectangle of width 100
and height 100
, with a transparent background, to draw the string inside of it with a font size of 90
.
Next, open the New File dialog and scroll down to choose the Asset Catalog type:
Enter art.scnassets
as the file name (confirming the use of the extension scnassets
):
Now copy your model to this folder:
Open the Scene Graph View, select the main node of your model and, in the properties tab, give it a name, which you’ll use to reference it in the code:
Back to ViewController.swift
, let’s add the import
statements we’ll need:
1import SceneKit 2 import CoreLocation 3 import PusherSwift
And the delegates the controller will use:
1class ViewController: UIViewController, ARSCNViewDelegate, CLLocationManagerDelegate {
2 ...
3 }
Next, let’s add some instance variables.
First, a CLLocationManager
to request the user location and another variable to store it:
1class ViewController: UIViewController, ARSCNViewDelegate, CLLocationManagerDelegate {
2 ...
3 let locationManager = CLLocationManager()
4 var userLocation = CLLocation()
5
6 ...
7 }
Then, a variable to store the direction the drivers are heading, the distance between them and the user, and the status of the app:
1class ViewController: UIViewController, ARSCNViewDelegate, CLLocationManagerDelegate {
2 ...
3 var heading : Double! = 0.0
4 var distance : Float! = 0.0 {
5 didSet {
6 setStatusText()
7 }
8 }
9 var status: String! {
10 didSet {
11 setStatusText()
12 }
13 }
14
15 ...
16
17 func setStatusText() {
18 var text = "Status: \(status!)\n"
19 text += "Distance: \(String(format: "%.2f m", distance))"
20 statusTextView.text = text
21 }
22 }
Whenever a new value for the distance or the status is set, the text view will be updated. Notice that the distance is calculated in meters.
Next, a variable to store the root node of the car model and the name of this node, which should be the same than the one you set at the SceneKit editor:
1class ViewController: UIViewController, ARSCNViewDelegate, CLLocationManagerDelegate {
2 ...
3 var modelNode:SCNNode!
4 let rootNodeName = "Car"
5
6 ...
7 }
You’ll also need the original (first) transformation of that node:
1class ViewController: UIViewController, ARSCNViewDelegate, CLLocationManagerDelegate {
2 ...
3 var originalTransform:SCNMatrix4!
4
5 ...
6 }
Why?
To calculate the orientation (rotation) of the model in the best possible way.
Ideally, the driver’s device will always give you the correct heading so you can take the first received reading, rotate the model in that direction, and then calculate the next rotations relative to the first one.
However, if the first reading is wrong (which happens sometimes), the next rotations will be wrong even if the rest of the readings are correct.
So you always need to calculate the orientation as if it was the first time you rotate the model, because once you rotate the model a certain angle the following rotations will be done relative to that angle. Resetting the rotation to 0º
won’t work either because of the way transformations work (matrix multiplication).
Finally, you’ll need to store the Pusher object and channel to receive the updates:
1class ViewController: UIViewController, ARSCNViewDelegate, CLLocationManagerDelegate {
2 ...
3 let pusher = Pusher(
4 key: "YOUR_PUSHER_APP_KEY",
5 options: PusherClientOptions(
6 authMethod: .inline(secret: "YOUR_PUSHER_APP_SECRET"),
7 host: .cluster("YOUR_PUSHER_APP_CLUSTER")
8 )
9 )
10 var channel: PusherChannel!
11
12 ...
13 }
Notice the value of the authMethod
option.
You’ll be receiving the updates through a private channel. They need to be authenticated by a server. However, at development time, you can use the inline
option to bypass the need to set up an auth endpoint as part of a server.
You can learn more about the object’s options here. If you need it, you can learn how to create an authentication endpoint on this page.
In the viewDidLoad
function, set up the SceneKit scene and the location service:
1override func viewDidLoad() {
2 super.viewDidLoad()
3
4 // Set the view's delegate
5 sceneView.delegate = self
6
7 // Create a new scene
8 let scene = SCNScene()
9
10 // Set the scene to the view
11 sceneView.scene = scene
12
13 // Start location services
14 locationManager.delegate = self
15 locationManager.desiredAccuracy = kCLLocationAccuracyBest
16 locationManager.requestWhenInUseAuthorization()
17
18 // Set the initial status
19 status = "Getting user location..."
20
21 // Set a padding in the text view
22 statusTextView.textContainerInset = UIEdgeInsetsMake(20.0, 10.0, 10.0, 0.0)
23 }
Next, configure the AR session:
1override func viewWillAppear(_ animated: Bool) {
2 super.viewWillAppear(animated)
3
4 // Create a session configuration
5 let configuration = ARWorldTrackingConfiguration()
6 configuration.worldAlignment = .gravityAndHeading
7
8 // Run the view's session
9 sceneView.session.run(configuration)
10 }
11
12 override func viewWillDisappear(_ animated: Bool) {
13 super.viewWillDisappear(animated)
14
15 // Pause the view's session
16 sceneView.session.pause()
17 }
The option gravityAndHeading will set the y-axis to the direction of gravity as detected by the device, and the x- and z-axes to the longitude and latitude directions as measured by Location Services.
For the users’ position, when they have authorized the use of the location services, you have to request the location (the requestLocation method is used so the location is requested only once):
1//MARK: - CLLocationManager
2 func locationManager(_ manager: CLLocationManager, didFailWithError error: Error) {
3 // Implementing this method is required
4 print(error.localizedDescription)
5 }
6
7 func locationManager(_ manager: CLLocationManager,
8 didChangeAuthorization status: CLAuthorizationStatus) {
9 if status == .authorizedWhenInUse {
10 locationManager.requestLocation()
11 }
12 }
Once the user’s location is received, take the last element of the array, update the status, and connect to Pusher (it doesn’t make sense to connect to Pusher before having the users location because all the calculations will be wrong):
1func locationManager(_ manager: CLLocationManager,
2 didUpdateLocations locations: [CLLocation]) {
3 if let location = locations.last {
4 userLocation = location
5 status = "Connecting to Pusher..."
6
7 self.connectToPusher()
8 }
9 }
In the method connectToPusher
you subscribe to private-channel
and, when a client-new-location
event is received, extract the driver’s latitude, longitude, and heading and update the status and location of the 3D model with the method updateLocation
:
1//MARK: - Utility methods
2 func connectToPusher() {
3 // subscribe to channel and bind to event
4 let channel = pusher.subscribe("private-channel")
5
6 let _ = channel.bind(eventName: "client-new-location", callback: { (data: Any?) -> Void in
7 if let data = data as? [String : AnyObject] {
8 if let latitude = Double(data["latitude"] as! String),
9 let longitude = Double(data["longitude"] as! String),
10 let heading = Double(data["heading"] as! String) {
11 self.status = "Driver's location received"
12 self.heading = heading
13 self.updateLocation(latitude, longitude)
14 }
15 }
16 })
17
18 pusher.connect()
19 status = "Waiting to receive location events..."
20 }
In updateLocation
, create a CLLocation object to calculate the distance between the user and the driver. Remember that the distance is calculated in meters:
1func updateLocation(_ latitude : Double, _ longitude : Double) {
2 let location = CLLocation(latitude: latitude, longitude: longitude)
3 self.distance = Float(location.distance(from: self.userLocation))
4 }
If this is the first update received, self.modelNode
will be nil
, so you have to instantiate the model:
1func updateLocation(_ latitude : Double, _ longitude : Double) {
2 ...
3 if self.modelNode == nil {
4 let modelScene = SCNScene(named: "art.scnassets/Car.dae")!
5 self.modelNode = modelScene.rootNode.childNode(withName: rootNodeName, recursively: true)!
6
7 }
8 }
Next, you need to move the pivot of the model to its center in the y-axis so it can be rotated without changing its position:
1func updateLocation(_ latitude : Double, _ longitude : Double) {
2 ...
3 if self.modelNode == nil {
4 ...
5 // Move model's pivot to its center in the Y axis
6 let (minBox, maxBox) = self.modelNode.boundingBox
7 self.modelNode.pivot = SCNMatrix4MakeTranslation(0, (maxBox.y - minBox.y)/2, 0)
8 }
9 }
Save the model’s transform to calculate future rotations, position it, and add it to the scene:
1func updateLocation(_ latitude : Double, _ longitude : Double) {
2 ...
3 if self.modelNode == nil {
4 ...
5 // Save original transform to calculate future rotations
6 self.originalTransform = self.modelNode.transform
7
8 // Position the model in the correct place
9 positionModel(location)
10
11 // Add the model to the scene
12 sceneView.scene.rootNode.addChildNode(self.modelNode)
13 }
14 }
Notice that there’s no need to create an ARAnchor to add the node as a child of it. An ARAnchor
gives you the ability to track positions and orientations of models relative to the camera.
But in this case, it’s better to work with the child directly. Mostly because you cannot delete or change the position of the whole ARAnchor
manually -only of its children.
Finally, create the arrow from an emoji, position it on top of the car (using the y-axis, I got the value by trial and error), and add it as a child of the model (so it stays with it at all times):
1func updateLocation(_ latitude : Double, _ longitude : Double) {
2 ...
3 if self.modelNode == nil {
4 ...
5 // Create arrow from the emoji
6 let arrow = makeBillboardNode("⬇️".image()!)
7 // Position it on top of the car
8 arrow.position = SCNVector3Make(0, 4, 0)
9 // Add it as a child of the car model
10 self.modelNode.addChildNode(arrow)
11 }
12 }
This is the definition of the makeBillboardNode
method (taken from this StackOverflow answer, modifying the width and height of the plane so the arrow can be properly seen):
1func makeBillboardNode(_ image: UIImage) -> SCNNode {
2 let plane = SCNPlane(width: 10, height: 10)
3 plane.firstMaterial!.diffuse.contents = image
4 let node = SCNNode(geometry: plane)
5 node.constraints = [SCNBillboardConstraint()]
6 return node
7 }
Now, if this is not the first update, you just need to position the model, animating the movement so it looks nice:
1func updateLocation(_ latitude : Double, _ longitude : Double) {
2 ...
3 if self.modelNode == nil {
4 ...
5 } else {
6 // Begin animation
7 SCNTransaction.begin()
8 SCNTransaction.animationDuration = 1.0
9
10 // Position the model in the correct place
11 positionModel(location)
12
13 // End animation
14 SCNTransaction.commit()
15 }
16 }
To position the model, you just need to rotate first, then translate it into the correct position and scale it:
1func positionModel(_ location: CLLocation) {
2 // Rotate node
3 self.modelNode.transform = rotateNode(Float(-1 * (self.heading - 180).toRadians()), self.originalTransform)
4
5 // Translate node
6 self.modelNode.position = translateNode(location)
7
8 // Scale node
9 self.modelNode.scale = scaleNode(location)
10 }
The order is important because of how matrix multiplication works (a * b
is not the same than b * a
).
In ARKit, rotation in the y-axis is counterclockwise (and handled in radians), so we need to subtract 180º
and make the angle negative. This is the definition of the method rotateNode
:
1func rotateNode(_ angleInRadians: Float, _ transform: SCNMatrix4) -> SCNMatrix4 {
2 let rotation = SCNMatrix4MakeRotation(angleInRadians, 0, 1, 0)
3 return SCNMatrix4Mult(transform, rotation)
4 }
I scale the node in proportion to the distance. They are inversely proportional -the greater the distance, the less the scale. In my case, I just divide 1000
by the distance and don’t allow the value to be less than 1.5
or great than 3
:
1func scaleNode (_ location: CLLocation) -> SCNVector3 {
2 let scale = min( max( Float(1000/distance), 1.5 ), 3 )
3 return SCNVector3(x: scale, y: scale, z: scale)
4 }
I got these values from trial and error. They will vary depending on the model you’re using.
To translate the node, you have to calculate the transformation matrix and get the position values from that matrix (from its fourth column, referenced by a zero-based index):
1func translateNode (_ location: CLLocation) -> SCNVector3 {
2 let locationTransform =
3 transformMatrix(matrix_identity_float4x4, userLocation, location)
4 return positionFromTransform(locationTransform)
5 }
6
7 func positionFromTransform(_ transform: simd_float4x4) -> SCNVector3 {
8 return SCNVector3Make(
9 transform.columns.3.x, transform.columns.3.y, transform.columns.3.z
10 )
11 }
To calculate the transformation matrix:
All this is done with the following methods:
1func transformMatrix(_ matrix: simd_float4x4, _ originLocation: CLLocation, _ driverLocation: CLLocation) -> simd_float4x4 {
2 let bearing = bearingBetweenLocations(userLocation, driverLocation)
3 let rotationMatrix = rotateAroundY(matrix_identity_float4x4, Float(bearing))
4
5 let position = vector_float4(0.0, 0.0, -distance, 0.0)
6 let translationMatrix = getTranslationMatrix(matrix_identity_float4x4, position)
7
8 let transformMatrix = simd_mul(rotationMatrix, translationMatrix)
9
10 return simd_mul(matrix, transformMatrix)
11 }
12
13 func getTranslationMatrix(_ matrix: simd_float4x4, _ translation : vector_float4) -> simd_float4x4 {
14 var matrix = matrix
15 matrix.columns.3 = translation
16 return matrix
17 }
18
19 func rotateAroundY(_ matrix: simd_float4x4, _ degrees: Float) -> simd_float4x4 {
20 var matrix = matrix
21
22 matrix.columns.0.x = cos(degrees)
23 matrix.columns.0.z = -sin(degrees)
24
25 matrix.columns.2.x = sin(degrees)
26 matrix.columns.2.z = cos(degrees)
27 return matrix.inverse
28 }
29
30 func bearingBetweenLocations(_ originLocation: CLLocation, _ driverLocation: CLLocation) -> Double {
31 let lat1 = originLocation.coordinate.latitude.toRadians()
32 let lon1 = originLocation.coordinate.longitude.toRadians()
33
34 let lat2 = driverLocation.coordinate.latitude.toRadians()
35 let lon2 = driverLocation.coordinate.longitude.toRadians()
36
37 let longitudeDiff = lon2 - lon1
38
39 let y = sin(longitudeDiff) * cos(lat2);
40 let x = cos(lat1) * sin(lat2) - sin(lat1) * cos(lat2) * cos(longitudeDiff);
41
42 return atan2(y, x)
43 }
About how to rotate in the y-axis, the method returns the inverse of the matrix because rotations in ARKit are counterclockwise. Here’s an answer from Mathematics Stack Exchange that explains rotation matrices pretty well.
And that’s it, time to test the app.
The first time you run the app, you’ll have to give permissions to the camera:
And to the location service:
And wait for a few seconds so the app can get the location and connect to Pusher.
To test it, you’ll need someone that publishes location events while driving.
On this GitHub repository, you can find an app for iOS that publishes location events.
It uses CoreLocation, and the code is pretty similar to the one shown in the previous section but it requests the location information every one or two seconds.
As a note, for the heading measurement, it’s important to hold the device in the direction the driver is heading.
For a quick test, you can use the following Node.js script to manually send some location coordinates (that you can get from this site) every two seconds:
1const Pusher = require('pusher');
2
3 const pusher = new Pusher({
4 appId: 'YOUR_PUSHER_APP_',
5 key: 'YOUR_PUSHER_APP_KEY',
6 secret: 'YOUR_PUSHER_APP_SECRET',
7 cluster: 'YOUR_PUSHER_APP_CLUSTER',
8 encrypted: true
9 });
10
11 const locations = [
12 {latitude: "", longitude: "-", heading: ""},
13 {latitude: "", longitude: "-", heading: ""},
14 {latitude: "", longitude: "-", heading: ""},
15 {latitude: "", longitude: "-", heading: ""},
16 {latitude: "", longitude: "-", heading: ""},
17 {latitude: "", longitude: "-", heading: ""},
18 {latitude: "", longitude: "-", heading: ""},
19 {latitude: "", longitude: "-", heading: ""},
20 {latitude: "", longitude: "-", heading: ""}
21 ];
22
23 locations.forEach((loc, index) => {
24 setTimeout(() => {
25 console.log(loc);
26 pusher.trigger('private-channel', 'client-new-location', loc);
27 }, 2000*index);
28 });
Once you have Node.js installed, you just have to copy this script to a file, let’s say publish.js
, create a package.json
file with the command:
1npm init
Install the Pusher Node.js library with:
1npm install --save pusher
Enter your Pusher and location info and execute the script with:
1node publish.js
Once the app starts receiving location events, the 3D model of the car will appear in the direction where it is in the real world (with a small size if it’s far from you):
You have learned how to combine the power of ARKit, CoreLocation, and Pusher to create an AR app.
You can add more features to make it more useful:
However, keep in mind that the app depends on the quality of the information received.
In my tests, for a few seconds after starting the driver’s app, the heading information was completely wrong, and overall, the position was off by a few meters.
ARKit occasionally gets confused too. Sometimes this can be a problem, and it is another area of improvement. However, we’re just at the beginning. Without a doubt, these frameworks will be improved over time.
Remember that you can find the entire project on this GitHub repository and you can contact me if you have questions.