Unity ARKit By Example: Part 2

Creating a Hello World application.

Image for post
Image for post

This article is part of a series starting with Unity ARKit By Example: Part 1.

Hello World

We start by building the absolute minimum application that uses Unity-ARKit-Plugin:

We start by creating a scene that supports Unity-ARKit-Plugin:

  1. In Unity, We create a new folder, Assets > Scenes, and create a new scene in it, e.g., Hello World
  2. We modify the Main Camera; setting the Transform > Position to (0, 0, 0) and Camera > Clear Flags to Depth only
  3. We add the Unity AR Video script to the Main Camera. We then set Unity AR Video > Clear Material to Assets > UnityARKitPlugin > Plugins > iOS > UnityARKit > Materials > YUVMaterial
  4. We add the Unity AR Camera Near Far script to the Main Camera
  5. We create a new empty GameObject, e.g., ARCameraManager, and add the Unity AR Camera Manager script to it. We set Unity AR Camera Manager > Camera to the Main Camera
  6. We add the prefab, Assets > UnityARKitPlugin > ARKitRemote > ARKitRemote, to the scene

Now we add something specific for our application, e.g., we add a Cube with:

  • Transform: (0, 0, 2)
  • Scale: (0.3, 0.3, 0.3)
  • Rotation: (0, 45, 0)

With this in place, we follow the instructions in the section Using ARKitRemote of the article Unity ARKit By Example: Part 1 to run the application in Unity.

Image for post
Image for post


  • It is important to size the Game pane to the same aspect ratio of a portrait iOS device; otherwise the cube will be distorted

We now follow the instructions in the section The Build Process of the article Unity ARKit By Example: Part 1 to build and load the application to an iOS device. A couple of things to keep in mind:

  • Change the bundle identifier to something unique to your application; in my case com.larkintuckerllc.hellounityarkit
  • Make sure to only include the Hello World scene in the build

Now running the application on an iOS device, we can see the cube projected into our room.

Image for post
Image for post


  • The night before, I happened to run the application in pitch black. Under these conditions, the cube would dance around the room. Guess this makes sense as the application has nothing to orient to

Hello World (Point Cloud)

The Hello World application only indirectly used the Unity-ARKit-Plugin; projecting the cube into our room. Let us update the application to directly use an API provided by it.

A number of the provided APIs are events that one can subscribe to; the simplest of them is ARFrameUpdate. Based on the event’s name, I presume that this event is much like Unity’s Update event; it occurs frequently and is used to observe changes in ARKit.

In this example, we display the number of points in ARKit’s point cloud on each ARFrameUpdate.

So, what is a point cloud?

Use the ARFrame rawFeaturePoints property to obtain a point cloud representing intermediate results of the scene analysis ARKit uses to perform world tracking.

— Apple — ARPointCloud

As the point cloud is used by ARKit under the hood, it is something that we will never likely need to directly interact with; but it is both interesting and informative to explore.

We first create the GameObjects to display the number:

  • Into the Hello World scene, we create a UI > Image GameObject; used as a background
  • We create an Text GameObject in the Canvas GameObject (automatically created along with the Image GameObject); renaming it PointCount
  • We set PointCount’s default text to 0 and alignment to center


  • The EventSystem GameObject is also automatically created along with the Image GameObject
Image for post
Image for post

Now we create a script (and intermediate folders) and add it to PointCount to display the number of detected points; Assets / Scripts / Hello World / PointCount.cs:


  • The integration with Unity-ARKit-Plugin is the single line in the Start method; subscribes a method to the ARFrameUpdatedEvent
  • We use the flag _pointCountUpdated, to only update the text in the Update method after the point count is updated

With these changes in place, we run the application in Unity with ARKitRemote.

Image for post
Image for post


  • Observe that when we focus the iOS device on a distant wall (left), ARKit detects few (2) points
  • When focused on a near scene with a number of objects, ARKit detects many (149) points

Next Steps

In the next article, Unity ARKit By Example: Part 3, we will build an application that makes more extensive use of Unity ARKit.

Written by

Broad infrastructure, development, and soft-skill background

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store