Apple is finally getting into augmented reality
Augmented Reality isn’t a new technology, but as our phones and tablets become more powerful, the limits to what is possible have started to melt away. With Apple jumping into AR, there are a few additional reasons to get excited. In WWDC-2017 show off ARKit, a platform which the company boasted was the “Largest AR platform in the world.”
In a pretty amazing demo, Apple highlighted the marker-less spatial awareness that ARKit has in a tabletop gaming title featuring an airship attack on an enemy outpost. ARKit has support for Unity, Unreal Engine, and SceneKit and will be coming to iPad and iPhone.
There are already a variety of third-party SDKs for AR development, such as ARToolKit, ARmedia, Vuforia, and Wikitude, in addition to Microsoft’s HoloLens SDK and Google’s Tango SDK. This year, Apple and Facebook have thrown their hats into the ring. In a bunch of SDK, there are lots of questions in my mind,
How is ARKit? Have you seen ARKit? Is someone channeling fountain?
Do you remember our favourite Metaio SDK
We used the Metaio SDK, the German company was soon after the launch bought by Apple. After that, the SDK disappeared and we were left in the dark when Apple would have completed integrating it into their ecosystem. Now with the upcoming release of iOS11, this time seems to have come. And by the promise of what ARKit might be able to do, I am genuinely excited.
Apple doesn’t usually talk about the origins of its technology and frameworks, but Metaio’s technology looked solid and it had an API that, according to reports, was already very good.
With the 3D nature of augmented and mixed reality becoming an important element of our world, the development workflow becoming new and more complex than it was before. For those new to this space, the Unity 3D engine does an amazing job of taking some of the unnecessary complexities in 3D development out of the workflow completely.
Along with the release of Apple’s new ARKit, Unity announced their Unity ARKit Plugin to help developers use their environment to build for iOS and macOS hardware. This incredibly useful starter kit includes examples of some of the more advanced areas of the new iOS functionality, like surface detection.
Developers interested in building things for iOS 11 can head to the Apple Developer site today, where you’ll find forums for building AR apps and beta downloads for everything you need to get started.
In this augmented reality tutorial we will use the Unity 3D and the Apple ARkit to create an augmented reality app for your iPhone or iPad.
Prerequisite (Tools Needed):
First, you will need a Mac computer (MAC OS) Apple has been really strict about this. Second, you will need an Apple developer’s account to download the necessary tools to get started. Second, you will need an Apple developer’s account to download the necessary tools to get started.
- Unity Engine (Unity’s AR Kit requires the patch version of Unity 5.6.1p1 or later. )
- Xcode 9 beta
- Apple A9/A10 Processor device with iOS 11 Beta installed
- Unity ARKit Plugin Download Here
Step by Step Implementation
1. Install all the software you will need (Prerequisite):
- Log into the Apple developers website and click on the “Downloads” button. If your developer account is active, you will see a collection of beta software options.
2. Setup a Simple Unity Project:
Once everything is installed and ready to go, it is time to create an empty project in Unity. Let’s start by creating a new Unity3D project by the name of “Tutorial4”. (Disable Unity Analytics for now)
3. Import the ARKit Plugin into newly created Unity Project:
- Import the ARToolkit Unity Plugin which we just downloaded in prerequires section.
- This can be done by importing custom UnityPackage, here I stored both inside Vuforia_SDK folder in my machine.
- To import a new custom package:
- Choose Assets > Import Package > Custom Package… to bring up File Explorer (Windows) or Finder (Mac).
- Select the package you want from Explorer or Finder, and the Import Unity Package dialog box displays, with all the items in the package pre-checked, ready to install. (See Fig 4: New install Import Unity Package dialog box.)
- Select Import and Unity puts the contents of the package into the Assets folder, which you can access from your Project View.
- Once it’s finished, a window will appear with all the files in the package. Click on the “Import” button.
4. Open Unity-Scene “UnityARKitScene” inside Unity Project
- In the Project window, double-click the “UnityARKitScene” file.
5. Final Deployment Setup & Build Process
We are almost done. Let’s save the scene: File >> Save Scene and move towards deployment step.
- The last step is to build the project for iOS iPhone/iPad Platform. We need to go to “File >> Build Settings”. We need to add the current scene by selecting “Add Open Scene”. Then, we need to select a platform (iOS) and then select “Switch Platform“. Here, we will have Build Project options:
- Build Project: This will allow us to export the current Unity project to X-Code to deploy on the iOS device.
- Unity provides a number of settings when building for iOS devices – select from the menu (File > Build Settings… > Player Settings…) to see/modify the current settings.
- With the Player Settings up in the Inspector, change the ‘Build Identifier’ find the Camera Usage Description and type ‘camera use’ into it. Then back in the Build Settings window, click the “Build” button.
- Now Unity wants to know where to put your Xcode project. In the Save As box, type “AR_DEMO” and click the “Save” button as per the screenshot below.
6. Compile & Run through X-Code
- When the build process completes, a Finder window will appear. Open“AR_DEMO” folder and double click on “Unity-iPhone.xcodeproj”
- We are almost done.You should now be seeing a screen similar to the one in the screenshot below.
- Assign Team to the Project: In the section labeled ‘Signing’, click on the ‘Team‘ drop-down and select your ‘Developer account’.
- With that out of the way, look at the upper-left corner of the window. With your iPhone or iPad connected to your computer, hit the “Play” button. It can take a little bit to compile the first time.
6. Show Time 🙂
Once it is done, you will be able to look at the point cloud information and surfaces that the system detects.
If you get stuck at any point or want to view the source code, you can find it on Github. Let me know if you have any questions in the comments section below!
You did a great Job !!! It’s a Guinness Time now! 🙂