Original: How To Make An App Like Pokemon Go
Author: Jean-Pierre Distler
Translator: kmyhy
One of the most popular mobile games today is the Dreamy Wizard.It uses augmented reality technology to bring games into the "real world" and allow players to do something healthy.
In this tutorial, we will write our own Augmented Reality Elf Capture game.This game will show a map containing your location and enemy's location, using a 3D SceneKit view to present the images taken from the rear camera and the enemy's 3D model.
If you're new to Augmented Reality, you can take a look at our Geographic location-based RA tutorial .It is not necessary for this tutorial to show you how to write the Dream app, but it contains a lot of useful knowledge about mathematics and RA that is not covered in this tutorial.
start
The starting project for this tutorial is here Download.The project contains two view controller s and an art.scnassets folder, which contains the necessary 3D models and maps.
ViewController.swift is a subclass of UIViewController that displays AR content for apps.The MapViewController displays a map that contains your current location and the location of nearby enemies.Some basic things, such as constraints and exits, are already built. All you need to do is focus on the core of this tutorial, how to make an app look like a gem.
Add enemies to the map
Before you can fight the enemy, you need to know where it is.Create a new Swift file called ARItem.swift.
Add after the import Foundation line of the file's ARItem.swift:
import CoreLocation
struct ARItem {
let itemDescription: String
let location: CLLocation
}
ARItem has a description field and a coordinate.So we can know what kind of enemy it is and where it is.
Open MapViewController.swift to add an impor CoreLocation statement and a property:
var targets = [ARItem]()
Add the following methods:
func setupLocations() {
let firstTarget = ARItem(itemDescription: "wolf", location: CLLocation(latitude: 0, longitude: 0))
targets.append(firstTarget)
let secondTarget = ARItem(itemDescription: "wolf", location: CLLocation(latitude: 0, longitude: 0))
targets.append(secondTarget)
let thirdTarget = ARItem(itemDescription: "dragon", location: CLLocation(latitude: 0, longitude: 0))
targets.append(thirdTarget)
}
We created three enemies by hard coding.We'll replace the coordinates (0,0) with those near your physical coordinates.
There are many ways to find coordinates.For example, you can create some random coordinates near your current location and use PlacesLoader or Xcode from our previous tutorial to simulate your current location.Of course, we don't want random coordinates in your neighbor's bedroom.That would be embarrassing.
The simple point is to use Google Maps.open https://www.google.com/maps/ Find your current location.When you click on the map, a pin appears and a bubble pops up at the bottom.
Latitude and longitude are shown in the bubbles.I suggest you create some hard-coded locations from your location or near your street so you don't have to knock on your neighbor's door and tell him you need to go to his bedroom to catch a dragon.
Select three locations and replace the 0 in the code above with the coordinates you selected.
https://koenig-media.raywenderlich.com/uploads/2016/12/Google_Maps.png' width='400'/>
Mark enemies on the map
We have set enemy coordinates and should show them on the map.Add a new Swift file named MapAnnotation.swift.Write the following code in this file:
import MapKit
class MapAnnotation: NSObject, MKAnnotation {
//1
let coordinate: CLLocationCoordinate2D
let title: String?
//2
let item: ARItem
//3
init(location: CLLocationCoordinate2D, item: ARItem) {
self.coordinate = location
self.item = item
self.title = item.itemDescription
super.init()
}
}
We created a MapAnnotation class and implemented the MKAnnotation protocol.
- This protocol requires a coordinate attribute and a title attribute to be implemented.
- The item property holds the ARItem associated with the pin.
- Implement a convenient initialization method in which all attributes are assigned values.
Back to MapViewController.swift, add in the last sentence of the setupLocations() method:
for item in targets {
let annotation = MapAnnotation(location: item.location.coordinate, item: item)
self.mapView.addAnnotation(annotation)
}
Loop through the targets array, and each target adds a pin to the map.
At the end of the viewDidLoad() method, setupLocations():
override func viewDidLoad() {
super.viewDidLoad()
mapView.userTrackingMode = MKUserTrackingMode.followWithHeading
setupLocations()
}
Before we can locate, we must have permission.
Add a new property to the MapViewController:
let locationManager = CLLocationManager()
In the last sentence of viewDidLoad(), add the code that requests permission:
if CLLocationManager.authorizationStatus() == .notDetermined {
locationManager.requestWhenInUseAuthorization()
}
Note: Mapview cannot load user locations without a permission request.And no error message will be prompted.Whenever you call Location Services, you cannot get the location information. To troubleshoot errors, start from this location first.
Run the app and wait until the map zooms to your current location and shows some red pins that represent the enemy's location.
https://koenig-media.raywenderlich.com/uploads/2016/12/FirstRun.png' width='200'/>
Add Augmented Reality
We have an app that looks good, but we need to add some AR elements.In the next section, we'll add a camera window and a simple square to represent the enemy.
First we need to track the location of users.Declare properties in MapViewController:
var userLocation: CLLocation?
Then add an extension:
extension MapViewController: MKMapViewDelegate {
func mapView(_ mapView: MKMapView, didUpdate userLocation: MKUserLocation) {
self.userLocation = userLocation.location
}
}
This method is called each time the location of the device changes.In this method, we simply save the user's location for use in another method.
Add a delegate method to the extension:
func mapView(_ mapView: MKMapView, didSelect view: MKAnnotationView) {
//1
let coordinate = view.annotation!.coordinate
//2
if let userCoordinate = userLocation {
//3
if userCoordinate.distance(from: CLLocation(latitude: coordinate.latitude, longitude: coordinate.longitude)) < 50 {
//4
let storyboard = UIStoryboard(name: "Main", bundle: nil)
if let viewController = storyboard.instantiateViewController(withIdentifier: "ARViewController") as? ViewController {
// more code later
//5
if let mapAnnotation = view.annotation as? MapAnnotation {
//6
self.present(viewController, animated: true, completion: nil)
}
}
}
}
}
When the user clicks on an enemy not more than 50 meters away, a video is displayed:
- Gets the coordinates of the selected pin.
- The report uerLocation is not empty.
- Confirm that the pin is within 50 meters of the user's position.
- Instantiate an instance of ARViewController from the storyboard.
- Check that the type of pin clicked on is MapAnnotation.
- Show viewController.
Run the app and click any pin near your location to display a blank view controller:
https://koenig-media.raywenderlich.com/uploads/2016/12/SecondRun.png' width='200'/>
Add a camera
Open ViewController.swift and add import AVFoundation after import SceneKit:
import UIKit
import SceneKit
import AVFoundation
class ViewController: UIViewController {
...
Add two properties to hold an AVCaptureSession object and an AVCaptureVideoPreviewLayer object:
var cameraSession: AVCaptureSession?
var cameraLayer: AVCaptureVideoPreviewLayer?
We use capture session s to access video inputs (such as lenses) and outputs (such as view frames).
Add a method:
func createCaptureSession() -> (session: AVCaptureSession?, error: NSError?) {
//1
var error: NSError?
var captureSession: AVCaptureSession?
//2
let backVideoDevice = AVCaptureDevice.defaultDevice(withDeviceType: .builtInWideAngleCamera, mediaType: AVMediaTypeVideo, position: .back)
//3
if backVideoDevice != nil {
var videoInput: AVCaptureDeviceInput!
do {
videoInput = try AVCaptureDeviceInput(device: backVideoDevice)
} catch let error1 as NSError {
error = error1
videoInput = nil
}
//4
if error == nil {
captureSession = AVCaptureSession()
//5
if captureSession!.canAddInput(videoInput) {
captureSession!.addInput(videoInput)
} else {
error = NSError(domain: "", code: 0, userInfo: ["description": "Error adding video input."])
}
} else {
error = NSError(domain: "", code: 1, userInfo: ["description": "Error creating capture device input."])
}
} else {
error = NSError(domain: "", code: 2, userInfo: ["description": "Back video device not found."])
}
//6
return (session: captureSession, error: error)
}
This approach is responsible for these things:
- Create variables that return values.
- Get the rear camera.
- If the camera is valid, get its input.
- Create an AVCaptureSession object.
- Add rear camera input to capture session.
- Returns a tuple containing captureSession and error.
Now that we have the input from the camera, we can add it to the view:
func loadCamera() {
//1
let captureSessionResult = createCaptureSession()
//2
guard captureSessionResult.error == nil, let session = captureSessionResult.session else {
print("Error creating capture session.")
return
}
//3
self.cameraSession = session
//4
if let cameraLayer = AVCaptureVideoPreviewLayer(session: self.cameraSession) {
cameraLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
cameraLayer.frame = self.view.bounds
//5
self.view.layer.insertSublayer(cameraLayer, at: 0)
self.cameraLayer = cameraLayer
}
}
The code is explained as follows:
- First call the previous method to get a capture session.
- Determine if an error has occurred or if capture session is empty, if it is immediately return ed, say bye-bye to AR!
- Otherwise, save capture session to the cameraSession variable.
- Create a camera preview layer and, if created successfully, set its videoGravity and frame properties to occupy the entire screen.
- Add the camera preview layer (viewfinder) to the sublayers and save it to the cameraLayer variable.
Then, in viewDidLoad(), join:
loadCamera()
self.cameraSession?.startRunning()
There are only two things to do here: first call the method you wrote earlier, and then open the lens viewfinder.This frame will immediately appear on the preview layer.
Run the app, click anywhere near you, and you will see a new lens preview interface:
https://koenig-media.raywenderlich.com/uploads/2016/12/CameraPreview.png' width= '200'/>
Add a Square
Good job, but it's not really an RA.In this section, we'll add a simple square to represent the enemy and move it based on the user's location and orientation.
This game has two enemies: wolf and dragon.
So we need to know what kind of enemies they are and where they should be displayed.
Add the following attributes to the ViewController (to hold enemy information):
var target: ARItem!
Open MapViewController.swift, find mapView (:, didSelect:) and modify the last if statement to:
if let mapAnnotation = view.annotation as? MapAnnotation {
//1
viewController.target = mapAnnotation.item
self.present(viewController, animated: true, completion: nil)
}
Before displaying the viewController, assign it an ARItem, which is the item property of the tapped pin.This allows viewController to know the current enemy type.
Now the ViewController has the target information.
Open ARItem.swift to import SceneKit.
import Foundation
import SceneKit
struct ARItem {
...
}
Add an attribute to hold a SCNNode object:
var itemNode: SCNNode?
Ensure that this property is declared after the other properties of the ARItem structure, since the implicit initialization method uses the same order to define parameters.
Xcode prompts for an error in MapViewController.swift.To resolve this error, open this file and find the setupLocations() method.
We need to modify the code that has a red dot on the left side of the editor.
https://koenig-media.raywenderlich.com/uploads/2017/01/ErrorMark.png' width='20'/>
For this code, we all need to fill in the missing itemNode parameter with nil.
For example, in this line:
let firstTarget = ARItem(itemDescription: "wolf", location: CLLocation(latitude: 50.5184, longitude: 8.3902))
It should be changed to:
let firstTarget = ARItem(itemDescription: "wolf", location: CLLocation(latitude: 50.5184, longitude: 8.3902), itemNode: nil)
We know the types of enemies and where they are, but we also need to know where the equipment is heading.
Open ViewController.swift and import CoreLocation:
import UIKit
import SceneKit
import AVFoundation
import CoreLocation
Then add the attribute declaration:
//1
var locationManager = CLLocationManager()
var heading: Double = 0
var userLocation = CLLocation()
//2
let scene = SCNScene()
let cameraNode = SCNNode()
let targetNode = SCNNode(geometry: SCNBox(width: 1, height: 1, length: 1, chamferRadius: 0))
The code is explained as follows:
- We use a CLLocation Manager to monitor device orientation.The heading is in degrees and represents the true north or magnetic north deflection angle.
- Create a SCNode() and a SCNode object.The targetNode will be used to place a cube.
Add in the last sentence of viewDidLoad():
//1
self.locationManager.delegate = self
//2
self.locationManager.startUpdatingHeading()
//3
sceneView.scene = scene
cameraNode.camera = SCNCamera()
cameraNode.position = SCNVector3(x: 0, y: 0, z: 10)
scene.rootNode.addChildNode(cameraNode)
The code is explained as follows:
- Set ViewController to CLLocationManager delegate.
- By calling the startUpdatingHeading method, we can receive direction notifications.By default, when the direction changes more than one degree, the delegate method is called.
This sets ViewController as the delegate for the CLLocationManager. - Set SCNView.First you create an empty scene, then add the camera to it.
Add an extension to implement the CLLocationManagerDelegate protocol:
extension ViewController: CLLocationManagerDelegate {
func locationManager(_ manager: CLLocationManager, didUpdateHeading newHeading: CLHeading) {
//1
self.heading = fmod(newHeading.trueHeading, 360.0)
repositionTarget()
}
}
When a new direction notification is received, CLLocationManager calls this delegate method.fmod modularizes the double to ensure that the direction is between 0 and 359.
Add a repostionTarget() method to ViewController.swift, noting that it is in the class implementation, not the CLLocationManagerDelegate extension:
func repositionTarget() {
//1
let heading = getHeadingForDirectionFromCoordinate(from: userLocation, to: target.location)
//2
let delta = heading - self.heading
if delta < -15.0 {
leftIndicator.isHidden = false
rightIndicator.isHidden = true
} else if delta > 15 {
leftIndicator.isHidden = true
rightIndicator.isHidden = false
} else {
leftIndicator.isHidden = true
rightIndicator.isHidden = true
}
//3
let distance = userLocation.distance(from: target.location)
//4
if let node = target.itemNode {
//5
if node.parent == nil {
node.position = SCNVector3(x: Float(delta), y: 0, z: Float(-distance))
scene.rootNode.addChildNode(node)
} else {
//6
node.removeAllActions()
node.runAction(SCNAction.move(to: SCNVector3(x: Float(delta), y: 0, z: Float(-distance)), duration: 0.2))
}
}
}
The code is explained as follows:
- The getHeadingForDirectionFromCoordinate method is used to calculate the direction from the current location to the target, which is described later.
- Calculates the deflection angle (delta) between the current direction of the device and the target direction.If delta is less than -15, show the left arrow.If greater than 15, show the right arrow.If between -15 and 15, both arrows are hidden, indicating that the enemy is in the screen.
- Calculate the distance from the device location to the enemy.
- If itemNode is not empty...
- At the same time, the node has no parent node, set the location of itemNode to distance and place the node on the screen.
- Otherwise, delete all actions and create a new action.
If you know SceneKit or SpriteKit, you know the last sentence of code.Otherwise, it will be described in more detail here.
The SCNAction.move(to:, duration:) method creates an action that moves a node to a specified location at a specified time.runAction(:) is also the SCNNode method used to execute an action.We can also create action groups/sequences.Read our book for more information 3D Apple Games by Tutorials.
Continue implementing methods that were not previously implemented.Add these methods to ViewController.swift:
func radiansToDegrees(_ radians: Double) -> Double {
return (radians) * (180.0 / M_PI)
}
func degreesToRadians(_ degrees: Double) -> Double {
return (degrees) * (M_PI / 180.0)
}
func getHeadingForDirectionFromCoordinate(from: CLLocation, to: CLLocation) -> Double {
//1
let fLat = degreesToRadians(from.coordinate.latitude)
let fLng = degreesToRadians(from.coordinate.longitude)
let tLat = degreesToRadians(to.coordinate.latitude)
let tLng = degreesToRadians(to.coordinate.longitude)
//2
let degree = radiansToDegrees(atan2(sin(tLng-fLng)*cos(tLat), cos(fLat)*sin(tLat)-sin(fLat)*cos(tLat)*cos(tLng-fLng)))
//3
if degree >= 0 {
return degree
} else {
return degree + 360
}
}
The radiansToDegrees(:) and degreesToRadians(:) methods are used to rotate radians and angles to each other.
The getHeadingForDirectionFromCoordinate(from:to:) method code is interpreted as follows:
- First, convert the angle to radians.
- The converted radians are then used to calculate the direction to the angle.
- If degree is negative, add 360 degrees to make the data more consistent.This is possible because -90 degrees equals 270 degrees.
It will take a few more steps to run your app.
First, you must pass the user's coordinates to the viewController.Open MapViewController.swift to find the last if statement for mapView (:, didSelect:), add this sentence before displaying the view controller:
viewController.userLocation = mapView.userLocation.location!
Then add this method to ViewController.swift:
func setupTarget() {
targetNode.name = "enemy"
self.target.itemNode = targetNode
}
This method sets a name for the targetNode and assigns it to the target.
You can now call this method at the end of the viewDidLoad() method.After adding the camera, add:
scene.rootNode.addChildNode(cameraNode)
setupTarget()
Run the app and you can see that the box is moving:
https://koenig-media.raywenderlich.com/uploads/2016/12/MovingCube.png' width='200'/>
Beautify our app
Using a square or sphere is a simple way to start an app because it saves a lot of time for 3D modeling - but 3D models look much more beautiful after all.In this section, we will continue to beautify our apps, add 3D models to our enemies, and give players the ability to throw fireballs.
Open the art.scnassets folder with two.dae files inside.They contain models of enemies: wolves and dragons.
Next, modify the setupTarget() method in ViewController.swift to load these 3D models and assign the itemNode property to the target.
Modify the setupTarget() method to:
func setupTarget() {
//1
let scene = SCNScene(named: "art.scnassets/\(target.itemDescription).dae")
//2
let enemy = scene?.rootNode.childNode(withName: target.itemDescription, recursively: true)
//3
if target.itemDescription == "dragon" {
enemy?.position = SCNVector3(x: 0, y: -15, z: 0)
} else {
enemy?.position = SCNVector3(x: 0, y: 0, z: 0)
}
//4
let node = SCNNode()
node.addChildNode(enemy!)
node.name = "enemy"
self.target.itemNode = node
}
The code is explained as follows:
- First load the model into the scene.The itemDescription property name of the target corresponds to the.dae file name.
- Then iterate through the scene to find nodes with the same name as itemDescription.There will only be one node, the root node of the model.
- Adjust where the model is placed so that both models appear in the same place.If both models were created by the same designer, this may not be necessary.But my two models come from different designers: wolf from 3dwarehouse.sketchup.com and dragon from 3dwarehouse. https://clara.io.
- Add the model to the empty node and assign the node to the itemNode property of the current target.There is one small problem left, the handling of touch, which is described later.
Run the app and you'll see a three-dimensional wolf, which is much more scary than a cheap square!
In fact, this wolf is enough to scare you away, but as a brave hero, you never choose to run away!Next you should add a few fireballs so that you can beat the wolf before you become a snack.
The best time to throw a fireball is when the user touches the end event, so do this in ViewController.swift:
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
//1
let touch = touches.first!
let location = touch.location(in: sceneView)
//2
let hitResult = sceneView.hitTest(location, options: nil)
//3
let fireBall = SCNParticleSystem(named: "Fireball.scnp", inDirectory: nil)
//4
let emitterNode = SCNNode()
emitterNode.position = SCNVector3(x: 0, y: -5, z: 10)
emitterNode.addParticleSystem(fireBall!)
scene.rootNode.addChildNode(emitterNode)
//5
if hitResult.first != nil {
//6
target.itemNode?.runAction(SCNAction.sequence([SCNAction.wait(duration: 0.5), SCNAction.removeFromParentNode(), SCNAction.hide()]))
let moveAction = SCNAction.move(to: target.itemNode!.position, duration: 0.5)
emitterNode.runAction(moveAction)
} else {
//7
emitterNode.runAction(SCNAction.move(to: SCNVector3(x: 0, y: 0, z: -30), duration: 0.5))
}
}
The code is explained as follows:
- Convert touch to scene coordinates.
- The hitTest (, options:) method emits a ray to a specified location and returns an array of SCNHitTestResult that represents all nodes through which the ray passes.
- Loads a particle system from a SceneKit particle file for firing fireballs.
- Add the particle system to an empty node and place it outside the screen.This makes the fireball look like it was fired from the player's position.
- Determine if a collision has occurred...
- Wait 0.5 seconds and remove the itemNode corresponding to the enemy.At the same time, move the particle emitter node to the enemy's position.
- If no collision occurs, the fireball moves to a fixed position.
Run the app and let the hungry wolf burn in the flames!
https://koenig-media.raywenderlich.com/uploads/2016/12/Wolf_Model.png' width='200'/>
End Work
To complete the app, we also need to remove the enemy from the list, close the AR view, and return to the map to find the next enemy.
Removing enemies should be done in MapViewController because the enemy list is there.We can specify a delegation protocol with only one method that is called when the target is hit.
Before the class declaration of ViewController.swift, add the following protocol:
protocol ARControllerDelegate {
func viewController(controller: ViewController, tappedTarget: ARItem)
}
Also declare a property for ViewController:
var delegate: ARControllerDelegate?
The delegate method tells the delegate that a collision event occurred while explaining it, and then the delegate can proceed to the next step.
Find the touchesEnded (: with:) method in ViewController.swift and modify the code block in the if statement to:
if hitResult.first != nil {
target.itemNode?.runAction(SCNAction.sequence([SCNAction.wait(duration: 0.5), SCNAction.removeFromParentNode(), SCNAction.hide()]))
//1
let sequence = SCNAction.sequence(
[SCNAction.move(to: target.itemNode!.position, duration: 0.5),
//2
SCNAction.wait(duration: 3.5),
//3
SCNAction.run({_ in
self.delegate?.viewController(controller: self, tappedTarget: self.target)
})])
emitterNode.runAction(sequence)
} else {
...
}
Explain as follows:
- Change the action of the particle emitter node to an action sequence in which the move action remains.
- After the move action, pause for 3.5 seconds.
- Notify the delegate that the target is hit.
Open MapViewController.swift to declare a property that holds the selected pin:
var selectedAnnotation: MKAnnotation?
This property is used to remove it from the map.Code to modify its viewController initialization and conditional binding (if let) section:
if let viewController = storyboard.instantiateViewController(withIdentifier: "ARViewController") as? ViewController {
//1
viewController.delegate = self
if let mapAnnotation = view.annotation as? MapAnnotation {
viewController.target = mapAnnotation.item
viewController.userLocation = mapView.userLocation.location!
//2
selectedAnnotation = view.annotation
self.present(viewController, animated: true, completion: nil)
}
}
It's simple:
- Set the delegation of the viewController to MapViewController.
- Save the pin object in the user's point.
Add under the MKMapViewDelegate extension:
extension MapViewController: ARControllerDelegate {
func viewController(controller: ViewController, tappedTarget: ARItem) {
//1
self.dismiss(animated: true, completion: nil)
//2
let index = self.targets.index(where: {$0.itemDescription == tappedTarget.itemDescription})
self.targets.remove(at: index!)
if selectedAnnotation != nil {
//3
mapView.removeAnnotation(selectedAnnotation!)
}
}
}
The code is explained as follows:
- Dissolve the AR view.
- Remove the target from the targets array.
- Remove the pins from the map.
Run the app and you will see the end result:
https://koenig-media.raywenderlich.com/uploads/2016/12/Finished_app.png' width='200'/>
End
Final completed project in Here Download.
If you want to learn how to write this app as much as possible, refer to the following tutorials:
- For MapKit and location services, see our Getting Started with MapKit Swift.
- For video capture, please refer to our AVFoundation Series.
- For SceneKit, please refer to our SceneKit Series Tutorials.
- Background data support is required to avoid hard coding of enemy locations. How to write a simple PHP/MySQL service as well as How to Program Server-side with Vapor.
Hope you enjoy this tutorial.If you have any questions or suggestions, please leave a message below.