BuildAShip: Semantic Segmentation
Demonstrates classifying pixels from the camera feed into categories, aka semantic segmentation. 3 categories are demonstrated: natural ground, foliage, and sky
Overview
After some instructions, the app turns on the device’s camera and the player is prompted to scan the ground to find an AR ground plane, and to choose a position to place Captain Doty. Doty offers more instructions, then disappears, and the semantic segmentation feature activates, attempting to classify one of 3 environmental categories, starting with natural ground (e.g., grass). The goal is to help Doty collect enough resources to build an airship.
A category is visualized by filling that region of the screen with icons representing it. A button appears near the bottom of the screen; holding it down activates a vacuum that collects “particles” of that category into a funnel. When enough particles have been collected, the scene advances to the 2nd and then 3rd categories (foliage, then sky).
(If the player cannot find a category in their environment, a debug menu option allows them to Skip Current Step.)
ARDK Features in use
ARSceneManager
ARSemanticSegmentationManager
- used in this scene for classifying:natural_ground
foliage
sky
ARPlaneManager
- for finding an AR ground planeARDepthManager
- for occlusion via depth
Additional Helpers
BuildAShipResourceRenderer
- visualizes a category by filling that region of the screen with icons representing itBuildAShipDebugManager
- manages the debug functionality for the scene
States
StateInstructions
A UI displaying instructions.
StateWarning
A UI displaying a warning about using AR; this is only displayed once per execution of the app, before the user enters an ARSession for the first time.
StateScanning
The device’s camera is turned on, and the player is prompted to scan the ground, to place Captain Doty. This state uses the ARPlaneHelper to find a safe surface to place Doty.
StateYetiRequest
Additional instructions from Captain Doty.
StateCollect
The player is prompted to find an environmental resource – grass, trees or sky – and collect particles of it into a funnel via a vacuuming button. Each resource corresponds to an ARDK semantic segmentation channel. This state uses the BuildAShipResourceRenderer
to visualize the resource icons; spatially aligning them with the texture for the semantic segmentation channel. This State is returned to for a total of 3 times.
StateCollectDone
UI that displays success of collecting that resource.
StateGameOver
UI displaying a success message, and an option to Restart, or Return to Map.
Editor-Only Mock Support
When developing ARDK applications in editor, to avoid always requiring building to a device, it is convenient to employ mock versions of AR assets that function in Unity Editor. Each scene’s mock objects are attached to the scene’s MockScene GameObject. This object has a MockSceneConfiguration component that will destroy the object if outside of Unity Editor.
This scene’s MockScene object includes:
A mock AR plane