Lightship ARDK Beta
The Lightship AR Developer Kit brings real-time mapping technology together with shared multiplayer experiences, semantics and depth for more realistic AR experiences. ARDK is made for developing AR experiences for both Android and iOS mobile platforms – and integrates directly within Unity.
Create persistent and realistic experiences. ARDK’s meshing feature translates the colors produced by the user’s camera, runs them through a neural network, and builds a mesh of tessellated triangles that create a machine-readable representation of the physical world. With this feature, your app will have a persistent and real-time map to create realistic AR interactions. For example, you can enable “physics” where a user can throw a ball and have it bounce off a wall and roll off of a table.
Meshing works best at 5-10 meter range, therefore, is not meant for large scale AR experiences–improvements to meshing range, latency, resolution, and stability to come in future releases. Recommended for iPhone X+, Pixel 3+, and Samsung S9+ devices only. More devices coming soon.
Unlock a world of powerful, social use cases for your app. From gaming to treasure hunts, multiplayer lets AR experiences shift from single user play to multi-user, social sessions. Peer-to-peer messaging with a provided server and back-end as well as some other handy server-side features are built in (e.g., synchronized clock and a session-persistent, last-write wins database).
Although 8 players is the maximum, optimal multiplayer experiences are shared between 2-3 users over a 1-5 minute session due to processing requirements.
Create AR experiences without having to be a computer vision or distributed systems expert. Lightship classifies outdoor natural objects that a user scans (like the ground and sky), enabling AR content interactions on or with specific surfaces. For example, using ‘sky’ segmentation you can create an experience where large AR objects are displayed on the horizon, properly occluded by buildings, trees, and other ground objects.
Current segmentation includes ground and sky (more coming soon). Recommended for iPhone X+, Pixel 3+, and Samsung S9+ devices only. More devices coming soon.
Discover more features from ARDK
Cross Platform APIs
Build once and run across multiple platforms. ARDK APIs work across iOS and Android to blend in core AR capabilities such as plane detection (horizontal and vertical), scene light estimation, hit testing (detected and undetected objects) and image detection to bring a consistent experience to your app across devices.
ARCore and ARKit AR capabilities vary in compatibility with different devices: device lists available on developer portal.
Create and maintain the illusion of reality with occlusion. ARDK’s unique occlusion technology applies a layer of learning on top of depth information, allowing virtual objects to “go behind” real world objects that are closer to the camera adding dimension to your app experience.
Recommended for iPhone X+, Pixel 3+, and Samsung S9+ devices only. More devices coming soon.
Integrate virtual objects realistically in the world. ARDK’s depth estimation API determines depth in a user’s surroundings using the devices camera within milliseconds. Depth-based features, occlusion and meshing, activate instantly and consistently across platforms making it possible for moving objects like cars and people to occlude virtual objects.
Recommended for iPhone X+, Pixel 3+, and Samsung S9+ devices only. More devices coming soon. As ARDK depth-based features are in active development, improvements to range, latency, resolution, and stability are expected to come in future releases.