Today Leap Motion Developers announced the public developer beta. Great news for HMDs like Oculus Rift! Check it out the samples! More than just a handful of new possibilities.
From the Developer Newsletter:
Message from Leap Motion CEO Michael Buckwald
Hi everyone,
Today I’m pleased to (officially) announce that the next generation of Leap Motion tracking (Version 2) is in public developer beta. This is the first major step towards a full V2 consumer release, which will be available as a free software update to all controllers later this year.
It’s been a whirlwind year, and we’ve learned a lot in that time, both thanks to community feedback and our own testing and research. We’ve seen a wide range of experiments from developers pushing the boundaries of the platform. Art installations and musical instruments. Flying drones and manipulating massive data sets. Learning aids for children and sign language translation.
But this is only scratching the surface, and these are still early days. Our vision for the platform has always been to let people do more with technology, and today the Leap Motion Controller is designed to augment your existing computing experience – opening up a wide range of 3D possibilities alongside the mouse and keyboard. We’re just beginning our journey in realizing the full potential of motion-controlled interaction.
V2 tracking is a critical step forward as the first of several major updates we’ll be pushing prior to the V2 consumer release. Ultimately, we want to make it much easier for developers to build transformative applications across a range of platforms – starting with computers and moving towards virtual and augmented reality, automotive, smart homes, healthcare, and beyond.
The features we’ve built into the beta reflect your feedback over the past year – what we’ve heard in person at events, on social media, and in forum discussions – with the aim of answering some of the frustrations that many people felt as early customers:
- Persistent finger and hand labels – every finger, hand, and joint now has anatomical labels like ‘pinky’, ‘left hand’, and ‘proximal phalanges’
- Improved tracking of two-handed interactions
- Tracking confidence is exposed (view in the visualizer by pressing k)
- Massively improved resistance to ambient infrared light – sunlight, powerful halogens, etc.
- New Bone API with joint tracking
- New gestures including pinch and grab
- A new examples gallery including new demos and on-screen hands for Unity and LeapJS
- Top-down tracking mode – now available in the Control Panel for better head-mounted and inverted tracking
- Updated API docs and Developer Portal
V2 tracking is just the first step, and there’s still a lot of work to be done. This is beta software and we’ve got many more features to add before the consumer launch, and further tracking and UX enhancements to come. We greatly appreciate your thoughts as we continue to build out over the coming months, as your feedback directly influences our roadmap moving forward.
Thank you,
Michael Buckwald
Robust to OcclusionFingers are tracked even when not seen by the controller (e.g. intertwined or blocked) |
Tracking Finger JointsMore granular data about the user’s hands and fingers, including individual finger joint tracking |
|
Finger & Hand LabelsEvery finger, hand, and joint now has anatomical labels like pinky, left hand, and proximal phalanges |
Ambient Light ResistanceMassively improved resistance to ambient infrared light like sunlight and powerful halogens |
|
Pinch/Grab CapabilityNew APIs indicate whether one finger is touching another and how close a hand is to being a fist |
Example GalleryBrand-new demos like “Ragdoll Thrower” and onscreen hands for Unity and LeapJS |
Stay informed: https://www.leapmotion.com/blog/