After the hackathon we’re scrambling to document the proof of concept, here is a quick overview of the bits and pieces and how they fit together.
A quick overview of the progress of the AprilTags detector for the bots in two videos – hello localization in 50 lines or less!
It is not the cessation of dreams I fear, but the ceasing of desire.
Dreams are plentiful, rampant and unbidden in every waking moment.
It is in the moment when I cease to desire them,
Disregarding my imprudent pursuit,
Abandoning my unrelenting yearning,
that I, too, shall cease to exist.
P.S. Yes, desires/dreams may kill you, but apathy/complacency will just shatter your back…
While working on ARNerve last week a few friends stopped by to have a look, so we recorded some of their feedback as they tried the slightly duct-taped, straight-out-of-the-oven framework. Thanks guys for giving it a kick! We thought we would put the current progress into a a quick demo movie that demonstrates the NUI menu system (kudos to Daniel building it out!), performance of the user tracking, and the user testing:
…More to follow once we sort out bot localization, when that is working we’ll try whittle this project down into a set of tutorials.
ARNerve is a framework for interacting in environments using a decentralized natural user interface, and at the moment we’re focusing on playing with it for robotics. Think Ender’s Game (the movie), and the way that they interact with the environment – we’re trying to achieve something similar. It’s far from complete, but as the hackathon is coming to a close, we’re starting to find a few minutes to release parts of the project on GitHub and SemiSorted.