Motion controls came into the public consciousness a little over a decade ago as touchpads and touchscreens became more popular. The main limitation to gesture control, at least as far as: [Norbert] is that they can only control objects in a virtual space. He hoped instead to use gestures to control a real world object, and created this device that uses gestures to control a real image.
This unique augmented reality device not only controls the object in the real world, but also controls the gestures there, thanks to a computer vision system that monitors its hand running OpenCV. The position data is fed into an algorithm that controls a physical image mounted on a slender robotic arm. Now when [Norbert] “pinch to zoom,” the servo attached to the photo physically moves it closer or further from its field of view. He can also use other gestures to move the image.
While this gesture-controlled machine is certainly a proof-of-concept, there are plenty of other uses for gesture control of real-world objects. Any robotics platform could benefit from an interface like this, or even something more mundane like an office PowerPoint presentation. Opportunities abound, but if you need an introduction to OpenCV, check out this build that takes a hand down to the tiniest detail.
This post OpenCV brings Pinch to zoom in on the real world
was original published at “https://hackaday.com/2022/03/24/opencv-brings-pinch-to-zoom-into-the-real-world/”