Wednesday

Visual Tracking Continued...  Look, No Tape!

My view of the JunkBot from my PC while I direct its movements with my mouse



Physical:

Parts/Stuff:

iPod with EpocCam (or actual webcam) sends data vi WiFi to
Computer running Processing sketch sends data via USB/Serial connection to
Arduino Nano is attached to
315Mhz RF Transmitter sends data to
JunkBot (detailed in previous posts)

This sketch should work anywhere there is consistent lighting and no movement other than that of the object being tracked. (indoors) Using the current method of a single screenshot compared against video input, this will not work well or at all if there are other moving objects or large changes in lighting.



Logical:

The sketch has been updated since the video above was taken. It now uses a single display window, and the views are toggled using keys 1,2 and 3.

Processing Sketch: here

The processing sketch uses the OpenCV library ( http://ubaa.net/shared/processing/opencv/ )to track the position and movements of the JunkBot. The iPod is using webcam type software to transmit the video over Wifi, but any webcam supported by the CV library will work. 
The sketch simply compares a screenshot taken without the bot in view against the current video capture, and determines the position of the bot. From there the information is sent to an Arduino based RF transmitter listed below. (2 bytes, speed for motors 1 and 2)
Movement is currently a bit 'swervy' compared to the physical line following version from a previous post, due to the delays involved in processing the video data and relaying commands. I am fairly sure this can be improved with better code for determining required movements.

Controls:

'4' key: Grab a screenshot and save it to file as specified.
'3' key: Load image file saved using '4' into memory and display difference image
'2' key: Display screenshot file only
'1' key: Display main view including lines, tracking box, etc.

'S' key: Begin. On first start only, signals will be sent increasing motor speed until movement is detected. This is used as the base speed. Once ranging is complete, commands will be sent to move to the indicated position at this speed.
'A' key: Stop
'+' key: Faster
'-' key: Slower

Mouse: Click and drag adjusts the Threshold (See OpenCV reference), Click/NoDrag sets a new position.

Transmitter: here
The transmitter uses the VirtualWire library and a 315Mhz transmitter to relay commands to the JunkBot. Commands are formatted to work with the standard JunkBot code listed below. The code simply waits for 2 bytes of serial data from the Processing sketch, and relays that information to the JunkBot. If no additional commands are received within 1/4 second, the commands are sent again to help ensure reception.


Current JunkBot_SD: here

JunkBot_SD code is as described in previous post. I might post a simple version just for visual tracking since this one is pretty rough and bloated.

Monday

Visual Tracking with Arduino,Processing, and an iPod


A first attempt at tracking movement:



Starting and stopping of the JunkBot while it moves in a circle.

Using OpenCV(ComputerVision) Processing library to receive data wirelessly from an iPod video camera, then process it and send controlling data to an Arduino. This Arduino relays the information via 315Mhz RF link to the JunkBot. In this demo, the JunkBot is told to hang around the center point of the circle.

As usual, the JunkBot does not analyze the incoming commands. Like any good robot, it just does what it is told... More to come.



AutoAnalogAudio Library - nRF52 Now supports Successive Approximation Analog-to-Digital Converter for Audio Input

 AutoAnalogAudio Library - nRF52 Now supports SAADC Support for Successive Approximation Analog-to-Digital Converter for Audio Input I'v...