Okay, I think I have the answer to the zero-calibration problem with drawbots.
I made an eyelet out of a piece of music wire. The drawbot string will be run through the eyelet. There will be a bead threaded on the string that will push the music wire.
This will trigger a micro-switch and presto we have a consistent reference point.
I’ll just use one micro-switch. There will be a thread running from each piece of music wire down to the micro-switch.
Just received this!
Since the arduino workshop here in Ottawa, I’ve had arduino on the brain and I’ve ordered lots of goodies to experiment with.
But Post Canada storage was kind enough to keep all my stuff so I’m only getting it now.
It has 8 pins.
GND Vin G2 G1
SL Z Y X
Here’s the product description:
Triple Axis Accelerometer MMA7260 Module Arduino PIC MCU DSP ARM Compatible
This action is For one pcs Triple Axis Accelerometer MMA7260 Module . It can measure the accelerations in X Y and Z axis with range +-1.5 g to 6 g.( software selectable sensitivity )
Voltage : 5V/3.3V
Selectable sensitivity : ±1．5g／2g／4g／6g
Low power:500μA @ measurement mode，3μA @standby ；
High sensivity: 800 mV／g @ 1.5g；
Boad size: 2.4 (cm) x1.2 (cm)
Weight: 5 gram
1 X Triple Axis Accelerometer MMA7260 Module
Email AVR Sample code (Download link will be provided after you receive the hardware)
I would like to make a rooster that has a head that can turn to look at you. I’m looking for lots of ideas on how to do this as I like this sort of interaction.
Obviously a motion detector is probably enough. Perhaps more than one so that we can get a take on the actual location of the person.
A more generalized solution might be to look at an arduino, a camera and processing.
Teddy Bear Vision — Arduino Experiment from Amanda Tasse on Vimeo.
Here’s a cool one I got from MIcheal of http://krazatchu.ca/.
Here’s one I found in google:
Here’s another cool vision project from Micheal