
Pleo Robot (Photo courtesy Ugobe)
Pleo is a robotic dinosaur toy designed and created by Ugobe, which sold its intellectual property to Jetta, the manufacturer of Pleo. Click here to see some moves Pleo is capable of.
We decided to use Pleo because we can use the readily available robot skills and behaviors to quickly prototype whether non-humanoid robots work well in treating children with Autism in a clinical setting.
Research Objectives
What are the differences between a humanoid robot and a non-humanoid robot when used in treating children with Autism in clinical settings?
Can a robotic pet help develop autistic children's capabilities in joint attention?
How can a therapist “program” the robot pet in preparation for clinical sessions using a skill/behavior based interactive learning interface?
What interface techniques help the therapists interact with the robot to better treat children with Autism?
Robot Specifications
Sensors
Existing Sensors:
Range Sensor: Two infrared sensors (directional) in nose to detect object distances.
Visual Sensor: One camera with very low resolution (320×240) in nose.
Audio Sensor: Two mics, one on each side of the head, allowing directional audio signals detection.
Feet Touch Sensor: Four switch touch sensors at bottom of feet.
Pat Sensor: One touch sensor at top of head, one touch sensor under the chin, at least two touch sensors on the back to detect user's patting behaviors.
Leg Touch Sensor: Touch sensors on side of the four legs to detect touch/poke from user.
Tilt/accelerometer Sensor: Sensing whether the robot has been tilted or turned upside down.
Shake Sensor: Really using other sensors to detect shaking behavior of the user.
Food Sensor: Senses whether a leaf (food) has been provided.
Possible Sensors:
Actuators
Existing Actuators:
Eye movements: Open or close eyes (1-DOF)
Mouth movements: Open or close mouth (1-DOF)
Head/neck movements: Up/down, left/right (2-DOF)
Leg movements: Bent knee, move leg forward/backward (2-DOF)
Hip movements: Move hip left/right (1-DOF)
Tail movements: Move tail up/down, left/right (2-DOF)
Possible Actuators:
Processor
Communication
Robot Skills/Behaviors
Existing skills/behaviors
Body movements: Head/neck, hip, leg, tail movements.
Movements: Walk forward/backward (very slowly). Turn left/right (very slowly). Stretch.
Sounds: Make sounds/noises, speak (if we replace sound effect files)
Cognitive behaviors: Detect padding behavior from user. Detect tilted or upside down position. Detect food (leaf)
Emotions: Angry (move violently and trying to bite), scared (cuddle), bored, happy (move head and tail left and right), sad (head and tail downward), scared (shaking head, body and legs). There are many different versions of emotions available.
Expressions: Blink, sniff, hiccup, yawn, lip smack, cough, howl, tail wag, wake up, fall asleep
Social behaviors: Curious, threaten (trying to bite quickly)
Possible skills/behaviors
Desired skills/behaviors
Back to top