Friday Gadget: The BIOSwimmer Fish Robot

BioSwimmer1The Biomimetic In-Oil Swimmer (BIO-Swimmer) is an robotic fish that has been under development the last 4-5 years by Boston Engineering Corporation’s Advanced Systems Group in Waltham, MA, for the U.S. Department of Homeland Security

The U.S. Department of Homeland Security is tasked with uncovering attempts to damage, disrupt, or illegally use the flow of commerce; without detection.  As you can imagine, this is a challenging process.  With regard to waterways, a balance needs to be maintained between monitoring ports, rivers and other waterways without slowing commerce.  The BIOSwimmer is being developed to help the U.S. secure and protect these very waterways.  It is a fish-inspired (it looks like a Tuna) robot that can be deployed rapidly.  It is designed to maneuver into locations previously inaccessible to current robots and provide security intelligence far beyond current capability.

The robot is a hybrid of the design features of a regular submarine (i.e. dive planes, thruster-powered locomotion, and a rigid hull) combined with the flexible keel of a fish.  The tuna is used as a biological model because its natural swimming gait holds the front 2/3 of the fish’s body rigid, while the rear 1/3 moves; this allows the robot to utilize the front 2/3 of its body as a rigid, watertight hull, while the rear 1/3 is converted into a flooded flexible structure. The robot uses hydraulic actuators to move the flexible tail structure from side to side and electric motors for dive plane control.

It is a drone that is controlled via laptop-based system, so it requires a human operator.  It uses an onboard camera and computer suite for navigation, sensor processing, and communications.  It has onboard sensors which are designed for the challenging environment of constricted spaces and high viscosity fluids that are found in crowded and active ports on our waterways.   

All this capability produces a robotic fish-inspired drone that can both move through the water quickly and turn on a dime, a set of traits not usually seen together in underwater vehicles of any type. 

The BIOSwimmer will be expected to perform tasks like conducting ship hull inspections; performing search and rescue missions; and checking cargo holds that may have toxic fluids.   It can inspect the interior voids of ships such as flooded bilges and tanks, and hard to reach external areas such as steerage, propulsion and sea chests.  It can also inspect and protect harbors and piers, perform area searches and carry out other security missions. 

Friday Gadget: The MAB Automated Cleaning System for your house

Gadget - MAB - Automatic Cleaning SystemIn my house, I am the one usually dusting, cleaning, and vacuuming.  It’s not that I like doing the cleaning…it’s just that the other family members never seem to do be interested in having a clean house.   So this Friday’s Gadget post is one that I really like as it paint’s a future where I don’t have to do the cleaning.  

The automated robotic cleaning concept system called MAB, relies on flying mini-robots.   The concept won the 2013 Electrolux Design Lab Competition.  Check out the video below.

The ‘Mab’ automated cleaning system uses hundreds of tiny robots that fly around and collect dust and dirt.   Designer Adrian Perez Zapata says he created the system with the idea that he could  free the human race from the tedious task of cleaning.  In his Mab design, micro-robots do the work to clean every surface of your house while you sit back and relax.   I love that idea. 

Here’s how his Mab concept works.  Think of the Mab core unit as like a beehive and the flying robots as the bees.  In this case, hundreds of tiny flying robots are loaded with drops of water mixed with soap.  The Mab core unit scans the room, identifies dimensions and potential problem areas.  It then releases the flying robots to clean.  As the robots touch surfaces, the cleaning fluid picks up dirt and then the flying robot returns it to the central unit.  Back at the Mab Core unit, the dirt is filtered out from the liquid, which is then then cycled through the Mab core unit for reuse.

 

 

Embedded in Adrian’s design concept is that the Mab could be powered through wireless energy or solar energy.   He also says the wings of each robot could have solar panels to collect energy.

Just think…in the future you may never have to clean again

Friday Gadget: Rapport Device Detects and Reacts to Human Emotions

I’ve decided to bring back the Friday Gadget posts after a very long absence. 

I am not really a gadget guy, but I do like to think about what types of products future generations will have that will make their life easier and think about how emerging technologies will be a part of our lives in the future. When I first started blogging back in 2006, every Friday I would post about a concept for a future technology or gadget.   The series of posts were designed to help us all take a step back on a Friday, have a little fun, and help us all imagine how technology can disrupt the future. 

So I am bringing back the Friday Gadget posts.  I am not sure how long the series will last this time, but we will have fun with it while it lasts…

For this first new post , I found a project team that asked the question:  What if your gadgets knew how you were feeling and could then respond appropriately?   A group of designers developed a device they call Rapport that can observe, analyze and react to your facial expressions in order to select a music playlist that suits you the best.  Once you make eye contact with the device, it leans forward and analyzes your facial expression. Taking into account the time of day, it selects a song that it feels might suit your current mood.   The Rapport device starts the playback of the song at a fairly low volume, but will boost the volume if it sees you smiling or excited.

Under the covers, the team utilized 4 different software programs including Visual Studio (stores the facial recognition library and eye tracking code), Processing (runs the facial recognition library), Max/Msp(controls volume and curates music) and Arduino (drives the stepper motors inside the device).

Potential initial applications could include smart homes, retirement homes, entertainment events, and education.  In the future, application developers will utilize emotion detection systems to design robots that understand how better to interact with humans.   Over time robots could learn to understand how different humans react emotionally and treat each person differently based on both visual and auditory inputs. 

For more, check out these resources:   1) Rapport Introduction (Youtube), 2) Rapport Demonstration (YouTube), 3) Emotional Intelligence (Yanko Design), 4) Feeling the Music;  Gadget Reads Emotions to Choose Songs (Gajitz)

Friday Gadget: Pumpkin Sensor Project

imageFor those of you hackers out there looking for a Halloween themed project.

Here is a quick project for an electronic Halloween pumpkin that will scare your guests as they approach.  A sensor embedded in the nose detects when people get close and can do anything you tell the sensor to do.  The project site I found has the pumpkin playing scary sounds and lighting up LEDs on the face. 

The sounds are stored on an SD card (like the one you use in your digital camera) so its easy to change and customize what the pumpkin says.  The project site says the code is very easy to modify if you want to change what happens. 

The project instructions seem well written and comes with lots of pictures to help you along the way.  Looks like an easy project for first time sensor hackers/programmers.

Friday Gadget: City-Sheet e-paper concept

Every once in a while I go check the red dot award site (http://en.red-dot.org/design.html).  The red dot awards are given out annually for excellence in design.  On my visit today I came across the City-Sheet concept design submitted by Yeon Haejung.  City Sheet

The City-Sheet is portable device designed for travellers who today can be seen carrying travel guidebooks while they are touring.   They use guide books and maps to help them plan out their day and provides more detailed information on tourist sites, public transportation, and restaurants. 

The City-sheet concept is an electronic ‘book’ that contains five ‘e-paper’ sheets that can display information.  While attached to a computer via a USB connection, users download information to the City-sheet.  The user can customize their route for the day, including potential sites they will see, stores they want to visit, museums, sporting events, and restaurants.  The City-sheet can rolled and perhaps placed in a bag or large purse. 

See more at  http://red-dot.sg/concept/porfolio/06/ic/R098city.htm

While this concept design is certainly interesting, I think in the end it will not be successful in the market.  With the flurry of activity surrounding Google-Map mashup mobile applications along with recent augmented reality apps for i-phones and Android devices, I think in the end users will end up preferring to get their city travel information on their mobile device.

Friday Gadget: i-Real Personal Mobility Device

toyota-irealToyota has been experimenting with Personal Mobility devices for some time (i-unit and i-swing concepts).  Their latest prototype is called i-REAL.  It is a personal mobility vehicle which uses three wheels (two at the front and one at the back).  The ‘driver’ sits in a chair when operating the i-Real.

It operates in both low-speed and high-speed modes.  When operating in low-speed mode, it shortens its wheelbase to allow it to move naturally among pedestrians (and at a similar eyesight height) without taking up a large amount of sidewalk space.  In high-speed mode, the wheelbase lengthens to provide a lower center of gravity and better driving performance.  The i-Real is like a three-wheeler Segway and hits 20mph.

Watch the 4 minute video from BBC here.  Interesting part of demo starts about 2 minutes into the video.

Toyota says the i-REAL ensures safe handling [both to the driver and those around the vehicle] by employing perimeter monitoring sensors to detect whenever a collision with a person or object is imminent.  It alerts the driver through noise and vibrations and alerts people around it of its movements through the use of light and sound.   The i-Real concept car is designed to communicate with other i-Reals, allowing you to find and navigate to them on command.

Friday Gadget: The Snuza Halo

The Snuza Halo (www.snuza.com) is among the latest gadgets that use sensors to monitor a baby’s crib for signs of trouble.  The Halo, designed in South Africa, is unusual: rather than clipping the device to a crib or mattress, you attach it to the baby’s diaper.  The Halo senses the slightest movements.  If it does not detect movement for 15 seconds – a sign the baby might not be breathing – the device vibrates the baby’s abdomen.  If after a few seconds the vibration doesn’t appear to restart movement, the Halo sounds an alarm.  Source: Clip-on sensor monitors infants for trouble – The Boston Globe

Friday Gadget: Anti-Paparazzi Clutch Bag

imageHey for you celebrities out there, I bet you just hate it when the paparazzi surround you and start taking pictures of you coming out of restaurants and clubs.  Fear no more.   The anti-paparazzi clutch is a wearable device designed to counter the attacks of flash photography from paparazzi.  It’s unique patent pending technology allows the celeb to block any number of incoming shots.   And in case you do like to be photographed the design allows you to control whether your flash is on or off by the way you hold the bag.  The innovation comes from NYU graduate student Adam Harvey.   Currently the gadget fits inside a ladies clutch, but Mr. Harvey hopes to shrink it down to the size of a pendant or a tie tack  For more information check out this article at pdngearguide.com

Friday Gadget: The Hummingbird Robot

AeroVironment has been designing and building small, portable, reliable, and rugged unmanned aerial platforms designed for front-line day/night reconnaissance and surveillance.   One of the projects they are working on now is  a tiny drone that looks and flies like a hummingbird, flapping its little robotic wings to stay in the air. 

Check out the video here at YouTube (2:18 minute video…hummingbird technology displayed 50 seconds into video)

Based on what I’ve read, apparently this first version of the ‘robot’ has only stayed aloft for 20 seconds at a time so far.  But that short flight was enough to show the potential of a whole new class of miniature spies, inspired by nature.  The news is that Darpa just handed AeroVironment, more money to research and develop a second version of this Hummingbird.  

The goals of Darpa’s NAV program ….

“The NAV program will push the limits of aerodynamic and power conversion efficiency, endurance, and maneuverability for very small, flapping wing air vehicle systems. The goals of the NAV program — namely to develop an approximately 10 gram aircraft that can hover for extended periods, can fly at forward speeds up to 10 meters per second, can withstand 2.5 meter per second wind gusts, can operate inside buildings, and have up to a kilometer command and control range — will stretch our understanding of flight at these small sizes and require novel technology development.” – Dr. Todd Hylton, DARPA program manager as quoted in AviationWeek

If you sit back and imagine where this technology is ultimately going, it’s fair to realize that we will end up with nano-sized hummingbirds loaded with all types of sensors and deployed all over the place for applications ranging from security surveillance to birds-eye view video coverage of sporting events.  Reminds me of the nano-bots deployed in Neal Stephenson's 1995 book, The Diamond Age Or, A Young Lady's Illustrated Primer

Friday Gadget: Microsoft Photosynth

Microsoft Photosynth is a free software application that analyzes a multitude of photos to create a browsable 3D model by identifying overlapping points in the images.  The tool allows you to stitch together dozens of photos to allow a place or event to be viewed from multiple angles. 

Photosynth works by analyzing each photo for similarities to the others, and uses that data to estimate where a photo was taken. It then re-creates the environment and uses that as a canvas on which to display the photos.   The result is a fresh way to organize and share photography — opening up new possibilities for a 180-year-old art form.   The potential uses of Photosynth can range from sharing experiences to storytelling and documentation

It takes 75 photos or more to get the optimal experience, but, with big events, one can also rely on crowdsourcing.  As an example CNN asked viewers to send in their photos of Barack Obama's swearing in.  See the resulting Inagural Photosynth.  You can check out other examples at the Photosynth website. 

The original announcement (last August) press release is here and some background is available at MS Live Labs

Microsoft has recently announced (May 2009) that they have integrated their PhotoSynth software into Virtual Earth, allowing users to flip between overhead satellite imagery and photographic stitches.  You can check that announcement out here