Posts Tagged 'Robots'

Lethal Autonomous Robots and Responsibility

Science, Technology, and Society Program
Spring 2012 Colloquium Series
Date:  Thursday Feb. 23,  2012
Time:  3:30 – 5:00 p.m
Location:  Rodman Room, Thornton Hall, University of Virginia

In the second STS colloquium talk of 2012 Merel Noorman, a postdoc at the STS department, will present her current research on autonomous military robots and responsibility. One of the primary ethical concerns about future military robots is that these technologies will further obfuscate the distribution of responsibility, as they become more complex and increasingly capable of autonomous operation. Who will be held responsible when these robots make life and death decisions? In her talk, Merel will take a closer look at the discourse on autonomous military robots in order to explore how we can best address such concerns.

Autonomous Quadcopters Work Together To Build Structures

Clay Dillow writes in Popular Science (1/19), “Whenever a new video emerges from UPenn’s GRASP lab (that’s General Robotics, Automation, Sensing and Perception), it’s usually awesome, and this one is no exception.” The video features a team of autonomous quad-rotor helicopters “working from a preset algorithm…constructing a cubic tower structure using specially designed parts that snap together via magnets when placed in the proper arrangement.” The quadcopters “can even judge the quality of their own construction, checking to make sure a piece is properly in place before moving on to the next segment.” Considering potential uses, Dillow writes, “Beyond the obvious applications in automated construction processes, swarms of construction ‘bots could be launched from naval vessels to autonomously construct shelters in disaster-stricken areas or to set up a forward operating base before live troops arrive in a combat zone.”

Reposted from the 1/20/11 ASEE First Bell

Robot Lifeguard

Robotic Lifeguard To Begin Patrolling US Beaches.

Popular Science (6/25, Calderin) reported, “This summer, EMILY (for EMergency Integrated Lifesaving lanYard) began patrolling Malibu’s dangerous Zuma Beach and will watch over about 25 more by December.” The autonomous robot, which is capable of achieving 28 mph in the water, uses sonar to “scan for the underwater movements associated with swimmers in distress.” It also has a “camera and speakers [to] let an onshore lifeguard calm the person and instruct him to wait for human help or to hold on as EMILY ferries him back.”

Reposted from ASEE First Bell for May 28, 2010

A Fly on the Wall

Fixed-Wing Drone Lands Vertically On Walls.

Popular Science (4/27, Hsu) reports that researchers at Stanford University’s Biomimetics Laboratory have developed “a fixed-wing, non-transforming drone” that can land vertically on walls. “Their drone approaches the wall at full speed,” and “then pitches sharply upward to angle its belly toward the wall and slows its approach speed to just under 7 mph.” The drone uses carbon-fiber and balsa landing legs “tipped with steel spines” in order to make a vertical landing. “The researchers still face engineering challenges such as tuning the suspension system so that the drone doesn’t simply rebound upon landing approach.” They will be presenting “an update on their work at next month’s 2010 IEEE International Conference on Robotics and Automation in Anchorage, Alaska.”

Reposted from the April 27, 2010 ASEE First Bell

Humanoid Robot

Students Unveil Full-Sized, Walking Humanoid Robot.

Popular Science (4/27, Ngo) reports, “A group of undergraduate and graduate students at the Virginia Tech College of Engineering’s Robotics and Mechanisms Laboratory (RoMeLa) have unveiled” the Cognitive Humanoid Autonomous Robot with Learning Intelligence (CHARLI), “which they are calling the first full-sized, walking, untethered, humanoid robot, complete with four moving limbs and a head, to be built in the United States.” Dennis Hong, an associate professor who is leading the research, explained that “the environment we live in is designed for humans.” Therefore, the researchers “focused on making a humanoid robot with motor skills that can handle human tasks.” Popular Science noted, “There are two version of CHARLI in development: CHARLI-L, for Lightweight, and CHARLI-H, for Heavy.” The former “will debut in Singapore’s RoboCup tournament later this year.”

Reposted from April 28, 2010 ASEE First Bell

I Love My Robot!

Some Roomba Owners Become Emotionally Attached, Study Finds.

The AP (3/3) reports, “A new study shows how deeply some Roomba owners become attached to the robotic vacuum and suggests there’s a measure of public readiness to accept robots in the house – even flawed ones.” Beki Grinter, an associate professor at Georgia Tech’s College of Computing and one of the researchers involved, said, “They’re more willing to work with a robot that does have issues because they really, really like it.” Grinter added, “It sort of begins to address more concerns: If we can design things that are somewhat emotionally engaging, it doesn’t have to be as reliable.” The article details the phases of the research and lists some of the specific findings.

Reposted from the March 3, 2010 ASEE First Bell.

Picky about Trash

Robot Sorts Plastic Recyclables From Trash.

The Daily Telegraph (UK) (3/2, Demetriou) reported on a “device, created by Mitsubishi Electric Engineering Corp and Osaka University researchers, [that] identifies different plastic materials among rubbish and sorts them into piles.” The robot “uses five laser beams and sensors to detect a range of different plastics for recycling purposes.” Plastic recycling in Japan is comparatively limited, and the new device “aims to boost plastic recycling levels by identifying six different types of plastics that can be recycled and sorted from general rubbish collections.”

Reposted from the March 3, 2010 ASEE First Bell.

A Robot for My Co-Pilot

In-Dash Robot Uses Facial Expressions To Communicate With Driver

The Wired (11/17, Squatriglia) “Autopia” blog reported, “Audi and the Massachusetts Institute of Technology envision a future where robots riding shotgun make us happier, safer drivers and create a ‘symbiotic relationship’ between car and driver.”  The robot, called Affective Intelligent Driving Agent, or Aida, “would analyze our driving habits, keeping track of frequent routes and destinations to provide real-time traffic info, and make friendly suggestions along the way,” as well as “give gentle reminders to buckle up, watch our speed or slow down for that school bus up ahead.”  The robot “uses a small laser video projector to convey facial expressions and other information.”  Having “human-like motion” and the ability to express emotions, researchers say, “makes it easier to convey information,” since “reading a facial expression is instantaneous.”  The researchers “plan to build a driving simulator for a controlled study” by next year, and “real-world tests will follow in 2011.”

Reposted from the November 18, 2009 ASEE First Bell briefing.

Just in Time for Halloween…

ChemBot Unveiled

CNET News (10/15, Katz) reports on the “shape-shifting ChemBot” that “looks like the love child of a beating heart and a wad of Silly Putty.”  It is the product of a contract awarded by the Defense Advanced Research Projects Agency and the U.S. Army Research Office to iRobot.  The maker “along with University of Chicago researchers, showed off the oozy results at the Iros conference (the IEEE/RSJ International Conference on Intelligent Robots and Systems) in St. Louis this week.  DARPA envisions the palm-size ChemBot as a mobile robot that can traverse soft terrain and navigate through small openings, such as tiny wall cracks, during reconnaissance and search-and-rescue missions.”  The robot “inflates and deflates parts of its body, changing size and shape — and scaring the living daylights out of us.  We don’t know exactly when ChemBot will join the Armed Forces, but we can only beg:  please, oh please, keep it away from us.”

Reposted from the October 15, 2009 ASEE First Bell briefing.

High Speed Robot Arm

A high speed robot arm dribbles, spins a pen, throws and catches, knots, and uses tweezers to pick up a grain of rice.  Link to video at

The robot is under development at Ishikawa Komuro Laboratory.  You can see additional videos and demonstration at their web site.

RSS Feed

July 2020