ASIMO: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Chunbum Park
imported>Chunbum Park
Line 26: Line 26:


====The three senses====
====The three senses====
ASIMO's sensors are responsible for visual, auditory, and tactile functions, as well as those relating to movement, including gyro sensor, force sensor, and inclinometer. In human terms, it is limited to three senses as it can see, hear, and touch, but it cannot smell or taste.
ASIMO's sensors are responsible for visual, auditory, and tactile functions, as well as those relating to movement, including gyro sensor, force sensor, and inclinometer. In comparison to humans, it is limited to the three senses, since it lacks particle detectors that can substitute for the nose and tongue as well as the emotional capacity to react in the manner of enjoyment or disgust.


ASIMO's vision system is responsible for processing spatial perception, object-mapping, human tracking, facial recognition, and gestural-postural detection. It relies on stereo images (of two views from slightly different angles) provided by the [[frame grabber]] that is connected to the two color board cameras on the head unit of the robot. 3D and moving objects are identified by calculating depth of space based on the [[Sum of Absolute values of Differences]] (SAD) method using the images that are captured in black-and-white and callibrated for [[distortion (optics)|lens distortion]]. A local map of 3D objects is constructed and provided to the agent programs for walking and movement. The vision system also identifies moving parts of the moving objects, from which the recognition of human gestures and postures is processed and provided to the agent programs for human-robot interaction and eye control.<ref name="sakagamietal2480-2481">Sakagami, et al, 2002. pp.&nbsp;2480-2481</ref>
ASIMO's vision system is responsible for processing spatial perception, object-mapping, human tracking, facial recognition, and gestural-postural detection. It relies on stereo images (of two views from slightly different angles) provided by the [[frame grabber]] that is connected to the two color board cameras on the head unit of the robot. 3D and moving objects are identified by calculating depth of space based on the [[Sum of Absolute values of Differences]] (SAD) method using the images that are captured in black-and-white and callibrated for [[distortion (optics)|lens distortion]]. A local map of 3D objects is constructed and provided to the agent programs for walking and movement. The vision system also identifies moving parts of the moving objects, from which the recognition of human gestures and postures is processed and provided to the agent programs for human-robot interaction and eye control.<ref name="sakagamietal2480-2481">Sakagami, et al, 2002. pp.&nbsp;2480-2481</ref>

Revision as of 13:15, 16 October 2011

This article is developed but not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
Catalogs [?]
Gallery [?]
Video [?]
 
This editable, developed Main Article is subject to a disclaimer. See also the current citable version, which can be linked to.
(CC) Photo: Honda
The New ASIMO at the Consumer Electronics Show in 2007.

ASIMO (アシモ ashimo) is the world's most advanced humanoid robot, developed by the Japanese company Honda. The first ASIMO was completed after 15 years of research, and it was officially unveiled on October 31, 2000. The robot resembles a small astronaut wearing a backpack, and is capable of performing a variety of tasks, including running, kicking a ball, walking up and down stairs, and recognizing people by their appearance and voice. The name is short for "Advanced Step in Innovative MObility" and is also known as an abbreviation of ashita no mobility, meaning 'mobility in the future.'[1] It was named in reference to Isaac Asimov, an American professor and science fiction writer who is credited with coining the term robotics and proposing the Three Laws of Robotics.

Design concept

"Following in the steps of Honda motorcycles, cars and power products. Honda has taken up a new challenge in mobility - the development of a two-legged humanoid robot that can walk."[2]

Bipedal movement has been the primary focus of Honda's humanoid robotic research to create general-purpose, intelligent robots that can "coexist and cooperate with humans",[3] since it began in 1986 with the development of the 'E0' prototype. While there existed many different visions of futuristic robots, such as R2-D2 and C-3PO from Star Wars, it was recognized that human-like robots with bipedal mobility are the most ideal for operating and interacting with humans in human surroundings.[4]

Based on this concept, ASIMO's design concerns three main elements, which are human-friendliness, adaptability to the human environment, and engineering feasibility. The robot's height was set at 130 cm, which is similar to a child's, as this would be practical both on the engineering aspect (since a smaller and lighter robot is less challenging than an adult-sized robot such as the P2 prototype) and the question of operability in the environment, where light switches are normally located 110 cm from the floor. With less bulk, the robot would be able to move more efficiently in handling obstacles and narrow passages,[5] and it would also be less overwhelming presence to humans and, in case of accidents, less hazardous.

Its humanoid form that is not only functionally but also proportionally similar to the human body was meant to enhance its human and environment-friendly qualities by allowing it to make gestures and communicate face-to-face, as well as using the stairs or taking seat in a car. Its strikingly minimalist appearance, which lacks a detailed face and toes on its feet, provides fewer moving parts as rooms for error, while being clearly discernible to humans as consisting of a head, torso, arms, hands, legs, and feet. ASIMO being a popular icon, its design has contributed to the conceptual diversity of futuristic robots, based on a very unique design language of modern Japanese aesthetics.[6]

Honda has suggested several future uses for robots like the ASIMO, which, despite its impressive list of feats and features, remains an experimental technology demonstrator that needs to operate in controlled, predictable environment. With further advances, ASIMO could be engaged in tasks such as elderly care assistance, firefighting, and toxic cleanup.[7] At the present, the ASIMO is being leased to companies for receptionist work.[8]

On August 12, 2011, the Japanese newspaper Asahi Shimbun reported that Honda was seeking to develop a robot based on the ASIMO to handle the radiation leakage at the Fukushima nuclear power plant that resulted with the earthquake and tsunami that hit Japan in March, 2011:[9]

If successful, in another year or two, an improved version of the Asimo could be taking over the work at the Fukushima site... Adjustments can be made to the degree of strength that is applied in the robot's shoulder, elbow and wrists that are operated by motors. While using the Asimo as the base, Honda officials want to create a robot devoted exclusively for working at the nuclear accident site by taking advantage of the arm technology of the Asimo. Because the footing at the Fukushima site is treacherous due to rubble that could topple the robot, the biped aspect of the Asimo would be replaced by either tires or crawlers used on tanks.

According to the AFP, US Honda spokeswoman Lauren Ebner denied the report, dismissing it as a mere "speculation."[10]

Technology

System structure

ASIMO's control system consists of approximately 20 CPUs and sensors that are grouped into several independent sub-systems interacting asynchronously through an internal message board. Each sub-system is essentially a PC with different operating system and are responsible for audio-visual sensory and recognition, communication with the operator, actuation of movement, and power management. The internal message board is updated at different frequencies due to the different processing load and power of each sub-system. Contrary to the popular belief, the system activities are event-driven without a central, intelligent AI to handle unexpected situations.[11][12]

The three senses

ASIMO's sensors are responsible for visual, auditory, and tactile functions, as well as those relating to movement, including gyro sensor, force sensor, and inclinometer. In comparison to humans, it is limited to the three senses, since it lacks particle detectors that can substitute for the nose and tongue as well as the emotional capacity to react in the manner of enjoyment or disgust.

ASIMO's vision system is responsible for processing spatial perception, object-mapping, human tracking, facial recognition, and gestural-postural detection. It relies on stereo images (of two views from slightly different angles) provided by the frame grabber that is connected to the two color board cameras on the head unit of the robot. 3D and moving objects are identified by calculating depth of space based on the Sum of Absolute values of Differences (SAD) method using the images that are captured in black-and-white and callibrated for lens distortion. A local map of 3D objects is constructed and provided to the agent programs for walking and movement. The vision system also identifies moving parts of the moving objects, from which the recognition of human gestures and postures is processed and provided to the agent programs for human-robot interaction and eye control.[13]

ASIMO can identify muliple people in a scene by using the snake algorithm to estimate the contours of human shapes. When in motion, ASIMO tracks humans by using an optical flow-based algorithm to lock on their changing positions. 2D gesture recognition algorithm can identify handshake, hand-circling, farewell, hand swing, high hand, and come here-call based on the probabilities for a given position of the hand using the Bayes statistical model. 3D gesture recognition algorithm can identify where a person is pointing at based on the relative positions of the person's head and hand from the depth map data. Face is recognized by applying the Eigenvector Method on the face contour and eye image data.[13]

ASIMO's hearing can recognize human voice tones and step sounds. It calculates the direction of the sound's origin based on the volume and time difference of the signals at two separate microphones. Hearing allows ASIMO to be aware of the changes in the surrounding behind its field of vision by looking at a person calling or seeing what fell on the floor.[13]

Movement

ASIMO v2 changes

Development history

notes

  1. Hirose, Masato, and Kenichi Ogawa, 2006. pp. 14
  2. "ASIMO Technology." Honda Worldwide. Honda Motor Co.,Ltd. Web. 24 Aug. 2011. <http://world.honda.com/ASIMO/technology/>.
  3. Pfeiffer, Friedrich, and Hirochika Inoue, 2007. pp. 5
  4. Hirose, Masato, and Kenichi Ogawa, 2006. pp. 11
  5. Hirose, Masato, and Kenichi Ogawa, 2006. pp. 15
  6. Mansfield, Stephen. "Japanese Aesthetics and High-Tech Design." Nov. 2001. J@pan Inc. Japan Inc Communications, Inc. Web. 10 Oct. 2011. <http://www.japaninc.com/article.php?articleID=515>.
  7. Honda. ASIMO Technical Guide. Honda. The Honda Humanoid Robot ASIMO. Honda. Web. 9 Oct. 2011. <http://www.asimo.com.au/pdf/Asimo_Tech_Guide.pdf>.
  8. Sakagami, et al, 2002. pp. 2478
  9. Hashimoto, Yukio. "Honda to Improve Robot to Work at Fukushima." The Asahi Shimbun. The Asahi Shimbun Company, 12 Aug. 2011. Web. 9 Oct. 2011. <http://ajw.asahi.com/article/0311disaster/recovery/AJ201108126075>.
  10. AFP. "Honda Denies Nuclear Robot Mission." The New Zealand Herald. 15 Aug. 2011. Web. 10 Oct. 2011. <http://www.nzherald.co.nz/japan/news/article.cfm?l_id=57>.
  11. Hirose, Masato, and Kenichi Ogawa, 2006. pp. 16-17
  12. Sakagami, et al, 2002. pp. 2479-2480
  13. 13.0 13.1 13.2 Sakagami, et al, 2002. pp. 2480-2481