Like many Japanese, Kogoro Kurata grew up watching futuristic robots in movies and animation, wishing that he could bring them to life and pilot one himself. Unlike most other Japanese, he has actually done it.His 4-tonne, 4-metre 13 feet tall Kuratas robot is a grey behemoth with a built-in pilot’s seat and hand-held controller that allows an operator to flex its massive arms, move it up and down and drive it at a speed of up to 10 kph 6 mph.“The robots we saw in our generation were always big and always had people riding them, and I don’t think they have much meaning in the real world,” said Kurata, a 39-year-old artist.“But it really was my dream to ride in one of them, and I also think it’s one kind of Japanese culture. I kept thinking that it’s something that Japanese had to do.”
Japan loves their robots, esepcially ones that look like attractive women and the “Miim” robot, otherwise known as HRP-4C, from the developers at the National Institute of Advanced Industrial Science and Technology AIST is no exception.Well with one maybe, they were never happy with the way that she walked. With 30 motors that let it walk, move its arms; as well as 8 motors for facial expressions, Miim is based on User Centered Robot Open Architecture.This allows the robot to utilize some fundemental robotic techniques as well as realime Linux, RT middleware, something called robot simulator openHRP3 and speech recognition.
Japan’s humanoid robots smile, laugh, and sing. But what if they could read your facial expression, converse in words, and scale the latest peak in communication—tweet on the microblogging service Twitter? All this from space?
A humanoid robot developed by Japan’s Advanced Industrial Science and Technology sings and dances with performers in Tokyo in October 2010.
That’s exactly what Japan’s national aerospace agency is aiming to develop by 2013. The Japan Aerospace Exploration Agency, or JAXA, said earlier this week that it has begun reviewing a possible joint venture with Tokyo University and advertising and communications company Dentsu Inc. to develop a humanoid robot that will join astronauts in space as a permanent resident on the International Space Station, or ISS.
The robot wouldn’t be the first aboard the ISS: NASA is launching a humanoid robot of its own later this month. But the NASA machine has been engineered to assist astronauts with various operational tasks on the ISS, while the Japanese robot’s main task would essentially be in the service sector—to keep astronauts company.
Meet Yuki-taro, a self-guided, GPS and camera equipped robot snowplow that somehow manages to look as cute as Pokemon’s Pikachu – this is Japan, after all!
Snow? In Japan? Yes indeed, and not just on top of Mount Fuji. Some parts of northern Japan can receive a surprising amount of snow in wintertime, enough to block roads and isolate people living in mountain villages. Elderly people in particular are at risk in these areas, both from being shut-in and from trying to shovel all the snow. That’s where “Yuki-taro, the friendly snowbot”, comes in!
Though only a prototype at present, Yuki-taro’s creators in the snowy city of Niigata expect to have a marketable version ready within 5 years at a cost of under 1 million yen (about $9000). That may sound like a lot, but it’s likely that municipalities would pay the cost and deploy them where needed.
Besides, Yuki-taro is packed with high-tech features such as a GPS positioning sensor, twin video cameras for obstacle avoidance in the “eyes”, and an integral snowblock maker that will thrill local kiddies looking to build an igloo or two!
The International Robot Exhibition 2009 is underway in Tokyo, and Yasukawa is again showcasing its Motoman robots. Previous models were dexterious in robo chef demonstrations, but it seems they’ve now have a taste for the Force.
Man and Machine expanded their relationship today when Honda announced the new thought controlled ASMIO. What doors of the mind does this open? Read the facts for yourself…
Honda Research Institute Japan Co., Ltd. (HRI-JP), a subsidiary of Honda R&D Co., Ltd., Advanced Telecommunications Research Institute International (ATR) and Shimadzu Corporation have collaboratively developed the world’s first*1 Brain Machine Interface (BMI) technology that uses electroencephalography (EEG) and near-infrared spectroscopy (NIRS) along with newly developed information extraction technology to enable control of a robot by human thought alone. It does not require any physical movement such as pressing buttons. This technology will be further developed for the application to human-friendly products in the future by integrating it with intelligent technologies and/or robotic technologies.
During the human thought process, slight electrical current and blood flow change occur in the brain. The most important factor in the development of the BMI technology is the accuracy of measuring and analyzing these changes. The newly developed BMI technology uses EEG, which measures changes in electrical potential on the scalp, and NIRS, which measures changes in cerebral blood flow, with a newly developed information extraction technology which enables statistical processing of the complex information from these two types of sensors. As a result, it became possible to distinguish brain activities with high precision without any physical motion, but just human thought alone.
The BMI technology announced by HRI-JP and ATR in 2006 used a functional magnetic resonance imaging (fMRI) scanner to measure brain activities. The large size and powerful magnetic field generated by the fMRI scanner limited the locations and conditions where it can be used. As the newly developed measuring device uses EEG and NIRS sensors, it can be transported to and used in various locations.