People have never had to see a staircase before, to know what that is – and how they are best committed. For robots, however, this can be a unuser-windable problem.
Potential solution for this dilemma could be to bring the machines to imitate the muhelose movement of people. Exactly this is the pramisse of an idea of researchers of the university of illinois and the with, the young in "science robotics" has been published.
The scientists have created a connection between man and machine, which transfers the operator’s movements to the robot. A plate with motion sensors on which the canopy of the operator can handle that the different movements (jumping, running or trees) can be tracked tracked. The system reads the body movements using a vest coupled with sensors. The data of the upper body and the legs are beaten to a two-legged robot (a small version of the hermes robot developed by being developed).
Automatic back and forth
(source: university of illinois)
The system works in both directions: it also allows the operator too "feel", what the robot foes. For example, if this is run or stabbed against a wall, this sensation is transmitted to the person at the other end of the line with foehstable feedback. This allows the person to adjust accordingly, depending on how much is necessary, the operator can dust more or less prere. This feedback includes security measures that automatically turn off the juice if the robot experiences dangerous stages of a force effect, so joao ramos, assistant professor at the university of illinois and co-author of the investigation.
The current state of the study is pretty easy. It needs many cables, has some communication delays and records only quite simple movements. It is also addressed to certain tasks instead of presenting a general system for all movements. Nevertheless, this is a step towards mobile and more useful robots.
"Robot to bring to move autonomously is one of the big challenges in robotic technology. This idea bypasses this by using the performance of the human brain, capturing sensory information about the world, processed and linked to a control system, for example for tasks such as balancing or occurring", explains mike mistry, which explores robotic technology at the university of edinburgh.
Robotic crop actions
Be connected directly to a person, robots helped react to disasters and other situations through which the life of human aid could be brought into danger.
The researchers say such a system could be used, for example, to help with robotized crop actions like the after the atomic catastrophe in japanese fukushima in 2011. People had drove the robots directly and thus can navigate accurately by the plant – from a safe distance.
Although the process does not currently involve machine learning, ramos believes that the data collected by the system could help train autonomous robots. "In 50 years we will have full autonomous robots. Human control, however, offers a lot of potential that we have not yet explored. In the meantime, it therefore makes sense to combine robots and people to benefit both", he says.