Updated Technology News

Unbelievable Latest Technology Improves Humanoid Robots

 


Tesla recently unveiled a robot prototype that would operate on a framework already in place thanks to the company's autopilot system.

Initially, this type of robot will be more task-oriented. However, it has sparked a heated debate about how life as a humanoid robot would be. However, as I previously stated, one of the major challenges is that most robotic research is compartmentalized, meaning that some robots are designed for locomotion, while others are designed for social interaction, and still others are designed solely for task orientation, which is problematic because a life-like humanoid robot requires all three attributes.

Now, I feel Tesla is on the right track with its development because it is built on existing technology, particularly its car autopilot system.

It's fascinating to watch how cars have grown and gone up the automation scale. We're currently at level four development, with a few businesses like Waymo conducting experimental testing, and level five, which is complete automation with no human interaction, is still a few years away. It could, however, be transformed into humanoid robots once we reach level five.

But it's important to remember that these neural networks are mostly vision-based, and they're only one part of the puzzle for a lifelike humanoid robot.

As a result, there are some qualities that a robot must possess.

Humanoid Robots are a key feature.

  • Touch Sensitivity
  • Auditory Reaction
  • Kinematics of Energy
  • Processing with SNN

The sense of touch is the primary and most important ability for manipulating objects. There's already a lot of work being done on developing robots with a restricted sense of touch.

A 3D-printed exoskeleton from Columbia University is a nice example of this. It features 32 photodiodes and 30 LEDs., as well as a reflecting silicone skin that functions as a light barrier. When the robot contacts something, the soft exterior deforms, and the photodiodes detect the changing light levels from the LEDs.

This enables it to discern where contact is made as well as the intensity of the pressure, making it extremely precise. Implementing a spiky neural network to encode and analyze temporal information is a different technique.

F-ZI

FZI in Germany has already turned this into a robotic arm, with each finger equipped with a neural circuit that detects contact using motor currents and joint velocity.

A controller is also triggered to control the amount of force applied by the finger. This is the closest approach to a human sense of touch, and it can adapt to a wide range of items, thus it should be able to clean the dishes, in my opinion. In the medium run, I believe, or spiking neural networks, will be the way to go.

And with neuromorphic computing, this will only get better, but I'll get to that in a minute. Auditory sensing is another important sense that a future humanoid robot would need, but it will most likely be focused on speech recognition.

Digital personal assistants, smart speakers, and smart houses have already incorporated this technology. I'm sure most of us are familiar with Alexa. Automatic speech recognition and natural language processing are used in many speech applications. This entails translating audio to text and deciphering the meaning of the text. This is easier said than done, as we all know because speech comprises accents, emotions, and other differences.

The first generation of commercial humanoid robots will most likely be task-oriented, meaning that if you instruct it to clean the dishes, it will first ask if you want the dishes cleaned before continuing. It'll probably rely largely on this type of system at first because you don't want the robot to receive a bad command and then discover it's mowing the lawn of a neighbor two hours later.

The ultimate goal would be for the robot to grasp what it's feeling, but it's too early to determine whether or not that's conceivable because it's on the verge of self-awareness.

However, robotics may get to the point where it can replicate feelings, and you wouldn't know the difference unless you programmed it.

Obviously, movement is an important feature. When it comes to task-oriented robots, Boston Dynamics is pretty much on top. Real-time perception and a model that can anticipate motion over time are the starting points. Their Atlas mobile hydraulic systems are superb. They have more than 28 hydraulic joints and a total speed of 2.5 meters per second.

The robot functions by utilizing information available to its perception system and determining appropriate behaviors for its surroundings over the internet. Essentially, it's combining offline and online templates, although this technique may have limitations because it doesn't represent every conceivable action. Deep reinforcement learning, which can smoothly shift from training in a simulation environment to a physical robot without any real training or offline processes, may be able to solve this challenge.

This means that angles, torque, and other parameters are passed across a network, allowing the robot to learn to walk on any surface. Deep reinforcement learning, at least in terms of mobility, is the way to go for future humanoid robots, in my opinion, since they can genuinely adapt to their surroundings without a human having to put in all these different models. Another important component that is sometimes forgotten is the power supply.

Atlantis has 3.7-kilowatt-hour lithium-ion batteries, which give it about an hour of operation time. However, in future robots, chores like cutting the grass or collecting groceries could take more than an hour.



Because there are so many design aspects to consider, such as temperature range, specific energy, and energy density, this might become a difficult problem to solve. The more pressing concern is whether batteries will be the future's power source. Yes, there are many different batteries on the horizon, including graphene and sodium ion, so there may be a battery with a very rapid charging time and a high energy density out there.

For the time being, this is a significant issue, and I do not believe that robots will be powered by this type of source in the future. The processing unit is the final and most significant feature of a humanoid robot. As you may be aware, quantum computers are the subject of extensive research, and there have been breakthroughs in error correction and q-bit stability. It's still too early to say if quantum algorithms will ever be used in everyday life.

It is extremely likely that we will witness a digital AI helper that is intelligent enough to operate as your friend or business partner.

Finally, In order to be sold, the future humanoid robot will require certain characteristics. This comprises enhanced processing, perception, mobility, and a feasible energy source. Many businesses concentrate on just one or a few of these characteristics. To produce a fully functional humanoid robot for the market, collaboration or technology transfer will almost certainly be required. We'll probably see task-oriented bots first.

These will be devoid of any personality. However, in 15 to 20 years, we may encounter something that possesses all of these characteristics and appears to be human. However, I want to underline that neuromorphic computing and SNNs will have a significant impact. This could suggest that within the next five to ten years, we could witness highly real, lifelike digital AI systems.

When combined with virtual reality, this might result in some really intriguing interactions.




No comments