Deep learning is not the end point of the robot becoming smart and needs to enhance the five senses.

In order for a robot to truly think like a human being, it is not only necessary to equip it with a human-like "brain", but a way of recognizing the outside world similar to human beings is also indispensable. Both the ears and the mind are flexible and human beings have good intelligence. Robots also need both the center and the terminal.

Deep learning is not the end point of the robot becoming smart and needs to enhance the five senses.

Deep learning ability is an important indicator reflecting the degree of intelligence of robots, and is therefore the focus of research and development of major technology companies. By studying and simulating the structure and operation of human neural networks, and gradually applying scientific research results to design practice, the deep learning ability of robots is continuously enhanced, intelligence is increasing, and "thinking" becomes more and more like human beings. However, in order for a robot to truly think like a human being, it is not only necessary to equip it with a human-like "brain", but also a way of recognizing the outside world similar to human beings.

Human cognition of the outside world mainly comes from five senses, namely, five senses of sight, hearing, smell, taste and touch. Therefore, when the current industry gives the robot a human-like perception, it is mainly divided into five branches according to this. Whether it is a virtual robot or a physical robot, as long as it is in the direction of the class, it is impossible to imitate the imitation of these five senses. However, depending on the positioning and function, there are differences in sensory types and perception levels.

Vision

About 90% of the information that humans perceive comes from vision. Correspondingly, machine vision also plays a pivotal role in the five senses of robots. The "visual organs" of robots mainly include photosensitive sensors and color sensitive sensors.

Last month, researchers at the Carnegie Mellon University School of Engineering developed a machine vision technology that automatically identifies and classifies different types of metallic 3D printed powders with an accuracy of over 95%, which is expected to become popular in five years. The computer's ability to recognize powder is actually better than that of trained people. Researchers believe that their work will contribute to future independent microstructural analysis.

Hearing

For humans, hearing is the second sensory sense of receiving outside information. The current hot intelligent voice service, in addition to the high requirements of the language learning ability of the robot, is also a major test of the robot's hearing. Mainly related to machine hearing is the acoustic sensor.

Software such as Siri, which is not satisfied with the iPhone, can only recognize voice functions. American robot expert Joseph Romano and his collaborators created a software tool called ROAR (an open source audio recognizer for the robot operating system) at the University of Pennsylvania. The software helps robotics train machines to react to broader sounds. Although the technology is still in the early stages of development, Romano believes its potential is huge.

Olfactory

The range of applications of olfaction is not as wide as it is for sight and hearing, but it is a critical sensory sense for some robots for specific purposes. The gas sensor is the main machine that gives the machine a sense of smell.

In 2015, the University of Tokyo, Sumitomo Chemical Co., Ltd. and the Kanagawa Institute of Science and Technology borrowed from the insect's olfactory structure to develop a sensor that senses the smell of human sweat. The device can be installed on a robot to help them quickly find missing persons and prevent rescue teams. The second disaster. Last year, the team of engineers at the University of Washington, St. Louis, inspired by the smell of locusts, developed a new bionic robot sensing system that can be used to sniff out dangerous goods such as explosives.

Taste

Robots don't need to eat, and the industry's research on machine taste is relatively small. However, robots serving the restaurant industry have naturally evolved a good tasting ability. Thanks to the help of chemical sensors, robots have a "taste."

Last year, Maria Luz Rodriguez-Mendez, a professor of chemistry at the University of Valladolid, Spain, developed a sommelier robot, BeerTongue. It has an "electronic tongue" that can be used to taste beer through sensors and chemical methods, with an accuracy of 82%.

Tactile

Tactile sensation is undoubtedly important for physical robots. Whether in the factory or in the home, having a sense of touch can often improve the quality of service of the robot. Robotic haptics are primarily obtained through pressure sensitive, temperature sensitive and fluid sensors.

A few days ago, scientists at Carnegie Mellon University in the United States developed a robotic system, Fingervision, which has a small camera inside to act as a finger to "perceive" objects. They used a pair of grippers printed on a desktop-level 3D printer to act as robot hands. The meaning of Fingervision is that when a huge robot grabs an object, it can sense the grip by sensing the slip of the item.

Only by using the keen sense of five to effectively collect external information, and with powerful deep learning ability for information analysis and processing, the intelligence of the robot can be comprehensively improved. Both the ears and the mind are flexible and human beings have good intelligence. Robots also need both the center and the terminal.


Home appliance parts

gree , https://www.greegroups.com