In order to overcome the challenge of big data, create intelligent sensor solutions by simulating human brain, and realize layered intelligence from edge sensor, fusion center to cloud data analysis

Robert Bosch, a German electronics company, believes that in order to overcome the challenge of big data, we must build solutions by making all levels intelligent, including edge sensors, centralized sensor hubs, and cloud data analysis.

Fortunately, our brain has the most intelligent intelligent sensors – including eyes, ears, nose, taste buds and tactile sensitivity, which can shape our electronic big data solutions according to the needs of the Internet of things (IOT).

Marcellino gemelli, business development director of Bosch sensortec, said at the recent annual MEMS & Sensor Executive Conference (msec) held by semi: “We must feed the problem of big data into the human brain based model generator, and then use this model to predict what the optimization solution should look like. Due to the versatility of neurons, these machine learning solutions can operate at multiple levels.”

Neuron is the microprocessor of the brain – it can accept thousands of big data inputs, but after receiving thousands of dendritic inputs mediated by memory synapses, it only outputs a single voltage spike along the axon. In this way, the receivers of eyes, ears, nose, taste buds and tactile sensors (mainly used for existence, pressure and temperature) can pre process a large number of raw big data inputs, and then transmit summary information (encoded on voltage spikes) along the spinal cord to the center called ‘old brain’ – which is responsible for breathing Brain stem and automatic behavior Center for tasks such as heartbeat and reflex.

Finally, the preprocessed data reaches the final destination of the conscious part of the brain (cortical gray matter) via a huge interconnected array called “white matter”. Each part of the cerebral cortex is dedicated to the senses of vision, language, smell, taste and touch, as well as the cognitive functions of attention, reasoning, evaluation, judgment and corresponding planning.

The intelligent sensor simulation brain is modeled in three levels: the sensory level represented by the real-time capture of big data readings by wrist wearing devices; The second tier hub (taking smart phones as an example here) collects trend graphics and transmits them to the cloud of the third tier every few minutes

Gemelli said: “the mathematical operation of brain neural network is equivalent to perception. It can learn through its variable conductance synapse, and big data can be transmitted through it. We can add multiple levels of perceptron to learn all things that human beings can learn, such as the different ways people walk.”

How sensors interpret big data

Brain’s perceptual data processing of cognitive system and limbic system

Influence of Moore’s law

Moore’s law also helps to realize multi-level perception – called deep learning – because it provides a general method for intelligent processing at the edge sensor, at the center and in the cloud.

Gemelli said: “first, the quantity is very helpful – the more big data, the better. Second, diversity helps to learn different aspects of things, such as the different gait people use to walk; third, the speed at which the perceptron must respond needs to be quantified. Once you define these three parameters, you can optimize the neural network for any specific application.”

For example, gemelli said that the combination of smart watch / smart phone / smart cloud can control big data respectively. Smart watches evaluate real-time continuous data from individual users, and then send the most important summary data to smart phones every few minutes. Then, smart phones only need to send trend summaries to the smart cloud several times a day. The detailed analysis of the most important data points is carried out in the cloud and fed back to the specific users wearing smart watches, as well as timely suggestions for other smart watch wearers on how to achieve the same set goals.

At present, Bosch is adding a processor to its edge sensor to simulate this three-level brain model, so that it can identify and concentrate the trend of big data, and then transmit it to the intelligent center.

Gemelli said: “in particular, smart cities need intelligent sensors with built-in processors to realize real-time edge sensor trends. Then, they send these trends to the sensor hub, analyze and send the most important messages to the cloud, so as to analyze feasible information for city managers.”

Responsible editor: CT

Leave a Reply

Your email address will not be published. Required fields are marked *