How brain structure inspired us to create a conscious AI model?

A new era of computers is coming.

Many researchers are now trying to develop a new brain-inspired computer architecture, as it is inspired by concepts of the human brain and appears more effective than conventional computers.

In this article, we’re going to discuss brain-inspired computers by IBM, Jülich Research Centre, quantum computers, AI’s contribution to material science, and Minsky’s Six Model.

A new brain-inspired computer by IBM

One example that draws everyone’s attention is the IBM Research project. Unlike today’s computers with a usual central processor, a memory unit, storage, and input and output devices, researchers are aiming to build a computer that has coexisting processing and memory units. This change promises an increase in efficiency and energy savings.


What aspects exactly were inspired by the brain?

There are three. The first one mimicked the co-location of the brain’s memory and processing by doing computational tasks in the memory itself. The second one is about the structure of the brain’s synaptic network, which was used for groups of phase change memory (PCM) devices to accelerate training for deep neural networks. Last but not least, IBM’s team built a computational substrate for spiking neural networks, which is reminiscent of the specific nature of neurons and synapses.

As a result of their experiments, the team has admitted that the outcomes exceeded all expectations. Abu Sebastian, one of the IBM researchers and managers, has expressed his astonishment, saying:


“We always knew they would be efficient, but we didn’t expect them to outperform by this much… We could achieve 200 times faster performance in the phase change memory computing systems as opposed to conventional computing systems.”

Neural networks: SpiNNaker

Another computer built by imitating the brain’s neural networks, called SpiNNaker, is able to compete with the current best brain-simulation supercomputer software and is expected to even outperform traditional computers regarding speed and power consumption. The project of Dr. Sacha van Albada of the Theoretical Neuroanatomy group at the Jülich Research Centre, Germany, focuses on studying learning and brain disorders. Professor Markus Diesmann, co-author, head of the Computational and Systems Neuroscience department at the Jülich Research Centre, emphasized the problem of energy consumption of ordinary computers, which is strikingly different from the energy consumption of the brain. He believes that neuromorphic machines like SpiNNaker will eventually be able to get to the same energy efficiency as the brain.

Quantum computers

Quantum machine learning software is an even further step into modern efficiency. These computers are able to exploit quantum coherence and quantum entanglement to solve numerous problems while searching a database that other classical computers cannot. While usual machine learning methods are used in biotechnology, pharmaceuticals, particle physics, and many other fields, they may face some difficulties in achieving certain tasks. Luckily, quantum technologies are bound to enhance deep learning, especially in chemical physics and the pharmaceutical industry, as examples.

Neural Networks

Neural Networks are a subset of machine learning systems that amuse everyone with how much information they can analyze at once. These machines can learn as they work and adjust quickly. One, presented by MIT researchers, has a technique for making sense of neural networks that are able to carry out natural language processing tasks. It reads texts as inputs and produces its own symbols as outputs. The technology was tested on three different types of language processing systems, including the way in which words were pronounced; a set of translators; and a simple computer dialogue system that provided responses to questions or remarks.

Material science and design

Active machine learning, a new approach developed by a team led by Leroy Cronin at the University of Glasgow (UK), has been successfully used in material science to help scientists find new structures of polyoxometalates. The machine competed against a group of experimenters and their intuition (new polyoxometalates structures are difficult to find and predict, therefore, expert intuition is used normally). The results showed that even though real experts were able to come up with more successful crystallizations, the machine was far more “adventurous”, as they covered a wider domain of the “crystallization space”. It has been mentioned that the method was unbiased, since it did not rely on human perception, moreover, it discovered a range of completely unexpected conditions that led to crystallization.

Through machine learning algorithms Elizabeth Holm, professor of materials science and engineering at Carnegie Mellon University, and her group have built an automatic material recognizing system that would categorize them after receiving microstructural images. The applications are endless: research, industry, publishing, academia, etc. It can quickly help to research huge archives of images, something a group of people would spend days doing.

Furthermore, chemists from the University of Basel in Switzerland, with the help of AI, have calculated the characteristics of about two million crystals out of just four chemical elements. 90 new crystals found during the research can be even regarded as new materials.

Artificial intelligence has helped Felix Faber, a doctoral student in Prof. Anatole von Lilienfeld’s group at the University of Basel’s Department of Chemistry, solve his material design problem with elpasolite, which is a chemically complex type of crystal that is practically impossible to design. He used quantum mechanics as well as machine learning methods to determine chemical compositions with decent accuracy. Machine learning (ML) has proved to be exceedingly quicker than a supercomputer based solely on quantum mechanics.

Minsky’s Six Model

A completely different approach was proposed by Marvin Minsky in his work “Model of Six” in 2007. He believed that in order to fully understand intelligence, one must first comprehend other aspects of it, such as human emotions. His model outlined six levels of thinking: self-conscious reflection; self-reflective thinking; reflective thinking; deliberative thinking; learned reactions; and instinctive reactions. Self-conscious reflection is considered the most sophisticated process revolving around higher values and instinctive reactions are illustrated as something not only humans but animals have. According to Minsky, the six different processes enable us to generate new ideas and thoughts. Marvin believed that AI would eventually be able to solve humanity’s biggest problems.

The biggest challenge in AI's future development is creating a new approach for a technological framework. Evolwe is currently doing research regarding creating a new architecture for AI based on neurobiology, neuropsychology, and human brain studies. Now Evolwe’s AI platform is based on deep learning AI, it is taught to recognize a human’s emotional and psychological state. The developing technology is able to respond to human emotions in an empathic way. It aims to unmistakably recognize human emotions through text, voice, and other information about a person.

In this article, we’ve discussed various scientific theories such as neural, quantum, material, and emotional by Marvin Minsky. Each approach deserves recognition and proposes valuable and successful solutions which can improve human life.

Although not every innovative brain-inspired architecture is impeccable, history shows that new AI algorithms are time and energy efficient, cheaper, more convenient, unbiased, unpredictable, and promising.


1. IBM Research. IBM.

2. Abu Sebastian. IBM.

3.  American Institute of Physics (2018, October 3). A new brain-inspired architecture could improve how computers handle data and advance AI. EurekAlert.

4. SpiNNaker. Wikipedia.

5. Sacha van Albada. Google Scholar.

6. Jülich Research Centre.

7. Research. MIT.

8. The Cronin Group. Digital Chemistry.

9. University of Glasgow.

10. Wiley (2017, August 3). Active machine learning for the discovery and crystallization of gigantic polyoxometalate molecules.

11. Diorio-Toth H. (2016, September, 13). Using machine learning to understand materials. TechXplore. 09-machine-materials.html

12. University of Basel.

13. Felix A. Faber’s research works.

14. University of Basel (2016, September 21). Artificial intelligence helps in the discovery of new materials.

15. The Emotion Machine. Marvin Minsky.

16. Evolwe.