Table of contents:
- Memory hierarchy
- The power of prediction
- Remember all
- Memory as art
- How not to drown in the sea of facts
Video: The Brain Is Not A Computer - Self-development
Before our eyes, a real war of rapidly developing technologies and the human brain is unfolding. And now we already hear that the struggle "machine - man" will end in no way in favor of the latter. And in the near future. How legitimate is the idea of replacing "an imperfect biological computer with a more perfect electronic one" and how does the human brain differ from the most modern device? Thinking psychologist, certified trainer in thinking methods of Edward de Bono
Many people think that with the progress of technology, the need to memorize information will disappear by itself. After all, the need for oral counting has disappeared with the advent of calculators! Already now, any information can be "Google" in a few minutes, and slogans like "your brain is the most powerful computer" are losing their relevance. Computer / cloud / Google are able to remember so much better and more than we do that there is no point in competing with them. But is our brain really a computer in our head? And why can't even the most advanced technology match the work of human gray matter?
Let's take a simple example. Everyone working on a computer is well aware that a file with instructions "how to make a table of contents in Word" looks something like this: "Mark the place in the document where you want to insert the table of contents, open the" Links "tab, click the" Table of Contents "button" - and so on. But in the head, it all happens differently. Otherwise, if a friend asked me on the phone how to do auto-table, I would answer right away. But I say: "Wait, I'll open the program now," and only after I see Word in front of me, I can remember what to do.
The mystery is that, unlike files, which are written and read linearly, memories are stored hierarchically in the brain. What happens when a person sees, for example, the letter H? The image enters the retina, and from there - into the primary visual cortex, which is engaged in recognizing simple images: two vertical rods, one horizontal. She transmits data about these rods to the secondary visual cortex, which adds a more complex pattern ("H") from them and transmits the result to the next zone, where letters received from different parts of the secondary visual cortex are combined into words and transmitted "upward."
The power of prediction
The cerebral cortex is divided into many zones through which information is constantly moving, and not only up the hierarchy, but also down. The human brain is so efficient, says Jeff Hawkins in his book On Intelligence, that it can predict future events based on the experience stored in memory. In order to perform a certain action (for example, to catch a ball), the brain does not have to calculate for a long time - it is enough for it to remember how it acted earlier, and on this basis predict the flight of the ball and coordinate its movements. The circuits of neurons in the cortex form a hierarchical structure in which the higher levels constantly send information to the lower levels. This allows the incoming sequence of images to be compared with sequences from previous experiences. So, on the basis of the words "A long time ago, many years …" one can predictthat the next words are "… ago."
Hawkins compares our brains to the hierarchy of military commands: "Generals at the top of the army say, 'Move troops to Florida for the winter." A simple high-level command unfolds into more detailed commands in sequence, going down the hierarchy levels. And thousands of individual structures carry out tens of thousands of actions resulting in troop movements. Reports of what is happening are generated at each level and sent to the top until the general receives the last report: "The transfer was successful." The general doesn't go into details.
What we think is always less complex than what we think
Unlike the brain, in a computer, two very different devices are responsible for "memory": the HDD (screw) and RAM (RAM). It would seem that the analogy is obvious: the screw is the cortex, and the operative is the hippocampus. But let's take a closer look at how the system works. Initially, new information enters the hippocampus through the cortex. If we no longer come across this information, the hippocampus gradually forgets it. And the more often we remember something, the stronger the connections in the cortex become, until the hippocampus "gives it all the authority" regarding this pattern. This process is called "memory consolidation" and can take up to two years. Until it is finished, it is too early to say that information is safely stored in long-term memory.
Try to remember your vacation in a dull autumn: how you lay on the sea beach and looked at the sand. Take a closer look: you can already distinguish individual grains of sand, pebbles, fragments of shells in it. It is very doubtful that you actually remember this - at some point, fantasy wedges itself into this image and helpfully provides the necessary details. But in which period, memories and fantasy merge into a single whole - it is impossible to determine.
Thus, any information that is returned from long-term memory to working memory is brought in line with the changed context and current tasks, and then consolidated in an updated form. And every time we remember the events of the past, this is not a memory of the event itself, but of the last "revision" of the brain. Our memory simply does not have the option to “open a file for viewing” - any call to it presupposes a certain change.
Memory as art
In a computer, deleting or saving a file is the opposite; for human memory, these are two sides of the same coin.
“For our intellect, oblivion is as important a function as memorization,” wrote William James over a hundred years ago. - If we remembered absolutely everything, we would be in the same desperate situation as if we did not remember anything. Remembering an event would take as much time as the event itself."
Yes, the computer may be better able to store information, but it is not able to forget it as well. After all, it is far from accidental that we forget - the memory is cleared of husks (which, as in the example with sand, can be filled with imagination, if necessary) and only a significant frame is preserved. And reflection helps us to identify and designate this framework.
This is why William James states that "the art of remembering is the art of thinking." Remember means to associate new information with the one we already know. The more a person remembers, the easier it is for the new to remain in the memory. And the best way to remember something is by persistent reflection on the information received.
How not to drown in the sea of facts
What conclusions suggest themselves? We can only rejoice at the capabilities of our own brain. Our memory, unlike computer memory, is not just a storehouse of information, but an integral part of thinking. And this is a colossal chance for development.
To replenish knowledge, you can request any information in Google, but in order to do this, you need to understand what exactly you do not know. It's like a puzzle - when the picture is already assembled around the missing piece, it is very easy to understand what exactly needs to be found. But when all the pieces are in disarray, it's not even clear where to start. In this case, Google is only able to drown us in a sea of facts, but in no way bring us closer to understanding them. And only the brain tells what fragments are missing. Thus, we just have to regularly load ourselves with new, interesting tasks in order to keep the brain in great shape.
More about this
Hawkins J., Blakesley S. On intelligence. // M.: Williams, 2016.
The book presents a revolutionary theory at the intersection of neurobiology, psychology and cybernetics, describing the "memory - prediction" system as the basis of human intelligence. The author notes that all previous attempts to create intelligent machines failed due to a fundamental mistake of developers who sought to recreate human behavior, but did not take into account the nature of biological intelligence.