Spanish English French German Italian Portuguese
Social Marketing
HomeOpinionGOLEM #2: The Awakening

GOLEM #2: The Awakening

In my article above mentioned the G, the new type of actor that has appeared in our society. He may belong to a bank, a hospital, a telco, an Administration, but above all he is an amalgamation of processes, systems, protocols with which none of us can establish a dialogue. There is no place where behavioral awareness resides part of the hospital as a whole. There is no place to establish a specific dialogue with me.

According to Wikipedia,

Un golem it is a personification […] of an animate being made from inanimate matter […] The golem is strong, but not smart. If he is ordered to carry out a task, he will carry it out systematically, slowly and by carrying out the instructions literally, without question.

Automatic systems have one difference from golems - they are fast. Everything else applies to them. We have designed them based on analytical thinking: we have achieved a collection of bits with maximum efficiency, but with insane behavior because they lack two things:

understand and manage their own behavior aggregate y

be able to listen and accept the response of the “other” to adjust the behavior.

The first point is more important the more complicated the system is, because there are too many separate bits to get consistent behavior: imagine walking a body without a place in the brain where that verb is translated into all muscle movements that they are needed

The second point is more important the more critical, more intense or more present in our lives the system is: imagine that your father did not recognize you.

The solution must necessarily be to provide the piece that the Golems do not have, the one that allows them to manage their behavior towards each one of us. The technicians would say that the component with which to manage the emergent behavior of the complex system according to context of each client.

In this article we are going to talk about emergent behavior or what we could call the awakening of consciousness.

1. “Once you see, you can't unsee” – Once you see, you can't unsee

This video is very enlightening.

all balls go in a straight line, at constant speed, going back and forth between two points.

However, as soon as there are several, a new element appears, a wheel, which rotates within the perimeter of the circle.

The little balls don't know anything about the wheel, just as the wheel doesn't know anything about the little balls. If you look at one, you don't see the other and vice versa.

The wheel is the emergent behavior of the system composed of the balls.

The little balls know nothing of the wheel they make up. From each of them you cannot imagine or see the wheel. Therefore, this system we are talking about is a complex system, in which that “something” arises that is not only due to its components, but also to the relationship that exists between them. There are thousands of examples around us (flocks of fish, water... even companies), but the most perfect is the human being: looking at our organs it is not possible to imagine the person we are and that is with which other humans relate.

2. The elephant in the room… now that we see it

The Illustration, the most accepted scientific approach has been based on the reductionism, who break things down into their component parts so you can understand them better. This strategy has brought enormous success in the scientific field, but it poses a problem in the study of emergent behaviors, which is precisely what is not seen when dividing what is studied.

En medicineFor example, if there was already talk of known medical practices in Mesopotamia, 6.000 years ago, the nineteenth century had to end for psychiatry to be born as the first discipline dedicated to the study of overall human behavior, his behavior, as something different from the behavior of his organs.

Today the cognitive neuroscience groups several branches of knowledge that deal with behavior and all are based on the fact that the human being, thanks to the proper functioning of the frontal cortex of his brain, has awareness of their behavior. This subjective awareness is something that not all animals have, not even all primates.

in technology it is necessary to have systems capable of consciousness, that is, to reflect and decide on what they do according to the context, although reflection and decision remain a human role and only a human one: even with artificial intelligence to look at the data, only a human will decide what action to implement or what to correct, but without that awareness of behavior, we will not be able to avoid the #Golems.

The idea of ​​this new module has been on our minds for some time: many readers will think they have recognized the CRM and yet CRM is just a long-term memory, which requires the analysis of a human to recognize what is going well or what is going wrong, sometimes after an effort of a few minutes. Other readers will have thought of a Big Data and neither: big data is also a long-term memory, from which you can draw statistical conclusions, but it is also not enough to correct the actions of a case while it is happening.

If we look for the analogy in the human world, we are undoubtedly in the “pre-psychiatry” era. The study of #Golem behavior and its impact, that huge elephant in the room of our digital world, is still an orphan: not found in any technology curriculum nor heard of in business schools.

3. The digital twin: consciousness in the automated world

According to Wikipedia, “the consciousness It is the capacity of the being to recognize the surrounding reality and to relate to it, as well as the immediate or spontaneous knowledge that the subject has of himself, of his acts and reflections”. That is, a knowledge subjective assessment of the impact of their own behavior on the surrounding realityindependent of its own behavior.

Our insane systems execute orders automatically: without questions or objectives. The #Golem has no consciousness, the equivalent of a cerebral frontal cortex, a cortex modulus in which to observe its own behavior, that of the entire system, and be able to change it according to its impact on the context, which is crucial.

The cortex module cannot be among the internal systems, as our frontal lobe could not be in any of our specialized organs, but receives some information from them, very little, and deals with verify the correct functioning of external exchanges, according to parameters that the internal world completely ignores, such as manners.

An example: if we have a nausea problem when we are in a commercial presentation, we will know how to apologize and run as fast as our internal impression requires us. It is very little information, but it is enough. The important thing is to know how to apologize before running away and to know who to apologize to again when we return from the emergency, or to know who to ask. Our digestion is the same as that of most mammals. Our ability to learn and relate is what distinguishes us.

Therefore, a cortex module needs a digital twin of the system's important relationships with the outside world and the ability to act about exchanges when things don't seem to be going the way they should. Important relationships can be customers, but also employees and, in a digital ecosystem like ours, certainly suppliers.

Going back to our ball and wheel system, the cortex module deals exclusively with the behavior of the wheel and translating changes in the wheel to the balls. If there is no suitable place from which to observe the behavior of our wheel, we could perfectly not see it..

That's why the world is filling us with #Golems.

4. Biomimicry: inspiration to manage complexity

We have talked about complex systems and emergent behaviors. Our example was six balls in a straight line, all the same, to give birth to a wheel that moved in a static circle. That is, something very simple compared to the problem facing companies that have turned to automation and that, due to the difficulty of their business, they are suffering the #Golem effect.

En of the book Thinking in Systems[1], Donella Meadows sets hierarchy as a brilliant mechanism to reduce complexity, that is to say reduce the exchange of information, and therefore make elements independent from each other. The hierarchy understood in this way is what allows us to support the functional complexity of the most resistant animals, from insects to humans. In our body, the organs are clearly specialists and are grouped into systems that try to share as little as possible, to the point that in some cases they can be exchanged or replaced by machines. It's actually something very similar to what you're talking about. Jeff Bezos in this memo in which he talks about architecture Company.

Translated, it says that:

-Each one of the systems or equipment (balls) has to expose its functionality through a service and only that.

-If it is very well done, people in the outside world will want to use it and maybe pay for it.

-Compete with any other equivalent service in the world, so if it's not done right, it'll go out of use and the equipment will fall apart.

And it does not say, but it is understood, that the behavior of the entire organism is directed from a single organ, the brain, that you can collect the information that you strictly need.

This architecture that Amazon order to their teams has two monumental implications:

-Silos don't break, the other way around: they are structured and completely isolated.

-The cortex module, which controls the relationship with the outside, uses internal or external services, and no internal discussion.

Looking at how the most evolved living organisms, or the most mature technologies, work, this seems the path of software maturity. From what Bezos says, it could also be that of organizations.

5. In summary

When our systems are complicated enough they create complex systems that have emergent behaviors., that which comes from the interrelation of the components of the system.

To manage the emergent behavior it is necessary an additional system. In the human, that is in the cerebral frontal cortex. If we are in the automatic world that module must be a digital twin of the system from which to be able to observe the behavior and change it if there are problems.

The knowledge and training of this module, the study of its impact on system and human relationships is not yet part of general knowledge. That's why it's disruption terrain. And you, dear reader, if you have followed me here, you will never be able to stop seeing it.

You can also read this post at Technological Atlas

RELATED

SUBSCRIBE TO TRPLANE.COM

Publish on TRPlane.com

If you have an interesting story about transformation, IT, digital, etc. that can be found on TRPlane.com, please send it to us and we will share it with the entire Community.

MORE PUBLICATIONS

Enable notifications OK No thanks