The concept that connects the origin of life to Microsoft’s deal with OpenAI is an illusive one. Contradicting intuition and belief yet promising a sci-fi near-future
One of the smartest people living among us is a 96-year-old professor named Noam Chomsky.
He is a philosopher, a political thinker, some say an anarchist but first and foremost—one of the most (if not the) famous and influential linguists of all time. He developed the theory of Language Acquisition Device (LAD). He showed that all languages share common grammatical traits – have a Universal Grammar (UG)—which is intuited from infancy and does not require learning. He argued that our brain developed a language center dedicated to our human ability to communicate with each other, similar to the centers for vision, hearing and feeling.
He was (probably) wrong.
The intertwined stories of design, adaptation, randomness, specialization and emergence is a challenging one to tell. It is broad (spanning physics to psychology), it is debated (among scholars and laymen), it is sensitive (challenging intuition and religion), and it is expensive (billions of dollars are at stake).
I will try to simplify and will round a few corners to fit the format. It will not be the story of the universe, but it will start with its birth.
An Emergent Property of The Universe
It all begins with entropy.
Or actually – lack thereof.
Entropy is mess. But when everything started – at the Big Bang – there was no entropy. All things – matter, energy, time – were packed tightly together sitting in complete peace and order. Then *bang*—everything scattered and from that point the universe took the shape of my boys room – a huge pile of randomness.
And in this randomness, over a very long period of time, a large enough number of elements and a constant set of rules – here and there things became less random and started to take shape. Atoms clamped into molecules, molecules into cells and cells into organs within creatures.
“Wait, stop. That cannot be right. How can a space full of random atoms colliding into each other produce such a specific thing as a manga-loving teenager wearing Nike, texting on an iPhone?”
This is known as emergence.
Emergence is the phenomenon where complex systems exhibit properties, behaviors, or patterns that are not present in their individual components. These new properties ’emerge’ from the interactions between the simpler parts.
Not to be confused with design. Which is purposeful and intended.
If you feel “the universe is too amazing and complex to not be planned by something” you are in good company. 84% of the people still believe in a higher-power putting order in everything.
Hard to Perceive and Everywhere
Emergence is all around us. Molecules having the ability to self-replicate (by random chance) are better at becoming abundant in an environment and emerge as ‘life’.
Random variations (mutations) generate traits for proliferation in the environment. Adaptation (the evolution of traits) makes a clump of cells able to distinguish light from dark. A few millions of years and levels of complexity – emerges the ability to see. Billions of cells connected by electric wiring and dipped in chemicals emerge the ability to store and then generate abstract concepts.
Multiple such regions in the brain, each able to generate different abstractions, can emerge language. It is understandable why we would assume there is a dedicated region in the brain that is responsible for it. Not so long ago, it was believed that a certain pineal gland is where our soul comes from.
There are two types of Emergence – Simple and Complex. The simple is the one we can predict what might emerge from its parts (snowflakes from water molecules). The complex is the one more difficult to “guess” from base complexity (consciousness from neurons).
Surprisingly, the emergence in fashion, which we daily hear (or fear) of in the news belongs to the simple type.
Emergent Phenomena – Also in Digital
This principle of emergence extends beyond natural phenomena to the digital realm. Large Language Models (LLMs) – the backbone of popular AIs—are extremely complex entities. And from complexity we can expect (be certain) – will emerge phenomena we did not build or plan for (not “designed”). LLMs are not just language engines – algorithms that know which words fit together grammatically. They have been trained on language in the wild – not in a lab.
So what can we expect to emerge from wild language? What is there that is not knowledge or facts? Here’s a list –
- Code – people are writing code using language – words, numbers and “grammatical” rules.
- Math – people are describing mathematics (and chemistry and physics) using words (numbers) and math-grammar
- Logic – people describe thought, reasoning and order using language
- Emotions
- Intuition
- Opinions – all appear in language
This is a partial list. And all these things do not appear at random.
I remember a book I read as a teenager called The Notebook (Le Grand Cahier) by Ágota Kristóf. It is written in first person plural and is deliberately void of emotions (as the twins protagonists say: “We don’t understand why we say ‘I love eating apples’ and ‘I love my mother’ with the same verb”). It is so uncommon that my usually challenged-memory holds this quote for decades.
So one can expect LLMs will ’emerge’ the abilities to –
- Write code
- Excel at math (and chemistry and physics)
- Display logical
- Show emotions
- Have intuition
- Have opinions
Where we stand now is at number 3 out of the 6.
AIs are already proficient in code, math, and logic, and these skills are expected to improve further.
Emotions, intuition, opinions – expect to follow. Though I can imagine why their creators (OpenAI, Google, Apple) will consider suppressing them (and Elon Musk, x.ai’s Grok won’t…).
Does it mean they adopted these abilities? Evolved traits to better suit an environment of the humans surrounding them? That their creators god-like “designed” these traits?
No.
These are emerging capabilities “jumping out” of complexity and having their origins hidden in the language sources the LLMs are fed with.
This is also where the difference is.
Emergence is No Organ, But Does it Matter?
Living creatures are built of “things” – organs, chemicals, electricity, environment, community, biome – which produce the complexity from which sometimes emerge new things – such as language.
Reverse engineering language does not create the “things” – it creates their shadows. The organs, chemistry, electricity and environment are not there to produce emotions or motivation.
Does it matter? Maybe the shadow is detailed enough to be “the thing”? After all – we emerged the ability to fly without flapping anything and surpassed every bird in the sky.
I argue that it does make a difference (picking sides in Plato’s Dilemma).
No matter how many works of art we see – we do not emerge artist-abilities. We do not become motivated, talented or driven from examples.
So if I was Microsoft, having a multi billion-dollar stake in the matter, I would say “yes, but does it have an AGI organ?”. The rest of us can expect a wild ride with many human-like abilities emerging from old-school language-based A(G)I.
One response to “Emergent Phenomena—A Universal Feature Worth Billions”
[…] ←Previous […]