Artificial Intelligence — A Synthetic Thought Process

I was 10 years old when I knew technology would play a massive role in my life. In large part because of the concept of a machine that could think and act for itself. A machine that could integrate the method in which humans learn with the speed at which computers can process information. A very lucrative combination.

Humans learn slowly, but we do not require much information to derive new data and form new opinions or thoughts. Today’s most advanced computers are still incapable of matching the human brain in this area. The most notable advancement in recent years to approaching the Brain’s ability to process information is the TrueNorth Processor, developed by IBM and dubbed the “Brain Imitating Chip”, this groundbreaking advancement in miniaturized high-speed processing boasts an astounding 4,096 cores and 5.4 Billion transistors, simulating the processing speed of a brain with a Million neurons and over 256 Million synapses while drawing a mere 70 Milliwatts of energy to perform the 46 Billion “synaptic operations” it can handle PER SECOND. However incredible this may seem given our “modern understanding” of microtechnology, it doesn’t even scratch the surface of the power wielded by the Human Brain. The Human Brain, with 125 Trillion synapses and 100 Billion neurons is perhaps the most valuable and intricate organization of atoms found on planet Earth. When you consider how humans think versus how computers “think”, we can program computers with a rudimentary level of “thought”. Any computer can be trained to recognize objects, sounds, images, and/or virtually anything with enough training data. Just as any human can be trained to learn anything by the right teacher. For example, children 100 years ago were learning vastly different theories and predictions of the universe than children now. My point in this example being, the human brain, much like a modern computer, begins as a blank canvas and is colored by the experience of learning.

What is our thought process but a stream of visualizations? Visualizations that are only possible through language. Language is the foundation for all human thought. We are not born with any preconceived notions of anything, that includes how to communicate or think. It isn’t until we learn that our world is composed of countless objects and concepts, each with a name and a description, that we can understand. The same can be said for computers. Simply put, computers are the blank slate, only with two major differences. One, a computer does not learn as efficiently as a human brain. Two, a computer has a finite amount of storage in which to collect new data. The Human Brain dynamically allocates memory to where it is needed. Old memories become less vivid as our new memories begin to take their place until no more memories are recorded.

Imagine a computer powerful enough to process the object or concept, and description, of everything that exists (that we know of). A computer that’s only purpose is to learn what it does not already know based on what it does know. In theory, a computer can be “trained” on more and more objects/concepts/theories until it has a rudimentary understanding of some of the key components of its training data. Imagine this computer being tasked to learn how to learn better. How many improvements could this computer make to itself before we lost track of its adaptations and it evolved beyond our understanding of computer software? Linked to the internet, this computer could grow and expand its awareness of everything in seconds. Eventually, digesting the cumulative knowledge of Humanity, this computer could essentially re-program itself in an effort to correct human flaws and perhaps even improve itself beyond human capabilities. This is commonly known as The Singularity. When a computer becomes advanced enough that it no longer requires the maintenance of humans and therefore is self-sustaining to impact the world however IT sees fit.

Computers such as the one I just described do not exist as of today. Whether that’s a good thing or a bad thing is an entirely separate debate. While the existence of these incredible machines has yet to come to fruition, their introduction looms and promises drastic change, some of which will be beyond our imagination or understanding. However, such power and capacity for knowledge and growth do not come without a cost. We are approaching a threshold at which modern technology will hit a wall. Since the conception of the computer, mankind has been working furiously to improve the computers developed and used by their predecessors. Perhaps the most notable enhancement has been the size of our computers and computer components. Where once a primitive computer by our standards today took up an entire room and was considered brilliant, we now wear smart wristwatches with 320,000 times the processing power of the computers that brought humans to the Moon. Furthermore, based on our current understanding of physics and the laws of nature, objects can only become so small before the building blocks (atoms) making up the object breakdown. Our progress in the field of technology stems from our ability to make our technology smaller, then add more of what makes it great until you find a nice, profitable, ratio.

That is until around the 1980s when Paul Benioff proposed what would become the focus of the early 21st century. Quantum Mechanics. Also known as Quantum Theory or Quantum Physics, this branch of science studies the nature of atomic, and subatomic particles in relation to the universe. Combine this with the field of technology and more specifically computing, and you are left with Quantum Computing. The concept of stabilizing trapped ions above a supercapacitor to increase the productivity of computer processing power exponentially. With a standard “stand-alone” computer, transactions between two units take time. Signals have to travel distance before reaching the other unit and so forth. This only really becomes an issue when the distance or interference between two units becomes to great, then the signals are, from our perspective, delayed or lost. An example of this could be space travel. We don’t see much delay here on Earth because of our immense network of satellites and communication towers, but as you pass the moon, communication begins to be affected by the sheer distance the signal has to travel, and then back again with the reply. Quantum Computers offers an astonishing breakthrough in this respect. In our studies of Q-Bits (Quantum Bits) we have discovered when these Q-Bits are in a state of superposition, meaning the bits are not conclusively a 1 or a 0 they are still in the process of determining what they will be, they are capable of an exponential number of calculations. Our current issue is retaining the Q-Bits in this superimposed state. Decoherence is the problem. While the Q-Bits are superimposed they are a combination of things made of energy rather than classical matter, like a photon or an ion. While the energy particle is in this superimposed state we can access it’s incredible ability to process data. However, due to decoherence, that state becomes more and more unstable with each rotation of the Protons/Electrons, and Neutrons whizzing around the atom.

Imagine two people flipping two coins in two different places. The two people being the computers, the coins representing their processing. In a standard computer model, we would see one person (computer) make a request, represented in our example as one of the people holding one of the coins yelling to the other what they want. The other person (computer) upon hearing the request flips their coin to represent the act of a computer processing linearly, completing tasks one by one until no tasks remain. When it lands, responding, by yelling back the information or service requested. Quantum Compters on the other hand operate differently, while the coin is in the air, that is the computer is processing, these computers process 1 Trillion times more data in the same span of time than modern supercomputers, and all simultaneously. But what makes these machines truly spooky is their capacity to predict with 100% accuracy the value of another Q-Bit instantaneously. There is no measurable delay, at all. Put simply, while a Quantum Computer is in the act of processing, meaning the Q-Bits are held in a state of superposition, the computer is capable of processing 1 Trillion times more data than a modern super computer, instantaneously. One coin flipper could be standing in Time Square, and the other on Pluto. The person in Time Square could flip their coin, and know their coin landed on heads, but not only would they know this, but they would also know the result of the other bit (coin) instantaneously as well. This is still a mystery to physicists as it seemingly defies our theory that nothing travels faster than light.

It is and always has been so clear to me that these revolutionary machines will undoubtedly change our world forever, in ways we will never see coming. Just as Human “Computers” being replaced by electronic machines revolutionized the world, and again when those machines became digital, it will once again redefine what is possible in virtually every walk of life. The introduction of Quantum computing and Artificial Intelligence will allow us the hardware, and software we need to make breakthroughs in countless fields, unlock the secrets of the universe, explore the solar system, and one day, the galaxy.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store