It is, of course, a mangling of the English language, but resonates because it’s true. Prognostication is a tricky business, usually best left to carnival hucksters and high priced Wall Street analysts.

However, despite the pitfalls, we can make some reasonable assumptions. Some important digital trends have highly foreseeable outcomes. We have a good idea about what limits will be hit, problems that will follow and some concrete notions about how they will be solved. The future exists, it just hasn’t arrived yet.

The emergent near future

As I wrote in an earlier post, our technological near future is driven by four major forces:

1. The Cloud: In the PC era, innovation was driven by Moore’s Law. Computing processing speeds doubled every eighteen months and that determined what we could get our hardware to do.

These days, however, Kryder’s Law, which projects a doubling in storage capacity every year and Nielsen’s Law which sets the pace for bandwidth capacity at every 21 months, are more determinant. We are now able to store huge amounts of data and access it from anywhere and our ability to do so is increasing rapidly.

2. Evolution of Client Side Scripting: In the early days, the Web was merely a bunch of static electronic pages linked together. You could access documents, but there was no functionality.

Then came PHP, which made pages dynamic (i.e. they changed). Further enhancements such as JavaScript, jQuery and Flash made it possible actually manipulate the data on the page and watch video.

Today, mobile apps and HTML5 are enabling entirely new experiences. Not surprisingly, that’s where a lot of innovation efforts are going.

3. Linked Data: As early as the late 90’s, Tim Berners-Lee recognized a shortcoming of the Web he created. While it helped give much greater access to the world’s information, that same information was still trapped inside of incompatible databases. So he invented a new, semantic web to unify them.

A decade later, linked data is finally gaining traction and is becoming an important technology driver in it’s own right.

4. The Mobile Explosion: As my agency, Moxie, described in a recent report we are increasingly operating in a post-PC computing environment. We now carry around multiple connected devices, all of which have greater computing power than our office desktops did a decade ago.

Those four trends are what drive innovation today and the stuff coming online now, such as augmented reality, app driven digital TV and mobile sharing sites like Instagram combine two or more of them to create something truly new and exciting.

However, around 2020, all the rules will change and we will enter the realm of the unknown. What exactly happens then is anybody’s guess, but there are already early signs of what’s to come.


Moore’s law hits a speed bump

As mentioned above, Moore’s law predicts that processing speed doubles every eighteen months. It’s driven by how many transistors engineers can squeeze into a single integrated circuit. What happens when they become as small as a single atom? The laws of physics will forbid us to make them any smaller.

That will happen by the end of this decade and scientists are already working on a solution to the problem called quantum computing. The idea is to no longer use the binary chains of ones and zeroes (called bits) that our computers run on now, but to encode information in qubits, basically the same idea transferred to quantum states.

It seems fantastical, but the future is closer than you might think. This past spring, Lockhead bought the first quantum computer. It’s still early days, but further developments are sure to follow.


E-commerce in peril

Vastly increased processing speeds are exciting, but like any advancement they create their own problems. As I wrote in a post a few years back, quantum computing will make existing encryption methods obsolete.

Think about that for a minute. All of our electronic transactions need to be encrypted. If they weren’t, we certainly wouldn’t be willing to put our financial information online.Without secure encryption, e-commerce, along with many of our national security and diplomatic structures, would grind to a screeching halt.

Perhaps not surprisingly, scientists intend to solve the problem in much the same way as they intend to solve our Moore’s Law problem. There are several efforts underway to develop forms of quantum cryptography, based on the surreal ERP paradox which will keep communications secure.


There’s plenty of computers at the bottom

In the late winter of 1959, a young scientist named Richard Feynman, got up to address the American Physical Society. His lecture was not the usual fare of decaying sub-atomic particles and obscure Greek letter strewn formulas. Nevertheless, it promised to become one of the most significant and consequential scientific talks ever.

It was entitled There’s Plenty of Room at the Bottom and it is an absolute delight. In that room a half century ago, speaking at roughly a middle school level, Feynman introduced the world to nanotechnology. Today, the technology is used to create new materials such as super-strong carbon structures called fullerenes.

The next step is nanocomputing. We will create devices out of microscopic components and there will be computers as small as a grain of sand. In the future, we will literally be able to spray on information technology (and possibly, in the case of cosmetics, rub on). This isn’t science fiction, as this article shows, the effort is already well underway.


A new organic evolution in the digital space

In 1945, in connection with the development of the EDVAC, the legendary John von Neumann published a report that described a computer architecture with a central processing unit and a memory unit to store programs. It became known as the Von Neumann architecture and remains the dominant design in computers to this day.

Ironically, around the same time he developed cellular automata, which are mathematical constructs that mimic biological systems. Researchers today are using cellular automata and related concepts to develop a new type of organic computing architecture which can self organize.

The idea is based on the realization that when we have computers that are not only vastly more powerful, but also vastly smaller, more connected and ubiquitous, present day architecture will be unable to manage the complexity. In effect, programmers will develop rules by which computers will evolve the ability to optimize tasks themselves.


The quirky digital paradigm

We are truly still at the dawn of the digital age, with more to come than we can possibly fathom. However, we can take some solace from looking at how widely off the mark previous visions of the future have been.

In the past, we were given fantasies of refrigerators ordering our milk for us and people marching in silver spandex suits down antiseptic hallways. None of that has come to pass. In fact, the digerati tend to be quirky, retro and organic. The reality is that we’ve used the digital technology we possess to give us more control.

The enhanced compression and storage of the cloud give us vastly more entertainment choices. Semantic technologies are freeing data to be used more efficiently. Mobile devices have freed us from our desks. Computers crunch the numbers so that we can concentrate on the math and logic.

Everywhere you look, we are deploying digital technology is to express our individuality, use our resources more wisely and do what we want, where and when we want to do it. So while exactly what 2020 will look like remains anybody’s guess, it will truly be a world we make in our own image.

Greg Satell is a blogger and a consultant at the Americal online media Digital Tonto. You can read his blog entries at http://www.digitaltonto.com