Knowing which bets to place when it comes to adopting emerging new technologies is impossible. But its still worth trying. Betting on the winners of the next generation, not as investment but in terms of incorporating them into your life, can give you a head start, future proof your life or work, and for this reason alone is well worth attempting.
Most of what has most recently been considered cutting-edge, such as artificial intelligence and machine learning, is already finding its way into production systems. You have to look far ahead sometimes to anticipate the next wave coming. And the farther out you look, the more risky the bets become.
Here are seven next-horizon ideas that might prove to be crackpot — or a savvy play for business value emerging along the fringe. It all depends on your perspective. William Gibson used to say that the future is already here, it’s just not evenly distributed yet. These ideas may be too insane for your team to try or they may be just the right thing for moving forward.
For the past few decades, the internet has been the answer to communications problems. Just hand the bits to the internet and they’ll get there. It’s a good solution that works most of the time but sometimes it can be fragile and, when cellular networks are involved, fairly expensive.
Some hackers have been moving off the grid by creating their own ad hoc networks using the radio electronics that are already in most laptops and phones. The bluetooth code will link up with other devices nearby and move data without asking “mother may I” to some central network.
Enthusiasts dream of creating elaborate local mesh networks built out of nodes that pass along packets of bits until they reach the right corner of the network. Ham radio hobbyists have been doing it for years.
Potential early adopters: Highly localized applications that group people near each other. Music festivals, conferences, and sporting events are just some of the obvious choices.
Potential for success in five years: High. There are several good projects and many open source experiments already running.
The buzzwords “green” and “artificial intelligence” go well together, but AI algorithms require computational power and at some point computational power is proportional to electrical power. The ratio keeps getting better, but AIs can be expensive to run. And the electrical power produces tons of carbon dioxide.
There are two strategies for solving this. One is to buy power from renewable energy sources, a solution that works in some parts of the world with easy access to hydro-electric dams, solar farms or wind turbines.
The other approach is to just use less electricity, a strategy that can work if questions arise about the green power. (Are the windmills killing birds? Are the dams killing fish?) Instead of asking the algorithm designers to find the most awesome algorithms, just ask them to find the simplest functions that come close enough. Then ask them to optimize this approximation to put the smallest load on the most basic computers. In other words, stop dreaming of mixing together a million layered algorithm trained by a dataset with billions of examples and start constructing solutions that use less electricity.
The real secret force behind this drive is alignment between the bean counters and the environmentalists. Simpler computations cost less money — and use less electricity which means less stress on the environment.
Potential early adopters: Casual AI applications that may not support expensive algorithms.
Potential for success in five years: High. Saving money is an easy incentive to understand.
Of all the out-there technologies, nothing gets more press than quantum computers — and nothing is spookier. The work is done by a mixture of physicists and computer scientists fiddling with strange devices at super-cold temperatures. If it requires liquid nitrogen and lab coats, well, it’s got to be innovation.
The potential is huge, at least in theory. The machines can work through bazillions of combinations in an instant delivering exactly the right answer to a mathematical version of Tetris. It would take millions of years of cloud computing time to find the same combination.
Cynics, though, point out that 99 percent of the work that we need to do can be accomplished by standard databases with good indices. There are few real needs to look for strange combinations, and if there are, we can often find perfectly acceptable approximations in a reasonable amount of time.
The cynics, though, are still looking through life through old glasses. We’ve only tackled the problems that we can solve with old tools. If you’ve got something that your programmers say is impossible, perhaps trying out IBM’s Q Experience quantum cloud service may be just the right move. Microsoft has also launched Azure Quantum for experimentation. AWS is following suit with Bracket as well.
Potential first adopters: Domains where the answer lies in the search for an exponentially growing combination of hundreds of different options.
Chance of happening in the next five years: Low. Google and IBM are warring with press releases. Your team will spend many millions just to get to the press release stage.
The world has been stuck on the old QWERTY keyboards since someone designed them to keep typewriters from jamming. We don’t need to worry about those issues anymore. Some people have imagined rearranging the keys and putting the common letters in the most convenient and fastest locations. The Dvorak keyboard is just one example and it has some fans who will teach you how to use it.
A more elaborate option is to combine multiple keys to spell out entire words or common combinations. This is what the court reporters use to keep accurate transcripts, and just to pass the qualifying exam, the new reporters must be able to transcribe more than 200 words per minute. Good transcriptionists are said to be able to handle 300 words per minute.
One project, Plover, is building tools for converting regular computers to work like stenotypes. If it catches on, there could be an explosion in creative expression. Don’t focus on the proliferation of inter-office memos and fine print.
Potential first adopters: Novelists, writers, and social media addicts.
Potential for success in five years: Medium. Two-finger typing is a challenge for many.
Build your own cloud (or cloud-in-a-box)
Wait, weren’t we supposed to be rushing to move everything to the cloud? When did the pendulum change directions? When some businesses started looking at the monthly bill filled with thousands of line entries. All of those pennies per hour add up.
The cloud is an ideal option for sharing resources, especially for work that is intermittent. If your load varies dramatically, turning to the public cloud for big bursts in computation makes plenty of sense. But if your load is fairly consistent, bringing the resources back under your roof can reduce costs and remove any worries about what happens to your data when it’s floating around out in the ether.
The major clouds are embracing solutions that offer hybrid options for moving data back on premises. Some desktop boxes come configured as private cloud servers ready to start up virtual machines and containers. And AWS recently announced Outposts, fully managed compute and storage racks that are built with the same hardware Amazon uses in its datacenters, run the same workloads, and are managed with the same APIs.
Potential first adopters: Shops with predictable loads and special needs for security.
Potential for success in five years: High. Some are already shifting load back on premises.
The weak spot in the world of encryption has been using the data. Keeping information locked up with a pretty secure encryption algorithm has been simple. The standard algorithms (AES, SHA, DH) have withstood sustained assault from mathematicians and hackers for some years. The trouble is that if you want to do something with the data, you need to unscramble it and that leaves it sitting in memory where it’s prey to anyone who can sneak through any garden-variety hole.
The idea with homomorphic encryption is to redesign the computational algorithms so they work with encrypted values. If the data isn’t unscrambled, it can’t leak out. There’s plenty of active research that’s produced algorithms with varying degrees of utility. Some basic algorithms can accomplish simple tasks such as looking up records in a table. More complicated general arithmetic is trickier and the algorithms are so complex they can take years to perform simple addition and subtraction. If your computation is simple, you might find that it’s safer and simpler to work with encrypted data.
Potential first adopters: Medical researchers, financial institutions, data-rich industries that must guard privacy.
Potential for success in five years:
Varies. Some basic algorithms are used commonly to shield data. Elaborate computations are still too slow.
The headlines focus on dramatic rise and fall in value of bitcoin, but in the background developers have created dozens of different uses for blockchains — immortalizing complex transactions and digital contracts. Folding this functionality into your data hierarchy can bring much needed assurance and certainty into existing processes.
The challenge is in decisions about the philosophical approaches. Do you want to rely on proof of work or some looser consensus that evolves from a trusted circle? Do you want to fret over elaborate Turing-complete digital contracts or just record transactions in a shared, trustworthy ledger? Sometimes a simple API that offers timely updates is enough to keep partners synchronized. A few digital signatures that guarantee database transactions may just be enough. There are many options.
Potential first adopters: Industries with tight, synchronized operations between businesses that don’t want to trust each other but must. These frenemies can use a shared blockchain database to eliminate some of the disputes before they happen.
Potential for success in five years: High. There are dozens of active prototypes already running and early adopters can dive in.