As the saying goes, forecasting is never easy, particularly about the future. On one hand, studies of what the future may bring are often constrained by our current knowledge, which inevitably sees progress as a linear extrapolation of recent trends. On the other hand, claims are often made about the importance of emerging technologies and how much they will shape our lives. Usually, neither is right. Instead disruptive technologies come along and lead us along paths we cannot foresee, but those technologies are rarely the ones that are trumpeted as the way forward.
The reasons for this are often complex but the crux of the matter is that, in a market economy, it is the consumer who decides, and for infrastructure projects where the ‘consumer’ is the government on behalf of the country, economics (eventually) is the decider. Of course, there are always a few enthusiasts for a particular development who correctly call its success, but there are many more failed technologies that are at least as strongly supported.
When something new comes along, companies and governments often cannot see its true potential because they view it as simply a better replacement for something that went before. Take, for example, the Xerox machine. Before the machines we now take for granted, copying was a wet process, using purpose-made stencils. It was messy and expensive and many documents existed only as the original typed one plus a couple of carbon copies (hence the origin of the ubiquitous cc from emails, doubtless a popular question in future trivia quizzes).
Xerox copiers did not just replace messy wet copiers, they transformed the market. Many more copies were made simply and quickly and the Xerox Corporation became rich on the proceeds. Investing in new technology, the company is credited with developing the first proper example of what we now recognise as personal computers. However, they clearly did not see the real potential of this, and allowed the young Steve Jobs to tour their development facility, take a note of whatever he wanted to and poach their key staff.
Apple itself, while currently an enormously successful global company, was technically innovative but was in its early days a niche player, at least in part because of its pricing strategy. The standard PC system and architecture was developed by the dominant company in the computer sector at the time, IBM. IBM – Big Blue – made its money from mainframe computers, massive beasts in air-conditioned rooms, programmed via punch cards. They, in common with other companies in the sector, saw a very limited market for the personal computer. Who would want them and what would they do with them?
Well, now we know. Everyone has one, increasingly in the form of a smartphone in their pocket, and we use them constantly for a whole range of things. But at the time IBM did not keep control of the technology, leaving upstarts such as Intel and Microsoft to move in and offer components and standardised software to other manufacturers. IBM missed the boat and no longer has a computer hardware business.
In the energy sector, meanwhile, governments are betting (with our money) that electricity generation networks can be transformed into reliable, secure, low-emissions systems despite being based on intrinsically intermittent and unreliable renewables technologies. No matter how much the technology of harvesting solar and wind energy develops, the Sun always sets and the winds are not ours to command.
Perhaps mindful that renewables at the present stage of development (that is, without a massive energy storage network) cannot be relied upon, governments have also chosen another ‘winner’, Carbon Capture and Storage (CCS). The theory is that we can continue to burn coal and gas but reduce emissions by removing the carbon dioxide from the flue gases. This is the easy bit, if rather expensive; capturing the CO2 and compressing it for storage takes a lot of energy, so effectively reducing the output of a power station by maybe 20%.
The more difficult bit is storage. In principle, liquefied or compressed CO2 can be injected into appropriate underground rock formations for permanent storage. This is already done to some extent as a way to increase extraction from oil fields whose productivity is declining, but each major source of carbon dioxide would need a dedicated pipeline or other system to get it to the point of injection, and development of each reservoir would be unique and costly. The alternative of injection to the deep sea bed is an unproven and potentially hazardous solution.
It is possible that support for CCS is primarily window dressing to assuage the lobbyists, but suffice it to say we are no further forward on this than a decade or two ago. Meanwhile, taxpayers are providing the money for grants to buyers of hybrid and electric cars, whose batteries make them considerably more expensive than their conventional equivalents. The emissions avoided by this shift from internal combustion engines is modest at best while the greater part of the electricity is still generated by burning fossil fuels.
Not that any of these technologies is in itself bad; knowledge is essentially neutral and it is how it is applied which is important. So, in a world where abundant clean electricity was available (for example, from nuclear generation) then hybrid vehicles may make more sense. Even fully electric vehicles could be the way forward if anyone had a good answer to the problem of battery swapping or recharging across the entire road network. And in polluted urban areas, there is indeed a strong case for the use of electric cars, without considering their wider pros and cons.
It is the role of government to encourage innovation within a broad policy framework. Rather than deciding that a drive to reduce carbon dioxide emissions should come via an expansion of solar and wind energy, mandated use of transport biofuels and CCS, none of which has yet delivered the promised environmental benefits nor has any chance of being economically viable without public subsidy for the foreseeable future, policymakers need only set a top level target and introduce the simplest possible incentives to encourage competition between innovative potential solutions.
In this way, the technologies currently chosen could compete on a level playing field against alternatives, with public acceptance being the measure of success. This does not even mean that anything new would necessarily be the least expensive option: people base their choices on perceived value rather than just price, as Apple have proven several times over. If open innovation is encouraged, the disruptive technologies of the future will emerge naturally.