Energy Realities At The Nexus Of TechnoOptimism

Energy Realities At The Nexus Of TechnoOptimism

By George Gilder The COSM Summit Every annual, glittering technology gathering offers us a glimpse of the future that excites many and frightens some. It is an optimistic view of the future that is very different from what is often assumed in many speculative predictions. However, one thing will not change in the future, regardless of the technology perspective : the central role of animation.

The physics of energy connects everything. This also applies to the topics of this year's COSM meeting: the liberation of artificial intelligence "in nature", the prospects of graphene as a completely new and revolutionary class of materials and China's role on the world stage.

This may be obvious, but it is worth noting that not only society, but life as we know it, and indeed the universe, do not exist without energy. Not to wax philosophical - but Gilder's COSM cannot be avoided - that all possible futures occur at the intersection of three fundamental realities: information, atoms and energy. As George said, the only thing that distinguishes our time from the Neanderthal time is what we know . The building blocks that we were all made of back then. Today we “simply” have a lot more information about the same atoms and forces that have always existed.

It is within our power to expand the body of information about how humanity can rearrange nature's atoms in unique and magical ways to create all the products and services of the present and future. But receiving and processing this information and adapting atoms to this information always requires energy to function. Every innovation, product and service consumes energy that makes life fun, safe, convenient, entertaining and even beautiful.

And throughout history, inventors have found many more ways to invent things that use energy than to produce energy. The invention of materials such as alloys, polymers, pharmaceuticals or monocrystalline silicon has led to new energy requirements for their production. Likewise, the invention of machines made from these materials, such as cars, airplanes or computers, has brought with it new energy needs.

The inexhaustible natural energy is characterized by the fact that our data machines themselves are extremely powerful. All software, including virtual reality, requires the reality of machines that consume electricity to generate logic. This may seem obvious, but the result of this fact is that the global cloud currently exerts a force equivalent to that of global aviation. And the first grows much faster than the second.

That brings us to artificial intelligence at the heart of this summit – a new way to power silicon motors. Although AI has been around for some time, November 30, 2022 is the date AI went live when ChatGPT went live.

AI is the most energy-intensive use of silicon ever. In terms of performance, it resembles the transition from the age of steamships to the age of jet aircraft. The latter made personal global travel much better, more convenient and more productive, preserving the most precious thing in the universe for our time. Of course, flying is a more intense way to make a difference. HE is like that.

AI naturally includes both a training phase, so-called machine learning, and a reference phase, which involves applying the acquired knowledge. Both learning and distraction at least use more energy for those being treated, thereby reducing society's energy consumption. For example, a few years ago, a simple machine learning algorithm trained to solve a puzzle, the Rubik's Cube, used enough electricity to drive a Tesla a million miles. And for many real-world tasks, learning is not a one-time event. Then there is a recovery period after exercise, which, although less energy-dense than exercise, is often repeated, sometimes continuous, and results in a higher overall energy expenditure during exercise. Think of it like comparing the energy needed to make aluminum to the energy needed to build an airplane, and then the fuel needed to fly.

The full capabilities of artificial intelligence remain to be seen, as software and hardware are still in their infancy. Around 1980 we were once on a par with mainframe computers. Anti-utopians are afraid of artificial intelligence, but it is an exciting and productive new tool that will unleash all sorts of innovations, not just self-driving cars and robots, but also greater productivity, new ways of making fundamental discoveries, and even more unimaginable things . And the infrastructure that will be created and deployed to democratize AI will leverage the Internet, to use Andressen Horowitz's phrase.

At a recent meeting of energy company executives, Elon Musk gently teased them for underestimating the size of electricity needs. He is not concerned with the performance of electric cars, but rather with the creation of artificial intelligence. For comparison, today's global cloud power is ten times larger than the number of electric cars in the world combined. Even if EV adoption grows faster than bulls believe, especially now that AI hardware is being added to cloud infrastructure, the cloud will far exceed this power demand.

A regular response to observations about the energy demands of computers, and now especially artificial intelligence, is that innovations are making silicon technology more efficient. Of course there will be some. But efficiency does not reduce the growth of energy demand, but rather promotes it. This fact is called Jevons' paradox. Information systems in general are a striking example of this so-called paradox.

Consider that the energy efficiency of rational systems has improved more than a billion times in the last 60 years. And that's why there are billions of smartphones and thousands of storage data centers today. In the 1980s, the energy efficiency of computers was measured. In other words, without the amazing energy efficiency calculation, the smartphone and cloud era would not have existed.

Now something unusual is happening, discovered two decades ago through a freak accident. A new class of groundbreaking materials called graphene, made up of almost unimaginably thin layers of pure carbon with a density of just a few atoms, has magical properties. Graphene is increasingly being used in some commercial products. One possibility is to use graphene as a highly efficient base material to replace silicon for computer chips. Expect the Jevons Paradox to accelerate.

Graphene has many other conductive properties related to structures and biology. Due to its special shape, it is stronger than steel. In a different formulation, it can alleviate biochemical and pre-existing nerve regeneration disorders. And graphene is just one, although perhaps the most popular, of a number of new classes of materials emerging in research laboratories.

But back to our topic: The production of all materials requires energy. Compared to centuries ago, everything was built from smaller materials such as stone, wood and animal parts. Materials of our time require on average more than one kilogram of energy to produce. Switch from wood to polymers - the latter are used in medicine and are more useful than wood - and the energy cost per kilogram of product increases tenfold. When aluminum is used instead of polymers, the energy cost per kilogram of product increases tenfold. And semiconductor silicon is 30 times more powerful than aluminum. Producing one kilogram of silicon requires 100 times more energy than producing one kilogram of steel. And the world produces kilotons of silicon – the energy equivalent of producing a megaton of steel – not just for computer chips but also for solar cells.

When it comes to graphics production, we are just beginning to figure out how to produce them on a large scale. George Gilder has suggested that graphene production could be approaching an “aluminium moment”. This historical point dates back to 1886, when inventors discovered a method for producing the then interesting but very expensive material. . Pure aluminum was once more valuable than gold.

But as the literature shows, graphene's ability to produce is more similar to that of silicon than aluminum. Therefore, I propose that graphene is not at the “aluminum moment” but at the Czachralsky limit. Moment The Polish metallurgist Jan Czachralski In 1916, Kräble discovered how to produce single-crystal silicon from a random melt. This discovery led directly to the commercial process used today, which was perfected at Bell Labs in 1949, 33 years after the accidental discovery of silicon. Without monocrystalline silicon, there would be no era of silicon computers. If it takes that long from accidental discovery to a viable commercial process for graphene, we'll have to wait another decade. But today, modern supercomputers with artificial intelligence in a variety of materials, machines and data can help bring this plan to the forefront.

The company and country that is the first to produce commercially viable graphene will reap some real benefits. This brings us to the third of the three themes of COSM 2023, China's energy imperative.

Consider the situation for several important energy-intensive materials simultaneously, both for energy-producing and energy-consuming machines.

China produces more than 60% of the world's aluminum, processes more than half of the world's copper production - an element that forms the basis for 90% of all electrical devices - and 90% of the world's important rare earth elements. . Essential for many electric cars or generators and in many high-tech applications, solar panels and wind turbines, 90% of the world's 90% is pure gallium. lasers and light-emitting diodes; And 60% of the world's refined lithium, 80% of the world's refined graphite is used in all lithium batteries, and 50% to 90% of the key chemical components and polymers needed to make lithium batteries. There is more; But you get the gist.

China is not afraid of energy-intensive industrial materials and became its main supplier two decades ago. This leadership emerged at the intersection of three types of policies: those that encouraged and incentivized engineers to study the basic old-fashioned chemical, electrical, and materials sciences as secondary or tertiary priorities, and second, policy. . Policies that encourage and accelerate the industrial capacity to build large chemical and energy-intensive facilities, rather than resisting and hindering them as we do in the United States, and third, policies that provide a reliable, low-cost energy supply to power these industrial facilities guarantee. In China, the latter means a network that is two-thirds powered by coal.

Now the US is enacting a deflation law, the largest industrial spending package in US history. The specific purpose of much of the IRA's spending is no secret - the goal is to reduce this country's carbon emissions by promoting an energy transition away from the use of hydrocarbons. Regardless of what you know about climate change and carbon dioxide, there are two facts at the intersection of technology, politics and energy that you should keep in mind.

First facts:

The nearly $2 trillion remaining in IRAs, if the government's projections are correct, would reduce U.S. carbon emissions by 1 gigaton. Inflation of these costs aside, the bottom line of theoretical emissions is that in practice China is still building more coal-fired power plants, and President Xi has made it clear that this will be the case . This means China will benefit from industrial energy spending on energy-intensive industries for decades. Additionally, once completed, these additional coal-fired power plants will add another 2 gigatonne years to the world's highest CO2 emissions. To theoretically save 1 gigaton here, a significant portion of the $2 trillion required by taxpayers under the IRA goals will be used to purchase critical energy materials from China for the construction of wind, solar and battery facilities.

And the second fact should be taken into account.

With China and energy materials inevitably at the heart of energy transition goals, two decades later and at least $5 trillion in global spending on wind and solar energy, as well as similar efforts to replace hydrocarbons and coal, it is worth noting where the world stands today . Avoid oil, natural gas

These costs have reduced the share of energy provided by hydrocarbons, but only by two percentage points. Today, hydrocarbons still provide 82 percent of the world's energy. And the combined contribution of solar and wind technologies currently provides less than 4% of global energy. For comparison: burning wood still provides 10% of the world's energy. Meanwhile, over the past two decades, the absolute amount, not share, of hydrocarbons used worldwide has increased, with Saudi Arabia's oil production increasing sixfold .

It is clear from the data that “energy transfer” costs have so far produced poor results. Whether we can and will spend more on using non-hydrocarbon energy sources is a political question. But it cannot escape the physics of energy materials and the fact that China is the main supplier of these materials.

As the COSM 2023 meeting made clear, there are some truly amazing transformative innovations in computing and materials. But these revolutions demonstrate once again the rise of innovations that use more energy than power plants.

Changes in the type and size of energy-producing machines await yet-unknown breakthroughs, revelations, or unexpected future discoveries. Such growth is inevitable, but to quote Bill Gates, this kind of revolutionary growth is “unpredictable.”

But AI infrastructure continues to expand, and it's not hard to imagine how soon someone will be able to produce graphene on a large scale. One can also foresee - perhaps more hopefully - the art of returning to the field of industrial policy given the geopolitical situation.

Ramez Nam talks about Nexus, Crux and The Infinite Resource