Minimum Energy (United) States
Interatomic Potentials, Regional Integration, and Eigenfederalism
Cloud first, then land.
Alexis de Tocqueville, reflecting on America’s early political development, had this to say:
“… it may be said that the township was organized before the county, the county before the state, the state before the union.”
America’s earliest political associations were forged at a local level. In line with geographical determinism, early colonists found themselves separated from their sovereign’s authority and protection by a vast ocean, and from their fellow colonists by a vast geographic expanse, and as a consequence, circadian organization and governance was independent and local. The dynamic struggle between nested collectives, the ebb and flow of competing, sometimes conflicting, spheres of local and global influence is exactly the story of the basic organizing principles of American government and how that struggle has been resolved.
In 1643, the first American effort to create a political union among colonies began in Boston. The English Civil War made it near impossible for the New England settlers to receive aid from England. The need to defend and maintain security over a large territory necessitated the banding together of Massachusetts, New Plymouth, Connecticut, and New Haven into the New England Confederation which lasted four decades, until James II folded the colonies into the new Dominion of New England in 1684, and initially described as:
“a firm and perpetual league of friendship and amity for offense and defense, mutual advice and succor upon all just occasions, both for preserving and propagating the truth and liberties of the Gospel and for their own mutual safety and welfare”
After a century, due to mutual pressing concerns (including relations with Native Americans and the possibility of a French attack), there was the need to confederate; delegates from the British North American Colonies adapted the Albany Plan of Union in 1754. Each colony would select members of a Grand Council and the British government would appoint a president General. Benjamin Franklin was one of THE PLAN’s most prominent supporters as depicted by his Join or Die sketch for the Pennsylvania Gazette.
In addition to being the first political cartoon ever published in an American newspaper, it was also the earliest pictorial representation for American federalism. The idea was not to sever or weaken ties with England, but to increase British participation in the colonies, “… one whole, and not as different states with separate interests.” This formed the precedent for the idea that colonies could join together to pursue mutual interests, while simultaneously retaining individual power over day-to-day political activities.
This very image, later became a rallying cry for the Revolutionary War (1775-1783). The difficulties of command and control, supply, fundraising, and capital allocation exceeded local government capabilities. The recognition that some form of confederation was needed heavily conflicted with the deep suspicion of centralized power, a recurring theme throughout history. The necessity of union, further intensified by wartime urgency, and combined with fear of an overarching sovereign, led these revolutionaries to ratify the Articles of Confederation in 1781: the states would wield sovereign power but centralized governance would be derived from the consent of states.
In order to prevent a new Leviathan from taking England’s place, the states delegated the central government limited power, and even limited resources. Weak governance meant revenue dependence, difficulty in levying taxes and poor regulation of commerce ; there was also no executive, no judiciary and no standing land + sea forces. Any change to the Articles required unanimous approval, and exercising the limited powers the new government had (including making treaties and coining money) required majority or supermajority approval.
Conflict over the use of the Potomac River in 1784 highlighted the difficulties in resolving disputes among states, the inability to compel contributions to common expenditure, and the lack of funding needed to effectively govern. It seemed preposterous to expect such unanimity from such antagonistic elements– something had to be done to save the Union from disintegration and the American experiment from disgrace. The experiment was being corrupted by the worst inclinations of human nature, especially in the state governments. The challenge to preserve state sovereignty within a national polity– one which could operate on the world stage, resolve interstate differences, and facilitate common interests– was offset by the common fear that central governance directly translated to the accumulation of too much power which eroded state sovereignty (couple with the absence of faith in the abilities of a central authority to govern a huge expanse of territory).
The Framers, posited a solution, one embodied in the rights of the American individual to this day; American citizens would have two political capacities, one state and one federal, each protected from incursion by the other– a mix of necessity and theory forming the basis of American governance. After the Constitutional Convention (Philadelphia, 1787) met to remedy the failures of the Articles of Confederation and to delineate the powers of Congress, Publius published The Federalist Papers to promote state ratification of the Constitution, and to assuage concerns that the states would lose sovereignty under Congress.
Epistolary battles ensued between the federalists and antifederalists resulting in the Bill of Rights, the first ten amendments to the Constitution, ratified in 1791. The federal courts, and then the Supreme Court, quickly became arbiters of federalism, and through a series of rulings, heavily predicated on what was unconstitutional or constitutional, set many critical precedents that form the latticework of present-day United States.
The Civil War (1861-1865) threatened the survival of the American experiment because of the possibility of secession of states. President Lincoln reinforced the idea that the bond of geography and the bond of the constitution was diametrically opposite to the anarchical concept of the sovereign state. After the South lost, it became clear that the Constitution did not protect the sovereignty of the states as abstract political entities, but divided authority between federal and state governments for the protection of the individual and for other constitutional ends.
The Reconstruction Era further clarified the roles of the federal and state governments resulting in three more amendments (The Thirteenth, Fourteenth and Fifteenth– the 11th and 12th were about judiciary powers and presidential elections respectively). The Fourteenth Amendment especially imposed restrictions on state power and expanded the power of the federal government, and this has been the very pivot upon which the balance between the state and the federal government is resolved, and the amendment on which the struggle between these two entities revolve. These concerns have been pervading in their consequences, profoundly interesting to the American people, and heavily critical in their bearing upon the relations of the United States, and of the several states to each other, and to the citizens of the states (and United States).
The beginning of the Industrial era, precipitated by the construction of railroads connecting the South and West, and further intensified by the rise of immigration led to new constitutional amendments which further entrenched federal power. The Bill of Rights was applied through the Fourteenth Amendment to invalidate state action– before the Civil War, the Bill of Rights did not apply to states– but this changed over time as the Supreme Court slowly applied specific protections afforded by the Bill to states.
My recount of history takes an unlikely turn here as the Progressive Era (and the New Era) ushered in so much innovation that changed the very trajectory of the American story and created a persistent era of ferment that lives through till this day
In addition to the third decade of the 1900s being called the Roaring Twenties, it was also the Golden Age of Quantum Physics. The backdrop of the previous decades was no less significant, with the publication of seminal works including Heike Onnes’ discovery of superconductivity, Max Planck’s use of quanta to explain black-body radiation, Rutherford’s discovery of the atomic nucleus, Bohr’s atomic model describing the quantization of matter, Einstein’s treatises on general and special relativity based on quantization, and finally, in line with the theme of this newsletter, the Braggs’ work on crystal analysis using X-rays, which I shall talk about in subsequent articles to further define material intelligence. The formalization of quantum theory by Bohr and Einstein was heavily predicated on the works of Max Planck, and the 1922 Nobel prize solidified Bohr and Planck as the fathers of quantum theory.
In his 1924 PhD thesis, de Broglie, a French physicist and aristocrat, postulated the wave nature of electrons, and in so doing pushed forward the idea of wave-particle duality, a central part of the theory of quantum mechanics. This created a lot of excitement in European physics circles. In 1925, Werner Heisenberg, Max Born, and Pascual Jordan published three classical papers to try to describe atomic systems by observables only. Born observed that Heisenberg’s initial formulation of calculating the intensities of spectral lines using position and momentum could be transcribed and extended to the systematic language of matrices. Up until then, matrices were seldom used by physicists. Together with his assistant and former student, Pascual Jordan, they submitted their solutions for publication, just 60 days after Heisenberg’s initial paper. The three of them then published On Quantum Mechanics together.
In that same year, during the fall of 1925, Pieter Debye, Einstein’s successor, suggested to Erwin Schrödinger that he give a seminar on de Broglie’s work. Schrödinger obliged but at the very end, Debye thought the theory rather childish: why should a wave confine itself to a circle in space? Real waves in space diffracted and diffused; they obeyed three-dimensional wave equations. Schrödinger then spent some weeks in the Swiss mountains, analyzing electron orbits from a geometric point-of-view, taking into account, the relativistic Doppler effect of spectral lines, the hypothesis of light quanta, and considerations of energy and momentum. This culminated in his 1926 publication, Quantization as an Eigenvalue Problem, in which he presented what we now know as the Schrödinger equation. He showed that, for time-independent systems, his derivation of the wave-equation could correctly predict the correct energy eigenvalues for a hydrogen-like atom. Afterwards, he published three more papers that dealt with different quantum-mechanical analogs, equivalence to Heisenberg’s approach, and time-dependence to prevent the occurrence of fourth- and sixth-order differential equations, ultimately reducing the order of the equation to one.
In order to ridicule the Bohr-Heisenberg probability interpretation (The Copenhagen Interpretation) of quantum mechanics, he contrived the famous thought experiment of Schrodinger’s cat, in what was possibly the first logically consistent scientific sh*tpost.
In 1927, Max Born and his grad student, J. Robert Oppenheimer put forward an approximation that assumed the separation of electronic and nuclear motion in molecules to describe many-body systems, leading to a molecular wave function in terms of electron and nuclear positions only.
For a molecule in three-dimensional space, consisting of m nuclei, and n electrons, to solve and obtain the energy levels and wavefunction of said molecule is computationally complex. Moreover, the position of each particle is affected by the pull of its nearest neighbors– this is called coupling. The underlying reasoning for their approximation is that electrons are much lighter than nuclei, and so with respect to the electrons, the nuclei are almost stationary (just as the moon is kept in orbit by the earth’s pull). For any two isolated particles interacting with each other, there exists a minimum distance, and as such, a minimum energy (an effective potential corresponding to the lowest energy state) at which these particles are stabilized in space– too close and they are on top of each other, too far and they never interact. This relationship between energy and distance is known as the interatomic potential. There are many kinds, depending on interaction parameters & degree of accuracy.
By separating the molecular energy (and its corresponding Hamiltonian) of the Schrödinger equations into individual relations for the electronic and nuclear (vibrational, rotational, and nuclear spin) energies, certain energies (electronic-nuclear cross-terms) can be ignored by treating the nucleus as stationary (uncoupled from the movement of their electrons).
Solving the electronic Schrödinger equation (consisting of kinetic energies, interelectronic repulsions, internuclear repulsions, and electron–nuclear attractions) for a fixed position of nuclei
The resulting potentials are then used to solve a second nuclear Schrödinger equation containing only the coordinates of the nuclei
Repeat for different positional configurations of the atoms in the molecule. These energies are connected to give a potential energy surface, approximated by an analytical function that gives potential energy as a function of coordinates. Forces are equivalent to the slopes.
Instead of a large equation containing (n+m)^2 calculations, only mn^2 calculations are required.
The basic functional form in describing many-body systems can be resolved into bonded and nonbonded energies.
For more complex interactions, more parameters can be added based on fitting to available experimental data. A collection of these empirical parameters for a given interatomic potential is known as a force field.
The slope of potential energy surfaces can be used to simulate molecular dynamics by expressing the mean force on the nuclei caused by electrons. The energy between two or more particles can be given as a function of its geometry, and the (local or global) minima and saddle points correspond to stable/quasistable species and transition states respectively.
By virtue of the fact that there are local and global minima, it can be deduced that the initial geometry of a reactant molecule determines the ease of formation of specific products.
Molecular dynamics (MD) is the solution to the classical equations of motion for atoms and molecules to obtain the time evolution of a defined system, useful for explaining structure-function relationships. It is as such a deterministic technique, meaning that accurate description of any system is necessary to prevent the accumulation of errors. It can also be used in statistical mechanics to generate a set of configurations that are distributed according to statistical distribution functions.
These discoveries form the underpinnings of modern-day computational chemistry, physics, biology, and materials science.
There is something to be said about accurate description of defined systems, the most accurate being quantum-mechanical. The most accurate descriptions are computationally cost-prohibitive. We fast forward in history, past the derivation of quantum-field theory, to the derivation of density-functional theory (DFT) in 1964-1965 by Hohenberg, Kohn, and Sham, whose publications (Inhomogeneous Electron Gas, and Self-Consistent Equations including Exchange and Correlation Effects) resulted in their being awarded the Nobel prize of 1998. Of course, this was predicated on papers by Thomas, Fermi, and Dirac (in line with blue-sky thinking) who imagined that the kinetic and exchange energies could be locally modeled by their uniform electron gas energy densities. The basic premise of DFT is that all the intricate motions and pair correlations in a many-electron system are somehow contained in the total electron density alone.
To wit, the quantum-mechanical wavefunction contains, in principle, all the information about a given system, allowing us to determine the allowed energy states of said system (remember, particles are said to be “quantized”, existing in specific energy states). It is computationally impossible to solve the Schrödinger equation for an N-body system (including electrons), and as such, we must involve some approximations. For a regular crystal, the electrons are not only affected by the nuclei in their lattice sites, but also by other electrons. Now for those who took high-school chemistry, you may remember Pauli’s exclusion principle which states that no two electrons can be in the same quantum state (equivalent to having the same four electronic quantum numbers).
Since orbitals can contain only a maximum of only two electrons, these two electrons must have opposite spins
The implications of this principle are that
the electrons with parallel spins are kept apart, reducing their Coulombic (classical) repulsion potential. The difference in energy between parallel and antiparallel spins is the exchange energy. As such, a spin interaction exists, in addition to an electronic one.
there is correlated motion between electrons of anti-parallel spins which arises because of their mutual classical repulsion.
Moving parallel spin electrons apart will lower the exchange energy, and moving anti-parallel spins apart will lower the correlation energy
The total energy of any given system is decomposed into the kinetic energy, the potential (Coulombic) energy and a term called the exchange-correlation energy that sufficiently captures all many-body interactions.
For a small region, the exchange-correlation energy is an approximation that accounts for the remaining electronic energy not included in the non-interacting kinetic and electrostatic terms.
By reducing the number of degrees of freedom of the system using the most basic form (Born-Oppenheimer approximation), we express the electron density as a function of space and time in what is known as a functional (a function of a function)– remember, the density of any system contains all ground-state properties of the system. In our case, the total ground state energy of the many-electron system is a functional of the density. If we know the electron density functional, we know the total energy of the system.
We can then compute the total energy of the system by decomposing it into functionals of the charge density: ion-electron, ion-ion, electron-electron, kinetic energy and exchange-correlation energy. The latter two are computationally intractable, and as such DFT reduces the many-body problem of N electrons with 3N spatial coordinates to three spatial coordinates.
In order to correctly describe a correlated system, there is the need for equations to be self-consistent (expressing a quantity M, in terms of properties of F which themselves depend on M; so M = F(M))
Just as interatomic potentials/force fields have their parameters that increase their accuracy, different exchange-correlation functionals exist to describe electronic densities. Also, since nonlocal effects are always present, there have been attempts to refine DFT by extending beyond intra-atomic to interatomic distances, which cannot be treated by local functionals.
Advances in DFT can be observed by the exponential increase in their use as seen by this graph of DFT papers over time:
The computational intractability of these two methods (DFT and MD) has necessarily led to
advancements in the field of supercomputing via massive parallelization, efficient algorithms + hardware design and quantum computing + analog simulation to find less data-intensive ways of describing and studying atomic and molecular systems. To clarify, supercomputing advancements were mostly driven by the computational needs of national scientific laboratories to solve engineering and molecular simulation problems.
the creation of many repositories (to minimize computational redundancy) with defined crystal, material, physical, and biological systems
the use of standard data formats and protocols to allow sharing and agreement worldwide
the intersection of quantum theory and machine learning resulting in machine-learning interatomic potentials and universal potentials which I learned of three days ago, and whose existence may very well precipitate multiple Cambrian explosions in scientific computation
Why was America’s historical backdrop necessary in the first place? And why was the history of interatomic potentials and exchange-correlation functionals important to this article?
The institution of science can best be described as an informal international organization whose organizing principles are the scientific method and the search for ground truth. There is indeed a reason why it was initially called natural philosophy during periods where religion was the main Leviathan, and now described as the Invisible College where the state is the main Leviathan. Based on advances in computation and machine learning, another phase transition has occurred due to the coupling of the search for ground truth to massively parallelized solutions of mathematically expressed physical and quantum phenomena. The trend towards ubiquitous computational collaboration has formed the basis for the emergence of the network, the Leviathan more powerful than the previous two. Balaji Srinivasan describes it as the transition from theism to statism to numism. It is well-recognized that the new frontier for warfare is in cyberspace– ask the DOD. The paradox here is the very one I talked about in my first publication where the very advances that brought us together as a nation and world, are the very ones which threaten to tear us apart.
The evolution of networks, best visualized by a connected collection of clusters, which are themselves collections of nodes, unrestricted by physical space says a lot about the formation of computational communities, which in the long run, translates to computational (and possibly quantum) advantage– these are my personal beliefs. As the Invisible College bands together to solve more complex potentials, that knowledge becomes more accessible and ground truth becomes more decentralized.
One can then imagine a minimum-energy state where the tension between computation and virtual location (ease of access to digital description ability) reaches some form of equilibrium, resulting in a positive feedback loop of compute → discovery → commercialization → cashflow → more compute, and directly translating into geographic superiority.
These things necessarily follow what I describe as a self-consistent power-law where the amount of physical, economic, and social networking assets compound over time. It is easy to spot the logical flaw in my argument, in that networks seem to be geography-agnostic– my argument on geographical determinism seems to fall through. The counterfactual here of course, is that there is something to be said about initial states of geography (specifically, the derived resources of the land, including ability to import inputs) and “alchemy” intersecting in the realm of compute. Whoever is able to take stock of their assets, minimize their costs (in energy, logistics, and input), achieve economies of scale quickly AND does so using compute will necessarily grow much faster than their counterparts. Just like force fields, the initial states will determine what global and local minima form.
This phenomenon will be further reinforced by the idea of the innovation ecosystem/district where the rate of ferment between physical, economic, social, and computing assets will lead to many local minima in the United States, best depicted as a topographic map of the US with minima showing areas with larger computing resources coupled with rapid economic growth. Regional integration (with respect to innovation) is becoming more and more critical, as can be observed by the different agency-based approaches to innovation (NSF’s Regional Innovation Engines is a notable example).
Taken to its logical conclusion, and tying it together with the history of the United States experiment, states have long been drivers of progress and change at the federal level, described by U.S. Justice Louis Brandeis as laboratories of democracy. Jenna Bednar and Mariano-Florentino Cuéllar describe U.S authority on the world stage as being associated with a federal system in which Washington is dominant. They then argue that this conventional understanding is both flawed and out-of-date.
In recent times, however, states have taken it upon themselves to seek their own international partnerships or agreements when they have significant stakes in energy, trade, and technology. This gives them the ability to enhance American international influence, and although the constitution prohibits states from entering into formal treaties, international agreements can be made by other means, as long as they are not legally binding. “It is then possible for states to leverage their soft power and convening capacity to facilitate policy coordination and form coalitions with like-minded foreign governments, [leveraging the constitutional flexibility to collaborate on international agreements to address globally significant problems with focused vigor, rather than through Washington, which is sometimes overextended]”
Inspired by the decentralized paradigm of matrix mechanics and cloud computation, I have personally dubbed this approach, eigenfederalism, an equilibrium of decentralization within a federal union of states.
I shall attempt to whimsically describe eigenfederalism, defining domestic and foreign policy as respective eigenvectors, technology as the eigenvalue and the initial states of… well, each state, described by the matrix A.
“It is not without its attendant risks– it may complicate our national security strategies for safeguarding information from other countries. Also, as states increasingly use litigation to contest federal action, foreign governments may be able to exploit tensions between states and the federal government.”
The growing influence of the states on domestic and foreign policy necessitates the pursuance of the following strategies delineated by the authors:
“State governments must further develop their potential to act as laboratories of democracy [and innovation] in the federal system to shape global deployment in ways that advance U.S interests.
Policymakers in Washington should recognize the value of allowing states to experiment on core issues, especially consistent with the projection of state influence on a global level.
Foreign governments can strengthen long-term relationships with the U.S, [specifically through the Invisible College], by building ties with individual states and their dependent cities.
The rise of quantum-centric supercomputing and advances in high-performance computing directly tie into the constant interplay between the states and federal government, and can provide a powerful strategic advantage to the United States. Every state, based on its innovation specialty, can contribute to continued U.S. leadership on the most vital international policy challenges of our time, as well as bolster the resilience of the American experiment. The pessimistic scenario that this Faustian bargain poses is that it could become a source of conflict and tension.
Cloud first, then land.
Edward! This discussion is as ambitious and thought provoking as anything I have ever read. I will have to read it again.