It’s pretty hard to imagine a world without electrons. With no electrons, there would be no TV, no radio, no internet. No smartphones, no computers, no electricity. Not to mention no chemicals, no food, no life. No atoms.
Of course, electrons have in fact always been around, in abundance. They have permeated the universe since the earliest instants after the Big Bang. But despite their ubiquity, no human knew very much about them until nearly the 20th century. Before then, only the foggiest clues existed about what caused the curiosities of static electricity and electric currents.
Pursuing those clues proceeded slowly for centuries. But once the quarry was captured, and its identity established, the electron enabled the magic of modern technology and fathered new fields of science. It was the electron that led scientists into the wild and weird world of quantum mechanics, which is marking its centennial this year. Knowledge of the electron’s behavior and its quantum powers transformed civilization in ways that defied anything the ancients could have imagined.
Ancient Greek philosophers did have an inkling that something mysterious was afoot in matter’s interactions. It was well known that amber rubbed with silk or fur acquired the ability to attract small, light objects — an example of what now is known as static electricity. Thales of Miletus, active around 600 B.C.E., even speculated that amber’s power and the attraction of iron to the mineral magnetite had something in common.

In ancient times, humans discovered that amber rubbed with a cloth would gain the power to attract small, lightweight objects like the bits of paper shown here, but the reason for this power — static electricity — remained mysterious for millennia.
CREDIT: NTV / SHUTTERSTOCK
Progress during antiquity and through the Middle Ages was limited. But around the end of the 16th century in England, Queen Elizabeth’s physician, William Gilbert, noted that many substances, including glass rods, acquired attractive powers similar to amber’s when rubbed with silk. Gilbert referred to such rods as “electric bodies” or “electrics” from elektron, the Greek word for amber.
A deeper pursuit of electricity’s mysteries came in the mid-18th century from Benjamin Franklin. Famous for proving that lightning is a form of electricity, Franklin also adduced the basic concepts and provided much of the terminology for future electrical science research.
“He introduced into the language of scientific discourse relating to electricity such technical words as plus and minus, positive and negative, charge, and battery,” wrote the science historian I.B. Cohen.
Franklin believed in a single electrical fluid — or “electrical fire” — that existed independently of other material substances. Glass rubbed with human hands, for example, did not create electrical fire; rather, bits of preexisting electrical fire were transferred from the hands to the glass during the rubbing. The glass, in other words, acquired what Franklin called a positive electric charge; silk’s deficit of electrical fire left it with a negative charge.
The electrical fire acquired by glass turned out to be nothing other than electrons. (Alas, later terminological conventions required assigning electrons a negative charge. But that wasn’t Franklin’s fault.)
Franklin surmised that his electrical fire, or fluid, “consists of particles extremely subtile” that “can permeate common matter” with ease. If anyone doubted electrical fire’s ability to pass through bodies, Franklin remarked that “a shock from an electrified large glass jar … will probably convince him.”

In the 18th century, Benjamin Franklin performed a suite of electrical experiments leading to his deduction that some sort of “electrical fluid” could be transferred from one object to another. That fluid turned out to be composed of what scientists now know to be electrons.
CREDIT: SCIENCE HISTORY IMAGES / ALAMY STOCK PHOTO
Electrical research flourished in the 19th century, leading to the eventual understanding of a mutual relationship between electricity and magnetism, manifested in the electromagnetic waves that would later make radio, TV and Wi-Fi possible. But the nature of Franklin’s electrical fire remained obscure.
A key development came with the discovery that a glass tube containing a low-pressure gas could conduct an electric current. When wires from a battery were connected to electrodes sealed inside each end of the tube, a green glow appeared to emanate from the negative electrode. Since the negative electrode was called the cathode, the green glow became known as cathode rays.
Experiments by the British physicist William Crookes showed that cathode rays traveled in a straight line, suggesting they were a form of light. But Crookes then showed that a magnet bent the rays’ path, ruling light out. A debate then swirled among Europe’s leading physicists over whether the rays consisted of waves or tiny particles.
As the end of the 19th century neared, the cathode ray debate merged with two other electrical issues: whether a fundamental unit of electric charge existed, and if so, was there a particle that carried that charge — a fundamental particle smaller than an atom.
At the forefront of investigating those questions was the British physicist J.J. Thomson. Thomson was trained as a mathematician but took up physics at the famous Cavendish Laboratory in Cambridge, working under the esteemed Lord Rayleigh. In 1884 Thomson succeeded Rayleigh as head professor at the Cavendish.
In 1897 Thomson showed that the electric charge in the cathode rays was associated with a definite mass, establishing the electron as a particle. The ratio of this mass to the electric charge indicated that the unit of charge — the atom of electricity — was carried by a mass less than a thousandth the mass of the hydrogen atom.
“The assumption of a state of matter more finely subdivided than the atom of an element is a somewhat startling one,” Thomson admitted in announcing his findings in a lecture at the Royal Institution. Yet that was exactly what his experiment had demonstrated.
What’s more, Thomson showed that this particle was the same mass no matter what gas was used in the tube and no matter what element the cathode was made of.
“After that no reasonable person could really refuse belief that there were particles smaller than atoms, or lighter than atoms at least, and that these particles played a fundamental part in the constitution of matter,” wrote J.J.’s son, George.

In 1897, J.J. Thomson subjected cathode rays (produced in a cathode ray tube) to electric and magnetic fields. By analyzing the response to those fields, Thomson showed that whatever carried the charge had a specific mass, no matter the element used in the tube. He deduced that cathode rays consisted of small electrically charged particles that he called corpuscles, now known as electrons.
Hence Thomson (the father) earned credit for the discovery of the electron, the first subatomic particle to be identified. He called his discovery “corpuscles.”
But oddly enough, the particle had previously been christened the electron in 1891, years before its discovery, by the Irish physicist George Johnstone Stoney. Stoney coined the term (from the Greek word for amber, remember) to refer to the fundamental unit of electricity, even though nobody yet knew what it was. Soon after Thomson identified the particle, electron became the popular term.
Inside the atom
Coming shortly after the discovery of X-rays and radioactivity, the electron’s arrival further accelerated the frenzied efforts to figure out what was going on inside atoms.
A particular problem was how atoms, known to be electrically neutral in ordinary circumstances, could contain charged particles. To offset the electron’s negative charge, positive electric charge of some sort must also reside within the atom. But nobody knew the proper architecture that permitted such cohabitation.
Thomson proposed that the negatively charged electrons embedded themselves in a pudding of positive charge, electrons playing the role of plums. No evidence for such an arrangement existed, though, and the whole idea was shattered in 1911, when Ernest Rutherford announced the discovery of the atomic nucleus. Each atom contained a tiny core, like a positively charged stage of a theater in the round, with the negatively charged electrons relegated to the cheap seats.
Rutherford’s discovery of the nucleus was a surprise that seemed impossible. Even Ben Franklin would have been befuddled. Everything physicists had discovered about electric charge required the negatively charged electron to spiral into a positively charged nucleus in a fraction of a second, releasing electromagnetic energy in the process.
But soon the Danish physicist Niels Bohr rescued the electron from its death spiral, invoking the novel rules of quantum physics.
Bohr’s atom pictured electrons circling the nucleus in certain allowed orbits, preventing them from releasing energy by traveling into the nucleus. (Energy was released or absorbed only when an electron jumped from one allowed orbit to another.)

The Danish physicist Niels Bohr attempted to explain the electron’s role in atomic structure as a set of orbiting trajectories around a central nucleus, as with the element radium shown here in this vintage drawing. After the introduction of quantum mechanics a century ago, precise orbits were replaced by electron energy levels without specific trajectories.
CREDIT: H. HOLST ET AL / THE ATOM AND THE BOHR THEORY OF ITS STRUCTURE 1923
Bohr’s idea (as he well knew) was preliminary. His math didn’t work for atoms more complicated than hydrogen. But a more complex approach, initiated by the German physicist Werner Heisenberg in 1925, established quantum mechanics as the rule book for electron behavior. Soon thereafter chemists began to apply quantum math to explain how electrons mediated the bonding between atoms to make chemical compounds.
But the electron was not done with surprises. Even before Heisenberg constructed his picture of the atom with electrons as particles, French physicist Louis de Broglie suggested that electrons might actually travel through space as waves. Soon after Heisenberg’s work appeared, Austrian physicist Erwin Schrödinger devised an electron wave model of the atom. Schrödinger’s wave math gave precisely the same results as Heisenberg’s particle picture.
Experimental verification of the wave picture soon came from Clinton Davisson and colleagues at Bell Labs and independently from George Thomson at the University of Aberdeen in Scotland. Both showed that electron beams sent through a crystal deviated from their path to form a diffraction pattern, something only waves could produce.
Davisson and Thomson were awarded the Nobel Prize in physics in 1937. It was one of the great ironies in physics history: J.J. Thomson won the 1906 Nobel for proving electrons are particles; his son George won the 1937 Nobel for proving electrons are waves.
A way out of the conundrum was proposed in 1927 by Bohr. He argued that both the wave and particle pictures were correct, but they applied only to mutually exclusive experimental arrangements. You could devise an experiment showing the electron to be a wave, or you could design one showing it to be a particle, but you could not construct an experiment that would reveal both wave and particle at the same time.
Bohr’s solution, called complementarity, solved the problem for the moment, but it birthed a century’s worth of debate about how the math of quantum mechanics should be interpreted.
Despite continuing interpretational controversy, quantum physics eventually matured into a driver of exotic technology relying on the electron. As electronic circuitry miniaturized, from its origins in bulky vacuum tubes to tidy transistors and tiny integrated circuits, society witnessed a flood of technological revolutions, along with a deeper understanding of the natural world.
Electron behavior permeates all realms of nature, from the chemical properties of individual atoms to the complexities of biological molecules. Understanding the electron enabled the era of designer materials, consumer electronics and prodigious computational power. From email to electron microscopes, solar-electric cells to lasers, electrons have been the key ingredient in making the modern world modern.
As Benjamin Franklin foresaw, his “electrical fluid” would someday offer humankind ample reward for pursuing its properties. “The beneficial uses of this Electrical Fluid we are not yet well acquainted with,” he wrote, “tho’ doubtless such there are and great ones.”