Beyond Numbers: Unveiling the Significance of Units of Measurement in Scientific Research and Human Endeavors - Sykalo Eugene 2025
Coulomb (C) - Electric charge
Somewhere between brushing your sweater sleeve and getting zapped by a doorknob, there’s a unit hiding in plain sight. Coulomb. Symbol: C. One of those strange, stately symbols—like the ohm or the mole—that appears in textbooks with the ceremonial tone of a courtroom oath. But behind this letter is one of the quietest revolutions in human thought: the decision to count electric charge.
What even is electric charge? Try explaining it to a teenager without metaphor, and you’ll start to sweat. You can’t hold it. You can’t see it. Yet everything electronic you own exists because we learned to measure it—not just acknowledge that it zaps and fizzes and powers your phone, but to quantify it with the rigorous detachment of a postal scale.
We call that quantification the coulomb. Named after Charles-Augustin de Coulomb, an 18th-century French physicist whose hair looked like he got too close to the experiments he was running. He’s mostly remembered for Coulomb’s law, a mathematical statement of how electric charges attract or repel each other. But his name stuck not because of that law alone, but because of what it helped birth: a way of treating electricity with the same mathematical precision as mass, distance, or time. You’d think it’s obvious now—“of course we measure charge!”—but at the time, that wasn’t a given. Before Coulomb, electricity was mostly parlor tricks and confused alchemy. After him, it started behaving like a science.
A single coulomb is a specific quantity: the amount of charge transported by a constant current of one ampere in one second. That’s not poetry—it’s accounting. But here’s where it gets weirder. One coulomb equals approximately 6.24 × 10¹⁸ elementary charges. That is: one coulomb is the charge carried by that many protons, or the negative of that many electrons. This number is so large it feels fake. Your brain, if you’re like most people, will skip it entirely and just nod politely. Which is a shame, because tucked inside it is the extraordinary strangeness of scale in physics: we only interact with electric charge in bulk. You’ve never felt a single electron. What you’ve felt is an avalanche.
In the early days of electricity, nobody had units. Not really. There were sparks. Crackles. Eels. It took obsessive minds and painful experiments (some with literal electric shocks) to wrangle these phenomena into anything resembling measurement. Alessandro Volta stacking zinc and copper plates. André-Marie Ampère scribbling equations about current before breakfast. Michael Faraday, a bookbinder’s apprentice, trying to trap lightning in a glass jar. And Coulomb, designing a torsion balance so delicate it could measure the invisible tug between two electrically charged pith balls.
Electricity wasn’t just mysterious—it was deeply inconvenient. It didn’t stay put. It leaked, it sparked, it didn’t scale. You couldn’t bottle it or count it, and so you couldn’t sell it. Until someone did.
Here’s the thing most people miss: units are not just tools. They are statements of faith—faith that something elusive can be tamed, pinned, understood well enough to trade or build with. The coulomb, like the second or the kelvin or the kilogram, is a vote for coherence in a universe that doesn’t owe us any.
We measure electric charge because we have to. Every battery, every transistor, every motor that twitches or phone that boots up is a temple to this countable quantity. Without the coulomb, we couldn’t write Ohm’s law or build circuit diagrams. We couldn’t price energy in kilowatt-hours or debug software that controls electrical pulses with eerie microsecond precision. The coulomb lets us talk about electricity as if it were real estate, not mythology.
Even the international system of units—the SI, that cool and slightly intimidating club of base measures—respects the coulomb, but in a very specific way. It’s not a base unit. It’s derived. The ampere is the base unit, and the coulomb rides shotgun: 1 coulomb = 1 ampere × 1 second. Which is hilarious, if you think about it. It means we define electric charge based on current over time. In other words, we define what charge is by watching it move.
Pause on that. It’s like trying to define a raindrop by measuring how wet the ground gets over a minute.
And that’s where the story becomes existential.
What is charge, really? Is it a substance? A property? A vibration? A symmetry? Quantum field theory tells us that charge is a kind of “bookkeeping” for interactions—something conserved across space and time. In the Standard Model, electric charge is tied to the U(1) symmetry of the electromagnetic interaction. If you’re blinking right now, good. It means you’re still sane. These abstractions aren’t here to make things fuzzy—they’re the best tools we have to describe how the universe refuses to contradict itself.
But none of that high-energy physics works unless we agree to count the damn electrons. And that means agreeing on the unit.
I once saw a graduate student panic because their lab results were off by a factor of ten to the minus six. The problem? A mislabeled data column where coulombs had been recorded as microcoulombs. That’s it. One Greek letter—the μ—slipped in unnoticed. Months of work, thrown into question. Science is full of such landmines. That’s why units matter. They’re not just conventions. They’re contracts between minds.
There’s something deeply human about units. They bridge our senses and our intellect. We can’t see a coulomb, but we can build a sensor that registers one. We can’t feel the flow of electrons, but we can watch an LED glow as a quiet signal of their journey. Measurement is our prosthetic sense. It’s how we reach beyond our evolved limits.
And yet, I’ve always found it slightly hilarious that our name for this unit comes from a man whose biggest fame was the torsion balance. A wire, suspended, twisting slightly when charges are near. It feels almost medieval in delicacy. But it works. Still. The math hasn’t aged, even if the haircuts have.
Today, we don’t use torsion balances to define the coulomb. In fact, as of the 2019 SI redefinition, all SI units are tied to fundamental constants. The coulomb is now defined indirectly through the elementary charge e, which is fixed at exactly 1.602176634 × 10⁻¹⁹ C. That’s the charge of one proton. This might seem like trivia, but it’s the opposite. It’s a reorientation of physics toward nature itself, not our apparatus.
We no longer define units by artifacts—no more platinum-iridium rods in Paris basements. We define them by constants we believe to be universal. The coulomb is a result, not a premise. A downstream truth from deeper assumptions. That’s cleaner, more abstract, but also somehow lonelier.
And yet, here we are. Using coulombs every time we check the battery on a laptop, every time we swipe a credit card, every time a defibrillator discharges into someone’s chest in an ER somewhere.
One coulomb. A crisp unit. A packet of electric charge. A name in honor of a Frenchman with an idea and some wires.
And beneath that—millions of electrons marching invisibly through copper veins, summoned and dismissed with the ease of a keystroke.
We don’t just live with electricity. We live because we measure it.
And the unit—silent, precise, taken for granted—is how we whisper to the forces that shape our world.