## Modern Ideas Let’s remind ourselves of modern interpretations of these ideas—things you all know. The historical discussion above was about regular-sized objects that were electrically charged or magnetized. The idea of atoms was still highly speculative, much less electrically charged particles. The sub-atomic objects in nature are either: positively charged, negatively charged, or neutral (zero charge). As we will soon see, the total electrical charge of any macroscopic object (your hair in winter and the comb that charged it?) can’t be just any value, but instead is a multiple of an apparently special amount. ```{admonition}   Please answer Question 9 for points: :class: danger Forces on wires carrying currents ``` When electricity was first studied—Franklin’s time—there was no idea of an electron or a proton and their particular charges. Rather “stuff” was charged: everyday-sized things, the charge of which might be pretty large. The unit of charge is the “Coulomb,” (abbreviation is “C”), named for our Charles Coulomb of previous pages. He studied regular sized objects and so they had lots of charge. But when the 20th century came around and people started to study atomic-sized objects it became apparent that the most natural value of charge was not Coulombs...but nano-nano Coulombs: the fundamental electrical charge is that of single electrons and protons. It was a couple of decades into the 20th century when the value of the discrete, fundamental charge was finally measured as the charge of the proton, which in magnitude is the same as the charge of the electron. Here it is. Lot’s of zeros: Let’s look at this number. There’s a lot of information in it—beyond just its value. The first thing you notice is that it’s tiny, as I warned: $$e = 0.0000000000000000001602176487(40)\mbox{ C}.$$ The second thing you notice is that there are lots of non-zero numbers to the right of the decimal point! That is, it’s measured to very high precision, which should suggest to you years and years of patient, careful measurements by lots of hard working people. The third thing you notice is the $(40)$ at the end of that string of numbers which are the only digits in the string of 12 that are considered to be slightly uncertain. So the precision to which this number has been measured is $0.00000000040/1.602 176 48740 = 2.49 \times 10^{-8}$ percent! You would correctly conclude that knowing this number must be a pretty important thing. It’s precision is much higher than that of the other fundamental physical constant that we saw earlier, the Gravitational Constant, G.

Electrical charge appears as multiples of a fixed, fundamental amount called $e$.

We use the symbol $e$ to represent the value of this fundamental amount of charge (so we don’t have to carry around all of the decimal points!). So in this way, we would say that the electrical charge of: * an electron is $Q_e = -e\text{ C}$, * a proton is $Q_p = +e\text{ C}$, and * a neutron is $Q_n = 0 \text{ C}.$ You might wonder why they are the same value? We don’t know. Franklin had a 50-50 chance of guessing which direction current flowed, and he got it wrong since the objects that move in a wire—that are an electric current—are negative electrons. So we’re stuck with his assignment. > **Wait.** Why is there a fundamental unit of electrical charge?

> **Glad you asked.** We’d love to understand this and it’s an object of speculation and research. Well, guess what. Just when we thought we’d nailed down the fundamental unit of charge in Nature...quarks happened. We’ll talk a lot about these fundamental entities later, but they have the surprising property of electrical charges which are not multiples of $e$, but rational fractions, namely $Q_{\text{ quarks}}\pm 2/3e$ or $\pm 1/3e$. Quarks go together to make many of the particles that we understand to be composite by adding their fractional charges to make ...whole multiples of $\pm e$. We take $e$ as the fundamental quantity primarily out of habit and because the electron has this fundamental, whole unit. ### Electrical Current {#chargescurrent} Earlier I noted that we define the unit of electrical current as that necessary to cause a particular force to separate two parallel currents. But most of us think in practical terms: a "current." An electrical current is a measure of how much charge passes a given point, during a given time. It's measured in "Amperes" as you know. Here's what an Ampere is: $$1\mbox{ Ampere}= 1\mbox{ Coulomb per second:}\;\;\;\; 1 \text{ A}= 1\text{ C/s}.$$ You’ve probably all played with batteries and wires in school and you know that if you set up a little circuit with an AA battery, some wire and a bulb that after some trial and error you can make it light. We say that the current flows from the positive terminal of the battery to the negative terminal, but that’s Ben’s mistake: it’s actually the electrons that move in a wire and so the current is in the opposite direction from the actual charge motion. We’ve learned to live with that and we call positive current *from* the positive terminal of a battery.

An electrical current is rate in time at which electrical charge moves.

```{admonition}   Please study Example 3: :class: warning How many electrons are in a Coulomb of electrons? ``` ```{admonition}   Please answer Question 10 for points: :class: danger Coulombs in a current ```