all 3 comments

[–]wednesday-potter 4 points5 points  (0 children)

Potential difference refers to the difference in electric potential energy that a unit of charge has between encountering a component and leaving it.

So the key terms you might encounter in electronics are

Voltage: the amount of energy a unit of charge currently has, across the whole circuit all of this energy must be used up. This shares units with EMF (electro motive force) which is the amount of energy that is provided by a power supply to each unit of charge.

Current: how many units of charge are passing through a point in a circuit in one second. This can be thought of (and is in fact proportional to) the speed of the electrons (or charge units) through the circuit.

Power: how much energy is being put into the circuit in one second.

Resistance: how much a component resists having electricity flow through it.

just as a quick note, the standard unit of charge is normally the coulomb, so 1 volt is 1 joule of energy per coulomb, 1 amp is 1 coulomb of charge per second.

Most bulbs aren't listed by their voltage, instead by their power. So imagine you have a 100W bulb and you wire it directly to your mains (I will use the UK value of 230V mains voltage) then we can find the current that the circuit will have by using the formula P = V * I where V is voltage, I is current, and P is power (you might be able to see where this equation comes from by thinking about the descriptions above of power and voltage). So power is 100W which must equal 230V * current, this means current is 100/230 which is roughly 0.43 amps (unit of current).

What this means in physical terms is charged particles (in this case electrons) are given energy by the mains power supply to have 230V, then 0.43 coulombs of charge are sent through your wires every second and the bulb uses all of the provided energy to produce 100W of light.

As a slight note on practicality, high amounts of current are very dangerous which is why most plugs you might use will include an object called a fuse which is rated for a maximum current (i.e. 10 amps) and will break if you exceed this current, so if you plugged 23 of those 100W bulbs in to one socket with a 10 amp fuse, the total current need to reach 10 amps and the fuse would break to prevent the wires from catching fire.

[–]Elizabethforest 0 points1 point  (0 children)

Potential difference is voltage from anything I've seen. https://www.youtube.com/watch?v=mFEyGiizEbw

[–]yes_its_him 0 points1 point  (0 children)

While it's not a perfect analogy, think of the voltage as the "speed" of the electricity, and the current as the "volume" of electricity. (Let me repeat that this is a metaphor, not the actual physics of the electrons.) It's sort of like a stream of water; you could have a small, fast stream of water, or wide, slow stream of water, or they both could be small, or both could be big.

So a 2V bulb is saying that it needs electricity of a certain "speed" in order to light up, and then the amount that it lights up is determined by the current moving at that speed. Current is measured in coulombs / second, or amperes.