If you’re looking to use a charger to power up your phone, tablet or something else, then in an ideal world you’d use the charger that came with it. This isn’t always possible, and sometimes, you might need to use a different charger that you have lying around the house.
Different chargers have different amperage ratings. The most common chargers for tech products nowadays are both 1 and 2 amps. But what’s the difference between 1 and 2 amps?
And what will happen if you charge a 1 amp device with a 2 amp charger, and vice versa? well, that’s what we’re going to explain today.
1 amp vs 2 amps charger
The difference between these two different charger amperages is unlikely to cause any problems with your devices. If you charge a device that can accept 2 amps with a 1 amp charger, then it will simply just charger at a slower rate. And if you charge a 1 amp device with a 2 amp charger, it will just charge as normal, as it cannot accept anything more than 1 amp.
This is a very simple explanation of the difference between the two, and is probably what most people are looking for. We can look into this a little further, and decipher what exactly an amp is.
What is an amp?
An amp is simply the unit that we use to measure the electrical current of a device. It is essentially used to measure how many electrons travel through the wire per second. Let’s look at this a little more in depth.
Your chargers wires are made up of metal, which contains billions of electrons. We measure these electrons with a unit called coulomb. 1 coulomb is equal to 6,240,000,000,000,000 electrons (I did warn you it would get complex).
And overall, 1 amp is equal to 1 coulomb travelling through the wire per second. 2 amps is literally just double this, so the electrons are travelling at twice the rate is comparison to a 1 amp charger. This is assuming that the charger has the same voltage.
Voltage and Watts
The voltage of your charger is essentially the pressure that’s applied to the electrical current, that pushes the electrons through the wires. It’s more important for you to use a charger with the same voltage as what’s listed than the amps. This is because if you were to use a drastically different voltage, then this could have an impact on your device.
We can then get the wattage of your charger, which is quite simply the amps multiplied by the volts. The wattage is defined as the electrical power of the charger, which we can now see is made up of amps and volts.
An example of using amps
So, here’s a good example of how this works. I have an old Nokia 105 phone that I use for calls. It’s not a smartphone, it’s just a cheap £20 phone that you might expect a pensioner to use.
If I look at the charger of this phone, its DC (direct current) is listed as 5 volts. Its amperage is 0.55A. Multiplying these together gives us a wattage of 2.75W.
Now, if we compare this to my Kindle charger. It has a direct current of 5 volts, and an amperage of 1, which gives us 5W overall. But the main thing to focus on here is the amperage.
The Kindle has a greater amperage than the Nokia device. So what would happen if I were to use my Nokia charger to try and charge up my Kindle? You guessed it. It would just charge at a slower rate.
And what about if I use my Kindle charger to charge up my Nokia? Well, the Nokia device is limited to only being able to accept 0.55A. So it wouldn’t charge any faster, but you could still use the Kindle charger.
This is quite simply the difference between a 1A charger and a 2A charger, an whether you can easily use them for the devices around your household. Nowadays with most of us having a ton of chargers lying about, it’s useful to know whether we can reuse them again, or throw them away!
When you charge the battery, you do want to make sure that you use the correct amperage. However, if you do use a charger with a slightly higher or lower amperage than your device, then it shouldn’t make too much of a difference, aside from your charging time.