I disagree. Joules are really hard to understand to laypeople. Watt-hours directly relate to the power of a device without conversion, and can even be really translated in terms of power bill.
3.6 megajoules? Eh, I guess that’s maybe a lot? Or not?
1000 watt-hours? Oh, like running a microwave for a whole hour? Dang that’s a LOT!
I believe it actually has to do more with historical conventions in electronics or math. (This is just what I remember from heresay when I was in university as an electronics engineer), but there is also a mathematical reason.
history hearsay theory
The easiest way to measure power draw is by measuring current draw (voltage across a sense resistor) way back before there were affordable, quality ICs to measure voltage and current and pretty much joule count.
To add to this, current sensors are much easier and cheaper than test machines that do the calculations for you.
When lithium batteries and NiCAD batteries became standard compared to the earlier lead-acid (which are measured in Wh), they had an extremely flat voltage curve compared to lead acid. They could be considered to be at a constant voltage.
Now cheaper electronics were being made and if a designer wanted to know how long a battery would last, they could take the nominal battery voltage that the battery would be at a vast majority of the time, and they could just measure the current draw over a short time of the circuit, 10s of calculations, and you have your approximate battery life. There is a joke that engineers approximate π to 3.
Even designing electronics today, everything is specced to current draw, not power draw. ICs take X current in mA during Y operations. Your DCDC converters have Z quiescent currents and from there you can calculate efficiency. It is much easier to work in current for energy running through the circuit.
Math units
Ah is a measure of electrical charge.
Wh is a measure of energy
Batteries and capacitors hold charge so are measured in Ah, generators that power the grid generate energy and use of that energy is measured in Wh (it also isn’t a “constant” voltage source like batteries as it is AC)
The thing is, it does not matter how much charge the battery holds, it does matter how much energy it holds. Without knowing the Voltage the Ah is useless.
Sorry, but you are simply wrong. Simple math says that you are wrong.
You can buck or boost convert nearly any voltage to any other voltage.
Then measure the current output of the battery, boom you have battery life.
Also electrical charge can be used in many, many very valuable calculations without involving voltage at all.
Let’s take an arbitrary example with an arbitrary battery powered device. Let’s say the battery is somewhere between 1V and 10000000V. You can’t measure it because you might blow up your multimeter.
You know that the battery is 5000mAh. You can safely measure that all of the circuitry is draining 1000mA because sense resistors or contactless magnetic current measurements don’t have anywhere near dangerous voltages. You know that the battery will last about 5 hours. What is the voltage? Doesn’t matter.
Yes, charge and the flow of charge is not the entire story, but to say it is useless or does not matter is just a straight lie. It is fine if you don’t understand electronics, but then don’t spit out misinformation.
Yes Watt-hours would give a more complete picture to slightly tech-inclined consumers (makes 0 difference for 99% of consumers), but then it returns to not mattering because you can do the 5s calculation yourself because single cell lithium batteries are overwhelmingly 1 nominal voltage.
Literally 90% of calculations related to efficiency are JUST as valid using mA as W.
Your device uses 12mA at idle with a 5000mAh battery has the same relevance as your 18.5Wh battery using 45mW at idle.
I am ONLY speaking from a consumer position and for those Wh is more useful.
The consumer looks on device a and on device b and then determines how often he can recharge its device. With Ah you cannot do this unless you know the Voltage, with Wh you can make this decision without any further knowledge.
Yes this does not include battery life or conversion of efficiency. But a cunsumer measures nothing he looks at the lable.
It is fine if you don’t understand electronics, but then don’t spit out misinformation.
Btw. no need to insult me. I have never put out misinformation, I may have not stated enough that I am viewing this as a consumer.
Please explain to me what the difference is between battery life if you have a 5000mAh battery and an 18Wh battery.
Please state the calculation that you would use to “determine how often you have to recharge” that is valid for Wh and not for Ah. I am all for it. If you can cite a single source where the manufacturer gives a specification that would give battery life in Wh, and not in Ah, I will concede the entire argument and say that you were right the whole time in every comment make a note that you were right. Please show your calculation work.
The thing is, it does not matter how much charge the battery holds, it does matter how much energy it holds. Without knowing the Voltage the Ah is useless.
This is patently, objectively misinformation and completely false. That is a direct quote of your words, today. That was your last comment. I have already laid out multiple examples of how Ah is a useful measurement and what you can do with it. Therefore, it is misinformation. It is not disinformation, but stating untrue things as fact is misinformation, even if you have no idea you are wrong.
If you can cite a single source where the manufacturer gives a specification that would give battery life in Wh, and not in Ah, I will concede the entire argument and say that you were right the whole time in every comment make a note that you were right.
Lol, you literally quoted me, didn’t actually read what you quotes, and then did something completely different.
Do you know that battery life ≠ battery capacity? That is not the same measurement as I have already tried to teach you 3 times.
Please state the calculation that you would use to “determine how often you have to recharge” that is valid for Wh and not for Ah.
What is its idle power draw? What is its power draw under load? Playing video? Sleep mode? That source gives nothing which determines battery life. All it gives is a nearly useless capacity number, just like all other manufacturers. So not valid at all. You still have exactly 0 more information about battery life.
If I am wrong, please state your calculations of what the battery life is with that 54Wh battery.
Your entire argument was “Ah is useless and Wh gives consumers the information to determine battery life” So go ahead, determine the battery life.
How is this any different at all if they said that it is a 5.8Ah battery? They don’t give any current or power draw.
As an exercise:
can you tell me the battery life difference between an arbitrary Laptop A with a 54Wh battery and Laptop B with a 27Wh battery?
Your entire argument was “Ah is useless and Wh gives consumers the information to determine battery life” So go ahead, determine the battery life.
Not quit sure where battery life is comming from, thats not my argument. To restate and also better phrase my argument:
Stored energy is the better measure for consumers for the quantification of the battery size in a consumer device compared to Ah stored charge.
Now i can cross compare devices based on that and do not have to worry about the Voltage of the battery of any other devices.
Please state the calculation that you would use to “determine how often you have to recharge” that is valid for Wh and not for Ah.
I never claimed that this is possible. I wrote “can recharge” not 'have to". I am referring to devices like a power bank which i can calculate with a simple:
powerbank has 100Wh and phone has 25Wh so 100/25=4
-> i can recharge my phone 4 times using that powerbank.
I guess it comes down to whether we want to primarily communicate battery size in terms of charge (Coulombs = Amps * Time) or energy (Joules = Watts * Time).
The first metric you multiply by your operating voltage to get the second metric, whereas the second metric you have to divide by your voltage to get the first. Depends on what comes easier to most people.
With the increasing abundance of electric vehicles people are getting used to (k)Wh as the unit for battery size. It would make sense to use the same unit for smaller electronics as well, IMO.
Yes. I really wish all batteries used watt-hours. All it’d take would be for someone to design a phone that runs at a different voltage and their battery numbers would stop being comparable.
A 4Ah battery at 5V would be a 20Wh battery, drop the kilo. Electronics draw power at idle, not energy. 2kWh is meaningless without an idle duration. What are you saying?
Wh may be better for determining total energy storage across differing cell chemistry. mAh is standard for electronics and makes more sense at the design level as the battery voltage is chemistry dependent and known to the designer.
i don’t think any manufacturer publishes the voltage their devices run at, could be anywhere from 3.3 to 5V. so i don’t know how an end-user is supposed to compare battery sizes between devices.
They would also have to give current draw which isn’t really possible since each end user has different apps and behavior. So you more often get standby time or video playback time which are based on an “ideal” (probably non-bloated) clean OS. That’s more useful to an end user but also subject to marketing fudging the figures.
You can often look up the battery chemistry or use an app to access sensors btw.
At the end of the day battery capacity is only one factor of many in battery/charge life and is generally just marketing in the context of phones.
Energy is just the product of power and time. And just like amperage, the power draw of a device varies.
And this should be obvious, but what makes more sense to an electronics engineer doesn’t matter one bit to the end user. And the end user doesn’t know anything about milli-amperes or volts (except maybe their wall outlet voltage).
Yes power is a rate. As you said energy is the time integral of power. So it’s meaningless to state an “energy draw” without a duration implied or explicit. E.g. what does drawing 2kWh at idle even mean?
I agree about end user sentiment. I was trying to suggest as well. The only way to know which battery/phone is going to have a better battery life is to identify reviews with similar usage to your own or cross-compare metrics across devices you’re familiar with. In general, phone A with a 4000mAh battery won’t necessarily outlast phone B with a 4500mAh batt.
Well you don’t say it draws 2 kWh at idle. You say it draws 2 kW at idle. While that is incredibly inefficient, it means that for every hour the device is idle, it draws 2 kWh of energy.
Oh yeah battery size isn’t sufficient to fully gauge battery life. You need to know power draw to calculate that. And it’s good to get battery life ratings from reviews. Great. It helps a lot.
But it doesn’t mean we shouldn’t get good, comparable physical specs.
Kinda like processors. Gigahertz and core counts are far from telling you everything, but it doesn’t mean it should be abstracted into some weird unit.
Yeah a metric would be nice but it would need a standard test. That’s why idle time and video playback time makes a good amount of sense. But it’s not entirely clear how that would translate into usage for example in back country (where cell network drains power harder) or travel. So it’s not perfect. But it is probably the best measure guven hardware and usage vatiation. In any case it’s subject to marketing dudging the numbers in various ways.
you can optimize your android device battery in ways iphones cant. For example you cant disable or remove any system app consuming your battery in iPhones, but that is instantly doable in Androids
Settings, apps, Google play services, disable. Very easy. Nobody is saying “you can disable any app you want on android and your phone will magically just keep running perfectly as though it’s not dependent on it” just that it is possible to do so. Yes, I understand disabling Google play services will cripple many features. It is however possible, and you’ll still have a functional phone afterwards. The same cannot be said about iPhones.
iphones cannot temporarily disable apps, cannot prevent specific apps from accessing network, cant spoof live location sharing, cannot even multi-window several apps at once. those are 4 simple examples which I personally find very helpful which all androids can do for more than 10 years already while iphone cant.
mAh is a stupid way to measure batteries. Wh is more relevant.
It also tells nothing about the efficiency of the device. You can add a 50kWh battery to a device but it doesn’t matter if it uses 2kWh at idle
I’d argue Wh is a complete waste. Just use J, which is the much more established unit.
I disagree. Joules are really hard to understand to laypeople. Watt-hours directly relate to the power of a device without conversion, and can even be really translated in terms of power bill.
3.6 megajoules? Eh, I guess that’s maybe a lot? Or not?
1000 watt-hours? Oh, like running a microwave for a whole hour? Dang that’s a LOT!
I believe it actually has to do more with historical conventions in electronics or math. (This is just what I remember from heresay when I was in university as an electronics engineer), but there is also a mathematical reason.
history hearsay theory
The easiest way to measure power draw is by measuring current draw (voltage across a sense resistor) way back before there were affordable, quality ICs to measure voltage and current and pretty much joule count.
To add to this, current sensors are much easier and cheaper than test machines that do the calculations for you.
When lithium batteries and NiCAD batteries became standard compared to the earlier lead-acid (which are measured in Wh), they had an extremely flat voltage curve compared to lead acid. They could be considered to be at a constant voltage.
Now cheaper electronics were being made and if a designer wanted to know how long a battery would last, they could take the nominal battery voltage that the battery would be at a vast majority of the time, and they could just measure the current draw over a short time of the circuit, 10s of calculations, and you have your approximate battery life. There is a joke that engineers approximate π to 3.
Even designing electronics today, everything is specced to current draw, not power draw. ICs take X current in mA during Y operations. Your DCDC converters have Z quiescent currents and from there you can calculate efficiency. It is much easier to work in current for energy running through the circuit.
Math units
Ah is a measure of electrical charge.
Wh is a measure of energy
Batteries and capacitors hold charge so are measured in Ah, generators that power the grid generate energy and use of that energy is measured in Wh (it also isn’t a “constant” voltage source like batteries as it is AC)
The thing is, it does not matter how much charge the battery holds, it does matter how much energy it holds. Without knowing the Voltage the Ah is useless.
Sorry, but you are simply wrong. Simple math says that you are wrong.
You can buck or boost convert nearly any voltage to any other voltage.
Then measure the current output of the battery, boom you have battery life.
Also electrical charge can be used in many, many very valuable calculations without involving voltage at all.
Let’s take an arbitrary example with an arbitrary battery powered device. Let’s say the battery is somewhere between 1V and 10000000V. You can’t measure it because you might blow up your multimeter.
You know that the battery is 5000mAh. You can safely measure that all of the circuitry is draining 1000mA because sense resistors or contactless magnetic current measurements don’t have anywhere near dangerous voltages. You know that the battery will last about 5 hours. What is the voltage? Doesn’t matter.
Yes, charge and the flow of charge is not the entire story, but to say it is useless or does not matter is just a straight lie. It is fine if you don’t understand electronics, but then don’t spit out misinformation.
Yes Watt-hours would give a more complete picture to slightly tech-inclined consumers (makes 0 difference for 99% of consumers), but then it returns to not mattering because you can do the 5s calculation yourself because single cell lithium batteries are overwhelmingly 1 nominal voltage.
Literally 90% of calculations related to efficiency are JUST as valid using mA as W.
Your device uses 12mA at idle with a 5000mAh battery has the same relevance as your 18.5Wh battery using 45mW at idle.
I am ONLY speaking from a consumer position and for those Wh is more useful.
The consumer looks on device a and on device b and then determines how often he can recharge its device. With Ah you cannot do this unless you know the Voltage, with Wh you can make this decision without any further knowledge.
Yes this does not include battery life or conversion of efficiency. But a cunsumer measures nothing he looks at the lable.
Btw. no need to insult me. I have never put out misinformation, I may have not stated enough that I am viewing this as a consumer.
Please explain to me what the difference is between battery life if you have a 5000mAh battery and an 18Wh battery.
Please state the calculation that you would use to “determine how often you have to recharge” that is valid for Wh and not for Ah. I am all for it. If you can cite a single source where the manufacturer gives a specification that would give battery life in Wh, and not in Ah, I will concede the entire argument and say that you were right the whole time in every comment make a note that you were right. Please show your calculation work.
This is patently, objectively misinformation and completely false. That is a direct quote of your words, today. That was your last comment. I have already laid out multiple examples of how Ah is a useful measurement and what you can do with it. Therefore, it is misinformation. It is not disinformation, but stating untrue things as fact is misinformation, even if you have no idea you are wrong.
Basically every Laptop manufacturer.
https://www.dell.com/en-us/shop/dell-computer-laptops/latitude-5550-laptop/spd/latitude-15-5550-laptop/s0035l5550usvp?ref=variantstack
Lol, you literally quoted me, didn’t actually read what you quotes, and then did something completely different.
Do you know that battery life ≠ battery capacity? That is not the same measurement as I have already tried to teach you 3 times.
What is its idle power draw? What is its power draw under load? Playing video? Sleep mode? That source gives nothing which determines battery life. All it gives is a nearly useless capacity number, just like all other manufacturers. So not valid at all. You still have exactly 0 more information about battery life.
If I am wrong, please state your calculations of what the battery life is with that 54Wh battery.
Your entire argument was “Ah is useless and Wh gives consumers the information to determine battery life” So go ahead, determine the battery life.
How is this any different at all if they said that it is a 5.8Ah battery? They don’t give any current or power draw.
As an exercise:
can you tell me the battery life difference between an arbitrary Laptop A with a 54Wh battery and Laptop B with a 27Wh battery?
Not quit sure where battery life is comming from, thats not my argument. To restate and also better phrase my argument: Stored energy is the better measure for consumers for the quantification of the battery size in a consumer device compared to
Ahstored charge.Now i can cross compare devices based on that and do not have to worry about the Voltage of the battery of any other devices.
I never claimed that this is possible. I wrote “can recharge” not 'have to". I am referring to devices like a power bank which i can calculate with a simple:
powerbank has 100Wh and phone has 25Wh so 100/25=4 -> i can recharge my phone 4 times using that powerbank.
I guess it comes down to whether we want to primarily communicate battery size in terms of charge (Coulombs = Amps * Time) or energy (Joules = Watts * Time).
The first metric you multiply by your operating voltage to get the second metric, whereas the second metric you have to divide by your voltage to get the first. Depends on what comes easier to most people.
With the increasing abundance of electric vehicles people are getting used to (k)Wh as the unit for battery size. It would make sense to use the same unit for smaller electronics as well, IMO.
Yes. I really wish all batteries used watt-hours. All it’d take would be for someone to design a phone that runs at a different voltage and their battery numbers would stop being comparable.
That’s why you watch performance tests on YouTube
A 4Ah battery at 5V would be a 20Wh battery, drop the kilo. Electronics draw power at idle, not energy. 2kWh is meaningless without an idle duration. What are you saying?
Wh may be better for determining total energy storage across differing cell chemistry. mAh is standard for electronics and makes more sense at the design level as the battery voltage is chemistry dependent and known to the designer.
i don’t think any manufacturer publishes the voltage their devices run at, could be anywhere from 3.3 to 5V. so i don’t know how an end-user is supposed to compare battery sizes between devices.
They would also have to give current draw which isn’t really possible since each end user has different apps and behavior. So you more often get standby time or video playback time which are based on an “ideal” (probably non-bloated) clean OS. That’s more useful to an end user but also subject to marketing fudging the figures.
You can often look up the battery chemistry or use an app to access sensors btw.
At the end of the day battery capacity is only one factor of many in battery/charge life and is generally just marketing in the context of phones.
What? They draw power, not energy?
Energy is just the product of power and time. And just like amperage, the power draw of a device varies.
And this should be obvious, but what makes more sense to an electronics engineer doesn’t matter one bit to the end user. And the end user doesn’t know anything about milli-amperes or volts (except maybe their wall outlet voltage).
Yes power is a rate. As you said energy is the time integral of power. So it’s meaningless to state an “energy draw” without a duration implied or explicit. E.g. what does drawing 2kWh at idle even mean?
I agree about end user sentiment. I was trying to suggest as well. The only way to know which battery/phone is going to have a better battery life is to identify reviews with similar usage to your own or cross-compare metrics across devices you’re familiar with. In general, phone A with a 4000mAh battery won’t necessarily outlast phone B with a 4500mAh batt.
Well you don’t say it draws 2 kWh at idle. You say it draws 2 kW at idle. While that is incredibly inefficient, it means that for every hour the device is idle, it draws 2 kWh of energy.
Oh yeah battery size isn’t sufficient to fully gauge battery life. You need to know power draw to calculate that. And it’s good to get battery life ratings from reviews. Great. It helps a lot.
But it doesn’t mean we shouldn’t get good, comparable physical specs.
Kinda like processors. Gigahertz and core counts are far from telling you everything, but it doesn’t mean it should be abstracted into some weird unit.
Per the kW vs kWh, see top level reply.
Yeah a metric would be nice but it would need a standard test. That’s why idle time and video playback time makes a good amount of sense. But it’s not entirely clear how that would translate into usage for example in back country (where cell network drains power harder) or travel. So it’s not perfect. But it is probably the best measure guven hardware and usage vatiation. In any case it’s subject to marketing dudging the numbers in various ways.
you can optimize your android device battery in ways iphones cant. For example you cant disable or remove any system app consuming your battery in iPhones, but that is instantly doable in Androids
Try disabling google play services.
To be fair, you can do pretty much anything on a rooted Android.
But I wouldn’t say “instantly” since you’d have to root it first.
Settings, apps, Google play services, disable. Very easy. Nobody is saying “you can disable any app you want on android and your phone will magically just keep running perfectly as though it’s not dependent on it” just that it is possible to do so. Yes, I understand disabling Google play services will cripple many features. It is however possible, and you’ll still have a functional phone afterwards. The same cannot be said about iPhones.
You can take background permissions from system apps on iOS
iphones cannot temporarily disable apps, cannot prevent specific apps from accessing network, cant spoof live location sharing, cannot even multi-window several apps at once. those are 4 simple examples which I personally find very helpful which all androids can do for more than 10 years already while iphone cant.