As written here, I had a good idea on paper of how much electricity and money an electric car would use. Using the battery size and the range estimate of the car it was easy to ballpark a figure for the cost to operate. But still, I wanted a real number to slap to this value. Making this real number even more important, my mother-in-law and I had an agreement that I’d pay her for the electricity the car used. When you buy gas you’re the one buying the gas, but charging on someone’s electric bill makes matters difficult. In the matter of fairness I wanted a real and accurate number for how much the car was using. How would I do that?
The car itself has resettable trip meters that also note the kilowatt-hours (kWh) used. I was hesitant to use this number because it would imply that whatever the car says it used would be what was actually coming through the outlet; this might not be the case. I needed to know what was used for real.

I thought of the possibility of a plug-in electric meter, sort of like one of those things on the side of your house but portable and pluggable wherever you needed it. I didn’t even know if they existed, but thanks to Amazon I was able to find one of these things. I bought a Kill-a-Watt meter and it had decent reviews. I’ll probably shill for the product later because it is tremendously useful. You plug the meter in, plug in an appliance, and it tells you the wattage, the total power used, power factor, voltage, and a few other numbers. By purchasing one I could slap that into an outlet, plug the car charger into the meter, and get a real number for how much juice was going into the battery.
Here’s the idea: I’d compare the Kill-A-Watt meter to the car’s display. You’d expect these numbers to have some correlation and maybe that the wall meter would have a slightly higher number. This is what I found although the discrepancy between them — sadly — wasn’t small at all. The first month the car said it used 248 kWh while the meter said that 348 kWh actually came through the outlet. Dividing those numbers gave a value of 1.40, or 71%. Basically the car was using more power than what it said it was using.

Why? My idea is that charging the car with a 110 volt charger is quite inefficient. Some power is lost due to the uphill voltage gradient between the outlet, the charger, and the battery. I think the Ford Focus has a battery that is kept around 300 volts so obviously the charger needs to “step up” the input voltage above 300 volts for the car to charge. There’s going to be some losses involved and apparently it’s (about) 30%.
It was slightly disheartening to find this out. The car was using quite a bit more electricity than I thought it was going to use. But crunching some numbers helped the pain: the car still only used $0.04 per mile! Obviously if the car used what it said the number would be lower and around $0.027 per mile but 4 cents per mile is nothing to gripe about. Compare that to our old Saturn that did, at very best, 8 cents per mile.
Is there a way to get rid of that efficiency loss? As stated, I think it’s due to using the 110 volt slow charger that came with the car. By using a 240 volt charger the charging time will be reduced drastically and it should be more efficient. One caveat with that: I don’t know if they make 240 volt Kill-A-Watt meters so I won’t be able to measure the true electrical usage. One could also charge elsewhere and get free power. I was trying to do that last summer and is a whole other story on its own. But for now I’m stuck taking a 40% upcharge on all power the car uses, possibly due to efficiency losses. It’s still by far cheaper than a gas car though.
If anyone has any information about this discrepancy or using 240 volt home chargers, please fill me in! I’m open to new information and maybe my “efficiency loss” idea is totally wrong.