It's rare that I can talk about green energy and software issues in one post. Here's some software that someone needs to write.
Over at "This week in batteries", Venkat Srinivasan explains the tradeoffs in modern battery charging. Essentially, the problem is that the higher voltage you charge a battery to, the bigger capacity it has in terms of discharging. However, higher voltages at the same time reduce the lifespan of the battery so that the next charge will have a little less capacity. Virtually all battery chargers choose a fixed voltage to charge up to and then stop charging, and this chosen voltage is a number that gives a good balance between capacity and lifespan, but doesn't really maximize either. From the article:
Operating to 4.1V makes things better and extends the life, 4.0 V is even better and so on. So why don’t battery manufacturers cut the voltage off at, say, 4 V to get better battery life? Because every time you cut this voltage down you decrease the capacity of the battery and its run time. The 4.2V cutoff is a compromise between good run time and decent (read “not pathetic”) life.Venkat goes on to suggest that this means you should charge your devices up to the maximum limit and then unplug them, so that they slowly drop a little below the max, to extend lifetime. He also mentions that Lenovo laptops even let you set the charge (as a percentage, not a voltage) if you want to play with these settings yourself.
This is a problem not just for laptops, but also for cell phones, plug-in cars, or any device that uses a battery. I'm certainly not a battery expert, but it seems to me that there is a better solution out there.
Most devices using an expensive battery have some amount of processing power and some amount of long-term memory (flash drive, hard drive, etc). Laptop, Cell Phone, and Vehicles certainly fall in to this category. If the device was able to predict roughly how long it would be used before the next recharge, then it would be able to charge to the lowest voltage required to stay alive until the next recharge.
Humans are creatures of habit, we often keep roughly the same schedule from day to day or week to week. If the device kept a history of these habits, it should be able to predict usage. For example, storing data like:
- Plug in timestamp
- Unplug timestamp
- Running out of juice timestamp
- Day of week, time of day
- which user is logged in, or how much they weigh on the driver seat sensor
- ip address the device is allocated to (or nearby wireless APs)
- GPS location
- applications that were run
Using a little bit of simple machine learning, the software could predict how much capacity is needed for the next unplug and chage accordingly. Thus if the laptop is always plugged in at a desk except on the weekend, a low voltage charge could be maintained and then ramped up all the way saturday morning. Or if the car is used for short trips to/from work, but on Friday night is driven to the next city for happy hour with friends, you can keep the voltage high for happy hour and low for the commute. If the device sees a completely new scenario (suddenly turned on in Japan at 3am on a different network), then it might switch temporarily to a high voltage/capacity until enough data comes in to know what to expect in this new scenario.
This approach would maximize both battery life and capacity while keeping the user blissfully unaware.