By Brad Jolly, Applications Engineer, Keysight Technologies, Inc.
Selecting and Testing
Batteries to Ensure IoT
In order to garner commercial appeal, manufactures must carefully consider the right battery design for their Io T devices.
The Internet of Things (Io T) continues its remarkable
growth, and although early estimates were overly
optimistic, there are still several credible projections in
the range of 25 to 30 billion devices by 2020. Despite the
improvements in energy harvesting technologies, most
Io T devices use battery power. As applications become
more demanding and customer expectations increase, Io T
device manufacturers are challenged to make the right
battery design decisions to ensure commercial success.
For example, consider wearable Io T devices, such as
smart watches, fitness trackers, virtual reality displays,
digital corrective eyewear, and even “smart” socks and
shirts for athletes. In these applications, long battery life
is a competitive differentiator, as users prefer not to have
to charge batteries frequently. In other contexts, such as
smart city and agriculture applications, the cost of getting
to a remote sensor to change a battery is often much
higher than the cost of the battery itself. In medical Io T
devices, especially those implanted in the human body, a
battery replacement may cost tens of thousands of dollars,
and the cost of a failed battery may go well beyond mere
Factors to Consider
There are many factors to consider when selecting a
battery for an Io T device. The most obvious are electrical
requirements of the device that the battery will power.
These, along with the physical characteristics of the
battery, must be considered in the environmental and
electromagnetic context in which the device will operate.
Finally, there are various costs and business considerations
that will likely influence battery selection.
The primary consideration, of course, is that the battery
must meet electrical requirements, such as the nominal
voltage of the device and the total battery capacity, which
is usually specified in amp-hours (Ah) or watt-hours
(Wh). These values must be considered in light of the
application’s physical environment, because different
battery technologies have different capacity de-rating
curves. For example, one particular 2200 mAh lithium
ion battery de-rates its capacity by 6 percent at 0 degrees
C, 17 percent at - 10 degrees C, and nearly 60 percent
at - 20 degrees C. In contrast, the rated discharge time
under a constant load for a particular manganese lithium
coin cell is only 5 percent less than the room temperature
rating ( 23 degrees C). Furthermore, different battery
technologies de-rate their capacities differently depending
The second characteristic is the battery’s ability to
be charged and recharged. How many times can it be
recharged? How fast does it take? How does capacity de-
rate over time? How well does it retain charge in storage?
How does it respond to multiple incremental trickle
charges as opposed to complete recharges?
You should also consider the battery’s cut-off voltage,
because your Io T device may need to implement cut-
off circuitry to predict and deal with this end-of-life
condition. For example, you may include a feature in your
device that sends a “battery nearing end-of-life” indication
when the battery’s voltage reaches a certain level. In
critical applications, it may be important to use the last