The integration of artificial intelligence at the edge (commonly referred to as Edge AI) has opened up new possibilities for IoT applications.
Traditionally, most AI workloads required powerful cloud infrastructure to process large volumes of sensor data. This meant that IoT devices were largely limited to collecting and transmitting raw data, leaving the heavy lifting to remote servers.
Edge AI has brought machine learning inference directly onto the device itself (or to a nearby gateway) and rather than constantly sending data to the cloud for analysis, the device can interpret and act on data locally, in real time. This is made possible through the development of lightweight AI models and increasingly capable edge hardware.
The benefits can be significant. Edge AI reduces latency, cuts down on bandwidth usage, enables operation in disconnected environments, and allows faster responses to critical events.
However, making the most of the advantages offered by edge AI requires an understanding of the implications for battery selection. Depending on how it's implemented, Edge AI can either be a battery saver or a battery drainer.

A double-edged sword
There are some obvious burdens on a battery that come with running AI at the edge.
Perhaps the most obvious is the increase in computational demands. AI models, especially deep learning networks, require substantial processing power. When these models are executed locally on an edge device, they ramp up CPU or GPU activity, which in turn consumes more energy. This becomes particularly problematic for devices with limited power resources.
Beyond raw computation, AI inference also requires frequent memory access. For small, battery-dependent devices, memory operations are notoriously power-hungry.
Plus, as the amount of computation required increases, so does heat generation. In environments where thermal management is necessary, additional energy is consumed to keep the system cool and stable.
Put together, this can be a drain on a battery’s lifespan, and it may require careful planning to select a power source that can cope with the additional demands.
However, there are also instances where AI at the edge could help conserve energy.
One of the most energy-intensive operations on IoT devices is data transmission, particularly over cellular or long-range wireless networks like LoRaWAN. By processing data locally and transmitting only filtered or summarized results, AI can greatly reduce the frequency and volume of communication, which directly translates to a longer battery life.
Deployed correctly, Edge AI can also enable smarter power management. AI models can be designed to predict idle periods, dynamically adjust sampling rates, and even trigger radios or sensors only when necessary. This kind of event-driven processing ensures that the device is only active when it needs to be, leading to significant energy savings.
You can already see the shifting demands for battery power in edge-enabled IoT devices in certain sectors.
Where devices are used for predictive maintenance of equipment, a small AI model running locally can analyze vibration patterns and only send alerts when anomalies suggest potential failures. Though the local processing consumes more power than idle operation, the drastic reduction in transmission requirements can significantly extend battery life.
It’s a similar story in the ever-growing smart agriculture sector.
In agricultural settings, sensors are regularly deployed to monitor soil moisture, light, and temperature to guide smarter approaches to irrigation. These sensors transmit data at regular intervals, but when coupled with edge AI these devices can better understand when conditions have changed meaningfully, and only communicate when irrigation is truly necessary. Similarly, this can change the requirements for batteries used in the device.
There are several strategies that developers are exploring to fully capitalize on AI’s potential without compromising battery life, including:
- Using lightweight models designed for microcontrollers to ensure that AI runs efficiently on low-power hardware.
- Deploying techniques such as pruning and quantization to cut down on both memory usage and processing time.
- Equipping devices with dedicated AI chips (e.g., Google Coral, NVIDIA Jetson, or custom Neural Processing Units) that are designed to execute tasks with far greater energy efficiency than general-purpose CPUs.
But whatever the strategies chosen to reduce the impact on battery life, integrating edge processing with an IoT device means there is a need to fully understand the requirements for a battery.
Whether you’re an experienced AI developer or new to edge processing, our team of experts are on hand to guide you through selecting the right battery for your device. They can talk you through your application’s profile and the impact of edge processing on finding the right power source.
