Ensure a good end user experience with proper gas metering.

Efficient operation on batteries is a core requirement of mobile devices. Good battery-powered behavior is critical to the success of a product. Battery powered operation revolves around three main areas:

Well-tuned system power consumption

proper battery charging

Accurate fuel or fuel gauge prediction of battery status

For mobile devices, tuning system power consumption begins with defining the device use case and modeling its power consumption in various modes. Typically this includes a low "static" suspend state when the device is not in use, and various dynamically managed power states when running in different modes.

Battery charging and fuel gauge aspects are often overlooked or underestimated in system design and implementation efforts. Especially for consumer products, marginal failures in charging or fuel metering can lead to customer dissatisfaction. Issues such as the inability to detect the charger type on 0.1% of the plug-in, or the sudden power outage of the product when indicating 15% charge remaining, can be a disaster for your product's reputation when you ship a million units. It also incurs significant customer support costs. Issues such as the inability to stop charging when a device overheats can even create life-threatening safety (and liability) issues.

These aspects require careful system design from scratch, both in hardware and software. The goal here is to emphasize that charging and gas monitoring follow the 80/20 rule: it takes 20% of the effort to get 80% of the functionality, but more than 80% to get the remaining 20% ​​of the "finishing touches". Careful initial design, such as in the Snapdragon Open-Q family of development kits, provides a good start.

Battery Choice: Chemical

Many parameters and characteristics can influence the selection of the appropriate battery technology. The first is the chemistry/technology of the battery itself. Most people are familiar with energy density (Figure 1). Various lithium chemistries are the leaders here, with the highest energy per unit volume or weight. These would be the obvious choices for many electronic consumer devices.

While important, energy density may not be the only decision maker. Temperature can be an important factor in charging/discharging. Lithium batteries should generally not be charged below 0°C, and are generally specified to stop charging at around 45°C. Generally, they should not operate above 60°C, and they contain a permanent thermal fuse to prevent catastrophic thermal runaway if other safety precautions fail. 60°C may seem beyond the normal operating range of the device, but it can easily be reached in a few minutes on the seat or dashboard of a warm car. Consumer products for outdoor use may also face challenges with low temperature constraints, requiring careful system design.

In addition to temperature, the internal impedance of the battery and the rate at which it releases energy can also affect your battery choice. In situations where brief high currents (like >4C) are required, another chemistry based on nickel or lead may be more suitable. An example might be a power tool or a system with a large display.

Lithium is the most common chemistry in consumer mobile applications, and a variety of cathode compounds are available for different properties and battery shapes, flexibility and configurations. Lithium has subtly different properties in terms of voltage and state of charge, internal impedance behavior, and stored energy and voltage as a function of temperature. Fuel gauge chip manufacturers refer to these as "golden parameters" and will carefully characterize any battery type to determine the appropriate values ​​to use when setting the fuel gauge for that battery.

TOLL

For a specific battery chemistry, there will be an appropriate paradigm to charge that battery. For lithium-based chemistries, the method involves three specific stages: preconditioning, constant current and constant voltage charging stages.

If lithium-based batteries are discharged beyond a certain low voltage, for example by connecting directly to a resistive load like a light bulb, they can be permanently damaged. To prevent this damage, most lithium battery packs will contain a small protection circuit PCB that contains a low voltage cut-off circuit. When the battery voltage falls below ~2.5 V, the protection FET turns on and the apparent voltage at the battery terminals drops to 0 V.

Before charging the battery in this case, it is necessary to "precondition" the battery by applying a small trickle charge current (usually 100 mA). To detect if a battery is connected, apply this current for a few minutes, then disconnect the current and check the battery voltage. This process should continue until the battery reaches the minimum regulation voltage specified by the battery supplier, typically 2.8 to 3.0 V.

Note that most charger designs allow some or all of the externally supplied charger current to power the system (also known as "bypass") rather than flowing directly into the battery as charging current. Unless your system can run entirely on external power (starting and running software), this precharging of dead batteries must be managed entirely by the charger hardware itself.

After reaching the constant current charging stage, the charging current can be increased, usually up to about 1C. This current (measured in mA) is roughly equivalent to the battery's rated capacity (measured in mA-hours). Newer chemistries allow for higher charging currents (and therefore shorter charging cycles), especially if the battery temperature during charging is carefully controlled. During this part of the time, the voltage of the battery will slowly rise. The maximum charging current will affect your choice of external chargers, cables, connectors, etc.

The control of charging current with temperature is the subject of several safety standards, including IEEE 1725 and JEITA. The charger circuit itself can be damaged if subjected to excessive load currents at high temperatures, so charger circuits often include thermally controlled throttling based on chip temperature. More importantly, however, lithium-based batteries can suffer catastrophic thermal runaway if charged at high temperatures (or discharged at high currents).

To prevent the risk of fire and meltdown, batteries typically contain an internal thermal fuse that permanently disables the battery at temperatures around 90°C. Before reaching this stage, the system must turn off the charging current when the battery temperature exceeds the limit. The battery manufacturer will provide temperature limits, but these typically specify no charging above 60°C or below 0°C. In the past, these were hard switch charging limits, but the JEITA standard (required in Japan and increasingly common as a de facto standard) has a more complex charge current derating curve based on battery temperature (Figure 3).

Such fine-grained control of charge current typically requires a combination of software and hardware, such as driver-level adjustments to charge thresholds. However, it is impossible to rely on software alone. To prevent the battery from over-discharging and damaging other components in high temperatures, your system may need to be shut down in a hot environment, such as a hot car. Startup and charging should be prevented in the absence of software running or software malfunctioning. See IEEE 1725 and IEEE 1625 for safety-critical requirements in this area.

The temperature of the battery is usually monitored by a dedicated thermistor, which is usually integrated inside the battery pack itself, especially in systems where the battery is somewhat isolated from the main circuit. However, this may increase the price of the battery, so in designs without built-in thermistors, careful system thermal design is required.

Fully charged, terminate current

At the end of the constant current charge portion, the battery will be near its maximum voltage, approximately 4.1 to 4.2 V. At this point, the charger must limit the voltage it applies to its cutoff voltage, and the charging current inherently decreases slowly. The voltage chosen for this constant voltage charging section affects the life of the battery, accelerating aging when too high a voltage is used. A constant voltage threshold that is too low results in sub-optimal full charge capacity, so a trade-off is made here, typically choosing a voltage around 4.15 to 4.2 V.

The battery is also chemically damaged if subjected to a permanent external charging voltage, so when the charging current falls below a certain threshold, the charger terminates its charge cycle and completely removes the applied voltage. This level is often referred to as cone current and is another key parameter when tuning charger circuits.

If your system is always plugged into the charger, the battery will go through a periodic "top-up" charge cycle when the battery discharges a certain percentage of charge or voltage. The charging system should ensure that these top-ups are mini charge cycles with constant current and constant voltage parts.

Charging System Design

The system charging circuit is usually a dedicated charging IC, or integrated in the system's power management IC (PMIC). When included as part of a PMIC, the charging system design can define a master node from which all system power is derived. This node can be powered by current bypassed by an external charger, and in some designs also includes the ability to supplement system current from the battery at the same time.

Depending on the system power load, a plug-in system may charge the battery at a high rate, at a throttling rate (because system operation is consuming a lot of power), or discharge the battery simultaneously with all the power from the external power source. A pattern example and bypass/supplement of charge current can be seen in Figure 4.

The charger subsystem in the Qualcomm Snapdragon PMIC family includes SMBB technology, which converts the charger circuit into boost mode to generate 5 V for the charger power node to power the power-hungry camera flash LEDs.

USB Type-C includes a power delivery specification that negotiates higher charger input voltages through a dedicated charger protocol channel for faster charging. Qualcomm supports a variety of fast charging technologies and techniques, including intelligent negotiation of optimal voltage (INOV) based on the principle of charging voltage negotiation. Providing a higher input voltage will increase the power transfer rate and can greatly reduce the charging time.

Gas metering

The final component of a successful battery-powered system is the ability to measure the amount of energy remaining in the system's battery at any given time. This is called gas metering. For a successful measurement, it is important to the user that your system does not crash unexpectedly, while maximizing the power of the battery. Allowing your system to suddenly fail due to low voltage is annoying and dangerous and can result in corrupted or lost data. Some devices, such as those with electrophoretic e-ink screens, appear to turn on when the battery is actually dead, leading to user confusion, complaints or customer support calls.

There are two basic principles for measuring energy in a battery: mapping the battery voltage to the current state of charge, and "in must out". The latter sounds simple enough and is called Coulomb counting. The circuit integrates the flow of current into/out of the battery to maintain a measurement of the charge (and thus energy) in the battery. This presents challenges in practice, including:

Determining the initial state of charge of the battery

Battery self-discharge due to internal resistance and leakage

Energy loss during discharge due to internal battery impedance

Always accurately measure discharges, including small leaks when the system is powered off, and the energy contained in short spikes when large subsystems are powered on or powered off

Mapping a battery's voltage to its state of charge also has its challenges:

State-of-charge voltage varies with battery chemistry (see Figure 5)

The battery voltage depends on the internal impedance of the battery: the voltage drop can be large under high current loads

Hysteresis due to charging or discharging: The voltage may be higher or lower than the open-circuit "relaxation" value, depending on how the state of charge is reached

Open circuit voltage decreases with increasing temperature

Because of these challenges, careful tracking of the battery's impedance is critical for effective fuel gauge. This impedance varies with battery conditions such as aging and charge/discharge cycle counts, so for maximum accuracy, an impedance fingerprint is maintained for a specific battery. This means that when a battery is replaced (for example, if your system has a user-accessible battery pack), your system must identify and re-evaluate the condition of the new battery pack.

Overall, effective gas metering requires a combination of open circuit voltage measurement and coulomb counting. The open-circuit voltage (measured at very low system discharge currents, such as during a suspend or sleep state) transitions to a state of charge according to a standard profile of the battery's known chemistry, as shown in Figure 5.

An open-circuit voltage measurement can provide a fairly accurate point value of the battery's state of charge, especially if the system has been stationary (or relaxed) for a period of time, allowing any hysteresis in charging or discharging to subside. This measurement helps to plot a battery's maximum charge capacity against its factory default maximum charge capacity. Over time, the maximum charge capacity will drop as the battery can hold less and less charge.

Coulomb counting is used to track energy increases or decreases during activity because your system use case may not typically allow effective measurement of open circuit voltages (such as continuously varying system current loads). This can be combined with knowledge of the internal impedance of the battery to track the state of charge via the battery voltage.

Gas metering system design

Doing this effectively requires either complex driver software or leveraging the knowledge and development efforts of semiconductor manufacturers who have designed gas metering hardware and firmware to implement these algorithms efficiently. This can be done by designing fuel gauge into the battery pack itself; as a discrete fuel gauge chip on the power subsystem PCB; or as an integrated component of the system PMIC.

One that involves minimal system integration effort and generally fewer integration issues is in-package measurement. It also has many distinct advantages, including easy tracking of aging data and impedance for each battery pack, easy dead battery charging and start-up (since the battery capacity can be easily retrieved from the battery pack in low-level bootloader software), and System integration and debugging effort is very low. However, the BOM cost of the battery pack can be high.

The discrete gas metering chip solution (shown in Figure 6) provides excellent metering performance and provides access to parameters such as minutes before no load based on windowed average system power consumption. Solutions from vendors such as Texas Instruments (TI) encapsulate complex algorithms such as impedance tracking and provide customizable features such as ensuring that the reported state of charge is as close to monotonically decreasing as possible. (If the remaining capacity suddenly jumps from 10% to 20% after a system relaxation period, the end user may be confused and lead to complaints). As we'll see, the separation of the fuel gauge from the charger IC can sometimes lead to some tricky edge cases.

Using a highly integrated system PMIC as a combined charger/gas gauge solution enables full customization and provides the lowest system BOM cost, component space, and PCB complexity. Solutions that include road-tested driver software can help reduce software development and tuning efforts, especially when you're keeping an eye on reference designs.

Paired SoC and PMIC solutions, such as Qualcomm's family of highly integrated Snapdragon processors, provide a rich feature set for charging, metering and system power conversion. It includes the ability to drive a user feedback LED (useful for dead battery charging), internal battery boost technology, and built-in overvoltage protection for the power input. Qualcomm's APQ8016 and its PMIC PM8916 are designed into Intrinsyc's Open-Q 410 SOM and reference carrier boards to provide BSP support for charging and metering solutions. This reduces the integration effort of tuning required to select battery and power use cases.

Definition of gas meter parameters

Whether you use discrete gas metering components or a PMIC integrated solution, you need to define gas metering parameters specific to your system. The most important of these is the empty voltage of the system. This voltage is derived from the dead voltage of the system, which in turn is defined by the system hardware. Dead voltage is the lowest battery voltage at which the system can operate normally. Various system components affect this decision, including the regulator's input specifications and its buck or boost configuration. On top of this dead voltage, you need to account for voltage input tolerances, temperature variations, measurement errors, and spikes that occur when components are turned on or off. The end result will be a safe null voltage at which your system must shut down immediately (see Figure 7).

Having defined the empty voltage of the system, you now need to specify the reserve capacity of the system. This is a measurement that ultimately defines when your battery reaches 0% capacity and when your system should shut down cleanly. This ensures that the user is told the system is shut down and no data corruption or loss can occur. Reserve capacity is defined in terms of energy and is based on the amount of run-time headroom required to respond to battery power and perform shutdowns. Sampling delay, time required to provide feedback, flushing datastore, and shutdown are all inputs to this parameter. If empty voltage and reserve capacity are not properly defined, your system will suffer from power outages and crash unexpectedly.

All in all, here is an example of a complex edge case in system charging/measurement. The system features discrete charger ICs that provide source system bypass and charge current throttling, allowing external input current to charge the battery or power the system power rails. A discrete gas-gauge IC tracks the state of charge and maintains a measurement of the battery's full charge capacity based on a combination of open circuit voltage and coulomb counting.

Allows charging using a dedicated charger (1 A) or a USB power supply (less than 500 mA). When the system's LCD panel is on and playing video, the system power requirement is approximately 800 mA. With the LCD off, the power requirement is less than 100 mA, so the battery can be effectively charged from USB power (up to 350 mA). When the system is plugged into USB power and the screen is turned on and off, the battery will switch from charging to discharging mode.

The fuel gauge IC monitors the battery pack's full charge capacity (how much the battery can hold based on aging and cycling). The amount of energy a fully charged battery can hold is used to calculate/predict shutdown capacity. It detects a full charge condition whenever it measures that the battery voltage is within 85% of full voltage (greater than 4.0 V) and when it sees that the average charge current is lower than the termination current.

Unfortunately, when the user turns on the screen at an inopportune time, the fuel gauge IC detects the charge throttling (caused by a separate charger IC) as a fully charged condition. The average charge current drops to 0 mA, so the fuel gauge assumes the battery is fully charged. Then, when the screen turns off, charging resumes, and the fuel gauge finally calculates that the battery has more energy than expected.

The result is that when the system is later unplugged and discharged, the misestimation of the full charge capacity causes the system to power down and shut down immediately, even though the user is told there is 20% energy remaining. Preventing this requires careful tuning of parameters, good measurement algorithms, coordination between charger and measurement circuits, and good design of the system use case.

Reviewing Editor: Guo Ting

Leave a Reply

Your email address will not be published.