Life-Cycle Analysis of Energy Efficiency in Battery Electric Vehicles
Abstract:
This study examines the energy efficiency of battery electric vehicles (BEVs) from a system-level and lifecycle perspective. The analysis highlights that the high mass of automotive battery packs, often an order of magnitude greater than conventional fuel tanks for equivalent energy storage, contributes to increased vehicle weight and may necessitate higher installed power to maintain performance. Consequently, larger battery capacities and associated increases in vehicle dimensions are commonly required, which in turn influences subsystems such as tires, resulting in higher rolling resistance. Operational advantages of BEVs are primarily observed in low-speed acceleration, while overall efficiency can be limited by frictional losses and auxiliary energy demands. Battery production is particularly energy-intensive, accounting for a substantial portion of the embodied energy in BEVs. Charge and discharge efficiencies vary with current rates and usage conditions. Slow charging, which takes approximately 10–12 hours, can reach an efficiency of around 95%, while fast charging from 25 to 75% of capacity over one hour typically achieves an efficiency of 85–90%. Discharge efficiency decreases from near 95% at low rates to roughly 70% at high C-rates (10–15 C). Despite the high efficiency of modern electric motors, including permanent magnet machines, system-level efficiency is further impacted by battery losses, power electronics, and auxiliary components. The effective energy delivery from the grid depends on the generation and distribution infrastructure; in contexts dominated by thermal power and limited renewable penetration, overall electricity efficiency may be around 40%, comparable to modern internal combustion engine vehicles, which operate at 40–50% thermal efficiency. Finally, current battery recycling technologies recover only 40–50% of materials and require additional energy input, highlighting limitations in end-of-life management. These findings suggest that BEVs may not always offer a clear energy efficiency advantage over conventional vehicles when evaluated on a comprehensive, life-cycle basis.
1. Introduction
This work is based on three previous studies, currently under review in the same ACADlore series. The first study focuses on battery-electric vehicles equipped with on-board diagnostic and task-screen systems. The second investigates the power and energy requirements of a fully electric national vehicle fleet in the Italian context. The third analyzes the upgrades required for the national electrical grid, along with the associated costs, to support the large-scale deployment of a fully electric vehicle fleet. The present study focuses on the overall energy sustainability of a fully electric vehicle fleet. It investigates the cost of electricity generation in terms of energy efficiency and compares it with the energy efficiency of modern internal combustion vehicles. The analysis, therefore, considers the energy cost of producing electrical energy relative to that of generating mechanical energy through thermal engines. Furthermore, this work examines the energy costs associated with the manufacturing of current-generation battery systems, with a particular focus on the often-overlooked issue of battery disposal, which remains a contentious and debated topic. The question of battery safety will not be addressed here, as extensive literature already exists on the matter. Notably, a comprehensive aeronautical report on a well-known incident involving a cargo aircraft transporting lithium batteries provides detailed information and experimental evidence regarding the behavior and risks of lithium-based battery technologies [1].
2. Structure of the Work
The present study is structured as follows. The first section examines the energy losses related to the increased mass of battery systems with respect to fossil fuels, with particular attention to the resulting energy dissipation associated with higher vehicle mass. The subsequent section analyzes the energy efficiency of the refueling process, encompassing the electrical grid, distribution infrastructure, and charging and discharging processes. Another section is devoted to characterizing electric motors and evaluating their efficiency. The following section addresses battery end-of-life management, focusing on disposal costs and on the proportion of materials that may be recovered through recycling. Additionally, a specific section is dedicated to assessing the energy costs associated with battery manufacturing, taking into account the overall vehicle mass.
3. Energy Implications of Increased Vehicle Mass in Battery Electric Vehicles
Battery electric vehicles (BEVs) typically exhibit higher curb weights compared to conventional internal combustion engine (ICE) vehicles, primarily due to the mass of the battery pack. The additional mass can have direct consequences on energy consumption during vehicle operation, as increased inertia and rolling resistance lead to higher energy traction requirements. Several studies have suggested that the energy penalty associated with additional battery mass can partially offset the efficiency advantages of electric drivetrains [2], [3]. The overall energy impact of increased vehicle mass depends on multiple factors, including vehicle type, driving cycle, regenerative braking efficiency, and the distribution of additional weight. From a life-cycle perspective, heavier vehicles may also require more energy for chassis and structural reinforcement, which contributes further to the embodied energy of BEVs [4].
As an illustrative example, consider a medium-size electric vehicle with a battery pack of 75 kWh, which adds approximately 500 kg compared to a conventional ICE vehicle of comparable size [5]. The additional energy required for propulsion can be estimated using the relationship between vehicle mass and energy consumption per kilometer:
where, $\Delta m$ is the additional mass (kg), $\alpha$ is the mass-specific energy consumption factor (Wh/km/kg), and $d$ is the annual driving distance (km). Literature values for $\alpha$ range from 0.15 to 0.25 Wh/km/kg depending on driving cycle and regenerative braking effectiveness [3]. Using $\Delta m$ = 500 kg, $\alpha$ = 0.2 Wh/km/kg, and $d$ = 15,000 km/year, the additional annual energy expenditure is:
$ E_{\text {extra }}=500 \cdot 0.2 \cdot 15,000=1,500,000 \mathrm{~Wh}=1.5\; \mathrm{MWh} / \text {year} $
This estimate suggests that the battery-induced mass increase can lead to a non-negligible energy penalty during vehicle operation, which should be considered in comparative life-cycle assessments of BEVs and ICE vehicles. The magnitude of this effect may be partially mitigated by regenerative braking and optimized vehicle design [2], [3], [4], [5], [6]. Indicative values of the additional annual energy required for vehicle propulsion due to increased battery mass are provided in Table 1. A graphic representation of the same trend is shown in Figure 1.
| Battery Capacity (kWh) | Additional Mass (kg) | Energy Factor (Wh/km/kg) | Additional Annual Energy (MWh/Year) |
| 40 | 250 | 0.2 | 0.75 |
| 60 | 375 | 0.2 | 1.125 |
| 75 | 500 | 0.2 | 1.5 |
| 90 | 600 | 0.2 | 1.8 |
| 100 | 650 | 0.2 | 1.95 |

4. Energy Efficiency of Electricity Production in the Italian Grid
The efficiency of electricity production in Italy is influenced by the mix of generation technologies, grid losses, and the relative contribution of renewable sources. According to recent data, the Italian electricity system is characterized by a combination of thermal power plants, hydroelectric facilities, and an increasing share of wind and solar generation [7]. The dominant utility, Enel, contributes substantially to both conventional and renewable electricity production [8]. Thermal power plants, particularly those fueled by natural gas, typically exhibit conversion efficiencies of 40–45%, depending on the technology and load factor [7]. Renewable sources such as hydro, wind, and solar contribute electricity with effectively higher conversion efficiency in terms of primary energy input, since they do not require combustion of fossil fuels [9]. The weighted average generation efficiency in Italy is therefore higher than that of thermal-only plants, but still substantially affected by the contribution of fossil-based generation. Transmission and distribution losses further reduce the effective energy available to end users. Italian grid statistics report overall losses in the order of 5–7% for transmission and an additional 5–7% for distribution. Considering these losses, the effective efficiency of delivering 1 kWh of electricity from the grid can be estimated as the product of the weighted generation efficiency and the fraction of energy remaining after losses. The efficiency is expected to improve over time due to the increasing share of low-carbon and renewable generation, as well as ongoing grid modernization efforts [7], [8]. However, until the share of fossil-based generation is substantially reduced, the overall well-to-grid efficiency remains below the ideal limit implied by purely renewable generation.
For illustration, consider a hypothetical 1 kWh of electricity delivered to the end user. Let the weighted average primary energy conversion efficiency of the Italian generation mix be $\eta_g$ = 45%, and the combined transmission and distribution losses be $L$ = 10%. The effective primary energy required to supply 1 kWh is given by:
This calculation indicates that approximately 2.5 kWh of primary energy is required to deliver 1 kWh of electricity to an Italian end user under current conditions. The value is sensitive to the proportion of renewable generation and the efficiency of thermal plants, as well as improvements in grid losses [9], [10].
5. Efficiency of Current Internal Combustion Engines (ICE) in Road Vehicles
In conventional ICE for road vehicles, comparatively high efficiency levels can still be achieved, particularly for compression ignition engines. Several studies indicate that modern diesel engines can achieve peak thermal efficiencies of approximately 45%, although the effective efficiency observed under real driving conditions may decrease to values closer to 35% when exhaust aftertreatment systems and emission constraints are considered [10], [11]. Alternative combustion concepts have been proposed with the objective of further improving efficiency while maintaining relatively simple engine architectures. Among these, dual-fuel operation, typically combining diesel combustion with gaseous fuels such as liquefied petroleum gas (LPG), has received increasing attention. In such configurations, the base engine design often remains largely unchanged and may correspond to emission levels originally associated with Euro 3 standards. Dual-fuel operation, when combined with appropriate control strategies, may allow compliance with more stringent emission regulations, potentially approaching Euro 6d limits [12]. Under favorable operating conditions, brake thermal efficiencies exceeding 50% have been reported in experimental investigations, although such values are generally limited to specific operating points [13]. Vehicles equipped with these powertrains may retain relatively low mechanical complexity and moderate mass, which could translate into reduced manufacturing costs when compared to more highly electrified vehicle architectures. However, such advantages are strongly dependent on system integration and regulatory constraints. The situation differs for spark ignition gasoline engines. Under partial load operation, which is representative of typical driving conditions, achieving high thermal efficiency remains challenging. Reported efficiency values typically fall within the range of 20–25%, primarily due to throttling losses and combustion limitations [10]. Engines operating on gaseous fuels such as natural gas may achieve somewhat higher efficiencies, particularly when optimized combustion concepts are employed, and in some cases values comparable to diesel engines have been observed [14]. Overall, ICE-powered road vehicles cannot be unambiguously characterized as inherently low-efficiency systems. Incremental improvements may be achieved through mild hybridization, where an auxiliary electric machine supports the internal combustion engine. Such systems can reduce efficiency penalties during start-up, low-speed operation, and transient conditions, while assisting torque delivery and enabling operation closer to optimal efficiency regions. As a result, mild hybrid architectures may offer efficiency gains without a substantial increase in vehicle mass or system complexity [15]. Indicative efficiency ranges for different propulsion technologies, based on representative literature sources and subject to operating conditions, are provided in Table 2. A graphical comparison of typical operating efficiencies is shown in Figure 2.
Powertrain Type | Fuel or Energy Source | Peak Efficiency | Typical Operating Efficiency |
Diesel ICE | Diesel fuel | 40–50% | 30–50% |
Gasoline ICE | Gasoline | 30–35% | 20–25% |
Natural gas ICE | Compressed natural gas | 35–50% | 25–50% |
Diesel-LPG dual fuel ICE | Diesel + LPG | 45–50% | 35–45% |
Mild hybrid diesel ICE | Diesel + electric assist | 45–50% | 35–45% |
Battery electric drivetrain | Electricity (on-board) | 90–95% | 70–85% |
Electricity generation and distribution | Grid electricity | - | 35–45% |

6. Charge and Discharge Efficiency of Lithium-Ion Batteries
The energy efficiency of lithium-ion batteries is a key factor affecting the overall performance and life-cycle energy balance of electric vehicles. Battery charging and discharging processes are associated with losses due to internal resistance, electrochemical polarization, and thermal effects. Efficiency depends on both the current rate and the state of charge, as well as the operating temperature [2], [16]. At a reference temperature of 32 ℃, typical lithium-ion cells exhibit different efficiency characteristics depending on the charging protocol. Slow charging over an extended period, such as a 12-hour full charge, achieves high efficiency, with reported energy recovery rates of approximately 95% [2]. In contrast, fast charging over a reduced state-of-charge window, for example from 25% to 75% capacity at a rate of 1 C, results in slightly lower efficiency, around 88%, due to higher resistive and kinetic losses [5]. Discharge efficiency is similarly dependent on current. At moderate rates of approximately 0.9 C, lithium-ion cells maintain high efficiency, typically near 90%. However, at high discharge rates, such as 10–15 C, internal losses increase, and efficiency can drop to about 70% [2], [5]. These variations highlight that the effective energy available from a battery pack is strongly influenced by both usage patterns and the rate of power delivery. Considering these effects is essential for realistic modeling of vehicle energy consumption and life-cycle energy assessments. High rate charging and discharging, while improving operational flexibility, incur an energy penalty that partially offsets the advantages of electric drivetrains. Maintaining temperature control around nominal conditions, such as 32 ℃, is critical for minimizing additional losses and ensuring consistent performance [16]. Typical lithium-ion battery efficiencies for various charging and discharging rates are reported in Table 3.
Mode | Current (C-Rate) | Efficiency (%) |
Charge (slow, full 12h) | 0.08–0.1 | 95 |
Charge (fast, 25–75%) | 1 | 88 |
Discharge (moderate) | 0.9 | 90 |
Discharge (high rate) | 10–15 | 70 |
7. Energy Expenditure Associated with Automotive Battery Manufacturing
The manufacturing of lithium-ion battery packs for electric vehicles is generally regarded as an energy-intensive process, and their contribution to the overall life-cycle energy balance has been widely discussed in the literature. For large battery packs, such as those employed in long-range battery electric vehicles, the energy required for raw material extraction, material processing, cell production, and pack assembly may represent a substantial fraction of the total embodied energy of the vehicle. Several life-cycle assessment (LCA) studies indicate that the energy demand for battery production varies significantly depending on cell chemistry, manufacturing location, and process efficiency. Reported values for the cumulative energy required to produce lithium-ion battery cells typically range from approximately 50 to 100 kWh of primary energy per kWh of battery capacity, although higher values have been reported in cases characterized by carbon-intensive electricity mixes or less optimized production processes [2], [5]. For battery packs with capacities representative of current long-range electric vehicles, such as those exceeding 60 kWh, the total energy expenditure associated with battery manufacturing may therefore reach several MWh. This energy demand is often dominated by energy-intensive steps such as electrode material synthesis, drying processes, and climate-controlled dry rooms required during cell assembly [4]. The geographical location of battery production plays a critical role in determining both the energy intensity and the associated environmental impact. Manufacturing facilities supplied by electricity grids with a high share of fossil-based generation may exhibit significantly higher primary energy consumption and greenhouse gas emissions compared to facilities operating with low-carbon electricity sources [16]. Consequently, the same battery design may exhibit markedly different embodied energy values depending on regional production conditions. It is also important to note that technological progress and process optimization may lead to reductions in manufacturing energy demand over time. Improvements in cell chemistry, increased production scale, and more efficient manufacturing equipment have been identified as key factors that could lower the energy intensity of battery production in future industrial scenarios [5]. However, at present, battery manufacturing remains a non-negligible contributor to the overall energy footprint of battery electric vehicles and should be explicitly considered in comparative assessments of propulsion technologies.
To provide a quantitative perspective, the energy cost of producing a lithium-ion battery can be estimated by dividing the total energy expenditure for a given production process by the nominal capacity of the battery pack. Using reported ranges from life-cycle assessments, the cumulative energy demand for a typical automotive lithiumion cell is approximately 50–100 kWh of primary energy per kWh of battery capacity [2], [5]. For a battery pack of 75 kWh, which is representative of a long-range electric vehicle, the total energy expenditure can be estimated as:
where, $C_{\text {pack }}$ is the battery capacity in kWh and $E_{\text {per kWh}}$ is the energy demand per unit of capacity. Using $E_{\text {per kWh}}$ = 50–100 kWh/kWh, the total energy requirement for a 75 kWh battery pack is approximately:
$ E_{\text {pack }}=75 \cdot 50 \;\text{to}\; 75 \cdot 100 \approx 3.75-7.5 \;\mathrm{MWh} $
These values highlight the substantial primary energy input required for battery production and underscore the importance of considering manufacturing energy in life-cycle assessments of electric vehicles. Regional electricity mix and production efficiency can cause significant deviations from these estimates [4], [16].
8. Battery Recycling
Battery recycling represents a critical aspect of the sustainability of electric vehicles, as it addresses both resource recovery and environmental impacts associated with end-of-life battery disposal. Lithium-ion batteries used in automotive applications contain multiple components, including an external casing typically made of steel, copper conductors, and active electrode materials embedded in electrochemical cells [16], [17]. While the metallic casing and copper conductors are relatively straightforward to recycle, the electrochemical cells present significant challenges. The cells are composed of wound electrode assemblies encapsulated in plastic or polymer foams, which provide mechanical stability but hinder the disassembly and separation of individual components [18]. Although in principle the electrodes could be extracted and processed, in practice, mechanical resistance, residual electrolytes, and contamination reduce the efficiency of direct material recovery. Current industrial recycling methods generally involve shredding or granulating the battery modules, followed by physical and chemical separation processes to recover metals and other valuable materials [17]. Reported recovery rates for lithium, cobalt, nickel, and other active materials vary widely. Optimistic estimates suggest that up to 75% of materials can be recovered under ideal conditions, whereas typical operational recovery rates in commercial facilities often do not exceed 40% [16], [18]. These limitations underscore that battery recycling remains partial and energy-intensive, with significant scope for improvement in both material recovery efficiency and environmental performance.
9. Life-Cycle Energy Comparison Between BEV and ICE Vehicles
This section presents a comparative life-cycle primary energy analysis of BEVs and ICE vehicles operating on gasoline, diesel, and natural gas. The comparison includes vehicle manufacturing, energy carrier production, vehicle operation, and end-of-life management. Three reference driving distances are considered: 100,000 km, 200,000 km, and 300,000 km. Due to battery lifetime limitations, the BEV is not assumed to remain operational up to 300,000 km. The BEV configuration is based on a medium-size passenger vehicle equipped with a 75 kWh lithium-ion battery pack. According to recent life-cycle assessments, the cumulative primary energy required for battery manufacturing ranges between 50 and 100 kWh per kWh of battery capacity. A conservative mid-range value of 75 kWh per kWh is adopted, resulting in an embodied battery energy of approximately 5.6 MWh. Including vehicle manufacturing and end-of-life battery processing, the total production energy is estimated at 8.0 MWh. Electricity supplied to the BEV is assumed to be generated within a grid characterized by an average well-to-wheel efficiency of 40 percent, accounting for generation and transmission losses. The operational electricity consumption of the BEV is assumed to be 18 kWh per 100 km at the plug and is converted to primary energy accordingly. The resulting life-cycle primary energy values are summarized in Table 4. The gasoline ICE vehicle is assumed to consume 6.5 L per 100 km under real driving conditions. The lower heating value of gasoline is assumed to be 8.6 kWh per liter, and an average tank-to-wheel efficiency of 23 percent is used. Diesel and natural gas vehicles are grouped into a single category and are assumed to operate at an average tank-to-wheel efficiency of 45 percent, which is representative of modern, high-efficiency compression-ignition and optimized gaseous-fuel engines. For consistency, the same mechanical energy delivered per kilometer implied by the gasoline case is preserved. End-of-life recycling is not credited as an energetic benefit. For conventional ICE vehicles, dismantling and recycling primarily involve metallic components, with recovery rates exceeding 80 percent and limited additional energy demand. For BEVs, end-of-life management is dominated by traction battery recycling, which typically recovers only 40–50 percent of active materials and requires significant additional energy input. Disposal energy is therefore treated as an additive penalty in the life-cycle primary energy balance. A direct visual comparison of cumulative primary energy consumption as a function of vehicle technology and driving distance is provided in Figure 3. Percentage variations of total life-cycle primary energy consumption relative to the gasoline ICE baseline are reported in Table 5.
Vehicle Type and Distance | Production (MWh) | Operation (MWh) | Total (MWh) |
BEV–100,000 km | 8.0 | 11.3 | 19.3 |
BEV–200,000 km | 8.0 | 22.5 | 30.5 |
BEV–300,000 km | - | - | Not applicable |
Gasoline ICE –100,000 km | 4.0 | 24.3 | 28.3 |
Gasoline ICE –200,000 km | 4.0 | 48.6 | 52.6 |
Gasoline ICE –300,000 km | 4.0 | 72.9 | 76.9 |
Diesel/CNG ICE (45%) –100,000 km | 4.0 | 12.4 | 16.4 |
Diesel/CNG ICE (45%) –200,000 km | 4.0 | 24.8 | 28.8 |
Diesel/CNG ICE (45%) –300,000 km | 4.0 | 37.3 | 41.3 |

Vehicle Type | 100,000 km (%) | 200,000 km (%) |
BEV | -31.8 | -42.0 |
Gasoline ICE (reference) | 0.0 | 0.0 |
Diesel/CNG ICE (45%) | -42.0 | -45.2 |
10. Conclusions
The analysis suggests that the energy efficiency of BEVs may be more limited than is often implied, particularly when evaluated from a system-level and life-cycle perspective. One relevant factor is the mass of the battery pack, which can be approximately an order of magnitude higher than that of a conventional fuel tank when compared because of equivalent stored energy. This additional mass generally increases overall vehicle weight and may necessitate higher installed power to achieve acceptable performance characteristics. As a result, larger battery capacities are often required, contributing to an increase in vehicle dimensions. This design scaling tends to affect several vehicle subsystems, including tire dimensions. Larger tires are typically associated with higher rolling resistance, which may increase frictional losses and negatively impact overall efficiency. Such interactions can promote vehicle architectures that are comparatively large and whose performance advantages are mainly concentrated in low-speed acceleration, a regime in which electric motors typically exhibit favorable torque characteristics. From a life-cycle perspective, vehicle size is also correlated with increased energy demand during manufacturing. Battery production appears to be an energy-intensive process and may represent a substantial share of the total embodied energy of BEVs. Operational efficiency is further influenced by battery charging and discharging processes. Under slow charging conditions, with durations on the order of 10 to 12 hours, charging efficiencies close to 95% are commonly reported. In contrast, fast charging scenarios, for example charging from 25 to 75% of nominal capacity within approximately one hour, may exhibit efficiencies closer to 85–90%. Discharge efficiency is strongly dependent on the power demand, potentially decreasing from values near 95% at low discharge rates to approximately 70% at high C-rates, on the order of 10–15 C. Although modern electric traction motors, including permanent magnet machines that employ rare-earth materials, can reach peak efficiencies approaching 95%, the overall drivetrain efficiency is reduced once battery losses, power electronics, and auxiliary systems are factored in. The system-level efficiency of electric mobility is also highly dependent on the electricity generation and distribution infrastructure. In the Italian context, and assuming the absence of nuclear power generation in the medium term, electricity production is likely to remain largely dependent on conventional thermal power plants, predominantly fueled by natural gas. If the penetration of renewable energy sources is close to saturation, the combined efficiency of electricity generation and grid distribution may result in an effective energy delivery efficiency of around 40%. These values are comparable to the efficiency range of modern internal combustion engines operating on diesel or natural gas, which may reach 40–50% thermal efficiency while complying with advanced emission standards. Under these boundary conditions, BEVs may not exhibit a clear advantage over contemporary internal combustion engine vehicles in terms of overall energy efficiency. When the analysis is extended to include cumulative primary energy consumption over representative vehicle lifetimes, high-efficiency internal combustion vehicles operating on diesel or natural gas appear, under the assumptions adopted in this study, to offer a more favorable life-cycle energy balance than current-generation BEVs, particularly at medium to high mileages. This outcome does not imply an intrinsic superiority of these technologies in all contexts but rather highlights the sensitivity of comparative results to factors such as electricity generation efficiency, battery lifetime, and end-of-life management. Consequently, from a strictly energetic perspective, diesel and natural gas vehicles may represent a competitive solution within the present technological and infrastructural framework. Ultimately, the end-of-life management of traction batteries remains a significant challenge. Current recycling technologies appear to recover only approximately 40–50% of battery materials and typically require additional energy input, which should be accounted for in comprehensive energy and sustainability assessments.
The data used to support the research findings are available from the corresponding author upon request.
The author declares no conflict of interest.
