by Enrico Barbierato and Matteo Montrucchio (Catholic University of the Sacred Heart)
Blockchains are often proposed as trustworthy backbones for AI-driven science, yet their energy costs remain poorly understood. As research infrastructures scale, these costs become a constraint rather than a footnote. This article asks what blockchain energy consumption really implies for sustainable scientific computing.
This work builds on research conducted within the project “Linea di intervento D3.2 – Scientific and Ethical Impacts of Artificial Intelligence–Based Applications”, an interdisciplinary initiative of the Catholic University of the Sacred Heart in Brescia, Italy. Launched in 2019 and carried out over a 24-month period, the project examined the societal impact of Information Technology, with particular focus on finance, healthcare, and industry. Special attention was devoted to artificial intelligence techniques—especially deep learning—for large-scale data analysis, alongside issues of interpretability and ethical responsibility. Within this framework, AI systems were evaluated not only for predictive accuracy but also for broader ethical implications, anticipating the need to assess computational practices beyond performance alone.
Against this background, blockchain energy consumption emerges as a relevant case study for AI-driven scientific infrastructure. In AI-enabled research workflows, blockchains are increasingly proposed as supporting infrastructures for data provenance, auditability, reproducibility, and the secure sharing of scientific artefacts, making their computational and energy costs a non-negligible constraint. Introduced in 2008 with Bitcoin, blockchain technology operates as a distributed ledger in which transactions are grouped into cryptographically linked blocks and replicated across networks of autonomous nodes. Consensus mechanisms such as Proof of Work (PoW) require participants to perform computationally intensive tasks to validate new blocks, replacing centralised authorities with algorithmic consensus and distributing control across the network [1]. Figure 1 schematically illustrates the lifecycle of a transaction from block formation to network dissemination, consensus evaluation, and final immutability.
Figure 1: Flow chart of the introduction of a new block into the blockchain.
The decentralised nature of permissionless blockchains has made traditional financial institutions cautious, which depend on centralised governance, auditable controls, and predictable performance guarantees that conflict with the openness, volatility, and jurisdictional ambiguity of public networks. Consequently, despite growing interest in distributed ledger technologies, institutional adoption of public blockchains remains limited.
At the same time, blockchain has evolved from a niche cryptographic experiment into an infrastructure supporting cryptocurrency markets, decentralised finance, supply-chain traceability, and digital identity systems. This expansion, driven by investment cycles and programmable smart-contract platforms, has also exposed structural limitations, particularly in PoW networks, where constraints in throughput, latency, and resource consumption raise concerns about long-term scalability without protocol-level changes.
A central concern is the environmental impact of PoW validation, which demands substantial computational power and, consequently, large amounts of electricity and ancillary resources such as water for cooling. Major PoW networks, most notably Bitcoin, are estimated to consume electricity on the order of tens to hundreds of terawatt-hours annually—comparable to the electricity demand of medium-sized countries. However, this cost is often mischaracterised by transaction-based metrics, as overall energy use is driven primarily by mining competition and difficulty adjustments rather than by transaction volume.
This tension between decentralization, security, and environmental impact has highlighted efficiency-oriented approaches commonly referred to as Green AI, which evaluate progress not only through predictive performance but also through computational cost, energy consumption, and resource use. By contrast, Red AI prioritises accuracy through scaling data, models, and experimentation, often with limited transparency regarding compute requirements. This distinction is particularly relevant for blockchain systems, where AI components themselves do not drive energy consumption but follow a comparable scaling logic: in PoW networks, security is effectively purchased through computation, incentivizing open-ended competition in processing power.
From a computational standpoint, transaction validation is relatively inexpensive: verifying a block containing BBB transactions requires O(B)O(B)O(B) operations, while block propagation incurs overhead that grows with network size, approximately O(N)O(N)O(N). PoW mining, by contrast, is fundamentally different, as the expected effort required to mine a block scales with the difficulty parameter DDD, yielding a cost of O(D)O(D)O(D) per block. Since DDD is dynamically adjusted to maintain a constant block interval as aggregate hashpower increases, energy consumption follows economic incentives rather than transaction volume [2].
Measuring blockchain energy use, therefore, requires a system-level perspective. For PoW networks, the most informative indicator is total electricity demand over time, typically expressed in terawatt-hours per year, as consumption is driven by aggregate hashing activity and hardware efficiency. Per-transaction or per-user metrics are often misleading, since they distribute an incentive-driven system cost over unstable or ill-defined quantities; network-level reporting, complemented by measures such as energy per block, provides a more robust basis for scientific and economic comparison.
According to the Cambridge Bitcoin Electricity Consumption Index (CBECI), maintained by the Cambridge Centre for Alternative Finance, recent annualised estimates place Bitcoin’s electricity consumption in the range of approximately 90–120 TWh, with variation reflecting different modelling assumptions regarding mining hardware efficiency and profitability. Although PoW protocols impose no explicit energy cap, practical limits emerge from economic viability, hardware availability, grid capacity, regulatory constraints, and social acceptance.
Within this context, efficiency-oriented AI approaches do not seek to eliminate the intrinsic computational cost of PoW, but rather to improve measurement accuracy, transparency, and optimization under resource constraints. Data-driven models can reduce uncertainty in energy estimates, support more efficient resource management, and inform protocol-level decisions, where architectural choices often outweigh marginal efficiency gains.
Link:
https://www.nytimes.com/2018/09/24/business/walmart-blockchain-lettuce.html
References:
[1] C. Berg, Sinclair Davidson, Jason Potts – Understanding the Blockchain Economy, (Edward Elgar Publishing, 2019).
[2] Narayanan, A., Bitcoin and Cryptocurrency Technologies: A Comprehensive Introduction. Princeton University Press, 2016.
Please contact:
Enrico Barbierato
Catholic University of the Sacred Heart, Italy
