375857 Grand Canonical Monte Carlo (GCMC) Simulations on the Effect of Pore Size on Electrosorption Capacitance during Capacitive Deionization (CDI)
Securing clean and potable water for future generations presents the technological challenge of developing energy efficient and cost effective desalination processes. Ninety percent of the water in the planet is either brackish or sea water.
A promising water desalination technology trying to take the leap from bench top to commercial application is capacitive deionization (CDI). During CDI, a low electrical potential (± 1.5 V maximum) is applied to an array of highly porous electrodes to enable the electrosorption of oppositely charged ions within the electrical double layer (EDL) formed at the surface of the electrodes. The separation process is reversible, and part of the electrical energy invested in separating the ions can be recovered during the discharge cycle.
Grand Canonical Monte Carlo (GCMC) simulations were used in this work to explore the effect of pore size on the equilibrium electrosorption capacity of sodium chloride ions during CDI at conditions of operation for desalination of sea water and brackish water. The scientific question driving this study was: does the structure of the electrode, basically pore size, contribute to the low separation efficiencies observed in CDI? Reports on experimental work state that only between fifteen and fifty percent of the BET available surface area of the electrodes is used during CDI.
It was found that lower uptake of chloride ions within the EDL occurs at slit-type pores with widths equal or larger than 12 nm, at all applied potentials explored. This observation implies that larger electrode pores do not contribute to the desalination process in the same extent as smaller pores, therefore negatively affecting the performance of CDI processes.
See more of this Group/Topical: Computational Molecular Science and Engineering Forum