Carbon dioxide absorption

Published on 07/02/2015 by admin

Filed under Anesthesiology

Last modified 22/04/2025

Print this page

rate 1 star rate 2 star rate 3 star rate 4 star rate 5 star
Your rating: none, Average: 0 (0 votes)

This article have been viewed 2474 times

Carbon dioxide absorption

John M. VanErdewyk, MD

Scientists first began experimenting with substances capable of absorbing carbon dioxide (CO2)in the early 1900s. Progress was made during World War I, when chemical warfare stimulated research to eliminate CO2 from the closed breathing system of the gas mask. Today, CO2 absorption is used daily to remove CO2 from semiclosed or closed anesthetic circuits.

CO2 absorbers

An ideal CO2 absorber would have the following characteristics: efficiency, ease of handling, low resistance to airflow, low cost, lack of toxicity, and lack of reactivity when used with common anesthetics.

The amount of CO2 that can be absorbed varies depending upon the absorbent. In practical use, the maximum amount is rarely achieved because of factors such as channeling of gas flow around the granules of absorbent. Channeling refers to the preferential passage of exhaled gases through the canister via the pathways of least resistance. Excessive channeling will bypass much of the granule bulk and decrease efficiency. Proper canister design—with screens and baffles plus proper packing—helps decrease channeling (Figure 8-1).

A dual-chamber canister is more efficient than a single-chambered canister, and an ideal water content of the absorbent is needed for optimal CO2 absorption. For the greatest efficiency of absorption, the patient’s entire tidal volume should be accommodated within the void space of the container. Therefore, a properly packed canister should contain approximately one half of its volume in granules and one half as intergranular space.

The greater the surface area available for CO2 absorption, the greater the absorptive ability. However, as granule size decreases (surface area increases), the resistance to airflow through the canister increases. A compromise has been reached in a granule size of 4 to 8 mesh, which allows good CO2 absorption with an acceptable resistance to flow.

Degradation of inhaled anesthetic agents

All CO2 absorbers utilize Ca(OH)2 as the primary component. Some absorbents contain various amounts of NaOH and KOH (e.g., sodalime) to serve as catalysts. Unfortunately inhalation anesthetic agents can interact with these catalysts to produce toxic byproducts (compound A, carbon monoxide [CO], methanol, and formaldehyde) and thermal injuries.

Compound A is a vinyl ether degradation product formed from the interaction of sevoflurane (only) with the bases KOH and NaOH (KOH > NaOH). Compound A is produced with either a moist or desiccated CO2 absorber and is present in any rebreathing circuit utilizing sevoflurane and sodalime. Despite the initial concern of possible renal toxicity, extensive evaluation has found no clinically significant adverse effect on renal function in humans.

Thermal injuries and CO production

All modern inhalation anesthetic agents interacting with a desiccated CO2 absorber containing KOH or NaOH cause an exothermic reaction and production of CO. CO production is increased with increasing temperature. Desflurane is more likely to result in CO production, as compared with isoflurane, with sevoflurane being the least likely. Other factors related to CO production include the dryness of absorbent and the concentration of the anesthetic agent.

Desiccation of the absorber can occur after a prolonged period of high fresh-gas flow rates, such as occurs when an anesthesia machine is inadvertently left on over a weekend. When an inhalation agent is introduced during the first case on Monday morning, the reaction can occur. Effects of CO can range from subclinical to severe, with carboxyhemoglobin concentrations greater than 30% described in case reports. Signs and symptoms of subclinical CO toxicity are nonspecific, or masked under anesthesia, making diagnosis difficult without a high degree of suspicion; the measurement of arterial blood-gas concentrations with a co-oximeter will document the concentration of carboxyhemoglobin.

Despite the lesser quantity of CO produced, sevoflurane has been proved to generate the most heat. Case reports have described explosions and fire in the CO2 absorbers secondary to the interaction of sevoflurane with desiccated Baralyme. In experimental settings, sevoflurane has been shown to produce canister temperatures of 400°C, with associated smoldering, melting of plastic, explosion, and fire. Cases of severe heat associated with the use of sodalime have also been reported.