Perioperative Thermoregulation and Heat Balance


17
Perioperative Thermoregulation and Heat Balance


Kurt A. Grimm


Veterinary Specialists Services, PC, Conifer, Colorado, USA


Introduction


Perianesthetic hypothermia, and less commonly hyperthermia, occurs in almost all patients undergoing anesthesia. Despite the common occurrence of body temperature changes, few anesthetists understand the complex patterns of heat transfer and the multitude of factors which influence the rate and extent to which it occurs. This chapter will address these factors, the techniques used to limit heat loss, and the impact of hypo‐ and hyperthermia on patient morbidity.


Thermodynamics


Thermodynamics is the study of the transfer of heat and work between systems. In biology, the thermodynamic laws can be applied to explain many phenomena, but of interest to the anesthetist is the analysis of the transfer of body heat, clinically measured as core body temperature, to the external environment. The reverse (e.g., heat transfer from the external environment to the patient) is also of interest since an understanding of the process will allow a safe and effective means of maintaining or increasing patient temperature during the perianesthetic period. The heat transfer and storage properties of materials which contact the patient are also important since they often present hazards in the form of potential sources of burns. Fortunately, a detailed knowledge of the application of the laws of thermodynamics is not necessary to understand heat balance; however, a general understanding of conservation of energy (the first law of thermodynamics) and the heat flow to establish an equilibrium between systems (the second law of thermodynamics) will be useful.


Thermoregulation


An organism possesses heat energy which is conveniently measured as body temperature. The total amount of heat energy is a function of the temperature and the mass of the patient, just as kinetic energy is a function of the velocity (speed) of an object and its mass. In most domestic mammalian species, the amount of heat energy (temperature) is relatively constant (homeothermic) despite continual metabolic heat production, and environmental heat gain and losses. The amount of body heat can be described using the terms normothermia (euthermia), hypothermia, and hyperthermia, which refer to normal, decreased, and increased amounts of body heat, respectively. Hyperthermia is an increase in temperature, but usually occurs in response to increased environmental temperature or altered thermoregulation. Pyrexia (i.e., fever) is similar in that it is an abnormal increase in temperature but it is due to an increase in the set‐point, often related to immune response to pathogens. Hypothermia is the opposite condition and is most commonly associated with heat loss in excess to metabolic production or decreased thermoregulatory set‐point, which delays shivering or other homeostatic mechanisms.


Measured temperature values corresponding to these different states depend on the species since the normal ranges of core body temperature can vary. Additionally, temperature can fluctuate with the time of day (or time of year for hibernating animals), hormonal influences, and activity levels, although these variations are usually minor compared to the changes induced by anesthesia. Additionally, some species (especially amphibians, reptiles, and some fish) are normally subject to significant environmental temperature influences (poikilothermic), and definition of normal body temperature becomes problematic. Interestingly, the naked mole rat is the only known mammal to exhibit poikilothermic responses to environmental temperature changes as an adult [1].


Body temperature is sensed by temperature‐responsive cells throughout the body. There are distinct populations of peripheral nerve endings (receptors) in the skin, which discharge when their thermal thresholds are reached. Most receptors appear to utilize ion channels that belong to the transient receptor potential (TRP) family of cation channels [2]. There are also visceral receptors which are found generally in the brain (especially the anterior hypothalamus and preoptic area), spinal cord, and abdominal structures such as the gastrointestinal tract and urinary bladder [2]. The afferent input to the central nervous system (CNS) is carried by different nerve fiber types (e.g., Aδ and C fibers) depending on whether the input is cold, hot, or noxious. These signals traverse ascending tracts of the spinal cord and eventually reach the hypothalamus where the signals are integrated and responses issued. Since local temperature can vary depending on the tissue type and metabolic activity, the thermal input to the CNS is a summary of multiple core and peripheral sensors. This “averaging” of temperature allows the body to maintain a narrow thermal set‐point, which is associated with an interthreshold range (temperature variation where no compensatory responses [e.g., shivering, sweating, vasoconstriction, or vasodilation] occur) of approximately ± 0.2 °C (Fig. 17.1). Anesthetic drugs can alter thermoregulatory thresholds for compensatory responses, which is why perioperative patients often fail to shiver even though they are mildly hypothermic. The increase in the interthreshold range (approximate increased range of 3.5 °C) caused by drugs such as opioids, sedatives, and anesthetics is to some degree drug‐ and dose‐dependent and reduces the patient’s ability to tightly regulate core body temperature.

Two graphs for altered environmental and core body temperature occur. The plot of the graph is labeled as vasoconstriction, normal, non-shivering thermogenesis, shivering, sweating or panting, vasodilation, and anesthesia.

Figure 17.1 Thermoregulation is tightly controlled in healthy, unmedicated individuals. Many drugs used during the perianesthetic period (e.g., opioids and inhalant anesthetics) can alter the range over which compensatory responses to altered environmental and core body temperature occur.


Core‐to‐periphery gradient


It is important to understand that body heat is not uniformly distributed throughout the organism. For example, the core body temperature is often several degrees (2–4 °C) higher than skin temperature (Fig. 17.2) [3]. There is also significant longitudinal variation in temperature in the limbs, with the core‐to‐skin difference being greater the further the measurement is made from the trunk. This temperature gradient is maintained by the autonomic nervous system through mechanisms which regulate peripheral blood flow [4].


The majority of heat transfer between the core and periphery occurs via blood‐borne convection (with some due to tissue‐to‐tissue conduction). Factors which influence the distribution of blood include arteriovenous anastomoses, cutaneous vasoconstriction or dilation, and countercurrent vascular heat exchange [3]. Sweating (in those species which possess this capability) and environmental temperature will also modify the redistribution of heat [3]. Additional mechanisms exist such as panting or shivering which can modify heat loss and gain, although these mechanisms more directly affect the core temperature rather than the skin‐to‐core temperature gradient.


Interestingly, in humans about 95% of metabolically generated heat is lost to the environment via transfer across the skin with only about 5% lost through the respiratory tract. This implies that heat conservation devices such as airway heat exchangers will have minimal influence on the rate of change of core temperature during anesthesia [57]. Since human skin and hair density is significantly different from that of many domestic species, the importance of cutaneous heat loss may vary, but in general it remains a major mechanism contributing to perianesthetic hypothermia.


The reason why it is important to understand the skin‐to‐core temperature gradient is that it helps explain why anesthetized patients undergo a rapid decrease in body temperature following administration of anesthetic drugs, especially those which cause profound peripheral vasodilation (e.g., acepromazine and inhalant anesthetics). Drugs which cause less peripheral vasodilation often result in a less rapid decrease in temperature (e.g., total intravenous anesthesia with propofol or following premedication with dexmedetomidine) [8,9].


Most of the clinical research into anesthesia‐associated body temperature change has been performed with human subjects. It should be remembered that body size, and more specifically the surface area‐to‐mass ratio (alternatively the core‐to‐peripheral compartment ratio) can greatly influence the rate of change of core body temperature. The typical pattern of perianesthetic hypothermia has been described as having three phases: an initial rapid hypothermic phase, the linear decrease, and the plateau phase (Fig. 17.3)


Mechanisms of heat transfer


Heat energy that has reached the patient’s body surface is transferred to the environment via four main mechanisms: radiation, conduction, convection, and evaporation. Radiation of heat is the electromagnetic (photon) transfer of infrared and far‐infrared energy between surfaces [10]. It does not depend upon the temperature of the air around the patient, but does depend on the emissivity of the involved surfaces and their temperature difference (in °K) raised to the fourth power. Emissivity is an object’s capacity to exchange heat, with a value of 1.0 being a perfect absorber of heat and 0 being a perfect reflector. Human skin has been described as having an emissivity of 0.95 (regardless of pigmentation) for infrared light [3]. Radiation has been identified as the most important mechanism resulting in heat loss and perianesthetic hypothermia [3] and of note, it is not significantly inhibited by common methods used to limit patient hypothermia (e.g., cloth blankets, circulating water blankets, and forced warm air heaters).

A diagrammatical representation of the skin surface and the body’s core. It includes skin 29 to 33 degree celsius, periphery 29 to 33 degree celsius, vasoconstriction, anesthesia, vasodilated, and periphery 34 to 36 degree celsius

Figure 17.2 A temperature gradient normally exists between the skin surface and the body’s core. The gradient is maintained by physiologic mechanisms such as peripheral vasoconstriction and altered blood flow distribution. Many anesthetics which cause indiscriminate vasodilation result in mixing of core and peripheral blood, leading to a lessening of the gradient and ultimately a decrease in core body temperature.

A graph for canine patients with minimal or no external heat support. The plot of the graph is labeled as temperature change during anesthesia. The horizontal axis of the graph reads elapsed time h. The vertical axis of the graph reads body temperature degree celsius.

Figure 17.3 The hypothetical pattern of body temperature decrease during general anesthesia of a canine patient with minimal or no external heat support. During the first hour, core temperature usually decreases 1–1.5 °C. A second slower phase follows which represents less influence of core‐skin blood redistribution and more dependence on environmental heat loss. Eventually, patients reach a psuedoequilibrium with the environment, and heat loss is minimal. Patients with a greater surface area‐to‐mass ratio (e.g., cats or small dogs) will experience faster changes while larger patients (e.g., horses) will experience slower changes. Use of supplemental heat sources in the perioperative period will tend to slow the rate of decrease in body temperature and elevate the minimum temperature reached. The surgical or diagnostic procedure being performed can also modify the rate of heat loss.


Conduction and convection share a common theme in that heat energy flows from a warmer to a cooler surface. Conduction is usually direct transfer between two adjacent surfaces, whereas convection is facilitated via an intermediary (e.g., moving air or flowing liquid). Transfer of heat via conduction is proportional to the difference between two surface temperatures and can be inhibited by placing insulation between them. With convection, the movement of air (“wind chill”) increases heat loss proportional to the square root of the air velocity. The use of an air‐trapping sheet or blanket around the patient will limit airflow and thus reduce the effects of convection by about 30% [11,12]. Adding additional layers does little to reduce heat loss from convection, emphasizing the importance of decreasing airflow rather than focusing on the insulating capability of the cover. Using towels, pads, or circulating water blankets underneath the patient will act as a layer of insulation and limit direct transfer of heat via conduction.


One important species difference to consider when interpreting studies of heat loss in domestic mammals compared to human subjects is the insulating effect of fur on conduction and convection. Convection is generally regarded as the second most important cause of intraoperative heat loss in humans, but can become the most important in environments with high airflow.


Evaporation of liquids from the skin or body cavity surface results in patient heat loss due to the “donation” of heat energy required to vaporize the liquid. It is recognized that heat loss is greater with surgeries requiring large incisions and exposure of internal surfaces than with small incisions or non‐invasive procedures. Evaporative losses can reach 50% of the total heat loss for smaller animal patients (rabbits) with large surgical fields [13], but are estimated to be less relative to skin loss with larger animals (swine) [14]. Another source of evaporative heat loss is associated with use of water‐ or alcohol‐based solutions to prepare the surgical site. Heat loss has been suggested to be less with water‐based solutions than with alcohol [15]. However, recent studies in cats and dogs have found little or no difference in temperature outcomes related to surgical preparation solutions [16,17]. Relative to other sources of heat loss, evaporative losses due to surgical site preparation tend to be small but, in some scenarios, may be significant.


Decreased core body temperature has long been associated with the intravenous administration of fluid therapy. The magnitude of this effect is a function of the temperature of the fluids being administered and their volume relative to the patient’s body mass. When fluids are introduced at less than blood temperature, heat energy is transferred to the solution to increase the temperature until it reaches equilibrium with the patient’s blood. Warming intravenous fluid can minimize this source of heat loss, but a medical fluid warming device should be used to avoid accidental overheating, which could result in blood protein and enzyme damage. Many commercial devices limit fluid temperature to 104 °F (40 °C), which is approximate to the blood temperature of a febrile patient. Important limitations of fluid warmers include the rate of fluid flow and distance they can be positioned from the patient. Many fluid warmers have a specified range of fluid flow rates where the fluid they output will be at or near the indicated temperature. If flow rates are too high, there is inadequate time to absorb heat and reach equilibrium with the warmer surface resulting in cooler than indicated fluid. If flow rates are too low, the fluid loses significant heat to room air on its way to the patient [18]. Long fluid lines positioned between the warmer and the patient will have a similar effect. Under clinical conditions, the amount of perianesthetic heat loss prevented by warming intravenous fluids is minor relative to other routes [19,20].


Active patient warming


Although patient heat loss cannot be completely prevented during anesthesia, several methods have succeeded in replacing heat losses or limiting their magnitude. The first phase of heat loss is the result primarily of the transfer of core heat to the periphery due to vasodilation (see Fig. 17.2). This phase is nearly impossible to prevent by application of an external heat source after induction. The most effective strategy is prewarming the skin and peripheral tissues to minimize the thermal gradient between the skin and the core so that once blood flow increases to the periphery, the heat energy required to re‐establish equilibrium is minimal [21]. This strategy is employed on some human patients but has obvious limitations in veterinary patients. It is often impossible to restrain the patient to apply a warming device, but some species can be placed into warm ambient environments prior to induction. A major downside to this approach is that the patient will often respond by attempting to maintain a normal core temperature by panting, sweating, or other compensatory mechanisms.

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

May 1, 2025 | Posted by in SUGERY, ORTHOPEDICS & ANESTHESIA | Comments Off on Perioperative Thermoregulation and Heat Balance

Full access? Get Clinical Tree

Get Clinical Tree app for offline access