Vol.:(0123456789)1 3 European Journal of Applied Physiology (2024) 124:147–218 https://doi.org/10.1007/s00421-023-05262-9 INVITED REVIEW A century of exercise physiology: concepts that ignited the study of human thermoregulation. Part 4: evolution, thermal adaptation and unsupported theories of thermoregulation Sean R. Notley1,5  · Duncan Mitchell2,3  · Nigel A. S. Taylor4 Received: 6 February 2023 / Accepted: 13 June 2023 / Published online: 5 October 2023 © The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2023 Abstract This review is the final contribution to a four-part, historical series on human exercise physiology in thermally stressful conditions. The series opened with reminders of the principles governing heat exchange and an overview of our contempo- rary understanding of thermoregulation (Part 1). We then reviewed the development of physiological measurements (Part 2) used to reveal the autonomic processes at work during heat and cold stresses. Next, we re-examined thermal-stress toler- ance and intolerance, and critiqued the indices of thermal stress and strain (Part 3). Herein, we describe the evolutionary steps that endowed humans with a unique potential to tolerate endurance activity in the heat, and we examine how those attributes can be enhanced during thermal adaptation. The first of our ancestors to qualify as an athlete was Homo erectus, who were hairless, sweating specialists with eccrine sweat glands covering almost their entire body surface. Homo sapiens were skilful behavioural thermoregulators, which preserved their resource-wasteful, autonomic thermoeffectors (shivering and sweating) for more stressful encounters. Following emigration, they regularly experienced heat and cold stress, to which they acclimatised and developed less powerful (habituated) effector responses when those stresses were re-encountered. We critique hypotheses that linked thermoregulatory differences to ancestry. By exploring short-term heat and cold acclima- tion, we reveal sweat hypersecretion and powerful shivering to be protective, transitional stages en route to more complete thermal adaptation (habituation). To conclude this historical series, we examine some of the concepts and hypotheses of thermoregulation during exercise that did not withstand the tests of time. Keywords Adaptation · Acclimation · Acclimatisation · Body temperature · Exercise · Evolution · Genetic · Homeostasis · Thermoeffector · Thermoregulation Abbreviations Tb-i Body temperature at time i Tb-0 Basal body temperature dt The duration of the thermal stimulus Introduction Many small mammals, both wild-caught and captive-bred, engage in voluntary endurance activity when in captivity (e.g., wheel running), but why they do so is unknown (Sher- win 1998; Halsey 2016); “wheel running has no directly analogous naturally occurring behaviour” (Sherwin 1998 [P. 11]). Of course, many animal species play, but Homo sapiens is the only species that engages voluntarily in ardu- ous endurance exercise for recreational or occupational purposes (Fig. 1), which draws the question. Why is it that humans of widely varying abilities run marathons, often Communicated by Michael I Lindinger. * Nigel A. S. Taylor nigelastaylor@gmail.com 1 Defence Science and Technology Group, Department of Defence, Melbourne, Australia 2 Brain Function Research Group, School of Physiology, University of the Witwatersrand, Johannesburg, South Africa 3 School of Human Sciences, University of Western Australia, Crawley, Australia 4 Research Institute of Human Ecology, College of Human Ecology, Seoul National University, Seoul, Republic of Korea 5 School of Human Kinetics, University of Ottawa, Ottawa, Canada http://crossmark.crossref.org/dialog/?doi=10.1007/s00421-023-05262-9&domain=pdf http://orcid.org/0000-0002-5065-5000 http://orcid.org/0000-0001-8989-4773 http://orcid.org/0000-0002-3655-5249 148 European Journal of Applied Physiology (2024) 124:147–218 1 3 in the heat, and swim across the English channel, which is rarely anything other than cold? Why do highly trained military personnel and emergency workers (occupational athletes) frequently push themselves very hard, and some- times into oblivion? No other extant primate species is capable of such endurance exercise (Bramble and Lieberman 2004). Nev- ertheless, humans do not routinely walk the thousands of kilometres each year that reindeer, Asian asses and wolves do (Teitelbaum et al. 2015; Joly et al. 2019). Those species are obliged to move for nutritional reasons, and they walk slowly and in cool weather. Humans are fast endurance mammals and have evolved to be so. During the evolution of all species, successful individuals, through natural vari- ations or mutations, acquired attributes that enabled them to survive. Humans did not evolve with the traits of com- petitive sprinters. The fastest sprinters are quadrupeds: a cheetah can run 200 m in less time than it takes the fastest human sprinter to run 100 m (Hetem et al. 2013). Humans are not even the fastest bipedal sprinters; ostriches are (Alexander et al. 1979). The Australian emu (Patak and Baldwin 1998) and South American rhea (Roberts et al. 1998) are also faster than humans. In this, the final part our series of historical reviews on thermoregulation during exercise (Notley et al. 2023a, 2023b, 2023c), we explore the evolutionary steps that endowed each of us with the potential to be fast endurance mammals and sweating specialists. We have not included recent developments in the fields of epigenetics, gene map- ping or heat-shock proteins. Against that genotypic back- ground, we will examine how we are able to enhance those attributes by acquiring, either naturally or artificially, the physiological (phenotypic) adaptations necessary for us to tolerate endurance running in the heat and endurance swim- ming in cold water. Those attributes, if developed, allow us to play and compete for pleasure in today’s world. Over the centuries, those characteristics allowed our ancestors to be selected ahead of other hominids, but our largely sedentary contemporary lifestyles within a warming climate will mean that natural selection will again test human survival. How thermoregulation enabled fast endurance exercise in Homo sapiens Before our ancestors could evolve the traits for fast endur- ance exercise, they needed to acquire thermoeffectors to dissipate metabolic heat, a regulatory system to control heat dissipation and bipedal locomotion. Bipedality ena- bled those hominins to carry water, which supported ther- moregulation. Indeed, we possess a curious collection of attributes and regulatory systems selected by nature through the random application of environmental and competitive stresses that could be tolerated only by a unique species (Balter 2002; Grigg et al. 2004), or members of species. In achieving those capabilities, non-thermoregulatory mecha- nisms were co-opted to serve thermoregulation (Euler and Söderberg 1958; Simon et  al. 1986). Indeed, “it would be unnecessarily burdensome to require the evolutionary process to create a new system to solve a problem already solved by an existing system” (Satinoff 1978 [P. 21]). Since our thermoeffectors operate from soft tissues, they left no evidence within the fossil record. Fortunately, archeological and genetic clues enabled anthropologists and physiologists Fig. 1 A Participants in the Comrades Marathon, held annually in South Africa, and regarded as the world’s most-famous ultramarathon (Burfoot 2007). The course is 90 km long with a climb of 870 m on the uphill run. The field is capped at 25,000 participants, and winners run at an average speed of 16.5 km  h−1 for over five hours. Over that distance humans are as fast as horses; over longer distances humans are faster (Minetti 2003). Source: Used with permission of The Comrades Marathon Association, which retains copyright. B Zasson (Jason) Zirganos (Greece) at the completion if his English Channel swim from Dungeness Point (France) to Folkestone (England; 1951 [14  h 10  min]). Source: https:// www. chann elswi mming dover. org. uk/ conte nt/ photo/ major- jason- zirga nos- of- greece- with- sam- rocke tt Accessed: July 11th, 2022. This photograph is in the Public Domain https://www.channelswimmingdover.org.uk/content/photo/major-jason-zirganos-of-greece-with-sam-rockett https://www.channelswimmingdover.org.uk/content/photo/major-jason-zirganos-of-greece-with-sam-rockett 149European Journal of Applied Physiology (2024) 124:147–218 1 3 to divine evidence-based interpretations of how our ances- tors evolved into the bipedal, less-hairy sweating specialists that we are today. Thermoeffector evolution: vascular networks, homeothermy, sweating and hairlessness For our hominid ancestors and ourselves, there were cru- cial steps in acquiring the traits necessary for fast endur- ance exercise: the emergence of bipedality occurred perhaps 6 million years ago, the drying and opening up of parts of the African continent happened about 2 million years ago and its conversion from a cold to a tropical climate occurred ~ 250,000 years ago (Gowlett 2001). Owen-Smith suggested that the evolution of the genus Homo could have occurred nowhere else other than in Africa, because that pro- cess required warm savanna grasslands and plentiful large- animal protein (Owen-Smith 2021). Those events gave rise to the Anthropocene epoch, the era dominated by humans, our activities and our often adverse influences on our shared home. Our evolving traits were superimposed on structures and functions with a much longer evolutionary history. We have discovered, from observations on primitive organisms, that learned behaviours are the most-ancient mechanism that animals use to manipulate their body tem- peratures (Malvin and Wood 1992; Nelson et al. 1984). Indeed, behavioural thermoregulation (e.g., clothing, build- ings, air conditioning) is our most effective method for manipulating and defending our body temperatures. For more precise regulation, those behaviours were supple- mented by autonomic mechanisms (Cabanac 1972; Nelson et al. 1984), and inter-species comparisons have revealed that progression has occurred across animal phyla, and may have been driven by selective pressures that favoured ani- mals with a preference for thermal stability, rather than het- erothermy (Bligh 1998). The first autonomic thermoeffectors to evolve were likely to have been those that depended upon vascular networks. Those networks appeared >600 million years ago (Fig. 2), predicated by increases in body size, which necessitated transportation networks to supply the metabolic require- ments for, and the removal of wastes from, all cells, which are diffusion dependent when organisms are very small (Monahan-Earley et al. 2013). Jump forwards some 300 million years to the appearance of reptiles (Carroll 1970; Simões et al. 2020), and we find animals that exploited their cutaneous vasculature, in combination with behavioural strategies, to use external heat to warm their deeper tissues (Templeton 1970), and to reduce heat loss in the absence of an ambient heat source. That transportation is bi-directional and gradient dependent, with all heat exchanges obeying the Laws of Thermodynamics (Notley et al. 2023a). It is not only the reptiles that possess such a capability, but all ectotherms that rely on their environments as a heat source and a heat sink use their cutaneous vasculature in this way (Smith 1979; Dzialowski and O’Connor 1999). Jump another 80–140 million years forwards, and we find evidence for the existence of endothermy and homeothermy (Fig. 2), especially amongst mammals and birds (Ruben 1995; Hillenius and Ruben 2004). Endotherms can produce enough metabolic heat (tachymetabolism) to regulate their body temperature using both shivering and non-shivering mechanisms. That capability became a feature of mammals and birds ~ 200 million years ago (Phillips et al. 2009; Ben- ton 2021), possibly as a consequence of different selective pressures (Legendre and Davesne 2020). Homeothermic mammals, with their stable body tem- peratures, appeared around 160–220 million years ago (Phillips et al. 2009; Benton 2021), when egg-laying mam- mals (monotremata) diverged from amphibious reptiles (Phillips et al. 2009). Neither the addition of endothermy to ectothermy, nor the emergence of homeothermy, occurred instantaneously. Those transitions probably related to a decline in body size (Lovegrove 2017; Rezende et al. 2020). By virtue of their high thermal inertia, large dinosaurs may well have had stable enough body temperatures to qualify as homeotherms (McNab 1978), or at least as mesotherms (Grady et al. 2014), and the energy requirements of running for large, bipedal dinosaurs may have required them to be Fig. 2 A probable timeline for the development of autonomic and behavioural thermoregulatory phenomena that contributed to the evolution and survival of the endurance capabilities of the first Homo sapiens, as they progressed from Homo sudomotor to become Homo cursor (Sands 2010). In the lower right, we find the oldest mummi- fied athlete of his time (>3000  years old), Ötzi, a Neolithic hunter and farmer, who was found with his personal protective clothing and equipment, trapped within the Niederjoch Glacier in the Ötztal Alps (Italy) 150 European Journal of Applied Physiology (2024) 124:147–218 1 3 tachymetabolic, so they too were endothermic (Pontzer et al. 2009). The biochemical mechanism selected for elevating metabolic heat production across mammals and birds was mitochondrial oxidative phosphorylation. That highly inef- ficient process liberates significant amounts of thermal energy (thermogenesis), which becomes problematic for occupational athletes working hard in protective clothing. That oxidation is not unique to animals. It occurs in higher plants and in organisms that have cell nuclei (eukaryotes; Müller et al. 2012; Poole and Gribaldo 2014). What had been selected in mammals and birds was the ability to use that metabolic engine to continually produce heat. For exam- ple, the metabolic rates of mammals, measured in the field (doubly-labelled water), are about 12 times higher than those of similarly sized reptiles (Nagy 2005). Let us now consider the evolution of tachymetabolism from an exercise perspective, and how our oxidative capa- bility may have provided a selective advantage. Of course, it was a double-edged sword, because it not only kept the engine running, but the conversion of chemical potential energy into kinetic energy, which was used to exercise and perform external (mechanical) work, incurs a significant heat load (Helmholtz 1848; Hill and Hartree 1920; Christensen 1931; Nielsen 1938). In the Part 3 of this review series, we examined the physiological and health implications of exces- sive heat storage and the corresponding increases in tissue temperatures (Notley et al. 2023c). The resting heat production of humans is dominated by the metabolic activities of our brain and intra-thoracic organs (Müller et al. 2011). When we commence exercising, there is an intensity-dependent elevation in heat produced by the skeletal muscles, which, on its own, can approach 90% of the overall heat production during maximal exer- cise (Mitchell and Blomqvist 1971; Hochachka 1994). That energy conversion is irreversible (Wilkie 1954, 1960), so the heat generated must either be stored or dissipated, and that is the other edge of this sword, since excessive heat storage can be life threatening. The egg-laying monotremes of Australia and New Guinea (Holz 2015; Wagstaff et al. 2020; Flannery et al. 2022) ther- moregulate with less precision, and at lower temperatures than do other mammals (Miklouho-Maclay 1884a, 1884b; Martin 1903). Since they were the next mammalian order that diverged from reptiles, it is not unreasonable to assume that they possessed some control over their cutaneous vas- culature, although those mechanisms probably differed widely across species (Morgareidge and White 1969). The next mammals to evolve seem to have been the marsupials (metatheria or marsupalia; Fig. 2), the neonates of which complete their development within a pouch, attached to a mammary gland. They inhabit Asia, Australia, North and South America (Vogelnest 2015; Taylor et al. 2022) and appeared 140–180 million years ago (Phillips et al. 2009; Luo et al. 2011). Our forebears also appeared at about that time (placental [eutherian] mammals; Phillips et al. 2009; Luo et al. 2011), and were likely to have been proficient thermoregulators. After nearly 200 million years of evolution within the endotherms and homeotherms, the hominid lineage emerged. They would become the most-proficient of the thermoregulators (Fig. 2). The phylogenetic tree of primates provides compelling evidence that Hominidae (humans, their predecessors and the other great apes) shared a com- mon ancestor with the other catarrhine primates: the Cerco- pithecidae (Old World monkeys) and Hylobatidae (gibbons and siamangs; Mittermeier et al. 2013). The human clade (hominids) split from the chimpanzees and bonobos ~ 6 to 7 million years ago, and the oldest known hominid fossils are those of Sahelanthropus tchadensis (Pontzer 2012a), which then lived in the tropical and sub-tropical habitats of Africa. Those habitats were characterised by abundant water, maximum dry-bulb temperatures of 30–32 °C, low wind speeds, little or no penetrating solar radiation due to the dense vegetation, and almost fully saturated air (close to 100% relative humidity; Newman 1970; WoldeGabriel et al. 2001). Although the African Equatorial Belt contracted pro- gressively, the hominids and other great apes did not have to leave those habitats until ~ 2 million years ago, though they may well have explored the forest edges and the nearby woodlands (Kingston et al. 1994). Indeed, it was unlikely that they had any reason to travel far on any one day (Pon- tzer 2017a). We know nothing of the skin of Sahelanthropus tchaden- sis, but the ancestors of chimpanzees, bonobos and gorillas shared the same habitat as Sahelanthropus, and they have dense body hair. We therefore assume that Sahelanthro- pus also had dense body hair. All three primate families possess a unique, whole-body distribution of apocrine and eccrine sweat glands (Gelineo 1964; Szabo 1967; New- man 1970; Jenkinson 1973; Taylor and Machado-Moreira 2013; Best and Kamilar 2018). The eccrine glands secrete a serous fluid (Folk and Semken 1991), and its evaporation fulfilled a significant thermoregulatory role (Johnson and Elizondo 1979; Mahoney 1980). In the catarrhine primates, those sweat glands are distributed over the entire body sur- face (Montagna and Yun 1963; Folk and Semken 1991; Best and Kamilar 2018). That enabled evaporative cool- ing without inhibiting locomotion. Animals that pant tend to remain stationary while panting. Unless those primate families evolved sweating independently after their diver- gence, their last common ancestor was likely to have been an eccrine sweater. The last common ancestor of Hominidae lived ~ 20 mil- lion years ago (Pontzer 2012a) and would have inherited that trait. The oldest date proposed for the loss of body hair 151European Journal of Applied Physiology (2024) 124:147–218 1 3 in that lineage was 4 million years ago. So for at least 16 million years, to the extent that they relied on evaporative cooling to dissipate metabolic heat, ancestors of the great apes had a sweating system that operated through their body hair. The fastest of the extant primates (easily outstripping humans) is the patas monkey (Erythrocebus patas), which inhabits the savannas of tropical Africa. It has a dense hair coat, but thermoregulates effectively using eccrine sweating (Mahoney 1980; Kolka and Elizondo 1983), challenging the contention that “that the benefits of sweating only accrue to hairless animals” (Dávid-Barrett and Dunbar 2016 [P. 75]). It is not even necessary to have an eccrine sweating system to achieve potent cutaneous evaporative cooling. Elephants, which lack eccrine sweat glands, achieve controlled evapora- tive cooling of up to 360 W  m−2 (>500 g  m−2  h−1), which is greater than that of any extant non-human primate (Dunkin et al. 2013). Only two thirds of the fluid-producing cutaneous glands of our closest phylogenetic relatives (chimpanzees) are eccrine glands (Lieberman 2015), and their glandular den- sity is about a tenth that of humans (Kamberov et al. 2018). The maximum sweat rate of chimpanzees is ~ 80 g  m−2  h−1 (Hiley 1976), which can achieve ~ 53 W  m−2 of cooling if it all evaporates on the skin. That is just enough to dissipate its resting metabolic heat, and is rather modest compared to human endurance athletes (Hora et al. 2020). For example, the sweat rate of Alberto Salazar, an elite marathon runner, was 3.7 L  h−1 (~2 kg  m−2h−1, achieving ~ 1.3 kW  m−2) dur- ing an Olympic marathon (1984; Armstrong et al. 1986a). He lost 8.1% of his body mass durings the race, despite drinking just before (1 L) and during the race. It is presumed the evaporative cooling achieved by our ancestral great apes was adequate for their lifestyle, but would have been inad- equate for hominids engaged in endurance exercise. Humans have hair-follicle densities similar to those of chimpanzees, but a density much lower than is found in other primates (Kamberov et al. 2018). Moreover, the majority of human body hairs are microscopic (vellus hair; Szabo 1967). It appears that the in-utero determination of human hair-follicle and eccrine-gland densities, which arise from the same epithelial appendages, results from the competing influences of two genes: sonic hedgehog proteins (hair-fol- licle formation) and bone morphogenetic proteins (eccrine- gland formation; Lu et al. 2016). Aldea et al. (2021) have now shown that an ectodermal enhancer gene (hECE18) determines the number of sweat glands. Compared to other great apes, and presumably our hominid ancestors, humans are unequivocally naked (though not hairless). When our ancestors lost their body hair, they lost most of their apo- crine sweat glands, which secrete into hair follicles (Folk and Semken 1991); eccrine glands secrete onto the skin surface. Humans retained apocrine glands only where we have dense body hair on our torsos. So it was nakedness that enabled our ancestors to become almost completely eccrine sweaters, with human sweat-gland densities being ~ 10 times those of the chimpanzee (Kamberov et al. 2018). Nakedness became particularly important for humans (Kuno 1934; List 1948; Weiner and Hellmann 1960; Sato 1977) as it opti- mised evaporative cooling, although that optimisation was not the evolutionary force behind nakedness. Nakedness does not fossilise, so the question of when our lineage became naked has to be approached indirectly, and it has provided an arena for entertaining and ongoing disputes (Rantala 2007; Allen et al. 2013). According to Ebling (1985 [P. 33]), “the evolution of near nakedness in the human species has been accounted for by a series of myths which owe more to the predilections of their creators than to the available evidence”. One indirect way of esti- mating when hominids became naked is to track the genetic clock of a gene locus associated with skin darkening by the sun (MC1R), which establishes that hominids were naked and regularly sun exposed 1.2 million years ago (Rogers et al. 2004). But whatever the evolutionary forces were that drove nakedness, it was unlikely to be those that enhanced thermoregulation when exercising in the sun (Newman 1970; do Amaral 1996). Mutations selected for that out- come would have carried more hair not less, in a hair coat that allowed the free transport of water vapour, but which intercepted solar radiation before it reached the skin, as seen in the patas monkey (do Amaral 1996). Indeed, we would have retained more than just scalp hair to act as a sunshield (Lasisi et al. 2023). Consequently, behavioural practices were needed, such as the loose-fitting Bedouin robe (Shkol- nik et al. 1980; Horowitz 2022), which supports evaporation while intercepting solar radiation. Though it is a topic unlikely to grace the dinner tables of the International Olympic Committee, the most-compelling evidence for when and how hominid nakedness occurred, we believe, relates to ectoparasites. The only human hairs to which the louse (Pthirus pubis) could attach were the pubic hairs, which co-evolved with nakedness; hairy great apes have no pubic hairs. The louse’s molecular clock tells us that they first attached to the pubic hair of hominids ~ 3–4 million years ago, probably host-switching from the coarse hair of gorillas (Reed et al. 2007). Our ancestors were not yet exploiting the savannas, so nakedness emerged within enclosed forest habitats. Global temperatures in the Pliocene (1.6–5 million years ago) were higher than they would be again until the current Anthropocene, so nakedness imposed little risk of cold stress for forest dwellers, and would have facilitated evaporation, which is helpful in an environment with low water vapour-pressure gradients. The louse may tell us when nakedness occurred, but not why. Hairy primates were vulnerable to another class of ectoparasite, blood-sucking ticks, which are powerful vec- tors of disease. Evolution would have favoured the less hairy 152 European Journal of Applied Physiology (2024) 124:147–218 1 3 members of a species, which were less vulnerable to ticks (Pagel and Bodmer 2003; Allen et al. 2013) because they would have been more visible and easily removed during grooming (Brown 2021). That parasite hypothesis was put forward by Belt (1874), and rejected by Charles Darwin (1809–1882, England; Darwin 1888) because he could not envisage why it applied to hominids, and not to other great apes (Rantala 2007). So why did hominids become naked, and not other great apes? At that time, hominids started functioning in groups, sharing activities and food, and liv- ing in communal lairs and dens (Rantala 2007). Those social circumstances can lead to lethal tick infestations in extant primates (Brain 1992; Brain and Bohrman 1992), and could well have done so for our hominid ancestors. So it would appear that our naked ancestors may have been selected by ticks, and our hairy ancestors (with notable contemporary exceptions) died out. Any benefits for evaporative cooling were secondary. The cost of that was lice, but human life was not meant to be easy! Which hominids first became naked? Gilligan (2010) says it was early Homo that became naked, but there are no Homo fossils from 3–4 million years ago (Pontzer 2012a). That was the age of Australopithecus, but Dávid-Barrett and Dunbar (2016) argue that nakedness did not emerge in Aus- tralopithecus, because they lived at high altitude, and had an over-riding need to stay warm, rather than to lose heat. They assert that nakedness appeared within the genus Homo ~ 2.5 million years ago. Macdonald (2018) has suggested several ways in which a hominin could cope with low temperatures without either hair or fire. Whichever hominin species entered the Pleistocene epoch naked (12,000–2.6 million years ago), it needed thermoregulatory competence in both hot and cold envi- ronments. During the Pleistocene, the earth experienced successive periods of falling and rising air temperatures, resulting in continental ice sheets and glaciers forming (the ice age; Pisias and Moore 1981). The advantage that mammals and birds possessed, in addition to their tachym- etabolism, was their ability to modify their thermal insu- lation, through anatomical, autonomic and behavioural changes (Irving 1966). For our now-naked ancestors, the critical behavioural change was adopting clothing, and it is another louse species that allows us to estimate when that happened. Pediculus humanus lives exclusively in clothing, and its molecular clock indicates that Homo sapiens first used cloth- ing to between 83,000 and 170,000 years ago (Fig. 2; Allen et al. 2013; Gilligan 2018; Hallett et al. 2021). So when the most recent wave of Homo sapiens left Africa, en route to Australia and tropical Asia, they were clothed, which was necessary in the colder winters of Europe and Asia (Toups et al. 2011). Any reversion to near-nakedness by subsequent indigenes was a matter of choice. Those who travelled north and into the colder European climates certainly needed clothing, and Notley et al. (2023c) introduced Ötzi, an occu- pational athlete of his time (>3000 years B.C.E.), as the person who provided the earliest example of thermally pro- tective clothing during exercise in the cold. Our objective for this section has been to highlight phenomena and develop- mental stages that accompanied the evolution of both human thermoregulation (Fig. 2), but why did humans adopt bipedal exercise and eventually become endurance specialists? Why we needed bipedality for endurance exercise: not what you might think When the hominid lineage split from chimpanzees and bono- bos, our ancestors were quadrupedal and arboreal, though they were likely to have made excursions to the ground. But within 1–2 million years, the hominids changed their mode of locomotion, with successive hominid genera develop- ing progressively better capacities for bipedal movement, presumably due to random genetic changes and mutations that served their arboreal activity (Drummond-Clarke et al. 2022). Those changes gave them a selective advantage by allowing them to venture into more open habitats, with bipedal locomotion eventually becoming their usual, and ultimately their only, mode of locomotion. Paleoanthropologists adduce evidence for proficiency in bipedal locomotion primarily from the orientation between the skull and spinal column, the size and shape of the fore- limb and hindlimb bones, and their alignment, and the shape and alignment of the foot bones. Bipedality was claimed for Sahelanthropus tchadensis (Zollikofer et al. 2005), but recent analysis does not support habitual bipedal locomo- tion (Macchiarelli et al. 2020). There seems to be agreement that the fossils of Orrorin tugenensis (~6 million years old) were from an habitually bipedal hominid (Senut et al. 2001; Pickford et al. 2002), although it would not have been naked, nor was it a prolific eccrine sweater. Which evolutionary forces led to habitual bipedal loco- motion in our ancestors? We know it was not selection for speed, because not just patas monkeys, but baboons and quadrupedal primates are faster than modern humans. The effortless grace of the Kenyan athlete, Eliud Kipchoge, might beguile us into thinking that it was locomotor effi- ciency, but we would be wrong. When our hominid ancestors first walked bipedally, it was with bodies better suited to arboreal movement. Think not of Kipchoge, but of the inele- gant crouching walks of modern chimpanzees that are reluc- tantly bipedal over short distances. It would take 2 million years after Orririn walked bipedally before Australopithecus walked comfortably (Fig. 2), and another 2 million years of bipedalism before Homo erectus walked and ran competi- tively with other mammalian species (Pontzer 2017b). 153European Journal of Applied Physiology (2024) 124:147–218 1 3 We believe that the evolutionary forces behind bipedal locomotion had more to do with forelimbs than with hindlimbs. That was also Charles Darwin’s explanation, and he proposed that bipedality freed the hands and arms for defence and food acquisition (Darwin 1888; Pontzer 2012b). Whilst that is true, we do not believe those were the most important benefits, since freeing the forelimbs enabled car- rying to become a normal behaviour of our bipedal ancestors (Berecz et al. 2020), with no more precious load to be car- ried than one’s offspring (Fig. 3). Hominid primates carry their offspring for at least the first few weeks of its life, and sometimes much longer, with the offspring of haired homi- nids clinging to the mother (Berecz et al. 2020). That form of offspring carriage fell away when our ancestors became naked. Chimpanzees and gorillas use one arm to assist the attachment of infants, but locomotion is difficult, although it would have been much easier once genetic changes allowed hominids to become bipedal. Parents could then use both arms, with an upright posture resulting in a better load dis- tribution (do Amaral 2008; Berecz et al. 2020; Brown 2021). Although the asymmetry of such load carriage is energeti- cally costly (Watson et al. 2008), that posture freed the arms for other functions, and possibly improved reproductive success. Collectively, those changes would have withstood selective pressures, so it is believed that bipedality and nakedness were co-selected. Our bipedal ancestors could now also carry water. That would not have been required in the forested habitats, but would have been a substantial advantage when our ancestors began to encounter more open and drier habitats (deMeno- cal 2011), where they would have to range much further for food. They would also have been exposed to more solar radiation, because they were forced into diurnal activity by nocturnal predators, such as leopards (Brain 1983). Reed (1997) insists that ancestor was Paranthropus (Constantino 2013), which survived another 600,000 years. In some ways, various hominins were pre-adapted for life under greater heat stress, with eccrine sweat glands secreting directly onto to their near-naked skin. In one important respect, though, they were not prepared. They were about to embark on an evolutionary journey that would lead to them having a lower daily water turno- ver than other primates (Pontzer et al. 2021), but there had been no evolutionary pressure to prepare them for habitats in Fig. 3 Offspring carriage by hairy and naked primates. A Baboon off- spring cling to the hair of their mothers. Source: Royalty-free image from Pixabay. B Chimpanzee offspring also cling, but mothers assist very young offspring with an arm; locomotion is then more difficult. Source: Royalty-free image from Pixabay. C: Human infants cannot cling, but our upright posture, which freed the arms, made offspring carriage possible. Source: Netpix.com (https:// www. needp ix. com/ photo/ 13455 86/), used under Creative Commons Zero License for Public Domain https://www.needpix.com/photo/1345586/ https://www.needpix.com/photo/1345586/ 154 European Journal of Applied Physiology (2024) 124:147–218 1 3 which drinking water was not readily available. Their ability to range widely would have been constrained by access to water to replace that lost as sweat. If their ability to with- stand hypohydration was similar to that of modern humans (Beis et al. 2012), they were unlikely to encounter lethal hypohydration (Hora et al. 2020), even from the sweating required to range at pace. But if their thermoregulatory characteristics resembled those of modern humans, hypo- hydration would have made them progressively hyperther- mic during exercise in the heat, and perhaps dangerously so. When body-fluid homeostasis and homeothermy come into regulatory conflict, the regulation of blood volume and pressure might be given a higher priority than the regulation of body temperature (Notley et al. 2023a [Table 1]). Thus, hypohydration may suppress sweating (Greenleaf and Cas- tle 1971; Fortney et al. 1984; Sawka et al. 1985), although sudomotor suppression is not always seen (Ladell 1955; Strydom et al. 1966; van den Heuvel et al. 2020b), particu- larly if the resulting increase in body temperature leads to extra sweating (Notley et al. 2023c). Whether they became hyperthermic or not, they may have carried water or water- rich food (Hewes 1961) to counteract hypohydration. Since our hominin ancestors had ample access to water in their earlier forest-dwelling forms, there was no selection pres- sure for them, or other primates, to evolve a physiological mechanism, such as selective brain cooling, that would help them to conserve water when they moved into arid regions (Maloney et al. 2007; Strauss et al. 2017). Some dismiss the idea that hominins carried water, because it would have required carrying several kilograms (Hanna and Brown 1983), for which they did not have the means (Ruxton and Wilkinson 2011a). However, the pro- vision of just a little water to hyperthermic (non-human) mammals following the suppression of evaporative cooling, can restore cooling and reverse hyperthermia within minutes (e.g., McKinley et al. 2009). The sensory cues for that seem to be related to the act of drinking (Takamata et al. 1995), but it can even be sufficient just to show water to an hypo- hydrated animal (Claus Jessen, personal communication), so the response seems to resemble a Pavlovian (classical con- ditioning) mechanism. That phenomenon occurs in baboons (Fig. 4; Brain and Mitchell 1999; Mitchell et al. 2009), mod- ern humans (Lee and Mulder 1935; Senay and Christensen, 1965; Takamata et al. 1995; Nose and Takamata 1997) and presumably within our ancestors. Remarkably little water is required, so carrying small volumes, or water-rich plants, could have allowed our ancestors to roam widely, for some time after drinking, in dry and open habitats without incur- ring lethal hyperthermia. Whilst hyperthermia was tempo- rarily averted, neither the blood volume nor its osmolality would be modified, although the suppression of hyper- thermia reduces water loss via evaporative cooling, and in Namib Desert baboons (Brain and Mitchell 1999) that effect can last from noon into the night, when body heat could be lost non-evaporatively. Wheeler (1991a, 1991b) provided an alternative expla- nation for the avoidance of hyperthermia in those dry and open habitats. That is, a bipedal posture reduced the skin surface area exposed to direct solar radiation and enhanced both convective and evaporative cooling. Wheeler’s model- ling was confined to a standing hominin, but his hypothesis was endorsed for a walking hominin by Ruxton and Wilkin- son (2011a, 2011b). When running, evaporation is further elevated by the relative air movements that accompany locomotion, but that does not imply that a thermoregula- tory disadvantage existed for quadrupedal primates in those environments. For example, quadrupedal baboons, which exploited those habitats at the same time as hominins, do not show any disadvantage, providing they have access to water (Mitchell et al. 2009). It is hard to conceive of any thermoregulatory pressures being an evolutionary driving force for bipedality, if being quadrupedal confers no ther- moregulatory disadvantage. Of course, that does not deny the possibility that bipedality might confer secondary ther- moregulatory benefits. Thermoregulation in our ancestral Homo cursor By the time that our hominin ancestors began exploiting open and dry habitats, they had acquired the thermoregu- latory capacity to dissipate the heat they generated when roaming, as well as the radiant heat gained during diurnal Fig. 4 Baboons exposed to a simulated desert environment, including solar radiation, showed progressive hyperthermia when deprived of water, and fed only dry food for three days. Providing water at 38 ℃ at peak hyperthermia on Day 3 resulted in complete resolution of the hyperthermia, with abdominal (implanted retroperitoneal loggers) temperatures falling very rapidly to <38 ℃. Redrawn from Mitchell et al. (2009) 155European Journal of Applied Physiology (2024) 124:147–218 1 3 activity. The key to that thermoregulatory capacity was their prolific, whole-body sweating from naked skin. Resource density would have been reduced in those habitats, so one can envisage more expansive roaming, presumably achieved by walking. But unless running emerged only with Homo sapiens, then our hominin ancestors also ran. Why did they incur the energy costs, thermoregulatory demands and the water requirements of running? How did running humans evolve? Sands (2010), who coined the pseudonym Homo cursor, believes that it had nothing to do with exercise performance. For our body mass, compared to other endotherms and particularly to other primates, running is energetically costly, except for highly trained athletes, whilst walking is energetically less costly and more economical (Steudel- Numbers 2003). Those differences are of thermoregulatory importance. During horizontal locomotion, if heat storage is to be avoided, every Joule of metabolically released ther- mal energy has to be dissipated (Notley et al. 2023a). In hot environments, heat dissipation is reliant upon evaporative cooling. The metabolic heat production of humans during horizon- tal (unloaded), steady-state walking and running is a non-lin- ear (allometric) function of body mass (Bowes et al 2021b; steady-state oxygen consumption = 0.023 × mass0.865). We assume that was similar for our hominin ancestors. Thus, a 60-kg hominin, moving at 4.8 km  h−1 would consume oxygen at a rate of 9.9 L  km−1. That equates with a heat production of 217 kJ  km−1, and dissipating that heat would require a sweat secretion of 89 g  km−1, if all of that sweat evaporated on the skin, which does not happen (Candas et al. 1979a, 1980). If walking constantly for 20 km, that hominin would need to evaporate 1.9 kg of sweat if sweating was 100% efficient. Consequently, in the absence of fluid replace- ment, our hominin would end up hypohydrated by >3%, and by analogy with modern humans (Wyndham and Strydom 1969a, b; Notley et al. 2023c; Fig. 11]), its deep-body tem- perature might exceed 39.0 °C, even in temperate conditions. Whilst the energetic cost of locomotion is both speed- and efficiency-dependent, the overall cost of locomotion is determined principally by exercise duration. Let us return to Alberto Salazar, whose metabolic power generation dur- ing a marathon would not have exceeded 1.5 kW (Costill et al. 1971). To dissipate the resulting metabolic heat, he would have to evaporate sweat at 2.2 L  h−1. In hot-humid environments, only ~ 60% of discharged sweat can evaporate (Stewart 1981). Greater evaporation can be expected dur- ing marathons, which are rarely run in unfavourable thermal environments, although a 60% evaporation rate of Salazar’s peak sweat rate (3.7 L  h−1) matches the 2.2 L  h−1 required to dissipate the metabolic heat. Solar heat might add 100–150 W to the metabolic load (Nielsen 1990). That can be offset by his running speed (5.8 m  s−1), because it would require the dry-bulb temperature to be only 4 °C below his skin temperature to dissipate all of that solar heat via convective cooling (Mitchell et al. 1969). His performance would not have been limited by his capacity to dissipate metabolic heat through evaporation. Our hominin ancestors were unlikely to have run for hours at such a high metabolic power generation (Halsey 2016), nor would they have the fatigue resistance of contemporary athletes (Marino et al. 2022). Whilst we do not know when their sweating capacity evolved to match that of modern humans, their running speed may have been limited by their sweating capacity. Which of our ancestors could have been the first Homo cursor? Experts in hominin energetics believe that the first to have the posture, limb length and hind-limb extension necessary for endurance running was Homo erectus (e.g., Steudel-Numbers and Wall-Scheffler 2009; Pontzer 2012b, 2017b; Raichlen and Pontzer 2021), which emerged ~ 1.9 million years ago (Pontzer 2012a). Initially, it may only have been the male Homo erectus that was capable of endurance running, because the pelvis of his average female counter- part seems less suited to running (Simpson et al. 2008). The australopithecine precursors of early Homo were prey, not predators (Brain 1983), who ate predominantly underground tubers (Dávid-Barrett and Dunbar 2016), and their skeletal morphology indicates they would not have been proficient runners (Carrier et al. 1984; Pontzer 2012b). Early Homo was omnivorous, not carnivorous. They had the capacity to be hunters, and not the hunted. Homo erectus, the hunter, has been the source of enthu- siastic conjecture about the origin of running. We have restricted our enthusiasm to the thermoregulatory aspects and to the hypotheses advanced about why Homo erectus needed to run. The first theory was that it enabled them to get to carcasses faster than other scavengers, while the sec- ond hypothesis was based on persistence hunting (Lieber- man 2015). Scavenging could have occurred at any time, but persistence hunting, which entailed running prey, usually the ungulates (hooved animals), to exhaustive hyperthermia, would have been most effective in the hottest part of the day. That would require resistance to fatigue (Marino et al. 2022), and would impose a combined metabolic and environmen- tal heat load on the hunter, with the inevitable hypohydra- tion. The question of whether Homo erectus could tolerate hypohydration has been asked more often than the question of whether he could tolerate hyperthermia better than his prey. Like Ruxton and Wilkinson (2011a), we suspect that hyperthermia imposed the greater risk, and we have elabo- rated on that within Part 3 of this historical series (Notley et al. 2023c). Scavenging is unlikely to have challenged thermoregu- lation. On the other hand, persistence hunting would have 156 European Journal of Applied Physiology (2024) 124:147–218 1 3 relied on the thermoregulatory competence of Homo erectus outperforming that of its prey. Much of the conjecture about persistence hunting by our ancestral hominins is based on observations of how persistence hunting has been carried out by modern hunter-gatherers in arid and semi-arid, sub- Saharan Africa (Liebenberg 2006; Glaub and Hall 2017), although they are not the only indigenous persistence hunt- ers (Liebenberg 2006). Persuasive in conjectures about hominin persistence hunting has been the view of Lieben- berg (2006) that “data from observations of !Xo and /Gwi hunters of the central Kalahari in Botswana … suggest that persistence hunting was a very efficient method under cer- tain conditions. Compared with other forms of hunting, it may have been one of the most efficient” (P. 1017). Those Kalahari hunters pursued their prey relentlessly, often track- ing its footprints without stopping their running. One suc- cessful hunt covered 33 km in 4 h 57 min, at an average speed of 6.6 km  h−1. If that was typical of the speed of persistence hunting for our hominin ancestors, it is slow by modern standards, since winning the Comrades Mara- thon (Fig. 1) requires running at 16.5 km  h−1 for five hours. Nevertheless, persistence hunting by running was likely to have been more successful than persistence walking (Hora et al. 2022). It was that Kalahari persistent hunt that was modelled by Hora and colleagues (2020) when evaluating whether the thermoregulatory capacity of Homo erectus was compatible with persistence hunting. They set as a criterion that hypo- hydration from sweating should not exceed 10% of the body mass, and, according to their model, it would not have done so. They considered, but did not analyse, the likely concomi- tant hyperthermia. Though it requires extrapolation of the predictive equation of Wyndham and Strydom (1969a, b) well beyond the original data (a potentially dangerous prac- tice), a predicted deep-body temperature of 43.4 °C at 10% hypohydration would result. Our Homo erectus persistence hunter would have increased his risk of developing heat stroke, even if he survived dehydration, although marathon runners often experience similarly high deep-body tempera- tures without sequelae (Pugh et al. 1967; Maron et al. 1977; Maughan et al. 1985). Not everyone shares the enthusiasm for the idea that either scavenging or persistence hunting contributed to the evolution of fast endurance running in modern humans. The sceptics, Pickering and Bunn (2007), pointed out that high carcase densities would not have required running to scavenge, and the habitat encountered by Homo erectus was not open, but savanna-woodland, with reduced visibil- ity. They suggested that tracking prey in those habitats was very difficult and that persistence hunting is extraordinar- ily rare in modern hunter-gatherers. Indeed, they believe that main motivation for the Kalahari hunt described by Liebenberg (2006) was that the hunters were being filmed for a television documentary. Sands (2010) was also scepti- cal about the role of subsistence hunting in the evolution of Homo cursor, instead supporting the possibility that the runner’s high might have driven the desire to engage in fast endurance running. Whatever the reason, endurance running (and swimming) will disturb one or more of our regulated variables (Notley et al. 2023a [Table 1]), and when those practices become habitual, they will induce a suite of physi- ological adaptations that are overlayed onto the genotypic attributes acquired during the ascent of Homo sapiens, and the progression of our ancestral sweating specialists (Homo sudomotor; Notley et al. 2023c) through to Homo cursor (running man; Sands 2010). Physiological adaptations to changes in body temperature We embark on the last topic from this review series rec- ognising that it could form two independent reviews, one on the adaptations to heat (e.g., Sundstroem 1927; Kuno 1934, 1956; Yoshimura 1964; Wenger 1988; Taylor 2006a, 2014; Horowitz 2014) and another on cold adaptation (e.g., Hammel 1964; Hicks 1964; LeBlanc 1992; Himms-Hagen 1996; Young 1988, 1996; Launay and Savourey 2009; Mäkinen 2010; Daanen and Van Marken Lichtenbelt 2016; Yurkevicius et al. 2021). Nevertheless, in the footsteps of Wyndham (1969), Bligh (1973), Zeisberger and Roth (1996) and Tipton et al. (2008), we proceed with the enthusiasm of the brave, a desire for thoroughness, but with a wish to avoid reader exhaustion, as we discuss the critical historical steps that led to our contemporary understanding of human adaptations to hot and cold conditions. As with Parts 1–3 of this series, our work has been divided into three epochs: the period before 1900, the Krogh-Hill epoch (1900–1930), named after August Krogh (1874–1949, Denmark) and Archibald V. Hill (1886–1977, England), and the modern epoch. Our primary emphasis is upon the potential ben- efits that thermal adaptation has for those who engage in endurance-dependent athletic and occupational pursuits for which Homo sapiens have a genetic capability. Our focus is upon the thermoeffectors that defend the thermal integrity of the deep-body tissues, but in the cold, there is the hazard of peripheral soft-tissue injuries, so it is of interest to know whether local adaptation of those peripheral tissues can be protective. At the end of the nineteenth Century, some were anxious about living and working in unfamiliar tropical climates (Sambon 1898a). Some believed Europeans could neither tolerate nor adapt to those climates. Therefore, we open this section by showing how opinions on that theme changed from 1771 to 1923. The last statement, which reveals the 157European Journal of Applied Physiology (2024) 124:147–218 1 3 cardinal sins of bias and unfamiliarity with the literature, appeared within one of Britain’s most prestigious medical journals and was made by the Director-in-Chief of a pres- tigious scientific trust. “... some employments, which are of such a nature, as cannot well be performed in hot and unhealthy coun- tries, by such as are lately arrived, without imminent danger of their health and lives ...” (Lind 1771: P. 104). “So late as 1850 Dr Knox of London declared that Englishmen transplanted to America and Australia must inevitably deteriorate and would die out in a few generations. The absurdity of such a statement does not need to be pointed out ...” (Cutler 1902 [P. 421]). “THE title of this lecture is intended to indicate that the white race, under existing circumstances, cannot be looked upon as more than wayfarers in the tropics, as people who have there no continuing city in the sense that they can settle and rear a progeny blessed with the energy and virility ...” (Balfour 1923a [P. 1329]). On the nature of adaptation Notley et al. (2023c) reviewed the acute thermoeffector responses of humans that defend body temperatures when confronted with diverse thermal stresses, and thereby reduce physiological strain. Those processes are forms of physiological accommodation, since they allow us to remain within, and to tolerate, thermally stressful conditions. Our individual capacities for accommodation are a function of our genetically determined anatomical and physiological characteristics (our genotype), but we can modify our phe- notype through physiological adaptation. Over millennia, the genotypic characteristics of some individuals provided them with survival advantages, so nature gave them the opportunity to transmit their genes (Darwin 1859). Through natural selection, humans acquired, and then inherited, the genetic attributes of sweaty, endur- ance specialists (“Thermoeffector evolution: vascular net- works, homeothermy, sweating and hairlessness”). Given the wide, inter-individual variations that exist within our genet- ically determined characteristics, our attention now turns towards the degree to which we can modify those anatomical and physiological attributes during adaptation. Phenotypic adaptations can enhance stress tolerance during athletic and occupational pursuits, and they are revealed through modi- fications in thermoeffector control and the stability of our regulated variables (Notley et al. 2023a [Table 1]). Such changes can occur when living, working or exercising in one’s natural environment (acclimatisation), or through artificially imposed environmental conditions (acclima- tion), both of which repeatedly disturb thermal homeosta- sis. Whilst acclimatisation is a long-term process that takes years to be fully expressed, acclimation effects occur within a few days (short-term process), with ergogenic advantages occurring within a few weeks. In this section, we consider thermal adaptation as it pertains to physiologists with interests in athletic perfor- mance, physiologists who focus upon occupational pursuits and thermal physiologists with interests in both resting and exercising states. Since increased thermal tolerance can be induced either passively or actively (exercise), one might wonder whether sitting in a hot bath or a sauna with a good book is advantageous for marathon runners preparing to race in the heat. Will sitting in an ice bath be beneficial for swimmers of the English Channel? Will the regular expo- sure of hands and feet to ice-cold water help to protect them from cold injuries? Which athletes and workers would ben- efit more from thermal adaptation; those who are about to experience the heat or those embarking on stressful cold exposures? How important are the physiological changes relative to the benefits gained from experiencing and dis- covering how stressful heat and cold conditions can dramati- cally affect athletic and work performance? Well-trained endurance athletes may already be so well adapted to the applicable climatic conditions, that additional physiological improvements may be quite small, and perhaps not even quantifiable. However, the experience and psycho- logical benefit obtained from trying to race at personal-best pace in oppressive conditions may well lead to the devel- opment of a strategic approach that makes the difference between successful racing and failing to complete the race (Taylor and Cotter 2006), or worse still, succumbing to heat illness (Notley et al. 2023c). Such exposures afford opportu- nities to prepare and rehearse behavioural strategies, includ- ing race pacing and rehydration strategies. The significance of those benefits depends upon factors that are beyond the expertise of most physiologists. We may describe adaptations according to their impact upon our regulated or controlled variables, with thermal adaptation truly being an integrated physiological process; “every adaptation is an integration” (Barcroft 1934 [P. 187]). Thus, our observations will always be constrained by our methodological limitations (Notley et al. 2023b) or the scope of our research questions, since they are just pieces of the overall puzzle. No single measurement or mechanism will enable us to explain the entire process of thermal adap- tation. Moreover, whilst our emphasis is not behavioural thermoregulation, behaviours are among our first, as well as our most powerful responses (Cabanac 1972; Nelson et al. 1984; Malvin and Wood 1992). They can ensure that modern humans seldom encounter stress, with such stress 158 European Journal of Applied Physiology (2024) 124:147–218 1 3 avoidance possibly being disadvantageous to the survival of individuals and our species (Selye 1973). Furthermore, we advise caution concerning the application of observations from the general population to athletes, who have unique and more homogeneous genotypes and phenotypes. The reverse application is also cautioned. One reason for caution lies in the different capacities of those sub-populations for adaptation. The general popula- tion contains stress-intolerant, sedentary individuals who respond vigorously to acutely applied homeostatic distur- bances. Following repeated disturbances, they will undergo pronounced physiological adaptations that increase their stress tolerance. Those individuals are high responders. At the other end of that spectrum are the adventurers, and the occupational and recreational athletes. Through those activi- ties, they will already be physiologically adapted, although perhaps not completely. They are low responders. It is a mis- take to assume that large physiological adaptations changes observed in high responders will also occur in low respond- ers exposed to the same stresses. A classical example of that truism is seen with increases in the maximal ability to extract and consume oxygen during exercise (peak aerobic power; Hill and Lupton 1923) that occur in response to endurance training. Whilst substantial changes to those attributes may occur in the general population (Saltin et al. 1968b; Ekblom 1969), well-trained individuals exhibit little or no change. Indeed, our adaptation responses even seem to be geneti- cally linked (Prud’homme et al. 1984; Vollaard et al. 2009), with most elite athletes attaining their genetic potential for training induced adaptations. Thus, the measurement of peak aerobic power is of limited value with regard to predicting athletic performance in elite athletes, and that myth has been domesticated elsewhere (Noakes and Ekblom 2008). At the close of the Krogh-Hill epoch, two scientists from the Southern Hemisphere highlighted the implications of inter-individual variability, as it related to thermal adapta- tion. Edward S.J. Sundstroem ([Sundström] 1880–1970, Finland and Australia; Sundstroem 1927) wrote that most populations include individuals lacking any thermal adap- tation, those who are in the process of adapting and those who are well adapted. Moreover, he recognised “that indi- viduals may not be all alike in their choice of the accli- matization mechanisms which nature has placed at their disposal. Some may succeed in striking the medium road preferring to divide up the burden imposed by a trying envi- ronment on several functions without unduly overtaxing any of them” (Sundstroem 1927 [P. 352]). Secondly, Aldo O. Dreosti (1902–1971, South Africa; Dreosti 1935, 1950), when evaluating the heat tolerance of recruits from differ- ent African communities prior to their employment within hot-humid mines, found that, from more than 42,000 miners tested, 15% were intolerant, while 25% had acquired heat tolerance naturally. Almost a century earlier, Johnson and Martin (1841), wrote that “the Constitution of Man is better adapted to bear those changes of temperatures and other circumstances, experienced in migrating from the northern to a tropical region, and vice versâ, than that of any other animal … this … is a distinctive characteristic of the human species …” (Johnson and Martin 1841 [P. 1]). Modern occupational athletes are required to wear protec- tive clothing, but recreational athletes may choose clothing best suited to the thermal stress imposed by the expected ambient conditions. What can those athletes learn from the physiological responses of different populations that lived and worked their entire lives in hot climates? What can they learn from clothed workers in hot industries? Clothing com- promises heat loss, with physiological strain being dramati- cally elevated when metabolic heat production approaches its maximal rate. For those wearing protective clothing, the advantage of increasing sweat rate, a common feature of short-term heat adaptation, is negligible, with heavy sweat- ing significantly degrading thermal protection and elevat- ing discomfort (McLellan et al. 2013; Taylor 2015). Whilst elevating sweat rate may be advantageous when competing in the heat, high sweat rates can be problematic during ultra- distance events, even when drinking is possible, because water absorption rates do not match sweat rates (Gisolfi et al. 2001; Notley et al. 2023c). In the cold, clothing choices are generally aimed at reducing heat loss, but the swimmers of Ice Miles, as well as those in events controlled by the International Winter Swimming Association, compete without thermal protective or buoyancy clothing. Those are worst-case scenarios that are normally associated with unintentional cold exposures. Not surprisingly, the wearing of wetsuits when racing in water temperatures <18 °C has been mandated for many events (Fédération Internationale de Natation). How- ever, insulated clothing often results in individuals who live in very cold conditions, only briefly experiencing the cold, with those exposures rarely being of the entire body (Rodahl 1958). There are some occupations in which the hands (e.g., fish filleters) or entire bodies (e.g., military personnel) are regularly cold exposed. Much has been learned about the physiology of human cold adaptation from sub-populations that live and work in cold climates. But before we examine different forms of thermal adapta- tion, it is useful to consider the first principles of physi- ological adaptation. To which thermal stimuli do humans adapt? We open this sub-section with a strength-training anal- ogy, during which repeated loading of the skeletal mus- cles with an external resistance (stress) will induce 159European Journal of Applied Physiology (2024) 124:147–218 1 3 neuromuscular adaptations that result in subsequent exer- cise with the same external resistance imposing less physi- ological strain (Edgerton et al. 1996). That is an habituated response. As adaptation progresses, the strain experienced when that same external stress is re-applied diminishes. However, for the strength athlete to continue to adapt and improve, that external stress (forcing function) must be pro- gressively increased (load, repetitions, volume [cumulative mass]) and that is the cornerstone of the overload principle (DeLorme and Watkins 1951). Indeed, regardless of which physiological function one examines, adaptation will not occur unless sufficiently large disturbances are repeatedly applied (Adolph 1956, 1964, 1972). As time progresses, our responses to the application of those same external stimuli will become progressively less pronounced. Let us weave that general principle into that which is known about regulated and controlled physiological vari- ables (Notley et al. 2023a [Table 1]). Repeated disturbances to one or more of our regulated, but not our controlled, vari- ables will induce physiological adaptation. Thermal adapta- tions occur when one (or more) of our regulated variables is sufficiently and frequently disturbed from its normothermic state. Consider our body temperatures, with the tempera- tures of our deep-body, skin and other thermosensitive tis- sues combining to form the regulated variable. Repeatedly changing any of those temperatures may lead to local or whole-body thermal adaptations. During heat adaptation, thermal stress can be imposed externally (exogenous heat stress), which elevates both deep-body and skin tempera- tures, or metabolically (endogenous stress), which elevates deep-body temperature, but not necessarily skin tempera- tures, depending on the environment. For cold adaptation, heat passes through the skin  to the environment, so there is no natural equivalent whereby the deep tissues can be cooled independently of the skin, although internal cooling could be achieved using extracorporeal cooling of the blood. Thus, cooling of the deep-body tissues is invariably accompanied by cooling of the superficial tissues. The stimuli for thermal adaptation are those body tem- perature disturbances (physiological strain), which can be induced passively and actively. One might quantify the phys- iological overloads that induce adaptation, just as coaches might quantify their training programmes using indices that reflect the size (intensity) of each stimulus, the duration and frequency of their application and the volume of each over- load. They might even seek to evaluate the cumulative physi- ological burden of each training session. We are not seeking to quantify the stimulus per se (e.g., ambient temperature or metabolic heat production), but its impact on physiological strain, as reflected within changes in a regulated variable. For endurance training, Banister and Calvert (1980, Canada) developed the concept of the endurance-training impulse, within which strain was quantified using heart-rate data. However, to apply that concept to physiological adaptations, we must replace heart rate, which is a controlled variable, with one of our regulated variables (e.g., body temperature), since they are the variables to which our regulatory centres respond. Therefore, Taylor and Cotter (2006) modified the training-impulse concept for use during thermal adaptation, resulting in a thermal impulse (Eq. 1), which enabled the quantification of either heat- or cold-adaptation stimuli. where: Tb-i is body temperature at time i, Tb-0 is the basal body temperature, dt is duration of the thermal stimulus Unbeknown to those authors, a similar concept (thermal response or fever index) had been used for investigating the pathophysiology of fever (e.g., Kenedi et al. 1982). Since we do not know the integrated body temperature to which our regulatory centres respond, we use a surrogate of one of its components (deep-body temperature), but that surrogate must faithfully track changes in the temperature of the blood perfusing those centres (Notley et al. 2023b). The impact of successive thermal loads can be estimated by adding indi- vidual impulses, which then becomes the cumulative adapta- tion impulse. Details of these concepts and adaptation theory are available elsewhere (Taylor 2014). During natural forms of thermal adaptation, our regulatory centres are responding to those same stimuli. Natural adaptation: acclimatisation within hot and cold climates Interests in thermal adaptation were initially driven by prag- matic issues (e.g., colonisation, the industrial revolution, international conflicts), and preceded athletic pursuits. An outcome of colonisation that ignited the interests of scien- tists was the apparent inability of Europeans to tolerate tropi- cal climates (Lind 1771; Balfour 1923a), relative to tropical indigenes. Did they possess anatomical or physiological attributes that were not present in Europeans? If they did, then how did they acquire that advantage? Our summary timeline (Fig. 5) highlights critical historical steps in the development of our understanding of human heat adaptation. Perhaps the person of greatest current relevance in the nineteenth Century was James Johnson (1777–1845, Eng- land), who, with James R. Martin (1796–1874, England), had just published the sixth edition of The Influence of Tropi- cal Climates on European Constitutions (Johnson and Mar- tin 1841 [colonial India]). That text preceded the birth of Patrick Manson (1844–1922, England), the first Director of the London School of Hygiene and Tropical Medicine (To and Yuen 2012), which was established in 1899, following its counterpart in Liverpool (1898). Both were part of the constructive imperialism (an oxymoron) policy of the British (1)Thermal impulse = ∫ ( T b−i − T b−0 ) dt [◦Cmin] 160 European Journal of Applied Physiology (2024) 124:147–218 1 3 Colonial Secretary of the time (Harrison 1992). Johnson and Martin (1841) knew that we could elevate heat production in the cold, and dissipate excess thermal energy in the heat. Whilst they did not understand either mechanism, Johnson did not accept prevailing teachings without question (Har- rison 1992), and he believed the view that humans could not adapt to the heat to be a misconception; he was not alone. “... the almost universal opinion that the European cannot colonize the tropics, but must inevitably fall, sooner or later, a victim to the influence of their deadly climate. I will endeavour to prove that this statement is wrong ... there is no reason why the European should not conquer the tropical world.” (Sambon 1898a [P. 589]). We introduced John Davy (1790–1868, England) in Parts 1 and 2 of this series (Notley et al. 2023a, 2023b), and noted that he was possibly the first person to record deep-body temperatures during exercise, as well as circa- dian and seasonal variations (Davy 1845). We now accom- pany him on a cruise to Barbados (West Indies), during which he replicated those measurements within a tropi- cal climate (Davy 1850 [tabulated data: Pp. 449–466]). Those data provided some of the earliest records for the acute impact of the tropics on body temperatures, and soon became the accepted dogma. Davy noted deep-body (sublingual) temperatures were almost 1 °C higher in the tropics than within temperate regions, and its circadian variation covered almost 2 °C, but with the nadir in the morning, rather than the evening (as recorded in England). To those data, we add the observations of Richard Neuhauss (1855–1915, Germany) on deep-body tempera- ture (rectum and axilla), heart rate and urine production, recorded four times daily (0600, 1200, 1800 and 2200 h) Fig. 5 Human heat adaptation: mileposts in the progression of our understanding over the past two centuries. The image is a stamp issued in Grenada to commemorate the Nobel laureate Christiaan Eijkman. Source: Singapore Medical Journal (Merritt and Tan 2011). Accessed: March 17th, 2022 161European Journal of Applied Physiology (2024) 124:147–218 1 3 on his trip around the world in 1884 (March-November; Neuhauss 1893 [Pp. 373–379]). That trip included north- ern and southern temperate zones, as well as tropical regions. Like Wunderlich (1871) and Richet (1889), he found rectal temperature to be ~ 0.6 °C warmer than the axilla, and observed slightly higher body temperatures when in the tropics (also see: Pembrey 1898; Castellani and Chalmers 1913). However, the data of Davy (1850) and Neuhauss (1893) involved tropical sojourners, so they offer little to our understand of thermal adaptation. To address that limitation, we return to William J. Young (1878–1942, England and Australia), whom we introduced in Notley et al. (2023b). He studied the rectal, sublingual and urine temperatures of tropical residents of European ances- try (Young 1915a [tabulated data: Pp. 225 and 227–229]). Under basal conditions, he found no significant variation between the rectal temperatures of Europeans acclima- tised to the tropics, and those of their counterparts living in Europe. From that apparently unremarkable, yet important observation, we turn our attention to studies in which the possibility that differences in thermal tolerance (cold and hot) might exist between indigenes and transplanted Euro- peans, for it was within the former that physiologists first observed thermal adaptation. They completed physically demanding work in their local climates without undue strain, while their recently arrived, European counterparts could not (Lind 1771). Over 150 years later, Eijkman (1924) described tropical indigenes as “permanent summermen”, and their relevance to our interests is that they were the occupational athletes of their time, so it is from those athletes that we gain our first insights into human heat adaptation. Heat‑induced acclimatisation in indigenous populations Another colonial power was The Netherlands, and working almost right through the Krogh-Hill epoch was the Dutch physician Christiaan Eijkman (1858–1930, Fig. 5; Nobel Prize in Medicine in 1929; Nobelstiftelsen 1965). He led the Medical Laboratory at the Military Hospital in Indone- sia (1888–1896; Pietrzak 2019), and discovered that poly- neuritis endemica perniciosa (beriberi), which was com- mon in soldiers of the Dutch East Indies, was nutritionally related (Jansen 1950; Merritt and Tan 2011; Pietrzak 2019). Whilst unknown at the time, it was the result of a thiamine deficiency, but our interest in him stems from his thermal investigations. Unlike Woodruff (1900, U.S.A.) and Balfour (1923a, England), Eijkman believed that people of European ances- try could indeed attain a state of acclimatisation similar to the indigenes, if they were protected from endemic diseases (Eijkman 1924). Nevertheless, there existed a belief that tropical climates were associated with, and perhaps caus- ally linked to, the origin of many diseases (Sargent 1960). Two so-called neurological disorders highlight that school of thought: tropical neurasthenia (Woodruff 1900; Taylor 1916; Huntington 1924) and Arctic hysteria (Aberle 1952). The former was a type of climatic determinism (Taylor et al. 2022), in which the climate was believed to predetermine both the choice of where to live and one’s health following the making of that choice. Tropical climates were thought to result in a “… lack of industry, an irascible temper, drunken- ness, and sexual indulgence …” (Huntington 1924 [P. 68]). That state is colloquially known as “going troppo”, but tropi- cal neurasthenia was domesticated by Cilento (1925), Sund- stroem (1926) and Ladell (1958/59). With regard to Arctic hysteria, Sargent (1960 [P. 243]) said that it “is not only not an Arctic disease but it is not even a climatic disease”. Scientific advancement required people to step away from a “prefabricated set of interpretations” (Kennedy 1962), and Johnson and Martin (1841) stepped away in the first half of the nineteenth Century, with Eijkman (1895, 1924) follow- ing that lead. Christiaan Eijkman also investigated the affects of tropi- cal climates on indigenes and recent European arrivals. He challenged the conclusions of Woodruff (1900) that Euro- peans could withstand cold stress better than could people of African ancestry, but they were less able to lose heat via radiation. Whilst the first statement has been supported within the modern epoch (Burgess and Macfarlane 2009; Maley et al. 2015), Eijkman knew that radiative heat loss was a function of temperature (first principle), and unre- lated to skin colour. Woodruff’s interpretation was “physi- cal heresy” (Eijkman 1924 [P. 890]). Although he could not measure radiant heat loss, he used skin dissected from Euro- pean and Malaysian cadavers (Malaysia was then a Dutch colony), wrapped the skin around a cylinder of heated water and observed equivalent heat loss from both skin types (Eijk- man 1895). Indeed, as demonstrated when radiant heat loss could be measured accurately, skin colour differences have no affect on the emissivity of human skin in the heat-loss waveband (Mitchell et al. 1967), although darker objects absorb radiation better across a wider range of the electro- magnetic spectrum, as Eijkman (1895) observed. In recogni- tion of his wide-ranging contributions to topical medicine and health, the Royal Tropical Institute (previously the Colo- nial Institute, Amsterdam; Notley et al. 2023b [Table 2]) ini- tiated The Christiaan Eijkman Medal in 1927. Further evidence that people of European ancestry were not precluded from successfully adapting to work in the heat, came from the research of Joseph S. Weiner (1915–1982, South Africa and England), whom we have met through- out this series. Weiner had observed that, after six months of hard physical work in hot-humid mines, miners of Afri- can ancestry appeared not to be as well adapted (acclima- tised) to work in the heat as artificially adapted (acclimated) Europeans (Weiner 1950). Those physiological differences 162 European Journal of Applied Physiology (2024) 124:147–218 1 3 disappeared if both groups went through the same heat-accli- mation programme (Wyndham et al. 1964b). From a col- laboration among South African, French and Australian sci- entists, Nick Strydom and Cyril H. Wyndham (1916–1987, South Africa; Anonymous 1987; Wolstenholme 1987; Mitchell and Laburn 2022) compared the acclimatisation state of various indigenous groups with Europeans who lived with them (Strydom and Wyndham 1963). They found that different indigenous groups had the same deep-body (rec- tal) temperature response to exercise in the heat, but desert- dwelling Bushmen and Arabs had higher sweat rates. Most remarkable was how similar the responses were for both the tropical indigenous men and the European men with whom they lived. They all revealed similar adaptations to exercis- ing in the heat. Sudomotor responses The colonial powers sent both admin- istrative and military personnel to their colonies. Whilst the administrators were just uncomfortably hot, the soldiers suf- fered heat-related problems, including many deaths from exertional heat stroke (Notley et al. 2023c). Christiaan Eijk- man decided to investigate the causes, and whether or not they could be managed. He compared people of European and Malaysian ancestry, in possibly the earliest investigation of ethnic differences in thermoregulation (Eijkman 1895, 1924), and observed equivalent deep-body (axilla) tempera- tures in the morning (37.0 and 36.9 ℃, respectively). Of greater interest, however, are his observations con- cerning sweating, and the experimental design he developed for that study. Eijkman (1895) studied pairs of Europeans and Malaysians, matched for body mass, averaging 57.4 and 55.6 kg (respectively). From 27 parallel experiments (75 min) in humid air (32 ℃ dry bulb), presumably at rest, he found their average deep-body temperatures to be equiva- lent (37.1 and 37.0 ℃, respectively), whilst the Europeans lost sweat 30.4 g  h−1 more rapidly. He also recognised that sudorific response to be superfluous and wasteful (Eijkman 1895), since not all of that sweat evaporated on the skin. Superfluous sweating was rare in the Malaysians, but com- mon in the Europeans, especially in new arrivals (Eijkman 1924). Those observations provided perhaps the first illustra- tion of what would become known as the acute (physiologi- cal accommodation) and short-term adaptation responses of the Europeans, and the longer-term habituated responses of the Malaysians. To explore those ethnic differences, Eijkman examined sweat-glands densities across ten skin regions, observing “no perceptible difference between the two races” (Eijk- man 1924 [P. 891]). The average glandular density of the Malaysians was 160 glands  cm−2, that of the Europeans was 162 glands  cm−2. Next, he studied both groups rest- ing in air at 30 ℃ (360 min; 200 mL water provided), with urine volumes and body masses determined before and after exposure. Both sweat secretion and urine production were higher in the Europeans (Eijkman 1924). When those trials were repeated (120 min, 32 ℃ dry bulb, artificially humidified air), but with the Europeans water deprived, their sweat and urine flows were still higher than those of the Malaysians. In another first, he observed that “unevapo- rated sweat, the quantity of which, weighed together with the clothing, amounted to more than three times as much in the thirsting white as in the water-drinking Malayan” (Eijk- man 1924 [P. 891]). Since those Europeans had lived and worked in Malaysia for some time, they were partially heat acclimatised. We now know that only ~ 10% of the increase in sweating, following short-term adaptation to exercise in a hot-humid environment, actually contributes to evapora- tive cooling, with the rest being superfluous (Mitchell et al. 1976). Eijkman (1895, 1924) was also well aware of the morpho- logical differences between Europeans and Malaysians, and the impact those differences might have on heat exchange and storage. Indeed, he designed his experiments to negate those effects (P. 138), with his matching of participants marking possibly the earliest attempt to control for differ- ences in the surface area to body-mass ratio (mass-specific surface area), although few followed his lead (Havenith and van Middendorp 1990; Havenith et al. 1995; Taylor 2006a; Notley et al. 2016, 2017). Since dry-heat exchanges are energy-efficient avenues for heat dissipation, then larger mass-specific surface areas tend to support heat tolerance during rest and exercise, although other factors are some- times equally important (Wyndham 1965). Because humans have the same general (allometric) shape, then larger mass- specific surface areas are found in smaller individuals, such as children, many women and all slender individuals. As a consequence, such people are less dependent on activat- ing their autonomically driven avenues for heat loss (Notley et al. 2016, 2017). During our first epoch, Carl Bergmann (1847, Germany) described broad differences in the morphology of endother- mic, non-human animals living in different climates, and suggested that body mass appeared to be inversely related to environmental temperature, and was possibly a climatic adaptation. Joel Allen (1877, 1907; U.S.A.) described differ- ences in limb shape which varied with climate, with animals in colder climates tending to have shorter and thicker limbs. In possibly the first detailed investigation of variations in human morphology, Derek Roberts (1953, England; Craw- ford 2017) reported evidence across 116 indigenous groups (also see: Suominen 1929; Roberts 1978), finding that body mass was inversely related to the average annual temperature of their country of origin. That outcome for humans was consistent with the eco- logical generalisation of Bergmann (1847), in that sphericity favours heat conservation (Sundstroem 1927). Whilst the 163European Journal of Applied Physiology (2024) 124:147–218 1 3 physical principle remains valid, whether Bergmann’s Rule indeed was a biological rule was challenged by Per F. Scho- lander (1905–1980, Norway; Scholander, 1959, 1955; Blix et al. 2022) and Ernst Mayr (Mayr 1956), with challenges of its applicability to humans continuing (e.g., Foster and Collard 2013). Whether any rule relating human body size to climate is currently relevant is doubtful, given the exten- sive immigration and world-wide improvements in dietary resources and health services that occurred during the sec- ond half of the twentieth Century. Nevertheless, when we examine research from the Krogh-Hill epoch, during which adequate diets and immigration were much less common, we should be sceptical about comparisons of climate-related raw data from different ethnic groups, unless those data are have been corrected for variations in body size, either by the authors or the readers. Eijkman (1895) recognised that requirement. Similar caution is required when comparing the micro- anatomy of eccrine sweat glands, their regional distribution and their functional characteristics across people of vary- ing ancestry. That material is covered in detail by Smith and Havenith (2011, England) and Taylor and Machado- Moreira (2013, Australia and Brazil), and summarised by Notley et al. (2023b, 2023c). With that precaution, we now turn to emphasise possible ethnic differences in sudomotor activity, and commence with contributions from Yas Kuno (1882–1977, Japan; Morimoto 2015; Nagasaka 2022; Notley et al. 2023b [Table 1]). In his seminal monograph (Kuno 1934, 1956), he hypothesised that the number of functional sweat glands was determined within the first two years of life. Whilst there is a well-established postnatal conversion of the sudomotor neurons from an adrenergic to a choliner- gic phenotype (Schotzinger and Landis 1988; Landis 1990; Guidry et al. 2005), it may be impossible ever to test Kuno’s hypothesis. However, that hypothesis may imply an ancestral attrib- ute, if being born and raised within warmer climates resulted in higher sweat-gland densities. Whilst there are historical and contemporary data consistent with that possibility (Schi- efferdecker 1922; Homma 1926; Kawahata and Sakamoto 1951; Kuno 1956; Toda 1967; Hwang and Baik 1997), oth- ers have either challenged those observations (Woollard 1930; Weiner and Hellmann 1960) or provided contradic- tory evidence (Eijkman 1924; Thompson 1954; Collins and Weiner 1965; Roberts et al. 1970; Green 1971; Knip 1972; Garcia et al. 1977; Samueloff 1987; Inoue et al. 2009). We are amongst the sceptics, as ancestry-related differences in glandular densities seem to be more artefactual than real (Taylor 2014). Instead, we suggest that the whole-body den- sity of eccrine sweat glands for adults of all ancestries will be about 112 glands  cm−2, with extensive inter-segmental variation (Taylor and Machado-Moreira 2013). If glandular densities are similar, then the observation of greater sweat rates for Europeans than Malaysians (Eijkman 1895, 1924) must be attributable to differences in glandular function. How glandular function can differ across individuals was established within the modern epoch, and there are several functional and morphological characteristics of sudomotor activity that are amenable to modification during the course of both passive (resting) and active heat adaptation. Those changes might also explain differences in sweating across short- and long-term residents of tropical and equatorial cli- mates, when compared with residents of temperate climates. Those adaptations relate to three primary changes: an altera- tion in the initiation of sweating (sudomotor threshold), an increased capacity of the glands to secrete sweat (glandu- lar hypertrophy) and a change in the reactivity of the sweat glands to a given change in body temperature (sudomotor sensitivity). Such changes can reflect adaptations within the regulatory centre itself (threshold) or process adaptations that can occur at the sweat glands. Modifications in sudomo- tor sensitivity can occur both centrally, via altered thermoef- ferent drive, and peripherally, through changes in cholinergic sensitivity of the glands themselves (Folk 1974; Brück 1986; Werner et al. 2008). With regard to adaptations to the regulatory centre, we will examine four studies that are relevant to thermoregula- tion within indigenous populations. The first two were led by Cyril Wyndham, who introduced the most extensive pro- gramme of supervised heat adaptation (acclimation) within climatic chambers, sometimes holding 1000 miners and operating over four shifts a day. Miners worked in simu- lated underground conditions for four hours a day for ten days (Mitchell and Laburn 2022), with hundreds of thou- sands of miners being heat acclimated (Wyndham and Stry- dom 1969b; Wyndham et al. 1973). From trials conducted in research-specific climate chambers, Wyndham and his colleagues found lower sudomotor thresholds in miners of African ancestry and in Bushmen (the MaSarwa, the origi- nal modern inhabitants of southern Africa; Wyndham et al. 1966, 1967b). Unfortunately, in neither study was there a control group, so those observations could simply reflect a controller adaptation that might occur within any group of people. The next two studies included experimental controls. The first was led by Ronald H. Fox (1923–2009, England), and it was he who developed the controlled-hyperthermia technique for artificially inducing thermal adaptation (Fox et al. 1963a). Some contemporary thermal physiologists seem unfamiliar with both the method and its true origin. Those studies were indirectly linked to South Africa, since Wyndham had studied in Oxford (England; Wolstenholme 1987), following in the footsteps of another expatriate, Joseph Weiner (Milton 2022). Weiner had a strong interest in ethnic differences in thermoregulation, which he pursued with vigour in the 1970s (Collins and Weiner 1977a, b), and 164 European Journal of Applied Physiology (2024) 124:147–218 1 3 Fox was part of that team. Fox and his colleagues compared indigenes from a tropical region of New Guinea with two control groups: indigenes of similar ancestry, but who lived in a cool-dry climate, and people of European ancestry (Fox et al. 1974). In another of the twists that we have sought to illuminate, our fourth investigation came from a combined Malaysian and Japanese team who also compared indi- genes from those countries (Wijayanto et al. 2011). Neither of those research groups found evidence for differences in sudomotor thresholds that might be attributable to ethnicity. Thus, the central control of sudomotor function seemed not to differ systematically with ancestry, at least among those experimental groups. If we accept that conclusion, and the interpretation that eccrine-gland densities do not differ with ancestry and that heat adaptation, at least in the short term, does not change the number of active sweat glands (Sargent et  al. 1965 [U.S.A.]; Peter and Wyndham 1966; Inoue et al. 1999a [Japan]), then we must entertain the possibility that differ- ences might exist in the secretion capacity or cholinergic sensitivity of those glands. Indeed, heat adaptation was clas- sically defined on the basis of greater sweat rates (hyperse- cretion) during subsequent heat exposures (Adolph 1947d), and it was once the consensus that adaptation was incom- plete unless sweating was more prolific (Wyndham et al. 1964a; Duncan and Horvath 1988). That is the second of our three possible adaptation responses. That interpretation is still held by many modern thermal and applied physiologists, although it has long been con- tested (Burton et al. 1940; Yoshimura 1964; Candas 1987; Hori et al. 1976; Taylor 2006a). Since both genotypic and phenotypic adaptations tend to move physiological mecha- nisms towards more efficient states, then an increase in the sudomotor capacity of indigenes from hot, and particu- larly from tropical climates, might seem counter-intuitive, because evaporative cooling depletes body water. Therefore, minimising excessive (superfluous) sweating would improve physiological efficiency, and that is precisely what Eijkman (1895, 1924) had observed in his Malaysian volunteers, whilst his Europeans produced much wasteful sweat. Simi- larly, Sundstroem (1927) reported that well-adapted indi- viduals, regardless of their ancestral origin, lost less heat through evaporation, and more through the less wasteful dry-heat exchanges. An elevated sudomotor efficiency, with reduced sweat dripping, has also been described (Eijkman 1924; Hori et al. 1976). Such responses are ideally suited to life in humid climates (Garden et al. 1966) and the wearing of protective clothing (Taylor et al. 2021), whilst short-term, adaptation-induced increases in sweating during exercise in humid heat can result in as much as a 200% elevation in unevaporated, wasteful or superfluous sweat produc- tion (Mitchell et al. 1976). Perhaps unsurprising is the observation that long-term heat adaptation does not necessarily result in hypersecre- tion, particularly for those living and working in the tropics for many years (Eijkman 1895; Burton et al. 1940; Ladell 1958/59 Yoshimura 1964; Candas 1987; Hori et al. 1976; Taylor 2006a). Instead, a more resource-conservative sudor- ific response is typically observed, with lower sweat rates during exposures to hot-humid conditions. Ladell wrote that “we were never able to get an African to sweat more than a “fully acclimatized” European under the same temperature conditions and working at the same rate” (Ladell 1958/59 [P. 4]). Such responses reflect less pronounced effector activity and reduced physiological strain. That outcome reflects an attenuation of our thermoeffector responses, which we now call an habituation process (Henane 1980). The phenomenon of sudomotor habituation was described by Seiki Hori and colleagues (Japan; Hori et al. 1976; Naga- saka 2022): “… people born and raised in Okinawa have been exposed to heat constantly for years, it is presumed that the sweating center may have been habituated to the stimu- lation of heat” (Hori et al. 1976 [P. 243]). They attributed the descriptor “habituation” to Glaser and Whittow (1953, England), and it was then adopted by the IUPS Thermal Commission (2001). Victor Candas (France) hypothesised the existence of a long-term (third) phase of heat adaptation, during which evaporation was optimal and sweat dripping minimal (Candas 1987). As Sundstroem (1927) had sug- gested, there are three stages of heat adaptation, with the last (habituated) stage perhaps being most easily seen in long-term residents and individuals with a morphologi- cal configuration suited to resource-conservative, dry-heat exchanges (e.g., those of Asian ancestry). Such an outcome is entirely consistent with adaptations increasing physiologi- cal efficiency and enhancing survival, so it perhaps should have been anticipated. It follows from this domestication of a very persistent hypothesis, that lower sweat rates during heat exposure might reflect more complete heat adaptation, a phenome- non hypothesised by others (e.g., Dreosti 1935; Burton et al. 1940; Thompson 1954; Yoshimura 1964; Fox et al. 1974; Senay et al. 1976). Indeed, if one examines the literature with that possibility also in mind, there is ample evidence of lower sweat rates in indigenes from the tropics, relative to similarly heated individuals from cooler climates (Eijk- man 1924; Thompson 1954; Edholm et al. 1964; McCance and Purohit 1969; Fox et al. 1974; Samueloff 1987; Duncan and Horvath 1988; Kosaka et al. 1994; Saat et al. 2005; Bae et al. 2006; Inoue et al. 2009; Wijayanto et al. 2011). Thor- ough and careful consideration of the literature can lead to reduced adherence to a “prefabricated set of interpretations” (Kennedy 1962). The third form of sudomotor adaptation is an elevated effector (cholinergic) sensitivity, wherein, for a given 165European Journal of Applied Physiology (2024) 124:147–218 1 3 amount of acetylcholine released from the axonal terminals, more precursor sweat is produced. That process is analo- gous to the sweat-gland training described by Ken J. Collins (1929–2017, England; Collins et al. 1965) and Kenzo Sato (1939–2010, U.S.A.; Sato et al. 1990; Anonymous 2010), yet, until relatively recently, it had not been investigated across different ethnic groups. Using acetylcholine ionto- phoresis, Inoue et al. (2009, Japan) found that the eccrine glands of Thai nationals from a hot-humid climate were less responsive to acetylcholine. It appeared that a sudor- ific habituation in those individuals was associated with a reduced cholinergic sensitivity. Was that a genotypic or a phenotypic adaptation? Thoicharoen and Taylor studied Thai women who had lived in a cooler Australian climate for > 1 year (Taylor 2014). Each woman was matched with an anthropometri- cally similar woman of European ancestry, and with equiva- lent physical activity behaviours, and then studied during light-intensity, steady-state cycling (90 min; 35.0 ℃ dry bulb). Sudomotor thresholds (37.1 [Thai] versus 37.0 ℃), sensitivities (4.07 [Thai] versus 3.06 mg  cm−2  min−1 oC−1) and steady-state sweat rates did not differ significantly. Since a genotypic difference would be retained when moving to, and living in, a cooler climate, those data were consistent with the absence of a genotypic difference related to sweat- ing between those ethnic groups. More recently, Muia et al. (2020) used direct calorimetry to quantify whole-body heat storage, dry and evaporative heat exchange and total heat loss in second-generation males of African and European ancestry living in a temperate, continental climate (Canada). When cycling (light, moderate and vigorous; 30 min each) in conditions that resulted in matched heat-loss requirements (40 ℃ dry bulb), heat storage, when averaged across exer- cise bouts, did not differ significantly between those groups (568 kJ [African] and 623 kJ [European]). Similarly, total heat losses were not significantly different (177, 217 and 244 W  m−2 [African]; and 172, 212 and 244 W  m−2 [European]). Those observations were interpreted to show that ancestral differences did not appreciably influence either whole-body heat exchange or heat storage when exercising in the heat. Vasomotor responses In the heat, cutaneous heterothermy is replaced by more uniform skin temperatures (Werner and Reents 1980; Notley et  al. 2023c [Fig.  7]), driven by increased convective heat delivery to the skin. That delivery reduces the core-skin thermal gradient and, if the air temper- ature is above that of the skin, the skin-air thermal gradient and the associated dry-heat exchanges (Sundstroem 1927; Eichna et al. 1950). There is also a simultaneous increase in the cutaneous water-vapour pressure, which increases evaporative cooling, even when the ambient vapour pressure and sweating remain stable. Those vascular effects were first described by Alan C. Burton (1904–1979, Canada and U.S.A.; Tikuisis 2022) and colleagues as part of the heat- adaptation process (Burton et al. 1940). However, few have examined the cutaneous vascular responses of people from different ancestral backgrounds during heat adaptation. Two exceptions are highlighted. Firs