About My Work‎ > ‎

DESPAIR AND HOPE IN THE AGE OF DYSPHORIA (Added November 2018, last updated March 2021)

DESPAIR AND HOPE IN THE AGE OF DYSPHORIA



ABSTRACT



The life of humans underwent rapid and multifaceted change with the transition from hunter-gatherer societies to civilizations. Civilizations are technology based therefore they required advances in cognition beyond the level of hunter-gatherers, and an abrupt rise in cognition occurred that allowed for advent of written language which in turn permitted the accumulation of knowledge and rapid technological advancement. At the same time mental and physical robustness declined as suggested by the need for substances (e.g., ethanol, narcotics) to allay anxiety, depression and fear. Also added protections such as sophisticated clothing, shelter and bedding were required to tolerate environmental conditions in regions where hunter-gatherers formerly thrived without such needs. These changes can not be explained by the genetic-evolutionary model, and therefore must have been the result of the effects of this new environment on their existing genome.


Here it is proposed that hunter-gatherers maintained high levels of endorphins in part as a result of living in groups of ideal size for their survival, and under conditions of high mobility needed to assure adequate foraging intensity (e.g., to produce adiposity and survive seasonal fasting). This endorphin production created wonder, which is defined as modest levels of anxiety and depression, and minimal mortality and existential fears from dulled cognition, together with the highest environmental temperature tolerance and pain thresholds that allow good health. Fertility and social cohesion are optimized by wonder.


Dysphoria is decline in one or more elements of wonder from optimal levels. Civilization brought about dysphoria through endorphin decline from maintaining excessively large group size, and decline in mobility below levels required for wonder. Whereas hunter-gatherers were highly socially cohesive, early civilizations were authoritarian and characterized by weaponized minorities dominating largely conquered groups. Improved living standards and later government social programs reduced existential fears, but insufficiently to compensate for the rise in dysphoria brought on by the continued reduction in mobility associated with advances in technology. Yet further improved cognition from lowered endorphins created additional dysphoria, as perhaps best illustrated by the invention of evermore effective means of authoritarian control and mass destruction.


The recent plunge in mobility that now accompanies the use of sedentary digital technology in the workplace, in school, in socialization and in online-shopping has brought about a dysphoria spike, one that is increasingly sending humans to seek answers and false leadership from medical pseudo-scientists and demagogic populists. Moreover, it has brought on a rash of substance-abuse to temporarily relieve dysphoria.


It is no longer sensible to seek levels of endorphins typical of hunter-gatherer societies as this would entail massive cognitive decline, and social programs have become unable to quiet population dysphoria. Antidepressant medication combined with analgesia can relieve the most distressing aspects of dysphoria with no cost to cognition. Under such a regime individual sense of well-being will increase and social cohesion. Examples will be decline in substance abuse, violence-free conflict resolution, the sharing of resources and other behaviors that will advance the common good.





INTRODUCTION


The rise in anxiety, depression, generalized fatigue and chronic pain experienced by inhabitants of economically advanced countries has occurred largely without a clear identifiable cause. Added to this are the decline in social cohesion as measured by rising substance abuse, suicide, income disparity and appearance of demagogic populists. Here it is proposed that the transition from hunter-gather society to civilization is at the root of these problems.



H. Sapiens lived as hunter-gatherers for nearly all of their existence which is perhaps 250,000. Throughout this time they became progressively better adapted to this environment. Civilizations first appeared less than 10,000 years ago, with the majority of humans living in this form of social organization for less than five millennia. This change is so recent in the human story that significant genetic differences between late hunter-gatherers and contemporary humans is improbable, and this forms the basis for all that follows. Small numbers of hunter-gatherers continued to exist totally isolated from the influences of civilizations until the Age of Discovery commenced in the early 15th. century. Reports from encounters with some of the last remaining hunter-gatherers suggest they displayed lower cognition as measured by lack of the ability to create written language or develop sophisticated technologies, and yet they appeared to be free of chronic intense anxiety and depression, and few complained of chronic pains despite living under relatively harsh conditions. Moreover, they displayed high levels of social cohesion characterized by prolific sharing, even of scarce resources, and violence within their groups was minimal.



Since both hunter-gatherers and humans living in civilizations share essentially the identical genome, these dissimilarities must be accounted for by differences between the environments of hunter-gatherers and civilizations. This report is the first to look at these current problems in individual health and social cohesion as environmental in nature. Identifying the environmental influences that has caused them is essential to their mitigation and resolution.





PERTINENT ANTHROPOLOGY AND PHYSIOLOGY


Adaptation is used here to signify genetic change that improves survival and reproduction in a specific environment. A highly socially cohesive society is defined by the OECD Development Centre publication entitled, Social Cohesion in a Shifting World, as one that “works towards the well-being of all its members, fights exclusion and marginalisation, creates a sense of belonging, promotes trust, and offers its members the opportunity of upward social mobility.” Fear associated with eventual death is called here mortality fear, whereas fear of pre-mature death from immediate situations such as starvation and war can be viewed as existential fear. Fear without source is anxietyPseudo-science is used here to indicate beliefs systems that are presented as scientific and proven but are either not based on scientific methods or have been invalidated. Pseudo-truths are concepts or details that may have originated as hypotheses advanced by legitimately trained experts that neither were properly tested and verified, or have later been invalidated, but they are thought to have been and remain scientifically validated. Pseudo-truths exist because it is uncommon that information that is assumed to be scientifically verified is examined to certify the correctness of assumptions. 



Homo sapiens appeared in sub-Saharan Africa perhaps 250,000 years ago as nomadic hunter-gatherers. Despite being falsely classified as omnivorous, radioisotope evidence indicates early humans displayed a strong preference for nutrients of animal origin. Mounds of crushed previously marrow-rich bones of large grazing animals that date to periods earlier than the appearance of civilizations have been discovered in sub-Saharan Africa together with the tools used to extract marrow. It is impossible for any animal to extract this morrow without tools because of the thickness of their cortexes, therefore this nutrient was exclusive to humans. It is now concluded that humans largely scavenged bone marrow from large grazing animals as a staple food source. The vast number of adaptations that were required to make lipid-rich bone marrow available suggests that essential fatty acids of animal origin were more of a species requirement than a choice in order to develop and maintain their unique lipid-rich central nervous system that accounted for their superior cognition, bipedal locomotion and ability to create and use tools.


Fulfilling this preference for lipid-rich animal-based nutrients came with intense existential risk because it required living in close proximity to predators who also fed on the same large animal as did humans, and probably killed the animals from which humans scavenged remains. These predators could kill individual humans more easily than large grazing animals, therefore humans were only able to exist in this dangerous environment through adaptations consisting of a complex network of collective defense behaviors combined with together with strategic awareness provided by their superior cognition.


The preference for nutrition sourced from large grazing animals also came at the cost of relying on a precarious food source, because these herds migrated during the dry season only to return many months later during which time humans had to survive fasting as they resisted utilizing an alternative less essential fatty-acid rich nutrient sources. To moderate the risk of starvation accompanying their prolonged fast they required complex adaptations that are present in a small number of other primates such as Macaca fuscata (snow monkeys of Japan), who unlike humans are mainly herbivores. Hunter-gatherers must have been able to adjust their foraging intensity according to the availability of their preferred nutrient. We do know that satiation control in humans is central and resides in the hypothalamus however our understanding of the details of this control remain rudimentary.


When their preferred nutrient became available satiation must have sharply declined and mobility increased which amplified foraging. Since the dry season in sub-Saharan Africa varies in timing in length, extreme obesity would have been occasionally rewarded, therefore foraging probably remained optimal and constrained only by the effect of obesity on mobility. When their primary food source was unavailable, satiation must have risen thereby eliminating mobility intensive foraging entirely. They became lethargic to conserve energy expenditure until their preferred food source returned. The need for adiposity to sustain extensive fasting disappeared with civilizations because they gained a relatively predictable food supply through agronomy and maintaining animals in captivity, both of which are present in all civilizations. Despite this fact, morbid obesity was documented as a disease in some of the earliest records from civilizations and it remains a major source of ill health in contemporary humans. Clearly humans have retained an adaptation that was appropriate for life as nomadic hunter-gatherers in sub-Saharan Africa, that began causing disease in the context of civilization. This adaptation has not been extinguished probably because humans remain in the early stages of adapting to civilizations. Lack of appreciation of this retained adaptation may account for the limited success in controlling obesity in contemporary humans. 


Living in civilizations required a pronounced change from mainly animal based nutrients consisting of protein and fats to plant based food which is rich in cellulose and starch. While this remains the diet of contemporary humans, this should not be taken to mean that humans are well adapted to it. There is considerable evidence that humans remain poorly adapted to a diet that is rich in cellulose and starch. Fundamental digestive chemistry, mechanics and anatomy remain adapted to the diet of hunter-gatherers. Slowing of stomach emptying into the small intestine where nutrient absorption takes place varies in relation to nutrient type, with greater gastric delay translating into better intestinal digestion and absorption. Contemporary humans retain an adaptation whereby essential fatty acids of animal origin best retard gastric emptying. The same essential fatty acids but of plant origin also reduce gastric emptying but this food exits the stomach at approximately double the rate of animal sourced fats. It remains unclear what mechanism is used by humans to distinguish between plant and animal essential fatty acids, although it has been proposed that it is through distinguishing the mix of essential and non-essential fatty acids which differs greatly between plant and animal sourced foods.


Humans living in civilizations have failed to adjust to carbohydrate and cellulose ingestion, since both are given such a short gastric delay that their consumption without fat or protein is associated the risk of suffering from abdominal pain brought on by distention via gas production from bacterial action upon partially digested carbohydrates and breakdown products of cellulose reaching the colon. Cellulose rich diets present additional problems resulting from the volume of undigested cellulose reaching the colon. Chimpanzees are of similar size to humans but are mainly herbivores and digest cellulose as incompletely as do humans. They have adapted to this diet through a thicker and structurally more robust colonic wall and far greater storage capacity of undigested cellulose than is available to humans. This suggests that the human colon is suitable only for modest residual volume typical of animal sourced fat and protein ingestion.


Despite the above uncontroversial arguments, it has become customary for physicians and nutritionists to recommend large quantities of dietary fiber based on unverified hypotheses because the benefit of dietary fiber in humans in treating intestinal disease has never been shown according to current scientific standards of measuring treatment effectiveness. In some cases the diseases for which it is recommended have increased in incidence since dietary fiber recommendations were introduced therefore it must be a pseudo-truth. The presumed benefit of a fiber-rich diet has been finally called into question in a report directed at understanding the increase in incidence of irritable bowel disease (IBS) and diverticular disease (diverticulosis and diverticulitis) in economically advanced societies. It has been proposed that chronic over-stretching in great part from excessive colonic contents causes the inflammation and mobility disturbance that characterizes IBS, and this eventually leads to micro-disruptions of the colonic wall in the form of diverticula, accompanied by higher risk of developing diverticulitis. The recent introduction of even greater amounts of dietary fiber in human diets, often at the recommendation of physicians and nutritionists, without recognition of the potential hazards of this practice can account for increased rates of IBS, diverticular disease, and can explain why these conditions now occur with greater incidence, and in younger aged cohorts as well.


With the high population density of civilizations, simply maintaining animals in captivity was insufficient to supply all nutritional needs of humans in civilizations with nutrients of animal origin. Despite remaining well adapted to nutrients of animal origin and poorly adapted to consuming plant-based nutrients, records indicate that humans entering civilizations began eating large amounts of plant-based nutrients. More than a millennium ago a large group of humans began forsaking consumption of nutrients of animal origin on religious grounds, and relatively recently others have acted similarly for a variety of reasons. These represent the most extreme expression of loss of the strong preference for consuming animal-based food. This loss of an intense preference for animal-based nutrients that records show commenced with early civilizations is best explained by natural selection favoring those with less pronounced preference for animal-based nutrients, and a willingness to consume more cellulose and starches. Presumably the reliable food supply without need for prolonged fasting obviated the need to nutritionally focus on essential fatty acids of animal origin for normal human development particularly when the product of early agronomy in the form of grains and legumes also supply essential fatty acids.


Looking at civilizations from a social perspective, a likely prolonged period of environmental stability resulted in the ancestors of H. sapiens developing morphologic, physiologic and social adaptations that eventually led to the human population burgeoning to the point of being growth constrained by the limited amount of land suitable for nomadic foraging. An alternative form of social organization was required due to nutritional needs causing survival pressure. Civilizations presumably were the best of a series of attempts to achieve population growth through alternative forms of social organizations. Civilization appeared so recently and abruptly that differences between late hunter-gatherers and even contemporary humans cannot be explained by the genetic-evolutionary model. Accordingly, humans today remain essentially late hunter-gatherers that have entered civilizations with characteristics some of which continue to aid their survival in civilizations, but others that may no longer serve a useful purpose, and moreover, some that perhaps result in disease and death in civilizations. This evolutionary perspective is often ignored in contemporary research dealing with human disease.

 

More advanced cognition was a pre-condition for civilizations because civilizations required novel technologies of agronomy and animal husbandry, whereas hunter-gatherers were cognitively limited, and lived with the most primitive of technologies. Formulation of written language was outside their capacity. With civilization, not only came the rapid development of written language, but documented, remarkable intellectual proficiency. Early civilizations produced many of the basic systems of mathematics and logic that continue to be used today, and early discourses dealing with ethics and contemplation about mortality are still studied and admired. In fact, there is no convincing argument to suggest that the cognitive performance of humans living in early civilizations was inferior to that of contemporary humans. Thus, further advances in technology and intellectual achievement in later stages of civilizations are likely the result of a steadily enlarging base of accumulated recorded knowledge, rather than significant further increases in intellectual capacity.


This change in cognition was therefore the result of environmental differences between life as hunter-gatherers versus civilizations. This indicates that the potential for superior cognition existed in hunter-gatherers but was unexpressed. The dulling of this potential can thus be viewed as an adaptation that advanced survival. The best available explanation of the advantage of dulled cognition as hunter-gatherers is that it moderated existential and mortality fears. Presumably preoccupation with their precarious existence and eventual death would have dissipated their will to live or distracted them from concentrating on the immediate task of survival in a challenging environment. But with the transition between late hunter-gatherer and civilized life came the rise in existential and mortality fear.


Most scholars attribute the survival ability of hunter-gatherers in part to high levels of social cohesion, consistent with the warding off of predators, acquisition and sharing of a precarious food supply, and cooperative effort that assured offspring would reach maturity and attain optimal fertility. This concept is supported by reports from recent first contacts with hunter-gatherers such as the Inuit and Australian aboriginals. These reports indicate intense social cohesion between members of groups and suspicion of those from outside. Reports are also available from the study of primate behavior such as chimpanzees that lived in the same region as the earliest hunter-gatherers and resembled them in terms of group size maintained and vulnerability to similar species of predators. These primates lack violent behavior that results in death to members of the group, aside from infant deaths associated with alpha-male succession. Territorial boundaries between groups are maintained through intimidation. There is also some cooperation between groups as indicated by females occasionally being peacefully integrated into a receiving group. This practice may improve survival through maintaining optimal group size.


Whereas hunter-gatherers survived due their intense social cohesion, humans in civilizations are widely considered to have survived despite their dismal social cohesion. Humans experienced a precipitous decline in social cohesion with civilizations. Early civilizations were authoritarian with a weaponized minority dominating an enslaved majority of conquered groups. Rulers often used sadistic practices to intimidate others and maintain their leadership. Slavery and serfdom were later abandoned, but this was not necessarily a sign of improved social cohesion—it was done mainly for economic reasons and because technological advances made human labor uneconomical. Any further advances in labor productivity required intellectual resources typically supplied by educated, freed humans. In fact, most technological advances that improved the overall well-being of the population of humans were spinoffs of research to advance weapons of mass destruction. As a sign of continued poor social cohesion, human activity has recently endangered the bio-systems humans rely upon for their existence, and it remains uncertain whether contemporary humans will choose to direct resources to their survival rather than to continue to consume resources that further degrades the environment.


Civilizations were likely responsible for massive emotional change in humans. Ethanol likely pre-dated civilizations, but it must have been used minimally because high volume production of ethanol requires feed stocks produced through agronomy and civilization. Ethanol has been continuously produced and consumed for more than 9,000 years—the totality of the existence of civilizations. It has been suggested that ethanol was consumed mainly for its nutritional value despite that fact that producing ethanol through fermentation lowers the nutritional content of feedstocks. Others have proposed that it was used primarily as an antiseptic or for the decontamination of polluted waters, but this is unlikely because as it would have imparted minimal protection to the population as a whole. The most plausible explanation for ethanol use by early civilizations is the one that applies today, that is to impart relaxation, sedation, analgesia, increase tolerance of low environmental temperatures, and as an antidepressant and moderator of mortality, existential fears, and anxiety. The importance of ethanol to humans living in civilizations is further suggested by its continued popularity despite being toxic and addictive. Indeed, ethanol dulls the cognition which is so essential to success in civilizations, and so the escape from discomfort caused by ethanol may have been an essential pre-condition to civilized life. The present-day popularity of non-medical opioid use can be similarly explained which suggests that humans emotionally resemble hunter-gatherers.


Fertility rates in hunter-gatherers are thought to have been near optimal considering the species survived despite challenges consisting of a harsh environment replete with predators and characterized by prolonged fasting. Fertility rates in early civilizations are not known, however reliable records from the 19th century in economically advanced countries indicate that they were well below optimal. Since then, fertility is at or below population replacement levels. It appears that there has been a progressive though non-linear decline in fertility since civilizations first commenced.


Contemporary humans retain an adaptation whereby fertility is so negatively related to anxiety and depression that either can be used as a surrogate measure of the other. The optimal fertility rate of hunter-gatherers suggests that their anxiety and depression were low – far lower than with civilization. However when they became anxious or depressed lowered reproductive behavior may have advanced survival in hunter-gatherers through diminishing interest and energy in reproductive behaviors so that they could concentrate on their immediate threat. The current low fertility rate in economically advanced countries is best explained by rise in anxiety and depression that has been documented in contemporary society and that likely commenced with the transition to life in civilizations. This decline in fertility would probably reverse if anxiety and depression were to be attenuated.


Humans experienced decline in some measures of physical robustness that also were of external cause. Whereas there is no physical evidence indicating that hunter-gatherers living in sub-Saharan Africa relied heavily on shelters, clothing or bedding, there are extensive physical remains, images and written records of humans using these technologies in early civilizations in similar climates. This suggests that humans entering civilizations rapidly developed intolerance to temperatures at the lower end of the range that hunter-gatherers found satisfactory, as well as lower pain thresholds to high amplitude skin tactile stimulation as would be produced by laying on rigid, irregular surfaces. Hunter-gatherers likely found such conditions acceptable without the interface of bedding. Contemporary humans seem even less robust than those of early civilizations implying that low temperature tolerance and pain thresholds to tactile stimulation have continued to diminish. The environmental factors that caused this decline in robustness likely have increased further.


Hunter-gatherers likely were adapted to tolerate these low temperatures and relatively intense tactile stimulation, therefore their sensory thresholds did not impair their survival, and contemporary humans probably retain these adaptations. It follows that the intolerance to low temperatures and intense tactile stimulation that accompanied civilized life do not improve human survival, therefore are unnecessarily low and can be viewed as a source of unnecessary distress. They also bring about a suite of unnecessary behaviors in order to attain comfort, such as varying thermal insulation both by clothing changes as well as changing positions in the attempt to achieve comfort when supine even with the use of highly resilient bedding. Analgesia may be required to attain the comfort while sleeping that hunter-gatherers could attain.


The aggregate number of humans living in both civilizations and as late hunter-gatherers for the first 5,000 years after civilizations commenced has been estimated to be equal to the number that lived as late hunter-gatherers prior to civilizations. This suggests that life in early civilizations was characterized by a relatively higher mortality and/or lower fertility rates compared to life as hunter-gatherers. Because of this the continued existence of early civilizations must have been been uncertain, even with the accompanying massive improvement in cognition and reduction death due to starvation and attacks from predators. Large numbers of humans have lived in civilizations for only about 5,000 years, which coincides with the emergence of civilizations in proximity to the Nile, Tigris and Euphrates Rivers. The success of civilizations after 5,000 years can be explained the emergence of modest resistance to infectious diseases such as tuberculosis, plague and syphilis and the viral diseases of smallpox and polio, all likely prevalent in early civilizations. As well, modest resistance likely was attained to ethanol addiction and perhaps addiction to other similar substances that appeared.


Skeletal remains of late hunter-gatherers and humans living in early civilizations suggest that life expectancy of both groups was the same, about 35 years. This gradually increased with advancing civilizations, probably due to increased resistance to infections and to ethanol addiction. Life expectancy in civilizations was stable until the 19th century, whereupon it rose modestly only to be followed by a more rapid rise following the advent of germ theory that led to improved hygienic practices, and as a result of the discovery of antibiotics.


Hunter-gatherer group size is thought to have been maintained at about 50 individuals, a size that resembles that of some primate groups currently living in sub-Saharan Africa. Groups of this size must have been associated with optimal species survival under conditions of a nomadic existence, optimizing fertility and group protective behavior. Maintaining a nomadic existence likely permitted more successful foraging for animal remains as well as attenuated disease from contamination of the immediate location with human waste. Group size as most social behaviors is probably actuated through operant conditioning with endorphin release as the reinforcer.


Group size in civilizations increased far above the number characteristic of hunter-gatherer societies, though there is evidence that humans living in civilizations continue to both desire and show better social cohesion in groups that approximate the size of those of hunter-gatherers. For example, all armies in the world are designed around platoons which consist of groups of between 20 and 30 adults. The platoon size was determined empirically from results in combat where groups of this size showed intense social cohesion as measured by predictably accomplishing dangerous missions often with acts of heroism. This optimal group size also helps to explain the success of group therapies in dealing with diseases, for example, group therapies based on a variety of theories have been successful by current scientific standards in dealing with addictions of all types even without the help of professionally trained leadership, whereas individual therapies have shown no success these areas even with trained personnel. Furthermore, individual psychotherapy and behavioral therapies when examined objectively have never been shown to aid disturbed patients beyond that of control groups in which individuals received support of the type as offered by a concerned friend. Another example the retained desire of small groups in adult humans comes from primary school education. Parents of children and many educators attach so much importance to maintaining school class sizes of near 30 individuals that it has become a perennial nightmare for school administrators and a source of excessive educational expenses. This must be an externalization of parents’ own sense of comfort in groups of this size. There is no evidence that children learn better in classes of ca. 30 students compared to larger classes, and children seem indifferent to class size if not informed of its presumed benefit by parents. Life in excessively large groups according to this group size adaptation may at least in part explain loss of social cohesion within civilizations.


Perhaps the greatest external change that occurred with civilizations was the decline in the magnitude of physical work that humans performed. Most of the skeletal muscle mass in humans is used to maintain stable equilibrium, walking and running and weight-bearing without locomotion, which is surprisingly work intensive because of constant postural adjustments. Whereas in early civilizations the elite classes were minimally mobile, the majority of those living in fixed locations and caring for crops and domesticated animals, day-to-day activities may have initially required weight-bearing work close to that of hunter-gatherers, though somewhat less energy-intensive due to shorter distances traversed. Afterwards came the steady decline in mobility from advances in transportation technologies, up until the Age of Industrialization and beyond when both the elite class and the less privileged became minimally mobile and more sedentary.


The rise of sedentary lifestyles began in Europe with the Renaissance and later spread to other countries through colonization. Chair use is a surrogate measure of sedentary behavior, and the number of chairs in the population can be used as a direct measure of chair use. Ancient chairs are rare valuable artifacts that are largely bought and sold through auction houses and displayed in museums. Their number can be estimated through auction catalogs and museum records. These sources indicate that few true pre-Renaissance chairs were fabricated because as symbols of status and power and they were rarely used at home or work as they are today. In contrast early post-Renaissance chairs are plentiful, tend to be far less ornate, and often were made in sets, which suggests a utilitarian rather than symbolic function. Written and visual records from this period confirm this. Moreover, the word sedentary dates back to Renaissance England.


The suddenness of the transition to a sedentary existence is thought to have been an unexpected consequence of the equally rapid switch to footwear use by all social classes. Rudimentary foot coverings were worn to provide insulation by hunter-gatherers who migrated to the extreme northern latitudes and later seasonally in some civilizations for insulation. Aside from these functional applications, footwear use was preceded pre-historically by decoration of the extremities with coloring agents such as henna to create body art. This progressed with civilizations to more elaborate foot decorations worn by the less mobile elite class, and this class eventually extended these decorations to plantar surface with footwear initially composed of woven grasses and eventually with leather by the Romans. These footwear invariably functioned poorly compared when weight-bearing with and without mobility compared to bare feet, but mobility was not important because they were worn by the immobile class. Shoes therefore were initially poorly functional decorations worn as a symbol of elevated status. This changed with the bubonic plague that preceded the Renaissance in Europe which resulted in inexpensive leather due to leather production exceeding reduced demand from depopulation. Plague also resulted in a skilled labor shortage in cities which was satisfied by ex-serfs who became relatively immobile. They now earned wages and footwear was high on their list of important purchases because footwear use had become a symbol of membership in the middle class, and the mobility impairment was less of a problem for urban dwellers. The original decorative origins of footwear were forgotten and their use was instead explained by unsubstantiated claims regarding improved foot health. Chair use followed footwear introduction probably because footwear use leads to intense discomfort with prolonged weight-bearing that has to be relieved by sitting.


Humans remain poorly adapted to footwear use since they directly cause abundant foot and leg disorders that do not occur in barefoot humans. For example patellofemoral pain syndrome (PPS) is the most common knee disease. It is caused by attenuated plantar sensory feedback from footwear caused by footwear that encourages excessive knee extension when standing walking and running. It is underappreciated that humans retain an adaptation whereby plantar sensory feedback direct contact between the barefoot and support surface produces sensations that maintain sufficient knee flexion to keep the patella fixed in the femoral trochlea, therefore attenuated plantar feedback from footwear leads to damage from repetitive partial subluxation of the patella which results in PPS. Footwear indirectly harm humans through impaired balance and amplified impact with locomotion poor sensory feedback from the interface between the extremely sensible plantar surface and the support surface which moderates high amplitude transient mechanical stimulation of the plantar surface. The potential dangers of footwear are rarely considered when dealing with sports injuries, obesity and frequent skeletal injuries associated with their use.





ENDORPHINS


Wonder in hunter-gatherers consisted of the highest level of cognition associated with maintaining mortality and existential fears at optimal levels, together with minimal anxiety and depression, and highest environmental temperature tolerance and pain thresholds that allow good health. Dysphoria is decline in one or more elements of wonder from optimal levels. Opioids have been used by humans for 2,000 years for their pleasing effects which include allaying existential and mortality fears, anxiety, depression, improving low temperature tolerance and pain relief. The relatively recent discovery of substances similar to opioids but produced within the human body explains the neologism endorphin, formed by combining endogenous and morphine. Morphinewas the most widely used opioid in civilizations up until recently. Endorphins are a heterogenous group of psychoactive peptides (endorphinsencephalins and dynorphins) produced in the central and peripheral nervous systems and the pituitary gland. They act upon opioid receptors which are widely distributed in the central and peripheral nervous system and digestive tract. The similarity between the differences between hunter-gatherers and those living in civilizations with the effect of endorphins on humans suggests that hunter-gatherers maintained a far higher level of endorphins than do humans in civilizations.


Release of endorphins in humans is associated with a vast number environmental conditions, behaviors and disease states, and their actions cannot easily be characterized. Their release may represent a selected adaptation to the environment. Humans likely had more than sufficient time as hunter-gatherers in sub-Saharan Africa to adapt well to this environment, and their high endorphin production is well suited to these conditions. Adaptations involving endorphin production are best understood from this perspective rather than as a response to life in civilizations which has been relatively brief in the history of Homo sapiens. The notion that humans largely remain hunter-gatherers in terms their physiology is often forgotten and rarely considered when dealing with disease causation.


Adaptations involving endorphins can involve both direct effects on tissues and organs, but also indirect effects such as those shaping complex behaviors through the operant conditioning learning model, with endorphin release acting as a positive reinforcer. Civilizations represented a massive recent environmental change for hunter-gatherers considering that life changed from an extraordinarily mobile foraging mode, eating mainly animal remains to a diet that was rich in cellulose and starch and dependent on superior cognition to advance technologies.


Significant rise in bloods levels of endorphins beyond those seen in normal contemporary humans not only increases pain thresholds, lowers anxiety, depression and fears and raises tolerance of extremes of environmental temperatures, but also impairs cognition as measured by tests of learning and memory. In addition, maintaining optimal group size involves complex social behaviors that likely are shaped by continual endorphin release.


The only means available for healthy, disease free, non-pregnant and non-starving humans to maintain persistently high endorphin levels is through voluntary prolonged work of moderate intensity involving a large mass of striated muscle, and since most voluntary muscle in humans is used for weight-bearing and mobility, the changes that occurred in humans with the advent of civilizations were primarily the result of decline in mobility.


Foraging for food is a mobility intensive activity. It is proposed that when the preferred food of hunter-gatherers was available, endorphin level required to achieve wonder rose likely though blocking endorphin receptors through endogenous endorphin receptor antagonists and partial agonists. Wonder was achievable only through intense mobility that lead to amplified foraging beyond current nutritional needs which created adiposity that was limited by the effect of adiposity on mobility. When their primary food source was unavailable, receptors became totally unblocked which resulting in wonder with no mobility at all thereby conserving energy.


Since essentially all humans are now relatively immobile compared to the hunter-gatherer state, and require advanced cognition in their daily lives, it is no longer appropriate to consider diminishing dysphoria through significantly increasing mobility because of its adverse effect on cognition. Since dysphoria likely has some selective disadvantage, an adaptation will eventually arise through natural selection that diminishes the relation between dysphoria and endorphin levels. Until then, sub-optimal endorphin levels with its inherent dysphoria will remain the principal cost of advanced cognition, and this will continue as long as civilizations exist and dysphoria is not considered a prevalent disease that must treated. 


The law of parsimony (Ockham's razor) advances the notion that hypotheses that contain the fewest assumptions or steps are correct. This hypothesis continues to be valuable in contemporary science in evaluating strength particularly of causal hypotheses because it empirically has proved reliable in predicting valid hypotheses once methods became available to test them directly. The mobility-endorphin hypothesis proposes that decline in endorphins resulted from attenuated mobility, and this single external factor directly accounts for the abrupt rise in cognition, amplified anxiety, depression, fear, environmental temperature intolerance and pain—essentially all of the changes that occurred when hunter-gatherers entered civilizations. 



DYSPHORIA AND SOCIAL COHESION


The survival of hunter-gatherers in the challenging environment of sub-Saharan Africa required intense socially cohesive behaviors that were adaptations based on endorphin reinforcement, such as when the ideal group size for hunter-gatherer survival was maintained. Many of these behaviors could also have advanced the survival of humans living in civilizations yet social cohesion precipitously declined due to the large group size inherent to civilizations, which was compounded by reactions to the dysphoria from attenuated mobility. Since loss of social cohesion was the result of the decline in endorphin levels and consequent rise of dysphoria, it follows that contemporary humans would probably exhibit greater social cohesion in large groups with substantial reduction of dysphoria through endorphin rise gained from prolonged mobility. But this would leave them poorly functional in modern civilizations because of loss of cognition from the endorphins. The use of ethanol or similar substances that temporarily lessen dysphoria has been the traditional means of lessening dysphoria temporarily in civilizations, but the hazards associated with their use, and loss of cognition they produce makes them unsuitable for long-term control of dysphoria which would be essential to raise social cohesion. To be useful in the context of civilizations a substance would need to lessen dysphoria safely without impairing cognition.


This analysis regarding the decline of social cohesion within civilizations follows deductive logic but does it pass the test of being consistent with historical records? The following is a simple sketch of how differences in cognition and dysphoria from endorphin decline may have shaped history. No attempt has been made to be comprehensive, for example minimal attention was directed at the origin of superstition, supernaturalism and religious expression, but the careful reader should have no difficulty expanding the implications of this hypothesis to these areas.


Humans entering civilizations became reliant on technologies rather than mobility. The greatest existential fears of hunter-gatherers were death from starvation associated with their prolonged periodic fasts and predators. Both declined, but paradoxically the existential fear component of dysphoria rose as spontaneous anxiety and depression appeared from the endorphin decline from diminished mobility and living in groups larger than are reinforced through endorphin feedback. This combined with increased dysphoria from amplified mortality fear that came with freed cognition and distress from persistent pains from the lower pain thresholds from lower baseline endorphin levels.


Humans lived in a social system composed of the ruling and dominated classes from the time when civilizations commenced up until the European Renaissance. Survival of these larger groups was found to be aided through assigning a single individual as leader to perform minimal physical work, which released their full cognitive potential through lowered endorphin levels thereby allowing them sufficient organizational awareness to successfully direct others to achieve the specialized tasks as is required in technology-based systems. Their lack of mobility however left them intensely dysphoric. As civilizations grew in size, the solitary leader became insufficient in directing these complex systems, which in turn necessitated expanding leadership to a supreme ruler and an elite ruling class composed at least in part of the supreme ruler's heirs all of whom were also minimally mobile hence highly dysphoric. In an attempt to explain their dysphoria, members of this class, much as contemporary humans do today, failed to consider that their dysphoria was an irrational and inappropriate response caused by retained hunter-gatherer adaptations that were inappropriate in civilizations. Instead, some thought they contracted a disease which they usually attributed to punishment from unhappy gods. Others justified their existential fear component of dysphoria by exaggerating external dangers of insurrections and attacks from neighboring societies, much as is also done today. These irrational existential fears lead to the deployment of military to prevent organized insurrections, and a standing army to both defend against attacks and to initiate pre-emptive strikes against neighboring states similarly ruled by dysphoric ruling classes.


While initially ruling with consent of the dominated class, noting the passivity of the dominated class, the rulers dismissed many of their concerns. This eventually led to repressive regimes characterized by thuggery and intimidation directed at any opposition. Dysphoria also created impatience with negotiation and compromise that is normally inherent to peaceful conflict resolution, and to the preference to deal with presumed externally justifiable dysphoria with ill-conceived expedient actions, often violent. This would temporarily allay misguided existential fear and anxiety, but also were often contrary to even the short-term interests of the ruling class. It resulted in long-term negative consequences, for instance, leading to cycles of violence both within the state and with other nations. Poor decision making in such conflict situations was amplified by the use of ethanol to moderate their dysphoria.


Hoarding is defined in this report as accumulation of anything in excess of need that is sufficiently durable to be later exchangeable for goods and services, and it proposed here that hoarding amplitude along with the quantity of violent conflict resolutions are the best measures of social cohesion in civilizations. Evidence of both hoarding and violent conflict resolution were present in some of the earliest written records from civilizations, and they have continued uninterrupted. Dysphoria from existential fears could be moderated by paying mercenary soldiers, bribes and ransoms using hoarded goods. As with many individuals in contemporary society, amplified dysphoria was thought to be retribution from gods, and this lead rulers of early civilizations to attempt to bribe them. One method in early civilizations was to sacrificing animals and slaves to honor their unhappy gods. Another was and remains constructing impressive edifices of no utilitarian value in honor of the gods.


The elite class became aware, much as they do today, that their hoarded arsenal could considerably outlast their own life, and if transmitted without dilution to one or a few children, could indirectly produce a sense of immortality thereby moderate their mortality fears. There is no evidence that the early ruling classes ever envisioned how charitable foundations could fulfill this need, but if they had they surely would have become popular within the dysphoric ruling class.


The power of the urge to hoard in dealing with dysphoria should not be underestimated. Philosopher and economist Adam Smith (1723-1790) based his notions about free enterprise on his observations regarding the irrational power of hoarding in humans, but never considered it an expression of a primal mortality fear. He never called it hoarding however, but rather sanitized this practice by calling it acquiring wealth. Treating humans as goods through slavery to him was an inevitable and unavoidable aspect of hoarding. He is often credited with devising a system whereby hoarding is optimized by a society.


Even awareness of the harm to oneself, family and others caused by hoarding has done little to moderate its influence. The number of documented examples of prominent individuals driven mindlessly by hoarding are too numerous to mention. One better known example was a contemporary of Adam Smith, Thomas Jefferson. As an educated, worldly, insightful and admittedly highly dysphoric person, Jefferson stated in his writings that slavery was an abomination that contradicted his belief that all humans, which to his mind included slaves, are created equal. Nevertheless, he invested a vast proportion of his wealth until the time of his death in slave capital presumably because it provided good returns, and did not in his will free his much-loved slave mistress, Sally Hemmings (but he did free the children he fathered by her). He also did not legally recognize himself as the biological father of his children he fathered with Sally Hemmings presumably because of his desire to prevent dilution of his estate among numerous heirs because this might result in reducing the inherited principal which could decimate his immortality.


Hoarding in dysphoric humans has always been a more powerful motivator than social justice. In this light, many acts presumed to be motivated by humanitarianism deserve reconsideration. Staying with slavery examples, the notion that social awareness led to the abolition of slavery is inconsistent with the fact that the abolitionist movement only gained traction when advances in technology and industrialization minimized the value of enslaved physical labor in production. Slaves were freed after it had become apparent that uneducated humans kept against their will were unsuitable workers in a complex industrial setting.


Until the end of serfdom the vast majority of the populations of civilizations were composed of the dominated class which in early civilizations consisted of slaves and later serfs. This class performed intense physical work thereby producing enough endorphins to dull their cognition and minimize their dysphoria, which combined to make them easily manipulated by the ruling class and both unwilling and incapable of organizing opposition to rulers. This explains why despite their large numbers, they rarely overthrew their leaders even when forced to exist with harsh living conditions. They managed to survive with these subsistence living standards through sharing and cooperation that comes from higher endorphin levels. Their low cognition accounted for their naive willingness to believe the demonization of enemies advanced by the ruling class, and often made reliable defenders of the regime with their willing participation in defense and pre-emptive strikes.


This two-class structure proved as durable so long as the endorphins of the dominated class remained high, and it only started to break down with the European Renaissance when large numbers of the dominated class became less mobile causing decline in endorphin levels and consequent rise in cognition and dysphoria. Serfdom disappeared because the depopulation from bubonic plague lowered demand therefore value of agricultural products. The demand for agricultural workers declined just as need for more technical workers rose in urban areas both because of depopulation which was more rampant in cities and the new demand for a workforce to supply manufactured goods that accompanied as Europe entered the Age of Discovery. This theme of progressive decline in mobility, and equivalent lowering of endorphins in populations with its dependent dysphoria has been the main force behind historical events since.


Heightened dysphoria of ex-serfs combined with their improved cognition to make them skeptical about self-serving arguments advanced by the elite class that promoted their leadership with its luxurious lifestyle as essential to the survival of the less privileged. Thise began giving them an intense desire to hoard much as it did the elites, and they demanded a far greater share of the resources hoarded by their rulers. England and city states such as those of the Italian peninsula were able to formulate, sometimes with the encouragement of bloody insurrections, an acceptable deal between the rulers and the ex-dominated class that largely avoided violent revolutions, but most of Europe were slow to realize the importance of a new social contract because the ruling classes irrationally minimized the risk of violent class conflict or were unwilling to share their hoarded bounty, or both. Revolution was in the air, which through violence produced more representative governance. A system was created replete with judiciary whereby all were given the right and means to hoard and express their recently amplified dysphoria through constant verbal and written complaining about the unfairness and tragedies of life and natural disasters that they felt explained their dysphoria. a position that they defined as  freedom.


By the end of the Industrial Age, the majority of the population of the emerging economically advanced countries were relatively sedentary therefore highly dysphoric, and with it came impatience with non-violent conflict resolution which typically involves time-consuming negotiations. Populism and demagoguery satisfied the desire to find expedient solutions to complex problems. Its cost was catastrophic spasms of poor social cohesion. These was particularly associated with spikes in the existential fear caused by economic collapse, in which some actually suffered true hardships such as hunger and loss of shelter, but more became dismayed by the sharp decrease in the value of their hoarded goods. The Long Depression (1873-1896) led to a spike in collective dysphoria from amplified existential fear together with the ever-present mortality fears. This in turn is thought to have led to nationalism, militarism and the colonialism of Bismark, and eventually the Great War (1914-1918). The Great Depression (1929-1939) similarly raised dysphoria through elevated existential fears that lead to ultra-nationalism and eventually to World War II.


Following these wars, officials in many nations, partly motivated by altruism, but more significantly by their desire to survive as leaders, introduced a series of laws aimed at moderating dysphoria rise caused by the heightened existential fear following economic collapse. These programs involved redistribution of hoarded goods from those who were more successful at hoarding or inherited goods from their ancestors to others a few of which were enlightened and not motivated by hoarding, but mainly individuals who were either less talented at hoarding, or not fortunate enough to receive an inheritance. The disadvantaged elderly, ill and unemployed received the greatest attention. These social programs diminished dysphoria through moderating existential fear and probably account for fewer intense declines in social cohesion following economic failures, but they were challenged by increasing dysphoria from further decline in mobility from work reducing technological advances.


The seeds of future collapse of social cohesion were sowed in the early 1980s, when most voters in industrialized countries became convinced through intense propaganda campaigns that the amount of hoarded goods in their possession would multiply if they re-create an aristocracy that possessed most of the aggregate of hoarded goods. Not only was this notion implausible, less than two centuries previously it was shown that revolution results when an aristocracy exists in a dysphoric population. Income disparities soared as this aristocracy failed to hoard through innovation as they had promised, but through the uncompetitive practice of changing the rules in their favor by bribing legislators and influencing judicial appointments. It eventually became evident that this aristocracy's mountain of hoarded goods came mainly from redistributing goods destined for the majority to themselves. As in the eighteenth century, political instability followed despite the presence of social programs. While actual revolution has failed to materialize, ultra-nationalism and populism has risen rife with demagogues, and true revolution is no longer considered impossible.


Greater income equality might help moderate population dysphoria caused by the flirtation with a new aristocracy. But an ominous sign has also recently appeared. Objective reports have shown a rise in many measures of poor social cohesion for the past two decades despite social programs that have muted existential fears of populations. Social cohesion has declined rife with populism, addiction and suicide, even in countries with exemplary social programs and modest income disparities such as Norway, Sweden, Germany and Denmark, and during a period of rapid economic growth, which has never before happened. The only explanation available for it is amplified dysphoria from lower population mean endorphin levels owed to attenuated population mobility. This can only be accounted for by the introduction of digital technology in the workplace, at school, while socializing and when shopping. Dysphoria amplitude from an increasingly sedentary lifestyle is now raising population dysphoria beyond the moderation of dysphoria from even advanced social programs. At the same time increasing mobility will not be able to moderate dysphoria because it is time intensive and has a negative effect on the slightly improved cognition that this inactivity has caused. Treatment of dysphoria using scientifically proven methods on a large scale may now be the only means available of maintaining even the modest level of social cohesion in societies, but also has the potential of raising social cohesion to levels that never existed in civilizations.




MANAGEMENT OF DYSPHORIA – INTRODUCTION


Technology has mitigated some components of dysphoria. Dysphoria caused by intolerance to innocuous low environmental temperatures due to sub-optimal endorphin levels in humans living in civilizations has been greatly relieved by advances in the insulation technologies integrated into clothing and shelters as well as domestic heating. Dysphoria from pain from innocuous tactile stimulation due to excessively low pain thresholds due to sub-optimal endorphin levels has been mitigated at least in relation to their effect on sleeping via resilient bedding. In other situations this can be moderated through the use of analgesic medications. Unfortunately no effective analgesic medication is currently available that is considered safe for continual use at the effective doses, and even with this caveat, analgesics are underused because it is not widely appreciated that humans in civilizations live with inappropriately low pain thresholds.


The most troubling components of dysphoria remain excessive existential and mortality fears, anxiety and depression, therefore this will be dealt with more extensively. As previously mentioned, existential fears released from attenuated endorphins have been amenable to social interventions and simply health advances in general that have extended high quality life expectancy of populations. Since there is evidence that these social programs are becoming increasingly ineffective in controlling dysphoria, and other measures that improve life quality and quantity improve slowly, direct management of dysphoria is now the only practical means available to deal with this rapidly rising dysphoria. The remainder of this report will be directed these medical interventions. 



INTRINSIC CYCLICAL DYSPHORIA


The Age of Dysphoria began approximately 5,000 years ago when large numbers of hunter-gatherers transitioned to civilizations. For the first time in human existence, human nutrition became based more on technological competence than mobility. The cognition required for this technology became available through releasing cognitive potential that was already available by lowering endorphin levels through attenuated mobility. The cost for this cognition was decent from wonder which is attainable through optimal endorphin levels for hunter-gatherer existence which is not available in civilization. Continuous dysphoria in humans living in civilizations is only interrupted temporarily through the use of certain substances. With this dysphoria came decline with other adaptations that were dependent of wonder, such as fertility and social cohesion. All humans in civilizations have been dysphoric with its amplitude varying in relation to the external factors of existential fear, endorphin producing muscular activity and group size.


When known environmental influences on dysphoria are constant, humans experience periodic worsening of dysphoria followed by return to near its previous intensity. The mean complete cycle is variable but usually approximates one year. It is not certain whether dysphoria is continually oscillating in everyone, although this seems probable. These spontaneous periodic swings in dysphoria became evident when examining the effectiveness of psychotherapies and behavioral therapies for the treatment of excessive depression and anxiety. This research revealed that with both groups given these treatment methods and control groups that received no treatment but were otherwise treated identically, symptoms of dysphoria disappeared after many months. Psychotherapy and behavioral therapy provided greater moderation of symptoms than control groups, but this was matched by groups offered encouragement of equal frequency and duration as psychotherapeutic sessions.. This inherent oscillation of dysphoria is referred to here as intrinsic cyclical dysphoria (ICD). 


Amplitude and duration of these dysphoric cycles loosely follow a familial pattern which suggests polygenetic inheritance in ICD frequency and intensity. Episodes of ICD with mild ICD causes nothing more than mild irritability, insomnia and fatigue that is attributed to minor external irritants, whereas high amplitude ICD peaks can be totally disabling with suicide risk. Because of the duration of these cycles ICD attempts have been made to explain them by duration of sunlight but this is unlikely because ICD is equally present near the equator where there is no seasonal change in daylight time. ICD most likely is caused by the actions of endogenous peptides with partial agonist or antagonist effect on endorphin receptors. Bipolar disorders may represent a disorder of endorphin receptors or these intermediaries.


While psychotherapies and behavioral therapies fail to treat dysphoria according to scientific standards of measurement, it is common knowledge that both patients and therapists are often convinced that psychotherapy is an effective treatment. This is because heightened dysphoria from ICD is inherently self-limited, therefore therapy lasting many months will usually be associated with decline in dysphoria from ICD, which gives the illusion that the therapeutic intervention is effective.


In contrast to psychotherapies, when antidepressant medication is similarly tested for effectiveness in treatment of dysphoria, duration of the symptomatic period is substantially shortened compared to control groups. These medications have been found to be an effective treatment of all conditions where anxiety and depression figure prominently, whether it is called by the existing arcane disease classification systems as anxiety, depression, post-traumatic stress syndrome (PTSD), seasonal affective disorderpre-menstrual stress syndromefibromyalgia,chronic fatigue syndrome and many others.


Hunter-gatherers must have had changes at the endorphin receptor level that could have brought on ICD, but they would not have lost wonder because they would have seamlessly lowered or amplified mobility thereby adjusting endorphin levels to maintain wonder. This suggests that ICD is part of the adaptation that raised mobility amplitude, in so doing, foraging intensity to gain the adiposity they needed to raise the probability that their prolonged seasonal fast would not result in death from starvation. Accordingly, when their preferred nutrients were available, partial blocking of endorphin receptors raised the circulating endorphins levels required to maintain wonder, thereby eliciting amplified mobility. Endorphin receptors become unblocked when large animal herds migrated during the dry season. This produced wonder with no foraging. This encouraged mobility decline and survival through conserving energy while catabolizing adipose reserves. The dry season in sub-Saharan Africa is variable in timing and duration, therefore extreme obesity might occasionally have survival value. This suggests that the control of foraging intensity was not a modulated system but rather switched from foraging to fasting in relation to the availability of nutrients.


Contemporary humans likely retain the above adaptation. With the current continuous availability of nutrients this adaptation that often results in acquiring adiposity would be permanently set at nutrient acquisition, which results in persistent obesity unless individuals exercise extreme dietary vigilance. We look forward to safe and effective medication that produces satiation to control the disease of obesity which is inherent to humans living in civilizations.


Humans in civilizations are unable to deal effectively with either the baseline level of dysphoria or ICD through mobility because in the unlikely event that many hours daily were available to perform enough aerobic exercise to significantly raise endorphin levels, it would come at the cost of unacceptably inadequate cognition.




MEDICAL TREATMENT OF MENTAL DISORDERS

- RELEVANT HISTORY



Darwin's, Origin of the Species, was published in 1859, and as a medical student in Vienna Sigmund Freud was taught the concept of evolution by Carl Claus, a prominent zoology professor. Freud seemed convinced enough about Darwin's percepts that he spent seven years performing scientific investigation consisting of field work in biological nomenclature and research in physiology before turning to mental disorders. Freud began presenting his revelations about human emotions and mental disorders in 1895, with the publication of Studies on Hysteria which was co-authored with Josef Breuer. Despite his scientific background, and perhaps as a reaction to his uninspiring tedious experience in plant taxonomy, Freud departed from science and entered the realm of the unobservable and untestable through envisaging the emotional component of the human brain as organized according into a hierarchy of emotional stations which he called idegoand superego, and proposed that normal human emotions are accounted for by tensions between them and psycho-pathology was owed to unresolved conflicts within and between these presumed emotional structures. His method of accessing these conflicts and treating them was equally unscientific by relying heavily on the meaningfulness of  free association (i.e., ideas that came to mind while laying on a couch in the psychoanalyst's office), analyzing dreams and retrieving early memories, never questioning the validity of any of them, and unfettered by evidence that they all are poor indicators of actual events.


Von-Ebbing, a contemporary of Freud who won the Nobel Prize for his attempt to look at mental processes from a biological perspective, dismissed Freud's notions as “scientific fairy tale”. The reliability of human recall and usefulness of analyzing free association and dreams has been called into question. A noteworthy example was by Sartre in 1943, in Being and Nothingness: An Essay on Phenomenological Ontology, in which he gave a convincing objective critique of the imprecision of recall of memories.


Heinz Lehmann is credited with performing the initial clinical trials of chlorpromazine for the treatment of schizophrenia in the early 1950's in Montreal, which resulted in introduction of the first highly effective treatment of this disorder. This marked the beginning of clinical neurochemistry. By slightly altering the chemical structure of chlorpromazine, it lost its effectiveness against schizophrenia but now could treat intense dysphoria from ICD (mortality fear, existential fear, anxiety and depression(. In 1957 imipramine was the first of a family of tricyclic anti-depressants (TCAs) to be introduced, to be followed by amitriptyline in 1961. They display modest adverse secondary effects, are safe for long-term use, safe even in the first trimester of pregnancy, do not impair cognition and are not addictive. They became the mainstay of treatment of severe dysphoric episodes associated with ICD until they were replaced by selective serotonin re-uptake inhibitors (SSRIs), beginning with the introduction of fluoxetine in 1987, and by similarly effective medications with different chemical structures soon after. These newer anti-depressant medications are equally safe, more effective and produce fewer adverse secondary effects than do TCAs. Their use has expanded. They have revolutionized treatment of troubled children and are perhaps now more widely used for excessive chronic anxiety than depression, therefore anti-dysphoria medication would be a more appropriate name for them. They are capable of lowering dysphoria from mild ICD and even improve baseline dysphoria which is now present in most contemporary humans. Yet regardless of dose, they are incapable of producing either pleasure or happiness nor tranquilization, and do not impede pleasure and tranquility that is justified by environmental conditions. They appear to block the spontaneous rise in anxiety, existential fear, mortality fear and depression from sub-optimal endorphin levels, although this has never been objectively examined.


Behavior therapists, psychoanalysts and those using psychoanalytically based psychotherapies mainly treat both patients suffering from more intense dysphoria brought on by ICD and modest dysphoria from baseline dysphoria. They have resisted testing the effectiveness their treatment methods after they initially were shown to be ineffective in treating dysphoria in all of its forms according to current scientific standards of treatment effectiveness which includes use of control groups. Their designation as pseudo-sciences is therefore justified.


The ineffectiveness of Freud's notions applied to treatment notwithstanding, they have become the zeitgeist of our times perhaps originally due to the titillating aspects of Freud's narratives during the Victorian era. The continued use of treatment methods based closely or loosely on disease causation based past presumed pathological experiences can be blamed on an army of practitioners that follow Freud's pseudo-science out of ignorance, and others that promote these invalid ideas and quash alternative visions out of self-interest. They often argue that antidepressant medication alters the personality of those that take them. Personality is a vaguely defined concept, but in this context it is used to indicate mental processes and behaviors that make an individual different from others. In truth, dysphoria destroys individuality, with dysphoric people strikingly uniform in expressed emotions, behavior and social perceptions which become disparate once dysphoria is moderated. It is also claimed by these pseudo-sciences that dysphoria adversely affects the character of individuals by altering values. Character and values are nebulous terms often used by the pseudo-sciences. They most likely refer to the collection of adaptations possessed by hunter-gatherers that gave them intense social cohesion, such as refraining from violent behavior and sharing. Actually dysphoria accounts for the loss of social cohesive behaviors in civilizations, with rise in prejudice with individual and collective violence typical of loss of social cohesion. Social cohesion should be restored to near hunter-gatherer levels through the use of antidepressant medication.


Despite its prevalence, psychotherapy is not what it used to be. Few psychotherapists now will attempt to treat patients with disorders requiring antipsychotic medicine, extreme dysphoria from ICD with high risk of suicide and substance abusers. But they continue to attempt to treat both unhappy individuals due to baseline dysphoria, and highly dysphoric patients with ICD both of which could be treated effectively by antidepressant medication. Their methods also introduce an additional hazard to their patients because rather than explaining the genetic and chemical nature of their illness, they often suggest fanciful presumed social cause of their pain that can lead to support structure upheaval in already vulnerable individuals. Since these pseudo-therapists typically erroneously consider mild dysphoria as unavoidable, and through ineffective treatment of more severe dysphoria with ICD, psychotherapy contributes to misery, increased risk of substance abuse, suicide, lower fertility and other signs of poor social cohesion.




DYSPHORIA IS A UBIQUITOUS DISEASE



The word health is derived from an Old English word meaning whole. The World Health Organization defined health in its constitution as ”the state of complete physical, mental and social well-being and not merely the absence of disease or infirmity, therefore by its original and more current meaning, dysphoria of all intensities is inconsistent with good health. The worddisease comes from Middle English where it denotes “lack of ease or inconvenience, which in turn has origins in Old French (OED). Its more current meaning is a “disorder of structure or function in a human, animal, or plant, especially one that produces specific symptoms or that affects a specific location and is not simply a direct result of physical injury (OED).


Dysphoria of all intensities is therefore both an indication of ill health and a disease, but this is not reflected in current mental health classification systems. Since its initial publication in 1948, the International Statistical Classification of Diseases and Related Health Problems, has become the most widely used system of classification for diseases, with the exception of mental disorders in North America, where the preferred classification system since its first publication in 1952 has been the Diagnostic and Statistical Manual of Mental Disorders (DSM). These systems of classification represent the consensus view of health professionals who deal with these disorders, which includes some that follow scientific principles exclusively and practitioners of the pseudo-science of psychotherapy and behavioral therapy, and many confused therapists that try to combine pseudo-science and science by practicing both. The pseudo-scientists have made certain that their views and treatment methods can be accommodated by these classification systems therefore retain a certain level of legitimacy. Both the classification systems differentiate between mental disorders by subjective description of signs and symptoms without taking into account advances in clinical neurochemistry. For example, antidepressant medication can effectively treat a host of different disorders that are considered different according to either classification system therefore they are all variations of the same disorder. The term dysphoria was created in this report both to indicate the evolutionary origins of this emotional disorder with civilizations, but also as a term that represents the consolidation of all conditions that respond to antidepressant medication, therefore would be reflected in the disease classification systems if they were evidence based. 


Dismal management of dysphoria portends massive individual and social costs. Dysphoria will be poorly managed as long as psychotherapy and behavioral therapies are considered an acceptable. Their use has also delayed advancement in areas of importance in the treatment of mental illness, such as mental resiliency.





MENTAL RESILIENCY


Mental resiliency is defined here as the acquired ability to recover fully to pre-morbid level of functioning following a mental illness. Elizabeth Zetzel (1907-1970) was the first to document this through observations she made early in her career when treating troubled military personnel during World War II who were suffering from what is now popularly called PTSD in a war context, which in the terms of this report represents an intense dysphoric swing of ICD, with dysphoria further amplified by the existential fear of death when considering front-line soldiers. Zetzel observed that all of her patients with this condition presented similarly but their recovery was variable in relation to life experiences prior to adulthood. Those that recovered fully had experienced events in their childhood that are now popularly considered as mentally traumatic, for example, sexual and physical abuse, as well as neglect in terms of hunger and poverty, or being brought up in one- or no-parent families. The opposing group, whose members never fully overcame their illness had experienced no similar events, or in other words, had an exemplary childhood by current belief.


These observations refute Freud's hypotheses regarding the origins of mental disorders that predict the opposite results from early experiences. Mental resiliency probably represents a form of imprinting that occurs in many species of animals and which is thought to be the result of relatively immutable neural connections that can be formed when the central nervous system is not fully developed. In this case events in childhood that posed intense existential threats in the child's mind and were beyond their control become firmly linked to return to wonder of the child. Similar intense associations may be unattainable in mature individuals which makes those individuals without this early learning subject to persistent heightened anticipatory fear regarding imminent return to a state of helplessness. Mental resiliency must have been an adaptation that improved survival of hunter-gatherers, however unlike humans living in civilizations, life of hunter-gatherers was so precarious, rife with intense existential fears from unpredictable death high infant mortality and low life expectancy, prolonged fasting, ever-present risk of predator attacks and environmental exposures with minimal protection, that none could have reached adulthood without experiencing events that would have given them mental resiliency.


Zetzel never attempted to objectively validate her observations, nor has anyone else, but recent data support them. Human racial classification systems have no scientific basis because they are arbitrary, do not indicate any significant genetic differences and were developed ad hoc by those advancing preconceived theories based on the desire to raise the status of the group in which they belonged and dismiss others as inferior, be they the Aryanism of the German National Socialists, or white supremacy to justify slavery. White supremacy has become so intensely embedded in the culture of the United States and other countries that it has resulted in massive social differences between those designated as black and white, with blacks disproportionately living in poverty since being considered chattel, with institutionalized discrimination leading to poverty related social problems such as poor education, minimal physical and mental health care, broken family structures, violence and criminality.


The encouters of blacks prior to adulthood typically include many of the early experiences that Zetzel suggested may contribute to mental resiliency later in life. Reports from the United States Veterans Administration since the the United States Armed Forces were integrated in terms of racial designation indicate that there is no difference in the incidence of PTSD in relation to race, which is consistent with it being a disorder of neurochemistry in groups that are not genetically distinct. But the quality of recovery from PTSD varies considerably with white current and former soldiers about four times more likely to commit suicide often well after their time in military service than the equivalent black soldiers. Relatively poor mental resiliency with white current and past military personnel when compared to black cohort of similar military background has left them unable to recover completely from PTSD. Poor mental resiliency satisfactorily also explains why those designated as white in the civilian population suffer more from the opioid epidemic and suicide.


Severe episodes of what is now popularly referred to as PTSD in non-military situations actually represent intense dysphoria from ICD combined with critical academic disappointments, severe financial loss, loss of social support that comes from the death of a loved one and breakup of long relationships and mortality fear from a life-threatening illness and many other similar events that raise existential or mortality fear. The more mentally resilient would snap back whereas those without mental resiliency may be left with persistent amplified dysphoria. PTSD of any intensity can be treated successfully with antidepressant medication whether or not it occurs in those with high or low mental resiliency, but those with poor mental resiliency might require prolonged use of these medications in order to return to their pro-morbid levels of functioning.


Even a modest degree of mental resiliency may help to prevent permanently amplified dysphoria or the need for use antidepressant medication for the rest one's life following episodes of ICD. Most contemporary children at least in industrially advanced countries are probably now unable to achieve high levels of mental resiliency because they do not experience amplified existential fear in childhood that is needed to create it. It may be worthwhile to consider changing some child-rearing practices such as the practice of eliminating all stressful childhood experiences. Clearly emotionally fragile children should be identified and treated, but perhaps the others deserve a chance to acquire mental resiliency through allowing some existential fear in childhood. No one would argue that physical and sexual abuse of children is desirable, but it may not follow that all social harassment, social exclusion, harsh criticism and disappointment from failure to achieve goals should also be eliminated as this may make it impossible to achieve even modest levels of mental resiliency. Better mental resiliency in our children may raise their academic achievements, lower their risk of suicide and their use of dangerous substances.



DISCUSSION


In 1859, modern biology was ushered in with the first treatise on evolution, and since then a rudimentary knowledge of evolution has become an essential requirement for everyone entering the health sciences. For example, students are taught that human blood has a similar power of hydrogen (pH) to ocean water, and gill structures appear in early fetal development of humans, all of which suggests that part of early human evolution took place in aquatic environment. Likewise, evolution is presented as responsible for the ascent of Homo sapiens from hunter-gatherers into the cognitively superior but in some ways less physically and mentally robust organisms we currently are. This pseudo-fact is both implausible and unsupported by data, and there is little evidence that even early hunter-gatherers differ genetically from current man. With the cascading nature of advancement of knowledge, this superficially logical but scientifically impossible conclusion in terms of evolutionary science has become a pseudo-fact that has had costly consequences in terms of understanding human health and social behaviors. This report advances the far more plausible thesis that late hunter-gatherers were essentially genetically identical to contemporary humans. It follows that current humans should be considered essentially hunter-gatherers that have recently been thrown into a vastly new environment, and remain in the early phase of adapting to life in civilizations.


Once humans in civilizations are considered genetically nearly identical to humans in civilizations, and the dissimilarities between hunter-gatherers and modern man are compiled, it becomes apparent that they all can be explained by decline in the baseline endorphin levels from high levels maintained by hunter-gatherers. Only substantial mobility decline could account for such a massive endorphin decline in healthy humans. This report is the first to propose that decline in mobility with civilizations released pre-existing cognitive potential, decline from wonder and loss of physical and mental robustness. The term, dysphoria was introduced to signify the emotional decline from wonder.


A vast trove full of insights appears once contemporary humans are viewed as genetically hunter-gatherers, and differences between hunter-gatherers and modern man are considered environmental and endorphin mediated. Disciplines that deal with social interactions, such as political science and economics may benefit from knowing that class structures in early civilizations were the result of differences in dysphoria and cognition between the classes due to endorphin disparity between them from differences in mobility. Also, revolutions following the European Renaissance were the result of the rise of cognition and dysphoria in the general population from mobility decline that lowered endorphin levels. Poor conflict resolution in the form of wars replete with the rise of populism and demagogues can be accounted for by rise in existential fear following economic collapse. The exception is the current drop in social cohesion that is occurring during a period of economic growth. It has probably been brought about by dysphoria rise in the general population from endorphin decline from the attenuated mobility associated with information technology which is performed by sedentary individuals.


This report presented a sampling of examples of how lack pf adaptation to civilizations provide insights in terms of disease cause and treatment. For example, humans remain adapted to barefoot mobility since use of footwear account for a host of common highly disabling injuries. Also, humans in civilizations sense pain from innocuous stimuli because their pain thresholds remain appropriate for hunter-gatherers. This has lead to universal use of analgesia, pain clinics and questionable syndromes involving pain such as fibromyalgia that have no pathological basis. Another example used in this report is the failure of humans to adapt at all to diets rich in terms of cellulose and starches that were introduced when they entered civilizations. The human digestive system continues to treat essential fatty acids of animal origin as the most important nutrient by assigning it the highest priority of any nutrient for complete digestion and absorption through giving them the slowest entry into the small intestine from the stomach. The structure of the colon remains far to too delicate in terms of wall thickness and storage capacity safely cope with a cellulose-rich diet. Diets rich in cellulose adequately explain IBS and the diverticular disease . The pseudo-fact that consumption of dietary fiber is beneficial which unfortunately has been advanced by health professionals has lead to increased consumption of cellulose and consequently a higher incidence of these conditions at earlier ages.


The notion that humans remain hunter-gatherers in terms of social adaptations is supported by both behavioral change as they entered civilizations and retained social adaptations. Social cohesion and fertility collapsed with the rise of dysphoria with civilizations and remains dismal. Despite this, humans living in civilizations are capable of displaying intense social cohesion even with endorphins well below what is required for wonder. An example of this appears when endorphin reinforcement encourages a group size that is similar to hunter-gatherer groups, such as with platoons in the military and therapy groups. This indicates that a relatively modest endorphin rise, well below the endorphins required to produce wonder yields substantial rise in social cohesion even in dysphoric humans living in civilizations. Consequently social cohesion, and perhaps improved fertility may be obtainable if dysphoria in the population is only moderately reduced. This may be obtainable via effective treatment of those suffering from ICD, and others experience persistent amplified dysphoria because of poor mental resiliency following ICD. 


Within the medical implications of these insights, I have chosen to emphasize both baseline dysphoria in the population and ICD because of the immense magnitude of these diseases and their implications to social cohesion. For greater than a half-century antidepressant medication has been available as a proven medical treatment of dysphoria, and if used optimally it might have already resulted in reduction in human suffering and improved social cohesion. This unacceptable situation is considered in this report to be mainly caused by the continued romance with the pseudo-science of psychotherapy. The health care establishment deserves much of the blame. Some of the oxygen that has allowed psychotherapy to survive has been an anachronistic mental disorder classification system in which disease categories have no relation to neurochemistry, but rather are based on subjective signs and symptoms (e.g., anxiety, depression) that are influenced by popular culture (e.g., PTSD, chronic fatigue syndrome). An evidence-based classification system is proposed where subjectivity is replaced by categories identified by individual or families of medications that are effective in treating specific disorders. By this system a category would be, responding to antidepressant medication, and would include anxiety and depression and all other disorders that respond to these medications. Psychotherapy has no place in this classification system because it is only based on effective treatments. This system presumably would be replaced by neurotransmitter-based categories once this information becomes available.


This dysphoria rise from mobility decline is now resulting in more dysphoric individuals attempting to escape the pain through the use of substances that mimic endorphins to escape from dysphoria. Legalizing marijuana has been a misguided attempt by societies to douse the fires of social instability caused by amplified dysphoria through encouraging substance use, but no technology based society will prosper with the lowered productivity from impaired cognition that this substance produces. Societies have failed to deal adequately even with more intense episodes of ICD through effective treatments. The baseline level of dysphoria has risen and now accounts for irritability, insomnia, excessive pain and fatigue in nearly everyone. There is an urgent need for treatment of excessive dysphoria not only in those experiencing ICD but in a much larger part of the population.


As previously mentioned, even countries with the most advanced social programs are experiencing intense hoarding, nationalism, and populism with demagogues, which suggests that social programs are now increasingly unable to reduce existential fear sufficiently to compensate for spontaneous dysphoria that is arising from rapid decline in mean endorphin levels in societies. It is proposed here that this can be accounted for by reduced mobility from the integration of sedentary information technologies into so many parts of our lives. This process will continue despite its effect on dysphoria because of advantages of further technological advances and the rewards for the modest cognition gains that come from further attenuated endorphins. The industrialized world has only previously faced relatively brief spasms of social cohesion decline following existential fear rise from economic collapse. We are now returning to the instability civilizations last experienced when the dysphoric ruling classes were overthrown when serfdom ceased. In the present context this endangers not only individuals, but nations and humanity itself.

Comments