About My Work‎ > ‎

DESPAIR AND HOPE IN THE AGE OF DYSPHORIA (Added November, last updated December, 2020)



Humans underwent rapid multifaceted changes when transitioning from life as hunter-gatherers to living in civilizations. Sudden rise in cognition allowed written language for the first time in human history which permitted accumulation of knowledge and rapid technological advancement. At the same time mental and physical robustness abruptly declined as is suggested by a need for ethanol and later other substances to allay anxiety, depression and fears, and clothing, shelter and bedding to tolerate environmental conditions in regions where hunter-gatherers thrived without their use. These changes can not be explained by the genetic-evolutionary model, therefore they must have been phenotypic, which means they are of environmental rather than genetic cause. It is proposed that hunter-gatherers maintained high endorphins levels following the operant conditioning model with endorphin reward for maintaining ideal group size and mobility to assure sufficient foraging intensity to produce the adiposity required to survive seasonal fasting. This endorphin production created wonder, which is defined as cognition dulled sufficiently to eliminate existential fears, and the lowest level of anxiety, depression and highest pain thresholds and ambient temperature tolerance that were consistent with optimal health. Fertility and social cohesion were also optimized at wonder. Dysphoria is decline from wonder. Civilizations caused dysphoria because they inherently require larger group size, less mobility and better cognition than does nomadic foraging. Whereas hunter-gatherers were highly socially cohesive, early civilizations were authoritarian with weaponized minorities dominating enslaved majorities composed of conquered groups. Improved living standards reduced existential fears but insufficiently to compensate for rise in dysphoria from further mobility decline from transportation technology advances. The net result was the use of further improved cognition to create evermore ingenious methods of mass destruction. The recent mobility plunge due to sedentary digital technology at work, school, and when socializing and shopping, and more recently with Covid-19 has caused a dysphoria spike which is sending humans to pseudo-scientists such as psychotherapists for treatment, demagogic populists for leadership and substances to temporarily relieve dysphoria regardless of risk. It is not sensible to seek hunter-gatherer endorphin levels as it entails massive cognitive decline. Antidepressant medication combined with analgesia can relieve the most distressing aspects of dysphoria with no loss of cognition. Individual sense of well-being will rise with resulting attenuated demand for substances. From a collective perspective social cohesion will rise which will lead to conflict resolution without violence, sharing of resources and other behaviors that advance common good.


Inhabitants of economically advanced countries have been experiencing a recent sharp rise in anxiety, depression, generalized fatigue and chronic pain all without a clear identifiable cause. Social cohesion has declined as measured by rising substance abuse, suicides, income disparities and appearance of demagogic populists which historically has preceded spasms of mass destruction. 

Civilizations first appeared less than 10,000 years ago, and with the majority of humans living in this form of social organization for less than 5 millennia, therefore they are so recent in the human story that significant genetic difference between the hunter-gatherers that existed prior to the appearance of civilizations and even contemporary humans is highly improbable. Small numbers of hunter-gatherers continued to exist totally isolated from the influences of civilizations until the Age of Discovery commenced in the early fifteenth century. Reports from early encounters with some of the last remaining hunter-gatherers suggest they displayed lower cognition as measured by lack of the ability to create written language or develop sophisticated technologies. They appeared to be free of chronic intense anxiety, depression and few complained of chronic pains despite relatively harsh living conditions. They displayed high levels of social cohesion with prolific sharing even of scarce resources and they exhibited minimal amounts of violence within their groups.

Since both hunter-gatherers and humans living in civilizations share essentially the identical genome, these differences must be phenotypic, which means they accounted for by variations in the environments of hunter-gatherers and civilizations. This report is the first to look at these current problems in individual health and social cohesion as phenotypic. Identifying environmental influences that has caused them is essential to their mitigation. Accordingly the object of this report is to identify the environmental cause of these phenotypic change so that we may finally progress in their resolution.


Phenotypes are difference in expression of the same genome that is accounted for by the environmental influences. Adaptation is used here to signify genetic change that improves survival in a specific environment. A highly socially cohesive society is defined by the OECD Development Centre publication entitled, Social Cohesion in a Shifting World, as one that “works towards the well-being of all its members, fights exclusion and marginalisation, creates a sense of belonging, promotes trust, and offers its members the opportunity of upward social mobility.” Fear associated with eventual death is called here mortality fear whereas fear of pre-mature death from immediate situations such as starvation and war is existential fear. Fear without source is anxiety.Pseudoscience is used here to indicate beliefs systems that are presented as scientific and proven but are either not based on scientific methods or have been invalidated. Pseudo-truths related to human biology and health are concepts or details that may have originated as hypotheses advanced by legitimately trained experts that are widely believed to have been scientifically validated but actually have neither been been properly tested and verified or have been invalidated. Belief in pseudo-truths is owed to the uncommonness that that scientific information thought to be valid are investigated to determine either their origins or whether they continue to be valid. 

H. sapiens appeared in sub-Saharan Africa perhaps 250,000 years as nomadic hunter-gatherers. Despite pseudo-truths to the contrary, although classified as omnivorous they displayed such a strong preference for large grazing animal based nutrition that when seasonal migration of herd animals made this food source unavailable fasted and survived from catabolizing fats from their adipose tissue rather than obtaining nutrients from vegetation or other animals. Hunter-gatherer group size was maintained at approximately 50 individuals of all ages. Their existence was continuously challenged by death from starvation if their adiposity was insufficient to sustain their prolonged seasonal fast and death from attacks from predators they depended upon to kill the large animals from which they scavenged remains. Survival of hunter-gatherers was a testament to their social cooperation, anatomy which allowed use of tools and weapons and superior cognition to other animals which gave them an advantage of situationally-adaptive protective behaviors.

prolonged period of environmental stability resulted in morphologic, physiologic and social adaptations that lead to their population burgeoning to the point of being growth constrained by the limited amount of land suitable for nomadic foraging. Civilizations presumably were the best of a series of attempts to achieve population growth through alternative forms of social organizations. Civilization appeared so recently and abruptly that differences between late hunter-gatherers and even contemporary humans can not be explained following the genetic-evolutionary model, and with no evidence of significant genetic differences between hunter-gatherers and humans in civilizations, any dissimilarities between them must be phenotypic. Accordingly, humans remain essentially well adapted hunter-gatherers that have entered civilizations with adaptations some of which continue to aid their survival in civilizations, others that may no longer serve a useful purpose in the context of civilizations and some some that perhaps result in disease and death in civilizations. This evolutionary perspective is often ignored in contemporary research dealing with human disease causation. 

More advanced cognition was a pre-condition for civilizations because civilizations required novel technologies of agronomy and maintaining animals in captivity whereas hunter-gatherers were so cognitively limited that lived with the most primitive of technologies and were incapable of even formulating a written language. Humans suddenly had not only written language, records of early civilizations document remarkable intellectual proficiency despite minimal written accumulated knowledge. Early civilizations produced some basic systems of mathematics and logic that continue to be currently used and early discourses dealing with ethics and contemplation about mortality are presently admired. No convincing argument is available to suggest that the cognitive performance of humans living in early civilizations was inferior to that of contemporary humans, therefore further advances in technology and other measures of intellectual achievement in later civilizations are probably the result of a steadily enlarging base of accumulated recorded knowledge rather than significant further increase in intellectual capacity.

This phenotypic change in cognition was therefore the result of environmental difference between life as hunter-gatherers and in civilizations. This indicates that the potential for superior cognition existed in hunter-gatherers but was not expressed, and a dulling of this potential was an adaptation that advanced their survival in that environment. The best available explanation of the the advantage of dulled cognition as hunter-gatherers is that it moderated existential and mortality fears. Presumably preoccupation with their precarious existence and eventual death would have dissipated their will to live or at least distracted them from concentrating on their immediate task of surviving a challenging environment. This further implies that the transition from hunter-gatherers to civilizations came with a rise in existential and mortality fear. 

Most scholars attribute survival of hunter-gatherers in part to high levels of social cohesion consisting of warding off predators, acquiring and sharing of a precarious food supply, cooperative effort to assure that offspring reach maturity and optimal fertility. This concept is supported by reports from relatively recent first contacts with hunter-gatherers such as the Inuit and Australian aboriginals. They indicate intense social cohesion between members of groups and suspicion of those from outside their groupings. Reports are also available from study of behavior of primates such as chimpanzees that lived in the same region as the earliest hunter-gatherers and resembled them in terms of group size maintained and dangerous predators they had to protect against through group behavior. These primates exhibit no violent behavior resulting in death between members of their group aside from infants deaths from alpha-male succession which is probably an adaptation that advances survival through limiting inbreeding. There is suspicion of those from outside their group but no violence between them causing death. Territorial boundaries between groups are maintained through intimidation. There is also some cooperation between groups as indicated by females occasionally leaving their group without apparent coercion to be rapidly and peacefully integrated into the receiving group. This practice appears to an adaptation that improves survival through maintaining optimal group size.

Humans experienced a precipitous phenotypic decline in social cohesion with civilizations. Early civilizations were authoritarian with a weaponized minority dominating an enslaved majority composed of conquered groups. Rulers often used sadistic practices to intimidate others to maintain their leadership. In Slavery and serfdom were later abandoned but this was not necessarily a sign of improved social cohesion because it was mainly done for economic reasons because technological advances made human labor uneconomical and further advances in labor productivity required intellectual resources typically supplied by educated freed humans. Furthermore most technological advances in civilizations that have improved the overall well-being of a population have originated from research that attempted to advance weapons of mass destruction. As a sign of continued poor social cohesion, human activity has recently endangered the bio-systems humans rely upon for their existence, and it remains uncertain whether contemporary humans will choose to direct resources to their own survival and that of their future generations rather than to consumption of resources for immediate gratification that further degrades their environment. 

Ethanol use may have pre-dated civilizations, but it must have been used minimally then because high volume production of ethanol requires feed stocks produced through agronomy that commenced with civilizations. Ethanol has been continuously produced and consumed for more than 9,000 years which encompasses the totality of the existence of civilizations. It has been suggested that ethanol was consumed mainly for its nutritional value despite that fact that producing ethanol through fermentation lowers the nutritional value of feedstocks. Others have proposed that it was used primarily an antiseptic or for decontaminating polluted water but this seems unlikely because is would have inferred minimal protection for the whole population. The most plausible explanation of ethanol use in early civilizations is that is was used then as it is today, to impart relaxation, sedation, analgesia, to increase tolerance of low environmental temperatures and as an antidepressant and moderator of mortality and existential fears and anxiety. The importance of ethanol to humans living in civilizations is further suggested by its continued popularity despite being toxic and addictive, and that it dulls the cognition that is essential to success in civilizations. Perhaps the availability of ethanol was an essential pre-condition to this form of social organization because it provided an escape from these emotions that may have amplified with cognition. The present popularity of non-medical use of opioids can be explained similarly.

Fertility rates in hunter-gatherers are thought to have been near optimal considering the specie survived despite challenges consisting of a harsh environment replete of predators and prolonged fasting. Fertility rates in early civilizations are not known, however reliable records from the nineteenth century in economically advanced countries indicate that they were well below optimal levels. Since then fertility has declined further reaching levels in contemporary societies that it is at or even at times below population replacement level even with the advantage of a substantial increase in life expectancy. This leads to the conclusion that there has been a progressive but non-linear decline in fertility since civilizations commenced. 

Contemporary humans retain an adaptation whereby fertility is so negatively related to anxiety and depression that either can be used as a surrogate measure of the other. This adaptation may have at times challenged survival of humans in civilizations but also advanced survival in hunter-gatherers through diminishing interest and energy in reproductive behaviors at times of extreme survival stress so that all their energy and attention could be directed at an immediate threat. The current low fertility rate in economically advanced countries is best explained by rise in anxiety and depression that has been recently documented in contemporary society that must have commenced with the transition from life as hunter-gatherers to living in civilizations. This phenotypic decline in fertility presumably would reverse if anxiety and depression were attenuated in societies.

Humans experienced decline in some measures of physical robustness that also were phenotypic. Whereas there is no physical evidence indicating that hunter-gatherers living in sub-Saharan Africa relied heavily on shelters, clothing or bedding, there are extensive physical remains, images and written records of humans using these technologies in civilizations in regions with a similar climate. This suggests that humans entering civilizations rapidly developed intolerance of temperatures at the lower end of the range that hunter-gatherers found satisfactory and lower pain thresholds to high amplitude skin tactile stimulation such as would be produced by laying on a rigid and irregular surfaces because hunter-gatherers found this acceptable without the interface of bedding. Contemporary humans seem even less robust in these terms than those of early civilizations which implies that low temperature tolerance and pain thresholds to tactile stimulation have continued to decline. The environmental factor that caused this decline in robustness likely has increased further.

Robustness in terms of tolerance of low temperatures and modest pain from intense tactile stimulation are adaptations that evolved in hunter-gatherers through natural selection therefore these thresholds do not impair human survival. It follows that intolerance to low temperatures and intense tactile stimulation that came with civilizations do not improve human survival therefore are sources of unnecessary distress though unnecessary behaviors required to attain comfort through varying thermal insulation by clothing changes and in n attempt to achieve enough comfort when supine to attain satisfactory sleep even when using highly resilient bedding.

The aggregate number of humans living in both civilizations and as hunter-gatherers for the first 5,000 years after civilizations commenced has been estimated to be equal to the number that lived as hunter-gatherers prior to civilizations, which means that few humans entering civilizations survived. This was probably accounted for by a higher mortality rate or lower fertility or both when compared to hunter-gatherers. This indicates that the continued existence of civilizations was uncertain for a considerable time even with massive improvement in cognition and reduction of risk of dying from starvation and attacks from predators. This also indicates that large numbers of humans have lived in civilizations for only 5,000 years, which coincides with the emergence of civilizations in proximity to the Nile, Tigris and Euphrates Rivers. Initial risk of civilizations surviving as a social system and success after 5000 years can be explained by time to develop through natural selection modest resistance to infectious diseases that were prevalent in civilizations such as tuberculosis, plague and syphilis when considering bacteria, and viral diseases of smallpox and polio. Also modest resistance likely was attained to addiction to ethanol and perhaps other similar substances that appeared with civilizations.

Skeletal remains of hunter-gatherers and humans living in early civilizations suggests that life expectancy of both groups was about 35 years. It gradually increased with advancing civilizations probably due to further resistance to infections and greater resistance to ethanol addiction until the 19th century where it rose modestly only to be followed by a rapid rise with germ theory that lead to improved hygienic practices and later to the discovery of antibiotics 100 years ago. 

As previously mentioned, hunter-gatherer group size is thought to have been maintained to near 50 individuals of all ages, which resembles the size of some other primate groups currently living in sub-Saharan Africa. Maintaining group size must be an adaptation which improved specie survival through optimizing fertility, group protective behavior while maintain a nomadic existence which permitted better foraging of animal remains and attenuated disease from contamination of their immediate location with human waste. Group size as most social behaviors is probably actuated through operant conditioning with endorphin release as the reinforcer.

Group size in civilizations increased far above the ideal number according to adaptation made by hunter-gatherers. There is evidence that this adaptation for group size of hunter-gatherers is retained by contemporary humans. All armies in the world are designed around the platoon which consist of groups of between 20 and 30 adults. The platoon size was determined empirically from results in combat where groups of this size predictably accomplishing dangerous missions often with acts of heroism. This adaptation to group size also helps to explain the success of group therapies in dealing with psychological disorders, for example group therapies based on a variety of theories have been successful by current scientific standards in dealing with addictions of all types even without the help of professionally trained leadership whereas individual therapies have shown no success even with trained personnel. Furthermore individual psychotherapy when examined objectively has never been shown to aid disturbed patients beyond control groups that received support of the type offered by a concerned friend. Another example of this adaptation comes from primary school education. Parents of children and many educators attach so much importance to maintaining school class size to near 30 individuals that it has become a perennial nightmare for school administrators and a source of excessive educational expenses. This must be an externalization of their own sense of comfort in groups of this size because there is no evidence that children learn better with classes of this size when compared to larger classes, and children seem indifferent to class size if not informed of its presumed benefit by parents. Excessively large groups according to this group size adaptation may at least in part explain loss of social cohesion with civilizations.

Humans entering civilizations probably never realized that civilizations offered the advantage of alleviating potential death from starvation associated with their prolonged seasonal fast as hunter-gatherers because this advantage to survival came the same time as diseases from the carbohydrates and cellulose introduction into their diet with civilizations. Radioisotope evidence indicates that hunter-gatherers consumed mainly animal based nutrients. Mounds of crushed previously marrow-rich bones of large grazing animals that date to periods earlier than the appearance of civilizations have been discovered in sub-Saharan Africa together with the tools used to extract marrow. The cortex of these bones are so thick that no animal other than humans using tools could extract the marrow. It is now concluded that humans largely scavenged bone marrow as a staple food source. As previously mentioned, their reliance on animal-based nutrients was so great that they risked both living with predators that could kill them even more easily than they could the large animals from which they humans scavenged remains, and to moderat this risk humans had to develop a complex network of collective defense adaptations.

Humans prevented their yearly fast combined with disinterest in resorting to alternative foods groups to result in death from starvation through a complex adaptation that is shared with by a small number of other primates such as Macaca fuscata (snow monkeys of Japan) who unlike humans are mainly herbivores. It consists of fasting and surviving from catabolizing retained adiposity. This adaptation also includes low satiation and intense mobility drive to forage intensely when the preferred food is available, also when it is unavailable for an extended period they become lethargic during their fast which presumably conserves energy. Natural selection chose only hunter-gatherers who were willing to suffer the cost of eating plant based nutrients consisting of carbohydrates and cellulose in addition to a relatively small amount of animal sourced food to survive life in civilizations, but that does not mean humans are well adapted to a diet dominated by these nutrients. 

Without regular fasting to reduce adiposity civilizations produced persistent obesity for the first time in human history first only in the elite class that performed minimal physical work. This expanded to the whole population as machines replaced physical labor. Obesity was documented as a disease in some of the earliest records from civilizations. Obesity will only be managed successfully when it is acknowledged that it is an expression of an adaptation that allowed humans to survive in sub-Sharan Africa. 

Human digestive physiology remains best adapted for animal-based nutrients. Slowing of stomach emptying in relation to a nutrient type allows greater time for small intestinal absorption of that nutrient. Contemporary humans retain an adaptation whereby essential fatty acids of animal origin greatly retard gastric emptying. The same essential fatty acids but of plant origin also reduce gastric emptying but it exits the stomach at approximately double the rate of animal sourced fats. It remains unclear what mechanism is used by humans to distinguish between plant and animal essential fatty acids, although it has been proposed that this is accomplished by distinguishing the mix of essential and non-essential fatty acids which differs greatly between plant and animal sourced foods. Humans have failed to adjust to both carbohydrate and cellulose ingestion since both are given such a short gastric delay that their consumption without adding fat or protein is associated with such a short gastric delay that humans eat them they risk suffering from abdominal pain from gas production by bacteria acting upon partially digested carbohydrates and breakdown products of cellulose reaching the colon. Cellulose rich diets present additional problems resulting from the elevated undigested cellulose reaching the colon. Chimpanzees are of similar size to humans but are mainly herbivores and digest cellulose as incompletely as do humans. They have adapted to this diet through a thicker and structurally more robust colonic wall and far greater storage capacity of undigested cellulose than is available to humans. This suggests that the human colon is suitable only for modest residual volume typical of animal sourced fat and protein ingestion. It has become customary for physicians and nutritionists to recommend large quantities of dietary fiber but this must be based on pseudo-truths because because of underappreciated data indicating that humans retain digestive adaptations that optimized for an animal based diet and the benefit of dietary fiber in humans has never been evidence based. The presumed benefit of a fiber-rich diet has been finally called into question in an report directed at understanding the increase in incidence of irritable bowel disease (IBS), diverticulosis and diverticulitis in economically advanced societies. It has been proposed that chronic over-stretching in great part from from excessive colonic contents causes the inflammation and mobility disturbance that characterizes IBS, and this eventually leads to micro-disruptions of the colonic wall in the form of diverticula and with it the risk of developing diverticulitis. The recent introduction of even greater amounts of dietary fiber in human diets often at the recommendation of physicians and nutritionists without recognition of the potential hazards of this practice explains the increased rates of both IBS, diverticula formation and diverticulitis, and why these conditions are now occurring with greater incidence and in younger aged cohorts.

Perhaps the greatest external change with civilizations was decline in the magnitude of physical work humans performed. Most of the skeletal muscle mass in humans is used to maintain stable equilibrium, walking and running and weight-bearing without locomotion which is surprisingly work intensive because of constant postural adjustments. In early civilizations an elite class performed minimally mobile but for the majority living in a fixed location and caring for crops and animals in captivity initially may have required comparable weight-bearing as hunter-gatherers but was somewhat less energy intensive because less distance was traversed. After came a steady decline in mobility from advances in transportation technologies until the present time where both the elite class and the less privileged are both minimally mobile and sedentary therefore they perform minimal physical work compared to hunter-gatherers. 

The rise of a sedentary lifestyle began in Europe with the Renaissance and later spread to other countries through colonization. Chair use is a surrogate measure of sedentary behavior and the number of chairs in a population is a measure of chair use. Ancient chairs are rare valuable artifacts that are largely bought and sold through auction houses and displayed in museums therefore their number can be estimated through auction catalogs and museum records. These sources indicate that few true pre-Renaissance chairs were fabricated because they were as symbols of status and power and rarely at home or work as they are today. In contrast early post-Renaissance chairs are plentiful, tend to be far less ornate and often made in sets which suggests a utilitarian rather than symbolic function. Written and visual records from this period confirm this. Also the word sedentary dates back to Renaissance England.

This suddenness of the transition to a sedentary existence is thought to have been an unexpected consequence of an equally rapid adaptation of footwear use by all social classes. Rudimentary foot coverings were worn to provide insulation by hunter-gatherers who migrated to the extreme northern latitudes and later seasonally in some civilizations for insulation. Aside from these functional applications, footwear use was preceded pre-historically by decoration of the extremities with coloring agents such as henna as a form of body art. This progressed with civilizations to more elaborate foot decorations worn by the less mobile elite class and this class eventually extended these decorations to plantar surface with footwear initially composed of woven grasses and eventually with leather by the Romans. These footwear invariably functioned poorly when weight-bering with and without mobility compared to bare feet but mobility was not important because they were used by the immobile class. Shoes therefore were initially poorly functional decorations worn as a symbol of elevated status. This changed with the bubonic plague that preceded the Renaissance in Europe which resulted in inexpensive leather due to leather production exceeding reduced demand from depopulation. Plague also resulted in a skilled labor shortage in cities which was satisfied by ex-serfs who became relatively immobile. They now earned wages and footwear was high on their list of important purchases because footwear use had become a symbol of membership to the middle class and the mobility impairment they produced was less of a problem of urban dwellers. The original decorative origins of footwear was forgotten and their use began being explained by pseudo-truths regarding improved foot health because the human foot delicate and subject to dysfunction without footwear use. Chair use followed footwear introduction probably because footwear use leads to intense discomfort with prolonged weight-bearing that had to be relieved by sitting. Humans remain poorly adapted to footwear use since they directly cause abundant foot and leg disorders that do not occur in barefoot humans and indirectly they harm humans through, impaired balance and amplified impact with locomotion from placing an interface between the extremely sensible plantar surface and the support surface that moderates high amplitude transient mechanical stimulation of the plantar surface and the hazards of the sedentary lifestyle. The potential dangers of footwear use are rarely considered when dealing with sports injuries, obesity and frequent skeletal injuries associated with their use.


Wonder consists of the lowest level of mortality and existential fear, anxiety, depression, and the highest environmental temperature tolerance and pain thresholds that are associated with good health. Dysphoria is decline in one or more elements of wonder from optimal amplitudes. Opioids have been used by humans for 2,000 years to manage pain and for their pleasing effects which include allaying existential and mortality fears, anxiety, depression, improving low temperature tolerance and pain relief. The relatively recent discovery of substances similar to opioids produced by humans explains the neologism endorphin, which is formed by combining endogenous and morphine which was the most widely used opioid at that time. Endorphins are a heterogenous group of psychoactive peptides (endorphinsencephalins and dynorphins) produced in the central and peripheral nervous systems and the pituitary gland that act upon opioid receptors which are widely distributed in the central and peripheral nervous system and digestive tract.

Endorphin releases in humans are associated with such a vast number environmental conditions, behaviors and disease states that their actions can not easily be characterized other than that their release represent adaptations through natural selection in the environment in which they developed, and since humans humans had sufficient time living as hunter-gatherers for life in sub-Saharan Africa they are likely well adapted to these conditions, therefore adaptations involving endorphins are best understood from this perspective rather than as their time in civilizations which has been brief. The notion that humans largely remain hunter-gatherers in terms of adaptations is often forgotten and rarely considered when dealing with disease causation.

Adaptations involving endorphins can involve both their direct effects on tissues and organs but they also act indirectly by shaping complex behaviors through the operant conditioning learning model with endorphin release acting as a positive reinforcer. Civilizations represented a massive recent environmental change for hunter-gatherers considering that they changed from extraordinarily mobile nomadic foragers eating mainly animal remains while living in small groups with limited cognition therefore minimal technology to large relatively immobile groups living densely in a fixed location eating a diet that was rich in carbohydrates and cellulose and dependent on superior cognition to advance technologies. 

Significant rise in bloods levels of endorphins beyond those seen in normal contemporary humans increases pain thresholds, impairs cognition as measured by tests of learning and memory, lowers anxiety, depression and fear and raises tolerance of extremes of environmental temperatures, therefore amplification of endorphin levels can account for all of the phenotypic changes that appeared in humans as they entered civilizations. The only means available for healthy, disease free, non-pregnant and non-starving humans to maintain persistently high endorphin levels is through voluntary prolonged work of moderate intensity involving a large mass of striated muscle and since most voluntary muscle in humans is used for weight-bearing and mobility, the phenotypic changes that occurred in humans were primarily the result of decline in mobility. In addition maintaining optimal group size involves complex social behaviors that likely are shaped by continual endorphin release.

It is proposed that optimal endorphin levels were set at relatively levels as a means of reaching wonder only after being highly mobile which was an adaptation that prevented starvation by amplifying mobility constrained foraging sufficiently to assure enough adiposity to survive seasonal starvation in sub-Saharan Africa. Therefore when the preferred food consisting of large animal remains was available, a low sense of satiation alone in hunter-gatherers was insufficient to guarantee adequate adiposity therefore another adaptation evolved that further amplified mobility intensity which was the limiting factor in foraging. Endorphins were maintained at a relatively high but narrow range though operant conditioning with endorphin levels just high enough for wonder but below dysfunctional euphoria. When the preferred food source of large animal remains was unavailable, hunter-gatherers did not enthusiastically forage alternative foods but rather became lethargic as is typical of many animals under similar conditions and relied upon acquired adiposity from intense foraging for their survival. These reserves made certain their prolonged fast did not turn into death from starvation. This energy conserving behavior was also mediated through intense endorphin release that occurs with prolonged fasting and minimal mobility which continues in humans as a means of energy conservation in fasting and starving humans and becomes a problem with contemporary humans suffering from anorexia.

Since essentially all humans are now relatively immobile compared to hunter-gatherers and they require advanced cognition in their daily lives, it is no longer appropriate to consider diminishing dysphoria through significantly increasing mobility because of its adverse effect on cognition. Sub-optimal endorphin levels has become and will remain the cost of advanced cognition and this will remain as long as civilizations exist, or since dysphoria likely represents some selective disadvantage therefore an adaptation will eventually arise through natural selection that diminishes the relation between dysphoria and endorphin levels, or dysphoria is treated effectively by medication.

The law of parsimony (Ockham's razor) advances the notion that hypotheses that contain the fewest assumptions or steps are correct. This hypothesis continues to be valuable in contemporary science in evaluating strength particularly of causal hypotheses because it empirically has proved reliable in predicting valid hypotheses once methods became available to test them directly. This mobility-endorphin hypothesis proposes that decline in endorphins resulted from attenuated mobility, and this single external factor directly accounts for abrupt rise in cognition, amplified anxiety, depression, fear, environmental temperature intolerance and pain - essentially all of the phenotypic changes that occurred when hunter-gatherers entered civilizations. Considering a single independent variable accounts directly for a vast number of dependent variables, this hypothesis is likely valid.


Hoarding is defined in this report as accumulation of anything that could exchanged for good and services in excess of need. The survival of hunter-gatherers in the challenging environment of sub-Saharan Africa required intense social cohesive behaviors, and many of these behaviors could also advanced survival of humans living in civilizations yet social cohesion precipitously declined in synchrony with the other changes previously mentioned. Loss of social cohesion was therefore phenotypic and likely the result of decline in endorphins and consequent rise of dysphoria. It follows that contemporary humans would probably exhibit improved social cohesion with the reduction of dysphoria through substantial endorphin rise from prolonged forms of mobility. This would however leave them poorly functional in modern civilizations because of loss of cognition from these endorphins and poorly functional in a technology based society. Better social cohesion might also follow the use of opioids or similar substances but is even more problematic then raising endorphins from mobility because of additional hazards associated with these substances. Social cohesion presumably would rise from lessening dysphoria through medication that does not impair cognition nor is hazardous. Group size of hunter-gatherers was probably maintained through persistent endorphin release when group size was ideal. Since humans retain this adaptation maintaining group size comparable to hunter-gatherer groups might raise endorphins levels enough to moderate endorphins and this deserves consideration when designing work and therapeutic groups but it would be little consequence on population dense civilizations.

This analysis regarding decline of social cohesion with civilizations follows deductive logic but does it pass the test of plausibility by being consistent with historical records? The following is a simple sketch of how differences in cognition and dysphoria from endorphin decline may have shaped history. No attempt has been made to be comprehensive, for example nothing is mentioned about the origin of superstition, supernaturalism and religious expression, but the careful reader should have no difficulty expanding the implications of this hypothesis to these areas.

Humans lived in a social system composed of the ruling and dominated classes from when civilizations commenced until the European Renaissance. Civilizations involved far larger group size substantially greater reliance of technologies when compared to life as nomadic hunter-gatherers. Survival of these larger groups was found to be aided through assigning a single individual as leader to perform minimal physical work which released their full cognitive potential through lowered endorphin levels which allowed them enough organizational awareness to successfully direct others to achieve specialization of task as is required in technology based systems. Their lack of mobility however left them highly dysphoric and self-absorbed with symptoms that included intense existential and mortality fear, anxiety, depression and persistent pain of moderate intensity. As civilizations grew in size the solitary leader became insufficient in directing the complex system which necessitated expanding leadership to a supreme ruler and a ruling class often composed at least in part of the supreme ruler's heirs who were also minimally mobile hence highly dysphoric. In an attempt to explain their dysphoria members of this class concluded much as contemporary humans do their dysphoria was not irrational inappropriate response caused by an adaptation that was useful to hunter-gatherers that became inappropriate in this new environment, but justified through exaggerating the external dangers such as attacks from neighboring societies and via insurrection. The agony of dysphoria was also considered again analogous to contemporary thoughts of many that it represented contempt from deities that were the source of life. While initially ruling with consent of the dominated class, this dysphoria lead to repressive regimes replete with thuggery directed at any opposition and intimidation of the masses. Dysphoria also created impatience with negotiation and compromise inherent to peaceful conflict resolution and a preference to deal with their presumed externally justifiable threats with ill conceived expedient actions often through violence. This would temporarily allay their misguided existential fear and anxiety, but also were often contrary to even their short-term interests and nearly always were the source of long-term negative consequences through leading to cycles of violence both within their state and with nations close to them. Poor decision making in conflict situations was amplified by the use of ethanol to moderate their dysphoria. 

Hoarding by the ruling class became embedded in early civilizations as much as it is in current societies. This hoarded goods directly dealt with their amplified existential fear through its deployment in maintaining a police force to prevent organized insurrections and for a standing army to both defend against and to initiate pre-emptive actions attacks in relation to neighboring states that also were ruled by highly dysphoric ruling classes. Hoarded goods were also used to pay mercenary soldiers, bribes and ransoms. As today, hoarded goods were used to deal with the mortality fear released by endorphin decline through placating deities whose wrath to them both partly explained their dysphoria and deny them immortality after death. This was typically accomplished via building large scale structures typically of no utilitarian value and in some early civilizations through performing animal and occasionally human sacrifices to honor these deities. They also dealt with their mortality fear by constructing in their name elaborate structures built to the highest standards of the time which they thought would last for an eternity. Inheritance of hoarded goods was also used to achieve some sense of immortality much as is does today. They early ruling classes never envisioned how charitable foundations could fill this need, but if they had it surely would have become popular with the dysphoric ruling class. 

The power of the urge to hoard in dealing with dysphoria should not be underestimated. Philosopher and economist Adam Smith (1723-1790) based his notions about free enterprise on the irrational power of hoarding in humans, but he sanitized it by calling it wealth. Slaving was to him an inevitable and unpreventable aspect of hoarding. He is credited with devising a system whereby hoarding could be optimized by a society. 

Even awareness of the harm to oneself, family and others caused by hoarding has done little to moderate its influence. The number of documented examples of prominent individuals totally driven by hoarding are too numerous to mention, however one better known example who was a contemporary of Adam Smith is a particularly compelling example. As an exemplary educated, worldly, insightful and admittedly highly dysphoric person, Thomas Jefferson stated in his writings that slavery was an abomination that contradicted his belief that all humans which to his mind included slaves are created equal, yet he invested vast proportion of his worth to slave ownership until his death, and did not free them after in his will as some other slave owners did as a reward for their loyalty. He also did not even legally recognize the existence of his own children mothered by his much loved slave mistress and sold them as property presumably to satisfy his desire to maximize the value of his estate and prevent its dilution among numerous heirs.

Hoarding has always been a more powerful motivator in dysphoric humans than social justice. Many acts presumed to be motivated by humanism deserve reconsideration. The notion that social awareness lead to abolition of slavery is inconsistent with the fact that the abolitionist movement only gained traction when advances in technology and industrialization minimized the value of the physical labor in production. Slaves were freed after it had became apparent that uneducated humans kept against their will were unsuitable in a complex industrial setting.

Until the end of serfdom the vast majority of the population were composed of the dominated class which in early civilizations consisted of slaves and later by serfs. They performed intense physical work which at least in the early civilizations produced near wonder endorphin levels therefore their cognition was dulled and their dysphoria was minimal which together made them easily manipulated by the ruling class, and usually both unwilling and incapable of organizing the overthrow the ruling class despite their numbers. They managed to survive subsistence living standards through the social cohesion that comes from higher endorphin levels. Their low cognition accounted for their naive willingness to believe the demonization of enemies as advanced by the ruling class. They often made reliable defenders of the regime with their willing participation in defense and pre-emptive strikes in relation to external threats. 

This two class structure proved amazingly durable and only started to break down with the European Renaissance when serfdom disappeared due to the bubonic plague which caused a shortage of labor and decline in value of agricultural products. This occurred coincidently with a need for more skilled labor to produce manufactured goods for trading purposes with the rise in mercantilism following the Age of Discovery. What followed was a progressive decline in mobility and rise in dysphoria which has continued to the present day which increased the number of people with dysphoria.

Absolute power of a ruling class could not be sustained as ex-serfs moved to an urban setting to become more dysphoric with improved cognition. They began feeling a need to hoard with rise in their existential and mortality fear. They became increasing difficult to manipulate, and demanded a far greater share of the hoarded resources from the rulers. City states such as those in the Italian peninsula were able to formulate an acceptable deal between the rulers and ex-dominated to avoid a violent revolution. Most of Europe was less fortunate because the ruling classes were deaf to the threat or unwilling to share their hoarded bounty or both. Revolution was in the air which through violence produced a more representative government that created a system whereby all were given the right to become hoarders and openly complain which they called freedom.

The majority of the population of the emerging economically advanced countries were universally dysphoric by the end of the Industrial Age, and with this dysphoria came the impatience with non-violent conflict resolution and desire to find expedient solutions to complex problems as the rulers before them. Populism and demagoguery satisfied this need at the cost of catastrophic spasms of poor social cohesion. This was particularly associated with spikes in the existential fear component of dysphoria caused by economic collapse in the absence of social programs to mitigate distress. Not only did some people suffer true hardships such as hunger and loss of shelter, but many were also dismayed by a sharp decrease in the value of hoarded goods. Examples include the Long Depression (1873-1896) that lead in rise in collective dysphoria and amplified existential fear that is thought to have lead to nationalism, militarism and colonialism of Bismark and eventually the Great War (1914-1918). The Great Depression (1929-1939) similarly raised dysphoria through raised existential fears based on economic decline thus leading to fascism and eventually World War II.

After these wars many governments recognized this pattern of destruction following economic collapse and instituted programs to allay existential fear rise from precipitous economic decline that was paid for through taxation and redistribution of these funds to those less successful at hoarding, particularly the more vulnerable elderly, ill and the unemployed without hoarded reserves. This accounts for more stable governance following economic vicissitudes even in the face of a slow rise in dysphoria due to an equally modest decline in mobility.

Whereas the Financial Crisis of 2007-2008 has not yet lead to war and perhaps never will because of social programs, this story is different because it consequences remain unfinished with fascism and hegemony rising with loss of multi-national cooperation that typically occurred prior to social programs that has lead to war. The present loss of social cohesion best explained not by economic factors leading to amplified existential fear component of dysphoria but rather a sharp dysphoria increase lower endorphins from mobility decline from the introduction of digital technology at work, school, while socializing and when shopping. Another briefer spike in dysphoria with the Covid-19 pandemic is of similar cause but in this case it is more accounted for by mobility decline from working and studying from home and increased shopping on-line.


Some elements of dysphoria can by mitigated without raising endorphin levels. Dysphoria caused intolerance of innocuous lower temperature has been successfully dealt with from the earliest civilizations through clothing and shelter technologies and these technologies have advanced to keep up with deepening dysphoria from additional endorphin decline from further attenuated from transportation technologies and more recently as a result of information technologies. Dysphoria owed to lower pain thresholds have been partly moderated at least in relation to their effect on sleeping via resilient bedding. Humans can moderate near constant pain from innocuous stimuli through the use of analgesic medications, but unfortunately no effective analgesic medication currently available is considered safe for continual use at effective doses levels as would be required for relief of pain from innocuous stimuli caused by sub-optimal endorphin levels.

A few hours per week of aerobic activity divided into several sessions appears to be associated with optimal cardio-vascular health, perhaps a minute lessening of dysphoria and equally insignificant decline in cognition due to endorphin release therefore it seems sensible for everyone to consider. More intense aerobic physical activity is capable of actually improving cognitive performance in humans that are disabled from dysphoria because endorphin rise from exercise can decrease distraction from amplified fear, anxiety and depression more than it impairs cognition, but cognition continues substantially impaired and dysphoria remans intense with or without this exercise. Prolonged intense exercise in an attempt to allay severe dysphoria or to reach wonder is a hazard in civilizations because of limited leisure time typically available to perform aerobic exercise, substantial endorphin rise requires intense exercise that is typically associated with frequent injuries. Furthermore this would result in poor cognition.

The most troubling components of dysphoria are undoubtedly the existential and mortality fears, anxiety and depression that are released from sub-optimal endorphin levels. The pseudo-science of psychotherapy remains a popular approach to dealing with this, but it has never met modern scientific standards of effectiveness. Behavior therapies are equally wanting.

In 1957 imipramine was the first of a family of  tricyclic anti-depressants (TCAs) to be introduced, to be followed by amitriptyline in 1961. They have been proven to be effective by current scientific standards in treating existential and mortality fear, anxiety and depression. Their secondary adverse effects are modest, they are safe for long-term use and in the first trimester of pregnancy, do not impair cognition and are not addictive. They became the mainstay of treatment of dysphoria until they were replaced by selective serotonin re-uptake inhibitors (SSRIs) beginning with the introduction of fluoxetine in 1987, and by and similarly effective medications with a different chemical structure soon thereafter. These newer anti-depressant medications are equally safe and more effective than TCAs and produce even fewer adverse secondary effects. Therefore safe and effective means of treating these components of dysphoria have been available for more than 50 years, yet paradoxically excessive dysphoria continues to be a massive health problem not because adequate treatment is not available, but rather because this treatment is egregiously underused. This accounts for unnecessary distress in humans due to poorly unmanaged dysphoria, frequent deaths from attempts to escape from dysphoria through substances and suicide and dismal social cohesion. 


Moderate persistent dysphoria (MPD) is used here to indicate dysphoria below the levels that make it impossible to work. The word health is derived from an Old English word meaning whole. The World Health Organization defined health in its constitution as a state of complete physical, mental and social well-being and not merely the absence of disease or infirmity, therefore by its original and more current meaning MPD is inconsistent with good health. The word disease comes from Middle English where its meaning is lack of ease or inconvenience, which in turn has origins in Old French (OED). Its more current meaning is a disorder of structure or function in a human, animal, or plant, especially one that produces specific symptoms or that affects a specific location and is not simply a direct result of physical injury. (OED) MPD is therefore both an indication of ill health and a disease, and with a proven treatment available for half a century why has it not been relegated to medical history? This is because there has been a reluctance on the part of health professionals to consider MPD either a signal of illness because of an absurd, illogical and destructive attachment to the pseudo-science of Freudian based psychiatry.

Since it was first published in 1948, the International Statistical Classification of Diseases and Related Health Problems (ICD), has become the most widely used system of classification of all diseases, with the exception of mental disorders in North America where the preferred classification system since 1952 when it was first published has been the Diagnostic and Statistical Manual of Mental Disorders (DSM). These classification systems represent a consensus view of health professionals that deal with disorders under consideration for classification. These classification systems do not respect advances in neurochemistry considering that pharmaceuticals that alter brain neurotransmitters concentrations can treat a host of different disorders according to either classification systems therefore they are probably different variations of the same disorder. In terms of psychiatric disorders these classification systems have always been accommodative to the pseudoscience of psychotherapy. By both classification systems excessive fear, anxiety and depression with no clear immediate external cause must reach a levels above MPD to be considered a disease, therefore universal MPD is thought to be inherent to healthy humans therefore no effort should be directed at eliminating it but rather if it is disturbing enough to those suffering with it they are typically directed to those practicing the pseudo-sciences of psychotherapy and behavioral therapy for help.

The word inherent is not a term from biological science. It should more properly be called an adaptation which means a genetically coded change arrived at through natural selection that improves survival in a environment. MPD has never been shown to improve survival of humans, rather impairs survival through lowering fertility, increasing the probability of using potentially lethal substances such as ethanol, opioids and the like, perform potentially lethal risk-taking behaviors and commit suicide. MPD is not normal in humans but a serious disease that should be treated by scientifically effective methods.

Darwin's, Origin of the Species, was published in 1859, and as a medical student in Vienna Sigmund Freud was taught is precepts by Carl Claus, a prominent zoology professor. Freud seemed convinced enough about Darwin's percepts that he spent seven years performing scientific investigation consisting of field work in biological nomenclature and research in physiology before turning to mental disorders. Despite his scientific background, Freud reverted to vague untestable notions by envisioning the emotional component of the human brain as organized according to a hierarchy consisting of the idego and superego, and proposed that psychiatric pathology was the result of unresolved conflicts within and between this emotional structures. His method of accessing these conflicts and resolve them relied heavily on free association which is considered ideas that came to mind while laying on a couch in the psychoanalyst's office, analyzing dreams and retrieving memories. Dreams and memories own known to be unreliable indicators of actual events.

Von-Ebbing, a contemporary of Freud who won a Nobel Prize for his attempt to look mental processes from a biological perspective dismissed Freud's notions as a scientific fairy tale. Sartre questioned Freud's use of recall of distant memory in Being and Nothingness: An Essay on Phenomenological Ontology, in which he gave a convincing objective critique of the imprecision of recall of memories.

Practitioners of psychoanalysis and less intense cousin, psychoanalytically based psychotherapy historically have resisted testing the effectiveness of these treatments by current scientific standards for good reason, because when they have been examned objectively they have been shown to be ineffective, These treatment methods and the hypotheses on which they have been based deserve the designation of pseudo-science. Despite the above criticisms and invalidations Freud's notions have become the zeitgeist of our times perhaps originally due to the titillating aspects of Freud's narratives during the Victorian era. Since than continued acceptance of Freud's notions may be explained by lack of a more plausible explanation, but it is more likely can be blamed on the army of practitioners of Freud's notions that promote themselves and quash alternative visions out of self-interest. Regardless of cause the result has been willingness to consider MPD as a disease worth treating effectively. It has also delayed advancement in understanding of important scientific developments that do follow Freud's notions such as mental resiliency.


Mental resiliency is defined in this report as the ability to return to pre-illness functioning after suffering a psychiatric disorder. Elizabeth Zetzel (1907-1970) was the first to document mental resiliency through observations made early in her career when treating military personnel during World War II who were suffering from what is now referred to as post-traumatic stress disorder (PTSD). Since PTSD responds well to anti-depressant medications and includes signs and symptoms of what is called a major depressive disorder from a scientific perspective it probably represents a major-depressive disorders with anxiety amplified by immediate external stressors. She observed that all of the patients she saw with PTSD presented similarly but recovered variably in relation to life experiences prior to adulthood. Those that recovered fully from PTSD experienced events in their childhood which are now popularly considered as mentally traumatic, such as sexual and physical abuse, neglect in terms of hunger and poverty and one or no parent families all resulting in loss of opportunity for social advancement, whereas the opposing group who never fully overcame their illness through approaching their pre-morbid functioning experienced no similar sytessful events therefore had an exemplary childhood by popular belief. These observations refute Freud's hypotheses regarding the origins of mental disorders which predicts the opposite results. This resiliency perhaps represents a form of imprinting that occurs in many species of animals which is considered relatively immutable neural connections that can be formed when the central nervous system is not fully developed, which in this case would be linking events that pose existential threats that are beyond an individual's control and return to less threatening conditions. Similar associations of this strength may be difficult to attain in mature individuals, which makes those without early learning subject to persistent fear of imminent return to this state of helplessness.

Zetzel never objectively validated her observations however recent data support them. Human racial classification systems have no scientific basis because they are arbitrary, do not indicate measurable genetic differences and were developed ad hoc by those advancing preconceived theories based on the desire to raise the status of the group in which they belonged and dismiss others as inferior, be it Aryanism with German National Socialists, or white supremacy to justify slavery. White supremacy has become so intensely embedded in the culture of the United States and some other countries that it has resulted in massive social differences between those designated as black and white, with blacks disproportionately living with poorer education and poverty for generations since they were slaves, and with poverty related problems of poor physical and mental health due to unavailability of resources, poor family structures, violence and criminality. Experience of blacks prior to adulthood typically include many of the early experiences that Zetzel suggested contribute to mental resiliency later in life. Reports from the United States Veterans Administration since the armed forces of the United States were integrated in terms of racial designation indicate that the incidence of PTSD does not vary in relation to race, which is consistent with it being a disorder of neurochemistry in groups that are not genetically distinct therefore part of the same binomial distribution of disorders. However the quality of recovery from PTSD varies considerably between these cohorts with white current and former soldiers about four times more likely to commit suicide than the equivalent black soldiers. Relatively poor mental resiliency with whites compared to blacks have left them unable to recover completely from PTSD. Poor mental resiliency satisfactorily also explains why those designated as white in the civilian population suffer more from the opioid epidemic and suicide.

Mental resiliency must have been an adaptation that improved survival of hunter-gatherers. The life of hunter-gatherers was so precarious that few children reached adulthood without experiencing events that gave them resiliency. Better mental resiliency would save lives and moderate distress of contemporary humans, but this may require reconsideration of current ideas regarding early life experiences. Is unpleasant emotionally intense early experiences in childhood such as bullying, expressions of prejudice and failures in achievement actually destructive or does it provide useful mental resiliency in adults. In an attempt to save a small number of extremely fragile emotionally disturbed high-risk children that re advarsely affected by the above stressors are we eliminating the possibility that all children in our society can achieve useful mental resiliency?


We live in the Age of Dysphoria. It commenced when humans began venturing into civilizations nearly 10,000 years ago. Those that risked civilizations instantly contracted the disease of dysphoria together with awakening to cognitive potential that previously existed in brain of hunter-gatherers that had to be suppressed for their survival as a specie. There were few survivors of civilizations because the desire to temporarily escape dysphoria mainly through the consumption of ethanol and prevalent addiction to it, as well as lack of resistance to infections diseases that became prevalent with greater population density and fixed location inherent to civilizations. After 5,000 years of natural selection humans development a modest resistance to these infectious diseases and addictions and only then it became more likely that this technology based form of social organization had a chance to survive Freed cognition proved more powerful in advancing human survival than the poor adaptation to this form of existence that humans entered with.

Humans remain challenged by resistance to infectious disease, but the consequences of dysphoria has emerged as the greatest threat to human survival as dysphoric humans that have historically prone to engaging in wars developed weapons that could instantly destroy all of humanity, and intense hoarding behavior has lead to degrading the ecosystem humans depend upon for their survival to levels that choke off human survival.

Despite these threats to human survival this report should be considered optimistic because lowering dysphoria in humans is now possible which reduce human suffering but at least equally important elicit social cohesion that is inherent to humans, thereby avoiding violent conflicts, moderating hoarding thereby advancing behaviors that improve the common good.

This report has unearthed both new and previously identified but under-appreciated data of practical significance the most important of which is that currently available medications can safely effectively deal with the most distressing aspects of dysphoria with no decline in cognition. Considerable effort was directed at exposing why management of MPD has been terribly inadequate considering it should have been dealt with decades ago when proven treatment became available. 

This report provides insights into the cause of a number of diseases that have been considered of uncertain origin such as IBS, diverticula formation and diverticulitis. This also identifies a novel digestive disorder that causes abdominal distress from excessively fast gastric emptying following the consumption of carbohydrates and cellulose resulting in unabsorbed nutrients from the small intestine reaching the colon to be acted upon by gas producing bacteria.

This report identifies a number of prevalent pseudo-truths advanced by both pseudoscientists and scientifically but inadequately trained health professionals. Perhaps the most important example in the context of this report is the usefulness of psychotherapy in all of its forms, and viewing humans thought and interaction through the optics of this fiction. Other pseudo-truths include the importance of cellulose and the hazards of fat of animal origin in the human diet. Another is the importance of maintaining relatively small class size in early education, the notion that wealth differs from hoarding, that sharing and social cohesion is possible in dysphoric humans. 

This report explains why humans living in civilizations suffer pain from innocuous stimuli and causes emotional distress in general, and impairs sleep through failure to achieve comfort when supine. It explains the need to have access to clothing incrementally differing in insulation value and to change clothing frequently to avoid discomfort from innocuous ambient temperatures. 

This discussion begins with general issues relevant to this report. It is followed by insights emanating from identification of the disease of dysphoria. The order of presentation in no way indicates the importance assigned to any element. Science has made magnificent advances through dividing somewhat arbitrarily a vast body of knowledge into more manageable units followed by vertical exploration within each niche. The full potential of new discoveries using vertical mining can only fully realized when through they are integrated horizontally which allows an understanding of how this new discovery influences the functioning organism. Most investigators now appear lack the knowledge of other branches of scientific inquiry to perform this integration. This was supposed to have been remedied by multi-disciplinary research, but this has failed because this practice groups together solitudes without integrative vision. This report serves to inform about the importance of fostering a broad educational base and aiding investigators that attempt to traverse other disciplines.

Pseudoscience whether advanced by trained individuals carrying prestigious university degrees to the insincere marketer of panaceas is treated as a threat to public safety that needs to be controlled. It is dismissed outright rather than coddled as traditional medicine. This is based on the belief that it is the duty of all responsible investigators and practitioners of health science to show the courage that is often required to confront and eliminate elements of pseudoscience that remain embedded in medicine and dispel pseudo-medicine. The existence of osteopathy, chiropractic, acupuncture, holistic medicinealternative medicine, etc.., is a danger. It can be seen form this report why nutritional science is fast becoming a pseudo-science by failing to integrate our current knowledge of the evolutionary cause of obesity and acknowledging adaptations human retain favoring animal fat and protein ingestion as well morphological limitations of the human colon with consumption of large amounts dietary fiber. Likewise, parts of orthopedic medicine and physiatry remains a pseudoscience because they have been slow to discredit the pseudoscience of arch height in disease causation, and integrate the role of footwear in the cause of patellofemoral pain syndrome, plantar fasciitis, osteoarthritis of foot metatarsal-phalangeal joints and many other disabilities. Pseudoscience is so embedded within physical therapy practice that few physiotherapy practices have proven validity.

Legitimately trained practitioners in the health related sciences often forget how many diseases that they attempt to treat are of unknown cause and treatment methods lack verification of effectiveness by contemporary scientific standards. Rather than admitting to patients that both cause and treatment they propose represent at best an educated guess by them either based on their own uncontrolled observations or that of others, they express far more certainty about disease cause and management than is justified. Now only is this inappropriate from an ethical perspective, but it contributes to pseudo-knowledge. More importantly, since few practitioners trouble themselves by researching anecdotal origins of many of their practices, they accept them as proven truths by most professionals and disseminated by them to the public.

This gradually changed living close to hunter-number of It wasnrequira necessary disease to release the cognition required for life in civilizations until this advanced cognition allowed technological innovation through the introduction of analgesics and more importantly effective antidepressant medication 50 years ago which without cognition loss is capable of reducing the most disturbing symptoms of dysphoria which are excessive existential fear, mortality fear, anxiety and depression. Antidepressant medication remains massively underused due to persistent pseudo-science that has become embedded in medical practice and popular culture.

Dysphoria seems to explain well the historical patterns that appeared is the growing experiment called civilization. Understanding of dysphoria has come at an opportune time when when it is rapidly rising from attenuated mobility caused by application of information technology and now Covid-19, that is resulting in rapid loss of social cohesion as measured by the emergence of populist demagogues, rise in opioid addiction and massive challenges to mental health services. This spike in dysphoria accounts for uncertainty whether humans will attempt to save they and future generations of humans depend upon for their survival. The happy ending of harm caused by this disease was foretold by discovering its cause.