Pages

Ghetto: an alternative etymology

The etymology of the word ghetto has long been debated. Several solutions have been offered. The very first use of the word has been traced back to 1516 to the Jewish area of Venice. There, residents will proudly tell you that ghèto meant 'foundry'. The problem is that it seems strange to name an area after one foundry and not 'foundries, because there must have been more than one.

Other suggestions are that it originates from the Yiddish gehektes ('enclosed'), from the Italian borghetto ('little town') or from the Old French guect ('guard'). All these suggestions eventually fail, mostly because of phonetics.
Southern Germany isn't far removed from northern Italy. There, we can find the term Jüdische Gass(e) or 'Jewish Street'. Yiddish gas means ‘street’. In the German language one often finds the switch from 'ss' to English to 't' (strasse to 'street', wasser to 'water', scheisse to 'shit', and more). Gasse therefore also is rather similar to English 'gate' and Dutch gat ('hole', 'opening'). The trail seems to turn cold here, as the Etymology Dictionary claims that it is of 'of unknown origin'.

But in Dutch language we find several words that describe a steeg ('alley'). In the southernmost province of Limburg a steeg is called a gats. That changes to gas in the city of Nijmegen, and finally to steiger in Enkhuizen in the north. Both steeg ('alley') and steiger ('jetty') are related to stijgen or 'to rise up'.

In English we discover that 'jetty' ('pier') also once had the meaning of 'a passage between two houses' in central and northern England.
And there we have it: the word ghetto simply means '(a series of small) alleys' in the sense of a medina quarter (Arabic city), a distinct city section found in a number of North African and Maltese medieval cities. A medina is typically walled, with many narrow and maze-like streets.

During my research for this article I found a text by Anatoly Liberman that seemed to have traveled much the same route as I did.

On President Trump's Dementia

President Donald Trump (1946) is 71 years old. Everyone experiences at least some degree of cognitive and motor decline over time, and almost 10 percent of people over 65 now have dementia.

Trump exhibits some worrisome symptoms that fall into three main categories: problems with language and executive function; problems with social cognition and behaviour; and problems with memory, attention and concentration. They raise concern for a neurocognitive disease process or cognitive decline, which results in dementia.

So, when Trump went in to have his a annual presidential physical exam at Walter Reed National Military Medical Center, he was also given a test to see if he is plagued by early signs of dementia. He came out with 'no issues whatsoever' with his mental ability. He got a perfect score on a 'gold-standard dementia test'. That standard is the Montreal Cognitive Assessment (MOCA).
The Montreal Cognitive Assessment is a cognitive test, meaning that it assesses memory, executive function, spatial skills, calculation - so it’s mostly cognition that is assessed, not the rest of the mental abilities.

The MOCA test is a 10-minute routine screening test and, unless the patient is indeed displaying signs of dementia or Alzheimer’s disease, is incredibly basic. In other words: Trump passed a test designed for patients that have advanced stages of dementia.
Passing that very basic test was reason for Donald Trump to boast that he was a 'stable genius'. Which means that he's neither stable nor a genius.

One typhoon away from disaster

The US has poisoned an entire region of the Pacific Ocean with nuclear weapons tests. Beginning in 1977, more than 8,000 people worked to clean up the Marshall Islands, shifting 80,000 cubic meters of contaminated soil and debris into a blast crater. This 10 meters-deep crater on Enewetak Atoll is called the Runit Dome, also called 'Cactus Dome' or - locally - 'The Tomb'.
The dome spans 100 meters across with an almost 50 centimetres thick concrete cap covering radioactive debris from 12-years of US government nuclear tests.

The costs associated with nuclear tests for any country have been quite devastating for surrounding communities. Enewetak Atoll is a large coral atoll of 40 islands in the Pacific Ocean, where the US detonated 30 megatons of weapons – equivalent to 2,000 Hiroshima blasts – between 1948 and 1958. In total, 67 nuclear bombs detonated on Enewetak Atoll and Bikini Atoll of the Marshall Islands in the Pacific Ocean.


The massive explosions created cracks in the coral. This allowed the tides of the ocean to pump water into the dome and then pump radioactive water out. Now, the dome’s concrete structure is rapidly deteriorating. Since global warming will result in a continued rise in seawater levels, the concrete dome may soon be exposed to the tides. The dome could be just one typhoon away from a breach.

Rise of the Planet of the Rats in the US

While President Trump still denies climate change, warmer weather is fueling a rodent surge in his own country. It’s no surprise that rats thrive in cities, where humans provide an abundance of food and shelter. But experts now agree that the weather is playing a role in these recent increases. Extreme summer heat and this past winter’s mild temperatures have created urban rat heavens.
Breeding usually slows down during the winter months. But with shorter, warmer winters becoming more common—2016 was America’s warmest winter on record[1]—rats are experiencing a baby boom. The rats have taken advantage of these condition to squeeze out one more litter, resulting in exploding numbers.

One more litter makes a serious difference when a population boom is not only a nuisance, but a public health and economic crisis. Rats breed like rabbits: two rats in an ideal environment can turn into 482 million rats over a period of three years. Urban rats caused $19 billion worth of economic damage in the year 2000, partially due to the fact that they eat away at buildings and other infrastructure. Imagine how much they’re costing now.

What’s more, every new litter increases the risk of a rodent-borne disease. A 2014 study showed that New York City’s rats carry diseases like E. coli, salmonella and Seoul Hanta Virus[2]. Rats also carry the bacteria that causes leptospirosis, which recently killed one person and sickened two in the Bronx[3].

No one really knows how many rats there are. Not in New York City, nor Washington, DC, nor Chicago—all three of which rank among the most rodent-infested cities in the U.S.
Clearly, the coming 'ratpocalypse' is threatening the health of millions across the US, costing billions of dollars and is being fueled by global climate change that the US itself primarily created.

[1] Holthaus: It’s Official: This Was America’s Warmest Winter on Record in The Slatest – 08 March 2016. See here.
[2] New York rats carry some pretty scary diseases in Washinton Post – 21 October 2014. See here.
[3] This rare disease spreads through contact with rat urine. In New York, it has left 1 dead in Washington Post – 15 February 2017. See here.

The domestication of turkeys

Researchers studied the remains of 55 turkeys (Meleagris gallopavo), dating from between 300BC to 1500CE in various parts of pre-Columbian Meso-America[1]. They discovered that turkeys weren’t just a prized food source, but was also culturally significant for sacrifices and ritual practices.
The team measured the carbon isotope ratios in the turkey bones to reconstruct their diets. They found that the turkeys were gobbling crops cultivated by humans such as corn in increasing amounts, particularly in the centuries leading up to Spanish exploration, implying more intensive farming of the birds.

Interestingly, the gradual intensification of turkey farming does not directly correlate to an increase in human population size, a link you would expect to see if turkeys were reared simply as a source of nutrition.

Lead author, Dr Aurélie Manin, said: “Turkey bones are rarely found in domestic refuse in Mesoamerica and most of the turkeys we studied had not been eaten – some were found buried in temples and human graves, perhaps as companions for the afterlife. This fits with what we know about the iconography of the period, where we see turkeys depicted as gods and appearing as symbols in the calendar.

“The archaeological evidence suggests that meat from deer and rabbit was a more popular meal choice for people in pre-Columbian societies. Turkeys are likely to have also been kept for their increasingly important symbolic and cultural role”.
Dr Camilla Speller said: “Even though humans in this part of the word had been practising agriculture for around 10,000 years, the turkey was the first animal, other than the dog, people in Mesoamerica started to take under their control. Turkeys would have been easily domesticated would have been drawn to human settlements searching for scraps”

Some of the remains the researchers analysed were from a cousin of the common turkey – the brightly plumed Ocellated turkey (Meleagris ocellata). The diets of these more ornate birds remained largely composed of wild plants and insects, suggesting that they were left to roam free and never really domesticated.

By analysing the DNA of the birds, the researches were also able to confirm that modern European turkeys descend from Mexican ancestors.

Manin et al. Diversity of management strategies in Mesoamerican turkeys: archaeological, isotopic and genetic evidence in Royal Society Open Science - 2018

The evolution of the word 'tea' (or 'cha')

If you look around the world, you might notice that there are two ways to designate 'tea'. One consists of variations of the English term tea, such as thee in Dutch. The other is some variation of 'cha', like chay in Hindi.

Both versions come from China. The words that sound like 'cha' spread across land, along the ancient Silk Road. The 'tea'-like phrasings spread over water, by Dutch traders of the vOC, the very first to bring tea to Europe.
The term cha (茶) is 'Sinitic', meaning it is common to many varieties of Chinese. It began in China and made its way through central Asia, eventually becoming chay (چای) in Persian. That is no doubt due to the trade routes of the Silk Road, along which, according to a recent discovery, tea was already traded over 2,000 years ago. This form spread beyond Persia, becoming chay in Urdu, shay in Arabic and chay in Russian. It even it made its way to sub-Saharan Africa, where it became chai in Swahili. The Japanese and Korean terms for tea are also based on the Chinese cha, though those languages likely adopted the word even before its westward spread into Persian.

The Chinese character for tea, 茶, is pronounced differently by different varieties of Chinese, though it is written the same in them all. In today’s Mandarin, it is chá. But in the Min Nan variety of Chinese, spoken in the coastal province of Fujian, the character is pronounced te.

The te-form, used in coastal-Chinese languages spread to Europe via the Dutch, who became the primary traders of tea between Europe and Asia in the 17th century. The main Dutch ports in east Asia were in Fujian and Taiwan, both places where people used the te-pronunciation. The Dutch East India Company’s expansive tea importation into Europe gave us the Dutch thee, French thé, the German tee and the English tea.
Yet the Dutch were not the first to Asia. That honour belongs to the Portuguese. The Portuguese did not trade not through Fujian but Macao, where chá is used. That’s why, on the map above, Portugal is a pink anomality in a sea of blue.

A few languages have their own way of talking about tea. These languages are generally in places where tea grows naturally, which led locals to develop their own way to refer to it. In Burmese, for example, tea leaves are lakphak.

The map demonstrates two different eras of globalization in action: the millenia-old overland spread of goods and ideas westward from ancient China and the 400-year-old influence of Asian culture on the seafaring Europeans of the age of exploration.

Whisky kills bacteria in ice

Italian researchers studied 60 samples of ice from domestic, restaurant or industrial producers. They found 52 different strains of bacteria, including Pseudomonas, Staphylococcus, Bacillus and Acinetobacter, across the 60 samples of ice, some of which were 'agents of human infection' indicating environmental contamination[1].
The researchers then took samples of contaminated ice and, to simulate a bar environment, used this ice to serve a range of drinks, including vodka, whisky, peach tea, tonic water and cola.

In the case of each drink, they found that the population of bacteria in the sample was reduced and cited the levels of alcohol, the drink’s pH and the amount of carbon dioxide in each serve as reasons for the reduction.

However, their results also showed that the ice sample served with whisky saw the greatest reduction in bacteria – none of the bacterial strains on the ice cubes survived after they were added to the whisky. The researchers noted that this was likely to be because whisky is somewhat more acidic than vodka. They speculated that the more acidic a drink is, the less likely bacteria are able to survive.

The question remains however why in the world would you add ice to your whisky or any other alcoholic drink.

[1] Settanni en al: Presence of pathogenic bacteria in ice cubes and evaluation of their survival in different systems in Annals of Microbiology - 2017

Why Painkillers are Killing America

Here, we wrote about the epidemic of opioid related death in the USA. But just stating the problem does not explain it. So, why do Americans flock to painkillers in the first place?

For several decades now the American Midwest has suffered from unprecedented economic decay courtesy of a persistent outsourcing of manufacturing jobs in the automotive and steel industries, among others.
Yes, the stock markets are reaching new highs every day, while industrial production lags. Normally one would expect that both numbers 'travel' in the same direction. Not now.

Which means that the American working middle class is experiencing the slow destruction of a way of life that used to exist. There has been stagnation in wages for the last 50 years. If you don’t have a university degree, median wages for those people have actually been going down. So it is just like that model, whereby American capitalism really delivered to people who were not particularly well-educated, seems to be broken.
The century-long decline in mortality rates that had gone on since the beginning of the 20th century had just stopped and was starting to rise. For mmortality rates to rise instead of fall is extremely rare. It typically takes a war or epidemic for death rates to jump.
It's not far-fetched to think that these deaths are tied to 'deaths of despair' from alcohol, suicide and opioids. Because opioids are often taken together with other pain medication, one can expect that accidental overdoses are rife.

The domestication of chickens

Chicken and humans have conquered the world together. Where humans went, chicken went too. Chicken (Gallus gallus domesticus) were first domesticated some 8,000 years ago from a hybrid of wild red junglefowl (Gallus gallus), and gray junglefowl (Gallus sonneratii).
[Red Junglefowl]
Domesticated chickens are less active, have fewer social interactions, are less aggressive to would-be predators and are less likely to go looking for foreign food sources than their wild ancestors. Chicken now have increased adult body weight and simplified plumage, while their egg production starts earlier, is more frequent and produces larger eggs.
[Grey junglefowl]
Research suggests there may have been multiple origins in distinct areas of South and Southeast Asia. The earliest archaeological evidence to date is from China about 5400 BC, though a few studies supported even earlier domestication of chicken in northern and central China[1]. Researchers think that chickens were a rare occurrence in northern and Central China, and thus probably an import from southern China or Southeast Asia where evidence of domestication is stronger.

The red junglefowl and gray junglefowl also live in India. Domestication of chickens appears in the Indus Valley around 2000 BC[2]. From there the chicken spread into Europe and Africa. Chickens arrived in the Middle East starting with Iran at 3900 BC, followed by Turkey and Syria (2400-2000 BC) and Jordan by 1200 BC.

The earliest firm evidence for chickens in east Africa are illustrations from several sites in Egypt's New Kingdom. Chickens were introduced into western Africa multiple times, arriving at Iron Age sites in Mali, Burkina Faso and Ghana by the mid-first millennium AD. Chickens arrived in the southern Levant about 2500 BC and reached Iberia in circa 2000 BC.

Chickens were brought to the Polynesian islands from Southeast Asia by Pacific Ocean sailors about 3,300 years ago. While it was previously assumed that they had been brought to the Americas by the Spanish conquistadors, pre-Columbian chickens have been identified at several sites throughout the Americas, most notably in Chile and dated at about 1350 AD.

But there's a problem: some archaeologists argue that the presence of haplogroup E in chickens from Rapa Nui (Easter Island) and coastal Chile must have come from chickens that travelled the Pacific with the Polynesians[3]. Others claim that the presence of haplogroup E in chickens from Rapa Nui is from contamination[4]. If the latter is true, then chickens must have travelled the Atlantic with Columbus.

[1] Xiang et al: Early Holocene chicken domestication in northern China in PNAS -2014
[2] Kanginakudru et al: Genetic evidence from Indian red jungle fowl corroborates multiple domestication of modern day chicken in BMC Evolutionary Biology – 2008
[3] Storey et al: Polynesian chickens in the New World: a detailed application of a commensal approach in Archaeology in Oceania – 2013
[4] Thomson et al: Using ancient DNA to study the origins and dispersal of ancestral Polynesian chickens across the Pacific in PNAS – 2014.

Ancient solar eclipse dates reign of pharaoh

The Bible speaks of a peculiar natural event. In the book Joshua 10:11 it says 'And the sun stood still, and the moon stayed, until the people had avenged themselves upon their enemies. ... So the sun stood still in the midst of heaven, and hasted not to go down about a whole day'.

The visionair Immanuel Velikovsky (1895-1979) thought this was proof that the proto-planet Mars came very the the earth[1]. Now, scientists have found another explanation for that mythical story[2].

Going back to the original Hebrew text, they determined that an alternative meaning could be that the sun and moon just stopped doing what they normally do: they stopped shining. In this context, the Hebrew words could be referring to a solar eclipse, when the moon passes between the earth and the sun, and the sun appears to stop shining. This interpretation is supported by the fact that the Hebrew word translated ‘stand still’ has the same root as a Babylonian word used in ancient astronomical texts to describe eclipses.”
An ancient Egyptian text, dated to 1205 BC and chiselled on the Merneptah Stele, recounts the military conquests of the pharaoh Merneptah, son of the fabled Ramesses the Great. The inscription mentions a people called 'Israel' that is said to have been wiped out by the conquering pharaoh. The stele mentions the same event as the text in the Bible.

Earlier historians have used these two texts to try to date the possible eclipse, but were unsuccessful as they were only looking at total eclipses. What they failed to understand was in the ancient world the same word was used for both total and annular eclipses.
[Path of Solar Eclipse in 1207 BC]
The researchers developed a code, which takes into account variations in the Earth’s rotation over time. From their calculations, they determined that the only annular eclipse visible from Canaan between 1500 and 1050 BC was on 30 October 1207 BC, in the afternoon. It enabled researchers to date the reigns of Ramesses the Great and his son Merneptah to within a year: He reigned in 1210 or 1209 BC. As it is known from Egyptian texts how long he and his father reigned for, it would mean that Ramesses the Great reigned from 1276-1210 BC, with a precision of plus or minus one year, the most accurate dates available. The precise dates of the pharaohs have been subject to some uncertainty among Egyptologists, but this new calculation, if accepted, could lead to an adjustment in the dates of several of their reigns and enable us to date them precisely.

I suppose you're now wondering why the column mentions 1207 BC and the image -1206? This is because the image was generated using astronomy software and the convention in astronomy is that there is no year zero between 1 BC and AD1. However, in the Julian calendar used by historians there is a year zero, hence in this calendar the date is 30 October 1207 BC. So both dates are correct[3].

[1] Velikovsky: Worlds in Collision – 1950 
[2] Humphreys, Waddington: Solar eclipse of 1207 BC helps to date pharaohs in News and Reviews Astronomy and Geophysics - 2017. See here.
[3] Personal communication with Colin Humphreys (9 November 2017)

Smoking and Stunting

Stunting, or being too short for one’s age, is defined as a height that is more than two standard deviations below the World Health Organization (WHO) Child Growth Standards median. Factors that contribute to stunted growth and development include – but are not limited to – poor maternal health and nutrition, inadequate infant and young child feeding practices, and infection. Stunting should be made a development indicator.

Here we explained that stunting can also be the result of exploitation and here we found that voluntary restrictions of the intake of food, such as in anorexia, might also result in stunted growth.
So, are there any other, less obvious factors, that can result in a stunted growth? There is an obvious one.

If you start smoking at a very young age, as happens so often in developing countries, you might experience stunting. In other words, you might not achieve your maximum length. At the same time stunted growth is of course only an issue for those still growing.
A scientific study on 451 boys and 478 girls showed that a boy who smokes ten cigarettes a day (or more) from age 12 to 17 will be about 2.5 centimeters shorter than a boy who does not smoke at all[1].

Strangely, in girls, cigarette use was not associated with any height or weight loss. Cigarette use appears to only decrease height and body mass index in boys. Young girls may be less likely to take up cigarette smoking if they would understand that cigarette use may not be associated with reduced weight in adolescent females.

Part 1 'Stunting: Malnutrition or Exploitation?' can be read here.
Part 2 'Stunting and Anorexia' can be read here.

[1] O'Loughlin et al: Does cigarette use influence adiposity or height in adolescence? in Annals of Epidemology - 2008

The Evolution of Melons

Cucurbitaceae are a plant family consisting of about 965 species. Well known genera are Cucurbita (squash, pumpkin, zucchini, some gourds), Lagenaria (calabash), Citrullus (watermelon), Cucumis (cucumber, various melons) and Luffa (luffa). This great diversity is related species wouldn't have been possible if it weren’t for an ancient event in plant evolution.
About 90 to 102 million years ago, the genome of a single melon-like fruit copied itself. Over time, this one ancestor became a whole family of plants with different colors, shapes, sizes, defenses and flavors, such as pumpkins, squash, watermelons and cucumbers, according to a recently published paper[1].

The researchers compared the genomes and evolutionary trees of a number of plants including cucumbers, melons and gourds. Millions of years of environmental changes allowed the fruits to lose genes over time and tailor their own codes to become what we know them as today.

After each major divergent event, genes were deleted, chromosomes were rearranged and new genetic patterns were created. Knowing more about which genes survived to do different things in each plant means scientists can now get closer to creating even more variations of these fruits.

[1] Wang et al: An overlooked paleo-tetraploidization in Cucurbitaceae in Molecular Biology and Evolution - 2017

[Review] 'Chaos' by Patricia Cornwell

I've read a few bad books in my life, some were even pretty bad, but 'Chaos' by Patricia Cornwell must certainly rank as the worst thriller I've read in a decade. Yes, Patricia Cornwell can write words that constitute a sentence and she produces many sentences. Far too many in fact and I wonder how she bribed here editor, because there could easily have been cut 100 pages filled with dribble from 'Chaos'.

Patricia Cornwell seems the enjoy the wealth she has accumulated, but she does so as a nouveau riche, a person who has recently become rich and needs to show the world just how knowledgeable she is about expensive food, wines and cars. And the book drags on and on about that (Kay Scarpetta wonders if husband Bryce may arrive in his Porsche Cayenne Turbo S or his Audi RS 7).

I wondered if Patricia Cornwell just started writing this thriller without any clue of a plot. Then, halfway in, she ran into difficulties. I will not refrain from warning the reader about *SPOILERS* and just mention that she uses a drone to kill people. A drone using electric wires that whizz down to the victim to electrocute him (or her). Then, surprised that the electric current cannot possibly be so powerful that the intended victim will die (Ohm's law), she 'invents' that panguite, a rare mineral found only in minute traces in meteorites, can supply that power. She even mentions that the mineral involves nano-technology. It doesn't: the amounts of Panguite in some meteorites are so small that you have to measure it in nanometers (nm), which means that you need tons of meteorites to get just a bit of panguite. Sloppy writing at its best, an uneducated woman at its worst.

What we have then is a drone targeting people that seem not to have noticed the sound of a strange apparatus above their heads and they seem not to have noticed that the wires came down. You would have thought that the intended victims would take evasive action, but no they didn't. So, we have victims that seem electrocuted by lightning without any thunder.

Like I said: 'Chaos' is easily one of the worst books I have ever read. Do not – I repeat NOT – buy this book. To be honest, it's the first time I ever reviewed a book with this sad result.

Earth's second sun

Earth has already a second moon, but a second sun is impossible. Right? Not quite.

In the constellation of Orion, Betelgeuse forms the left hand shoulder of the warrior (see the sword dangling from his belt). It is a red giant, a semi-regular variable star in the latter stages of its life whose apparent magnitude varies between 0.0 and 1.3. Which is a lot.
As Betelgeuse is using up the last of its fuel, it will become increasingly unstable over time and will eventually collapse due to its own gravity. Then Betelgeuse will become a supernova. Supernovae can outshine the whole galaxy they live in. Supernovae have a 'rising time' of about a week, when the star is increasing in brightness. It stays at its peak brightness for several days days and then slowly declines into obscurity over a period of a couple of weeks. At its point of maximum brightness it can compete with the brightness of a full moon (-11 magnitude). Because Betelgeuse is a star it will become a second sun. Our second sun.

Will we ever live to see such a spectacle in the heavens? Scientists have calculated that the possibility of Betelgeuse imploding and exploding is somewhere between nil and a million years. As Betelgeuse lives a mere 640 light years away from earth, it might already have gone supernova 640 years ago.

So, keep watching the southern sky (if you live in the northern hemisphere).

The White Horse of the Sun

Carved into the chalk of a hillside in southern England, the Uffington White Horse stretches 110 meters from head to tail, it is the only prehistoric geoglyph - a large-scale design created using elements of the natural landscape - known in Europe. Its closest parallel are the Nazca lines in Peru.
Excavations in the 1990s yielded dates that showed it was created during the Late Bronze Age or the Iron Age, sometime between 1380 and 550 BC.

Archeologist Joshua Pollard usually works on sites dating to the Neolithic, a period when people erected large monuments, such as Stonehenge, that were for the most part aligned with astronomical events. That experience led him to wonder if the Uffington Horse could have been designed along similar lines, and he investigated how the geoglyph was positioned relative to celestial bodies[1]. He found that when observed from a hill opposite, in midwinter, the sun rises behind the horse, and as the day progresses, seems to gain on the horse and finally pass it. From the same vantage point, at all times of the year, the horse appears to be galloping along the ridge in a westerly direction, toward the sunset.

Both the form and the setting of the site led Pollard to conclude that the White Horse was originally created as a depiction of a 'solar horse', a creature found in the mythology of many ancient Indo-European cultures. These people believed that the sun either rode a horse or was drawn by one in a chariot across the sky. Depictions of horses drawing this so-called solar chariot have been unearthed in Scandinavia and Celtic coins often show horses associated with the sun.
[Scandinavian Sun Horses]
"The White Horse is depicted as a horse in motion, and the people who created it must have thought that it was responsible for the sun’s movement across the sky," says Pollard. He posits that the geoglyph was not a static symbol, but an animated creature on the landscape, one that connected ancient Britons with the sun.

Over time, though its original purpose was lost, local people maintained a connection with the White Horse that ensured its continued existence. If it weren’t maintained, the White Horse would be overgrown and disappear in 20 years. Each summer, a few hundred local volunteers weed the White Horse and then crush fresh chalk on top of it so that it keeps the same brilliant white appearance it has had for 3,000 years. The site, as it must have throughout millennia, continues to be meaningful to the people around it.

[1] Pollard: The Uffington White Horse geoglyph as sun-horse in Antiquity - 2016

[Review] 'Classical Traditions in Science Fiction'

'Classical Traditions in Science Fiction' (edited by Brett M. Rogers and Benjamin Eldon Stevens) is a book that contains 14 essays by scholars of the classics, Greek, English, and philosophy. The essays explore connections between Jules Verne and the Greek satirist Lucian; Dune and the Iliad; Alien Resurrection and the Odyssey; antiquity and Western identity in Battlestar Galactica; the Iliad and Dan Simmons’ Ilium; The Hunger Games and the Roman Empire; and the graphic novel Pax Romana, which explores the transition from antiquity to a Christian world.

The term 'science fiction' is inherently vague and finding an all encompassing definition proves surprisingly elusive. Adam Roberts’ dictum that science fiction is 'premised on a material, instrumental version of the cosmos,' in contrast to its close ally, fantasy, which concerns 'magic, the supernatural, the spiritual.' Alternately, Susan Sontag summed up the whole genre as consisting of the 'imagination of disaster,' a fascination with dread of irresistible destruction.

At first science fiction did keep itself busy with 'novel ideas' about a possible future as dictated by Adam Roberts. Yet, the next wave of SF consisted of visions of a drab and depressing future as summed up by Susan Sontag. During the Victorian era, the world was changing fast, for some too fast. When extrapolated, the rapid industralisation with its smog and crumbling institutions, could herald an apocalypse in the future.

To be literature, one school of thought goes, a science fiction novel must be depressing, ginging an account of hubris and failure, such as George Orwell’s 1984. Some consider Mary Shelley’s Frankenstein the first science fiction: the optimism that drives scientific advance is thwarted by that unreliable factor, the human element.

Jesse Weiner’s essay “Lucretius, Lucian, and Mary Shelley’s Frankenstein” gives a thorough account of the book’s debate with the ancients, its later influence, and Shelley’s ambivalence about scientific progress.

But Frankenstein is subtitled The Modern Prometheus. Shelley drew upon the myth of Prometheus, who steals fire from the gods and is condemned to eternal damnation. Dr. Frankenstein is seeking higher human knowledge, the secret to the spark of life, and pays dearly for it.

'Classical Traditions in Science Fiction' is a book that contains a fascinating collection of essays that gives readers a new understanding of the place of science fiction within the Western literary tradition. Science fiction certainly harks its history back to classical Greek literature. Well worth your time. 

Diet Soda Linked to Weight Gain, Not Weight Loss?

Olive Oil Times, formerly a site with a good reputation, ran an article with the heading 'Diet Soda Linked to Weight Gain, Not Weight Loss'.
The article used data from recent Canadian research that claimed that 'Evidence from RCTs does not clearly support the intended benefits of nonnutritive sweeteners for weight management, and observational data suggest that routine intake of nonnutritive sweeteners may be associated with increased BMI and cardiometabolic risk'[1].

Well, that was a strange outcome, because a previous study that used the same data reached a different conclusion 'Overall, the balance of evidence indicates that use of LES (Low Energy Sugars) in place of sugar, in children and adults, leads to reduced EI (Energy Intake) and BW (body weight), and possibly also when compared with water[2].


What I think is that the effects of all interventions to combat obesity are limited and inconsistent. There are many variables and no study will ever be able to control for all of them. People might drink diet soda, but still eat to much fast food, nullifying the effect of the zero calories of the diet soda.

So, if you are trying to lose weight replacing sugary drinks with low calorie drinks can be a helpful part of your overall strategy. It will not be a panacea or make weight loss easy. See here.

The Olive Oil Times made things even worse by asking a naturopath (quack alert!) for her opinion. 'Carolyn Dean, medical doctor and naturopath, didn’t mince words in giving her opinion about the research. “This study, which exposes the false claims of synthetic sweeteners, should have the industry quaking in its boots”'.

As Wikipedia rightly warns: 'Naturopathy or naturopathic medicine is a form of pseudoscientific, alternative medicine.' Poor Olive Oil Times. I hope they didn't pay the writer of that article, because it did more harm than good.

[1] Azad et al: Nonnutritive sweeteners and cardiometabolic health: a systematic review and meta-analysis of randomized controlled trials and prospective cohort studies in Canadian Medical Association Journal – 2017
[2] Rogers et al: Does low-energy sweetener consumption affect energy intake and body weight? A systematic review, including meta-analyses, of the evidence from human and animal studies in International Journal of Obesity – 2015

Human Brain Is Still Evolving

Two genes involved in determining the size of the human brain have undergone substantial evolution in the last 60,000 years, suggesting that the brain is still undergoing rapid evolution[1].
New versions of the genes - or alleles - appear to have spread because they enhanced the brain's size and function in some way. These new alleles improve brain function, but that would not necessarily mean that the populations where they are common have any brain-related advantage over those where they are rare. Different populations often take advantage of different alleles, which occur at random, to respond to the same evolutionary pressure, as has happened in the emergence of genetic defenses against malaria, which are somewhat different in Mediterranean and African populations.

The researchers studied study two genes, Microcephalin (MCPH1) and ASPM (Abnormal Spindle-like Microcephaly Associated), that came to light because they are disabled in microcephaly ('small brain'), now better known because Zika Virus causes it[2].

Lahn and his colleagues have studied the worldwide distribution of the alleles by decoding the DNA of the two genes in many different populations. They report that with microcephalin, a new allele arose ~37,000 years ago (between 60,000 and 14,000 years ago)[3]. Some 70 percent or more of people in most European and East Asian populations carry this allele of the gene, as do 100 percent of those in three South American Indian populations, but the allele is much rarer in most sub-Saharan Africans.

With the other gene, ASPM, a new allele emerged ~5,800 years ago (between 14,100 and 500 years ago). The allele has attained a frequency of about 50 percent in populations of the Middle East and Europe, is less common in East Asia, and found at low frequency in some sub-Saharan Africa peoples. They note that the ASPM allele emerged at about the same time as the spread of agriculture in the Middle East 10,000 years ago and the emergence of the civilizations of the Middle East some 5,000 years ago, but say any connection is not yet clear.
The Microcephalin and ASPM genes are known to be involved in determining brain size and so far have no other known function, he said. They are known to have been under strong selective pressure as brain size increased from monkeys to man, and the chances seem "pretty good" that the new alleles are a continuation of that process, Dr. Lahn said.

[1] Mekel-Bobrov et al: Ongoing adaptive evolution of ASPM, a brain size determinant in Homo sapiens in Science – 2005
[2] Evans et al: Microcephalin, a gene regulating brain size, continues to evolve adaptively in humans in Science – 2005
[3] Evans et al: Evidence that the adaptive allele of the brain size gene microcephalin introgressed into Homo sapiens from an archaic Homo lineage in PNASofUSA - 2006

Stunting and Anorexia

Most experts now probably agree that stunting is a development disorder[1]. Stunting, or being too short for one’s age, is defined as a height that is more than two standard deviations below the World Health Organization (WHO) Child Growth Standards median[2]. It is a largely irreversible outcome of inadequate nutrition and repeated bouts of infection during the first 1000 days of a child’s life.

In my previous paper, 'Stunting: Malnutrition or Exploitation?'[3], I claimed that stunting is not only the result of malnutrition, but also of child exploitation. Both are indicative of poverty.

I also linked stunting to rigorous training by athletes. These athletes eat meals that contain more than enough nutrients to grow, but their bodies use these nutrients to enhance the short-term goals to the detriment of long-term growth. My conclusion was that, while stunting is usually monitored in children less than five years of age, stunting should also be monitored in children older than five years of age.
But what if malnutrition is the result of an ill-advised choice? What if anorexia also leads to stunting? The Diagnostic and Statistical Manual of Mental Disorders 5 (DSM-5) classifies Anorexia Nervosa as an eating disorder. Criteria include [1] Restriction of energy intake relative to requirements leading to a significantly low body weight in the context of age, sex, developmental trajectory, and physical health, [2] Intense fear of gaining weight or becoming fat, even though underweight and [3] Disturbance in the way in which one's body weight or shape is experienced, undue influence of body weight or shape on self-evaluation, or denial of the seriousness of the current low body weight[3].

How will a voluntary restriction of energy intake relative to requirements, that leads to a significantly lower body weight in the context of age, sex, developmental trajectory and physical health, influence your growth?
One study revealed that 'Male children of women with a history of Anorexia Nervosa [...], and female children of women with Anorexia Nervosa, were shorter throughout childhood'[4]. Another study found that 'linear growth retardation was a prominent feature of Anorexia Nervosa in our sample of male adolescent patients, preceding, in some cases, the reported detection of the eating disorder. Weight restoration, particularly when target weight is based on the premorbid height percentile, may be associated with significant catch-up growth, but complete catch-up growth may not be achieved'[5].

Therefore, anorexia is a type of malnutrition and can lead to stunting.

Part 1 'Stunting: Malnutrition or Exploitation?' can be read here.
Part 3 'Smoking and Stunting' can be read here.

[1] Kraemer: Making Stunting a Development Indicator in Sight and Life - 2016 
[2] WHO Global Nutrition Targets 2025: Stunting Policy Brief. See here
[3] De Vries: Stunting: Malnutrition or Exploitation? in Sight and Life - 2016 
 [4] American Psychiatric Association: Diagnostic and Statistical Manual of Mental Disorders 5 – 2013 
[5] Easter et al: Growth trajectories in the children of mothers with eating disorders: a longitudinal study in BMJ Open - 2014 
 [6] Modan-Moses et al: Stunting of growth as a major feature of anorexia nervosa in male adolescent in Pediatrics - 2003

[Review] 'The Evidence of Ghosts' by AK Benedict

Maria King, blind from birth and now blind by choice, sits by the Thames mudlarking, sifting through the history of London. Having been blind all her life, she can't get used to being gifted with sight after surgery. She wears a blindfold that gives her a feeling of security. Only, one day, while mudlarking, she finds a ring still on a finger in a box with 'Marry me Maria' on the lid in braille.

DI Jonathan Dark is assigned to the case. The finger and the ring belonged to the last woman who received a similar proposal and was murdered. Jonathan Dark was unable to prevent that murder and his intention is not to let Maria be the next victim of the stalker.

Jonathan Dark is a detective with a disintegrating private life. His personal problems constantly interfere with his professional life, but the real question in 'The Evidence of Ghosts' is: who's stalking Maria King and why?

The other question that may be on our lips is: if I was being stalked by a murderer would I want to keep wearing a blindfold? I know that seems an odd question but when you consider Maria wears one by choice all the time, it makes sense to ask. While most reviewers think that this doesn't reflect true life, I can assure readers that one can never understand the psychology of the human mind.

Alexandra Benedict weaves a fascinating supernatural (or supranational) world where the dead are always with us, sometimes helping, sometimes obstructing and sometimes urging to kill.

What do I think of 'The Evidence of Ghosts'? I got the distinct feeling that Alexandra Benedict was trying to weave too many storylines in this book and not quite succeeding. Yet, it still is a perfect albeit unusual amalgamation of a crime novel and a Gothic novel. A.K. Benedict has a rich imagination and a dark sense of humour that enlightens nearly every page.

Death has no sequel. So ends the book. But I'm certain that AK Benedict's fertile imagination has already conjured up other adventures for our troubled detective Jonathan Dark. Highly Recommended.

Sss-Cut...

[Review] 'Classical Traditions in Modern Fantasy'

'Classical Traditions in Modern Fantasy' is the second in a series, the first being 'Classical Traditions in Science Fiction'.

'Classical Traditions in Modern Fantasy' is a collection of essays focusing on how fantasy draws deeply on ancient Greek and Roman mythology and literature.

Edited by Brett M. Rogers and Benjamin Eldon Stevens, the book contains fifteen essays intended for scholars and readers of fantasy alike. This volume explores many of the most significant examples of the modern genre, including H. P. Lovecraft's dark stories, J. R. R. Tolkien's 'The Hobbit', C. S. Lewis's 'Chronicles of Narnia', J. K. Rowling's 'Harry Potter' and George R. R. Martin's 'A Song of Ice and fire' (aka 'Game of Thrones'), in relation to ancient classical texts such as Aeschylus' Oresteia, Aristotle's Poetics, Virgil's Aeneid and Apuleius' Metamorphoses (aka 'The Golden Ass').

So, the writers of the essays try to find links and similarities between modern fantasy and classical texts. It's a comparatively easy task, because both hark back to universal stories that lie buried deep within us. All writers, ancient and recent, will tell stories that have the same issues at the heart of it: a quest for freedom, a rebellion against repression or the urge to discover unknown lands.

What most of the essays fail to mention is the education the modern fantasy writers have had. We know that Tolkien was a philologist and university professor, but he said his main inspiration for 'The Hobbit' was the Old English epic 'Beowulf'. I agree with Benjamin Eldon Stevens, writer on the essay on Tolkien, that Bilbo's travels into the tunnels and his encounters with Gollum/Sméagol echoes the underworlds of Dante and Virgil. We also know that Rowling studied classics at the University of Exeter, so her classical 'roots' are also not in doubt. But what of George R. R. Martin, who 'only' studied journalism? Did he write his sprawling fantasy series with the classics in mind? Or did he simply write a story that has so many similarities with classical stories that one is easily tempted to deduce that Martin is influenced by them. H.P. Lovecraft never finished high school, but was interested in chemistry and astronomy. His dark writing was fueled by his nightmares, the result of parasomnia or ‘night terrors’.

In the end, 'Classical Traditions in Modern Fantasy', is certainly a book that you should read, because it gives you a reason to ask yourself a lot of questions. And that's the very best one might expect from a book.

Viking: an alternative etymology

Everybody knows about Vikings, the fearless warriors from the cold and barren north. People who have studied history (but not etymology) will tell you that Viking is an Old Norse word meaning 'pirate' or 'raider'. It's not.
Actually, the English word 'Viking' went extinct in Middle English, was revived in the 19th century and borrowed from the Scandinavian languages of that time.

The etymology of víkingr and víking is hotly debated by scholars. A víkingr was someone who went on expeditions, usually abroad, usually by sea, and usually in a group with other víkingar (the plural).

Both words are thought to be connected with Old Norse vík meaning 'fjord'', 'small bay', 'inlet' or 'cove'. Towns such as Reykjavik and Lerwick may trace their origins back to the Vikings. But it would be a step too far if one would decide that viking was named after a 'fjord'. Vikings were a diverse group and originated from the entire Scandinavian peninsular. We need another explanation.

The Swedes will tell you that vig means 'battle' and therefore a viking would be a 'warrior'. Not so.

Both wic in Old English and wick in Old Frisian meant ‘camp’ or ‘a temporary living space'. So, it's quite possible that, if vic means 'camp', then 'vikingr' could well mean '(one) going on a camping trip'.

As camps grew into more permanent settlements, the word vic also came to denote something different. We can discover the word in Old English wīc ('dwelling place', 'abode') and Middle English wik, wich ('village', 'hamlet', 'town'). Modern Dutch wijk and modern Frisian wyk still mean 'part of a city'. This solution also ties in very neatly with the old Norse word vestrvíking. It is usually translated as 'raiding in the west', in the context of 'the British Isles'. Now we can give its original meaning 'camping in the west'.

[Review] 'Sleeper' by J.D. Fennell

'Sleeper', the debut by J. D. Fennell, is marketed as a young adult thriller. Yes, it is that and much more. The book is also a masterful melange of fantasy and war-time chaos. The protagonist Will Starling is a sixteen year old and he must keep a mysterious notebook out of the hands of VIPER, a murderous bunch of villains. After being shot and fallen into the icy water near Dover, he is rescued only to discover his memory is gone. You might think that this is some sort of homage to Jason Bourne, but 'Sleeper' is different. Very different.

Slowly but surely the memory of Starling returns and he understands that he's no ordinary lad. He's been trained to kill and to maim. As could be expected the story takes place at the backdrop of air raids on London, which adds another layer of fear and chaos to 'Sleeper'.

Just a few pages into the tale, I was certain that this was no ordinary thriller. This was something new. If I were a native English speaker, I would be able to say that it is a ripping yarn told at breakneck speed. What can it be compared to, I wondered aloud. In the end I decided 'Sleeper' might well be a start to a wonderful series that emulates the movies about 'Indiana Jones' with elements of Young Bond (by Steve Cole) and Alex Rider (by Anthony Horowitz) thrown in for good measure.

This is a thriller I would certainly highly recommend to young adults, but more adult readers might find 'Sleeper' also very entertaining. If I were pressed to mention a minor negative, I might mention that I missed a bit of British tongue-in-cheek humour to lighten the narrative a bit at opportune moments. But I would only mention that after a fair bit of torture.

The fear of cats in Victorian times

It was in the late nineteenth century that medicine turned its attention to irrational fears. The German physician Carl Westphal (1833-1890) made the initial diagnosis of a phobia, agoraphobia, the fear of open spaces, in 1871[1]. He studied the behaviour of three otherwise sane and rational men who were terrified of crossing an open city space. Following this diagnosis, the notion that individuals could be overtaken by various form of inexplicable fear was quickly taken up by medical practitioners around the world.

The American psychologist Granville Stanley Hall (1846-1924) soon identified 138 different forms of pathological fear[2]. Not only did these include recognised phobia, such as agoraphobia and claustrophobia, but also some fears that were particular to the Victorian era: amakophobia (fear of carriages), pteronophobia (fear of feathers) and hypegiaphobia (fear of responsibility).
However, it was the fear of cats (ailurophobia) that attracted the most attention from Victorian researchers. Hall, with his colleague Silas Weir Mitchell, even conducted experiments, such as placing sufferers into a room with a hidden cat, to see if they picked up animal's presence. He became convinced that many of his patients always could sense them. Trying to explain the phobia, he ruled out asthma and evolutionary inherited fears (people who were terrified of cats were could look at lions and tigers without problems).

Eventually Hall suggested that emanations from the cat 'may affect the nervous system through the nasal membrane, although recognised as odours'. He remained baffled over why cats seemed to have an urge to get as close as possible to individuals who were scared of them.

Research now suggest that the Victorian urge to classify almost everything was the result of a rapidly changing, industrialising society, where new scientific theories were starting to challenge long-held religious beliefs, explanations and dogma.

[1] Westphal: Die Agoraphobie, eine neuropathische Erscheinung in Archiv für Psychiatrie und Nervenkrankheiten - 1871
[2] Stanley Hall: Synthetic Genetic Study of Fear in American Journal of Psychology - 1914

Astronomy and watches in Friesland

Astronomical devices have been made for thousands of years. A famous example is the Antikythera mechanism, an artifact recovered off the Greek island of Antikythera. It is an ancient planetarium (or orrery) used to predict astronomical positions and eclipses for calendrical and astrological purposes. Even the Olympiads, the cycles of the ancient Olympic Games, could be calculated. The ancient device is a complex clockwork mechanism composed of at least 30 meshing bronze gears and is dated at around 205 BC.
[Model of the Antikythera mechanism]
Just a few kilometers west of my hometown of Harlingen lies the city of Franeker. In that Frisian town once lived Eise Eisinga (1744-1828), an amateur astronomer who built a planetarium in his own house. The planetarium still exists and is the oldest functioning planetarium in the world. Eisinga never went to school, but he did publish a book about the principles of astronomy when he was only 17 years old.
[Eise Eisinga's planetarium]
And today the Frisians are still world famous for their – sometimes – astronomical watches with – yes- astronomical price tags. Christiaan van der Klaauw, based in Heerenveen, creates astronomical watches such as the Planetarium, which contains the smallest mechanical planetarium in the world, showing in real time the orbits of Mercury, Venus, Earth, Mars, Jupiter and Saturn around the Sun. Don't worry, it also tells you the time and the mechanism is extremely accurate.
[Van der Klaauw Planetarium CKPT3304]
It isn't quite known why Frisians are so fascinated with planetary movements. It might have something to do with their healthy dairy products or their perfect night skies, but my bet is on Beerenburg, a traditional alcoholic drink that contains a host of medicinal herbs. It is almost exclusively consumed by Frisians.

Painkillers are killing America

You might remember House MD self-medicating on Vicodin to keep the pain in his leg at bay and allowing him to function as a brilliant docter. Vicodin is a painkiller that consists of a combination of two ingredients: hydrocodone and acetaminophen. Hydrocodone is an opioid, while acetaminophen (paracetamol) is a non-opioid analgesic. It is indicated for relief of moderate to severe pain.
Fentanyl is another potent, synthetic opioid pain medication with a rapid onset and short duration of action. It is approved for treating severe pain, typically advanced cancer pain. Fentanyl is more than 50 times more potent than morphine, thus increasing the risks for users. Fentanyl has emerged as the drug of choice in many parts of the United States and its legal and illegal use is now termed an 'epidemic' by scientists.
Opioid use has exploded in the US, after decades of doctors over-prescribing painkillers in the 1990s and 2000s. Authorities believe it is now pouring into the US, mostly directly from China through the mail, sometimes via Mexico.

A recent report by the Centers for Disease Control and Prevention shows that drug overdose deaths nearly tripled during 1999–2014[1]. Among 47,055 drug overdose deaths that occurred in 2014 in the United States, nearly 30,000 of these deaths involved an opioid. There are now more people killed by opioids than from bullets[2]. The numbers (read: deaths) keep rising, because in 2015 these were 52,404 drug overdose deaths, 33,091 of those involved the use of opioids[3]. Drug overdose deaths in 2016 are expected to exceed 64,000, representing a rate of 175 deaths a day.
Another factor is that opioids are often taken with other painkillers and alcohol, which also acts as a sedative.

[Update March 16, 2017] The Commission on Narcotic Drugs, part of the UN, has decided to help the US by adding two chemicals, used to make the drug Fentanyl, to an international list of controlled substances. It is hoped that it will help fight a wave of deaths by overdose in America. The substances are two precursors of Fentanyl: 4-anilino-N-phenethylpiperidine (ANPP) and N-phenethyl-4-piperidone (NPP). It also added a fentanyl analog called butyrfentanyl, a drug similar to fentanyl.

[1] Rose et al: Increases in Drug and Opioid-Involved Overdose Deaths — United States, 2010–2014 in Morbidity and Mortality Weekly Report (MMWR) – 2016
[2] Washington Post: Heroin deaths surpass gun homicides for the first time, CDC data shows – 2016. See here
[1] Rose et al: Increases in Drug and Opioid-Involved Overdose Deaths — United States, 2010–2015 in Morbidity and Mortality Weekly Report (MMWR) – 2016

Abigail Thaw on 'Morse' and 'Endeavour'

[Guest post by Damian Michael Barcroft, previously published here]

2017 comes around and I had no inkling it was 30 years since Morse first crossed our TV screens. Perhaps that’s a credit to the Endeavour series that we’ve become so immersed on our characters and our own program. Suddenly I am in the thick of the “30 years” thing and I can’t believe it was so long ago that it all started.
[Abigail Thaw as Dorothea Fazil]
But I remember thinking, while waiting to shoot my first scene of series 4 in 2016, that being in Oxford is a pertinent reminder of my father for me. It brings me back to him with a jolt; the colleges, the streets, the Randolph Hotel, the Ashmolean. Strange, because I lived there as a child long after my parents divorced so I’ve rarely been there with him. But the character of Morse is so ingrained in that golden stone and the legacy (although I hate that cliched word) is quite sobering. Staring round at this wonderful, talented crew and actors, there to tell the stories of Inspector Morse’s crime solving… I mean, how extraordinary is that!

Thank you Colin Dexter and thank you Dad for giving 'Morse' a corporal existence and everyone for continuing to make it happen: Damien, Russell, Kevin who drives you to the set happy and rested, Shaun with all that weight on his slender shoulders that he carries effortlessly… The list is very long. And then I stop thinking about it because if I didn’t I’d be overwhelmed and wouldn’t be able to do my job!

Having James Laurenson in the first episode was a treat and it was lovely to hear his stories of that very first Morse; the uncertainty of whether it “had legs”. But for the rest of the time I don’t think about “Morse” or “Dad”. I look across at my fellow actors and I think, Hello Endeavour or Hello Thursday, and when the camera’s not rolling I’m having a jolly good laugh; or putting the world to right over a custard cream and a tepid cup of tea; or trying to remember my lines and not bump into the furniture. Or trying to look as though I drive a 1960 Triumph with exceptionally stiff gears every day of my life…

And I love Dorothea. I fall for her more with each series. Russell thinks up all sorts for her, some make it to the final cut and many don’t but I know they’re there and they help me fill her out. Russell graciously allows me to feel I have some input into her development as I email him with the odd thought but I have to admit, he’s the puppet master. And I love the glimpses we get of her private life. Her friendship with Endeavour is touching and particularly comes to fruition in this series. Not to give anything away! She’s a lonely soul much like her Morse compatriot. But she’s got such gumption and life force. She can be utterly charmless when she wants to be which is rare in playing or being a woman. Something men take for granted. I wish I was more like her in many ways. But not at the witching hour after a scotch too many. Or those dark hours before dawn. I doubt she’s a stranger to the Dark Night of the Soul.

Whatever other job I do during the year, there is nothing like the thrill of a fresh new Endeavour script arriving, the comfort of all those familiar faces working for the same thing, making it as brilliant and enjoyable as possible. Putting on Dorothea’s rather uncomfortable clothes and pointy bra and drowning in a sea of Irene’s (Napier) hairspray, I’m plunged back into “Ah yes, I know this. Hello, girl. Cheers.”

BTW: The name Dorothea Frazil is a clever find. 'Frazil' means 'Ice crystals formed in turbulent water, as in swift streams or rough seas'. D. Frazil can thus be read as 'De-ice' or 'Thaw'.

The (Short) Evolution of Smallpox

New research suggests that smallpox, a viral disease that caused millions of deaths worldwide, may not be an ancient disease[1]. The findings raise new questions about when the Variola virus first emerged and later evolved, possibly in response to inoculation and vaccination.
Smallpox, one of the most devastating viral diseases, had long been thought to have appeared in human populations thousands of years ago in ancient Egypt, India and/or China, with some historical accounts suggesting that pharaoh Ramses V, who died circa 1145 BC, suffered from smallpox due to lesions found on his face.

To better understand its evolutionary history, scientists extracted the DNA, from partial mummified remains of a Lithuanian child, interred in the crypt of a church in Vilnius, believed to have died between 1643 and 1665, a period in which several smallpox outbreaks were documented throughout Europe with increasing levels of mortality. Researchers compared the 17thC strain to those from a databank of samples dating from 1940 up to its eradication in 1977. Surprisingly, the results shows that the evolution of smallpox virus occurred far more recently than previously thought, with all the available strains of the virus having an ancestor no older than 1580 AD.

The pox viral strains, that represent the true reservoir for human smallpox, remains unknown to this day. Camelpox is very closely related, but is not regarded as the likely ancestor to smallpox, suggesting that the real reservoir remains at large or has gone extinct[2].
The researchers also discovered that smallpox virus evolved into two circulating strains, Variola major and Viriola minor, after English physician Edward Jenner developed a vaccine in 1796.

One form, Variola major, was highly virulent and deadly, the other Variola minor more benign. However, the two forms experienced a ‘major population bottleneck’ with the rise of immunization efforts.The date of the ancestor of the minor strain corresponds well with the Atlantic Slave trade which was likely responsible for partial worldwide dissemination.

This raises important questions about how a pathogen diversifies in the face of vaccination. While smallpox is now eradicated in humans, we should remain vigilant about its possible reemergence until we fully understand its origins.

[1] Duggan et al: 17th Century Variola Virus Reveals the Recent History of Smallpox in Current Biology – 2016. See here
[2] Smithson et al: Prediction of steps in the evolution of variola virus host range in PLoS One - 2014