The Wizard and the Prophet

Charles Mann, author of 1491, has a new book called The Wizard and the Prophet: Two Remarkable Scientists and Their Dueling Visions to Shape Tomorrow’s World.

Here’s the Amazon description:

From the best-selling, award-winning author of 1491 and 1493–an incisive portrait of the two little-known twentieth-century scientists, Norman Borlaug and William Vogt, whose diametrically opposed views shaped our ideas about the environment, laying the groundwork for how people in the twenty-first century will choose to live in tomorrow’s world.

In forty years, Earth’s population will reach ten billion. Can our world support that? What kind of world will it be? Those answering these questions generally fall into two deeply divided groups–Wizards and Prophets, as Charles Mann calls them in this balanced, authoritative, nonpolemical new book. The Prophets, he explains, follow William Vogt, a founding environmentalist who believed that in using more than our planet has to give, our prosperity will lead us to ruin. Cut back! was his mantra. Otherwise everyone will lose! The Wizards are the heirs of Norman Borlaug, whose research, in effect, wrangled the world in service to our species to produce modern high-yield crops that then saved millions from starvation. Innovate! was Borlaug’s cry. Only in that way can everyone win! Mann delves into these diverging viewpoints to assess the four great challenges humanity faces–food, water, energy, climate change–grounding each in historical context and weighing the options for the future. With our civilization on the line, the author’s insightful analysis is an essential addition to the urgent conversation about how our children will fare on an increasingly crowded Earth.

I made my own attempt to reconcile these world views a few years ago. My conclusion was that it is theoretically possible to grow without exceeding limits, if almost all innovation that occurs is aimed at transcending those limits. In the real world, I don’t think there is any evidence our species is capable of that. What is more likely is that technology helps us grow until we come up against the limits, then we experience a setback that takes us back under the limits, then eventually we start again. We may push the limits a little further each time, but the setbacks can be long and painful enough to ruin entire human lifetimes. If I am right, we haven’t even finished the first cycle yet as a planetary civilization. Mann’s book 1491, along with Jared Diamond’s Collapse, were instrumental in helping me to realize that regional and even continental cultures have experienced major setbacks before.

Polarization, Partisanship and Junk News Consumption over Social Media in the US

Maybe this is just the Brits picking on us. Or, maybe they are onto something.

Vidya Narayanan, Vlad Barash, John Kelly, Bence Kollanyi, Lisa-Maria Neudert, and Philip N. Howard. “Polarization, Partisanship and Junk News Consumption over Social Media in the US.” Data Memo 2018.1. Oxford, UK: Project on Computational Propaganda. comprop.oii.ox.ac.uk

What kinds of social media users read junk news? We examine the distribution of the most significant sources of junk news in the three months before President Donald Trump’s first State of the Union Address. Drawing on a list of sources that consistently publish political news and information that is extremist, sensationalist, conspiratorial, masked commentary, fake news and other forms of junk news, we find that the distribution of such content is unevenly spread across the ideological spectrum. We demonstrate that (1) on Twitter, a network of Trump supporters shares the widest range of known junk news sources and circulates more junk news than all the other groups put together; (2) on Facebook, extreme hard right pages—distinct from Republican pages—share the widest range of known junk news sources and circulate more junk news than all the other audiences put together; (3) on average, the audiences for junk news on Twitter share a wider range of known junk news sources than audiences on Facebook’s public pages.

I hadn’t heard the term computational propaganda before. Here is how they describe it:

The Computational Propaganda Research Project (COMPROP) investigates the interaction of algorithms, automation and politics. This work includes analysis of how tools like social media bots are used to manipulate public opinion by amplifying or repressing political content, disinformation, hate speech, and junk news.

We use perspectives from organizational sociology, human computer interaction, communication, information science, and political science to interpret and analyze the evidence we are gathering. Our project is based at the Oxford Internet Institute, University of Oxford.

So in other words, we are all being manipulated by some very old and tired ideas using powerful new technologies Hitler and Stalin could only have dreamed of.

January 2018 in Review

Most frightening stories:

  • Larry Summers says we have a better than even chance of recession in the next three years. Sounds bad, but I wonder what that stat would look like for any randomly chosen three year period in modern history.
  • The United States is involved in at least seven wars: Afghanistan, Iraq, Syria, Yemen, Libya, Somalia, and Pakistan. Nuclear deterrence may not actually the work.
  • Cape Town, South Africa is in imminent danger of running out of water. Longer term, there are serious concerns about snowpack-dependent water supplies serving large urban populations in Asia and western North America.

Most hopeful stories:

Most interesting stories, that were not particularly frightening or hopeful, or perhaps were a mixture of both:

precision nutrition

Lancet has an article on precision nutrition and diabetes. Precision nutrition is the idea of a diet tailored specifically to an individual based on analysis of factors such as their genetics, proteins, and gut bacteria.

Precision nutrition for prevention and management of type 2 diabetes

Precision nutrition aims to prevent and manage chronic diseases by tailoring dietary interventions or recommendations to one or a combination of an individual’s genetic background, metabolic profile, and environmental exposures. Recent advances in genomics, metabolomics, and gut microbiome technologies have offered opportunities as well as challenges in the use of precision nutrition to prevent and manage type 2 diabetes. Nutrigenomics studies have identified genetic variants that influence intake and metabolism of specific nutrients and predict individuals’ variability in response to dietary interventions. Metabolomics has revealed metabolomic fingerprints of food and nutrient consumption and uncovered new metabolic pathways that are potentially modified by diet. Dietary interventions have been successful in altering abundance, composition, and activity of gut microbiota that are relevant for food metabolism and glycaemic control. In addition, mobile apps and wearable devices facilitate real-time assessment of dietary intake and provide feedback which can improve glycaemic control and diabetes management. By integrating these technologies with big data analytics, precision nutrition has the potential to provide personalised nutrition guidance for more effective prevention and management of type 2 diabetes. Despite these technological advances, much research is needed before precision nutrition can be widely used in clinical and public health settings. Currently, the field of precision nutrition faces challenges including a lack of robust and reproducible results, the high cost of omics technologies, and methodological issues in study design as well as high-dimensional data analyses and interpretation. Evidence is needed to support the efficacy, cost-effectiveness, and additional benefits of precision nutrition beyond traditional nutrition intervention approaches. Therefore, we should manage unrealistically high expectations and balance the emerging field of precision nutrition with public health nutrition strategies to improve diet quality and prevent type 2 diabetes and its complications.

I don’t want to be cynical, but I can imagine a scenario where this technology really catches on, but is accessible only to the rich. The result would be the rich living much longer than the rest of us (and they already live longer).

quantifying ecological functions

Here is an interesting article on quantifying ecological functions. The main application appears to be wetland mitigation but the theory seems more general and could maybe be adapted to a variety of ecosystem restorations or creations.

Landscape consequences of aggregation rules for functional equivalence in compensatory mitigation programs

Mitigation and offset programs designed to compensate for ecosystem function losses due to development must balance losses from affected ecosystems and gains in restored ecosystems. Aggregation rules applied to ecosystem functions to assess site equivalence are based on implicit assumptions about the substitutability of functions among sites and can profoundly influence the distribution of restored ecosystem functions on the landscape. We investigated the consequences of rules applied to aggregation of ecosystem functions for wetland offsets in the Beaverhill watershed in Alberta, Canada. We considered the fate of 3 ecosystem functions: hydrology, water purification, and biodiversity. We set up an affect-and-offset algorithm to simulate the effect of aggregation rules on ecosystem function for wetland offsets. Cobenefits and trade-offs among functions and the constraints posed by the quantity and quality of restorable sites resulted in a redistribution of functions between affected and offset wetlands. Hydrology and water-purification functions were positively correlated and negatively correlated with biodiversity function. Weighted-average rules did not replace functions in proportion to their weights. Rules prioritizing biodiversity function led to more monofunctional wetlands and landscapes. The minimum rule, for which the wetland score was equal to the worst performing function, promoted multifunctional wetlands and landscapes. The maximum rule, for which the wetland score was equal to the best performing function, promoted monofunctional wetlands and multifunctional landscapes. Because of implicit trade-offs among ecosystem functions, no-net-loss objectives for multiple functions should be constructed within a landscape context. Based on our results, we suggest criteria for the design of aggregation rules for no net loss of ecosystem functions within a landscape context include the concepts of substitutability, cobenefits and trade-offs, landscape constraints, heterogeneity, and the precautionary principle.

why deny science when you can just make it up?

There is no reason to deny facts or evidence when you can just make up new ones that suit your pre-conceived notions, you truly believe anything that comes out of your own mouth is true, and tens of millions of other people do too.

This is not supposed to be a political blog. But it is supposed to be a blog about whether our civilization is progressing or at risk of a catastrophic downfall. And when the things in the first paragraph I just wrote are happening, I have to lean toward the catastrophic downfall side.

From Bloomberg:

“The ice caps were going to melt, they were going to be gone by now, but now they’re setting records,” Trump said in excerpts of an interview with Piers Morgan on the U.K. television network ITV broadcast Jan. 28. Trump didn’t specify the data behind his statement about setting records…

“There is a cooling, and there’s a heating,” he said. “I mean, look, it used to not be climate change, it used to be global warming. That wasn’t working too well because it was getting too cold all over the place…”

In 2014, less than a year before he entered the 2016 presidential race, president, Trump said on Twitter that the “POLAR ICE CAPS are at an all time high, the POLAR BEAR population has never been stronger. Where the hell is global warming.”

Anybody with some basic science or information literacy knows that a short-term fluctuation in the data does not prove or disprove a long-term trend. You can look at a lot of those short-term fluctuations together and begin to determine whether they represent random noise or whether they are consistent with some longer-term trend you are seeing in the larger data set, as scientists are doing with recent hurricanes, droughts and fires.

This was my favorite quote of all though:

“The Paris accord, for us, would have been a disaster,” Trump said in excerpts of an interview with Piers Morgan. “Would I go back in? Yeah, I’d go back in. I like, as you know, I like Emmanuel” Macron.

I can’t picture Emmanuel Macron, but what I can picture is Sasha Baron Cohen kissing Will Ferrell in Talladega Nights. Sometimes fiction actually does turn into reality!