The title of this article in Nature pretty much says it all. The authors make a case that the IPCC is underestimating the risk of a rapid deterioration in the climate situation. There are a couple counter-intuitive points here. First, there is good news about air pollution, particularly in China. This is good news for public health in the near term, but paradoxically the air pollution has actually been bad enough in recent years to measurably block sunlight. Second, there really is a non-manmade component to global warming, and it may be significant in the coming decades. This is not good news at all, because the manmade component is of course very real, and the two are additive.
Category Archives: Peer Reviewed Article Review
ecosystem restoration and carbon sequestration
Ecosystem and soil restoration could offset around a fifth of U.S. carbon emissions, according to this article in Science Advances.
Limiting climate warming to <2°C requires increased mitigation efforts, including land stewardship, whose potential in the United States is poorly understood. We quantified the potential of natural climate solutions (NCS)—21 conservation, restoration, and improved land management interventions on natural and agricultural lands—to increase carbon storage and avoid greenhouse gas emissions in the United States. We found a maximum potential of 1.2 (0.9 to 1.6) Pg CO2e year−1, the equivalent of 21% of current net annual emissions of the United States. At current carbon market prices (USD 10 per Mg CO2e), 299 Tg CO2e year−1 could be achieved. NCS would also provide air and water filtration, flood control, soil health, wildlife habitat, and climate resilience benefits.
microdosing
There are people taking small doses of LSD and other psychedelics every day for their possible/suspected health benefits, and there may actually be some science behind this.
Microdosing psychedelics – the regular consumption of small amounts of psychedelic substances such as LSD or psilocybin – is a growing trend in popular culture. Recent studies on full-dose psychedelic psychotherapy reveal promising benefits for mental well-being, especially for depression and end-of-life anxiety. While full-dose therapies include perception-distorting properties, microdosing may provide complementary clinical benefits using lower-risk, non-hallucinogenic doses. No experimental study has evaluated psychedelic microdosing, however; this pre-registered study is the first to investigate microdosing psychedelics and mental health. Recruited from online forums, current and former microdosers scored lower on measures of dysfunctional attitudes and negative emotionality and higher on wisdom, open-mindedness, and creativity when compared to non microdosing controls. These findings provide promising initial evidence that warrants controlled experimental research to directly test safety and clinical efficacy. As microdoses are easier to administer than full-doses, this new paradigm has the exciting potential to shape future psychedelic research.
Oumuamua
I tracked down the Harvard astrophysics paper that suggests this object could be an alien spacecraft or probe. They never say that it is one, only that its behavior would be consistent with one. Sadly, it seems like we missed the boat and it is too late to train our telescopes or send probes of our own in time to get a good look at the thing, so the best we can do is be on the lookout for others like it in the future.
“organic” eating may lower cancer risk after all
I haven’t always been on the “organic” band wagon 100%. For one thing, the name is stupid. Chugging a glass of diesel fuel would be about as organic as you could get, in terms of the definition of the word I learned in high school chemistry. I am strongly in favor of sustainable farming practices that build soil, protect biodiversity, and prevent groundwater and surface water pollution. But in terms of health benefits, I have never felt the benefits were all that proven, and to some extent the industry is just based on scare tactics. I also wonder if the billions of humans on the planet can be fed without resorting to fossil fuel-derived fertilizer, and I still think that is dubious. But here is one large study in JAMA that did find significant evidence of a link between organic food (as labeled at the grocery store) and reduced cancer risk.
Association of Frequency of Organic Food Consumption With Cancer Risk: Findings From the NutriNet-Santé Prospective Cohort Study
Main Outcomes and Measures This study estimated the risk of cancer in association with the organic food score (modeled as quartiles) using Cox proportional hazards regression models adjusted for potential cancer risk factors.
Results Among 68 946 participants (78.0% female; mean [SD] age at baseline, 44.2 [14.5] years), 1340 first incident cancer cases were identified during follow-up, with the most prevalent being 459 breast cancers, 180 prostate cancers, 135 skin cancers, 99 colorectal cancers, 47 non-Hodgkin lymphomas, and 15 other lymphomas. High organic food scores were inversely associated with the overall risk of cancer (hazard ratio for quartile 4 vs quartile 1, 0.75; 95% CI, 0.63-0.88; P for trend = .001; absolute risk reduction, 0.6%; hazard ratio for a 5-point increase, 0.92; 95% CI, 0.88-0.96).
Conclusions and Relevance A higher frequency of organic food consumption was associated with a reduced risk of cancer. Although the study findings need to be confirmed, promoting organic food consumption in the general population could be a promising preventive strategy against cancer.
I researched the risk measures a little. The hazard risk ratio of 0.75 means that people eating mostly organic food (scoring in the top 25% of however they are measuring that) are 25% less likely than people eating the least organic food. That seems significant. From a quick skim, it appears they did try to control for differences in lifestyle (i.e., similar nutrition and exercise levels) and family history of cancer when coming to their conclusions.
the quantum internet
A paper in Nature explains what a “quantum internet” could look like.
In stage 1, users will start getting into the quantum game, in which a sender creates quantum states, typically for photons. These would be sent to a receiver, either along an optical fibre or through a laser pulse beamed across open space. At this stage, any two users will be able to create a private encryption key that only they know…
In stage 2, the quantum internet will harness the powerful phenomenon of entanglement. Its first goal will be to make quantum encryption essentially unbreakable. Most of the techniques that this stage requires already exist, at least as rudimentary lab demonstrations.
Stages 3 to 5 will, for the first time, enable any two users to store and exchange quantum bits, or qubits. These are units of quantum information, similar to classical 1s and 0s, but they can be in a superposition of both 1 and 0 simultaneously. Qubits are also the basis for quantum computation. (A number of laboratories — both in academia and at large corporations, such as IBM or Google — have been building increasingly complex quantum computers; the most advanced ones have memories that can hold a few dozen qubits.)
So it seems as though the main advantage of a quantum internet would be truly secure communications. Which I guess is at least something, but doesn’t seem as though it would revolutionize everyday life anytime soon. There are no predictions in this article about when it might happen other than “a long time”.
climate change threatens barley yields
A new study says climate change is likely to threaten barley yields, leading to high beer prices later in the century. I’m hoping this is wrong and we can grow hops and barley in the formerly frozen tundra of Canada and Siberia. Of course the bigger picture is about grain yields overall, and that is not just about average temperature but about extreme heat and drought.
Decreases in global beer supply due to extreme drought and heat
Beer is the most popular alcoholic beverage in the world by volume consumed, and yields of its main ingredient, barley, decline sharply in periods of extreme drought and heat. Although the frequency and severity of drought and heat extremes increase substantially in range of future climate scenarios by five Earth System Models, the vulnerability of beer supply to such extremes has never been assessed. We couple a process-based crop model (decision support system for agrotechnology transfer) and a global economic model (Global Trade Analysis Project model) to evaluate the effects of concurrent drought and heat extremes projected under a range of future climate scenarios. We find that these extreme events may cause substantial decreases in barley yields worldwide. Average yield losses range from 3% to 17% depending on the severity of the conditions. Decreases in the global supply of barley lead to proportionally larger decreases in barley used to make beer and ultimately result in dramatic regional decreases in beer consumption (for example, −32% in Argentina) and increases in beer prices (for example, +193% in Ireland). Although not the most concerning impact of future climate change, climate-related weather extremes may threaten the availability and economic accessibility of beer.
aging populations – a good thing?
This article in Trends in Ecology and Evolution argues that aging populations can actually be a good thing for the environment and the human economy, and that the challenges they pose are overblown and manageable.
statistical analysis of the Supreme Court
A statistical analysis of the U.S. Supreme Court suggests that it is not all that partisan after all. Okay, I admit it, I am really just pretending to understand half the words in the abstract below. It’s always kind of fun when physicists dabble in fields outside their usual boundaries, like economics or politics.
The US Supreme Court throughout the twentieth century has been characterized as being divided between liberals and conservatives, suggesting that ideologically similar justices would have voted similarly had they overlapped in tenure. What if they had? I build a minimal, pairwise maximum entropy model to infer how 36 justices from 1946–2016 would have all voted on a Super Court. The model is strikingly consistent with a standard voting model from political science, W-Nominate, despite using 105 less parameters and fitting the observed statistics better. I find that consensus dominates the Super Court and strong correlations in voting span nearly 100 years, defining an emergent institutional timescale that surpasses the tenure of any single justice. Thus, the collective behavior of the Court over time reveals a stable institution insulated from the seemingly rapid pace of political change. Beyond consensus, I discover a rich structure of dissenting blocs with a heavy-tailed, scale-free distribution consistent with data from the Second Rehnquist Court. Consequently, a low-dimensional description of voting with a fixed number of ideological modes is inherently misleading because even votes that defy such a description are probable. Instead of assuming that strong higher order correlations like voting blocs are induced by features of the cases, the institution, and the justices, I show that such complexity can be expressed in a minimal model relying only on pairwise correlations in voting.
IPCC terminology
I find some of the IPCC terminology interesting. Alternatives analysis and communication of uncertainty are professional interests of mine. I am afraid I am not all that good at them, but when I see the state of the art in scientific communication from the experts sometimes I feel a little better.
Here is a footnote in the Summary for Policy Makers on the terminology they use to try to communicate uncertainty.
A level of confidence is expressed using five qualifiers: very low, low, medium, high and very high, and typeset in italics, for example, medium confidence. The following terms have been used to indicate the assessed likelihood of an outcome or a result: virtually certain 99–100% probability, very likely 90–100%, likely 66 100%, about as likely as not 33–66%, unlikely 0–33%, very unlikely 0–10%, exceptionally unlikely 0–1%. Additional terms (extremely likely 95–100%, more likely than not >50–100%, more unlikely than likely 0–<50%, extremely unlikely 0–5%) may also be used when appropriate. Assessed likelihood is typeset in italics, for example, very likely.
Here are some definitions of scenarios and pathways in Chapter 1 of Global Warming of 1.5 °C.
A ‘scenario’ is an internally consistent, plausible, and integrated description of a possible future of the human–environment system, including a narrative with qualitative trends and quantitative projections (IPCC, 2000). Climate change scenarios provide a framework for developing and integrating emissions, climate change and climate impact projections, including an assessment of their inherent uncertainties. The long-term and multi–faceted nature of climate change requires climate scenarios to describe how assumptions about inherently uncertain socio-economic trends in the 21st century could influence future energy and land use, resulting in emissions, and climate change as well as human vulnerability and exposure to climate change. Such driving forces include population, GDP, technological innovation, governance, and lifestyles. Climate change scenarios are used for analysing and contrasting climate policy choices.
The notion of a ‘pathway’ can have multiple meanings in the climate literature. It is often used to describe the temporal evolution of a set of scenario features, such as GHG emissions and socioeconomic development. As such, it can describe individual scenario components or sometimes be used interchangeably with the word ‘scenario’. For example, the RCPs describe GHG concentration trajectories (van Vuuren et al., 2011) and the SSPs are a set of narratives of societal futures augmented by quantitative projections of socio-economic determinants such as population, GDP, and urbanization (Kriegler et al., 2012; O’Neill et al., 2014). Socio-economic driving forces consistent with any of the SSPs can be combined with a set of climate policy assumptions (Kriegler et al., 2014) that together would lead to emissions and concentration outcomes consistent with the RCPs (Riahi et al., 2017). This is at the core of the scenario framework for climate change research that aims to facilitate creating scenarios integrating emissions and development pathways dimensions (Ebi et al., 2014; van Vuuren et al., 2014).