Category Archives: Peer Reviewed Article Review

NAS study on genetically modified crops

The National Academy of Sciences has released a massive study of genetically modified crops. This has been a tough issue to discern the facts because there has been a lot of corporate propaganda coming from one side, and a lot of emotion from well-meaning but not-all-that-scientific activists from the other side. I would consider the NAS to be pretty close to an impartial, science-based source, although you could argue that the academics involved probably do a lot of research funded by the agriculture industry. Still, it is a very large number of academics involved and is very thoroughly peer-reviewed, so I think you could regard this as the academic consensus.

First, on human health effects, they offer some reassuring news:

There have been claims that GE crops have had adverse effects on human health. Many reviews have indicated that foods from GE crops are as safe as foods from non-GE crops, but the committee reexamined the original studies of this subject. The design and analysis of many animal-feeding studies were not optimal, but the large number of experimental studies provided reasonable evidence that animals were not harmed by eating food derived from GE crops. Additionally, long-term data on livestock health before and after the introduction of GE crops showed no adverse effects associated with GE crops. The committee also examined epidemiological data on incidence of cancers and other human-health problems over time and found no substantiated evidence that foods from GE crops were less safe than foods from non-GE crops.

You could still argue, as the Europeans do, that the precautionary principle means new technologies must be treated as guilty until proven innocent. It is somewhat the opposite here in the big-business-friendly U.S. Still, there is no smoking gun here.

Nor is there a smoking gun on the ability of genetic engineering to deliver yield increases. Some are arguing that the smoking gun is evidence showing it has not really done this yet. That is somewhat disappointing, but with biotechnology continuing to accelerate I don’t think you can point to progress so far as evidence that no further progress will be made. That is like saying we have not cured cancer to date, so it is time to give up.

There is disagreement among researchers about how much GE traits can increase yields compared with conventional breeding. In addition to assessing detailed surveys and experiments comparing GE with non-GE crop yields, the committee examined changes over time in overall yield per hectare of maize, soybean, and cotton reported by the U.S. Department of Agriculture (USDA) before, during, and after the switch from conventional to GE varieties of these crops. No significant change in the rate at which crop yields increase could be discerned from the data. Although the sum of experimental evidence indicates that GE traits are contributing to actual yield increases, there is no evidence from USDA data that they have substantially increased the rate at which U.S. agriculture is increasing yields…

One of the critical questions about the new traits that may be produced with emerging genetic engineering technologies is the extent to which these traits will contribute to feeding the world in the future. Some crop traits, such as insect and disease resistance, are likely to be introduced into more crop species and the number of pests targeted will also likely increase. If deployed appropriately, those traits will almost certainly increase harvestable yields and decrease the probability of losing crop plantings to major insect or disease outbreaks. However, there is great uncertainty regarding whether traits developed with emerging genetic-engineering technologies will increase crop potential yield by improving photosynthesis and increasing nutrient use. Including such GE traits in policy planning as major contributors to feeding the world must be accompanied by strong caveats.

The don’t talk too much about one of my questions, the extent to which corporate profit-driven genetic engineering reduces genetic diversity, potentially making the global food system less resilient in the face of future shocks. They don’t seem concerned about the possibility of genetically engineered crops escaping and wreaking havoc in our remaining natural ecosystems.

I’ll reproduce one graphic I found interesting, distinguishing between the concepts of potential and actual yield. One point they seem to be making is that the focus of genetic engineering to date has been on reducing crop losses to weeds, pests, and diseases. This does not increase the plant’s ability to make full use of water, nutrients, and ultimately sunlight more efficiently than the naturally-derived crop has in the past. So this is why there is still the potential for a lot of progress, as well as the potential for risks to diversity, resilience, human health and ecosystems. This also reinforces my general sense that medical biotech is farther along than agricultural biotech.

National Academies of Sciences, Engineering, and Medicine. 2016. Genetically Engineered Crops: Experiences and Prospects. Washington, DC: The National Academies Press. doi: 10.17226/23395.

National Academies of Sciences, Engineering, and Medicine. 2016. Genetically
Engineered Crops: Experiences and Prospects. Washington, DC: The National Academies Press. doi:
10.17226/23395.

silicon-based life

Scientists can now synthesize proteins that could be incorporated in silicon-based life forms.

Directed evolution of cytochrome c for carbon–silicon bond formation: Bringing silicon to life

Enzymes that catalyze carbon–silicon bond formation are unknown in nature, despite the natural abundance of both elements. Such enzymes would expand the catalytic repertoire of biology, enabling living systems to access chemical space previously only open to synthetic chemistry. We have discovered that heme proteins catalyze the formation of organosilicon compounds under physiological conditions via carbene insertion into silicon–hydrogen bonds. The reaction proceeds both in vitro and in vivo, accommodating a broad range of substrates with high chemo- and enantioselectivity. Using directed evolution, we enhanced the catalytic function of cytochrome c from Rhodothermus marinus to achieve more than 15-fold higher turnover than state-of-the-art synthetic catalysts. This carbon–silicon bond-forming biocatalyst offers an environmentally friendly and highly efficient route to producing enantiopure organosilicon molecules.

street tree survey using Google Street View

An automated analysis program can produce street tree data using Google Street View.

Google Street View shows promise for virtual street tree surveys

Geospatial technologies are increasingly relevant to urban forestry, but their use may be limited by cost and technical expertise. Technologies like Google Street View™ are appealing because they are free and easy to use. We used Street View to conduct a virtual survey of street trees in three municipalities, and compared our results to existing field data from the same locations. The virtual survey analyst recorded the locations of street trees, identified trees to the species level, and estimated diameter at breast height. Over 93% of the 597 trees documented in the field survey were also observed in the virtual survey. Tree identification in the virtual survey agreed with the field data for 90% of trees at the genus level and 66% of trees at the species level. Identification was less reliable for small trees, rare taxa, and for trees with multiple species in the same genus. In general, tree diameter was underestimated in the virtual survey, but estimates improved as the analyst became more experienced. This study is the first to report on manual interpretation of street tree characteristics using Street View. Our results suggest that virtual surveys in Street View may be suitable for generating some types of street tree data or updating existing data sets more efficiently than field surveys.

climate change, ecosystems, and food

This 17-author paper in Science describes evidence for how natural organisms and ecosystems are already adapting themselves to climate change, and what it means for humans.

The broad footprint of climate change from genes to biomes to people

Species are undergoing evolutionary adaptation to temperature extremes, and climate change has substantial impacts on species physiology that include changes in tolerances to high temperatures, shifts in sex ratios in species with temperature-dependent sex determination, and increased metabolic costs of living in a warmer world. These physiological adjustments have observable impacts on morphology, with many species in both aquatic and terrestrial systems shrinking in body size because large surface-to-volume ratios are generally favored under warmer conditions. Other morphological changes include reductions in melanism to improve thermoregulation, and altered wing and bill length in birds.

Broader-scale responses to climate change include changes in the phenology, abundance, and distribution of species. Temperate plants are budding and flowering earlier in spring and later in autumn. Comparable adjustments have been observed in marine and freshwater fish spawning events and in the timing of seasonal migrations of animals worldwide. Changes in the abundance and age structure of populations have also been observed, with widespread evidence of range expansion in warm-adapted species and range contraction in cold-adapted species. As a by-product of species redistributions, novel community interactions have emerged. Tropical and boreal species are increasingly incorporated into temperate and polar communities, respectively, and when possible, lowland species are increasingly assimilating into mountain communities. Multiplicative impacts from gene to community levels scale up to produce ecological regime shifts, in which one ecosystem state shifts to an alternative state…

The many observed impacts of climate change at different levels of biological organization point toward an increasingly unpredictable future for humans. Reduced genetic diversity in crops, inconsistent crop yields, decreased productivity in fisheries from reduced body size, and decreased fruit yields from fewer winter chill events threaten food security. Changes in the distribution of disease vectors alongside the emergence of novel pathogens and pests are a direct threat to human health as well as to crops, timber, and livestock resources. Humanity depends on intact, functioning ecosystems for a range of goods and services. Enhanced understanding of the observed impacts of climate change on core ecological processes is an essential first step to adapting to them and mitigating their influence on biodiversity and ecosystem service provision.

As smug as we are about the advanced state of our civilization, this planet still gives us an enormous amount for free, and we simply can’t afford to replace all the free goods and services with our own effort and technology. I continue to hear alarm bells sounding from many different quarters on one particular issue – food.

learn about carbon trading and R

This is pretty cool – an interactive website that lets you explore a real-world carbon trading research problem while learning new tricks in R.

Many economists would agree that the most efficient way to fight global warming would be a world-wide tax or an emmission trading system for greenhouse gases. Yet, if only a part of the world implements such a scheme, a reasonable concern is that firms may decide to relocate to other parts of the world, causing job losses and less effective emmission reduction…

In their article ‘Industry Compensation under Relocation Risk: A Firm-Level Analysis of the EU Emissions Trading Scheme’ (American Economic Review, 2014), Ralf Martin, Mirabelle Muûls, Laure B. de Preux and Ulrich J. Wagner study the most efficient way to allocate a fixed amount of free permits among facilities in order to minimize the risk of job losses or carbon leakage. Given their available data, they establish simple alternative allocation rules that can be expected to substantially outperform the current allocation rules used by the EU.

As part of his Master’s Thesis at Ulm University, Benjamin Lux has generated a very nice RTutor problem set that allows you to replicate the insights of the paper in an interactive fashion. You learn about the data and institutional background, run explorative regressions and dig into the very well explained optimization procedures to find efficient allocation rules. At the same time you learn some R tricks, like effective usage of some dplyr functions.

It’s an interesting question at a time when some U.S. states and Canadian provinces have started introducing carbon trading and taxation schemes that differ from their neighbors (sometimes because their neighbors have nothing at all). Perhaps there is a win-win where a policy can gradually phase out less productive, dirtier industries while replacing them with cleaner and higher-value-added industries, then sharing enough of the wealth so everyone benefits.

passive haptic learning

Researchers at Georgia Tech have taken a small step toward the dream of learning without effort.

Tactile taps teach rhythmic text entry: passive haptic learning of morse code

Passive Haptic Learning (PHL) is the acquisition of sensorimotor skills with little or no active attention to learning. This technique is facilitated by wearable computing, and applications are diverse. However, it is not known whether rhythm-based information can be conveyed passively. In a 12 participant study, we investigate whether Morse code, a rhythmbased text entry system, can be learned through PHL using the bone conduction transducer on Google Glass. After four hours of exposure to passive stimuli while focusing their attention on a distraction task, PHL participants achieved a 94% accuracy rate keying a pangram (a phrase with all the letters of the alphabet) using Morse code on Glass’s trackpad versus 53% for the control group. Most PHL participants achieved 100% accuracy before the end of the study. In written tests, PHL participants could write the codes for each letter of the alphabet with 98% accuracy versus 59% for control. When perceiving Morse code, PHL participants also performed significantly better than control: 83% versus 46% accuracy.

reduced work week as a carbon emissions strategy

Reducing the work week to four days would reduce carbon emissions.

Worktime Reduction as a Solution to Climate Change: Five Scenarios Compared for the UK

Reducing working hours in an economy has been discussed as a policy which may have benefits in achieving particular economic, social and environmental goals. This study proposes five different scenarios to reduce the working hours of full-time employees by 20% with the aim of cutting greenhouse gas emissions: a three-day weekend, a free Wednesday, reduced daily hours, increased holiday entitlement and a scenario in which the time reduction is efficiently managed by companies to minimise their office space. We conceptually analyse the effects of each scenario on time use patterns through both business and worker activities, and how these might affect energy consumption in the economy. To assess which of the scenarios may be most effective in reducing carbon emissions, this analytical framework is applied as a case study for the United Kingdom. The results suggest that three of the five scenarios offer similar benefits, and are preferable to the other two, with a difference between the best and worst scenarios of 13.03 MTCO2e. The study concludes that there is a clear preference for switching to a four-day working week over other possible work-reduction policies.

reservoirs, resilience and system dynamics

This article in Water Resources Research uses a system dynamics simulation to examine the resilience of a reservoir. Some of these concepts may be adaptable to other types of water resources systems or systems in general.

Comparison of static and dynamic resilience for a multipurpose reservoir operation

Reliability, resilience and vulnerability are the traditional risk measures used to assess the performance of a reservoir system. Among these measures, resilience is used to assess the ability of a reservoir system to recover from a failure event. However, the time independent static resilience does not consider the system characteristics, interaction of various individual components and does not provide much insight into reservoir performance from the beginning of the failure event until the full performance recovery. Knowledge of dynamic reservoir behavior under the disturbance offers opportunities for proactive and/or reactive adaptive response that can be selected to maximize reservoir resilience. A novel measure is required to provide insight into the dynamics of reservoir performance based on the reservoir system characteristics and its adaptive capacity. The reservoir system characteristics include, among others, reservoir storage curve, reservoir inflow, reservoir outflow capacity and reservoir operating rules. The reservoir adaptive capacity can be expressed using various impacts of reservoir performance under the disturbance (like reservoir release for meeting a particular demand, socio-economic consequences of reservoir performance, or resulting environmental state of the river upstream and downstream from the reservoir). Another way of expressing reservoir adaptive capacity to a disturbing event may include aggregated measures like reservoir robustness, redundancy, resourcefulness and rapidity. A novel measure that combines reservoir performance and its adaptive capacity is proposed in this paper and named ‘dynamic resilience’. The paper also proposes a generic simulation methodology for quantifying reservoir resilience as a function of time. The proposed resilience measure is applied to a single multi-purpose reservoir operation and tested for a set of failure scenarios. The dynamic behavior of reservoir resilience is captured using the system dynamics simulation approach, a feedback-based object-oriented method, very effective for modelling complex systems. The results of dynamic resilience are compared with the traditional performance measures in order to identify advantages of the proposed measure. The results confirm that the dynamic resilience is a powerful tool for selecting proactive and reactive adaptive response of a multipurpose reservoir to a disturbing event that cannot be achieved using traditional measures. The generic quantification approach proposed in the paper allows for easy use of dynamic resilience for planning and operations of various civil infrastructure systems.

“virgin soil epidemics” in the Americas

This is a seminal 1976 paper by Alfred Crosby on the epidemics that devastated Native Americans after Europeans first came. I’m sure there is plenty of scholarly work since then that may have refined this, but it is horrifying even if some of the details have changed. The most extreme estimates are that as many as 100 million people lived in the Americas pre-Columbus, or one-sixth of all humans alive at the time, and only a few million survived. If true, this is much worse than the Black Death in Europe. This would mean that Native American civilizations might have been equivalent in size and sophistication to European and Asian ones. We just don’t know.

I think this is also a cautionary tale for what a novel disease or combination of novel diseases could do to our current civilization, whether natural or man-made. He does point out though that genetic factors and never having been exposed before were only some of the factors. People at the time did not understand quarantine for example, and some practices for dealing with the dead led to more contagion. People might have been weakened by exotic diseases like smallpox, then finished off by diseases they would have experience with like malaria or pneumonia. They didn’t understand how hydration, nutrition, and keeping warm could keep their strength up to fight off secondary infections, or else people may have been too sick to fetch water and food and keep fires burning. Hopefully we can do much better today if and when some terrible epidemic strikes.

radiation during your flight to Mars

Radiation exposure could be a problem on flights to Mars, according to Nature.

Cosmic radiation exposure and persistent cognitive dysfunction

The Mars mission will result in an inevitable exposure to cosmic radiation that has been shown to cause cognitive impairments in rodent models, and possibly in astronauts engaged in deep space travel. Of particular concern is the potential for cosmic radiation exposure to compromise critical decision making during normal operations or under emergency conditions in deep space. Rodents exposed to cosmic radiation exhibit persistent hippocampal and cortical based performance decrements using six independent behavioral tasks administered between separate cohorts 12 and 24 weeks after irradiation. Radiation-induced impairments in spatial, episodic and recognition memory were temporally coincident with deficits in executive function and reduced rates of fear extinction and elevated anxiety. Irradiation caused significant reductions in dendritic complexity, spine density and altered spine morphology along medial prefrontal cortical neurons known to mediate neurotransmission interrogated by our behavioral tasks. Cosmic radiation also disrupted synaptic integrity and increased neuroinflammation that persisted more than 6 months after exposure. Behavioral deficits for individual animals correlated significantly with reduced spine density and increased synaptic puncta, providing quantitative measures of risk for developing cognitive impairment. Our data provide additional evidence that deep space travel poses a real and unique threat to the integrity of neural circuits in the brain.