Category Archives: Peer Reviewed Article Review

evaporation energy

There is a lot of energy in evaporation, and there are technologies that theoretically could harvest it for human use.

About 50% of the solar energy absorbed at the Earth’s surface drives evaporation, fueling the water cycle that affects various renewable energy resources, such as wind and hydropower. Recent advances demonstrate our nascent ability to convert evaporation energy into work, yet there is little understanding about the potential of this resource. Here we study the energy available from natural evaporation to predict the potential of this ubiquitous resource. We find that natural evaporation from open water surfaces could provide power densities comparable to current wind and solar technologies while cutting evaporative water losses by nearly half. We estimate up to 325 GW of power is potentially available in the United States. Strikingly, water’s large heat capacity is sufficient to control power output by storing excess energy when demand is low, thus reducing intermittency and improving reliability. Our findings motivate the improvement of materials and devices that convert energy from evaporation.

This is interesting. Cutting evaporation losses in half could be a good thing in some situations, like reservoirs and swimming pools in arid regions. Cut too much evaporation elsewhere, and you could imagine a science fiction scenario where you have a full reservoir but nearby ecosystems or farmland turn into deserts. Or you end up pumping that reservoir and using it for irrigation using the energy you have harvested, in the end using technology to efficiently recreate the hydrologic cycle and ecosystem services nature used to provide for free.

Battlefield Casualties and Ballot Box Defeat

This surprising study from Boston University and University of Minnesota concludes that military families that suffered casualties in the Afghanistan and Iraq wars might have been the swing voters that put Trump over the top in Pennsylvania, Michigan, and Wisconsin.

Kriner, Douglas L. and Shen, Francis X., Battlefield Casualties and Ballot Box Defeat: Did the Bush-Obama Wars Cost Clinton the White House? (June 19, 2017). Available at SSRN: https://ssrn.com/abstract=2989040

America has been at war continuously for over 15 years, but few Americans seem to notice. This is because the vast majority of citizens have no direct connection to those soldiers fighting, dying, and returning wounded from combat. Increasingly, a divide is emerging between communities whose young people are dying to defend the country, and those communities whose young people are not. In this paper we empirically explore whether this divide—the casualty gap—contributed to Donald Trump’s surprise victory in November 2016. The data analysis presented in this working paper finds that indeed, in the 2016 election Trump was speaking to this forgotten part of America. Even controlling in a statistical model for many other alternative explanations, we find that there is a significant and meaningful relationship between a community’s rate of military sacrifice and its support for Trump. Our statistical model suggests that if three states key to Trump’s victory – Pennsylvania, Michigan, and Wisconsin – had suffered even a modestly lower casualty rate, all three could have flipped from red to blue and sent Hillary Clinton to the White House. There are many implications of our findings, but none as important as what this means for Trump’s foreign policy. If Trump wants to win again in 2020, his electoral fate may well rest on the administration’s approach to the human costs of war. Trump should remain highly sensitive to American combat casualties, lest he become yet another politician who overlooks the invisible inequality of military sacrifice. More broadly, the findings suggest that politicians from both parties would do well to more directly recognize and address the needs of those communities whose young women and men are making the ultimate sacrifice for the country.

I acknowledge and am willing to believe the numbers. I am not sure I buy the conclusions these authors draw from the numbers – that communities with ties to the military will vote for candidates they think are least likely to send their children off to war. On the contrary, I would hypothesize that people in these communities might respond more strongly to patriotic rhetoric, and be more likely to support military approaches to geopolitical problems.

climate change and Hurricane Harvey

Michael Mann, a climate scientist at Penn State, has posted a long (for Facebook) article on Facebook about how climate change contributes to events like this. In short, climate determines the probability of a particular weather event occurring, but ultimately any one particular weather event is a roll of the (now slightly loaded) dice. Warmer water and warmer air than in the past have both made events like this more likely, and are making events like this more destructive when they do occur. The article has links to several journal articles which would be worth reading to know something about hydrology and climate change. But right now I can’t do that because I’m late for my job where I have to convince people I know something about, among other topics, hydrology and climate change.

Sea level rise attributable to climate change (some is due to coastal subsidence due to human disturbance e.g. oil drilling) is more than half a foot over the past few decades (see http://www.insurancejournal.com/…/sou…/2017/05/31/452704.htm for a decent discussion).

That means that the storm surge was a half foot higher than it would have been just decades ago, meaning far more flooding and destruction.

In addition to that, sea surface temperatures in the region have risen about 0.5C (close to 1F) over the past few decades, from roughly 30C (86F) to 30.5C (87F), which contributed to the very warm sea surface temperatures (30.5-31 C or 87-88F). There is a simple thermodynamic relationship known as the “Clausius-Clapeyron equation (see e.g. https://en.wikipedia.org/…/Clausius%E2%80%93Clapeyron_relat…) that tells us there is a roughly 3% increase in average atmospheric moisture content for each 0.5C (~1F) of warming. Sea surface temperatures in the area where Harvey intensified were 0.5-1C warmer than current-day average temperatures, which translates to 1-1.5C warmer than the ‘average’ temperatures a few decades ago. That means 3-5% more moisture in the atmosphere.

Sowing density effects and patterns of colonization

That’s plant colonization, in case you were wondering what kind of colonization I am talking about. This study has a fairly simple premise – that in restoration you can sow the seeds that have the most trouble establishing at the highest densities, and seeds of plants that germinate and spread easily at lower densities, or even not at all.

Sowing density effects and patterns of colonization in a prairie restoration

A cost-effective approach in plant restorations could be to increase sowing density for species known to be challenging to establish, while reducing sowing density for species that easily colonize on their own. Sowing need not occur evenly across the site for rapidly dispersing species. We explored these issues using a prairie restoration experiment on a high-school campus with three treatments: plots sown only to grasses (G plots), to grasses and forbs (GF1), and to grasses and forbs with forbs sown at twice the density (GF2). In year 2, GF1 and GF2 plots had higher diversity than G plots, as expected, but GF2 treatments did not have twice the sown forb cover. However, high forb sowing density increased forb richness, probably by reducing stochastic factors in establishment. Cover of nonsown species was highest in G plots and lowest in GF2 plots, suggesting suppressive effects of native forbs on weedy species. Colonization of G plots by two sown forbs (Coreopsis tinctoria and Rudbeckia hirta) was apparent after 2.5 years, providing evidence that these species are self-sustaining. Colonization was greater in edges than in the central areas of G plots. Through construction of establishment kernels, we infer that the mean establishment distance was shorter for R. hirta (6.7 m) compared to C. tinctoria (21.1 m). Our results lead us to advocate for restoration practices that consider not only seed sowing but also subsequent dispersal of sown species. Furthermore, we conclude that restoration research is particularly amenable for outdoor education and university-high school collaborations.

200,000 annual deaths from air pollution in the U.S.

A 2013 study estimated the number of annual premature deaths due to air pollution in the U.S. at about 200,000. That’s kind of a shocking number considering it is more than deaths from other preventable causes like car accidents and suicides. An interesting (not in a good way) finding is that road transportation causes more deaths (~53,000/yr) from air pollution than from crashes. On the other hand, it means you can kill two birds with one stone when you institute policies and technologies that reduce vehicle emissions, driving, or both. Of course, a shift to electric cars just shifts the emissions to power plants in the short term, but that means many fewer centralized sources of emissions, which might be easier to deal with. A shift to more muscle-powered transportation in our cities is a huge win in terms of health (less violent death and injuries, less death from dirty air, more exercise in all that clean fresh air, probably better mental health), and a win in terms of land use and vibrancy and getting to know one another in our cities.

Air pollution and early deaths in the United States. Part I: Quantifying the impact of major sectors in 2005

Combustion emissions adversely impact air quality and human health. A multiscale air quality model is applied to assess the health impacts of major emissions sectors in United States. Emissions are classified according to six different sources: electric power generation, industry, commercial and residential sources, road transportation, marine transportation and rail transportation. Epidemiological evidence is used to relate long-term population exposure to sector-induced changes in the concentrations of PM2.5 and ozone to incidences of premature death. Total combustion emissions in the U.S. account for about 200,000 (90% CI: 90,000–362,000) premature deaths per year in the U.S. due to changes in PM2.5 concentrations, and about 10,000 (90% CI: −1000 to 21,000) deaths due to changes in ozone concentrations. The largest contributors for both pollutant-related mortalities are road transportation, causing ∼53,000 (90% CI: 24,000–95,000) PM2.5-related deaths and ∼5000 (90% CI: −900 to 11,000) ozone-related early deaths per year, and power generation, causing ∼52,000 (90% CI: 23,000–94,000) PM2.5-related and ∼2000 (90% CI: −300 to 4000) ozone-related premature mortalities per year. Industrial emissions contribute to ∼41,000 (90% CI: 18,000–74,000) early deaths from PM2.5 and ∼2000 (90% CI: 0–4000) early deaths from ozone. The results are indicative of the extent to which policy measures could be undertaken in order to mitigate the impact of specific emissions from different sectors — in particular black carbon emissions from road transportation and sulfur dioxide emissions from power generation.

eyes on the street

A group at the University of Pennsylvania looked for statistical evidence that “eyes on the street” are a deterrent to crime. The results are a bit puzzling, as real world data often can be.

ANALYSIS OF URBAN VIBRANCY AND SAFETY IN PHILADELPHIA

Statistical analyses of urban environments have been recently improved through publicly available high resolution data and mapping technologies that have adopted across industries. These technologies allow us to create metrics to empirically investigate urban design principles of the past half-century. Philadelphia is an interesting case study for this work, with its rapid urban development and population increase in the last decade. We focus on features of what urban planners call vibrancy: measures of positive, healthy activity or energy in an area. Historically, vibrancy has been very challenging to measure empirically. We explore the association between safety (violent and non-violent crime) and features of local neighborhood vibrancy such as population, economic measures and land use zoning. Despite rhetoric about the negative effects of population density in the 1960s and 70s, we find very little association between crime and population density. Measures based on land use zoning are not an adequate description of local vibrancy and so we construct a database and set of measures of business activity in each neighborhood. We employ several matching analyses within census block groups to explore the relationship between neighborhood vibrancy and safety at a higher resolution. We fi nd that neighborhoods with more vacancy have higher crime but within neighborhoods, crimes tend not to be located near vacant properties. We also find that more crimes occur near business locations but businesses that are active (open) for longer periods are associated with fewer crimes.

This is particularly fascinating to me because I live my life in the middle of this particular data set and am part of it. So it is very interesting to compare what the data seem to be saying with my own experiences and impressions.

The lack of correlation between population density and crime is not surprising. Two neighborhoods with identical density can be drastically different. The correlation between poverty and crime is not surprising – people who are not succeeding in the formal economy and who are not mobile turn to the informal economy, in other words drug dealing, loan sharking and other illegal ways of trying to earn an income. If they are successful at earning an income, they tend to have a lot of cash around, and other people who know about the cash will take advantage of them, knowing they will not go to the police. Other than going to the police, the remaining options are to be taken advantage of repeatedly, or to retaliate. This is how violence escalates, I believe, and it goes hand in hand with development of a culture that tolerates and even celebrates violence, in a never-ending feedback loop.

The puzzling part comes when they try to drill down and look at explanatory factors at a very fine spatial scale. They found a correlation between crime and mixed use zoning, which appears to contradict the idea that eyes on the street around the clock will help to deter crime. And they found more crime around businesses like cafes, restaurants, bars and retail shops. They found that longer open hours seemed to have some deterrent effect on crime relative to shorter open hours.

I think they have made an excellent effort to do this, and I am not sure it can be done a lot better, but I will point out one idea I have. They talk about some limitations and nuances of their data, but one they do not mention is the idea that they are looking at reported crimes, most likely police reports or 911 calls. It could be that business owners, staff and patrons are much more likely to call 911 and report a crime than are residential neighbors. The business staff and patrons may see this as being in the economic interest, increasing the safety of their families, and the (alleged) criminals they are reporting are generally strangers. In quieter all-residential neighborhoods, people may not observe as many of the crimes that do occur (fewer “eyes on the street”), they may prefer not to report crimes either through a sense of loyalty to one’s neighbors, minding one’s own business, quid pro quo, or in some cases a fear of retaliation. There is also the factor of some demographic groups trusting the police more than others, although the authors’ statistical attempts to control for demographics may tend to factor this out.

 

“automated curation of wild places”

This is a fascinating idea, could even be attempted on other planets, and provides limitless ideas for dystopian science fiction about what could go wrong and/or whether we could all be experiencing some form of “automated curation” right now.

Designing Autonomy: Opportunities for New Wildness in the Anthropocene
Bradley Cantrell, Laura J. Martin, and Erle C. Ellis

Maintaining wild places increasingly involves intensive human interventions. Several recent projects use semi-automated mediating technologies to enact conservation and restoration actions, including re-seeding and invasive species eradication. Could a deep-learning system sustain the autonomy of nonhuman ecological processes at designated sites without direct human interventions? We explore here the prospects for automated curation of wild places, as well as the technical and ethical questions that such co-creation poses for ecologists, conservationists, and designers. Our goal is to foster innovative approaches to creating and maintaining the autonomy of evolving ecological systems.

After rooting around just a bit I was able to find an open source proof of this paper here.

R and differential equations

Here’s a new R package for solving differential equations. Sounds like something that might be of interest to only a few ivory tower mathematicians, right? But solving differential equations numerically is the critical core of almost any dynamic simulation model, whether it is simulating water, energy, money, ecology, social systems, or the intertwinings of all of these. So if we are going to understand our systems well enough to solve their problems, we have to have some people around who understand these things on a practical level.

inequality and carbon emissions

A paper in Ecological Economics explores the links between inequality and carbon emissions.

The Trade-off Between Income Inequality and Carbon Dioxide Emissions

We investigate the theoretically ambiguous link between income inequality and per capita carbon dioxide emissions using a panel data set that is substantially larger (in both regional and temporal coverage) than those used in the existing literature. Using an arguably superior group fixed effects estimator, we find that the relationship between income inequality and per capita emissions depends on the level of income. We show that for low and middle-income economies, higher income inequality is associated with lower carbon emissions while in upper middle-income and high-income economies, higher income inequality increases per capita emissions. The result is robust to the inclusion of plausible transmission variables.

It could be that as developing countries develop, greener technologies become available to the working and middle classes faster than their household incomes actually increase. I am thinking of a switch from biomass and coal to electricity and natural gas, for example. These will lower people’s ecological footprint without necessarily costing them a lot more money. Once they start to get more money, they may start to transition to higher-impact behaviors, like driving instead of bicycling, and eating more meat and less grain.

You certainly wouldn’t want to promote income inequality as a policy measure to help the environment. There are social and tax policies that could be pursued instead, for example keeping communities walkable and mixed use even as incomes rise, and pricing meat at its true cost to the environment. These aren’t easy things to do politically in developing countries or anywhere else, of course, because they would require a political system willing to take on corporate power such as the oil, automobile, highway, and agriculture industries which tend to be immensely powerful and intertwined with political, bureaucratic and military elites.

synergy, uniqueness, and redundancy in interacting environmental variables

This is a bit over my head, but one thing I am interested in is analyzing and making sense of a large number of simultaneous time series, whether measured in the environment, the economy, or output of a computer model. This can easily be overwhelming, so one place people often start is trying to figure out which time series are telling essentially the same story, or directly opposite stories. Understanding this allows you to reduce the number of variables you need to analyze to a more manageable number. Time series make this more complicated though, because two variables could be telling the same or opposite stories, but if the signals are offset in time, simple ways of looking at correlation may not lead to the right conclusions. With simulations you have yet another set of complicating factors, which is the implicit links between your variables, intended or not, and whether they exist in the real world or not.

Temporal information partitioning: Characterizing synergy, uniqueness, and redundancy in interacting environmental variables

Information theoretic measures can be used to identify non-linear interactions between source and target variables through reductions in uncertainty. In information partitioning, multivariate mutual information is decomposed into synergistic, unique, and redundant components. Synergy is information shared only when sources influence a target together, uniqueness is information only provided by one source, and redundancy is overlapping shared information from multiple sources. While this partitioning has been applied to provide insights into complex dependencies, several proposed partitioning methods overestimate redundant information and omit a component of unique information because they do not account for source dependencies. Additionally, information partitioning has only been applied to time-series data in a limited context, using basic pdf estimation techniques or a Gaussian assumption. We develop a Rescaled Redundancy measure (Rs) to solve the source dependency issue, and present Gaussian, autoregressive, and chaotic test cases to demonstrate its advantages over existing techniques in the presence of noise, various source correlations, and different types of interactions. This study constitutes the first rigorous application of information partitioning to environmental time-series data, and addresses how noise, pdf estimation technique, or source dependencies can influence detected measures. We illustrate how our techniques can unravel the complex nature of forcing and feedback within an ecohydrologic system with an application to 1-minute environmental signals of air temperature, relative humidity, and windspeed. The methods presented here are applicable to the study of a broad range of complex systems composed of interacting variables.