Author Archives: rdmyers75@hotmail.com

critical natural capital

This article in Ecological Economics is about the idea of critical natural capital. Critical natural capital is meant to bridge the gap between strong sustainability, which says manufactured capital cannot be substituted for natural capital, and weak sustainability, which says it can. Critical natural capital says that some, but not all, of it can be substituted, because some of it is, well, critical.

The other theme of this paper is the “capability approach”, which is based on the ideas of Amartya Sen. Reading Amartya Sen is on my list of things to do eventually someday, but I haven’t gotten to that yet.

This article is an attempt to conceptually improve the notion of strong sustainability by creating a rapprochement between its core concept, critical natural capital, and the capability approach. We first demonstrate that the capability approach constitutes a relevant framework for analysing the multiple links between human well-being and critical natural capital. Second, we demonstrate that the rapprochement between critical natural capital and the capability approach can form both the normative basis and the informational basis for a deliberative approach to human development which embraces a strong sustainability perspective. This conceptual rapprochement, as illustrated in our case study, opens up avenues of research towards the practical implementation of human development projects from a strong sustainability perspective.

Pennsylvania is #1…

…in government fragmentation, according to this 2003 paper by David Rusk at the Brookings Institution.

  • The Commonwealth of Pennsylvania has created the nation’s most fragmented system of local government within its metropolitan areas.
  • State policies have contributed to uncontrolled urban sprawl by making its “little boxes” governments so highly dependent on local property taxes, promoting a constant ratables chase. Over the last fifty years Pennsylvania ranks second only to West Virginia in consuming the most land for the least population growth.
  • The combination – constant outward development overlaying a pattern of immutable local government boundaries – has condemned Pennsylvania’s “inelastic” central cities, most boroughs, and many “built-out” townships to population, economic, and fiscal decline.
  • The many governmental “little boxes” actively contribute to the high degree of racial and economic segregation that characterizes Pennsylvania’s metropolitan areas.
  • Whether through costly inefficiencies, high social and economic disparities, or cutthroat inter-municipal competition, Pennsylvania’s governmental system of “little boxes” also retards its economic growth.
  • Sprawl and steady abandonment of “inelastic” central cities, most boroughs, and many “built-out” townships also implicitly means abandonment (or certainly underutilization) of existing physical infrastructure (houses, stores, factories, water and sewer lines, etc.) that cost prior generations a fortune to create originally and is even more expensive to duplicate anew. Discarding this investment is decidedly fiscally wasteful.

The obvious answer would be to reorganize around metropolitan areas. The Minneapolis-St. Paul Metropolitan Council is one model of a regional government with real teeth. I am not an expert on state constitutional law, but it may be that Pennsylvania’s “home rule” state constitution makes it difficult to do something similar. Or it may be that the system of representative government gives outsize power to representatives from relatively less populous places, so the state legislature is unlikely to overhaul things even if the constitution would allow it. Even if these problems could be solved at the state level, the Philadelphia metro area would still cover parts of New Jersey and Delaware. So the remaining option is massive changes to the United States Constitution abolishing states entirely in favor of metropolitan areas. I haven’t noticed that in any campaign platforms lately.

 

DIY programming and robotics stuff

Here’s some programming and robotics stuff that could be useful for education and do-it-yourself projects around the house. Oh, the things I could do if I didn’t have to work for a living…

  • The Raspberry Pi is a low cost, credit-card sized computer that plugs into a computer monitor or TV, and uses a standard keyboard and mouse. It is a capable little device that enables people of all ages to explore computing, and to learn how to program in languages like Scratch and Python. It’s capable of doing everything you’d expect a desktop computer to do, from browsing the internet and playing high-definition video, to making spreadsheets, word-processing, and playing games.

  • Arduino is an open-source electronics platform based on easy-to-use hardware and software. It’s intended for anyone making interactive projects.

  • VEX IQ is a robotics platform designed to transform STEM learning for young students and their teachers. Students as young as 8 can jump right in and snap robots together using this intuitive, toolless platform while educators can utilize the free VEX IQ Curriculum to help teach them valuable lessons and skills that are needed in today’s changing world. The VEX IQ Challenge, presented by the Robotics Education & Competition Foundation, gives students affordable access to the inspiration, excitement and learning that comes from participating in a STEM challenge.

Joe Jenkins

Here’s an interview with Joe Jenkins, author of The Humanure Handbook, a guide to compost toilets. Composting toilets are a potentially very good idea – they could save enormous amounts of water, energy, and money everywhere, address problems caused by aging and inadequate wastewater infrastructure in developed countries, and bring life-saving basic sanitation to billions who don’t have it now.

There are commercial composting toilet systems available – you can see some in Chapter 6 of Jenkins’s book. I don’t know why they have not caught on more widely. Maybe it’s a case like the QWERTY keyboard where the design that caught on is not the best one available, but simply an adequate design that got implemented at scale first, and is now hard to displace. Or maybe the designs are too expensive and/or just not good enough to overcome the incredible power of social taboos about human waste, which are not to be taken lightly. If this is the case, the technology may be stuck in a chicken and egg problem where it is not quite good enough to be adopted on a larger scale, and it is not going to be improved unless and until it is subjected to a larger marketplace. People are not going to take a risk on it as long as they are content with the flush toilet system they already have. That said, it really would not be rocket science to come up with better designs. It would just have to be taken seriously as a research and development project and have some real resources thrown at it, the kind of resources we routinely throw at weapons, chemicals, drugs and electronics.

Let’s assume we get to a better, cheaper composting design that everyone will want in their house – what then? The composting toilets people are using now need a carbon source such as sawdust to balance out the nitrogen in the feces. That is fine on a small scale, but on a large scale we would now need a system to produce and distribute sawdust or something similar to billions of people. That sounds like a sustainability problem. A possible solution there would be to build the carbon source into packaging of consumer products – instead of all the plastic wrap we use now, make consumer packaging out of some sort of carbonaceous waste (corn stalks, switchgrass?). When people unwrap things they would just throw it into the toilet.

The next problem is what to do with the compost. Compost is great stuff that gardeners love. But not everybody is a gardener, and now you have done this on a large scale. You have to collect the stuff and get it to gardens, parks or farm fields where it can be used. So now you are back to a system of trucks or pipes to do this – not much different from what we do now, except you have moved the treatment step from the central wastewater plant to individual homes.

A biogas system is a possible alternative technology. Instead of the aerobic composting system, you would put your bodily waste, carbonaceous packing material, and food waste (the same ingredients from your aerobic system) in a sealed reactor with the right microbes to break it down to methane. The solids remaining should be less than with the aerobic system although you still have to deal with them. You can use the methane for anything you use natural gas for now – heating, hot water, or electricity which you can either use or sell back to the grid. An intriguing possibility is to feed it into a fuel cell rather than burning it. Whereas both aerobic composting and combustion will liberate the carbon from the carbon source back into the atmosphere (if the carbon source is plant-based, it will be the same carbon absorbed from the atmosphere when the plants were grown), an ideal fuel cell (which may not have been invented yet) theoretically will produce only electricity, clean water, and elemental carbon. So in theory, the carbon is sequestered. You still need to pick it up and do something with it. Since I’m daydreaming, we’ll use some kind of biotechnology to turn it into cement.

algorithms

Algorithms don’t sound like a topic for riveting reading, but these two articles are pretty good.

The first is from a marketing magazine, Adbusters. The claim it makes – that markets have never really worked before, but are starting to work now because of computer algorithms – is  bit of a stretch, but entertaining. Here’s a quote:

The critical flaw in Hayek’s vision of the hand was that a “central body” could never gather enough information. We know this to be untrue, and with big data and the analysis and manipulation of that data through algorithmic equation, the missing link between money and the machine was discovered.

The searches we make, the news we read, the dates we go on, the advertisements we see, the products we buy and the music we listen to. The stock market … All informed by this marriage between mathematics and capital, all working together in perfect harmony to achieve a singular goal — equilibrium. But it’s a curious sort of equilibrium. Less to do with the relationship between supply and demand, and more about the man and the market.

All these algorithms we encounter throughout the day, they’re working toward a greater goal: solving problems and learning how to think. Like the advent and rise of high–frequency trading, they’re part of an optimization trend that leads to a strange brand of perfection: automated profit.

The second, from ESPN, is about how numbers are being crunched by big-time professional sports gamblers:

Eventually, he grew to understand one of Walters’ keys to success: Some of his bets were intentional losers, designed to manipulate the bookmakers’ odds. Walters might bet $50,000 on a team giving 3 points, then $75,000 more on the same team when the line reaches 3.5. The moment the line gets to 4, a runner is instructed to immediately place a larger bet — perhaps $250,000 — on the other team. The $125,000 on the initial lines will be lost, but if things go according to plan, the $250,000 on the other side will win enough to make up for it many times over. Walters uses the same method on multiple games, often risking millions each weekend.

Since the days of the Computer Group, analytically inclined professional gamblers have relied on technology as well as research to produce what is called a delta: the difference between the Vegas line and what the bettors conclude the point spread should be. The greater the delta, the more money a gambler like Walters will bet. There’s nothing illegal about manipulating lines, and many prominent gamblers have the ability to move a line with as little as $1,000. Walters’ strategy is simply more sophisticated and uses more people, better information and, of course, more dollars bet in far more places than anyone else’s, insiders say…

The vast Walters network also includes a guy on the East Coast known as The Reader, who scans local newspapers, websites, blogs and Twitter for revealing tidbits or injury updates. That information is weighed and plugged into the computer alongside other statistical data — from field conditions to intricate breakdowns of officiating crews. Armed with algorithms and probability theories, the objective is to find the mispriced team, then hammer the line to where Walters wants it.

 

designing ecosystem complexity

This article in Ecological Engineering is about measuring and purposely designing complexity into ecosystems to support biodiversity. I like this idea – certainly grass and trees are a step up from concrete in cities, but there might be some relatively simple design choices that could improve conditions for both wildlife and people without adding effort or cost. We actually expend enormous amounts of time, effort, and money maintaining our grass and trees, whereas natural ecosystems manage to maintain themselves while being more beautiful, diverse, and productive. The first step is to understand the systems better, the second would be to understand what variables we can manipulate, then the third and most difficult step is always translating that new understanding to actions on the ground and getting people to actually take them.

Simplification of natural habitats has become a major conservation challenge and there is a growing consensus that incorporating and enhancing habitat complexity is likely to be critical for future restoration efforts. Habitat complexity is often ascribed an important role in controlling species diversity, however, despite numerous empirical studies the exact mechanism(s) driving this association remains unclear. The lack of progress in untangling the relationship between complexity and diversity is partly attributable to the considerable ambiguity in the use of the term ‘complexity’. Here, we offer a new framework for conceptualizing ecological complexity, an essential prerequisite for the development of analytical methods for creating and comparing habitat complexity. Our framework distinguishes between two fundamental forms of complexity: information-based complexity and systems-based complexity. Most complexity–diversity studies are concerned with informational complexity which can be measured in the field through a variety of metrics (e.g. fractal dimensions, rugosity, etc.), but these metrics cannot be used to re-construct three-dimensional complex habitats. Drawing on our operational definition of informational complexity, it is possible to design habitats with different degrees of physical complexity. We argue that the ability to determine or modify the variables of complexity precisely has the potential to open up new lines of research in diversity theory and contribute to restoration and reconciliation by enabling environmental managers to rebuild complexity in anthropogenically-simplified habitats.

What Should We Be Worried About?

Need new stuff to worry about? They have a book for that!

What Should We Be Worried About: Real Scenarios That Keep Scientists Up at Night
by John Brockman

Steven Pinker uncovers the real risk factors for war * Mihaly Csikszentmihalyi peers into the coming virtual abyss * Nobel laureate Frank Wilczek laments our squandered opportunities to prevent global catastrophe * Seth Lloyd calculates the threat of a financial black hole * Alison Gopnik on the loss of childhood * Nassim Nicholas Taleb explains why firefighters understand risk far better than economic “experts” * Matt Ridley on the alarming re-emergence of superstition * Daniel C. Dennett and george dyson ponder the impact of a major breakdown of the Internet * Jennifer Jacquet fears human-induced damage to the planet due to “the Anthropocebo Effect” * Douglas Rushkoff fears humanity is losing its soul * Nicholas Carr on the “patience deficit” * Tim O’Reilly foresees a coming new Dark Age * Scott Atran on the homogenization of human experience * Sherry Turkle explores what’s lost when kids are constantly connected * Kevin Kelly outlines the looming “underpopulation bomb” * Helen Fisher on the fate of men * Lawrence Krauss dreads what we don’t know about the universe * Susan Blackmore on the loss of manual skills * Kate Jeffery on the death of death * plus J. Craig Venter, Daniel Goleman, Virginia Heffernan, Sam Harris, Brian Eno, Martin Rees, and more

Sustainable Cities Index

Before I read the new Sustainable Cities Index published by ARCADIS, I gave some thought to how I would put together a sustainable cities index. (Disclosure: I work in the same industry as this company, but have no affiliation with it.) Here is what I would try to do, at least for the environmental component (I haven’t given much thought to the social component):

  • Estimate the consumption (food, water, energy, and materials) of people in that city. Also estimate the waste produced – solid waste, air emissions, and water pollution – that is not recycled in some way.
  • Try to figure out roughly where this food, water, energy, and stuff came from, whether from within the city or without. Figure out the impact on soils, water bodies, the atmosphere, and natural ecosystems of producing all these materials. You could try to express this in land terms, energy terms, dollar terms, theoretically even information terms – there is no best or perfect way, although “ecological footprint” in terms of land is possibly the most intuitive.
  • Figure out, if everybody on Earth lived in cities exactly like a particular city, whether the Earth could support that indefinitely, and what the ultimate state of the remaining natural ecosystems would be.

By doing this, you would have an absolute benchmark to compare any city against, rather than just a relative benchmark to compare cities to each other. The problem with a relative benchmark is that it doesn’t tell you whether your best cities are good enough.

A possible variation would be to attribute impacts to the city where a particular corporation or industry is headquartered, rather than where the demand is. Let’s say a particularly unsustainable, sprawling development pattern is built in India, but is designed by a firm based on Singapore, or Geneva, or wherever. Or people in a city with very clean air and water eat food grown on plantations owned by corporations headquartered in the city, but grown in recently burned-down rainforests overseas. You could assign all or a portion of the blame to the designer or owner rather than the people living where the impact is actually occurring, who may not always have had much of a choice.

Now, let’s see what ARCADIS did. I don’t mean to be too critical – good for them for doing something and giving people something to think about, which is more than most companies do.

The People sub-index rates transport infrastructure, health, education, income inequality, work-life balance, the dependency ratio and green spaces within cities. These indicators can be broadly thought of as capturing ‘quality of life’ for the populace in the respective cities.
The Planet sub-index looks at city energy consumption and renewable energy share, recycling rates, greenhouse gas emissions, natural catastrophe risk, drinking water, sanitation and air pollution.
The Profit sub-index examines performance from a business perspective, combining measures of transport infrastructure (rail, air, other public transport and commuting time), ease of doing business, the city’s importance in global economic networks, property and living costs, GDP per capita and energy efficiency.

how U.S. taxes are spent

In a poll of U.S. taxpayers, 95% of respondents had no idea how much the country spends on foreign aid, which is much less than 1%. It just shows that although rational people can disagree on how our tax money should be spent, we are not having a rational debate because most of us have no clue what it is really being spent on. A taxpayer receipt is a simple idea to help cure this problem. Ideally this should be done by the IRS or Treasury Department, but the White House has stepped in to do it since nobody else will.

I picked a hypothetical married couple with children making an income of $80,000 per year. There are a million different ways you can slice it. But no matter what you do, the biggest categories jump out at you. Income tax is only about 40% of what the government takes in from this family, with Social Security and Medicare taxes making up the other 60%. Pensions for retired people make up the majority of how the money is spent, with Social Security alone making up almost half. Even without an offical public health care system, the federal government spends a lot on health care – almost 20% of all the money spent between Medicare (for older people) and programs to help lower-income people (Medicaid and children’s health insurance programs mostly). Of course, state and local governments also tax and spend on health care, which is not reflected here.

The military makes up around 10% of all federal spending. If you add veteran’s benefits (I’m double counting here, since these include retirement and health care) and homeland security, that number comes to more like 13%.

So however you slice it, the big numbers are retirement, health care, and defense. If we want to make significant changes in either the amount of tax, or the outcomes of government programs, we should focus most of our debating energies in these areas.