Tag Archives: simulation

simulation games

This Wired article has a run-down of new(ish) simulation games. Before I entered the intensive child rearing years, I was one of those people like the author that was into this type of game (and also sports games, which are a simulation of sorts), and not so much into arcade-type games. So it is somewhat comforting that there are other people like me.

I keep hearing that the intensive child-rearing years do eventually wind down, and that you remember them fondly as you start to enjoy having some time to enjoy your own grownup life again. For my wife and I, there are just some slight twinkles of light at the end of the tunnel. It’s been a long dark tunnel, particularly with Covid, although of course there have been many joyful moments along the way and over time we will probably remember those and forget the hard parts.

Also appealing to me is the idea of writing my own simulations of real things that I can play something like games. For example, the stock market? climate change? the ecology of my neighborhood? geopolitics? Can I link these things together into one simulation of the universe as it actually plays out, Asimov Foundation-style? Of course not. Many smarter people than me have tried and failed. But the fun could be in the trying. Now, if you will excuse me I need to attend to the (beautiful, healthy, wonderful in every way) whining children and mountains of dirty laundry and dishes and unpaid bills and things in my house that are broken.

water, energy, and food in macroeconomic models

Here’s an article on how water, energy, and food fit into macroeconomic models. My basic understanding is that traditionally, they don’t. Production functions focus on labor and capital because these are assumed to explain most of the output, and including water, energy, raw material, and even land prices does not make enough difference to bother with. So the methods exist, but economists generally don’t bother because historical data shows these things don’t make a difference. We have certainly seen short-term and regional price shocks in food and energy that have affected economies. We haven’t really seen a sustained, long-term rise in prices of water, energy, or food, in fact the long term trend has been clearly the opposite. Will climate change begin to reverse this at some point? Or is it already happening but our technology is keeping up? Or is it happening slowly, we are adjusting, but the system is becoming more fragile and we are headed for a sudden panic at some point? Like dead wood building up in a forest – the forest may look okay for a long time, and then one day there is a spark, followed by an intense crisis, and then you are left with ashes…

Critical Reflections on Water-Energy-Food Nexus in Computable General Equilibrium Models: A Systematic Literature Review

The paper analyses how the Water-Energy-Food Nexus is treated in Computable General Equilibrium (CGE) models, discussing their design, importance and possible ways of improvement. The analysis of their structure is critical for evaluating their potential efficiency in understanding the Nexus, which will be particularly effective for gauging the importance of the topic, the reciprocal dependency of its elements and the expected macroeconomic, demographic and climatic pressures that will act on its components. General equilibrium models can be useful devices to this end, as they are specifically built to track interdependencies and transmission effects across sectors and countries. Nevertheless, the review showed that most CGEs in the literature struggle to represent the competing water uses across sectors and, in particular, those concerning the energy sector. Therefore, it highlights the need to resolve this issue as a necessary step toward improving future research.

Environmental Modeling and Software

one more Covid-19 dashboard

In the U.S., it feels like we are done with Covid. At least, for those of us who are vaccinated adults. For those of us with children, life is still not back to normal because even as we are being told we can return to the office, we can’t actually do that because the children are still home. And the world is clearly not done with Covid-19, as vaccination is proceeding slowly in many countries outside North America and Europe.

Anyway, here is one more simulation dashboard that shows an ensemble of simulations going forward up to four weeks. This might be useful to see if there are blips on the horizon when (if?) the kids really are allowed to go back to school in the fall. Here’s an article describing the site in MIT technology review.

attribution science, and some thoughts on computer modeling

This Slate article explains how attribution science works. It depends on modeling. Basically, scientists model an event (like a storm, flood, fire, whatever) using a hypothetical condition where the event did not occur, and compare that to the data from our actual universe where it did occur.

I do a fair amount of modeling in my job, and there are always skeptics (some more informed than others). Why would anyone trust a computer model? Isn’t empirical measurement always better? Well, we model things we can’t measure, often things that could or would have occurred if things were different, or things that might happen in the future. To trust a model, first, somewhat obviously, you need to say what the model is for, clearly. Second, you need to be confident that it is adequately representing the real-world processes underlying the system you are interested in. Whether this is true requires expert judgment, and the expert needs to really understand the system. If the expert is confident in this, and the expert knows what they are doing, the model has some usefulness even if there is no data. (Purely empirical models like regression equations don’t represent processes, and therefore have limited predictive value if conditions change significantly.) But we always want data. Third, the modeler will compare what the model predicts to some real data. The modeler needs to be aware that there is always uncertainty in how well measurements represent the real condition of the actual physical universe, and that this uncertainty will propagate through the model (the uninformed often think of this as “model error”.) If the prediction is reasonably accurate without tweaking, you may have a pretty good model. Often the modeler will do a little tweaking to improve the fit, but the more tweaking the more you are moving toward an empirical model with less predictive value. In a somewhat old-fashioned (according to me) but common approach in the engineering field, the modeler will set a portion of the data aside while doing the tweaking, then compare the tweaked model to the portion they set aside. I don’t usually do this, because there is never enough data. I tend to use it all, then check the model again when more data becomes available in the future.

Finally, we have a model that we are confident represents underlying processes, matches real-world measurements reasonably well, and is suitable for its stated purpose. We can use the model for that purpose, be clear about the known unknowns and unknown unknowns, and draw some conclusions that might be useful in the real world. We have some information that can inform decisions better than guesses alone could have, and that we couldn’t have learned from data alone.

if the universe is a simulation, can it crash?

Hopefully, if the universe is a simulation, it is a stable one. And if it crashes, whatever intelligent entity is out there can call his or her IT guy, spin it up again, fast forward to where it left off, and we won’t know the difference. If it is a simulation, do we really want to know? This article in Scientific American says that if we really want to know, one way to test whether it is a simulation is to try to crash it on purpose. So how would you do that? One way is to build our own simulated universes, then let them build their own simulated universes, and so on. At some point, the hardware of our universe should not be able to run all those universes. So to get to this point, we need to keep working on building way faster computers.

April 2020 in Review

Most frightening and/or depressing story:

  • The coronavirus thing just continued to grind on and on, and I say that with all due respect to anyone reading this who has suffered serious health or financial consequences, or even lost someone they care about. After saying I was done posting coronavirus tracking and simulation tools, I continued to post them throughout the month – for example here, here, here, here, and here. After reflecting on all this, what I find most frightening and depressing is that if the U.S. government wasn’t ready for this crisis, and isn’t able to competently manage this crisis, it is not ready for the next crisis or series of crises, which could be worse. It could be any number of things, including another plague, but what I find myself fixating on is a serious food crisis. I find myself thinking back to past crises – We got through two world wars, then managed to avoid getting into a nuclear war to end all wars, then worked hard to secure the loose nuclear weapons floating around. We got past acid rain and closed the ozone hole (at least for awhile). Then I find myself thinking back to Hurricane Katrina – a major regional crisis we knew was coming for decades, and it turned out no government at any level was prepared or able to competently manage the crisis. The unthinkable became thinkable. Then the titans of American finance broke the global financial system. Now we have a much bigger crisis in terms of geography and number of people affected all over the world. The crises may keep escalating, and our competence has clearly suffered a decline. Are we going to learn anything?

Most hopeful story:

  • Well, my posts were 100% doom and gloom this month, possibly for the first time ever! Just to find something positive to be thankful for, it’s been kind of nice being home and watching my garden grow this spring.

Most interesting story, that was not particularly frightening or hopeful, or perhaps was a mixture of both:

  • There’s a comet that might be bright enough to see with the naked eye from North America this month.

March 2020 in Review

To state the obvious, March 2020 was all about the coronavirus. At the beginning of the month, we here in the U.S. watched with horror as it spread through Europe. We were hearing about a few cases in Seattle and California, and stories about people flying back from Italy and entering the greater New York area and other U.S. cities without medical screening. It was horrible, but still something happening mostly to other people far away on TV. In the middle of the month, schools and offices started to close. By the end of the month, it was a full blown crisis overwhelming hospitals in New York and New Jersey and starting to ramp up in other U.S. cities. It’s a little hard to follow my usual format this month but I’ll try. Most frightening and/or depressing story:
  • Hmm…could it be…THE CORONAVIRUS??? The way the CDC dropped the ball on testing and tracking, after preparing for this for years, might be the single most maddening thing of all. There are big mistakes, there are enormously unfathomable mistakes, and then there are mistakes that kill hundreds of thousands of people (at least) and cost tens of trillions of dollars. I got over-excited about Coronavirus dashboards and simulations towards the beginning of month, and kind of tired of looking at them by the end of the month.
Most hopeful story:
  • Some diabetics are hacking their own insulin pumps. Okay, I don’t know if this is a good thing. But if medical device companies are not meeting their patient/customers’ needs, and some of those customers are savvy enough to write software that meets their needs, maybe the medical device companies could learn something.
Most interesting story, that was not particularly frightening or hopeful, or perhaps was a mixture of both:
  • I studied up a little on the emergency powers available to local, state, and the U.S. federal government in a health crisis. Local jurisdictions are generally subordinate to the state, and that is more or less the way it has played out in Pennsylvania. For the most part, the state governor made the policy decisions and Philadelphia added a few details and implemented them. The article I read said that states could choose to put their personnel under CDC direction, but that hasn’t happened. In fact, the CDC seems somewhat absent in all this other than as a provider of public service announcements. The federal government officials we see on TV are from the “Institute of Allergies and Infectious Diseases”, which most people never heard of, and to a certain extent the surgeon general. I suppose my expectations on this were created mostly by Hollywood, and if this were a movie the CDC would be swooping in with white suits and saving us, or possibly incinerating the few to save the many. If this were a movie, the coronavirus would also be mutating into a fog that would seep into my living room and turn me inside out, so at least there’s that.
https://www.youtube.com/watch?v=4chSOb3bY6Y

coronavirus simulations

The Washington Post has some interesting simulations that explain why quarantine is not all that effective a strategy, and why aggressive social distancing can be so effective. Basically, by isolating healthy people from each other you can drastically slow down the rate of spread and reduce the number of cases hitting the health care system at any one time to something manageable. These are agent-based simulations with accompanying time series graphs, and I find them pretty intuitive and informative.

dystopian Schumpeter meets Keynes

This article is about a serious attempt to consider climate change in a traditional economic model. Where does the dystopian part come in? Well, it sounds like the model suggests we are not going to innovate our way out of the consequences of climate change.

For these reasons, we develop the Dystopian Schumpeter meeting Keynes (DSK) model, which is the first attempt to provide a fully-fledged agent-based integrated assessment framework. It builds on Dosi et al. (2010, 2013, 2016) and extends the Keynes+Schumpeter (K+S) family of models, which account for endogenous growth, business cycles and crises. The model is composed by heterogeneous firms belonging to a capital-good industry and to a consumption-good sector. Firms are fed by an energy sector, which employ dirty or green power plants. The production activities of energy and manufacturing firms lead to CO2 emissions, which increase the Earth surface temperature in a non-linear way as in Sterman et al. (2013). Increasing temperatures trigger micro stochastic climate damages impacting in a heterogeneous way on workers’ labour productivity, and on the energy efficiency, capital stock and inventories of firms.

The DSK model accounts both for frequent and mild climate shocks and low-probability but extreme climate events. Technical change occurs both in the manufacturing and energy sectors. Innovation determines the cost of energy produced by dirty and green technologies, which, in turn, affect the energy-technology production mix and the total amount of CO2 emissions. In that, structural change of the economy is intimately linked to the climate dynamics. At the same time, climate shocks affect economic growth, business cycles, technical-change trajectories, green-house gas emissions, and global temperatures…

Simulation results show that the DSK model is able to replicate a wide array of micro and macro-economic stylized facts and climate-related statistical regularities. Moreover, the exploration of different climate shock scenarios reveals that the impact of climate change on economic performances is substantial, but highly heterogeneous, depending on the type of climate damages. More specifcally, climate shocks to labour productivity and capital stocks lead to the largest output losses and the highest economic instability, respectively. We also
find that the ultimate macroeconomic damages emerging from the aggregation of agent-level shocks are more severe than those obtained by standard IAMs, with the emergence of tipping-points and irreversible catastrophic events.