Week 1:

Valerie P -

Good morning!

 

For my first week, I’ve collected two readings. The first one is called Initial results from Phase 2 of the international urban energy balance model comparison by C. S. B. Grimmond, focusing on the development of urban land surface schemes.

 

To begin, we need to define a few terms. Land surface models (LSMs) seek to represent the physical and biogeochemical exchanges of energy, water, and other matter that comprise land-atmosphere interactions, especially under conditions of global change. These models originated with simple land-earth biophysics but in the past decade have grown to encompass more interrelated processes. These models measure various flux factors in an environment. Simply defined, flux is “the rate of flow” or the transfer of quantity per unit area. LSMs measure various types of fluxes, including heat flux (transfer of heat between different surfaces and atmospheres), radiative flux (total amount of radiation that is absorbed, converted into heat, and re-emitted), urban flux (which can measure flow of carbon dioxide) and others.

 

LSMs parametrize energy exchanges between surface and the atmosphere. To model an urban environment, LSMs can vary in their complexity. You can visualize first just a simple slab, then maybe adding different patches of colors to represent different materials, then slowly, like a clay pot taking shape, shapes emerging from the slabs, from simple blobs to complex geometric shapes. The simple slab represents the most simple of LSMs, but then these models can start to take into account 3D geometry, height, and materials of buildings. However, increased model complexity leads to increased cost, including higher computational power and more parameters requiring specifications. No model includes every possible specification for all exchange processes, so it’s in the interest of people constructing those LSMs to understand if there’s any performance improvement with increased complexity.

 

In this paper, Grimmond and his team analyzed 32 urban LSMs (ULSM) with a range of approaches. At first, all teams are only informed whether their sites are urban or not. Then, in four stages, Grimmond’s team released a controlled amount of information about the sites. Groups were told how their models performed but not how others performed. This experiment had three main objectives:

  1. To evaluate how ULSMs modeled urban energy balance fluxes in relation to varying degrees of info
  2. To evaluate and compare how models with similar characteristics and complexities performed relative to each other
  3. To use these findings to look for ways to improve future ULSMs

 

The conclusions were rather lengthy, and many related to specific variables defined in the methodology, so I’ll just mention a major one most pertinent to that outlined above. The results showed that simpler models often showed a net improvement with additional information but complex models did not, suggesting increased model complexity does not necessarily increase model performance. Additionally, the authors noted that only two modeling sites were compared in this research, and urges the need for future comparisons among a wider range of morphology and building site materials.

 

Thanks for reading,

Valerie Polukhtin

More Posts

Comments:

All viewpoints are welcome but profane, threatening, disrespectful, or harassing comments will not be tolerated and are subject to moderation up to, and including, full deletion.

    mr_mcclernon
    Good report, Valerie. What sort of numerical technique is used in the models. mac
    vishruth_p
    Hi Valerie! The LSMs and ULSMs seem very interesting! I know you mentioned that these models track the flow of energy and fluxes such as carbon dioxide, so I was curious, are LSMs capable of tracking human flow? While based on what you said I understood that these models can track many things humans have an impact on, like heat and radiation as you mentioned, I wasn't sure if these models can track/manage human movement and possibly aid with issues like congestion.
      valerie_p
      Hey Vishi! I think some ULSMs take human movement/traffic patterns/congestion into account, but not the ones from this specific paper. There's specific models built to look at just traffic, at its patterns, how different pathways affect congestion, spatial distribution, etc. LSMs focus on more of the biophysics aspect of climate models, but technically within your model, you could incorporate whatever parameters you have data for (poorly chosen parameters are more likely to increase bias in your models). Hope that helps!
    jessie_z
    Hi Valerie! Your research topic sounds very interesting, and I'm excited to follow along. I'm also wondering: how do you think future urban LSMs can strike a balance between model complexity and computational efficiency?
      valerie_p
      Hi Jessie! That's a great question that sort of gets to the heart of what these researchers were looking at. The thing is, I don't think there's every going to be an objective balance of complexity and efficiency. It can be sort of subjective, with maybe one researcher saying that he believes that the increased cost is worth a certain increase in accuracy and the other might not. Additionally, sometimes a simple model might work better than a more complex one because the data for the more complex one isn't as thoroughly pulled, but as the methods for acquiring and the kinds of data change, maybe the accuracy of complex models that rely on that data will also change. So I believe that balance can evolve. But it's a great question.
    bhavitha_s
    Hi Valerie! Great post on LSMs :) You mentioned that more complex models require more computing power. Do you know how much more computing power that might be? Would the relationship between model complexity and computing power be more linear or exponential? Either way, it would be important to reduce the computing power needed for these models while maximizing performance.
      valerie_p
      Hi Bhavitha! That's a great question. It doesn't discuss it in this article and I'm actually not sure if about exactly how much computing power these models require. I do know for a fact that computational power/costs is directly related to model complexity (I just don't know the specifics). However, interestingly enough, the amount of computing power needed in models across the board has the potential to significantly decrease due to machine learning. Machine learning would allow these models to be potentially pre-trained or find more efficient algorithms/pathways, so the question of the quantity of computing power is actually changing. I mean, even at that guest lecture I attended, Dr. McPhearson was comparing how computations that would have taken days to finish only a few years ago now can be done in seconds or minutes, so there's a lot of potential in the field right now to reduce computational power. Sorry I couldn't provide a more specific answer to your question.

Leave a Reply to jessie_z Cancel reply

Your email address will not be published. Required fields are marked *