owned this note
owned this note
Published
Linked with GitHub
# Possible titles
- Comparing ocean and climate models using new physics-informed methods and tools
-
- Novel tools and methods ...
- How good is your ocean model? Methods and tools beyond a point-by-point comparison.
(current word count is 191)
## How good is your ocean model? Methods and tools beyond a point-by-point comparison
As the number and complexity of ocean and climate models grows, more opportunities arise to compare simulations with each other and with observations. Such comparisons have the potential to elucidate how well models represent physical processes. In addition, larger model diversity will enable us to identify emergent constraints that increase confidence in future climate projections even in the presence of model bias. Changes in the mean state (caused by different setup/spinup/initial conditions) limit the utility of point-by-point comparisons between models.
Multiple new comparison strategies have begun to arise, many of which involve comparing fields in alternative coordinate systems. For example, it may be useful to compare changes in the volume of a particular water mass or biogeochemical tracer patch, or to examine the transport of tracers across a density surface or streamline. These kinds of methods highlight integrated properties in physically-relevant regions, allowing multiple resolutions, grid types, and sparse observations to be compared.
We invite presentations that describe novel methods and software packages for comparing ocean and climate models with each other and with observations beyond the point-by-point approach. We particularly invite presentations that showcase method implementation based on open-source packages.
___
- Mention explicitly that we are not just interested in physical 'water masses' but that biogeochemical tracers present a huge opportunity to expand these frameworks...
Tuning models to represent key physical processes like heat uptake/distribution instead of i.e. the exact location of an isotherm gives more confidence in extrapolation into the future.
Maximize the utility of observations to validate/compare models. We do not have multidecadal observations, so what can we learn about the dynamics in the model...good represantation of seasonal/interannual dynamics lend faith to projected changes?
In this session we are soliciting abstracts presenting novel methods and tools to compare climate models amongst each other (e.g. CMIP) or comparing models to observations. We encourage methods that provide a complimentary view to point-by-point comparisons, i.e. based on ocean water masses.
As the number and complexity of ocean and climate models grows, there are more opportunities to compare the representation of the ocean physics in simulations with other simulations and with observations.
As the number and complexity of ocean and climate models grows, there are growing opportunities to compare ocean/climate models with other models and with observations. Climate model comparison needs to be both easy and fast but at the same time should focus on whether key physical processes are represented.
Hence, point-by-point comparisons of ocean and climate model fields have limited utility, and are insufficient for identifying whether the same physics is taking place across models. The same problem arises when comparing models with observations.
Hence, identifying complimentary metrics based on water properties will inform process-based research and improve model bias correction.
## Promo Text
Dear ____
Are you interested in new and creative methods for ocean model evaluation? Submit an abstract to our Ocean Sciences Session on this topic!
[**How good is your model? Methods and tools beyond a point-by-point comparison**](https://agu.confex.com/agu/OSM24/prelim.cgi/Session/195591)
> As the number and complexity of ocean and climate models grows, more opportunities arise to compare simulations with each other and with observations. Such comparisons have the potential to elucidate how well models represent physical processes. In addition, larger model diversity will enable us to identify emergent constraints that increase confidence in future climate projections even in the presence of model bias. Differences in the mean state (caused by different setup/spinup/initial conditions) limit the utility of point-by-point comparisons between models.
> Multiple new comparison strategies have begun to arise, many of which involve comparing fields in alternative coordinate systems. For example, it may be useful to compare changes in the volume of a particular water mass or biogeochemical tracer patch, or to examine the transport of tracers across a density surface or streamline. These kinds of methods highlight integrated properties in physically-relevant regions, allowing multiple resolutions, grid types, and sparse observations to be compared.
> We invite presentations that describe novel methods and software packages for comparing ocean and climate models with each other and with observations beyond the point-by-point approach. We particularly invite presentations that showcase method implementation based on open-source packages.
[Submission Link](https://agu.confex.com/agu/OSM24/cc/papers/index.cgi?sessionid=195591)
The deadline for abstract submission is Wednesday September 13th 2023. Please contact Spencer Jones (spencerjones@tamu.edu) if you have any questions about the session.
Thanks,
Spencer, Julius and Kenechukwu
We should all promote this on Social Media, and retweet/boost each others posts.
Julius will post this to
- [x] OCP list
- [x] LEAP list
- [x] M2lines list
- [ ] ~~Princeton geoclim~~ (seems like I don't have access anymore... oh well)
Spencer will send to:
- [x] Jon Baker,
- [x] Geoff Vallis,
- [x] Graeme MacGilChrist,
- [x] Steve Griffies,
- [x] Maike Sonnewald,
- [x] Jan-Erik Tesdal,
- [x] Jon Krasting,
- [x] Jan Zika,
- [x] Yassir Eddebar,
Taimoor Sohail,
- [x] Dan Whitt,
Sjoerd Groeskampf,
Fabien Roquet