This funded work involves implementing synthetic flight profiles in the Hurricane WRF (HWRF) model by taking aircraft flight paths from real storms, transforming these flight paths into a frame relative to the moving center of HWRF's simulated storm, and then comparing the resulting synthetic wind profiles to the observed wind structure from the real storm. By doing this, an "apples-to-apples" comparison can be made between the wind structures of the simulated and real storms. This advanced diagnostics activity should lead to insight on how to improve HWRF's structure and intensity predictions. Jonathan is the Principal Investigator and main developer in this project. Collaborating institutions include the Development Testbed Center, the NOAA Hurricane Research Division, and the NOAA Environmental Modeling Center.
This work is funded by the DTC Visitor Program.
NCPP is a large multi-agency project whose goal is to advance the provision of regional and local information about the evolving climate and to accelerate its use in adaptation planning and decision making. The NCPP team includes members from NOAA, NCAR, the University of Colorado, the University of Michigan, and other institutions.
I have been working in a support role under Caspar Ammann (NCAR PI) since May 2013. My contribution thus far has been the creation of an evaluation engine to compute various metrics and indices across a large set of downscaled regional climate data sets.
To build the evaluation engine, I created a 20,000+ line code system in NCAR Command Language (NCL). Using a highly efficient and integrated workflow, this code set: (a) restructures each of the downscaled data sets over the monthly, seasonal, and annual timescales; (b) computes base statistics for a variety of metrics and indices; (c) computes climatological period statistics; (d) and finally, generates a unique evaluation plot for each metric or index combination for the designated period time frame, along with an associated self-contained NetCDF data file and XML metadata file. Metrics computed include the mean, median, standard deviation, 5th, 10th, 25th, 75th, 90th, and 95th percentiles. Various groups of indices are also computed including the ETCCDI climate extremes indices, the BioClim indices, and additional health-related indices. Comparison datasets are also generated to allow users to compare the various downscaled regional climate model data to several observational standards, which include the Maurer BCCA dataset and the Daymet 2.1 dataset. Altogether, 159,000 plots datasets have been created.
Project website: National Climate Predictions & Projections Platform (NCPP)
View evaluation and comparison data on the NCPP data portal: https://earthsystemcog.org/search/ncpp/
This project seeks to develop a new historical database of tropical cyclone wind and size parameters. Unlike other historical databases, such as the National Hurricane Center's Hurricane Database (HURDAT2), this new database will use objective methods to provide time-dependent error bounds on the estimated wind parameters. The goal is to provide the highest quality database possible for parametric wind modeling applications. Such models are used by the (re)insurance industry to simulate wind risk from tropical cyclones.
As the project PI, I am working to build several source datasets, including the Enhanced Vortex Data Message Dataset (VDM+) and the Extended Flight Level Dataset. Another project team member, Daniel Chavas, is updating his TC QuikSCAT Dataset of outer wind parameters. From these three datasets, the project team will build the new historical database with time-dependent uncertainty bounds. We will also devise new metrics for structure and intensity and examine how well these predict historic landfall losses.
This project is funded by the Risk Prediction Initiative (RPI2.0).
Project website: http://verif.rap.ucar.edu/tcdata/
I began working on this NCAR-led project in Fall 2013 under the leadership of Caspar Ammann. I am working on implementing the computation of return periods and other ensemble and extreme value analysis techniques to examine the effects of changing climate on agriculture and food security.
I initiated this project in 2011 to provide a global platform for the real-time collection and dissemination of tropical cyclone guidance aids. Used by forecasters and non-specialists alike, this site provides clear and easy-to-read plots of the track and intensity forecasts of the various global and regional hurricane models and other forecast aids. In the future, the site will also serve as a platform for the exchange and display of aircraft-based structure information and other types of data.
Project website: http://www.ral.ucar.edu/hurricanes/
In August 2012, I wrapped up an 8-month diagnostics effort for the DTC's Hurricane Task. He started this task by making an integrative assessment of known and perceived problems with the HWRF model. Then he formulated a menu of possible diagnostics approaches. With input from NCEP's Environmental Modeling Center (EMC), the top-priority task was selected to examine the climatology of large-scale errors in the basin-scale HWRF model (bHWRF). By comparing retrospective bHWRF forecasts against 0-hr forecast fields from retrospective runs of the Global Forecast System (GFS) model, a grid-point climatology of the spatial structure of errors and biases was constructed. Numerous systematic biases have been discovered, leading to recommendations for ways to improve the bHWRF.
Jonathan crafted a 17,000 line integrated code system in NCAR Command Language (NCL) to accomplish this task. This set of NCL scripts manages the workflow of staging the model data sets from the mass stores on Jet, Zeus, and Bluefire, transferring data onto disk on Zeus, subsetting and interpolating model data onto a common grid, converting to compressed netCDF4, computing paired differences of numerous 3D variables at each grid-point, accumulating these model errors for each month and the entire season, and creating output plots for evaluation and analysis. This tera-scale data analysis effort has been implemented on the NESCC Zeus Supercomputer using parallel batch processing allowing for high rates of throughput. Altogether, approximately 60 TB of data have been processed, resulting in over 5000 plots of the mean error (bias), RMS error, and de-biased RMS error.