Navegue por el glosario usando este índice.

Especial | A | B | C | D | E | F | G | H | I | J | K | L | M | N | Ñ | O | P | Q | R | S | T | U | V | W | X | Y | Z | TODAS

Página:  1  2  (Siguiente)
  TODAS

C

LP

Cloud Formation Processes

de Lawrence Pologne - viernes, 21 de julio de 2006, 08:11
 

Source: COMET Module, "Influence of Model Physics on NWP Forecasts"

Importance of cloud thickness and cloud parameterization processes on model precipitation output.

cloud processes

Additional Resources:


D

JH

Data Assimilation Process

de John Horel - jueves, 20 de julio de 2006, 14:55
 
Source: COMET Module: "Understanding Data Assimilation"

Appropriate Level: Some graphics and concepts could be used in introductory classes but the core content is appropriate for upper division majors

Outline of Module

1. Data Assimilation Process. A yes/no question that can be skipped

2. Data Assimilation Wizard. Motivation for data assimilation that may appeal to some

3. Data/Observation Increment/Analysis. The core of the module is contained in these three sets of pages. Graphics and simple examples help to highlight critical concepts (a handful of busted links to NCEP operational resources in the Data section). Graphics shown below summarize the components of the data assimilation process.

Step 1. Observations received and processed
Observations

Step 2. Deviations between the observations and
background values are computed

Constructing Observation Increments

Step 3. Observation increments are used to adjust
the background

Objective Analysis

4. Operational Tips/Juding Analysis Quality. Brief review on some issues associated with assessing analysis quality.

5. Exercise/Assessment. Ordering the steps involved in data assimilation leads to the resulting graphic of the end-to-end process (below).
Data Assimilation Process


Other relevant COMET resources:

Operational Models Matrix: Brief overviews of operational data assimilation systems are listed at the bottom.

Good Data/Poor Analysis Case Study: Very nice illustration of some of the issues associated with objective analysis/data assimilation.

WRF Model WebCast: Provides very useful information on the Gridded Statistical Interpolation (GSI) assimilation approach used in the NCEP NAM.



E

JH

ECMWF Training on Data Assimilation

de John Horel - jueves, 20 de julio de 2006, 15:23
 
Source: ECMWF Training Material

Level: Instructor

Technical information on data assimilation as applied at the ECMWF.

JH

Ensemble Filters

de John Horel - jueves, 20 de julio de 2006, 14:55
 
COMET Module: None on this subject yet

Appropriate Level: Graduate level

Ensemble Filters for Geophysical Data Assimilation: A Tutorial by Jeff Anderson. Data Assimilation Research Testbed

This is an excellent, but technical, introduction to ensemble modeling.

Dave Dempsey

Ensemble Forecasting

de Dave Dempsey - jueves, 20 de julio de 2006, 14:51
 
COMET Module: "Ensemble Forecasting Explained" (2004)

"The assumptions made in constructing ensemble forecast systems (EPS) and the intelligent use of their output will be the primary focus of this module. In order for you to better understand EPS output, we will also look at some necessary background information on statistics, probability, and probabilistic forecasting."

Appropriate Level: Advanced undergraduate and above.

Summary comments: Lots of text, some of which can be dense (when the concepts described are relatively challenging). Simple, mostly static diagrams provide clear illustrations. Assessment exercises (multiple choice and multiple answer questions, with feedback) help. Quite a few ensemble forecast products and verification tools are described. Basic concepts of probability and statistics underly most of these (as befits the topic), and an attempt is made to provide some background about these, but a basic statistics course would be a valuable prerequisite to this material. A COMET webcast by NCEP's Dr. Bill Bua offers a simpler, one-hour introduction to a subset of the material in this module, "Introduction to Ensemble Prediction"; for most users unfamiliar with ensemble forecasting, I recommend viewing the webcast before completing the module.

Preassessment
. This module starts with a randomized, 15-item pre-assessment quiz, which the user repeats at the end of the module as a measure of what the user has learned. The pre-assessment quiz therefore should provide a preview of at least some of the module's content.

The mostly (entirely?) multiple-answer quiz questions include graph and weather map interpretations and verbal questions about aspects of ensemble forecasting and its applications. The questions, graphs and maps--and hence necessarily the module itself--use language and invoke concepts technical enough to be well out of the reach of beginning students. Here's an example of a graph with very limited explanation offered to help the user interpret it (which of course is the point--the user presumably will learn more about it in the module):

Hypothetical distributions of ensemble forecast, climatology, and observations

(The user is asked about the resolution and reliability of the two forecasts depicted in the graph.)

Depending on the degree of support provided by the instructor and the module itself, the preassessment suggests that upper division majors should be able to learn something from the module. Taking the preassessment quiz yourself is one way of judging whether the module content as it stands is appropriate in content and level for your students. Warning: although the preassessment strikes me as a useful pedagogical tool (for measuring learning progress and offering a preview of the module content), completing it can be discouraging unless users understand that they aren't expected to do well on it and that they should look forward to learning enough from the module to improve dramatically.

Introduction.
  1. Why Ensembles? A table summarizes the advantages of ensemble forecasts over of single forecasts.
  2. Chaos and NWP Models. A Flash animation of an 84-hour "control" 500 mb height forecast at 12-hour intervals and a second forecast with perturbed initial conditions are superimposed, together with color-filled contours of the difference between them. The accompanying text describes the animation and points out how the difference between the two forecasts tends to grows with forecast time. The idea of chaos and its manifestation in the atmosphere and numerical weather forecasting is introduced, usiing the same flash animation. The overarching content of the module is summarized
Generation. Brief, bulleted summary of possible methods used to create ensemble members, based on uncertainty in data or in the NWP model itself.
  1. Perturbation Sources. Uncertainty in initial conditions, and the range of that uncertainty (one static graphic similar to the animation in Section 2 of the Introduction); uncertainty in model formulation (including dynamical formulation and grid- and subgrid-scale physical parameterizations; illustrated with contour plots of precipitation plus a sounding); and boundary value uncertainty (text only).
  2. Implementations. Singular vectors (used by ECMWF); "breeding" cycle (one static graphic); reference to ensemble Kalman filter (EnKF).
Statistical Concepts. An overview of basic concepts and their relevance to ensemble forecasting, including the idea of probability density functions. Very little math; a few static graphs.
  1. Probability Distributions.
  2. Middleness.
  3. Variability.
  4. Shape.
  5. Using PDFs.
  6. Data Application. A spaghetti plot is introduced, with some basic guidance about how to read it. Measures of central tendency (mean, median, mode) and their limitations applied to interpreting ensemble forecasts.
  7. Exercises. Two multiple-choice questions with feedback and discussion. These strike me as pedagogically helpful.
Summarizing Data. Motivates the need for products that help forecasters digest the vast amount of information in an ensemble forecast.
  1. Products. Thirteen pages of examples with explanation, comparison, and discussion of various ensemble forecast analysis products, illustrated with static graphics.
  2. Product Interpretation. Relatively clear, simplified diagrams show how to intepret the mean and the spread of ensemble forecasts from spaghetti diagrams. One multiple choice question with feedback.
  3. Exercises. Three multiple choice questions with feedback, all involving interpretation of figures.
Verification. The problem of verifying ensemble forecasts.
  1. Concepts. Categorical and probabilistic forecasts and verification of the former. Skill score. Verification of probabilistic forecasts: "reliability" and "resolution".
  2. Tools. Six pages on verification tools. Some of these are relatively dense, but interesting.
  3. Applications. A few examples of how some verification tools are used at NCEP (an animated Talagrand diagram is shown). Comparison of NCEP's EPS to those of other forecast centers (links).
  4. Exercises. Three multiple answer questions with feedback.
Case Applications. Links to case studies and to the module quiz.



Dave Dempsey

Ensemble Forecasting: Information Matrix (NCEP)

de Dave Dempsey - jueves, 20 de julio de 2006, 14:52
 
COMET Module: "Ensemble Prediction System Matrix: Characterics of Operational Ensemble Prediction Systems (EPS)" (updated as needed)

"This matrix describes the operational configuration of NCEP EPS, including the method of perturbing the forecasts (initial condition or model), the configuration(s) of the model(s) used in the EPS, and other operationall significant matters."

Appropriate level: Advanced undergraduate and above.

Summary comments: This module is not an instructional module but is rather a matrix of hypertext links to information about aspects of NCEP's medium range and short range ensemble forecasting systems, including perturbation methods, characteristics of the models used, and postprocessing and verification. Some of the informational Web pages are illustrated with simple, clear diagrams, and some links lead to the "Ensemble Forecasting Explained" module (see entry in this glossary under "Ensemble Forecasting"). For students already familiar with the basics of how numerical weather prediction systems work and how numerical models are formulated, the matrix provides a useful way to investigate these topics in more detail, at least for NCEP's ensemble forecasting system. Some explanations are illustratrated with clear and easy to read, albeit static, graphics. Instructors might find some of these explanatory Web pages useful for instructional purposes, such as assigned reading.


Dave Dempsey

Ensemble Forecasting: Webcast Introduction

de Dave Dempsey - jueves, 20 de julio de 2006, 14:55
 
COMET Webcast: "Introduction to Ensemble Prediction" (2005)

This one-hour webcast by NCEP's William Bua "introduce[s] concepts in the COMET module Ensemble Forecasting Explained" (see the glossary entry for the module).

Appropriate level: Advanced undergraduates and above.

Summary comments: Dr. Bua's webcast presents a subset of the concepts presented in the Ensemble Forecasting Explained module. Generally speaking, the webcast treats its topics more simply than the module does. Quizzes with feedback, including a preassessment, are interspersed throughout. Viewing the webcast first should make completing the module, which is text intensive and sometimes conceptually challenging, easier.

F

Rodney Jacques

Focus COMET NWP Case Studies Modules

de Rodney Jacques - jueves, 20 de julio de 2006, 14:41
 

Water Vapor Loop

Appropriate Level: Advanced Forecaster

Outline of

1. Introduction - The snowstorm of January 6-7, 2002 failed to quantify the amount of snowfall, areal coverage of snowfall, and the timing of the snow event due to failed data assimilation and model dynamics. The ETA model and SREF model failed by not representing the following areas: 1. Shear and curvature vorticity. 2. Upper level diffluence and convection. 3) Model resolution, upper level dynamics, and data assimilation.

Questions:

Does an advanced forecaster have to review 4-5 COMET NWP modules to gain further understanding of this case study?

Can we FOCUS the NWP modules to gain insight on mid-upper level diabatic process within the NWP model?

Can this NWP case study provide me with a 3D visualization of the model underperforming or overperforming?

2. Discussion - The advanced weather forecaster assimilates and interrogates environmental data prior to displaying NWP models to provide a conceptual view of atmosphere. A thorough analysis is critical to asses the initial state of numerical models and to discover possible errors that may exist. A problem exists where the advanced forecaster does not have sufficient knowledge of NWP dynamics or structure. How can a forecaster obtain a better working knowledge of the inner working of a NWP model, "the black box".

3. Instructional Design - A suggestion would be to develop visualization software that can rerun the model and focus on the errors that this case study presents. (mid-upr level dynamics, convective processes, frozen precipitation processes). The advanced forecaster would have good and bad model runs from which to evaluate their impact on his forecast products. The case study area should finish this training scenario by providing educational content on how a forecaster adjusts model guidance using the digital forecast process.

4. Smart Tools - The digital forecast process is in place at NSW forecast offices. Weather forecasters account for errors in model output by running python scripts to adjust guidance. There are over 300 tools and most run basic routines that allow a forecaster to adjust or tweak guidance. These results are detached from the model. Most forecasters do not have complete understanding of the NWP models to understand the digital output in scenario driven situations.

5. Summary - The NWP case studies can be improved by providing 3D visualizations of good and bad model forecasts. A 3D visualization aids in graphically representing where a good model goes bad. Specific section sof COMET NWP models could be injected into the cast studies to focus the students learning and address the error in model guidance. Lastly, the NWP case studies needs to complete the end to end forecast training by providing content on the digital forecast process.

6. Resources -

  • Smart Tool Repository - http://www.nws.noaa.gov/mdl/prodgenbr.htm
  • Interactive Forecast Preparation System - http://www.nws.noaa.gov/mdl/icwf/IFPS_WebPages/indexIFPS.html
  • Forecast Verification Homepage - http://www.nws.noaa.gov/mdl/adappt/verification/verify.html
  • Meteorological Development Lab - http://www.nws.noaa.gov/mdl/
  • Integrated Data Viewer - http://www.unidata.ucar.edu/software/idv/






G

JH

Gridded Statistical Interpolation

de John Horel - jueves, 20 de julio de 2006, 14:56
 
COMET Module: Technique mentioned only in passing

Level: Graduate level

Journal article (Wu et al. 2002 MWR) that provides a technical description of selected aspects of the gridded statistical interpolation used in the NCEP GFS model. A version of this 3-dimensional variational approach is also used in the 2-D real time mesoscale analysis (RTMA).

I

SC

Interactions of Parameterizations

de Sen Chiao - jueves, 20 de julio de 2006, 11:13
 

Module: Influence of Model Physics on NWP Forecasts


A simple diagram illustrates the interactions of physical parameterizations for NWP models. Easy to understand the whole processes of model physics.

processes.JPG
(Source: WRF User's Workshop)



Página:  1  2  (Siguiente)
  TODAS