Overview
- History
- The site owner's first experience with a visualization lab was one built through Nevada's ACES program at the Desert Research Laboratory in the mid-2000's (Grubišić 2005). There were stereoscopic viewing screens for the main screen and each workstation. Vis5D was used for model output and IDV for research aircraft flight tracks.
- This experience was later used to assist in creating a weather center for an undergraduate program in 2012. Large touchscreens ran IDV, GR2, web browsers and were controlled with taps and 'flicks'. When a visualization laboratory was opened on campus in 2013, a provost grant funded creation of several meteorology education exercises described in more detail by Billings and Kubesh (2015).
- References
- Billings, B. J. and R. J. Kubesh, 2015: The use of visualization for advancing student analysis of meteorological data and forecasts. 12 pp. (Available on request.)
- Grubišić, V., 2005: Advanced Computing in Environmental Sciences (ACES): Cyberinfrastructure for atmospheric research in Nevada. 14th Symp. on Education, San Diego, CA
- Jerald, J., 2016: The VR book. Morgan & Claypool Publishers. 523 pp.
- Milgram, S., and Kishino, F., 1994: A taxonomy of mixed reality visual displays. IEICE Trans. Inform. Syst., 77, 1321-1329.
- Mori, M., 1970: The Uncanny Valley. Energy, 7, 33-35.
Reality-Virtuality Continuum
Milgram and Kishino (1994) introduced the concept of a spectrum of visual display systems. At one end is a real environment, while the other end is virtual reality (VR) where a "participant-observer is totally immersed in, and able to interact with, a completely synthetic world". The authors were the first to define the term mixed reality to refer to the inner spectrum including any "merging of real and virtual worlds".
Nearer to a real environment, in augmented reality (AR) "an otherwise real environment is augmented by means of virtual (computer graphic) objects". In contract, augmented virtuality is closer to VR in they are "completely graphic display environments ... to which video reality is added". This website contains information on topics related to all four areas of the reality-virtuality continuum.
Reality
There are other ways to add value to real weather footage besides overlaying computer graphics. One example is converting time-lapse footage into a format which is easier to carry and handle than even on a smartphone. Lenticular plates are easier to create than many would expect.
If photos are collected viewing the same direction but spaced by some horizontal distance, then stereoscopic pairs can be created. These produce a three-dimensional effect when viewed using several possible methods. Furthermore with accurate location information on the two cameras the geographic location of every pixel in the photo can be measured using photogrammetry. One can then obtain accurate measurements of cloud formations and locate these clouds relative to other meteorological operational or research measurements.
IOP 3 - 14 June 2014 - Greyhound Road
Augmented Reality
When overlaying computer-generated graphics on a real-world environment, there can be varying degrees of correspondence between the two. The images may depend on the geographic location of the user or a more distant location being viewed in a given direction. A meteorology example would be providing forecast information along a route a user may wish to travel.
On the other hand, AR graphics can be used simply to allow for easier manipulation of a 3D view. Such views are often the best illustration of concepts of meteorology, such as momentum dissipation due to the vertical flow over an island producing different types of wakes. Placing such graphics alongside textbook pages, whiteboard notes, or student assignments increases the value of the AR framework.
Augmented Virtuality
During the mid-1990's computer games were first developed which incorporated pre-recorded live-action video clips during actual gameplay. In the extreme case only the actors and certain props were recorded while the backgrounds and most of the game objects were entirely computer generated. Research has suggested this result is far more comfortable to users than even computer generated characters slightly less than completely realistic, a phenomenon know as the Uncanny Valley (Mori 1970, Jerald 2016).
The most prominent current use of this technique in meteorology is that featured on The Weather Channel. A live broadcaster is surrounded by animations depecting significant weather events including tornadoes, wildfires, and snowstorms. These segments, referred to as immersive mixed reality, are also provided to select news programs airing on different channels.
Virtual Reality
Any atmospheric scientist with experience visualizing numerical model output has a simple task in creating a VR environment. Isosurfaces of relative humidity and a grid of terrain elevation values are simple to convert to necessary object files. The user can then navigate downstream of a virtual mountain range with the associated cloud patterns.
While the technical aspects can be relatively simple, getting value from an atmospheric VR display involves several careful considerations. The most important question to answer is why rendering a display in VR will be superior to showing it in a textbook or a 3D display on a monitor or handheld AR devices. Examples include showing the user realistic weather at a location they have never visited, visualizing multiple datasets collected from large field experiments, and linking different complex terrain phenomoena by placing them in their correct geographic location.