Creating a timelapse video with weather observations + radar

I recently purchased a GoPro H3B (Hero 3 Black edition for non-GoPro folks) for numerous reasons – one of which was to create awesome timelapses. (Is it time lapse, time-lapse, or timelapse? I don’t know.) The H3B has a built-in timelapse mode which can do intervals ranging from 0.5 sec to 60 seconds. For a recent snow storm, I wanted to test out the timelapse feature and set it to 60 second interval – plenty for a snow storm.

Let me just take a half-step back and say I like making snowfall timelapses. A lot. I’ve created many snowfall timelapses over the years, using various hardware/software combinations. Sometimes I’ve used a simple webcam, basic capture software, and Windows Movie Maker, while others have been created with a DSLR and Sony Vegas Movie Maker.

I’ve always wanted to create an outdoor timelapse with radar included, but I’ve never had the time or energy to figure out how to do it. This time, however, I was inspired – from black bars, of all things. See, the GoPro captures images using it’s full 4000x3000px (4:3 aspect ratio) sensor, which gives great big 12 MP images. Unfortunately, YouTube’s video player is 16:9, and if you upload anything but 16:9 ratio, it adds black bars. I decided I wanted to fill that blank space with radar data, and it turned out the radar part was actually pretty easy – it was putting the actual weather observations on the video that was the more difficult part.

Getting the Radar data ready for video

My photo captures were at 60 sec intervals, so as long as I knew the radar VCP, I could match up the times with the images. Fortunately, only two VCPs were used over the course of the event – 31 and 21. VCP 31 is about ten minutes and 21 about six. I downloaded the radar data from NCDC, then used Unidata’s Integrated Data Viewer (IDV) to put together the video.

You will need:

  • IDV (see above) – you can actually run it right from the Java Web Start without installation!
  1. Download radar data from the NCDC NEXRAD Inventory site (not the HAS site)
    1. Click the radar site you want data from
    2. Change the date at the top to the first day you want data for. (All dates/times UTC) Then select the product you want – I used the Level 2 (“raw”) data, but Level 3 takes up a lot less space and will be faster for a slower machine. For L2 data, leave the default selected, for L3, you should use either N0Q – Long range reflectivity or NCR – Composite short-range reflectivity (for light snowfall events); Click Create Graph
    3. You will see a graph of each radar volume scan, color-coded for mode. Should be a lot of blue and red lines – blank space means the radar was down. Before you leave this page, click on the link towards the bottom that says “View Actual Timestamps / Op. Mode / VCP ” – this will list at what time VCPs were changed. Copy this and save to a text file for future reference. Enter your email address, change the times to what you desire, and click Order Data – you will receive the data within minutes to hours.
    4. Download the data, and ungzip. (On Windows, use something 7zip; Linux/Mac users can just use gunzip)
  2. Dowload/Open IDV
  3. Load data (Data Choosers > Files; Set Data Source Type to “Radar”; Add Source)
  4. For L2 data, choose reflectivity (or whatever parameter you want); For L3 I don’t think you have much of a choice – just select the data and load)
  5. Radar data will be displayed. Set up the map the way you’d like. For radar colors, choose whatever you want. I’ll share with you my AWIPS 256 color table that I created because I like those colors best – it was a pain to make, it’s not perfect, but it looks good enough for me! In the IDV color table manager, just import the XML file and it should work – just make sure the range is set from -32 to 80!
  6. Everything look good? Are you sure? Now we’re ready to capture. On the Map View, go to the View menu just above the map; Capture > Movie.
  7. In the Movie Capture dialogue, click on the Time Animation button and let it go. Don’t touch your computer while it runs!
  8. After capturing, a new window will pop up asking about timing and formats. Set these to what you’d like, as it will depend on your video speed and VCP.
  9. Movie saved! It happens very fast – you might think it didn’t work, but go ahead and check where you saved the file and it should play just fine.

I recommend saving in shorter (6-10 hour) chunks to reduce taxing your computer too much. Also, because VCP scan times aren’t exact, you’ll want some wiggle room to speed up or slow down the radar data in your final production.

Tip: Want to save some video then come back later to the exact same view? You can! When everything looks good, go to Displays > Favorite Bundles > Save As Favorite. Make sure Views and Displays are checked, and un-check Data Sources. You can then re-load the current view next time without having to re-do your map lines/zoom/etc.

Preparing the Weather Observations for the Video

This was by far the biggest challenge, and the reason I am documenting this process. It took time to figure it out, but I have ideas to make it easier next time. This process did take a while, so I’m going to break it up into a few sections. My process was basically to create an image of the text of the latest weather observation every minute. I did this because I could then just add these images alongside my timelapse images and know that they would be perfectly time-matched! I tried to find a better way to do this without the extra step of making images, but I couldn’t find anything despite searching a few internet forums.

You will need:

  • Excel (Yes, Excel – you can’t use OpenOffice or Google Docs for this one, folks)
  • Imagemagick

1) Download and format the data

Head to Wunderground and find your closest ASOS/AWOS. Note, if you have a closer or more representative station, you could combine, say, the weather/sky/visibility observations from the ASOS/AWOS but use temp/dew point/rh/wind from another station. Go to the first day for which you want data from. You should be at a page that looks like this.

  1. You don’t need this, but I recommend turning on full METARs  by clicking the link all the way at the bottom of the table of data
  2. Download the CSV – The link is all the way at the bottom of the page. If your browser does not prompt you to save the data, copy it all and paste it into Excel.
  3. Open the CSV with Excel; Use the Text-to-Columns tool if need be.
  4. Delete all the columns you don’t care about it. Don’t need Altimeter reading? Don’t keep it around!
  5. Make any unit conversions by creating a new column and convert the original data using a formula. If you want to change the decimal places of a number, change it to look the way you want. THEN, Copy > Paste Special > Values Only. Why? When we concatenate it will not keep the new style.
  6. Now we need to create a new sheet with a line for every minute. This is pretty simple – just enter the first two times you need (eg. MM/DD/YYYY 00:00, MM/DD/YYYY 00:01) then drag down as far as you need.
  7. Now with a sheet with times for every minute, we will re-create the sheet with the formatted data and fill in the times we don’t have obs using the LOOKUP function. If you want to see how I did this, see the Excel file I used.
  8. Finally, use the CONCATENATE function to take all of the values and put them in one string. You can add units, remarks, etc – again, see my Excel file to see how I did it. You’ll notice random ‘q’ characters in my text, too – I replace these with newlines when creating the images.
  9. Select all of the column with out concatenated strings (not including any label at the top), then paste it into a text file (eg. using Notepad) and save the text file.

2) Create images with Imagemagick

  1. You now have a text file with a line of text for each minute of the timelapse. (See my sample) – Move this to your platform with Imagemagick installed if needed. (I used Dropbox to send the text file from my Windows machine to my Mac.)
  2. Write a short shell script to use Imagemagick to make images. You can steal mine (below these instructions) or write your own. I used caption: to write the text, but there are a few ways to write text with Imagemagick. I also used tr to convert the q’s to \n (newlines).
  3. Run the script to make the images! My script took around a half hour to finish, creating over two thousand images.
#!/bin/bash
 iter=0
 while read l; do
     echo $iter
     echo -n $l | tr 'q' '\n' | convert -background green -fill white \
     -strokewidth 2 -stroke black -font AvantGarde-Demi -pointsize 72 -size 1800x \
     -gravity Center caption:@- out_${iter}.png
    iter=$((iter + 1))
 done < kgld_out_final.txt

Putting it all together

How to put it all together is up to use. I used Sony Movie Studio, which is just as good as Vegas for my purposes. You can’t use something like iMovie or Windows Live Movie Maker, since you need some advanced such as chroma key (for the images of weather obs) and multiple video tracks. If you’re using GoPro images, at 1080×1920 (standard 1080p HD), the stills will be 1080×1440, which leaves 1080×480 of empty space to use, which is what I used the radar imagery to fill.

How to make it easier – ideas for next time

Here are some ideas I have for making this process much easier…

  1. For the obs, write a script to archive the obs every minute using, for instance, a shell script and the Wunderground API. You could then just QC the data and make the images from that text, assuming you archive the data in a format you like.
  2. For radar, use GRLevel3 to save images in realtime using the FTP Publish feature, which can publish to a local directory.
  3. Also for the radar, use IDV within a few days to avoid having to download the data – Radar and satellite data is archived in IDV’s numerous free data sources for days.

Concluding Thoughts

I hope you found this post useful if you’d like to do something similar to what I did! If you have any questions feel free to ask me on Twitter – I couldn’t include every little step for every level of user, but I hope this guide was good enough as a starting point for anyone interested in doing this. I hope to do this again in the future, and especially this spring!

Posted in Technology, Weather | Tagged , , , , , , | Comments Off

April 8, 2013 Storm Chase and landspout

On April 8 a few severe to tornadic cells moved through parts of northeast Colorado, northwest Kansas, and extreme southwest Nebraska. I had been watching the situation during the afternoon and deciding whether it would be worth it to chase or if I might do better just waiting until something came to me in Goodland, KS. I took the wait-and-see approach, because while the environment was very favorable south of I-70, there didn’t seem to be a strong forcing mechanism. I watched a cell strengthen in eastern Colorado, and as it was passing I-70 just west of Burlington, CO I noticed the cell trying to split. A right turn would mean the storm could cross into Kansas and be fairly close to me, so I and a friend decided to make a go at it.

We took to KS-27, a paved highway north of Goodland and about 5 miles north made a left turn to head west on dirt roads. The cell was moving north-northeast but appeared to be weakening. Even so, this low-precipitation storm looked pretty great from our vantage point and we were able to get some good photos! This was my first time chasing in the Plains and seeing a real supercell thunderstorm in person. I was very thankful to have a co-pilot guiding me where to drive (and when to stop) – I couldn’t imagine going out there alone.

After about an hour of watching the storm it appeared to be weakening, so we decided to head back to Goodland. As I was about to get back on the paved highway KS-27, I stopped to glance at my phone and saw a text from a friend: “JOE CONFIRMED TORNADO ON THAT CELL NOW TURN AROUND.” Knowing how my friends and I joke with each other, I figured he was just kidding me. While I was without radar data for a time, based on what we saw of the storm structure it didn’t look exciting. I continued on the road and my co-pilot checked out the radar to see my friend was not kidding! Soon after I received a call from NWS Goodland asking what I could seem, since my last Spotter Network location was near the storm. I told him we were already on our way back to Goodland and he let me know that they had multiple land spout reports. The storm later went on to produce 2″ hail and another tornado warning.

While I missed the landspout, I was very curious of what had happened since there were a ton of chasers on the storm. Below are a number of videos from the storm showing the landspout which I found on YouTube:

 

 

Another from the same folks above:

 

 

 

And one last one showing the intense blowing dust (skip to 1:00):

 

All in all, I thought it was a great first “chase” of the season – a good dry-run of things to come this spring. And I really can’t complain about the location of the chase – within a half hour I was getting great photos and was home before sunset.

LSRs and Warnings from IEM

Posted in Storm Chasing, Weather | Tagged , , , | 1 Comment

Top 5 Reasons to Intern at the Meteorological Development Lab!

During the entire year of 2011, I’m working at the National Weather Service’s Meteorological Development Lab (MDL) in the Student Career Experience Program (SCEP). The coordinator of the program recently sent an email to the current interns (~6 of us) asking us to promote the internship because we have some openings coming up in January. So, I decided to create this list of the top five reasons why students should apply for the SCEP at MDL based on my experience.

5. Work at a world-class laboratory with some of the top scientists within the National Weather Service. MDL is the only lab within the National Weather Service. (Others, like GSD, are part of NOAA Research.)

4. See how products go from research to operations and understand the challenges that are faced in bringing a product from an idea to an operational product. Learn more about how the National Weather Service and NOAA work from the inside out.

3. Get paid as a real scientist! SCEP’s are considered full-time employees, so you get paid just like everyone else. You also get the benefits of working for the federal government, too, such as healthy benefits and a retirement savings account. You also receive transit benefits which cover the cost of public transportation to/from work. (Note: You don’t get housing like some internships, but you can usually find a place to live with some roommates to keep costs down.)

2. Live and work in the nation’s capital! MDL is located in the NOAA Silver Spring Metro Campus (SSMC) inside NWS headquarters, which is a stones throw away from the metro (rail) station and just a few minutes from DC via metro rail or bus!

1. Learn new skills and apply them for a full year! Unlike some internships, the SCEP at MDL is a 12-month commitment. I’m currently working my 12 months in one session (Jan 2011 – Jan 2012), but you can also break it up into two segments. In addition, after graduating there is a “possible 120-day work period, after graduation and possible conversion to career-conditional appointment, depending upon the availability of positions.”

Here’s some more information about the SCEP at MDL:

  • Applications are welcomed year-round! Unlike the separate nationwide program, the SCEP at MDL hires at all times of the year.
  • You can intern as an undergraduate or graduate student. (You MUST NOT be completed your degree requirements before beginning the SCEP.)
  • The branch you work in will depend on available openings.
  • You only need to fill out the application and provide a resume and (unofficial) transcript to begin the application process!

So what are you waiting for? Apply today!

If you have any questions, feel free to comment below, ask me on Twitter or Facebook or shoot me an email (joe at this website’s domain).

 

Posted in Millersville University, Weather | Tagged , , , | Comments Off

How-to: Installing the WRF-EMS on a virtual machine

At my internship, I’ve learned a ton about Unix/Linux. As a meteorology student, I have always had an interest in modeling, especially since we run a version of the WRF at Millersville. I knew it would be an enormous task to compile and run the WRF myself… then I stumbled upon the WRF-EMS. The WRF-EMS is a pre-compiled WRF with a ton of built-in functionality that makes it a cinch to get up and running in hours! The WRF-EMS is built using a pre-configured version of the WRF model controlled by perl scripts with lots of extra goodies that making running it almost too easy.

Everything listed here is free; in addition to the Open Source community that makes Unix and other projects so successful, you can thank your hard-working tax dollars for the people at NOAA/NWS/NCEP and UCAR/NCAR for making something like this possible.  Finally, the ease of installation and operation would not be possible without Robert Rozumalski, the NOAA NWS SOO Science and Training Resource Coordinator at UCAR and author of the WRF-EMS*. (*Just to be clear, Robert is the author of the WRF-EMS. The core WRF code (the actual model that runs) comes from the WRF development teams at NCAR (ARW) and NCEP (NMM). Read the introduction on the WRF-EMS homepage if you’re confused.)

Step -1: Disclaimer and Requirements

  • I did nothing to make any of this happen! Everything in this guide relies on the hard work of others (see above). I’m writing this post to help out other students and scientists (and myself!) learn more about the WRF and modeling in general.
  • This guide is made for someone interested in modeling. If you are serious about running the WRF for operations, you should not be using a virtual machine. However, this is a great way to get started if you’re not yet ready to take the plunge on dual-booting or buying a new machine.
  • You should have at least 4 GB of RAM, a modern (minimum dual core) processor and 150-200GB of hard disk space. A fast internet connection (or a load of patience) is required as well.
  • I am not responsible if this breaks your computer! Working in a virtual machine should prevent any computer disasters beyond the “virtual” machine. Please proceed at your own risk!
  • Ready to go? Let’s go!

Continue reading

Posted in Technology, Weather | Tagged , , | 1 Comment

WxBlogging: How to get started with your own weather blog

One of the first things I learned about meteorology was that to be a good forecaster, it took a lot of experience forecasting. Through my years of education and internships, I still am a firm in this and I think many others in the industry would agree. There’s no great how-to manual for every single forecasting scenario- you have to apply your education every day to become better at it. For this reason, I think starting a weather blog is an excellent idea for meteorology students (or meteorology students-to-be). It’s easy to look at the radar, look at models, take some MetEd modules and think you have a handle of what’s going on; when you sit down and try to describe this in written form it can be a real challenge!

This guide is for anyone looking to set up their own weather blog, though its primarily aimed at my peers at Millersville and other colleges who are looking to get into blogging about the weather. I’ve helped out a number of peers through this process and I hope I can help you, too!
Continue reading

Posted in Communication, Weather | Tagged , , , | Comments Off

Welcome to Twitter, NWS!

The National Weather Service (NWS) recently announced a number of experimental Twitter feeds:

Over the next several weeks, prototype Twitter feeds will be established for the following sites:

NWS Norman OK
NWS LMRFC (Lower Mississippi River Forecast Center)
NWS Pleasant Hill/Kansas City MO
NWS Salt Lake City UT
NWS Charleston SC
NWS Western Region
NWS Honolulu HI
NHC Atlantic | NHC Pacific – The National Hurricane Center will run 2 accounts, basin specific.

The NWS Norman account is currently posting relevant stories to their CWA. I have a pretty strong feeling that Norman didn’t happen to be one of the first WFO’s on twitter, however: their WCM, Rick Smith, interacts on Twitter regularly using his ounwcm account. NWS Kansas City also has an active account with re-tweets of other accounts.

Contrast this with NHC Atlantic, which is strictly a feed (also available via RSS) of their latest products, a la the IEM Bots. It doesn’t seem (so far, at least) that there is any human using the Twitter account..

Twitter means different things to different people. Many people tweet “at” or mention other users (eg. @username) in their tweets, and many people have come to expect a response. Some companies even have entire teams dedicated to dealing with help on Twitter. in fact, when I needed help with my phone, I got a faster response by tweeting @ATTCustomerCare instead of calling! While this is great for some people, I think many would agree that NWS shouldn’t be constantly watching who mentions them. However, I hope that they do occasionally reply and generally keep an eye on mentions just as much as they do for their Facebook pages.

Ideally, this is what I hope to see from these Twitter accounts:

  • Stories and short-term forecasts (eg. an upcoming snowfall event, enhanced fire weather risks, public outreach like Lightning Awareness Week)
  • Occasional re-tweets of reliable sources (eg. an emergency management agency publishing helpful information)
  • Occasional replies (eg. towards members of the general public- but only when time permits)
  • Automated posting of severe weather information, ideally published through the IEM/NWSChat bot system

I hope this Twitter trial is successful and expands to other offices soon!

FYI: How many people are on Twitter? It’s always important to get a reality check: A recent Pew survey found that roughly 13% of online Americans use Twitter. Using various sources, this comes out to about 10% of the entire US population. In comparison, 38% of Americans are on Facebook. These aren’t huge numbers, but they’re likely larger than the current number of Americans who have Weather Radios- which, in my opinion, makes a project like this worth the effort.

Posted in Communication, Reviews, Weather | Tagged , , , | Comments Off

Enhancing #wxreport with Skywarn spotters

I’ve been following the progress of the National Weather Service’s #wxreport Twitter project since its inception. In fact, I was researching the idea of using twitter for high-density (but low quality) weather reports as an observation network.

It seems the project website hasn’t been very active (though the RFC has been renewed through 2011), and I can bet why: while there may be some great reports, most of the tweets tagged with #wxreport are just plain noise. So how do we fix this? If only there was a way to verify people who were trained in some way to spot the weather, so that their reports carried more weight than others. Hm… Skywarn anyone?

What if trained spotters could let their local WFO know their twitter username, so that forecasters knew they could trust the reports more than others? There are a few ways this could be done:

  1. NWS could set up a central registration page for users to log in and enter their Skywarn info (possibly with other information, like their current phone number/email address). Then, this data could be used in applications used by WFOs.
  2. Users could tweet “at” or “direct message” a special account with the necessary information to a special twitter account. (For example, @nws_skywarn_register #SkywarnID #WFO #email could be sent by users and a script could be ran to catch and organize these tweets with some manual QC to establish a database.)

On a related note: I still don’t understand the Spotter Network- a Non-profit organization started by Tyler Allison (of AllisonHouse) that “is in no way sanctioned or affiliated with the NWS nor any of the other government agencies.” They have their own training, not related to Skywarn (or so it seems). And if this confuses me, a senior meteorology student, I have no doubt others have been confused as well. I would like to see this program and the Skywarn program joined or merged in some way to have one private-public partnership spotter network. I know this is tough given the current budget constraints, but I think it would eliminate some duplication while strengthening NWS’s relationship with the private weather enterprise.

Posted in Communication, Weather | Tagged , | Comments Off

Watch/Warning/Advisory colors should be standard across all NWS platforms and partners

What color is a tornado warning? Red? Usually. Unless you use the default on GRLevelX products- then it’s pink. I also found a Memphis TV station that used bright orange.

How about a Severe Thunderstorm Warning? The National Weather Service website uses bright orange, while most media outlets use yellow. (GRLevelX Products use a default of red.)

And don’t even get me started about other products, like severe watches and winter products. Watch/Warning/Advisory colors is a messy topic across platforms that I believe should become standardized.

In doing my research for this post, I found I wasn’t alone. Turns out there was a *paper published by the American Meteorological Society’s Interactive Information Processing Systems (IIPS) Subcommittee for Color Guidelines in 1993 that contained guidelines for colors. Under severe weather, they recommended all thunderstorm and tornado WWA products be displayed in red. However, that was nearly 20 years ago! (The age of the paper is evident is phrases like, “Consideration should be given to how the color set will map to gray shades if some users will ultilerly view the color set in monochrome mode (for example, monochrome television or hard copy)…”) However, the paper does have some relevant points:

  • “The colors selected should be used consistently everywhere they are used (Travis 1991): for example, if green in used to represent landmasses one place it should not be used to represent water bodies in another instance”
  • “Limit the number of different colors that used in any single visual display or product.” (Ahm… NWS…)
  • “When selecting colors to represent various features or conditions, choose colors that have familiar relationships…”

While I don’t see having inconsistent color schemes for WWA colors across NWS partners as a critical safety issue, I don’t believe that it’s helping convey important weather information, either! Luckily, at least one broadcast market has recognized the issue:

The local TV stations in Kansas City, MO got together and standardized their colors. While I haven’t been able to find any articles about this other than the video, I’m sure the cost of implementing these changes cost next to nothing.

Who should implement this? I think the National Weather Service has the authority to do this, at least as a strong recommendation to its customers. If not NWS, I think it would be a great thing for an AMS committee to develop guidelines.

What are your thoughts? In an informal, non-scientific poll amongst my peers I could not find one dissenting voice. What’s stopping this from happening?

 

*Guidelines for Using Color to Depict Meteorological Information: IIPS Subcommittee for Color Guidelines. Bulletin of the American Meteorological Society 74.9 (1993): 1709-713.

Posted in Communication, Weather | Tagged , , | 1 Comment

Where’s my storm chaser TV network?

Technology is amazing. Every day that there is severe weather, dozens of storm chasers stream their experience live via Severe Studios, ChaserTV, UStream, Livestream- you name it. The proliferation of affordable mobile broadband cards in recent years has made live streaming mobile video not only technically possible, but relatively affordable. This is great!

While I’ll be the first to admit I haven’t got much of a clue when it comes to “real” storm chasing in Mid-West, I do know that much of the time chasing is not very exciting. It’s a lot of the “Hurry up and Wait” sort of thing. So when there’s a severe outbreak predicted, many chasers might be streaming while they wait for storms to begin. While it’s certainly exciting to be there waiting for a storm to begin, it’s not the most entertaining thing to watch.

Once storms do begin to fire, it can be hard to keep up with what’s going on! Twitter will be on fire* and while it’s a really great tool to stay up to date, it can really be overwhelming. It’s an overload of information… much like the first few days of March Madness! How do you stay up-to-date when there are so many details unfolding on so many scales?

I think a web-based Storm Chaser TV Network would be a great idea. Here’s what I’m thinking:

  • Who would host it? Ideally, experienced storm chasers or broadcasters who not only know the meteorology but also know the geography.
  • When would it air? It would have to be something with maybe a days notice, depending on the type of outbreak. It could just be limited to days with a Moderate Risk, or it could be a daily thing (depending on the availability of those involved). I would see something airing just before storms are expected to start in the afternoon until sunset.
  • What would the format be? I see a mix of in-depth severe analysis, community involvement, SPC Multimedia Briefings, call-ins from chasers, and, of course, live video of chasers. A live chat with a few IEM_Bots would be excellent as well.
  • Who would produce it? This is one of the more difficult questions to answer, but I think either of the two most popular live storm chase video sites (Severe Studios or ChaserTV) have the technical know-how to pull it off.
  • Why? It would be much more entertaining to watch a well put-together live production than just to wait and watch chasers drive around. Why not showcase the best-of-the-best live streams with educational analysis? Plus, advertising revenue could be shared fairly among chasers- heck, I’m sure some people would pay a small amount to watch something like this.

What do you think? Too much work? Not a big enough market? Is this already happening? Feel free to leave your comments below or share them with me on Twitter.

 

*FYI, Google’s Realtime search, which searches Twitter, Facebook and Buzz (ha), is a very powerful social search engine. Not only does it get the latest data (and updates the page as updates roll in) but you can search back in time, and also narrow down your search by location.

Posted in Communication, Weather | Tagged , , , | Comments Off

Idea for MU-AMS: A Data Visualization Workshop

I’ve wanted to plan something like this for nearly a year, but I’ve always been too busy to actually plan it. And while I am now taking a year off from school to be a SCEP at NWS/MDL, I wanted to get this idea at least written down and hopefully prompt some discussion.

My idea is to have a one- or two-day workshop focused on some of the current visualization tools. Why? We don’t really learn them formally in our classes, yet many of us will need to know how to use them in our careers. I think it would be an excellent service to our members to hold something like this and would give our students (myself included) an edge in the current employment environment.

Focus: To teach attendees how to install and use the current visualization tools, with a focus on operational forecasting and research.

Audience: Our main audience would be MU-AMS members (~100), but we could also have it open to pre-college students, weather enthusiasts and possibly met students at other schools.

Location/Venue: Naturally we would use the facilities we already have in Caputo Hall.

Date/Time: A Saturday or Friday/Saturday in the Spring or Fall.

Costs: Minimal. Some refreshments, maybe a catered lunch? If we really wanted to go all-out and fly in some guest speakers), I would see costs around $1,000, max. At minimum this could be done for $200 or less.

Cost to participants: Less than $20.

Topics:

  • GRLevel2Analyst – An excellent radar tool that some of us already use.
  • The Interactive Data Viewer (IDV) – I LOVE this program, though it has a steep learning curve. You can do everything from models to radar to satellite to… well, anything in almost any format.
  • ArcGIS or other GIS software – While the GIS class is available to take instead of IDL, I still think a tutorial in how to use GIS software would be immensely helpful!
  • GEMPAK/NAWIPS – And maybe AWIPS2 when it becomes available.
  • BUFKIT - A tool used in many NWS offices.
  • Other web-based tools?

Format: I have no idea, but I would think after giving some background and installation information on each software package, there could be a series of tutorials for different scenarios. Perhaps participants could request subjects ahead of time so that the workshop focused on what they really wanted to learn.

Speakers: This would be the biggest issue in planning. We would need to find the best speakers for each software package; these candidates could include NWS personnel, professionals at private businesses, Unidata people, or even our own professors.

Technology: Some of this software is already installed on the department computers, but the rest could probably be installed as well. However, it would likely benefit participants to work on their own machines.

Can it happen? Of course it can! Millersville actually hosted a Unidata Regional Workshop on IDV back in 2004. (Edit: Plymouth State hosted the regional workshop in 2008, too, though it looks like this was the last Unidata workshop on the east coast.)

So that’s my idea. Feel free to discuss in the comments below, on Facebook, on Twitter or at any open officer meeting (even if I won’t be there).

Posted in Millersville University, Weather | Tagged , , , | 1 Comment