On Communicating Winter Storm Threats and the Idea of a Winter Storm Outlook

Throughout the winter season, time and time again, someone will publish a QPF x 10 model map (otherwise known as “snowfall”) and the meteorological community (TV, NWS, academic, and private sector mets, students, and weather enthusiasts alike) will throw their hands up: It’s too early! It hasn’t even reached land yet! Irresponsible!

But in the warm season, when NWS meteorologists have confidence in severe thunderstorms or flash flooding, outlooks are issued by national centers (SPC/WPC). SPC has perhaps the most well-known of the “outlook” stage products, the Convective Outlook. These outlooks are issued as far out as Day 8 when SPC have confidence widespread severe weather. They’re so popular the categories were recently revamped to add more detail in the short-range, acknowledging the wide-spanning use of the product. (For Day 4-8, now “15%” and “30%” risk areas are possible, previously only 30% was used; For Day 1-3 the categories expanded and names were adjusted “to bring better consistency to the risks communicated in SPC outlooks.”) For Flash Flooding, WPC issues a similar (but much less popular in my experience) Excessive Rainfall Outlook for Day 1-3.

Winter Weather Left out in the Cold?

Winter Storm fatalities are vastly but knowingly under-counted by NWS (click to expand for details)
When you look at high-impact weather – weather that disrupts people, property, and the economy – there are a number of major categories: Floods, Extreme Heat, Extreme Cold, Tornadoes, Hurricanes, Lightning, etc. Here’s how the National Weather Service counts weather fatality statistics:

You might notice the “winter” category is very low on the 10-year average (light blue bars). More people have been killed by lightning compared to winter storms? No way! The reason is capturing these statistics – work done on a case-by-case basis at local National Weather Service offices – is extremely time-consuming and challenging. When it comes to winter weather, the NWS Storm Data directive (NWS 10-1605) declares winter weather deaths from vehicle accident are almost always “indirect” fatalities: “Fatalities and injuries due to motor vehicle accidents on slippery, rain, or ice-covered roads are indirect.” Indirect fatalities are not included in weather-related fatality statistics, thus the smaller number in the graph above.

While fatalities from car accidents are not captured in NOAA/NWS statistics, the U.S. Department of Transportation does capture this data in the National Highway Transportation Safety Administration’s Fatality Analysis Report System (FARS). Vox’s  recently reported on a pair of studies that reviewed the data and counted 31,098 car crash fatalities attributed to winter weather conditions in the past 36 years, or an average of 863/year nationwide. If these were included in Storm Data, Winter Weather would exceed every other category combined. While this comes at no surprise to anyone in the industry, it quantifies the fatal nature of winter storms.

While having a high impact on life (not to mention economic impacts), this phenomenon is without any formal product in the “outlook” phase beyond local (and inconsistent) mentions in local Hazardous Weather Outlooks, a great but often hidden product. (When no threat exists for Day 1 (today and tonight) this product is not highlighted on the local WWA map nor on the point-and-click forecast pages.)

WPC Launches Experimental Day 4-7 Probabilistic Winter Weather Outlook

For the 2015-2016 winter season, WPC launched an experimental probabilistic Winter Weather Outlook (more details), a single map for each of day 4-7 containing a range of binned probabilities (10-30%, 30-50%, 50-70%, 70-90%, and 90%+) for “probability of snow/sleet accumulating > 0.25″ liquid equivalent.” Here’s what the maps looked like for the 2016 east coast blizzard, what was generally a well-predicted event at the long range in my observations: (Note color changes at the end – they must’ve changed it at some point.)


And the verifying snowfall water equivalent (SWE) for the identical 24-hour period. (This isn’t exactly the same as actual reports, but close enough for this purpose.) Around the third-darkest shade of blue and above would verify the WPC maps above.

Not a bad job at all! Keep in mind this event was longer than 24 hours, so while it seems like a “miss” for areas of eastern PA and NJ, the snowfall fell on the other side of 12z. I think this product is a great first attempt at filling the need for a winter weather outlook for the “medium range,” but it’s not the only experiment going on.

WFO Sterling’s experimental Long Range Winter Storm Threat product

WFO Sterling (AKA Baltimore/Washington, DC, or simply “LWX”) is doing a very similar experiment (more details/live product) using threat categories and colors (green, yellow, orange, red, purple representing none, slight, enhanced, moderate, and high, the same as SPC categorical outlooks) and corresponding 3×3 “threat matrix” to convey confidence and potential impact for a “winter storm” at Day 4-7 range. This is displayed on a map for two areas of their County Warning Area (CWA) – west of the Blue Ridge & Catoctin Mountains and east of the Blue Ridge & Catoctin Mountains, including northern Virginia (NoVA), DC, and Baltimore. The documentation does not specifically define a “winter storm,” but given the Winter Storm Warning criteria is 4″ for most of their region outside of the higher terrain, it could be inferred as a confidence marker of at least 4″ of snowfall. It’s important to note this product focuses on potential impact, not snowfall amounts. For the same storm as above, here’s a collection of examples of this product taken from Sterling’s Facebook page.

As the storm approaches, the confidence level increased, reaching the highest level possible in both confidence and potential impact at Day 4, providing four days advanced notice for high confidence in a high-impact winter storm. It’s important to note that even before this, the 3×3 threat matrix highlighted the high (3 of 3) potential impact versus medium confidence (2 of 3).

Comparing These Attempts at a Winter Storm Outlook

While these two experiments are not completely separate (the WFO Sterling experiments discusses using the WPC experimental outlooks in production of their winter storm threat product), they both provide different information in different formats; the WPC product seems to be aimed at more technical users while WFO Sterling’s product is tuned to the needs of their local partners such as Emergency Managers. Do both have a reason to exist side by side? Comparing the two ideas:

 WPC's Winter Weather OutlookWFO Sterling's Winter Storm Threat
Output Probabilistic guidance for CONUS (0-100% in bins)Categorical (5 categories)
Criteria> 0.25" Liquid Equivalent of Snow/SleetWinter Storm Impact (subjective)
Intended audienceNWS forecasters, Emergency Managers, and other interested parties (advanced users)Emergency Managers, School Officials, general public
Advantages+ Easy to produce (mainly automated with human adjustment)
+ Gives precise probability range
+ Output categories easier to explain/convey
+ Simple one-word forecast summary for each day
+ Ability to depict both confidence and potential impact
+ Subjectivity gives forecaster some leeway in communicating threat
+ Focuses on impacts relative to geographic area (Eg. Impact difference of 2" of snow in Minnesota compared to Georgia)
Disadvantages- Criteria may not be well understood (0.25" liquid could be <1" to >6" of snowfall)
- May be difficult to understand (is a 30% chance high?)
- Only one probability instead of a few/many categories like the usual Day 1-3 probabilistic guidance
- Does not take into account impacts nor climatology
- May be more difficult to produce (due to subjective nature, but based on suite of automated guidance)
- Large geographic area has bust potential for storms with a sharp gradient

Thought’s on WPC’s Winter Storm Outlook

The output focusing on liquid-equivalent instead of snowfall is confusing and requires the user to also understand the expected snow-to-liquid ratio – not very user friendly for forecasters wanting simple guidance let alone a non-meteorologist who won’t know how to find this information. I like the idea as guidance, but using liquid-equivalent amounts is not as helpful as a specific amount (or a few amount categories) would be, especially for non-technical users.

Thoughts on WFO Sterling’s Winter Storm Threat

The simple categories focusing on impact are excellent. The use of a threat matrix may be challenging to understand, but I like the idea. The production could be difficult, but with proper training and experience I really think this is the direction NWS should pursue. The output maps are not very clearly presented in my opinion (more of a web display/GIS issue), but otherwise I think this is a great experiment and I’m very happy they’ve had at least one major event to really get some use out of it!

Taking Aim at a National Winter Storm/Threat Outlook

Ultimately I think the experiment at Sterling could and should be expanded to a national scale with some small modifications. The SPC model in which a national office produces the product with some local input could be imitated by WPC. One of the main challenges would be understanding local impacts – forecasting snowfall is difficult enough, but understanding impacts on a nationwide scale is very challenging. An answer to this challenge could be developing a threat index based on local climatology, essentially trying to equalize winter weather impact everywhere from the inter-mountain west to the Gulf Coast. (NWS meteorologists are actually already working towards this idea, though it is currently in a per-experimental phase.) The production of such a national outlook could be done by WPC using a similar approach as its current suite of winter weather products. To develop confidence/potential impact ratings could be more subjective, but using ensemble guidance, three maps could be produced: potential impact (based on amount of snow/sleet/ice), confidence (based on ensemble and run-to-run spread), and a final outlook category based on the potential impact and confidence maps. The first two maps could be developed and collaborated on internally, identical to how WPC produces its winter weather and QPF guidance. WFO’s in turn could then use the final categorical outlook in their communication, using consistent terminology nationwide. The potential impact and confidence maps could either be shown separately or displayed in a visual point-forecast display such as on the point-and-click pages or elsewhere. Enhanced or higher outlook categories could even be highlighted on the point-and-click icons with details in the wording, similar to the new WWA display.

A friendly reminder that this post as well as all of the content on this website, on my Twitter (@wxjoe) and on my Facebook page do not reflect the views of my employer. These are my opinions only.

Posted in Communication, Ideas, Weather | Tagged , , , | Comments Off on On Communicating Winter Storm Threats and the Idea of a Winter Storm Outlook

Creating a timelapse video with weather observations + radar

I recently purchased a GoPro H3B (Hero 3 Black edition for non-GoPro folks) for numerous reasons – one of which was to create awesome timelapses. (Is it time lapse, time-lapse, or timelapse? I don’t know.) The H3B has a built-in timelapse mode which can do intervals ranging from 0.5 sec to 60 seconds. For a recent snow storm, I wanted to test out the timelapse feature and set it to 60 second interval – plenty for a snow storm.

Let me just take a half-step back and say I like making snowfall timelapses. A lot. I’ve created many snowfall timelapses over the years, using various hardware/software combinations. Sometimes I’ve used a simple webcam, basic capture software, and Windows Movie Maker, while others have been created with a DSLR and Sony Vegas Movie Maker.

I’ve always wanted to create an outdoor timelapse with radar included, but I’ve never had the time or energy to figure out how to do it. This time, however, I was inspired – from black bars, of all things. See, the GoPro captures images using it’s full 4000x3000px (4:3 aspect ratio) sensor, which gives great big 12 MP images. Unfortunately, YouTube’s video player is 16:9, and if you upload anything but 16:9 ratio, it adds black bars. I decided I wanted to fill that blank space with radar data, and it turned out the radar part was actually pretty easy – it was putting the actual weather observations on the video that was the more difficult part.

Getting the Radar data ready for video

My photo captures were at 60 sec intervals, so as long as I knew the radar VCP, I could match up the times with the images. Fortunately, only two VCPs were used over the course of the event – 31 and 21. VCP 31 is about ten minutes and 21 about six. I downloaded the radar data from NCDC, then used Unidata’s Integrated Data Viewer (IDV) to put together the video.

You will need:

  • IDV (see above) – you can actually run it right from the Java Web Start without installation!
  1. Download radar data from the NCDC NEXRAD Inventory site (not the HAS site)
    1. Click the radar site you want data from
    2. Change the date at the top to the first day you want data for. (All dates/times UTC) Then select the product you want – I used the Level 2 (“raw”) data, but Level 3 takes up a lot less space and will be faster for a slower machine. For L2 data, leave the default selected, for L3, you should use either N0Q – Long range reflectivity or NCR – Composite short-range reflectivity (for light snowfall events); Click Create Graph
    3. You will see a graph of each radar volume scan, color-coded for mode. Should be a lot of blue and red lines – blank space means the radar was down. Before you leave this page, click on the link towards the bottom that says “View Actual Timestamps / Op. Mode / VCP ” – this will list at what time VCPs were changed. Copy this and save to a text file for future reference. Enter your email address, change the times to what you desire, and click Order Data – you will receive the data within minutes to hours.
    4. Download the data, and ungzip. (On Windows, use something 7zip; Linux/Mac users can just use gunzip)
  2. Dowload/Open IDV
  3. Load data (Data Choosers > Files; Set Data Source Type to “Radar”; Add Source)
  4. For L2 data, choose reflectivity (or whatever parameter you want); For L3 I don’t think you have much of a choice – just select the data and load)
  5. Radar data will be displayed. Set up the map the way you’d like. For radar colors, choose whatever you want. I’ll share with you my AWIPS 256 color table that I created because I like those colors best – it was a pain to make, it’s not perfect, but it looks good enough for me! In the IDV color table manager, just import the XML file and it should work – just make sure the range is set from -32 to 80!
  6. Everything look good? Are you sure? Now we’re ready to capture. On the Map View, go to the View menu just above the map; Capture > Movie.
  7. In the Movie Capture dialogue, click on the Time Animation button and let it go. Don’t touch your computer while it runs!
  8. After capturing, a new window will pop up asking about timing and formats. Set these to what you’d like, as it will depend on your video speed and VCP.
  9. Movie saved! It happens very fast – you might think it didn’t work, but go ahead and check where you saved the file and it should play just fine.

I recommend saving in shorter (6-10 hour) chunks to reduce taxing your computer too much. Also, because VCP scan times aren’t exact, you’ll want some wiggle room to speed up or slow down the radar data in your final production.

Tip: Want to save some video then come back later to the exact same view? You can! When everything looks good, go to Displays > Favorite Bundles > Save As Favorite. Make sure Views and Displays are checked, and un-check Data Sources. You can then re-load the current view next time without having to re-do your map lines/zoom/etc.

Preparing the Weather Observations for the Video

This was by far the biggest challenge, and the reason I am documenting this process. It took time to figure it out, but I have ideas to make it easier next time. This process did take a while, so I’m going to break it up into a few sections. My process was basically to create an image of the text of the latest weather observation every minute. I did this because I could then just add these images alongside my timelapse images and know that they would be perfectly time-matched! I tried to find a better way to do this without the extra step of making images, but I couldn’t find anything despite searching a few internet forums.

You will need:

  • Excel (Yes, Excel – you can’t use OpenOffice or Google Docs for this one, folks)
  • Imagemagick

1) Download and format the data

Head to Wunderground and find your closest ASOS/AWOS. Note, if you have a closer or more representative station, you could combine, say, the weather/sky/visibility observations from the ASOS/AWOS but use temp/dew point/rh/wind from another station. Go to the first day for which you want data from. You should be at a page that looks like this.

  1. You don’t need this, but I recommend turning on full METARs  by clicking the link all the way at the bottom of the table of data
  2. Download the CSV – The link is all the way at the bottom of the page. If your browser does not prompt you to save the data, copy it all and paste it into Excel.
  3. Open the CSV with Excel; Use the Text-to-Columns tool if need be.
  4. Delete all the columns you don’t care about it. Don’t need Altimeter reading? Don’t keep it around!
  5. Make any unit conversions by creating a new column and convert the original data using a formula. If you want to change the decimal places of a number, change it to look the way you want. THEN, Copy > Paste Special > Values Only. Why? When we concatenate it will not keep the new style.
  6. Now we need to create a new sheet with a line for every minute. This is pretty simple – just enter the first two times you need (eg. MM/DD/YYYY 00:00, MM/DD/YYYY 00:01) then drag down as far as you need.
  7. Now with a sheet with times for every minute, we will re-create the sheet with the formatted data and fill in the times we don’t have obs using the LOOKUP function. If you want to see how I did this, see the Excel file I used.
  8. Finally, use the CONCATENATE function to take all of the values and put them in one string. You can add units, remarks, etc – again, see my Excel file to see how I did it. You’ll notice random ‘q’ characters in my text, too – I replace these with newlines when creating the images.
  9. Select all of the column with out concatenated strings (not including any label at the top), then paste it into a text file (eg. using Notepad) and save the text file.

2) Create images with Imagemagick

  1. You now have a text file with a line of text for each minute of the timelapse. (See my sample) – Move this to your platform with Imagemagick installed if needed. (I used Dropbox to send the text file from my Windows machine to my Mac.)
  2. Write a short shell script to use Imagemagick to make images. You can steal mine (below these instructions) or write your own. I used caption: to write the text, but there are a few ways to write text with Imagemagick. I also used tr to convert the q’s to \n (newlines).
  3. Run the script to make the images! My script took around a half hour to finish, creating over two thousand images.
 while read l; do
     echo $iter
     echo -n $l | tr 'q' '\n' | convert -background green -fill white \
     -strokewidth 2 -stroke black -font AvantGarde-Demi -pointsize 72 -size 1800x \
     -gravity Center caption:@- out_${iter}.png
    iter=$((iter + 1))
 done < kgld_out_final.txt

Putting it all together

How to put it all together is up to use. I used Sony Movie Studio, which is just as good as Vegas for my purposes. You can’t use something like iMovie or Windows Live Movie Maker, since you need some advanced such as chroma key (for the images of weather obs) and multiple video tracks. If you’re using GoPro images, at 1080×1920 (standard 1080p HD), the stills will be 1080×1440, which leaves 1080×480 of empty space to use, which is what I used the radar imagery to fill.

How to make it easier – ideas for next time

Here are some ideas I have for making this process much easier…

  1. For the obs, write a script to archive the obs every minute using, for instance, a shell script and the Wunderground API. You could then just QC the data and make the images from that text, assuming you archive the data in a format you like.
  2. For radar, use GRLevel3 to save images in realtime using the FTP Publish feature, which can publish to a local directory.
  3. Also for the radar, use IDV within a few days to avoid having to download the data – Radar and satellite data is archived in IDV’s numerous free data sources for days.

Concluding Thoughts

I hope you found this post useful if you’d like to do something similar to what I did! If you have any questions feel free to ask me on Twitter – I couldn’t include every little step for every level of user, but I hope this guide was good enough as a starting point for anyone interested in doing this. I hope to do this again in the future, and especially this spring!

Posted in Technology, Weather | Tagged , , , , , , | Comments Off on Creating a timelapse video with weather observations + radar

April 8, 2013 Storm Chase and landspout

On April 8 a few severe to tornadic cells moved through parts of northeast Colorado, northwest Kansas, and extreme southwest Nebraska. I had been watching the situation during the afternoon and deciding whether it would be worth it to chase or if I might do better just waiting until something came to me in Goodland, KS. I took the wait-and-see approach, because while the environment was very favorable south of I-70, there didn’t seem to be a strong forcing mechanism. I watched a cell strengthen in eastern Colorado, and as it was passing I-70 just west of Burlington, CO I noticed the cell trying to split. A right turn would mean the storm could cross into Kansas and be fairly close to me, so I and a friend decided to make a go at it.

We took to KS-27, a paved highway north of Goodland and about 5 miles north made a left turn to head west on dirt roads. The cell was moving north-northeast but appeared to be weakening. Even so, this low-precipitation storm looked pretty great from our vantage point and we were able to get some good photos! This was my first time chasing in the Plains and seeing a real supercell thunderstorm in person. I was very thankful to have a co-pilot guiding me where to drive (and when to stop) – I couldn’t imagine going out there alone.

After about an hour of watching the storm it appeared to be weakening, so we decided to head back to Goodland. As I was about to get back on the paved highway KS-27, I stopped to glance at my phone and saw a text from a friend: “JOE CONFIRMED TORNADO ON THAT CELL NOW TURN AROUND.” Knowing how my friends and I joke with each other, I figured he was just kidding me. While I was without radar data for a time, based on what we saw of the storm structure it didn’t look exciting. I continued on the road and my co-pilot checked out the radar to see my friend was not kidding! Soon after I received a call from NWS Goodland asking what I could seem, since my last Spotter Network location was near the storm. I told him we were already on our way back to Goodland and he let me know that they had multiple land spout reports. The storm later went on to produce 2″ hail and another tornado warning.

While I missed the landspout, I was very curious of what had happened since there were a ton of chasers on the storm. Below are a number of videos from the storm showing the landspout which I found on YouTube:



Another from the same folks above:




And one last one showing the intense blowing dust (skip to 1:00):


All in all, I thought it was a great first “chase” of the season – a good dry-run of things to come this spring. And I really can’t complain about the location of the chase – within a half hour I was getting great photos and was home before sunset.

LSRs and Warnings from IEM

Posted in Storm Chasing, Weather | Tagged , , , | 1 Comment

Top 5 Reasons to Intern at the Meteorological Development Lab!

During the entire year of 2011, I’m working at the National Weather Service’s Meteorological Development Lab (MDL) in the Student Career Experience Program (SCEP). The coordinator of the program recently sent an email to the current interns (~6 of us) asking us to promote the internship because we have some openings coming up in January. So, I decided to create this list of the top five reasons why students should apply for the SCEP at MDL based on my experience.

5. Work at a world-class laboratory with some of the top scientists within the National Weather Service. MDL is the only lab within the National Weather Service. (Others, like GSD, are part of NOAA Research.)

4. See how products go from research to operations and understand the challenges that are faced in bringing a product from an idea to an operational product. Learn more about how the National Weather Service and NOAA work from the inside out.

3. Get paid as a real scientist! SCEP’s are considered full-time employees, so you get paid just like everyone else. You also get the benefits of working for the federal government, too, such as healthy benefits and a retirement savings account. You also receive transit benefits which cover the cost of public transportation to/from work. (Note: You don’t get housing like some internships, but you can usually find a place to live with some roommates to keep costs down.)

2. Live and work in the nation’s capital! MDL is located in the NOAA Silver Spring Metro Campus (SSMC) inside NWS headquarters, which is a stones throw away from the metro (rail) station and just a few minutes from DC via metro rail or bus!

1. Learn new skills and apply them for a full year! Unlike some internships, the SCEP at MDL is a 12-month commitment. I’m currently working my 12 months in one session (Jan 2011 – Jan 2012), but you can also break it up into two segments. In addition, after graduating there is a “possible 120-day work period, after graduation and possible conversion to career-conditional appointment, depending upon the availability of positions.”

Here’s some more information about the SCEP at MDL:

  • Applications are welcomed year-round! Unlike the separate nationwide program, the SCEP at MDL hires at all times of the year.
  • You can intern as an undergraduate or graduate student. (You MUST NOT be completed your degree requirements before beginning the SCEP.)
  • The branch you work in will depend on available openings.
  • You only need to fill out the application and provide a resume and (unofficial) transcript to begin the application process!

So what are you waiting for? Apply today!

If you have any questions, feel free to comment below, ask me on Twitter or Facebook or shoot me an email (joe at this website’s domain).


Posted in Millersville University, Weather | Tagged , , , | Comments Off on Top 5 Reasons to Intern at the Meteorological Development Lab!

How-to: Installing the WRF-EMS on a virtual machine

At my internship, I’ve learned a ton about Unix/Linux. As a meteorology student, I have always had an interest in modeling, especially since we run a version of the WRF at Millersville. I knew it would be an enormous task to compile and run the WRF myself… then I stumbled upon the WRF-EMS. The WRF-EMS is a pre-compiled WRF with a ton of built-in functionality that makes it a cinch to get up and running in hours! The WRF-EMS is built using a pre-configured version of the WRF model controlled by perl scripts with lots of extra goodies that making running it almost too easy.

Everything listed here is free; in addition to the Open Source community that makes Unix and other projects so successful, you can thank your hard-working tax dollars for the people at NOAA/NWS/NCEP and UCAR/NCAR for making something like this possible.  Finally, the ease of installation and operation would not be possible without Robert Rozumalski, the NOAA NWS SOO Science and Training Resource Coordinator at UCAR and author of the WRF-EMS*. (*Just to be clear, Robert is the author of the WRF-EMS. The core WRF code (the actual model that runs) comes from the WRF development teams at NCAR (ARW) and NCEP (NMM). Read the introduction on the WRF-EMS homepage if you’re confused.)

Step -1: Disclaimer and Requirements

  • I did nothing to make any of this happen! Everything in this guide relies on the hard work of others (see above). I’m writing this post to help out other students and scientists (and myself!) learn more about the WRF and modeling in general.
  • This guide is made for someone interested in modeling. If you are serious about running the WRF for operations, you should not be using a virtual machine. However, this is a great way to get started if you’re not yet ready to take the plunge on dual-booting or buying a new machine.
  • You should have at least 4 GB of RAM, a modern (minimum dual core) processor and 150-200GB of hard disk space. A fast internet connection (or a load of patience) is required as well.
  • I am not responsible if this breaks your computer! Working in a virtual machine should prevent any computer disasters beyond the “virtual” machine. Please proceed at your own risk!
  • Ready to go? Let’s go!

Continue reading

Posted in Technology, Weather | Tagged , , | 1 Comment

WxBlogging: How to get started with your own weather blog

One of the first things I learned about meteorology was that to be a good forecaster, it took a lot of experience forecasting. Through my years of education and internships, I still am a firm in this and I think many others in the industry would agree. There’s no great how-to manual for every single forecasting scenario- you have to apply your education every day to become better at it. For this reason, I think starting a weather blog is an excellent idea for meteorology students (or meteorology students-to-be). It’s easy to look at the radar, look at models, take some MetEd modules and think you have a handle of what’s going on; when you sit down and try to describe this in written form it can be a real challenge!

This guide is for anyone looking to set up their own weather blog, though its primarily aimed at my peers at Millersville and other colleges who are looking to get into blogging about the weather. I’ve helped out a number of peers through this process and I hope I can help you, too!
Continue reading

Posted in Communication, Weather | Tagged , , , | Comments Off on WxBlogging: How to get started with your own weather blog

Welcome to Twitter, NWS!

The National Weather Service (NWS) recently announced a number of experimental Twitter feeds:

Over the next several weeks, prototype Twitter feeds will be established for the following sites:

NWS Norman OK
NWS LMRFC (Lower Mississippi River Forecast Center)
NWS Pleasant Hill/Kansas City MO
NWS Salt Lake City UT
NWS Charleston SC
NWS Western Region
NWS Honolulu HI
NHC Atlantic | NHC Pacific – The National Hurricane Center will run 2 accounts, basin specific.

The NWS Norman account is currently posting relevant stories to their CWA. I have a pretty strong feeling that Norman didn’t happen to be one of the first WFO’s on twitter, however: their WCM, Rick Smith, interacts on Twitter regularly using his ounwcm account. NWS Kansas City also has an active account with re-tweets of other accounts.

Contrast this with NHC Atlantic, which is strictly a feed (also available via RSS) of their latest products, a la the IEM Bots. It doesn’t seem (so far, at least) that there is any human using the Twitter account..

Twitter means different things to different people. Many people tweet “at” or mention other users (eg. @username) in their tweets, and many people have come to expect a response. Some companies even have entire teams dedicated to dealing with help on Twitter. in fact, when I needed help with my phone, I got a faster response by tweeting @ATTCustomerCare instead of calling! While this is great for some people, I think many would agree that NWS shouldn’t be constantly watching who mentions them. However, I hope that they do occasionally reply and generally keep an eye on mentions just as much as they do for their Facebook pages.

Ideally, this is what I hope to see from these Twitter accounts:

  • Stories and short-term forecasts (eg. an upcoming snowfall event, enhanced fire weather risks, public outreach like Lightning Awareness Week)
  • Occasional re-tweets of reliable sources (eg. an emergency management agency publishing helpful information)
  • Occasional replies (eg. towards members of the general public- but only when time permits)
  • Automated posting of severe weather information, ideally published through the IEM/NWSChat bot system

I hope this Twitter trial is successful and expands to other offices soon!

FYI: How many people are on Twitter? It’s always important to get a reality check: A recent Pew survey found that roughly 13% of online Americans use Twitter. Using various sources, this comes out to about 10% of the entire US population. In comparison, 38% of Americans are on Facebook. These aren’t huge numbers, but they’re likely larger than the current number of Americans who have Weather Radios- which, in my opinion, makes a project like this worth the effort.

Posted in Communication, Reviews, Weather | Tagged , , , | Comments Off on Welcome to Twitter, NWS!

Enhancing #wxreport with Skywarn spotters

I’ve been following the progress of the National Weather Service’s #wxreport Twitter project since its inception. In fact, I was researching the idea of using twitter for high-density (but low quality) weather reports as an observation network.

It seems the project website hasn’t been very active (though the RFC has been renewed through 2011), and I can bet why: while there may be some great reports, most of the tweets tagged with #wxreport are just plain noise. So how do we fix this? If only there was a way to verify people who were trained in some way to spot the weather, so that their reports carried more weight than others. Hm… Skywarn anyone?

What if trained spotters could let their local WFO know their twitter username, so that forecasters knew they could trust the reports more than others? There are a few ways this could be done:

  1. NWS could set up a central registration page for users to log in and enter their Skywarn info (possibly with other information, like their current phone number/email address). Then, this data could be used in applications used by WFOs.
  2. Users could tweet “at” or “direct message” a special account with the necessary information to a special twitter account. (For example, @nws_skywarn_register #SkywarnID #WFO #email could be sent by users and a script could be ran to catch and organize these tweets with some manual QC to establish a database.)

On a related note: I still don’t understand the Spotter Network– a Non-profit organization started by Tyler Allison (of AllisonHouse) that “is in no way sanctioned or affiliated with the NWS nor any of the other government agencies.” They have their own training, not related to Skywarn (or so it seems). And if this confuses me, a senior meteorology student, I have no doubt others have been confused as well. I would like to see this program and the Skywarn program joined or merged in some way to have one private-public partnership spotter network. I know this is tough given the current budget constraints, but I think it would eliminate some duplication while strengthening NWS’s relationship with the private weather enterprise.

Posted in Communication, Weather | Tagged , | Comments Off on Enhancing #wxreport with Skywarn spotters

Watch/Warning/Advisory colors should be standard across all NWS platforms and partners

What color is a tornado warning? Red? Usually. Unless you use the default on GRLevelX products- then it’s pink. I also found a Memphis TV station that used bright orange.

How about a Severe Thunderstorm Warning? The National Weather Service website uses bright orange, while most media outlets use yellow. (GRLevelX Products use a default of red.)

And don’t even get me started about other products, like severe watches and winter products. Watch/Warning/Advisory colors is a messy topic across platforms that I believe should become standardized.

In doing my research for this post, I found I wasn’t alone. Turns out there was a *paper published by the American Meteorological Society’s Interactive Information Processing Systems (IIPS) Subcommittee for Color Guidelines in 1993 that contained guidelines for colors. Under severe weather, they recommended all thunderstorm and tornado WWA products be displayed in red. However, that was nearly 20 years ago! (The age of the paper is evident is phrases like, “Consideration should be given to how the color set will map to gray shades if some users will ultilerly view the color set in monochrome mode (for example, monochrome television or hard copy)…”) However, the paper does have some relevant points:

  • “The colors selected should be used consistently everywhere they are used (Travis 1991): for example, if green in used to represent landmasses one place it should not be used to represent water bodies in another instance”
  • “Limit the number of different colors that used in any single visual display or product.” (Ahm… NWS…)
  • “When selecting colors to represent various features or conditions, choose colors that have familiar relationships…”

While I don’t see having inconsistent color schemes for WWA colors across NWS partners as a critical safety issue, I don’t believe that it’s helping convey important weather information, either! Luckily, at least one broadcast market has recognized the issue:

The local TV stations in Kansas City, MO got together and standardized their colors. While I haven’t been able to find any articles about this other than the video, I’m sure the cost of implementing these changes cost next to nothing.

Who should implement this? I think the National Weather Service has the authority to do this, at least as a strong recommendation to its customers. If not NWS, I think it would be a great thing for an AMS committee to develop guidelines.

What are your thoughts? In an informal, non-scientific poll amongst my peers I could not find one dissenting voice. What’s stopping this from happening?


*Guidelines for Using Color to Depict Meteorological Information: IIPS Subcommittee for Color Guidelines. Bulletin of the American Meteorological Society 74.9 (1993): 1709-713.

Posted in Communication, Weather | Tagged , , | 1 Comment

Where’s my storm chaser TV network?

Technology is amazing. Every day that there is severe weather, dozens of storm chasers stream their experience live via Severe Studios, ChaserTV, UStream, Livestream– you name it. The proliferation of affordable mobile broadband cards in recent years has made live streaming mobile video not only technically possible, but relatively affordable. This is great!

While I’ll be the first to admit I haven’t got much of a clue when it comes to “real” storm chasing in Mid-West, I do know that much of the time chasing is not very exciting. It’s a lot of the “Hurry up and Wait” sort of thing. So when there’s a severe outbreak predicted, many chasers might be streaming while they wait for storms to begin. While it’s certainly exciting to be there waiting for a storm to begin, it’s not the most entertaining thing to watch.

Once storms do begin to fire, it can be hard to keep up with what’s going on! Twitter will be on fire* and while it’s a really great tool to stay up to date, it can really be overwhelming. It’s an overload of information… much like the first few days of March Madness! How do you stay up-to-date when there are so many details unfolding on so many scales?

I think a web-based Storm Chaser TV Network would be a great idea. Here’s what I’m thinking:

  • Who would host it? Ideally, experienced storm chasers or broadcasters who not only know the meteorology but also know the geography.
  • When would it air? It would have to be something with maybe a days notice, depending on the type of outbreak. It could just be limited to days with a Moderate Risk, or it could be a daily thing (depending on the availability of those involved). I would see something airing just before storms are expected to start in the afternoon until sunset.
  • What would the format be? I see a mix of in-depth severe analysis, community involvement, SPC Multimedia Briefings, call-ins from chasers, and, of course, live video of chasers. A live chat with a few IEM_Bots would be excellent as well.
  • Who would produce it? This is one of the more difficult questions to answer, but I think either of the two most popular live storm chase video sites (Severe Studios or ChaserTV) have the technical know-how to pull it off.
  • Why? It would be much more entertaining to watch a well put-together live production than just to wait and watch chasers drive around. Why not showcase the best-of-the-best live streams with educational analysis? Plus, advertising revenue could be shared fairly among chasers- heck, I’m sure some people would pay a small amount to watch something like this.

What do you think? Too much work? Not a big enough market? Is this already happening? Feel free to leave your comments below or share them with me on Twitter.


*FYI, Google’s Realtime search, which searches Twitter, Facebook and Buzz (ha), is a very powerful social search engine. Not only does it get the latest data (and updates the page as updates roll in) but you can search back in time, and also narrow down your search by location.

Posted in Communication, Weather | Tagged , , , | Comments Off on Where’s my storm chaser TV network?