Sunday, December 9, 2018

Winter Storms and Flurries

High-impact weather events can either be annoyingly difficult to forecast...or a joy, depending on how you view it. While there is no right or wrong view on that, it can be an absolute nightmare when it is forecast and doesn't occur, or vice-versa. Several years ago, I wrote a post about the "Winter Storm that Never Was". The focus with that post was more about sharing (or not sharing) snowfall total maps from one model output. For this post, I wanted to look a bit more at the forecasting aspect.

7 to 10 days ago, model guidance was hinting at the potential of a significant snowfall for portions of the Central/Southern Plains. Snow lovers rejoice! And, haters...well, it's winter.

One example of what some model guidance was suggesting (this was the GFS forecast from last Sunday)
Fast-forward to present day and what was supposed to be a big snow ended up being scattered flurries, at least for parts of the Plains. Across parts of the Southern Plains, winter impacts were still felt, though (just ask the fine folks in Lubbock).

NOHRSC Modeled Snow Depth for Dec 9, 2018
This wasn't a case of no snow occurring at all, but the actual swath of snow was quite a bit different compared to what many models were showing days in advance. The tough part with this forecast was that the models continued to suggest a higher-impact snow even up to 2-3 days or so in advance for areas that ended up seeing no flakes at all, or more of a wintry mix as opposed to all snow.

NAM Snowfall Forecast 2-3 Days Out
I worked leading up to, and during, this event. Several things stuck out to me and/or came up in conversation within our office and with neighboring offices.

1) Trends are your friend, but know when to lock in. While not completely consistent, there was a noticeable trend further south with successive model runs. The trick here is knowing when to bite on a solution in the middle of a trend. This can be especially difficult when a certain trend continues well into the Watch/Warning/Advisory decision window. This is kind of like figuring out when to fill up on gas while prices are falling. You want to get the best price (forecast), but don't want to run out of gas (miss the forecast). From a messaging standpoint, you want to give people as much lead time as possible while still balancing out the potential crying wolf syndrome. The suggestion here is to be cautious with specific impacts if the models are in the middle of a consistent trend. If you can, try to wait until the guidance "levels off". This may be especially important when models show a drier, less snowy, less severe, etc type of trend for your particular area.

2) Consistency doesn't always equal higher confidence. There were several model runs in which the guidance were well-clustered on QPF / snowfall amounts. Typically, this would equate to higher confidence for the forecaster. The catch is that run-to-run consistency on where the heaviest snow would fall wasn't always there. Consistency in one model cycle is great, but make sure to look at it in context from previous runs.

3) I cannot stress this enough...don't let social media get to you. We can continue to educate folks and message events as best we can, but some things are simply misunderstood. Keep in mind, too, that the dreaded phrase, "They said...", while directed at us, likely includes non-Meteorologists as well. John Q posting a 400-hr snowfall map from some model is probably getting lumped into people's view of the error in the forecast. We didn't post those maps and yet we still get blamed. One suggestion is to take each event and, if you can, try to explain things to folks. I realize it won't always be received well, but don't give up trying. If this doesn't work, know when to just let it go.

4) Be honest with yourself. No matter how hard we try, we are going to bust at times. The models aren't perfect and neither are we. We all know this, but do we truly account for it in an honest post-event reflection? If you, personally, can do something better next time, then work at it. But, realize that even after considering all of the above suggestions on trends, messaging, science, and consistency...you will miss a forecast from time to time. Period. You are not alone...we all will miss forecasts. Richelle Goodrich said it well - “Many times what we perceive as an error or failure is actually a gift. And eventually we find that lessons learned from that discouraging experience prove to be of great worth.”

5) Postmortems! If you and/or your office do this already, great! If not, now is as good a time as any to start. It doesn't have to be a lengthy, detailed process. It could simply be an email pointing out which models did well or what stuck out to you during the event. Start a discussion. Figure out what went well and what didn't. Remember...be honest with yourself and as an office and learn from it. With each event, successful or not, we have an opportunity as individuals and as a team to improve. Make the most of each opportunity.

Forecasting the weather has its challenges, especially when high-impact events are at stake. What I'm learning is to implement change through lessons learned, figure out how to best interpret model guidance in various scenarios, and to be honest with myself. But, don't take my word for it...give it a try for yourself!

Note: if there are things you have learned from forecasting high-impact events, then let me know and add to the discussion!

Thursday, December 6, 2018

Confessions of a Prideaholic

Six months ago, I started at my new office here in Wichita. On my first day, my MIC (Manager-in-Charge) was giving me a run-down of the office - a whose who of the staff, mentioning various forecasters who would be good resources for radar, outreach, etc. In that moment, I immediately found myself desiring to have my name added to that list for future run-downs with new staff members.

Okaaay...so what's the big deal, you might ask. After all, what's wrong with shooting for doing your best and getting credit for it? For me, the problem is my motivation. I don't know where it came from or when it started, but somewhere down the line, I developed the Meteorologist's version of a borderline, superiority complex. When I first joined the National Weather Service (NWS) several years ago, I walked into that office acting like I had it all figured out. Turns out I didn't. Almost four years in, and those pesky thoughts of superiority keep trying to creep back in.

In my short time here at NWS Wichita (ICT), I am once again reminded that I don't know everything, that I'm not the best thing since sliced bread. But, here's the thing. Deep down, I've never actually believed that I know EVERYTHING, and yet if you could read my thoughts, you might think otherwise. The cause seems to be rooted in a poor self-assessment.

I have this tendency to analyze / assess people...their strengths and weaknesses, motivations, etc. There is a part of that that's enjoyable, especially when I am able to help others figure out what may be driving a person to do a certain thing - almost like a detective. But, when it comes to self-assessment, my effective analyzing seems to go out the window at times. On one hand, there are things that I actually do well, but struggle to believe it. On the other hand, there are things that I believe I do well that, in reality, I am not as good at.

Both sides create problems. Option "A" leads to an unrealistic lack of confidence which can cause others to believe I am not as good as I actually am at something. Think about the potential for missed opportunities there. Option "B" leads to an unrealistic surplus of confidence, potentially causing folks to trust me with something that is better suited for someone else. This can also cause me to miss out on opportunities, specifically opportunities to improve.

My self-assessment is the worst where my pride is the strongest. I can quickly say I am not the best fire weather forecaster out there because, frankly, I don't get wrapped up in what others think of my fire weather forecasting abilities, or lack thereof, and it isn't high on my passion list. But, ask me about convection, severe weather, or radar and that's a different ballgame. Those three are big on my passions list and maybe pride is the strongest where passion is the greatest. Passion is a great asset, but with great passion comes great responsibility...a challenge to properly assess myself and others. I say 'and others' because pride can cause me to not only give myself a poor assessment, but also others.

I believe it is always important to strive to do our best, whether it is high on our passion list or not. But, equally as important is the call to give ourselves and others a proper assessment. Be humbly confident in what you are good at, but honest enough to know where you could use some improvement. There are areas where you will, in fact, be better than someone else at something, but don't make that your goal. Instead, find ways to encourage others and help them succeed. Similarly, keep an open mind when people better at you in something come to help you succeed. Or, better yet, go ask someone more knowledgeable than you for guidance.

Being humble in this way and keeping our thoughts from going off the prideful deep end can aid in effective collaboration, improved service, and a stronger, more knowledgeable workforce.