Fall 2019 Update: We recorded an in depth podcast with AMGA Ski Guide and Avalanche expert Chris Brown on this same subject – check it out here! 

An exploration into the human factors and heuristics that lead to avalanche incidents and our recommendations on: (1) how to overcome them and (2) how to improve how avalanche education courses teach them.

The American Institute for Avalanche Research and Education (AIARE) Avalanche courses are designed to train backcountry users (skiers, snowboarders, hikers, snowmobilers, etc.) in avalanche awareness to help them make more educated decisions in backcountry and high-risk winter alpine environments.

AIARE is a non-profit organization that began in the 1990s in an attempt to bring unification to the avalanche awareness and education courses that were being offered at the time. As backcountry use was becoming more popular, it was becoming more and more necessary to ensure that people understood the dangers involved and to provide a platform to educate people on how to: minimize risk, read terrain and snow conditions, and understand how to navigate avalanche terrain.

Since the course began, the program has continued to evolve. By studying the patterns involved in avalanche related accidents, it became apparent that in addition to understanding the natural and physical conditions involved: terrain, snow, weather, equipment, etc., there was also significant value in teaching the human factors that guide our decision-making process in avalanche terrain. A theory evolved that by providing education in our human biases and the heuristics that guide our decisions, backcountry safety could be improved, and thus fatalities could be decreased. These ideas culminated with a 2002 research paper by Researcher Ian McCammon titled “Heuristic Traps in Recreational Avalanche Accidents: Evidence and Implications.“ The findings from this research evolved into “The 6 Heuristic traps in avalanche accidents (FACETS)”, which are used today to educate participants in the human factors that cause avalanche accidents. We will dive deeper into these factors shortly. If you are an avid backcountry user, avalanche forecaster Bruce Temper does an amazing job of getting detailed  on how to handle these situations in “Staying Alive in Avalanche Terrain.” I highly recommend adding it to your library. 

The current adaptation of the 3-day AIARE 1 course involves a series of pre-course modules and reading, an eight-hour classroom session, a day of applied field training in avalanche recovery and rescue, and a full day in a backcountry environment applying these skills. The curriculum is built on the concept that there are 3 major contributing factors in avalanche accidents: snow, weather, and human/heuristic factors. While the course still weighs heavily on the first two, it took an interesting dive into the human factors as well. Being that we are a behavioral design agency with a skiing problem (or is it the other way around?) this seems to be the appropriate place to dive in.

Amongst the pre-course technical modules and avalanche safety books is a recommendation to read Daniel Kahneman’s groundbreaking “Thinking Fast and Slow.” If you have not read it, we also highly recommend it. It introduces the concept of System 1 and System 2 thinking.

System 1 thinking is fast and reactionary. It is the type of thinking involved in activities such as driving a car, or for this purpose, shredding down a slope on a pair of sticks. It operates automatically, intuitively, involuntary, and effortlessly – “see tree, turn skis, feel bump, pivot body”. System 1 uses heuristics, which are mental shortcuts learned from past experiences, and as such can be valuable in some situations but inaccurate in others. However, because it runs on mental shortcuts, it can also be gullible and biased and is prone to emotions.

System 2 thinking requires slowing down, deliberating, solving problems, reasoning, computing, focusing, concentrating, considering other data, and not jumping to quick conclusions. This is the type of thinking involved in solving a complex math problem, or in this instance, analyzing and planning a complex backcountry ski route through the Wasatch Mountains in Utah – “analyze maps and contours, review avalanche forecast, scope out the terrain, research weather patterns and forecast”. System 2 oversees doubt and is prone to questioning things rather than believing, however it uses more mental energy and as such can be lazy and let System 1 take over.

So how does this correlate to backcountry safety? The more we understand our reactive versus analytical thinking, the more we can analyze our decision-making process and build our own system of checks and balances in high-risk environments. A heightened sense of awareness around our emotional and heuristic thinking can help us stop, think analytically, and make safer and less reactionary decisions unshrouded in emotional judgment.

In this specific situation, a balance of System 1 and System 2 is the key. By intentionally tapping into System 2 we can mitigate the poor reactive decisions that lead to incidents. As mentioned before, System 2 should be used for trip and route planning. Analyzing the conditions and being aware of what your approach for the days is with this mindset is the basis for a lower risk day. However, System 2 should not stop there as it often does. In many of the incidents studied by McCammon and his peers this step had been completed and a well-set plan was in place for the day. It was an emotional or heuristic decision made in the field that led to the incident. In these instances, a human factor was to blame.

In analyzing numerous incidents over the course of a few years, MaCammon was able to identify six specific heuristics, or human factors, that led to the incident. Each of these factors is an example of System 1 taking over in the field and an emotionally charged decision overriding the analytical choice that System 2 would have made. He labeled these factors “The 6 Heuristic traps in avalanche accidents (FACETS)” and they are F = Familiarity, A = Acceptance, C = Consistency, E = Expert Halo, T = Tracks (First Tracks), and S = Social Facilitation.

Let’s take a look at each of them and how we can overcome them.


Our past actions guide our behavior in a familiar setting. You are in your home terrain, you know these peaks, and you have skied it more times than you can remember. Despite being in avalanche terrain, you’ve never seen that couloir slide… so why would it now? There are obvious warning signs that would make you question that route in unfamiliar terrain, but this is your home turf and familiarity takes over. We gain a false sense of security when we know the area and System 1 will allow us to overlook the obvious dangers.

Interestingly enough, this same concept can be seen in consumer behavior. People are more likely to buy a product by a brand they are familiar with, even if there are better options out there.  We create a rule of thumb that our past behavior of purchasing this specific brand’s product was most likely correct and should be repeated. Unlike at your local supermarket though, this rule of thumb can be the difference between a safe day with friends and a life or limb situation.

The take-away here: treat your own backyard as if it were a new environment every time. In all reality, it is – snow conditions, weather, environmental factors, and your own abilities are constantly changing. Overlook your familiarity bias and think about each route you choose with the same level of analysis to these conditions as you would anywhere else.


This one is pretty simple; we have dealt with it since day one on the playground. More commonly known as “peer pressure”, we tend to engage in activities that we think will impress others, get us noticed, or impress others that we like or respect.

We all do it. Our desire to impress others in the group can cause us to overlook warning signs. This can work two-fold. You may find yourself suggesting higher risk routes in an attempt to impress the overall group, or, you may find yourself saying yes to another group members’ suggestion, even if you have reservations, so as not to be the “lame one”. It may sound elementary, but we are all susceptible to it.

The advice here is to always ask yourself “Are the conditions safe”, “Is this truly within my ability level?”, “Would I do this if I were by myself?”. Equally as important, be the leader, if something feels off speak up. Point out your reservations and explain the risk factor to your group – chances are there are others in your group falling victim to the same heuristic and they may agree with you when you point out the specific risk factors, especially if you back it up with empirical evidence such as snow conditions.


We tend to make decisions based on consistency.  We look to our previous decision to decide how we should behave in a current situation.

For example, you have created a trip plan and set a goal to ski a specific slope. This decision anchors your forthcoming decisions for the remainder of the trip and you are more likely to make subsequent decisions that align with that goal as opposed to formulating new ones that contradict it.

Along the way, you encounter some environmental warning signs that increase the risk of the plan you had previously made. Our natural biases and System 1 are more likely to tell us to keep going and stick to the original plan – we committed to skiing this slope and are willing to overlook the dangers by justifying it through consistency with the original plan.

To engage System 2 and help overcome this heuristic try two things. Pre-commit to a set increment of time in which you will stop and evaluate the current conditions and to consciously be aware of this bias when environmental conditions force you to stop and evaluate the conditions. During these increments, ask yourself: “Am I making the right decision or am I making an emotional decision based on my goal to ski this slope regardless of the consequences?”

Expert Halo

We tend to assign trust to a leader, even if that leader is informal or not truly an expert. By associating a member of the group with the identity of leadership, our System 1 decision making allows us to fall back to this heuristic and assume they are right. As the “leader”, our brains shortcutting system assigns them the badge of “expert”. In doing so, we allow their decisions to overpower our own along with those of others in the group, even if the decision is not the best option. It is cognitively easier to trust their judgment than it is to do our own deep analysis and challenge the “expert”.

To overcome this heuristic, have a group accountability discussion before taking off into the backcountry. Agree as a group that even though one member may be more familiar with the terrain, or more experienced, decisions should be discussed as a group.

Furthermore, don’t simply rely on your own ability or the ability of others to speak up. Commit to periodic check-ins and ask each member of the group if they are comfortable with the current decisions. Whenever a new plan or decision is made, look around and pick an individual, ask them directly if they are good with the decision at hand or if they have any reservations. Inversely, ask them to do the same for you. If one member of the group speaks up, it may provoke others to as well.

Tracks (Scarcity)

If you have ever been involved in a snow sport then you know this one very well – it goes by many names: powder fever, first tracks, etc. This heuristic revolves around the concept of scarcity. If we think something is limited, we are willing to risk more for it, we are willing to overlook the negatives, and we are more likely to make an impulsive decision to make sure we don’t miss out.

This is another heuristic that is heavily leveraged in consumer behavior. Think about it, the last time you booked a hotel room or a flight, I am willing to bet that the site you used had an “Act now, only 3 left at this price” or similar statement right next to the item you were looking at. You may not be aware of it, but there is a good chance this impacted your purchasing decision. Studies have shown that this method drives faster purchases when implemented.

Well, this same concept applies to snow. If we see 3’ of fresh, light, fluffy powder and the line we have been dreaming about is untracked, there is a solid chance that little will stop us from wanting to ski it. This desire is compounded with full awareness of how rare and limited an opportunity it is. With these two factors combined, we are willing to overlook the obvious warning signs to get it.

I would argue that this is probably one of the hardest heuristics to overcome. I think having an awareness of why we want it so bad and being able to stop and rationalize the risks based on the current forecast and environmental conditions is a good start. Stop, think, engage System 2.

Social Facilitation

Behavioral Gurus will recognize this as “Social Proof”. It’s a fairly simple concept, we look to others to determine how we should behave. It is easier to rationalize a poor decision when you are not the first to make it.

In the case of backcountry safety, it is easy to look at a slope that has been skied earlier that day, see that no avalanches were triggered and assume it is safe, even if the avalanche danger is high that day. The more tracks, the safer it must be – that slope must be the good one! This is not always the case; it could just as easily be that a lot of people who made poor decisions that day got lucky. They simply missed the trigger point.

In these situations, rather than blindly follow the crowd, revert back to your plan for the day. Examine the environmental factors you determined were in play that morning and determine if it is a good decision for you, regardless of how many others have successfully navigated that slope.

That rounds out the FACETS concept, let’s look at Reflective Thinking.

The current course relies on the individual instructor to drive education in the human factors during field training. My group was particularly lucky in that we had an instructor who was passionate about the power of these tools and used them throughout the course. One specific exercise that we did that I believe would be insightful for all backcountry users to adopt is the post-trip exercise of reflective thinking for better decision making. The concept is simple – ask yourself two questions upon returning from the backcountry:

  • Did I make good decisions, or did I get away with something? If your answer is the latter, then it is important to recognize it, identify the poor decisions you made, hold yourself accountable, and use that awareness to improve your decision-making process the next time.
  • What was the most dangerous situation you were in today? This question helps you identify where and when you may have put yourself in a questionable situation. In some cases, as was the case with our group, the answer may simply be “In the car driving to get to the trailhead”, and that’s ok.

With all of these factors and concepts in mind, our best defense in making better backcountry decisions is to stop, think, rationalize, act. Continuous intentional decision making along the way, combined with a basic knowledge of these heuristic traps can intentionally engage system 2. Don’t let System 2 take a backseat after the planning process. Engage it throughout the trip to re-analyze and re-validate your decisions based on changing field conditions. The trick is to avoid System 1 based emotional decisions.

Now all this being said, System 1 is an amazing tool. It certainly has its uses in this environment and it can save your life. Once you pull the trigger and commit it is System 1 that is going to navigate you to safety. System 1 helps you carve, jump, drop, react naturally, and let loose. Should an avalanche occur System 1 will be there holding your hand to hopefully navigate you to safety.

Closing Thoughts and Recommendations.

While I believe that the FACETS concept is a fantastic step toward using behavioral insights to mitigate avalanche accidents, I would like to make a few recommendations to simplify this concept from a learning standpoint. I would recommend that AIARE break the heuristics into 2 major human factors that lead to poor decisions: Social Factors and Emotional factors – each with their own subset of heuristics. I would also recommend some adjustments to the terminology to make them clearer and more descriptive.

  1. Social Factors:
    • Social Proof – Basing our decisions off of the behavior of others and overlooking warning signs because someone else has already been there.
    • Social Acceptance – Making decisions to impress others and overlooking warning signs. (Formerly Acceptance. Reworded to avoid confusion with the concept of “acceptance of risk”. Risk acceptance is a personal choice, it is how much risk you are comfortable putting yourself in for a reward, and we all have different levels of acceptance).
    • Expert Halo – Assuming the “Leader” is always right and not questioning their decisions despite warning signs.
  2. Emotional Factors:
    • Familiarity – Ignoring warning signs because you are familiar with the terrain and are comfortable being there.
    • Scarcity (First Tracks) – Ignoring warning signs because the conditions are “too good to pass up.”
    • Commitment & Escalation (Peak Fever). Ignoring warning signs because we are committed to our previous decisions. This bias increases as we get closer to our destination – the further we go the less likely we are to turn back. (Formerly Consistency. For more information on this concept, check out the “Gradient Goal Effect”. Studies show that our motivation increases as we get closer to our goals).

I am still mulling over these suggestions and would love to hear your opinions on how to improve them and/or simplify them – reach out here with your recommendations, comment below, or email us at behavior@lanterngroup.com.

I believe the FACETS acronym is a great way to remember the idea but doesn’t fully dive into the depth of the human factors and leaves me wracking my brain to remember what A and T stand for. I also believe that tying them into the concept of “ignoring” or “overlooking warning signs” adds value because the act of engaging System 2 primarily revolves around intentionally stopping and looking for these signs.

AIARE has expressed a desire to ramp up the education around human factor to help teach better decision making, and we couldn’t agree more in the power of these tools! If you have taken the program or are involved in the AIARE training – reach out now! We would love to hear your opinions on the effectiveness of the current training, any suggested improvements on how to increase the awareness of human factors for safety in both backcountry and other high risk environments, or your overall impression of the concept of applying human factors to these types of training.

Additional Insights. 

Below are some additional concepts that we believe could be integrated into these programs to increase their effectiveness.

Loss Aversion

Studies have shown that humans are loss averse. Loss aversion refers to the tendency of people to be more motivated by the idea of mitigating loss than by acquiring an equivalent gain. This is particularly prevalent when it comes to finances – aka the green stuff.

Our recommendation – before embarking on your next backcountry experience, calculate the financial implication of an injury or rescue. A helicopter evacuation can be exorbitantly expensive, the loss of work and laddering costs associated with the recovery can be even more. Write this number down, commit it to memory, and perhaps even put it at the top of your trip plan.

Every time you approach a decisions to veer off of your plan for the day or come to a juncture that presents a surprise situation and leaves you wondering whether to continue or turn back, look at the number you wrote down and ask yourself if the experiential gain you would attain by proceeding or veering would outweigh the financial loss of an incident.

Leveraging Social Proof & the Expert Halo

There is a positive power in some of the FACETS heuristics that have been identified in this research. Specifically, social proof and expert halo. Not only do we tend to replicate poor behaviors in social situations or when presented with evidence from a perceived expert, but we also tend to replicate positive ones.

We recommend that AIARE consider this when framing their education materials. An example of this is to provide social proof statements such as “90% of professionals create a comprehensive trip plan before entering the backcountry and a trip report upon completion.”**

Seperating “Plans” and “Goals”

If we can separate our plan for the day from our goal for the day, we may be able to overcome some emotional biases. Think of your plan as the “empirical” approach to the day, and your goal as the “emotional approach”. We are more likely to overcome our biases on the empirical side than the emotional side.

Let’s looks at an example. There is 6” of new snow and you and 3 friends are heading into the Indian Peaks on the front range of Colorado. Your group intends in skiing the Audubon Couloirs on the southeast face of Mount Audubon. None of you have skied it before and, no doubt, that’s exciting.

The traditional approach to creating a combined plan/goal for the day might look something like this. “Ski the Audubon Couloirs on the southeast face of Mount Audubon by skinning up via the Mitchell Lake/Blue Lake route.”*

In this instance, the “goal” is defaulted as the intent to ski the Audubon Couloirs and our emotional biases/heuristics will commit to that goal. If we were to approach it as a split plan/goal however it may look something like this:

  • Plan (empirical): “Ski the Audubon Couloirs on the southeast face of Mount Audubon by skinning up via the Mitchell Lake/Blue lake route.”***
  • Goal (emotional): “Enjoy a fun ski tour through the Indian Peaks Wilderness and safely navigate back to our car as a group.”

Now that you have separated your plan and goal, you have assigned the more biased emotional portion of your decision making to the idea of “having fun and getting back safely” as opposed to the specific route you intend to ski. If you reach a point in the day where your plan and your goal no longer align, reconsider your plan and the empirical and environmental factors that align with it to achieve your goal.

For example, you get to the base of the couloirs and realize that there are roller balls coming down the face and the morning sun has cooked the southeast exposure, making it at high risk for a wet loose slide into a terrain trap.

  • In the combined plan/goal scenario your commitment and escalation (consistency) bias may kick in, goal gradient theory might push you to achieve the goal of skiing that line despite the dangers that have presented themselves.
  • In the separate plan/goal scenario your emotions and biases may instead link themselves to the goal of having fun and getting home safe. In this instance, you may be more likely to engage System 2 and re-evaluate your empirical plan to ski a more western facing route that has not been cooked by the sun yet.

With this in mind, it may be valuable for AIARE to have a separate line item for “Plan your route” and “Trip Goal” in the pre-trip decision making guide that they offer and recommend using for all trips into the backcountry. It is not a magic cure, but it may just make the marginal difference that matters when it comes to a life or limb decision.

*Authors note. I would conject from personal experience that the concept of Acceptance can go a step further. As a competitive person who prefers individual sports over team-based ones, I have often found myself looking inward for competition. In analyzing past behaviors while writing this, I have noticed that when I face a personal challenge that puts me at the edge of my abilities, I tend to push it over the edge even if no one else is around to impress.

This leaves me thinking, why? My assumption is that it is a form of “internal personal acceptance bias”. The thought of looking back and saying, “I could have done that” versus “I did”, while not as manipulative as social acceptance, could be potentially as dangerous. Studies have shown that we idealize our future selves, perhaps we do so to a point that makes us willing to make poor decisions in order to impress our future selves or live up to the image we see in our future selves. This is purely hypothetical, but I think it could be an interesting concept to study.   

**Example only, the actual percentage is unknown.

***Abbreviated example – a full plan for the day would be far more complex and consider environmental conditions and more.

Cover Photo by Clement Delhaye on Unsplash