Scoring a Business Simulation Competition is Not for Sissies

Unlike a 2nd grade soccer game, everyone is not a winner in a business simulation competition.  But even with grown-ups competing to run simulated companies, scoring a simulation is not a simple process.  The key is to have a clear, precise, and fair approach to determining the “winner”.

In the segments below, we’ve put together a few best-practices that we use at PriSim to score our competitions and to reinforce the financial learning in our classes.  As a reference, take a look at a sample Strategic Performance Measures sheet we use for our aerospace and defense classes.

Choose a prize for the winning team

Putting up a prize for the winning team helps fan the flames of a rousing competition.  To get a sense of just how rousing a business simulation class can get, go to YouTube and watch any UFC fight-clip…

In our PriSim classes, we offer inexpensive chrome-plated “floating” pens as prizes to the members of the winning team.  We are always amazed at how much energy and competitiveness surrounds this humble prize and the bragging rights that come with it.  Some of our clients go further, offering Amazon gift cards to the winners, “grab bags” of tech goodies like phone cords and chargers, and even cash prizes.  One of our clients offered $100 gift cards to the winners – but, to our dismay, not to the instructors.

Regardless of the prize, be sure to repeat enthusiastically throughout the class that everyone is actually a winner, even the non-winners (don’t call them losers…), because everyone walks out with increased business acumen!

Define the process that will be used to determine the winner

The process for determining the winner needs to account for teams running their simulated companies using very different approaches and strategies.  But how can you compare a big company to a small company; a niche strategy to a cost-leader strategy; Walmart to Neiman Marcus?  Answer: you let the company pick the metrics!  Walmart would likely choose revenue and market share; Neiman Marcus, profit margin.  They’re both great companies but they excel at different things and should be measured by different yardsticks.  We typically take a similar approach in our business simulation competitions – we let the teams choose the metrics by which they want their company to be measured.

The specific process we use at PriSim is a “weighted proportional ranking” rather than a simple ordinal ranking (see Page 2 of the sample scoring sheet above):

  1. Teams weight how they’d like to be measured on specific metrics (e.g., Sales, Profit Margin, ROE, CSI).
  2. Performance is then ranked proportionally (0 – 100) across all of the teams on each metric. If a team doesn’t choose a metric, it doesn’t help or hurt them but their result is still used to produce the proportional ranking for that metric.
  3. Each team’s weightings are then applied against each of their ranks using their chosen metrics.
  4. The total scores are calculated and compared to determine the winner.

We ask teams to choose their metrics in the second-to-last Round of the competition (e.g., the 3rd Round in a 4-Round competition).  That way, they commit to their metrics and to producing a result at their company over time, instead of waiting until the very end when it’s clear how the competition has played out.  We also ask the teams to sign their scoring sheets so there is no misunderstanding of what they’ve picked.

And yes, you can have ties – and that’s actually OK!  It’s unlikely, but if 2 teams happened to get exactly the same scores on the same metrics, or if both got the top scores but on different metrics, you could end up with a tie.  If you do, the class actually benefits by having two winning teams happily high-fiving at the end of the class.

Define the specific scoring metrics to be used

The choice of metrics that you will allow for scoring is critical to the process.  You should choose metrics that are generated directly from the simulation and that you can easily track and monitor.  And look for metrics that reflect a “holistic” higher-level view of the company, not just one small part of the company that can be easily “gamed” or manipulated.

You will also need to decide whether the metrics will be “prescribed” with teams having limited or no choice in the matter, or if the teams will be allowed to choose their own metrics.  Some of our clients choose to prescribe metrics in alignment with their real-world goals and strategies.  However, in our PriSim classes, we prefer that teams select their own metrics that they think mesh well with the strategies and approaches they’ve used in running their simulated companies.  You can also utilize a hybrid approach, with some metrics required and others optional.

Be flexible and ready to adjust as needed.  There will be trial and error in this process, and your approach may evolve over time.  You will have smart people in your classes who will figure out ways to “manage” (a.k.a., game) the metrics and the process – and good for them!  Their feedback will allow you to tighten up the process for future classes.

Some tips on choosing metrics:

  1. Include non-financial metrics such as Customer Satisfaction or Lifetime Value for a customer perspective, and Employee Satisfaction or Attrition for an employee perspective.
  2. Allow an option for “Other – Instructor Approval Required” to encourage financial creativity from the teams and comfort using financial ratios.
    1. Get ready for some interesting requests such as, “Can we use the highest debt as a metric?”
    2. Be ready to say “no”, and to give your reasons why. And don’t laugh out loud at the flimsy arguments that will be presented to you as the team begs for the metric…
    3. If you do want to try out a new metric, but aren’t sure what the unexpected consequences could be, you can allow it within limits – say 10% of their total weighting. Make sure all the teams in the competition know that they can choose the new metric.  And if it works, offer it as a metric in your next class, if it doesn’t, at least you gave it a shot!
  3. List the rules and constraints on the scoring sheet. Our sample in the link above clearly states the constraints, including weighting-limits, profit-metric requirements, and grounds for disqualification (though we’ve actually only DQ’d a team once).
  4. PriSim’s simulations also have specific limits on certain operations to prevent “gaming”, such as limiting the amount of stock that can be issued/repurchased.

Use an Excel spreadsheet to automate data analysis and scoring

We recommend that you use an Excel spreadsheet to tally, score, and tabulate the winner.  At PriSim, we’ve automated the process of transferring data from the simulation into our scoring sheet after each Round and to analyze round results and calculate the winner.  Contact us if you’d like us to walk you through our approach.

Show the scores, announce the winner, and get off the stage

Put some thought into when you will show the final scores of the competition.  In some of our classes, we announce the scores before the teams deliver their final “Earnings Call” presentations, which allows them to incorporate the results into their summaries.  But typically, we show the scores at the very end of the class so that we keep the participants’ attention and focus high right up to the end.

Be ready for questions and even some pushback from a team or participant now and again – the competitions can be intense and non-winning teams can be too.  And we recommend that you don’t leave the final scores up on the screen for too long to avoid “rubbing it in” to the non-winning teams…