Hey, thanks for your interest in how we calculate an experience's ranking score. It's at the core of Rankers so I'm pleased you're curious.
The ranking score percentage is used to compare and sort experiences in ranking tables. It is not necessarily a direct measurement of the quality of a particular experience as rated by its customers. I've found it a useful tool to allow me to find the best experiences with confidence. But I've also found it important to read the customer reviews before making any final judgements!
We calculate an experience's ranking score using a multi-factor data model instead of a raw data average (mean). This model takes into account several important questions. For instance - is there a trusted body of reviews? What is the age of a review and is the review from a credible source?
Below you'll find details around some of the important factors that went into calculating the ranking score for Avalanche Peak track.
If you have any questions or comments about our ranking score calculation please get in touch at info@rankers.co.nz. We don't believe this is perfect or complete so we're always interested in ways we might make improvements.
33 Valid Reviews
The Avalanche Peak track experience has a total of 33 valid reviews. There are no invalid reviews that are excluded from the calculation. Reviews can be excluded only when a reviewer is not verified or after an investigation by our team determines the reviewer is not genuine.
Below is the distribution of ratings for the 33 valid reviews:
Rating | Count | Percentage | |
---|---|---|---|
10/10 | 16 |
|
48% |
9/10 | 6 |
|
18% |
8/10 | 7 |
|
21% |
7/10 | 4 |
|
12% |
6/10 | 0 |
|
0% |
5/10 | 0 |
|
0% |
4/10 | 0 |
|
0% |
3/10 | 0 |
|
0% |
2/10 | 0 |
|
0% |
1/10 | 0 |
|
0% |
90.30% Average
The raw data average (mean) for all the Avalanche Peak track valid reviews is 90.30% and is based on 33 valid reviews. This value is not used to calculate the ranking score and it only provided here as a comparison to the weighted average.
31 Face-to-Face Reviews
The Rankers team meets with travellers while they’re in New Zealand and conducts face-to-face surveys. These reviews, in our opinion, are the most trusted in the industry and represent a critical control sample. To our knowledge, we are the only travel review website in the world that has gone to this extent.
More about face-to-face reviews
Within the 33 valid reviews, the experience has 31 face-to-face reviews collected during interviews by our team.
Below is the distribution of ratings for the 31 face-to-face reviews:
Rating | Count | Percentage | |
---|---|---|---|
10/10 | 14 |
|
45% |
9/10 | 6 |
|
19% |
8/10 | 7 |
|
23% |
7/10 | 4 |
|
13% |
6/10 | 0 |
|
0% |
5/10 | 0 |
|
0% |
4/10 | 0 |
|
0% |
3/10 | 0 |
|
0% |
2/10 | 0 |
|
0% |
1/10 | 0 |
|
0% |
89.68% Average
The raw data average (mean) for all the Avalanche Peak track face-to-face reviews is 89.68% and is based on 31 face-to-face reviews. This value is not used to calculate the ranking score and it only provided here for comparison purposes.
90.88%
Rankers calculates a weighted mean as a base average on which we can improve. Individual review's ratings are given a weight based on several factors. The weight of a review determines the overall impact it'll have on the final weighted average.
Recent reviews have more weight as they are more relevant and reflect the experience as it currently operates. Over time reviews become less relevant and loose their impact on the ranking score.
Low rating reviews carry slightly less weight. This dampens the effect of very low ratings for every experience across the board. This is especially important when the experience has few reviews overall and a single negative rating can grossly mischaracterise an experience. Consistent poor reviews will still result in the experience receiving a comparitively low ranking score.
Credible sources provide reviews that can be trusted. If we have verified a reviewer is genuine via a face-to-face meeting then the review carries additional weight.
Reviewer | Rating | Age | Relative Weight |
---|---|---|---|
Matthew Hall | 10/10 | 2683 days | 100% |
Sandra Piechoua | 10/10 | 3076 days | 73% |
Ramon Corbett | 10/10 | 3095 days | 72% |
Wolfgang Ellenrieder | 10/10 | 3320 days | 56% |
Daniel Weber | 8/10 | 3332 days | 54% |
Munne | 9/10 | 3355 days | 53% |
Isaliner and Mathieu | 7/10 | 3357 days | 49% |
Jared and Evi | 9/10 | 3399 days | 50% |
Sona | 10/10 | 3721 days | 28% |
Martin Kroek | 7/10 | 3746 days | 24% |
Stefanie | 10/10 | 4045 days | 6% |
Jan-Peter Stripp | 10/10 | 4045 days | 6% |
Simeon W | 9/10 | 4059 days | 5% |
Jana Rutkowski | 8/10 | 4063 days | 4% |
Dennis Philippi | 8/10 | 4063 days | 4% |
Philip Schumann | 10/10 | 4073 days | 4% |
Florent Bouillon | 9/10 | 4074 days | 4% |
Vera Kreipe | 10/10 | 4082 days | 3% |
Anouck Roudet | 10/10 | 4098 days | 2% |
Mayan Goddat | 10/10 | 4125 days | 0% |
Magdalene Zech | 8/10 | 4126 days | 0% |
Marco Newald | 10/10 | 4126 days | 0% |
Paul Liecke | 10/10 | 4444 days | 18% |
Stella Thoben | 10/10 | 4444 days | 18% |
Colin Webster | 8/10 | 4453 days | 17% |
Peter Adams | 7/10 | 4801 days | 16% |
R Straathof | 9/10 | 4823 days | 18% |
Jo Meekley | 8/10 | 5157 days | 17% |
Hadar Zevulun | 8/10 | 5172 days | 17% |
Manfred Liuduer | 9/10 | 5177 days | 18% |
Matthias A | 10/10 | 5540 days | 18% |
Brenda | 10/10 | 5564 days | 18% |
markus Kieper | 7/10 | 5923 days | 16% |
No Adjustment
Several adjustments to the weighted average may be added to improve relevancy and credibility. Avalanche Peak track does not meet the criteria for any of these adjustments to apply.
0.88% Adjustment
Every experience's review score is adjusted to balance out the disproportional number of negative reviews that are contributed.
You won't be surprised to learn that disgruntled folk are more likely to leave a review than happy ones. They are motivated to share their experience and warn others. We consider this a good thing and it's why reading the reviews is important. However we've learned it can misrepresent the experience in a more overall sense.
We apply a balancing adjustment to counteract this effect and ensure the ranking score is a more fair representation of the experience. This adjustment is applied equally to all experiences.
92%
The final ranking score once any adjustments, ratings, and rounding has been applied. This value is recalculated each day and a short rolling average is applied. Therefore it may not be precisely accurate based on the other values presented.