Ranking Score Explained

Hi there, thanks for your interest in how we calculate an experience's ranking score. It's at the core of Rankers so I'm pleased you're curious.

The ranking score percentage is used to compare and sort experiences in ranking tables. It is not necessarily a direct measurement of the quality of a particular experience as rated by its customers. I've found it a useful tool to allow me to find the best experiences with confidence. But I've also found it important to read the customer reviews before making any final judgements!

We calculate an experience's ranking score using a multi-factor data model instead of a raw data average (mean). This model takes into account several important questions. For instance - is there a trusted body of reviews? What is the age of a review and is the review from a credible source?

Below you'll find details around some of the important factors that went into calculating the ranking score for Double Bay.

If you have any questions or comments about our ranking score calculation please get in touch at info@rankers.co.nz. We don't believe this is perfect or complete so we're always interested in ways we might make improvements.

Nick Morrison's avatar

Nick Morrison

Rankers owner

Double Bay

Valid Reviews

34 Valid Reviews

The Double Bay experience has a total of 35 reviews. There are 34 valid reviews that are included when calculating the ranking score and 1 invalid review that are excluded from the calculation. Reviews can be excluded only when a reviewer is not verified or after an investigation by our team determines the reviewer is not genuine.

Below is the distribution of ratings for the 34 valid reviews:

Rating Count Percentage
10/10 6
18%
9/10 15
44%
8/10 4
12%
7/10 6
18%
6/10 1
3%
5/10 0
0%
4/10 0
0%
3/10 0
0%
2/10 1
3%
1/10 1
3%

81.76% Average

The raw data average (mean) for all the Double Bay valid reviews is 81.76% and is based on 34 valid reviews. This value is not used to calculate the ranking score and it only provided here as a comparison to the weighted average.

Face-to-Face Reviews

17 Face-to-Face Reviews

The Rankers team meets with travellers while they’re in New Zealand and conducts face-to-face surveys. These reviews, in our opinion, are the most trusted in the industry and represent a critical control sample. To our knowledge, we are the only travel review website in the world that has gone to this extent.

More about face-to-face reviews

Within the 34 valid reviews, the experience has 17 face-to-face reviews collected during interviews by our team.

Below is the distribution of ratings for the 17 face-to-face reviews:

Rating Count Percentage
10/10 4
24%
9/10 5
29%
8/10 2
12%
7/10 4
24%
6/10 1
6%
5/10 0
0%
4/10 0
0%
3/10 0
0%
2/10 0
0%
1/10 1
6%

80.00% Average

The raw data average (mean) for all the Double Bay face-to-face reviews is 80.00% and is based on 17 face-to-face reviews. This value is not used to calculate the ranking score and it only provided here for comparison purposes.

Weighted Average

84.30%

Rankers calculates a weighted mean as a base average on which we can improve. Individual review's ratings are given a weight based on several factors. The weight of a review determines the overall impact it'll have on the final weighted average.

Recent reviews have more weight as they are more relevant and reflect the experience as it currently operates. Over time reviews become less relevant and loose their impact on the ranking score.

Low rating reviews carry slightly less weight. This dampens the effect of very low ratings for every experience across the board. This is especially important when the experience has few reviews overall and a single negative rating can grossly mischaracterise an experience. Consistent poor reviews will still result in the experience receiving a comparitively low ranking score.

Credible sources provide reviews that can be trusted. If we have verified a reviewer is genuine via a face-to-face meeting then the review carries additional weight.

Reviewer Rating Age Face-to-Face Weight Relative Weight
Lia 9/10 54 days 99.54 100%
Gus 9/10 419 days 72.18 72%
Ella 7/10 511 days 53.35 52%
The Weathersons 7/10 585 days true 42.06 41%
Michael Pitman 10/10 645 days true 37.09 36%
Stefan Hohmann 8/10 679 days 30.8 29%
Alice Addy 9/10 884 days 12.05 10%
Peter Moore 10/10 1037 days 5.53 3%
Marina 9/10 1328 days 4.6 2%
R Werder 8/10 1346 days 4.34 2%
Ron Web 7/10 1423 days 4.04 2%
Dan Young 9/10 1454 days 4.39 2%
Alan White 9/10 1454 days 4.39 2%
Mike Merrick 9/10 1666 days 4.02 2%
Matthias Bohmert 6/10 1694 days true 3.3 1%
Lieselotte Michels 9/10 1700 days 3.96 2%
Jurgen Moors 2/10 1734 days 2.93 1%
David and Stephan 10/10 1737 days true 3.9 2%
Sophie James 7/10 1744 days true 3.54 1%
Joanna Doran 9/10 1751 days 3.88 2%
Greg Gilmore 9/10 1995 days true 3.46 1%
Fiona Hawkins 9/10 1995 days true 3.46 1%
Alex 10/10 2011 days true 3.43 1%
Granjon 8/10 2034 days true 3.22 1%
Maria = Moller Hansen 9/10 2044 days true 3.38 1%
Rebecca Luke 9/10 2047 days true 3.37 1%
Emilie Chanbon 7/10 2064 days true 3.04 1%
Sina Sacranie 9/10 2073 days true 3.33 1%
Sanne Sprogoe Borgaa 10/10 2073 days true 3.33 1%
Lars Borgaa 8/10 2073 days true 3.16 1%
Lara Mroseck 1/10 2090 days true 2.41 0%
Robert Geissler 7/10 2090 days true 3.0 1%
David Diaper 9/10 2306 days 2.93 1%
Will and Taylor 10/10 2398 days 2.77 0%

Adjustments

No Adjustment

Several adjustments to the weighted average may be added to improve relevancy and credibility. Double Bay does not meet the criteria for any of these adjustments to apply.

Final Ranking Score

84%

The final ranking score once rounding has been applied. This value is cached and recalculated each day. Therefore it may not be precisely accurate based on the other values presented.

If you have any questions or comments about our ranking score calculation please get in touch at info@rankers.co.nz.