Ranking Score Explained

Hi there, thanks for your interest in how we calculate an experience's ranking score. It's at the core of Rankers so I'm pleased you're curious.

The ranking score percentage is used to compare and sort experiences in ranking tables. It is not necessarily a direct measurement of the quality of a particular experience as rated by its customers. I've found it a useful tool to allow me to find the best experiences with confidence. But I've also found it important to read the customer reviews before making any final judgements!

We calculate an experience's ranking score using a multi-factor data model instead of a raw data average (mean). This model takes into account several important questions. For instance - is there a trusted body of reviews? What is the age of a review and is the review from a credible source?

Below you'll find details around some of the important factors that went into calculating the ranking score for Double Bay.

If you have any questions or comments about our ranking score calculation please get in touch at info@rankers.co.nz. We don't believe this is perfect or complete so we're always interested in ways we might make improvements.

Nick Morrison's avatar

Nick Morrison

Rankers owner

Double Bay

Valid Reviews

35 Valid Reviews

The Double Bay experience has a total of 36 reviews. There are 35 valid reviews that are included when calculating the ranking score and 1 invalid review that are excluded from the calculation. Reviews can be excluded only when a reviewer is not verified or after an investigation by our team determines the reviewer is not genuine.

Below is the distribution of ratings for the 35 valid reviews:

Rating Count Percentage
10/10 7
20%
9/10 15
43%
8/10 4
11%
7/10 6
17%
6/10 1
3%
5/10 0
0%
4/10 0
0%
3/10 0
0%
2/10 1
3%
1/10 1
3%

82.29% Average

The raw data average (mean) for all the Double Bay valid reviews is 82.29% and is based on 35 valid reviews. This value is not used to calculate the ranking score and it only provided here as a comparison to the weighted average.

Face-to-Face Reviews

17 Face-to-Face Reviews

The Rankers team meets with travellers while they’re in New Zealand and conducts face-to-face surveys. These reviews, in our opinion, are the most trusted in the industry and represent a critical control sample. To our knowledge, we are the only travel review website in the world that has gone to this extent.

More about face-to-face reviews

Within the 35 valid reviews, the experience has 17 face-to-face reviews collected during interviews by our team.

Below is the distribution of ratings for the 17 face-to-face reviews:

Rating Count Percentage
10/10 4
24%
9/10 5
29%
8/10 2
12%
7/10 4
24%
6/10 1
6%
5/10 0
0%
4/10 0
0%
3/10 0
0%
2/10 0
0%
1/10 1
6%

80.00% Average

The raw data average (mean) for all the Double Bay face-to-face reviews is 80.00% and is based on 17 face-to-face reviews. This value is not used to calculate the ranking score and it only provided here for comparison purposes.

Weighted Average

89.23%

Rankers calculates a weighted mean as a base average on which we can improve. Individual review's ratings are given a weight based on several factors. The weight of a review determines the overall impact it'll have on the final weighted average.

Recent reviews have more weight as they are more relevant and reflect the experience as it currently operates. Over time reviews become less relevant and loose their impact on the ranking score.

Low rating reviews carry slightly less weight. This dampens the effect of very low ratings for every experience across the board. This is especially important when the experience has few reviews overall and a single negative rating can grossly mischaracterise an experience. Consistent poor reviews will still result in the experience receiving a comparitively low ranking score.

Credible sources provide reviews that can be trusted. If we have verified a reviewer is genuine via a face-to-face meeting then the review carries additional weight.

Reviewer Rating Age Face-to-Face Weight Relative Weight
Cath 10/10 152 days 96.34 100%
Lia 9/10 305 days 85.26 88%
Gus 9/10 670 days 33.62 34%
Ella 7/10 762 days 20.54 20%
The Weathersons 7/10 836 days true 14.22 13%
Michael Pitman 10/10 896 days true 11.28 10%
Stefan Hohmann 8/10 930 days 8.85 7%
Alice Addy 9/10 1135 days 4.92 3%
Peter Moore 10/10 1288 days 4.62 3%
Marina 9/10 1579 days 4.05 2%
R Werder 8/10 1597 days 3.82 2%
Ron Web 7/10 1674 days 3.52 2%
Dan Young 9/10 1705 days 3.81 2%
Alan White 9/10 1705 days 3.81 2%
Mike Merrick 9/10 1917 days 3.39 2%
Matthias Bohmert 6/10 1945 days true 2.77 1%
Lieselotte Michels 9/10 1951 days 3.32 2%
Jurgen Moors 2/10 1985 days 2.44 1%
David and Stephan 10/10 1988 days true 3.25 1%
Sophie James 7/10 1995 days true 2.95 1%
Joanna Doran 9/10 2002 days 3.23 1%
Greg Gilmore 9/10 2246 days true 2.75 1%
Fiona Hawkins 9/10 2246 days true 2.75 1%
Alex 10/10 2262 days true 2.72 1%
Granjon 8/10 2285 days true 2.54 1%
Maria = Moller Hansen 9/10 2295 days true 2.65 1%
Rebecca Luke 9/10 2298 days true 2.65 1%
Emilie Chanbon 7/10 2315 days true 2.38 1%
Sina Sacranie 9/10 2324 days true 2.59 1%
Sanne Sprogoe Borgaa 10/10 2324 days true 2.59 1%
Lars Borgaa 8/10 2324 days true 2.47 1%
Lara Mroseck 1/10 2341 days true 1.87 0%
Robert Geissler 7/10 2341 days true 2.33 0%
David Diaper 9/10 2557 days 2.14 0%
Will and Taylor 10/10 2649 days 1.96 0%

Adjustments

No Adjustment

Several adjustments to the weighted average may be added to improve relevancy and credibility. Double Bay does not meet the criteria for any of these adjustments to apply.

Final Ranking Score

89%

The final ranking score once rounding has been applied. This value is cached and recalculated each day. Therefore it may not be precisely accurate based on the other values presented.

If you have any questions or comments about our ranking score calculation please get in touch at info@rankers.co.nz.