nz blogocracy | FAQ: "the ranking system is a mess"
Re: Kiwiblogblog query.
(I must respond here because I simply will not register to make comments on someone's blog I'm afraid - we don't force people to register on this site in order to be able to participate, and personally I think the practice should be discouraged.)
Q. For no apparent reason, the blogs are scored on:
the number of unique visitors they get each day from people with the Alexa Toolbar installed, or unique visitors according to the site’s meter if available.
+ the number of posts a week (not a day?, not a month? Not a fortnight?)
+ the incoming links (number of blogs linking to your blog in the last 6 months) by Technorati
+incoming links as scored by Truth Laid Bear (many blogs, like us, are not signed up with them, so score = 0)
I can’t for the life of me figure out why an incoming link every 6 months should be worth 2 posts a week or two unique visitors a day. I don’t know what an appropriate conversion rate would be, indeed it’s impossible to determine an objective one, but this rate is just arbitrary.
A. The reason is to rate blogs on how good they are. The ratios are necessarily arbitrary as it is an artificial formula based on our nevertheless considered opinion of what constitutes a good blog. The formula combines and quantifies different measurements of blog performance to create a total score that can be used to rank the set. "Objective" is an unhelpful word to employ in understanding this formula.
We are trying to assess a blog's popularity and quality. In order to do this in a way that does not involve personal judgements being made of each blog's editorial content or other aspect only assessable by professional opinion we confined our evaluation to:
public support (ie. traffic)
peer support (ie. links incoming from other blogs)
editorial frequency (ie. posts per week)
so we may discover their value amongst one another in terms of the nz blogosphere.
We want this to be done with:
transparency (ie. the data must be publicly available)
verifiability (ie. every bit of data will be linked to where available)
simplicity (eg. an easily understood equation; comments are too difficult to count, but posts can often be counted simply so posts are used and comments are not)
"An appropriate conversion rate" or weighting is problematic. Some acknowledgment of editorial output seems reasonable, but peer support through incoming links seem and even better prospect, but by far the most important measure must ultimately be traffic.
Newspapers are rated via readership surveys or sales figures, TV uses electronic sets for rating, and radio use diaries to measure ratings, and so the nz blogosphere being on the internet should be measured in ways appropriate to, and available on, the internet.
Q. Furthermore, what is a unique visitor? An IP address, yet many organistions share one IP address between many internet users, while many individuals go through several IP addresses a day (at work, at home, Bluetooth, and more if they have dial-up). Simply counting up the number of IP addresses a day does not tell you how many people are visiting a site.
The major problem I think, however, is relying on Alexa’s ranking of sites for the bulk of the points in each blog’s ‘score’.
The Alexa system works by recording the sites visited and page-views from everybody who has an Alexa toolbar running and tabulating them to make a rough guide to who’s going where on the internet. But there’s something weird when you look at the stats Alexa produces. The numbers jag about hugely between days
A. Alexa stats reliability have been dealt with empirically. The Alexa score used is relatively stable as it is not a snapshot but an average. Over a short period of polling day-to-day variance is normal, and remembering weekends have significantly less traffic than weekdays. Alexa is also a very good predictor at the higher end as I have discovered. No alternative method has yet been found.
Q. Nzblogosphere should give away the Alexa toolbar as a measure of blogs’ performance. A far more accurate and fair solution would be to ask those blogs who wish to be ranked to sign up to a program like google analytics and make the results available.
A. And the comprehensive nature of the list and what constitutes the nz blogosphere would be destroyed. We include a blog's stats where available. Some bloggers have sent me screen grabs of their stats or the stats they have published themselves and I have used that data. Many bloggers are reluctant to produce the goods, inter alia Public Address and The Standard, and the fact they haven't might *maybe, possibly* indicate that they are being over-counted. The one thing we would never want to do is discount any blogs existence - which means we must include as many as possible. The original base for the list was my own searching and the exploration of the blog rolls of the top 50.