This morning Stuff published a story about the number of followers that various party leaders have. With in hours there were those questioning if the story was being pushed from the PMs Office.
I had a read of the story, and when it comes to Twitter, it is pretty light on analysis. So I thought I would have a bit of a dig. The party leaders covered are John Key, David Cunliffe, Russel Norman, Metiria Turei, Colin Craig, Jamie Whyte, Laila Harré, Pita Sharples, Tariana Turia and Hone Harawira. Though Stuff decided to exclude Hone as they could only find an account that was inactive for a number of years. However there is this account in his name that is still active. But we will ignore that omission for the moment. The account has 651 followers, and I excluding any with under 1000 as I feel it is too small of a sample.
So the graphic that Stuff included in their article was this: (image links to story, screen grab from here)
So the immediate thing that you notice is the massive number of followers that John Key has compared to everyone else. John Key has 10 follower for every one of Russel Norman’s and 11 for every one of David Cunliffe’s. However some have questioned if all of these followers are real:
Now the page that Dovil linked to in their Tweet, for the Twitter Audit for John Key had data that was over a year old. The way that Twitter Audit works is that if you need up to data info on an account that isn’t yours it shows the info from last time it was calculated, either by the account holder or by someone willing to pay US$4 for the information. Which, as Dovil pointed out, for most people is not worth it. But Twitter Audit make this clear.
I was going to use this information, and put a caviet on it that it was a year old. But then when I went to do the screen grabs, I noticed something interesting:
Someone had updated John Key’s, and on closer inspection, they have also updated all the other party leaders I am going to be looking at, apart from David Cunliffe and Russel Norman. Though I am checking them as I write this blog, incase that changes.
Trying to measure certain aspects of social media and social media engagement is rather difficult. So I am going to use three different websites to help provide information, Twitter Audit, Status People’s Fakers and Klout. There are also some stats from Twitonomy.
Twitter audit gives you a score of how many accounts that follow the account being audited are “real”. Their methodology is stated on their website:
Each audit takes a random sample of 5000 Twitter followers for a user and calculates a score for each follower. This score is based on number of tweets, date of the last tweet, and ratio of followers to friends. We use these scores to determine whether any given user is real or fake. Of course, this scoring method is not perfect but it is a good way to tell if someone with lots of followers is likely to have increased their follower count by inorganic, fraudulent, or dishonest means.
They don’t give a full picture of how each of these elements is used, but as with the other tools being used, the same method will be applied for each person we look at, so any inaccuracies should be similar across each account.
*Screen grabs listed in same order as Stuff list accounts in above graph*
So using the data that Stuff provides here I have adjusted the followers figures for each of the party leaders using the audit score. This doesn’t change the order at all, but it does cluster some of them a bit closer together. As you can see, both above and below, accounts with under 3500 followers all have audit scores above 80%, where as 4 of the 6 accounts above 4500 followers have scores below 80%. Both of the accounts over that figure with audit scores over 80% have audit scores over a year old, and have both grown substantially in followers since their last audit.
|Twitter followers||Twitter followers adjusted for Twitter audit|
The next measure is Status People’s Fakers. This divides an accounts followers into 3 groups, fake, inactive and good. Because Twitter is a non mutual network, unlike Facebook, where someone can follow you without you following them back, it is a lot harder for accounts to see who is active and who isn’t and thus cull them out. (Fakers limits you to 3 searches for free per account, but I have access to a number of accounts, hence being able to get all these screen grabs. Nor does it give discrete addresses for each account searched, so no links on these screen grabs. )
So when we adjust for the percentage of good accounts following each account, things still look pretty similar order wise, however the numbers are different, some significantly so.
|X.1||Twitter followers||Twitter followers adjusted by fakers||Twitter followers adjusted for Twitter audit|
So it would appear that Fakers are a lot harder on accounts when it comes to them proving that they aren’t fake. Now I am not in a position to judge which is right, but the fact that there is not a huge change in order, the only shift being Pita Sharples and Colin Craig swapping places. The stand out figure has to be John Key’s, where both services have his number of followers taking a 50%+ drop.
The thing about all of these figures is that it is based on the assumption that followers is the be all and end all of social media. But we all know it isn’t. None of us enjoy following accounts that don’t interact, with the obvious exception of news accounts pushing out links. So is users a fair way to measure success? I don’t think it is.
So what measures are available to asses aspects other than just followers? Klout attempts to do so. It looks at a number of factors concerning engagement. Now again, the weight it places on different elements is not tally clear, but the weight applied will be the same across the accounts concerned. Now not all accounts are measured on Klout. But I have included all the accounts I could find.
So the results are pretty consistent for order. I do want to mention that John Key’s Klout score includes his Facebook page, whereas the other accounts don’t. This will elevate his score. So what we can see is that despite his apparent huge lead when it comes to raw follower numbers, John Key doesn’t have a massive lead when we look at engagement. This indicates that the huge follower numbers are not translating into engagement. Now this could be because of the content that is being produced, but as laid out above, it is just as likely, if not more likely, to be influenced by the fact that many of his 110,000 followers are not active, so they aren’t even seeing the content, let alone sharing it. I am surprised at the low score of Metiria, who normally appears to be reasonably active, and get a solid number of retweets. But this could be a fact of the people who I follow, and they are not a fair indication of the wider following she has.
The final thing I want to look at is Twitonomy stats.
So what you can see here is that Russel, David, Metiria, Laila and Peter Dunne are all engaging with the people who follow them. Between 20% and 53% of their tweets are replies. Compared with the likes of Colin Craig, John Key and Tariana and Peter, where they are all below 2%, and Colin’s is only that hight because he has sent so few tweets. John Key obviously gets a lot of retweets, but this will partly be driven by his lack of replies. By that I mean that more of his tweets are going to be retweet worthy to followers. Most people won’t retweet many tweets sent at them. But if an account is simply putting out promotional material, chances are there is going to be at least one person who finds a tweet worth retweeting.
One thing that needs pointing out is that when you sign up for a Twitter account, Twitter suggests a number of accounts for to follow. These are based partly on the number of followers accounts have, and if they are verified. So accounts with lots of followers are more likely to be suggested than accounts with less. But on the flip side those accounts will attract more followers who use Twitter for a few days or weeks, then never log in again. So I am not suggesting that any of the accounts above have bought followers, or employed bots to boost their numbers. That is a lot harder to prove. But what can be seen is that pure follower numbers are not useful, in and of themselves.
So it appears that Stuff have fallen into the typical trap of “oh my god, followers! The account must be successful!”. But followers is a pretty inaccurate way to judge accounts, especially when you are talking about accounts with tens, if not hundreds of thousands, of followers. If you want to judge an account on how successful it is, there are a number of ways to do that, but it takes a little bit of digging and explaining. I can understand why people were questioning why this story was published by Stuff. It comes across as a lazy attempt to lay out the current state of social media in New Zealand Politics. It has elicited a number of reactions, from those trumpeting John Key’s success, like David Farrar, to those like Dovil pointing out the issue with the numbers. But there is much more to running a successful social media presence than just gaining numbers and even those numbers can be misleading.