What metrics / outcomes mattered to you at the time? Top 2 are Pageviews and Links
How did it perform in terms of those metrics / outcomes? To date, 174,612 pageviews, and 180 linking root domains (via Open Site Explorer)
Was there a reason why it performed so well?
To be fair, the post benefited greatly by being published on a high-visibility platform (Moz).
Most of the traffic has come from a steady stream of search. Last month alone saw 10,000+ visits. The keyword targeting wasn't intentionally (we actually do very little keyword research at Moz) but the topic seemed to hit a nerve
The post is useful and often referenced
The post has been updated to keep it current
What were the takeaways from this experience? Make a useful resource, and try to do it better than anything that currently exist.
At Moz, we actually use dozens of tools to report and share metrics from
almost every aspect of our business. This includes standard SAAS
metrics such as conversion, retention, cash flow and other performance
metrics such as website uptime and customer service stats. We even
report on the amount of food Mozzers consume in the office each week!
Each
department takes responsibility for reporting and the information is
distributed both through emails and recorded on our Intranet. It's not
unusual to receive 10-15 status emails each week.
For core
software subscription metrics we use a custom-built system called Gizmo
which tracks new free trials, vesting rates and churn, among other
things. Gizmo also helps visualize our customer acquisition channel. I'm
not sure how much I'm allowed to show you so here is a screenshot:
Staff can also subscribe to automated emails that Gizmo sends out each morning detailing metrics for that day, week, month and year.
It that weren't enough, each month the entire company participates in Town Hall and All Hands meetings where each department head and the company's executive staff share detailed metrics about nearly every business metric conceivable. Here's a slide from a Town Hall that I will present later today:
@edfryed In my limited research, anchor text seems to play a significant role. I'm doing some experiments and trying to feed Google the proper clues to trigger these answer boxes - so far not much luck. I wrote a little about it here: http://moz.com/blog/feeding-the-hummingbird
I don't know everything about the BrightEdge story, in part because the company itself has been mostly silent on the issue.
Perhaps I'm jumping on the anti-sue bandwagon and letting my emotions get the better of me. Perhaps BrightEdge has legitamate complaints and SearchMetrics acted with malice and of course they should be sued!
I don't know. But I do know the practice of patenting and suing over broad analytics and SEO techniques is not something I want to see encouraged. I've read through the patents and many feel overly broad. For now, at least until we get all the facts, I'd like to lean on the side, as Russ says, of investing in innovation instead of litigation.
Actually, I agree that "The Death of" is a little hyperbolic, but it was the first title that came to my head, and it stuck. I could have used "The Demise of" and not have been as dramatic. ;)
One thing to remember is that the default filter in GWT Query Reports is set to "Web". This filter should be set to "All" to include image, mobile, video, etc queries - which should, in theory, produce more accurate results. Not sure why GWT would choose to set this as the default, as it is easily something most folks miss.
That said, even when you set the filter to "all" this doesn't mean there is a perfect 1:1 relationship between GWT data and Google Analytics. In my own testing, here are still significant discrepancies.
WordStream - Fair enough, I totally understand your point. But for years SEOs and even Google reps have distinguished between "good" linkbait and "bad" linkbait. There's got to be more granularity. On the other hand, I always tend to think of "linkable assets" as good and having value beyond linkbait, hence use of the word. http://www.mattcutts.com/blog/seo-advice-linkbait-and-linkbaiting/
As for the SEO > Inbound comment... aw screw it. I'm not going to get into it :)
The tool ranks your friends and pages relative to your engagement with them. Presumably this internal algorithm helps determine if a friends post or activity will show up in your feed.
Sorry to hear you found it too long. In user testing, 90% of participants finished it in 10 minutes or less, which we found was a big improvement over past years. I fear if we reduced it any further, the data wouldn't be as actionable.
I think the discrepancies in numbers may be due to the fact that the API must check each post individually? (I can't really say because we don't support that API and we didn't intend it for public consumption) If you look at the total #s for all blog posts we actually see something like 55K visits a day on the blog.
So I think what we are looking at is a portion of Moz traffic. If you look at this screenshot, http://i.imgur.com/dfFcE9O.png it actually lines up with your experiment, just on a different scale.
While accurate (the correlation between +1's and higher rankings is pretty amazing!) if I could do it over again, I would have crafted a different title, considering 80% of folks see the headline, but only 20% read the article.
Okay, this is fun. Here's mine:
Title: Must-Have Social Meta Tags for Twitter, Google+, Facebook and More
What metrics / outcomes mattered to you at the time?
Top 2 are Pageviews and Links
How did it perform in terms of those metrics / outcomes?
To date, 174,612 pageviews, and 180 linking root domains (via Open Site Explorer)
Was there a reason why it performed so well?
What were the takeaways from this experience?
Make a useful resource, and try to do it better than anything that currently exist.
At Moz, we actually use dozens of tools to report and share metrics from almost every aspect of our business. This includes standard SAAS metrics such as conversion, retention, cash flow and other performance metrics such as website uptime and customer service stats. We even report on the amount of food Mozzers consume in the office each week!
Each department takes responsibility for reporting and the information is distributed both through emails and recorded on our Intranet. It's not unusual to receive 10-15 status emails each week.
For core software subscription metrics we use a custom-built system called Gizmo which tracks new free trials, vesting rates and churn, among other things. Gizmo also helps visualize our customer acquisition channel. I'm not sure how much I'm allowed to show you so here is a screenshot:
Staff can also subscribe to automated emails that Gizmo sends out each morning detailing metrics for that day, week, month and year. It that weren't enough, each month the entire company participates in Town Hall and All Hands meetings where each department head and the company's executive staff share detailed metrics about nearly every business metric conceivable. Here's a slide from a Town Hall that I will present later today:
@edfryed In my limited research, anchor text seems to play a significant role. I'm doing some experiments and trying to feed Google the proper clues to trigger these answer boxes - so far not much luck. I wrote a little about it here: http://moz.com/blog/feeding-the-hummingbird
- Hitting the blue "M" icon on your upper right: http://screencast.com/t/7PpEamBQ OR:
- Use the nifty shortcut we put in by hitting ctrl+alt+shift+M
@edfryed :)
Looking forward to it. If there is anything we can do to help, don't hesitate to ask.
Haven't seen anything from Google directly, which is making me start to suspect the rumors may be true :)
Quite a few folks within Google debunking this article: https://plus.google.com/+ChrisLang/posts/ZQkwzH5ufWV
Cyrus here. To an extent, the answer box queries Moz triggers don't seem to effect traffic much, i.e. https://www.google.com/search?q=title+tag+length
Thanks goodness Moz isn't in the weather prediction business: https://www.google.com/search?q=seattle+weather
Perhaps I'm jumping on the anti-sue bandwagon and letting my emotions get the better of me. Perhaps BrightEdge has legitamate complaints and SearchMetrics acted with malice and of course they should be sued!
I don't know. But I do know the practice of patenting and suing over broad analytics and SEO techniques is not something I want to see encouraged. I've read through the patents and many feel overly broad. For now, at least until we get all the facts, I'd like to lean on the side, as Russ says, of investing in innovation instead of litigation.
One thing to remember is that the default filter in GWT Query Reports is set to "Web". This filter should be set to "All" to include image, mobile, video, etc queries - which should, in theory, produce more accurate results. Not sure why GWT would choose to set this as the default, as it is easily something most folks miss.
That said, even when you set the filter to "all" this doesn't mean there is a perfect 1:1 relationship between GWT data and Google Analytics. In my own testing, here are still significant discrepancies.
I actually haven't seen this before so I thought it worth sharing. Amazing and slightly creepy how much Google knows about our location history.
WordStream - Fair enough, I totally understand your point. But for years SEOs and even Google reps have distinguished between "good" linkbait and "bad" linkbait. There's got to be more granularity. On the other hand, I always tend to think of "linkable assets" as good and having value beyond linkbait, hence use of the word. http://www.mattcutts.com/blog/seo-advice-linkbait-and-linkbaiting/
As for the SEO > Inbound comment... aw screw it. I'm not going to get into it :)
The tool ranks your friends and pages relative to your engagement with them. Presumably this internal algorithm helps determine if a friends post or activity will show up in your feed.
I've done 'nighttime link building' with at least half of these guys and let me tell you, it wasn't worth it.
Sorry to hear you found it too long. In user testing, 90% of participants finished it in 10 minutes or less, which we found was a big improvement over past years. I fear if we reduced it any further, the data wouldn't be as actionable.
Cool project Yousaf!
I think the discrepancies in numbers may be due to the fact that the API must check each post individually? (I can't really say because we don't support that API and we didn't intend it for public consumption) If you look at the total #s for all blog posts we actually see something like 55K visits a day on the blog.
So I think what we are looking at is a portion of Moz traffic. If you look at this screenshot, http://i.imgur.com/dfFcE9O.png it actually lines up with your experiment, just on a different scale.
While accurate (the correlation between +1's and higher rankings is pretty amazing!) if I could do it over again, I would have crafted a different title, considering 80% of folks see the headline, but only 20% read the article.
It's #86 :)