Absolutely agree Takeshi, and was a key point in the summary
"The preference should be for the agency to prove their ability through
references and case studies, and ensuring that the relationship between
agency and client has a really tight set of objectives as well as very
good reporting in place for evaluation. Its also important that there
be a contract with no long term tie ins, and an easy break clause to
activate."
Big thumbs up for making the effort to make this actionable rather than another huge list of seemingly random points that many people write up after a conference!
Love the time you have taken on this to add in the other variables etc. Started by reading and thinking straight away…hmmm…what about no of words.
I think this could be really useful, as in the past I’ve made some forecasting errors looking at highly generic, low competition keywords. Incorrecty assumed they would be easy to directly target and then got washed out by all the other content indirectly targeting this keyword.
Nice stuff, love to see an analytical approach. Might not be, and may never be perfect but it gives numbers to make decisions on or explain why you made decisions.
Thanks for sharing
Hi Patrick, very good point and yes the overall control is a big thing. We as an agency don't do full web development (we do have developers but just for SEO) , but if an agency had a contract for both web dev and SEO then you'd be much better placed to be able to offer this type of deal
It definitely should be a cause for concern (internal and external teams working together), requires very strong management and my thoughts are 1 team has to be given the power to have final say. This would normally be the external expert in my opinion
Thanks Kristian - some really great opinions from both sides of the fence. We've already had quite a few additional experts offer their insights so we will be updating this as time goes on
Quote from article (below) Of course it was from using Postjoint! Clean up all posts that you have created using the system and get recon request in
The question is how did this 16% get on Googles hit list? Although the timing coincides with our penalty we cannot conclusively say this is down to using PostJoint.
Not so different after all, but interesting to see they believe due to lack of footprint their users are not at risk of penalty being passed (see comments in article).
Anyone used it to know if claims could be correct?
Nice to see some actual first hand experience rather than another set of best practices. Very useful, thanks.
Absolutely agree Takeshi, and was a key point in the summary
"The preference should be for the agency to prove their ability through references and case studies, and ensuring that the relationship between agency and client has a really tight set of objectives as well as very good reporting in place for evaluation. Its also important that there be a contract with no long term tie ins, and an easy break clause to activate."
Big thumbs up for making the effort to make this actionable rather than another huge list of seemingly random points that many people write up after a conference!
Good term 'white hattedness', and know exactly that room and argument
Phenomenal stuff
Hi Patrick, very good point and yes the overall control is a big thing. We as an agency don't do full web development (we do have developers but just for SEO) , but if an agency had a contract for both web dev and SEO then you'd be much better placed to be able to offer this type of deal
It definitely should be a cause for concern (internal and external teams working together), requires very strong management and my thoughts are 1 team has to be given the power to have final say. This would normally be the external expert in my opinion
Thanks Kristian - some really great opinions from both sides of the fence. We've already had quite a few additional experts offer their insights so we will be updating this as time goes on
Quote from article (below) Of course it was from using Postjoint! Clean up all posts that you have created using the system and get recon request in
The question is how did this 16% get on Googles hit list? Although the timing coincides with our penalty we cannot conclusively say this is down to using PostJoint.
Not so different after all, but interesting to see they believe due to lack of footprint their users are not at risk of penalty being passed (see comments in article).
Anyone used it to know if claims could be correct?
very nicely executed! so much better than a standard infographic