We’ve heard it before about mobile or video…this year will be THE year. The year it matures, comes of age – becomes a key ‘marketing tool’.
I have a growing hatred of Google’s “Questions and Answers” feature for Google My Business. Let me tell you why. We (both SEO folk and general web users) do a lot for Google I think. Granted, they do a lot for us – but they have some of the brightest minds on the planet teamed with machine learning and near-unlimited computing power. Some people using Google don’t even know if a zoo is a zoo. I’m not kidding. Read on.
1. It’s Google wanting us to do their job for them. Again
So we’re over here either making information more accessible to them – putting it online in the first place, marking it up as they want, ensuring Google can discover it – or, we review things, let them use our location data, identify areas of images for them…these lists could go on and on. Now with Q&A on GMB – we’re having to either provide answers to some pretty silly questions, police and report dodgy questions or promote good questions. I don’t think we should have to.
If you’re in SEO or a related field you’ve probably got Screaming Frog at your disposal – maybe a cloud based crawler like DeepCrawl too. Well there’s a new kid on the block.
He’s installable, a little cocky and not afraid of a swear word or two…
From the horse’s mouth, their aim with Sitebulb is:
To help small agencies and consultants do amazing technical SEO work, at scale, using desktop software.
And, whilst it’s sometimes hard to get people to shift over to a newcomer when the incumbent (SF) has such a strong level of adoption – I genuinely believe Sitebulb will have no trouble picking up a tonne of customers at all levels. It looks good, isn’t expensive, works really well and has all sorts of cool stuff to make your job easier and your clients happier (note: Patrick offers this as a personal guarantee if you pay the money and sign up…I *think*…)
So what cool stuff is there to play with I hear you ask…?
Crawl maps, oh yeah!
My absolute favourite are the Crawl Maps. Check it out…
Cool, huh? The map above shows a client site, the atom-like blob to the far left is the homepage surrounded by product/catgeory pages one hop from it, and the hub to the right is their blog. The long ‘whips’ coming off the blog show pagination in action.
This stuff is hard to visualise, you can mess about with other tools to try and get somewhere close, we’ve used Gephi to do this in the past, but the crawl maps in Sitebulb are much more accessible and faster to get to. Without visualising a site structure you can miss things, not fully understand why Google’s doing what it’s doing, and that’s never good :/
Hints are actually useful *shocker*
Hints are probably underused in most SEO tools by the intermediate to expert level folks. Why? They’re usually dumb – or at least not intelligent (the hints I mean, not the folks) and so by simply following what a hint or statement is saying verbatim you can fuck things up. Lots. Entry level SEOs get themselves in all sorts of pointless arguments with devs by ‘telling them’ what a tool has told them 😉
The hints in Sitebulb are better. They just are – download it and try it for yourself.
Fianlly, the release notes are honest & hilarious
No more boring, “Bug fixes” notes…oh no my friend. Patrick has made the release notes (long may they continue post launch!) into a joy to receive and read. You know what they’ve changed, why they’ve changed it and who’s fuck up it was 😉 Plenty of puns, in jokes and references. Please have a play with it and drop the guys a line on twitter if you have any questions.
The folk over at Majestic.com have been hard at work restructuring their data centre so they can bring us even more juicy link data! So, what’s new?…
Outbound link data – Internal/External Links & Overall Domains
Previously the tool would show you the number of individual links/domains that point to a URL/subdomain/domain of your choosing. That’s undoubtedly amazingly useful info and something we use almost daily. Now Majestic have added data to show you how many internal and external pages & domains they link to.
To show an example, the screenshot below shows that a site I ran through the beta has a link from the MillionDollarHomepage. On the far right you can see that the MillionDollarHomepage links to 8 internal pages (FAQS, Buy Pixels, etc) as well as 1,040 external pages – a total of outbound links 1,048. Of those 1,048 – there’s 989 unique domains. Really useful stuff…
Title of linking page & language detection
Another nice touch is the addition of the title tag of the linking page – you can see this below on the lovely Neil Patel’s site…
Also in that screenshot you can see the new language detection function working, the small ‘EN’ below the TrustFlow figure denoting that it’s an English language page…not sure they’ve installed a bullshit detector yet 😉
Finally, the design of a page layout has changed, with a new wider default view – see the comparison below:
It’s not quite fully responsive (due to the sheer amount of data they need to show I suppose), but who’s looking at tonnes of link data on their mobile?! You? Me..? Yeah OK, that would be useful – one for the future maybe 😉
You can read the full announcement on the Majestic site.
Majestic is a tool I find myself using on an almost daily basis. Great for domain analysis (before acquiring), competitor analysis, pitch work analysis and all manner of other things. But one thing I haven’t done with Majestic until recently is used it to monitor things on an ongoing basis – like a campaign.
Yes, links from governing bodies and industry associations are useful, of course they are. But getting links from them isn’t always easy. This post is less about how to get them, but more about how to get the right ones and make them useful.
By now you’ve probably read Jon Cooper’s gigantic blog post listing almost every link building strategy known. If you haven’t, then go and read it…now. Back? Great.
I say ‘almost’ every link building strategy because I’ve got one that’s not listed and you don’t see talked about very often. I’m not claiming it’ll work for every site and every industry, but there’s so much scope that I’d be shocked if you couldn’t apply it to your site in some way.
So Thursdays are normally fairly standard, but today I received a hardback review copy of ‘The Avengers’ – edition 24 of the new books from the Marvel’s Mightiest Heroes graphic novel collection. This a new collection published by Hachette Partworks – it looks like they’ve done a great job too.
Since the ‘cookie law’ was introduced in the UK, the number of websites asking for permission to set cookies has been growing. Even small company websites are affected, in fact it strangely seems to be them who’ve been the group that have adopted these mechanisms most actively.
Ever since Facebook announced ‘usernames’ in 2009, they’ve been a handy tool in the armoury, adding an extra touch of professionalism to pages, keeping the brand guys happy and helping them to rank for company or brand names.
The ’25 likes’ rule
It used to be the case that you could only claim a custom Facebook username (or vanity URL) if you had over 25 likes. That number was probably picked to stop spammers, or people who wanted to ‘squat’ the URLs, being able to register them en masse. It meant you couldn’t just setup a page and choose a URL, you had to hustle a little, get friends, family and employees to like the page so you could hit that magic number and claim the name you want before someone else got it.