This past week SEO software company, Moz, launched a free tool for assessing risky links, and Google’s John Mueller bestowed some more of his wisdom in a few webmaster forum comments worth highlighting.
Full details on each of these stories in this week’s SEO news roundup!
Latest SEO News – Week Ending 4/3/2015
Moz Releases New Tool For Assessing Risky Links
The creators of some of the most useful free and paid SEO software on the market now has a brand new tool for assessing how risky your link profile is.
You may be familiar with Moz through using its Open Site Explorer tool, which is free web-based software that can analyze a website’s inbound links. Based on who is linking to a site, and how often, Moz can determine how authoritative a site is and assign it a “Domain Authority” score.
Now, built into the Open Site Explorer tool is a new module for risky link analysis. Based on a checklist of 17 spam flag, Moz will determine how risky your links are on a scale of 1 to 17.
Here is a brief overview of the types of spam indicators the tool looks for:
- Low MozTrust/MozRank score
- Low link diversity
- Ratio of followed/nofollowed domains
- Ratio of followed/nofollowed sub-domains
- Thin content
- Large number of external links
- Heavy anchor text
- No contact info or social profiles
- Use of a spammy domain extension
- Numbers in domain name
- Few links to linking site
- Small amount of branded anchor text
- High ratio of text to HTML
- Linking site has few internal links
- External links in navigation bar
- Linking site has few pages
- Domain name length is above average
If the tool finds you have a large amount of risky links, it will even help you remove them. The tool can help you build a file of the riskiest links, which you can then submit to Google’s disavow tool. That way you can ensure those links are no longer counted as part of your inbound link profile.
For full details about this new tool, check out Moz’s official announcement.
Wisdom from Google’s John Mueller
Google engineer John Mueller is notorious for frequently monitoring and responding to threads on Google’s webmaster help forum. Occasionally his responses are rather insightful.
Here are some new things we learned this week thanks to some of his comments.
Your site better load within 2 seconds or less.
To ensure Google is fully crawling and indexing every page on your site, you better keep the load time under two seconds.
In this Webmaster Help thread, Muller says if it takes longer than 2 seconds to load your site it will “severely limit” the number of pages Google will crawl.
“My recommendation would be to make sure that your server is fast & responsive across the board. As our systems see a reduced response-time, they’ll automatically ramp crawling back up.”
To see how long it takes for your site to load, check it using the GTMetrix tool.
Googlebot is always watching.
Some of the more conscientious site owners have a habit of manually submitting pages to Google’s index every time they add a new one to their site.
According to John Mueller, that’s unnecessary. If you make a change to your site, Google will find it:
“You don’t need to submit pages when they’re changed — we recrawl the web automatically to pick those changes up.”
If you really want to have Google crawling your site as efficiently as possible, make sure you’re using XML and HTML sitemaps.
Reminder: Google’s Mobile-Friendly Algorithm Launches on April 21st
To keep this fresh in everyone’s mind, we’d like to remind you that a significant algorithm update is coming in just under a month that will affect how non-mobile friendly sites appear in search results.
Wrapping it Up
We now have a new tool for assessing and removing risky links, we have a new understanding of the importance of site speed, and have been assured by John Mueller that Google will find our pages.
If you have any questions about any of this past week’s SEO news, please leave a comment and I will be sure to respond.