Down to earth

about everything in the world

More About Me...

down to earth

Another Tit-Bit...

my entrecard blog

Matt Cutts on Nofollow, Links-Per-Page and the Value of Directories

and I have a beautiful relationship - I relentlessly pester him and he gives me answers to questions that are pestering webmasters. It's a cross between symbiotic and vampirical (vampiristic?). Thankfully, Matt has once again let me suck away some of his precious time to address some big issues. After this, I think I'll let him have a rest and probably go after poor Tim (his website is so lonely).

I've posted a list of six questions I asked Matt, 3 of which he answered (answers, or lack of answers, are in bold). This is followed by some discussion on these and other topics that warrant a mention, and may be serious news to many folks (they certainly were to me).

1. Why was John Chow's website (johnchow.com) penalized in the Google search results?
* A) John engaged in manipulative link acquisition (buying links, affiliate linking, etc)
* B) John's site consistently provided links to paying companies without using nofollow
* C) John's site engaged in other manipulative practices on-page – cloaking, keyword stuffing, etc.
* D) He said mean things about Google and we're soft-shelled here at the 'plex
* E) No answer provide by Matt
2. Does Google recommend the use of nofollow internally as a positive method for controlling the flow of internal link love?
* A) Yes – webmasters can feel free to use nofollow internally to help tell Googlebot which pages they want to receive link juice from other pages
_
(Matt's precise words were: The nofollow attribute is just a mechanism that gives webmasters the ability to modify PageRank flow at link-level granularity. Plenty of other mechanisms would also work (e.g. a link through a page that is robot.txt'ed out), but nofollow on individual links is simpler for some folks to use. There's no stigma to using nofollow, even on your own internal links; for Google, nofollow'ed links are dropped out of our link graph; we don't even use such links for discovery. By the way, the nofollow meta tag does that same thing, but at a page level.)
_
* B) Sometimes – we don't generally encourage this behavior, but if you're linking to user-generated content pages on your site who's content you may not trust, nofollow is a way to tell us that
* C) No – nofollow is intended to say "I don't editorially vouch for the source of this link." If you're placing un-trustworthy content on your site, that can hurt you whether you use nofollow to link to those pages or not.
3. If I have a website in Spanish that serves Spanish speakers in Mexico, Spain, Latin America & the United States, how does Google recommend that I tell them my content is relevant to searches in all of those countries?
* A) Host a single site in any country you want, but use a general domain extension like .com, .net, .org, etc. Then, acquire links from sites in all of those markets and Google will realize that your content is intended for an international, Spanish-speaking audience
* B) Build a website for each country, but host them anywhere so long as they are registered with the proper top level domain. Feel free to use the same content on each, as duplicate content filters only apply to a single country.
* C) Build multiple websites for each country, host them individually in each country, register them on different top level domains and create unique content for each (even though the content might technically be exactly the same). Then, link build and market each site in each individual country separately. Don't interlink excessively between these sites.
* D) Use a single site for now and target your largest potential market. Then, give us some time to work on this issue – we'll have greater functionality for single sites targeting multiple countries/languages in the near future.
* E) No answer provide by Matt
4. Google has noted in the past that a maximum of 100 links per page was wise and would insure that all of the links on a page would be crawled. Does this rule still apply or is there some flexibility?
* A) The rule is still a good one – even very important pages with lots of PageRank should stay away from linking to more than 100 other pages.
* B) There's some flexibility. If you have a high PageRank page with lots of link juice, we may spider well beyond 100 links per page – possibly even 2-300 depending on how valuable we feel that page to be.
* C) The rule really only applies to pages of low importance/PageRank. Googlebot now regularly can crawl 150-200 links per page without breaking a sweat and those numbers can be even higher for pages we consider particularly important.
* D) Although we may crawl more than 100 links per page (maybe even many hundreds), we don't recommend linking to that many because of the dilution of link juice that occurs. Instead, use sub-navigation pages to help ease the link per page burden.
* E) B & D
_
Matt's exact words - The "keep the number of links to under 100" is in the technical guideline section, not the quality guidelines section. That means we're not going to remove a page if you have 101 or 102 links on the page. Think of this more as a rule of thumb. Originally, Google only indexed the first 100 kilobytes or so of web documents, so keeping the number of links under 100 was a good way to ensure that all those links would be seen by Google. These days I believe we index deeper within documents, so that's less of an issue. But it is true that if users see 250 or 300 links on a page, that page is probably not as useful for them, so it's a good idea to break a large list of links down (e.g. by category, topic, alphabetically, or chronologically) into multiple pages so that your links don't overwhelm regular users.
5. What is Google's position on the value of generic web directories that market to webmasters as a way to boost link strength, PageRank and relevance? Would you largely agree or disagree with my assertions about the value of generic web directories on the subject?
* A) While Google certainly feels that many directories are valuable, those that are built with SEO purposes in mind are generally viewed as somewhat manipulative and we make an effort to see that their value is limited. We're generally in agreement with your post.
* B) Google doesn't treat SEO directory sites any differently than any other sites – if their PageRank is high and their content is relevant, we pass just as much link value and anchor text weight as any other similar link. So, we differ a bit in opinion with your post.
* C) Mostly A with a little B
_
Matt's exact words - We tend to look more at the quality of a directory than whether it is SEO-related. Of course, plenty of directories that are targeted only for SEO don't add that much value in my experience. I talked about some of the factors that we use to assess the quality of a directory in an earlier post at http://www.mattcutts.com/blog/how-to-report-paid-links/
_
"Q: Hey, as long as we're talking about directories, can you talk about the role of directories, some of whom charge for a reviewer to evaluate them?
A: I'll try to give a few rules of thumb to think about when looking at a directory. When considering submitting to a directory, I'd ask questions like:
- Does the directory reject urls? If every url passes a review, the directory gets closer to just a list of links or a free-for-all link site.
- What is the quality of urls in the directory? Suppose a site rejects 25% of submissions, but the urls that are accepted/listed are still quite low-quality or spammy. That doesn't speak well to the quality of
the directory.
- If there is a fee, what's the purpose of the fee? For a high-quality directory, the fee is primarily for the time/effort for someone to do a genuine evaluation of a url or site.
_
Those are a few factors I'd consider. If you put on your user hat and ask "Does this seem like a high-quality directory to me?" you can usually get a pretty good sense as well, or ask a few friends for their take on a particular directory."
6. Are pages with "noarchive" treated any differently than those without the tag?
* A) Yes, Google may not treat links, content or other factors on a page the same if it has the "noarchive" tag.
* B) No, these pages are treated the same as a page that permits us to archive.
* C) No answer provided by Matt

BTW - For the remaining unaswered questions, Matt responded:

The rest deserve longer answers and I'm too swamped to do a full reply, or the question is not how I'd pose it, so multiple answers (or none) could apply or the answer is more nuanced.

So, I guess I need to work on my question-posing abilities.

Here are the big takeaway points from my perspective:

*
Nofollow is now, officially, a "tool" that power users and webmasters should be employing on their sites as a way to control the flow of link juice and point it in the very best directions. Good architectural SEO has always had some internal link structuring work involved, but nofollow and Matt's position on it makess it clear that for those of us who are professionals, we should be using it to the best of our abilities.
*
For pages with many, many links, sticking close to 100 links per page is probably still a very good idea, though high PR and link juice pages can certainly get more pages spidered. I note that on a page like the Web 2.0 Awards, well over 200 links are being followed and passing link juice (at least from what I can see).
*
Matt Cutts Directories (and all websites) that link out need to be very, very careful of who they link to, as this is a big way that Google's algorithmically identifying and discounting paid and other types of links they don't want to count. For really solid evidence on this, check out SEOmoz's own recommended list - the page is indexed at Google, but can't rank for its full title tag, we're not even ranking for the phrase in quotes. Why? I'm almost sure it's because of someone on the list that we're linking out to that Google doesn't like. I don't think that's a hand penalty - I think it's algorithmic. Full disclosure on this - SEOmoz used to be a directory operator, with a junky piece of crap called SOCEngine. We've officially shut it down at this point, even though it's been barely active for the last 2 years, as it would be totally hypocritical to operate a low quality directory and then proclaim that they're not worth paying for.

Matt also addressed a few other issues that deserve their own blog posts, and those should be coming soon.

0 comments:

Post a Comment