Webmasters have been expecting a BIG Penguin update from Google for quite some time, and a couple weeks ago, Google’s Matt Cutts promised that one was on the way.
Finally, on Wednesday, he announced that Google had not only started
the roll-out, but completed it. While it was said to be a big one, it
remains to be seen just how big it has been in terms of impacting
webmasters.
Have you been impacted by the latest Penguin update? Let us know in the comments.
Just what did Cutts mean by “big” anyway? When discussing the update a
couple weeks ago, he said it would be “larger”. When it rolled out, he
announced that “about 2.3% of English-US queries are affected to the
degree that a regular user might notice,” and that “the scope of Penguin
varies by language, e.g. languages with more webspam will see more
impact.”
As far as English queries, it would appear that the update is
actually smaller. The original Penguin (first called the “Webspam”
update) was said to impact about 3.1% of queries in English. So, perhaps
this one is significantly larger in terms of other languages.
Cutts has also been tossing around the word “deeper”. In the big “What should we expect in the next few months” video
released earlier this month, Cutts said this about Penguin 2.0: “So
this one is a little more comprehensive than Penguin 1.0, and we expect
it to go a little bit deeper, and have a little bit more of an impact
than the original version of Penguin.”
Cutts talked about the update a little more in an interview with Leo Laporte
on the day it rolled out, and said, “It is a leap. It’s a brand new
generation of algorithms. The previous iteration of Penguin would
essentially only look at the homepage of a site. The newer generation of
Penguin goes much deeper. It has a really big impact in certain small
areas.”
We asked Cutts if he could elaborate on that part about going deeper. He said he didn’t have anything to add:
The whole thing has caused some confusion in the SEO community. In fact, it’s driving Search Engine Roundtable’s Barry Schwartz “absolutely crazy.” Schwartz wrote a post ranting about this “misconception,” saying:
The SEO community is translating “goes deeper” to mean that
Penguin 1.0 only impacted the home page of a web site. That is
absolutely false. Deeper has nothing to do with that. Those who were hit
by Penguin 1.0 know all to well that their whole site suffered, not
just their home page.
What Matt meant by “deeper” is that Google is going deeper
into their index, link graph and more sites will be impacted by this
than the previous Penguin 1.0 update. By deeper, Matt does not mean how
it impacts a specific web site architecture but rather how it impacts
the web in general.
He later updated the piece after realizing that Cutts said what he
said in the video, adding, “Matt must mean Penguin only analyzed the
links to the home page. But anyone who had a site impacted by Penguin
noticed not just their home page ranking suffer. So I think that is the
distinction.”
Anyhow, there have still been plenty of people complaining that they
were hit by the update, though we’re also hearing from a bunch of people
that they saw their rankings increase. One reader says this particular
update impacted his site negatively, but was not as harsh as the
original Penguin. Paul T. writes:
Well, in a way I like this update better than any of the others.
It is true I lost about 50% of my traffic on my main site, but the
keywords only dropped a spot or two–so far anyway.
The reason I like it is because it is more discriminating. It doesn’t just wipe out your whole site, but it goes page by page.
Some of my smaller sites were untouched. Most of my loss came from hiring people to do automated back-linking. I though I would be safe doing this because I was really careful with anchor text diversity, but it was not to be.
Some of my smaller sites were untouched. Most of my loss came from hiring people to do automated back-linking. I though I would be safe doing this because I was really careful with anchor text diversity, but it was not to be.
I am going to try to use social signals more to try to bringt back my traffic.
Another reader, Nick Stamoulis, suggests that Google could have taken
data from the Link Disavow tool into consideration when putting
together Penguin 2.0:
I would guess that the Disavow tool was factored into Penguin
2.0. If thousands of link owners disavowed a particular domain I can’t
imagine that is something Google didn’t pick up on. It’s interesting
that they are offering site owners the chance to “tell” on spammy sites
that Penguin 2.0 might have overlooked.
Cutts has tweeted about the Penguin spam form several times.
With regards to the Link Disavow tool, Google did not rule out the
possibility of using it as a ranking signal when quizzed about it in the
past. Back in the fall, Search Engine Land’s Danny Sullivan shared a Q&A with Matt Cutts
in which he did not rule out the possibility. Sullivan asked him if
“someone decides to disavow link from good sites a perhaps an attempt to
send signals to Google these are bad,” is Google mining this data to
better understand what bad sites are?
“Right now, we’re using this data in the normal straightforward way,
e.g. for reconsideration requests,” Cutts responded. “We haven’t decided
whether we’ll look at this data more broadly. Even if we did, we have
plenty of other ways of determining bad sites, and we have plenty of
other ways of assessing that sites are actually good.”
No comments:
Post a Comment