Greekgeek's Online Odyssey - Hubpages and Online Article Writing Tips

google panda

The Long Tail in the Age of Semantic SEO

I recently did a long tail experiment to catch a few different search phrases.

See my introduction to the long tail, The New Long Tail of SEO, if you don’t know what I mean by that term.

Okay. Here’s the story.

(more…)

Panda Update 3.2 Happened January 18

Has your traffic profile changed recently? The culprit may be Panda 3.2, confirmed on Jan 18, 2012. See that link on SearchEngineLand for more info.

To review what this Panda thing is about:

Google’s search algorithm ranks pages’ relevance to a given search query based on over 200 factors. For example, are the words in the search query (“what’s in a hot dog?”) found in the page’s headers, or does that page link to other good pages about that topic? The pages that rank highest on relevance get listed first for that query when someone searches for it on Google. A better Google listing means more clicks, more visitors, more traffic.

Starting last February, Google introduced a new factor, code named Panda. This factor is weighted more strongly than many other factors. Panda is different from most of the factors in that it’s a measure of the domain where the page is found. Are there a lot of spammy pages on that domain (e.g. Squidoo.com)? Are there a lot of pages whose content is found elsewhere? Or is that domain full of unique, useful pages? Panda attempts to determine the overall quality of a website. It then boosts or detracts the raw rank of any page found on that site.

Panda isn’t calculated every day. Instead, it’s recalculated manually whenever someone at Google says, “Time to run a Panda update again.” It then crawls all the sites on the web and re-evaluates whether they’re full of spam and junk or excellent content.

The long and short of it: each time Panda is recalculated, ALL articles on Squidoo may be somewhat impacted, depending on whether Squidoo gets a good Panda rating or a poor one. A good one means that — other things being equal, a page on Squidoo will be listed higher in search results than the same page posted somewhere else. Or, if Squidoo gets downgraded, it’ll give lenses a slight disadvantage, like a golf handicap.

January 18ith is about the time my Squidoo traffic jumped by about 20%. However, I haven’t seen a lot of Squidoo members gloating over a sudden traffic jump, so this is evidently not much of a sidewide change — in which case, my own traffic boost is probably not due to Panda.

There’s another Google update muddying the waters right now, making it difficult to tell which factor is causing what. Search Plus Your World now shows strongly personalized results in Google searches, including things your friends and circle have tweeted and shared. I’m not clear on whether Google has started giving more weight to socially shared links as a ranking factor— one of those 200+ factors mentioned above — or whether it’s still only regarding social signals from “trusted authorities” (say, a link posted by Neil Gaiman) as important and all the rest of our Tweets, Facebook Likes, etc as only significant to our friends.

At any rate, any one of the recent reshufflings of what Google displays as seach results could explain my traffic boost. It’s not just more traffic following a holiday lull, as this is significantly more traffic than I saw in 2011.

ETA: Click the widget below to view the full-sized Quantcast chart for Squidoo traffic. It may show a modest bump in traffic from the latest Panda update, or it may be within seasonal variation. (Here’s Hubpages’ traffic, too, for comparison.)

    

 

Google Panda 2.5 Winners & Losers

No time for a detailed post, but I wanted to recommend this link partly so I can find it later when I update my own page on Hubpages, Squidoo and Panda:

Google Panda 2.5 Winners & Losers

Supposedly, Hubpages has regained a lot of its traffic. Quantcast shows it’s still down from pre-Panda, and I have seen scattered complaints from some members that their traffic hasn’t recovered.

There is the uncomfortable possibility that Google has decided their content as “shallow” and downgraded it on a subdomain-by-subdomain basis. That would account for overall traffic increase but still not to the levels there was before Panda started dinging shallow content.

Many of those who lost traffic feel their content is excellent, unique, and original, and it doesn’t deserve to be penalized any more than Daniweb. Are they right, or… in the view of average web users, rather than those of us on the inside of the fishbowl, are those pages spammy, shallow or  just not something most of the web would be interested in reading?

It would be an interesting exercise to examine a sample of Hubages profiles: which members say their traffic has returned, which say theirs remains flatlined. Are there any particular features that the “winner” hubs have in common, or that the “losers” do?

Stay tuned for your next big bad Panda.

 

 

 

 

Subdomains: That Is the Question

Thanks to Hubpages’ June 2011 experiment in subdomains as an attempt to get out from under Panda, Squidoo is in the beta testing stage of something similar. Hubpages’ subdomain experiment picked up a lot of buzz when it landed in the Wall Street Journal, and I was one of many who was excited by the possibilities, since I thought it made sense. SearchEngineLand, one of the better SEO journals out there, made cautious noises and checked with Google (see that article for Google’s response).

Based on Google’s responses and Hubpage’s traffic rebound (see below), I thought subdomains couldn’t hurt and might help, and said so. However, after more pondering, I’ve joined the ranks of Squidoo members who are concerned. Apologies for the about-face. Let me explain.

(more…)

Panda 2.3, Hubpages, and a Suggestion for Zazzle Members

By the way, Google reran the Panda algorithm again on about July 25.

What this means is that every month or so, someone at Google pushes the “Panda button.” Panda then reassesses the quality of content on each domain versus the amount of junk/spam on it, and gives that site, shall we say, a Panda Rating. That Panda Rating then becomes one of the factors Google’s everyday search algorithm uses to decide how well to list a page in search engine results. Panda’s rating is apparently a fairly strong factor, as traffic on each domain tends to rise or fall together, unless individual pages on that site have acquired enough other factors (say, backlinks from highly-respected sites) to offset the Panda factor.

Good news for Hubpages members: reorganizing Hubpages along subdomains has helped many of you, by partitioning off your content from spammy members’ content. That helps convince Panda to judge your content on its own merits versus that by other authors on the Hubpages domain. So Hubpages is now slightly outperforming Squidoo, ezinearticles, suite101, and other open publishing sites (as opposed to those vetting content with an editorial board, which Panda is going to like better). We can clearly see Hubpages getting an uptick from Panda 2.3 at the end of July:

Now, wait, why did I put DeviantArt on there? A hunch. Just look at all that traffic! I think Zazzle members should have a DeviantArt account where you showcase some of your work and link to your Zazzle gallery and/or accounts on Squidoo and HP where you showcase more of your work.

DeviantArt has an advantage over sites like HP and Squidoo, as you see.   A social community that appeals to a large niche market (share your art! writing! photography!) gets tons of traffic if search engines didn’t care diddly squat for it. Members market it by pointing friends, relatvies, and peers to their stuff. Search engine traffic, for DeviantArt, is a bonus on top of the social buzz it generates.

Now, don’t all run out and create DeviantArt accounts for the purpose of spamming DA with backlinks. That won’t help much for SEO purposes. DeviantArt does not let you link directly out to some other website. Instead, when you enter links on a DeviantArt page like your profile, it’s stored in in a special in-house format, which is deciphered by a script only when a user clicks that link.

For instance, here’s our friend Flynn the Cat on DeviantArt. Hover over that link in Flynn’s sidebar and see what the URL is:

http://www.deviantart.com/users/outgoing?http://www.squidoo.com/flynn_the_cat

I bet that Google, at least, is clever enough to detect the hidden URL in there and crawl it for indexing purposes: “Aha, there’s a webpage at http://www.squidoo.com/flynn_the_cat.” But indexing is not the same as ranking. This link probably doesn’t count as a backlink, when Google is checking backlinks as one of the factors it uses to decide how high up to list a page in search engine results.

So why bother with backlinks on DeviantArt, if they don’t count for SEO? Pages on Hubpages, Squidoo, etc get indexed / crawled pretty quickly anyway.

Because links have two audiences: (a) search engines, which may use that link to rank your page better in search engine results and (b) humans, who will click on links that look interesting or useful to them.

In this case, your target audience is (b), people.

When writing backlinks for people, you have to give something they’ll be interested in. On DeviantArt, if they see an excellent portfolio of art, photos, or other kinds of creativity, some visitors will follow your link to see more of your creative work hosted elsewhere. Note that just because DeviantArt itself has a huge amount of traffic doesn’t mean your account will. As with Twitter, Facebook, or other social sites, you’ll only get traffic if you participate in and/or post really good stuff that attracts a following.

But if you are an artistic person like Flynn here, and upload stuff regularly, you will attract a following. You could then direct some of that following to a Zazzle store, Squidoo gallery, or blog where you showcase your stuff.

By the way, Digg, StumbleUpon, and many social media sites create outlinks the same way as DeviantArt: they are stored in a non-standard, in-house format, and then a script untangles them and sends the user to the real link. So everyone measuring links from those social sites as backlinks is missing the boat. Those may help Google index a page, but they probably don’t count much as far as helping a page rank better. As with DeviantArt, those links won’t help much for traffic unless you’re an active, contributing member of those communities who has gained a following by frequently posting good stuff of the kind that community tends to like.

Hubpages’ subdomains approach is forward-thinking

This is another of my off-the-cuff observations not backed up by evidence, but I really like one approach Hubpages has taken to recover from Panda: establishing author-based subdomains.

On the one hand, this means backlink churn. They’ve got redirects in place, but any time you shift the URLs of part of a website, there are bound to be problems. They’ll iron out over time.

But on the other hand, this makes it much, much clearer who’s written what. Is everything in one subdomain scraped garbage? Fine, penalize it. But if another subdomain has unique, well-written content with sound links to related content, don’t give it a penalty because of Jane Q. Scraper/Spammer in the next domain over. It’s the same principle as web hosting from the last decade. There’s quite a mix of websites on the hosting service where I’m posting this blog, and search engines don’t judge us the same way.

There’s one other piece of the puzzle that Hubpages and Squidoo are getting half right.

Both Hubpages and Squidoo have added a hidden rel=author link from individual articles (lenses, hubs) to the member’s profile page. Good. That makes clear that the member is the author of all those pages.

But as Marisa Wright of the HP forums reminded me, there’s something more to do. There needs to be a rel=”me” field on our Squidoo and Hubpages profiles to link to our Google profile, or Google won’t count the authorship, and our suite of articles, as our own work separate from the rest of the site, because the authorship won’t be confirmed.

Update: Squidoo has now implemented this field. (And it didn’t matter anyway, since we could add a rel=”me” link manually, but still, the field makes it easier.)

Hubpages, Squidoo, and Panda 2.2

I was just checking to see how Hubpages and Squidoo are doing, following the latest tweak Panda algorithm, 2.2, which I reported on back in June. Unfortunately, Hubpages’ traffic data has disappeared from Quantcast. [Update: It’s back. Phew.]

Panda 2.2 rolled out back on June 21st. By now we’ve had enough traffic data that we should begin to see a bit of a before-and-after change from that. To my surprise, Hubpages has stopped letting Quantcast report its numbers. I would’ve expected a slight uptick from Panda 2.2, which may be the first chance Hubpages would’ve had to get back in Google’s good graces after spam stomping. ( Panda is a special calculation done separately from ranking individual pages; it’s ranking a whole domain, and that number is then applied as a boost or penalty to pages posted on the domain, like an extra ranking factor. Since the Panda calculation is only performed again when someone at Google manually punches a button, a domain has to wait to be reassessed).

So anyway. Hubpages has followed Mahalo’s lead in hiding its data. A pity.

[UPDATE Aug 13: Hupbages is back on Quantcast! And I see a slight uptick after Panda 2.2. Pardon me for mentioning Mahalo and Hubpages in the same sentence; Hubpages tries to highlight quality content and stamp out spam, even if it sometimes has to mop up the mess created by unscrupulous people taking advantage of its free publishing platform. ]

So how’s Squidoo holding up? I wish I could get a detailed breakdown of past years versus this year, since there’s always a summertime drop. But here’s the 3 year overview:

And here’s the past 3 months.

Not much to tell us, but from what I can see, no drastic change from Panda 2.2.

Just as another interesting comparison, here’s Suite101.com vs. Squidoo for the past six months:

Owie. Again, so far so good for the Squid, but not so happy for Suite 101, an old web 1.0 site that’s got lots of good amidst the bad, from what I remember. (It probably depends on the neighborhood.)

Stay tuned for the next Panda Punch.

 

I realize some of the upheaval at Squidoo right now is, once again, Squidoo’s attempt to be prepared for the next round. I think the newest layer of spam filters need some fine-tuning, and I’m anxious about the process for dealing with false positives, but I understand the need for even more aggressive spam/scraper filters.

Important Google News: Panda 2.2, rel=author, analytics

WOW. LOTS of Google news to report to Squidoo users this week: it’s piling up faster than I can digest it. Let me start with the most recent, since it’s the easiest to tell, though it will take you more time to use:

Squidoo Has Added Google Analytics

Finally. If you have Google Analytics, go to your profile right now and edit it to add your tracking number. If you haven’t a clue what Google Analytics and Squidoo is about, see How to Track Lenses With Google Analytics by theFluffanutta. Also see SquidHQ’s official announcement: What Is the Advantage of Using Google Analytics Over regular Squidoo stats?

But wait! Don’t go yet! There is more Google news for Squids.

Google Panda Update News

SMX (Search Marketing Expo) is the big online search convention where all the experts line up to catch pearls of wisdom from Google Spokespundit Matt Cutts. (There’s even a term for his groupies, Cuttlets. Lordie.) A rough transcript of webexpert Danny Sullivan’s interview with Matt Cutts is posted on Searchengineland. Takeaway lessons:

  • Google Panda is an algorithm run less frequently than Google’s daily indexing. Panda re-evaluates sites occasionally for spamminess, content quality, etc, and then the regular Google algorithm uses Panda’s site ranking to boost/lower pages found on that site.
  • Google Panda has yet to be implemented on non-English-based Google (Google has a different search engine gateway and database in each country, so for example, search results by someone using Google in France do not match the search results for someone using Google in England).
  • Expect a Panda 2.2 soon.  No word on what it will entail.
  • ALSO, the “web spam team” is implementing a tweak to cut down on scraped content outranking the original. Huzzah.

Google Pushes Rel=Author Tag

Also from the Matt Cutts interview: Google has implemented two new voluntary tags, rel=author and rel=me, which allow you to link to an author profile page you’ve set up and back.

I was fussing with this post for several days because I’m still not 100% whether to implement the rel=author tag based on this news. But let me try to explain what it means and why it matters, and then you can ponder along with me!

10-word summary: Using rel=author might boost traffic for some sites. Maybe.

[[UPDATE: See Giltotherescue’s comments below. Based on what he says, I suggest you skip this discussion unless you’re interested. I do NOT advise using rel=author on Squidoo at this time.]]

But if you’re curious…

(more…)

Hubpages vs. Squidoo Traffic: Holding Steady

With all the hullaballoo lately I haven’t had much time to follow my pet project, the impact of Google Panda on Hubpages and Squidoo (there’s another lens that needs rewriting before page breaks vanish, sigh).

I just wanted to post a quick follow-up. I was actually checking to see if Squidoo traffic is down across the board, because I and a number of members have seen a very slight drop. But traffic drops every summer. But here’s Hubpages traffic vs. Squidoo traffic, measured directly via Quantcast:

The Feb 24 and Apr 11 Panda Updates are visible on Hubpages’ line. They’ve implemented a lot of changes, but it may take a while for Google to recrawl and reassess. The problem is (I believe) that part of Panda is a special algorithm that evaluates the quality of a domain/site, and from that derives a handicap which it applies to pages on the site. When someone asked how long before traffic came back after one totally re-tooled a site, Matt Cutts said the Panda algorithms have to be re-run. If I’m interpreting that correctly, it means that the site penalties are being updated less frequently the daily crawl to find/index content.

So Hubpages members need to stick tight a little longer and wait for Google to reassess what Hubpages has done to correct its problems. I’m hoping for their sakes (and mine; I’m trying to get a few irons in the fire over there) that they will have good news soon. Meanwhile, Squidoo members need to stick tight and see whether Squidoo has second-guessed itself in a wise or foolish way by implementing vast numbers of changes after successfully passing through Panda I and II unscathed. Most of them aren’t content-related, but some are navigation-related; in particular we’ve lost a vast number of internal links with lensroll getting phased out. And I’m uneasy about the extra line of adsense above the fold. We’ll see.

A quick survey of Panda news reveals nothing much, but M. Martinez has detected hints that Panda might unroll in Latin America next. To recap, Panda was implemented on U.S. Google results on Feb 24, all English-based Google results on Apr 11, and a minor Google update whose impacts I haven’t been able to see in the sites I’ve studied. I shall be interested to see what happens when Panda is implemented for French, Russian, and especially German Google.

 

My Lens on the Google Panda Update

Yes, you’ve probably already seen it: I’ve written a 3-page article discussing the Panda update’s impact on Squidoo and Hubpages traffic, and what lessons we can learn from it to stay ahead of the Google wrecking ball.

You may notice that the root message boils down to, “write unique content that your readers find USEFUL,” which isn’t exactly earth-shattering, but it’s amazing how many people try every approach but that one.

Unfortunately, if you publish on an open article-submission site like Squidoo or Hubpages, your content is partly judged by association, so you can get marked down by Google even if that’s exactly what you’re doing. Fight back by encouraging quality content on your favorite sites and leading by example.

I think Hubpages is taking some good steps to put its house back in order (as is Squidoo). We’ll see if Google agrees a few months from now.