Greekgeek's Online Odyssey - Hubpages and Online Article Writing Tips

google

Three notes on Rel=”me”, Rel=”author” (They work!)

EDIT: DRAT. I spoke too soon. Google has changed how rel=”author” works, and try as I might, I can no longer get it to recognize authorship with Squidoo pages. Or at least, Google’s snippet validator isn’t recognizing it.

—–

 

Three notes on rel=”me” and rel=”author,” which I talked about last month.

  • It WORKS with an ordinary Google Profile, as opposed to a Google+ profile, if you’re annoyed with Google+ for various reasons. Here’s a screenshot of some Google results showing my author icon, linked to an ordinary Google account not Google Plus. (Alll the way at the bottom, but at least it draws the eye). Ignore the cache on the right… or don’t, because as you see, it’s one more way users may decide whether or not to visit your page:

Notice how the author icon  makes my link stand out from other text links on the same page, although perhaps I ought to create and add a “how to” YouTube video  as well to see if I can land in that section of Google results.

  • Your author icon will not appear next to your claimed content immediately. Over time, more and more content pages are showing my author icon. For search results that do not show my authorship icon, my author name is not listed either. This suggests that the author icon appears next to authored content AFTER it is re-crawled. 

Therefore, to get the author icon to show up on your older articles, edit and tweak the content, and PING them (on Squidoo, get SquidUtils’ Workshop Add-on and then click “ping” on the SU link that appears in the “Just published” page. Or just wait. Google re-crawls everything eventually.

Haven’t implemented rel=me on Squidoo yet? Here’s that tutorial again.

  • Thirdly, Google has CHANGED the way links are listed on your Google Profile. They’ve now been divided into “Other Profiles,” “Contributor to” and “Recommended Links.” The first one, “Other Profiles,” is obviously where you put your Squidoo, Wizzley, Twitter, Facebook and other social media accounts. But what about blogs? I tried moving my blog-links to “Contributor to,” and it dropped rel=”me”  and tagged those links with rel=”contributor-to” instead. That doesn’t seem right. I’m still trying to figure out where one files blogs.

I think, perhaps, the best thing to do would be to create an Author Profile page on each blog where you are an author, set the other pages/entries on the site to point to that profile page with rel=”author,” and set up reciprocal rel=”me” links between the author profile and your Google profile. In other words, mimic the rel=”author” and rel=”me” setup that I’ve suggested with Squidoo, which we know works (see screencap above). But I haven’t implemented this yet, so I’m not sure I’m right. Why is it so bally complicated? Well, I’m sure we’ll be doing it with our eyes closed just like basic HTML in a few years.

Alas, Google Toolbar Pagerank is NOT Dead

Ding, Dong the –
Oh DAGNABBIT.”

— Twittersphere all a-flutter over (exaggerated) rumors of Google Toolbar Pagerank‘s demise.

Alas no. What happened is Google changed the URL where it stored Toolbar Pagerank, so most 3rd party tools aren’t displaying it.

Google itself has been trying to kill Google Toolbar Pagerank since 2007, but like a zombie, Toolbar Pagerank keeps lurching around the web, a macabre and thin caricature of actual, true, living Pagerank which is never revealed.

Toolbar Pagerank haunts us. Yes, I’ve actually looked at a Pagerank checking tool within the last month to get a sense of Hubpages vs. Squidoo Pagerank and see which of my lenses or hubs rank (I’ve got a fair number of toolbar Pagerank 4 to 6 lenses on Squidoo, so far none on HP, but so what)?

Here’s why it’s silly that I checked:

Google Toolbar Pagerank and the ACTUAL Pagerank Google uses to rank pages are not the same.

Toolbar Pagerank is updated every few weeks (or months). It is not stored in the same place as the Pagerank that Google uses to calculate search results, which is recalculated far more frequently. This is to prevent gaming the system, so that SEOers can’t reverse-engineer Pagerank and discover exactly what factors Google uses to order search results. (Source: “Why You Shouldn’t Rely on Toolbar Pagerank“)

Intersting tidbits about Pagerank from SEO-theory:
Matt Cutts stated that Real Pagerank is calculated several times a day, and it’s not just a (logarithmic) scale of 1 to 10. it’s got a lot more degrees to it. This suggests that it may not be 100% the same algorithm as Toolbar Pagerank. Matt Cutts says: “…At Google you’ve got full access to the raw values, so I rarely look at the truncated histograms of stuff.”

And by the way, it’s not called Pagerank as in “the rank of a webpage.” It’s named after Larry Page, CEO of Google, who co-pioneered the original algorithm (here it is, from 1998). But of course, Pagerank is nothing like it was 13 years ago, any more than the web is.

More recently, Google has talked about 200+ ranking factors used to determine the order of search results. That 200+ ranking factors was stated several years before we had heard of Google Panda, and (see link above) those factors are constantly being tweaked/changed. In the past two years, social media data has entered the equation, for example.

Google Panda 2.5 Winners & Losers

No time for a detailed post, but I wanted to recommend this link partly so I can find it later when I update my own page on Hubpages, Squidoo and Panda:

Google Panda 2.5 Winners & Losers

Supposedly, Hubpages has regained a lot of its traffic. Quantcast shows it’s still down from pre-Panda, and I have seen scattered complaints from some members that their traffic hasn’t recovered.

There is the uncomfortable possibility that Google has decided their content as “shallow” and downgraded it on a subdomain-by-subdomain basis. That would account for overall traffic increase but still not to the levels there was before Panda started dinging shallow content.

Many of those who lost traffic feel their content is excellent, unique, and original, and it doesn’t deserve to be penalized any more than Daniweb. Are they right, or… in the view of average web users, rather than those of us on the inside of the fishbowl, are those pages spammy, shallow or  just not something most of the web would be interested in reading?

It would be an interesting exercise to examine a sample of Hubages profiles: which members say their traffic has returned, which say theirs remains flatlined. Are there any particular features that the “winner” hubs have in common, or that the “losers” do?

Stay tuned for your next big bad Panda.

 

 

 

 

Subdomains: That Is the Question

Thanks to Hubpages’ June 2011 experiment in subdomains as an attempt to get out from under Panda, Squidoo is in the beta testing stage of something similar. Hubpages’ subdomain experiment picked up a lot of buzz when it landed in the Wall Street Journal, and I was one of many who was excited by the possibilities, since I thought it made sense. SearchEngineLand, one of the better SEO journals out there, made cautious noises and checked with Google (see that article for Google’s response).

Based on Google’s responses and Hubpage’s traffic rebound (see below), I thought subdomains couldn’t hurt and might help, and said so. However, after more pondering, I’ve joined the ranks of Squidoo members who are concerned. Apologies for the about-face. Let me explain.

(more…)

Claim Authorship of Your Content on Google

Claiming authorship of your unique, original content could help your content rank better in Google, if Google determines that you generally write good content. It also might help Google find your new content faster, since it will check your author profile (lensmaster profile) from time to time. Most importantly, if you establish yourself as the author of content in Google’s eyes, it will privilege the original content above that of scrapers.

The downside is that while HTML has a mechanism for you to establish your content linked to any username, Google will only recognize your authorship if you link it to a Google profile including your real name and a photo. This is a serious problem for millions of web users who have privacy concerns, especially minors and women who are sometimes targets of stalkers.

But if you already have a Google+ account, and/or you’re willing to take the risk, here’s what to do:

How to claim authorship with
rel=”author” and rel=”me” : a Squidoo Tutorial

I did this at the beginning of September, and saw my traffic spike across most of my lenses. See my Squidoo Stats for the week of Sep 4-10, showing my weekly traffic jumping from about 12,500 to 15,000, and this chart of my top 25 lenses by lensrank:

 

Traffic increase a week and a half after implementing rel="author"

 

I wish I knew whether these traffic spikes were coincidence or significant. I did not see similar almost-across-the-board traffic increases from other search engines; some were up and some were down. If you’re an established web author with a lot of good content on the web, I’m curious to know whether you’ve seen similar results after a week or two of hooking up your content to your Google profile with rel=”author” and rel=”me”.

 

 

 

 

 

Panda 2.3, Hubpages, and a Suggestion for Zazzle Members

By the way, Google reran the Panda algorithm again on about July 25.

What this means is that every month or so, someone at Google pushes the “Panda button.” Panda then reassesses the quality of content on each domain versus the amount of junk/spam on it, and gives that site, shall we say, a Panda Rating. That Panda Rating then becomes one of the factors Google’s everyday search algorithm uses to decide how well to list a page in search engine results. Panda’s rating is apparently a fairly strong factor, as traffic on each domain tends to rise or fall together, unless individual pages on that site have acquired enough other factors (say, backlinks from highly-respected sites) to offset the Panda factor.

Good news for Hubpages members: reorganizing Hubpages along subdomains has helped many of you, by partitioning off your content from spammy members’ content. That helps convince Panda to judge your content on its own merits versus that by other authors on the Hubpages domain. So Hubpages is now slightly outperforming Squidoo, ezinearticles, suite101, and other open publishing sites (as opposed to those vetting content with an editorial board, which Panda is going to like better). We can clearly see Hubpages getting an uptick from Panda 2.3 at the end of July:

Now, wait, why did I put DeviantArt on there? A hunch. Just look at all that traffic! I think Zazzle members should have a DeviantArt account where you showcase some of your work and link to your Zazzle gallery and/or accounts on Squidoo and HP where you showcase more of your work.

DeviantArt has an advantage over sites like HP and Squidoo, as you see.   A social community that appeals to a large niche market (share your art! writing! photography!) gets tons of traffic if search engines didn’t care diddly squat for it. Members market it by pointing friends, relatvies, and peers to their stuff. Search engine traffic, for DeviantArt, is a bonus on top of the social buzz it generates.

Now, don’t all run out and create DeviantArt accounts for the purpose of spamming DA with backlinks. That won’t help much for SEO purposes. DeviantArt does not let you link directly out to some other website. Instead, when you enter links on a DeviantArt page like your profile, it’s stored in in a special in-house format, which is deciphered by a script only when a user clicks that link.

For instance, here’s our friend Flynn the Cat on DeviantArt. Hover over that link in Flynn’s sidebar and see what the URL is:

http://www.deviantart.com/users/outgoing?http://www.squidoo.com/flynn_the_cat

I bet that Google, at least, is clever enough to detect the hidden URL in there and crawl it for indexing purposes: “Aha, there’s a webpage at http://www.squidoo.com/flynn_the_cat.” But indexing is not the same as ranking. This link probably doesn’t count as a backlink, when Google is checking backlinks as one of the factors it uses to decide how high up to list a page in search engine results.

So why bother with backlinks on DeviantArt, if they don’t count for SEO? Pages on Hubpages, Squidoo, etc get indexed / crawled pretty quickly anyway.

Because links have two audiences: (a) search engines, which may use that link to rank your page better in search engine results and (b) humans, who will click on links that look interesting or useful to them.

In this case, your target audience is (b), people.

When writing backlinks for people, you have to give something they’ll be interested in. On DeviantArt, if they see an excellent portfolio of art, photos, or other kinds of creativity, some visitors will follow your link to see more of your creative work hosted elsewhere. Note that just because DeviantArt itself has a huge amount of traffic doesn’t mean your account will. As with Twitter, Facebook, or other social sites, you’ll only get traffic if you participate in and/or post really good stuff that attracts a following.

But if you are an artistic person like Flynn here, and upload stuff regularly, you will attract a following. You could then direct some of that following to a Zazzle store, Squidoo gallery, or blog where you showcase your stuff.

By the way, Digg, StumbleUpon, and many social media sites create outlinks the same way as DeviantArt: they are stored in a non-standard, in-house format, and then a script untangles them and sends the user to the real link. So everyone measuring links from those social sites as backlinks is missing the boat. Those may help Google index a page, but they probably don’t count much as far as helping a page rank better. As with DeviantArt, those links won’t help much for traffic unless you’re an active, contributing member of those communities who has gained a following by frequently posting good stuff of the kind that community tends to like.

Hubpages’ subdomains approach is forward-thinking

This is another of my off-the-cuff observations not backed up by evidence, but I really like one approach Hubpages has taken to recover from Panda: establishing author-based subdomains.

On the one hand, this means backlink churn. They’ve got redirects in place, but any time you shift the URLs of part of a website, there are bound to be problems. They’ll iron out over time.

But on the other hand, this makes it much, much clearer who’s written what. Is everything in one subdomain scraped garbage? Fine, penalize it. But if another subdomain has unique, well-written content with sound links to related content, don’t give it a penalty because of Jane Q. Scraper/Spammer in the next domain over. It’s the same principle as web hosting from the last decade. There’s quite a mix of websites on the hosting service where I’m posting this blog, and search engines don’t judge us the same way.

There’s one other piece of the puzzle that Hubpages and Squidoo are getting half right.

Both Hubpages and Squidoo have added a hidden rel=author link from individual articles (lenses, hubs) to the member’s profile page. Good. That makes clear that the member is the author of all those pages.

But as Marisa Wright of the HP forums reminded me, there’s something more to do. There needs to be a rel=”me” field on our Squidoo and Hubpages profiles to link to our Google profile, or Google won’t count the authorship, and our suite of articles, as our own work separate from the rest of the site, because the authorship won’t be confirmed.

Update: Squidoo has now implemented this field. (And it didn’t matter anyway, since we could add a rel=”me” link manually, but still, the field makes it easier.)

Mormon Search Engine Optimization

Wow. You learn something every day. This post got long, so I turned it into a Hub:

The Church of Latter-Day SEO

SEO basics and ethical questions raised by the Church of Latter-Day Saints’ grass-roots SEO campaign.

Check Your Article in Google’s Cache

Do you know what Google knows of your article?

The result may surprise you.

Take the URL for your article, hub or lens and Google the following:

cache:http://www.squidoo.com/yourlens

If you see spinny “loading” icons, that means Google doesn’t have that part of your page indexed. Google will only send search traffic based on the content it can see and index.

Other places you can check:

  • Webconfs’ Spider Simulator, especially useful to see what links are crawlable on your page
  • The Internet Archive: does it have a copy of your page yet? If it does, search engines most certainly do.

SEO Tip: Is That the BEST You Can Do?

For obscure reasons that Glen and Janet will understand, I’m going to call this the Potato Chip Challenge.

In the past, we’ve learned that adding “unique” to gift-related keywords captures long-tail searches. I have also observed that the word “stuff” can collect people who are searching vaguely for interesting, er, stuff. As in “stuff about volcanoes.” “Review” gets people looking for “[product name] review” before making a purchase, and as I noted in a previous post, people often search for types of products, news, movies, etc by appending the year to the search (“lcd TVs 2011″).

Well, here’s another to add to the list. I’d been doing it already, for some topics where keyword research suggested a match, but I hadn’t consciously added it to my toolbox of pointless yet useful qualifiers: “best.”  I’ve got Best Books on Greek Mythology, for example.

Here’s the Potato Chip Challenge: Take a lens where you’re reviewing several of the same kind of thing — or even one thing, if you’re really sure it’s a good one — and open its traffic stats, the detailed stats where you’ve got all the keywords that have brought visits to your lens. Set the time span to “90 days.”

Now, open another window to edit the lens. Add “Best” to the title. Work in “best” next to the main keyword in a few places on the lens where it sounds natural. IMPORTANT: As you edit, keep an eye on your traffic stats to make sure you don’t accidentally delete/screw up a phrase that’s bringing you traffic.

Publish and use SquidUtils’ workshop add-on to ping the updated lens.

Wait! You’re not done yet. Look at the traffic stats again. Open a text document, jot down the date, and record the weekly and monthly traffic totals. Copy and paste the complete list of keywords. Save the document as “potato chip” in your Squidoo projects folder.

Come back in a month and compare traffic stats (keeping in mind that shopping-related traffic often dips in summer and rises in the fall). Hopefully, “Best [thingie]” will now be part of your lens traffic.

I don’t know how successful this will be, but based on observations, it looks like an experiment worth trying. Please report results one way or the other, if you give this a try!