Squidbits – Greekgeek's Squidoo Blog Rotating Header Image

September, 2011:

Subdomains: That Is the Question

Thanks to Hubpages’ June 2011 experiment in subdomains as an attempt to get out from under Panda, Squidoo is in the beta testing stage of something similar. Hubpages’ subdomain experiment picked up a lot of buzz when it landed in the Wall Street Journal, and I was one of many who was excited by the possibilities, since I thought it made sense. SearchEngineLand, one of the better SEO journals out there, made cautious noises and checked with Google (see that article for Google’s response).

Based on Google’s responses and Hubpage’s traffic rebound (see below), I thought subdomains couldn’t hurt and might help, and said so. However, after more pondering, I’ve joined the ranks of Squidoo members who are concerned. Apologies for the about-face. Let me explain.

(more…)

“What Do You Want?”

That headline won’t make any sense unless you’re a Babylon 5 fanatic. (And if you love any science fiction, fantasy, or thought-provoking fiction whatsoever, go find it and watch the first four seasons).

Ahem. I had a really good pow-wow with a couple of Squids yesterday, 2muchtrash and her partner. We talked about successes and failures, and about the challenges of making money and getting traffic to our articles. We picked our collective brains. I coredumped everything I know about succeeding on Squidoo (which alas is still too scanty on earnings; I spent 3 of the last 4 years experimenting with ways to increase traffic).

For all my tips, advice, and tricks, once again I was making the same error most of us do. We discuss the power of backlinks, the use of nofollow, the use of rel=”author”, SEO, encouraging clickoutsinterlinking content, time spent on lens, boosting lensrank, article marketing and keyword research and all these other little techniques for maximizing traffic and profits and lensrank and…

phew!

Even as we start to get comfortable with our skills, search engine algorithms and the way people browse the web keep changing. Even if they did not change, we never know exactly what Google, Bing, or all these different browsing platforms are optimized for: they never tell us, so people won’t game the system. All we know is (a) our own areas of expertise and (b) what visitors are doing on our pages, more or less.

 

So once again I return to lens stats and traffic stats, digging up whatever information I can about user behavior. Because that’s where the staying power is, the one thing we can master that will survive the web’s inevitable evolution. We need to keep asking ourselves: What do our visitors want? What do web users look for, and need, and enjoy? What do the comments we receive tell us about our visitors?

To turn it around, because those questions still sound like, “What can I get out of my visitors and how can I use their behavior to my advantage?” let us ask: What are we giving them? What function do our webpages serve? What GOOD is our content? Not “how good is it?” but really, truly, what purpose does it serve? What can people get out of it?

Why do we spend so much time pondering backlinks and stats and keywords and Tweets and not what are we doing, what are we creating, and how can we make our content more useful, readable, interesting, and/or entertaining?  What about article structure and form: not simply heat maps and click maps, but “How can I make this page as functional as possible?” the way Apple did when it designed the iPhone/iPad interface. We know content quality matters. So how can we improve our content? And why do we spend so little time thinking about it?

Look at your own articles with the eyes of a total stranger who has the whole web to browse. Why yours? Is your content really good enough to hold somebody’s attention? If not, what’s missing? What kind of webpages appeal to you, and why do you find yourself reading them, visiting them, clicking links on them, or buying from them?

I don’t have great answers, how-tos, or tutorials on how to make fantastic, useful content. All I can do is suggest we be a self-observers of our own web behavior, looking to see what we like and use. We can also monitor visitor stats, trying to discover what visitors like and respond to. It’s psychology (deducing what users want); it’s research (building exceptional content that isn’t simply rehashing what’s already all over the web); it’s writing craftsmaship.

Quality isn’t everything: the web is so vast that people may never stumble across it. But really unique, excellent, useful pages have at least as good a chance of long-term success as ones put together strategically, following certain tips, checklists, techniques and “how to” rules.

Silly moneymaking tip…

I’ve got just one word to say to you:


“The Graduate”

Or, wait, two words…

I noticed this phenomenon several years ago when I converted a seminar paper on the Egyptian god Thoth into a Squidoo lens.

People love action figures. Even in this economy, and especially if it’s something for which one would not expect an action figure, they’ll click on small thumbnail images of action figures to get a better look. Sometimes they’ll buy. Sometimes they’ll buy something else on Amazon instead.

There’s sports figure bobbleheads and politician action figures (and voodoo dolls) and collectible action figures for every single movie, video game, and most pop music stars. There’s collectible Greek mythology action figures.  There’s Seth Godin and his mismatched socks.

The eco-friendly part of me shudders at promoting collectibles, because they’re plastic, plastic, plastic, and they’re a waste of resources. Mea culpa.

The pragmatic part of me says that they pay the bills. And I’m really fond of the one that sits on my computer guarding my hard drive.

So there’s a thought. Which I’m offering in lieu of intelligent commentary on Squidoo’s experiments with subdomains and “digest” style magazines following the Hubpages success with subdomains which has the Panda watching world in a tizzy. Other than: it’s worth testing.

Claim Authorship of Your Content on Google

Claiming authorship of your unique, original content could help your content rank better in Google, if Google determines that you generally write good content. It also might help Google find your new content faster, since it will check your author profile (lensmaster profile) from time to time. Most importantly, if you establish yourself as the author of content in Google’s eyes, it will privilege the original content above that of scrapers.

The downside is that while HTML has a mechanism for you to establish your content linked to any username, Google will only recognize your authorship if you link it to a Google profile including your real name and a photo. This is a serious problem for millions of web users who have privacy concerns, especially minors and women who are sometimes targets of stalkers.

But if you already have a Google+ account, and/or you’re willing to take the risk, here’s what to do:

How to claim authorship with
rel=”author” and rel=”me” : a Squidoo Tutorial

I did this at the beginning of September, and saw my traffic spike across most of my lenses. See my Squidoo Stats for the week of Sep 4-10, showing my weekly traffic jumping from about 12,500 to 15,000, and this chart of my top 25 lenses by lensrank:

 

Traffic increase a week and a half after implementing rel="author"

 

I wish I knew whether these traffic spikes were coincidence or significant. I did not see similar almost-across-the-board traffic increases from other search engines; some were up and some were down. If you’re an established web author with a lot of good content on the web, I’m curious to know whether you’ve seen similar results after a week or two of hooking up your content to your Google profile with rel=”author” and rel=”me”.

 

 

 

 

 

Panda 2.3, Hubpages, and a Suggestion for Zazzle Members

By the way, Google reran the Panda algorithm again on about July 25.

What this means is that every month or so, someone at Google pushes the “Panda button.” Panda then reassesses the quality of content on each domain versus the amount of junk/spam on it, and gives that site, shall we say, a Panda Rating. That Panda Rating then becomes one of the factors Google’s everyday search algorithm uses to decide how well to list a page in search engine results. Panda’s rating is apparently a fairly strong factor, as traffic on each domain tends to rise or fall together, unless individual pages on that site have acquired enough other factors (say, backlinks from highly-respected sites) to offset the Panda factor.

Good news for Hubpages members: reorganizing Hubpages along subdomains has helped many of you, by partitioning off your content from spammy members’ content. That helps convince Panda to judge your content on its own merits versus that by other authors on the Hubpages domain. So Hubpages is now slightly outperforming Squidoo, ezinearticles, suite101, and other open publishing sites (as opposed to those vetting content with an editorial board, which Panda is going to like better). We can clearly see Hubpages getting an uptick from Panda 2.3 at the end of July:

Now, wait, why did I put DeviantArt on there? A hunch. Just look at all that traffic! I think Zazzle members should have a DeviantArt account where you showcase some of your work and link to your Zazzle gallery and/or accounts on Squidoo and HP where you showcase more of your work.

DeviantArt has an advantage over sites like HP and Squidoo, as you see.   A social community that appeals to a large niche market (share your art! writing! photography!) gets tons of traffic if search engines didn’t care diddly squat for it. Members market it by pointing friends, relatvies, and peers to their stuff. Search engine traffic, for DeviantArt, is a bonus on top of the social buzz it generates.

Now, don’t all run out and create DeviantArt accounts for the purpose of spamming DA with backlinks. That won’t help much for SEO purposes. DeviantArt does not let you link directly out to some other website. Instead, when you enter links on a DeviantArt page like your profile, it’s stored in in a special in-house format, which is deciphered by a script only when a user clicks that link.

For instance, here’s our friend Flynn the Cat on DeviantArt. Hover over that link in Flynn’s sidebar and see what the URL is:

http://www.deviantart.com/users/outgoing?http://www.squidoo.com/flynn_the_cat

I bet that Google, at least, is clever enough to detect the hidden URL in there and crawl it for indexing purposes: “Aha, there’s a webpage at http://www.squidoo.com/flynn_the_cat.” But indexing is not the same as ranking. This link probably doesn’t count as a backlink, when Google is checking backlinks as one of the factors it uses to decide how high up to list a page in search engine results.

So why bother with backlinks on DeviantArt, if they don’t count for SEO? Pages on Hubpages, Squidoo, etc get indexed / crawled pretty quickly anyway.

Because links have two audiences: (a) search engines, which may use that link to rank your page better in search engine results and (b) humans, who will click on links that look interesting or useful to them.

In this case, your target audience is (b), people.

When writing backlinks for people, you have to give something they’ll be interested in. On DeviantArt, if they see an excellent portfolio of art, photos, or other kinds of creativity, some visitors will follow your link to see more of your creative work hosted elsewhere. Note that just because DeviantArt itself has a huge amount of traffic doesn’t mean your account will. As with Twitter, Facebook, or other social sites, you’ll only get traffic if you participate in and/or post really good stuff that attracts a following.

But if you are an artistic person like Flynn here, and upload stuff regularly, you will attract a following. You could then direct some of that following to a Zazzle store, Squidoo gallery, or blog where you showcase your stuff.

By the way, Digg, StumbleUpon, and many social media sites create outlinks the same way as DeviantArt: they are stored in a non-standard, in-house format, and then a script untangles them and sends the user to the real link. So everyone measuring links from those social sites as backlinks is missing the boat. Those may help Google index a page, but they probably don’t count much as far as helping a page rank better. As with DeviantArt, those links won’t help much for traffic unless you’re an active, contributing member of those communities who has gained a following by frequently posting good stuff of the kind that community tends to like.

Squidoo Pay Day Coming: Two Things to Check

Squidoo Pay Day is almost here. Someone usually posts a thread in SquidU when earnings start showing up in our dashboard.

You can find them by clicking the “stats” link under an individual lens, then the “earnings” tab. There’s an Ad Pool “earnings” amount showing for 7/30. That’s July earnings, which will be paid in September.

Nice to know, but first something to check: are your Payment Settings correct?  It’s a bummer when a charity drops from Squidoo’s list, so that your donation to your favorite charity goes to another instead. It’s even more of a bummer when you make a co-brand lens that sends all your earnings to charity by default, or when Squidoo glitches and sends your earnings to charity. I’ve started taking a screenshot of all my lens payment settings for my records.

You get to the “Payment Settings” overview of which lens is set to donate to which charity by clicking “My Settings”  at the upper right of Squidoo’s control strip, then “Payouts,”  then scroll down and click “Individual Lens Settings.”

One more thing. Do you have multiple accounts? There are advantages and disadvantages to niche accounts. One disadvantage is that you pay the Paypal transaction fee on EACH account, which is (I think?) something like 2%, capped at $1. I’m a little worried about Squidoo glitches and the hopper, but I’ve just raised the payout threshold on my accounts to $50.