Greekgeek's Online Odyssey - Hubpages and Online Article Writing Tips

linkbuilding

My Thoughts About Hubscore on Hubpages

Many Hubpages members are anxious about Hubscores. I feel almost the same way about them as I did about Squidoo points and levels: they’re in-house metrics that have no bearing on my success as an online writer. From time to time, I’ll glance at my dash to see which of my articles Hubpages scores as the best, partly to feel a tingle of self-satisfaction, and partly to get a general sense of the kinds of articles Hubpages prioritizes. (In that case, I’m judging Hubpages— never mind how it arrives at those scores, are the articles it’s giving 90s to good articles? If  Hubpages starts ranking junk above what I consider quality, then I’ll worry.)

As I mentioned in my previous post, my recent Hub of the Day had a Hubscore of 83. That shows just how futile it is to chase Hubscores. Looking at all my Hubs with better scores, a quick glance showed no obvious patterns. Some had more comments, some more traffic, some more user interaction, some more or fewer words, some more or fewer photos. Maybe I could figure out why they outrank my HotD if I analyzed them all carefully, but it’s just not important. (Especially since I agree that, while it’s a successful article, it’s not my very best work.)

The only reason Hubscore matters is that it’s Hubpages’ requirement for DoFollow links. If your overall Hubberscore drops below 85, or an individual Hub’s score drops below 40, then links in your article are set to NoFollow. That means Google won’t count them when assessing the value of the sites they point to. Many people actively plant DoFollow links as a way to make their own websites rank better on Google— or so they hope.

Personally, I’ve stopped caring whether links on my hubs are DoFollow. My websites, blogs and good articles attract plenty of dofollow links on their own from other people linking to them.  That spares me from worrying about Google cracking down on various kinds of spammy backlinks. What Google wants to see (and use as a ranking factor) is links from other people who genuinely find your content valuable. To that end, Google’s Penguin algorithm is designed to filter out self-promotional links which tell it nothing about how good the content actually is.

Many years ago, I decided that the best thing to do is not to build my own backlinks, but to try to build content that’s worth linking to. As long as I do that, my Hubscore generally stays above 85 anyway. (I just checked, and all my Hubpages accounts are 85 or better except one, which is an old test account.)

Squidoo NoFollows Links to Strangle Spam

Once again, spammers pooping in Squidoo’s sandbox have caused inconvenience for the rest of us. Squidoo has now followed Wikipedia’s example and nofollowed all outbound links.

My opinion: it stings, but on the whole it’s a reasonable move. There are some drawbacks to nofollowing all outbound links. But those drawbacks are outweighed by the benefit: this will discourage those using Squidoo as a place to drop self-serving links, and encourage the use of Squidoo as a place to post actual content. For obvious reasons, Google prefers the latter.

Lakeerieartist concurs, in this succinct post on SquidLog: Spammers Be Gone.

Let me see if I can explain nofollow/dofollow links in plain English for those who don’t understand what this is all about.
(more…)

Flattery Bots: The New Paid Blog Networks?

Fastidious respond in return of this difficulty with genuine arguments
and explaining the whole thing regarding that.

~ Comment I just removed as spam from this blog’s “pending comments” box this morning. The username included a link back to what looks like a linkbuilding site in Germany (I didn’t check).

wow, I’ve been following your blog for a while, and I’ve recommended this post to all my followers!

~ Comment from someone with a link pointing back to a vast student essay slush site (where students can buy a paper to turn in and pretend it’s their own work).

Remarkable lens.

~ Comment I received from a new lensmaster on a page comprising a few word search and crossword puzzles. Fun? I hope so. Useful study aid? Sure. “Remarkable?” Hardly.

Sometimes, the comments are genuine. I’ve made some good lenses, and commenters are kind enough to say so.

But for the past month, I’ve had my ego stroked by a vast influx of generic flattering comments. The common thread in all of them is that they seem not to fit the lens very well, and there is nothing in the comment that matches the lens content. They remind me of the worst-case generic rejection letter my aunt parodied when trying to get a short story published:

Thank you for your thing. We are not accepting things at this time. Please do not send us any more of your things.

Nobody likes rejection letters. But everyone likes flattery. Therefore, if you’re looking to build up backlinks, design a bot that leaves human-sounding compliments plus a link back to your site. Or, if you’re trying to build up a social network, create a bunch of fake accounts on social sites and have the bots go around schmoozing up the locals, building up a social following and getting a percentage of the members on that site to check out the account that flattered them (including the link one wants to promote in the profile).

The timing of this flood of flattery is suspicious: it started up in April 2012 right after Google jettisoned links built by BuildMyRank and other paid blog networks (creating automated fake content on fake blogs, planting links in the posts to the websites of paying customers). Links from spam sites are no longer a way to get your site listed. So paid linkbuilders are now trying to Trojan Horse their links onto reputable sites that still have good standing in Google. Flattery is even more effective than a horse-shaped sculpture on wheels!

I don’t know for certain, but I think the paid linkbuilders are now using the Flattery Gambit to get their backlinks on your webpages. If the comment could apply to any written piece of material whatsoever (“this is really insightful”), I’m suspicious. Doubly so if it’s written in English as a second language, as was the comment I quoted at the start of this post.

On Squidoo, there’s an additional, insidious use of  flattery behavior: reciprocal visits and “likes” can boost lensrank, leading to high payouts. So why not build a bot that leaves friendly comments on thousands of lenses in order to get your lens to tier one within a week of starting on Squidoo, as Tipi just reported?  Or, even if you don’t have access to a bot, why not scramble around leaving short comments on everyone’s lenses to get lots and lots of reciprocal likes, boosting your Squidpoints and payout rank? (I’ve just had one lensmaster leave about 20 lame comments like “who are the girls?” on a long article reviewing a video game with female leads: it’s obvious from his comments that he’s just looking at the lens title and graphic. But at least he’s actually commenting on specific lens content, unlike the majority of the comments I’m talking about).

Unfortunately, the upshot of all of this is that the bots and vague flatterers are making me less appreciative than I should be of genuine, sincere, friendly comments which someone thoughtfully took the time to make! I now delete most variants of “nice lens,” unless there’s some clue in the comment to tell me the person read the article. On some days, I hit the point of saying to myself, “If the comment isn’t useful to my readers, and it’s just directed at me, there doesn’t need to be a public record of it. The message was received, after all.” Doubtless, when I’m in that frame of mind, some real comments get caught up in the lawnmower blades. I’m inconsistent in my comment moderation, letting some stand and removing others.

If I’ve pulled up one of your comments during an attack of weeding, I apologize. It’s easy to pull up a flower by accident! Thank you for your comment — really! I’m sorry the bad behavior of some people is making me cynical regarding genuine forms of human courtesy. (Or, more likely, I haven’t actually deleted your comment; my comment backlog may have swelled to 50+, at which point I fail to keep up with moderation for a while.)

Backlink Seekers Target Squidoo For Pagerank

Pagerank is a measurement Google came up with in the late 1990s to help it decided how highly to rank webpages, based on which webpages linked to that page (backlinks) and which pages it linked to. Nowadays, Pagerank is only one of 200+ factors that Google uses to decide how high up to list a webpage in its search results. Google has come up with many ways to detect relevance to a particular search query, making Pagerank somewhat obsolete. (See this post by Google spokespundit Matt Cutts for an explanation of Pagerank). Nevertheless, many old-time backlinkers are convinced that Pagerank is still the number one factor in making webpages rank well in Google, so they keep trying to find webpages with pagerank on which to plant backlinks.

Squidoo is a target for these pagerank-seekers. It’s six years old, and many of its older articles have good pagerank. (Many of my older lenses are pagerank 3 to 5, which isn’t bad).

Squidoo is a web 2.0 website with multiple opportunities for visitors to leave links: guestbooks and link plexos and duels. If you leave a guestbook or link plexo unmoderated — and even if you don’t — link spammers will hit your lenses, trying to exploit your pagerank to boost their own rankings. Linkspam is not harmless. If your webpage links to poor neighborhoods, to sites that engage in shady linking practices, or to a lot of non-relevant content, those links could lower the quality, trustworthiness and relevance of your article in Google’s eyes.

Link spam has always been a problem on Squidoo, but two events within the past year have made it more of a target. First, it has been largely unaffected by Google’s Panda algorithm updates, which demoted a huge number of other websites. Second, on March 19, 2012, Google did a major algorithm tweak which de-indexed (removed from Google results) a batch of paid blog networks and other websites whose sole purpose was to publish thin, computer-generated content which appeared to be real articles, and which contained links to sites that paid them to feature those links. People were paying linkbuilding services to create backlinks for them in this way. Now, suddenly, those backlink sites are worthless, and some paid linkbuilding services like “BuildMyRank” have actually shut down.

All the sites which those backlinks pointed to have now lost standing in Google search results.  They’re now searching for new places to plant backlinks in order to replace those they lost. Any blog, guestbook, or “submit your links here” widget is a target, especially on websites that still have some pagerank.

These link droppers are getting ever more clever about trying to disguise what they’re doing so that you let their link through. Today I deleted two comments left on this blog saying it was a very well-written blog, asking me if I coded it from scratch, or saying that the person liked my blog so much he tweeted it to all his followers. It sounded like real humans had written these comments. However, the generic reference to “your blog” without any reference to the subject matter of the blog was a dead giveaway that they were cut-and-paste comments being dropped on any old blog. Their usernames included backlinks to their websites. They were using not only flattery, but one of the “six persuaders“:  reciprocity. If someone does something for you, it’s human nature to feel you should return the favor in some fashion. (The “I tweeted this to all my followers” ploy, which I’ve seen on several link drops lately).

I’ve also received a flood of emails from people offering to pay me to put a link to their sites on my lenses.

Don’t be fooled. Google just dropped or demoted a whole bunch of domains these link droppers used to try and make their own sites rank better. You don’t want your blog, lens or website to be showcasing links to the very people Google just penalized for shady backlinking practices and shallow content. Your lens could get hit by the same algorithm filter that demoted the sites they were using for backlinks before.

Your sole criteria for allowing a link onto your page should be the benefit it gives your readers. Is the site it links to useful, helpful, interesting, and strongly relevant to your subject matter? Will your readers be interested in it? Then approve it. Is it off-topic, or would readers who clicked on it be disappointed? Reject it.

By making sure your lenses only link to good, relevant content that is useful to your readers, you’ll not only make that particular article looks good to Google. You’ll help keep Squidoo from looking like “a place for spammers post their links.”  By keeping our own lenses spam-free, we ensure that Squidoo continues to be ranked well by Google and doesn’t get hit with a Panda penalty (which would cause a traffic drop for all pages on Squidoo).

Are Cross-Links About to Get Google-Punched?

Uh oh. Remember how I noticed the murmurs about content farm penalties back in January 2011, and got scoffed at for suggesting Google was going to be unrolling domain-based rather than single-page-based penalties?

Weeeell, I don’t like the sound of this. Something in seoMOZ’s whiteboard Friday vid this week caught my eye:

Here’s the part that concerns me:

We’ve particularly seen this in the past few weeks with Google cracking down on linked networks, low quality links. They announced that they are going to be devaluing more administrative links. By administrative links, we mean sites that are related to each other. So if you own a network of 50 sites and you’re interlinking all of those to each other, the value that you are getting is over-optimizing. They are going to be diminished, and you could face a penalty. If you do it too much, you could face de-indexing, like we’ve seen Google do with the linking indexes.

I cannot find the source for this: where has Google announced it’s about to crack down on administrative links (cross-links between our own content on different sites)? But actually, it makes sense that Google would treat links we build to our own content as less value-passing than links other people have built, since self-promotion is not the same as third party recommendation. Furthermore, since Google (and Bing) define webspam as artificial practices designed to boost ranking in search engines, it will crack down on any linking practices — such as building a whole bunch of websites and cross-linking them to simulate backlinks — that are designed primarily for that purpose.

Once again, there’s one thing that worries me, and one thing that doesn’t.

I don’t care if Google decides to treat those links as less important. Many people think that Google ignoring signals it used to give more weight to is a penalty, and the effect can be catastrophic if you relied too heavily on them.

But there is a difference between “Google starts ignoring X…” and “Google starts penalizing X.” I may do things that Google pretty much ignores: they could be of benefit to my readers. What I try to avoid is things that I believe Google may actively penalize. (For example, since Google is on the record for penalizing paid links, I do not use Redgage, even though it may be perfectly safe).

I’m not saying I’m going to stop cross-linking my sites, articles and content: that would be a silly knee-jerk reaction, and I’m still not entirely sure what Cyrus Shepherd’s possible “administrative link penalties” will entail. After all, prior to Panda, the punditsphere was full of people predicting the demise of “Content Farms,” expecting Google to create some sort of blacklist of user-generated sites like Blekko did, and just penalizing those. In fact, Panda worked in an entirely different way. So we don’t yet know what form Google’s announcement will take when it’s implemented. (WHERE is this announcement?) But it’s time to brace, just in case.

To avoid possible algorithm tweaks in the future, it may be time to reconsider whether our cross-links are for our readers’ benefit or for ours.

If this “administrative linking” algorithm adjustment materializes and is confirmed from reputable sources, I’m going to watch my author-linked content closely compared to my alternate pen name content which is not linked to my real name, “Greekgeek” pseudonym or Google profile. It will be interesting to see whether the network of blogs, articles and content Google associates under my authorname drops in rankings while the stuff associated with no particular author name (and thus missing the authorship benefit) stays unchanged.

I also want to leave you with a word of wisdom picked up from a guest interview at seoBook (I do not necessarily endorse most of what Aaron Wall says, and I am a “useful/exceptional content and on-page optimization” advocate rather than a professional backlinker like Jim Boykin, but still):

SeoBook: Google recently nailed a bunch of lower quality bulk link networks. Were you surprised these lasted as long as they did? Was the fact that they worked at all an indication of the sustained importance of links?

Boykin: Well…surprised…no… filtering out networks is something that’s always going to happen….once something gets too big, or too popular, or too talked about…then it’s in danger of being burned… the popular “short cuts” of today are the popular penalized networks of tomorrow.

Emphasis mine. They’re talking about BuildMyRank and other link/blog networks getting deep sixed by a recent Google penalty, but the wider message is a Google variant of Tall Poppy Syndrome: various tricks will work for a while to draw traffic, boost lensrank, or succeed in any sphere where success is measured by a computer algorithm, but once a particular strategy for gaming the system becomes popular, then, sooner or later, the algorithm maker will notice and attempt to thwart the tactic. (And the collateral damage is sometimes more devastating to innocent bystanders than those the algorithm tweak is meant to thwart.)

Rel=”me” Rel=”author” UPDATE for Squidoo lensmasters

I just got a note from Gil on my Rel=”author” Squidoo tutorial. (Thanks, Gil!)

The slots on our Squidoo Profile for “other profiles” (Facebook, Twitter, MySpace) are now labeled with rel=”me” automatically. So is the “My Blog” slot.

More importantly, Squidoo has now added a slot on our lensmaster profile for a link to “Google Plus” (which will work just fine for a regular Google profile account as well). This link is automatically marked with rel=”me” in the code.

Therefore, in order to connect your Squidoo lenses to your Google profile, the process is now:

  1. Create a Google Profile
  2. Edit your Google Profile, add a link to your Squidoo Lensmaster Page in the “Other Profiles” box
  3. View your Google profile and copy its URL
  4. On Squidoo, go to My Settings > Profile, scroll down, and paste your Google Profile URL into the “Google Plus” box
  5. Save, and you’re done!
(You don’t have to fuss with rel=”author” at all, because the bio box in the upper right corner of lenses automatically creates rel=”author” from each lens to your lensmaster profile page.)

P.S. Remember those slots in our Squidoo Profile that we haven’t been able to access since the Dashboard update? They’re editable again!

Three notes on Rel=”me”, Rel=”author” (They work!)

EDIT: DRAT. I spoke too soon. Google has changed how rel=”author” works, and try as I might, I can no longer get it to recognize authorship with Squidoo pages. Or at least, Google’s snippet validator isn’t recognizing it.

—–

 

Three notes on rel=”me” and rel=”author,” which I talked about last month.

  • It WORKS with an ordinary Google Profile, as opposed to a Google+ profile, if you’re annoyed with Google+ for various reasons. Here’s a screenshot of some Google results showing my author icon, linked to an ordinary Google account not Google Plus. (Alll the way at the bottom, but at least it draws the eye). Ignore the cache on the right… or don’t, because as you see, it’s one more way users may decide whether or not to visit your page:

Notice how the author icon  makes my link stand out from other text links on the same page, although perhaps I ought to create and add a “how to” YouTube video  as well to see if I can land in that section of Google results.

  • Your author icon will not appear next to your claimed content immediately. Over time, more and more content pages are showing my author icon. For search results that do not show my authorship icon, my author name is not listed either. This suggests that the author icon appears next to authored content AFTER it is re-crawled. 

Therefore, to get the author icon to show up on your older articles, edit and tweak the content, and PING them (on Squidoo, get SquidUtils’ Workshop Add-on and then click “ping” on the SU link that appears in the “Just published” page. Or just wait. Google re-crawls everything eventually.

Haven’t implemented rel=me on Squidoo yet? Here’s that tutorial again.

  • Thirdly, Google has CHANGED the way links are listed on your Google Profile. They’ve now been divided into “Other Profiles,” “Contributor to” and “Recommended Links.” The first one, “Other Profiles,” is obviously where you put your Squidoo, Wizzley, Twitter, Facebook and other social media accounts. But what about blogs? I tried moving my blog-links to “Contributor to,” and it dropped rel=”me”  and tagged those links with rel=”contributor-to” instead. That doesn’t seem right. I’m still trying to figure out where one files blogs.

I think, perhaps, the best thing to do would be to create an Author Profile page on each blog where you are an author, set the other pages/entries on the site to point to that profile page with rel=”author,” and set up reciprocal rel=”me” links between the author profile and your Google profile. In other words, mimic the rel=”author” and rel=”me” setup that I’ve suggested with Squidoo, which we know works (see screencap above). But I haven’t implemented this yet, so I’m not sure I’m right. Why is it so bally complicated? Well, I’m sure we’ll be doing it with our eyes closed just like basic HTML in a few years.

Claim Authorship of Your Content on Google

Claiming authorship of your unique, original content could help your content rank better in Google, if Google determines that you generally write good content. It also might help Google find your new content faster, since it will check your author profile (lensmaster profile) from time to time. Most importantly, if you establish yourself as the author of content in Google’s eyes, it will privilege the original content above that of scrapers.

The downside is that while HTML has a mechanism for you to establish your content linked to any username, Google will only recognize your authorship if you link it to a Google profile including your real name and a photo. This is a serious problem for millions of web users who have privacy concerns, especially minors and women who are sometimes targets of stalkers.

But if you already have a Google+ account, and/or you’re willing to take the risk, here’s what to do:

How to claim authorship with
rel=”author” and rel=”me” : a Squidoo Tutorial

I did this at the beginning of September, and saw my traffic spike across most of my lenses. See my Squidoo Stats for the week of Sep 4-10, showing my weekly traffic jumping from about 12,500 to 15,000, and this chart of my top 25 lenses by lensrank:

 

Traffic increase a week and a half after implementing rel="author"

 

I wish I knew whether these traffic spikes were coincidence or significant. I did not see similar almost-across-the-board traffic increases from other search engines; some were up and some were down. If you’re an established web author with a lot of good content on the web, I’m curious to know whether you’ve seen similar results after a week or two of hooking up your content to your Google profile with rel=”author” and rel=”me”.

 

 

 

 

 

Multiple Backlinks from One Zazzle Store

NoFollow backlinks aren’t that useful, but people and Google do follow them. (Yes, Google does follow NoFollow Links, and in fact counts them a tiny bit for Pagerank.)

Also, it’s possible that search engines may take notice of how many different domains link to a page. We don’t know, but it’s no more foolish counting backlink diversity than counting backlinks with no idea which of those backlinks are actually weighted as relevant to a particular search.

In that context, I was intrigued to discover through SquidUtils’ Backlink Checker that when you build a shop on Zazzle.com, it propagates on Zazzle.co.uk, Zazzle.de, Zazzle.fr and Zazzle.pt. (Where’s that?) Now, links in Zazzle descriptions are nofollow, so the backlink on my Mythphile Shop is not passing much pagerank.

Google is probably sophisticated enough to realize those multiple domains are not totally independent: they’re obviously part of an international network of sites. Also, it sees the duplicate content. (I think the duplicate content scare triggered by Panda has set off a bit of hysteria… a few mirror sites won’t send your content off the Google SERPs, it’s just they may not rank quite as well, or maybe only one will rank well in each country. Oops, tangent.) Nevertheless, those links have to count as least as much as forum signature links, which Google is also sophisticated enough to recognize as (a) self-promotion, not an unbiased recommendation and (b) a forum signature — multiple posts with it shouldn’t be weighted any more, or much more, than a single post.

All of this means that you might as well open a Zazzle shop, if you’ve got some visual assets related to your niche.

What kind of assets?

Have you taken your own digital photos related to your topic? Are they photographs of public landmarks, nature, or out-of-copyright (pre-1920 should be safe) products or images? (See this “Legal Pitfalls of Using Photographs” copyright FAQ for more info on what’s allowed.) Commercially-licensed Creative Commons images are also permissible, with credit and a backlink.

Consider making postcards or small prints with them. (Don’t be misleading and print ordinary-sized images on a poster when the original picture is 600×800 pixels; it’ll look awful blown up to poster-size.) Write keyword-rich descriptions. And tie it in somehow to your topic, as I did with my Mythphile Shop. Plant the backlink. It’s not much link juice, but it’s a little. It’s worth expanding your online assets and footprint while creating a possible venue for money-earning.

(This is where I plug my Zazzle tutorial.) Anyway, it’s a thought.

 

Mormon Search Engine Optimization

Wow. You learn something every day. This post got long, so I turned it into a Hub:

The Church of Latter-Day SEO

SEO basics and ethical questions raised by the Church of Latter-Day Saints’ grass-roots SEO campaign.