Greekgeek's Online Odyssey - Hubpages and Online Article Writing Tips

seo-blunders

Hubpages – A Grab Bag of Tips and Observations

I became active on Hubpages again in 2011 and churned out a little over a hundred hubs in two years. I have been next to useless on that site for the past year. So take anything I say about Hubpages with a grain of salt. That said, I may have learned a few useful things.

(more…)

The Great Squidoo vs. Panda Death Match: Are We Having Fun Yet?

Recent major Google algorithm updates that have helped or harmed Squidoo, according to Sistrix.com’s “Google Updates” tracking tool.

 

So, we all knew that Squidoo had to do some major damage control to rescue itself, since Google’s downranked it for… well, we’re all making educated guesses, but Google’s webmaster guidelines provide us with a list of likely culprits (links are to the specific part of Google guidelines detailing each big no-no):

Most of these are content-related problems which are the responsibility of Squidoo members. Some are in the hands of HQ. Let’s take a closer look at each of these problems and how it’s playing out on Squidoo:

(more…)

Backlink Seekers Target Squidoo For Pagerank

Pagerank is a measurement Google came up with in the late 1990s to help it decided how highly to rank webpages, based on which webpages linked to that page (backlinks) and which pages it linked to. Nowadays, Pagerank is only one of 200+ factors that Google uses to decide how high up to list a webpage in its search results. Google has come up with many ways to detect relevance to a particular search query, making Pagerank somewhat obsolete. (See this post by Google spokespundit Matt Cutts for an explanation of Pagerank). Nevertheless, many old-time backlinkers are convinced that Pagerank is still the number one factor in making webpages rank well in Google, so they keep trying to find webpages with pagerank on which to plant backlinks.

Squidoo is a target for these pagerank-seekers. It’s six years old, and many of its older articles have good pagerank. (Many of my older lenses are pagerank 3 to 5, which isn’t bad).

Squidoo is a web 2.0 website with multiple opportunities for visitors to leave links: guestbooks and link plexos and duels. If you leave a guestbook or link plexo unmoderated — and even if you don’t — link spammers will hit your lenses, trying to exploit your pagerank to boost their own rankings. Linkspam is not harmless. If your webpage links to poor neighborhoods, to sites that engage in shady linking practices, or to a lot of non-relevant content, those links could lower the quality, trustworthiness and relevance of your article in Google’s eyes.

Link spam has always been a problem on Squidoo, but two events within the past year have made it more of a target. First, it has been largely unaffected by Google’s Panda algorithm updates, which demoted a huge number of other websites. Second, on March 19, 2012, Google did a major algorithm tweak which de-indexed (removed from Google results) a batch of paid blog networks and other websites whose sole purpose was to publish thin, computer-generated content which appeared to be real articles, and which contained links to sites that paid them to feature those links. People were paying linkbuilding services to create backlinks for them in this way. Now, suddenly, those backlink sites are worthless, and some paid linkbuilding services like “BuildMyRank” have actually shut down.

All the sites which those backlinks pointed to have now lost standing in Google search results.  They’re now searching for new places to plant backlinks in order to replace those they lost. Any blog, guestbook, or “submit your links here” widget is a target, especially on websites that still have some pagerank.

These link droppers are getting ever more clever about trying to disguise what they’re doing so that you let their link through. Today I deleted two comments left on this blog saying it was a very well-written blog, asking me if I coded it from scratch, or saying that the person liked my blog so much he tweeted it to all his followers. It sounded like real humans had written these comments. However, the generic reference to “your blog” without any reference to the subject matter of the blog was a dead giveaway that they were cut-and-paste comments being dropped on any old blog. Their usernames included backlinks to their websites. They were using not only flattery, but one of the “six persuaders“:  reciprocity. If someone does something for you, it’s human nature to feel you should return the favor in some fashion. (The “I tweeted this to all my followers” ploy, which I’ve seen on several link drops lately).

I’ve also received a flood of emails from people offering to pay me to put a link to their sites on my lenses.

Don’t be fooled. Google just dropped or demoted a whole bunch of domains these link droppers used to try and make their own sites rank better. You don’t want your blog, lens or website to be showcasing links to the very people Google just penalized for shady backlinking practices and shallow content. Your lens could get hit by the same algorithm filter that demoted the sites they were using for backlinks before.

Your sole criteria for allowing a link onto your page should be the benefit it gives your readers. Is the site it links to useful, helpful, interesting, and strongly relevant to your subject matter? Will your readers be interested in it? Then approve it. Is it off-topic, or would readers who clicked on it be disappointed? Reject it.

By making sure your lenses only link to good, relevant content that is useful to your readers, you’ll not only make that particular article looks good to Google. You’ll help keep Squidoo from looking like “a place for spammers post their links.”  By keeping our own lenses spam-free, we ensure that Squidoo continues to be ranked well by Google and doesn’t get hit with a Panda penalty (which would cause a traffic drop for all pages on Squidoo).

Are Cross-Links About to Get Google-Punched?

Uh oh. Remember how I noticed the murmurs about content farm penalties back in January 2011, and got scoffed at for suggesting Google was going to be unrolling domain-based rather than single-page-based penalties?

Weeeell, I don’t like the sound of this. Something in seoMOZ’s whiteboard Friday vid this week caught my eye:

Here’s the part that concerns me:

We’ve particularly seen this in the past few weeks with Google cracking down on linked networks, low quality links. They announced that they are going to be devaluing more administrative links. By administrative links, we mean sites that are related to each other. So if you own a network of 50 sites and you’re interlinking all of those to each other, the value that you are getting is over-optimizing. They are going to be diminished, and you could face a penalty. If you do it too much, you could face de-indexing, like we’ve seen Google do with the linking indexes.

I cannot find the source for this: where has Google announced it’s about to crack down on administrative links (cross-links between our own content on different sites)? But actually, it makes sense that Google would treat links we build to our own content as less value-passing than links other people have built, since self-promotion is not the same as third party recommendation. Furthermore, since Google (and Bing) define webspam as artificial practices designed to boost ranking in search engines, it will crack down on any linking practices — such as building a whole bunch of websites and cross-linking them to simulate backlinks — that are designed primarily for that purpose.

Once again, there’s one thing that worries me, and one thing that doesn’t.

I don’t care if Google decides to treat those links as less important. Many people think that Google ignoring signals it used to give more weight to is a penalty, and the effect can be catastrophic if you relied too heavily on them.

But there is a difference between “Google starts ignoring X…” and “Google starts penalizing X.” I may do things that Google pretty much ignores: they could be of benefit to my readers. What I try to avoid is things that I believe Google may actively penalize. (For example, since Google is on the record for penalizing paid links, I do not use Redgage, even though it may be perfectly safe).

I’m not saying I’m going to stop cross-linking my sites, articles and content: that would be a silly knee-jerk reaction, and I’m still not entirely sure what Cyrus Shepherd’s possible “administrative link penalties” will entail. After all, prior to Panda, the punditsphere was full of people predicting the demise of “Content Farms,” expecting Google to create some sort of blacklist of user-generated sites like Blekko did, and just penalizing those. In fact, Panda worked in an entirely different way. So we don’t yet know what form Google’s announcement will take when it’s implemented. (WHERE is this announcement?) But it’s time to brace, just in case.

To avoid possible algorithm tweaks in the future, it may be time to reconsider whether our cross-links are for our readers’ benefit or for ours.

If this “administrative linking” algorithm adjustment materializes and is confirmed from reputable sources, I’m going to watch my author-linked content closely compared to my alternate pen name content which is not linked to my real name, “Greekgeek” pseudonym or Google profile. It will be interesting to see whether the network of blogs, articles and content Google associates under my authorname drops in rankings while the stuff associated with no particular author name (and thus missing the authorship benefit) stays unchanged.

I also want to leave you with a word of wisdom picked up from a guest interview at seoBook (I do not necessarily endorse most of what Aaron Wall says, and I am a “useful/exceptional content and on-page optimization” advocate rather than a professional backlinker like Jim Boykin, but still):

SeoBook: Google recently nailed a bunch of lower quality bulk link networks. Were you surprised these lasted as long as they did? Was the fact that they worked at all an indication of the sustained importance of links?

Boykin: Well…surprised…no… filtering out networks is something that’s always going to happen….once something gets too big, or too popular, or too talked about…then it’s in danger of being burned… the popular “short cuts” of today are the popular penalized networks of tomorrow.

Emphasis mine. They’re talking about BuildMyRank and other link/blog networks getting deep sixed by a recent Google penalty, but the wider message is a Google variant of Tall Poppy Syndrome: various tricks will work for a while to draw traffic, boost lensrank, or succeed in any sphere where success is measured by a computer algorithm, but once a particular strategy for gaming the system becomes popular, then, sooner or later, the algorithm maker will notice and attempt to thwart the tactic. (And the collateral damage is sometimes more devastating to innocent bystanders than those the algorithm tweak is meant to thwart.)

Bing Still Uses the Meta Keywords Tag!

Uh, oh! Bing still uses the META keywords tag!

META tags. Gotta love ‘em. They are pesky bits of HTML code hidden on (some) webpages to give information about each page. Ten years ago, search engines consulted META tags to help them learn what search phrases each page was relevant for. Then people started manipulating META tags to try and convince search engines their pages were the best pages for particular topics by virtue of their META tags saying so. Search engines wised up to this elementary trick (or went bust).

Not that META tags are completely, utterly, totally dead. On rare occasions, Google still uses the META description tag as the page excerpt it quotes in search results. That is, if there’s not a better and more appropriate quote that fits the search query better.

The META keywords tag, however, was buried several years ago, when even Yahoo/Bing apparently had abandoned it. Keywords as in…

<META name=”keywords” content=”spam, spam and eggs, spam and bacon, spam spam spam and bacon, and oh hey bing this is the greatest webpage ever on spam, so let me repeat the word spam a few more times, spam spam, spam, spammity spam”>

Squidoo fills in the META keywords tag on each lens with your Squidoo tags, by the way. It’s quaint that way.

However — wait! Stop the presses! Our old friend Danny Sullivan has checked with Bing and discovered that Bing still uses the META keywords tag as a signal! 

 

Woo!

 

Whee!

 

Ha!

(more…)

Digg the SEO Vampire: It Drinks Your Backlinks Dry

Just in time for Hallowe’en, I have an SEO horror story that’s happening right now. You may even be a victim!

You think submitting your page to Digg will help SEO, right? Or at least, it can’t hurt, can it?

Ha. Ahaha. Ahahaha.

In September ’09, Digg announced that links would be NoFollow until they proved themselves worthy (lots of Diggs). And I vaguely remember a flap about the DiggBar totally screwing up SEO. I didn’t follow the story closely because I don’t use social media for SEO: social media means promoting your site to people, whereas SEO means promoting your site to search engines.

(more…)

The “Antwerp Sound of Music” viral video and SEO

I  just made a new lens on a popular funny YouTube video, the “Antwerp Train Station Sound of Music” prank.

If you haven’t seen the video, you need to– it’ll make you smile. VERY effective. So far it’s gotten nearly 13 million hits, and that’s not counting all the duplicate copies floating around on YouTube plus a few million more on various European YouTube sites.

It’s a great case study in “linkbait,” content that’s so good people start linking to it. (Also known as “viral,” since linkbait this good can spread by word-of-mouth to millions of web users within days, even hours).

It also illustrates an SEO blunder.

(more…)

Making Search Engine Results Look Sexy, Part II

In SEO Blunders: Very Un-sexy Search Results, I showed off my first Stupid SEO Trick! I’d decided not to worry too much about optimizing this blog, since I don’t want to shell out the money for a second webhost and domain name (a URL is the best spot for keyword optimization after page title).  However, I did at least want to optimize well enough that people searching for my blog by title would find it– all the more important since the domain name doesn’t match.

Unfortunately, I forgot one of my favorite tricks: make sure your keyword’s first appearance on your webpage is in a sentence that reads well when Google excerpts it in search results.

I took steps to correct the problem. To some extent, my corrections helped, but I still haven’t got it quite right. So here’s another quick lesson in how to shape your search engine results to make them look sexy– or at least what NOT to do.

(more…)

SEO Blunders: Very Un-sexy Search Results

Remember how I talked about “making your search results look sexy” in my lens on Squidoo and SEO?

I noted that the two-line blurb that shows up in Google search results is your big chance to “sell” your page to the searching public:

A juicy, “I want to read more!” excerpt is what you want people to see. You can’t necessarily bait every possible “long tail” search phrase with juicy verbal bait, but at least make sure your keywords first appear in a sentence that shows off what your page has to offer.

Did I pay attention to my own advice when making this blog?

Look! Look! It took 1 day to get my Squidbits blog to the top spot on Google, even though there’s a surprising number of webpages out there about “Squid bits”!

There’s just one… little… problem…

(more…)