Greekgeek's Online Odyssey - Hubpages and Online Article Writing Tips

google

Making Search Results Sexy, Revisited

I’ve talked about making search results sexy before. By this I mean tweaking your lenses so that whatever shows up in Google or other search engines tempts your target audience to click on YOUR webpage, instead of all the other search results that come up.

This is vital. Getting your webpage to appear in search engine results is important. But what you want is those clicks! Whatever appears in Google is your billboard, your front door, your commercial that will get people off the street and into your article.  Your search engine listing is the number one way to attract search traffic — forget backlinks! — so you want it to look its best.

Here’s an example:

Search results for 'Free Web Graphics' on Google

Search results for ‘Free Web Graphics‘ on Google

Suppose I’ve landed the #4 spot for the search “Free Web Graphics.” (Actually, I haven’t; that’s Google’s personalized results which tend to favor Squidoo in my case.) While searchers tend to click the TOP results on a page, a photo can draw the eye down. My photo is more appealing than the one above it, simply because of the contrast and vibrance of the colors (mine is a professional portrait). Also, my snippet’s excerpt sounds a lot more friendly — doesn’t it? — while still showing that I’m covering the search query.

So, let’s take Google’s search results step by step.

1) Title: You know the drill: include your keyword phrase AND something to engage your audience.

2) Breadcrumbs (green links in above screencap): make sure the category you file your article under suits  your keyword, if you possibly can.

3) Photo and Google Authorship links.

Set up Google Authorship if you haven’t. Once Google has crawled your lenses with authorship included, it will often place your Google Profile photo in search results for your articles. The photo can draw the searcher’s eye — or repel them if your photo is off-putting.

See the bottom part of my Is Your Profile Picture a Zombie? article for tips on how to tweak  your profile photo. Or see this recent SEOMoz post, which reminded me of this topic: How Optimizing My Google Profile Pic Increased Traffic 35%. Take a look at the examples he tried and discarded. That will give you more ideas about what works!

4) Google snippet (excerpt) from your lens, usually 156 characters.

I’ve covered this before, but just in case you missed it:

Google will give a short excerpt: either the first sentence of your post/article/lens, or the first place where the searcher’s query shows up on the page. You can’t optimize every single snippet on the page, but you can optimize the first sentence plus the place where your top keyword (or the 2-3 most common searches) appears. Do this by Googling for your article — assuming it’s already been indexed — and see what snippet comes up.

Tweak it. Use SEOMofo’s Snippet Optimizer to rewrite it so that the excerpt accomplishes two things: (1) shows that the page covers the search query and (2) show that the page is well-written and competent. (This means proofreading, crisp language). If possible, make the snippet engaging, intriguing, fun, depending on your topic. Be un-boring.

 

The True Power of Social Media

Wow. I had started to create a lens on the Mars Curiosity Rover a little over a week ago, claimed the URL, then stupidly failed to finish it.

I wound up creating the article on Hubpages instead, since its clean interface looks a little less corny for educational pages.

I discovered one more reason why I was glad I’d created my Mars Rover page on Hubpages: it’s really fast to edit, update, and add pictures.  Sunday night, I decided to use my hub on the Mars Rover to liveblog the whole event. With a few Tweets and hashtags related to the event as it was unfolding, I started getting traffic before the spacecraft hit the atmosphere!

Today, I saw just how fast Google can crawl after a Tweet or link on a social website gets posted (possibly this is only true of social sites its owns).

On Squidoo, I wound up making a lighthearted Mars Curiosity Rover 3D model lens showing off an amazing free app from NASA that lets you plunk a virtual reality model of the rover down on your cat, er, your desk, and move the model around. (It’s bizarre. It’s SO COOL.) Obviously, if I’m combining a trending topic with a funny cat video, I should Tweet it. In this case, I tweeted the video on YouTube, which is liable to get more visitors. In the video description, I included a link to the Squidoo lens showing how to get your own copy of the virtual rover so you can put it on your cat, er, desk.

I published that lens a few hours ago, but it’s got a lensrank of almost 2 million, since Squidoo ranks unpublished lenses lower and lower if you don’t publish them pronto. This means it’s a WIP lens, and is not yet plugged into Squidoo’s internal links, so Squidoo hasn’t yet informed search engines that the page exists.  To my surprise, it immediately started getting Google search traffic wanting to know how to get and use the app I was demoing in the video!

Remember, YouTube is owned by Google. It must have seen the link in the video description, and/or seen the Tweet, followed it back, and crawled it, a good 12 hours before Squidoo acknowledges that the lens exists.

I’m actually not sure whether the Youtube video link or the Tweet got the page indexed and ranked well by Googe, but it’s good to remember both, and to remember that you already have to be part of the conversation in social media like Twitter, or nobody is going to follow your interruption (link drop) into the conversation. 

Backlink Seekers Target Squidoo For Pagerank

Pagerank is a measurement Google came up with in the late 1990s to help it decided how highly to rank webpages, based on which webpages linked to that page (backlinks) and which pages it linked to. Nowadays, Pagerank is only one of 200+ factors that Google uses to decide how high up to list a webpage in its search results. Google has come up with many ways to detect relevance to a particular search query, making Pagerank somewhat obsolete. (See this post by Google spokespundit Matt Cutts for an explanation of Pagerank). Nevertheless, many old-time backlinkers are convinced that Pagerank is still the number one factor in making webpages rank well in Google, so they keep trying to find webpages with pagerank on which to plant backlinks.

Squidoo is a target for these pagerank-seekers. It’s six years old, and many of its older articles have good pagerank. (Many of my older lenses are pagerank 3 to 5, which isn’t bad).

Squidoo is a web 2.0 website with multiple opportunities for visitors to leave links: guestbooks and link plexos and duels. If you leave a guestbook or link plexo unmoderated — and even if you don’t — link spammers will hit your lenses, trying to exploit your pagerank to boost their own rankings. Linkspam is not harmless. If your webpage links to poor neighborhoods, to sites that engage in shady linking practices, or to a lot of non-relevant content, those links could lower the quality, trustworthiness and relevance of your article in Google’s eyes.

Link spam has always been a problem on Squidoo, but two events within the past year have made it more of a target. First, it has been largely unaffected by Google’s Panda algorithm updates, which demoted a huge number of other websites. Second, on March 19, 2012, Google did a major algorithm tweak which de-indexed (removed from Google results) a batch of paid blog networks and other websites whose sole purpose was to publish thin, computer-generated content which appeared to be real articles, and which contained links to sites that paid them to feature those links. People were paying linkbuilding services to create backlinks for them in this way. Now, suddenly, those backlink sites are worthless, and some paid linkbuilding services like “BuildMyRank” have actually shut down.

All the sites which those backlinks pointed to have now lost standing in Google search results.  They’re now searching for new places to plant backlinks in order to replace those they lost. Any blog, guestbook, or “submit your links here” widget is a target, especially on websites that still have some pagerank.

These link droppers are getting ever more clever about trying to disguise what they’re doing so that you let their link through. Today I deleted two comments left on this blog saying it was a very well-written blog, asking me if I coded it from scratch, or saying that the person liked my blog so much he tweeted it to all his followers. It sounded like real humans had written these comments. However, the generic reference to “your blog” without any reference to the subject matter of the blog was a dead giveaway that they were cut-and-paste comments being dropped on any old blog. Their usernames included backlinks to their websites. They were using not only flattery, but one of the “six persuaders“:  reciprocity. If someone does something for you, it’s human nature to feel you should return the favor in some fashion. (The “I tweeted this to all my followers” ploy, which I’ve seen on several link drops lately).

I’ve also received a flood of emails from people offering to pay me to put a link to their sites on my lenses.

Don’t be fooled. Google just dropped or demoted a whole bunch of domains these link droppers used to try and make their own sites rank better. You don’t want your blog, lens or website to be showcasing links to the very people Google just penalized for shady backlinking practices and shallow content. Your lens could get hit by the same algorithm filter that demoted the sites they were using for backlinks before.

Your sole criteria for allowing a link onto your page should be the benefit it gives your readers. Is the site it links to useful, helpful, interesting, and strongly relevant to your subject matter? Will your readers be interested in it? Then approve it. Is it off-topic, or would readers who clicked on it be disappointed? Reject it.

By making sure your lenses only link to good, relevant content that is useful to your readers, you’ll not only make that particular article looks good to Google. You’ll help keep Squidoo from looking like “a place for spammers post their links.”  By keeping our own lenses spam-free, we ensure that Squidoo continues to be ranked well by Google and doesn’t get hit with a Panda penalty (which would cause a traffic drop for all pages on Squidoo).

Are Cross-Links About to Get Google-Punched?

Uh oh. Remember how I noticed the murmurs about content farm penalties back in January 2011, and got scoffed at for suggesting Google was going to be unrolling domain-based rather than single-page-based penalties?

Weeeell, I don’t like the sound of this. Something in seoMOZ’s whiteboard Friday vid this week caught my eye:

Here’s the part that concerns me:

We’ve particularly seen this in the past few weeks with Google cracking down on linked networks, low quality links. They announced that they are going to be devaluing more administrative links. By administrative links, we mean sites that are related to each other. So if you own a network of 50 sites and you’re interlinking all of those to each other, the value that you are getting is over-optimizing. They are going to be diminished, and you could face a penalty. If you do it too much, you could face de-indexing, like we’ve seen Google do with the linking indexes.

I cannot find the source for this: where has Google announced it’s about to crack down on administrative links (cross-links between our own content on different sites)? But actually, it makes sense that Google would treat links we build to our own content as less value-passing than links other people have built, since self-promotion is not the same as third party recommendation. Furthermore, since Google (and Bing) define webspam as artificial practices designed to boost ranking in search engines, it will crack down on any linking practices — such as building a whole bunch of websites and cross-linking them to simulate backlinks — that are designed primarily for that purpose.

Once again, there’s one thing that worries me, and one thing that doesn’t.

I don’t care if Google decides to treat those links as less important. Many people think that Google ignoring signals it used to give more weight to is a penalty, and the effect can be catastrophic if you relied too heavily on them.

But there is a difference between “Google starts ignoring X…” and “Google starts penalizing X.” I may do things that Google pretty much ignores: they could be of benefit to my readers. What I try to avoid is things that I believe Google may actively penalize. (For example, since Google is on the record for penalizing paid links, I do not use Redgage, even though it may be perfectly safe).

I’m not saying I’m going to stop cross-linking my sites, articles and content: that would be a silly knee-jerk reaction, and I’m still not entirely sure what Cyrus Shepherd’s possible “administrative link penalties” will entail. After all, prior to Panda, the punditsphere was full of people predicting the demise of “Content Farms,” expecting Google to create some sort of blacklist of user-generated sites like Blekko did, and just penalizing those. In fact, Panda worked in an entirely different way. So we don’t yet know what form Google’s announcement will take when it’s implemented. (WHERE is this announcement?) But it’s time to brace, just in case.

To avoid possible algorithm tweaks in the future, it may be time to reconsider whether our cross-links are for our readers’ benefit or for ours.

If this “administrative linking” algorithm adjustment materializes and is confirmed from reputable sources, I’m going to watch my author-linked content closely compared to my alternate pen name content which is not linked to my real name, “Greekgeek” pseudonym or Google profile. It will be interesting to see whether the network of blogs, articles and content Google associates under my authorname drops in rankings while the stuff associated with no particular author name (and thus missing the authorship benefit) stays unchanged.

I also want to leave you with a word of wisdom picked up from a guest interview at seoBook (I do not necessarily endorse most of what Aaron Wall says, and I am a “useful/exceptional content and on-page optimization” advocate rather than a professional backlinker like Jim Boykin, but still):

SeoBook: Google recently nailed a bunch of lower quality bulk link networks. Were you surprised these lasted as long as they did? Was the fact that they worked at all an indication of the sustained importance of links?

Boykin: Well…surprised…no… filtering out networks is something that’s always going to happen….once something gets too big, or too popular, or too talked about…then it’s in danger of being burned… the popular “short cuts” of today are the popular penalized networks of tomorrow.

Emphasis mine. They’re talking about BuildMyRank and other link/blog networks getting deep sixed by a recent Google penalty, but the wider message is a Google variant of Tall Poppy Syndrome: various tricks will work for a while to draw traffic, boost lensrank, or succeed in any sphere where success is measured by a computer algorithm, but once a particular strategy for gaming the system becomes popular, then, sooner or later, the algorithm maker will notice and attempt to thwart the tactic. (And the collateral damage is sometimes more devastating to innocent bystanders than those the algorithm tweak is meant to thwart.)

Panda Update 3.2 Happened January 18

Has your traffic profile changed recently? The culprit may be Panda 3.2, confirmed on Jan 18, 2012. See that link on SearchEngineLand for more info.

To review what this Panda thing is about:

Google’s search algorithm ranks pages’ relevance to a given search query based on over 200 factors. For example, are the words in the search query (“what’s in a hot dog?”) found in the page’s headers, or does that page link to other good pages about that topic? The pages that rank highest on relevance get listed first for that query when someone searches for it on Google. A better Google listing means more clicks, more visitors, more traffic.

Starting last February, Google introduced a new factor, code named Panda. This factor is weighted more strongly than many other factors. Panda is different from most of the factors in that it’s a measure of the domain where the page is found. Are there a lot of spammy pages on that domain (e.g. Squidoo.com)? Are there a lot of pages whose content is found elsewhere? Or is that domain full of unique, useful pages? Panda attempts to determine the overall quality of a website. It then boosts or detracts the raw rank of any page found on that site.

Panda isn’t calculated every day. Instead, it’s recalculated manually whenever someone at Google says, “Time to run a Panda update again.” It then crawls all the sites on the web and re-evaluates whether they’re full of spam and junk or excellent content.

The long and short of it: each time Panda is recalculated, ALL articles on Squidoo may be somewhat impacted, depending on whether Squidoo gets a good Panda rating or a poor one. A good one means that — other things being equal, a page on Squidoo will be listed higher in search results than the same page posted somewhere else. Or, if Squidoo gets downgraded, it’ll give lenses a slight disadvantage, like a golf handicap.

January 18ith is about the time my Squidoo traffic jumped by about 20%. However, I haven’t seen a lot of Squidoo members gloating over a sudden traffic jump, so this is evidently not much of a sidewide change — in which case, my own traffic boost is probably not due to Panda.

There’s another Google update muddying the waters right now, making it difficult to tell which factor is causing what. Search Plus Your World now shows strongly personalized results in Google searches, including things your friends and circle have tweeted and shared. I’m not clear on whether Google has started giving more weight to socially shared links as a ranking factor— one of those 200+ factors mentioned above — or whether it’s still only regarding social signals from “trusted authorities” (say, a link posted by Neil Gaiman) as important and all the rest of our Tweets, Facebook Likes, etc as only significant to our friends.

At any rate, any one of the recent reshufflings of what Google displays as seach results could explain my traffic boost. It’s not just more traffic following a holiday lull, as this is significantly more traffic than I saw in 2011.

ETA: Click the widget below to view the full-sized Quantcast chart for Squidoo traffic. It may show a modest bump in traffic from the latest Panda update, or it may be within seasonal variation. (Here’s Hubpages’ traffic, too, for comparison.)

    

 

Tip: Check Your Google Snippets!

When Google lists your page, it lists a “snippet” — a small excerpt of your content. This snippet will be one of two things:

  • Your META description tag. On Squidoo, this is the first 255 characters of the lens introduction.
  • OR: an excerpt from your page showing the first instance (usually) of the keyword the Google user searched for.

You can’t predict what people will search for. But you should at least do a command-F when viewing your article to see where your top keywords appear. Do the few words on either side of it make someone feel that your article is useful, relevant, and may possibly answer their questions? Or are they vague, poorly written, and don’t give a good impression of what your page is like?
Also, do this for your business name or blog name. Here’s an example. My main mythology blog, Mythphile, gets enough Google love to receive the special Google table-of-contents treatment. (Search for Mythphile and you’ll see what I mean.)
However, recently I finally clued into the snippet description that was showing in search results. I forgot to take a screenshot, but what it said was:

Mythphile by  is powered by WordPress using theme Tribune.

WHOOPS!  That doesn’t tell us a THING about this blog or the content on the page. That’s from the footer at the bottom of the blog. Apparently, I don’t have the blog’s name anywhere on the blog except in the Header and navigation links (e.g. “What Is Mythphile?”), and the snippet tool does NOT excerpt the header, when it’s one word and too short to be a useful snippet.

So I ran to my blog template and added a widget in the upper righthand corner of the sidebar. Now that blurb is what displays in the Google snippet:

What Is MythphileMythphile is a blog exploring the intersection between mythology and modern culture, timeless symbols and current events.

Moral: Make sure that the first instance of your top keyword, username, and brand name/business name/blog name appear in a meaningful sentence, because that’s likely going to be the only data web searchers have to go on when trying to decide whether to click your link in a page of search results.

Yes, this is yet another example of my SEO axiom, “Make Search Results SEXY!”

Google Is Now Hiding Our Keyword Traffic

So, last week, the SEO industry was abuzz with Google’s October 18 announcement that it would no longer report visitors’ search queries in any of its tools IF the visitors doing the search were logged into Google.

Say what? Let me show you.

BEFORE THIS CHANGE (this is an actual example from my own content):

  • A user searches the web for “where can I find a pet sea hare?” and lands on my page.
  • Google records the “where can I find a pet sea hare” query in its tools and sends it to Squidoo stats
  • I see in my Squidoo stats “where can I find a pet sea hare?” brought a visitor
  • Amused, I do the research and find some C. aplysia suppliers and add them to my sea hare  fanpage in hopes that info will be useful (or at least entertaining) to future visitors.
NOW:
  • A user searches for “where can I find a pet sea hare?” and lands on my page.
  • Google’s webtools record the search as (“not provided”)
  • Squidoo doesn’t report the search query in its traffic stats. (In fact, it may not even know that’s a visit… I can’t tell, but my visits have suddenly dropped on my Squidoo dashboard stats without a corresponding drop in Google Analytics).
  • I don’t know what my readers are interested in, want to know, or need me to clarify.
In this case, it’s a one-time query that won’t tell me much. But how about more common queries? Looking at Google Analytics, the third most common search phrase people use to find my content is now listed as “not provided,” so I no longer know what hundreds of my visitors are looking for, and I can no longer respond as effectively to what my readers want.

Again, Google claims this is to protect users’ privacy. However:

  • Search queries don’t tell me who is doing the searching. They’re like words shouted from the back of the room, except you can’t hear the voices. So this doesn’t protect privacy.
  • The only search queries Google is concealing are those from logged-in Google members. It’s a protection racket: “Join Google, and we won’t spy on you!”
  • EXCEPT that Google still shares all the search query data with its paying advertisers, which is why the SEO industry is rife with articles like Google Puts a Price on Privacy and Google Invests in Privacy For Profit .
  • And speaking of privacy, Google is NOT concealling referrer data — where the visitor comes from — which is more private than search queries.
If  Google said they were going to start charging for valuable data they’ve hitherto given away for free, that wouldn’t annoy me, but these self-righteous claims of protecting user privacy offend me.

Some parts of the internet are gloating about this, because they consider all SEO to be dirty tricks, and they have swallowed Google’s white lie that this change will help protect users’ privacy. But I was using that data to improve my content for my readers. And see Matt Cutts’ video from this week:

Matt Cutts: “Does Google Consider SEO to be Spam?”

Well, at least that’s a little reassuring.

I am also concerned about how Squidoo is handling this change. If Google isn’t reporting a significant number of search queries to Squidoo’s traffic stats, does Squidoo still count them as visits? This impacts Lensrank.  Yes, Google claims the cloaked data represent only a fraction of visits, but on the other hand, it’s everyone logged into Google from, say, Gmail, YouTube, Picasa, Reader, Google Plus, or a ton of other Google services. My third-most-common search query is now cloaked, and you can bet a bucketload of long-tail queries are (Analytics only gives me my top 500 queries, and one of my top lenses used to get more than 500 unique queries a week).

This change will also impact Hubpages and other sites that rely on Google’s API to report keywords that brought visitors to your site.

2011 Google Quality Raters Guidelines (Oops!)

Google did something wrong. I did something wrong. Yet I believe that good will come of this. Let’s recap what happened with the 2011 Google Quality Raters Guidelines:

  • Step 1: I see a post in the Squidoo forums noting that Potpiegirl (aka Jennifer Ledbetter, WAHM affiliate marketer) ha a new post up about Google.
  • Step 2: I read Jennifer’s lengthy (and fairly useful) post on How Google Makes Algorithm Changes.
  • Step 3: I notice that Jennifer’s post links to Google’s 2011 Quality Raters Handbook.
  • Step 4: Classics major training kicks in: Wait, hang on, is this real? Is this legitimate? Why aren’t the major SEO websites like searchengineland, seomoz and seobook salivating over this carcass like a pack of rabid hyenas circling a dying zebra?
  • Step 5: I share the tip with SearchEngineLand, asking if the document is legitimate. Barry Schwartz seems to think so and posts about it.
  • Step 6: Lots of people download the 2011 Quality Raters Handbook.
  • Step 7: Google contacts Barry Schwartz and asks him to take down SEL’s mirror of the document. Google also contacts PotPieGirl and asks her to remove the link from her blog.
  • Step 8: Too late: the guidelines have gone viral. As a result, various SEO bloggers and experts discuss ways to make content more relevant and useful. (There, Google, was that so bad?)

I owe Jennifer an apology for tipping without thinking. Hopefully the amount of traffic that has landed on her blog as a result of this offsets the inconvenience of having to delete that link. I also feel guilty for my part in spreading the leak, but I honestly think that having the Quality Rater Guidelines out there will encourage people to focus more on the quality of their content, which is not a bad thing.

So, well, Mea culpa. Now, what are these Quality Rater Guidelines? Simply, they are the rating system that Google beta testers use to test, refine, and improve Google’s automated algorithm. They are not the algorithm itself. But in order to create a computer algorithm which can detect and rank sites relevant to a given search query, Google first needs to know which sites real people judge to be the best ones for a given search query.

The reason these raters guidelines are useful to us is that they give us some idea what Google considers “quality content.” I can’t talk too specifically about what’s in the guidelines, but here are three takeaway lessons:

  • The rating system is based on relevance to a topic. Content is king, but relevance is queen. And “relevance” here means “gives the searcher what they wanted when they typed in that search.” Is a site absolutely THE go-to place for a particular search query? It wins. Is a site incredibly relevant for that query, satisfying most people who search for it? It ranks pretty well. Does the site only have some relevant content, or is it from a less trustworthy source? That’s going to lose points. If it’s only slightly relevant, fuggeddaboudit.
  • Google defines webspam as anything designed to trick search engines into getting more traffic. So while backlink spamming, keyword stuffing, or other sneaky tricks may work for a while, sooner or later, Google will tweak its algorithm to negate those practices. If you’re doing something only for search engines, it’s probably not worth doing it (save, perhaps, making your content structured, organized and clear enough for search engines to comprehend it). If you’re doing something that really is for your readers, hopefully, long-term, you’ll win.
  • Google doesn’t define all affiliate marketing as spam or “thin” content, but it’s extra wary of affiliate marketing. Raters are told to watch out for warning signs like a product review on one page that sends people to buy things on another domain entirely, suggesting the review is there to benefit the reviewer (with commissions) not the visitor. If you’re doing affiliate marketing, you have to create relevant content that is useful to your readers — price comparisons, pros and cons, your own photos of the product in use, etc. If you only excerpt/quote customer reviews and info from the site selling the product, then your page has provided nothing of value to the reader that cannot be found on the original product page. That’s thin, that’s shallow, and it’s asking for Google to bury your page so far down in search results that no one sees it.

In sum, Google is trying its best to design an algorithm that rewards pages which are useful to readers and relevant to the search query.  Over time, the algorithm gets more and more successful in doing this (we hope). So, if you want your pages to rank well on Google, take a page from Kennedy:

Ask not what search traffic can do for your webpage, but what your webpage can do for search traffic.

 

UPDATE: I discuss this topic a little more here: Google’s “Quality Content” Guidelines: Do You Make the Cut?

Bing Still Uses the Meta Keywords Tag!

Uh, oh! Bing still uses the META keywords tag!

META tags. Gotta love ‘em. They are pesky bits of HTML code hidden on (some) webpages to give information about each page. Ten years ago, search engines consulted META tags to help them learn what search phrases each page was relevant for. Then people started manipulating META tags to try and convince search engines their pages were the best pages for particular topics by virtue of their META tags saying so. Search engines wised up to this elementary trick (or went bust).

Not that META tags are completely, utterly, totally dead. On rare occasions, Google still uses the META description tag as the page excerpt it quotes in search results. That is, if there’s not a better and more appropriate quote that fits the search query better.

The META keywords tag, however, was buried several years ago, when even Yahoo/Bing apparently had abandoned it. Keywords as in…

<META name=”keywords” content=”spam, spam and eggs, spam and bacon, spam spam spam and bacon, and oh hey bing this is the greatest webpage ever on spam, so let me repeat the word spam a few more times, spam spam, spam, spammity spam”>

Squidoo fills in the META keywords tag on each lens with your Squidoo tags, by the way. It’s quaint that way.

However — wait! Stop the presses! Our old friend Danny Sullivan has checked with Bing and discovered that Bing still uses the META keywords tag as a signal! 

 

Woo!

 

Whee!

 

Ha!

(more…)

Rel=”me” Rel=”author” UPDATE for Squidoo lensmasters

I just got a note from Gil on my Rel=”author” Squidoo tutorial. (Thanks, Gil!)

The slots on our Squidoo Profile for “other profiles” (Facebook, Twitter, MySpace) are now labeled with rel=”me” automatically. So is the “My Blog” slot.

More importantly, Squidoo has now added a slot on our lensmaster profile for a link to “Google Plus” (which will work just fine for a regular Google profile account as well). This link is automatically marked with rel=”me” in the code.

Therefore, in order to connect your Squidoo lenses to your Google profile, the process is now:

  1. Create a Google Profile
  2. Edit your Google Profile, add a link to your Squidoo Lensmaster Page in the “Other Profiles” box
  3. View your Google profile and copy its URL
  4. On Squidoo, go to My Settings > Profile, scroll down, and paste your Google Profile URL into the “Google Plus” box
  5. Save, and you’re done!
(You don’t have to fuss with rel=”author” at all, because the bio box in the upper right corner of lenses automatically creates rel=”author” from each lens to your lensmaster profile page.)

P.S. Remember those slots in our Squidoo Profile that we haven’t been able to access since the Dashboard update? They’re editable again!