logoWhen I released my updated WordPress SEO article a few weeks back, my buddy Avinash was kind enough to tweet it. He tweeted it, at first, with a link. is a sharing service that allows you to basically make a copy of a page and add some notes or even some changes to the page. The idea is nice, as a webmaster though, I hate it. Let me explain why.

You see, makes a copy of the page at the moment it’s prepared for sharing, they say they do that because of speed. As Avinash tweets a lot, he probably made that copy a couple of hours before he shared it. This wouldn’t be so much of an issue if I hadn’t added stuff to the page in the mean time and fixed a lot of typo’s. Everyone who’d use Avinash’s link wouldn’t see those changes. And decided that for me, without asking me anything, or even worse, giving me the option to opt-out.

SEO Impact

Surely those guys are at least trying to give the rankings for those pages people share through its service their links back? No. They don’t. Well, not unless you’re not already adding rel="canonical" elements to your site yourself. Each user has its own subdomain. Avinash’s subdomain is As you can see for yourself, quite a few of his shared pages are indexed by Google. That shouldn’t be possible. should add a canonical back to the original page if there isn’t one in the source already.

No Analytics

They claim a webmaster gets all his normal stuff, ads and analytics etc. Except that for both Clicky and Google Analytics there are no views measured for that link, because Clicky refuses pageviews from other domains and I’ve filtered those out of Google Analytics to prevent others from rendering my analytics useless (yes, people do try that). So, “my” visitors don’t get the changes I made to the copy, making me look stupid and I can’t track which visitors those were and where they came from… At this point, I want out.

Opt Out

I’ve gone through their documentation, both normal and for developers, and there simply is no documented way to opt-out. So I decided to dive a bit deeper and figure out which user-agent uses. It turns out that they actually do have a page about their user-agent. The next step would normally be simple: add a line to your robots.txt blocking Unfortunately, in my tests, never actually retrieved the robots.txt file so they’re not adhering to the robots.txt protocol. They really should. They’re taking my content, they’re not asking for permission and they’re not allowing me to opt-out. Someone could sue them over that. I’m just going to request, through this blog post:, please add an option to opt my sites out of your service.

Also, in my opinion, if you’re using, you should probably start considering alternatives.

Disclaimer: please be aware that I like Avinash a lot and don’t blame him for anything. He’s a great guy and an inspiration to a lot of us in the online marketing industry. It’s the service I dislike and I think that after reading this he will switch to something else as well.

A “hard” out

I figured out a “hard” way to get out of doing its thing, add the following to your .htaccess file:

RewriteBase /
RewriteEngine on
RewriteCond %{HTTP_USER_AGENT}
RewriteRule . - [F,L]

This will block, giving it a “forbidden” page.

Why I dislike is a post by on Yoast – Tweaking Websites.A good WordPress blog needs good hosting, you don’t want your blog to be slow, or, even worse, down, do you? Check out my thoughts on WordPress hosting!