5 Good Reasons Why I Persevered with Panda

Google Add comments

I’m sure you’re all very much aware of the Google Panda update that’s caused many a heart attack in affiliate land since it was launched 6 long months ago.  My own sad experience has seen me almost completely lose a very large site which was IMHO largely because Google never did work out the difference between my original content and the stuff that had been stolen by oodles of horrid scrapers. I had to remove 1100 pages of a 1300 page site and consign most of it to the internet dustbin. Ouch.

Despite the fact I had other sites with much better EPCs I could have thrown my effort into to replace the income I lost I’ve sweated blood these last few months to go through the painstaking process of rebuilding my site. To date, I’m about 25% of the way there and have a lovely uphill climb over razorblades to look forward to.

However, I decided to persevere with Panda and these are the reasons I think it was the right thing to do: –

1. I Need to Understand Google’s New Way Of Thinking – Yes it’s tempting just to consign an apparently unsavable site to the rubbish bin. But if I don’t take time to understand the fundamentals of what Panda needs to see from me I’m not going to have an affiliate business at all in 2 or 3 years. Working on my hammered site may not be the best short term use of time from a revenue point of view but I’d rather use this experience to improve my business than focus on short term gain.

2. Let Google Get The Better Of Me? Get F**ked! I’d rather stick hot needles under my fingernails than admit to myself that Google took one of my favourite sites. I will fix it, the Panda will not win! Someone mentioned to me the other day that I may or may not have sadistic tendencies when it comes to things like this.  If I do, more power to my elbow.

3. It’s Sort Of Fun Whenever things like this happen I always have a renewed appreciation for my SEO roots, I love the problem solving aspect of algo changes.  To be fair this has been the toughest one to date because it’s had a huge and admittedly depressing effect on Lingerie Brands in particular, and to a lesser extent two of my other main sites.

4. I Can Use What I Learn on My Other Sites – I had two other sites hit by Panda in the June update.  Not badly, but enough to annoy me quite a fair bit.  I used what I’d laboriously learned with my other poorly site to immediately make some key changes I thought could help.  In the August update I recouped about half of what I’d lost. Cheered me up immensely that did!

5. They Might Not be Bloody Done Yet! Like all Google’s wonderful ideas, they’re working hard to refine it.  If I don’t take the time to be up to date with it and they raise the bar again, things could get really gnarly. By working hard to recreate an improved site and passing all I’ve learned into my existing sites I hope to avoid any further traffic upsets.  Incidentally, I really hope we’ve seen all the big changes go through that will affect affiliate content sites – I don’t think I can take any more Pandalisation in my fragile pregnant state, LOL.  MMMM Morning sickness and Panda – thanks for nothing Google 😉

Well, those are my reasons for Persevering with Panda. I could have made this post why “YOU” should persevere but I know a lot of people have just been totally stumped by this. I guess I’m lucky I found a path forward with my sites.

Good luck to all my Pandalised affiliate pals :)

Related Affiliate Marketing Posts

11 Responses to “5 Good Reasons Why I Persevered with Panda”

  1. Mike Behnken Says:

    Thanks for the post, you’re definitely not alone. I was (and am) hit extremely hard by Panda. I have come to the conclusion (based on data from my other sites and general common sense) that google panda is hitting large sites the hardest simply because the sites targeted were large sites.

    Of course it sickens someone who has spent a lot of time creating hundreds or thousands of pages of content by themselves, but the fact is, the giant content farms out there, put out thousands of pages of rubbish per week or even per day.

    Google had to something to stop the quantity is better than quality, even though there is NO WAY Panda has even come close to accomplishing this.

    Small sites of a couple hundred pages with large articles have not been hit nearly as hard as the large sites w/ a mix of articles, products, and other pages w/ less text (content). In addition to not being hit as hard, smaller sites are a lot easier to fix.

    Good luck on your quest, wish me the same. These scumbags at google offer next to zero support to the little guys so we are fighting an uphill battle.

  2. Jez Says:

    I had a similar issue which I recovered from, what I found was that the post excerpts had been scraped heavily on some posts, so it was only the first couple of lines from the “excerpt” in the rss feed that had been swiped.

    I re-wrote the first portion of those posts and the site came back. Assuming you were not running full (post) rss at the time you may be able to do the same.

    The other change I made was to limit the number of posts in the rss feed to 1 (you still have 10). That cut down the amount people could scrape considerably.

    I have reasons for wanting to retain that feed, but, if I did not have I would remove the feed completely.

  3. Simon Says:

    Great post and valiant effort, Kirsty.

    As a part-time affiliate with small websites (100-200 posts) I initially got hit hard by Panda then reinstated on the 2.2 update – so what Mike says holds true in my experience.

    Perhaps the future for affiliates is small and agile?

  4. Kirsty Says:

    Thanks for that Jez – good point on those posts in the RSS feed. Mind you a lot of my posts were scraped in full, I’ve no idea how or why (not my area of specialism). I’ve found since installing Cloudflare that it’s less of an issue but I need to do more to protect my sites in future. I still see the odd scraped post from my new content but have been checking for scrapers regularly and blocking their IP addresses. If anyone knows what a scraper tends to look like in your raw logs I’d love to hear as I should also be scanning those weekly I think.

    Simon & Mike – you could well be right. I’m lucky in that my income is spread over several sites (although the badly hit site was a huge contributor). I’ll be working hard to spread that risk further over the next year and doing things like finding more PPC projects that will pay on Yahoo & MSN (there are some, honest!!)

    Good luck all :)

  5. Kirsty Says:

    Oh P.S. My issues haven’t all been about scraped content, but I think a LOT of it is!

  6. Jez Says:

    There isnt much to give away a scraper, other than they may not have a referer set, and the stupid ones will rip all your posts in quick succession.

    Scrapers typically want to build networks of link sites for SEO, or mash content up on auto blog money sites. So, they may hit google blog search for posts based on certain keywords, then mash that up with amazon, ebay, other scraped snippets to automagically create a “franken post” capable of earning a few dollars a month.

    Returns per sites are poor, so its a numbers game. An auto blogger will run hundreds of sites to make their money.

    Unless you have real “readers” or have a specific need for that feed (feeds can be useful in getting backlinks) I would consider nulling it all together.

  7. Jez Says:


    My sites were fine after Panda, they are mostly 30 pages, 500 words per page. You dont need big sites. You need sites that give off good quality signals.

    EzineArticles, E-How et al were massive sites, high PR, all culled.

    Its not really about size, once you get out of the “thin” zone at least.

  8. Mark Says:

    Hi Kirsty,

    Point 5 made me smile given it’s date and the Panda 2.5 update last week (Sept 27). Whilst my site wasn’t affected by previous Panda updates, it got smashed to hell on the 2.5 update.

    The only thing I can think of, is that I have recently added an affiliate link to most of my 800 pages, to monetise the site more. I didn’t cloak the link but I’m now thinking that that was a bad idea. Call me paranoid but I’m now thinking of doing it manually rather than using a WP Plugin as I ‘think’ Big G may be tracking some of the more popular cloaking plugins. What do you think?

  9. Mark Says:

    For those still struggling with Panda, please persevere. Religiously follow the Big G’s guidelines and do the right things, and in my experience, your site will come back – perhaps even stronger than before.

  10. BDR Acton Says:

    One of the sites I work on was not hit by previous panda updates but was hit in October the site has hundreds of pages. I have gone back through the site and consolidated many pages which are repetitive, but my strong feeling is that Mark may be right the cloaking plugins as the site with the problem does use this for affiliate links. The actual owner of the site has far too many affiliate ads on each page I have recommended reducing the number of ads in relation to text on the page. This long and slow process of consolidating pages is helping and I am starting to see increases in traffic particularly in longer tail terms I am not actively link building for.

  11. Mark Says:

    @BDR Acton,
    I am also seeing longtail success, with no link-building for those terms. In fact, I am doing no link-building at all but the site is growing in the SERPS.

Leave a Reply

Subscribe without commenting

 © Copyright 2010. All rights reserved