If you believe some sales letters, you can set your internet marketing on something close to perpetual marketing.
But does that kind of promise really work in practice?
The short answer is “no”.
The longer answer is “sometimes, kind of, but for nothing even close to perpetual really”.
There are lots of reasons for this and I’m going to delve into a few of them here,
Automatic blogging
Plugins like WP Robot claim to drip feed quality content on a regular basis.
They do that by using the keywords you supply and then using those keywords to search sites like Amazon, Yahoo! Answers, PR Web, YouTube, EzineArticles and various other places to get content that you’re allowed to use on your site.
There is a finite amount of content available on those sources – even if you’re targeting a very broad keyword.
And once you start getting reasonably specific and drilling down to the kind of keyword phrases that are likely to attract buyers, those choices narrow dramatically.
Let’s take fear of heights as our example – it’s a reasonable size market and there are plenty of solutions including hypnosis, Kindle books (27 at the time of writing), real printed books (5,008 according to the Amazon search but I don’t believe that for one minute) and Clickbank products to name but a few.
The problem is that the results stray.
Fast.
I searched the Clickbank marketplace for the phrase fear of heights and it claimed to find 77 products.
So your autoblog would happily list all 77 of these and – always assuming that you ever looked at it – you’d wonder why there were pages about how to grow taller, a complete beginners guide to horseback riding, how to train to be a tough mudder (whatever that is) and bamboo plant care.
Those were in the first 10 Clickbank results.
I dread to think what the less relevant results would include.
EzineArticles has 1,280 results according to Google. By the 10th page – so only in the first 100 results – they’d strayed far enough to include fear of squirrels and fear of driving.
Yahoo! Answers was better, as you’d expect with a claimed 19,200 results. But even on the first page the titles were getting repetitive.
Sure, you’d have reasonably related content from that source but any human would think that it was close to duplicate content.
And a computer algorithm like Google would know that the pages were scraped from Yahoo! and elsewhere.
Which is one of the main problems with automated blogging systems – they’re quite unlikely to fool a computer and almost certainly won’t fool a human.
So Google’s algorithm doesn’t give them any credibility in the results. And your site just eats up storage space and resources until something breaks sufficiently to cause the automatic updating to stop working as planned.
But at least fear of heights is relatively timeless as a subject.
If you’ve chosen something faster moving like internet marketing, there are extra things to watch out for.
A friend of mine (who shall remain nameless here) is posting regular articles to his internet marketing blog.
I don’t think he’s using an autoblog system but he’s definitely not using original articles – Google claimed to find 4,770 copies of one of his posts using the allintitle: search option.
And I’m certain he’s not checking the articles before they’re posted as one that was recently posted talked about paid inclusion in AltaVista. Which isn’t something they have offered in over a decade and the AltaVista domain has redirected to Yahoo! for over a year.
I wouldn’t have known about the article but he’d set up another option to post “new” articles on his blog to his Facebook wall. So the article will possibly be seen by anyone following him on Facebook.
The post has since been taken down but if you’re trying to build up credibility it’s not exactly helpful to publish content like that. And the link from Facebook is still there at the time of writing but now shows a page not found error.
In my view, it’s far better to spend the time creating your own content. Then at least you’ll be sure about what you’ve written and what’s being said.
If you publish other content, I think it’s pointless competing with over 4,000 identical pages.
In my opinion, there’s no credibility in publishing a “recent post” on your website that has been doing the rounds on the web since the turn of the century – the top result I found in Google was published on 15th October 2000.
Hardly fresh content however you try to twist that round.
Social media
Another way that’s often suggested to automate your internet marketing is with social media.
Sites like Facebook, Twitter, Pinterest and LinkedIn are the most commonly suggested and a few others such as Reddit and maybe Digg also regularly get a mention.
The logic here is that you post regular snippets to these sites and build up a following.
And that can happen – but you need to do it properly.
Sites like Hootsuite offer various ways to automate this.
Or you could hand craft a snippet when you post something.
The latter option takes more time but (hopefully) makes more sense to your potential readers.
Automation usually takes shortcuts.
I’m not picking on my friend but here’s the description that was posted on his Facebook feed for a recent post:
up-sell, upsell, up-sell strategies, upsell strategies,
So the posts are being auto-posted and the meta tags are being autogenerated.
There’s no way Google would use that as the description. Assuming the page ever showed in a search result, Google’s algorithm would change it to something more relevant.
And he’s missing a trick by using the short cut of an autogenerated description because leaving something that important to a computer is just plain dumb.
You and I can spot immediately that the auto-generated description is gobbledegook.
A computer just thinks that it’s keyword rich and – depending on whether or not it counts the hyphenated version of up-sell as different from upsell maybe also thinks it’s using keyword variants.
That’s almost certainly the promise the plugin would have made.
This kind of automation maybe worked for a short while a few years ago.
But with thousands of identical pages it’s easy for Google to spot what’s being done.
My guess is that at best the original article – or the first user of it if the article is PLR – will show up occasionally in the search results – Google knows when it first crawled a page and can deduce the original owner correctly a reasonable amount of the time. The other thousands of identical copies will be relegated to the supplemental results that are essentially never shown to anyone apart from SEO techies.
Put enough of those pages on your automated blog and Google will then figure out that it’s not showing any original content and will downplay it in the search results accordingly.
Which explains why the thousands of pages generated get no real visitors. Probably just robots looking for places to potentially post spam comments.
As far as I can figure out, the only way that automatically created sites work is by using a program that you’ve had custom written to populate hundreds or – more likely thousands – of blogs that are monetised.
Of course, they almost certainly can’t be monetised with AdSense adverts as they’re no longer in favour with Google (they used to be fine but that was a long time ago) and would most likely trigger your account for closure.
So you could maybe use an alternative such as Chitika or Amazon but they monitor the quality of the clicks and an automated site is unlikely to generate the best quality of clicks so something like showing retargeting could be the best way of earning a reasonable income.
The thing to remember is that by the time an automated script has been produced and used on a million blogs (which is what WP Robot claims) then the cost of the script is likely to be of more benefit to the software company selling the script than it is to users.
If you’ve got any comments or thoughts, feel free to share them below.