August 20, 2011 at 12:46 pm #4117ratracegradMember
I have an ebook in .pdf format that I sell on my website. If you do a search for “grinder strength pullup” you will actually find the exact path to download it on the first page of google. That allows people to download the book without paying for it.
theme: thesis 1.7
Is there anyway to prevent google from finding the download link to the product and then showing it in search results? Thanks.August 20, 2011 at 1:39 pm #35746wzpModerator
You could use a robots.txt file to exclude the directory containing your downloadable files. HOWEVER, search engines are not forced to “obey” the exclusion directives; it is “voluntary.” This is akin to placing a “please don’t hack me” sign on your site.
If you ARE NOT using the PDF Stamper on the PDF files, you could migrate you files to Amazon S3 and use the Amazon S3 integration option of eStore to secure your files. This offers the best security.
If you ARE using the PDF Stamper on your PDF files, you should periodically clear (by hand) the PDF Stamper destination directory of any stamped files that have already been downloaded. This creates a situation in which you must delete the stamped files faster than the search engines can index them.August 20, 2011 at 10:22 pm #35747adminKeymaster
Search engines do not index content that does not have any link to from any content that is already indexed. So for example if you just place a PDF file on your server in some folder, there is no way for Search Engines to index it because it doesn’t even know about it (search engine bot crawls to discover content, it cannot magically find content). So I would recommend you check all of your post and pages and make sure you do not have a link to this PDF file from a page that is indexed (it could even be an external link coming to this file from an external page).
With that said, there are a plenty of ways to protect against this. This post should help you:August 26, 2011 at 1:45 am #35748ratracegradMember
Thanks so much for the information, I followed your link and added the .htaccess and index file and updated robots.txt. You support is fantastic.
JenniferNovember 9, 2011 at 3:59 pm #35749LouisMember
In case my following option work for you guys, make it a sticky or amend it to the relevant post(s):
I am testing… and here is what seems to work for me: folder/directory permission set to 700, file permission set to 400.
Only the plugin can access it.
BUT, please try it in different browsers and see if it works 100% ok.
Of course, adding the .htaccess is good too.
Let me know what you think…November 10, 2011 at 12:01 am #35750adminKeymaster
Thank you! Added your tip to this post:
- You must be logged in to reply to this topic.