Tips and Tricks HQ Support

Support site for Tips and Tricks HQ premium products

  • Home
  • Contact Us
  • Documentation
  • Forum Home
    • Forum
    • Forum Search
    • Forum Login
    • Forum Registration

wp estore – digital download url

by

Tips and Tricks HQ Support Portal › Forums › WP eStore Forum › wp estore – digital download url

Tagged: Amazon, Download, https, S3, security, SSL

  • This topic has 20 replies, 9 voices, and was last updated 6 years, 6 months ago by wzp.
Viewing 15 posts - 1 through 15 (of 21 total)
1 2 →
  • Author
    Posts
  • October 12, 2010 at 8:31 am #2013
    webar2007
    Spectator

    Security question: If i upload a zip file here: [http://cheapsalesleads.com/downloads/travel.zip] for download, wont this file still be indexed by the search engines or found by various users/surfers?

    Is there a way to secure the “downloads” folder or encrypt the downloads link/file when i add the product in wp estore admin?

    October 12, 2010 at 11:13 am #25320
    wzp
    Moderator

    eStore takes care of encrypting the download links that are provided to customers.

    As for securing the actual files, consider the following…

    Create an empty “index.html” file in the directory that contains your downloads. For extra sarcasm, you might put a message in it like:

    Find what you’re looking for?

    Or better yet, a popup ad…

    Or you can “play nice” and just redirect them back to your main page:

    <meta HTTP-EQUIV=”REFRESH” content=”0; url=http://www.yourdomain.com”>

    Also, inside the downloads directory, create an “.htaccess” file with one line:

    Options -Indexes

    In the server’s document root directory, create a “robots.txt” file. It should contain the following lines…

    User-agent: *

    Disallow: /wp-admin/

    Disallow: /wp-content/

    Disallow: /wp-includes/

    Disallow: /wp-content/themes/

    Disallow: /wp-plugins/

    Disallow: /downloads/

    Basically, create a “Disallow” line for each directory you’d like robots and spiders to ignore. Note: the “robots.txt” is an optional “please don’t do that” type of thing. Search engines are not required to honor it.

    To enforce mandatory exclusion of directories and files, you can add/edit the “.htaccess” file in the server’s root directory, but that gets into the “running with scissors” area. You can do some serious damage, if you aren’t careful… 8)

    May 16, 2011 at 2:26 am #25321
    vfx001
    Member

    wzp,

    If using the Amazon S3 addon feature to deliver secure downloads with expiring links…. do you recommend to also have an index.html file in each bucket and/or folder as mentioned above ?

    I was wondering if doing that would add another layer of security along with the buckets being locked down to the public …. not sure if it would be irrelevant with using the S3 feature ?

    May 16, 2011 at 10:48 am #25322
    wzp
    Moderator

    It is not necessary to have index files in your buckets, because Amazon security is based on the object’s individual security setting. If an object is marked as private, it doesn’t matter if someone knows it exists, in your bucket.

    May 18, 2011 at 3:04 pm #25323
    vfx001
    Member

    wzp,

    As a follow up then…. having additional folders within your bucket in which your objects are placed into would seem only necessary for organization rather than a traditional approach of ‘burying’ a file in multiple layers of folders named in an arbitrary way ?

    example: bucket name/klepqd/fewOihe/mopWio/woPiejs/your_file.zip

    May 18, 2011 at 3:42 pm #25324
    wzp
    Moderator

    Correct…

    A Bucket Full Of Objects — Amazon S3

    http://www.tipsandtricks-hq.com/a-bucket-full-of-objects-amazon-s3-3052

    Objects are referenced by their “keys,” which consist of an optional “pseudo folder” (directory) path name, followed by the name of the object.  The keys “His-Stuff/test.txt” and “Her-Stuff/test.txt” refer to 2 separate instances of the  “test.txt” object.  Because the “folder” name part of the keys are unique, so are the object instances.  The term “pseudo folder” is used because S3 does not really store objects in folders; the same way that Windows, OS-X or Linux does.  The entire object key is considered (by S3) to be the equivalent of a file name.

    May 18, 2011 at 4:16 pm #25325
    vfx001
    Member

    wzp,

    Thank you for the confirmation…

    I look forward to utilizing this feature ! :-)

    November 10, 2011 at 6:25 pm #25326
    Louis
    Member

    Great tips.

    IMPORTANT to note that Amazon S3 has some limitations with SSL and CNAME / redirections.

    You cannot use secured downloads/SSL (https instead http) as well as CNAME redirections:

    your_subdomain.your_domain.com/folder/product.zip

    INSTEAD OF

    your_subdomain.your_domain.com.s3.amzonaws.com/folder/product.zip

    In short, you can do

    THIS: http://your_subdomain.your_domain.com/folder/product.zip

    OR

    THAT: https://your_subdomain.your_domain.com.s3.amzonaws.com/folder/product.zip

    so

    You CANNOT do this: https://your_subdomain.your_domain.com/folder/product.zip

    I am weighing the pros and cons…

    July 2, 2013 at 3:25 pm #25327
    whitelight308
    Member

    Hi,

    I uploaded all my MP3 and PDF products in the WP Media Library so I can easily add them to the products in eStore. The File URL for each product was create automatically by WP when I uploaded them. Will people be able to find these files somehow if they know how to search for them (I don’t)? If so, how can I further protect the files in the Media Library. Sorry if the answer is above but I don’t understand the terminology and missed it if it’s there.

    From above: “Create an empty “index.html” file in the directory that contains your downloads. For extra sarcasm, you might put a message in it like:

    Find what you’re looking for?” Can I do this with the WP Media Library?

    Thanks!

    July 2, 2013 at 9:57 pm #25328
    wzp
    Moderator

    The following post will address your concern and suggest methods to better protect your downloadable files:

    https://support.tipsandtricks-hq.com/forums/topic/download-directory-protection

    It is suggested that you store product files in a directory that is outside the normal media library directory with a directory file permission of 0700, that the uploaded product file permissions are 0400, and that you create an empty index.html file within the directory, with a file permission of 0644. If you do this, and use encrypted download links; you should be reasonably “fine.”

    For maximum protection, you should use Amazon S3 to host your product files, instead.

    “Sarcasm” towards hackers is not only unprofessional, but can be a self-fulfilling invitation to have people attack your site.

    September 24, 2014 at 9:34 pm #25329
    russhansen
    Member

    I have a Synology NAS for storing my files. Access to one of my video files is [http://216.177.233.225:5000/fbdownload/EX3B0059_01_1.mov?dlink=2f766964656f2f4d54524e522d3030312f45583342303035395f30315f312e6d6f76] and if you enter that into a browser my video file begins to download, however, when I put this link into eStore with the product listing for the video, everything works up until the point the customer receives the download email and clicks on the encrypted download link which brings up the browser window with the following message, “The secure download manager ran into a problem that it couldn’t handle, and is unable to process your download request. Please contact the site administrator. Please tell the site administrator the problem was due to the following reason: The file (on the server) containing your download couldn’t be opened.” Any suggestions? Is there a way to turn off the encryption download link or what do you suggest to make it work so the customer gets their download? Thanks, Russ

    September 25, 2014 at 12:33 am #25330
    wpCommerce
    Moderator

    @russhansen, You can disable the “Downloadable” checkbox so the plugin does not encrypt the URL of that product (It’s under “Digital Content Details” section).

    September 25, 2014 at 1:48 am #25331
    wzp
    Moderator

    I have a Synology NAS for storing my files. Access to one of my video files is …

    Did you also setup your router to port forward access to the NAS? If you are doing your testing from the same network as the NAS, you wouldn’t notice any problems.

    November 12, 2015 at 11:49 pm #25332
    clearscopedesign
    Participant

    Hello,

    We recently realized that many of our PDF digital downloads were appearing in search engine results. These items should not be linked to from anywhere else on the site, so we’re not exactly sure how the search engines are finding them.

    We followed the instructions to set the root downloads folder to 0700 and that seems to have solved the problem of those documents being accessible via the direct URL (now instead of the files coming up, a 404 page appears instead).

    We have also followed the instructions for creating the robots.txt file in the root directory and the .htaccess file in the downloads folder.

    If you could please just help clarify a couple things for us:

    1. The robots.txt and .htaccess files are what will hopefully keep search engines from indexing the download materials, correct (we understand that it is up to the search engine to obey or not)? Is it just a matter of waiting now to see if the search results update? Is there anything else that we can do in this regard to keep our files from appearing in search results?

    2. Since applying the 0700 permission to the root download folder seems to have solved the problem of our documents being freely downloadable, is there anything else that we need to do in this regard? We have tested in several browsers and in all of them a 404 error appears when attempting to access any of the files directly, which is perfect. Does this mean that we should be good to go now, or is it possible that in some other browsers the files are still accessible and we need to look at taking extra security measures?

    Thank you!

    November 13, 2015 at 1:07 am #25333
    admin
    Keymaster

    What you have done sounds good to me. So wait and see what result you get.

    Just a couple of clarification….

    Yes, search engines have a choice to obey the robots.txt file. Google does obey it.

    They don’t have any choice when it comes to the .htaccess file though. If something is blocked via the .htaccess file then they can’t access it.

    You shouldn’t really have to change folder permission for this. The robots.txt and/or the .htaccess file will do the job.

  • Author
    Posts
Viewing 15 posts - 1 through 15 (of 21 total)
1 2 →
  • You must be logged in to reply to this topic.
Log In

Forum Related

  • Forum Home
  • Forum Search
  • Forum Login

Support Related Forms

  • Contact Us
  • Customer Support
  • Request a Plugin Update
  • Request Fresh Download Links

Useful Links

  • Plugin Upgrade Instructions
  • WP eStore Documentation
  • WP eMember Documentation
  • WP Affiliate Platform Documentation
  • WP PDF Stamper Documentation
  • WP Photo Seller Documentation
  • Tips and Tricks HQ Home Page
  • Our Projects

Quick Setup Video Tutorials

  • WP eStore Video Tutorial
  • WP eMember Video Tutorial
  • WP Affiliate Platform Video Tutorial
  • Lightbox Ultimate Video Tutorial
  • WP Photo Seller Video Tutorial

Our Other Plugins

  • WP Express Checkout
  • Stripe Payments Plugin
  • Simple Shopping Cart Plugin
  • Simple Download Monitor

Copyright © 2023 | Tips and Tricks HQ