Tips and Tricks HQ Support Portal › Forums › WP eStore Forum › WP eStore Troubleshooting › WP eStore and Amazon S3
- This topic has 11 replies, 2 voices, and was last updated 13 years, 9 months ago by wzp.
-
AuthorPosts
-
February 19, 2011 at 5:19 pm #2441vfx001Member
Hello,
Just purchased eStore and am integrating it with Amazon S3 for my downloads.
I’d first like to confirm that even though S3 is serving the content,…. the ram allocated to my hosting account will still restrict the size of the downloads??? … is this STILL applicable when using the S3 functionality ?
Ok,…I made a test product and transacted a live purchase to myself…. all seems fine on the paypal side along with email notices going out. For the product I followed your URL guidance for the S3 bucket naming …. no problem there with eStore finding and serving the download…. the concern I have is that within Firefox I see the location of the bucket in the prompt the browser gives me to either open the downloaded file OR save it. I have “Digital Product URL selected in the settings of the plugin also.
“History of process” – I’ve had to attempt multiple downloads with this ‘test’ product purchase in order to troubleshoot a situation which initially was returning a 0kb downloaded zip file. Initially I was using an existing CNAME path when getting this error of 0kb file size… I them created a new bucket and entered the URL structure of as3tp://bucket.s3.amazonaws.com /folder/object while modifying this URL relative to my bucket name and file location. Now I have no problem with the 0kb file size download…… the download is fine but I see the location in the Firefox download prompt window.
So I am wondering what I may be doing wrong here…
Also, can you explain what the “AWS S3 Presigned URL Expiry” setting is doing ?
much thanks!….
February 20, 2011 at 2:32 pm #29235wzpModeratorDon’t panic about seeing the bucket name in Firefox.
What is happening is that eStore is converting the encrypted URL into a pre signed URL request. The browser is then redirected to the S3 bucket. Because control is passed from eStore to S3, you are free of all the previous problems that occasionally have been plauging eStore users. The request is only valid for less than 5 minutes. If you tried to bookmark the resulting URL, it won’t work after 300 seconds (from the time it is created). You could adjust this time to a lessor value, but only if you are sure that your server’s clock is in sync with the one at Amazon.
As long as the object is (default setting) not marked public in S3 console, you are fine.
February 20, 2011 at 3:14 pm #29236vfx001MemberHi wzp,
Thank you for the feedback…
A follow up question regarding the pre-signed URL that expires after the default setting of 300 seconds…. does that override the time set for the buyer’s expiring link ? (in their confirmation email) … by default 24 hrs. ???
Would the buyer still be able to download the clip via their encrypted link multiple times in those 24 hrs ? ….
Also,… when using S3, am I still constrained by my hosting server’s ram amount for continuous download while the script is running ? …. from your comments above regarding eStore handing off control to S3 … it sounds like ram limitations on the hosting server are NOT an issue now with the S3 addon ???? ….. Am I understanding that correctly ?
thank you !
February 20, 2011 at 4:09 pm #29237wzpModeratorThere are two expiration periods at work; the expiry of the encrypted eStore URL and the expiry of the pre signed URL. The expiry of the encrypted URL is 24 hours. It can be used multiple times during that expiry period unless you have set the (default 5) number of times it can be used. Each time the encrypted URL is used, a pre signed URL request is generated. Each pre signed URL can also be used multiple times, within the (default 300 second) expiry period.
Correct, once eStore generates the pre signed URL request and transfers control to Amazon, you are no longer constrained by any hosting provider limitations. if you can find a way to break S3 with your downloads, I’m sure Amazon would love to hear from you
February 20, 2011 at 4:44 pm #29238vfx001Memberwzp,
Thanks ! …. very clear and concise !… appreciate it.
I was having some issues initially with 0KB zip files being downloaded with nothing in them…. at that time I was using a pre-established CNAME for a bucket location for the file location. Subsequently, I set up a new bucket and followed the naming convention
as3tp://bucket.s3.amazonaws.com /folder/object and entered this into the file location for eStore.
I suspect I have to set up a new CNAME (if I wish to use one) with my host following the “as3tp://” path structure ??
February 20, 2011 at 4:54 pm #29239wzpModeratorThe 0k file problem was related to eStore handling the downloads. the as3tp is a “hint” to eStore to process using S3. You should be able to use the existing CNAME as part of an as3tp URI as long as it resolves to the correct form, that uses the bucket name as part of the host path… bucket.s3.amazonaws.com/folder/object and not the form that uses the bucket name as part of the directory path… s3.amazonaws.com/bucket/folder/object
February 20, 2011 at 5:30 pm #29240vfx001MemberThe existing CNAME uses the bucket name as part of the host path BUT when that was set up the ‘as3tp’ URI was not part of the record… just bucket name.s3.amazonaws.com
THAT must have been the problem ???
Also, my host has 64MB Ram for php and the zip file was in the low 30 MB size…. so IF it didn’t hand off to S3 and tried to buffer the download could it have been a ram issue seeing that WordPress has to take a good chunk of that ram allotment ?
February 20, 2011 at 5:48 pm #29241wzpModeratorWhen eStore does the download processing (no as3tp) it acts as a middleman between the actual file location and the browser. As such, the process was always dependent upon the lack of any resource restrictions on the part of the hosting provider. The as3tp solution does not suffer from this problem. There are simply too many things a provider can do that can mess up the download process. The as3tp process counts on the fact that Amazon is using their own network for their own business (selling books/music) and that they would not allow anything to screwup the download experience for their own customers.
You should be able to use the existing CNAME in the product URL as as3tp://CNAME/folder/object
February 20, 2011 at 9:11 pm #29242vfx001Memberwzp,
Success ! … ….
I very much appreciate the explanation and support! ….
I did a live test product purchase of a 52 MB file and no issues what so ever. Looking ahead, I see the possibility of downloads in the 300-400 MB size at times…. with those, I figured a VPS host would be required with dedicated ram in the 700MB – 1Gig range in order to have eStore act as the middle man and keep the locations cloaked. The way I see it… a shared hosting plan would not be able to handle the download for sizes like such ????
Without S3, and given those size of DLs…. would a VPS be the next hosting plan to move to with dedicated resources (ram)… OR are there other potential issues with a VPS… ie: time outs during downloads when hosting the files eStore would be buffering with security ?
February 21, 2011 at 3:25 am #29243wzpModeratorPersonally, I feel that S3 is the more cost effective option. You gain an unprecedented level of reliability and speed, for a fraction of the cost and aggravation that you’d have to pay for a VPS and all the support costs. The only time I can see S3 not being cost effective, is when you start getting into massive sales levels and downloads.
February 21, 2011 at 3:46 am #29244vfx001Memberwzp,
Thank you for your insight….. it is MUCH appreciated !
Correct me if I’m wrong on this observation regarding content on S3 and eStore…. I have found when hosting some of my site’s static content on S3 I have to set the permissions on the static items… ie: jpegs, flv files, gifs, pngs, mp4 files… to World “read”…. THAT isn’t required with content being approved by eStore for download after purchase… is it ????
It appears to me with eStore and the S3 integration that the approval is being sent via the expiring Key to serve the file for ‘download’ rather the need for the file to be ‘read’ as in browser calls for static content…. is this a correct observation ?
February 21, 2011 at 5:53 am #29245wzpModeratorYes. That is correct, you do not have to set permissions. The expiring URL takes care of everything.
-
AuthorPosts
- You must be logged in to reply to this topic.