Tips and Tricks HQ Support Portal › Forums › WP eStore Forum › curl problem: CURLOPT_FOLLOWLOCATION
Tagged: cURL, curl_setopt, estore, S3
- This topic has 9 replies, 3 voices, and was last updated 13 years, 10 months ago by wzp.
-
AuthorPosts
-
January 2, 2011 at 11:29 pm #2443dcMember
I sell large downloads, and it has been recommended to use curl as my download method. however, when i try to instantiate a download, i get the following error:
Warning: curl_setopt() [function.curl-setopt]: CURLOPT_FOLLOWLOCATION cannot be activated when safe_mode is enabled or an open_basedir is set
i’ve done some digging and it appears this is due to a security change in PHP.
any fix for this?
January 3, 2011 at 2:49 am #27711amin007ParticipantIs your server running PHP in safe mode?
January 3, 2011 at 3:09 am #27712dcMemberit is not.
safe mode is set to OFF and max_execution_time is set to 0.
i have tried all methods and downloads still time out prematurely. i am hoping curl might be the answer, although it seems from one of the other threads here that the culprit may be apache and not php.
January 3, 2011 at 1:35 pm #27713wzpModerator
Shameless plug for new eStore feature
As an alternative to hosting large files on your site, you might want to consider using the native Amazon S3 support that is in eStore 4.8.x and higher. Not only is it more reliable than trying to self-host the files, your customers will enjoy an incredibly faster download experience.
January 3, 2011 at 4:32 pm #27714dcMemberi’d rather not pay another fee just to get downloads to work properly
i think the solution is to pass the file off to the browser, thus circumventing the php execution and memory limits entirely.
it’s not the most elegant thing in the world, but i’ve tested it and it works without a hitch with files of all sizes (tested up to 1.5gb).
the basic idea is to create a temporary symbolic link that points to the actual file being requested, use the header(location:) to pass the transfer to the browser, and then clean up/delete the symlink after a designated amount of time/access attempts (just as the other methods currently do).
i’ve got the guts of the symlink creation/transfer working:
<?php
$ran = rand();
$file_name = “mylargefile.zip”;
$path = “/mypath/path/”;
# create a new file of the name: download_randomnumber.zip
$randomFile = sprintf(“download_%s.zip”, $ran);
system(sprintf(‘ln -s %s %s’, $path.$file_name, $randomFile), $retval);
if ($retval != 0) {
die(“Failure to create symbolic link or retrieve download.”);
}
header(“Location: http://www.mydomain.com/pathtosymlinks/$randomFile”);
?>
still need to sort out cleaning it up, etc. would love some help with making this a bonafide Method 8
January 3, 2011 at 7:22 pm #27715dcMemberwith a few tweaks i now have this working as a selectable ‘method 8’.
url obfuscation works as planned, access limits work as they should and best of all – you can download large files with no php timeouts!
now to look into getting the symlinks cleaned up after a certain amount of time.
i also have some things hardcoded which i’d like to move to the user-definable menu settings..
January 3, 2011 at 8:01 pm #27716wzpModeratorAt one time I thought of the symbolic link approach as well. But then I’d have to consider users who have safe mode or other restrictions turned on. It might be worth looking into as “yet another available option…”
Amazon is offering a one year free trial of all AWS products. It includes a monthly storage quota of 5 GB for S3.
As for cleanup of the symbolic links, I think that once the browser begins downloading, it may be possible to remove the link. But that’s just a theory that would have to be tested by someone
January 3, 2011 at 8:55 pm #27717dcMemberof course.. it’s not without its own set of limitations. for those of us that need to serve large downloads and can use it, however, it seems to be the best workaround. you can never have too many options!
my file sizes and monthly transfer amounts are cost prohibitive for S3. it’s a great service, though… maybe if i can sell enough product i can justify the cost.
cleanup shouldn’t be too bad. i’m thinking a script which checks a file’s age and deletes it if it is beyond a certain threshold. then use cron to run that periodically.
January 4, 2011 at 1:47 am #27718dcMemberjust a little update:
there seems to be no issue deleting symlinks once the transfer has started.
thus, my final(ish) solution is:
when a request comes for a download, create a temporary, random directory populated with a temporary, random filename. that file is a symbolic link that points to the actual file being requested.
initiate the transfer via the browser using header(Location:).
call an external php file, which runs in the background, waits 30 seconds (to be sure you don’t delete the file before things get going), and then removes the temporary file and temporary directory.
so, you’ve got a very small window in which a temporary, random file location is shown to the client browser. even if they catch it, the link is gone forever in 30 seconds.
all the nifty access restrictions and link duration stuff already included with wp-eStore still function as you would expect.
gotta cleanup this mess of code.. but it looks like i can now serve large downloads..
hooray, method 8!
January 4, 2011 at 4:31 am #27719wzpModeratorPlease visit my contact form and leave me your e-mail address…
Thanks.
-
AuthorPosts
- You must be logged in to reply to this topic.