如何排序在PHP中的事件上传文件到Amazon S3上传文件、事件、PHP、Amazon

2023-09-11 09:08:49 作者:掏心掏肺不如掏钱

我得到不同的结果与一些PHP code自己编写的文件上传到S3然后调用一个EC2实例上执行上传文件操作。

I am getting varied results with some PHP code I have written to upload files to S3 then call an EC2 instance to perform actions on the uploaded file.

下面是我做事的顺序 -

Here is the order I do things -

1)使用S3级把文件

1) use S3 class to put file

$result = s3 -> putObjectFile($uploadDIR, $bucket, $name, S3::ACL)

2)检查$结果

2) check $result

if($result == "1") {
//file made it to s3

3)使用卷曲打电话EC2实例和文件中的S3执行操作

3) use cURL to call EC2 instance and perform action on file in S3

我用这与视频文件,当我上传一个小的视频文件时,它工作正常(如3MB),但对于较大的视频(如80MB)的code似乎没有闯过第1步中的文件移动到S3确定,但我的猜测是,经过一段时间的PHP放弃观望$结果== 1,因此不执行的code中的其余部分。

I am using this with video files, when I upload a small video file it works ok (eg 3MB) but for larger video (eg 80MB) the code doesnt seem to get past step 1. The file is moved to s3 ok but my guess is that after a while PHP gives up waiting to see if $result == 1 and so does not execute the rest of the code.

什么是处理这样的事情的最好方法是什么?我怎样才能检测到该文件已被上传到S3,然后运行一些code时,它有?

What is the best way to handle something like this? How can I detect that the file has been uploaded to S3 and then run some code when it has?

推荐答案

对于大文件才能上传他们可能会比脚本的的max_execution_time 。

For big files the time it takes to upload them will probably be higher than the script's max_execution_time.

您可以只使用set_time_limit()函数的功能,但它仍然可能不是网页的脚本一个好主意,将只是挂有没有任何用户反馈(输出到浏览器)。

You could just use the set_time_limit() function but it's still probably not a good idea for web pages as the script would just "hang" there without any user feedback (output to the browser).

这很可能是更好的:

使用set_time_limit()函数中的脚本,并没有死。 存储文件名中的临时位置(DB,会议等),并获得一个唯一的ID为它 输出与一些AJAX code到(重复)查询第二个脚本文件的状态(已完成,失败,未定义?)页 在最初的剧本,等待操作完成,并更新数据库的结果。