我如何批量上传到S3?批量

2023-09-11 08:35:21 作者:总有刁民想捶朕

最近我重构一些我的code到的东西用加载数据行到一个数据库,它的伟大工程 - 但是,对于每个记录我有我必须上传2个文件到S3 - 这完全破坏了宏伟速度提升,我是获得。虽然我能够处理这些文件超过600 /秒,他们现在滴在1 S3 /秒,因为

I recently refactored some of my code to stuff rows into a db using 'load data' and it works great -- however for each record I have I must upload 2 files to s3 -- this totally destroys the magnificent speed upgrade that I was obtaining. Whereas I was able to process 600+ of these documents/second they are now trickling in at 1/second because of s3.

你有什么解决方法呢?综观API我看到它主要是基于REST的,所以我不知道该怎么做 - 也许我应该只是坚持这一切到数据库中。该文本文件一般不超过1500人。 (我们的东西在里面另一个文件是文本的XML重新presentation)

What are your workarounds for this? Looking at the API I see that it is mostly RESTful so I'm not sure what to do -- maybe I should just stick all this into the database. The text files are usually no more than 1.5k. (the other file we stuff in there is an xml representation of the text)

我已经缓存在HTTP请求到我的Web服务器这些文件,因为它们使用了很多。

I already cache these files in HTTP requests to my web server as they are used quite a lot.

顺便说一句:我们的当前实现使用Java;我还没有尝试线程但是这可能是一种选择

btw: our current implementation uses java; I have not yet tried threads but that might be an option

建议?

推荐答案

您可以使用