Node.js的杏在1000个并发连接Node、js

2023-09-11 10:14:12 作者:- 秀满别走^

我们正在使用的AWS(EC2)一个简单的Hello World节点服务器基准测试节点的性能。

We're benchmarking node performance using a simple Hello World node server on AWS (EC2).

无论我们使用什么尺寸实例节点似乎总是最大输出为1000个并发连接(这不是每秒1000次,但1000也可以处理1次)。不久后的CPU峰值和节点基本上冻结。

No matter what size instance we use Node always appears to max out at 1000 concurrent connections (this is NOT 1000 per second, but 1000 it can handle at 1 time). Shortly after that the CPU spikes and node basically freezes.

节点v0.10.5

var http = require('http');
var server = http.createServer(function (req, res) {
    res.writeHead(200, {'Content-Type': 'text/plain'});
    res.end('loaderio-dec86f35bc8ba1b9b604db6c328864c1');
});
server.maxHeadersCount = 0;
server.listen(4000);

节点应该能够处理比这更正确的吗?任何想法,将大大AP preciated。

Node should be able to handle more than this correct? Any thoughts would be greatly appreciated.

另外,文件描述符(软,硬,系统)设置为65096)

Also the file descriptors (soft, hard, system) are set to 65096)

推荐答案

使用了 POSIX 模块,以提高上限的文件的数量描述你的进程可以使用。

Use the posix module to raise the limit on the number of file descriptors your process can use.

安装 POSIX

npm install posix

然后在你的code运行时启动应用程序...

Then in your code that runs when you launch your app...

var posix = require('posix');

// raise maximum number of open file descriptors to 10k,
// hard limit is left unchanged
posix.setrlimit('nofile', { soft: 10000 });