search

Home  >  Q&A  >  body text

node.js - nodejs 队列大数组内存溢出

报错如下
FATAL ERROR: CALL_AND_RETRY_0 Allocation failed - process out of memory

我的代码

connection.query('select * from `table` limit 100000', function(err, rows, fields) {
数据库里的链接大概做了个队列 就是 不断取数据 大概每次做 1W 大多时候 运行不稳定 出如上报错

var array = [1,2,3,4];
function check()
{
if(array.length > 0)
var url = array.pop();
load(url);
}}
function load(url)
{
http.get(url, function(res){
check();
)
}}

http://stackoverflow.com/ques... 跟这个情况类似
请问怎么解决比较好 能不能不断清楚内存的垃圾?

PHP中文网PHP中文网2874 days ago478

reply all(1)I'll reply

  • PHP中文网

    PHP中文网2017-04-17 15:53:41

    • If I count correctly, you are taking out 100,000 records at one time. If you estimate that one row of records is 10K, 100,000 records is about 1 G of memory, so OOM occurs

    • The best way to deal with this problem is stream processing. You will need this method. Set each highWaterMark to 10,000. For example, your total There are 1 million pieces of data to be taken out. You don’t have to control the array and load under your code. Write a ReadStream that processes 10,000 pieces at a time流处理,你会需要这个方法的,把每一次的highWaterMark设置成1万,比方说你总共有100万条数据要取出来,你也不用自己去控制了你代码下面的array和load,写好每次处理1万的ReadStream

    • TALK IS CHEAP, SHOW ME THE CODE

    const SQL_SINGLE_RETURN_ROWS = 20000
    const resStream = new Readable({ highWaterMark: SQL_SINGLE_RETURN_ROWS, objectMode: true })
    resStream.on('data', () => {
    })
    resStream.on('end', () => {
        // stream end 
    })
    const SQL = 'select * from `table` limit 100000'
    const query = connection.query(SQL).stream({ highWaterMark: SQL_SINGLE_RETURN_ROWS / 2 })
    query.pipe(resStream)
    query.on('error', e => console.error(e))
    query.on('end', () => {
      // if you need release conn 
      // connection.release()
    })
    • 伪代码如上所示,untest根据自己实际业务改改

    注:最好设置ReadStreamhighWaterMark比mysql的读数据的stream要大一些,要不然数据来不及consumer

    🎜TALK IS CHEAP, SHOW ME THE CODE🎜🎜 rrreee 🎜🎜🎜The pseudo code is as shown above, untest should be modified according to your actual business🎜🎜
    🎜Note: It is best to set the highWaterMark of ReadStream to be larger than the stream of mysql for reading data, otherwise the data will not reach consumer Data will be lost! ! 🎜🎜

    reply
    0
  • Cancelreply