search

Home  >  Q&A  >  body text

php - json_encode handles big data problems

When using json_encode to convert big data into json, I found that the memory was exhausted. There is no solution that can replace json_encode. This problem also seems to occur when looping big data. How to solve it

PHP中文网PHP中文网2823 days ago890

reply all(4)I'll reply

  • 世界只因有你

    世界只因有你2017-05-16 13:11:11

    //设定脚本无超时时间
    set_time_limit(0);
    
    //设置脚本可用最大内存
    ini_set("memory_limit","2048M");
    

    reply
    0
  • 巴扎黑

    巴扎黑2017-05-16 13:11:11

    Loops can consider the yield keyword to solve memory consumption.
    json_encode This is too vague.

    reply
    0
  • 黄舟

    黄舟2017-05-16 13:11:11

    If you just save and call parsing by php yourself, you can use the serialize method, which has much higher performance than Json_encode.

    My answer to the question is not comprehensive, so you don’t need to read it. Only suitable for certain specific scenarios. . .

    reply
    0
  • PHPz

    PHPz2017-05-16 13:11:11

    Usually when I encounter problems with large data volumes. I will always think about whether this large data can be split. For example. I want to cache a data list. I can just cache the id. I am getting the specific data through ID (all cached). Of course, the specific situation needs to be analyzed on a case-by-case basis.
    In addition, if you serialize, it will be very slow. When you need to process this json. Reading and parsing is also a problem

    reply
    0
  • Cancelreply