I'm having serious problems deploying my Laravel application. This is a news site, nothing complicated. The entire homepage is cached in Redis as pure HTML, and query results are also stored in Redis, so MySQL usage is very minimal. The application is fully functional, so we decided to test it in the production environment at night. After deployment, we noticed that the CPU usage jumped to 90-100% on all 12 cores, so the application was running very slowly or not at all. There are about 400 users and 17-20 requests per second.
We tried changing server settings (e.g. PHP-FPM) without success. We ran some tests using Apache Benchmark and the results showed that the CPU usage was around 80-100% with 10 concurrent users. Next, we repeated the test using a clean Laravel (without our application) with similar results.
What we use:
We noticed that Nginx caching can reduce CPU usage in applications like Wordpress or Drupal. But I think Laravel doesn't use it. Does Laravel need to be configured to use the Nginx reverse proxy and cache it in some special way?
I really don’t know what to do. Has anyone encountered similar problems?
P粉4638401702024-03-28 00:33:42
Laravel runs great on Nginx. I use Nginx to run my application in a container on Google Cloud without any performance issues.
If you want to use a retention proxy, this one does the job for me:
server { listen 80; your_news_app.com; location / { proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header Host $host; proxy_set_header X-Forwarded-Proto $scheme; proxy_pass http://localhost:8000; }
}