ホームページ >ウェブフロントエンド >jsチュートリアル >NGINX と負荷分散による Node.js アプリケーションのスケーリング
As your Node.js applications grow and receive more traffic, scaling becomes crucial to maintaining performance and stability. In this article, we'll dive into key techniques and tools that help scale your Node.js applications, focusing on NGINX as a reverse proxy and load balancer, as well as other methods for handling high traffic and demand.
In this article, we will cover:
NGINX is a high-performance web server known for its capabilities as a reverse proxy, load balancer, and HTTP cache. It efficiently handles large volumes of simultaneous connections, making it an excellent tool for scaling Node.js applications.
A reverse proxy routes client requests to one or more backend servers. Using NGINX as a reverse proxy for your Node.js application can offload tasks such as SSL termination, caching, and load balancing from your application.
Install NGINX:
First, install NGINX on your server. On Ubuntu, this can be done using:
sudo apt update sudo apt install nginx
Configure NGINX:
Create a new configuration file for your Node.js app in /etc/nginx/sites-available/ directory.
Here's an example configuration file:
server { listen 80; server_name yourdomain.com; location / { proxy_pass http://localhost:3000; # Your Node.js app proxy_http_version 1.1; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection 'upgrade'; proxy_set_header Host $host; proxy_cache_bypass $http_upgrade; } }
Enable the Configuration:
Create a symbolic link from sites-available to sites-enabled to enable the configuration:
sudo ln -s /etc/nginx/sites-available/yourdomain.com /etc/nginx/sites-enabled/
Restart NGINX:
Restart NGINX to apply the changes:
sudo systemctl restart nginx
Now, NGINX will forward any incoming requests to your Node.js application running on port 3000. The reverse proxy setup ensures that your Node.js app is isolated from direct client access, adding a layer of security.
When traffic increases, a single Node.js instance might struggle to handle all incoming requests. Load balancing allows traffic to be distributed evenly across multiple instances of your application, improving reliability and performance.
server {
listen 80;
server_name yourdomain.com;
location / { proxy_pass http://node_app; proxy_http_version 1.1; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection 'upgrade'; proxy_set_header Host $host; proxy_cache_bypass $http_upgrade; }
}
2. **Explanation**: - The `upstream` block defines a pool of Node.js servers running on different ports. - NGINX will distribute incoming requests evenly among these servers. 3. **Load Balancing Algorithms**: By default, NGINX uses a round-robin algorithm to balance traffic. You can specify other load balancing methods such as: - **Least Connections**: Sends requests to the server with the fewest active connections. ```nginx upstream node_app { least_conn; server localhost:3000; server localhost:3001; } ``` 4. **Test and Scale**: You can now test the setup by running multiple instances of your Node.js app on different ports (3000, 3001, 3002, etc.) and monitor how NGINX balances the traffic. ## Caching Static Content with NGINX Caching static content such as images, CSS, and JavaScript files can significantly reduce the load on your Node.js application by serving cached versions of these assets directly from NGINX. ### Caching Setup in NGINX: 1. **Modify Configuration for Caching**: Add caching rules to your server block: ```nginx server { listen 80; server_name yourdomain.com; location / { proxy_pass http://localhost:3000; proxy_set_header Host $host; proxy_cache_bypass $http_upgrade; } # Caching static content location ~* \.(jpg|jpeg|png|gif|ico|css|js)$ { expires 30d; add_header Cache-Control "public, no-transform"; } }
Scaling a Node.js application isn’t just about using NGINX. Below are a few more techniques to ensure that your application scales effectively:
Vertical scaling means upgrading the server's hardware resources, such as increasing the number of CPUs or adding more memory. While this can improve performance in the short term, it's limited by the physical capabilities of the machine.
Horizontal scaling involves running multiple instances of your application across different servers and balancing the traffic among them using tools like NGINX or a cloud load balancer. This method allows virtually unlimited scaling by adding more instances.
Node.js can run on multiple cores by using the built-in cluster module. This allows you to utilize all the available CPU cores on a server, increasing throughput.
Example:
const cluster = require('cluster'); const http = require('http'); const os = require('os'); if (cluster.isMaster) { const numCPUs = os.cpus().length; // Fork workers. for (let i = 0; i 1a56c26e1ee04e4e5ac420f74e2d5a0d { console.log(`Worker ${worker.process.pid} died`); }); } else { // Workers can share any TCP connection http.createServer((req, res) => { res.writeHead(200); res.end('Hello, world!\n'); }).listen(8000); }
This example shows how to use all CPU cores available on a machine by forking worker processes.
Problem: An e-commerce website is experiencing high traffic during sales events, leading to slow response times and occasional server crashes.
Solution:
Outcome: The e-commerce website can now handle thousands of concurrent users without slowdowns, ensuring a smooth user experience during peak traffic times.
When scaling applications, security should not be overlooked. Implementing SSL (Secure Sockets Layer) ensures that data transmitted between the client and server is encrypted and protected from attacks.
Configure NGINX to use SSL:
server { listen 443 ssl; server_name yourdomain.com; ssl_certificate /etc/ssl/certs/yourdomain.crt; ssl_certificate_key /etc/ssl/private/yourdomain.key; location / { proxy_pass http://localhost:3000; } }
Redirect HTTP to HTTPS to ensure all traffic is secure:
server { listen 80; server_name yourdomain.com; return 301 https://$host$request_uri; }
Scaling a Node.js application is essential as traffic and demand grow. By utilizing NGINX as a reverse proxy and load balancer, you can distribute traffic effectively, cache static assets, and ensure high availability. Combining these techniques with horizontal scaling and Node.js clustering enables your applications to handle massive traffic loads while maintaining performance and stability.
Implement these strategies in your projects to achieve better scalability, improved user experience, and increased uptime.
以上がNGINX と負荷分散による Node.js アプリケーションのスケーリングの詳細内容です。詳細については、PHP 中国語 Web サイトの他の関連記事を参照してください。