Home >Backend Development >PHP Tutorial >How to prevent some important pages from being maliciously crawled?
The existing page a.php is mainly to display the interface to the user
Get some data through ajax post data to b.php
But I don’t want others to directly simulate access to capture the data returned by b.php
How to deal with this? Please An expert can help analyze the process
The existing page a.php is mainly to display the interface to the user
Get some data through ajax post data to b.php
But I don’t want others to directly simulate access to capture the data returned by b.php
How to deal with this? Please An expert can help analyze the process
As long as the HTTP
protocol is used, it is impossible to avoid being simulated requests
The usual method is to judge whether the Referer
request header comes from your own domain name, but it can also be simulated
So only by encrypting the interface request parameters and judging various request headers can we reduce the number of requests being simulated. The simulated probabilistic front-end script needs to be compressed and obfuscated otherwise it will be useless