Home >Backend Development >PHP Tutorial >How to prevent some important pages from being maliciously crawled?

How to prevent some important pages from being maliciously crawled?

WBOY
WBOYOriginal
2016-09-27 14:18:121684browse

The existing page a.php is mainly to display the interface to the user
Get some data through ajax post data to b.php
But I don’t want others to directly simulate access to capture the data returned by b.php
How to deal with this? Please An expert can help analyze the process

Reply content:

The existing page a.php is mainly to display the interface to the user
Get some data through ajax post data to b.php
But I don’t want others to directly simulate access to capture the data returned by b.php
How to deal with this? Please An expert can help analyze the process

As long as the HTTP protocol is used, it is impossible to avoid being simulated requests
The usual method is to judge whether the Referer request header comes from your own domain name, but it can also be simulated
So only by encrypting the interface request parameters and judging various request headers can we reduce the number of requests being simulated. The simulated probabilistic front-end script needs to be compressed and obfuscated otherwise it will be useless

Statement:
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn