Home >Web Front-end >JS Tutorial >Some countermeasures against the impact of anti-brushing IP traffic software on statistics in jquery_jquery

Some countermeasures against the impact of anti-brushing IP traffic software on statistics in jquery_jquery

WBOY
WBOYOriginal
2016-05-16 18:04:591255browse

Encountered problems with IP swiping and traffic swiping software

I thought that the solution of effectively controlling the same IP count only once on the same day can effectively prevent users from cheating by refreshing their addresses multiple times. I have to admit that I am ignorant and ignorant, and I have always ignored the current traffic brushing software. Powerful, our project also encountered the use of traffic brushing software, which produced a lot of junk data and even affected the accuracy of statistics.

In order to solve this situation, I also downloaded two well-known traffic brushing software "Flow Bao" and "Traffic Wizard" to learn about it. You don't need to know, it is indeed a good thing to use it.

The principles of these two software are the same, and I guess the others are similar. They use the principle of network mutual access, the regional differences of network nodes, and the randomness of users to make the access effect real and effective, that is, your computer is hung up When a traffic brushing software is used, your address will be accessed by all users who are also hanging up. Of course, while they are brushing data for you, you are also brushing data for others. The Kung Fu software is doing all this for you behind the scenes, and you will be hung up for a few minutes. You can immediately see that the traffic is slowly coming up, and a lot of nonsense has been said. Let’s talk about countermeasures below.

Countermeasures

In view of the fact that some netizens don’t like to make judgments after reading the full text, I would like to emphasize here that all submitted backend data are IP checked, and This article discusses how to deal with the IP being swiped in this case.

Option 1: Submit data asynchronously through Ajax (invalid)

At first, when the promotion address is clicked, the visitor’s IP, time and other information will be recorded in the background when parsing the page. This method is obviously difficult to prevent traffic brushing software, so we considered submitting data asynchronously through Ajax.

I still underestimated these rogue software at the beginning, thinking that simply simulating http requests would not trigger js scripts, so the first solution was to asynchronously submit a record request through ajax after the page was loaded. The result was invalid. Experiments proved this The method is only valid for lower-level robots;

Option 2: Determine the requested client browser window width or height (invalid)

It can be inferred from the solution 1 that these traffic software are not simply simulating http requests, that is, they are requested through real browsers. However, when I hung up and "was" helping others brush traffic, I did not see any web pages being opened. I can only see the non-stop requests through the packet capture tool, so I guess these traffic brushing software hide a browser window, or set the browser window to be very small... I guess I use js to judge the customer Whether the browser window area currently opened on the client is larger than a certain value (such as 300px wide and 200px high, I believe few people will use such a small area to view the website~), data will be submitted through Ajax only if it exceeds a lower limit value.

The result is still invalid, so I also wrote a small program to record the browser type and window size of each request... The result left me speechless. The requested browser's visible area was normal, and even the resolution was much higher than My monitor is still high, I despise myself...

Option 3: Use mouse events as the basis for normal access (valid)

After several experiments, we came to the conclusion that these robots are not simple, but they are robots after all. Then consider using mouse events to determine whether they are robots, such as mousemove, mousedown, mouseover, etc. Of course, you can also let the user choose to click Buttons, etc. are considered as the basis for judging the operation (of course, the operating experience must be considered). Here is a simple script:

Copy code Code As follows: