Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
With a site that active, they really need something that can’t identify strange traffic patterns. Hell, maybe they do but no one cared to do anything. Maybe no one listens to IT… that never happens /s
Get something like splunk to do it. I’m wondering what the rules for this might look like, especially if this was e.g. distributed scraping.
With a site that active, they really need something that can’t identify strange traffic patterns. Hell, maybe they do but no one cared to do anything. Maybe no one listens to IT… that never happens /s