by netsectoday on 10/5/22, 1:37 PM
If you expose a web server to the internet today you'll get 10 malicious requests for every 1 legitimate request.
This constant and unrelenting beating at your doors doesn't go away unless you add perimeter protection.
The options here are:
1) Block the IP and cidr ranges that are giving you trouble
2) Silently scan the connection request and block it when things look fishy
3) Provide a challenge in the return response that is difficult for bots to complete
Most of the bot protection on the internet is #2 where you don't notice you've been verified as a human and the site just loads. People hate #3 of completing a challenge, but the other option here is #1 where the site doesn't load at all.
I'd argue that bots are breaking the internet.
by IceWreck on 10/5/22, 12:33 PM
Another thing that annoys me so much is that a lot of websites offer RSS feeds and then their RSS feeds are broken because of cloudflare.
If your feed reader periodically requests a feed, cloudflare starts showing their javascript based checking your browser thingy.
by phillipseamore on 10/5/22, 11:57 AM
The site owner has complete control over this in the CF dashboard, and can easily disable it or lower the threshold. Myself, I'm quite happy with stopping bad traffic (about 20% of the requests to my sites) at the edge with CF and keeping my hosting costs down.
by LinuxBender on 10/5/22, 1:39 PM
I've never used Cloudflare so apologies for what is probably documented somewhere. Can the site owners not set JS requirements per-URL? I ask because the same JS hidden browser tests can be added to NGinx and HAProxy using LUA scripting and it can be done by ACL for specific URL's. e.g. No-JS for static content and URL's that use GET but then require passing the JS hidden browser test prior to using a page that would require a POST. That is just one example of the myriad of possibilities. Can that not be set up in CF? Or is it all-or-none?
For people not using a CDN and wanting to keep bots off the static content, this can for now be partially accomplished doing two things. Forcing HTTP/2.0 and one raw table iptables rule to drop TCP SYN packets that do not have an MSS in the desired range. Most poorly written bots do not even bother to set MSS. I'd wager this is something CF looks at in their eBPF logic. Blocking non HTTP/2.0 requests will drop all search engine crawlers except for Bing.
by flyingfences on 10/5/22, 1:29 PM
Imgur is breaking the internet by requiring JavaScript, too. Oh the irony.
by ynbl_ on 10/5/22, 12:27 PM
cloudflare as of this month shows propaganda on the captcha page, like "40% of the internet was historically bots" (as if that matters). it actually fits right in with, the common sentiment that the old internet was bad, welcome in the new internet where nothing is allowed unless it's a legitimate commercial use. this is getting out of hand.
by endigma on 10/5/22, 11:43 AM
How exactly do you imagine bot/attach protection (cloudflare's main product) working without JS? Even to bypass a captcha using your browser to assert trust requires JS.
Are captchas and DDoS bot protection ruining the web?
by plgonzalezrx8 on 10/5/22, 4:39 PM
Cloudflare is not the one breaking the internet, Bots are, they are just providing a solution to deal with the bot problem.
This is also controlled by the Cloudflare customer. If I'm having issues with my server due to fake/hostile traffic coming to my website, you're dang right I will do what it takes to stop it.
by rpigab on 10/5/22, 3:21 PM
It's annoying to some of us, and will only result in escalation, browser plugins will prolly be made to only run js in this context and not in the final render.
Some more wasted processing power, that might block unwanted requests, but apart from DDoSes, these requests shouldn't be a threat anyway. Maybe DoS zombie agents will be updated to run a bit of js, if it's worth the hassle.
Every day we stray further from what the web could have been if we could have nice things.
by tosspottery on 10/5/22, 10:24 AM
Technically this is just breaking the web, not the internet - none of the other protocols are being interfered with.
Even Cloudflare's DNS product is just standard DNS protocol, sitting behind network-level DDoS protections. It's only HTTP where they tamper with the application layer.
by giancarlostoro on 10/5/22, 1:05 PM
This is JS specifically in Cloudflare's own domain is it not? I don't think you need to enable JS after. So a JS / cookie blocking setup should be versatile enough to let you allow only Cloudflare JS and cookies which are then self-destructed.
by nalaz on 10/5/22, 1:42 PM
No, you broke the Internet for yourself by disabling two core technologies of the web.
by vnkr on 10/5/22, 9:35 AM
Cloudflare is a tool. Website owners use this tool to solve the challenges they face. You are shouting at a wrench.
by hulitu on 10/5/22, 10:23 AM
TBH, i thoght about an experiment when 1.1.1.1 is redirected to 127.0.0.1 to check how much of the internet still works.