- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
I think I understand why this is bad, but I am not confident in my technical understanding of the mechanics here. Will appreciate an explainer :)
cross-posted from: https://lemmy.dbzer0.com/post/978408
looks like rendering adblockers extensions obsolete with manifest-v3 was not enough so now they try to implement DRM into the browser giving the ability to any website to refuse traffic to you if you don’t run a complaint browser ( cough…firefox )
here is an article in hacker news since i’m sure they can explain this to you better than i.
and also some github docs
So the person you cross posted this from does not seem to have read this.
This is not impactful of extensions or different browsers. The main point of this actually seems to be replacing captcha.
The dumbed down version is, attestation of the software stack such that it is reasonable to assume a human is actually using it and not an automated process.
Quite frankly, as a web dev I can already prevent certain browsers from accessing my webpage by trying to access unique functions of a browser as a condition of loading the rest of the content.
So what the other user is concerned about already exists, in fact Google meet already does this to prevent Firefox users from accessing certain features, changing user agent doesn’t change the outcome of whether or not the features are available. (In this case it’s because Firefox will crash, but most of the time this is done is for bad reasons).
Edit: this is the most reasonable criticism https://github.com/RupertBenWiser/Web-Environment-Integrity/issues/44
I do agree with it completely (that the proposal can’t actually work)