- cross-posted to:
- [email protected]
- [email protected]
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
- [email protected]
- [email protected]
A.I. company Worldcoin has rolled out 1,500 Orbs to more than 35 cities in a bid to create digital identities for the world’s citizens.
deleted by creator
The article is bullshit click bait.
Yeah, this is nefarious data collection masking as slightly less nefarious data collection
Copy that.
Just hell no. Sounds like a spray paint campaign is in order. I’m gonna go post this on the anarchy subs and see how they feel about it (unless you already got there first).
Baseball bats sound more effective. Make ‘em eat the costs.
Just hide your eyes or they’ll scan you before you can beat the ball
Sunglasses and a mask are still totally fine to wear nowadays. Just walk up and pretend to trip.
“I’ve been very interested in things like universal basic income and what’s going to happen to global wealth redistribution,” Sam Altman, Worldcoin’s cofounder
Holy crap it’s Sam Altman, the CEO of OpenAI. After that recent article about his $2 Kenyan workers it’s much harder to believe in benevolent intentions.
Any time someone creates a new coin instead of using the thousands available, it’s 99.9% a scam. We don’t need a new money supply for a UBI, this has been discussed to death in crypto circles for over a decade.
That’s twice the minimum wage where I live.
Well I look forward to hearing the endless tales of people smashing the fuck out of these, and taking the hardware to figure out how to do greater damage to the entire project.
I think these would look pretty cool in my art deco living room, and they’re free too! Such a great deal ;)
People trying to force social credit onto the free world.
Shiit, if you think about it, we kinda already have a social credit system in the US. It’s less social I suppose, but does affect things that effect our social status, like being able to finance a car or house afforedably.
Hmm, based on the pictures in the article this thing is basically a camera in a shiny ball about 1ft in diameter (it appears to be about the width of 3 bricks laid side by side). It’s not like a Cloud Gate-sized object. To get a scan of your irises you would have to be pretty close to it for at least a few seconds - it’s not like it could get a scan if you’re just walking by a few feet away. You’d have to walk up and point your face at it on purpose. The camera in it also looks fixed - I doubt it can rotate to follow you, that would be mechanically complex, expensive and prone to failure.
Based on the description, their software takes an image of your irises and reduces it to a hash value. The original image is deleted (they claim) and the hash value is stored as an ID code. It seems likely that the hash value will be unique to their software - e.g. if you wrote your own code to produce hash values from images, you would get a different number even if you had the same picture of the same eyes. So the hash value doesn’t necessarily represent anything about your eyes that would be much of a privacy invasion… It’s just a mathematically derived number string which is unique to their software.
It’s not clear what part of this system is “AI”, though my guess would be it has something to do with re-identifying your eyes next time you want to access whatever is secured with your hash code. It’s really not clear how that would work… a new image of your eyes collected a year later under different lighting conditions would probably produce a different hash value, so how does this system match them, if it only records the hashes?
FWIW, I think smashing or spray painting these things, while fun ideas in the rebellious teenager sense, is probably overkill and likely to get you more attention from law enforcement than you want. But, you could probably just walk up behind it and slap a sticker or tape over the camera… they’d still have to pay someone to go out and peel it off.
Taking a picture instantly after would probably create a different hash value. The thing about hashing is that even if one bit is different between source images, the resulting hashes would look entirely different.
I suppose I could conceive of a proprietary hash algorithm that would allow for fuzzy matching of iris photos, but as you said, eyes taken years apart in different conditions wouldn’t match the original hash. Or falsely match similar looking eyes. It’s not like this system allows them to get high resolution perfectly lit iris photos, after all.
The whole thing sounds dubious, and I suspect AI is mentioned solely to secure investor funding, much like how several years back everything mentioned Blockchain.
They are likely using a form of https://en.wikipedia.org/wiki/Perceptual_hashing
The noise level a perceptual hash is sensitive to can be tuned.
The “falsely match similar looking” is harder than one would expect. I used to work on an audio fingerprinting system which was extremely robust to “similar” audio matching. What sounded similar to us was always identified uniquely by the hash with high confidence.
For example. Take the same piano piece done by the same artists on the same piano performed as close as they could to the same: never confused the perceptual hash with ~10 sec of audio. Not once. We could even identify how much of a pre-recorded song was used in a “live” performance.
There are adversarial attacks for perceptual hashes. However, “similar eyes” would not be one to a standard perceptual hash. More like: a picture of an abstract puppy happens to have the same hash as an eye.
I’d be curious on the details of the hash. That is necessary to know what the adversely attacks are. But I see no mention of the details. Which is suspicious on it’s own.
The absolute absurdity of a news article on nefarious data collection requiring that I enable JS to read it, just so that it can load a ridiculous number of trackers.
What the hell is this? How can they even do this without getting deleted.
I’m not sure I understood it correctly. Do people just need to just look at the mirrored surface to get scanned, and they get a coin?
You don’t know and don’t have an account your bad! We have your eyes now!
Or do people need to read a privacy policy and accept everything before they get scanned?
Well it’s not magic at least https://worldcoin.org/blog/engineering/opening-orb-look-inside-worldcoin-biometric-imaging-device
It’s not a 360 camera. People have to be able to look at the dark spot where the glass for the cameras are.
Cool now we know how to approach and destroy these abominations.
Don’t destroy them. Ship them to me so I can convert them into stationary cameras for my front door and yard with my self hosted security system.
Eyes for the Eye God!
This article described some concerning methods they used to develop and deploy the system…
https://www.technologyreview.com/2022/04/06/1048981/worldcoin-cryptocurrency-biometrics-web3/So this company has ties with OpenAI? That is kinda concerning …
deleted by creator
One step closer to minority report
Yea all those futuristic dystopia and gantz.
Another shitcoin lol !!!
So, no way in hell this could comply with GDPR then.
Passersby need only gaze into its mirrored surface, whereupon the device will scan their irises and generate a unique hash or numeric code attached to their particular set of eyes. In exchange, each participant will receive a World ID and a WLD token.
What th’ . . . are you kids on _dope?!_
The article is misrepresenting the whole thing. It’s voluntary, the devices are not just randomly picking up iris scans.
Click bait at it’s finest.
BUT WE WANT TO FREAK OUT AND BE AFRAID FOREVER : The Internet