Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
A new report warns that the proliferation of child sexual abuse images on the internet could become much worse if something is not done to put controls on artificial intelligence tools that generate deepfake photos.
If we had access to the original model, we could give it the same seed and prompt and get the exact image back. Or, we could mandate techniques like statistical fingerprinting. Without the model though, it’s proven to be mathematically impossible the better models get in the coming years - and what do you do if they take a real image, compress it into an embedding, then reassemble it?
Another worry could be: how do you know if it’s a real victim who needs help, or an AI generated image.
If we had access to the original model, we could give it the same seed and prompt and get the exact image back. Or, we could mandate techniques like statistical fingerprinting. Without the model though, it’s proven to be mathematically impossible the better models get in the coming years - and what do you do if they take a real image, compress it into an embedding, then reassemble it?