This is our Independence Moment.
-
This is our Independence Moment.
From defence to clean tech, we choose independence over dependence.
We invest in security, democracy, and dignity – from housing and jobs to AI and energy.
We defend democracy. We protect media freedom. Because truth and trust are Europe's strongest shields.
Europe must fight for its freedom, unity, and future.
Leading, protecting, and delivering.
These are just some of the key battles Europe is fighting
https://europa.eu/!NNWdhF -
This is our Independence Moment.
From defence to clean tech, we choose independence over dependence.
We invest in security, democracy, and dignity – from housing and jobs to AI and energy.
We defend democracy. We protect media freedom. Because truth and trust are Europe's strongest shields.
Europe must fight for its freedom, unity, and future.
Leading, protecting, and delivering.
These are just some of the key battles Europe is fighting
https://europa.eu/!NNWdhF@EUCommission "Europe must fight for its freedom" as if Europe were a person. Hellooo? WE, THE PEOPLE are Europe, I AM Europe. And if you make #ChatControl mandatory you take a piece of my #freedom away
-
@EUCommission "Europe must fight for its freedom" as if Europe were a person. Hellooo? WE, THE PEOPLE are Europe, I AM Europe. And if you make #ChatControl mandatory you take a piece of my #freedom away
Hello @ranx!
Let us be clear: under this proposal, there is no general monitoring of online communications. There will be no such thing as “chat control”.Only material that is clearly child sexual abuse will be searched for and can be detected.
Detection orders can only be issued by judicial or independent administrative authorities at the end of a thorough process to assess necessity and proportionality, balancing all the fundamental rights at stake.
-
Hello @ranx!
Let us be clear: under this proposal, there is no general monitoring of online communications. There will be no such thing as “chat control”.Only material that is clearly child sexual abuse will be searched for and can be detected.
Detection orders can only be issued by judicial or independent administrative authorities at the end of a thorough process to assess necessity and proportionality, balancing all the fundamental rights at stake.
I have a question about: "Only material that is clearly child sexual abuse will be searched for and can be detected."
How? How will one determine when it's "sexual abuse" on all media? One has to scan each and every single media file to find these, I do not see any other way
Don't get me wrong, I would love a better way to deal with CSAM content but scanning all to find the few is not the way
-
I have a question about: "Only material that is clearly child sexual abuse will be searched for and can be detected."
How? How will one determine when it's "sexual abuse" on all media? One has to scan each and every single media file to find these, I do not see any other way
Don't get me wrong, I would love a better way to deal with CSAM content but scanning all to find the few is not the way
There's a function that creates a hash of an image (like sha256) except it's designed so that usually re-scaling, cropping, or making subtle changes to the image will not cause the hash to change.
I suspect they will send a database of hashes to devices, or else have the device hash whatever picture is sent, and send that to their central server. Now of course if they only have a database and various police forces are submitting hashes to it, there's no way to know if that's CSAM or political memes.
Incidentally, these databases already exist and as I recall, the admin of Poast was trying to get access so he could block CSAM on his site, but they wouldn't give him access without him paying some absurd fee. You could ask him if you're interested in the back story on that. -
There's a function that creates a hash of an image (like sha256) except it's designed so that usually re-scaling, cropping, or making subtle changes to the image will not cause the hash to change.
I suspect they will send a database of hashes to devices, or else have the device hash whatever picture is sent, and send that to their central server. Now of course if they only have a database and various police forces are submitting hashes to it, there's no way to know if that's CSAM or political memes.
Incidentally, these databases already exist and as I recall, the admin of Poast was trying to get access so he could block CSAM on his site, but they wouldn't give him access without him paying some absurd fee. You could ask him if you're interested in the back story on that.@cjd @EUCommission @ranx Oh yeah it has to go via the hash way but the hashes need to be generated, and since those algo's are often secret since it would be easier to bypass otherway we cannot run these ourselves i believe
So the media still has to go from X to Y and back to X for the hash right?