Just a few minutes ago, I was told by my mentor o...
How to prevent indecent videos and images from being spread all over the Internet.
We can stop people from posting emails from their cell phones and smart phones.
Take away the function itself. I'm not sure how to do that, but I think it's the right thing to do.
Main media.
For example, youtube, Instagram, posting forums, blog comment sections, photo posting sites.
We can take away the function itself to send from mobile phones/smart phones.
It is not a program, but a function.
Not a program, but a feature.
The function itself is taken from all portable devices.
Instead of using conditional branching or AI to recognize what kind of photo or video is being submitted, define a code (program) to select it, and then have the person receiving the submission kick it.
From cell phones and smartphones, youtube X (Twitter), Instagram, bulletin boards, blog comment sections, photo-posting sites, etc,
The “select” side is not a kicker, but rather a selector who is kicked,
from all the machines we carry around with us every day,
The “function” that sends it there,
from the machines themselves.
It is the opposite idea.
In Japan, you can be arrested for possessing child pornography.
In other countries, you can be arrested just for possessing obscene images and videos.
From a portable device,
Where it can be made public to an unspecified number of people,
If you take away the ability to send the images and videos from a portable device to a place where they can be publicly displayed to the general public,
It is the function, not the application. The function.
And then, surely, absolutely, you won't be able to send it. You will not be able to publish.
You can't publish itself.
If you can't send it.
Then, first of all, obscene images and videos, images and videos of violence,
And then the portable terminal will not be able to send a certain amount of data to the public.
Then, on the portable device side, all gadgets with a capacity over a certain number will be banned from production.
Then, it will be impossible to take too clear images and videos.
And the images produced by the AI will be less accurate,
The system will always know that the images are fake.
Also, the price of portable devices will drop.
Anyone can have a portable device.
Anyone can have a portable device,
The photographic and video evidence can be taken.
With a cheap, easy-to-get portable device, victims can leave evidence.
I mean, some countries and agencies need to ensure that voiceprint matching has evidentiary value, but it can be done in 10 minutes, right?
Also, the voice recording function is still there (this one retains the range where voiceprint matching is possible. Of course.) I'd like to know if there is any possibility of
(Of course, this is within the scope of voiceprint matching.) But obscene images and videos that are made public to an unspecified number of people do not reveal the identity of the person making them.
And even if a voice that is supposed to be the person's voice is recorded in an indecent video, it is still possible to identify the person by voiceprint matching.
That's right. Voiceprint matching can only be done by a specialized agency.
So, no one can know who it is.
You can tell who is the person who called the victim by name or nickname during the violence, can't you?
If you yell or say a horrible word, you can identify the perpetrator.
Even if it is audio, you will not be able to identify them.
From all cell phones and smartphones,
Indecent images and indecent videos.
to all posting sites (media)
the ability to send obscene images and videos
The idea was first thought up by My Master O. on November 9.
This idea came to My Master O on November 9.
Today is November 11, 2:27 PM Japan time.
I think there are still a lot of holes, but we will take away this function itself.
If the professionals develop it more and more from the point of view of,
If the professionals continue to develop from this point, there will be no more people in this world who are threatened, who continue to be threatened, who continue to be victimized, who continue to be victimized.
Translated with DeepL.com (free version)