• New TOR Mirror: suicidffbey666ur5gspccbcw2zc7yoat34wbybqa3boei6bysflbvqd.onion

  • Hey Guest,

    If you want to donate, we have a thread with updated donation options here at this link: About Donations

N

noname223

Angelic
Aug 18, 2020
4,441
I read a newspaper article some months ago. I think google image cannot be used for this goal. I am not sure why but it seems to be not accurate enough.

Such a technology can easily be misused. And it is likely that some companies know that but don't intervene. They rather want to make profits over ethics.

In my country (and I think in the EU) certain companies violate privacy regulations with this technology. Such technology can easily be used by governments to monitor the citizens. Also in brutal regimes for example during protests.

The main usage of it is clear. A tool to search for porn. Often even reveng porn. Certain companies even used this as marketing trick. When confronted with it they changed their strategy.

So I think this could mean a lot of problerms in the future. Especially for women. The mixture of deep fake porn and face search engines like that can have devastating consequences. Though also for men this can result in problems.


I don't want to scare people. Either we get used to it or there will be regulations and stricter laws. I assume the latter one.
 
Last edited:
  • Like
Reactions: katagiri83
Insomniac

Insomniac

š¯”„ š¯”² š¯”± š¯”¦ š¯”° š¯”Ŗ
May 21, 2021
1,357
for something to be even dangerous in this world, there'd have to be safety.

There is no safety. There never have been any.

The whole universe is a race to destruction. The more you destroy, the higher your chances of survival.

so anyway I don't think face recognition is the problem here.
 
A

another@

Member
Nov 13, 2022
96
I agree with Insomniac. What is the society looking to do with such technology? I doubt that there is a dangerous bug in the software...