by chetangoti on 2/2/23, 6:00 AM with 399 comments
by jakelazaroff on 2/2/23, 7:01 AM
> on the contrary, it must be developed.
No, it mustn’t. There’s not a gun to your head, forcing you to do this. You want to develop this technology. That’s why it’s happening.
Technology isn’t inevitable. It’s a choice we make. And we can go in circles about whether it’s good or bad, but at least cop to the fact that you have agency and you’re making that choice.
[1] https://github.com/iperov/DeepFaceLive/issues/41#issuecommen...
by lucumo on 2/2/23, 10:37 AM
During WWII the protagonist's parents are shot by Germans after a body of a collaborator was found in front of their house. The parents were arguing with the soldiers about something when arrested and ended up shot during the arrest.
The collaborator was shot by the resistance in front of their neighbours house, and the neighbours moved the body in front of the protagonist's house.
Over the years, he encounters many people involved in this event, and starts seeing things from many sides. One of the themes that's explored is who bears moral responsibility for his parents' death? The Germans for shooting them? His mother for arguing? The neighbours for moving the body? The resistance for shooting the collaborator? The collaborator for collaborating? All of their actions were a necessary link in the chain that lead to their death.
One of the characters utters a simple and powerful way of dealing with that morality: "He who did it, did it, and not somebody else. The only useful truth is that everybody is killed, by who he is killed, and not by anyone else."
It's a self-serving morality, because the character was part of the resistance group that shot the collaborator in a time where reprisals where very common. But it's also very appealing in its simplicity and clarity.
I find myself referring back to it, in cases like this. In the imagined future, where this tech is used for Bad Things, who is responsible for those Bad Things? The person that did the Bad Thing? The person that developed this tech? The people that developed the tech that lead up to this development?
I'm much inclined to only lay blame on the person that did the Bad Thing.
by LeanderK on 2/2/23, 2:05 PM
If this is used to scam old people out of their belongings then you really have to question your actions and imho bear some responsibility. Was it worth it? Do the positive uses outweigh their negatives? They use examples of misuse of technology as if that would free them of any guilt. As if previous errors would allow them to do anything because greater mistakes were made.
You are not, of course, completely responsible for the actions others take but if you create something you have to keep in mind bad actors exist. You can't just close your eyes if your actions lead to a strictly worse world. Old people scammed out of their savings are real people and it is real pain. I can't imagine the desperation and the helplessness following that. It really makes me angry how someone can ignore so much pain and not even engage in an argument whether it's the right thing to do.
by mihaic on 2/2/23, 8:38 AM
I'm thinking now this is to justify away the collective guilt of bringing into the mainstream harmful products.
It seems to come from the same origins as "crypto can't be regulated", "government can't to anything", "it's ok because it's legal" and it's always worrying me to not really see any sort of moral stance being taken anymore.
by dusted on 2/2/23, 7:12 AM
Using fake media to trick people into believing anything used to be a privileged reserved for nation states and the ultra rich. Now that _ANYONE_ and their cat can do it, it should follow that nobody can believe anything that's on a screen anymore (this comment included).
by yipbub on 2/2/23, 7:23 AM
Once you let the genie out of the bottle, a wish will be made. A technology might not be inherently bad, but neither are knives, and we don't leave those lying around.
That said, it is the human species that develops technology, rarely is one human individual capable of holding back a technology.
by kefka_p on 2/2/23, 8:14 AM
edit: I’ll take your knee-jerk DV, and any others, as an admission of an inability to speak to positive utility of this technology.
by kstenerud on 2/2/23, 10:31 AM
This means that the incentive to develop this technology is already there, and so it WILL be developed no matter how much people wish it wouldn't.
The only difference at this point is whether some of the implementations are developed in public view or not. If none are public, then all of them will be done in secret, and our opportunities to develop countermeasures will be severely hampered by having fewer eyes on it, and a smaller entry funnel for potential white hats.
by college_physics on 2/2/23, 10:47 AM
But software is not "tech". It is the explicit expression and projection of cultural objectives and values onto a particular type of tech. You can take the exact precise hardware we have today and reprogram a million different worlds on it, some better, some worse.
Developers are simply the willing executioners of prevalent power structures. Deal with it. If you have a moral backbone (i.e., you don't agree with the prevalent morality as expressed in what the software industry currently does) do something about it.
[0] Ofcourse upon deeper examination overall system design (e.g. how client or server heavy the configuration, what kind of UI is promoted etc) is not neutral either. Cultural/political/economic choices creep * everywhere*
by rvieira on 2/2/23, 8:38 AM
Machine guns, an advanced piece of engineering widely known to being developed purely as an academic exercise. No one could expect other uses.
by proto-n on 2/2/23, 12:56 PM
This technology is going to be developed regardless of what we do here. Please realize that you are not advocating for it not to be developed: rather, you are advocating for it not to be developed in the open.
by wruza on 2/2/23, 10:25 AM
We actually blame them, except for airplanes. Most of these were invented at the times when lives had much less value and are of no use unless some half-minded pig attacks you or tries to undermine your defenses.
I’d like to see how this line of reasoning changes when someone releases a virus for your DNA in your backyard, made with funnyjokes/easy-create-virus-for-a-drone-app.
by tgv on 2/2/23, 9:58 AM
So, what's the possible scenario for that outcome? Well, look at the upcoming elections in Nigeria. The BBC writes: "With an estimated 80 million Nigerians online, social media plays a huge role in national debates about politics. Our investigation uncovered different tactics used to reach more people on Twitter. Many play on divisive issues such as religious, ethnic and regional differences." ABC News writes: "At least 800 people died in post-election violence after the 2011 polls."
Adding deepfakes into this mix can trigger violent reactions. Should that happen, the creators of deepfakes are obviously to blame, but also those who enabled them, and that includes the original researchers, are responsible. Ignoring that is just putting your head in the sand.
by Hard_Space on 2/2/23, 3:00 PM
But as the barrier to entry for really convincing output goes up (768px /1024px training pipelines, and beyond), and it suddenly becomes something that one person alone can't really do well any more, the 'amateur' stuff is going to look far worse to people than it does now. You just have to wait for that barrier to rise, and I can tell you as a VFX insider that that is happening right now.
Deepfakes are the reverse of CGI, which began as an inaccessible technology and gradually became accessible, before the scale of its use in VFX reversed that again.
Now, assuming you can either afford or will pirate the right software, you could probably match any CGI VFX shot in a major blockbuster if you gave up your job and worked on it non-stop for a year of 18-hour days (assuming you'd already been through the steep learning curve of the pipeline). So it's out of reach, really, and so will the best deepfakes be.
This stuff everyone is so scared of will end up gate-kept, if only for logistical reasons (never mind any new laws that would address it) - at least at the quality that's so feared in these comments.
by jablala on 2/2/23, 11:48 AM
Those posses a complete lack of a moral compass.
by AlexAltea on 2/2/23, 12:12 PM
Do you trust videocall participants because you recognize their faces and voices? ...Or because a server certified by a root CA has authenticated the other participants?
The age of deepfakes has started, nobody can stop it. Improving our mental security models will become as essential as literacy.
by Lacerda69 on 2/2/23, 7:04 AM
the alternative is that its being developed hidden and used by the most vile and evil without many being aware of it (which it most definitely will)
As with nuclear, the cats out of the bag or the babys bathwater(?) already spilled, no way to turn back the clock on technological innovation.
by epups on 2/2/23, 9:00 AM
If somehow we could get the United Nations to agree to ban deepfake development worldwide, then surely we should enter an ethical discussion about whether we should do it or not. In a world where sophisticated actors already have access to these (and much better) tools, having an open-source GitHub repo is a good thing in my view.
by JasonFruit on 2/2/23, 2:28 PM
I know that it's expected that a man of a certain age will begin to say these things, but I think it's true now in a way that it was not true about "those young whippersnappers with their motorcars": we've taken a wrong turn, and I don't like where we're headed.
by blondin on 2/2/23, 6:22 AM
so, deepfake authors want credits for their work. that's perplexing.
what's more, this is happening while they seem to be ignoring the ethical concerns raised in the issue. citing that people can do whatever they want with the tech.
by ttsalami on 2/2/23, 8:19 AM
by zzo38computer on 2/3/23, 4:53 AM
If they do not make them FOSS in public, then the Conspiracy will invent their own and use it for bad uses only.
Furthermore, even if a program is written, you can decide not to use it; that it is written (as FOSS) means that you can read its working, now that someone else wrote about it. You can also execute it on a computer, if that is what is desired. Also, if it is well known enough, then hopefully if someone does use it deceptively against you, then you might be able to guess, or to figure it out (although it might be difficult, at least it might be possible if it is known enough).
I have no intention using such a thing, but someone else might figure out uses of it.
(For example, maybe there are some uses that can be used with movies, for example, if the original actor has been injured for an extended period of time (including if they are dead) or if they want to make up a picture of someone who does not exist. (Although, they should avoid being deceptive. For example, include in the credits, the mention of using such a thing.) Even if it is considered acceptable though, some people will prefer to make movies without it, and such a thing should be acceptable too anyways.)
(I think even in Star Trek, in story, in some episodes they made deepfake movies of someone. And even in Star Trek, both good and bad uses are possible. Or, am I mistaken?)
Nevertheless, there may be some dangers involved, but there are potential danger with anything; if you are careful, then you can try to avoid it, hopefully.
by qwerty456127 on 2/2/23, 11:07 AM
Can you believe a politician saying something on TV? Hell no! You should exercise logic about the whole political play he is a part of. Should you think bad about a person you find on a porn site? Absolutely no, what good could result out of this in any case?
This has always been like this but now there is a thing which can push this into the common sense.
by nathias on 2/2/23, 8:20 AM
1. I want to deepfake myself to have an avatar for online interaction.
2. I want to generate videos instead of filming by pasting people into existing videos.
3. Prevent a Face/Off of scenario.
by amarant on 2/2/23, 12:33 PM
Imagine what that would mean for dubbed movies/TV if it gets good enough.
There are legit usecases, and that justifies the technologies existence. The bad actors don't make it immoral to develop a technology IMO.
by DennisP on 2/2/23, 3:49 PM
by givemeethekeys on 2/2/23, 10:44 AM
Face swapping + voice swapping + auto translate = your customer support can be anyone on the planet but look and sound familiar to you. Maybe you're getting over a facial injury.
Face swapping = you no longer have to put on make up. Just swap your made up face for meetings.
Face swapping + voice recordings + AI that learns = that scene in Contact where Jody Foster talks to the alien - but he takes the form of her father to make her feel more comfortable.
by kderbyma on 2/2/23, 2:46 PM
by CyborgCabbage on 2/2/23, 12:34 PM
by lonelyasacloud on 2/2/23, 5:19 PM
Here it's obvious what's going to happen without robust legislation protecting the likeness of all individuals (and not just special-case celebs) from the non-consensual generation of new material.
The fact that such legislation is unlikely to happen before an awful lot of suffering has occurred is a testament both to naive belief that everything new is good. And to legislative processes with a bandwidth from the age of sail that is riddled with vested interests to handle the downsides when scaling breaks the happy path assumptions.
Focusing efforts on the legislative process seems likely to be more productive than point solutions that rely on techies and scientists not to develop tech that can be used for nefarious purposes.
by dmingod666 on 2/2/23, 9:12 AM
What if this same repo was owned by Nvidia and it had some commercial interest on that product and were ready to litigate.. would everyone still pileup on it?
Is it not on some level disdain that it's just run by a bunch of guys that can be pushed around without much consequence.
Would we have a thread saying, screw it shut down ChatGPT, doesn't fit my moral world-view.. why is that absurd but this is fair discussion?
by bondarchuk on 2/2/23, 12:09 PM
You might argue that the technology to make pixels on a screen resembling real humans is bad, but then you have to actually make that argument (and "some people got scammed" is indeed such an argument, albeit a pretty weak one), not just shift it to "this is technology, machine guns are technology, machine guns are bad".
by 55555 on 2/2/23, 1:27 PM
It’s not going to end the argument though, not least because you’ll then have to assign a relative value to abstract things like “personal freedom” or “artistic expression”.
Even if you could quantify the losses to crypto scams, you can’t put an objective value on some of its more ideological benefits.
by gloosx on 2/2/23, 2:24 PM
by winrid on 2/2/23, 8:04 AM
by halicarnassus on 2/2/23, 1:24 PM
I guess the non-sensical argument "it's not the <insert technology> that <insert and thing>, but the person using the <technology>" will never die out.
If you don't have <the technology>, it's much harder to <do the bad thing>, has to be done hands-on from a very close distance with much higher risk for the perpetrator.
by deniscepko2 on 2/2/23, 1:21 PM
by caporaltito on 2/2/23, 8:55 AM
by WFHRenaissance on 2/2/23, 2:19 PM
by baal80spam on 2/2/23, 9:49 AM
If Company A / Country A / Person A won't do it, then Company B / Country B / Person B will do it and use it to bankrupt you / attack and possibly kill you / take advantage of you.
It's that simple.
by jmnicolas on 2/2/23, 1:10 PM
For me everything (image, text, sound etc) that comes from a computer is suspect nowadays.
by spaceman_2020 on 2/2/23, 12:00 PM
Could crypto unironically be the way out of this mess? If a document isn't signed by a wallet associated with you, it should not be considered authentic?
by EVa5I7bHFq9mnYK on 2/2/23, 1:56 PM
by Sevii on 2/2/23, 2:30 PM
by azubinski on 2/2/23, 10:15 AM
LOL
BTW.
It's not a "technology" in any classical sense of this word.
This is a funny and technologically useless rattle that can be used as a Chinese-made Kalashnikov assault rifle.
by dtx1 on 2/2/23, 10:34 AM
> “Your scientists were so preoccupied with whether they could, they didn’t stop to think if they should.”
by ncr100 on 2/2/23, 4:46 PM
by SergeAx on 2/2/23, 7:02 PM
by return_to_monke on 2/2/23, 5:32 PM
BUT.
what are people using deepfakes for, in good faith? can someone provide one example that isn't malicious?
the best I could imagine is maybe amateur filmmakers deepfaking their faces onto existing footage to cut costs - but it doesn't seem that this outweighs the drawbacks
by sva_ on 2/2/23, 1:09 PM
by renewiltord on 2/2/23, 10:40 AM
They've cried wolf enough times. Everything is dangerous and everything is a crisis.
Consequently, I will ignore their warnings about this as well. It'll be okay. Tomorrow the community will forget about it. Tomorrow the crisis will be that some one-person blog is not GDPR compliant.
by arein3 on 2/2/23, 6:22 PM
There will be a break in period, but the conclusion will be check the source of the information.
Making this easily available will make the break in period easier.
by ge96 on 2/2/23, 8:32 AM
by qthrowayq0909 on 2/2/23, 7:32 AM
by badrabbit on 2/2/23, 11:27 AM