Big Tech's timid deepfake defense

These platforms are loath to pass judgment on a clip’s veracity on their own — an approach experts say could lead to a new election crisis.

“A deepfake could cause a riot; it could tip an election; it could crash an IPO. And if it goes viral, [social media companies] are responsible,” says Danielle Citron, a UMD law professor who has written extensively about deepfakes…

Citron says shrinking away from arbitrating truth is a “cop-out” and that platforms should more aggressively block and filter out potentially dangerous edited videos.

Jack Clark, policy director at OpenAI, says companies can do more to verify whether or not a video was taken by the person who posted it and that they should plaster huge banners across manipulated videos to ward users away.