Yeah, would definitely get slr in major legal issues. But to way in on the whole subject: The box is already long open. If you wanna deep fake someone in porn you can do so already. Just requires know how and time but this is just gonna get better quality and much easier to do moving forward.
As for non porn deep fakes: All these institutions and companies are doing is delaying the inevitably really. It's like open AI not releasing voice cloning. They act like it's not already possible. That's just companies saving their own ass from lawsuit. It's not safety for you and I but safety for themselves. I'd even argue that it's actually harmful as it keeps the masses oblivious to the threats that are already out there so they can't get used to it while it's still not perfect.