We are already there, you can no longer trust any image or video you see, so what is the point?
Bad actors will still be able to create fake images and videos as they already do.
Limiting it for the average user is stupid.
We are not actually there yet. First, you still need some technical understanding and a somewhat decent setup to run these models yourself without the guardrails. So the average greasy dude who wants to share HD porn based on your daugther's linkedin profile pic on nsfw subreddits still has too many hoops to jump through. Right now you can also still spot AI images pretty easily, if you know what to look for. Especially for previous stable diffusion models. But all of this could change very soon.
Generating porn is easier and cheaper. You don’t have to spend the time learning to draw naked bodies, which can be substantial. (The joke being that serious drawers go through the draw naked model sessions a lot, but it isn’t porn)
The models art schools get for naked drawing sessions usually aren’t that attractive, definitely not at a porn ideal. The objective is to learn the body, not become aroused.
There is a lot of (mostly non realistic) porn that comes out of art school students via the skills they gain.