Really solid legal anlaysis here. The existing civil remedies angle is underrated in these debates, poeple forget that defamation and privacy torts still apply. I dealt with a similar issue in a tech case where everyone wanted new regs when we already had the tools for accountability.
When photography first started it wasn't long before "dirty postcards" started showing up in France ,and the advent of moving pictures quickly spawned the "stag film" industry. Early vcr catered to pornography. Taking a new technology and using it for sexual material, and most importantly profit, has always been the case. It's was inevitable that AI would open up another Pandora's Box , and social media would make the spred far faster than anything experienced in the past. I think we're just at the beginning of this , and would be wise to have serious discussions sooner rather than later. The coming of "the feelies" with someone you actually know is not far away.
I can report, having tested it out (for research purposes, of course), that at this point Grok will not produce any new sexualized image or video from an uploaded photo of a person or drawing, even if the original image is an already existing pornographic photo or even comic art. Nor can it even generate actual celebrities - I tried generating some images from "The Shining" with Malcolm MacDowell substituted for Jack Nicholson, and got images that resembled neither. And yet many press sources are claiming that Grok will still "nudifies" real people.
Really solid legal anlaysis here. The existing civil remedies angle is underrated in these debates, poeple forget that defamation and privacy torts still apply. I dealt with a similar issue in a tech case where everyone wanted new regs when we already had the tools for accountability.
When photography first started it wasn't long before "dirty postcards" started showing up in France ,and the advent of moving pictures quickly spawned the "stag film" industry. Early vcr catered to pornography. Taking a new technology and using it for sexual material, and most importantly profit, has always been the case. It's was inevitable that AI would open up another Pandora's Box , and social media would make the spred far faster than anything experienced in the past. I think we're just at the beginning of this , and would be wise to have serious discussions sooner rather than later. The coming of "the feelies" with someone you actually know is not far away.
I can report, having tested it out (for research purposes, of course), that at this point Grok will not produce any new sexualized image or video from an uploaded photo of a person or drawing, even if the original image is an already existing pornographic photo or even comic art. Nor can it even generate actual celebrities - I tried generating some images from "The Shining" with Malcolm MacDowell substituted for Jack Nicholson, and got images that resembled neither. And yet many press sources are claiming that Grok will still "nudifies" real people.