I did the bad thing and clicked the clickbait. It was for crazy movie credits, and I am sorry to everyone for increasing hit count for those guys. But one of the credits came from RoboCop. You can see it here in the featured image. But the text is,
This motion picture is protected under the laws of the United States and other countries and its unauthorized duplication, distribution, or exhibition may result in civil liability and criminal prosecution by enforcement droids.
In 1987, that was a crazy idea. By 2012, it is routine for copyright enforcement droids to police YouTube and BitTorrent. These robots issue automatic takedown notices routinely. I have no beef with copyright holders protecting their property. I have a serious problem with their incompetence. These robots routinely misbehave and identify things that the owners do not hold as theirs. That’s bad enough, but the the robots have no ability to distinguish fair use from actual infringement. These robots need human guidance and supervision. Just like all predictive algorithms.
There’s a qualitative difference between Amazon’s algorithms showing me a bad pick on the next purchase and law enforcement misreading a dataset. If Amazon messes up, they just lose revenue. If law enforcement messes up, someone’s life is about to get a lot worse. And if that were not enough, the incentives are such that only Amazon will move to fix a bad algorithm. The use of predictive analytics for legal purposes must be as an augmentation tool, and not as an automated tool that proceeds without supervision by someone capable of exercising reasoned judgement. Copyright enforcement has made that exceptionally clear in the last five years.
Image by Orion Pictures, and criticism constitutes fair use, so the robots can suck it. Also, if you want to come at me, bro, this.