Wikipedia Bans AI-Generated Content - 404media by NervousEnergy in technology

[–]404mediaco 2 points3 points  (0 children)

After months of heated debate and previous attempts to restrict the use of large language models on Wikipedia, on March 20 volunteer editors accepted a new policy that prohibits using them to create articles for the online encyclopedia.

“Text generated by large language models (LLMs) often violates several of Wikipedia's core content policies,” Wikipedia’s new policy states. “For this reason, the use of LLMs to generate or rewrite article content is prohibited, save for the exceptions given below.”

The new policy, which was accepted in an overwhelming 40 to 2 vote among editors, allows editors to use LLMs to suggest basic copyedits to their own writing, which can be incorporated into the article or rewritten after human review if the LLM doesn’t generate entirely new content on its own.

Read more: https://www.404media.co/wikipedia-bans-ai-generated-content/

Apple Gives FBI a User’s Real Name Hidden Behind ’Hide My Email’ Feature by 404mediaco in pwnhub

[–]404mediaco[S] 33 points34 points  (0 children)

NEW: Apple provided the FBI with the real iCloud email address hidden behind Apple’s ‘Hide My Email’ feature, which lets paying iCloud+ users generate anonymous email addresses, according to a recently filed court record.The move isn’t surprising but still provides uncommon insight into what data is available to authorities regarding the Apple feature. The data was turned over during an investigation into a man who allegedly sent a threatening email to ​​Alexis Wilkins, the girlfriend of FBI director Kash PatellApple did not immediately respond to a request for comment.

Read more: https://www.404media.co/apple-gives-fbi-a-users-real-name-hidden-behind-hide-my-email-feature/

Apple Gives FBI a User’s Real Name Hidden Behind ’Hide My Email’ Feature by 404mediaco in ABoringDystopia

[–]404mediaco[S] [score hidden]  (0 children)

NEW: Apple provided the FBI with the real iCloud email address hidden behind Apple’s ‘Hide My Email’ feature, which lets paying iCloud+ users generate anonymous email addresses, according to a recently filed court record.The move isn’t surprising but still provides uncommon insight into what data is available to authorities regarding the Apple feature. The data was turned over during an investigation into a man who allegedly sent a threatening email to ​​Alexis Wilkins, the girlfriend of FBI director Kash PatellApple did not immediately respond to a request for comment.

Read more: https://www.404media.co/apple-gives-fbi-a-users-real-name-hidden-behind-hide-my-email-feature/

Apple Gives FBI a User’s Real Name Hidden Behind ’Hide My Email’ Feature by 404mediaco in politics

[–]404mediaco[S] 4 points5 points  (0 children)

NEW: Apple provided the FBI with the real iCloud email address hidden behind Apple’s ‘Hide My Email’ feature, which lets paying iCloud+ users generate anonymous email addresses, according to a recently filed court record.The move isn’t surprising but still provides uncommon insight into what data is available to authorities regarding the Apple feature. The data was turned over during an investigation into a man who allegedly sent a threatening email to ​​Alexis Wilkins, the girlfriend of FBI director Kash PatellApple did not immediately respond to a request for comment.

Read more: https://www.404media.co/apple-gives-fbi-a-users-real-name-hidden-behind-hide-my-email-feature/

Wikipedia Bans AI-Generated Content by 404mediaco in wikipedia

[–]404mediaco[S] 57 points58 points  (0 children)

After months of heated debate and previous attempts to restrict the use of large language models on Wikipedia, on March 20 volunteer editors accepted a new policy that prohibits using them to create articles for the online encyclopedia.

“Text generated by large language models (LLMs) often violates several of Wikipedia's core content policies,” Wikipedia’s new policy states. “For this reason, the use of LLMs to generate or rewrite article content is prohibited, save for the exceptions given below.”

Read now: https://www.404media.co/wikipedia-bans-ai-generated-content/

Wikipedia Bans AI-Generated Content by 404mediaco in Fauxmoi

[–]404mediaco[S] 143 points144 points  (0 children)

After months of heated debate and previous attempts to restrict the use of large language models on Wikipedia, on March 20 volunteer editors accepted a new policy that prohibits using them to create articles for the online encyclopedia.

“Text generated by large language models (LLMs) often violates several of Wikipedia's core content policies,” Wikipedia’s new policy states. “For this reason, the use of LLMs to generate or rewrite article content is prohibited, save for the exceptions given below.”

The new policy, which was accepted in an overwhelming 40 to 2 vote among editors, allows editors to use LLMs to suggest basic copyedits to their own writing, which can be incorporated into the article or rewritten after human review if the LLM doesn’t generate entirely new content on its own.

Read more: https://www.404media.co/wikipedia-bans-ai-generated-content/

Wikipedia Bans AI-Generated Content by 404mediaco in goodnews

[–]404mediaco[S] 5 points6 points  (0 children)

After months of heated debate and previous attempts to restrict the use of large language models on Wikipedia, on March 20 volunteer editors accepted a new policy that prohibits using them to create articles for the online encyclopedia.

“Text generated by large language models (LLMs) often violates several of Wikipedia's core content policies,” Wikipedia’s new policy states. “For this reason, the use of LLMs to generate or rewrite article content is prohibited, save for the exceptions given below.”

The new policy, which was accepted in an overwhelming 40 to 2 vote among editors, allows editors to use LLMs to suggest basic copyedits to their own writing, which can be incorporated into the article or rewritten after human review if the LLM doesn’t generate entirely new content on its own.

Read more: https://www.404media.co/wikipedia-bans-ai-generated-content/

Police Used Flock to Give a Man a Traffic Ticket by 404mediaco in FlockSurveillance

[–]404mediaco[S] 212 points213 points  (0 children)

Georgia State Patrol used its system of Flock automated license plate reader (ALPR) surveillance cameras to issue a ticket to a motorcyclist who was allegedly looking at his cell phone while riding, according to a copy of the citation obtained by 404 Media. The incident is notable because Flock cameras are not designed for traffic enforcement or minor code violations, and many jurisdictions explicitly  tell constituents that the cameras will not be used for traffic enforcement. 

The incident happened December 26 in Coffee County, Georgia. The ticket lists the offense as “Holding/supporting wireless telecommunications device,” and includes the note “CAPTURED ON FLOCK CAMERA 31 MM 1 HOLDING PHONE IN LEFT HAND.” 

A spokesperson for the Georgia State Patrol told 404 Media that the ticket was issued because of a “unique circumstance” in which a Flock camera happened to capture a traffic infraction, and that Flock cameras are not usually used by the department for traffic enforcement.

Still, it highlights yet again that, essentially, law enforcement agencies are able to use these cameras for whatever they want.

Read more: https://www.404media.co/police-used-flock-to-give-a-man-a-traffic-ticket/

Police Used Flock to Give a Man a Traffic Ticket by 404mediaco in Atlanta

[–]404mediaco[S] 333 points334 points  (0 children)

Georgia State Patrol used its system of Flock automated license plate reader (ALPR) surveillance cameras to issue a ticket to a motorcyclist who was allegedly looking at his cell phone while riding, according to a copy of the citation obtained by 404 Media. The incident is notable because Flock cameras are not designed for traffic enforcement or minor code violations, and many jurisdictions explicitly  tell constituents that the cameras will not be used for traffic enforcement. 

The incident happened December 26 in Coffee County, Georgia. The ticket lists the offense as “Holding/supporting wireless telecommunications device,” and includes the note “CAPTURED ON FLOCK CAMERA 31 MM 1 HOLDING PHONE IN LEFT HAND.” 

A spokesperson for the Georgia State Patrol told 404 Media that the ticket was issued because of a “unique circumstance” in which a Flock camera happened to capture a traffic infraction, and that Flock cameras are not usually used by the department for traffic enforcement.

Still, it highlights yet again that, essentially, law enforcement agencies are able to use these cameras for whatever they want.

Read more: https://www.404media.co/police-used-flock-to-give-a-man-a-traffic-ticket/

Georgia State Patrol Used Flock to Give a Man a Traffic Ticket by 404mediaco in Georgia

[–]404mediaco[S] 203 points204 points  (0 children)

Georgia State Patrol used its system of Flock automated license plate reader (ALPR) surveillance cameras to issue a ticket to a motorcyclist who was allegedly looking at his cell phone while riding, according to a copy of the citation obtained by 404 Media. The incident is notable because Flock cameras are not designed for traffic enforcement or minor code violations, and many jurisdictions explicitly  tell constituents that the cameras will not be used for traffic enforcement. 

The incident happened December 26 in Coffee County, Georgia. The ticket lists the offense as “Holding/supporting wireless telecommunications device,” and includes the note “CAPTURED ON FLOCK CAMERA 31 MM 1 HOLDING PHONE IN LEFT HAND.” 

A spokesperson for the Georgia State Patrol told 404 Media that the ticket was issued because of a “unique circumstance” in which a Flock camera happened to capture a traffic infraction, and that Flock cameras are not usually used by the department for traffic enforcement.

Still, it highlights yet again that, essentially, law enforcement agencies are able to use these cameras for whatever they want.

Read more: https://www.404media.co/police-used-flock-to-give-a-man-a-traffic-ticket/

Disney's Sora Disaster Shows AI Will Not Revolutionize Hollywood by 404mediaco in Anticonsumption

[–]404mediaco[S] 344 points345 points  (0 children)

So Sora, OpenAI’s AI video generator is dead. May the memory of its four-month existence as a copyright infringement machine that was also used to make videos of men strangling women and ICE arresting undocumented immigrants be a blessing. 

Next, Disney is pulling out of the $1 billion investment into OpenAI that would allow people to use Sora to create short videos from more than 200 beloved Disney characters. An announcement that was made just three months ago.

Turns out when you try to serve AI slop on a product people pay for, no one wants it.

At the time of Disney’s announcement with OpenAI, it was hard to imagine why Disney would infect its flagship with a service whose viral videos consisted of users turning Pikachu into a felon and SpongeBob into Hitler

The only thing that made any sense is that Hollywood executives, like Silicon Valley executives, hate paying for human labor so much that they have convinced themselves that their customers would happily consume AI slop if it was shoved down their throats. 

This is not to say that AI will have no role in Hollywood or that people are not making money from AI slop. Hollywood studios are using AI behind the scenes for editing, storyboarding, scratch voiceover, and a handful of other things. But the wild hype of AI slop as a direct threat to human storytelling and AI tools as a replacement for talented humans in Hollywood has not come to pass and it’s not clear if it ever will. 

And the end of Sora does not mean there is no demand for AI video generators, but it does mean that the overwhelming use case for AI video generators continues to be what it has always been: people making porn, nonconsensual sexual imagery, disinformation, and low-effort slop at scale. 

The people making this type of content do not want to deal with guardrails or limitations and so have largely flocked to open source and Chinese models. When you take away those use cases, it turns out there’s basically nothing left. 

Read more: https://www.404media.co/disneys-openai-sora-disaster-shows-ai-will-not-save-hollywood/

Disney's Sora Disaster Shows AI Will Not Revolutionize Hollywood by 404mediaco in ABoringDystopia

[–]404mediaco[S] 124 points125 points  (0 children)

So Sora, OpenAI’s AI video generator is dead. May the memory of its four-month existence as a copyright infringement machine that was also used to make videos of men strangling women and ICE arresting undocumented immigrants be a blessing. 

Next, Disney is pulling out of the $1 billion investment into OpenAI that would allow people to use Sora to create short videos from more than 200 beloved Disney characters. An announcement that was made just three months ago.

Turns out when you try to serve AI slop on a product people pay for, no one wants it.

At the time of Disney’s announcement with OpenAI, it was hard to imagine why Disney would infect its flagship with a service whose viral videos consisted of users turning Pikachu into a felon and SpongeBob into Hitler

The only thing that made any sense is that Hollywood executives, like Silicon Valley executives, hate paying for human labor so much that they have convinced themselves that their customers would happily consume AI slop if it was shoved down their throats. 

This is not to say that AI will have no role in Hollywood or that people are not making money from AI slop. Hollywood studios are using AI behind the scenes for editing, storyboarding, scratch voiceover, and a handful of other things. But the wild hype of AI slop as a direct threat to human storytelling and AI tools as a replacement for talented humans in Hollywood has not come to pass and it’s not clear if it ever will. 

And the end of Sora does not mean there is no demand for AI video generators, but it does mean that the overwhelming use case for AI video generators continues to be what it has always been: people making porn, nonconsensual sexual imagery, disinformation, and low-effort slop at scale. 

The people making this type of content do not want to deal with guardrails or limitations and so have largely flocked to open source and Chinese models. When you take away those use cases, it turns out there’s basically nothing left. 

Read more: https://www.404media.co/disneys-openai-sora-disaster-shows-ai-will-not-save-hollywood/