close
close

Trump signs the Take It Down Act. What is that?

President Donald Trump signed the Take It Down Act on Monday, the non-partisan laws, the stricter punishments for the distribution of non-mutually inconceivable intimate images, which are sometimes referred to as “revenge porn”, as well as Deepfakes, which were created by artificial intelligence.

The measure that comes into force immediately was introduced by Senator Ted Cruz, a Republican from Texas, and Senator Amy Klobuchar, a democrat from Minnesota, and later won the Support from First Lady Melania Trump. Critics of the measure, which appeal to both real and artificial intelligence, say that the language is too wide and could lead to questions of censorship and arrival questions.

What is the Take It Down Act?

The law makes it illegally, “knowingly published” or threaten to publish intimate images without the consent of a person, including AI created “deeppakes”. In addition, websites and social media companies must remove such a material within 48 hours after the termination of a victim. The platforms must also take measures to delete double content. Many states have already banned the spread of sexually explicitly explicit deep porn or revenge porn, but the Take IT Down Act is a rare example of the supervision of the federal government on Internet companies.

Who supports it?

Take It Down Act has received a strong support for two -party support and was used by Melania Trump, which was used on the Capitol Hill in March and said that it was “heartbreaking” to see what teenagers, especially girls, go through after they are falling victim to people who spread such content.

Cruz said the measure was inspired by Elliston Berry and her mother, who then visited his office Snapchat Almost a year, refused to remove an “deep papal” of the 14-year-old at the time.

MetaThe legislation supports what Facebook and Instagram has and operates.

“An intimate picture – real or with AI -generated – to be shared without consent, can be devastating and meta developed and many efforts to prevent this,” said Meta spokesman Andy Stone in March.

Information technology and innovation Foundation, a technical-industry-proven Think Tank, said in a statement after the adoption of the law last month that it is “an important step forward, which helps people to pursue justice if they are victims of non-mutually intimate images, including Deepfake images that are created with AI.”

“We have to provide the victims of online abuse the legal protection that you need if intimate images are shared without your consent, especially now that Deepfakes create terrible new opportunities for abuse,” said Klobuchar in a explanation. “These pictures can ruin life and call, but now that our cross -party legislation is right, the victims can remove this material from social media platforms and criminal prosecution can be held accountable.”

Klobuchar described the adoption of the law a “great victory for victims of online abuse” and said that there were “legal protective measures and instruments for the time when their intimate images, including deep packs, are shared without their consent and enable the law enforcement authorities to hold perpetrators to account.”

“This is also a pioneering step to define the street for social media and AI healthy,” she added.

Cruz said: “Predicts who weapons new technologies to publish this exploitative dirt will now have quite criminal consequences, and Big Tech will no longer turn an eye to spread this hideous material.”

What are the censorship concerns?

Proponents of freedom of speech and digital law groups say that the legislative template is too wide and could lead to the censorship of legitimate images such as legal pornography and LGBTQ content as well as government critics.

“While the law is supposed to tackle a serious problem, good intentions alone are not enough to make good politics,” said the non -profit electronic Frontier Foundation, a group of digital right advocacy. “Legislators should strengthen and enforce existing legal protection for victims instead of inventing new takedown regime that are ripe for abuse.”

The present provision in the invoice “applies to a much wider category of content-potentially all images that contain intimate or sexual content” than the closer definitions of non-mutual intimate images that were found elsewhere in the text said.

“The Takedown determination also lacks critical protective measures from frivolous or bad takedown requests. The services rely on automated filters that are famous tools,” said Eff. “They often mark legal content, from fair comments to news reports. The narrow time frame of the law requires that apps and websites remove the language within 48 hours, rarely enough time to check whether the speech is actually illegal.”

As a result, the group said that online companies, especially smaller ones, who are missing the resources to wade through many content, will probably decide to avoid the stressful legal risk by simply exposing the speech instead of trying to check them. “

The measure, said Eff, also presses up platforms, “actively monitoring the language, including the language that is currently encrypted” in order to tackle liability threats.

The Cyber ​​Civil Rights InitiativeA non -profit organization that helps victims of online crimes and abuse said that he had “serious reservations” against the invoice. It described his takedown determination of constitutional, unconstitutional, unconstitutional about and without adequate protective measures against abuse. ”

According to the group, platforms could be obliged to remove the photographs of a journalist of a topless protest on a public road, photos of a U -Bahn indicator distributed by the law enforcement agencies to locate the perpetrator, commercially sexually explicit content or sexual explicit material that is a consistent but incorrectly reported.

Leave a Comment