The Attorney General’s request for an opinion will delay the High Court’s decision on whether to take up the matter. It is the constitutionality of state laws in Florida and Texas that limit what can be done and require transparency as to how such decisions are made.
Right-wing views gained momentum after major social media sites blocked Donald Trump from speaking out in the wake of the Jan. 6, 2021 attacks on the US Capitol.
Court decisions could have far-reaching implications for the future of democracy and elections, as tech companies play an increasingly important role in spreading political news and debate. says to limit It can lead to an onslaught of hate speech, misinformation and other violent material.
The Supreme Court will likely have to address the issue during its term, which begins in October. The Federal Court of Appeals ruled to the contrary. The 11th Circuit Court of Appeals nullified many of Florida’s laws, while the Fifth Circuit upheld Texas laws.
Both the state and the tech industry have urged Supreme Court justices to file lawsuits, saying only the High Court can decide future rules.
Pending the Fifth Circuit’s ruling, technology trade group NetChoice said in court: If the Fifth Circuit’s opinion remains intact, it threatens to overturn First Amendment precedents and change current Internet discourse. ”
The Texas response described the case as equally important. “A small number of modern communications platforms effectively control access to the modern digital public square,” Texas Attorney General Ken Paxton, a Republican, wrote in a Supreme Court petition. They claim an absolute First Amendment right to exclude . . who they want for whatever reason, without explanation.
The lawsuit set the most important test for claims that Silicon Valley companies illegally censor conservative views.
The Supreme Court already has two key technology cases scheduled for next month. gonzalez vs google, scheduled to be debated on February 21, the court will decide whether a statutory provision known as Section 230 protects technology companies from alleging that algorithmic recommendations of content may harm them. was first fully considered. The lawsuit alleges that the family of an American killed in Paris in an attack by IS sympathizers may have been influenced by YouTube’s recommendations that the perpetrators were supporting terrorism.
The next day for the judge to consider Twitter vs Taamnehwhich raises a related question about the responsibility to monitor posts that support terrorism.
Will Oremus contributed to this report.