Book Online or Call 1-855-SAUSALITO

Sign In  |  Register  |  About Sausalito  |  Contact Us

Sausalito, CA
September 01, 2020 1:41pm
7-Day Forecast | Traffic
  • Search Hotels in Sausalito

  • CHECK-IN:
  • CHECK-OUT:
  • ROOMS:

Protecting the integrity U.S. elections will require a massive regulatory overhaul, academics say

Ahead of the 2020 elections former Facebook chief security officer Alex Stamos and his colleagues at Stanford University have unveiled a sweeping new plan to secure U.S. electoral infrastructure and combat foreign campaigns seeking to interfere in U.S. politics. As the Mueller investigation into electoral interference made clear, foreign agents from Russia (and elsewhere) engaged […]

Ahead of the 2020 elections former Facebook chief security officer Alex Stamos and his colleagues at Stanford University have unveiled a sweeping new plan to secure U.S. electoral infrastructure and combat foreign campaigns seeking to interfere in U.S. politics.

As the Mueller investigation into electoral interference made clear, foreign agents from Russia (and elsewhere) engaged in a strategic campaign to influence the 2016 U.S. elections. As the chief security officer of Facebook at the time, Stamos was both a witness to the influence campaign on social media and a key architect of the efforts to combat its spread.

Along with Michael McFaul, a former ambassador to Russia, and a host of other academics from Stanford, Stamos lays out a multi-pronged plan that incorporates securing U.S. voting systems, providing clearer guidelines for advertising and the operations of foreign media in the U.S. and integrating government action more closely with media and social media organizations to combat the spread of misinformation or propaganda by foreign governments.

The paper lays out a number of suggestions for securing elections including:

  • Increasing the Security of the U.S. Election Infrastructure
  • Explicitly prohibit foreign governments and individuals from purchasing online advertisements targeting the American electorate
  • Require greater disclosure measures for FARA-registered foreign media organizations.
  • Create standardized guidelines for labeling content affiliated with disinformation campaign producers.
  • Mandate transparency in the use of foreign consultants and foreign companies in U.S. political campaigns.
  • Foreground free and fair elections as part of U.S. policy and identifying election rights as human rights
  • Signal a clear and credible commitment to respond to election interference.

A lot of heavy lifting by Congress and media and social media companies would be required to enact all of these policy recommendations and many of them speak to core issues that policymakers and corporate executives are already attempting to manage.

For lawmakers that means drafting legislation that would require paper trails for all ballots and improve threat assessments of computerized election systems along with a complete overhaul of campaign laws related to advertising, financing, and press freedoms (for foreign press).

The Stanford proposals call for the strict regulation of foreign involvement in campaigns including a ban on foreign governments and individuals from buying online ads that would target the U.S. electorate with an eye toward influencing elections. The proposals also call for greater disclosure requirements indicating articles, opinion pieces or media produced by foreign media organizations. Furthermore, any campaign working with a foreign company or consultant or with significant foreign business interests should be required to disclose those connections.

Clearly, the echoes of Facebook’s Cambridge Analytica and political advertising scandals can be heard in some of the suggestions made by the paper’s authors.

Facebook’s election interference problem exponentially worse on eve of midterms, study suggests

Indeed, the paper leans heavily on the use and abuse of social media and tech as a critical vector for an attack on future U.S. elections. And the Stanford proposals don’t shirk from calling on legislators to demand that these companies do more to protect their platforms from being used and abused by foreign governments or individuals.

In some cases companies are already working to enact suggestions from the report. Facebook, Alphabet, and Twitter have said that they will work together to coordinate and encourage the spread of best practices. Media companies need to create (and are working to create) norms for handling stolen information. Labeling manipulated videos or propaganda (or articles and videos that come from sources known to disseminate propaganda) is another task that platforms are undertaking, but an area where there is still significant work to be done (especially when it comes to deepfakes).

As the report’s author’s note:

Existing user interface features and platforms’ content delivery algorithms need to be utilized as much as possible to provide contextualization for questionable information and help users escape echo chambers. In addition, social media platforms should provide more transparency around users who are paid to promote certain content. One area ripe for innovation is the automatic labeling of synthetic content, such as videos created by a variety of techniques that are often lumped under the term “deepfakes”. While there are legitimate uses of synthetic media technologies, there is no legitimate need to mislead social media users about the authenticity of that media. Automatically labeling content, which shows technical signs of being modified in this manner, is the minimum level of due diligence required of the major video hosting sites.

There’s more work that needs to be done to limit the targeting capabilities for political advertising and improving transparency around paid and unpaid political content as well, according to the report.

And somewhat troubling is the report’s call for the removal of barriers around sharing information relating to disinformation campaigns that would include changes to privacy laws.

Here’s the argument from the report:

At the moment, access to the content used by disinformation actors is generally restricted to analysts who archived the content before it was removed or governments with lawful request capabilities. Few organizations have been able to analyze the full paid and unpaid content created by Russian groups in 2016, and the analysis we have is limited to data from the handful of companies who investigated the use of their platforms and were able to legally provide such data to Congressional committees. Congress was able to provide that content and metadata to external researchers, an action that is otherwise proscribed by U.S. and European law. Congress needs to establish a legal framework within which the metadata of disinformation actors can be shared in real-time between social media platforms, and removed disinformation content can be shared with academic researchers under reasonable privacy protections.

Ultimately, these suggestions are meaningless without real action from the Congress and the President to ensure the security of elections. As the events of 2016  — documented in the Mueller report — revealed there are a substantial number of holes in the safeguards erected to secure our elections. As the country looks for a place to build walls for security, perhaps one around election integrity would be a good place to start.

The Mueller Report

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.
 
 
Photos copyright by Jay Graham Photographer
Copyright © 2010-2020 Sausalito.com & California Media Partners, LLC. All rights reserved.