Book Online or Call 1-855-SAUSALITO

Sign In  |  Register  |  About Sausalito  |  Contact Us

Sausalito, CA
September 01, 2020 1:41pm
7-Day Forecast | Traffic
  • Search Hotels in Sausalito

  • CHECK-IN:
  • CHECK-OUT:
  • ROOMS:

Elon Musk warns of AI's impact on elections, calls for US oversight: 'Things are getting weird... fast'

Elon Musk called on the government to regulate artificial intelligence chatbots to protect the public from the emerging threats associated with the new technology.

Twitter CEO and tech billionaire Elon Musk told Tucker Carlson that AI will likely have a "significant influence" on future elections, and called on the U.S. government to establish some level of oversight to sensibly regulate the rapidly emerging technology.

The Tesla and SpaceX CEO, who has been outspoken about his concerns with AI and the dangers associated with it, said earlier in his conversation with Carlson that he fears AI could lead to "civilizational destruction" if mismanaged.

Asked how he thinks the new technology will impact democracy in the future, Musk told Carlson, "Well, that’s why I raised the concern of AI being a significant influence in elections.

ELON MUSK REVEALS US INTEL AGENCIES HAD ‘FULL ACCESS’ TO PRIVATE TWITTER DM'S, DISCLOSES NEW ENCRYPTION FEATURE

 "Even if you say that AI doesn’t have agency, well it’s very likely that people will use the AI as a tool in elections. And then, you know, if AI's smart enough, are they using the tool or is the tool using them? So I think things are getting weird, and they’re getting weird fast," he told the Fox News host in part two of "Tucker Carlson Tonight" interview that aired Tuesday.

In part one of the "Tucker Carlson Tonight" interview on Monday, Musk revealed his plans to develop his own artificial intelligence chatbot, "TruthGPT." to counter bias from ChatGPT and other AI chatbots. Musk said he fears slanted or ideological programmers will use AI technology to "lie" and spread falsehoods without consequence.

"What's happening is they're training the AI to lie. It's bad," he told Carlson. "AI is more dangerous than, say, mismanaged aircraft design or production maintenance or bad car production," he explained. "In the sense that it has the potential, however, small one may regard that probability, but it is non-trivial, it has the potential of civilization destruction."

‘IT PRESUMES TO REPLACE US:’ CONCERNS OF BIAS IN AI GROW AFTER ELON MUSK ISSUES NEW WARNING

Musk said that without government oversight regulating the rapidly developing software, AI could quickly become a "danger to the public" and pose a real threat to the future of mankind.

"It's already past the point of what most humans can do," he told Carlson. "Most humans cannot write as well as ChatGPT and certainly no human can write that well that fast, to the best of my knowledge. Maybe Shakespeare"

Pundits and others have suggested blowing up AI server farms as a last resort if the technology surpasses human control. Musk said he agrees that the U.S. government should have a contingency in place to power down server farms in the event of an emergency, but offered a more subtle solution.

"I'm not suggesting we blow up the server centers right now but there may be some – it may be wise to have some sort of contingency plan where the government’s got an ability to shut down power to these service centers. Like you don’t have to blow it up, you can just cut the power," he told Carlson.

If "the administrator passwords, if they somehow stopped working, where we can’t slow down or you know – I don’t have a precise answer," he added. "But if it’s something that we’re concerned about and are unable to stop it with software commands, then we probably want to have some kind of hardware off switch."


 

Stock Quote API & Stock News API supplied by www.cloudquote.io
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.
 
 
Photos copyright by Jay Graham Photographer
Copyright © 2010-2020 Sausalito.com & California Media Partners, LLC. All rights reserved.