Celebrities from Prince Harry to Steve Bannon call for ban on AI 'superintelligence.' What is it?

(Nexstar) – An open letter calling for a ban on the development of “superintelligence” by AI companies has received support from former royals, Hollywood actors, conservative political commentators and a former U.S. national security adviser.

The diverse group of signatories are calling on all companies to stop developing advanced forms of artificial intelligence unless it can be done safely and with control.

Letter Warns that the type of AI that companies are building will “essentially outperform all humans in all cognitive tasks.”

The statement continues, “This has raised concerns ranging from human economic obsolescence and disempowerment, loss of freedom, civil liberties, dignity and control, to national security risks and even potential human extinction.”

What is AI ‘Superintelligence’?

In discussions of AI technology, “superintelligence” is sometimes also referred to as artificial general intelligence or AGI.

It is not a technical term with a universally accepted definition, but “a serious, albeit ill-defined concept,” said AI scientist Geoffrey Hinton. told the associated press Last year.

He said, “I use it to mean an AI that is almost as good at all cognitive things as humans are.”

“Superintelligence” research is not about building a specific AI device. It’s about creating a “thinking machine,” said Pei Wang, a professor who teaches AGI courses at Temple University. AI will be able to reason, plan, and learn from experiences like people.

According to the AP, OpenAI, Amazon, Google, Meta, and Microsoft have all invested heavily in researching it. Some AI experts warn that companies are in an arms race to develop technology that they cannot guarantee they will be able to fully control.

in an interview with Ezra Klein of The New York TimesAI researcher Eliezer Yudkowsky described a scenario where “Now AI is completely redesigning itself. We don’t know what’s going on there. We don’t even understand the thing that’s driving AI.”

But instead of writing it off, a company can invest heavily in having technology that is better than its competitors.

“And of course, if you create superintelligence, you don’t have to have superintelligence — superintelligence has you,” Yudkowsky said.

While some are concerned that AI will spin out of control, there is also criticism that developers are sometimes overstating the capabilities of their products. OpenAI recently faced ridicule from mathematicians and AI scientists when its researcher claimed that ChatGPT had discovered unsolved math problems – when what it really did was find and summarize what was already online.

Who has signed the letter?

Prince Harry and his wife Meghan, Duchess of Sussex, made headlines on Wednesday for joining others in signing the warning letter. Actors Stephen Fry and Joseph Gordon-Levitt and musician will.i.am have joined.

Two prominent conservative commentators, Steve Bannon and Glenn Beck, have also signed. Also on the list are Apple co-founder Steve Wozniak; British billionaire Richard Branson; former Chairman of the US Joint Chiefs of Staff Mike Mullen, who served under Republican and Democratic administrations; and democratic foreignPolicy expert Susan RiceWho was the National Security Advisor to President Barack Obama.

They join AI pioneers, including Yoshua Bengio and Geoffrey Hinton, who are co-winners of the Turing Award, computer science’s top prize. Hinton toowon the Nobel Prizein Physics last year. Bothhave been vocalIn drawing attention to the dangers of the technology he helped create.

“This is not a ban or even a moratorium in the usual sense,” wrote another signatory, AI pioneer and computer science professor Stuart Russell at the University of California, Berkeley. “It is simply a proposal to require substantial safeguards for a technology that, according to its developers, has a significant chance of causing human extinction. Is that too much to ask?”

The Associated Press contributed to this report.

Source link

Please follow and like us:
Pin Share

Leave a Reply

Your email address will not be published. Required fields are marked *