Thursday, March 30, 2023

Crossing The AI Rubicon

Members of the scientific-technological elite are urging a pause in the rush to create super-intelligent AI:

Contemporary AI systems are now becoming human-competitive at general tasks, and we must ask ourselves: Should we let machines flood our information channels with propaganda and untruth? Should we automate away all the jobs, including the fulfilling ones? Should we develop nonhuman minds that might eventually outnumber, outsmart, obsolete and replace us? Should we risk loss of control of our civilization? Such decisions must not be delegated to unelected tech leaders. Powerful AI systems should be developed only once we are confident that their effects will be positive and their risks will be manageable. This confidence must be well justified and increase with the magnitude of a system's potential effects. OpenAI's recent statement regarding artificial general intelligence, states that "At some point, it may be important to get independent review before starting to train future systems, and for the most advanced efforts to agree to limit the rate of growth of compute used for creating new models." We agree. That point is now.

Source - https://www.zerohedge.com/technology/musk-wozniak-call-pause-developing-more-powerful-ai-gpt-4

And here is a link to the letter itself:

https://futureoflife.org/open-letter/pause-giant-ai-experiments/

A wise warning indeed.

But I predict TPTB will just ignore them and cross the AI Rubicon at warp speed. They want total control of the entire Earth, all its inhabitants (all plants and animals), and all its resources. They see AI as the best way to accomplish this, and humanity be damned.

Perhaps this is the solution to the Fermi Paradox - at the point civilizations become capable of communicating with the rest of the universe, they create AI ... which then exterminates them or induces them to return to their respective "stone ages" and keep quiet.

Update - 3/31/2023

Here is additional discussion:

https://www.foxnews.com/tech/ai-expert-warns-elon-musk-signed-letter-doesnt-enough-literally-everyone-earth-will-die

Yes - if we don't stop this now, we're in big trouble.

No comments:

Post a Comment