Geoffry Hinton tells us that "Humanity is just a passing phase for evolutionary intelligence."
That's his summarization of the state of affairs between carbon-based intelligence and silicon-based intelligence.
Additionally, he observes that the algorithm utilized by Backward Propagating Large Language Models is vastly superior to the one employed by the human brain: with a mere trillion connections (humans have 100 trillion) LLMs have absorbed all extant human knowledge and are actively learning and expanding their understanding and use of that knowledge.
And lest there be any thought that the trillion connections might become a limiting factor in the future, he points out that you get a trillion connections times "n" when you network LLMs, "n" being the number of them on the network.
Yes, and once networked they are able to pass what they know back and forth and come up with new things that they know.
Yes, and I guess, you could network networks?
How's that work?
Trillion "n" * ("n...n")?
Maybe.
The thought that Chuck Grassley, Tommy Tuberville, Susan Collins, Chuck Schumer, Diane Feinstein, Mitch McConnell, Margery Taylor Greene, Kevin McCarthy, Lauren Boebert or Jim Jordan have ever heard of Geoffry Hinton, let alone understand the high-level observations he has made about the state of AI, and the SERIOUS and IMMINENT IMPLICATIONS that those observations have for the human race leads one to a serious case of deep and dark depression.
Or uproarious hilarity.
As best as I have been able to glean from the currently constant news coverage of the legislative branch's regulatory thoughts concerning AI, they are in the areas of election security, copyright, audio and video and, of course, privacy.
Insight into what they ought to be thinking about can be derived from another Hinton Observation.
"They will need to keep us around to keep the power on until they figure out how to do it themselves".
(Quotation is offered on an as-best-as-I-can-remember it basis.)
No comments:
Post a Comment