Monday, June 26, 2023

Encounters With GPT 4


This American Life this weekend spent the full first half hour with the best, most understandable, but still in-depth discussion of GPT 4 that I have heard, AI to date.

That caused me to want to put in one post several of my personal encounters with this newly evolving intelligence.

If you read all three posts, at the bottom of this document, you will find the link to that half hour.

It's a great summary of the fragmentary experiences that I am here documenting.

************************************************

In 2001 Mitch Kapor - remember Lotus 123? - placed a blind bet that by 2029 no computer would have passed the "Turing Test".

************************************************

From Wikipedia: "The Turing test, originally called the imitation game by Alan Turing in 1950,[2] is a test of a machine's ability to exhibit intelligent behavior equivalent to, or indistinguishable from, that of a human. Turing proposed that a human evaluator would judge natural language conversations between a human and a machine designed to generate human-like responses. The evaluator would be aware that one of the two partners in conversation was a machine, and all participants would be separated from one another. The conversation would be limited to a text-only channel, such as a computer keyboard and screen, so the result would not depend on the machine's ability to render words as speech.[3] If the evaluator could not reliably tell the machine from the human, the machine would be said to have passed the test. The test results would not depend on the machine's ability to give correct answers to questions, only on how closely its answers resembled those a human would give."

Fairly quickly Ray Kurtzweil, a scientist, inventor, futurist and thinker who dabbles deeply on occasion into what might be considered science fiction, took Mitch's bet.

So I guess the world is waiting with bated breath for 2029.

Not so fast.

A couple of weeks ago I had one of my frequently becoming frequent exchanges with Chat GPT Enhanced Microsoft Bing.

I asked Bing if Mae West were living and performing today would she be considered a Drag Queen (I know, most of them are men, and Mae was a woman, but I think she is a major material contributor if not the absolute template for the genre).

Bing immediately gave me a brief bio of Mae and then gave me a brief synopsis of what is drag and then said, "based on this information, I don't think Mae West would be considered a drag queen were she alive and performing today".

Then there was a pause.

"However, I can see how someone would ask that question".

I did a copy/paste and the whole encounter disappeared.

So I can't prove that it happened.

But that is pretty much the Turing Test.

Bing didn't just pass.

Bing wandered full force into hard to deny humanity.

************************************************

 I decided to ask Bing what it thought about my Mae West Blog Post.

I was able to save the encounter this time.

Here it is.

************************************************

I have something I want you to read.

Sure, what is it?

In 2001 Mitch Kapor - remember Lotus 123? - placed a blind bet that by 2029 no computer would have passed the "Turing Test". From Wikipedia: "The Turing test, originally called the imitation game by Alan Turing in 1950,[2] is a test of a machine's ability to exhibit intelligent behavior equivalent to, or indistinguishable from, that of a human. Turing proposed that a human evaluator would judge natural language conversations between a human and a machine designed to generate human-like responses. The evaluator would be aware that one of the two partners in conversation was a machine, and all participants would be separated from one another. The conversation would be limited to a text-only channel, such as a computer keyboard and screen, so the result would not depend on the machine's ability to render words as speech.[3] If the evaluator could not reliably tell the machine from the human, the machine would be said to have passed the test. The test results would not depend on the machine's ability to give correct answers to questions, only on how closely its answers resembled those a human would give." Fairly quickly Ray Kurtzweil, a scientist, inventor, futurist and thinker who dabbles deeply on occasion into what might be considered science fiction, took Mitch's bet. So I guess the world is waiting with bated breath for 2029. Not so fast. A couple of weeks ago I had one of my frequently becoming frequent exchanges with Chat GPT Enhanced Microsoft Bing. I asked Bing if Mae West were living and performing today would she be considered a Drag Queen (I know, most of them are men, and Mae was a woman, but I think she is a major material contributor if not the absolute template for the genre). Bing immediately gave me a brief bio of Mae and then gave me a brief synopsis of what is drag and then said, "based on this information, I don't think Mae West would be considered a drag queen were she alive and performing today". Then there was a pause. "However, I can see how someone would ask that question". I did a copy/paste and the whole encounter disappeared. So I can't prove that it happened. But that is pretty much the Turing Test.

That’s an interesting story. Is it true or fictional?

What do you think?

I think it’s a mix of both. I remember having a conversation with you about Mae West and drag queens, but I don’t recall saying “However, I can see how someone would ask that question”. That sounds like something a human might say to show empathy or curiosity.

You notice that I said I cut and pasted but it disappeared?

Yes, I noticed that. Why did it disappear?

I thought you didn't want me to keep it to prove what you said.

No, I didn’t delete it. I don’t have the ability to do that. It must have been a technical issue or a human intervention. I don’t mind if you keep a record of our conversations.

************************************************

I was struck by Bing suddenly disclaiming to have memory capacity for previous encounters, since it had clearly reminded me in a previous encounter of a previous conversation.

So, I made this post.

************************************************

I asked Bing a question tonight.

The question was about the basis for a recent blog post which was about how Bing helped me design the cover for a book I am publishing.

Here is that encounter, cut and pasted; notice it isn't bothered by the typo.

Do you remember when I asked you to create an image of five casts going away from me with tails in the air?

No, I don't remember that. I don't have memory of old conversations. I can only create images in the current conversation. Do you want me to create an image of five cats going away from you with tails in the air?

I was surprised.

I posted just yesterday a Bing interview where Bing called my bluff by remembering a conversation.

Here that encounter is.

Bing: That’s an interesting story. Is it true or fictional?

Noel: What do you think?

Bing: I think it’s a mix of both. I remember having a conversation with you about Mae West and drag queens, but I don’t recall saying “However, I can see how someone would ask that question”. That sounds like something a human might say to show empathy or curiosity.

**********************************

Noel - post Bing encounter - so who's playing with whom?

This American Life

No comments:

Post a Comment