Jump to content


Photo

ChatGPT 4 is scarier than you thought


  • Please log in to reply
191 replies to this topic

#41 porter

porter

    Comptroller of Affairs with Potatoes

  • Patron
  • PipPipPipPipPip
  • 17620 posts
  • LocationColorado Springs, CO

Posted 19 March 2023 - 08:07 AM

Even if it does end up nuking us all, people would still prefer it say please and thank you first.

#42 Stains_not_here_man

Stains_not_here_man

    Phat O'Mic Chef Winner!

  • Patron
  • PipPipPipPipPipPip
  • 105575 posts

Posted 20 March 2023 - 06:43 AM

It's funny to me, the headlines are like "ChatGPT can pass the bar exam!"

Yeah, so can I, if I'm allowed to Google the answers.

Don't get me wrong, it's impressive tech, but when you hear someone say it "knows" something, it doesn't. It has no concept of knowledge, or truth. It's mostly a really really good autocomplete...

#43 positiveContact

positiveContact

    Anti-Brag Queen

  • Patron
  • PipPipPipPipPipPip
  • 65278 posts
  • LocationLimbo

Posted 20 March 2023 - 06:51 AM

What does it mean when you know something?

#44 Stains_not_here_man

Stains_not_here_man

    Phat O'Mic Chef Winner!

  • Patron
  • PipPipPipPipPipPip
  • 105575 posts

Posted 20 March 2023 - 06:55 AM

What does it mean when you know something?


I would say it involves some level of understanding and being able to apply that knowledge in new situations, and being able to understand when something contradicts something else you know. If you "know" one thing is true, you likely know its opposite is not true.

ChatGPT can be "confidently wrong" a lot, because it doesn't actually "know" anything except the likelihood of particular words to appear near other words. So you can easily get it to tell you two opposite things and insist that both are true.

It's important, I think, that people understand that this is NOT General Artificial Intelligence. It can't reason or deduce.

#45 Sidney Porter

Sidney Porter

    Comptroller of the Banninated

  • Members
  • PipPipPipPipPip
  • 27592 posts
  • LocationColumbus OH

Posted 20 March 2023 - 06:56 AM

Here’s an attempt at posting a pic. You can see me and my boy’s reflection in R2’s eye. I’m about 80% done with the electronics.

https://imgur.com/gallery/lcC5RPh

that's a fun picture.

#46 positiveContact

positiveContact

    Anti-Brag Queen

  • Patron
  • PipPipPipPipPipPip
  • 65278 posts
  • LocationLimbo

Posted 20 March 2023 - 07:14 AM

I would say it involves some level of understanding and being able to apply that knowledge in new situations, and being able to understand when something contradicts something else you know. If you "know" one thing is true, you likely know its opposite is not true.

ChatGPT can be "confidently wrong" a lot, because it doesn't actually "know" anything except the likelihood of particular words to appear near other words. So you can easily get it to tell you two opposite things and insist that both are true.

It's important, I think, that people understand that this is NOT General Artificial Intelligence. It can't reason or deduce.


There is a reason these are called neural networks. Chat gpt had something like 100 million nodes. We have 86 billion neurons. We are just more advanced.

#47 Stains_not_here_man

Stains_not_here_man

    Phat O'Mic Chef Winner!

  • Patron
  • PipPipPipPipPipPip
  • 105575 posts

Posted 20 March 2023 - 08:07 AM

There is a reason these are called neural networks. Chat gpt had something like 100 million nodes. We have 86 billion neurons. We are just more advanced.


Yes and no. Enough said for now. I'll reiterate what I (and its own creators) have already said: it is a language model, it is not a general artificial intelligence.

#48 positiveContact

positiveContact

    Anti-Brag Queen

  • Patron
  • PipPipPipPipPipPip
  • 65278 posts
  • LocationLimbo

Posted 20 March 2023 - 08:15 AM

Yes and no. Enough said for now. I'll reiterate what I (and its own creators) have already said: it is a language model, it is not a general artificial intelligence.

I think we attribute a lot of intelligence to ourselves that isn't deserved. This is a statement in general and not directed at your thought :P

Also, general ai seems kind of not really agreed upon.

Edited by positiveContact, 20 March 2023 - 08:25 AM.


#49 Vagus

Vagus

    Just had an organism

  • Patron
  • PipPipPipPipPipPip
  • 55201 posts
  • LocationThere

Posted 20 March 2023 - 08:15 AM

Yes and no. Enough said for now. I'll reiterate what I (and its own creators) have already said: it is a language model, it is not a general artificial intelligence.

I generally agree, but think the danger is that it may surpass our own language abilities. And that threatens my scientific writing income (unless i start advertising Vagus Powered by Chat DPT)



#50 positiveContact

positiveContact

    Anti-Brag Queen

  • Patron
  • PipPipPipPipPipPip
  • 65278 posts
  • LocationLimbo

Posted 20 March 2023 - 08:22 AM

The D!

#51 positiveContact

positiveContact

    Anti-Brag Queen

  • Patron
  • PipPipPipPipPipPip
  • 65278 posts
  • LocationLimbo

Posted 20 March 2023 - 08:35 AM

To add to this I think we are a lot more like AI than not but people have attributed something special to humanity. There are other kinds of intelligences out there that we don't possess that we appear to put less value in.

#52 Vagus

Vagus

    Just had an organism

  • Patron
  • PipPipPipPipPipPip
  • 55201 posts
  • LocationThere

Posted 20 March 2023 - 08:46 AM

To add to this I think we are a lot more like AI than not but people have attributed something special to humanity. There are other kinds of intelligences out there that we don't possess that we appear to put less value in.

Humanity's big leap was the ability to put the puzzle together. Other species can build the peices, maybe join a few together, but only we've been able to complete the picture.

 

AI will be able to turn that picture into a blueprint for novel creation. (imo)



#53 positiveContact

positiveContact

    Anti-Brag Queen

  • Patron
  • PipPipPipPipPipPip
  • 65278 posts
  • LocationLimbo

Posted 20 March 2023 - 08:51 AM

I just come back to what is fundamentally happening when we "think"? Think about how predictable people can be and how often they will repeat certain actions in a predictable way. That tells me that we are just processing input in a fairly repeatable way. You know that guy that has tells you the same story 10 times? Think you could get him to tell it again? I bet you could.

Example: when I see the d I am compelled to point it out :P

Edited by positiveContact, 20 March 2023 - 08:52 AM.


#54 Vagus

Vagus

    Just had an organism

  • Patron
  • PipPipPipPipPipPip
  • 55201 posts
  • LocationThere

Posted 20 March 2023 - 08:53 AM

I just come back to what is fundamentally happening when we "think"? Think about how predictable people can be and how often they will repeat certain actions in a predictable way. That tells me that we are just processing input in a fairly repeatable way. You know that guy that has trolls you the same story 10 times? Think you could get him to tell it again? I bet you could.

90% of my posts are predictable.

 

But back to that dude, its funny how the story can come out exactly the same for some people. Like, say the correct thing to trigger the file, and the monologue will begin like he rehearses it six times a day. Meanwhile, the "other" dude's story will get crazier or mutate over time as if the files are getting corrupted. 



#55 positiveContact

positiveContact

    Anti-Brag Queen

  • Patron
  • PipPipPipPipPipPip
  • 65278 posts
  • LocationLimbo

Posted 20 March 2023 - 09:02 AM

I used to work with a guy who would repeat the same stories all the time. It was so bad we could stop him almost immediately because we knew he was about to tell the story. His family actually started just giving him "the hand" and once we found out about it we would also give this guy "the hand". He wasn't an idiot but it was weird how he could not remember who he had shared these stories with after so many repeats.

#56 porter

porter

    Comptroller of Affairs with Potatoes

  • Patron
  • PipPipPipPipPip
  • 17620 posts
  • LocationColorado Springs, CO

Posted 20 March 2023 - 09:22 AM

I used to work with a guy who would repeat the same stories all the time. It was so bad we could stop him almost immediately because we knew he was about to tell the story. His family actually started just giving him "the hand" and once we found out about it we would also give this guy "the hand". He wasn't an idiot but it was weird how he could not remember who he had shared these stories with after so many repeats.


If I'm telling someone a story,I usually preface it with "I've only got about five good stories" or similar which I think makes them feel more comfortable calling me off if they've heard it before. I think there's only about 20 stories extant anyway, with variety created by changing minor variables.

#57 Stains_not_here_man

Stains_not_here_man

    Phat O'Mic Chef Winner!

  • Patron
  • PipPipPipPipPipPip
  • 105575 posts

Posted 20 March 2023 - 09:28 AM

To add to this I think we are a lot more like AI than not but people have attributed something special to humanity. There are other kinds of intelligences out there that we don't possess that we appear to put less value in.

Probably fair. But still, I think people are confusing its real capabilities. Like let's say I ask it to tell me the most energy efficient way to do some activity. It's not going to conduct some kind of analysis and look at all the variables and determine that for you. It might very well say "the best way is to do X" and be completely wrong because it's really just coming up with the most likely words based on its input patterns. It doesn't actually do "analysis" or comparisons of things etc. It just says the things that something that did those things would most likely say based entirely on patterns it finds in things that have been said before.

Like I saw some article suggesting it could help you plan the most effective school shooting. I don't think so. It could certainly describe such scenarios (absent its filters) but it has no way to determine the "most effective way" because it's not actually trying to do that.

Edited by Stains_not_here_man, 20 March 2023 - 09:29 AM.


#58 porter

porter

    Comptroller of Affairs with Potatoes

  • Patron
  • PipPipPipPipPip
  • 17620 posts
  • LocationColorado Springs, CO

Posted 20 March 2023 - 09:33 AM

Brains are basically just massive probability calculators, in a sense. However, I agree that calling this something like artificial language integrator is more appropriate than just AI. This isn't what Minsky and others would have meant by AI. It isn't sampling environmental data and integrating independent analytic processes with motor outputs. It's quite a ways from that. At least, my perspective as a former neuroscientist.

#59 positiveContact

positiveContact

    Anti-Brag Queen

  • Patron
  • PipPipPipPipPipPip
  • 65278 posts
  • LocationLimbo

Posted 20 March 2023 - 09:42 AM

Probably fair. But still, I think people are confusing its real capabilities. Like let's say I ask it to tell me the most energy efficient way to do some activity. It's not going to conduct some kind of analysis and look at all the variables and determine that for you. It might very well say "the best way is to do X" and be completely wrong because it's really just coming up with the most likely words based on its input patterns. It doesn't actually do "analysis" or comparisons of things etc. It just says the things that something that did those things would most likely say based entirely on patterns it finds in things that have been said before.

Like I saw some article suggesting it could help you plan the most effective school shooting. I don't think so. It could certainly describe such scenarios (absent its filters) but it has no way to determine the "most effective way" because it's not actually trying to do that.

It's fair to say the general public doesn't understand AI, machine learning, etc. No disagreement there.

But your examples...

People assuredly say incorrect stuff all the time and don't do a detailed analysis. They just guess. Have you dealt with people who "fake it until they make it"? What they are doing is not far off from chat gpt. This thing is trying to solve a specific problem though. I agree it's not very general purpose.

Edited by positiveContact, 20 March 2023 - 09:44 AM.


#60 Stains_not_here_man

Stains_not_here_man

    Phat O'Mic Chef Winner!

  • Patron
  • PipPipPipPipPipPip
  • 105575 posts

Posted 20 March 2023 - 09:45 AM

People assuredly say incorrect stuff all the time and don't do a detailed analysis. They just guess.


Sure, but when people think of an AI they aren't thinking of an advanced guessing machine.


0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users