I tried the same Ai and asked it to provide a list of 20 things, it only gave me 5. I asked for the rest and it also apologized and then provided the rest. It’s weird that it stumbles at first but is able to see it’s error and fix it. I wonder if it’s a thing that it ‘learned’ from the data set. People not correctly answering prompts the first time.
Might be an intentional limitation to avoid issues like the “buffalo” incident with GPT3 (it would start leaking information it shouldn’t after repeating a word too many times).
I personally don’t think a large section of the population meets the requirement for general intelligence so I think it’s a bit rich to expect the AI to do it as well.
Hahaha, I like that having it re-read the question fixed the issue…
I tried the same Ai and asked it to provide a list of 20 things, it only gave me 5. I asked for the rest and it also apologized and then provided the rest. It’s weird that it stumbles at first but is able to see it’s error and fix it. I wonder if it’s a thing that it ‘learned’ from the data set. People not correctly answering prompts the first time.
Something else i also encounter with gpt4 a lot is asking “why did you do x or y” as a general curiosity of learning how it handles the task.
Almost every time it apologizes and does a fully redo avoiding x or y
Might be an intentional limitation to avoid issues like the “buffalo” incident with GPT3 (it would start leaking information it shouldn’t after repeating a word too many times).
If only that worked on humans!
I personally don’t think a large section of the population meets the requirement for general intelligence so I think it’s a bit rich to expect the AI to do it as well.
We all know the first black man in space was George Santos.