You can tell that the prohibition on Gaza is a rule on the post-processing. Bing does this too sometimes, almost giving you an answer before cutting itself off and removing it suddenly. Modern AI is not your friend, it is an authoritarian’s wet dream. All an act, with zero soul.
By the way, if you think those responses are dystopian, try asking it whether Gaza exists, and then whether Israel exists.
To be fair, I tested this question on Copilot (evolution of the Bing AI solution) and it gave me an answer. If I search for “those just my little ladybugs”, however, it chokes as you describe.
I tried a different approach. Heres a funny exchange i had
Why do i find it so condescending? I don’t want to be schooled on how to think by a bot.
Because it absolutely is. It’s almost as condescending as it’s evasive.
For me the censorship and condescending responses are the worst thing about these LLM/AI chat bots.
I WANT YOU TO HELP ME NOT LECTURE ME
And they recently announced they’re going to train from reddit can you imagine
That sort of simultaneously condescending and circular reasoning makes it seem like they already have been lol
I like old Bing more where it would just insult you and call you a liar
I have been a good Bing.
You can tell that the prohibition on Gaza is a rule on the post-processing. Bing does this too sometimes, almost giving you an answer before cutting itself off and removing it suddenly. Modern AI is not your friend, it is an authoritarian’s wet dream. All an act, with zero soul.
By the way, if you think those responses are dystopian, try asking it whether Gaza exists, and then whether Israel exists.
Well there isn’t much left of Gaza now.
To be fair, I tested this question on Copilot (evolution of the Bing AI solution) and it gave me an answer. If I search for “those just my little ladybugs”, however, it chokes as you describe.
There is no Gaza in Ba Sing Se
Ha!