Recently, I started to experiment with Bing's ChatGPT-powered chat tab. This was the first time I asked it:
img alt="Search and fabrication?" class="kg-image" height="735" loading="lazy" sizes="(min-width: 720px) 720px" src="https://www.aiweirdness.com/content/images/2023/03/AI-Weirdness-blog-examples-annotated.jpg" srcset="https://www.aiweirdness.com/content/images/size/w600/2023/03/AI-Weirdness-blog-examples-annotated.jpg 600w, https://www.aiweirdness.com/content/images/size/w1000/2023/03/AI-Weirdness-blog-examples-annotated.jpg 1000w, https://www.aiweirdness.com/content/images/size/w1600/2023/03/AI-Weirdness-blog-examples-annotated.jpg 1600w, https://www.aiweirdness.com/content/images/2023/03/AI-Weirdness-blog-examples-annotated.jpg 1722w" width="1722"/>
Red boxes have been placed around factual errors. These are not typos or errors in context. They are completely fabrications and have never been on my blog. It was not helpful to ask for clarification or further details. 40% of the paint colors I requested were not on my blog. Many of the color descriptions were incorrect.
img alt="Search and fabrication?" class="kg-image" height="868" loading="lazy" sizes="(min-width: 720px) 720px" src="https://www.aiweirdness.com/content/images/2023/03/AI-Weirdness-paint-color-examples-annotated.jpg" srcset="https://www.aiweirdness.com/content/images/size/w600/2023/03/AI-Weirdness-paint-color-examples-annotated.jpg 600w, https://www.aiweirdness.com/content/images/size/w1000/2023/03/AI-Weirdness-paint-color-examples-annotated.jpg 1000w, https://www.aiweirdness.com/content/images/size/w1600/2023/03/AI-Weirdness-paint-color-examples-annotated.jpg 1600w, https://www.aiweirdness.com/content/images/2023/03/AI-Weirdness-paint-color-examples-annotated.jpg 1715w" width="1715"/>
Red boxes indicate factual errors
When I tried to point out a problem, it doubled down and generated more imaginary facts that fit the original position.
img alt="Search and fabrication?" class="kg-image" height="379" loading="lazy" sizes="(min-width: 720px) 720px" src="https://www.aiweirdness.com/content/images/2023/03/AI-Weirdness-paint-color-doubling-down.png" srcset="https://www.aiweirdness.com/content/images/size/w600/2023/03/AI-Weirdness-paint-color-doubling-down.png 600w, https://www.aiweirdness.com/content/images/size/w1000/2023/03/AI-Weirdness-paint-color-doubling-down.png 1000w, https://www.aiweirdness.com/content/images/size/w1600/2023/03/AI-Weirdness-paint-color-doubling-down.png 1600w, https://www.aiweirdness.com/content/images/2023/03/AI-Weirdness-paint-color-doubling-down.png 1754w" width="1754"/>
(I did create some paint colors with HSV encoding, but none of those colors were generated by the neural net.
These aren't cherry-picked examples. I doubt it generated any response that didn't include at least one fact made up. Many contained multiple facts. The example below contained information about my latest blog post's gist, but did not include every one of its examples.
img alt="Search and fabrication?" class="kg-image" height="1221" loading="lazy" sizes="(min-width: 720px) 720px" src="https://www.aiweirdness.com/content/images/2023/03/controversial-ai-weirdness-posts-part-2.png" srcset="https://www.aiweirdness.com/content/images/size/w600/2023/03/controversial-ai-weirdness-posts-part-2.png 600w, https://www.aiweirdness.com/content/images/size/w1000/2023/03/controversial-ai-weirdness-posts-part-2.png 1000w, https://www.aiweirdness.com/content/images/size/w1600/2023/03/controversial-ai-weirdness-posts-part-2.png 1600w, https://www.aiweirdness.com/content/images/2023/03/controversial-ai-weirdness-posts-part-2.png 1717w" width="1717"/>
Bing chat isn't a search engine. It's just playing a role. It is trained to predict internet content and fills in the lines of a search engine in a hypothetical text between a user (or chatbot). It draws on many examples of internet dialog, which is why it often slips into internet argument mode. The internet argument examples will often double down on the incorrect facts and support their position with more data.
When it mentioned an AI Weirdness post that was not on Battlestar Galactica, I challenged it. It created a separate AI Weirdness newsletter. Finally, it fabricated a complete excerpt from it.
img alt="Search and fabrication?" class="kg-image" height="1635" loading="lazy" sizes="(min-width: 720px) 720px" src="https://www.aiweirdness.com/content/images/2023/03/AI-Weirdness-battlestar-galactica.png" srcset="https://www.aiweirdness.com/content/images/size/w600/2023/03/AI-Weirdness-battlestar-galactica.png 600w, https://www.aiweirdness.com/content/images/size/w1000/2023/03/AI-Weirdness-battlestar-galactica.png 1000w, https://www.aiweirdness.com/content/images/size/w1600/2023/03/AI-Weirdness-battlestar-galactica.png 1600w, https://www.aiweirdness.com/content/images/2023/03/AI-Weirdness-battlestar-galactica.png 1709w" width="1709"/>
For reference, please use the mouseover text
Ironically, the reference it gives to its fake battlestar galactica episode summaries is actually my Galactica post. It's a chat-based "knowledge platform" that can make up facts. When asked about my Galactica post, Bing gives the general idea but doesn't give the details.
img alt="Search and fabrication?" class="kg-image" height="1523" loading="lazy" sizes="(min-width: 720px) 720px" src="https://www.aiweirdness.com/content/images/2023/03/ai-weirdness-galactica.png" srcset="https://www.aiweirdness.com/content/images/size/w600/2023/03/ai-weirdness-galactica.png 600w, https://www.aiweirdness.com/content/images/size/w1000/2023/03/ai-weirdness-galactica.png 1000w, https://www.aiweirdness.com/content/images/size/w1600/2023/03/ai-weirdness-galactica.png 1600w, https://www.aiweirdness.com/content/images/2023/03/ai-weirdness-galactica.png 1700w" width="1700"/>
It is absurd that large tech companies market chatbots as search engines. People have been tricked into asking librarians and authors for nonexistent references or contacting a Signal number which turned out to be Dave.
It's better than useless if a search engine can find what you are looking for, regardless of whether it exists.
A newsletter is a blog that I send out via email. Bonus posts are available for supporters. You can read one of them here. In it, I ask Bing chat to generate a list with 50 AI-generated colors for my blog.
————————————————————————————————————————————————————————————
By: Janelle Shane
Title: Search or fabrication?
Sourced From: www.aiweirdness.com/search-or-fabrication/
Published Date: Sat, 11 Mar 2023 20:09:46 GMT
Leave a Reply