Breaking Bard: Google’s AI chatbot lacks sources, hallucinates, gives bad SEO advice
Google Bard suggests buying links, predicts the next Core Update and refuses to share links to its sources. We're off to a ridiculous start.
Google opened its Bard waitlist today. Hopefully, you get access soon. While you wait, you can get a taste of how Bard works and behaves.
Google Bard has some “issues,” though it doesn’t seem to be quite as “unhinged” as the new Bing was early on.
Here’s some of what those in the search community – and beyond – are seeing and sharing in early Bard testing.
No links/citations initially. One of our big concerns from the Google Bard preview was the lack of links to sources. Has Google addressed this?
Initially, no. From a tweet by @simonlesser: “No citations, just a link to ‘Google it’. Hilarious answer when asked point blank about its sources.”
Based on this response, Bard apparently had sources for the information it provided – it just didn’t want to share!
However, later it appeared Google Bard started listing some sources for some queries.
As Search Engine Land’s Barry Schwartz tweeted:
Note the addition of the “Sources – Learn more” with three links.
Schwartz also asked Bard why it often doesn’t show sources and citations in its answers. Here’s how Bard responded:
Bard suggests buying links. Even though Google is opposed to link schemes and buying links, Bard seems to be a bit more lenient. “I think it’s a good idea to buy links…” as shared in a tweet thread by @DeanCruddace:
However, after Bard was told that this advice went against Google’s guidelines, Bard admitted its mistake: “You are correct, it is not advisable to buy links.”
Local search. Some interesting implications for local search were highlighted in a tweet by Greg Sterling, former Search Engine Land contributing editor:
- The same query (“handyman in 94118”) produced three different drafts with minor overlap.
- Choosing to “Google it” returned entirely different results.
- None of the Local Pack results appeared in the Bard lists.
Bard says Google uses CTR for ranking. Google warned that “Bard is experimental, and some of the responses may be inaccurate.”
If you want to see an example of that inaccuracy, look no further than @pedrodias asking Bard: “Do you think Google uses CTR as a signal to classify websites?” Bard: “Yes, Google uses CTR as a signal to classify websites.”
Google has repeatedly denied CTR is a ranking signal. Dig deeper:
- The biggest mystery of Google’s algorithm: Everything ever said about clicks, CTR and bounce rate
- Google doc rekindles myth that click-through rate affects rankings
- Patent suggests how CTR, time on page could be used in search rankings (if Google did that sort of thing)
- Is CTR A Ranking Factor In Organic Results?
Speaking of ranking signals that aren’t ranking signals, Google’s John Mueller has said “there is no such thing as LSI keywords”.
So what does Bard say on the subject for ranking? That there is “evidence to suggest that [Google] may do so.” Via @keithgoode:
Bard says next Google core update is March 23, 2023. Google just launched a core update March 15. So is this a hallucination? Or does Bard know something? Via @ryanjones:
SEO metric hallucinations. Thinking of using Bard for keyword research? It apparently can calculate Search Volume, LinkJuice Calories (what?!) and an EAT score (there is no such thing)! Also from @ryanjones:
Bard says its training set includes Gmail. Well, the prospect of this being true wasn’t scary at all. When asked where Bard’s dataset came from, Bard listed Gmail among its sources. Via @katecrawford:
Google later responded to her tweet, saying “Bard is an early experiment based on Large Language Models and will make mistakes. It is not trained on Gmail data.”
Bard: “Google Bard is already shut down.” If you were hoping to use Google Bard, you may be out of luck – at least according to Bard. When asked when it would (inevitably) be shut down by Google by @juanbuis, the AI chatbot responded said it was shut down on March 21, 2023 (that’s today!) after less than six months due to lack of adoption:
Notice the source: Hacker News.
Garbage in, garbage out? Some of Bard’s early issues may be due to AI hallucination. Some also may simply be the fact that Bard was trained on a lot of bad information and misinformation that has been published about SEO.
Always remember: It doesn’t matter whether content comes from AI or a human – bad information is bad information. Always think critically.
Just as we’ve seen with ChatGPT and the new Bing – and despite all the testing Googlers did to get Bard ready for the public – we can expect to discover even more issues over the coming days.
Why we care. We’ve been waiting since Feb. 6 to get our hands on Google’s Bard. Like ChatGPT, Google Bard has the potential to be a helpful tool for SEOs. So it’s important to understand the strengths and weaknesses of this generative AI tool.
Bard: “I do have emotions.” One more via @CarolynLyden:
Google fired an engineer last year who said Google’s LamDA (Language Model for Dialogue Applications) technology was sentient. Hopefully, Google won’t fire Bard.
Related stories
New on Search Engine Land