AI Answers: How they Impact Content Production
Generative AI Answers can deliver instant answers from your company’s knowledge base.
Brand entities intense on “selling” over providing helping content will likely find that they are missing among vast Question-Answer search engine result pages (SERPs).
The remarkable evolution and advancing pace of generative AI is built on data foundations. However, success in these technological advancements boils down to the quality of that data and your prompt refinements. You need to use the correct data when combining generative AI, search, and content marketing when publishing question-answer content.
Table of Contents
- AI Answers: How they Impact Content Publication
- Google Improves Bard’s Answer Accuracy with Search Injections
- AI Content Marketing Workflows to Win Answer SERPs
- Tools for AI-Generated Question Answering Content
- Concern for Fake or Misleading AI Answers
- AI Answer Content Must Facutally Support Health Claims
- Google Search Generative Experience’s AI-Powered Answers
- How Google AI may Formulate Quality Answers
- How to Evaluate and Edit AI Generated Answers
- Generative AI Answer Outputs Need Human Reviewers
- SUMMARY: AI Answers Can Ignite In-Person Conversations
You can create better answers once you realize that most using AI rely primarily on internet-based information. Many can’t respond to prompts or questions regarding proprietary content or knowledge. According to seagate.com, “Data is exploding, with the IDC projecting that the size of global data will reach 175 zettabytes by 2025.  This article helps you provide correct answers that can show up in this sea of data.
“Getting the answer right could mean the difference between higher sales or losing to the competition. But new Wharton research shows that 57% of marketers are incorrectly crunching the data and potentially getting the wrong answer — and perhaps costing companies a lot of money.” – What Marketers are Doing Wrong in Data Analytics 
Google Improves Bard’s Answer Accuracy with Search Injections
A Google October 5, 2023 publication talks about how “in-context learning is an appealing alternative in which real-time knowledge can be injected into an LLM’s prompt for conditioning generation. Work to further augment LLMS with web search results (Lazaridou et al., 2022; Press et al., 2022), hopes to improve Google Search outputs by improving LLM factuality.
“Most large language models (LLMS) are trained once and never updated; thus, they lack the ability to dynamically adapt to our ever-changing world. In this work, we perform a detailed study of the factuality of LLM-generated text in the context of answering questions that test current world knowledge. Specifically, we introduce FRESHQA, a novel dynamic QA benchmark encompassing a diverse range of question and answer types, including questions that require fast-changing world knowledge as well as questions with false premises that need to be debunked.” – Refreshing Large Language Models With Search
The Google study identifies Question-answer retrieval types.
Question-answer retrieval classified into four categories:
- Never-changing: in which the answer almost never changes.
- Slow-changing: in which the answer typically changes over the course of several years.
- Fast-changing: in which theanswer typically changes within a year or less.
- False-premise: which includes questions whose premises are factually incorrect and thus have to be rebutted.
AI prompt engineering skills are vital to produce qualilty content
It may be easy to assume that prompt engineering is similiar to writing Google searches, however, it is more about knowing “how Google or GPT4 thinks” and how Bard or the latest algorithm works.
May 12, 2023, at Google I/O, we heard Global VP of AI & Business Solutions at Google Cloud Philip Moyer indictate that it’s possible to get fresh answers with the right retrieval methods.
“For…frequently-changing data, organizations can limit the information the model considers. Rather than asking the model to devise answers from training data, they can direct it to specific sources that are constantly refreshed, like an internal repository or a public database. By narrowing the aperture, and focusing the model on specific information instead of every bit of data to which it’s ever been exposed, organizations can enjoy more accurate responses based on the most up-to-date data.” – The Prompt: Invest in AI platforms, not just models + a recap of Google I/O 
AI answer optimization
Significant funds and research are continually invested to improve Artificial Intelligence Answer optimization methodology. Better AI question-answering entails the generation of fresh prompts, all with the singular goal of enhancing answer accuracy.
Your prompts provide key criteria that will refine outputted responses. Take SEO best practices into account when preparing your prompts and specify that it only outputs factual answers. This should improve the quality of the answers! These models will generate text as an extrapolation from your prompt.
Google Research Scientist Tu Vu announced Google’s FRESHPROMPT. It’s introduced as having the capability to respond to a specific question, takes full advantage of a search engine by extracting all up-to-date and relevant information (including knowledge from relevant questions that search users also ask). It then leverages few-shot in-context learning to train a model to reason over retrieved entities to come up with the the most relevant and up-to-date answer. It collects evidences and stores them in a unified format that identifies the source webpage, date, title, text snippet, and highlighted words.
He states that “We show that FRESHPROMPT significantly boosts LLMS’s factuality: for example, our best GPT-4 + FRESHPROMPT variant yields an improvement of 32.6% and 49.0% accuracy over the vanilla GPT-4 on FRESHQA under RELAXED and STRICT, respectively.”
I like how the document provides an image that labels a Google’s Generative AI answer display.
It also gives this list of types of Google Search results for one query:
- The answer box.
- Organic results.
- AI populated knowledge graphs (KG).
- Questions and answers from crowdsourced QA platforms.
- Related questions that search users also ask.
One thing we are confident of, Google Search will continue to test and display evolving question-answer SERP types.
What AI answers mean for your content strategy
Every business entity needs a flexible and applicable marketing strategy that can adapt to a variety of scenarios. Skip the pain of investing your marketing efforts to people without money, interest, or incentive to contact you. Rather, answer your audience’s burning questions.
We find queries asking “What is the AI that answers questions?” From a search engine perspective, they real question may be “what is the AI website that answers questions?” How does you website perform for Google’s Answer Engine?
While attempting to future-proof your business by introducing a culture of experimentation with AI may feel perilous — it’s riskier for marketers to become behind.
AI Content Marketing Workflows to Win Answer SERPs
AI can augment your content workflows and editoral processes to increase chances of your answers displaying on the SERP.
AI makes it easier for everyone to produce content faster. However, it cannot include your tone, personality, and answers drawn form your personal experience. Search engines are evolving to sort out low qualtiy AI generated content from high-quality, human edited content. According to Forbes Advisor, 64% of business owners expect AI to increase productivity. We urge embracing AI to increase content productivity with awareness that it could backfire if you aren’t properly prepared.
With the jump in content production and publication, you need to easily organize, find and distribute your digital assets with increasing accuracy and efficiency.
Turn your AI-answer rich content and creative teams — enhanced with AI technology — into a workable process that increases content ROI. Aim to have a robust Google Knowlege Graphs’ question answering section.
Tools for AI Generated Question Answering Content
The emergence of tools and controls include AI generative APIs
The following tools assist with Generative AI marketing through the seeding of training data to predictably answer queries with amalgamated answers
- Falcon 180B: An open-source LLM similiar to Google’s PaLM-2.
- Wordlift AI Question Answering App (Free)
- Market Muse AI Dataset
- IBM Watsonx Assistant
AI-powered tools assit content creators by reducing both time and effort required to consistently publish high-quality content. As tempting as it is, the goal isn’t just to create more content in less time. It’s about adding value to searchers. AI can also help content creators improve the accuracy and consistency of their work – if properly reviewed and edited.
We hear of new tools every week and are selective in which ones we try.
“Making simple and scalable controls, like Google-Extended, available through robots.txt is an important step in providing transparency and control that we believe all providers of AI models should make available. However, as AI applications expand, web publishers will face the increasing complexity of managing different uses at scale. That’s why we’re committed to engaging with the web and AI communities to explore additional machine-readable approaches to choice and control for web publishers.” – Google-Extended helps sites improve Bard and Vertex AI generative APIs 
While we find that in Search Lab, Google cites the website(s) source that is used while generating the answer, many ask if they can trust those AI answers. Google explains its reasoning is to help the searcher:
“So instead of asking a series of questions and piecing together that information yourself, Search now can do some of that heavy lifting for you.” – 3 new ways generative AI can help you search 
Augmenting Content with an AI Question Answer Generator
Note of caution: An AI Question Answer Generator should by used to augment your content but not to replace the human expert. Large Language Models that are combined with a Topic Model will analyze specifc documents versus whatever is published on the web.
The trend towards direct AI answers may potentially affect website traffic. With people finding answers directly on SERPs, it lessens their need to click through search results to gain information. This change is already leading some sites to believe it is the cause of declining organic website traffic. However, we believe you
Now you understand more about the workflow for augmenting AI answers and some useful tools. What else is critical to know?
Concern for Fake or Misleading AI Answers
Without humans involved in content creation, it means consumers engage with AI bots for answers
Public consumer concerns have lead to the Federal Trade Commission (FTC) getting involved. If you rely on generative AI for creating your QA content, you are responsible for its accuracy. We believe this can be done and accomplished responsibily with the right workflows in place.
“1. Bias and inaccuracies. A consumer says they asked a chat-based generative AI interface for the customer service phone number of the bank that issues their credit card and received the number of a scammer pretending to be the bank instead. This is something we have been tracking at the FTC. In a 2022 report to congress, our agency warned of harms from bias and inaccuracies in AI showing up in products.
2. Limited pathways for appeal and bad customer service (AKA can we talk to a human please!?). Another frequently cited concern is limited pathways to appeal decisions for products using AI. There are numerous complaints of consumers who are unable to reach a human for customer service complaints or the end subscriptions and are stuck trying to communicate with AI-powered service bots.” – Consumers Are Voicing Concerns About AI – FTC 
So, does this apply to all niches’ content creation processes?
AI Answer Content Must Facutally Support Health Claims
Health-related articles have experiences a noticeabe decline during Google’s recent algorithm update. Oberseriving which health sites were hit includes those who failed to clarify the difference between a “health fact”, a “health claim”, and a strongly stated health opinion.
For example, if you state “This cake recipe is gluten-free,” it is not a health claim. It’s a health fact. The sum of the ingredients are either “gluten-free” or not. Writing for anyone with Celiac disease is a Your Money Your Life (YMYL) article and must not mislead readers. Someone may get really sick if you are not facutal.
Or if an blog author says, “This is how you can self-manage sports injury TMJ pain” that’s a health claim that should be supported.
Whether you miss proper fact and claim document supprt – accidentally or intentionally – it may impact your site’s AI content marketing effectiveness. Always have a human editor double or thripple check for AI answer halunications.
Google helpful content system guidelines stress asking yourself, “Does the information make you trust the author by means of clear sourcing or verifiable information?”
Google Search Generative Experience’s AI-Powered Answers
Providing answers is occuring right on the SERPs. Or if not, your competitors are using Conversational AI strategies
A good approach for your content marketing strategy is to realize that search is becoming conversational. The Google Search Generative Experience (SGE) lets people explore further. Directly under the snapshot is the option to “ask a follow-up question” or click on a suggested next step. This open up a conversational mode. This is where you can ask Google more about the query you’re exploring. Context is carried over from question to question, to help continue answer explorations.
Retailers are happy to see ads now displaying in SGE answers. And Google explained they will be further testing sponsored content in the SGE answer. Google Search intends to further simplify SERPs.
Also, redundant local pack displays now show fewer results if it avoids repeating what Google showed directly above in its AI-powered answer. Google would show a “three pack” in the main traditional search results, and it would often be redundant to what was shown in the AI-SGE answer.
“Google would show a “three pack” in the main traditional search results, and it would often be redundant to what was shown in the AI-SGE answer.” – AI powered answers in SGE” by Gary Illyes 
SGE currently produces AI-generated “overviews” to assist with tasks. Generative search let you find answers to how-to questions or gain suggested code snippets for common tasks. It also lets you “Explore on page” to see questions the article answers and go to that relevant section.
AI generated answers can be trusted – if you know their limitations and how to evaluate them. You’ll need to validate AI generated answers by comparing them to other trusted sources. and to use your own judgment to assess their accuracy.
How Google AI may Formulate Quality Answers
AI-generated answers display above Google’s traditional search results listings.
By finding examples in Google patents, how Google may populate generative AI answers becomes clearer.
“As a working example of some implementations, assume a user interacts with a client device to submit a query of “How many doctors will there be in China in 2050?” to a search engine. Assuming that an authoritative answer to this query is not included in one or more existing resources (such as a knowledge graph, a proprietary database of a company, and/or structured database), the user is typically provided search results that are responsive to one or more terms in the query (e.g., results including population of China, current number of doctors in China). However, none of these search results may provide a satisfactory answer to the user’s query. Accordingly, various implementations described herein provide, in response to a query, a predicted answer to the query that is predicted based on a trained machine learning model and/or provide an interactive interface to the trained machine learning model, where the user can interact with the interface to be presented with the predicted answer and/or other answer(s) predicted using the trained machine learning model.” – Generating and/or utilizing a machine learning model in response to a search request, Patent application granted 2023-05-09
It goes further to say:
“Content from the user’s query can then be processed using the machine learning model to generate a predicted answer, and the predicted answer provided to the user in response to the search query. Alternatively or additionally, an interface can be provided to the user to allow the user to submit additional parameters and/or parameter values to receive additional predicted answers based on the model.”
We can only guess if Google ends up implemented what the patent covers. However, its patents provide key insights as AI-answer technology continues to advance. We see in Google’s new Search Generative Experience that not only do images populate in the AI-generated answer, but also playable videos.
How to Evaluate and Edit AI Generated Answers
- Evaluate your answer sources when including them in your content strategy. Are they from websites or organizations with Experience, Expertise, Authoritativeness, and Trust (E-E-A-T)?
- Provide an article publication date and when it is updated. Let readers know it is current.
- Is your author providing evidence to support answers? Are there links to quoted facts? Are you using FactCheck schema markup?
- Think through it yourself. is there a good flow to your answer? Does the answer seem logical? Does it demonstrate credible?
- Offer more on your website so that people find valuable enough to click beyond the AI generated answer. This may be when they want a more comprehensive answer.
- Check any Bard’s Answer you plan to incorporate by using Google Search.
By thinking like the searcher, you can see why providing accurate and reliable information within AI generated answers. To check and refine Bard answers, do the following.
“To help you check Bard’s statements, you can use the Google button. It uses Google Search to find content that’s likely similar or different, and shows links. If a link to similar content is shown, Bard didn’t necessarily use it to generate its response. Tap Double-check response.” – Double Check Bard’s Responses 
Just click on the “G” at the top right of your screen to have Google Search read the answer that Bard generated. Google Search identifies what it thinks are correct answers by color-coding with green. Incorrect answers have a deep orange color.
Generative AI Answer Outputs Need Human Reviewers
If you rely totally on LLM’s output, your answers lack diferientation. They are “copied” and cannot represent your unique brand entity.
We’ve had a long time to prepare, but the pace of AI answer adoption has exploded in 2023. Back in 2011, Google’s (then) executive chairman Eric Schmidt stated the following at the annual All Things Digital conference.
“But the other thing that we’re doing that’s more strategic is we’re trying to move from answers that are link-based to answers that are algorithmically based, where we can actually compute the right answer. And we now have enough artificial intelligence technology and enough scale and so forth that we can, for example, give you — literally compute the right answer.” – Google now wants to answer your questions without links and with AI. 
We may complain about AI Question Answering without sourced links, but we need to be transparent ourselves. Content creators should state when content is purely AI generated, as well as when a human reviewer has refined the content.
Be transparent about how people (companies, bots, systems, cameras, etc) are involved in your content creation workflow. Did they edit, write, strategize, copyed it, out-sourced it, etc. The human reviewer is like a final checkup. Trust is earned in your transparent details, otherwise your writing may draw more questions and provide answers. It must be helpful content versus meaninglessness.
It’s about providing helpful, verified answers
Google Search, Bing, and other search engines continue to improve how Artificial Intelligence is used in matching answers to questions. Google has made it clear multiple times – it is not about if your content is AI-generate as much is it is about high-quality content.
“We haven’t said AI content is bad. We’ve said, pretty clearly, content written primarily for search engines rather than humans is the issue. That’s what we’re focused on. If someone fires up 100 humans to write content just to rank, or fires up a spinner, or a AI, same issue.
As said before when asked about AI, content created primarily for search engine rankings, however it is done, is against our guidance. If content is helpful & created for people first, that’s not an issue.” – Danny Sullivanm Google Search Liaison 
Here is what we believe can guide an AI content marketing process.
Creating helpful AI answers should focus on:
- Writing (or augmenting) from personal knowledge.
- Let your experience come through, which supports relationship building.
- Answers need to be helpful, consise, and correct.
- Providing a natural and intuitive search experince.
Kevin Indig talks about why he’s waiting to see if that “AI answers are inevitable in the medium term to long term.”  The verdict is still out as to whether users will prefer AI answers over classic web results. In my opinion, it may be that in the future they lack choice. Or that most people will lack the ability to tell them apart.
Verified AI answers added to existing Google Search
The human editorial element adds necessary authenticity and correctness. For companies facing data management challenges, editors also protect you from the possible degradation of training material for generative AI itself.
With ChatGPT, OpenAI, as well as Google Bard, test a more accurate and efficient dialog-based general-purpose language model. Once search engines are confident of correct and needed answers, your QA content has a greater chance of being rolled into Google’s no-click search. This is my opinion and what our ongoing SERP Analysis results lend us to think.
To gain a verified AI answer for your KG, employ research-backed metrics to evaluate the quality of your inputs (data quality, RAG context quality, etc) and AI tool outputs (avoid hallucinations). This also assists in building LLM-powered applications.
Context-rich questions in the GSE let users ask longer context-rich questions and receive comprehensive answers powered by AI directly on the SERPs. This new search summarizes answers for searchers, without needing a click, a source link or pulling information from Wikipedia.
Focus on nurturing important customers’ top questions
One study by Digital Information World determined that the AI’s directness and less emotional tone fared better with respondents as opposed to the more emotional responses made by human CEOs.  This may inspire a company to revise and implement better procedures for publishing answer-rich content.
Data science and a research expert can help you take advantage of the mathematical strengths of generative AI. Continuously test your LLM outputs and how your content performs, The number of correct answers measures test and output quality.
Another bit of incentive – Google Assistant sometimes elects to show ads in some answers. Meaning your answer, typically related to a product or service, can lead a potential buyer to your ad.
Make ALL of your communications valuable. Google’s AI chatbot can now answer questions based on the information it finds in your Gmail inbox and Drive storage. The AI technology company also announced that its chatbot will connect with Maps, YouTube, and Google Flights. Answers in search are drawn from many sources. 
To wrap up, here is a statement that I align with.
“While the volume of global data grows exponentially, at the same time, SEO is changing as consumer demands continue to evolve, and search engines cater to these changes by creating new experiences and experimenting with the integration of AI in search engine results pages (SERPs). As a result, marketers need to carefully reconsider their approach to data, technical SEO, and generative AI outputs.” – What Is Quality Data And How It Connects Search, Content, And AI Success by Lemuel Park 
SUMMARY: AI Answers Can Ignite In-Person Conversations
We’ve anticipated that search will produce more shopping-related results. Product carousel SERPs encourage user spending just by the ease of seeing options in front of them. Semantic search and skilled use of LLMs can expand your content-to-query matching capabilities. Answering vital comsumer questions can give you a competitive lead to gain more revenue while providing people with accurate responses.
Success in QA content creation is about the number of useful conversations you trigger with potential clients.
In preparation for advanced AI Answer content creation Request Your Website Content Audit