Google Gemini | G2: Make clear how well the system can do what it can do

Google Gemini makes clear how well the system can do what it can do (Guideline 2) by citing sources in its responses and aiding users in double-checking its responses. Cited sources highlight statements relevant to Gemini’s response. The “Double-check response” feature guides users through Gemini’s automated verification process, with steps like “Searching on Google,” “Evaluating Gemini’s statements,” “Preparing results,” and “Check completed.” Once the check is finished, statements in Gemini’s response are highlighted (green, orange, none) based on their similarity to other content found in Google search. The color coded statements express uncertainty visually and are explained in a pop-up that appears when users click “Understand the results”.
The verification experience, which includes, citing sources, highlighting relevant statements within them, double-checking and using latency moments to educate users on the process is proactive and eases the burden of verification on users.
The techniques used in this example have the potential to foster appropriate reliance on AI, by reducing overreliance. Keep in mind that overreliance mitigations can backfire. Be sure to test such mitigations in context, with your AI system’s actual users. Learn more about overreliance on AI and appropriate reliance on generative AI.