How to track your AI visibility with Bing

For the past two years, tracking AI visibility and citations has largely been a black box.

Third-party tools like Semrush and Ahrefs introduced ways to estimate AI visibility by monitoring search results and citations. But until recently, we didn’t have first-party data directly from the search engines themselves confirming when and how often content was used inside AI-generated answers.

That changed a few weeks ago.

Bing Webmaster Tools introduced AI performance reporting, allowing website owners to see impressions and queries tied specifically to AI-generated answers.

Naturally, there’s a lot of hype online. So, we tested it ourselves. 

Here’s what it actually shows, what you can do with it today, and where it still falls short.

Quick Overview: 

  • Early findings: AI citations strongly correlate with traditional SEO performance. The pages Bing cites most already rank well organically. That undercuts the idea that “GEO” replaces SEO, which means strong search fundamentals still drive AI visibility.
  • The smartest move right now:
    • Export grounding queries (real AI retrieval language).
    • Test those queries in Google and ChatGPT to see who gets cited.
    • Use GSC long-tail query filters (7+ words) as a proxy for AI-style searches.
    • Double down on pages that already rank and get cited.
  • Bottom line: AI visibility is finally measurable. But until we get click data and broader platform coverage, it’s a directional signal, not a full performance metric. Keep your SEO foundation strong as that’s what’s getting cited anyway.

What Bing’s report actually shows (and what it doesn’t)

The report is built around citations when Bing’s AI retrieves and references your content to generate an answer.

Bing AI Performance Reporting dashboard

Four metrics are available: 

  • Total citations: How often your pages were referenced in AI answers.
  • Cited pages: Which URLs are most frequently used by AI.
  • Grounding queries: The types of queries Bing’s AI used to retrieve your content.
  • Visibility trends: How citation activity changes over time.

That’s a legitimate and meaningful dataset. 

Knowing which pages your AI is pulling from and which queries prompted those retrievals is far more actionable than anything third-party tools have offered.

But here’s the gap that makes the whole report incomplete: there’s no click data.

You can see that you’re being cited. You can’t see whether anyone actually visited your site as a result. That’s not a minor omission. It’s the difference between measuring visibility and measuring impact. Which cited pages are actually driving traffic? Which topics make users curious enough to click through? 

Without that layer, you’re essentially optimizing for citations you can’t prove are doing anything.

The feature has only been live for a few weeks, so hopefully Microsoft expands it. But as of today, you’re working with half the picture.

The finding that challenges the GEO hype

Here’s the most useful signal from our own dashboard, and it cuts against a lot of the noise around generative engine optimization: our top 10 most-cited pages also rank, on average, around position five in organic search.

CXL top 10 most-cited pages (Citations and Average position)

Let that sit for a second.

The pages Bing’s AI is pulling from aren’t obscure deep-cuts that got lucky with an AI retrieval query. They’re pages that already perform well in traditional search. Strong SEO correlates directly with AI citation frequency.

This matters because a whole industry has sprung up around GEO and AEO as replacements for SEO: the idea that you need an entirely different content strategy to show up in AI answers. 

Our data doesn’t support that framing. 

AI visibility is building on traditional SEO, not replacing it.

Yes, there are things worth doing at the margin: tightening your authority signals, earning third-party mentions, structuring content to be reference-worthy rather than just readable. But the foundation hasn’t shifted. 

Well-researched, high-quality content that earns organic rankings is also the content that gets cited in AI answers. Treating GEO as a separate discipline is only adding complexity without evidence it’s necessary.

Tracking AI visibility on Google and other LLMs

It’s also important to keep the context in mind: Bing represents a relatively small share of overall search usage compared to Google. What gets cited in Bing’s AI answers or Microsoft Copilot doesn’t automatically reflect what’s happening in Google’s AI Overviews, ChatGPT, or other LLM platforms.

So the data is real, but it’s not representative, which means you’re only looking at one slice of AI visibility, not the full picture.

For now, the most effective way to use it  to track track AI visibility is:

  1. Export the grounding queries from the Bing report. These are real queries—actual language your audience is using when AI retrieves your content. 
Grounding query
  1. Run those queries manually in Google, ChatGPT, and other platforms. See if your content gets cited. If it doesn’t, identify who does and reverse-engineer what they’ve done differently.
manual query citation check in Google

This won’t be as precise as getting first-party data from each platform directly. Results vary by user and context, and you’re doing comparative analysis rather than working with clean metrics. But you’re starting from real queries connected to real AI retrieval events, which is a significantly better starting point than guessing.

The grounding queries alone justify setting up the report. Everything else is a bonus.

The Google Search Console workaround

The move we’re hoping for: Microsoft’s report pressures Google to add equivalent AI citation data to Search Console. Until it happens, there’s a GSC workaround worth using.

  1. Go to GSC → Performance → Queries
  2. Click Add filter → Query → Custom (regex)
  3. Use a long-tail extractor like: ([^” “]*\s){7,}?
Google Search Console AI citation long-tail extractor

This surfaces queries of seven or more words: long, problem-focused searches that closely resemble how people write prompts in ChatGPT and other LLMs. 

long-tail queries surfaced in GSC

It’s not the same as AI citation data, but it gives you a proxy signal for the types of queries where AI systems are likely pulling content. Pages that rank well for these long-tail patterns are probably your best candidates for AI citation across platforms.

It’s a workaround. Name it as such. But it’s a useful one until Google gives us something better.

What to do this week

1. Set up Bing Webmaster Tools if you haven’t already. The AI performance report is available now at no cost. If your site isn’t verified, that’s the first step (verification takes under 10 minutes).

2. Pull your grounding queries and build a tracking list. Export every query showing up in your AI citation data. This is your most valuable output from the report: real retrieval language that Bing’s AI associates with your content.

3. Run those queries across Google and ChatGPT manually. For each grounding query, check who’s being cited in AI Overviews and ChatGPT responses. If it’s not you, document who it is. That’s your competitive gap analysis.

4. Cross-reference cited pages with your organic rankings. If your most-cited pages are also your top-ranked pages, your SEO foundation is solid and the GEO-specific tactics are secondary. If cited pages are scattered across your rankings, look at what they have in common structurally.

5. Set up the GSC regex filter for long-tail queries. Treat the resulting query list as a content audit trigger. Pages ranking well for 7+ word queries are your AI visibility candidates on Google—prioritize keeping them current and well-structured.

Close the loop on AI discovery

AI visibility is measurable now. That’s real progress after two years of speculation.

But measurable isn’t the same as actionable, not yet anyway. Bing’s AI visibility tracking report tells you when your content gets cited. It doesn’t tell you whether those citations are doing anything for your business. Until click data gets added, you’re optimizing toward a metric you can’t fully connect to outcomes.

For now, your best bet is to use the grounding queries, cross-reference across platforms, and keep your traditional SEO foundation strong. Because the data increasingly shows that’s what gets you cited anyway.

→ Improve your LLM content strategy with expert insights from the top 1% in AI B2B Marketing:

→ Secure your seat at our 5-day n8n webinar series and learn how to automate manual marketing tasks
→ Don’t miss our free webinar 5-day AI Adoption for Leaders with Ramir Arya and Ilinca Munteanu (Co-Founders of WeSimplify)

Discover more live and on-demand B2B AI courses here.

And push for Google to follow Microsoft’s lead on Search Console. That’s the data that would actually change how teams make content decisions. Everything until then is useful, but incomplete.

Current article:

How to track your AI visibility with Bing


Categories


B2B Marketing and AI courses

How people search, compare and buy products and services is changing. Your marketing should change too.

This 5-track program is designed to keep you up-to-date with B2B marketing and AI.

Check out the program