Technical SEO in the Age of AI: What B2B Companies Need to Know

Conor McDonoughBlog

Last week, I attended “Winning the AI Search Race: Lessons from 2025 Trends” a webinar put on by SearchEngineLand and SemRush. This blog post is based on the things that were discussed, as well as the slide deck from the webinar.

Think about how you find information these days. When you need to know something, do you still click ten links on the first page of Google? Or do you ask ChatGPT, Perplexity, or Gemini to give you a direct answer?

More and more of your potential clients are doing the latter. And here’s what matters: AI systems are reading your website right now to decide whether to recommend your business when someone asks about services like yours.

What This Means for Your Blog Posts and Service Pages

Let’s start with the practical stuff. If you’re publishing blog posts about your industry expertise, AI systems are reading them. When they scan your content, they’re looking for clear, well-organized information they can understand and potentially use to answer questions.

Here’s what that means in practice:

For your blog posts: Write with clear headings that tell readers (and AI) exactly what each section covers. If you write a post titled “How to Choose Tax Compliance Software,” make sure your content directly answers that question in the first few paragraphs. AI systems want straightforward answers, not marketing fluff.

For your service pages: Be specific about what you do, who you serve, and what problems you solve. Instead of “We provide innovative solutions for the modern enterprise,” try “We help accounting firms automate sales tax compliance across all 50 states.” The second version tells both humans and AI exactly what you offer.

For your product pages: List features clearly, include pricing when possible, and explain benefits in simple terms. AI systems are looking for facts they can compare and contrast when someone asks “What’s the best [your product category] for [specific use case]?”

The technical health of your website—how fast it loads, how it’s built, and how it communicates with these AI systems—now directly affects whether you show up in AI-generated answers. A technically sound website isn’t just about looking professional anymore. It’s about being discoverable in a world where AI is becoming the primary research assistant.

Understanding AI Crawlers (And Why They Matter)

When we talk about AI “reading” your website, we’re talking about automated programs called crawlers (or bots) that visit your site, scan your content, and store that information. Think of them like very fast readers taking notes on everything you’ve published.

Different companies run different crawlers, and they don’t all behave the same way:

Google’s crawler is the most sophisticated. It can handle complex websites and understands most types of content.

OpenAI’s crawler (which powers ChatGPT) has increased its activity by 305% in the past year. It’s visiting websites much more often, gathering information to help answer user questions.

Google’s Gemini uses Google’s existing infrastructure, so it’s also quite good at understanding websites.

Perplexity’s crawler has gotten in trouble for aggressive behavior—essentially sneaking around websites that tried to block it. Cloudflare, a major internet security company, publicly called them out for this.

Why does this matter to your business? Because if these systems can’t properly read your website, you won’t appear in AI-generated answers when potential clients ask questions about your industry.

The Technical Basics (Explained Simply)

What is Robots.txt?

Every website can have a file called “robots.txt” that works like a “Do Not Enter” sign for certain areas of your site. It tells crawlers which parts of your site they can and cannot read.

For example, you might not want AI systems reading your internal employee portal or your website’s administrative pages. Your robots.txt file tells them to skip those sections.

Most B2B companies should make sure their public-facing content—blog posts, service pages, product descriptions—is accessible to legitimate crawlers. The key word is “legitimate.” You want Google and ChatGPT to read your content. You probably don’t want spam bots or malicious crawlers accessing your site.

What is JavaScript?

JavaScript is programming code that makes websites interactive. When you click a button and something happens without the page reloading, that’s usually JavaScript at work.

Here’s the problem: Many AI crawlers struggle to read content that’s loaded through JavaScript. Imagine if you visited a library, but all the books were locked in cases that required a special key to open. You’d leave without reading anything, right? That’s what happens when AI crawlers encounter JavaScript-heavy websites.

If your website uses a lot of JavaScript to display critical information—like your service descriptions, pricing, or key features—many AI systems might not be able to read that content. They’ll essentially see a blank page where your important information should be.

How to check: Ask your web developer to test your site with JavaScript disabled. If your main content disappears or your navigation doesn’t work, that’s a red flag for AI crawlability.

Website Speed and AI Abandonment

AI systems are impatient. If your website takes too long to load, they might give up and move on to a competitor’s site instead.

Think about your own browsing behavior. If a page doesn’t load within a few seconds, you hit the back button and try another result. AI crawlers do the same thing, but they’re even less patient than humans.

For B2B companies, this is particularly important because:

  1. Your service pages need to load quickly so AI systems can read your offerings
  2. Your blog posts need to load quickly so AI can access your thought leadership content
  3. Your contact and pricing information needs to load quickly so AI can include these details when recommending your business

What to do: Use free tools like Google PageSpeed Insights to test your site. If your pages take more than 3-4 seconds to load, you should investigate optimization options with your developer or web host.

Pop-ups, Banners, and Other Moving Parts

Have you ever tried to click a button on a website, but right before you clicked, an advertisement slid into place and you accidentally clicked that instead? Frustrating, right?

AI systems have the same problem. When elements on your page shift around—especially in those first few seconds as the page loads—AI crawlers can get confused about what to click or what content is most important.

This is measured by something called Cumulative Layout Shift (CLS). High CLS means lots of unexpected movement as your page loads.

Common culprits for B2B websites:

  • Email signup pop-ups that appear after a few seconds
  • Cookie consent banners that push content down
  • Promotional banners at the top of the page that load late
  • Images that don’t have set dimensions, causing text to jump when they load

Some of these elements are necessary (like cookie consent), but minimize unexpected movement wherever possible. Make sure images have width and height attributes so the browser reserves the right amount of space before the image loads.

The Future: Agentic AI

Here’s where things get really interesting. The next generation of AI doesn’t just read websites—it interacts with them.

“Agentic AI” refers to AI systems that can perform tasks on your behalf. Imagine an AI assistant that can actually fill out your contact form, navigate your product catalog, or even complete a purchase.

These systems can handle JavaScript and complex interactions, which is good news. But they’re also judging whether your website is easy to use. If your forms are confusing, if your checkout process has unnecessary steps, or if your site is slow, these AI agents might recommend a competitor instead.

For B2B companies, this means:

  • Simplify your contact forms. Every required field is another chance for an AI (or human) to give up.
  • Make your navigation intuitive. Can someone get from your homepage to your pricing in two clicks?
  • Test your conversion paths. From initial interest to demo request to purchase, eliminate friction points.

Structured Data: Teaching AI to Understand Your Business

Imagine if every business card in the world had information in random places. One card puts the phone number in the top left, another puts it in the bottom right, another doesn’t include a phone number at all. It would be chaos.

Structured data is like agreeing on a standard business card format. It’s a way of labeling information on your website so AI systems know exactly what they’re looking at.

Instead of AI having to guess “Is this text a product name or a blog post title?” structured data explicitly labels it: “This is a product. This is its name. This is its price. This is what it does.”

For B2B companies, particularly valuable structured data includes:

  • Organization information: Your company name, logo, contact details, and location
  • Service listings: What services you offer and who they’re for
  • Article markup: For blog posts, so AI knows the headline, author, publish date, and main content
  • FAQ markup: Questions and answers that might appear directly in search results
  • Review markup: If you have testimonials or reviews, this helps AI understand your reputation

Important note: While structured data helps AI search systems find and display your information correctly, there’s limited evidence it’s used when AI models are initially trained. Think of it this way: structured data helps AI recommend you when someone asks a question, but it doesn’t necessarily make AI “learn” about your industry from your content.

That said, it’s still worth implementing because modern AI search is essentially conducting multiple searches and combining results—and structured data helps your content get surfaced in those searches.

Server Logs and Crawler Monitoring

This section is a bit more technical, but it’s important for understanding what’s actually happening on your site.

Your web server keeps a record (a “log”) of every visit to your site. This includes regular visitors and crawler bots. Reviewing these logs can tell you:

  • Which AI crawlers are visiting your site
  • How often they’re visiting
  • Which pages they’re reading
  • Whether they’re encountering errors

Ask your web developer or hosting provider to show you crawler activity. You should see regular visits from legitimate bots (Googlebot, GPTBot for ChatGPT, Google-Extended for Gemini).

Red flags include:

  • Unknown crawlers making thousands of requests per hour
  • Crawlers that ignore your robots.txt file
  • Unusual traffic patterns that slow down your site

If you see aggressive crawling behavior, you may want to implement rate limiting or blocking for specific user agents. This is technical work, but it’s important for protecting your server resources.

Practical Action Steps

Let’s bring this all together with a specific checklist for B2B companies:

Immediate Actions (This Week)

  1. Review your service and product pages. Are they clear and specific? Replace vague marketing language with concrete descriptions.
  2. Check your blog post structure. Do your headlines clearly indicate what the post covers? Is your most important information in the first few paragraphs?
  3. Test your site speed. Go to PageSpeed Insights and run tests on your key pages. Anything slower than 3-4 seconds needs attention.
  4. Review pop-ups and banners. Can someone read your content without being interrupted? If not, consider less intrusive alternatives.

This Month

  1. Audit your contact information. Is your phone number, email, and address clearly visible on every page? AI systems look for this when recommending businesses.
  2. Talk to your developer about JavaScript. Ask them to test your site with JavaScript disabled. If critical content disappears, discuss options for making it more accessible.
  3. Review your robots.txt file. Make sure you’re not accidentally blocking legitimate crawlers from your public content.
  4. Simplify your contact forms. Remove any fields that aren’t absolutely necessary. Every extra field reduces conversion rates.

This Quarter

  1. Implement basic structured data. At minimum, add Organization and Article schema. Your developer can use Google’s Structured Data Markup Helper to get started.
  2. Review crawler activity in your server logs. Work with your hosting provider or developer to understand what crawlers are visiting and whether there are any concerns.
  3. Optimize images. Make sure every image has width and height attributes, proper alt text, and is compressed for web use.
  4. Create an FAQ page with structured data markup. This gives AI systems clear question-and-answer pairs to work with.

The Bottom Line

The fundamentals of good SEO haven’t changed—clear content, fast loading, easy navigation, and proper technical structure. What has changed is the importance of these factors.

In the past, if your site was a bit slow or used a lot of JavaScript, you might just rank a little lower on Google. Today, AI systems might not be able to read your site at all. That means you’re invisible when potential clients ask AI for recommendations in your industry.

The good news? Most of what makes your site work well for AI also makes it work better for humans. Clear headlines, fast loading, simple navigation, and straightforward descriptions benefit everyone.

You don’t need to rebuild your website from scratch. Start with the quick wins—improve your service page descriptions, test your site speed, and review your contact forms. Then gradually work through the more technical items with your developer.

The AI revolution isn’t coming—it’s already here. It’s crawling your website right now, deciding whether to recommend your business to its users. Make sure you’re ready.


Need help assessing your website’s readiness for AI search? Scribendi Digital specializes in technical SEO for professional services firms and B2B companies. Contact us to discuss a comprehensive audit tailored to your business needs.