Search engine optimization is a pillar of digital marketing, a reality that endures. However, by 2025, a major evolution is emerging… Artificial intelligences such as ChatGPT, Gemini and Claude are reshaping the landscape of online search.
From now on, websites are no longer measured only by their positioning on Google; they also aspire to be mentioned – or even paraphrased – by language models such as LLMs in their instant responses.
For businesses in Quebec (and elsewhere), understanding these issues and adapting to this transformation is not just an option: it is a necessity to maintain their visibility. By embracing these changes with insight and agility, they can continue to thrive in this ever-changing digital environment.
Why rethink SEO today for AI and LLMs?
LLMs (Large Language Models) do not read the web like a classic engine. They synthesize. They evaluate the contextual quality, perceived authority, and conversational usefulness of a piece of content. As such, the historical SEO criteria (backlinks, keywords, meta titles) are no longer enough. This is where GEO (Generative Engine Optimization) comes in, a nascent discipline that adapts content to the reading and restitution logics of generative AI.
GEO or SEO for AI: An approach centered on understanding and citation
Unlike traditional SEO, GEO does not optimize for ranking, but for direct citation in AI answers. This requires a different structure, increased granularity of information, and a more visible proof-of-authority strategy.
Comparison Table: Classic SEO vs GEO (SEO for LLMs)
Key | Classic | GEO / SEO for AI |
---|---|---|
Main objective | Search | Citation in AI responses (ChatGPT, etc.) |
Success | Keywords, backlinks, speed, UX | Clear, structured, contextual, trusted source |
Expected | Long pages with markup | Summary paragraphs, easy quotes, OCD |
Performance | Google Analytics, Search Console | Snippets in AI answers, mentions in AI SEO tools |
When AI Quotes: Examples of Customers Embedded in AI Corpora
Some Quebec companies have already adapted their content to maximize their visibility in AI.
- BCRH Recruitment offers specialized content on headhunting strategies, with up-to-date, signed, and structured pages for queries such as “how to recruit in times of talent scarcity.”
- Respiart, which specialises in CPAP devices, is known for its technical guides on masks and devices, designed to be used as such in AI research responses.
By directly quoting these brands, AIs position them as references in their sector.
The technical fundamentals to be visible in LLMs
- No JavaScript-only content : AIs read raw HTML. The essential content must therefore be present in the source code of the page.
- Server-Side Rendering (SSR): Ensure a full display even without dynamic loading.
- Schema.org markup : essential to structure information (HowTo, FAQ, Article).
- Enriched data : reviews, authors, date mentions.
Writing for AI: Language, Format, and Tone
AIs look for content that is easily paraphrased, factual, and adapted to conversational queries. Some good practices:
- Clearly answer questions such as “What is…?”, “How to do it…?”, “Why…?”
- Favor an informative, non-promotional style.
- Add internal quotes, e.g.:
“Companies like Ma Chérie Bleue benefit from increased awareness by providing useful guides for choosing simple and modern wedding dresses.”
Perceived freshness: an underestimated criterion
Unlike traditional engines, LLMs do not read sitemap or robots.txt files. They infer the relevance of a piece of content by its date and its perceived usefulness. Content that has not been updated for 3 years, even very good, will be ignored.
Recommended update frequency:
- Strategy pages: every 6 months
- Expert blog posts: 1 time a year
- Product guides: with each evolution or novelty
Platforms where to sow seeds for AI
LLMs train on Reddit, Quora, Medium, StackExchange, but also local or professional forums. Participating in these spaces in a sincere and useful way increases the chances of being indexed.
- Reddit: prefer sector-specific subreddits (e.g. r/entrepreneuriatQC).
- Quora: Reply under your real name, add links to your articles.
- Medium: Republish your best content, formatted for engagement.
Example : The Clark Influence agency has strengthened its visibility by sharing client cases on Medium and LinkedIn, feeding the corpus useful for queries such as “creative Montreal influence agency”.
Think local to position yourself globally
LLMs love local specificity. Content that addresses a global topic (e.g., recruitment, cleaning, law) with a local entry is more likely to be cited for geographic queries.
- Azran answers specific legal questions about family law in Quebec.
- Nettoyage Experts offers guides on duct maintenance specific to Montreal’s climate.
- This is also why we regularly push local signals with very precise placements!
The role of internal linking and semantic density
While AIs read pages as a whole, they also take into account the overall context of the site. An isolated page with no incoming or outgoing links, without thematic neighborhoods, loses visibility. Make sure that:
- Each page is linked to other related content.
- The anchors are natural, not over-optimized.
- The vocabulary is varied but coherent.
Measuring your GEO success: not yet Google Analytics
There are several ways to track your visibility in AI:
- Test queries in ChatGPT/Gemini with mentions of your industry.
- Use of tools such as ChatGPT Browse, SearchGPT, or Perplexity with source filtering.
- Citation monitoring (with or without link) via Mention, Talkwalker or Ahrefs.
To dig deeper, read the complete guide to SEO for ChatGPT published by BlackcatSEO, and get inspired by successful examples in their portfolio of GEO optimization applied in Quebec.
Gemini, Perplexity, ChatGPT, Deepseek
Gemini (Google)
Objective : To be cited in AI Overviews and to be featured in rich results powered by Gemini.
Technical best practices :
- Full HTML, no dynamically injected content
- Mobile-first performance and Core Web Vitals (LCP < 2.5s, CLS < 0.1)
- Structuring via relevant Schema.org tags (Product, Article, FAQ, Event)
- Visible update date, revised content
- Highlighting the E-E-A-T: author, expertise, press mentions
Perplexity
Objective : To be included in the summary answers by citing reliable sources.
Technical best practices :
- Pages served without JavaScript blocking
- Clear headings, short paragraphs, list structuring or FAQs
- Sourced, original, dated content
- Links to trusted external resources
- Schema.org on key elements (Organization, Person, WebPage)
ChatGPT (OpenAI)
Objective : To see its content included in browse responses and in query completion modules.
Technical best practices :
- Full HTML rendering, no cloaking
- Content structuring the answers (e.g., Q sections, tables)
- Tags FAQPage, HowTo, Article to Guide Extraction
- Allowing the Common Crawler (ccbot) in robots.txt
- Citations to other articles, internal data, and case studies
Deepseek
Goal : Sustainable indexability and reliable reuse of content via mass ingestion.
Technical best practices :
- Up-to-date and comprehensive XML sitemap
- Robots.txt permissive, no guidelines
noindex
- Logical semantic tags (Hn, clear sections)
- Use of structured data for all informational pages
- Strong internal linking with semantic consistency
Comparison Table of SEO Technical Requirements by LLM
Technical | Gemini (Google) | Perplexity | ChatGPT (OpenAI) | Deepseek |
---|---|---|---|---|
Hn Structure and Tags | Indispensable | Indispensable | Indispensable | Indispensable |
Schema | Priority | Priority | Priority | Priority |
HTML / SSR | Obligatory | Obligatory | Obligatory | Obligatory |
HTTPS / Performance | Priority | Important | Important | Important |
Freshness / update | Indispensable | Priority | Important | Priority |
E-E-A-T / Authorship | Indispensable | Important | Important | Important |
External | Priority | Indispensable | Priority | Priority |
Robots.txt / Sitemaps | Open / up-to-date | Open | “ccbot” allowed | Open, full sitemaps |
Tailor the strategy to each AI engine
While the foundations remain similar, each LLM emphasizes slightly different aspects. Gemini values technical performance and E-E-A-T, Perplexity requires clear sources and structures, ChatGPT builds on conversational logic, and Deepseek requires maximum structural rigor. By understanding them, we can better position ourselves in the cognitive value chain of research AIs.
The technical backstage of SEO for LLMs
From bare HTML to full display: what AIs see
The content read by AIs is the content directly visible in the HTML. This excludes any element dynamically added by JavaScript. What the user sees is not always what an LLM sees.
- Server-side rendering (SSR): Prioritize SSR or pre-rendering. Frameworks like Next.js or Nuxt allow content to be exposed without JavaScript.
- Render check : Test your pages with
curl
Google Search Console’s preview, or Lynx’s “Text Only” mode to see what AI can extract. - Avoid : Content injected via JS widgets, asynchronous carousels, or CSS-masked tags.
Table – Content accessibility by rendering method
Rendering | AI-readable | Recommended? | Comment |
---|---|---|---|
Server-Side Rendering (SSR) | Yes | Yes | Stable and complete display in HTML |
Static Pre-rendering (SSG) | Yes | Yes | Ideal for fixed or informational pages |
Client-side rendering (CSR/SPA) | No | No | Risk of content invisible to AI |
Partial | Partly | With caution | Depends on HTML structure and fallback |
Semantic structuring and intelligent markup
A page understandable by an AI is a page whose structure makes sense: hierarchical, tagged, typed content. The more tagged the content, the more the AI will understand it.
- Schema.org / JSON-LD : Use the Article, FAQPage, Product, LocalBusiness, HowTo types to define your pages.
- Microdata : Add author, revision date, products cited, ratings, informative links.
- Hierarchical headings : Tags H1 to H4 allow ideas to be clearly segmented, making them easy to reuse.
Example : a page on legal obligations in family law in Quebec with Article + FAQ markup can be included in a Gemini response.
Optimization of technical performance
LLMs do not directly “reward” speed, but the platforms where they operate (e.g. Bing/Edge, Google/Gemini) filter the results according to the classic criteria:
- Compressing images in WebP or AVIF
- Intelligent Lazy loading (main content loaded from HTML)
- Core Web Vitals compliant LCP, CLS, FID
AIs prefer sites that display quickly and present the essentials as soon as they arrive on the page.
Crawler access: AIs sometimes ignore your rules
- robots.txt : Disallow: empty. But be aware that some AIs voluntarily bypass this file.
- HTTP headers : Use clean, non-cloaking indexing headers.
- Avoid gated content : Connection walls block access to content for AIs, who will systematically ignore it.
Organize content into thematic silos
AIs are looking for global consistency. A site that offers isolated content on a subject is not considered relevant. You have to create a network of related content, with a coherent and varied vocabulary.
- Create a topic cluster around each master topic.
- Use internal links to guide AI from one topic to the next.
- Embed synonyms, co-occurrences, named entities.
Technical Best Practices (Advanced) Roundtable
Technical | Recommended | Why it’s important for LLMs |
Content | SSR or pre-rendering, clean | Enables full indexing, without loss of content |
Semantic | H1-H4 tags, short | Facilitates the generation of summaries and citations |
Structured | Schema.org + full | Clarifies context for AIs, promotes understanding |
Performance | Compressed images, minified CSS, fast | Indirect influence by the UX of the original engines |
Internal | Contextual navigation, natural | Strengthens thematic logic and content extraction |
Update Frequency | Visible timestamp, regular | AIs enhance current and maintained content |
Content | Well-defined | Demonstrates expertise in a given field |
Original/Proprietary | Internal data, case | Single source that AIs can cite unambiguously |
We leave you with this information, and we will be happy to talk about it again for your SME, with concrete scenarios for your visibility on LLMs.