...
elementskit logo

How to turn Claude Code into your SEO analyst (with Semrush data)

claude code seo.png

Getting extra knowledge isn’t the issue in search engine optimisation. The problem is that it lives in too many locations, and turning it into helpful insights entails a number of leaping between instruments, exporting CSVs, and piecing all of it collectively manually.

Claude Code and the Semrush MCP allow you to see the total image in a single place, and in a manner that allows you to work together with the info in plain English. Right here’s an instance of what you are able to do:

Claude Code analyzes GSC and Semrush data to find low-KD, high-impression SEO keyword opportunities for trafficthinktank.com

On this information, I’ll present you tips on how to join your first-party Google knowledge and Semrush’s aggressive intelligence to Claude Code. I’ll then present you tips on how to construct a reside dashboard that brings all of it collectively in a single place.

I will use certainly one of our portfolio websites (TrafficThinkTank.com) because the operating instance. Visitors Assume Tank is an search engine optimisation schooling group and weblog that competes for key phrases like “tips on how to be taught search engine optimisation,” “greatest search engine optimisation books,” and “search engine optimisation communities.” 

Each immediate and evaluation on this information was run towards the reside website, so that you’re seeing precisely how these workflows play out for an actual use case.

Let’s construct it.

Step 1: Arrange your venture

On this step, you’ll set up Claude Code and arrange the venture recordsdata. This text will present you what it seems to be like within the desktop app, however you’ll nonetheless be capable to observe each step throughout the terminal/command line.

Set up Claude Code

For those who’re utilizing the Claude desktop app, swap to the “Code” tab and also you’re good to go:

Claude sidebar menu with Code option highlighted to start a new coding session for SEO automation tasks

For those who choose the CLI, listed below are the set up instructions for Mac, Home windows PowerShell, and Home windows CMD:

For those who’re utilizing Mac, sort in:

curl -fsSL https://claude.ai/set up.sh | bash

For those who’re utilizing Home windows PowerShell, sort in:

irm https://claude.ai/set up.ps1 | iex

For those who’re utilizing Home windows CMD, enter:

curl -fsSL https://claude.ai/set up.cmd -o set up.cmd && set up.cmd && del set up.cmd

You’ll want an Anthropic account with a Claude plan. Anthropic’s getting started guide covers the fundamentals, however all you need to do right here is sort “claude” and hit enter. Then observe the directions to hyperlink your Claude account.

Create the venture listing

Arrange the file construction Claude Code will use to arrange your knowledge. Inform Claude Code to do it immediately, pasting in your required construction. Like this:

Claude Code creates SEO Dashboard project folder structure with data, fetchers, dashboard, and reports directories

Retailer it wherever makes most sense for you, and provides the folder an acceptable identify. I created this in a folder referred to as “experiments” and labeled the brand new subfolder “search engine optimisation Dashboard v1.”

This would possibly look a bit complicated at first, however here is the whole lot we’re establishing:

  • claude.md: A file that offers Claude Code computerized context about your website, rivals, and objectives so you do not have to repeat your self each session
  • Fetchers: Scripts that pull reside knowledge out of your Google APIs and put it aside regionally (we’ll create these within the subsequent step)
  • Information: That is the place all of your fetched knowledge lives, organized by supply
  • Exports: These are the reside dashboard we’ll construct and studies we’ll generate

If you wish to copy ours, right here’s what to stick into Claude Code:

Create the next file construction:

  • claude.md
  • fetchers/
  • knowledge/
  • dashboard/
  • studies/

Create your claude.md file

Claude Code reads your claude.md file robotically and makes use of it as context for each session. This implies you by no means need to repeat who you’re, what website you’re engaged on, or who your rivals are. You’ll be able to ask Claude Code to create one for you, primarily based on details about your enterprise.

For Visitors Assume Tank, the claude.md file seems to be like this:

Markdown project brief for Traffic Think Tank SEO analysis with site goals, competitors, data sources, and strategy context

Right here’s what to incorporate (though you may give as a lot context as you need):

  • What your website is for
  • Who your essential rivals are
  • The principle matters your weblog content material targets
  • Any gated content material/sections you may have
  • The information sources you propose to attach

Step 2: Join Google Search Console and Google Analytics

Google Search Console (GSC) and Google Analytics 4 (GA4) are your first-party knowledge. These sources function the bottom reality of what’s truly taking place in your website. 

You have got two choices for connecting these knowledge sources:

Choice A: CSV exports

If you wish to stand up and operating in 5 minutes, export and add these studies to your Claude Code setup:

  • GSC: Go to Search Console, then “Efficiency” > “Export.” Add to Claude and inform it to avoid wasting to knowledge/gsc/.
  • GA4: Go to “Stories” > [any report you want to use] > “Share” > “Obtain.” Add to Claude and inform it to avoid wasting to knowledge/ga4/.

This is sufficient to run each evaluation on this information, however connecting reside APIs provides you real-time knowledge and makes the dashboards and studies extra strong. 

Choice B: Stay API connections

For reside, up-to-date knowledge that refreshes on demand, connect with Google’s APIs utilizing a service account. One service account covers each GSC and GA4.

There are fairly just a few steps concerned in establishing a service account accurately. Google has documentation on the setup, however we’ve lined a very powerful steps beneath. 

Claude Code may help you troubleshoot in actual time. For those who’re uncertain of what to do at any level, paste a screenshot of the place you’re immediately into Claude Code and ask it what to do subsequent. It may possibly learn the display screen and stroll you thru the subsequent step.

Simply bear in mind that you just’ll be creating and sharing non-public keys right here, and giving it entry to your GSC and GA4 knowledge. So when you’re unsure, or when you’re setting this up for shoppers, you could need to seek the advice of a developer. 

Right here’s tips on how to arrange the API connections:

  1. Create a venture in Google Cloud Console
  2. Allow the Search Console API and Google Analytics Information API (not the “Google Analytics API”). You could find this display screen by trying to find “allow APIs and companies” on the high.
Google Cloud APIs dashboard showing Search Console and Analytics Data APIs enabled for SEO data collection
  1. Go to “IAM & Admin” > “Service Accounts” and create a brand new service account (you may select “viewer” because the position right here to restrict permissions)
Google Cloud IAM Service Accounts page with Create Service Account button highlighted for API authentication setup
  1. Create and obtain the JSON key file by way of “Actions” > “Handle Keys” > “Add key” > “Create new key
  2. Add the service account electronic mail (it seems to be like your-project@your-project-id.iam.gserviceaccount.com) as a person in your GSC property with learn entry
  3. Add the identical electronic mail as a Viewer in your GA4 property
  4. Save the important thing file as service-account-key.json in your venture root

Set up Python dependencies

Python is the programming language that’ll energy a number of what we’re doing with this setup inside Claude Code. You don’t must know something about the way it works or tips on how to use it. Simply paste this line into your command line interface (CLI) or Claude desktop app:

pip set up google-api-python-client google-auth google-analytics-data

That is the guide option to do it, however Claude Code could set up this robotically.

This installs recordsdata Claude Code can then use to construct the scripts we have to extract the info from our sources. These are referred to as fetcher scripts, and we’ll create them quickly.

Create a config file

Create a config file Claude Code will use when operating the fetcher scripts. That is the place you may add your model identify, area, and IDs for every knowledge supply.

Do that for your self if relevant, and create one for every shopper:

{
"identify": "Shopper Title",
"area": "instance.com",
"gsc_property": "https://www.instance.com/",
"ga4_property_id": "1234567890",
"google_ads_customer_id": "1234567890",
"trade": "Instance Business",
"rivals": [
"https://competitor1.com/",
"https://competitor2.com/"
]
}

Construct the fetcher scripts

Fetcher scripts use the service account you created in Google Cloud to drag data from sources like Google Search Console and Google Analytics. 

Simply paste within the beneath script that’s prepared to make use of:

from google.oauth2 import service_account
from googleapiclient.discovery import construct
SCOPES = ['https://www.googleapis.com/auth/webmasters.readonly']
def get_gsc_service():
credentials = service_account.Credentials.from_service_account_file(
'service-account-key.json', scopes=SCOPES
)
return construct('searchconsole', 'v1', credentials=credentials)
def fetch_queries(service, site_url, start_date, end_date):
response = service.searchanalytics().question(
siteUrl=site_url,
physique={
'startDate': start_date,
'endDate': end_date,
'dimensions': ['query'],
'rowLimit': 1000
}
).execute()
return response.get('rows', [])

You’ll be able to improve “rowLimit” for bigger websites, however the GSC API has a each day restrict of 50,000 rows per search sort. Every time you name the API on this workflow, you’re utilizing up a portion of that restrict. 

I like to recommend retaining the row restrict to 1-5K. This gives you an inexpensive quantity of information whereas nonetheless letting you carry out a number of highly effective analyses per day.

Claude Code reads, runs, and iterates on this script for you. It already is aware of the GSC API, so that you needn’t learn a line of documentation.

Do the identical for GA4 by pasting on this fetcher:

from google.analytics.data_v1beta import BetaAnalyticsDataClient
from google.analytics.data_v1beta.sorts import (
RunReportRequest, DateRange, Metric, Dimension
)
def get_ga4_client():
credentials = service_account.Credentials.from_service_account_file(
'service-account-key.json',
scopes=['https://www.googleapis.com/auth/analytics.readonly']
)
return BetaAnalyticsDataClient(credentials=credentials)
def fetch_traffic_by_channel(shopper, property_id, start_date, end_date):
request = RunReportRequest(
property=f"properties/{property_id}",
date_ranges=[DateRange(start_date=start_date, end_date=end_date)],
dimensions=[Dimension(name="sessionDefaultChannelGroup")],
metrics=[
Metric(identify="periods"),
Metric(identify="totalUsers"),
Metric(identify="bounceRate"),
]
)
return shopper.run_report(request)

Shout out to Will Scott for the code we used right here, which we took from this Search Engine Land article.

Ask Claude Code to wrap the fetchers in a single “run_fetch.py” orchestrator. This can assist you to simply replace the info for any dashboards or studies you create. 

Run the scripts and confirm your knowledge

Ask Claude Code to run the fetcher scripts. Then carry out a fast sanity verify on the info by asking Claude Code to:

Learn the GSC and GA4 knowledge within the knowledge/ listing. Summarize what we’ve got: what number of queries, what number of pages, what date vary, and what metrics can be found.

Claude Code summarizes imported GSC and GA4 data, showing query counts, landing pages, metrics, and SEO dataset coverage

If Claude Code can learn and summarize the info, you’re set. For those who run into points, quote the inconsistencies and ask Claude to elucidate and/or troubleshoot the discrepancies.

We’ll cowl tips on how to confirm findings earlier than sharing them with shoppers in Step 7.

Step 3: Join Google Adverts (if relevant)

For those who run Google Adverts to your web site, connecting the Google Adverts interface to your Claude Code setup permits you to carry out some highly effective analyses throughout each paid and natural knowledge.

What you want

Google Adverts makes use of a special authentication move than GSC and GA4. As an alternative of a service account, you want OAuth 2.zero credentials and a developer token, which you’ll solely get if in case you have a supervisor account.

To arrange the Google Adverts API connection:

  • Developer token: From the Google Adverts API Heart (“Instruments & Settings” > “Setup” > “API Heart”). For company use, describe it as “automated reporting for advertising shoppers.” Approval is probably not immediate.
  • OAuth 2.zero shopper: Create this in Google Cloud Console. It’s a separate credential out of your service account. See the Google documentation for extra on the setup.
  • Refresh token: Generate this by means of a one-time browser authentication move

When you have a Supervisor Account (MCC), for instance when you run an company, one developer token and one refresh token cowl all of your sub-accounts. You modify the client ID per shopper.

Set up the dependency

As you probably did with the Python dependencies for Google Search Console and Analytics, paste the next into Claude Code to put in the dependency for Google Adverts:

pip set up google-ads

Construct the fetcher

This is the code to stick in for the Google Adverts fetcher (inform Claude Code so as to add it to fetchers/):

from google.advertisements.googleads.shopper import GoogleAdsClient

shopper = GoogleAdsClient.load_from_storage("google-ads.yaml")
ga_service = shopper.get_service("GoogleAdsService")

question = """
SELECT
search_term_view.search_term,
metrics.impressions,
metrics.clicks,
metrics.cost_micros,
metrics.conversions
FROM search_term_view
WHERE segments.date DURING LAST_30_DAYS
ORDER BY metrics.impressions DESC
"""

response = ga_service.search(customer_id="1234567890", question=question)

Like with the primary two fetchers, ask Claude Code to wrap this fetcher within the “run_fetch.py” orchestrator so you may simply request up to date knowledge whenever you want it. 

For those who’re nonetheless getting API entry arrange or ready on approval, obtain 90 days of search phrases knowledge as a CSV immediately from the Google Adverts UI and inform Claude so as to add it to the info/advertisements/ folder. You’ll be able to improve to reside API connections later, however this allows you to begin utilizing this setup instantly.

Step 4: Add Semrush’s aggressive intelligence

Google’s APIs inform you what’s taking place on your website. Semrush tells you what’s taking place throughout the whole market. This contains:

  • Competitor key phrases
  • Backlink profiles
  • Key phrase search volumes
  • Key phrase problem scores
  • Visitors estimates

The connection makes use of MCP (Mannequin Context Protocol), an open customary for connecting AI instruments to exterior knowledge sources. Basically, this lets Claude Code and Semrush speak to one another and share knowledge. This makes the sorts of studies and analyses you may carry out much more highly effective.

Right here’s tips on how to join Semrush MCP to Claude Code:

Examine your eligibility

Semrush MCP is accessible on Semrush One (Starter and Professional+) and search engine optimisation Traditional (Professional and Guru) plans, with 50,000 API items included every month. search engine optimisation Traditional Enterprise and Semrush One Superior also can use Semrush MCP, however you may want so as to add an API items package deal.

Examine your plan and remaining API items within the “Subscription information” tab of your profile.

Semrush subscription dashboard highlighting API units balance and available credits for MCP-powered SEO data access

Join Semrush by way of MCP

To attach the Semrush MCP when you’re utilizing the desktop app, click on the “+” > “Connectors” > “Handle connectors” > “+” > “Add customized connector.” 

Title it one thing like “Semrush MCP” and paste “https://mcp.semrush.com/v1/mcp” into the “Distant MCP Server URL” field:

Claude custom connector setup for Semrush MCP with server URL entered before adding the integration

To attach Semrush to your Claude Code setup within the terminal, paste within the following command:

claude mcp add semrush https://mcp.semrush.com/v1/mcp -t http

You would possibly must approve the connection.

Claude Code terminal command adding Semrush MCP server and requesting approval to connect Semrush tools

As soon as the connection is added, authenticate utilizing the steps beneath.

Claude MCP server manager showing Semrush MCP successfully connected and authenticated

Authenticate your account

Such as you did whenever you authenticated your Anthropic account firstly, you should join your Semrush account. To do that, sort “/mcp” into Claude Code, choose Semrush from the listing, click on “Authenticate,” and observe the login directions.

For those who’re utilizing the desktop app, the authentication course of ought to begin robotically. Comply with the on-screen directions.

Semrush knowledge is now accessible in each Claude Code session.

Claude Code confirms Semrush MCP server added to config and prompts restart to enable live Semrush access

Take a look at it

Take a look at your Semrush connection is working by pasting in a immediate like:

Present me the highest 10 natural key phrases for [yourdomain.com] within the US. Embrace place, quantity, key phrase problem, and the rating URL.

For those who see an information desk, the connection is working. 

What’s out there by means of Semrush MCP

You’ll be able to entry the next knowledge by means of the Semrush MCP:

  • Analytics API: Natural/paid key phrases, key phrase analysis (quantity, KD, SERP options, intent), backlinks (referring domains, anchors, Authority Rating), area comparisons, and competitor evaluation
  • Traits API: Visitors estimates, site visitors sources, high pages, viewers demographics (requires a Traits API subscription)
  • Tasks API (read-only): Place Monitoring and Website Audit outcomes out of your present Semrush initiatives

For extra on setting this up, take a look at the Semrush MCP documentation.

Step 5: Cache your Semrush knowledge for the dashboard

Earlier than we construct the dashboard, we have to save Semrush knowledge regionally alongside the Google knowledge. MCP pulls knowledge reside, which is nice for advert hoc evaluation. However a dashboard wants a secure dataset to render from.

Ask Claude Code to:

Run the next Semrush studies for [yourdomain.com] and save every as a JSON file in knowledge/semrush/:

1. High 200 natural key phrases (US) with place, quantity, KD, URL, and site visitors estimate into organic_keywords.json

2. High 50 referring domains by Authority Rating into referring_domains.json

3. High 20 natural pages by estimated site visitors with key phrase depend into top_pages.json

4. Area overview (natural site visitors, key phrase depend, Authority Rating) for [yourdomain.com] plus rivals [list 3-5 competitor domains] into competitive_overview.json

Claude Code will make a number of Semrush MCP calls and save structured JSON recordsdata. Now you may have all 4 knowledge sources—GSC, GA4, Adverts (if relevant), and Semrush—sitting in your knowledge/ listing, able to energy a dashboard.

Add AI visibility knowledge to go a step additional

Semrush’s AI Visibility Toolkit is not presently out there by way of the MCP, however you may nonetheless add knowledge from it to your Claude Code setup. Cross-checking your search engine optimisation metrics towards your AI efficiency provides you a extra full image of your total on-line visibility.

Export no matter knowledge you want from inside Semrush and add it to your Claude Code setup, asking Claude to incorporate it in /knowledge/semrush as a brand new CSV file.

For instance, you possibly can export your high performing prompts and your matter alternatives from the “Visibility Overview” software. 

Semrush Topics and Sources dashboard with top-performing prompts export option for AI visibility research

If you’re performing analyses or constructing your dashboard, ask Claude to establish easy-win immediate optimization alternatives that align together with your key phrase alternatives. You will enhance your search engine optimisation and your AI search optimization on the similar time.

Step 6: Construct the dashboard

Construct an interactive dashboard primarily based on the info and connections you have arrange. As an alternative of a terminal output you learn as soon as and neglect, you are getting a reside dashboard that visualizes all of your knowledge sources in a single place. You’ll be able to open it in a browser, share it together with your staff, and replace the info month-to-month.

What the dashboard will embrace

You’ll see 5 panels within the dashboard, every pulling from totally different knowledge sources:

  1. Natural Overview (GSC + Semrush): Complete impressions, clicks, and high queries from GSC alongside natural key phrase depend, site visitors estimate, and Authority Rating from Semrush. Use this to baseline your total natural efficiency towards rivals.
  2. Hanging Distance Key phrases (GSC + Semrush): Key phrases at positions 5–20 from GSC, enriched with KD and quantity from Semrush, all sortable and filterable. These are key phrases you could need to begin optimizing for first to see the quickest outcomes.
  3. Aggressive Hole Map (Semrush): Key phrase overlap visualization displaying your area versus your high rivals. Highlights clusters the place your area has no presence—i.e., clear aggressive gaps.
  4. Content material Efficiency (GA4 + Semrush): High pages by periods from GA4, enriched with rating key phrase depend and referring domains from Semrush. This flags probably skinny content material that solely ranks for just a few phrases.
  5. Backlink Intelligence (Semrush): High referring domains, Authority Rating distribution, and competitor comparability. Use this to tell your link building efforts.
Custom SEO dashboard for Traffic Think Tank showing GSC, Semrush, and competitor traffic comparison metrics

(I requested Claude so as to add some additional performance like a light-weight/darkish mode selector, a sixth tab for paid versus natural efficiency, and a date selector.)

Claude Code can construct the complete dashboard as a single-page net app. Right here’s the immediate to construct yours:

Construct a dashboard net app in dashboard/index.html that reads JSON knowledge from our knowledge/ listing. Use Tailwind CSS for styling and Chart.js for visualizations. The dashboard ought to have 5 panels: 

1. ORGANIC OVERVIEW: Present whole GSC impressions, clicks, and common place for the interval. Subsequent to it, present Semrush’s natural key phrase depend, estimated site visitors, and Authority Rating from competitive_overview.json. Embrace a comparability bar chart of estimated natural site visitors for [yourdomain.com] vs. rivals. 

2. STRIKING DISTANCE KEYWORDS: A sortable desk of key phrases from GSC the place place is 5-20, enriched with quantity and KD from Semrush organic_keywords.json. Shade-code KD (inexperienced <30, yellow 30–50, purple >50). Add a filter for KD vary. 

3. COMPETITIVE GAP MAP: A bar chart displaying key phrase depend by matter cluster the place rivals rank however [yourdomain.com] doesn’t. Use the cached aggressive knowledge. Present high 10 hole clusters by whole quantity. 

4. CONTENT PERFORMANCE: A desk of high weblog pages from GA4, displaying periods, bounce charge, plus Semrush’s rating key phrase depend and referring domains per web page. Spotlight pages with <5 rating key phrases as “skinny content material.” 

5. BACKLINK INTELLIGENCE: A horizontal bar chart of high 20 referring domains by Authority Rating. Present whole referring domains depend and comparability to rivals. Make it responsive. Use a darkish sidebar with navigation. Embrace a header displaying “[Your Brand] — search engine optimisation Dashboard” and the info freshness date. The dashboard ought to load knowledge from relative paths to the info/ listing.

Claude Code will generate the entire HTML file with embedded JavaScript. It reads your precise JSON knowledge recordsdata and builds charts from them.

Launch and iterate

Kind this into Claude Code to launch your dashboard regionally:

cd dashboard
python3 -m http.server 8080
# Open http://localhost:8080 in your browser

From right here, you may iterate on the format and particular functionalities you want. Simply ask Claude Code to refine the dashboard to your wants with prompts like:

  • “Add a date picker to the header that filters the GSC knowledge by date vary.”
  • “Add a sixth panel that reveals the paid-organic overlap knowledge from knowledge/advertisements/.”
  • “Make the placing distance desk exportable as CSV.”
  • “Add sparklines displaying place traits for every key phrase.”
  • “Add month-over-month comparisons and different long-term monitoring views.”

Claude Code edits the present dashboard file. Refresh your browser to see the end result instantly.

You’ll be able to ask Claude Code to make nearly any change to the dashboard. If one thing would not work, preserve iterating or just say “revert to the earlier model.”

Deal with the dashboard as yours to form.

Share the dashboard

You’ll be able to share the dashboard with colleagues or shoppers by deploying the one HTML file to a internet hosting service like Vercel. You’ll be able to ask Claude Code for assist right here, or seek the advice of a developer. 

Recurrently replace the info

Replace the info in your dashboard repeatedly and/or every time you need to construct a report. 

Do that by getting into this immediate into Claude Code on the time you need to replace the info (don’t embrace “advertisements” when you don’t have that knowledge supply linked but):

python3 run_fetch.py --sources gsc,ga4,advertisements,semrush

You’ll be able to ask Claude Code to arrange a scheduled job to refresh the info robotically (your pc will should be on and awake). Anthropic additionally launched [Claude Code Routines](https://code.claude.com/docs/en/routines), which run on Anthropic’s cloud infrastructure and do not require your native machine to be on. Routines are presently in analysis preview.

Alternatively, you possibly can arrange computerized refreshes your self utilizing cron jobs when you’re extra technical. However a fast guide refresh will work for most individuals at this stage. 

Step 7: Obtain your first report

You should utilize Claude Code to generate client-ready studies primarily based on the info and connections you’ve arrange.

Both ask Claude for studies primarily based on particular knowledge or actions you need to spotlight, or use this immediate to immediately generate a report that covers a number of the most necessary areas:

Generate a full search engine optimisation alternative report for [yourdomain.com] as a Phrase doc. Use all knowledge sources out there: GSC (queries.csv), GA4 (traffic_by_channel.csv), Semrush key phrases/pages/domains/aggressive overview. Construction it as follows:

1. Govt Abstract: 3-5 bullet “state of the positioning” findings backed by numbers

2. Fast Wins (subsequent 30 days): GSC key phrases pos 5-20 enriched with Semrush KD/quantity. For every, estimate the month-to-month click on uplift if it moved to place 3 (use GSC common place and CTR knowledge to estimate this). Prioritize by (uplift multiplied by the inverse of KD).

3. Content material Hole Evaluation: Subject clusters from Semrush key phrases vs precise GSC clicks. Which clusters have excessive search quantity however low click on seize? What content material is lacking or skinny?

4. High Pages Audit: Cross-reference Semrush high pages with GSC. Flag pages with excessive Semrush-estimated site visitors however low GSC clicks (potential rating drops or cannibalization). Flag skinny content material pages (< 5 rating key phrases).

5. Aggressive Benchmarking: Examine [yourdomain.com] vs all Four rivals throughout key phrases, site visitors, and area metrics. The place is the largest hole and what sort of content material closes it?

6. Backlink Alternatives: Given the referring area profile (Authority Scores, sources), what is the hyperlink acquisition technique? The place are rivals getting hyperlinks that [yourdomain.com] is not?

7. Prioritized Motion Plan: A scored desk of all really helpful actions: effort (Low/Med/Excessive), impression (Low/Med/Excessive), and instructed proprietor (author / developer / outreach).

Claude Code can generate a bunch of various file sorts to fit your wants, together with:

  • PDFs
  • Phrase docs
  • CSVs
  • PowerPoints
  • Plain textual content

However you might need to put in additional dependencies inside Claude Code for it to work. I’ve discovered it’s greatest to first ask Claude Code what it must generate the particular format you require. 

When producing a docx, Claude Code requested for a number of permissions and folder accesses I did not anticipate. Stopping the method and asking “what’s going to you should generate a docx file?” gave me two clear choices. I selected one, and the report generated cleanly.

Claude Code generates SEO report document after analyzing GSC, GA4, and Semrush data with Python docx setup

Iterating with Claude Code is regular. Count on to ask follow-ups to land the place you need.

Confirm the info

The dashboard and studies Claude Code can generate are spectacular, however they’re not client-ready with out you first checking the whole lot over. LLMs can often misinterpret knowledge or generate a chart that doesn’t match the underlying numbers. 

Claude also can confidently mix numbers in methods which are mathematically appropriate however analytically fallacious. This may very well be misattributing channel knowledge, for instance, which might impression choices you or your shoppers make. 

Earlier than you share the dashboard or act on something, ensure you:

  • Cross-reference dashboard numbers towards the uncooked JSON recordsdata (and even the uncooked knowledge throughout the instruments themselves)
  • If a discovering seems to be too dramatic, open Claude Code and ask it to point out you the uncooked knowledge behind it

Step 8: Carry out cross-source analyses

Working cross-source analyses is the place you may leverage the connection between first-party knowledge (from GSC, GA4, and Google Adverts) and Semrush aggressive knowledge. This allows you to work together with the info at any time and for just about any objective you may think about.

The conversational nature of working with an LLM permits for deep, iterative conversations. You ask Claude Code to carry out an evaluation, it performs it, you push again or ask for additional clarification, and also you get extra insights than static numbers may give you.

Under are my 5 favourite methods to work together with Claude Code for search engine optimisation. For every evaluation, I’ll inform you the sources it pulls from, why the evaluation is helpful, and a immediate you may copy and paste to carry out the evaluation your self, proper now.

Evaluation #1: Prioritize GSC queries with aggressive problem

First-party supply: GSC (actual impressions and positions)

Third-party supply: Semrush (key phrase problem)

Immediate: Learn the GSC knowledge in knowledge/gsc/. Discover queries for [yourdomain.com] the place place is between 5 and 15 with greater than 500 impressions. For every, pull key phrase problem and search quantity from Semrush. Present solely queries the place KD is beneath 35. Type by impressions descending. These are [yourdomain.com’s] best wins.

Why that is helpful: GSC tells you which ones queries are actual (precise impressions from precise searchers). Semrush tells you ways exhausting every one is to rank for. Collectively, these offer you a prioritized optimization listing neither supply can produce by itself. 

Evaluation #2: Aggressive key phrase hole

First-party supply: GSC (what TTT truly ranks for)

Third-party supply: Semrush (competitor key phrase knowledge)

Immediate: Learn [yourdomain.com’s] GSC question knowledge. Then pull the highest natural key phrases for [competitor 1] and [competitor 2] from Semrush within the US. Discover key phrases the place both [competitor 1] or [competitor 2] rank within the high 10 however [yourdomain.com] has no GSC impressions in any respect—that means we’re utterly invisible. Filter to quantity above 300. Group outcomes by matter cluster and rank clusters by whole quantity.

Why that is helpful: Utilizing GSC knowledge as a substitute of Semrush knowledge means we’re working with floor reality, not estimates. If a key phrase has zero impressions in GSC, our web site actually isn’t displaying up. However combining this with Semrush’s knowledge for rivals means we are able to nonetheless carry out an efficient key phrase hole evaluation.

Comply with-up immediate: For the highest Three hole clusters by quantity, present all key phrases in every cluster with quantity, KD, and who ranks #1. Then verify—does [yourdomain.com] have any present weblog content material that may very well be expanded to focus on these?

The follow-up immediate takes it from key phrase hole evaluation to content material optimization planning. 

Evaluation #3: Content material efficiency audit with authority knowledge

First-party supply: GA4 (periods, bounce charge, engagement)

Third-party supply: Semrush (rating key phrases, backlink profile per web page)

Immediate: Learn the GA4 high pages knowledge in knowledge/ga4/. For [yourdomain.com’s] high 20 weblog pages by periods, pull from Semrush: variety of rating key phrases, estimated natural site visitors, and variety of referring domains for every URL. Flag pages the place the ratio of periods to rating key phrases is low—these are pages with site visitors however skinny topical protection that may very well be expanded.

Why that is helpful: GA4 reveals you which ones pages truly drive engagement. Semrush reveals the aggressive energy of every web page (utilizing key phrase depend and backlink knowledge). A web page with excessive periods however solely three rating key phrases and few backlinks is fragile—one algorithm replace might tank it. These are pages you must prioritize optimizing for extra associated key phrases and constructing backlinks to.

Evaluation #4: Excessive-impression, low-CTR title tag alternatives

First-party supply: GSC (impressions, CTR, positions)

Third-party supply: Semrush (SERP evaluation, competitor title tags)

Immediate: From the GSC knowledge, discover [yourdomain.com’s] pages with greater than 2,000 impressions however CTR beneath 2%. For every, pull the Semrush key phrase knowledge to search out the first key phrase driving impressions, and use net search to see what the SERP outcomes appear like for that key phrase. Generate Three improved title tag choices for every web page primarily based on what’s working for rivals.

Why that is helpful: Excessive impressions however low CTR means Google thinks your web page is related, however searchers aren’t clicking. That is often a title tag or meta description drawback. Your GSC knowledge identifies the pages, whereas Semrush reveals which pages are successful for related key phrases. Claude then makes use of net search to search out the corresponding titles. Then, Claude gives you suggestions primarily based on all of those knowledge sources.

Evaluation #5: Paid vs. natural overlap (for websites operating Google Adverts)

Notice: This evaluation requires Google Adverts knowledge, which isn’t relevant to Visitors Assume Tank. However for companies that do run advertisements, this is likely one of the highest-value analyses in the complete setup.

First-party supply: Google Adverts (search phrases, spend, CPC)

Third-party supply: Semrush (natural rankings)

Immediate: Learn the Google Adverts search phrases knowledge in knowledge/advertisements/. Cross-reference with Semrush natural key phrase knowledge for [yourdomain.com]. Discover key phrases the place we’re paying for clicks however already rank within the high Three organically. Present key phrase, natural place, advert spend, CPC, and month-to-month paid clicks. Calculate whole potential financial savings if we paused advertisements on phrases the place we rank within the high 3.

Why that is helpful: Your Adverts knowledge reveals what you’re spending, and Semrush confirms the place you already rank. The overlap is wasted price range. This type of evaluation can result in hundreds of {dollars} in month-to-month financial savings for shoppers—cash that may be reallocated to key phrases the place you (or they) haven’t any natural presence.

Begin utilizing Claude Code for search engine optimisation right this moment

Right here’s tips on how to use this Claude Code setup in your common search engine optimisation workflows:

  1. As soon as a month, open the dashboard. Ask Claude to refresh the info, then verify the 5 panels. What has modified since final month?
  2. Recurrently ask cross-source questions. Open Claude Code and ask questions on something that appears attention-grabbing—the dashboard reveals you what, Claude Code tells you why.
  3. Take motion. Replace title tags for low-CTR pages. Plan content material for hole clusters. Transient your hyperlink constructing staff on referring area targets.

This course of doesn’t substitute:

  • The Semrush UI: You continue to want to make use of Semrush for configuring Place Monitoring, Website Audit, key phrase lists, or shopper dashboards. MCP is read-only for venture knowledge.
  • Strategic judgment: The dashboard and Claude Code floor insights quick. Whether or not you must act on them—by investing in content material, shifting advert price range, consolidating pages—is your name.
  • Ongoing monitoring: For automated alerts and each day monitoring, you continue to need the Semrush platform. This setup is for month-to-month evaluation and on-demand deep dives.

To start out utilizing the Semrush MCP inside Claude Code right this moment, join a Semrush One subscription.

Related Post

Leave a Reply

Your email address will not be published. Required fields are marked *