Finding local links is a challenge, but looking to the competition can help point you in the right direction.  In this post I'll show how you can quickly build a report that uncovers local link opportunities.

This post is geared for SEOs with a little Python knowledge, but I'll be providing all of the code that you need to get started.

What you'll need to follow along

  • Persuaded.io API key. We'll use this API to find the top 20 GMB listings for a given area and search query. The API performs a geolocated grid search so that we're getting the highest visibility listings.
  • Serpstat API key. We'll use Serpstat to find all of the backlinks for the GMB listings we found in the first step. API access is included starting at the $69 Lite plan.

What is the result going to look like?

The script uses both APIs to build a spreadsheet of the most common domains linking to the GMB websites in the area.

Each referring domain is counted once per GMB website that it links to, so multiple links to the same website don't inflate the count.

Common citation sites like yellowpages.com and justia.com are going to appear frequently, but you'll also find regional results like newjerseylawtv.com.  

How to get a Persuaded.io API Key

API access to Persuaded.io is available for all accounts, including free trials, so it's easy to get started and try it out.

First head to the app and sign in with an email address or using a Google account.

Once you're signed in, you'll see an Account button at the bottom of the sidebar.  Click there to reveal the Account tab and copy your API key.

Open the Account tab to copy your API key.

Running the Script

The script should be usable with minimal setup.  The requests module is the only third party package in use, so you'll need to make sure that is installed before running the script.

The script may take a few minutes to run, because the Serpstat Lite plan limits requests per second, but it should be relatively quick.  When the script is finished you'll have a CSV file named linkintersect.csv that contains the referring domains summary that we previewed above.

The entire script is available below.  Don't forget to plug your own API keys into the key variables at the top of the script!

from collections import defaultdict
import requests
import time
import csv

persuaded_key = "Your Persuaded.io API Key Here"
serpstat_key = "Your Serpstat API Key Here"

# Trigger a Google Maps scan to discover the top ranking GMB listings.
pending_task = requests.post(
    "https://app.persuaded.io/api/keywords/scans/",
    params={"api_key": persuaded_key},
    json={
        "keyword": "personal injury",
        "lat": 40.0583,
        "lng": -74.4057,
        "dimensions": 6,
        "distance": 1000,
    },
).json()

# Polling until Persuaded.io returns scan results.
results = []
while True:
    pending_task = requests.get(
        "https://app.persuaded.io/api/task/{}/".format(pending_task["id"]),
        params={"api_key": persuaded_key},
    ).json()

    if pending_task["status"] == "COMPLETE":
        results = requests.get(
            "https://app.persuaded.io/api/keywords/{}/results/".format(
                pending_task["ref_id"]
            ),
            params={"api_key": persuaded_key},
        ).json()

        break
    time.sleep(1)

# Asking Serpstat API for referring domain data.
link_intersect = defaultdict(int)
for row in results[:20]:
    if not row["website"]:
        continue

    links_result = requests.post(
        "https://api.serpstat.com/v4/",
        params={"token": serpstat_key},
        json={
            "id": 1,
            "method": "SerpstatBacklinksProcedure.getRefDomains",
            "params": {"query": row["website"], "sort": "domain_rank", "size": 20},
        },
    ).json()

    if "result" not in links_result:
        continue

    # Keep track of how many times a referring domain is seen.
    for link_row in links_result["result"]["data"]:
        link_intersect[link_row["domain_from"]] += 1
    time.sleep(1)

sorted_intersect = sorted(link_intersect.items(), key=lambda r: r[1], reverse=True)

# Write the results to a CSV file.
writer = csv.writer(open("linkintersect.csv", "w"))
writer.writerow(["Domain", "Count"])

for (domain, count) in sorted_intersect:
    writer.writerow([domain, str(count)])

Local SEO Software

You can use Persuaded.io for hyper-local rank tracking, spam busting, and much more. New users start with a credit, and you can try without any signup required.