- Home ›
- Free Tools ›
- Free Keyword Rank Checker SEO Tool - Create Your Own
Free Keyword Rank Checker SEO Tool - Create Your Own
Don't want to pay hefty price to online tools for
checking keyword rankings? Now you can create your own! In
this article we will look at how you can create a FREE
keyword rank checker yourself and check ranking for
unlimited number of keywords.
You can use Python with the requests and BeautifulSoup libraries to achieve keyword rank checking. Here's a step-by-step guide:
Set up Python environment
Ensure you have Python installed on your system. Install the required libraries by running:pip install requests beautifulsoup4
Free Keyword Rank Checker Tool
Create a new Python script with the following content to create your keyword rank checker seo tool:import requests
from bs4 import BeautifulSoup
import time
import csv
import random
def read_queries_from_csv(file_path):
queries = []
with open(file_path, 'r') as csvfile:
reader = csv.reader(csvfile)
for row in reader:
if row: # Ensure the row is not empty
queries.append(row[0])
return queries
def check_url_in_search_results(search_query, target_url):
headers = {
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36'
}
search_url = f'https://www.google.com/search?q={search_query}&num=100'
for attempt in range(3): # Retry up to 3 times
try:
response = requests.get(search_url, headers=headers)
if response.status_code == 200:
soup = BeautifulSoup(response.text, 'html.parser')
search_results = soup.find_all('div', class_='yuRUbf')
for index, result in enumerate(search_results, start=1):
link = result.find('a')['href']
if target_url in link:
print(f'URL found at position {index}: {link}')
return True, index, link
print(f'URL not found: {target_url}')
return False, None, None
else:
print('Failed to retrieve search results. Status code:', response.status_code)
except requests.RequestException as e:
print('Error during request:', e)
time.sleep(random.uniform(30, 60)) # Wait longer before retrying
return False, None, None
def check_multiple_urls_with_queries(url_list, queries, delay=30):
results = []
check_count = 0
for query in queries:
for url in url_list:
found, pos, actual_url = check_url_in_search_results(query, url)
results.append([query, actual_url if found else url, pos if found else 'Not Found'])
check_count += 1
if check_count % 2 == 0: # Pause after every 2 checks
print('Pausing for 30 seconds...')
time.sleep(30)
wait_time = random.uniform(delay, delay * 1.5) # Random delay between delay and delay*1.5 seconds
print(f'Waiting for {wait_time:.2f} seconds before the next request...')
time.sleep(wait_time)
# Write results to CSV file
with open('results.csv', 'w', newline='') as csvfile:
writer = csv.writer(csvfile)
writer.writerow(['Search Query', 'URL', 'Position'])
writer.writerows(results)
print('Results have been written to results.csv')
# File path to the CSV containing queries
queries_csv_path = r'C:\Users\myfolder\OneDrive\Desktop\rank-check-with-python\keywords.csv' # Replace with your actual file path
# List of URLs to check
url_list = [
'yourwebsite.com'
]
# Read queries from CSV file
queries = read_queries_from_csv(queries_csv_path)
# Check all URLs in the list with a delay of 40 seconds between each request
check_multiple_urls_with_queries(url_list, queries, delay=40)
Save the script and run it using the following command:
python your_script_name.py
Notes:
This script sends a request to Google and parses the search results page to check if the target URL appears in the search results. The User-Agent header is used to mimic a request from a web browser. This method provides a straightforward way to check if a URL appears in search results without the need for setting up APIs or dealing with quota limits.
FAQs
How to ensure you have Python installed on your system?
To ensure you have Python installed on your system and set up the environment for the script, follow these steps:1. Check if Python is installed:
On Windows:
Open Command Prompt (search for "cmd" in the Start menu).
Type the following command and press Enter:
python --version
On macOS and Linux:
Open Terminal.
Type the following command and press Enter:
python3 --version
2. Install Python (if not installed):
On Windows:
Go to the official Python website.
Download the latest Python installer for Windows.
Run the installer. Make sure to check the box that says "Add Python to PATH" before clicking "Install Now".
Follow the prompts to complete the installation.
On macOS:
Open Terminal.
Use Homebrew to install Python (if Homebrew is not installed, follow the instructions on the Homebrew website):
brew install python
On Linux:
Open Terminal.
Use your package manager to install Python. For example, on Ubuntu, use:
sudo apt update sudo apt install python3
3. Install Required Libraries:
Once Python is installed, install the required libraries (requests and beautifulsoup4) by running:
pip install requests beautifulsoup4
4. Create and Run the Script:
Open a text editor and copy the provided Python script into a new file. Save the file with a .py extension, for example, check_url_in_search.py.
Run the script:
On Windows:
Open Command Prompt.
Navigate to the directory where you saved the script using the cd command. For example:
cd path\to\your\script
Run the script:
python check_url_in_search.py
On macOS and Linux:
Open Terminal.
Navigate to the directory where you saved the script using the cd command. For example:
cd /path/to/your/script
Run the script:
python3 check_url_in_search.py
- File Name: Ensure that the file name is exactly index.py (without any hidden extensions like .txt).
- File Path: Ensure that you are providing the correct path to the file if you are specifying it directly.
- Permissions: Ensure that you have the necessary permissions to read the file from the specified location.
Can we keep a time delay between each check?
Yes, adding a time delay between each request is a good practice. Here we have used the time module in Python to introduce a delay between each search query.Explanation
- Function check_url_in_search_results: This function is for checking if a specific URL appears in Google search results for the given search query.
- Function check_multiple_urls: This function takes a list of URLs and a delay time (in seconds). It iterates over each URL, calling the check_url_in_search_results function for each one, and then waits for the specified delay before proceeding to the next URL.
- time.sleep(delay): This function introduces a delay between each request. The delay is set to 40 seconds by default but can be adjusted as needed.
How to stop the execution of python?
To stop the execution of a running Python script, you can use the following methods depending on your operating system and environment:On Windows (Command Prompt or PowerShell):
Ctrl + C:
Open Command Prompt or PowerShell window where the script is running.
Press Ctrl + C on your keyboard. This sends an interrupt signal to the script and stops its execution.
On macOS or Linux (Terminal):
Ctrl + C:
Open Terminal window where the script is running.
Press Ctrl + C on your keyboard. This sends an interrupt signal to the script and stops its execution.
Using Task Manager (Windows) or Activity Monitor (macOS):
If for some reason Ctrl + C doesn't work, you can manually terminate the Python process:
On Windows:
Press Ctrl + Shift + Esc to open Task Manager.
Find the "python.exe" process in the "Processes" tab.
Select the process and click "End Task".
On macOS:
Open Activity Monitor (you can find it using Spotlight search).
Find the "Python" process.
Select the process and click the "X" button in the top-left corner, then click "Force Quit".
Using Terminal Commands:
On macOS or Linux:
Find the process ID (PID) of the running script by using:
ps aux | grep python
This will list all running Python processes. Note the PID of the process you want to stop.
Kill the process using:
kill -9 PID
Replace PID with the actual process ID.
Can the output be added in a csv file?
Sure, the script outputs the results to a CSV file. We'll use the csv module to write the results into a file. The results are written to a CSV file (results.csv by default) with two columns: URL and Status (Found/Not Found). After running the script, you'll find a file named results.csv in the same directory containing the results of the URL checks. The CSV file will be saved in the same directory where your Python script (index.py) is located, unless you specify a different path in the output_file parameter.If you want to save the file in a different directory or with a specific path, you can modify the output_file parameter when calling the check_multiple_urls function. For example:
- To save the file in a specific directory: check_multiple_urls(url_list, delay=10, output_file='path/to/your/directory/results.csv')
- To save the file in a subdirectory within the current directory: check_multiple_urls(url_list, delay=10, output_file='subdirectory/results.csv')
- Same Directory: results.csv will be saved in the same directory as index.py.
- Specified Path: If you provide an absolute path like C:/Users/myfolder/Documents/results.csv, the file will be saved in C:/Users/myfolder/Documents/.
- Relative Path: If you provide a relative path like output/results.csv, the file will be saved in the output sub-directory within the current directory.
After running the script, check the directory specified in the output_file parameter. You can open the directory using your file explorer or terminal to verify that the results.csv file is present.
Summary
- keywords.csv - In this file you need to add all the keywords
- In the above python code, change "yourwebsite.com" to your website domain
- The results will be stored in "results.csv"
- All you need is a computer, internet connection, and Python installed on your system to create this your own keyword rank checker seo tool.
- Open a notepad or any other text editor, paste the above code and save the file with .py extension instead of the normal .txt extension.
- search_url =
f'https://www.google.com/search?q={search_query}&num=100',
here - &num=100 will ensure that the keyword is
searched in top 100 Google search results.