The Reporting Problem
SEO reporting is tedious. Every week or month, you're pulling data from multiple sources, copying it into spreadsheets, formatting it for stakeholders, and writing the same analysis you wrote last time.
What if you could run one script and get a complete report? Keyword performance, SERP changes, discussion opportunities, competitor sentiment - all formatted and ready to share.
In this tutorial, we'll build exactly that. We're combining everything from the previous tutorials into an automated reporting system that:
- Pulls keyword metrics for your target keywords
- Analyzes SERPs to detect feature changes
- Discovers high-traffic discussions with sentiment
- Generates a professional HTML report
- Can run on a schedule (daily, weekly, monthly)
The Pipeline
Here's how data flows through our reporting system:
Keywords
/keyword-metrics
SERP Analysis
/serp-analysis
Discovery
/discover-threads
HTML Report
report.html
Each endpoint provides a different piece of intelligence. Combined, they give you a complete picture of your SEO landscape.
Building the Report Generator
Let's start with a class that handles all API calls and data collection:
""" SEO Report Generator Combines keyword metrics, SERP analysis, and discussion discovery into automated HTML reports. """ import requests import json import time from datetime import datetime class SEOReportGenerator: def __init__(self, api_key): self.api_key = api_key self.base_url = "https://reddit-traffic-and-intelligence-api.p.rapidapi.com" self.headers = { "Content-Type": "application/json", "x-rapidapi-host": "reddit-traffic-and-intelligence-api.p.rapidapi.com", "x-rapidapi-key": api_key } def get_keyword_metrics(self, keywords): """Fetch metrics for a list of keywords.""" all_results = [] # Process in batches of 10 for i in range(0, len(keywords), 10): batch = keywords[i:i+10] response = requests.post( f"{self.base_url}/api/v2/keyword-metrics", json={"keywords": batch, "include_related": False}, headers=self.headers ) data = response.json() if data.get('status') == 'success': all_results.extend(data['keywords']) time.sleep(0.5) return all_results def get_serp_analysis(self, keyword): """Analyze SERP for a single keyword.""" response = requests.post( f"{self.base_url}/api/v2/serp-analysis", json={"keyword": keyword, "include_features": True}, headers=self.headers ) return response.json() def discover_discussions(self, keyword, your_brand=None, competitors=None): """Discover discussions for a keyword with sentiment analysis.""" payload = { "keyword": keyword, "max_threads": 10, "max_comments_per_thread": 10 } if your_brand: payload["your_brand"] = your_brand if competitors: payload["competitors"] = competitors response = requests.post( f"{self.base_url}/api/v2/discover-threads", json=payload, headers=self.headers ) return response.json() def generate_report(self, config): """ Generate a complete SEO report. Args: config: Dict with keys: - keywords: List of keywords to analyze - primary_keyword: Main keyword for discussion discovery - your_brand: Your brand name (optional) - competitors: List of competitor names (optional) Returns: Dict with all report data """ print("Starting report generation...") report = { 'generated_at': datetime.now().isoformat(), 'config': config, 'sections': {} } # 1. Keyword Metrics print(" [1/3] Fetching keyword metrics...") report['sections']['keywords'] = self.get_keyword_metrics(config['keywords']) # 2. SERP Analysis for primary keyword print(" [2/3] Analyzing SERP...") report['sections']['serp'] = self.get_serp_analysis(config['primary_keyword']) # 3. Discussion Discovery print(" [3/3] Discovering discussions...") report['sections']['discussions'] = self.discover_discussions( config['primary_keyword'], config.get('your_brand'), config.get('competitors') ) print("Report data collected!") return report
Creating the HTML Template
Now we need to turn that data into a readable report. We'll build an HTML template that presents everything clearly:
def generate_html_report(report_data): """Generate an HTML report from the collected data.""" config = report_data['config'] keywords = report_data['sections']['keywords'] serp = report_data['sections']['serp'] discussions = report_data['sections']['discussions'] # Calculate summary stats total_volume = sum(kw.get('volume', 0) for kw in keywords) avg_difficulty = sum(kw.get('keyword_difficulty', 0) for kw in keywords) / len(keywords) if keywords else 0 discussion_count = discussions.get('threads_discovered', 0) discussion_traffic = discussions.get('summary', {}).get('estimated_monthly_traffic', 0) # Build keyword rows keyword_rows = "" for kw in sorted(keywords, key=lambda x: x.get('volume', 0), reverse=True): keyword_rows += f""" <tr> <td>{kw['keyword']}</td> <td>{kw.get('volume', 0):,}</td> <td>${kw.get('cpc', 0)}</td> <td>{kw.get('keyword_difficulty', 0)}</td> <td>{kw.get('search_intent', 'unknown')}</td> </tr>""" # Build SERP features list serp_features = serp.get('serp_features', {}) active_features = [f.replace('has_', '').replace('_', ' ').title() for f, v in serp_features.items() if v] features_html = ", ".join(active_features) if active_features else "None detected" # Build discussion rows discussion_rows = "" for thread in discussions.get('threads', [])[:5]: priority_class = thread.get('priority', 'LOW').lower() discussion_rows += f""" <tr class="{priority_class}"> <td><a href="{thread['url']}">{thread['title'][:50]}...</a></td> <td>{thread.get('estimated_traffic', 0):,}</td> <td>{thread.get('source', 'organic')}</td> <td>{thread.get('priority', 'LOW')}</td> </tr>""" # Generate timestamp generated = datetime.fromisoformat(report_data['generated_at']).strftime("%B %d, %Y at %I:%M %p") html = f"""<!DOCTYPE html> <html> <head> <title>SEO Report: {config['primary_keyword']}</title> <style> body {{ font-family: -apple-system, sans-serif; max-width: 900px; margin: 0 auto; padding: 40px 20px; }} h1 {{ color: #1a1a2e; border-bottom: 3px solid #e94560; padding-bottom: 10px; }} h2 {{ color: #16213e; margin-top: 40px; }} .meta {{ color: #666; margin-bottom: 30px; }} .stats {{ display: grid; grid-template-columns: repeat(4, 1fr); gap: 15px; margin: 30px 0; }} .stat {{ background: #f8f9fa; padding: 20px; border-radius: 8px; text-align: center; }} .stat .number {{ font-size: 2em; font-weight: bold; color: #e94560; }} .stat .label {{ color: #666; font-size: 0.9em; }} table {{ width: 100%; border-collapse: collapse; margin: 20px 0; }} th, td {{ padding: 12px; text-align: left; border-bottom: 1px solid #eee; }} th {{ background: #f8f9fa; }} tr.high {{ background: #fff5f5; }} tr.medium {{ background: #fffbf0; }} a {{ color: #e94560; }} .features {{ background: #e8f4fd; padding: 15px 20px; border-radius: 8px; margin: 20px 0; }} </style> </head> <body> <h1>SEO Intelligence Report</h1> <p class="meta">Primary keyword: <strong>{config['primary_keyword']}</strong> | Generated: {generated}</p> <div class="stats"> <div class="stat"> <div class="number">{len(keywords)}</div> <div class="label">Keywords Tracked</div> </div> <div class="stat"> <div class="number">{total_volume:,}</div> <div class="label">Total Search Volume</div> </div> <div class="stat"> <div class="number">{discussion_count}</div> <div class="label">Discussions Found</div> </div> <div class="stat"> <div class="number">{discussion_traffic:,}</div> <div class="label">Discussion Traffic/mo</div> </div> </div> <h2>Keyword Performance</h2> <table> <tr> <th>Keyword</th> <th>Volume</th> <th>CPC</th> <th>Difficulty</th> <th>Intent</th> </tr> {keyword_rows} </table> <h2>SERP Analysis</h2> <div class="features"> <strong>Active SERP Features:</strong> {features_html} </div> <p>Discussions in top 10: <strong>{serp.get('discussion_count', 0)}</strong> at positions {serp.get('discussion_positions', [])}</p> <h2>Top Discussion Opportunities</h2> <table> <tr> <th>Thread</th> <th>Est. Traffic</th> <th>Source</th> <th>Priority</th> </tr> {discussion_rows} </table> <p style="margin-top: 50px; color: #999; font-size: 0.85em;"> Generated by SEO Report Generator | Powered by RedRanks API </p> </body> </html>""" return html
Putting It Together
Here's the complete script that generates and saves your report:
from seo_report_generator import SEOReportGenerator from report_template import generate_html_report from datetime import datetime import json # Configuration API_KEY = "your_api_key_here" config = { "keywords": [ "crm software", "best crm for small business", "crm comparison", "free crm", "hubspot alternatives", "salesforce competitors" ], "primary_keyword": "crm software", "your_brand": "Acme CRM", "competitors": ["HubSpot", "Salesforce", "Pipedrive"] } # Generate report generator = SEOReportGenerator(API_KEY) report_data = generator.generate_report(config) # Create HTML html = generate_html_report(report_data) # Save files timestamp = datetime.now().strftime("%Y%m%d_%H%M%S") with open(f"seo_report_{timestamp}.html", "w") as f: f.write(html) with open(f"seo_report_{timestamp}.json", "w") as f: json.dump(report_data, f, indent=2) print(f"Report saved: seo_report_{timestamp}.html")
Console Output
Starting report generation... [1/3] Fetching keyword metrics... [2/3] Analyzing SERP... [3/3] Discovering discussions... Report data collected! Report saved: seo_report_20251226_143022.html
Scheduling Your Reports
A report that runs manually is useful. A report that runs automatically is powerful. Here's how to schedule it on different systems:
Linux/Mac (Cron)
Open your crontab with crontab -e and add:
# Run every Monday at 9am 0 9 * * 1 cd /path/to/scripts && /usr/bin/python3 run_report.py # Run daily at 6am 0 6 * * * cd /path/to/scripts && /usr/bin/python3 run_report.py
Windows (Task Scheduler)
- Open Task Scheduler
- Create Basic Task -> Name it "SEO Report"
- Set trigger (weekly, daily, etc.)
- Action: Start a program
- Program:
python - Arguments:
C:\path\to\run_report.py
Python-based Scheduling
For simpler setups, you can use the schedule library:
import schedule import time from run_report import generate_and_save_report def job(): print("Running scheduled report...") generate_and_save_report() # Schedule for every Monday at 9am schedule.every().monday.at("09:00").do(job) # Or every day # schedule.every().day.at("06:00").do(job) print("Scheduler running. Press Ctrl+C to exit.") while True: schedule.run_pending() time.sleep(60)
Email Delivery
Want reports delivered to your inbox? Add email sending with Python's smtplib or use a service like SendGrid. Attach the HTML file or embed the report directly in the email body.
Complete Solution
Here's the full, production-ready script in one file:
""" Complete SEO Report Generator Combines all three API endpoints into automated HTML reports. Usage: python seo_report_complete.py Configure your settings in the CONFIG section below. """ import requests import json import time from datetime import datetime # ============================================================ # CONFIG - Edit these settings # ============================================================ API_KEY = "your_api_key_here" CONFIG = { "keywords": [ "crm software", "best crm for small business", "crm comparison", "free crm", "hubspot alternatives", ], "primary_keyword": "crm software", "your_brand": "Acme CRM", # Optional "competitors": ["HubSpot", "Salesforce"], # Optional } # ============================================================ # API CLIENT # ============================================================ class SEOReportGenerator: def __init__(self, api_key): self.api_key = api_key self.base_url = "https://reddit-traffic-and-intelligence-api.p.rapidapi.com" self.headers = { "Content-Type": "application/json", "x-rapidapi-host": "reddit-traffic-and-intelligence-api.p.rapidapi.com", "x-rapidapi-key": api_key } def _post(self, endpoint, payload): """Make a POST request to the API.""" response = requests.post( f"{self.base_url}{endpoint}", json=payload, headers=self.headers ) return response.json() def get_keyword_metrics(self, keywords): """Fetch metrics for keywords in batches.""" results = [] for i in range(0, len(keywords), 10): batch = keywords[i:i+10] data = self._post("/api/v2/keyword-metrics", {"keywords": batch}) if data.get('status') == 'success': results.extend(data['keywords']) time.sleep(0.5) return results def get_serp_analysis(self, keyword): """Analyze SERP for a keyword.""" return self._post("/api/v2/serp-analysis", {"keyword": keyword}) def discover_discussions(self, keyword, your_brand=None, competitors=None): """Discover discussions with sentiment analysis.""" payload = {"keyword": keyword, "max_threads": 10} if your_brand: payload["your_brand"] = your_brand if competitors: payload["competitors"] = competitors return self._post("/api/v2/discover-threads", payload) def generate_report(self, config): """Generate complete report data.""" print("Generating SEO report...") report = { 'generated_at': datetime.now().isoformat(), 'config': config, 'keywords': None, 'serp': None, 'discussions': None } print(" [1/3] Keywords...") report['keywords'] = self.get_keyword_metrics(config['keywords']) print(" [2/3] SERP...") report['serp'] = self.get_serp_analysis(config['primary_keyword']) print(" [3/3] Discussions...") report['discussions'] = self.discover_discussions( config['primary_keyword'], config.get('your_brand'), config.get('competitors') ) return report # ============================================================ # HTML TEMPLATE # ============================================================ def generate_html(report): """Convert report data to HTML.""" config = report['config'] keywords = report['keywords'] or [] serp = report['serp'] or {} discussions = report['discussions'] or {} # Stats total_vol = sum(k.get('volume', 0) for k in keywords) disc_count = discussions.get('threads_discovered', 0) disc_traffic = discussions.get('summary', {}).get('estimated_monthly_traffic', 0) # Keyword rows kw_rows = "" for k in sorted(keywords, key=lambda x: x.get('volume',0), reverse=True): kw_rows += f"<tr><td>{k['keyword']}</td><td>{k.get('volume',0):,}</td><td>${k.get('cpc',0)}</td><td>{k.get('keyword_difficulty',0)}</td><td>{k.get('search_intent','')}</td></tr>" # Features features = [f.replace('has_','').replace('_',' ').title() for f,v in serp.get('serp_features',{}).items() if v] # Discussion rows disc_rows = "" for t in discussions.get('threads', [])[:5]: disc_rows += f"<tr><td><a href='{t['url']}'>{t['title'][:45]}...</a></td><td>{t.get('estimated_traffic',0):,}</td><td>{t.get('priority','')}</td></tr>" gen_time = datetime.fromisoformat(report['generated_at']).strftime("%B %d, %Y %I:%M %p") return f"""<!DOCTYPE html> <html><head><title>SEO Report: {config['primary_keyword']}</title> <style> body{{font-family:-apple-system,sans-serif;max-width:900px;margin:0 auto;padding:40px 20px;color:#333}} h1{{color:#1a1a2e;border-bottom:3px solid #e94560;padding-bottom:10px}} h2{{color:#16213e;margin-top:40px}} .stats{{display:grid;grid-template-columns:repeat(4,1fr);gap:15px;margin:30px 0}} .stat{{background:#f8f9fa;padding:20px;border-radius:8px;text-align:center}} .stat .n{{font-size:2em;font-weight:bold;color:#e94560}} .stat .l{{color:#666;font-size:0.9em}} table{{width:100%;border-collapse:collapse;margin:20px 0}} th,td{{padding:12px;text-align:left;border-bottom:1px solid #eee}} th{{background:#f8f9fa}} a{{color:#e94560}} .feat{{background:#e8f4fd;padding:15px;border-radius:8px;margin:20px 0}} .meta{{color:#666;margin-bottom:30px}} </style></head><body> <h1>SEO Intelligence Report</h1> <p class="meta">Keyword: <strong>{config['primary_keyword']}</strong> | Generated: {gen_time}</p> <div class="stats"> <div class="stat"><div class="n">{len(keywords)}</div><div class="l">Keywords</div></div> <div class="stat"><div class="n">{total_vol:,}</div><div class="l">Total Volume</div></div> <div class="stat"><div class="n">{disc_count}</div><div class="l">Discussions</div></div> <div class="stat"><div class="n">{disc_traffic:,}</div><div class="l">Disc. Traffic/mo</div></div> </div> <h2>Keyword Metrics</h2> <table><tr><th>Keyword</th><th>Volume</th><th>CPC</th><th>Difficulty</th><th>Intent</th></tr>{kw_rows}</table> <h2>SERP Features</h2> <div class="feat"><strong>Active:</strong> {', '.join(features) or 'None'}</div> <p>Discussions in SERP: {serp.get('discussion_count',0)} at positions {serp.get('discussion_positions',[])}</p> <h2>Top Discussion Opportunities</h2> <table><tr><th>Thread</th><th>Est. Traffic</th><th>Priority</th></tr>{disc_rows}</table> <p style="margin-top:50px;color:#999;font-size:0.85em">Generated by SEO Report Generator | RedRanks API</p> </body></html>""" # ============================================================ # MAIN # ============================================================ if __name__ == "__main__": generator = SEOReportGenerator(API_KEY) report = generator.generate_report(CONFIG) html = generate_html(report) timestamp = datetime.now().strftime("%Y%m%d_%H%M%S") filename = f"seo_report_{timestamp}.html" with open(filename, "w") as f: f.write(html) with open(filename.replace('.html', '.json'), "w") as f: json.dump(report, f, indent=2) print(f"\nReport saved: {filename}") print(f"Data saved: {filename.replace('.html', '.json')}")
Build Your First Report
The free tier includes 15 API calls - enough to generate several complete reports with all three endpoints.
Get Free API KeyTutorial Series Complete!
You've learned how to automate keyword research, analyze SERPs, discover high-traffic discussions, and generate professional reports. These building blocks can be combined and extended for any SEO automation workflow.
What's Next
You now have a complete SEO automation toolkit. Here are some ways to extend it: