Welcome to my 6th annual What’s in my SEO Toolbox post. If you’ve read this post in the past, you’ll note that there’s a lot of consistency with respect to my choices. There are two reasons for that. First, these are damn good tools and their developers keep improving them year after year. Second, as much as I love testing the newest “thing” on the market, I get attached to my tools and I’m reluctant to change.
My tools fall into several categories: website crawling, log file analysis, keyword research, analytics, competitive research, link development, website performance, and usability analysis. With so many tools available in each category, it can be hard to pick just one. So most of the time, I don’t. This year, I added a couple of items that don’t really qualify as tools but they belong in my toolbox because, without them, I couldn’t do my job correctly. Sadly, this year I had to say goodbye to one of my favorite tools – Market Samurai. The tool was written using Adobe’s Air development environment, and when Adobe pulled that from the market, the tool also went away.
With so many of these tools being multi-functional, I decided to create a handy little chart. So without further ado, here is my SEO toolbox for 2022. Keep in mind as you read the reviews, that they are not full product reviews. They’re just high-level summaries of the tool’s capabilities.
|Screaming Frog Spider||✔||✔||✔||✔|
|Screaming Frog Log File Analyzer||✔|
|Bing Webmaster Tools||✔||✔||✔|
|Google Search Console||✔||✔||✔|
|Answer the Public||✔||✔||✔|
Screaming Frog Web Crawler
The #1 item in my seo toolbox is Screaming Frog’s Web Crawler. Now at version 16.4, Screaming Frog’s Web Crawler just keeps on getting better and better. For a basic SEO Audit, the first thing I do is input the top-level domain, connect Google Analytics, Google Search Console, Majestic, and AHREFS, and hit Submit. Screaming Frog will happily crawl every link it finds and reports back its findings. For other projects, I can switch to list mode and paste in a list of URLs from my clipboard or check the accuracy of an XML sitemap by pasting in the URL of the sitemap and starting a crawl. Oh, and I can also crawl SERP results to see who I’m competing against. Cool stuff.
Once I have my crawl data, I can find broken links, audit redirects, analyze page titles and metadata, discover duplicate content, review robots.txt files, create sitemaps, extract data with XPath, visualize site architecture, create word clouds of on-page text or the anchor text in incoming links. With custom extraction, I can look for specific text strings in the written text or in the code. All of that data can be exported into an Excel file or a CSV for additional analysis. And I can produce reports. Not just a few. A lot.
There’s a free version if you have a small budget or you have a small website (500 URLs) you want to crawl. The paid version is about $195 USD per year for a single user, allows you to crawl unlimited URLs, and it’s worth every penny. I say “about” because the final price always depends on the conversion rate between British Pounds and US Dollars. Download Screaming Frog Web Crawler here.
Screaming Frog Log File Analyzer
If there has to be a tie for #1 in my seo toolbox is Screaming Frog’s Log File Analyzer. Log file analysis is (in my opinion) a vastly underused tool in the SEO’s toolbox. It’s hard to come by – not every hosting company will collect this information for you by default. One reason? Since the log file records every interaction, every request for a file, it’s a big file and takes up a lot of storage space. But the biggest reason is that most web developers don’t know enough to ask the hosting companies for them.
The Screaming Frog Log File Analyzer (version 4.3) is the perfect companion tool for their web crawler. By itself, it does an amazing job of analyzing log file data in a way that even a novice SEO can understand. But when you combine it with data exported from Screaming Frog’s Web Crawler, you can get a list of every URL that is – or is not – included in the log file. You can also see which URLs are crawled how often, and that’s a good indicator of whether or not the search engines think they are important.
As with the web crawler, there is a free version and a paid version, but at about $125 USD for the paid version, the product is a steal. Download Screaming Frog Log Analyzer here.
Majestic is to links what Google is to search engines. It’s the biggest. Majestic’s Fresh index has crawled 453,548,169,678 URLs. That’s over 453 billion unique URLs. It has found a bit more than twice that many. Majestic’s Historic index, which goes back to October 2013, has over eight trillion – that’s 8,485,618,151,026 – links in it. So yeah, it’s pretty huge.
What can you do with all that data? Just about anything you want. What I use it for is understanding a web site’s link profile (topicality) and the amount of authority it has and is able to pass. I also use it for research when I’m building out a link development campaign. With that much data, it’s really easy to research who you want to send a link request out to and who you want to avoid.
Majestic’s pricing starts at $49.99 a month (for a minimum quarterly commitment) and $79.99 a month if you want to go month-to-month. Both of those plans are for single-user access. Multi-user access plans start at $169.99 a month. My seo toolbox wouldn’t be complete without it.
SEMRush has been in my seo toolbox for years and unless they do something stupid, it’s going to stay there. For me, it’s like Screaming Frog – it’s one of those tools I have to have in order to be able to do my job effectively.
SEMRush is generally the first tool I run to when I’m talking to a potential client. A quick search and I can get an idea of what their traffic looks like (paid vs organic) and who they compete against on the SERPs. From there I can do a backlink audit, analyze their content, perform a cursory site audit and see what they’re going on social media. And that’s without having access to their site. Once I’ve signed them, the data is more accurate and more actionable.
One of the most useful parts of SEMRush is the Sensor tool that tracks SERP volatility over the last 30 days:
SEMRush is also one of those tools that keep getting better and more important with every release. The features that have been added this year alone are numerous, and they don’t show any signs of slowing down. It’s another must-have tool. Start with the Pro package ($99 a month) and go from there.
Personally, I don’t do a log of paid search work – there are people on staff who do that – but when I dive into the pool, SpyFu is a lifesaver.
SpyFu is a competitive intelligence tool for search marketers. You can type in a competitor’s domain to see all of the keywords it ranks for (including the content that ranks), the ads it buys on Google, and an estimation of how they compare to its competitors in the marketplace. You can also type in a keyword to find the domains that buy it (which is kinda cool), the domains that rank for the keyword, and how that has changed over time. And you can see how your Client’s campaigns stack up against their competitors.
Another thing I like is that SpyFu combines an amazing amount of competitive data into an easy-to-use interface. You have access to almost everything your competitors are doing in terms of PPC, keywords, and SEO. It can identify competitors in your industry as well as recommend top Adwords to buy.
With deep insights into both SEO and PPC, SpyFu is a solid tool for people who are starting a campaign and those who are improving what is already in motion. Pricing is pretty reasonable at $38 a month for a Basic package.
I just stumbled on this tool about a month ago, but it’s already become indispensable. The search volume data, like all third-party tools, is pulled from Google’s Adwords API so the volume data isn’t actually based on the number of organic searches. Ah well – you can’t have everything.
On the plus side, there are 5 different keyword tools to choose from. There’s the Find Keywords tool where you can put in up to 30 seed keywords and pull data from 11 different APIs; there’s the Import Keywords tool when you can drop in your Google Search Console (or Bing Webmaster Tools) keywords; there’s the Related Keywords tool, the People Also Ask For tool and the Merge Keywords Tool (which is way easier than using an Excel spreadsheet with a concatenate function). The onscreen reports are brilliant. The downloadable reports lose the “prettiness” of the onscreen reports, but the data is solid. You just have to format everything yourself.
Still, this has become a must-have tool for me. Keyword Keg pricing starts at $40 USD a month and goes up from there.
If your business serves a local market, whether you have a retail location, or you’re selling services in a specific geographic market, BrightLocal is worth looking at.
BrightLocal integrates a bunch of tools and functions you’ll need to be successful in a local SEO campaign into a single, reliable platform. It tracks organic traffic, your appearance in map results, rankings for specific keywords, competitors – even on-site SEO and off-site SEO. It integrates with your Google Analytics, Google My Business, Facebook, and Twitter accounts. It tracks your citations and even offers a citation builder utility for an extra cost.
There’s a 14-day free trial, and after that pricing starts at $29 a month. If you want Facebook and Twitter integration it’s $49 a month and if you’re an Agency, you really need the $79 a month package. Still, it’s a good deal.
Web Page Test
Whenever I’m doing a speed test on a website, I like to get multiple data points since there are too many variables to work from just one. At least in my opinion. So there are 3 in my toolbox.
A former Client who worked at AOL turned me on to this tool. WebPagetest is a tool that was originally developed by AOL for use internally and was open-sourced in 2008 under a BSD license. The platform is under active development on GitHub and is also packaged up periodically and available for download if you would like to run your own instance. For me, the online version is fine (although sometimes the queue gets quite long.
What I like about this tool is that it’s free – we LOVE free – and it’s got solid data. It runs every test 3 times so that you can any fluctuations or variations in the data. Test output is provided in the form of highly detailed charts and downloadable files.
And did I mention it’s free? Click here to try it yourself.
Google Analytics / Google Search Console
I put Google Analytics and Google Search Console together since you really can’t have one without the other. Google Search Console tells you what happens before someone gets to your website: how they found you, where they’re coming from (geographically), what pages they’re looking at, CTR, impressions, etc. Google Analytics provides insight into what happens after they’ve gotten to your website.
It would take a much longer article than this to get into all of the information provided and how you can use it. Suffice to say that no SEO can do his or her job without these tools. Yes, there are other analytics platforms out there in the marketplace, but as long as you’re not bumping your head on limits Google has put in place for a free version of Analytics, there is no better solution.
Bing Webmaster Tools
Bing Webmaster tools are the red-headed stepchild of SEO. That’s a shame because it’s a really, really good tool. In some respects, it’s actually better than Google search console.
First off, the setup is really simple. If you already have access to a website via Google Search Console you can use that to verify your access for Bing Webmaster Tools. That’s a whole lot easier than the old process.
Second, when you log into Bing Webmaster Tools there’s a dashboard that shows you all of the sites you’ve got connected and you can quickly see the improvements (or not) in your key metrics: the number of clicks from search, the number of appearances the number of pages crawled and the number of pages indexed.
Getting additional detail just requires clicking on that particular property. The site dashboard has a chart that shows clicks, search appearance in search, pages crawled, and crawl errors. Below are your Top 3 pages for traffic along with some metrics, below are the Top 3 keywords for your site with their metrics, and then below that are some SEO reports that show where you might want to make changes to better optimize the website. There are also some great tools for diagnostics, validating structured markup, and keyword research. What’s interesting about that is it’s not tied to advertising – it’s actually organic search volume.
Pingdom is site monitoring on steroids. If you’d got questions, they’ve got answers.
The product allows you to test website uptime, page speed, user transactions, servers, real-time user monitoring – all of which come with alerts. The web-based application has a really clean understandable user interface. Charts and data galore. What I use it for is page speed monitoring because that’s such a critical factor for Google. Pingdom allows me to find poor-performing code and third-party assets that can negatively impact the performance of my Client websites.
If you want ongoing monitoring, Pingdom has a 14-day trial available, after which it costs $45 a month. If you’re happy with the occasional one-off report, you can do that for free.
Much as I love Pingdom, there are times when I don’t need all of their services. Or their cost. For those times, I use GTMetrix.
To be clear, GTMetrix doesn’t do everything that Pingdom does. But when I’m looking for quick insights into why a website might be running slow, it’s awesome. Their Report Page neatly summarizes the website’s page performance based on key indicators of page load speed: Analyze a page with Google PageSpeed and Yahoo! YSlow rulesets, get the page’s Page Load Time, Total Page Size and Total # of Requests when it loads, and a page’s performance relative to the average of all sites analyzed on GTmetrix. Kinda cool.
All that for free. If that’s not enough, there’s always a GTMetrix Pro version you can get.
Deep Crawl does everything you could want from a crawler. Want to crawl just the top-level domain? Check. How about all of the subdomains? Yup. Compare your crawl data to your sitemaps, your Google Analytics data, and your Google Search Console Data? Check, check and check. What about backlinks? Absolutely. You can import backlink data from Google Search Console or Majestic or just about any tool that exports into a .csv format. Do you have 10 sites you need to crawl on the 1st of every month? No problem. Just set up the scheduling and push the Save button. The magic happens without you having to watch anything, and you’ll get an email and a browser alert when it’s done. Deep Crawl does server log files.
So why is this in my toolbox as well as Screaming Frog? Screaming Frog is a desktop application that uses Adobe Air and it doesn’t handle really large websites – like 1mm pages – really well. Not unless you have lots and lots of memory. That’s where Deep Crawl shines – running out of memory is someone else’s problem.
Answer the Public
Ask any SEO what they consider to be the most important part of a solid SEO campaign, and they’ll tell you its Content. And if you ask those same SEOs what tool they use to research content, they’ll give you one name: Answer the Public.
Supposing you want to write a blog post on New Years Resolutions. The first thing you’d like to do is come up with a perspective for your article and a title for it. Just go to Answer the Public, type in “new years resolution”, press submit and you’ll get a little chart that looks like this:
Click on any of those suggestions, and you’ll be taken to a SERP that shows who you’re going to be competing with for eyeballs. There are visual results like this for questions, prepositions, and comparisons. The visuals can be downloaded as PNG files and the data can be downloaded as a CSV file and from there you can develop an editorial calendar.
Another new-for-2021 feature was the introduction of Listening Alerts. Available only in the Pro Package, you can set up Search Listening Alerts and automatically get weekly email digests showing what new questions are being asked around any topic.
Answer the Public is available as a month-to-month subscription service for $99USD per month.