JetOctopus: The Fastest Web-Based SEO Crawler
The WordPress and SEO industry is great for the huge number of tools that are available.
I think they are some of the most self-innovating industries out there, and I am always happy to come across new tools to play with.
This time I am reviewing a cool new SEO crawler: JetOctopus
JetOctopus is probably one of the most efficient crawlers on the market. It’s fast and incredibly easy to use, even for a non-SEO.
Its most convincing selling point is that it has no crawl limits, no simultaneous crawl limits and no project limits giving you more data for less money. If you are working on a huge database-driven website, you’ll definitely find it a money- and time-saver.
Watch how it works here!
The best part of the tool is that it’s web-based which makes it perfect for collaboration: Your team doesn’t need any new software installed. All they need is a (universal) login.
Web-based tools keep teams on one page because when logging in they all see the same thing. Whenever I can, I use online tools for this exact reason: Cross- team (and cross-device) co-working.
When it comes to SEO crawlers, the usual problem with web-based solutions is that they are not fast enough. You’ll be happy to find JetOctopus to be even faster than its desktop alternatives.
Your content team will appreciate its “Content” section that can generate all kinds of analyses thanks to the flexible filters, for example:
- “Thin” content, i.e. pages that need more unique content created for them
- Long-form content, i.e. content with most words
- Pages with largest images (those may need some image optimization)
- Pages with titles containing a certain term, (e.g. when you need to find all content you’ve ever written on a certain topic)
Naturally, there are a lot of features targeting a more technically-equipped user. JetOctopus helps dev teams to diagnose all kinds of errors hindering smooth user experience as well as preventing search crawlers from access your site.
- Broken links
- Pages (accidentally) blocked by Robots.txt or Robots Meta tags
- Orphan pages
- Redirect chains
- Too big pages
Internal Linking Analysis
We are all pretty sure (and anyone working with at least one site has seen the actual experimental evidence on that) that internal links help a page rank better in search. How come we have so few tools analysing internal links for each particular page.
We have a few powerful platforms analyzing incoming links from other domains but there’s no good solution to the best of my knowledge as to how many internal in-links a web page has.
JetOctopus has just introduced a great feature our industry is missing: “Linking Explorer” lets you see how many pages within your site link to a particular page (or pages) and, more importantly, which anchor text those internal links have:
Takeaway: Dig as Deep as You Need / Can
The beauty of SEO crawlers is that everyone is using them differently. A SEO crawler isn’t supposed to show you the way: Instead you can play with the data in your own way to identify what matters to you based on your focus and specialty.
JetOctopus accomplishes this task in an almost perfect way: Its Data Table view gives you all the filters and options to find whatever it is you are looking for, be it canonical tags, redirects, load time metrics or almost anything else under the sun.
I’d probably argue with some things JetOctopus identifies as issues (e.g. too short or too long title tags) and sometimes I’ve seen labeling pages with “multiple title tags” even though I could clearly see only one in the code. But I don’t expect to always agree with an SEO tool as we don’t have clearly set industry standards in many cases.