SEO parsers

Which tasks SEO parsers perform. What functionality is used in SEO parsers. Selection of the most popular parsers.

Parsers are used by SEO specialists for complex site analysis: internal, technical, and external optimization. Some may have a narrow functionality, while others have strong SEO combinations of various professional tools.

SEO parser main tasks:

  • to indicate the correct setting of the main mirror;
  • analyze the content on robots.txt and sitemap.xml;
  • specify the existence, size, and content of title and description meta tags, number and content of h1 - h6 headers;
  • define page response codes;
  • generate an XML map of the site;
  • define the level of page nesting and visualize the site structure;
  • specify the presence/absence of alt attributes in images;
  • define broken links;
  • determine the attribute rel="canonical";
  • provide data on internal relinking and external link mass;
  • display information about technical optimization: loading speed, code validity, optimization for mobile devices, etc.

 SEO parser functionality

  • Screaming Frog SEO Spider
  • Netpeak Spider
  • ComparseR
  • SiteAnalyzer by Majento
  • SE Ranking
  • A-Parser
  • PR-CY
  • Xenu's Link Sleuth

Screaming Frog SEO Spider

One of the most widely used SEO-analyzer from British developers. It can be used to quickly and visually find out:

  • content, response code, indexing status of each page;
  • the length and content of title and description;
  • availability and content of h1 and h2 headers;
  • information about images on the site - format, size, indexing status;
  • information on setting up canonical links and pagination;
  • other important data.

The free version is limited by 500 URLs. In paid one (license can be bought for a year) the number of pages for parsing is not limited, and it has many more features. Among them are parsing prices, product names, and descriptions from any site. 

Netpeak Spider

A popular tool for a complex site analysis. Checks the resource for internal optimization errors, analyzes relevant SEO parameters: broken links, duplicate pages, and meta tags, response codes, redirects, and others. You can import data from Google Search Console and web analytics systems. For agencies, it is possible to form a branded report.

The tool is paid, basic features are available in all tariffs. Free trial period - 14 days.

ComparseR 

This is a program that analyzes the resource for technical errors. The special parser feature is that it also shows all pages of the site in Yandex and Google index. This function is useful to find out which URLs are not in the index and which ones are in the search (and whether these are the pages the optimizer needs).

You can buy and install the program on a single computer. To learn how it works, download the demo version.

SiteAnalyzer from Majento

Free program for scanning all the pages, scripts, documents, and images of the site. Used for technical SEO audit. Requires installation on a PC (OS, Windows), but can also work from removable media. Extract the following data: server response codes, existence, and content of meta tags and headlines, attribute rel="canonical" definition, external and internal links for each page, duplicate pages, and others.

The report can be exported in CSV, XLS, and PDF formats.

SE Ranking Website Analysis

The tool analyzes key parameters of site optimization: the existence of robots.txt and sitemap.xml, main mirror configuration, duplicate pages, response codes, meta tags, and headers, technical errors, download speed, internal links. After scanning the site is rated at a 100 points scale. There is an option to create an XML site map. A useful option for agencies is to create a branded report, which can be downloaded in a convenient format or sent by email. Reports can be launched manually or on schedule.

There are available two payment methods - for checking positions and monthly subscription. Free trial period - 2 weeks.

A-Parser 

This service brings together more than 70 parsers for different purposes: parsing popular search engines, keywords, apps, social networks, Yandex and Google maps, the largest online stores, content, and others. Apart from using ready-made tools, there are opportunities for programming your own parsers based on regular statements, XPath, JavaScript. Developers also provide access via API.

The prices depend on the number of options and the period of free updates. Parser features can be evaluated in the demo version, which will be available within six hours after registration.

PR-CY Website Analysis 

An online tool to analyze sites by more than 70 items. Points to optimization errors, offers possible solutions, forms a SEO checklist, and recommendations for resource improvement. According to the scanning results, the site is rated in percentages.

Only general information about the number of pages in the index, presence/absence of viruses and search engine filters, link profile, and some other data can be obtained free of charge. A more detailed analysis is chargeable. The price depends on the number of websites, their pages, and account checks. There is a possibility for daily monitoring, indicators comparing with competitors, and branded reports unloading. Free trial period - 7 days.

We should also mention the parsers, which solve narrowly focused tasks and can be useful for site owners, webmasters, and SEO-specialists.

Xenu's Link Sleuth

Free program to parse all the websites Urls: external and internal links, links to images and scripts, etc. Can be used for various tasks, including the search for broken links on the site. The program should be downloaded and installed on your computer (Windows).

Each link will show its status, type (e.g. text/plain or text/html), size, anchor, and error.

PromoPult meta tags and headers parser

This is a tool that parses the title, description, keywords, and h1-h6 headers meta tags. You can use it to analyze your project or competitors' sites. In the first case, it is easy to identify unfilled, uninformative, too long or too short meta tags, duplicate metadata, in the second case - to find out which keywords are used by competitors, to determine the structure and logic of meta tag formation.

You can add the Url list manually, as an XLSX file or as a link to the XML site map. Reports are uploaded in HTML and XLSX formats. The first 500 requests are free of charge. 

How to choose a proper parser

  1. Determine the purpose of parsing: competitors monitoring, catalog filling, SEO parameters checking, multi-tasking.
  2. Find out what kind of data and to what extent you need to get at the output.
  3. Think over how regularly you need to collect and process data: once, monthly, daily?
  4. If you have a large resource with complex functionality, it makes sense to order a parser with flexible settings for your goals. For standard projects on the market, ready-made solutions are enough.
  5. Choose several tools and learn the responses. Pay special attention to the quality of technical support.
  6. Compare the level of training (your own or the responsible person) with the complexity of the tool.
  7. In applying the above criteria, select the appropriate tool and tariff. There may be enough free functionality or a trial period for your needs.