What is a Log File?

Log files are raw, unfiltered data of the traffic to your website and are generated through requests made to your server for information. They allow you to understand how search engines are crawling your website and what information they are finding. Log files contain a history of page requests for a website, from both humans and robots.

What is Log File Analysis?

Every time the search engine interacts with your website, the request server IP address, status code, timestamp (time and date request made), URL requested, and user agent (Google bot) are recorded by the server. Logs files are then acquired from the server and analysed to assist in providing advanced SEO recommendations as a part of a technical SEO audit for your site.

Analysing your log files allows for a better understanding of how your site is being accessed and crawled by search engines. Through log file analysis, we can identify the relationship between search engine bot frequency and page performance, possible status errors, un-crawled content, and more.

How can Log File Analysis be useful for SEO?

Through log file analysis, we can understand how Google views your site and which pages are being focused on by the crawlers. If changes in bot activity are evident, this could be a sign of algorithm changes or site changes which can impact your SEO. Through log file analysis, these changes can be detected before issues arise. This will allow us to check the affected pages and fix the errors. Additionally, we can identify issues relating to site structure and the site’s usability which can affect organic traffic to your site, and as a result your conversions.

It is essential to have enough data to establish a useful baseline – usually three months of raw data acquired from your server to determine all the hits from bots and users. A combination of spreadsheet and software analysis is implemented, in which we:

  • Examine how much crawl budget (the number of pages a search engine will crawl each time it visits your site) is being wasted and where.
  • Identify which pages search engines prioritise and consider important: Through log file analysis, we can identify which sections or pages are being crawled and the crawl frequency.
  • Determine crawl and indexing issues: Google might be ignoring (not crawling or indexing) crucial pages or sections of your website. With log file analysis we can see if this is the case and identify the reasons for this, ensuring valuable pages to your site are being crawled in the future.
  • Assess and improve page/accessibility errors: Log files allow you to find pages which are not responding or have 3xx, 4xx, or 5xx response codes, which would need to be redirected and/or fixed for them to be crawled by search bots. Through analysing these errors, we can measure their impact on bot hits and frequency, and identify issues relating to website content and structure.
  • Identifying slow pages. Further, determining orphaned pages which do not show up in the crawl.
  • Determine if your website has switched to a Google Mobile-First Index: Your website should be optimised for mobile users, allowing responsive design for improved viewing and faster loading speeds.

Think we make a good match? We’d love to hear from you! Get In Touch