What is a Log File?
Log files are raw, unfiltered data of the traffic to your website and are generated through requests made to your server for information. They allow you to understand how search engines are crawling your website and what information they are finding. Log files contain a history of page requests for a website, from both humans and robots.
What is Log File Analysis?
Every time the search engine interacts with your website, the request server IP address, status code, timestamp (time and date request made), URL requested, and user agent (Google bot) are recorded by the server. Logs files are then acquired from the server and analysed to assist in providing advanced SEO recommendations as a part of a technical SEO audit for your site.
Analysing your log files allows for a better understanding of how your site is being accessed and crawled by search engines. Through log file analysis, we can identify the relationship between search engine bot frequency and page performance, possible status errors, un-crawled content, and more.