|
To find out, you have to dig through your log files. log files Log files are initially a lot of unreadable text, but then a world full of SEO input. Why should you evaluate log files? Google uses crawlers to access and index all websites in the world. This is very complex and costs a lot of resources. Therefore, there is an individual crawl budget for each domain. If these crawled pages contain a lot of unimportant, low-quality or even incorrect pages, that is not a good signal. will then be visited less often.
In Search Console you can see how many URLs Google has crawled. However, you can't see Special Data which ones exactly. You can see part of the new coverage report, but not everything. If you really want to know how healthy your site is, you have to evaluate your log files. What is a log file anyway? Log files are files that are automatically saved on a website's server. In the access logs you can see every hit, i.e. every request to the server, including those from Googlebot. Depending on the number of page views, such a file can be quite large. Log files are therefore often automatically deleted after just a few weeks. It is therefore important to download or back up the data on time or regularly.

What does a log file look like? A log file contains lots of identically structured lines strung together. Each line corresponds to a call and typically contains this information: IP of the caller Time of call URI, i.e. the path of the page accessed protocol Status code, i.e. the response from the server Bytes transferred User agent For example, a line looks like this: 66.249.123.456 - - [28/Nov/2018:03:53:40 +0200] "GET /logfile-analyse HTTP/1.0 " 200 25518 "-" " Mozilla/5.0 (compatible; Googlebot/2.1; +http:/ /www.google.com/bot.html) ” Because the IP is stored in log files, log files are also relevant in terms of data protection.
|
|