Using Log File Analysis to Improve How Google Crawls My Website
Автор: TM Blast
Загружено: 2023-12-18
Просмотров: 830
Описание:
Understanding how Google crawls your website is achieved through log file analysis. Log file analysis takes the raw logs from your server (for me, it is my cpanel) and then throws them into Screaming Frog to analyze the data. From there, I like to see the number of events that Google has on my site, which shows what pages they have crawled.
In my video, I had just under 900 unique events, which doesn't mean anything on the surface. However, I saw that my blogs were making up around 25% of that activity, which was alarming since most of my blogs were back from 2015 and had zero importance to what I was doing today. Another 15% (maybe closer to 20%) of my crawl was wasted on WordPress plugins that don't need to be crawled.
Therefore, by removing those blog posts, which I did on 12-18-2023, and blocking the paths in the robots.txt file, I'm giving Google about 50% more guidance on where I want them to crawl on my site. In this case, I want them to crawl my service pages because Google has only crawled them once or twice in about a month, helping them get indexed.
Are you interested in a Free SEO Audit for your website? https://www.tmblast.com/services/free...
Want to learn more about Greg Kristan from TM Blast? https://www.tmblast.com/about-this-si...
Are you interested in having me work on your SEO Strategy? If so, visit the site and fill out the form!
https://www.tmblast.com/
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: