Is my site in google's mobile first index?

Is my site in google's mobile-first index?

Your site's web server logs will answer the question

As we know few sites have already moved to google’s mobile-first index.

Is your site in those few?

How can we know when our site has moved to google's mobile first index?

On 15/12/2017's Hangout, John Mueller gave an advice to SEOs:

    “If you watch out for your log files probably you can notice [your site’s transition to the mobile-first index] fairly obviously.”

He explains that on an average site about 80% of the crawl is from googlebot-desktop and approximately 20% is expected from googlebot-smartphone.

Once your site has made the move to the mobile-first index, you will see that majority reverse. Googlebot-smartphone will be crawling your site more than googlebot-desktop.

For that reason, I decided to compare the useragents', googlebot-desktop, googlebot-smartphone crawls on my site.

These are the web server logs of my website https://www.searchdatalogy.com/ dates between 01/01/2017 and 18/12/2017. There are 352 days of webserver logs files, almost 12 months. (We can not see some of them, too many to fit on the screen)

Nginx is the web server which my website is on, below is the format of the logs:

66.249.76.115 - - [18/Dec/2017:02:58:22 -0500] "GET / HTTP/1.1" 200 6105 "-" "Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2272.96 Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" "164734" "2"

The commands which are used to extract data from web server log files and the graph obtained by using that input at the end are as follows:

1) The commands used to extract input data from web server logs for the graph:

grep "66\.249.*Googlebot" *2017*.log |sed  's/\"Mozilla.*Linux.*Android.*Nexus.*AppleWebKit.*\"/0/g'|sed 's/\"\(Mozilla\|Googlebot\).*www.google.com\/bot.html.*/1/g'|awk -F'[' '{print $2}'|cut -c1-6,33-|awk -F' ' '{print $1,$2,$7}'| sort -u |uniq | awk '{print $1,$3}' |sort |uniq -c |awk '{print $2,$3,$1}'|awk '{my_dict[$1][$2] = $3} END { for (key in my_dict) { print key,my_dict[key][0],my_dict[key][1]}}'|awk  '{for(i=1; i<=3; i++) if(length($i)==0) $i=0; print }'|sed 's/\/2017//g'|awk -F' ' '{print $1,$2,$3,$2+$3}'|sed 's/\//-/g'|sort -t$'-' -k 2M -k1 >googlebotscrawl

The columns following date information show the number of googlebot crawls by googlebot's user-agents, the second column is google-smartphone, the third column is google-desktop and on the fourth column, we find the total number of urls crawled by these user-agents.  Empty crawls are replaced by 0 in the output file. This is needed for the graph which will be created with gnuplot later on. 

2) The code used in file histogramgooglebotuseragent.p which is called by gnuplot:

clear
reset
unset key
set terminal pngcairo  font "verdana,8" size 1200,400
# graph title
set title "URLS CRAWLED BY GOOGLEBOT USERAGENT BY DAY https://www.searchdatalogy.com"
set grid y
#y-axis label
set yrange [0:100]
set ylabel "% of total"
set key invert reverse Left outside
set output "GooglebotUserAgent.png"
set xtics rotate out
set ytics nomirror
# Select histogram data
set style data histogram
# Give the bars a plain fill pattern, and draw a solid line around them.
set boxwidth 0.75
set style fill solid 1.00 border -1
set style histogram rowstacked
xticreduce(col) = (int(column(col))%5 ==0)?stringcolumn(1):""
colorfunc(x) = x == 2 ?  "blue" : x == 3 ?  "red"  : "blue"  
titlecol(x) = x == 2 ?  "Smartphone" : x == 3 ?  "Desktop" :  "2"  
plot for [i=2:3] 'googlebotscrawl'  using   (100.*column(i)/column(4)):xtic(xticreduce(1)) title titlecol(i) lt rgb  colorfunc(i)

3) The graph showing 352 days of URLs crawled by Googlebot useragent by day:

Thanks for taking time to read this post. I offer consulting, architecture and hands-on development services in web/digital to clients in Europe & North America. If you'd like to discuss how my offerings can help your business please contact me via LinkedIn

Have comments, questions or feedback about this article? Please do share them with us here.

If you like this article

Follow Me on Twitter

Follow Searchdatalogy on Twitter

Related Tags: LINUX  

Comments

About Us

My objective is bringing all my experience and expertise together to deliver solid technology solutions that can take your search traffic acquisition to the next level. My main goal is to assist you in building and maintaining your search marketing analytics platforms. My will is to leverage your marketing and IT teams search knowledge while bridging the gap between two.

Certificates

Botify: Botify Certified Consultant

IBM: Data Scientist, Data Engineering Certificates

Google: Google Analytics, Google Adwords, Mobile Sites, Digital Sales Certificated Professional

Coursera: Data Engineering on Google Cloud Platform Specialization

Legal Terms Privacy

Recent Posts

87 million domains pagerank 2 weeks, 4 days ago
SEO data forecasting 1 month, 3 weeks ago
SEO data analysis 1 month, 3 weeks ago
BrightonSEO conference 2 months, 2 weeks ago
HTTP2 on top sites 5 months, 1 week ago
Desktop & mobile performances 9 months, 3 weeks ago
Alexa top 1 million sites 10 months, 1 week ago
Web marketing festival 1 year, 5 months ago
Webcampday 1 year, 6 months ago
Queduweb 1 year, 7 months ago
SEOCamp'us 1 year, 9 months ago
1 million #SEO tweets 1 year, 10 months ago
SEO, six blind men & an elephant 1 year, 11 months ago
SEO hero 2017 2 years ago
Digitalzone 2 years ago

Recent Tweets