Debug JavaScript and CSS
errors with Fetch as Render by Google Webmasters Tools
Google Recently Announced
on Google Webmasters Central Blog that they are releasing a new tool which
helps to debug your websites JavaScript and CSS issues. Now you can see how
Googlebot renders a webpage.
Google trying to improve
their indexing because till now Google only indexes content, instead of showing
some output of HTML codes and content Google will show you a visual
representation that what Google see on a webpage.
It will also shows you that
what resources Google cannot access while crawling your website, like what
JavaScript, CSS and other resources are blocked by robots.
Google announced this
option in Google Webmaster Tools as “Fetch and Render” at just
next from “fetch as Google”.
You can also tell Google
that what device you are using like Desktop, Smartphone or feature phones etc.
while fetching your website. This shows you the accurate preview of your
webpages according to your device’s screen resolution.
Specifically, Google is
going to show you if they have issues while crawling and indexing your website
because of JavaScript, CSS implementation mistakes.
Google said:
We have been gradually
improving how we do this for some time. In the past few months, our indexing
system has been rendering a substantial number of web pages more like an
average user’s browser with JavaScript turned on.
During this process,
they’ve run into several frequent problems that may “negatively impact” your
pages from ranking in the search results. Google has listed out some of those
problems, so one would assume the new tool that Google is working on would highlight
these issues to webmasters.
These are the highlighted
issues:
ü If resources like JavaScript or CSS in separate
files are blocked (by robots.txt) so that Googlebot can’t retrieve them, our
indexing systems won’t be able to see your website like an average user. We
recommend allowing Googlebot to retrieve JavaScript and CSS so that your
content can be indexed better. This is especially important for mobile
websites, where external resources like CSS and JavaScript help our algorithms
understand that the pages are optimized for mobile.
ü If your web server is unable to handle the
volume of crawl requests for resources, it may have a negative impact on our
capability to render your pages. If you’d like to ensure that your pages can be
rendered by Google, make sure your servers are able to handle crawl requests
for resources.
ü It’s always a good idea to have your site
degrade gracefully. This will help users enjoy your content even if their
browser doesn't have compatible JavaScript implementations. It will also help
visitors with JavaScript disabled or off, as well as search engines that can’t
execute JavaScript yet.
ü Sometimes the JavaScript may be too complex or
arcane for us to execute, in which case we can’t render the page fully and
accurately.
ü Some JavaScript removes content from the page
rather than adding, which prevents us from indexing the content.
Google says the new tool is
coming to Google Webmaster Tools “in the coming days.” This is that Tool,
Google was talking about. This tool helps Google to index websites better or it
will also shows you a collection of blocked resources which Google-bot can’t
retrieve such as JavaScript, CSS and so forth.
Anish
Tiwari is an expert Search Engine
Optimizer and Blogger. A passionate learner, he keeps great taste in
modern gadgets and IT news. Also known as Anish Tiwari SEO, his surpassing love for Search
Engine Optimization, Blogging and Social Media Optimization is vast.
Comments
Post a Comment