Historically, search engine crawlers like Googlebot could only read static HTML source code and were unable to scan and index material written dynamically using JavaScript. However, this has changed with the rise of JavaScript-rich websites and frameworks such as Angular, React, and Vue.JS, as well as single-page applications (SPA) and progressive web applications (PWA). In order to display web pages correctly before indexing them, Google modified and discontinued its previous AJAX crawling technology. While Google can generally crawl and index most JavaScript information, they recommend against using client-side solutions because JavaScript "is difficult to process, and not all search engine crawlers can process it correctly or quickly." ”
What is Google crawl?
Google and other search engines use software called Google crawlers (also known as search bots or spiders) to scan the web. In other words, it "crawls" the Internet from page to website, looking for fresh or updated content that isn't already in Google's database.
Each search engine has a unique collection of crawlers. For Google, there are more than 15 different types of crawlers, with Googlebot being the main one. Since Googlebot does crawling and indexing, we'll examine its operation in more detail.
How does Google crawler work?
No search engine (including Google) maintains a central register of URLs and updates that URL every time a new page is created. This means that Google has to search the internet for new pages rather than automatically "alert" them. Googlebot is constantly prowling the Internet, looking for new web pages to add to Google's inventory of existing web pages.
Once a new website is found, Googlebot renders (or "visualizes") the site in the browser by loading all HTML, third-party code, JavaScript, and CSS. Search engines use this data saved in databases to index and rank pages. The page will be added to the Google index, which is an additional, very large Google database if it is indexed.
JavaScript and HTML Rendering
Lengthy code can be difficult for Googlebot to process and render. If the code is not clean, the crawler may not be able to render your site correctly, in which case it will be treated as empty.
Regarding JavaScript rendering, please keep in mind that the language is evolving rapidly and Googlebot may sometimes stop supporting the latest version. Make sure your JavaScript is compatible with Googlebot to avoid showing your site wrongly. Ensure JavaScript loads quickly. Googlebot will not render and index script-generated material if it takes longer than five seconds to load.
When to use JavaScript for scraping?
We still recommend selectively using JavaScript crawling when first analyzing a site for JavaScript, although Google will typically render every page. JavaScript is used to leverage known client-side dependencies for auditing and during deployment on large sites.
All resources (including JavaScript, CSS, and images) must be selectively crawled to display each web page and build the DOM in a headless browser behind the scenes. JavaScript crawling is slower and more labor intensive.
While this isn't a problem for smaller sites, it can have a significant impact on larger sites with hundreds or even millions of pages. If your website doesn't rely heavily on JavaScript to dynamically change web pages, there's no need to spend time or resources.
When processing JavaScript and web pages with dynamic content (DOM), the crawler must read and evaluate the Document Object Model. After all the code is loaded and processed, a fully displayed version of such a website must also be generated. Browsers are the easiest tool for us to view displayed web pages. For this reason, crawling JavaScript is sometimes described as using a "headless browser."
in conclusion
There will be more JavaScript in the next few years because it is here to stay. JavaScript can coexist peacefully with SEOs and crawlers as long as you discuss it with SEO early on when creating your website architecture. Crawlers are still just replicas of the behavior of actual search engine bots. In addition to JavaScript crawlers, we strongly recommend using log file analysis, Google's URL inspection tool, or mobile-friendly testing tools to understand what Google can crawl, render, and index.
The above is the detailed content of Will Google crawl JavaScript that contains body content?. For more information, please follow other related articles on the PHP Chinese website!

The main difference between Python and JavaScript is the type system and application scenarios. 1. Python uses dynamic types, suitable for scientific computing and data analysis. 2. JavaScript adopts weak types and is widely used in front-end and full-stack development. The two have their own advantages in asynchronous programming and performance optimization, and should be decided according to project requirements when choosing.

Whether to choose Python or JavaScript depends on the project type: 1) Choose Python for data science and automation tasks; 2) Choose JavaScript for front-end and full-stack development. Python is favored for its powerful library in data processing and automation, while JavaScript is indispensable for its advantages in web interaction and full-stack development.

Python and JavaScript each have their own advantages, and the choice depends on project needs and personal preferences. 1. Python is easy to learn, with concise syntax, suitable for data science and back-end development, but has a slow execution speed. 2. JavaScript is everywhere in front-end development and has strong asynchronous programming capabilities. Node.js makes it suitable for full-stack development, but the syntax may be complex and error-prone.

JavaScriptisnotbuiltonCorC ;it'saninterpretedlanguagethatrunsonenginesoftenwritteninC .1)JavaScriptwasdesignedasalightweight,interpretedlanguageforwebbrowsers.2)EnginesevolvedfromsimpleinterpreterstoJITcompilers,typicallyinC ,improvingperformance.

JavaScript can be used for front-end and back-end development. The front-end enhances the user experience through DOM operations, and the back-end handles server tasks through Node.js. 1. Front-end example: Change the content of the web page text. 2. Backend example: Create a Node.js server.

Choosing Python or JavaScript should be based on career development, learning curve and ecosystem: 1) Career development: Python is suitable for data science and back-end development, while JavaScript is suitable for front-end and full-stack development. 2) Learning curve: Python syntax is concise and suitable for beginners; JavaScript syntax is flexible. 3) Ecosystem: Python has rich scientific computing libraries, and JavaScript has a powerful front-end framework.

The power of the JavaScript framework lies in simplifying development, improving user experience and application performance. When choosing a framework, consider: 1. Project size and complexity, 2. Team experience, 3. Ecosystem and community support.

Introduction I know you may find it strange, what exactly does JavaScript, C and browser have to do? They seem to be unrelated, but in fact, they play a very important role in modern web development. Today we will discuss the close connection between these three. Through this article, you will learn how JavaScript runs in the browser, the role of C in the browser engine, and how they work together to drive rendering and interaction of web pages. We all know the relationship between JavaScript and browser. JavaScript is the core language of front-end development. It runs directly in the browser, making web pages vivid and interesting. Have you ever wondered why JavaScr


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

Safe Exam Browser
Safe Exam Browser is a secure browser environment for taking online exams securely. This software turns any computer into a secure workstation. It controls access to any utility and prevents students from using unauthorized resources.

SublimeText3 Mac version
God-level code editing software (SublimeText3)

mPDF
mPDF is a PHP library that can generate PDF files from UTF-8 encoded HTML. The original author, Ian Back, wrote mPDF to output PDF files "on the fly" from his website and handle different languages. It is slower than original scripts like HTML2FPDF and produces larger files when using Unicode fonts, but supports CSS styles etc. and has a lot of enhancements. Supports almost all languages, including RTL (Arabic and Hebrew) and CJK (Chinese, Japanese and Korean). Supports nested block-level elements (such as P, DIV),

Notepad++7.3.1
Easy-to-use and free code editor

WebStorm Mac version
Useful JavaScript development tools
