Home  >  Q&A  >  body text

javascript - Why is it said that the separation of front and back ends is not conducive to seo

I just recently built a multi-page Vue project, but I saw many people on the Internet saying that the separation of front-end and back-end is not conducive to SEO. Why? Is it conducive to SEO optimization to write static pages on the front end and send them to the back end for server-side rendering? Also, I would like to ask whether logical operations are processed on the front end or on the back end? For example, when sorting air tickets, does the front-end sort the data and then use js to render the page, or does the back-end sort?

PHP中文网PHP中文网2711 days ago1109

reply all(3)I'll reply

  • phpcn_u1582

    phpcn_u15822017-05-19 10:13:53

    The essence of seo is that one server initiates a request to another server and parses the request content. But generally speaking, search engines will not go back and execute the requested js. In other words, if it is a single-page application, the HTML has not yet rendered some data on the server side, and the data is only rendered in the browser, and the HTML requested by the search engine does not have rendered data. This is very detrimental to the content being searchable by search engines. Therefore, server-side rendering is to try to have data on the page before the server sends it to the browser.

    The second question is that general data logic operations are placed on the back end. If there are only a few pieces of data sorted, the front-end and front-end sorting will be the same. If there are 1,000 pieces of data, the front-end must request all the data to sort. This is obviously unreasonable.

    reply
    0
  • 巴扎黑

    巴扎黑2017-05-19 10:13:53

    Because search engines don’t like executing JavaScript very much, back-end rendering is better. For example, you can try it. It seems that Google cannot search the articles in the Zhihu column at all. Internet Archive also cannot be applied to Zhihu columns.

    My personal opinion is: For websites that focus on content, the core content should be rendered on the backend as much as possible to facilitate non-browser use. For software-based websites, such as various SAPs, you don’t need to worry about these. It’s useless to care.

    reply
    0
  • 黄舟

    黄舟2017-05-19 10:13:53

    The basic principle of a search engine crawler is to crawl your URL, then obtain your HTML source code and parse it. Your page usually uses the data binding mechanism of js such as vue to display page data. The html obtained by the crawler is your model page rather than the rendering page of the final data, so using js to render data is not friendly to SEO. .

    ps: In addition, not all js engines cannot crawl. For example, Google has long been able to parse js content, but a group of domestic "stupid" ones are still not able to do so. The reason why Zhihu cannot be found on Google is because the robots.txt of the website can be configured. Zhihu is not open to search engines by default. A rule-abiding engine will still follow your configuration and not crawl, but some rogue software may not.

    reply
    0
  • Cancelreply