The writing method of robots.txt is something that SEO personnel must know (what is robots.txt), but how to write it, what is prohibited and what is allowed, we have to set it ourselves.
Baidu Spider is a machine. It only recognizes numbers, letters and Chinese characters, and robots.txt is the most important and first "dialogue" with Baidu.
When our website is not built yet, we don’t want Baidu to crawl our website, and some people often prohibit Baidu from crawling it. However, this approach is very bad, as it will make it difficult for Baidu spiders to come to your website again. Therefore, we must build the website locally now, and then buy the domain name and space after everything is done. Otherwise, repeated modifications of a website will have certain adverse effects on your website.
The initial robots.txt of our website is written as follows:
User-agent: *
Disallow: /wp-admin/
Disallow: / wp-includes/
User-agent: * means, allow all engines to crawl.
Disallow: /wp-admin/ and Disallow: /wp-includes/ prohibit Baidu from crawling our privacy, including user passwords, databases, etc. This way of writing not only protects our privacy, but also maximizes Baidu Spider’s crawling.
If you want to prohibit Baidu Spider from crawling a certain page, such as 123.html, then add the code "Disallow: /123.html/".
After writing robots.txt, you only need to upload it to the root directory of the website.
The above is the detailed content of How to write robots.txt. For more information, please follow other related articles on the PHP Chinese website!

The funnel metaphor, long a cornerstone of digital marketing, is increasingly inadequate in today's data-rich online landscape. The reality of how users interact with information online is far more complex. With the abundance of data available and t

Traditional marketing funnels no longer reflect how users search and content interact today, which is hurting your SEO results. Marketers today have more data than ever before, so a more effective framework can be built to drive natural traffic: the spider web model. By building content strategies around interconnected, high-value pages, you can improve your ranking faster, get more organic traffic, and improve user experience. Join Ryan Brock's seminar on "Thinking with a spider web instead of a funnel to achieve excellent SEO results" and learn: Why traditional funnels waste natural traffic opportunities Gartner and leading researchers’ perspective on the journey of modern buyers How to adjust your content strategy to improve ranking and engagement

Google now lets you use AI to generate backgrounds for the images you add within Google Posts in your Google Business Profiles. “When creating a post for your customers in the Maps app, you can create an engaging AI-generated background for the ph

Website visitors rarely follow a predetermined path, creating a challenge for website design. This article explores how to create a website that effectively serves all target audiences, regardless of their entry point or navigation style. The soluti

PPC automation has always been about efficiency. We’ve relied on scripts, rule-based optimizations, and APIs to manage campaigns at scale. These tools have been essential, but they all share a common limitation: they follow strict, pre-prog

Google's AI Overview rankings, the web pages featured in AI-generated responses, exhibit greater instability than traditional Google organic search rankings. A recent Authoritas analysis reveals that within two to three months, a significant 70% of

The digital landscape is constantly evolving, a reality keenly felt by SEO professionals. AI-powered search tools like ChatGPT, Perplexity, Claude, and Gemini are rapidly changing how users find information online. However, the fundamental principl

Microsoft is experimenting with a novel Bing search iteration dubbed "Copilot Search," leveraging Copilot AI to deliver search results in a unique format. This differs significantly from standard Bing Search, Copilot, and Bing's generative


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

DVWA
Damn Vulnerable Web App (DVWA) is a PHP/MySQL web application that is very vulnerable. Its main goals are to be an aid for security professionals to test their skills and tools in a legal environment, to help web developers better understand the process of securing web applications, and to help teachers/students teach/learn in a classroom environment Web application security. The goal of DVWA is to practice some of the most common web vulnerabilities through a simple and straightforward interface, with varying degrees of difficulty. Please note that this software

VSCode Windows 64-bit Download
A free and powerful IDE editor launched by Microsoft

SublimeText3 Mac version
God-level code editing software (SublimeText3)

SAP NetWeaver Server Adapter for Eclipse
Integrate Eclipse with SAP NetWeaver application server.

Dreamweaver Mac version
Visual web development tools