Home >Backend Development >PHP Tutorial >PHP Tutorial.Application Example 15_PHP Tutorial

PHP Tutorial.Application Example 15_PHP Tutorial

WBOY
WBOYOriginal
2016-07-13 16:54:05828browse

Search engine implementation based on Linux
The search engine provides users with a tool to quickly obtain web page information. Its main function is that the system retrieves the back-end web page database through user input of keywords, and feeds back links and summary information of relevant web pages to user. From the scope of search, it is generally divided into site web search and global web search. With the rapid increase in the number of web pages, search engines have become a necessary means to query information on the Internet. All large websites have provided web page data search services, and many companies have emerged to provide professional search engine services for large websites, such as providing search for Yahoo. Google, which provides services, and Baidu, which provides services for domestic websites such as Sina and 263, etc. Professional search services are expensive and free search engine software is basically based on English searches, so they are not suitable for the needs of intranet environments (such as campus networks, etc.).
The basic components of a search engine are generally divided into three parts: webpage collection program, webpage back-end data organization and storage, and webpage data retrieval. The key factor that determines the quality of a search engine is the response time of data queries, that is, how to organize a large amount of web page data to meet the needs of full-text retrieval.
GNU/Linux is an excellent network operating system. Its distribution version integrates a large number of network application software, such as Web server (Apache + PHP), directory server (OpenLDAP), scripting language (Perl), and web page collection program. (Wget) etc. Therefore, by applying them together, a simple and efficient search engine server can be realized.
1. Basic composition and usage
1. Web page data collection
The Wget program is an excellent web page collection program. It uses a multi-threaded design to easily mirror website content to a local directory, and can Flexibly customize the type of collection web pages, recursive collection levels, directory limits, collection time, etc. The collection of web pages is completed through a dedicated collection program, which not only reduces the difficulty of design but also improves the performance of the system. In order to reduce the size of local data, you can only collect html files, txt files, script programs asp and php that can be queried, and only use the default results, without collecting graphics files or other data files.
2. Web page data filtering
Since there are a large number of tags in html files, such as

, etc., these tagged data have no actual search value, so the collected data must be filtered before adding to the database. filter. As a widely used scripting language, Perl has a very powerful and rich program library that can easily complete web page filtering. By using the HTML-Parser library, you can easily extract text data, title data, link data, etc. contained in web pages. The library can be downloaded at www.cpan.net, and the site's collection of Perl programs covers a wide range of topics well beyond our scope.
3. Directory service
Directory service is a service developed for large amounts of data retrieval. It first appeared in the X.500 protocol set and was later extended to TCP/IP and developed into the LDAP (Lightweight Directory Acess Protocol) protocol. The relevant standards are RFC1777 formulated in 1995 and RFC2251 formulated in 1997. The LDAP protocol has been widely used as an industrial standard by Sun, Lotus, Microsoft and other companies in their related products. However, dedicated directory servers based on Windows platforms are rare. OpenLDAP is a free directory server running on Unix systems. Its products It has excellent performance and has been collected by many Linux distributions (Redhat, Mandrake, etc.), and provides development interfaces including C, Perl, PHP, etc.

www.bkjia.comtruehttp: //www.bkjia.com/PHPjc/631823.htmlTechArticleLinux-based search engine implementation A search engine provides users with a tool to quickly obtain web page information. Its main function is The system searches the back-end web database through user input of keywords...
Statement:
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn