search

Home  >  Q&A  >  body text

python - Flask-whooshsqlalchemyplus中文分词搜索问题

伊谢尔伦伊谢尔伦2807 days ago947

reply all(2)I'll reply

  • 怪我咯

    怪我咯2017-04-18 10:08:25

    If you are using a postgresql database, check whether the encoding of your database is UTF-8? You can view database information through l in the database shell:

    postgres=# \l
                                      List of databases
       Name    |  Owner   | Encoding  |   Collate   |    Ctype    |   Access privileges   
    -----------+----------+-----------+-------------+-------------+-----------------------
     db1  | owner | UTF8      | en_US.UTF-8 | en_US.UTF-8 | =Tc/owner         +
               |          |           |             |             | owner=CTc/owner
     db2     | owner   | SQL_ASCII | C           | C           | =Tc/owner           +

    Is it possible to search in Chinese in the database shell? Can be checked via the following sql:

    SELECT to_tsvector('我们') @@ to_tsquery('我:*');

    The above db1 is UTF-8, so it supports Chinese search,

    postgres=# \c db1
    db1=#
    db1=# SELECT to_tsvector('我们') @@ to_tsquery('我:*');
     ?column? 
    ----------
     t
    (1 row)
    
    db1=#

    db2 is SQL_ASCII and does not support Chinese search

    db1=# \c db2
    db2=#
    db2=# SELECT to_tsvector('我们') @@ to_tsquery('我:*');
    NOTICE:  text-search query contains only stop words or doesn't contain lexemes, ignored
     ?column? 
    ----------
     f
    (1 row)
    
    db2=#

    reply
    0
  • 天蓬老师

    天蓬老师2017-04-18 10:08:25

    You can refer to this: https://www.v2ex.com/t/274600...

    I used flask-whooshalchemy before, but the Chinese word segmentation effect was not good. Then I used jieba to make the word segmentation table and index, and then whooshalchemy searched the word segmentation table. The effect was okay.

    reply
    0
  • Cancelreply