Home  >  Article  >  Backend Development  >  Find duplicate parts of data

Find duplicate parts of data

WBOY
WBOYOriginal
2016-08-31 08:54:53848browse

  1. Now there is a batch of address book data (address books of more than 10,000 people). It is necessary to find the duplicate parts of every two people's address books (that is, who has duplicates of whose address book and how many entries are repeated). It is necessary to have two address books for everyone. Compare the two together
    For example, there are four address books of ABCDE and find out the number of duplicate entries in the address book between AB AC AD AE BC BD BE CD CE DE

If the mobile phone number is duplicated, it is considered that these two address books are duplicates
This is a data table, there are more than 10,000 personal address books
Find duplicate parts of data

Find duplicate parts of data

The json stored in the list field is the content of the address book
A person’s address book ranges from 100 to 1000 items
Find duplicate parts of data

What I'm currently trying to do is to take out everyone's address book at once, and then compare the first person's address with the remaining ones (foreach array, with foreach nested inside), and then compare the second person's address book with the remaining ones. Comparison of everyone, and so on.
Script part code
Find duplicate parts of data
Find duplicate parts of data
Find duplicate parts of data

Then run the script. The script took more than 20 hours to run and only about half of it was completed The memory and CPU usage are also relatively high. The script efficiency is too low

Please tell me if there is a better way to find the duplicate parts of this batch of data, or how to optimize the script

Thank you everyone

Reply content:

  1. Now there is a batch of address book data (address books of more than 10,000 people). It is necessary to find the duplicate parts of every two people's address books (that is, who has duplicates of whose address book and how many entries are repeated). It is necessary to have two address books for everyone. Compare the two together
    For example, there are four address books of ABCDE and find out the number of duplicate entries in the address book between AB AC AD AE BC BD BE CD CE DE

If the mobile phone number is duplicated, it is considered that these two address books are duplicates
This is a data table, there are more than 10,000 personal address books
Find duplicate parts of data

Find duplicate parts of data

The json stored in the list field is the content of the address book
A person’s address book ranges from 100 to 1000 items
Find duplicate parts of data

What I'm currently trying to do is to take out everyone's address book at once, and then compare the first person's address with the remaining ones (foreach array, with foreach nested inside), and then compare the second person's address book with the remaining ones. Comparison of everyone, and so on.
Script part code
Find duplicate parts of data
Find duplicate parts of data
Find duplicate parts of data

Then run the script. The script took more than 20 hours to run and only about half of it was completed The memory and CPU usage are also relatively high. The script efficiency is too low

Please tell me if there is a better way to find the duplicate parts of this batch of data, or how to optimize the script

Thank you everyone

<code class="php">$data = array(
    array('id'=>1,'name'=>1),
    array('id'=>2,'name'=>2),
    array('id'=>3,'name'=>3),
    array('id'=>1,'name'=>2)
);
$ret = array();
# 数据遍历一次,以计算重复key作为key新建数据,如果存在value +1,如果不存在设置为1
foreach($data as $k=>$v){
    $_id = $v['id'];
    $_name = $v['name'];
    if (array_key_exists($_id, $ret)) {
        $ret[$_id]++;
    }else{
        $ret[$_id] = 1;
    }
}
# 遍历结果
foreach($ret as $k=>$v){
    echo "{$k}出现{$v}次\n";
}
#print_r($id)
</code>
Statement:
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn