搜尋

首頁  >  問答  >  主體

mysql去重,為什麼每次只能去掉幾百或幾十次重複數據,明明有幾十萬個重複數據

需求:刪除DRUG中同primaryid和drugname的項目,並保留drug_seq最小的一行

問題:mysql用這幾種方法去重,為什麼每次只能去掉幾百或幾十條重複數據,跟擠牙膏一樣,每執行一下,去重一點,明明有幾十萬條重複數據,幾種方式都試了,都是一樣的去重不徹底

#方法一:

DELETE FROM `drug2022` WHERE drug_seq IN(

SELECT drug_seq FROM (

SELECT drug_seq FROM `drug2022` WHERE (primaryid,drugname) INid. `drug2022` GROUP BY primaryid,drugname HAVING COUNT(*) > 1)

AND

drug_seq NOT IN (SELECT MIN(drug_seq) FROM 『 (*) > 1))AS a1);

#方法二:

DELETE

##FROM `drug2022`

WHERE drug_seq NOT IN (

    (SELECT t1.min_drug_seq

     FROM (SELECT MIN(drug_seq) AS min_drug_seq FROM `drug2022`drugsq. t1))

  AND (drugname, primaryid) IN

      (SELECT t2.drugname, t2.primaryid

       FROM (SELECT drugname, p`002 , primaryid HAVING COUNT(1) > 1) t2);

#方法三:

DELETE t1

#FROM `drug2022` t1,

     `drug2022` t2

WHERE t1.primaryid = t2.primaryid

  AND t1.drugname = t2.drugname

# AND t1.d se < t2.drug_seq;

  

#方法四

DELETE

FROM `drug2022`

WHERE drug_seq NOT IN (SELECT * FROM (SELECT MIN(drug_seq) FROM `drug2022` GROUP BY primaryid, drugname) t2);

P粉029305743P粉029305743821 天前816

全部回覆(0)我來回復

無回覆
  • 取消回覆