数据库down下一个200M的sql文件准备做测试,求一个导入数据库的方法;
Tips:cli下用source 导入直接导致server断掉,亲测
用工具SQLyog一样不行
另,修改了my.ini
max_allowed_packet=500M重启mysql后没起到作用,也很奇怪
天蓬老师2017-04-17 16:26:18
There is a tool that can split large sql data files into multiple files according to a certain size such as 10M. The data structure file is in a separate file. You can search it online. The name is SQLDumpSplitter
ringa_lee2017-04-17 16:26:18
Can the data source be imported again?
Direct
mysqldump -u -p -P > *.sql
mysql -u -p -P < *.sql
How fast is this
伊谢尔伦2017-04-17 16:26:18
Modify
wait_timeout=2880000
interactive_timeout = 2880000
max_allowed_packet = 20M --This parameter cannot be set too large.