我在尝试将大型CSV文件导入localhost上的mysql时遇到了麻烦.
CSV大约为55 MB,大约有750,000行.
我重写了脚本,以便它解析CSV并逐个转储行.
这是代码:
$row = 1;
if (($handle = fopen("postal_codes.csv", "r")) !== FALSE)
{
while (($data = fgetcsv($handle, 1000, ",")) !== FALSE)
{
$num = count($data);
$row++;
for ($c=0; $c < $num; $c++)
{
$arr = explode('|', $data[$c]);
$postcode = mysql_real_escape_string($arr[1]);
$city_name = mysql_real_escape_string($arr[2]);
$city_slug = mysql_real_escape_string(toAscii($city_name));
$prov_name = mysql_real_escape_string($arr[3]);
$prov_slug = mysql_real_escape_string(toAscii($prov_name));
$prov_abbr = mysql_real_escape_string($arr[4]);
$lat = mysql_real_escape_string($arr[6]);
$lng = mysql_real_escape_string($arr[7]);
mysql_query("insert into cities (`postcode`, `city_name`, `city_slug`, `prov_name`, `prov_slug`, `prov_abbr`, `lat`, `lng`)
values ('$postcode', '$city_name', '$city_slug', '$prov_name', '$prov_slug', '$prov_abbr', '$lat', '$lng')") or die(mysql_error());
}
}
fclose($handle);
}
Run Code Online (Sandbox Code Playgroud)
问题是它需要永远执行.任何suuggested解决方案将不胜感激.
你正在重新发明轮子.查看MySQL附带的mysqlimport工具.它是导入CSV数据文件的有效工具.
mysqlimport是LOAD DATA LOCAL INFILESQL语句的命令行界面.
两者应该比逐行执行INSERT快10-20倍.