How to import huge CSV file with 200,00 rows to MySQL (asynchronous and fast)?
Date : March 29 2020, 07:55 AM
around this issue to everyone who gave answers to this question. I have discovered a solution! Just wanted to share it, in case someone needs to create a PHP script that will import a huge CSV file into MySQL database (asynchronously and fast!) I have tested my code with 400,000 rows and the importing is done in seconds. I believe it would work with larger files, you just have to modify maximum upload file size. In this example, I will be importing a CSV file that contains two columns (name, contact_number) into a MySQL DB that contains the same columns. CREATE TABLE `testdb`.`table_test`
( `id` INT NOT NULL AUTO_INCREMENT ,
`name` VARCHAR(100) NOT NULL ,
`contact_number` VARCHAR(100) NOT NULL ,
PRIMARY KEY (`id`)) ENGINE = InnoDB;
<form action="upload.php" method="post" enctype="multipart/form-data">
<input type="file" name="csv" value="" />
<input type="submit" name="submit" value="Save" /></form>
<?php
//modify your connections here
$servername = "localhost";
$username = "root";
$password = "";
$dbname = "testDB";
$conn = new mysqli($servername, $username, $password, $dbname);
if ($conn->connect_error) {
die("Connection failed: " . $conn->connect_error);
}
?>
<?php
include('connect.php');
$data = $_POST['file'];
$handle = fopen($data, "r");
$test = file_get_contents($data);
if ($handle) {
$counter = 0;
//instead of executing query one by one,
//let us prepare 1 SQL query that will insert all values from the batch
$sql ="INSERT INTO table_test(name,contact_number) VALUES ";
while (($line = fgets($handle)) !== false) {
$sql .= "($line),";
$counter++;
}
$sql = substr($sql, 0, strlen($sql) - 1);
if ($conn->query($sql) === TRUE) {
} else {
}
fclose($handle);
} else {
}
//unlink CSV file once already imported to DB to clear directory
unlink($data);
?>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.11.1/jquery.js"></script>
<script>
//Declaration of function that will insert data into database
function senddata(filename){
var file = filename;
$.ajax({
type: "POST",
url: "senddata.php",
data: {file},
async: true,
success: function(html){
$("#result").html(html);
}
})
}
</script>
<?php
$csv = array();
$batchsize = 1000; //split huge CSV file by 1,000, you can modify this based on your needs
if($_FILES['csv']['error'] == 0){
$name = $_FILES['csv']['name'];
$ext = strtolower(end(explode('.', $_FILES['csv']['name'])));
$tmpName = $_FILES['csv']['tmp_name'];
if($ext === 'csv'){ //check if uploaded file is of CSV format
if(($handle = fopen($tmpName, 'r')) !== FALSE) {
set_time_limit(0);
$row = 0;
while(($data = fgetcsv($handle)) !== FALSE) {
$col_count = count($data);
//splitting of CSV file :
if ($row % $batchsize == 0):
$file = fopen("minpoints$row.csv","w");
endif;
$csv[$row]['col1'] = $data[0];
$csv[$row]['col2'] = $data[1];
$min = $data[0];
$points = $data[1];
$json = "'$min', '$points'";
fwrite($file,$json.PHP_EOL);
//sending the splitted CSV files, batch by batch...
if ($row % $batchsize == 0):
echo "<script> senddata('minpoints$row.csv'); </script>";
endif;
$row++;
}
fclose($file);
fclose($handle);
}
}
else
{
echo "Only CSV files are allowed.";
}
//alert once done.
echo "<script> alert('CSV imported!') </script>";
}
?>
|
Crystal reports shows all the rows of table even when rows are filtered C#
Tag : chash , By : JulianCT
Date : March 29 2020, 07:55 AM
fixed the issue. Will look into that further I found the answer after a lot of search: I should use DataSet instead of DataTable var command="Select ID,Nam,Family From Info where ID=2";
var connection=new OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=data.mdb");
var dt=new DataSet();//Here was the problem
using (var da = new OleDbDataAdapter(command, Connect))
da.Fill(dt);
var report=new Report();//prebuilt report
report.SetDataSource(dt);
CrystalReportViewer.ReportSource=report;
report.Database.Tables[0].SetDataSource(dataTable);
|
Count rows and filter and then delete filtered rows with VBA
Date : March 29 2020, 07:55 AM
Hope that helps Can any one tell me how to count the total number of rows in an Excel spreadsheet and then pass the value dynamically? , Regarding the question: lastRow = ws.Cells(ws.Rows.Count, "A").End(xlUp).Row
Option Explicit
Sub Test()
Dim ws As Worksheet
Dim lastRow As Long
Dim rng As Range
'set sheet reference
Set ws = ActiveSheet
'turn off autofilter
ws.AutoFilterMode = False
'get last row
lastRow = ws.Cells(ws.Rows.Count, "A").End(xlUp).Row
'set range to filter
Set rng = ws.Range("A1:C" & lastRow)
'set filter
rng.AutoFilter Field:=3, Criteria1:="=ABC", Operator:=xlOr, Criteria2:="=XYZ"
'delete visible rows
rng.Offset(1, 0).SpecialCells(xlCellTypeVisible).EntireRow.Delete
'show remaining rows by removing autofilter
ws.AutoFilterMode = False
End Sub
|
Excel VBA - Merge columns in X rows based on number of filtered rows in adjacent worksheet
Tag : excel , By : Magnus
Date : March 29 2020, 07:55 AM
this will help I have a worksheet (wsA) that I need to merge columns B and C for a variable amount of rows based on a filtered range in worksheet (wsB) (both in the same workbook). , To fix the issue, the code should be as follows: filteredLastRow = ActiveSheet.UsedRange.Rows.Count
filledRows = ActiveSheet.Range("A2:A" & filteredLastRow).SpecialCells(xlCellTypeVisible).Count
Range("B12:C" & filledRows + 12).Merge True
|
Excel pivot table: display rows filtered, but keep total for all rows (unfiltered)
Date : March 29 2020, 07:55 AM
|