seems to work fine I would say that there are a wide variety of benefits to using a database over a CSV for such large structured data so I would suggest that you learn enough to do so. However, based on your description you might want to check out non-server/lighter weight databases. Such as SQLite, or something similar to JavaDB/Derby... or depending on the structure of your data a non-relational (Nosql) database- obviously you will need one with some type of python support though.
How to delete the repeated data of txt using Matlab?
Any of those help I have a database with one particular table having more than 4 million record entries. I tried downloading whole db it using MySQL workbench as well as command terminal using following command: , Try adding the below lines in the my.cnf and restart
# Performance settings used for import.
it helps some times LOAD DATA INFILE is your solution. You can read the documentation from the MySQL website and generate the LOAD DATA query for your need. Make sure you put the file in a place where MySQL process can read. It can only load files from certain location. Again it is part of the documentation. https://dev.mysql.com/doc/refman/8.0/en/load-data.html