Want to increase node export to csv speed in Drupal - drupal-7

I am working on a site which have a huge database.Currently When i Search the it search will lead to 8000 records displayed. Now when I clieck on export to put these records in Csv file then it take almost 3 minutes to do that.So please let me know how to reduce this time to 1 minute, Thanks in advance

Using the Views module, Views data export you should be able to export CSVs relatively quickly. Otherwise, try writing a MySQL query in phpMyAdmin or Sequel Pro and using their export feature.

Related

Cannot export local databases using phpmyadmin

Environment:
Windows 10
XAMPP Win32 7.0.23
PHP: 7.0.23
MariaDB 10.1.28?
Magento 2.1.9
I cannot export any databases. I get that "Warning: a form on this page has more than 1000 fields" message, and then the export does not work. From what I've read, you're supposed to change the max_input_vars in php.ini. I checked my only php.ini file. The line was commented out. I removed the semicolon and increased the value - a few times. Here is what it looks like now:
; How many GET/POST/COOKIE input variables may be accepted
max_input_vars = 10000
No matter what I increased it to, I still get the same 1,000 fields message when I try to export a database. I searched the entire XAMPP directory and sub-directories, but found no other php.ini except for the one in the xampp/php directory. I would not think it needs to be higher than 10,000 for a fairly new Magento database with only a few products in it. When I imported the database into XAMPP, it was less than 2 MB in total size. I tried exporting another Magento database for the unmodified demo site, and I get the same warning and result.
Can someone help me? Thanks.
UPDATE:
info.php created and verifies the current max_input_vars = 10000. I am selecting the database in phpmyadmin, and then clicking on the export button at the top. The export, in simple mode, selects all tables for the given database for export.
put info.php into document root with
<?php
phpinfo();
?>
call the file in browser and check the value for max_input_vars.
Also specify if you are exporting a whole database or a query result for further advice

Export Phabricator tasks to spreadsheet / csv

I spent more than 15 minutes on Phabricator to look for the Export to spreadsheet feature without success. I have seen there that this feature exist. My goal is to migrate away from Phabricator.
You need to implement PHPExcel and add it into your php config include_path.
Then you can use the Export To Excel on each Maniphest page with the built ind queries and your custom queries.
If you need a custom format you need to customize the existing ones. Therefore see files in ./t/phabricator/src/applications/maniphest/export/

Export large amount of data?

I'm trying to export large amount of data (~90000rows * 17columns) to excel file. However, after executing the script, no file is created at the location which I had specified.
I had tried
$cacheSettings = array( 'memoryCacheSize' => '64MB');
and
ini_set('memory_limit', '64M');
but none of these helps.
However, I manage to get the file created by reducing the number of columns.
I understand there are existing topic related to this question and I have gone through these topic but still couldn't find any solution to my problem.
Thank you.
Reading issues with packages using PHPExcel, it seems the best option is loading data in smaller chunks and appending to the same sheet.
Further reading on one of the issues
Update may 2017:
If you are using Laravel Excel, their 3.0 version is out with huge performance improvements. If a PHP 7.0 requirement is not a problem, it might be worth a look at.

How do I export data from filemaker?

This is my first time using filemaker and I'm in the process of converting a client's filemaker project to a webapp. Currently, I am trying to export the current data so that I can put it in a database.
I've found that tables don't appear for export unless they appear in the "Relationships" graph, and I've found that I can export the data using the "Export Records" script command.
This all works fine except that I can't seem to export all the records. In the database manager, one of the tables reports that it has 596 records, but when I export it I only get 119 records. The same is true for all other tables I've tried, they report more than 119 records, but the exported data only has 119.
Any ideas or help is welcome
edit
More progress, it seems the script runs in the context of the current layout, so the number of records is related to the number of results in the current layout. Is there a way to automate creating layouts for tables that don't have them so that I can export all the data for each table?
The concept you were missing was the "found set". FileMaker exports the currently found records in the current layout. You can create a script that grabs the names of the layouts within the file, sticks into a variable, goes to each layout in turn, performs a Show All Records, and then runs an open ended export. However, I'm wondering how many base tables there actually are?
As to creating layouts - nope, there's no programmatic way. However, it is possible to use an external key macro type app (QuicKeys?) to run through the create layout dialog and choose a new table each time.

Is is possible to re-export to CSV from a GAE bulkloader.py sqlite database?

I ran bulkloader with parameters to export my bigtable to a csv file, and it spent a lot of time downloading the table (it's large). It failed to export do to a problem with one of the lambda expressions in the Exporter class. I fixed that, can I run bulkloader.py again without having to redownload all of the data from GAE? I'd like to point it back at the .sql file it downloads and just tell it to export again to CSV.
No. If you use the same progress database, it may pick up from where it left off, though.

Resources