Postgres -- Simple batch file to export CSV - database

Hi and thanks in advance.
I am currently exporting from my postgres database VIA the psql shell with::
\COPY "Accounts" TO 'C:\Users\admin\Desktop\Accounts.csv' CSV HEADER;
This works fine, but I want to be able to double click a batch file .cmd or .bat that is saved on my desktop to 1) log into the database 2) export the CSV.
So that way I dont have to go into psql shell every time. Please help, I did google but postgres resources are few.

Because the comments above are limited in their length and formatting, I am sharing some basic research that might get you started:
Using a .pgpass file
PowerShell connect to Postgres DB

Related

How can i read data running batch file and store in a text file?

I am running a command in a command prompt it is a nuodb manager
nuodbmgr --broker localhost --password bird123 --command "log database madhu categories sql-statements"
this will get all running tasks in nuodb server. after getting more data its clean up old data now i want to store hole data into one file this file can be new file for every day. how can i write batch file or any alternative way to do this requirement plz help me on this.
Try
nuodbmgr --broker localhost --password bird123 --command "log database madhu categories sql-statements" > sometextfile.txt

improperly importing sql file in postgres cli

I have a question about importing a sql file to postgres CLI. I may have been improperly importing my file or either I may have some User or Database privilege?!? issue. Anyways, these are just my hunches. I am trying to pinpoint the cause of this message after importing a sql file.
The message that I get is:
No relations found.
The steps I did to get into Postgres are:
I typed in:
sudo -i -u postgres
psql
then i created a new role, altered the role permission
and then created a new database as well
i got all my commands from this site http://blog.jasonmeridth.com/posts/postgresql-command-line-cheat-sheet/
last step was I imported a sql file by typing:
psql -d db_name_dev -U username_dev -f /www/dbexport.sql
Now when I go inside the database I created "db_name_dev" by typing
psql db_name_dev and check to see any content imported by typing \dt
I get
No relations found.
here is also a table and role list from my command line..
http://screencast.com/t/8ZMqBLNRb
I'm thinking my database might also have some access privilege issue..
also here is an additional issue i ran into.. hope this helps..
http://screencast.com/t/BJy0ZjrALm6h
thanks,
any feedback would be appreciated
ok so after a few research and readings, i found out my .sql file was empty.. here are some links ive read and learnt more about pg_dump command dbforums.com/showthread.php?1646161-Postgresql-Restores and pg_dump vs pg_dumpall? which one to use to database backups?

Write a .bat script create database for orientdb and run by php

OrientDB is using console command create DB by the way run console.bat file on "..\orientdb\bin\". However i need create a .bat separate file run like this command:
create database remote:localhost/test <root> <password> local graph
And run that .bat file by PHP
Thank for advance !
Based on the old documentation at https://code.google.com/p/orient/wiki/ImportFromRDBMS it seem that it is possible and is similar to what i want to do as well.
console.bat database.sql
where all your sql commands are in database.sql. The link about includes a sample for porting a full database over but don;t see why it should not work for similar scripts as well.
That said i have not tested this yet.

Easy way to view postgresql dump files?

I have a ton of postgresql dump files I need to peruse through for data. Do I have to install Postgresql and "recover" each one of them into new databases one by one? Or I'm hoping there's a postgresql client that can simply open them up and I can peek at the data, maybe even run a simple SQL query?
The dump files are all from a Postgresql v9.1.9 server.
Or maybe there's a tool that can easily make a database "connection" to the dump files?
UPDATE: These are not text files. They are binary. They come from Heroku's backup mechanism, this is what Heroku says about how they create their backups:
PG Backups uses the native pg_dump PostgreSQL tool to create its
backup files, making it trivial to export to other PostgreSQL
installations.
This was what I was looking for:
pg_restore db.bin > db.sql
Thanks #andrewtweber
Try opening the files with text editor - the default dump format is plain text.
If the dump is not plain text - try using pg_restore -l your_db_dump.file command. It will list all objects in the database dump (like tables, indexes ...).
Another possible way (may not work, haven't tried it) is to grep through the output of pg_restore your_db_dump.file command. If I understood correctly the manual - the output of pg_restore is just a sequence of SQL queries, that will rebuild the db.
In newer versions you need to specify the -f flag with a filename or '-' for stdout
pg_restore -f - dump_file.bin
I had this same problem and I ended up doing this:
Install Postgresql and PGAdmin3.
Open PGAdmin3 and create a database.
Right click the db and click restore.
Ignore file type.
Select the database dump file from Heroku.
Click Restore.
pg_restore -f - db.bin > db.sql
Dump files are usually text file, if Not compressed, and you can open them with a text editor. Inside you will find all the queries that allow the reconstruction of the database ...
If you use pgAdmin on Windows, can just backup the file as plain text, there is one option when you do backup instead of pg_dump in command line prompt.

I have a 18MB MySQL table backup. How can I restore such a large SQL file?

I use a Wordpress plugin called 'Shopp'. It stores product images in the database rather than the filesystem as standard, I didn't think anything of this until now.
I have to move server, and so I made a backup, but restoring the backup is proving a horrible task. I need to restore one table called wp_shopp_assets which is 18MB.
Any advice is hugely appreciated.
Thanks,
Henry.
For large operations like this it is better to go to command line. phpMyAdmin gets tricky when lots of data is involved because there are all sorts of timeouts in PHP that can trip it up.
If you can SSH into both servers, then you can do a sequence like the following:
Log in to server1 (your current server) and dump the table to a file using "mysqldump" --- mysqldump --add-drop-table -uSQLUSER -pPASSWORD -h
SQLSERVERDOMAIN DBNAME TABLENAME > BACKUPFILE
Do a secure copy of that file from server1 to server2 using "scp" ---
scp BACKUPFILE USER#SERVER2DOMAIN:FOLDERNAME
Log out of server 1
Log into server 2 (your new server) and import that file into the new DB using "mysql" --- mysql -uSQLUSER -pPASSWORD DBNAME < BACKUPFILE
You will need to replace the UPPERCASE text with your own info. Just ask in the comments if you don't know where to find any of these.
It is worthwhile getting to know some of these command line tricks if you will be doing this sort of admin from time to time.
try HeidiSQL http://www.heidisql.com/
connect to your server and choose the database
go to menu "import > Load sql file" or simply paste the sql file into the sql tab
execute sql (F9)
HeidiSQL is an easy-to-use interface
and a "working-horse" for
web-developers using the popular
MySQL-Database. It allows you to
manage and browse your databases and
tables from an intuitive Windows®
interface.
EDIT: Just to clarify. This is a desktop application, you will connect to your database server remotely. You won't be limited to php script max runtime, or upload size limit.
use bigdupm.
create a folder on your server witch is not easy to guess like "BigDump_D09ssS" or w.e
Download the http://www.ozerov.de/bigdump.php importer file and add them to that directory after reading the instructions and filling out your config information.
FTP The .SQL File to that folder along side the bigdump script and go to your browser and navigate to that folder.
Selecting the file you uploaded will start importing the SQL is split chunks and would be a much faster method!
Or if this is an issue i reccomend the other comment about SSH And mysql -u -p -n -f method!
Even though this is an old post I would like to add that it is recommended to not use database-storage for images when you have more than like 10 product(image)s.
Instead of exporting and importing such a huge file it would be better to transfer the Shopp installation to file-storage for images before transferring.
You can use this free plug-in to help you. Always backup your files and database before performing this action.
What I do is open the file in a code editor, copy and paste into a SQL window within phpmyadmin. Sounds silly, but I swear by it via large files.

Resources