programmatically doing what postgres \copy does - npgsql

in a c# npgsql client can I do what the \copy command does (fast load from a csv file). I read that psql \copy invokes COPY FROM STDIN but I dont know how to pipe the data

I found the answer right here
https://www.npgsql.org/doc/copy.html

Related

cannot import new database from pg_dump, out of memory. psql cloned_db < db_backup.sql

made a backup of a remote database by the following
pg_dump dbname | gzip > filename.gz
made a new database that i want to import it to
CREATE DATABASE new_clone;
tried importing into new database by the following command:
gunzip -c /path/to/backup/filename.sql.gz | psql new_clone
working, imported parts of the database, but ran into error
out of memory
no other information... most tables and schemas have appeared, but I cannot get past this problem.
ive tried doing a simple
pg_dump dbname > filename
and a
psql dbname < infile
tried without compression, and it fails as well.
my guess is that a blob inside the database is too large? is there a way to skip the blobs that cause errors?
database size is 10GB uncompressed, 2.2GB compressed. i have 135GB free disk space, and 32GB ram. running ubuntu 14.04, postgres v 9.6.2
appreciate any help at all. thanks.
I have same issue. After little digging I found that is not psql issue, but sdin: after change COPY ... FROM stdin to COPY ... from '/path/to/file/with.dump' all became fine. So when you try feed psql through stdin, IMHO, OS, not psql, trying load all dump into memory, and, as dump bigger, than memory process can have, you got error.
PS I'm sorry for my bad English, it is not my native language.
To improve k.o. answer, psql --no-readline could be used to avoid stdin(readline) OOM issue.
I had the same issue recently. I am working on ubuntu 20.04 and postgres version is 12. You should try reading the first line of the error. The main error is written on the first line. In my case, postgis-scripts were missing. As a result, I was getting the OOM error. To fix it, run:
sudo apt-get install postgis postgresql-12-postgis-scripts
I had the same problem with a 30GB database dump, and splitting the file up works well for me.
To back it up do a dump of the file and, pipe it through split
pg_dump dbname | split -b 1000m - backup.dump
Then restore it using a pipe from cat
createdb dbname
cat backup.dump* | psql dbname
https://www.postgresql.org/docs/8.1/backup.html#BACKUP-DUMP-LARGE

Postgres -- Simple batch file to export CSV

Hi and thanks in advance.
I am currently exporting from my postgres database VIA the psql shell with::
\COPY "Accounts" TO 'C:\Users\admin\Desktop\Accounts.csv' CSV HEADER;
This works fine, but I want to be able to double click a batch file .cmd or .bat that is saved on my desktop to 1) log into the database 2) export the CSV.
So that way I dont have to go into psql shell every time. Please help, I did google but postgres resources are few.
Because the comments above are limited in their length and formatting, I am sharing some basic research that might get you started:
Using a .pgpass file
PowerShell connect to Postgres DB

How to import file into oracle table

I have a .tbl file with data and I'm trying to import this data into a table. I'm using SQL Developer for this with this command:
load data infile "C:\path\users.tbl"
insert into table users fields terminated by "|" lines terminated by "\r\n;
But nothing is working, the data is not loaded and no errors are shown...Do you see why it's not working?
That looks like SQL*Loader syntax.
For that to work, you'd have to run SQL*Loader, which is a separate command-line program available in your ORACLE_HOME/bin directory.
If you don't have an ORACLE_HOME, you'll need to install the client. Then open a shell/cmd window, and run your command there.
OR, if you want to use SQL Developer, you can use our wizard to read the file and insert the data, row-by-row.

A way to read(select) an image from a location

Is there a way in sybase where I can select (read) an image or a file from the a server or driver ?
In oracle there is Bfile , it lets me to read an image directly from the driver, how to do that in sybase ?.
You can read/write text files located on the ASE server's host through a proxy table that is mapped to a file.
Unfortunately, there is no way to read a binary file like an image via such a proxy table or otherwise directly from SQL. Some kludges are possible though:
you can use BCP and a format file to read a binary file into an image column (see my Tips & Tricks book below), and you can run this from SQL via xp_cmdshell.
you can use the Java JVM that is embedded in the ASE server to read files and move the content into a table; that will require combined Java and SQL programming. YMMV.

Easy way to view postgresql dump files?

I have a ton of postgresql dump files I need to peruse through for data. Do I have to install Postgresql and "recover" each one of them into new databases one by one? Or I'm hoping there's a postgresql client that can simply open them up and I can peek at the data, maybe even run a simple SQL query?
The dump files are all from a Postgresql v9.1.9 server.
Or maybe there's a tool that can easily make a database "connection" to the dump files?
UPDATE: These are not text files. They are binary. They come from Heroku's backup mechanism, this is what Heroku says about how they create their backups:
PG Backups uses the native pg_dump PostgreSQL tool to create its
backup files, making it trivial to export to other PostgreSQL
installations.
This was what I was looking for:
pg_restore db.bin > db.sql
Thanks #andrewtweber
Try opening the files with text editor - the default dump format is plain text.
If the dump is not plain text - try using pg_restore -l your_db_dump.file command. It will list all objects in the database dump (like tables, indexes ...).
Another possible way (may not work, haven't tried it) is to grep through the output of pg_restore your_db_dump.file command. If I understood correctly the manual - the output of pg_restore is just a sequence of SQL queries, that will rebuild the db.
In newer versions you need to specify the -f flag with a filename or '-' for stdout
pg_restore -f - dump_file.bin
I had this same problem and I ended up doing this:
Install Postgresql and PGAdmin3.
Open PGAdmin3 and create a database.
Right click the db and click restore.
Ignore file type.
Select the database dump file from Heroku.
Click Restore.
pg_restore -f - db.bin > db.sql
Dump files are usually text file, if Not compressed, and you can open them with a text editor. Inside you will find all the queries that allow the reconstruction of the database ...
If you use pgAdmin on Windows, can just backup the file as plain text, there is one option when you do backup instead of pg_dump in command line prompt.

Resources