Dump of Last 30 days records using pg_dump - database

i have a database which is approx 50 GB in size.I am running a command to get all the dump of the database.The command is something like this
pg_dump -h databaseAddress databaseName > dump-name.sql(database address is hidden obviously)
I want to get the records of last 30 days.Is there a filter command for this.

Related

Need to compare the SQL output date with the sysdate/currentdate in batch file

I want to compare the SQL output (which will be a single row date in the format YYYYMMDD) with the current date. If the date are same, code will exit, but if the date are not same I need to trigger an email. Please help me to get this. Till now I could get the date output on cmd console. Also the cmd prompts wait for the user input after that.
sqlcmd -S 10.XXX.XX.XXX -d BE_MI_DEV -U user1 -P pass$ -q "select max(day_id) from table1"
Above command gives me output on console as
20180920
(1 rows affected)
Note: I will be scheduling this batch and it will run daily.enter image description here

Sybase IQ - BCP error

Trying to copy data from Sybase IQ Table to .txt file using bcp
Command
C:\Windows\system32>bcp xxTable out \xx\xx\xxTable.txt -S HSPREP_15 -U xxUser -A 16384 -c -t\t$$ -P xxpwd -F -L 1>\xx\xx\xxTable_rpt.txt
Source Table got about 132 million records
bcp copy about 60 million records to .txt file and then fails with below error:
There is no issue with disk spare or user credentials.
Any clues? where should i be looking into?
Bcp out doesnt create a file bigger than 2GB which might the case please check

Postgresql table backup restoration

I have a database which contains 50 tables (5 schemas, 5 tablespaces). And tried to take a backup of few tables (each table in different tablespace) using following command.
$psql -U my_db_user my_db_name -t my_table_1 -t my_table_2 -t my_table_3 > ttables.sql
Above command is working fine to take the *sql backup. But the table column value is having null values. While restoring the the dump using the following command getting some error due to null (\N) values which is in backup file (ttables.sql).
$cat ttables.sql | psql -d new_db -U new_db_user
Is there any way to avoid \N characters in backup dump file? or Any wrong with backup / restore command which I have used?
(Postgres version 9.1)

SQL Server BCP export corrupted file?

i have a small problem with BCP functionality in SQL Server 2012.
The things is:
im loading .jpg image (167KB in size) using below command:
INSERT [tabela_testowa] ( Data )
SELECT * FROM OPENROWSET (BULK N'C:\foty\ch6_MagicShop.jpg', SINGLE_BLOB) a;
and then im trying to export it back to disk using:
BCP "SELECT data FROM tabela_testowa WHERE ID = 1" queryout "C:\test\file.jpg" -T -n -d test
File gets saved on disk no problem, size is also 167 KB but.. it cant be opened like the original copy.
I dont know whatever some parameter is wrong in BCP export? Or maybe it gets corrupted at import stage?
Anyone had similiar problems?
Thank god, thanks to #user_0 answer and #user3494351's cryptic answer and comment and this ancient forum post I finally figured this out after several hours of banging my head against the wall.
The issue is that BCP likes to add an extra 8 bytes to the file by default. This corrupts the file and makes it unable to be opened if you just use the native -n flag.
However, BCP allows you to specify a format file as output that can allow you to tell it not to add the extra 8 bytes. So I have a table I created (to be used in a cursor) in SQL Server that only has ONE ROW and ONE COLUMN with my binary data. Table must exist when you run the first command.
In command line first you need to do this:
bcp MyDatabase.MySchema.MyTempTable format nul -T -n -f formatfile.fmt
This creates formatfile.fmt in the directory you are in. I did on E:\ drive. Here's what it looks like:
10.0
1
1 SQLBINARY 8 0 "" 1 MyColumn ""
That 8 right there is the variable that bcp says how many bytes to add to your file. It is the bastard that is corrupting your files. Change that sucker to a 0:
10.0
1
1 SQLBINARY 0 0 "" 1 MyColumn ""
Now just run your BCP script, drop the -n flag and include the -f flag:
bcp "SELECT MyColumn FROM MyDatabase.MySchema.MyTempTable" queryout "E:\MyOutputpath" -T -f E:\formatfile.fmt
BCP is adding informations to his file. Just few data, but you are not exporting just a jpg file.
You say 167 KB, but watch the real bytes, not the rounded dimension. There will be a difference.
You cannot export the image via BCP.
OK so i solved the issue.
Format file has to be added using -f and path to the file. It can be create by running bcp without any format and order it to save format file to disk. Then we can use this format file so its no longer needed to answer those questions, and file itself has no additional data and can be opened without problems

SSMS output delimited files with headers

Is there a way using SSMS or other tool to output about 600 tables from a SQL Server database. The catch is they need to have column headers.
Basically I need to dump 600+ tables with a bar '|' delimiter, and they need to all have column names in the first row.
If I remember right you should be able to use the command line sqlcmd tool to export data together with headers. Something like this:
sqlcmd -S localhost -d YourDatabase
-E -Q “SELECT * FROM YourTable” -o “CSVData.csv” -W -w 1024 -s”|”
You'll have to look into the options to get them right.

Resources