Firebird nbackup delta file - database

I want to do a Backup from a Firebird Database. In the documentation I read i should do it with:
/opt/firebird/bin/nbackup -B 0 /home/server/daten/DB.fdb DB19082014.nbk
This work. I have a file DB19082014.nbk. This I copy to my computer, and now I would Restore it with:
/opt/firebird/bin/nbackup -R /home/server/daten/DB.fdb db19082014.nbk
But now I get the error:
I/O error during "open" operation for file "/home/server/daten/DB.fdb.delta"
Error while trying to open file
null
But I don't have a .delta File. Not on my System and not on the System I do the Backup. Knows anybody where or how I can create a empty .delta File? To get the database work?
Thank You
Solution:
The Backup File must be unlocked with:
nbackup -F <database>

Solution: The Backup File must be unlocked with:
nbackup -F <database>

Related

DB2 relocate with .dbf file on DB2 v. 10.5 LUW

Hi I have a DB2 database at
/db2/ins/data/ins/dbtest
but it origin is
/db2/oldins/data/oldins/dbtest1
I copied the files to the folders as needed.
My relocate.cfg look like:
DB_NAME=dbtest1,dbtest
DB_PATH=/db2/oldins/data/dbtest1/metalog/,/db2/ins/data/ins/dbtest/metalog
INSTANCE=oldins,ins
STORAGE_PATH=/db2/oldins/data/dbtest1/data/,/db2/ins/data/ins/dbtest/data/
LOG_DIR=/db2/oldins/data/dbtest1/metalog/oldins/NODE0000/SQL00001/LOGSTREAM0000/,/db2/ins/data/ins/dbtest/metalog/NODE0000/SQL00001/
LOGARCHMETH1=DISK:/db2/backup/ins/dbtest/archivlogfiles/
I get this error:
DBT1006N The "/db2/oldins/data/dbtest1/data/dbtest1_TS.dbf/SQLTAG.NAM" file or device could not be opened.
The system is DB2 v. 10.5 LUW.
The file does exist and the priviledges are correct.
How do I add this to the relocate.cfg file or what do I need to do?
Thank you for any help.
Here is one of simple test case how to use db2relocatedb.
[Db2] Simple test case shell script for db2relocatedb command
https://www.ibm.com/support/pages/node/1099185
It has topic about:
- db2relocatedb for changing container path
And it tells that we need to change 'path' by 'mv' command before run db2relocatedb command as below:
# mv storage path manually and run db2relocatedb with relocate.cfg file
mv /home/db2inst1/db/stor1 /home/db2inst1/db/new1
mv /home/db2inst1/db/stor2 /home/db2inst1/db/new2
db2relocatedb -f relocate.cfg
It is recommended to review it.
Hope this helps.

Sybase IQ - BCP error

Trying to copy data from Sybase IQ Table to .txt file using bcp
Command
C:\Windows\system32>bcp xxTable out \xx\xx\xxTable.txt -S HSPREP_15 -U xxUser -A 16384 -c -t\t$$ -P xxpwd -F -L 1>\xx\xx\xxTable_rpt.txt
Source Table got about 132 million records
bcp copy about 60 million records to .txt file and then fails with below error:
There is no issue with disk spare or user credentials.
Any clues? where should i be looking into?
Bcp out doesnt create a file bigger than 2GB which might the case please check

Postgres pg_dump issue

Good Day,
I've been trying to restore a dump file using the psql client and I'm getting this error:
psql.bin:/home/user/Desktop/dump/dumpfile.sql:907:
ERROR: more than one function named "pg_catalog.avg"
CONTEXT: COPY pg_aggregate, line 1, column aggfnoid: "pg_catalog.avg"
I created the dump file from a different Postgres DB (version: 9.4.5) using the command:
pg_dump --username=pgroot ${tables} --no-owner --no-acl --no-security
--no-tablespaces --no-unlogged-table-data --data-only dbname > dumpfile.sql
Where ${tables} is a variable in the for:
-T table1 -T table2 -T table3 ...
This is because I'm dumping specific tables listed in a new-line delimited file. Hence its not the entire database but specific tables I want to dump.
I tried loading the the dump file int another Postgres DB (9.6) using the following command:
psql -d dbname -U superuser -v "ON_ERROR_STOP=1" -f
${DUMP_DIR}dumpfile.sql -1 -a > ${LOG_ERR_DIR}dumpfile.log
2>${LOG_ERR_DIR}dumpfile.err
This gave the error mentioned above. It seems this error is occurring because the dump file tries to add the function "pg_catalog.avg" to the database and it gives an error because it already exists.
The sql file generated by the pg_dump does not have anywhere in it where it creates the pg_catalog.avg function, so i don't know why this is occurring.
So I tried dropping the database and creating it from template0, and still I got the error. It seems to me that its a bug based on the follwoing post:
Re: BUG #6176: pg_dump dumps pg_catalog tables
I'm stuck trying to reslove this issue. If anyone can help me resolve this issue please respond?
Thank you in advance,
j3rg
I found out what was causing this issue. It seems that there was an extra newline in the file containing the table listing. This was causing an extra table argument with no table specified and in turn pg_dump exported the sys tables into the file. I file I was searching for the avg function was the wrong file too.

db2exfmt unable to open output file

I am trying to create query explain tables using db2exfmt.
I am using db2 CLP and I am following below steps:
Connect to sample
set current explain mode explain
My query = select * from staff where JOB = 'Sales'
db2 set current explain mode no
db2exfmt -d sample -# 0 -w -1 -g -TIC -n % -s % -o output.txt
After the last step, I am getting this output:
Connecting to the Database.
Connect to Database Successful.
Unable to open output file.
I am not sure why it is not able to open output file. How should I resolve this issue?
It appears that you don't have write access to the C:\Program Files\IBM\SQLLIB\BIN directory, so db2exfmt can't open the output file for writing.
Change to a directory you do have write permissions, or specify a file name with path for the -o option.

Override file while backup database

I want to back up a database using this code
sqlcmd -S servername -Q "BACKUP DATABASE [DBName] TO DISK = 'C:\backup.bak'"
It works. But if the backup file already exists, the data gets appended to the file instead of replacing the file. Every time I call BACKUP DATABASE the file gets bigger.
Is there an option for BACKUP DATABASE to force a replace?
sqlcmd -S servername -Q "BACKUP DATABASE [DBName] TO DISK = 'C:\backup.bak' WITH INIT"
INIT does the trick. From MSDN:
INIT Specifies that all backup sets should be overwritten
WITH INIT is not enough. Should be WITH INIT, SKIP these days. Docs
Explanation: INIT overwrites only if some conditions are met. SKIP instructs to ignore those conditions.
BACKUP DATABASE SQ_P TO DISK='D:\Data Backup\SQ_P.bak' with init;
where SQ_P is the Database Name

Resources