Some questions about backup kiwi-tcms database - kiwi-tcms

I try to backup my kiwi tcms data following steps on web http://kiwitcms.org/blog/atodorov/2018/07/30/how-to-backup-docker-volumes-for-kiwi-tcms/. Some question need help.
Which type datas stored on kiwi_uploads? Shall I also use command "docker volume rm kiwi_uploads" then restore it? Did same as Backing up the database.
Some errors occurs as below when restore kiwi_uploads using "cat uploads.tar | docker exec -i kiwi_web /bin/tar -x". But even error occurs, login and find previous data ok, such as plan, runs, test case...Of cause, I restore kiwi_db_data successfully.
cat uploads.tar | docker exec -i kiwi_web /bin/tar -x
/bin/tar: This does not look like a tar archive
/bin/tar: Skipping to next header
/bin/tar: Exiting with failure status due to previous errors
3."cat database.json | docker exec -i kiwi_web /Kiwi/manage.py loaddata --format json -". No any parameter behind last -? missing or just as this.

1) kiwi_uploads is for all files that are uploaded (or attached) to documents like Test Plan, Test Case, etc.
The instructions in the blog should work for you. Usually there's no need to remove the volume but if you are restoring everything it doesn't really matter.
2) For the errors you have
/bin/tar: This does not look like a tar archive
so whatever file you ended up with is not a tar archive and everything else fails.
3) The last - means to read the input data from stdin. You have to copy the backup and restore commands verbatim.
All commands are designed to be executed from a Linux host. I don't have access to a Windows or Mac OS box so I don't know if they will work there at all.

Related

Deploy the database to Docker Container microsoft/mssql-server-linux

I have a database running on SQL Server (13.01) on Windows. I like to deploy it to the Docker Container on Linux using SSDT.
I can perfectly connect to the server running on Docker and create/drop database manually and play with the data.
The problem is I can not publish it. I'm executing following script on Powershell
PS: SqlPackage.exe /Action:Publish /SourceFile:"d.dacpac" /TargetConnectionString:"server=containeraddress;database=thedatabase;user id=sa;password=thepassword;
and getting the following error.
Unable to connect to master or target server 'the database'. You must have a user with the same password in master or target server 'the database'. (Microsoft.Data.Tools.Schema.Sql)
I have the same user and same password on target and source servers.
Is there anybody has the same problem and know how to solve it?
I'll post this here as most of the answers refer to having an existing compiled dacpac file, which may not always be possible. I haven't seen similar ideas posted elsewhere to the solution I'm suggesting here.
Given your usage of docker and if you wish to compile your visual studio project inside the container, given certain combinations of the container base OS and image may not be possible to create a dacpac file with msbuild.
You can work around restoring the database using a series of unix based commands, taking note that the visual studio database project is usually just a series of SQL files, below I show an example of this, where I concat the SQL files into a single file and call sqlcmd to run the script;
FROM mcr.microsoft.com/mssql/server
WORKDIR /init
ENV ACCEPT_EULA=Y
ENV MSSQL_SA_PASSWORD=MyPassword
EXPOSE 1433:1433
RUN apt-get update && apt-get install dos2unix
COPY /solution_folder/database/Tables/*.sql /init/
WORKDIR /database
RUN echo "CREATE DATABASE [database_name];\nGO\nUSE [database_name];\n” >> /database/create.sql
RUN for f in /init/*.sql; do dos2unix $f; cat $f >> /database/create.sql; echo "\nGO\n" >> /database/create.sql; done
RUN ( /opt/mssql/bin/sqlservr --accept-eula & ) | grep -q "Service Broker manager has started" && /opt/mssql-tools/bin/sqlcmd -S localhost -U SA -P ‘MyPassword’ -i /database/create.sql && pkill sqlservr
The reason for "dos2unix" is that the SQL files created within visual studio have unique hidden cr/lf (and other characters) which the linux version of sqlcmd won't interpret successfully and will cause errors (which is kind of bizarre and this is exactly the kind of thing you'd want a cross platform database to be able to cope with)
Also, within the final run command you have to start-up the sql server service temporarily otherwise you'll also get errors; it's a little bit of work-around, and a bit fiddly and I'm not sure entirely that the microsoft sql server linux container is well designed enough for the simple task of restoring a database like this, the nuances are the differences between building and running a container and needing some sort of happy middle ground of both concepts for it to work.
Given here isn't a complete solution to restore, it only deals with Tables from the project file, although it should be trivial to expand to scalar functions and stored procedures.
Which version of SqlPackage.exe are you using? Only the most recent release candidate versions of SqlPackage.exe support SQL Server vNext CTP. The SqlPackage release candidate can be downloaded here: https://www.microsoft.com/en-us/download/details.aspx?id=54273

Restore SQL Server database to Linux Docker

I need to restore a large SQL Server database on a Linux Docker instance (https://hub.docker.com/r/microsoft/mssql-server-linux/)
I'm moving my .bak file to the docker and executing this command in mssql shell:
RESTORE DATABASE gIMM_Brag FROM DISK = '/var/opt/mssql/backup/BackupFull8H_gIMM.bak' WITH MOVE '[gIMM].Data' T'/var/opt/mssql/data/gIMM.mdf', MOVE '[gIMM].Log' TO '/var/opt/mssql/data/gIMM.ldf', MOVE 'TraceabilityData' TO '/var/opt/mssql/data/gIMM.TraceData.mdf', MOVE 'TraceabilityIndexes' TO '/var/opt/mssql/data/gIMM.TraceIndex.mdf', MOVE 'KpiData' TO '/var/opt/mssql/data/gIMM.KpiData.mdf', MOVE 'KpiIndexes' TO '/var/opt/mssql/data/gIMM.KpiIndex.mdf'
I'm mapping correctly every file that need to and I definitely have enough space on the docker instance but I'm getting this error:
Error: The backup or restore was aborted.
The same error occurs with a windows version of this docker actually... And as it's not supposed to be a Express version, the database size shouldn't be the issue here.
If anyone has more information about what is causing this error !
Thanks,
#TOUDIdel
You have to use the actual file system paths on linux rather than the virtual paths that are shown in the error.
RESTORE DATABASE Northwind FROM DISK='/var/opt/mssql/Northwind.bak' WITH MOVE 'Northwind' TO '/var/opt/mssql/data/NORTHWND.MDF', MOVE 'Northwind_log' TO '/var/opt/mssql/data/NORTHWND_log.ldf'
http://www.raditha.com/blog/archives/restoring-a-database-on-ms-sql-server-for-linux-docker/
You didn't mention it, but the thing that tricked me up was that I wasn't copying the BAK file to my Docker instance. In Terminal with docker and your mssql container running...
1) get container ID:
$docker inspect -f '{{.Id}}' <container_name>
2) copy BAK file to docker instance:
docker exec -i <container_id> bash -c 'cat > /var/opt/mssql/backup.bak' < '/source/path/backup.bak'
3) log into mssql:
mssql -u sa -p 'myPassword'
3) restore db: (you can replace this with your restore script, though this was sufficient for me)
RESTORE DATABASE [MyDatabase] FROM DISK = N'/var/opt/mssql/backup.bak' WITH FILE = 1, NOUNLOAD, REPLACE, STATS = 5
When I had this problem, it's because the restore command was taking long enough for mssql to time out (with a totally unhelpful error message). Specifying a long timeout when connecting allowed the restore to complete. eg
mssql -s localhost -p "<sa_password>" -t 36000000 -T 36000000
I am not sure it is worth mentioning, but neither of the answers alone worked when moving a .bak made in Windows server to the docker running in Linux version.
(Note that I am using the code from the two previous answers and thus any credit should go to the below-mentioned authors)
TabsNotSpaces' solution was good until step 3 where the restore crashed with path mismatch (C:/path_to_mssql_server could not be found).
Vinicius Krauspenhar's answer was then necessary to remap the MDF and LOG files to fully complete the backup.
Thus the solution that worked for me when importing a windows-server-made .bak file into the Linux docker instance was:
In Terminal with docker and your SQL Server container running...
1) get container ID:
$docker inspect -f '{{.Id}}' <container_name>
2) copy BAK file to docker instance:
docker exec -i <container_id> bash -c 'cat > /var/opt/mssql/backup.bak' < '/source/path/backup.bak'
3) log into mssql or in any DB software and
RESTORE DATABASE Northwind FROM DISK='/var/opt/mssql/Northwind.bak' WITH MOVE 'Northwind' TO '/var/opt/mssql/data/NORTHWND.MDF', MOVE 'Northwind_log' TO '/var/opt/mssql/data/NORTHWND_log.ldf'

Unable to restore SEC filings preloaded database from arelle.org, postgres pg_dump gzip file

I was trying to restore an SEC form preloaded database from Arelle.org using postgres. Below is the link:
http://arelle.org/documentation/xbrl-database/
It's the one towards the bottom of the page where it says "Preloaded Database".
I was able to download the file, but unable to gunzipped it at first. So, I copied the file and renamed it with .gz extension instead of .gzip. Then, I was able to gunzip it, but noot sure if that affects the file.
After that I tried the following command on postgress to restore the database in the database that I created:
psql -U username -d mydb -f secfile.pg (no luck)
I also tried:
pg_restore -C -d mydb secfile.pg (also no luck)
I am not sure if it's because I copied and renamed the file. But, I'd really appreciate it if anyone could help.

PostgreSQL - Backup and Restore Database Tables with Partitions

I'm working on PostgreSQL 8.4 and I'd like to do backup and restore (from Ubuntu 11.10 to Ubuntu 12.4)
I want to include all partitions, clusters, roles and stuff.
My commands:
Back up:
dumb_all > filename
Compress:
zip -f mybackup
Uncompress and restore:
sudo gunzip -c /home/ubuntu/Desktop/backupFile.zip | psql -U postgres
The issue is in the restore process, I got an error
invalid command \.
ERROR: syntax error at or near "2"
LINE 1: 2 2 1
^
invalid command \.
ERROR: syntax error at or near "1"
LINE 1: ...
^
out of memory
Plus, the tables with partitions did not restored. also some tables restored without any data!
Please help!
EDIT
I used pgAdmin to do the back up, using the "backup server" option.
If you did used zip to compress the output, then you should use unzip do uncompress it, not gunzip, they use different formats/algorithms.
I'd suggest you to use gzip and gunzip only. For instance, if you generated a backup named mybackup.sql, you can gzip it with:
gzip mybackup.sql
It will generate a file named mybackup.sql.gz. Then, to restore, you can use:
gunzip -c mybackup.sql.gz | psql -U postgres
Also, I'd suggest you to avoid using pgAdmin to do the dump. Not that it can't do, it is just that you can't automatize it, you can easily use pg_dumpall the same way:
pg_dumpall -U postgres -f mybackup.sql
You can either dump and compress without intermediate files using pipe:
pg_dumpall -U postgres | gzip -c > mybackup.sql.gz
BTW, I'd really suggest you avoiding pg_dumpall and use pg_dump with custom format for each database, as with that you already get the result compressed and easier to use latter. But pg_dumpall is ok for small databases.

Import SQL dump into PostgreSQL database

We are switching hosts and the old one provided a SQL dump of the PostgreSQL database of our site.
Now, I'm trying to set this up on a local WAMP server to test this.
The only problem is that I don't have an idea how to import this database in the PostgreSQL 9 that I have set up.
I tried pgAdmin III but I can't seem to find an 'import' function. So I just opened the SQL editor and pasted the contents of the dump there and executed it, it creates the tables but it keeps giving me errors when it tries to put the data in it.
ERROR: syntax error at or near "t"
LINE 474: t 2011-05-24 16:45:01.768633 2011-05-24 16:45:01.768633 view...
The lines:
COPY tb_abilities (active, creation, modtime, id, lang, title, description) FROM stdin;
t 2011-05-24 16:45:01.768633 2011-05-24 16:45:01.768633 view nl ...
I've also tried to do this with the command prompt but I can't find the command that I need.
If I do
psql mydatabase < C:/database/db-backup.sql;
I get the error
ERROR: syntax error at or near "psql"
LINE 1: psql mydatabase < C:/database/db-backu...
^
What's the best way to import the database?
psql databasename < data_base_dump
That's the command you are looking for.
Beware: databasename must be created before importing.
Have a look at the PostgreSQL Docs Chapter 23. Backup and Restore.
Here is the command you are looking for.
psql -h hostname -d databasename -U username -f file.sql
I believe that you want to run in psql:
\i C:/database/db-backup.sql
That worked for me:
sudo -u postgres psql db_name < 'file_path'
I'm not sure if this works for the OP's situation, but I found that running the following command in the interactive console was the most flexible solution for me:
\i 'path/to/file.sql'
Just make sure you're already connected to the correct database. This command executes all of the SQL commands in the specified file.
Works pretty well, in command line, all arguments are required, -W is for password
psql -h localhost -U user -W -d database_name -f path/to/file.sql
Just for funsies, if your dump is compressed you can do something like
gunzip -c filename.gz | psql dbname
As Jacob mentioned, the PostgreSQL docs describe all this quite well.
make sure the database you want to import to is created, then you can import the dump with
sudo -u postgres -i psql testdatabase < db-structure.sql
If you want to overwrite the whole database, first drop the database
# be sure you drop the right database !!!
#sudo -u postgres -i psql -c "drop database testdatabase;"
and then recreate it with
sudo -u postgres -i psql -c "create database testdatabase;"
Follow the steps:
Go to the psql shell
\c db_name
\i path_of_dump [eg:-C:/db_name.pgsql]
I tried many different solutions for restoring my postgres backup. I ran into permission denied problems on MacOS, no solutions seemed to work.
Here's how I got it to work:
Postgres comes with Pgadmin4. If you use macOS you can press CMD+SPACE and type pgadmin4 to run it. This will open up a browser tab in chrome.
If you run into errors getting pgadmin4 to work, try killall pgAdmin4 in your terminal, then try again.
Steps to getting pgadmin4 + backup/restore
1. Create the backup
Do this by rightclicking the database -> "backup"
2. Give the file a name.
Like test12345. Click backup. This creates a binary file dump, it's not in a .sql format
3. See where it downloaded
There should be a popup at the bottomright of your screen. Click the "more details" page to see where your backup downloaded to
4. Find the location of downloaded file
In this case, it's /users/vincenttang
5. Restore the backup from pgadmin
Assuming you did steps 1 to 4 correctly, you'll have a restore binary file. There might come a time your coworker wants to use your restore file on their local machine. Have said person go to pgadmin and restore
Do this by rightclicking the database -> "restore"
6. Select file finder
Make sure to select the file location manually, DO NOT drag and drop a file onto the uploader fields in pgadmin. Because you will run into error permissions. Instead, find the file you just created:
7. Find said file
You might have to change the filter at bottomright to "All files". Find the file thereafter, from step 4. Now hit the bottomright "Select" button to confirm
8. Restore said file
You'll see this page again, with the location of the file selected. Go ahead and restore it
9. Success
If all is good, the bottom right should popup an indicator showing a successful restore. You can navigate over to your tables to see if the data has been restored propery on each table.
10. If it wasn't successful:
Should step 9 fail, try deleting your old public schema on your database. Go to "Query Tool"
Execute this code block:
DROP SCHEMA public CASCADE; CREATE SCHEMA public;
Now try steps 5 to 9 again, it should work out
Summary
This is how I had to backup/restore my backup on Postgres, when I had error permission issues and could not log in as a superuser. Or set credentials for read/write using chmod for folders. This workflow works for a binary file dump default of "Custom" from pgadmin. I assume .sql is the same way, but I have not yet tested that
I use:
cat /home/path/to/dump/file | psql -h localhost -U <user_name> -d <db_name>
Hope this will help someone.
If you are using a file with .dump extension use:
pg_restore -h hostname -d dbname -U username filename.dump
I noticed that many examples are overcomplicated for localhost where just postgres user without password exist in many cases:
psql -d db_name -f dump.sql
You can do it in pgadmin3. Drop the schema(s) that your dump contains. Then right-click on the database and choose Restore. Then you can browse for the dump file.
I used this
psql -d dbName -U username -f /home/sample.sql
Postgresql12
from sql file:
pg_restore -d database < file.sql
from custom format file:
pg_restore -Fc database < file.dump
I had more than 100MB data, therefore I could not restore database using Pgadmin4.
I used simply postgres client, and write below command.
postgres#khan:/$ pg_restore -d database_name /home/khan/Downloads/dump.sql
It worked fine and took few seconds.You can see below link for more information.
https://www.postgresql.org/docs/8.1/app-pgrestore.html

Resources