Why is SQLite3 automatically encrypting a database I create? - database

I am trying to get a thorough understanding of sqlite3 so that I can run some basic queries through DB Browser for SQLite (http://sqlitebrowser.org/).
To do so, I've imported NYC Taxi data for 1 month, and tried (for many hours) to import this data on sqlite3.
.mode csv <Table_Name>
.import <path/to/file/data.csv> <Table_Name>
Once that finishes, I issue the following SQL statement:
.out <path/to/file/data.db>
select * from <table_name>;
Then, when I try to use DB Browser for SQLite to verify that the database has been populated with data, I get a prompt:
SQLCipher Encryption
Please enter the key used to encrypt the database
Why is it getting auto-encrypted? Is there another way to get my csv file into a database?

The message means that the file is not recognized as a database file. This can happen if the file is encrypted.
But in this case, the output generated by .output is the same as what would be printed on the screen. This is not a database file at all.
To get a copy of the entire database file, use .backup.
To get a copy of a single table, use .dump tablename, then execute those SQL statements in a new database:
sqlite3 data.db < file_generated_by_dump

Related

How to generate Insert statement from PGAdmin4 Tool?

We are writing a new application, and while testing, we will need a bunch of dummy data. I've added that data by using MS Access to dump excel files into the relevant tables into the Postgres database.
What should I do now to generate an Insert statements from the PGAdmin4 Tool similar to what SQL Studio allow us to generate an Insert statements for SQL Server? There are no options available to me. I can't use the closest one, which is to export and import the data via CSV.
I understand that you cannot import the CSV file into the actual DB as this needs to be done through ASP.NET core EF. Perhaps, you can probably create a test schema and import the CSV file into the test schema. Once you have the data imported into the test schema, you can use that to generate SQL statements using the steps below:
Right click on target table and select "Backup".
Select a file path to store the backup. You can save the file name as data.backup
Choose "Plain" as Format.
Open the tab "Options" check "Use Column Inserts".
Click the Backup-button.
Once the file gets generated you can open with Notepad++ or VSCode to get the SQL insert statements
You can use the statements generated and delete the test schema created
Here is a resource that might help you in loading data from Excel file into PostgresSQL if you still need to take this path Transfer Data from Excel to PostgreSQL

Load database into PostgreSQL from text file

ALL,
I have a text file which contains multiple SQL statements, like:
CREATE TABLE a();
CREATE TABLE b();
INSERT INTO a() VALUES();
INSERT INTO b() VALUES();
This file is generated from the SQLite database.
What I'd like to do is to load this file into PostgreSQL database. I already created the database on the server and now I want to populate the database structure and the data.
The whole DB structure contain in 1 file.
Is it possible to just load this file into the PostgreSQL? Or I will have to split the file and then manually create all tables and issue "LOAD" command?
Thank you.
PG_RESTORE will not work unless the source file is from a PG_DUMP.
Your best bet is to fire this off as an .SQL file on connection to the database.
eg.
psql -d [database] -U [user] -f [file from SQLite].sql
As long as the commands in the file are executable and the syntax will work with Postgres this will create your objects and populate them.

What is the best way to archive a postgres database?

Here are the details:
The database has to be archived such that records older than 6 months can be copied to a new database and deleted from the main(production) database. The complexity here will be to copy all rows in all tables which have reference each other. After that, these copied rows from some of the tables (which are really huge and whose data is no more needed) will be deleted.
The postgres database is an Amazon RDS instance.
What is the best way to achieve this?
I was thinking either a Springboot application
OR
Have postgresql.conf invoke a shell script which invokes a sql batch.
For the second approach, I am not sure how to edit a amazon RDS postgresql.conf file and where to specify the shell script. Where would be the sql batch written? This is a little new to me, appreciate any pointers.
It will be much faster if you do everything server side instead of using a Springboot application. The problem is not dump/restore which you could easily do with pg_dump utility or psql -d dbname -t -A -F";" -c "SELECT * FROM yourdata WHERE cutdate<=current_timestamp-interval '6 months'" > output.csv
But you have to guarantee that everything that is exported is loaded into the second database and that you do not delete anything that has not been exported.
I would first SELECT a subset of primary keys into a temporary table. Then use server side COPY command to export the preselected keys (and all its dependencies)
COPY (SELECT d.* FROM yourdata d INNER JOIN temporal t WHERE d.pk=t.pk) To '/tmp/yourdata.csv' WITH CSV DELIMITER ',';
After all the export files have been generated
DELETE FROM yourdata WHERE pk IN (SELECT pk FROM temporal)
Then on the backup database do
COPY yourdata(column1,column2,column3) FROM '/tmp/yourdata.csv' DELIMITER ',' CSV
You can write a script that invokes all that commands on server side using psql command line tool and last move the imported files into a permanent location just in case something went wrong and you need to process them again.
See Save PL/pgSQL output from PostgreSQL to a CSV file and How to import CSV file data into a PostgreSQL table?

How to import file into oracle table

I have a .tbl file with data and I'm trying to import this data into a table. I'm using SQL Developer for this with this command:
load data infile "C:\path\users.tbl"
insert into table users fields terminated by "|" lines terminated by "\r\n;
But nothing is working, the data is not loaded and no errors are shown...Do you see why it's not working?
That looks like SQL*Loader syntax.
For that to work, you'd have to run SQL*Loader, which is a separate command-line program available in your ORACLE_HOME/bin directory.
If you don't have an ORACLE_HOME, you'll need to install the client. Then open a shell/cmd window, and run your command there.
OR, if you want to use SQL Developer, you can use our wizard to read the file and insert the data, row-by-row.

.sql file dump returns empty tables

Someone has sent me a database dump as a .sql file dumped using phymyadmin interface. I am trying to restore the dump using the mysql command prompt, however I keep getting empty tables. The .sql file creates a database before creating tables and populating them. When the empty tables message first showed up I thought it was because the database had to be created before running the script, so I created the db and ran the script again, however the tables still show up as empty set.
I tried these steps,
logged in as root.
create database x (this is the name of the db in the create db command in the .sql file)
mysql x -u root -p < my_x_db.sql
logged in as root
show databases
use x
show tables -- empty set
What should I do different and how can I troubleshoot this?
Thanks
There was a create database statement in the .sql file. I solved this by simply commenting the statement. I had already created the database with the same name externally.

Resources