I have an application that used to run on Snowflake but was migrated to a SQL Managed Instance. i used SnowHub to export all the DDL and then zipped it up should I ever need to look up some source code for a deprecated report or other process.
But how do dump the entire database to a compressed file. You never know when an audit might come up and I might need to restore the data in the state it was a few months or years ago.
You have to use UNLOADING to load data from your table to an internal/external stage in your preferred format.
https://docs.snowflake.com/en/user-guide/data-unload-overview.html
Unfortunately, you cannot unload the whole database with one statement, but need to run one COPY INTO per table. As an alternative you could write a procedure looping over your metadata about tables and run COPY INTO per loop execution.
Related
My company is looking to possibly migrate to Snowflake from SQL Server. From what i've read on snowflake documentation, flat files (CSV) can get uploaded and set into a staging table then use COPY INTO that loads data into physical table.
example: put file://c:\temp\employees0*.csv #sf_tuts.public.%emp_basic;
My question is, can this be automated via a job or script within snowflake? this includes the copy into command.
Yes, there are several ways to automate jobs in Snowflake as already commented by others. Putting your code in a Stored Procedure and call it via a Task in schedule is an option.
There is also a command line interface in Snowflake called SnowSQL.
is there any clever way to get my data from a mysql datatbase into snowflake?
I found two possible ways so far:
Option 1: Put a Snowpipe ontop of the mysql database and the pipeline converts the data automatically.
Option 2: I convert tables manually into csv and store them locally and load them via staging into snowflake.
For me it seems strange to convert every table into a csv first. Can I not just push a sql dump file to snowflake? Can I also schedule some reload task in snowflake, so either option1 or 2 get triggered automatically?
Best
NicBeC24
I found some very good information regarding MySQL-Snowflake-migrations here: https://hevodata.com/blog/mysql-to-snowflake-data-migration-steps/
The main steps from the webpage above are:
Exporting data from MySQL
Taking care about data types
Stage your files into Snowflake (Internal/External stage)
Copy the staged files into the table
If the SQL-dump is just a ".sql-file" in ANSI, yes, of course, you can copy&paste it to your Snowflake worksheet and execute it there.
Regarding scheduling: Yes, in Snowflake there is a functionality called Tasks: https://docs.snowflake.com/en/user-guide/tasks-intro.html You can use them to schedule your COPY INTO-command.
During our SQL Server database deployments, we create a temporary table which contains the new desired state of data for a particular table. We then merge the temp table into the target table (we actually use individual insert, update and delete statements, but that's probably not relevant). The inserts/updates/deletes performed are captured and written out to a log.
We would like to be able to report on what changes would be applied by a deployment, without actually applying them. This is currently done by rolling back the transaction at the end of the above process. This doesn't feel particularly great though.
Now what we are thinking of doing is, instead of performing the changes and rolling them back, we will generate a migration script for the table (generate some SQL code that performs the necessary inserts, updates and deletes). If we want to do the actual deployment, this code will be dynamically executed. If not, the code will just be printed to a log.
It shouldn't take long to put together some code which can generate migration scripts for two specified tables, but I first wanted to verify that there isn't already an existing tool which can do this?
Searching on Google, I can find lots of talk about migrating whole databases, but nothing about generating a data migration script to effectively merge one table into another.
So my question is, does anyone know of such a tool?
There are several data compare tools like:
SQL Data Compare from Red Gate
SQL Server Data Tools
dbForge Data Compare from Devart
Is that what you're looking for?
is it possible to migrate data from one database table into another database table using liquibase?
Now we are running liquibase changesets on two different databases as we have two executions in pom maven file. But is it possible to write one changeset which selects data from one database table and copies to another database table?
You could use your preferred scripting language to query the data from the table and generate insert statements with the result. Once you have the insert statements, put them in a liquibase formatted sql file. and run them on your target database.
The goal is to have already created the files when the data was originally inserted. If your database already existed before you started using liquibase, then it may be a good idea to restore from a backup taken from the day you started using liquibase and sync it from there.
I have a database in SQL Server 2008 (not R2)
A third party has the job of replacing the database regularly by restoring the data in the live environment from a .bak file made in the development environment. This leads to the destruction of any user generated data in that database. I am restricted in the live environment and cannot have two databases there.
One solution I am thinking about is to write a stored procedure that could save somehow the user generated data to some kind of local file and then once the development .bak is restored a second stored procedure could write this data back from the local file.
I'm familiar with using generate scripts that will generate a .sql file so maybe something similar to this, but it needs to be generated from a sql query that contains only the user generated data (these are specific rows of certain tables that are joined together - not the best design but it's what I have to work with).
Is it possible to generate a SQL script from a SQL query? Or is there some other kind of local file storage I could use. Something like a CSV file would be ok but I'm not aware of an easy way to automate restoring this. It will need to be restored with some very specific sql queries.