I am able to read parameters from a local config file on SnowFlake ( using SnowSQL ). But in production environment, SQL will be running in a automated manner (using SnowFlake Tasks).
I have created a task in Snowflake which calls a Stored Procedure. The Stored procedure takes few parameters which I want to read from a config file. So that same Stored Procedure can be used for multiple similar use cases.
Please suggest if there is any work around.
Reference Link : https://docs.snowflake.net/manuals/user-guide/tasks-intro.html
Although it says "Note that a task does not support account or user parameters."
You can't read a config file from a task. The easiest way in my opinion is to put your configuration in a Snowflake table and have your Stored Procedure read any configuration from the table instead.
Since I am not very sure if stored proc can read the config file and hence I agree with the approach #SimonD suggested.
Another alternative (though bit complex design) is to have the config file in JSON format in the S3 bucket which you can load via stage. Refer $ notation to access the respective JSON properties to access the key-value and inject it where needed in the stored procedure. This way, your configuration is still in JSON or text format outside the snowflake and can be managed via S3 (if you are using AWS)
Though I have not tried this approach but looks it should work. This way, snowflake access or accidental DB update can be prevented.
I hope this idea makes sense to you?
Related
My company is looking to possibly migrate to Snowflake from SQL Server. From what i've read on snowflake documentation, flat files (CSV) can get uploaded and set into a staging table then use COPY INTO that loads data into physical table.
example: put file://c:\temp\employees0*.csv #sf_tuts.public.%emp_basic;
My question is, can this be automated via a job or script within snowflake? this includes the copy into command.
Yes, there are several ways to automate jobs in Snowflake as already commented by others. Putting your code in a Stored Procedure and call it via a Task in schedule is an option.
There is also a command line interface in Snowflake called SnowSQL.
I want to use snowflake Task scheduler to clone one or all of the DB's with dynamic clone DB name something like below,Is it possible to do it without creating Stored procedure.As I have multiple DB under my account I would prefer to clone all of the DB's in one task
create database xx_date
clone xx
I appreciate your response
Thanks,
Is it possible to do it without creating a Stored Procedure
The CREATE TASK statement syntax only allows for a single SQL statement to be specified, and the CREATE … CLONE statement syntax does not permit specifying more than one object at a time.
Given the above, this isn't possible currently. You will need to use an iteration of database names from within a stored procedure call. The same stored procedure can also be used to clean up older dated clones from previous task invocations.
For incorporating dates into a dynamically generated statement within the stored procedure, checkout this question.
P.s. If the underlying goal of the numerous clones is to maintain backups, also consider cross-account, cross-region (and/or) cross-cloud replication for better safety.
I have a requirement of loading data from a .csv file into a SQL Server table inside a stored procedure. We even thought of implementing a SSIS packages, but we had some limitations and have left with only stored procedure.
I even have got the functionality using bulkinsert, where its not accepted in our project.
So requesting to provide any alternate method for data loading.
Maybe something like :
https://www.toadworld.com/platforms/sql-server/b/weblog/archive/2015/02/09/t-sql-read-csv-files-using-openrowset
Is this acceptable ?
Due to an employee quitting, I've been given a project that is outside my area of expertise.
I have a product where each customer will have their own copy of a database. The UI for creating the database (licensing, basic info collection, etc) is being outsourced, so I was hoping to just have a single stored procedure they can call, providing a few parameters, and have the SP create the database. I have a script for creating the database, but I'm not sure the best way to actually execute the script.
From what I've found, this seems to be outside the scope of what a SP easily can do. Is there any sort of "best practice" for handling this sort of program flow?
Generally speaking, SQL scripts - both DML and DDL - are what you use for database creation and population. SQL Server has a command line interface called SQLCMD that these scripts can be run through - here's a link to the MSDN tutorial.
Assuming there's no customization to the tables or columns involved, you could get away with using either attach/reattach or backup/restore. These would require that a baseline database exist - no customer data. Then you use either of the methods mentioned to capture the database as-is. Backup/restore is preferrable because attach/reattach requires the database to be offline. But users need to be sync'd before they can access the database.
If you got the script to create database, it is easy for them to use it within their program. Do you have any specific pre-requisite to create the database & set permissions accordingly, you can wrap up all the scripts within 1 script file to execute.
I am in need of testing several different processes for the application we're builduing. Each process requires a particular table in our database to have data and all of these tables have foreign key constraints from other tables as well.
I've written sql scripts that populate the table I'm interested in as well as its dependencies but, it turns out that in a few of these scripts I've duplicated a lot of code when populating the dependencies tables.
I would like to take out the duplicated code and put it in a separate script but I don't know how, if possible, to execute a sql script from within another one.
An important part of all of this would also be to be able to get the ##IDENTITY value in the calling script from the called one.
Any help will be greately appreciated.
Best regards.
Clarification: By script I mean a file saved in disk. I don't want to be creating and deleting temporary stored procedures for this.
When I hear the word "script", I think of a file containing a series of commands; if you're asking how to get SQL Server to load a file of commands from another file of commands, I'm not sure of an easy way to do that.
If you can save your duplicate code as a stored procedure, you can certainly call a stored procedure from another stored procedure within SQL Server. You could then pass in a parameter holding the ##IDENTITY value (and you may want to look at SCOPE_IDENTITY() instead).
HTH,
Stu