I would like to drop all pipes in a snowflake schema that match a pattern.
You can show pipes that match a pattern as shown here.
Example: show pipes like '%NAME_LIKE_THIS%' in MY_DB.MY_SCHEMA
However, it doesn't appear that a similar functionality exists for drop pipe.
I'm thinking of creating a stored procedure that will take in pattern and schema parameters, and iterate through each and drop but I'm hoping there's a better/easier way.
Thank you in advance.
You can use a SQL generator to do this.
show pipes like '%NAME_LIKE_THIS%' in MY_DB.MY_SCHEMA;
select 'drop pipe MY_DB.MY_SCHEMA.' || "name" || ' in MY_DB.MY_SCHEMA;' as SQL_COMMAND from table(result_scan(last_query_id()));
If you want to automate dropping the pipes, you can write a stored procedure that loops through all dropping one at a time.
If you don't want to write a custom stored procedure, here's a stored procedure I wrote to execute the commands from a SQL generator one at a time:
https://support.snowflake.net/s/article/Executing-Multiple-SQL-Statements-in-a-Stored-Procedure
Related
I need to change the collation for quite a lot tables including the columns. I already wrote a sql-statement which generate these for me. I wanted to create a stored procedure to automate this process. I created a table to store these generated statements and want to execute row after row. How can I do that?
Thanks in advance
I have a Pentaho transformation with three steps:
Table input. It works perfectly.
Execute SQL script, which I exec a sql server procedure. This procedure returns 4 tables, but I am not able to catch this three tables.
Text file output. Here I would like to save the contains of the before returned tables.
My main problem is that my partners have developed a procedure that returns some tables, when clearly this is a misunderstanding of the procedure concept. It should be a table valued function.
Despite this, I have to fix this someway, so, I am asking for advise. Anyone knows how to catch the tables?
Greetings.
I have tried this in the past and I believe it is not possible (although I can't seem to find the docs at the moment). Even if you could achieve this, how would you tell Pentaho what to do with each of the separate tables?
My solution at the time was to alter the proc so that only a single table was returned. Either by splitting the proc into several procs or combining the output into a single larger table with a parameter to split out in Pentaho.
We currently have a process that calls SQLCMD in shell script that outputs the results of a stored procedure to a text log file. The stored procedure does multiple Updates, Inserts, and Select statements and we capture all the messages and results to a text file partly for having a Select statement that shows the table before it is updated and after it is updated. Now we are converting to SSIS and would like to capture both the results and messages in a text file.
I have 2 questions: Is there a way to do this without calling SQLCMD in SSIS and possibly use execute sql or data flow task? If not, what is the best practice for capturing changes? (I see that I need enterprise edition for Change Capture via SQL so that doesn't work for us.)
Edit (more explanation):
I have stored procedure that does 10 updates in a row. Before I do the update I want to see what the table looks like for that specific update query by selecting the data out of it with the same parameters as the update query. Now each update does something different but one may do something to a record that I did not expect. This will allow me to pinpoint the exact problem. The best idea suggested is triggers, although it may be slow, it can be set up to capture the changes that I need.
I would like to be able to query multiple of the same type of argument (for example, several IDs, just to keep the example simple) so I only have to execute a procedure once instead of one time for each individual ID. Where my single-instance proc returns, say, a name, my get-all proc would return a single-column table of names.
What I have now:
EXEC MyProc(123);
EXEC MyProc(456);
EXEC MyProc(789);
What I would like:
// Square brackets aren't correct syntax,
// they just represent a list that contains x number of IDs
EXEC MyProc([123, 456, 789]);
Can I do this, and if so, is there an easy mechanism for handling such a thing that doesn't involve cursors and various over-complicated things? Would this even be considered a good idea?
To execute the proc only once, you'll have to refactor your proc to work with multiple IDs, as there is no T-SQL function or syntactic sugar to do this for you.
If this is to be varadic in that there may be one or many IDs, you'll have to pass multiple IDs to your proc in one parameter. This passing of an array of sorts can be easier in more recent versions of SQL Server.
For example, you can try passing:
TVPs in SQL Server 2008+
delimited strings that are then split in the proc
xml which is then parsed in the proc
a table name which is then read by the proc dynamically
use a table name which is known by both the proc and the caller beforehand
A quick search for passing arrays is SQL Server will yield more results, among the best of them is Arrays and Lists in SQL Server as mentioned by #Andomar.
Our team just experienced for the first time the hassle of not having version control for our DB. How can we add stored procedures at the very least to version control? The current system we're developing relies on SPs mainly.
Background: I develop a system that has almost 2000 stored procedures.
The critical thing I have found is to treat the database as an application. You would never open an EXE with a hex editor directly and edit it. The same with a database; just because you can edit the stored procedures from the database does not mean you should.
Treat the copy of the stored procedure in source control as the current version. It is your source code. Check it out, edit it, test it, install it, and check it back in. The next time it has to be changed, follow the same procedure. Just as an application requires a build and deploy process, so should the stored procedures.
The code below is a good stored procedure template for this process. It handles both cases of an update (ALTER) or new install (CREATE).
IF EXISTS(SELECT name
FROM sysobjects
WHERE name = 'MyProc' AND type = 'P' AND uid = '1')
DROP PROCEDURE dbo.MyProc
GO
CREATE PROCEDURE dbo.MyProc
AS
GO
However following sample is better in situations where you control access to the stored procedures. The DROP-CREATE method loses GRANT information.
IF NOT EXISTS(SELECT name
FROM sysobjects
WHERE name = 'MyProc' AND type = 'P' AND uid = '1')
CREATE PROCEDURE dbo.MyProc
AS
PRINT 'No Op'
GO
ALTER PROCEDURE dbo.MyProc
AS
GO
In addition, creating a process to build the database completely from source control can help in keeping things controlled.
Create a new database from source control.
Use a tool like Red Gate SQL Compare to compare the two databases and identify differences.
Reconcile the differences.
A cheaper solution is to simply use the "Script As" functionality of SQL Management Studio and do a text compare. However, this method is real sensitive to the exact method SSMS uses to format the extracted SQL.
I’d definitely recommend some third party tool that integrates into SSMS. Apart from SQL Source Control mentioned above you can also try SQL Version from Apex.
Important thing is to make this really easy for developers if you want them to use it and the best way is to use tool that integrates into SSMS.
2nd solution from #Darryl didn't work as suggested by #Moe. I modified #Darryl's template and I got it working, and thought it would be nice to share it with everybody.
IF NOT EXISTS(SELECT name FROM sysobjects
WHERE name = '<Stored Proc Name>' AND type = 'P' AND uid = '1')
EXEC sp_executesql N'CREATE PROCEDURE dbo.<Stored Proc Name>
AS
BEGIN
select ''Not Implemented''
END
'
GO
ALTER PROCEDURE dbo.<Stored Proc Name>
AS
BEGIN
--Stored Procedure Code
End
This is really nice because I don't lose my stored procedure permissions.
I think it's good to have each stored procedure scripted to a separate .sql file and then just commit those files into source control. Any time a sproc is changed, update the creation script - this gives you full version history on a sproc by sproc basis.
There are SQL Server source control tools that hook into SSMS, but I think they are just scripting the db objects and committing those scripts. Red Gate looks to be due to releasing such a tool this year for example.
We just add the CREATE statement to source control in a .sql file, e.g.:
-- p_my_sp.sql
CREATE PROCEDURE p_my_sp
AS
-- Procedure
Make sure that you only put one SP per file, and that the filename exactly matches the procedure name (it makes things so much easier to find the procedure in source control)
You then just need to be disciplined about not applying a stored procedure to your database that hasn't come from source control.
An alternative would be to save the SP as an ALTER statement instead - this has the advantage of making it easier to update an existing database, but means you need to do some tweaking to create a new empty database.
I've been working on this tool http://timabell.github.com/sqlHawk/ for exactly that purpose.
The way to ensure no-one forgets to check in their updated .sql files is by making your build server force the staging and live environments to match source control ;-) (which this tool will assist you with).