I have been writing a CLR stored procedure that moves data from one database to another. I went with the CLR stored procedure because I like the .NET framework's ability to connect to remote servers better than I like linked servers, or openrowset, but I now find that my class is mostly embedded SQL strings. I was considering just using the CLR stored procedures to retrieve the data onto the local SQL Server, and then using a regular SQL stored procedure for the actual inserts and updates.
I'm not worried about pre-compilation of the procedure or performance, and I do like that the CLR procedure allows me to see all of the logic in one place, read from top to bottom.
Are there any reasons I should consider moving to a TSQL solution instead of CLR?
Thanks.
There are multiple reasons why you would stick to a regular stored procedure. I'll try to give you an overview of the ones that I know of:
Performance.
Memory issues. SQL Server only operates with its own max memory settings. CLR's go out of this bound. This could comprimise other applications (and the OS) running on this server.
Updatebility. You can update a Stored procedure with a simple script. CLR's are more complicated to update
Security. CLR's often require more security settings than regular t-sql.
As a general rule you only want to use CLR for:
interaction with the OS, such as reading from a file or dropping a message in MSMQ
performing complex calculations, especially when you already have the code written in a .NET language to do the calculation.
Related
I have a stored procedure that is called from a c# application.
The transaction is started and commited/rolledback from this c# application.
The stored procedure can do some inserts/updates in various tables, which will all be commited or rolledback by the calling application.
The problem is that the stored procedure also insert records into a logtable, which must survive the rollback.
What is the best way of doing this ?
I think I remember from a company I worked for long ago they had solved this by creating a stored procedure for the logging, and this stored procedure had some exotic statements that made it work outside the transaction, something like that. As I said, long time ago I could remember this wrong.
Some times ago, I've develop a tools that logged stored procedure execution in a databases table. The tools was written as a C# assembly compiled into the Database Server and based on differents SQL procedures and functions linked to its C# entry points.
To allow a rollback without the lost of all events allready logged, the C# assembly SHOULD used a full defined connectionString to connect to its database server (SERVERNAME\INSTANCE server param instead of local).
This is perhaps the solution used by your previous company.
Meanwhile, there are some disadvantages:
thoses connections was qualified as "external" and the "truthfully" databases parameters should be set to true to allow code execution if not signed
this solution is not supported on clouded databases (AWS RDS or Azure)
A new connection is created by C# methods
For this last reason and a customer need, I've rewrite a toolbox based on 100% T-SQL source code.
I've just write a response which can be usefull see: https://stackoverflow.com/a/32988757/1183297
I have a database which has many stored procedure which execute in the first of month. They should read based on rules from some tables and insert calculated results on other table. There are huge number of queries. Can I find another better solution instead of stored procedures?
SQL Server 2008
If you really wanted to you could write a service (e.g. windows service) to do the work but I would suggest stored procedures are probably best. This of course does depend on what the procedures do.
We have a database server and it has about 10 databases.
I would like to create some functions / stored procedures which can be used in all databases.
For example, we can use sp_executesql in any database.
We have some requirements like that (getting current academic year, financial year, etc...)
Is it doable?
As others have suggested, you could put objects into the master database, but Microsoft explicitly recommends that you should not do that. I find that solution to be rather risky anyway, because the master database is 'owned' by the system, not by you, so there are no guarantees that it will continue to behave in the same way in the future.
Instead, I would consider this to be primarily a deployment issue. There are (at least) two strategies you could use:
Deploy the objects to every database
Deploy them to one 'reference' database that is only used for shared objects and create synonyms in the other databases
The second option is perhaps the better one, because if your functions use tables (e.g. you use a calendar table to get the academic year, which is much easier than calculating it) then you would have to create the same tables in every database too. By using synonyms, you only have to maintain one set of tables.
For the actual deployment, it's straightforward to use scripting to do manage the objects, because you just need a list of databases to connect to and run each DDL script against. You can do that using batch files and SQLCMD (perhaps with SQLCMD variables in your .sql scripts), or drive it from PowerShell or any other language that you prefer.
Depending upon what the SP actually does, you want to create the procedure in master, name it with sp_ and mark it as a system procedure:
http://weblogs.sqlteam.com/mladenp/archive/2007/01/18/58287.aspx
A couple of options:
You can use a system stored procedure as Cade says. I've done this in the past and it works ok. One warning on this is that the sp_MS_marksystemobject procedure is undocumented, which may mean that it could vanish or change without warning in future SQL versions. Thinking back I think there were other problems using this approach with functions though.
Another approach is to use standardized procedure and functions, and roll them out across your databases using sp_MSforeachdb to run code against every database. If you need to run against only your 10 database you can take copy the code in this procedure and modify it to check that a database matches your schema before running the code (or you can write your own version that does a similar thing).
I'm working on a desktop application that must support (currently) MS Access and SQL Server back ends. The application is under constant development and changes are frequently being made to the database, mostly the addition of tables and views to support new features (but also some DROPs and ALTER TABLEs to add new columns).
I have a system that compiles the DDL into the executable, checks the database to see if the executable has any new DDL that has to be executed, and executes it. This works fine for a single database.
My immediate problem is that SQL Server and Access support wildly different names for data types so a CREATE TABLE statement that executes against Access will not execute against SQL Server (or worse, will execute but create a table with different datatypes).
Is there a method that can be used to create DDL (especially CREATE TABLE commands) that can be executed through ADO against both of these databases without having to craft separate DDL for each provider?
Since you are already using ADO, you should look into Microsoft ADOX
This allows you to manipulate structures in a data source using an ADO object model that is independent of the underlying data source type. i.e. without resorting to explicit DDL
Support for ADOX is not guaranteed by any given ADO Provider, and the level of ADOX support may vary even when it is available. But for MS Access and MS SQL Server I think you will find all the capability you require (and quite possibly more!)
This can be done using DBX in Delphi.
The following is links to sample code showing how this can be done.
http://cc.embarcadero.com/item/26210
I had the same problem.
I resolved it applying a C preproccessor to the SQL before executing it.
The preprocessor includes macros in order to handle the different dbs.
Stefano
Did you check
http://db.apache.org/ddlutils/
or
http://publib.boulder.ibm.com/infocenter/wtelecom/v6r1/index.jsp?topic=%2Fcom.ibm.twss.plan.doc%2Fdb_scripts.html
I'd like to add the "msdb.dbo.sp_help_job" system stored procedure to a LINQ to SQL object, but I can't figure out how to specify it. If I create a new Data Connection in Server Explorer and specify the "msdb" database of the server I want, and navigate to "Stored Procedures", that procedure is not listed. Am I looking in the wrong place?
I've added regular (user defined) stored procedures in the past with no problem. I know I could get there by executing it via "ExecuteCommand" on the data context, and I could also create a "wrapper" stored procedure that did nothing but call "sp_help_job", but I'd like to know how to hook it up directly to LINQ, or if it's even possible.
The System Stored Procedures are not actually sitting inside your database, but rather the Read-Only Resource database.
http://msdn.microsoft.com/en-us/library/ms190940.aspx
However, here's how you can make it possible to find them:
Accessing System Databases/Tables using LINQ to SQL?