I want to know if there are any provisions in SQL server where we can store the constants variables either in some file or else just like codeigniter's constants.php file from where we can define constant variables and whenever we need we can call the variable.
I want to use these feature as because i have multiple views, stored procedures which takes static values and whenever these values changes in database i can just change those values in constant and my job is done. I am ok to create any XML or other file and call it in SP,View wherever i needed.
Thank you in advance.
This is not possible at the server level but you can do this if you have all the views in a single .SQL file (if you want to, several batches in a single .SQL script). You will need to use SQLCMD mode in SSMS. To enable SQLCMD mode go to Query > SQLCMD Mode
Once you have SQLCMD mode enabled you can define constants/variables at the top of your SQL script and use them throughout the different views in your .SQL file.
To define the variables/constants
:setvar cons1 10
:setvar cons2 20
And to use them
$(cons1)
$(cons2)
A "constants file" is really a list of key/value pairs
const key = "value"
Translated into SQL, a list of key/value pairs is a table
Key varchar() [UK]
IntValue int null
StringValue nvarchar() null
etc. for the supported data types.
Reference the values using
SELECT [ValueColumn] FROM [SettingsTable] WHERE Key = '[Key]'
Since the result is a NULLable scalar, you can use this expression to read a "constant" value in a sub-SELECT.
If your constants are really really constant, you can also define a view as
CREATE VIEW V_SystemConstants AS
SELECT 42 AS TheAnswer
Related
I currently have code in a script that, looks something like:
SELECT * FROM [linkedServerName].database1.dbo.tableA;
SELECT * FROM [linkedServerName].database1.dbo.tableB;
SELECT * FROM [linkedServerName].database1.dbo.tableC;
except that it's not as neat as this and in reality there are references scattered all over a very long script.
I will need to run this script on a number of different test environments where the name of the linkedServer will need to change.
Is there a way that I can store these in a single variable so that I only need to make 1 adjustment?
(I tried to use a Synonym but it seems that I can only do that at a table level, so I would need multiples of them which defeats the purpose of what I am trying to achieve.)
It is possible by using sqlcmd variables in database project
https://learn.microsoft.com/en-us/sql/ssdt/add-database-reference-dialog-box?view=sql-server-ver15#:~:text=a%20composite%20project-,In%20Solution%20Explorer%2C%20right%20click%20References%20and%20select%20Add%20Database,and%20paste%20it%20into%20your%20.
The DACPAC along with the cmdvariable can be used to achieve your requirement
This seems ridiculously easy, but I can't find it anywhere...
I have a VERY simple sequence container with two tasks: Truncate a SQL table, and repopulate it from production. But this container will be repeated for about 50 tables. The container's name (entered manually) = the name of both the source and destination tables.
I have two variables:
"TableName" is entered manually.
"DelTable" is an expression that uses #[User::TableName] to generate a simple SQL statement.
I'm super-lazy and would like to use an expression to set "TableName" = the name of the current scope so I only have to enter it once.
Ideas???
THANK YOU!
if you are truncating all tables in a DB and replacing with exactly the same structure, how about this approach:
Execute SQL:
select table_name
from INFORMATION_SCHEMA.TABLES --Add a where to limit the tables to the ones you want
Save results to an object variable called TABLES
Add a for each loop:
Loop through ADO Object setting value to a string variable called table
Add Execute SQL to FE LOOP: truncate table ? and map parameter.
Add a 2nd Execute SQL statement:
INSERT INTO SERVER.DB.SCHEMA.?
select * from ?
Again map the parameters.
If you are having trouble mapping parameters set up variables and use them to create the SQL statements to run.
#TomPhillips is correct, I cannot unfortunately comment or make that answer useful. hence commenting here.
There's no easy quick fix to use a loop/automate unless all the 50 tables are same structure which is rare by any stretch of imagination.
BIML is the way to go if you are lazy :)
SSIS is not dynamic. Data Flows require fixed input and output at compile time, not runtime. You cannot simply change the table name and have it work.
If you have a list of 50 tables to do the same function on, you can use BIML to dynamically generate the SSIS package(s). But the DF itself cannot be dynamic.
I have an SSIS package that contains one data flow task that has several data sources as well as several destinations. The package takes data fro one table and inserts it into another.
I want to transfer from Source table to destination table only records that belong to a particular CollectionID. I added a parameter "CollectionID" of type string to the project and added the parameter to the configuration file.
I select data from the source table via SQL command. How can I get the sql command to use the parameter I added to the configuration file? I understand I need to add a WHERE clause, but how do I point the where clause to a parameter in the config file?
You need to create a variable and map it to the configuration value.
Assuming you are using the OLE connection type, you then map the variable value to the SQL statement with the ? placeholder.
SELECT * from Table where columnvalue = ?
Finally, map the variable in the ExecuteSQL task:
If the parameter doesn't have a name you can just use 0, but make sure the data type is correct. If it is a text data type, you will need to give it the proper length, not -1.
In Ms-SQL Server to restore database I have a script as below but would like to parameterize the database name, path and file name in an SSIS Package, how do I do that?
Details:
The script, which works, I got this by right click at Restore in Ms SQL server:
USE [master]
RESTORE DATABASE [DataSystem2014] FROM DISK = N'F:\Folders\99_Temp\12_DataSystem\DataSystem_20140915_181216.BAK' WITH FILE = 1, MOVE N'DataSystem2014' TO N'F:\DatabaseFiles\DataSystem2014_Primary.mdf', MOVE N'DataSystem2014_log' TO N'F:\DatabaseLogFiles\DataSystem2014_Primary.ldf', NOUNLOAD, STATS = 5
GO
but I'd like to use above as an SQL task in an SSIS package, and I couldn't properly parameterize the database name ( [DataSystem2014] ) or the path ( F:\Folders\99_Temp\12_DataSystem\ ) or the file name ( DataSystem_20140915_181216.BAK ).
The database name will be fairly stable, but would like to bring it in to the SQL statement as a parameter, path might change but also stable enough, the file name always changes. I tried a few versions, used ? and parameter mapping, used #[User::Variable] in SQL statement, but couldn't get them working, always error messages.
Is this something I could get some help with, how to do this, please?
The Task for issuing SQL Statements is called the Execute SQL Task. Depending on your Connection Manager type, you will use different characters for placeholders. Map Query Parameters
Generally speaking, people use an OLE DB connection manager when working with SQL Server so the replacement character is a ?. That is going to be an ordinal based replacement token so even if you have the same string in there N times, you would be having to add N ? and make N variable mappings.
Looking at your query,
RESTORE DATABASE
[DataSystem2014]
FROM DISK = N'F:\Folders\99_Temp\12_DataSystem\DataSystem_20140915_181216.BAK' WITH FILE = 1
, MOVE N'DataSystem2014' TO N'F:\DatabaseFiles\DataSystem2014_Primary.mdf'
, MOVE N'DataSystem2014_log' TO N'F:\DatabaseLogFiles\DataSystem2014_Primary.ldf'
, NOUNLOAD
, STATS = 5;
it could be as heavily parameterized as this
RESTORE DATABASE
?
FROM
DISK = ? WITH FILE = 1
, MOVE ? TO ?
, MOVE ? TO ?
, NOUNLOAD
, STATS = 5;
Since those are all unique-ish values, I'd create a number of Variables within SSIS to hold their values. Actually, I'd create more Variables than I directly map.
For example, my restored database, DataSystem2014, name might always match the virtual name of the data and log so knowing one, I could derive the other values. The mechanism for deriving values is an Expression. Thus if create a Variable called #[User::BaseDBName], I could then create #[User::DBLogName] by setting EvaluateAsExpression = true and then using a formula like
#[User::BaseDBName] + "_log"
You can see in the linked MSDN article how actually map these Variables to their placeholders in the query.
Where this all falls apart though, at least in my mind, is when you have multiple data files. You're now looking at building your restore command dynamically.
I'm assuming you know what you are doing with parameterized execute sql tasks and skip right to the solution I had when doing this.
declare variables in the sql with quotename('?') and then build dynamic sql to execute the restore statement.
hope it helps.
Edit: I'm aware that SELECT * is bad practice, but it's used here just to focus the example SQL on the table statement rather than the rest of the query. Mentally exchange it for some column names if you prefer.
Given a database server MyServer (which we are presently connected to in SSMS), with several databases MyDb1, MyDb2, MyDb3 etc and default schema dbo, are any of the following equivilant queries (they will all return exactly the same result set) more "optimal" than the others?
SELECT * FROM MyServer.MyDb1.dbo.MyTable
I was told that this method (explicitly providing the full database name including server name) treats MyServer as a linked server and causes the query to run slower. Is this true?
SELECT * FROM MyDb1.dbo.MyTable
The server name isn't required as we're already connected to it, but would this run 'faster' than the above?
USE MyDb1
GO
SELECT * FROM dbo.MyTable
State the database we're using initially. I can't imagine that this is any better than the previous for a single query, but would it be more optimal for subsequent queries on the same database (ie, if we had more SELECT statements in the same format below this)?
USE MyDb1
GO
SELECT * FROM MyTable
As above, but omitting the default schema. I don't think this makes any difference. Does it?
SQL Server will always look for the objects you sepcify within the current "Context" if you do not specify a fully qualified name.
Is one faster than the other, sure, the same as a file name on your hard drive of "This is a really long name for a file but at long as it is under 254 it is ok.txt" will take up more hard-drive (toc) space than "x.txt". Will you ever notice it, no!
As far as the "USE" keyword, this just sets the context for you, so you dont have to fully qualify object names. The "USE" keyword is NOT sql, you cannot use in in another application (like a vb/c# app) or within a stored procedure but it is like the "GO" keyword in that it tells SSMS to do something, change the context.