I have SQL code like below, but with more columns and I also create a nonclustered index on this table within the sql script.
IF NOT EXISTS (SELECT * FROM INFORMATION_SCHEMA.TABLES
WHERE TABLE_NAME = ModelOutput)
BEGIN
CREATE TABLE ModelOutput
(
[OutputID] [bigint] IDENTITY(1,1) NOT NULL,
[Prediction] [int] NOT NULL,
[CreateDate] [datetime] DEFAULT CONVERT(DATETIME2(0), getdate()),
) ON [PRIMARY]
END;
This code is saved in a file called CreateTable.sql which I want to run using RStudio.
Currently, my R code looks like:
#Read SQL file
sqlcode = read_file(file = "CreateTable.sql")
#Run SQL code
DBI::dbSendQuery(conn = con, statement = sqlcode)
However, when I tried to figure out why it didn't seem to be running, I found that dbSendQuery() only works with select statements (https://cran.r-project.org/web/packages/DBI/vignettes/DBI-advanced.html). So then I found dbCreateTable() but that only works if I have created the table using R. I don't want to do that since I can't set the same column defaults or create an index--as far as I know.
I have looked around and cannot find any way to run a create table SQL statement from RStudio. Are there any options for me or do I need to rethink my process entirely to just use dbCreateTable()?
I tried many options and found the most reliable way to run any SQL code from RStudio Connect and RStudio Server Pro for me was to turn it into a procedure. Then run DBI::dbSendQuery(con, "SET NOCOUNT ON\nEXEC YourProcedure")
Related
I have hit a problem with SQL Server that results in it infinitely recompiling a function.
To reproduce, create a new database with the option Parameterization = Forced or execute the following on an existing DB:
ALTER DATABASE [DatabaseName] SET PARAMETERIZATION FORCED WITH NO_WAIT
Then execute the following script:
CREATE TABLE dbo.TestTable(
ID int IDENTITY(1,1) NOT NULL,
FullTextField varchar(100) NULL,
CONSTRAINT PK_TestTable PRIMARY KEY CLUSTERED
(ID ASC)
)
GO
IF NOT EXISTS(SELECT 1 FROM sysfulltextcatalogs WHERE name = 'FullTextCat')
CREATE FULLTEXT CATALOG FullTextCat;
GO
CREATE FULLTEXT INDEX ON dbo.TestTable (FullTextField) KEY INDEX PK_TestTable
ON FullTextCat
WITH
CHANGE_TRACKING AUTO
GO
CREATE OR ALTER FUNCTION dbo.fn_TestFullTextSearch(#Filter VARCHAR(8000))
RETURNS TABLE
AS
RETURN SELECT
ID,
FullTextField
FROM dbo.TestTable
WHERE CONTAINS(FullTextField, #Filter)
GO
SELECT * FROM dbo.fn_TestFullTextSearch('"a*"')
The query will never return. Running SQL Profiler to monitor SP:CacheInsert and SP:CacheRemove will show SQL server is doing this endlessly and the SQL logs will show countless "A possible infinite recompile was detected for SQLHANDLE" messages.
Setting the Parameterization = Simple works around the issue but we need this to be set to Forced for other reasons.
Has anyone come across this issue before and/or have a suggested solution?
Thanks,
Chuck
While I still experience the problem with the original code I provided, by following #Martin's approach of explicitly parameterizing the call to the function:
EXEC sys.sp_executesql N'SELECT * FROM dbo.fn_TestFullTextSearch(#Filter)', N'#Filter VARCHAR(4)', #Filter = '"a*"'
I have been able to successfully work around the problem.
I want to be able to run some ad-hoc queries to get some fast results. The following will return the number of rows in table foobar from two databases that have identical structures.
USE Master
GO
select count(*) from MyFirstDB.dbo.foobar;
select count(*) from MySecondDB.dbo.foobar;
This works fine for SQL Server, but SQL Azure returns errors. I read that you with SQL Azure you cannot change the database context in the query window of SSMS. Is there a way to make this work? What happens if I want to create a join across two databases?
Ones needs to create external data source and table that is linked to another server. You can use the following code to create one.
CREATE DATABASE SCOPED CREDENTIAL credname
WITH IDENTITY = 'username',
SECRET = 'password';
CREATE EXTERNAL DATA SOURCE data_source_name
WITH
(
TYPE=RDBMS,
LOCATION='server.database.windows.net',
DATABASE_NAME='databasename',
CREDENTIAL= credname
);
CREATE EXTERNAL TABLE [dbo].[external_table_name](
-- Copy column definition
[Id] [uniqueidentifier] NOT NULL,
[Name] [nvarchar](200) NULL
)
WITH
(
DATA_SOURCE = data_source_name,
SCHEMA_NAME = 'dbo',
OBJECT_NAME = 'tablename'
)
select *
from dbo.[external_table_name]
DROP EXTERNAL TABLE [external_table_name]
DROP EXTERNAL DATA SOURCE [data_source_name]
DROP DATABASE SCOPED CREDENTIAL credname
In my installer I have to make a minor change in the schema:
IF NOT EXISTS (SELECT * FROM sys.columns WHERE object_id = OBJECT_ID(N'[dbo].[UserProfiles]') AND name = 'AllCheckboxesChecked')
BEGIN
ALTER TABLE [dbo].[UserProfiles] ADD [AllCheckboxesChecked] [bit] CONSTRAINT [DF_UserProfiles_AllCheckboxesChecked] DEFAULT 0 NOT NULL
UPDATE [dbo].[UserProfiles] SET [AllCheckboxesChecked]=1 WHERE [CheckedBoxes] LIKE '%#ALL#%'
END
GO
In SSMS this works, but not in Advanced Installer, where it fails with the error message that the column AllCheckboxesChecked does not exist. So I tried:
IF NOT EXISTS (SELECT * FROM sys.columns WHERE object_id = OBJECT_ID(N'[dbo].[UserProfiles]') AND name = 'AllCheckboxesChecked')
BEGIN
ALTER TABLE [dbo].[UserProfiles] ADD [AllCheckboxesChecked] [bit] CONSTRAINT [DF_UserProfiles_AllCheckboxesChecked] DEFAULT 0 NOT NULL
GO
UPDATE [dbo].[UserProfiles] SET [AllCheckboxesChecked]=1 WHERE [CheckedBoxes] LIKE '%#ALL#%'
END
GO
but this throws syntax errors as well (not in SSMS, only in AdvInst), so I guess that GO is not allowed inside the BEGIN...END block. The connection is configured as follows:
Connection type: Microsoft SQL Server / MSDE
Connection mode: ODBC Driver
ODBC Driver: SQL Server
Use 64-bit ODBC resource: No
What steps can I take to get the column created and populated with correct values iff the installer runs on a DB where the column does not yet exist?
The column doesn't exist error is due to validation that occurs on existing objects. Since the table already exists, the parser / compiler will verify that the table contains all of the referenced columns.
In order to get around such timing issues with object verification, you can enclose the statement in an EXEC which will not be verified until run-time:
BEGIN
ALTER TABLE [dbo].[UserProfiles]
ADD [AllCheckboxesChecked] [bit]
CONSTRAINT [DF_UserProfiles_AllCheckboxesChecked] DEFAULT 0
NOT NULL;
EXEC(N'UPDATE [dbo].[UserProfiles]
SET [AllCheckboxesChecked]=1
WHERE [CheckedBoxes] LIKE ''%#ALL#%''');
END;
The GO statement is a batch terminator and items in a batch only get committed at the end of the batch, either at the next GO statement or when the end of the script is reached. In your case the batch containing the ALTER COLUMN statement has not yet been committed and therefore you get an error that the column does not exist. You will have to split up your scripts in two parts.
GO is a batch terminator - it's specific to the tool you are using rather than SQL Server - so if you put
Statement1
GO
Statemetn2
That will be sent to SQL Server as two separate executions (batches).
Basically you are breaking your query up into two batches and the first batch you aren't closing the BEGIN block and in the second batch you aren't startinng the END block!
I have a migration script with the following statement:
ALTER TABLE [Tasks] ALTER COLUMN [SortOrder] int NOT NULL
What will happen if I run that twice? Will it change anything the second time? MS SQL Management Studio just reports "Command(s) completed successfully", but with no details on whether they actually did anything.
If it's not already idempotent, how do I make it so?
I would say that second time, SQL Server checks metadata and do nothing because nothing has changed.
But if you don't like possibility of multiple execution you can add simple condition to your script:
CREATE TABLE Tasks(SortOrder VARCHAR(100));
IF NOT EXISTS (SELECT 1
FROM INFORMATION_SCHEMA.COLUMNS
WHERE [TABLE_NAME] = 'Tasks'
AND [COLUMN_NAME] = 'SortOrder'
AND IS_NULLABLE = 'NO'
AND DATA_TYPE = 'INT')
BEGIN
ALTER TABLE [Tasks] ALTER COLUMN [SortOrder] INT NOT NULL
END
SqlFiddleDemo
When you execute it the second time, the query gets executed but since the table is already altered, there is no effect. So it makes no effect on the table.
No change is there when the script executes twice.
Here is a good MSDN read about: Inside ALTER TABLE
Let's look at what SQL Server does internally when performing an ALTER
TABLE command. SQL Server can carry out an ALTER TABLE command in any
of three ways:
SQL Server might need to change only metadata.
SQL Server might need to examine all the existing data to make sure
it's compatible with the change but then change only metadata.
SQL Server might need to physically change every row.
I'm working on a datawarehouse.
I wrote a script to loada dimention table by creating date entries (such as the SSAS wizard but directly in my SSIS ETL process). It works well in SSMS and directly in a T-SQL task in SSIS without using parameters).
This script doesn't provide any ResultSet, this is just a loop to insert data in a table.
Here is a quick look of my query.
USE [MySQLServerDatabase]
GO
-- Some parameters used by the script.
DECLARE #PREFIX_YEAR_NAME [nvarchar](50) = 'Year ';
DECLARE #PREFIX_QUARTER_NAME [nvarchar](50) = 'Q';
DECLARE #PREFIX_WEEK_NAME [nvarchar](50) = 'W';
DECLARE #PREFIX_MONTH_NAME [nvarchar](50) = 'M';
DECLARE #DefaultBeginDate [datetime] = '01/01/2000';
DECLARE #Date [datetime];
SET #Date = #BeginDate
WHILE #Date <= #EndDate
BEGIN
-- ...
-- INSERT INTO ...
END
-- ...
However, to get something more easy to use and to maintain, I would like to use SSIS variables directly in the script.
Here is my params (Project.params file).
My script needs all of them to work :
Then, I added a "Execute SQL Task component" containing my SQL query (OLEDB connection, direct input method). I created my 5 parameters :
How to use parameters in the SQL query ?
I tried to use the same name as the variables in the script, it doesn't work.
I tried to use indexes (0,1,2 etc.) names and to use '?' in the script, I doesn't work.
Here is the error when using '?' as parameters :
Error: 0xC002F210 at Execute SQL Task, Execute SQL Task: Executing the
query " DECLARE #PREFIX_YEAR_NAME nvarchar = ?; D..." failed
with the following error: "Multiple-step OLE DB operation generated
errors. Check each OLE DB status value, if available. No work was
done.". Possible failure reasons: Problems with the query, "ResultSet"
property not set correctly, parameters not set correctly, or
connection not established correctly. Task failed: Execute SQL Task
Any idea to solve that ?
Thanks.
you can put this query into stored procedure. In SQL Task Editor in SQLStatement you can execute it by exec [sch].[sp_name] ?,?,?...,? (each ? is one parameter).
In your 'Parameter Mapping' tab in Parameter Name column you can use numbers 0, 1, 2... n (it's order of params in your stored procedure).