Passing Parameter to SQL from Batch File - Garbage Value Appears - batch-file

I am trying to pass parameters to SQL script as mentioned here. Here
This is my batch file.
testUpdate.bat
#echo off
sqlplus myuser#mydb/mypassword #updateRundate 25042018
pause
This is my sql file
updateRundate.sql
update SB_ER_RUNDATE set rundate = TO_DATE( '&1', 'ddmmyyyy');
commit;
exit;
I am getting the following error
Connected to:
Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
old 1: update SB_ER_RUNDATE set rundate = TO_DATE( '&1', 'ddmmyyyy')
new 1: update SB_ER_RUNDATE set rundate = TO_DATE( 'to', 'ddmmyyyy')
update SB_ER_RUNDATE set rundate = TO_DATE( 'to', 'ddmmyyyy')
*
ERROR at line 1:
ORA-01858: a non-numeric character was found where a numeric was expected
Commit complete.
Instead of the passed parameter, garbage value is appearing in the query. Please help me to resolve this issue.
Update
The path of my sqlplus installation is
C:\app\nt to de softwares\product\11.1.0\client_1
I have observed that, if I am outputting &3 instead of &1, I am getting the value as
softwares\product\11.1.0\client_1\product\11.1.0\client_1\sqlplus\admin\glogin.sql
&1 outputs the value of 'to' . This is in fact part of the installation path.
Update 2
I have tried the second answer in this post. The strange thing is that this script is not working in my system and it prints 'to' at the position of the parameter. But the same is working in another PC which has the same SQL plus installation.

Related

EF Core idempotent migration script fails despite check on applied migration

I am using EF Core (3.1.15). In a previous migration (also created in 3.1.15), a column was referenced that was dropped later on. The idempotent script does check if the migration was performed on the database (which it is, and the reference still shows in the __EFMigrationsHistory table). However the check doesn't have the expected result and the script due to the inexistent column.
Q: why is the inexistent column tripping the execution of the SQL script?
Script was created with
dotnet-ef migrations script -i -o migrations.sql
Relevant part of the automated script that fails, where ReferenceToLedgerId is the column dropped in later migration:
IF NOT EXISTS(SELECT * FROM [__EFMigrationsHistory] WHERE [MigrationId] = N'20210612052003_CLedger')
BEGIN
UPDATE LedgerTable SET LedgerId = ReferenceToLedgerId
END;
Error:
Msg 207, Level 16, State 1, Line 3
Invalid column name 'ReferenceToLedgerId'
When running the following SQL query, the result comes back as expected:
SELECT *
FROM [__EFMigrationsHistory] WHERE [MigrationId] = N'20210612052003_CLedger'
MigrationId
ProductVersion
20210612052003_CLedger
3.1.15
The database is Azure SQL Database. Script doesn't fail on local SQL dev database. A dozen migrations have been applied since then, and only now the script fails.
Below was the call that created the specific script:
migrationBuilder.Sql("UPDATE LedgerTable set LedgerId = ReferenceToLedgerId", true);
I tried to place the table and column names in square brackets, but that made no difference (eg. [ReferenceToLedgerId]. The script fails in Azure DevOps release when using SQLCMD and also fails when using Azure Data Studio, both accessing the Azure SQL Database.
Additional check
I changed the script to do a quick check:
PRINT '#Before IF'
IF NOT EXISTS(SELECT * FROM [__EFMigrationsHistory] WHERE [MigrationId] = N'20210612052003_CLedger')
BEGIN
PRINT '#Within IF'
--UPDATE LedgerTable SET LedgerId = ReferenceToLedgerId
END;
PRINT '#After IF'
To which I get the following result:
Started executing query at Line 1
#Before IF #After IF
Total execution time: 00:00:00.010
If I uncomment the UPDATE statement it fails again. So I can only conclude that the code path works as intended, but that the server still checks for the existence of the column. I am not familiar with SQL to understand why this would be, or why it only fails for this one line while the column itself is referenced in other lines of the SQL script without it failing.
That batch will fail on every version of SQL Server. eg
use tempdb
go
create table __EFMigrationsHistory(MigrationId nvarchar(200))
create table LedgerTable(LedgerId int)
go
insert into __EFMigrationsHistory(MigrationId) values (N'20210612052003_CLedger')
go
IF NOT EXISTS(SELECT * FROM [__EFMigrationsHistory] WHERE [MigrationId] = N'20210612052003_CLedger')
BEGIN
UPDATE LedgerTable SET LedgerId = ReferenceToLedgerId
END;
Fails with
Msg 207, Level 16, State 1, Line 8
Invalid column name 'ReferenceToLedgerId'.
Because the batch cannot be parsed and compiled. It's simply not legal to reference a non-existent table or column in a TSQL batch.
You can work around this by using dynamic SQL, so that the batch referencing the non-existent column is not parsed and compiled unless the migration is being applied.
migrationBuilder.Sql("exec('UPDATE LedgerTable set LedgerId = ReferenceToLedgerId')", true);
This is documented here:
Tip
Use the EXEC function when a statement must be the first or only
one in a SQL batch. It might also be needed to work around parser
errors in idempotent migration scripts that can occur when referenced
columns don't currently exist on a table.
https://learn.microsoft.com/en-us/ef/core/managing-schemas/migrations/operations

What is this SQL Server OUTPUT clause effect in SSIS package step?

I am editing this question because I was not clear enough.
I started maintaining an SSIS package, and I haven't worked on SQL Server for years, and I see this code and don't know how SSIS is dealing with it:
There is a table called "employees", and we are running this query:
UPDATE t SET IsProcessed = 1
OUTPUT INSERTED.RecordId, INSERTED.SiteId, INSERTED.EmployeeId, INSERTED.FirstName, INSERTED.MiddleName,
INSERTED.LastName, INSERTED.IsProcessed
FROM [mydatabase].[dbo].[employees] AS t
WHERE IsProcessed = 0
So, does SSIS only shows the output of the command, or it takes that output as an input to the next step?
Can you help please?
This will list all the row values that were updated. The after (the new updated value) columns are listed. If the user wanted the previous values they could have listed deleted.column-name. This is a form of auditing the changes.

Backup SSAS via SSMS Job with Date Appended

I'd like to backup my SSAS database via a SQL Server Agent job in SSMS – and I’d also like to append the day number to the end of the file and allow overwrites.
This will ensure that I only ever have a month of backups (i.e. Jan 1st backup will be called backup_01.abf, Jan 2nd = backup_02.abf etc.)
I connected to the SSAS DB via SSMS and scripted out the backup procedure, which is as follows:
{
"backup": {
"database": "ExampleDB",
"file": "Backup.abf",
"allowOverwrite": true,
"applyCompression": false
}
}
I believe I can simply add this as a Step in a server agent job as a SQL Server Analysis Services Command.
But how can I then append the day of the month to the file?
Can’t seem to find much about this online
[EDIT]
Using a combination of what Vaishali responded with below, and another article I've found online about this, I've performed the following:
Created a linked server (Link_SSAS)
Generated the following script:
Declare #XMLA nvarchar(1000),
#DateSerial nvarchar(35)
-- Get Day Number from GETDATE()
Set #DateSerial = RIGHT('0' + RTRIM(DAY(GETDATE())),2)
-- Create the XMLA string
Set #XMLA = N'<Backup xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
<Object>
<DatabaseID>ExampleDB</DatabaseID>
</Object>
<File>C:\bak\Backup_' + #DateSerial + '.abf</File>
</Backup>'
-- Execute the string across the linked server
Exec (#XMLA) AT Link_SSAS;
The above xmla runs perfectly if I just execute it through a query window in SSMS
However, when I try to put the XMLA into a job-step, i get an error
With Job Type set to T-SQL I get:
The Specified '#server' is invalid (valid values are returned by sp_helpserver)
If I run sp_helpserver, it does show my linked server as being there.
Tried disconnected / reconnecting to the server.
You can with 'dynamic xmla script' execute via sql. below link will be useful:
https://www.mssqltips.com/sqlservertip/2790/dynamic-xmla-using-tsql-for-sql-server-analysis-services/

when was the last time Ansi_warning was set

In our production database there is an sp which was working fine till 2nd Nov 2014 and suddenly it started giving warnings that is
Warning : Null value is eliminated by an aggregate or other SET operation because we Ansi_warning ON in our Production database
so to resolve it we had marked as set Ansi_warning OFF in the beginning sp
So can anybody tell me is there any way I can check when was the last time or by whom Ansi_warning was set
You can see all your configuration changes by using this t-sql
DECLARE #sTracePath VARCHAR(1024)
SELECT #sTracePath=CONVERT(VARCHAR(500),value) FROM fn_trace_getinfo(DEFAULT)
WHERE property=2
SELECT TEXTData,HostName,ApplicationName,DatabaseName,LoginName,SPID,StartTime,EventSequence
FROM fn_trace_gettable(#sTracePath,1)
WHERE TEXTData LIKE '%configure%' AND SPID<>##spid
ORDER BY StartTime DESC
This of cource is based on that you have default trace enabled on the server.

Strange Issue in SSIS with WITH RESULTS SET returning wrong number of columns

So I have a stored procedure in SQL Server. I've simplified its code (for this question) to just this:
CREATE PROCEDURE dbo.DimensionLookup as
BEGIN
select DimensionID, DimensionField from DimensionTable
inner join Reference on Reference.ID = DimensionTable.ReferenceID
END
In SSIS on SQL Server 2012, I have a Lookup component with the following source command:
EXECUTE dbo.DimensionLookup WITH RESULT SETS (
(DimensionID int, DimensionField nvarchar(700) )
)
When I run this procedure in Preview mode in BIDS, it returns the two columns correctly. When I run the package in BIDS, it runs correctly.
But when I deploy it out to the SSIS catalog (the same server the database is on), point it to the same data sources, etc. - it fails with the message:
EXECUTE statement failed because its WITH RESULT SETS clause specified 2 column(s) for result set number 1, but the statement sent
3 column(s) at run time.
Steps Tried So Far:
Adding a third column to the result set - I get a different error, VS_NEEDSNEWMETADATA - which makes sense, kind of proof there's no third column.
SQL Profiler - I see this:
exec sp_prepare #p1 output,NULL,N'EXECUTE dbo.DimensionLookup WITH RESULT SETS ((
DimensionID int, DimensionField nvarchar(700)))',1
SET FMTONLY ON exec sp_execute 1 SET FMTONLY OFF
So it's trying to use FMTONLY to get the result set data ... needless to say, running SET FMTONLY ON and then running the command in SSMS myself yields .. just the two columns.
SET NOTCOUNT ON - Nothing changed.
So, two other interesting things:
I deployed it out to my local SQL 2012 install and it worked fine, same connections, etc. So it may be a server / database configuration. Not sure what if anything it is, I didn't install the dev server and my own install was pretty much click through vanilla.
Perhaps the most interesting thing. If I remove the join from the procedure's statement so it just becomes
select DimensionID, DimensionField from DimensionTable
It goes back to just sending 2 columns in the result set! So adding a join, without adding any additional output columns, ups the result set to 3 columns. Even if I add 6 more joins, just 3 columns. So one guess is its some sort of metadata column that only gets activated when there's a join.
Anyway, as you can imagine, it's driving me kind of mad. I have a workaround to load the data into a temp table and just return that, but why won't this work? What extra column is being sent back? Why only when I add a join?
Gah!
So all credit to billinkc: The reason is because of a patch.
In Version 11.0.2100.60, SSIS Lookup SQL command metadata is gathered using the old SET FMTONLY method. Unfortunately, this doesn't work in 2012, as the Books Online entry on SET FMTONLY helpfully notes:
Do not use this feature. This feature has been replaced by sp_describe_first_result_set.
Too bad they didn't follow their own advice!
This has been patched as of version 11.0.2218.0. Metadata is correctly gathered using the sp_describe_first_result_set system stored procedure.
This can happen if the specified WITH results set in SSIS identifies that there are more columns than being returned by the stored proc being called. Check your stored proc and ensure that you have the correct number of output columns as the WITH results set.

Resources