I am trying to set MULTI_STATEMENT_COUNT=0 in Talend when making tdbconnection.
I had to add a seperate tSnowflakeRow right now to set this parameter using ALTER SESSION.
Is there any way to do that during making connection when using tDBConnection?
A session parameter can be set at user level (CREATE USER).
So this parameter will be set as soon as you initiate your connection.
But as indicated by #hkandpal and noticed in Snowflake documentation you should be cautious with MULTI_STATEMENT_COUNT parameter as it opens up the possibility for SQL injection.
Related
I am using Talend to load data from Oracle to Snowflake. I am able to set up the load pipeline, but I wanted to set the query tag as part of the load pipeline so that I can do some analysis based on the tag. However, I could not find any way to specify the query tag along with query statements (ALTER SESSION SET QUERY_TAG='TALENDLOAD') in the load pipeline.
Is it that Talend does not allow to set the session parameters?
You need to first run ALTER SESSION SET MULTI_STATEMENT_COUNT=0; as the default value is 1, which allows only one statement in JDBC and ODBC connectors info here
Then you may pass ALTER SESSION SET QUERY_TAG='TALENDLOAD' along with other query statements.
I'm migrating a classic ASP site to ASP.NET MVC. As part of the migration, we've moved the database from MS Access to SQL Server, and have set up basic trigger-level audit logging on the SQL end of things, for good measure.
What I want to do is track the currently logged in user of the classic ASP site for the purpose of the trigger-level auditing.
For the MVC end of things, I use SQL's SET CONTEXT_INFO (ref) in conjunction with Entity Framework and the "one data context per request" rule, which allows me to set the context info to the currently logged in MVC user's ID. All's well there.
I'd like to to the same with the classic ASP site, but am unsure how. Is there a way I can capture the "per request" to set up the CONTEXT_INFO such as we can in MVC? I'm not familiar enough with how the Classic ASP pipeline works to know if this can be done, or if the database connection (implemented as a connection string in an include file and an ADODB connection) will be persisted in the App Pool, which would mean that I don't have a way of doing this. Does anyone know if this is possible?
Here are some facts.
CONTEXT_INFO stored in session / batch scope not the connection.
From Using Session Context Information
Session context information enables applications to set binary values
of up to 128 bytes that can be referenced in multiple batches, stored
procedures, triggers, or user-defined functions operating on the same
session.
When you take a connection from the pool, ADO (with a decent, preferably official data provider like SQLOLEDB, SQLNCLI of course) executes sp_reset_connection which indicates that connection is being reused.
Starting from SQL Server 2005 *, sp_reset_connection resets CONTEXT_INFO.
From System Stored Procedures (Transact-SQL)
The sp_reset_connection stored procedure is used by SQL Server to
support remote stored procedure calls in a transaction. This stored
procedure also causes Audit Login and Audit Logout events to fire when
a connection is reused from a connection pool.
In conclusion, it's safe to use CONTEXT_INFO during an open session.
So the following code is in accordance with one data context per request rule as long as you stick with the same connection object reference (adoCon) during the request.
<%
Dim adoCon ' global scope variable
Set adoCon = Server.CreateObject("ADODB.Connection")
adoCon.ConnectionString = "Provider=SQLNCLI10;Data Source=..."
adoCon.Open 'connection taken from the pool, a session "possibly recycled" started
'CONTEXT_INFO() is definitely NULL right now
adoCon.Execute "SET CONTEXT_INFO 0x01"
'all database operations through the adoCon ...
adoCon.Close 'connection closed, session released
%>
I am having a problem with a data flow task in an ssis package i am trying to build. The objective of the package is to update tables situated in our local server using a connection to a distant server containing the source of the data, through a vpn connection.
There are no problems for tables which are re-downloaded entirely.
But some of the tables must be updated for real. What I mean is they're not re-downloaded. For each of those tables, I have to check the maximum value of the date column in our local server (int YYYMMDD type) and ask the package to download only the data added after that date.
I thought about using a scalar (#MAXDATE for ex) but the issue is, I have to declare this scalar in a session with our local server, and I cannot use it as a condition in an OLE DB Source task, because the latter implies a new session, this time with the distant server.
I can only view the database on the distant server and import it. So no way to create a table on it.
I hope it is clear enough. Would you have any tips to solve this problem?
Thank you in advance.
Ozgur
You can do this easily by using an execute SQL Task, a Data Flow task and one variable. I would probably add some error checking just in case no value is found on the local system, but that depends very much on what might go wrong.
Assuming VS2008
Declare a package level variable of type datetime. Give it an appropriate default value.
Create an Execute SQL Task with a query that returns the appropriate date value. On the first page of the properties window, make sure the Result Set is set to "Single Row." On the Result Set page, map the date column to the package variable.
Create a Data Flow task. In the OLE DB Data Source, write your query to include a question mark for the incoming date value. "and MaxDate>?". Now when you click on the Paramaters button, you should get a pop-up that allows you to map "Parameter0" to your package level variable.
Yesterday I added some indexes on a view in my MSSQL 2008 db. After that it seems like all the store procedures need to run with QUOTED_IDENTIFIER set to ON, even those that don't use the view in question.
Why is it so? Is this something I can configure on the db or do I have to update all my stored procedures to set the QUOTED_IDENTIFIER to ON? I think it is rather weird that this is required for the stored procedures not using the view.
Do these stored procedures relate to the base table(s) that the view is based upon? To quote from Creating Indexed Views:
After the clustered index is created,
any connection that tries to modify
the base data for the view must also
have the same option settings required
to create the index. SQL Server
generates an error and rolls back any
INSERT, UPDATE, or DELETE statement
that will affect the result set of the
view if the connection executing the
statement does not have the correct
option settings. For more information,
see SET Options That Affect Results.
And it's kind of obvious, when you think about it - you're potentially going to be updating the contents of the view whenever you touch these base tables, and so you inherit the same responsibilities as when you created the index.
You can set the defaults at multiple levels:
Any application can explicitly override any default settings by executing a SET statement after it has connected to a server. The SET statement overrides all previous settings and can be used to turn options on and off dynamically as the application runs. The option settings are applicable only to the current connection session.
OLE DB and ODBC applications can specify the option settings that are in effect at connection time by specifying option settings in connection strings. The option settings are applicable only to the current connection session.
SET options specified for a SQL Server ODBC data source by using the ODBC application in Control Panel or the ODBC SQLConfigDataSource function.
Default settings for a database. You can specify these values by using ALTER DATABASE or the Object Explorer in SQL Server Management Studio.
Default settings for a server. You can specify these values by using either sp_configure or Object Explorer in SQL Server Management Studio to set the server configuration option named user options.
We have a classic ASP application that simply works and we have been loathe to modify the code lest we invoke the wrath of some long-dead Greek gods.
We recently had the requirement to add a feature to an application. The feature implementation is really just a database operation requires minimal change to the UI.
I changed the UI and made the minor modification to submit a new data value to the sproc call (sproc1).
In sproc1 that is called directly from ASP, we added a new call to another sproc that happens to be located on another server, sproc2.
Somehow, this does not work via our ASP app, but works in SQL Management Studio.
Here's the technical details:
SQL 2005 on both database servers.
Sql Login is authenticating from the ASP application to SQL 2005 Server 1.
Linked server from Server 1 to Server 2 is working.
When executing sproc1 from SQL Management Studio - works fine. Even when credentialed as the same user our code uses (the application sql login).
sproc2 works when called independently of sproc1 from SQL Management Studio.
VBScript (ASP) captures an error which is emitted in the XML back to the client. Error number is 0, error description is blank. Both from the ADODB.Connection object and from whatever Err.Number/Err.Description yields in VBScript from the ASP side.
So without any errors, nor any reproducibility (i.e. through SQL Mgmt Studio) - does anyone know the issue?
Our current plan is to break down and dig into the code on the ASP side and make a completely separate call to Server 2.sproc2 directly from ASP rather than trying to piggy-back through sproc1.
Have you got set nocount on set in both stored procedures? I had a similar issue once and whilst I can't remember exactly how I solved it at the moment, I know that had something to do with it!
You could be suffering from the double-hop problem
The double-hop issue is when the ASP/X page tries to use resources that are located on a server that is different from the IIS server.
Windows NT Challenge/Response does not support double-hop impersonations (in that once passed to the IIS server, the same credentials cannot be passed to a back-end server for authentication).
You should verify the attempted second connection using SQL Profiler.
Note that with your manual testing you are not authenticating via IIS. It's only when you initiate the sql via the ASP/X page that this problem manifests.
More resources:
http://support.microsoft.com/kb/910449
http://support.microsoft.com/kb/891031
http://support.microsoft.com/kb/810572
I had a similar problem and I solved it by setting nocount on and removing print commands.
My first reaction is that this might not be an issue of calling cross-server, but one of calling a second proc from a first, and that this might be what's acting differently in the two different environments.
My first question is this: what happens if you remove the cross-server aspect from the equation? If you could set up a test system where your first proc calls your second proc, but the second proc is on the same server and/or in the same database, do you still get the same problem?
Along these same lines: In my experience, when the application and SSMS have gotten different results like that, it has often been an issue of the stored procedures' settings. It could be, as Luke says, NOCOUNT. I've had this sort of thing happen from extraneous PRINT statements in the code, although I seem to remember the PRINTed value becoming part of the error description (very counterintuitively).
If anything is returned in the Messages window when you run this in SSMS, find out where it is coming from and make it stop. I would have to look up the technical terms, but my recollection is that different querying environments have different sensitivities to "errors", and that a default connection via SSSM will not throw an error at certain times when an ADO connection from a scripting language will.
One final thought: in case it is an environment thing, try different settings on your ASP page's connection string. E.g., if you have an OLEDB connection, try ODBC. Try the native and non-native SQL Server drivers. Check out what connection string options your provider supports, and try any of them that seem like they might be worth trying.
Example code might help :) Are you trying to return two tables from the stored procedure; I don't think ADO 2.6 can handle multiple tables being returned.
I did consider that (double-hop), but what is the difference between a sproc-in-a-sproc call like I am referring to vs. a typical cross-server join via INNER JOIN? Both would be executed on Server1, using the Linked Server credentials, and authenticating to Server 2.
Can anyone confirm that calling a sproc cross-server is different than doing a join on data tables? And why?
If the Linked Server config is a sql account - is that considered a double-hop (since what you refer to is NTLM double-hops?)
In terms of whether multiple resultsets are coming back - no. Both Server1.Sproc1 and Server2.Sproc2 would be "ExecuteNonQuery()" in the .net world and return nothing (no resultsets and no return values).
Try to check the permissions to the database for the user specified in the connection string.
Use the same user name in the connection string to log in to the database while using sql mgmt studio.
create some temporary table to write the intermediate values and exceptions since it can be a effective way of debugging your application.
Can I just check: You made the addition of sproc2? Prior to that it was working fine for ages.
Could you not change where you call sproc2 from? Rather than calling it from inside sproc1, can you call it from the ASP? That way you control the authentication to SQL in the code, and don't have to rely on setting up any trusts or shared remote authentication on the servers.
How is your linked server set up? You generally have some options as to how it authenticates to the remote server, which include logging in as the currently logged in user or specifying a SQL login to always use. Have you tried setting it to always use a specific account? That should eliminate any possible permissions issues in calling the remote procedure...