My eventlog is cluttered with Package "<name>" finished successfully messages; is there any way to stop these from being added to the log?
The packages in question run very frequently and are making the eventlog harder to use
This is running from SQL Server 2008 R2 (Standard)
The job properties are set with Write to the Windows Application event log - When the job fails (and sends an email to an operator and in the corresponding maintenance plan, the settings for "Reporting and Logging" are all set with nothing checked.
And the SQL Server Agent properties are set only with the fail-safe operator; by email
For the life of me, I cannot see anywhere in SQL where I can suppress the "success" messages and would appreciate help.
I have just encountered a very similar scenario.
I have a couple of packages that are scheduled frequently. Monitoring of these non-critical packages can be managed within SQL Server Management Studio itself, I have no need to log events to the Windows Application logs.
In fact the logs are now "bloating" insofar as they are filling at a far greater pace than I am happy with.
It is possible to switch package logging on or off within the SSIS package itself.
From within BIDS (Business Intelligence Design Studio), right click anywhere within the control flow and select "Logging..." from the menu that appears.
From what I have read of this, to set your own custom logging options you have to tick the option on in the "Providers and Logs" screen and then add the "SSIS log provider for Windows Event Log" provider.
Once you have done that, you can tick options on and off within the "Details" tab. The options are all unchecked by default.
In the alternative, you could set up logging to the "SSIS log provider for SQL Server" and select the items that you do want to monitor. This then logs activity to a table called dbo.sysssislog in whichever database you configure within the provider.
You can get details on SSIS package logging here: http://msdn.microsoft.com/en-gb/library/ms181205(v=sql.100).aspx
Related
Here is some background information. Both the cube and the agent is located on the same server lets call this x.
The database that is used by the Cube is located on a different serverlets call this datasource.
The funny thing is that if we process the cube manually from Microsoft Analysis server it works perfect every time.
Note we run all of this in azure and the datasource is Sql Server
When we start the agentjob it runs quite a while but then we get an error with this message
Executed as user:NT Service\SQLSERVERAGENT.
<return xmlns="urn:schemas-microsoft-com:xml-analysis">
<root xmlns="urn:schemas-microsoft-com:xml-analysis:empty">
<Messages xmlns="urn:schemas-microsoft-com:xml-analysis:exception">
<Warning WarningCode="1092091904" Description="Note. Analysis Server has persisted any security information specified as part of the Connection string portion of the Datsource object definition." Source="Microsoft SQL Server 2017 Analysis Services Managed Code Module"
HelpFile="" />
</Messages>
</root>
</return>
When we klick on "Start Job at Step..." It shows Actions and status
The first action Start Job SsasLoad_Ramundberget_Axess give status Success
The second action execute job SsasLoad_Ramundberget_Axess is first status In progress in 80 sec and then status error
I have been search for a solution to this problem and it all end up with being a permission issue.
A few have had this same problem and all of them point is the permission direction.
We have tried the following.
We have logged in to visual Studio with admin account and deployed it but it was the same error.
I have a theory.
I think the user that is trying to connect to datasource is different when we run the cube manually and when we run the agent.
I am the one that posted the initial question. I have now additional information to share. To be able to see the flow of data that happens on the datasource I use the Azure Data Studio(Sql Server profiler). As I mentioned in my initial question it works all the time with status success when I update the cube(tabular model) manually from Microsoft Analysis Server. Here is the screen shot from logging using Azure Data Studio how it looks like when I process the cube(tabular model) manually with process mode full. https://i.stack.imgur.com/FMcBJ.png To automate this steps to update the cube automatically we created a schedule Job that will run every hour. To make sure that the agent job will work we start the agent job manually that kick start the cube to update itself . But in reality it will run in the background and kickstart the cube to update itself. We used the SQL Server Agent to process the cube(tabular mode). Here is how the json looks like to set up the agent job { "refresh": { "type": "full", "objects": [ { "database": "Axess_Ramundberget" } ] } } I want to point out when we use the agent to trigger the cube to update itself it doesn't work in 95% of the time. Here is a screen shot from zure Data Studio sql server profiler how it looks like when status give error. https://i.stack.imgur.com/Ms075.png One really funny thing is that when we get status error the cube is updated successfully correct despite the error. But the thing is we really don't want to have false information saying error but the cube is successfully updated from the datasource. So in the initial question I thought it had to do with permission but as far as I underrstand it can't be any permission issue. The error we get when starting the agent is always the same as I mentioned in the initial question. I have been looking really thoroughly to find a solution to our problem but there are very few that have had this kind of problem. If I compare the data stream in azure data studio sql profiler between when we run the cube manually and run the agent it very similar. Here is a screen shot for view history of the SQl Server agent https://i.stack.imgur.com/keDVl.jpg One more funny thing is that when we use the Azure Data Studio sql profiler to log it works sometimes but if we doesn't use the sql profiler it will never ever give succes but when we use the sql profiler it give success sometimes as the screen shot shows. So every success that shows is when we have used the sql profiler.
I'm using SSRS (SQL Server reporting services) to display reports, my datasource is Snowflake
I have installed the ODBC snowflake driver and configured it properly
Click here to view the ODBC configuration
I have created a shared datasource on the SSRS server (via Report manager) and put in my own credentials and the connection works fine
Click here to view the connection on the SSRS Server
I'm able to build the SSRS report without any issues, when I run the report, everything works fine, I can publish the report on the server and the report renders perfectly fine on the browser
The issue is when i go back to the report the next day, i'm presented with an error:
An error has occurred during report processing. (rsProcessingAborted)
Query execution failed for dataset
'insert_name_of_my_dataset_here'. (rsErrorExecutingCommand)
ERROR [57P03] No active warehouse selected in the current session.
Select an active warehouse with the 'use warehouse' command.
So, this also means that the following doesn't work neither:
Subscriptions
Cache refresh
Snapshots
The only thing that works is if I open my report in SSRS Report builder, I right-click EACH of my datasets ("each" is very important, it doesn't work if i don't do all of them), I run the queries manually for each of them, and then the "connection" or "session" is "re-activated" and the report runs fine, both locally AND on the server...note i do not have to re-publish the report on the server for it to run
Click here to view screenshots of my process
Steps I have taken to addresss the issue (that didn't yield any resolution):
I have tried putting the "use warehouse WAREHOUSE_NAME;" command before each dataset's SQL script, but Snowflake's API doesn't allow multiple SQL commands to be sent, so I already saw that this functionality was in the development pipeline for Snowflake and found this link: https://github.com/snowflakedb/snowflake-connector-net/issues/33 - this work was started in 2018 and the last update dates from Apr 2019 that says they are starting to address the JDBC driver...no mention for the ODBC driver yet
I have set the snowflake parameter client-session-keep-alive to true (https://docs.snowflake.com/en/sql-reference/parameters.html#client-session-keep-alive), but according to the community portal: A similar "keep alive" parameter is not currently available for the ODBC driver. Instead, you could issue a dummy query every few hours to keep the connection alive. (https://community.snowflake.com/s/article/faq-how-long-can-my-jdbcodbc-connection-remain-idle)
List item
I have tried to create a cache refresh plan or a snapshot schedule that creates a snapshot or caches the report every 3 hours, and it works for the first schedule, but fails with the error for the other ones
The only thing I didn't try is to have snowflake never close the connection and keep the warehouse in the "started" state indefinitely...but this would increase my cost, and i'm pretty sure it won't work since the session would end anyways after 4 hours...
Any assistance is welcome!
Thanks
Specs:
SSRS 2014
Snowflake X-small
ODBC-64 bit driver, installed from the
snowflake driver repository (tested with 32-bit also, but 64-bit is
the one that is visible to SSRS)
I faced the same kind of issue and fixed adding the corresponding role with the data warehouse.
In the data warehouse add role with USAGE.
Could it be related with the data warehouse name (in the ODBC settings)? Is there a typo? COSNUMER_WH or CONSUMER_WH?
I strongly recommend setting default "context" configurations for situations like this, setting default role, warehouse, database, and schema with commands such as this:
ALTER USER xyz SET DEFAULT_WAREHOUSE = 'WH_NAME_HERE' ;
https://docs.snowflake.com/en/sql-reference/sql/alter-user.html
I cannot find a similar question. I have an SSIS package that contains a visual basic script task with the following line in it - msgbox("some text") . It runs fine from BIDS and manually executed from MSDB, but when I schedule it in SQL Server agent the package seems to fun fine until that point and completes. But the message box does not appear and none of the actually tasks after that run. The scheduled job reports complete and success. Can you point me to the right solution, I believe it would have something to do with the SSIS proxy account and its security but can't find anything. Does anyone know how to resolve this?
Here's a snapshot of my code. As you can see, I'm firing off lots of message boxes in an attempt to log what steps are working withing my package.
xworkbook = ExcelObject.Workbooks.Open("C:\xxx.csv")
xworksheet = DirectCast(xworkbook.Sheets(1), Excel.Worksheet)
MsgBox("csv")
xworksheet.Range("B:B").Replace(What:=",", Replacement:="")
MsgBox("replace 1")
xworksheet.Range("B:B").Replace(What:=".", Replacement:="")
MsgBox("replace 2")
xworkbook.SaveAs("C:\xxx.xlsx", FileFormat:=51) MsgBox("saved")
I believe that the reason it won't work is that when you run the SSIS task as a scheduled job it doesn't run in the context of your account but rather the service account for the SQL Server Agent and the message box won't show for you. The messagebox isn't valid for a non-interactive task.
#jpw hit the nail on the head for "The messagebox isn't valid for a non-interactive task."
To make it work you either need to strip your message boxes out of your code or inspect the value of the boolean Variable System::InteractiveMode
Code approximately
If CBool(Dts.Variables("System::InteractiveMode").Value) = True Then
....
End if
Assorted references
Microsoft SQL Server 2008 Integration Services Unleashed
How do I disable interactive message boxes from inside an SSIS package?
You overcome this by changing your process so that no user intervention is needed. There is no reason you should ever have a message box in SSIS except for debugging.
I have a package that I'm using to load records from a CSV file into a table. It has three elements in the control flow:
Truncate table
Load File into Table
Verify that there are records on the table after the load or raise an error
The idea is to have a single transaction on the package, so if the load of elements fails or the file was empty then the transaction is rolled back and the table isn't truncated.
To enable the transaction I just go to the package properties and set TransactionOption=Required, then I just try to execute the package and get this error while trying to execute the first element (The SQL task that tries to truncate the table):
[Execute SQL Task] Error: Failed to acquire connection "Database
Connection". Connection may not be configured correctly or you may not
have the right permissions on this connection.
If I just go back and change the TransactionOption property of the package to the default (Supported) then the package executes correctly but if there's an error there's no rollback.
I am using ADO.NET to connect to a SQL Server DB.
Any idea of what am I doing wrong? Is this the correct way to use transactions or am I missing something?
Thanks!
I know this is an old topic, but I had the same problem as you - the package works fine until I set one of the containers transaction's option to TransactionOption=Required
From what I understand, this might be related to Microsoft Distributed Transaction Coordinator (MSDTC) service not being started on the SQL server.
When I had this issue I checked if MSDTC is started on the machine on which I was running the package - it was. Sadly, I couldn't access the SQL server to check the same thing.
But, following these steps on the machine running the package solved the problem:
On Windows Server 2008 and Windows Vista:
Click Start, click Run, and type dcomcnfg to launch the Component Services Management console.
Click to expand Component Services and click to expand Computers.
Click to expand My Computer, click to expand Distributed Transaction Coordinator, right-click Local DTC, and click Properties.
Click the Security tab of the Local DTC Properties dialog.
In that dialog box, I had to enable "Network DTC Access" and also "Allow Inbound" and "Allow Outbound".
Sources:
msdn forum about this
msdn article about troubleshooting Problems with MSDTC
Just recently started having issues with an SQL Server Agent Job that contains an SSIS package to extract production data and summarize it into a separate reporting database.
I think that some of the Alerts/Notifications settings I tried playing with caused the problem as the job had been running to completion unattended for the previous two weeks.
So... Where's a good place to start reading up on SQL Agent Alerts and Notifications? I want to enable some sort of alert/notification so that I'm always informed:
That the job completes successfully (as a check to ensure that it's always executed), or
That the job ran into some sort of error, which should include enough info (such as error number) that I can diagnose the cause of the error
As always, any help will be greatly appreciated!
Books Online is probably a good place to start (or at least I like it and generally find it useful).
SQLMenace and bofe made some good points. Here's my additional two cents:
I'd recommend configuring Database Mail rather than SQL Mail (i.e. SMTP vs. MAPI, which I think is deprecated anyway). Once you get the mail profile configured, you'll have to also configure the SQL agent to use that mail profile (which is just a page of settings for the agent properties), or else your SSIS job notifications won't actually get sent, even though you can successfully send a test email from Management Studio.
I don't use alerts as often as job notifications, so the only tricky thing I can recall about them is that if you're raising an error and you want the alert to email you when that happens, you have to make sure that the raised error gets written to the log. I think that just boils down to "RAISERROR ... WITH LOG"; here's the BOL link for the syntax details.
In each step of the job click on advanced then from there you can log to a file or to a table, this will have all errorcodes and other things why the job failed
You should be able to see this also from the job history.
Right click on the job-->view history, click on the + sign to expand, the click on each step and it will be in the lower panel
To set up notifications you need to set up an operator and the in the job on the notification tab you pick it from the email dropdown
You'll want to have "When the job completes" marked in your notifications page on the job's properties.
Just go to that dropdown and switch it to job completion instead of failure (which is on the screenshot).
You'll also want to make sure that your server has e-mail configured. I think it's under SQL Surface Area Configuration for Features.