SQL Server 2008 R2 standard data-drive subs workaround - sql-server

So after some research I figured out that Standard edition does have the ability to manage data-driven subscriptions, so you'd have to write a custom script or stored procedure to work around it and get the same result.
My goal is be able to edit our existing reports (most of them are done with SSRS but we have a number of them created as SQL Server Agent jobs) so that they only e-mail the report if data is available; if no rows come up I want it to cancel sending the e-mail.
Before tackling an existing report, I tried creating a simple test script to get a better understanding DB Mail and stored procedures, so I came up with this script:
IF Exists ( Select cht_number, cht_itemcode, cht_description from chargetype where last_updatedate>'11/11/2014')
execute msdb.dbo.sp_send_dbmail
#profile_name=Null,
#recipients='email#company.com',
#subject='Test',
#Execute_Query_Database='DB_Name',
#query='Select cht_number, cht_itemcode, cht_description from chargetype where last_updatedate>''11/11/2014''',
#attach_query_result_as_file=1,
#query_attachment_Filename='TEST.csv',
#query_result_no_padding=1,
#query_result_header = 1,
#query_result_width = 256,
#query_result_separator=' '
IF ##ROWCOUNT = 0 raiserror ('No Data', 16, 1)
To test this script, I would edit something in that table so that only the most recent items would be sent in the report. If no data was available, it would just raise the error "No Data".
Can anyone suggest another way of getting this result, or how I could shorten this and be able to use it sort of like a template I could fiddle with to fit in an existing script?

Related

SQL Server send email when query result is not empty

I need to send an email from SQL Server when a query result set holds records. The query can be based on a lot of logic with joins between several tables.
Please send me in the right direction (views, triggers on views, SQL Server agent job..?).
using sp_send_DBmail as documented here(https://learn.microsoft.com/en-us/sql/relational-databases/system-stored-procedures/sp-send-dbmail-transact-sql?view=sql-server-2017) for all parameter options
declare #bodytext varchar(max)= '<b>Hey look I wrote something</b>'
if(Exists(select 1 from ....))
begin
EXEC msdb.dbo.sp_send_dbmail
#recipients='xyz#gmail.com',
#subject='ATTN! There are records',
#body=#bodytext,
#body_format='HTML',
#from_address='DBA <kl#domain.com>',
#reply_to='xyz#gmail.com'
end
take a look at Vsql-email app (you can search on google, I think I'm not allowed to post the direct link here), it has this option to not send the email if the query has 0 rows, and you can send the email as HTML formatted body and/or as excel attachment and no need to enable and configure Database Mail in SQL Server and write code for HTML formatting.

removing an object from Publication database sys.sp_droparticle and sp_dropsubscription

SQL Server Admin is not my forte. So please bear with while I explain this
A SQL Server 2012 cluster is involved in a Change data capture ( CDC ) effort using a 3rd party CDC utility. for it to work replication needs to be turned on, without replication CDC will not work. The CDC taps some 2000+ odd tables from SQL Server in a database Db1. Out of these we found out that some 200+ tables undergo truncate and load as against increments. So we removed those from our CDC lists but since replication is turned on at DB Level we also need to remove these from publication database so that truncates happening to this exception list wont need replication switched off DB level ( aka truncates to these tables and replication can co-exist. As its known, for truncates to happen we need to switch off replication. The code is in prod so replacing truncate by delete is not an option now besides the fact that for billion row tables deletes are going to be expensive & time consuming )
The above is the requirement. So based on that if a better solution can be conceived do let me know
What I tried :
EXEC sys.sp_droparticle #publication = 'pub', #article = 'art', #force_invalidate_snapshot = 1
Error I get
Msg 14013, Level 16, State 1, Procedure sp_MSrepl_droparticle, Line 104 [Batch Start Line 2]
This database is not enabled for publication.
Another SP
DECLARE #subscriber AS sysname;
EXEC sp_dropsubscription #publication = 'AR_PUBLICATION_00010', #article = 'BPA_BRGR_RUL_GRP_R' ,#subscriber=#subscriber
Msg 14013, Level 16, State 1, Procedure sp_MSrepl_dropsubscription, Line 55 [Batch Start Line 1]
This database is not enabled for publication.
But using GUI I am able to uncheck the tables I dont want in that publication. ( right click publication --> properties --> articles --> check /uncheck whatever you want excluded ) . I dont have any subscription just there is a publication.
Whatever code I ran through GUI above I can def. run through T-SQL But I dont know what code was it that was run ? How do I get this done using a scripting approach. I have 200+ tables to deal with and unchecking em 1 by 1 ain't helping
Nearly four years late, but in case it helps anyone... I think you want sp_dropmergearticle not sp_droparticle.
EXEC sys.sp_dropmergearticle #publication = 'pub', #article = 'art', #force_invalidate_snapshot = 1
I was getting an identical error message using sp_droparticle, but sp_dropmergearticle removed the table from the publication and allowed me to delete it.
Whatever code I ran through GUI above I can def. run through T-SQL But I dont know what code was it that was run ? How do I get this done using a scripting approach.
SSMS does not have a special API. Everything it does, it does through TSQL. So use SQL Profiler to watch what SSMS does, and capture the script.

SSRS subscription only when data in report

I have setup a report that uses a stored procedure to create the dataset. I am sending this report to 4 users using an email subscription on a daily basis. More often than not, the report will not have any data. How do I get the subscription to only send the email when the report has data?
Anyone else who stumbles across this, you can try this little trick. It will throw an error, which will prevent the report from being sent.
IF ##ROWCOUNT = 0 RAISERROR('No Results', 16, 1);
Use a data-driven report, if you're on Enterprise edition. If not, you'll need to see if any rows were returned from the SP first, and only execute the subscription if there are. This is pseudo-SQL, in the absence of a lot of context, however, something like:
IF EXISTS(SELECT 1 FROM YouTable JOIN YourOtherTable WHERE ...) BEGIN
EXEC ReportServer.dbo.AddEvent #EventType='TimedSubscription', #EventData='a6c151ca-ff47-46c0-b807-ad1ac8116769';
END

Strange Issue in SSIS with WITH RESULTS SET returning wrong number of columns

So I have a stored procedure in SQL Server. I've simplified its code (for this question) to just this:
CREATE PROCEDURE dbo.DimensionLookup as
BEGIN
select DimensionID, DimensionField from DimensionTable
inner join Reference on Reference.ID = DimensionTable.ReferenceID
END
In SSIS on SQL Server 2012, I have a Lookup component with the following source command:
EXECUTE dbo.DimensionLookup WITH RESULT SETS (
(DimensionID int, DimensionField nvarchar(700) )
)
When I run this procedure in Preview mode in BIDS, it returns the two columns correctly. When I run the package in BIDS, it runs correctly.
But when I deploy it out to the SSIS catalog (the same server the database is on), point it to the same data sources, etc. - it fails with the message:
EXECUTE statement failed because its WITH RESULT SETS clause specified 2 column(s) for result set number 1, but the statement sent
3 column(s) at run time.
Steps Tried So Far:
Adding a third column to the result set - I get a different error, VS_NEEDSNEWMETADATA - which makes sense, kind of proof there's no third column.
SQL Profiler - I see this:
exec sp_prepare #p1 output,NULL,N'EXECUTE dbo.DimensionLookup WITH RESULT SETS ((
DimensionID int, DimensionField nvarchar(700)))',1
SET FMTONLY ON exec sp_execute 1 SET FMTONLY OFF
So it's trying to use FMTONLY to get the result set data ... needless to say, running SET FMTONLY ON and then running the command in SSMS myself yields .. just the two columns.
SET NOTCOUNT ON - Nothing changed.
So, two other interesting things:
I deployed it out to my local SQL 2012 install and it worked fine, same connections, etc. So it may be a server / database configuration. Not sure what if anything it is, I didn't install the dev server and my own install was pretty much click through vanilla.
Perhaps the most interesting thing. If I remove the join from the procedure's statement so it just becomes
select DimensionID, DimensionField from DimensionTable
It goes back to just sending 2 columns in the result set! So adding a join, without adding any additional output columns, ups the result set to 3 columns. Even if I add 6 more joins, just 3 columns. So one guess is its some sort of metadata column that only gets activated when there's a join.
Anyway, as you can imagine, it's driving me kind of mad. I have a workaround to load the data into a temp table and just return that, but why won't this work? What extra column is being sent back? Why only when I add a join?
Gah!
So all credit to billinkc: The reason is because of a patch.
In Version 11.0.2100.60, SSIS Lookup SQL command metadata is gathered using the old SET FMTONLY method. Unfortunately, this doesn't work in 2012, as the Books Online entry on SET FMTONLY helpfully notes:
Do not use this feature. This feature has been replaced by sp_describe_first_result_set.
Too bad they didn't follow their own advice!
This has been patched as of version 11.0.2218.0. Metadata is correctly gathered using the sp_describe_first_result_set system stored procedure.
This can happen if the specified WITH results set in SSIS identifies that there are more columns than being returned by the stored proc being called. Check your stored proc and ensure that you have the correct number of output columns as the WITH results set.

How to log in T-SQL

I'm using ADO.NET to access SQL Server 2005 and would like to be able to log from inside the T-SQL stored procedures that I'm calling. Is that somehow possible?
I'm unable to see output from the 'print'-statement when using ADO.NET and since I want to use logging just for debuging the ideal solution would be to emit messages to DebugView from SysInternals.
I think writing to a log table would be my preference.
Alternatively, as you are using 2005, you could write a simple SQLCLR procedure to wrap around the EventLog.
Or you could use xp_logevent if you wanted to write to SQL log
I solved this by writing a SQLCLR-procedure as Eric Z Beard suggested. The assembly must be signed with a strong name key file.
using System;
using System.Data;
using System.Data.SqlClient;
using System.Data.SqlTypes;
using Microsoft.SqlServer.Server;
public partial class StoredProcedures
{
[Microsoft.SqlServer.Server.SqlProcedure]
public static int Debug(string s)
{
System.Diagnostics.Debug.WriteLine(s);
return 0;
}
}
}
Created a key and a login:
USE [master]
CREATE ASYMMETRIC KEY DebugProcKey FROM EXECUTABLE FILE =
'C:\..\SqlServerProject1\bin\Debug\SqlServerProject1.dll'
CREATE LOGIN DebugProcLogin FROM ASYMMETRIC KEY DebugProcKey
GRANT UNSAFE ASSEMBLY TO DebugProcLogin
Imported it into SQL Server:
USE [mydb]
CREATE ASSEMBLY SqlServerProject1 FROM
'C:\..\SqlServerProject1\bin\Debug\SqlServerProject1.dll'
WITH PERMISSION_SET = unsafe
CREATE FUNCTION dbo.Debug( #message as nvarchar(200) )
RETURNS int
AS EXTERNAL NAME SqlServerProject1.[StoredProcedures].Debug
Then I was able to log in T-SQL procedures using
exec Debug #message = 'Hello World'
You can either log to a table, by simply inserting a new row, or you can implement a CLR stored procedure to write to a file.
Be careful with writing to a table, because if the action happens in a transaction and the transaction gets rolled back, your log entry will disappear.
Logging from inside a SQL sproc would be better done to the database itself. T-SQL can write to files but it's not really designed for it.
There's the PRINT command, but I prefer logging into a table so you can query it.
You can write rows to a log table from within a stored procedure. As others have indicated, you could go out of your way to write to some text file or other log with CLR or xp_logevent, but it seems like you need more volume than would be practical for such uses.
The tough cases occur (and it's these that you really need your log for) when transactions fail. Since any logging that occurs during these transactions will be rolled back along with the transaction that they are part of, it is best to have a logging API that your clients can use to log errors. This can be a simple DAL that either logs to the same database, or to a shared one.
For what it's worth, I've found that when I don't assign an InfoMessage handler to my SqlConnection:
sqlConnection.InfoMessage += new SqlInfoMessageEventHandler(MySqlConnectionInfoMessageHandler);
where the signature of the InfoMessageHandler looks like this:
MySqlConnectionInfoMessageHandler(object sender, SqlInfoMessageEventArgs e)
then my PRINT statements in my Stored Procs do not appear in DbgView.
You could use output variables for passing back messages, but that relies on the proc executing without errors.
create procedure usp_LoggableProc
#log varchar(max) OUTPUT
as
-- T-SQL statement here ...
select #log = #log + 'X is foo'
And then in your ADO code somehwere:
string log = (string)SqlCommand.Parameters["#log"].Value;
You could use raiserror to create your own custom errors with the information that you require and that will be available to you through the usual SqlException Errors collection in your ADO code:
RAISERROR('X is Foo', 10, 1)
Hmmm but yeah, can't help feeling just for debugging and in your situation, just insert varchar messages to an error table like the others have suggested and select * from it when you're debugging.
You may want to check Log4TSQL. It provides Database-Logging for Stored Procedures and Triggers in SQL Server 2005 - 2008. You have the possibility to set separate, independent log-levels on a per Procedure/Trigger basis.
Use cmd commands with cmdshell
I found this while searching for an answer to this question.
https://www.databasejournal.com/features/mssql/article.php/1467601/A-general-logging-t-sql-process-to-write-to-txt-files.htm
select #cmdtxt = "echo " + #logEntry + " >> drive:\path\filename.txt"
exec master..xp_cmdshell #cmdtxt
I've been searching for a way to do this, as I am trying to debug some complicated, chained, stored procedures, all that are called by an external API, and which operate in the context of a transaction.
I'd been writing diagnostic messages into a logging file, but if the transaction rolls back, the new log entries disappear with the rollback. I found a way! And it works pretty well. And it has already saved me many, many hours of debugging time.
Create a linked server to the same SQL instance, using the login's
security context. In my case, the simplest method was to use the
localhost loop address, 127.0.0.1
Set the linked server to enable RPC, and to NOT "Enable Promotion of
Distributed Transactions". This means that calls through that
server will take place outside of your transaction context.
In your logging procedure, (I have an example excerpted below) write
to the log table using the procedure through loopback linked server
if you are in a transaction. You can write to it the usual way
if your are not. Writing though the linked server is considerably
slower than direct DML.
Voila! My in-process logging survives the rollback, and I can find out what's happening internally when things are going south.
I can't claim credit for thinking of this--I found the approach after some time with Google, but I'm so pleased with the result I felt like I had to share it.
USE TX
GO
CREATE PROCEDURE dbo.LogError(#errorSource Varchar(32), #msg Varchar(400))
AS BEGIN
SET NOCOUNT ON
IF ##TRANCOUNT > 0
EXEC [127.0.0.1].TX.dbo.LogError #errorSource, #msg
ELSE
INSERT INTO TX.dbo.ErrorLog(source_module, message)
SELECT #errorSource, #msg
END
GO

Resources