We've two SQL server database. The source server has data populated from an external system and a destination database on a remote server (used by a web app). There's an SSIS package which maps column from source tables to destination (column names differ) and populates data to maintain the sync.
Now, to ensure that both the database are in sync for which we've an SP which shows record count and for some parent-child relationships it shows child count for each parent record (i.e. Brandwise Item count). Someone has to logon to both the servers, execute the SP and get the data manually. Then compare the results to ensure that both the db are in sync.
Now, to automate this process, we've done the following-
Add the destination server as a "Linked Server"
Use "EXEC msdb.dbo.sp_send_dbmail" along with "#attach_query_result_as_file =1"
Create an SSIS job which will execute the email SP for both the servers
So, this is how we get two emails which has query results attached to
it. And then comparing the text files completes the db sync check.
I believe this can be made better - now that we're able to access the destination server as a linked server. Its my first time so I'd request some experienced guys to share their approach, probably something beyond a join query with the linked server.
Since you have access to server as Linked server you can directly run query and compare data.
Please check this
You can modify SSIS jobs to send mails based on this query result.
I'm using the following query which is a simple version and gives me differences of both the sides -
(Select s.Title, s.Description from ERPMasterBrand as s EXCEPT
Select d.Title, d.Description from MasterBrand as d)
UNION
(Select s.Title, s.Description from MasterBrand as s EXCEPT
Select d.Title, d.Description from ERPMasterBrand as d)
Any better suggestions? I've tested and it gives desired results - hope I'm not being misguided :-) by my own solution.
Related
session Connection full information
I am currently running the query and capturing session connection information which collects login, NT User, Host, Application/Program, Server and database information but i am also looking more granular detail which also tell me if it's executing SSIS package then which SSIS package, if it's tableau reports as application/program then which Database objects it's pulling the reports or other application/batch job doing any DML activities.
If i can't get all the above details but as a example if it's running SSIS package then which SSIS package running i can get it?
I am running currently couple different query as a daily sql job and storing the data into table.
SELECT ##ServerName AS SERVER
,NAME
,login_time
,last_batch
,getdate() AS DATE
,STATUS
,hostname
,program_name
,nt_username
,loginame
FROM sys.databases d
LEFT JOIN sys.sysprocesses sp ON d.database_id = sp.dbid
WHERE database_id NOT BETWEEN 0 AND 4
AND loginame IS NOT NULL
Any other way i can more details as we will be doing migration wanted to make sure?
Ex. If SSIS loading data into DB, we need to find that SSIS package
Currently I am able to identify all connections established to the instance and where they are coming from, but
we need to know which of those connections are feeding data and which are consuming data.
Thanks for all your help!
The simple answer is No.
When an application connects, it can have an ApplicationName passed in the connection details. You can retrieve it easily enough from the program_name column when you query sys.dm_exec_sessions. But it has to be configured in the connection string (in the SSIS data source) for you to be able to retrieve it.
Adding an ApplicationName to your connection strings is a good habit, not just for this, but for many tuning/monitoring reasons.
You can query the SSISDB catalog system view, catalog.executions, to find SSIS package executions on the server:
SELECT * FROM ssisdb.catalog.executions
I have a Microsoft SQL database that we have been using for several years. Starting this morning a single table in the database is throwing a time-out error whenever we attempt to insert or update any records.
I have tried to insert and update through:
Microsoft Access ODBC
a .Net Program via Entity Framework
a stored procedure run as an automatic job -- that runs each morning
a custom query written this morning to test the database and executed through SQL Server Management Studio
Opening the table directly via 'Edit Top 200 Rows' and typing in the appropriate values
We have restarted the service, then restarted the entire server and continue to get the same problems. The remainder of the database appears to be working fine. All data can be read even from the affected table, and other tables allow updates and inserts to be run just fine.
Looking through the data in the table, I have not found anything that appears out of the ordinary.
I am at a loss as to the next steps on finding the cause or solution.
Its not a space issue is it ? try ...
SELECT volume_mount_point Drive,
cast(sum(available_bytes)*100 / sum(total_bytes) as int) as [Free%],
avg(available_bytes/1024/1024/1024) FreeGB
from sys.master_files f
cross apply sys.dm_os_volume_stats(f.database_id, f.[file_id])
group by volume_mount_point
order by volume_mount_point;
I have been given an Access Database that I have to try to decipher what it is doing.
As a start I see that there is a Pass Through query with a command like:
Exec RefreshGLTableLatestEntries
#sourceDB = 'DB_NAME' ,
#tablePrefix = 'TableName$' ,
#logFile = 'C:\logDB.txt'
When I run it I will get something like:
Result
Success... 108 rows inserted with a total amount of $0.000000
What I don't understand is where are the rows being copied from or copied to.
In the MSSQL database I don't see a table, query, standard procedure or function called 'TableName$'. There are quite a few tables & queries called 'TableName$SomethingElse'. Is there a way to see more details on where is the data coming from?
Similarly, how can I see where are the rows being inserted to? I can not find any file named 'logDB.txt' in my hard disk to see the log. I would suspect that it might not say much more that '...108 rows insterted...'
I'm using:
Access 2016 from Office 365, Version 1609
MS SQL Server Management Studio v17.1
Any ideas on how to get more information on how to get more information on what the Pass Through do?
A Pass-Through query in Access is equivalent to running its SQL code in SQL Server Management Studio.
(In the database that is designated by the connection string of the Pass-Through query.)
The SQL is sent as-is to MSSQL and run there.
RefreshGLTableLatestEntries is the stored procedure that is executed here. You need to locate and analyze it in SQL Server.
I have around 4 servers(test1 , test2, test3, test4) and each server with 5 - 6 databases. Each server has around 60 SQL server agent jobs scheduled.
Each each job inside the servers has combination of sql statements and SSIS packages. Inside SQL statements, we have stored procs, which access tables from another servers.
Now i want to check whether server test2, is being used in sql server agent jobs of other servers.
i tried with the following query but not able to get any result
SELECT Job.name AS JobName, Job.enabled AS ActiveStatus,
JobStep.step_name AS JobStepName, JobStep.command AS JobCommand
FROM sysjobs Job INNER JOIN
sysjobsteps JobStep
ON Job.job_id = JobStep.job_id
WHERE JobStep.command LIKE '%test2%'
but when i check manually in test1 server , i can see test2 server being used in stored procs and ssis packages.
How can this be achieved ?
This is a little tricky. You will need to search the procedures, file system and SQL Server separately. The job table only contains the entry point, not the definition for each step.
SSIS Packages, stored in SQL Server
SSIS packages are stored in the msdb table sysssispackages. The definition is stored in the column packagedata. You'll need to cast the data, before you can read/search it.
-- How to read package data column.
SELECT TOP 10
[name],
packagedata AS [Before],
CAST(CAST(packagedata AS VARBINARY(MAX)) AS XML) AS [After]
FROM
sysssispackages
;
SSIS Packages, stored on the file system
These are XML text files. You can search using your favorite method. If you want to build your own solution .Net has some great file handling ablities.
Stored Procedures
There are several methods, all of which return data from the system tables. One approach is to use the object definition function.
-- Returns definition of stored procedure.
SELECT
OBJECT_DEFINITION(OBJECT_ID(N'[Your-Schema].[Your-Procedure]'))
;
If you want to bring this altogether
Using the table sysjobsteps, you could build a package/app that searches for a given server name. The final product would need to support all three search methods. You might be able to use the subsystem column, to choose which search method should be used for each step.
I'm using Crystal Reports 2008 with SQL Server 2014.
I read on the internet that it was possible to create a temporary table with Crystal Reports. This link says that, one of many examples -> Click here
Yet when I go to the database expert, create a new command and enter the following DDL
CREATE TABLE #temp_test (col1 VARCHAR(5))
I get this error
Translation:
database connector error : 'No error message from server'
Yet, when I'm doing that with SQL Server on my database, everything is fine.
Have you managed to do it? If yes, how?
It sounds like an urban legend to me but I might be wrong...
Cheers
When you create a "Command" table in Crystal, you're giving Crystal a set of text to send to the SQL server, and Crystal expects a data set in return. Everything in between is done on the SQL server. Crystal checks the command by sending it to the SQL server when you enter it to see if it works.
Given that, your temp table is actually created on the SQL server. Also, when you create a temp table, it is deleted after the command is finished running.
As a result, if you use only this code, the SQL server will create the table, but there is no data set to return. It succeeds, so doesn't return an error, but also doesn't return data, hence the message: "No error message from server".
For your next step, I would suggest using code like this:
CREATE TABLE #temp_test (col1 VARCHAR(5))
SELECT * FROM #temp_test
This will create an empty data set to return to Crystal, so that it's getting the response it needs. I say this so that you don't think anything is wrong when you don't see anything. You'll need to insert data into the temp table in order to get it from the select statement for visual confirmation.
I would also suggest that you don't use a temp table unless you determine that you do or will actually need one within the scope of the command. For example, you may need one to increase performance on a particularly complex query or CTE, so it might increase performance to use a temp table. But I would create that query first and worry about optimization after I have at least some of it developed.