Database trigger that communicates with an external program - sql-server

I've got a SQLServer Database and an application (.NET) that configures data on that database. On the other hand, I've got an application (VC++) that reads that data from the database and must be 'informed' a soon as possible of any change in the data saved on the database.
I would like to receive a signal from the database engine when some data had been changed by the first app. Of course, I can code some kind of message that comunicates both apps but my question is if exists that mechanism (some kind of ultra-trigger that warns the other app) in SQLServer or in any API for accessing SQLServer (Native, OLE DB, or even ODBC)
PS: Apologize for my english

If your requirements include guaranteed delivery (e.g. no lost events) - you might want to consider an MSMQ based strategy.
Accessing MSMQ from Microsoft SQL Server
http://www.codeproject.com/KB/database/SqlMSMQ.aspx
Microsoft Support Knowledge Base Article: 555070
Posting Message to MSMQ from SQL Server
http://support.microsoft.com/kb/555070

You can use a SQL Agent Job that will notify the other app starting it with sp_start_job. Check this out.

Related

How to implement TADSevent like functionality for SQL Server

I'm using Delphi 10.3 and Advantage DataBase.
Here, I have used sp_SignalEvent method and then, used the TADSEvent which get triggered for all the ADS connected application and handled the operation based on the requests.
Is there any similar operation is present in SQL Server.?
SQL Server has Query Notifications:
Built upon the Service Broker infrastructure, query notifications
allow applications to be notified when data has changed. This feature
is particularly useful for applications that provide a cache of
information from a database, such as a Web application, and need to be
notified when the source data is changed.
Query Notifications in SQL Server
Working with Query Notifications
Detecting Changes with SqlDependency

One ASP.net WebAPI Application, Multiple Signal-R Backplanes (Sql Server databases)

Is it possible to create multiple backplanes within Signal-R?
We're working on an ASP.net WebAPI Sass application and are looking to implement Signal-R for "real-time" web functionality. Since we'll be hosting the application a web farm, client-connection state will be managed through a SQL Server backplane.
The application is multi-tenant - but database is not. The application determines which connection string to use and all client requests talk to their appropriate database. Now the code for configuring the Signal-R SQL Server backplane within Application_Start() is:
GlobalHost.DependencyResolver.UseSqlServer(connectionString);
Does anyone know if it's possible to create multiple backplanes with Signal-R, basically loop through each connection string and call the above code?
Thanks for checking this out!
If you need to eliminate the single point of failure, I suggest setting up a failover server in case the primary SQL Server machine goes down. Reference: http://technet.microsoft.com/en-us/library/hh231721.aspx
If you simply need more performance than a single SQL Server instance can provide, I suggest using Redis as the backplane.
In either case, I doubt attempting to use "multiple backplanes" will be helpful, unless you intend to map certain hubs to certain backplanes for load distribution.

Approaches for Database Synchronization

I am currently been assigned to develop a sync application for my company. We have SQL server on our database server which will be synced with the client database. Client databases are not known, they can be SQLite or MYSQL or whatever.
What this sync app does is, detect changes that occur on server & client databases. Save these changes and sync. If changes occur on server database it will be synced with the client database and vice versa.
I did some research on it and came to know many solutions. One of them is to use a Microsoft Sync Framework. But I hardly found a good implementation example on it for syncing with remote databases.
Then I came across Change Data Capture(CDC) on SQL Server 2008. CDC works by detecting the change on the source table through triggers and put these changes on a separate table called sync_table, this table is then used for syncing.
Since, I cannot use the CDC feature because I don't have sufficient database rights on my machine, I have started to develop my own solution which works like how CDC does. I create separate sync_table for each source table, create triggers to detect data change and put this data in the sync_table.
However, I am advised to do some more research on it for choosing the best implementation methodology.
I need to keep the following things in mind,
Databases may/may not be on the same network.
On server side, the user must be able to select which tables will take part in the sync process.
Devices that will sync with the server database need to be registered first. Meaning that all client devices will be registered by the user before they can start syncing.
As usual any help will be appreciated :)
There is an open source project called SymmetricDS with many of the same goals. Take a look at the documentation and data model to see how the problem was solved, and maybe you will get some ideas. Instead of a separate shadow table for each source table, there is a single sym_data table where all the data is captured in comma separated value format. The advantage is one place to look for captured data and retrieve changes that were part of the same transaction. The table is kept small by purging it often after data is transferred successfully. It uses web protocols (HTTP) for data transfer. The advantage is leveraging existing web servers for performance, administration, and known filtering through firewalls. There is also a registration protocol used before clients are allowed to sync. The server admin "opens registration" for a client ID, which allows the client to connect for the first time. It supports many different databases, so you'll find examples of how to write triggers and retrieve unique transaction IDs on those systems.

Suitable method For synchronising online and offline Data

I have two applications with own database.
1.) Desktop application which has vb.net winforms interface, runs in offline enterprise network and stores data in central database [SQL Server]
**All the data entry and other office operations are carried out and stored in central database
2.) Second application has been build on php. it has html pages and runs as website in online environment. It stores all data in mysql database.
**This application is accessed by registered members only and they are facilitied with different reports of the data processed by 1st application.
Now I have to synchronize data between online and offline database servers. I am planning for following:
1.) Write a small program to export all the data of SQL Server [offline server] to a file in CVS format.
2.) Login to admin Section of live server.
3.) Upload the exported cvs file to the server.
4.) Import the data from cvs file to mysql database.
Is the method i am planning good or it can be tunned to perform good. I would also appreciate for other nice ways for data synchronisation other than changing applications.. ie. network application to some other using mysql database
What you are asking for does not actually sound like bidirectional sync (or movement of data both ways from SQL Server to MySQL and from MySQL to SQL Server) which is a good thing as it really simplifies things for you. Although I suspect your method of using CSV's (which I would assume you would use something like BCP to do this) would work, one of the issues is that you are moving ALL of the data every time you run the process and you are basically overwriting the whole MySQL db everytime. This is obviously somewhat inefficient. Not to mention during that time the MySQL db would not be in a usable state.
One alternative (assuming you have SQL Server 2008 or higher) would be to look into using this technique along with Integrated Change Tracking or Integrated Change Capture. This is a capability within SQL Server that allows you to determine data that has changed since a certain point of time. What you could do is create a process that just extracts the changes since the last time you checked to a CSV file and then apply those to MySQL. If you do this, don't forget to also apply the deletes as well.
I don't think there's an off the shelf solution for what you want that you can use without customization - but the MS Sync framework (http://msdn.microsoft.com/en-us/sync/default) sounds close.
You will probably need to write a provider for MySQL to make it go - which may well be less work than writing the whole data synchronization logic from scratch. Voclare is right about the challenges you could face with writing your own synchronization mechanism...
Do look into SQL Server Integration Service as a good alternate.

Microsoft Sync Services - good solution for me?

We upload sales transactions from our stores to the headoffice server. At the moment, we use DTS (SQL Server Data Transformation Services), but we’re planning on replacing that with Microsoft Sync services for ADO.NET, as this seems to be Microsoft’s preferred solution for this type of setup and we want to follow the standard (that will be hopefully be around for a long time).
Here are the details of our setup and what we’re planning. I’m looking for some advice, especially about whether Sync Services fits into our solution.
Situation
Each store has a 3rd party EPOS system which stores sales in a Microsoft Access 2000 database, which we can access. Our headoffice database is SQL Server 2005, but will be upgraded to 2008. The headoffice is not on a VPN with all the stores, but we can open up our firewall to the stores’ IP addresses, so that they can send data directly to SQL Server. The stores are always connected to the internet via ADSL, although they do lose connection and we don’t want to lose sales data.
We are only uploading transactions from the store – definitions do not need to be downloaded.
Current solution
We have written a Windows service that runs on the store PC. This service downloads a DTS package from the server (which contains all the details of the upload) and runs it in the store – and this will upload sales to our server.
We chose DTS, because it is free when you install MSDE. We can’t use SSIS, because that would require a SQL Server licence at every store.
Another reason we chose DTS is that the details of the upload (i.e. which tables and fields to include) are stored on our headoffice server, so if we need to change things we can do that centrally and don’t need to install anything new at the stores. This isn’t a showstopper, but would be nice to have this ability in our new solution.
Potential solution - Microsoft Sync services for ADO.NET
We are currently building a proof of concept with Microsoft Sync services for ADO.NET. The idea is to put SQL CE (SQL Server Compact 3.5) in each store (client) and sync that to the headoffice SQL Server 2005 database (server). We’ll get the data into the SQL CE database either by (1) syncing it with the Access 2000 database or (2) getting the EPOS system developers to write sales straight into the SQL CE database – probably (2). But our main concern is getting the data from the store to the headoffice server. This method seems to be Microsoft’s preferred solution for occasionally connected systems and that is what made us look seriously at Sync Services.
I’m hoping that using this will mean that most of the work needed to upload the sales will be built into Sync Services and we won’t have to re-invent the wheel.
Potential solution - Upload to a custom webservice
There is also the possibility of uploading the sales transactions to a custom web service on our headoffice server and then into our SQL Server database. This means that we will have to build our own mechanism for determining which rows are new, and as well as caching for when the systems are disconnected. Also, we might be missing out on other functionality that will come built into Sync Services.
Please let me know if you have any advice that will help, especially : “Is Sync services the right solution!”. The problem that we are trying to solve seems very generic (uploading sales from stores) – and I’d like to solve it with a generic solution.
Microsoft Sync services is more than you need, but it will certainly do what you want, and it was built with your type of application in mind.
As with most new technologies out of microsoft, (caution: generalization!) you may find that it's not as mature as you might like. It'll do what you need it to, but you may run into issues that aren't easily resolved because it hasn't been put through the ringer. As an early adopter, though, you may find that the Sync developers are eager to help you out when you get stuck, so this isn't as big a problem as it might seem.
Make sure you read through all the literature on it, some of which is here, or linked in the following sites:
http://msdn.microsoft.com/en-us/sync/default.aspx
http://msdn.microsoft.com/en-us/sync/bb887608.aspx
http://en.wikipedia.org/wiki/Microsoft_Sync_Framework
Given your one-way flow of information, though, and centralized layout I expect you should have few, if any, issues setting it up and using it.
Be sure to report your experience back here!
-Adam

Resources