I want to send notification to windows application from oracle based on status of some activities running in oracle.
Approach that I could think.
Polling from application. But that could degrade the performance of production server.
Using UTL_SMTP in oracle where the back end process running in oracle will send an email. The app will have email client which will notify user based on the mail received.
But I know these are not good solutions. Does oracle support some standard event delegation model.
Recently Microsoft has come up with Signalr ( http://goo.gl/F8Rcmu ) which allow webclients to get updated based on activities on server.
Is there any way how can I achieve this with oracle and windows form application. Does oracle has support for broadcasting information to form applications or services.
Thank You.
What is the desired transport? (pure TCP or db connection)
Should it be synchronous or asynchronous? (Notification to be part of the db transaction)
You can use AQ(advanced queueing), SMTP, web services, continuous query notification, some more curious packages like DBMS_ALERT, DBMS_PIPE.
In Java world you would most probably use AQ acting as JMS(Java messaging system) provider.
Related
I'm using Delphi 10.3 and Advantage DataBase.
Here, I have used sp_SignalEvent method and then, used the TADSEvent which get triggered for all the ADS connected application and handled the operation based on the requests.
Is there any similar operation is present in SQL Server.?
SQL Server has Query Notifications:
Built upon the Service Broker infrastructure, query notifications
allow applications to be notified when data has changed. This feature
is particularly useful for applications that provide a cache of
information from a database, such as a Web application, and need to be
notified when the source data is changed.
Query Notifications in SQL Server
Working with Query Notifications
Detecting Changes with SqlDependency
Anyone can give some advice on C# frameworks/opensource projects for bi-drectional database synchronisation?
I have an application that will be used by multiple users. Normally, the user will interact with the application's local database (MS ACCESS) as we assume the network is not available on-site most of the time. When the user has network connectivity, the local database is to be synchronised with the centralised remote database (MS SQL). In the end, all users are able to read/write/commit each others' data. It very much like a SVN repository, i think.
Has anyone tried http://msdn.microsoft.com/en-us/library/bb629326.aspx? how does it fare? I have not really look into it, i am trying to look for more options first before evaluating each.
Thank you.
Have you looked at the Microsoft Sync Framework?
It was designed with scenarios like yours in mind.
Introduction to Microsoft Sync Framework
Sync Framework Samples
Walkthrough: Creating a Sync service
Walkthrough: Creating a Sync Service in Windows Azure
This question is an updated version of a previous question I have asked on here.
I am new to client-server model with SQL Server as the relational database. I have read that public access to SQL Server is not secure. If direct access to the database is not a good practice, then what kind of layer should be placed between the server and the client? Note that I have a desktop application that will serve as the client and a remote SQL Server database that will provide data to the client. The client will input their username and password in order to see their data. I have heard of terms like VPN, ISA, TMG, Terminal Services, proxy server, and so on. I need a fast and secure n-tier architecture.
P.S. I have heard of web services in front of the database. Can I use WCF to retrieve, update, insert data? Would it be a good approach in terms of security and performance?
A web-service tier is pretty common for smart-clients as a layer between the user-client and the server. This allows:
simple networking (http only)
you have an app-layer in which to put validation etc without upsetting the db
you can have security that isn't tied to the db
the db can run as fewer accounts (app accounts), allowing greater connection pooling
you can "scale out" the app layer
you can cache etc above the db
you can have a richer app layer, with more services than sql server provides
the client has a known API, and never knows about the db (which is an implementation detail)
You can use WCF to talk to the app layer, but you shouldn't think in terms of "INSERT", "UPDATE" etc - you should think in terms of operations that make sense to your domain model - the "CreateOrder" operation, etc. ADO.NET Data Services allows an API more similar to your "INSERT" etc, but it isn't necessarily as controlled as you might like for a secure service.
Performance is really a factor of "what queries am I running?" and "how much data am I transferring?". As long as you keep the operations sane (i.e. don't fetch the entire "Orders" data over the wire just to find the most recent order-date), then you should be OK.
Just looking at the requirements of a new project and I wanted to make sure this use case was sound:
user fills in InfoPath (2003) form locally on their PC
a button within the InfoPath form titled 'submit' brings up a new outlook (2003) email message with the infopath form attached. User presses sends and email is sent to an exchange mailbox.
sql server preiodically checks this mailbox, downloading any new submissions with the infopath form attached
sql server parses the attachment and the fields within the infopath form.
Is SQL Server capable of parsing mail attachments this way? Any caveats with this approach?
The attraction to using Outlook as the submission technology is that the process for the user is the same if they are offline. Outlook will then automatically sync when they come back online. It is essential that users have some way to fill the forms in offline, 'submit' them, and then have then synced automatically with the server when they next come online.
edit: to clarify, I am not looking for a way to cache form data from the server->client. I am looking to cache the completed form. Building a separate application to cache the completed reports on the client is not an option.
Later versions of SQL Server are capable of running .NET code within them, and as such you might be able to poll a mailbox from SQL Server and process an InfoPath form. However, I'm not sure I'd do it this way.
It might be better to consider writing a Windows Service that does this work. The Windows Service would start up, inspect the mail box of a "service account", read the mails, extract the attachments, process the xml and, eventually, write the data to SQL. It could also, presumably, respond to that mail with a confirmation or errors if business rules or validation errors occurred.
I'm not sure I'd put all of the above logic into SQL - for one thing, I suspect you'd have issues with accounts (having to have the account SQL was running under be able to access the Exchange mailbox account).
Your mileage may vary, and you should prototype this to determine what works best for you, but I'd try and keep the code the uses Exchange as a "work queue" separate from SQL and only put the code that deals with writing data into tables in SQL.
I would not use the approach you outlined above. There are several approaches that appear to me to be more desirable than having SQL Server looking at an Exchange Mailbox. The major point that you make and an important requirement is that the InfoPath form be allowed to work in offline mode. I would think of the "offline mode" and the "data transfer" parts of your project as two distinct and separate pieces: 1) The form and the data should be stored on the client until a connection to the Internet is available and 2) once the connection is available the form and data should be transferred to the server.
You can setup your InfoPath form to submit directly to the SQL Server and bypass the Exchange "middleman" entirely. The setup in InfoPath when you are designing your form is pretty straight forward: 1) you enable "Submit data" for the connection and 2) you configure the submit options. This article has the details about how to do that. Furthermore, your connection to the SQL Server may be setup for offline use, as it's discussed in this article. The only caveat with this approach is that you may need to change your database schema to support it.
Another approach is to have your InfoPath form submit to a SQL Server 2005 HTTP Endpoint. The InfoPath client is just a glorified XML Editor and the HTTP Endpoint is basically a different name for a web service. You receive the form data at the HTTP endpoint into a staging table where the data is stored as XML and then you can do your parsing of that data from that staging area. Still you will have to setup the InfoPath connection for offline use. The major caveat with this approach is that Microsoft will deprecate HTTP Endpoint in SQL Server 2008 in favor of WCF.
And the other approach I would like to suggest is to use WCF itself to receive the XML form data from the InfoPath client. This approach would require you to connect the form's data source to you WCF web service at design time and then also setting up the form for offline use.
I hope that this will be useful to you and at the very least point you in the right direction.
I've seen similar projects that resorted to an Express edition on the client, save the infopath (or app data) in Express and use Service Broker to deliver to center, because of guaranteed delivery semantics of SSB vs. mail. This gives you an all SQL solution easier to sell to IT and you don't need polling on the server. Also you will not have to deal with MIME parsing, is all straight forward XML processing. It is not for the faint of heart though, getting SSB up and running is a challenge.
If you decide to go with mail delivery, an external service will be arguably easier to build, debug and troubleshoot. There are some finer point issues you should have an answer for:
-How will you keep consistent the mail dequeue operations and the table write operations? Your component must engage the Exchange read/delete and the SQL insert into one distributed transaction.
- Is your logic prepared to deal with infopath docs coming out of order? mail transport makes absolutely no guarantee about the order of delivery, so you may see an 'order delete' doc before the 'order create' doc
- How are you going to detect missing documents (not delivered by mail)? Are you going to implement a sender sequence number and end up reinventing TCP on top of mail?
- Does your processing allow for parallel process of correlated documents? If thread 1 picks up doc 1 and thread 2 picks up doc 2 from same sender and doc 2 is correlated with doc 1 (ie. refer to same business transaction), what will happen at the database write? Will it deadlock, will it loose an update, will one be rolled back?
There are many dragons under the bridge ahead...
I've got a SQLServer Database and an application (.NET) that configures data on that database. On the other hand, I've got an application (VC++) that reads that data from the database and must be 'informed' a soon as possible of any change in the data saved on the database.
I would like to receive a signal from the database engine when some data had been changed by the first app. Of course, I can code some kind of message that comunicates both apps but my question is if exists that mechanism (some kind of ultra-trigger that warns the other app) in SQLServer or in any API for accessing SQLServer (Native, OLE DB, or even ODBC)
PS: Apologize for my english
If your requirements include guaranteed delivery (e.g. no lost events) - you might want to consider an MSMQ based strategy.
Accessing MSMQ from Microsoft SQL Server
http://www.codeproject.com/KB/database/SqlMSMQ.aspx
Microsoft Support Knowledge Base Article: 555070
Posting Message to MSMQ from SQL Server
http://support.microsoft.com/kb/555070
You can use a SQL Agent Job that will notify the other app starting it with sp_start_job. Check this out.