I have a usecase where I need to replicate data for a specific duration from one Azure SQL DB (subscription1) to another Azure SQL DB (supbscription2).
The source DB has live data with continuous inserts, updates and
deletes possibility. Henceforth, we will miss the updates while
performing DB copy operations.
For consuming the messages, I am considering to use Debizium SQL Server Connector, but this component
supports only the consumer functionality
Which camel component can I use to produce the same event to the destination DB ??
-Srikant
you can use one of the available Camel Kafka Connector implementations
https://camel.apache.org/camel-kafka-connector/latest/connectors/camel-jdbc-kafka-sink-connector.html
https://camel.apache.org/camel-kafka-connector/latest/connectors/camel-sql-kafka-sink-connector.html
If you don't you use Kafka Connect but plain Camel then just use respective Camel components.
Related
I'm using Delphi 10.3 and Advantage DataBase.
Here, I have used sp_SignalEvent method and then, used the TADSEvent which get triggered for all the ADS connected application and handled the operation based on the requests.
Is there any similar operation is present in SQL Server.?
SQL Server has Query Notifications:
Built upon the Service Broker infrastructure, query notifications
allow applications to be notified when data has changed. This feature
is particularly useful for applications that provide a cache of
information from a database, such as a Web application, and need to be
notified when the source data is changed.
Query Notifications in SQL Server
Working with Query Notifications
Detecting Changes with SqlDependency
I have a project where there will be a master source database, client windows service, client application, client database.
We need to have a client database because there are times when the client won't be able to reach out to the master database source due to connectivity issues.
I was hoping to get some of your expertise on what would the best/most efficient way to sync certain tables from client database to master database and other tables from master database to client database in close to real-time (within minutes). I would also need to keep track of what was synced in the master database so that I can use it in a dashboard.
There could be up til 10,000 + clients trying to pull this information all at once.
Any suggestions would be helpful.
You may be interested in a scheme based on Apache Kafka technologies.
Apache Kafka has the ability to create connectors.
The architectural scheme will look like this:
Local DB - Connector - Apache Kafka - Connector - Server DB.
Connectors support different databases.
You can use connectors to connect to the database itself or to separate tables.
Your scheme is similar to the ETL scheme based on Apache Kafka technologies.
You can also develop an application that will track what was synced in the main database using Apache Kafka Streams.
The other way, if you don't use Apache Kafka, is to master - master replication.
Is there any way to send the entities from Orion Context Broker to SQL Server by using docker compose file?
Right now, Orion subscribes to the entities and Quantum Leap notifies the Crate DB when there is a change for those entities. However, Crate DB is not what I want and this is the reason: https://community.powerbi.com/t5/Desktop/Direct-Query-for-PostgreSQL/m-p/776979/highlight/true#M374297
So basically my question is, if there is any way to replace the Crate DB with any of these supported ones: https://learn.microsoft.com/en-us/power-bi/desktop-directquery-data-sources.
(I randomly chose the SQL Server but any other is fine)
I don't know Quantum Leap or CrateDB but, basically, you could create a Context Consumer to receive notifications from Orion (a subscription would be created for that) so each notification is persisted in your target DB (in this case SQL Server).
In fact, maybe you can take advantage of Cygnus instead of developing that piece of software from scratch. Cygnus is extensible and there is documentation about how to extend it, so new sinks can be developed. You need to develop the sink for SQL Server. Note that there already sinks for other SQL-based databases (MySQL and Postgresql in particular) so they could be a good starting point.
If at the end you develop a new sink, it would be a great contribution to Cygnus catalog of sinks. Please, don't hesitate to send as the pull request in that
I'm relatively new to Azure and am having trouble finding what options are out there for connecting to an existing SQL database to push data into it.
The situation is that we have an external client who needs to connect to our Azure SQL database to push data into it, on an on-going basis. We can't give them permission to get into our database, so we're looking at what we can do allow data in. At this point the best option seems to be to create a web service deployed in Azure that will validate the data and then push it into our database.
The question I have is, are there other options to do this in an easier way? Are there Azure services or processes that can be set up to automatically process a file and pull the data into a database? Any other go-between options when each side has their own database and for security reasons can't just open up access to it?
Azure Data Factory works great for basic ETL. If neither party can grant direct access, you can use an intermediate repository like Blob Storage to drop csv/xml/json files for ingestion. If they'll grant you access to pull, you can setup a linked service that more or less functions the same as a linked server in MSSQL. As of the last release ADF now supports Azure hosted SSIS packages too.
I would do this via SSIS using SQL Studio Managemenet Studio (if it's a one time operation). If you plan to do this repeatedly, you could schedule the SSIS job to execute on schedule. SSIS will do bulk inserts using small batches so you shouldn't have transaction log issues and it should be efficient (because of bulk inserting). Before you do this insert though, you will probably want to consider your performance tier so you don't get major throttling by Azure and possible timeouts.
I've got a SQLServer Database and an application (.NET) that configures data on that database. On the other hand, I've got an application (VC++) that reads that data from the database and must be 'informed' a soon as possible of any change in the data saved on the database.
I would like to receive a signal from the database engine when some data had been changed by the first app. Of course, I can code some kind of message that comunicates both apps but my question is if exists that mechanism (some kind of ultra-trigger that warns the other app) in SQLServer or in any API for accessing SQLServer (Native, OLE DB, or even ODBC)
PS: Apologize for my english
If your requirements include guaranteed delivery (e.g. no lost events) - you might want to consider an MSMQ based strategy.
Accessing MSMQ from Microsoft SQL Server
http://www.codeproject.com/KB/database/SqlMSMQ.aspx
Microsoft Support Knowledge Base Article: 555070
Posting Message to MSMQ from SQL Server
http://support.microsoft.com/kb/555070
You can use a SQL Agent Job that will notify the other app starting it with sp_start_job. Check this out.