I am using Change Data Capture to capture the change of data from software application. I am trying to generate the SQL statements (insert, update, delete) from the data captured.
Is there any proper way to get this done?
The way I have is/which I have worked on is, get all the change records from CDC tables along with the action (update/delete/insert) and pass the batch of records to a stored procedure which accepts table type as a input parameter. In the stored procedure you can basically write a cursor/ group by action to perform the operation on destination table. This way you don't need to generate dynamic SQL queries and run it on data base and we have seen this as a very efficient way when compared to generating dynamic sql and run it on DB.
Related
I have an SSIS job, and a relatively complex select, that use the same data. I have to make it so that my client doesn't have to call them separately, but use one thing to get the result of the select and call the job.
My original plan was to create a procedure, which will take necessary input, and then output a table variable with the select result.
However, after reading the Microsoft documentation, I found out that table variables might not be able to hold a result with more than 100 rows, while I might want to select ~10 000 rows. And now I'm stumped. What is the best way to call a job and select data, from one component?
I have permissions to create views, procedures, and I can edit the SSIS job. The user will provide me with 2 parameters.
This is how I would suggest that you do in this scenario, to take the complexity away from the SSIS.
Create the SP that you wanted to; but instead of Table Variable; push your output into a table. This table can be addded on the fly(dynamically using CREATE TABLE script) or can exist on the DB always available as a buffer.
Call this SP in your control flow.
In the Data flow task, select from this buffer table.
After completing the SSIS work, flush the buffer table, i.e. truncate the table.
Caveat: You may face problem in concurrency scenarios; To eliminate that, you should have a column BatchID or BatchStartTimeStamp which can store a unique value for each run.
You can pass data for BatchID or BatchStartTimeStamp from SSIS package.
I want to track each and every execution event of all the stored procedures of a database, So Is there any way or any global event where I can write SQL to insert record into a table along with stored procedure name or object id?
There are so many stored procedures in my database and I can't make changes to all the SP's and re-deploy them. I need global event where I can write the SQL.
I know we have sys.dm_exec_procedure_stats view (Show's last execute date time from cache), but I want to track manually by insert record for each SP into a separate table.
Answers will be greatly appreciated.
For that purpose you can create separate table and write a trigger for insert, update and delete for each table in your database so you can manually track the all type of transaction. Or write only insert trigger for each and every table which tables are used in your stored procedure.
Hi all I have a requirement to create a web based application using SQL server 2005. The data is coming from a third party source in a text format. This is my idea so far.
I have a file system watcher looking for a file in a directory
I loop through the file found, find the columns and insert the data one by one in a table
Once all data has been inserted, run a stored procedure against the table to do some more cleaning and create totals used within the web app
As you can see there are mainly 2 steps involved within the import after the file has been found. Those are storing data in SQL server and the second to clear up values and do some other work within my database. My question is if as I am looping through the values anyway can I have a trigger (and yes I do know that a trigger is per execution not for every row) to do the cleaning for me as I insert the records in my table.
For example I loop through one by one figure out the columns and then insert them into the table. As that happens a trigger is fired to runs some script (possibly stored procedures)to do some other work on other tables. That way all my file system watch needs to do is get the data and insert them into the table. The trigger will do all the other work. Is this advisable and what will happen if a trigger is already running a script and it is called again by another insert to the table?
Sorry for the long question
Thanks
Table A is imported from Excel file into SQL Server. There is serious of Update and Delete operations performed on the Table to replace certain columns with certain values. I have written a Proc for this but Since I'm beginning to use SSIS I need to know if this can be accomplished via Execute SQL Task or by using any other transformation.
Yes it is achievable by either the slowly changing dimension component, or lookup components but both of the these require a RBAR (row by agonising row) update to be executed. If you have a small amount of data < 50,000 records) this is probably OK but it just doesn't work for larger datasets
When using the lookup or SCD components, the dataflow is split into an insert and and update stream. The insert stream is reasonably fast, but the update stream needs to be fed into an execute SQL task, which performs an update one data row at a time.
This is in contrast to the ELT method where you execute one UPDATE statement which performs all updates in one batch. This is generally much faster.
If you decide not to use the SSIS RBAR approach, you can stage your Excel data then use an execute SQL task (in the control flow) to call your existing SP. In this way a batch update can be performed rather than an RBAR one.
I need to copy a large amount (~200,000) of records between two tables inside the same SQL Server 2000 database.
I can't change the original table to include the columns I would need, so the copy is the only solution.
I made a script with insert select statement. It works, but sometimes the .net form that triggers the stored procedure catches an exception with a timeout expired error.
Is there a more effective way to copy this many records around?
Any tips about how to check where the timeout occurred in the database?
INSERT (id,name) SELECT id,name FROM
your_table WHERE your_condition
And i'd suggest you to put your form in a different thread so It won't freeze, you can also increase the timeout, it's in your connection string.
If you can't avoid the multiple insert, you can try to split them in smaller stack, for instance send only 50 query at a time.
Are you wanting to create an application to copy data between tables or is this just a one-off solution? If you only need to do this once, you should create a script to execute on the database server itself to copy the data you need to transfer between tables.
Are you using a SqlCommand to execute the stored procedure?
If so, set the CommandTimeout:
myCmd.CommandTimeout = 360; //value is in seconds.
1>Compare two database with redgate data compare since other table is empty the script which will generate after comparing will be all inserts.Select insert for that table only.
2>Use multiscript from redgate just add those script to multiscript and execute on that database table it will keep on executing till complete and then you can compare if u have all data correctly.
3> If you don't want to use multiscript create a command line application to just insert the data .