How to update a table while using it at the same time? - sql-server

I have a Local DB (I'm using SQL Server Express) named PCNAME\SQLEXPRESS. I need to load data from the main database at MAINDB\HYEAH, so I linked the mainDB and I was able to insert data from the main DB into local DB using a stored procedure.
The problem I have is that I can't figure out the correct way to do the following:
I'm constantly using the data imported from the mainDB, that data is in the table Credits, I'm always consulting, inserting or updating a record from that table.
But every 10 minutes I have to reload the data from the mainDB into Credits again, but I can't stop using the data. I need to find a way to be able to use this data and manipulate it, while is being reloaded from the mainDB.
I'm not an expert in DB or SQL transactions so I thought about this solution:
The first time I load the data from mainDB I'll do it directly on table Credits. The other times I'll load the data in a temporary table and when the stored procedure finishes, I'll replace Credits with data from the temporary table. But I think this is dumb cause if I delete all the data from Credits to replace it with temporary table I will not be able to continue using the data so I'm stuck.
Is there a way to properly achieve this?
Thank you!

One option would be to use synonyms.
BEGIN TRY
DROP SYNONYM working_table
END TRY
BEGIN CATCH
END CATCH
CREATE SYNONYM working_table FOR import_table_a
you can now do your selects and updates to working_table and they will go into import_table_a. When you need to reload the data (into import_table_b) you just drop the synonym and point it at the new version of the table.
But do take on board the other comments that imply that you might be fixing the wrong problem :)

Related

Backing up a table before deleting all the records and reloading it in SSIS

I have a table named abcTbl, the data in there is populated
from other tables from a different database. Every time I am loading
data to abcTbl, I am doing a delete all to it and loading the buffer
data into it.
This package runs daily. My question is how do I avoid losing data
from the table abcTbl if we fail to load the data into it. So my
first step is deleting all the data in the abcTbl and then
selecting the data from various sources into a buffer and then
loading the buffer data into abcTbl.
Since we can encounter issues like failed connections, package
stopping prematurely, supernatural forces trying to stop/break my
package from running smoothly, etc. which will end up with the
package losing all the data in the buffer after I have already
deleted the data from abcTbl. 
My first intuition was to save the data from the abcTbl into a
backup table and then deleting the data in the abcTbl but my DBAs
wouldn't be too thrilled about creating a backup table for in every
environment for the purpose of this package, and giving me juice to
create backup tables on the fly and then deleting it again is out of
the question too. This data is not business critical and can be repopulated
again if lost.
But, what is the best approach here? What are the best practices for this issue?
For backing up your table, instead of loading data from one table (Original) to another table (Backup), you can just rename your original table to something (back-up table), create original table again like the back-up table and then drop the renamed table only when your data load is successful. This may save some time to transfer data from one table to another. You may want to test which approach is faster for you depending on your data/table structure etc., But what I wanted to mention is, this is also one of the way to do it. If you have lot of data in that table below approach may be faster.
sp_rename 'abcTbl', 'abcTbl_bkp';
CREATE TABLE abcTbl ;
While creating this table, you can keep similar table structure as that of abcTbl_bkp
Load your new data to abcTbl table
DROP TABLE abcTbl_bkp;
Trying to figure this out but I think what you are asking for is a method to capture the older data before loading the new data. I would agree with your DBA's that a seperate table for every reload would be extremely messy and not very usable if you ever need it.
Instead, create a table that copies your load table but adds a single DateTime field(say history_date). Each load you would just flow all the data in your primary table to the backup table. Use a Derived Column task in the Data Flow to add the history_date value to the backup table.
Once the backup table is complete, either truncate or delete the contents of the current table. Then load the new data.
Instead of created additional tables you can set the package to execute as a single transaction. By doing this, if any component fails all the tasks that have already executed will be rolled back and subsequent ones will not run. To do this, set the TransactionOption to Required on the package. This will allow that the package will begin a transaction. After this set all this property to Supported for all components that you want to succeed or fail together. The Supported level will have these tasks join a transaction that is already in progress by the parent container, being the package in this case. If there are other components in the package that you want to commit or rollback independent of these tasks you can place the related objects in a Sequence container, and apply the Required level to the Sequence instead. An important thing to note is that if anything performs a TRUNCATE then all other components that access the truncated object will need to have the ValidateExternalMetadata option set to false to avoid the known blocking issue that is a result of this.

Stored procedure deleting temporary table

I have a stored procedure which makes used of a temporary table with ##temp creating on the fly using select * into ##temp from tablename.
The problem I have having is this stored procedure seems to delete or make this available only for that moment in time when the query is ran, despite having ## which is global and can be used by other users from what i know.
I am using SSRS to pull the stored procedure and using drill through from this report to the same report, first one only showing charts, the second report which is the same stored procedure which uses the actions link via parameter but the second report doesn't recognize the ##temp table.
Now that you got the background, is there a way around this or a better way of doing it, keep in mind we don't have a data warehouse at the moment, so just using temporary tables to do the work around.
Thanks
From MSDN:
Global temporary tables are automatically dropped when the session that created the table ends and all other tasks have stopped referencing them. The association between a task and a table is maintained only for the life of a single Transact-SQL statement. This means that a global temporary table is dropped at the completion of the last Transact-SQL statement that was actively referencing the table when the creating session ended.
If you have admin access to the server, try this answer.

Is it wise to use triggers as part of an import routine

Hi all I have a requirement to create a web based application using SQL server 2005. The data is coming from a third party source in a text format. This is my idea so far.
I have a file system watcher looking for a file in a directory
I loop through the file found, find the columns and insert the data one by one in a table
Once all data has been inserted, run a stored procedure against the table to do some more cleaning and create totals used within the web app
As you can see there are mainly 2 steps involved within the import after the file has been found. Those are storing data in SQL server and the second to clear up values and do some other work within my database. My question is if as I am looping through the values anyway can I have a trigger (and yes I do know that a trigger is per execution not for every row) to do the cleaning for me as I insert the records in my table.
For example I loop through one by one figure out the columns and then insert them into the table. As that happens a trigger is fired to runs some script (possibly stored procedures)to do some other work on other tables. That way all my file system watch needs to do is get the data and insert them into the table. The trigger will do all the other work. Is this advisable and what will happen if a trigger is already running a script and it is called again by another insert to the table?
Sorry for the long question
Thanks

How to create a trigger to populate a table from another table in a different database

Basically what I'm trying to do is create a dynamic trigger where if a table from database 1 has a new record inputed. if it falls in the category of data that I need for database 2, it automatically populates the table in database 2 without me needed to manually update.
Right now I am going into the table in database 1 sorting for the category I need and copying the data I need into the table in database 2.
I tried to make this process easier by doing a select query for the columns that I need from database 1 to database 2, which works fine however it overwrites what I have already and I have to basically recreate everytime.
So after all that rambling I guess exactly what I need to know. Is there a way to create a trigger that if a new line item is inputed in database 1 with the tag matching the type of material I need to transfer to database 2. Also on top of that I only need to transfer 2 columns from database 1 to database 2.
I would try to post a sample code, however I have no idea where to start on this.
I suggest you look into Service Broker messaging. We use it quite a bit and it works quite well. You can send messages to the other database with the data that needs to be inserted and allow the second database to do all the work. This will alleviate the worries about the second database being offline or causing an error which rolls back into your trigger. If the second database is unavailable the messages will queue up in your database until it can send them. This isn't the easiest thing to set up but is a way to keep the two databases from being so closely tied together.
Service Broker
I am unclear about the logic in your selection but if you want to save a copy of what was just inserted into table1 into a table (table2) on another database, using a trigger, you can try this:
create trigger trig1 on dbo.table1
after insert as
insert into database2.dbo.table2 (col1,col2,col3) values (inserted.col1, inserted.col2)`
You could use an AFTER INSERT Trigger like this:
CREATE TRIGGER [FirstDB].[dbo].[YourTrigger]
ON [FirstDB].[dbo].[Table]
AFTER INSERT
AS
BEGIN
INSERT INTO [OtherDB].[dbo].[Table] SELECT (values...)
END
I recommend you consider non-trigger alternatives as well though. Cross-DB triggers could be risky (what if the other db is offline, etc.)

SQL Server - Copy Data Between Tables

I need to copy a large amount (~200,000) of records between two tables inside the same SQL Server 2000 database.
I can't change the original table to include the columns I would need, so the copy is the only solution.
I made a script with insert select statement. It works, but sometimes the .net form that triggers the stored procedure catches an exception with a timeout expired error.
Is there a more effective way to copy this many records around?
Any tips about how to check where the timeout occurred in the database?
INSERT (id,name) SELECT id,name FROM
your_table WHERE your_condition
And i'd suggest you to put your form in a different thread so It won't freeze, you can also increase the timeout, it's in your connection string.
If you can't avoid the multiple insert, you can try to split them in smaller stack, for instance send only 50 query at a time.
Are you wanting to create an application to copy data between tables or is this just a one-off solution? If you only need to do this once, you should create a script to execute on the database server itself to copy the data you need to transfer between tables.
Are you using a SqlCommand to execute the stored procedure?
If so, set the CommandTimeout:
myCmd.CommandTimeout = 360; //value is in seconds.
1>Compare two database with redgate data compare since other table is empty the script which will generate after comparing will be all inserts.Select insert for that table only.
2>Use multiscript from redgate just add those script to multiscript and execute on that database table it will keep on executing till complete and then you can compare if u have all data correctly.
3> If you don't want to use multiscript create a command line application to just insert the data .

Resources