I have one table named "Staff" in access and also have this table(same name) in SQL 2008.
Both table have thousands of records. I want to merge records from the access table to sql table without affecting the existing records in sql. Normally, I just export using OCBC driver and that works fine if that table doesn't exist in sql server. Please advise. Thanks.
A simple append query from the local access table to the linked sql server table should work just fine in this case.
So, just drop in the first (from) table into the query builder. Then change the query type to append, and you are prompted for the append table name.
From that point on, just drop in the columns you want (do not drop in the PK column, as they need not be used nor transferred in this case).
You can also type in the sql directly in the query builder. Either way, you will wind up with something like:
INSERT INTO dbo_custsql
( ADMINID, Amount, Notes, Status )
SELECT ADMINID, Amount, Notes, Status
FROM custsql1;
This may help: http://www.red-gate.com/products/sql-development/sql-compare/
Or you could write a simple program to read from each data set and do the comparison, adding, updating, and deleting, etc.
Related
I need to create a "ghost" table in SQL Server, which doesn't actually exist but is a result set of a SQL Query. Pseudo code is below:
SELECT genTbl_col1, genTblcol2
FROM genTbl;
However, "genTbl" is actually:
SELECT table1.col AS genTbl_col1,
table2.col AS genTbl_col2
FROM table1 INNER JOIN table2 ON (...)
In other words, I need that every time a query is run on the server trying to select from "genTbl", it simply creates a result set from the query and treats it like a real table.
The situation is that I have a software that runs queries on a database. I need to modify it, but I cannot change the software itself, so I need to trick it into thinking it can actually query "genTbl", when it actually doesn't exist but is simply a query of other tables.
To clarify, the query would have to be a sort of procedure, available by default in the database (i.e. every time there is a query for "genTbl").
Use #TMP
SELECT genTbl_col1, genTblcol2
INTO #TMP FROM genTbl;
It exists only in current session. You can also use ##TMP for all sessions.
I'm trying to work out a specific way to copy all data from a particular table (let's call it opportunities) and copy it into a new table, with a timestamp of the date copied into the new table, for the sole purpose of generating historic data into a database hosted in Azure Data Warehousing.
What's the best way to do this? So far I've gone and created a duplicate table in the data warehouse, with an additional column called datecopied
The query I've started using is:
SELECT OppName, Oppvalue
INTO Hst_Opportunities
FROM dbo.opportunities
I am not really sure where to go from here!
SELECT INTO is not supported in Azure SQL Data Warehouse at this time. You should familiarise yourself with the CREATE TABLE AS or CTAS syntax, which is the equivalent in Azure DW.
If you want to fix the copy date, simply assign it to a variable prior to the CTAS, something like this:
DECLARE #copyDate DATETIME2 = CURRENT_TIMESTAMP
CREATE TABLE dbo.Hst_Opportunities
WITH
(
CLUSTERED COLUMNSTORE INDEX,
DISTRIBUTION = ROUND_ROBIN
)
AS
SELECT OppName, Oppvalue, #copyDate AS copyDate
FROM dbo.opportunities;
I should also mention that the use case for Azure DW is million and billions of rows with terabytes of data. It doesn't tend to do well at low volume, so consider if you need this product, a traditional SQL Server 2016 install, or Azure SQL Database.
You can write insert into select query like below which will work with SQL Server 2008 +, Azure SQL datawarehouse
INSERT INTO Hst_Opportunities
SELECT OppName, Oppvalue, DATEDIFF(SECOND,{d '1970-01-01'},current_timestamp)
FROM dbo.opportunities
Ok I have a database with a table LOOKUP (1st), and the same database on another server also with LOOKUP (2nd).
Is there a way I can insert into the 1st database from the second, if duplicate exist then skip else all other values that is present in 2nd should be inserted into 1st. Basically I want the exact same Database!
The think that confuses me is they are on different servers.
Can I export the one to like excel and import it again and replace my database or anything.
You will have to use 2 MERGE queries if you want to make both the databases identical. This is because the first merge will only insert records that are available in DB1 into DB2. But, DB1 will still not contain the records present in DB2 but not in DB1.
I would suggest you to do this task using SSIS.
You can use 2 sources DB1 and DB2 and a LOOKUP transformation on each source (LKP1 and LKP2).
Then you can insert the No Match output of LKP1 into DB2 as destination and No Match output of LKP2 into DB1 as destination.
This will solve the multi-server issue as well because you can create connection to any server in SSIS.
I want to create a copy of a table, say TestTable, with a new name, say TestTableNew, in the same database with the use of an SSIS package. I've created a "Transfer SQL Server Objects Task" for this with the source database specified as both the SourceDatabase and the DestinationDatabase. When I run this task, the original table TestTable is overwritten with a new -empty- TestTable.
This might well be something really obvious that I've overlooked, but can I somehow specify another name for the destination table somewhere in this transfer task? Or should I solve this in another way?
You can't use the "Transfer SQL Server Objects Task" to copy a table to the same database because there isn't an option to specify the new table name. You would be copying table "TestTable" to table "TestTable", which will fail because they both have the same name.
You can set the "DropObjectsFirst" property to true, but that will make you lose your original table and its data, which I think you did on your test, otherwise you would have received a failure message.
The best option here is to use an "Execute SQL Task" to create the structure of your TestTableNew based on your TestTable and then do a simple OleDBSource -> OleDBDestination transformation to load all the data from one table to another.
My knowledge of SSIS is very limited but I assume you can run sql commands passing in
parameters and therefore generating something like the following dynamically
select *
insert into TestTableNew
from TestTable
I want to update a static table on my local development database with current values from our server (accessed on a different network/domain via VPN). Using the Data Import/Export wizard would be my method of choice, however I typically run into one of two issues:
I get primary key violation errors and the whole thing quits. This is because it's trying to insert rows that I already have.
If I set the "delete from target" option in the wizard, I get foreign key violation errors because there are rows in other tables that are referencing the values.
What I want is the correct set of options that means the Import/Export wizard will update rows that exist and insert rows that do not (based on primary key or by asking me which columns to use as the key).
How can I make this work? This is on SQL Server 2005 and 2008 (I'm sure it used to work okay on the SQL Server 2000 DTS wizard, too).
I'm not sure you can do this in management studio. I have had some good experiences with
RedGate SQL Data Compare in synchronising databases, but you do have to pay for it.
The SQL Server Database Publishing Wizard can export a set of sql insert scripts for the table that you are interested in. Just tell it to export just data and not schema. It'll also create the necessary drop statements.
One option is to download the data to a new table, then use commands similar to the following to update the target:
update target set
col1 = d.col1,
col2 = d.col2
from downloaded d
inner join target t on d.pk = t.pk
insert into target (col1, col2, ...)
select (d.col1, d.col2, ...) from downloaded d
where d.pk not in (select pk from target)
If you disable the FK constrains during the 2nd option - and resume them after finsih - it will work.
But if you are using identity to create pk that are involves in the FK - it will cause a problem, so it works only if the pk values remains the same.