I'm doing it programmatically (I'm a newbie to sql) I'm getting the data per table within first DB using with being a value from a list of table names that I need to make sure are
there
if there have the corresponding values in the same table in
DB X list all the fields that do not have the same values and the
value in below
Table that does match listing the table, field name, row,
"SELECT * FROM [Dev.Chris21].[dbo].[" & PayrollTablemaskedarray(xxxxxx-2) & "]"
I can copy the whole thing into excel but I'm wondering is there a way to do this using sql?
Thanks
Since you mention that you're doing it programmically I assume you're using visual studio. If so you can take advantage of SQL Server Data Tools (SSDT) to do comparisons of two database schemas or two database data sets. You get this out of the box with VS2012 or VS2013 (and earlier versions too). Might be worth a look...
Related
I am new to using ssis and am on my third package. We are taking data from Oracle into Sql Server. On my oracle table, the unique key is called recnum and is numeric(12,0). In this particular package, I am trying to take the record from oracle, lookup in a sql server table to see if that unique key is found, and if not add the record to the sql server table. My issue is it wouldn't find a match. After much testing, I came up with the following method that works. But I don't understand why I had to do this.
How I currently have it working:
I get the data from oracle. In my next step, I added a derived column that uses the oracle column. (The expression is just that field, no other formatting.) Then in the lookup I use the derived column instead of the column from Oracle.
We had already done this on another table where the unique key was numeric(8,0) and it worked ok without needing a derived column.
SSIS is very fussy about data types, lookups only work nicely if data types match.
Double click on the Data Path lines between Data Flow objects to check data types. I use Data Conversion tasks or CAST statements to force matching data types when I use lookups.
Hope this helps.
I have to compare the tables in Server1 database A dbo.X and Server2, database B dbo.Y. Both table X and table Y contains same values.
SO I need to validate both tables contains same values in every row and column. Is it possible to do it?
Thanks
If you do not want to use any Tool like SSIS/Visual Studio then Linked Server will be required.
Select * FROM Server1.databaseA.dbo.X
EXCEPT
Select * FROM Server2.databaseB.dbo.Y
EXCEPT returns distinct rows from the left input query that aren’t output by the right input query.
EXCEPT
Sure, you can do it by creating linked servers. Please, follow this manual to create it:
Creating Linked Servers
After this you will able to make sql-queries to another server like this:
SELECT name FROM [SRVR002\ACCTG].master.sys.databases ;
There is a more easy way if you have visual studio installed. There is a option to compare schema and data with any server and it is very efficient as you can update the target server within the tool as well.
VisualStudio -> Tools -> SQL server -> Data Comparison
I have a large table of information (around 11,000 rows, 4 columns) in Excel that uses a macro and I need to import it to an SQL server, Microsoft SQL Server Management Studio, which will be utilized by another server to get the new information.
Example:
If I type into SQL:
Insert Into ENT_LINK_OBJECTS (OBJ_NAME, ENTITY_KEY, IDENTITY_KEY)
Select 'TDS-C1487-81236', ITEM_KEY, 1
From ENT_ITEM_MASTER As M
Where M.ITEM_CODE = 'TL-123'
or M.ITEM_CODE = 'TL-456'
I can then open the program which holds all this information, called Matrix, which prompts me to enter an item key and/or code and/or type etc (which has all possible files listed below it) and hit search (image 1). If I type in TL-123 to the item code section (image 2), it narrows down the files to any containing TL-123 (image 3). When i double click, I can click on many tabs, one of which is "Links". In that tab under document name the information TDS-C1487-81236(image 4). How would I go about making that happen?
(1)
(2)
Then hit ENTER
(3)
(4)
The website below is a good explanation of what I am getting at but I do not know how to implement it. What would be the most efficient way to migrate the data from my excel document to the SQL server?
http://sqlmag.com/business-intelligence/excel-macro-creates-insert-statements-easy-data-migration
Have you tried DTSWizard ? Its a GUI based tool to do so.. and should be shipped with MS SQL server
Create a linked server or Use statement like OPENROWSET to access Excel Sheet. That would be the easiest and fastest method of accessing excel sheet via SQL.
Does anyone know how the SchemaCompare in Visual Studio (using 2010 currently) determines how to handle [SQL Server 2008R2] database table updates (column data type, optionality, etc)?
The options are to:
Use separate ALTER TABLE statements
Create a new table, copy the old data into the new table, rename the old table before the new one can be renamed to assume the proper name
I'm asking because we have a situation involving a TIMESTAMP column (for optimistic locking). If SchemaCompare uses the new table approach, the TIMESTAMP column values will change & cause problems for anyone with the old TIMESTAMP values.
I believe Schema Compare employs the same CREATE-COPY-DROP-RENAME (CCDR) strategy as VSTSDB described here: link
Should be able to confirm this by running a compare and scripting out the deploy, no?
I'm trying to export some tables from SQL Server 2005 and then create those tables and populate them in Oracle.
I have about 10 tables, varying from 4 columns up to 25. I'm not using any constraints/keys so this should be reasonably straight forward.
Firstly I generated scripts to get the table structure, then modified them to conform to Oracle syntax standards (ie changed the nvarchar to varchar2)
Next I exported the data using SQL Servers export wizard which created a csv flat file. However my main issue is that I can't find a way to force SQL Server to double quote column names. One of my columns contains commas, so unless I can find a method for SQL server to quote column names then I will have trouble when it comes to importing this.
Also, am I going the difficult route, or is there an easier way to do this?
Thanks
EDIT: By quoting I'm refering to quoting the column values in the csv. For example I have a column which contains addresses like
101 High Street, Sometown, Some
county, PO5TC053
Without changing it to the following, it would cause issues when loading the CSV
"101 High Street, Sometown, Some
county, PO5TC053"
After looking at some options with SQLDeveloper, or to manually try to export/import, I found a utility on SQL Server management studio that gets the desired results, and is easy to use, do the following
Goto the source schema on SQL Server
Right click > Export data
Select source as current schema
Select destination as "Oracle OLE provider"
Select properties, then add the service name into the first box, then username and password, be sure to click "remember password"
Enter query to get desired results to be migrated
Enter table name, then click the "Edit" button
Alter mappings, change nvarchars to varchar2, and INTEGER to NUMBER
Run
Repeat process for remaining tables, save as jobs if you need to do this again in the future
Use the SQLDeveloper migration tools
I think quoting column names in oracle is something you should not use. It causes all sort of problems.
As Robert has said, I'd strongly advise agains quoting column names. The result is that you'd have to quote them not only when importing the data, but also whenever you want to reference that column in a SQL statement - and yes, that probably means in your program code as well. Building SQL statements becomes a total hassle!
From what you're writing, I'm not sure if you are referring to the column names or the data in these columns. (Can SQLServer really have a comma in the column name? I'd be really surprised if there was a good reason for that!) Quoting the column content should be done for any string-like columns (although I found that other characters usually work better as the need to "escape" quotes becomes another issue). If you're exporting in CSV that should be an option .. but then I'm not familiar with the export wizard.
Another idea for moving the data (depending on the scale of your project) would be to use an ETL/EAI tool. I've been playing around a bit with the Pentaho suite and their Kettle component. It offered a good range of options to move data from one place to another. It may be a bit oversized for a simple transfer, but if it's a big "migration" with the corresponding volume, it may be a good option.