In my work area we have 100+ MS SQL databases across the country from where we need to query data back to a single SQL db. We can do this either by using OpenRowSet or LinkedServer. But which one should be used.
Openrowset is ad-hoc. So connection have to be established every time openrowset is called. This should have some extra overhead on performance.
On the other hand Linked Server is persistent. So no need to establish connection every time (i guess). But the central server does have to have all those linked servers added to it. Will adding so many linked servers have any negative impact on the server?
Bottom line is that for connecting to a lot of servers on monthly basis, which will be better approach? openrowset or linked server?
As I recall, OpenRowSet executes the query on the remote server, whereas Linked Server makes the join locally. (Much more resource-intensive than 'connection' concerns.)
Related
I'm working on a project where I need to automatically run an insert statement to insert a result set - problem is that I need it to go from a SQL Server over to a DB2 server. I can't create a file or script and then import it or run it on the other side. I need to insert or update the DB2 side from the SQL Server side.
Is this possible? I need this to run all by itself as part of a stored procedure in SQL Server.
You're looking for the linked server feature.
Typically linked servers are configured to enable the Database Engine to execute a Transact-SQL statement that includes tables in another instance of SQL Server, or another database product such as Oracle. Many types OLE DB data sources can be configured as linked servers, including Microsoft Access and Excel. Linked servers offer the following advantages:
The ability to access data from outside of SQL Server.
The ability to issue distributed queries, updates, commands, and transactions on heterogeneous data sources across the enterprise.
The ability to address diverse data sources similarly.
(I believe most of the major RDBMSs have a similar feature)
For the most part, this essentially allows you to treat tables or sources in the other database as if they were part of the SQL Server instance - an INSERT statement should just work "normally".
As mentioned you can use a linked server on the SQL Server side to perform operations between two servers. I haven't done much with running DML on DB2 from SQL Server, but from my experience SSIS performs far better than linked servers for transactions pulling data from DB2 to SQL Server using an OLE DB connection. You can read more about OLE DB connections in SSIS here and you'll want to reference the DB2 documentation for the specific DB2 type (Mainframe, LUW, etc.) that's used for details on setting up the connection there. If you setup the SSIS catalog you can run packages using SQL Server stored procedures, which you can either use directly or execute from an existing user stored procedures.
I am a SQL Server database developer. We have a current requirement that we need to link our product with an existing application of a client. Our product has a SQL Server 2012 database while the client's existing application uses Oracle 11g. Since the time limit for the project completion is limited we cannot migrate our application to Oracle.
The requirement is that we have to get the customer details from the Oracle database to do billing activities in our system.
So I went through a few links and found that SQL Server linked server can be used to do this. I have successfully created a view which uses the Customer table from the Oracle database using a linked server. It will solve our problem.
Now here are my questions:
Is there any better solutions to do this other than linked server?
Are there any drawbacks when using linked server for this?
Thanks in advance
One drawback to consider is that the filtering on your view may take place at "your" end, rather than in Oracle. e.g. if you have a linked server (using, say, an OPENQUERY statement) and a view based on that and you do this:
select id from myView where id = 4711
expecting that the result will be very quick (assuming id is indexed etc.etc.), then you may be in for a shock as what will actually happen is:
the entire contents of the Oracle table are passed to SQL Server
SQL Server then filters this data, i.e. the filtering cannot be "pushed
down" into the view objects (as they are remote).
N.B.: I know there are two ways to define linked server (openquery and the other one, I forget the details), so this may not always apply, but you should be aware of the potential performance hit.
I have an MS Access application with data in a separate access mdb. We need to move the data to MS SQL. Previously when I did this is worked very well - dramatic speed improvements. In this new upgrade however, we are seeing some problems with translating of the SQL from Access to SQL.
We have the solution separated into an APP mdb and a DATA MDB. We use Linked-tables to join the two. After migrating the data to SQL server, we created an ODBC link, then re-pointed the linked-tables to the SQL data source over ODBC. We would expect the SQL command generated by Access, to be simply transferred to SQL server, for SQL server to operate on same using its normal efficiency and power.
When I put a trace on SQL Profiler, instead of seeing for example a simple "Delete * from table", we see a series of delete statements, one for each record in the table ...
We see the same thing for updates - where we would expect for example "insert into table1 (select a,b,c from table2)...) to be sent as an SQL string, to be executed on the server, we instead see a series of insert statements, one per row being inserted, being sent to the server.
It seems that Access is trying to work out the logic for everything but select statements client side and not letting MS SQL server take care of things as one would expect.
Anyone experienced this behaviour before and can offer a suggestion to resolve?
thanks.
I need to extract data from two databases which are on two different servers. But I can't use Linked server or OPENQUERY. Is there any other way to extract the data?
If you cannot use a Linked Server or Open Query, then you will have to migrate the data between environments. Those are the only ways to do cross-server queries.
There are multiple different ETL tools available that can move the information between environments, such as SSIS or Informatica.
It's worth noting that even with Linked Server connections that you can encounter performance issues and transactions which are held open from one end of the connection resulting in blocking transactions even when selecting very small amounts of information through the linked connection.
I was told by our IT team that we cannot use the Openquery command in MS Sql Server anymore. They claimed it was possible to slow down the server because every query requires a full table load, and all queries slow it down, etc.etc.
I was somewhat puzzled by this as I thought an OpenQuery command was similar to the 'passthrough' query in Access. The query goes to the IBM server, which executes the command and only sends the results back to SQL Server. I have read through OpenQuery on the internet and nothing I've read makes me believe that it loads or sends a whole table and then SQL Server filters the results.
I assume its possible for them to lock down the DB2 servers and prevent linked servers from SQL Server, but for my future knowledge can someone explain any perils to using OpenQuery when connecting to IBM DB2?
Thanks,
Please read this. Can you avoid OpenQuery? The best alternative would be to use a store procedure command or at least craft a EXECSQL with a SP target.