SQL Server linked server performance on Oracle after upgrade 2008->2016 - sql-server

I have had a running SQL Server 2008 instance for years where I have several interfaces to an ora11 database exchanging data day for day, everything has been running correctly.
For few days now since upgrading from SQL Server 2008R2 to 2016 and importing the whole SQL Server 2008 database into our new SQL Server 2016 instance.
We were re-establishing our interfaces and everything looked good. I had created the linked server with an ora11 client connecting to the ora11-db. connected worked fine.
But since then, the stored-procedures which I was using in my interfaces, took a lot of more time for running. On SQL Server 2008R2, the linked server updates to ora11 took about a few minutes. The completely same procedure now takes about a few hours for the same number of rows updating to oracle.
My interface procedures almost look like this:
collect local data using a cursor
iterate through the cursor, looking in the remote database using linked server if the record exists
insert the missing record using linked server in Oracle db
close everything and complete
Why is it suddenly that much slower using the exactly the same source? Is there anything new in SQL Server 2016 which needs to be configured properly?
What is wrong?

Related

SQL Server 2014 レスポンス不良

Premises / What you want to achieve
With the server replacement, the version of SQL Server was upgraded.
At that time, an event occurred in which repetition worsened compared to the old environment. I am in trouble without improvement.
Problems and error messages that have occurred
Event①Response deteriorated in client-server system. Login etc. as a whole.
Event ②
Accessing a linked server configured with Oracle Provider for OLE DB deteriorates a simple select process of 30,000 records from a few seconds to two and a half minutes.
environment
"Old environment"
Build on VMWare
Windows Server 2008 R2 Standard SP1
SQL Server 2014 32-bit version (12.0.5000.0)
Link server consists of Microsoft OLE DB Provider for Oracle
"New environment"
Build on VMWare
Windows Server 2012 R2 Standard
SQL Server 2014 64-bit version (12.0.5000.0)
Link server consists of Oracle Provider for OLE DB
Things I tried
I thought it was related to the network, but I couldn't find any noticeable phenomena by monitoring the performance monitor.
The same applies to other CPUs, memories, disks, etc.
Also, if data is obtained from an external DB without using a linked server, it can be obtained in a normal time, so it is speculated that this is a SQL Server problem.
I would like you to teach me how to identify the cause. Thank you

What are my options for accessing an SQL Server database through MS Access front-end while offline

I'm currently working on a project proposal which would require moving multiple Access databases into a new MS SQL Server database. The idea is to keep the front end program as MS Access so that the users are familiar with the process of inputting data and creating reports.
However, things get complicated in that the internet in the areas where the survey will be collected has poor connectivity and will be out from time to time. I had thought of a few ways of solving this issue but all of them are cumbersome:
1) Having a PC with a router that stores the SQL Server database in offline mode and the data entry PCs connect to the PC with the offline database through the router. The PC with the SQL Server database can then backup the db on the server when it has an internet connection.
2) Adding the data to MS Access databases that can then be merged with the SQL Server at specified increments (this would probably cause some issues).
We've done option 1 before for similar projects but never for connecting to an SQL Server database in offline mode. However, it seems feasible.
My question is: Does anyone know of a way of using Access as a front end application for SQL Server and being able to update data during times without internet connectivity? The SQL Server database would automatically assign primary keys, so, duplicate unique values shouldn't be an issue while syncing the data.
Thanks for your help. I've been having a hard time finding an answer on Google and syncing to databases is complicated at the best of times. I'm really just looking for a starting point to see if there are easier ways of accomplishing this.
I would run a the free editon of SQL express on all laptops. So the Access database would be the front end to the local edition of SQL express. SQL express can be a subscriber to the "main" sql database. You thus use SQL replication to sync those local editions of SQL server to the master server. Of course the main SQL server can't be the free edition of SQL server. So to publish the database for replication, you can't use the free edition, but those free editions can certainly be used as subscribers.
This approach would eliminate the need to build or write special software for the Access application. You thus do a traditional migration of the access back end (data tables) to sql server, and then simply run the Access application local with sql express installed on each laptop. You then fire off a sync to the main edition of sql server when such laptops are back at the office.
The other possible would be to adopt and use the net sync framework. This would also allow sync, and would eliminate the need to run sql expess on each machine. I think the least amount of effort is to sync the local editions of sql express with the main editon of SQL server running at the office (but that office edition of SQL server can't be a free edition).

SQL Server not geting rows from OraOLEDB linked server

I have two Windows 2008 Server 64x machines. One running Oracle 10.2.x.x.x (Express Edition) and other SQL Server 2008 R2 with ODAC 12c (12.1.0.2.4).
I had created linked server, tested connection and it passes. I can see all Oracle tables, but when I query them, for example:
SELECT *
FROM ORACLE..USER.PERSON
All I get back are columns without no rows. If I run query from Oracle SQL Developer I get around 13000 rows.
What could be problem? I thought it was problem with backward compatibility, but according to this link it is not. It could be something with permissions/security?
Well it was nothing to do with linked server configuration. I didn't know that after every insert I need to commit changes.
SQL> commit;

How to view a list of databases under server in SQL Anywhere like we do in MSSQL

I am very new to SQL Anywhere. I have been working MSSQL for a long time.
So in MSSQL, if we need to see the list of database under server, we can see that under Server Explorer.
How can I do same in SQL Anywhere?
I have just installed SQL Anywhere 16 and have no idea how to find Server name etc.
There should be a program called "SQL Central" (scjview.exe) that was installed with SQLAnywhere server. Run that and you should be able to see the servers.

SQL Server Reporting Services (SSRS) DB upgrade independent of source database

My client has two business databases running on SQL Server 2005, together with SSRS databases (ReportServer and ReportServerTempDB), all on the same W2K3 server. My client is thinking about separate the SSRS out from the server, moving to a W2k12 R2 server, and upgrade the SSRS databases to SQL Server 2014, but with the business databases remain on SQL server 2005/W2k3 (because of $$$).
Will that be possible and is that likely to experience problems in the future?
Thanks in advance.
p.s. I should also mention there are two .NET 4.0 C# apps connected to the business and SSRS databases atm. SSRS databases are full of overnight generated report snapshots, plus ad-hoc reports generated via the apps. The SSRS db is about 80GB.
I do agree with BIDeveloper, the only way to really be sure of this is to test it. However, I can tell you from experience that so long as your report writers do not try to take advantage of new features found in the new sql server version that your SSRS instance is running off, you should be fine.
We had a similar situation where our databases were 2008r2, running on windows 2008r2 servers, but our SSRS was running on a windows 2012 server, with the SQL Server backing for SSRS was running on SQLServer 2012. - It was fine - so long as all the SSRS reports were able to run against the 2008r2 sql servers. We had a few bad report writers that tried to use sql server 2012 functions that didn't run on 2008, and those reports had errors. After the report writer fixed their mistake, the reports worked fine.
An example off the top of my head would be if a report writer wrote the following query to be executed in the report:
SELECT
TRY_CONVERT(INT, 100) [SomeConvertedInt]
FROM
<SOMETABLE>
It would work fine on their local development machine running SQL 2012, but would fail when the report was being executed on the SSRS server. To fix, they would have to remove the new 2012 function - in this case, TRY_CONVERT().
You need to create a new SSRS instance at the version you are proposing - without touching the old instance. Then do full testing.
This is the only way to prove your plan will work.

Resources