Getting a Login Timeout Expired: A network related or instance specific... While trying to export data from one server to another - sql-server

I am trying to export data from a table in a server to another server that has the same table. The table contains about 500 Million records, Generating a script requires memory that our server doesnt seem to have since it gives us an error midway. I was trying to use the export data wizard tool. I chose native client 11.0 as the source and specified the server. But when i specify the the destination server it gives me an error.
Login timeout expired: A network related or instance specific error has occured while establishing a connection to the server. Server is not found or accessible...
I am also worried I might have to deal with duplicate data if I use this tool.
Is there a way I can copy/backup/restore a specific table in SQL server and simply restore it on the other server?
I am looking for the most efficient solution to this.
Thank you for your help.

Related

Error on job that retrieves data from an external FTP server

I started in a large company as an IT consultant. One of my tasks is to manage an application that has a SQL database.
I have very limited knowledge of SSIDB and SQL Server Management Studio - but I am willing to learn.
The SQL database is updated with external data. This can be done by users directly in the application, but it also happens through a scheduled job. The job runs from SQL manager. The job has only two steps, one of which is to execute a dtsx package.
The dtsx package is set up retrieves data from an external FTP server and merges the data into the database. The job was made by my predecessor and has run flawlessly for a very long time.
Now we are in the situation where the FTP, supplying the data, has been changed.
I have therefore been inside the Connection managers and changed to the new FTP server.
Running the jobs however we still get the following error message:
Failed to configure a connection property that has the following path: \Package.Connections[FTPConnection].Propterties[ChunkSize]. An error occurred while setting the value of property “ChunkSize”. The error returned is 0x80020009 “The ValidateDates has been migrated. The package must be saved to retain migration changes.”
I have checked the Connection managers and the ChunkSize is unchanged from when the job was working correctly. ChunkSize is set to 1000, both in the Connection manager, but also in the dtsx package itself.
When I have searched for the problem, it is mentioned that it may have something to do with the connection to the FTP server. So I have checked the connection to the FTP server from the server where the SQL database is located - and there is a connection. In addition to this, I have also made sure that there is a firewall rule that allows traffic between the two servers. This is ensured across protocols and port 20-22
When the job itself is run, however, no traffic leaves the server. So I believe the problem is with the job itself.
Edit: after having done a validation of the package i have gotten the following.
Failed to configure a connection property that has the following path: \Package.Connections[FTPConnection].Propterties[ChunkSize]. An error occurred while setting the value of property “ChunkSize”. The error returned is 0x80020009 “The ValidateDates has been migrated. The package must be saved to retain migration changes.”.
: at Microsoft.SqlServer.IntegrationServices.Server.ISServerExec.ConnectionParametersManager.ConfigureProperties(Sting parameterName, object parameterValue)
at
Microsoft.SqlServer.IntegrationServices.Server.ISServerExec.ConnectionParametersManager.ConfigureProperties()
at
Microsoft.SqlServer.IntegrationServices.Server.ISServerExec.ProjectOperator.ValidatePackageWithReference(Int64 validationId, Int64 infold, Int64 projectId, String packageName, Int64 versionId, Nullable'1 referenceId, Project isserverProject, Boolean OfflineMode)
I hope my description is comprehensive enough - otherwise please do write follow-up questions.
ps english is not my first language. sorry if something didn't turn out too well.

Azure Function Database Connection

I have a Python package that I am able to run successfully on an Azure Data Science Virtual Machine. However, when I push it to Azure as a Function, I cannot successfully make a database connection. I was getting an error that the ODBC Driver 13 for SQL Server was not supported, so I changed the driver to ODBC Driver 17 for SQL Server and now I am NOT getting an error, but no data is being returned for a query that I know should return data.
Is there any other reason that data would not be returned? Firewall issues? do I need to add a binding? Do I need to separate out the connection string to feed each part (e.g., Driver, UID, PWD) into pyodbc.connect() separately? Right now I am feeding it in like this:
setting = os.environ("CONNECTIONSTRING")
conn = pyodbc.connect(setting)
This query works fine returning data when I run it on the VM using this code, just not as a Function.
(Note, this is different from my previous post regarding reading the Azure App Setting. That problem has been solved).
There are many parts where this could be breaking.
I'd suggest start by having a Profiler or Extended Events trace on your SQL Server to verify whether a connection is even being established. If not then you need to work through the the various points of connectivity to find out where it breaks. The identity, firewall, NSGs etc might all come into play here.
Once you see a connection then you can play with permissions to ensure that your query then returns your data.
Without a full picture of your infrastructure and settings it is hard to pin it down further.
Turns out it was not a database connectivity issue like I thought it was; it was a code error.

There is one error when I am trying to move TFS report to a new server

I try to debug a problem related to tfs report,I don't know why and really need help.
Recently I export one report from the old tfs server and deploy to the new server, I also copy the store procedures which related to this report in the old server database to the new server database.
But the report just doesn't work. The error is as following, The report rdl file and store processors in new tfs server are exactly same like the old server, just didn't work for the new one.
An error occurred during client rendering.
An error has occurred during report processing. (rsProcessingAborted)
Query execution failed for dataset 'DataSet1'.
(rsErrorExecutingCommand) Query (1, 16) Parser: The syntax for
'#TFS_Date' is incorrect.
First of all, I would suggest you to check your reporting service log files from below path,
C:\Program Files\Microsoft SQL Server\MSRS11.SQLEXPRESS\Reporting Services\LogFiles
Usually the error messages here in log file shows us a perfect solution.
In your case, try to check that user has access to dataset1's database. (server database) by going to SQL Server Security tabs.
Also, check your store procedure. If you have joined with some other database's table then user must have access of that database too.
Note: This type of error messages you can find there in log file. So read it to solve the issue.

Another ODBC Call Failed Topic

I am running Access 2010 FE and SQL Server 2005 BE.
I can execute pass through queries to my SQL Server succesfully by using DSNless connections.
During my testing phase sometimes I need to restore my database to get back to my original records so I can rerun my pass through queries. What I have found is when I run a pass through query, it creates an active connection on my SQL Server. I see the connection via the SQL Server Management Console under the MANAGEMENT | SQL Server Logs | Activity Monitor, select view processes. There I can see which process ID is being used and who is using it when I run my pass through query.
Now the only way for me to restore my database is to KILL the PROCESS e.g. Active connection
Now when I have my restored database in place and re-run the pass through query, I receive a ODBC -- Call Failed message box. I have attempted to run a procedure to refresh my querydefs but to no avail, I will still get the ODBC-- Call Failed message box when I click on those objects.
Now there are two options on how to fix this problem, which in either case I find not USER Friendly.
Restart my Access Application
Wait approx 5-10 minutes to rerun the Pass Through Query
I created a function to trap my ODBC Errors and this is what appears:
ODBC Error Number: 0
Error Description: [Microsoft][ODBC SQL Server Driver]Communication link failure
ODBC Error Number: 3146
Error Description: ODBC--call failed.
So if for some reason, I need to restart my SQL server or kill a process (Active Connection) on my SQL server while the Access Application is currently connected via ODBC, the objects created via ODBC will not perform properly till I execute the 2 workaround solutions as stated above.
Can anyone shed some advice on a solution? I appreciate any insight.
I asked a similar question some time ago, and never got a satisfactory answer. My original question is here: Force SET IDENTITY_INSERT to take effect faster from MS Access
There is a registry setting documented here for ACE that controls the timeout behavior:
ConnectionTimeout: The number of seconds a cached connection can remain idle before timing out. The default is 600 (values are of type REG_DWORD).
So as a third workaround (in addition to the two you already listed) you can change that registry setting to a shorter timeout (like 10 seconds). This is the approach I took in my answer. One caveat is that shortening the timeout may cause performance or other issues. Your mileage may vary.
See my full answer to the original question for more info.

How can I properly link my SQL Server database to an Access form and distribute it to the network and have them input information?

When I link the Microsoft Access to SQL Server locally, everything works. The second I go to a computer that is on the network, I am not able to open the form I created on the Access database.
I found out that if I open the link, I will get an error. If I choose where the DB through configurations in Access, I get the error again. If I try this a third time, it connects to the database and it is fully linked--I am able to type data into Access and it will store it in the SQL Server database as well.
My question is: How do I get it to connect to the server on the first try? I have to hit "connect" three times which takes about 5 minutes to log in. That isn't very efficient when the people using this program nothing about computers.
Linking Access to SQL Server uses ODBC. Make sure your ODBC settings are correct.

Resources