Bulk Insert command fails within the stored procedure - sql-server

System Error: Cannot bulk load because the file "XYZ.txt" could not be opened. Operating system error code 1311(There are currently no logon servers available to service the logon request.)
I have a stored procedure in SQL Server 2008 R2 which is using Bulk Insert command to load data from txt files into SQL Server tables. These files are on a shared folder on a drive located on different domain. I have full access to the drive. I tried copying files to different directory on that drive, moving files, deleting files and everything works.
When I execute the stored procedure from a ssms session from local computer it works like a champ. It is able to open the files on shared drive, read it and load the data into the SQL Server tables without any issue. When I call the stored procedure from a SQL Server Agent job, it throws this error.
SQL Server Agent is using the account which is very powerful with lot more permissions than mine. But the job fails.
To find a workaround I created a SSIS package which calls the stored procedure from "Execute SQL Task". It uses Windows authentication to connect to the database. I tried executing the package and it ran successfully. It is able to upload the data from txt to table.
So, then I created a SQL Server user with my account details, and then used that credentials to create a proxy with ssis sub system. I then scheduled the job to execute the step with newly created proxy to see if it can upload the data. But it failed with the same error.
I am confused what am I doing wrong..? I even added myself to bulkdmin role and ran the job again with no success.
I'll appreciate if someone can help.
Thanks.

Just out of curiosity I tried replacing Bulk Insert command with BCP. For some reason BCP worked. It is able to Open the files on network drive and read through it to insert the data in sql server tables. I can even call the same stored proc from sql agent job and it works perfectly fine. I didn't need to use SSIS package to solve this.

Related

Bulk Insert error in SQL Server stored procedure : Can not open the file

One of our stored procedures uses bulk insert statement to load data in the NAS (Network Attached Storage). This stored procedure is executed by a SQL Server Agent job. The owner of the job (DEV\f29) has full control over the NAS.
Other SQL jobs copying files from one folder to another in the NAS succeed. However, the Bulk Insert fails with the error:
Can not open the file (\FS\abc\abc.txt)"
We are using Windows authentication.
Any help is appreciated.

SQL Server ogr2ogr batch unable to access ogr_MSSQLSpatial.dll

I have an ogr2ogr batch file that reprojects SQL data into a new SQL Server table.
It works fine when I run the bat file manually but it fails if I run the bat file via a SQL Server stored procedure. I have given the gdal folders SQL Service permissions and xp_CommandShell is also enabled. I'm using
EXECUTE xp_CMDShell 'blah'
in the T-SQL script.
For some reason the ogr_MSSQLSpatial.dll causes it to fail.
ERROR 1: Can't load requested DLL: Z:\BroadSpectrumSQLTreeExtract\ogr2ogr\gdalplugins\ogr_MSSQLSpatial.dll
If I remove this dll the script runs via SQL but it means I need to add extra commands that the dll must take care of, such as setting source coordinate system. I haven't managed to get it working 100%. The furthest I got to was producing the reprojected table but the geometry field is empty.
The DLL does contain SQL commands to the system tables. Could this be a SQL Server security issue stopping it from working?
I again had this problem with another ogr2ogr bat while executing with SQL. If I put the bat in the same folder as the dll's it works fine.

Get local copy of SQL Server hosted on Amazon RDS

I have a small (few hundred MB) SQL Server database running on RDS. I've spent several hours trying to get a copy of it onto my local SQL Server 2014 instance. All of the following fail. Any ideas what might work?
Task -> Backup fails because it doesn't give my admin account permission to backup to a local drive.
Copy Database fails during create package with While trying to find a folder on SQL an OLE DB error was encountered with error code 0x80040E4D
From SSMS, while connected to the RDS server, running BACKUP DATABASE. This fails with message BACKUP DATABASE permission denied in database 'MyDB'. Even after running EXEC sp_addrolemember 'db_backupoperator' for the connected user.
General scripts generates a 700MB .sql file. Running that with sqlcmd -i fails at some point after producing plausible .mdf and .ldf files that can't be mounted on the local server (probably because the sqlcmd failed to complete and unlock them).
AWS has finally provided a reasonably easy means of doing this: It requires an S3 bucket.
After creating a bucket called rds-bak I ran the following stored procedure in the RDS instance:
exec msdb.dbo.rds_backup_database
#source_db_name='MyDatabase',
#s3_arn_to_backup_to='arn:aws:s3:::rds-bak/MyDatabase.bak',
#overwrite_S3_backup_file=1;
The following stored procedure returns the status of the backup request:
exec msdb.dbo.rds_task_status #db_name='MyDatabase'
Once it finished I downloaded the .bak file from S3 and imported it into a local SQL Server instance using the SSMS Restore Database... wizard!
The SSIS Import Export Wizard can generate a package to duplicate a whole set of tables. (It's not the sort of Copy Database function that relies on files - it makes a package with data flow components for each table.)
It's somewhat brittle but can be made to work :-)
SSMS Generate Scripts feature can often fail with any large data set as the script for all the data is just to large/verbose. This method never scripts out the data.
Check this out: https://github.com/andrebonna/RDSDump
It is a C#.NET Console Application that search for the latest origin database Snapshot, restore it on a temporary RDS instance, generate a BACPAC file, upload it to S3 and delete the temporary RDS instance.
You can transform your RDS snapshot into a BACPAC file, that can be downloaded and imported onto your local SQL Server 2014 instance using the feature answered here (Azure SQL Database Bacpac Local Restore)
Redgate's SQL Compare and SQL Data Compare are invaluable for these types of things. They are not cheap (but worth every penny imo). But if this is a one-time thing, you could use the 14 day trial and see how it behaves for you.
http://www.red-gate.com/products/

Cannot bulk load because the file could not be opened. Operating System Error Code 3

I'm trying to set up a Stored Procedure as a SQL Server Agent Job and it's giving me the following error,
Cannot bulk load because the file "P:\file.csv" could not be opened. Operating system error code 3(failed to retrieve text for this error. Reason: 15105). [SQLSTATE 42000] (Error 4861)
Funny thing is the Stored Procedure works just fine when I execute it manually.
The drive P: is a shared drive on Windows SQL Server from LINUX via Samba Share and it was set up by executing the following command,
EXEC xp_cmdshell 'net use P: "\lnxusanfsd01\Data" Password /user:username /Persistent:Yes'
Any help on this would be highly appreciated
I do not know if you solved this issue, but I ran into the same issue. If the instance is local you must check the permission to access the file, but if you are accessing from your computer to a server (remote access) you have to specify the path in the server, so that means to include the file in a server directory, that solved my case.
example:
BULK INSERT Table
FROM 'C:\bulk\usuarios_prueba.csv' -- This is server path not local
WITH
(
FIELDTERMINATOR =',',
ROWTERMINATOR ='\n'
);
To keep this simple, I just changed the directory from which I was importing the data to a local folder on the server.
I had the file located on a shared folder, I just copied my files to "c:\TEMP\Reports" on my server (updated the query to BULK INSERT from the new folder). The Agent task completed successfully :)
Finally after a long time I'm able to BULK Insert automatically via agent job.
Best regards.
I have solved this issue,
login to server computer where SQL Server is installed get you csv
file on server computer and execute your query it will insert the
records.
If you will give datatype compatibility issue change the datatype for that column
Using SQL connection via Windows Authentication:
A "Kerberos double hop" is happening: one hop is your client application connecting to the SQL Server, a second hop is the SQL Server connecting to the remote "\\NETWORK_MACHINE\". Such a double hop falls under the restrictions of Constrained Delegation and you end up accessing the share as Anonymous Login and hence the Access Denied.
To resolve the issue you need to enable constrained delegation for the SQL Server service account. See here for a good post that explains it quite well
SQL Server using SQL Authentication
You need to create a credential for your SQL login and use that to access that particular network resource. See here
I would suggest the P: drive is not mapped for the account that sql server has started as.
It's probably a permissions issue but you need to make sure to try these steps to troubleshoot:
Put the file on a local drive and see if the job works (you don't necessarily need RDP access if you can map a drive letter on your local workstation to a directory on the database server)
Put the file on a remote directory that doesn't require username and password (allows Everyone to read) and use the UNC path (\server\directory\file.csv)
Configure the SQL job to run as your own username
Configure the SQL job to run as sa and add the net use and net use /delete commands before and after
Remember to undo any changes (especially running as sa). If nothing else works, you can try to change the bulk load into a scheduled task, running on the database server or another server that has bcp installed.
I did try giving access to the folders but that did not help.
My solution was to make the below highlighted options in red selected for the logged in user

Bulk Insert - Does the file need to be accessible to database server or local machine?

My question is about where I should put a file I need to use in a BULK INSERT command in MS SQL.
I have a database running on a server. I run queries on this through an ODBC connection from my machine on the same network. I want to create a stored procedure that will use Bulk Insert to import data from a .txt file and then execute this stored procedure from my machine (from clicking a button in an Excel sheet).
I'm no expert on how SQL Server actually works to say the least so I have what I imagine is a very basic question for someone who does. Does the .txt file used in the Bulk Insert need to be in a location that can be read by:
a) my machine e.g. on it's local hard disk
or
b) on a location that can be read by the database server e.g. somewhere on the network that it can access
I'm not sure if my local machine or the server is actually opening the file. I would assume it's the server, but I'd like to be sure!
Many thanks in advance for your help!
If the file is not on the server computer, then you will need to make sure SQL Server has access to the file. This becomes a permissions issue. In particular, the permission you need to look at is the log on account for the SQL Server service. Open the services control panel, locate the SQL Server service and check the log on account in the properties section.
As a general rule, it's not a good idea to give too many network permissions to the SQL Server service account because this can allow hackers access to resources outside the server computer.
If you mean the BULK INSERT command from T-SQL then the name of the 'data_file' can be a local filename or an UNC path, local that is to the SQL Server machine running the query. You can put the file to import from on a share hosted on the SQL Server machine or on any other share in the network the SQL Server has access to.

Resources