I am getting following error on BULK INSERT after the file location was changed to remote share. Before it used to be a shared folder in local drive and we never ran into this issue. I am running this BULK INSERT from my local PC connecting to SQL Server via SSMS.
I have made sure both SQL server and file permissions are in place.
Before when I ran this command from SSMS, it was \\SQLServer\FTP location which was a shared folder in local drive in that SQL Server but now I changed the file location to a network share \\Fileshare\FTP and have the above error but both SQL service account (domain account) and me (domain account) have elevated permission on that new location.
Any help or suggestions!!
Thanks,
I can identify three circumstances that might generate this issue:
From the SQLAuthority Blog, full detail on a related backup issue where there is a cross-domain link (in this case, from a workgroup to a full domain).
There are also two other possible answers in the question Cannot bulk load because the file could not be opened. Operating system error code 1326(Logon failure: unknown user name or bad password.) here on StackOverflow. We can discount the first one (login permissions) because you stated that you had permissions, but the other solution (I fixed it by adding the SQL Server port number to the connection string in SSIS, forcing SSIS to access SQL Server through TCP/IP instead of Named Pipes.) could apply. Try forcing a connection to the server using TCP/IP.
All of these issues appear to be related to having an attempt at cross-domain communication. If this is the issue with you, one or more of these fixes should be applicable to your issue.
-
It finally worked....
I had to configure Kerberos Authentication following the guide from this link https://thesqldude.com/2011/12/30/how-to-sql-server-bulk-insert-with-constrained-delegation-access-is-denied/.
Of course, I had to make adjustments to suit our environment and had to involve Active Directory Admin for creating SPNs and enabling DELEGATION properties.
Thanks.
Related
Having an issue getting a SQL Server linked server to Oracle working while using a tnsnames.ora file on a network share.
If I copy the tnsnames.ora file to the local server, the linked servers work fine. However, we keep the file on a network share. My sql service accounts have read access to the share. I configure TNS_ADMIN system variable to the network share, the linked servers no longer work. I get ora-12154: could not resolve the connect identifier specified. tnsping and sqlplus work on the server. When I use process monitor to investigate further, I see:
Operation: createFile
Result: ACCESS DENIED
...
Impersonating: domain\MyLogin
This seems like an issue, but is maybe a false positive? If a process is trying to impersonate my account and access a remote resource it will fail since we don't have Kerberos configured to handle double-hop.
SQLPlus and TNSPing work just fine with the network share configured.
I've looked at this post and tried the items that seemed relevant, but had no success.
Additional Info:
sqlnet.ora has this:
SQLNET.AUTHENTICATION_SERVICES= (NTS)
NAMES.DIRECTORY_PATH= (TNSNAMES, EZCONNECT)
I am able to open a file browser as a service account and open the tnsnames file.
I had this same issue while trying to connect a oracle 10g database via my WCF serivce developed in .NET 4.0 framework.
I was having multiple instances of ORACLE installed in my system. So, I modified the ORACLE_HOME to point to the Oracle 10g and it worked.
Also check the following:
Your service name might have an alias, so Make sure that your listener is listening for the same service name that you are using and check for both local and global entries. Check:
$ORACLE_HOME/network/admin/tnsnames.ora
Check your global_name setting with this SQL:
select * from global_name;
Also, Please make sure you add the Key TNS_ADMIN in the registry and create a enviroinment variable with name TNS_ADMIN
Regedit->HKEY_LOCAL_MACHINE->Software->Oracle->RightClick NEW->StringValue and name
Specify the correct path where the oracle is installed for Example
X:oracleproduct32bit10.0.1.0.0NETWORKADMIN
Edit
The below video also looks quite helpful. Please check.
https://www.youtube.com/watch?v=Sec8WG8gQPg
As an Oracle DBA I sometimes have to work with Windows. Maybe you can adopt from my experiences with Oracle on Windows.
Scenario:
An Oracle DB runs under a domain user. I want to restore a database from a backup which is located on a Windows share (sounds like "read" but it obviously isn't). I (or let's say the windows team) did not manage to find the proper way to grant the required permissions.
After many tries, the admins grant "everything" to the entire Oracle server.
Even though the Oracle process runs in a user context we did not find a set of permissions for the user only. Only the permissions for the entire server enabled the restore process to access the data.
From security point of view this is a horrible solution! But maybe it will help you to come closer to a solution (and if so, please share :-)).
I have SQL Server 2008R2 on my laptop, with a number of databases on it.
I detached one of my development databases today (I wanted to take a copy) using SSMS, and now I can't re-attach it. Whenever I try I get the following error
CREATE FILE encountered operating system error 5(failed to retrieve text for this error. Reason: 15105) while attempting to open or create the physical file
I have seen numerous similar questions on this, and they always seem to suggest the database being attached to another SQL Server instance. I have checked as to whether I have another SQL Server instance running, can't see anything even remotely like in Task manager. I even restarted the computer to clear any if they were there - no joy.
I read somewhere that if I change the Log On checkbox on the SQL Server Services properties Log On tab to Local System Account that should do it; it didn't.
I am at a complete loss as to where to go next.
Any ideas anyone?
Error 5 is Access Denied. It is a permissions issue. You do not have permissions to write to that location. You probably should be an Administrator on that machine.
I'm not an expert with TSQL so have patience with me please. So recently I was doing a project in TSQL on my local server using SQL Server 2008 R2 Management Studio. I was reading my files from a temp file on my C: drive and bulk inserting them into tables at the time.
Then I went and moved to a regular server instead of my local server on my machine.
It took me a bit to realize that I no longer had access to my local machine folders and files, and that is causing me issues.
I've read that one solution is to create a mapped drive on the server, but this is not an option for me.
So my question is what are other options for me? Could I use UNC paths to access my files or anything else?
The files I want to access are regular text files that are comma-delimited and newline terminated.
(I saw somewhat similar questions to mine, but there's seemed server specific or specific to their particular issues. Also none of their questions were answered.)
Actually a mapped drive won't work either because the account SQL runs under by default (local system if I recall) will not have network access.
So, the more reliable way to do this is definitely with a UNC path BUT there is more! (I've done this several times when I've needed to move database backups and log backups across servers for mirroring).
How?
On the SQL server machine AND the other server that will host the share, create a new user (same username and password on both machines) - assuming your not using AD. The user needs not be in any groups at all other than the users group but it must be called the same in both servers and the password must match.
On the SQL server machine change the account that SQL SERVER is running under. This is done in the SQL server configuration tool. Do not try to do this yourself via windows services. Choose the user that you created in no 1 above. Note you have to enter the pw. Restart SQL after you've changed it and verify SQL still runs fine. It should run just as before but now is running as a particular user with all the permissions of that user (which actually are very limited anyhow, but at least the user can access network resources).
On the remote server, make sure the new user has NTFS permissions on the folders that will host your share. Read/write perhaps or just read if SQL is only reading data.
On the remote server, create a share pointing to the appropriate folder that you set permissions for above. Make sure if you're using share permissions that the new user also has permissions on the share (not just on NTFS on the drive).
Once all of this is setup, your SQL scripts simply use the UNC path that points to the remote share and since SQL is running "as" a user with access to that share, SQL will see the files just fine!
I'm trying to set up a Stored Procedure as a SQL Server Agent Job and it's giving me the following error,
Cannot bulk load because the file "P:\file.csv" could not be opened. Operating system error code 3(failed to retrieve text for this error. Reason: 15105). [SQLSTATE 42000] (Error 4861)
Funny thing is the Stored Procedure works just fine when I execute it manually.
The drive P: is a shared drive on Windows SQL Server from LINUX via Samba Share and it was set up by executing the following command,
EXEC xp_cmdshell 'net use P: "\lnxusanfsd01\Data" Password /user:username /Persistent:Yes'
Any help on this would be highly appreciated
I do not know if you solved this issue, but I ran into the same issue. If the instance is local you must check the permission to access the file, but if you are accessing from your computer to a server (remote access) you have to specify the path in the server, so that means to include the file in a server directory, that solved my case.
example:
BULK INSERT Table
FROM 'C:\bulk\usuarios_prueba.csv' -- This is server path not local
WITH
(
FIELDTERMINATOR =',',
ROWTERMINATOR ='\n'
);
To keep this simple, I just changed the directory from which I was importing the data to a local folder on the server.
I had the file located on a shared folder, I just copied my files to "c:\TEMP\Reports" on my server (updated the query to BULK INSERT from the new folder). The Agent task completed successfully :)
Finally after a long time I'm able to BULK Insert automatically via agent job.
Best regards.
I have solved this issue,
login to server computer where SQL Server is installed get you csv
file on server computer and execute your query it will insert the
records.
If you will give datatype compatibility issue change the datatype for that column
Using SQL connection via Windows Authentication:
A "Kerberos double hop" is happening: one hop is your client application connecting to the SQL Server, a second hop is the SQL Server connecting to the remote "\\NETWORK_MACHINE\". Such a double hop falls under the restrictions of Constrained Delegation and you end up accessing the share as Anonymous Login and hence the Access Denied.
To resolve the issue you need to enable constrained delegation for the SQL Server service account. See here for a good post that explains it quite well
SQL Server using SQL Authentication
You need to create a credential for your SQL login and use that to access that particular network resource. See here
I would suggest the P: drive is not mapped for the account that sql server has started as.
It's probably a permissions issue but you need to make sure to try these steps to troubleshoot:
Put the file on a local drive and see if the job works (you don't necessarily need RDP access if you can map a drive letter on your local workstation to a directory on the database server)
Put the file on a remote directory that doesn't require username and password (allows Everyone to read) and use the UNC path (\server\directory\file.csv)
Configure the SQL job to run as your own username
Configure the SQL job to run as sa and add the net use and net use /delete commands before and after
Remember to undo any changes (especially running as sa). If nothing else works, you can try to change the bulk load into a scheduled task, running on the database server or another server that has bcp installed.
I did try giving access to the folders but that did not help.
My solution was to make the below highlighted options in red selected for the logged in user
For some weird reason I'm having problems executing a bulk insert.
BULK INSERT customer_stg
FROM 'C:\Users\Michael\workspace\pydb\data\andrew.out.txt'
WITH
(
FIRSTROW=0,
FIELDTERMINATOR='\t',
ROWTERMINATOR='\n'
)
I'm confident after reading this that I've setup my user role correctly, as it states...
Members of the bulkadmin fixed server role can run the BULK INSERT statement.
I have set the Login Properties for the Windows Authentication correctly (as seen below).. to grant server-wide permissions on bulkadmin
(source: iforce.co.nz)
And the command EXEC sp_helpsrvrolemember 'bulkadmin' tells me that the information above was successful, and the current user Michael-PC\Michael has bulkadmin permissions.
(source: iforce.co.nz)
But even though I've set everything up correctly as far as I know, I'm still getting the error. executing the bulk insert directly from SQL Server Management Studio.
Msg 4861, Level 16, State 1, Line 2
Cannot bulk load because the file "C:\Users\Michael\workspace\pydb\data\andrew.out.txt" could not be opened. Operating system error code 5(Access is denied.).
which doesn't make sense because apparently bulkadmins can run the statement, am I meant to reconfigure how the bulkadmin works? (I'm so lost). Any ideas on how to fix it?
This error appears when you are using SQL Server Authentication and SQL Server is not allowed to access the bulk load folder.
So giving SQL server access to the folder will solve the issue.
Here is how to:
Go to the folder right click ->properties->Security tab->Edit->Add(on the new window) ->Advanced -> Find Now. Under the users list in the search results, find something like SQLServerMSSQLUser$UserName$SQLExpress and click ok, to all the dialogs opened.
I don't think reinstalling SQL Server is going to fix this, it's just going to kill some time.
Confirm that your user account has read privileges to the folder in question.
Use a tool like Process Monitor to see what user is actually trying to access the file.
My guess is that it is not Michael-PC\Michael that is trying to access the file, but rather the SQL Server service account. If this is the case, then you have at least three options (but probably others):
a. Set the SQL Server service to run as you.
b. Grant the SQL Server service account explicit access to that folder.
c. Put the files somewhere more logical where SQL Server has access, or can be made to have access (e.g. C:\bulk\).
I suggest these things assuming that this is a contained, local workstation. There are definitely more serious security concerns around local filesystem access from SQL Server when we're talking about a production machine, of course this can still be largely mitigated by using c. above - and only giving the service account access to the folders you want it to be able to touch.
I had the same problem SSIS 2012 and the solution was to use Windows Authentication. I was using SQL authentication with the sa user.
Go to start run=>services.msc=>SQL SERVER(MSSQLSERVER) stop the service
Right click on SQL SERVER(MSSQLSERVER)=> properties=>LogOn Tab=>Local System Account=>OK
Restart the SQL server Management Studio.
Try giving the folder(s) containing the CSV and Format File read permissions for ‘MSSQLSERVER’ user (or whatever user the SQL Server service is set to Log On As in Windows Services)
This is what worked for me:
Log on SSIS with Windows authentication.
1. Open services and find MSSQL NT Service account name and copy it:
2. Open folder from which SQL server should read from. Security - Group or user names tab - Add and paste there copied account:**
You will probably get "Multiple names found error", just select MSSQL user:
Your BULK INSERT query should run fine now.
If problem persists try adding SQL Server Agent account to folder permissions in same way.
Make sure you restart MSSQL server in services after you are done.
This is quite simple the way I resolved this problem:
open SQL Server
right click on database (you want to be backup)
select properties
select permissions
select your database role (local or cloud)
in the you bottom you will see explicit permissions table
find " backup database " permission and click Grant permission .
your problem is resolved .
sometimes this can be a bogus error message, tried opening the file with the same account that it is running the process. I had the same issue in my environment and when I did open the file (with the same credentials running the process), it said that it must be associated with a known program, after I did that I was able to open it and run the process without any errors.
Make sure the file you're using ('C:\Users\Michael\workspace\pydb\data\andrew.out.txt') is on the SQL server machine and not the client machine running MSSMS.
1) Open SQL
2) In Task Manager, you can check which account is running the SQL - it is probably not Michael-PC\Michael as Jan wrote.
The account that runs SQL need access to the shared folder.
I have come to similar question when I execute the bulk insert in SSMS it's working but it failed and returned with "Operation system failure code 5" when converting the task into SQL Server Agent.
After browsing lots of solutions posted previously, this way solved my problem by granting the NT SERVER/SQLSERVERAGENT with the 'full control" access right to the source folder.
Hope it would bring some light to these people who are still struggling with the error message.
In our case it ended up being a Kerberos issue. I followed the steps in this article to resolve the issue: https://techcommunity.microsoft.com/t5/SQL-Server-Support/Bulk-Insert-and-Kerberos/ba-p/317304.
It came down to configuring delegation on the machine account of the SQL Server where the BULK INSERT statement is running. The machine account needs to be able to delegate via the "cifs" service to the file server where the files are located. If you are using constrained delegation make sure to specify "Use any authenication protocol".
If DFS is involved you can execute the following Powershell command to get the name of the file server:
Get-DfsnFolderTarget -Path "\\dfsnamespace\share"