Error when trying to acces dbf files throught SQLServer 2008 - sql-server

I am trying to do single query to a dbf(FoxPro 9) file through
SqlServer, the problem is that this files are located in another domain, so i configured a LinkedServer with a valid remote user and a remote password in the security page of the linked server, and when i try to execute the query i get the error:"Invalid path or file name", but if i open the Windows explorer and go to the location of the dbf files, then i close the explorer and launch again the query, now Works fine, i don't know why, Any idea?

It wouldn't work, if you need to connect using a username and password. If the remote location allows connecting without a username and password then it would work, that is why it works after you manually make the connection. Use a mapped drive as a workaround. OTOH linked server to VFP is not much of a value, I doubt it is worth it.

If authorization is correct it may be a problem with mapped drives. The query is executed on the server so that machine needs to have access. Have you tried windows explorer on the server? Have you tried a UNC path?

Related

SQL Server Linked Server with tnsnames.ora on network share - ORA: 12154

Having an issue getting a SQL Server linked server to Oracle working while using a tnsnames.ora file on a network share.
If I copy the tnsnames.ora file to the local server, the linked servers work fine. However, we keep the file on a network share. My sql service accounts have read access to the share. I configure TNS_ADMIN system variable to the network share, the linked servers no longer work. I get ora-12154: could not resolve the connect identifier specified. tnsping and sqlplus work on the server. When I use process monitor to investigate further, I see:
Operation: createFile
Result: ACCESS DENIED
...
Impersonating: domain\MyLogin
This seems like an issue, but is maybe a false positive? If a process is trying to impersonate my account and access a remote resource it will fail since we don't have Kerberos configured to handle double-hop.
SQLPlus and TNSPing work just fine with the network share configured.
I've looked at this post and tried the items that seemed relevant, but had no success.
Additional Info:
sqlnet.ora has this:
SQLNET.AUTHENTICATION_SERVICES= (NTS)
NAMES.DIRECTORY_PATH= (TNSNAMES, EZCONNECT)
I am able to open a file browser as a service account and open the tnsnames file.
I had this same issue while trying to connect a oracle 10g database via my WCF serivce developed in .NET 4.0 framework.
I was having multiple instances of ORACLE installed in my system. So, I modified the ORACLE_HOME to point to the Oracle 10g and it worked.
Also check the following:
Your service name might have an alias, so Make sure that your listener is listening for the same service name that you are using and check for both local and global entries. Check:
$ORACLE_HOME/network/admin/tnsnames.ora
Check your global_name setting with this SQL:
select * from global_name;
Also, Please make sure you add the Key TNS_ADMIN in the registry and create a enviroinment variable with name TNS_ADMIN
Regedit->HKEY_LOCAL_MACHINE->Software->Oracle->RightClick NEW->StringValue and name
Specify the correct path where the oracle is installed for Example
X:oracleproduct32bit10.0.1.0.0NETWORKADMIN
Edit
The below video also looks quite helpful. Please check.
https://www.youtube.com/watch?v=Sec8WG8gQPg
As an Oracle DBA I sometimes have to work with Windows. Maybe you can adopt from my experiences with Oracle on Windows.
Scenario:
An Oracle DB runs under a domain user. I want to restore a database from a backup which is located on a Windows share (sounds like "read" but it obviously isn't). I (or let's say the windows team) did not manage to find the proper way to grant the required permissions.
After many tries, the admins grant "everything" to the entire Oracle server.
Even though the Oracle process runs in a user context we did not find a set of permissions for the user only. Only the permissions for the entire server enabled the restore process to access the data.
From security point of view this is a horrible solution! But maybe it will help you to come closer to a solution (and if so, please share :-)).

Check if file exist on a webserver using sql server

I want to check that a file exists on a web server using the SQL Server.
I have tried the xp_cmdshell DIR. But it works only for local files.
Please let me know how to achieve this.
Thanks in advance.
After searching for a few days, I found the following which works well :-
When you have to put file, like BCP result, or a backup in a remote drive, just map this drive into windows don't work, it must be mapped on SQL Server to!, to do this, try like this:
exec xp_cmdshell 'net use p:\ \Server\Folder\Folder\Folder\ /Domain\Login /Password'
Reference :
https://social.msdn.microsoft.com/Forums/en-US/6eca2d62-eb86-4f23-9b86-6f917017f50c/bcp-utility-via-xpcmdshell-and-network-drive?forum=sqlsecurity
You can not access with any user, means while access directory in sqlserver you have to only use that user which have directory rights and your sqlserver services running in that user.
if you access in normal user or say simply Network Service or 'Local Service' user (in which sqlserver service run default) you can access all resources of your computer via network too (ie. \yourname\sharablefolder ), but not access sharable folder in the network.
either backup resore : http://www.sqlservercentral.com/Forums/Topic631787-146-1.aspx
or execute command :
http://www.sqlservercentral.com/Forums/Topic808580-359-1.aspx
how can i access a file/folder over network through XP_CMDSHELL in sql server 2008?
https://www.simple-talk.com/community/forums/thread/72262.aspx
https://social.msdn.microsoft.com/forums/sqlserver/en-US/c5ce5e21-17e7-4763-ba68-d3bb7dad213f/access-denied-on-xpcmdshell-with-certain-folders

TSQL and UNC Paths

I'm not an expert with TSQL so have patience with me please. So recently I was doing a project in TSQL on my local server using SQL Server 2008 R2 Management Studio. I was reading my files from a temp file on my C: drive and bulk inserting them into tables at the time.
Then I went and moved to a regular server instead of my local server on my machine.
It took me a bit to realize that I no longer had access to my local machine folders and files, and that is causing me issues.
I've read that one solution is to create a mapped drive on the server, but this is not an option for me.
So my question is what are other options for me? Could I use UNC paths to access my files or anything else?
The files I want to access are regular text files that are comma-delimited and newline terminated.
(I saw somewhat similar questions to mine, but there's seemed server specific or specific to their particular issues. Also none of their questions were answered.)
Actually a mapped drive won't work either because the account SQL runs under by default (local system if I recall) will not have network access.
So, the more reliable way to do this is definitely with a UNC path BUT there is more! (I've done this several times when I've needed to move database backups and log backups across servers for mirroring).
How?
On the SQL server machine AND the other server that will host the share, create a new user (same username and password on both machines) - assuming your not using AD. The user needs not be in any groups at all other than the users group but it must be called the same in both servers and the password must match.
On the SQL server machine change the account that SQL SERVER is running under. This is done in the SQL server configuration tool. Do not try to do this yourself via windows services. Choose the user that you created in no 1 above. Note you have to enter the pw. Restart SQL after you've changed it and verify SQL still runs fine. It should run just as before but now is running as a particular user with all the permissions of that user (which actually are very limited anyhow, but at least the user can access network resources).
On the remote server, make sure the new user has NTFS permissions on the folders that will host your share. Read/write perhaps or just read if SQL is only reading data.
On the remote server, create a share pointing to the appropriate folder that you set permissions for above. Make sure if you're using share permissions that the new user also has permissions on the share (not just on NTFS on the drive).
Once all of this is setup, your SQL scripts simply use the UNC path that points to the remote share and since SQL is running "as" a user with access to that share, SQL will see the files just fine!

Cannot bulk load because the file could not be opened. Operating System Error Code 3

I'm trying to set up a Stored Procedure as a SQL Server Agent Job and it's giving me the following error,
Cannot bulk load because the file "P:\file.csv" could not be opened. Operating system error code 3(failed to retrieve text for this error. Reason: 15105). [SQLSTATE 42000] (Error 4861)
Funny thing is the Stored Procedure works just fine when I execute it manually.
The drive P: is a shared drive on Windows SQL Server from LINUX via Samba Share and it was set up by executing the following command,
EXEC xp_cmdshell 'net use P: "\lnxusanfsd01\Data" Password /user:username /Persistent:Yes'
Any help on this would be highly appreciated
I do not know if you solved this issue, but I ran into the same issue. If the instance is local you must check the permission to access the file, but if you are accessing from your computer to a server (remote access) you have to specify the path in the server, so that means to include the file in a server directory, that solved my case.
example:
BULK INSERT Table
FROM 'C:\bulk\usuarios_prueba.csv' -- This is server path not local
WITH
(
FIELDTERMINATOR =',',
ROWTERMINATOR ='\n'
);
To keep this simple, I just changed the directory from which I was importing the data to a local folder on the server.
I had the file located on a shared folder, I just copied my files to "c:\TEMP\Reports" on my server (updated the query to BULK INSERT from the new folder). The Agent task completed successfully :)
Finally after a long time I'm able to BULK Insert automatically via agent job.
Best regards.
I have solved this issue,
login to server computer where SQL Server is installed get you csv
file on server computer and execute your query it will insert the
records.
If you will give datatype compatibility issue change the datatype for that column
Using SQL connection via Windows Authentication:
A "Kerberos double hop" is happening: one hop is your client application connecting to the SQL Server, a second hop is the SQL Server connecting to the remote "\\NETWORK_MACHINE\". Such a double hop falls under the restrictions of Constrained Delegation and you end up accessing the share as Anonymous Login and hence the Access Denied.
To resolve the issue you need to enable constrained delegation for the SQL Server service account. See here for a good post that explains it quite well
SQL Server using SQL Authentication
You need to create a credential for your SQL login and use that to access that particular network resource. See here
I would suggest the P: drive is not mapped for the account that sql server has started as.
It's probably a permissions issue but you need to make sure to try these steps to troubleshoot:
Put the file on a local drive and see if the job works (you don't necessarily need RDP access if you can map a drive letter on your local workstation to a directory on the database server)
Put the file on a remote directory that doesn't require username and password (allows Everyone to read) and use the UNC path (\server\directory\file.csv)
Configure the SQL job to run as your own username
Configure the SQL job to run as sa and add the net use and net use /delete commands before and after
Remember to undo any changes (especially running as sa). If nothing else works, you can try to change the bulk load into a scheduled task, running on the database server or another server that has bcp installed.
I did try giving access to the folders but that did not help.
My solution was to make the below highlighted options in red selected for the logged in user

Cannot bulk load. Operating system error code 5 (Access is denied.)

For some weird reason I'm having problems executing a bulk insert.
BULK INSERT customer_stg
FROM 'C:\Users\Michael\workspace\pydb\data\andrew.out.txt'
WITH
(
FIRSTROW=0,
FIELDTERMINATOR='\t',
ROWTERMINATOR='\n'
)
I'm confident after reading this that I've setup my user role correctly, as it states...
Members of the bulkadmin fixed server role can run the BULK INSERT statement.
I have set the Login Properties for the Windows Authentication correctly (as seen below).. to grant server-wide permissions on bulkadmin
(source: iforce.co.nz)
And the command EXEC sp_helpsrvrolemember 'bulkadmin' tells me that the information above was successful, and the current user Michael-PC\Michael has bulkadmin permissions.
(source: iforce.co.nz)
But even though I've set everything up correctly as far as I know, I'm still getting the error. executing the bulk insert directly from SQL Server Management Studio.
Msg 4861, Level 16, State 1, Line 2
Cannot bulk load because the file "C:\Users\Michael\workspace\pydb\data\andrew.out.txt" could not be opened. Operating system error code 5(Access is denied.).
which doesn't make sense because apparently bulkadmins can run the statement, am I meant to reconfigure how the bulkadmin works? (I'm so lost). Any ideas on how to fix it?
This error appears when you are using SQL Server Authentication and SQL Server is not allowed to access the bulk load folder.
So giving SQL server access to the folder will solve the issue.
Here is how to:
Go to the folder right click ->properties->Security tab->Edit->Add(on the new window) ->Advanced -> Find Now. Under the users list in the search results, find something like SQLServerMSSQLUser$UserName$SQLExpress and click ok, to all the dialogs opened.
I don't think reinstalling SQL Server is going to fix this, it's just going to kill some time.
Confirm that your user account has read privileges to the folder in question.
Use a tool like Process Monitor to see what user is actually trying to access the file.
My guess is that it is not Michael-PC\Michael that is trying to access the file, but rather the SQL Server service account. If this is the case, then you have at least three options (but probably others):
a. Set the SQL Server service to run as you.
b. Grant the SQL Server service account explicit access to that folder.
c. Put the files somewhere more logical where SQL Server has access, or can be made to have access (e.g. C:\bulk\).
I suggest these things assuming that this is a contained, local workstation. There are definitely more serious security concerns around local filesystem access from SQL Server when we're talking about a production machine, of course this can still be largely mitigated by using c. above - and only giving the service account access to the folders you want it to be able to touch.
I had the same problem SSIS 2012 and the solution was to use Windows Authentication. I was using SQL authentication with the sa user.
Go to start run=>services.msc=>SQL SERVER(MSSQLSERVER) stop the service
Right click on SQL SERVER(MSSQLSERVER)=> properties=>LogOn Tab=>Local System Account=>OK
Restart the SQL server Management Studio.
Try giving the folder(s) containing the CSV and Format File read permissions for ‘MSSQLSERVER’ user (or whatever user the SQL Server service is set to Log On As in Windows Services)
This is what worked for me:
Log on SSIS with Windows authentication.
1. Open services and find MSSQL NT Service account name and copy it:
2. Open folder from which SQL server should read from. Security - Group or user names tab - Add and paste there copied account:**
You will probably get "Multiple names found error", just select MSSQL user:
Your BULK INSERT query should run fine now.
If problem persists try adding SQL Server Agent account to folder permissions in same way.
Make sure you restart MSSQL server in services after you are done.
This is quite simple the way I resolved this problem:
open SQL Server
right click on database (you want to be backup)
select properties
select permissions
select your database role (local or cloud)
in the you bottom you will see explicit permissions table
find " backup database " permission and click Grant permission .
your problem is resolved .
sometimes this can be a bogus error message, tried opening the file with the same account that it is running the process. I had the same issue in my environment and when I did open the file (with the same credentials running the process), it said that it must be associated with a known program, after I did that I was able to open it and run the process without any errors.
Make sure the file you're using ('C:\Users\Michael\workspace\pydb\data\andrew.out.txt') is on the SQL server machine and not the client machine running MSSMS.
1) Open SQL
2) In Task Manager, you can check which account is running the SQL - it is probably not Michael-PC\Michael as Jan wrote.
The account that runs SQL need access to the shared folder.
I have come to similar question when I execute the bulk insert in SSMS it's working but it failed and returned with "Operation system failure code 5" when converting the task into SQL Server Agent.
After browsing lots of solutions posted previously, this way solved my problem by granting the NT SERVER/SQLSERVERAGENT with the 'full control" access right to the source folder.
Hope it would bring some light to these people who are still struggling with the error message.
In our case it ended up being a Kerberos issue. I followed the steps in this article to resolve the issue: https://techcommunity.microsoft.com/t5/SQL-Server-Support/Bulk-Insert-and-Kerberos/ba-p/317304.
It came down to configuring delegation on the machine account of the SQL Server where the BULK INSERT statement is running. The machine account needs to be able to delegate via the "cifs" service to the file server where the files are located. If you are using constrained delegation make sure to specify "Use any authenication protocol".
If DFS is involved you can execute the following Powershell command to get the name of the file server:
Get-DfsnFolderTarget -Path "\\dfsnamespace\share"

Resources