I am able to run the xp_fileexists command successfully on the local drives of my SQL Server instance. If a file exists, I get an output of 1. If a file does not exists, the output is 0. When I run the same command on a network drive that is mapped on the SQL Server machine, the output is always 0.
For example, if I have a file with a URL of '\\10.188.20.5\myfolder\myfile.txt'. I would run
SET #MYFILE = '\\10.188.20.5\myfolder\myfile.txt'
EXEC MASTER.DBO.XP_FILEEXIST #MYFILE, #MYOUTPUT OUT
PRINT #MYOUTPUT
The result would return 0.
If I run a bulk insert command on the same URL, the file would be successfully imported
BULK INSERT #mytable
FROM '\\10.188.20.5\myfolder\myfile.txt'
What is causing xp_fileexists to malfunction on network drives?
The problem turned out to be with the Admin account password used to run SQL Server. The password was changed, but SQL Server continued to run. It just happened 'MASTER.DBO.XP_FILEEXIST' didn't work because of the password conflict, but other SQL Server functions did. Once I changed the password associated with the Admin account used to run SQL Server, the function worked as expected.
It depends on the credetials of the Service
usualy local system, which has not network access
Go to >> Sql Server Configuration Manager >> on left panel go to SQL Server Services >> select your own Instance (MyPc\Sqlexpress) and enter the account name, the username and the password.
now it should work
Related
Please find the diagram as below for my issue:
I have 3 servers in the same domain, there is a SQL Server instance A (it's windows service run under domain\User1), In this instance, we have a Stored Procedure used for BULK INSERT a text file from a network shared folder in server C, the domain\User1 has full permissions on this folder.
My issue is: The Stored Procedure runs ok (green arrow) when connecting by SSMS in its (server A). But it failed when I change to SSMS in server B (log in by the same domain\User1 to the same Instance A). The error is "Access denied" to the text file (red arrow). Does the client have a role in this? I think the client does not matter, the file reading is done from the server (by the user that run Instance A service)
Note: If I connect Instance A from SSMS B with SQL Logon User (not windows account), the stored procedure works fine.
Could anyone give me some advice and sorry for my bad English
This is just a link answer but hopefully it helps.
BTW I commend you for taking the time to analyse the issue to the extent of drawing a diagram. This is far higher quality than most questions on here.
I believe you are running into a double hop issue. I searched everywhere for the BULK INSERT permission model and finally found this https://dba.stackexchange.com/questions/189676/why-is-bulk-insert-considered-dangerous
which says this about using BULK INSERT:
When accessing SQL Server via a Windows Login, that Windows account
will be impersonated (even if you switch the security context using
EXECUTE AS LOGIN='...') for doing the file system access
and this
when accessing SQL Server via a SQL Server Login, then the external
access is done in the context of the SQL Server service account
When you have issues with windows authentication and there is three servers and impersonation, it's often a double hop issue.
This may help you with that:
https://dba.stackexchange.com/questions/44524/bulk-insert-through-network
Which in turn references this:
https://thesqldude.com/2011/12/30/how-to-sql-server-bulk-insert-with-constrained-delegation-access-is-denied/
We’re very experienced with SQL Server as well as R (as a standalone product). We’ve setup SQL Server 2016 test server (production version from MSDN) with R also installed. The machine works fine, and we’ve tried some rudimentary R, and that works fine as well (which also means that we’ve referred to this article from Microsoft: https://msdn.microsoft.com/en-us/library/mt696069.aspx).
So, the issue we’re having is trying to load an R script from a location on our network. For example:
source(“\\\\MyServer\\MyRDirectory\\MyRScript.R”);
Just in case UNC didn’t work, we tried mapping a network drive to a drive letter, but received the same, “No such file or directory” error message.
There seems to be a permissions issue accessing this file. If we copy that file to the local test server, it works fine. For example, we have no issue with this:
source(“C:\\Temp\\MyRScript.R”);
For this test, I am using SSMS 2016 and I am logged in as a Windows AD user with DBO permissions, and I have permissions to the remote folder. The SQL Server 2016 service accounts are also AD “users” with appropriate permissions. I read that R has its own user group (SQLRUserGroup) and 20 user accounts are in that group – all of this is assigned by SQL Server during the install; these are accounts that are local to the test machine. I suspect this is the issue: R/SQL Server must be trying to access that network folder/file as a local user – not an AD user with appropriate permissions.
Has anyone run into this and found a resolution that you can share?
FYI, as expected, running the following R script in SQL Server 2016 (SSMS):
execute sp_execute_external_script
#language = N’R’
, #script = N’ OutputDataSet<- data.frame(c(USERNAME=Sys.getenv("USERNAME")),HOME=Sys.getenv("HOME"))'
, #input_data_1 = N''
WITH RESULT SETS ((USERNAME varchar(200),HOME varchar(200)))
Reports that my script is running as "MSSQLSERVER01" with a local/home directory and GUID for that user. I'm sure that's the issue, but how do I change that to run as an AD user with proper permissions?
I get the strong feeling that this is not going to be possible, but can anyone here verify?
SQL Server R Services always runs the scripts in the context of worker accounts that are local to the system, for security and isolation purposes. And it is not possible to run them in the AD user context.
SQL Server 2008 in SSMS
I'm getting this error when running a job I just created using SQL Server Agent:
Executed as user: DNA\circsrv. Database 'DN' does not exist. Make sure that the name is entered correctly. [SQLSTATE 08004] (Error 911). The step failed.
DNA is the name of a network domain, and circsrv is a valid user in that domain.
The Process for the Sql Server Agent is started by user DNA\circsrv but the job itself is owned by a different user, dn-atcore1\syncronexadmin
#owner_login_name=N'DN-ATCORE1\syncronexadmin'
(dn-atcore1 is the name of the system, and syncronexadmin is a local user on the box)
This seems like it should be simple, but I'm just not getting it.
Any ideas? Thanks for any help.
Barb
Do you have a database called 'DN'? The error states that the database does not exist. When you created the job did you set the database?
Does the database exist?
Run this code to check.
-- main database
use master;
go
-- does the db exist?
select *
from sys.sysdatabases
where name like 'DN%'
go
If it does not exist, you have bigger issues here!
Time to find a backup to restore from ...
We are now in progress of moving all our production databases from a SQL Server 2005 32 bit instance to a brand new SQL Server 2012 64 bit instance.
one of the main hardships that our developers still suffer is Linked Servers.
We have a lot of programs that need to get some data from text, csv or excel files, and the way it's implemented is with a linked server to text files so you can easily throw a select statement to the text file and insert it into a table.
The problem raised that the 32 bit server used the Microsoft.Jet.OLEDB.4.0 driver the files where on a sheared directory that had full permissions for everyone and we never ran into security issues.
On the new 64 bit server we added a linked server with the following syntax:
USE [master]
GO
EXEC master.dbo.sp_addlinkedserver
#server = N'TEMP_FILES_1'
, #srvproduct=N''
, #provider=N'Microsoft.ACE.OLEDB.12.0'
, #datasrc=N'\\SERVER-APP01\BWA\TempFiles'
, #provstr=N'Text'
Note:
The data source is on a network share.
The MSSQL service runs as the domain administrator account.
I'm logged in remotely as the domain administrator which is of course a local administrator too.
The \\SERVER-APP01\BWA\TempFiles directory has full access set for everyone.
Now when i run EXEC sp_testlinkedserver [TEMP_FILES_1] i get the following error message:
OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "TEMP_FILES_1" returned message "'\\SERVER-APP01\BWA\TempFiles' is not a valid path. Make sure that the path name is spelled correctly and that you are connected to the server on which the file resides.".
This is definitely a security issue, but the funny part is that when i run xp_cmdshell 'dir \\SERVER-APP01\BWA\TempFiles' it returns records so obviously the service has access to this folder...
On the other side on my local computer i also have a 64 bit instance with the same linked server and it works like a charm!
I've been crawling around the internet to find a solution to my problem but seems that linked servers to text files is used very little especially with 64 bit.
We had to copy the file to a drive on the destination SQL Server system before we could read the data as below:
EXEC xp_cmdshell 'net use y: \\[source directory path] [pw] /USER:[active user]'
EXEC xp_cmdshell 'copy y:\[source file] [destination directory]'
EXEC xp_cmdshell 'net use y: /delete'
When I use the sp_send_dbmail stored procedure, I get a message saying that my mail was queued. However, it never seems to get delivered. I can see them in the queue if I run this SQL:
SELECT * FROM msdb..sysmail_allitems WHERE sent_status = 'unsent'
This SQL returns a 1:
SELECT is_broker_enabled FROM sys.databases WHERE name = 'msdb'
This stored procedure returns STARTED:
msdb.dbo.sysmail_help_status_sp
The appropriate accounts and profiles have been set up and the mail was functioning at one point. There are no errors in msdb.dbo.sysmail_event_log.
Have you tried
sysmail_stop_sp
then
sysmail_start_sp
I had the same problem and this is how I was able to resolve it.
Go to Sql Agent >> Properties >> Alert System >> Check the Enable box for DBMail and add a profile.
Restart Agent and it works since then.
Hope this helps,
_Ub
Could be oodles of things. For example, I've seen (yes, actually seen) this happen after:
Domain controller reboot
Exchange server reboot
Router outage
Service account changes
SQL Server running out of disk space
So until it happens again, I wouldn't freak out over it.