can't access shared folder of Cluster Server - sql-server

I have configured a FailOver Cluster Instance (ClusterDB) with two nodes for SQL Server 2014 on Windows 2008 R2 SP 1. It's my first cluster so I'm not very sure of a lot of things, this is new for me so keep this in mind.
I created a shared folder of my backups on the cluster and successfully accessed from my Test Server (Windows Server 2012 and in the same domain). But from a few days now I can't access to this folder, when I log in into my Test Server and try to access to \ClusterDB on windows explorer, throws me an error as it cannot find the machine.
The cluster has an internal IP and a external one for the Server and both are online, if put \InternalIP on my test server, it can connect, but this not shows the shared folder (I suppose it should be here but not sure why not).
From the nodes of the cluster (node 1 and node 2) I can connect to \ClusterDB and see the shared folder.
I checked the permissions for this folder and the main user (user which I log in on my test server) has permission to read this folder
Could anybody help me to fix this?

I know this is a really old post and some people might get upset that I'm posting to it, however this is the result that turns up in Google when searching for an answer to this issue.
In my case the problem was the drive letter of the shared drive was removed by the failover cluster. To see this open Failover Cluster Manager, click on the File Server role, and select the resources tab at the bottom. You should see a Storage category with the drive being used listed, right click on this and select change drive letter.
The drive letter needs to either match the drive letter listed in the shares tab or you need to adjust the shares listed in the shares tab to match the new drive letter you assign to it.

Related

SQL Server Agent Job Can't Read Shared File

I've been stuck on this for sometime now. I have an SSIS package thats supposed to read a file and populate a database. I need to run it from a SQL Server Agent Job and the source files to read are located on a folder in another server that I have shared with this server.
The shared path to the folder looks like like this: \\server\D\folder\folder
However when I run agent job through a service account it tells me File name property is not valid. Filename is a device or contains invalid characters
The SQL Server Agent uses a service account to run this job. It runs just fine if the source path is located somewhere on the machine where the database instance lives, however I can't get it to run from a shared folder. If I run it myself by right clicking on the SSIS catalog I can run it just fine. I am aware that it is most likely a credentials issue, but all of these servers and accounts were not set up by me. Can someone help me explain how I should go about adding appropriate permissions to the said SA account so it can read the files successfully? Some examples/references would be greatly appreciated!
Things I've tried: Going to the folder security tab and adding all permissions to everyone on both the server where the folder originally is from and the server that the folder is being shared from. I can confirm everyone has the permissions with the windows PowerShell Get-Acl command.
Switching owner of the job task in SQL Server Agent to my account (I don't think its supposed to work ever to begin with) - this makes Agent complain about being "Unable to determine if user has server acces" with SA account it does have server access, it just can't read the folder.
I saw a post where someone suggests to change the SQL Agent Job advanced step option to "execute as user" and change the user with appropriate credentials, but I don't even see that option in my MSSQL.
I have stumbled upon this thread here , it was never really solved it seems but it looks like the 3 steps given should help me:
Assume that we need to write \serv\share\dir1..\dirN\targetDir\somefile.txt using SSIS throught SQL Agent Job and nonadmin proxy account MyDomain\TestAccount
MyDomain\TestAccount need read/write access to share \serv\share
MyDomain\TestAccount needed at least FILE_READ_DATA permission for all folders (share,dir1,..dirN)
MyDomain\TestAccount needed the CHANGE rights + FILE_DELETE_CHILD permission for folder targetDir
However, me being new to this, I have no idea how to properly check whether or not all these 3 conditions are true and if they are even completely relevant to the problem
EDIT:
There is a project-level variable in SSIS that determines where to read from (in this case set to \\server\d\folder\folder)
This variable is passed into forEach file enumerator in a for loop.
There is also a fileName variable used to check if file name was already loaded in the db as I store them in the table. The variable goes like this:
DECLARE #FileName VARCHAR(50) SET #FileName='' IF EXISTS (SELECT 1 FROM FileLoadStatus WHERE fileName = #FileName) BEGIN SELECT 1 AS FileExistsFlg END ELSE BEGIN SELECT 0 as FileExistsFlg END
If variables are at fault, I still don't know why it works if I execute it manually through catalog myself, but SQL Server Agent is unable to execute it through an SA account
EDIT 2: Exact errors say the following:
EDIT 3: Now that I have set a windows system task to execute the SSIS package instead of a SQL Server Agent Job it just tells me that the "for each file enumerator is empty" basically meaning it can't find any files in the destination to read, even though files are there
it might be a late respond, for all who come to check for an answer to this issue:
the main thing is to be sure that the SQL agent has the authority to read from the shared folder:
1- hold down the Windows key and press R on your keyboard to open the Run command in windows.
2- type services.
3- search for SQL Server Agent.
4- as in the screenshot shows on the logon option you will find which user the agent is using, be sure that this user has the authority to read from the shared folder.
or change the user to another one with the right credentials.
5- you can check the users of the shared folder by right clicking on it and choosing properties --> security. From this window you can change the credentials of the users.

SSIS Foreach File Enumerator finds no files when executed from deployed package

I've seen this question asked several times but never with a satisfacory answer. I have read through all the posts I can related to this and tried as much as I possibly can.
I have an SSIS package that loops thru a network folder of Excel files, I won't explain what it does inseide the loop container as that is not relevant.
I refernce the folder via a UNC \servername\folder
The package works fine from within Visual Studio.
I deployed the package to the Integration Services Catalogs on the
server
After I deploy, I connect to the server from my local pc via SSMS and then I execute it from SSMS via "Integration Services Catalogs"... Execute - This fails.
However If I remote desktop onto the SQL Server box, then start SSMS, connect to the SQL Server using my own credentials and execute the package using exactly the same method as above it works fine.
When I look in the logs of the failed attempts I see a warning that "The For Each File enumerator is empty". I'm not sure if this is telling the full story as, if I rename the network folder, I get the same message, (I expected to see an error that the folder was not found) - this may or may not be relevant.
I've sketched up a quick overview of what works and what doesn't
NOTE The "script execution" method is just a t-sql script for running these packages. This is the method I will use eventually but at the moment I'm focusing on the simple right-click execute method as this essentially does the same thing.
Since I did this I have tried a few more things...
I've tried accessing the folder as a UNC, a mapped drive and also UNC using IP address instead of server name.
I recreated the issue on our development server so I could change
service accounts etc.
I tried the default accounts on both SSIS and SQL Agent services.
I tried changing these to domain accounts and network service accounts.
I get the package to log the folder name which is and expression - this always looks correct. I do the same with the user name which always shows me as the exection user.
I can change most things as I can test on the development server with the excpetion of testing with a domain admin account so any suggestions would be greatly appreciated.

BULK INSERT error when the file location changed to remote share

I am getting following error on BULK INSERT after the file location was changed to remote share. Before it used to be a shared folder in local drive and we never ran into this issue. I am running this BULK INSERT from my local PC connecting to SQL Server via SSMS.
I have made sure both SQL server and file permissions are in place.
Before when I ran this command from SSMS, it was \\SQLServer\FTP location which was a shared folder in local drive in that SQL Server but now I changed the file location to a network share \\Fileshare\FTP and have the above error but both SQL service account (domain account) and me (domain account) have elevated permission on that new location.
Any help or suggestions!!
Thanks,
I can identify three circumstances that might generate this issue:
From the SQLAuthority Blog, full detail on a related backup issue where there is a cross-domain link (in this case, from a workgroup to a full domain).
There are also two other possible answers in the question Cannot bulk load because the file could not be opened. Operating system error code 1326(Logon failure: unknown user name or bad password.) here on StackOverflow. We can discount the first one (login permissions) because you stated that you had permissions, but the other solution (I fixed it by adding the SQL Server port number to the connection string in SSIS, forcing SSIS to access SQL Server through TCP/IP instead of Named Pipes.) could apply. Try forcing a connection to the server using TCP/IP.
All of these issues appear to be related to having an attempt at cross-domain communication. If this is the issue with you, one or more of these fixes should be applicable to your issue.
-
It finally worked....
I had to configure Kerberos Authentication following the guide from this link https://thesqldude.com/2011/12/30/how-to-sql-server-bulk-insert-with-constrained-delegation-access-is-denied/.
Of course, I had to make adjustments to suit our environment and had to involve Active Directory Admin for creating SPNs and enabling DELEGATION properties.
Thanks.

TSQL and UNC Paths

I'm not an expert with TSQL so have patience with me please. So recently I was doing a project in TSQL on my local server using SQL Server 2008 R2 Management Studio. I was reading my files from a temp file on my C: drive and bulk inserting them into tables at the time.
Then I went and moved to a regular server instead of my local server on my machine.
It took me a bit to realize that I no longer had access to my local machine folders and files, and that is causing me issues.
I've read that one solution is to create a mapped drive on the server, but this is not an option for me.
So my question is what are other options for me? Could I use UNC paths to access my files or anything else?
The files I want to access are regular text files that are comma-delimited and newline terminated.
(I saw somewhat similar questions to mine, but there's seemed server specific or specific to their particular issues. Also none of their questions were answered.)
Actually a mapped drive won't work either because the account SQL runs under by default (local system if I recall) will not have network access.
So, the more reliable way to do this is definitely with a UNC path BUT there is more! (I've done this several times when I've needed to move database backups and log backups across servers for mirroring).
How?
On the SQL server machine AND the other server that will host the share, create a new user (same username and password on both machines) - assuming your not using AD. The user needs not be in any groups at all other than the users group but it must be called the same in both servers and the password must match.
On the SQL server machine change the account that SQL SERVER is running under. This is done in the SQL server configuration tool. Do not try to do this yourself via windows services. Choose the user that you created in no 1 above. Note you have to enter the pw. Restart SQL after you've changed it and verify SQL still runs fine. It should run just as before but now is running as a particular user with all the permissions of that user (which actually are very limited anyhow, but at least the user can access network resources).
On the remote server, make sure the new user has NTFS permissions on the folders that will host your share. Read/write perhaps or just read if SQL is only reading data.
On the remote server, create a share pointing to the appropriate folder that you set permissions for above. Make sure if you're using share permissions that the new user also has permissions on the share (not just on NTFS on the drive).
Once all of this is setup, your SQL scripts simply use the UNC path that points to the remote share and since SQL is running "as" a user with access to that share, SQL will see the files just fine!

Access 2007 - accdb; options in setting up a reliable multi-user environment across multiple servers?

I am having trouble sorting through all the information / various options in regards to Access 2007 used in a multi-user environment. Here is a brief description of my current situation. At work there is the "Business LAN" which I can log on and use to monitor two other servers via remote desktop. The business LAN is strictly controlled by our IT department and no one is permitted to install any software or drivers without their consent. I do have administrative privileges on both servers though.
The two servers that I log on to using RD are used for essentially the same task, which is to monitor and control the heat to different process lines. Each server runs a different program to accomplish this task but both programs use SQL Server as a back end.
I created two access databases (one on each server because they are currently behind seperate firewalls) in order to query information from the backend SQL side of these programs and combine it with relative information I have compiled in tables in order to add more detail to the data the programs are collecting. My program is still in the debug stage but ultimately this information can then be accessed by field techs / maintenance in order to make their job easier. Maintenance staff can also add even more information based on the status of repairs etc....Last, I have created reports which can be run by Managers / Engineers who are looking for an overall status of their area.
Both access db's are split so that the back ends are seperate from the forms, queries, etc... I use an ODBC data source to import a link to SQL. I am using vba for user authentication, user logging record updates, and user / group access control. Everything works the way I intended except the fact I everyone who logs on the server will be trying to run the same copy of the front end. For example, I had a co-worker log on to the server via RD to test the program and I logged on from my desk. After logging in I could see the forms he had open. Access was already running. Without being able to install access locally (or even runtime, due to IT restrictions) on to each individuals workstation, I'm not sure what approach to take to resolve this.
Additional info, Server 1
One of the servers is considered to be the "master server" in which a number of client stations "slave servers" all communicate with. The only way to access folders on themaster server is log on to the client station and run RD.
Server 2
This server is considered to be the "historian". It communicates with a terminal server in which users log on using RD and run applications which use SQL backend which resides on the historian. I have been able to set up shares so that certain folders are visible on the historian from the terminal server.
Can anyone tell me what my best option is?
Thanks in advance.
CTN
It's really crazy the way some IT departments do everything possible to make it hard to do your job well.
You allude to users logging on via Terminal Server. If so, perhaps you can store the front ends in the user profiles of their Terminal Server logons? This assumes they're not just using the two default admininstrative Terminal Server logons, of course.
The other thing that's not clear to me is why you need a back end at all in Access/Jet/ACE -- why not just link via ODBC to the SQL Server and use that data directly? The only reason to have an independent Jet/ACE file with data tables in it in that scenario is if there is data you're storing for your Access application that is not stored in the SQL Server. You might also have temp tables (e.g., for staging complicated reports, etc.), but those should be in a temp database on a per-user basis, not in a shared back end.
Here is a suggestion how to implement what David Fenton wrote: write a simple batch script which copies your frontend from the installation path to %TEMP% (the temporary folder of the current user session) and runs the frontend from there. Something along the lines of
rem make sure current directory is where the script is
cd /d %~d0%~p0
rem assume frontend.mdb is in the same folder as the script
copy /y frontend.mdb %temp%
start %temp%\frontend.mdb
Tell your users not to run the frontend directly, only via the batch script, then everyone should get his own copy of the frontend. Or, give your frontend a different suffix in the installation path and rename it to "frontend.mdb" when copying to %temp%.

Resources