Batch File Not Connecting to Database - batch-file

I am trying to automatically execute a SQL script with a batch file. Here is what I have in my batch file
#echo off
dbisql -c "Server=servername ;DBN=databasename ;UID=UserID;PWD=password" SqlFile.sql
pause
It says that the server was not found

If the server is not on the same machine, you have to tell the client that. You can do that in one of two ways:
Add the hostname of the machine where the server is running to the connection string, like: Server=<servername>;...;host=<hostname>. If the server is not running on the default port (2638), you can add the port number too, using host=<hostname>:<port>. This is the preferred method.
Add the links=tcpip parameter to the connection string. This is an older method and won't work if the server is on a different subnet.

Related

VbScript or with Scripting

we need to execute scripts locally, but how can we do it from a remote machine, so as to avoid going on to each machine and triggering the process manually. preferably with VbScript.
Generally in your VBScript you will specify the machine you want to query with the line
strComputer = "."
This can be changed to a computer name or IP address to remotely query a machine. However it is difficult to provide anything further as you've not posted your script or what you're trying to achieve with it..
Write your .vbs scripts and deploy them to a remote server - then on the remote server schedule a task to run whenever you'd like the script to run.
The script will run locally on that server.

Check if file exist on a webserver using sql server

I want to check that a file exists on a web server using the SQL Server.
I have tried the xp_cmdshell DIR. But it works only for local files.
Please let me know how to achieve this.
Thanks in advance.
After searching for a few days, I found the following which works well :-
When you have to put file, like BCP result, or a backup in a remote drive, just map this drive into windows don't work, it must be mapped on SQL Server to!, to do this, try like this:
exec xp_cmdshell 'net use p:\ \Server\Folder\Folder\Folder\ /Domain\Login /Password'
Reference :
https://social.msdn.microsoft.com/Forums/en-US/6eca2d62-eb86-4f23-9b86-6f917017f50c/bcp-utility-via-xpcmdshell-and-network-drive?forum=sqlsecurity
You can not access with any user, means while access directory in sqlserver you have to only use that user which have directory rights and your sqlserver services running in that user.
if you access in normal user or say simply Network Service or 'Local Service' user (in which sqlserver service run default) you can access all resources of your computer via network too (ie. \yourname\sharablefolder ), but not access sharable folder in the network.
either backup resore : http://www.sqlservercentral.com/Forums/Topic631787-146-1.aspx
or execute command :
http://www.sqlservercentral.com/Forums/Topic808580-359-1.aspx
how can i access a file/folder over network through XP_CMDSHELL in sql server 2008?
https://www.simple-talk.com/community/forums/thread/72262.aspx
https://social.msdn.microsoft.com/forums/sqlserver/en-US/c5ce5e21-17e7-4763-ba68-d3bb7dad213f/access-denied-on-xpcmdshell-with-certain-folders

Transferring files with SFTP

So I'm trying to transfer files to a remote computer on an SSH system. 'I've used the sftp command, used lls to confirm the presence of the file in the local computer, and then implemented the put filename command. However, I receive the same result each time:
stat filename: No such file or directory
I just don't know what's going wrong! Any help or troubleshooting tips would be appreciated.
If you're currently using Windows you can download winscp and use that to transfer files. It has a nice graphic interface that is easy to interact with
Well, supposing that you are on a Linux/Unix environment, you could use scp. Typically, the syntax for an scp command would be like this:
$ scp foobar.txt your_username#remotehost.net:/some/remote/directory
The above command copies the file foobar.txt which resides in the local computer, to a specific directory in the remote machine, using a username (you will be asked for a password later).
The sftp command line client uses the ssh transport and will tunnel your connections using your key. So if you have ssh access, you should also have sftp access. This is a secure option for people who are more comfortable with ftp. Most GUI ftp clients should also support sftp.
I was facing also in this issue when trying to upload files from the local to the remote server. I did commands well and clean but the mistake I was making was that: I've logged into the remote server with ssh and then login with sftp. In that way, sftp will consider that your remote server is the local (as I logged in first to this via ssh) when using the command below:
put /c/path/to/file.txt
So, the thing to do is to login directly to the server via sftp and putting your local files in there.

Cannot bulk load because the file could not be opened. Operating System Error Code 3

I'm trying to set up a Stored Procedure as a SQL Server Agent Job and it's giving me the following error,
Cannot bulk load because the file "P:\file.csv" could not be opened. Operating system error code 3(failed to retrieve text for this error. Reason: 15105). [SQLSTATE 42000] (Error 4861)
Funny thing is the Stored Procedure works just fine when I execute it manually.
The drive P: is a shared drive on Windows SQL Server from LINUX via Samba Share and it was set up by executing the following command,
EXEC xp_cmdshell 'net use P: "\lnxusanfsd01\Data" Password /user:username /Persistent:Yes'
Any help on this would be highly appreciated
I do not know if you solved this issue, but I ran into the same issue. If the instance is local you must check the permission to access the file, but if you are accessing from your computer to a server (remote access) you have to specify the path in the server, so that means to include the file in a server directory, that solved my case.
example:
BULK INSERT Table
FROM 'C:\bulk\usuarios_prueba.csv' -- This is server path not local
WITH
(
FIELDTERMINATOR =',',
ROWTERMINATOR ='\n'
);
To keep this simple, I just changed the directory from which I was importing the data to a local folder on the server.
I had the file located on a shared folder, I just copied my files to "c:\TEMP\Reports" on my server (updated the query to BULK INSERT from the new folder). The Agent task completed successfully :)
Finally after a long time I'm able to BULK Insert automatically via agent job.
Best regards.
I have solved this issue,
login to server computer where SQL Server is installed get you csv
file on server computer and execute your query it will insert the
records.
If you will give datatype compatibility issue change the datatype for that column
Using SQL connection via Windows Authentication:
A "Kerberos double hop" is happening: one hop is your client application connecting to the SQL Server, a second hop is the SQL Server connecting to the remote "\\NETWORK_MACHINE\". Such a double hop falls under the restrictions of Constrained Delegation and you end up accessing the share as Anonymous Login and hence the Access Denied.
To resolve the issue you need to enable constrained delegation for the SQL Server service account. See here for a good post that explains it quite well
SQL Server using SQL Authentication
You need to create a credential for your SQL login and use that to access that particular network resource. See here
I would suggest the P: drive is not mapped for the account that sql server has started as.
It's probably a permissions issue but you need to make sure to try these steps to troubleshoot:
Put the file on a local drive and see if the job works (you don't necessarily need RDP access if you can map a drive letter on your local workstation to a directory on the database server)
Put the file on a remote directory that doesn't require username and password (allows Everyone to read) and use the UNC path (\server\directory\file.csv)
Configure the SQL job to run as your own username
Configure the SQL job to run as sa and add the net use and net use /delete commands before and after
Remember to undo any changes (especially running as sa). If nothing else works, you can try to change the bulk load into a scheduled task, running on the database server or another server that has bcp installed.
I did try giving access to the folders but that did not help.
My solution was to make the below highlighted options in red selected for the logged in user

How to copy file to remote server in Lotusscript

I want to create a Lotus Notes agent that will run on the server to generate a text file. Once the file is created, I need to send it to a remote server.
What is the best/easiest way to send the file to a remote server?
Thanks
If your "remote" server is on a local windows network, you can simply copy the file from the server file system to a UNC path (\myserver\folder\file.txt) using the FileCopy statement. If not, you may want to look at using a Java agent, which would make more file transfer protocols easily accessible.
In either case, be sure to understand the security restrictions on Notes agents - for your agent to run on the server and create a file on the server's file system, the agent will need to be flagged with a runtime security level of 2 or 3, and signed by an appropriately authorized ID.
Sending or copying files using O/S like commands to a remote server require that destination servers be also mapped as drives on your source server. As Ed rightly said, security needs to allow you to save files down onto the server and then try and copy them.
You can generate the file locally on the server and then use FTP commands in a script to send the file. Or if you're a java guru, you can try using Java.FTP to send the file as well. I had some trouble with it, but it should be possible providing an FTP account is setup on the destination server. FTP related stuff by a well known notes guy can be found here and here
I have done it using a script, and it's clumsy but effective in simply pushing files around. Ideally, if the server at the other end is a Domino server as well, you could actually attach the file in an email and send it to a mailin account on the destination server. I have done that before, and it's great as you can just pass the whole problem of getting files off to the SMTP process.

Resources