how to download CSV file from FTP location and update table using stored proceedure - sql-server

I want to download CSV file from FTP location and update that data into tables using stored procedure. I am not sure how to do that or whether stored procedure is right approach. I have gone through many posts but most of the post talk about pushing data to FTP location.
Any help much appreciated.
Thank you.

For this requirement you may have to write some Shell Script to connect required FTP Server and download to local system and use to sql loader function supported by the respective database.
If you using some programming language you can write program to download file from ftp location and read CSV file and insert as batch.

Related

How to load data from UNIX to snowflake

I have created CSV files into UNIX server using Informatica resides in. I want to load those CSV files directly from UNIX box to snowflake using snowsql, can someone help me how to do that?
Log into SnowSQL:
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-log-in.html
Create a Database, Table and Virtual Warehouse, if not done so already:
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-create-objects.html
Stage the CSV files, using PUT:
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-stage-data-files.html
Copy the files into the target table using COPY INTO:
https://docs.snowflake.com/en/user-guide/getting-started-tutorial-copy-into.html

Can anyone suggest me how i can write the path to store the output file in which my stored procedure outputs in my oracle DB?

suppose the IP address of my FTP server is xx.xxx.xx.xx and i need the output file to be stored in D:/example. I need to esnure that the path i give is in my FTP server. How can i include that in my fopen function, like a path which points to the example in my FTP server.
Generally speaking, this is how it goes:
there's a database server
there is a directory on one of its disks
that directory will be used in create directory command which creates a directory, Oracle object
it will be used as a target for your file-related operations. For example:
it'll contain CSV files which are source of external tables
.dmp files, result of data pump export, will be stored there (the same goes for import)
UTL_FILE will create files in that directory
All that means that your idea of creating a file on a FTP server might not work just as easy.
However, there's a way : if you create directory (Oracle object) using UNC (Universal Naming Convention) which points to a directory on the FTP server, the file might be created there. Do some research about it; I know I once did that (put files onto an application server), but that was long time ago and I don't remember everything I did.
Another option you might consider is DBMS_SCHEDULER package. Suppose you create a file on the database server (which is the simplest option; if you do it right, it is more or less trivial). Once the procedure (which creates the file) is done, call DBMS_SCHEDULER.CREATE_JOB using the executable job type and call an operating system batch file that will copy the file from the database server to the FTP server.
That's all I can say about it; at least, you have something to research & think about.

Using SQL Server to Zip Files

I have a table that stores users FileData as such Data Type: varbinary(MAX) FILESTREAM null
In my web application, the user selects multiple file Ids and eventually wants a zip file of those selected FileIds
Currently my solution is to bring the FileData into C# and call some C# function/library that zips the file and returns that to the user. The problem with this is that the user could potentially select a ton of files causing a lot of temporary data to exist in C#.
Is there a way that I can zip these files in SQL Server and then return the zipped result to C# without having to bring the selected FileDatas into C# memory?
You can certainly do this through a stored procedure, which would write the files and zip, but you would be writing SQL which writes files to disk and executes windows system commands. You can read up on xp_cmdshell. I would advise against this personally
You are still going to have a large zip file blob coming back to your server in that model. Couldn't your users still overload your system? You would get around this using streaming which could be done with your zipping.
Are you using the most recent ZipArchive? It provides streaming access both in and out if used properly. See here for an example writing without bumping into memory Basically you will write your code to use an output stream so that data doesnt build up in memory ...new ZipArchive(myOutPutStream, ZipArchiveMode.Update, true or false)

SSIS Permission Issue Flat Files

I recently had to move my files to a new SSIS server. Everything seems to be working except when I try to execute a bulk insert it tells me
(Cannot bulk load because the file "E:\FlatFiles\SSG\apmast.txt" could
not be opened. Operating system error code 21(The device is not
ready.).".
It does this for all my flat files. I found an article saying you need to give the MSSQLSERVER user full control of the files, which I did but this does not seem to fix it. Any other ideas? Do I need to give other files the same permissions? I really don't want to just throw full control around if I don't have to. Thanks
I figured it out, turns out that a bulk insert tells the server to look locally for the text file. I was trying to get SSIS to do a bulk insert of flat files from one server into another sql server on the network. As soon as I put the flatfiles on the remote server it grabbed and used them. This seems like a very odd way for it to work. I would expect it to push the files to the sql server instead of asking the sql server to look for the files locally via a hard path.

SQL Server Bulk Insert fails (Network related)

I'm fairly new to SQL Server.
I'm trying to bulk insert into a table, using the command in SQL Server Management Studio (2005):
BULK INSERT Table1 FROM 'c:\text.txt' WITH (FIELDTERMINATOR = '|')
I get the error:
Msg 4860, Level 16, State 1, Line 1
Cannot bulk load. The file "c:\text.txt" does not exist.
I'm positive the file actually exists.
I get the feeling that it is looking for the file on the local hard drive for where ever the server is. Is that the case? If so, how do you generally solve this problem? (to note, I've tried specifying the network address of my PC when entering the location of the text file, but I get a permission error. Also, I know in advance that my company doesn't allow files to placed on a server).
SQL Server does not have an SQL statement that reads data from the client end (as the other posters have pointed out). Other RDBMS products do implement this (eg. the Postgres COPY statement lets you specify a file on the server or a file on the client that is read by the db connectivity library on the client side).
You can achieve moving data from a file on the client to a table on SQL Server using the bcp command line program.
bcp lets you copy data from a local file to a table on the server, or from a table (or select query) on the server into a local file. For example:
bcp servername.dbname.tablename in c:\temp.txt -T -c
will copy a tabbed delimited file (temp.txt) into the specified table (assuming the file contains the right number of columns).
I am not sure if this helps, but it is the only way to move data from a client file to a server table without giving the server some sort of access across the network to the data file on client.
I'd agree that it's a problem with the file being on your C drive, and not the server's drive.
If it's a permissions issue, have you tried creating a file share on your workstation that the server does have permissions to read from? Maybe something like \YourWorkstation\SQLFile, and then granting everyone (or Guest, depending on how your network permissions are set up) read access on it?
If you can't create the share on your laptop, or you can't grant rights to it for some reason, is there a file share somewhere in the office that you do have rights to, and that SQL can also read from? Maybe a NAS or a "Common" network folder?
Have you created a share drive on your machine that the server can see? If so then you just need to refer to the path including your machine name instead of C:
Yes, it will look for the file on the SQL server itself.
If you can map a network drive to the C drive of your SQL server, then you can just copy the file over before running the bulk insert.
If you absolutely can't get any access to the server's file system, then you can look at doing something like this:
write a program that reads your text file and inserts the contents into single record in a temporary table that has a Text field, perhaps using a stored procedure
have the program execute the bcp command to export the data from the temporary table into a text file on the SQL server's local file system, to a folder that the account under which the SQL service is running has write permission
have the program run a bulk insert command specifying the path to the text file on the server
delete the text file and the temporary table

Resources