Is there a way in sybase where I can select (read) an image or a file from the a server or driver ?
In oracle there is Bfile , it lets me to read an image directly from the driver, how to do that in sybase ?.
You can read/write text files located on the ASE server's host through a proxy table that is mapped to a file.
Unfortunately, there is no way to read a binary file like an image via such a proxy table or otherwise directly from SQL. Some kludges are possible though:
you can use BCP and a format file to read a binary file into an image column (see my Tips & Tricks book below), and you can run this from SQL via xp_cmdshell.
you can use the Java JVM that is embedded in the ASE server to read files and move the content into a table; that will require combined Java and SQL programming. YMMV.
Related
I am trying to load data in Oracle database from BCP files.
The Oracle database server is located on remote machine. In my control file I have added path to BCP file as - load data INFILE 'C:\path\to\bcpFile.txt'. This does not work if BCP files are not on same machine as DB server. One option I found is to create network mapping on my windows machine to the linux DB server. But this has manual overhead. I learned we can use LOAD DATA LOCAL INFILE to fetch data files, I would like see an example for this.
Install the full Oracle Client which includes SQL Loader on your Windows Computer
SQL Loader is included with full Oracle Client installs.
Once installed, make sure your infile section properly references your datafile using the correct syntax.
With the path you provided, this section would look like this:
INFILE 'C:\path\to\bcpFile.txt'
I want to download CSV file from FTP location and update that data into tables using stored procedure. I am not sure how to do that or whether stored procedure is right approach. I have gone through many posts but most of the post talk about pushing data to FTP location.
Any help much appreciated.
Thank you.
For this requirement you may have to write some Shell Script to connect required FTP Server and download to local system and use to sql loader function supported by the respective database.
If you using some programming language you can write program to download file from ftp location and read CSV file and insert as batch.
I am trying to bulk insert a csv file located on a remote web server but i am getting the following error.
Cannot bulk load because the file "http://34.34.32.34/test.csv" could
not be opened. Operating system error code 123(The filename, directory
name, or volume label syntax is incorrect.).
Is there anyway to accomplish this?
The documentation for BULK INSERT says nothing about SQL Server being able to connect to web servers.
http://msdn.microsoft.com/en-us/library/ms188365.aspx
' data_file ' Is the full path of the data file that contains data to
import into the specified table or view. BULK INSERT can import data
from a disk (including network, floppy disk, hard disk, and so on).
data_file must specify a valid path from the server on which SQL
Server is running. If data_file is a remote file, specify the
Universal Naming Convention (UNC) name. A UNC name has the form
\Systemname\ShareName\Path\FileName. For example,
\SystemX\DiskZ\Sales\update.txt.
If you must import a file from HTTP, consider writing a CLR stored procedure or using SSIS' external connectivity capabilities.
http://34.34.32.34/test.csv is, exactly as the error message says, an incorrect file name. Correct filenames look like c:\somefolder\test.csv. Something that starts with http: is an URL, not a file.
BULK INSERT does not support URLs as source. You should download the file first locally (using wget, curl or any other program that can download HTTP content), then bulk insert the downloaded file.
I wish to write a query that inserts a file that resides on the client (C# web server) into a column in the database server (SQL Server), something like INSERT … SELECT * FROM OPENROWSET(BULK…), but without having to save the file on the server machine first.
Is this even possible in SQL?
Although your context is unstated, I'm assuming that you're intending to run this from SSMS rather than from OSQL, a PowerShell script, or through some other means.
The file doesn't need to reside on the physical box running SQL Server, but SQL Server does need access to it. The typical approach, I believe, would be for an application server to copy the file to a shared repository and then pass it off to SQL Server through a UNC reference. The syntax to do so is relatively trivial and can be found in Importing Bulk Data by Using BULK INSERT or OPENROWSET(BULK...).
If instead you're interested in providing a mechanism for the SQL Server to save a file from some type of stream operation where the client is directly transmitting a file and there is no shared repository, I'm not aware of a way to do that. Even if you use an SQL FILESTREAM object you still need an accessible NTFS location to stream from. See Saving and Retrieving File Using FileStream SQL Server 2008.
At some point, the server will have to have a hand on the file. That does not mean that the server has to keep the file, but the file has to get to the server in order to be read and inserted into the db. Typically, this is achieved with a form and a file-type input. On the server, you can use the uploaded file to create your query, then delete it.
That said, storing files in a database is a debatable practice. Depending on the type and size of files you're storing, your database can quickly balloon in size. For starters, this makes backups slower and more prone to failure, along with a laundry list of other potential pitfalls. Check out this question on SO: Storing Images in DB - Yea or Nay? As you can see from the answers, there are a number of considerations to be made, but a good rule of thumb is to not do this unless you have compelling reasons to do so.
In SQL Server BLOB Data in .NET: Tutorial, Mohammad Elsheimy explains how it can be done:
using(SqlConnection con = new SqlConnection(conStr) )
using(SqlCommand command = new SqlCommand("INSERT INTO MyFiles VALUES (#Filename, #Data)", con) )
{
command.Parameters.AddWithValue("#Filename", Path.GetFileName(filename));
command.Parameters.AddWithValue("#Data", File.ReadAllBytes(filename));
connection.Open();
}
Basically, this way the file is read on the client and sent to the database server without a need for a temporary file on the server machine.
I'm fairly new to SQL Server.
I'm trying to bulk insert into a table, using the command in SQL Server Management Studio (2005):
BULK INSERT Table1 FROM 'c:\text.txt' WITH (FIELDTERMINATOR = '|')
I get the error:
Msg 4860, Level 16, State 1, Line 1
Cannot bulk load. The file "c:\text.txt" does not exist.
I'm positive the file actually exists.
I get the feeling that it is looking for the file on the local hard drive for where ever the server is. Is that the case? If so, how do you generally solve this problem? (to note, I've tried specifying the network address of my PC when entering the location of the text file, but I get a permission error. Also, I know in advance that my company doesn't allow files to placed on a server).
SQL Server does not have an SQL statement that reads data from the client end (as the other posters have pointed out). Other RDBMS products do implement this (eg. the Postgres COPY statement lets you specify a file on the server or a file on the client that is read by the db connectivity library on the client side).
You can achieve moving data from a file on the client to a table on SQL Server using the bcp command line program.
bcp lets you copy data from a local file to a table on the server, or from a table (or select query) on the server into a local file. For example:
bcp servername.dbname.tablename in c:\temp.txt -T -c
will copy a tabbed delimited file (temp.txt) into the specified table (assuming the file contains the right number of columns).
I am not sure if this helps, but it is the only way to move data from a client file to a server table without giving the server some sort of access across the network to the data file on client.
I'd agree that it's a problem with the file being on your C drive, and not the server's drive.
If it's a permissions issue, have you tried creating a file share on your workstation that the server does have permissions to read from? Maybe something like \YourWorkstation\SQLFile, and then granting everyone (or Guest, depending on how your network permissions are set up) read access on it?
If you can't create the share on your laptop, or you can't grant rights to it for some reason, is there a file share somewhere in the office that you do have rights to, and that SQL can also read from? Maybe a NAS or a "Common" network folder?
Have you created a share drive on your machine that the server can see? If so then you just need to refer to the path including your machine name instead of C:
Yes, it will look for the file on the SQL server itself.
If you can map a network drive to the C drive of your SQL server, then you can just copy the file over before running the bulk insert.
If you absolutely can't get any access to the server's file system, then you can look at doing something like this:
write a program that reads your text file and inserts the contents into single record in a temporary table that has a Text field, perhaps using a stored procedure
have the program execute the bcp command to export the data from the temporary table into a text file on the SQL server's local file system, to a folder that the account under which the SQL service is running has write permission
have the program run a bulk insert command specifying the path to the text file on the server
delete the text file and the temporary table