Is there way to get file from windows xp command prompt? I tried to run xp_cmdshell 'type [path to file]' but then when i insert theese data into other file and renaming it to file.exe (that is executable) it does not work. Any suggestions how to get file contents in such way that i can use it?
You could use BULK INSERT on the file and treat the file as a table with one row and one column. This should allow you to read the file directly into a VARBINARY field Like this:
CREATE TABLE FileRead
(
content VARBINARY(MAX)
)
BULK INSERT FileRead FROM [FilePath]
This requires SQL Server to have access to the file you are trying to read. It sounds like you are trying to "acquire" executables from a server you do not have access to? :-)
Related
I often want to quickly load a CSV into an Oracle database. The CSV (Unicode) is on a machine with an Oracle InstantClient version 19.5, the Oracle database is of version 18c.
I look for a command line tool which uploads the rows without me specifying a column structure.
I know I can use sqlldr with a .ctl file, but then I need to define columns types, etc. I am interested in a tool which figures out the column attributes itself from the data in the CSV (or uses a generic default for all columns).
The CSVs I have to ingest contain always a header row the tool in question could use to determine appropriate columns in the table.
Starting with Oracle 12c, you can use sqlldr in express mode, thereby you don't need any control file.
In Oracle Database 12c onwards, SQLLoader has a new feature called
express mode that makes loading CSV files faster and easier. With
express mode, there is no need to write a control file for most CSV
files you load. Instead, you can load the CSV file with just a few
parameters on the SQLLoader command line.
An example
Imagine I have a table like this
CREATE TABLE EMP
(EMPNO number(4) not null,
ENAME varchar2(10),
HIREDATE date,
DEPTNO number(2));
Then a csv file that looks like this
7782,Clark,09-Jun-81,10
7839,King,17-Nov-81,12
I can use sqlldr in express mode :
sqlldr userid=xxx table=emp
You can read more about express mode in this white paper
Express Mode in SQLLDR
Forget about using sqlldr in a script file. Your best bet is on using an external table. This is a create table statement with sqlldr commands that will read a file from a directory and store it as a table. Super easy, really convenient.
Here is an example:
create table thisTable (
"field1" varchar2(10)
,"field2" varchar2(100)
,"field3" varchar2(100)
,"dateField" date
) organization external (
type oracle_loader
default directory <createDirectoryWithYourPath>
access parameters (
records delimited by newline
load when (fieldname != BLANK)
skip 9
fields terminated by ',' optionally ENCLOSED BY '"' ltrim
missing field values are null
(
"field1"
,"field2"
,"field3"
,"dateField" date 'mm/dd/yyyy'
)
)
location ('filename.csv')
);
I have a folder containing thousands of text files (.txt) that I want to import into a single SQL Server table with the following 2 fields:
Filename
Content
Filename = filename of the text file
Content = the text within the text file
I'm hoping there is an easy(ish) way to do this using maybe a command line utility or through SSIS or just through T-SQL as I ultimately want to add this into a SQL job to run on a schedule.
Would somebody please point in the right direction and possibly provide an example?
Many thanks
Try Something like below:
INSERT INTO YourTable (FileName, Content)
SELECT 'mytxtfile.txt',BulkColumn
FROM OPENROWSET (BULK 'c:\temp\mytxtfile.txt', SINGLE_CLOB) MyFile
Refer here BULK INSERT or OPENROWSET(BULK…)
We have a post-deployment script in our SQL Server project which essentially performs a bulk-insert to populate tables after they're created. This is done by reading several .csv files:
BULK INSERT
[dbo].[Table1]
FROM '.\SubFolder\TestData\Data1.csv'
WITH
(
ROWTERMINATOR = '0x0a',
FIELDTERMINATOR = ','
)
BULK INSERT
[dbo].[Table2]
FROM '.\SubFolder\TestData\Data2.csv'
WITH
(
ROWTERMINATOR = '0x0a',
FIELDTERMINATOR = ','
)
The problem is Visual Studio is having a hard time finding the files:
Cannot bulk load because the file ".\SubFolder\TestData\Data1.csv" could not be opened.
Operating system error code 3(The system cannot find the path specified.).
The .csv files are checked in to the source control and I do see them when I go to the folder they're mapped to on my machine. I assume the problem is . isn't returning current path for the sql file being executed. Is there a way to get the relative path? Is there a macro (or a SQLCMD Variable maybe) that would give me current path of the file?
The problem you have is that the .csv files are in your VS project but the script will be executed on the SQL Server, so the files should be in a location that the server will have access to. Maybe, you could add a Pre-Build event that will copy the .csv files to a shared drive on the server and then use a static path in your script that will take the files from the shared location.
I know this question is very old but still relevant. I found a working solution to this problem under the following conditions (what are optional because there are ways to overcome it) :
You will use the "publish" option within the Visual studio IDE to deploy your app.
The csv file is part of your project and configured to be copied to output folder.
Here are the steps:
Open Project Properties and go to SQLCMD Variables Section.
Add a new variable (for example $(CurrentPath))
In default Value put: $(ProjectDir)$(OutputPath)
Change your BULK code to:
BULK INSERT
[dbo].[Table1]
FROM '$(CurrentPath)\PathToFolderInsideOutputDirectory\Data1.csv'
WITH
(
ROWTERMINATOR = '0x0a',
FIELDTERMINATOR = ','
)
Save all and compile.
Test your deploy using publish, ensure the $(CurrentPath) variable shows the right path(or press "Load values" button), press the publish button, all should work.
You can create a SSIS package and use foreach Loop container to loop through all the .csv files in a given path. See below the a demo configuration of Foreach Loop Container
I Think That My Question Is Simple.
How Can I Find That My Query Is Running From Where
( Where is The Location of the Script File itself ) ?
Edit :
Thank You For Your Answer.
I Need To Import a XML File Using my TSQL Script File And i want to Keep Them Together,
so Wherever Someone try to run the TSQL script file, it must knows the current directory of itself to know where is the XML file and then import it. Thank Again !
You need a well known location where you can place XML files for the server to load. This could be a share on the SQL Server machine, or on a file server which the SQL Server service account has permissions to read from.
You then need a comment like this at the top of your script:
--Make sure you've placed the accompanying XML file on \\RemoteMachine\UploadShare
--Otherwise, expect this script to produce errors
Change \\RemoteMachine\UploadShare to match the well known location you've selected. Optionally, have the comment followed by 30-40 blank lines (or more comments), so that it's obvious to anyone running it that they might need to read what's there.
Then, write the rest of your script based on that presumption.
I Found A Solution to my problem that's simpler !
You Know I Just Import My XML File To A Temp Table for once.
Then I Write a Select Query for That Temp Table That Contains my imported Data Like This :
" SELECT 'INSERT INTO MyTable VALUES (' + Col1 + ', ' + Col2 + ')' FROM MyImportedTable "
And Now I Have Many Insert Commands For Each One Of My Imported Records.
And I Save All of the Insert Commands in My Script. And So I Just Need My Script File Everywhere I Go.
I have a sql command as follows:
INSERT [dbo].[Currency] ([CurrencyID], [Description], [Symbol])
VALUES (N'7418fe34-1abc-4189-b5f1-e638a34af1a1', N'GBP', N'£')
When I run this against the database, it inputs the last column as '£' rather than '£'. I have come across this before but can't for the life of me remember how to fix it!
Any ideas?
Thanks.
UPDATE
Funnilty enough, if I copy and paste that line from my sql file into sql man stud, then it inserts fine. So I think there is something wrong with my sql file, and a possible character in it that I cant see?
UPDATE
The sql script has the following to insert the euro symbol:
INSERT [dbo].[Currency] ([CurrencyID], [Description], [Symbol])
VALUES (N'c60b1e0c-289a-4a0a-8c7d-30a490cbb7a8', N'EUR', N'€')
And it outputs "€" in the database for the last column
UPDATE
Ok, I have now copy and pasted my full sql file into Sql Server and run it, and it now inserts everything fine. So why does this issue arise only when I run my ".sql" file?
UPDATE
Another update! If I view the ".sql" file in Visual Studio it looks fine, however if I open it within notepad, the bogus characters appear!
(From the comments)
The file is saved as UTF-8, but sqlcmd is reading it using the wrong code page. Adding -f 65001 to the options tells sqlcmd to read it as an UTF-8 file.