Oracle SQLDeveloper: Keep CR LF on insert - export

I am trying to export and import some data in my oracle DB.
Using Export, I choose Format insert and default settings. This gives me a Windows file with correct CR LF within cells.
However, when I open this file in the SQLDeveloper and run the inserts all will be stored in DB only with LF.
How can I keep the CR LF?
For scripting reasions I require the insert statements.

Related

Easy way to load a CSV file from the command line into a new table of an Oracle database without specifying the column details

I often want to quickly load a CSV into an Oracle database. The CSV (Unicode) is on a machine with an Oracle InstantClient version 19.5, the Oracle database is of version 18c.
I look for a command line tool which uploads the rows without me specifying a column structure.
I know I can use sqlldr with a .ctl file, but then I need to define columns types, etc. I am interested in a tool which figures out the column attributes itself from the data in the CSV (or uses a generic default for all columns).
The CSVs I have to ingest contain always a header row the tool in question could use to determine appropriate columns in the table.
Starting with Oracle 12c, you can use sqlldr in express mode, thereby you don't need any control file.
In Oracle Database 12c onwards, SQLLoader has a new feature called
express mode that makes loading CSV files faster and easier. With
express mode, there is no need to write a control file for most CSV
files you load. Instead, you can load the CSV file with just a few
parameters on the SQLLoader command line.
An example
Imagine I have a table like this
CREATE TABLE EMP
(EMPNO number(4) not null,
ENAME varchar2(10),
HIREDATE date,
DEPTNO number(2));
Then a csv file that looks like this
7782,Clark,09-Jun-81,10
7839,King,17-Nov-81,12
I can use sqlldr in express mode :
sqlldr userid=xxx table=emp
You can read more about express mode in this white paper
Express Mode in SQLLDR
Forget about using sqlldr in a script file. Your best bet is on using an external table. This is a create table statement with sqlldr commands that will read a file from a directory and store it as a table. Super easy, really convenient.
Here is an example:
create table thisTable (
"field1" varchar2(10)
,"field2" varchar2(100)
,"field3" varchar2(100)
,"dateField" date
) organization external (
type oracle_loader
default directory <createDirectoryWithYourPath>
access parameters (
records delimited by newline
load when (fieldname != BLANK)
skip 9
fields terminated by ',' optionally ENCLOSED BY '"' ltrim
missing field values are null
(
"field1"
,"field2"
,"field3"
,"dateField" date 'mm/dd/yyyy'
)
)
location ('filename.csv')
);

Use first row of CSV file as column name SQL

I am using Microsoft SQL Server Management studio and I am currently importing some CSV files in a database. I am importing the CSV files using the BULK INSERT command into already existing tables, using the following query.
BULK INSERT myTable
FROM >>'D:\myfolder\file.csv'
WITH
(FIRSTROW = 2,
FIELDTERMINATOR = ';', --CSV Field Delimiter
ROWTERMINATOR = '\n', -- Used to shift to the next row
ERRORFILE = 'D:\myfolder\Error Files\myErrrorFile.csv',
TABLOCK
)
This works fine for me thus far, but I would like to automate the process of naming columns in tables. More specifically I would like to create a table and use as column names, the contents of the first row of the CSV file. Is that possible?
The easiest way I can think of is:
right-click on the database, select: Tasks -> Import Data...
After that, SQL Server Import and Export Wizard will display. There you have everything to specify and custom settings on importing data from any sources (such as getting column names from first row in a file).
In your case, your data source will be Flat file source.

VB6 - Bulk insert objects from memory into SQL

I know it's possible to do a bulk insert from a file like this:
strSQL = "BULK INSERT Northwind.dbo.[Order Details]
FROM 'e:\My Documents\TextFiles\OrderDetails.txt' " & _
"WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '\n' )"
But I can't seem to find a way to insert an object that's in memory instead. Is this possible?
The file must be visible from the SQL Server itself and its account must have access rights (if you are using SQL Server):
From http://technet.microsoft.com/en-us/library/ms188365.aspx
BULK INSERT can import data from a disk (including network, floppy disk, hard disk, and so on). 'data_file' must specify a valid path from the server on which SQL Server is running. If data_file is a remote file, specify the Universal Naming Convention (UNC) name. A UNC name has the form \Systemname\ShareName\Path\FileName. For example, \SystemX\DiskZ\Sales\update.txt.
You can use a table variable to populate a temporary table, then use a MERGE statement to process that temp table into the target database.
T-SQL Merge statement docs here http://msdn.microsoft.com/en-us/library/bb510625.aspx
The solution ended up being to build a COM object in C# that does the bulk insert and then leveraging that COM object in the VB6 project.

SQL Server - Bulk insert without losing CR or LF characters

I am trying to import email communication into a database table using Bulk Insert but I can't seem to be able to preserve the CR and LF characters. Let's consider the following:
CREATE TABLE myTable (
Email_Id int,
Email_subject varchar(200) NULL,
Email_Body TEXT NULL
)
The bulk insert statement has the following:
codepage = '1250',
fieldterminator = '<3P4>',
rowterminator = '<3ND>',
datafiletype = 'char'
The file contains full emails (including CR and LF characters). I would like to import the data and include the CR and LF characters. I have read that BULK INSERT treats each entry as a single row but does that mean it strips out the CR and LF characters? If so, what can I use to import this CSV file? I don't have access to SSIS and I would prefer to use SQL code to do it.
Example data:
11324<3P4>Read this email because it's urgent<3P4>Haha John,
I lied, the email was just to mess with you!
Your Nemesis,
Steve
P.S. I still hate you!
<3ND>
11355<3P4>THIS IS THE LAST STRAW<3P4>Steve,
I have had it with you stupid jokes, this email is going to the manager.
Good day,
John
<3ND>
It should import with the carriage returns and linefeeds, even if you don't see them in some tools. We would import XSL this way and it would preserve all of the line formatting.

How to import csv files

How can I import CSV file data into SQL Server 2000 table? I need to insert data from CSV file to table twice a day. Table has more then 20 fields but I only need to insert value into 6 fields.
i face same problem before i can suggest start reading here. The author covers:"This is very common request recently – How to import CSV file into SQL Server? How to load CSV file into SQL Server Database Table? How to load comma delimited file into SQL Server? Let us see the solution in quick steps."
I need to insert data from CSV file to table twice a day.
Use DTS to perform the import, then schedule it.
For SQL 2000, I would use DTS. You can then shedule this as a job when your happy with it.
Below is a good Microsoft link explaining how to use it.
Data Transformation Services (DTS)
You describe two distinct problems:
the CSV import, and
the extraction of data into only those 6 fields.
So break your solution down into two steps:
import the CSV into a raw staging table, and
then insert into your six 'live' fields from that staging table.
There is a function for the first part, called BULK INSERT, the syntax looks like this:
BULK INSERT target_staging_table_in_database
FROM 'C:\Path_to\CSV_file.csv'
WITH
(
DATAFILETYPE = 'CHAR'
,FIRSTROW = 2
,FIELDTERMINATOR = ','
,ROWTERMINATOR = '\n'
);
Adjust to taste, and consult the docs for more options. You might also want to TRUNCATE or DELETE FROM your staging table before doing the bulk insert so you don't have any old data in there.
Once you get the information into the database, doing an UPDATE or INSERT into those six fields should be straightforward.
You can make of use SQL Server Integration services(SSIS). It's jusy one time task to create the Package. Next time onwards just run that package.
You can also try Bulk Insert as daniel explained.
You can also try Import export wizard in SQL Server 2000.

Resources