I have a csv file that need to import using CSVREAD. The issue is that it has date time format in a different way so a need to parse it.Can some one give me an example of how i am supposed to do it?
I tried: merge into MESSAGE (MESG_DATE_FROM,MESG_DATE_TO,MESG_DISPLAY_SEQ,MESG_TIME_DELAY,MESG_ID,REASONTYPE_MAJOR) SELECT * FROM CSVREAD('MESSAGE_0.csv');
This querys are made programatically so they cannot be handmade. The problem is that some columns are datetime type and are in a different datetime format from the one in H2, at the moment of parsing i have no precise way of determining which columns will be datetime so i cannot easily put a PARSEDATETIME and the CSV file doesnt contains any column names or info, just the values. Like this:
2011-11-18 00.00.00.00,2030-12-31 00.00.00.00,1,20000,1,0,
...
...
An Sql file will be generated to load this CSV to each table but it seems i need to know if a column is of TIMESTAMP type to add PARSEDATETIME(MESG_DATE_FROM,'yyyy-mm-dd hh.mm.ss.uu') as MESG_DATE_FROM to the sql.
In DB2 we could use a timestampformat=YYYY-MM-DD HH.MM.SS.UU in the merge query so this was done in a default way for tables. Is there anythis similar in H2?
I think what you want is not possible. There is no way to change the "global" timestamp format in H2 (as you can do in DB2).
I think you will need to either
construct the query based on the column types of each column, using PARSEDATETIME where appropriate, or
modify the CSV file to use the same timestamp format as H2 uses, which is the format defined in the JDBC specification, under SQL escape sequence: yyyy-mm-dd hh:mm:ss[.f...], or
provide a patch for H2 to support the "global" timestamp format.
Related
So I need a way to import CSVs that vary in column names, column order, and number of columns. They will always be CSV and of course comma-delimited.
Is it possible to generate both FMT and a temp table creation script of a CSV file?
From what I can gather, you need one or the other. For example, you need the table to generate the FMT file using the bcp utility. And you need the FMT file to dynamically build a create script for a table.
Using just SQL and to dynamically load files text files there is no quick way to do this. I see one option:
Get the data into SQL Server as a single column (bcp it in or use
t-sql and openrowset to load, SSIS, etc...). Be sure to include in this table a second column that is an identity (I'll call it "row_nbr"). You will need this to find the first row to get column names from the header in the file.
Parse the first record "where row_nbr = 1" to get the header record. You will need a string parse function (find online, or create your own) to substring out each column name.
Build dynamic SQL statement to create a new table with the parsed
out number of fields you just found. Must calculate lengths and use
a generic "varchar" data type since you wont know how to type the
data. Use column names found above.
Once you have a table created with the correct number of adequately
sized columns, you can create the format file.
I assumed, in my answer, that you are comfortable with doing all these things, just shared the logical flow at a high level. I can add more if you need more detail.
When I use T-SQL to convert a datetime into dd.mm.yyyy for an csv output using SSIS, the file is produced with a dd-mm-yyyy hh:mm:ss which is not what i need.
I am using:
convert(varchar,dbo.[RE-TENANCY].[TNCY-START],104)
which appears correct in SSMS.
Which is the best way to handle the conversion to be output from SSIS?
Not as simple as i thought it would be.
It works for me.
Using your query as a framework for driving the package
SELECT
CONVERT(char(10),CURRENT_TIMESTAMP,104) AS DayMonthYearDate
I explicitly declared a length for our dd.mm.yyyy value and since it's always going to be 10 characters, let's use a data type that reflects that.
Run the query, you can see it correctly produces 13.02.2019
In SSIS, I added an OLE DB Source to the data flow and pasted in my query
I wired up a flat file destination and ran the package. As expected, the string that was generated by the query entered the data flow and landed in the output file as expected.
If you're experiencing otherwise, the first place I'd check is double clicking the line between your source and the next component and choose Metadata. Look at what is reported for the tenancy start column. If it doesn't indicate dt_str/dt_wstr then SSIS thinks the data type is date variant and is applying locale specific rules to the format. You might also need to check how the column is defined in the flat file connection manager.
The most precise control on output format of the date can be achieved by T-SQL FORMAT(). It is available since SQL Server 2012.
It is slightly slower than CONVERT() but gives desired flexibility
An example:
SELECT TOP 4
name,
FORMAT(create_date, 'dd.MM.yyyy') AS create_date
FROM sys.databases;
name create_date
--------------------
master 08.04.2003
tempdb 12.02.2019
model 08.04.2003
msdb 30.04.2016
p.s. take into account that FORMAT() produces NVARCHAR output, which is different from your initial conversation logic, therefore extra cast to VARCHAR(10)) perhaps will be necessary to apply
Our business would be providing us a .csv file. One of the columns in the file would be in date format. Now as we know there are many date formats in Excel. The problem is that we need to check whether the date provided is a correct date. It could be in any format like ddmmyyyy, yyyymmdd, dd-mon-yyyy etc basically any format that Excel supports.
We are planning to first load the data in a staging area and the date field would be defined as varchar so that it can accept any data.
Now either using SSIS or via T-SQL, I need to check whether the date provided is actually a date and if it is I need to load it into a different table in YYYYMMDD format.
How do I go about doing the above?
Considering you have your excel data already loaded into a SQL Server table as varchar (you can easily do this using SSIS), something like this would work:
SELECT
case when ISDATE(YOUR_DATE) = 1 then CONVERT(int,YOUR_DATE,112) else null end as MyDate
FROM
YOUR_TABLE
I don't have access to a SQL Server instance at the moment and can't test the code above, so you may need to adapt to your needs, but this is the general idea.
You can also do further research on ISDATE and CONVERT functions in SQL Server. You should be able to achieve what you need combining them together.
I have the CSV for some time (suppose it's an RFC-compliant CSV with no syntax errors), with a header line containing columnn names. Howevwer, I don't have the intended data type of each column.
What would be a quick-and-not-so-dirty way to guess those column types?
Notes:
Motivation: I want to load the CSV into a DB table, but I have to create the table first.
I'm interested in a shell-script'y solution, but other alternatives might be relevant.
You can use SQL loader for loading csv file in a accurate way. If you want to know how to create SQL loader
I have quite a few tables and I'm using SSIS to bring the data from Oracle to SQL Server, in the process I'd like to convert all varchar fields to nvarchar. I know I can use the Data Conversion transformer but it seems the only way to do this is to set each field one by one, then I'll have to manually set the mapping in the destination component to map to the "Copy of" field. I've got thousands of fields and it would be tedious to set it on each one... is there a way to say "if field is DT_STR convert to DT_WSTR"?
what you can do is, instead of replacing varchar with nvarchar manually before running the script is copy and save all the create table scripts generated by SSIS to a document. Then you can do a global replace nvarchar x varchar in the document.
Use then the amended script as a step in your SSIS package to create the tables before populating them with the data from Oracle.
The proper way is to use the data conversion step...
That said, it appears if you disable external meta data validation in SSIS, you can bypass this error. SQL will then use an implicit conversion to the destination type.
See this SO post for a quick explanation.