I have failed to load a text file's data to my db's table with the persistent error
invalid time format.
I have changed the time format to include a T between the date and time, to no avail. I also substituted the year, month, date delimiter from . to - but the error persisted.
This is how I am attempting to load from a txt file
bcp mydb.dbo.mytable in c:\data.txt -T -S myserver\instance
I then proceed to confirm data types of the fields, prefix-length (this is the length of the delimiter before the field... right?) and the field terminator. I create a format file which when used still yields the same error. The datetime field I am importing is one of four other fields in a typical row.
What am I missing?
EDIT
Here is a typical row
14,1999-01-04T08:08:24.000,1.36000,1.36000
I have also failed when using SSIS my prior work around of first using access is not applicable here. It is either bcp or SSIS
I believe you are using SQL Server 2008 (R2)? If so, please add appropriate tag to your question.
This problem has been reported here at dba.stackexchange.
Following hints were given:
You need to use a generated format file and use -f for both import and export.
Ensure your datetimes are formatted like yyyy-mm-dd hh:mm:ss, so with a space instead of a T in between date and time
Note that I read here the claim that this is a bug in BCP that ships with SQL Server 2008 (R2). Apparantly this was solved in SQL Server 2012+.
Related
When I use T-SQL to convert a datetime into dd.mm.yyyy for an csv output using SSIS, the file is produced with a dd-mm-yyyy hh:mm:ss which is not what i need.
I am using:
convert(varchar,dbo.[RE-TENANCY].[TNCY-START],104)
which appears correct in SSMS.
Which is the best way to handle the conversion to be output from SSIS?
Not as simple as i thought it would be.
It works for me.
Using your query as a framework for driving the package
SELECT
CONVERT(char(10),CURRENT_TIMESTAMP,104) AS DayMonthYearDate
I explicitly declared a length for our dd.mm.yyyy value and since it's always going to be 10 characters, let's use a data type that reflects that.
Run the query, you can see it correctly produces 13.02.2019
In SSIS, I added an OLE DB Source to the data flow and pasted in my query
I wired up a flat file destination and ran the package. As expected, the string that was generated by the query entered the data flow and landed in the output file as expected.
If you're experiencing otherwise, the first place I'd check is double clicking the line between your source and the next component and choose Metadata. Look at what is reported for the tenancy start column. If it doesn't indicate dt_str/dt_wstr then SSIS thinks the data type is date variant and is applying locale specific rules to the format. You might also need to check how the column is defined in the flat file connection manager.
The most precise control on output format of the date can be achieved by T-SQL FORMAT(). It is available since SQL Server 2012.
It is slightly slower than CONVERT() but gives desired flexibility
An example:
SELECT TOP 4
name,
FORMAT(create_date, 'dd.MM.yyyy') AS create_date
FROM sys.databases;
name create_date
--------------------
master 08.04.2003
tempdb 12.02.2019
model 08.04.2003
msdb 30.04.2016
p.s. take into account that FORMAT() produces NVARCHAR output, which is different from your initial conversation logic, therefore extra cast to VARCHAR(10)) perhaps will be necessary to apply
Our business would be providing us a .csv file. One of the columns in the file would be in date format. Now as we know there are many date formats in Excel. The problem is that we need to check whether the date provided is a correct date. It could be in any format like ddmmyyyy, yyyymmdd, dd-mon-yyyy etc basically any format that Excel supports.
We are planning to first load the data in a staging area and the date field would be defined as varchar so that it can accept any data.
Now either using SSIS or via T-SQL, I need to check whether the date provided is actually a date and if it is I need to load it into a different table in YYYYMMDD format.
How do I go about doing the above?
Considering you have your excel data already loaded into a SQL Server table as varchar (you can easily do this using SSIS), something like this would work:
SELECT
case when ISDATE(YOUR_DATE) = 1 then CONVERT(int,YOUR_DATE,112) else null end as MyDate
FROM
YOUR_TABLE
I don't have access to a SQL Server instance at the moment and can't test the code above, so you may need to adapt to your needs, but this is the general idea.
You can also do further research on ISDATE and CONVERT functions in SQL Server. You should be able to achieve what you need combining them together.
Using SQL Server 2014 SSIS to import an vendor supplied Excel file through the Excel Source Data Flow. Two issues I'm having related to data conversion to the SQL table.
In the file is a text column that has prices (numeric values) in it I can't not get it to transform into a numeric field (decimal(8,2)) in SQL. I have used the Data Conversion data flow task converting it to DT_NUMERIC and it fails to process the field. I have also tried to let it go through the Data Conversion task and converted through a Derived Column casting the field to Numeric. Both fail, I'm at a loss as to how to get this into the database in a Decimal/Numeric format.
In the same file are three date fields with dates that look like 07/18/2015 in Excel. I have tried similarly with the Data Conversion and Derived column to get the date into the database as SQL date formats. I have cast the dates at DT_DBDATE and DT_DBDATE and DT_DBTIMESTANP and neither has worked I have also tried taking the month day and year and rearranging them into the SQL date format with Substring/left/right functions to split the string. Also to no avail.
Here is what I tried:
Excel Source ---> Data Conversion ----> Derived Column -----> OLE DB Destination
In the excel source it recognized the date as text, I leave that be in the data conversion to deal with it in the Derived Column where I have tried.
a. (DT_DBDATE)("20" + RIGHT(TRIM(sale_start),2) + "-" + LEFT(TRIM(sale_start),2) + "-" + SUBSTRING(TRIM(sale_start),4,2)) - I have done this with and without the trim with same results. I have also used Right(sale_start,4).
b. (DT_DBDATE) sale_start
The SQL table is data type DATE. I have also changed it to DATETIME and used DT_DBTIMESTAMP in place of DT_DBDATE above.
I can't change the file I'm receiving it needs to process into the database the way it comes from the vendor. Looking at the data in excel there seems to be no reason it wouldn't be ok.
Any direction on bringing this data in would be much appreciated.
2.
I was able to figure this out although I don't completely understand what the connection setting is doing. Similarly with a XML file this connection setting wasn't necessary although some version of a derived column was, I the above I believe in my XML import.
For the EXCEL Solution:
1) In the Excel File connection I added IMEX=1 to the end of the connection under properties. So the connection string looked like this:
Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:\SSIS\Test.xls;Extended Properties="EXCEL 8.0;HDR=YES;IMEX=1";
2) Used the following script in the derived column:
ISNULL([Copy of expected_date]) ? NULL(DT_DATE) : (LEN(TRIM([Copy of expected_date])) == 0 ? NULL(DT_DATE) : (DT_DATE)((DT_DBDATE)TRIM([Copy of expected_date])))
Thanks for taking the time to respond.
I am connecting to an Oracle DB and the connection works, but I get the following error for some of the columns:
Description: Column "RESOURCE_NAME" cannot convert between unicode
and non-unicode string data types.
Value for RESOURCE_NAME:
For Oracle: VARCHAR2(200 BYTE)
For SQL Server: VARCHAR(200 BYTE)
I can connect to the Oracle DB via Oracle SQL Developer without any issues. Also, I have the SSIS package setting Run64BitRuntime = False.
The Oracle data type VARCHAR2 appears to be equivalent to NVARCHAR in SQL Server, or DT_WSTR in SSIS. Reference
You will have to convert using the Data Conversion Transformation, or CAST or CONVERT functions in SQL Server.
If the package works in one machine and doesn't in other; Try setting the NLS_LANG to right language, territory and character set and test the package.
[Command Prompt]> set NLS_LANG=AMERICAN_AMERICA.WE8ISO8859P1
The easiest way around this to to open the SSIS package in notepad (the dtsx file) and do a global find and replace of all instances of validateExternalMetadata="True" with validateExternalMetadata="False".
note: we encountered this issue when connecting to an Oracle 11g database on Linux through SSIS.
on oledb source ->advanced editor options->input/output columns->output columns->select RESOURCE_NAME column and change Data type as DT_WSTR and length also u can change as required
You can use SQL command in SSIS and use CONVERT or CAST. If SSIS still gives you an error its because of the metadata. Here is how you can fix it.
Open the Advanced Editor.
Under the Input and Output properties, Expand Source Output.
Expand Output columns
Select the column which is causing the issue.
Go to Data Type Properties and change the DataType to your desired type DT_STR, DT_Text etc.
You can just double-click on the "Data Conversion" block in the Data Flow and for every item change it to: "Unicode String [DT_WSTR]"
Works
If everything failed from above. Create a table variable and insert the data into it. Then select all records as source. use SET NOCOUNT ON in the script.
I encountered a very similar problem even using SQL Server rather than Oracle. In my case, I was using a Flat File as a data source, so I just went in to the Flat File Connection Manager and manually changed the column type to be a Unicode string:
I don't know if this would fix your problem or not, but it helped me - hopefully someone else will be helped too. (I was inspired to try that by this previous answer to this question BTW, just to give credit where credit's due).
I've spent so many hours just trying to import CSV, Excel data into SQL Server 2003 using SSMS (2012). I've tried importing as an excel, as a CSV, and a text file all options have presented problems of their own.
The biggest frustration now is when importing a CSV under the Flatfile option. In the Advanced tab I've set my source date column to have [DT_DBTIMESTAMP] this matches my destination's date column's type [DT_DBTIMESTAMP] YET despite all this when I run my import SQL Server errors out and says
• Error 0xc02020a1: Data Flow Task 1: Data conversion failed. The data conversion for column returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
(SQL Server Import and Export Wizard).
How can this fail if BOTH columns are exactly the same type?
Thanks in advance for the help.
It might not be that SQLServer is complaining about losing data when importing but when reading. For instance, the DT_DBTIMESTAMP type expects the format yyyy-mm-dd hh:mm:ss[.fff]. If your data is not formatted correct in any dimension (say your mm > 12 or dd > 31) that may be the problem. Like #Pondlife suggested I'd bring everything into a varchar(max) field and then run TSQL to see if all rows can convert to DBTIMESTAMP data type is a simple convert statement.
I saw many problems with SQL server import wizard (I'm on version 2008 R2), reading date fields from CSV files, when format is Latin - day comes before the month, like 25/12/2013 (Christmas). Looks like it assume allways MM/DD/YYYY, and there is no clear way to tell to read as DD/MM/YYYY (or I did not find). If I solve, will post. Tks.