I have a CSV file that I'm trying to import into SQL Management Server Studio.
In Excel, the column giving me trouble looks like this:
Tasks > import data > Flat Source File > select file
I set the data type for this column to DT_NUMERIC, adjust the DataScale to 2 in order to get 2 decimal places, but when I click over to Preview, I see that it's clearly not recognizing the numbers appropriately:
The column mapping for this column is set to type = decimal; precision 18; scale 2.
Error message: Data Flow Task 1: Data conversion failed. The data conversion for column "Amount" returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
(SQL Server Import and Export Wizard)
Can someone identify where I'm going wrong here? Thanks!
I believe I figured it out... the CSV Amount column was formatted such that the numbers still contained commas separating at the thousands mark. I adjusted XX,XXX.XX to XXXXX.XX it seems to have worked. –
Related
I'm loading data from excel file (.xlsx) to SQL table using SSIS package. For one column it's adding scientific notations in the data, it's already there in the excel file. But it's actual value is not loading to SQL table. I tried multiple option of derived columns, expressions etc. But I couldn't get the proper value.
This column has data of numeric and nvarchar values. Below is the example of the column.
ApplicationNumber
1.43E+15
923576663
25388447
TXY020732087
18794588
TXAP0000140343
**Actual Values -**
ApplicationNumber
1425600000000000
923576663
25388447
TXY020732087
18794588
TXAP0000140343
There is no issue with data coming from Business to Excel. But how we can handle this scenario in SSIS ?
I also tried (DT_I8)ApplicationNumber==(DT_I8)ApplicationNumber, But it giving values for the above
1.43E+15 -> 1.430000000000000 and not the 1425600000000000
One thing you can do is set the output in advanced editor of the excel source as decimal with a large scale, 20 digits for example:
UPDATE
to consider also strings in the same column you may need to redirect the error output as these will throw a conversion error:
in advanced editor:
Default output:
Error output:
Then you can update your database from both the default and the error output.
I faced this problem recently using SSIS too.
1- Change the column type in Excel to "Number"
2- Remove the decimal positions.
3- Upload the file using SSIS
I am trying to simply import a .tsv (200 column, 400,000 rows) to Sql Server.
I get this error all the time (always with a different column):
Error 0xc02020a1: Data Flow Task 1: Data conversion failed. The data conversion for column "Column 93" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.".
Even though I explicitly:
So, I found myself, going back, and changing the Output (500 for this case):
Is there a way to change all OutputColumnWidth to like ‘max’ at once?! I have 200 columns I can't wait for it to fail and go back and change it for each failed column... (I do not care about performance, any data type is the same for me)
You could try opening the code view of your SSIS package and doing a ctrl-H replace of all "50" to "500". If you have 50's you don't want to change to 500 then look at the code and make the replacement more context-specific.
I have a very simple (but big) CSV file and I want to import it to my database in Microsoft SQL Server 2014 (Database/Tasks/Import Data). But I receive the following error :
The conversion returned status value 2 and status text "The value could not be converted because of a potential loss of data".
Here is a sample of my CSV file (containing ~ 9 million rows) :
1393013,297884,'20150414 15:46:25'
1393010,301242,'20150414 15:46:58'
Ideally my first and second columns are big-int and the third is datetime. In the wizard, I choose 'unsigned 8 byte integer' for first two and 'timestamp' for the third and I receive the error. Even I try to use string for all three columns as data type and still I receive the same error.
I also tried using bcp command in command line. It errs nothing and inserts nothing! Also using "bulk insert" command errors me that :
the column is too long! verify your terminators
But they are correctly fixed!
I appreciate any idea you have as a solution to this simple-looking problem.
You are trying to change the input types: unsigned 8 byte integer is a setting on the source.
You don't need to change source setting at all. 'string [DT_STR]' and the default length of 50 will work.
'timestamp' is a binary type. I believe the type you are after is datetime, but that set is on the destination, not the source. The source is still a string regardless.
You still will not be able to import your date value as a datetime data type.
This would work though (added dashes) -> 2015-04-14 15:46:25. Import what you have as string and fix it after import unless you can get your text file changed.
I'm using the Import/Export Wizard to import some data to a table. Total of 2 rows, so I've just been working around this, but I would like to know the answer.
The issue with the Import/Export is the dates. No matter what I do, they fail. The date looks pretty straightforward to me: 2009-12-05 11:40:00. I also tried: 2010-03-01 12:00 PM. Tried DT_DATE and DT_DBTIMESTAMP as a source data type. The target column type is datetime.
The message that I get is:
The data conversion for column
"Start_Date" returned status value 2
and status text "The value could not
be converted because of a potential
loss of data.".
How do I fix this? Why's the Import/Export Wizard so bad at parsing dates (or is that in my imagination)?
The truly obnoxious thing here is that when you select a date column from a table and save it as a CSV you get a date like '2009-12-05 11:40 AM'. So the import wizard isn't even capable of parsing dates that come from SQL Server. Really? Really?
Added details (realized my description wasn't correct after revisiting the package I had issues with):
The import thing IS pretty bad.
In my case I had incoming data with form matching SQL Server type 126 / ISO8601. That is, in T-SQL, this form:
select convert ( varchar(100), getdate(), 126 )
--> 2009-12-22T16:29:22.123
I was able to import with SSIS using two steps:
Replace the "T" with a space " ", using SSIS Derived Column with expression:
REPLACE(DateColumn,"T"," ")
Cast the result to database timestamp [DT_DBTIMESTAMP] using the data conversion transform
Apologies if I caused any confusion.
I have created a very simple Data Flow in SSIS that is run inside a loop.
IMAGE 1 http://img407.imageshack.us/img407/1553/step1f.jpg
I have a simple OLE DB Source control which is connecting to a SQL Server and running quite a complex query to split daily data by 30 minute intervals as shown below.
IMAGE 2 http://img168.imageshack.us/img168/857/step2vs.jpg
I then have a Flat File Destination control which is taking the output from the OLE DB Source control and saving it as a comma-delimited CSV file. As you can see above the numbers are decimal numbers to two decimal places but in the CSV file below it is showing as ones and zeros.
IMAGE 3 http://img341.imageshack.us/img341/5494/step3w.jpg
What can I do to get the CSV output to match the query input? I have tried converting the numbers to varchar in the query but I got the same result. I also tried changing the column types in the Connection Manager too but got the same result.
I have managed to resolve this issue by changing all the DataType properties for each column of data I was importing. I had to change them to 'double-precision float [DT_R8]' and then it saved the CSV with the proper decimal values.
Very annoying, I hope that helps someone.
IMAGE 4 http://img687.imageshack.us/img687/3749/step4dp.jpg