fail to inser data when using TDengine - tdengine

I encountered an error when inserting data into TDengine.
I can see the error report in its log.
ERROR vgId:5, msg:0x7fea18137cb0 failed to put into vnode queue since Invalid Vgroup ID, type:submit qtype:3 contLen:455, gtid:0x11ba042c4fef0043:0x66040e62c4f0003a
I googled this information, and no valuable information was gained.

Related

Error handling with Dapper Plus BulkInsert

I am using BulkInsert API from Z.Dapper.Plus and it is failing due to bad data and the message is Parameter value '7900000000000000.000000' is out of range.
Is there a way to improve this message so I can readily identity the specific row causing the issue?
There is no way to improve the message in Dapper Plus. That's an error message from SQL Server that your value cannot be stored in your current data type (pretty similar to this issue: https://stackoverflow.com/a/56349937/5619143)
You currently have the value in error, so the easiest way to find the row in error is probably to look at your datasource to which entity has this value. Or also by adding a validation that no value is out of bound with your data type.
Another way is to set a BatchSize = 1 and look with SQL Profiler when the error will be thrown. You will have all information about the row in error by looking at the SQL.

Constant "potential loss of data" errors when importing Salesforce Dataloader CSV using SQL Import Tool

I have exported a list of accounts from Salesforce using their Dataloader tool. The output file is a CSV file. I have the table I want it imported into already created. I was using nvarchar(255) for all fields, but after I kept getting truncation errors I changed to nvarchar(max).
I am using the SQL Import Tool, and importing a flat file. I set it up with " for text qualifier, and comma separated. Everything looks good. Then when I go to import I kept getting truncation errors on nearly every field.
I went back and had it suggest type, and had it read the entire file.
I kept getting the same errors.
I went back and changed everything to DT_STR with length 255, and then instead of truncation errors, I get the following:
- Executing (Error)
Messages
Error 0xc02020c5: Data Flow Task 1: Data conversion failed while converting column "BILLINGSTREET" (86) to column "BILLINGSTREET" (636). The conversion returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
(SQL Server Import and Export Wizard)
Error 0xc0209029: Data Flow Task 1: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "Data Conversion 0 - 0.Outputs[Data Conversion Output].Columns[BILLINGSTREET]" failed because error code 0xC020907F occurred, and the error row disposition on "Data Conversion 0 - 0.Outputs[Data Conversion Output].Columns[BILLINGSTREET]" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
(SQL Server Import and Export Wizard)
Error 0xc0047022: Data Flow Task 1: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Data Conversion 0 - 0" (552) failed with error code 0xC0209029 while processing input "Data Conversion Input" (553). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
(SQL Server Import and Export Wizard)
I went back AGAIN and changed everything to Stream Text. It's now working, but it's runningslow. What took less than a minute before will take probably 2 hours now.
FYI, I tried to import the csv into Excel but it either cuts off preceeding zeros, or completely screws up the parsing.
What I ended up doing is importing the .csv as Flat File not the .xsl file.
In the Advanced area I highlighted all of the columns on the right side and select DT_STR(255)
The few fields I had that were longer than 255 I changed to D_TEXT
This is a workaround, it is not the "Proper" way to do it, but the "Proper" was just wasn't working due to bad data in the Salesforce Export. Once I got the data into the database I was able to review a lot easier and allowed me to identify the bad data.

SSIS Destination Insert Error

PACKAGE DESCRIPTION: I use a Source from One database do a lookup to get a surrogate geography key then another to check if the customer exists, if not insert the row if so update the row...
PROBLEM: I am unable to insert approximately 700,000 rows.
PROBLEM DESCRIPTION: I have looked at this for a long, long time now using data viewer, outputting to flat files and cannot find the cause of my issues.
From the below errors, research and checking through SSIS I have drawn a blank.
Error: 0xC0209029 at DimCustomer, Dw_DimCustomer [2]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "Dw_DimCustomer.Inputs[OLE DB Destination Input]" failed because error code 0xC020907B occurred, and the error row disposition on "Dw_DimCustomer.Inputs[OLE DB Destination Input]" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
Error: 0xC0047022 at DimCustomer, SSIS.Pipeline: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Dw_DimCustomer" (2) failed with error code 0xC0209029 while processing input "OLE DB Destination Input" (15). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
Error: 0xC02020C4 at DimCustomer, SQL_Customer [154]: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
Error: 0xC0047038 at DimCustomer, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on SQL_Customer returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
WORK DONE :
I've checked all formats match and that no truncation will occur.
Checked all lookups are working correctly.
Dropped my destination database (I'm in a test environment don't worry) and recreated it.
I've checked that all the correct columns are showing the correct data in the correct format between tasks.
I've checked the only error is coming from the final OLE DB destination output (which it is).
I am unsure where to go, as ever the answer is StackOverflow.
Any pointers or ideas or help would be welcomed with open arms.
From the image you attached i think that you are using SSIS 2012 or higher, After searching there are many issue that cause this problem.
You can test the following things:
First
if your server operationg system is 64-bit try running your ssis package in 64bit runtime. you can follow this link for more details.
Second
Try replacing OLEDB Destination with a SQL server Destination and set the TIME OUT to higher value (for ex : 500)
Thrird
On you OLEDB Destination try to uncheck Check Constraints checkbox (The error may be caused by a constraints)
References
First workaround
SSIS ERROR: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020
Second Workaround
https://social.msdn.microsoft.com/Forums/sqlserver/en-US/8a3558df-221a-45e6-8e08-c23c987c12a9/error-the-attempt-to-add-a-row-to-the-data-flow-task-buffer-failed-with-error-code-0xc0047020?forum=sqlintegrationservices
Additional Info
I think that this is the main Error message that you can search for it (if there is no error message received from your package) if my answer didn't solved it:
The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020
Hope it helps
Go to DimCustomer Properties:
Make BufferMaxRows to 100. This might increase the execution time for the package as it will transfer only 100 row at a time using a less RAM at that instance.
If this doesnt work:
You can create go for For-loop. Adding this same inside a For-loop appdeing 100 rows everytime it runs. You can found the loop execution by dividing the total count by 100. This will definitely help you!
This error occured when data is damaged before the data reaches the data flow destination. Data flow find unexpectly , null or blank or empty values you should to recreate your source data basese with healthy backup .

SSIS: data types just don't get converted from string

That's my first experience in SSIS and I'm just going nuts: NOTHING WORKS.
(Don't be afraid of big post: most is just errors output.)
I've got two MS SQL DBs with same fields and I have to transfer from first one, where everything is in nvarchar(32) aka DT_WSTR, into second one, where types are different.
Data and its examples:
"Timestamp" is "datetime2(7)" in destination, and source looks like ISO 8601: 2013-12-19T00:00:00.000
"Value" is "real" numbers with scientific notation, test examples are: 17e+10, 17.14, 17.14e+5, 1715E+4, 1714
And four columns with just different ints (bigint, bigint, tinyint, int).
Now for what I've tried (warning for lots of quotations):
Derived Column. I used casts like "(DT_DBTIMESTAMP2,7)time" and "(DT_R4)value". Perhaps I'm using wrong types, but I strongly doubt: I googled it like a lot and most articles (like this) tells that I'm right.
Error: 0xC0049064 at Import from ODS to DWH, Derived Column [2]: An error occurred while attempting to perform a type cast.
Error: 0xC0209029 at Import from ODS to DWH, Derived Column [2]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "Derived Column" failed because error code 0xC0049064 occurred, and the error row disposition on "Derived Column.Outputs[Derived Column Output].Columns[timestamp]" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
Error: 0xC0047022 at Import from ODS to DWH, SSIS.Pipeline: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Derived Column" (2) failed with error code 0xC0209029 while processing input "Derived Column Input" (3). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
Error: 0xC02020C4 at Import from ODS to DWH, OLE DB Source [62]: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
Error: 0xC0047038 at Import from ODS to DWH, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on OLE DB Source returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
Explicitly changing types in source (so they match destination) and connecting directly to destination.
Error: 0xC020901C at Direct, OLE DB Source [32]: There was an error with OLE DB Source.Outputs[OLE DB Source Output].Columns[time] on OLE DB Source.Outputs[OLE DB Source Output]. The column status returned was: "The value could not be converted because of a potential loss of data.".
Error: 0xC0209029 at Direct, OLE DB Source [32]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "OLE DB Source.Outputs[OLE DB Source Output].Columns[time]" failed because error code 0xC0209072 occurred, and the error row disposition on "OLE DB Source.Outputs[OLE DB Source Output].Columns[time]" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
Error: 0xC0047038 at Direct, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on OLE DB Source returned error code 0xC0209029. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
Data Conversion. Same result: failure.
Error: 0xC02020C5 at Conversion, Data Conversion [2]: Data conversion failed while converting column "time" (74) to column "Copy of time" (11). The conversion returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
Error: 0xC0209029 at Conversion, Data Conversion [2]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "Data Conversion.Outputs[Data Conversion Output].Columns[Copy of time]" failed because error code 0xC020907F occurred, and the error row disposition on "Data Conversion.Outputs[Data Conversion Output].Columns[Copy of time]" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
Error: 0xC0047022 at Conversion, SSIS.Pipeline: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Data Conversion" (2) failed with error code 0xC0209029 while processing input "Data Conversion Input" (3). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
Error: 0xC02020C4 at Conversion, OLE DB Source [62]: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
Error: 0xC0047038 at Conversion, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on OLE DB Source returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
So now I just ran out of tools and ideas how to do this: most instructions just say something like "map this to that and it will work". I'm struggling with this so hard and for so long I even created a StackExchange account for that. Any help appreciated. Cheers.
This message
The value could not be converted because of a potential loss of data.
tells you pretty much everything you need to know. Your problem isn't with the types you're using, it's with the data in your source.
The most likely cause is at least one row of data in your source ODS has values which are in a format that cannot be recast into (for example) a DB_TIMESTAMP.
My recommended approach would be to cast all your source columns explicitly in your SQL source statement.
So instead of
select time, value from source
do
select (cast time as datetime) as time, (cast value as int) as value from source
In fact, I would run this query against your source table to make sure all the values can be correctly cast into the final fields. (My guess is not or you wouldn't get casting errors.)
Another thing you could do is change your task components from "Fail" on error to "Redirect error rows" and push the error rows into a file destination so you can see which rows are getting kicked out by the transformation component.
New Answer to an old question - but I just pulled all of my hair out trying to solve this error. Multiple error paths, data conversion, derived columns, etc. were not working. Two things solved this error:
I noticed that columns with Null Values were causing errors when dealing with Curreny (DT_CY) .
I finally solved this error by checking a little box "Retain null values from source as null values in the data flow" that was within the Flat File Source node. (why is this not selected by deafult!?)
Second, I was having a hard time converting dates to strings - an easy solution would be to do an update statement within a Execute SQL Task within the control flow.
For example: your database column is varchar or nvarchar. Your Source has data coming in a mm/dd/yyyy. No matter how many times you try wrestling with the above error, converting this into a string fails.
I solved this by loading the data (you can use a temp table if you'd like) and then converting this column using a SQL Query. I personally used:
update MY_TABLE_NAME
set [MY_DATE_COLUMN] = convert(varchar(8),CONVERT (datetime,[MY_DATE_COLUMN] ), 112)
where (FILTER_STATEMENT_TO_UPDATE_ONLY_WHAT_YOU_JUST_LOADED)
Hopefully this can assist others who stumble upon this post.
(DT_R8)(REPLACE(Knowledgeable,"\"","") == "" ? NULL(DT_R8) : (DT_R8)REPLACE(Knowledgeable,"\"",""))
Try the above code.
What's happening is that you are trying to convert a value that has double qoutes. That is why it's giving you an error. Replacing the double qoute with an empty string will solve the problem.

Combining Multiple Tables Into 1 In SQL Server

I have about 40 tables worth of data that I need to turn into one large table in SQL Server. They are currently text files. I tried combining them all into an Access DB then uploading to SQL Server that way, but their datatypes, nvarchar(255) are far too large and I need them to be smaller, but I cannot edit data types once the table is uploaded so I need to create a new table, then one by one upload the data into it. I cannot figure out the process to import data to an already made table though. Any help would be greatly appreciated.
I tried the regular way of importing but I keep getting the following error messages
Error 0xc02020a1: Data Flow Task 1: Data conversion failed. The data conversion for column ""Description"" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.".
Error 0xc020902a: Data Flow Task 1: The "output column ""Description"" (26)" failed because truncation occurred, and the truncation row disposition on "output column ""Description"" (26)" specifies failure on truncation. A truncation error occurred on the specified object of the specified component.
Error 0xc0202092: Data Flow Task 1: An error occurred while processing file "C:\Users\vzv7kqm\Documents\Queries & Reports\UPSU Usage\UpTo1999.CSV" on data row 9104.
Error 0xc0047038: Data Flow Task 1: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on component "Source - UpTo1999_CSV" (1) returned error code 0xC0202092. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
Why not export the data from MS Access to MS SQL Server? Nvarchar(255) just means that it is of variable length. At most, it uses 2 bytes for over head. At worst, I can store 255/2 characters. It is using UNICODE. Why not use VARCHAR(255)?

Resources