What type of data can be considered as status in the status window - tdengine

TDengine DOC
Here are two questions.
Whether "字符串" includes NCHAR or BINARY?
If "字符串" includes NCHAR.Whether the variable with "1.1" string value is valid in status window?
TDengine sql
value_str value is "1.0".
TDengine version is 2.6.0.32.

we suggest you upgrade TDengine to 3.0.1.8 , it supports more completed SQL syntax .

Summary:
"字符串" includes NCHAR or BINARY .
variable with "1.1" string value is valid in status window.
In fact, any of the same values will work
Upgrade TDengine to 3.0.1.8, and status window will work.
Thanks to Yu Chen.

Related

add TDengine as data source in idea

I add the the tdengine as idea data source, but i found some problems.
1.when i executed a query, after i got the result. The idea still execute the query.However compare with mysql, the idea execution terminated as long as the result returned.enter image description here
2.also compare with the mysql, i can just get information db and tables, and there isn't any enter image description hereinformation about the columns.enter image description here
the exception of second image is
<failed to load>
java.sql.SQLException: ERROR (2315): unknown taos type in tdengine
at com.taosdata.jdbc.TSDBError.createSQLException(TSDBError.java:69)
at com.taosdata.jdbc.TSDBError.createSQLException(TSDBError.java:56)
at com.taosdata.jdbc.TSDBConstants.taosType2JdbcType(TSDBConstants.java:131)
at com.taosdata.jdbc.TSDBResultSetMetaData.getColumnType(TSDBResultSetMetaData.java:151)
in RemoteResultSetMetaDataImpl.getFixedColumnType(RemoteResultSetMetaDataImpl.java:105)
use taos-jdbcdriver-2.0.34 to solve this problem

Is there a way to find out details of data type erorr in Snowflake?

I am pretty new to Snowflake Cloud offering and was just trying to load a simple .csv file from AWS s3 staging are to a table in Snowflake using copy command.
Here is what I used as the command:
copy into "database name"."schema"."table name"
from #S3_ACCESS
file_format = (format_name = format name);
When run the above code, I get the following error: Numeric value '63' is not recognized
Please see the attached image. Not sure what this error is and i'm not able to find any lead in Snowflake UI itself to find out what could be wrong with the value.
Thanks in Advance!
The error says, it was waiting a numberic value, but it got "63", and this value can not be converted to numeric value.
From the image you share, I can see that there are some weird characters around 6 and 3. There could be an issue with file encoding or data is corrupted.
Please check encoding option for file format:
https://docs.snowflake.com/en/sql-reference/sql/create-file-format.html#format-type-options-formattypeoptions
By the way, I recommend you always use utf-8.

Value Being Loaded As NULL to SQL Server When It Is A Zero And Other Than Zero It Is Working Fine via Informatica PowerCenter 10.1

I have come through a scenario where the value for a field is loading 'NULL' when actual value in the file is '0' but loading the correct value if the file has a non-zero value. I have used the debugger and read the session log. Everywhere the value is showing '0', but in the table it is loading as 'NULL'. Is this a known issue? Can anyone please help me overcoming this discrepancy.
Did you try writing the output to a file and see the output ? Please give details of the transformation.
In addition, Informatica has a special property which you can set when value is 0 or when value is null.
Check that session property too.
Check if there is anything present when value is 0.
Also, what is the column datatype which you are trying to populate too? Does that column have any constraint to not to accept "0"? check that too.
I agree (as #buzyjess says), writing the output to a text file will make it easy for debugging. So, please let us know how it looks like when you output to a file.

Burmese language is shown as "boxes" in sql server 2012

I have observed that Burmese language is shown as "boxes" are record level in SQL server 2012. Both the fields shown in the screenshot are nvarchar type with more than the required length.Is this expected ? If so why.
If you are storing it in nvarchar then it is OK,
You can test it by copy and paste one of the row data into Google Translate where Burma language is selected as source, if you see the text in Burma language characters, then it is OK
It is related with the editor
You have to install correct Burmese font Zawgyi or Myanmar-1.

Connecting to SQL Server with CL-SQL via unixODBC/FreeTDS

I've managed to connect from SBCL running on debian to an SQL Server 2000 instance over the network using FreeTDS/unixODBC.
I can actually get data back from the server, so all is working.
However, many of the columns trigger what seem to be unsupported data types a-la:
The value 2147483647 is not of type FIXNUM.
and
-11 fell through ECASE expression.
Wanted one of (-7 -6 -2 -3 -4 93 92 91 11 10 ...).
Anyone have experience using CLSQL with SQL Server would be able to help out?
This (error with 2147483647) occurs because the FreeTDS driver doesn't handle OLEDB BLOBs so well.
You have to issue the following SQL command to make it work:
set textsize 102400
You can see the FreeTDS FAQ entry here. Excerpt:
The text data type is different from char and varchar types. The maximum data length of a text column is governed by the textsize connection option. Microsoft claims in their documentation to use a default textsize of 4000 characters, but in fact their implementation is inconsistent. Sometimes text columns are returned with a size of 4 GB!
The best solution is to make sure you set the textsize option to a reasonable value when establishing a connection.
As for the ECASE expression, I haven't really solved it but I have hacked it away by doing a data conversion of timestamp into a binary value, and a uniqueidentifier into a varchar(36).

Resources