Add Snowflake calculated geography column - snowflake-cloud-data-platform

I've got a block of data with lat/longs and I'm trying to add a point to a Snowflake table from this data.
First I tried to accomplish it when I created the table with:
create or replace table geo (
best_lat double,
best_lon double,
geography geography as (ST_POINT(best_lon, best_lat)));
This errored out with SQL compilation error: error line 4 at position 2 Data type of virtual column does not match the data type of its expression for column 'GEOGRAPHY'. Expected data type 'GEOGRAPHY', found 'VARIANT'
Then I tried to add the column with:
alter table geo
add column geom geography as (select st_makepoint(best_lon, best_lat) from geo)
This errored out with SQL compilation error: Invalid virtual column expression [(SELECT ST_MAKEPOINT(GEO.BEST_LON, GEO.BEST_LAT) AS "ST_MAKEPOINT(BEST_LON, BEST_LAT)" FROM GEO AS GEO)]
Clearly I'm doing something wrong here. Can anyone provide some insight?

Snowflake doesn’t really support calculated columns, what it does do is allow you to set a default value for a column and this default can be a simple SQL statement. The syntax is documented here
Because this isn’t a pure calculated column, you can still insert values directly into the column, which will override the defined default value.

Related

How to Change Hive External Table Data Type from Double to Decimal

I am trying to alter several columns on HIVE external table from double to decimal. I have dropped, recreated the table, and ran msck repair statement. However, I am unable to select the table neither from Hive nor Impala as it returns these error:
Hive: ERROR processing query/statement. Error Code: 0, SQL state: File 'hdfs://ns-bigdata/user/warehouse/fact/TEST_FACT/key=2458773/000000_0' has an incompatible Parquet schema for column 'testing.fact_table.tot_amt'. Column type: DECIMAL(28,7), Parquet schema:
optional double tot_amt [i:29 d:1 r:0]
Impala: ERROR processing query/statement. Error Code: 0, SQL state: File 'hdfs://ns-bigdata/user/warehouse/fact/TEST_FACT/key=2458773/000000_0' has an incompatible Parquet schema for column 'testing.fact_table.tot_amt'. Column type: DECIMAL(28,7), Parquet schema:
optional double tot_amt [i:29 d:1 r:0]
Is it possible to change the datatype from double to decimal?
Also what's the difference between droping+recreating the table and altering the table?
can you pls use alter table like below to convert from double to decimal. Please make sure your decimal column can hold all double data. it works on both impala and hive.
alter table table_name change col col decimal(5,4); -- notice col name mentioned twice
Alter table - useful if you want to add a new column in the end of a table and not wipe out all data. Easier, faster but with limited capability.
Drop, create table - useful when you want to restructure columns, file format, all columns to partitioned tables.

SQL compilation error: invalid identifier for TIMESTAMP_NTZ(9) fields in Snowflake table

I have a table with 2 TIMESTAMP_NTZ(9) fields. When I try to add either of these to a WHERE clause I get a SQL compilation error: invalid identifier. If name either of these field in my SELECT statement I also get this error. If I perform a SELECT * the fields are available and return data.
Relatively new to Snowflake, what am I missing here?

SQL Server: Error converting data type varchar to numeric (Strange Behaviour)

I'm working on a legacy system using SQL Server in 2000 compatibility mode. There's a stored procedure that selects from a query into a virtual table.
When I run the query, I get the following error:
Error converting data type varchar to numeric
which initially tells me that something stringy is trying to make its way into a numeric column.
To debug, I created the virtual table as a physical table and started eliminating each column.
The culprit column is called accnum (which stores a bank account number, which has a source data type of varchar(21)), which I'm trying to insert into a numeric(16,0) column, which obviously could cause issues.
So I made the accnum column varchar(21) as well in the physical table I created and it imports 100%. I also added an additional column called accnum2 and made it numeric(16,0).
After the data is imported, I proceeded to update accnum2 to the value of accnum. Lo and behold, it updates without an error, yet it wouldn't work with an insert into...select query.
I have to work with the data types provided. Any ideas how I can get around this?
Can you try to use conversion in your insert statement like this:
SELECT [accnum] = CASE ISNUMERIC(accnum)
WHEN 0 THEN NULL
ELSE CAST(accnum AS NUMERIC(16, 0))
END

How to alter Image column on Sybase to NOT NULL

On a Sybase ASE 15.7 database I'm trying to modify a column type Image from NULL to NOT NULL (I'm using SQSH so ; is a valid terminator):
create table LOB_TEST (XML image NULL);
alter table LOB_TEST modify XML image NOT NULL;
Error message:
Msg 13907, Level 16, State 1
Server 'MYSERVER', Line 1
ALTER TABLE 'LOB_TEST' failed. You cannot modify column 'XML' to TEXT/IMAGE/UNITEXT type.
This works on an int type column:
create table NON_LOB_TEST (XML_ID int NULL);
alter table NON_LOB_TEST modify XML_ID int NOT NULL;
(0 rows affected)
Any clues why? I cannot find anything online. Thanks.
Text/image datatypes are very different internally from the other datatypes due to the way they are stored. Therefore, it is not a surprise that operations that work on an INT column do not work on a text/image column.
The documentation is not terribly clear on this point, but implicitly it sort of says that you cannot modify the nullability of a text/image column: http://infocenter.sybase.com/help/index.jsp?topic=/com.sybase.infocenter.dc36272.1600/doc/html/san1393050903443.html

Date Conversion Issue MS Access to SQL Server

I'm creating a table B from an exisitng table A. In the table A I have a column ValDate which is varchar and contains Date. When I try to create the table B I have a function used in the query as given below and I get a conversion error. Table A contains null values as well.
MS Access:
((DateDiff("d",Date(),format(Replace(Replace([Table A].ValDate,".","/"),"00/00/0000","00:00:00"),"dd/mm/yyyy")))>0)).
Tables were in MS Access and are being migrated to SQL Server 2012.
SQL Server:
((DATEDIFF(day,FORMAT( GETDATE(), 'dd-MM-yyyy', 'en-US' ),FORMAT( ValDate, 'dd-MM-yyyy', 'en-US' ))>0))
or
((DateDiff(day,GETDATE(),Format(Replace(Replace([TableA].[ValidFrom],'.','/'),'00/00/0000','00:00:00'),'dd/mm/yyyy')))
I tried converting the date using several approachs like Convert , Format and Cast but I end up getting error as below.
Msg 8116, Level 16, State 1, Line 1
Argument data type date is invalid for argument 1 of isdate function.
I would really appreciate someone telling me what I'm missing here.
since you have date data in a string field is very likely you have some value that is not valid against your expected date format.
copy the data in a sql server table and then perform check and validation of the content of the string field.
have a look to the function try_convert that can be helpful when checking the content of the string field containing the date values.
when bad data is ruled out you can apply again your formula with (hopefully) a different result.
a better solution would be to create a separate field with appropriate datatype to store date values converted from the string field and apply your logic to that field.

Resources