Extracting elements from a varchar column in snowflake - snowflake-cloud-data-platform

I have a sample data, as shown below;
NAME
Values
Typeof(Values)
BNL
[1,2]
VARCHAR
As seen the data type is VARCHAR for the second column.
Expected Output
Name
Values
BNL
1
BNL
2
In a way, we want to split the row into two rows depending on how many elements we have in the Values column. I was looking into FLATTEN as an option;
select temp.*,
t.value
from temp,
table(flatten(temp.values)) as t;
But since it is a VARCHAR, we get following error message;
invalid type [VARCHAR(16777216)] for parameter '1'
Can I please get some help here?thanx

One way is to use TRY_PARSE_JSON:
select temp.*,
t.value
from temp,
table(flatten(TRY_PARSE_JSON(temp.values))) as t;

Related

Snowflake float type column out of range error

I got Numeric value out of range error when trying to insert two values into a Float type column.
create or replace table num_test(float_num float); -- create table and column
INSERT INTO num_test (float_num)VALUES (1.0528618730874378E10), (-3.694822225952521E-13);
The error i got shows: "Numeric value '10528618730.874378' is out of range."
But when i try to insert these two values separately, it work fine.
INSERT INTO num_test (float_num)VALUES (1.0528618730874378E10); -- ok
INSERT INTO num_test (float_num)VALUES (-3.694822225952521E-13); -- ok
I couldn't see any out of range issues on the snowflake document for the values I tried to insert.
If I had to guess what the problem was, it would be that "the guess of the "type" based on the first value in the VALUES is smaller than the type of the second value".
So if we just try and select those values with zero target problems:
select column1, system$typeof(column1)
from VALUES
(1.0528618730874378E10),
(-3.694822225952521E-13);
triggers.
Numeric value '10528618730.874378' is out of range
One at a time we get:
COLUMN1
SYSTEM$TYPEOF(COLUMN1)
-0.0000000000003695
NUMBER(29,28)[SB16]
COLUMN1
SYSTEM$TYPEOF(COLUMN1)
10,528,618,730.874378
NUMBER(17,6)[SB8]
So sure enough the two random "numbers" are cast to two different types, and these are deemed "too different". Thus my guess was correct.
What to do about this:
So we inline cast them:
select column1, system$typeof(column1)
from VALUES
(1.0528618730874378E10::double),
(-3.694822225952521E-13::double)
;
we get
COLUMN1
SYSTEM$TYPEOF(COLUMN1)
10,528,618,730.8744
FLOAT[DOUBLE]
-0.0000000000003695
FLOAT[DOUBLE]
so the values are safe if we first tell the DB what they are, and avoid the auto guessing code.
thus in your context:
INSERT INTO num_test (float_num)VALUES
(1.0528618730874378E10::double),
(-3.694822225952521E-13::double)
;
number of rows inserted
2

How to replace string column with number 0 if the values in that column is null

Hope this is right place to ask question related to snowflake database..
I would like to know how to replace string column value with 0 if it is null.
so far, i tried nvl function, but that didn't work.
CREATE TABLE EMP(ENAME VARCHAR(10));
INSERT INTO EMP VALUES('JACK');
INSERT INTO EMP VALUES(null);
SELECT NVL(ENAME,0) FROM EMP1;
Error :Numeric value 'JACK' is not recognized
Any help would be appreciated.
Thanks,
Nawaz
SQL is strongly typed. The output type of NVL is being inferred to be an NUMBER, so the actual query looks something like
SELECT NVL(ENAME::NUMBER, 0) FROM EMP1;
You should decide what type your output should be. If you want strings, then you will need to pass NVL a string, like
SELECT NVL(ENAME, '0') FROM EMP1;
If you want integers, you will need to convert the strings to integers safely. For example, if you want non-integers to become NULL, then 0, then you can use
SELECT NVL(TRY_TO_INTEGER(ENAME), 0) FROM EMP1;

How to parse a table with a JSON array field in PostgreSQL into rows?

I have a table that contains a json array. Here is a sample of the contents of the field from:
SELECT json_array FROM table LIMIT 5;
Result:
[{"key1":"value1"}, {"key1":"value2"}, ..., {"key2":"value3"}]
[]
[]
[]{"key1":"value1"}
[]
How can I retrieve all the values and count how many of each value was found?
I am using PostgreSQL 9.5.14, and I have tried the solutions here Querying a JSON array of objects in Postgres
and the ones suggested to me by another generous stackoverflow user in my last question: How can I parse JSON arrays in postgresql?
I tried:
SELECT
value -> 'key1'
FROM
table,
json_array_elements(json_array);
which sadly does not work for me due to receiving the error: cannot call json_array_elements on a scalar
This error happens when using a query that returns more than one row or more than one column as a scalar subquery.
Another solution I tried was:
SELECT json_array as json, (json_array->0),
coalesce(
case
when (json_array->0) IS NULL then null
else (json_array->0->>'key1')
end,
'No value') AS "Value"
FROM table;
which only returned null values for the "Value"
Referencing Querying a JSON array of objects in Postgres I attempted to use this solution as well:
WITH json_test (col) AS (
values (json_arrays)
)
SELECT
y.x->'key1' "key1"
FROM json_test jt,
LATERAL (SELECT json_array_elements(jt.col) x) y;
But I would need to be able to fit all the elements of the json_arrays into json_test
So far I have only attempted to list all the values in the all json arrays, but my ideal end-result for the query resembles this:
Value | Amount
---------------
value1 | 48
value2 | 112
value3 | 93
value4 | 0
Yet again I am grateful for any help with this, thank you in advance.
step-by-step demo:db<>fiddle
SELECT
each.value,
COUNT(*)
FROM
data,
json_array_elements(json_array) elems, -- 1
json_each_text(elems) each -- 2
GROUP BY each.value -- 3
Expand array into one row for each array element
split the key/value pairs into two columns
group by the new value column/count

PostgreSQL - counting the elements in the JSON

I have a JSON type column called "log_data" and the data stored in it is in the format [{"key":"test123123","identity":"user#test.it","identity_type":"email"}].
I want to count how many records for a given value for a given key in json:
Doesn't work
SELECT count (distinct esas_logs.log_id) AS "count" FROM "esas_logs" WHERE log_data->0->>'identity' = 'user#test.it'
[2016-06-30 13:59:18] [42883] ERROR: operator does not exist: json = unknown
HINT: No operator matches the given name and argument type(s). You might need to add explicit type casts.
use json_array_length()
test=# select json_array_length('[{"key":"test123123","identity":"user#test.it","identity_type":"email"}]');
json_array_length
-------------------
1
(1 row)
According to the doc, you should use the ? operator.
If your column type is JSON:
SELECT COUNT(esas_logs ? 'log_id') FROM esas_logs WHERE ...
If you column is a TEXT or VARCHAR:
SELECT COUNT(esas_logs::jsonb ? 'log_id') FROM esas_logs WHERE ...

'Converting varchar to data type numeric' error after successful conversion to decimal(18,2)

I have a temporary table I'm using for parsing, #rp.
#rp contains an nvarchar(max) column, #rp.col8, which holds positive and negative numbers to two decimal places of precision e.g. `1234.26'.
I'm able to run the following query and get out a set of converted values out:
select * from
(
select CONVERT(decimal(18,2),rp.col8) as PARSEAMT
from #rp
where
--#rp filtering criteria
)q
However, when I try to query for PARSEAMT = 0 in the following manner, I get the standard '8114, Error converting data type varchar to numeric.':
select * from
(
select CONVERT(decimal(18,2),col8) as PARSEAMT
from #rp
where
--#rp filtering criteria
)q
where q.PARSEAMT = 0
Without that where clause, the query runs fine and generates the expected values.
I've also tried other clauses like where q.PARSEAMT = 0.00 and where q.PARSEAMT = convert(decimal(18,2),0).
What am I doing wrong in my comparison?
I was going to suggest you select PARSEAMT into another temp-table/table-variable but I can see you've already done that from your comments.
Out of interest what does the following yield?
select
col8
from
#rp
where
-- ISNUMERIC returns 1 when the input expression evaluates to a valid
-- numeric data type; otherwise it returns 0. Valid numeric data types
-- include the following:
isnumeric(col8) <> 1

Resources