SQL Server: How to remove a key from a Json object - sql-server

I have a query like (simplified):
SELECT
JSON_QUERY(r.SerializedData, '$.Values') AS [Values]
FROM
<TABLE> r
WHERE ...
The result is like this:
{ "2019":120, "20191":120, "201902":121, "201903":134, "201904":513 }
How can I remove the entries with a key length less then 6.
Result:
{ "201902":121, "201903":134, "201904":513 }

One possible solution is to parse the JSON and generate it again using string manipulations for keys with desired length:
Table:
CREATE TABLE Data (SerializedData nvarchar(max))
INSERT INTO Data (SerializedData)
VALUES (N'{"Values": { "2019":120, "20191":120, "201902":121, "201903":134, "201904":513 }}')
Statement (for SQL Server 2017+):
UPDATE Data
SET SerializedData = JSON_MODIFY(
SerializedData,
'$.Values',
JSON_QUERY(
(
SELECT CONCAT('{', STRING_AGG(CONCAT('"', [key] ,'":', [value]), ','), '}')
FROM OPENJSON(SerializedData, '$.Values') j
WHERE LEN([key]) >= 6
)
)
)
SELECT JSON_QUERY(d.SerializedData, '$.Values') AS [Values]
FROM Data d
Result:
Values
{"201902":121,"201903":134,"201904":513}
Notes:
It's important to note, that JSON_MODIFY() in lax mode deletes the specified key if the new value is NULL and the path points to a JSON object. But, in this specific case (JSON object with variable key names), I prefer the above solution.

Related

Return Parts of an Array in Postgres

I have a column (text) in my Postgres DB (v.10) with a JSON format.
As far as i now it's has an array format.
Here is an fiddle example: Fiddle
If table1 = persons and change_type = create then i only want to return the name and firstname concatenated as one field and clear the rest of the text.
Output should be like this:
id table1 did execution_date change_type attr context_data
1 Persons 1 2021-01-01 Create Name [["+","name","Leon Bill"]]
1 Persons 2 2021-01-01 Update Firt_name [["+","cur_nr","12345"],["+","art_cd","1"],["+","name","Leon"],["+","versand_art",null],["+","email",null],["+","firstname","Bill"],["+","code_cd",null]]
1 Users 3 2021-01-01 Create Street [["+","cur_nr","12345"],["+","art_cd","1"],["+","name","Leon"],["+","versand_art",null],["+","email",null],["+","firstname","Bill"],["+","code_cd",null]]
Disassemble json array into SETOF using json_array_elements function, then assemble it back into structure you want.
select m.*
, case
when m.table1 = 'Persons' and m.change_type = 'Create'
then (
select '[["+","name",' || to_json(string_agg(a.value->>2,' ' order by a.value->>1 desc))::text || ']]'
from json_array_elements(m.context_data::json) a
where a.value->>1 in ('name','firstname')
)
else m.context_data
end as context_data
from mutations m
modified fiddle
(Note:
utilization of alphabetical ordering of names of required fields is little bit dirty, explicit order by case could improve readability
resulting json is assembled from string literals as much as possible since you didn't specified if "+" should be taken from any of original array elements
the to_json()::text is just for safety against injection
)

BigQuery: extract keys from json object, convert json from object to key-value array

I have a table with a column which contains a json-object, the value type is always a string.
I need 2 kind of information:
a list of the json keys
convert the json in an array of key-value pairs
This is what I got so far, which is working:
CREATE TEMP FUNCTION jsonObjectKeys(input STRING)
RETURNS Array<String>
LANGUAGE js AS """
return Object.keys(JSON.parse(input));
""";
CREATE TEMP FUNCTION jsonToKeyValueArray(input STRING)
RETURNS Array<Struct<key String, value String>>
LANGUAGE js AS """
let json = JSON.parse(input);
return Object.keys(json).map(e => {
return { "key" : e, "value" : json[e] }
});
""";
WITH input AS (
SELECT "{\"key1\": \"value1\", \"key2\": \"value2\"}" AS json_column
UNION ALL
SELECT "{\"key1\": \"value1\", \"key3\": \"value3\"}" AS json_column
UNION ALL
SELECT "{\"key5\": \"value5\"}" AS json_column
)
SELECT
json_column,
jsonObjectKeys(json_column) AS keys,
jsonToKeyValueArray(json_column) AS key_value
FROM input
The problem is that FUNCTION is not the best in term of compute optimization, so I'm trying to understand if there is a way to use plain SQL to achieve these 2 needs (or 1 of them at least) using only SQL w/o functions.
Below is for BigQuery Standard SQL
#standardsql
select
json_column,
array(select trim(split(kv, ':')[offset(0)]) from t.kv kv) as keys,
array(
select as struct
trim(split(kv, ':')[offset(0)]) as key,
trim(split(kv, ':')[offset(1)]) as value
from t.kv kv
) as key_value
from input,
unnest([struct(split(translate(json_column, '{}"', '')) as kv)]) t
If to apply to sample data from your question - output is

Postgresql: Append object to a list

I have data that has a key as string and value is a list of json, looks like
ID|data
A1|{key1:[{k1:v1,k2:v2},{{k3:v3,k4:v4}]}
I want to append json, say, {k9:v9,k7:v6} to this list for the key1, something like-
ID|data
A1|{key1:[{k1:v1,k2:v2},{{k3:v3,k4:v4},{k9:v9,k7:v6}]}
I have tried jsonb_set and other functions but they were of no use, example-
UPDATE tbl_name
SET data = jsonb_set(data,'{key1,1}','{k9:v9,k7:v6}'::jsonb) where ID = 'A1'
You need to use jsonb_insert() function in order to append that part, after fixing the format of JSONB value ( otherwise you'd get "ERROR: invalid input syntax for type json" ) :
UPDATE tbl_name
SET data = jsonb_insert(data,'{key1,1}','{"k9":"v9","k7":"v6"}'::jsonb)
WHERE ID = 'A1'
Demo

Snowflake : Object_construct leaving null values when i used copy command to frame json file as out put

I use copy command of snowflake which is below returns a file with content json
copy into #elasticsearch/product/sf_index
from (select object_construct('id',id, alpha,'alpha')from table limit 1)
file_format = (type = json, COMPRESSION=NONE), overwrite=TRUE, single = TRUE, max_file_size=5368709120;
data is
id alpha
1 null
the output file is
{
"id" :1
}
but I need to have the null values
{
"id" : 1,
"alpha" : null
}
You can use the function OBJECT_CONSTRUCT_KEEP_NULL.
Documentation: https://docs.snowflake.com/en/sql-reference/functions/object_construct_keep_null.html
Example:
select OBJECT_CONSTRUCT_KEEP_NULL('id',id, alpha,'alpha')
Will it be possible for you to check programmatically if the value is null and it is null use the below
select object_construct('id',1,'alpha',parse_json('null'));
Per SnowFlake documentation
If the key or value is NULL (i.e. SQL NULL), the key-value pair will be omitted from the resulting object. A key-value pair consisting of a not-null string as key and a JSON NULL as value (i.e. PARSE_JSON(‘NULL’)) will not be omitted.
The other option is, just send it without the null attribute in Elastic and then take care of the retrieval from Elastic.
How about this
select object_construct('id',id, 'alpha', case when alpha is not null then alpha else 'null' end )from table limit 1;
case should be supported by the copy command.
"null" is a valid in json document as per this SO
Is null valid JSON (4 bytes, nothing else)
Ok another possible way is this using union
select object_construct('id',id, 'alpha', parse_json('NULL') )from table where alpha is null
union
select object_construct('id',id, 'alpha', alpha )from table where alpha is not null;
select object_construct('id', id,'alpha', IFNULL(alpha, PARSE_JSON('null'))) from table limit 1
Use IFNULL to check if the value is null and replace with JSON 'null'

MSSQL JSON_VALUE to match ANY Object in Array

I have a table with a JSON text field:
create table breaches(breach_id int, detail text);
insert into breaches values
( 1,'[{"breachedState": null},
{"breachedState": "PROCESS_APPLICATION",}]')
I'm trying to use MSSQL's in build JSON parsing functions to test whether ANY object in a JSON array has a matching member value.
If the detail field was a single JSON object, I could use:
select * from breaches
where JSON_VALUE(detail,'$.breachedState') = 'PROCESS_APPLICATION'
but it's an Array, and I want to know if ANY Object has breachedState = 'PROCESS_APPLICATION'
Is this possible using MSSQL's JSON functions?
You can use function OPENJSON to check each object, try this query:
select * from breaches
where exists
(
select *
from
OPENJSON (detail) d
where JSON_VALUE(value,'$.breachedState') = 'PROCESS_APPLICATION'
)
Btw, there is an extra "," in your insert query, it should be:
insert into breaches values
( 1,'[{"breachedState": null},
{"breachedState": "PROCESS_APPLICATION"}]')

Resources