All of the examples examples I see dealing with json arrays have the array nested under a top level object. I have a json array in a column:
[{"key": "value1"}, {"key": "value2"}]
I would like to run a sql script to add/update a key for each element in the array, resulting in:
[{"key": "value1", "otherKey": "otherValue"}, {"key": "value", "otherKey": "otherValue"}]
Yes, in my case I want the same value set for each array member. I've tried:
declare #info nvarchar(max)
SET #info = '[{"key": "value1"}, {"key": "value2"}]'
print JSON_MODIFY(#info, '[0].otherKey', '""')
and fails with "JSON path is not properly formatted. Unexpected character '[' is found at position 0."
This is in MSSQL 2017.
The approach, that can be used, depends on JSON structure (I assume, that the count of the items in the JSON array is not fixed):
If the input JSON array has items (JSON objects) with fixed key/keys, you may use a combination of OPENJSON() with explicit schema (to parse this JSON as table) and FOR JSON (to modify and return the rows as JSON)
If the input JSON has items with different structure, you may use a combination of OPENJSON() with default schema, JSON_MODIFY() and STRING_AGG().
JSON:
declare #info nvarchar(max)
SET #info = '[{"key": "value1"}, {"key": "value2"}]'
Statement for fixed structure:
SELECT #info = (
SELECT [key], 'OtherValue' AS OtherKey
FROM OPENJSON(#info) WITH ([key] varchar(100) '$.key')
FOR JSON PATH
)
Statement for variable structure:
SELECT #info = CONCAT('[', STRING_AGG(JSON_MODIFY([value], '$.OtherKey', 'OtherValue'), ','), ']')
FROM OPENJSON(#info)
SELECT #info
Result:
[{"key":"value1","OtherKey":"OtherValue"},{"key":"value2","OtherKey":"OtherValue"}]
Note, that the reason for the error is that the value of path parameter ([0].otherKey) is wrong. The correct path expression is $[0].otherKey for the first item in the JSON array.
Have you tried adding $ before the indexer sign? Like:
declare #info nvarchar(max)
SET #info = '[{"key": "value1"}, {"key": "value2"}]'
SELECT JSON_MODIFY(#info, '$[0].otherKey', '""')
This gives the following output:
[{"key": "value1","otherKey":"\"\""}, {"key": "value2"}]
Related
I have a query like (simplified):
SELECT
JSON_QUERY(r.SerializedData, '$.Values') AS [Values]
FROM
<TABLE> r
WHERE ...
The result is like this:
{ "2019":120, "20191":120, "201902":121, "201903":134, "201904":513 }
How can I remove the entries with a key length less then 6.
Result:
{ "201902":121, "201903":134, "201904":513 }
One possible solution is to parse the JSON and generate it again using string manipulations for keys with desired length:
Table:
CREATE TABLE Data (SerializedData nvarchar(max))
INSERT INTO Data (SerializedData)
VALUES (N'{"Values": { "2019":120, "20191":120, "201902":121, "201903":134, "201904":513 }}')
Statement (for SQL Server 2017+):
UPDATE Data
SET SerializedData = JSON_MODIFY(
SerializedData,
'$.Values',
JSON_QUERY(
(
SELECT CONCAT('{', STRING_AGG(CONCAT('"', [key] ,'":', [value]), ','), '}')
FROM OPENJSON(SerializedData, '$.Values') j
WHERE LEN([key]) >= 6
)
)
)
SELECT JSON_QUERY(d.SerializedData, '$.Values') AS [Values]
FROM Data d
Result:
Values
{"201902":121,"201903":134,"201904":513}
Notes:
It's important to note, that JSON_MODIFY() in lax mode deletes the specified key if the new value is NULL and the path points to a JSON object. But, in this specific case (JSON object with variable key names), I prefer the above solution.
I am receiving quoted string values in an array instead of an array of objects when I use JSON_ARRAY_APPEND() to insert a string that represents an object.
I need a way to force the value inserted to the array to be the object instead of it's string representation.
Server:
10.2.18-MariaDB-log
MariaDB Server
Linux x86_64
Here is a sample I am trying to get work:
set #NewArrayItem = '{"item2": "value2"}';
SELECT JSON_ARRAY_APPEND('{"SomeData": "SomeValue", "AnArray": [{"item1": "value1"}]}', '$.AnArray', #NewArrayItem ) as outval;
The second element in the array ($.AnArray[1]) is a string instead of an object.
I am expecting:
{"SomeData": "SomeValue", "AnArray": [{"item1": "value1"}, {"item2": "value2"}]}
But I actually get:
{"SomeData": "SomeValue", "AnArray": [{"item1": "value1"}, "{\"item2\": \"value2\"}"]}
I see that the following works, but my constraint is that value #NewArrayItem is a properly formatted string from another application:
SELECT JSON_ARRAY_APPEND('{"SomeData": "SomeValue", "AnArray": [{"item1": "value1"}]}', '$.AnArray', JSON_OBJECT('item2','value2') ) as outval;
I solved this with a combination of JSON_SET, JSON_MERGE, and JSON_QUERY:
set #ExistingData = '{"SomeData": "SomeValue", "AnArray": [{"item1": "value1"}]}';
set #NewArrayItem = '{"item2": "value2"}';
SELECT JSON_SET(#ExistingData, '$.AnArray', JSON_MERGE(ifnull(JSON_QUERY(#ExistingData, '$.AnArray'),'[]'),#NewArrayItem) ) as outval;
As a bonus, it also works for the case where the array does not exist:
set #ExistingData = '{"SomeData": "SomeValue"}';
set #NewArrayItem = '{"item2": "value2"}';
SELECT JSON_SET(#ExistingData, '$.AnArray', JSON_MERGE(ifnull(JSON_QUERY(#ExistingData, '$.AnArray'),'[]'),#NewArrayItem) ) as outval;
Still looking for a more simple answer.
I try this example
SET #json = '[]';
SELECT #json; // []
SET #json = JSON_ARRAY_APPEND(#json, '$', JSON_OBJECT("id", 1, "name", "Month1"));
SELECT #json; // [{"id": 1, "name": "Month1"}]
SET #json = JSON_ARRAY_APPEND(#json, '$', JSON_OBJECT("id", 2, "name", "Month2"));
SELECT #json; // [{"id": 1, "name": "Month1"}, {"id": 2, "name": "Month2"}]
I have the below array of an HTTP request, in JSON format:
[
{
"Code":"856956645",
"Type":"Colet",
"MeasuredWeight":0.0,
"VolumetricWeight":0.0,
"ConfirmationName":null,
"Observation":" 100 DE SFATURI OASELE",
"ResponseCode":null,
"Event":
[
{
"Date":"2018-11-16T16:22:29.397",
"EventId":73,
"Description":"Ridicare din comanda client",
"LocalityName":"BUCURESTI"
},
{
"Date":"2018-11-17T08:55:06.14",
"EventId":5,
"Description":"Spre destinatar ",
"LocalityName":"BUCURESTI"
}
]
}
]
How could I extract the value of Description element, within the second set of values. I tried with OPENJSON but I couldn't do it:
SELECT *
FROM OPENJSON(#json)
WITH (
Description nvarchar(100) '$.Event.Description'
);
Try nesting instead. Nor sure why your attempt didn't work; I've not had much use for OPENJSON as yet apart from when playing around, however, this works:
SELECT J.Code, J.[Type], E.[Description]
FROM OPENJSON(#json)
WITH (Code bigint '$.Code',
[Type] varchar(10) '$.Type',
[Event] nvarchar(MAX) AS JSON) J
CROSS APPLY OPENJSON([Event])
WITH ([Description] varchar(100) '$.Description',
EventID int '$.EventId') E
WHERE E.EventID = 5;
Edit: Worked out why your attempt wasn't working. The JSON you have has a new JSON object in the Event node, they're not simply properties, like in the documentation's second example here. The entities are wrapped in further brackets ([]), not just braces ({}), and hence why you have to parse the next layer again as a separate JSON object.
I have a table with a JSON text field:
create table breaches(breach_id int, detail text);
insert into breaches values
( 1,'[{"breachedState": null},
{"breachedState": "PROCESS_APPLICATION",}]')
I'm trying to use MSSQL's in build JSON parsing functions to test whether ANY object in a JSON array has a matching member value.
If the detail field was a single JSON object, I could use:
select * from breaches
where JSON_VALUE(detail,'$.breachedState') = 'PROCESS_APPLICATION'
but it's an Array, and I want to know if ANY Object has breachedState = 'PROCESS_APPLICATION'
Is this possible using MSSQL's JSON functions?
You can use function OPENJSON to check each object, try this query:
select * from breaches
where exists
(
select *
from
OPENJSON (detail) d
where JSON_VALUE(value,'$.breachedState') = 'PROCESS_APPLICATION'
)
Btw, there is an extra "," in your insert query, it should be:
insert into breaches values
( 1,'[{"breachedState": null},
{"breachedState": "PROCESS_APPLICATION"}]')
I'm trying to build a JSON object that contains an array, using SQL Server 2016.
The source data for the array is itself JSON, so I'm using the JSON_QUERY inside a select statement, with the FOR JSON clause applied to the select statement.
Everything works beautifully until I wrap the JSON_QUERY clause in a CASE statement (in certain cases the array must not be included, i.e. must be null).
The following code illustrates the problem:
declare #projects nvarchar(max) = '{"projects": [23439658267415,166584258534050]}'
declare #id bigint = 123
SELECT
[data.array1] = JSON_QUERY(#projects, '$.projects') -- returns an array - perfect.
, [data.array2] = CASE WHEN 1 is NOT NULL
THEN JSON_QUERY(#projects, '$.projects')
ELSE NULL END -- returns an array - still good!
, [data.array3] = CASE WHEN #id is NOT NULL
THEN JSON_QUERY(#projects, '$.projects')
ELSE NULL END -- why do I end up with a string in the JSON when I do this?
FOR JSON PATH, without_array_wrapper
This code returns the following JSON:
{
"data":{
"array1": [23439658267415,166584258534050],
"array2": [23439658267415,166584258534050],
"array3":"[23439658267415,166584258534050]"
}
}
The problem is that the third 'array' is returned as a string object.
I would expect it to return the following JSON:
{
"data":{
"array1": [23439658267415,166584258534050],
"array2": [23439658267415,166584258534050],
"array3": [23439658267415,166584258534050]
}
}
If I remove the FOR JSON PATH... clause, all columns returned by the query are identical (i.e. all three nvarchar values returned by the JSON_QUERY function are identical).
Why is this happening, how do I make it output an array in the final JSON?
Wrap the result from the case statement in a call to JSON_QUERY.
, [data.array3] = JSON_QUERY(
CASE WHEN #id is NOT NULL
THEN JSON_QUERY(#projects, '$.projects')
ELSE NULL END
)
According to the documentation JSON_QUERY "Extracts an object or an array from a JSON string". Further down it says "Returns a JSON fragment of type nvarchar(max).". A bit confusing.
Doing a for xml json on a string value will give you a string value in the returned JSON string and when you do it on a JSON object you get the JSON object inlined in the resulting string value.
You can look at CASE as a function call with a return value automatically determined for you by looking at what values you are returning from the CASE. And since JSON_QUERY returns a string the case will return a string and the returned value will be a string value in JSON.
The case statement in the query plan looks like this.
<ScalarOperator ScalarString="CASE WHEN [#id] IS NOT NULL THEN json_query([#projects],N'$.projects') ELSE NULL END">
When you wrap the case in a call to JSON_QUERY it looks like this instead.
<ScalarOperator ScalarString="json_query(CASE WHEN [#id] IS NOT NULL THEN json_query([#projects],N'$.projects') ELSE NULL END)">
<Intrinsic FunctionName="json_query">
By some kind of internal magic SQL Server recognize this as a JSON object instead of a string and inserts it into the resulting JSON string as a JSON value instead.
CASE WHEN 1 is NOT NULL works because SQL Server is smart enough to see that the case statement will always be true and is optimized away.