I have this code in my VIEW:
SELECT json_object_agg(code,value) FROM table1
It generates this in my postgREST api:
[{"json_object_agg":"{code1: value1, code2: value2, ...}"]
I want to remove the json_object_agg to make it like this:
[{code1: value1, code2: value2, ...}]
How do I do that?
You can't have empty result set name. I assume this is related to your previous question where you were getting
t=# select json_object_agg(code,value) from tt;
json_object_agg
----------------------------------------------------
{ "ALLOW_MAC_ADDR" : "1", "USER_ALIAS" : "James" }
So if you try to name the column with empty value (closest I can think of from your post requirements), you get the error:
t=# select json_object_agg(code,value) "" from tt;
ERROR: zero-length delimited identifier at or near """"
LINE 1: select json_object_agg(code,value) "" from tt;
^
Of course you can mokey hack it with space character, like:
t=# select json_object_agg(code,value) " " from tt;
----------------------------------------------------
{ "ALLOW_MAC_ADDR" : "1", "USER_ALIAS" : "James" }
which is ALMOST no name, but I assume you have to modify your "postgREST api" in order to do it properly - ignoring the result set attribute name
Related
I have a stored procedure that accepts a JSON string as input parameter. The input JSON string is like this:
[
{
"name":"Jhon",
"surname":"Smith",
"skills":["C#","VB.NET"]
},
{
"name":"Robert",
"surname":"Jhonson",
"skills":["T-SQL","Pascal"]
}
]
How can I add a unique GUID property to each principal object automatically?
Looking at your example data you already discovered this page of the documentation that tells you how to insert values with the json_modify() function. The examples on that page are written for a single "principal object".
If I interpret this correctly, then your sample has 2 principle objects. Using another page of the documentation shows how you can split that sample in rows with the openjson() function. You can then apply the json_modify() from the first documentation page on each row.
declare #var nvarchar(max) =
'[
{
"name":"Jhon",
"surname":"Smith",
"skills":["C#","VB.NET"]
},
{
"name":"Robert",
"surname":"Jhonson",
"skills":["T-SQL","Pascal"]
}
]';
select row_number() over(order by (select null)) as ObjNumber,
json_modify(j.value, '$.guid', convert(nvarchar(100), newid())) as ObjValue
from openjson(#var, '$') j
The result looks like this:
ObjNumber ObjValue
----------- ----------------------------------------------------
1 {
"name":"Jhon",
"surname":"Smith",
"skills":["C#","VB.NET"]
,"guid":"154C5581-588C-41AA-B292-BB6459F8F4DC"}
2 {
"name":"Robert",
"surname":"Jhonson",
"skills":["T-SQL","Pascal"]
,"guid":"46ACFDD6-58DE-4DB0-8D7A-9B1CCABFF8D8"}
Fiddle
To add the rows back together, just add for json path. This does however require a field alias (here MyObjects) that ends up in the output.
select json_modify(j.value, '$.guid', convert(nvarchar(100), newid())) as MyObjects
from openjson(#var, '$') j
for json path;
Output:
[{"MyObjects":{
"name":"Jhon",
"surname":"Smith",
"skills":["C#","VB.NET"]
,"guid":"FCED4D30-B2B0-460B-97FA-EDA820039572"}},{"MyObjects":{
"name":"Robert",
"surname":"Jhonson",
"skills":["T-SQL","Pascal"]
,"guid":"9FF02A70-0455-4E5C-8C11-27BB2688929D"}}]
Fiddle
To update the variable use the following code. Bonus: replace() removes the previously added field alias.
set #var = replace(
( select json_modify(j.value, '$.guid', convert(nvarchar(100), newid())) as MyObjects
from openjson(#var, '$') j
for json path ),
'"MyObjects":', '');
Final output for select #var:
[{{
"name":"Jhon",
"surname":"Smith",
"skills":["C#","VB.NET"]
,"guid":"66CB37D3-FAEF-4186-94D8-8AC0CF6EB1AC"}},{{
"name":"Robert",
"surname":"Jhonson",
"skills":["T-SQL","Pascal"]
,"guid":"564D6904-D981-40AC-BA9C-8B06015ACE50"}}]
Fiddle
I am receiving quoted string values in an array instead of an array of objects when I use JSON_ARRAY_APPEND() to insert a string that represents an object.
I need a way to force the value inserted to the array to be the object instead of it's string representation.
Server:
10.2.18-MariaDB-log
MariaDB Server
Linux x86_64
Here is a sample I am trying to get work:
set #NewArrayItem = '{"item2": "value2"}';
SELECT JSON_ARRAY_APPEND('{"SomeData": "SomeValue", "AnArray": [{"item1": "value1"}]}', '$.AnArray', #NewArrayItem ) as outval;
The second element in the array ($.AnArray[1]) is a string instead of an object.
I am expecting:
{"SomeData": "SomeValue", "AnArray": [{"item1": "value1"}, {"item2": "value2"}]}
But I actually get:
{"SomeData": "SomeValue", "AnArray": [{"item1": "value1"}, "{\"item2\": \"value2\"}"]}
I see that the following works, but my constraint is that value #NewArrayItem is a properly formatted string from another application:
SELECT JSON_ARRAY_APPEND('{"SomeData": "SomeValue", "AnArray": [{"item1": "value1"}]}', '$.AnArray', JSON_OBJECT('item2','value2') ) as outval;
I solved this with a combination of JSON_SET, JSON_MERGE, and JSON_QUERY:
set #ExistingData = '{"SomeData": "SomeValue", "AnArray": [{"item1": "value1"}]}';
set #NewArrayItem = '{"item2": "value2"}';
SELECT JSON_SET(#ExistingData, '$.AnArray', JSON_MERGE(ifnull(JSON_QUERY(#ExistingData, '$.AnArray'),'[]'),#NewArrayItem) ) as outval;
As a bonus, it also works for the case where the array does not exist:
set #ExistingData = '{"SomeData": "SomeValue"}';
set #NewArrayItem = '{"item2": "value2"}';
SELECT JSON_SET(#ExistingData, '$.AnArray', JSON_MERGE(ifnull(JSON_QUERY(#ExistingData, '$.AnArray'),'[]'),#NewArrayItem) ) as outval;
Still looking for a more simple answer.
I try this example
SET #json = '[]';
SELECT #json; // []
SET #json = JSON_ARRAY_APPEND(#json, '$', JSON_OBJECT("id", 1, "name", "Month1"));
SELECT #json; // [{"id": 1, "name": "Month1"}]
SET #json = JSON_ARRAY_APPEND(#json, '$', JSON_OBJECT("id", 2, "name", "Month2"));
SELECT #json; // [{"id": 1, "name": "Month1"}, {"id": 2, "name": "Month2"}]
So here is my code block
val cols = df.columns
val w = cols(0)
val query1 = s"select $cols(0), square(age) as age, age as age2, first_name, last_name from test"
val query2 = s"select $w, square(age) as age, age as age2, first_name, last_name from test"
Query 2 works just fine, query 1 throws the following error
no viable alternative at input 'select ['(line 1, pos 7)
== SQL ==
select [Ljava.lang.String;#7988d54(0), square(age) as age, age as age2, first_name, last_name from test
Is there anyway to accomplish query1 without creating a temp variable? I want to acces the indexed cols datatype directly.
You can do it by nesting an expression within the string you're interpolating:
s"select ${col(0)} and then..."
s"select ${df.columns(0)} and then..."
You use the ${expr} to hold some expr which is valid code. The single variable version of the interpolation phrase $foo is actually short hand for ${foo}.
I have a table with a string column containing values like this: 'ID: 1, name: john doe, occupation: salesmen'. I want to convert this into a column of JSON objects like this: {"ID" : "1", "name" : "john doe", "occupation" : "salesmen"}
For now my solution is:
WITH
lv1 as(SELECT regexp_split_to_table('ID: 1, name: john doe, occupation: salesmen', ', ') record)
, lv2 as (SELECT regexp_split_to_array(record, ': ') arr from lv1)
SELECT
json_object(
array_agg(arr[1])
, array_agg(arr[2])
)
FROM lv2
The problem is that the string actually contains nearly 100 key - value pair and the table has millions of rows, so using regex_split_to_table will make this table explode. Is there any efficient way to do this in Postgresql?
you don't necessarily need regular expressions functions here, eg:
db=# with c as (select unnest('{ID: 1, name: john doe, occupation: salesmen}'::text[]))
select string_to_array(unnest,': ') from c;
string_to_array
-----------------------
{ID,1}
{name,"john doe"}
{occupation,salesmen}
(3 rows)
Not sure what will be faster though.
Regarding built in json formatting - I think you HAVE to provide ether a row or formatted JSON - no parsers are currently vailable...
I'm looking for an answer to the following question:
Is it possible to 'Create a Table' and to 'Input Values' with various Arrays?
My Problem is, that I have a large amount of values which I'd like to insert into the table. Because of that it's not really practical for me, to do the standard 'CreateTable' stuff like c.execute('''CREATE TABLE calc_matrix (date text, trans text, symbol text, qty real, price real)''')
I have the following Arrays which I like to insert:
1. Array: column_name [200] = [ColName1 string, ColName2 string, ...., ColName200 string]
2. Array: Result_1 [200] = [ Value1 int, Value2 double, Date date, Value3 float, ...., Value185 int, Value 186 float]
3. Array: Result_2 [200] = [ Value1 int, Value2 double, Date date, Value3 float, ...., Value185 int, Value 186 float]
4. Array: Result_3 [200] = [ Value1 int, Value2 double, Date date, Value3 float, ...., Value185 int, Value 186 float]
Finale Table Example:
[0]|ColName1|ColName2|ColName3|ColName4|...|ColName199|ColName200|
______________________________________________________________
[1]|Value1|Value2|Date|Value3|....|Value198|Value199|
[2]|Value1|Value2|Date|Value3|....|Value198|Value199|
[3]|Value1|Value2|Date|Value3|....|Value198|Value199|
[4]|Value1|Value2|Date|Value3|....|Value198|Value199|
.
.
.
[150]|Value1|Value2|Date|Value3|....|Value198|Value199|
Therefore I don't know how to accomplish that.
I've tried things like:
c.execute('''CREATE TABLE calc_matrix %s''', column_name)
Or in my loop:
c.execute('''CREATE TABLE calc_matrix ?''', column_name[i])
but nothing worked ......
Try to create the full SQL command in a string first and then pass the string into c.execute
string = "CREATE TABLE calc_matrix ( "
your for loop
string += column_name[i] + ", "
string += ")"
c.execute(string)
Your for loop wasn't working before because your program was trying to execute many CREATE TABLE commands that all had the same table name calc_matrix. You need the entire command in one c.execute statement.
The same pattern can be used for inserting many rows into a table.
string = "INSERT INTO calc_matrix VALUES ("
your for loop
string += result_1[i] + ", "
string += ")"
c.execute(string)