folks! I have table Template with only 1 column from type JSON.
column
{
"name": "any name",
"headers": {
"Accept": ["application/json"],
"ApiSecret": ["keySecret==="],
"ApiSecretKey": ["key==="],
"X-Auth-Token": ["token=="],
"OAUTH-Token": ["token2=="],
"Authorization": ["basicAuth"]
},
"description": "any desc"
}
I have to remove all headers like %authorization%, %token%, %apiSecret%.
I created the following query:
UPDATE template as w
SET
column = column::jsonb - ARRAY(select wf.elements
from (select jsonb_object_keys(column->'headers') as elements
from template) as wf
where LOWER(wf.elements) LIKE ANY(ARRAY['%authorization%','%token%','%apikey%']));
Unfortunately my query does not work. Where could be the problem?
(a) You cant't mix jsonb type with array type : column::jsonb - ARRAY(...) is not allowed.
(b) If you want to remove several key/value pairs from the same jsonb data, then you have to create a user-defined aggregate based on the jsonb #- operator :
CREATE OR REPLACE FUNCTION jsonb_remove(x jsonb, y jsonb, p text[])
RETURNS jsonb LANGUAGE sql IMMUTABLE AS
$$ SELECT COALESCE(x, y) #- p ; $$ ;
DROP AGGREGATE IF EXISTS jsonb_remove_agg(jsonb, text[]) ;
CREATE AGGREGATE jsonb_remove_agg(jsonb, text[])
( sfunc = jsonb_remove
, stype = jsonb
) ;
Then you can iterate on the new aggregate within a query :
UPDATE template as w
SET
column = l.elements
FROM
( SELECT id -- id is to be replaced by the primary key of table template
, jsonb_remove_agg(column, array['headers', wf.elements]) as elements
FROM template
CROSS JOIN LATERAL jsonb_object_keys(column->'headers') as wf(elements)
WHERE LOWER(wf.elements) LIKE ANY(ARRAY['%authorization%','%token%','%apikey%'])
GROUP BY id -- id is to be replaced by the primary key of table template
) AS l
WHERE w.id = l.id ; -- -- id is to be replaced by the primary key of table template
see the test result in dbfiddle.
Related
I have the following table:
CREATE TABLE fun (
id uuid not null,
tag varchar[] NOT NULL,
CONSTRAINT fun_pkey PRIMARY KEY(id, tag)
);
CREATE UNIQUE INDEX idx_fun_id ON fun USING btree (id);
Then I inserted a data into the table
insert into fun (id, tag)
values('d7f17de9-c1e9-47ba-9e3d-cd1021c644d2', array['123','234'])
So currently, the value of my tag is ["123", "234"]
How can I add the value of the array, and ignore any of the existing varchar, only adding the non-existing one?
currently, this is how I approach it
update fun
set tag = tag || array['234','345']
where id = 'd7f17de9-c1e9-47ba-9e3d-cd1021c644d2'
but my tag will become ["123", "234", "234", "345"]. The value of 234 becomes a duplicated one. What I need to achieve is the value of the tag becomes ["123", "234", "345"]
There is no built-in function to only append unique elements, but it's easy to write one:
create function append_unique(p_one text[], p_two text[])
returns text[]
as
$$
select array(select *
from unnest(p_one)
union
select *
from unnest(p_two));
$$
language sql
immutable;
Then you can use it like this:
update fun
set tag = append_unique(tag,array['234','345'])
where id = 'd7f17de9-c1e9-47ba-9e3d-cd1021c644d2'
Note that this does not preserve the order of the items.
A function that preserves the order of the elements of the existing array and appends the elements of the second one in the order provided would be:
create function append_unique(p_one text[], p_two text[])
returns text[]
as
$$
select p_one||array(select x.item
from unnest(p_two) with ordinality as x(item,idx)
where x.item <> all (p_one)
order by x.idx);
$$
language sql
immutable;
I have used following query to parse and store json elements into table 'pl'
'test' table is used to store raw json.
select
each_attribute ->> 'id' id,
each_attribute ->> 'sd' sd,
each_attribute ->> 'v' v
from test
cross join json_array_elements(json_array) each_section
cross join json_array_elements(each_section -> 'firstt') each_attribute
I am able to view following json values using above query but not able to insert it into another table using json_populate_recordset.
Table definition I need to insert nested json into:
id integer, character varying(6666), character varying(99999)
Table1(for above definition) should store value for key firstt
Table2(for above definition) should store value for key secondt
Json format:
{
"firstt": [
{
"id": 1,
"sd": "test3",
"v": "2223"
},
{
"id": 2,
"sd": "test2",
"v": "2222"
}],
"secondt": [
{
"id": 1,
"sd": "test3",
"v": "2223"
},
{
"id": 2,
"sd": "test2",
"v": "2222"
}]
}
Please assist. I have tried every possible thing from stackoverflow solutions but nothing is given for nested array like this for insertion.
Adding code for dynamic query. It does not work. Error -'too few arguments for format'.
do $$
DECLARE
my record;
tb_n varchar(50);
BEGIN
FOR my IN
SELECT json_object_keys(json_array) as t FROM test
LOOP
tb_n := my.t;
EXECUTE format($$ WITH tbl_record_arrays as(
SELECT
entries.*
FROM
test
JOIN LATERAL json_each(json_array) as entries(tbl_name,tbl_data_arr) ON TRUE
)
INSERT INTO %I
SELECT
records.*
FROM
tbl_record_arrays
JOIN LATERAL json_populate_recordset(null::%I,tbl_data_arr) records ON TRUE
WHERE
tbl_name = %I$$,tb_n);
END LOOP;
END;
$$;
To create a plpgsql function that dynamically inserts a json array for a specified key into a specified table, you can do:
CREATE OR REPLACE FUNCTION dynamic_json_insert(key_name text,tbl text) RETURNS VOID AS $$
BEGIN
-- the $<tag>$ syntax allows for generating a multiline string
EXECUTE format($sql$
INSERT INTO %1$I
SELECT
entries.*
FROM test
JOIN LATERAL json_populate_recordset(null::%1$I,json_data -> $1) as entries ON TRUE;
$sql$::text,tbl) USING dynamic_json_insert.key_name;
END;
$$ LANGUAGE plpgsql
VOLATILE --modifies data
STRICT -- Returns NULL if any arguments are NULL
SECURITY INVOKER; --Execute this function with the Role of the caller, rather than the Role that defined the function;
and call it like
SELECT dynamic_json_insert('firstt','table_1')
If you want to insert into multiple tables using multiple key value pairs you can make a plpgsql function that takes a variadic array of key,table pairs and then generate a single Common Table Expression (CTE) with all of the INSERTs in a single atomic statement.
First create a custom type:
CREATE TYPE table_key as (
tbl_key text,
relation regclass -- special type that refers to a Postgresql relation
);
Then define the function:
CREATE OR REPLACE FUNCTION dynamic_json_insert(variadic table_keys table_key[]) RETURNS VOID AS $$
DECLARE
tbl_key_len integer = array_length(dynamic_json_insert.table_keys,1);
BEGIN
IF tbl_key_len > 0 THEN
EXECUTE (
--generates a single atomic insert CTE when there are multiple table_keys OR a single insert statement otherwise
--the SELECT is enclosed in parenthesis because it generates a single text value which EXECUTE receives.
SELECT
--append WITH if We have more than 1 table_key (for CTE)
CASE WHEN tbl_key_len > 1 THEN 'WITH ' ELSE '' END
|| string_agg(
CASE
WHEN
--name the auxiliary statement and put it in parenthesis.
is_aux THEN format('%1$I as (%2$s)','ins_' || tk.tbl_key,stmt) || end_char
ELSE stmt
END,E'\n') || ';'
FROM
--unnest the table_keys argument and get its index (rn)
unnest(dynamic_json_insert.table_keys) WITH ORDINALITY AS tk(tbl_key,relation,rn)
-- the JOIN LATERAL here means "for each unnested table_key, generate the rows of the following subquery"
JOIN LATERAL (
SELECT
rn < tbl_key_len is_aux,
--we need a comma between auxiliary statements
CASE WHEN rn = tbl_key_len - 1 THEN '' ELSE ',' END end_char,
--dynamically generate INSERT statement
format($sql$
INSERT INTO %1$I
SELECT
entries.*
FROM test
JOIN LATERAL json_populate_recordset(null::%1$I,json_data -> %2$L) as entries ON TRUE
$sql$::text,tk.relation,tk.tbl_key) stmt
) stmts ON TRUE
);
END IF;
END;
$$ LANGUAGE plpgsql
VOLATILE --modifies data
STRICT -- Returns NULL if any arguments are NULL
SECURITY INVOKER; --Execute this function with the Role of the caller, rather than the Role that defined the function;
Then call the function like:
SELECT dynamic_json_insert(
('firstt','table_1'),
('secondt','table_2')
);
Because of the use of the variadic keyword, you can pass in each element of the array as an individual argument and Postgres will cast to the appropriate types automatically.
The generated/executed SQL for the above function call will be:
WITH ins_firstt as (
INSERT INTO table_1
SELECT
entries.*
FROM test
JOIN LATERAL json_populate_recordset(null::table_1,json_data -> 'firstt') as entries ON TRUE
)
INSERT INTO table_2
SELECT
entries.*
FROM test
JOIN LATERAL json_populate_recordset(null::table_2,json_data -> 'secondt') as entries ON TRUE
;
I have a column that contains a JSON array of strings, which I would like to transform into boolean columns. These columns are true if the value was present in the array.
Let's say I have the following columns in Postgres.
|"countries"|
---------------
["NL", "BE"]
["UK"]
I would like to transform this into boolean columns per market. e.g.
|"BE"|"NL"|"UK"|
--------------------
|True|True|False|
|False|False|True|
I know I can manually expand it using case statements for each country code, but there are 200+ countries.
Is there are more elegant solution?
Displaying a various list of columns whose labels are known only at the runtime is not so obvious with postgres. You need some dynamic sql code.
Here is a full dynamic solution whose result is close from your expected result and which relies on the creation of a user-defined composite type and on the standard functions jsonb_populate_record and jsonb_object_agg :
First you create the list of countries as a new composite type :
CREATE TYPE country_list AS () ;
CREATE OR REPLACE PROCEDURE country_list () LANGUAGE plpgsql AS
$$
DECLARE country_list text ;
BEGIN
SELECT string_agg(DISTINCT c.country || ' text', ',')
INTO country_list
FROM your_table
CROSS JOIN LATERAL jsonb_array_elements_text(countries) AS c(country) ;
EXECUTE 'DROP TYPE IF EXISTS country_list' ;
EXECUTE 'CREATE TYPE country_list AS (' || country_list || ')' ;
END ;
$$ ;
Then you can call the procedure country_list () just before executing the final query :
CALL country_list () ;
or even better call the procedure country_list () by trigger when the list of countries is supposed to be modified :
CREATE OR REPLACE FUNCTION your_table_insert_update()
RETURNS trigger LANGUAGE plpgsql VOLATILE AS
$$
BEGIN
IF EXISTS ( SELECT 1
FROM (SELECT jsonb_object_keys(to_jsonb(a.*)) FROM (SELECT(null :: country_list).*) AS a) AS b(key)
RIGHT JOIN jsonb_array_elements_text(NEW.countries) AS c(country)
ON c.country = b.key
WHERE b.key IS NULL
)
THEN CALL country_list () ;
END IF ;
RETURN NEW ;
END ;
$$ ;
CREATE OR REPLACE TRIGGER your_table_insert_update AFTER INSERT OR UPDATE OF countries ON your_table
FOR EACH ROW EXECUTE FUNCTION your_table_insert_update() ;
CREATE OR REPLACE FUNCTION your_table_delete()
RETURNS trigger LANGUAGE plpgsql VOLATILE AS
$$
BEGIN
CALL country_list () ;
RETURN OLD ;
END ;
$$ ;
CREATE OR REPLACE TRIGGER your_table_delete AFTER DELETE ON your_table
FOR EACH ROW EXECUTE FUNCTION your_table_delete() ;
Finally, you should get the expected result with the following query, except that the column label are lower case, and NULL is replacing false in the result :
SELECT (jsonb_populate_record(NULL :: country_list, jsonb_object_agg(lower(c.country), true))).*
FROM your_table AS t
CROSS JOIN LATERAL jsonb_array_elements_text(t.countries) AS c(country)
GROUP BY t
full test result in dbfiddle.
How to have not null constraint inside a jsonb column in Postgres.
I have created a Postgres table with just one column called id like this below
create table tablea (
id jsonb,
check
((id->>'test1', id->>'test2') != (null, null))
);
The caller will insert data into the table in the below json format:-
[
{
"test1":"",
"test2":"",
"test3":""
},
{
"test1":"",
"test2":"",
"test3":""
}
]
My goal is to when a caller insert data in the id column i want the key test1 and test2 be not null. How can in achieve that. My table creation logic is explained above. I am trying to insert data like
insert into tablea(id) values
('[{"test1":null,"test2":"a","test3":""}]');
Ideally this insert statement should throw me error but it is inserting data in the table. Can anyone help me out
You will need to create a function that iterates through your array and validates every array element.
Something like this:
create or replace function validate_json(p_input jsonb)
returns boolean
as
$$
select not exists (select *
from jsonb_array_elements(p_input) as t(element)
where nullif(element ->> 'test1', '') is null
or nullif(element ->> 'test2', '') is null);
$$
language sql
stable;
Then you can use it to define a check constraint:
You can't compare null with = or <>. You need to use IS NOT NULL for that.
It also seems you want to treat an empty string the same way as null.
create table tablea
(
id jsonb,
constraint check_json check ( validate_json(id) )
);
Table:
CREATE TABLE appointment
(
id bigserial NOT NULL,
date_of_visit timestamp without time zone NOT NULL,
symptoms text[],
diseases text[],
lab_tests text[],
prescription_id bigint NOT NULL,
medicines jsonb,
CONSTRAINT appointment_pkey PRIMARY KEY (id),
CONSTRAINT appointment_prescription_id_fkey FOREIGN KEY (prescription_id)
REFERENCES prescription (id) MATCH SIMPLE
ON UPDATE NO ACTION ON DELETE NO ACTION
)
WITH (
OIDS=FALSE
);
Insert statement:
INSERT INTO appointment values(
1,
now(),
'{"abc","def","ghi"}',
'{"abc","def","ghi"}',
'{"abc","def","ghi"}',
1,
'[{"sku_id": 1, "company": "Magnafone"}, {"sku_id": 2, "company": "Magnafone"}]')
I am trying to query against a jsonb array type column in postgres. I had some solution in hand which is as below. Somehow it is not working The error is - Cannot extract elements from a scalar.
SELECT distinct(prescription_id)
FROM appointment
WHERE to_json(array(SELECT jsonb_array_elements(medicines) ->>'sku_id'))::jsonb ?|array['1']
LIMIT 2;
Update:
The query runs just fine. There was some unwanted value in the column for some other rows because of which it was not running.
There are rows in the table containing a scalar value in column medicines instead of array.
You should inspect and properly update the data. You can find these rows with this query:
select id, medicines
from appointment
where jsonb_typeof(medicines) <> 'array';
Alternatively, you can check the type of values in this column in the query:
select prescription_id
from (
select distinct on (prescription_id)
prescription_id,
case
when jsonb_typeof(medicines) = 'array' then jsonb_array_elements(medicines) ->>'sku_id'
else null
end as sku_id
from appointment
) alias
where sku_id = '1'
limit 2;
or simply exclude non-array values in where clause:
select prescription_id
from (
select distinct on (prescription_id)
prescription_id,
jsonb_array_elements(medicines) ->>'sku_id' as sku_id
from appointment
where jsonb_typeof(medicines) = 'array'
) alias
where sku_id = '1'
limit 2;