I have the following table:
CREATE TABLE fun (
id uuid not null,
tag varchar[] NOT NULL,
CONSTRAINT fun_pkey PRIMARY KEY(id, tag)
);
CREATE UNIQUE INDEX idx_fun_id ON fun USING btree (id);
Then I inserted a data into the table
insert into fun (id, tag)
values('d7f17de9-c1e9-47ba-9e3d-cd1021c644d2', array['123','234'])
So currently, the value of my tag is ["123", "234"]
How can I add the value of the array, and ignore any of the existing varchar, only adding the non-existing one?
currently, this is how I approach it
update fun
set tag = tag || array['234','345']
where id = 'd7f17de9-c1e9-47ba-9e3d-cd1021c644d2'
but my tag will become ["123", "234", "234", "345"]. The value of 234 becomes a duplicated one. What I need to achieve is the value of the tag becomes ["123", "234", "345"]
There is no built-in function to only append unique elements, but it's easy to write one:
create function append_unique(p_one text[], p_two text[])
returns text[]
as
$$
select array(select *
from unnest(p_one)
union
select *
from unnest(p_two));
$$
language sql
immutable;
Then you can use it like this:
update fun
set tag = append_unique(tag,array['234','345'])
where id = 'd7f17de9-c1e9-47ba-9e3d-cd1021c644d2'
Note that this does not preserve the order of the items.
A function that preserves the order of the elements of the existing array and appends the elements of the second one in the order provided would be:
create function append_unique(p_one text[], p_two text[])
returns text[]
as
$$
select p_one||array(select x.item
from unnest(p_two) with ordinality as x(item,idx)
where x.item <> all (p_one)
order by x.idx);
$$
language sql
immutable;
Related
folks! I have table Template with only 1 column from type JSON.
column
{
"name": "any name",
"headers": {
"Accept": ["application/json"],
"ApiSecret": ["keySecret==="],
"ApiSecretKey": ["key==="],
"X-Auth-Token": ["token=="],
"OAUTH-Token": ["token2=="],
"Authorization": ["basicAuth"]
},
"description": "any desc"
}
I have to remove all headers like %authorization%, %token%, %apiSecret%.
I created the following query:
UPDATE template as w
SET
column = column::jsonb - ARRAY(select wf.elements
from (select jsonb_object_keys(column->'headers') as elements
from template) as wf
where LOWER(wf.elements) LIKE ANY(ARRAY['%authorization%','%token%','%apikey%']));
Unfortunately my query does not work. Where could be the problem?
(a) You cant't mix jsonb type with array type : column::jsonb - ARRAY(...) is not allowed.
(b) If you want to remove several key/value pairs from the same jsonb data, then you have to create a user-defined aggregate based on the jsonb #- operator :
CREATE OR REPLACE FUNCTION jsonb_remove(x jsonb, y jsonb, p text[])
RETURNS jsonb LANGUAGE sql IMMUTABLE AS
$$ SELECT COALESCE(x, y) #- p ; $$ ;
DROP AGGREGATE IF EXISTS jsonb_remove_agg(jsonb, text[]) ;
CREATE AGGREGATE jsonb_remove_agg(jsonb, text[])
( sfunc = jsonb_remove
, stype = jsonb
) ;
Then you can iterate on the new aggregate within a query :
UPDATE template as w
SET
column = l.elements
FROM
( SELECT id -- id is to be replaced by the primary key of table template
, jsonb_remove_agg(column, array['headers', wf.elements]) as elements
FROM template
CROSS JOIN LATERAL jsonb_object_keys(column->'headers') as wf(elements)
WHERE LOWER(wf.elements) LIKE ANY(ARRAY['%authorization%','%token%','%apikey%'])
GROUP BY id -- id is to be replaced by the primary key of table template
) AS l
WHERE w.id = l.id ; -- -- id is to be replaced by the primary key of table template
see the test result in dbfiddle.
How to have not null constraint inside a jsonb column in Postgres.
I have created a Postgres table with just one column called id like this below
create table tablea (
id jsonb,
check
((id->>'test1', id->>'test2') != (null, null))
);
The caller will insert data into the table in the below json format:-
[
{
"test1":"",
"test2":"",
"test3":""
},
{
"test1":"",
"test2":"",
"test3":""
}
]
My goal is to when a caller insert data in the id column i want the key test1 and test2 be not null. How can in achieve that. My table creation logic is explained above. I am trying to insert data like
insert into tablea(id) values
('[{"test1":null,"test2":"a","test3":""}]');
Ideally this insert statement should throw me error but it is inserting data in the table. Can anyone help me out
You will need to create a function that iterates through your array and validates every array element.
Something like this:
create or replace function validate_json(p_input jsonb)
returns boolean
as
$$
select not exists (select *
from jsonb_array_elements(p_input) as t(element)
where nullif(element ->> 'test1', '') is null
or nullif(element ->> 'test2', '') is null);
$$
language sql
stable;
Then you can use it to define a check constraint:
You can't compare null with = or <>. You need to use IS NOT NULL for that.
It also seems you want to treat an empty string the same way as null.
create table tablea
(
id jsonb,
constraint check_json check ( validate_json(id) )
);
I'm having a table called student, with id and name as fields in PostgreSQL:
Create table student (id int, name text[]);
I need to add the constraint for the name field. Which means it has to accept only character for that field. But the field name is a text array.
I tried this check constraint:
Alter table student
add constraint stud_const check (ALL(name) NOT LIKE '%[^a-zA-Z]%');
But it throws this error:
ERROR: syntax error atERROR: syntax error at or near "all"
LINE 1: ... student add constraint stud_const check (all(name) ...
or near "all"
How could I solve this problem? The constraint should be set to whole array.
It is necessary to unnest the array to match it to a regular expression:
select bool_and (n ~ '^[a-zA-Z]*$')
from unnest(array['John','Mary']) a(n)
;
bool_and
----------
t
bool_and. Since it is not possible to use a subquery in the check constraint wrap it in a function:
create function check_text_array_regex (
a text[], regex text
) returns boolean as $$
select bool_and (n ~ regex)
from unnest(a) s(n);
$$ language sql immutable;
and use the function in the check constraint:
create table student (
id serial,
name text[] check (check_text_array_regex (name, '^[a-zA-Z]*$'))
);
Test it:
insert into student (name) values (array['John', 'Mary']);
INSERT 0 1
insert into student (name) values (array['John', 'Mary2']);
ERROR: new row for relation "student" violates check constraint "student_name_check"
DETAIL: Failing row contains (2, {John,Mary2}).
Let's say I have a table called SomeTable with a primary key row_id and json array field called some_json.
some_json would look something like this:
[{'some_key': 'some_value_1'}, {'some_key': 'some_value_2'},
{'some_key': 'some_value_3'}, {'some_key': 'some_value_4'}]
and I have a function that takes a text array parameter called values_to_remove like this:
['some_value_2', 'some_value_3', etc]
I want to update some_json by removing all of its json objects that contain a text value that's also in the values_to_remove array. I have a rough idea of what I need to do but don't know how to piece it all together with the correct syntax, so forgive me if doesn't make sense.
Here's what I have so far (forgive me for butchering the syntax):
CREATE OR REPLACE FUNCTION remove_json_items (removal_id smallint,
values_to_remove text[])
RETURNS void AS $$
BEGIN
UPDATE SomeTable
SET some_json = array_to_json(array_remove(ARRAY(SELECT *
FROM json_array_elements(some_json)),
(SELECT *
FROM json_array_elements(some_json),
unnest(values_to_remove)
WHERE some_json->>some_key = values_to_remove.value
)))
WHERE row_id = removal_id;
END;
$$ LANGUAGE plpgsql;
What's the correct solution for achieving this? Thanks.
Here is the function made with plain SQL
CREATE OR REPLACE FUNCTION remove_json_items (removal_id smallint,
values_to_remove text[])
RETURNS void AS $$
UPDATE sometable
SET some_json =
(SELECT json_agg(ae)
FROM (
SELECT json_array_elements(some_json) AS ae from sometable WHERE row_id = removal_id
) sq1
WHERE NOT ARRAY[(SELECT * FROM json_to_record(ae) AS x(some_key text))] <# ARRAY['some_value_3', 'some_value_2'])
WHERE row_id = removal_id;
$$ LANGUAGE SQL;
In my sql code i'm passing around a bunch of magic numbers :-
AnimalType TINYINT
/*
AnimalType can be one of the following :-
1. Cat
2. Dog
3. Bird
....
*/
Is there anyway i could make this a custom type / enumeration.
eg.
AnimalType ANIMAL
and it's constrained to contain a number between 1 <-> whatever (eg. 3 in my example above).
or constrained to strings. eg. AnimalType = 'Cat' .. etc ?
Cheers!
Edit
I know what LookUp tables are. This is not for a lookup table, but for some data passed to a number of stored procedures. Instead of passing in Magic Numbers, I wish to pass in an enumeration OR at least some contrained number (eg. numbers 1<->5), etc...
There are no enumeration types. However, you can create user defined functions to translate back and forth between INTs that you map to enumerated values.
To generate friendly names for an AnimalType based of 'INT' you could do something like this:
UDF to generate friendly names:
CREATE FUNCTION ihAnimalTypeDesc
(
#AnimalType INT
)
RETURNS VARCHAR(20)
AS
BEGIN
IF #AnimalType IS NULL
RETURN NULL
DECLARE #Temp AS VARCHAR(20)
SET #Temp = CASE #AnimalType
WHEN 1 THEN 'Cat'
WHEN 2 THEN 'Dog'
WHEN 3 THEN 'Snake'
END
RETURN #Temp
END
A SELECT statement could uses the UDF like so:
SELECT A.AnimalName, dbo.ihAnimalTypeDesc(A.AnimalType)
FROM Animals A
Here is a UDF to return true or false if an animal is of a particular type:
CREATE FUNCTION IsCat
(
#AnimalType INT
)
RETURNS BIT
AS
BEGIN
IF #AnimalType IS NULL
RETURN NULL
IF #AnimalType = 1
RETURN 1
RETURN 0
END
Here is an example using the above UDF. NOTE: you have to be careful with performance issue when doing this in the WHERE clause.:
SELECT AnimalName
FROM Animals
WHERE dbo.IsCat(AnimalType)
Enumeration is like a FOREIGN KEY to a table, but without a table.
Create a table and make a FOREIGN KEY constraint:
CREATE TABLE AnimalType (id INT NOT NULL PRIMARY KEY, name VARCHAR(50) NOT NULL)
CREATE TABLE Animal (
id INT NOT NULL PRIMARY KEY,
type INT NOT NULL,
name VARCHAR(50) NOT NULL,
CONSTRAINT FK_animal_type FOREIGN KEY (type) REFERENCES AnimalType(id)
)
You can try to add something like this:
CREATE VIEW AnimalType AS
SELECT
10 AS Cat
,20 AS Dog
,30 AS Bird
to use it:
DECLARE #animal INT
SELECT #animal = Bird FROM AnimalType
as per your latest edit, about needing to pass in parameters. T-SQL has the basics to to simple stuff in addition to the heavy lifting SQL. There are no enumeration data-types. If that is what you want, you can handle it in the following way:
create procedure YourProcedure
( #param1 int
,#param2 varchar(5)
,#param3 varchar(5)
)
as
DECLARE #Value3 int
IF #param1 IS NULL OR #param1<1 OR #param1>5
BEGIN
return 1 --error out of range param
END
IF NOT EXISTS (SELECT Value FROM LookUpTable WHERE Value=#param2)
BEGIN
return 2 --error out of range param
END
SELECT #Value3=IntValue FROM OtherLookupTable WHERE StrValue=#param3
IF #Value3 IS NULL
BEGIN
return 3 --error out of range param
END
--do work here
go