I'm having a table called student, with id and name as fields in PostgreSQL:
Create table student (id int, name text[]);
I need to add the constraint for the name field. Which means it has to accept only character for that field. But the field name is a text array.
I tried this check constraint:
Alter table student
add constraint stud_const check (ALL(name) NOT LIKE '%[^a-zA-Z]%');
But it throws this error:
ERROR: syntax error atERROR: syntax error at or near "all"
LINE 1: ... student add constraint stud_const check (all(name) ...
or near "all"
How could I solve this problem? The constraint should be set to whole array.
It is necessary to unnest the array to match it to a regular expression:
select bool_and (n ~ '^[a-zA-Z]*$')
from unnest(array['John','Mary']) a(n)
;
bool_and
----------
t
bool_and. Since it is not possible to use a subquery in the check constraint wrap it in a function:
create function check_text_array_regex (
a text[], regex text
) returns boolean as $$
select bool_and (n ~ regex)
from unnest(a) s(n);
$$ language sql immutable;
and use the function in the check constraint:
create table student (
id serial,
name text[] check (check_text_array_regex (name, '^[a-zA-Z]*$'))
);
Test it:
insert into student (name) values (array['John', 'Mary']);
INSERT 0 1
insert into student (name) values (array['John', 'Mary2']);
ERROR: new row for relation "student" violates check constraint "student_name_check"
DETAIL: Failing row contains (2, {John,Mary2}).
Related
I have the following table:
CREATE TABLE fun (
id uuid not null,
tag varchar[] NOT NULL,
CONSTRAINT fun_pkey PRIMARY KEY(id, tag)
);
CREATE UNIQUE INDEX idx_fun_id ON fun USING btree (id);
Then I inserted a data into the table
insert into fun (id, tag)
values('d7f17de9-c1e9-47ba-9e3d-cd1021c644d2', array['123','234'])
So currently, the value of my tag is ["123", "234"]
How can I add the value of the array, and ignore any of the existing varchar, only adding the non-existing one?
currently, this is how I approach it
update fun
set tag = tag || array['234','345']
where id = 'd7f17de9-c1e9-47ba-9e3d-cd1021c644d2'
but my tag will become ["123", "234", "234", "345"]. The value of 234 becomes a duplicated one. What I need to achieve is the value of the tag becomes ["123", "234", "345"]
There is no built-in function to only append unique elements, but it's easy to write one:
create function append_unique(p_one text[], p_two text[])
returns text[]
as
$$
select array(select *
from unnest(p_one)
union
select *
from unnest(p_two));
$$
language sql
immutable;
Then you can use it like this:
update fun
set tag = append_unique(tag,array['234','345'])
where id = 'd7f17de9-c1e9-47ba-9e3d-cd1021c644d2'
Note that this does not preserve the order of the items.
A function that preserves the order of the elements of the existing array and appends the elements of the second one in the order provided would be:
create function append_unique(p_one text[], p_two text[])
returns text[]
as
$$
select p_one||array(select x.item
from unnest(p_two) with ordinality as x(item,idx)
where x.item <> all (p_one)
order by x.idx);
$$
language sql
immutable;
How to have not null constraint inside a jsonb column in Postgres.
I have created a Postgres table with just one column called id like this below
create table tablea (
id jsonb,
check
((id->>'test1', id->>'test2') != (null, null))
);
The caller will insert data into the table in the below json format:-
[
{
"test1":"",
"test2":"",
"test3":""
},
{
"test1":"",
"test2":"",
"test3":""
}
]
My goal is to when a caller insert data in the id column i want the key test1 and test2 be not null. How can in achieve that. My table creation logic is explained above. I am trying to insert data like
insert into tablea(id) values
('[{"test1":null,"test2":"a","test3":""}]');
Ideally this insert statement should throw me error but it is inserting data in the table. Can anyone help me out
You will need to create a function that iterates through your array and validates every array element.
Something like this:
create or replace function validate_json(p_input jsonb)
returns boolean
as
$$
select not exists (select *
from jsonb_array_elements(p_input) as t(element)
where nullif(element ->> 'test1', '') is null
or nullif(element ->> 'test2', '') is null);
$$
language sql
stable;
Then you can use it to define a check constraint:
You can't compare null with = or <>. You need to use IS NOT NULL for that.
It also seems you want to treat an empty string the same way as null.
create table tablea
(
id jsonb,
constraint check_json check ( validate_json(id) )
);
I have this function:
CREATE FUNCTION CheckAkvNames (#Name VARCHAR(20))
RETURNS INT
AS
BEGIN
DECLARE #NoTexist int = 1
SELECT
#NoTexist = CASE WHEN COUNT(*) > 0 THEN 0 ELSE 1 END
FROM
[dbo].[Names]
WHERE
[Name] = #Name
RETURN #NoTexist
END
GO
ALTER TABLE [dbo].[Names]
ADD CONSTRAINT chkNames CHECK(dbo.CheckAkvNames([Name]) = 1);
GO
The problem is, when I run this on empty table I can't insert ...
So this change works:
CASE WHEN (COUNT(*) - 1) > 0 THEN 0 ELSE 1 END
WHY? Any ideas?
Edit:
Aim is to insert only names that are not in the table. I know it would be better to use key, point of the question is not to find better solution but why this solution does not work.
The constraint you added to the table actually means that you can't insert any name in the table, because for any value inserted in the table the function should return 1.This is impossible because if the name was inserted then the constraint would be violated.
This is why count(*) - 1 works: if there is already a name inserted and you tried to insert the same name then the constraint would be violated.
If you want unique names in a table, do not use a check constraint, use a unique constraint (or equivalently a unique index):
ALTER TABLE [dbo].[Names]
ADD CONSTRAINT unq_names_name UNIQUE (Name);
Now I have a array of varchar (nullable & no default value),
and wanting to convert it into a array of pair(varchar, integer).
For the integer value, just set 0 for now.
For example, ["a", "b"] will be [("a", 0), ("b", 0)].
I already know that I can create pair type via:
CREATE TYPE keycount AS (key VARCHAR, count INTEGER);
but have no idea how to use SET to alter the column.
Thank you for any advices!
First create the type
CREATE TYPE keycount AS (key VARCHAR, count INTEGER);
Now you need to create the casts..
CREATE FUNCTION text_to_keycount(a text)
RETURNS keycount AS
$$
SELECT ($1, 0)::keycount
$$ LANGUAGE sql;
CREATE CAST (text AS keycount) WITH FUNCTION text_to_keycount(text);
SELECT ARRAY['asdf','asdf']::text[]::keycount[];
Now you can create the table and cast the type USING
CREATE TABLE foo ( a text[] );
INSERT INTO foo (a) VALUES
( ARRAY['1','2','3'] );
ALTER TABLE foo
ALTER COLUMN a
SET DATA TYPE keycount[]
USING CAST (a AS keycount[]);
If all the rows in the keycount column have valid data, i.e there aren't any rows that are like '{"abc", ""}', then the following would work. If there are any invalid data in the column, then that has to be first cleaned, before changing the column type.
The trick is to create a temporary pair column populate it with data, delete the old column & rename the temporary column back to the original.
This is needed because there is no way for Postgresql to figure out how to convert between a VARCHAR[] to a keycount automatically.
example:
-- setup a test table
BEGIN;
CREATE TABLE test (col VARCHAR[]);
INSERT INTO test (col) VALUES ('{"abc", "1"}'::varchar[]);
SELECT col FROM test;
-- returns {abc,1}
CREATE TYPE keycount AS (key VARCHAR, count INTEGER);
ALTER TABLE test ADD COLUMN col2 keycount;
UPDATE test SET col2 = (col[1], col[2]::int)::keycount;
ALTER TABLE test DROP COLUMN col;
ALTER TABLE test RENAME COLUMN col2 TO col;
SELECT col FROM test;
-- returns (abc,1)
COMMIT;
EDIT
The above isn't required. Although it works, it is overly complicated.
It is possible to specify the type cast in the alter statement via a USING clause:
Utilizing this clause in the ALTER statement
ALTER TABLE test ALTER COLUMN col TYPE keycount USING (col[1], col[2]::INTEGER)::keycount;
There's no need to stage the data & drop & rename.
Trying to add a check constraint to the following table:
CREATE TABLE TEST_A
(
NAME VARCHAR(55),
Country VARCHAR(50)
)
ALTER TABLE TEST_A
ADD CONSTRAINT CK_GBR_TO_IND
CHECK (Country = CASE WHEN 'GBR' THEN 'IND' ELSE COUNTRY END);
I am getting the following error:
Msg 4145, Level 15, State 1, Line 2
An expression of non-boolean type specified in a context where a condition is expected, near 'THEN'.
Try a Trigger instead..You Can't use Check Constraint to change values...
create trigger trg_test
on yourtable
instead of insert
as
Begin
insert into yourtable--assuming it has only country column
select case when country='GBR' then 'IND'
else country end
from Inserted
end
ALTER TABLE TEST_A ADD CONSTRAINT CK_GBR_TO_IND
CHECK (Country IN('GBR', 'IND'));