ERROR: operator does not exist: jsonb[] -> integer - arrays

select id,rules from links where id=2;
id | rules
----+---------------------------------------------------------------------------------------------------------------------------------------------------------------------------
2 | {"{\"id\": \"61979e81-823b-419b-a577-e2acb34a2f40\", \"url\": \"https://www.wikijob.co.uk/about-us\", \"what\": \"country\", \"matches\": \"GB\", \"percentage\": null}"}
I'm trying to get the elements of the jsonb using the operators here https://www.postgresql.org/docs/9.6/functions-json.html
Whether I use 'url', or an integer as below, I get a similar result.
select id,rules->1 from links where id=2;
ERROR: operator does not exist: jsonb[] -> integer
LINE 1: select id,rules->1 from links where id=2;
^
HINT: No operator matches the given name and argument type(s). You might need to add explicit type casts.
What am I doing wrong?
PS Postgres version 9.6.12.

The column is an array, you can access the first element using index:
select id, rules[1]
from links
where id = 2
Be sure to check also this answer.
Use jsonb_each() in a lateral join to see all rules in separate rows:
select id, key, value
from links
cross join jsonb_each(rules[1]) as rule(key, value)
where id = 2
You can get a single rule in this way:
select id, value as url
from links
cross join jsonb_each(rules[1]) as rule(key, value)
where id = 2 and key = 'url'
Use unnest() to find an url in all elements of the array, e.g.:
select id, unnest(rules)->'url' as url
from links
where id = 2;

Related

Return Parts of an Array in Postgres

I have a column (text) in my Postgres DB (v.10) with a JSON format.
As far as i now it's has an array format.
Here is an fiddle example: Fiddle
If table1 = persons and change_type = create then i only want to return the name and firstname concatenated as one field and clear the rest of the text.
Output should be like this:
id table1 did execution_date change_type attr context_data
1 Persons 1 2021-01-01 Create Name [["+","name","Leon Bill"]]
1 Persons 2 2021-01-01 Update Firt_name [["+","cur_nr","12345"],["+","art_cd","1"],["+","name","Leon"],["+","versand_art",null],["+","email",null],["+","firstname","Bill"],["+","code_cd",null]]
1 Users 3 2021-01-01 Create Street [["+","cur_nr","12345"],["+","art_cd","1"],["+","name","Leon"],["+","versand_art",null],["+","email",null],["+","firstname","Bill"],["+","code_cd",null]]
Disassemble json array into SETOF using json_array_elements function, then assemble it back into structure you want.
select m.*
, case
when m.table1 = 'Persons' and m.change_type = 'Create'
then (
select '[["+","name",' || to_json(string_agg(a.value->>2,' ' order by a.value->>1 desc))::text || ']]'
from json_array_elements(m.context_data::json) a
where a.value->>1 in ('name','firstname')
)
else m.context_data
end as context_data
from mutations m
modified fiddle
(Note:
utilization of alphabetical ordering of names of required fields is little bit dirty, explicit order by case could improve readability
resulting json is assembled from string literals as much as possible since you didn't specified if "+" should be taken from any of original array elements
the to_json()::text is just for safety against injection
)

Case insensitive Postgres query with array contains

I have records which contain an array of tags like these:
id | title | tags
--------+-----------------------------------------------+----------------------
124009 | bridge photo | {bridge,photo,Colors}
124018 | Zoom 5 | {Recorder,zoom}
123570 | Sint et | {Reiciendis,praesentium}
119479 | Architecto consectetur | {quia}
I'm using the following SQL query to fetch a specific record by tags ('bridge', 'photo', 'Colors'):
SELECT "listings".* FROM "listings" WHERE (tags #> ARRAY['bridge', 'photo', 'Colors']::varchar[]) ORDER BY "listings"."id" ASC LIMIT $1 [["LIMIT", 1]]
And this returns a first record in this table.
The problem with this is that I have mixed type cases and I would like this to return the same result if I search for: bridge, photo, colors. Essentially I need to make this search case-insensitive but can't find a way to do so with Postgres.
This is the SQL query I've tried which is throwing errors:
SELECT "listings".* FROM "listings" WHERE (LOWER(tags) #> ARRAY['bridge', 'photo', 'colors']::varchar[]) ORDER BY "listings"."id" ASC LIMIT $1
This is the error:
PG::UndefinedFunction: ERROR: function lower(character varying[]) does not exist
HINT: No function matches the given name and argument types. You might need to add explicit type casts.
You can convert text array elements to lower case in the way like this:
select lower(tags::text)::text[]
from listings;
lower
--------------------------
{bridge,photo,colors}
{recorder,zoom}
{reiciendis,praesentium}
{quia}
(4 rows)
Use this in your query:
SELECT *
FROM listings
WHERE lower(tags::text)::text[] #> ARRAY['bridge', 'photo', 'colors']
ORDER BY id ASC;
id | title | tags
--------+--------------+-----------------------
124009 | bridge photo | {bridge,photo,Colors}
(1 row)
You can't apply LOWER() to an array directly, but you can unpack the array, apply it to each element, and reassemble it when you're done:
... WHERE ARRAY(SELECT LOWER(UNNEST(tags))) #> ARRAY['bridge', 'photo', 'colors']
You could also install the citext (case-insensitive text) module; if you declare listings.tags as type citext[], your query should work as-is.

How to delete array element in JSONB column based on nested key value?

How can I remove an object from an array, based on the value of one of the object's keys?
The array is nested within a parent object.
Here's a sample structure:
{
"foo1": [ { "bar1": 123, "bar2": 456 }, { "bar1": 789, "bar2": 42 } ],
"foo2": [ "some other stuff" ]
}
Can I remove an array element based on the value of bar1?
I can query based on the bar1 value using: columnname #> '{ "foo1": [ { "bar1": 123 } ]}', but I've had no luck finding a way to remove { "bar1": 123, "bar2": 456 } from foo1 while keeping everything else intact.
Thanks
Running PostgreSQL 9.6
Assuming that you want to search for a specific object with an inner object of a certain value, and that this specific object can appear anywhere in the array, you need to unpack the document and each of the arrays, test the inner sub-documents for containment and delete as appropriate, then re-assemble the array and the JSON document (untested):
SELECT id, jsonb_build_object(key, jarray)
FROM (
SELECT foo.id, foo.key, jsonb_build_array(bar.value) AS jarray
FROM ( SELECT id, key, value
FROM my_table, jsonb_each(jdoc) ) foo,
jsonb_array_elements(foo.value) AS bar (value)
WHERE NOT bar.value #> '{"bar1": 123}'::jsonb
GROUP BY 1, 2 ) x
GROUP BY 1;
Now, this may seem a little dense, so picked apart you get:
SELECT id, key, value
FROM my_table, jsonb_each(jdoc)
This uses a lateral join on your table to take the JSON document jdoc and turn it into a set of rows foo(id, key, value) where the value contains the array. The id is the primary key of your table.
Then we get:
SELECT foo.id, foo.key, jsonb_build_array(bar.value) AS jarray
FROM foo, -- abbreviated from above
jsonb_array_elements(foo.value) AS bar (value)
WHERE NOT bar.value #> '{"bar1": 123}'::jsonb
GROUP BY 1, 2
This uses another lateral join to unpack the arrays into bar(value) rows. These objects can now be searched with the containment operator to remove the objects from the result set: WHERE NOT bar.value #> '{"bar1": 123}'::jsonb. In the select list the arrays are re-assembled by id and key but now without the offending sub-documents.
Finally, in the main query the JSON documents are re-assembled:
SELECT id, jsonb_build_object(key, jarray)
FROM x -- from above
GROUP BY 1;
The important thing to understand is that PostgreSQL JSON functions only operate on the level of the JSON document that you can explicitly indicate. Usually that is the top level of the document, unless you have an explicit path to some level in the document (like {foo1, 0, bar1}, but you don't have that). At that level of operation you can then unpack to do your processing such as removing objects.

Double-JOIN in web2py. How to apply this SQL in web2py?

In my image-labeling app I want to display a list of labels assigned to an image along with lable-type and user information, who assigned this lable.
The main table is imageLabel. The tables auth_user and labelType contain additional information to the label.
Could you help me converting this SQL-statement to web2py:
SELECT labelType.name, imageLabel.labelValue, auth_user.first_name, auth_user.last_name from imageLabel
LEFT JOIN labelType
ON imageLabel.labelId = labelType.id
LEFT JOIN auth_user
ON imageLabel.userId = auth_user.id
WHERE imageLabel.imageId = 581
ORDER BY labelType.name
The data model is like this:
db.define_table('labelType',
Field('name','string'),
)
db.define_table('imageLabel',
Field('imageId','string'),
Field('labelId','integer'),
Field('userId','integer'),
Field('labelValue','text'),
)
db.define_table('image',
Field('imageId','string')
)
# + built-in auth tables
My attempt was:
labels = db((db.imageLabel.id == db.labelType.id) ).select(
db.imageLabel.labelValue, db.labelType.name, db.auth_user.first_name, db.auth_user.last_name, db.imageLabel.labelTimeStamp,
left=db.auth_user.on(db.imageLabel.userId==db.auth_user.id)
)
Which abviously doesn't work as there is no WHERE imageLabel.imageId = 581 part in the code. And I can't figure out how to use WHERE along with 2 "JOINS" in web2py :-(
Thank you very much in advance for any help!
EDIT: SOLUTION After reading answer from Anthony:
labels = db(
db.imageLabel.imageId == img.id
).select(
db.imageLabel.labelValue,
db.labelType.name,
db.auth_user.first_name,
db.auth_user.last_name,
db.imageLabel.labelTimeStamp,
left=[
db.labelType.on(db.imageLabel.labelId == db.labelType.id),
db.auth_user.on(db.imageLabel.userId==db.auth_user.id)
],
orderby=~db.labelType.name
)
In general, a select looks like db(query).select(...). The query part of that represents the WHERE clauses. If you have multiple WHERE clauses, you just use the & operator:
db((condition 1) & (condition 2) & ...)
As for left joins, the left argument of the .select() method can be a list if you need to specify multiple left joins:
left=[db.auth_user.on(db.imageLabel.userId==db.auth_user.id),
db.labelType.on(db.imageLabel.labelId==db.labelType.id)]
However, it's not clear that you really want left joins here -- you might just want inner joins (which can be specified using the join argument to the .select() method, or more simply as conditions in the query):
db((db.imageLabel.labelId == db.labelType.id) & # joins imageLabel and labelType
(db.imageLabel.userId == db.auth_user.id) & # joins imageLabel and auth_user
(db.imageLabel.imageId == 581))\
.select(db.imageLabel.labelValue, db.labelType.name, db.auth_user.first_name,
db.auth_user.last_name, db.imageLabel.labelTimeStamp)
Also, you should probably specify the three "Id" fields as reference type fields:
db.define_table('imageLabel',
Field('imageId', 'reference image'),
Field('labelId', 'reference labelType'),
Field('userId', 'reference auth_user'),
Field('labelValue', 'text'),
)
Finally, why do you need an imageId string field? The db.image table will already have an auto-incrementing integer "id" field to serve as a primary key for the table and uniquely identify each image.

Query with integers not working

I've been searching here on stackoverflow and other sources but not found a solution to this
The query below works as expected expect for when either custinfo.cust_cntct_id or custinfo.cust_corrcntct_id = '' (blank not NULL) then I get no results. Both are integer fields and if both have an integer value then I get results. I still want a value returned for either cntct_email or corrcntct_email even if custinfo.cust_cntct_id or custinfo.cust_corrcntct_id = blank
Can someone help me out in making this work? The database is PostgreSQL.
SELECT
cntct.cntct_email AS cntct_email,
corrcntct.cntct_email AS corrcntct_email
FROM
public.custinfo,
public.invchead,
public.cntct,
public.cntct corrcntct
WHERE
invchead.invchead_cust_id = custinfo.cust_id AND
cntct.cntct_id = custinfo.cust_cntct_id AND
corrcntct.cntct_id = custinfo.cust_corrcntct_id;
PostgreSQL won't actually let you test an integer field for a blank value (unless you're using a truly ancient version - 8.2 or older), so you must be using a query generator that's "helpfully" transforming '' to NULL or a tool that's ignoring errors.
Observe this, on Pg 9.2:
regress=> CREATE TABLE test ( a integer );
CREATE TABLE
regress=> insert into test (a) values (1),(2),(3);
INSERT 0 3
regress=> SELECT a FROM test WHERE a = '';
ERROR: invalid input syntax for integer: ""
LINE 1: SELECT a FROM test WHERE a = '';
If you are attempting to test for = NULL, this is not correct. You must use IS NOT NULL or IS DISTINCT FROM NULL instead. Testing for = NULL always results in NULL, which is treated as false in a WHERE clause.
Example:
regress=> insert into test (a) values (null);
INSERT 0 1
regress=> SELECT a FROM test WHERE a = NULL;
a
---
(0 rows)
regress=> SELECT a FROM test WHERE a IS NULL;
a
---
(1 row)
regress=> SELECT NULL = NULL as wrong, NULL IS NULL AS right;
wrong | right
-------+-------
| t
(1 row)
By the way, you should really be using ANSI JOIN syntax. It's more readable and it's much easier to forget to put a condition in and get a cartesian product by accident. I'd rewrite your query for identical functionality and performance but better readability as:
SELECT
cntct.cntct_email AS cntct_email,
corrcntct.cntct_email AS corrcntct_email
FROM
public.custinfo ci
INNER JOIN public.invchead
ON (invchead.invchead_cust_id = ci.cust_id)
INNER JOIN public.cntct
ON (cntct.cntct_id = ci.cust_cntct_id)
INNER JOIN public.cntct corrcntct
ON (corrcntct.cntct_id = ci.cust_corrcntct_id);
Use of table aliases usually keeps it cleaner; here I've aliased the longer name custinfo to ci for brevity.

Resources