I have sample data as follows;
team_id
mode
123
[1,2]
Here mode is an array.The goal is to replace the values in column mode by literal values, such as 1 stands for Ocean, and 2 stands for Air
Expected Output
team_id
mode
123
[Ocean,Air]
Present Approach
As an attempt, I tried to first flatten the data into multiple rows;
team_id
mode
123
1
123
2
Then we can define a new column assigning literal values to mode column using a case statement, followed by aggregating the values into an array to get desired output.
Can I get some help here to do the replacement directly in the array? Thanks in advance.
Using FLATTEN and ARRAY_AGG:
CREATE OR REPLACE TABLE tab(team_id INT, mode ARRAY) AS SELECT 123, [1,2];
SELECT TEAM_ID,
ARRAY_AGG(CASE f.value::TEXT
WHEN 1 THEN 'Ocean'
WHEN 2 THEN 'Air'
ELSE 'Unknown'
END) WITHIN GROUP(ORDER BY f.index) AS new_mode
FROM tab
,LATERAL FLATTEN(tab.mode) AS f
GROUP BY TEAM_ID;
Output:
TEAM_ID
NEW_MODE
123
[ "Ocean", "Air" ]
For an alternative solution with easy array manipulation. you could create a JS UDF:
create or replace function replace_vals_in_array(A variant)
returns variant
language javascript
as $$
dict = {1:'a', 2:'b', 3:'c', 4:'d'};
return A.map(x => dict[x]);
$$;
Then to update your table:
update arrs
set arr = replace_vals_in_array(arr);
Example setup:
create or replace temp table arrs as (
select 1 id, [1,2,3] arr
union all select 2, [2,4]
);
select *, replace_vals_in_array(arr)
from arrs;
Related
I have a table that contains a json array. Here is a sample of the contents of the field from:
SELECT json_array FROM table LIMIT 5;
Result:
[{"key1":"value1"}, {"key1":"value2"}, ..., {"key2":"value3"}]
[]
[]
[]{"key1":"value1"}
[]
How can I retrieve all the values and count how many of each value was found?
I am using PostgreSQL 9.5.14, and I have tried the solutions here Querying a JSON array of objects in Postgres
and the ones suggested to me by another generous stackoverflow user in my last question: How can I parse JSON arrays in postgresql?
I tried:
SELECT
value -> 'key1'
FROM
table,
json_array_elements(json_array);
which sadly does not work for me due to receiving the error: cannot call json_array_elements on a scalar
This error happens when using a query that returns more than one row or more than one column as a scalar subquery.
Another solution I tried was:
SELECT json_array as json, (json_array->0),
coalesce(
case
when (json_array->0) IS NULL then null
else (json_array->0->>'key1')
end,
'No value') AS "Value"
FROM table;
which only returned null values for the "Value"
Referencing Querying a JSON array of objects in Postgres I attempted to use this solution as well:
WITH json_test (col) AS (
values (json_arrays)
)
SELECT
y.x->'key1' "key1"
FROM json_test jt,
LATERAL (SELECT json_array_elements(jt.col) x) y;
But I would need to be able to fit all the elements of the json_arrays into json_test
So far I have only attempted to list all the values in the all json arrays, but my ideal end-result for the query resembles this:
Value | Amount
---------------
value1 | 48
value2 | 112
value3 | 93
value4 | 0
Yet again I am grateful for any help with this, thank you in advance.
step-by-step demo:db<>fiddle
SELECT
each.value,
COUNT(*)
FROM
data,
json_array_elements(json_array) elems, -- 1
json_each_text(elems) each -- 2
GROUP BY each.value -- 3
Expand array into one row for each array element
split the key/value pairs into two columns
group by the new value column/count
Datatype:
id: int4
keywords: text
objectivable_id: int4
Postgres version: PostgreSQL 9.5.3
Business_objectives table:
id keywords objectivable_id
1 keyword1a,keyword1b,keyword1c 6
2 keyword2a 6
3 testing 5
Currently the query I'm using is :
select array(select b.keywords from business_objectives b where b.objectivable_id = 6)
It selects the keywords of matched objectivable_id as:
{"keyword1a,keyword1b,keyword1c","keyword2a"}
Over here I wanted the result to be :
{"keyword1a","keyword1b","keyword1c","keyword2a"}
I tried using "string_agg(text, delimiter)", but it just combines all the keywords into one single pocket of an array.
You can simply (and cheaply!) use:
SELECT string_to_array(string_agg(keywords, ','), ',')
FROM business_objectives
WHERE objectivable_id = 6;
Concatenate your comma separate lists with string_agg(), and then convert the complete text to an array with string_to_array().
So something like this can give you expected result:
SELECT array_agg( j.keys )
FROM business_objectives b,
LATERAL ( SELECT k
FROM unnest ( string_to_array( b.keywords, ',' ) ) u( k )
) j( keys )
WHERE b.objectivable_id = 6;
array_agg
-------------------------------------------
{keyword1a,keyword1b,keyword1c,keyword2a}
(1 row)
With the LATERAL part, we look at the outer query to create a new view. Simply it does split of your keywords as set of rows which you can then feed into array_agg() function.
See more about LATERAL: https://www.postgresql.org/docs/9.6/static/queries-table-expressions.html#QUERIES-LATERAL
I have scenario where i need to convert a json array into postgres int array and query it for the result. Below is my array
ID DATA
1 {"bookIds" : [1,2,3,5], "storeIds": [2,3]}
2 {"bookIds" : [4,5,6,7], "storeIds": [1,3]}
3 {"bookIds" : [11,12,10,9], "storeIds": [4,3]}
I want convert booksId array into int array and later query it. Is it possible in postgres 9.3? I know 9.4 + provides much more JSON support but i can't update my db at the moment.
Below query gives me error
Select data::json->>'bookIds' :: int[] from table
ERROR: malformed array literal: "bookIds"
LINE 1: Select data::json->>'bookIds' :: int[] from table
Is it possible to query elements inside json array in postgres 9.3.. Thanks in advance ...
The setup in the question should look like this:
create table a_table (id int, data json);
insert into a_table values
(1, '{"bookIds": [1,2,3,5], "storeIds": [2,3]}'),
(2, '{"bookIds": [4,5,6,7], "storeIds": [1,3]}'),
(3, '{"bookIds": [11,12,10,9], "storeIds": [4,3]}');
Note the proper syntax of json values.
You can use the function json_array_elements()
select id, array_agg(e::text::int)
from a_table, json_array_elements(data->'bookIds') e
group by 1
order by 1;
id | array_agg
----+--------------
1 | {1,2,3,5}
2 | {4,5,6,7}
3 | {11,12,10,9}
(3 rows)
Use any() to search for an element in the arrays, e.g.:
select *
from (
select id, array_agg(e::text::int) arr
from a_table, json_array_elements(data->'bookIds') e
group by 1
) s
where
1 = any(arr) or
11 = any(arr);
id | arr
----+--------------
1 | {1,2,3,5}
3 | {11,12,10,9}
(2 rows)
Read also about <# operator.
You can also search in json array (without converting it to int array) by examine its elements, e.g.:
select t.*
from a_table t, json_array_elements(data->'bookIds') e
where e::text::int in (1, 11);
id | data
----+-----------------------------------------------
1 | {"bookIds" : [1,2,3,5], "storeIds": [2,3]}
3 | {"bookIds" : [11,12,10,9], "storeIds": [4,3]}
(2 rows)
These two functions (for json/jsonb) modified from a fantastic answer to this question work perfectly
CREATE OR REPLACE FUNCTION json_array_castint(json) RETURNS int[] AS $f$
SELECT array_agg(x)::int[] || ARRAY[]::int[] FROM json_array_elements_text($1) t(x);
$f$ LANGUAGE sql IMMUTABLE;
CREATE OR REPLACE FUNCTION jsonb_array_castint(jsonb) RETURNS int[] AS $f$
SELECT array_agg(x)::int[] || ARRAY[]::int[] FROM jsonb_array_elements_text($1) t(x);
$f$ LANGUAGE sql IMMUTABLE;
You can use them as follows:
SELECT json_array_castint('[1,2,3]')
Which gives the expected return {1,2,3} as in integer[]. If you wonder why I'm concatenating with an empty array in each of the SELECT statement it's because the cast is lossy and without it, if you try to cast an empty json/jsonb array to an integer[] you'll get no return (not desired) instead of an empty array (as expected). With the above method when you do
SELECT json_array_castint('[]')
You'll get {} instead of nothing. See here for more on why I added that.
In my case I had to cast json data stored in a table col to pg array format and this was handy :
-- username is the table column, which has values like ["john","pete","kat"]
select id, ARRAY(SELECT json_array_elements_text((username)::json)) usernames
from public.table-name;
-- this produces : {john,pete,kat}
I would go a bit simpler:
select * from
(
select t.id, value::text::int as bookvalue
from testjson t, json_array_elements(t.data->'bookIds')
) as t
where bookvalue in (1,11)
See it working here: http://sqlfiddle.com/#!15/e69aa/37
I have a data like below.
id col1[]
--- ------
1 {1,2,3}
2 {3,4,5}
My question is how to use replace function in arrays.
select array_replace(col1, 1, 100) where id = 1;
but it gives an error like:
function array_replace(integer[], integer, integer) does not exist
can anyone suggest how to use it?
Your statement (augmented with the missing FROM clause):
SELECT array_replace(col1, 1, 100) FROM tbl WHERE id = 1;
As commented by #mu, array_replace() was introduced with pg 9.3. I see 3 options for older versions:
1. intarray
As long as ...
we are dealing with integer arrays.
elements are unique.
and the order of elements is irrelevant.
A simple and fast option would be to install the additional module intarray, which (among other things) provides operators to subtract and add elements (or whole arrays) from/to integer arrays:
SELECT CASE col1 && '{1}'::int[] THEN (col1 - 1) + 100 ELSE col1 END AS col1
FROM tbl WHERE id = 1;
2. Emulate with SQL functions
A (slower) drop-in replacement for array_replace() using polymorphic types, so it works for any base type:
CREATE OR REPLACE FUNCTION f_array_replace(anyarray, anyelement, anyelement)
RETURNS anyarray LANGUAGE SQL IMMUTABLE AS
'SELECT ARRAY (SELECT CASE WHEN x = $2 THEN $3 ELSE x END FROM unnest($1) x)';
Does not replace NULL values. Related:
Replace NULL values in an array in PostgreSQL
If you need to guarantee order of elements:
PostgreSQL unnest() with element number
3. Apply patch to source and recompile
Get the patch "Add array_remove() and array_replace() functions" from the git repo, apply it to the source of your version and recompile. May or may not apply cleanly. The older your version the worse are your chances. I have not tried that, I would rather upgrade to the current version.
You can create your own
based on this source :
CREATE TABLE arr(id int, col1 int[]);
INSERT INTO arr VALUES (1, '{1,2,3}');
INSERT INTO arr VALUES (2, '{3,4,5}');
SELECT array(
SELECT CASE WHEN q = 1 THEN 100 ELSE q END
FROM UNNEST(col1::int[]) q)
FROM arr;
array
-----------
{100,2,3}
{3,4,5}
You can create your own function and put it in your public schema if you still want to call by function though it will be slightly different than the original.
UPDATE tbl SET col1 = array_replace(col1, 1, 100) WHERE id = 1;
Here is the sample query for a test-array:
SELECT test_id,
test_array,
(array (
-- Replace existing value 'int' of an array with given value 'Text'
SELECT CASE WHEN a = '0' THEN 'MyEntry'
WHEN a = '1' THEN 'Apple'
WHEN a = '2' THEN 'Banana'
WHEN a = '3' THEN 'ChErRiEs'
WHEN a = '4' THEN 'Dragon Fruit'
WHEN a = '5' THEN 'Eat a Fruit in a Day'
ELSE 'NONE' END
FROM UNNEST(test_array::TEXT[]) a) ::TEXT
-- UNNEST : Lists out values of my_test_array
) test_result
FROM (
--my_test_array
SELECT 1 test_id, '{0,1,2,3,4,5,6,7,8,9}'::TEXT[][] test_array
) test;
In PostgreSQL 9.3, I have a table like this
id | array_json
---+----------------------------
1 | ["{123: 456}", "{789: 987}", "{111: 222}"]
2 | ["{4322: 54662}", "{123: 5121}", "{1: 5345}" ... ]
3 | ["{3232: 413}", "{5235: 22}", "{2: 5453}" ... ]
4 | ["{22: 44}", "{12: 4324}", "{234: 4235}" ... ]
...
I want to get the sum of all values in array_json column. So, for example, for first row, I want:
id | total
---+-------
1 | 1665
Where 1665 = 456 + 987 + 222 (the values of all the elements of json array). No previous information about the keys of the json elements (just random numbers)
I'm reading the documentation page about JSON functions in PostgreSQL 9.3, and I think I should use json_each, but can't find the right query. Could you please help me with it?
Many thanks in advance
You started looking at the right place (going to the docs is always the right place).
Since your values are JSON arrays -> I would suggest using json_array_elements(json)
And since it's a json array which you have to explode to several rows, and then combine back by running sum over json_each_text(json) - it would be best to create your own function (Postgres allows it)
As for your specific case, assuming the structure you provided is correct, some string parsing + JSON heavy wizardry can be used (let's say your table name is "json_test_table" and the columns are "id" and "json_array"), here is the query that does your "thing"
select id, sum(val) from
(select id,
substring(
json_each_text(
replace(
replace(
replace(
replace(
replace(json_array,':','":"')
,'{',''),
'}','')
,']','}')
,'[','{')::json)::varchar
from '\"(.*)\"')::int as val
from json_test_table) j group by id ;
if you plan to run it on a huge dataset - keep in mind string manipulations are expensive in terms of performance
You can get it using this:
/*
Sorry, sqlfiddle is busy :p
CREATE TABLE my_table
(
id bigserial NOT NULL,
array_json json[]
--,CONSTRAINT my_table_pkey PRIMARY KEY (id)
)
INSERT INTO my_table(array_json)
values (array['{"123": 456}'::json, '{"789": 987}'::json, '{"111": 222}'::json]);
*/
select id, sum(json_value::integer)
from
(
select id, json_data->>json_object_keys(json_data) as json_value from
(
select id, unnest(array_json) as json_data from my_table
) A
) B
group by id