Transforming HSTORE array values to a column in JSONB - arrays

I am trying to transform my data stored in HSTORE column ('data') of Postgres.
My row values have key "entity" and value is in the array.
"entity"=>"[{'id': .............}]
I used the following code:
Alter TABLE my_table
ALTER COLUMN h_store_column TYPE jsonb
USING hstore_to_jsonb_loose(data -> 'entity');
which resulted in value as output in a new column as below:
"[{'id': .............}]"
but, with quotes "". This made it scalar in JSONB type column and doesn't let me run the query.
How can I change the value of every row in a new column named 'entity' with JSONB, without quotes?
[{'id': .............}]
SAMPLE CODE TO GENERATE SIMILAR DATA:
"key" => "[json_text_array]"
stored in hstore data type column.
When changed to JSON B type, I get {'key':'[array]'}, whereas I am after {'key': [array]} - No quotes. I tried loose functions in postgres, no help.

As per your question what I understood, you have a column with type hstore having a key named entity and value as JSON ARRAY. Explanation of your problem and solution will be as:
Your Alter query mentioned in the question will through error because hstore_to_jsonb_loose function accepts hstore type value but you are passing text. So the correct statement for query should be.
Alter TABLE my_table
ALTER COLUMN h_store_column TYPE jsonb
USING hstore_to_jsonb_loose(data) -> 'entity';
Above query will convert the hstore key-value into jsonb key value pair and update it into the column h_store_column.
So the function hstore_to_jsonb_loose will convert the data as { "entity": "[{'id':..........}]" } from which you are your extracting the JSON value of key 'entity' which is "[{'id':..........}]".
You want to store your value fetched from hstore_to_jsonb_loose(data) -> 'entity' as full JSON ARRAY. Your data stored as value in hstore type column seems like a JSON but its not a JSON. In JSON, keys and values (other than numeric and boolean) are surrounded by " but in your string it is surrounded by '. So it can not be stored as JSON in JSONB type column.
Considering that there is no other problem in the structure of values as JSON (other than '). We should replace the ' with " and store the value as JSONB in the column. Try This query to do the same.
Alter TABLE test
ALTER COLUMN h_store_column TYPE jsonb
USING replace(hstore_to_jsonb_loose(data)->>'entity','''','"')::jsonb;
DEMO1
Even hstore_to_jsonb_loose is not required in your case. You can write your Alter statement as below:
Alter TABLE test
ALTER COLUMN h_store_column TYPE jsonb
USING replace((data)->'entity','''','"')::jsonb;
DEMO2

Related

Postgresql: How to convert JSONb keys into an ENUM list?

I'm currently cleaning some tables in our application, and we have some JSONb defined in some tables, whose values are just true or false for a given key. Seeing that, i want to get rid of the jsonb, to have instead an enum array, in which i could get the former keys of my jsonb (whose value was true)
For instance, i want to transform this table
CREATE TABLE my_table (
...
my_field JSONB
...
);
to
CREATE TYPE field_enum AS ENUM (
'key1',
'key2',
'key3',
'key4',
'key5',
'key6'
);
CREATE TABLE my_table (
...
my_field field_enum[] DEFAULT NULL,
...
);
And i want the data to be migrated from (for this example)
{"key1": true, "key2": null, "key3": false, "key4": true}
to
['key1','key4']
I'm trying to alter my colum type by doing the data migration by a unique command, which i think could be something of:
ALTER TABLE
my_table
ALTER COLUMN
my_field
TYPE
field_enum
USING __XXXXXX___;
The using part is where i have difficulties, anybody has an idea of how i should alter my column without losing data?
I'm also open to create a new field and rename it after that, and use an UPDATE on the table.
I was thinking of using the jsonb_object_keys function but it first gives me all keys, and not just those whose values are true, and it also gives me an record[] which i don't manage to cast as a enum[].
Even more deeper, but it is not mandatory for me as i can do it as a post-treatment, it may happen that the json keys has to be linked to a enum key that is not the same, lets say that key1 should be converted as NEW_KEY(upper_case and name changing). Do you think it's possible to include in the same postgresql command?
If anybody of you has an idea of what i could do, i would appreciate any help.
Thanks !
X.
You need a function to convert a json object to your enum array:
create or replace function jsonb_to_field_enums(jsonb)
returns field_enum[] language sql immutable
as $$
select array_agg(key)::field_enum[]
from jsonb_each_text($1)
where value::bool
$$;
Db<>fidlle.

How to change the data type from bytea[] to bytea of column in PostgreSQL?

I am new to PostgreSQL. I want to change one of my column datatype from bytea[] to bytea in Postgres. I tried the following query but it is giving the error cannot cast type bytea[] to bytea:
ALTER TABLE table_name
ALTER COLUMN COLUMN_NAME TYPE bytea USING COLUMN_NAME::bytea;
Does anyone know how to do this?
It depends on how many elements of the array do you want to save. If it is enough to save the first element:
alter table table_name alter column_name type bytea using column_name[1];
or you can concatenate two first elements:
alter table table_name alter column_name type bytea using column_name[1] || column_name[2];
To convert all elements of the original column you need to create a function which concatenates unknown number of array elements.

nullable column with Postgres API

With libfq and the C API, is there a way to get the nullable metadata of a column from a query?
I'm just talking about the property of a column, not the NULL value from a resultset.
For instance, with MySQL:
mysql_fetch_field_direct()
does the job.
I tried with PQfmod, but without success.
Information about the table definition is not part of the result set data.
You can find out if the value is NULL, but not if the column is nullable or not.
A column in a result set need not be related to a certain column in a table!
To get information about a column's definition, query information_schema.columns. The is_nullable column will contain the information.

Change Column datatype with existing enum col in laravel

I have a users table for my project, i already have column Role ( enum ['admin', 'student', 'teacher']) in my table, and i want to change the designation column ('string'), but as per laravel docs we cannot update the column with enum datatype, so in this case what should i do, i am new to laravel.
Add a new column with data type string then write a simple query to update to copy the enum value into the string column
update tableName set columnA = ColumnB;
then drop the old column with enum type.
If you want to use migration then first create a migration to add the string column then run a query to copy the data into string column. Now write another migration to drop the column with enum type.

Create Null-able numeric column in Derived column component

I use SSIS in SQL Server 2016. I have to create a new column in my DataFlow. I use DrivedColumn component to create this new column. In this case I need a nullable numeric column like below image :
As you can see, I have error in this case.
How can I create null-able numeric in Derived column component?
In the derived column task there are NULL Functions that you can use if you would like to set a value to NULL.
Select the appropriate NULL function that corresponds to the datatype of your column and plug that into your equation.
You can use NULL(DT_CY) if your column is defined as currency.
Example. ([currencyAmount]>0? [currencyAmount]:NULL(DT_CY)
In the Derived Column task Null Functions can be found here:

Resources