postgres sql Query a field with Json data - arrays

We have a field in a table that contains JSON object. There are array objects inside the json structure and we would have to extract values specifically from the array. Below is a sample data in the json field
children:[{server:,children:[{server:,newPage:,menuType:3,label:File Tax Forms,url:\/dashboard\/fileTaxForm},{server:,newPage:,menuType:3,label:View Filed Forms,url:dashboard\/viewFiledForms}],
Can you please throw some light on how to extract the values via sql query in the above json object.
Any help on this is much appreciated
Thanks in Advance

When adding the json to your question, it seems to have lost quotes.
If the JSON is inserted as follows
INSERT INTO test VALUES ('{"children":[{"server":"","children":[{"server":"","newPage":"","menuType":3,"label":"File Tax Forms","url":"dashboard/fileTaxForm"},{"server":"","newPage":"","menuType":3,"label":"View Filed Forms","url":"dashboard/viewFiledForms"}]}]}');
into a table defined as follows:
CREATE TABLE test (somejson json);
then you could use a query similar to the following:
SELECT somejson::json->'children'->0->'children'->0->'label' as label, somejson::json->'children'->0->'children'->0->'url' as url FROM test;
to get a result:
label | url
------------------+-------------------------
"File Tax Forms" | "dashboard/fileTaxForm"

Related

Big query nested json strings to arrays then new tables

Bigquery Database
I've got a webhook that's pushing to my big query table. The problem is it has lots of nested json strings which are brought in as strings. I ultimately want to make each column with these json strings into their own tables but I'm getting stuck because I can't figure out how to get them unnested and into an array.
[{"id":"63bddc8cfe21ec002d26b7f4","description":"General Admission", "src_currency":"USD","src_price":50.0,"src_fee":0.0,"src_commission":1.79,"src_discount":0.0,"applicable_pass_id":null,"seats_label":null,"seats_section_label":null,"seats_parent_type":null,"seats_parent_label":null,"seats_self_type":null,
"seats_self_label":null,"rate_type":"Rate","option_name":null,"price_level_id":null,"src_discount_price":50.0,"rate_id":"636d6d5cea8c6000222c640d","cost_item_id":"63bddc8cfe21ec002d26b7f4"}]
Here's the sample return from the original source and below is a screenshot of what I'm working with.
[Current Database
I've tried a number of things but the multiple nestings and string to array issue are really hampering everything I've tried.
I'm honestly not sure exactly what output/structure is best for this data set. I assume that each of the json returns probably just needs to be its own table and I can reference or join them based off that first "id" value in the json strings but I'm wide open to suggestions.
You can use a combination of JSON functions, and array functions to manipulate this kind of data.
JSON_EXTRACT_ARRAY can convert the JSON formatted string into an array, UNNEST then can make each entry into rows, and finally JSON_EXTRACT_SCALAR can pull out individual columns.
So here's an example of what I think you're trying to accomplish:
with sampledata as (
select """[{"id":"63bddc8cfe21ec002d26b7f4","description":"General Admission", "src_currency":"USD","src_price":50.0,"src_fee":0.0,"src_commission":1.79,"src_discount":0.0,"applicable_pass_id":null,"seats_label":null,"seats_section_label":null,"seats_parent_type":null,"seats_parent_label":null,"seats_self_type":null,"seats_self_label":null,"rate_type":"Rate","option_name":null,"price_level_id":null,"src_discount_price":50.0,"rate_id":"636d6d5cea8c6000222c640d","cost_item_id":"63bddc8cfe21ec002d26b7f4"},{"id":"63bddc8cfe21ec002d26b7f4","description":"General Admission", "src_currency":"USD","src_price":50.0,"src_fee":0.0,"src_commission":1.79,"src_discount":0.0,"applicable_pass_id":null,"seats_label":null,"seats_section_label":null,"seats_parent_type":null,"seats_parent_label":null,"seats_self_type":null,"seats_self_label":null,"rate_type":"Rate","option_name":null,"price_level_id":null,"src_discount_price":50.0,"rate_id":"636d6d5cea8c6000222c640d","cost_item_id":"63bddc8cfe21ec002d26b7f4"}]""" as my_json_string
)
select JSON_EXTRACT_SCALAR(f,'$.id') as id, JSON_EXTRACT_SCALAR(f,'$.rate_type') as rate_type, JSON_EXTRACT_SCALAR(f,'$.cost_item_id') as cost_item_id
from sampledata, UNNEST(JSON_EXTRACT_ARRAY(my_json_string)) as f
Which creates rows with specific columns from that data, like this:

Chnage DateFormat in Logic apps

There is a date field in my logic app which i am getting data from finops connector.
In JSON assigning the field.
After Parsing Json and create a csv table assinging like this
Is there a way for me to format the date using formatDateTime in any of the steps above ?
Thanks,
Vivek
I suspect you may have an issue with null values in your array.
You need to check every item and make sure the invoiceDate field contains a valid value.
Something like this will help you if you don't filter them out ...
if(equals(item()['invoiceDate'],null),'',formatDateTime(item()['invoiceDate'], 'dd.MM.yyyy'))
... but you will need to decide on the business logic with those items that do have a null invoice date.

Best data format to load into snowflake table from MySQL (JSON or XML for a column)

I need to migrate Aurora MySQL table is has 3 billions rows and 7TB storage .
I am new to Snowflake so need some suggestions about the file format .
The data type is simple except two columns which is longtext and stores xml data in MySQL table .
This is my MySQL table structure
CREATE TABLE `app_event` (
`ID` varchar(255) NOT NULL,
`DETAILS` longtext,
`OBJECT` varchar(255) NOT NULL,
`DATE_TIME` datetime(6) DEFAULT NULL,
`SUMMARY` varchar(4000) DEFAULT NULL
);
DETAILS column and SUMMARY column stores xml file of now .
A sample row is
6tgbcr45345o82mcrzz,<?xml version="1.0" encoding="UTF-8" standalone="yes"?><CasePayload><caseId>5f5475-21cf-4c7e-8071-574a1ef78981</caseId><providerTypes>WATCHLIST</providerTypes> ,9/16/2020 9:45,<?xml version="1.0" encoding="UTF-8" standalone="yes"?><CaseEventSummary><providerTypes>WATCHLIST</providerTypes>
This is just a sample data and while moving we need to convert xml to json and store into Snowflake table .
We are thinking to use Kinesis for this but when Kinesis delivers data into s3 it converts to json .
So question is what is best data format to load into Snowflake ?
Shall we convert to CSV and make xml to json on the fly and store into CSV or json is good ?
Like this
{
"ID":"6tgbcr45345o82mcrzz",
"DETAILS":{
"CasePayload":{
"caseId":"5f5475-21cf-4c7e-8071-574a1ef78981",
"providerTypes":"WATCHLIST"
}
},
"OBJECT":"TEST",
"DATE_TIME":"9/16/2020 9:45",
"SUMMARY":{
"screenCaseEventSummary":{
"providerTypes":"WATCHLIST"
}
}
}
Which is good in terms of Select ?
I need to query on element of json and that one of the main reason to convert xml to json but rest all column we want to store a column .
Snowflake understands JSON and XML, you can work with both.
If I had to choose, I would choose JSON - as XML support is still in preview (https://docs.snowflake.com/en/sql-reference/functions/xmlget.html).
I'd also consider how you are currently querying the XML strings, in case you want to minimize the effort to translate queries (in that case, I would keep working with the XML objects).
More to Felipe's point. If you can use JSON, and make sure you compress you files.
For both FLATTEN will be your best friend for unlooping array's. But the syntax for JSON is like
json_column:Object_Name:Sub_Obj_Name::int
which can also be done via
json_get(json_column, 'Object_Name.Sub_Obj_Name')
but the former looks so much nicer, while XML is a dog breakfast (IMHO)
get(xmlget(xmlget(xmlget(xml,'header',0), 'companyinfo', 0), 'companyid'),'$') as companyid
see this XML answer verse this JSON answer

Presto query Array of Rows

So I have a hive external table with schema looks like this :
{
.
.
`x` string,
`y` ARRAY<struct<age:string,cId:string,dmt:string>>,
`z` string
}
So basically I need to query a column(column "y") which is array of nested json,
I can see data of column "y" from hive, but data in that column seems invisible to presto, even though presto knows schema of this field, like this:
array(row(age varchar,cid varchar,dmt varchar))
As you can see presto already knows this field is array of row.
Notes:
1.The table is a hive external table.
2.I get schema of field "y" by using ODBC driver, but data is just all empty, however I can see something like this in hive :
[{"age":"12","cId":"bx21hdg","dmt":"120"}]
3.Presto queries hivemetastore for schema.
4.Table was stored as parquet format.
So how can I see my data in field "y" please?
Please try the below. This should work in Presto.
"If the array element is a row data type, the result is a table with one column for each row field in the element data type. The result table column data types match the corresponding array element row field data types"
select
y,age,cid,dmt
from
table
cross join UNNEST(y) AS nested_data(age,cid,dmt)
Reference: https://www.ibm.com/support/knowledgecenter/en/SSEPGG_10.5.0/com.ibm.db2.luw.sql.ref.doc/doc/r0055064.html

How to save Json_encode data in database in magento?

I am Working on magento 1.7 version.In this I have data in a array.I encoded this in json using json_encode and insert into database. But I an getting error like
connection was reset.
If I insert normal value then it is working fine.
In database I have field type longtext.I used mysql_real_escape_string(),base_63_encode(),serialize() but not succeed.
I am using following code
$table = Mage::getSingleton('core/resource')->getTableName('checkout_prescription_details');
$write = Mage::getSingleton('core/resource')->getConnection('core_write');
$custom = json_encode($customoptions);
$query = "insert into {$table} set `data`='$custom';
$write->query($query);
In this when I echo $query then It shows encoded data and when I insert this from phpmyadmin then it insert in database but using $write->query($query); this is not inserting in database.
Please suggest regarding this.
See Magento’s Core JSON Encoding and Decoding Functions
Encode an array
Mage::helper('core')->jsonEncode($array);
Decode an array
Mage::helper('core')->jsonDecode($jsonData);
Mage::getModel('checkout/prescription_details')
->setData('data', Mage::helper('core')->jsonEncode($customoptions))
->save();
you did not escape the query parameter and (SQL injection aside) a JSON string usually contains single quotes, which breaks the query.
you should not have done this with an SQL query in the first place, let Magento do the work for you. Assuming there is a model for this table with the alias checkout/prescription_details:
Mage::getModel('checkout/prescription_details')
->setData('data', json_encode($customoptions))
->save();
If there is no model, go ahead and create one. You should not have database tables without an according model.

Resources