I have to create a dataframe through table valued function in which we have data in signals and we have to convert data into table through table valued function. And transfer into dataframe
i am try with help of python
Related
I am trying to transform my data stored in HSTORE column ('data') of Postgres.
My row values have key "entity" and value is in the array.
"entity"=>"[{'id': .............}]
I used the following code:
Alter TABLE my_table
ALTER COLUMN h_store_column TYPE jsonb
USING hstore_to_jsonb_loose(data -> 'entity');
which resulted in value as output in a new column as below:
"[{'id': .............}]"
but, with quotes "". This made it scalar in JSONB type column and doesn't let me run the query.
How can I change the value of every row in a new column named 'entity' with JSONB, without quotes?
[{'id': .............}]
SAMPLE CODE TO GENERATE SIMILAR DATA:
"key" => "[json_text_array]"
stored in hstore data type column.
When changed to JSON B type, I get {'key':'[array]'}, whereas I am after {'key': [array]} - No quotes. I tried loose functions in postgres, no help.
As per your question what I understood, you have a column with type hstore having a key named entity and value as JSON ARRAY. Explanation of your problem and solution will be as:
Your Alter query mentioned in the question will through error because hstore_to_jsonb_loose function accepts hstore type value but you are passing text. So the correct statement for query should be.
Alter TABLE my_table
ALTER COLUMN h_store_column TYPE jsonb
USING hstore_to_jsonb_loose(data) -> 'entity';
Above query will convert the hstore key-value into jsonb key value pair and update it into the column h_store_column.
So the function hstore_to_jsonb_loose will convert the data as { "entity": "[{'id':..........}]" } from which you are your extracting the JSON value of key 'entity' which is "[{'id':..........}]".
You want to store your value fetched from hstore_to_jsonb_loose(data) -> 'entity' as full JSON ARRAY. Your data stored as value in hstore type column seems like a JSON but its not a JSON. In JSON, keys and values (other than numeric and boolean) are surrounded by " but in your string it is surrounded by '. So it can not be stored as JSON in JSONB type column.
Considering that there is no other problem in the structure of values as JSON (other than '). We should replace the ' with " and store the value as JSONB in the column. Try This query to do the same.
Alter TABLE test
ALTER COLUMN h_store_column TYPE jsonb
USING replace(hstore_to_jsonb_loose(data)->>'entity','''','"')::jsonb;
DEMO1
Even hstore_to_jsonb_loose is not required in your case. You can write your Alter statement as below:
Alter TABLE test
ALTER COLUMN h_store_column TYPE jsonb
USING replace((data)->'entity','''','"')::jsonb;
DEMO2
Requirement :
1. I am using inline SQL to insert data into the DB table.
2. I have hierarchical lists which I have to pass to inline SQL to insert data into DB table.
How can I pass List / data table to inline SQL to insert data into DB table with out using UDT (user type table)
Have a table where a column houses a stored procedure to be executed for each row in a cursor or loop this procedure has table valued parameter as input parameters the procedure being called for each row gets its values from the other columns in the same table based on an update statement using dynamic sql which populates the column and gives the procedure its values . One of the columns being used to supply values to the procedure has a string value consisting of many rows however the for XML PATH ('') was used to convert the value to a single one line string. This string needs to passed into the a user defined table type input parameter for the procedure .
How can I get this string value to be inserted especially when it has multiple rows sometimes , meaning I need to make multiple inserts into the user defined table type variable for that row of execution. *
You can try this.
Insert into Table tablename (col1,col2,..,coln) values (call storedprocedurename(arg1,arg2,..,argn))
Note : No.of attributes in insert query should equal with no.of attribute which stored procedure returns. And also in same order
I want to extract data from a table that contains joins from database(db_name_1) and insert the data into another database(db_name_2) that contains the same table with the same fields with the same joints. Using SSIS.
I have a stored procedure that I want to use to create one row in a parent table [Test]
and multiple rows in a child table [TestQuestion]. The parent and child table both have primary keys that are identity datatype. Here is what the child table looks like with some not relevant columns removed:
CREATE TABLE [dbo].[TestQuestion] (
[TestQuestionId] INT IDENTITY (1, 1) NOT NULL,
[TestId] INT NOT NULL,
[QuestionNumber] INT NOT NULL
);
Inserting into the parent table is easy as all parameters are supplied to the SP and I just map these to an insert and perform the insert. But the child data table ids are given as a parameter #qidsJSON containing ids in a JSON form like this:
parameterList.Add(new SqlParameter ("#qidsJSON", qids.ToJSONString()));
["3CEFF956-BF61-419E-8FB2-9D6A1B75B909","63E75A2D-9F45-43CC-B706-D9890A22577D"]
Is there a way that I can use TransactSQL to take the data from my #qidsJSON and
have it insert a row into the TestQuestion table for every GUID that appears in the parameter?
Alternatively is there another way I could pass data in a parameter that contains mulitple GUIDs? I am using C# to formulate the input data from a C# List so I could create the data for the input parameter in any form that would be most easy for the stored procedure to use.
You can use Table variable parameter for your stored procedure :
CREATE TYPE GuidList AS TABLE (Id UNIQUEIDENTIFIER)
CREATE PROCEDURE test
#Ids dbo.GuidList READONLY
AS
Use following reference in order to use table variable parameter in C#:
How to pass table value parameters to stored procedure from .net code
C# and Table Value Parameters