The logic app I am working with is intended to quickly update a json file, which is based on a SQL Server table (1000 rows, 6 columns).
The SQL statement resembles this:
SELECT ID, NAME, FIELD1, FIELD2, FIELD3, FIELD4 FROM TABLENAME FOR JSON PATH;
There are ~1000 rows in the table, with little variance or changes.
When I run this SQL in SSMS or locally, my output is a single row / consolidated json output; when I run the same SQL via the Logic App, it batches the output into groups of 10 json rows.
screenshot of output from stored proc / execute sql
If I use a stored procedure with NO COUNT ON, the same behavior results.
Does anyone know a method to force the Execute SQL task in logic apps NOT to chunk / batch the return into different resultsets?
I've since learned that the Execute SQL automatically casts its output to Json.
To fix this, I changed my SQL to remove the FOR JSON PATH, and used ResultSet.Table1 as the source for a Compose Task. This wraps the array with the Json-specific square brackets, and now output is as expected.
Related
I want to get count of all records stored in Extended Event but that file is a huge and running a query on it takes several minutes and does not meet my need.
I wanted to know is there any place in the sql server to get this data there? I mean something like sys.traces or event_count
I write query but this is not work
SELECT COUNT(timestamp_utc)
FROM sys.fn_xe_file_target_read_file(N'D:\Extended Event\ErrorReport\ex_*.xel', NULL, NULL, NULL);
You can export it to csv file as, it is explained in this link
Then, you can import the csv file into a sql server using Bulk insert
Bulk insert execution is very fast unless you don't use indexes in the target Table, see here
Then you can query the SQL SERVER Table to which you imported the Data.
I need to extract data from a DB2 database to a SQL Server. I need to create my query based on a Excel file I have 176 records, which I need to create repeating queries & put in SQL server
So for example;
I have an Excel with a Number, From date, To date, and a Country
So the query should use these information from the records
SELECT *
FROM dbo.Test
WHERE Number = excel.Number1 AND Date BETWEEN excel.fromDate1 AND excel.toDate1 AND Country = excel.country1
And then another query with
SELECT *
FROM dbo.Test
WHERE Number = excel.Number2 AND Date BETWEEN excel.fromDate2 AND excel.toDate2 AND Country = excel.country2
Etc...
How should I do something like this in SSIS?
If needed I can put the DB2 and Excel data in MS SQL
You can proceed with the following approach:
Extract data rows from Excel and put it into SSIS Object Variable
Proceed with a Foreach loop to get each row from the Object Variable, parsing Object Variable to separate variables
Inject variable values into SQL Select command with Expressions
Perform Data Flow task based on SQL command, transform and put it into the target
Overall, your task seems to be feasible, but requires some knowledge on parsing Object Variable in Foreach Loop, and writing Variable Expressions.
I'm using Visual Studio 2015, SSIS to run set of sql tasks in Execute Sql task and then do a data transfer between tables which are in SSMS by executing package in SSIS. When we run a series of sql statements on SSMS, we get results like rows effected for every sql successful activity. However, now I want to automate the process using SSIS to reduce the turn around time. I would like to get the rows effected for every sql query like select, insert, delete which are in execute sql task. How can it be done in SSIS? I don't have dbo_owner permission to stored procedures in SSMS. I'm thinking SSIS would be a quick way. But it is very important for me to make a log of rows effected to validate the data, as it is financial data. I have nearly 10 sql statements in each sql task like select and delete. But the output is only one table.
For example my sql task is like below
select * from dbo.table1;
select * from dbo.table2 where city = 'Chicago';
create dbo.table3(id int, name varchar(50);
insert into dbo.table3(1,'a');
select * from dbo.table3;
If I execute this in SSMS I get rows effected for each select statement and also table is created. If I execute the same through package in SSIS, how will get messages for each of them?
I assume your data lies on SQL Server. With selects, you could use data flow tasks and row counts instead of Excecute Sql's.
For inserts and updates there's a few ways to get affected rowcount, like this: https://stackoverflow.com/a/1834264/5605866
or like this: http://microsoft-ssis.blogspot.fi/2011/03/rowcount-for-execute-sql-statement.html
Basically the same thing but with a bit different syntax.
You can use the Row Count transaformation after the Data source and save it the variable. Can refer to this get the number of rows returned from the Source that SHOULD be processed.
Hope this help.
ETL Script to dynamically map multiple execute sql resultset to multiple tables (table name based on sql file provided)
I have a source folder with sql files ( I can put it up as stored procedures as well ) . I know how to loop and execute sql tasks in a foreach container. Now the part where I'm stuck is I need to use the final result set of each sql queries and shove it into a table with the same name as the sql file.
So, Folder -> script1.sql , script2.sql etc -> ETL -> goes to table script1, table script2 etc.
EDIT : Based on the comment made by Joe, I just want to say that I'm aware of using insert within a script but I need to insert it onto a table in a different server.And Linked servers are not the ideal solutions
Any psuedocode or link to tutorials will be extremely helpful . Thanks!
I would add the table creation to the script. It is probably the simplest way to do this. If your script is Select SomeField From Table1, you could change it to Select SomeField Into Table script1 From Table1. Then there is no need to map in SSIS which is not easy to do from my experience.
I have done several SSIS packages over the past few months to move data from a legacy database to a SQL Server database. It normally takes 10-20 minutes to process around 5 millions of records depending on the transformation.
The issue I am experiencing with one of my package is a very poor performance because one of the columns in my destination is of the SQL Server XML data type.
Data comes in like this: 5
A script creates a Unicode string like this: <XmlData><Value>5</Value></XmlData>
Destination is simply a column with XML data type
This is really slow. Any advice?
I did a SQL Trace and notice that in behind the scene SSIS is executing on each row a convert before the insert:
declare #p as xml
set #p=convert(xml,N'<XmlData><Value>5</Value></XmlData>')
Try using a temporary table to store the resulting 5 million records without the XML transformation and then use SQL Server itself to move them from tempDB to the final destination:
INSERT INTO final_destination (...)
SELECT cast(N'<XmlData><Value>5</Value></XmlData>' AS XML) AS batch_converted_xml, col1, col2, colX
FROM #tempTable
If 5.000.000 turns to be too much data for a single batch, you can do it in smaller batches (100k lines should work like a charm).
The record captured by the profiler looks like an OleDB transformation with one command per line.