count of all records stored Extended Event - sql-server

I want to get count of all records stored in Extended Event but that file is a huge and running a query on it takes several minutes and does not meet my need.
I wanted to know is there any place in the sql server to get this data there? I mean something like sys.traces or event_count
I write query but this is not work
SELECT COUNT(timestamp_utc)
FROM sys.fn_xe_file_target_read_file(N'D:\Extended Event\ErrorReport\ex_*.xel', NULL, NULL, NULL);

You can export it to csv file as, it is explained in this link
Then, you can import the csv file into a sql server using Bulk insert
Bulk insert execution is very fast unless you don't use indexes in the target Table, see here
Then you can query the SQL SERVER Table to which you imported the Data.

Related

Logic App Execute SQL TO JSON automatically chunks output

The logic app I am working with is intended to quickly update a json file, which is based on a SQL Server table (1000 rows, 6 columns).
The SQL statement resembles this:
SELECT ID, NAME, FIELD1, FIELD2, FIELD3, FIELD4 FROM TABLENAME FOR JSON PATH;
There are ~1000 rows in the table, with little variance or changes.
When I run this SQL in SSMS or locally, my output is a single row / consolidated json output; when I run the same SQL via the Logic App, it batches the output into groups of 10 json rows.
screenshot of output from stored proc / execute sql
If I use a stored procedure with NO COUNT ON, the same behavior results.
Does anyone know a method to force the Execute SQL task in logic apps NOT to chunk / batch the return into different resultsets?
I've since learned that the Execute SQL automatically casts its output to Json.
To fix this, I changed my SQL to remove the FOR JSON PATH, and used ResultSet.Table1 as the source for a Compose Task. This wraps the array with the Json-specific square brackets, and now output is as expected.

Executing query from SSDT

I'm using Visual Studio 2015, SSIS to run set of sql tasks in Execute Sql task and then do a data transfer between tables which are in SSMS by executing package in SSIS. When we run a series of sql statements on SSMS, we get results like rows effected for every sql successful activity. However, now I want to automate the process using SSIS to reduce the turn around time. I would like to get the rows effected for every sql query like select, insert, delete which are in execute sql task. How can it be done in SSIS? I don't have dbo_owner permission to stored procedures in SSMS. I'm thinking SSIS would be a quick way. But it is very important for me to make a log of rows effected to validate the data, as it is financial data. I have nearly 10 sql statements in each sql task like select and delete. But the output is only one table.
For example my sql task is like below
select * from dbo.table1;
select * from dbo.table2 where city = 'Chicago';
create dbo.table3(id int, name varchar(50);
insert into dbo.table3(1,'a');
select * from dbo.table3;
If I execute this in SSMS I get rows effected for each select statement and also table is created. If I execute the same through package in SSIS, how will get messages for each of them?
I assume your data lies on SQL Server. With selects, you could use data flow tasks and row counts instead of Excecute Sql's.
For inserts and updates there's a few ways to get affected rowcount, like this: https://stackoverflow.com/a/1834264/5605866
or like this: http://microsoft-ssis.blogspot.fi/2011/03/rowcount-for-execute-sql-statement.html
Basically the same thing but with a bit different syntax.
You can use the Row Count transaformation after the Data source and save it the variable. Can refer to this get the number of rows returned from the Source that SHOULD be processed.
Hope this help.

ETL Script to dynamically map multiple EXECUTE SQL resultset to multiple tables (table name based on sql file provided)

ETL Script to dynamically map multiple execute sql resultset to multiple tables (table name based on sql file provided)
I have a source folder with sql files ( I can put it up as stored procedures as well ) . I know how to loop and execute sql tasks in a foreach container. Now the part where I'm stuck is I need to use the final result set of each sql queries and shove it into a table with the same name as the sql file.
So, Folder -> script1.sql , script2.sql etc -> ETL -> goes to table script1, table script2 etc.
EDIT : Based on the comment made by Joe, I just want to say that I'm aware of using insert within a script but I need to insert it onto a table in a different server.And Linked servers are not the ideal solutions
Any psuedocode or link to tutorials will be extremely helpful . Thanks!
I would add the table creation to the script. It is probably the simplest way to do this. If your script is Select SomeField From Table1, you could change it to Select SomeField Into Table script1 From Table1. Then there is no need to map in SSIS which is not easy to do from my experience.

How to merge table from access to SQL Express?

I have one table named "Staff" in access and also have this table(same name) in SQL 2008.
Both table have thousands of records. I want to merge records from the access table to sql table without affecting the existing records in sql. Normally, I just export using OCBC driver and that works fine if that table doesn't exist in sql server. Please advise. Thanks.
A simple append query from the local access table to the linked sql server table should work just fine in this case.
So, just drop in the first (from) table into the query builder. Then change the query type to append, and you are prompted for the append table name.
From that point on, just drop in the columns you want (do not drop in the PK column, as they need not be used nor transferred in this case).
You can also type in the sql directly in the query builder. Either way, you will wind up with something like:
INSERT INTO dbo_custsql
( ADMINID, Amount, Notes, Status )
SELECT ADMINID, Amount, Notes, Status
FROM custsql1;
This may help: http://www.red-gate.com/products/sql-development/sql-compare/
Or you could write a simple program to read from each data set and do the comparison, adding, updating, and deleting, etc.

How to import csv files

How can I import CSV file data into SQL Server 2000 table? I need to insert data from CSV file to table twice a day. Table has more then 20 fields but I only need to insert value into 6 fields.
i face same problem before i can suggest start reading here. The author covers:"This is very common request recently – How to import CSV file into SQL Server? How to load CSV file into SQL Server Database Table? How to load comma delimited file into SQL Server? Let us see the solution in quick steps."
I need to insert data from CSV file to table twice a day.
Use DTS to perform the import, then schedule it.
For SQL 2000, I would use DTS. You can then shedule this as a job when your happy with it.
Below is a good Microsoft link explaining how to use it.
Data Transformation Services (DTS)
You describe two distinct problems:
the CSV import, and
the extraction of data into only those 6 fields.
So break your solution down into two steps:
import the CSV into a raw staging table, and
then insert into your six 'live' fields from that staging table.
There is a function for the first part, called BULK INSERT, the syntax looks like this:
BULK INSERT target_staging_table_in_database
FROM 'C:\Path_to\CSV_file.csv'
WITH
(
DATAFILETYPE = 'CHAR'
,FIRSTROW = 2
,FIELDTERMINATOR = ','
,ROWTERMINATOR = '\n'
);
Adjust to taste, and consult the docs for more options. You might also want to TRUNCATE or DELETE FROM your staging table before doing the bulk insert so you don't have any old data in there.
Once you get the information into the database, doing an UPDATE or INSERT into those six fields should be straightforward.
You can make of use SQL Server Integration services(SSIS). It's jusy one time task to create the Package. Next time onwards just run that package.
You can also try Bulk Insert as daniel explained.
You can also try Import export wizard in SQL Server 2000.

Resources