I managed to create a server side trace on my analysis server which runs in background.
Somehow it has too many records for each entry. On the print screen you can see about 30 records they all refer to the same entry, it should have been only two, two for entry (Session initialize and Audit login) and one for exit (audit Logout). Why are there so many and how to filter them?
print screen from the profiler
That's the code i used to create the server side profiler
<Batch xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
<Create xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
<ObjectDefinition>
<Trace>
<ID>MicrosoftProfilerTrace1512302999</ID>
<Name>MicrosoftProfilerTrace1512302999</Name>
<LogFileName>D:\OLAP_Recorder1410.trc</LogFileName>
<LogFileAppend>1</LogFileAppend>
<AutoRestart>1</AutoRestart>
<LogFileSize>5000</LogFileSize>
<LogFileRollover>1</LogFileRollover>
<Events>
<Event>
<EventID>1</EventID>
<Columns>
<ColumnID>24</ColumnID>
<ColumnID>32</ColumnID>
<ColumnID>2</ColumnID>
<ColumnID>3</ColumnID>
<ColumnID>25</ColumnID>
<ColumnID>33</ColumnID>
<ColumnID>36</ColumnID>
<ColumnID>37</ColumnID>
</Columns>
</Event>
<Event>
<EventID>2</EventID>
<Columns>
<ColumnID>32</ColumnID>
<ColumnID>2</ColumnID>
<ColumnID>5</ColumnID>
<ColumnID>6</ColumnID>
<ColumnID>25</ColumnID>
<ColumnID>33</ColumnID>
<ColumnID>36</ColumnID>
<ColumnID>37</ColumnID>
</Columns>
</Event>
<Event>
<EventID>43</EventID>
<Columns>
<ColumnID>2</ColumnID>
<ColumnID>3</ColumnID>
<ColumnID>25</ColumnID>
<ColumnID>33</ColumnID>
<ColumnID>28</ColumnID>
<ColumnID>36</ColumnID>
<ColumnID>32</ColumnID>
<ColumnID>37</ColumnID>
<ColumnID>41</ColumnID>
<ColumnID>42</ColumnID>
<ColumnID>45</ColumnID>
</Columns>
</Event>
</Events>
<Filter>
<NotLike>
<ColumnID>37</ColumnID>
<Value>SQL Server Profiler - beed891e-04cd-4afb-ac37-9dc964567a1b</Value>
</NotLike>
</Filter>
</Trace>
</ObjectDefinition>
</Create>
</Batch>
a large number of entries may indicate that many users log in and log out of the base server on which many databases work. You can flirt your entries by Database name. You can configure it in the Event selection tab and then use the column filters option
Related
I have a CSV file with the two columns :
Employee_Name,EmpID
"Harry, Watson",1
"Amy, Black",2
"O'hare, Lynn",3
Jeremy Prater,4
,,
,,
I am using biml to generate my package :
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<FileFormats>
<FlatFileFormat Name="FlatFile" CodePage="1252" TextQualifer="""
ColumnNamesInFirstDataRow="true" IsUnicode="false">
<Columns>
<Column Name="Employee_Name" DataType="AnsiString" Length="255" Delimiter="Comma" />
<Column Name="EmpID" DataType="Int64" Delimiter="Comma" />
</Columns>
</FlatFileFormat>
</FileFormats>
<Connections>
<FlatFileConnection Name="importexcel"
FilePath="HR.csv"
FileFormat="FlatFile" />
<Connection Name="AppSTG"
ConnectionString="XXXX"></Connection>
</Connections>
<Databases>
<Database Name="STG_App" ConnectionName="AppSTG"></Database>
</Databases>
<Schemas>
<Schema Name="HR" DatabaseName="STG_App"></Schema>
</Schemas>
<Tables>
<Table Name="Employee" SchemaName="STG_App.HR">
<Columns>
<Column Name="Employee_Name" DataType="AnsiString" Length="255" />
<Column Name="EmpID" DataType="Int64" />
</Columns>
</Table>
</Tables>
<Packages>
<Package Name="Load Flat File Data" >
<Tasks>
<Dataflow Name="Load Flat File Data">
<Transformations>
<FlatFileSource ConnectionName="importexcel" Name="FlatFile"/>
<OleDbDestination Name="Target" ConnectionName="AppSTG">
<TableOutput TableName="STG_App.HR.Employee" />
</OleDbDestination>
</Transformations>
</Dataflow>
</Tasks>
</Package>
</Packages>
</Biml>
To create the following data flow :
When I try to execute the package, I have the following error :
« Data conversion failed. The data conversion for column "EmpID"
returned status value 2 and status text "The value could not be
converted because of a potential loss of data.". »
You have defined your EmpID field as being Int64 which will work great when you have a digits there but in the case of no data (but a row still being present), SSIS is going to try to convert the empty string to a number and that will fail.
If you add an error pathway from the Flat File Source for truncation/error/etc you'd see rows 5+ going down that path. For this data, I'd define everything as string as you need to get the data into the pipeline and then you'll need to take action on it based on whatever business rules make sense (no name/id, trash it).
As #alex points out in the comment, the final rows indicate there are three columns of data whereas you've defined two so when the flat file source gets to that, you'll blow up. SSIS won't be able to handle inconsistent file formats like that.
Is there an option to create a database table (SQL Server) automatically depends on a large XML data? Instead of creating the table manually.
, let say you have the xml with 50 nodes
i need a pattern to create a table with the mentioned 50 columns automatically for one time Because i have a huge xml tree . My needs is to save time
You can use Liquibase for XML based Database management on SQL Server.
First, you need to integrate your code base with Liquibase then need to convert your existing XML to Liquibase version XML like below. So, you can start to manage your database changes (including DDL and DML operations) via a DBMS Tool.
<databaseChangeLog
xmlns="http://www.liquibase.org/xml/ns/dbchangelog"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.liquibase.org/xml/ns/dbchangelog
http://www.liquibase.org/xml/ns/dbchangelog/dbchangelog-3.8.xsd">
<changeSet id="1" author="bob">
<createTable tableName="department">
<column name="id" type="int">
<constraints primaryKey="true" nullable="false"/>
</column>
<column name="name" type="varchar(50)">
<constraints nullable="false"/>
</column>
<column name="active" type="boolean"
defaultValueBoolean="true"/>
</createTable>
</changeSet>
</databaseChangeLog>
https://www.liquibase.org/documentation/xml_format.html
https://www.liquibase.org/documentation/tutorials/mssql.html
To keep it short, I have this ".xes" (Extensible Event Stream) file, which is in fact an XML, and looks like this (this code only shows an example of an event - the file contains multiple events similar to this one):
<?xml version="1.0" encoding="UTF-8" ?>
<log xes.version="1.0" xes.features="nested-attributes" openxes.version="1.0RC7" xmlns="http://www.xes-standard.org/">
<trace>
<string key="concept:name" value="0"/>
<event>
<string key="org:resource" value="Call Centre Agent"/>
<date key="time:timestamp" value="2006-01-01T00:00:00.000+01:00"/>
<string key="concept:name" value="check if sufficient information is available"/>
<string key="lifecycle:transition" value="start"/>
</event>
</trace>
...
This file represents in fact a business process event log which contains the events of different activities of a process with timestamp and further information.
I need to extract the information from this data and prepare some SQL queries as well.
I am currently using a SQL Server 2014 Express database, and having trouble importing the data and querying it.
This is a general approach to get a file's content into a variable:
This is a general approach to get a file's content into a variable:
DECLARE #xml XML=
(SELECT * FROM OPENROWSET(BULK 'C:\YourPath\XMLFile.xml',SINGLE_CLOB) AS x);
SELECT #xml;
As this is nested data (with unclear level of nesting...) this is my suggestion:
DECLARE #log XML=
'<log xmlns="http://www.xes-standard.org/" xes.version="1.0" xes.features="nested-attributes" openxes.version="1.0RC7">
<trace>
<string key="concept:name" value="0" />
<event>
<string key="org:resource" value="Call Centre Agent" />
<date key="time:timestamp" value="2006-01-01T00:00:00.000+01:00" />
<string key="concept:name" value="check if sufficient information is available" />
<string key="lifecycle:transition" value="start" />
</event>
<event>
<string key="second-resouce" value="Call Centre Agent" />
<date key="second:timestamp" value="2006-01-01T00:00:00.000+01:00" />
<string key="second:name" value="check if sufficient information is available" />
<string key="second:transition" value="start" />
</event>
</trace>
</log>';
WITH XMLNAMESPACES(DEFAULT 'http://www.xes-standard.org/')
SELECT TraceNode.value('string[1]/#key','varchar(max)') AS Trace_String_Key
,TraceNode.value('string[1]/#value','int') AS Trace_String_Value
,EventNode.value('date[1]/#key','varchar(max)') AS Trace_Event_Date_Key
,EventNode.value('date[1]/#value','datetime') AS Trace_Event_Date_Value
,EventStringNode.value('#key','varchar(max)') AS Trace_Event_String_Key
,EventStringNode.value('#value','varchar(max)') AS Trace_Event_String_Value
FROM #log.nodes('/log/trace') AS a(TraceNode)
OUTER APPLY TraceNode.nodes('event') AS b(EventNode)
OUTER APPLY EventNode.nodes('string') AS c(EventStringNode)
do you have any suggestions on how, and for what, could I query this
data? Some practical examples would be useful
Well, that's really up to you... If you ask such a question, you should know what you need it for :-)
One idea:
Create a relational table structure
Table "Log" (Each log file and side data)
Table "Event" (Child data to "Log")
Table "EventData" (Child data to "Event")
You can use the query to retrieve your data to insert this into your tables...
I need to export data from SQL Server 2012 based on a view. While testing the export for a downstream system, I was manually extracting the data out of the table that the view is based on and the BIT data type columns were reporting as 1/0.
However, once I setup the view against the table, I noticed that the BIT data type columns reported as TRUE/FALSE. This happens whether I perform a select against the view or export from it.
Why does this happen and how can I maintain the same results in the view as the data table (1/0)?
The bit data type is interpreted by clients differently. SSMS, will report back a 1 or 0 for a bit while the same 1/0 is interpreted by an SSIS's Data Flow as True or False.
Whether the source is a table or a view makes no matter for SSIS unless you explicitly change the data type.
For setup, I created 2 tables and a view
CREATE TABLE dbo.BaseTable
(
SomeBit bit NOT NULL
, RowDescription varchar(50) NOT NULL
);
CREATE TABLE dbo.TargetTable
(
SomeBit bit NOT NULL
, RowDescription varchar(50) NOT NULL
, SourcePackage nvarchar(100) NOT NULL
);
GO
CREATE VIEW dbo.MyView
AS
SELECT
BT.SomeBit
, BT.RowDescription
FROM
dbo.BaseTable AS BT;
GO
INSERT INTO
dbo.BaseTable
(
SomeBit
, RowDescription
)
VALUES
(CAST(0 AS bit), 'Falsification')
, (CAST(1 AS bit), 'True dat');
GO
At this point, if I use SSMS and query either dbo.BaseTable or dbo.MyView, I will get back a 1 and 0. But again, these are just artifacts of presentation. In C, 0 is false and any numeric value that isn't 0 is true. Excel will present it as FALSE and TRUE. Every client will interpret the value into whatever the local representation of a boolean value is. SSIS chose True and False.
I built out a simple package that pulls data from BaseTable or MyView and writes it to a text file and a table.
The basic control flow looks thus
The data flow looks complex but it's not.
I select from either my table or view, add a description for my target table, use a multicast so I can send the same data to multiple destinations and then write to a file and table.
If I query SSMS for my sources and destinations, you'll see that the destination libraries handle the translation between the local and foreign representation of the data type.
There is no such translation available for a flat file because there's no "standard" for the representation of a boolean. I might like Y/N. Even so, the
I tried a number of things to coerce a 1/0 to be written to the flat file. I set my data types to
Boolean DT_BOOL
Single byte signed int DT_I1
Four byte signed int DT_I4
String DT_STR
but it never mattered (which actually seems odd given how persnickety SSIS is about data types) --- my output was always the same
False,Falsification
True,True dat
Ultimately, if I wanted a 0 or a 1 in that output file, I needed to change my data type: either in the source query with an explicit cast or through a Derived Column component using the ternary operator SomeBit ? (DT_I1)1 : (DT_I1)0. Use DT_I1/I2/I4/I8 as you see fit
Fun trivia note: if you chose to use the Data Conversion component you're going to get 0 for False, -1 for True or if you use a lazy cast in the Derived Component (DT_I1) SomeBit It seems they follow the C interpretation of boolean values.
Biml it
No need to take my word for it. Using the above table definitions and population of values, if you install the free addon BIDS Helper you can generate the same code for any version of SSIS.
After installing BIDS Helper, right click on an SSIS project and in the context menu, select Add Biml file. Replace the contents of that file with the below code; save and then right-click to generate a new package.
You will need to edit the values for the Flat File Connection to point to valid locations as well as point the ole db connection string to wherever you spin up your tables.
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Connections>
<FlatFileConnection FilePath="C:\ssisdata\so_29244868.table.csv" FileFormat="FFF_table" Name="FF_Table" />
<FlatFileConnection FilePath="C:\ssisdata\so_29244868.view.csv" FileFormat="FFF_table" Name="FF_View" />
<OleDbConnection Name="CM_OLE" ConnectionString="Data Source=localhost\dev2014;Initial Catalog=tempdb;Provider=SQLNCLI11.0;Integrated Security=SSPI;" />
</Connections>
<FileFormats>
<FlatFileFormat
Name="FFF_table" IsUnicode="false" CodePage="1252"
FlatFileType="RaggedRight">
<Columns>
<Column Name="SomeBit" DataType="Boolean" Delimiter="," />
<Column Name="RowDescription" DataType="AnsiString" Length="50" Delimiter="CRLF"/>
</Columns>
</FlatFileFormat>
</FileFormats>
<Packages>
<Package ConstraintMode="Parallel" Name="so_29244868">
<Tasks>
<Dataflow Name="DFT Table example">
<Transformations>
<OleDbSource ConnectionName="CM_OLE" Name="OLE_SRC dbo_BaseTable">
<ExternalTableInput Table="dbo.BaseTable" />
</OleDbSource>
<DerivedColumns Name="DER Package name">
<Columns>
<Column DataType="String" Name="SourcePackage" Length="100">"DFT Table example"</Column>
</Columns>
</DerivedColumns>
<Multicast Name="MC Dupe">
<OutputPaths>
<OutputPath Name="FF" />
<OutputPath Name="Table" />
</OutputPaths>
</Multicast>
<FlatFileDestination ConnectionName="FF_Table" Name="FF_DST table">
<InputPath OutputPathName="MC Dupe.FF" />
</FlatFileDestination>
<OleDbDestination
ConnectionName="CM_OLE"
Name="OLE_DST Table"
TableLock="false">
<InputPath OutputPathName="MC Dupe.Table" />
<ExternalTableOutput Table="[dbo].[TargetTable]"></ExternalTableOutput>
</OleDbDestination>
</Transformations>
</Dataflow>
<Dataflow Name="DFT View example">
<Transformations>
<OleDbSource ConnectionName="CM_OLE" Name="OLE_SRC dbo_MyView">
<ExternalTableInput Table="dbo.MyView" />
</OleDbSource>
<DerivedColumns Name="DER Package name">
<Columns>
<Column DataType="String" Name="SourcePackage" Length="100">"DFT View example"</Column>
</Columns>
</DerivedColumns>
<Multicast Name="MC Dupe">
<OutputPaths>
<OutputPath Name="FF" />
<OutputPath Name="Table" />
</OutputPaths>
</Multicast>
<FlatFileDestination ConnectionName="FF_View" Name="FF_DST view">
<InputPath OutputPathName="MC Dupe.FF" />
</FlatFileDestination>
<OleDbDestination
ConnectionName="CM_OLE"
Name="OLE_DST view"
TableLock="false"
>
<InputPath OutputPathName="MC Dupe.Table" />
<ExternalTableOutput Table="[dbo].[TargetTable]"></ExternalTableOutput>
</OleDbDestination>
</Transformations>
</Dataflow>
</Tasks>
</Package>
</Packages>
</Biml>
I've run into the same problem using Entity Framework.
Try casting the bit field to a bit.
I have been able to execute a Stored Procedure in hibernate to map the result query into a Java Bean.
For example, I have this file called Profesor.hbm.xml:
<hibernate-mapping>
<class name="model.Profesor" table="Profesor" catalog="dbo">
<id name="idProfesor" type="int">
<column name="idProfesor" />
<generator class="assigned" />
</id>
<property name="Nombre" type="string">
<column name="Nombre" length="25" not-null="false" />
</property>
<property name="ApellidoP" type="string">
<column name="ApellidoP" length="20" not-null="false" />
</property>
<property name="ApellidoM" type="string">
<column name="ApellidoM" length="20" not-null="false" />
</property>
</class>
<sql-query name="getProfesors" callable="true">
<return alias="getProfesors" class="model.Profesor">
<return-property name="idProfesor" column="idProfesor"/>
<return-property name="Nombre" column="Nombre"/>
<return-property name="ApellidoP" column="ApellidoP"/>
<return-property name="ApellidoM" column="ApellidoM"/>
</return>
exec getProfesors :idTT
</sql-query>
</hibernate-mapping>
And then in my Profesor class I have the following code:
SessionFactory sf = new Configuration().configure().buildSessionFactory();
session = sf.getCurrentSession();
public List <Profesor> getProfesors(String idTT){
session.beginTransaction();
query.setString("idTT", idTT);
return query.list();
}
This works perfectly, I have no problem, the stored procedure executes and it populates my model Class named Profesor with the results.
Now I have another stored procedure which INSERTs data to create a new Profesor in the Database.
I tried something like this with no success:
session.beginTransaction();
Connection c = session.connection();
CallableStatement cs = c.prepareCall( "{call addProfesor(?,?,?)}" ) ;
cs.setString(1, "George");
cs.setString(2, "Williams");
cs.setString(3, "Mathematics");
cs.executeUpdate();
It doesn't even show me an error message, the data just won't insert.
I was also reading that just like there is a <"sql-query"> tag, there is a <"sql-insert">
but then I can't see the way to call that <"sql-insert"> because it doesn't have a "name" attribute just like sql-query does.
With <"sql-query"> we could do:
Query query =session.getNamedQuery("getProfesors");
Just like I showed before, but since sql-insert doesnt have that attribute name I don't know how to do it. I'm also forced to use Stored procedures since it's a very special requirement, otherwise I would have used other Hibernate persistance features.
Any help would be really appreciated.
You can specify "custom sql" for insert, update and delete. See the documentation about it.
These are not named queries. They are always used when Hibernate inserts, updates or deletes this entity.
Another way to do it is call the stored proc from a prepared statement. This is potentially useful if the stored procedure updates more than one record (batch updates).
PreparedStatement ps = getSession().connection().prepareStatement("{exec sp_batchRateUpdate}");
ps.execute();
Please note though: hibernate sessions' connection() method has been deprecated since Hibernate 3.2.4 (https://hibernate.onjira.com/browse/HHH-2603)
An alternative to this deprecation temporarily, we can use:
PreparedStatement ps = ((SessionFactoryImplementor)sessionFactory).getConnectionProvider()
.getConnection().prepareStatement("{exec sp_batchRateUpdate}");
ps.execute();