Serilog SQL Server sink with custom table - sql-server

I am using SeriLog SQL Server sink. I am using appsettings.json to read the configuration for the Serilog.
I have a custom log table with custom columns (Id, Info). So I need to save Serilog message into the Info column. Is this possible using only app settings configuration without changing the source code?
I know this is possible by the property to log context. But I don't need to change the source code if so I have to review 500 code lines where it is used to log.
If it is only possible by adding property to logcontext then is it possible to map Message => Info column globally using the source code?

Related

When Data source is created using ODBC Driver 17, SQLBulkOperations is not updating the database

This particular application is used to created users. And it had UI to display the list of all users created. When we create a user using New and Accept buttons, it is not updating the list nor it is updating the database. We have enabled TLS 1.2 and so using ODBC Driver 17
When I changed the driver to simple "SQL Server", then all the previously non-displayed users also got added to the list and database as well.
when I debugged the code, it is failing at one particular line of code - SQLBulkOperations i.e,
As an alternative, I changed the code to
result = SQLSetPos(m_hStatement, 1, SQL_ADD, SQL_LOCK_NO_CHANGE);
It is able to add the users to database, but the problem is, the users data is not syncing to the server.
Note: We have restarted the NodeSynchronizer service. All the configurations are correct.
Please help.

Intellij embedded H2 database tables do not appear

I'm creating a Spring Boot application and I'm using Intellij's embedded h2 database.
I have added the following lines in my application.properties file:
spring.datasource.url=jdbc:h2:~/testdb;MV_STORE=false;AUTO_SERVER=TRUE
This is my data source configuration
Although the connection is successful and I can query the database using Intellij's query console, the tables do not appear in the Database tab.
Succeeded
DBMS: H2 (ver. 2.1.210 (2022-01-17))
Case sensitivity: plain=upper, delimited=exact
Driver: H2 JDBC Driver (ver. 2.1.210 (2022-01-17), JDBC4.2)
Ping: 16 ms
When I refresh the connection or go to the schemas tab of the data source configuration, I get the following error:
[42S02][42102] org.h2.jdbc.JdbcSQLSyntaxErrorException: Table "INFORMATION_SCHEMA_CATALOG_NAME" not found; SQL statement: select CATALOG_NAME from INFORMATION_SCHEMA.INFORMATION_SCHEMA_CATALOG_NAME [42102-210].
By going to the advanced tab of the data source and clicking on expert options, we are presented with a checkbox labeled "Introspect using JDBC metadata"
By checking that box, the tables successfully appear in the Database tab
Regarding why this works, this is taken from the official documentation:
https://www.jetbrains.com/help/datagrip/data-sources-and-drivers-dialog.html
Introspect using JDBC metadata
Switch to the JDBC-based introspector. Available for all the databases.
To retrieve information about database objects (DB metadata), DataGrip uses the following introspectors:
A native introspector (might be unavailable for certain DBMS). The native introspector uses DBMS-specific tables and views as a source of metadata. It can retrieve DBMS-specific details and produce a more precise picture of database objects.
A JDBC-based introspector (available for all the DBMS). The JDBC-based introspector uses the metadata provided by the JDBC driver. It can retrieve only standard information about database objects and their properties.
Consider using the JDBC-based intorspector when the native introspector fails or is not available.
The native introspector can fail, when your database server version is older than the minimum version supported by DataGrip.
You can try to switch to the JDBC-based introspector to fix problems with retrieving the database structure information from your database. For example, when the schemas that exist in your database or database objects below the schema level are not shown in the Database tool window.

How to display image in Sharepoint List from SQL Server

I have the following problem when I want to display an image in a column of type "Hyperlink or Picture" of Sharepoint List, the image comes from a column in a SQL Server table of data type "Image". To give more context to the problem that I am presenting, I will explain the process in detail.
I have an application developed in PowerApps which is connected to SQL Server, this application stores information in a table and among that information that I save is an image, which is the one I mentioned above and I save in the SQL Server table.
As I mentioned, the image is saved in an "Image" data type with the following information "0xFFD8FFE000104A46494600010101004800480000FFE…" something like that, since the information chain is very long, I do not add it completely.
In the same application I have a display screen of the image which I can see without any problem after saving the record.
In addition to this application, I have a workflow developed with Power Automate which is a scheduled flow that runs at a certain time of the day. What this flow does is get the rows from my table in SQL Server and then create an element in a Sharepoint list, I do this to export the information from SQL Server to a Sharepoint list.
Among that information that I export, the "Image" column does not appear, as I show it in the flow below.
For this reason after the flow ends, I go to the Sharepoint list and the image is not stored.
It is for this reason that I request help to correctly display the image from SQL Server in the Sharepoint list.
Is there something wrong that I'm doing in my workflow in Power
Automate?
Why is the image not displayed in the Sharepoint list?
Any other solution to store the image and then be displayed correctly
in the Sharepoint list?
Or do you think I have to change the data type with which I am
storing the image in the SQL Server table?
Conditions to take into account.
It is not possible to connect the PowerApps application to the Sharepoint list directly, since the SQL Server table will be used by other external applications and that information is required to be in a SQL Server table.
Update 1:
I am currently getting an error when running the flow.
The error is in a particular record and it is in one of the records which has an image in SQL Server, this image must be stored in the Sharepoin list. But the flow throws the following error:
OpenApiOperationParameterTypeConversionFailed. The 'inputs.parameters' of workflow operation 'Create_item' of type 'OpenApiConnection' is not valid. Error details: Input parameter 'item/Image' is required to be of type 'String/uri'.

Error when copying CSV files from Windows directory into SQL Server DB by using Apache NiFi

I am trying to copy CSV files from my local directory into a SQL Server database running in my local machine by using Apache NiFi.
I am new to the tool and I have been spending few days googling and building my flow. I managed to connect to source and destination but still I am not able to populate the database since I get the following error: "None of the fields in the record map to the columns defined by the tablename table."
I have been struggling with this for a while and I have not been able to find a solution in the Web. Any hint would be highly appreciated.
Here are further details.
I have built a simple flow using GetFile and PutDatabaseRecord processors 1.
My input is a simple table with 8 columns 2.
My configurations for GetCSV process are here (I have added the input directory and left the rest as default): 3
The configuration for PutDatabaseRecord process is here (I have referred to the CSVReader and DBCPConnectionPool controller services, used the MS SQL 2012+ database type (I have 2019 version), configured INSERT statement type, inserted the schema and correct table name and left everything else as default): 4
The CSVReader configuration looks as shown here (Schema Access Strategy = Use String Fields From Header; CSV Format = Microsoft Excel): 5
And this is the configuration of the DBCPConnectionPool (I have added the correct URL, DB driver class name, driver location, DB user and password): 6
Finally, this is a snapshot of the description of the table I have created in the database to host the content: 7
Many thanks in advance!
The warning "None of the fields in the record map to the columns defined by the tablename table." is also obtained when the processor is not able to find the table and this can happen also when the table name is correctly configured in PutDatabaseRecord but there is some issue with user access rights (which ended up to be the actual cause of my error ...).

Find Which applications access my server using Profiler Trace with Application Name column

I Need to find out what are all the applications that use my sql server.
I'm using Profiler trace to do this (if there's another way to do this I would appreciate it)
On Profiler I'm using a Replay template, and after looking at the trace result I see that there's a column called Application Name, I'm wondering if there's a way to get the distinct ones (the trace is on a .trc file).
(By the way is this supposed to be posted on stackoverflow or serverfault?)
Thanks,
Gabriel
Try this:
SELECT DISTINCT ApplicationName
FROM ::fn_trace_gettable('C:\YourFolder\YourTraceFile.trc', DEFAULT) t
You can actually do this right from within Profiler in SQL Server 2008.
Create a trace with the following two events:
Security Audit : Audit Login
Security Audit : Existing Connection
For those two events, capture the following columns:
Event Class
Application Name
SPID (required)
Event Sub Class
Add a filter to Event Subclass to restrict it to values of 1. This filter will only capture non-pooled logins. This should give you all your existing connections and any new logins that occur during the time you are running your trace.
Next, in the organize columns, move Application Name up to the "Groups" section. This will now group all the results by the Application Name.
This is a pretty light weight trace and shouldn't put much (if any) load on the server if you restrict it to just those events and apply the filter.
(I'm pretty sure previous versions work the same way. I just don't have one in front of me to test.)

Resources