I have to implement a scenario, where a user logs in to the application and I have to validate the queries that where being executed during the load run of the log in scenario. Now I know that the JDBC sampler allows to run certain queries and returns their response but that is not needed here. I want to check that when a number of users Login to the application, exactly what queries have been initiated. A road map or associated tool will be very helpful in this regard.
First of all check if there is an APM tool in place as well-behaved APM tools can show the SQL queries associated with the HTTP requests.
If it's not there you can only get the query log from the database, depending on the database type you can either use the aforementioned JDBC Request sampler or if the database doesn't expose its query logs via SQL you may need to go to the database server directly and get the query log via command-line using OS Process Sampler
Related
So I have an Azure SQL Database instance that I need to run a nightly data import on, and I was going to schedule a stored procedure to make a basic GET request against an API endpoint, but it seems like the OLE object isn't present in the Azure version of SQL Server. Is there any other way to make an API call available in Azure SQL Database, or do I need to put in place something outside of the database to accomplish this?
There are several options. I do not know whether a powershell job as stated in the first comment to your question can execute http requests but I do know at least a couple of options:
Azure Data Factory allows you to create scheduled pipelines to copy/transform data from a variety of sources (like http endpoints) to a variety of destinations (like azure sql databases). This involves no or a little bit of scripting.
Azure Logic Apps allows you to do the same:
With Azure Logic Apps, you can integrate (cloud) data into (on-premises) data storage. For instance, a logic app can store HTTP request data in a SQL Server database.
Logic apps can be triggered by a schedule as well and involves none or little scripting
You could also write an Azure Function that is executed on a schedule and calls the http endpoint and write the result to the database. Multiple languages are supported for writing functions, like c# and powershell for example.
All those options include the possibility to force an execution outside the schedule.
In my opinion Azure Data Factory (no coding) or an Azure Function (code only) are the best options given the need to parse a lot of json data. But do mind that Azure Functions on a Consumption Plan have a maximum execution time allowed of 10 minutes per invocation.
I'm using Delphi 10.3 and Advantage DataBase.
Here, I have used sp_SignalEvent method and then, used the TADSEvent which get triggered for all the ADS connected application and handled the operation based on the requests.
Is there any similar operation is present in SQL Server.?
SQL Server has Query Notifications:
Built upon the Service Broker infrastructure, query notifications
allow applications to be notified when data has changed. This feature
is particularly useful for applications that provide a cache of
information from a database, such as a Web application, and need to be
notified when the source data is changed.
Query Notifications in SQL Server
Working with Query Notifications
Detecting Changes with SqlDependency
I am currently doing performance testing with JMeter for an application in my company. The application requires a login and I have to use multiple users. The way this is usually done in JMeter seems to be via CSV dataset, however I have access to the database and can read all the credentials from there directly.
Since it is a test environment, all users have the same password, so I can hardcode it, but I need the list of usernames. JMeter can already do JDBC requests, but I was wondering if there was any way to use the results from such a request as a dataset.
The ideal way would be for me to query the database for the usernames and use those as input for the login test.
Does anyone know if this is possible?
Sure, it is possible.
Follow the steps from The Real Secret to Building a Database Test Plan With JMeter to establish JDBC Connection and execute your query.
Define a variable in the "Variable Name" input of the JDBC Request sampler
3. Query results will be available in form of
actor_1=John
actor_2=Doe
etc.
If you need to deal with several columns, the approach is the same. See official documentation on JDBC Request Sampler for example
There are 5 different types of logging in SSIS
Event Log
Text File
XML File
SQL Server
SQL Server Profiler
I am in a production environment where developers do not have access to production systems.
Which logging method should be my poison of choice, and why?
If you're not going to have access to the production server, then SQL Server logging is your best bet by far. You'll have plenty of ways of viewing the logged information, for example via custom SSRS reports or web pages, or direct access to the tables if your DBA allows it. Also, the logs will be easier to search and filter when in a table.
Personally I prefer logging to SQL Server.
I think this is because it puts the data in a form which I can immediately access and process. For example, I can then slice and dice the data, export it to another server, setup agent jobs to monitor the logs and email alerts etc.
Have you looked at BI xPress from Pragmaticworks. They have serious auditing feature for SSIS
SSIS Logging And Auditing, Notification, Deployment using BI xPress
Just looking at the requirements of a new project and I wanted to make sure this use case was sound:
user fills in InfoPath (2003) form locally on their PC
a button within the InfoPath form titled 'submit' brings up a new outlook (2003) email message with the infopath form attached. User presses sends and email is sent to an exchange mailbox.
sql server preiodically checks this mailbox, downloading any new submissions with the infopath form attached
sql server parses the attachment and the fields within the infopath form.
Is SQL Server capable of parsing mail attachments this way? Any caveats with this approach?
The attraction to using Outlook as the submission technology is that the process for the user is the same if they are offline. Outlook will then automatically sync when they come back online. It is essential that users have some way to fill the forms in offline, 'submit' them, and then have then synced automatically with the server when they next come online.
edit: to clarify, I am not looking for a way to cache form data from the server->client. I am looking to cache the completed form. Building a separate application to cache the completed reports on the client is not an option.
Later versions of SQL Server are capable of running .NET code within them, and as such you might be able to poll a mailbox from SQL Server and process an InfoPath form. However, I'm not sure I'd do it this way.
It might be better to consider writing a Windows Service that does this work. The Windows Service would start up, inspect the mail box of a "service account", read the mails, extract the attachments, process the xml and, eventually, write the data to SQL. It could also, presumably, respond to that mail with a confirmation or errors if business rules or validation errors occurred.
I'm not sure I'd put all of the above logic into SQL - for one thing, I suspect you'd have issues with accounts (having to have the account SQL was running under be able to access the Exchange mailbox account).
Your mileage may vary, and you should prototype this to determine what works best for you, but I'd try and keep the code the uses Exchange as a "work queue" separate from SQL and only put the code that deals with writing data into tables in SQL.
I would not use the approach you outlined above. There are several approaches that appear to me to be more desirable than having SQL Server looking at an Exchange Mailbox. The major point that you make and an important requirement is that the InfoPath form be allowed to work in offline mode. I would think of the "offline mode" and the "data transfer" parts of your project as two distinct and separate pieces: 1) The form and the data should be stored on the client until a connection to the Internet is available and 2) once the connection is available the form and data should be transferred to the server.
You can setup your InfoPath form to submit directly to the SQL Server and bypass the Exchange "middleman" entirely. The setup in InfoPath when you are designing your form is pretty straight forward: 1) you enable "Submit data" for the connection and 2) you configure the submit options. This article has the details about how to do that. Furthermore, your connection to the SQL Server may be setup for offline use, as it's discussed in this article. The only caveat with this approach is that you may need to change your database schema to support it.
Another approach is to have your InfoPath form submit to a SQL Server 2005 HTTP Endpoint. The InfoPath client is just a glorified XML Editor and the HTTP Endpoint is basically a different name for a web service. You receive the form data at the HTTP endpoint into a staging table where the data is stored as XML and then you can do your parsing of that data from that staging area. Still you will have to setup the InfoPath connection for offline use. The major caveat with this approach is that Microsoft will deprecate HTTP Endpoint in SQL Server 2008 in favor of WCF.
And the other approach I would like to suggest is to use WCF itself to receive the XML form data from the InfoPath client. This approach would require you to connect the form's data source to you WCF web service at design time and then also setting up the form for offline use.
I hope that this will be useful to you and at the very least point you in the right direction.
I've seen similar projects that resorted to an Express edition on the client, save the infopath (or app data) in Express and use Service Broker to deliver to center, because of guaranteed delivery semantics of SSB vs. mail. This gives you an all SQL solution easier to sell to IT and you don't need polling on the server. Also you will not have to deal with MIME parsing, is all straight forward XML processing. It is not for the faint of heart though, getting SSB up and running is a challenge.
If you decide to go with mail delivery, an external service will be arguably easier to build, debug and troubleshoot. There are some finer point issues you should have an answer for:
-How will you keep consistent the mail dequeue operations and the table write operations? Your component must engage the Exchange read/delete and the SQL insert into one distributed transaction.
- Is your logic prepared to deal with infopath docs coming out of order? mail transport makes absolutely no guarantee about the order of delivery, so you may see an 'order delete' doc before the 'order create' doc
- How are you going to detect missing documents (not delivered by mail)? Are you going to implement a sender sequence number and end up reinventing TCP on top of mail?
- Does your processing allow for parallel process of correlated documents? If thread 1 picks up doc 1 and thread 2 picks up doc 2 from same sender and doc 2 is correlated with doc 1 (ie. refer to same business transaction), what will happen at the database write? Will it deadlock, will it loose an update, will one be rolled back?
There are many dragons under the bridge ahead...