Lightswitch : can it create a real time SQLServer DB Monitoring application? - sql-server

Can Lightswitch Be Used To Create A Web Based Real Time SQL Server Databse Monitoring Application
In otherwords if I have one or more querys I run in SQL Server Mgt Studio's Query Tool to get various pieces of information can I use Lightswitch to create an IE based version of this that will execute the same queries against the same SQL database and re-execute those on some timed value so that I effectively have a real time montioring applictaion or live report that shows the info I choose?
SQL Server Mgt Studio has a great tool called the Activity Monitor that on a fixed interval (a value that can be changed by the user) to rquery a number of system views and other code so as to provide the user with a monitoring like interface that is effetcively a live report. Its live because it continually re-querys the data source without the user having to do anything.
For a long time I've been using pre-defined queries in SSMS's query tool to continually check on data I've defined (as opposed as to system views created by someone at Microsoft) and I would love a way to do this without having to use SSMS and in a way so that it auto executes the queries on a specific interval so I don't have to continually press F5.
If there is another solution aside from LIghtswitch thaat can do this that doesn;t cost and arm and a leg I'd love to hear about it.
Thanks

You would need to attach the database to Lightswitch and recreate the queries there. Then create a screen to display the relevant data. But yes, Lightswitch can do what you want. You just need to implement a timer to refresh the screen on the interval you define. I do something similar in my Lightswitch app. I followed this guide:
http://lightswitchspecial.blogspot.in/2012/02/autorefresh-lightswitch-screen.html

Related

Automated SQL Server slow query report?

I am a developer and performance tester but not a DBA. My team is working on a performance testing tool that is specific to our software. One of the features we want it to have is the ability to generate a database report immediately after the test. Our software is database agnostic. For Oracle, I can easily create a snapshot id before and after the test and programmatically create an AWR report for those snapshots, write to a file and save with other artifacts we gather. Works great.
For SQL Server, however, there is no AWR equivalent (that I know of). I know the MDW as part of the SSMS has a UI for getting things like top 10 slow SQL and things like that. But, I have not yet found a way to programmatically create and extract a SQL performance report (preferably similar to Oracle's AWR) for SQL Server.
I am even willing to create the report myself if I can find a way to extract the raw data.
Any ideas would be greatly appreciated because searching online is not getting me anywhere.
P.S. I'm trying to do this in Java, by the way, but will accept help in any language. Thanks again!
Good news! In SQL Server 2016, you can use Query Store. This is like your flight recorder blackbox.. finding long running queries and waits. Capture baseline built in to SQL Server. You can compare before and after hardware changes and/or upgrades on queries. Maybe this similar to Oracle AWR.
Only available SQL Server 2016 and up.

SQL Server Reporting Services - Ability to trigger report generation from stored procedure

I'm just starting to explore the Visual Studio Reporting Services and, before I dive deep into the details, I need to know if the following is feasible.
I have a web application that, through a SERVLET, interacts with a SQL Server database (2012).
Now, I need to realize a scenario in which a user (Web Client) clicks on a button, a report is created at the Server's side and a link to the report is returned (so that the user may download it).
The reports could be either in Excel or PDF format.
Since I have an active and perfectly working interface with the database, I would like to have this scenario using the same mechanisms, meaning invoke a stored procedure that would trigger the report generation (to be stored in a pre-defined folder with a pre-defined name).
My questions are:
Is this at all possible?
If yes, could anyone provide some guidelines on how to tackle this in the right way?

Visual Studio Load Test w/o Agents - Manually Executing and Aggregating Results

My team recently adopted Visual Studio's Web Performance Test/Load Testing solution. Our test plans are developed, and we preparing to begin collecting baseline performance and stress test results against an corporate MVC application.
Due to corporate network security "features", Microsoft's Agents/Controller on-premises test distribution solution is not an option. Furthermore, the TFS Virtual Lab and Azure Virtual Lab load test distribution solutions are also not viable options due to security infrastructure and resource limitations.
Because of these constraints, it seems our only option is to run a Visual Studio Load Test from each developer machine (at a coordinated time, through different internet connections). *If anyone has another solution, I'm certainly receptive.
Assuming we take this approach, I'm concerned the results Visual Studio stores in the "LoadTest2010" SQL repository will not accurately reflect the combined results of all developer machine's Load Test.
My questions are:
Is this approach even viable?
If so, what is the best way to combine the separate Load Test SQL repositories into a single SQL Database (keeping in mind connecting to a central SQL Server during test execution is not an option)?
Assuming we import all the testers' results into a central database, does anyone have an idea of how to report on composite test results? I'm assuming they'll all have different TestRunIds which seems that would break Microsoft's built in Views and Stored Procedures for analyzing test results.
Putting all the test runs into one database can be done by exporting the results from all the secondary places and importing them into one database. Use the Open and manage load test results commands. See https://sqa.stackexchange.com/a/14503/6752 for more details.
Combining the results from several runs cannot be done, as far as I know, within Visual Studio. However each "graph" can be exported to Excel where you can manually merge the results. The rows of each "table" (but, unfortunately, not the headers) can be copied and pasted into Excel.
I prefer the Export graph data to Excel and Export graph data to text (.csv) commands over the Create Excel report. (The two Export... commands are not available for tables.) The reason being that the "Create Excel report" requires Visual Studio to be run as an Administrator and I have not found a sensible way of letting the Administrator user have access to my non-Administrator load test database.

Best approach to move data between SQL Servers

Ok, let me explain the environment we are facing here:
We have an ASP.NET MVC 4 app that uses a SQL Server database.
This app isolates data in "projects", so when any user connects to it only can work on the data of one of this projects.
Sometimes... a group of users have to travel to remote regions for some days to retrieve data for a single project, and quite often they won't be able to have an internet connection (even mobile or satellite solutions are often out of reach).
While the displaced team works on a project, people at the office still can work on the rest of the projects (but no on the one that is abroad).
So... we are pondering the possibility of using a laptop to act as a "mobile server", where users can download the data from a specific project before travelling. While abroad, they can work against this "mobile server", update any data on their project and, when they come back, they could upload their updated data to the main server.
Our idea is to create stored procedures on both servers (main and mobile) that executes different queries to update data from a project between them, passing the project identifier as a parameter. Probably using Linked servers to allow main and mobile to see themselves during update operations.
Our questions here are:
Is this a good aproach?
Is there any other better approach that we're not seeing?
Are there any risks we should pay attention to in this or other approachs?
I've never used Bidirectional transaction replication so if that works for you, problem solved. I do have quite a bit of experience with data migration, including merging large data sets into software driven systems. And from that experience, replication has hurt us more than it has helped us (from a migration/merge view).
The biggest challenge in my opinion is going to be conflict resolution. I know you say that all of the data is in project specific databases, but there is no shared data at all? What about multiple remote users updating the same data? In that case you're going to need a little more than just replication.
Instead of maintaining two databases at all times (one for mobile, one as the regular in-house db), why not a system where a job is called to your main system indicating that a project needs to be prepared for "offline mode" (the job could be stored procedures or SSIS packages or straight T-SQL). Whatever the technology used, this job would copy all of the requested project data to a new database on the remote server/laptop and mark it somehow in the main database as read-only to prevent users in the office from updating that data.
Once the data is in offline mode on the remote server, the users can update and use the data as much as they want from that remote server. Then when the users get an internet connection or they are back in the office they can kick off another job that syncs the data to the main server, removes offline mode, and deletes/archives the remote database. Almost like a temporary project database.
Seriously, it sounds like a fun project.
Technologies to look at:
SSIS (Sql Server Integration Services) - In my experience, this is extremely fast at moving data and allows you the ability to add logic to handle conflict resolution, error logic, etc. It's free (with certain Sql Server editions) and the community is huge so supporting it should be easy. SSIS is not as dynamic as some of the specialized solutions out there.
A data migration suite like Pervasive's Data Integrator - I loved this but it's expensive. You could right an entire solution in this product that could handle the processing of your data bidirectionally and like SSIS it allows for complex programming logic.
T-SQL - With a linked server you could just write straight queries (using stored procedures if you wanted). The problem here is security on the linked server. We don't use them because of this issue. Linked Servers: Good or Bad?
Start using some of Microsoft's built in change detection technologies right off the bat. It's harder to implement when you're already using the system. Change Data Capture (CDC) will give you a full history of the records updated while Change Tracking will give you a light-weight summary of your changes. Using either technology will make syncing the data many times easier.
Change Tracking: http://msdn.microsoft.com/en-us/library/bb933874.aspx
Change Data Capture: http://msdn.microsoft.com/en-us/library/cc645937.aspx
SSIS: http://msdn.microsoft.com/en-us/library/ms169917.aspx
SQL Server Agent Jobs: http://msdn.microsoft.com/en-us/library/ms189237.aspx

Migrating a Windows Forms Application with SQL Server back end to Silverlight

Presently I have a Windows Forms application that obtains data from a SQL Server database on a separate server in our LAN. Basically we want to re-use as much as possible our source code that interacts with the SQL Server database and change the forms portion to a thin client silverlight solution. The problem is that our Windows Forms application is a fat client application; however, our company recently added employees working several thousand miles away, so they have a long delay in working with our application as it retrieves data from the database server which is a long ways away from the employee's client forms application.
The ideal solution for me would allow the developer to display data based on various database tables or views dynamically at runtime say based on what treeview item a user clicks and not having to hard code the database schema at design time. This is the way our windows forms application presently works.
One aspect of silverlight I am wrestling with right now is that if you want to access data from a SQL Server database on the web server side you have to use web services or WCF RIA, which of course involves creating a design time EDMX file or generating LINQ to SQL classes. The problem is that our database schema changes quite frequently, so that means I would have to keep manually re-updating the web services along with the EDMX and/or LINQ to SQL. What I would really like to do would be to just connect to the SQL Server database using ADO.NET to populate the various silverlight datagrids without having to deal with web services. Please note that I am pretty new to Silverlight, so perhaps I am missing something obvious.
Here is one of the many links I have checked as I've been working on this solution; however, this just migrates a Windows Forms application that already has a web service to a silverlight application that has a similar web service, so it doesn't seem to apply to my situation:
http://www.silverlight.net/learn/advanced-techniques/moving-from-windows-forms/migrating-a-windows-forms-application-to-silverlight
Here is another website that I have been looking at closely; however, the database that I am working with is so huge and has such a large schema that whenever I attempt to open or work with the Data->Show Data Sources or Data->Add Data Source window in Visual Studio it takes about an hour of the CPU running at full throttle before it displays the values in visual studio. Of course this makes Visual Studio almost unusable if this hour long wait happens every time I try to make a change in the silverlight XAML designer:
http://msdn.microsoft.com/en-us/gg315272
Also, the website example above is not an acceptable solution because we want the web server and the database server to be two separate machines, so we would not be able to put the database into the App_Data folder in the silverlight solution.
If anyone has any suggestions or guidance in terms of migrating this application, they would be most appreciated. TIA.
Roger
Basically I found out that since I'm used to writing desktop applications that communicate directly with a database (such as SQL Server), I was surprised to find that there is no object in Silverlight to enable you to do (SqlConnection, OdbcConnection, etc.). This is because Silverlight is a client platform, designed to be run from within a browser anywhere in the world, and so it does not make sense for it to be able to access databases directly because databases are generally hidden behind a firewall. The purpose of a service is to provide an interface for exposing data publicly from the server, acting as a conduit between the data in the database and external applications. (source: "Pro Business Applications with Silverlight 5" by Chris Anderson)
Please note that I also had made a mistake when I created a DomainService and assigned to it all the tables, queries, and stored procedures in the entire database, when in fact one needs to create a separate domain service for each individual table or query. This explains why I had the problem before where I had to wait about an hour for the Data Sources window to populate.

Resources