I am working on a project where i need to display the database mssql server's performance metrics for example memory consumed/free, storage free space etc. I have researched for this purpose and one thing came up was DOGSTATSD.
Datadog provides the library for .net project to get custom metrics but that was not the solution for me because the metrics works on datadog website. I have to display the all (in graph or whatever suited) data, received from MSSQL SERVER. There will be multiple servers/instances.
Is there a way to do that, our WebApp connected with multiple databases and we receive/display information.
I cannot use already available tools for the insights.
You can easily get all needed data via querying dmv and other resources inside SQL Server. Good start is here.
Related
I have data in Microsoft's Common Data Service (from Microsoft Dynamics for Talent). I can't use the Data Management Framework as the data in question is in entities that are not available through the DMF.
How do I replicate the data in the CDS back a SQL database?
What I've tried so far is to create a logic app (and flow, neither worked) that grabs data using the CDS connector and pushes it into an SQL database, but there are several problems with this:
It's a maintenance burden
It's extremely error tedious to add new tables, etc. I have written a somehwat horendous stored proc that tries to create a table based on the data given to it from the json-ified data from the flow, but this is very error prone.
It doesn't work at all, since the size of the data exceeds some kind of limitation in the SQL connector and I get spurious errors.
Rather than trying to push through with these issues, I'd rather ask whether there's a better way to achieve this. With the Data Management Framework in Dynamics it was simply a matter of scheduling these sync jobs, which worked pretty well. Is there something similar with CDS?
I've also tried looking at the Data Integration projects in Powerapps, but these only seem to allow me to get data into Powerapps/CDS, not back out...
Common Data Service for Apps provides access to the data using the user interfaces or API, there is no direct access to the underlying database. This architecture has certain limitations when it comes to processing large volumes of data, for example for the purposes of data warehousing, reporting, or using Azure machine learning and analytics tools. Replicating CDS data using Extract, Transform, Load (ETL) tools is possible but inherently complex to maintain.
Data Export Service is a service made available on Microsoft AppSource that adds the ability to replicate Dynamics 365 for Customer Engagement apps data to an Azure SQL Database store in a customer-owned Azure subscription.
Note: The Data Export Service requires Dynamics 365 for Customer Engagement apps subscription, it is not available on Common Data Service for Apps plans.
We are planning to implement a project in Azure cloud where data storage will be Azure Data lake for now and in future HDP will be implemented and ADLS will be the extended datanode. From ADLS we want to expose data for Dashboard creation using Tableau. Initial plan was to use Hive and Tableau will connect to Data through Hive. But here comes the performance issue as:
There will be multiple users who will have access to Data through Tableau(100+)
We will also have to expose Data to different portal with API calls.
Which means multiple connectivity will be established at the same time which will hit hive . My question is:
Can hive serve the purpose with minimal time?
How can i measure the performance?
I dont want to let my users to sit back after running a query in tableau and wait for a long time to see the dashboard.
Would you please share your experiences in this design issue? Should we use Hive or should We use some other tools which have better performance to work with tableau and HDFS storage. Someone suggested me to use Azure SQL Server and connect Tableau to SQL server. But its again the old fashion and also matter of cost as price is related with the execution of each query.
If you have any better solution experience please share , would be greatly appreciated.
Thanks in advance.
Hive LLAP could work, if you can get it installed.
Otherwise, at my work, we've had good experience with PrestoDB and Tableau on S3 data.
Some teams use Spark SQL, and you can setup a Spark Thrift Server, that should be compatible with the Hive JDBC/ODBC drivers
Ok, let me explain the environment we are facing here:
We have an ASP.NET MVC 4 app that uses a SQL Server database.
This app isolates data in "projects", so when any user connects to it only can work on the data of one of this projects.
Sometimes... a group of users have to travel to remote regions for some days to retrieve data for a single project, and quite often they won't be able to have an internet connection (even mobile or satellite solutions are often out of reach).
While the displaced team works on a project, people at the office still can work on the rest of the projects (but no on the one that is abroad).
So... we are pondering the possibility of using a laptop to act as a "mobile server", where users can download the data from a specific project before travelling. While abroad, they can work against this "mobile server", update any data on their project and, when they come back, they could upload their updated data to the main server.
Our idea is to create stored procedures on both servers (main and mobile) that executes different queries to update data from a project between them, passing the project identifier as a parameter. Probably using Linked servers to allow main and mobile to see themselves during update operations.
Our questions here are:
Is this a good aproach?
Is there any other better approach that we're not seeing?
Are there any risks we should pay attention to in this or other approachs?
I've never used Bidirectional transaction replication so if that works for you, problem solved. I do have quite a bit of experience with data migration, including merging large data sets into software driven systems. And from that experience, replication has hurt us more than it has helped us (from a migration/merge view).
The biggest challenge in my opinion is going to be conflict resolution. I know you say that all of the data is in project specific databases, but there is no shared data at all? What about multiple remote users updating the same data? In that case you're going to need a little more than just replication.
Instead of maintaining two databases at all times (one for mobile, one as the regular in-house db), why not a system where a job is called to your main system indicating that a project needs to be prepared for "offline mode" (the job could be stored procedures or SSIS packages or straight T-SQL). Whatever the technology used, this job would copy all of the requested project data to a new database on the remote server/laptop and mark it somehow in the main database as read-only to prevent users in the office from updating that data.
Once the data is in offline mode on the remote server, the users can update and use the data as much as they want from that remote server. Then when the users get an internet connection or they are back in the office they can kick off another job that syncs the data to the main server, removes offline mode, and deletes/archives the remote database. Almost like a temporary project database.
Seriously, it sounds like a fun project.
Technologies to look at:
SSIS (Sql Server Integration Services) - In my experience, this is extremely fast at moving data and allows you the ability to add logic to handle conflict resolution, error logic, etc. It's free (with certain Sql Server editions) and the community is huge so supporting it should be easy. SSIS is not as dynamic as some of the specialized solutions out there.
A data migration suite like Pervasive's Data Integrator - I loved this but it's expensive. You could right an entire solution in this product that could handle the processing of your data bidirectionally and like SSIS it allows for complex programming logic.
T-SQL - With a linked server you could just write straight queries (using stored procedures if you wanted). The problem here is security on the linked server. We don't use them because of this issue. Linked Servers: Good or Bad?
Start using some of Microsoft's built in change detection technologies right off the bat. It's harder to implement when you're already using the system. Change Data Capture (CDC) will give you a full history of the records updated while Change Tracking will give you a light-weight summary of your changes. Using either technology will make syncing the data many times easier.
Change Tracking: http://msdn.microsoft.com/en-us/library/bb933874.aspx
Change Data Capture: http://msdn.microsoft.com/en-us/library/cc645937.aspx
SSIS: http://msdn.microsoft.com/en-us/library/ms169917.aspx
SQL Server Agent Jobs: http://msdn.microsoft.com/en-us/library/ms189237.aspx
The project I'm on has a single server running both the SQL Server 2008 R2 database and the SSRS Reporting Services. Can I utilize SSRS to host a simple intranet site that would allow users to modify reference table data via a web app that I created?
I know SSRS is no longer tied directly to IIS. Is it possible to use SSRS's hosting facilities or will I be forced to install IIS?
I have read articles on how to utilize report parameters to do simple CRUD operations (http://www.sqlservergeeks.com/articles/sql-server-bi/26/using-sql-server-reporting-services-to-manage-data), but I'd prefer another solution, because at some point the data entry piece will be used to do more than manage lookup table data.
Any suggestions? Thanks!
Can I utilize SSRS to host a simple intranet site that would allow users to modify reference table data via a web app that I created?
Yes you could do that. From the different elements in a SSRS report you can link to web pages, so the report could link to your web app, then have the web app link back to reports.
Also, one report could link to another report, and you could accomplish an intranet with a dashboard page that links off to dozens of other reports.
I know SSRS is no longer tied directly to IIS. Is it possible to use SSRS's hosting facilities or will I be forced to install IIS?
In order to just run reports you would need an install of IIS, but in order to run other web applications, you will need a web server, and IIS will do the job.
As far as the data entry goes, you are probably best off using your web application rather than trying to implement data entry in the report. Technically you can do data entry using parameters on the reports, but it is a very ugly solution.
Still using Sharepoint with SSRS might give you want you want. Another option would be to use OneNote for some of the intranet pages, and put the OneNote files on a shared network location. You will still need your web app to enter data.
I hope this answers you question.
So I'm inexperienced in hosting DB's and I've always had the luxury of someone else getting the db setup.
I was going to help a friend out with getting a webpage setup, I've got experience in Asp.Net MVC so I'm going with that. They want to setup a search page to query a db and display the results. My question I have is in getting the DB setup and hosted. They currently just have the Access DB on a local computer. There is basically only one table that would need to be queried for the search.
What is the best approach to getting this table/db accessible? They would like to keep the main copy of the db on the local machine, so copying the entire db over to the hosted site would be time consuming, could the lone table needed be solely copied to the host? Should I try to convince them to make changes on the hosted db and just make copies of that for their local machines? Any suggestions are welcome, Again I'm a total noob when it comes to hosting databases.
Thanks
Added: They are using a MS Access 2000, and the page will have access restrictions. Thanks for the responses.
How about SQL Server Express? I think you can do a remote connect from Access and just push the data over from Access.
I wouldn't use Access on a web server in any case.
I would strongly recommend against access from web work, its just not designed for it and given that SQL server express is free there is no reason not to give it a go.
You can migrate the data over by using the SQL server upsizing wizard, here is a link for help on using that feature
http://support.microsoft.com/kb/237980
It depends on what you mean by web work? Access 2010 can build scalable browser neutral web applications. They can scale to 1000's to users. In fact, you can even park the web sites on Microsoft's new cloud hosting options, and scale out to as many users as you need.
Here is a video of an application I wrote in access 2010. Note how at the half way I run the same application including the Access forms in a standard web browser. This application was built 100% inside of the Access client. The end result needs no ActiveX or Silverlight to run.
http://www.youtube.com/watch?v=AU4mH0jPntI
So, the above shows that access can now be used to build scale web sites (you can ignore the confusing answers by the other two posters here they are not quite up to speed on how access works or functions).
However, for your case, I would continue to have the access database on the desktop. You can simply link to tables that are hosted on the web server. Those tables can exist in MySql, or sql server. As long as the web site supports external ODBC connections (many do), then you can thus have the desktop application use the live data from the web server. If connections to the live data at all times is a issue, then you could certainly setup something to send up new records (or the whole table) on some kind of interval or perhaps the reverse, and pull down new records on a interval from the web site (depends which way you need to go). So, connecting to MySql or sql server is quite easy as long as the web hosting and site permits external ODBC connections. I do this all the time, and it works quite well.
As mentioned, new for access 2010 is web site building ability but that does requite Access Web services running on SharePoint.
You don't need to upgrade to Access 2010. One option is to use the EQL Data plugin to sync the database up to the server. Then you can write an asp.net, php, or whatever application that queries the table using the EQL API and prints the results however you want. This kb article describes how to use the EQL API from a web app.
The nice thing is that the database is still totally usable (and at full speed) even when you're not online, and then you can sync the new data up to the web occasionally. It only uploads the changes, not the entire database every time, so it's fast.
Disclaimer: I work at EQL Data so I'm a bit biased. But this kind of use case is the whole reason the company exists.