Exposing SQL Data to clients - sql-server

we have an internal SQL Server 2008R2 db that we'd like to expose (partially - only some tables) to our clients via Internet, so they can feed their Excel reports. What are our best options? How should we provide security (ie. Should we create another, staging DB server on DMZ for this?). As far as quantity to transfer, it's very small (< 100 recs).

Here would be one simple way to start with if they need live, real-time access:
Create a custom SQL user account for web access, locked down with read-only access to the relevant tables or stored procedures.
Create a REST web service that connects to the database using the SQL Account above. Expose methods for each set of data that can be retrieved.
Make sure the web service runs over SSL (HTTPS) and requires username/password authentication - for example via BASIC auth with custom hard-coded account per client.
Then when the clients need to retrieve data, they can access a specific URL and receive data in CSV format or whatever is convenient for their reports. Also, REST web services are easily accessed via XMLHTTPObject if you have clients that are technically-savvy and can write VBA macros.
If the data is not needed real-time - for instance, if once a day is often enough, you could probably just generate .csv output files and host them somewhere the clients can download manually through their web browser. For instance, host on an FTP site or simple IIS website with BASIC authentication.

If data is not needed real-time, the other alternative is use SSIS or SSRS to export excel file, and email to your clients.

Related

Why does Excel require less information for querying a SQL Server than Power Apps?

In Excel 365 desktop, I can:
Open a blank workbook
Click on the 'Data' ribbon
Click 'Get Data'
Click 'From Database'
Click 'From SQL Server Database'
Fill in the 'Server' field
Click OK
and that's all that I need to query my SQL server. Conversely, in the web version of Power Apps, it appears that I absolutely must set up something called a "gateway" (or sometimes, an "on-premises data gateway"). This appears to be non-trivial and looks like it may even cost money.
Is there any technical reason for this restriction? I find it very surprising that Excel appears to be more powerful than Power Apps. Am I profoundly ignorant in some way?
To answer your last question: Yes, but that can be changed.
PowerApps is a cloud service. It is hosted on Microsoft servers. You can query all kinds of data, but you need so-called "connectors" to do that.
If the data source is on your company's internal network, then you need a way to connect to that internal data securely and safely. You wouldn't want to expose your company's SQL Server data for all the world to see.
To create that secure connection from a cloud-hosted service like PowerApps (or Power BI, or Power Automate), you install the data gateway on a machine in your internal network. That gateway is then the, ehm... , gateway from the cloud-hosted system into your company's SQL Server or other on-premises data.
If your SQL Server database is hosted in the cloud, for e.g. in Azure, then you would not need the gateway and could use a different connector in PowerApps that targets Azure hosted SQL.

Calling API from Azure SQL Database (as opposed to SQL Server)

So I have an Azure SQL Database instance that I need to run a nightly data import on, and I was going to schedule a stored procedure to make a basic GET request against an API endpoint, but it seems like the OLE object isn't present in the Azure version of SQL Server. Is there any other way to make an API call available in Azure SQL Database, or do I need to put in place something outside of the database to accomplish this?
There are several options. I do not know whether a powershell job as stated in the first comment to your question can execute http requests but I do know at least a couple of options:
Azure Data Factory allows you to create scheduled pipelines to copy/transform data from a variety of sources (like http endpoints) to a variety of destinations (like azure sql databases). This involves no or a little bit of scripting.
Azure Logic Apps allows you to do the same:
With Azure Logic Apps, you can integrate (cloud) data into (on-premises) data storage. For instance, a logic app can store HTTP request data in a SQL Server database.
Logic apps can be triggered by a schedule as well and involves none or little scripting
You could also write an Azure Function that is executed on a schedule and calls the http endpoint and write the result to the database. Multiple languages are supported for writing functions, like c# and powershell for example.
All those options include the possibility to force an execution outside the schedule.
In my opinion Azure Data Factory (no coding) or an Azure Function (code only) are the best options given the need to parse a lot of json data. But do mind that Azure Functions on a Consumption Plan have a maximum execution time allowed of 10 minutes per invocation.

How do I expose my enterprise SQL Server to web services/mobile apps?

My enterprise SQL Server deployment is currently local to our extranet. Now, I would like to expose some of this data to not only be consumed by web services and mobile apps, but allow those apps to actually create new records in the DB.
Also, one of my main hesitations is security. From a conceptual standpoint, what is involved in exposing data via a web service and ensuring that both the data and connection remain secure?
Are REST/OAuth or SOAP the only feasible options?
Speaking from the SQL side: We control the flow of data through stored procedures.
Web service/mobile app uses a sql login to access data in the database. The sql login ONLY has access to execute stored procedures. Those stored procedures use parameters to select/update/insert/delete only the records specified.
This prevents the app/web service from seeing any of the base tables or schema.
However, I can't speak on the security of the connection, that is an issue our developers and architects deal with.

WPF with arbitrary, unknown databases - Client/Server or Desktop app?

My company is planning to turn an older Winforms application into a WPF/Silverlight Client/Server app.
The idea of having a small server app is to have a list of the accessible data bases combined with the user type that may access each of the databases, instead of having to manage databases in each client's admin control. Additionally, it would be great if the SQL request would be handled by the server which would then return the result.
The app is supposed to work on a arbitrary set of databases which will be "registered" with the server and users get a list of databases according to their authentication rights. They can then do practically everything on those databases what one can imagine. The system should be able to handle up to 2 million rows.
The databases are very different, there can be many of them, they can be MS Access, Oracle, SQL Server etc., so no way for me to specify them all before. On top of that, communication with a SQLite cache is needed.
I already have everything I need for the SQL queries from the Winforms app.
I was thinking:
1) A simple WCF server specifying in a config file the available databases per user type.
2) Interface that specifies all necessary SQL queries that can be made to the server.
3) Client...
The idea is:
a client-server application, where the client uses WCF services to execute SQL queries (INSERT, UPDATE, SELECT, etc.) on tables by invoking services methods.
The service should ideally be consumable for both the WPF and the Silverlight app.
Is that the way to go? Which exisiting technologies might I want to make use of regarding formats, communication, services etc.
If that is problematic, I would consider going back to a desktop app, but then how to ease the user type/database access problem for each client?
I would stick with ADO.NET and start with the DbProviderFactory class. This will let you determine the proper database access based on information supplied by the provider using the Factory Design Pattern. So instead of having to create a specialized objects for each database type and database, you can abstract that logic with the DbProviderFactory.
Here's a link that shows some examples: http://msdn.microsoft.com/en-us/library/wda6c36e(v=VS.100).aspx

How should I access to SQL Server DB?

I have been reading that direct access to a SQL Server database over the Internet is insecure, so I am wondering what intermediary I can and should use between the client and the server. What are the best practices in terms of security and performance?
For direct access, you would have to use SSL on your connections, but generally, I wouldn't expose a database server to the internet. I would design my way around it, for example by creating web services in front of the db server.
Use an API - Application Programming Interface . This is a frontend door to the data you wish to expose. This means you will need to define what you expose and how.
For example, Stack Overflow does not allow their database to be accessed via anyone directly. BUT, they have allowed people to access certain parts of their database, via their Stack Apps API. What parts? they have exposed certains parts with their own API -> web url's that spit back data, based upon what you request. The results are in JSON format only (at the time of me posting this answer).
Here is a sample API method that exposes some of their database. (EDIT: hmm, none of the API links work ... the link i was trying to show was ...
http://api.stackoverflow.com/0.8/help/method?method=answers/{id}/
)
Now .. if you don't want to actually think about what data (eg DB tables, if you're using a Relational Database like Microsoft SQL Server or Oracle Sql Server) but want to expose the ENTIRE database .. just via the web ... then maybe you could look at using OData to stick in front of your DB, to expose it?
Another Edit:
I was assuming you ment - allowing the public to access your DB .. not private. Otherwise, this should be on ServerFault.
I'd written this lovely reply pertaining to web access to a SQL server, and then you go and update it stating you have a desktop app in place.
With that, as was said above, the best idea is to not expose a database server to the internet. If you absolutely have to, then there's a few possible solutions.
Implement some sort of VPN connection into the network. I had once instance where we had a large number of sites all connecting to a database server (and company network) via VPN. This kept the database server off of the internet, while still allowing a half decent access time to the information. This was for a retail environment with not a great deal of data throughput
Properly setup your firewalls and permissions on the server. This one should be done anyway. You could put the server behind a firewall, allowing access only on 1433, and only from a specific IP range (which i assume would be possible). This way, you can at least lower the amount of locations a possible attack could come from.
This could all be employed in addition to the APIs and services mentioned above.
You can use with config.php. You must write db name, db user, db password, and host in config.php. Then you can use
[?php require("config.php"); ?]
in you page. Please change [ and ] to { and }.
You could just have a page in your web site's language (e.g. PHP, JSP, ASP, etc...) that queries the DB and returns the data you need in whatever format you need. For example:
If you're using jQuery:
from the client-side:
$.ajax({
url: 'ajax/test.php',
success: function(data) {
$('.result').html(data);
alert('Load was performed.');
}
});
Here, test.php would connect to the DB and query it and the result of test.php would be returned in the 'data' variable.

Resources