Currently I have a web app that imports Excel files, parses them (using openXML) then imports the data into SQL Server. The user does not need to predefine the Excel template as long as he sends the file in the same order of the database table he wants to insert it, if a value is not compatible with a column type, the system generates a log error file which basically says: "Excel Address -> A1, Value -> XXXXX (string), Expected -> (Date yyyy/mm/dd).
On parsing the file I check if the formats are compatible with the db table
I want to migrate this service to the cloud, specifically azure since I use Visual Studio to develop and the integrations facilitate the job. However I'm a little lost on the multiple services and if its possible to even do this kind of work.
I would like to store the files in Azure Blob storage then send to Data Lake(?) to import the data to a SQL Server database, is this possible? Are these the services I should use?
The simplest (and probably cheapest) solution I can think of given these constraints would be to upload to an Azure Function. Your Azure Function can then perform the parsing and insert records into the SQL database. It can also return an HTTP error code and message if the parsing cannot complete due to data errors.
The downside of Azure Functions is runtime constraints. If you are pushing so much data that you run into those constraints, you could spin up an Azure WebApp to do the same thing.
Can't you just create your tables in Azure SQL and then change your current web application to connect to Azure SQL instead of your on-premise SQL Server?
Azure SQL is basically a SQL Server in the cloud with a few differences (nowadays the on-premise versions also contains most of the Azure features).
If you are lucky you would just need to change your connection string:)
Related
In Excel 365 desktop, I can:
Open a blank workbook
Click on the 'Data' ribbon
Click 'Get Data'
Click 'From Database'
Click 'From SQL Server Database'
Fill in the 'Server' field
Click OK
and that's all that I need to query my SQL server. Conversely, in the web version of Power Apps, it appears that I absolutely must set up something called a "gateway" (or sometimes, an "on-premises data gateway"). This appears to be non-trivial and looks like it may even cost money.
Is there any technical reason for this restriction? I find it very surprising that Excel appears to be more powerful than Power Apps. Am I profoundly ignorant in some way?
To answer your last question: Yes, but that can be changed.
PowerApps is a cloud service. It is hosted on Microsoft servers. You can query all kinds of data, but you need so-called "connectors" to do that.
If the data source is on your company's internal network, then you need a way to connect to that internal data securely and safely. You wouldn't want to expose your company's SQL Server data for all the world to see.
To create that secure connection from a cloud-hosted service like PowerApps (or Power BI, or Power Automate), you install the data gateway on a machine in your internal network. That gateway is then the, ehm... , gateway from the cloud-hosted system into your company's SQL Server or other on-premises data.
If your SQL Server database is hosted in the cloud, for e.g. in Azure, then you would not need the gateway and could use a different connector in PowerApps that targets Azure hosted SQL.
I'm relatively new to Azure and am having trouble finding what options are out there for connecting to an existing SQL database to push data into it.
The situation is that we have an external client who needs to connect to our Azure SQL database to push data into it, on an on-going basis. We can't give them permission to get into our database, so we're looking at what we can do allow data in. At this point the best option seems to be to create a web service deployed in Azure that will validate the data and then push it into our database.
The question I have is, are there other options to do this in an easier way? Are there Azure services or processes that can be set up to automatically process a file and pull the data into a database? Any other go-between options when each side has their own database and for security reasons can't just open up access to it?
Azure Data Factory works great for basic ETL. If neither party can grant direct access, you can use an intermediate repository like Blob Storage to drop csv/xml/json files for ingestion. If they'll grant you access to pull, you can setup a linked service that more or less functions the same as a linked server in MSSQL. As of the last release ADF now supports Azure hosted SSIS packages too.
I would do this via SSIS using SQL Studio Managemenet Studio (if it's a one time operation). If you plan to do this repeatedly, you could schedule the SSIS job to execute on schedule. SSIS will do bulk inserts using small batches so you shouldn't have transaction log issues and it should be efficient (because of bulk inserting). Before you do this insert though, you will probably want to consider your performance tier so you don't get major throttling by Azure and possible timeouts.
I'm new to Azure development, and I'm having trouble finding examples of what I want to do.
I have an XML file in Azure file storage and I want to use a Logic App to get that XML data into a SQL database.
I guess I will need to create a "SQL Database" in Azure, before the Logic App can be written (correct?).
Assuming that I have some destination SQL database, are there Logic App connectors/triggers/whatever that I can use to: 1) recognize that a file has been uploaded to Azure, and 2) process that XML to go into a database?
If so, can such connectors/triggers/whatevers be configured/written so that any business rules I have, for massaging the data between the XML and the database, can be specified?
Thanks!
Yes you are right you need to create the db and then write logicapps to perform necessary functionality.
There are lot of connectors with trigger like blob storage, Sql connector etc...
You can perform your processing with the help of "Enterprise Connectors" or you can do custom processing using "AzureFunctions" which integrate with logic apps.
In order to perform CRUD operations on an Azure SQL Database, you can use the SQL Connector. Documentation on the connector can be found here:
Logic App SQL Connector
Adding SQL Connector to a Logic App
I've also written a blog myself on how to use the SQL Connector to perform Bulk operations using a stored procedure and OpenJSON : Bulk insert into SQL
This might help you in designing your Logic App if you choose to use a stored procedure.
we have an internal SQL Server 2008R2 db that we'd like to expose (partially - only some tables) to our clients via Internet, so they can feed their Excel reports. What are our best options? How should we provide security (ie. Should we create another, staging DB server on DMZ for this?). As far as quantity to transfer, it's very small (< 100 recs).
Here would be one simple way to start with if they need live, real-time access:
Create a custom SQL user account for web access, locked down with read-only access to the relevant tables or stored procedures.
Create a REST web service that connects to the database using the SQL Account above. Expose methods for each set of data that can be retrieved.
Make sure the web service runs over SSL (HTTPS) and requires username/password authentication - for example via BASIC auth with custom hard-coded account per client.
Then when the clients need to retrieve data, they can access a specific URL and receive data in CSV format or whatever is convenient for their reports. Also, REST web services are easily accessed via XMLHTTPObject if you have clients that are technically-savvy and can write VBA macros.
If the data is not needed real-time - for instance, if once a day is often enough, you could probably just generate .csv output files and host them somewhere the clients can download manually through their web browser. For instance, host on an FTP site or simple IIS website with BASIC authentication.
If data is not needed real-time, the other alternative is use SSIS or SSRS to export excel file, and email to your clients.
This should be simple. I'm trying to import data from Access into SQL Server. I don't have direct access to the SQL Server database - it's on GoDaddy and they only allow web access. So I can't use the Management Studio tools, or other third-party Access upsizing programs that require remote access to the database.
I wrote a query on the Access database and I'm trying to loop through and insert each record into the corresponding SQL Server table. But it keeps erroring out. I'm fairly certain it's because of the HTML and God knows what other weird characters are in one of the Access text fields. I tried using CFQUERYPARAM but that doesn't seem to help either.
Any ideas would be helpful. Thanks.
Try using the GoDaddy SQL backup/restore tool to get a local copy of the database. At that point, use the SQL Server DTS tool to import the data. It's an easy to use, drag-and-drop graphical interface.
What error(s) get(s) thrown? What odd characters are you using? Are you referring to HTML markup, or extended (eg UTF-8) characters?
If possible, turn on Robust Error Reporting.
If the problem is the page timing out, you can either increase the timeout using the Admin, using the cfsetting tag, or rewrite your script to run a certain number of lines, and then forward to itself at the next start point.
You should be able to execute saved DTS packages in MS SQL Server from the application server's command line. Since this is the case, you can use <cfexecute> to issue a request to DTSRUNNUI.EXE. (See example) This is of course assuming you are on a server where the command is available.
It's never advisable to loop through records when a SQL Update can be used.
It's not clear from your question what database interface layer you are using, but it is possible with the right interfaces to insert data from a source outside a database if the interface being used supports both types of databases. This can be done in the FROM clause of your SQL statement by specifying not just the table name, but the connect string for the database. Assuming that your web host has ODBC drivers for Jet data (you're not actually using Access, which is the app development part -- you're only using the Jet database engine), the connect string should be sufficient.
EDIT: If you use the Jet database engine to do this, you should be able to specify the source table something like this (where tblSQLServer is a table in your Jet MDB that is linked via ODBC to your SQL Server):
INSERT INTO tblSQLServer (ID, OtherField )
SELECT ID, OtherField
FROM [c:\MyDBs\Access.mdb].tblSQLServer
The key point is that you are leveraging the Jet db engine here to do all the heavy lifting for you.