We are developing and SSIS service to import some data in Excel and CSV files in Azure. For uploading the files we have chosen Azure File Storage and we are running the SSIS packages on a VM. For picking up the files from file storage, we have mapped the File Storage as mapped network drive on the VM. This works file when we manually trigger the SSIS jobs. However, this fails when running as SQL Server Agent job. As far as I understand, the mapped drives are per user and they do not work for service account used for SQL Server Agent. Is there a way by which we can access the file storage in SSIS packages as SQL Agent Jobs?
I found this page but this is for basic windows network file sharing. Does not work for us as we also need to use the Shared Access Key for Azure File Storage.
I solved the problem using this solution.
Add credential via cmd not via the GUI
Related
I am testing the azure file share with SSIS hosted on Azure VM. However, I am facing an issue while running the job.
Package execution through Visual Studio is successful
Package execution through catalog also is successful
Package execution through the Job is FAILING with the error The File Name "\.windows.net<FolderName>\file.csv" specified in the connection was not valid
I tried running the job with the proxy account having access to the Azure share folder (Same account used to run the package), however it failed with the same error.
Both UNC and as a mapped drive was tested.
Any suggestion or advice on this will be highly appreciated.
Thanks in advance.
Try giving read/write permissions (For the Azure share folder) to the SQL Database Engine Service account NT SERVICE\MSSQL$<Instance Name> and NT SERVICE\SQLSERVERAGENT (Where <Instance Name> should be replaced by the installed instance name):
Configure File System Permissions for Database Engine Access
Also, feel free to read more about SQL Server service accounts in the following documentation:
Configure Windows Service Accounts and Permissions
This year We moved from hosted servers to Azure VM's, we run two production servers (SQL and IIS). A vital component of our business is bulk transfer of data file. We take customers data from our SQL Server and then write it out to a file (XLS, CSV, XML, PDF, Word, etc.) and then either email these files to customers or in most cases, push them into their FTP server. We also have a few import procedures where we retrieve data files. All of this is currently done with SSIS packages.
We're examining a move to Azure Data Factory as a replacement for SSIS so that we can possibly move to either Azure SQL (if we can work out Broker Services limitations) or an Azure SQL Managed Instance.
I've done some preliminary work with ADF but I saw a couple of posts about lack of FTP support. Is it possible to create/deliver files to FTP and retrieve/consume files from FTP using ADF? Also, almost all of these jobs are automated and we use SQL Agent to run the packages. What is the Azure equivalent for scheduling these jobs to run?
There is automation in ADF but the scheduler is per pipeline. Azure Automation is more powerful and can automate more than one pipeline (Azure Data Factory v2), if needed.
Automation with Azure Data Factory (ADF)
You can receive files from FTP into an Azure Data Factory pipeline: Copy data from FTP server by using Azure Data Factory The idea is that you receive a file via FTP to submit to a particular pipeline activity, and that activity pushes data to an Azure data source. It might be possible to reverse the flow, and send data out.
The Azure SQL Database Managed Instance is the most on-premise like database (PaaS) service but SQL Server deployed on an Azure VM still has more functionality.
We have a Planned Maintenance System which stores and retrieves documents in a SQL database from an Access front end using ADODB Stream and SQL server authorisation. All works happily with a local database, but with SQL Azure the download fails. The connection opens OK but I get "Write To File failed, run time error 3004".
The same code works to the same location from a local server, so it's a permissions issue.
I've tried various locations for the file on the C drive, have given the folder all the permissions I can find and turned off the Firewall and Virus scanner.
Despite searching the internet for ages, I can't find what I need to do to allow the file to be streamed from a SQL Azure server that's not using active directory.
Inevitably there was a tiny difference in the code between the local and Azure versions revealed by ProcMon. Fixed.
what is the alternative of XP_CMDshell ? it is working fine in sql server
but it is not support in sql azure
so what is alternative for sql azure ?
I assume you need command line access to access BCP since you are trying to create and upload/download data into a flat or xml file.
SQL Azure does not provide access to a command prompt. Think of SQL Azure as just a virtualized service, without host infrastructure (Virtualized or physical). Since BCP is actually a command line based utility you won't have access to it from Azure. You'll want to setup BCP, which is part of the command line utilities, on a machine that does have access to command line. That machine might be either on-premise or Azure IaaS. You can install just the BCP component.
http://www.microsoft.com/en-us/download/details.aspx?id=36433
Alternatively if you don't want to setup physical infrastructure to complete this, Azure Data Factory would be able to complete the task of moving data between flat files and databases.
I am running desktop app that uses mdf file on local path.What if I want to do is that this mdf file should be placed over a network shared folder but network is using Domains and we need password to connect to that folder.Server is running windows Server and dont know if it has installed SQL Express or not.
Q
1-> do server needs to have SQL server Express.
2-> If I publish that project then use on multiple clients that may not have Visual Studio and sql server express but will have Dot Net framework. Will that Database Using application will work
1 - You don't have to use SQL Server Express, but its better than sharing a folder and use a mdf file over network using file sharing, and more trustable.
If your application will be used by one user at time only, you can share a folder on remote server, put the mdf file in there and give read/write access permission on share to the user running your desktop app.
If several users will run the app at the same time and access the database, it won't work because windows will probably lock the mdf file (and if not, your database will get corrupt). Then, you will have to use SQL Server Express and no folder sharing at all.
2 - It will work as long as your clients have SQL Compact Edition, that is installed with the .net framework by default, so you won't have any problems. By the way, if you are planning to all your customers using the same database, all the concerns I answered in question 1 applies here. If multiple users have to connect to the same db file at the same file, you'll have to: 1 - setup a SQL Server Express on client or 2 - publish your sql server express so it can be accessible from outside your network and configure your custumers desktop app to access this server.