Microsoft one drive to SQl Server using SSIS - sql-server

can i use microsoft 1 drive to store the flat file and can i load the files from one drive to sql server database suing SSIS.Can any one share a thought on an alternative.Keeping the master data flat file on cloud and making it available on sql server.We do not want to download those files stored in cloud.The thing is would like to create ssis package to read the files on the cloud

Related

Using sharepoint online excel file as a data source in SSIS

The user case is that we have a file in our SharePoint online that I need to load into our on prem SQL server, using SSIS 2019 (64 bit).
This question is about how to use SSIS to copy the file from the online SharePoint Document Library to an on-prem server file location, from the on-prem server I indeed to use SSIS to load the file data required to the on-prem SQL server
I have read a number of question
Opening Excel file stored on SharePoint as data source using ADODB Connection
Import Excel file located in SharePoint Server into SSIS
However, in the main I am struggling to use the "Show in File Explorer" option even when I try to use IE11 Can any one suggest an easier way to convert the web file path to the correctly formatted UNC
I am aware that I could use a Power Automate to move the file to achieve the same thing, but I am finding it strange that SSIS does not do this
Can anyone help please

Google Cloud SQL (SQL Server) restore external backup files into current database

I'm working on a project where I need access to one of clients databases which happen to be SQL Server, I cannot get direct access to DB, instead the client shares with me regularly .bak files of his SQL Server so that I could create a sister version of his DB on my infrastructure.
I'm developing on Google Cloud Platform and my question is: Is it doable to have a process where I restore external (differential) .bak files into Cloud SQL with SQL Server, so that the restore would merge with my existing database? (The first backup that I get from them would be full, but the next will be differential). I cannot find any info in GCP documentation on restoring externally created backup files and merging them into existing DB.
Cloud SQL for SQL server supports importing databases using BAK and SQL files. You can read here about importing data from a BAK file [1].
Currently this is only possible for full backups and not for incremental/differential ones. I suggest following the recently created Issuetracker issue [2] and to indicate you are affected by the issue in order to increase its visibility.
[1] - https://cloud.google.com/sql/docs/sqlserver/import-export/import-export-bak#import_data_from_a_bak_file
[2] - https://issuetracker.google.com/200782933

SQL Server Integration Services - programmatically choose what files to import

I'm not new to SQL Server but am new to Integration Services, so I want to understand if the following requirement is fully achievable in SSIS, or if I will need to consider some level of C# development to supplement:
We have 25 Azure VM's running Windows Server 2016 Datacenter - on each VM we have thousands of log files in different folders. We need to have all these folders monitored and upon creation of any new file with a certain string present in the name, we want the contents of the file exported to a table in our Azure SQL Server 2017 database.
Is this kind of custom logic configurable in an SSIS project, or is SSIS more geared toward static definitions of file folders/filenames?
I don't think that you can implement a folder watcher within SSIS package. But you can achieve this in different ways:
(1) - using a folder watcher
You have to develop the SSIS packages you need to import data, then you must develop a folder watcher using C# to read the changes in folder and if the file name meet the requirementes you should run dtexec utility (using shell) to run the relevant SSIS package.
(2) - using SQL agent job
You can configure an SQL server agent job to check the file changes periodically and run the SSIS package when required.

Transfer json files from dropbox to SQL Server 2016 database tables

I have a requirement to automatically copy/transfer .json files that are located on dropbox to a SQL Server 2016 database. How can this be done, code examples. I have SQL Server 2016 but I am new to all of this.
If the .json files are in dropbox can SQL Server get them, then parse them into the database automatically or do I need something to copy the files down from dropbox first, then import these files directly or first parse then to some format (e.g. .csv) to populate tables in the SQL Server database?
SQL Server cannot directly access dropbox account. However, if you synchronize dropbox on some local folder or network path that is accessible by SQL Server, you can use OPENROWSET to read file and parse content with OPENJSON, see example in:
https://blogs.msdn.microsoft.com/sqlserverstorageengine/2015/10/07/bulk-importing-json-files-into-sql-server/

Is SSIS able to query flat files from another Windows Server?

I pretty new SQL Server Integration Server (SSIS) user. Is SSIS able to query data from text files located in another Windows Server? I mean that when SSIS is installed on Windwos Server A, is SSIS able to query data from e.g. one folder containing text files in Windows Server B (under same domain)? I have used only SAP BO Data Integrator ETL tool and it cannot query flat files from another Server: during execution, all files must be located on the Job Server machine that executes the job.
Yes, you can access text files on another server using SSIS by using a fileshare and accessing the files using the share name i.e. \\ServerB\MySSISFiles\.
You need to make sure that the account the SSIS is running under has access to the file share.

Resources