I want to ask something about SSIS, that have an SSIS solution that has TASK Source Azure Data Lake, and Destination Azure SQL Database
In Azure SQL Database, I have added column DATEMODIFIED from Last Modified Azure Data Lake ...
The question is "can I get the last modified data CSV Azure Data Lake into Azure SQL DB table."?
Thanks in advance
You will need a script task in your package and use the System.IO.FileInfo object in the script task to get the file properties including the file's last modified date.
You could also use REST API call or SDK as suggested in this SO thread.
Related
I created an empty SQL database using Azure portal.
I also added some sample data to a data lake in Data Lake Storage Gen 1.
I downloaded SSMS, linked it to the server containing the SQL database, and added a new table using SSMS in order to have a target location to import the data into the SQL database.
Questions: 1. Will the new table I added in SSMS be recognized in Azure?; 2. How do I get the sample data from the data lake I created into the new table I created in the Azure SQL database?
An article suggested using Azure HDInsight to transfer the data, but it's not part of my free subscription and I don't know how much charges I will incur using it.
Will the new table I added in SSMS be recognized in Azure?
Yes, when we connect to the Azure SQL database with SSMS, all the operations in query editor will register/sync to the Azure SQL database. When the SQL statements executed/ran, we can refresh the SSMS and the new table will exist.
How do I get the sample data from the data lake I created into the new table I created in the Azure SQL database?
What's the sample data did you mean added to Data Lake Storage Gen 1, CSV files? If the file is not very large, I would suggest you download to you local computer can then using SSMS Import and Export Wizard to load the data to Azure SQL database. It may take some time, but it's free.
Or you may follow the tutorial Copy data between Data Lake Storage Gen1 and Azure SQL Database using Sqoop, I didn't find the price of Azure HDInsight when we create it. I'm not sure if it's free. I think you could try it.
Hope this helps.
I'm moving an application from an Access database to a SQL Server database.
The current Access database contains 5 'linked' Excel files (reports that come from SAP) which are refreshed daily by overwriting each file with the new SAP report. In this way, through a bunch of transformation/queries, the data ends up in the appropriate table in the way we want to store the data.
Is a setup similar to this possible using SSIS? I've watched tutorials about uploading Excel files into a table, but essentially I need to do this:
SAP Export > Save (overwrite) to Network File > MS Access link exists and new data is transformed through many 'stored procedures (action queries)' > Data moved to appropriate table.
Appreciate any YouTube/Google/Links/Reading about doing this in SSIS. Regards!
Architectural/perf question here.
I have a on premise SQL server database which has ~200 tables of ~10TB total.
I need to make this data available in Azure in Parquet format for Data Science analysis via HDInsight Spark.
What is the optimal way to copy/convert this data to Azure (Blob storage or Data Lake) in Parquet format?
Due to manageability aspect of task (since ~200 tables) my best shot was - extract data locally to file share via sqlcmd, compress it as csv.bz2 and use data factory to copy file share (with 'PreserveHierarchy') to Azure. Finally run pyspark to load data and then save it as .parquet.
Given table schema, I can auto-generate SQL data extract and python scripts
from SQL database via T-SQL.
Are there faster and/or more manageable ways to accomplish this?
ADF matches your requirement perfectly with one-time and schedule based data move.
Try copy wizard of ADF. With it, you can directly move on-prem SQL to blob/ADLS in Parquet format with just couple clicks.
Copy Activity Overview
Is there any way to transfer the large volume of data from Azure SQL to on-premises SQL Server 2016 Enterprise/Standard? The requirements prescribed as follows:
Weekly full database transfer
Daily delta transfer before midnight
I read about SSIS for Azure Blob Storage but am not sure whether it is applicable to this context.
Updated: I found an article on Azure Data Sync; according to that article, it seems doable. Please share your experiences. That would be extremely helpful.
https://www.mssqltips.com/sqlservertip/3062/understanding-sql-data-sync-for-sql-server/
Weekly full database transfer
SSIS Doesn't provide a way to do Full transfer of data(i mean backup),unless you want to truncate and insert from source..
For Weekly full database transfer,i would go with SQLAzure Export/Import functionality
Refer below links for more details..
1.https://github.com/richorama/SQLDatabaseBackup
2.I need to automate SQL Azure database backup in SQL Script files. How can i do so?
Daily delta transfer before midnight
You will need a way to identify delta..so create a table with all table names and last run time
create a console application which uses bulk insert functionality,which uses above table as base and insert in onpremises
We track IPs that attack our site. First attack, we temp block them. Tf they ever attack again then we permanently blacklist them. Information for each attack by each IP is stored in perpetuum. Twice daily, reports with an Excel spreadsheet with all pertinent information is emailed to various people, and then the information is manually added to a massive spreadsheet. We've recently spun up a new box with SQL server and I've added all of the existing information to a table in the new database.
As I'm new to this, I would like to know if there is a way to send the daily spreadsheets to this new sql server and have it parse out the excel attachment and update our master tracking table. The spreadsheet will always have the same structure (15 columns and header and footer rows) with varying row quantities, and of course it matches the existing table structure.
I've been googling it and am only able to find queries (ba dum tish) on how to make SQL export to excel and send an email with Database Mail. Can't find anything on sending en email TO sql server and having it process an attachment.
You can make use of the SQL Server Integration Services(SSIS). You can write an SSIS package that import the data from the given Excel spreadsheet to a table and
then from that table you can write insert or update statements to your production table. You can use "Data Flow task" to Import the Data from the excel file and then write an " Execute SQL Task" which will update the values to Production table. Remember that you will have to keep the Excel file in the same folder all time (or else you can use dynamic statements to get the file name dynamically using Variables). Once you have completed the package you can schedule the package as an SQL Server Job which will run periodically and hence the data will be automatically updated.
Please refer this video for a basic idea about SSIS :
Import Data From Excel to SQL Server Using SSIS
Twice daily, reports with an Excel spreadsheet with all pertinent information is emailed to various people,
Try saving the File to a location and then use SSMS Export,Import Wizard ..This package can be saved and set to Run Daily
Here is a step by step tutorial covering the same..
https://www.mssqltips.com/sqlservertutorial/203/simple-way-to-import-data-into-sql-server/