How do you automate data loads in visual studio? [closed] - sql-server

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I am pretty new to SSIS and SQL in general, and I am still learning the ropes. I am trying to create a few integration packages. First step would be to retrieve the data from one database and load it into another. I've been using the data flow task for this ... However the next step would be to run this package once a daily at a given X time (eg. 1 am). The landing database would be continuously updated , just like the source.
On a seperate package, I'm looking for a way to copy the data from my landing database, and dump it all into another table. This needs to be done so that the package automatically retrieves the rows from the landing database, before it gets overwritten everyday.
Any suggestions?
Thanks!

Regarding the scheduling of both the created SSIS packages they can be scheduled as SQL Agent job. SQL Agent is part of all SQL Server Editions except Express Edition. For details how to create the automated job to load SSIS packages, please refer to Schedule SSIS

Related

Is there a any benefit at all of using DB Project in SQL Server? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
Our DBA recently introduce the whole idea of DB Project for maintaining sql (DDL and DML) check in part however I feel like it's of no benefit at all. Is anyone knows any benefit of using DB project ? I would like to know real benefit.
Is anyone knows any benefit of using DB project ?
1) It tracks the change history of your database schema, and stores your schema in a version control system.
2) It integrates with your DevOps workflow, and enables you to track what version of your schema is deployed in what environment.
3) Manages the creation of DDL change scripts for upgrading a target environment to a specific version of the schema.
4) Prevents Schema Drift in environments by detecting and fixing changes made directly in environments.
An easy way to get started with database projects is to continue with your connected database development workflow, and use the Schema Compare tool to update your Database Project and check it in to source control.

SSIS, Change Tracking, and Snapshot Isolation [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I'm currently using SQL Server 2008 R2 Change Tracking (not Change Data Capture) and SSIS to extract incremental changes from several source databases.
Until now, I'd been using restored backups to do this so I didn't need to worry about Snapshot Isolation. However, I now need to point these packages at the production databases.
I know that setting the Snapshot Isolation Level on for the tracked databases is recommended to ensure consistency of the ETL extracts. I'm reluctant to do this because of possible degradation in performance.
Since I'm extracting late at night, is there some reason I can't use the following process?
Create a database snapshot for temporary use.
Get the Change Tracking Current Version of the Production Database.
Compare it with the previous successful run version.
Extract from the Database Snapshot instead of the Production Database.
After successful load, drop the Database Snapshot.
We're using 2008 R2 Enterprise Edition. is there any downside to this? Am I missing something?

SQL stored procedure to insert data into Mongo DB [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
At present we got a windows service which runs a stored procedure to transfer data from one table to another. Now we need to change the stored procedure to transfer data from SQL table to Mongo collection. Could some one with Mongodb experience point me in right direction?
You will probably need to get the MongoDB .NET Driver and do one of the following:
Create a new application that connects to SQL Server, gets data from the desired table, and then connects to MongoDB to insert that data.
Create a SQLCLR Stored Procedure that reads from the desired table and connects to MongoDB to insert the data.
Option 2 is more of a drop-in replacement as it doesn't interfere with the current structure of the Windows Service, but it might take a little bit of work to get the MongoDB .NET driver working correctly inside of SQL Server (I have not tried it so I am not sure what requirements that particular code has).

Scheduler Get Data From SQL Server to Oracle [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 years ago.
Improve this question
I want create a task as scheduler to get Data From SQL Server 2000 to Oracle ?
how i can do it ? thanks
This kind of processes are called ETL automation.
You should first write appropriate code accomplish the data transformation and extraction prior loading to other system. You have several options. One of them is using SQL. You may use either oracle to get data from SQL server or vice versa.
It is better to convert this code to a batch executable to able to run from command line.
Choose a scheduler to execute the job according to your business requirements.
Alternatives are cron for ux platforms, windows task scheduler for ms platforms, open source frameworks to implement your own (like quartz), open source products if you do not have budget or more professional commercial ones like TlosLite.
In Sql server you can create DTS job to transfer the data from SQL server table to Oracle table.Later using SQL server Agent,you can schedule this job.

How to send the reports through E-mail and Windows file simultaneously in SQL server 2008 report services [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I'm using the SQL server 2008.
The scenario is:
We have about two hundred reports to generate daily, and it need plenty of time to query in DB and refine the data. We hope the reports can be sent from e-mail and saved at the windows file system at the same time. So the method need two different subscription for the two ways of delivery is not acceptable.
Wondering if we can make it by customizing the subscription directly or write some dll to enhance it(like allow us selecting a new customized delivery when edit the subscription). Could u pls give me some ideas on it? If not, we have to write a new program to do it..but it needs effort..
One option would be to cache the reports so that they run quickly the second time.
Another option would be to put the files in a Windows file share and use SSIS SendMail to pick up the files and email them out.
One more option is to write your own report runner that runs the reports and deals with them as you want. I was unsatified with the Reporting Services scheduler options and did it this way. It isn't as hard as you might think and Microsoft has some code that will get you started.

Resources