Best way of logging in SSIS - sql-server

There are 5 different types of logging in SSIS
Event Log
Text File
XML File
SQL Server
SQL Server Profiler
I am in a production environment where developers do not have access to production systems.
Which logging method should be my poison of choice, and why?

If you're not going to have access to the production server, then SQL Server logging is your best bet by far. You'll have plenty of ways of viewing the logged information, for example via custom SSRS reports or web pages, or direct access to the tables if your DBA allows it. Also, the logs will be easier to search and filter when in a table.

Personally I prefer logging to SQL Server.
I think this is because it puts the data in a form which I can immediately access and process. For example, I can then slice and dice the data, export it to another server, setup agent jobs to monitor the logs and email alerts etc.

Have you looked at BI xPress from Pragmaticworks. They have serious auditing feature for SSIS
SSIS Logging And Auditing, Notification, Deployment using BI xPress

Related

Allow Data Push into an Azure SQL Database?

I'm relatively new to Azure and am having trouble finding what options are out there for connecting to an existing SQL database to push data into it.
The situation is that we have an external client who needs to connect to our Azure SQL database to push data into it, on an on-going basis. We can't give them permission to get into our database, so we're looking at what we can do allow data in. At this point the best option seems to be to create a web service deployed in Azure that will validate the data and then push it into our database.
The question I have is, are there other options to do this in an easier way? Are there Azure services or processes that can be set up to automatically process a file and pull the data into a database? Any other go-between options when each side has their own database and for security reasons can't just open up access to it?
Azure Data Factory works great for basic ETL. If neither party can grant direct access, you can use an intermediate repository like Blob Storage to drop csv/xml/json files for ingestion. If they'll grant you access to pull, you can setup a linked service that more or less functions the same as a linked server in MSSQL. As of the last release ADF now supports Azure hosted SSIS packages too.
I would do this via SSIS using SQL Studio Managemenet Studio (if it's a one time operation). If you plan to do this repeatedly, you could schedule the SSIS job to execute on schedule. SSIS will do bulk inserts using small batches so you shouldn't have transaction log issues and it should be efficient (because of bulk inserting). Before you do this insert though, you will probably want to consider your performance tier so you don't get major throttling by Azure and possible timeouts.

Azure Database Installation Error "Invalid Object name 'Categories' "

I am completely ignorant in relation to databases and servers etc. Please bear with me.
I am trying to install a program called RealProspect 2009 which allows both local and remote sql database installation. Both types are done using the program installation .exe.
I have an azure account on which I have set up a server, and a database. During the program installation I am asked to provide the SQL server address, SQL server name, SQL username and SQL password. Using the information provided in the Azure online tools, I input all of this information into the fields and the program commences installing the database on the remote location. If I use incorrect information in these fields the installation returns an error and tells me it cannot log in, or the IP is not allowed etc., so I know it's actually attempting to connect and verifying the connection credentials.
When I use the correct server and login information the program proceeds. It spends several minutes "Creating the Tables". When it finishes doing that it attempts to begin "Installing Default Data (Categories)". At this point the program stops and I get the error in the subject line of this post "Invalid Object name 'Categories' "
I don't know enough to tell you what I don't know about this process.
I just signed up for Azure specifically because hosting the database with Azure is like $5-10 per month and I want myself and several other participants to be able to use the software with a common database. I created the server and database using the gui "tools/how to" from within the online Azure portal and I have never written a script, or accessed the server/database using anything other than the online GUI.
Thank you in advance for any help you may be able to provide. I hope i'm not too much of a speed bump to your day.
P.S. - For what it's worth you can download a free trial of the software from realinvestorsoftware.com and see if you could install it on a remote server. Maybe you can better see what I see and tell me how to do it on my own?
SQL Azure is VERY similar to SQL Server but there are a few features that SQL Azure doesn't support. That said, I'd be surprised if the app's installer is using any of the features that are unsupported by SQL Azure. My guess is that there's a bug in their installation scripts that might fail on more modern versions of SQL Server (note, their app installs on SQL Express 2005 which is no longer in mainstream support).
Just a couple of other thoughts for you: You get keys to install the app on two machines but:
"If you would like to install on more than two computers, then after you order your copy of RealProspect you can login to your customer account on this website and order additional activation keys for only $97 each."
Because you're going to be paying several hundred dollars anyway, and because (you yourself admit) you're not a database expert, it may be less cost, stress and hard-work to use their $27 per month database hosting service. That way you can concentrate on building your business while they take care of the technology.
[Update: 3/27/2013 # 23:05]
Another option Chris presented was to install the app and database locally and then migrate the database to Azure.
While this is potentially feasible, it requires some finesse to execute.
Microsoft provides a DB migration guide presenting several (pretty manual) options.
You might also want to read this thread which discusses how to migrate your DB via a DACPack.
Another option is to download and use the SQL Azure Migration Wizard which should do most of the heavy-lifting for you and make your DB migration simpler.
However, note that it is possible that the DB the app uses may use features of SQL Server that are not supported on SQL Azure. Hopefully this isn't the case, but be aware that this may be an issue.
Good luck :)
Chris,
I think SQL Database Migration Wizard v3.9.10 & v4.0.13 will solve your problem, I have used this tool several time to migrate db from local machine to sql azure, the most beauty of this tool it also highlights the error or sql which couldn't be migrated to Azure, so we can easily find alternate syntax of such sql queries

Should SSIS Packages run on a database SQL Server or a separate app server?

It seems to be a policy in my company to place application code on a separate server than the database.
We are a data warehouse that mostly use SSIS and T-SQL to transform and load data into a SQL Server. We would like to use a C# .NET app to do some of the steps
For example, to iterate through a file and call a web service.
Given that SQL Server 2008 now automatically installs .NET 3.5 and supports .NET stored procedures, is it justified to prohibit .ETL code written in .NET from running on the database server? Having both SSIS and .NET code running on the same box will help simplify our workflow so we don't have to worry about a scheduling app having to control flow across servers.
I do understand that it would be appropriate, for example in a web app, to separate the business logic tier from the db tier.
One possible snag: In my company, the DBAs are not Admins of the App Servers and do not have rights to install the db Client tools to the app server and the App Serve admins probably should not have anything to do with installing database client tools. If they did, there would have to be coordination between the App server Admins and the DB server Admins. Perhaps the DBAs could share ownership of the App Server. How do companies usually handle this?
Best practice is to leave SQL Server completely alone on the box. SQL Server uses a user mode cooperative multi-tasking and resource control that assumes 100% ownership of the systemn and does not play well if other processes are stealing memory or CPU. See Dynamic Memory Management and SQL Server Batch or Task Scheduling.
As for .NET doing ETL Web calls from inside SQLCLR: don't even think about it. NEver do any sort of blocking operations from SQLCLR, you'll starve the SQL scheduler.
As a general rule, you should have your SSIS server be separate from the SQL Server as SSIS is an app layer. Next, your application code should run on a separate server (can be the same as the SSIS server).
Keep your scaling concerns separate.
It depends on the code, the availability needed of the database, and the size of the box. If you're doing a lot in memory on the SSIS pipe or in your C# app, I'm a proponent of putting it on a separate box (all of the ETL, not just some of it). If you're just using SSIS to call stored procs on the database, it's fine to leave it on the same system.
That being said, I'd avoid splintering the ETL across boxes unless there's an overwhelming reason to do so. It adds a lot of complexity for not much benefit (usually).
That being said, if you need C# stuff to run, you could always use the script tasks in SSIS to control its execution.
We implemented SSIS by placing DTS packages on same server with SSIS Server and we have winservice that leaves on another server and executing remotely DTS Packages.

System for automating admin creation for any database

For example I have a database, and want to have good automaticly generated admin for it. Is there software for this?
Not sure what your question is. What kind of admin tasks are you trying to automate?
The SQL Server Agent service can handle a lot of the scheduled aspect, as well as alerts for certain conditions. Policy Based Management can log and restrict certain database params across a single instance or multiple ones.
What more are you looking for? SSMS is the gui that sits on top of SQL Server for an easy implementation.

Updating database on website from another data store

I have a client who owns a business with a handful of employees. He has a product website that has several hundred static product pages that are updated periodically via FTP.
We want to change this to a data-driven website, but the database (which will be hosted at an ISP) will have to be updated from data on my client's servers.
How best to do this on a shoestring? Can the database be hot-swapped via FTP, or do we need to build a web service we can push changes to?
Ask the ISP about the options. Some ISPs allow you to ftp upload the .mdf (database file).
Some will allow you to connect with SQL management studio.
some will allow both.
you gotta ask the ISP.
Last time I did this we created XML documents that were ftp'd to the website. We had an admin page that would clear out the old data by running some stored procs to truncate tables then import the xml docs to the sql tables.
Since we didn't have the whole server to ourselves, there was no access to SQL Server DTS to schedule this stuff.
There is a Database Publishing Wizard from MS which will take all your data and create a SQL file that can then be run on the ISP. It will also, though I've never tried it, go directly to an ISP database. There is an option button on one of the wizard screens that does it.
it does require the user to have a little training and it's still a manual process so mabe not what you're after but i think it will do the job.
Long-term, building a service to upload the data is probably the cleanest solution as the app can now control it's import procedures. You could go grossly simple with this and just have the local copy dump some sort of XML that the app could read, making it not much harder than uploading the file while still in the automatable category. Having this import procedure would also help with development as you now have an automated and repeatable way to sync data.
This is what I usually do:
You could use a tool like Red-Gate's SQL Data Compere to do this. The tool compares data between two catalogs (on same or different servers) and generates a script for syncing them.

Resources