I am trying to build a tool to facilitate some redundant importing of data into a SQL Server database. The flat text files we get have are mostly static, but there is often about a 5-10% variance in field names and sometimes some extra fields added (in which we will add columns to the table in the database before importing).
I'd like to build a front end interface for a SSIS package to make the field mapping the only real work for the user as I don't think we can program it. Is there anything out there that would allow this? Should I consider something other than SSIS? Appreciate any input, thanks!
SSIS packages are generally headless because they typically will run as a scheduled job somewhere on a database server. That said, there are definitely ways to do this.
One option that I have used is SQL Management Objects (SMO) to connect to the SQL Server Agent where the job is hosted. A client can interactively run such a job and even update the user on execution status. The same client could ask the user for input prior to kicking off the job, and you could store such input in a place where the package can access it.
Related
We having SQL Server Management Studio , we had written several stored procedures in it. Currently we taking output in HTML and mailing to desired email id's. Now our requirement is instead of HTML we need to take output in excel and mail to desired Id's.
I would use the SQL Server Reporting services, and add subscriptions that send the created result by email as an Excel or CSV file.
Excellent question.
As Michael mentioned, you may use SQL Server Reporting Services (SSRS) to create a report that automatically sends the excel file to your chosen subscribers.
This might be an ideal solution if:
Your business unit would like the report to have specific fonts, color schemes, and column formatting as this is a user-friendly way to format the report and test as needed prior to adding on the email subscriptions. Of course, this depends on your audience and the way that the excel file might be used.
You have support analysts or specialists on your team that have been granted access to SSRS, but not necessarily SQL Server Management Studio. From my experience, granting access to one but not the other may lessen the liability of stored procedures or tables being written over, deleted, executed improperly, etc.
Your business unit frequently has changes to the subscription list, as you would be able to hand the responsibility of editing the list over to designated user-support team members rather than bogging down your SQL Developers.
However, if you'd prefer to create a stored procedure to send the emails or don’t have access to SSRS, then you should be able to use the Bulk Copy Program (BCP) command-line utility to generate a simple CSV file. Here are a couple of resources that provide further detail on this option:
https://www.red-gate.com/simple-talk/sql/database-administration/creating-csv-files-using-bcp-and-stored-procedures/
https://social.msdn.microsoft.com/Forums/sqlserver/en-US/453c9593-a689-4f7e-8364-fa998e266363/how-to-export-sql-data-to-excel-spreadsheet-using-sql-query?forum=transactsql
If you have any further questions, please don’t hesitate to reach out! I’m always happy to help whenever I’m able.
The overall goal is to have data from an automated daily Cognos report stored in a database so that I am able to report not only on that day but also historical data if I so choose. My general thought is that if I can find a way to automatically add the new daily data to an existing Excel file, I can then use that as my data source and create a dashboard in Tableau. However, I don't have any programming experience, so I'm floundering here.
I'm committed to using Tableau, but I chose Excel only because I'm more familiar with that program than others, along with the fact that an Excel output file is an option in Cognos. If you have better ideas, please don't hesitate to suggest them along with why you believe it's a better idea.
Update: I'm still jumping through hoops to try to get read-only access to the backend database to make this process a lot more efficient, but in the meantime I've moved forward with the long method utilizing Cognos.
I was able to leverage a coworker to create a system file folder to automatically save the Cognos reports to, and then I scheduled a job to run the reports I need. Each of those now saves into a folder in a shared network drive (so my entire team has access to the files), and I wrote a series of macros to append the data each day from those feeder files in the shared drive to a Master File. Now all that's left is to create a Tableau dashboard using the Master File as the data source and I'll have what I need.
Thanks for all your help!
I'm posting this an an answer because, it's just too much to leave as a comment.
What you need are 3 things.
Figure out how to have COGNOS run your report and download your Excel file.
Use Visual Studio with BIDS (which is the suite of SQL analysis, reporting, and integration services) to automate all the stuff you need to do to append your Excel files, etc... Then you can use the same tools to import that data to your SQL server.
In fact, if all you're doing is trying to get this data into SQL, you can skip the Append Excel part, and just append the data directly to your SQL table.
Once your package is built, you can save it as an automated job on your SQL server to run whenever you wish.
Tableau can use your SQL server as a data source. Once you have that updated, you can run your reports.
I'm looking for the best approach (or a couple of good ones to choose from) for extracting from a Progress database (v10.2b). The eventual target will be SQL Server (v2008). I say "eventual target", because I don't necessarily have to connect directly to Progress from within SQL Server, i.e. I'm not averse to extracting from Progress to a text file, and then importing that into SQL Server.
My research on approaches came up with scenarios that don't match mine;
Migrating an entire Progress DB to SQL Server
Exporting entire tables from Progress to SQL Server
Using Progress-specific tools, something to which I do not have access
I am able to connect to Progress using ODBC, and have written some queries from within Visual Studio (v2010). I've also done a bit of custom programming against the Progress database, building a simple web interface to prove out a few things.
So, my requirement is to use ODBC, and build a routine that runs a specific query on a daily basis daily. The results of this query will then be imported into a SQL Server database. Thanks in advance for your help.
Update
After some additional research, I did find that a Linked Server is what I'm looking for. Some notes for others working with SQL Server Express;
If it's SQL Server Express that you are working with, you may not see a program on your desktop or in the Start Menu for DTS. I found DTSWizard.exe nested in my SQL Server Program Files (for me, C:\Program Files (x86)\Microsoft SQL Server\100\DTS\Binn), and was able to simply create a shortcut.
Also, because I'm using the SQL Express version of SQL Server, I wasn't able to save the Package I'd created. So, after creating the Package and running it once, I simply re-ran the package, and saved off my SQL for use in teh future.
Bit of a late answer, but in case anyone else was looking to do this...
You can use linked server, but you will find that the performance won't be as good as directly connecting via the ODBC drivers, also the translation of the data types may mean that you cannot access some tables. The linked server might be handy though for exploring the data.
If you use SSIS with the ODBC drivers (you will have to use ADO.NET data sources) then this will perform the most efficiently, and as well you should get more accurate data types (remember that the data types within progress can change dynamically).
If you have to extract a lot of tables, I would look at BIML to help you achieve this. BIML (Business Intelligence Markup Language) can help you create dynamically many SSIS packages on the fly which can be called from a master package. This master package can then be scheduled or run ad-hoc and so can any of the child packages as needed.
Can you connect to the Progress DB using OLE? If so, you could use SQL Server Linked Server to bypass the need for extracting to a file which would then be loaded into SQL Server. Alternately, you could extract to Excel and then import from Excel to SQL Server.
I'm used to working with SQL Server and when I want to copy a DB there, I just need a handful of clicks in the wizard and voila...a complete copy of the DB, without taking the source DB offline.
We now also have an Oracle 11g because some machines require it, and I want to make a copy of the database. Just a copy on the same server, to use as a test DB for my software development.
All instructions that I find are pages full of steps, using RMAN or not, you have to write scripts, use command line stuff...I'm amazed at how inefficient such a common task is when using Oracle.
Aren't there any easy ways of copying a DB? Maybe just exporting everything to a SQL file, then editing it to use another DB name, and then executing it again?
I see that in SQL Developer you can choose 'Database Copy...' from the Tools menu, but it asks a destination connection. How can I select a destination when creating the destination DB is the whole point of running the wizard? Or is a connection not the same as a DB?
Thanks for helping me out here!
You're generally going to need a new database to copy the data to, and the data could be copied with datapump export/import. There aren't many ways of getting around that I'm afraid, but one option that you might consider is to make more use of VM's such as Oracle's own VirtualBox, as they can be cloned very easily with an absolute certainty of byte-by-byte fidelity.
Incidentally, one problem in making logical copies (via export/import) of a database is that it's easy to end up with a different physical pattern to the table and indexes, which can lead to unexpected differences in query optimisation.
We have several SQL Server databases containing measurements from generators that we build. However, this useful data is only accessible to a few engineers since most are unfamiliar with SQL (including me). Are there any tools would allow an engineer to extract chosen subsets of the data in order to analyze it in Excel or another environment? The ideal tool would
protect the database from any accidental changes,
require no SQL knowledge to extract data,
be very easy to use, for example with a GUI to select fields and the chosen time range,
allow export of the data values into a file that could be read by Excel,
require no participation/input from the database manager for the extraction task to run, and
be easy for a newbie database manager to set up.
Thanks for any recommendations or suggestions.
First off, I would never let users run their own queries on a production machine. They could run table scans or some other performance killer all day.
We have a similar situation, and we generally create custom stored procedures for the users to "call", and only allow access to a backup server running "almost live" data.
Our users are familiar with excel, so I create a stored procedure with ample parameters for filtering/customizations and they can easily call it by using something like:
EXEC YourProcedureName '01/01/2010','12/31/2010','Y',null,1234
I document exactly what the parameters do, and they generally are good to go from there.
To set up a excel query you'll need to set up the data sources on the user's PC (control panel - data sources- odbc), which will vary slightly depending on your version of windows.
From in excel, you need to set up the "query", which is just the EXEC command from above. Depending on the version of Excel, it should be something like: menu - data - import external data - new database query. Then chose the data source, connect, skip the table diagram maker and enter the above SQL. Also, don't try to make one procedure do everything, make different ones based on what they do.
Once the data is on the excel sheet, our users pull it to other sheets and manipulate it at will.
Some users are a little advanced and "try" to write their own SQL, but that is a pain. I end up debugging and fixing their incorrect queries. Also, once you do correct the query, they always tinker with it and break it again. using a stored procedure means that they can't change it, and I can put it with our other procedures in the source code repository.
I would recommend you build your own in Excel. Excel can make queries to your SQL Server Database through an ODBC connection. If you do it right, the end user has to do little more than click a "get data" button. Then they have access to all the GUI power of Excel to view the data.
Excel allows to load the output of stored procedures directly into a tab. That IMO is the best way: users need no knowledge of SQL, they just invoke a procedure, and there are no extra moving parts besides Excel and your database.
Depending on your version of SQL server I would be looking at some of the excellent self service BI tools with the later editions such as Report Builder. This is like a stripped down version of visual studio with all the complex bits taken out and just the simple reporting bits left in.
If you setup a shared data source that is logging into the server with quite low access rights then the users can build reports but not edit anything.
I would echo the comments by KM that letting the great unwashed run queries on a production system can lead to some interesting results with either the wrong query being used or massive table scans or cartesian joins etc