Can I automatically export data from a Cognos report into a database? - database

The overall goal is to have data from an automated daily Cognos report stored in a database so that I am able to report not only on that day but also historical data if I so choose. My general thought is that if I can find a way to automatically add the new daily data to an existing Excel file, I can then use that as my data source and create a dashboard in Tableau. However, I don't have any programming experience, so I'm floundering here.
I'm committed to using Tableau, but I chose Excel only because I'm more familiar with that program than others, along with the fact that an Excel output file is an option in Cognos. If you have better ideas, please don't hesitate to suggest them along with why you believe it's a better idea.

Update: I'm still jumping through hoops to try to get read-only access to the backend database to make this process a lot more efficient, but in the meantime I've moved forward with the long method utilizing Cognos.
I was able to leverage a coworker to create a system file folder to automatically save the Cognos reports to, and then I scheduled a job to run the reports I need. Each of those now saves into a folder in a shared network drive (so my entire team has access to the files), and I wrote a series of macros to append the data each day from those feeder files in the shared drive to a Master File. Now all that's left is to create a Tableau dashboard using the Master File as the data source and I'll have what I need.
Thanks for all your help!

I'm posting this an an answer because, it's just too much to leave as a comment.
What you need are 3 things.
Figure out how to have COGNOS run your report and download your Excel file.
Use Visual Studio with BIDS (which is the suite of SQL analysis, reporting, and integration services) to automate all the stuff you need to do to append your Excel files, etc... Then you can use the same tools to import that data to your SQL server.
In fact, if all you're doing is trying to get this data into SQL, you can skip the Append Excel part, and just append the data directly to your SQL table.
Once your package is built, you can save it as an automated job on your SQL server to run whenever you wish.
Tableau can use your SQL server as a data source. Once you have that updated, you can run your reports.

Related

Output Export to excel

We having SQL Server Management Studio , we had written several stored procedures in it. Currently we taking output in HTML and mailing to desired email id's. Now our requirement is instead of HTML we need to take output in excel and mail to desired Id's.
I would use the SQL Server Reporting services, and add subscriptions that send the created result by email as an Excel or CSV file.
Excellent question.
As Michael mentioned, you may use SQL Server Reporting Services (SSRS) to create a report that automatically sends the excel file to your chosen subscribers.
This might be an ideal solution if:
Your business unit would like the report to have specific fonts, color schemes, and column formatting as this is a user-friendly way to format the report and test as needed prior to adding on the email subscriptions. Of course, this depends on your audience and the way that the excel file might be used.
You have support analysts or specialists on your team that have been granted access to SSRS, but not necessarily SQL Server Management Studio. From my experience, granting access to one but not the other may lessen the liability of stored procedures or tables being written over, deleted, executed improperly, etc.
Your business unit frequently has changes to the subscription list, as you would be able to hand the responsibility of editing the list over to designated user-support team members rather than bogging down your SQL Developers.
However, if you'd prefer to create a stored procedure to send the emails or don’t have access to SSRS, then you should be able to use the Bulk Copy Program (BCP) command-line utility to generate a simple CSV file. Here are a couple of resources that provide further detail on this option:
https://www.red-gate.com/simple-talk/sql/database-administration/creating-csv-files-using-bcp-and-stored-procedures/
https://social.msdn.microsoft.com/Forums/sqlserver/en-US/453c9593-a689-4f7e-8364-fa998e266363/how-to-export-sql-data-to-excel-spreadsheet-using-sql-query?forum=transactsql
If you have any further questions, please don’t hesitate to reach out! I’m always happy to help whenever I’m able.

Versioning SQL Server Database with data using SSDT

I have a sql server database already containing data. I want to start versioning it. I know I can use Database project in Visual Studio, and by importing database I can generate sql scripts.
But what about data in the database? I tried to make some Data-Tier Application Files, but when I try to import it in my DB project in Visual Studio I am getting this error:
Import Data-Tier Application File - This operation is not supported for packages containing data
So how do I import data? It has to be some way, because when I am extracting DAC file there is option Extract Schema and Data so there has to be a way to use this data afterwards.
Or maybe post deployement scripts are the only option?
Grettings
Your only option for this at this time is to use post-deploy scripts to populate those tables, taking into account the fact that the scripts need to be able to run multiple times without re-inserting data. A temp table/table variable and a MERGE statement are probably your best bets if you might have changes to the reference data, otherwise a left join might suffice.
Others have tried to include reference data, but it's a pretty hard problem to solve in a manner that works well for everyone. I know others like Ed Elliott have written some stuff that can turn those on/off as needed so you're not always including all reference data every time. You could also look into a post-post-deploy scenario where after your publish and post-deploy, you run a separate script that updates the data from static files. They'd still be in source control, but not necessarily part of your SSDT project. You'd have to remember to run that script in your builds, though.
I know for a while we had a database that solely had the lookup tables populated so we could reference that and do data compares if needed, but that still requires someone to maintain those values in an ongoing manner.

Anyone knows a good database setup tool for SQL Server?

I have a master database where we define all information of our software.
It contains
tables
queries
trigger
stored procedures
stored functions
meta data
in the table (content)
At the moment, with every change I manually (with some support from SQL Management Studio) edit files where I have all the CREATE, UPDATE, INSERT statements for the stuff mentioned above. When I have to create a new database I fire-up all the xyz.sql files, which contains my SQL statements.
I know there is a database creation script wizard in management studio, but this for example doesn't include the content data. I also need to make sure the stuff is executed for creation in the right order (e.g. queries , function, etc. last then structure tables are available).
At the moment I was thinking about a .NET project where I start read all the shema tables and then create the files automatically. In Ruby on rails the system creates a shema.rb and for the data yaml files. I tried work with this, but as many tables not created by active record (old c++ stuff also running), this won't work for me.
So does anyone have any hint for me how to do this best or any tool that fits perfect to my demand?
You can do this very easily in .NET using the SMO frameworks.
There are integrated tools for scripting out in dependency order, and you can script out data as well if you desire.
See my answer here for some info and links.
SQL Compare Pro should be able to load up your DDL creation scripts and deploy them to a target in the correct order. In the Edit Project dialog make sure you load your scripts as a Scripts Folder. For the data you'll need to use SQL Data Compare Pro. If you have any trouble or have questions, let me know as I work for Red Gate so will be able to help you with these tools.
I'm a little confused about why you've got UPDATEs given that these scripts create a database from scratch. Shouldn't they all be INSERTs?
SSMS does have the ability to create data scripts as well. You need SSMS 2008 and you need to go to Tasks/Generate Scripts and in the Choose Script Options pane you have to make sure Script Data is set to True.
If you're looking to maintain these scripts as a sensible way to source control your SQL Server objects, you might want to consider SQL Source Control. This will maintain your schema objects AND static data tables as individual .sql files.
"I know there is a database creation script wizard in management studio, but this for example doesn't include the content data."
You have to look carefully! Of course this build-in script engine can include the content data. You just have to click the button labeled "properties" (or something like that) and there you can change all the SMO script options including a full data dump.
This ends up in the script with many INSERT INTO... statements.
In-depth description
Try DbSourceTools.
It is a SQL Management tool designed specifically to script SQL databases to disk ( including data ), and then re-create them using "Deployment Targets".
We are using it for database source control in an agile project.

User-friendly tools for retrieving data from a SQL Server database

We have several SQL Server databases containing measurements from generators that we build. However, this useful data is only accessible to a few engineers since most are unfamiliar with SQL (including me). Are there any tools would allow an engineer to extract chosen subsets of the data in order to analyze it in Excel or another environment? The ideal tool would
protect the database from any accidental changes,
require no SQL knowledge to extract data,
be very easy to use, for example with a GUI to select fields and the chosen time range,
allow export of the data values into a file that could be read by Excel,
require no participation/input from the database manager for the extraction task to run, and
be easy for a newbie database manager to set up.
Thanks for any recommendations or suggestions.
First off, I would never let users run their own queries on a production machine. They could run table scans or some other performance killer all day.
We have a similar situation, and we generally create custom stored procedures for the users to "call", and only allow access to a backup server running "almost live" data.
Our users are familiar with excel, so I create a stored procedure with ample parameters for filtering/customizations and they can easily call it by using something like:
EXEC YourProcedureName '01/01/2010','12/31/2010','Y',null,1234
I document exactly what the parameters do, and they generally are good to go from there.
To set up a excel query you'll need to set up the data sources on the user's PC (control panel - data sources- odbc), which will vary slightly depending on your version of windows.
From in excel, you need to set up the "query", which is just the EXEC command from above. Depending on the version of Excel, it should be something like: menu - data - import external data - new database query. Then chose the data source, connect, skip the table diagram maker and enter the above SQL. Also, don't try to make one procedure do everything, make different ones based on what they do.
Once the data is on the excel sheet, our users pull it to other sheets and manipulate it at will.
Some users are a little advanced and "try" to write their own SQL, but that is a pain. I end up debugging and fixing their incorrect queries. Also, once you do correct the query, they always tinker with it and break it again. using a stored procedure means that they can't change it, and I can put it with our other procedures in the source code repository.
I would recommend you build your own in Excel. Excel can make queries to your SQL Server Database through an ODBC connection. If you do it right, the end user has to do little more than click a "get data" button. Then they have access to all the GUI power of Excel to view the data.
Excel allows to load the output of stored procedures directly into a tab. That IMO is the best way: users need no knowledge of SQL, they just invoke a procedure, and there are no extra moving parts besides Excel and your database.
Depending on your version of SQL server I would be looking at some of the excellent self service BI tools with the later editions such as Report Builder. This is like a stripped down version of visual studio with all the complex bits taken out and just the simple reporting bits left in.
If you setup a shared data source that is logging into the server with quite low access rights then the users can build reports but not edit anything.
I would echo the comments by KM that letting the great unwashed run queries on a production system can lead to some interesting results with either the wrong query being used or massive table scans or cartesian joins etc

Transfer data from SQL Server table using query to Excel and vice-versa

I want to transfer data from SQL Server table to Excel sheet using 3 tier architecture in asp.net 3.5. After user has made required changes in the Excel sheet I want Excel sheet to get uploaded and update data in the table with validation for proper data.
You could setup an SSIS package to import the data from Excel
You could either look at libraries (tons of them out there) to programmatically read and write Excel sheets, and handle it all manually.
OR: check out the SQL Server Integration Services (SSIS) - they offer neat ways to export SQL Server data into a multitude of formats (including Excel), and they also offer the route back. You can easily control and execute SSIS packages from a .NET application, too.
I can think of two methods. The first is to make an ADO connection to your SQL/Server from Excel VBA (which can be password protected). When you press a button in Excel, it either reads your spreadsheet data one record at a time and has the validation logic OR it simply uploads it with insert query to a temporary table and then you have a trigger that sees the data and processes it. That way you don't have to upload any files.
I have used ADO commands in VBA to SQL/Server and they are amazingly easy, reliable and exceptionally fast. Plenty of examples out there through Google search to find examples. It's great because you can use all kinds of Excel-specific VBA commands to build a record then update or insert it or whatever you need to do.
For security, you can limit the user (connection string and password hidden in the VBA) -- to just inserting data in a certain database so even if the password is somehow hacked from the VBA it won't do anyone any good as they can only insert.
The second method is to create an ordinary ASP upload control that accepts your Excel file when it is done. There is an event in ASP where you can run ASP.Net code when the file upload is complete. That would see the uploaded Excel file, read it through ordinary .net commands for reading Excel files (Excel automation), process it then I guess either refresh it or discard it. I do not know if there are complications running Excel automation on a server -- probably, because in essence it is running a verison of Excel.Exe on your web server (not really a good idea).
I believe you can make an ADO connection from ASP to an Excel file and do SQL queries on it. I have done that successfully but unfortunately it decides the type of the field based on the first few records and this can sometimes cause a lot of problems when reading an Excel file as a database. I suppose you could write some quick VBA to output the Excel data to CSV and upload that CSV file instead, so that nothing on the web server has to try to read an Excel file. In VBA, you can even automate the upload through SendKeys and InternetExplorer automation. I've done that and it works amazingly well. Sendkeys is the only way to populate the file upload text box for security reasons.
As you can see the first answer is the better one. That is how I would do it because that way you can also refresh your spreadsheet with new data.
I actually think you posted a very interesting question here. It's a lot easier to edit data in an Excel spreadsheet and send it back up. I have replicated a lot of that functionality using the Excel-style grid control from essentialobjects -- great software, but to emulate a spreadsheet takes a lot of coding and still it's just a Excel-like form, not a full spreadsheet.
If you are willing to put MS-Access in the middle, that can get you around a lot of these complications, but is itself an extra layer.

Resources