Substituting Column Values From Other Columns On The Fly - sql-server

I want the contents of a query to land in an Excel sheet, the data flow already applied in SSIS.
Th query is trying to achieve a set of values to appear in one column using values from other columns in the same table.
Example:
Case When col_date1 > 1900 then col_date1
When col_date1 = '' and col_date2 like 18% then col_date3
Else col_date4
End
Usually the col_date values are hardcoded strings but in my case I want them to be values from the other columns dynamically applied as the query runs and stores the output into Excel.
i don't think an Update statement works here because that would permenantly change the conents of the table which I'm trying to avoid. The changes should take effect on the fly and land in Excel while leaving the SQL Server table unchanged.
Perhaps a left join somehow but not sure that works either.
Any ideas?

Related

SSRS - Multi Value Parameter Report Problem

I have a quite strange / unusual request for an SSRS report. The client wants to be able to paste in large lists of ID numbers from an Excel sheet column (normally < 100 values but can be as many as 20,000 + values) as a search parameter in the report.
Now, I know how to pass multi value parameters from SSRS to a stored proc etc, that's not the issue. The issue is with this requirement to literally paste in a list of IDs into the multi value parameter input box and then limit the dataset based on that list (rather than pre-populate the Multi Value parameter with a list of values based on a query / SP as you normally would)
My question is what would be the best method / approach to this problem as I have never had a similar ask in many years of SSIS development? I need to make the solution as "self-service" as possible too, so as easy as running an SSRS report from report manager ideally. I know I could just import the Excel data into a table in the database and join to that etc but ideally would like something the user can run without the need for any tech input to import data or run SQL through SSMS to get the datset.
When you copy/paste from excel its just a tab delimited string. You can configure a string parameter to allow multiple values, and then in the expression editor, split the string by tab values or by line breaks.
Here's how:
In the parameter properties check the 'Allow multiple values' checkbox on the
general tab
Then in the datasource properties click the 'fx' (expression) button next to the parameter on the parameters tab
In the expression editor that appears, type the following:
=replace(split(Parameters!MyParameter.Value,vbCrLf),"\x0009","")
This will split the string by line breaks, then strip the tab characters out. After that you can treat it like any multi-valued parameter.
If you have a typical multi-value parameter setup then you can just copy/paste the column from excel directly in, it will automatically put each cell row copied from excel on a new line.
So if your query looked something like
SELECT * FROM myTable WHERE EmployeeName IN(#empName)
and empName was you report parameter, if no available values are configured you'll get an empty list when you click in the parameter field, just copy paste direct from Excel and it will work.
I'm not sure if there would be any limits or how good performance would be especially if copying thousand of values but certainly for a reasonably small number of items this will work.
The only other way I can think of that means no real extra work for the user would be to have them drop the workbook into a specified folder (maybe with a subfolder based on their SSRS username, then use openrowset to read the contents either directly into a dataset or better still, into a permanent table with their username and the parameter value on each row.
The openrowset statement could sit at the top of your main dataset query
Then your query could do something simple like
DELETE FROM myParamValueTable WHERE UserName = #UserID
INSERT INTO myParamValueTable
SELECT * FROM (OPENROWSET .....) xl
SELECT * FROM myTable t
JOIN myParamValueTable p
ON t.EmployeeName = p.EmployeeName
WHERE p.UserName = #UserID

SSIS - Excel to SQL Server with changing column names

I have an Excel sheet that changes column names based on the year and current week of the year, so for example 201901 would be the first week of 2019.
The Excel sheet that is sent to us daily automatically adjusts the column names based on the current date (up to 6 months), so currently (31/07/2019) the year and week show 201931 - 202011:
So the N column next week will be 201932 (the columns shift left basically).
I have tried changing the Excel source columns to a different alias of just 1,2,3,4 etc in hopes to just get the data into SQL Server, and then script a trigger in SQL Server to change column names but doesn't work due to the mapping SSIS requires.
Works fine until the column changes to next week.
A simple method would be to drop the table and just dump the file in a new table named the same but can't see how to set up in SSIS as you need to map the column names (which unfortunately change).
Here is how the dataflow looks:
Ideally, for me, something like this would be perfect:
But not sure how to achieve this outcome in SSIS?
I would suggest transforming the data. Currently, you have a "cross-table"-format.
How about putting the Excel data in the form of (RAG_week; CalenderWeek; Value_of_CalenderWeek) ? For doing this you can use an Excel-Macro which fills a new Sheet in the Excel file. (Each cell is transformed in one dataset, being a row on its own.) Next, you create a similar table on the SQL Server. Then you can create a SSIS package with constant column assignment, simply appending the new data each week.
This impacts the further evaluation of your data, but seems to be a far more stable approach.

Iterative UPDATE loop in SQL Server

I would really like to find some kind of automation on this issue I am facing;
A client has had a database attached to their front end site for a few years now, and until this date has been inputting certain location information as a numeric code (i.e. County/State data).
They now would like to replace these values with their corresponding nvarchar values. (e.g Instead of having '8' in their County column, they want it to read 'Clermont County' etc etc for upwards of 90 separate entries).
I have been provided with a 2-column excel sheet, one with the old county numeric code and one with the text equivalent they request. I have imported this to a temp table, but cannot find a fast way of iteratively matching and updating these values.
I don't really want to write a 90 line CASE WHEN paragraph and type out each county name manually. Opens doors for human error etc.
Is there something much simpler I don't know about what I can do here?
I realize that it might be a bit late, but just in case someone else is searching and comes across this answer...
There are two ways to handle this: In Excel, or in SQL Server.
1. In Excel
Create a concatenated string in one of the available columns that meets your criteria, i.e.
=CONCATENATE("UPDATE some_table SET some_field = '",B2,"' WHERE some_field = ",A2)
You can then auto-fill this column all the way down the list, and thus get 90 different update statements which you can then copy and paste into a query window and run. Each one will say
UPDATE some_table SET some_field = 'MyCounty' WHERE some_field = X
Each one will be specific to a case; therefore, you can run them sequentially and get the desired result, or...
2. In SQL Server
If you can import the data to a table then all you need to do is write a simple query with a JOIN which handles the case, i.e.
UPDATE T1
SET T1.County_Name = T2.Name
FROM Some_Table T1 -- The original table to be updated
INNER JOIN List_Table T2 -- The imported table from an Excel spreadsheet
ON T1.CountyCode = T2.Code
;
In this case, Row 1 of your original Some_Table would be joined to the imported data by the County_Code, and would update the name field with the name from that same code in the imported data, which would give you the same result as the Excel option, minus a bit of typing.

Making output more readable for others

I'm trying to make an SQL Query output more readable for our staff. It is a kind of a Warehouse Delivery note. Does not matter if I use Word or Excel or something else.
So I made an SQL Query (MS-SQL 2000 Server), this is working fine. The output obviously is a Table. The first 7 columns containing data that needs to be one time on the print as heading. The next columns containing data that needs to be on a list on the print. Already used PowerPivot to put it in Excel.
To make it some clearer I made a pictures in Excel. The data in the first column contains a "warehouse" number. I need a separate print per Warehouse. As you see, the first 7 columns also containing data concerning the warehouse. The last 3 columns are the products.
I have to put this:
into this, were Column 1 contains the same value. Other value of Column 1 goes to the next page and so on:
Return 2 datasets into your excel workbook, one for the heading values and the other for the details. I'm assuming your first dataset should only return the one row, which means you can reference the values in the data table directly without issue.
Your second table can then be directly inserted into the details section of your spreadsheet and formatted accordingly.
Well, so far it is working now. I used the Grouped Serial Letter function in Word from one Excel Table only. I tried the two Excel table Database Serial Letter version also, but somehow it did not worked. And what did not worked? I dont know, when I run the Database version I only get gibberish output and Errors.
So now I have it that way, that I run a SQL Query, that gives me one Excel Table. After that I run Word and print a Serial Letter

Pivot or Unpivot a dataset using Excel, SSIS or SQL Server

I have an Excel file with various market indexes, dates, and values. In this file there is a single column to the left representing the market index names followed by many date columns to the right. I can perform this pivot in Excel, SSIS or SQL Server. I have the most recent versions of each program. I don't want to simply copy and paste special transpose in Excel. I would like the solution to be automated as possible. I suspect loading this data into SSMS and using SQL would be the easiest. I can change the format of the dates if needed.
I have used pivot in both SSIS and SSMS but always with less columns than this tasks requires and am not sure how to approach it in a way that will allow for the large amount of columns and potential for the number of columns (dates) to vary. Perhaps this requires dynamic SQL. The dates which comprise the bulk of the columns can potentially extend 100 rows or more. Note there may be null values if an index didn't have a value on a given day.
Input data
Here is the desired output format.
Here is the data loaded into SSMS 2012. The dates become the column headers. Same goal of transposing the dates and index names.
I would use the Excel Power Query Add-In for this. It has Pivot and Unpivot commands that you can use. The Power Query implementations of both commands are dynamic i.e. they adjust to variations in the input data.
For your scenario I would first select Name and Unpivot all other columns. That will transform each date column and value into a row. Then I would Pivot on the Name column, which will generate columns for each of your Name values.
To automate this process, you just need a few lines of VBA code or script code to open, refresh and save the Excel file (including Power Queries). There are lots of options for this, e.g.
http://southbaydba.com/2013/09/10/part-5-power-query-api-refreshing-data-indeed/

Resources