Monthly report automated with VBA + SQL Server Stored Procs? - sql-server

I am trying to completely automate this process, and I'm wondering if its viable or efficient to do in VBA.
Report process involves 2 files: one sql file and one excel file.
SQL file has the algorithm, and the final step is a query who's result is then pasted into the excel file.
The algorithm is simpler(than what the audience might be used to) but has two "into" commands and several "update" commands.
Two "into" commands, the first grabs a small portion(constrained on first and last day of previous month) of a 500m+ record table. The second joins the first table with an eligibility type table.
After the second table is created, there is a series of UPDATE commands that change existing data of existing columns.
Then a series of ALTER & UPDATE commands that add new columns to the [second] table and UPDATES them with desired data.
the final step is a query who's results are copy-pasted into excel (as is, no formatting changes necessary).
I'm not too well-versed in VBA/VBNET nor TSQL stored procedures and dynamic sql, if the sql algorithm was a simple pull query with no table creation, I can build something to automate that. But the SQL has 2 table creations, and about a dozen ALTER & UPDATE commands.
Am I stirring up the wrong nest? Should I run it manually as is?

You can definitely do automate this. I created a report that ran two stored procedures and created numerous queries with temp tables including both update and alter commands then used VBA to run execute these and aggregate the data in the final summary sheet.
There is a ton of documentation out there. You can even pass your values to the stored procedure after the user inputs them.
I would add this as a comment but I do not have enough reputation to comment yet (need 50).

Related

MSSQL Executing table filled with instructions

I need to change the collation for quite a lot tables including the columns. I already wrote a sql-statement which generate these for me. I wanted to create a stored procedure to automate this process. I created a table to store these generated statements and want to execute row after row. How can I do that?
Thanks in advance

Take data from different datasets and inserting them into a SQL table from SSRS report

I have a SSRS report with 20 different datasets with some calculated columns in each.
I want to take few fields from all data sets including some calculated columns and insert them into a SQL table.
I want to do this for each month so that I can see the trends during a period. Is there any way to do that with out editing the data sets?
Can I refer the fields that I need by referring to Textbox4 and insert them into a SQL table? What is the easy way to do without touching data sets?
There is most likely alot better solution than using SSRS to update a SQL database. I am not proposing this as the best solution but rather a way to achieve what was asked.
You could create a dataset that runs a stored procedure you can pass the data as parameters to. The Sproc would do the insert into your chosen table and you can pass in the parameters from your original dataset however you see fit. You could even setup a second report with the Stored procedure Dataset that you can call on command by having an action event on an item to call the report. (I had a subreport embedded in a column of a tablix configured so it would only update with values from that row for instance).
To clarify:
Create a subreport that accepts the data you want to insert as a parameter for each column
Instead of adding a normal Dataset, have it call a stored procedure that inserts as you require
Add the subreport to your main report to be called once its run and configure the required parameters to be passed through.
There will be better, more efficient, cleaner ways to do this, but I found the above to work for my purposes since I was limited by time and resources. But I would still recommend you seek other solutions if possible.

SQL Server Create Dynamic Table with different table names based on a template or an existing table

My team is creating a high volume data processing tool. The idea is to take a 30,000 line batch file and bulk load it into a table and then process the records use parallel processing.
The part I'm stuck on is creating dynamic tables. We want to create a new physical table for each file that we receive. The tables will be purged from our system by a separate process after they are completed.
The part I'm stuck on is creating dynamic tables. For each batch file we receive I need to create a new physical file with a unique table name.
I have the base structure for the table and I intend to create unique table names using a combination of date/time stamp and a guid (dashes converted to underscore characters).
I could do this easily enough in a stored procedure but I'm wondering if there is a better way.
Here is what I have considered...
Templates in SQL Server Management Studio. This is a GUI tool built into Management Studio (from Management Studio Ctrl+Alt+T) that allows you to define different sql objects including a table and specify parameters. This seems like it would work, however it appears that this is a GUI tool and not something that I could call from a stored procedure.
Stored Procedure. I could put everything into a stored procedure and build my file name and schema into a nvarchar(max) string and use sp_executesql to create the table. This might be the way to accomplish my goal but I wonder if there is a better way.
Stored Procedure with an existing table as a template. I could define a base table and then query sys.columns & sys.dataypes to create a string representing the new table. This would allow me to add columns to the base table without having to update my stored procedure. I'm not sure if this is a better approach.
I'm wondering if any Stack Overflow folks have solved a similar requirements. What are your recommendations.

How to run SSIS packages dynamically?

We have a large production MSSQL database (mdf appx. 400gb) and i have a test database. All the tables,indexes,views etc. are same eachother. I need to make sure that tha datas in the tables of this two database consistent. so i need to insert all the new rows and update all the updated rows into test db from production every night.
I came up with idea of using SSIS packages to make the data consistent by checking updated rows and new rows in all the tables. My SSIS Flow is ;
I have packages in SSIS for each tables seperately because;
Orderly;
Im getting the timestamp value in the table in order to get last 1 day rows instead of getting whole table.
I get the rows of the table in the production
Then im using 'Lookup' tool to compare this data with the test database table data.
Then im using conditional sprit to get a clue whether the data is new or updated.
If the data is new, i insert this data to the destination
5_2. If the data is updated, then i update the data in the destination table.
Data flow is in the MTRule and STBranch package in the picture
The problem is, im repeating creating all this single flow for each table and i have more than 300 table like this. It takes hours and hours :(
What im asking is;
Is there any way in SSIS to do this dynamically ?
PS: Every single table has its own columns and PK values but my data flow schema is always same. . (Below)
You can look into BiMLScript, which lets you create packages dynamically based on metadata.
I believe the best way to achieve this is to use Expressions. They empower you to dynamically set the source and Destination.
One possible solution might be as follows:
create a table which stores all your table names and PK columns
define a package which Loops through this table and which parses a SQL Statement
Call your main package and pass the stmt to it
Use the stmt as Data Source for your Data Flow
if applicable, pass the Destination Table as Parameter as well (another column in your config table)
This is how I processed several really huge tables: the data had to be fetched from 20 tables and moved to one single table.
You are better off writing a stored procedure that takes the tablename as parameter and doing your CRUD there.
Then call the stored procedure in a FOR EACH component in SSIS.
Why do you need to use SSIS?
You are better off writing a stored procedure that takes the tablename as parameter and doing your CRUD there. Then call the stored procedure in a FOR EACH component in SSIS.
In fact you might be able to do everything using a Stored Procedure and scheduling it in a SQL Agent Job.

SSIS, splitting a single row into multiple rows

My problem is as follows. I have a CSV file (~100k rows) containting history information with the column format of:
ID1,History1,ID2,History2...ID110,History110
Each row may have anywhere between 0 and 110 history entries. Each separate entry requires a stored procedure to be called.
If there were a small number of possible entries per row, I imagine the way to do this would be to transform the data using a script, and send it to a unique path. Creating 110 paths would probably work, but isn't very elegant (and quite time consuming).
What would the best way to approach this be?
Just load the data (raw csv unchanged, one row per file line) into a staging table. Then, call a stored procedure that will use a string splitter to break up and loop over the staging table rows and call your other procedure for each history entry.
see: Arrays and Lists in SQL Server 2005 and Beyond
also see this previous answer: SQL comma delimted column => to rows then sum totals?
If you want to solve this in SSIS without the staging tables, you could create a destination script component. You could use switch statement or hashtable to lookup the right sproc to execute for the data row.
It is unclear whether this is a better solution then the staging table approach above; but it is an alternative.
I know you already accepted an answer, but couldn't you use an Unpivot task to achieve what you wanted to do here?

Resources