Retrieving No. of rows being Written to CSV File - sql-server

I have a task where I need to generate a CSV file from data coming out of two views including a Header having hard coded values and a Trailer at the bottom of the CSV file having These fields- Record_Type = E99, Row_count, and Blank field with 190 length.
I'm able to get the desired output file but I am not able to figure out how to retrieve the NO. of rows coming out of the two vies and write it in between the record type and the blank field at the bottom of the CSV as the whole line is trailer with | delimited.
Please help me figure this out.
Thanks.

My suggestion:
I assume you are using the SSIS Package to solve this problem.
Create a SQL Staging table to store the content which you want to export in CSV file. You may use stored procedure to truncate and refill this staging table by executing it. Execute this store procedure through Execute SQL Task in SSIS Package
Use Data Flow Task to export the data from staging table to CSV file. Input will be SQL Staging table and output will be flat file with Comma(,) delimiter.
I hope it will help you

Related

SSIS: Multiple Sources to One Destination table

I am new SSIS & ETL. And trying to extract & load data into single destination table in sql server.
I have 4 sources - text file, csv file, excel file and some data in sql server. Please find the pictures attached that I have done so far. In one package, I have created 2 data flows, not connected (highlighted in red color boxes): one for .txt & .csv files and another for .xls & data in sql server.
Data is getting inserted but not in correct way. Here are the screenshots attached:
And the screenshot for output in destination table is shown below: Customer_ID is the auto increment.
Can anyone please let me know what am I missing and how to do it in correct way.
Thanks in advance
Raj
With the example, it seems that the merges join are not working.
Are you sure that the joining columns are ok?
If not, you can try to add derived columns to format the data for each source. SSIS is case sensitive and if you work with varchar columns be careful of: start and end Space, unicode and non-unicode columns, accent, ...

Creating a blank CSV file Snowflake Staging area

I was working on creating the staging area in Snowflake and creating CSV files inside it. I am stuck up with an issue, hope someone here is experienced enough to help me in the case.
I have created a job in unix to create a CSV file in a staging area from a table added with where conditions to filter the data. But, when at times if there are no rows as output as the result of the select statement from the table, the CSV file is not at all created in the staging area. Is there any way in such cases a CSV file is created with the name of the columns alone with no value rows?
Thanks in advance.
Rahul
Have you tried the header=true option in your COPY INTO?

I want to fill a data base table from an xslt file

I have a Table (Exchange) in my Data Base (SQL) Exchange(ExchangeDateStart,ExchangeDateEnd,Value,Code), the Period is monthly. And my csv file :
Now, my problem is how can i use this file to fill my table, i think about Stored Procedure or bash script but i used MySQlserver in the end to import the data, but i can't do the mapping between source and destination because i have just one DateStart and one DateEnd , in csv file i have multiple Date the same for the columns A.
To be more clear i want To show them like this
:
If someone could help me plz,
I suggest to use Python and one of its packages that can import Excel data.
If you google "import excel data into database with python" then you'll find code that shows you how to do it. Here's one howto with code: https://mherman.org/blog/import-data-from-excel-into-mysql-using-python/
In the End i use'import data' from MySqlServerManagament in a tempory table after i update this table with a P.S [https://support.discountasp.net/kb/a1179/how-to-import-a-csv-file-into-a-database-using-sql-server-management-studio.aspx]

Can SSIS support loading of files with varying column lengths in each row?

Currently I receive a daily file of around 750k rows and each row has a 3 character identifier at the start.
For each identifier, the number of columns can change but are specific to the identifier (e.g. SRH will always have 6 columns, AAA will always have 10 and so on).
I would like to be able to automate this file into an SQL table through SSIS.
This solution is currently built in MSACCESS using VBA just looping through recordsets using a CASE statement, it then writes a record to the relevant table.
I have been reading up on BULK INSERT, BCP (w/Format File) and Conditional Split in SSIS however I always seem to get stuck at the first hurdle of even loading the file in as SSIS errors due to variable column layouts.
The data file is pipe delimited and looks similar to the below.
AAA|20180910|POOL|OPER|X|C
SRH|TRANS|TAB|BARKING|FORM|C|1.026
BHP|1
*BPI|10|16|18|Z
BHP|2
*BPI|18|21|24|A
(* I have added the * to show that these are child records of the parent record, in this case BHP can have multiple BPI records underneath it)
I would like to be able to load the TXT file into a staging table, and then I can write the TSQL to loop through the records and parse them to their relevant tables (AAA - tblAAA, SRH - tblSRH...)
I think you should read each row as one column of type DT_WSTR and length = 4000 then you need to implement the same logic written using vba within a Script component (VB.NET / C#), there are similar posts that can give you some insights:
SSIS ragged file not recognized CRLF
SSIS reading LF as terminator when its set as CRLF
How to load mixed record type fixed width file? And also file contain two header
SSIS Flat File - CSV formatting not working for multi-line fileds
how to skip a bad row in ssis flat file source

Use SSIS to import multiple .csv files that each have unique columns

I keep running into issues creating a SSIS project that does the following:
inspects folder for .csv files -> for each csv file -> insert into [db].[each .csv files' name]
each csv and corresponding table in the database have their own unique columns
i've tried the foreach loop found in many write ups but the issue comes down to the flat file connection. it seems to expect each csv file has the same columns as the file before it and errors out when not presented with this column names.
anyone aware of a work around for this?
Every flat file format would have to have it's own connection because the connection is what tells SSIS how to interpret the data set contained within the file. If it didn't exist it would be the same as telling SQL server you want data out of a database but not specifying a table or its columns.
I guess the thing you have to consider is how are you going to tell a data flow task what column in a source component is going to map to a destination component? Will it always be the same column name? Without a Connection Manager there is no way to map the columns unless you do it dynamically.
There are still a few ways you can do what you want and you just need to search around because I know there are answers on this subject.
You could create a Script Task and do the import in .Net
You could create a SQL Script Task and use BULK INSERT or OPENROWSET into a temporary stagging table and then use dynamic sql to map and import the final table.
Try to keep a mapping table with below columns
FileLocation
FileName
TableName
Add all the details in the table.
Create user variables for all the columns names & one for result set.
Read the data from table using Execute SQL task & keep it in single result set variable.
In For each loop container variable mappings map all the columns to user variables.
Create two Connection Managers one for Excel & other for csv file.
Pass CSV file connection string as #[User::FileLocation]+#[User::FileName]
Inside for each loop conatiner use bulk insert & assign the source & destination connections as well as table name as User::TableName parameter.
if you need any details please post i will try to help you if it is useful.
You could look into BiML Script, which dynamically creates and executes a package, based on available meta data.
I got 2 options for you here.
1) Scrip component, to dynamically create table structures in sql server.
2) With for each loop container, use EXECUTE SQL TASK with OPENROWSET clause.

Resources