I have a table in a db in SQL Server that reads data from a csv file that's being uploaded to a ftp each night. The table shows data for the past 30 days, but suddenly it stopped showing entries past a certain date.
I've checked the ftp dump and the csv file - everything here looks fine here (nothing's changed). The table itself is created using SSIS, and I've found various script for extract, load and transform. However, I'm unsure of how and where to start troubleshooting.
I realize that this is somewhat a broad question, so I'm looking for a way or narrowing down the problem?
Every time I had an SSIS issue like this I would open the package and run it manually from within SSIS because I've found the logging at that level better than when it runs as a SQL job.
For large and complicated packages I would have to select parts of and run piece by piece
Related
i would like to ask maybe basic question. My SSIS uses SQL query, result is exported to *.csv file. Everything works fine for months (sql select, sql records/results are in csv export) - but yesterday i have issue: SSIS ran without error, but in *.csv export one column was without any records(column header is in csv file, but records not) - in another columns there were records (records are from same table as missing column:(). when i ran SSIS manually again (cca 4 min after), data were in column. SSIS is deployed and run on same sever as database. my question is - if there is no error in SSIS Execution report, data flow task ran correctly, now query returns all data withou editing .... is there any chance to find what caused data loss? maybe on server log?
thank you
I have a tpcds dsdgen generated data about 100GB, and each dat file is around 20GB+,
I have exhaustively searched how to load flat file into sql-server 2016, and i tried importing it using SSIS package and enabled fast parse on all the decimal fields.
Still it takes lot of time to load into a table. Its taking 24hrs and continuing and just loaded 9,700,000 records of a table only,
is the above the ideal way, or what could be the problem.
please help me, Iam stuck with this problem and also new to MSSQL world.
I was trying to solve a smaller integration through logic apps in Azure.
I have a Stored Procedure that selects data from a database and outputs XML as result.
The thing is that the Xml result is about 50k rows and pretty large.
I made an on-premises gateway connection to run the Stored Procedure through logic apps. but when I get the result, not only does it split the big xml, but it also cuts the whole result after about 15k rows.
I know I could use blobs, which means I need to export the sql-xml to files first, which also means that I need to use BCP with something like powershell to export the xml to file in best way. But, I'm trying to scipt most of the on premises steps. I want this solution to be a cloud-based one as much as possible.
Anyone have a solution for this?
Ok, So...
I have boiled it down to two possible outcomes to why this problem occurs.
first one is that I noised I got this error when trying to open the xml im sql server.
'~vs8D51.xml' is too large to open with XML editor. The maximum file size is '10' MB. Please update the registry key 'HKCU\Software\Microsoft\SQL Server Management Studio\13.0_Config\XmlEditor\MaxFileSizeSupportedByLanguageService' to change the maximum size.
Which makes me think that the stored procedure in Azure Logic apps doesn't fetch a result larger than 10mb because of the restrain in sql server.
I have tried to change it in regedit, but every time I restart sql server manager it resets to 10mb.
I have no idea if this is a correct assessment of the problem but its a thought...
Second, a colleague told me he had a similar problem with an file from an FTP.
He said that in some weird way the logic app doesn't get all the data because of some kind of timeout that happens in the background...
He had to fetch the file content in split pieces and somehow stream it through the workflow of the logic app and then recreate the whole thing and save it to a file on the other end of the integration.
That made me think of trying out this: SQL Pagination for bulk data transfer with Logic Apps
It works, but not quite how i want it to work. I can stream the data and save them to blob, but that is as results from the table itself, not as split pieces of the whole XML of that same data...
Anyone know of a way to maybe iterate/paginate though a whole XML result in SQL in a good way, with root tags and all?
In SSMS 18 in order to have the MaxFileSizeSupportedByLanguageService value persist I needed to edit that key's value in the C:\Program Files (x86)\Microsoft SQL Server Management Studio 18\Common7\IDE\CommonExtensions\Platform\Shell\Microsoft.XmlEditor.pkgdef file.
I have a stored procedure that I am going to run every weekend, it produces a result set that I need to export into an Excel file.
For the above problem I want to automate this process, so I am going to create a SQL Job and I am going to run this stored procedure every weekend so that that generated Excel file is sent to my reporter.
For this I need steps to export the result set data to an Excel file.
And also is it possible to send that Excel file to the specific mail while running the job itself?
So, you might try your luck on https://dba.stackexchange.com/, but in my experience a SQL Agent job running a stored procedure could be coaxed to return CSV or XML - and those could end up in Excel, but there are missing links. I think the missing links would involve programming and potentially 3rd party tools to avoid using Excel's COM API.
I'd strongly recommend your pursuing SQL Server Reporting Services. It is included free with your edition of SQL and includes the ability to
run reports on a schedule (subscriptions),
format the results as an Excel file
distribute the results via email
You'd take your query and use it as the data source for a "report" and use the report wizard to create a very simple table with the results.
Avoid page headers (or footers) that span columns - this will keep the excel output cleaner.
References
Stack Overflow: reporting-services-export-to-excel-with-multiple-worksheets
Technet: Reporting Services
I understand this may be a little far-fetched, but is there a way to take an existing SSIS package and get an output of the job it's doing as T-SQL? I mean, that's basically what it is right? Transfering data from one database to another can be done with T-SQL as well.
I'm wondering this because I'm trying to get away from using SSIS packages for data transfer and instead using EF/linq to do this on the fly in my application. My thought process is that currently I have an SSIS package that transfers and formats data from one database to another in preparation to be spit out to an excel. This SSIS package runs nightly and helps speed up the generation of the excel as once the data is transferred to the second db, it's already nice and formatted correctly.
However, if I could leverage EF and maybe some linq to sql in order to format the data from the first database on the fly and spit it out to excel quickly without having to use this second db, that would be great. So can my original question be done, can I extract the t-sql representation of an SSIS package some how?
SSIS packages are not exclusively T-SQL. They can consist of custom back-end code, file system changes, Office document creation steps, etc, to name only a few. As a result, generating the entirety of an SSIS package's work into T-SQL isn't possible, because the full breadth of it's work isn't limited to SQL Server.