I have a folder in Cloud with some files and I have an SSIS Package that upload these file to SQL. I have also two variables on my SSIS package: start_date and end_date.
Basically, I would like to do that:
If start_date is empty then the SSIS should upload files since last uploaded.
If start_date is not empty the SSIS should upload the file from start_date to end_date.
I have a variable that configures the files that SSIS needs to upload. I have a problem here:
I donĀ“t know I can set up this step, because when start_date is empty the SSIS gives an error message
You should use the following syntax:
IF '" + (DT_WSTR,10)REPLACENULL(#[$Package::start_date],"") + "' <> ''
Pretty new to BI and SQL in general, but a few months ago I didn't even know what a model is and now here I am...trying to build a package that runs daily.
Currently running this is Excel via PowerQuery but because the data is so much, I have to manually change the query every month. Decided to move it into SSIS.
Required outcome: Pull the last date in my Database and use it as a variable in the model (as I have millions of rows, I only want to load lines with dates greater than what I have in my table already).
Here is my Execute SQL Task:
I set up a variable for the SQL query
and trying to use it in my OLE DB query like this
Execute SQL Task: results, are fine - returns date as "dd/mm/yyyy hh24:mi:ss"
SELECT MAX (CONVACCT_CREATE_DATE) AS Expr1 FROM GOMSDailySales
Variable for OLE DB SQL Query:
"SELECT fin_booking_code, FIN_DEPT_CODE, FIN_ACCT_NO, FIN_PROD_CODE, FIN_PROG_CODE, FIN_OPEN_CODE, DEBIT_AMT, CREDIT_AMT, CURRENCY_CODE, PART_NO, FIN_DOC_NO, CREATE_DATE
FROM cuown.converted_accounts
WHERE (CREATE_DATE > TO_DATE(#[User::GetMaxDate],'yyyy/mm/dd hh24:mi:ss'))
AND (FIN_ACCT_NO LIKE '1%')"
Currently getting missing expression error, if I add " ' " to my #[User::GetMaxDate], I get a year must be between 0 and xxxx error.
What am I doing wrong / is there a cleaner way to get this done?
In the OLEDB source use the following, change the data access mode to SQL command, and use the following command:
SELECT fin_booking_code, FIN_DEPT_CODE, FIN_ACCT_NO, FIN_PROD_CODE, FIN_PROG_CODE, FIN_OPEN_CODE, DEBIT_AMT, CREDIT_AMT, CURRENCY_CODE, PART_NO, FIN_DOC_NO, CREATE_DATE
FROM cuown.converted_accounts
WHERE (CREATE_DATE > TO_DATE(?,'yyyy/mm/dd hh24:mi:ss'))
AND (FIN_ACCT_NO LIKE '1%')
And click on the parameters button and map #[User::GetMaxDate] to the first parameter.
For more information, check the following answer: Parameterized OLEDB source query
Alternative method
If parameters are not supported in the OLE DB provider you are using, create a variable of type string and evaluate this variable as the following expression:
"SELECT fin_booking_code, FIN_DEPT_CODE, FIN_ACCT_NO, FIN_PROD_CODE, FIN_PROG_CODE, FIN_OPEN_CODE, DEBIT_AMT, CREDIT_AMT, CURRENCY_CODE, PART_NO, FIN_DOC_NO, CREATE_DATE
FROM cuown.converted_accounts
WHERE CREATE_DATE > TO_DATE('" + (DT_WSTR, 50)#[User::GetMaxDate] +
"' ,'yyyy/mm/dd hh24:mi:ss') AND FIN_ACCT_NO LIKE '1%'"
Then from the OLE DB source, change the data access mode the SQL Command from variable and select the string variable you created.
Your trying to use the SSIS variable like a variable in the query. When constructing a SQL query in a string variable you simply need to concatenate the strings together. The expression for your query string variable should look like this.
"SELECT fin_booking_code, FIN_DEPT_CODE, FIN_ACCT_NO, FIN_PROD_CODE, FIN_PROG_CODE, FIN_OPEN_CODE, DEBIT_AMT, CREDIT_AMT, CURRENCY_CODE, PART_NO, FIN_DOC_NO, CREATE_DATE
FROM cuown.converted_accounts
WHERE CREATE_DATE > " + #[User::GetMaxDate] +
"AND (FIN_ACCT_NO LIKE '1%')"
This question already has answers here:
Export SQL query data to Excel
(5 answers)
Closed 5 years ago.
I am using SQL Server 2012. I need to create an FTP Task that will pick up an Excel file (or it can be a csv file if needed) and upload it on an FTP site on a daily basis. The basic idea here is to automate the whole process from the creation of the Excel file to uploading it on the FTP site.
The content of that Excel file will be the result of a T-SQL query that will run on a daily basis and overwrite the existing one. My idea is to use SQL Server Agent Job to do this part. However, I am having a really hard time figuring out how to write my T-SQL codes so that it outputs the results into an Excel file.
The creation of the FTP Task is not really an issue right now as I think I can handle that by creating a SSIS package.
For simplicity, let us assume, my SQl query stands as follows:
Select *
From TravelAgency
and let us also assume that I want to save that file as 'myfile.xlsx' in the 'C:\Test' folder.
Can it be done in a T-SQL query?
Here example (1251 is russian encoding, you do not need it)
set #cmd ='bcp "' + #query + '" queryout "' + #SourceFileName + '" -T -c -C 1251'
--print #cmd
exec #ret = master..xp_cmdshell #cmd
if #ret <> 0 throw 50000, 'shell error while saving file', 1
I am very new to SSIS. Every time I am getting syntax error when using "SQL command from variable" Data access mode in OLE DB editor. I created 2 variables..one with string type to store a SQL statement and another to store Int value. Please help me in guiding to correct the syntax
1) For "SQL" variable, you need to specify an expression as below example:
"SELECT TOP " + (DT_STR, 10, 1252) #[User::NumberOfRecords] + " * FROM YourTable"
2) Then Use that "SQL" variable in the data source (SQL command from variable)
I Think That My Question Is Simple.
How Can I Find That My Query Is Running From Where
( Where is The Location of the Script File itself ) ?
Edit :
Thank You For Your Answer.
I Need To Import a XML File Using my TSQL Script File And i want to Keep Them Together,
so Wherever Someone try to run the TSQL script file, it must knows the current directory of itself to know where is the XML file and then import it. Thank Again !
You need a well known location where you can place XML files for the server to load. This could be a share on the SQL Server machine, or on a file server which the SQL Server service account has permissions to read from.
You then need a comment like this at the top of your script:
--Make sure you've placed the accompanying XML file on \\RemoteMachine\UploadShare
--Otherwise, expect this script to produce errors
Change \\RemoteMachine\UploadShare to match the well known location you've selected. Optionally, have the comment followed by 30-40 blank lines (or more comments), so that it's obvious to anyone running it that they might need to read what's there.
Then, write the rest of your script based on that presumption.
I Found A Solution to my problem that's simpler !
You Know I Just Import My XML File To A Temp Table for once.
Then I Write a Select Query for That Temp Table That Contains my imported Data Like This :
" SELECT 'INSERT INTO MyTable VALUES (' + Col1 + ', ' + Col2 + ')' FROM MyImportedTable "
And Now I Have Many Insert Commands For Each One Of My Imported Records.
And I Save All of the Insert Commands in My Script. And So I Just Need My Script File Everywhere I Go.