We recently migrated our data servers from Netezza to Snowflake and all SQL queries that were originally running in Netezza need to be translated to be compatible to run in snowflake. I'm not able to translate the following syntax from netezza to snowflake. Can someone please help? Thanks
,trim(trailing ',' from replace(replace (XMLserialize(XMLagg(XMLElement('X',date(orderdate)))), '<X>','' ),'</X>' ,',' )) as orderdate
I tried using REGEX_REPLACE function from snowflake but that didn't work.
In general when translating code it is better to focus on the behavior instead of trying to literally translate it.
Based on how the code is written, i.e. generating XML element, aggregating multiple elements, replacing of <X>, </X> and removal of trailing , it seems it is generating comma-separated string of dates.
In Snowflake:
LISTAGG(TO_VARCHAR(orderdate, 'YYYY-MM-DD'), ',') AS orderdate
Related
I have found several posts on using the GETDATE() function for SQL Server linked table while in an Access front-end VBA procedure. Those posts are focused on the WHERE clause of the query, but I have been unable to find corresponding information on use of GETDATE() for column assignment.
For example, I understand that in the WHERE clause, I would use something like this:
WHERE MyDate = CAST(GETDATE() AS DATE)
However, I am getting syntax errors in VBA when I try to assign the current date to a column, like this:
INSERT INTO MyTable ( SomeValue, TheDate ) SELECT 'Widget' AS Expr1, CAST(GETDATE() AS DATE) AS Expr2;
In this example, TheDate is defined as DateTime in SQL Server. Written like this, VBA reports "Syntax error (missing operator) in query expression 'CAST(GETDATE() AS DATE)'. I tried to surround the expression with Access-friendly # date delimiters, but no luck there.
After spending about 30 minutes searching stackexchange.com various ways for MS Access Date() in SQL, I have been unable to find this. However it is so simple I am sure it was already answered somewhere.
In MS Access you likely (not 100% sure for linked SQL, you have to experiment) should use Now() and Date() functions. First one is equivalent to getdate() in SQL, the second one returns current date without time.
If you run this in Access on a linked table (not a PT query), it should read:
INSERT INTO MyTable ( SomeValue, TheDate )
VALUES ('Widget', Date());
There seems to be some confusing here. If you building a Access query, then ZERO ZERO of the SQL server date functions and syntax matter. Your SQL MUST continue to be written to Access standards unless you using a pass-though query.
However, I seen this 100x times here.
What is the data type on sql server side?
Is it datetime, or datetime2?
And double, triple, qudadropes, (and more) check the linked table in desing mode.
If you link to SQL server using the standard legacy "SQL Server" driver. The one that been shipped for 20 years since windows 98SE?
You MUST check if Access is seeing those columns as text, or as date columns (which in Access always allow a time part if you want).
Access code, queries, forms and EVERYTHING should require ZERO changes if you migrate that data from Access to SQL server and link the table. Again: ZERO ZERO changes.
However, if you used datetime2 on the SQL server side? Then you CAN NOT use the legacy "SQL server driver" when linking table. The reason is they don't support the newer datetime2 format. As a result, Access will actually see, use, and process that column as a text column. You REALLY, but REALLY do not want that to occur.
Why?
Becuase then you spend the next week asking questions on SO about how some date code or column or query does not work.
again:
ZERO ZERO changes are required in Access. If your dates are starting to break, then the issue is not date formats, but that column is now being seen by access as a TEXT data type.
Soltuion:
Either change the sql side datetime2 columns to datetime, and re-link.
or
re-link your tables using a newer native 11 (or later - up to 18 now). that way, access will see/use/process the datetime2 as a correct date format in Access.
So, before you do anything? Open one of the Access tables linked to SQL server in design mode. (ignore the read only prmompt). Now, look at the data type assigned to the date columns. If they are text, then you have a royal mess.
You need to re-link using the newer ODBC drivers.
Zero of your existing code, sql and quires should be touched or even changed if you using a linked table to sql server. But then again, if you linked using the wrong SQL ODBC driver, then Access cannot see nor process those datetime2 columns as date - it will be using text, and you beyond really don't want to allow that to occur.
In summary:
Any date code, SQL updates, sorting, query, VBA code, form code, reports should continue to work with ZERO changes. If you are making changes to dates after a migration, then you done this all wrong, and those date columns are not being seen by access as date columns.
Either get rid of all datetime2 columns and then re-link (change them server side to datetime). Or re-link the tables using a native 11 or later ODBC driver. Either of these choices will fix this issue.
This is a fix that requires ZERO code, and zero changes to Access dealing with dates.
When I use T-SQL to convert a datetime into dd.mm.yyyy for an csv output using SSIS, the file is produced with a dd-mm-yyyy hh:mm:ss which is not what i need.
I am using:
convert(varchar,dbo.[RE-TENANCY].[TNCY-START],104)
which appears correct in SSMS.
Which is the best way to handle the conversion to be output from SSIS?
Not as simple as i thought it would be.
It works for me.
Using your query as a framework for driving the package
SELECT
CONVERT(char(10),CURRENT_TIMESTAMP,104) AS DayMonthYearDate
I explicitly declared a length for our dd.mm.yyyy value and since it's always going to be 10 characters, let's use a data type that reflects that.
Run the query, you can see it correctly produces 13.02.2019
In SSIS, I added an OLE DB Source to the data flow and pasted in my query
I wired up a flat file destination and ran the package. As expected, the string that was generated by the query entered the data flow and landed in the output file as expected.
If you're experiencing otherwise, the first place I'd check is double clicking the line between your source and the next component and choose Metadata. Look at what is reported for the tenancy start column. If it doesn't indicate dt_str/dt_wstr then SSIS thinks the data type is date variant and is applying locale specific rules to the format. You might also need to check how the column is defined in the flat file connection manager.
The most precise control on output format of the date can be achieved by T-SQL FORMAT(). It is available since SQL Server 2012.
It is slightly slower than CONVERT() but gives desired flexibility
An example:
SELECT TOP 4
name,
FORMAT(create_date, 'dd.MM.yyyy') AS create_date
FROM sys.databases;
name create_date
--------------------
master 08.04.2003
tempdb 12.02.2019
model 08.04.2003
msdb 30.04.2016
p.s. take into account that FORMAT() produces NVARCHAR output, which is different from your initial conversation logic, therefore extra cast to VARCHAR(10)) perhaps will be necessary to apply
I am trying to see if there is an easy answer for this. I have done something similar using multiple pick dropdown parameters in SSRS but this appears to be different.
My scenario is this, so maybe there is an even better answer.
I have a production server that I do not want to make any changes to including temp tables or functions. The production server has a table of clients with about 1600 records. I have set up an SSIS package that will allow transfer of data from production to dev based on a clientid. So my sources would have a query similar to Select Field From Table Where ClientId = ?
This works fine. Now I want to load more than one client, based an data in the clients table. It may be Select ClientId From Clients where Field = A and returns multiple ClientIds.
I am able to populate a comma delimited list from an execute sql task to a SSIS variable, so it maybe 1,4,8.
If I change my source query to use ClientId in (?) I get a conversion error.
I have looked at many posts that advocate a temp table or a function which I want to avoid. Select IN using varchar string with comma delimited values
I have contemplated building the entire sql statement into a variable but this don't seem like the right path as I have many tables to query and transfer where using ClientId = ? works well without having to build each individual SQL statement to a variable.
Is there an easy fix I am missing? I will turn my research now to try to find out how I did this in SSRS but I thought that I should try a post here to see if someone has accomplished this before.
I appreciate any info on this, thank you.
EDIT: Key note is that the column on clients is on the dev server, so I cannot just use a select in the where clause as the column does not exist on the production server.
EDIT: I did not mention that I am specifically looking at OLEDB sources mapping a parameter to ? in the sql statement.
EDIT: Narrowing down on this but having trouble relating SSRS and SSIS functionality. In SSRS its called a multi-value parameter in the following link the key line is
WHERE Production.ProductInventory.ProductID IN (#ProductID)
https://msdn.microsoft.com/en-us/library/dn385719(v=sql.110).aspx
This one looks good as well
https://sqlblogcasts.com/blogs/simons/archive/2007/11/22/RS-HowTo---Pass-a-multivalue-parameter-to-a-query-using-IN.aspx
I will keep researching and thank you for the help so far.
I think this sums it up best
This functionality is limited to strictly using embedded SQL.
What SSRS does is transform your SQL column IN (#value) to column IN
(#selectedvalue1,#selectedvalue2) etc.
You need to forget anything you have about the other ways of passing
lists to SQL i.e. building strings etc. and make sure you declare the
data types are correct for the value of your parameter.
You do not need to use the Join(parameters!,",") trick UNLESS
you are passing the list to a stored procedure.
In which case you then need to use some function to turn the delimited
list into a rowset as you have done.
I hope that helps
The core question is if I can get the same functionality in SSIS as in SSRS. It reminds me of macro substitution..
If you dont want to create a function, you can use the following in your t-sql statement.
Declare #ClientIds nvarchar(50) = '123,456'; --<-- Comma delimited list of Client Ids
Select Field
From Table
Where ClientId IN (
SELECT CAST(RTRIM(LTRIM(Split.a.value('.', 'VARCHAR(100)'))) AS INT) ClientIDs
FROM (
SELECT Cast ('<X>'
+ Replace(#ClientIds, ',', '</X><X>')
+ '</X>' AS XML) AS Data
) AS t CROSS APPLY Data.nodes ('/X') AS Split(a)
)
I need to export XML data from a Sql Server table using a known schema.
The schema takes the form
<myField>value</myField>
As far as I can make out I need to use the FOR XML clause in the select statement.
All options for that clause seem to generate XML in the form of:
<element myField=value /element>
i.e. Data is inserted as attributes of elements instead of being values of elements.
It seems that the best way to deal with this need is to handle the XML file generation inside of an application.
Am I correct in this conclusion or is it lack of understanding of TSQL xml capabilities on my part?
thanks
Bob
I don't know if you can get the exact format you are looking for but this comes pretty close. The Point and DateTimeFormat column will be repeated for each row.
SELECT Point
,DateTimeFormat
,DT AS "S/DT"
,V AS "S/V"
,Q AS "S/Q"
FROM mytable
FOR XML PATH(''), TYPE, ELEMENTS, ROOT('Data')
Check out the SQL Fiddle
Is there a direct route that is pretty straight forward? (i.e. can SQL Server read XML)
Or, is it best to parse the XML and just transfer it in the usual way via ADO.Net either as individual rows or perhaps a batch update?
I realize there may be solutions that involve large complex stored procs--while I'm not entirely opposed to this, I tend to prefer to have most of my business logic in the C# code. I have seen a solution using SQLXMLBulkLoad, but it seemed to require fairly complex SQL code.
For reference, I'll be working with about 100 rows at a time with about 50 small pieces of data for each (strings and ints). This will eventually become a daily batch job.
Any code snippets you can provide would be very much appreciated.
SQL Server 2005 and up have a datatype called "XML" which you can store XML in - untyped or typed with a XSD schema.
You can basically fill columns of type XML from an XML literal string, so you can easily just use a normal INSERT statement and fill the XML contents into that field.
Marc
You can use the function OPENXML and stored procedure sp_xml_preparedocument to easily convert your XML into rowsets.
If you are using SQL Server 2008 (or 2005), it has an xml native datatype. You can associate an XSD schema with xml variables, and Insert directly into columns of type xml.
Yes, SQL Server 2005 and above can parse XML out of the box.
You use the nodes, value and query methods to break it down how you want, whether values or attributes
Some shameless plugging:
Importing XML into SQL Server
Search XML Column in SQL
Xml data and Xml document could have different meaning.
When xml type is good for data, it doesn't save formatting (white spaces removed), so in some cases (e.g. cofiguration files) the best option is nvarchar.