I want to be able to look up fields values from a secondary table that stores the secondary table field names as strings in a primary table.
I am trying to both query the primary table itself to get the field names I want from the secondary table:
How can I use that string that contains the field name to select data from table2, given the id of a specific row.
The use case here, is that table1 also contains a note field. I want to populate that note field with the contents of a specific record in table2. I want to do this for all the records in table 1: each record references a different field by field name.
Another way to look at this:
Table2 has 1 row we want data from, and we know the ID
table1 stores the list of fields that we need data for from table 2
How do I get both pieces of data (field name, and field value) using these constraints?
all data is varchar
table1
fieldname
externalID
myField1
001
myField2
001
table2
id
myField1
myField2
myField3
001
myField1ValueForID001
myField2ValueForID001
myField3ValueForID001
002
moredata1
moredata2
moredata3
select fieldname,
(select [fieldname] from table2 where id = ) as fieldData
from table1
result
fieldname
fieldData
myField1
myField1
myField2
myField2
desired result
fieldname
fieldData
myField1
myField1ValueForID001
myField2
myField2ValueForID001
You can do this with a CASE expression, though I do recommend you rethink your design here:
SELECT *
INTO dbo.Table1
FROM (VALUES('myField1','001'),
('myField2','001'))V(FieldName,ExternalID);
SELECT *
INTO dbo.Table2
FROM (VALUES('001','myField1ValueForID001','myField2ValueForID001','myField3ValueForID001'),
('002','moredata1','moredata2','moredata3'))V(id,myField1,myField2,myField3);
GO
SELECT T1.FieldName,
CASE T1.FieldName WHEN 'myField1' THEN T2.myField1
WHEN 'myField2' THEN T2.myField2
WHEN 'myField3' THEN T2.myField3
END AS FieldData
FROM dbo.Table1 T1
JOIN dbo.Table2 T2 ON T1.ExternalID = T2.id;
GO
DROP TABLE dbo.Table1;
DROP TABLE dbo.Table2;
Related
I am looking for a replace of OUTPUT/RETURNING clause in SNOWFLAKE.
Test scenario:
I have a metadata tbl, in which I store all tables + columns which should be updated:
tblToUpdate
FieldToUpdate
tblA
name
tblA
lastname
tblB
middlename
tblC
address
tblC
phone
tblC
zipcode
I dynamically generate the upd stmt, based on this table and it looks like :
update tblA set name = '#tst', lastname = '#tst';
update tblB set middlename = '#tst';
update tblC set address = '#tst', phone ='#tst', zipcode = '#tst';
Next step is to create a log table, to store the names of the updated tables + ids of updated rows.
How can I do this without creating a STREAM tbl for each tbl to be updated (metadata tbl can contain from 1 to n tables, rows from it can change above the time). I need to find a way to create the log table, to keep a track of all changed tables with its rows.
Thanks!
I am currently building a SQL database to monitor access to a server room. I have table1 with the employees details. The primary key is the employeeID field.
I have table2 which is the transaction produced from the door reader. When a new row is inserted into table2 the RFID reader will produce the time/date and employeeID. I would like table2 to auto populate the employee name field by matching the employeeID’s in table1 and table2.
Should I be using a SQL view to complete this task?
Table 1
EmployeeID, FirstName, LastName
Table2
Time/date, EmployeeID, FirstName, LastName
I would do something like this,
Table1
EmployeeID, FirstName, LastName
Table2
Id, Time/date, EmployeeID
When you want to view the result,
Select Table2.Time/date, Table1.EmployeeId, Table1.FirstName, Table1.LastName From Table2 Left Join Table1 On Table2.EmployeeId = Table1.EmployeeId
Basically I have two tables. Table1 has millions of rows. Table2 has very few rows.
Table1 has field1 which is a product ID (not unique). Table2 has field2 which is just a list of productID's that need to be included from Table1 in the select statement.
SELECT Table1.*
FROM Table1
Join Table2 ON Table2.field2 = table1.field1
I have this scenario:
Source node:
Schema1:
Table1: id, field1, field2
Table2: id, table1_id, table3_id
Table3: id
Table4: id, table1_id
Schema2:
Table5: id, table3_id, table6_id, field3, field4
Table6: id
Target Node:
Schema1:
Table7: id, field1, field2, field3, field4.
To solve this, I have created one router of "default" type, triggers for each table and the corresponding rows in trigger_router table. After that, I have created one row in transform_table to manage transformations from Table1 in source node, to Table 7 in target node. Here is my problem: At first, I tried to create one row in transform_table for getting data from Table6 to Table7, but I can't use any primary Key to link Table1 and Table6 in a direct way in the source. Now I'm trying to use lookup transformation to get field3 and field4.In order to achieve that, I have created a row in transform_column table like this:
TARGET_COLUMN_NAME: field3
SOURCE_COLUMN_NAME: null
PK: 0
TRANSFORM_TYPE:lookup
TRANSFORM_EXPRESSION:
SELECT field3
FROM schema1.table1 s1t1
INNER JOIN schema1.table2 s1t2 ON s1t2.table1_id = s1t1.id
LEFT JOIN schema1.table3 s1t3 ON s1t3.id = s1t2.table3_id
LEFT JOIN schema2.table5 s2t5 ON s2t5.table3_id = s1t3.id
LEFT JOIN schema2.table6 s2t6 ON s2t6.id = s2t5.table6_id
WHERE s1t1.id = :ID
I understand that, when transformation take place the :ID variable will be replaced with the id of the table1 row that I'm getting. The problem I'm having is that field3 and field4 could be NULL in some table7 rows (as you can imagine from the LEFT JOINS in query). So I'm getting the error
Expected a single row, but returned no rows from lookup for target
column field3 on transform source_2_target_table7_table1_schema1
Is there any way to force SymmetricDS to copy a NULL value in this column when lookup expression returns no rows? Or is there any other way to achieve this kind of synchronization?
Thanks in advance
Solved by using BSH transformation and sqlTemplate. Just return null when sqlTemplate query returns no rows.
I am trying to find out whether I can do the following using joins instead of looping through each record.
Table1
------------
LastName
FirstName
Table2
-------------
UniqueId
LastName (full text indexed)
FirstName (full text indexed)
for each record in table1, I am trying to find out if there are any matching records in table2.
Thanks,
sridhar
Need more info like what you are joining on.
If you are joining based on those fields you could do:
SELECT Table2.LastName, Table2.FirstName
FROM Table2 INNER JOIN Table1 t ON t.FirstName=Table2.FirstName
AND t.LastName = Table2.LastName
This should return all rows where data matches up in both tables.
Is this waht you need?
EDIT PORTION
If you want that try this:
SELECT * FROM Table_2 t2 INNER JOIN Table_1 t1
ON t2.lastname LIKE t1.lastname + '%'
Modify to fit your needs.