I have a string of email addresses. For example, "a#a.com; b#a.com; c#a.com"
My database is:
record | flag1 | flag2 | emailaddresss
--------------------------------------------------------
1 | 0 | 0 | a#a.com
2 | 0 | 0 | b#a.com
3 | 0 | 0 | c#a.com
What I need to do is parse the string, and if the address is not in the database, add it.
Then, return a string of just the record numbers that correspond to the email addresses.
So, if the call is made with "A#a.com; c#a.com; d#a.com", the rountine would add "d#a.com", then return "1, 3,4" corresponding to the records that match the email addresses.
What I am doing now is calling the database once per email address to look it up and confirm it exists (adding if it doesn't exist), then looping thru them again to get the addresses 1 by 1 from my powershell app to collect the record numbers.
There has to be a way to just pass all of the addresses to SQL at the same time, right?
I have it working in powershell.. but slowly..
I'd love a response from SQL as shown above of just the record number for each email address in a single response. That is, "1,2,4" etc.
My powershell code is:
$EmailList2 = $EmailList.split(";")
# lets get the ID # for each eamil address.
foreach($x in $EmailList2)
{
$data = exec-query "select Record from emailaddresses where emailAddress = #email" -parameter #{email=$x.trim()} -conn $connection
if ($($data.Tables.record) -gt 0)
{
$ResponseNumbers = $ResponseNumbers + "$($data.Tables.record), "
}
}
$ResponseNumbers = $($ResponseNumbers+"XX").replace(", XX","")
return $ResponseNumbers
You'd have to do this in 2 steps. Firstly INSERT the new values and then use a SELECT to get the values back. This answer uses delimitedsplit8k (not delimitedsplit8k_LEAD) as you're still using SQL Server 2008. On the note of 2008 I strongly suggest looking at upgrade paths soon as you have about 6 weeks of support left.
You can use the function to split the values and then INSERT/SELECT appropriately:
DECLARE #Emails varchar(8000) = 'a#a.com;b#a.com;c#a.com';
WITH Emails AS(
SELECT DS.Item AS Email
FROM dbo.DelimitedSplit8K(#Emails,';') DS)
INSERT INTO YT (emailaddress) --I don't know what the other columns value should be, so have excluded
SELECT E.Email
FROM dbo.YourTable YT
LEFT JOIN Emails E ON YT.emailaddress = E.Email
WHERE E.Email IS NULL;
SELECT YT.record
FROM dbo.YourTable YT
JOIN dbo.DelimitedSplit8K(#Emails,';') DS ON DS.Item = YT.emailaddress;
I have a flat file that has 6 columns: NoteID, Sequence, FileNumber, EntryDte, NoteType, and NoteText. The NoteText column has 200 characters and if a note is longer than 200 characters then a second row in the file contains the continuation of the note. It looks something like this:
|NoteID | Sequence | NoteText |
---------------------------------------------
|1234 | 1 | start of note text... |
|1234 | 2 | continue of note.... |
|1234 | 3 | more continuation of first note... |
|1235 | 1 | start of new note.... |
How can I in SSIS combine the multiple rows of NoteText into one row so the row would like this:
| NoteID | Sequence | NoteText |
---------------------------------------------------
|1234 | 1 | start of note text... continue of note... more continuation of first note... |
|1235 | 1 | start of new note.... |
Greatly appreciate any help?
Update: Changing the SynchronousInputID to None exposed the Output0Buffer and I was able to use it. Below is what I have in place now.
Dim NoteID As String = "-1"
Dim NoteString As String = ""
Dim IsFirstRow As Boolean = True
Dim NoteBlob As Byte()
Dim enc As New System.Text.ASCIIEncoding()
Public Overrides Sub Input0_ProcessInputRow(ByVal Row As Input0Buffer)
If Row.NoteID.ToString() = NoteID Then
NoteString += Row.NoteHTML
IsFirstRow = True
Else
If IsFirstRow Then
Output0Buffer.AddRow()
IsFirstRow = False
End If
NoteID = Row.NoteID.ToString()
NoteString = Row.NoteHTML.ToString()
End If
NoteBlob = enc.GetBytes(NoteString)
Output0Buffer.SingleNoteHTML.AddBlobData(NoteBlob)
Output0Buffer.ClaimID = Row.ClaimID
Output0Buffer.UserID = Row.UserID
Output0Buffer.NoteTypeLookupID = Row.NoteTypeLookupID
Output0Buffer.DateCreatedUTC = Row.DateCreated
Output0Buffer.ActivityDateUTC = Row.ActivityDate
Output0Buffer.IsPublic = Row.IsPublic
End Sub
My problem now is that I had to convert the output column from Wstr(4000) to NText because some of the notes are so long. When it imports into my SQL table, it is just jibberish characters and not the actual notes.
In SQL Server Management Studio (using SQL), you could easily combine your NoteText field using stuff function with XML Path to combine your row values to a single column like this:
select distinct
noteid,
min(sequence) over (partition by n.noteid order by n.sequence) as sequence,
stuff((select ' ' + NoteText
from notes n1
where n.noteid = n1.noteid
for xml path ('')
),1,1,'') as NoteText
from notes n;
You will probably want to look into something along the line that does similar thing in SSIS. Check out this link on how to create a script component in SSIS to do something similar: SSIS Script Component - concat rows
SQL Fiddle Demo
Please let me know if there is any query where in I remove the repeating entries in a row.
For eg: I have a table which has name with 9 telephone numbers:
Name Tel0 Tel1 Tel2 Tel3 Tel4 Tel5 Tel6 Tel7 Tel8
John 1 2 2 2 3 3 4 5 1
The final result should be as shown below:
Name Tel0 Tel1 Tel2 Tel3 Tel4 Tel5 Tel6 Tel7 Tel8
John 1 2 3 4 5
regards
Maddy
I fear that it will be more complicated to keep this format than to split the table in two as I suggested. If you insist on keeping the current schema then I would suggest that you query the row, organise the fields in application code and then perform an update on the database.
You could also try to use SQL UNION operator to give you a list of the numbers, a UNION by default will remove all duplicate rows:
SELECT Name, Tel FROM
(SELECT Name, Tel0 AS Tel FROM Person UNION
SELECT Name, Tel1 FROM Person UNION
SELECT Name, Tel2 FROM Person) ORDER BY Name ;
Which should give you a result set like this:
John|1
John|2
You will then have to step through the result set and saving each number into a separate variable (skipping those variables that do not exist) until the "Name" field changes.
Tel1 := Null; Tel2 := Null;
Name := ResultSet['Name'];
Tel0 := ResultSet['Tel'];
ResultSet.Next();
if (Name == ResultSet['Name']) {
Tel1 := ResultSet['Tel'];
} else {
UPDATE here.
StartAgain;
}
ResultSet.Next();
if (Name == ResultSet['Name']) {
Tel2 := ResultSet['Tel'];
} else {
UPDATE here.
StartAgain;
}
I am not recommending you do this, it is very bad use of a relational database but once implemented in a real language and debugged that should work.
I have a table in Google Datastore that holds n values in n columns, and one of them is a timestamp.
The timestamp property is defined like this, inside the table class (Java):
#Persistent
private Date timestamp;
The table is like this:
id | value | timestamp
----------------------------------------------------------
1 | ABC | 2014-02-02 21:07:40.822000
2 | CDE | 2014-02-02 22:07:40.000000
3 | EFG |
4 | GHI | 2014-02-02 21:07:40.822000
5 | IJK |
6 | KLM | 2014-01-02 21:07:40.822000
The timestamp column was added later to the table, so some rows have not the corresponding timestamp value.
I'm trying, using Python Google App Engine to build an api that returns the total number of rows that have a timestamp >= to some value.
For example:
-- This is just an example
SELECT * FROM myTable WHERE timestamp >= '2014-02-02 21:07:40.822000'
I've made this class, in python:
import sys
...
import webapp2
from google.appengine.ext import db
class myTable(db.Model):
value = db.StringProperty()
timestamp = datetime.datetime
class countHandler(webapp2.RequestHandler):
def get(self, tablename, timestamp):
table = db.GqlQuery("SELECT __key__ FROM " + tablename + " WHERE timestamp >= :1", timestamp )
recordsCount = 0
for p in table:
recordsCount += 1
self.response.out.write("Records count for table " + tablename + ": " + str(recordsCount))
app = webapp2.WSGIApplication([
('/count/(.*)/(.*)', countHandler)
], debug=True)
I've successfully deployed it and I'm able to call it, but for some reason I don't understand it's always saying
Records count for table myTable: 0
I'm struggling with the data type for the timestamp.. I think the issue is there.. any idea? which type should it be declared?
Thank you!
You problem (as discussed in the comments as well) seems to be that you are passing a string (probably) to the GqlQuery parameters.
In order to filter your query by datetime you need to pass a datetime object in to the query params. For that take a look here on how to convert that.
Small example:
# not sure how your timestamps are formatted but supposing they are strings
# of eg 2014-02-02 21:07:40.822000
timestamp = datetime.datetime.strptime(timestamp, "%Y-%m-%d %H:%M:%S.%f" )
table = db.GqlQuery("SELECT __key__ FROM " + tablename + " WHERE timestamp >= :1", timestamp)
Maybe today's been a long week, but I'm starting to run in circles with trying to figure out the logic on how to solve this.
To use the classic Orders and Items example, we have a webform that tabulates the data of an EXISTING Order e.g. saved in the db.
Now this form needs the ability to add/"mark as removed" ItemIDs from an order after the order has been 'saved'. When the form is submitted, I have two arrays:
$ogList = The original ItemIDs for the OrderID in question. (ex. [123, 456, 789])
$_POST['items'] = The modifications of ItemIDs, if any (ex. [123, 789, 1240, 944])
The intent is to compare the two arrays and:
1) Add new ItemIDs (never have been related to this OrderID before)
2) Mark the date 'removed' those ItemIDs that weren't $_POSTed.
The simple approach of just removing the existing ItemIDs from the Order, and adding the $_POSTed list won't work for business reasons.
PHP's array_diff() doesn't really tell me which ones are "new".
So what's the best way to do this? It appears that I'm looking at a nested foreach() loop ala:
foreach($_POST['items'] as $posted){
foreach($ogList as $ogItem){
if($posted == $ogItem){
Open to ideas here.
}
}
}
...with maybe conditional break(1) in there? Maybe there's a better way?
Another way to perhaps explain is to show the db records. The original in this example would be:
+--------+-------------+
| itemID | dateRemoved |
+--------+-------------+
| 123 | 0 |
| 456 | 0 |
| 789 | 0 |
After the POST, the ItemIDs in this OrderID would look something like:
+--------+-------------+
| itemID | dateRemoved |
+--------+-------------+
| 123 | 0 |
| 456 | 1368029148 |
| 789 | 0 |
| 1240 | 0 |
| 944 | 0 |
Does this make sense? Any suggestions would be appreciated!
EDIT: I found JavaScript sync two arrays (of objects) / find delta, but I'm not nearly proficient enough to translate Javascript and maps. Though it gets me almost there.
Well, when you have items in databse and you receive new dokument with changed set of items, you obviously must update database. One of the easyest way is to delete all previous items and insert new list. But, in case that items is related to other data, you can't do that. So you must apply this technique of comparing two lists. You have items in database, you have items from client after dokument change and with bellow commands you can separate what is for insert in database, what you just need to update and what to delete from database. Below is a simplified example.
$a = [1,2,3,4,5]; //elements in database
$b = [2,3,4,6]; //elements after document save
$del = array_diff($a, $b);
//for delete 1, 5
$ins = array_diff($b, $a);
//for insert 6
$upd = array_intersect($a, $b);
//for update 2, 3, 4
As you can see, this simply compare elements, but if you want to insert real data, you must switch to associative arrays and compare keys. You need to array look something like this:
{
"15":{"employee_id":"1","barcode":"444","is_active":"1"},
"16":{"employee_id":"1","barcode":"555","is_active":"1"},
"17":{"employee_id":"1","barcode":"666","is_active":"1"},
"18":{"employee_id":"1","barcode":"777","is_active":"1"}
}
Here, you have ID extracted from data and placed in array key position.
On server-side, that arrays can be eays get with PDO:
$sth->fetchAll(PDO::FETCH_UNIQUE|PDO::FETCH_ASSOC);
Beware that this will strip the first column from the resultset. So if you want to ID be on both places, use: "SELECT id AS arrkey, id, ... FROM ...".
On server side, when you make array, you don't need all data, just fetch id-s. That all need for compare, because key only matters:
[16=>16, 17=>17, 18=>18, 19=>19, 20=>20];
On client side, with JavaScript, you have to make same structured array of objects so can be compared on server. When you have new items on client, simply use Math.random() to genareate random key, because that doesn't matter.
let cards = {};
cards[Math.random()] = {
employee_id: something,
barcode: something,
is_active: something
}
After that, from client you will get array like this:
{
"16":{"employee_id":"1","barcode":"555","is_active":"1"},
"17":{"employee_id":"1","barcode":"666","is_active":"1"},
"18":{"employee_id":"1","barcode":"777","is_active":"1"},
"0.234456523454":{"employee_id":"1","barcode":"888","is_active":"1"}
}
item with Id 15 is deleted, 16, 17, 18 are changed (or not, you will update them anyway), and new item added. On server you can apply 3 way comparison with: array_diff_key and array_intersect_key.
So, for end, how this server side code looks in my case. First loop is on array keys and secound is to dynamically make UPDATE/INSERT statement.
//card differences
$sql = 'SELECT card_id AS arrkey, card_id FROM card WHERE (employee_id = ?)';
$sth = $this->db->prepare($sql);
$sth->execute([$employee_id]);
$array_database = $sth->fetchAll(PDO::FETCH_UNIQUE|PDO::FETCH_ASSOC);
$array_client = json_decode($_POST['...'], true);
//cards update
foreach (array_intersect_key($array_database, $array_client) as $id => $data) {
$query = 'UPDATE card SET';
$updates = array_filter($array_client[$id], function ($value) {return null !== $value;}); //clear nulls
$values = [];
foreach ($updates as $name => $value) {
$query .= ' '.$name.' = :'.$name.',';
$values[':'.$name] = $value;
}
$query = substr($query, 0, -1); // remove last comma
$sth = $this->db->prepare($query . ' WHERE (card_id = ' . $id . ');');
$sth->execute($values);
}
//cards insert
foreach (array_diff_key($array_client, $array_database) as $id => $card) {
$prep = array();
foreach($card as $k => $v ) {
$prep[':'.$k] = $v;
}
$sth = $this->db->prepare("INSERT INTO card ( " . implode(', ',array_keys($card)) . ") VALUES (" . implode(', ',array_keys($prep)) . ")");
$sth->execute($prep);
}
//cards delete
foreach (array_diff_key($array_database, $array_client) as $id => $data) {
$sth = $this->db->prepare('DELETE FROM card WHERE card_id = ?');
$sth->execute([$id]);
}