loop through and compare all results of two foreach loops? - loops

I have 2 foreach loops:
Loop 1: goes through a json file
Loop 2: goes through a database
i want to compare the results form the json to the results in the database.
// TRANSACTIONS
$transaction = json_decode(transaction_list($wallet_id), true);
// DATABASE ROWS
$rows = $db->run('SELECT * FROM transactions');
// loop through transactions
foreach ($transaction as $tx) {
$tx_id = $tx['id'];
}
// loop through database
foreach ($rows as $row) {
$id = $row['id'];
}
echo $id . " | " . $tx_id . "\n";
The results are only one line, Id like to get results for all the lines.
24d418b322e889e39d8e4bf3b8d6060e479d40032658cb9b080ff6d615eee9cf |
7c6c161695a21ad9143b1f3e242d176880b3484ebb1f6c820772c92ece916bdb
How do i get all the results from the database?
I tired a for loop to count the rows and produce one result for each one, but I just got the above line 12 times instead of 12 different results.
The goal is to be able to compare the json files results to the database and if theyre different to insert the json thats not already in the database into the database. for example, if the $tx_id doesnt exist to add the new transaction to the database. Maybe Im doing this wrong?

Related

Perl performance is slow, file I/O issue or due to while loop

I have the following code in my while loop and it is significantly slow, any suggestions on how to improve this?
open IN, "<$FileDir/$file" || Err( "Failed to open $file at location: $FileDir" );
my $linenum = 0;
while ( $line = <IN> ) {
if ( $linenum == 0 ) {
Log(" This is header line : $line");
$linenum++;
} else {
$linenum++;
my $csv = Text::CSV_XS->new();
my $status = $csv->parse($line);
my #val = $csv->fields();
$index = 0;
Log("number of parameters for this file is: $sth->{NUM_OF_PARAMS}");
for ( $index = 0; $index <= $#val; $index++ ) {
if ( $index < $sth->{NUM_OF_PARAMS} ) {
$sth->bind_param( $index + 1, $val[$index] );
}
}
if ( $sth->execute() ) {
$ifa_dbh->commit();
} else {
Log("line $linenum insert failed");
$ifa_dbh->rollback();
exit(1);
}
}
}
By far the most expensive operation there is accessing the database server; it's a network trip, hundreds of milliseconds or some such, each time.
Are those DB operations inserts, as they appear? If so, instead of inserting row by row construct a string for an insert statement with multiple rows, in principle as many as there are, in that loop. Then run that one transaction.
Test and scale down as needed, if that adds up to too many rows. Can keep adding rows to the string for the insert statement up to a decided maximum number, insert that, then keep going.†
A few more readily seen inefficiencies
Don't construct an object every time through the loop. Build it once befor the loop, and then use/repopulate as needed in the loop. Then, there is no need for parse+fields here, while getline is also a bit faster
Don't need that if statement for every read. First read one line of data, and that's your header. Then enter the loop, without ifs
Altogether, without placeholders which now may not be needed, something like
my $csv = Text::CSV_XS->new({ binary => 1, auto_diag => 1 });
# There's a $table earlier, with its #fields to populate
my $qry = "INSERT into $table (", join(',', #fields), ") VALUES ";
open my $IN, '<', "$FileDir/$file"
or Err( "Failed to open $file at location: $FileDir" );
my $header_arrayref = $csv->getline($IN);
Log( "This is header line : #$header_arrayref" );
my #sql_values;
while ( my $row = $csv->getline($IN) ) {
# Use as many elements in the row (#$row) as there are #fields
push #sql_values, '(' .
join(',', map { $dbh->quote($_) } #$row[0..$#fields]) . ')';
# May want to do more to sanitize input further
}
$qry .= join ', ', #sql_values;
# Now $qry is readye. It is
# INSERT into table_name (f1,f2,...) VALUES (v11,v12...), (v21,v22...),...
$dbh->do($qry) or die $DBI::errstr;
I've also corrected the error handling when opening the file, since that || in the question binds too tightly in this case, and there's effectively open IN, ( "<$FileDir/$file" || Err(...) ). We need or instead of || there. Then, the three-argument open is better. See perlopentut
If you do need the placeholders, perhaps because you can't have a single insert but it must be broken into many or for security reasons, then you need to generate the exact ?-tuples for each row to be inserted, and later supply the right number of values for them.
Can assemble data first and then build the ?-tuples based on it
my $qry = "INSERT into $table (", join(',', #fields), ") VALUES ";
...
my #data;
while ( my $row = $csv->getline($IN) ) {
push #data, [ #$row[0..$#fields] ];
}
# Append the right number of (?,?...),... with the right number of ? in each
$qry .= join ', ', map { '(' . join(',', ('?')x#$_) . ')' } #data;
# Now $qry is ready to bind and execute
# INSERT into table_name (f1,f2,...) VALUES (?,?,...), (?,?,...), ...
$dbh->do($qry, undef, map { #$_ } #data) or die $DBI::errstr;
This may generate a very large string, what may push the limits of your RDBMS or some other resource. In that case break #data into smaller batches. Then prepare the statement with the right number of (?,?,...) row-values for a batch, and execute in the loop over the batches.‡
Finally, another way altogether is to directly load data from a file using the database's tool for that particular purpose. This will be far faster than going through DBI, probably even including the need to process your input CSV into another one which will have only the needed data.
Since you don't need all data from your input CSV file, first read and process the file as above and write out a file with only the needed data (#data above). Then, there's two possible ways
Either use an SQL command for this – COPY in PostgreSQL, LOAD DATA [LOCAL] INFILE in MySQL and Oracle (etc); or,
Use a dedicated tool for importing/loading files from your RDBMS – mysqlimport (MySQL), SQL*Loader/sqlldr (Oracle), etc. I'd expect this to be the fastest way
The second of these options can also be done out of a program, by running the appropriate tool as an external command via system (or better yet via the suitable libraries).
† In one application I've put together as much as millions of rows in the initial insert -- the string itself for that statement was in high tens of MB -- and that keeps running with ~100k rows inserted in a single statement daily, for a few years by now. This is postgresql on good servers, and of course ymmv.
‡
Some RDBMS do not support a multi-row (batch) insert query like the one used here; in particular Oracle seems not to. (We were informed in the end that that's the database used here.) But there are other ways to do it in Oracle, please see links in comments, and search for more. Then the script will need to construct a different query but the principle of operation is the same.

PHPSpreadsheet to update an existing file writes only the last record of the query

Hello i am trying to write an existing xlsx file using phpspreadsheet with setActiveSheetIndexByName(sheetname) and setcellvalue with reference and value, but it updates only the last record. spent more than 12 hours on this.
i tried foreach instead of while and used a counter to increment, but none worked.
<?php
include_once('db.php');
$prospect = $_REQUEST['prospect'];
require 'vendor/autoload.php';
use PhpOffice\PhpSpreadsheet\IOFactory;
use PhpOffice\PhpSpreadsheet\Spreadsheet;
use PhpOffice\PhpSpreadsheet\Writer\Xlsx;
$sql1 = mysqli_query($db,"select filename,sheetname, row, responsecol,compliancecol,response, compliance from spreadsheet where `prospect`='$prospect' and response <>'' order by row");
//$row=1;
while($row1 = mysqli_fetch_assoc($sql1))
{
$filename= $row1['filename']; //test.xlsx
$sheetname= $row1['sheetname']; // mysheet
$responsecol= $row1['responsecol'].$row1['row']; //D1
$response= $row1['response']; //response
$compliancecol= $row1['compliancecol'].$row1['row']; //C1
$compliance= $row1['compliance']; //compliance
$spreadsheet = \PhpOffice\PhpSpreadsheet\IOFactory::load($filename);
$spreadsheet->setActiveSheetIndexByName($sheetname)
->setCellValue($compliancecol,$compliance)
->setCellValue($responsecol,$response);
//$row++;
}
$writer = IOFactory::createWriter($spreadsheet, 'Xlsx');
$writer->save("newfile.xlsx");
exit;
?>
i wish each of the row from mysqli result updates each reference cell with value.
The easys way is to set a Limit of 1 to your MySQL query. That takes only one value from your data. If you will the last you should sort DESC.
$sql1 = mysqli_query($db,"select filename,sheetname, row, responsecol,compliancecol,response, compliance from spreadsheet where `prospect`='$prospect' and response <>'' order by row DESC LIMIT 1");

Splitting column with XML data

I have a SQL column named "details" and it contains the following data:
<changes><RoundID><new>8394</new></RoundID><RoundLeg><new>JAYS CLOSE AL6 Odds(1 - 5)</new></RoundLeg><SortType><new>1</new></SortType><SortOrder><new>230</new></SortOrder><StartDate><new>01/01/2009</new></StartDate><EndDate><new>01/01/2021</new></EndDate><RoundLegTypeID><new>1</new></RoundLegTypeID></changes>
<changes><RoundID><new>8404</new></RoundID><RoundLeg><new>HOLLY AREA AL6 (1 - 9)</new></RoundLeg><SortType><new>1</new></SortType><SortOrder><new>730</new></SortOrder><StartDate><new>01/01/2009</new></StartDate><EndDate><new>01/01/2021</new></EndDate><RoundLegTypeID><new>1</new></RoundLegTypeID></changes>
<changes><RoundID><new>8379</new></RoundID><RoundLeg><new>PRI PARK AL6 (1 - 42)</new></RoundLeg><SortType><new>1</new></SortType><SortOrder><new>300</new></SortOrder><StartDate><new>01/01/2009</new></StartDate><EndDate><new>01/01/2021</new></EndDate><RoundLegTypeID><new>1</new></RoundLegTypeID></changes>
What is the easiest way to separate this data out into individual columns? (that is all one column)
Try this:
SELECT DATA.query('/changes/RoundID/new/text()') AS RoundID
,DATA.query('/changes/RoundLeg/new/text()') AS RoundLeg
,DATA.query('/changes/SortType/new/text()') AS SortType
-- And so on and so forth
FROM (SELECT CONVERT(XML, Details) AS DATA
FROM YourTable) AS T
Once you get your result set from the sql (mysql or whatever) you will probably have an array of strings. As I understand your question, you wanted to know how to extract each of the xml nodes that were contained in the string that was stored in the column in question. You could loop through the results from the sql query and extract the data that you want. In php it would look like this:
// Set a counter variable for the first dimension of the array, this will
// number the result sets. So for each row in the table you will have a
// number identifier in the corresponding array.
$i = 0;
$output = array();
foreach($results as $result) {
$xml = simplexml_load_string($result);
// Here use simpleXML to extract the node data, just by using the names of the
// XML Nodes, and give it the same name in the array's second dimension.
$output[$i]['RoundID'] = $xml->RoundID->new;
$output[$i]['RoudLeg'] = $xml->RoundLeg->new;
// Simply create more array items here for each of the elements you want
$i++;
}
foreach ($output as $out) {
// Step through the created array do what you like with it.
echo $out['RoundID']."\n";
var_dump($out);
}

Compare two two arrays to add or remove records from database

Maybe today's been a long week, but I'm starting to run in circles with trying to figure out the logic on how to solve this.
To use the classic Orders and Items example, we have a webform that tabulates the data of an EXISTING Order e.g. saved in the db.
Now this form needs the ability to add/"mark as removed" ItemIDs from an order after the order has been 'saved'. When the form is submitted, I have two arrays:
$ogList = The original ItemIDs for the OrderID in question. (ex. [123, 456, 789])
$_POST['items'] = The modifications of ItemIDs, if any (ex. [123, 789, 1240, 944])
The intent is to compare the two arrays and:
1) Add new ItemIDs (never have been related to this OrderID before)
2) Mark the date 'removed' those ItemIDs that weren't $_POSTed.
The simple approach of just removing the existing ItemIDs from the Order, and adding the $_POSTed list won't work for business reasons.
PHP's array_diff() doesn't really tell me which ones are "new".
So what's the best way to do this? It appears that I'm looking at a nested foreach() loop ala:
foreach($_POST['items'] as $posted){
foreach($ogList as $ogItem){
if($posted == $ogItem){
Open to ideas here.
}
}
}
...with maybe conditional break(1) in there? Maybe there's a better way?
Another way to perhaps explain is to show the db records. The original in this example would be:
+--------+-------------+
| itemID | dateRemoved |
+--------+-------------+
| 123 | 0 |
| 456 | 0 |
| 789 | 0 |
After the POST, the ItemIDs in this OrderID would look something like:
+--------+-------------+
| itemID | dateRemoved |
+--------+-------------+
| 123 | 0 |
| 456 | 1368029148 |
| 789 | 0 |
| 1240 | 0 |
| 944 | 0 |
Does this make sense? Any suggestions would be appreciated!
EDIT: I found JavaScript sync two arrays (of objects) / find delta, but I'm not nearly proficient enough to translate Javascript and maps. Though it gets me almost there.
Well, when you have items in databse and you receive new dokument with changed set of items, you obviously must update database. One of the easyest way is to delete all previous items and insert new list. But, in case that items is related to other data, you can't do that. So you must apply this technique of comparing two lists. You have items in database, you have items from client after dokument change and with bellow commands you can separate what is for insert in database, what you just need to update and what to delete from database. Below is a simplified example.
$a = [1,2,3,4,5]; //elements in database
$b = [2,3,4,6]; //elements after document save
$del = array_diff($a, $b);
//for delete 1, 5
$ins = array_diff($b, $a);
//for insert 6
$upd = array_intersect($a, $b);
//for update 2, 3, 4
As you can see, this simply compare elements, but if you want to insert real data, you must switch to associative arrays and compare keys. You need to array look something like this:
{
"15":{"employee_id":"1","barcode":"444","is_active":"1"},
"16":{"employee_id":"1","barcode":"555","is_active":"1"},
"17":{"employee_id":"1","barcode":"666","is_active":"1"},
"18":{"employee_id":"1","barcode":"777","is_active":"1"}
}
Here, you have ID extracted from data and placed in array key position.
On server-side, that arrays can be eays get with PDO:
$sth->fetchAll(PDO::FETCH_UNIQUE|PDO::FETCH_ASSOC);
Beware that this will strip the first column from the resultset. So if you want to ID be on both places, use: "SELECT id AS arrkey, id, ... FROM ...".
On server side, when you make array, you don't need all data, just fetch id-s. That all need for compare, because key only matters:
[16=>16, 17=>17, 18=>18, 19=>19, 20=>20];
On client side, with JavaScript, you have to make same structured array of objects so can be compared on server. When you have new items on client, simply use Math.random() to genareate random key, because that doesn't matter.
let cards = {};
cards[Math.random()] = {
employee_id: something,
barcode: something,
is_active: something
}
After that, from client you will get array like this:
{
"16":{"employee_id":"1","barcode":"555","is_active":"1"},
"17":{"employee_id":"1","barcode":"666","is_active":"1"},
"18":{"employee_id":"1","barcode":"777","is_active":"1"},
"0.234456523454":{"employee_id":"1","barcode":"888","is_active":"1"}
}
item with Id 15 is deleted, 16, 17, 18 are changed (or not, you will update them anyway), and new item added. On server you can apply 3 way comparison with: array_diff_key and array_intersect_key.
So, for end, how this server side code looks in my case. First loop is on array keys and secound is to dynamically make UPDATE/INSERT statement.
//card differences
$sql = 'SELECT card_id AS arrkey, card_id FROM card WHERE (employee_id = ?)';
$sth = $this->db->prepare($sql);
$sth->execute([$employee_id]);
$array_database = $sth->fetchAll(PDO::FETCH_UNIQUE|PDO::FETCH_ASSOC);
$array_client = json_decode($_POST['...'], true);
//cards update
foreach (array_intersect_key($array_database, $array_client) as $id => $data) {
$query = 'UPDATE card SET';
$updates = array_filter($array_client[$id], function ($value) {return null !== $value;}); //clear nulls
$values = [];
foreach ($updates as $name => $value) {
$query .= ' '.$name.' = :'.$name.',';
$values[':'.$name] = $value;
}
$query = substr($query, 0, -1); // remove last comma
$sth = $this->db->prepare($query . ' WHERE (card_id = ' . $id . ');');
$sth->execute($values);
}
//cards insert
foreach (array_diff_key($array_client, $array_database) as $id => $card) {
$prep = array();
foreach($card as $k => $v ) {
$prep[':'.$k] = $v;
}
$sth = $this->db->prepare("INSERT INTO card ( " . implode(', ',array_keys($card)) . ") VALUES (" . implode(', ',array_keys($prep)) . ")");
$sth->execute($prep);
}
//cards delete
foreach (array_diff_key($array_database, $array_client) as $id => $data) {
$sth = $this->db->prepare('DELETE FROM card WHERE card_id = ?');
$sth->execute([$id]);
}

array deccleration

I have written a code in php to connect and insert into a MSSQL database. i don't know much in php because am new to this. i used odbc to connect database.user can enter his details through the form. after submitting the form the details are getting stored into a database.
while inserting rows into a database am not trying to insert duplicate values . for this i have given if conditions.these conditions are able to notice the user cname and name exist in the database if the same name exist. but the else part after these conditions not working i.e rows are not getting inserted. i put everything inside the while loop. how can i correct it?
this is my code written in php
$connect = odbc_connect('ServerDB','sa', 'pwd');//connects database
$query2="select count(*) from company";//this is needer for loop through
$result2=odbc_exec($connect,$query2);
while(odbc_fetch_row($result2));
{
$count=odbc_result($result2,1);
echo "</br>","$count";
}
$query1="select * from company";
$result1 = odbc_exec($connect, $query1);
# fetch the data from the database
$index=0;
while(odbc_fetch_row($result1))
{
$compar[$count] = odbc_result($result1, 1);
$namearray[$count] = odbc_result($result1, 2);
if($compar[$count]==$_POST['cname'])
{
echo "<script> alert(\"cname Exists\") </script>";
}
else if($namearray[$count]==$_POST['name'])
{
echo "<script> alert(\"Name Exists\") </script>";
}
else {
$query=("INSERT INTO company(cname,name) VALUES ('$_POST[cname]','$_POST[name]') ");
$result = odbc_exec($connect, $query);
echo "<script> alert(\"Row Inserted\") </script>";
} }
You generally don't allocate memory in PHP. The PHP interpreter handles it for you. Just go ahead and assign elements to your array and all the memory allocation is taken care of for you.
$index = 0;
Then you do $index++ until the last index is reached.
So index further on stays e.g. 12
So anywhere you can only access the index [12]
In PHP arrays are implemented as what would be a HashMap in Java. It will be sized automatically depending on what you put into it, and there is no way you can give an initial capacity.

Resources