Saving array to session in Codeigniter - arrays

I have a problem when saving array into session data in Codeigniter.
var_dump($this->session->userdata('data')); // output is boolean false
$array = array(0 => 'abc', 1 => 'def', 2 => 'ghi');
$this->session->set_userdata(array('data' => $array, 'name' => 'my_name'));
var_dump($this->session->userdata('data')); // output is 0 => 'abc', 1 => 'def', 2 => 'ghi'
Everytime page is loaded "userdata('data')" is lost but other userdata is ok. It means only this array is lost. I'm 100% sure it can work, it worked for me before i did lot of modifications, so now i can't find solution.
Thanks.

I have found what's the problem. Codeigniter has some limits in session, my array was too big.
More info here

It's seems that the cookies are disabled in your navigator.

You need to use a database. The 4kb limit is a browser limit for cookie sizes. It's generally a good practice to keep cookies and session small, since every request header to an object on a server (for the same domain) will send this cookie.
Also, a good tip for CI concerning database session table's, set the type to MEMORY, so that the sessions are stored in RAM instead of disk, which makes your site quicker.
The SQL
CREATE TABLE IF NOT EXISTS `ci_sessions` (
session_id varchar(40) DEFAULT '0' NOT NULL,
ip_address varchar(16) DEFAULT '0' NOT NULL,
user_agent varchar(50) NOT NULL,
last_activity int(10) unsigned DEFAULT 0 NOT NULL,
user_data text NOT NULL,
PRIMARY KEY (session_id)
);
CI Configuration (in application/config/config.php):
$config['sess_cookie_name'] = 'ci_session';
$config['sess_expiration'] = 7200;
$config['sess_encrypt_cookie'] = FALSE;
$config['sess_use_database'] = TRUE;
$config['sess_table_name'] = 'ci_sessions';
$config['sess_match_ip'] = FALSE;
$config['sess_match_useragent'] = TRUE;
$config['sess_time_to_update'] = 300;

Related

Importing Large XML file into SQL 2.5Gb

Hi I am trying to import a large XML file into a table on my sql server (2014)
I have used the code below for smaller files and thought it would be ok as this is a once off, I kicked it off yesterday and the query was still running when I came into work today so this is obviously the wrong route.
here is the code.
CREATE TABLE files_index_bulk
(
Id INT IDENTITY PRIMARY KEY,
XMLData XML,
LoadedDateTime DATETIME
)
INSERT INTO files_index_bulk(XMLData, LoadedDateTime)
SELECT CONVERT(XML, BulkColumn, 2) AS BulkColumn, GETDATE()
FROM OPENROWSET(BULK 'c:\scripts\icecat\files.index.xml', SINGLE_BLOB) AS x;
SELECT * FROM files_index_bulk
Can anyone point out another way of doing this please ive looked around at importing large files and it keeps coming back to using bulk. which I already am.
thanks in advance.
here is the table I am using I want to pull all the data into.
USE [ICECATtesting]
GO
/****** Object: Table [dbo].[files_index] Script Date: 28/04/2017 20:10:44
******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [dbo].[files_index](
[Product_ID] [int] NULL,
[path] [varchar](100) NULL,
[Updated] [varchar](50) NULL,
[Quality] [varchar](50) NULL,
[Supplier_id] [int] NULL,
[Prod_ID] [varchar](1) NULL,
[Catid] [int] NULL,
[On_Market] [int] NULL,
[Model_Name] [varchar](250) NULL,
[Product_View] [int] NULL,
[HighPic] [varchar](1) NULL,
[HighPicSize] [int] NULL,
[HighPicWidth] [int] NULL,
[HighPicHeight] [int] NULL,
[Date_Added] [varchar](150) NULL
) ON [PRIMARY]
GO
SET ANSI_PADDING OFF
GO
and here is a snippit of the xml file.
<ICECAT-interface xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="http://data.icecat.biz/xsd/files.index.xsd">
<files.index Generated="20170427010009">
<file path="export/level4/EN/11.xml" Product_ID="11" Updated="20170329110432" Quality="SUPPLIER" Supplier_id="2" Prod_ID="PS300E-03YNL-DU" Catid="151" On_Market="0" Model_Name="Satellite 3000-400" Product_View="587591" HighPic="" HighPicSize="0" HighPicWidth="0" HighPicHeight="0" Date_Added="20050627000000">
</file>
<file path="export/level4/EN/12.xml" Product_ID="12" Updated="20170329110432" Quality="ICECAT" Supplier_id="7" Prod_ID="91.42R01.32H" Catid="151" On_Market="0" Model_Name="TravelMate 740LF" Product_View="40042" HighPic="http://images.icecat.biz/img/norm/high/12-31699.jpg" HighPicSize="19384" HighPicWidth="170" HighPicHeight="192" Date_Added="20050627000000">
</file>
<file path="export/level4/EN/13.xml" Product_ID="13" Updated="20170329110432" Quality="SUPPLIER" Supplier_id="2" Prod_ID="PP722E-H390W-NL" Catid="151" On_Market="0" Model_Name="Portégé 7220CT / NW2" Product_View="37021" HighPic="http://images.icecat.biz/img/norm/high/13-31699.jpg" HighPicSize="27152" HighPicWidth="280" HighPicHeight="280" Date_Added="20050627000000">
</file>
The max size of an XML column value in SQL Server is 2GB. It will not be possible to import a 2.5GB file into a single XML column.
UPDATE
Since your underlying objective is to transform XML elements within the file into table rows, you don't need to stage the entire file contents into a single XML column. You can avoid the 2GB limitation, reduce memory requirements, and improve performance by shredding the XML in client code and using a bulk insert technique to insert batches of multiple rows.
The example Powershell script below uses an XmlTextReader to avoid reading the entire XML into a DOM and uses SqlBulkCopy to insert batches of many rows at once. The combination of these techniques should allow you to insert millions rows in minutes rather than hours. These same techniques can be implemented in a custom app or SSIS script task.
I noticed a couple of the table columns specify varchar(1) yet the XML attribute values contain many characters. You'll need to either expand length of the columns or transform the source values.
[String]$global:connectionString = "Data Source=YourServer;Initial Catalog=YourDatabase;Integrated Security=SSPI";
[System.Data.DataTable]$global:dt = New-Object System.Data.DataTable;
[System.Xml.XmlTextReader]$global:xmlReader = New-Object System.Xml.XmlTextReader("C:\FilesToImport\files.xml");
[Int32]$global:batchSize = 10000;
Function Add-FileRow() {
$newRow = $dt.NewRow();
$null = $dt.Rows.Add($newRow);
$newRow["Product_ID"] = $global:xmlReader.GetAttribute("Product_ID");
$newRow["path"] = $global:xmlReader.GetAttribute("path");
$newRow["Updated"] = $global:xmlReader.GetAttribute("Updated");
$newRow["Quality"] = $global:xmlReader.GetAttribute("Quality");
$newRow["Supplier_id"] = $global:xmlReader.GetAttribute("Supplier_id");
$newRow["Prod_ID"] = $global:xmlReader.GetAttribute("Prod_ID");
$newRow["Catid"] = $global:xmlReader.GetAttribute("Catid");
$newRow["On_Market"] = $global:xmlReader.GetAttribute("On_Market");
$newRow["Model_Name"] = $global:xmlReader.GetAttribute("Model_Name");
$newRow["Product_View"] = $global:xmlReader.GetAttribute("Product_View");
$newRow["HighPic"] = $global:xmlReader.GetAttribute("HighPic");
$newRow["HighPicSize"] = $global:xmlReader.GetAttribute("HighPicSize");
$newRow["HighPicWidth"] = $global:xmlReader.GetAttribute("HighPicWidth");
$newRow["HighPicHeight"] = $global:xmlReader.GetAttribute("HighPicHeight");
$newRow["Date_Added"] = $global:xmlReader.GetAttribute("Date_Added");
}
try
{
# init data table schema
$da = New-Object System.Data.SqlClient.SqlDataAdapter("SELECT * FROM dbo.files_index WHERE 0 = 1;", $global:connectionString);
$null = $da.Fill($global:dt);
$bcp = New-Object System.Data.SqlClient.SqlBulkCopy($global:connectionString);
$bcp.DestinationTableName = "dbo.files_index";
$recordCount = 0;
while($xmlReader.Read() -eq $true)
{
if(($xmlReader.NodeType -eq [System.Xml.XmlNodeType]::Element) -and ($xmlReader.Name -eq "file"))
{
Add-FileRow -xmlReader $xmlReader;
$recordCount += 1;
if(($recordCount % $global:batchSize) -eq 0)
{
$bcp.WriteToServer($dt);
$dt.Rows.Clear();
Write-Host "$recordCount file elements processed so far";
}
}
}
if($dt.Rows.Count -gt 0)
{
$bcp.WriteToServer($dt);
}
$bcp.Close();
$xmlReader.Close();
Write-Host "$recordCount file elements imported";
}
catch
{
throw;
}
Try this. Just another method that I have used for some time. It's pretty fast (could be faster). I pull a huge xml db from a gaming company every night. This is how i get it an import it.
$xml = new XMLReader();
$xml->open($xml_file); // file is your xml file you want to parse
while($xml->read() && $xml->name != 'game') { ; } // get past the header to your first record (game in my case)
while($xml->name == 'game') { // now while we are in this record
$element = new SimpleXMLElement($xml->readOuterXML());
$gameRec = $this->createGameRecord($element, $os); // this is my function to reduce some clutter - and I use it elsewhere too
/* this looks confusing, but it is not. There are over 20 fields, and instead of typing them all out, I just made a string. */
$sql = "INSERT INTO $table (";
foreach($gameRec as $field=>$game){
$sql .= " $field,";
}
$sql = rtrim($sql, ",");
$sql .=") values (";
foreach($gameRec as $field=>$game) {
$sql .= " :$field,";
}
$sql = rtrim($sql,",");
$sql .= ") ON DUPLICATE KEY UPDATE "; // online game doesn't have a gamerank - not my choice LOL, so I adjust that for here
switch ($os) {
case 'pc' : $sql .= "gamerank = ".$gameRec['gamerank'] ; break;
case 'mac': $sql .= "gamerank = ".$gameRec['gamerank'] ; break;
case 'pl' : $sql .= "playercount = ".$gameRec['playercount'] ; break;
case 'og' :
$playercount = $this->getPlayerCount($gameRec['gameid']);
$sql .= "playercount = ".$playercount['playercount'] ;
break;
}
try {
$stmt = $this->connect()->prepare($sql);
$stmt->execute($gameRec);
} catch (PDOException $e) {// Kludge
echo 'os: '.$os.'<br/>table: '.$table.'<br/>XML LINK: '.$comprehensive_xml.'<br/>Current Record:<br/><pre>'.print_r($gameRec).'</pre><br/>'.
'SQL: '.$sql.'<br/>';
die('Line:33<br/>Function: pullBFG()<BR/>Cannot add game record <br/>'.$e->getMessage());
}
/// VERY VERY VERY IMPORTANT do not forget these 2 lines, or it will go into a endless loop - I know, I've done it. locks up your system after a bit hahaah
$xml->next('game');
unset($element);
}// while there are games
This should get you started. Obviously, adjust the "game" to your xml records. Trim out the fat I have here.
Here is the createGameRecord($element, $type='pc')
Basically it turns it into an array to use elsewhere, and makes it easier to add it to the db. with a single line as seen above: $stmt->execute($gameRec); Where $gameRec was returned from this function. PDO knows gameRec is an array, and will parse it out as you INSERT IT. the "delHardReturns() is another of my fucntion that gets rid of those hard returns /r /n etc.. Seems to mess up the SQL. I think SQL has a function for that, but I have not pursed it.
Hope you find this useful.
private function createGameRecord($element, $type='pc') {
if( ($type == 'pc') || ($type == 'og') ) { // player count is handled separately
$game = array(
'gamename' => strval($element->gamename),
'gameid' => strval($element->gameid),
'genreid' => strval($element->genreid),
'allgenreid' => strval($element->allgenreid),
'shortdesc' => $this->delHardReturns(strval($element->shortdesc)),
'meddesc' => $this->delHardReturns(strval($element->meddesc)),
'bullet1' => $this->delHardReturns(strval($element->bullet1)),
'bullet2' => $this->delHardReturns(strval($element->bullet2)),
'bullet3' => $this->delHardReturns(strval($element->bullet3)),
'bullet4' => $this->delHardReturns(strval($element->bullet4)),
'bullet5' => $this->delHardReturns(strval($element->bullet5)),
'longdesc' => $this->delHardReturns(strval($element->longdesc)),
'foldername' => strval($element->foldername),
'hasdownload' => strval($element->hasdownload),
'hasdwfeature' => strval($element->hasdwfeature),
'releasedate' => strval($element->releasedate)
);
if($type === 'pc') {
$game['hasvideo'] = strval($element->hasvideo);
$game['hasflash'] = strval($element->hasflash);
$game['price'] = strval($element->price);
$game['gamerank'] = strval($element->gamerank);
$game['gamesize'] = strval($element->gamesize);
$game['macgameid'] = strval($element->macgameid);
$game['family'] = strval($element->family);
$game['familyid'] = strval($element->familyid);
$game['productid'] = strval($element->productid);
$game['pc_sysreqos'] = strval($element->systemreq->pc->sysreqos);
$game['pc_sysreqmhz'] = strval($element->systemreq->pc->sysreqmhz);
$game['pc_sysreqmem'] = strval($element->systemreq->pc->sysreqmem);
$game['pc_sysreqhd'] = strval($element->systemreq->pc->sysreqhd);
if(empty($game['gamerank'])) $game['gamerank'] = 99999;
$game['gamesize'] = $this->readableBytes((int)$game['gamesize']);
}// dealing with PC type
if($type === 'og') {
$game['onlineiframeheight'] = strval($element->onlineiframeheight);
$game['onlineiframewidth'] = strval($element->onlineiframewidth);
}
$game['releasedate'] = substr($game['releasedate'],0,10);
} else {// not type = pl
$game['playercount'] = strval($element->playercount);
$game['gameid'] = strval($element->gameid);
}// no type = pl else
return $game;
}/
Updated: Much faster. I did some research, and while the above post I made shows one (slow) method, I was able to find one that works even faster - for me it does.
I put this as a new answer due to the complete difference from my previous post.
LOAD XML LOCAL INFILE 'path/to/file.xml' INTO TABLE tablename ROWS IDENTIFIED BY '<xml-identifier>'
Example
<students>
<student>
<name>john doe</name>
<boringfields>bla bla bla......</boringfields>
</student>
</students>
Then, MYSQL command would be:
LOAD XML LOCAL INFILE 'path/to/students.xml' INTO TABLE tablename ROWS IDENTIFIED BY '<student>'
rows identified must have single quote and angle brackets.
when I switched to this method, I went from 12min +/- to 30 seconds!! +/-
tips that worked for me. was use the
DELETE FROM tablename
otherwise it will just append to your db.
Ref: https://dev.mysql.com/doc/refman/5.5/en/load-xml.html

database not inserting using PDO prepared statements

I have a paypal form which is submitted and working, this is part of the ipn, which means I can not see the error reporting on page which means I am a little blind on what is the issue.
Everything looks fine I think.
Here is the code:
$newcustom = rtrim($_POST['custom'], ',');
$buyingarray = explode('~', $newcustom);
$name = $buyingarray[1];
$phone = $buyingarray[2];
$email = $buyingarray[3];
$comments = $buyingarray[4];
$date = $buyingarray[5];
$time = explode(",",$tt[6]);
$person = explode(",",$tt[7]);
$orderID = $tt[0];
$booking_date = strtotime($date);
$nowitsdate = date('Y-m-d G:i:s', strtotime("now"));
$addreservation = $pdo->prepare("INSERT INTO reservations (id, dateCreated, name, email, phone, comments, status, eventID, voucherCode, voucherplace, OrderID) VALUES ('', :dateCreated, :name, :email, :phone, :comments, 1, NULL, '', 'PayPal Purchase', :OrderID)");
$addreservation->execute(array(':dateCreated' => $nowitsdate,':name' => $name,':email' => $email,':phone' => $phone,':comments' => $comments,':OrderID' => $orderID));
$addreservation_num = $addreservation->rowCount();
if($addreservation_num == 0){ $inserterror = $pdo->query("INSERT INTO testipn (id, testing, testing2) VALUES ('','Input into reservations died','')"); exit();}
Now I know that everything is ok with the details coming in as I have input each of the variables into the testipn row which is telling me if an error occurred when the ipn is being used.
As far as I can see everything looks fine yet it still is not finding the problem.
Now the only thing I can think it can be is the NULL within the eventID, maybe that is causing an issue?
If a ID is not present which it wont be within this table, then it needs to be NULL.
The rows effected is 0 which means my error row in the database says: Input into reservations died
Thanks for any input into finding a solution :)

Cannot understand how will Entity Framewrok generate a SQL statement for an Update operation using timestamp?

I have the following method inside my asp.net mvc web application :
var rack = IT.ITRacks.Where(a => !a.Technology.IsDeleted && a.Technology.IsCompleted);
foreach (var r in rack)
{
long? it360id = technology[r.ITRackID];
if (it360resource.ContainsKey(it360id.Value))
{
long? CurrentIT360siteid = it360resource[it360id.Value];
if (CurrentIT360siteid != r.IT360SiteID)
{
r.IT360SiteID = CurrentIT360siteid.Value;
IT.Entry(r).State = EntityState.Modified;
count = count + 1;
}
}
IT.SaveChanges();
}
When I checked SQL Server profiler I noted that EF will generated the following SQL statement:
exec sp_executesql N'update [dbo].[ITSwitches]
set [ModelID] = #0, [Spec] = null, [RackID] = #1, [ConsoleServerID] = null, [Description] = null, [IT360SiteID] = #2, [ConsoleServerPort] = null
where (([SwitchID] = #3) and ([timestamp] = #4))
select [timestamp]
from [dbo].[ITSwitches]
where ##ROWCOUNT > 0 and [SwitchID] = #3',N'#0 int,#1 int,#2 bigint,#3 int,#4 binary(8)',#0=1,#1=539,#2=1502,#3=1484,#4=0x00000000000EDCB2
I can not understand the purpose of having the following section :-
select [timestamp]
from [dbo].[ITSwitches]
where ##ROWCOUNT > 0 and [SwitchID] = #3',N'#0 int,#1 int,#2 bigint,#3 int,#4 binary(8)',#0=1,#1=539,#2=1502,#3=1484,#4=0x00000000000EDCB2
Can anyone advice?
Entity Framework uses timestamps to check whether a row has changed. If the row has changed since the last time EF retrieved it, then it knows it has a concurrency problem.
Here's an explanation:
http://www.remondo.net/entity-framework-concurrency-checking-with-timestamp/
This is because EF (and you) want to update the updated client-side object by the newly generated rowversion value.
First the update is executed. If this succeeds (because the rowversion is still the one you had in the client) a new rowversion is generated by the database and EF retrieves that value. Suppose you'd immediately want to make a second update. That would be impossible if you didn't have the new rowversion.
This happens with all properties that are marked as identity or computed (by DatabaseGenertedOption).

Yii MANY_MANY on clause

I have the following tables in my database.
subscriber(
subscriber_id int,
status char(1),
...
)
sendlist(
sendlist_id int,
...
)
subscriber_list(
subscriber_id int,
sendlist_id int,
status char(1)
)
in my SendList model I have a relation defined as follows:
'Subscribers'=>array(self::MANY_MANY, 'Subscriber', 'subscriber_list(sendlist_id, subscriber_id)',"on"=>"Subscribers_Subscribers.status='a' and Subscribers.status='a'")
I have tried setting both the on and condition clause, and one or the other, however the sole Subscriber in the table at present (linked to this sendlist) is ALWAYS returned, regardless of whether it's status in the subscriber table, or the subscriber_list table is set to 's'.
When I check the DB - the query that the join supposedly gives something along the lines of:
SELECT `Subscribers`.`subscriber_id` AS `t1_c0`, `Subscribers`.`full_phone` AS `t1_c1`, `Subscribers`.`status` AS `t1_c2`, `Subscribers`.`contact` AS `t1_c3`, `Subscribers`.`client_id` AS `t1_c4`, `Subscribers`.`date_joined` AS `t1_c5` FROM `subscriber` `Subscribers`
INNER JOIN `subscriber_list` `Subscribers_Subscribers` ON (`Subscribers_Subscribers`.`sendlist_id`=7075) AND (`Subscribers`.`subscriber_id`=`Subscribers_Subscribers`.`subscriber_id`) AND (Subscribers.status='a' and Subscribers_Subscribers.status='a')
WHERE (Subscribers.status='a' and Subscribers_Subscribers.status='a');
returns an empty set (as it should). But when I print_r the $list->Subscribers array, I get the following:
Array(
[0] => Subscriber Object
(
...
[_attributes:CActiveRecord:private] => Array
(
[subscriber_id] => 2043221
[full_phone] => 447944426885
[status] => s
[contact] => 0
[client_id] => 14002
[date_joined] =>
)
so it recognises that the subscriber is status 's', but still loads it!
The code that causes the issue is here:
$client = new Client();
$client->save(false);
//ensure that we are working with a clean subscriber set.
$subscribers = Subscriber::model()->findAll();
foreach($subscribers as $subscriber){
$subscriber->delete();
}
//create a new subscriber with a set of filter options.
$subscriber = new Subscriber();
$subscriber->client_id = $client->client_id;
$subscriber->full_phone = "44712345678";
$subscriber->country = "England";
$subscriber->gender = "Male";
$subscriber->save(false);
//create a new list with a set of matching options
$list = new SendList();
$list->client_id = $client->client_id;
$list->gender = "=Male";
$list->save(false);
//confirm that the subscriber can be added to the list.
self::assertTrue($list->canImport($subscriber));
$list->import();
//confirm that the subscriber has been added to the list.
self::assertEquals(1, $list->active_subscriber_count);
self::assertEquals(1, sizeof($list->Subscribers));
//remove all subscribers (set status to 's')
$list->remove();
//check that the list no longer records the subscribers presence.
self::assertEquals(0, $list->active_subscriber_count);
self::assertEquals(1, $list->stopped_subscriber_count);
//check that the subscriber has no lists associated with it.
$subscriber->refresh();
self::assertEquals('s', $subscriber->status, sizeof($subscriber->SendLists));
//attempts to refresh the list. Tested without these with the same result.
$list_id = $list->sendlist_id;
unset($list);
unset($subscriber);
$list2 = SendList::model()->findByPk($list_id);
$list2->refresh();
//check that the the list has no subscribers - this fails.
self::assertEquals(0, sizeof($list2->Subscribers));
Have I missed something silly?

Grails keep deleting my tables

I have my table structure like:
CREATE TABLE test_two_tabel.T1 ( T1_ID INT NOT NULL AUTO_INCREMENT , A1 INT NULL , B1 VARCHAR(45) NULL , C1 VARCHAR(45) NULL , D1 DATETIME NULL , PRIMARY KEY (T1_ID) );
In Grails:
package twotables
class T1 {
Integer a
String b
String c
Date d
static mapping = {
table "T1"
version false
id column:"T1_ID"
a1 column:"a1"
b1 column:"b1"
c1 column:"c1"
d1 column:"d1"
}
static constraints = {
id()
a1()
b1()
c1()
d1()
}
}
Every time I execute my program... Grails deletes my tables in the DB, does anyone know what's happening?
You need to change value of dbCreate from 'create-drop' to 'update' at grails-app/conf/DataSource.groovy
You current value probably is:
development {
dataSource {
dbCreate = "create-drop" // one of 'create', 'create-drop','update'
url = "***"
}
}
this means that Grails will recreate all tables on every restart. If you'll set this as update it will try to update table structure, according to your data model classes.
You can read more about Grails DB configuration at http://www.grails.org/doc/latest/guide/3.%20Configuration.html#3.3%20The%20DataSource
it could be a few things. As #splix mentioned, it could be the 'create-drop' settings.
Also, if you never changed your datasource, Grails uses an in-memory database, so it only lasts as long as the program runs. You can tell hsqldb to persist to file instead of be in memory. You can also change it to point to something like mysql. Look here.

Resources