Laravel insert millions of database rows from models - database

I have a text file that contains comma delineated values representing data set with each row within the string. There are about 2 million of them and I want to parse the string, create Laravel models from them and store each as a row in my database.
At this time, I have a class that parses the file line by line and creates a model for each as follows:
class LargeFileParser{
// File Reference
protected $file;
// Check if file exists and create File Object
public function __construct($filename, $mode="r"){
if(!file_exists($filename)){
throw new Exception("File not found");
}
$this->file = new \SplFileObject($filename, $mode);
}
// Iterate through the text or binary document
public function iterate($type = "Text", $bytes = NULL)
{
if ($type == "Text") {
return new \NoRewindIterator($this->iterateText());
} else {
return new \NoRewindIterator($this->iterateBinary($bytes));
}
}
// Handle Text iterations
protected function iterateText()
{
$count = 0;
while (!$this->file->eof()) {
yield $this->file->fgets();
$count++;
}
return $count;
}
// Handle binary iterations
protected function iterateBinary($bytes)
{
$count = 0;
while (!$this->file->eof()) {
yield $this->file->fread($bytes);
$count++;
}
}
}
I then have a controller (I want to be able to run this migration via a route occasionally) that handles creating and inserting the models into the database:
class CarrierDataController extends Controller
{
// Store the data keys for a carrier model
protected $keys;
//Update the Carrier database with the census info
public function updateData(){
// File reference
$file = new LargeFileParser('../storage/app/CENSUS.txt');
//Get iterator for the file
$iterator = $file->iterate("Text");
// For each iterator, store the data object as a carrier in the database
foreach ($iterator as $index => $line) {
// First line sets the keys specified in the file
if($index == 0){
$this->keys = str_getcsv(strtolower($line), ",", '"');
}
// The rest hold the data for each model
else{
if ($index <= 100) {
// Parse the data to an array
$dataArray = str_getcsv($line, ",", '"');
// Get a data model
$dataModel = $this->createCarrierModel(array_combine($this->keys, $dataArray));
// Store the data
$this->storeData($dataModel);
}
else{
break;
}
}
}
}
// Return a model for the data
protected function createCarrierModel($dataArray){
$carrier = Carrier::firstOrNew($dataArray);
return $carrier;
}
// Store the carrier data in the database
protected function storeData($data){
$data->save();
}
}
This works perfectly...that is while I'm limiting the function to 100 inserts. If I remove this check and allow it to run this function over the entire 2 million data sets, it no longer works. Either there is a timeout, or if I remove the timeout via something like ini_set('max_execution_time', 6000); I eventually get a "failed to respond" message from the browser.
My assumption is that there needs to be some sort of chunking in place, but I'm honestly not sure of the best approach for handling this volume.
Thank you in advance for any suggestions you may have.

I would create an artisan command who handles the import rather than doing this via the browser. Do you like to let the user wait until this big file is imported? What happens if he moves uses the back button or closes the page?
If you want or need to have some kind of user interaction, like the user uploads the file and clicks on an Import button, push the import to a job queue using e.g. Beanstalk. The aforementioned artisan will be run and import the stuff and if its done, you can send the user an e-mail or a slack notification. If you need some UI interaction you can make the request via ajax and that script makes request to an API endpoint requesting the status of the import or since its asynchron, waiting for completion and shows some UI notification, stops a spinner or in error case, shows an error message.

Related

Data Logging from Cogent Datahub to a Database using gamma scripting based on a timer

Can you guys please help me with writing a gamma script for logging data into a database table? The tags are inside a domain created in OPC DA of Cogent Datahub. The only condition that needs to be satisfied is the script should be logging all points in the domain every one second along with their value and timestamp.
require ("Application");
require ("ODBCThreadSupport");
require ("Time");
require ("Quality");
class LogData Application
{
DSN = "your ODBC DSN name"; // The DSN name to use for the database
connection
username = "Database_User"; // The user name for connecting to the
database
password = "*****"; // The password for connecting to the database
tablename = "table1"; // The name of the database table
cachefile = "C:\Users\AppData\cache.txt"; // Base name for the disk cache file
domain = "Domain name"; // The domain in which to log all points
tableclass;
thread;
mappedPoints = new Dictionary();
prevcount = 0;
}
/* If there is something we only want to perform on the first connection, we can
test
* is_first_connect to perform the code only once.
*/
method LogData.onConnect()
{
princ ("Connection succeeded\n");
if (.thread.is_first_connect)
{
// Start the sequence defined by the AddInitStage calls in the constructor
.thread.BeginAsyncInit();
}
}
/* If the connection fails after having been
* connected, this method is called.
*/
method LogData.onConnectFail()
{
princ ("Connection closed: ", SQLResult.Description, "\n");
}
/* Map the table in the set of table definitions that matches the name in
.tablename
* into a Gamma class. This lets us easily convert between class instances and
rows
* in the table.
*/
method LogData.mapTable(name, tabledefinitions)
{
//princ("Mapping table\n");
.tableclass = .thread.ClassFromTable(name, tabledefinitions);
//princ("Table class = ", .tableclass, "\n");
}
method LogData.startLogging()
{
.registerPoints();
}
/* Set up the timer or event handler functions to write to the table. */
method LogData.registerPoints()
{
/* Find all points in the domain */
local info = datahub_domaininfo(.domain)[0];
if (info.n_points != .prevcount)
{
local points = datahub_points(.domain, nil, nil);
local pointsym;
princ(info.n_points - .prevcount, " new points are being added to
logging\n");
with point in points do
{
// Filter out branch points
if ((point.flags & 0x30) == 0)
{
pointsym = symbol(string(point.domain,":", point.name));
if (!.mappedPoints.contains(pointsym))
{
local PointName = string(pointsym);
.TimerEvery(01,`(#self).writeData(#PointName));
//.mappedPoints.add(pointsym, pointsym);
}
}
}
.prevcount = info.n_points;
}
}
method LogData.writeData(pointsymbol)
{
local row = new (.tableclass);
local pttime, ptltime;
local timestring;
local point;
// Generate a timestamp in database-independent format to the millisecond.
// Many databases strip the milliseconds from a timestamp, but it is harmless
// to provide them in case the database can store them.
point = PointMetadata(pointsymbol);
//princ(point,"\n");
if (point && number_p(point.value))
{
pttime = WindowsTimeToUnixTime(point.timestamp);
//princ(point,"\n");
ptltime = localtime(pttime);
//princ(ptltime,"\n");
if (!ptltime)
ptltime = localtime(0);
timestring = format("{'%04d-%02d-%02d %02d:%02d:%02d'}",
ptltime.year+1900, ptltime.mon+1, ptltime.mday, ptltime.hour, ptltime.min,
ptltime.sec);
//princ(timestring,"\n");
// Fill the row. Since we mapped the table into a Gamma class, we can
access
// the rows in the column as member variables of the mapped class.
row.ptname = string(pointsymbol);
row.ptvalue = point.value;
row.pttime = timestring;
// Perform the insertion. In this case we are providing no callback on
completion.
.thread.Insert(row, nil);
}
}
/* Write the 'main line' of the program here. */
method LogData.constructor ()
{
// Create and configure the database connection object
.thread = new ODBCThread();
.thread.Configure(.DSN, .username, .password, STORE_AND_FORWARD, .cachefile, 0);
// Query the table and map it to a class for each insertion. We want to run an
asynchronous event
// within the asynchronous initialization stage, so to do that we specify the
special method
// cbInitStage as the callback function of our asynchronous event
(GetTableInfo). We deal with
// the return from the GetTableInfo in the onSuccess argument of the init stage.
.thread.AddInitStage(`(#.thread).GetTableInfo("", "", (#.tablename),
"TABLE,VIEW",
`(#.thread).cbInitStage()),
`(#self).mapTable(#.tablename, SQLTables), nil);
//.thread.AddInitStage(`(#.thread).GetTableInfo("", "", (#.tablename),
"TABLE,VIEW",
// `(#self).mapTable(#.tablename, SQLTables)),
`(#.thread).cbInitStage(), nil);
// Do not start writing data to the table until we have successfully created and
mapped
// the table to a class. If we wanted to start writing data immediately, then
we would
// create the table class beforehand instead of querying the database for the
table
// definition. Then, even if the database were unavailable we could still cache
to the
// local disk until the database was ready.
.thread.AddInitStage(nil, `(#self).startLogging(), nil);
// Set up the callback functions for various events from the database thread
.thread.OnConnectionSucceeded = `(#self).onConnect();
.thread.OnConnectionFailed = `(#self).onConnectFail();
.thread.OnFileSystemError = `princ("File System Error: ", SQLResult, "\n");
.thread.OnODBCError = `princ("ODBC Error: ", SQLResult, "\n");
.thread.OnExecuteStored = nil;
.thread.Start();
// Create a menu item in the system tray that allows us to open a window to
monitor
// the performance of the ODBC thread. The menu strings can be edited as
desired.
.AddCustomSubMenu("ODBC Logging");
.AddCustomMenuItem("Monitor Performance",
`(#.thread).CreateMonitorWindow((#self), "ODBC Monitor"));
// Automatically update the point list every 1 seconds in case new points are v
added
// to the domain.
//.TimerEvery(01, `(#self).registerPoints());
}
/* Any code to be run when the program gets shut down. */
method LogData.destructor ()
{
if (instance_p(.thread))
destroy(.thread);
}
/* Start the program by instantiating the class. */
ApplicationSingleton (LogData);
Major parts of this Gamma script are the constructor,destructor,classes and methods. This program first initializes an ODBC connection using provided details and write each row of data using 'registerpoints' and 'writedata' methods. Please find additional details of each lines from comments in the program.

Google Apps Script - Optimizing File Creation/Modification

I reviewed the following documentation from Google on how to optimize existing Google scripts here:
https://developers.google.com/apps-script/guides/support/best-practices
In particular, the 'Use batch-operation' section seems more appropriate for my use case, where the optimal strategy is to 'batch' all the reading into one operation, and then writing in separate operation; not to cycle between read-and-write calls.
Here is an example of inefficient code, as given by the url above:
// DO NOT USE THIS CODE. It is an example of SLOW, INEFFICIENT code.
// FOR DEMONSTRATION ONLY
var cell = sheet.getRange('a1');
for (var y = 0; y < 100; y++) {
xcoord = xmin;
for (var x = 0; x < 100; x++) {
var c = getColorFromCoordinates(xcoord, ycoord);
cell.offset(y, x).setBackgroundColor(c);
xcoord += xincrement;
}
ycoord -= yincrement;
SpreadsheetApp.flush();
}
Here is an example of efficient and improved code:
// OKAY TO USE THIS EXAMPLE or code based on it.
var cell = sheet.getRange('a1');
var colors = new Array(100);
for (var y = 0; y < 100; y++) {
xcoord = xmin;
colors[y] = new Array(100);
for (var x = 0; x < 100; x++) {
colors[y][x] = getColorFromCoordinates(xcoord, ycoord);
xcoord += xincrement;
}
ycoord -= yincrement;
}
sheet.getRange(1, 1, 100, 100).setBackgroundColors(colors);
Now, for my particular use case:
Instead of storing values in an array, then writing/modifying them as a separate operation from reading them into an array, I want to create multiple Google documents that replaced placeholders within each document.
For context:
I'm writing a script that reads a spreadsheet of students with files to modify for each student, which is later sent as a mail merge. For example, there are 3 master files. Each student will have a copy of the 3 master files, which is used to .replaceText placeholder fields.
Here are my relevant snippets of code below:
function filesAndEmails() {
// Import the Spreadsheet application library.
const UI = SpreadsheetApp.getUi();
// Try calling the functions below; catch any error messages that occur to display as alert window.
try {
// Prompt and record user's email draft template.
// var emailLinkID = connectDocument(
// UI,
// title="Step 1/2: Google Document (Email Draft) Connection",
// dialog=`What email draft template are you referring to?
// This file should contain the subject line, name and body.
// Copy and paste the direct URL link to the Google Docs:`,
// isFile=true
// );
// TEST
var emailLinkID = "REMOVED FOR PRIVACY";
if (emailLinkID != -1) {
// Prompt and record user's desired folder location to store generated files.
// var fldrID = connectDocument(
// UI,
// title="Step 2/2: Google Folder (Storage) Connection",
// dialog=`Which folder would you like all the generated file(s) to be stored at?
// Copy and paste the direct URL link to the Google folder:`,
// isFile=false
// );
// TEST
var fldrID = DriveApp.getFolderById("REMOVED FOR PRIVACY");
// Retrieve data set from database.
var sheet = SpreadsheetApp.getActive().getSheetByName(SHEET_1);
// Range of data must include header row for proper key mapping.
var arrayOfStudentObj = objectify(sheet.getRange(3, 1, sheet.getLastRow()-2, 11).getValues());
// Establish array of attachment objects for filename and file url.
var arrayOfAttachObj = getAttachments();
// Opportunities for optimization begins here.
// Iterate through array of student Objects to extract each mapped key values for Google document insertion and emailing.
// Time Complexity: O(n^3)
arrayOfStudentObj.forEach(function(student) {
if (student[EMAIL_SENT_COL] == '') {
try {
arrayOfAttachObj.forEach(function(attachment) {
// All generated files will contain this filename format, followed by the attachment filename/description.
var filename = `${student[RYE_ID_COL]} ${student[FNAME_COL]} ${student[LNAME_COL]} ${attachment[ATTACH_FILENAME_COL]}`;
// Create a copy of the current iteration/file for the given student.
var file = DocumentApp.openById(DriveApp.getFileById(getID(attachment[ATTACH_FILEURL_COL], isFile=false)).makeCopy(filename, fldrID).getId())
// Replace and save all custom fields for the given student at this current iteration/file.
replaceCustomFields(file, student);
});
} catch(e) {
}
}
});
UI.alert("Script successfully completed!");
};
} catch(e) {
UI.alert("Error Detected", e.message + "\n\nContact a developer for help.", UI.ButtonSet.OK);
};
}
/**
* Replaces all fields specified by 'attributesArray' given student's file.
* #param {Object} file A single file object used to replace all custom fields with.
* #param {Object} student A single student object that contains all custom field attributes.
*/
function replaceCustomFields(file, student) {
// Iterate through each student's attribute (first name, last name, etc.) to change each field.
attributesArray.forEach(function(attribute) {
file.getBody()
.replaceText(attribute, student[attribute]);
});
// Must save and close file to finalize changes prior to moving onto next student object.
file.saveAndClose();
}
/**
* Processes the attachments sheet for filename and file ID.
* #return {Array} An array of attachment file objects.
*/
function getAttachments() {
var files = SpreadsheetApp.getActive().getSheetByName(SHEET_2);
return objectify(files.getRange(1, 1, files.getLastRow(), 2).getValues());
}
/**
* Creates student objects to contain the object attributes for each student based on
* the header row.
* #param {Array} array A 2D heterogeneous array includes the header row for attribute key mapping.
* #return {Array} An array of student objects.
*/
function objectify(array) {
var keys = array.shift();
var objects = array.map(function (values) {
return keys.reduce(function (o, k, i) {
o[k] = values[i];
return o;
}, {});
});
return objects;
}
To summarize my code, I read the Google spreadsheet of students as an array of objects, so each student has attributes like their first name, last name, email, etc. I have done the same for the file attachments that would be included for each student. Currently, the forEach loop iterates through each student object, creates copies of the master file(s), replaces placeholder text in each file, then saves them in a folder. Eventually, I will be sending these file(s) to each student with the MailApp. However, due to the repetitive external calls via creating file copies for each student, the execution time is understandably very slow...
TLDR
Is it still possible to optimize my code using "batch operations" when it is necessary for my use case to have multiple DriveApp calls to create said copies of the files for modification purposes? As opposed to reading raw values into an array and modifying them at a later operation, I don't think I could simply just store document objects into an array, then modify them at a later stage. Thoughts?
You could use batchUpdate of Google Docs API.
See Tanaike's answer just to have an idea on how the request object looks like.
All you need to do in your Apps Script now is build the object for multiple files.
Note:
You can also further optimize your code by updating:
var arrayOfStudentObj = objectify(sheet.getRange(3, 1, sheet.getLastRow()-2, 11).getValues();
into:
// what column your email confirmation is which is 0-index
// assuming column K contains the email confirmation (11 - 1 = 10)
var emailSentColumn = 10;
// filter data, don't include rows with blank values in column K
var arrayOfStudentObj = objectify(sheet.getRange(3, 1, sheet.getLastRow()-2, 11).getValues().filter(row=>row[emailSentColumn]));
This way, you can remove your condition if (student[EMAIL_SENT_COL] == '') { below and lessen the number of loops.
Resource:
Google Docs Apps Script Quickstart
Google Docs REST API

Dart - HTTPClient download file to string

In the Flutter/Dart app that I am currently working on need to download large files from my servers. However, instead of storing the file in local storage what I need to do is to parse its contents and consume it one-off. I thought the best way to accomplish this was by implementing my own StreamConsumer and overriding the relvant methods. Here is what I have done thus far
import 'dart:io';
import 'dart:async';
class Accumulator extends StreamConsumer<List<int>>
{
String text = '';
#override
Future<void> addStream(Stream<List<int>> s) async
{
print('Adding');
//print(s.length);
return;
}
#override
Future<dynamic> close() async
{
print('closed');
return Future.value(text);
}
}
Future<String> fileFetch() async
{
String url = 'https://file.io/bse4moAYc7gW';
final HttpClientRequest request = await HttpClient().getUrl(Uri.parse(url));
final HttpClientResponse response = await request.close();
return await response.pipe(Accumulator());
}
Future<void> simpleFetch() async
{
String url = 'https://file.io/bse4moAYc7gW';
final HttpClientRequest request = await HttpClient().getUrl(Uri.parse(url));
final HttpClientResponse response = await request.close();
await response.pipe(File('sample.txt').openWrite());
print('Simple done!!');
}
void main() async
{
print('Starting');
await simpleFetch();
String text = await fileFetch();
print('Finished! $text');
}
When I run this in VSCode here is the output I get
Starting
Simple done!! //the contents of the file at https://file.io/bse4moAYc7gW are duly saved in the file
sample.txt
Adding //clearly addStream is being called
Instance of 'Future<int>' //I had expected to see the length of the available data here
closed //close is clearly being called BUT
Finished! //back in main()
My understanding of the underlying issues here is still rather limited. My expectation
I had thought that I would use addStream to accumulate the contents being downloaded until
There is nothing more to download at which point close would be called and the program would display exited
Why is addStream showing instance of... rather than the length of available content?
Although the VSCode debug console does display exited this happens several seconds after closed is displayed. I thought this might be an issue with having to call super.close() but not so. What am I doing wrong here?
I was going to delete this question but decided to leave it here with an answer for the benefit of anyone else trying to do something similar.
The key point to note is that the call to Accumulator.addStream does just that - it furnishes a stream to be listened to, no actual data to be read. What you do next is this
void whenData(List<int> data)
{
//you will typically get a sequence of one or more bytes here.
for(int value in data)
{
//accumulate the incoming data here
}
return;
}
function void whenDone()
{
//now that you have all the file data *accumulated* do what you like with it
}
#override
Future<void> addStream(Stream<List<int>> s) async
{
s.listen(whenData,onDone:whenDone);
//you can optionally ahandler for `onError`
}

Firebase Unity - Get all children into an array/generic list after GetValueAsync?

How can I get data back as an array or generic list from a Firebase database in Unity3D without knowing ahead of time what the name (key) of the children are?
I have been trying out the new Unity Firebase plugin, and I am having an issue figuring out how to get all the children in a specific location, and put the names (the key) and the values into arrays or generic lists so that I can work on the data locally. Forgive me for being so new to Firebase and probably using bad techniques to do this, and this plugin being so new its pretty hard for me to get much outside help, as there are not a lot of docs and tutorials out there on Firebase Unity.
In this particular case I am trying to create "instant messaging" like functionality, without the use of Firebase messaging, and just using regular Firebase database stuff instead. It might have been easier to use Firebase messaging, but mostly for the sake of learning and customization I want to do this on my own with just the Firebase database.
I insert data into the database like this:
public void SendMessage(string toUser, string msg)
{
Debug.Log(String.Format("Attempting to send message from {0} to {1}", username, toUser));
DatabaseReference reference = FirebaseDatabase.DefaultInstance.GetReference("Msgs");
string date = Magnet.M.GetCurrentDate();
// send data to the DB
reference.Child(toUser).Child(username).Child(date).SetValueAsync(msg);
// user receiving message / user sending message > VALUE = "hello dude|20170119111325"
UpdateUsers();
}
And then I try and get it back like this:
public string[] GetConversation(string userA, string userB)
{
// get a conversation between two users
string[] convo = new string[0];
FirebaseDatabase.DefaultInstance.GetReference("Msgs").GetValueAsync().ContinueWith(task =>
{
Debug.Log("Getting Conversation...");
if (task.IsFaulted || task.IsCanceled)
{
Debug.LogError("ERROR: Task error in GetConversation(): " + task.Exception);
}
else if (task.IsCompleted)
{
DataSnapshot snapshot = task.Result;
string[] messagesA = new string[0], messagesB = new string[0];
if(snapshot.HasChild(userA))
{
// userA has a record of a conversation with other users
if(snapshot.Child(userA).HasChild(userB)) // userB has sent messages to userA before
{
Debug.Log("Found childA");
long count = snapshot.Child(userA).Child(userB).ChildrenCount;
messagesA = new string[count];
var kids = snapshot.Child(userA).Child(userB).Children;
Debug.Log(kids);
for (int i = 0; i < count; i++)
{
// this won't work, but is how I would like to access the data
messagesA[i] = kids[i].Value.ToString(); // AGAIN.... will not work...
}
}
}
if(snapshot.HasChild(userB))
{
if(snapshot.Child(userB).HasChild(userA)) // userA sent a message to userB before
{
Debug.Log("Found childB");
long count = snapshot.Child(userB).Child(userA).ChildrenCount;
messagesA = new string[count];
var kids = snapshot.Child(userB).Child(userA).Children;
Debug.Log(kids);
// messy incomplete testing code...
}
}
// HERE I WOULD ASSIGN ALL THE MESSAGES BETWEEN A AND B AS 'convo'...
}
Debug.Log("Done Getting Conversation.");
});
return convo;
}
But obviously this won't work, because DataSnapshot won't let me access it like an array or generic list using indices, and I can't figure out how to treat the data when I don't know the names (the keys) of all the children, and just want to get them out one by one in any order... and since they are named by the date/time they are entered into the DB, I won't know ahead of time what the childrens names (keys) are, and I can't just say "GetChild("20170101010101")" because that number is generated when its sent to the DB from any client.
FYI here is what the DB looks like:
Figured out the answer to your question. Here's my code snippet. Hope this would help!
void InitializeFirebase() {
FirebaseApp app = FirebaseApp.DefaultInstance;
app.SetEditorDatabaseUrl ("https://slol.firebaseio.com/");
FirebaseDatabase.DefaultInstance
.GetReference ("Products").OrderByChild ("category").EqualTo("livingroom")
.ValueChanged += (object sender2, ValueChangedEventArgs e2) => {
if (e2.DatabaseError != null) {
Debug.LogError (e2.DatabaseError.Message);
}
if (e2.Snapshot != null && e2.Snapshot.ChildrenCount > 0) {
foreach (var childSnapshot in e2.Snapshot.Children) {
var name = childSnapshot.Child ("name").Value.ToString ();
text.text = name.ToString();
Debug.Log(name.ToString());
//text.text = childSnapshot.ToString();
}
}
};
}
Firebase developer here.
Have you tried to use Value at the top level Snapshot? It should return to you an IDictionary where the values can also be lists or nested dictionaries. You will have to use some dynamic inspection to figure out what the values are.

SharedObject for Arrays of Object. Can't get correct data when restart flash

I have an array that stored some Object with its data, and I try to store it in my computer.
If I try to load for the data after I've save the data, I could get a correct data.
Exp: [Object Player]
But if I restart the flash, the data seems to be gone.
What is the problem?
private var sharedObject:SharedObject = SharedObject.getLocal("aquarium", "/");
public function save(n:String):void
{
/* player list will only handle the list of all the Players
* each player data will handle by Player class itself.
*/
registerClassAlias("Player", Player)
player = new Player()
player.newPlayer(n, LATEST_VERSION)
playerArray.push(player)
//saving as shared object
sharedObject.data.aquariumData = playerArray
sharedObject.flush()
load()
}
public function load():void
{
if (sharedObject.size > 0)
{
trace("loading player info")
playerArray = sharedObject.data.aquariumData
trace(playerArray)
}
else
{
trace("there's no record")
}
}
Can you please provide the code how you obtain the shared object ?
Do you use var sharedObject:SharedObject = SharedObject.getLocal("sharedObject"); or something like this ?
Apart from that when calling registerClassAlias("Player", Player) before serialization keep in mind that it must be called before extraction of the data also, so the de-serialization will work correctly and returns array of Player objects not array of Object objects.
And ofc closing the sharedObject is very nice practice after flushing :)
P.S. Your code works as far as i've tested it replacing your Player class with other custom class.

Resources