I have a classifieds website similar to Craigslist built with laravel.
There's a status column in my MySQL posts table and I want to be able to automatically change the status of all posts that is more than 30 days old to "deleted "
So how do I do that in laravel.
There's no code here because I don't know how to go about it.
You have to use the scheduler (docs: https://laravel.com/docs/8.x/scheduling).
It allows you to perform cron tasks.
For instance, you could to something like that:
You should read the documentation first because it contains a lot of useful information to understand the following code example.
<?php
namespace App\Console;
use Illuminate\Console\Scheduling\Schedule;
use Illuminate\Foundation\Console\Kernel as ConsoleKernel;
use Illuminate\Support\Facades\DB;
class Kernel extends ConsoleKernel
{
/**
* The Artisan commands provided by your application.
*
* #var array
*/
protected $commands = [
//
];
/**
* Define the application's command schedule.
*
* #param \Illuminate\Console\Scheduling\Schedule $schedule
* #return void
*/
protected function schedule(Schedule $schedule)
{
$schedule->call(function () {
YourModel::query()
->where('created_at', '<', today()->subDays(30))
->update(['your_status_column' => 'deleted');
})->daily();
}
}
Here we are updating the status of all YourModel that are created since more than 30 days to deleted (and we are doing it every days at midnight).
Of course you have to adapt the code to your model and column.
You can create a command or job, and use task scheduling to automatically run the command and job you just created.
https://laravel.com/docs/8.x/scheduling
Related
This question already has answers here:
Can we call a scenario from one feature in another using karate?
(2 answers)
Closed 1 year ago.
I need variables to be re-used (shared) across scenarios in the same feature file.
Please find below the working way that I'm currently using.
The problem here is that I have to outsource the shared variables to another feature file what seems to be quite cumbersome for such a silly task.
I was wondering if I could define the re-usable variables as an ignored scenario in the same feature file that I can callonce from "myself" (the same feature file) as follows:
File my.feature:
Feature: My
Background:
* url myUrl
# call once explicitly the scenario tagged with '#init'
* def vars = callonce read('my.feature#init')
#ignore #init
Scenario: Return shared variables for all scenarios
* def id = uuid()
# the non-ignored scenarios follow below this line...
Problem: Unfortunately this leads to an endless loop with many errors. It seems like callonce myself (the same file that invokes callonce) runs the Background including the callonce again.
Is the idea shown above possible and if yes, where's my mistake?
Or could you callonce without processing the Background again? Something like adding an argument to callonce or use karate.callSingle(file, dontProcessBackground=true)?
Many thanks.
--
The following works (but is cumbersome):
File my.feature:
Feature: My
Background:
* url myUrl
* def vars = callonce read('my.init.feature')
#one
Scenario: One
* def payload = `{ "id" : "${vars.id}" }`
* request payload
* method post
* status 200
* match $.value == 'one'
#two
Scenario: Two
* def payload = `{ "id" : "${vars.id}" }`
* request payload
* method post
* status 200
* match $.value == 'two'
File my.init.feature:
#ignore
Feature: Create variables to be used across mutliple scnearios
Scenario: Return shared variables for all scenarios
* def id = uuid()
... where uuid() is shared in karate-config.js:
function fn() {
var uuid = () => { return String(java.util.UUID.randomUUID().toString()) };
// ...
var config = { uuid: uuid }
return config;
}
I have to outsource the shared variables to another feature file
There is nothing wrong with using a second file for re-usable stuff. All programming languages work this way.
If this is such an inconvenience, kindly contribute code to Karate, it is an open-source project.
As my CakePHP 2.4 app gets bigger, I'm noticing I'm passing a lot of arrays around in the model layer. Cake has kinda led me down this path because it returns arrays, not objects, from it's find calls. But more and more, it feels like terrible practice.
For example, in my Job model, I've got a method like this:
public function durationInSeconds($job) {
return $job['Job']['estimated_hours'] * 3600; // convert to seconds
}
Where as I imagine that using active record patter, it should look more like this:
public function durationInSeconds() {
return $this->data['Job']['estimated_hours'] * 3600; // convert to seconds
}
(ie, take no parameter, and assume the current instance represents the Job you want to work with)
Is that second way better?
And if so, how do I use it when, for example, I'm looping through the results of a find('all') call? Cake returns an array - do I loop through that array and do a read for every single row? (seems a waste to re-fetch the info from the database)
Or should I implement a kind of setActiveRecord method that emulates read, like this:
function setActiveRecord($row){
$this->id = $row['Job']['id'];
$this->dtaa = $row;
}
Or is there a better way?
EDIT: The durationInSeconds method was just a simplest possible example. I know for that particular case, I could use virtual fields. But in other cases I've got methods that are somewhat complex, where virtual fields won't do.
The best solution depends on the issue you need to solve. But if you have to make a call to a function for each result row, perhaps it is necessary to redesign the query taking all the necessary data.
In this case that you have shown, you can use simply a virtual Field on Job model:
$this->virtualFields = array(
'duration_in_seconds' => 'Job.estimated_hours * 3600',
):
..and/or you can use a method like this:
public function durationInSeconds($id = null) {
if (!empty($id)) {
$this->id = $id;
}
return $this->field('estimated_hours') * 3600; // convert to seconds
}
I got a question. I have a db table with settings (id, name).
If I read them from the db
$settings = $this->Setting->find('list');
How can I do this in the AppController or something like that to access from each Controller and Model?
Hope someone can help me.
Thanks
Explanation:
I would assume you're looking for something like below (Obviously you'll want to tweak it per your own application, but - it's the idea).
In the app controller, it
finds the settings from the table
repeats through each and puts each one into a "Configure" variable
Code:
/**
* Read settings from DB and populate them in constants
*/
function fetchSettings(){
$this->loadModel('Setting');
$settings = $this->Setting->findAll();
foreach($settings as $settingsData) {
$value = $settingsData['Setting']['default_value'];
//note: can't check for !empty because some values are 0 (zero)
if(isset($settingsData['Setting']['value'])
&& $settingsData['Setting']['value'] !== null
&& $settingsData['Setting']['value'] !== '') {
$value = $settingsData['Setting']['value'];
}
Configure::write($settingsData['Setting']['key'], $value);
}
}
Then, you can access them anywhere in your app via Configure::read('myVar');
A warning from the CakePHP book about Configure variables. (I think they're fine to use in this case, but - something to keep in mind):
CakePHP’s Configure class can be used to store and retrieve
application or runtime specific values. Be careful, this class allows
you to store anything in it, then use it in any other part of your
code: a sure temptation to break the MVC pattern CakePHP was designed
for. The main goal of Configure class is to keep centralized variables
that can be shared between many objects. Remember to try to live by
“convention over configuration” and you won’t end up breaking the MVC
structure we’ve set in place.
When I read ibatis-sqlmap-2.3.4,I find They both implements SqlMapExecutor.
SqlMapClientImpl do insert with localSqlMapSession which provide thread safe.
But in spring2.5.6, the execute method of SqlMapClientTemplate use SqlMapClientImpl like this:
SqlMapSession session = this.sqlMapClient.openSession();
...
return action.doInSqlMapClient(session);
The openSession method return a new SqlMapSessionImpl each time.
My questions are:
Why SqlMapClientTemplate use sqlMapSeesion instead of sqlMapClient ?
Why localSqlMapSession of sqlMapClient is unused in SqlMapClientTemplate ? use like this:
return action.doInSqlMapClient(this.sqlMapClient);
what's the different between SqlMapClient and SqlMapSeesion ?
for your first question, spring-orm explain in the comment:
// We always need to use a SqlMapSession, as we need to pass a Spring-managed
// Connection (potentially transactional) in. This shouldn't be necessary if
// we run against a TransactionAwareDataSourceProxy underneath, but unfortunately
// we still need it to make iBATIS batch execution work properly: If iBATIS
// doesn't recognize an existing transaction, it automatically executes the
// batch for every single statement...
the answer to difference between ibatis' SqlMapClient and SqlMapSession can be found in interface SqlMapClient's comments:
/**
* Returns a single threaded SqlMapSession implementation for use by
* one user. Remember though, that SqlMapClient itself is a thread safe SqlMapSession
* implementation, so you can also just work directly with it. If you do get a session
* explicitly using this method <b>be sure to close it!</b> You can close a session using
* the sqlMapSession.close() method.
* <p/>
*
* #return An SqlMapSession instance.
*/
public SqlMapSession openSession();
I'm testing a Factory that simply retrieves all the "post" of a news system. I'll cut the example to something as simple as possible:
$newsFactory->getAllNews();
The table looks like this:
+---------+---------------------+-------------+
| news_id | news_publishedDate | news_active |
+---------+---------------------+-------------+
| 1 | 2010-03-22 13:20:22 | 1 |
| 2 | 2010-03-23 13:20:22 | 1 |
| 14 | 2010-03-23 13:20:22 | 0 |
| 15 | 2010-03-23 13:20:22 | 1 |
+---------+---------------------+-------------+
I want to test that behaviour; for now, we'll focus only on the first one:
Make sure the query will return only news_active=1
Make sure the query will return the element ordered by news_publishedDate, from newest to older.
So I've made an dbData.xml dataset of what I consider as good testing data:
<?xml version="1.0" encoding="UTF-8" ?>
<dataset>
<table name="news">
<column>news_id</column>
<column>news_publishedDate</column>
<column>news_active</column>
<row>
<value>1</value>
<value>2010-03-20 08:55:05</value>
<value>1</value>
</row>
<row>
<value>2</value>
<value>2010-03-20 08:55:05</value>
<value>0</value>
</row>
<row>
<value>3</value>
<value>2011-03-20 08:55:05</value>
<value>1</value>
</row>
</table>
</dataset>
Ok, so let's just check the first test (not returning the news_id #2 from the XML data set)
I must extend the PHPUnit_Extensions_Database_TestCase class to make my NewsFactoryTest class:
<?php
require_once 'PHPUnit/Extensions/Database/TestCase.php';
class NewsFactoryTest extends PHPUnit_Extensions_Database_TestCase
{
protected $db;
protected function getConnection()
{
$this->db = new PDO('mysql:host=localhost;dbname=testdb', 'root', '');
return $this->createDefaultDBConnection($this->db, 'testdb');
}
protected function getDataSet()
{
return $this->createXMLDataSet(dir(__FILE__) . DIRECTORY_SEPARATOR . 'dbData.xml');
}
public function testGetNewsById()
{
$newsFactory = new NewsFactory($this->db);
$news = $newsFactory->getNewsById();
// ???
$this->assertEquals(2, count($news), "Should return only 2 results");
}
}
My main question would be how do I setup that test ?
In details, I try to understand:
Should I create a testdb database or is that all emulated/virtual ?
I've seen many examples using sqlite::memory:, is it a good idea to test MySQL based query with sqlite? Can I use mysql::memory: instead ?
If it's a real DB, how do I restore all the data from dbData.xml in the DB before each test run ?
Where am I supposed to call getConnection() and getDataSet()?
Thanks for reading & sharing your knowledge!
I setup database testing in our project and here are some answers and lessons that worked for us:
Should I create a testdb database or is that all emulated/virtual ?
I was asking the same question at the beginning and we both learned that it is indeed, a real database running.
I've seen many examples using sqlite::memory: , is it a good idea to test MySQL based query with sqlite ? Can I use mysql::memory: instead ?
I tried to use sqlite for performance, but found that the SQL would be different enough not to be usable every where with our existing code. I was able to use the MySQL MEMORY engine for most tables thought (not possible for some tables such as BLOB columns).
If it's a real DB, how do I restore all the data from dbData.xml in the DB before each test run ?
I wrote a script to call mysqldump of the schemas and all their tables from our remote test server, insert them in the local server, and convert all possible table engines to MEMORY. This does take time, but as the schemas don't change between tests, it is only run once at the top most TestSuite or separately as a developer needs on their local system.
The datasets are loaded at the beginning of each test and since the table already exists and is in memory, the inserting and truncating between tests is fast.
Where am I supposed to call getConnection() and getDataSet() ?
We already had a helper class that extended TestCase, so I couldn't use PHPUnit_Extensions_Database_TestCase. I add setup functions into that helper class and never called or had to implement getDataSet(). I did use getConnection() to create datasets from modified data in an assert function.
/**
* #param PHPUnit_Extensions_Database_DataSet_IDataSet $expected_data_fixture
* #param string|array $tables
*/
protected function assertDataFixturesEqual($expected_data_fixture, $tables){
if(!is_array($tables)){
$tables = array($tables);
}
PHPUnit_Extensions_Database_TestCase::assertDataSetsEqual($expected_data_fixture, $this->DbTester->getConnection()->createDataSet($tables));
}
EDIT:
I found some bookmarks of resources I used as the PHPUnit documentation is a little lacking:
http://www.ds-o.com/archives/63-PHPUnit-Database-Extension-DBUnit-Port.html
http://www.ds-o.com/archives/64-Adding-Database-Tests-to-Existing-PHPUnit-Test-Cases.html
I haven't used PHPUnit's database test case so I must confine my answer to the assertion. You can either assert that ID 2 is not present in $news or you can assert that every object in $news is inactive. The latter is more flexible as you won't need to change your test as you add data to the test dataset.
$news = $newsFactory->getNewsById();
foreach ($news as $item) {
self::assertTrue($news->isActive());
}
BTW, the published dates in your dataset are all identical. This will make testing the ordering impossible. ;)
So far I've understood that:
Should I create a testdb database or is that all emulated/virtual ?
It create a real database, and using the getSetUpOperation method, it's really slow as the tables are truncated and re-imported for each test, and it's demanding alot on the harddrive even for a small amount of data. ( ~ 1 sec/test )
I've seen many examples using sqlite::memory: , is it a good idea to test MySQL based query with sqlite ? Can I use mysql::memory: instead ?
I still don't know. I think it's now really possible with MySQL.
If it's a real DB, how do I restore all the data from dbData.xml in the DB before each test run ?
There are getSetUpOperation and getTearDownOperation that act like setup and tearDown method. Adding this will truncate the table mentionned in the dataSet, and re-insert all the data of that xml file:
/**
* Executed before each
*
* #return PHPUnit_Extensions_Database_Operation_DatabaseOperation
*/
protected function getSetUpOperation()
{
return PHPUnit_Extensions_Database_Operation_Factory::CLEAN_INSERT();
}
Where am I supposed to call getConnection() and getDataSet() ?
Nowhere. Theses are magic method that are called automatically. getConnection is called before the tests ( a bit like __construct would be, but I'm not sure about the order ) and getDataSet will be called when a dataSet is needed. I think that in my case, only getSetUpOperation have a dependency for a dataSet... so in the background it calls the getDataSet method before each tests to make the CLEAN_INSERT operation.
Also, I discovered that we need to create the table structure ( the dataset doesn't handle that ), so my full --slow-- working code is:
<?php
require_once 'PHPUnit/Extensions/Database/TestCase.php';
class NewsFactoryTest extends PHPUnit_Extensions_Database_TestCase
{
/**
* Custom PDO instance required by the SUT.
*
* #var Core_Db_Driver_iConnector
*/
protected $db;
/**
* Create a connexion.
* Note: les constantes de connexion sont définit dans bootstrap.php.
*
* #return PHPUnit_Extensions_Database_DB_IDatabaseConnection
*/
protected function getConnection()
{
//Instanciate the connexion required by the system under test.
$this->db = new Core_Db_Driver_PDO('mysql:host=' . TEST_DB_HOST . ';dbname=' . TEST_DB_BASE, TEST_DB_USER, TEST_DB_PASS, array());
//Create a genuine PDO connexion, required for PHPUnit_Extensions_Database_TestCase.
$db = new PDO('mysql:host=' . TEST_DB_HOST . ';dbname=' . TEST_DB_BASE, TEST_DB_USER, TEST_DB_PASS);
$this->createTableSchema($db);
return $this->createDefaultDBConnection($db, TEST_DB_BASE);
}
/**
* Load the required table schemes.
*
* #param PDO $db
* #return void
*/
protected function createTableSchema(PDO $db)
{
$schemaPath = dirname(__FILE__) . DIRECTORY_SEPARATOR . 'sql_schema' . DIRECTORY_SEPARATOR;
$query = file_get_contents($schemaPath . 'news.sql');
$db->exec($query);
$query = file_get_contents($schemaPath . 'news_locale.sql');
$db->exec($query);
}
/**
* Load the dataSet in memory.
*
* #return PHPUnit_Extensions_Database_DataSet_IDataSet
*/
protected function getDataSet()
{
return $this->createXMLDataSet(dirname(__FILE__) . DIRECTORY_SEPARATOR . 'newsFactory_dataSet.xml');
}
/**
* Method executed before each test
*
* #return PHPUnit_Extensions_Database_Operation_DatabaseOperation
*/
protected function getSetUpOperation()
{
//TRUNCATE the table mentionned in the dataSet, then re-insert the content of the dataset.
return PHPUnit_Extensions_Database_Operation_Factory::CLEAN_INSERT();
}
/**
* Method executed after each test
*
* #return PHPUnit_Extensions_Database_Operation_DatabaseOperation
*/
protected function getTearDownOperation()
{
//Do nothing ( yup, their's a code for that )
return PHPUnit_Extensions_Database_Operation_Factory::NONE();
}
/**
* #covers NewsFactory::getNewsById
*/
public function testGetNewsById()
{
$newsFactory = new NewsFactory($this->db);
$news = $newsFactory->getNewsById(999);
$this->assertFalse($news);
}
}
Hope that will help other people that needed some extra explanations.
If you have any comment, suggestion or idea, your input is welcome, as I don't consider this solution as a fully efficient one. ( Slow and long to setup, and needs a double connection. )