I was trying to schedule batch apex class using schedule apex class in developer console, but I am not able to schedule it.
Here is my code :
schedulefieldupdtebatch sc = new schedulefieldupdtebatch();
System.schedule('Job1', '0 * * * * ?', new scjob());
System.schedule('Job2', '0 15 * * * ?', new scjob());
Am getting the error as
invalid type scjob.
Signature for System.schedule method is given below, for more details please check link or details
public static String schedule(String jobName, String cronExpression, Object schedulableClass)
Assuming schedulefieldupdtebatch is a SchedulableClass, you can execute below code snippet in developer console.
/*Apex code snippet*/
//Instantiate Schedulable batch Schedulefieldupdtebatch
Schedulefieldupdtebatch sc = new Schedulefieldupdtebatch();
//Set Scheduler time Oct 20 10:10 am PST
String schTime = '0 10 10 20 10 ?';
//This will execute Batch on coming Oct 20 10:10 a PST. Please check Scheduled job for more details
System.schedule(’Schedule Dup Batch, schTime, sc);
/*Apex code snippet*/
Related
I have a classifieds website similar to Craigslist built with laravel.
There's a status column in my MySQL posts table and I want to be able to automatically change the status of all posts that is more than 30 days old to "deleted "
So how do I do that in laravel.
There's no code here because I don't know how to go about it.
You have to use the scheduler (docs: https://laravel.com/docs/8.x/scheduling).
It allows you to perform cron tasks.
For instance, you could to something like that:
You should read the documentation first because it contains a lot of useful information to understand the following code example.
<?php
namespace App\Console;
use Illuminate\Console\Scheduling\Schedule;
use Illuminate\Foundation\Console\Kernel as ConsoleKernel;
use Illuminate\Support\Facades\DB;
class Kernel extends ConsoleKernel
{
/**
* The Artisan commands provided by your application.
*
* #var array
*/
protected $commands = [
//
];
/**
* Define the application's command schedule.
*
* #param \Illuminate\Console\Scheduling\Schedule $schedule
* #return void
*/
protected function schedule(Schedule $schedule)
{
$schedule->call(function () {
YourModel::query()
->where('created_at', '<', today()->subDays(30))
->update(['your_status_column' => 'deleted');
})->daily();
}
}
Here we are updating the status of all YourModel that are created since more than 30 days to deleted (and we are doing it every days at midnight).
Of course you have to adapt the code to your model and column.
You can create a command or job, and use task scheduling to automatically run the command and job you just created.
https://laravel.com/docs/8.x/scheduling
First and foremost:
I'm kind of new to Flink (Understand the principle and is able to create any basic streaming job I need to)
I'm using Kinesis Analytics to run my Flink job and by default it's using incremental checkpointing with a 1 minute interval.
The Flink job is reading event from a Kinesis stream using a FlinkKinesisConsumer and a custom deserailzer (deserialze the byte into a simple Java object which is used throughout the job)
What I would like to archieve is simply counting how many event of ENTITY_ID/FOO and ENTITY_ID/BAR there is for the past 24 hours. It is important that this count is as accurate as possible and this is why I'm using this Flink feature instead of doing a running sum myself on a 5 minute tumbling window.
I also want to be able to have a count of 'TOTAL' events from the start (and not just for the past 24h) so I also output in the result the count of events for the past 5 minutes so that the post porcessing app can simply takes these 5 minute of data and do a running sum. (This count doesn't have to be accurate and it's ok if there is an outage and I lose some count)
Now, this job was working pretty good up until last week where we had a surge (10 times more) in traffic. From that point on Flink went banana.
Checkpoint size starting to slowly grow from ~500MB to 20GB and checkpoint time were taking around 1 minutes and growing over time.
The application started failing and never was able to fully recover and the event iterator age shoot up never went back down so no new events were being consumed.
Since I'm new with Flink I'm not enterely sure if the way I'm doing the sliding count is completely un optimised or plain wrong.
This is a small snippet of the key part of the code:
The source (MyJsonDeserializationSchema extends AbstractDeserializationSchema and simply read byte and create the Event object):
SourceFunction<Event> source =
new FlinkKinesisConsumer<>("input-kinesis-stream", new MyJsonDeserializationSchema(), kinesisConsumerConfig);
The input event, simple java pojo which will be use in the Flink operators:
public class Event implements Serializable {
public String entityId;
public String entityType;
public String entityName;
public long eventTimestamp = System.currentTimeMillis();
}
env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime);
DataStream<Event> eventsStream = kinesis
.assignTimestampsAndWatermarks(new BoundedOutOfOrdernessTimestampExtractor<Event>(Time.seconds(30)) {
#Override
public long extractTimestamp(Event event) {
return event.eventTimestamp;
}
})
DataStream<Event> fooStream = eventsStream
.filter(new FilterFunction<Event>() {
#Override
public boolean filter(Event event) throws Exception {
return "foo".equalsIgnoreCase(event.entityType);
}
})
DataStream<Event> barStream = eventsStream
.filter(new FilterFunction<Event>() {
#Override
public boolean filter(Event event) throws Exception {
return "bar".equalsIgnoreCase(event.entityType);
}
})
StreamTableEnvironment tEnv = StreamTableEnvironment.create(env);
Table fooTable = tEnv.fromDataStream("fooStream, entityId, entityName, entityType, eventTimestame.rowtime");
tEnv.registerTable("Foo", fooTable);
Table barTable = tEnv.fromDataStream("barStream, entityId, entityName, entityType, eventTimestame.rowtime");
tEnv.registerTable("Bar", barTable);
Table slidingFooCountTable = fooTable
.window(Slide.over("24.hour").every("5.minute").on("eventTimestamp").as("minuteWindow"))
.groupBy("entityId, entityName, minuteWindow")
.select("concat(concat(entityId,'_'), entityName) as slidingFooId, entityid as slidingFooEntityid, entityName as slidingFooEntityName, entityType.count as slidingFooCount, minuteWindow.rowtime as slidingFooMinute");
Table slidingBarCountTable = barTable
.window(Slide.over("24.hout").every("5.minute").on("eventTimestamp").as("minuteWindow"))
.groupBy("entityId, entityName, minuteWindow")
.select("concat(concat(entityId,'_'), entityName) as slidingBarId, entityid as slidingBarEntityid, entityName as slidingBarEntityName, entityType.count as slidingBarCount, minuteWindow.rowtime as slidingBarMinute");
Table tumblingFooCountTable = fooTable
.window(Tumble.over(tumblingWindowTime).on("eventTimestamp").as("minuteWindow"))
.groupBy("entityid, entityName, minuteWindow")
.select("concat(concat(entityName,'_'), entityName) as tumblingFooId, entityId as tumblingFooEntityId, entityNamae as tumblingFooEntityName, entityType.count as tumblingFooCount, minuteWindow.rowtime as tumblingFooMinute");
Table tumblingBarCountTable = barTable
.window(Tumble.over(tumblingWindowTime).on("eventTimestamp").as("minuteWindow"))
.groupBy("entityid, entityName, minuteWindow")
.select("concat(concat(entityName,'_'), entityName) as tumblingBarId, entityId as tumblingBarEntityId, entityNamae as tumblingBarEntityName, entityType.count as tumblingBarCount, minuteWindow.rowtime as tumblingBarMinute");
Table aggregatedTable = slidingFooCountTable
.leftOuterJoin(slidingBarCountTable, "slidingFooId = slidingBarId && slidingFooMinute = slidingBarMinute")
.leftOuterJoin(tumblingFooCountTable, "slidingFooId = tumblingBarId && slidingFooMinute = tumblingBarMinute")
.leftOuterJoin(tumblingFooCountTable, "slidingFooId = tumblingFooId && slidingFooMinute = tumblingFooMinute")
.select("slidingFooMinute as timestamp, slidingFooCreativeId as entityId, slidingFooEntityName as entityName, slidingFooCount, slidingBarCount, tumblingFooCount, tumblingBarCount");
DataStream<Result> result = tEnv.toAppendStream(aggregatedTable, Result.class);
result.addSink(sink); // write to an output stream to be picked up by a lambda function
I would greatly appreciate if someone with more experience in working with Flink could comment on the way I have done my counting? Is my code completely over engineered? Is there a better and more efficient way of counting events over a 24h period?
I have read somewhere in Stackoverflow #DavidAnderson suggesting to create our own sliding window using map state and slicing the event by timestamp.
However I'm not exactly sure what this mean and I didn't find any code example to show it.
You are creating quite a few windows in there. If You are creating a sliding window with a size of 24h and slide of 5 mins this means that there will be a lot of open windows in there, so You may expect that all the data You have received in the given day will be checkpointed in at least one window if You think about it. So, it's certain that the size & time of the checkpoint will grow as the data itself grows.
To be able to get the answer if the code can be rewritten You would need to provide more details on what exactly are You trying to achieve here.
Public class AutoConvertLeads
{
#InvocableMethod
public static void LeadAssign(List<Id> LeadIds)
{
Database.LeadConvert Leadconvert = new Database.LeadConvert();
Leadconvert.setLeadId(LeadIds[0]);
lead l= [SELECT Id, email FROM Lead WHERE id=:LeadIds[0]];
LeadStatus Leads= [SELECT Id, MasterLabel FROM LeadStatus WHERE IsConverted=true LIMIT 1];
contact[] clist=[select id,name,session__c from contact where email=:l.email limit 1 ];
if(clist.size()>0){
contact c=clist[0];
c.session__c='PUT_THE_VALUE_YOU_WANT_TO_UPDATE_THE_FIELD_WITH'; //Make sure you are inserting value according to field type.
update c;
}
else{
Leadconvert.setConvertedStatus(Leads.MasterLabel);
Leadconvert.setDoNotCreateOpportunity(TRUE); //Remove this line if you want to create an opportunity from Lead Conversion
Database.LeadConvertResult Leadconverts = Database.convertLead(Leadconvert);
System.assert(Leadconverts.isSuccess());
}
}
}
The session value comes from web to lead form depending on person signing up. It's a picklist. The values are dates as mentioned - May 24; 2 PM - 4 PM, June 28; 9 AM - 12 PM, May 24; 10 AM -12 PM, June 28; 4:30 PM - 7:30 PM, July 26; 9 AM - 12 PM, July 26; 4:30 PM - 7:30 PM.
How can I pass these values into this trigger code for c.session__c ?
In trigger code all the properties from the object are available, so in this case all the properties of led will be available so if you have a trigger like this
trigger LeadTrigger on Lead (before insert){
for(Lead l : Trigger.new){
System.debug(l.Session__c);
}
}
That would print out the session__ field from the lead. If you want to pass the value as a function param you could do it like:
SomeClass.someStaticMethod(l.Session__c);
//or
new SomeClass().someMethod(l.Session__c);
However it might be easier to pass the entire lead record from the trigger or if you pass the ids query the exact fields that you need from the lead in the method.
But from your code it looks like what you want to do is
contact c = clist[0];
c.session__c = l.MasterLabel;
update c;
Or whatever the field is on lead that you want. But since you want to map a field on lead to a field on contact why not use lead mapping functionality?
EDIT
I Think the title is misleading, I think you meant to ask how to pass params from a trigger into this invocable method. From the documentation
Triggers can’t reference invocable methods
However this can be done via flows and the process builder, so that might be worth looking into.
On a side note you code is not bulkified at all and would lead unexpected results if multiple lead its were passed in with different session codes. You probably should create a Map> for converted leads where the key is the email (normalized) an the list is a list of converted leads. Query contacts by the list map keySet() and build a Map where the key is the email (normalized) and do the update value to the lead.whatever__c and up the contact map values
To pass values to the method you listed you would do
List<Id> leadIds = new List<Id>();
for(Lead l : [SELECT Id FROM Lead]){ //the query here is an example u can have a list already available
leadIds.add(l.Id);
}
AutoConvertLeads.LeadAssign(leadIds);
But u cant do this from a trigger not directly becuase the doc indicate that it isn't permitted. You could test to see if you could do it from another class indirectly but I haven't tested that.
I want to create timer in odoo view which support run/pause something like Soccer match time , min:sec format
I tried below code but it generated an error
#api.one
def timer_th(self):
timer_thread = Thread(target=self.timer)
timer_thread.start()
def timer(self):
while self.current_time <= self.duration:
time.sleep(1)
self.current_time += 1
it gave me AttributeError: environments error
but when I used the code without thread it works but gui wasn't responsive
if you want to schedule a piece of code you can use ir.cron model
this is for automating actions but i don't know if you can do start/pause thing
more details in the docs:
http://odoo-development.readthedocs.io/en/latest/odoo/models/ir.cron.html
I'm testing a Factory that simply retrieves all the "post" of a news system. I'll cut the example to something as simple as possible:
$newsFactory->getAllNews();
The table looks like this:
+---------+---------------------+-------------+
| news_id | news_publishedDate | news_active |
+---------+---------------------+-------------+
| 1 | 2010-03-22 13:20:22 | 1 |
| 2 | 2010-03-23 13:20:22 | 1 |
| 14 | 2010-03-23 13:20:22 | 0 |
| 15 | 2010-03-23 13:20:22 | 1 |
+---------+---------------------+-------------+
I want to test that behaviour; for now, we'll focus only on the first one:
Make sure the query will return only news_active=1
Make sure the query will return the element ordered by news_publishedDate, from newest to older.
So I've made an dbData.xml dataset of what I consider as good testing data:
<?xml version="1.0" encoding="UTF-8" ?>
<dataset>
<table name="news">
<column>news_id</column>
<column>news_publishedDate</column>
<column>news_active</column>
<row>
<value>1</value>
<value>2010-03-20 08:55:05</value>
<value>1</value>
</row>
<row>
<value>2</value>
<value>2010-03-20 08:55:05</value>
<value>0</value>
</row>
<row>
<value>3</value>
<value>2011-03-20 08:55:05</value>
<value>1</value>
</row>
</table>
</dataset>
Ok, so let's just check the first test (not returning the news_id #2 from the XML data set)
I must extend the PHPUnit_Extensions_Database_TestCase class to make my NewsFactoryTest class:
<?php
require_once 'PHPUnit/Extensions/Database/TestCase.php';
class NewsFactoryTest extends PHPUnit_Extensions_Database_TestCase
{
protected $db;
protected function getConnection()
{
$this->db = new PDO('mysql:host=localhost;dbname=testdb', 'root', '');
return $this->createDefaultDBConnection($this->db, 'testdb');
}
protected function getDataSet()
{
return $this->createXMLDataSet(dir(__FILE__) . DIRECTORY_SEPARATOR . 'dbData.xml');
}
public function testGetNewsById()
{
$newsFactory = new NewsFactory($this->db);
$news = $newsFactory->getNewsById();
// ???
$this->assertEquals(2, count($news), "Should return only 2 results");
}
}
My main question would be how do I setup that test ?
In details, I try to understand:
Should I create a testdb database or is that all emulated/virtual ?
I've seen many examples using sqlite::memory:, is it a good idea to test MySQL based query with sqlite? Can I use mysql::memory: instead ?
If it's a real DB, how do I restore all the data from dbData.xml in the DB before each test run ?
Where am I supposed to call getConnection() and getDataSet()?
Thanks for reading & sharing your knowledge!
I setup database testing in our project and here are some answers and lessons that worked for us:
Should I create a testdb database or is that all emulated/virtual ?
I was asking the same question at the beginning and we both learned that it is indeed, a real database running.
I've seen many examples using sqlite::memory: , is it a good idea to test MySQL based query with sqlite ? Can I use mysql::memory: instead ?
I tried to use sqlite for performance, but found that the SQL would be different enough not to be usable every where with our existing code. I was able to use the MySQL MEMORY engine for most tables thought (not possible for some tables such as BLOB columns).
If it's a real DB, how do I restore all the data from dbData.xml in the DB before each test run ?
I wrote a script to call mysqldump of the schemas and all their tables from our remote test server, insert them in the local server, and convert all possible table engines to MEMORY. This does take time, but as the schemas don't change between tests, it is only run once at the top most TestSuite or separately as a developer needs on their local system.
The datasets are loaded at the beginning of each test and since the table already exists and is in memory, the inserting and truncating between tests is fast.
Where am I supposed to call getConnection() and getDataSet() ?
We already had a helper class that extended TestCase, so I couldn't use PHPUnit_Extensions_Database_TestCase. I add setup functions into that helper class and never called or had to implement getDataSet(). I did use getConnection() to create datasets from modified data in an assert function.
/**
* #param PHPUnit_Extensions_Database_DataSet_IDataSet $expected_data_fixture
* #param string|array $tables
*/
protected function assertDataFixturesEqual($expected_data_fixture, $tables){
if(!is_array($tables)){
$tables = array($tables);
}
PHPUnit_Extensions_Database_TestCase::assertDataSetsEqual($expected_data_fixture, $this->DbTester->getConnection()->createDataSet($tables));
}
EDIT:
I found some bookmarks of resources I used as the PHPUnit documentation is a little lacking:
http://www.ds-o.com/archives/63-PHPUnit-Database-Extension-DBUnit-Port.html
http://www.ds-o.com/archives/64-Adding-Database-Tests-to-Existing-PHPUnit-Test-Cases.html
I haven't used PHPUnit's database test case so I must confine my answer to the assertion. You can either assert that ID 2 is not present in $news or you can assert that every object in $news is inactive. The latter is more flexible as you won't need to change your test as you add data to the test dataset.
$news = $newsFactory->getNewsById();
foreach ($news as $item) {
self::assertTrue($news->isActive());
}
BTW, the published dates in your dataset are all identical. This will make testing the ordering impossible. ;)
So far I've understood that:
Should I create a testdb database or is that all emulated/virtual ?
It create a real database, and using the getSetUpOperation method, it's really slow as the tables are truncated and re-imported for each test, and it's demanding alot on the harddrive even for a small amount of data. ( ~ 1 sec/test )
I've seen many examples using sqlite::memory: , is it a good idea to test MySQL based query with sqlite ? Can I use mysql::memory: instead ?
I still don't know. I think it's now really possible with MySQL.
If it's a real DB, how do I restore all the data from dbData.xml in the DB before each test run ?
There are getSetUpOperation and getTearDownOperation that act like setup and tearDown method. Adding this will truncate the table mentionned in the dataSet, and re-insert all the data of that xml file:
/**
* Executed before each
*
* #return PHPUnit_Extensions_Database_Operation_DatabaseOperation
*/
protected function getSetUpOperation()
{
return PHPUnit_Extensions_Database_Operation_Factory::CLEAN_INSERT();
}
Where am I supposed to call getConnection() and getDataSet() ?
Nowhere. Theses are magic method that are called automatically. getConnection is called before the tests ( a bit like __construct would be, but I'm not sure about the order ) and getDataSet will be called when a dataSet is needed. I think that in my case, only getSetUpOperation have a dependency for a dataSet... so in the background it calls the getDataSet method before each tests to make the CLEAN_INSERT operation.
Also, I discovered that we need to create the table structure ( the dataset doesn't handle that ), so my full --slow-- working code is:
<?php
require_once 'PHPUnit/Extensions/Database/TestCase.php';
class NewsFactoryTest extends PHPUnit_Extensions_Database_TestCase
{
/**
* Custom PDO instance required by the SUT.
*
* #var Core_Db_Driver_iConnector
*/
protected $db;
/**
* Create a connexion.
* Note: les constantes de connexion sont définit dans bootstrap.php.
*
* #return PHPUnit_Extensions_Database_DB_IDatabaseConnection
*/
protected function getConnection()
{
//Instanciate the connexion required by the system under test.
$this->db = new Core_Db_Driver_PDO('mysql:host=' . TEST_DB_HOST . ';dbname=' . TEST_DB_BASE, TEST_DB_USER, TEST_DB_PASS, array());
//Create a genuine PDO connexion, required for PHPUnit_Extensions_Database_TestCase.
$db = new PDO('mysql:host=' . TEST_DB_HOST . ';dbname=' . TEST_DB_BASE, TEST_DB_USER, TEST_DB_PASS);
$this->createTableSchema($db);
return $this->createDefaultDBConnection($db, TEST_DB_BASE);
}
/**
* Load the required table schemes.
*
* #param PDO $db
* #return void
*/
protected function createTableSchema(PDO $db)
{
$schemaPath = dirname(__FILE__) . DIRECTORY_SEPARATOR . 'sql_schema' . DIRECTORY_SEPARATOR;
$query = file_get_contents($schemaPath . 'news.sql');
$db->exec($query);
$query = file_get_contents($schemaPath . 'news_locale.sql');
$db->exec($query);
}
/**
* Load the dataSet in memory.
*
* #return PHPUnit_Extensions_Database_DataSet_IDataSet
*/
protected function getDataSet()
{
return $this->createXMLDataSet(dirname(__FILE__) . DIRECTORY_SEPARATOR . 'newsFactory_dataSet.xml');
}
/**
* Method executed before each test
*
* #return PHPUnit_Extensions_Database_Operation_DatabaseOperation
*/
protected function getSetUpOperation()
{
//TRUNCATE the table mentionned in the dataSet, then re-insert the content of the dataset.
return PHPUnit_Extensions_Database_Operation_Factory::CLEAN_INSERT();
}
/**
* Method executed after each test
*
* #return PHPUnit_Extensions_Database_Operation_DatabaseOperation
*/
protected function getTearDownOperation()
{
//Do nothing ( yup, their's a code for that )
return PHPUnit_Extensions_Database_Operation_Factory::NONE();
}
/**
* #covers NewsFactory::getNewsById
*/
public function testGetNewsById()
{
$newsFactory = new NewsFactory($this->db);
$news = $newsFactory->getNewsById(999);
$this->assertFalse($news);
}
}
Hope that will help other people that needed some extra explanations.
If you have any comment, suggestion or idea, your input is welcome, as I don't consider this solution as a fully efficient one. ( Slow and long to setup, and needs a double connection. )