Unable to Correctly Serialize RangeSet<Instant> with Flink Serialization System - apache-flink

I've implemented a RichFunction with following type:
RichMapFunction<GeofenceEvent, OutputRangeSet>
the class OutputRangeSet has a field of type:
com.google.common.collect.RangeSet<Instant>
When this pojo is serialized using Kryo I get null fields !
So far, I tried using a TypeInfoFactory<RangeSet>:
public class InstantRangeSetTypeInfo extends TypeInfoFactory<RangeSet<Instant>> {
#Override
public TypeInformation<RangeSet<Instant>> createTypeInfo(Type t, Map<String, TypeInformation<?>> genericParameters) {
TypeInformation<RangeSet<Instant>> info = TypeInformation.of(new TypeHint<RangeSet<Instant>>() {});
return info;
}
}
That annotate my field:
public class OutputRangeSet implements Serializable {
private String key;
#TypeInfo(InstantRangeSetTypeInfo.class)
private RangeSet<Instant> rangeSet;
}
Another solution (that doesn't work either) is registring a third party serializer:
env.getConfig().registerTypeWithKryoSerializer(RangeSet.class, ProtobufSerializer.class);
You can get the github project here:
https://github.com/elarbikonta/tonl-events
When you run the test you can see (in debug) that the rangeSet beans I get from my RichFunction has null fields, see test method com.tonl.apps.events.IsVehicleInZoneTest#operatorChronograph :
final RangeSet<Instant> rangeSet = resultList.get(0).getRangeSet(); // rangetSet.ranges = null !
Thanks for your help

Related

Salesforce : Apex test class for the getting trending articles of Knowledge from Community

I need to get trending articles from the community. I created a apex class for that by using ConnectApi.Knowledge.getTrendingArticles(communityId, maxResult).
I need to create a test class for that. I am using test class method provided by Salesforce for that. setTestGetTrendingArticles(communityId, maxResults, result) but I am getting this error "System.AssertException: Assertion Failed: No matching test result found for Knowledge.getTrendingArticles(String communityId, Integer maxResults). Before calling this, call Knowledge.setTestGetTrendingArticles(String communityId, Integer maxResults, ConnectApi.KnowledgeArticleVersionCollection result) to set the expected test result."
public without sharing class ConnectTopicCatalogController {
#AuraEnabled(cacheable=true)
public static List<ConnectApi.KnowledgeArticleVersion> getAllTrendingArticles(){
string commId = [Select Id from Network where Name = 'Customer Community v5'].Id;
ConnectApi.KnowledgeArticleVersionCollection mtCollection = ConnectApi.Knowledge.getTrendingArticles(commId, 12);
System.debug('getAllTrendingTopics '+JSON.serializePretty(mtCollection.items));
List<ConnectApi.KnowledgeArticleVersion> topicList = new List<ConnectApi.KnowledgeArticleVersion>();
for(ConnectApi.KnowledgeArticleVersion mtopic : mtCollection.items)
{
topicList.add(mtopic);
}
return topicList;
}
}
Test class that I am using for this
public class ConnectTopicCatalogControllerTest {
public static final string communityId = [Select Id from Network where Name = 'Customer Community v5'].Id;
#isTest
static void getTrendingArticles(){
ConnectApi.KnowledgeArticleVersionCollection knowledgeResult = new ConnectApi.KnowledgeArticleVersionCollection();
List<ConnectApi.KnowledgeArticleVersion> know = new List<ConnectApi.KnowledgeArticleVersion>();
know.add(new ConnectApi.KnowledgeArticleVersion());
know.add(new ConnectApi.KnowledgeArticleVersion());
system.debug('know '+know);
knowledgeResult.items = know;
// Set the test data
ConnectApi.Knowledge.setTestGetTrendingArticles(null, 12, knowledgeResult);
List<ConnectApi.KnowledgeArticleVersion> res = ConnectTopicCatalogController.getAllTrendingArticles();
// The method returns the test page, which we know has two items in it.
Test.startTest();
System.assertEquals(12, res.size());
Test.stopTest();
}
}
I need help to solve the test class
Thanks.
Your controller expects the articles to be inside the 'Customer Community v5' community, but you are passing the communityId parameter as null to the setTestGetTrendingArticles method.

is JSONDeserializationSchema() deprecated in Flink?

I am new to Flink and doing something very similar to the below link.
Cannot see message while sinking kafka stream and cannot see print message in flink 1.2
I am also trying to add JSONDeserializationSchema() as a deserializer for my Kafka input JSON message which is without a key.
But I found JSONDeserializationSchema() is not present.
Please let me know if I am doing anything wrong.
JSONDeserializationSchema was removed in Flink 1.8, after having been deprecated earlier.
The recommended approach is to write a deserializer that implements DeserializationSchema<T>. Here's an example, which I've copied from the Flink Operations Playground:
import org.apache.flink.api.common.serialization.DeserializationSchema;
import org.apache.flink.api.common.typeinfo.TypeInformation;
import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper;
import java.io.IOException;
/**
* A Kafka {#link DeserializationSchema} to deserialize {#link ClickEvent}s from JSON.
*
*/
public class ClickEventDeserializationSchema implements DeserializationSchema<ClickEvent> {
private static final long serialVersionUID = 1L;
private static final ObjectMapper objectMapper = new ObjectMapper();
#Override
public ClickEvent deserialize(byte[] message) throws IOException {
return objectMapper.readValue(message, ClickEvent.class);
}
#Override
public boolean isEndOfStream(ClickEvent nextElement) {
return false;
}
#Override
public TypeInformation<ClickEvent> getProducedType() {
return TypeInformation.of(ClickEvent.class);
}
}
For a Kafka producer you'll want to implement KafkaSerializationSchema<T>, and you'll find examples of that in that same project.
To solve the problem of reading non-key JSON messages from Kafka I used case class and JSON parser.
The following code makes a case class and parses the JSON field using play API.
import play.api.libs.json.JsValue
object CustomerModel {
def readElement(jsonElement: JsValue): Customer = {
val id = (jsonElement \ "id").get.toString().toInt
val name = (jsonElement \ "name").get.toString()
Customer(id,name)
}
case class Customer(id: Int, name: String)
}
def main(args: Array[String]): Unit = {
val env = StreamExecutionEnvironment.getExecutionEnvironment
val properties = new Properties()
properties.setProperty("bootstrap.servers", "xxx.xxx.0.114:9092")
properties.setProperty("group.id", "test-grp")
val consumer = new FlinkKafkaConsumer[String]("customer", new SimpleStringSchema(), properties)
val stream1 = env.addSource(consumer).rebalance
val stream2:DataStream[Customer]= stream1.map( str =>{Try(CustomerModel.readElement(Json.parse(str))).getOrElse(Customer(0,Try(CustomerModel.readElement(Json.parse(str))).toString))
})
stream2.print("stream2")
env.execute("This is Kafka+Flink")
}
The Try method lets you overcome the exception thrown while parsing the data
and returns the exception in one of the fields (if we want) or else it can just return the case class object with any given or default fields.
The sample output of the Code is:
stream2:1> Customer(1,"Thanh")
stream2:1> Customer(5,"Huy")
stream2:3> Customer(0,Failure(com.fasterxml.jackson.databind.JsonMappingException: No content to map due to end-of-input
at [Source: ; line: 1, column: 0]))
I am not sure if it is the best approach but it is working for me as of now.

Populating a table from a file only last column is populated JavaFX [duplicate]

This has baffled me for a while now and I cannot seem to get the grasp of it. I'm using Cell Value Factory to populate a simple one column table and it does not populate in the table.
It does and I click the rows that are populated but I do not see any values in them- in this case String values. [I just edited this to make it clearer]
I have a different project under which it works under the same kind of data model. What am I doing wrong?
Here's the code. The commented code at the end seems to work though. I've checked to see if the usual mistakes- creating a new column instance or a new tableview instance, are there. Nothing. Please help!
//Simple Data Model
Stock.java
public class Stock {
private SimpleStringProperty stockTicker;
public Stock(String stockTicker) {
this.stockTicker = new SimpleStringProperty(stockTicker);
}
public String getstockTicker() {
return stockTicker.get();
}
public void setstockTicker(String stockticker) {
stockTicker.set(stockticker);
}
}
//Controller class
MainGuiController.java
private ObservableList<Stock> data;
#FXML
private TableView<Stock> stockTableView;// = new TableView<>(data);
#FXML
private TableColumn<Stock, String> tickerCol;
private void setTickersToCol() {
try {
Statement stmt = conn.createStatement();//conn is defined and works
ResultSet rsltset = stmt.executeQuery("SELECT ticker FROM tickerlist order by ticker");
data = FXCollections.observableArrayList();
Stock stockInstance;
while (rsltset.next()) {
stockInstance = new Stock(rsltset.getString(1).toUpperCase());
data.add(stockInstance);
}
} catch (SQLException ex) {
Logger.getLogger(WriteToFile.class.getName()).log(Level.SEVERE, null, ex);
System.out.println("Connection Failed! Check output console");
}
tickerCol.setCellValueFactory(new PropertyValueFactory<Stock,String>("stockTicker"));
stockTableView.setItems(data);
}
/*THIS, ON THE OTHER HAND, WORKS*/
/*Callback<CellDataFeatures<Stock, String>, ObservableValue<String>> cellDataFeat =
new Callback<CellDataFeatures<Stock, String>, ObservableValue<String>>() {
#Override
public ObservableValue<String> call(CellDataFeatures<Stock, String> p) {
return new SimpleStringProperty(p.getValue().getstockTicker());
}
};*/
Suggested solution (use a Lambda, not a PropertyValueFactory)
Instead of:
aColumn.setCellValueFactory(new PropertyValueFactory<Appointment,LocalDate>("date"));
Write:
aColumn.setCellValueFactory(cellData -> cellData.getValue().dateProperty());
For more information, see this answer:
Java: setCellValuefactory; Lambda vs. PropertyValueFactory; advantages/disadvantages
Solution using PropertyValueFactory
The lambda solution outlined above is preferred, but if you wish to use PropertyValueFactory, this alternate solution provides information on that.
How to Fix It
The case of your getter and setter methods are wrong.
getstockTicker should be getStockTicker
setstockTicker should be setStockTicker
Some Background Information
Your PropertyValueFactory remains the same with:
new PropertyValueFactory<Stock,String>("stockTicker")
The naming convention will seem more obvious when you also add a property accessor to your Stock class:
public class Stock {
private SimpleStringProperty stockTicker;
public Stock(String stockTicker) {
this.stockTicker = new SimpleStringProperty(stockTicker);
}
public String getStockTicker() {
return stockTicker.get();
}
public void setStockTicker(String stockticker) {
stockTicker.set(stockticker);
}
public StringProperty stockTickerProperty() {
return stockTicker;
}
}
The PropertyValueFactory uses reflection to find the relevant accessors (these should be public). First, it will try to use the stockTickerProperty accessor and, if that is not present fall back to getters and setters. Providing a property accessor is recommended as then you will automatically enable your table to observe the property in the underlying model, dynamically updating its data as the underlying model changes.
put the Getter and Setter method in you data class for all the elements.

Retrieving every field of a database row as object in zend framework 2

I know we have result set to get a row as object But How can I get every field as a separate object ? consider of this database row :
user_id address_id product_id shop_id
5 3 134 2
I want to retrieve and save the row as follows :
userEntity AddressEntity ProductEntity ShopEntity
This is not how the TableDataGateway is supposed to be used, since what you are looking for are more complex features such as the ones of Doctrine 2 ORM and similar data-mappers.
Here is one possible solution to the problem, which involves using a custom hydrator (docs). My example is simplified, but I hope it clarifies how you are supposed to build your resultset.
First, define your entities (I'm simplifying the example assuming that UserEntity is the root of your hydration):
class UserEntity {
/* fields public for simplicity of the example */
public $address;
public $product;
public $shop;
}
class AddressEntity { /* add public fields here for simplicity */ }
class ProductEntity { /* add public fields here for simplicity */ }
class ShopEntity { /* add public fields here for simplicity */ }
Then, build hydrators specific for the single entities:
use Zend\Stdlib\Hydrator\HydratorInterface as Hydrator;
class AddressHydrator implements Hydrator {
// #TODO: implementation up to you
}
class ProductHydrator implements Hydrator {
// #TODO: implementation up to you
}
class ShopHydrator implements Hydrator {
// #TODO: implementation up to you
}
Then we aggregate these hydrators into one that is specifically built to hydrate a UserEntity:
class UserHydrator extends \Zend\Stdlib\Hydrator\ObjectProperty {
public function __construct(
Hydrator $addressHydrator,
Hydrator $productHydrator,
Hydrator $shopHydrator
) {
$this->addressHydrator = $addressHydrator;
$this->productHydrator = $productHydrator;
$this->shopHydrator = $shopHydrator;
}
public function hydrate(array $data, $object)
{
if (isset($data['address_id'])) {
$data['address'] = $this->addressHydrator->hydrate($data, new AddressEntity());
}
if (isset($data['product_id'])) {
$data['product'] = $this->productHydrator->hydrate($data, new ProductEntity());
}
if (isset($data['shop_id'])) {
$data['shop'] = $this->shopHydrator->hydrate($data, new ShopEntity());
}
return parent::hydrate($data, $object);
}
}
Now you can use it to work with your resultset. Let's define the service for your UserEntityTableGateway:
'UserEntityTableGateway' => function ($sm) {
$dbAdapter = $sm->get('Zend\Db\Adapter\Adapter');
$resultSetPrototype = new ResultSet();
$resultSetPrototype->setArrayObjectPrototype(new UserHydrator());
return new TableGateway('user', $dbAdapter, null, $resultSetPrototype);
},
These are all simplified examples, but they should help you understanding how powerful hydrators can be, and how you can compose them to solve complex problems.
You may also check the chapters in the documentation about the Aggregate Hydrator and Hydration Strategies, which were designed specifically to solve your problem.

GAE/JPA/DataNucleus: Strange exception while trying to persist entity (IllegalArgumentException: out of field index :-1)

I'm getting an exception after I added this embedded field in my entity:
#Entity
public class Team extends DataObject
{
#Embedded
private TeamEvolution teamEvolution = new TeamEvolution();
// NEW FIELD:
#Embedded
// #AttributeOverrides({ #AttributeOverride(name = "buffer", column = #Column) })
// #Enumerated
private ScoutBuffer scoutBuffer;
...
This guy is very simple:
#Embeddable
public class ScoutBuffer
{
private static final int BUFFER_SIZE = 150;
#Basic
private List<String> buffer;
... // from here on there are only methods...
When I try to merge my modifications I get the following exception:
java.lang.IllegalArgumentException: out of field index :-1
at com.olympya.futweb.datamodel.model.ScoutBuffer.jdoProvideField(ScoutBuffer.java)
at org.datanucleus.state.JDOStateManagerImpl.provideField(JDOStateManagerImpl.java:2585)
at org.datanucleus.state.JDOStateManagerImpl.provideField(JDOStateManagerImpl.java:2555)
at org.datanucleus.store.mapped.mapping.CollectionMapping.postUpdate(CollectionMapping.java:185)
at org.datanucleus.store.mapped.mapping.EmbeddedPCMapping.postUpdate(EmbeddedPCMapping.java:133)
// etc, etc...
I don't think there's anything to do, but I had to use JDOHelper.makeDirty before merging the entity for it to perceive that I modified scoutBuffer:
team.getScoutBuffer().add(playerIds);
JDOHelper.makeDirty(team, "scoutBuffer");
em.merge(team);
As you can see commented in the code, I tried the workaround described here, without success. Strange thing is that is from 2009... I'm using GAE 1.7.0, by the way. Also, I tried cleaning/re-enhancing the datamodel.

Resources