Apache Flink 1.3 table api rowtime strange behavior - apache-flink

The following code sample not work in 1.3
public class TumblingWindow {
public static void main(String[] args) throws Exception {
List<Content> data = new ArrayList<Content>();
data.add(new Content(1L, "Hi"));
data.add(new Content(2L, "Hallo"));
data.add(new Content(3L, "Hello"));
data.add(new Content(4L, "Hello"));
data.add(new Content(7L, "Hello"));
data.add(new Content(8L, "Hello world"));
data.add(new Content(16L, "Hello world"));
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime);
final StreamTableEnvironment tableEnv = TableEnvironment.getTableEnvironment(env);
DataStream<Content> stream = env.fromCollection(data);
DataStream<Content> stream2 = stream.assignTimestampsAndWatermarks(
new BoundedOutOfOrdernessTimestampExtractor<Content>(Time.milliseconds(1)) {
/**
*
*/
private static final long serialVersionUID = 410512296011057717L;
#Override
public long extractTimestamp(Content element) {
return element.getRecordTime();
}
});
Table table = tableEnv.fromDataStream(stream2,
"urlKey,httpGetMessageCount,httpPostMessageCount" + ",uplink,downlink,statusCode,statusCodeCount,rowtime.rowtime");
table.window(Tumble.over("1.hours").on("rowtime").as("w")).groupBy("w, urlKey")
.select("w.start,urlKey,uplink.sum,downlink.sum,httpGetMessageCount.sum,httpPostMessageCount.sum ");
env.execute();
}
public static class Content implements Serializable {
private String urlKey;
private long recordTime;
// private String recordTimeStr;
private long httpGetMessageCount;
private long httpPostMessageCount;
private long uplink;
private long downlink;
private long statusCode;
private long statusCodeCount;
public Content() {
super();
}
public Content(long recordTime, String urlKey) {
super();
this.recordTime = recordTime;
this.urlKey = urlKey;
}
public String getUrlKey() {
return urlKey;
}
public void setUrlKey(String urlKey) {
this.urlKey = urlKey;
}
public long getRecordTime() {
return recordTime;
}
public void setRecordTime(long recordTime) {
this.recordTime = recordTime;
}
public long getHttpGetMessageCount() {
return httpGetMessageCount;
}
public void setHttpGetMessageCount(long httpGetMessageCount) {
this.httpGetMessageCount = httpGetMessageCount;
}
public long getHttpPostMessageCount() {
return httpPostMessageCount;
}
public void setHttpPostMessageCount(long httpPostMessageCount) {
this.httpPostMessageCount = httpPostMessageCount;
}
public long getUplink() {
return uplink;
}
public void setUplink(long uplink) {
this.uplink = uplink;
}
public long getDownlink() {
return downlink;
}
public void setDownlink(long downlink) {
this.downlink = downlink;
}
public long getStatusCode() {
return statusCode;
}
public void setStatusCode(long statusCode) {
this.statusCode = statusCode;
}
public long getStatusCodeCount() {
return statusCodeCount;
}
public void setStatusCodeCount(long statusCodeCount) {
this.statusCodeCount = statusCodeCount;
}
}
private class TimestampWithEqualWatermark implements AssignerWithPunctuatedWatermarks<Object[]> {
/**
*
*/
private static final long serialVersionUID = 1L;
#Override
public long extractTimestamp(Object[] element, long previousElementTimestamp) {
// TODO Auto-generated method stub
return (long) element[0];
}
#Override
public Watermark checkAndGetNextWatermark(Object[] lastElement, long extractedTimestamp) {
return new Watermark(extractedTimestamp);
}
}
}
will raise following exception
Exception in thread "main" org.apache.flink.table.api.TableException: The rowtime attribute can only be replace a field with a valid time type, such as Timestamp or Long.
at org.apache.flink.table.api.StreamTableEnvironment$$anonfun$validateAndExtractTimeAttributes$1.apply(StreamTableEnvironment.scala:450)
at org.apache.flink.table.api.StreamTableEnvironment$$anonfun$validateAndExtractTimeAttributes$1.apply(StreamTableEnvironment.scala:440)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
at org.apache.flink.table.api.StreamTableEnvironment.validateAndExtractTimeAttributes(StreamTableEnvironment.scala:440)
at org.apache.flink.table.api.StreamTableEnvironment.registerDataStreamInternal(StreamTableEnvironment.scala:401)
at org.apache.flink.table.api.java.StreamTableEnvironment.fromDataStream(StreamTableEnvironment.scala:88)
at com.taiwanmobile.cep.noc.TumblingWindow.main(TumblingWindow.java:53)
But if I delete statusCodeCount in fromDataStream, this sample runs successfully without Exception.
Table table = tableEnv.fromDataStream(stream2,
"urlKey,httpGetMessageCount,httpPostMessageCount" + ",uplink,downlink,statusCode,statusCodeCount,rowtime.rowtime");
table.window(Tumble.over("1.hours").on("rowtime").as("w")).groupBy("w, urlKey")
.select("w.start,urlKey,uplink.sum,downlink.sum,httpGetMessageCount.sum,httpPostMessageCount.sum ");
Any suggestion?

This is bug that is filed as FLINK-6881. As a workaround you could define your own StreamTableSource that implements DefinedRowtimeAttribute (see also this documentation draft). A table source also nicely hides the underlying DataStream API which makes table programs more compact.

Related

spring cloud gcp pub/sub Jackson messageConverter deserialize fail

I'm trying to receive and process messages through GCP Pub/Sub.
I tried to convert and receive the payload part of the message through JacksonPubSubMessageConverter, but it failed.
It seems that I am not handling byte[] properly inside JacksonPubSubMessageConverter. Do I need to change ObjectMapper settings or override JacksonPubSubMessageConverter?
Below is a code example.
#Slf4j
#Configuration
public class PubSubConfig {
#Bean
public PubSubMessageConverter pubSubMessageConverter(ObjectMapper objectMapper) {
return new JacksonPubSubMessageConverter(objectMapper);
}
}
// ...
#Getter
#Setter
#ToString
#NoArgsConstructor(access = AccessLevel.PROTECTED)
public class MessageDTO {
private PubSubAction action;
#JsonFormat(pattern = "yyyy-MM-dd")
private LocalDate startedAt;
private Boolean dryRun;
}
// ...
public enum PubSubAction {
MY_ACTION("my action"),
ETC("etc action");
private final String description;
PubSubAction(String description) {
this.description = description;
}
#JsonCreator
public static PubSubAction create(String name) {
return Stream.of(PubSubAction.values())
.filter(pubSubAction -> pubSubAction.name().equals(name))
.findAny()
.orElse(null);
}
}
// ...
class MyConsumer() {
private final String subscriptionName;
private final PubSubTemplate pubSubTemplate;
public MyConsumer(
String subscriptionName,
PubSubTemplate pubSubTemplate
) {
this.subscriptionName = subscriptionName;
this.pubSubTemplate = pubSubTemplate;
}
private void consume(
ConvertedBasicAcknowledgeablePubsubMessage<MessageDTO> convertedMessage) {
try {
MessageDTO payload = convertedMessage.getPayload();
log.debug("payload {}", payload);
// payload MessageDTO(action=MY_ACTION, startedAt=null, dryRun=null)
convertedMessage.ack();
} catch (Exception e) {
log.error("Unknown Exception {} {}", e.getMessage(), this.subscriptionName, e);
}
}
private Consumer<ConvertedBasicAcknowledgeablePubsubMessage<MessageDTO>> convertConsumer() {
return this::consume;
}
public void subscribe() {
log.info("Subscribing to {}", subscriptionName);
pubSubTemplate.subscribeAndConvert(subscriptionName, this.convertConsumer(),
MessageDTO.class);
}
}

apache flink avro FileSink is struck at in-progress state for long time

I have below avro schema User.avsc
{
"type": "record",
"namespace": "com.myorg",
"name": "User",
"fields": [
{
"name": "id",
"type": "long"
},
{
"name": "name",
"type": "string"
}
]
}
The below java User.java class is generated from above User.avsc using avro-maven-plugin.
package com.myorg;
import java.io.IOException;
import java.io.ObjectInput;
import java.io.ObjectOutput;
import java.nio.ByteBuffer;
import org.apache.avro.AvroRuntimeException;
import org.apache.avro.Schema;
import org.apache.avro.Schema.Parser;
import org.apache.avro.data.RecordBuilder;
import org.apache.avro.io.DatumReader;
import org.apache.avro.io.DatumWriter;
import org.apache.avro.message.BinaryMessageDecoder;
import org.apache.avro.message.BinaryMessageEncoder;
import org.apache.avro.message.SchemaStore;
import org.apache.avro.specific.AvroGenerated;
import org.apache.avro.specific.SpecificData;
import org.apache.avro.specific.SpecificRecord;
import org.apache.avro.specific.SpecificRecordBase;
import org.apache.avro.specific.SpecificRecordBuilderBase;
#AvroGenerated
public class User extends SpecificRecordBase implements SpecificRecord {
private static final long serialVersionUID = 8699049231783654635L;
public static final Schema SCHEMA$ = (new Parser()).parse("{\"type\":\"record\",\"name\":\"User\",\"namespace\":\"com.myorg\",\"fields\":[{\"name\":\"id\",\"type\":\"long\"},{\"name\":\"name\",\"type\":{\"type\":\"string\",\"avro.java.string\":\"String\"}}]}");
private static SpecificData MODEL$ = new SpecificData();
private static final BinaryMessageEncoder<User> ENCODER;
private static final BinaryMessageDecoder<User> DECODER;
/** #deprecated */
#Deprecated
public long id;
/** #deprecated */
#Deprecated
public String name;
private static final DatumWriter<User> WRITER$;
private static final DatumReader<User> READER$;
public static Schema getClassSchema() {
return SCHEMA$;
}
public static BinaryMessageDecoder<User> getDecoder() {
return DECODER;
}
public static BinaryMessageDecoder<User> createDecoder(SchemaStore resolver) {
return new BinaryMessageDecoder(MODEL$, SCHEMA$, resolver);
}
public ByteBuffer toByteBuffer() throws IOException {
return ENCODER.encode(this);
}
public static User fromByteBuffer(ByteBuffer b) throws IOException {
return (User)DECODER.decode(b);
}
public User() {
}
public User(Long id, String name) {
this.id = id;
this.name = name;
}
public Schema getSchema() {
return SCHEMA$;
}
public Object get(int field$) {
switch(field$) {
case 0:
return this.id;
case 1:
return this.name;
default:
throw new AvroRuntimeException("Bad index");
}
}
public void put(int field$, Object value$) {
switch(field$) {
case 0:
this.id = (Long)value$;
break;
case 1:
this.name = (String)value$;
break;
default:
throw new AvroRuntimeException("Bad index");
}
}
public Long getId() {
return this.id;
}
public void setId(Long value) {
this.id = value;
}
public String getName() {
return this.name;
}
public void setName(String value) {
this.name = value;
}
public void writeExternal(ObjectOutput out) throws IOException {
WRITER$.write(this, SpecificData.getEncoder(out));
}
public void readExternal(ObjectInput in) throws IOException {
READER$.read(this, SpecificData.getDecoder(in));
}
static {
ENCODER = new BinaryMessageEncoder(MODEL$, SCHEMA$);
DECODER = new BinaryMessageDecoder(MODEL$, SCHEMA$);
WRITER$ = MODEL$.createDatumWriter(SCHEMA$);
READER$ = MODEL$.createDatumReader(SCHEMA$);
}
}
I want to write an instance of User SpecificRecord into File using apache flink`s FileSink.
Below is the program that I wrote -
import org.apache.flink.connector.file.sink.FileSink;
import org.apache.flink.core.fs.Path;
import org.apache.flink.formats.avro.AvroWriters;
import org.apache.flink.streaming.api.CheckpointingMode;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import com.myorg.User;
import org.apache.flink.streaming.api.functions.sink.filesystem.OutputFileConfig;
import org.apache.flink.streaming.api.functions.sink.filesystem.bucketassigners.DateTimeBucketAssigner;
import org.apache.flink.streaming.api.functions.sink.filesystem.rollingpolicies.OnCheckpointRollingPolicy;
import java.util.Arrays;
public class AvroFileSinkApp {
private static final String OUTPUT_PATH = "./il/";
public static void main(String[] args) throws Exception {
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment().enableCheckpointing(5000);
env.getCheckpointConfig().setCheckpointingMode(CheckpointingMode.EXACTLY_ONCE);
env.setParallelism(4);
OutputFileConfig config = OutputFileConfig
.builder()
.withPartPrefix("il")
.withPartSuffix(".avro")
.build();
DataStream<User> source = env.fromCollection(Arrays.asList(getUser(), getUser(), getUser(), getUser(), getUser(), getUser()));
source.sinkTo(FileSink.forBulkFormat(new Path(OUTPUT_PATH), AvroWriters.forSpecificRecord(User.class)).withBucketCheckInterval(5000).withRollingPolicy(OnCheckpointRollingPolicy.build())
.withOutputFileConfig(config).withBucketAssigner(new DateTimeBucketAssigner<>("yyyy/MM/dd/HH")).build());
env.execute("FileSinkProgram");
Thread.sleep(300000);
}
public static User getUser() {
User u = new User();
u.setId(1L);
u.setName("raj");
return u;
}
}
I wrote this program using this and this as reference. The project is on github here.
When I run the program, the in progress files are getting created but not checkpointing and committing the temp files. I've added Thread.sleep(300000); but couldn't see the inprogress files to avro files.
I've awaited the main thread for an hour as well but no luck.
Any idea what is stopping in-progress files moving to finished state?
This problem is mainly because Source is a BOUNDED Source. The execution of the entire Flink Job is over before the Checkpoint has been executed.
You can refer to the following example to generate User records instead of fromCollection
/** Data-generating source function. */
public static final class Generator
implements SourceFunction<Tuple2<Integer, Integer>>, CheckpointedFunction {
private static final long serialVersionUID = -2819385275681175792L;
private final int numKeys;
private final int idlenessMs;
private final int recordsToEmit;
private volatile int numRecordsEmitted = 0;
private volatile boolean canceled = false;
private ListState<Integer> state = null;
Generator(final int numKeys, final int idlenessMs, final int durationSeconds) {
this.numKeys = numKeys;
this.idlenessMs = idlenessMs;
this.recordsToEmit = ((durationSeconds * 1000) / idlenessMs) * numKeys;
}
#Override
public void run(final SourceContext<Tuple2<Integer, Integer>> ctx) throws Exception {
while (numRecordsEmitted < recordsToEmit) {
synchronized (ctx.getCheckpointLock()) {
for (int i = 0; i < numKeys; i++) {
ctx.collect(Tuple2.of(i, numRecordsEmitted));
numRecordsEmitted++;
}
}
Thread.sleep(idlenessMs);
}
while (!canceled) {
Thread.sleep(50);
}
}
#Override
public void cancel() {
canceled = true;
}
#Override
public void initializeState(FunctionInitializationContext context) throws Exception {
state =
context.getOperatorStateStore()
.getListState(
new ListStateDescriptor<Integer>(
"state", IntSerializer.INSTANCE));
for (Integer i : state.get()) {
numRecordsEmitted += i;
}
}
#Override
public void snapshotState(FunctionSnapshotContext context) throws Exception {
state.clear();
state.add(numRecordsEmitted);
}
}
}

Data not inserting into the Database room Android

I'm new to android and this is the first time I'm using room in my application. Either insert operation is not performed or the database is not created or any other error.
I don't know what I'm doing wrong so I need your help.
This program is running but No result is displayed. Nothing is showing on the screen.
Here is my code-
please let me know what is wrong in this code and what I should do to correct it.
Car_details.java
#PrimaryKey
#NonNull
#SerializedName("id")
#Expose
private String id;
#SerializedName("name")
#Expose
private String name;
#SerializedName("desc")
#Expose
private String desc;
#SerializedName("image")
#Expose
private String image;
CarDao.java-
#Insert(onConflict = OnConflictStrategy.REPLACE)
void insert(Car_Details car_details);
#Query("Select * from car_table")
LiveData<List<Car_Details>> selectAll();
CarListDatabase.java
private static CarListDatabase instance;
public abstract CarDao carDao();
public static synchronized CarListDatabase getInstance(Context context){
if(instance==null)
{
instance= Room.databaseBuilder(context.getApplicationContext(),
CarListDatabase.class,"Car_database").fallbackToDestructiveMigration()
.build();
}
return instance;
}
CarRepository.java
public void getCarList(){
CarlistInterface carlistInterface= retrofit.create(CarlistInterface.class);
Call<List<Car_Details>> carList= carlistInterface.carList();
carList.enqueue(new Callback<List<Car_Details>>() {
#Override
public void onResponse(Call<List<Car_Details>> call, final Response<List<Car_Details>> response) {
if(response.body() != null){
List<Car_Details> car_details = response.body();
for (int i = 0; i < car_details.size(); i++) {
String id=car_details.get(i).getId();
String names = car_details.get(i).getName();
String desc=car_details.get(i).getDesc();
String image= car_details.get(i).getImage();
Car_Details car = new Car_Details();
car .setId(id);
car .setName(names);
car .setDesc(desc);
car .setImage(image);
new InsertNoteAsyncTask(carDao).execute(car);
}
}
}
});
}
public LiveData<List<Car_Details>> getCarLists(){
return allCarList;
}
private static class InsertNoteAsyncTask extends AsyncTask<Car_Details,Void,Void> {
private CarDao carDao;
private InsertNoteAsyncTask(CarDao carDao){
this.carDao= carDao;
}
#Override
protected Void doInBackground(Car_Details... car_details) {
carDao.insert(car_details[0]);
return null;
}
CarViewModel.java
public CarViewModel(#NonNull Application application) {
super(application);
repository= new CarRepository(application);
carList= repository.getCarLists();
}
public LiveData<List<Car_Details>> getListLiveData() {
return carList;
MainActivity.java
#Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
repository = new CarRepository(this);
carViewModel = ViewModelProviders.of(this).get(CarViewModel.class);
recyclerView= findViewById(R.id.cars_recyclerView);
recyclerView.setHasFixedSize(true);
recyclerView.setLayoutManager(new LinearLayoutManager(this));
List = new ArrayList<>();
recyclerAdapter = new RecyclerAdapter(List);
recyclerView.setAdapter(recyclerAdapter);
recyclerAdapter = new RecyclerAdapter(List);
recyclerView.setAdapter(recyclerAdapter);
carViewModel.getListLiveData().observe(this, new
Observer<java.util.List<Car_Details>>() {
#Override
public void onChanged(java.util.List<Car_Details> car_details) {
recyclerAdapter.setUserList(List);
}
});
repository.getCarList();
}
RecyclerAdapter.java
public class RecyclerAdapter extends RecyclerView.Adapter {
List<Car_Details> carList= new ArrayList<>();
public RecyclerAdapter(List<Car_Details> carList) {
this.carList = carList;
}
#NonNull
#Override
public ViewHolder onCreateViewHolder(#NonNull ViewGroup parent, int viewType) {
LayoutInflater layoutInflater= LayoutInflater.from(parent.getContext());
View view= layoutInflater.inflate(R.layout.row_item,parent,false);
return new RecyclerAdapter.ViewHolder(view);
}
#Override
public void onBindViewHolder(#NonNull ViewHolder holder, int position) {
holder.car_name.setText(carList.get(position).getName());
holder.car_desc.setText(carList.get(position).getDesc());
}
public void setUserList(List<Car_Details> userList) {
this.carList = userList;
notifyDataSetChanged();
}
#Override
public int getItemCount() {
return carList.size();
}
class ViewHolder extends RecyclerView.ViewHolder {
private TextView car_name,car_desc;
public ViewHolder(#NonNull View itemView) {
super(itemView);
car_name= itemView.findViewById(R.id.car_name);
car_desc= itemView.findViewById(R.id.car_desc);
}
}
}
#Override
public int getItemCount() {
return carList.size();
}
class ViewHolder extends RecyclerView.ViewHolder {
private TextView car_name,car_desc;
public ViewHolder(#NonNull View itemView) {
super(itemView);
car_name= itemView.findViewById(R.id.car_name);
car_desc= itemView.findViewById(R.id.car_desc);
}
}
}
There is nothing wrong with your insert operation with room.
The way you have used live date in your application seems wrong that's why your program is running but no result is coming.
You have to check the part where you are using live data.
Hope this help you out.

My application usage memory remained high even after clear all my objects list

So i have this Base Class:
public abstract class WiresharkFile : IDisposable
{
private string _fileName;
private int _packets;
private int _packetsSent;
private string _duration;
private double _speed;
private int _progress;
protected abstract WiresharkFilePacket ReadPacket();
public abstract IEnumerator<WiresharkFilePacket> GetEnumerator();
public abstract void Rewind();
public string FileName
{
get { return _fileName; }
set { _fileName = value; }
}
public int Packets
{
get { return _packets; }
set { _packets = value; }
}
public void Dispose()
{
// implemented inside sub class.
}
}
And specific Wireshark format (libpcap):
public class Libpcap : WiresharkFile, IDisposable, IEnumerable<WiresharkFilePacket>
{
private BinaryReader binaryReader;
private Version version;
private uint snaplen;
private int thiszone;
private uint sigfigs;
private LibpcapLinkType linktype;
private long basePos;
private bool byteSwap;
private static uint MAGIC = 0xa1b2c3d4;
private static uint MAGIC_ENDIAN = 0xd4c3b2a1;
public Libpcap(string path)
: this(new FileStream(path, FileMode.Open, FileAccess.Read))
{
FileName = path;
}
private Libpcap(Stream fileStream)
{
...
}
public override void Rewind()
{
binaryReader = new BinaryReader(new FileStream(FileName, FileMode.Open, FileAccess.Read));
binaryReader.BaseStream.Position = basePos;
}
public void Dispose()
{
if (binaryReader != null)
binaryReader.Close();
}
I removed almost all parts of how i am read this file
Add files in to my application
I have this objects list:
public ObservableCollection<WiresharkFile> wiresharkFiles { get; set; }
This list is binding into my ListView.
When the user choose files to add into my application:
string[] files = openFileDialog.FileNames;
I am check this files via another class:
public class FileValidation
{
public static void DoWork(IEnumerable<string> files)
{
CancellationTokenSource tokenSource = new CancellationTokenSource();
CancellationToken token = tokenSource.Token;
Task task = Task.Factory.StartNew(() =>
{
try
{
Parallel.ForEach(files,
new ParallelOptions
{
MaxDegreeOfParallelism = 3
},
file =>
{
ProcessFile(file);
});
}
catch (Exception)
{ }
}, tokenSource.Token,
TaskCreationOptions.None,
TaskScheduler.Default).ContinueWith
(t =>
{
if (FinishValidationEventHandler != null)
FinishValidationEventHandler();
}
, TaskScheduler.FromCurrentSynchronizationContext()
);
}
private static void ProcessFile(string file)
{
ReadWiresharkFormat(file);
using (WiresharkFile wiresharkFile = new Libpcap(file))
{
WiresharkFileInfo.ReadInfo(wiresharkFile);
// Add file into my list.
}
}
private static WiresharkFileFormat ReadWiresharkFormat(string file)
{
using (BinaryReader binaryReader = new BinaryReader(File.Open(file, FileMode.Open, FileAccess.Read)))
{
// Open file and read first 4 bytes in order to verify file type.
}
}
private static void ReadInfo(WiresharkFile wiresharkFile)
{
foreach (WiresharkFilePacket packet in wiresharkFile)
{
// Collect file information (number of packets...)
}
}
}
OK so until here all good.
Now when add many files, lets say (1000+-) i can see that my memory usage is growing in 200MB but after clear this list the memory usage not changed.
Any idea what could cause this ?

JAVA Google App Engine + Facebook API + GSON = Trouble with Javabean

I am trying to get the user's friends list from Facebook.
The problem seems to be the Javabean...
FBUser fbuser = new Gson().fromJson(jsonStr, FBUser.class);
public class FBUser implements Serializable {
private static final long serialVersionUID = -3154429420153433117L;
private String id;
private String name;
private String email;
private Friends friendsList = new Friends();
private FBUser() { }
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public String getEmail() {
return email;
}
public void setEmail(String email) {
this.email = email;
}
public List<Data> getFriendsList() {
return friendsList.getData();
}
public static class Friends implements Serializable {
private static final long serialVersionUID = 6991758772193514527L;
private List<Data> data;
private Friends() { }
public List<Data> getData() {
return data;
}
public void setData(List<Data> data) {
this.data = data;
}
public class Paging implements Serializable {
private static final long serialVersionUID = 1689816298710621080L;
private String next;
private Paging() { }
public String getNext() {
return next;
}
public void setNext(String next) {
this.next = next;
}
}
}
public class Data implements Serializable {
private static final long serialVersionUID = -5008541658519841090L;
private String id;
private String name;
private Data() { }
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}
}
Json:
json: {"id":"10861234","name":"Whatever","email":"whatever\u0040gmail.com","friends":{"data":[{"name":"Someone","id":"10861234"},{"name" ...43"}],"paging":{"next":"https:\/\/graph.facebook.com\/10861234\/friends..."}}}
The fields ID, Name and Email I can retrieve succesfully... but the friendsList is null... =(
Maybe it is the way I am trying to get it from the nested class, any suggestions on that?
There is no friendsList in your JSON (or, there's no friends in your Java class - whichever way you'd like to look at it). Gson silently ignores anything in the JSON that is not present in your classes.
You have a field friends whose value is an object. That object has a field data which is an array of objects and a field paging which is another object.
You need to write Java classes that match that structure. You're ... close.
In your FBUser class change:
private Friends friendsList = new Friends();
to:
private Friends friends = new Friends();
or:
#SerializedName("friends")
private Friends friendsList = new Friends();
Then in your Friends class you need to add:
private Paging paging = new Paging();
Also note that you don't have to initialize these values unless you specifically don't want them to be non-null when using these classes elsewhere.

Resources