Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I am trying to create a program for my college marathon .
I want to be able start a stopwatch (timer) for every one running the marathon and as each person finishes the marathon i want to type in there id number and stop there Stopwatch (timer) and print out a statement saying ("you finished the marathon in " + (Time).
Just wondering is this possible to create and what way would i go about it .
any help would be greatly appreciated
Regards
Niall
See this answer for creating a digital clock in Swing.
Though this answer is only a clock, it could easily be made into a stop watch. Make use of the System.currentTimeMillis(). Have a static start time, that is populated to a data structure of a Runner object that will hold name startTime and endTime. For each runner when they're finished, they get their own endTime.
See this question for elapsed time formatting.
UPDATE
There are number of ways you can handle this. One solution is you could store the Runners in a HashMap
public class Runner {
long endTime;
Integer id;
public void setEndTime(long endTime) {
this.endTime = endTime;
}
}
public class GUI {
Map<Integer, Runner> runners = new HashMap<>();
public GUI {
Runner runner = new Runner(12334....) // id
map.put(runner.getId(), runner);
}
}
Like I said there are a number of ways to set the end time. One way is to have a endTime variable also in your GUI class. When you click a button, the variable will be assigned. Then in a text field you can type in the runner id, and assign the endtime to the runner in the map that matches the id. So everytime the button is pressed, it will be a new end time set the endTime variable, so each runner will get their own endTime
Related
First and foremost:
I'm kind of new to Flink (Understand the principle and is able to create any basic streaming job I need to)
I'm using Kinesis Analytics to run my Flink job and by default it's using incremental checkpointing with a 1 minute interval.
The Flink job is reading event from a Kinesis stream using a FlinkKinesisConsumer and a custom deserailzer (deserialze the byte into a simple Java object which is used throughout the job)
What I would like to archieve is simply counting how many event of ENTITY_ID/FOO and ENTITY_ID/BAR there is for the past 24 hours. It is important that this count is as accurate as possible and this is why I'm using this Flink feature instead of doing a running sum myself on a 5 minute tumbling window.
I also want to be able to have a count of 'TOTAL' events from the start (and not just for the past 24h) so I also output in the result the count of events for the past 5 minutes so that the post porcessing app can simply takes these 5 minute of data and do a running sum. (This count doesn't have to be accurate and it's ok if there is an outage and I lose some count)
Now, this job was working pretty good up until last week where we had a surge (10 times more) in traffic. From that point on Flink went banana.
Checkpoint size starting to slowly grow from ~500MB to 20GB and checkpoint time were taking around 1 minutes and growing over time.
The application started failing and never was able to fully recover and the event iterator age shoot up never went back down so no new events were being consumed.
Since I'm new with Flink I'm not enterely sure if the way I'm doing the sliding count is completely un optimised or plain wrong.
This is a small snippet of the key part of the code:
The source (MyJsonDeserializationSchema extends AbstractDeserializationSchema and simply read byte and create the Event object):
SourceFunction<Event> source =
new FlinkKinesisConsumer<>("input-kinesis-stream", new MyJsonDeserializationSchema(), kinesisConsumerConfig);
The input event, simple java pojo which will be use in the Flink operators:
public class Event implements Serializable {
public String entityId;
public String entityType;
public String entityName;
public long eventTimestamp = System.currentTimeMillis();
}
env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime);
DataStream<Event> eventsStream = kinesis
.assignTimestampsAndWatermarks(new BoundedOutOfOrdernessTimestampExtractor<Event>(Time.seconds(30)) {
#Override
public long extractTimestamp(Event event) {
return event.eventTimestamp;
}
})
DataStream<Event> fooStream = eventsStream
.filter(new FilterFunction<Event>() {
#Override
public boolean filter(Event event) throws Exception {
return "foo".equalsIgnoreCase(event.entityType);
}
})
DataStream<Event> barStream = eventsStream
.filter(new FilterFunction<Event>() {
#Override
public boolean filter(Event event) throws Exception {
return "bar".equalsIgnoreCase(event.entityType);
}
})
StreamTableEnvironment tEnv = StreamTableEnvironment.create(env);
Table fooTable = tEnv.fromDataStream("fooStream, entityId, entityName, entityType, eventTimestame.rowtime");
tEnv.registerTable("Foo", fooTable);
Table barTable = tEnv.fromDataStream("barStream, entityId, entityName, entityType, eventTimestame.rowtime");
tEnv.registerTable("Bar", barTable);
Table slidingFooCountTable = fooTable
.window(Slide.over("24.hour").every("5.minute").on("eventTimestamp").as("minuteWindow"))
.groupBy("entityId, entityName, minuteWindow")
.select("concat(concat(entityId,'_'), entityName) as slidingFooId, entityid as slidingFooEntityid, entityName as slidingFooEntityName, entityType.count as slidingFooCount, minuteWindow.rowtime as slidingFooMinute");
Table slidingBarCountTable = barTable
.window(Slide.over("24.hout").every("5.minute").on("eventTimestamp").as("minuteWindow"))
.groupBy("entityId, entityName, minuteWindow")
.select("concat(concat(entityId,'_'), entityName) as slidingBarId, entityid as slidingBarEntityid, entityName as slidingBarEntityName, entityType.count as slidingBarCount, minuteWindow.rowtime as slidingBarMinute");
Table tumblingFooCountTable = fooTable
.window(Tumble.over(tumblingWindowTime).on("eventTimestamp").as("minuteWindow"))
.groupBy("entityid, entityName, minuteWindow")
.select("concat(concat(entityName,'_'), entityName) as tumblingFooId, entityId as tumblingFooEntityId, entityNamae as tumblingFooEntityName, entityType.count as tumblingFooCount, minuteWindow.rowtime as tumblingFooMinute");
Table tumblingBarCountTable = barTable
.window(Tumble.over(tumblingWindowTime).on("eventTimestamp").as("minuteWindow"))
.groupBy("entityid, entityName, minuteWindow")
.select("concat(concat(entityName,'_'), entityName) as tumblingBarId, entityId as tumblingBarEntityId, entityNamae as tumblingBarEntityName, entityType.count as tumblingBarCount, minuteWindow.rowtime as tumblingBarMinute");
Table aggregatedTable = slidingFooCountTable
.leftOuterJoin(slidingBarCountTable, "slidingFooId = slidingBarId && slidingFooMinute = slidingBarMinute")
.leftOuterJoin(tumblingFooCountTable, "slidingFooId = tumblingBarId && slidingFooMinute = tumblingBarMinute")
.leftOuterJoin(tumblingFooCountTable, "slidingFooId = tumblingFooId && slidingFooMinute = tumblingFooMinute")
.select("slidingFooMinute as timestamp, slidingFooCreativeId as entityId, slidingFooEntityName as entityName, slidingFooCount, slidingBarCount, tumblingFooCount, tumblingBarCount");
DataStream<Result> result = tEnv.toAppendStream(aggregatedTable, Result.class);
result.addSink(sink); // write to an output stream to be picked up by a lambda function
I would greatly appreciate if someone with more experience in working with Flink could comment on the way I have done my counting? Is my code completely over engineered? Is there a better and more efficient way of counting events over a 24h period?
I have read somewhere in Stackoverflow #DavidAnderson suggesting to create our own sliding window using map state and slicing the event by timestamp.
However I'm not exactly sure what this mean and I didn't find any code example to show it.
You are creating quite a few windows in there. If You are creating a sliding window with a size of 24h and slide of 5 mins this means that there will be a lot of open windows in there, so You may expect that all the data You have received in the given day will be checkpointed in at least one window if You think about it. So, it's certain that the size & time of the checkpoint will grow as the data itself grows.
To be able to get the answer if the code can be rewritten You would need to provide more details on what exactly are You trying to achieve here.
My requirement is to hold 30 days data into stream to given any day for processing. so first day when FLINK application will start, it will fetch 30 days data from database and will merge to current stream data.
My challenge is - manage 30 days data window. If I create slidingwindow for 30 days with sliding time 1 day. something like
WatermarkStrategy<EventResponse> wmStrategy = WatermarkStrategy.<EventResponse>forBoundedOutOfOrderness(Duration.ofMillis(1))
.withTimestampAssigner((eventResponse, l) -> eventResponse.getLocalDateTime().toEpochSecond(ZoneOffset.MAX));
ds.assignTimestampsAndWatermarks(wmStrategy)
.windowAll(SlidingEventTimeWindows.of(Time.days(30), Time.days(1)))
.process(new ProcessAllWindowFunction<EventResponse, Object, TimeWindow>() {
#Override
public void process(Context context, Iterable<EventResponse> iterable, Collector<Object> collector) throws Exception {
--- proccessing logic
}
in this case process() do not start processing immediately when first element of historical data is added. my assumption is ```a) by default first event will be part of first window and will be available for processing immediately. b) next day job will remove last 29th day data from window. is my assumption correct with that piece of code? thank you for your help on this.
I don't think that Your assumptions are correct in this case. When You are using the TimeWindow with ProcessFunction it means that the function is able to process the data when the window is closed (in Your case after 30 days). In this case, slide in time window means that the second window will contain 29 days of the first window and 31st day which was not part of the first window.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I am using a FSM to model the behaviour of an embedded system I am developing. After some initial tests I have a FSM coded in C working quite well. The approach I used to code the FSM was the pointer-to-function table where I define a table with all the functions that takes place in each state.
void(*state_table[])(void) = {
conection,
Idle,
cal,
bi,
pot,
gal,
eis,
PrepE,
Pret,
Meas,
FS_ch,
Ending,
Error };
Now the next step is to modeling some complexities of the requirements that I could not solve with the first approach. In this new model I would like to use some concurrency to model two FSMs that should work at the "same time", and I also used some hierarchy to put those FSMs inside a superstate.
The problem I have now is that I am not sure how to code this concurrency and hierarchy using C. I was taking a look to the QP framework but I think my FSM is not still so complex to begin to use those kind of frameworks.
This is the FSM I designed.
The states 71 and 72 are the concurrent states inside the superstate 7. The execution is acutally independent between them, they doesn't share any variable.
How could I implement this concurrency and hierarchy using C?
The outer state-machine runs in a continuous loop, while sub-states and sub-state-machines must run to completion. Concurrency in the absence of a preemptive scheduler must be implemented cooperatively - the "concurrent" states must be executed sequentially, but each should perform a deterministic guaranteed-to-complete operation on each invocation - i.e. no indefinite busy-waits or delays or processing that take longer that your real-time constraints allow (this is fundamental to state-machine constraint in any case).
The substate-machines can be implemented identically to the main state machine but without the "big-loop". For example:
// Main state machine
static int current_state0 = 0 ;
int main( void )
{
static const void(*state_table[])(void) =
{
conection,
Idle,
cal,
bi,
pot,
gal,
eis,
PrepE,
Pret,
Meas,
FS_ch,
Ending,
Error
} ;
// Main loop - execute the current state
for(;;)
{
state_table[current_state0]() ;
}
}
void superState7()
{
// Execute concurrent sub-statemachines
subStateMachine71() ;
subStateMachine72() ;
}
// Sub-state machine
static int current_state71 = 0 ;
void subStateMachine71()
{
static const void(*state_table[])(void) =
{
state711,
state712,
state713,
state714
} ;
// Execute current substate
state_table[current_state71]() ;
}
...
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 years ago.
Improve this question
SqlDatasource1 :
Table1
-----------
Result_1
-----------
Result_2
I have a query, and I want to get a Result_2 value from the query in code behind, in the BeforePrint event, and I donĀ“t want to put a label in the report, how can I do this?
Thanks
If you want to manually run the Select command on your data source, you could try something like (C#):
var dv = new DataView()
dv = Sqldatasource1.Select(DataSourceSelectArguements.Empty) as DataView;
if (dv.Count > 0)
{
foreach (DataRowView rv in dv)
{
string Result2Val = rv["Result_2"].ToString();
// do something with the result...
}
}
I'm sure VB will follow a similar pattern.
Alternatively, try and capture the OnSelected event, if it's already being triggered by another control.
I'm making a Windows Phone 7.1 application, and I'm having a lot of trouble submitting changes to my database. Here is the structure of the tables in my database:
Day <-1-----*-> TrainingSession <-many-----1-> Sport
So, a single day can have many training sessions, and a training session has one sport. A single sport can naturally be in many different training sessions.
The primary keys look like this:
Day - DateTime
TrainingSession - int (DB generated)
Sport - nvarchar(200)
Sports will simply have attributes sportName, and an iconFileName.
I've set up Associations by putting EntitySet in both Day and Sport, and TrainingSession has EntityRef and EntityRef. I'm not 100% sure if Sport needs the EntitySet, so please correct me if I'm wrong. For the moment, I just hard-coded some sports in my Sport class for testing, and you'll see me retrieving an ObservableCollection to get those out.
Here is how I am trying to create a collection of days with training sessions, each training session having different sports:
public void CreateDay(DateTime date)
{
FitPlanDataContext calendarDatabase = new FitPlanDataContext(FitPlanDataContext.ConnectionString);
DateTime firstDate = new DateTime(date.Year, date.Month, 1);
DayItem dayItem = new DayItem();
dayItem.DateTime = firstDate;
fillTestDayItemWithRandomData(dayItem);
calendarDatabase.DayItems.InsertOnSubmit(dayItem);
calendarDatabase.SubmitChanges();
}
private void fillTestDayItemWithRandomData(DayItem dayItem)
{
ObservableCollection<SportArt> sportArtCollection = SportArtController.GetAllSports();
dayItem.TrainingSessions = new EntitySet<TrainingSession>();
ObservableCollection<TrainingSession> trainingSessionCollection = new ObservableCollection<TrainingSession>();
TrainingSession trainingSession1 = new TrainingSession();
trainingSession1.DayItem = dayItem;
trainingSession1.SportArt = sportArtCollection[1];
trainingSessionCollection.Add(trainingSession1);
TrainingSession trainingSession2 = new TrainingSession();
trainingSession2.DayItem = dayItem;
trainingSession2.SportArt = sportArtCollection[2];
trainingSessionCollection.Add(trainingSession2);
FitPlanDataContext calendarDatabase = new FitPlanDataContext(FitPlanDataContext.ConnectionString);
calendarDatabase.TrainingSessions.InsertAllOnSubmit<TrainingSession>(trainingSessionCollection);
}
This code is not working for me, and it is giving me the following error:
NotSupportedException was Unhandled:
An attempt has been made to Attach or Add an entity that is not new, perhaps having been loaded from another DataContext. This is not supported.
Before I got this error, I was also getting NullReferenceExceptions.
I've been looking around for a solution, and I saw some people used Detach or workarounds with Attach, but I havent figured out how I could implement it to my code. Could anyone give me a helping hand with this?
Also, I thought the NullReferenceException could be coming from the fact that I'm not saving any sports to the database, could this be so?
So I messed around with it a lot, and today I finally found the solution I was looking for.
It seems I asked the question wrong. I didn't include the query from the database, which is probably important to add. I actually omitted a lot of the code to keep things simple in my question, but looks like I omitted too much.
Anyways, it turned out the way I setup the database structure was correct, and nothing had to be changed there.
So here's what I did to get it working:
-The call to the method that fills the day with training sessions needed to go after submitting changes about the day. This is because days have training sessions, and I cant save training sessions without the day already in the database.
-I added using statements around the places where I need to use the datacontext instead of just creating an instance of the datacontext with a local variable. This ensures that the datacontext lives only in the scope of the using statment.
(I changed the DateTime of the day to be the date given as the parameter to the method)
public void CreateDay(DateTime date)
{
DayItem dayItem = new DayItem();
dayItem.DateTime = date;
using (FitPlanDataContext calendarDatabase = new FitPlanDataContext(FitPlanDataContext.ConnectionString))
{
calendarDatabase.DayItems.InsertOnSubmit(dayItem);
calendarDatabase.SubmitChanges();
}
fillTestDayItemWithRandomData(dayItem);
}
Then, the changes to the method that fills the day with training sessions go like this:
-I open a using statement where I instantiate a new datacontext. Then I access the database to retrieve a list of all the sports, and also the day that I need to update. I find the day I need to update by dayItemParameter. (Remember that retrieving from the database will give you a collection.)
-I create my new training sessions and fill their properties. Note that the day I retrieved from the database is the value of a training session's property because the training session is a child of day, and needs to know who its parent day is.
-I removed the instantiation of EntitySet because I realized that I already instantiate it in the constructor of the DayItem class.
-Lastly, I add all the new training sessions into a collection, and save them all to the database at once using InsertAllOnSubmit(collection).
private void fillTestDayItemWithRandomData(DayItem dayItemParameter)
{
using (FitPlanDataContext calendarDatabase = new FitPlanDataContext(FitPlanDataContext.ConnectionString))
{
ObservableCollection<SportArt> sportArtCollection;
var sportArts = (from SportArt sportArt in calendarDatabase.SportArts
select sportArt);
sportArtCollection = new ObservableCollection<SportArt>(sportArts);
ObservableCollection<DayItem> dayItemCollection;
var dayItems = (from DayItem dayItem in calendarDatabase.DayItems
where dayItem.DateTime == dayItemParameter.DateTime
select dayItem);
dayItemCollection = new ObservableCollection<DayItem>(dayItems);
DayItem foundDayItem = dayItemCollection[0];
ObservableCollection<TrainingSession> trainingSessionCollection = new ObservableCollection<TrainingSession>();
TrainingSession trainingSession1 = new TrainingSession();
trainingSession1.DayItem = foundDayItem;
trainingSession1.SportArt = sportArtCollection[1];
trainingSessionCollection.Add(trainingSession1);
TrainingSession trainingSession2 = new TrainingSession();
trainingSession2.DayItem = foundDayItem;
trainingSession2.SportArt = sportArtCollection[2];
trainingSessionCollection.Add(trainingSession2);
calendarDatabase.TrainingSessions.InsertAllOnSubmit<TrainingSession>(trainingSessionCollection);
calendarDatabase.SubmitChanges();
}
}
Conclusion:
The main problem I was having was that I was trying to save training sessions to a day that wasn't submitted to the database. The next big problem (that I think many others have) is that reading and updating of an entity has to be in the same datacontext. So, you can't create a datacontext to retrieve a day, then use another datacontext to add a training session to that day (even if you saved the value of the day to a local variable). You need to retrieve the day and save training sessions to it all in the same data context.
At the moment, my application is working, but it is quite sluggish. In this question, I'm asking about just one day, but in my actual program, I'm creating hundreds of days, which means a lot of opening and closing of the database. If anyone has suggestions to how I can
optimize the process, I'm open ears.
I realize and apologize that this post got so long, but writing it helped me to understand the situation with more depth, and I really hope that it'll help others too.