Google App Engine GREATER_THAN and LESS_THAN comparison operator not working - google-app-engine

I am trying to pull data from an Entity called latLongInfo and trying to get all the results with the lattitude results within a certain range. Below code should work but it's not:
Filter lowerLatF = new FilterPredicate("lat", FilterOperator.GREATER_THAN, botLat);
Filter topLatF = new FilterPredicate("lat", FilterOperator.LESS_THAN, topLat);
Filter twoFilter = CompositeFilterOperator.and(lowerLatF, topLatF);
Query rip = new Query("latLongInfo").setFilter(twoFilter);
PreparedQuery ripQ = datastore.prepare(rip);
List<Entity> llResult = ripQ.asList(FetchOptions.Builder.withLimit(15));
int sizeOfList=llResult.size();
The value of botLat is 40.94495459565217 and the value of topLat is 41.3797372043.
In the Datastore that I am pulling the data from there's a result with lat = 41.1623459 however, the code above doesn't find it and keeps giving me a sizeOfList = 0.
I should be getting at least one result but it's not returning it. Is there something simple I am missing?

I figured this out. I was saving the lat as String rather than a numerical float or double. Once i changed it to save doubles (these were actually converted to floats in the datastore) I was able to do the comparison without issue.
Decided to answer my own question...in case this helps anyone else.

Related

EF Core 3.1.9 - FromRawSql using stored procedures stopped working - 'The underlying reader doesn't have as many fields as expected.'

At one point using FromSqlRaw to call stored procedures worked for me. I did not change anything in the project but now calling any stored procedure using FromSqlRaw returns
The underlying reader doesn't have as many fields as expected
I removed the model from the project and performed a BUILD. Then added the model back with no luck. I reduced the model and stored procedure to return a single column, no luck.
I tried adding Microsoft.EntityFrameworkCore.Relational as a dependency, no luck. All my unit test that use FromSqlRaw to call a stored procedure return the same error and at one time they all worked.
I have received Windows updates but nothing I know about that would have affected EF Core. I have run through all internet problem solving I can find. I am starting to think I will need to use ADO as a work around but I do not want a work around when it worked for me at one point. Something changed on my machine but I am not sure what to cause this problem.
Here is my test method in case my code is messed up. It is very straight forward not much to mess up. I tried the "var" out of desperation.
[TestMethod]
public void WorkOrderBOMGridABS()
{
List<WorkOrderBOMGridABS> baseList = new List<WorkOrderBOMGridABS>();
using (WorkOrderDataContext context = new WorkOrderDataContext())
{
var param = new SqlParameter[] {
new SqlParameter() {
ParameterName = "#WorkOrderId",
SqlDbType = System.Data.SqlDbType.Int,
Direction = System.Data.ParameterDirection.Input,
Value = 38385
}
};
baseList = context.WorkOrderBOMGridABS.FromSqlRaw("[dbo].[WorkOrderBOMGridABS] #WorkOrderId", param).ToList();
//var results = context.WorkOrderBOMGridABS.FromSqlRaw("[dbo].[WorkOrderBOMGridABS] #WorkOrderId", param).ToList();
Assert.IsNotNull(baseList);
}
}
I was using an old table to get the Unit Of Measure value that had an integer ID value. I switched it to use a new table with a VARCHAR ID value. Making this change to the stored proc and model code allowed the FromRawSql to work. Not sure why because while the integer ID value was getting an integer, either 0 or number other than 0, it was a valid value for the model. Any error message I received did not mention this UnitId field. It was a pain but I am glad it is resolved. At least until the next error I run into that much is guaranteed.

Appending values to DataSet in Apache Flink

I am currently writing an (simple) analytisis code to sum time connected powerreadings. With the data being assumingly raw (e.g. disturbances from the measuring device have not been calculated out) I have to account for disturbances by calculation the mean of the first one thousand samples. The calculation of the mean itself is not a problem. I only am unsure of how to generate the appropriate DataSet.
For now it looks about like this:
DataSet<Tupel2<long,double>>Gyrotron_1=ECRH.includeFields('11000000000'); // obviously the line to declare the first gyrotron, continues for the next ten lines, assuming separattion of not occupied space
DataSet<Tupel2<long,double>>Gyrotron_2=ECRH.includeFields('10100000000');
DataSet<Tupel2<long,double>>Gyrotron_3=ECRH.includeFields('10010000000');
DataSet<Tupel2<long,double>>Gyrotron_4=ECRH.includeFields('10001000000');
DataSet<Tupel2<long,double>>Gyrotron_5=ECRH.includeFields('10000100000');
DataSet<Tupel2<long,double>>Gyrotron_6=ECRH.includeFields('10000010000');
DataSet<Tupel2<long,double>>Gyrotron_7=ECRH.includeFields('10000001000');
DataSet<Tupel2<long,double>>Gyrotron_8=ECRH.includeFields('10000000100');
DataSet<Tupel2<long,double>>Gyrotron_9=ECRH.includeFields('10000000010');
DataSet<Tupel2<long,double>>Gyrotron_10=ECRH.includeFields('10000000001');
for (int=1,i<=10;i++) {
DataSet<double> offset=Gyroton_'+i+'.groupBy(1).first(1000).sum()/1000;
}
It's the part in the for-loop I'm unsure of. Does anybody know if it is possible to append values to DataSets and if so how?
In case of doubt, I could always put the values into an array but I do not know if that is the wise thing to do.
This code will not work for many reasons. I'd recommend looking into the fundamentals of Java and the basic data structures and also in Flink.
It's really hard to understand what you actually try to achieve but this is the closest that I came up with
String[] codes = { "11000000000", ..., "10000000001" };
DataSet<Tuple2<Long, Double>> result = env.fromElements();
for (final String code : codes) {
DataSet<Tuple2<Long, Double>> codeResult = ECRH.includeFields(code)
.groupBy(1)
.first(1000)
.sum(0)
.map(sum -> new Tuple2<>(sum.f0, sum.f1 / 1000d));
result = codeResult.union(result);
}
result.print();
But please take the time and understand the basics before delving deeper. I also recommend to use an IDE like IntelliJ that would point to at least 6 issues in your code.

Why does this get method stop returning correctly?

I am trying to write an app engine application for my university. What I am trying to achieve right now, is to create a method which takes in a Course name, and returns a list of all the CourseYears (think of that as being like a link table e.g. if Maths is the course, and it has Year 1, year 2 and Year 3; MathsYear1, MathsYear2 and MathsYear3 would be the names of the CourseYears).
This is the code for the module (WARING: super dirty code below!):
#ApiMethod(name = "courseYears")
public ArrayList<CourseYear> courseYears(#Named("name") String name){
DatastoreService datastore = DatastoreServiceFactory.getDatastoreService();
Query.Filter keyFilter = new Query.FilterPredicate("name", Query.FilterOperator.EQUAL, name);
Query query = new Query("Course").setFilter(keyFilter);
PreparedQuery preparedQuery = datastore.prepare(query);
List<Entity> resultList = preparedQuery.asList(FetchOptions.Builder.withLimit(1));
Course course = ofy().load().type(Course.class).id(resultList.get(0).getKey().getId()).now();
ArrayList<String> courseYearNames = course.getAllCourseYearNames();
System.out.println(course.getName());
ArrayList<CourseYear> courseYears = new ArrayList<CourseYear>();
for(String courseYearName: courseYearNames){
Query.Filter courseNameFilter = new Query.FilterPredicate("name", Query.FilterOperator.EQUAL, courseYearName);
Query query2 = new Query("CourseYear").setFilter(courseNameFilter);
List<Entity> resL = preparedQuery.asList(FetchOptions.Builder.withLimit(1));
System.out.println("test");
CourseYear courseYear = ofy().load().type(CourseYear.class).id(resL.get(0).getKey().getId()).now();
courseYears.add(courseYear);
}
return courseYears;
}
It basically takes a Course name in, applies a filter on all courses to get the corresponding Course object, and then calls getAllCourseYearNames() on the course to get an array list containing all its CourseYears' names. (I would have loved to do this using Keys, but parameterised Objectify keys don't seem to be supported in this version of App Engine).
I then try and get the CourseYears by looping through the arraylist of names and applying the filter for each name. I print "test" each time to see how many times it is looping. Like I said, a super dirty way of doing it.
When I try passing a few course names as a parameters, it loops the correct number of times only once or twice, and after that does not loop at all (doesn't print "test"). I could understand if it never looped, but not doing it correctly once or twice and then never again. It doesn't successfully return a list of CourseYears when it does work, but rather the relevant number of NULLs - I don't know if this is relevant. I believe it successfully retrieves the course every time, as I print the name of the course after loading and it never fails to do this.
If anyone has ANY suggestions for why this may be happening, I would be incredibly grateful to hear them!
Thanks
query2 is never used in your code. You reuse preparedQuery from your previous query, which runs on a different entity kind.

Implementing get_multi on app engine memcache

I was wondering if somebody could help. I'm using the blobcache module outlined in this post here
This works fine but I'm looking to speed retrieval from memcache by using the get_multi()
key function but my current code cannot find the keys when using get_multi
My current get def looks like this
def get(key):
chunk_keys = memcache.get(key)
if chunk_keys is None:
return None
chunk_keys= ",".join(chunk_keys)
str(chunk_keys)
chunk = memcache.get_multi(chunk_keys)
if chunk is None:
return None
try:
return chunk
except Exception:
return None
My understanding per the documentation is that you only need to pass through a string of keys to get_multi.
However his is not returning anything at the moment.
Can someone point out what i'm doing wrong here?
pass it a list of strings (keys) , instead of a single string with commas in it.
get_multi(keys, key_prefix='', namespace=None, for_cas=False)
keys = List of keys to look up. A Key can be a string or a tuple of
(hash_value, string), where the hash_value, normally used for sharding
onto a memcache instance, is instead ignored, as Google App Engine
deals with the sharding transparently.
Multi Get Documentation

Google Datastore problem with query on *User* type

On this question I solved the problem of querying Google Datastore to retrieve stuff by user (com.google.appengine.api.users.User) like this:
User user = userService.getCurrentUser();
String select_query = "select from " + Greeting.class.getName();
Query query = pm.newQuery(select_query);
query.setFilter("author == paramAuthor");
query.declareParameters("java.lang.String paramAuthor");
greetings = (List<Greeting>) query.execute(user);
The above works fine - but after a bit of messing around I realized this syntax in not very practical as the need to build more complicated queries arises - so I decided to manually build my filters and now I got for example something like the following (where the filter is usually passed in as a string variable but now is built inline for simplicity):
User user = userService.getCurrentUser();
String select_query = "select from " + Greeting.class.getName();
Query query = pm.newQuery(select_query);
query.setFilter("author == '"+ user.getEmail() +"'");
greetings = (List<Greeting>) query.execute();
Obviously this won't work even if this syntax with field = 'value' is supported by JDOQL and it works fine on other fields (String types and Enums). The other strange thing is that looking at the Data viewer in the app-engine dashboard the 'author' field is stored as type User but the value is 'user#gmail.com', and then again when I set it up as parameter (the case above that works fine) I am declaring the parameter as a String then passing down an instance of User (user) which gets serialized with a simple toString() (I guess).
Anyone any idea?
Using string substitution in query languages is always a bad idea. It's far too easy for a user to break out and mess with your environment, and it introduces a whole collection of encoding issues, etc.
What was wrong with your earlier parameter substitution approach? As far as I'm aware, it supports everything, and it sidesteps any parsing issues. As far as the problem with knowing how many arguments to pass goes, you can use Query.executeWithMap or Query.executeWithArray to execute a query with an unknown number of arguments.

Resources