How to parse JSON-LD feed using dotNetRDF in C# - json-ld

I'm trying to consume a json-ld formatted endpoint in a dotnet app.
I've not come across this format before but most of the examples are for JavaScript.
I've tried a couple of libraries and simply failing because there is so little reference.
I've loaded the contents of the endpoint into memory, now I want to see how best to traverse the nodes, but I can't take the contents and do anything with them.
The simplest example OUGHT to look something like:
JsonLdParser parser = new JsonLdParser();
parser.Load(contentsfromuri)
However, the above requires you to have an IRdfReader declared, which cannot be instantiated as its an abstract class.

You'll find the dotNetRDF documentation all at https://github.com/dotnetrdf/dotnetrdf/wiki. There are some examples of parsing RDF data from various sources at https://github.com/dotnetrdf/dotnetrdf/wiki/UserGuide-Reading-RDF. In RDF there are syntaxes that only ever serialize a single graph and syntaxes that can serialize multiple graphs - JSON-LD is one of the latter, so you need to also read the section on Store Readers.
The following examples all show loading the data into an in-memory store.
If your content source is "well-behaved" (sends back the right sort of Content-Type headers, or has the expected file name suffix if it is a local file), then loading the data can be as simple as creating a new in-memory graph and calling its Load method:
var store = new TripleStore();
# This is a convenience wrapper that simply invokes UriLoader.Load()
store.LoadFromUri(contentSourceUri)
NOTE: This uses an extension method (as described at https://github.com/dotnetrdf/dotnetrdf/wiki/UserGuide-Extension-Methods) which is just a convenience wrapper around:
# Create the store
var store = new TripleStore();
# UriLoader will make an HTTP request and parse the response,
# selecting the parser to be used based on the Content-Type header returned.
UriLoader.Load(store, contentSourceUri);
If you are parsing from a string that you have already retrieved, or if you need to be explicit about the parser instance to use (this may be the case if you want to pass some options to the parser when you create it for example), then you need a slightly more verbose approach:
var store = new TripleStore();
# Create the parser (we can pass in options here if needed)
var parser = new JsonLdParser();
# Wrap the string content in a StringReader and pass the target graph and the reader
parser.Load(store, new StringReader(contentsFromUri));
One final thing to note as you specifically refer to JSON-LD. The parser is a conformant JSON-LD 1.0 parser but it's JSON-LD 1.1 support is based on an earlier draft of the spec. I'm currently working on updating the implementation and hope to have a new release that supports the JSON-LD 1.1 Proposed Recommendation in a few weeks.

Related

Representing QVT-Operational transformations in XML, JSON or or any other serialized format

I have a requirement where I need to parse the transformations defined in the QVT-Operational file.
I need some way to represent the QVT-Operational transformations in a json, xml or any other serialized format.
In model to model transformation performed using operational QVT in eclipse, I am able to generate a trace file in XML format. The trace file provides details on which element in the source model is mapped to which element in the target model but I also require the transformation logic. So is there any way to either convert the QVT-operational file to xml(or any serialized format) or get transformation details in the trace file ?
Interactively there is no support since use of the unstable internal *.qvtox representation is not encouraged.
However programmatically you may save the compiled Resource to a *.qvtox XMI file and load it again later, provided you use a compatible OCL+QVTo release.
See also https://www.eclipse.org/forums/index.php/mv/msg/1109554/1848331/#msg_1848331

Need help parsing through JSON Object in JMETER

I'm testing an application that calls one API, gets a bunch of work orders, then only the work order ID's are passed to another API to display on the page.
The format they need to be in is: {"workOrderIds":["12345","123456"]}
I'm using the JSON Extractor with the following Path Expressions:
$..workOrderNumber
then I'm using the JSR223 PostProcessor and using the following script:
props.put("workOrderNumber", "${workOrderNumber}";
The problem is, that its creating the object like so when I add the variable into the POST Request body of the second request:
{"workOrderIds":["12345, 123456"]}
essentially, I just need to make sure that each value has quotations, but not sure how to make this happen. Sorry if this seems simple, I'm fairly new to QA and have spent several hours trying to figure this out.
We cannot provide a comprehensive answer without seeing the source JSON, maybe it worth trying explicitly casting the filtering result to an Integer like:
vars.put('workOrderIds', new groovy.json.JsonBuilder(new groovy.json.JsonSlurper().parse(prev.getResponseData()).findResults { entry -> entry.workOrderNumber as int }).toPrettyString())
More information:
Apache Groovy - Parsing and producing JSON
Apache Groovy - Why and How You Should Use It

How to change game data on the fly in a packaged UE4 project?

My question seems to be pretty straight forward but, I haven't been able to find any solutions to this online. I've looked at a number of different types of objects like DataTables and DataAssets only to realize they are for static data alone.
The goal of my project is to have data-driven configurable assets where we can choose different configurations for our different objects. I have been able to successfully pull JSON data down from the database at run-time but, I would like to save said data to something like a Data Asset or something similar that I can read and write to. So when we pull from said database later we only pull updates to our different configurations and not the entire database (every time at start-up).
On a side note: would this be possible/feasible using an .ini file or is this kind of thing considered too big for something like that (i.e 1000+ json objects)?
Any solutions to this problem would be greatly appreciated.
Like you say, DataTable isn't really usable here. You'll need to utilize UE4's various File IO API utilities.
Obtaining a Local Path
This function converts a path relative to your intended save directory, into one that's relative to the UE4 executable, which is the format expected throughout UE4's File IO.
//DataUtilities.cpp
FString DataUtilities::FullSavePath(const FString& SavePath) {
return FPaths::Combine(FPaths::ProjectSavedDir(), SavePath);
}
"Campaign/profile1.json" as input would result in something like:
"<game.exe>/game/Saved/Campaign/profile1.json".
Before you write anything locally, you should find the appropriate place to do it. Using ProjectSaveDir() results in saving files to <your_game.exe>/your_game/Saved/ in packaged builds, or in your project's Saved folder in development builds. Additionally, FPaths has other named Dir functions if ProjectSavedDir() doesn't suit your purpose.
Using FPaths::Combine to concatenate paths is less error-prone than trying to append strings with '/'.
Storing generated JSON Text Data on Disk
I'll assume you have a valid JSON-filled FString (as opposed to a FJSONObject), since generating valid JSON is fairly trivial.
You could just try to write directly to the location of the full path given by the above function, but if the directory tree to it doesn't exist (i.e., first-run), it'll fail. So, to generate that path tree, there's some path processing and PlatformFile usage.
//DataUtilities.cpp
void DataUtilities::WriteSaveFile(const FString& SavePath, const FString& Data) {
auto FullPath = FullSavePath(SavePath);
FString PathPart, Disregard;
FPaths::Split(FullPath, PathPart, Disregard, Disregard);
IPlatformFile& PlatformFile = FPlatformFileManager::Get().GetPlatformFile();
if (PlaftormFile.CreateDirectoryTree(*PathPart)){
FFileHelper::SaveStringToFile(Data, *FullPath);
}
}
If you're unsure what any of this does, read up on FPaths and FPlatformFileManager in the documentation section below.
As for generating a JSON string: Instead of using the Json module's DOM, I generate JSON strings directly from my FStructs when needed, so I don't have experience with using the Json module's serialization functionality. This answer seems to cover that pretty well, however, if you go that route.
Pulling Textual Data off the Disk
// DataUtilities.cpp
bool DataUtilities::SaveFileExists(const FString& SavePath) {
return IFileManager::Get().FileExists(*FullSavePath(SavePath));
}
FString DataUtilities::ReadSaveFile(const FString& SavePath) {
FString Contents;
if(SaveFileExists(SavePath)) {
FFileHelper::LoadFileToString(Contents, *FullSavePath(SavePath));
}
return Contents;
}
As is fairly obvious, this only works for string or string-like data, of which JSON qualifies.
You could consolidate SaveFileExists into ReadSaveFile, but I found benefit in having a simple "does-this-exist" probe for other methods. YMMV.
I assume if you're already pulling JSON off a server, you have a means of deserializing it into some form of traversable container. If you don't, this is an example from the UE4 Answer Hub of using the Json module to do so.
Relevant Documentation
FFileHelper
FFileHelper::LoadFileToString
FFileHelper::SaveStringToFile
IFileManager
FPlatformFileManager
FPaths
UE4 Json.h (which you may already be using)
To address your side note: I would suggest using an extension that matches the type of content saved, if for nothing other than clarity of intention. I.e., descriptive_name.json for files containing JSON. If you know ahead of time that you will be reading/needing all hundreds or thousands of JSON objects at once, it would likely be better to group as many as possible into fewer files, to minimize overhead.

Regex to extract data in cell and pass to subsequent request where it is needed in Jmeter

I am using this regex to extract data in cell and pass to subsequent request where it is needed in JMeter.
Using Reg-ex to extract the data in cell:
"cell":\["","(.*?)","(.*?)","(.*?)","(.*?)","(.*?)","(.*?)","(.*?)","(.*?)","(.*?)","","(.*?)","(.*?)","(.*?)"]}]}
Can someone help to enhance it using beanshell or store it in array and then pass it to subsequent request?
It seems you are trying to extract something from JSON response. Using regular expressions for this is not very recommended.
Be aware that starting from JMeter 3.0 there is a JSON Extractor which can be used for fetching data from JSON responses using JsonPath language.
The relevant JsonPath expression to get the content of cell element will be as simple as:
$..cell
Going forward please try to include at least essential parts of the response into your question.

media files converter plugin/component in CakePHP

I am trying to develop a plugin/component that can change the media file format from one to another. Specifically, I need it to convert the "tiff" file to array/single copy of "jpg" image file.
Kindly guide, how I can implement it or is there any kind of tutorial link from where either I can download it or take some help to develop it. Thanks in advance.
We did this in our CMS (built on CakePHP 1.2; sorry if there are any significant discrepancies I'm not aware of) using a behaviour. That makes the controller logic very easy (in fact we use a baked controller without any modification at all).
Unfortunately TIFF isn't a supported file format in GD (the default image manipulation library in PHP). You'll need to use ImageMagick or an equivalent tool to do the actual conversion itself, but the logic for implementing it in your CakePHP project won't be any different to what I describe here.
The behaviour (in our case) was used to generate images as thumbnails as well as page resolution and to convert the uploaded file format into JPEG.
In its beforeSave() method it checked that data was specified (and that there was no error), and then pulled the tmp_name value from the posted data (and removed the posted data object).
In its afterSave() method, it actually performed the image conversion task itself (putting the generated images in the expected location on disk), then updated any foreign keys on extended models with the uploaded image's ID. We do this in the afterSave() operation so we have a database ID to use to name the files on disk.
In its afterDelete() method we unlink the files on disk.
Using the behaviour in the model is as simple as telling the model (where ContentImage is the name of the behaviour):
var $actsAs = array('ContentImage');
Although we also use the model to define the output directory since we had a few models that implemented the behaviour, and it felt like the right thing to do, e.g. in the model:
function getThumbnailDir() {
return WWW_ROOT.'img'.DS.'upload'.DS.'thumb';
}
and in the behaviour itself the output path becomes:
$Model->getThumbnailDir().DS.$Model->id.'.jpg'

Resources