How secure is str(BlobKey)? - google-app-engine

I have implemented a generic blob serving handler as mentioned in the appengine docs. The handler will serve any blob to you, as long as you know that blob's key string. I am using it to easily compose URLs that clients can use to download their files. If client A inspects the URL to download their file and finds their blob key (i.e. 1CX2kh468IDYKGcDUiq5c69u8BRXBtKBYcIaJkmSbSa4QY096gGVaYCZJjGZUpDz == str(BlobKey)), can they somehow reverse-engineer this key and easily construct another key that can be used to download client B's files? Or does the key have a random component added?
For reference, there is this note about str(db.Key), which is what raises my question:
Note: The string representation of a key looks cryptic, but is not
encrypted! It can be converted back to the raw key data, both kind and
identifier. If you don't want to expose this data to your users (and
allow them to easily guess other entities' keys), then encrypt these
strings or use something else.
I am creating the files like this, which does not specify a filename parameter, so I think the question boils down to, how does create() "pick" a filename when one is not specified? I suppose I could generate a random filename and pass it in here to be doubly sure this is secure.
file_name = files.blobstore.create(mime_type='application/octet-stream')

BlobKeys are non guessable. If a user has one key, that in no way enables them to guess another key. Unlike datastore keys, which contain full path information, BlobKeys do not encode any such data. You can share them safely without risk of a user doing an attack as you describe.
(I could not locate docs for these claims - this is based on my recollection.)

Assign a filename when creating a blob:
name = .....
file_name = files.blobstore.create(mime_type='application/octet-stream', _blobinfo_uploaded_filename=name)
And you do not need to use str(BlobKey). The BlobKey can be part of your serving url

Related

How to change game data on the fly in a packaged UE4 project?

My question seems to be pretty straight forward but, I haven't been able to find any solutions to this online. I've looked at a number of different types of objects like DataTables and DataAssets only to realize they are for static data alone.
The goal of my project is to have data-driven configurable assets where we can choose different configurations for our different objects. I have been able to successfully pull JSON data down from the database at run-time but, I would like to save said data to something like a Data Asset or something similar that I can read and write to. So when we pull from said database later we only pull updates to our different configurations and not the entire database (every time at start-up).
On a side note: would this be possible/feasible using an .ini file or is this kind of thing considered too big for something like that (i.e 1000+ json objects)?
Any solutions to this problem would be greatly appreciated.
Like you say, DataTable isn't really usable here. You'll need to utilize UE4's various File IO API utilities.
Obtaining a Local Path
This function converts a path relative to your intended save directory, into one that's relative to the UE4 executable, which is the format expected throughout UE4's File IO.
//DataUtilities.cpp
FString DataUtilities::FullSavePath(const FString& SavePath) {
return FPaths::Combine(FPaths::ProjectSavedDir(), SavePath);
}
"Campaign/profile1.json" as input would result in something like:
"<game.exe>/game/Saved/Campaign/profile1.json".
Before you write anything locally, you should find the appropriate place to do it. Using ProjectSaveDir() results in saving files to <your_game.exe>/your_game/Saved/ in packaged builds, or in your project's Saved folder in development builds. Additionally, FPaths has other named Dir functions if ProjectSavedDir() doesn't suit your purpose.
Using FPaths::Combine to concatenate paths is less error-prone than trying to append strings with '/'.
Storing generated JSON Text Data on Disk
I'll assume you have a valid JSON-filled FString (as opposed to a FJSONObject), since generating valid JSON is fairly trivial.
You could just try to write directly to the location of the full path given by the above function, but if the directory tree to it doesn't exist (i.e., first-run), it'll fail. So, to generate that path tree, there's some path processing and PlatformFile usage.
//DataUtilities.cpp
void DataUtilities::WriteSaveFile(const FString& SavePath, const FString& Data) {
auto FullPath = FullSavePath(SavePath);
FString PathPart, Disregard;
FPaths::Split(FullPath, PathPart, Disregard, Disregard);
IPlatformFile& PlatformFile = FPlatformFileManager::Get().GetPlatformFile();
if (PlaftormFile.CreateDirectoryTree(*PathPart)){
FFileHelper::SaveStringToFile(Data, *FullPath);
}
}
If you're unsure what any of this does, read up on FPaths and FPlatformFileManager in the documentation section below.
As for generating a JSON string: Instead of using the Json module's DOM, I generate JSON strings directly from my FStructs when needed, so I don't have experience with using the Json module's serialization functionality. This answer seems to cover that pretty well, however, if you go that route.
Pulling Textual Data off the Disk
// DataUtilities.cpp
bool DataUtilities::SaveFileExists(const FString& SavePath) {
return IFileManager::Get().FileExists(*FullSavePath(SavePath));
}
FString DataUtilities::ReadSaveFile(const FString& SavePath) {
FString Contents;
if(SaveFileExists(SavePath)) {
FFileHelper::LoadFileToString(Contents, *FullSavePath(SavePath));
}
return Contents;
}
As is fairly obvious, this only works for string or string-like data, of which JSON qualifies.
You could consolidate SaveFileExists into ReadSaveFile, but I found benefit in having a simple "does-this-exist" probe for other methods. YMMV.
I assume if you're already pulling JSON off a server, you have a means of deserializing it into some form of traversable container. If you don't, this is an example from the UE4 Answer Hub of using the Json module to do so.
Relevant Documentation
FFileHelper
FFileHelper::LoadFileToString
FFileHelper::SaveStringToFile
IFileManager
FPlatformFileManager
FPaths
UE4 Json.h (which you may already be using)
To address your side note: I would suggest using an extension that matches the type of content saved, if for nothing other than clarity of intention. I.e., descriptive_name.json for files containing JSON. If you know ahead of time that you will be reading/needing all hundreds or thousands of JSON objects at once, it would likely be better to group as many as possible into fewer files, to minimize overhead.

LogicApps scenario log on, download, zip

I access a 3rd party website using forms authentication (username and password).
Once logged on I make a call to a HTTP endpoint and receive XML in the body. The XML contains 1000 XML elements. Within each element there is a text value, a code.
For each of these codes I make a further call to a different HTTP endpoint. The response is more XML.
When all 1000 responses have been received I would like to add all the XML responses as files to a zip container and make it available for download.
I would like to see how LogicApps could do this as quickly as possible.
Make the call to the first HTTP endpoint (auth set to basic auth with user/pass inputted)
Use the xpath(xml(<body var here>), '//elementNameHere') expression on the Body of the result from the call to get all the elements of the return value that have the code in it
Foreach over this return value and
make the HTTP call
append the result to an array variable, or concat on to a string variable.
Submit this value to blob storage
Because you're messing w/ vars in the foreach loop, though, you'll have to do it sequentially (set concurrency control on the Foreach Loop to 'on' and '1') else you could end up with a bad result.
I don't know of a way to "zip contents" here so you may have to send the result to an Azure Function that uses a .Net zip lib to do the work (or js zip lib, whatever your flavor) and does the put to blob storage for you.
This would also all be much easier in Durable Functions land, I encourage you to look in to that if you're so inclined.
One mild alternative you might consider is for step 3.2, instead upload that result to a blob storage container, then make the entire container available for download via an Azure Function call which gets the container & zips up the contents (or does the Blob Storage URL for a container do this for you already? not sure)

How do you modify an extension from an X509?

I am creating an api for modifying X509 certificates in C and I want to add a way to modify an extension. For example, add another DNS entry to subjectNameAlt so that it would be DNS:example.com,DNS:example2.com instead of just DNS:example.com . The reason that deleting and re-adding is bad is because I then have to reparse the extension (which is difficult) and I would rather just add a piece of information. How would I do this via the OpenSSL API?
I tried to simply reuse the add code:
ex = X509V3_EXT_conf_nid(NULL, &ctx, NID_subject_alt_name, "DNS:new.dns.example");
if (!ex)
return;
X509_add_ext(cert,ex,-1);
X509_EXTENSION_free(ex);
But after running that, the extension isnt found at all (even if i try to add another new one).
Have a look at
demos/x509/mkcert.c
demos/x509/selfsign.c
demos/x509/mkreq.c
in your openssl distribution. I suspect that you are not setting up your context; or you are making an assumption in that some i2d/etc is called behind the scene when you say 'is not found at all'. Note that X509_sign() and their ilk do create a lot of this. If you are not calling anything -then do not expect any of those to be created.

Getting a URL with GtkClipboard

I'm trying to check if a url is on the system clipboard and if so get it from the clipboard. While reading the GTK API docs I came across gtk_clipboard_wait_for_uris but it seems to always return NULL:
g_print("%s", gtk_clipboard_wait_for_uris(gtk_clipboard_get(GDK_SELECTION_PRIMARY)));
What would be the correct/best way of getting a url from the clipboard?
wait_for_uris only works if the clipboard contains data tagged as a URI list (text/uri-list), e.g. when you perform a copy action in a file manager. It won't work as you expect if you copy a piece of text that just happens to contain a URI, e.g. "http://stackoverflow.com/". When you copy that string, it's most likely tagged as plaintext (text/plain).
The solution is to use wait_for_text and check whether it's a URI.

Torrent file protocol - custom field

I am wondering if there is any available field in the .torrent files that could be used for some custom functionality in someone's implementation of a torrent client? For example, one might want to encode an URL to the file owner's website, someone else - some custom message to be displayed when opening the files, etc. Is something like this feasible in the current implementation of .torrent files?
Yes. .torrent files are just bencoded dictionaries and can hold arbitrary key-value pairs.
The main consideration when adding a custom field is to determine whether it should go into the root of the .torrent or inside the info dictionary.
If it goes into the root, it will not affect the info hash (which is the unique identifier of the torrent), and it will also not be available when downloading magnet links.
If it goes into the info dictionary, it is sort of locked down to the info-hash, in the sense that the info-hash depends on it. It will be transferred as part of the metadata when downloading magnet links and it cannot be changed (without changing the info-hash and thus creating a separate swarm).
So, if it's something you want 3rd parties should be able to change after the torrent was created, it should go in the root, if you want it to be entered once when the torrent is created and never change, it should go in the info dict.

Resources