Saving Documents to CouchDB Urls with Multiple Slashes - database

My first exposure to NoSQL DBs was through Firebase, where I'd typically store json data to a url like: category, then store something else later to a url like category/subcategory.
Trying to do the same in CouchDB I ran into a problem.
For example, I saved a simple object like:
{"_id":"one"}
to
database/category
which works as expected. Then I try saving the following
{"_id":"two"}
to
database/category/subcategory
I get this error message:
{"error":"not_found","reason":"Document is missing attachment"}
Apparently, when you use multiple slashes in a url, Couch understands the resource as an attachment. If this is so, how does one make databases where data will have multiple levels, like Geography/Continents/Africa/Egypt, for example?

CouchDB is not suitable for the usage you described. CouchDB is a flat document store.
You should flatten your structure in order to store it in CouchDB.
{"_id":"country-es",
"type":"geography",
"country":"Spain",
"continent":"Europe"
}
{"_id":"country-fr",
"type":"geography",
"country":"France",
"continent":"Europe"
}
Then use a view in order to have a mechanism to query it hierarchically.
function (doc) {
if (doc.type == "geography") {
emit([doc.continent,doc.country], doc._id );
}
}

Related

Querying relational data in GraphQL (NoSQL v RDS)

I'm writing an application that contains an overall data model with some obvious relations. I started writing the application using MongoDB but decided to try and transition over to Postgres since my data has tons of "foreign keys". For simplicity, let's consider the following models:
class GameBase {
id: string
entryIds: string[]
submissionIds: string[]
}
class EntryBase {
id: string
description: string
gameId: string
userId: string // id of user who supplied entry
submissionIds: string[] // submissions where entry is covered
}
class SubmissionBase {
id: string
url: string
gameId: string
userId: string // id of user who submitted
entryIds: string[] // entries covered by submission
}
Now I understand if I use a tool like TypeOrm, I could retrieve these relations with something along the lines of:
const games = await gameRepository.find({ relations: ["entryIds", "submissionIds"] });
But I'm not really sure how that relates to GraphQL. What I've been doing up until now is adding #ResolveField inside my Resolvers and writing something like
// game.resolver.ts
#ResolveField(() => [SubmissionBase], { nullable: true })
submissions(#Parent() game: GameBase) {
return this.submissionService.getManySubmissions(game.submissionIds)
}
and in the service
// game.service.ts
async getManySubmissions(submissionIds: string[]): Promise<SubmissionBase[]> {
if (!submissionIds) return []
return await this.submissionRepository.find({
where: {
id: { $in: submissionIds },
},
})
}
So this makes sense to me and has been working great, I'm just curious if I would see tangible speed/performance improvements if I switched to a relational database. For example, if the same .find method you see in my service was instead backed by Postgres instead of MongoDB, and the appropriate foreign key relationship was established, could I reasonably expect speed improvements? I imagine I wouldn't since it's just a simple get with no joins. Also, although submissionIds is a pseudo foreign key (because of MongoDB), it still acts as one in this setup. I guess I'm failing to see why MongoDB is inherently the wrong choice for relational data if you can use GraphQL and something like #ResolveField to grab whatever you need. What would a successful implementation of an RDS backed by GraphQL look like given this context?
This is a good question although I think it's going to get some opinionated and non definitive answers. My personal experience and advice after working on multiple production GraphQL servers talking to both SQL and NoSQL databases is this:
If you're going to expose GraphQL over a relational DB like Postgres and you're using NestJS do not write the GraphQL layer by hand
It's extremely time consuming and error prone plus you'll run into all kinds of problems related to N+1 and performance while sacrificing lots of the functionality that is gained by using an RDS in the first place (like the joins you're talking about). As someone who has gone down this path before, just please don't
There is a plethora of extremely powerful technologies that allow you to instead generate a GraphQL API on top of your RDS. These technologies than parse the GraphQL AST and convert it into an optimized single SQL query. I highly recommend that you look in to Hasura and PostGraphile. Both of these will blow you away with how productive you can be. Manually writing resolvers over top of SQL relations is just a waste of time
These tools can then be integrated into and with your NestJS application. If you're interested in Hasura specifically, I maintain open source packages that can help you integrate it nicely with NestJS

filter mongoose documents based on the specific fields and attributes

I'm developing a website using the MEAN stack (MongoDB/Express/Angular/Node).
I have a product schema with 12 different fields/properties, including size, color, brand, model, etc. What is the best and most efficient way to filter products, in Angular or on the server-side?And how can i chain the results if the client had selected more than one property?What would that look like?
Assuming there will be a lot of products, it will be too much to download to the client in order to filter using Angular. It doesn't scale very well. As the list of products gets bigger and bigger, it will be less and less performant. The better way would, generally, be to let MongoDB do the filtering for you. It's very fast.
But, you can control the filtering from Angular by posting to the server the filtering term you want on the endpoint used for that method of filtering, for example, using the http module
http.post('/api/filter/' + methodOfFiltering, { 'term': termtoFilterBy }, function(dataReturned) {
// use dataReturned to do something with the data
});
Put this in an angular service method, so you can inject it into any of your controllers/components.
Create an endpoint that will use the method and the keyword in the mongoose query. I'm assuming that you're using Express for your server routes.
app.post('/api/filter/:method', function(req, res) {
var method = req.params.method;
var termToFilterBy = req.body.term;
productSchema.find({method: termToFilterBy}, function(err, products) {
res.send(products);
});
});
Let me know if this helps.

node.js, restify - handle a parameter of type array

I have a node.js server with restify. I want to send it a get request that has an array of names in it. I think the request should look like this (but I am not sure about it):
/users?names=bob,joe,michael,joey
Is this query correct?
How do I get the names I send on the node.js server?
The W3C recommendation is that one key can be repeated multiple times with multiple values:
GET /users?names=bob&names=joe&names=michael&names=joey
Good systems will be designed to handle this format of data and be able to recognize multiple keys to group them within an array.
You do not need to specify query variables in your route:
// perform: GET /users?names=bob&names=joe&names=michael&names=joey
server.get('/users', function (req, res) {
// All your query vars from the GET request are in req.query
res.json(req.query.names);
});

AppEngine - Optimize read/write count on POST request

I need to optimize the read/write count for a POST request that I'm using.
Some info about the request:
The user sends a JSON array of ~100 items
The servlet needs to check if any of the received items is newer then its counterpart in the datastore using a single long attribute
I'm using JDO
what i currently do is (pseudo code):
foreach(item : json.items) {
storedItem = persistenceManager.getObjectById(item.key);
if(item.long > storedItem.long) {
// Update storedItem
}
}
Which obviously results in ~100 read requests per request.
What is the best way to reduce the read count for this logic? Using JDO Query? I read that using "IN"-Queries simply results in multiple queries executed after another, so I don't think that would help me :(
There also is PersistenceManager.getObjectsById(Collection). Does that help in any way? Can't find any documentation of how many requests this will issue.
I think you can use below call to do a batch get:
Query q = pm.newQuery("select from " + Content.class.getName() + " where contentKey == :contentKeys");
Something like above query would return all objects you need.
And you can handle all the rest from here.
Best bet is
pm.getObjectsById(ids);
since that is intended for getting multiple objects in a call (particularly since you have the ids, hence keys). Certainly current code (2.0.1 and later) ought to do a single datastore call for getEntities(). See this issue

Salesforce Metadata apis

I want to rerieve list of Metadata Component's like ApexClass using Salesforce Metadata API's.
I'm getting list of all the Apex Classes(total no is 2246) that are on the Salesforce using the following Code and its taking too much time to retrieve these file names:
ListMetadataQuery query = new ListMetadataQuery();
query.type = "ApexClass";
double asOfVersion = 23.0;
// Assume that the SOAP binding has already been established.
FileProperties[] lmr = metadataService.listMetadata(
new ListMetadataQuery[] { query }, asOfVersion);
if (lmr != null)
{
foreach(FileProperties n in lmr)
{
string filename = n.fileName;
}
}
My requirement is to get list of Metadata Components(Apex Classes) which are developed by my organizasion only so that i can get the Salesforce Metadata Components which are relevant to me and possibly can save my time by not getting all the classes.
How can I Achieve this?
Reply as soon as possible.
Thanks in advance.
I've not used the meta-data API directly, but I'd suggest either trying to filter on the created by field, or use a prefixed name on your classes so you can filter on that.
Not sure if filters are possible though! As for speed, my experience of using the Meta-Data API via Eclipse is that it's always pretty slow and there's not much you can do about it!

Resources