We have a GraphQL server setup which uses the exact same set of GraphQL calls on different client systems. These clients have the same exact tables/structures which do not change.
I'd like to make a GraphQL calls which indicate which DB connection to use. OR if there is a similar alternative with GraphQL, I'm all ears.
I've been Googling, forum diving, and document scanning for a while...
One possible solution is to create a directives.
const typeDefs = gql`
directive #db(client: DbClient) on FIELD_DEFINITION
type DbClient {
DB_CLIENT_1
DB_CLIENT_2
}
type Query {
users: [User] #db
}
`;
// Directive
// db client connection config
const dbConnectionConfig = {
DEFAULT : {...},
DB_CLIENT_1: {...},
DB_CLIENT_2: {...}
}
class DbClientDirective extends SchemaDirectiveVisitor {
public visitFieldDefinition(field) {
const { resolve = defaultFieldResolver } = field;
const { client } = this.args;
field.args.push({
name: 'client',
type: GraphQLString
});
field.resolve = async function (
source,
{ client, ...otherArgs },
context,
info,
) {
let connectionConfig = dbConnectionConfig.DEFAULT;
if(Object.prototype.hasOwnProperty.call(dbConnectionConfig, client)) {
connectionConfig = dbConnectionConfig[client];
}
/**
create a DB client with the given config. Even you can persist the DB connection is a hashMap such that you don't need to establish a new connection every time.
**/
const dbSession = new DbClient(connectionConfig);
// put the dbSession in a context such that it can be used by the resolver.
context.dbSession = dbSession;
return resolve.call(this, source, otherArgs, context, info);
};
field.type = GraphQLString;
}
}
// UserResolver.js
const UserResolver = {
Query: {
users(async (parent, args, context, info) => {
const dbSession = context.dbSession;
// use the dbSession to fetch the results. Note, here resolve need not to know about the db client details.
return dbSession.findAll();
}
}
const resolvers = merge(
UserResolver
);
const server = new ApolloServer({
typeDefs,
resolvers,
schemaDirectives: {
date: DbClientDirective
}
});
server.listen().then(({ url }) => {
console.log(`🚀 Server ready at ${url}`);
});
You can now pass client argument to a user query to indicate which DB connection to use.
Example 1 - this will use DEFAULT as DB client
query {
users {
name
userId
}
}
Example 2 - this will use DB_CLIENT_1 as DB client
query {
users(client: DB_CLIENT_1) {
name
userId
}
}
Related
In our React project, we use the npm package "#elastic/elasticsearch".
Since 8.0 migration, delete function is not working. We implement it this way:
// 1) Declaration of client instance
import { Client } from '#elastic/elasticsearch';
import Config from 'src/config';
const client = new Client(Config.get('/authentication/elastic'));
export default client;
// 2) Use delete fonction
public async delete(id: string): Promise<any> {
try {
return await this.client.delete({
index: this.indexName,
id: id,
});
} catch (e) {
return null;
}
}
The promise does not return an error, it sends this:
{
_index: 'visitor_0',
_id: 'RN-PdzFW-Yfr0ahMp',
_version: 3,
result: 'deleted',
_shards: { total: 2, successful: 1, failed: 0 },
_seq_no: 396,
_primary_term: 22
}
Problem, it does not delete the object. It updates it with empty content.
I try do delete manually on elastic dashboard, it works correctly.
I try do a small script by entering the id by hand, it also works.
// my small script
'use strict';
require('dotenv').config();
const util = require('util');
const elastic = require('./services/elastic/client').default;
const debug = o => console.log(util.inspect(o, false, null, true));
(async () => {
debug('Starting...');
const id = 'ChmG-wAL-YpjZAdGp';
try {
const result = await elastic.delete({ index: 'visitor', id });
debug(result);
} catch (e) {
debug(e);
}
})();
Do any of you have any idea where my problem could come from?
While developing a small project, there is absolutely no need for a MongoDB persistence layer, but I would like the benefit of publishing and subscribing for client synchronization.
From a related question, I implemented a very crude interface (untested) :
// server
const _cache = new Map();
const rooms = {
registerObserver: (roomHash, observer) => {
if (!_cache.has(roomHash)) {
_cache.add(roomHash, { messages:[], observers:[] );
}
const room = _cache.get(roomHash);
room.observers.add(observer);
observer.added("rooms", roomHash, { messages:room.messages });
observer.onStop(() => room.observers.delete(observer));
observer.ready();
}
}
Meteor.publish('chatroom', function (roomHash) {
check(roomHash, String);
rooms.registerObserver(roomHash, this);
});
Meteor.methods({
pushMessage: function (roomHash, message) {
check(roomHash, String);
check(message, String);
const room = _cache.get(roomHash);
room.messages.push(message);
room.observers.forEach(observer =>
observer.changed("rooms", roomHash, { messages:room.messags })
);
}
});
But, now, I need to fetch the messages from the given room, so I added :
// client, React hook
const useChatMessages = roomHash => {
const loading = useTracker(() => {
const handle = Meteor.subscribe("chatroom", roomHash);
return !handle.ready();
}, [orderHash]);
const pushMessage = useCallback(message => {
Meteor.call('pushMessage', roomHash, message);
}, [roomHash]);
const messages = []; // .... ???
return { loading, messages, pushMessage };
};
I have no idea how to fetch the messages. Since I removed the MongoDB dependencies, I do not have access to Mongo.Colllection, and it seems like Meteor.Collection is also unavailable (i.e. Meteor.Collection === undefined)
So, I publish, and also subscribe, but how do I fetch the published messages?
(Note: the above code compiles, but it is mostly untested as explained in the question.)
My React code creates a WebSocket connection to our company's ActiveMQ 5.15.5 server, and then subscribes to the following two topics: salary and decoding. The problem is that the code is only able to subscribe to one of the topics. It cannot subscribe to both.
const client = window.Stomp.client(`ws://${ipAddress}:61614`, 'aj6.stomp');
const headers = { id: 'username' };
client.debug = null;
client.connect('user', 'pass', () => {
client.subscribe(
'/topic/salary', //BREAKPOINT was set here
message => {
const body = JSON.parse(message.body);
if (body && body.pcId) {
salaries[body.pcId] = body;
setState({ salaries});
}
},
headers,
);
client.subscribe(
'/topic/decoding', //BREAKPOINT was set here
message => {
const newBody = JSON.parse(message.body);
if (newBody && newBody.PcID) {
consoleMessage[newBody.PcID] = newBody;
setState({ consoleMessage });
}
},
headers,
);
});
So in the code above I put a break-point at client.subscribe('/topic/decoding... and client.subscribe('/topic/salary.... I saw that it only subscribes to /topic/decoding but not /topic/salary.
Does anyone know how I can fix this issue so that it subscribes to both topics?
From Stomp documentation:
Since a single connection can have multiple open subscriptions with a server, an id header MUST be included in the frame to uniquely identify the subscription. The id header allows the client and server to relate subsequent MESSAGE or UNSUBSCRIBE frames to the original subscription.
Within the same connection, different subscriptions MUST use different subscription identifiers.
Stomp API definition:
subscribe(destination, callback, headers = {})
So for my understanding, You can't have the same username id for both of your subscriptions
Try creating 2 clients, e.g.:
const salaryClient = window.Stomp.client(`ws://${ipAddress}:61614`, 'aj6.stomp');
const salaryHeaders = { id: 'salary' };
salaryClient.debug = null;
salaryClient.connect('user', 'pass', () => {
salaryClient.subscribe(
'/topic/salary',
message => {
const body = JSON.parse(message.body);
if (body && body.pcId) {
salaries[body.pcId] = body;
setState({ salaries});
}
},
salaryHeaders,
);
});
const decodingClient = window.Stomp.client(`ws://${ipAddress}:61614`, 'aj7.stomp');
const decodingHeaders = { id: 'decoding' };
decodingClient.debug = null;
decodingClient.connect('user', 'pass', () => {
decodingClient.subscribe(
'/topic/decoding',
message => {
const newBody = JSON.parse(message.body);
if (newBody && newBody.PcID) {
consoleMessage[newBody.PcID] = newBody;
setState({ consoleMessage });
}
},
decodingHeaders,
);
});
I am using Amplify and Appsync for a small website and I am trying to create a contact form and I need to send an email after the mutation. Can anyone suggest the best way to approach this?
It is actually quite simple. When you do a Mutation you can Invoke a lambda and execute the following code using SES from the aws-sdk.
You trigger the lambda within AppSync as you choose the function as the Mutations datasource (don't forget to have the proper IAM permission for this). Then you need two mapping templates one for the request and one for the response. With the request you can pass the input parameters of the Mutation endpoint to the lambda.
It could look like this for the mappingTemplate.request.vtl
{
"version": "2018-05-29",
"operation": "Invoke",
"payload": {
"field": "fieldVariable"
"arguments": $utils.toJson($context.arguments)
}
}
And for the mappingTemplate.response.vtl
#if( $context.result && $context.result.error )
$utils.error($context.result.error)
#else
$utils.toJson($context.result)
#end
This will execute your lambda and you have your passed arguments within event.aguments
import { SES } from 'aws-sdk';
...
exports.handler = async event => {
const bccEmailAddresses = [];
const ccEmailAddresses = [];
const toEmailAddresses = [];
const bodyData = '';
const bodyCharset = 'UTF-8';
const subjectdata = '';
const subjectCharset = 'UTF-8';
const sourceEmail = '';
const replyToAddresses = [];
const emailParams = {
Destination: {
BccAddresses: bccEmailAddresses,
CcAddresses: ccEmailAddresses,
ToAddresses: toEmailAddresses
},
Message: {
Body: {
Text: {
Data: bodyData,
Charset: bodyCharset
}
},
Subject: {
Data: subjectdata,
Charset: subjectCharset
}
},
Source: sourceEmail,
ReplyToAddresses: replyToAddresses
};
await SES.sendEmail(emailParams).promise();
}
In this case, type "X" is Application and type "Y" is type "Node" - I can see why this is happening, but my understanding of Relay isn't enough to understand how to fix it. The query generated by Relay is
query {
node(id: $some_id) {
...F0
}
}
fragment F0 on Application {
...
}
I have a schema that looks like
query {
application {
/* kind of a generic endpoint for fetching lists, etc */
invites(token: $token) {
name
}
}
viewer { /* the current user */ }
}
I'm trying to fetch a specific invite from outside a session (viewer is null).
I've tried
const application = Relay.QL`query { application }`
...
<Route ... queries={{ application }}/>
...
Relay.createContainer(Component, {
initialValues: { token: null },
fragments: {
application: () => {
fragment on Application {
invites(token: $token) {
...
}
}
}
}
})
which gives me the error
fragment "F0" cannot be spread here as objects of type "Node" can never be of type "Application" - or something to that effect.
I'm a little confused, because if I were to write a raw query and run it through GraphQL directly
query {
application {
invites(token: "asdasdasd") {
edges {
node {
name
}
}
}
}
}
it gives me what I'm looking for...
In the backend, my graph is defined like
export const Application = new GraphQLObjectType({
name: 'Application',
fields: () => ({
id: {
type: GraphQLString,
resolve: () => 'APPLICATION_ID'
},
invites: {
type: InviteConnectionType,
args: connectionArgs,
resolve: (application, args) => {
...
}
}
})
})
export default new GraphQLSchema({
query: new GraphQLObjectType({
name: 'query',
fields: {
node: nodeField,
application: {
type: Application,
resolve: (root, args, ctx) => {
return Promise.resolve({})
}
}
}
})
I've been looking at questions like this and some issues on the Relay github, but it's not clear to me how I should implement nodeInterface.
edit: the short-long of the current nodeInterface code is
export const {
nodeInterface,
nodeField
} = nodeDefinitions(
(globalId) => {
const { type, id } = fromGlobalId(globalId)
return db[type].findById(id)
},
(obj) => {
const name = obj.$modelOptions.name.singular
return types[name]
}
)
Application is not a db model, however, just a generic interface to fetch data through. I've tried checking to see if type === 'Application', and returning null (although I see why that doesn't work), returning Application (the GraphQLObject), but that doesn't work... not really sure where to go from there.
You need to automatically generate an unique global id for a GraphQL
type that you want to refetch.
In nodeInterface you tell GraphQL
how to map the id to the corresponding GraphQL object.
By the given server-side object nodeInterface identifies the GraphQL type.
Below is simplified example how it may look like with Application:
// nodeInterface.
var {nodeInterface, nodeField} = nodeDefinitions(
(globalId) => {
var {type, id} = fromGlobalId(globalId);
// The mapping from globalId to actual object id and type.
console.log('globalId:', id);
console.log('type:', type);
if (type === 'Application') {
// getApplication is your db method to retrieve Application object.
// With id you could also retrieve a specific db object.
return getApplication();
} else {
return null;
}
},
(obj) => {
// Note that instanceof does an identity check on the prototype object, so it can be easily fooled.
if (obj instanceof Application) {
return ApplicationType;
} else {
return null;
}
},
);
// Application.
export const ApplicationType = new GraphQLObjectType({
name: 'Application',
fields: () => ({
// Auto-generated, globally defined id.
id: globalIdField('Application'),
_id: {
type: GraphQLString,
resolve: () => 'APPLICATION_ID'
},
invites: {
type: InviteConnectionType,
args: connectionArgs,
resolve: (application, args) => {
...
}
}
}),
// Declaring nodeInterface.
interfaces: [nodeInterface]
});
Note that during the initial fetch nodeInterface is not even executed, so if nodeInterface is returning nothing there won’t be errors at the initial fetch. If that doesn’t make sense or you’re still struggling you can post a link to the repo, I’ll look into it.
To give an update on this, I was on the right path.
The current nodeDefinitions I had just needed a little extra:
nodeDefinitions(
(globalId) => {
const { type, id } = fromGlobalId(globalId)
if (type === 'Application') {
return Promise.resolve(Application)
}
return db[type].findById(id)
},
(obj) => {
if (obj.$modelOptions) {
/* sequelize object */
const name = obj.$modelOptions.name.singular
return types[name]
} else if (obj.name === 'Application') {
return Application
}
return null
}
)
I'm not sure if this is the best way to do it, but it seems to do the trick. The gist is that, if the type of node that I want to be returned is Application, I return the GraphQL object - { ... name: "Application" ... } we'll use the name field from this in the next step (the second callback in nodeDefinitions) to just re-return Application. I think you could return a "custom" object or something else - it doesn't matter so long as you return something unique that you can define a mapping from to a GraphQLObject type (required for the second callback).