I am moving my app from Svelte SPA (original) to Sveltekit multi page app (new).
In the original app, I configure a http client up top and put it context using:
setContext(HTTP_CLIENT, httpClient)
Now the entire app can get that http client using
const httpClient = getContext(HTTP_CLIENT)
I do this because my app can be started with debug parameters than turn on http request logging.
I'm not clear how to do similar in Sveltekit, because it seems that pages do not share a context.
I tried sticking the http client in the session like this:
import { session } from "$app/stores";
$session.httpClient = httpClient
and I got:
Error: Failed to serialize session data: Cannot stringify arbitrary non-POJOs
So $session is meant to be serialized, ok. Does that mean that I need to put whatever debug parameters a user supplied in $session, and each page needs to freshly instantiate its own http client? Or is there some other idiomatic sveltekit way of doing this?
PS I know sveltekit has its own fetch so you might want to say "don't use your own http client", but my app uses many different service objects (graphql client for example) that can be configured in debug (and other) modes, so please don't zero in on the fact that my example is a http client.
One way around this could be to send down the configuration in the top __layout file, create the http client there and store in a store. Since stores are shared across all pages the client can then freely use this store.
Related
I am currently trying to use camunda platform and in this concept I am building a react application to make a call to a graphQL api and perform some actions. So far, I have used the api with postman and does the job I want to, The graphql mutation is the following:
mutation claimTask ($taskId: String!, $assignee: String) {
claimTask (taskId: $taskId, assignee: $assignee) {
id
name
taskDefinitionId
processName
creationTime
completionTime
assignee
variables {
id
name
value
previewValue
isValueTruncated
}
taskState
sortValues
isFirst
formKey
processDefinitionId
candidateGroups
}
}
And the endpoint is
http://{my_ip}:8082/graphql
which is set in a personal vm server. What I am trying to do now, is make the same request through the react app (apollo client). So far, I am getting a cors policy error:
Access to fetch at 'http://{my_ip}:8082/graphql' from origin 'http://localhost:3000' has been blocked by CORS policy
I understand that I have to configure somehow the uri that can be accepted by the server. My question is, since I am using an existing api should I do this from the express server (apollo server) configuration? Because so far every solution I found talks about implementing the api from the scratch, including defining the schemas.
I have concluded, that I should use the express server to create a kind of proxy so that the react app will hit the api through there but I cannot figure out how exactly is this implemented.
I know that this is a vague question, but any suggestion could be very useful.
Thank you!!
It is a best practice to not hit the GraphQL API directly, but to create your own facade, which exposes the functionality your front-end needs, possibly in a more use case specific way. This means connectivity only needs to be allowed server-to-server between the back-ends. It is more secure as you don't need to open the API to the public and it also solves the cross-domain challenge you have. Your facade will be exposed under your domain.
Here is a example NestJS client "Generating the Tasklist service":
https://docs.camunda.io/docs/apis-clients/tasklist-api/tasklist-api-tutorial/#generating-the-tasklist-service
On your express backend you would do something similar.
(This example uses a Java back-end with react, but I am guess you want JS: https://github.com/camunda-community-hub/camunda-8-lowcode-ui-template/blob/main/src/main/java/org/example/camunda/process/solution/facade/TaskController.java .)
I am planning to migrate some part of my website from Ruby to React while others still need to be supported on Ruby as Front-end.
But the main issue I am facing is concerned with Login Service.
In case of Ruby, after Login,cookie that gets generated is defined as
Rails.application.config.session_store :cookie_store, key: '_my_session'
So, even if I create a new login page using ReactJS ,I need to create a similar cookie(as I need to support some old pages on Ruby that use this cookie for authorization) after successful authentication.
So is there any way to create this 'my_session' from ReactJS ?Or How I can decode 'my_session' cookie?
It is possible to read and write that cookie without rails, but the implementation is specific to the version of rails you're using and the way your rails application is configured. But more importantly, that cookie is likely marked HTTP-only and in that case it isn't possible to read or write it from a client side app (not in React or anything other client side library). The login scenario is typically something you do on the server side and the server (Rails in your case), uses HTTP headers to read and write that session cookie. Set-Cookie to write to your user agent via HTTP response and Cookie when reading from the request sent by your user agent.
We want to migrate our existing web application( based on HTTP API) to REST Service model with ReactJS for UI. We have used Session object heavily (to hold data and process) in our current application. Is it possible to use same Session object to hold data and Session ID for authentication process with REST API + ReactJS ?
Yes, and no.
A session is held for a specific HTTP client (say, your web browser) based on a cookie that's sent with every browser requests. It doesn't matter if that browser request is for a HTML web page (your current web app) or to a URL that returns JSON (such as an API). As such, you can refactor parts of your application front end to use the same session based auth (assuming things like domains and paths for your session cookie allow, etc).
Your refactored front end can therefore simply make an HTTP call to retrieve data and your backend can respond accordingly, using the data stored in the session on the server.
This does imply that you'll need to think about your resource abstraction in your API carefully because you cannot simply access your server session data in your JavaScript.
As time goes on you may find you want to refactor your authentication/session layer away from sessions w/ cookies and look at a proper IDS w/ JWT's in local storage but thats well beyond the scope of "can I do it this way".
I found an interesting case in one of the react-scripts apps where the proxy is configured to localhost:3001 (the front end is running on localhost:3000).
From the react stuff we make a request via axios to localhost:3000/api/results and that loads a bunch of JSON information, but if I open localhost:3000/api/results in a new browser tab that does not display the JSON but loads the HTML instead.
Why is that happening?
The real problem is that we have endpoint to download files from, like:localhost:3001/api/downloads/csv/file.csv, but they won't work, because when localhost:3000/api/downloads/csv/file.csv is not proxied to localhost:3001/api/downloads/csv/file.csv and we simply cannot call this via axios because it should be a direct call from the browser.
However, the strange thing is why does it work via axios and curl?
By doing curl localhost:3000/api/downloads/csv/file.csv (or 3001), we get the right content back.
If you are using react-scripts, then this is facilitated by Service Workers. Quoting from Service Workers - By Ankita Masand...
Service Worker acts as a proxy server that intercepts the network requests sent by your web application to the server. In the sense, requests to fetch Javascript or CSS files, images go through service worker to the server. Service Worker has the ability to modify this request or send a custom response back to the client.
Here, Service Workers are acting as a Client Side Proxy and tapping to your HTTP Requests and you get the right content back.
I am implementing Cloud Endpoints with a Python app that uses custom authentication (GAE Sessions) instead of Google Accounts. I need to authenticate the requests coming from the Javascript client, so I would like to have access to the cookie information.
Reading this other question leads me to believe that it is possible, but perhaps not documented. I'm not familiar with the Java side of App Engine, so I'm not quite sure how to translate that snippet into Python. Here is an example of one of my methods:
class EndpointsAPI(remote.Service):
#endpoints.method(Query_In, Donations_Out, path='get/donations',
http_method='GET', name='get.donations')
def get_donations(self, req):
#Authenticate request via cookie
where Query_In and Donations_Out are both ProtoRPC messages (messages.Message). The parameter req in the function is just an instance of Query_In and I didn't find any properties related to HTTP data, however I could be wrong.
First, I would encourage you to try to use OAuth 2.0 from your client as is done in the Tic Tac Toe sample.
Cookies are sent to the server in the Cookie Header and these values are typically set in the WSGI environment with the keys 'HTTP_...' where ... corresponds to the header name:
http = {key: value for key, value in os.environ.iteritems()
if key.lower().startswith('http')}
For cookies, os.getenv('HTTP_COOKIE') will give you the header value you seek. Unfortunately, this doesn't get passed along through Google's API Infrastructure by default.
UPDATE: This has been enabled for Python applications as of version 1.8.0. To send cookies through, specify the following:
from google.appengine.ext.endpoints import api_config
AUTH_CONFIG = api_config.ApiAuth(allow_cookie_auth=True)
#endpoints.api(name='myapi', version='v1', auth=AUTH_CONFIG, ...)
class MyApi(remote.service):
...
This is a (not necessarily comprehensive list) of headers that make it through:
HTTP_AUTHORIZATION
HTTP_REFERER
HTTP_X_APPENGINE_COUNTRY
HTTP_X_APPENGINE_CITYLATLONG
HTTP_ORIGIN
HTTP_ACCEPT_CHARSET
HTTP_ORIGINALMETHOD
HTTP_X_APPENGINE_REGION
HTTP_X_ORIGIN
HTTP_X_REFERER
HTTP_X_JAVASCRIPT_USER_AGENT
HTTP_METHOD
HTTP_HOST
HTTP_CONTENT_TYPE
HTTP_CONTENT_LENGTH
HTTP_X_APPENGINE_PEER
HTTP_ACCEPT
HTTP_USER_AGENT
HTTP_X_APPENGINE_CITY
HTTP_X_CLIENTDETAILS
HTTP_ACCEPT_LANGUAGE
For the Java people who land here. You need to add the following annotation in order to use cookies in endpoints:
#Api(auth = #ApiAuth(allowCookieAuth = AnnotationBoolean.TRUE))
source
(Without that it will work on the local dev server but not on the real GAE instance.)