Is it recommended to use the parse.com API for a mobile application while using other databases such as amazon simpleDB for most of the app back-end data?
Basically, parse.com would only be used for its login/register user system.
That leads to some questions such as "can our own server check a parse.com sessionToken validity before returning data?"
Anyway, anyone has used parse.com in combination with a bigger cloud service such as AWS?
Many Thanks
can our own server check a parse.com sessionToken validity before returning data?
I think you can achieve this using Parse's Cloud Code.
You could try something like:
Parse.Cloud.define("validateUser", function(request, response) {
var myname = request.params.myname;
var mypass = request.params.mypass;
Parse.User.logIn(myname, mypass, {
success: function(user) {
// Do stuff after successful login.
},
error: function(user, error) {
// The login failed. Check error to see why.
}
});
});
And test it with the following curl (be sure to replace the App ID and Rest API Key):
curl -X POST \
-H "X-Parse-Application-Id: <YOUR_PARSE_APP_ID_HERE>" \
-H "X-Parse-REST-API-Key: <YOUR_PARSE_REST_API_KEY_HERE>" \
-H "Content-Type: application/json" \
-d '{"myname":"johnPooter", "mypass":"pooter123"}' \
https://api.parse.com/1/functions/validateUser
As to whether it is recommended or not to use other back-ends with Parse: I think it is fine but it defeats the purpose of using Parse. Parse is good when you are trying to throw something together quickly and don't want to invest resources towards solving common back-end problems.
Related
I am trying to use aurora serverless data API feature to reduce the db connection time in my serverless application. But building client is taking time.
I would like to call rds HTTP service via lambda to get/post data.
I came across some was posts but I am still getting error, missing authentication token
https://docs.aws.amazon.com/rdsdataservice/latest/APIReference/API_ExecuteStatement.html
My sample request is below for MySQL. I have run this via AWS cloud shell.
curl --location --request POST 'https://rds-data.us-west-2.amazonaws.com/Execute' \
--header 'Content-Type: application/json' \
--data-raw '{
"continueAfterTimeout": false,
"database": "demo_data",
"includeResultMetadata": true,
"parameters": [],
"resourceArn": "arn:aws:rds:us-west-2:*******:cluster:rds-serverless",
"schema": "demo_data",
"secretArn": "arn:aws:secretsmanager:us-west-2:******:secret:serverless/user_u-cMt2Q4",
"sql": "select now()"
}'
If I can recommend a different approach--AWS has already done a lot of work for you, so you shouldn't need to construct your own access mechanism for the data api.
For instance, from the command line you can access the data api using the aws cli application (available on mac, windows, linux, etc) like this:
aws rds-data execute-statement --resource-arn "arn:aws:rds:us-east-1:123456789012:cluster:mydbcluster" \
--database "mydb" --secret-arn "arn:aws:secretsmanager:us-east-1:123456789012:secret:mysecret" \
--sql "select * from mytable"
For access to the data api from within a lambda, you would usually want to pull in the sdk for whatever language your lambda is setup to run and then to use their libraries to access the data api and other services.
If you are dead-set on using the command line from within lambda you could even create a lambda layer that includes the aws cli mentioned above and then use a system command to call the api that way, though I wouldn't recommend it.
Hopefully one of these easier solution will work better for you than curl!
I wish to use the Google Cloud Platform (GCP) REST API locally, starting with the apps.services.versions.instances.list method.
The route works when I use "Try this API" here, but how would I use this method locally with curl?
"https://appengine.googleapis.com/v1/apps/$APPSID/services/$SERVICESID/versions/$VERSIONSID/instances?key=$YOUR_API_KEY" \
--compressed \
--header 'Accept: application/json' \
--header "Authorization: Bearer $YOUR_ACCESS_TOKEN"
#=>
{
"error": {
"code": 401,
"message": "Request is missing required authentication credential. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.",
"status": "UNAUTHENTICATED"
}
}
How do I access $YOUR_API_KEY and $YOUR_ACCESS_TOKEN? I have been unable to find either in the official GCP docs.
The fastest way is use Cloud Shell:
List projects to get project id
gcloud projects list
# save you project id
PROJECT_ID="YOURS_PROJECT_ID"
Get ACCESS_TOKEN
ACCESS_TOKEN=$(gcloud auth print-access-token)
Get API_KEY
API_KEY=$(curl -X POST https://apikeys.googleapis.com/v1/projects/$PROJECT_ID/apiKeys?access_token=$ACCESS_TOKEN | jq -r ".currentKey")
Print API_KEY and ACCESS_TOKEN
echo $ACCESS_TOKEN
echo $API_KEY
To run above commands on local machine first you need authenticate using command gcloud auth login and follow instructions.
Alternatively api key could be readed or created from console go to Navigation Menu -> APIs & Services -> Credentials and next click on CREATE CREDENTIALS -> API Key.
By reading the documentation (clicking on question mark next to Credentials) we can read:
[YOUR_API_KEY] - "Include an API Key to identify your project, used to verify enablement and track request quotas."
[YOUR_ACCESS_TOKEN] - "Include an access (bearer) token to identify the user which completed the OAuth flow with your Client ID."
You no longer need an API key. It's a legacy feature of Google APIs, provide only an access token is enough.
In command line you can do this
curl -H "Authorization: Bearer $(gcloud auth print-access-token)" https://....
All the Google Cloud APIs are compliant with access token authentication. Few are still compliant with API keys.
About APIKeys API
This API has been published in beta and now closed. At least the documentation part. I don't know if this API is stable or subject to change. You can create an API key per API like this (very similar to Bartosz Pelikan answer)
curl -H "Authorization: Bearer $(gcloud auth print-access-token)" \
-X POST https://apikeys.googleapis.com/v1/projects/PROJECT_ID/apiKeys
As you can see, I reuse the access token authentication mode
The above answers are using an API that isn't publicly available (I reached out to GCP support an confirmed.
I recommend using the CLI tool like so:
gcloud app instances list --service core-api --project my-project-name
docs: https://cloud.google.com/sdk/gcloud/reference/app/instances/list
You'll have to a gcloud auth first and probably set your project.
API call from the browser (React with Firebase auth) to the Google Vision API in order to get the content of a .pdf file which is stored in the Firestore database. The result should be stored as .json in Firestore. A service account was created and it has system wide access. The expected response looks like this:
{
"name": "projects/usable-auth-library/operations/1efec2285bd442df"
}
The response I get is a 403 which indicates that something in the authenication process went wrong. In Firestore no .json with the text content is created.
The function for the call looks like this:
const test = () => {
fetch("https://vision.googleapis.com/v1/files:asyncBatchAnnotate", {
method: "post",
requests: [
{
inputConfig: {
gcsSource: {
uri: "gs://XXXX.appspot.com/images/XXXX.pdf"
},
mimeType: "application/pdf"
},
features: [
{
type: "DOCUMENT_TEXT_DETECTION"
}
],
outputConfig: {
gcsDestination: {
uri: "gs://XXXX.appspot.com/images/output"
},
batchSize: 1
}
}
]
}).then(res => console.log(res))
Any idea what I am doing wrong? Or is there a React library which handles that process out of the box or a more detailed step by step guide to make such client side calls to the API? I had a look at the npm package #google-cloud/vision but this seems not to work on the client side yet.
You can find and example here on how to use the api with curl.
curl -X POST \
-H "Authorization: Bearer "$(gcloud auth application-default print-access-token) \
-H "Content-Type: application/json; charset=utf-8" \
https://vision.googleapis.com/v1/images:annotate -d #request.json
Therefore you should use your service account to sign a jwt token and receive an access token from Google Cloud. Then you should use your access token in the post request.
Your application prepares to make authorized API calls by using
the service account's credentials to request an access token from the
OAuth 2.0 auth server.
Finally, your application can use the access token to call Google
APIs.
Using OAuth 2.0 for Server to Server Applications
Please follow Firebase Authenticate REST Requests for an example on how to generate an access token with node.js
The service account only needs to have access to the GCS bucket you interacting with. It seems like you aren't using your service account though.
Typically, you wouldn't have the service account or api key in your frontend because then someone could steal/use your service account/api key and you have to pay for the charges. Better to call Vision API from your backend.
Let us assume that there is an app engine standard python app hosted at https://xyz.appspot.com and that its URLs are protected with:
login: admin
secure: always
How can I exercise the APIs using curl? I guess the real question is how can I authenticate to the app using curl. If the app is used from a browser, one is redirected to Google login but I am wondering how I can simulate the same from curl.
Any help is greatly appreciated.
Thanks,
Raghu
One way would be to do the authentication in browser first, and then copy the cookie from there to curl. For example in Chrome, you can open the devtools (F12) and select the Network tab.
When you access your secure resource it will appear there. Then you can right click -> Copy -> Copy as cURL (bash).
This will give you a cURL command that is authorized to call your secure resource.
Based on the suggestion from #Erfa, I visited the site in Chrome while keeping the dev tools open.
The browser takes you through login procedure and the site appears. At this point, right click on the GET request in "Network" tab and select "Save as HAR with Content" which saves the API information in a text file.
In the file, you will find a cookie that is being sent with the GET request. You can now use this same cookie with curl as follows:
$ curl --cookie "NAME=VALUE" <URL>
You can use a combination of Cloud Endpoints and API Key.
In this article https://cloud.google.com/endpoints/docs/frameworks/python/restricting-api-access-with-api-keys-frameworks from Google Cloud Platform you have an example of how to use curl authentication with this combination:
If an API or API method requires an API key, supply the key using a
query parameter named key, as shown in this cURL example:
curl \
-H "Content-Type: application/json" \
-X POST \
-d '{"message": "echo"}' \
"${HOST}/_ah/api/echo/v1/echo_api_key?key=${API_KEY}
where HOST and API_KEY are variables containing your API host name and API key,
respectively. Replace echo with the name of your API, and v1 with the
version of your API.
I'm trying to interrogate a rest service. If I use CURL I can get the data back without issue.
curl -k -X GET -H "Accept: application/json" -u username:password "https://example.com/v1/apps"
However, I've tried several methods for pulling data back using Angular's $http and $resource services and nothing seems to work.
Does anyone have a good example of how to do this with Angular 1.4.6?
It's both a Cross-Domain and Basic Authentication call.
There are so many "like" examples out there, but nothing seems to work.
Hopefully this will save someone else some frustration. A few things were causing the issue.
The certificate on the target site was invalid. They were using a self-signed cert with example.com as the domain.
The call I was using had a capital 'GET' in it, it should be all lower-case.
To solve issue 1, I had to add the server's IP to my hosts with an alias example.com. It's a major hack, but it was necessary to get around the invalid domain issue. I'm working with the server's owners to have a correct cert put in place.
For the call itself, here an example that works. I'm using another service for the auth header. It just base64s the username and password. Replace AuthorizationService.getCredentials() with your favorite base64(username:password) tool. There are several of them out there.
.factory(
'ApplicationDetailService',
[
'$resource', 'AuthorizationService',
function($resource, AuthorizationService) {
return $resource('https://example.com/v1/apps/:id', { id: '#id' },
{
'get' : {
headers: {
Authorization: AuthorizationService.getCredentials()
}
}
}
);
}
]
)
Finally, because the target server wasn't passing back the Access-Control-Allow-Origin: * header in its response. All of the modern browsers to flat out reject the AJAX return.
I was under the assumption that it was an Angular thing, I was wrong. There's a plug-in for Chrome called Access-Control-Allow-Origin: *, I tried that, it failed because CORS needs your domain in the header.
I found a flag for chrome --disable-web-security that tells Chrome to ignore all security. FAIR WARNING: This makes your session very insecure. Only do this for a test environment. Just modify your shortcut to pass the flag in as a parameter to chrome.exe.
With all of that, I was able to get past my issue and continue developing until the server team fixes things on their end.
Best of luck!