I am trying to use the searchKitManager inside react-admin I provided the parameters etc according to the docs but when I run the code it throws errors. Here is how the code works
React Admin is running on http://localhost:3000
Golang backend is running on http://localhost:3006
ElasticSearch is running on http://localhost:9200
When data is inserted in mysql database using golang code it is also inserted in elasticsearch later on in one of my display component I call the above searchkitManager as follows
let apiUrl= 'http://localhost:9200/donate' // what link should I pass, url to elasticsearch or url to my backend
const searchkit = new SearchkitManager('/', {
searchUrlPath: `${apiUrl}/_search`,
});
This code will throw 404 Not Found or 400 Bad Request error but the API works in postman
if I change the above link to
let apiUrl= 'http://localhost:9200/donate' // what link should I pass, url to elasticsearch or url to my backend
const searchkit = new SearchkitManager('/', {
searchUrlPath: `${apiUrl}/_doc/`,
});
I am not getting anything at all sometimes it no error in console and sometimes 400 Bad Request or 405 Post Not Allowed
One last thing the link I am providing as for searchUrlPath should be like that or not? or should I pass the apiUrl in place of /? I tried that as well but just to make sure.
Any kind of help will be really appreciated.
Try doing this:
const elasticSearchUrl = 'http://localhost:9200/<your_index>';
const searchkit = new SearchkitManager(elasticSearchUrl );
Related
i am using forcejs in my angular app which is working fine and gives me accessToken. However, I am not able to get refreshToken to be able to renew accessToken as needed. The code is below
import { OAuth, DataService } from 'forcejs';
async loginSFDC(){
let callbackUrl = 'https://my.callback.url'
let oauth = OAuth.createInstance('client key','', callbackUrl);
oauth.login().then(
async (oauthResult) => {
DataService.createInstance(oauthResult);
console.log("Logged Into Salesforce Successfully:::" + JSON.stringify(oauthResult));
});
}
the above code is printing accessToken but no refreshToken. Please advise
i have also tried passing the 2nd parameter in createInstance as http://login.salesfoce.com?scope=full+refresh_token but that does not work as url gets constructed wrong on adding the scope=full+refresh_token
From looking at the source code of forcejs, you can use the refreshAccessToken() method with the DataService instance you created.
After some more debugging it is discovered that the refresh token shows up when my code is running on localhost but does not when it is deployed to the the webserver. i dont know how to debug further or fix it. but i have verified that this behavior is consistently reproducible
The front end enables people to upload their photos, so i was sending the base64 to the server and working with it initially, but there are problems with firewall which blocks the request which contains base64. As an alternative solution I was trying to upload the image to azure blob get the file name and then send that to the server for processing where I generate a sas token for the blob validation and processing.
This works perfectly fine when I work locally and the front end connection works with #azure/storage-blob
and uploadBrowserData() when I send the arrayBuffer as the param
export const uploadSelfieToBlob = async arrayBuffer => {
try {
const blobURL = `https://${accountName}.blob.core.windows.net${sasString}`;
const blobServiceClient = new BlobServiceClient(blobURL, anonymousCredential);
const containerClient = blobServiceClient.getContainerClient(containerName);
let randomString = Math.random().toString(36).substring(7);
const blobName = `${randomString}_${new Date().getTime()}.jpg`;
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
const uploadBlobResponse = await blockBlobClient.uploadBrowserData(arrayBuffer);
return { blobName, blobId: uploadBlobResponse.requestId };
} catch (error) {
console.log('error when uploading to blob', error);
throw new Error('Error Uploading the selfie to blob');
}
};
When I deploy this is not working, the front is deployed in the EastUs2 location and the local development location is different.
I thought the sasString generated for anonymous access had the timezone option so I generated 2 different one's one for local and one for hosted server with the same location selected.
Failed to send request to https://xxxx.blob.core.windows.net/contanainer-name/26pcie_1582087489288.jpg?sv=2019-02-02&ss=b&srt=c&sp=rwdlac&se=2023-09-11T07:57:29Z&st=2020-02-18T00:57:29Z&spr=https&sig=9IWhXo5i%2B951%2F8%2BTDqIY5MRXbumQasOnY4%2Bju%2BqF3gw%3D
What am I missing any lead would be helpful thanks
First, as mentioned in the comments there was an issue with the CORS Settings because of which you're getting the initial error.
AuthorizationResourceTypeMismatchThis
request is not authorized to perform this operation using this
resource type. RequestId:7ec96c83-101e-0001-4ef1-e63864000000
Time:2020-02-19T06:57:31.2867563Z
I looked up this error code here and then closely looked at your SAS URL.
One thing I noticed in your SAS URL is that you have set the signed resource type (srt) as c (container) and trying to upload the blob. If you look at the description of the kind of operations you can do using srt=c here, you will notice that blob related operations are not supported.
In order to perform blob related operations (like blob upload), you would need to set signed resource type value to o (for object).
Please regenerate your SAS Token and include signed resource type as object (you can also include container and/or service in there as well) and then your request should work. So essentially your srt in your SAS URL should be something like srt=o or srt=co or srt=sco.
I couldn't notice anything wrong with the code you mentioned about, but I have been using a different method to upload files to Azure Blog Storage using React, the method is exactly the same as in this blog article which works perfectly for me.
https://medium.com/#stuarttottle/upload-to-azure-blob-storage-with-react-34f37805fdfc
I have uploaded the model.json file of my tensorflow graph to a private repository on an AWS S3 bucket, and am now trying to load the graph with the loadGraphModel (alongside with the binary files of the weight manifest values, group1-shard1of1). Here's my code, which I run with node (I've kept the bucket path and signature keys private)
TFJSConverter = require('#tensorflow/tfjs-converter')
const MODEL_URL = "https://[BucketName].s3.amazonaws.com/[PathToModel]/model.json?[credentials]&[securitykey]";
global.fetch = require('node-fetch')
TFJSConverter.loadGraphModel(MODEL_URL)
However the loadGraphModel function looks for a model url ending with '.json'. If not, it looks for the full model url and checks for a weight manifest file called weights_manifest.json, with no signature. An error request then follows:
UnhandledPromiseRejectionWarning: Error: Request to https://[BucketName].s3.amazonaws.com/[PathToModel]/model.json?[credentials]&[securitykey],https://[BucketName].s3.amazonaws.com/[PathToModel]/weights_manifest.json failed with status code 403. Please verify this URL points to the model JSON of the model to load.
I've checked that the signed url actually works, is there a solution for signed urls?
Installed versions:
#tensorflow/tfjs-converter#1.1.2
node v10.15.3
Many thanks!
The correct library to use to load the model is tfjs and not tfjs-converter
let tf = require("#tensorflow/tfjs");
tf.loadGraphModel(MODEL_URL)
403 error is an authorization error response. Try to set the credentials in the request using requestInit of the object passed as parameter of loadGraphModel
This worked for me:
const fetch = require('node-fetch')
global.fetch = fetch
but you can also try:
const fetch = require('node-fetch')
tf.loadGraphModel(MODEL_URL, { fetchFunc: fetch } )
as described in the documentation:
https://js.tensorflow.org/api/latest/#loadGraphModel
I'm currently using Solr 4.3.1. i have configured dih for my solr. i would like to do a full import through command prompt. I know the url will be something like this http://localhost:8983/solr/corename/dataimport?command=full-import&clean=true&commit=true is there any method i can do this without using curl ?
Thanks
Edit
string Text = "http://localhost:8983/solr/Latest_TextBox/dataimport?command=full-import&clean=true&commit=true";
var wc = new WebClient();
var Import = wc.DownloadString(Text);
Currently using the above code
Call it like a normal REST url that's it !! I am using it in my application for importing and indexing data from my Local drive and it just works fine ! :) . Use HttpURLConnection to make a request and capture response to see whether it was successful or not . You don't need any specific API to do that . This is a sample code to make a GET request correctly in C# .Try data import handler url with this, it may work !
Console.WriteLine("Making API Call...");
using (var client = new HttpClient(new HttpClientHandler { AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate }))
{
client.BaseAddress = new Uri("https://api.stackexchange.com/2.2/");
HttpResponseMessage response = client.GetAsync("answers?order=desc&sort=activity&site=stackoverflow").Result;
response.EnsureSuccessStatusCode();
string result = response.Content.ReadAsStringAsync().Result;
Console.WriteLine("Result: " + result);
}
Console.ReadLine();
}
}
}
You'll have to call the URL in some way - Solr only operates through a REST API. There is no command line API (the command line tools available just talk to the API). So use your preferred way to talk to a HTTP endpoint, that being curl, wget, GET or what's available for your programming language of choice.
The bundled solrCli application does not have any existing command for triggering a full-import as far as I were able to see (which would just talk to the REST API by calling the URL you've already referenced).
In the dev environment for my React app, I have a set of public/private keys that I need to access an API. I'd like to ideally put these keys into their own file for gitignore purposes, but i'm not having luck with my code as shown below.
my helpers.jsx file is where the API data is called via lightweight AJAX add-on, and I have the actual keys in the require declarations area:
var API_KEY = require('./keys.jsx');
var PRIV_KEY = require('./keys.jsx');
Summarily, my keys.jsx file (stored in the same subfolder as the helpers.jsx) consists of the following:
module.exports = {
API_KEY:'myactualpublickey',
PRIV_KEY:'myactualprivatekey'
};
However, my app does not like this set up, as I get an "Failed to load resource: the server responded with a status of 401 (Unauthorized)” error message and the API call isn't successful because the necessary keys are not included.
When I replace the require('./keys.jsx'); in the helpers.jsx file with the actual keys, the API call works fine.
Any help or guidance would be most appreciated. Thanks.
You're exporting an object with properties called API_KEY and PRIV_KEY, so try this:
var API_KEY = require('./keys.jsx').API_KEY;
var PRIV_KEY = require('./keys.jsx').PRIV_KEY;