Getting data not available error while importing bcfks certificate into salesforce - salesforce

Getting this error while importing the BCKFS file into salesforce system.
Data Not Available
The data you were trying to access could not be found. It may be due to another user deleting the data or a system error. If you know the data is not deleted but cannot access it, please look at our support page.

Related

Log Logic Apps error messages to Azure storage file

I have some steps in my logic apps eg Parse JSON. If they fail, I can see the reason for the failure when I open the step in logic apps eg string instead of integer.
How can I log these error messages in my Azure storage account.
The dynamic content dialogue box doesn't specify error messages.
I have created a storage account, created files, populated them with a string and put them into the storage account. I just need to get hold of the error message.
I will be processing JSON from HTTP requests. If the JSON is invalid ie does not conform to the expected schema, I need the error logged, so people can query it with the provider of the data.
If you just want to log the runs error message, it is not necessary to be so troublesome. You could just use outputs to implement it.
You set the create blob action run after parse json action fails, the blob content could be outputs('Parse_JSON')['errors'], if just want to get the error message it should be outputs('Parse_JSON')['errors'][0]['message'].

Creating a Message-Hub Bridge for IBM Cloud Object Storage

I'm trying to create a "Bridge" from Message Hub to S3 Object Storage, copying information from the credentials that I created but I always get an error that says "Please trying refreshing the page, or logging back into Bluemix."
I have already created an access policy for these credentials and the Bucket I want to use as destination.
Also tried with private and public end-points.
I wasn't able to found documentation that explains how to accomplish this. Nothing seems to work.
Thanks!
Apologies, this is an internal error caused by the S3 Object Storage bridges capability being made available in the UI but not in the backend.
An update to the Message Hub service will be made this week to correct this.

AzureSearch- Error on detecting index schema from data source

I created a data source on Azure Search via rest API. I use the API instead of portal, as I have a rowversion data type that isnt handled yet on portal. I am able to view the data source on portal
When I try to import the data source into an index, I get the following error
"Error detecting index schema from data source: "Data source payload should specify at least one of datasouce name and type"
What am I missing here?

Realtime document permanantly unable to be loaded due to server error

Earlier today we started to see instances of server errors popping up on an old realtime document. This is a persistent error and the end result appears to be that the document is completely inaccessible using the gapi.drive.realtime.load endpoint. Not great.
However the same document is accessible through the gapi.client.drive.realtime.get endpoint. Which is great for data recovery, but not so great for actually using the document. It's possible I can 'fix' the document by doing a 'drive.realtime.update', but haven't tried as hopefully the doc can be used to track down the bug.
Document ID: 0B9I5WUIeAEJ1Y3NLQnpqQWVlX1U
App ID: 597847337936
500 Error Message: "Document was not successfully migrated to new UserKey format"
Anyone else seeing this issue? Can I provide any additional information?

Liferay document checkin issue

I'm still new to Liferay and using Liferay 6.2
what i'm doing:
I am trying to add a document manually into my database using insert statement.
I inserted into dlfileentry, dlfileversion and AssertEntry.
Also, i created a folder with the valid name and file.
The issue:
upon entering the Documents and Media portlet, i can see the document name there but when i click on checkout, it will prompt a error saying that Documents and Media is temporarily unavailable. however i am still able to download the valid document.
Am i doing something wrong? Personally, i feel that i am missing one more table for the database but i'm not sure .
Thanks!
Yes, you're doing something wrong: You should never write to Liferay's database with SQL, as there might be more data required than what's directly visible to you. Obviously, you're running into exactly such an issue.
Liferay has an API which you can use locally, from within the same application server, or remotely as JSON or SOAP service. You should exclusively use this for write access to the database.
Alternatively, you might consider WebDAV access to your document repository as the way to add more documents to the document library.

Resources