Error while executing query for custom object Work Order - salesforce

I am executing the query for my custom object created in SFDC. but i am getting the following error:
{'[{"message":"\nSELECT FS_Account_Name__c from FS_Work_Order__c\ERROR at Row:1:Column:34\nsObject type 'FS_Work_Order__c' is not supported. If you are attempting to use a custom object, be sure to append the '__c' after the entity name. Please reference your WSDL or the describe call for the appropriate names.","errorCode":"INVALID_TYPE"}]'} Thoough have written the correct table name as given while i created the Custom object. PLease help.

First thing to try: does it work properly when run as the system administrator profile? If so then it's certainly a permissions issue. Things to checl
The object is deployed (Setup > Create > Objects > Edit > Deployment Status)
The profile has permission to query the object.
If not, does that same query work from inside the developer console? If so I can't think of what it might be, except connecting to production instead of a sandbox or vice versa.

Related

How do you resolve an "Access Denied" error when invoking `image_uris.retrieve()` in AWS Sagemaker JumpStart?

I am working in a SageMaker environment that is locked down. For example, my user account is prevented from creating S3 buckets. But, I can successfully run vanilla ML training jobs by passing in role=get_execution_role to an instance of the Estimator class when using an out-of-the-box algorithm such as XGBoost.
Now, I'm trying to use an algorithm (LightBGM) that is only available via the JumpStart feature in SageMaker, but I can't get it to work. When I try to retrieve an image URI via image_uris.retrieve(), it returns the following error:
ClientError: An error occurred (AccessDenied) when calling the GetObject operation: Access Denied.
This makes some sense to me if my user permissions are being used when creating an object. But what I want to do is specify another role - like the one returned from get_execution_role - to perform these tasks.
Is that possible? Is there another work-around available? How can I see which role is being used?
Thanks,
When I encountered this issue, it was a permissions issue with a bucket that had changed.
In the SageMaker Python SDK source code , there is a cache that is located at in an AWS-owned bucket: jumpstart-cache-prod-{region}. and a manifest.json that translates the ECR path for the image for you.
If you look at the stack trace, it could be erroring out at the code that is looking for the manifest.
One place to look is if there are new restrictions placed in IAM, Included here is the minimum policy you need to access JumpStart (pretrained) models

Upload file to s3 browser through batch script

I am trying to upload a json file to S3 browser through batch script using command:
s3browser-con.exe upload <account_name> <local directory\json file> <s3 bucket name and path>
(Referred CLI documentation). However, I get error:
:AccountManager::CurrentAccount::get::failed - unable to show the Add New Account dialog.
This runs fine when I run the batch script individually, however, when I try to run it through command task in Informatica cloud, it gives me this error.
I suspect this is trying to create new account at runtime, but we can only add two accounts at a time since it is free version. Not sure though as I am new to S3 and batch scripts.
Also, is there any way, we can avoid giving account name, as all users might have different account name for a particular bucket? Any help and guidance would be appreciated.
EDIT:
Note: This is detailed error Unhandled Exception:
System.NullReferenceException: Object reference not set to an instance of an object. at mg.b(String aty) at mk.a(String[] avx) at mg.Main(String[] args)
<account_name> is held in the User Profile of whoever set it up in s3browser-con. So if you are not running Informatica (secure agent) on the same machine under the same user then it's not going to work.
However, why are you using a 3rd party tool to upload files to S3 within Informatica? Why not just use Informatica's built-in capabilities? Unless there is a very specific reason for doing this, your solution appears to be over complicated.

ds.addToCatalog() for Angular-Wakanda

Is there anything like ds.addToCatalog() in Angular-Wakanda? Problem is that once the DataStore is loaded (ex. init("Employee")), it is not possible to add new DataClasses if needed.
I'm trying to loady only the DataClasses needed for each Angular-Controller.
The reason different access rights for different Angular-Controllers, meaning that if a user with limited acces rights logs in the .init() method would throw the following error because the user has no access to all DataClasses:
GET http://127.0.0.1:8081/rest/$catalog/$all/ 401 (Unauthorized)
Loading only the DataClasses with access rights works fine:
http://127.0.0.1:8081/rest/$catalog/Page,%20Employee,%20News/
There is actually a bug on Angular-Wakanda connector that is causing this issue.
When $wakanda.init() is called the first time (whatever the parameter given to the function), the returned dataStore is cached connector-side, and directly returned to further calls to $wakanda.init().
Wakanda team is aware of this issue and it will be fixed soon.
There is no possible workaround at this time, unless calling $wakanda.init() without parameter to retrieve the whole catalog. But it won't properly work according to the access rights you have set on your dataClasses.

Liferay document checkin issue

I'm still new to Liferay and using Liferay 6.2
what i'm doing:
I am trying to add a document manually into my database using insert statement.
I inserted into dlfileentry, dlfileversion and AssertEntry.
Also, i created a folder with the valid name and file.
The issue:
upon entering the Documents and Media portlet, i can see the document name there but when i click on checkout, it will prompt a error saying that Documents and Media is temporarily unavailable. however i am still able to download the valid document.
Am i doing something wrong? Personally, i feel that i am missing one more table for the database but i'm not sure .
Thanks!
Yes, you're doing something wrong: You should never write to Liferay's database with SQL, as there might be more data required than what's directly visible to you. Obviously, you're running into exactly such an issue.
Liferay has an API which you can use locally, from within the same application server, or remotely as JSON or SOAP service. You should exclusively use this for write access to the database.
Alternatively, you might consider WebDAV access to your document repository as the way to add more documents to the document library.

Parser: The syntax for the ImpersonationInfo object is incorrect, If the ImpersonateAccount value is used for ImpersonationInfo

I am trying to deploy a cube for the first time on my PC. I have run SSMS as an Admin and made myself (Gary-pc\gary, using Windows authentication) an admin. Doing this got me past the error message user does not have permission to create a new object in 'GARY-PC', or the object does not exist.
When I deploy the cube now, I get the error: the syntax for the ImpersonationInfo object is incorrect. If the ImpersonateAccount value is used for ImpersonationInfo, then the Account property cannot be empty.
I have not (knowingly! LOL) set up anything related to impersonation.
I've resolved the problem with this:
Double click on data source
Select impersonation
Choose use credentials of user
As #user1335419 says.
I tried changing the impersonation to "Credentials of User" and although I was successful in deploying the cube, I could not process it. I ended up getting an error that said:
"The datasource contains an ImpersonationMode that is not supported
for processing operations"
So I changed the impersonation from "Credentials of User" to "Inherit" and was able to process. I don't know if the first deployment would have worked with "Inherit", but thought I would share my experience.

Resources