SFDC Bulk API option in informatica session not working - salesforce

When i am loading data from oracle database to Salesforce from informatica with SFDC Bulk API checked, then no data is getting inserted into salesforce. In Workflow Monitor it is showing the successful records but when i checked in Salesforce its not getting inserted.How to bulk load to Salesforce?

There should be some errors in your data that is the reason you are not able to see any data in your salesforce target.
While using SFDC Bulk API option the rejected data will not be written to any reject file. To know if there are any errors in your data implement below steps in order.
In target session properties check the below options.
Use SFDC Error File
Monitor Bulk jobs Until all Batches Processed.
Set the location of the BULK error files(you should provide a path)
After doing the above changes run the workflow if there are any errors in the data it will be moved to the reject file along with the error message which will be saved in the location you provided in step3.

Related

How to display image in Sharepoint List from SQL Server

I have the following problem when I want to display an image in a column of type "Hyperlink or Picture" of Sharepoint List, the image comes from a column in a SQL Server table of data type "Image". To give more context to the problem that I am presenting, I will explain the process in detail.
I have an application developed in PowerApps which is connected to SQL Server, this application stores information in a table and among that information that I save is an image, which is the one I mentioned above and I save in the SQL Server table.
As I mentioned, the image is saved in an "Image" data type with the following information "0xFFD8FFE000104A46494600010101004800480000FFE…" something like that, since the information chain is very long, I do not add it completely.
In the same application I have a display screen of the image which I can see without any problem after saving the record.
In addition to this application, I have a workflow developed with Power Automate which is a scheduled flow that runs at a certain time of the day. What this flow does is get the rows from my table in SQL Server and then create an element in a Sharepoint list, I do this to export the information from SQL Server to a Sharepoint list.
Among that information that I export, the "Image" column does not appear, as I show it in the flow below.
For this reason after the flow ends, I go to the Sharepoint list and the image is not stored.
It is for this reason that I request help to correctly display the image from SQL Server in the Sharepoint list.
Is there something wrong that I'm doing in my workflow in Power
Automate?
Why is the image not displayed in the Sharepoint list?
Any other solution to store the image and then be displayed correctly
in the Sharepoint list?
Or do you think I have to change the data type with which I am
storing the image in the SQL Server table?
Conditions to take into account.
It is not possible to connect the PowerApps application to the Sharepoint list directly, since the SQL Server table will be used by other external applications and that information is required to be in a SQL Server table.
Update 1:
I am currently getting an error when running the flow.
The error is in a particular record and it is in one of the records which has an image in SQL Server, this image must be stored in the Sharepoin list. But the flow throws the following error:
OpenApiOperationParameterTypeConversionFailed. The 'inputs.parameters' of workflow operation 'Create_item' of type 'OpenApiConnection' is not valid. Error details: Input parameter 'item/Image' is required to be of type 'String/uri'.

Matillion to Snowflake connection is showing error 'Default Database must not be empty'

I am trying to connect Snowflake to Matillion ETL using key-pair authentication.
I'm facing the error 'Default Database must not be empty'.
We have given a default database from Snowflake side as well.
click to see the error
In this environment, we are unable to see the dropdown for default database. We have even tried manually passing the default variables but it shows the error 'invalid JWT token' and our key-pair is correct since we have tested it in other environment where it is working.
When we try to establish the connectivity from another environment of ours, it is successful. We can see a list of options to select from in the dropdown of default database(which is how it should ideally be). click to see successful connection in the other environment
The Default Database, Warehouse and Schema are all set in the 3rd dialog in Matillion's Environment setup window.
According to this Matillion guide to Connecting Matillion ETL to Snowflake, these dropdowns can show no selectable values if there are any problems on the previous (2nd dialog).
If it is an account level problem (account not correct, or no network access) then the default database dropdown list will show a "Loading..." message for several seconds before rendering an empty list. I guess it tries to make network contact in the background, and eventually times out. You will see this if you go backwards and forwards between the 2nd and 3rd dialogs.
In contrast, if it is a user level problem (bad credentials, or not enough privileges) then the default database dropdown will be blank immediately when you enter the 3rd dialog.
According to this Matillion document on environments you can use either a password or a private key to authenticate into Snowflake. So - as per the comments - I agree that if you can connect to Snowflake using the same Account, username and private key using a different SQL client, then it should also work in Matillion ETL.

Snowflake task failure notification

I have Snowflake tasks that runs every 30 minutes. Currently, when the task fails due to underlying data issue in the stored procedure that the Task calls, there is no way to notify the users on the failure.
SELECT *
FROM TABLE(INFORMATION_SCHEMA.TASK_HISTORY());
How can notifications be setup for a Snowflake Task failure? The design plan I have in mind is to build a python application that runs every 30mins and looks for any error on TASK_HISTORY table. Please advise if there are any better approaches to handle failure notifications
I think currently a python script would the best way to address this.
You can use this SQL to query last runs, read into a data frame and filter out errors
select *
from table(information_schema.task_history(scheduled_time_range_start=>dateadd(minutes, -30,current_timestamp())))
It is possible to create Notification Integration and send message when error occurs. As of May 2022 this feature is in preview, supported by accounts on Amazon Web Servives.
Enabling Error Notifications for Tasks
This topic provides instructions for configuring error notification support for tasks using cloud messaging. This feature triggers a notification describing the errors encountered when a task executes SQL code
Currently, error notifications rely on cloud messaging provided by the
Amazon Simple Notification Service service;
support for Google Cloud Pub/Sub queue and Microsoft Azure Event Grid is planned.
New Tasks
Create a new task using CREATE TASK. For descriptions of all available task parameters, see the SQL command topic:
CREATE TASK <name>
[...]
ERROR_INTEGRATION = <integration_name>
AS <sql>
Existing tasks:
ALTER TASK <name> SET ERROR_INTEGRATION = <integration_name>;
A new Snowflake feature was announced for Task Error Notifications on AWS via SNS. This doc walks though how to set this up for task failures.
https://docs.snowflake.com/en/user-guide/tasks-errors.html

How to configure Snowflake to send logs to Azure Log Analytics

Expert,
How can we configure Azure/Snowflake to access all snowflake logs using Azure Log Analytics and use kusto and alert to create alert?
Rana
it depends what data you want to unload from Snowflake to log files, as there is lots of information available in account_usage and information schema. But it's easy enough to write that data out to files on Azure storage, for ingestion and use in Azure Log Analytics. Here's an example - pushing errors recorded in the login_history view to JSON files:
copy into #~/json_error_log.json from
(select object_construct(*) from (
select event_timestamp, event_type,user_name,reported_client_type,error_code,error_message
from table(information_schema.login_history(dateadd('days',-7,current_timestamp()),current_timestamp()))
where error_code is not null
order by event_timestamp))
file_format = (type ='JSON');
And you can find more information here:
https://docs.snowflake.com/en/user-guide/data-unload-azure.html
Can't comment on the A.L.A tool operations but hopefully this gives you some idea of what to do on the Snowflake side.

SSRS Error Invalid Source credential setting

I have created a report in SSRS and got it working in the design view. When I publish on the report server I am getting the following error.
The current action cannot be completed. The user data source credentials do not meet the requirements to run this report or shared dataset. Either the user data source credentials are not stored in the report server database, or the user data source is configured not to require credentials but the unattended execution account is not specified. (rsInvalidDataSourceCredentialSetting)
I am using shared datasource and using SQLserver database connection as authentication type. Where else could I have gone wrong and Also let me know what are the settings I would have missed to update

Resources