Error executing API while fetching folder list - azure-logic-apps

I am creating logicapp to retrieve attachment from Email and copy it to SFTP folder. While Setting up connection to SFTP folder, when I try to select a folder then I get error as in the screenshot
logic app error
Another user in my company can connect to the SFTP folder using filezilla. However, I am getting an error even with FileZilla. We have confirmed that whitelist of IPs is not enabled and ports are no blocked anywhere.
Any idea how to trouble-shoot this issue? Thanks in advance
harry

When you create the api-connection for the SFTP server, have you tried with 'Disable SSH Host Key Validation' turned on? If not try it. You can update the api-connection from the Azure portal.

Related

Error Establishing Database Connection with Wordpress in MAMP

I have a few local Wordpress sites already running in my folder for MAMP, but when I try to create a new one to go through the install process it is throwing "Error Establishing Database Connection".
I know there is not a wp-config.php file setup, but I thought this first step was going to create one?
I have tried both Wordpress 5 and 4 to see if the version isolates the problem.
I had the same problem but solved it by going into the wp-config-sample.php file and renaming it wp-config.php. Then I changed the file contents to the database name I set up in MAMP and the username and password to root. I was then able to open the project.
For reference, I used this video: https://www.youtube.com/watch?v=iIxbv4n5jgQ

BULK INSERT error when the file location changed to remote share

I am getting following error on BULK INSERT after the file location was changed to remote share. Before it used to be a shared folder in local drive and we never ran into this issue. I am running this BULK INSERT from my local PC connecting to SQL Server via SSMS.
I have made sure both SQL server and file permissions are in place.
Before when I ran this command from SSMS, it was \\SQLServer\FTP location which was a shared folder in local drive in that SQL Server but now I changed the file location to a network share \\Fileshare\FTP and have the above error but both SQL service account (domain account) and me (domain account) have elevated permission on that new location.
Any help or suggestions!!
Thanks,
I can identify three circumstances that might generate this issue:
From the SQLAuthority Blog, full detail on a related backup issue where there is a cross-domain link (in this case, from a workgroup to a full domain).
There are also two other possible answers in the question Cannot bulk load because the file could not be opened. Operating system error code 1326(Logon failure: unknown user name or bad password.) here on StackOverflow. We can discount the first one (login permissions) because you stated that you had permissions, but the other solution (I fixed it by adding the SQL Server port number to the connection string in SSIS, forcing SSIS to access SQL Server through TCP/IP instead of Named Pipes.) could apply. Try forcing a connection to the server using TCP/IP.
All of these issues appear to be related to having an attempt at cross-domain communication. If this is the issue with you, one or more of these fixes should be applicable to your issue.
-
It finally worked....
I had to configure Kerberos Authentication following the guide from this link https://thesqldude.com/2011/12/30/how-to-sql-server-bulk-insert-with-constrained-delegation-access-is-denied/.
Of course, I had to make adjustments to suit our environment and had to involve Active Directory Admin for creating SPNs and enabling DELEGATION properties.
Thanks.

Apache SOLR index on remote server

I want to be able to run a SOLR instance on my local computer by have the index directory on a remote server. Is this possible ?
I've been trying to look for a solution for days. Please help.
Update: We've got a business legal requirement where we are not allowed store client data on our servers ... we can just read, insert, delete and update it on Client request via our website and the data has to be stored on client servers. So each client will have their own index and we cannot run SOLR or any other web application on Client's server. Some of the clients have dropbox business account. So we thought may be just having the SOLR index file upload to dropbox might work.
enable remote streaming in solrConfig.xml and configure the remote file location in it.
It's working

Creating virtual directory in IIs for dnn website which uses new database

I want to create virtual directory for my dnn website which uses new database, for this I followed these steps.
go to iis server manager right click on site and click on add virtual directory
then I copied all file from dnn site to virtual directory folder and runs on browser it runs fine
after that I created new database and restore the backup of old database(which are used by Dnn website) on it . and change the connection string of virtual directory web.config file, it shows error "Invalid login" then I created new login in sql server and checked db_owner Properties of new database.
but when i browse site one issue occurring "Error 310 (net::ERR_TOO_MANY_REDIRECTS): There were too many redirects."
so I want to know is these steps are right? if right then, how can i resolve this problem .
If any one have an idea please guide me
Thanks
You will need to update the PORTALALIAS in the restored Database in order to get the site working. Your PortalAlias should match the URL of the virtual directory.

TFS InvokeMethod Delete Access to the path is denied

I am trying to call invokemethod to delete a file in the drop folder. However I am receiving access to the path is denied message. I am also doing copydirectories and delete directory before using invokmethod. Both copy and delete directories work with no permission issue.
I am guessing invokemethod use different user account. Any help is very much appreciated.
By default the build agent runs with the build service account of Network Service. You may want to change it to a domain account such as TFSBuild so you could assign write permissions on the paths you need your build artifacts.
Then make sure TFSBuild is running the build service and is a member of the Project Collection Build Service Accounts
You will also get access denied error, when you have the file in use.

Resources