Search in all project files on remote host in PhpStorm - file

I have lots of files in a project on a remote host and I want to find out from which file another php file is called. Is it possible to use Ctrl+Shift+f search on a remote host project?

Is it possible to use Ctrl+Shift+F search on a remote host project?
Currently it's not possible. (2022-06-09: now possible with remote development using JetBrains Gateway, see at the end)
In order to execute search in a file content in a locally run IDE such file must be read first. For that the IDE must download it... which can be quite time & connection consuming task for (S)FTP connections (depends on how far the server is; how fast your connection; bandwidth limits etc.)
Even if the IDE could do it transparently for search like it does with Remote Edit functionality (where it downloads a remote file but instead of placing it in the actual project it stores it in a temp location) it still needs to download it.
If you execute one search (one term) and then need to do another search (slightly modified term or completely different search string) the IDE would need to re-download those files again (waste of time and connection).
Therefore it makes much more sense to download your project (all or desired files only) locally and then execute such search(es) on local files.
If it has to be purely remote search (when nothing gets downloaded locally)... then you just establish SSH/RDP/etc connection to that remote host (BTW: PhpStorm has built-in SSH Console functionality) and execute such search directly on the remote server with OS native tools (find/grep and alike) or some remote software (e.g. mc or notepad++).
P.S. (on related note)
Some of the disadvantages when doing Remote Edit: https://stackoverflow.com/a/36850634/783119
EDIT 2022-06-09:
BTW, JetBrains now has JetBrains Gateway for remote development where you run the IDE core on a remote server and connect to it via SSH using a local dedicated app or a plugin to your IDE (PhpStorm comes bundled with such a plugin since 2021.3 version).
To check more on JetBrains Gateway:
https://www.jetbrains.com/remote-development/gateway/
https://blog.jetbrains.com/blog/2021/11/29/introducing-remote-development-for-jetbrains-ides/

Related

How to load file locally with path vs path in Azure Devops build

I am trying to load a file in my PostDeployment script in my DB project:
FROM OPENROWSET(BULK'C:\Development\MyProject.Db\Scripts\My_Script.sql',SINGLE_CLOB)
Of course my build pipeline in azure devops is complaining that cannot find this script. But when I change it to the following my local build complains:
FROM OPENROWSET(BULK'$(Build.ArtifactStagingDirectory)\MyProject.Db\Scripts\My_Script.sql',SINGLE_CLOB)
What is the right way here? I would like to keep my local path locally and switch to the build one (hoping that one is the correct one, didn't even get the chance to try it cause the build complains) when the Azure Pipeline build runs. How can I achieve this?
Generally, the Post Deployment scripts will run on the servers once the databases have been deployed successfully to the servers. So, you can't directly use the absolute (or relative) paths in the scripts to access the local machines.
In Azure Pipelines, if we want to access the remote services from the local machines, we have many existing methods and tasks to create and use the service connections to access the services.
However, for the reverse visit that access the local machines from the remote services, we have no existing, easy ways. As far as I know, you may need to map the IP address and port of the local machine on the remote services, and you also may need to configure proxy settings, firewall settings on the local machines.
If the files you want to access in the Post Deployment scripts have been deployed to the servers, you can directly use these file paths on the servers instead of the local machines.
[UPDATE]
Each pipeline will be assigned with a working directory ($(Pipeline.Workspace)) in the working directory for the agent ($(Agent.WorkFolder)).
If the files you want to access are the source files in the source repository, and the repository has been checked out to the working directory for the pipeline, you can access the source files in the directory $(Build.SourcesDirectory).
$(Build.SourcesDirectory) is the default working directory of the job, so you can access the source files using either relative paths or absolute paths.
For example:
Using relative path.
MyProject.Db/Scripts/My_Script.sql
OR
./MyProject.Db/Scripts/My_Script.sql
Using absolute path.
$(Build.SourcesDirectory)/MyProject.Db/Scripts/My_Script.sql

Access files from one computer to another using remote extensions

I have some code on my own computer but while I'm moving, I'd like to access files on this computer from my laptop
I thought about using vscode remote extensions but I'm having trouble connecting to my pc
I think I should expose my pc to ssh access but can't find any resources on that
Is what I want even possible?
Thanks

Attempting to Distribute an Access Application with SQL Server Backend

I am attempting to use a Packaged Solution for my Access 2010 application that has its backend linked to SQL Server. At the moment, I'm using the .accdb file as the frontend, and I would like to distribute my application to some other Windows computers, but the Packaged Solution does not work. I had the package include Access Runtime, so their version of the frontend is running on Runtime and not full Access. However, once the application makes a request to the backend, the application does nothing, as I am not even prompted for the SQL Password as per usual with the full version. I've read on about including a .dsn file in the package can secure the SQL connection (see here), but going through steps of other tutorials to create .dsn files hasn't led to any results. Would anyone know how to correctly generate the .dsn file or if I've done something else wrong at this point?
(And yes, I understanding using Access 2010 in the year 2019 is almost a joke at this point, but I'm doing this for testing purposes. I plan to completely remake the frontend in Angular in the future.)
One other unrelated note... would it be a better idea to have the frontend hosted as a .html file like through the "Publish to Access Services" process? I did read that Access Services was discontinued last year, so would that not be possible?
Edit: This is not a duplicate of "DSN Less Connection (MS Access to SQL2016)" because A) I want to utilize a DSN Connection, not DSN-less and B) I am not using connection strings in my code to hook up with SQL.
You should be able to just create FILE dsn, link your tables, and then distribute the compiled accDE to each desktop.
However, what SQL odbc source provider did you use? If you use the SQL server ODBC provider, then that is by default installed on each computer.
However, if you linked using Native 11 (or later), then that driver is NOT installed on each workstation by default. So, I HIGH recommend you create a FILE dsn (not a user or system DSN), and link the table using that. (Access will create DSN-less links for you)
And you should NOT be seeing a logon prompt with your application. This suggests you forgot or missed the save password option.
So, I would re-link your tables, creating a new FILE DSN. And if you using the linked table manager, then make sure you check the prompt for new location to force creating of a NEW DSN. If you just re-fresh, then you DO NOT get a chance to click on the save password option during the linking process.
So, what odbc driver are you using? The native 11 or later are better, but they are not installed by default on each workstation. However, CAUTION is required here, since the older sql driver does NOT support the newer datetime2 formats. If you used these newer sql column types, they will be returned as string data types in Access and create a mess of issues.
So, first, I would re-link using a FILE dsn.
Make sure you check the save password during the re-link.
You then compile your accDB into an accDE, and then distribute that. You don’t really need to use the package wizard, since once each workstation has the runtime installed, then a simple copy of the accDE to each person’s computer will thus work fine. There is NO special connection between your accDE and the package wizard. Once the runtime is installed, then any and all mdb, accDB, and your accDE can simply be clicked on to launch + run. So for testing, you can skip the package wizard, and just copy the accDE to the target machine, click on it, and see if it works.
Edit
The prompt and check box during this process is this:
So you have to check that box to save the password. Note that you ONLY get this dialog WHEN you create a new FILE dsn.

Informatica Cloud - Picking up files from SFTP and inserting records in Salesforce

Our objective is as follows
a) Pick up a file "Test.csv" from a Secure FTP location.
b) After picking up the file we need to insert the contents of the file into an object in Salesforce.
I created the following connection for the Remote SFTP (the location which will contain "Test.csv")
Step 1
This is as shown below
Step 2
Then I started to build a Data Synchronization Task as below
What we want is for the Informatica Cloud to connect to the secure FTP location and extract the contents from a .csv from that location into our object in Salesforce.
But as you can see in Step 2, it does not allow me to choose .csv from that remote location.
Instead the wizard prompts me to choose a file from a local directory (which is my machine ...where the secure agent is running) and this is not what I want
What should I do in this scenario ?
Can someone help ?
You can write a UNIX script to transfer the file to your secure agent and then use informatica to read the file. Although, I have never tried using sftp in cloud, I have used cloud and I do know that all files are tied up to the location of the secure agent( either server or local computer) .
The local directory is used for template files. The idea is that you set up the task using a local template and then IC will connect to the FTP site when you actually run the task.
The Informatica video below shows how this works at around 1:10:
This video explains how it works at around 1:10:
http://videos.informaticacloud.com/2FQjj/secure-ftp-and-salesforececom-using-informatica-cloud/
Can you elaborate the Secure agent OS as in Windows or Linux.
For Windows environment you will have to call the script using WINSCP or CYGWIN utility I recommend the former.
For Linux the basic commands in script should work.

Running batch file remotely using Hudson

What is the simplest way to schedule a batch file to run on a remote machine using Hudson (latest and greatest version)? I was exploring the master slave setup. I created a dumb slave but I am not sure what the parameters should be so that I can trigger the batch file in the remote slave machine.
Basically, I am trying to run 2 different batch files on two different remote machines sequentially, triggered from my machine (the master). The Step by step guide on the Hudson website is a dead link. There are similar questions posted on SO but it does not quite work for me when I use the parameters they mention.
If anyone has done something similar please suggest ways to make this work.
(I know how to set up jobs, and add a step to run a batch file etc what I am having trouble configuring is doing this on a remote machine using hudson in built features)
UPDATE
Thank you all for the suggestions. Quick update on this:
What I wanted to get done is partially working, below are the steps followed to get to it -
Created new Node from Manage Nodes -> New Node -> set # of Executors as 1, Remote FS root set as '/var/hudson', set Launch method as using JNLP, set slavename and saved.
Once slave was set up (from master machine), I logged into the Slave physical machine, I downloaded the _slave.jar from http://masterserver:port/jnlpJars/slave.jar, and ran the following from command line at the download location -> java -jar _slave.jar -jnlpUrl http://masterserver:port/computer/slavename/slave-agent.jnlp. The connection was made successfully.
Checked 'Restrict where this project can be run' in the Master job configuration, and set paramater as slavename.
Checked "Add Build Step" for adding my batch job script
What I am still missing now is a way to connect to 2 slaves from one job in sequence, is that possible?
It is fairly easy and straight forward. Lets assume you already have a slave running. Then you configure the job as if you are locally on the target box. The setting for Restrict where this project can be run needs to be the node that you want to on. This is all for the job configuration.
For the slave configuration read the following pages.
Installing Hudson as a Windows service
Distributed builds
On windows I prefer to run the slave as a service and let the remote machine manage the start up and shut down of the slave. The only disadvantage with this is, you need to upgrade the client every time you update the server Just get the new client.jar from the server, after the upgrade and put it on the slave. Then restart the slave and you are done.
I had troubles using the install as a service option for the slave even though I did it as a local administrator. I used then srvany to wrap the jar into a service. Here is a blog about it. The command that you need to wrap, you will get from your Hudson server from the slave page. For all of this to work, you should set up the slave management as jnlp.
If you have an ssh server on your target machine, you can use the ssl slave settings. These work for me like a charm. I use them with my unix slaves. So far the ssl option with unix is less of an hassle, than the windows service clients.
I had some similar trouble with slave setup and wrote up this blog post - I was running on Linux rather than Windows, but hopefully this will help.
I dont know about how to use built-in hudson features for this job - but in one of my project builds, i run a batch file that in turn uses PSTools
to run the job on a remote server. I found PS tools extremely easy to use - download, unpack and run the command with the right parameters, hence opted to use this.

Resources