All client machine's connected to server via open vpn. Also all of the client machine has set custon winlogon shell settings, To run only myapp.exe.
So, the desktop or anyother explorer cannot be visible unless from taskmanager by using ctrl_shift_esc.
One of the client machine has stopped the myapp.exe, and wanted to restrt the machine. So from server done an RDP using open vpn IP, But unfortunately ctrl_shift_esc is not working to start the taskmanager.
Is there any way to restart this client machine from server machine. As no other tool is available in server to restrt this machine. They connected only via openvpn.
Regards
If PowerShell is enabled on the target machine you can remotely reboot it with powershell command.
PS C:\> Restart-Computer <hostname or IP> -whatif
Also you can restart multiple computers in single line of command
PS C:\> Restart-Computer "hostname1", "hostname2" -whatif
If someone logged in to target computer you can use -force parameter to force reboot.
-WhatIf parameter is used to verify the command.
please have a look at this link http://technet.microsoft.com/en-us/library/hh849837.aspx
The less sophisticated yet time-proven solution is to use the free handy utility called Wizmo by GRC (the same guy who made ShieldUp)
Once downloaded and deployed, you can cause a reboot via calling,
wizmo reboot
Note: not wizmo restart. because it's so easy, you can create a shortcut, or add it to PATH and call from the terminal. Too easy!
Related
I want to create a batch script that outputs all of my current rdp connections. I am connected to multiple machines from my desktop, but lose connection to them every so often. Eventually, I'd like to schedule this task to run regularly and notify me so that I can go in a reconnect. For reference, I am using a non-admin account.
Update:
With the query posted below (found in another stackoverflow post), I am able to get a list of all connections listening in on port 3389. In a limited use environment, these should only be RDP connections.
netstat -n -a | findstr 3389 >"C:\Users\Public\log.txt"
Edit 1: Reworded question.
Edit 2: Found temporary solution.
Original question:
The goal is to to schedule a task that checks a PC every XX minutes to
see if it is connected to some VM's via Remote Desktop Connection. If
it isn't connected to the right VM, then attempt to reconnect. I
understand that I will need to schedule a batch file to run every XX
minutes, then have the batch file check the connections (the hard
part). I have looked around and I cannot find a clear answer as to
whether this is even possible.
While not a batch solution, powershell is available for Most Windows Machines
Powershell
Using PowerShell
=======================
import-module remotedesktopservices
Get-RDUserSession
Results
CollectionName DomainName Username HostServer UnifiedSessionId
============== ========== ======== ========== ================
Session Coll.. LocalHost joe d103joe 14
So I'm trying to transfer files to a remote computer on an SSH system. 'I've used the sftp command, used lls to confirm the presence of the file in the local computer, and then implemented the put filename command. However, I receive the same result each time:
stat filename: No such file or directory
I just don't know what's going wrong! Any help or troubleshooting tips would be appreciated.
If you're currently using Windows you can download winscp and use that to transfer files. It has a nice graphic interface that is easy to interact with
Well, supposing that you are on a Linux/Unix environment, you could use scp. Typically, the syntax for an scp command would be like this:
$ scp foobar.txt your_username#remotehost.net:/some/remote/directory
The above command copies the file foobar.txt which resides in the local computer, to a specific directory in the remote machine, using a username (you will be asked for a password later).
The sftp command line client uses the ssh transport and will tunnel your connections using your key. So if you have ssh access, you should also have sftp access. This is a secure option for people who are more comfortable with ftp. Most GUI ftp clients should also support sftp.
I was facing also in this issue when trying to upload files from the local to the remote server. I did commands well and clean but the mistake I was making was that: I've logged into the remote server with ssh and then login with sftp. In that way, sftp will consider that your remote server is the local (as I logged in first to this via ssh) when using the command below:
put /c/path/to/file.txt
So, the thing to do is to login directly to the server via sftp and putting your local files in there.
I'm trying to execute a batch file(shutdown.bat and startup.bat of tomcat 7) on a remote machine(Windows server 2008) using PSTools but didn't got any luck till now.
Below are the steps I used
c:\>psexec \\129.12.3.1 -u Admin -p admin90 C:\>Hyp\tom7_50080\bin\shutdown.bat
and on my cmd i got
PsExec v2.0 - Execute processes remotely
Copyright (C) 2001-2013 Mark Russinovich
Sysinternals - www.sysinternals.com
PsExec could not start cmd on 129.12.3.1:
There are currently no logon servers available to service the logon request.
Can anyone help with the above output and with the batch file for executing the shutdown and startup batch file on remote machine.
Is PS Tools only option to execute any service/batch file on remote machine or we could use any other utility provided by MS.
In you example, #David Candy pointed out even you had the connection go thru, it would not work as you have 'c:>hyp\' instead of C:\hyp\tom7_*
You seem to be using IP, but the message you got seems to be name resolution related, so not sure what's happening there. Maybe you should upgrade to the latest PsExec version.
If you want to use PowerShell you would use Invoke-Command -ComputerName {NameOfPC} -ScriptBlock {C:\Hyp\tom7_50080\bin\shutdown.bat}
If you want to execute a program on another server, you can use a stored procedure on that server to invoke the command, and call that stored procedure from the local mcahine.
You could also create a web service on the remote server that invoked the command you want to execute.
In either case, be very careful that you don't open a security hole by either allowing more users to execute commands through the mechanism you implement, or by some user to execute commands other than the one you intend.
I want to copy some files from a network shared drive (mounted at my local machine as drive Z). I have written a Batch file to copy the contents of Z drive into my local drive. This batch file runs successfully on cmd, but i am having issue when i trigger it through Jenkins. The Jenkins gives the following error:
"The system cannot find the specified drive"
Any help regarding this, will be greatly appreciated.
Thanks,
Nouman.
If you don't want to use Jenkins-plugins or schedule-Tasks here is a "groovy" way:
By Hand:
You can use the Groovy Script-Console provided by Jenkins>Manage Jenkins>Script Console and execute the command to map the network-drive within the Jenkins-service. (Must be repeated, once the Jenkins-service is stopped)
Automation:
Write your Groovy commands to a file named "init.groovy" and place it in your JENKINS_HOME-directory. So the network-drive gets mapped on Jenkins-startup.
Groovy Commands - Windows:
Check available network drives using the Script-Console:
println "net use".execute().getText()
Your init.groovy would look like this:
def mapdrive = "net use z: \\\\YOUR_REMOTE_MACHINE\\SHARED_FOLDERNAME"
mapdrive.execute()
Yes Jenkins uses different login credentials. To map a drives through Jenkins use below command in Jenkins command prompt:
Subst U: \drive\folder
then after that your queries.
You might run into permission issues. Jenkins might be executed with different user credentials; so it does not know the configured drive for the windows share. Instead of using shell scripts I suggest to use a plugin. There is a set of Publish-over plugins that allow deployments to remote systems via a couple of protocols (ssh, cfis etc). Have a look at the CFIS plugin that allows to send artifacts to a windows share. Once the plugin is configured (ie the host is specified in the Manage Jenkins section) you can add to the post build steps Send files to a windows share where you can specify which file(s) shall be sent to which location.
Had this issue where my jenkins job was unable to read files present on the network drive.
I resolved it by adding "net use" command in your pre-build step. i.e.
Open your job.
Go to Pre Steps
From the drop down, select Execute Windows Batch Command
Enter the following command:
net use E: \[server name][Folder name] "[password]" /user:"[userid]"
Click Save
Execute the job
I was able to read files from my network drive by following the steps mentioned above.
It seemed to be a one time activity as after the initial run, I had removed the batch command from my job and it seemed to remember the mapped drive command.
Try adding debugging commands to that bat file, or as separate build step, such as net use, set (pay attention to vars like like HOMEPATH and USERNAME) and plain dir Z:\.
As said in another answer, most likely reason is that Jenkins runs as SYSTEM user, which has different permissions. One way around that is, go to services (for example open Task Manager, go to Services tab in it, click the Services button at the lower right corner of that tab), find Jenkins service, open it's properties, go to "Log on" tab and set your normal user account as one that runs Jenkins.
Basically you can access your network shared drive (Z) using by servername or IP from jenkins command. Write \\192.168.x.xxx\Your_Folder instead of z:\Your_Folder.
For example:
mkdir \\192.168.x.xxx\Your_Folder
I was trying to copy files from one remote computer to other, the easy solution which worked for me is COPY iphone.exe \192.xx.xx.xx\dev(dev is the folder name on c drive in that ip address)
A similar issue showed up for us on Jenkins slaves set up on Windows Server 2008 following this documentation. The Jenkins agent failed to access the mounted network drives even after configuring the agent service with the correct user credentials.
Troubleshooting:
Jenkins could access the mounted network drives by their drive letters when connected via the JNLP agent (Launch agent via Java Web Start).
It stops recognizing the drive letters soon after we install the agent as a Windows service. Configuring the correct user credentials and restarting the agent does not help.
We could still access the drives via the command line while logged in to the machine with the above user.
Stop the agent service from services.msc and then uninstall it by running the command jenkins-slave.exe uninstall. The slave is disconnected at this point.
Reconnect the slave by launching the JNLP agent via Java Web Start. The agent can now access the network drives again.
Synopsis:
Do not install the slave agent as a Windows service to keep accessing your mounted network drives using drive letters. But this is highly unreliable as the agent might fail to restart after a machine reboot. Alternatively, see if Jenkins can access them via \\<ip_address\of\network\drive>.
In order to access your remote drive
just use the command in cmd prompt
pushd "\sharedDrive\Folder1\DestinationFolder"
mkdir FolderName
popd
pushd >> It navigates to the shared drive by creating a virtual drive..
popd >> Gets you back to the local directory
What is the simplest way to schedule a batch file to run on a remote machine using Hudson (latest and greatest version)? I was exploring the master slave setup. I created a dumb slave but I am not sure what the parameters should be so that I can trigger the batch file in the remote slave machine.
Basically, I am trying to run 2 different batch files on two different remote machines sequentially, triggered from my machine (the master). The Step by step guide on the Hudson website is a dead link. There are similar questions posted on SO but it does not quite work for me when I use the parameters they mention.
If anyone has done something similar please suggest ways to make this work.
(I know how to set up jobs, and add a step to run a batch file etc what I am having trouble configuring is doing this on a remote machine using hudson in built features)
UPDATE
Thank you all for the suggestions. Quick update on this:
What I wanted to get done is partially working, below are the steps followed to get to it -
Created new Node from Manage Nodes -> New Node -> set # of Executors as 1, Remote FS root set as '/var/hudson', set Launch method as using JNLP, set slavename and saved.
Once slave was set up (from master machine), I logged into the Slave physical machine, I downloaded the _slave.jar from http://masterserver:port/jnlpJars/slave.jar, and ran the following from command line at the download location -> java -jar _slave.jar -jnlpUrl http://masterserver:port/computer/slavename/slave-agent.jnlp. The connection was made successfully.
Checked 'Restrict where this project can be run' in the Master job configuration, and set paramater as slavename.
Checked "Add Build Step" for adding my batch job script
What I am still missing now is a way to connect to 2 slaves from one job in sequence, is that possible?
It is fairly easy and straight forward. Lets assume you already have a slave running. Then you configure the job as if you are locally on the target box. The setting for Restrict where this project can be run needs to be the node that you want to on. This is all for the job configuration.
For the slave configuration read the following pages.
Installing Hudson as a Windows service
Distributed builds
On windows I prefer to run the slave as a service and let the remote machine manage the start up and shut down of the slave. The only disadvantage with this is, you need to upgrade the client every time you update the server Just get the new client.jar from the server, after the upgrade and put it on the slave. Then restart the slave and you are done.
I had troubles using the install as a service option for the slave even though I did it as a local administrator. I used then srvany to wrap the jar into a service. Here is a blog about it. The command that you need to wrap, you will get from your Hudson server from the slave page. For all of this to work, you should set up the slave management as jnlp.
If you have an ssh server on your target machine, you can use the ssl slave settings. These work for me like a charm. I use them with my unix slaves. So far the ssl option with unix is less of an hassle, than the windows service clients.
I had some similar trouble with slave setup and wrote up this blog post - I was running on Linux rather than Windows, but hopefully this will help.
I dont know about how to use built-in hudson features for this job - but in one of my project builds, i run a batch file that in turn uses PSTools
to run the job on a remote server. I found PS tools extremely easy to use - download, unpack and run the command with the right parameters, hence opted to use this.