Do we have any method to know whether the protractor script is currently running or not - batch-file

I need to execute the protractor script automatically from windows task scheduler at particular time intervals. I can do that by giving command(protractor conf.js) in a batch file. But i need to check whether the protractor script is currently running or not. If it not in a running mode then only i need to execute the batch file. How can i do this? Can anyone help me.
Thanks in advance.

I can think of three options:
Run %windir%\system32\taskschd.msc /s, drill-down to your task entry, right-click and select properties from the drop-down menu. Select the settings tab and select Do not start a new instance in the drop-down selection box under If the task is already running, then the following rule applies:.
Add the <MultipleInstancesPolicy>IgnoreNew</MultipleInstancesPolicy> element to the <Settings> group in your task XML file and import that.
Set the MultipleInstancesPolicy setting via whatever script/code you're using to create the task. See ITaskSettings::put_MultipleInstances.

Related

SSIS, avoid failure if source file isn't available

I have an SSIS job that is scheduled to run every 5 minutes via SQL Agent. The job imports the contents of an excel file into a SQL table. That all works great, but the files get placed there sporadically and often times when the job runs there is no file there at all. The issue is this is causing the job to fail and send a notification email that the job failed, but I only want to be notified if the job failed while processing a file, not because there was no file there in the first place. From what I have gathered I could fix this with a script task to check if the file is there before the job continues, but I haven't been able to get that to work. Can someone break down how the script task works and what sort of script I need to check if a file exists? Or if there is some better way to accomplish what I am trying to do I am open to that as well!
The errors I get when I tried the Foreach Loop approach are
This can be done easily with a Foreach Loop Container in SSIS.
Put simply, the container will check the directory you point it at and perform the tasks within the container for each file found. If no files are found the contents of the container are never executed. Your job will not fail if no files are found. It will complete reporting success.
Check out this great intro blog post for more info.
In the image attached the question, the specific errors are related to the Excel Source failing validation. When SSIS opens a package for editing or running, the first thing it does is validate all of the artifacts needed for a successful run are available and conform to the expected shape/API. Since the expected file may not be present, right click on the Excel Connection Manager and in the Properties menu, find a setting for DelayValidation and change it to True. This will ensure the connection manager only validates the resource is available if the package is actually going to use it i.e. it passes into the Foreach Loop Container. You will also need to set the same DelayValidation to True on your Data Flow Task.
You did not mention what scripting approach you're applying to search for your file. While using C# or VB.NET are typical scripting languages used in a Scripting control task of this nature, you can also use TSQL that will simply return a boolean value saved to a user variable (Sometimes systems limit the use C# and VB.NET). Then you apply that user variable in the control flow to determine whether to import (boolean = 1) or not (boolean = 0).
Take a look at the following link that shows in detail how to set up the TSQL script that checks for whether or not a file exist.
Check for file exists or not in sql server?
Take a look at the following link that shows how to apply a conditional check based on a boolean user variable. This example also shows how to apply VB.NET in a script task to determine if the file exists (as an alternative to the before mentioned TSQL approach).
http://sql-articles.com/articles/bi/file-exists-check-in-ssis/
Hope this helps.

Jenkins pipeline - Extended Choice Parameter

I am using Jenkins version - v2.73.2 and Jenkins pipeline - 2.5 and wanted to have below multi select options. Users should be able to choose more than one option.
Build
Deploy
Analysis
Test
For example, if user selects the options- Build , Analysis and Test. First Build job should be executed then Analysis and Test. If user selects the 'Analysis' option , sub-options need to be displayed like 'Choose Instance' - Dev , QA, PreProd and Prod.
Right now i am able to create the multi select options to select Build, Deploy , Analysis and Test using 'Extended Choice Parameter plugin' and now i want to add sub list option if 'Analysis' is selected. Please share your inputs how can i acheive this scenario.
This is not possible in Jenkins build with parameters page. Parameters are loaded all at once, and cannot have logic that depends on the value of another one.
If you truly need this you will need to use an input step after job kickoff to evaluate the 1st provided param and provide a list for the user to choose from.
This is because params are 'post processed', they are evaluated/executed during job execution. This is why you need a single run of a pipeline job before 'Build' becomes 'Build with parameters'.

Open multiple copies of same file in SSMS

We have a script file called CreateClientDatabase.SQL, double clicking it opens it in SSMS where we can change a few parameters and execute it.
Problem:
A few hours/days later we may need to do the same again for another client, but if the original tab in SSMS has not been closed then double clicking the file will simply bring that tab to the fore and not actually open the file.
So it's easy to assume the script you are now looking at is the same as the file when it is not, and this can lead to all sorts of issues.
Is there a way round this?
Can SSMS open a second copy of the file, or warn the user that it hasn't actually opened it much like Excel does.
What you need, I think, is something similar to excel or word template files: whenever you open such file by double-click, a new document with the contents of the template is created.
SSMSBoost add-in (that I develop) has “Autoreplacements” feature: you can define a “magic token”, that will be replaced by your script, whenever that token is typed. For example, we have pre-defined token “sel”, which is replaced by “select * from” whenever you type “sel” and press space.
You could associate your script with any word, like “doit”, so whenever you visit next customer, you just open new query window, type that word+space and you have your script in the window immediately.
Just to mention: SSMSBoost allows you to define “favorite” connections, so, you can save all your cutomer’s servers in one list and quickly switch between them.
Alternative:
Have a look at SSMS Templates (View->Template explorer). SSMS allows creating your own templates and opening them by double-clicking their name from Template Explorer. In connection with SSMSBoost “Preferred connections” list you have a good setup to start your work quickly.
Hope this helps.
If file is opened from the windows explorer then it opens another instance of SSMS.
I think what you need is to detect when the SQL script file is changed outside if the SSMS environment:
make sure that Tools -> Options -> Environment -> Documents -> Detect when file is changed outside the environment is checked.
More details can be found here.

Have to configure my batch script to run on windows every day at 8 AM

I have batch script , that will take update of code base and do a build all the related projects.I want to run this script automatically without running manually every day at 8AM.
Create a batch file you wish to run and place it under a folder where you have enough permission. For example under C drive.
Click on Start and under search, type in Task and click open Task Scheduler.
Select Create Basic Task from the Action pane on the right of the window.
Under Create Basic Task, type in the name you like and click Next.
From the Trigger select the option you like and click Next. Select Date and Time.
Now click on Browser and select the batch file you like to run.
Finally, click on Finish to create the Task.
Now that you have created a Task, you have to make sure it runs with the highest privilege. Since you have UAC settings you have to make sure that when you run the file it should not fail if it does not bypass the UAC settings.
So click on Task Scheduler Library.
Then double click on the Task you just created.
Click on Run with Highest privilege then click OK.

how to index data in solr from database automatically

I have MySql database for my application. i implemented solr search and used dataimporthandler(DIH)to index data from database into solr. my question is: is there any way that if database gets updated then my solr indexes automatically gets update for new data added in the database. . It means i need not to run index process manually every time data base tables changes.If yes then please tell me how can i achieve this.
I don't think there is a possibility in Solr which lets you index the data when any updates happens to DB.
But there could be possibilities like, with the help of Triggers - there is a possibility to run an external application from triggers.
Write a CRON to trigger PHP script which does reading from the DB and indexing it in Solr. Write a trigger (which calls this script) for CRUD operation and dump it into DB, so, whenever something happens to DB, this trigger will call the above script and indexing could happen.
Please see:
Invoking a PHP script from a MySQL trigger
Automatic Scheduling:
Please see this post How can I Schedule data imports in Solr for more information on scheduling. The second answer, explains how to import using Cron.
Since you used a DataImportHandler to initially load your data into Solr... You could create a Delta Import Handler that is executed using curl from a cron job to periodically add changes in the database to the index. Also, if you need more real time updates, as #Rakesh suggested, you could use a trigger in your database and have that kick off the curl call to the Delta DIH.
you can import the data using your browser and task manager.
do the following steps on windows server...
GO to Administrative tools => task Schedular
Click "Create Task"
Now a screen of Create Task will be open with the TAB
General,Triggers,Actions,Conditions,Settings.
In the genral tab enter the task name "Solrdataimport" and in discriptions enter "Import mysql data"
Now go to Triggers tab CLick new in Setting check Daily.In Advanced setting Repeat task every ... Put time there whatever you want.click OK
Now go to Actions button click new Button IN setting put Program/Script "C:\Program Files (x86)\Google\Chrome\Application\chrome.exe" this is the installation path of chrome browser.In the Add Arguments enter http://localhost:8983/solr/#/collection1/dataimport//dataimport?command=full-import&clean=true And click OK
Using the all above process Data import will Run automatically.In case of Stop the Imort process follow the all above process just change the Program/Script "taskkill" in place of "C:\Program Files (x86)\Google\Chrome\Application\chrome.exe" under Actions Tab In arguments enter "f /im chrome.exe"
Set the triggers timing according the requirements
What you're looking for is a "delta-import", and a lot of the other posts have information about that covered. I created a Windows WPF application and service to issue commands to Solr on a recurring schedule, as using CRON jobs and Task Scheduler is a bit difficult to maintain if you have a lot of cores / environments.
https://github.com/systemidx/SolrScheduler
You basically just drop in a JSON file in a specified folder and it will use a REST client to issue the commands to Solr.

Resources