is it possible to update .properties files from batch script?
We have a external system which reads one configuration properties files and based on that it is doing some task. Our plan is to accept some dynamic values(like version number etc.) from user input(Jenkins parametrized build) and update those new values in existing config.properties file.
Please help!
If I'm understanding correctly, yes, I do something similar. JobA feeds properties to JobB.
JobA runs a task that creates an Amazon EC2 instance. The public URL is detected after instance creation, and I write that to an output.properties file. That file is saved to the jenkins node running JobA.
In Post-build Actions, use "Trigger parameterized build on other projects." -- from the Parameterized Trigger Plugin https://wiki.jenkins-ci.org/display/JENKINS/Parameterized+Trigger+Plugin --
"Projects to build" is JobB.
In "Parameters from properties file," point it at the output.properties file (relative to the workspace).
When JobB is triggered it'll use that file just like any other properties file, so you have a downstream job that's able to consume relatively random output from an upstream job.
[jenkins#burl-aet-jenk01 poc]$ cat output.properties
INSTANCE_HOSTNAME = ec2-54-84-37-183.compute-1.amazonaws.com
Related
I'm creating an app using grails that generates multiple files as a process output, and put them on some folders that later will be access by ftp by other people.
Everything works great except that in production the new created files are created with access only for the user that runs tomcat, meaning that when somebody connects to the folder using ftp the files can't be open because they don't have permission.
is there any way to set permissions from grails or configure tomcat in a way that every output can be access by other users?.
This might help. You can also look into executing shell commands but that may not be the best option.
I found out that there is actually a method for the class File that changes the permissions for an instance of file, I was trying to use it but I notice only changed the permissions for the owner, but with a slight change on the parameters you can tell it to apply to other users too.
File.setReadable(boolean readable)
File.setReadable(boolean readable,boolean ownerOnly)
So in my case file.setReadable true,false made the trick.
Check out the class methods for more info java.io.File
So, I have a script that uses bulk insert to pull text from files and insert their contents into a table. I am loading from text files because the text may be large and in doing this, I do not need to worry about escaping. I have the script working locally with a set defined directory. ex ('C:\Users\me\Files\File.txt') But, I need to run this script in a Post Deployment script. The text files that I am reading from are in the same Database project. I cannot do a set defined directory as the directory may be different depending on the different environments that the project is published to. Is there a way to get a relative path or get what the solution/project's directory is after deployment?
So, because Bulk Insert needs an absolute path, scripts have no concept of relative paths, and this will be deployed on multiple environments where I do not know the absolute path. I decided to utilize Powershell AND Bulk Insert. So, what I am doing is, on the Database project's Pre-Build, I call my Powershell script. The Powershell script is able to figure out its current directory. I build a SQL file that is called in the Post-Deployment script. In this SQL file, I Bulk Insert using the current directory.
Why not use BCP: http://msdn.microsoft.com/en-us/library/ms162802.aspx ? It can handle relative paths. And if you are able to all PowerShell, I don't see why you wouldn't be able to call BCP.EXE. And it is essentially the same API as BULK INSERT.
Have you considered using a standard location on the file system? When I need to write DOS/CMD scripts that are portable (including install stuff for later consumption via T-SQL, such as CREATE ASSEMBLY FROM), I do something like:
IF NOT EXIST C:\TEMP\MyInstallFolder (
MKDIR C:\TEMP\MyInstallFolder
)
REM put stuff into C:\TEMP\MyInstallFolder now that it is certain to be there)
REM CALL some process that looks in C:\TEMP\MyInstallFolder
The MKDIR will create all missing parent folders. So a folder like TEMP, which used to be standard on PCs running Windows, is typically not there anymore since they have moved to per-user temp folders but is created and then MyInstallFolder is created, causing no errors. The IF NOT EXIST will make sure that re-running the script will also not error after the first run.
Can anyone help me to build a table that lists all files in a specified folder, so whenever a file is copied to that folder the table should update and make a log of files?
I need the list to retain the names, even if the file is moved from that folder or deleted. Later the data would be deleted by a scheduler.
Also I need the table to record the time exactly when the file was copied into that folder and not the modification or creation time.
I am using windows 7; how can I build a system with my desired behaviour?
Just turn on Windows file auditing, for that folder, the youtube video takes you through the process.
Microsoft provide information on their techNet site as to how you can use the LogParser tool to extract Security events from the Event Log DB.
Note: Admin questions should really be posted to the SuperUser site.
I have a few PowerDesigner 15 vbs scripts that perform various cleansing/transformation tasks on physical data models.
I'd like to be able to run all of them at once rather than one at a time (there are 10 scripts now, possibly more in the future).
Also, I'd like to avoid copying all the code in one big ugly script.
Is there a way to have a script that runs all other scripts?
Note: I've tried ExecuteCommand, but it doesn't work; it executes the .vbs outside the context of PowerDesigner, so it cannot access the Model.
You could store your scripts in an extension (.xem).
The main script would attach the extension to the model, use the scripts (as custom methods on the model object for example), and detach the extension at the end?
Or you could even always attach the extension to your model (which would allow defining the cleansing scripts as popup menu options).
I have MySql database for my application. i implemented solr search and used dataimporthandler(DIH)to index data from database into solr. my question is: is there any way that if database gets updated then my solr indexes automatically gets update for new data added in the database. . It means i need not to run index process manually every time data base tables changes.If yes then please tell me how can i achieve this.
I don't think there is a possibility in Solr which lets you index the data when any updates happens to DB.
But there could be possibilities like, with the help of Triggers - there is a possibility to run an external application from triggers.
Write a CRON to trigger PHP script which does reading from the DB and indexing it in Solr. Write a trigger (which calls this script) for CRUD operation and dump it into DB, so, whenever something happens to DB, this trigger will call the above script and indexing could happen.
Please see:
Invoking a PHP script from a MySQL trigger
Automatic Scheduling:
Please see this post How can I Schedule data imports in Solr for more information on scheduling. The second answer, explains how to import using Cron.
Since you used a DataImportHandler to initially load your data into Solr... You could create a Delta Import Handler that is executed using curl from a cron job to periodically add changes in the database to the index. Also, if you need more real time updates, as #Rakesh suggested, you could use a trigger in your database and have that kick off the curl call to the Delta DIH.
you can import the data using your browser and task manager.
do the following steps on windows server...
GO to Administrative tools => task Schedular
Click "Create Task"
Now a screen of Create Task will be open with the TAB
General,Triggers,Actions,Conditions,Settings.
In the genral tab enter the task name "Solrdataimport" and in discriptions enter "Import mysql data"
Now go to Triggers tab CLick new in Setting check Daily.In Advanced setting Repeat task every ... Put time there whatever you want.click OK
Now go to Actions button click new Button IN setting put Program/Script "C:\Program Files (x86)\Google\Chrome\Application\chrome.exe" this is the installation path of chrome browser.In the Add Arguments enter http://localhost:8983/solr/#/collection1/dataimport//dataimport?command=full-import&clean=true And click OK
Using the all above process Data import will Run automatically.In case of Stop the Imort process follow the all above process just change the Program/Script "taskkill" in place of "C:\Program Files (x86)\Google\Chrome\Application\chrome.exe" under Actions Tab In arguments enter "f /im chrome.exe"
Set the triggers timing according the requirements
What you're looking for is a "delta-import", and a lot of the other posts have information about that covered. I created a Windows WPF application and service to issue commands to Solr on a recurring schedule, as using CRON jobs and Task Scheduler is a bit difficult to maintain if you have a lot of cores / environments.
https://github.com/systemidx/SolrScheduler
You basically just drop in a JSON file in a specified folder and it will use a REST client to issue the commands to Solr.