How to deactivate/delete flow through WorkBench API or Apex? I tried all possible ways. could anyone help out here
You can use the Workbench feature and then deactivate flow through metadata when a process is installed using a managed package and that process references a custom object that isn't in the target org. In this case, the process is active and can't be edited using the Process Builder.
Resolution
Follow these steps to deactivate a process.
Log into Workbench. Read more here about Workbench.
Use Workbench to retrieve metadata components and then download those components in a .zip file.
Read about Metadata for more information.
Modify the flow definition file (contained in the .zip file you downloaded) by specifying a value of 0 in active version number.
Note that the flow definition file is available only in API version 34 and later.
https://help.salesforce.com/s/articleView?id=000338777&type=1
use the above link
Related
As mentioned in the document:
For example a data pipeline might monitor a file system directory for new files and write their data into an event log. Another application might materialize an event stream to a database or incrementally build and refine a search index.
So, how can I follow a local file system file updating while using Flink?
Here, the document also mentioned that:
File system sources for streaming is still under development. In the future, the community will add support for common streaming use cases, i.e., partition and directory monitoring.
Does this mean I could use the API to do some special streaming? If you know how to use streaming file system source, please tell me. Thanks!
I have zero experience with ETL.
Whenever a file(a .csv) is moved into a specific folder, it should be uploaded to SalesForce I don't know how to get this automated flow.
I hope I was clear enough.
I gotta use the opensource version, any helpful links or resources will be appreciated.
Thank you in advance
You could definitely use Talend Open Studio for ESB : this studio contains 'Routes' functionalities : you'll be able to use a cFile component, which will check your folder for new files, and raise an event that will propagate throughout the route to a designed endpoint (for example a salesForce API). Talend ESB maps Apache Camel components , which are well documented.
Check about Routes with Talend for ESB, it should do the trick.
We have tFileExists component, you can use that and configure to check the file.
Also you have tFileWait component, where you can defile the frame of the arrival of he files and the number of iterations it has to check the file.
But i would suggest id you have any scheduling tool, use file watcher concept and then use talend job to upload the file to a specific location.
Using talend itself to check the file arrival is not a feasible way as the jobs has be in running state continuously which consumes more java resource
I have created a Web Content Management library for use in WebSphere Portal. At the moment I'm using import-wcm-data to import the library, then I need to add some additional propeties to 2-3 files on the server under Resource Environment Providers and then restart particular services so those changes are detected.
Can anyone explain the benefits of using a paa over writing a simple bash (or similar) script to automate this process?
I don't understand if I get any advantages when using paa, or is paa even capable of updating properties files and restarting services?
I have been working intensively with PAA files and I must say that it is a very stable way of deploying a app requirering multiple depl steps and components.
It does need a startup process but is well worth it in a multi server environment.
You can do all the tasks that you can do in a Ant file as well as using the wsadmin script interface. I only update res env settings and the such in WAS and do not touch any props files for that reason since all settings are stored in WAS.
In my experience, a PAA is not a good method if you're merely importing a content library.
I don't think I understand why you are doing the import manually and not syndicating, but even if there's a good reason not to syndicate, the PAA process was too involved and required too many precursor actions (deleting libraries, remove PAA, deploy PAA and then activate the portliest) to be a viable option for something as simple as importing a WCM library.
Since activating the portlets I was importing with the PAA was an extra step, I don't believe you can restart applications either.
I've run over only a few examples of how to do this and they didn't work for me. Mainly since i've only used an ant script to auto build jar files threw jenkins. Now though i need to build those files in jenkins then upload them to a 3rd party file site like sourceforge. This is both to save hard drive space on the server, since i don't own it, and to allow external downloads. Any help is welcome but no comments on the fact i don't know to much about ant scripts.
Also something related by a bit separate.The jar file i'm building depends on a another jar file with its own version. i also want to make a new folder each time it uploads with a different dependency version. This way the users that download this file can easily understand the main jar version it goes with while allowing me to upload 20+ sub builds.
There are several ways to upload files, so there are several kind of ant tasks to do the job.
For instance, if you want to upload to sourceforge, you can use the Ant task scp. But it seems also possible to upload there via FTP: so here is the task ftp.
Maybe you find some other service which requires you to upload via HTTP: ant-contrib have the post task.
I used to do publications as part of my ANT build logic, creating a special "publish" target that issued the scp or ftp command.Now I'm more inclined to leverage one of the publish over plugins for Jenkins.
The main reason for this shift is the management of access credentials. Using the ANT based approach I was forced to run my build on a Jenkins slave that was pre-configured with the correct SSH key to talk to the remote server. The Jenkins plugin manages private keys centrally and ensures all slaves are properly configured.
Finally if your build has dependencies on 3rd party jars, use a dependency manager like ivy to download them and include them in your project. It then becomes trivial to include their upload as part of your publish step.
I'm using TeamCity 6.5.4 and I need to have 3 build configurations for the same deployment package. I'd like to persist the version number across all three build configurations and be able to use that number to version the assembly, tag vcs, version the nuspec file, etc.
Here are the configurations and desired version numbers:
Configuration | Version
-------------------|---------
CI/Nightly Build | 1.1.*
Minor Release | 1.*.0
Major Release | *.0.0
It seems that TeamCity uses a separate build incrementer for each configuration. This means every time we have a major or minor release, I'd have to manually update the persisted values (1) in all of the subsequent configurations. I'm a programmer and I'm lazy. I want a single button to do everything for me.
I've seen examples of persisting the build number through build steps of a configuration with dependent snapshots, but that only works in the same configuration.
The Autoincrementer plugin bumps up the number every time you reference the ID. This is fine for the changing numbers (*), but not so good for referencing the persisted values (1).
Is there a way for TeamCity, either natively or via plugin, to allow me to read and write that version to a file or variable that can be persisted across build configurations?
You can reference the build number of the dependent ( artifact / snapshot) configuration using dep.btx.build.number where btx is the bt id of the latter. Once you have the build number, pass the build number to your script running in the configuration, parse the build number in the script and send service messages from the script to Teamcity to set the build number in the way you want. Do this parsing and setting number as the first step in your script / first step in the build steps.
Thanks for the suggestions. I opted to write a set of custom targets to use with my MSBuild script which maintains assembly metadata in a remote xml "manifest" file. When a new TeamCity project is created, my build script calls an Init target which creates a new manifest file from an unpopulated template.
<Copy SourceFiles="#(ManifestTemplate)" DestinationFiles="#(ManifestTemplate->'$(ManifestFile)')" Condition="!Exists('$(ManifestFile)')" />
I'm using the MSBuild Extentions pack to read attributes like version information from the manifest file.
<MSBuild.ExtensionPack.Xml.XmlFile TaskAction="ReadElementText" File="$(ManifestFile)" XPath="/Package/Version/Major">
<Output PropertyName="PackageVersionMajor" TaskParameter="Value"/>
</MSBuild.ExtensionPack.Xml.XmlFile>
I have my TeamCity build configurations separated to CI, Test, Minor Release, and Major Release with different events triggering each. From the corresponding target in my project build script, I add a new target to DependsOnTargets attribute to call the custom target to update the appropriate version number and save it to the manifest file.
<Target Name="Test" DependsOnTargets="IntializeBuildProject;Build-UpdateVersion-Build">
<MSBuild Projects="$(SolutionFile)" Targets="Rebuild" Properties="Configuration=$(Configuration)" />
<TeamCitySetBuildNumber BuildNumber="$(PackageVersion)" />
The code in the custom target to handle the version update:
<MSBuild.ExtensionPack.Science.Maths TaskAction="Add" Numbers="$(PackageVersionBuild);1">
<Output PropertyName="PackageVersionBuild" TaskParameter="Result"/>
</MSBuild.ExtensionPack.Science.Maths>
<MSBuild.ExtensionPack.Xml.XmlFile TaskAction="UpdateElement" File="$(ManifestFile)" XPath="/Package/Version/Build" InnerText="$(PackageVersionBuild)"/>
This file handles persistence of the version and other metadata thus ignoring the TeamCity build number. Since the XML metadata file is centralized, I can use the values to populate my Nuspec, AssemblyInfo, and WiX Installer metadata as well as pass the version and other pertinent information back to TeamCity through service messages.
I added a simple MVC web interface to allow my team to edit the file contents remotely if package details change. Now we have one single place to update things like Copyright information and any other metadata for a given build project. I can also give non-dev folks access to the MVC site to update branding information without allowing them access to my TeamCity build configurations.
With the exception of the service messages used to relay version to TeamCity, there's very little here that's coupled with TeamCity. I like having the functionality in custom targets and build scripts removed from TeamCity on the off chance we move to another build management solution. For that reason, I don't envision taking time to build a TeamCity plugin, but there could be a blog series coming soon.
I'll be happy to provide more code and further explanation to anyone interested.
Yes, you can create a plugin to do this easy. You can take my auto increment build number ( across configurations ) plugin and modify it to fit your need. The build number will be saved in a text file that is configurable from the admin screen in TeamCity.
http://github.com/ornatwork/tc_plugins/tree/master/unique
You can hit me up for input how to change it if you need.