I'm trying to change disk storage profile of one VM in the vCloud. I've found this link. I use the following xml document as the body of my REST request
<?xml version="1.0" encoding="UTF-8"?>
<RasdItemsList xmlns="http://www.vmware.com/vcloud/v1.5" xmlns:rasd="http://schemas.dmtf.org/wbem/wscim/1/cim-schema/2/CIM_ResourceAllocationSettingData" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" href="https://192.168.141.100/api/vAppTemplate/vm-4ec8cce7-0b48-483c-ac0c-14ff39d0aa7c/virtualHardwareSection/disks" type="application/vnd.vmware.vcloud.rasdItemsList+xml" xsi:schemaLocation="http://www.vmware.com/vcloud/v1.5 http://192.168.141.100/api/v1.5/schema/master.xsd http://schemas.dmtf.org/wbem/wscim/1/cim-schema/2/CIM_ResourceAllocationSettingData http://schemas.dmtf.org/wbem/wscim/1/cim-schema/2.22.0/CIM_ResourceAllocationSettingData.xsd">
<Link rel="edit" href="https://192.168.141.100/api/vAppTemplate/vm-4ec8cce7-0b48-483c-ac0c-14ff39d0aa7c/virtualHardwareSection/disks" type="application/vnd.vmware.vcloud.rasdItemsList+xml"/>
<Item>
<rasd:Address>0</rasd:Address>
<rasd:Description>SCSI Controller</rasd:Description>
<rasd:ElementName>SCSI Controller 0</rasd:ElementName>
<rasd:InstanceID>2</rasd:InstanceID>
<rasd:ResourceSubType>lsilogic</rasd:ResourceSubType>
<rasd:ResourceType>6</rasd:ResourceType>
</Item>
<Item>
<rasd:AddressOnParent>0</rasd:AddressOnParent>
<rasd:Description>Hard disk</rasd:Description>
<rasd:ElementName>Hard disk 1</rasd:ElementName>
<rasd:HostResource
xmlns:vcloud="http://www.vmware.com/vcloud/v1.5"
vcloud:capacity="10240"
vcloud:busSubType="lsilogic"
vcloud:busType="6"
vcloud:storageProfileOverrideVmDefault="true"
vcloud:storageProfileHref="https://192.168.141.100/api/vdcStorageProfile/3235c8c2-7489-4e32-b73c-cd8a9d10c4e4">
</rasd:HostResource>
<rasd:InstanceID>2000</rasd:InstanceID>
<rasd:Parent>2</rasd:Parent>
<rasd:ResourceType>17</rasd:ResourceType>
</Item>
<Item>
<rasd:AddressOnParent>1</rasd:AddressOnParent>
<rasd:Description>Hard disk</rasd:Description>
<rasd:ElementName>Hard disk 2</rasd:ElementName>
<rasd:HostResource xmlns:vcloud="http://www.vmware.com/vcloud/v1.5" vcloud:capacity="1024" vcloud:busSubType="lsilogic" vcloud:busType="6"/>
<rasd:InstanceID>2001</rasd:InstanceID>
<rasd:Parent>2</rasd:Parent>
<rasd:ResourceType>17</rasd:ResourceType>
</Item>
<Item>
<rasd:Address>0</rasd:Address>
<rasd:Description>IDE Controller</rasd:Description>
<rasd:ElementName>IDE Controller 0</rasd:ElementName>
<rasd:InstanceID>3</rasd:InstanceID>
<rasd:ResourceType>5</rasd:ResourceType>
</Item>
</RasdItemsList>
For some reason after reconfiguration task is completed I see no changes for the storage profile section
PS C:\Windows\system32> $vmdisks.RasdItemsList.Item[1].HostResource
vcloud capacity busSubType busType
------ -------- ---------- -------
http://www.vmware.com/vcloud/v1.5 10240 lsilogic 6
PS C:\Windows\system32> $vmdisks.RasdItemsList.Item[2].HostResource
vcloud capacity busSubType busType
------ -------- ---------- -------
http://www.vmware.com/vcloud/v1.5 1024 lsilogic 6
The interesting thing is re-configuring of disk size runs successfully. Looks like reconfiguration task skips storage profile changes. Can anyone advice me on this matter?
I've double checked fast provisioning in my organisational vDC and it is off.
What you're looking for is on line 27: ProviderVdcStorageProfile
I'm using PowerShell to post the XML document that I build in Notepad++, based on the document linked below. The link has a rel of edit, a content-type of orgVdc+xml and the href will be your vdc ID.
<?xml version="1.0" encoding="UTF-8"?>
<CreateVdcParams name="{0}" xmlns="http://www.vmware.com/vcloud/v1.5">
<Description>API VDC</Description>
<AllocationModel>AllocationVApp</AllocationModel>
<ComputeCapacity>
<Cpu>
<Units>MHz</Units>
<Allocated>3000</Allocated>
<Limit>0</Limit>
</Cpu>
<Memory>
<Units>MB</Units>
<Allocated>0</Allocated>
<Limit>0</Limit>
</Memory>
</ComputeCapacity>
<NicQuota>0</NicQuota>
<NetworkQuota>10</NetworkQuota>
<VdcStorageProfile>
<Enabled>true</Enabled>
<Units>MB</Units>
<Limit>0</Limit>
<Default>true</Default>
<ProviderVdcStorageProfile
href="https://vcloud.example.com/api/admin/pvdcStorageProfile/0b6fe60b-e70b-4529-bbaa-fd82ff59125f" />
</VdcStorageProfile>
<ResourceGuaranteedMemory>0.01</ResourceGuaranteedMemory>
<ResourceGuaranteedCpu>0.01</ResourceGuaranteedCpu>
<VCpuInMhz>3000</VCpuInMhz>
<IsThinProvision>true</IsThinProvision>
<NetworkPoolReference
href="https://vcloud.example.com/api/admin/extension/externalnet/4444"/>
<ProviderVdcReference
name="Provider1"
href="https://vcloud.example.com/api/admin/extension/providervdc/242424" />
<UsesFastProvisioning>true</UsesFastProvisioning>
</CreateVdcParams>
Related
I am using React and would like to use the data within the .env file and insert it into my .xml file.
Is this achievable somehow, could not find anything useful on the net?
The file looks like this:
<?xml version="1.0" encoding="UTF-8"?>
<OfficeApp xmlns="http://schemas.microsoft.com/office/appforoffice/1.1" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:type="TaskPaneApp">
<!--IMPORTANT! Id must be unique for each add-in. If you copy this manifest ensure that you change this id to your own GUID. -->
<Id>c6890c26-5bbb-40ed-a321-37f07909a2f0</Id>
<Version>1.0</Version>
<ProviderName>Contoso, Ltd</ProviderName>
<DefaultLocale>en-US</DefaultLocale>
<DisplayName DefaultValue="Northwind Traders Excel" />
<Description DefaultValue="Search Northwind Traders data from Excel"/>
<SupportUrl DefaultValue="[Insert the URL of a page that provides support information for the app]" />
<AppDomains>
<AppDomain>https://www.northwindtraders.com</AppDomain>
</AppDomains>
<DefaultSettings>
<SourceLocation DefaultValue="https://www.contoso.com/search_app/Default.aspx" />
</DefaultSettings>
<Permissions>ReadWriteDocument</Permissions>
</OfficeApp>
And instead of the lets say Id property i would like to have some preset value from .env.
I am trying to create records in Salesforce using a CSV file in Mule 4. However, after the first batch is created successfully, instead of ending the program, Mule goes back and creates the same batches an infinite amount of times.
How do I get Mule to stop after the first batch is created? Here is my code:
<?xml version="1.0" encoding="UTF-8"?>
<mule xmlns:batch="http://www.mulesoft.org/schema/mule/batch" xmlns:db="http://www.mulesoft.org/schema/mule/db"
xmlns:salesforce="http://www.mulesoft.org/schema/mule/salesforce"
xmlns:ee="http://www.mulesoft.org/schema/mule/ee/core" xmlns:file="http://www.mulesoft.org/schema/mule/file" xmlns="http://www.mulesoft.org/schema/mule/core" xmlns:doc="http://www.mulesoft.org/schema/mule/documentation" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.mulesoft.org/schema/mule/core
http://www.mulesoft.org/schema/mule/core/current/mule.xsd
http://www.mulesoft.org/schema/mule/file http://www.mulesoft.org/schema/mule/file/current/mule-file.xsd
http://www.mulesoft.org/schema/mule/ee/core http://www.mulesoft.org/schema/mule/ee/core/current/mule-ee.xsd
http://www.mulesoft.org/schema/mule/salesforce
http://www.mulesoft.org/schema/mule/salesforce/current/mule-salesforce.xsd
http://www.mulesoft.org/schema/mule/db http://www.mulesoft.org/schema/mule/db/current/mule-db.xsd
http://www.mulesoft.org/schema/mule/batch http://www.mulesoft.org/schema/mule/batch/current/mule-batch.xsd">
<salesforce:sfdc-config name="Salesforce_Config" doc:name="Salesforce Config" doc:id="8e29192b-953e-499f-a34b-584b34bd6e9c" >
<salesforce:basic-connection username="gregory.palios#ohrsdev2.com" password="1luvL!zzy" securityToken="6QgMUnwuaNM9rgX5HQbiwIdv" url="https://vha-gov--ohrsdev2.my.salesforce.com/services/Soap/u/48.0" />
</salesforce:sfdc-config>
<flow name="addendumFlow" doc:id="4df7a2c4-3d56-4e00-a65a-62f579f43880" >
<file:listener doc:name="On New or Updated File" doc:id="7ea260b4-02a9-429c-82a9-87cf924c6ef1" directory="C:\Users\GregoryPalios\AnypointStudio\studio-workspace\legacyohrstestmigration\Table Files\ADDENDUM">
<scheduling-strategy >
<fixed-frequency />
</scheduling-strategy>
</file:listener>
<ee:transform doc:name="Transform Message" doc:id="93744a3d-5945-4688-af87-afaf90677469">
<ee:message>
<ee:set-payload><![CDATA[%dw 2.0
output application/java
---
[payload map{
ADDENDUM_ID__c: $.ADDENDUM_ID as Number,
ENCOUNTER_ID__c: $.ENCOUNTER_ID as Number,
OH_STD_ENCOUNTER_STATUS_ID__c: $.OH_STD_ENCOUNTER_STATUS_ID as Number,
NOTE__c: $.NOTE,
ADDENDUM_DESCRIPTION__c: $.ADDENDUM_DESCRIPTION,
VIEWED_DATE__c: $.VIEWED_DATE,
VIEWED_DATE_TZ__c: $.VIEWED_DATE_TZ,
DELETED_DATE__c: $.DELETED_DATE,
DELETED_DATE_TZ__c: $.DELETED_DATE_TZ,
DELETED_BY__c: $.DELETED_BY,
CO_SIGNATURE_REQUIRED_IND__c: $.CO_SIGNATURE_REQUIRED_IND,
CO_SIGNER_ACTIVE_DIRECTORY_NAME__c: $.CO_SIGNER_ACTIVE_DIRECTORY_NAME,
ADMIN_LEVEL__c: $.ADMIN_LEVEL,
RECORD_CREATED_BY__c: $.RECORD_CREATED_BY,
RECORD_CREATED_DATE__c: $.RECORD_CREATED_DATE,
RECORD_MODIFIED_BY__c: $.RECORD_MODIFIED_BY,
RECORD_MODIFIED_DATE__c: $.RECORD_MODIFIED_DATE,
RECORD_MODIFIED_COUNT__c: $.RECORD_MODIFIED_COUNT as Number,
RECORD_CREATED_DATE_TZ__c: $.RECORD_CREATED_DATE_TZ,
RECORD_MODIFIED_DATE_TZ__c: $.RECORD_MODIFIED_DATE_TZ,
OH_STD_INACTIVE_ACTIVITY_REASON_ID__c: $.OH_STD_INACTIVE_ACTIVITY_REASON_ID as Number
}]]]></ee:set-payload>
</ee:message>
</ee:transform>
<foreach doc:name="For Each" doc:id="9ab8acaf-988e-4a02-bb72-3150f9688a4a" >
<salesforce:create doc:name="Create" doc:id="2865a2fd-2559-401e-a365-93950717d3a7" config-ref="Salesforce_Config" type="ADDENDUM__c" />
</foreach>
</flow>
In your file listener, you need to set the autoDelete to "true" or move the file to a backup director other than the directory you are listening to.
You can also enable watermarking so file will not be picked up again and again.
I must create new cartridge for integration in BM but I don't want use pipelines. Can I use the controllers for this? If yes, please provide information on how to do this.
.
Yes, you can. You need to create the bm_extensions.xml and add all the actions/entries.
Note: The file mention pipeline but It can actually be a Controller as you can see in the example I linked below.
<?xml version="1.0" encoding="ISO-8859-1" ?>
<extensions xmlns="http://www.demandware.com/xml/bmmodules/2007-12-11"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.demandware.com/xml/bmmodules/2007-12-11 bmext.xsd">
<menuaction id="paypal_transactions_manager" menupath="orders" position="200" site="true">
<name xml:lang="x-default">PayPal Transactions</name>
<short_description xml:lang="x-default">Manage the PayPal transactions related with this site orders.</short_description>
<description xml:lang="x-default">Manage the PayPal transactions related with this site orders.</description>
<exec pipeline="PaypalAdmin" node="Orders" />
<sub-pipelines>
<pipeline name="PaypalAdmin-Orders" />
<pipeline name="PaypalAdmin-OrderTransaction" />
<pipeline name="PaypalAdmin-Action" />
<pipeline name="PaypalAdmin-CreateNewTransaction" />
</sub-pipelines>
<icon>paypalbm/images/icon_transactions.png</icon>
</menuaction>
</extensions>
PayPal Cartridge bm_paypal is a good example to understand how is done: https://github.com/SalesforceCommerceCloud/link_paypal/tree/master/cartridges/bm_paypal/cartridge
Ps: Let me know if you cannot access the link.
Below is the configuration xml of my small program in Anypoint Studio. What i am trying to do is copying one text file data(pipe delimited) to another text file. The execution goes well but stops at status as "Deployed". I have tried other transformations as well but the result is same. Help is highly appreciable. Thanks in advance.
<?xml version="1.0" encoding="UTF-8"?>
<mule xmlns:db="http://www.mulesoft.org/schema/mule/db" xmlns:ee="http://www.mulesoft.org/schema/mule/ee/core"
xmlns:file="http://www.mulesoft.org/schema/mule/file"
xmlns="http://www.mulesoft.org/schema/mule/core" xmlns:doc="http://www.mulesoft.org/schema/mule/documentation" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.mulesoft.org/schema/mule/core http://www.mulesoft.org/schema/mule/core/current/mule.xsd
http://www.mulesoft.org/schema/mule/file http://www.mulesoft.org/schema/mule/file/current/mule-file.xsd
http://www.mulesoft.org/schema/mule/ee/core http://www.mulesoft.org/schema/mule/ee/core/current/mule-ee.xsd
http://www.mulesoft.org/schema/mule/db http://www.mulesoft.org/schema/mule/db/current/mule-db.xsd">
<flow name="texttoexcelFlow" doc:id="42aaa83a-e26a-4f6d-8d2f-da3613a8d232" initialState="started">
<file:read doc:name="Read" doc:id="89fa46c9-aa14-4a79-b7ab-e609b9fad501" path="D:\Mulesoft Input\Name.txt" outputMimeType="application/json" outputEncoding="UTF-8">
<repeatable-in-memory-stream />
</file:read>
<ee:transform doc:name="Transform Message" doc:id="86dc86b8-99ed-4bee-b5bc-e07616e44431" >
<ee:message >
<ee:set-payload ><![CDATA[%dw 2.0
output application/csv headerLineNumber = 0 , header = false , separator = "|"
---
payload map ( payload01 , indexOfPayload01 ) -> {
FirstName: payload01.FirstName ,
LastName: payload01.LastName
}]]></ee:set-payload>
</ee:message>
</ee:transform>
<file:write doc:name="Write" doc:id="3884725e-3870-4ef1-9e05-b10a2274dfa6" path="C:\Users\aseem\Desktop\Mulesoft Output\Excel.txt">
</file:write>
</flow>
</mule>
"
You need something to trigger the flow to run. file:read doesn’t do this automatically.
All flows need a ‘Source’ to trigger them unless you are calling them from other flows using flow-ref (or from dataweave using a lookup()).
If you know the exact file you want then you can put a scheduler before your file:read to trigger the flow:
<scheduler>
<scheduling-strategy>
<fixed-frequency startDelay="5" frequency="10" timeUnit="SECONDS"/>
</scheduling-strategy>
</scheduler>
Or you can use a file:listener to listen for new files in a directory etc as the source directly:
<flow name="onNewFile">
<file:listener config-ref="file" directory="test-data/in" autoDelete="true">
<scheduling-strategy>
<fixed-frequency frequency="1000"/>
</scheduling-strategy>
</file:listener>
...
</flow>
You can use fixed frequency or cron. More details here: https://docs.mulesoft.com/mule-runtime/4.1/scheduler-xml-reference
I followed youtube videos and articles on net and implemented this. But it never writes to my log file. Tried with all suggestions around many forums with no use.
not sure where I went wrong. I had this inside class library.
app.config file:
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<configSections>
<section name="log4net" type="log4net.Config.Log4NetConfigurationSectionHandler,log4net"/>
</configSections>
<log4net><appender name="myLogAppender" type="log4net.Appender.RollingFileAppender" >
<file value="D:\\Log4NetLog.txt" /><layout type="log4net.Layout.PatternLayout">
<conversionPattern value="%date %level - %message%n" /></layout></appender>
<logger name="myLog"><level value="ALL"></level><appender-ref ="myLogAppender" />
</logger></log4net></configuration>
and in the Assembly.info.cs:
[assembly: log4net.Config.XmlConfigurator(ConfigFile = "app.config", Watch = true)]
and in the class file:
ILog mylog = LogManager.GetLogger("myLog");
string xx = "tokensalt";
mylog.Info(xx);
If you configuration is in you app.config, you do not need to specify the file in the Configurator:
[assembly: log4net.Config.XmlConfigurator()]
Also the watch is not very usefull, when you change the app.config file. The applpication will restart anyway and the file will be reloaded.
If that is not working, I would guess that the path you logging to is not accessible by the web user you are logging with.
->>> <file value="D:\\Log4NetLog.txt"
Make sure you choose a path where you have access.
Use the file appender and not the rolling appender, the rolling appender was made for backup purposes, for example if your file exceeds 10mb then it will write to your rolling appender and you can decide how many files of 10mb you write there, from the log4net site:
RollingFileAppender can roll log files based on size or date or both
depending on the setting of the RollingStyle property. When set to
Size the log file will be rolled once its size exceeds the
MaximumFileSize. When set to Date the log file will be rolled once the
date boundary specified in the DatePattern property is crossed. When
set to Composite the log file will be rolled once the date boundary
specified in the DatePattern property is crossed, but within a date
boundary the file will also be rolled once its size exceeds the
MaximumFileSize. When set to Once the log file will be rolled when the
appender is configured. This effectively means that the log file can
be rolled once per program execution.
Here is a working web.config example that should work for you:
<?xml version="1.0" encoding="utf-8" ?>
<configuration>
<configSections>
<section name="log4net" type="log4net.Config.Log4NetConfigurationSectionHandler, log4net"/>
</configSections>
<log4net>
<appender name="FileAppender" type="log4net.appender.FileAppender">
<file value="C:\MyLogs\MyLogFile.txt"/>
<appendToFile value="true"/>
<lockingModel type="log4net.Appender.FileAppender+MinimalLock"/>
<layout type="log4net.Layout.PatternLayout">
<conversionPattern value="%date{ABSOLUTE} [%logger] %level - %message%newline%exception"/>
</layout>
</appender>
<root>
<level value="DEBUG"></level>
<appender-ref ref="FileAppender"></appender-ref>
</root>
</log4net>