I have tried to change the Resource.resx file at run time but it's impossible because during compile time all the .resx files for an app are packed into a single PRI file by the MakePRI utility and included with the app's deployment package.
For the time being I am trying to delete the ".resx" files and find another solution to create and retrieve the localization Strings files
Can you please help me by providing a solution for this problem, thank you
In fact, you should not delete your resw files but leverage this feature to build your own localization mechanism.
I have recently build a tool that updates resw files (they are just XML files) at runtime and can also read them. You have to ensure that your resw files at are not compiled by PRI, for this change the build action from PRIResource to Content and select Copy Always to ensure that the files are available in your output directory.
Unfortunately, you can't reuse ResXResourceReader class which is available for Windows Form application, but you can easily write a class that is able to read and modify these entries. The format for localized data in the resw files is the following:
<data name="Name1">
<value>this is my long string</value>
<comment>this is a comment</comment>
</data>
Then you add a class that just open the expected localization files based on the user locale. And you pass the stream to the parser which will extract the localized string based on its identifier.
Related
MSI database contains set of tables, and I can successfully enumerate File table, which has all deployable file' meta-deta. What I need to extract is the actual contents of those files. msiexec, lessmsi, 7-zip all can do it, but I couldn't find any source/API to do it.
What I've discovered it that all other (resource) files are in Binary table, and Data field can be used to get content of those files (like icons, custom DLL etc).
Further, I found and know that Media table contains information about the .CAB file (MSI has all content embedded with <MediaTemplate EmbedCab="yes"/>. This simply means the CAB file contains the actual content. I probably need to read contents from "Structured Storage" of the .msi file.
How to extract the contents of CAB/MSI file, using native C Msi* functions?
Phil has given you the easy/simple answer but I thought I might give you a little more information since you've done some research. Checkout:
https://msdn.microsoft.com/en-us/library/windows/desktop/aa372919(v=vs.85).aspx
This is where the structured storage is. You'll see something like Disk1.cab as the Name (PK) and binary data. The data is a CAB file with the file entry in the cab matching the File.File column. From there you can use the File.FileName column to get the short name and long name (you'll want the long name no doubt) and do a joint to the Component table to get the directory table ID.
You'll also need to recurse the directory table to build the tree of directories and know where to put the files.
Fun stuff. There's some libraries in C# that make this WAY simpler. Or just call msiexec /a as Phil says. :)
The most straightforward to extract all the files to some location is to install the product in "advertised" mode. If you do a:
msiexec /a [path to msi] TARGETDIR=[some folder]
you'll see what happens.
In C++ call MsiInstallProduct () with that command line.
You have gotten many good answers already, including the use of dark.exe from the WiX toolkit. By downloading the WiX source code you should be able to get the code you need ready-made from there. I assume you may already have done this.
Chris has already linked to the DTF code you can check, but here is a link directly to dark.exe as well: https://github.com/wixtoolset/wix3/tree/develop/src/tools/dark. I would try both. This is C#, you seem to want native.
UPDATE: Before I get to the Win32 features you can use, check out this little summary of the C# DTF features: How to programmatically read the properties inside an MSI file?
Native Win32 functions: The database functions to deal with an MSI file can be found on MSDN (this is to deal with the MSI file as a database). There are also MSI Installer Functions (used to deal with the MSI file as an actual installer).
You can certainly find good examples of native code for this with a good Google search. Have fun!
BTW: It would help with a description of the actual problem you are trying to solve as well as what you need technically. There could - as always - be less involved ways to achieve what you need. Unless you are writing a security software or malware scanner or something super-involved.
And so it is clear: WiX's dark.exe fully decompiles MSI files into WiX source files and the resource files used to build them - you can then text and binary compare the various types of content (text compare for tables, binary compare for binaries, etc...). The process to do so via command line is described in the following answer: How can I compare the content of two (or more) MSI files? (this is about comparing MSI files, but one option to do so is to decompile them - see section on dark.exe - just for reference for others who find your question).
I like to link things together so we can find content easily at a later point in time. Strictly speaking it doesn't seem necessary here, you have what you need I think but others could perhaps benefit from some further links. Here are some related links:
Extract MSI from EXE.
What is the purpose of administrative installation initiated using msiexec /a?
How do I extract files from an MSI package? (explains why you should not use 7-Zip to extract).
I am currently trying to create a plugin-like library for my company.
I need to check if four directories exist within the project structure. As java.io.File is not available, I am pretty confused on how to check for existance of a file that needs to exist within the project structure?
The concrete use-case.
There will be four directories:
/entities
/converter
/attributes
/caches
Now if the developer uses this library and wants to access all, lets say "Person"-Entities from the server, he should be able to call
RestGet.getAll("Person");
and the library looks in the source directory of the project if there are these Files:
/entities/PersonEntity.java //<-- Stores the actual data
/converter/PersonConverter.java //<-- Converts the JSON answer of the server to the Object
/attributes/PersonAttributes.java //<-- An enum that is used to set the attributes of the object
/caches/PersonCache.java //<-- A simple Cache
How can I do this? I tried with FileSystemStorage, but it only tell me that I should use getAppHome()...
I don't quite understand the usage of the source directory which obviously won't exist on the device where your application is running.
You can get access to files in the root of your SRC directory which get packaged into the JAR using Display.getInstance().getResourceAsStream(...).
The replacement to java.io.File is FileSystemStorage which is covered in the developer guide.
I used Sphinx4 for some time which really fits my needs. I load a recognizer, pass the audio data to it and use the recognized String in my application.
Right now I'm working on a C application (C++ is unfortunately not an option) where I need something similar and thought that I could use Sphinx3 which is written in C.
The problem is that I don't really know how it is used inside an application and there is no "Hello World"-example as Sphinx4 provides it.
I already compiled and installed sphinxbase and sphinx3 and now I can include the sphinx header files in my application.
Now to my questions:
Is there a "simple" and well documented example application that uses sphinx3 from a C environment?
How can I load up the sphinx3 engine and call a recognizer with my binary audio data?
OR: Do I need to start an application like "sphinx3_decode" and call it from my own application? If so, is there an example application for that?
Thank you in advance!
Best regards,
Robert
It's not recommended to use Sphinx3. From the website:
Sphinx-3 is CMU’s large vocabulary speech recognition system. It’s
older C based decoder that we continue to maintain. It’s planned to
make it obsolete in the future, it’s still most accurate decoder for
large vocabulary tasks. We are using it as a baseline to check the
recognizer accuracy. This decoder is only intended for researchers who
want to evaluate bleeding edge methods in ASR like tree search method.
If you need to use a decoder you should use pocketsphinx. You can find the tutorial and the API documentation on the website
http://cmusphinx.sourceforge.net/wiki/tutorialpocketsphinx
http://cmusphinx.sourceforge.net/api/pocketsphinx/pocketsphinx_8h.html
I Recently worked on an Intregated Project on Punjabi Language.
Here are some steps that we used...
First we recorded the punjabi audio data in a vaccumed room in 16000 hz sample rate.
Then we took the recorded data and segmented it using Praat Software into small wav and raw files of 2 to 30 sec and saved them in a folder named train.
Then we took a system having Linux ie. Ubuntu and installed the required plug in like autoconfig, automake etc and untarred Sphinx 3 along with 4 packages that are cmuclmtk, pocketsphinx, sphinxbase, sphinxtrain.
Then according to the small wav files we made many files like transcription, dic, phone, filler, file id, ccs etc.
Then we opened the terminal and typed –"sphinx_fe” to check the whether the sphinx is functional or not.
Then we created an folder named “man” and then in terminal wrote its path.
Then we run the command- “sphinxtrain –t man setup”. By running this command an folder named “etc” will be formed in “man” folder containing files “feat_paramas” & ”config”.
Changes were made in the in the config file according to our data.
Then we moved all the files that we created before ie. transcription, dic in the etc folder in that is located in man folder.
Then we placed ‘lang1.sh” script in etc folder and remaining 4 scripts in man folder.
Then we opened the path for etc folder in terminal and run command- “lang1.sh”
Then we run series of commands in terminal – “mfcgen2.sh” then “verify3.sh” then “hmm4.sh” and at last “end-test.sh” to get the final result.
Rest if you have worked on Sphinx 4 then you may know about the files that are mentioned above in the steps. I hope this helps you.
I have some XML files marked as "Content" that should be copied into the application's XAP file. How can I read these files from within Silverlight?
I know how to read files normally in .NET. So what I'm looking for is a way to find where Windows stuck the files and any realavent security issues.
This is not exactly how to read files from within the xap but if you just want to be able to load the files at runtime then this will work.
You can set the files to be embedded resources. You then get access to them using the following:
GetType().Assembly.GetManifestResourceStream(resourceName)
resourceName is the name of the file with a full namespace. E.g. if your assembly default namespace (set in project settings) is "foo" and your file is in a folder called "bar" then resourceName would be something like "foo.bar.MyFile.xml"
You should just be able to do an XDocument.Load an pass in the name of your file.
I can't find a way to use the source server tools from the Debugging Tools for Windows on a static library project,
which is built separately from the solutions actually using that library:
The output of "ssindex.cmd" always displays "zero source files found" for the PDB file generated for the library
(using compiler options /ZI and /Fd).
Running "srctool.exe -r" on this PDB displays nothing, which probably means that the PDB file does not contain any source file information.
Running the same command on the PDB file of a test application which is also build as part of the the same solution
yields a list of all expected source files.
Is there a way to use source indexing for a static library project when it should be built seperately from the solutions using it?
Thanks for any suggestions!
You can use the "/Save" and "/Load" options to store and load source information for a static library, respectively. Using these options allows you to store information for your library and then later import it when indexing a project that links against your library.
When indexing your library solution, you specify the "/Save" flag with a directory in which to store index information about the library's source files. For example (assuming you are using Subversion for source control),
ssindex.cmd /System=SVN /Save=c:\source\libproj\srcinfo /Source=c:\source\libproj /Symbols=c:\source\libproj\Release\*.pdb
When later indexing your project that includes your library, you specify the "/Load" flag with the directory containing the library's source file information. For example,
ssindex.cmd /System=SVN /Load=c:\source\libproj\srcinfo /Source=c:\source\binproj /Symbols=c:\source\binproj\Release\*.pdb
There are two potential issues that may affect your ability to use this technique. First, it appears that some source control providers may not support saving and loading source control information. I know that the Subversion provider does and it looks like the SourceSafe provider does, but I haven't checked any others.
Second, this technique appears to only work for one external static library out-of-the-box. There does not seem to be a way to load information from multiple directories and the scripts currently overwrite the contents of the directory each time you use the "/Save" option. You could probably edit the source control provider module to append to the files in the save directory rather than overwrite them, but I have not tried it.
Also, note as you mentioned above that you only need to do this if your library is being built as part of a separate solution. If the static library is part of the solution you are indexing, its source files will be included if they are in the path specified by the "/Source" option.
It probably means you haven't inputed the correct directories when running "ssindex" so for ssindex you need to have: /source=C:/SourceCode/ /symbols=C:/SourceCode/bin/Debug I'm not sure if the "source" has an upper case S or not but that should be it!
when run svnindex.cmd, it always tell you "zero source files found"
after a painful diggin into svn.pm (the perl module to deal with svn), i found that:
first, svn.pm invokes "svn info -R $SourceRoot" to get all version info of files in $SourceRoot (passed by /source option),
then svn.pm stores all files in a dictionary which using the local file path as key
svnindex.cmd call srctool -r to get all source files info in *.pdb, and use the source file name as a key to query info saved in step2
the problem is:
svn.pm uses relative path, but *.pdb uses absolute path, so you will never find a svn log info for any file, then "zero source files found"
fixup:
change svn.pm line 162:
$LocalFile = lc $1;
to
$LocalFile = $SourceRoot . "\" . lc $1; #make path absolute