By default, bzr diff does not show differences in binary files. However, a large part of ODT (open office, libre office) files contain XML data under the surface. (ODT stores xml data as an archive (zip ?))
Is there a plugin which allows bazaar to look into archives? Or are there any special plugins for ODT documents?
The "official" branch of the oodiff plugin is out of date, and doesn't work with the latest version of Bazaar.
I created a new branch with bugfixes, it should work better. You can install with:
bzr branch lp:~janos-gyerik/bzr-oodiff/fixes-for-bzr2.5 ~/.bazaar/plugins/oodiff
I tested it with a few ODT files and it works for differences in the working tree and differences in past revisions.
From the description, the following plugin should do the job:
http://doc.bazaar.canonical.com/plugins/en/oodiff.html
Related
I have been searching for a way to obtain a timestamp for when a package was either released for general use or possibly when first loaded on a local repository. Something in either Shell or Python would be ideal, but I'm open to other options at this point. I know packages support a changelog, but it looks like not all packages include a release date.
Thanks!
The answer depends on what exactly you are looking for, and it's not clear from the question. Before reproducible builds were introduced, the date a package was built could be retrieved from the raw ar members such as:
ar tv pkgname_version_arch.deb
If you are looking for the date the package got accepted/uploaded into a specific repository, then the answer will depend on what repository and the software used to manage it. For Debian you can get the information from UDD, from the debian-devel-changes mailing list for the maintainer uploads (but not the buildd uploads, or from package tracker, other derivatives and distributions might have different interfaces or none at all. For other private repositories perhaps there are publicly accessible logs available.
As you mention the changelog can be used for when the source package was prepared, but that might be wildly different to when it got built or even uploaded.
In our company there are many projects that are all contain several informations, e.g. source code, project informations, bug reports or emails. The informations are not on a central place so if you want to search for a solved problem in a past project, you have to search for yourself.
The idea is now to build a project archive that you can search through. We want to use Apache Solr to create a Webapp with which you can search for several informations.
Indexing pdf, word or java files is not the problem in this case. The question is, what is the best solution to gather all the files from different systems. The documents are present in systems like MS Sharepoint, Atlassian Confluence, Jira, SVN or Git.
What is the best strategy to export all the informations from the different systems to gather them in a central place, where the indexing can easily be done, maybe automatically.
I.e., we have a 20MB bzip2 sql file of development data that we'd like to have versioned along with our development code.
However, we don't want this file pulled down from the repo by default with every fresh clone/fetch.
One solution seems to be storing this large file in a separate repo and then link to it with a submodule. Then, a developer would fetch the db file only when they need to retrieve and reset their development database. And then, when there's a schema change, the database file would be updated, committed to the external repo, and the submodule updated.
Is this a good development workflow? Or is there a better way of doing this?
EDIT: The uncompressed SQL dump is 360MB.
EDIT: Github says "no", don't do this:
Database dumps
Large SQL files do not play well with version control systems such as
Git. If you are looking to provide your developers with the most
recent production dataset, we recommend using Dropbox for sharing
files like these among your developers.
I ended up making a simple web server serve the schema dump directory from the repo where dumps are stored. The repo grew really quickly because the dumps are large, and it was slowing people down just to clone it when they had to bring up new nodes.
Recently our development team are wanting to use Flyway as database deployment tool. Flyway requires some version numbers to be prepended to the files. Developers want to check in the version pre-pended files into the Clearcase. Our argument has been why do you need to version the files inside the versioning tool.
Has anyone used Flyway and Clearcase? If so how are you all doing it?
Thanks in advance.
You would need to version those files if you cannot generate them.
But you shouldn't version them with version number, but rather put that information in an extra file, also versioned, which you can use in order to copy the other files in a separate folder, with the right naming convention.
I am running Sitecore 6.5
I have two installations of Sitecore and want to transfer a whole site from one installation to another.
Have found a few articles that go into Serialization and Creating a Package although they don't go into detail about how these two fit together.
How do I transfer a site from one installation to another?
thanks.
Create a package with the package designer.
include these items and their children with the button "items statically". if you have placed your solution specific item in folders, it is only needed to include these.
/sitecore/content
/sitecore/layout
/sitecore/media library
/sitecore/templates/ (only take the templates you have created. e.g. the folder user defined
using the button "files statically", include the folders with you have solution specific changes to like:
/bin
/layouts
/app.config/include (only take the files changed in the solution,
compared to a default sitecore installation)
web.config (if you have made changes to this, compared to default
sitecore web.config)
if you have any user accounts you want to transfer to, you can include them with "security accounts".
then generate zip file and install on empty sitecore and full publish :)
If your systems are similar enough, you may want to consider moving the Sitecore DBs via backup/restore (in SQL) and copying over filesystem assets. Generally I find this faster and less prone to user error than creating/installing very large packages. (Just remember to take back-ups first.)
Large packages have a tendency to break, one option would be to look into this:
http://www.hhogdev.com/Products/Team-Development-for-Sitecore/Overview.aspx
TDS can sync all your items to XML on your dev box and from that you can create a different sort of installation package which is significantly more robust than a regular package you create through the Sitecore desktop. It's the same sort of package that Sitecore use when you upgrade versions.
I believe there is a 60 day trial on this product so plenty of time to try it out.
Note: when transferring user accounts, passwords will not be migrated when using either packages or serialization.
Solution is here - cowboy-aspx from Sitecore :)
https://kb.sitecore.net/articles/242631