Orchard CMS complete site data backup / export - sql-server

I am running a website on Orchard 1.6. The (shared) hosting company I use is not great and I am looking to move to somewhere new, possibly Azure.
The question is, having chosen to use SQL server 2008, is there a way I can export ALL of my site's data through the admin UI? Otherwise I will need to backup the data from the database to which I only have limited access.
Obviously I would then want to re-import it later on elsewhere.
Many thanks.

Check out the Import/Export module.
This will work for a lot of content items (pages, menus, widgets etc). If you have custom content parts, you may need to add import and export support to your content part drivers.

You should ask your hosting company to make a backup of the database for you or tell you how to do it. It's your data, they have to comply. If only for disaster recovery.

Related

Episerver - How to Manage Media Items on Multiple Environments

Hi guys I'm working on an exiting Episerver project (my first one) -
One of the issues that we are having is we have three enviroments for our episerver website. Developer / Staging / Live.
All have sepreate DBs. At the moment, we have had lots of media items added to our live enviroment via the CMS, we want to sync this with our staging enviroment.
However when we use the export data feature from live admin section and try to restore it to our staging enviroment, we end up with missing media, duplicate folders etc.
Is there a tool/plugin avalible to manage content/media across mulitple enviroments. Umbraco has something called "courier" (Umbraco being another CMS I have used in the past) looking for the episerver equvilent.
Or is the best way to do this export the live SQL database and over write my staging one? We have diffrent user permissions set in these enviroments how can we manage that?
How is this genreally done in the world of episerver?
Unfortunately the most common way to handle this is as you say to do it manually. Restore the db, copy the fileshare, and set up the access rights on the stage environment after the restore.
Luc made a nice provider for keeping your local environment in sync. https://devblog.gosso.se/2017/09/downloadifmissingfileblob-provider-version-1-6-for-episerver/

wso2am deployment overrides database, API's are lost

i am using wso2 api-manager 02.01.00 on a linux system. The Api-Manager is deployed at Folder A. The Databases (h2) are deployed ad Folder B which is not in Folder A. The datasources in /repository/conf/datasources/master-datasources.xml are pointing correctly to the databases in Folder B. I configured it like that, because i want do preserve the databases if there is a deployment. (Becaus a fiew Developer are using the API-Manager and they don't want to loose their Data.) But it seem, that WSO2AM_DB.h2.db is created new if there is an api-manager-depoyment. I think this, because i had a look to the DB-Size. I started with a Size of 1750KB for WSO2AM_DB.h2.db. I published a view API's in the Manager and the Size increases to 2774KB. Then i did a Deployment and the size returned to 1750KB.
Effect is that API-Store/Publisher says "There are no APIS published yet".
But i could see the APIS at Application Subscriptions and in Carbon Resources at /_system/governance/apimgt/applicationdata/provider/admin.
I tried to force a new Indexing with this, but it doesn't change anything.
Could i configure at any place, that the Database should not be created/manipulated at start?
Meanwhile i'm really desperated of not solving this problem.
Maybe you could help me.
Thank you for your Time.
WSO2 does not recommend to run on H2 database. You need to use a production database such as mysql, oracle, etc. H2 is only for tryouts.
Basically, WSO2 servers store data in databases as well as use the file system. For this kind of a deployment, you need to do the following.
Point to an external database. If you are using this for demo purposes, still you can go with the current mode (H2 database).
Use dep-sync. The content which comes under the WSO2_HOME/repository/deployment/server location needs to be preserved. You can use SVN based dep-sync or rsync. Basic idea is that for a new deployment, you need to have the data of the previous deployment.
Solr Indexing preservation. If you have hundreds/thousands of APIs in the system, it would take time for indexing. To avoid that you can copy the content of WSO2_HOME/solr to the new deployment.

Multiple-domain security with SSDT .sqlproj projects?

I'm doing a small pilot project trying to implement Sql Server Data Tools sqlproj projects in order to better bring our databases under source control. In my organization, we have separate no-trust domains for test environments of various purposes, so these domains of course have their own isolated active directory accounts.
The documentation is still somewhat sparse and I don't really know where to go for more information on this toolset, especially considering the extraordinary amount of churn in Visual Studio's history of database assets.
So far, the only idea I've really had would be to make separate sqlproj projects specifically for the security objects each separate domain, separate from the other schema objects. My hope is that somehow I can tie my actual database schema to those at deploy time and also to somehow switch which security project I'm using in the build. I have no idea if that's feasible though.
Has anyone that uses Visual Studio sqlproj projects had to deal with this? Is there a best practice for this kind of thing?
If you have different settings for each environment then the easiest is to either leave them out and not delete them when you deploy or to have a post deploy script that sets them up manually.
Normally for handling different configurations I would suggest using sql cmd variables (on the properties of the project there is a page for setting these up) but when you create a login you cannot use a variable to create it so that falls over!
There is an example on how to setup a post deploy wrapper for just this case:
http://schottsql.blogspot.co.uk/2013/05/ssdt-setting-different-permissions-per.html
Good luck with ssdt, there are some strange quirks but it enables so much!

Export Tableau notebook to standalone interactive output?

I have just started using Tableau and I love creating visualizations with it. However, I am trying to export the visualizations into some standalone format, but I do not know how to. I see that I can export as image / pdf / excel crosstab, but all these kill the interactivity of the visualizations. I can export as a Tableau packaged workbook, but the client (my intended audience) will need Tableau to see it. Is there any way to export it as a standalone, offline, interactive data visualization accessible format ? I would assume the client will have Microsoft Office installed, but cannot assume / ask him to install Tableau to view my output.
Please suggest if there is any way possible.
Thanks a lot !
Siva.
I'm guessing you don't have Tableau Server (in which you could generate reports that could be visualized in a web browser).
So the solution is to have Tableau Reader: https://www.tableau.com/products/reader
It's free, and it will allow your client to open your twbx files. Only thing is, if you have a large database, the file size will be huge (as it will have all the database in it), so it's a good thing to filter the tables to the minimum necessary, so you don't have to share huge files.
Another option, if the data is not particularly private, is to publish the packaged workbook to Tableau Public. That's free as well. Just be aware that anyone can view workbooks on Tableau public, so it's great for blogs, newspaper interactives and public demos, not so good for private financial data. Even for business users, you can post examples with fake data to Tableau Public.
Also, if your customer doesn't want to purchase Tableau Server, but wants more than the free Tableau Reader, there is a third way. Use the cloud hosted Tableau OnLine service which makes financial sense for smaller organizations.

Update a local/client Microsoft Access Database from a server (MS SQL Server2005)

I've got a website that runs on a shared hosting environment, using ASP.net 2.0 (C#) and MS SQL Server 2005. I've recently been asked if I can integrate my website with a piece of third party desktop software that uses the Access runtime as its database (transparent to the end user).
Primarily I want to be able to offer users of my website the option of exporting their data into the Access database on their local machine. The data schema's match sufficiently, the question is how to actually do this, and in the simplest way possible for the user.
Simply having a webpage update the local Access database isn't possible due to the obvious security restrictions. I've considered asking them to upload the Access database to the server, so I can migrate the data then allow them to download it again, however the competency of the users of this software is such that even locating the Access database, let alone uploading and downloading it from the website might be too complicated.
I've also considered if Adobe Air or Silverlight could help here, but don't know them well enough to know for sure. Similarly I'm assuming another exe could be written to perform this task that the user could simply download and run, however my experience is in web development, not program development, so this isn't a 100% certainty for me, or an ideal development option for me.
So, can this be done, and if so what technique can achieve this, with the stated aims being ease of use for the end user, followed by ease of development by someone with web development as their main skill. Many thanks!
You may find this answer of interest: Best way to stream files in ASP.NET
It is about transferring a file from the server. You could save Excel or CSV and use that to update Access.
Instead of trying to do this in a web page you might just expose some views from your sql server to some client specific logins.
Then within the Access application, allow them to tie to your sql server. You might even provide an access application for getting the data from your site and stuffing it in their local access database.
In my work we have done something similar that is transparent to the user by creating an ActiveX control. The problem is that you are limiting the users to use only Internet Explorer.
I think that the best way to achieve what you are trying to do is by installing a service in the client's computer. If creating a service is beyond your experience you can post a project in a place like oDesk and find somebody that can help you with the development for the money that you are willing to pay to complete your project.
Good Luck.

Resources