Currently, my Strapi is hosted on AWS and uses Postgres as its database. Currently, data is often lost overnight. All data in the content type builder is preserved. Only data I added via Content Manager is suddenly gone.
I have already updated my dependencies in my Strapi project for AWS and installed them on the server.
My colleague and I are working on the project in parallel. I via the server directly and he locally. We have already tested if this has an impact on the data loss but couldn’t find anything so far.
I read something about a problem with Sqlite, but since I already use Postgres, there shouldn’t really be any issues
Does anyone have any idea what the problem could be?
System Information
Strapi Version: 4.3.3
Database: postgres
Node Version: 16.13
NPM Version: 8.1.0
Related
I am a new bee to the cloud and trying to understand few common issues which people solved already in the field.
As of now i have created a docker image for my java based web application. Also i have created a oracle database 11g XE instance with a database imported by default. Finally i pushed these docker images to the AWS repository and deployed to EC2 instances as a docker containers. I am able to access my web application using the public IP and everything is looking good except one thing. When the EC2 instance went down or recreated for some reason, the database container will be recreated with the original database, i will be losing all the data's which have been created after setting up my application for the first time.
I know this is a common issue in the container world, i just want to know how people solve this issue.
We have an JavaEE web application running with Maven to do the build process, JSF 2.2, Tomcat 7 as our server and MySQL 5.5 as our database. With the development of new features, sometimes we need to change our database structure. At this moment we have the work to do all this manually:
Wait until we have no clients online (around midnight)
Go to Tomcat manager
Undeploy context
Deploy new context
Go to phpMyAdmin and execute the SQL scripts
While our application is still "small" is still viable to do this process, but we are looking forward to automatize this. We already know about Jenkins, that can read our Git, build the .war using Maven and - not sure yet - do the deploy at Tomcat.
But I am not sure about how we will automatize our SQL scripts to execute when we deploy a new version. It needs to be robust, so it doesn't mess with our database, by for example, running it twice or something like that.
My question is if there is a better deployment process focusing on database changes that can help me.
Just to append the previous answer about liquibase, you could use the flywaydb too.
There are solutions for this out there. One of them is called Liquibase.
You can use liquibase to apply incremental database changes along with jenkins to automate build process.
We have been planning migrating from shared development database to local database for each developer. Installing the database, schema and initial data should be automated and platform independent, and each developer would point his application server and DBMS to this local database instead of a shared one, to freely experiment with the schema without fear of breaking others work. Database in question is Oracle.
Database stuff is of course source controlled, and each developer should easily upgrade to the latest version. Ideal is, each developer runs some kind of platform independent container, which on boot is configured to mirror the QA database by fetching the latest schema and scripts from the source control. It should be easy resetting to the last stable state, but also preserve local changes in some persistent storage in the case of container failure.
I have been considering technologies like Vagrant, Docker and/or Ansible to ship and automate the local database setup and configuration in a platform independent way. However, I read Oracle Database doesn't officially support Docker. What does that mean? Can't I build custom docker with Oracle Database binary?
Would it be better to install Oracle database using vagrant Ansible provisioner, due to the uncertain docker support? Would docker just bring unnecessary layer of complexity, as Vagrant already provides the virtualization, and Ansible could handle the setup and configuration?
I would like to hear some real life war stories about implementing platform independent database per developer pattern.
Oracle Database doesn't officially support Docker. just means that
there is no official docker image for oracle database for now. But you
can always pull a base image like ubuntu and install your database.
Once you have setup the whole environment on top of the base image, you can push the image you created into a private repository and share it.
Private repository serveice with version control is provided by dockerhub, GCP, AWS etc...
Once everyone has docker deamon running in their systems, they can just pull and deploy the image as a container.
I have configured wordpress to Bitbucket when I changes push through source tree, the reflect of azure web app service where my code exists. The problem is that about the database cause Team are working on local and use online database but when push the changes the database is not update? How to resolve this problem.
Databases are generally not saved in your git repository, neither would this be desirable. If you'd like to easily sync your WordPress database between a local and remote install consider a tool like wordmove or a plugin like WP Migrate DB Pro.
I've recently inherited a database driven e-commerce site written in C# ASP.Net, with an MS SQL database.
I have had little or no experience with this exact type of application up to this point, although I am comfortable exploring code, and am familiar with SQL query structure and C# (and web mark-up languages too).
So far I've been able to make all the adjustments I've wanted to to the application, have debugged some stuff, removed some compiler errors, added a few new simple functions, and am enjoying myself rather.
I am experiencing some problems with displaying the information from the database within Visual Web Developer 2008 Express Edition.
Having faced initial setup problems with the web.config file I'm a little wary about the next steps to take!
I currently have a local copy of web.config, which connects to a local copy of the database during development.
When I compile and upload any new versions of the application, I exclude the local version of web.config, so that the remote version uses it's own web.config file to connect to the remote database.
In order to see any of the database information on the web pages during development , I have to run the website in the browser.
Should I be able to see this info in Design View in VS by creating a connection to the database in the database explorer? Will this affect the application when it is running remotely on the webserver? (as the connection would have been made to the local database and not the remote one, and hence the connection string would be different)
All of the DataGrids are blank in VS design view. If I choose a Data Source for them using the Smart Tags in design view, will they use the right Data Source when running remotely? Should I drop the local copy of the database altogether? Connecting to the remote database during development seems rather dangerous to me!
I hope this is clear, any and all help/links/pointers welcome!
Using different Web.config in development and production environment to learn how you can use different configs
Also check Scott's tip, http://weblogs.asp.net/scottgu/archive/2007/09/21/tip-trick-automating-dev-qa-staging-and-production-web-config-settings-with-vs-2005.aspx (Not sure if it applies to Visual Web Developer)