Error when I deploy: "attempt to write a readonly database" - database

I created a social media site following this video. Users can create an account and make posts. This works on my local machine. But when I deploy via Google Cloud users are not able to create an account. The error message:
OperationalError: attempt to write a readonly database
How to fix this? Config for the app:
class Config:
SECRET_KEY = os.environ.get('SECRET_KEY')
if SECRET_KEY == None:
SECRET_KEY = '447cc3d7257853eafabc30fd9c373ff8'
SQLALCHEMY_DATABASE_URI = os.environ.get('SQLALCHEMY_DATABASE_URL')
if SQLALCHEMY_DATABASE_URI == None:
SQLALCHEMY_DATABASE_URI = 'sqlite:///site.db'
MAIL_SERVER = 'smtp.gmail.com'
MAIL_PORT = 587
MAIL_USE_TLS = True
MAIL_USERNAME = 'noreplyb77988443#gmail.com'
MAIL_PASSWORD = os.environ.get('EMAIL_PASS')

Related

What is the correct way to deploy an nginx container on azure app service?

I built a react app and I created a docker file that uses the nginx image. I can build and run the docker container locally and it serves up my react app sucessfully. However when I try and deploy this to azure app service it is failing for some reason. In the logs in logstream it just says "Your container failed to start up" What confuses me is that I am deploying this in a very similar way to how I have successfully deployed hundreds of java containers running springboot apps. In this situation I have no problem, but for some reason it is taking me days to get this nginx/react app working on app service.
I am provisioning the app service with terraform. The code looks like this.
resource "azurerm_app_service_plan" "dpt_appservice_plan" {
name = "${var.ENVIRONMENT}-dpt-app-service-plan"
location = azurerm_resource_group.dpt_rg.location
resource_group_name = azurerm_resource_group.dpt_rg.name
kind = "Linux"
reserved = true
sku {
tier = "Basic"
size = "S1"
}
}
resource "azurerm_app_service" "dpt_my_website_app_service" {
name = "${var.ENVIRONMENT}-dpt-my-website-app-service"
location = azurerm_resource_group.dpt_rg.location
resource_group_name = azurerm_resource_group.dpt_rg.name
app_service_plan_id = azurerm_app_service_plan.dpt_appservice_plan.id
site_config {
always_on = false
linux_fx_version = "DOCKER|${var.ENVIRONMENT}<acr>.azurecr.io/<registry>:latest"
}
app_settings = {
DOCKER_REGISTRY_SERVER_URL = "https:# acr.azurecr.io"
DOCKER_REGISTRY_SERVER_USERNAME = azurerm_container_registry.dpt_acr.admin_username
DOCKER_REGISTRY_SERVER_PASSWORD = azurerm_container_registry.dpt_acr.admin_password
WEBSITES_PORT = 8080
WEBSITES_CONTAINER_START_TIME_LIMIT = 300
}
}
I build the react app and take the contents from the build directory and place them in the nginx container
FROM nginx
COPY build /usr/share/nginx/html
COPY docker/default.conf /etc/nginx/conf.d/default.conf
EXPOSE 8080 80
ENTRYPOINT ["nginx", "-g", "daemon off;"]
Does anybody know the correct way to deploy an nginx container with a react app to azure app service?
All of the code there was posted there is correct. I solved this issue by changing the azure-pipelines file. Before the containerCommand was uncommented. For some reason with the nginx container the containerCommand should not be included as it just magically starts up but when you add this like I did then it makes it fail. I commented it out by adding the # sign next to it.
inputs:
azureSubscription: 'dpt-service-1'
appName: 'dev-dpt-my-website-app-service'
deployToSlotOrASE: true
resourceGroupName: 'dev-dpt-rg'
# Do not set this to production only use slot<number>
slotName: 'production'
imageName: 'devdptAcr.azurecr.io/dpt-images:$(smallTag)'
# containerCommand: 'docker run -p 80:80 $(smallTag)'

Google AppEngine Getting 403 forbidden trying to update cron.yaml

I am following the docs on how to backup datastore using AppEngine.
I am performing a gcloud app deploy cron.yaml command on a GCE VM that is meant to update a cronjob in AppEngine. the GCE VM and AppEngine cron are in the same project, and I have granted AppEngine admin to the GCE VM via a default Service Account. When I run this on my local machine, it updates fine. However on the GCE instance, thats where issues arise
Here are the files
app.yaml
runtime: python27
api_version: 1
threadsafe: true
service: cloud-datastore-admin
libraries:
- name: webapp2
version: "latest"
handlers:
- url: /cloud-datastore-export
script: cloud_datastore_admin.app
login: admin
cron.yaml
cron:
- description: "Daily Cloud Datastore Export"
url: /cloud-datastore-export?namespace_id=&output_url_prefix=gs://<my-project-id>-bucket
target: cloud-datastore-admin
schedule: every 24 hours
cloud_datastore_export.yaml
import datetime
import httplib
import json
import logging
import webapp2
from google.appengine.api import app_identity
from google.appengine.api import urlfetch
class Export(webapp2.RequestHandler):
def get(self):
access_token, _ = app_identity.get_access_token(
'https://www.googleapis.com/auth/datastore')
app_id = app_identity.get_application_id()
timestamp = datetime.datetime.now().strftime('%Y%m%d-%H%M%S')
output_url_prefix = self.request.get('output_url_prefix')
assert output_url_prefix and output_url_prefix.startswith('gs://')
if '/' not in output_url_prefix[5:]:
# Only a bucket name has been provided - no prefix or trailing slash
output_url_prefix += '/' + timestamp
else:
output_url_prefix += timestamp
entity_filter = {
'kinds': self.request.get_all('kind'),
'namespace_ids': self.request.get_all('namespace_id')
}
request = {
'project_id': app_id,
'output_url_prefix': output_url_prefix,
'entity_filter': entity_filter
}
headers = {
'Content-Type': 'application/json',
'Authorization': 'Bearer ' + access_token
}
url = 'https://datastore.googleapis.com/v1/projects/%s:export' % app_id
try:
result = urlfetch.fetch(
url=url,
payload=json.dumps(request),
method=urlfetch.POST,
deadline=60,
headers=headers)
if result.status_code == httplib.OK:
logging.info(result.content)
elif result.status_code >= 500:
logging.error(result.content)
else:
logging.warning(result.content)
self.response.status_int = result.status_code
except urlfetch.Error:
logging.exception('Failed to initiate export.')
self.response.status_int = httplib.INTERNAL_SERVER_ERROR
app = webapp2.WSGIApplication(
[
('/cloud-datastore-export', Export),
], debug=True)
The Error I'm getting is
Configurations to update:
descriptor: [/usr/local/sbin/pluto/<my-project-id>/datastore/cron.yaml]
type: [cron jobs]
target project: [<my-project-id>]
Do you want to continue (Y/n)?
Updating config [cron]...
failed.
ERROR: (gcloud.app.deploy) Server responded with code [403]:
Forbidden Unexpected HTTP status 403.
You do not have permission to modify this app (app_id=u'e~<my-project-id>').
I have checked other posts related to this, however they seem to deal with an old version/deployment of appengine
Service Accounts!
From Deploying using IAM roles:
To grant a user account the ability to deploy to App Engine:
Click Add member to add the user account to the project and then select all of the roles for that account by using the dropdown menu:
Required roles to allow an account to deploy to App Engine:
a. Set the one of the following roles:
Use the App Engine > App Engine Deployer role to allow the account to deploy a version of an app.
To also allow the dos.yaml or dispatch.yaml files to be deployed with an app, use the App Engine > App Engine Admin role
instead.
The user account now has adequate permission to use the Admin API to deploy apps.
b. To allow use of App Engine tooling to deploy apps, you must also give the user account the Storage > Storage Admin role
so that the tooling has permission to upload to Cloud Storage.
Optional. Give the user account the following roles to grant permission for uploading additional configuration files:
Cloud Scheduler > Cloud Scheduler Admin role: Permissions for uploading cron.yaml files.
Potentially of interest:
Deployments with predefined roles
Predefined roles comparison matrix
Okay after some tinkering. I added the project editor role to the service account linked to the GCE instance running my server. I am not fully aware if this is the role with least priviledge to enable this to work.

Spring SAML Error: javax.net.ssl.SSLPeerUnverifiedException: SSL peer failed hostname validation for name: null

I am getting error: "javax.net.ssl.SSLPeerUnverifiedException: SSL peer failed hostname validation for name: null" while running the java application on my local machine.
I have created the key stores like following and added the jks file in classpath. Still the error is not resolved.
#Bean
public KeyManager keyManager() {
DefaultResourceLoader loader = new DefaultResourceLoader();
Resource storeFile = loader.getResource("classpath:samlKeystore.jks");
String storePass = "password";
Map<String, String> passwords = new HashMap<String, String>();
passwords.put("username", "password");
String defaultKey = "username";
return new JKSKeyManager(storeFile, storePass, passwords, defaultKey);
}
Can anyone please help me with it ?
I am using Spring SAML as service provider and Salesforce as IdP.
This is probably because the certificate is self-signed. For test purposes you can add your local CA to the trusted authorities:
keytool -list -keystore [...]/jre/lib/security/cacerts
For production the certificate should be signed by a recognised authority.

Google App Engine App failed to access Google Cloud Storage bucket

Unable to access default Google Cloud Storage bucket from Appengine Project. This project was created with App engine SDK version prior to 1.9.0. I've created the bucket manually, as per GCS Documentation it was said by default the bucket is accessible to Appengine Projects, but its not accessible in my case. This is the code snippet that tries to create a file..
...
GcsService gcsService = GcsServiceFactory.createGcsService();
GcsFilename file = new GcsFilename(getGcsDefaultBucketName(), fileName);
GcsFileOptions.Builder builder = new GcsFileOptions.Builder();
GcsFileOptions options = builder.mimeType(mimeType).build();
GcsOutputChannel channel = gcsService.createOrReplace(file, options); //erroring in this line
...
Error found in Logs:
: com.google.appengine.tools.cloudstorage.NonRetriableException: java.lang.RuntimeException: Server replied with 403, verify ACLs are set correctly on the object and bucket: Request: POST https://storage.googleapis.com/1-ebilly.appspot.com/SERVICESTAGEREPORT-DEVICENAME-LYF2-CREATEDDATE-01012017-CREATEDDATE-19022017-.ZIP
: User-Agent: AppEngine-Java-GCS
: Content-Length: 0
: x-goog-resumable: start
: Content-Type: application/zip
:
: no content: Response: 403 with 212 bytes of content
: X-GUploader-UploadID: AEnB2Upq0Lhtfy5pbt06pVib8J0-L0XiGqW4JpB0G9PL87keY3WV7RCMVLCPeclD-D4UATEddvvwpAG2qeeIxUJx--brKxdQFw
: Content-Type: application/xml; charset=UTF-8
: Content-Length: 212
: Vary: Origin
: <?xml version='1.0' encoding='UTF-8'?><Error><Code>AccessDenied</Code><Message>Access denied.</Message><Details>Caller does not have storage.objects.create access to bucket myprojectID.appspot.com.</Details></Error>
:
: at com.google.appengine.tools.cloudstorage.RetryHelper.doRetry(RetryHelper.java:120)
: at com.google.appengine.tools.cloudstorage.RetryHelper.runWithRetries(RetryHelper.java:166)
: at com.google.appengine.tools.cloudstorage.RetryHelper.runWithRetries(RetryHelper.java:156)
: at com.google.appengine.tools.cloudstorage.GcsServiceImpl.createOrReplace(GcsServiceImpl.java:70)
PS: I've tried to create a new Google Appengine Project and deployed the app init. This project is automatically created with a default GCS bucket and the same code is working fine without any error. My old project has lots of DB data which I want to retain and continue to use the same project without disposing it.
Please help with your thoughts to make the GCS bucket accessible in old project.
Resolved the issue by adding IAM permissin for appengine project. After reading the IAM "Access Control at the Project Level" document and comparing old project and new project permissions came to know that Appengine project level permission is not found in old project. After adding the permission the same code started to access the default bucket.
IAM Permissions before fix screenshot
IAM Permissions after fix screenshot
Try this:-
GcsService gcsService = GcsServiceFactory.createGcsService(RetryParams.getDefaultInstance());
GcsFileOptions options = new GcsFileOptions.Builder().mimeType(mime).build();
GcsFilename gcsfilename = new GcsFilename(BUCKET_NAME, fileName);
GcsOutputChannel outputChannel = gcsService.createOrReplace(gcsfilename, options);

Google App Engine Admin SDK Reports API returning 403 Insufficient Permission error

I started with some sample code for App Engine from Google.
My app needs to use the Directory API and the Reports API from the Google Admin SDK.
I have created a project in the API Console and turned on the Admin SDK in Services.
I added the scopes (the same ones as used in the code below) to the "Manage API client access" section of Advanced Tools in my domain's Google cpanel.
The call to the Directory API works.
After that, the call to the Reports API fails with the error message:
"HttpError: https://www.googleapis.com/admin/reports/v1/activity/users/all/applications/admin?alt=json returned "Insufficient Permission">"
Thanks much for the assistance.
import webapp2
import os
from apiclient.discovery import build
from oauth2client.appengine import OAuth2Decorator
from oauth2client.appengine import OAuth2DecoratorFromClientSecrets
from apiclient import errors
import logging
import json
decorator = OAuth2DecoratorFromClientSecrets(
os.path.join(os.path.dirname(__file__), 'client_secrets.json'),
'https://www.googleapis.com/auth/admin.directory.user.readonly')
directoryauthdecorator = OAuth2Decorator(
client_id='123.apps.googleusercontent.com',
client_secret='456-abc',
callback_path='/oauth2callback',
scope='https://www.googleapis.com/auth/admin.directory.user.readonly '
'https://www.googleapis.com/auth/admin.reports.audit.readonly '
'https://www.googleapis.com/auth/admin.reports.usage.readonly'
)
class MainHandler(webapp2.RequestHandler):
def get(self):
self.response.write('Hello world!')
class OAuthHandler(webapp2.RequestHandler):
#directoryauthdecorator.oauth_required
def get(self):
users = []
# Get the authorized Http object created by the decorator.
auth_http = directoryauthdecorator.http()
# Get the directory service
service = build("admin", "directory_v1", http=auth_http)
result = []
page_token = None
while True:
try:
param = {}
param['domain'] = 'mydomain.com'
if page_token:
param['pageToken'] = page_token
files = service.users().list(**param).execute()
result.extend(files['users'])
page_token = files.get('nextPageToken')
if not page_token:
break
except errors.HttpError, error:
print 'An error occurred: %s' % error
break
users = []
for user in result:
logging.info(user['primaryEmail'])
users.append(user['primaryEmail'])
param = {}
param['userKey'] = 'all'
param['applicationName'] = 'admin'
service = build('admin', 'reports_v1', http=auth_http)
# this call fails with the 403 Insufficient Permissions error
results = service.activities().list(**param).execute()
logging.info(results)
app = webapp2.WSGIApplication([
('/', MainHandler),
('/users', OAuthHandler),
(directoryauthdecorator.callback_path, directoryauthdecorator.callback_handler()),
], debug=True)
I read this post and cleared the credentials from the datastore.
Hitting the /users url again I got the redirect_uri error message.
I went back to the API project, fixed the Redirect URIs, and downloaded the client_secrets.json file.
Now both calls work (one to Directory API, the other to Reports API).

Resources