Im using Celery + Redis with Django Rest API on my localhost to run a classification task how get data from an Axios post.
and right now im trying to deploy it to google cloud , and i didnt found a clear way to run Redis and Celery on the App Engine , so i heard about Google Task Queue but i didnt found a way to add it to a view and triger it when the view is called , so how i can create a function to call this google cloud task that i have on celery or just have an idea on how to do it
Those are my codes :
from celery import shared_task
from celery_progress.backend import ProgressRecorder
from snakeimage.models import Prediction,UploadedSnake,SnakeClass
from snakeimage.classification_codes.classification_codes.prediction_func import predict_classes
#import json
#import time
#from django.conf import settings
#from google.cloud import tasks_v2beta3
#from google.protobuf import timestamp_pb2
#shared_task(bind=True)
def image_progress(self,image_path, X, Y, metadata,image_id):
progress_recorder = ProgressRecorder(self,)
predictions = predict_classes(image_path, X, Y, metadata)
print(predictions)
for prediction in predictions:
print(prediction[0])
image = UploadedSnake.objects.get(id=image_id)
class_name = SnakeClass.objects.get(index=(prediction[0]+1))
print('>>>>>>>>>>>>>>>>>>>>>',prediction[1])
Prediction.objects.create(image=image,class_name=class_name,predict_percent=prediction[1])
progress_recorder.set_progress( 1, 3, description='Prediction Result Status')
return True
and i call it in the with a :
task = image_progress.delay(image_path=image_path, X=X, Y=Y, metadata=0, image_id=image_id)
Thanks for the help .
[EDIT 1]
sorry for the late reply i could make it work locally with Django-cloud-task
but in the staging server its not working , and when i try to connect remotly from my machine to the Google Cloud Task Queue that i created , i get this error and it try to send it for 7 times :
googleapiclient.errors.HttpError: <HttpError 403 when requesting https://cloudtasks.googleapis.com/v2beta3/projects/%7Bdeployement-test%7D/locations/europe-west6/queues/default/tasks?alt=json returned "Permission denied on resource project {deployement-test}.". Details: "[{'#type': 'type.googleapis.com/google.rpc.Help', 'links': [{'description': 'Google developer console API key', 'url': 'https://console.developers.google.com/project/{deployement-test}/apiui/credential'}]}, {'#type': 'type.googleapis.com/google.rpc.ErrorInfo', 'reason': 'CONSUMER_INVALID', 'domain': 'googleapis.com', 'metadata': {'consumer': 'projects/{deployement-test}', 'service': 'cloudtasks.googleapis.com'}}]">
i did everything as mentionned in the github , and their is no options to add some api key to authentificate , so did somone know how to resolve this issue , those are my codes :
settings :
DJANGO_CLOUD_TASKS_EXECUTE_LOCALLY = False
# If False, running `.execute()` on remote task will simply log the
task data instead of adding it to
# the queue. Useful for debugging. Default: True
DJANGO_CLOUD_TASKS_BLOCK_REMOTE_TASKS = True
PROJECT_NAME = "project"
QUEUE_REGION = "region"
QUEUE_NAME = "queue"
DJANGO_CLOUD_TASKS_HANDLER_SECRET = 'random secret key'
DJANGO_CLOUD_TASKS = {
'project_location_name': 'projects/{project}/locations/region',
'task_handler_root_url': '/_tasks/',
}
task.py:
from celery import shared_task
#from celery_progress.backend import ProgressRecorder
from snakeimage.models import Prediction,UploadedSnake,SnakeClass
from
snakeimage.classification_codes.classification_codes.prediction_func
import predict_classes
#from AINature.settings import DJANGO_HANDLER_SECRET
import json
import time
from django.conf import settings
from google.cloud import tasks_v2beta3
from google.protobuf import timestamp_pb2
from django_cloud_tasks.decorators import task
#task(queue='default')
def example_task(request, p1, p2):
print(p1, p2)
print("lezgooow >>>>>>>>>>>>>>>"+p1)
print(request.task_id)
def prediction_task(request,image_path, X, Y, metadata,image_id):
print("what is going ooon")
#progress_recorder = ProgressRecorder(self, )
predictions = predict_classes(image_path, X, Y, metadata)
print(predictions)
for prediction in predictions:
print(prediction[0])
image = UploadedSnake.objects.get(id=image_id)
class_name = SnakeClass.objects.get(index=(prediction[0] + 1))
print('>>>>>>>>>>>>>>>>>>>>>', prediction[1])
Prediction.objects.create(image=image, class_name=class_name,
predict_percent=prediction1)
#progress_recorder.set_progress(1, 3, description='Prediction Result Status')
return True
when i put DJANGO_CLOUD_TASKS_EXECUTE_LOCALLY = True , all things run correctly , but when i turn it off , it throw the error that i mentionned .
this a link to the Django-cloud-task github : this link
Celery and Cloud Task are both task queue but different implementation, so there is no direct way to convert your Celery to Cloud Task logic. It means it would be easier if you use the Cloud Task Service only. I suggest studying Cloud Task Client libraries before doing the migration. There are samples in GitHub link to get you started.
If you want to still use the Celery you will need to work out how you will trigger the workers via HTTP request because App Engine Standard only accepting HTTP request.
There are other service options you can use, Compute Engine and App Engine Flex where you can implement what kind of setup you want.
A Google drive sheet has been created (from XLS) using Drive API - by an App Engine application, with default service account. The newly created document has been shared with individuals and access to file has been confirmed.
File file = driveService.files().create(fileMetadata, inputStreamContent)
.setFields("id")
.execute();
Logger.info("Created file: %s", file.getId());
BatchRequest batch = driveService.batch();
Permission userPermission = new Permission()
.setType("user")
.setRole("writer")
.setEmailAddress("personal.email#gmail.com");
driveService.permissions().create(file.getId(), userPermission)
.setFields("id")
.execute();
Now I would like to create a BigQuery table from this Google Sheet. So I've got Drive API enabled obviously for previous step. I have adjusted BigQuery service to have Credentials with necessary scope created:
private static final List<String> SCOPES = asList(DriveScopes.DRIVE,
DriveScopes.DRIVE_READONLY, SheetsScopes.SPREADSHEETS, AUTH, BIGQUERY);
GoogleCredentials googleCredentials = AppEngineCredentials.getApplicationDefault().createScoped(SCOPES);
BigQueryOptions options = BigQueryOptions.newBuilder().setCredentials(googleCredentials).build();
BigQuery bigQuery = options.getService();
But still no luck when I call the controller to ingest the sheet with this code:
ExternalTableDefinition tableDefinition = ExternalTableDefinition
.of(String.format(GOOGLE_DRIVE_LOCATION_FORMAT, fileId), categoryMappingSchema(),
GoogleSheetsOptions.newBuilder().setSkipLeadingRows(FIRST_ROW).build());
TableInfo tableInfo = TableInfo.newBuilder(tableId, tableDefinition).build();
Table table = bigQuery.create(tableInfo);
The error I'm getting suggests that the scope has not been provided to the credentials.
Access Denied: BigQuery BigQuery: No OAuth token with Google Drive scope was found.
Am I missing something?
I suspect there's a problem with ADC - when I initialize Credentials from the json key, it works as expected:
InputStream inputStream = new ChannelInputStream(inputChannel);
bqCredentials = GoogleCredentials
.fromStream(inputStream)
.createScoped(BQ_SCOPES);
This approach did not work:
GoogleCredentials googleCredentials = AppEngineCredentials.getApplicationDefault().createScoped(SCOPES);
I want to get unread mails from yesterday. So i had to implement multiple queries in the message.list function which give me an error of invalid syntax. How do i do it? Can someone help me? And will internalDate help me anyway?
from __future__ import print_function
import httplib2
import os
from email.utils import parsedate_tz,mktime_tz,formatdate
from requests.adapters import HTTPAdapter
import datetime
from datetime import date,timedelta
import time
from apiclient import discovery
import oauth2client
from oauth2client import client
from oauth2client import tools
import json
try:
import argparse
flags = argparse.ArgumentParser(parents=[tools.argparser]).parse_args()
except ImportError:
flags = None
SCOPES = 'https://www.googleapis.com/auth/gmail.readonly'
CLIENT_SECRET_FILE = 'client_server.json'
APPLICATION_NAME = 'Gmail API Python Quickstart'
def get_credentials():
"""Gets valid user credentials from storage.
If nothing has been stored, or if the stored credentials are invalid,
the OAuth2 flow is completed to obtain the new credentials.
Returns:
Credentials, the obtained credential.
"""
home_dir = os.path.expanduser('~')
credential_dir = os.path.join(home_dir, '.credentials')
if not os.path.exists(credential_dir):
os.makedirs(credential_dir)
credential_path = os.path.join(credential_dir,
'gmail-python-quickstart.json')
store = oauth2client.file.Storage(credential_path)
credentials = store.get()
if not credentials or credentials.invalid:
flow = client.flow_from_clientsecrets(CLIENT_SECRET_FILE, SCOPES)
flow.user_agent = APPLICATION_NAME
if flags:
credentials = tools.run_flow(flow, store, flags)
else: # Needed only for compatibility with Python 2.6
credentials = tools.run(flow, store)
print('Storing credentials to ' + credential_path)
return credentials
def main():
da=date.fromordinal(730920)
credentials = get_credentials()
http = credentials.authorize(httplib2.Http())
service = discovery.build('gmail', 'v1', http=http)
today=date.today()
print (today)
yesterday=today-timedelta(1)
print (yesterday)
response = service.users().messages().list(userId='me',q='{in:inbox is:unread} AND {after: {0}.format(yesterday.strftime('%Y/%m/%d'))}').execute()
messages=[]
store=[]
message1=[]
test2=[]
da=[]
if 'messages' in response:
messages.extend(response['messages'])
fo = open("foo.txt", "wb")
for i in range(len(messages)):
store=messages[i]['id']
message = service.users().messages().get(userId='me',id=store,format='metadata',metadataHeaders=['from','date']).execute()
fo.write(store+" ");
#print(message['payload']['headers'][0])
fo.write(message['snippet'].encode('utf-8')+" ")
if message['payload']['headers'][0]['name'].lower()=="from":
From=message['payload']['headers'][0]['value']
fo.write(From+" ");
elif message['payload']['headers'][0]['name'].lower()=="date":
da=message['payload']['headers'][0]['value']
fo.write(da+"\n");
for line in open("foo.txt"):
print(line)
# Open a file
# Close opend file
fo.close()
if __name__ == '__main__':
main()
Use:
q='in:inbox is:unread newer_than:3d'
as the query. Gmail queries don't have a concept of timezones so if you try to get one day worth of email you'll end up with some overlap. Just use local filtering to narrow those down. See advanced Gmail search for more help. The API and Gmail UI use the same query syntax and should show the same results so you can do testing in the UI.
I have a CSV file of this form:
Username, Password_Hash
noam , ************
paz , ************
I want to import this CSV into my datastore so the data could be accessed from python by using this model:
class Company(ndb.Model):
Username = ndb.StringProperty()
Password_Hash= ndb.StringProperty(indexed=False)
Of course, manual import one by one is not an option because the real file is pretty large.
I've no idea of which structure the file used by gcloud preview datastore upload is based on.
Google has a lack of good documentation on this issue.
How about something like:
from google.appengine.api import urlfetch
from models import Company
def do_it(request):
csv_string = 'http://mysite-or-localhost/table.csv'
csv_response = urlfetch.fetch(csv_string, allow_truncated=True)
if csv_response.status_code == 200:
for row in csv_response.content.split('\n'):
if row != '' and not row.lower().startswith('Username,'):
row_values = row.split(',')
new_record = Company(
Username = row_values[0],
Password_Hash = row_values[1]
)
new_record.put()
return Response("Did it", mimetype='text/plain')
there is no magic way of migrating. you need to write a program that reads the file and saves to the datastore one by one. it's not particularly difficult to write this program. give it as long as it takes, it won't be forever...
I'd been searching for a way to do cookie based authentication/sessions in Google App Engine because I don't like the idea of memcache based sessions, and I also don't like the idea of forcing users to create google accounts just to use a website. I stumbled across someone's posting that mentioned some signed cookie functions from the Tornado framework and it looks like what I need. What I have in mind is storing a user's id in a tamper proof cookie, and maybe using a decorator for the request handlers to test the authentication status of the user, and as a side benefit the user id will be available to the request handler for datastore work and such. The concept would be similar to forms authentication in ASP.NET. This code comes from the web.py module of the Tornado framework.
According to the docstrings, it "Signs and timestamps a cookie so it cannot be forged" and
"Returns the given signed cookie if it validates, or None."
I've tried to use it in an App Engine Project, but I don't understand the nuances of trying to get these methods to work in the context of the request handler. Can someone show me the right way to do this without losing the functionality that the FriendFeed developers put into it? The set_secure_cookie, and get_secure_cookie portions are the most important, but it would be nice to be able to use the other methods as well.
#!/usr/bin/env python
import Cookie
import base64
import time
import hashlib
import hmac
import datetime
import re
import calendar
import email.utils
import logging
def _utf8(s):
if isinstance(s, unicode):
return s.encode("utf-8")
assert isinstance(s, str)
return s
def _unicode(s):
if isinstance(s, str):
try:
return s.decode("utf-8")
except UnicodeDecodeError:
raise HTTPError(400, "Non-utf8 argument")
assert isinstance(s, unicode)
return s
def _time_independent_equals(a, b):
if len(a) != len(b):
return False
result = 0
for x, y in zip(a, b):
result |= ord(x) ^ ord(y)
return result == 0
def cookies(self):
"""A dictionary of Cookie.Morsel objects."""
if not hasattr(self,"_cookies"):
self._cookies = Cookie.BaseCookie()
if "Cookie" in self.request.headers:
try:
self._cookies.load(self.request.headers["Cookie"])
except:
self.clear_all_cookies()
return self._cookies
def _cookie_signature(self,*parts):
self.require_setting("cookie_secret","secure cookies")
hash = hmac.new(self.application.settings["cookie_secret"],
digestmod=hashlib.sha1)
for part in parts:hash.update(part)
return hash.hexdigest()
def get_cookie(self,name,default=None):
"""Gets the value of the cookie with the given name,else default."""
if name in self.cookies:
return self.cookies[name].value
return default
def set_cookie(self,name,value,domain=None,expires=None,path="/",
expires_days=None):
"""Sets the given cookie name/value with the given options."""
name = _utf8(name)
value = _utf8(value)
if re.search(r"[\x00-\x20]",name + value):
# Don't let us accidentally inject bad stuff
raise ValueError("Invalid cookie %r:%r" % (name,value))
if not hasattr(self,"_new_cookies"):
self._new_cookies = []
new_cookie = Cookie.BaseCookie()
self._new_cookies.append(new_cookie)
new_cookie[name] = value
if domain:
new_cookie[name]["domain"] = domain
if expires_days is not None and not expires:
expires = datetime.datetime.utcnow() + datetime.timedelta(
days=expires_days)
if expires:
timestamp = calendar.timegm(expires.utctimetuple())
new_cookie[name]["expires"] = email.utils.formatdate(
timestamp,localtime=False,usegmt=True)
if path:
new_cookie[name]["path"] = path
def clear_cookie(self,name,path="/",domain=None):
"""Deletes the cookie with the given name."""
expires = datetime.datetime.utcnow() - datetime.timedelta(days=365)
self.set_cookie(name,value="",path=path,expires=expires,
domain=domain)
def clear_all_cookies(self):
"""Deletes all the cookies the user sent with this request."""
for name in self.cookies.iterkeys():
self.clear_cookie(name)
def set_secure_cookie(self,name,value,expires_days=30,**kwargs):
"""Signs and timestamps a cookie so it cannot be forged"""
timestamp = str(int(time.time()))
value = base64.b64encode(value)
signature = self._cookie_signature(name,value,timestamp)
value = "|".join([value,timestamp,signature])
self.set_cookie(name,value,expires_days=expires_days,**kwargs)
def get_secure_cookie(self,name,include_name=True,value=None):
"""Returns the given signed cookie if it validates,or None"""
if value is None:value = self.get_cookie(name)
if not value:return None
parts = value.split("|")
if len(parts) != 3:return None
if include_name:
signature = self._cookie_signature(name,parts[0],parts[1])
else:
signature = self._cookie_signature(parts[0],parts[1])
if not _time_independent_equals(parts[2],signature):
logging.warning("Invalid cookie signature %r",value)
return None
timestamp = int(parts[1])
if timestamp < time.time() - 31 * 86400:
logging.warning("Expired cookie %r",value)
return None
try:
return base64.b64decode(parts[0])
except:
return None
uid=1234|1234567890|d32b9e9c67274fa062e2599fd659cc14
Parts:
1. uid is the name of the key
2. 1234 is your value in clear
3. 1234567890 is the timestamp
4. d32b9e9c67274fa062e2599fd659cc14 is the signature made from the value and the timestamp
Tornado was never meant to work with App Engine (it's "its own server" through and through). Why don't you pick instead some framework that was meant for App Engine from the word "go" and is lightweight and dandy, such as tipfy? It gives you authentication using its own user system or any of App Engine's own users, OpenIn, OAuth, and Facebook; sessions with secure cookies or GAE datastore; and much more besides, all in a superbly lightweight "non-framework" approach based on WSGI and Werkzeug. What's not to like?!
For those who are still looking, we've extracted just the Tornado cookie implementation that you can use with App Engine at ThriveSmart. We're using it successfully on App Engine and will continue to keep it updated.
The cookie library itself is at:
http://github.com/thrivesmart/prayls/blob/master/prayls/lilcookies.py
You can see it in action in our example app that's included. If the structure of our repository ever changes, you can look for lilcookes.py within github.com/thrivesmart/prayls
I hope that's helpful to someone out there!
This works if anyone is interested:
from google.appengine.ext import webapp
import Cookie
import base64
import time
import hashlib
import hmac
import datetime
import re
import calendar
import email.utils
import logging
def _utf8(s):
if isinstance(s, unicode):
return s.encode("utf-8")
assert isinstance(s, str)
return s
def _unicode(s):
if isinstance(s, str):
try:
return s.decode("utf-8")
except UnicodeDecodeError:
raise HTTPError(400, "Non-utf8 argument")
assert isinstance(s, unicode)
return s
def _time_independent_equals(a, b):
if len(a) != len(b):
return False
result = 0
for x, y in zip(a, b):
result |= ord(x) ^ ord(y)
return result == 0
class ExtendedRequestHandler(webapp.RequestHandler):
"""Extends the Google App Engine webapp.RequestHandler."""
def clear_cookie(self,name,path="/",domain=None):
"""Deletes the cookie with the given name."""
expires = datetime.datetime.utcnow() - datetime.timedelta(days=365)
self.set_cookie(name,value="",path=path,expires=expires,
domain=domain)
def clear_all_cookies(self):
"""Deletes all the cookies the user sent with this request."""
for name in self.cookies.iterkeys():
self.clear_cookie(name)
def cookies(self):
"""A dictionary of Cookie.Morsel objects."""
if not hasattr(self,"_cookies"):
self._cookies = Cookie.BaseCookie()
if "Cookie" in self.request.headers:
try:
self._cookies.load(self.request.headers["Cookie"])
except:
self.clear_all_cookies()
return self._cookies
def _cookie_signature(self,*parts):
"""Hashes a string based on a pass-phrase."""
hash = hmac.new("MySecretPhrase",digestmod=hashlib.sha1)
for part in parts:hash.update(part)
return hash.hexdigest()
def get_cookie(self,name,default=None):
"""Gets the value of the cookie with the given name,else default."""
if name in self.request.cookies:
return self.request.cookies[name]
return default
def set_cookie(self,name,value,domain=None,expires=None,path="/",expires_days=None):
"""Sets the given cookie name/value with the given options."""
name = _utf8(name)
value = _utf8(value)
if re.search(r"[\x00-\x20]",name + value): # Don't let us accidentally inject bad stuff
raise ValueError("Invalid cookie %r:%r" % (name,value))
new_cookie = Cookie.BaseCookie()
new_cookie[name] = value
if domain:
new_cookie[name]["domain"] = domain
if expires_days is not None and not expires:
expires = datetime.datetime.utcnow() + datetime.timedelta(days=expires_days)
if expires:
timestamp = calendar.timegm(expires.utctimetuple())
new_cookie[name]["expires"] = email.utils.formatdate(timestamp,localtime=False,usegmt=True)
if path:
new_cookie[name]["path"] = path
for morsel in new_cookie.values():
self.response.headers.add_header('Set-Cookie',morsel.OutputString(None))
def set_secure_cookie(self,name,value,expires_days=30,**kwargs):
"""Signs and timestamps a cookie so it cannot be forged"""
timestamp = str(int(time.time()))
value = base64.b64encode(value)
signature = self._cookie_signature(name,value,timestamp)
value = "|".join([value,timestamp,signature])
self.set_cookie(name,value,expires_days=expires_days,**kwargs)
def get_secure_cookie(self,name,include_name=True,value=None):
"""Returns the given signed cookie if it validates,or None"""
if value is None:value = self.get_cookie(name)
if not value:return None
parts = value.split("|")
if len(parts) != 3:return None
if include_name:
signature = self._cookie_signature(name,parts[0],parts[1])
else:
signature = self._cookie_signature(parts[0],parts[1])
if not _time_independent_equals(parts[2],signature):
logging.warning("Invalid cookie signature %r",value)
return None
timestamp = int(parts[1])
if timestamp < time.time() - 31 * 86400:
logging.warning("Expired cookie %r",value)
return None
try:
return base64.b64decode(parts[0])
except:
return None
It can be used like this:
class MyHandler(ExtendedRequestHandler):
def get(self):
self.set_cookie(name="MyCookie",value="NewValue",expires_days=10)
self.set_secure_cookie(name="MySecureCookie",value="SecureValue",expires_days=10)
value1 = self.get_cookie('MyCookie')
value2 = self.get_secure_cookie('MySecureCookie')
If you only want to store the user's user ID in the cookie (presumably so you can look their record up in the datastore), you don't need 'secure' or tamper-proof cookies - you just need a namespace that's big enough to make guessing user IDs impractical - eg, GUIDs, or other random data.
One pre-made option for this, which uses the datastore for session storage, is Beaker. Alternately, you could handle this yourself with set-cookie/cookie headers, if you really just need to store their user ID.
Someone recently extracted the authentication and session code from Tornado and created a new library specifically for GAE.
Perhaps this is more then you need, but since they did it specifically for GAE you shouldn't have to worry about adapting it yourself.
Their library is called gaema. Here is their announcement in the GAE Python group on 4 Mar 2010:
http://groups.google.com/group/google-appengine-python/browse_thread/thread/d2d6c597d66ecad3/06c6dc49cb8eca0c?lnk=gst&q=tornado#06c6dc49cb8eca0c