Using SQLAlchemy in AWS LAMBDA - sql-server

I am trying to use SQL Alchemy in AWS Lambda function but it is throwing error:
module not found.
Also I am attaching the folder structure which I am deploying to Lambda layer after zipping.
I am using the following command to create the folder for lambda layer.
pip3 install sqlalchemy --target Alchemy_layer/

The layer needs a different folder structure according to the documentation.
Try this:
pip3 install sqlalchemy --target python/
zip -r sqlalchemy-layer.zip python/
Upload that ZIP as layer and try again.

Related

Django+React AWS LightSails Ubuntu deployment not using js

this is my first deployment, I have gone so far as to create and configure Ubuntu instance on lightsails following official guide. However, I run into problems when instead of an empty project I use my own Django hosting React frontend.
My project works on my computer and if I start up django's production server thorugh manage.py runserver but does not run at all if I use gunicorn to run it with:
gunicorn my_app.wsgi --bind 0.0.0.0:8000 --workers 4
It seems that it cannot find proper filepaths of any static files but I do not undertstand how to configure it to use it since Django already has all the right paths.
One of the errors in the browser console:
Refused to execute http://3.127.76.103/static/rest_framework/js/bootstrap.min.js as script because "X-Content-Type-Options: nosniff" was given and its Content-Type is not a script MIME type.
How can I start fixing it, I do not understand why it does not see the any files.

Snowpark for Python - PyTorch inconsistency?

When I try to install pytorch on client, I get:
"Exception: You tried to install 'pytorch'. The package named for PyTorch is 'torch'".
So, I install torch.
When trying to use torch in snowpark udf#(packages=["torch"]), I get: torch is not available in snowflake; and when trying to use udf#(packages=["pytorch"]), I get: "package pytorch is not installed in the local environment. Your UDF might not work...."
How to go about resolving this? Thank you for your help.

Installing private packages in app engine from a google artifact registry

I am trying to modularize our code, and running into difficulties installing private packages on app engine.
We have a repository on GCP in the same project, so I would think this is not too difficult.
My requirement.txt looks like:
--extra-index-url=https://us-central1-python.pkg.dev/myproject/my-python-repo/simple/
Flask==2.0.1
my-new-package
I can pip install the package locally, and it uses the keyrings.google-artifactregistry-auth package to authenticate.
The deployment fails with:
File "/opt/python3.8/lib/python3.8/site-packages/pip/_internal/utils/misc.py", line 218, in ask_input
return input(message)
EOFError: EOF when reading a line
This is clearly pip asking for a username, where i
Is google's own keyring package not available in its own environment? This Suggests so. Of course adding it to requirements.txt has no effect, as it is too late.
How can I install packages properly?
A possible solution is to:
--extra-index-url=_json_key_base64:KEY#us-central1-python.pkg.dev/myproject/my-python-repo/simple/
Where KEY is the result of base64 -w 0 < credentials.json
Of course this then requires some rewriting of the requirements.txt file in your deployment pipeline to ensure service account credentials are secured in repository variables.

React.js not connecting to (localhost) Ganache on EC2: ERR_CONNECTION_REFUSED 127.0.0.1:8545

Using the following installation on Ubuntu Server 20.04 :
sudo apt-get update
sudo apt-get install nodejs
sudo apt install python2
sudo apt install npm
npm install ganache-cli
npm install node-gyp#3.6.2
npm install truffle#5.1.39
sudo npm install create-react-app#3.3.1 -global
npm install
I am having the error
Failed to load resource: net::ERR_CONNECTION_REFUSED 127.0.0.1:8545/:1
I am running ganache with the command line interface:
I can test the connection via node command prompt
To which I am able to verify a connection to ganache private blockchain
Then I try App.js in React
obtaining the following error:
I have tried the following:
1.) Setting up a proxy under package.json : http://127.0.0.1:8545
2.) Trying http://0.0.0.0:8545
3.) Setting up a middleware proxy account as presented in the following solution:
https://medium.com/bb-tutorials-and-thoughts/react-how-to-proxy-to-backend-server-5588a9e0347
"proxy": "http://127.0.0.1:8545"
4.) addressing cache related issues through rm -r package-lock.json node_modules and npm install updating react.js to latest version
5.) trying different port: 7545
6.) updating react to latest version on ubuntu
I figured it out - thanks to all who reviewed my problem. - hopefully this will help someone in the same situation. To connect to an ec2 instance with React.js / blockchain dapp you need to follow five steps:
1.) set appropriate security rules (you will need to allow access for an instance to reach your server) I set the following rules in a very liberal fashion since this was only a test:
you need to start ganache with explicit chainId specified and a direct reference to your public domain IPv4 DNS domain e.g (
http://ec2-54-186-149-26.us-west-2.compute.amazonaws.com)
the command should look as follows:
3.) the same domain mnemonic needs to be specified in metamask (along with an imported account from your ganache server ) It should look as follows:
4.) React.js also needs to access the domain from EC2 directly (do not use localhost ,even you are running on the server the loopback will not work! - also do not use a shortcut with or (||) condition as that also produced errors for me ). It should look similar to below:
5.) your smart contracts also need this domain specified explicitly - so you will also need to edit your truffle-config.js as follows:
Finally i can read my blockchain via React.js on a test network (ganache) via EC2.
Below is blockchain data presented to screen.
I hope this helps someone out there - this was not found easily.
regards
John D.

How to access Google Cloud Bucket from development Environment

I am trying to access my Google Cloud Bucket from development environment but when I write import statement I get error
from google.cloud import storage
Command I use to run server with bucket flag
dev_appserver.py app.yaml --default_gcs_bucket_name ABC-test-bucket
Error I get
File "C:\Users\ABC\AppData\Local\Google\Cloud SDK\google-cloud-sdk\platform\google_appengine\lib\setuptools-0.6c11\pkg_resources.py", line 565, in resolve
raise DistributionNotFound(req) # XXX put more info here
DistributionNotFound: google-cloud-storage
I think I completed all the steps like creating bucket, downloading client library using pip i.e.
pip install GoogleAppEngineCloudStorageClient -t <your_app_directory/lib>
I am new to GAE projects(Using Webapp2 Python framework for server) so I will really appreciate any pointers and help
The library you're using is not the correct one. You need the one below, as per the documentation:
pip install --upgrade google-cloud-storage

Resources