Collectives™ on Stack Overflow
Find centralized, trusted content and collaborate around the technologies you use most.
Learn more about Collectives
Teams
Q&A for work
Connect and share knowledge within a single location that is structured and easy to search.
Learn more about Teams
Ask Question
I am using zappa to deploy backend to the AWS Lambda. It worked well, until I decided to use PostgreSQL. I added it in the settings like this:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': config('DATABASE_NAME'),
'USER': config('DATABASE_USER'),
'PASSWORD': config('DATABASE_PASSWORD'),
'HOST': config('DATABASE_HOST'),
'PORT': '5432'
I am using AWS RDS. I installed psycopg2-binary and also psycopg2 (versions 2.8.6), but the issue remains. The python version is 3.8.
The full error log:
[1621168086542] [ERROR] ImproperlyConfigured: Error loading psycopg2 module: No module named 'psycopg2._psycopg'
Traceback (most recent call last):
File "/var/task/handler.py", line 609, in lambda_handler
return LambdaHandler.lambda_handler(event, context)
File "/var/task/handler.py", line 240, in lambda_handler
handler = cls()
File "/var/task/handler.py", line 146, in __init__
wsgi_app_function = get_django_wsgi(self.settings.DJANGO_SETTINGS)
File "/var/task/zappa/ext/django_zappa.py", line 20, in get_django_wsgi
return get_wsgi_application()
File "/var/task/django/core/wsgi.py", line 12, in get_wsgi_application
django.setup(set_prefix=False)
File "/var/task/django/__init__.py", line 24, in setup
apps.populate(settings.INSTALLED_APPS)
File "/var/task/django/apps/registry.py", line 114, in populate
app_config.import_models()
File "/var/task/django/apps/config.py", line 211, in import_models
self.models_module = import_module(models_module_name)
File "/var/lang/lib/python3.8/importlib/__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
File "<frozen importlib._bootstrap>", line 991, in _find_and_load
File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 783, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/var/task/django/contrib/auth/models.py", line 2, in <module>
from django.contrib.auth.base_user import AbstractBaseUser, BaseUserManager
File "/var/task/django/contrib/auth/base_user.py", line 48, in <module>
class AbstractBaseUser(models.Model):
File "/var/task/django/db/models/base.py", line 122, in __new__
new_class.add_to_class('_meta', Options(meta, app_label))
File "/var/task/django/db/models/base.py", line 326, in add_to_class
value.contribute_to_class(cls, name)
File "/var/task/django/db/models/options.py", line 206, in contribute_to_class
self.db_table = truncate_name(self.db_table, connection.ops.max_name_length())
File "/var/task/django/db/__init__.py", line 28, in __getattr__
return getattr(connections[DEFAULT_DB_ALIAS], item)
File "/var/task/django/db/utils.py", line 214, in __getitem__
backend = load_backend(db['ENGINE'])
File "/var/task/django/db/utils.py", line 111, in load_backend
return import_module('%s.base' % backend_name)
File "/var/lang/lib/python3.8/importlib/__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "/var/task/django/db/backends/postgresql/base.py", line 29, in <module>
raise ImproperlyConfigured("Error loading psycopg2 module: %s" % e)
The (likely) issue is that you are building your lambda .zip
package on MacOS. When you deploy your lambda function it is running in a Linux environment (specifically, AWS's Linux2 environment). The psycopg2-binary
is different for the MacOS vs Linux environments, so if you build your lambda package (including the psycopg2-binary
) on a Mac and then deploy to lambda you'll have the issues noted above.
You'll need to build your lambda function inside of an AWS Linux container. Here's a Dockerfile you could use to create a container inside of which you install the psycopg2-binary
and build your lambda zip package. Then everything should work:
FROM amazonlinux:2.0.20200207.1
RUN cd /opt && \
yum install -y gcc openssl-devel bzip2-devel libffi-devel wget tar gzip make && \
wget https://www.python.org/ftp/python/3.8.2/Python-3.8.2.tgz && \
tar xzf Python-3.8.2.tgz && \
cd Python-3.8.2 && \
./configure --enable-optimizations && \
make altinstall && \
rm -f /opt/Python-3.8.2.tgz && \
echo "alias python3=python3.8" > ~/.bashrc
Note the amaonzonlinux:2.0 operating system, then I just install python 3.8.2 into the environment (you could use a different version of python if desired). From there you can copy in your code and build your lambda .zip
package and deploy to lambda.
–
I installed aws-psycopg2
(1.2.1) to resolve the 500 error that I was getting from AWS every time I executed a post request.
Note: I kept psycopg2
(2.9.1) in order so that my local app could still function.
I have a flask app (python 3.8) deployed to AWS lambda via Zappa. Local and hosted versions of my app are both hooked up to an AWS RDS PostgreSQL db. Trying to avoid using a docker container to keep my stack as simple as possible.
I recently had the same issue popping out despite using psycopg2-binary
successfully for a long time. I found the following workaround using lambda layers which seems to go in the good future proof direction:
In your zappa config, add the following line:
"layers": ["arn:aws:lambda:<region>:<account_id>:layer:<layer_name>:<layer_version>"]
Referencing one of the layers provided in this repo (it depends on your AWS region and Python version) and remove psycopg2-binary
from your environment.
One of the added benefit is that it should reduce your package size as well.
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.