Deploy your FastAPI API to AWS EC2 using Nginx
FastAPI is an excellent tool for putting your machine learning models into production. In this article, I briefly explain how you can easily put your FastAPI in production to an AWS EC2 instance using Nginx.
FastAPI fundamentals
From the FastAPI website:
FastAPI is a modern, fast (high-performance), web framework for building APIs with Python 3.6+ based on standard Python type hints.
Minimal code example
First things first, let’s install FastAPI:
pip install fastapi
To start, we import the fastAPI class from the fastapi module and then instantiate this class creating the object app. Then, we define a function that returns a simple message in JSON format. This function has a decorator that defines a GET method on the specified path.
Running your app
Before running your app, you also need to install uvicorn, which is a lightweight server implementation to run our API.
pip install uvicorn
Now we are ready to run the application “app” located in the main.py file using the following command:
uvicorn main:app
By default, the API will be available in http://127.0.0.1:8000 .
Automatic robust documentation
One of the characteristic things about FastAPI is that relies heavily on type hints for its capabilities. It uses Pydantic (a Python library for data parsing and validation) and standard type hints to create and check data models that allow you to automatically create robust documentation of the API.
By default the documentation is located at your API domain/docs. If you are running it locally, it will be available in http://127.0.0.1:8000/docs (provided by Swagger UI).
Alternatively, another automatic documentation provided by Redoc will be available in http://127.0.0.1:8000/redoc .
Some words about Nginx
Nginx is an open-source Web Server written in C that was designed with the purpose of being the world’s fastest Web Server. It was created in 2004 by Igor Sysoev.
A Web Server is a program that uses HTTP (Hypertext Transfer Protocol) to serve the files that form Web pages to users, in response to their requests.
In addition to Nginx, there are other well-known web servers such as Apache, or IIS.
Nginx can also be used as a reverse and forward Proxy, a load balancer as well as other things as an API Gateway.
Why use Nginx?
Regardless of using uvicorn, exposing our API with Nginx has several advantages, but the one that will be most highlighted in this article is the ability to easily add an SSL certificate.
Deployment steps
The deployment process includes the following steps:
- Create and launch the AWS EC2 instance.
- Configure the AWS EC2 instance by installing Nginx and the API requirements
- Configure Nginx.
- Add an SSL certificate using OpenSSL.
Create and launch the AWS EC2 instance.
After login to your AWS account go to:
Services -> Compute ->EC2 ->Lauch Instance
Now you have to follow these steps:
Step 1: Choose an Amazon Machine Image (AMI)
I chose an Ubuntu 18.04 Server (note that it is Free Tier eligible).
Step 2: Choose an Instance Type
I chose the following, which is also Free Tier eligible.
Then, we leave the default settings for the following steps:
Step 3: Configure Instance Details
Step 4: Add Storage
Step 5: Add Tags
Step 6: Configure Security Group
By clicking on “Add Rule”, we will make sure to add the HTTP and HTTPS types
Step 7: Review Instance Launch
Finally, we review and launch the instance. When you click on the “Launch” button you will be prompted to create a key pair. Create and download it. You will need it to access the instance you have just created.
Configure the AWS EC2 instance by installing Nginx and the API requirements
Now that the instance is up and running, we will access it via SSH and configure it. To do it, go to Services -> Compute ->EC2 ->Instances
You should see your instance running:
Select the instance and go to Actions -> Connect -> SSH Client
You will find there a detailed example of how to connect to your instance via SSH. In my particular case (as I named the key pair “fastapi-nginx.cer” and downloaded it to my “Downloads” folder) I will use this command to access my instance:
ssh -i Downloads/fastapi-nginx.cer ubuntu@ec2-18-116-199-161.us-east-2.compute.amazonaws.com
Congrats, you have just accessed to your EC2 instance via SSH:
Now let’s clone the repository I have prepared for this tutorial. You will see that it contains a minimal example of an API made with fastAPI.
git clone https://github.com/lcalcagni/Deploying-FastAPI-using-Nginx.git
Enter to the directory of the project to install the requirements:
cd Deploying-FastAPI-using-Nginxsudo apt-get updatesudo apt install python3-pippip3 install -r requirements.txt
Let’s run the API locally to check that everything is ok:
python3 -m uvicorn main:app
You will get something like this:
So now let’s make this API accessible for the rest of the world using Nginx.
Nginx configuration
First, install Nginx using the following command:
sudo apt install nginx
We already have the fastAPI API we wish to serve, now we need to create the server blocks that will tell Nginx how to do this.
By default, Nginx contains one server block called default
. You can find it in this location: etc/nginx/sites-enabled
But we will create a new one called “fastapi_nginx” (you can choose another name):
cd /etc/nginx/sites-enabled/sudo nano fastapi_nginx
Inside this file, we have to specify the following:
server { listen 80; server_name 18.116.199.161; location / { proxy_pass http://127.0.0.1:8000; }}
Where server_name contains the public IP of your instance. In my case, it is 18.116.199.161
With this configuration, you are telling Nginx, that what you have running on http://127.0.0.1:8000 should be served to http://18.116.199.161/ (port 80).
We save that file (Ctrl X) and then run
sudo service nginx restart
Then, we run the API
cd path/to/Deploying-FastAPI-using-Nginxpython3 -m uvicorn main:app
Then, try to access http://{your EC2 public IP}/using your browser on your local computer. You should see something like this:
Congrats! now your API is accessible for the rest of the world.
Add a self-signed SSL certificate using OpenSSL
Install OpenSSL and create the /etc/nginx/ssl directory:
sudo apt-get install opensslcd /etc/nginx
sudo mkdir ssl
Then, we create the self-signed SSL certificate using this command:
sudo openssl req -batch -x509 -nodes -days 365 \-newkey rsa:2048 \-keyout /etc/nginx/ssl/server.key \-out /etc/nginx/ssl/server.crt
After that, we add this certificate to our server block configuration:
cd /etc/nginx/sites-enabled/sudo nano fastapi_nginx
Inside the file we make the following modification:
server { listen 80; listen 443 ssl; ssl on; ssl_certificate /etc/nginx/ssl/server.crt; ssl_certificate_key /etc/nginx/ssl/server.key;
server_name 18.116.199.161; location / { proxy_pass http://127.0.0.1:8000; }}
We save that file (Ctrl X) and then restart Nginx:
sudo service nginx restart
Finally, we run our API:
cd path/to/Deploying-FastAPI-using-Nginxpython3 -m uvicorn main:app
If everything works correctly, you should now be able to access your server over HTTPS (https://{your EC2 public IP}/). Your web browser (in this case I am using Firefox) may display a warning like this:
This is expected because of the type of certificate (self-signed) you are using. So you will have to manually confirm that you trust the server in order to access it.
Once you confirm that by clicking on the Advanced button, you will see your API available on https://{your EC2 public IP}/:
Notice that it is possible to redirect the HTTP to HTTPS adding this to the server block configuration (for more information check this):
return 301 https://$server_name$request_uri;
Don’t forget to restart Nginx to apply the changes:
sudo service nginx restart
I hope you find this article useful. If you have any queries you can find me on LinkedIn.
Happy Deploying!
Laura Calcagni
Software Data Engineer
If you like my work and want to say thanks or encourage me to do more, you can buy me a coffee!