DEV Community

Osazuwa J. Agbonze
Osazuwa J. Agbonze

Posted on • Edited on

JWT Authentication in FastAPI ( Comprehensive Guide )

Hi and welcome. In this guide we'll build a JWT authentication system with FastAPI. Followed technique is production grade and by the end of this walkthrough, you should've a system ready to authenticate users. We'll use SQLAlchemy as ORM for Postgres DB and alembic as migration tool. Application and database will be containerized with docker.

Pre-requisite

It is expected to have installed docker and to be familiar with it's usage. Foundational knowledge to SQLAlchemy would also be of an advantage.

You can find the complete codebase on github

Case Study

πŸ’« Zap
feel free to skip this section if you're familiar with how authentication works with JWT

A simple use case to keep in mind is that of a student that needs to access her unique profile in an academic portal to submit a project. It is required student creates an account (only for new students) by providing an email and password which is saved on the platform to allow for account recognition on future access. At login, student provides an email and password to gain access to the academic portal. Valid credentials would allow student access and invalid credentials would deny access. On successful login a token is given which when used before the expiration time would allow the student access to the platform.

Virtual Environment & Application Dependencies

To get started, open your terminal & navigate to a folder dedicated solely for this guide. As a personal choice, I've named mine jwt-fast-api. Use below script to create & activate virtual environment which will scope dependencies needed for this guide from those installed globally.



# create environment [ windows, linux, mac ]
python -m venv env

# activate environment [ windows ]
env/Scripts/activate

# activate environment [ linux & mac ]
source env/bin/activate


Enter fullscreen mode Exit fullscreen mode

We'll proceed to installing the necessary dependencies needed in this guide. Copy the below content to requirements.txt.



fastapi==0.88.0
bcrypt==4.0.1
pyjwt==2.6.0
alembic>=1.9.1
uvicorn==0.20.0
SQLAlchemy>=1.4,<=2.0
psycopg2-binary==2.9.5
email-validator>=1.0.3


Enter fullscreen mode Exit fullscreen mode

Install dependencies with command



pip install -r requirements.txt


Enter fullscreen mode Exit fullscreen mode

Hello Login

Create a new file at the root of your project folder named main.py which will serve as the application entrypoint. Below is the current folder structure



jwt-fast-api/
β”œβ”€ main.py
β”œβ”€ requirements.txt


Enter fullscreen mode Exit fullscreen mode

Open up main.py and include the following content



import fastapi


app = fastapi.FastAPI()


@app.post('/login')
def login():
    """Processes user's authentication and returns a token
    on successful authentication.

    request body:

    - username: Unique identifier for a user e.g email, 
                phone number, name

    - password:
    """
    return "ThisTokenIsFake"


Enter fullscreen mode Exit fullscreen mode

Above code simply creates a fastapi application to which /login/ route is attached to accept a post request. The endpoint currently returns a fake token, this we'll revisit and refactor.

Serve the application using below command



uvicorn --reload main:app


Enter fullscreen mode Exit fullscreen mode

If the application is served successfully, the command line output should be similar to the below output

uvicorn serving application

Exploring the Docs

We'll be using the interactive docs auto-generated by fastapi to test the application as we build. Open your browser and visit 127.0.0.1:8000/docs. You should be greeted with a page similar to the one below

fastapi autogenerated documentation

Every endpoint on the docs has a Try It Out button, when clicked on shows an Execute button that sends a request to the endpoint. Clicking Execute button on the login endpoint would return a response ThisTokenIsFake.

Application Docker Image

Having gotten our application to run successfully, let's create a docker image for it. With this we are sure to have consistent platform agnostic application behavior.

In the project root, create a new file named Dockerfile and include the following code



FROM        python:3.8-alpine

ENV         PYTHONUNBUFFERED=1

WORKDIR     /home

COPY        ./requirements.txt .

COPY        * .

RUN         pip install -r requirements.txt \
            && adduser --disabled-password --no-create-home doe

USER        doe

EXPOSE      8000

CMD         ["uvicorn", "main:app", "--port", "8000", "--host", "0.0.0.0"]


Enter fullscreen mode Exit fullscreen mode

We had to be explicit with uvicorn command used in the Dockerfile to specify the port and the host IP address we want the app to run on.

Before using any docker command, ensure to have docker installed and it's service running.

To build the docker image, your current working directory should be in same location as the Dockerfile. Run the following script



docker build . -t fastapiapp


Enter fullscreen mode Exit fullscreen mode

this would name the application docker image as fastapiapp.

Test that the application runs successfully when launched using the docker image.



docker run -it -p 8000:8000 fastapiapp


Enter fullscreen mode Exit fullscreen mode

Open your browser and you should still be able to access the interactive documentation autogenerated for the application by fastapi.

Database Service Setup

The database and application are separate entities and as such would need a way to interact. Docker compose would be used to define and connect our services, that is, application and database service. The application is already configured and can take more features, the following section shows how to configure the database

Create a docker-compose.yml file in the project root. Your project structure should resemble



jwt-fast-api/
β”œβ”€ Dockerfile
β”œβ”€ requirements.txt
β”œβ”€ docker-compose.yml
β”œβ”€ main.py


Enter fullscreen mode Exit fullscreen mode

Paste the following content into docker-compose.yml



version: "3.9"

services:
  db:
    image: postgres:12-alpine
    container_name: fastapiapp_demodb
    restart: always
    environment:
      - POSTGRES_DB=postgres
      - POSTGRES_USER=postgres
      - POSTGRES_PASSWORD=postgres
    networks:
      - fastapiappnetwork

  app:
    image: fastapiapp
    container_name: fastapiapp_demoapp
    ports:
      - 8000:8000
    volumes:
      - .:/home
    depends_on:
      - db
    networks:
      - fastapiappnetwork

networks:
  fastapiappnetwork:


Enter fullscreen mode Exit fullscreen mode

Above docker-compose.yml file defines two services namely: app and db. app service composition defines a connection to the db using the depends_on statement which would allow the app to have access to the database.

To bring the application and database to live, run



docker-compose up --build


Enter fullscreen mode Exit fullscreen mode

which would run both services on the foreground of your terminal. You should've a similar output as seen below

docker compose log on successful application launch

The application should be accessible from the browser like previously seen.

Hide Sensitive Variables

As a best practice, sensitive variables shouldn't be committed to public repository. For this, we'll prune docker-compose.yml. First off, create a new file .env in project root



jwt-fast-api/
β”œβ”€ .env
β”œβ”€ Dockerfile
β”œβ”€ requirements.txt
β”œβ”€ docker-compose.yml
β”œβ”€ main.py


Enter fullscreen mode Exit fullscreen mode

Add the following to .env file



POSTGRES_DB=enteryourdbname
POSTGRES_USER=enterdbusername
POSTGRES_PASSWORD=enterdbuserpassword


Enter fullscreen mode Exit fullscreen mode

Update db service environment block on docker-compose.yml with the following code



.....

   db:
    ......
    ......
    environment:
      - POSTGRES_DB=$POSTGRES_DB
      - POSTGRES_USER=$POSTGRES_USER
      - POSTGRES_PASSWORD=$POSTGRES_PASSWORD
    ......

......


Enter fullscreen mode Exit fullscreen mode

We've successfully prune docker-compose.yml. Just one more step left, create a .gitignore file and add .env as it's only content. Your folder structure should now resemble



jwt-fast-api/
β”œβ”€ .env
β”œβ”€ Dockerfile
β”œβ”€ requirements.txt
β”œβ”€ docker-compose.yml
β”œβ”€ main.py
β”œβ”€ .gitignore


Enter fullscreen mode Exit fullscreen mode

Setup Application Database Usage With SQLAlchemy

SQLAlchemy is the Object Relational Mapper (ORM) with which we'll interact with our database.

Create settings.py file in project root. This file would house all application configurations. With this new file added, folder structure should now resemble



jwt-fast-api/
β”œβ”€ .env
β”œβ”€ Dockerfile
β”œβ”€ requirements.txt
β”œβ”€ docker-compose.yml
β”œβ”€ main.py
β”œβ”€ settings.py
β”œβ”€ .gitignore


Enter fullscreen mode Exit fullscreen mode

Add the following content to settings.py



import os

# Database url configuration
DATABASE_URL = "postgresql+psycopg2://{username}:{password}@{host}:{port}/{db_name}".format(
    host=os.getenv("POSTGRES_HOST"),
    port=os.getenv("POSTGRES_PORT"),
    db_name=os.getenv("POSTGRES_DB"),
    username=os.getenv("POSTGRES_USER"),
    password=os.getenv("POSTGRES_PASSWORD"),
)


Enter fullscreen mode Exit fullscreen mode

The database URL is composed using the sensitive database variables defined in .env file. From above database URL declaration, there're two variables accessed which are not in .env i.e POSTGRES_HOST and POSTGRES_PORT. Update .env file to include the following variables



POSTGRES_HOST=db
POSTGRES_PORT=5432


Enter fullscreen mode Exit fullscreen mode

Although we've setup the application to read sensitive variables from it's environment, these variables are yet to be served to the app service in our docker-compose.yml file. Update app service composition in docker-compose.yml to include environment block (just like we had for db service). Below is the complete code for docker-compose.yml file



version: "3.9"

services:
  db:
    image: postgres:12-alpine
    container_name: fastapiapp_demodb
    restart: always
    environment:
      - POSTGRES_DB=$POSTGRES_DB
      - POSTGRES_USER=$POSTGRES_USER
      - POSTGRES_PASSWORD=$POSTGRES_PASSWORD
    networks:
      - fastapiappnetwork

  app:
    image: fastapiapp
    container_name: fastapiapp_demoapp
    ports:
      - 8000:8000
    volumes:
      - .:/home
    depends_on:
      - db
    networks:
      - fastapiappnetwork
    environment:
      - POSTGRES_DB=$POSTGRES_DB
      - POSTGRES_USER=$POSTGRES_USER
      - POSTGRES_HOST=$POSTGRES_HOST
      - POSTGRES_PORT=$POSTGRES_PORT
      - POSTGRES_PASSWORD=$POSTGRES_PASSWORD

networks:
  fastapiappnetwork:


Enter fullscreen mode Exit fullscreen mode

At this point, our application now have access to the environment variable and the DATABASE_URL configuration setup in settings.py is ready to be used.

All SQLAlchemy processes passes through a base called engine. An engine powers communication and specifies access to the database where sql interactions is directed. Create a db_initializer.py file within project root and include the following content



from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker, declarative_base

import settings


# Create database engine
engine = create_engine(settings.DATABASE_URL, echo=True, future=True)

# Create database declarative base
Base = declarative_base()

# Create session
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)


def get_db():
    """Database session generator"""
    db = SessionLocal()
    try:
        yield db
    finally:
        db.close()


Enter fullscreen mode Exit fullscreen mode

With above configuration, we now have SQLAlchemy setup within our application and ready for use.

Create User Model

User model would represent an entity capable of being authenticated. The most crucial details needed for authentication (in this case) are email and password. As a best practice, user's password are not meant to be saved in it's raw context, therefore, it's advised that the saved value should be the hashed representation of the raw text.

Create models folder in project root directory and within it add __init__.py and users.py. Project structure should be similar with the one below



jwt-fast-api/
β”œβ”€ models/
   β”œβ”€ __init__.py
   β”œβ”€ users.py
β”œβ”€ .env
β”œβ”€ main.py
β”œβ”€ .gitignore
β”œβ”€ Dockerfile
β”œβ”€ settings.py
β”œβ”€ requirements.txt
β”œβ”€ docker-compose.yml
β”œβ”€ db_initializer.py


Enter fullscreen mode Exit fullscreen mode

Open users.py and add the following content



from sqlalchemy import (
    LargeBinary, 
    Column, 
    String, 
    Integer,
    Boolean, 
    UniqueConstraint, 
    PrimaryKeyConstraint
)

from db_initializer import Base


class User(Base):
    """Models a user table"""
    __tablename__ = "users"
    email = Column(String(225), nullable=False, unique=True)
    id = Column(Integer, nullable=False, primary_key=True)
    hashed_password = Column(LargeBinary, nullable=False)
    full_name = Column(String(225), nullable=False)
    is_active = Column(Boolean, default=False)

    UniqueConstraint("email", name="uq_user_email")
    PrimaryKeyConstraint("id", name="pk_user_id")

    def __repr__(self):
        """Returns string representation of model instance"""
        return "<User {full_name!r}>".format(full_name=self.full_name)


Enter fullscreen mode Exit fullscreen mode

Alembic Setup

Now that we've our user model declared, we'll use alembic to run it's migration. To use alembic, it needs to be initialized in the project directory. Run



alembic init alembic


Enter fullscreen mode Exit fullscreen mode

Alembic's configurations and versioning will now be contained in a folder alembic.

πŸŽ† Note
initialization of alembic should be run from the virtual environment created and activated earlier in this guide

On successful initialization of alembic, project folder structure should resemble



jwt-fast-api/
β”œβ”€ alembic/              <-- alembic folder & sub files
   β”œβ”€ versions/ 
   β”œβ”€ env.py
   β”œβ”€ README
   β”œβ”€ script.py.mako
β”œβ”€ models/
   β”œβ”€ __init__.py
   β”œβ”€ users.py
β”œβ”€ .env
β”œβ”€ alembic.ini            <-- just added
β”œβ”€ main.py
β”œβ”€ .gitignore
β”œβ”€ Dockerfile
β”œβ”€ settings.py
β”œβ”€ requirements.txt
β”œβ”€ docker-compose.yml
β”œβ”€ db_initializer.py


Enter fullscreen mode Exit fullscreen mode

A couple of setup needs to integrated before we can run our first migrations. Update alembic/env.py so it has the following content



from logging.config import fileConfig

from sqlalchemy import engine_from_config
from sqlalchemy import pool

from alembic import context

from db_initializer import Base
from settings import DATABASE_URL

from models.users import User

# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config

# Interpret the config file for Python logging.
# This line sets up loggers basically.
if config.config_file_name is not None:
    fileConfig(config.config_file_name)

# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
target_metadata = Base.metadata

# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.
config.set_section_option(config.config_ini_section, "sqlalchemy.url", DATABASE_URL)


def run_migrations_offline() -> None:
    """Run migrations in 'offline' mode.

    This configures the context with just a URL
    and not an Engine, though an Engine is acceptable
    here as well.  By skipping the Engine creation
    we don't even need a DBAPI to be available.

    Calls to context.execute() here emit the given string to the
    script output.

    """
    url = config.get_main_option("sqlalchemy.url")
    context.configure(
        url=url,
        compare_type=True,
        literal_binds=True,
        target_metadata=target_metadata,
        dialect_opts={"paramstyle": "named"},
    )

    with context.begin_transaction():
        context.run_migrations()


def run_migrations_online() -> None:
    """Run migrations in 'online' mode.

    In this scenario we need to create an Engine
    and associate a connection with the context.

    """
    connectable = engine_from_config(
        config.get_section(config.config_ini_section),
        prefix="sqlalchemy.",
        poolclass=pool.NullPool,
    )

    with connectable.connect() as connection:
        context.configure(
            compare_type=True,
            connection=connection, 
            target_metadata=target_metadata,
        )

        with context.begin_transaction():
            context.run_migrations()


if context.is_offline_mode():
    run_migrations_offline()
else:
    run_migrations_online()


Enter fullscreen mode Exit fullscreen mode

Running Migrations

Alembic autogenerate migrations by watching changes in a model's Base class. With alembic we've backward compatibility with our migrations and can go back to previous migration with a few command.

Alembic is a tool utilized within our application service to interacts with our database. To use it, we'll need access to our application service/container. Each container has got a unique identifier made up of alphanumeric character. Below command would list the running container



docker ps -a 


Enter fullscreen mode Exit fullscreen mode

You should have a similar output like the one below. The application container identifier is underlined.

Application container unique identifier

To gain access into the application container use below script



docker exec -it 2a6 sh


Enter fullscreen mode Exit fullscreen mode
πŸŽ† Note
your container's unique identifier output should be different from mine and we only need the first 3 alphanumeric to interact with it.
Kindly replace 2a6 with the first 3 alphanumeric of your application container unique identifier

Run first migration using



alembic revision --autogenerate -m "Create user model"


Enter fullscreen mode Exit fullscreen mode

A successful output should resemble.

Successful alembic migration

πŸŽ† Note
take note of the sha value autogenerated at the last line of the above output. You can find the sha value within
alembic/versions/some_sha_value_create_user_table.py

To reflect migrations on our database, we'll run



# kindly use the appropriate sha value as yours will be different from mine
alembic upgrade 66b63a


Enter fullscreen mode Exit fullscreen mode

Password hashing on signup

We'll only be requiring user to provide on sign up, email, password and full name. For this we'll create a pydantic model which would capture these payload. Create a new folder schemas and add within it __init__.py and users.py. Folder structure should be similar to:



jwt-fast-api/
β”œβ”€ alembic/              
   β”œβ”€ versions/ 
   β”œβ”€ env.py
   β”œβ”€ README
   β”œβ”€ script.py.mako
β”œβ”€ models/
   β”œβ”€ __init__.py
   β”œβ”€ users.py
β”œβ”€ schemas/            <-- schemas folder & sub files
   β”œβ”€ __init__.py
   β”œβ”€ users.py
β”œβ”€ .env
β”œβ”€ alembic.ini            
β”œβ”€ main.py
β”œβ”€ .gitignore
β”œβ”€ Dockerfile
β”œβ”€ settings.py
β”œβ”€ requirements.txt
β”œβ”€ docker-compose.yml
β”œβ”€ db_initializer.py


Enter fullscreen mode Exit fullscreen mode

Open schemas/users.py file and include this content



from pydantic import BaseModel, Field, EmailStr


class UserBaseSchema(BaseModel):
    email: EmailStr
    full_name: str


class CreateUserSchema(UserBaseSchema):
    hashed_password: str = Field(alias="password")


class UserSchema(UserBaseSchema):
    id: int
    is_active: bool = Field(default=False)

    class Config:
        orm_mode = True


Enter fullscreen mode Exit fullscreen mode

There're 3 schema classes above of which two inherits from UserBaseSchema. This inheritance structure is simply to avoid duplication of model fields. So what we've done is to say, the most basic user data that can be public facing are the email and full_name and this is composed in the UserBaseSchema. As we only need full_name, email and password on sign up, there's no need to redefine all those fields in CreateUserSchema as we've already composed the same in UserBaseSchema. Hence why CreateUserSchema inherits UserBaseSchema and added the only required field, that is hashed_password. We aliased hashed_password so it is public facing as password, that is, instead of the api to request that user should provide hashed_password in request body, password will be requested for instead and fastapi will remap the captured password field to hashed_password automatically. UserSchema declares the fields returnable the API. Given hashed_password is a sensitive information we don't want users to have access to, it's deliberately excluded from the schema property. UserSchema has a config subclass which solely defines that the schema would as as an ORM ( would capture data coming from database as if it the real model class ). This is done using orm_mode = True.

We'll include helper functions in User model class to help with password hashing and for password confirmation. Open models/users.py and update the class to include the below methods



# other import statement above

import bcrypt

class User(Base):
        # previous class attributes and methods are here

    @staticmethod
    def hash_password(password) -> str:
        """Transforms password from it's raw textual form to 
        cryptographic hashes
        """
        return bcrypt.hashpw(password.encode(), bcrypt.gensalt())

    def validate_password(self, password) -> bool:
        """Confirms password validity"""
        return {
            "access_token": jwt.encode(
                {"full_name": self.full_name, "email": self.email},
                "ApplicationSecretKey"
            )
        }

    def generate_token(self) -> dict:
        """Generate access token for user"""
        return {
            "access_token": jwt.encode(
                {"full_name": self.full_name, "email": self.email},
                settings.SECRET_KEY
            )
        }


Enter fullscreen mode Exit fullscreen mode

In above code, we used an amazing library bcrypt to handle both password hashing and confirmation.

🏁 Checkpoint
Setup an application secret key as environment variable and replace "ApplicationSecretKey" on models/users.py with the value consumed from the environment variable.
NOTE:: if you leave code as is without taking on this task, your code should still run properly

Final step before creating our signup endpoint is to create a database service which would handle all database interactions (DDL, DML, DQL e.t.c) with user model. Create a new folder services, add __init__.py as the folder's only file and db folder as it's only folder. Create __init__.py and users.py file within services/db folder. Your folder structure should now resemble



jwt-fast-api/
β”œβ”€ alembic/              
   β”œβ”€ versions/ 
   β”œβ”€ env.py
   β”œβ”€ README
   β”œβ”€ script.py.mako
β”œβ”€ models/
   β”œβ”€ __init__.py
   β”œβ”€ users.py
β”œβ”€ schemas/            
   β”œβ”€ __init__.py
   β”œβ”€ users.py
β”œβ”€ services/            <-- services folder & sub files
   β”œβ”€ __init__.py
   β”œβ”€ db/                <-- db folder & sub files
       β”œβ”€ __init__.py
       β”œβ”€ users.py
β”œβ”€ .env
β”œβ”€ alembic.ini            
β”œβ”€ main.py
β”œβ”€ .gitignore
β”œβ”€ Dockerfile
β”œβ”€ settings.py
β”œβ”€ requirements.txt
β”œβ”€ docker-compose.yml
β”œβ”€ db_initializer.py


Enter fullscreen mode Exit fullscreen mode

Update services/db/users.py with the following code



from sqlalchemy.orm import Session
from sqlalchemy import select

from models.users import User
from schemas.users import CreateUserSchema

def create_user(session:Session, user:CreateUserSchema):
    db_user = User(**user.dict())
    session.add(db_user)
    session.commit()
    session.refresh(db_user)
    return db_user


Enter fullscreen mode Exit fullscreen mode

The file only contains one function create_user which does the actual interaction with the database using session object to create a user instance passed down from CreateUserSchema.

πŸŽ† Quick Recap
To get going with password hashing, we added password hashing and validation helper methods to User model just to ensure related behaviors are kept close. We proceeded to creating schema to collect sign up information i.e CreateUserSchema and schema defining user data consumable from the API i.e UserSchema. Lastly we added a database service to interact with the database ( this is just a clean approach for seperation of concerns )

Update main.py to include a signup endpoint which will utilize all we've done.



# other import statement are above
from fastapi import Body, Depends
from sqlalchemy.orm import Session

from db_initializer import get_db
from models import users as user_model
from schemas.users import CreateUserSchema, UserSchema
from services.db import users as user_db_services

@app.post('/signup', response_model=UserSchema)
def signup(
    payload: CreateUserSchema = Body(), 
    session:Session=Depends(get_db)
):
    """Processes request to register user account."""
    payload.hashed_password = user_model.User.hash_password(payload.hashed_password)
    return user_db_services.create_user(session, user=payload)


# uncompleted login endpoint handler is below


Enter fullscreen mode Exit fullscreen mode

Signup handler uses some foreign bodies:

  • app.post() which is the decorator indicating request verb takes a new parameter response_model pointing to UserSchema. This is how we define what user should have access to. In this case on successful signup, the fields defined in UserSchema would be returned as response.

  • signup function signature explicitly define that payload parameter would serve as the expected request body by using Body. Payload is an instance of CreateUserSchema which means all fields defined in it would be expected on signup.

  • signup function uses dependency injection to create an instance of database session scoped to the lifecycle of the request for which it is created. This is done using Depends(get_db)

As a best practice, before creating the user in the signup request function body, we first have to hash the password using below code.



payload.hashed_password = user_model.User.hash_password(payload.hashed_password)


Enter fullscreen mode Exit fullscreen mode

Rebuild the application docker image and restart the composed services with below script



# rebuilding the docker image
docker build . -t fastapiapp

# restart docker services
docker-compose restart 


Enter fullscreen mode Exit fullscreen mode

Signup Exploration

Visit the docs page on http://localhost:8000/docs and you should have a new endpoint for user signup. The below image contains values supplied to create a new user brain.

Sign up inputs

Once the execute button is clicked on you should get a response containing the details of the new user created. It should be similar to the below image

Successful user creation

🏁 Checkpoint
Try creating a new user by using the try it out & execute buttons on the docs. Create users John Doe and Jane Doe.
Leave in the comment session if you encounter any challenge

Refactoring Login

A successful user login should return a recognized access token with which restricted endpoints can be accessed with. To standardize the login endpoint, we'll need to capture payload (email and password) supplied in the request body on the login endpoint, confirm if any user of such exists using the given email and verify the password given in the payload is valid for the user. On successful authentication, a JSON token will be returned as a response.

Update schemas/users.py and include below code defining login schema



# previously defined schemas are above

class UserLoginSchema(BaseModel):
    email: EmailStr = Field(alias="username")
    password: str 


Enter fullscreen mode Exit fullscreen mode

Update services/users.py and include below code which is a service to retrieve a single user from the database



# previously defined services are above 

def get_user(session:Session, email:str):
    return session.query(User).filter(User.email == email).one()


Enter fullscreen mode Exit fullscreen mode

Below code is the refactored login verifying the existence of the acclaimed user and validating the credentials of the same user when found.



from typing import Dict
from schemas.users import CreateUserSchema, UserSchema, UserLoginSchema


@app.post('/login', response_model=Dict)
def login(
        payload: UserLoginSchema = Body(),
        session: Session = Depends(get_db)
    ):
    """Processes user's authentication and returns a token
    on successful authentication.

    request body:

    - username: Unique identifier for a user e.g email, 
                phone number, name

    - password:
    """
    try:
        user:user_model.User = user_db_services.get_user(
            session=session, email=payload.email
        )
    except:
        raise HTTPException(
            status_code=status.HTTP_401_UNAUTHORIZED,
            detail="Invalid user credentials"
        )

    is_validated:bool = user.validate_password(payload.password)
    if not is_validated:
        raise HTTPException(
            status_code=status.HTTP_401_UNAUTHORIZED,
            detail="Invalid user credentials"
        )

    return user.generate_token()


Enter fullscreen mode Exit fullscreen mode

Above code utilizes UserLoginSchema from schemas/users.py and Dict class from typings. Exception is raised on failed authentication attempt and an access token is returned on a successful one.

To confirm the refactored login endpoint, visit the auto-generated docs page at http://localhost:8000/docs, you should find out that the login interactive docs now requires a username and password. Provide a valid credential of a user previously created and you should have a successful response with an access token

Successful Login Access Token

An invalid credentials when provided should return an access denied response with a message that credentials are invalid

Invalid Login

Conclusion

There are more that can be done to strengthen the security of the system such as

  • token blacklist system
  • refresh token for renewing expired access tokens
  • rolling tokens for automatic renewal based on access timeframe
  • token expiration

e.t.c but we've successfully been able to setup a workable and production grade solution for JSON Web Token with FastAPI.

If you've come this far, I appreciate your time and I hope it was well worth it.

If yncountered any error, kindly drop in the comment section.

Don't forget to SHARE, LIKE & FOLLOW for more

πŸ“Œ The Extra Mile
Study my Journey to SQLAlchemy ( Practical Guide) .
Support with a cup of Coffee

Top comments (2)

Collapse
 
nairadethya2208 profile image
nairadethya2208

Hey Osazuwa, thank you for creating a clear guide. I just had one question as to how does the generate token function work? is it an inbuilt function because when i am testing it out I'm not getting a valid token I'm only getting null value.

Collapse
 
spaceofmiah profile image
Osazuwa J. Agbonze • Edited

Hello @nairadethya2208, thanks for spotting on that - generate_token is a helper method in the User class which utilizes jwt to generate a json encoded token.

I've updated post to reflect the update on the user model. Thanks for the feedback.