DEV Community

Cover image for Building an AI-powered Financial Behavior Analyzer with NodeJS, Python, SvelteKit, and TailwindCSS - Part 2: GitHub OAuth
John Owolabi Idogun
John Owolabi Idogun

Posted on • Originally published at johnowolabiidogun.dev

Building an AI-powered Financial Behavior Analyzer with NodeJS, Python, SvelteKit, and TailwindCSS - Part 2: GitHub OAuth

Introduction

Part 3 is already out here: https://johnowolabiidogun.dev/blogs/building-an-ai-powered-financial-behavior-analyzer-with-nodejs-python-sveltekit-and-tailwindcss-part-3-transactions-0da982/67a3a71d72d0dd09224068e8

With the AI service completed, we now focus on setting up the base backend service. As outlined in the system's architecture, our main backend will be powered by Node.js with Express.js, using MongoDB as the database.

While we could have chosen other combinations—such as Node.js with PostgreSQL, Bun with PostgreSQL, or Django/Flask/FastAPI/aiohttp with SQL/NoSQL/NewSQL—I opted for this stack as a refresher.

We'll begin developing the backend service here and continue refining it in subsequent articles. Let's dive in! 🚀

Prerequisite

We assume you have already set up a TypeScript-based Express project. If not, follow these simple steps:

  • [x] Create a new folder, say backend, and change the directory into it
  • [x] Run npm init -y to initialize a node project:
  backend$ npm init -y
Enter fullscreen mode Exit fullscreen mode

This will create a normal package.json file with very basic entries.

  • [x] Install TypeScript and create a tsconfig.json file sh backend$ npm install --save-dev typescript && npx tsc --init

At this point, you have a minimal Node.js app with TypeScript support. However, for our project, we need a more structured setup. Modify your package.json as follows:

{
  "name": "ai-powered-financial-behavior-analyzer",
   "version": "1.0.0",
  "description": "An API for analyzing financial behavior using AI",
  "main": "dist/app.js",
  "scripts": {
    "dev": "tsx watch src/app.ts",
    "build": "tsc && tsc-alias",
    "start": "node dist/src/app.js",
    "test": "jest --config jest.config.js",
    "test:coverage": "jest --config jest.config.js --coverage --silent=false"
  },
  "keywords": [],
  "author": "John Owolabi Idogun",
  "license": "ISC",
  "devDependencies": {
    "typescript": "^5.7.3"
  },
  "type": "module",
  "engines": {
    "node": "22.x"
  },
  "jest": {
    "extensionsToTreatAsEsm": [
      ".ts"
   ]
  }
}
Enter fullscreen mode Exit fullscreen mode

We want to use ESM instead of CommonJS hence the "type": "module". We also want to use the latest node.js LTS (v22 at the time of writing). We also want some structure where all source files are in the src directory and tests in the tests directory with the entry point, during development, as src/app.ts (in production, it'll be dist/app.js). Next, make tsconfig.json look like this:

{
  "compilerOptions": {
    "target": "ES2023",
    "module": "NodeNext",
    "moduleResolution": "nodenext",
    "strict": true,
    "esModuleInterop": true,
    "skipLibCheck": true,
    "outDir": "./dist",
    "rootDir": ".",
    "baseUrl": ".",
    "allowJs": true,
    "resolveJsonModule": true,
    "allowImportingTsExtensions": false,
    "typeRoots": ["./node_modules/@types", "./src/types"],
    "paths": {
      "$config/*": ["src/config/*"],
      "$controllers/*": ["src/controllers/*"],
      "$models/*": ["src/models/*"],
      "$routes/*": ["src/routes/*"],
      "$services/*": ["src/services/*"],
      "$types/*": ["src/types/*"],
      "$middlewares/*": ["src/middlewares/*"],
      "$utils/*": ["src/utils/*"],
      "$websockets/*": ["src/websockets/*"]
    },
    "experimentalDecorators": true,
    "emitDecoratorMetadata": true
  },
  "include": ["src/**/*", "test/**/*.ts", "jest.setup.js"],
  "exclude": ["node_modules", "dist"]
}
Enter fullscreen mode Exit fullscreen mode

You can get a better explanation for each of the entries here. The idea is we want to use the latest entries while being considerate. We also created aliases (that's what paths does) so that instead of doing ../../../src/config/base.js, we would just do: $config/base.js. Nifty stuff!

Source code

GitHub logo Sirneij / finance-analyzer

An AI-powered financial behavior analyzer and advisor written in Python (aiohttp) and TypeScript (ExpressJS & SvelteKit with Svelte 5)

Implementation

Now it's time to get our hands dirty. We will be implementing the OAuth-based authentication system here.

Step 1: Install dependencies and setup configurations

We will use passports.js and passport-github2 for implementing the authentication strategy. Let's install the libraries (and types to keep TypeScript happy):

# Packages
backend$ npm i express@^5.0.1 passport passport-github2 mongoose cors redis connect-redis express-session dotenv winston

# Types
backend$ npm i -D @types/express  @types/passport  @types/passport-github2  @types/mongoose  @types/cors  @types/redis  @types/connect-redis  @types/express-session  @types/dotenv @types/winston
Enter fullscreen mode Exit fullscreen mode

Note: Create a GitHub OAuth App

As with most OAuth services, you need to create a new GitHub OAuth app to use its authentication strategy. After creation, you will be provided with GITHUB_CLIENT_ID and GITHUB_CLIENT_SECRET which are required by passport-github2. Ensure you fill in your app details correctly.

Having installed the packages, let's start with some configurations. Create src/types/misc.types.ts and populate it with:

import { AuthConfig } from "$types/auth.types.js";
import { DbConfig } from "$types/db.types.js";
import winston from "winston";

export enum Providers {
  GOOGLE = 1,
  GITHUB = 2,
}

export interface BaseConfig {
  auth: AuthConfig;
  db: DbConfig;
  frontendUrl: string;
  utilityServiceUrl: string;
  redisUrl: string;
  logger: winston.Logger;
}
Enter fullscreen mode Exit fullscreen mode

Though we are only supporting GitHub OAuth for now, we defined an enum that includes Google as well. The BaseConfig interface will be used for all the app's configurations including auth and db which have standalone types:

export interface OAuthCredentials {
  clientID: string;
  clientSecret: string;
  callbackURL: string;
}

export interface AuthConfig {
  google: OAuthCredentials;
  github: OAuthCredentials;
  session: {
    secret: string;
  };
}
Enter fullscreen mode Exit fullscreen mode

```ts :src/types/db.types.ts:
export interface DbConfig {
uri: string;
dbName: string;
}




As previously stated, we need each OAuth service's `cliend_id` and `client_secret` to authenticate. Another important credential is the `callback_url` (also called `redirect URI` in some OAuth implementations) which is the URL where the OAuth service redirects users after authentication. You will supply this at the point of registering or creating a new OAuth app in GitHub (and other providers as well).

Next, let's find a way to populate these credentials. We will use the `dotenv` package to retrieve them from a `.env` file or environment variables:



```ts
import { BaseConfig } from "$types/misc.types.js";
import { authConfig } from "$config/internal/auth.config.js";
import { dbConfig } from "$config/internal/db.config.js";
import { logger } from "$config/internal/logger.config.js";

export const baseConfig: BaseConfig = {
  get frontendUrl() {
    return process.env.FRONTEND_URL || "http://localhost:3000";
  },
  get utilityServiceUrl() {
    return process.env.UTILITY_SERVICE_URL || "http://localhost:5173";
  },
  get redisUrl() {
    return process.env.REDIS_URL || "redis://localhost:6379";
  },
  auth: authConfig,
  db: dbConfig,
  logger:
};
Enter fullscreen mode Exit fullscreen mode

We use getter methods in our configuration for three key benefits:

  1. Dynamic Values: Getters retrieve values on-demand, ensuring we always get the latest values
  2. Environment Variables: Particularly important for process.env values that may change during runtime
  3. Lazy Evaluation: Values are only computed when accessed, improving performance

I personally encountered a bug in production where my authentication process was failing because stale values were being read by process.env for frontendUrl.

Here are the contents of src/config/internal/auth.config.ts:

import { AuthConfig } from "$types/auth.types.js";
import { config } from "dotenv";

config();

export const authConfig: AuthConfig = {
  google: {
    clientID: process.env.GOOGLE_CLIENT_ID!,
    clientSecret: process.env.GOOGLE_CLIENT_SECRET!,
    callbackURL: `${process.env.APP_URL}/api/v1/auth/google/callback`,
  },
  github: {
    clientID: process.env.GITHUB_CLIENT_ID!,
    clientSecret: process.env.GITHUB_CLIENT_SECRET!,
    callbackURL: `${process.env.APP_URL}/api/v1/auth/github/callback`,
  },
  session: {
    secret: process.env.SESSION_SECRET || "your-secret-key",
  },
};
Enter fullscreen mode Exit fullscreen mode
import { DbConfig } from "$types/db.types.js";

import { config } from "dotenv";

config();

export const dbConfig: DbConfig = {
  uri: process.env.DB_URI || "mongodb://localhost:27017",
  dbName: process.env.DB_NAME || "test",
};
Enter fullscreen mode Exit fullscreen mode
import winston from "winston";

// Colors for different log levels
const colors = {
  error: "red",
  warn: "yellow",
  info: "green",
  http: "magenta",
  debug: "blue",
};

winston.addColors(colors);

// Development format - pretty console output
const developmentFormat = winston.format.combine(
  winston.format.colorize(),
  winston.format.timestamp({ format: "YYYY-MM-DD HH:mm:ss" }),
  winston.format.printf(
    (info) => `${info.timestamp} ${info.level}: ${info.message}`
  )
);

// Production format - JSON for better parsing
const productionFormat = winston.format.combine(
  winston.format.timestamp(),
  winston.format.json()
);

export const logger = winston.createLogger({
  level: process.env.NODE_ENV === "development" ? "debug" : "info",
  format:
    process.env.NODE_ENV === "development"
      ? developmentFormat
      : productionFormat,
  transports: [new winston.transports.Console()],
});
Enter fullscreen mode Exit fullscreen mode

Tip: Use MongoDB Atlas

You probably have MongoDB installed on your machine and you can just use it for development. In case you need to test in production, check up the free version of the MongoDB Atlas.

Step 2: Connect to MongoDB database and redis

With the configurations underway, let's create services that will connect our application to MongoDB as well as redis (for session storage).

<span><strong>Note:</strong> Why Redis for Session Storage?</span>
<p>
    While <code>express-session</code> offers in-memory storage, Redis is preferred for production because:
</p>
<ul>
    <li>
Enter fullscreen mode Exit fullscreen mode

Persistence: Sessions survive server restarts

  • Scalability: Handles high traffic and multiple server instances

  • Performance: Fast read/write operations with minimal latency

  • Memory Management: Automatic memory optimization and key expiration


  • Using in-memory storage in production can lead to:


    • Lost sessions after server restarts
    • Memory leaks as sessions accumulate
    • Scaling issues with multiple server instances

    Let's create a database service for them:

    import mongoose from "mongoose";
    import { baseConfig } from "$config/base.config.js";
    import { RedisStore } from "connect-redis";
    import { createClient } from "redis";
    
    const MAX_RETRIES = 3;
    const RETRY_INTERVAL = 5000;
    
    eexport async function connectToCluster(retryCount = 0) {
      try {
        const options = {
          dbName: baseConfig.db.dbName,
          serverSelectionTimeoutMS: 15000,
          socketTimeoutMS: 45000,
          maxPoolSize: 50,
          minPoolSize: 10,
          retryWrites: true,
          retryReads: true,
        };
        await mongoose.connect(baseConfig.db.uri, options);
    
        mongoose.connection.on("error", (err) => {
          baseConfig.logger.error("❌ MongoDB connection error:", err);
        });
    
        mongoose.connection.once("open", () => {
          baseConfig.logger.info("✅ MongoDB connection successful");
        });
        // Handle graceful shutdown
        process.on("SIGINT", async () => {
          try {
            await mongoose.connection.close();
            baseConfig.logger.info("MongoDB connection closed");
            process.exit(0);
          } catch (err) {
            baseConfig.logger.error("Error closing MongoDB connection:", err);
            process.exit(1);
          }
        });
    
        return mongoose.connection;
      } catch (error) {
        baseConfig.logger.error("❌ MongoDB connection error:", error);
    
        if (retryCount < MAX_RETRIES) {
          baseConfig.logger.info(
            `Retrying connection to MongoDB cluster in ${
              RETRY_INTERVAL / 1000
            } seconds...`
          );
          await new Promise((resolve) => setTimeout(resolve, RETRY_INTERVAL));
          return connectToCluster(retryCount + 1);
        }
    
        throw error;
      }
    }
    
    export const connectToRedis = (): RedisStore => {
      const redisClient = createClient({
        url: baseConfig.redisUrl,
      });
    
      redisClient.connect().catch((error) => {
        baseConfig.logger.error("❌ Redis connection error:", error);
      });
    
      redisClient.on("connect", () => {
        baseConfig.logger.info("✅ Redis connection successful");
      });
    
      return new RedisStore({ client: redisClient, prefix: "session:" });
    };
    
    Enter fullscreen mode Exit fullscreen mode

    For the MongoDB connection, we implemented a retry logic in case some connection attempts fail. Aside from that, it's a basic way to connect to a MongoDB instance. We did something equivalent to redis. Now, we can proceed to hook all these up in src/app.ts.

    Step 3: Setting up an express server in src/app.ts

    Let's populate our src/app.ts with the following:

    import express, { Application } from "express";
    import cors from "cors";
    import session from "express-session";
    import passport from "passport";
    import { Strategy as GitHubStrategy } from "passport-github2";
    import { connectToCluster, connectToRedis } from "$services/db.service.js";
    import { baseConfig } from "$config/base.config.js";
    import { AuthService } from "$services/auth.service.js";
    import authRoutes from "$routes/auth.routes.js";
    import { Providers } from "$types/misc.types.js";
    import { GitHubProfile } from "$types/auth.types.js";
    import type { User } from "$types/passports.d.js";
    import { ProviderMismatchError } from "$types/error.types.js";
    import { createServer, Server as HttpServer } from "http";
    
    const app: Application = express();
    
    // 1. Trust proxy setting
    app.set("trust proxy", 1);
    
    // 2. Basic middleware
    app.use(express.json());
    
    // 3. CORS configuration
    app.use(
      cors({
        origin: baseConfig.frontendUrl,
        credentials: true,
        methods: [
          "GET",
          "POST",
          "PUT",
          "PATCH",
          "DELETE",
          "OPTIONS",
          "HEAD",
          "TRACE",
          "CONNECT",
        ],
        allowedHeaders: ["Content-Type", "Authorization"],
        exposedHeaders: ["set-cookie"],
      })
    );
    
    // 4. Session configuration
    app.use(
      session({
        store: baseConfig.redisUrl ? connectToRedis() : new session.MemoryStore(),
        secret: baseConfig.auth.session.secret,
        resave: false,
        saveUninitialized: false,
        proxy: true,
        cookie: {
          secure: process.env.NODE_ENV === "production",
          sameSite: process.env.NODE_ENV === "production" ? "none" : "lax",
          httpOnly: true,
          maxAge: 24 * 60 * 60 * 1000,
          domain:
            process.env.NODE_ENV === "production"
              ? baseConfig.cookieDomain
              : undefined,
        },
      })
    );
    
    // 5. Authentication middleware
    app.use(passport.initialize());
    app.use(passport.session());
    
    passport.serializeUser<User>((user, done) => {
      done(null, user);
    });
    
    passport.deserializeUser((user: User, done) => {
      done(null, user);
    });
    
    passport.use(
      new GitHubStrategy(
        baseConfig.auth.github,
        async (
          accessToken: any,
          refreshToken: any,
          profile: GitHubProfile,
          done: (error: any, user?: any, options?: { message: string }) => void
        ) => {
          try {
            const user = await AuthService.findOrCreateUser({
              id: profile.id,
              email: profile.emails?.[0].value,
              provider: profile.provider,
              providerId: Providers.GITHUB,
              avatar: profile.photos?.[0].value,
              name: profile.displayName,
            });
            return done(null, user);
          } catch (error) {
            if (error instanceof ProviderMismatchError) {
              return done(null, false, { message: error.message });
            }
            return done(error);
          }
        }
      )
    );
    
    // Authentication routes
    app.use("/api/v1/auth", authRoutes);
    
    // Health check
    app.get("/api/v1/health", (req, res) => {
      baseConfig.logger.info("Health check endpoint called");
      res.status(200).json({ message: "Server is running" });
    });
    
    const startServer = async () => {
      try {
        const server: HttpServer = createServer(app);
    
        const db = await connectToCluster();
    
        if (!db.readyState) {
          throw new Error("MongoDB connection not ready");
        }
    
        // 7. Start server
        const PORT = process.env.PORT || 3000;
        server.listen(PORT, () => {
          baseConfig.logger.info(`Server listening on port ${PORT}`);
        });
      } catch (error) {
        baseConfig.logger.error("Error starting server:", error);
        process.exit(1);
      }
    };
    startServer();
    
    Enter fullscreen mode Exit fullscreen mode

    It is a simple setup that also considers production environments. We started by creating an express application instance which is needed to attach. Next, we enabled the "trust proxy" which allows some features for applications behind a proxy. Then we informed Express to parse incoming requests with JSON payloads by "using" the express.json() middleware. This is a way of attaching middleware to express. We also used the CORS and session middlewares to appropriately configure our application's CORS for inter-origin resource sharing and sessions. Specifically, we are giving the backend a "go-ahead" to share resources with our front end. We also made sure our sessions were secured by providing them with an option to use our generated secret keys. In development, you can use openssl to generate a 32-bit secret key:

    $ openssl rand -base64 32
    
    Enter fullscreen mode Exit fullscreen mode

    In production, you can opt for Cryptographically generated bytes.

    After that, we used passport's authentication middleware and extended its feature to easily serialize and deserialize our app's User object. In the spirit of authentication, we defined our GitHub OAuth strategy next and it follows the normal anatomy of OAuth strategies supported by passport and specifically, passport-github2. Because we defined our credentials perfectly, we just passed it in, else we would have done something like:

    passport.use(new GitHubStrategy({
        clientID: process.env['HITHUB_CLIENT_ID'],
        clientSecret: process.env['GITHUB_CLIENT_SECRET'],
        callbackURL: 'https://www.example.com/oauth2/redirect/google'
      },
      ...
    
    Enter fullscreen mode Exit fullscreen mode

    In the callback function, we have access to the profile data returned by GitHub which was then passed into the AuthService to create the user in the database:

    import { User } from "$models/user.model.js";
    import { AuthUser, UserProfile } from "$types/auth.types.js";
    import { ProviderMismatchError } from "$types/error.types.js";
    
    export class AuthService {
      static async findOrCreateUser(profile: UserProfile) {
        try {
          // First try to find user by email only
          let user = await User.findOne({ email: profile.email }).exec();
    
          if (user) {
            // User exists, check provider
            if (user.provider !== profile.provider) {
              throw new ProviderMismatchError(user.provider);
            }
    
            // Check if any details need updating
            const updates: Partial<AuthUser> = {};
            if (user.name !== profile.name) updates.name = profile.name;
            if (user.providerId !== profile.providerId)
              updates.providerId = profile.providerId;
            if (user.avatar !== profile.avatar) updates.avatar = profile.avatar;
            // If updates needed, apply them
            if (Object.keys(updates).length > 0) {
              user = await User.findByIdAndUpdate(
                user._id,
                { $set: updates },
                { new: true }
              ).exec();
            }
          } else {
            // Create new user if none exists
            user = await User.create({
              email: profile.email,
              name: profile.name,
              provider: profile.provider,
              providerId: profile.providerId,
              avatar: profile.avatar,
            });
          }
    
          return user;
        } catch (error) {
          console.error("Error in findOrCreateUser:", error);
          throw error;
        }
      }
    }
    
    Enter fullscreen mode Exit fullscreen mode

    We defined a user model (with its schema) already:

    import mongoose, { Schema } from "mongoose";
    import { AuthUser } from "$types/auth.types.js";
    
    const userSchema = new Schema<AuthUser>(
      {
        email: { type: String, required: true, unique: true },
        name: { type: String },
        provider: { type: String, required: true },
        providerId: { type: Number, required: true },
        avatar: { type: String, default: null },
      },
      {
        timestamps: true,
      }
    );
    
    export const User = mongoose.model<AuthUser>("User", userSchema);
    
    Enter fullscreen mode Exit fullscreen mode

    The types used so far for the user can be found in src/types/auth.types.ts:

    import mongoose from "mongoose";
    
    export interface UserProfile {
      id: string;
      email: string;
      name?: string;
      provider: "google" | "github";
      providerId: number;
      avatar: string | null;
    }
    
    export interface AuthUser {
      _id?: mongoose.Types.ObjectId;
      email: string;
      name?: string;
      provider: string;
      providerId: number;
      avatar: string | null;
      isJohnOwolabiIdogun: boolean;
      createdAt: Date;
      updatedAt: Date;
    }
    ...
    
    Enter fullscreen mode Exit fullscreen mode

    To make TypeScript happy with our custom user type, we needed to modify its user type in src/types/passport.d.ts:

    import { AuthUser } from "$types/auth.types.js";
    
    declare global {
      namespace Express {
        interface User extends AuthUser {}
      }
    }
    
    // Re-export the User type
    export type User = Express.User;
    
    Enter fullscreen mode Exit fullscreen mode

    The rest of the src/app.ts are pretty basic. Before we wrap up with this article, let's see what the authentication routes are.

    Step 4: Authentication routes

    In src/app.ts, we used:

    ...
    import { AuthService } from "$services/auth.service.js";
    import authRoutes from "$routes/auth.routes.js";
    ...
    
    // Authentication routes
    app.use("/api/v1/auth", authRoutes);
    ...
    
    Enter fullscreen mode Exit fullscreen mode

    These routes are in src/routes/auth.routes.ts:

    import { Router } from "express";
    import passport from "passport";
    import { AuthController } from "$controllers/auth.controller.js";
    import { isAuthenticated } from "$middlewares/auth.middleware.js";
    
    const authRouters = Router();
    const authController = new AuthController();
    
    authRouters.get("/github", (req, res, next) => {
      const state = req.query.next
        ? Buffer.from(req.query.next as string).toString("base64")
        : "";
      passport.authenticate("github", {
        scope: ["user:email"],
        state,
      })(req, res, next);
    });
    
    authRouters.get(
      "/github/callback",
      passport.authenticate("github", { failureRedirect: "/api/v1/auth/failure" }),
      (req, res, next) => {
        next();
      },
      authController.handleLoginSuccess
    );
    
    authRouters.get("/session", isAuthenticated, (req, res) => {
      res.json({ user: req.user });
    });
    
    authRouters.get("/failure", authController.handleLoginFailure);
    authRouters.get("/logout", authController.handleLogout);
    
    export default authRouters;
    
    Enter fullscreen mode Exit fullscreen mode

    The first one is where the authentication flow starts. It lets you login into your GitHub account and if successful redirects you to the callback_url the developer supplied during OAuth app creation, for us, it's the second route.

    Tip: Supplying redirect route in the frontend

    Let's say a user wants to access /private/route in your app's front end but such a user wasn't authenticated. Then your frontend app redirects the user to login with GitHub (and provides a next=/private/route in the URL). What the user expects is after a successful login, they want to be sent back to where they were headed initially /private/route. That was the logic implemented in the /github route above. It simply "remembers" the user's previous state.

    These routes are very basic. We won't talk much about them. However, they used some "controllers" which we haven't seen yet:

    import { baseConfig } from "$config/base.config.js";
    import { Request, Response } from "express";
    
    export class AuthController {
      async handleLoginSuccess(req: Request, res: Response) {
        if (req.user) {
          if (req.xhr || req.headers.accept?.includes("application/json")) {
            res.status(200).json({
              success: true,
              message: "Login successful",
              user: req.user,
            });
          } else {
            const state = req.query.state as string | undefined;
            let redirectPath = "/";
    
            if (state) {
              try {
                // Validate if the state is Base64
                const base64Regex = /^[A-Za-z0-9+/=]+$/;
                if (base64Regex.test(state)) {
                  redirectPath = Buffer.from(state, "base64").toString();
                } else {
                  throw new Error("Invalid Base64 input");
                }
              } catch (error) {
                baseConfig.logger.error("Failed to decode state parameter:", error);
              }
            }
            baseConfig.logger.info(
              `Redirecting to ${baseConfig.frontendUrl}${redirectPath}`
            );
            res.redirect(`${baseConfig.frontendUrl}${redirectPath}`);
          }
        }
      }
    
      async handleLoginFailure(req: Request, res: Response) {
        baseConfig.logger.info(
          `Redirecting to ${baseConfig.frontendUrl}/finanalyzer/auth/login?error=true`
        );
        res.redirect(`${baseConfig.frontendUrl}/finanalyzer/auth/login?error=true`);
      }
    
      async handleLogout(req: Request, res: Response) {
        req.logout(() => {
          baseConfig.logger.info(
            `Redirecting to ${baseConfig.frontendUrl}/finanalyzer/auth/login`
          );
          res.redirect(`${baseConfig.frontendUrl}/finanalyzer/auth/login`);
        });
      }
    }
    
    Enter fullscreen mode Exit fullscreen mode

    We redirected responses back to our frontend app.

    With that, I will say see you in the next release! Check out the GitHub repository for the other missing pieces.

    Outro

    Enjoyed this article? I'm a Software Engineer and Technical Writer actively seeking new opportunities, particularly in areas related to web security, finance, healthcare, and education. If you think my expertise aligns with your team's needs, let's chat! You can find me on LinkedIn and X. I am also an email away.

    If you found this article valuable, consider sharing it with your network to help spread the knowledge!

    Top comments (0)