Hey everyone! This first post shares my thoughts and progress on an internal application I'm developing with the help of a junior designer and a junior developer. The purpose of this tool is to streamline the website analysis process for our dev team, helping us gain deeper insights into client sites with less friction.
Project Objective
The Domain Analysis Tool will serve as an internal application for our development team, enabling us to:
- Analyze client websites and pull relevant metrics from Google Analytics.
- Map these metrics visually to help clients make data-informed decisions.
- Provide an individual page form where clients can add notes or complete information for more context.
Early Designs
The basic idea was for clients to be able to select what pages they wanted to migrate to our new branding theme. Furthermore, due to our information technology department restructure, they can choose what pages they wanted to be deleted or redesigned.
Google Analytics plays a big role here, in providing metrics for the clients' pages, as well as generating our dynamic content. We'll get more into that later.
🔑 Key Challenges
Since our team usually works with older technologies, this project allows us to explore more modern tools and practices. Additionally, we’re tasked with building Docker images to facilitate deployment, giving us more flexibility in our stack choices. It is a learning curve for everyone involved, but it's exciting!
💻 Choosing the Tech Stack
Our guiding principle was simplicity—our goal was to create a functional, straightforward minimum viable product (MVP) that could quickly meet client needs, with the potential for additional quality-of-life improvements over time.
1. ⚛️ Why React.js?
The team's skill growth has been capped by years of working in a Content Management System. To keep our development streamlined yet impactful, I wanted a framework that offered:
- Widespread adoption and solid documentation
- Good performance and scalability
- However, the challenge: React's flexibility can lead to a wide variety of patterns and opinions on structure. We needed something straightforward, with clear best practices for file organization, to keep the app easy to navigate for future devs.
2. ⏭️ Why Next.js?
After exploring Next.js, I felt it was a great fit due to:
- Built on React!
- Dynamic Routing: Essential for managing pages rendered on the fly, perfect for our needs.
- Flexible rendering (server or static) This seemed like the right choice for building out our frontend efficiently, and I took the lead in setting it up.
3. ⌨️ TypeScript all the way!
Initially, I was using JSX and your regular old way of writing react code, but I remember that linting often did not do a good job of telling me what my errors were or where they specifically lie. TypeScript will save your ass, being strongly typed, strict null checks, and being just less error-prone than JavaScript.
4. 🚊 Express.js
I will admit that backend is not my forte, but I saw this as an opportunity to learn, as well as playing to my junior's strengths, I decided to put him in charge of the backend, with a bit of oversight from me. He chose Express.js for simple routing and middleware management as we build our API.
5. 💾 MySQL
It's reliable and has a proven track record. One of the obstacles we were figuring out how to tackle with the database was how we could upload images, and if we did upload images, was there a library we could use to automate scraping and uploading pages. I felt this required a lot of heavy lifting, so we decided to hold off on images.
Minimal Third-Party Dependencies
We focused on selecting well-supported libraries that wouldn’t add technical debt or face deprecation risks within a year. This minimizes the burden on the team if I move to a different role or project.
Other Stack Considerations
We initially considered Python (Django/Flask) due to its simplicity and strong scraping libraries like Selenium and Beautiful Soup, which could have been useful for automated data extraction. Ultimately, we decided to stick with a JavaScript stack for consistency and ease of integration.
Brand theming
Of course we had already brand made components, but being in a content management system, these components were not configured for Next.js or even standalone pieces. In short, I had to rebuild our design system to accommodate for single page applications. (React component library coming soon for our Mercury Theme? 👀)
🛠️ Dev Tools
🚀 Postman
Perhaps one of my most favorite tools for API testing. This is perfect for testing Google Analytics API queries and retrieving data based on dynamic input. I used this in various projects to see what HTTP requests were being sent and diagnose what errors I received.
🥽 GitLab
How could I forget about our repository. It will provide a good project management structure, Web IDE, and most important of all, my company is partnered with them 🙂
🐳 Docker
This one is a doozy. To be honest, I didn't understand it first time around... nor the second time, but third time is the charm! It clicks now. Understanding the process and workflow for specific projects was difficult to grasp, especially since we would not be maintaining the containers (our infrastructure team is responsible), just handing over the Docker images.
Laying the groundwork
As my junior dev began building our routes and API in the backend, I began setting up in the frontend. Likewise, my junior designer began creating high fidelity mockups, offering different iterations of designs on how the app should look and feel.
In this rough first version, shows the interface of when you click into a department's domain. We opted for cards versus a list, so that clients could get a better view of what their page looks like. The thought is that clients aren't keeping track of every single page and will often forget what a page is about, especially for orphaned pages. Therefore we provide a visual of the page, which brings us back to our first issue — image handling. As we spoke about keeping our technical debt low and lightweight, we were trying to use minimal libraries.
The solution? Well, we opted for iframes, for every single page. Since we would be retrieving page paths from Google's API, we could just iframe those links in, make them non-interactive, and zoom out to give a basic preview. It was a robust, imperfect solution so that we could meet our deadline by the end of 2024.
Performance-wise, it could be a nightmare, it could not be 🤔. I decided that I would lazy load all the iframes upon render. With pagination involved, it only displayed 10 cards with iframes at a time. This is an internal tool, so I wasn't too worried about scaling and performance. I figured about less than ~30 people would be trafficking this site at it's peak, since it's behind a lot of walls (SAML, VPN, etc.).
What about internal pages that the iframe cannot load?
Well, you're sort of S.O.L. with that, but we provide the original page link within the same card. So as long as you're within the right department, you can view those pages in a new tab.
I didn't want to make too long of a first post, but if I have the interest and others are interested in seeing the project progression, I'll make another post continuing the project details and obstacles I've gone through. I am writing this anecdotally, as I've recently finished the project 😎🎉
I'm open to suggestions and things I may have missed! This was a great learning process for my team and I.
Top comments (0)