This is the last project of the Full-Stack Javascript Project in The Odin Project Course. Buckle up because this post will discuss the many things I learned from it.
What we will talk about
Planning
For me, coming from a creative video production and graphic design background, I create a visual design to guide me in both the front end and back end. Knowing what I want to see visually, it gives me a good idea of the front-end pages and backend functionality I need. For this visual planning, I chose to use Figma for the design.
I wouldn’t consider myself a professional web designer but I can get by with some graphic design basics i’ve learned over the years. I didn’t implement everything on the design, just the most important to the user experience. As I began implementing it, I can definately see that one solo engineer would take a substantial amount of time implementing a full project in a node backend and react SPA front end.
That said, I kept it minimal to get the chance to touch every aspect of full-stack web development I could. In the future, I do plan on revisiting the project and updating it to fix and improve the UI and some backend functionality.
Tech Chosen
The project instructions when I undertook this project was to use a seperate backend and front end. That lead to the decision to have an expressjs backend and a react SPA front end. Since this is a learning project, I stayed away from using full-stack frameworks like NextJS, Remix, or React Router Framework mode.
The backend is a simple fully custom made ExpressJS app with controllers, middlewares, orm, and json responses. The front-end is a Vite React SPA with React Router used in declarative mode. That means only the client side router is React Router. This enabled be to use different react components per “page” in the SPA and all data fetching was done with fetch and react query to handle the state and mutations.
Front-End Tech
For a more comprehensive view of the frontend tech, you can find the frontend repo here.
- React
- React Dropzone
- React Hook Form
- React Router (declarative mode)
- Luxon
- Shadcnui
- Tanstack Query
- Tailwindcss
- Typescript
- Vite
- Zod
Back-End Tech
For a more comprehensive view of the backend tech, you can find the backend repo here.
- Cloudinary
- ExpressJS
- Passportjs
- Postgres
- Prisma
- Typescript
Cloudinary is used as the image server to store the images and retrieve them. It was easier for me to build the project with cloudinary than to learn how to use AWS S3 and secure it properly. Although that is something that I am looking to learn very soon.
Database Schema
I chose prisma as the ORM and designed the database with prisma in mind. In the beginning, I had a general idea of the types of data that I wanted to have. Such as:
- Users
- Posts
- Comments
- PostLikes
- CommentLikes
I wanted to have PostLikes and CommentLikes as a seperate model in the schema as I didn’t just want a like number on each post and comment. I wanted for users to have a way to see posts that they have liked before so that lead to the decision of having “likes” be an entire model entity in the Database, to associate the users with liking certain posts and comments.
As the project grew, and I implemented social auth, I added Accounts into the Schema. It was the only way to make it simple to know if a user had a social auth login or login through email.
Another thing to note, mostly for myself, is that this project taught me that it’s important to save the image dimensions when saving a picture in your database as this ensures that <img /> tags can have appropriate width and height attributes for better user experience. This is because those attributes help the browser reduce layout shifts by knowing the aspect ratio of an image, even before the browser can load the image.
Auth
The decision to use passportjs was mostly because it’s what the course, at the time I took it, had taught up to that point. If I could do it again, I would have chosen a more comprehensive library like better-auth. The reason I feel that better-auth is a better choice is because Auth is a very sensitive aspect of web dev. I do not trust myself with implementing auth on my own, and while passport does a lot of heavy lifting, I had to implement some things in my database schema to make passportjs and my express app work with both email and google social login.
Better-Auth already has a lot of helper scripts and functionality to get your database in line with best practices for login in using different methods. So as a learning project, passportJS was good and got me to see the types of requirements needed in the database layer that are necessary for auth to function properly. However in future projects, I would much rather use better-auth or a paid provider like WorkOS if the project could be used by enterprises.
Backend Routes
There are over 30 routes in the backend, so I’ll explain them in groups. The /auth routes use PassportJS to authenticate a user, either via email/password login or google auth, and once authenticated, an access token and a refresh token is sent to the user to be saved in local storage. All of the other routes use the req.user property in the final route handler to see who is performing the action. If the user is banned, or otherwise can’t access the action requested, the action is prohibited and an error is sent back.
The /posts routes deal directly with CRUD operations on Posts. In particular, POST /posts/:postId/likes and DELETE /posts/:postId/likes are for liking and unliking posts respectively.
The /profiles routes get user profile’s info. This does not include a user’s posts, this is taken care of by the /posts route. The POSTS /profiles/:username/follow and DELETE /profiles/:username/follow follow and unfollow the users respectively.
The /account route is specifically to do CRUD operations on the currently logged in user. and the /admin routes are to perform administrator tasks for users who have the role of ADMIN. This allows admin users to ban or delete users.
Frontend Pages
The front end is a SPA, which multiple “pages” of sorts. Most of the app experience would be in /feed, /explore, and /users. The Feed page is to see posts from the users you follow. The feed can be filtered in 3 ways: popular, latest, oldest. Popular sorts the posts by most likes. The Explore page gets all the posts from all users, regardless if you follow them or not. The /users page lets you see all users and even search for them using their username. The /users/:username page is specifically to see info and posts from the specific user you want to see.
The /posts/create page allows the logged in user to create posts. it allows users to upload 1 image and a simple caption for the post.
Deployment
The front-end was deployed on Vercel. This decision was because Vercel makes it easy to deploy Vite apps automatically without much configuration or creating a docker image. The backend was deployed on Railway. Railway can detect node projects and also perform some auto configurations to get it working. Thankfully both platforms give you a free domain to see your deployment and that is how I got the back-end and front-end to communicate. This is something that I thought about from the start.
Since I didn’t want to buy a domain just for this project, I couldn’t deploy them in different subdomains and use ‘lax’ same site cookies for auth. so I decided to go with token based auth stored in local storage on the front end and use the authorization http header to send the jwt’s for auth.
Another route that I did consider but I knew it would involve a lot of time to study and implement was to deploy the front-end in railway in the same project as the back-end. This would have required the following measures to be implemented
- Create a dockerfile with nginx or caddy to serve the front-end assets
- Setup nginx config to reroute request from
/api/<rest of route>to backend server - setup ExpressJS in a way to accept only request from the frontend domain
The implementation of the above would have allowed for the use of httpOnly cookies to store the auth information. At this point, it would just have made more sense to use Sessions with cookies. However in the nodejs ecosystem, JWT’s are used much more than sessions. So to showcase an general understanding of using JWT’s AND avoid writing complex nginx configuration files, I chose to go with using JWT’s and saving them to local storage.
If I was to implement this in my own opinionated way, I would have chosen to both use better-auth and use sessions with the database and a cookie to store a session id.