Conversation
WalkthroughLogging migrated to slog; server gains timeouts and cancellable background media cleanup; media and post domains added (models, DTOs, repositories, services, handlers, routes, and migration); user repository refactored to use a dbExecutor; CORS helper removed; frontend IDs and lint settings adjusted. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
actor Client as User
participant MH as MediaHandler
participant MS as MediaService
participant MR as MediaRepository
participant FS as Filesystem
Client->>MH: POST /api/v1/media (multipart/form-data)
MH->>MS: PrepareFileUpload(origFilename, contentType)
MS->>FS: ensure upload dir & create temp file
MS->>MR: CreateMedia(status=pending, expires_at)
MS-->>MH: writer + media metadata
MH->>MH: sniff bytes, validate type
MH->>FS: stream file to temp file
MH->>MS: FinalizeFileUpload(mediaID, size, actualType)
MS->>MR: UpdateMediaStatusAndSize(status=uploaded)
MS-->>MH: success
MH-->>Client: 201 Created + MediaUploadResponse
Note over MS,MR: Background: CleanupExpiredUnusedMedia runs periodically -> delete files + DB rows
sequenceDiagram
autonumber
actor Client as User
participant PH as PostHandler
participant PS as PostService
participant MS as MediaService
participant PR as PostRepository
participant DB as PostgreSQL
Client->>PH: POST /api/v1/posts {description, media_id?}
PH->>PS: CreatePost(ctx, req)
PS->>PS: validate + get user claims
alt media_id present
PS->>MS: GetMediaForPostCreation(mediaID)
MS-->>PS: media metadata (ensure uploaded)
PS->>PS: build image URL
end
PS->>DB: BEGIN
PS->>PR: CreatePost(post)
PR-->>PS: postID
alt media_id present
PS->>MS: LinkMediaToPost(mediaID, postID)
end
PS->>DB: COMMIT
PS-->>PH: PostResponse (201)
PH-->>Client: 201 Created + JSON
Estimated code review effort🎯 4 (Complex) | ⏱️ ~75 minutes Possibly related PRs
Poem
Pre-merge checks and finishing touches✅ Passed checks (3 passed)
✨ Finishing touches
🧪 Generate unit tests
📜 Recent review detailsConfiguration used: CodeRabbit UI Review profile: CHILL Plan: Pro 📒 Files selected for processing (2)
✅ Files skipped from review due to trivial changes (1)
🚧 Files skipped from review as they are similar to previous changes (1)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
|
bump |
There was a problem hiding this comment.
Actionable comments posted: 17
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
frontend/biome.json (1)
314-345: yo frfr frontend/biome.json is nesting a root config and Biome straight up bailsWe ran Biome 2.2.0 against
frontend/biome.jsonand got:× Found a nested root configuration, but there's already a root configuration. i The other configuration was found in . i Use the migration command from the root of the project to update the configuration. $ biome migrate --write × Biome exited because the configuration resulted in errors. Please fix them.Means none of your suspicious rules are loading—linter’s 💀
To fix this ASAP:
- Decide on a single root config
- In
frontend/biome.json, either remove the top-level"root": true(or delete the file if it’s redundant)- Or consolidate its rules into the project-root config and drop the nested one
- After cleanup, run from repo root:
npx biome migrate --writeThis will ensure your new rules actually load and lint without errors.
♻️ Duplicate comments (1)
backend/internal/handlers/post_handler.go (1)
42-47: Yo frfr: validation will nuke text-only posts until DTO tag is fixed.This path will 400 when
media_idis absent because the DTO usesvalidate:"uuid"(noomitempty). See suggested DTO fix.
🧹 Nitpick comments (52)
frontend/src/index.css (2)
29-29: tiny nit: clarify the ignore reason + spacing for vibesMake the rationale a bit clearer and add a space before the closing
*/for consistency.-/* biome-ignore lint/suspicious/noUnknownAtRules: tailwindcss*/ +/* biome-ignore lint/suspicious/noUnknownAtRules: allow Tailwind CSS @theme */
28-30: optional: toolchain-level allowlist to avoid sprinkling ignores laterIf more Tailwind at-rules pop up, consider scoping the rule off for this entry CSS via Biome overrides instead of repeating ignores.
Example .biome.json override (scoped to this file only):
{ "overrides": [ { "include": ["frontend/src/index.css"], "linter": { "rules": { "suspicious": { "noUnknownAtRules": "off" } } } } ] }frontend/src/components/UI/CommentsSection.tsx (2)
2-2: Move useId out of JSX for clean hooks usage (small W refactor).Calling hooks inline in JSX is lowkey sus for rule-of-hooks linters. Define it once per render and reference it. Keeps it crisp and future-lint-proof, frfr.
Apply this diff inside the component for the changed line:
- id={useId()} + id={commentInputId}And add this line near the top of the component (outside the return):
const commentInputId = useId();Also applies to: 24-24
19-28: A11y micro-buff: label/aria and form semantics would slap.Right now it’s placeholder-only + a click handler. A screen reader label or aria-label, plus wrapping in a form to enable Enter-to-submit, would be a W UX.
Example (outside this hunk):
<form onSubmit={...}> <label htmlFor={commentInputId} className="sr-only">Add a comment</label> <input id={commentInputId} aria-label="Add a comment" ... /> <SendButton type="submit" /> </form>frontend/src/assets/icons/MetaIcon.tsx (2)
1-1: Don’t call useId in JSX; hoist to a const.Same vibe as elsewhere: keep hooks at top level for lint happiness. This keeps the icon component chill and predictable.
Apply this diff for the changed attribute:
- id={useId()} + id={svgId}And add at the top of the component:
const svgId = useId();Also applies to: 12-12
12-12: If the svg id isn’t referenced, you can drop it (tiny DOM trim).If nothing is using this id (no clipPath/gradient/URL refs), nuke it for fewer bytes. Low effort, small win, frfr.
Alternative diff:
- id={svgId} + // id removed (unused)frontend/src/pages/registerPage.tsx (1)
63-65: Use semantic input types + autocomplete for built-in UX boosts.Swap email to type="email" and add autocomplete hints. Free validation and better autofill, no cap.
- <input - type="text" + <input + type="email" id={`email-${idHash}`} placeholder="Enter your email" + autoComplete="email" className="w-full rounded-2xl border border-accent-secondary py-2 px-3 focus:outline-accent-secondary hover:bg-accent-secondary transition-all duration-300" value={email} required- <input - type="password" + <input + type="password" id={`password-${idHash}`} placeholder="Enter your password" + autoComplete="new-password" className="w-full rounded-2xl border border-accent-secondary py-2 px-3 focus:outline-accent-secondary hover:bg-accent-secondary transition-all duration-300" value={password} minLength={8} requiredOptional (outside this hunk): add
role="alert"to the error span for screen readers:{error && ( <span role="alert" className="mt-2 block text-center error text-red-800"> {error} </span> )}Also applies to: 77-79
frontend/src/pages/loginPage.tsx (3)
49-55: Make the form a lil more cracked: let the browser help you
- Email input should be type="email" to auto-validate and offer platform keyboards.
- Add autocomplete hints for both fields for better UX.
- These swaps are zero-risk and polish the flow.
Apply:
- <input - type="text" + <input + type="email" id={`email-${idHash}`} placeholder="Enter your email" className="w-full rounded-2xl border border-accent-secondary py-2 px-3 focus:outline-accent-secondary hover:bg-accent-secondary transition-all duration-300" value={email} required + autoComplete="email" onChange={(e: React.ChangeEvent<HTMLInputElement>) => setEmail(e.target.value) } /> @@ - <input - type="password" + <input + type="password" id={`password-${idHash}`} placeholder="Enter your password" className="w-full rounded-2xl border border-accent-secondary py-2 px-3 focus:outline-accent-secondary hover:bg-accent-secondary transition-all duration-300" value={password} minLength={8} required + autoComplete="current-password" onChange={(e: React.ChangeEvent<HTMLInputElement>) => setPassword(e.target.value) } />Also applies to: 63-69
13-14: Heads-up: useId strings contain colons; avoid CSS id selectors if you canReact’s useId can yield ids like ":r123:". HTML is fine, but CSS ids need escaping. You’re prefixing with email-/password-, which is already 200 IQ, so you’re chilling. Just don’t style by these ids.
11-12: Token state might be deadweight hereYou navigate immediately after success, so holding token in local state + success banner is low-value and can flicker. Consider removing token state and pushing token to your auth layer (HTTP-only cookie or context).
Also applies to: 31-33
frontend/biome.json (2)
9-9: Duped include patterns are lowkey sus"includes": ["", "!dist", "", "!dist"] has duplicates. Not harmful, just messy—drop the duplicates to keep config tidy.
Apply:
- "includes": ["**", "!dist", "**", "!dist"] + "includes": ["**", "!dist"]
391-405: yo frfr –noTsIgnoreis singular, no shadowing detectedChecked
frontend/biome.json: only one"noTsIgnore": "error"at line 404, so there’s no duplicate to brick the build. CI vibe stays 🔥.
- Keep the hard error on
noTsIgnoreto catch ts-ignores early.- If CI noise ramps up, consider downgrading to a warning on incremental PRs first.
- Optionally, group all TS rule overrides in one block later to avoid any future rule shadowing.
backend/config/config.go (2)
27-28: Dial the log level + attach err contextMissing .env isn’t necessarily an error in containerized deploys. Warn (or Info) with the actual error for observability.
Apply:
- slog.Error("No .env file found, using environment variables") + slog.Warn("loading .env failed; falling back to OS env", "error", err)
35-45: Defaulting sensitive config is risky in prodHard defaults for DB creds and JWT secret are fine for local dev but scary in prod. Consider enforcing required envs when APP_ENV=prod and fail fast.
I can drop a small guard that checks JWT_SECRET and DB_PASSWORD when APP_ENV=prod; lmk if you want a patch.
backend/internal/repositories/interfaces.go (1)
10-14: dbExecutor interface is clean; export it if you plan cross-package testsUnexported is fine inside repositories, but if tests live in package repositories_test or other packages need it, consider exporting (DBExecutor). Also, you might later want Begin/Commit for tx flows.
Apply:
-type dbExecutor interface { +type DBExecutor interface { Exec(ctx context.Context, sql string, arguments ...any) (pgconn.CommandTag, error) Query(ctx context.Context, sql string, args ...any) (pgx.Rows, error) QueryRow(ctx context.Context, sql string, args ...any) pgx.Row }Follow-up in constructors: accept the interface type so mocks slide right in.
Want me to wire a pgxmock-based test scaffold for repos?
backend/internal/repositories/post_repository.go (2)
25-30: Use UTC timestamps to avoid time driftPersisting local time can backfire across regions and DST. Use time.Now().UTC() for CreatedAt/UpdatedAt.
Apply:
- now := time.Now() + now := time.Now().UTC()
37-51: Set the ID back on the entity for callers (ergonomics boost)You already return the id; setting post.ID makes downstream code cleaner.
Apply:
var postID int err := r.db.QueryRow(ctx, query, @@ ).Scan(&postID) if err != nil { return 0, fmt.Errorf("failed to create post: %w", err) } - return postID, nil + post.ID = postID + return postID, nilbackend/internal/models/media.go (2)
7-21: Low-key upgrade: add strong types/consts for status + UUID type.
ID stringworks, butuuid.UUIDprevents invalid IDs flowing through.Status stringis fragile; introduce typed consts (e.g., Pending/Uploaded/Used) to avoid misspells across services/repos.Example constants to add in this package:
type MediaStatus string const ( MediaStatusPending MediaStatus = "pending" MediaStatusUploaded MediaStatus = "uploaded" MediaStatusUsed MediaStatus = "used" )Then migrate
Status string→Status MediaStatusand gate repo/service inputs with these.
18-20: Nullable vibes: consider pointers/omitempty for optionals.
ExpiresAtmay not always be set; likewisePostID/ProfilePictureUserIDmay be absent. If this struct ever hits JSON,omitemptyon pointers helps keep payloads clean. If it never leaves the backend, feel free to ignore.- ExpiresAt time.Time `json:"expires_at"` - PostID *int `json:"post_id"` - ProfilePictureUserID *int `json:"profile_picture_user_id"` + ExpiresAt time.Time `json:"expires_at"` // or *time.Time if truly optional + PostID *int `json:"post_id,omitempty"` + ProfilePictureUserID *int `json:"profile_picture_user_id,omitempty"`backend/internal/models/post.go (1)
12-15: Soft take: pick either Deleted or DeletedAt as the source of truth.Carrying both a boolean flag and a nullable timestamp can drift. Many codebases key off
DeletedAt != nilalone.backend/internal/dto/post_dto.go (1)
5-9: Trim the fluff: prevent “all spaces” descriptions sneaking through.
required,min=1counts whitespace. If you want non-blank content, trim before validate or add a custom rule.Example in handler before
h.validator.Struct(req):req.Description = strings.TrimSpace(req.Description)Or register a
notblankcustom validator and usevalidate:"required,notblank,min=1,max=1000".backend/internal/handlers/post_handler.go (4)
53-59: Don’t vibe-check error strings; use typed/sentinel errors.
strings.Contains(err.Error(), "unauthorized")is brittle. Prefer exported sentinel errors orerrors.Ischecks from the service layer.Example shape:
// services package var ( ErrUnauthorized = errors.New("unauthorized") ErrInvalidMediaID = errors.New("invalid or inaccessible media ID") ) // handler switch { case errors.Is(err, services.ErrUnauthorized): ... case errors.Is(err, services.ErrInvalidMediaID), errors.Is(err, services.ErrValidation): ... default: ... }
65-68: Pre-encode response to avoid half-writes after 201.If
Encodefails after headers are out,ErrorResponsecan’t change status and you end up with a mixed body. Buffer first, then write headers.-w.Header().Set("Content-Type", "application/json") -w.WriteHeader(http.StatusCreated) -if err := json.NewEncoder(w).Encode(postResponse); err != nil { +w.Header().Set("Content-Type", "application/json") +var buf bytes.Buffer +if err := json.NewEncoder(&buf).Encode(postResponse); err != nil { slog.Error("Failed to encode post response", "err", err, "post_id", postResponse.ID) common.ErrorResponse(w, http.StatusInternalServerError, "Failed to encode response", nil) -} + return +} +w.WriteHeader(http.StatusCreated) +_, _ = w.Write(buf.Bytes())Note: add
bytesto imports.
29-33: Nit: route should gate methods; handler check is redundant.If your router already mounts this under POST-only, this guard adds noise.
45-46: Tiny privacy tweak: don’t echo raw validation errors to clients.Passing
errintoErrorResponsemay leak internal field names/struct layout. Send a generic message and, if needed, a structured list of safe field errors.- common.ErrorResponse(w, http.StatusBadRequest, "Validation failed", err) + common.ErrorResponse(w, http.StatusBadRequest, "Validation failed", nil) + // Optionally map validator errs to a public-friendly []FieldError and include that instead.backend/internal/handlers/post_handler_test.go (4)
30-31: Mark helper as a helper.So failures point to the caller, not
CreateTestMedia.func CreateTestMedia(t *testing.T, pool *pgxpool.Pool, userID int, status string) string { + t.Helper()
64-67: DefinetestUploadDirin a shared test helper for clarity.It’s referenced across tests; ensure it’s a single source of truth (same package is fine). If it already exists in another _test.go, ignore me.
168-183: Add a couple quick asserts for extra signal.
- Assert
Content-Typeisapplication/jsonfor 201s.- Assert
resp.UserID == testUserIDto catch mismatched claims plumbing.Example snippet inside the success branch:
if tt.expectedStatus == http.StatusCreated { if ct := rr.Header().Get("Content-Type"); !strings.Contains(ct, "application/json") { t.Errorf("expected application/json, got %q", ct) } } if resp.UserID != testUserID { t.Errorf("expected user_id %d, got %d", testUserID, resp.UserID) }
197-210: Cleanups are valid, but consider deferring file removal.If assertions fail before reaching removal, test files linger.
t.Cleanupwith a guard on path existence can keep the FS tidy.if tt.requestBody.MediaID != "" { // ... p := mediaInfo.TempLocalPath t.Cleanup(func() { _ = os.Remove(p) }) }backend/internal/repositories/user_repository_test.go (1)
56-60: Regex for ExpectQuery is brittle af, low-key might flakeRelying on a multiline regex with escaped parens can get flaky if whitespace changes. Consider
regexp.QuoteMetaor a looser pattern that anchors key tokens only.- mockPool.ExpectQuery(` - INSERT INTO users \(full_name, email, password, bio, profile_picture, created_at, updated_at, deleted, deleted_at\) - VALUES \(\$1, \$2, \$3, \$4, \$5, \$6, \$7, \$8, \$9\) - RETURNING id`). + mockPool.ExpectQuery(regexp.QuoteMeta(` + INSERT INTO users (full_name, email, password, bio, profile_picture, created_at, updated_at, deleted, deleted_at) + VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9) + RETURNING id`)).backend/internal/handlers/media_handler.go (4)
46-53: Too-large detection via string compare = brittle, use canonical errorsComparing
err.Error() == "http: request body too large"is shaky. Prefererrors.Isagainstmultipart.ErrMessageTooLargeand handle read-time errors fromio.Copysimilarly.+ import ( + "mime/multipart" + "errors" + ) ... - } else if err.Error() == "http: request body too large" { + } else if errors.Is(err, multipart.ErrMessageTooLarge) { ...Also, for reads during streaming, if
copyErrmatchesmultipart.ErrMessageTooLarge, return 413.
97-105: Use 415 Unsupported Media Type for blocked types + finalize reason tweakWrong status code makes API feel off. 415 hits better here.
- http.Error(w, "Disallowed file type detected", http.StatusBadRequest) + http.Error(w, "Unsupported media type", http.StatusUnsupportedMediaType) go func(id string) { - m.mediaService.FinalizeFileUpload(context.Background(), id, 0, "failed_sniff") + m.mediaService.FinalizeFileUpload(context.Background(), id, 0, "unsupported_media") }(mediaResp.ID)
70-74: Don’t defer Close() inside loop; close parts eagerlyDeferring
part.Close()per iteration stacks up and can leak fds if clients spam form fields. Close eachpartright after handling it.- defer part.Close() // Ensure part is closed after processing + // Close each part after processing to avoid piling up defersThen explicitly
part.Close()in both branches once you’re done reading.Also applies to: 134-137
20-23: Validator field unused rnYou’re injecting a validator but not using it here. Either validate form metadata or drop the field to keep it lean.
Also applies to: 25-30
backend/migrations/20250817154124_create_media_table/up.sql (2)
10-16: Consider ON DELETE actions to avoid orphan chaos
user_id,post_id, andprofile_picture_user_idhave no ON DELETE behavior. Think aboutON DELETE CASCADEorSET NULLper relation to match domain semantics.
1-1: uuid-ossp vs gen_random_uuid() glow-upIf infra allows,
pgcrypto’sgen_random_uuid()is often preferred overuuid-osspin managed PG. Not blocking, just a thought.backend/internal/repositories/user_repository.go (2)
118-129: scanUser + GetAll scan are consistent, but low-key duplicationYou have
scanUserfor single rows but inlinerows.ScaninGetAll. Optional: reusescanUserby scanning into a temporarypgx.Rowswrapper or refactor ascanUserFields(...)helper. Not blocking.Also applies to: 137-141
85-95: Error messages could carry IDs/emails for observability“user not found” is vague. Consider
%wwith sentinel or include the lookup key to help logs. Not blocking.Also applies to: 97-107
backend/internal/handlers/media_handler_test.go (1)
123-123: frfr this test data is kinda sus nglThe file content for testing a JPEG upload is not actually valid JPEG data - it's just a text string. While this might work for basic testing, it could miss issues related to actual binary file handling.
Consider using actual small valid JPEG bytes or at least binary data:
- fileContent: "a beautiful redhead damsel sitting alone in a forest", + // Minimal valid JPEG header + fileContent: string([]byte{0xFF, 0xD8, 0xFF, 0xE0, 0x00, 0x10, 0x4A, 0x46, 0x49, 0x46}),backend/internal/dto/media_dto.go (1)
7-9: yo this TODO is lowkey mysterious frThe MediaUploadRequest struct has a TODO comment in Japanese ("nani wo suru?" = "what to do?") but no fields. This empty struct seems incomplete.
Would you like me to help implement the MediaUploadRequest struct with appropriate validation fields, or should I open an issue to track this TODO?
backend/internal/services/media_service.go (2)
51-52: no cap, using Mkdir instead of MkdirAll is kinda midUsing
os.Mkdirwill fail if parent directories don't exist. Consider usingos.MkdirAllfor better robustness.- return os.Mkdir(mediaFolderName, 0755) + return os.MkdirAll(mediaFolderName, 0755)
161-167: sheesh, fire-and-forget goroutine hits different but could be problematicSpawning a goroutine to delete files without tracking completion could lead to race conditions or incomplete cleanup if the service shuts down.
Consider using a channel or wait group to track cleanup operations, or just remove the file synchronously since it's already an error path:
- go func() { - if media.TempLocalPath != "" { - os.Remove(media.TempLocalPath) - slog.Info("Removed disallowed file post-upload", "path", media.TempLocalPath) - } - }() + if media.TempLocalPath != "" { + if err := os.Remove(media.TempLocalPath); err != nil { + slog.Error("Failed to remove disallowed file", "path", media.TempLocalPath, "err", err) + } else { + slog.Info("Removed disallowed file post-upload", "path", media.TempLocalPath) + } + }backend/internal/services/post_service.go (1)
12-13: yo these import comments are lowkey unnecessary frThe "ADD THIS IMPORT" comments don't add value and should be removed.
- "github.com/escuadron-404/red404/backend/pkg/middleware" // <--- ADD THIS IMPORT - "github.com/escuadron-404/red404/backend/pkg/utils" // <--- ADD THIS IMPORT + "github.com/escuadron-404/red404/backend/pkg/middleware" + "github.com/escuadron-404/red404/backend/pkg/utils"backend/internal/models/user.go (2)
9-18: Separate internal model vs API DTO to avoid accidental exposure.Right now the core
Usertype is wired with JSON tags (including soft-delete fields) and could be marshaled by handlers. Consider splitting:
- persistence model (DB scan/write)
- API DTO (public view without sensitive fields)
You already have
UserWithoutPassword, but it’s missing new profile fields. Either expand it or introduce aUserPublicDTO that mirrors all non-sensitive fields. This keeps the hot path clean and prevents future regressions if someone encodesmodels.Userdirectly.If you want, I can draft a
UserPublicDTO and mapper funcs.
10-10: Trim null spam: add omitempty to optional fields.Pointer optional fields currently JSON to
null. If you want cleaner payloads, addomitemptyso absent values just drop. Vibes > nulls.Apply this diff:
- FullName *string `json:"full_name"` + FullName *string `json:"full_name,omitempty"` - Bio *string `json:"bio"` - ProfilePicture *string `json:"profile_picture"` + Bio *string `json:"bio,omitempty"` + ProfilePicture *string `json:"profile_picture,omitempty"` - DeletedAt *time.Time `json:"deleted_at"` + DeletedAt *time.Time `json:"deleted_at,omitempty"`Also applies to: 13-14, 18-18
backend/cmd/server/main.go (7)
28-30: Boost observability on logs.Slog migration is clean, W. Add
AddSourceand a static component attr so logs hit different when tailing prod.Apply this diff:
- logger := slog.New(slog.NewJSONHandler(os.Stdout, nil)) - slog.SetDefault(logger) + logger := slog.New(slog.NewJSONHandler(os.Stdout, &slog.HandlerOptions{AddSource: true})) + slog.SetDefault(logger.With(slog.String("component", "api")))
49-49: Nix the meme comment.Keep comments crisp for future you.
- defer db.Close() // closeyoself + defer db.Close()
52-55: Tidy the multiline if; gofmt vibes.The newline between
!=andnilis low-key cursed. Flatten it for readability.- if err := runMigrations(db.Pool); err != - nil { + if err := runMigrations(db.Pool); err != nil {
89-92: Harden against slowloris: set ReadHeaderTimeout.Timeouts are chef’s kiss. Add
ReadHeaderTimeoutto cover header parse time.server := &http.Server{ Addr: ":" + cfg.ServerPort, Handler: mux, + ReadHeaderTimeout: 5 * time.Second, ReadTimeout: 15 * time.Second, WriteTimeout: 30 * time.Second, IdleTimeout: 60 * time.Second, }
98-101: Background job shutdown semantics.Cleanup goroutine is chill and cancelable. For extra smooth shutdown, register the cancel with the server so Ctrl+C and programmatic shutdown both converge:
Example (add after server creation):
server.RegisterOnShutdown(cleanupCancel)This removes the need to call
cleanupCancel()manually at Line 115 and avoids double-cancel if the shutdown path changes later.
103-109: Use a dedicated error channel instead of faking a signal.Pushing
SIGTERMinto the signal chan works but is a little scuffed. PrefererrCh := make(chan error, 1)andselectonquitvserrChto disambiguate OS signals from server errors.
140-140: Wrap errors with %w for stackable context.Tiny but mighty. Use
%wso callers canerrors.Is/As.- return fmt.Errorf("failed to create migrations table: %v", err) + return fmt.Errorf("failed to create migrations table: %w", err)
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
💡 Knowledge Base configuration:
- MCP integration is disabled by default for public repositories
- Jira integration is disabled by default for public repositories
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
⛔ Files ignored due to path filters (2)
backend/go.sumis excluded by!**/*.sumfrontend/package-lock.jsonis excluded by!**/package-lock.json
📒 Files selected for processing (29)
backend/cmd/server/main.go(4 hunks)backend/config/config.go(2 hunks)backend/go.mod(1 hunks)backend/internal/dto/media_dto.go(1 hunks)backend/internal/dto/post_dto.go(1 hunks)backend/internal/handlers/media_handler.go(1 hunks)backend/internal/handlers/media_handler_test.go(1 hunks)backend/internal/handlers/post_handler.go(1 hunks)backend/internal/handlers/post_handler_test.go(1 hunks)backend/internal/models/media.go(1 hunks)backend/internal/models/post.go(1 hunks)backend/internal/models/user.go(1 hunks)backend/internal/repositories/interfaces.go(1 hunks)backend/internal/repositories/media_repository.go(1 hunks)backend/internal/repositories/post_repository.go(1 hunks)backend/internal/repositories/user_repository.go(6 hunks)backend/internal/repositories/user_repository_test.go(1 hunks)backend/internal/routes/routes.go(1 hunks)backend/internal/services/media_service.go(1 hunks)backend/internal/services/post_service.go(1 hunks)backend/migrations/20250817154124_create_media_table/up.sql(1 hunks)backend/pkg/middleware/cors.go(0 hunks)frontend/biome.json(5 hunks)frontend/package.json(1 hunks)frontend/src/assets/icons/MetaIcon.tsx(2 hunks)frontend/src/components/UI/CommentsSection.tsx(2 hunks)frontend/src/index.css(1 hunks)frontend/src/pages/loginPage.tsx(4 hunks)frontend/src/pages/registerPage.tsx(4 hunks)
💤 Files with no reviewable changes (1)
- backend/pkg/middleware/cors.go
🧰 Additional context used
🧬 Code graph analysis (12)
backend/internal/handlers/post_handler_test.go (9)
backend/internal/models/media.go (1)
Media(7-21)backend/internal/handlers/media_handler_test.go (2)
SetupTestDB(36-57)CreateTestUser(60-77)backend/internal/services/media_service.go (1)
NewMediaService(41-46)backend/internal/services/post_service.go (1)
NewPostService(30-42)backend/internal/handlers/post_handler.go (1)
NewPostHandler(21-26)backend/pkg/utils/jwt.go (1)
Claims(12-16)backend/internal/dto/post_dto.go (2)
PostCreateRequest(6-9)PostResponse(12-19)backend/pkg/middleware/auth.go (1)
UserContextKey(15-15)backend/internal/models/post.go (1)
Post(7-16)
backend/internal/repositories/post_repository.go (1)
backend/internal/models/post.go (1)
Post(7-16)
backend/internal/handlers/media_handler_test.go (8)
backend/internal/repositories/user_repository.go (1)
NewUserRepository(26-28)backend/internal/models/user.go (1)
User(8-19)backend/internal/repositories/media_repository.go (1)
NewMediaRepository(28-30)backend/internal/services/media_service.go (1)
NewMediaService(41-46)backend/internal/handlers/media_handler.go (1)
NewMediaUploadHandler(25-30)backend/pkg/utils/jwt.go (1)
Claims(12-16)backend/pkg/middleware/auth.go (1)
UserContextKey(15-15)backend/internal/dto/media_dto.go (1)
MediaUploadResponse(11-19)
backend/internal/handlers/media_handler.go (2)
backend/internal/services/media_service.go (1)
MediaService(26-33)backend/internal/dto/media_dto.go (1)
MediaUploadResponse(11-19)
backend/internal/repositories/user_repository_test.go (2)
backend/internal/repositories/user_repository.go (1)
NewUserRepository(26-28)backend/internal/models/user.go (1)
User(8-19)
backend/internal/services/post_service.go (7)
backend/internal/dto/post_dto.go (2)
PostCreateRequest(6-9)PostResponse(12-19)backend/internal/repositories/post_repository.go (1)
PostRepository(12-14)backend/internal/services/media_service.go (1)
MediaService(26-33)backend/internal/models/media.go (1)
Media(7-21)backend/internal/models/post.go (1)
Post(7-16)backend/pkg/utils/jwt.go (1)
Claims(12-16)backend/pkg/middleware/auth.go (1)
UserContextKey(15-15)
backend/internal/services/media_service.go (4)
backend/internal/dto/media_dto.go (1)
MediaUploadResponse(11-19)backend/internal/models/media.go (1)
Media(7-21)backend/internal/repositories/media_repository.go (1)
MediaRepository(12-22)backend/internal/services/post_service.go (1)
GetUserFromContext(129-134)
backend/internal/handlers/post_handler.go (2)
backend/internal/services/post_service.go (1)
PostService(19-21)backend/internal/dto/post_dto.go (1)
PostCreateRequest(6-9)
backend/cmd/server/main.go (8)
backend/internal/repositories/media_repository.go (1)
NewMediaRepository(28-30)backend/internal/repositories/post_repository.go (1)
NewPostRepository(20-22)backend/internal/services/media_service.go (1)
NewMediaService(41-46)backend/internal/services/post_service.go (1)
NewPostService(30-42)backend/internal/handlers/media_handler.go (1)
NewMediaUploadHandler(25-30)backend/internal/handlers/post_handler.go (1)
NewPostHandler(21-26)backend/pkg/middleware/auth.go (1)
NewAuthMiddleware(21-23)backend/internal/routes/routes.go (1)
SetupRoutes(12-30)
backend/internal/repositories/media_repository.go (1)
backend/internal/models/media.go (1)
Media(7-21)
backend/internal/repositories/user_repository.go (1)
backend/internal/models/user.go (1)
User(8-19)
backend/internal/routes/routes.go (7)
backend/internal/handlers/user_handler.go (1)
UserHandler(16-19)backend/internal/handlers/auth_handler.go (1)
AuthHandler(15-18)backend/internal/handlers/media_handler.go (1)
MediaHandler(20-23)backend/internal/handlers/post_handler.go (1)
PostHandler(16-19)backend/pkg/middleware/auth.go (1)
AuthMiddleware(17-19)backend/internal/routes/front_handler_prod.go (1)
RegisterFrontendHandlers(58-65)backend/internal/routes/front_handler_dev.go (1)
RegisterFrontendHandlers(14-37)
🔇 Additional comments (19)
frontend/src/index.css (1)
28-30: yo frfr this PR tweak is clean: scoped Biome ignore right where it matters.The inline
/* biome-ignore ... */right above@themeis tight—suppresses the false-positive without muting the rule globally. Readability boost with the extra blank line too. Ship it.backend/go.mod (1)
8-8: yo frfr this PR is 🔥 – CORS imports cleaned and go directive in place, just verify your CI Go toolchain
- No more
github.com/rs/corsimports found underbackend/(stray CORS refs ✅).backend/go.modhasgo 1.23.2set (toolchain directive ✅).- 👉 Please double-check that your CI/build images are actually running Go 1.23.x or that you’ve wired in
GOTOOLCHAINfor auto-installation to prevent build failures.- Ensure CORS is still handled upstream (e.g. reverse proxy or your HTTP stack) so you don’t break cross-origin requests in dev/prod.
frontend/src/pages/registerPage.tsx (2)
59-65: Dynamic IDs with useId = accessibility W.Linking label/htmlFor to the inputs with a stable id per render is clean. This pattern is consistent and chef’s kiss.
Also applies to: 73-79
1-1: yo frfr 🚨 critical: switch router imports toreact-router-domYou’re pulling DOM-specific APIs (BrowserRouter, Link, NavLink, useNavigate, Outlet, etc.) from
"react-router", which will blow up at runtime—these belong in"react-router-dom".Files to fix:
- frontend/src/main.tsx (line 4)
- frontend/src/pages/loginPage.tsx (line 2)
- frontend/src/pages/registerPage.tsx (line 2)
- frontend/src/pages/ProfilePage.tsx (line 3)
- frontend/src/layouts/HomeLayout.tsx (line 1)
- frontend/src/components/UI/ProfileComponents/Tabs.tsx (line 5)
- frontend/src/components/NavBarComponents/NavBarLink.tsx (line 2)
Apply this diff everywhere:
-import { BrowserRouter, Route, Routes } from "react-router"; +import { BrowserRouter, Route, Routes } from "react-router-dom"; -import { Link, useNavigate } from "react-router"; +import { Link, useNavigate } from "react-router-dom"; -import { NavLink, Route, Routes } from "react-router"; +import { NavLink, Route, Routes } from "react-router-dom"; -import { Outlet } from "react-router"; +import { Outlet } from "react-router-dom"; -import { NavLink } from "react-router"; +import { NavLink } from "react-router-dom"; -import { Link } from "react-router"; +import { Link } from "react-router-dom";Also ensure
react-router-domis listed in yourpackage.jsondeps and installed so these imports resolve correctly.Likely an incorrect or invalid review comment.
frontend/src/pages/loginPage.tsx (1)
1-14: ✅ yo frfr useId is safe on React 19.1.0 & Link import is legit — PR is 🔥
- frontend/package.json shows
"react": "^19.1.0"→ useId is supported- frontend/package.json has
"react-router": "^7.8.0"and loginPage.tsx importsLink, useNavigatefrom"react-router"→ imports line upAll checked—no further action needed, merging this W move.
frontend/biome.json (1)
2-2: Schema bump to 2.2.0 is cleanUpgrading the $schema looks valid and aligns with a more recent Biome. Ship it.
backend/config/config.go (1)
5-5: slog import swap is a WMoving to log/slog keeps logging consistent with the rest of the backend.
backend/internal/models/post.go (1)
7-16: Yo frfr: Post model looks clean and aligned with DTOs.Field set and JSON tags are coherent; pointer for
ImageURLis appropriate at the model layer. No blockers.backend/internal/dto/post_dto.go (1)
13-19: LGTM on response DTO.
ImageURLas string withomitemptyplays nicely with clients. Timestamps are clear.backend/internal/handlers/post_handler_test.go (1)
105-112: Yo frfr: this test will fail unless MediaID gainsomitempty.“Successful Post without Media” expects 201, but DTO uses
validate:"uuid". After you switch tovalidate:"omitempty,uuid", this case should pass.backend/internal/repositories/user_repository_test.go (2)
32-41: Clean pgxmock setup, vibes immaculateUsing
pgxmock.NewPool()and wiring it intoNewUserRepositoryis crisp. TestMain slog default also hits.
87-104: Solid negative-path coverage, ngl this is clutchAsserting err contains "database connection lost" + expectations check = chef’s kiss.
backend/internal/routes/routes.go (2)
50-57: UserRoutes adds Update/Delete — align with handler implsDouble-check
UpdateUserandDeleteUserexist and return proper codes; if not fully implemented yet, maybe gate routes until ready.
12-18: yo frfr the repo’s already flexin’ Go 1.23.2 in backend/go.mod, so those fancy"POST /api/…”patterns work as of Go 1.22+ 🔥
– SetupRoutes signature’s synced in both routes.go and cmd/server/main.go
Just double-check your CI is rolling with Go 1.22+ too to keep it all matching 🚀backend/internal/repositories/user_repository.go (1)
3-10: Security Vibe Check: Confirm No Direct Encoding of models.Useryo frfr 🔒 the
models.Userstruct still hasjson:"password". I ran a grep for any direct
json.NewEncoder(...).Encode(models.User)or[]models.Usercalls and didn’t find hits, but that isn’t proof-positive. Please manually audit your HTTP handlers to:• Ensure no endpoint directly marshals or encodes a
models.User(or slice) back to clients
• Introduce DTO/response structs that omit thepasswordfield for all user-facing JSONbackend/internal/services/post_service.go (1)
78-90: this defer transaction handling is bussin frfrThe transaction handling with defer, recover, and proper rollback/commit logic is well implemented. Good error handling pattern!
backend/internal/models/user.go (1)
9-18: No action needed on DB tags – manual Scan usedyo frfr, we’re doing explicit
row.Scan(&user.ID, …)inscanUser(backend/internal/repositories/user_repository.go), sodb:"…”tags onmodels.Useraren’t used anywhere.– User mapping is manual, no struct-mapper or sqlx found
–UserWithoutPassword’sdbtags are harmless (it’s only for JSON responses)tags:
backend/cmd/server/main.go (2)
83-84: Route wiring looks solid.Passing media/post handlers + auth middleware into
SetupRoutestracks with the new endpoints. Ship it.
126-126: Shutdown log is clean.Graceful exit path is on point. Keep.
| Description string `json:"description" validate:"required,min=1,max=1000"` | ||
| MediaID string `json:"media_id,omitempty" validate:"uuid"` | ||
| } |
There was a problem hiding this comment.
Yo frfr: optional MediaID is actually required — validation tag will 400 text-only posts.
validate:"uuid" fails on empty strings. Add omitempty so posts without media pass validation as intended.
type PostCreateRequest struct {
Description string `json:"description" validate:"required,min=1,max=1000"`
- MediaID string `json:"media_id,omitempty" validate:"uuid"`
+ MediaID string `json:"media_id,omitempty" validate:"omitempty,uuid"`
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| Description string `json:"description" validate:"required,min=1,max=1000"` | |
| MediaID string `json:"media_id,omitempty" validate:"uuid"` | |
| } | |
| type PostCreateRequest struct { | |
| Description string `json:"description" validate:"required,min=1,max=1000"` | |
| MediaID string `json:"media_id,omitempty" validate:"omitempty,uuid"` | |
| } |
🤖 Prompt for AI Agents
In backend/internal/dto/post_dto.go around lines 7 to 9, the MediaID field is
intended to be optional but currently uses validate:"uuid" which fails
validation for empty strings; update the struct tag to allow empty values by
adding the omitempty option to the validator tag (e.g.,
validate:"omitempty,uuid") so posts without media pass validation while still
enforcing UUID format when present.
| // SetupTestDB creates a connection pool to a test database. | ||
| // Ensure you have a 'red404_test' database created and accessible. | ||
| func SetupTestDB(t *testing.T) *pgxpool.Pool { | ||
| connStr := "postgres://user:password@localhost:5432/red404_test?sslmode=disable" |
There was a problem hiding this comment.
🛠️ Refactor suggestion
yo frfr hardcoded connection string is not bussin
The test database connection string is hardcoded, which could cause issues in different environments. This should be configurable through environment variables.
- connStr := "postgres://user:password@localhost:5432/red404_test?sslmode=disable"
+ connStr := os.Getenv("TEST_DATABASE_URL")
+ if connStr == "" {
+ connStr = "postgres://user:password@localhost:5432/red404_test?sslmode=disable"
+ }📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| connStr := "postgres://user:password@localhost:5432/red404_test?sslmode=disable" | |
| connStr := os.Getenv("TEST_DATABASE_URL") | |
| if connStr == "" { | |
| connStr = "postgres://user:password@localhost:5432/red404_test?sslmode=disable" | |
| } |
🤖 Prompt for AI Agents
In backend/internal/handlers/media_handler_test.go around line 37 the Postgres
connection string is hardcoded; change the test to read the connection string
from an environment variable (e.g., TEST_DATABASE_URL or RED404_TEST_DB) and
fall back to a sensible default only if the env var is unset; update the test
setup to fail fast with a clear error if neither a configured env var nor a
valid default is appropriate for CI, and document the expected env var in the
test file comments or test setup so different environments can override the
database URL without editing code.
| // onichan yamete | ||
| r.Body = http.MaxBytesReader(w, r.Body, maxUploadBytes) | ||
|
|
||
| reader, err := r.MultipartReader() |
There was a problem hiding this comment.
Remove NSFW/irrelevant comments, keep it corporate clean
Comments like "onichan yamete" don’t belong in prod. Let’s yeet them.
- // onichan yamete
r.Body = http.MaxBytesReader(w, r.Body, maxUploadBytes)📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| // onichan yamete | |
| r.Body = http.MaxBytesReader(w, r.Body, maxUploadBytes) | |
| reader, err := r.MultipartReader() | |
| r.Body = http.MaxBytesReader(w, r.Body, maxUploadBytes) | |
| reader, err := r.MultipartReader() |
🤖 Prompt for AI Agents
In backend/internal/handlers/media_handler.go around lines 38 to 41, remove the
NSFW/irrelevant comment "onichan yamete" and replace it with a short,
professional comment (or no comment) that accurately describes the following
line (e.g., "limit request body to maxUploadBytes") so the codebase remains
corporate-clean and self-explanatory.
| // sniff em real good boy | ||
| buffer := make([]byte, 512) | ||
| n, readErr := part.Read(buffer) | ||
| if readErr != nil && readErr != io.EOF { | ||
| slog.Error("Error reading initial bytes for content type sniffing", "err", readErr, "filename", originalFilename) | ||
| http.Error(w, "Error reading file content", http.StatusInternalServerError) | ||
| return | ||
| } | ||
| actualContentType = http.DetectContentType(buffer[:n]) | ||
|
|
||
| if !strings.HasPrefix(actualContentType, "image/") && !strings.HasPrefix(actualContentType, "video/") { |
There was a problem hiding this comment.
Double-counting bytes, result size goes brazy wrong
You prepend the sniffed bytes via io.MultiReader, so io.Copy already includes them. Adding bytesWritten += int64(n) inflates size. Also, close the writer before finalize so we don’t mark success while buffers are still unflushed.
- dstWriter, mediaResp, err := m.mediaService.PrepareFileUpload(r.Context(), originalFilename, initialContentType)
+ dstWriter, mediaResp, err := m.mediaService.PrepareFileUpload(r.Context(), originalFilename, initialContentType)
if err != nil {
...
}
- defer dstWriter.Close()
+ // we'll close explicitly before finalize
buffer := make([]byte, 512)
n, readErr := part.Read(buffer)
...
prefixedReader := io.MultiReader(bytes.NewReader(buffer[:n]), part)
bytesWritten, copyErr := io.Copy(dstWriter, prefixedReader)
if copyErr != nil {
...
return
}
- bytesWritten += int64(n) // Add the bytes read by sniff to total
+ // no-op: sniff bytes already included by MultiReader
+ if cerr := dstWriter.Close(); cerr != nil {
+ slog.Error("Error closing destination writer", "err", cerr, "filename", originalFilename, "media_id", mediaResp.ID)
+ http.Error(w, "Error finalizing file write", http.StatusInternalServerError)
+ return
+ }
slog.Info("File successfully streamed to disk", "media_id", mediaResp.ID, "filename", originalFilename, "bytes_written", bytesWritten)
err = m.mediaService.FinalizeFileUpload(r.Context(), mediaResp.ID, bytesWritten, actualContentType)Also applies to: 106-112, 120-126
🤖 Prompt for AI Agents
In backend/internal/handlers/media_handler.go around lines 87-97 (and similarly
at 106-112 and 120-126), the handler is double-counting the initial sniffed
bytes by adding int64(n) after using io.MultiReader + io.Copy, and it also does
not close the writer before finalizing success; remove the explicit bytesWritten
+= int64(n) adjustments so io.Copy's return is the sole source of written byte
count, and ensure the destination writer (or multipart/writer/flushable) is
properly closed/closed/Flushed before marking the upload as successful or
committing any metadata.
| w.Header().Set("Content-Type", "application/json") | ||
| w.WriteHeader(http.StatusCreated) | ||
| if err := json.NewEncoder(w).Encode(mediaUploadResponse); err != nil { | ||
| slog.Error("Failed to encode media upload response to JSON", "err", err, "media_id", mediaUploadResponse.ID) | ||
| http.Error(w, "File uploaded, but failed to return response", http.StatusInternalServerError) | ||
| } |
There was a problem hiding this comment.
🛠️ Refactor suggestion
Don’t write 201 then error; encode first, then write
If JSON encoding fails after WriteHeader(201), client sees 201 with an error body—bad UX. Buffer encode first.
- w.Header().Set("Content-Type", "application/json")
- w.WriteHeader(http.StatusCreated)
- if err := json.NewEncoder(w).Encode(mediaUploadResponse); err != nil {
+ respBytes, encErr := json.Marshal(mediaUploadResponse)
+ if encErr != nil {
+ slog.Error("Failed to encode media upload response to JSON", "err", encErr, "media_id", mediaUploadResponse.ID)
+ http.Error(w, "File uploaded, but failed to return response", http.StatusInternalServerError)
+ return
+ }
+ w.Header().Set("Content-Type", "application/json")
+ w.WriteHeader(http.StatusCreated)
+ if _, err := w.Write(respBytes); err != nil {
slog.Error("Failed to encode media upload response to JSON", "err", err, "media_id", mediaUploadResponse.ID)
http.Error(w, "File uploaded, but failed to return response", http.StatusInternalServerError)
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| w.Header().Set("Content-Type", "application/json") | |
| w.WriteHeader(http.StatusCreated) | |
| if err := json.NewEncoder(w).Encode(mediaUploadResponse); err != nil { | |
| slog.Error("Failed to encode media upload response to JSON", "err", err, "media_id", mediaUploadResponse.ID) | |
| http.Error(w, "File uploaded, but failed to return response", http.StatusInternalServerError) | |
| } | |
| // Buffer the response so we can catch encoding errors before sending headers | |
| respBytes, encErr := json.Marshal(mediaUploadResponse) | |
| if encErr != nil { | |
| slog.Error("Failed to encode media upload response to JSON", "err", encErr, "media_id", mediaUploadResponse.ID) | |
| http.Error(w, "File uploaded, but failed to return response", http.StatusInternalServerError) | |
| return | |
| } | |
| w.Header().Set("Content-Type", "application/json") | |
| w.WriteHeader(http.StatusCreated) | |
| if _, err := w.Write(respBytes); err != nil { | |
| slog.Error("Failed to write media upload response", "err", err, "media_id", mediaUploadResponse.ID) | |
| http.Error(w, "File uploaded, but failed to return response", http.StatusInternalServerError) | |
| } |
🤖 Prompt for AI Agents
In backend/internal/handlers/media_handler.go around lines 152 to 157, the
handler currently calls w.WriteHeader(http.StatusCreated) before encoding the
JSON which can result in sending a 201 even if encoding fails; instead encode
the response into a buffer (or marshal to bytes) first, handle any encode error
(log and send an appropriate http.Error with a non-201 status), and only after
successful encoding set the Content-Type, call WriteHeader(http.StatusCreated)
and write the buffered bytes to the ResponseWriter.
| func (r *userRepository) Update(ctx context.Context, user *models.User) error { | ||
| query := `UPDATE users SET email = $1, password = $2, updated_at = $3 WHERE id = $4` | ||
| _, err := r.db.Exec(ctx, query, user.Email, user.Password, user.UpdatedAt, user.ID) | ||
| return err | ||
| user.UpdatedAt = time.Now() // Update timestamp on modification | ||
|
|
||
| query := ` | ||
| UPDATE users | ||
| SET full_name = $1, email = $2, password = $3, bio = $4, profile_picture = $5, updated_at = $6, deleted = $7, deleted_at = $8 | ||
| WHERE id = $9 | ||
| ` | ||
| cmdTag, err := r.db.Exec(ctx, query, | ||
| user.FullName, | ||
| user.Email, | ||
| user.Password, | ||
| user.Bio, | ||
| user.ProfilePicture, | ||
| user.UpdatedAt, | ||
| user.Deleted, | ||
| user.DeletedAt, | ||
| user.ID, | ||
| ) | ||
| if err != nil { |
There was a problem hiding this comment.
🛠️ Refactor suggestion
Same encoding issue hits Update, run it back
Update needs the same *string → value/nil mapping.
cmdTag, err := r.db.Exec(ctx, query,
- user.FullName,
- user.Email,
- user.Password,
- user.Bio,
- user.ProfilePicture,
+ strPtrToDBVal(user.FullName),
+ user.Email,
+ user.Password,
+ strPtrToDBVal(user.Bio),
+ strPtrToDBVal(user.ProfilePicture),
user.UpdatedAt,
user.Deleted,
user.DeletedAt,
user.ID,
)📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| func (r *userRepository) Update(ctx context.Context, user *models.User) error { | |
| query := `UPDATE users SET email = $1, password = $2, updated_at = $3 WHERE id = $4` | |
| _, err := r.db.Exec(ctx, query, user.Email, user.Password, user.UpdatedAt, user.ID) | |
| return err | |
| user.UpdatedAt = time.Now() // Update timestamp on modification | |
| query := ` | |
| UPDATE users | |
| SET full_name = $1, email = $2, password = $3, bio = $4, profile_picture = $5, updated_at = $6, deleted = $7, deleted_at = $8 | |
| WHERE id = $9 | |
| ` | |
| cmdTag, err := r.db.Exec(ctx, query, | |
| user.FullName, | |
| user.Email, | |
| user.Password, | |
| user.Bio, | |
| user.ProfilePicture, | |
| user.UpdatedAt, | |
| user.Deleted, | |
| user.DeletedAt, | |
| user.ID, | |
| ) | |
| if err != nil { | |
| func (r *userRepository) Update(ctx context.Context, user *models.User) error { | |
| user.UpdatedAt = time.Now() // Update timestamp on modification | |
| query := ` | |
| UPDATE users | |
| SET full_name = $1, email = $2, password = $3, bio = $4, profile_picture = $5, updated_at = $6, deleted = $7, deleted_at = $8 | |
| WHERE id = $9 | |
| ` | |
| cmdTag, err := r.db.Exec(ctx, query, | |
| strPtrToDBVal(user.FullName), | |
| user.Email, | |
| user.Password, | |
| strPtrToDBVal(user.Bio), | |
| strPtrToDBVal(user.ProfilePicture), | |
| user.UpdatedAt, | |
| user.Deleted, | |
| user.DeletedAt, | |
| user.ID, | |
| ) | |
| if err != nil { | |
| // handle error | |
| } | |
| // ... | |
| } |
🤖 Prompt for AI Agents
In backend/internal/repositories/user_repository.go around lines 152 to 171, the
Update method is still passing pointer string fields directly to Exec causing
encoding errors; convert any *string fields (e.g., Bio, ProfilePicture, and any
other nullable string pointers) to either their value or nil before passing to
the query. Implement or reuse a small helper like toValueOrNil(s *string)
interface{} that returns s value or nil, and use it for those Exec arguments
(and similarly handle nullable time pointers like DeletedAt if needed) so the DB
driver receives nil for nulls and actual values otherwise.
| s.mediaRepository.MarkMediaAsUsed(context.Background(), media.ID, "failed_cleanup_file") | ||
| continue |
There was a problem hiding this comment.
🛠️ Refactor suggestion
bruh the error handling here is giving mixed vibes
When file deletion fails, you're marking the media as failed_cleanup_file but continuing to try to delete from DB. If both operations fail, the status might be overwritten incorrectly.
if err := os.Remove(media.TempLocalPath); err != nil {
slog.Error("Failed to delete expired media file from disk", "err", err, "path", media.TempLocalPath)
s.mediaRepository.MarkMediaAsUsed(context.Background(), media.ID, "failed_cleanup_file")
- continue
+ continue // Skip DB deletion if file deletion failed
}🤖 Prompt for AI Agents
In backend/internal/services/media_service.go around lines 249-250, the current
flow marks media as "failed_cleanup_file" when file deletion fails but then
continues to the DB deletion step, which can lead to overwriting statuses if the
DB operation also fails. Change the control flow so that if the file deletion
returns an error you mark the media as "failed_cleanup_file" and skip any DB
removal (continue/return) so the DB state isn't touched; only if file deletion
succeeds proceed to remove the DB record, and if the DB removal fails, log the
error and set a distinct status (e.g., "failed_cleanup_db") or leave the
existing status untouched to avoid incorrect overwrites.
| return &dto.PostResponse{ | ||
| ID: postID, | ||
| UserID: claims.UserID, | ||
| ImageURL: *imageURL, |
There was a problem hiding this comment.
deadass this nil pointer dereference finna crash your app no cap
Dereferencing imageURL when it could be nil will cause a panic if no media is provided.
- ImageURL: *imageURL,
+ ImageURL: func() string {
+ if imageURL != nil {
+ return *imageURL
+ }
+ return ""
+ }(),Or more cleanly, change the DTO field to accept a pointer:
// In dto/post_dto.go PostResponse struct
- ImageURL string `json:"image_url,omitempty"`
+ ImageURL *string `json:"image_url,omitempty"`
// Then in this file:
- ImageURL: *imageURL,
+ ImageURL: imageURL,📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| ImageURL: *imageURL, | |
| // ... | |
| ImageURL: func() string { | |
| if imageURL != nil { | |
| return *imageURL | |
| } | |
| return "" | |
| }(), | |
| // ... |
🤖 Prompt for AI Agents
In backend/internal/services/post_service.go around line 121, the code currently
dereferences imageURL with ImageURL: *imageURL which will panic if imageURL is
nil; either change the DTO/Post struct field to a *string and assign ImageURL:
imageURL directly (preserving nil when no media) or keep the field as a string
but guard the dereference by checking if imageURL == nil and assigning an empty
string (or other safe default) when nil; update any affected call sites and
tests to use the pointer-typed DTO or the nil-checked assignment accordingly.
| stored_filename VARCHAR(255) NOT NULL UNIQUE, | ||
| file_type VARCHAR(50) NOT NULL, |
There was a problem hiding this comment.
Duplicate uniqueness on stored_filename, index go brrrr for no reason
You’ve got stored_filename UNIQUE on the column and also a separate CREATE UNIQUE INDEX on the same column. That’s redundant and wasteful. Pick one. Easiest: keep the column-level UNIQUE and drop the extra index.
- CREATE UNIQUE INDEX idx_media_stored_filename ON media(stored_filename);If you prefer explicit indexes, then remove UNIQUE from the column and keep the index.
Also applies to: 22-22
🤖 Prompt for AI Agents
In backend/migrations/20250817154124_create_media_table/up.sql around lines 7-8
(and also line 22), the stored_filename column is declared with UNIQUE and there
is an additional CREATE UNIQUE INDEX on the same column which is redundant;
remove the separate CREATE UNIQUE INDEX statement (line ~22) and keep the
column-level UNIQUE constraint, or alternatively remove UNIQUE from the column
and keep the explicit index—preferably follow the reviewer and drop the extra
CREATE UNIQUE INDEX at line 22 so only the column-level UNIQUE remains.
frontend/package.json
Outdated
| }, | ||
| "devDependencies": { | ||
| "@biomejs/biome": "2.1.4", | ||
| "@biomejs/biome": "2.2.0", |
There was a problem hiding this comment.
💡 Verification agent
🧩 Analysis chain
Biome bump to 2.2.0 looks good — sanity-check engines + use local bin for speed.
Nice upgrade, no cap. Two quick checks:
- Make sure your Node/npm engines in the repo/CI match (Node >= 24.4.0, npm >= 11.4.2).
- Consider using the local biome bin instead of npx to avoid the extra spawn each run.
You can verify and tighten scripts (outside this hunk):
{
"scripts": {
"lint": "biome check .",
"lint:write": "biome check --write ."
}
}Quick version checks to run locally:
🏁 Script executed:
node -v
npm -v
npx @biomejs/biome --versionLength of output: 174
yo frfr: Node engine upgrade needed before this bump is truly 🔥
Your Biome bump to 2.2.0 itself is solid (Version 2.2.0 confirmed via npx) and npm 11.4.2 ✅, but we spotted a Node mismatch that’ll trip CI:
- engines/node in package.json (and your CI image) require ≥ 24.4.0, but you’re on v24.3.0 locally. Please update to Node ≥ 24.4.0 to satisfy the engines field.
Optional nit for speed:
- Instead of invoking Biome via
npx, tighten your scripts to use the local binary directly:{ "scripts": { "lint": "biome check .", "lint:write": "biome check --write ." } }
🤖 Prompt for AI Agents
In frontend/package.json around line 25, the Biome bump to 2.2.0 is fine but
your local Node (v24.3.0) doesn't meet the engines/node requirement (≥24.4.0)
which will fail CI; upgrade your local Node to ≥24.4.0 (or change your
development environment image) so it matches the engines field, and confirm CI
image also uses ≥24.4.0; optionally, replace npx invocation in package.json
scripts with the local biome binary by calling "biome" directly (e.g., update
lint and lint:write scripts to use biome check . and biome check --write .).
There was a problem hiding this comment.
Actionable comments posted: 0
🧹 Nitpick comments (3)
frontend/src/pages/loginPage.tsx (3)
61-67: Make email input actually email + better autofill vibesType/email + autocomplete boosts UX and PW managers. Tiny diff:
- <input - type="text" - id={`email-${idHash}`} + <input + type="email" + id={`email-${idHash}`} + name="email" + autoComplete="username"
75-81: Password field: give autocomplete/name some loveHelps password managers; zero downside.
- <input - type="password" - id={`password-${idHash}`} + <input + type="password" + id={`password-${idHash}`} + name="password" + autoComplete="current-password"
11-11: yo frfr no brittle CSS/id selectors found Dynamic IDs via useId won’t break tests—no occurrences of #email/#password or getElementById in the codebase. Optional: mirror useId in RegisterPage for consistency, fr.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
💡 Knowledge Base configuration:
- MCP integration is disabled by default for public repositories
- Jira integration is disabled by default for public repositories
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
⛔ Files ignored due to path filters (1)
frontend/package-lock.jsonis excluded by!**/package-lock.json
📒 Files selected for processing (2)
frontend/biome.json(1 hunks)frontend/src/pages/loginPage.tsx(4 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- frontend/biome.json
🧰 Additional context used
🧠 Learnings (1)
📚 Learning: 2025-08-26T03:28:41.261Z
Learnt from: ruvasqm
PR: escuadron-404/red404#33
File: frontend/src/main.tsx:3-3
Timestamp: 2025-08-26T03:28:41.261Z
Learning: React Router v7 unified the package structure - you can import BrowserRouter, Routes, Link, and all other router components directly from "react-router" instead of "react-router-dom". The react-router-dom package in v7 simply re-exports everything from react-router for compatibility.
Applied to files:
frontend/src/pages/loginPage.tsx
🧬 Code graph analysis (1)
frontend/src/pages/loginPage.tsx (2)
frontend/src/pages/registerPage.tsx (3)
RegisterPage(6-110)e(82-83)e(67-68)frontend/src/components/AuthComponents/Input.tsx (1)
Input(8-18)
🔇 Additional comments (1)
frontend/src/pages/loginPage.tsx (1)
1-1: useId import is a W for a11y/SSR — ship itUnique, deterministic ids = no collisions/hydration whiplash. Clean move, frfr.
Use a descriptive and concise title. E.g., feat: Add user authentication module or fix: Correct typo in login form
Description
A brief summary of the changes and the motivation behind them. Include context, especially for non-obvious changes.
For example:
This PR introduces a new user authentication flow using OAuth 2.0. The previous method was insecure and did not support social logins. This change addresses the security concerns and adds new functionality.
Related Issues
Link any relevant issues, tickets, or epics here.
Changes
List the key changes made in this pull request.
How to Test
Provide clear instructions for the reviewer on how to test the changes.
Example:
Check out this branch.
Run npm install and npm start.
Navigate to /login and try logging in with a new account.
Verify that the user is redirected to the dashboard after a successful login.
Checklist
Before you merge, ensure you have completed the following.
[ ] I have read the contribution guidelines and followed them.
[ ] My code follows the project's style guidelines.
[ ] I have performed a self-review of my code.
[ ] I have added comments to my code, particularly in hard-to-understand areas.
[ ] I have added tests that prove my fix is effective or my feature works.
[ ] New and existing unit tests pass locally with my changes.
[ ] I have updated the documentation where necessary.
[ ] I have run a local build and confirmed it runs without errors.
[ ] I have removed any unnecessary console.log() statements.
Summary by CodeRabbit
New Features
Improvements
Tests & Migrations
Chores