This document captures potential enhancements and known edge cases identified during code review.
Currently history.flatMap produces a flat list. Users can't easily see which items came from which upload.
Suggested approach:
- Add a heading row or subtle divider between batches
- Show a tiny thumbnail in the Source column for better context
- Example: "Upload #1", "Upload #2" section headers
- Use alternating background colors per batch
- Add a small preview thumbnail instead of just "View" link
Store history metadata (without image blobs) in localStorage:
useEffect(() => {
const raw = localStorage.getItem("food-analyzer-history");
if (raw) setHistory(JSON.parse(raw));
}, []);
useEffect(() => {
localStorage.setItem("food-analyzer-history", JSON.stringify(history));
}, [history]);
Note: Would need a strategy to re-associate images (e.g., cache key or re-upload).
Allow users to correct the AI's guesses or add annotations:
- Inline editable field for name/quantity
- Simple modal for editing
Add a "Download CSV" button that generates a downloadable file:
const handleDownloadCSV = () => {
const rows = ["Food Item,Quantity"];
history.forEach((batch) => {
batch.items.forEach((item) => {
rows.push(`"${item.name}",${item.quantity}`);
});
});
const blob = new Blob([rows.join("\n")], { type: "text/csv" });
const url = URL.createObjectURL(blob);
const a = document.createElement("a");
a.href = url;
a.download = "food-items.csv";
a.click();
URL.revokeObjectURL(url);
};
- Update button text to "Copy All (for Excel/Sheets)"
- Add tooltip explaining TSV format is spreadsheet-friendly
canvas.toBlob("image/jpeg") always outputs JPEG:
- Animated GIFs become a single-frame JPEG
- PNGs with transparency lose transparency (background becomes white)
Potential fix: Choose format based on input type:
const outputType = file.type === "image/png" ? "image/png" : "image/jpeg";
canvas.toBlob(..., outputType, quality);
The compression heuristic may still produce payloads exceeding the limit for extremely large files.
Suggested fix: Add a post-compression check:
const { base64 } = await compressImage(file);
if (base64.length > maxBase64Chars) {
throw new Error("Image is too large even after compression. Try a smaller file.");
}
Some older browsers (especially Safari versions) don't fully support createImageBitmap.
Fallback approach:
const imgBitmap = "createImageBitmap" in window
? await createImageBitmap(file)
: await new Promise<HTMLImageElement>((resolve, reject) => {
const img = new Image();
img.onload = () => resolve(img);
img.onerror = () => reject(new Error("Failed to load image"));
img.src = URL.createObjectURL(file);
});
Currently assumes JSON even on server errors. If server returns HTML error page, response.json() will fail with a confusing message.
Safer approach:
const response = await fetch("/api/analyze", ...);
if (!response.ok) {
throw new Error(`Server error (${response.status})`);
}
const data = await response.json();
If upload is somehow triggered multiple times rapidly:
compressing/analyzingflags may flicker- History appends are safe but UI might look odd
Current mitigation: Input is now disabled while busy.
Some text/background combinations are borderline for accessibility:
text-orange-600on whitebg-orange-100 text-orange-700for small text
Consider using slightly darker text colors (text-orange-700, text-rose-700) for improved readability.