Peppera: Infinite Meals From One Pantry
Tell it what's in your kitchen. Peppera generates complete meals with nutrition tracking, turning a handful of ingredients into weeks of variety. An AI meal engine that proves constraint-based creativity beats infinite choice.
Overview
Constraint-Based AI Creativity
Peppera is an AI meal engine built around a deceptively simple premise: infinite meals from one pantry. You tell the system what ingredients you have on hand — eggs, rice, two chicken breasts, half an onion, some soy sauce — and it generates complete, varied meals you can actually make right now. Not recipes you need to shop for. Meals from exactly what's in your kitchen.
The system is designed around constraint-based creativity, which is the principle that AI performs better with boundaries than without. Most meal planning apps give you access to millions of recipes and expect you to filter. Peppera inverts the model: it starts with your constraints (what you have, what you don't, how many people you're feeding, dietary requirements) and generates creative solutions within those boundaries. The result is more useful than an infinite recipe database because every suggestion is something you can make immediately.
Each generated meal includes full nutrition tracking — calories, protein, carbohydrates, fats — calculated from the ingredients you specified. The system tracks what you've used across multiple meals, so you can plan a full week of cooking without accidentally allocating the same chicken breast to three different recipes. This is where the AI moves from "interesting idea generator" to "practical kitchen tool."
Built by Temisan Gerrard, a London-based Solutions Architect, Peppera demonstrates how AI can transform a mundane, repetitive task into a creative, personalised experience.
The Challenge
Limited Pantry, Not Infinite Ingredients
The fundamental challenge Peppera addresses is the gap between what recipe apps assume you have and what you actually have. Open any recipe app and search for "chicken dinner" — you'll get hundreds of results requiring ingredients you don't own, techniques you don't know, and equipment you don't own. The average UK household has 15–25 staple ingredients at any given time. Most recipe databases assume you have access to a fully stocked professional kitchen.
The technical challenge is making the AI genuinely creative within tight constraints. When you have eggs, bread, cheese, and butter, the answer "make a cheese sandwich" is technically correct but completely unhelpful. The AI needs to understand cooking techniques — grilling, frying, baking, toasting — and apply them creatively to transform the same four ingredients into distinctly different meals: cheese on toast, a fried egg sandwich with melted cheese, a French-style croque madame, cheese straws made from bread, eggs en cocotte.
"Most AI meal planners are search engines with a chatbot interface. Peppera is a constraint satisfaction engine that happens to produce meals. The difference is that every output is something you can make right now, with what you have right now."
Nutrition tracking adds mathematical constraints on top of ingredient constraints. If you're aiming for 2,000 calories per day with 120g of protein, the meal engine needs to generate combinations that hit those targets across multiple meals using the same pool of ingredients. This is a resource allocation problem disguised as a cooking problem, and it's one that LLMs are surprisingly good at solving when properly prompted with structured constraints.
Knowledge provenance — the ability to trace every meal suggestion back to its source — was a requirement from the start. Users need to know whether a meal combination came from a known recipe tradition (Italian, Thai, Mexican) or was an AI-generated novel combination. This transparency builds trust and helps users develop genuine cooking knowledge rather than blindly following AI suggestions.
The Solution
A Constraint-First AI Engine
The architecture follows the AI Operating Stack approach, with each layer handling a specific part of the meal generation pipeline.
Ingredient Ingestion
Users input their pantry contents through a simple form interface built with shadcn/ui components. Ingredients are normalised against a canonical database — so "chicken breast," "chicken breasts," and "boneless skinless chicken breast" all map to the same ingredient with standardised nutrition data. Quantities are captured in practical units (pieces, cups, grams) rather than requiring precise measurements. The system also captures expiry dates to prioritise ingredients that need to be used soon.
Constraint Modelling
The AI engine receives three categories of constraints: ingredient constraints (what you have, how much, what's expiring), nutritional constraints (calorie targets, macro ratios, dietary restrictions like vegetarian or gluten-free), and preference constraints (cuisine preferences, cooking time limits, skill level). These constraints are structured as a prompt template that the LLM must satisfy simultaneously. The constraint-first approach means the AI never suggests meals requiring ingredients you don't have.
Creative Generation via Qwen
Qwen, accessed through OpenRouter, serves as the creative engine. Given the structured constraints, it generates meal combinations that satisfy all requirements simultaneously. The prompt engineering is designed to encourage variety — if you've already received "egg fried rice" as a suggestion, the next generation biases away from rice-based dishes. The LLM also applies cooking knowledge: it knows that you can make a roux with butter and flour, that egg whites whip to stiff peaks, and that caramelised onions take 40 minutes. This isn't just pattern matching — it's applied culinary knowledge filtered through your constraints.
Nutrition Calculation
Each generated meal is retroactively verified against the nutrition database in Supabase. The system calculates actual macros based on the exact ingredient quantities used in the suggested meal, not generic estimates. If the meal doesn't hit the user's nutritional targets, it's either adjusted or regenerated. This verification step prevents the common problem of AI-generated recipes that look good but are nutritionally useless.
Knowledge Provenance
Every generated meal is tagged with its derivation path. If the meal closely matches a known recipe from a specific cuisine tradition, it's tagged with that origin. If it's a novel AI-generated combination, it's labelled as such. Users can see why the AI suggested combining these particular ingredients — whether it's drawing on a known cooking technique, a flavour pairing principle, or creative extrapolation. This transparency turns each meal into a learning opportunity rather than just a set of instructions.
Tech Stack
What It's Built With
Next.js 16
The application framework provides server components for AI generation calls and nutrition calculations, keeping API keys and prompt templates server-side. The app router manages the meal generation flow as a series of server actions, ensuring the AI pipeline runs in a controlled environment rather than exposing logic to the client.
Supabase
Supabase handles user authentication, ingredient storage, pantry management, meal history, and the nutrition reference database. Row-level security ensures each user's pantry data is isolated. The real-time subscriptions feature powers instant UI updates when ingredients are added or meals are generated.
Qwen AI
Qwen via OpenRouter serves as the creative generation engine. It receives structured constraint prompts and produces meal combinations that satisfy ingredient availability, nutritional targets, and variety requirements simultaneously. The model was selected for its strong performance on structured output tasks and its ability to maintain coherence across multi-ingredient, multi-constraint scenarios.
WebAuthn
Passwordless authentication using biometric verification. Users sign in with fingerprint or face recognition — no passwords, no email verification flows. This keeps the onboarding friction minimal, which matters for a consumer-facing app that competes with the zero-friction experience of just opening a recipe book.
shadcn/ui
The component library provides accessible, customisable UI primitives for the ingredient input form, meal cards, nutrition breakdowns, and pantry management interface. shadcn/ui was chosen over a pre-built component framework because it provides full control over styling while maintaining accessibility standards. Every component is owned by the project, not imported from a dependency that might change its API.
Decisions
Key Decisions
Constraint-first over recommendation-first
Most meal planning apps are built as recommendation engines: they have a database of recipes and try to match you to one. Peppera is built as a constraint satisfaction engine: it starts with what you have and generates meals within those boundaries. The difference is fundamental. A recommendation engine suggests a recipe and hopes you have the ingredients. A constraint engine guarantees you have everything you need before it suggests anything. This decision shaped every aspect of the system, from the ingredient input UX to the AI prompt structure.
Server-side AI generation over client-side
All AI generation happens in Next.js server actions, not in the browser. This keeps API keys secure, allows server-side caching of similar ingredient combinations, and ensures consistent prompt templates that can't be modified by the user. It also means the AI generation latency is consistent regardless of the user's device — important for a consumer app that may be used on older phones.
Knowledge provenance as a first-class feature
Every meal suggestion includes its derivation path: is this a known recipe, a variation of a known technique, or an AI novel combination? This transparency was a deliberate design decision to differentiate Peppera from "AI generates random recipes" tools. Users building cooking knowledge want to know when they're learning a real technique versus experimenting with an AI invention. This feature also serves as a safety mechanism — if an AI novel combination suggests something questionable (like an unsafe ingredient pairing), the provenance tag makes it clear this wasn't from a tested recipe tradition.
Netlify deployment over Vercel
Despite being a Next.js application (which is developed by Vercel), Peppera deploys on Netlify for simplicity. The server action functionality works identically on both platforms, and Netlify's deploy previews and form handling provided a marginally better developer experience for this specific project's needs. The decision reflects a pragmatic approach: use the tool that fits the project, not the one made by the framework vendor.
Results
What Was Shipped
Peppera is live at peppera.co.uk, deployed on Netlify. The MVP delivers on its core promise: infinite meals from one pantry. Users input their ingredients, set nutritional targets, and receive varied, creative meal suggestions that use exactly what they have — nothing more, nothing less. Every meal includes full nutrition calculations and knowledge provenance tags.
The marketing hook — "Infinite meals from one pantry" — captures the product's essence perfectly. It's not a recipe search engine. It's not a meal delivery service. It's a creativity engine that transforms limited resources into unlimited variety, which is exactly what AI does best: finding novel solutions within tight constraints.
Lessons
Lessons Learned
Constraints make AI more creative, not less. The counterintuitive finding from building Peppera is that tighter constraints produce better meal suggestions. When the AI has unlimited ingredients to work with, it defaults to obvious combinations. When it has four ingredients and a calorie target, it's forced to be genuinely creative — applying techniques like emulsion, caramelisation, and fermentation to transform the same ingredients into fundamentally different dishes.
Nutrition verification catches AI errors. LLMs are bad at precise nutrition calculations. They'll confidently state that a meal has 450 calories when the actual count is 680. The server-side nutrition verification layer — calculating actual macros from the Supabase database rather than trusting the LLM's estimate — catches these errors before they reach the user. This is a pattern that applies broadly: use the LLM for creative generation, use deterministic systems for numerical verification.
Knowledge provenance builds user trust. Users who could see the derivation path for each meal suggestion were significantly more likely to try unfamiliar combinations. When the system explained "this combines the French technique of a croque monsieur with the Japanese concept of tamagoyaki" rather than just presenting the meal, users reported higher confidence in the suggestion. Transparency about what the AI knows and what it's inventing is essential for consumer trust.
Ingredient normalisation is harder than it looks. Users describe the same ingredient in dozens of ways. "Chicken breast," "1 chicken breast," "chicken fillet," "boneless chicken," "chicken (breast)" — all the same thing, but a database sees them as different. The normalisation layer that maps free-text ingredient input to canonical database entries required more engineering effort than the AI generation itself. This is a common pattern in AI products: the last mile of input processing takes more work than the AI model.
Building an AI product that solves a real problem?
Peppera proves that constraint-based AI creates better user experiences than unlimited choice. If you're building an AI product that needs to work within real-world constraints, let's discuss your architecture.
Available for Q2 2026 consulting engagements.