7.8 KiB
SousChefAI
Note - this is an AI generated readme, and will be updated in the future. A production-ready iOS app that uses multimodal AI to scan ingredients, generate personalized recipes, and provide real-time cooking guidance.
Features
🎥 Intelligent Fridge Scanner
- Real-time ingredient detection using Overshoot API
- Camera-based scanning with live preview
- Confidence scoring for each detected ingredient
- Manual ingredient entry and editing
🍳 AI-Powered Recipe Generation
- Personalized recipe suggestions based on available ingredients
- Google Gemini AI for complex reasoning and recipe creation
- Filtering by "Scavenger" (use only what you have) or "Upgrader" (minimal shopping)
- Recipe scaling based on limiting ingredients
- Match scoring to prioritize best recipes
👨🍳 Live Cooking Mode
- Step-by-step guided cooking
- Real-time visual monitoring of cooking progress
- Text-to-speech announcements for hands-free cooking
- AI feedback when steps are complete
- Progress tracking and navigation
🔐 User Profiles & Persistence
- Firebase Firestore for cloud data sync
- Dietary restrictions (Vegan, Keto, Gluten-Free, etc.)
- Nutrition goals
- Saved recipes and pantry staples
Architecture
The app follows MVVM (Model-View-ViewModel) with a Repository Pattern for clean separation of concerns:
├── Models/ # Core data models (Codable, Identifiable)
│ ├── Ingredient.swift
│ ├── UserProfile.swift
│ └── Recipe.swift
│
├── Services/ # Business logic & external APIs
│ ├── VisionService.swift # Protocol for vision AI
│ ├── OvershootVisionService.swift # Overshoot implementation
│ ├── RecipeService.swift # Protocol for recipe generation
│ ├── GeminiRecipeService.swift # Gemini implementation
│ ├── FirestoreRepository.swift # Firebase data layer
│ └── CameraManager.swift # AVFoundation camera handling
│
├── ViewModels/ # Business logic for views
│ ├── ScannerViewModel.swift
│ ├── RecipeGeneratorViewModel.swift
│ └── CookingModeViewModel.swift
│
├── Views/ # SwiftUI views
│ ├── ScannerView.swift
│ ├── InventoryView.swift
│ ├── RecipeGeneratorView.swift
│ └── CookingModeView.swift
│
└── Config/ # App configuration
└── AppConfig.swift
Setup Instructions
1. Clone the Repository
git clone https://github.com/yourusername/souschef.git
cd souschef
2. Configure API Keys
Open SousChefAI/Config/AppConfig.swift and replace the placeholder values:
// Overshoot Vision API
static let overshootAPIKey = "YOUR_OVERSHOOT_API_KEY"
// Google Gemini API
static let geminiAPIKey = "YOUR_GEMINI_API_KEY"
Getting API Keys:
- Overshoot API: Visit overshoot.ai (or the actual provider URL) and sign up
- Gemini API: Visit Google AI Studio and create an API key
3. Add Firebase
Add Firebase SDK via Swift Package Manager:
- In Xcode:
File>Add Package Dependencies - Enter URL:
https://github.com/firebase/firebase-ios-sdk - Select version:
10.0.0or later - Add the following products:
FirebaseAuthFirebaseFirestore
Add GoogleService-Info.plist:
- Go to Firebase Console
- Create a new project or select existing
- Add an iOS app with bundle ID:
com.yourcompany.SousChefAI - Download
GoogleService-Info.plist - Drag it into your Xcode project (ensure it's added to the SousChefAI target)
Enable Firebase in App:
- Open
SousChefAI/SousChefAIApp.swift - Uncomment the Firebase imports and initialization:
import FirebaseCore
init() {
FirebaseApp.configure()
}
4. Add Google Generative AI SDK (Optional)
For better Gemini integration, add the official SDK:
// In Xcode: File > Add Package Dependencies
// URL: https://github.com/google/generative-ai-swift
Then update GeminiRecipeService.swift to use the SDK instead of REST API.
5. Configure Camera Permissions
The app requires camera access. Permissions are already handled in code, but ensure your Info.plist includes:
<key>NSCameraUsageDescription</key>
<string>We need camera access to scan your fridge and monitor cooking progress</string>
6. Build and Run
- Open
SousChefAI.xcodeprojin Xcode - Select your target device or simulator
- Press
Cmd + Rto build and run
Usage Guide
Scanning Your Fridge
- Tap the Scan tab
- Point your camera at your fridge or ingredients
- Tap Scan Fridge to start detection
- Review detected ingredients (yellow = low confidence)
- Tap Continue to Inventory
Managing Inventory
- Edit quantities by tapping an ingredient
- Swipe left to delete items
- Add manual entries with the
+button - Set dietary preferences before generating recipes
- Tap Generate Recipes when ready
Generating Recipes
- Browse suggested recipes sorted by match score
- Filter by:
- All Recipes: Show everything
- The Scavenger: Only use what you have
- The Upgrader: Need 1-2 items max
- High Match: 80%+ ingredient match
- Tap a recipe to view details
- Save favorites with the heart icon
- Start cooking with Start Cooking button
Cooking Mode
- Enable AI Monitoring to watch your cooking
- The AI will analyze your progress visually
- Navigate steps with Previous/Next
- Use Read Aloud for hands-free guidance
- The AI will announce when steps are complete
- View all steps with the list icon
Tech Stack
- Language: Swift 6
- UI Framework: SwiftUI
- Architecture: MVVM + Repository Pattern
- Concurrency: Swift Async/Await (no completion handlers)
- Camera: AVFoundation
- Vision AI: Overshoot API (real-time video inference)
- Reasoning AI: Google Gemini 2.0 Flash
- Backend: Firebase (Auth + Firestore)
- Persistence: Firebase Firestore (cloud sync)
Protocol-Oriented Design
The app uses protocols for AI services to enable easy provider swapping:
protocol VisionService {
func detectIngredients(from: AsyncStream<CVPixelBuffer>) async throws -> [Ingredient]
}
protocol RecipeService {
func generateRecipes(inventory: [Ingredient], profile: UserProfile) async throws -> [Recipe]
}
To swap providers, simply create a new implementation:
final class OpenAIVisionService: VisionService {
// Implementation using OpenAI Vision API
}
final class AnthropicRecipeService: RecipeService {
// Implementation using Claude API
}
Future Enhancements
- Nutrition tracking and calorie counting
- Shopping list generation
- Recipe sharing and social features
- Meal planning calendar
- Voice commands during cooking
- Multi-language support
- Apple Watch companion app
- Widget for quick recipe access
- Offline mode with local ML models
- Integration with smart kitchen appliances
Contributing
Contributions are welcome! Please follow these guidelines:
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature - Follow Swift style guide and existing architecture
- Write unit tests for new features
- Update documentation as needed
- Submit a pull request
License
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgments
- Overshoot AI for low-latency video inference
- Google Gemini for powerful reasoning capabilities
- Firebase for robust backend infrastructure
- Apple for SwiftUI and AVFoundation frameworks
Support
For issues, questions, or feature requests, please open an issue on GitHub.