GE4X: Grounded AI Demo
What it does
Users type a question about any website, and the app returns answers sourced from that site’s content — via Google Search grounding scoped to the target domain — with Google Maps integration for location-aware queries. There’s a “publish” preview where you see the agent embedded in a realistic website mockup, built by dynamically fetching the target site’s metadata.
The point: grounding alone isn’t impressive until you show it working on someone’s actual website with their actual data. The publish preview made that concrete for non-technical stakeholders.
Architecture
Backend (FastAPI): GenAIService wrapping the GenAI SDK with Vertex AI. Site-restricted search (queries scoped to target domain), location-aware grounding (user lat/lng passed through RetrievalConfig to Google Maps), multi-turn conversation with thought signature tracking. Deployed to Cloud Run.
Frontend (Vue 3 + TypeScript): Chat interface with inline grounding citations — the backend parses grounding_chunks, grounding_supports (text segments mapped to source indices), and Maps widget tokens, then normalizes it all into a clean JSON response. Config panel for agent setup, debug panel exposing raw payloads.
GM3 component library: 80+ Material Design 3 Vue components — buttons, data tables, navigation, date pickers, video player, etc. Built as a standalone git submodule and shared across GE4X, Dubbing Studio, and other projects.
Express Mode (related prototype): “Zero friction” website-to-chatbot ingestion — paste a URL, get a grounded chatbot in seconds. That prototype defined the architectural blueprint the production team used. 2-week delivery.