Skip to content

Discovery 360 (D360)

Discovery 360 (D360) turns CSV data scattered across your organization into an analytical environment you can query in plain English. When you upload CSV files, AI reads their contents and automatically handles the table design and data loading into the analytical database (Amazon Redshift Serverless). Users do not need to write SQL — they simply ask questions in natural language from a chat interface, and an AI agent assembles and executes the appropriate queries to return the answers.

Key Features

  • AI-Assisted Schema Generation: Upload CSV files in the Admin UI and the agent proposes Redshift table definitions (column names, types, descriptions) for review and editing
  • One-Stop Data Ingestion: Confirming the schema automatically creates the tables and loads data from CSV into Amazon Redshift Serverless
  • Agentic Text2SQL: Text2SQL agent powered by Amazon Bedrock AgentCore that generates and runs SQL from natural-language questions
  • Domain Knowledge Injection: Add organizational rules (e.g. "sales figures are tax-excluded") as Knowledge entries so the agent responds in your business context
  • Prompt Caching: Faster, cheaper responses through Amazon Bedrock prompt cache
  • Separated Agent / Admin UIs: End-user Agent UI and operator Admin UI backed by independent Amazon Cognito User Pools
  • Edge Protection: CloudFront delivery with WAF-based IP restriction
  • Serverless Foundation: Auto-scaling, pay-per-use Redshift Serverless and managed services throughout

Primary Use Cases

  • Cross-Domain Data Discovery: Query any CSV-based business data — sales, inventory, surveys, operational logs — across the organization from a single interface
  • Data Analysis Democratization: Enable non-engineers in business units to ask questions of the data warehouse directly, without waiting for SQL specialists
  • Rapid Analytical Onboarding: Turn raw CSV exports from multiple systems into a queryable dataset without hand-writing DDL or building a bespoke data pipeline

Deploy to AWS

Click the button below to deploy. Please log in to AWS first.

 Deploy

Region note

D360 automatically deploys an additional WAF stack in us-east-1 for CloudFront. CDK bootstrap is performed in both your chosen deployment region and us-east-1.

Parameter Configuration

You can configure the following parameters during deployment:

  • NotificationEmailAddress: Email address for deployment notifications. Also auto-registered as an Agent and Admin Cognito user when RegisterNotificationAddressAsUser is true.
  • RegisterNotificationAddressAsUser: When true (default), register NotificationEmailAddress as a Cognito user in both the Agent and Admin pools with an auto-generated temporary password shown in the deployment-completed email. Set to false to skip this registration.
  • DefaultUserEmails: Comma-separated emails to pre-create as Agent Frontend users in Cognito. Each user receives an invitation email with a temporary password. Leave blank to create no additional users.
  • DefaultAdminEmails: Comma-separated emails to pre-create as Admin Frontend users in Cognito. Each user receives an invitation email with a temporary password. Leave blank to create no additional users.
  • StackPrefix: Environment prefix applied to D360 stack names (dev / staging / prod, default dev)
  • BedrockModelId: Bedrock model ID used by the D360 agent (default global.anthropic.claude-sonnet-4-6)
  • SqlResultThreshold: Maximum number of rows returned from a single SQL query (default 200)
  • EnablePromptCache: Enable Bedrock prompt caching for the agent (default true)
  • CsvInputBucketName: Name of an existing S3 bucket to load CSV data from. Leave blank to create a new bucket.
  • AllowedIpv4Cidrs: Comma-separated IPv4 CIDR ranges allowed by WAF. Default is blank, which inherits the upstream cdk.json value (currently 0.0.0.0/1,128.0.0.0/1 = all IPv4).
  • AllowedIpv6Cidrs: Comma-separated IPv6 CIDR ranges allowed by WAF. Default is blank, which inherits the upstream cdk.json value (currently ::/1,8000::/1 = all IPv6).

Post-Deployment Setup

After clicking the deploy button, you will receive an AWS Notification - Subscription Confirmation email shortly. Click the Confirm subscription link to receive deployment start and completion notifications.

Upon deployment completion, you will receive a notification email with the following information:

  1. Agent Frontend URL: End-user query interface
  2. Admin Frontend URL: Operator interface for schema management and data ingestion
  3. Auto-registered user credentials: When RegisterNotificationAddressAsUser is true, the email contains the Agent and Admin temporary passwords for NotificationEmailAddress. You must change these passwords on first sign-in.
  4. Invited users: Addresses listed in DefaultUserEmails / DefaultAdminEmails receive their own Cognito invitation emails with separate temporary passwords.
  5. CSV data bucket: S3 bucket used for CSV ingestion

Resource Cleanup

To delete deployed resources, remove the following stacks from the CloudFormation console:

  1. <StackPrefix>DwhAgentStack (main application)
  2. <StackPrefix>DwhAgentWafStack (in us-east-1)
  3. D360DeploymentStack deployment stack

Warning

Stack deletion will remove all data ingested into Redshift and all generated schema definitions. Export anything you need beforehand.

Usage

After deployment, D360 becomes usable once you onboard data via the Admin UI:

  1. Enable Bedrock model access for the configured model (default Claude Sonnet 4.6) in the Bedrock console
  2. Enable Bedrock AgentCore Observability in CloudWatch (Transaction Search)
  3. Sign in to the Admin Frontend using the initial password delivered in the deployment notification email
  4. Upload & Build: in the Admin UI, upload your CSV files, run Analyze, then Confirm to materialize the Redshift tables and load data
  5. End-user querying: have your users sign in to the Agent Frontend and query the data in natural language

For more details, see the project README.