Skip to content

feat: create predictions table for engagement scoring#17

Open
sidneyswift wants to merge 1 commit intomainfrom
feature/create-predictions-table
Open

feat: create predictions table for engagement scoring#17
sidneyswift wants to merge 1 commit intomainfrom
feature/create-predictions-table

Conversation

@sidneyswift
Copy link
Copy Markdown
Contributor

@sidneyswift sidneyswift commented Apr 10, 2026

Summary

  • New predictions table for storing TRIBE v2 neural engagement predictions
  • Columns: id, account_id (FK), file_url, modality, engagement_score, engagement_timeline (jsonb), peak_moments (jsonb), weak_spots (jsonb), regional_activation (jsonb), total_duration_seconds, elapsed_seconds, created_at
  • Indexes on account_id and created_at (desc)
  • Row Level Security enabled

Test plan

  • Run migration against Supabase
  • Verify table schema matches OpenAPI spec

Made with Cursor


Summary by cubic

Add public.predictions table to store TRIBE v2 neural engagement predictions. Enables fast querying and account-scoped access.

  • New Features

    • New public.predictions table with: id, account_id (FK, cascade), file_url, modality ('video'|'audio'|'text'), engagement_score, engagement_timeline (jsonb), peak_moments (jsonb), weak_spots (jsonb), regional_activation (jsonb), total_duration_seconds, elapsed_seconds, created_at.
    • Indexes on account_id and created_at desc.
    • Row Level Security enabled.
  • Migration

    • Run the Supabase migration.
    • Verify the table schema matches the OpenAPI spec.

Written for commit a26b652. Summary will update on new commits.

Summary by CodeRabbit

  • Infrastructure
    • Added database support for prediction data storage, enabling engagement metrics tracking, content analysis, and performance timing data capture across multiple media formats.

Stores TRIBE v2 prediction results: engagement score, timeline,
peak moments, weak spots, and regional brain activation data.
Indexed by account_id and created_at for efficient queries.

Made-with: Cursor
@supabase
Copy link
Copy Markdown

supabase Bot commented Apr 10, 2026

Updates to Preview Branch (feature/create-predictions-table) ↗︎

Deployments Status Updated
Database Fri, 10 Apr 2026 15:44:57 UTC
Services Fri, 10 Apr 2026 15:44:57 UTC
APIs Fri, 10 Apr 2026 15:44:57 UTC

Tasks are run on every commit but only new migration files are pushed.
Close and reopen this PR if you want to apply changes from existing seed or migration files.

Tasks Status Updated
Configurations Fri, 10 Apr 2026 15:45:03 UTC
Migrations Fri, 10 Apr 2026 15:45:05 UTC
Seeding Fri, 10 Apr 2026 15:45:08 UTC
Edge Functions Fri, 10 Apr 2026 15:45:08 UTC

View logs for this Workflow Run ↗︎.
Learn more about Supabase for Git ↗︎.

@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented Apr 10, 2026

📝 Walkthrough

Walkthrough

A new SQL migration creates the public.predictions table with a UUID primary key, foreign key reference to accounts, engagement analysis data fields (scores, timelines, peak moments, weak spots, regional activation), timing metrics, and timestamp. Row Level Security is enabled on the table.

Changes

Cohort / File(s) Summary
Database Migration
supabase/migrations/20260410000000_create_predictions.sql
Creates public.predictions table with UUID primary key, account foreign key (cascade delete), file URL, modality enum constraint, engagement metrics (score, timeline, peak moments, weak spots, regional activation), duration fields, and created timestamp. Adds indexes on account_id and created_at, enables RLS.

Estimated code review effort

🎯 1 (Trivial) | ⏱️ ~5 minutes

Poem

🐰 Hop, hop, hooray! A table springs to life,
Predictions now flow, free from strife,
With engagement scores and timelines bright,
Our schema hops forward—what a sight! 🎬✨

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'feat: create predictions table for engagement scoring' directly and clearly summarizes the main change - creating a new predictions table for engagement scoring purposes.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch feature/create-predictions-table

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@supabase/migrations/20260410000000_create_predictions.sql`:
- Line 23: The migration enables RLS on public.predictions but does not create
any policies, which will lock out authenticated/anonymous users; update the
migration to either (A) create appropriate row-level policies for
public.predictions (e.g., separate CREATE POLICY statements for
SELECT/INSERT/UPDATE/DELETE granting access to authenticated/anon roles or using
USING/WITH CHECK expressions tied to your auth columns) before running ALTER
TABLE public.predictions ENABLE ROW LEVEL SECURITY, or (B) defer the ALTER TABLE
... ENABLE ROW LEVEL SECURITY to a subsequent migration after you add the
necessary CREATE POLICY statements; reference the table name public.predictions
and the ALTER TABLE ... ENABLE ROW LEVEL SECURITY line when making the change.
- Around line 7-13: Add CHECK constraints to enforce valid domains: require
engagement_score to be within a sane range (e.g., >= 0 AND <= 1 or the
project-specific max — update to the correct max if different) and require
total_duration_seconds >= 0, elapsed_seconds >= 0 and elapsed_seconds <=
total_duration_seconds; add these CHECK constraints referencing the columns
engagement_score, total_duration_seconds and elapsed_seconds in the table
definition so negative or inconsistent durations/scores cannot be inserted.
- Line 4: The migration enables row level security on public.predictions but
defines no policies, which will block all access; update this migration to add
appropriate RLS policies for public.predictions (for example policies that allow
owners or specific roles to SELECT/INSERT/UPDATE/DELETE) or remove/deferr the
"alter table public.predictions enable row level security" statement until a
prior migration creates those policies; reference the table name
public.predictions and the existing "alter table public.predictions enable row
level security;" statement when adding the corresponding CREATE POLICY entries
or moving the enable step.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 4a601e50-48d5-48ca-95be-6f21e162ce0e

📥 Commits

Reviewing files that changed from the base of the PR and between a952927 and a26b652.

📒 Files selected for processing (1)
  • supabase/migrations/20260410000000_create_predictions.sql

-- Neural engagement predictions from TRIBE v2 model
create table if not exists public.predictions (
id uuid primary key default gen_random_uuid(),
account_id uuid not null references public.accounts(id) on delete cascade,
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
set -euo pipefail

echo "== Searching for accounts table creation =="
rg -n --type-add 'sql:*.sql' --type sql -C2 'create\s+table\s+(if\s+not\s+exists\s+)?public\.accounts\b|create\s+table\s+(if\s+not\s+exists\s+)?accounts\b' supabase/migrations || true

echo
echo "== Searching for predictions policies =="
rg -n --type-add 'sql:*.sql' --type sql -C2 'create\s+policy.*predictions|on\s+public\.predictions' supabase/migrations || true

Repository: recoupable/database

Length of output: 884


🏁 Script executed:

#!/bin/bash
set -euo pipefail

echo "== List all SQL migrations in supabase/migrations =="
ls -la supabase/migrations/*.sql 2>/dev/null | head -20 || echo "No migrations found"

echo
echo "== Count and show all migration filenames =="
find supabase/migrations -name "*.sql" -type f | sort

echo
echo "== Check if accounts table exists anywhere in migrations =="
rg -i "create.*table.*accounts" supabase/migrations --no-heading || echo "No matches found"

echo
echo "== Check the full content of the predictions migration file =="
cat supabase/migrations/20260410000000_create_predictions.sql

Repository: recoupable/database

Length of output: 13587


🏁 Script executed:

#!/bin/bash
set -euo pipefail

echo "== Search for any RLS policies on predictions table =="
rg -i "create.*policy.*predictions|policy.*on.*public\.predictions" supabase/migrations --type sql -A5 || echo "No policy definitions found"

echo
echo "== Check specific accounts table creation migration =="
cat supabase/migrations/20241209025457_add_foreign_key.sql

Repository: recoupable/database

Length of output: 889


Add RLS policies for public.predictions table before deployment.

Line 23 enables row level security (alter table public.predictions enable row level security;) but no policies are defined. This will block all access to the table. Either define policies in this migration or ensure they're created in a guaranteed earlier deploy step.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@supabase/migrations/20260410000000_create_predictions.sql` at line 4, The
migration enables row level security on public.predictions but defines no
policies, which will block all access; update this migration to add appropriate
RLS policies for public.predictions (for example policies that allow owners or
specific roles to SELECT/INSERT/UPDATE/DELETE) or remove/deferr the "alter table
public.predictions enable row level security" statement until a prior migration
creates those policies; reference the table name public.predictions and the
existing "alter table public.predictions enable row level security;" statement
when adding the corresponding CREATE POLICY entries or moving the enable step.

Comment on lines +7 to +13
engagement_score numeric not null,
engagement_timeline jsonb not null,
peak_moments jsonb not null,
weak_spots jsonb not null,
regional_activation jsonb not null,
total_duration_seconds numeric not null,
elapsed_seconds numeric not null,
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Add domain checks for score/duration fields.

Lines 7-13 allow negative/invalid numeric values (engagement_score, total_duration_seconds, elapsed_seconds), which can corrupt analytics.

Proposed SQL patch
   engagement_score numeric not null,
   engagement_timeline jsonb not null,
   peak_moments jsonb not null,
   weak_spots jsonb not null,
   regional_activation jsonb not null,
   total_duration_seconds numeric not null,
   elapsed_seconds numeric not null,
+  constraint predictions_engagement_score_range_chk
+    check (engagement_score >= 0 and engagement_score <= 1),
+  constraint predictions_duration_nonnegative_chk
+    check (
+      total_duration_seconds >= 0
+      and elapsed_seconds >= 0
+      and elapsed_seconds <= total_duration_seconds
+    ),
   created_at timestamptz not null default now()
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
engagement_score numeric not null,
engagement_timeline jsonb not null,
peak_moments jsonb not null,
weak_spots jsonb not null,
regional_activation jsonb not null,
total_duration_seconds numeric not null,
elapsed_seconds numeric not null,
engagement_score numeric not null,
engagement_timeline jsonb not null,
peak_moments jsonb not null,
weak_spots jsonb not null,
regional_activation jsonb not null,
total_duration_seconds numeric not null,
elapsed_seconds numeric not null,
constraint predictions_engagement_score_range_chk
check (engagement_score >= 0 and engagement_score <= 1),
constraint predictions_duration_nonnegative_chk
check (
total_duration_seconds >= 0
and elapsed_seconds >= 0
and elapsed_seconds <= total_duration_seconds
),
created_at timestamptz not null default now()
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@supabase/migrations/20260410000000_create_predictions.sql` around lines 7 -
13, Add CHECK constraints to enforce valid domains: require engagement_score to
be within a sane range (e.g., >= 0 AND <= 1 or the project-specific max — update
to the correct max if different) and require total_duration_seconds >= 0,
elapsed_seconds >= 0 and elapsed_seconds <= total_duration_seconds; add these
CHECK constraints referencing the columns engagement_score,
total_duration_seconds and elapsed_seconds in the table definition so negative
or inconsistent durations/scores cannot be inserted.

create index if not exists idx_predictions_created_at
on public.predictions(created_at desc);

alter table public.predictions enable row level security;
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

RLS is enabled without any policy in this migration.

On Line 23, ENABLE ROW LEVEL SECURITY with no policy in the same migration can make the table unusable for authenticated/anon immediately after deploy.

Proposed SQL patch
 alter table public.predictions enable row level security;
+
+-- Example baseline policies (adjust to your auth/account model)
+create policy if not exists "predictions_select_own"
+  on public.predictions
+  for select
+  using (account_id = auth.uid());
+
+create policy if not exists "predictions_insert_own"
+  on public.predictions
+  for insert
+  with check (account_id = auth.uid());
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
alter table public.predictions enable row level security;
alter table public.predictions enable row level security;
-- Example baseline policies (adjust to your auth/account model)
create policy if not exists "predictions_select_own"
on public.predictions
for select
using (account_id = auth.uid());
create policy if not exists "predictions_insert_own"
on public.predictions
for insert
with check (account_id = auth.uid());
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@supabase/migrations/20260410000000_create_predictions.sql` at line 23, The
migration enables RLS on public.predictions but does not create any policies,
which will lock out authenticated/anonymous users; update the migration to
either (A) create appropriate row-level policies for public.predictions (e.g.,
separate CREATE POLICY statements for SELECT/INSERT/UPDATE/DELETE granting
access to authenticated/anon roles or using USING/WITH CHECK expressions tied to
your auth columns) before running ALTER TABLE public.predictions ENABLE ROW
LEVEL SECURITY, or (B) defer the ALTER TABLE ... ENABLE ROW LEVEL SECURITY to a
subsequent migration after you add the necessary CREATE POLICY statements;
reference the table name public.predictions and the ALTER TABLE ... ENABLE ROW
LEVEL SECURITY line when making the change.

Copy link
Copy Markdown

@cubic-dev-ai cubic-dev-ai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

1 issue found across 1 file

Prompt for AI agents (unresolved issues)

Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.


<file name="supabase/migrations/20260410000000_create_predictions.sql">

<violation number="1" location="supabase/migrations/20260410000000_create_predictions.sql:13">
P2: No domain constraints on `engagement_score`, `total_duration_seconds`, or `elapsed_seconds`. Negative scores or durations would silently pass. Add CHECK constraints (e.g., `engagement_score between 0 and 1`, durations `>= 0`, `elapsed_seconds <= total_duration_seconds`).</violation>
</file>

Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review, or fix all with cubic.

weak_spots jsonb not null,
regional_activation jsonb not null,
total_duration_seconds numeric not null,
elapsed_seconds numeric not null,
Copy link
Copy Markdown

@cubic-dev-ai cubic-dev-ai Bot Apr 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2: No domain constraints on engagement_score, total_duration_seconds, or elapsed_seconds. Negative scores or durations would silently pass. Add CHECK constraints (e.g., engagement_score between 0 and 1, durations >= 0, elapsed_seconds <= total_duration_seconds).

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At supabase/migrations/20260410000000_create_predictions.sql, line 13:

<comment>No domain constraints on `engagement_score`, `total_duration_seconds`, or `elapsed_seconds`. Negative scores or durations would silently pass. Add CHECK constraints (e.g., `engagement_score between 0 and 1`, durations `>= 0`, `elapsed_seconds <= total_duration_seconds`).</comment>

<file context>
@@ -0,0 +1,23 @@
+  weak_spots jsonb not null,
+  regional_activation jsonb not null,
+  total_duration_seconds numeric not null,
+  elapsed_seconds numeric not null,
+  created_at timestamptz not null default now()
+);
</file context>
Fix with Cubic

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant