Skip to content

Tokenization reduction clean#327

Open
elmartinj wants to merge 8 commits intoTimeCopilot:mainfrom
elmartinj:tokenization_reduction_clean
Open

Tokenization reduction clean#327
elmartinj wants to merge 8 commits intoTimeCopilot:mainfrom
elmartinj:tokenization_reduction_clean

Conversation

@elmartinj
Copy link
Contributor

Adding tokenization reduction in clean version without noise files.

elmartinj and others added 8 commits March 13, 2026 11:25
This tutorial contains an example with multiple indexes and subsequent tampering to the data in order to show resiliency and a real life use case of TC applied on cryptocurrency prices up to 2021.
Fix to the issue raised on empty dataframe, resulting from an inner merge where the existing dataframe that accumulated results and the newer one had diferent indices. A subsequent issue must be raised to either:
1 report a single model failure (on index matching)
2 fix the moirai discrepancy (only model that showed this issue)
…okenization run, first trial version. felt cute, might delete later
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant