Full data type treeviewitems#3321
Conversation
…ser preserve the data type string for the node.
- Use field name as header when present, otherwise generic "Type" - Remove redundant "Name" field (now in header) - Remove field/entries count display (full type string shows structure) - Remove getTypeKindLabel() method (no longer needed) - Add 9 comprehensive unit tests for tooltip behavior Tooltip now contains only essential fields: header, Data Type, Nullable, and Comment (if present). This provides a cleaner, more focused user experience while the full data type string (e.g., "ROW<id INT, name VARCHAR>") already conveys the complete structure. Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
Added comprehensive documentation explaining the real reason for transient children: VS Code's tree view API requires getParent() to return the exact parent instance that was returned from getChildren(). By setting parentNode: this on each child, we ensure the view provider's getParent() method can navigate the tree correctly. Clarified that IDs are unique even without the parentNode chain (via parentColumnId + field names), but the parentNode reference is essential for tree navigation logic. Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
…kTypeNode Added 3 comprehensive unit tests for getParent() behavior with nested type nodes: 1. should return the parent FlinkTypeNode for a nested FlinkTypeNode - Tests that parentNode is correctly returned for deeply nested expansions 2. should return the FlinkRelationColumn parent for a top-level FlinkTypeNode - Tests that the parent column is returned for type nodes directly under a column 3. should return undefined when FlinkTypeNode parent column is not found - Tests graceful handling when parent column is not in the relations container These tests verify that the transient children pattern works correctly with VS Code's getParent() API for tree navigation. All 22 getParent() tests passing (13 FlinkDatabaseViewProvider + 9 other providers). Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
Added 3 new tests to revealResource() covering FlinkTypeNode reveal scenarios: 1. should reveal a top-level FlinkTypeNode directly - Tests revealing type nodes that are direct children of a column 2. should reveal a nested FlinkTypeNode directly - Tests revealing deeply nested type nodes with parentNode references 3. should reveal a FlinkTypeNode with custom options - Tests that custom reveal options (select, focus, expand) work with type nodes These tests verify the revealResource() implementation correctly handles FlinkTypeNode instances by passing them through directly to treeView.reveal() since they are transient instances not stored in containers. All 17 revealResource() tests passing, including 14 existing tests for containers and columns, plus the 3 new tests for FlinkTypeNode. All 147 flinkDatabase tests passing (no regressions). Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
Added 4 new tests covering previously uncovered branches: 1. should return type children when expanding a FlinkRelationColumn with compound type - Tests FlinkRelationColumn branch: calls getTypeChildren() 2. should return scalar children when expanding a FlinkTypeNode with ROW type - Tests FlinkTypeNode branch with expandable type: calls getChildren() - Verifies child FlinkTypeNode instances are created 3. should return empty array when expanding a FlinkTypeNode with scalar type - Tests FlinkTypeNode branch with non-expandable type - Verifies empty array is returned (scalars have no children) 4. should skip intermediate nodes when expanding ARRAY<ROW> FlinkTypeNode - Tests FlinkTypeNode with compound array: skips [element] intermediate node - Verifies direct ROW field children are returned These tests provide coverage for the two previously uncovered branches in getChildren(): - Line 146-148: element instanceof FlinkRelationColumn - Line 149-151: element instanceof FlinkTypeNode All 96 FlinkDatabaseViewProvider tests passing (no regressions). Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
The final else block (lines 178-180) was unreachable dead code because: - All types in DatabaseChildrenType union are explicitly handled by preceding conditions - FlinkDatabaseResourceContainer: line 160 - FlinkRelation: line 163 (has getTreeItem) - FlinkUdf: line 168 - FlinkAIConnection: line 170 - FlinkAIModel: line 172 - FlinkAITool: line 174 - FlinkAIAgent: line 176 - FlinkArtifact: line 166 - FlinkRelationColumn: line 163 (has getTreeItem) - FlinkTypeNode: line 163 (has getTreeItem) Replaced the dead code with a TypeError that documents the exhaustiveness check: if a new type is added to DatabaseChildrenType, this line will serve as a catch-all to detect unhandled cases at runtime. The implicit exhaustiveness check means TypeScript will flag unhandled types if strict type checking is enabled on the union type. This change had no test coverage because the code was unreachable. Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
Replaced bracket notation access to private getTreeItemDescription() method with direct testing through the public getTreeItem() API: Before: column[\"getTreeItemDescription\"](column.getParsedType()) After: column.getTreeItem().description Benefits: - Tests the actual public API users interact with (getTreeItem) - Eliminates awkward private method access via bracket notation - Cleaner test code that's easier to understand - No longer needs to manually call getParsedType() and pass it in The getTreeItemDescription() private method only exists to support getTreeItem(), so testing it through the public API is more appropriate. Renamed describe block from \"treeItemDescription\" to \"getTreeItem() description\" to better reflect what's being tested. All 48 FlinkRelationColumn tests passing (no regressions). Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
The test was incorrectly expecting that searchableText() on a ROW node would include the names of fields inside the ROW. In reality, searchableText() should only include metadata for the node itself: field name (if present), data type, nullability, and comment. The field names inside compound types are part of separate child nodes. Updated the test to use a ROW type with a field name and verify it includes the field name and type string, which is the correct behavior. All 3237 tests passing with no regressions. Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
The startPos was being calculated after consuming the keyword, resulting in incorrect substring extraction. For example, ARRAY<INT> would produce "RRAY<INT>" instead of "ARRAY<INT>". Fixed by capturing startPos before consuming keywords in the parse() method, then passing it to the specific type parse methods. This ensures extractSince() gets the correct starting position to include the full keyword and structure. Now fullDataTypeString correctly matches the original input syntax: - ARRAY<INT> - MULTISET<VARCHAR> - ROW<id INT, name VARCHAR> - MAP<INT, VARCHAR> All 3237 tests passing. Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
Added 20 unit tests to explicitly verify that fullDataTypeString is generated correctly across all type categories: - Scalar types (INT, VARCHAR, etc.) - Parameterized types (VARCHAR(255), DECIMAL(10,2)) - Types with nullability markers (NOT NULL, NULL) - ARRAY, MULTISET, ROW, MAP types with various nesting levels - Types with keywords (TIMESTAMP WITH TIME ZONE, INTERVAL YEAR TO MONTH) These tests ensure that the parser's fullDataTypeString matches the original input, which is critical for displaying complete type definitions in tooltips and other UI elements. Also fixed a bug where parseNullability() was called after extracting fullDataTypeString in scalar type parsing, causing NOT NULL markers to be excluded. Now nullability is consumed before extraction. All 3257 tests passing (+20 new tests). Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
Renamed FlinkRelationColumn.getTypeChildren() to getChildren() to match the naming convention used by FlinkTypeNode.getChildren(). This improves consistency across the codebase where both classes implement similar methods for tree expansion. Updated: - Method definition in flinkRelation.ts - Call site in flinkDatabase.ts - All test references in test files All 3257 tests passing. Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
Changed parentColumnId from `string | null` to `string` (required) because in real code paths created from FlinkRelationColumn.getChildren(), it is always populated. Making it non-nullable clarifies the API contract and prevents confusion about when it can actually be null. Changes: - Field declaration now requires string instead of string | null - Constructor now requires parentColumnId parameter (no longer optional) - Updated id getter to always include parentColumnId in path - Updated getChildren() to pass parentColumnId directly (no ?? logic needed) - Updated all test instantiations to provide parentColumnId The field remains non-nullable for test code as well, using "test-col-id" as a default value where tests create nodes outside the normal FlinkRelationColumn flow. All 3257 tests passing. Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
Clarified why parentNode is essential: it's required by VS Code's TreeView API (getParent() must return the exact instance created by getChildren()), and it enables proper tree navigation and ID path construction through the parent node chain. The expanded docstring helps future maintainers understand that parentNode is not just an optional convenience - it's a critical part of the TreeView integration contract. Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
…eness Clarified the critical role of parentColumnId in ensuring TreeView item ID uniqueness. Without it anchoring each node to its source column, identical field names across different columns would produce duplicate IDs, breaking the TreeView. The expanded docstring highlights: - Its role in ensuring unique IDs across columns - The hierarchical ID construction combining column ID with field names - Concrete example showing how two columns with the same nested field name generate distinct IDs This makes it clear that parentColumnId is not just metadata - it's essential for correct TreeView behavior. Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
There was a problem hiding this comment.
Pull request overview
Adds first-class TreeView support for fully-expanded Flink SQL data types in the Flink Database view by introducing a FlinkTypeNode model, extending the full datatype parser output with an exact fullDataTypeString segment for tooltips, and wiring expand/collapse + icons through the existing view provider patterns.
Changes:
- Introduce
FlinkTypeNodeas a tree-aware wrapper over parsedFlinkType, including tooltips, icons, IDs, and child expansion. - Extend the Flink datatype parser/model to retain
fullDataTypeStringper parsed node, and add a shared display formatter (formatFlinkTypeForDisplay). - Update Flink Database view + relation column models to surface nested type children and add extensive unit test coverage.
Reviewed changes
Copilot reviewed 13 out of 13 changed files in this pull request and generated 6 comments.
Show a summary per file
| File | Description |
|---|---|
| src/viewProviders/flinkDatabase.ts | Integrates FlinkTypeNode into Flink Database tree expansion/parenting and reveal behavior. |
| src/viewProviders/flinkDatabase.test.ts | Adds unit tests for expanding/revealing FlinkTypeNode and column type children. |
| src/utils/flinkTypes.ts | Adds unified Flink type display formatting and updates SQL type cleaning. |
| src/utils/flinkTypes.test.ts | Adds tests for display formatting across scalar/compound/container types. |
| src/parsers/flinkTypeParser.ts | Captures fullDataTypeString for parsed nodes (for tooltip fidelity). |
| src/parsers/flinkTypeParser.test.ts | Updates/extends parser tests to validate fullDataTypeString behavior. |
| src/models/flinkTypes.ts | Extends Flink type model with fullDataTypeString. |
| src/models/flinkTypeNode.ts | New model bridging parsed types into TreeView-friendly nodes (children/tooltips/icons/search). |
| src/models/flinkTypeNode.test.ts | Comprehensive unit tests for FlinkTypeNode behavior and rendering metadata. |
| src/models/flinkRelation.ts | Updates column rendering/expandability via parser + FlinkTypeNode, adds icons + caching/error logging. |
| src/models/flinkRelation.test.ts | Updates tests for new column formatting, tooltip behavior, and type-children expansion. |
| src/icons.ts | Adds icon constants for ROW/ARRAY/MULTISET. |
| FULL_DATATYPE_PARSER_PROJECT.md | Project notes documenting goals/approach for the parser + UI integration. |
Made parentNode able to hold either a FlinkRelationColumn (for top-level type nodes) or a FlinkTypeNode (for nested nodes), while remaining nullable for test compatibility. This enables the simplification of FlinkDatabaseViewProvider.getParent() which now simply returns element.parentNode without needing to look up the column by ID, since the parent is already directly referenced. Benefits: - Clearer semantics: parentNode can be a Column (top-level) or TypeNode (nested) - Simpler getParent() implementation: no more ID lookups needed in most cases - Preserved backward compatibility: parentNode is still optional for tests - Better type clarity: callers know what types of parents are possible Updated id getter to safely stop traversal when encountering a FlinkRelationColumn. All 3257 tests passing. Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
Replace all 'as any' casts with proper 'as CustomMarkdownString' type for tooltip assertions. Improves type safety without changing test logic. - Import CustomMarkdownString from ./main - Replace 9 instances of (item.tooltip as any).value with proper typing Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
Replace meaningless assertion `assert(x || !x)` with explicit assertion that proves Comment field is not present when no comment is provided. The test case provides no comment, so we should explicitly verify the Comment field does not appear in the tooltip. Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
For ARRAY and MULTISET types, recursively call formatFlinkTypeForDisplay() on the element type instead of using formatSqlType() on just the dataType. This ensures nested containers like ARRAY<ROW<...>> or MULTISET<MAP<...>> are properly formatted with their full structure, not just the base type name. Add comprehensive tests for nested containers: - ARRAY<ROW> formats as "ROW[]" - ARRAY<ARRAY<INT>> formats as "INT[][]" - ARRAY<MULTISET<INT>> formats as "INT MULTISET[]" - MULTISET<ARRAY<INT>> formats as "INT[] MULTISET" Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
Add test case proving that when a ROW field has a comment (single-quoted string in the type definition), the comment appears in the tooltip as a 'Comment' field with the comment text. This validates end-to-end that comments are properly parsed, attached to field nodes, and rendered in the TreeView item tooltip. Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
1. Disable duplicate enum value rule for FLINK_TYPE_ROW and FLINK_TYPE_MULTISET - Both intentionally use 'confluent-function' icon for the demo - Added eslint-disable-next-line comments to suppress the warnings 2. Change CustomMarkdownString import to type-only import - It's only used in type casts, not runtime code - Using 'import type' ensures it's removed at compile time Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
Make parentNode a required property on FlinkTypeNode constructor instead of optional. This eliminates the need for null checks and fallback ID-based parent lookups in the view provider. Changes: - Update FlinkTypeNode constructor to require parentNode (not optional) - Update parentNode type from union with null to just the union of parent types - Add TEST_PARENT_COLUMN constant in flinkTypeNode.test.ts for all test nodes - Pass parentNode: this in FlinkRelationColumn.getChildren() - Pass parentNode: TEST_VARCHAR_COLUMN in flinkDatabase.test.ts - Update test assertion to expect direct parentNode return instead of undefined Benefit: Simpler code, stronger type safety, no special-case null handling needed for tests. Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
Remove 'creates node with parentNode' test that only verified parentNode was stored as-is from the constructor. This is now trivial since parentNode is required and always assigned directly by the constructor. The type system already guarantees this behavior. Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
The test was hardcoding the connection ID string, which differs between
stable ('vscode-confluent-cloud-connection') and insiders
('vscode-insiders-confluent-cloud-connection') editions.
Import CCLOUD_CONNECTION_ID constant from constants and use it in the
assertion instead of hardcoding the string.
Fixes CI/CD failures when running tests against vscode-insiders edition.
Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
Replace the regex pattern /\(2147483647\)/g with literal string '(2147483647)' in formatSqlType(). String literals are simpler and more efficient for fixed text replacement than regex patterns. Resolves SonarQube suggestion: use literal string instead of regex. All 14 flinkTypes tests passing. Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
Reorder getChildren() in both FlinkRelationColumn and FlinkTypeNode to
check the positive condition (this.isExpandable) first, then handle the
else case (not expandable) at the end. This avoids the negated condition
pattern that SonarQube flags.
Also added clarifying comment in else clause: 'not expandable; no children'
since the else is now further from the positive condition.
Structure:
- if (this._children === null) {
- if (this.isExpandable) {
- // construct children for expandable types
- } else {
- // not expandable; no children
- this._children = [];
- }
- }
- return this._children;
Resolves SonarQube warning: 'Unexpected negated condition'
All 189 Flink tests passing.
Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
Change parameter type from DatabaseChildrenType to Exclude<DatabaseChildrenType, FlinkTypeNode> to prevent type-safety violations where code could pass type checking but fail at runtime. With the type system now enforcing this, the runtime check for FlinkTypeNode is no longer needed, and the test verifying the runtime error has been removed (since it's now a compile-time error instead). Resolves Copilot comment: API type mismatch in revealResource() All 147 flinkDatabase tests passing. Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
- Fix ARRAY test to use proper ARRAY<INT> syntax instead of scalar "INT ARRAY" - Update JSDoc examples in flinkTypes.ts to match actual parser output (ARRAY<INT>, MULTISET<...>) - Fix CHANGELOG.md typo: "Fink Database" → "Flink Database" - Improve error message in getTreeItem() to include constructor.name and id - Remove unused CompoundFlinkType import from flinkRelation.ts All 211 tests passing. Type checking clean. Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
Replace unsafe `(element as any)` casts with proper type narrowing using typeof checks and structural type casts. This maintains diagnostic value (showing className and id in error messages) while respecting strict typing requirements. Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
When type parsing fails, the synthetic fallback SCALAR type now inherits the column's actual nullability (this.isNullable) instead of unconditionally setting isFieldNullable to true. This prevents losing the column's nullability information in downstream formatting/search. Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
…odes Implemented recursive check for ARRAY/MULTISET expandability that correctly handles nested containers at any depth. Added hasRowOrMapAtAnyDepth() helper to recursively check if a type eventually contains ROW or MAP. Key changes: - isExpandable now checks recursively for nested containers (e.g., ARRAY<ARRAY<ROW>> is expandable, ARRAY<ARRAY<INT>> is not) - getChildren() creates intermediate nodes with synthetic [element] ID segments for nested ARRAY/MULTISET, preserving ID uniqueness - Preserves existing UX optimization: ARRAY<ROW> still skips directly to fields without intermediate node - Applied same fix to both FlinkTypeNode and FlinkRelationColumn Example tree structures: - ARRAY<ROW<id INT>> → id field (skip intermediate, existing behavior) - ARRAY<ARRAY<ROW<id INT>>> → [element] (ARRAY) → id field (new behavior) - ARRAY<ARRAY<INT>> → not expandable (leaf node, no ROW/MAP at any depth) Fixes issue where nested containers caused duplicate TreeView IDs. Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
Moved the recursive type checking logic from FlinkTypeNode and FlinkRelationColumn into a shared utility function in the flinkTypes module. This eliminates 28 lines of duplication while improving code organization and maintainability. Changes: - Added hasRowOrMapAtAnyDepth() as exported function in flinkTypes.ts - Removed private static helper methods from FlinkTypeNode and FlinkRelationColumn - Both classes now import and use the centralized function - No behavioral changes, all 175 type-related tests pass Benefits: - Single source of truth for recursive type checking logic - Easier to test and understand the algorithm - Better separation of concerns (model logic in models, utilities in utilities) - Reduced code duplication (-28 net lines) Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
…tainer IDs Changed synthetic ID segments from generic '[element]' to descriptive '[array]' or '[multiset]' based on the container type being wrapped. This makes intermediate nodes more identifiable in the tree hierarchy and ID chains. Updates: - Synthetic ID segments now reflect the actual container type: [array] for ARRAY, [multiset] for MULTISET - Fixed bug where elementType.kind was used instead of kind for determining label - Updated test cases to expect new ID format - Added comprehensive test for ARRAY<ARRAY<MULTISET<ROW<id INT>>>> covering: - Deep nesting with mixed container types - Proper intermediate node creation at each level - Correct ID generation and uniqueness across hierarchy - Expansion behavior through all nesting levels Example ID chains: - table.column.[array].[multiset].fieldName - table.column.[array].[array].fieldName All 176 type-related tests passing (added 1 new test). Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
The isExpandable logic was duplicated identically in both FlinkTypeNode and FlinkRelationColumn. Extract the core type expansion rules into a pure function isFlinkTypeExpandable() in the flinkTypes module, making it the single source of truth for expandability logic. Both classes now delegate to this function, reducing maintenance burden and preventing divergence of the rules. This refactoring improves testability and makes the logic more discoverable as a reusable type utility. Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
…tTreeItem() Extract nested conditionals and error handling logic into focused helper methods: - buildTreeItem(): Routes to appropriate TreeItem builder based on element type - createUnhandledTypeError(): Creates detailed error with debug information - extractElementInfo(): Extracts className and id from unknown element type Benefits: - Main getTreeItem() method is now 5 lines (clear intent) - buildTreeItem() uses flat if-returns instead of nested if-else chains - Error handling logic is isolated and testable - Easier to add new element types in the future This refactoring follows the same patterns used in FlinkTypeNode.getChildren(), maintaining consistency across the codebase. Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
Replace the error handling helper methods (createUnhandledTypeError and extractElementInfo) with a proper TypeScript exhaustiveness check using the never type. This approach is cleaner and more idiomatic. Benefits: - Compile-time safety: TypeScript will flag missing cases when DatabaseChildrenType changes, rather than relying on runtime error handling - Eliminates ~40 lines of error handling code - Simpler control flow: no runtime exceptions thrown - Clearer intent: the never type explicitly documents exhaustiveness checking - More explicit: upgraded duck-typing check to explicit instanceof checks for FlinkRelation, FlinkRelationColumn, and FlinkTypeNode If a new type is added to DatabaseChildrenType in the future, TypeScript will flag the never assignment as unreachable, preventing silent bugs. Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
…nsContainer Changed the FlinkTypeNode test case from expecting undefined to expecting relationsContainer. This is the correct behavior since FlinkTypeNode instances are children of FlinkRelationColumns, which belong to the relations container. This test now verifies that getParent() correctly handles FlinkTypeNode by returning the relations container. Co-Authored-By: Claude Haiku 4.5 <noreply@anthropic.com>
|
Dave Shoup (shouples)
left a comment
There was a problem hiding this comment.
Love this! 🙌 Only concerns are about the icon enum values and usage, but we'll need to do some icon-related follow-up before the 2.3 release anyway.
| default: | ||
| return IconNames.FLINK_FUNCTION; |
There was a problem hiding this comment.
hah -- I was thinking 'f' for 'field', forgot that over there it was 'f' for 'function.' Sigh.
That said, a followup item for this parser is to have each UDF param or return type also be a child node of the UDF, then possibly have them be expandable if is compound like here. So:
- MyFunctionB:
- Parameters:
- a INT
- B INT
- Return type:
- INT
- Parameters:
But if it returned, say, a ROW type:
* MyFunctionB:
* Parameters:
* a INT
* B INT
* Return type:
* ROW:
* name: VARCHAR
* age: INT
* ...





Summary of Changes
FlinkTypeNodeclass to bridge the Flink full datatype parser ouput interfaceFlinkTypeobjects introduced here to become fully fledged children of the Flink Database view.ARRAY<INT>will not be expandable, but instead will look like 'INT []'.ARRAY<ROW<id INT, name VARCHAR>>will be expandable, showing two childrenid INTandname VARCHAR, and the description for the ARRAY node itself will read 'ROW []'.ARRAY<ARRAY<ROW<id INT, name VARCHAR>>>will be expandable -- the determination that there's a ROW or MAP is done recursively.FlinkTypeNode.FlinkTypeNodeobjects create their ownTreeViewItems just likeFlinkRelationdoes.Click-testing instructions
spotify-listening-datacolumns show up. The ROW columns are now expandable, all the way down.Caveats
Pull request checklist
Please check if your PR fulfills the following (if applicable):
Tests
Release notes