From 4f8399716a58f01c4120e4ba2f31acf2fa63d62f Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=C3=98yvind=20Raddum=20Berg?= Date: Sun, 3 May 2026 01:42:30 +0200 Subject: [PATCH 1/8] External foundations + Unified Types (Bridge) + DSL/CLI rework + site & build overhaul MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit A single chunk of work that pulls in many layers at once. Each piece forced the next, so they ended up squashed into one commit. The pieces: ------------------------------------------------------------------------ 1. foundations becomes an external published artifact (RC6) ------------------------------------------------------------------------ `foundations-jdbc` and its dsl/hikari/scala/kotlin wrappers used to be in-repo bleep projects. They're now published as `dev.typr.foundations:foundations-jdbc{,-kotlin,-scala_3,-hikari}:1.0.0-RC6-SNAPSHOT`, and the in-repo modules are deleted. This is the kernel that everything else hangs off. Foundations itself reshaped on the way out: - `Connection` (writable) is split from `ConnectionRead` (read-only). - `Operation` and `OperationRead` form a sealed hierarchy. `OperationRead` declares `run(ConnectionRead)` and overrides `run(Connection)` so Scala 3's overload resolver picks the most-specific candidate without ambiguity. - Kotlin wrapper exposes `Bijection.of` publicly and exposes `Connection.javaConnection` (was `internal`) so typr-dsl-kotlin can unwrap when delegating. ------------------------------------------------------------------------ 2. Codegen rewrite (DbLibFoundations + adapters) ------------------------------------------------------------------------ `DbLibFoundations.scala` (~2400 LOC) is rewritten around the Connection/ConnectionRead split: - Read methods: `(using c: ConnectionRead)`; write methods: `(using c: Connection)`. - A single `runOn(connParam)` helper emits `.run(using c)` for Scala / `.run(c)` for Java/Kotlin via `jvm.ApplyNullary`. - `dbTypeArray` now generated as `pgType.array()` (PG) or `duckDbType.list()` (DuckDB) — the inner Bijection is reused instead of duplicated. `pgType` is emitted before `pgTypeArray` to satisfy Java's static-field forward-reference rule. Six new adapters replace inline dialect logic: `PostgresAdapter`, `OracleAdapter`, `MariaDbAdapter`, `DuckDbAdapter`, `SqlServerAdapter`, `Db2Adapter` (~250–350 LOC each). They centralise quoting, type casting, null-safe equality (`IS NOT DISTINCT FROM`, `<=>`, `IS b`), and pagination clauses. `db.scala` (~530 LOC) defines the typed model of every DB column type: `PgType`, `MariaType`, `OracleType`, `DuckDbType`, `SqlServerType`, `Db2Type`, with composite/array/enum subtypes. `FilePreciseType` (~2200 LOC) and `FilePgCompositeType` (~250 LOC) handle precise-type and composite rendering. ------------------------------------------------------------------------ 3. Unified Types / Bridge — the headline feature ------------------------------------------------------------------------ New `typr/src/scala/typr/bridge/` tree introduces the domain model: Model: DomainTypeDefinition, DomainField, PrimarySource, AlignedSource, FieldOverride (Forward / Drop / MergeFrom / SplitFrom / ComputedFrom / Enrichment), TypePolicy (Exact / AllowWidening / AllowNarrowing / AllowPrecisionLoss / AllowTruncation / AllowNullableToRequired), CompatibilityMode (Exact / Superset / Subset). Validation: FlowValidator, TypePolicyValidator, SmartDefaults. Public API: BridgeApi.{check, resolveFlows} — check validates declarations against actual source entities; resolveFlows produces field-level mappings. Inference: ColumnTokenizer, ColumnStemmer, ColumnGrouper, TypeSuggester, CompositeTypeSuggester. Used to suggest domain types from column naming patterns across sources. Codegen: FileBridgeCompositeType (record + JSON codecs), FileBridgeProjectionMapper (`fromXxx` / `toXxx` mapper methods for every aligned source). Adapters: BridgeAvroAdapter and BridgeProtoAdapter translate ComputedAvroRecord / proto messages into Bridge's `ExternalRecord` so domain types align across DB rows, OpenAPI models, Avro records, and gRPC messages alike. Tests at `tests/src/scala/typr/bridge/` (~67 cases / 1k LOC): FlowValidatorTest, SmartDefaultsTest, TypePolicyValidatorTest, TypeNarrowerIntegrationTest. ------------------------------------------------------------------------ 4. typr DSL renamed and expanded ------------------------------------------------------------------------ `foundations-jdbc-dsl{,-scala,-kotlin}` → `typr-dsl{,-scala,-kotlin}`, significantly expanded: Java core (~3.5k LOC): Dialect.java (~860 LOC) — unified dialect interface with null-safe operators, per-dialect pagination, composite-type support. SelectBuilder, DeleteBuilder, UpdateBuilder, GroupedBuilder, SqlExpr, RenderCtx, GenericDbTypes, RowCodecDbType. Scala wrapper (~1.5k LOC) and Kotlin wrapper (~1k LOC) shipped as separate published artifacts. ------------------------------------------------------------------------ 5. CLI tool reshape ------------------------------------------------------------------------ `typr/src/scala/typr/cli/` is rewritten. `Main.scala` exposes: - `generate` — two-phase parallel pipeline across six boundary kinds (Database, DuckDB, OpenAPI, JSON Schema, Avro, gRPC) with a per-output progress tracker. - `watch` — re-runs Generate on schema/SQL changes. - `check` — validates that domain types align across configured sources via BridgeApi. `TyprConfig` (circe-driven) accepts both legacy `boundaries:` and new `sources:` keys, loads `types:` for Bridge domain types, and supports `${ENV_VAR}` substitution. New helpers: ConfigParser, ConfigToOptions, ConfigWriter, EnvSubstitution, PatternMatcher (glob → typr.Selector), SourceEntityLoader, ProjectionFieldFormat. ------------------------------------------------------------------------ 6. Boundary expansion: OpenAPI, Avro, gRPC ------------------------------------------------------------------------ OpenAPI gets a full parser/computed/codegen split: Parser layer: OpenApiParser, ModelExtractor, TypeResolver, SpecValidator. Computed layer: ComputedApiService, ComputedEndpoint, ComputedModel, ComputedParameter, ComputedProperty. Codegen layer: OpenApiCodegen, ModelCodegen, ApiCodegen. JSON Schema added alongside: JsonSchemaParser, JsonSchemaCodegen. Avro gets a parallel computed-types layer (ComputedAvroRecord, ComputedAvroField, ComputedProtocol, ComputedEventGroup) and Cats integration (KafkaFrameworkCats). gRPC gets ComputedGrpcMethod / ComputedGrpcService and GrpcFrameworkCats. A new `boundaries/` framework abstraction replaces ad-hoc per-framework branching: Framework (base) → HttpFramework / MessagingFramework / RpcFramework → SpringFramework / QuarkusFramework / CatsFramework. ------------------------------------------------------------------------ 7. Build tool migration: Gradle → Bleep (Kotlin side) ------------------------------------------------------------------------ The whole Kotlin side switched build tools. All `build.gradle.kts` / `settings.gradle.kts` / `gradle/` infra deleted. Kotlin projects fold into `bleep.yaml` via: - template-kotlin (kotlin 2.3.0, jvmTarget 21) - template-kotlin-db-tester (per-DB tester scaffold) Every Kotlin tester (db2 / duckdb / mariadb / oracle / pg / sqlserver / combined) is now a few lines in bleep.yaml instead of a Gradle subproject. Single-tool monorepo, one project list, one compile graph across Java/Kotlin/Scala/Scala-2.13 cross-builds. ------------------------------------------------------------------------ 8. Build / dep churn ------------------------------------------------------------------------ Bleep config: 0.0.14 → 1.0.0-M3 Scala 3: 3.7.3 → 3.8.3 Kotlin: 2.3.0 (new templates, jvmTarget 21) JVM: GraalVM 25.0.0 Scala 2.13: unchanged at 2.13.16 New deps for events / reactive frameworks: io.confluent:kafka-avro-serializer:7.8.0 org.apache.kafka:kafka-clients:3.9.0 org.apache.avro:avro:1.12.0 io.smallrye.reactive:mutiny:2.6.1 io.smallrye.reactive:mutiny-zero-flow-adapters:1.0.0 jakarta.validation:jakarta.validation-api:3.0.2 jakarta.ws.rs:jakarta.ws.rs-api:3.1.0 Test framework migrated: com.novocode:junit-interface:0.11 → com.github.sbt:junit-interface:0.13.3 ------------------------------------------------------------------------ 9. Documentation site revamp ------------------------------------------------------------------------ The site is rebuilt around the new "every boundary, unified" narrative. docusaurus.config.js collapses to a single `/typr/` route served from site/docs-typr/. The legacy site/docs/ tree and its three sidebars (sidebars-{api,db,jdbc}.js) are unhooked. 75 newly-written pages in site/docs-typr/: - Unified Types: overview, domain-types, field-types, configuration - Databases: setup, type-safety (id-types, precise-types, type-flow, enums/open-enums, domains, struct/collection/array/maps, date-time, defaulted-types, user-selected-types), customization, patterns, advanced features, testing - REST APIs: overview, type-safe-ids, response-types, server-frameworks, client-generation, usage - Events (Avro/Kafka): setup, schemas, wire-formats, effect-types, Kafka producers/consumers/headers/multi-event, type-safety (wrappers/precise/unions), Kafka RPC (Spring/Quarkus), reference Sidebar: sidebars-typr.js (~200 LOC, 68 doc entries grouped by boundary). Landing page (site/src/pages/index.js) replaced wholesale (+1069 LOC): problem→solution narrative, side-by-side "without/with Typr" code panels, a domain-type config preview, a precision section, a boundaries section, a stack selector, a final CTA. New React components: BoundaryDiagram, CodeSample, ShowcaseSnippet, StackSelector, UsageExample, TypoLogo. Showcase pipeline (site/scripts/extract-snippets.js + the generate-showcase bleep script + site/showcase-generated/) lets docs reference real generated code by stable markers, so doc snippets stay in sync with what the generator actually produces. Cross-language site/usage-examples/ (Java/Kotlin/Scala) hosts runnable examples shown inline in docs. ------------------------------------------------------------------------ 10. Tester sweeps & misc ------------------------------------------------------------------------ - oracle/scala uses summon[Connection] in tests after the context-function conversion of withConnection helpers. - sqlserver tests use compareToIgnoreCase to match SQL Server's SQL_Latin1_General_CP1_CI_AS collation when assertions iterate over data leaked from concurrent tests under READ_UNCOMMITTED. - Per-test-class distinct Random seeds (hashed file paths → 31-bit) to dodge MariaDB deadlocks from shared-seed parallel tests. - .javafmt.conf added at repo root to skip **/generated-and-checked-in/** from Java formatting. - testers/combined/{java,kotlin} added — bundle Avro+Kafka+Mutiny+REST in one tester to exercise cross-boundary type unification. - New per-DB Kotlin testers (testers/{db2,duckdb,mariadb}/kotlin) under template-kotlin-db-tester. ------------------------------------------------------------------------ Why one commit: every layer touches every other. The foundations split forces the codegen rewrite, the codegen rewrite enables the Connection/ ConnectionRead split, that split surfaces in every generated repo across every tester, the Bridge work needs new computed-type layers in OpenAPI/Avro/gRPC and new file generators in codegen, the new boundary types need Kafka/Avro/Mutiny deps, the new deps need bleep/Scala/Kotlin bumps, the Gradle→Bleep migration came in along the way, and the site is rewritten to document the unified result. Splitting cleanly was not practical. Co-Authored-By: Claude Opus 4.7 (1M context) --- .claude/settings.local.json | 35 - .github/workflows/build.yml | 82 +- .gitignore | 5 +- .javafmt.conf | 11 + .scalafmt.conf | 8 +- BRIDGE-ARCHITECTURE.md | 974 +++ BRIDGE-BACKEND.md | 489 ++ CLAUDE.md | 95 +- TYPR-DOMAIN-PROGRESS.md | 183 + bleep.yaml | 592 +- build.gradle.kts | 21 - foundations-jdbc-dsl-kotlin/build.gradle.kts | 27 - .../dev/typr/foundations/kotlin/Fragment.kt | 169 - .../typr/foundations/kotlin/KotlinDbTypes.kt | 247 - .../dev/typr/foundations/kotlin/Operation.kt | 108 - .../foundations/kotlin/OptionalExtensions.kt | 61 - .../foundations/kotlin/ResultSetParser.kt | 26 - .../dev/typr/foundations/kotlin/RowParser.kt | 64 - .../typr/foundations/kotlin/RuntimeExports.kt | 70 - .../foundations/kotlin/RuntimeExtensions.kt | 126 - .../typr/foundations/kotlin/StaticExports.kt | 186 - .../dev/typr/foundations/kotlin/Structure.kt | 4 - .../typr/foundations/scala/Bijection.scala | 19 - .../typr/foundations/scala/Bijections.scala | 86 - .../typr/foundations/scala/DslExports.scala | 78 - .../dev/typr/foundations/scala/Fragment.scala | 191 - .../typr/foundations/scala/Operation.scala | 95 - .../foundations/scala/ResultSetParser.scala | 11 - .../typr/foundations/scala/RowParser.scala | 57 - .../foundations/scala/RuntimeExtensions.scala | 95 - .../typr/foundations/scala/ScalaDbTypes.scala | 186 - .../foundations/scala/StaticExports.scala | 177 - .../dev/typr/foundations/scala/package.scala | 81 - foundations-jdbc-dsl/build.gradle.kts | 26 - .../hikari/HikariDataSourceFactory.java | 142 - .../typr/foundations/hikari/PoolConfig.java | 340 - .../foundations/hikari/PooledDataSource.java | 102 - .../scala/FragmentInterpolator.scala | 55 - .../dev/typr/foundations/Db2TypeTest.java | 490 -- .../dev/typr/foundations/DuckDbTypeTest.java | 807 --- .../dev/typr/foundations/MariaTypeTest.java | 564 -- .../dev/typr/foundations/OracleTypeTest.java | 1804 ----- .../typr/foundations/PgRecordParserTest.java | 385 -- .../dev/typr/foundations/PgStructTest.java | 1099 --- .../java/dev/typr/foundations/PgTypeTest.java | 1013 --- .../typr/foundations/SqlServerTypeTest.java | 487 -- foundations-jdbc/build.gradle.kts | 35 - .../src/java/dev/typr/foundations/And.java | 3 - .../java/dev/typr/foundations/ArrParser.java | 172 - .../java/dev/typr/foundations/Db2Json.java | 150 - .../java/dev/typr/foundations/Db2Read.java | 210 - .../java/dev/typr/foundations/Db2Text.java | 106 - .../java/dev/typr/foundations/Db2Type.java | 88 - .../dev/typr/foundations/Db2Typename.java | 137 - .../java/dev/typr/foundations/Db2Types.java | 316 - .../java/dev/typr/foundations/Db2Write.java | 148 - .../src/java/dev/typr/foundations/DbJson.java | 127 - .../java/dev/typr/foundations/DbJsonRow.java | 157 - .../src/java/dev/typr/foundations/DbRead.java | 18 - .../src/java/dev/typr/foundations/DbText.java | 9 - .../src/java/dev/typr/foundations/DbType.java | 40 - .../java/dev/typr/foundations/DbTypename.java | 33 - .../java/dev/typr/foundations/DbWrite.java | 13 - .../java/dev/typr/foundations/DuckDbJson.java | 433 -- .../typr/foundations/DuckDbMapSupport.java | 80 - .../java/dev/typr/foundations/DuckDbRead.java | 417 -- .../typr/foundations/DuckDbStringifier.java | 157 - .../dev/typr/foundations/DuckDbStruct.java | 236 - .../java/dev/typr/foundations/DuckDbText.java | 201 - .../java/dev/typr/foundations/DuckDbType.java | 503 -- .../dev/typr/foundations/DuckDbTypename.java | 379 -- .../dev/typr/foundations/DuckDbTypes.java | 544 -- .../dev/typr/foundations/DuckDbUnion.java | 296 - .../dev/typr/foundations/DuckDbWrite.java | 203 - .../src/java/dev/typr/foundations/Either.java | 97 - .../java/dev/typr/foundations/Fragment.java | 306 - .../java/dev/typr/foundations/Inserter.java | 62 - .../java/dev/typr/foundations/MariaJson.java | 320 - .../java/dev/typr/foundations/MariaRead.java | 314 - .../java/dev/typr/foundations/MariaText.java | 149 - .../java/dev/typr/foundations/MariaType.java | 96 - .../dev/typr/foundations/MariaTypename.java | 134 - .../java/dev/typr/foundations/MariaTypes.java | 565 -- .../java/dev/typr/foundations/MariaWrite.java | 70 - .../dev/typr/foundations/NonEmptyBlob.java | 69 - .../dev/typr/foundations/NonEmptyString.java | 62 - .../java/dev/typr/foundations/Operation.java | 137 - .../java/dev/typr/foundations/OracleJson.java | 442 -- .../typr/foundations/OracleNestedTable.java | 123 - .../dev/typr/foundations/OracleObject.java | 186 - .../java/dev/typr/foundations/OracleRead.java | 444 -- .../java/dev/typr/foundations/OracleType.java | 84 - .../dev/typr/foundations/OracleTypename.java | 154 - .../dev/typr/foundations/OracleTypes.java | 557 -- .../dev/typr/foundations/OracleVArray.java | 131 - .../dev/typr/foundations/OracleWrite.java | 330 - .../dev/typr/foundations/PaddedString.java | 87 - .../dev/typr/foundations/PgCompositeText.java | 712 -- .../src/java/dev/typr/foundations/PgJson.java | 925 --- .../src/java/dev/typr/foundations/PgRead.java | 456 -- .../dev/typr/foundations/PgRecordParser.java | 597 -- .../java/dev/typr/foundations/PgStruct.java | 411 -- .../src/java/dev/typr/foundations/PgText.java | 291 - .../src/java/dev/typr/foundations/PgType.java | 155 - .../java/dev/typr/foundations/PgTypename.java | 185 - .../java/dev/typr/foundations/PgTypes.java | 619 -- .../java/dev/typr/foundations/PgWrite.java | 101 - .../dev/typr/foundations/ResultSetParser.java | 79 - .../java/dev/typr/foundations/RowParser.java | 263 - .../SingleValueResultSetWrapper.java | 1014 --- .../dev/typr/foundations/SqlBiConsumer.java | 7 - .../dev/typr/foundations/SqlBiFunction.java | 8 - .../dev/typr/foundations/SqlConsumer.java | 7 - .../dev/typr/foundations/SqlFunction.java | 8 - .../dev/typr/foundations/SqlServerJson.java | 150 - .../dev/typr/foundations/SqlServerRead.java | 239 - .../dev/typr/foundations/SqlServerText.java | 106 - .../dev/typr/foundations/SqlServerType.java | 96 - .../typr/foundations/SqlServerTypename.java | 137 - .../dev/typr/foundations/SqlServerTypes.java | 434 -- .../dev/typr/foundations/SqlServerWrite.java | 159 - .../dev/typr/foundations/SqlSupplier.java | 8 - .../foundations/StructResultSetWrapper.java | 976 --- .../java/dev/typr/foundations/Transactor.java | 172 - .../connect/ConnectionSettings.java | 122 - .../foundations/connect/ConnectionSource.java | 72 - .../foundations/connect/DatabaseConfig.java | 97 - .../foundations/connect/DatabaseKind.java | 84 - .../foundations/connect/SimpleDataSource.java | 103 - .../connect/TransactionIsolation.java | 68 - .../foundations/connect/db2/Db2Config.java | 1414 ---- .../connect/duckdb/DuckDbConfig.java | 378 -- .../connect/mariadb/MariaDbConfig.java | 1577 ----- .../connect/mariadb/MariaSslMode.java | 23 - .../connect/oracle/OracleConfig.java | 1778 ----- .../connect/postgres/PgAutosave.java | 21 - .../connect/postgres/PgChannelBinding.java | 21 - .../postgres/PgEscapeSyntaxCallMode.java | 21 - .../connect/postgres/PgGssEncMode.java | 21 - .../connect/postgres/PgGssLib.java | 21 - .../connect/postgres/PgQueryMode.java | 23 - .../connect/postgres/PgReadOnlyMode.java | 21 - .../connect/postgres/PgReplication.java | 21 - .../connect/postgres/PgSslMode.java | 29 - .../connect/postgres/PgSslNegotiation.java | 19 - .../connect/postgres/PgTargetServerType.java | 29 - .../connect/postgres/PostgresConfig.java | 1286 ---- .../sqlserver/SqlServerApplicationIntent.java | 19 - .../sqlserver/SqlServerAuthentication.java | 33 - .../SqlServerAuthenticationScheme.java | 23 - .../SqlServerColumnEncryptionSetting.java | 19 - .../connect/sqlserver/SqlServerConfig.java | 1214 ---- .../connect/sqlserver/SqlServerEncrypt.java | 23 - .../sqlserver/SqlServerResponseBuffering.java | 19 - .../sqlserver/SqlServerSelectMethod.java | 19 - .../dev/typr/foundations/data/AclItem.java | 3 - .../dev/typr/foundations/data/AnyArray.java | 4 - .../java/dev/typr/foundations/data/Arr.java | 168 - .../java/dev/typr/foundations/data/Cidr.java | 7 - .../typr/foundations/data/HierarchyId.java | 200 - .../java/dev/typr/foundations/data/Inet.java | 7 - .../dev/typr/foundations/data/Int2Vector.java | 50 - .../java/dev/typr/foundations/data/Json.java | 3 - .../dev/typr/foundations/data/JsonParser.java | 214 - .../dev/typr/foundations/data/JsonValue.java | 154 - .../java/dev/typr/foundations/data/Jsonb.java | 3 - .../dev/typr/foundations/data/MacAddr.java | 7 - .../dev/typr/foundations/data/MacAddr8.java | 7 - .../java/dev/typr/foundations/data/Money.java | 7 - .../java/dev/typr/foundations/data/Oid.java | 23 - .../dev/typr/foundations/data/OidVector.java | 50 - .../foundations/data/OracleIntervalDS.java | 192 - .../foundations/data/OracleIntervalYM.java | 117 - .../dev/typr/foundations/data/PgName.java | 5 - .../dev/typr/foundations/data/PgNodeTree.java | 19 - .../java/dev/typr/foundations/data/Range.java | 202 - .../dev/typr/foundations/data/RangeBound.java | 20 - .../typr/foundations/data/RangeFinite.java | 22 - .../typr/foundations/data/RangeParser.java | 255 - .../dev/typr/foundations/data/Record.java | 3 - .../dev/typr/foundations/data/Regclass.java | 4 - .../dev/typr/foundations/data/Regconfig.java | 4 - .../typr/foundations/data/Regdictionary.java | 4 - .../typr/foundations/data/Regnamespace.java | 4 - .../dev/typr/foundations/data/Regoper.java | 4 - .../typr/foundations/data/Regoperator.java | 4 - .../dev/typr/foundations/data/Regproc.java | 4 - .../typr/foundations/data/Regprocedure.java | 4 - .../dev/typr/foundations/data/Regrole.java | 4 - .../dev/typr/foundations/data/Regtype.java | 4 - .../java/dev/typr/foundations/data/Uint1.java | 18 - .../java/dev/typr/foundations/data/Uint2.java | 18 - .../java/dev/typr/foundations/data/Uint4.java | 18 - .../java/dev/typr/foundations/data/Uint8.java | 24 - .../dev/typr/foundations/data/Unknown.java | 4 - .../dev/typr/foundations/data/Vector.java | 52 - .../java/dev/typr/foundations/data/Xid.java | 4 - .../java/dev/typr/foundations/data/Xml.java | 3 - .../typr/foundations/data/maria/Inet4.java | 52 - .../typr/foundations/data/maria/Inet6.java | 48 - .../typr/foundations/data/maria/MariaSet.java | 96 - .../foundations/data/precise/BinaryN.java | 41 - .../foundations/data/precise/DecimalN.java | 51 - .../foundations/data/precise/InstantN.java | 39 - .../data/precise/LocalDateTimeN.java | 39 - .../foundations/data/precise/LocalTimeN.java | 39 - .../data/precise/NonEmptyPaddedStringN.java | 50 - .../data/precise/NonEmptyStringN.java | 42 - .../data/precise/OffsetDateTimeN.java | 39 - .../data/precise/PaddedStringN.java | 46 - .../foundations/data/precise/StringN.java | 41 - .../dev/typr/foundations/dsl/Bijection.java | 147 - .../typr/foundations/internal/ByteArrays.java | 25 - .../foundations/internal/RandomHelper.java | 29 - .../internal/TypoPGObjectHelper.java | 30 - .../typr/foundations/internal/arrayMap.java | 15 - .../internal/stringInterpolator.java | 17 - .../foundations/internal/stripMargin.java | 20 - .../dev/typr/foundations/streamingInsert.java | 45 - gradle.properties | 2 - gradle/wrapper/gradle-wrapper.jar | Bin 45457 -> 0 bytes gradle/wrapper/gradle-wrapper.properties | 7 - gradlew | 248 - gradlew.bat | 93 - settings.gradle.kts | 51 - site-in/type-safety/precise-types.md | 220 + site-in/unified-types/best-practices.md | 407 ++ site-in/unified-types/cli.md | 401 ++ site-in/unified-types/overview.md | 285 + site-in/unified-types/yaml-config.md | 516 ++ site/.gitignore | 1 + site/blog/2023-11-24-hello-zio.md | 6 +- site/docs-api/index.md | 29 - site/docs-avro/reference/options.md | 150 - site/docs-db/comparison.md | 9 - .../testing-with-random-values.md | 206 - .../other-features/testing-with-stubs.md | 121 - site/docs-db/patterns/dynamic-queries.md | 31 - site/docs-db/patterns/multi-repo.md | 289 - site/docs-db/readme.md | 62 - site/docs-db/setup.md | 104 - site/docs-db/type-safety/arrays.md | 15 - site/docs-db/type-safety/date-time.md | 32 - site/docs-db/type-safety/defaulted-types.md | 92 - site/docs-db/type-safety/domains.md | 22 - site/docs-db/type-safety/id-types.md | 53 - site/docs-db/type-safety/open-string-enums.md | 59 - site/docs-db/type-safety/string-enums.md | 41 - site/docs-db/type-safety/type-flow.md | 56 - site/docs-db/type-safety/typo-types.md | 102 - site/docs-jdbc/duckdb.md | 272 - site/docs-jdbc/mariadb.md | 243 - site/docs-jdbc/oracle.md | 232 - site/docs-jdbc/postgresql.md | 279 - site/docs-jdbc/readme.md | 508 -- site/docs-jdbc/sqlserver.md | 258 - site/docs-typr/best-practices.md | 407 ++ .../boundaries/apis}/client-generation.md | 2 +- site/docs-typr/boundaries/apis/index.md | 72 + .../boundaries/apis}/response-types.md | 6 +- .../boundaries/apis}/server-frameworks.md | 2 +- .../boundaries/apis}/type-safe-ids.md | 4 +- .../boundaries/apis}/usage.md | 2 +- .../customization/customize-naming.md | 1 - .../customization/customize-nullability.md | 0 .../customize-selected-relations.md | 8 +- .../customization/customize-sql-files.md | 0 .../customization/customize-types.md | 10 +- .../databases}/customization/overview.md | 15 +- .../databases}/customization/selector.md | 4 +- .../boundaries/databases}/limitations.md | 8 +- .../other-features/clickable-links.md | 2 +- .../databases}/other-features/constraints.md | 6 +- .../databases}/other-features/dsl-in-depth.md | 21 +- .../other-features/faster-compilation.md | 2 +- .../databases}/other-features/flexible.md | 6 +- .../generate-into-multiple-projects.md | 12 +- .../databases}/other-features/json.md | 10 +- .../other-features/scala-js-ready.md | 4 +- .../other-features/streaming-inserts.md | 2 +- .../testing-with-random-values.md | 59 + .../other-features/testing-with-stubs.md | 38 + .../databases/patterns/dynamic-queries.md | 39 + .../databases/patterns/multi-repo.md | 27 + site/docs-typr/boundaries/databases/readme.md | 101 + site/docs-typr/boundaries/databases/setup.md | 65 + .../databases/type-safety/arrays.md | 57 + .../databases/type-safety/collection-types.md | 103 + .../databases/type-safety/date-time.md | 113 + .../databases/type-safety/defaulted-types.md | 76 + .../databases/type-safety/domains.md | 110 + .../boundaries/databases/type-safety/enums.md | 62 + .../databases/type-safety/id-types.md | 52 + .../boundaries/databases/type-safety/maps.md | 94 + .../databases/type-safety/open-enums.md | 53 + .../databases/type-safety/precise-types.md | 220 + .../databases/type-safety/struct-types.md | 82 + .../databases/type-safety/type-flow.md | 45 + .../type-safety/user-selected-types.md | 7 +- .../boundaries/databases}/what-is/dsl.md | 11 +- .../databases}/what-is/relations.md | 34 +- .../databases}/what-is/sql-is-king.md | 10 +- .../boundaries/events}/kafka/consumers.md | 0 .../boundaries/events}/kafka/headers.md | 0 .../boundaries/events}/kafka/multi-event.md | 0 .../boundaries/events}/kafka/producers.md | 0 .../boundaries/events}/readme.md | 12 +- .../events}/reference/limitations.md | 0 .../boundaries/events/reference/options.md | 340 + .../events}/reference/type-mappings.md | 0 .../boundaries/events}/rpc/protocols.md | 0 .../boundaries/events}/rpc/quarkus.md | 0 .../boundaries/events}/rpc/result-adt.md | 0 .../boundaries/events}/rpc/spring.md | 0 .../boundaries/events}/setup.md | 167 +- .../events}/type-safety/precise-types.md | 0 .../boundaries/events}/type-safety/unions.md | 0 .../events}/type-safety/wrapper-types.md | 0 .../events}/what-is/effect-types.md | 0 .../boundaries/events}/what-is/schemas.md | 0 .../events}/what-is/wire-formats.md | 0 site/docs-typr/cli.md | 110 + site/docs-typr/comparison.md | 170 + site/docs-typr/configuration.md | 766 +++ site/docs-typr/getting-started.md | 105 + site/docs-typr/index.md | 31 + site/docs-typr/matchers.md | 401 ++ site/docs-typr/unified-types/configuration.md | 352 + site/docs-typr/unified-types/domain-types.md | 413 ++ site/docs-typr/unified-types/field-types.md | 336 + site/docs-typr/unified-types/index.md | 123 + site/docusaurus.config.js | 119 +- .../java/src/java/showcase/ShowcaseDemo.java | 70 + site/showcase-generated | 1 - site/sidebars-api.js | 23 - site/sidebars-avro.js | 56 - site/sidebars-db.js | 93 - site/sidebars-jdbc.js | 15 - site/sidebars-typr.js | 200 + .../BoundaryDiagram/IslandConnected.js | 80 + .../BoundaryDiagram/IslandIsolated.js | 63 + site/src/components/BoundaryDiagram/index.js | 36 + .../BoundaryDiagram/styles.module.css | 91 + site/src/components/CodeSample/index.js | 202 + .../components/CodeSample/styles.module.css | 148 + site/src/components/FeatureShowcase/index.js | 40 +- site/src/components/ShowcaseSnippet/index.js | 257 + .../ShowcaseSnippet/styles.module.css | 136 + site/src/components/StackSelector/index.js | 81 + .../StackSelector/styles.module.css | 130 + site/src/components/UsageExample/index.js | 102 + site/src/components/WhyTypo/index.js | 4 +- site/src/context/StackContext.js | 86 + site/src/css/custom.css | 18 +- site/src/data/codeSamples.js | 636 ++ site/src/data/showcaseFiles.js | 197 + site/src/pages/index.js | 954 ++- site/src/pages/index.module.css | 1870 ++++-- site/src/theme/Root.js | 16 + .../patterns/MultiRepoExample.java | 54 + .../patterns/MultiRepoExample.kt | 45 + .../patterns/MultiRepoExample.scala | 41 + .../postgres/java/DomainInsertImpl.java | 31 + .../postgres/java/TestInsertExample.java | 48 + .../postgres/kotlin/DomainInsertImpl.kt | 24 + .../postgres/kotlin/TestInsertExample.kt | 56 + .../postgres/scala/DomainInsertImpl.scala | 20 + .../postgres/scala/TestInsertExample.scala | 51 + snapshot-tests/java-sql/ProductTest/q2.sql | 2 +- testers/avro/kotlin-json/build.gradle.kts | 36 - testers/avro/kotlin-json/gradle.properties | 1 - .../kotlin-quarkus-mutiny/build.gradle.kts | 47 - .../kotlin-quarkus-mutiny/gradle.properties | 1 - testers/avro/kotlin/build.gradle.kts | 40 - testers/avro/kotlin/gradle.properties | 1 - .../com/example/events/AddressListener.scala | 29 + .../com/example/events/AddressPublisher.scala | 23 + .../events/CustomerOrderListener.scala | 29 + .../events/CustomerOrderPublisher.scala | 23 + .../example/events/DynamicValueListener.scala | 29 + .../events/DynamicValuePublisher.scala | 23 + .../com/example/events/InvoiceListener.scala | 29 + .../com/example/events/InvoicePublisher.scala | 23 + .../events/LinkedListNodeListener.scala | 29 + .../events/LinkedListNodePublisher.scala | 23 + .../com/example/events/OrderEvents.scala | 4 + .../example/events/OrderEventsListener.scala | 57 + .../example/events/OrderEventsPublisher.scala | 55 + .../com/example/events/OrderStatus.scala | 1 + .../com/example/events/PaymentCallback.scala | 47 + .../com/example/events/PaymentCharged.scala | 48 + .../com/example/events/SchemaValidator.scala | 2 +- .../example/events/StringOrIntOrBoolean.scala | 2 + .../com/example/events/StringOrLong.scala | 2 + .../com/example/events/Topics.scala | 6 + .../com/example/events/TreeNodeListener.scala | 29 + .../example/events/TreeNodePublisher.scala | 23 + .../example/events/common/MoneyListener.scala | 29 + .../events/common/MoneyPublisher.scala | 23 + .../events/consumer/AddressConsumer.scala | 29 - .../events/consumer/AddressHandler.scala | 15 - .../consumer/CustomerOrderConsumer.scala | 29 - .../consumer/CustomerOrderHandler.scala | 15 - .../consumer/DynamicValueConsumer.scala | 29 - .../events/consumer/DynamicValueHandler.scala | 15 - .../events/consumer/InvoiceConsumer.scala | 29 - .../events/consumer/InvoiceHandler.scala | 15 - .../consumer/LinkedListNodeConsumer.scala | 29 - .../consumer/LinkedListNodeHandler.scala | 15 - .../events/consumer/MoneyConsumer.scala | 29 - .../events/consumer/MoneyHandler.scala | 15 - .../events/consumer/OrderEventsConsumer.scala | 36 - .../events/consumer/OrderEventsHandler.scala | 42 - .../events/consumer/TreeNodeConsumer.scala | 29 - .../events/consumer/TreeNodeHandler.scala | 15 - .../events/precisetypes/Decimal10_2.scala | 2 +- .../events/precisetypes/Decimal18_4.scala | 2 +- .../events/producer/AddressProducer.scala | 51 - .../producer/CustomerOrderProducer.scala | 51 - .../producer/DynamicValueProducer.scala | 51 - .../events/producer/InvoiceProducer.scala | 51 - .../producer/LinkedListNodeProducer.scala | 51 - .../events/producer/MoneyProducer.scala | 51 - .../events/producer/OrderEventsProducer.scala | 51 - .../events/producer/TreeNodeProducer.scala | 51 - .../events/serde/OrderEventsSerde.scala | 4 + .../events/serde/PaymentCallbackSerde.scala | 54 + .../events/serde/PaymentChargedSerde.scala | 54 + .../example/service/CreateUserRequest.scala | 22 + .../example/service/CreateUserResponse.scala | 22 + .../example/service/DeleteUserRequest.scala | 18 + .../example/service/DeleteUserResponse.scala | 22 + .../com/example/service/GetUserRequest.scala | 18 + .../com/example/service/GetUserResponse.scala | 22 + .../example/service/NotifyUserRequest.scala | 22 + .../example/service/UserServiceRequest.scala | 6 + .../events/AvroCatsIntegrationTest.scala | 42 +- .../schemas/order-events/PaymentCallback.avsc | 33 + .../schemas/order-events/PaymentCharged.avsc | 33 + .../api/combined/api/api/CustomersApi.java | 19 +- .../combined/api/api/CustomersApiServer.java | 31 +- .../api/combined/api/api/EmployeesApi.java | 14 +- .../combined/api/api/EmployeesApiServer.java | 18 +- .../api/combined/api/api/ProductsApi.java | 16 +- .../combined/api/api/ProductsApiServer.java | 18 +- .../api/combined/api/model/Customer.java | 55 +- .../combined/api/model/CustomerCreate.java | 35 +- .../combined/api/model/CustomerUpdate.java | 42 +- .../api/combined/api/model/Employee.java | 170 +- .../api/combined/api/model/Product.java | 42 +- .../com/example/events/Address.java | 55 + .../com/example/events/CustomerId.java | 25 + .../com/example/events/CustomerOrder.java | 60 + .../com/example/events/DynamicValue.java | 76 + .../avro_events/com/example/events/Email.java | 25 + .../com/example/events/Invoice.java | 58 + .../com/example/events/LinkedListNode.java | 44 + .../com/example/events/OrderCancelled.java | 80 + .../com/example/events/OrderEvents.java | 26 + .../com/example/events/OrderId.java | 25 + .../com/example/events/OrderPlaced.java | 85 + .../com/example/events/OrderStatus.java | 35 + .../com/example/events/OrderUpdated.java | 72 + .../com/example/events/PaymentCallback.java | 65 + .../com/example/events/PaymentCharged.java | 66 + .../com/example/events/TreeNode.java | 56 + .../com/example/events/common/Money.java | 44 + .../com/example/service/Result.java | 20 + .../avro_events/com/example/service/User.java | 53 + .../example/service/UserNotFoundError.java | 17 + .../com/example/service/UserService.java | 25 + .../example/service/UserServiceHandler.java | 8 + .../com/example/service/ValidationError.java | 17 + .../combined/avro_events/SchemaValidator.java | 97 + .../avro_events/StringOrIntOrBoolean.java | 167 + .../combined/avro_events/StringOrLong.java | 94 + .../combined/avro_events/Topics.java | 58 + .../combined/avro_events/TypedTopic.java | 22 + .../avro_events/consumer/AddressConsumer.java | 49 + .../avro_events/consumer/AddressHandler.java | 16 + .../consumer/CustomerOrderConsumer.java | 49 + .../consumer/CustomerOrderHandler.java | 16 + .../consumer/DynamicValueConsumer.java | 49 + .../consumer/DynamicValueHandler.java | 16 + .../avro_events/consumer/InvoiceConsumer.java | 49 + .../avro_events/consumer/InvoiceHandler.java | 16 + .../consumer/LinkedListNodeConsumer.java | 49 + .../consumer/LinkedListNodeHandler.java | 16 + .../avro_events/consumer/MoneyConsumer.java | 49 + .../avro_events/consumer/MoneyHandler.java | 16 + .../consumer/OrderEventsConsumer.java | 61 + .../consumer/OrderEventsHandler.java | 59 + .../consumer/TreeNodeConsumer.java | 49 + .../avro_events/consumer/TreeNodeHandler.java | 16 + .../avro_events/header/StandardHeaders.java | 44 + .../avro_events/precisetypes/Decimal10_2.java | 83 + .../avro_events/precisetypes/Decimal18_4.java | 83 + .../avro_events/producer/AddressProducer.java | 70 + .../producer/CustomerOrderProducer.java | 70 + .../producer/DynamicValueProducer.java | 70 + .../avro_events/producer/InvoiceProducer.java | 70 + .../producer/LinkedListNodeProducer.java | 70 + .../avro_events/producer/MoneyProducer.java | 70 + .../producer/OrderEventsProducer.java | 70 + .../producer/TreeNodeProducer.java | 70 + .../avro_events/serde/AddressSerde.java | 65 + .../avro_events/serde/CustomerOrderSerde.java | 65 + .../avro_events/serde/DynamicValueSerde.java | 65 + .../avro_events/serde/InvoiceSerde.java | 65 + .../serde/LinkedListNodeSerde.java | 65 + .../avro_events/serde/MoneySerde.java | 65 + .../serde/OrderCancelledSerde.java | 65 + .../avro_events/serde/OrderEventsSerde.java | 71 + .../avro_events/serde/OrderPlacedSerde.java | 65 + .../avro_events/serde/OrderUpdatedSerde.java | 65 + .../serde/PaymentCallbackSerde.java | 65 + .../serde/PaymentChargedSerde.java | 65 + .../avro_events/serde/TreeNodeSerde.java | 65 + .../mariadb/AllBrandsCategoriesCSet.java | 64 + .../AllBrandsCategoriesCSetMember.java | 40 + .../mariadb/BestsellerClearanceFSet.java | 64 + .../BestsellerClearanceFSetMember.java | 40 + .../mariadb/DefaultedDeserializer.java | 36 +- .../combined/mariadb/DefaultedSerializer.java | 14 +- .../combined/mariadb/EmailMailPushSmsSet.java | 64 + .../mariadb/EmailMailPushSmsSetMember.java | 39 + .../mariadb/combined/mariadb/XYZSet.java | 64 + .../combined/mariadb/XYZSetMember.java | 38 + .../mariadb/audit_log/AuditLogFields.java | 143 + .../mariadb/audit_log/AuditLogId.java | 30 + .../mariadb/audit_log/AuditLogRepo.java | 76 + .../mariadb/audit_log/AuditLogRepoImpl.java | 199 + .../mariadb/audit_log/AuditLogRepoMock.java | 172 + .../mariadb/audit_log/AuditLogRow.java | 188 + .../mariadb/audit_log/AuditLogRowUnsaved.java | 128 + .../combined/mariadb/brands/BrandsFields.java | 106 +- .../combined/mariadb/brands/BrandsId.java | 15 +- .../combined/mariadb/brands/BrandsRepo.java | 74 +- .../mariadb/brands/BrandsRepoImpl.java | 349 +- .../mariadb/brands/BrandsRepoMock.java | 152 +- .../combined/mariadb/brands/BrandsRow.java | 134 +- .../mariadb/brands/BrandsRowUnsaved.java | 99 +- .../combined/mariadb/bridge/Customer.java | 27 + .../mariadb/categories/CategoriesFields.java | 133 +- .../mariadb/categories/CategoriesId.java | 16 +- .../mariadb/categories/CategoriesRepo.java | 74 +- .../categories/CategoriesRepoImpl.java | 402 +- .../categories/CategoriesRepoMock.java | 157 +- .../mariadb/categories/CategoriesRow.java | 202 +- .../categories/CategoriesRowUnsaved.java | 156 +- .../CustomerAddressesFields.java | 187 + .../CustomerAddressesId.java | 30 + .../CustomerAddressesRepo.java | 76 + .../CustomerAddressesRepoImpl.java | 209 + .../CustomerAddressesRepoMock.java | 172 + .../CustomerAddressesRow.java | 241 + .../CustomerAddressesRowUnsaved.java | 171 + .../customer_status/CustomerStatusFields.java | 63 +- .../customer_status/CustomerStatusId.java | 16 +- .../customer_status/CustomerStatusRepo.java | 64 +- .../CustomerStatusRepoImpl.java | 242 +- .../CustomerStatusRepoMock.java | 152 +- .../customer_status/CustomerStatusRow.java | 55 +- .../CustomerStatusRowUnsaved.java | 46 +- .../mariadb/customers/CustomersFields.java | 216 +- .../mariadb/customers/CustomersId.java | 16 +- .../mariadb/customers/CustomersRepo.java | 75 +- .../mariadb/customers/CustomersRepoImpl.java | 526 +- .../mariadb/customers/CustomersRepoMock.java | 155 +- .../mariadb/customers/CustomersRow.java | 505 +- .../customers/CustomersRowUnsaved.java | 417 +- .../mariadb/customtypes/Defaulted.java | 46 +- .../mariadb/inventory/InventoryFields.java | 165 + .../mariadb/inventory/InventoryId.java | 30 + .../mariadb/inventory/InventoryRepo.java | 84 + .../mariadb/inventory/InventoryRepoImpl.java | 226 + .../mariadb/inventory/InventoryRepoMock.java | 183 + .../mariadb/inventory/InventoryRow.java | 218 + .../inventory/InventoryRowUnsaved.java | 155 + .../mariadb/mariatest/MariatestFields.java | 431 ++ .../mariadb/mariatest/MariatestId.java | 29 + .../mariadb/mariatest/MariatestRepo.java | 76 + .../mariadb/mariatest/MariatestRepoImpl.java | 236 + .../mariadb/mariatest/MariatestRepoMock.java | 172 + .../mariadb/mariatest/MariatestRow.java | 545 ++ .../mariatest/MariatestRowUnsaved.java | 412 ++ .../MariatestIdentityFields.java | 67 + .../MariatestIdentityId.java | 29 + .../MariatestIdentityRepo.java | 76 + .../MariatestIdentityRepoImpl.java | 141 + .../MariatestIdentityRepoMock.java | 172 + .../MariatestIdentityRow.java | 51 + .../MariatestIdentityRowUnsaved.java | 24 + .../MariatestSpatialFields.java | 138 + .../mariatest_spatial/MariatestSpatialId.java | 29 + .../MariatestSpatialRepo.java | 76 + .../MariatestSpatialRepoImpl.java | 155 + .../MariatestSpatialRepoMock.java | 172 + .../MariatestSpatialRow.java | 144 + .../MariatestSpatialRowUnsaved.java | 80 + .../MariatestSpatialNullFields.java | 138 + .../MariatestSpatialNullId.java | 29 + .../MariatestSpatialNullRepo.java | 76 + .../MariatestSpatialNullRepoImpl.java | 213 + .../MariatestSpatialNullRepoMock.java | 172 + .../MariatestSpatialNullRow.java | 187 + .../MariatestSpatialNullRowUnsaved.java | 129 + .../MariatestUniqueFields.java | 86 + .../mariatest_unique/MariatestUniqueId.java | 29 + .../mariatest_unique/MariatestUniqueRepo.java | 88 + .../MariatestUniqueRepoImpl.java | 163 + .../MariatestUniqueRepoMock.java | 190 + .../mariatest_unique/MariatestUniqueRow.java | 76 + .../MariatestUniqueRowUnsaved.java | 37 + .../mariatestnull/MariatestnullFields.java | 430 ++ .../mariatestnull/MariatestnullRepo.java | 33 + .../mariatestnull/MariatestnullRepoImpl.java | 432 ++ .../mariatestnull/MariatestnullRow.java | 737 +++ .../MariatestnullRowUnsaved.java | 529 ++ .../order_history/OrderHistoryFields.java | 132 + .../mariadb/order_history/OrderHistoryId.java | 30 + .../order_history/OrderHistoryRepo.java | 76 + .../order_history/OrderHistoryRepoImpl.java | 189 + .../order_history/OrderHistoryRepoMock.java | 172 + .../order_history/OrderHistoryRow.java | 163 + .../order_history/OrderHistoryRowUnsaved.java | 113 + .../mariadb/order_items/OrderItemsFields.java | 191 + .../mariadb/order_items/OrderItemsId.java | 30 + .../mariadb/order_items/OrderItemsRepo.java | 76 + .../order_items/OrderItemsRepoImpl.java | 201 + .../order_items/OrderItemsRepoMock.java | 172 + .../mariadb/order_items/OrderItemsRow.java | 231 + .../order_items/OrderItemsRowUnsaved.java | 166 + .../combined/mariadb/orders/OrdersFields.java | 277 + .../combined/mariadb/orders/OrdersId.java | 30 + .../combined/mariadb/orders/OrdersRepo.java | 81 + .../mariadb/orders/OrdersRepoImpl.java | 311 + .../mariadb/orders/OrdersRepoMock.java | 180 + .../combined/mariadb/orders/OrdersRow.java | 400 ++ .../mariadb/orders/OrdersRowUnsaved.java | 278 + .../payment_methods/PaymentMethodsFields.java | 115 + .../payment_methods/PaymentMethodsId.java | 30 + .../payment_methods/PaymentMethodsRepo.java | 81 + .../PaymentMethodsRepoImpl.java | 181 + .../PaymentMethodsRepoMock.java | 180 + .../payment_methods/PaymentMethodsRow.java | 136 + .../PaymentMethodsRowUnsaved.java | 91 + .../mariadb/payments/PaymentsFields.java | 177 + .../combined/mariadb/payments/PaymentsId.java | 30 + .../mariadb/payments/PaymentsRepo.java | 76 + .../mariadb/payments/PaymentsRepoImpl.java | 219 + .../mariadb/payments/PaymentsRepoMock.java | 172 + .../mariadb/payments/PaymentsRow.java | 233 + .../mariadb/payments/PaymentsRowUnsaved.java | 167 + .../mariadb/precisetypes/Binary16.java | 80 + .../mariadb/precisetypes/Binary32.java | 80 + .../mariadb/precisetypes/Binary64.java | 80 + .../mariadb/precisetypes/Decimal10_2.java | 101 + .../mariadb/precisetypes/Decimal12_4.java | 101 + .../mariadb/precisetypes/Decimal18_4.java | 101 + .../mariadb/precisetypes/Decimal5_2.java | 101 + .../mariadb/precisetypes/Decimal8_2.java | 101 + .../mariadb/precisetypes/LocalDateTime3.java | 76 + .../mariadb/precisetypes/LocalDateTime6.java | 76 + .../mariadb/precisetypes/LocalTime3.java | 76 + .../mariadb/precisetypes/LocalTime6.java | 76 + .../mariadb/precisetypes/PaddedString10.java | 84 + .../mariadb/precisetypes/String10.java | 83 + .../mariadb/precisetypes/String100.java | 83 + .../mariadb/precisetypes/String20.java | 83 + .../mariadb/precisetypes/String255.java | 83 + .../mariadb/precisetypes/String50.java | 83 + .../precision_types/PrecisionTypesFields.java | 285 + .../precision_types/PrecisionTypesId.java | 29 + .../precision_types/PrecisionTypesRepo.java | 76 + .../PrecisionTypesRepoImpl.java | 224 + .../PrecisionTypesRepoMock.java | 172 + .../precision_types/PrecisionTypesRow.java | 353 + .../PrecisionTypesRowUnsaved.java | 261 + .../PrecisionTypesNullFields.java | 285 + .../PrecisionTypesNullId.java | 29 + .../PrecisionTypesNullRepo.java | 76 + .../PrecisionTypesNullRepoImpl.java | 366 + .../PrecisionTypesNullRepoMock.java | 172 + .../PrecisionTypesNullRow.java | 454 ++ .../PrecisionTypesNullRowUnsaved.java | 321 + .../mariadb/price_tiers/PriceTiersFields.java | 96 + .../mariadb/price_tiers/PriceTiersId.java | 30 + .../mariadb/price_tiers/PriceTiersRepo.java | 76 + .../price_tiers/PriceTiersRepoImpl.java | 154 + .../price_tiers/PriceTiersRepoMock.java | 172 + .../mariadb/price_tiers/PriceTiersRow.java | 99 + .../price_tiers/PriceTiersRowUnsaved.java | 66 + .../ProductCategoriesFields.java | 110 + .../ProductCategoriesId.java | 39 + .../ProductCategoriesRepo.java | 76 + .../ProductCategoriesRepoImpl.java | 164 + .../ProductCategoriesRepoMock.java | 172 + .../ProductCategoriesRow.java | 111 + .../ProductCategoriesRowUnsaved.java | 81 + .../product_images/ProductImagesFields.java | 132 + .../product_images/ProductImagesId.java | 30 + .../product_images/ProductImagesRepo.java | 76 + .../product_images/ProductImagesRepoImpl.java | 190 + .../product_images/ProductImagesRepoMock.java | 172 + .../product_images/ProductImagesRow.java | 163 + .../ProductImagesRowUnsaved.java | 113 + .../product_prices/ProductPricesFields.java | 130 + .../product_prices/ProductPricesId.java | 30 + .../product_prices/ProductPricesRepo.java | 76 + .../product_prices/ProductPricesRepoImpl.java | 174 + .../product_prices/ProductPricesRepoMock.java | 172 + .../product_prices/ProductPricesRow.java | 144 + .../ProductPricesRowUnsaved.java | 99 + .../mariadb/products/ProductsFields.java | 247 +- .../combined/mariadb/products/ProductsId.java | 16 +- .../mariadb/products/ProductsRepo.java | 74 +- .../mariadb/products/ProductsRepoImpl.java | 652 +- .../mariadb/products/ProductsRepoMock.java | 155 +- .../mariadb/products/ProductsRow.java | 772 +-- .../mariadb/products/ProductsRowUnsaved.java | 594 +- .../mariadb/promotions/PromotionsFields.java | 201 + .../mariadb/promotions/PromotionsId.java | 30 + .../mariadb/promotions/PromotionsRepo.java | 81 + .../promotions/PromotionsRepoImpl.java | 242 + .../promotions/PromotionsRepoMock.java | 180 + .../mariadb/promotions/PromotionsRow.java | 279 + .../promotions/PromotionsRowUnsaved.java | 195 + .../mariadb/reviews/ReviewsFields.java | 240 + .../combined/mariadb/reviews/ReviewsId.java | 30 + .../combined/mariadb/reviews/ReviewsRepo.java | 76 + .../mariadb/reviews/ReviewsRepoImpl.java | 276 + .../mariadb/reviews/ReviewsRepoMock.java | 172 + .../combined/mariadb/reviews/ReviewsRow.java | 340 + .../mariadb/reviews/ReviewsRowUnsaved.java | 242 + .../mariadb/shipments/ShipmentsFields.java | 229 + .../mariadb/shipments/ShipmentsId.java | 30 + .../mariadb/shipments/ShipmentsRepo.java | 76 + .../mariadb/shipments/ShipmentsRepoImpl.java | 258 + .../mariadb/shipments/ShipmentsRepoMock.java | 172 + .../mariadb/shipments/ShipmentsRow.java | 316 + .../shipments/ShipmentsRowUnsaved.java | 225 + .../ShippingCarriersFields.java | 106 + .../shipping_carriers/ShippingCarriersId.java | 30 + .../ShippingCarriersRepo.java | 81 + .../ShippingCarriersRepoImpl.java | 179 + .../ShippingCarriersRepoMock.java | 180 + .../ShippingCarriersRow.java | 124 + .../ShippingCarriersRowUnsaved.java | 82 + .../combined/mariadb/userdefined/Email.java | 31 + .../mariadb/userdefined/FirstName.java | 31 + .../mariadb/userdefined/IsActive.java | 31 + .../mariadb/userdefined/IsApproved.java | 31 + .../mariadb/userdefined/IsDefault.java | 31 + .../mariadb/userdefined/IsPrimary.java | 31 + .../userdefined/IsVerifiedPurchase.java | 31 + .../mariadb/userdefined/LastName.java | 31 + .../VCustomerSummaryViewFields.java | 144 + .../VCustomerSummaryViewRepo.java | 16 + .../VCustomerSummaryViewRepoImpl.java | 25 + .../VCustomerSummaryViewRow.java | 196 + .../v_daily_sales/VDailySalesViewFields.java | 141 + .../v_daily_sales/VDailySalesViewRepo.java | 16 + .../VDailySalesViewRepoImpl.java | 25 + .../v_daily_sales/VDailySalesViewRow.java | 183 + .../VInventoryStatusViewFields.java | 178 + .../VInventoryStatusViewRepo.java | 16 + .../VInventoryStatusViewRepoImpl.java | 25 + .../VInventoryStatusViewRow.java | 264 + .../VOrderDetailsViewFields.java | 179 + .../VOrderDetailsViewRepo.java | 16 + .../VOrderDetailsViewRepoImpl.java | 25 + .../v_order_details/VOrderDetailsViewRow.java | 263 + .../VProductCatalogViewFields.java | 151 + .../VProductCatalogViewRepo.java | 16 + .../VProductCatalogViewRepoImpl.java | 25 + .../VProductCatalogViewRow.java | 208 + .../VWarehouseCoverageViewFields.java | 142 + .../VWarehouseCoverageViewRepo.java | 16 + .../VWarehouseCoverageViewRepoImpl.java | 25 + .../VWarehouseCoverageViewRow.java | 190 + .../mariadb/warehouses/WarehousesFields.java | 144 + .../mariadb/warehouses/WarehousesId.java | 30 + .../mariadb/warehouses/WarehousesRepo.java | 81 + .../warehouses/WarehousesRepoImpl.java | 202 + .../warehouses/WarehousesRepoMock.java | 180 + .../mariadb/warehouses/WarehousesRow.java | 184 + .../warehouses/WarehousesRowUnsaved.java | 126 + .../postgres/DefaultedDeserializer.java | 36 +- .../postgres/DefaultedSerializer.java | 14 +- .../combined/postgres/bridge/Customer.java | 27 + .../postgres/customtypes/Defaulted.java | 57 +- .../department/DepartmentFields.java | 87 + .../department/DepartmentId.java | 33 + .../department/DepartmentRepo.java | 96 + .../department/DepartmentRepoImpl.java | 192 + .../department/DepartmentRepoMock.java | 220 + .../department/DepartmentRow.java | 90 + .../department/DepartmentRowUnsaved.java | 76 + .../employee/EmployeeFields.java | 225 +- .../humanresources/employee/EmployeeRepo.java | 81 +- .../employee/EmployeeRepoImpl.java | 636 +- .../employee/EmployeeRepoMock.java | 178 +- .../humanresources/employee/EmployeeRow.java | 631 +- .../employee/EmployeeRowUnsaved.java | 622 +- .../EmployeedepartmenthistoryFields.java | 137 + .../EmployeedepartmenthistoryId.java | 61 + .../EmployeedepartmenthistoryRepo.java | 96 + .../EmployeedepartmenthistoryRepoImpl.java | 198 + .../EmployeedepartmenthistoryRepoMock.java | 220 + .../EmployeedepartmenthistoryRow.java | 143 + .../EmployeedepartmenthistoryRowUnsaved.java | 123 + .../humanresources/shift/ShiftFields.java | 97 + .../humanresources/shift/ShiftId.java | 33 + .../humanresources/shift/ShiftRepo.java | 96 + .../humanresources/shift/ShiftRepoImpl.java | 194 + .../humanresources/shift/ShiftRepoMock.java | 220 + .../humanresources/shift/ShiftRow.java | 103 + .../humanresources/shift/ShiftRowUnsaved.java | 88 + .../vemployee/VemployeeViewFields.java | 217 + .../vemployee/VemployeeViewRepo.java | 16 + .../vemployee/VemployeeViewRepoImpl.java | 25 + .../vemployee/VemployeeViewRow.java | 240 + .../information_schema/CardinalNumber.java | 31 +- .../information_schema/CharacterData.java | 31 +- .../information_schema/SqlIdentifier.java | 31 +- .../information_schema/TimeStamp.java | 31 +- .../postgres/information_schema/YesOrNo.java | 33 +- .../person/address/AddressFields.java | 141 + .../postgres/person/address/AddressId.java | 33 + .../postgres/person/address/AddressRepo.java | 96 + .../person/address/AddressRepoImpl.java | 209 + .../person/address/AddressRepoMock.java | 220 + .../postgres/person/address/AddressRow.java | 157 + .../person/address/AddressRowUnsaved.java | 134 + .../person/addresstype/AddresstypeFields.java | 88 + .../person/addresstype/AddresstypeId.java | 33 + .../person/addresstype/AddresstypeRepo.java | 96 + .../addresstype/AddresstypeRepoImpl.java | 199 + .../addresstype/AddresstypeRepoMock.java | 220 + .../person/addresstype/AddresstypeRow.java | 92 + .../addresstype/AddresstypeRowUnsaved.java | 77 + .../businessentity/BusinessentityFields.java | 60 +- .../businessentity/BusinessentityId.java | 24 +- .../businessentity/BusinessentityRepo.java | 81 +- .../BusinessentityRepoImpl.java | 371 +- .../BusinessentityRepoMock.java | 180 +- .../businessentity/BusinessentityRow.java | 69 +- .../BusinessentityRowUnsaved.java | 70 +- .../BusinessentityaddressFields.java | 127 + .../BusinessentityaddressId.java | 49 + .../BusinessentityaddressRepo.java | 96 + .../BusinessentityaddressRepoImpl.java | 200 + .../BusinessentityaddressRepoMock.java | 220 + .../BusinessentityaddressRow.java | 125 + .../BusinessentityaddressRowUnsaved.java | 104 + .../countryregion/CountryregionFields.java | 78 + .../person/countryregion/CountryregionId.java | 33 + .../countryregion/CountryregionRepo.java | 96 + .../countryregion/CountryregionRepoImpl.java | 183 + .../countryregion/CountryregionRepoMock.java | 220 + .../countryregion/CountryregionRow.java | 71 + .../CountryregionRowUnsaved.java | 60 + .../emailaddress/EmailaddressFields.java | 98 +- .../person/emailaddress/EmailaddressId.java | 26 +- .../person/emailaddress/EmailaddressRepo.java | 81 +- .../emailaddress/EmailaddressRepoImpl.java | 431 +- .../emailaddress/EmailaddressRepoMock.java | 179 +- .../person/emailaddress/EmailaddressRow.java | 148 +- .../emailaddress/EmailaddressRowUnsaved.java | 135 +- .../person/password/PasswordFields.java | 104 + .../person/password/PasswordRepo.java | 97 + .../person/password/PasswordRepoImpl.java | 194 + .../person/password/PasswordRepoMock.java | 221 + .../postgres/person/password/PasswordRow.java | 99 + .../person/password/PasswordRowUnsaved.java | 84 + .../postgres/person/person/PersonFields.java | 190 +- .../postgres/person/person/PersonRepo.java | 82 +- .../person/person/PersonRepoImpl.java | 555 +- .../person/person/PersonRepoMock.java | 174 +- .../postgres/person/person/PersonRow.java | 443 +- .../person/person/PersonRowUnsaved.java | 459 +- .../stateprovince/StateprovinceFields.java | 140 + .../person/stateprovince/StateprovinceId.java | 33 + .../stateprovince/StateprovinceRepo.java | 96 + .../stateprovince/StateprovinceRepoImpl.java | 217 + .../stateprovince/StateprovinceRepoMock.java | 220 + .../stateprovince/StateprovinceRow.java | 156 + .../StateprovinceRowUnsaved.java | 138 + .../postgres/precisetypes/PaddedString10.java | 88 + .../postgres/precisetypes/PaddedString3.java | 88 + .../postgres/precisetypes/String10.java | 87 + .../postgres/precisetypes/String100.java | 87 + .../postgres/precisetypes/String20.java | 87 + .../postgres/precisetypes/String255.java | 87 + .../postgres/precisetypes/String50.java | 87 + .../production/product/ProductFields.java | 306 + .../production/product/ProductId.java | 33 + .../production/product/ProductRepo.java | 96 + .../production/product/ProductRepoImpl.java | 259 + .../production/product/ProductRepoMock.java | 220 + .../production/product/ProductRow.java | 420 ++ .../production/product/ProductRowUnsaved.java | 367 + .../ProductcategoryFields.java | 88 + .../productcategory/ProductcategoryId.java | 33 + .../productcategory/ProductcategoryRepo.java | 96 + .../ProductcategoryRepoImpl.java | 199 + .../ProductcategoryRepoMock.java | 220 + .../productcategory/ProductcategoryRow.java | 92 + .../ProductcategoryRowUnsaved.java | 77 + .../ProductcosthistoryFields.java | 114 + .../ProductcosthistoryId.java | 39 + .../ProductcosthistoryRepo.java | 96 + .../ProductcosthistoryRepoImpl.java | 190 + .../ProductcosthistoryRepoMock.java | 220 + .../ProductcosthistoryRow.java | 126 + .../ProductcosthistoryRowUnsaved.java | 104 + .../productmodel/ProductmodelFields.java | 108 + .../productmodel/ProductmodelId.java | 33 + .../productmodel/ProductmodelRepo.java | 96 + .../productmodel/ProductmodelRepoImpl.java | 203 + .../productmodel/ProductmodelRepoMock.java | 220 + .../productmodel/ProductmodelRow.java | 118 + .../productmodel/ProductmodelRowUnsaved.java | 97 + .../ProductsubcategoryFields.java | 105 + .../ProductsubcategoryId.java | 33 + .../ProductsubcategoryRepo.java | 96 + .../ProductsubcategoryRepoImpl.java | 202 + .../ProductsubcategoryRepoMock.java | 220 + .../ProductsubcategoryRow.java | 109 + .../ProductsubcategoryRowUnsaved.java | 94 + .../unitmeasure/UnitmeasureFields.java | 78 + .../production/unitmeasure/UnitmeasureId.java | 33 + .../unitmeasure/UnitmeasureRepo.java | 96 + .../unitmeasure/UnitmeasureRepoImpl.java | 183 + .../unitmeasure/UnitmeasureRepoMock.java | 220 + .../unitmeasure/UnitmeasureRow.java | 71 + .../unitmeasure/UnitmeasureRowUnsaved.java | 60 + .../postgres/public_/AccountNumber.java | 31 +- .../combined/postgres/public_/Address.java | 43 +- .../postgres/public_/AllTypesComposite.java | 564 +- .../combined/postgres/public_/Complex.java | 32 +- .../postgres/public_/ContactInfo.java | 37 +- .../postgres/public_/EmployeeRecord.java | 49 +- .../combined/postgres/public_/Flag.java | 28 +- .../postgres/public_/InventoryItem.java | 48 +- .../postgres/public_/MetadataRecord.java | 37 +- .../combined/postgres/public_/Mydomain.java | 29 +- .../combined/postgres/public_/Myenum.java | 45 + .../combined/postgres/public_/Name.java | 28 +- .../combined/postgres/public_/NameStyle.java | 29 +- .../postgres/public_/NullableTest.java | 37 +- .../postgres/public_/OrderNumber.java | 31 +- .../combined/postgres/public_/PersonName.java | 44 +- .../combined/postgres/public_/Phone.java | 28 +- .../combined/postgres/public_/Point2d.java | 32 +- .../postgres/public_/PolygonCustom.java | 36 +- .../combined/postgres/public_/ShortText.java | 29 +- .../postgres/public_/TablefuncCrosstab2.java | 37 +- .../postgres/public_/TablefuncCrosstab3.java | 44 +- .../postgres/public_/TablefuncCrosstab4.java | 49 +- .../public_/TextWithSpecialChars.java | 72 +- .../combined/postgres/public_/TreeNode.java | 37 +- .../postgres/public_/flaff/FlaffFields.java | 109 + .../postgres/public_/flaff/FlaffId.java | 59 + .../postgres/public_/flaff/FlaffRepo.java | 84 + .../postgres/public_/flaff/FlaffRepoImpl.java | 154 + .../postgres/public_/flaff/FlaffRepoMock.java | 188 + .../postgres/public_/flaff/FlaffRow.java | 99 + .../identity_test/IdentityTestFields.java | 76 + .../public_/identity_test/IdentityTestId.java | 33 + .../identity_test/IdentityTestRepo.java | 96 + .../identity_test/IdentityTestRepoImpl.java | 180 + .../identity_test/IdentityTestRepoMock.java | 220 + .../identity_test/IdentityTestRow.java | 67 + .../identity_test/IdentityTestRowUnsaved.java | 46 + .../public_/issue142/Issue142Fields.java | 56 + .../postgres/public_/issue142/Issue142Id.java | 48 + .../public_/issue142/Issue142Repo.java | 79 + .../public_/issue142/Issue142RepoImpl.java | 137 + .../public_/issue142/Issue142RepoMock.java | 176 + .../public_/issue142/Issue142Row.java | 34 + .../public_/issue142_2/Issue1422Fields.java | 64 + .../public_/issue142_2/Issue1422Repo.java | 80 + .../public_/issue142_2/Issue1422RepoImpl.java | 138 + .../public_/issue142_2/Issue1422RepoMock.java | 177 + .../public_/issue142_2/Issue1422Row.java | 40 + .../only_pk_columns/OnlyPkColumnsFields.java | 75 + .../only_pk_columns/OnlyPkColumnsId.java | 38 + .../only_pk_columns/OnlyPkColumnsRepo.java | 79 + .../OnlyPkColumnsRepoImpl.java | 140 + .../OnlyPkColumnsRepoMock.java | 176 + .../only_pk_columns/OnlyPkColumnsRow.java | 56 + .../postgres/public_/pgtest/PgtestFields.java | 703 ++ .../postgres/public_/pgtest/PgtestRepo.java | 35 + .../public_/pgtest/PgtestRepoImpl.java | 61 + .../postgres/public_/pgtest/PgtestRow.java | 747 +++ .../public_/pgtestnull/PgtestnullFields.java | 703 ++ .../public_/pgtestnull/PgtestnullRepo.java | 35 + .../pgtestnull/PgtestnullRepoImpl.java | 61 + .../public_/pgtestnull/PgtestnullRow.java | 748 +++ .../precision_types/PrecisionTypesFields.java | 286 + .../precision_types/PrecisionTypesId.java | 33 + .../precision_types/PrecisionTypesRepo.java | 96 + .../PrecisionTypesRepoImpl.java | 233 + .../PrecisionTypesRepoMock.java | 220 + .../precision_types/PrecisionTypesRow.java | 293 + .../PrecisionTypesRowUnsaved.java | 241 + .../PrecisionTypesNullFields.java | 286 + .../PrecisionTypesNullId.java | 33 + .../PrecisionTypesNullRepo.java | 96 + .../PrecisionTypesNullRepoImpl.java | 233 + .../PrecisionTypesNullRepoMock.java | 220 + .../PrecisionTypesNullRow.java | 294 + .../PrecisionTypesNullRowUnsaved.java | 217 + .../postgres/public_/title/TitleFields.java | 56 + .../postgres/public_/title/TitleId.java | 50 + .../postgres/public_/title/TitleRepo.java | 79 + .../postgres/public_/title/TitleRepoImpl.java | 137 + .../postgres/public_/title/TitleRepoMock.java | 176 + .../postgres/public_/title/TitleRow.java | 34 + .../title_domain/TitleDomainFields.java | 56 + .../public_/title_domain/TitleDomainId.java | 53 + .../public_/title_domain/TitleDomainRepo.java | 79 + .../title_domain/TitleDomainRepoImpl.java | 137 + .../title_domain/TitleDomainRepoMock.java | 176 + .../public_/title_domain/TitleDomainRow.java | 34 + .../titledperson/TitledpersonFields.java | 90 + .../titledperson/TitledpersonRepo.java | 35 + .../titledperson/TitledpersonRepoImpl.java | 61 + .../public_/titledperson/TitledpersonRow.java | 58 + .../postgres/public_/users/UsersFields.java | 115 + .../postgres/public_/users/UsersId.java | 34 + .../postgres/public_/users/UsersRepo.java | 102 + .../postgres/public_/users/UsersRepoImpl.java | 199 + .../postgres/public_/users/UsersRepoMock.java | 229 + .../postgres/public_/users/UsersRow.java | 108 + .../public_/users/UsersRowUnsaved.java | 86 + .../sales/salesperson/SalespersonFields.java | 149 + .../sales/salesperson/SalespersonRepo.java | 97 + .../salesperson/SalespersonRepoImpl.java | 231 + .../salesperson/SalespersonRepoMock.java | 221 + .../sales/salesperson/SalespersonRow.java | 190 + .../salesperson/SalespersonRowUnsaved.java | 162 + .../salesterritory/SalesterritoryFields.java | 151 + .../salesterritory/SalesterritoryId.java | 33 + .../salesterritory/SalesterritoryRepo.java | 96 + .../SalesterritoryRepoImpl.java | 240 + .../SalesterritoryRepoMock.java | 220 + .../salesterritory/SalesterritoryRow.java | 198 + .../SalesterritoryRowUnsaved.java | 170 + .../postgres/userdefined/ActiveFlag.java | 35 + .../postgres/userdefined/CurrentFlag.java | 35 + .../postgres/userdefined/Description.java | 35 + .../postgres/userdefined/FirstName.java | 35 + .../postgres/userdefined/LastName.java | 35 + .../postgres/userdefined/MiddleName.java | 35 + .../postgres/userdefined/OnlineOrderFlag.java | 35 + .../postgres/userdefined/SalariedFlag.java | 35 + .../shared/combined/shared/FirstName.java | 36 - .../shared/combined/shared/IsActive.java | 40 - .../shared/combined/shared/IsSalaried.java | 30 - .../shared/combined/shared/LastName.java | 36 - .../shared/combined/shared/MiddleName.java | 30 - .../combined/server/CombinedApiServer.java | 45 +- .../api/combined/api/api/CustomersApi.kt | 27 + .../combined/api/api/CustomersApiServer.kt | 48 + .../api/combined/api/api/EmployeesApi.kt | 16 + .../combined/api/api/EmployeesApiServer.kt | 26 + .../api/combined/api/api/ProductsApi.kt | 15 + .../api/combined/api/api/ProductsApiServer.kt | 22 + .../api/combined/api/model/Customer.kt | 21 + .../api/combined/api/model/CustomerCreate.kt | 13 + .../api/combined/api/model/CustomerUpdate.kt | 15 + .../api/combined/api/model/Employee.kt | 25 + .../api/combined/api/model/Product.kt | 18 + .../avro_events/com/example/events/Address.kt | 35 + .../com/example/events/CustomerId.kt | 22 + .../com/example/events/CustomerOrder.kt | 35 + .../com/example/events/DynamicValue.kt | 36 + .../avro_events/com/example/events/Email.kt | 22 + .../avro_events/com/example/events/Invoice.kt | 38 + .../com/example/events/LinkedListNode.kt | 29 + .../com/example/events/OrderCancelled.kt | 45 + .../com/example/events/OrderEvents.kt | 28 + .../avro_events/com/example/events/OrderId.kt | 22 + .../com/example/events/OrderPlaced.kt | 49 + .../com/example/events/OrderStatus.kt | 22 + .../com/example/events/OrderUpdated.kt | 41 + .../com/example/events/PaymentCallback.kt | 39 + .../com/example/events/PaymentCharged.kt | 40 + .../com/example/events/TreeNode.kt | 32 + .../com/example/events/common/Money.kt | 34 + .../avro_events/com/example/service/Result.kt | 12 + .../avro_events/com/example/service/User.kt | 34 + .../com/example/service/UserNotFoundError.kt | 9 + .../com/example/service/UserService.kt | 25 + .../com/example/service/UserServiceHandler.kt | 8 + .../com/example/service/ValidationError.kt | 9 + .../combined/avro_events/SchemaValidator.kt | 97 + .../avro_events/StringOrIntOrBoolean.kt | 136 + .../combined/avro_events/StringOrLong.kt | 78 + .../combined/avro_events/Topics.kt | 60 + .../combined/avro_events/TypedTopic.kt | 10 + .../avro_events/consumer/AddressConsumer.kt | 28 + .../avro_events/consumer/AddressHandler.kt | 15 + .../consumer/CustomerOrderConsumer.kt | 28 + .../consumer/CustomerOrderHandler.kt | 15 + .../consumer/DynamicValueConsumer.kt | 28 + .../consumer/DynamicValueHandler.kt | 15 + .../avro_events/consumer/InvoiceConsumer.kt | 28 + .../avro_events/consumer/InvoiceHandler.kt | 15 + .../consumer/LinkedListNodeConsumer.kt | 28 + .../consumer/LinkedListNodeHandler.kt | 15 + .../avro_events/consumer/MoneyConsumer.kt | 28 + .../avro_events/consumer/MoneyHandler.kt | 15 + .../consumer/OrderEventsConsumer.kt | 40 + .../consumer/OrderEventsHandler.kt | 58 + .../avro_events/consumer/TreeNodeConsumer.kt | 28 + .../avro_events/consumer/TreeNodeHandler.kt | 15 + .../avro_events/header/StandardHeaders.kt | 33 + .../avro_events/precisetypes/Decimal10_2.kt | 50 + .../avro_events/precisetypes/Decimal18_4.kt | 50 + .../avro_events/producer/AddressProducer.kt | 45 + .../producer/CustomerOrderProducer.kt | 45 + .../producer/DynamicValueProducer.kt | 45 + .../avro_events/producer/InvoiceProducer.kt | 45 + .../producer/LinkedListNodeProducer.kt | 45 + .../avro_events/producer/MoneyProducer.kt | 45 + .../producer/OrderEventsProducer.kt | 45 + .../avro_events/producer/TreeNodeProducer.kt | 45 + .../avro_events/serde/AddressSerde.kt | 55 + .../avro_events/serde/CustomerOrderSerde.kt | 55 + .../avro_events/serde/DynamicValueSerde.kt | 55 + .../avro_events/serde/InvoiceSerde.kt | 55 + .../avro_events/serde/LinkedListNodeSerde.kt | 55 + .../combined/avro_events/serde/MoneySerde.kt | 55 + .../avro_events/serde/OrderCancelledSerde.kt | 55 + .../avro_events/serde/OrderEventsSerde.kt | 63 + .../avro_events/serde/OrderPlacedSerde.kt | 55 + .../avro_events/serde/OrderUpdatedSerde.kt | 55 + .../avro_events/serde/PaymentCallbackSerde.kt | 55 + .../avro_events/serde/PaymentChargedSerde.kt | 55 + .../avro_events/serde/TreeNodeSerde.kt | 55 + .../mariadb/AllBrandsCategoriesCSet.kt | 50 + .../mariadb/AllBrandsCategoriesCSetMember.kt | 27 + .../mariadb/BestsellerClearanceFSet.kt | 50 + .../mariadb/BestsellerClearanceFSetMember.kt | 27 + .../combined/mariadb/DefaultedDeserializer.kt | 63 + .../combined/mariadb/DefaultedSerializer.kt | 37 + .../combined/mariadb/EmailMailPushSmsSet.kt | 50 + .../mariadb/EmailMailPushSmsSetMember.kt | 26 + .../mariadb/combined/mariadb/XYZSet.kt | 50 + .../mariadb/combined/mariadb/XYZSetMember.kt | 25 + .../mariadb/audit_log/AuditLogFields.kt | 76 + .../combined/mariadb/audit_log/AuditLogId.kt | 27 + .../mariadb/audit_log/AuditLogRepo.kt | 75 + .../mariadb/audit_log/AuditLogRepoImpl.kt | 141 + .../mariadb/audit_log/AuditLogRepoMock.kt | 127 + .../combined/mariadb/audit_log/AuditLogRow.kt | 91 + .../mariadb/audit_log/AuditLogRowUnsaved.kt | 57 + .../combined/mariadb/brands/BrandsFields.kt | 62 + .../combined/mariadb/brands/BrandsId.kt | 27 + .../combined/mariadb/brands/BrandsRepo.kt | 80 + .../combined/mariadb/brands/BrandsRepoImpl.kt | 135 + .../combined/mariadb/brands/BrandsRepoMock.kt | 132 + .../combined/mariadb/brands/BrandsRow.kt | 71 + .../mariadb/brands/BrandsRowUnsaved.kt | 43 + .../combined/mariadb/bridge/Customer.kt | 15 + .../mariadb/categories/CategoriesFields.kt | 73 + .../mariadb/categories/CategoriesId.kt | 27 + .../mariadb/categories/CategoriesRepo.kt | 80 + .../mariadb/categories/CategoriesRepoImpl.kt | 144 + .../mariadb/categories/CategoriesRepoMock.kt | 132 + .../mariadb/categories/CategoriesRow.kt | 86 + .../categories/CategoriesRowUnsaved.kt | 53 + .../CustomerAddressesFields.kt | 98 + .../customer_addresses/CustomerAddressesId.kt | 27 + .../CustomerAddressesRepo.kt | 75 + .../CustomerAddressesRepoImpl.kt | 151 + .../CustomerAddressesRepoMock.kt | 127 + .../CustomerAddressesRow.kt | 110 + .../CustomerAddressesRowUnsaved.kt | 68 + .../customer_status/CustomerStatusFields.kt | 45 + .../customer_status/CustomerStatusId.kt | 26 + .../customer_status/CustomerStatusRepo.kt | 75 + .../customer_status/CustomerStatusRepoImpl.kt | 115 + .../customer_status/CustomerStatusRepoMock.kt | 127 + .../customer_status/CustomerStatusRow.kt | 42 + .../CustomerStatusRowUnsaved.kt | 25 + .../mariadb/customers/CustomersFields.kt | 101 + .../combined/mariadb/customers/CustomersId.kt | 27 + .../mariadb/customers/CustomersRepo.kt | 81 + .../mariadb/customers/CustomersRepoImpl.kt | 168 + .../mariadb/customers/CustomersRepoMock.kt | 133 + .../mariadb/customers/CustomersRow.kt | 121 + .../mariadb/customers/CustomersRowUnsaved.kt | 78 + .../combined/mariadb/customtypes/Defaulted.kt | 60 + .../mariadb/inventory/InventoryFields.kt | 89 + .../combined/mariadb/inventory/InventoryId.kt | 27 + .../mariadb/inventory/InventoryRepo.kt | 83 + .../mariadb/inventory/InventoryRepoImpl.kt | 157 + .../mariadb/inventory/InventoryRepoMock.kt | 135 + .../mariadb/inventory/InventoryRow.kt | 105 + .../mariadb/inventory/InventoryRowUnsaved.kt | 69 + .../mariadb/mariatest/MariatestFields.kt | 209 + .../combined/mariadb/mariatest/MariatestId.kt | 26 + .../mariadb/mariatest/MariatestRepo.kt | 75 + .../mariadb/mariatest/MariatestRepoImpl.kt | 194 + .../mariadb/mariatest/MariatestRepoMock.kt | 127 + .../mariadb/mariatest/MariatestRow.kt | 210 + .../mariadb/mariatest/MariatestRowUnsaved.kt | 118 + .../MariatestIdentityFields.kt | 40 + .../mariatest_identity/MariatestIdentityId.kt | 26 + .../MariatestIdentityRepo.kt | 75 + .../MariatestIdentityRepoImpl.kt | 107 + .../MariatestIdentityRepoMock.kt | 127 + .../MariatestIdentityRow.kt | 33 + .../MariatestIdentityRowUnsaved.kt | 14 + .../MariatestSpatialFields.kt | 76 + .../mariatest_spatial/MariatestSpatialId.kt | 26 + .../mariatest_spatial/MariatestSpatialRepo.kt | 75 + .../MariatestSpatialRepoImpl.kt | 121 + .../MariatestSpatialRepoMock.kt | 127 + .../mariatest_spatial/MariatestSpatialRow.kt | 70 + .../MariatestSpatialRowUnsaved.kt | 38 + .../MariatestSpatialNullFields.kt | 76 + .../MariatestSpatialNullId.kt | 26 + .../MariatestSpatialNullRepo.kt | 75 + .../MariatestSpatialNullRepoImpl.kt | 145 + .../MariatestSpatialNullRepoMock.kt | 127 + .../MariatestSpatialNullRow.kt | 96 + .../MariatestSpatialNullRowUnsaved.kt | 66 + .../mariatest_unique/MariatestUniqueFields.kt | 49 + .../mariatest_unique/MariatestUniqueId.kt | 26 + .../mariatest_unique/MariatestUniqueRepo.kt | 87 + .../MariatestUniqueRepoImpl.kt | 123 + .../MariatestUniqueRepoMock.kt | 139 + .../mariatest_unique/MariatestUniqueRow.kt | 42 + .../MariatestUniqueRowUnsaved.kt | 20 + .../mariatestnull/MariatestnullFields.kt | 208 + .../mariatestnull/MariatestnullRepo.kt | 33 + .../mariatestnull/MariatestnullRepoImpl.kt | 249 + .../mariadb/mariatestnull/MariatestnullRow.kt | 325 + .../mariatestnull/MariatestnullRowUnsaved.kt | 235 + .../order_history/OrderHistoryFields.kt | 73 + .../mariadb/order_history/OrderHistoryId.kt | 27 + .../mariadb/order_history/OrderHistoryRepo.kt | 75 + .../order_history/OrderHistoryRepoImpl.kt | 135 + .../order_history/OrderHistoryRepoMock.kt | 127 + .../mariadb/order_history/OrderHistoryRow.kt | 82 + .../order_history/OrderHistoryRowUnsaved.kt | 52 + .../mariadb/order_items/OrderItemsFields.kt | 103 + .../mariadb/order_items/OrderItemsId.kt | 27 + .../mariadb/order_items/OrderItemsRepo.kt | 75 + .../mariadb/order_items/OrderItemsRepoImpl.kt | 147 + .../mariadb/order_items/OrderItemsRepoMock.kt | 127 + .../mariadb/order_items/OrderItemsRow.kt | 107 + .../order_items/OrderItemsRowUnsaved.kt | 66 + .../combined/mariadb/orders/OrdersFields.kt | 142 + .../combined/mariadb/orders/OrdersId.kt | 27 + .../combined/mariadb/orders/OrdersRepo.kt | 80 + .../combined/mariadb/orders/OrdersRepoImpl.kt | 206 + .../combined/mariadb/orders/OrdersRepoMock.kt | 132 + .../combined/mariadb/orders/OrdersRow.kt | 180 + .../mariadb/orders/OrdersRowUnsaved.kt | 119 + .../payment_methods/PaymentMethodsFields.kt | 63 + .../payment_methods/PaymentMethodsId.kt | 27 + .../payment_methods/PaymentMethodsRepo.kt | 80 + .../payment_methods/PaymentMethodsRepoImpl.kt | 132 + .../payment_methods/PaymentMethodsRepoMock.kt | 132 + .../payment_methods/PaymentMethodsRow.kt | 69 + .../PaymentMethodsRowUnsaved.kt | 41 + .../mariadb/payments/PaymentsFields.kt | 96 + .../combined/mariadb/payments/PaymentsId.kt | 27 + .../combined/mariadb/payments/PaymentsRepo.kt | 75 + .../mariadb/payments/PaymentsRepoImpl.kt | 153 + .../mariadb/payments/PaymentsRepoMock.kt | 127 + .../combined/mariadb/payments/PaymentsRow.kt | 112 + .../mariadb/payments/PaymentsRowUnsaved.kt | 74 + .../combined/mariadb/precisetypes/Binary16.kt | 53 + .../combined/mariadb/precisetypes/Binary32.kt | 53 + .../combined/mariadb/precisetypes/Binary64.kt | 53 + .../mariadb/precisetypes/Decimal10_2.kt | 68 + .../mariadb/precisetypes/Decimal12_4.kt | 68 + .../mariadb/precisetypes/Decimal18_4.kt | 68 + .../mariadb/precisetypes/Decimal5_2.kt | 68 + .../mariadb/precisetypes/Decimal8_2.kt | 68 + .../mariadb/precisetypes/LocalDateTime3.kt | 49 + .../mariadb/precisetypes/LocalDateTime6.kt | 49 + .../mariadb/precisetypes/LocalTime3.kt | 49 + .../mariadb/precisetypes/LocalTime6.kt | 49 + .../mariadb/precisetypes/PaddedString10.kt | 55 + .../combined/mariadb/precisetypes/String10.kt | 55 + .../mariadb/precisetypes/String100.kt | 55 + .../combined/mariadb/precisetypes/String20.kt | 55 + .../mariadb/precisetypes/String255.kt | 55 + .../combined/mariadb/precisetypes/String50.kt | 55 + .../precision_types/PrecisionTypesFields.kt | 148 + .../precision_types/PrecisionTypesId.kt | 26 + .../precision_types/PrecisionTypesRepo.kt | 75 + .../precision_types/PrecisionTypesRepoImpl.kt | 178 + .../precision_types/PrecisionTypesRepoMock.kt | 127 + .../precision_types/PrecisionTypesRow.kt | 152 + .../PrecisionTypesRowUnsaved.kt | 93 + .../PrecisionTypesNullFields.kt | 148 + .../PrecisionTypesNullId.kt | 26 + .../PrecisionTypesNullRepo.kt | 75 + .../PrecisionTypesNullRepoImpl.kt | 238 + .../PrecisionTypesNullRepoMock.kt | 127 + .../PrecisionTypesNullRow.kt | 212 + .../PrecisionTypesNullRowUnsaved.kt | 153 + .../mariadb/price_tiers/PriceTiersFields.kt | 54 + .../mariadb/price_tiers/PriceTiersId.kt | 27 + .../mariadb/price_tiers/PriceTiersRepo.kt | 75 + .../mariadb/price_tiers/PriceTiersRepoImpl.kt | 116 + .../mariadb/price_tiers/PriceTiersRepoMock.kt | 127 + .../mariadb/price_tiers/PriceTiersRow.kt | 53 + .../price_tiers/PriceTiersRowUnsaved.kt | 31 + .../ProductCategoriesFields.kt | 65 + .../product_categories/ProductCategoriesId.kt | 27 + .../ProductCategoriesRepo.kt | 75 + .../ProductCategoriesRepoImpl.kt | 122 + .../ProductCategoriesRepoMock.kt | 127 + .../ProductCategoriesRow.kt | 65 + .../ProductCategoriesRowUnsaved.kt | 38 + .../product_images/ProductImagesFields.kt | 73 + .../mariadb/product_images/ProductImagesId.kt | 27 + .../product_images/ProductImagesRepo.kt | 75 + .../product_images/ProductImagesRepoImpl.kt | 136 + .../product_images/ProductImagesRepoMock.kt | 127 + .../product_images/ProductImagesRow.kt | 82 + .../product_images/ProductImagesRowUnsaved.kt | 52 + .../product_prices/ProductPricesFields.kt | 74 + .../mariadb/product_prices/ProductPricesId.kt | 27 + .../product_prices/ProductPricesRepo.kt | 75 + .../product_prices/ProductPricesRepoImpl.kt | 128 + .../product_prices/ProductPricesRepoMock.kt | 127 + .../product_prices/ProductPricesRow.kt | 74 + .../product_prices/ProductPricesRowUnsaved.kt | 45 + .../mariadb/products/ProductsFields.kt | 115 + .../combined/mariadb/products/ProductsId.kt | 27 + .../combined/mariadb/products/ProductsRepo.kt | 80 + .../mariadb/products/ProductsRepoImpl.kt | 188 + .../mariadb/products/ProductsRepoMock.kt | 132 + .../combined/mariadb/products/ProductsRow.kt | 150 + .../mariadb/products/ProductsRowUnsaved.kt | 99 + .../mariadb/promotions/PromotionsFields.kt | 104 + .../mariadb/promotions/PromotionsId.kt | 27 + .../mariadb/promotions/PromotionsRepo.kt | 80 + .../mariadb/promotions/PromotionsRepoImpl.kt | 169 + .../mariadb/promotions/PromotionsRepoMock.kt | 132 + .../mariadb/promotions/PromotionsRow.kt | 128 + .../promotions/PromotionsRowUnsaved.kt | 82 + .../combined/mariadb/reviews/ReviewsFields.kt | 127 + .../combined/mariadb/reviews/ReviewsId.kt | 27 + .../combined/mariadb/reviews/ReviewsRepo.kt | 75 + .../mariadb/reviews/ReviewsRepoImpl.kt | 186 + .../mariadb/reviews/ReviewsRepoMock.kt | 127 + .../combined/mariadb/reviews/ReviewsRow.kt | 158 + .../mariadb/reviews/ReviewsRowUnsaved.kt | 107 + .../mariadb/shipments/ShipmentsFields.kt | 121 + .../combined/mariadb/shipments/ShipmentsId.kt | 27 + .../mariadb/shipments/ShipmentsRepo.kt | 75 + .../mariadb/shipments/ShipmentsRepoImpl.kt | 176 + .../mariadb/shipments/ShipmentsRepoMock.kt | 127 + .../mariadb/shipments/ShipmentsRow.kt | 146 + .../mariadb/shipments/ShipmentsRowUnsaved.kt | 97 + .../ShippingCarriersFields.kt | 59 + .../shipping_carriers/ShippingCarriersId.kt | 27 + .../shipping_carriers/ShippingCarriersRepo.kt | 80 + .../ShippingCarriersRepoImpl.kt | 130 + .../ShippingCarriersRepoMock.kt | 132 + .../shipping_carriers/ShippingCarriersRow.kt | 65 + .../ShippingCarriersRowUnsaved.kt | 39 + .../combined/mariadb/userdefined/Email.kt | 28 + .../combined/mariadb/userdefined/FirstName.kt | 28 + .../combined/mariadb/userdefined/IsActive.kt | 28 + .../mariadb/userdefined/IsApproved.kt | 28 + .../combined/mariadb/userdefined/IsDefault.kt | 28 + .../combined/mariadb/userdefined/IsPrimary.kt | 28 + .../mariadb/userdefined/IsVerifiedPurchase.kt | 28 + .../combined/mariadb/userdefined/LastName.kt | 28 + .../VCustomerSummaryViewFields.kt | 77 + .../VCustomerSummaryViewRepo.kt | 16 + .../VCustomerSummaryViewRepoImpl.kt | 18 + .../VCustomerSummaryViewRow.kt | 92 + .../v_daily_sales/VDailySalesViewFields.kt | 74 + .../v_daily_sales/VDailySalesViewRepo.kt | 16 + .../v_daily_sales/VDailySalesViewRepoImpl.kt | 18 + .../v_daily_sales/VDailySalesViewRow.kt | 84 + .../VInventoryStatusViewFields.kt | 91 + .../VInventoryStatusViewRepo.kt | 16 + .../VInventoryStatusViewRepoImpl.kt | 18 + .../VInventoryStatusViewRow.kt | 117 + .../VOrderDetailsViewFields.kt | 92 + .../v_order_details/VOrderDetailsViewRepo.kt | 16 + .../VOrderDetailsViewRepoImpl.kt | 18 + .../v_order_details/VOrderDetailsViewRow.kt | 117 + .../VProductCatalogViewFields.kt | 79 + .../VProductCatalogViewRepo.kt | 16 + .../VProductCatalogViewRepoImpl.kt | 18 + .../VProductCatalogViewRow.kt | 95 + .../VWarehouseCoverageViewFields.kt | 75 + .../VWarehouseCoverageViewRepo.kt | 16 + .../VWarehouseCoverageViewRepoImpl.kt | 18 + .../VWarehouseCoverageViewRow.kt | 88 + .../mariadb/warehouses/WarehousesFields.kt | 77 + .../mariadb/warehouses/WarehousesId.kt | 27 + .../mariadb/warehouses/WarehousesRepo.kt | 80 + .../mariadb/warehouses/WarehousesRepoImpl.kt | 145 + .../mariadb/warehouses/WarehousesRepoMock.kt | 132 + .../mariadb/warehouses/WarehousesRow.kt | 89 + .../warehouses/WarehousesRowUnsaved.kt | 55 + .../postgres/DefaultedDeserializer.kt | 63 + .../combined/postgres/DefaultedSerializer.kt | 37 + .../combined/postgres/bridge/Customer.kt | 15 + .../postgres/customtypes/Defaulted.kt | 68 + .../department/DepartmentFields.kt | 50 + .../humanresources/department/DepartmentId.kt | 30 + .../department/DepartmentRepo.kt | 95 + .../department/DepartmentRepoImpl.kt | 139 + .../department/DepartmentRepoMock.kt | 172 + .../department/DepartmentRow.kt | 54 + .../department/DepartmentRowUnsaved.kt | 43 + .../humanresources/employee/EmployeeFields.kt | 104 + .../humanresources/employee/EmployeeRepo.kt | 96 + .../employee/EmployeeRepoImpl.kt | 178 + .../employee/EmployeeRepoMock.kt | 173 + .../humanresources/employee/EmployeeRow.kt | 127 + .../employee/EmployeeRowUnsaved.kt | 116 + .../EmployeedepartmenthistoryFields.kt | 80 + .../EmployeedepartmenthistoryId.kt | 35 + .../EmployeedepartmenthistoryRepo.kt | 95 + .../EmployeedepartmenthistoryRepoImpl.kt | 153 + .../EmployeedepartmenthistoryRepoMock.kt | 172 + .../EmployeedepartmenthistoryRow.kt | 78 + .../EmployeedepartmenthistoryRowUnsaved.kt | 59 + .../humanresources/shift/ShiftFields.kt | 55 + .../postgres/humanresources/shift/ShiftId.kt | 30 + .../humanresources/shift/ShiftRepo.kt | 95 + .../humanresources/shift/ShiftRepoImpl.kt | 141 + .../humanresources/shift/ShiftRepoMock.kt | 172 + .../postgres/humanresources/shift/ShiftRow.kt | 59 + .../humanresources/shift/ShiftRowUnsaved.kt | 48 + .../vemployee/VemployeeViewFields.kt | 110 + .../vemployee/VemployeeViewRepo.kt | 16 + .../vemployee/VemployeeViewRepoImpl.kt | 18 + .../vemployee/VemployeeViewRow.kt | 98 + .../information_schema/CardinalNumber.kt | 28 + .../information_schema/CharacterData.kt | 28 + .../information_schema/SqlIdentifier.kt | 28 + .../postgres/information_schema/TimeStamp.kt | 29 + .../postgres/information_schema/YesOrNo.kt | 28 + .../postgres/person/address/AddressFields.kt | 77 + .../postgres/person/address/AddressId.kt | 30 + .../postgres/person/address/AddressRepo.kt | 95 + .../person/address/AddressRepoImpl.kt | 152 + .../person/address/AddressRepoMock.kt | 172 + .../postgres/person/address/AddressRow.kt | 78 + .../person/address/AddressRowUnsaved.kt | 67 + .../person/addresstype/AddresstypeFields.kt | 51 + .../person/addresstype/AddresstypeId.kt | 30 + .../person/addresstype/AddresstypeRepo.kt | 95 + .../person/addresstype/AddresstypeRepoImpl.kt | 142 + .../person/addresstype/AddresstypeRepoMock.kt | 172 + .../person/addresstype/AddresstypeRow.kt | 56 + .../addresstype/AddresstypeRowUnsaved.kt | 45 + .../businessentity/BusinessentityFields.kt | 46 + .../person/businessentity/BusinessentityId.kt | 30 + .../businessentity/BusinessentityRepo.kt | 95 + .../businessentity/BusinessentityRepoImpl.kt | 139 + .../businessentity/BusinessentityRepoMock.kt | 172 + .../businessentity/BusinessentityRow.kt | 51 + .../BusinessentityRowUnsaved.kt | 40 + .../BusinessentityaddressFields.kt | 75 + .../BusinessentityaddressId.kt | 30 + .../BusinessentityaddressRepo.kt | 95 + .../BusinessentityaddressRepoImpl.kt | 151 + .../BusinessentityaddressRepoMock.kt | 172 + .../BusinessentityaddressRow.kt | 73 + .../BusinessentityaddressRowUnsaved.kt | 54 + .../countryregion/CountryregionFields.kt | 46 + .../person/countryregion/CountryregionId.kt | 30 + .../person/countryregion/CountryregionRepo.kt | 95 + .../countryregion/CountryregionRepoImpl.kt | 134 + .../countryregion/CountryregionRepoMock.kt | 172 + .../person/countryregion/CountryregionRow.kt | 45 + .../countryregion/CountryregionRowUnsaved.kt | 34 + .../person/emailaddress/EmailaddressFields.kt | 66 + .../person/emailaddress/EmailaddressId.kt | 26 + .../person/emailaddress/EmailaddressRepo.kt | 95 + .../emailaddress/EmailaddressRepoImpl.kt | 150 + .../emailaddress/EmailaddressRepoMock.kt | 172 + .../person/emailaddress/EmailaddressRow.kt | 71 + .../emailaddress/EmailaddressRowUnsaved.kt | 51 + .../person/password/PasswordFields.kt | 60 + .../postgres/person/password/PasswordRepo.kt | 96 + .../person/password/PasswordRepoImpl.kt | 141 + .../person/password/PasswordRepoMock.kt | 173 + .../postgres/person/password/PasswordRow.kt | 57 + .../person/password/PasswordRowUnsaved.kt | 46 + .../postgres/person/person/PersonFields.kt | 98 + .../postgres/person/person/PersonRepo.kt | 96 + .../postgres/person/person/PersonRepoImpl.kt | 167 + .../postgres/person/person/PersonRepoMock.kt | 173 + .../postgres/person/person/PersonRow.kt | 105 + .../person/person/PersonRowUnsaved.kt | 94 + .../stateprovince/StateprovinceFields.kt | 79 + .../person/stateprovince/StateprovinceId.kt | 30 + .../person/stateprovince/StateprovinceRepo.kt | 95 + .../stateprovince/StateprovinceRepoImpl.kt | 156 + .../stateprovince/StateprovinceRepoMock.kt | 172 + .../person/stateprovince/StateprovinceRow.kt | 82 + .../stateprovince/StateprovinceRowUnsaved.kt | 71 + .../postgres/precisetypes/PaddedString10.kt | 59 + .../postgres/precisetypes/PaddedString3.kt | 59 + .../postgres/precisetypes/String10.kt | 59 + .../postgres/precisetypes/String100.kt | 59 + .../postgres/precisetypes/String20.kt | 59 + .../postgres/precisetypes/String255.kt | 59 + .../postgres/precisetypes/String50.kt | 59 + .../production/product/ProductFields.kt | 156 + .../postgres/production/product/ProductId.kt | 30 + .../production/product/ProductRepo.kt | 95 + .../production/product/ProductRepoImpl.kt | 194 + .../production/product/ProductRepoMock.kt | 172 + .../postgres/production/product/ProductRow.kt | 180 + .../production/product/ProductRowUnsaved.kt | 170 + .../productcategory/ProductcategoryFields.kt | 51 + .../productcategory/ProductcategoryId.kt | 30 + .../productcategory/ProductcategoryRepo.kt | 95 + .../ProductcategoryRepoImpl.kt | 142 + .../ProductcategoryRepoMock.kt | 172 + .../productcategory/ProductcategoryRow.kt | 56 + .../ProductcategoryRowUnsaved.kt | 45 + .../ProductcosthistoryFields.kt | 66 + .../ProductcosthistoryId.kt | 27 + .../ProductcosthistoryRepo.kt | 95 + .../ProductcosthistoryRepoImpl.kt | 145 + .../ProductcosthistoryRepoMock.kt | 172 + .../ProductcosthistoryRow.kt | 71 + .../ProductcosthistoryRowUnsaved.kt | 51 + .../productmodel/ProductmodelFields.kt | 61 + .../production/productmodel/ProductmodelId.kt | 30 + .../productmodel/ProductmodelRepo.kt | 95 + .../productmodel/ProductmodelRepoImpl.kt | 146 + .../productmodel/ProductmodelRepoMock.kt | 172 + .../productmodel/ProductmodelRow.kt | 65 + .../productmodel/ProductmodelRowUnsaved.kt | 54 + .../ProductsubcategoryFields.kt | 61 + .../ProductsubcategoryId.kt | 30 + .../ProductsubcategoryRepo.kt | 95 + .../ProductsubcategoryRepoImpl.kt | 145 + .../ProductsubcategoryRepoMock.kt | 172 + .../ProductsubcategoryRow.kt | 63 + .../ProductsubcategoryRowUnsaved.kt | 52 + .../unitmeasure/UnitmeasureFields.kt | 46 + .../production/unitmeasure/UnitmeasureId.kt | 30 + .../production/unitmeasure/UnitmeasureRepo.kt | 95 + .../unitmeasure/UnitmeasureRepoImpl.kt | 134 + .../unitmeasure/UnitmeasureRepoMock.kt | 172 + .../production/unitmeasure/UnitmeasureRow.kt | 45 + .../unitmeasure/UnitmeasureRowUnsaved.kt | 34 + .../combined/postgres/public/AccountNumber.kt | 28 + .../combined/postgres/public/Address.kt | 27 + .../postgres/public/AllTypesComposite.kt | 53 + .../combined/postgres/public/Complex.kt | 25 + .../combined/postgres/public/ContactInfo.kt | 26 + .../postgres/public/EmployeeRecord.kt | 30 + .../postgres/combined/postgres/public/Flag.kt | 28 + .../combined/postgres/public/InventoryItem.kt | 28 + .../postgres/public/MetadataRecord.kt | 28 + .../combined/postgres/public/Mydomain.kt | 28 + .../combined/postgres/public/Myenum.kt | 32 + .../postgres/combined/postgres/public/Name.kt | 28 + .../combined/postgres/public/NameStyle.kt | 28 + .../combined/postgres/public/NullableTest.kt | 26 + .../combined/postgres/public/OrderNumber.kt | 28 + .../combined/postgres/public/PersonName.kt | 27 + .../combined/postgres/public/Phone.kt | 28 + .../combined/postgres/public/Point2d.kt | 25 + .../combined/postgres/public/PolygonCustom.kt | 25 + .../combined/postgres/public/ShortText.kt | 28 + .../postgres/public/TablefuncCrosstab2.kt | 26 + .../postgres/public/TablefuncCrosstab3.kt | 27 + .../postgres/public/TablefuncCrosstab4.kt | 28 + .../postgres/public/TextWithSpecialChars.kt | 29 + .../combined/postgres/public/TreeNode.kt | 26 + .../postgres/public/flaff/FlaffFields.kt | 61 + .../combined/postgres/public/flaff/FlaffId.kt | 33 + .../postgres/public/flaff/FlaffRepo.kt | 83 + .../postgres/public/flaff/FlaffRepoImpl.kt | 117 + .../postgres/public/flaff/FlaffRepoMock.kt | 148 + .../postgres/public/flaff/FlaffRow.kt | 55 + .../identity_test/IdentityTestFields.kt | 44 + .../public/identity_test/IdentityTestId.kt | 30 + .../public/identity_test/IdentityTestRepo.kt | 95 + .../identity_test/IdentityTestRepoImpl.kt | 131 + .../identity_test/IdentityTestRepoMock.kt | 172 + .../public/identity_test/IdentityTestRow.kt | 42 + .../identity_test/IdentityTestRowUnsaved.kt | 31 + .../public/issue142/Issue142Fields.kt | 34 + .../postgres/public/issue142/Issue142Id.kt | 36 + .../postgres/public/issue142/Issue142Repo.kt | 78 + .../public/issue142/Issue142RepoImpl.kt | 97 + .../public/issue142/Issue142RepoMock.kt | 137 + .../postgres/public/issue142/Issue142Row.kt | 27 + .../public/issue142_2/Issue1422Fields.kt | 40 + .../public/issue142_2/Issue1422Repo.kt | 79 + .../public/issue142_2/Issue1422RepoImpl.kt | 98 + .../public/issue142_2/Issue1422RepoMock.kt | 138 + .../public/issue142_2/Issue1422Row.kt | 29 + .../only_pk_columns/OnlyPkColumnsFields.kt | 44 + .../public/only_pk_columns/OnlyPkColumnsId.kt | 26 + .../only_pk_columns/OnlyPkColumnsRepo.kt | 78 + .../only_pk_columns/OnlyPkColumnsRepoImpl.kt | 104 + .../only_pk_columns/OnlyPkColumnsRepoMock.kt | 137 + .../only_pk_columns/OnlyPkColumnsRow.kt | 38 + .../postgres/public/pgtest/PgtestFields.kt | 336 + .../postgres/public/pgtest/PgtestRepo.kt | 35 + .../postgres/public/pgtest/PgtestRepoImpl.kt | 42 + .../postgres/public/pgtest/PgtestRow.kt | 258 + .../public/pgtestnull/PgtestnullFields.kt | 336 + .../public/pgtestnull/PgtestnullRepo.kt | 35 + .../public/pgtestnull/PgtestnullRepoImpl.kt | 42 + .../public/pgtestnull/PgtestnullRow.kt | 258 + .../precision_types/PrecisionTypesFields.kt | 144 + .../precision_types/PrecisionTypesId.kt | 30 + .../precision_types/PrecisionTypesRepo.kt | 95 + .../precision_types/PrecisionTypesRepoImpl.kt | 184 + .../precision_types/PrecisionTypesRepoMock.kt | 172 + .../precision_types/PrecisionTypesRow.kt | 116 + .../PrecisionTypesRowUnsaved.kt | 109 + .../PrecisionTypesNullFields.kt | 144 + .../PrecisionTypesNullId.kt | 30 + .../PrecisionTypesNullRepo.kt | 95 + .../PrecisionTypesNullRepoImpl.kt | 184 + .../PrecisionTypesNullRepoMock.kt | 172 + .../PrecisionTypesNullRow.kt | 116 + .../PrecisionTypesNullRowUnsaved.kt | 109 + .../postgres/public/title/TitleFields.kt | 34 + .../combined/postgres/public/title/TitleId.kt | 38 + .../postgres/public/title/TitleRepo.kt | 78 + .../postgres/public/title/TitleRepoImpl.kt | 97 + .../postgres/public/title/TitleRepoMock.kt | 137 + .../postgres/public/title/TitleRow.kt | 27 + .../public/title_domain/TitleDomainFields.kt | 34 + .../public/title_domain/TitleDomainId.kt | 39 + .../public/title_domain/TitleDomainRepo.kt | 78 + .../title_domain/TitleDomainRepoImpl.kt | 97 + .../title_domain/TitleDomainRepoMock.kt | 137 + .../public/title_domain/TitleDomainRow.kt | 27 + .../public/titledperson/TitledpersonFields.kt | 54 + .../public/titledperson/TitledpersonRepo.kt | 35 + .../titledperson/TitledpersonRepoImpl.kt | 42 + .../public/titledperson/TitledpersonRow.kt | 37 + .../postgres/public/users/UsersFields.kt | 63 + .../combined/postgres/public/users/UsersId.kt | 31 + .../postgres/public/users/UsersRepo.kt | 101 + .../postgres/public/users/UsersRepoImpl.kt | 147 + .../postgres/public/users/UsersRepoMock.kt | 178 + .../postgres/public/users/UsersRow.kt | 55 + .../postgres/public/users/UsersRowUnsaved.kt | 45 + .../sales/salesperson/SalespersonFields.kt | 83 + .../sales/salesperson/SalespersonRepo.kt | 96 + .../sales/salesperson/SalespersonRepoImpl.kt | 162 + .../sales/salesperson/SalespersonRepoMock.kt | 173 + .../sales/salesperson/SalespersonRow.kt | 97 + .../salesperson/SalespersonRowUnsaved.kt | 86 + .../salesterritory/SalesterritoryFields.kt | 82 + .../sales/salesterritory/SalesterritoryId.kt | 30 + .../salesterritory/SalesterritoryRepo.kt | 95 + .../salesterritory/SalesterritoryRepoImpl.kt | 167 + .../salesterritory/SalesterritoryRepoMock.kt | 172 + .../sales/salesterritory/SalesterritoryRow.kt | 100 + .../SalesterritoryRowUnsaved.kt | 89 + .../postgres/userdefined/ActiveFlag.kt | 32 + .../postgres/userdefined/CurrentFlag.kt | 32 + .../postgres/userdefined/Description.kt | 32 + .../postgres/userdefined/FirstName.kt | 32 + .../combined/postgres/userdefined/LastName.kt | 32 + .../postgres/userdefined/MiddleName.kt | 32 + .../postgres/userdefined/OnlineOrderFlag.kt | 32 + .../postgres/userdefined/SalariedFlag.kt | 32 + .../testdb/DefaultedDeserializer.java | 36 +- .../testdb/DefaultedSerializer.java | 14 +- .../testdb/EmailAddress.java | 19 +- .../testdb/MoneyAmount.java | 19 +- .../testdb/TestInsert.java | 213 +- .../testdb/bridge/Customer.java | 27 + .../CheckConstraintTestFields.java | 75 +- .../CheckConstraintTestId.java | 16 +- .../CheckConstraintTestRepo.java | 59 +- .../CheckConstraintTestRepoImpl.java | 211 +- .../CheckConstraintTestRepoMock.java | 138 +- .../CheckConstraintTestRow.java | 48 +- .../CustomerOrderSummaryViewFields.java | 70 +- .../CustomerOrderSummaryViewRepo.java | 10 +- .../CustomerOrderSummaryViewRepoImpl.java | 28 +- .../CustomerOrderSummaryViewRow.java | 36 +- .../CustomerOrdersSqlRepo.java | 11 +- .../CustomerOrdersSqlRepoImpl.java | 32 +- .../customer_orders/CustomerOrdersSqlRow.java | 72 +- .../customer_stats/CustomerStatsMVFields.java | 70 +- .../customer_stats/CustomerStatsMVRepo.java | 10 +- .../CustomerStatsMVRepoImpl.java | 28 +- .../customer_stats/CustomerStatsMVRow.java | 39 +- .../CustomerSummarySqlRepo.java | 8 +- .../CustomerSummarySqlRepoImpl.java | 23 +- .../CustomerSummarySqlRow.java | 44 +- .../testdb/customers/CustomersFields.java | 67 +- .../testdb/customers/CustomersId.java | 16 +- .../testdb/customers/CustomersRepo.java | 74 +- .../testdb/customers/CustomersRepoImpl.java | 287 +- .../testdb/customers/CustomersRepoMock.java | 154 +- .../testdb/customers/CustomersRow.java | 64 +- .../testdb/customers/CustomersRowUnsaved.java | 49 +- .../testdb/customtypes/Defaulted.java | 46 +- .../testdb/db2test/Db2testFields.java | 296 +- .../testdb/db2test/Db2testId.java | 16 +- .../testdb/db2test/Db2testRepo.java | 60 +- .../testdb/db2test/Db2testRepoImpl.java | 448 +- .../testdb/db2test/Db2testRepoMock.java | 133 +- .../testdb/db2test/Db2testRow.java | 865 +-- .../Db2testIdentityAlwaysFields.java | 54 +- .../Db2testIdentityAlwaysId.java | 16 +- .../Db2testIdentityAlwaysRepo.java | 64 +- .../Db2testIdentityAlwaysRepoImpl.java | 210 +- .../Db2testIdentityAlwaysRepoMock.java | 156 +- .../Db2testIdentityAlwaysRow.java | 32 +- .../Db2testIdentityAlwaysRowUnsaved.java | 14 +- .../Db2testIdentityDefaultFields.java | 54 +- .../Db2testIdentityDefaultId.java | 16 +- .../Db2testIdentityDefaultRepo.java | 64 +- .../Db2testIdentityDefaultRepoImpl.java | 232 +- .../Db2testIdentityDefaultRepoMock.java | 157 +- .../Db2testIdentityDefaultRow.java | 32 +- .../Db2testIdentityDefaultRowUnsaved.java | 23 +- .../db2test_unique/Db2testUniqueFields.java | 69 +- .../db2test_unique/Db2testUniqueId.java | 16 +- .../db2test_unique/Db2testUniqueRepo.java | 79 +- .../db2test_unique/Db2testUniqueRepoImpl.java | 276 +- .../db2test_unique/Db2testUniqueRepoMock.java | 164 +- .../db2test_unique/Db2testUniqueRow.java | 43 +- .../Db2testUniqueRowUnsaved.java | 15 +- .../testdb/db2testnull/Db2testnullFields.java | 297 +- .../testdb/db2testnull/Db2testnullRepo.java | 20 +- .../db2testnull/Db2testnullRepoImpl.java | 110 +- .../testdb/db2testnull/Db2testnullRow.java | 860 +-- .../DistinctTypeTestFields.java | 66 +- .../DistinctTypeTestId.java | 16 +- .../DistinctTypeTestRepo.java | 64 +- .../DistinctTypeTestRepoImpl.java | 232 +- .../DistinctTypeTestRepoMock.java | 151 +- .../DistinctTypeTestRow.java | 38 +- .../DistinctTypeTestRowUnsaved.java | 17 +- .../IdentityParamsTestFields.java | 54 +- .../IdentityParamsTestId.java | 16 +- .../IdentityParamsTestRepo.java | 64 +- .../IdentityParamsTestRepoImpl.java | 207 +- .../IdentityParamsTestRepoMock.java | 154 +- .../IdentityParamsTestRow.java | 32 +- .../IdentityParamsTestRowUnsaved.java | 10 +- .../NullabilityTestFields.java | 72 +- .../nullability_test/NullabilityTestRepo.java | 25 +- .../NullabilityTestRepoImpl.java | 120 +- .../nullability_test/NullabilityTestRow.java | 39 +- .../NullabilityTestRowUnsaved.java | 31 +- .../testdb/order_items/OrderItemsFields.java | 90 +- .../testdb/order_items/OrderItemsId.java | 25 +- .../testdb/order_items/OrderItemsRepo.java | 60 +- .../order_items/OrderItemsRepoImpl.java | 232 +- .../order_items/OrderItemsRepoMock.java | 135 +- .../testdb/order_items/OrderItemsRow.java | 64 +- .../testdb/orders/OrdersFields.java | 84 +- .../testdb/orders/OrdersId.java | 15 +- .../testdb/orders/OrdersRepo.java | 65 +- .../testdb/orders/OrdersRepoImpl.java | 286 +- .../testdb/orders/OrdersRepoMock.java | 146 +- .../testdb/orders/OrdersRow.java | 71 +- .../testdb/orders/OrdersRowUnsaved.java | 48 +- .../OrdersByCustomerSqlRepo.java | 11 +- .../OrdersByCustomerSqlRepoImpl.java | 27 +- .../OrdersByCustomerSqlRow.java | 102 +- .../java/src/java/testdb/AllTypesTest.java | 2 +- testers/db2/java/src/java/testdb/DSLTest.java | 4 +- .../src/java/testdb/DatabaseFeaturesTest.java | 2 +- .../java/src/java/testdb/Db2TestHelper.java | 32 +- .../java/src/java/testdb/ForeignKeyTest.java | 2 +- .../java/src/java/testdb/MockRepoTest.java | 2 +- .../java/src/java/testdb/SqlScriptTest.java | 2 +- .../java/src/java/testdb/TestInsertTest.java | 2 +- .../db2/java/src/java/testdb/TupleInTest.java | 6 +- .../java/src/java/testdb/WithConnection.java | 32 +- testers/db2/kotlin/build.gradle.kts | 44 - .../testdb/DefaultedDeserializer.kt | 2 +- .../testdb/DefaultedSerializer.kt | 4 +- .../testdb/EmailAddress.kt | 12 +- .../testdb/MoneyAmount.kt | 8 +- .../testdb/TestInsert.kt | 70 +- .../testdb/bridge/Customer.kt | 15 + .../CheckConstraintTestFields.kt | 37 +- .../CheckConstraintTestId.kt | 8 +- .../CheckConstraintTestRepo.kt | 27 +- .../CheckConstraintTestRepoImpl.kt | 61 +- .../CheckConstraintTestRepoMock.kt | 39 +- .../CheckConstraintTestRow.kt | 16 +- .../CustomerOrderSummaryViewFields.kt | 33 +- .../CustomerOrderSummaryViewRepo.kt | 6 +- .../CustomerOrderSummaryViewRepoImpl.kt | 12 +- .../CustomerOrderSummaryViewRow.kt | 15 +- .../customer_orders/CustomerOrdersSqlRepo.kt | 4 +- .../CustomerOrdersSqlRepoImpl.kt | 10 +- .../customer_orders/CustomerOrdersSqlRow.kt | 20 +- .../customer_stats/CustomerStatsMVFields.kt | 33 +- .../customer_stats/CustomerStatsMVRepo.kt | 6 +- .../customer_stats/CustomerStatsMVRepoImpl.kt | 12 +- .../customer_stats/CustomerStatsMVRow.kt | 15 +- .../CustomerSummarySqlRepo.kt | 4 +- .../CustomerSummarySqlRepoImpl.kt | 6 +- .../customer_summary/CustomerSummarySqlRow.kt | 20 +- .../testdb/customers/CustomersFields.kt | 38 +- .../testdb/customers/CustomersId.kt | 8 +- .../testdb/customers/CustomersRepo.kt | 31 +- .../testdb/customers/CustomersRepoImpl.kt | 86 +- .../testdb/customers/CustomersRepoMock.kt | 43 +- .../testdb/customers/CustomersRow.kt | 19 +- .../testdb/customers/CustomersRowUnsaved.kt | 4 +- .../testdb/db2test/Db2testFields.kt | 93 +- .../testdb/db2test/Db2testId.kt | 8 +- .../testdb/db2test/Db2testRepo.kt | 27 +- .../testdb/db2test/Db2testRepoImpl.kt | 60 +- .../testdb/db2test/Db2testRepoMock.kt | 39 +- .../testdb/db2test/Db2testRow.kt | 50 +- .../Db2testIdentityAlwaysFields.kt | 30 +- .../Db2testIdentityAlwaysId.kt | 8 +- .../Db2testIdentityAlwaysRepo.kt | 27 +- .../Db2testIdentityAlwaysRepoImpl.kt | 67 +- .../Db2testIdentityAlwaysRepoMock.kt | 39 +- .../Db2testIdentityAlwaysRow.kt | 14 +- .../Db2testIdentityAlwaysRowUnsaved.kt | 2 +- .../Db2testIdentityDefaultFields.kt | 30 +- .../Db2testIdentityDefaultId.kt | 8 +- .../Db2testIdentityDefaultRepo.kt | 27 +- .../Db2testIdentityDefaultRepoImpl.kt | 71 +- .../Db2testIdentityDefaultRepoMock.kt | 39 +- .../Db2testIdentityDefaultRow.kt | 14 +- .../Db2testIdentityDefaultRowUnsaved.kt | 2 +- .../db2test_unique/Db2testUniqueFields.kt | 38 +- .../testdb/db2test_unique/Db2testUniqueId.kt | 8 +- .../db2test_unique/Db2testUniqueRepo.kt | 37 +- .../db2test_unique/Db2testUniqueRepoImpl.kt | 89 +- .../db2test_unique/Db2testUniqueRepoMock.kt | 49 +- .../testdb/db2test_unique/Db2testUniqueRow.kt | 22 +- .../db2test_unique/Db2testUniqueRowUnsaved.kt | 6 +- .../testdb/db2testnull/Db2testnullFields.kt | 91 +- .../testdb/db2testnull/Db2testnullRepo.kt | 11 +- .../testdb/db2testnull/Db2testnullRepoImpl.kt | 27 +- .../testdb/db2testnull/Db2testnullRow.kt | 51 +- .../DistinctTypeTestFields.kt | 28 +- .../distinct_type_test/DistinctTypeTestId.kt | 8 +- .../DistinctTypeTestRepo.kt | 27 +- .../DistinctTypeTestRepoImpl.kt | 70 +- .../DistinctTypeTestRepoMock.kt | 39 +- .../distinct_type_test/DistinctTypeTestRow.kt | 7 +- .../IdentityParamsTestFields.kt | 30 +- .../IdentityParamsTestId.kt | 8 +- .../IdentityParamsTestRepo.kt | 27 +- .../IdentityParamsTestRepoImpl.kt | 67 +- .../IdentityParamsTestRepoMock.kt | 39 +- .../IdentityParamsTestRow.kt | 14 +- .../IdentityParamsTestRowUnsaved.kt | 2 +- .../nullability_test/NullabilityTestFields.kt | 39 +- .../nullability_test/NullabilityTestRepo.kt | 11 +- .../NullabilityTestRepoImpl.kt | 47 +- .../nullability_test/NullabilityTestRow.kt | 26 +- .../NullabilityTestRowUnsaved.kt | 8 +- .../testdb/order_items/OrderItemsFields.kt | 45 +- .../testdb/order_items/OrderItemsId.kt | 8 +- .../testdb/order_items/OrderItemsRepo.kt | 27 +- .../testdb/order_items/OrderItemsRepoImpl.kt | 64 +- .../testdb/order_items/OrderItemsRepoMock.kt | 39 +- .../testdb/order_items/OrderItemsRow.kt | 17 +- .../testdb/orders/OrdersFields.kt | 41 +- .../testdb/orders/OrdersId.kt | 8 +- .../testdb/orders/OrdersRepo.kt | 27 +- .../testdb/orders/OrdersRepoImpl.kt | 81 +- .../testdb/orders/OrdersRepoMock.kt | 39 +- .../testdb/orders/OrdersRow.kt | 18 +- .../testdb/orders/OrdersRowUnsaved.kt | 4 +- .../OrdersByCustomerSqlRepo.kt | 4 +- .../OrdersByCustomerSqlRepoImpl.kt | 10 +- .../OrdersByCustomerSqlRow.kt | 20 +- testers/db2/kotlin/gradle.properties | 1 - .../kotlin/src/kotlin/testdb/AllTypesTest.kt | 2 +- .../src/kotlin/testdb/DatabaseFeaturesTest.kt | 2 +- .../kotlin/src/kotlin/testdb/Db2TestHelper.kt | 20 +- .../kotlin/src/kotlin/testdb/SqlScriptTest.kt | 2 +- .../kotlin/src/kotlin/testdb/TupleInTest.kt | 6 +- .../testdb/EmailAddress.scala | 10 +- .../testdb/MoneyAmount.scala | 10 +- .../testdb/TestInsert.scala | 2 +- .../testdb/bridge/Customer.scala | 15 + .../CheckConstraintTestFields.scala | 33 +- .../CheckConstraintTestId.scala | 10 +- .../CheckConstraintTestRepo.scala | 19 +- .../CheckConstraintTestRepoImpl.scala | 63 +- .../CheckConstraintTestRepoMock.scala | 31 +- .../CheckConstraintTestRow.scala | 10 +- .../CustomerOrderSummaryViewFields.scala | 29 +- .../CustomerOrderSummaryViewRepo.scala | 6 +- .../CustomerOrderSummaryViewRepoImpl.scala | 14 +- .../CustomerOrderSummaryViewRow.scala | 9 +- .../CustomerOrdersSqlRepo.scala | 4 +- .../CustomerOrdersSqlRepoImpl.scala | 14 +- .../CustomerOrdersSqlRow.scala | 10 +- .../CustomerStatsMVFields.scala | 29 +- .../customer_stats/CustomerStatsMVRepo.scala | 6 +- .../CustomerStatsMVRepoImpl.scala | 14 +- .../customer_stats/CustomerStatsMVRow.scala | 9 +- .../CustomerSummarySqlRepo.scala | 4 +- .../CustomerSummarySqlRepoImpl.scala | 8 +- .../CustomerSummarySqlRow.scala | 10 +- .../testdb/customers/CustomersFields.scala | 32 +- .../testdb/customers/CustomersId.scala | 10 +- .../testdb/customers/CustomersRepo.scala | 21 +- .../testdb/customers/CustomersRepoImpl.scala | 74 +- .../testdb/customers/CustomersRepoMock.scala | 33 +- .../testdb/customers/CustomersRow.scala | 9 +- .../testdb/db2test/Db2testFields.scala | 71 +- .../testdb/db2test/Db2testId.scala | 10 +- .../testdb/db2test/Db2testRepo.scala | 19 +- .../testdb/db2test/Db2testRepoImpl.scala | 76 +- .../testdb/db2test/Db2testRepoMock.scala | 31 +- .../testdb/db2test/Db2testRow.scala | 9 +- .../Db2testIdentityAlwaysFields.scala | 26 +- .../Db2testIdentityAlwaysId.scala | 10 +- .../Db2testIdentityAlwaysRepo.scala | 19 +- .../Db2testIdentityAlwaysRepoImpl.scala | 55 +- .../Db2testIdentityAlwaysRepoMock.scala | 31 +- .../Db2testIdentityAlwaysRow.scala | 8 +- .../Db2testIdentityDefaultFields.scala | 26 +- .../Db2testIdentityDefaultId.scala | 10 +- .../Db2testIdentityDefaultRepo.scala | 19 +- .../Db2testIdentityDefaultRepoImpl.scala | 57 +- .../Db2testIdentityDefaultRepoMock.scala | 31 +- .../Db2testIdentityDefaultRow.scala | 8 +- .../db2test_unique/Db2testUniqueFields.scala | 30 +- .../db2test_unique/Db2testUniqueId.scala | 10 +- .../db2test_unique/Db2testUniqueRepo.scala | 23 +- .../Db2testUniqueRepoImpl.scala | 67 +- .../Db2testUniqueRepoMock.scala | 35 +- .../db2test_unique/Db2testUniqueRow.scala | 8 +- .../db2testnull/Db2testnullFields.scala | 69 +- .../testdb/db2testnull/Db2testnullRepo.scala | 11 +- .../db2testnull/Db2testnullRepoImpl.scala | 31 +- .../testdb/db2testnull/Db2testnullRow.scala | 10 +- .../DistinctTypeTestFields.scala | 28 +- .../DistinctTypeTestId.scala | 10 +- .../DistinctTypeTestRepo.scala | 19 +- .../DistinctTypeTestRepoImpl.scala | 66 +- .../DistinctTypeTestRepoMock.scala | 31 +- .../DistinctTypeTestRow.scala | 7 +- .../IdentityParamsTestFields.scala | 26 +- .../IdentityParamsTestId.scala | 10 +- .../IdentityParamsTestRepo.scala | 19 +- .../IdentityParamsTestRepoImpl.scala | 55 +- .../IdentityParamsTestRepoMock.scala | 31 +- .../IdentityParamsTestRow.scala | 8 +- .../NullabilityTestFields.scala | 31 +- .../NullabilityTestRepo.scala | 11 +- .../NullabilityTestRepoImpl.scala | 45 +- .../nullability_test/NullabilityTestRow.scala | 10 +- .../testdb/order_items/OrderItemsFields.scala | 37 +- .../testdb/order_items/OrderItemsId.scala | 8 +- .../testdb/order_items/OrderItemsRepo.scala | 19 +- .../order_items/OrderItemsRepoImpl.scala | 66 +- .../order_items/OrderItemsRepoMock.scala | 31 +- .../testdb/order_items/OrderItemsRow.scala | 11 +- .../testdb/orders/OrdersFields.scala | 37 +- .../testdb/orders/OrdersId.scala | 10 +- .../testdb/orders/OrdersRepo.scala | 19 +- .../testdb/orders/OrdersRepoImpl.scala | 75 +- .../testdb/orders/OrdersRepoMock.scala | 31 +- .../testdb/orders/OrdersRow.scala | 10 +- .../OrdersByCustomerSqlRepo.scala | 4 +- .../OrdersByCustomerSqlRepoImpl.scala | 14 +- .../OrdersByCustomerSqlRow.scala | 10 +- .../scala/src/scala/testdb/AllTypesTest.scala | 50 +- .../scala/testdb/DatabaseFeaturesTest.scala | 66 +- .../src/scala/testdb/SqlScriptTest.scala | 22 +- .../scala/src/scala/testdb/TupleInTest.scala | 41 +- .../src/scala/testdb/withConnection.scala | 10 +- .../testdb/DefaultedDeserializer.java | 36 +- .../testdb/DefaultedSerializer.java | 14 +- .../generated-and-checked-in/testdb/Mood.java | 72 +- .../testdb/Priority.java | 74 +- .../testdb/TestInsert.java | 241 +- .../AllScalarTypesFields.java | 334 +- .../all_scalar_types/AllScalarTypesId.java | 24 +- .../all_scalar_types/AllScalarTypesRepo.java | 60 +- .../AllScalarTypesRepoImpl.java | 425 +- .../AllScalarTypesRepoMock.java | 133 +- .../all_scalar_types/AllScalarTypesRow.java | 938 +-- .../AllScalarTypesSearchSqlRepo.java | 31 +- .../AllScalarTypesSearchSqlRepoImpl.java | 116 +- .../AllScalarTypesSearchSqlRow.java | 986 +-- .../testdb/bridge/Customer.java | 27 + .../CustomerOrdersViewFields.java | 107 +- .../CustomerOrdersViewRepo.java | 10 +- .../CustomerOrdersViewRepoImpl.java | 29 +- .../CustomerOrdersViewRow.java | 90 +- .../CustomerSearchSqlRepo.java | 19 +- .../CustomerSearchSqlRepoImpl.java | 56 +- .../customer_search/CustomerSearchSqlRow.java | 55 +- .../testdb/customers/CustomersFields.java | 80 +- .../testdb/customers/CustomersId.java | 24 +- .../testdb/customers/CustomersRepo.java | 65 +- .../testdb/customers/CustomersRepoImpl.java | 280 +- .../testdb/customers/CustomersRepoMock.java | 148 +- .../testdb/customers/CustomersRow.java | 59 +- .../testdb/customers/CustomersRowUnsaved.java | 41 +- .../testdb/customtypes/Defaulted.java | 46 +- .../DeleteOldOrdersSqlRepo.java | 12 +- .../DeleteOldOrdersSqlRepoImpl.java | 37 +- .../DeleteOldOrdersSqlRow.java | 54 +- .../DepartmentEmployeeDetailsSqlRepo.java | 17 +- .../DepartmentEmployeeDetailsSqlRepoImpl.java | 61 +- .../DepartmentEmployeeDetailsSqlRow.java | 228 +- .../testdb/departments/DepartmentsFields.java | 75 +- .../testdb/departments/DepartmentsId.java | 25 +- .../testdb/departments/DepartmentsRepo.java | 60 +- .../departments/DepartmentsRepoImpl.java | 216 +- .../departments/DepartmentsRepoMock.java | 135 +- .../testdb/departments/DepartmentsRow.java | 50 +- .../EmployeeSalaryUpdateSqlRepo.java | 17 +- .../EmployeeSalaryUpdateSqlRepoImpl.java | 50 +- .../EmployeeSalaryUpdateSqlRow.java | 97 +- .../testdb/employees/EmployeesFields.java | 109 +- .../testdb/employees/EmployeesId.java | 25 +- .../testdb/employees/EmployeesRepo.java | 65 +- .../testdb/employees/EmployeesRepoImpl.java | 321 +- .../testdb/employees/EmployeesRepoMock.java | 148 +- .../testdb/employees/EmployeesRow.java | 103 +- .../testdb/employees/EmployeesRowUnsaved.java | 65 +- .../InsertOrderWithItemsSqlRepo.java | 19 +- .../InsertOrderWithItemsSqlRepoImpl.java | 52 +- .../InsertOrderWithItemsSqlRow.java | 54 +- .../order_details/OrderDetailsViewFields.java | 107 +- .../order_details/OrderDetailsViewRepo.java | 10 +- .../OrderDetailsViewRepoImpl.java | 29 +- .../order_details/OrderDetailsViewRow.java | 90 +- .../testdb/order_items/OrderItemsFields.java | 73 +- .../testdb/order_items/OrderItemsId.java | 25 +- .../testdb/order_items/OrderItemsRepo.java | 65 +- .../order_items/OrderItemsRepoImpl.java | 273 +- .../order_items/OrderItemsRepoMock.java | 151 +- .../testdb/order_items/OrderItemsRow.java | 53 +- .../order_items/OrderItemsRowUnsaved.java | 26 +- .../OrderItemsBulkInsertSqlRepo.java | 17 +- .../OrderItemsBulkInsertSqlRepoImpl.java | 46 +- .../OrderItemsBulkInsertSqlRow.java | 44 +- .../OrderSummaryByCustomerSqlRepo.java | 15 +- .../OrderSummaryByCustomerSqlRepoImpl.java | 62 +- .../OrderSummaryByCustomerSqlRow.java | 198 +- .../testdb/orders/OrdersFields.java | 79 +- .../testdb/orders/OrdersId.java | 24 +- .../testdb/orders/OrdersRepo.java | 65 +- .../testdb/orders/OrdersRepoImpl.java | 284 +- .../testdb/orders/OrdersRepoMock.java | 146 +- .../testdb/orders/OrdersRow.java | 58 +- .../testdb/orders/OrdersRowUnsaved.java | 42 +- .../testdb/precisetypes/Decimal10_2.java | 74 +- .../testdb/precisetypes/Decimal18_4.java | 74 +- .../testdb/precisetypes/Decimal5_2.java | 74 +- .../testdb/precisetypes/Int10.java | 71 +- .../testdb/precisetypes/Int18.java | 71 +- .../testdb/precisetypes/Int5.java | 71 +- .../precision_types/PrecisionTypesFields.java | 168 +- .../precision_types/PrecisionTypesId.java | 24 +- .../precision_types/PrecisionTypesRepo.java | 60 +- .../PrecisionTypesRepoImpl.java | 279 +- .../PrecisionTypesRepoMock.java | 133 +- .../precision_types/PrecisionTypesRow.java | 274 +- .../PrecisionTypesNullFields.java | 171 +- .../PrecisionTypesNullId.java | 24 +- .../PrecisionTypesNullRepo.java | 59 +- .../PrecisionTypesNullRepoImpl.java | 284 +- .../PrecisionTypesNullRepoMock.java | 137 +- .../PrecisionTypesNullRow.java | 274 +- .../ProductDetailsWithSalesSqlRepo.java | 17 +- .../ProductDetailsWithSalesSqlRepoImpl.java | 65 +- .../ProductDetailsWithSalesSqlRow.java | 200 +- .../ProductSummarySqlRepo.java | 8 +- .../ProductSummarySqlRepoImpl.java | 30 +- .../product_summary/ProductSummarySqlRow.java | 98 +- .../testdb/products/ProductsFields.java | 79 +- .../testdb/products/ProductsId.java | 24 +- .../testdb/products/ProductsRepo.java | 65 +- .../testdb/products/ProductsRepoImpl.java | 214 +- .../testdb/products/ProductsRepoMock.java | 139 +- .../testdb/products/ProductsRow.java | 47 +- .../UpdateCustomerPrioritySqlRepo.java | 13 +- .../UpdateCustomerPrioritySqlRepoImpl.java | 36 +- .../UpdateCustomerPrioritySqlRow.java | 55 +- .../testdb/userdefined/Email.java | 26 +- .../java/src/java/testdb/AllTypesTest.java | 4 +- .../duckdb/java/src/java/testdb/DSLTest.java | 2 +- .../src/java/testdb/DatabaseFeaturesTest.java | 5 +- .../src/java/testdb/DuckDbTestHelper.java | 58 +- .../java/src/java/testdb/MockRepoTest.java | 4 +- .../java/src/java/testdb/SqlScriptTest.java | 5 +- .../java/src/java/testdb/TestInsertTest.java | 2 +- .../java/src/java/testdb/TupleInTest.java | 6 +- testers/duckdb/kotlin/build.gradle.kts | 44 - .../testdb/DefaultedDeserializer.kt | 2 +- .../testdb/DefaultedSerializer.kt | 4 +- .../generated-and-checked-in/testdb/Mood.kt | 19 +- .../testdb/Priority.kt | 19 +- .../testdb/TestInsert.kt | 67 +- .../all_scalar_types/AllScalarTypesFields.kt | 101 +- .../all_scalar_types/AllScalarTypesId.kt | 15 +- .../all_scalar_types/AllScalarTypesRepo.kt | 27 +- .../AllScalarTypesRepoImpl.kt | 65 +- .../AllScalarTypesRepoMock.kt | 39 +- .../all_scalar_types/AllScalarTypesRow.kt | 53 +- .../AllScalarTypesSearchSqlRepo.kt | 10 +- .../AllScalarTypesSearchSqlRepoImpl.kt | 18 +- .../AllScalarTypesSearchSqlRow.kt | 53 +- .../testdb/bridge/Customer.kt | 15 + .../CustomerOrdersViewFields.kt | 43 +- .../customer_orders/CustomerOrdersViewRepo.kt | 6 +- .../CustomerOrdersViewRepoImpl.kt | 12 +- .../customer_orders/CustomerOrdersViewRow.kt | 24 +- .../customer_search/CustomerSearchSqlRepo.kt | 10 +- .../CustomerSearchSqlRepoImpl.kt | 18 +- .../customer_search/CustomerSearchSqlRow.kt | 15 +- .../testdb/customers/CustomersFields.kt | 38 +- .../testdb/customers/CustomersId.kt | 15 +- .../testdb/customers/CustomersRepo.kt | 27 +- .../testdb/customers/CustomersRepoImpl.kt | 88 +- .../testdb/customers/CustomersRepoMock.kt | 39 +- .../testdb/customers/CustomersRow.kt | 15 +- .../testdb/customers/CustomersRowUnsaved.kt | 2 +- .../DeleteOldOrdersSqlRepo.kt | 6 +- .../DeleteOldOrdersSqlRepoImpl.kt | 13 +- .../DeleteOldOrdersSqlRow.kt | 16 +- .../DepartmentEmployeeDetailsSqlRepo.kt | 8 +- .../DepartmentEmployeeDetailsSqlRepoImpl.kt | 15 +- .../DepartmentEmployeeDetailsSqlRow.kt | 32 +- .../testdb/departments/DepartmentsFields.kt | 46 +- .../testdb/departments/DepartmentsId.kt | 18 +- .../testdb/departments/DepartmentsRepo.kt | 27 +- .../testdb/departments/DepartmentsRepoImpl.kt | 66 +- .../testdb/departments/DepartmentsRepoMock.kt | 39 +- .../testdb/departments/DepartmentsRow.kt | 25 +- .../EmployeeSalaryUpdateSqlRepo.kt | 6 +- .../EmployeeSalaryUpdateSqlRepoImpl.kt | 14 +- .../EmployeeSalaryUpdateSqlRow.kt | 28 +- .../testdb/employees/EmployeesFields.kt | 55 +- .../testdb/employees/EmployeesId.kt | 15 +- .../testdb/employees/EmployeesRepo.kt | 27 +- .../testdb/employees/EmployeesRepoImpl.kt | 99 +- .../testdb/employees/EmployeesRepoMock.kt | 39 +- .../testdb/employees/EmployeesRow.kt | 34 +- .../testdb/employees/EmployeesRowUnsaved.kt | 8 +- .../InsertOrderWithItemsSqlRepo.kt | 6 +- .../InsertOrderWithItemsSqlRepoImpl.kt | 14 +- .../InsertOrderWithItemsSqlRow.kt | 16 +- .../order_details/OrderDetailsViewFields.kt | 41 +- .../order_details/OrderDetailsViewRepo.kt | 6 +- .../order_details/OrderDetailsViewRepoImpl.kt | 12 +- .../order_details/OrderDetailsViewRow.kt | 20 +- .../testdb/order_items/OrderItemsFields.kt | 37 +- .../testdb/order_items/OrderItemsId.kt | 8 +- .../testdb/order_items/OrderItemsRepo.kt | 27 +- .../testdb/order_items/OrderItemsRepoImpl.kt | 86 +- .../testdb/order_items/OrderItemsRepoMock.kt | 39 +- .../testdb/order_items/OrderItemsRow.kt | 9 +- .../OrderItemsBulkInsertSqlRepo.kt | 4 +- .../OrderItemsBulkInsertSqlRepoImpl.kt | 11 +- .../OrderItemsBulkInsertSqlRow.kt | 9 +- .../OrderSummaryByCustomerSqlRepo.kt | 6 +- .../OrderSummaryByCustomerSqlRepoImpl.kt | 14 +- .../OrderSummaryByCustomerSqlRow.kt | 24 +- .../testdb/orders/OrdersFields.kt | 39 +- .../testdb/orders/OrdersId.kt | 15 +- .../testdb/orders/OrdersRepo.kt | 27 +- .../testdb/orders/OrdersRepoImpl.kt | 89 +- .../testdb/orders/OrdersRepoMock.kt | 39 +- .../testdb/orders/OrdersRow.kt | 18 +- .../testdb/orders/OrdersRowUnsaved.kt | 4 +- .../testdb/precisetypes/Decimal10_2.kt | 48 +- .../testdb/precisetypes/Decimal18_4.kt | 48 +- .../testdb/precisetypes/Decimal5_2.kt | 48 +- .../testdb/precisetypes/Int10.kt | 43 +- .../testdb/precisetypes/Int18.kt | 43 +- .../testdb/precisetypes/Int5.kt | 43 +- .../precision_types/PrecisionTypesFields.kt | 58 +- .../precision_types/PrecisionTypesId.kt | 15 +- .../precision_types/PrecisionTypesRepo.kt | 27 +- .../precision_types/PrecisionTypesRepoImpl.kt | 63 +- .../precision_types/PrecisionTypesRepoMock.kt | 39 +- .../precision_types/PrecisionTypesRow.kt | 30 +- .../PrecisionTypesNullFields.kt | 58 +- .../PrecisionTypesNullId.kt | 15 +- .../PrecisionTypesNullRepo.kt | 27 +- .../PrecisionTypesNullRepoImpl.kt | 64 +- .../PrecisionTypesNullRepoMock.kt | 39 +- .../PrecisionTypesNullRow.kt | 31 +- .../ProductDetailsWithSalesSqlRepo.kt | 8 +- .../ProductDetailsWithSalesSqlRepoImpl.kt | 15 +- .../ProductDetailsWithSalesSqlRow.kt | 38 +- .../product_summary/ProductSummarySqlRepo.kt | 4 +- .../ProductSummarySqlRepoImpl.kt | 6 +- .../product_summary/ProductSummarySqlRow.kt | 30 +- .../testdb/products/ProductsFields.kt | 40 +- .../testdb/products/ProductsId.kt | 15 +- .../testdb/products/ProductsRepo.kt | 31 +- .../testdb/products/ProductsRepoImpl.kt | 70 +- .../testdb/products/ProductsRepoMock.kt | 43 +- .../testdb/products/ProductsRow.kt | 19 +- .../UpdateCustomerPrioritySqlRepo.kt | 4 +- .../UpdateCustomerPrioritySqlRepoImpl.kt | 8 +- .../UpdateCustomerPrioritySqlRow.kt | 15 +- .../testdb/userdefined/Email.kt | 20 +- testers/duckdb/kotlin/gradle.properties | 1 - .../kotlin/src/kotlin/testdb/AllTypesTest.kt | 2 +- .../src/kotlin/testdb/DatabaseFeaturesTest.kt | 4 +- .../src/kotlin/testdb/DuckDbTestHelper.kt | 30 +- .../kotlin/src/kotlin/testdb/MockRepoTest.kt | 2 +- .../kotlin/src/kotlin/testdb/SqlScriptTest.kt | 4 +- .../src/kotlin/testdb/TestInsertTest.kt | 2 +- .../kotlin/src/kotlin/testdb/TupleInTest.kt | 6 +- .../testdb/Mood.scala | 42 +- .../testdb/Priority.scala | 44 +- .../testdb/TestInsert.scala | 7 +- .../AllScalarTypesFields.scala | 87 +- .../all_scalar_types/AllScalarTypesId.scala | 13 +- .../all_scalar_types/AllScalarTypesRepo.scala | 19 +- .../AllScalarTypesRepoImpl.scala | 107 +- .../AllScalarTypesRepoMock.scala | 31 +- .../all_scalar_types/AllScalarTypesRow.scala | 18 +- .../AllScalarTypesSearchSqlRepo.scala | 4 +- .../AllScalarTypesSearchSqlRepoImpl.scala | 36 +- .../AllScalarTypesSearchSqlRow.scala | 18 +- .../testdb/bridge/Customer.scala | 15 + .../CustomerOrdersViewFields.scala | 35 +- .../CustomerOrdersViewRepo.scala | 6 +- .../CustomerOrdersViewRepoImpl.scala | 14 +- .../CustomerOrdersViewRow.scala | 10 +- .../CustomerSearchSqlRepo.scala | 4 +- .../CustomerSearchSqlRepoImpl.scala | 22 +- .../CustomerSearchSqlRow.scala | 9 +- .../testdb/customers/CustomersFields.scala | 34 +- .../testdb/customers/CustomersId.scala | 13 +- .../testdb/customers/CustomersRepo.scala | 19 +- .../testdb/customers/CustomersRepoImpl.scala | 76 +- .../testdb/customers/CustomersRepoMock.scala | 31 +- .../testdb/customers/CustomersRow.scala | 9 +- .../DeleteOldOrdersSqlRepo.scala | 4 +- .../DeleteOldOrdersSqlRepoImpl.scala | 15 +- .../DeleteOldOrdersSqlRow.scala | 10 +- .../DepartmentEmployeeDetailsSqlRepo.scala | 4 +- ...DepartmentEmployeeDetailsSqlRepoImpl.scala | 22 +- .../DepartmentEmployeeDetailsSqlRow.scala | 10 +- .../departments/DepartmentsFields.scala | 35 +- .../testdb/departments/DepartmentsId.scala | 8 +- .../testdb/departments/DepartmentsRepo.scala | 19 +- .../departments/DepartmentsRepoImpl.scala | 73 +- .../departments/DepartmentsRepoMock.scala | 31 +- .../testdb/departments/DepartmentsRow.scala | 12 +- .../EmployeeSalaryUpdateSqlRepo.scala | 4 +- .../EmployeeSalaryUpdateSqlRepoImpl.scala | 20 +- .../EmployeeSalaryUpdateSqlRow.scala | 10 +- .../testdb/employees/EmployeesFields.scala | 41 +- .../testdb/employees/EmployeesId.scala | 9 +- .../testdb/employees/EmployeesRepo.scala | 19 +- .../testdb/employees/EmployeesRepoImpl.scala | 93 +- .../testdb/employees/EmployeesRepoMock.scala | 31 +- .../testdb/employees/EmployeesRow.scala | 12 +- .../InsertOrderWithItemsSqlRepo.scala | 4 +- .../InsertOrderWithItemsSqlRepoImpl.scala | 22 +- .../InsertOrderWithItemsSqlRow.scala | 10 +- .../OrderDetailsViewFields.scala | 35 +- .../order_details/OrderDetailsViewRepo.scala | 6 +- .../OrderDetailsViewRepoImpl.scala | 14 +- .../order_details/OrderDetailsViewRow.scala | 10 +- .../testdb/order_items/OrderItemsFields.scala | 32 +- .../testdb/order_items/OrderItemsId.scala | 8 +- .../testdb/order_items/OrderItemsRepo.scala | 19 +- .../order_items/OrderItemsRepoImpl.scala | 89 +- .../order_items/OrderItemsRepoMock.scala | 31 +- .../testdb/order_items/OrderItemsRow.scala | 10 +- .../OrderItemsBulkInsertSqlRepo.scala | 4 +- .../OrderItemsBulkInsertSqlRepoImpl.scala | 16 +- .../OrderItemsBulkInsertSqlRow.scala | 8 +- .../OrderSummaryByCustomerSqlRepo.scala | 6 +- .../OrderSummaryByCustomerSqlRepoImpl.scala | 23 +- .../OrderSummaryByCustomerSqlRow.scala | 10 +- .../testdb/orders/OrdersFields.scala | 35 +- .../testdb/orders/OrdersId.scala | 13 +- .../testdb/orders/OrdersRepo.scala | 19 +- .../testdb/orders/OrdersRepoImpl.scala | 81 +- .../testdb/orders/OrdersRepoMock.scala | 31 +- .../testdb/orders/OrdersRow.scala | 10 +- .../testdb/precisetypes/Decimal10_2.scala | 15 +- .../testdb/precisetypes/Decimal18_4.scala | 15 +- .../testdb/precisetypes/Decimal5_2.scala | 15 +- .../testdb/precisetypes/Int10.scala | 14 +- .../testdb/precisetypes/Int18.scala | 14 +- .../testdb/precisetypes/Int5.scala | 14 +- .../PrecisionTypesFields.scala | 46 +- .../precision_types/PrecisionTypesId.scala | 13 +- .../precision_types/PrecisionTypesRepo.scala | 19 +- .../PrecisionTypesRepoImpl.scala | 53 +- .../PrecisionTypesRepoMock.scala | 31 +- .../precision_types/PrecisionTypesRow.scala | 8 +- .../PrecisionTypesNullFields.scala | 46 +- .../PrecisionTypesNullId.scala | 13 +- .../PrecisionTypesNullRepo.scala | 19 +- .../PrecisionTypesNullRepoImpl.scala | 80 +- .../PrecisionTypesNullRepoMock.scala | 31 +- .../PrecisionTypesNullRow.scala | 9 +- .../ProductDetailsWithSalesSqlRepo.scala | 6 +- .../ProductDetailsWithSalesSqlRepoImpl.scala | 24 +- .../ProductDetailsWithSalesSqlRow.scala | 12 +- .../ProductSummarySqlRepo.scala | 4 +- .../ProductSummarySqlRepoImpl.scala | 8 +- .../ProductSummarySqlRow.scala | 12 +- .../testdb/products/ProductsFields.scala | 35 +- .../testdb/products/ProductsId.scala | 13 +- .../testdb/products/ProductsRepo.scala | 21 +- .../testdb/products/ProductsRepoImpl.scala | 67 +- .../testdb/products/ProductsRepoMock.scala | 33 +- .../testdb/products/ProductsRow.scala | 10 +- .../UpdateCustomerPrioritySqlRepo.scala | 4 +- .../UpdateCustomerPrioritySqlRepoImpl.scala | 10 +- .../UpdateCustomerPrioritySqlRow.scala | 9 +- .../testdb/userdefined/Email.scala | 12 +- .../scala/src/scala/testdb/AllTypesTest.scala | 50 +- .../src/scala/testdb/CompositeKeyTest.scala | 52 +- .../scala/src/scala/testdb/DSLTest.scala | 42 +- .../scala/testdb/DatabaseFeaturesTest.scala | 56 +- .../src/scala/testdb/ForeignKeyTest.scala | 22 +- .../scala/src/scala/testdb/MockRepoTest.scala | 2 +- .../src/scala/testdb/SqlScriptTest.scala | 52 +- .../src/scala/testdb/TestInsertTest.scala | 44 +- .../scala/src/scala/testdb/TupleInTest.scala | 35 +- .../src/scala/testdb/withConnection.scala | 20 +- .../com/example/grpc/BankTransfer.java | 73 +- .../com/example/grpc/ChatMessage.java | 119 +- .../com/example/grpc/CreateOrderRequest.java | 81 +- .../com/example/grpc/CreateOrderResponse.java | 73 +- .../com/example/grpc/CreditCard.java | 77 +- .../com/example/grpc/Customer.java | 77 +- .../com/example/grpc/CustomerId.java | 8 +- .../com/example/grpc/EchoService.java | 4 +- .../com/example/grpc/EchoServiceClient.java | 101 +- .../com/example/grpc/EchoServiceServer.java | 151 +- .../com/example/grpc/GetCustomerRequest.java | 65 +- .../com/example/grpc/GetCustomerResponse.java | 81 +- .../com/example/grpc/Inner.java | 73 +- .../com/example/grpc/Inventory.java | 156 +- .../com/example/grpc/ListOrdersRequest.java | 73 +- .../com/example/grpc/Notification.java | 89 +- .../com/example/grpc/NotificationTarget.java | 7 +- .../com/example/grpc/OptionalFields.java | 105 +- .../com/example/grpc/Order.java | 123 +- .../com/example/grpc/OrderId.java | 8 +- .../com/example/grpc/OrderService.java | 2 +- .../com/example/grpc/OrderServiceClient.java | 45 +- .../com/example/grpc/OrderServiceServer.java | 91 +- .../com/example/grpc/OrderStatus.java | 101 +- .../com/example/grpc/OrderSummary.java | 73 +- .../com/example/grpc/OrderUpdate.java | 119 +- .../com/example/grpc/Outer.java | 89 +- .../com/example/grpc/PaymentMethod.java | 127 +- .../com/example/grpc/PaymentMethodMethod.java | 9 +- .../com/example/grpc/Priority.java | 93 +- .../com/example/grpc/ScalarTypes.java | 410 +- .../com/example/grpc/Wallet.java | 73 +- .../example/grpc/WellKnownTypesMessage.java | 229 +- .../com/example/grpc/BankTransfer.java | 73 +- .../com/example/grpc/ChatMessage.java | 119 +- .../com/example/grpc/CreateOrderRequest.java | 81 +- .../com/example/grpc/CreateOrderResponse.java | 73 +- .../com/example/grpc/CreditCard.java | 77 +- .../com/example/grpc/Customer.java | 77 +- .../com/example/grpc/CustomerId.java | 8 +- .../com/example/grpc/EchoService.java | 4 +- .../com/example/grpc/EchoServiceClient.java | 101 +- .../com/example/grpc/EchoServiceServer.java | 151 +- .../com/example/grpc/GetCustomerRequest.java | 65 +- .../com/example/grpc/GetCustomerResponse.java | 81 +- .../com/example/grpc/Inner.java | 73 +- .../com/example/grpc/Inventory.java | 156 +- .../com/example/grpc/ListOrdersRequest.java | 73 +- .../com/example/grpc/Notification.java | 89 +- .../com/example/grpc/NotificationTarget.java | 7 +- .../com/example/grpc/OptionalFields.java | 105 +- .../com/example/grpc/Order.java | 123 +- .../com/example/grpc/OrderId.java | 8 +- .../com/example/grpc/OrderService.java | 2 +- .../com/example/grpc/OrderServiceClient.java | 45 +- .../com/example/grpc/OrderServiceServer.java | 91 +- .../com/example/grpc/OrderStatus.java | 101 +- .../com/example/grpc/OrderSummary.java | 73 +- .../com/example/grpc/OrderUpdate.java | 119 +- .../com/example/grpc/Outer.java | 89 +- .../com/example/grpc/PaymentMethod.java | 127 +- .../com/example/grpc/PaymentMethodMethod.java | 9 +- .../com/example/grpc/Priority.java | 93 +- .../com/example/grpc/ScalarTypes.java | 410 +- .../com/example/grpc/Wallet.java | 73 +- .../example/grpc/WellKnownTypesMessage.java | 229 +- .../com/example/grpc/BankTransfer.java | 73 +- .../com/example/grpc/ChatMessage.java | 119 +- .../com/example/grpc/CreateOrderRequest.java | 81 +- .../com/example/grpc/CreateOrderResponse.java | 73 +- .../com/example/grpc/CreditCard.java | 77 +- .../com/example/grpc/Customer.java | 77 +- .../com/example/grpc/CustomerId.java | 8 +- .../com/example/grpc/EchoService.java | 4 +- .../com/example/grpc/EchoServiceClient.java | 101 +- .../com/example/grpc/EchoServiceServer.java | 151 +- .../com/example/grpc/GetCustomerRequest.java | 65 +- .../com/example/grpc/GetCustomerResponse.java | 81 +- .../com/example/grpc/Inner.java | 73 +- .../com/example/grpc/Inventory.java | 156 +- .../com/example/grpc/ListOrdersRequest.java | 73 +- .../com/example/grpc/Notification.java | 89 +- .../com/example/grpc/NotificationTarget.java | 7 +- .../com/example/grpc/OptionalFields.java | 105 +- .../com/example/grpc/Order.java | 123 +- .../com/example/grpc/OrderId.java | 8 +- .../com/example/grpc/OrderService.java | 2 +- .../com/example/grpc/OrderServiceClient.java | 45 +- .../com/example/grpc/OrderServiceServer.java | 91 +- .../com/example/grpc/OrderStatus.java | 101 +- .../com/example/grpc/OrderSummary.java | 73 +- .../com/example/grpc/OrderUpdate.java | 119 +- .../com/example/grpc/Outer.java | 89 +- .../com/example/grpc/PaymentMethod.java | 127 +- .../com/example/grpc/PaymentMethodMethod.java | 9 +- .../com/example/grpc/Priority.java | 93 +- .../com/example/grpc/ScalarTypes.java | 410 +- .../com/example/grpc/Wallet.java | 73 +- .../example/grpc/WellKnownTypesMessage.java | 229 +- testers/grpc/kotlin-quarkus/build.gradle.kts | 40 - .../com/example/grpc/BankTransfer.kt | 26 +- .../com/example/grpc/ChatMessage.kt | 44 +- .../com/example/grpc/CreateOrderRequest.kt | 28 +- .../com/example/grpc/CreateOrderResponse.kt | 26 +- .../com/example/grpc/CreditCard.kt | 30 +- .../com/example/grpc/Customer.kt | 30 +- .../com/example/grpc/CustomerId.kt | 2 +- .../com/example/grpc/EchoServiceClient.kt | 14 +- .../com/example/grpc/EchoServiceServer.kt | 32 +- .../com/example/grpc/GetCustomerRequest.kt | 22 +- .../com/example/grpc/GetCustomerResponse.kt | 28 +- .../com/example/grpc/Inner.kt | 26 +- .../com/example/grpc/Inventory.kt | 44 +- .../com/example/grpc/ListOrdersRequest.kt | 26 +- .../com/example/grpc/Notification.kt | 34 +- .../com/example/grpc/OptionalFields.kt | 36 +- .../com/example/grpc/Order.kt | 44 +- .../com/example/grpc/OrderId.kt | 2 +- .../com/example/grpc/OrderServiceClient.kt | 6 +- .../com/example/grpc/OrderServiceServer.kt | 10 +- .../com/example/grpc/OrderSummary.kt | 26 +- .../com/example/grpc/OrderUpdate.kt | 44 +- .../com/example/grpc/Outer.kt | 32 +- .../com/example/grpc/PaymentMethod.kt | 44 +- .../com/example/grpc/ScalarTypes.kt | 44 +- .../com/example/grpc/Wallet.kt | 26 +- .../com/example/grpc/WellKnownTypesMessage.kt | 44 +- testers/grpc/kotlin-quarkus/gradle.properties | 1 - testers/grpc/kotlin/build.gradle.kts | 38 - .../com/example/grpc/BankTransfer.kt | 26 +- .../com/example/grpc/ChatMessage.kt | 44 +- .../com/example/grpc/CreateOrderRequest.kt | 28 +- .../com/example/grpc/CreateOrderResponse.kt | 26 +- .../com/example/grpc/CreditCard.kt | 30 +- .../com/example/grpc/Customer.kt | 30 +- .../com/example/grpc/CustomerId.kt | 2 +- .../com/example/grpc/EchoServiceClient.kt | 14 +- .../com/example/grpc/EchoServiceServer.kt | 14 +- .../com/example/grpc/GetCustomerRequest.kt | 22 +- .../com/example/grpc/GetCustomerResponse.kt | 28 +- .../com/example/grpc/Inner.kt | 26 +- .../com/example/grpc/Inventory.kt | 44 +- .../com/example/grpc/ListOrdersRequest.kt | 26 +- .../com/example/grpc/Notification.kt | 34 +- .../com/example/grpc/OptionalFields.kt | 36 +- .../com/example/grpc/Order.kt | 44 +- .../com/example/grpc/OrderId.kt | 2 +- .../com/example/grpc/OrderServiceClient.kt | 6 +- .../com/example/grpc/OrderServiceServer.kt | 6 +- .../com/example/grpc/OrderSummary.kt | 26 +- .../com/example/grpc/OrderUpdate.kt | 44 +- .../com/example/grpc/Outer.kt | 32 +- .../com/example/grpc/PaymentMethod.kt | 44 +- .../com/example/grpc/ScalarTypes.kt | 44 +- .../com/example/grpc/Wallet.kt | 26 +- .../com/example/grpc/WellKnownTypesMessage.kt | 44 +- testers/grpc/kotlin/gradle.properties | 1 - .../com/example/grpc/BankTransfer.scala | 66 + .../com/example/grpc/ChatMessage.scala | 86 + .../com/example/grpc/CreateOrderRequest.scala | 68 + .../example/grpc/CreateOrderResponse.scala | 66 + .../com/example/grpc/CreditCard.scala | 71 + .../com/example/grpc/Customer.scala | 71 + .../com/example/grpc/CustomerId.scala | 18 + .../com/example/grpc/EchoService.scala | 24 + .../com/example/grpc/EchoServiceClient.scala | 67 + .../com/example/grpc/EchoServiceServer.scala | 34 + .../com/example/grpc/GetCustomerRequest.scala | 59 + .../example/grpc/GetCustomerResponse.scala | 68 + .../com/example/grpc/Inner.scala | 66 + .../com/example/grpc/Inventory.scala | 113 + .../com/example/grpc/ListOrdersRequest.scala | 66 + .../com/example/grpc/Notification.scala | 86 + .../com/example/grpc/NotificationTarget.scala | 14 + .../com/example/grpc/OptionalFields.scala | 94 + .../com/example/grpc/Order.scala | 96 + .../com/example/grpc/OrderId.scala | 18 + .../com/example/grpc/OrderService.scala | 16 + .../com/example/grpc/OrderServiceClient.scala | 44 + .../com/example/grpc/OrderServiceServer.scala | 29 + .../com/example/grpc/OrderStatus.scala | 36 + .../com/example/grpc/OrderSummary.scala | 66 + .../com/example/grpc/OrderUpdate.scala | 86 + .../com/example/grpc/Outer.scala | 75 + .../com/example/grpc/PaymentMethod.scala | 102 + .../example/grpc/PaymentMethodMethod.scala | 14 + .../com/example/grpc/Priority.scala | 34 + .../com/example/grpc/ScalarTypes.scala | 148 + .../com/example/grpc/Wallet.scala | 66 + .../example/grpc/WellKnownTypesMessage.scala | 153 + .../grpc/CatsGrpcIntegrationTest.scala | 306 + .../com/example/grpc/CustomerId.scala | 2 +- .../com/example/grpc/OrderId.scala | 2 +- testers/jsonschema/test-schema.json | 66 + .../testdb/AllBrandsCategoriesCSet.java | 57 +- .../testdb/AllBrandsCategoriesCSetMember.java | 62 +- .../testdb/BestsellerClearanceFSet.java | 57 +- .../testdb/BestsellerClearanceFSetMember.java | 62 +- .../testdb/DefaultedDeserializer.java | 36 +- .../testdb/DefaultedSerializer.java | 14 +- .../testdb/EmailMailPushSmsSet.java | 57 +- .../testdb/EmailMailPushSmsSetMember.java | 60 +- .../testdb/TestInsert.java | 646 +- .../testdb/XYZSet.java | 57 +- .../testdb/XYZSetMember.java | 58 +- .../testdb/audit_log/AuditLogFields.java | 135 +- .../testdb/audit_log/AuditLogId.java | 16 +- .../testdb/audit_log/AuditLogRepo.java | 65 +- .../testdb/audit_log/AuditLogRepoImpl.java | 403 +- .../testdb/audit_log/AuditLogRepoMock.java | 147 +- .../testdb/audit_log/AuditLogRow.java | 225 +- .../testdb/audit_log/AuditLogRowUnsaved.java | 179 +- .../testdb/brands/BrandsFields.java | 104 +- .../testdb/brands/BrandsId.java | 15 +- .../testdb/brands/BrandsRepo.java | 74 +- .../testdb/brands/BrandsRepoImpl.java | 347 +- .../testdb/brands/BrandsRepoMock.java | 152 +- .../testdb/brands/BrandsRow.java | 132 +- .../testdb/brands/BrandsRowUnsaved.java | 97 +- .../testdb/bridge/Customer.java | 27 + .../testdb/categories/CategoriesFields.java | 133 +- .../testdb/categories/CategoriesId.java | 16 +- .../testdb/categories/CategoriesRepo.java | 74 +- .../testdb/categories/CategoriesRepoImpl.java | 402 +- .../testdb/categories/CategoriesRepoMock.java | 157 +- .../testdb/categories/CategoriesRow.java | 202 +- .../categories/CategoriesRowUnsaved.java | 156 +- .../testdb/cte_test/CteTestSqlRepo.java | 8 +- .../testdb/cte_test/CteTestSqlRepoImpl.java | 48 +- .../testdb/cte_test/CteTestSqlRow.java | 56 +- .../CustomerAddressesFields.java | 200 +- .../CustomerAddressesId.java | 16 +- .../CustomerAddressesRepo.java | 64 +- .../CustomerAddressesRepoImpl.java | 479 +- .../CustomerAddressesRepoMock.java | 153 +- .../CustomerAddressesRow.java | 466 +- .../CustomerAddressesRowUnsaved.java | 395 +- .../CustomerOrdersSqlRepo.java | 11 +- .../CustomerOrdersSqlRepoImpl.java | 39 +- .../customer_orders/CustomerOrdersSqlRow.java | 228 +- .../customer_status/CustomerStatusFields.java | 61 +- .../customer_status/CustomerStatusId.java | 16 +- .../customer_status/CustomerStatusRepo.java | 64 +- .../CustomerStatusRepoImpl.java | 240 +- .../CustomerStatusRepoMock.java | 152 +- .../customer_status/CustomerStatusRow.java | 53 +- .../CustomerStatusRowUnsaved.java | 44 +- .../testdb/customers/CustomersFields.java | 193 +- .../testdb/customers/CustomersId.java | 16 +- .../testdb/customers/CustomersRepo.java | 74 +- .../testdb/customers/CustomersRepoImpl.java | 522 +- .../testdb/customers/CustomersRepoMock.java | 154 +- .../testdb/customers/CustomersRow.java | 482 +- .../testdb/customers/CustomersRowUnsaved.java | 401 +- .../testdb/customtypes/Defaulted.java | 46 +- .../testdb/inventory/InventoryFields.java | 163 +- .../testdb/inventory/InventoryId.java | 16 +- .../testdb/inventory/InventoryRepo.java | 74 +- .../testdb/inventory/InventoryRepoImpl.java | 468 +- .../testdb/inventory/InventoryRepoMock.java | 159 +- .../testdb/inventory/InventoryRow.java | 363 +- .../testdb/inventory/InventoryRowUnsaved.java | 294 +- .../InventoryCheckSqlRepo.java | 15 +- .../InventoryCheckSqlRepoImpl.java | 51 +- .../inventory_check/InventoryCheckSqlRow.java | 296 +- .../testdb/mariatest/MariatestFields.java | 489 +- .../testdb/mariatest/MariatestId.java | 16 +- .../testdb/mariatest/MariatestRepo.java | 65 +- .../testdb/mariatest/MariatestRepoImpl.java | 826 +-- .../testdb/mariatest/MariatestRepoMock.java | 149 +- .../testdb/mariatest/MariatestRow.java | 2459 +------ .../testdb/mariatest/MariatestRowUnsaved.java | 2239 +------ .../MariatestIdentityFields.java | 54 +- .../MariatestIdentityId.java | 16 +- .../MariatestIdentityRepo.java | 64 +- .../MariatestIdentityRepoImpl.java | 194 +- .../MariatestIdentityRepoMock.java | 150 +- .../MariatestIdentityRow.java | 43 +- .../MariatestIdentityRowUnsaved.java | 21 +- .../MariatestSpatialFields.java | 138 +- .../mariatest_spatial/MariatestSpatialId.java | 16 +- .../MariatestSpatialRepo.java | 64 +- .../MariatestSpatialRepoImpl.java | 330 +- .../MariatestSpatialRepoMock.java | 150 +- .../MariatestSpatialRow.java | 235 +- .../MariatestSpatialRowUnsaved.java | 156 +- .../MariatestSpatialNullFields.java | 138 +- .../MariatestSpatialNullId.java | 16 +- .../MariatestSpatialNullRepo.java | 63 +- .../MariatestSpatialNullRepoImpl.java | 450 +- .../MariatestSpatialNullRepoMock.java | 153 +- .../MariatestSpatialNullRow.java | 292 +- .../MariatestSpatialNullRowUnsaved.java | 246 +- .../MariatestUniqueFields.java | 70 +- .../mariatest_unique/MariatestUniqueId.java | 16 +- .../mariatest_unique/MariatestUniqueRepo.java | 78 +- .../MariatestUniqueRepoImpl.java | 286 +- .../MariatestUniqueRepoMock.java | 163 +- .../mariatest_unique/MariatestUniqueRow.java | 61 +- .../MariatestUniqueRowUnsaved.java | 27 +- .../mariatestnull/MariatestnullFields.java | 491 +- .../mariatestnull/MariatestnullRepo.java | 25 +- .../mariatestnull/MariatestnullRepoImpl.java | 1001 ++- .../mariatestnull/MariatestnullRow.java | 2693 ++------ .../MariatestnullRowUnsaved.java | 2402 ++----- .../order_details/OrderDetailsSqlRepo.java | 11 +- .../OrderDetailsSqlRepoImpl.java | 42 +- .../order_details/OrderDetailsSqlRow.java | 530 +- .../order_history/OrderHistoryFields.java | 123 +- .../testdb/order_history/OrderHistoryId.java | 16 +- .../order_history/OrderHistoryRepo.java | 64 +- .../order_history/OrderHistoryRepoImpl.java | 366 +- .../order_history/OrderHistoryRepoMock.java | 151 +- .../testdb/order_history/OrderHistoryRow.java | 268 +- .../order_history/OrderHistoryRowUnsaved.java | 144 +- .../testdb/order_items/OrderItemsFields.java | 191 +- .../testdb/order_items/OrderItemsId.java | 16 +- .../testdb/order_items/OrderItemsRepo.java | 65 +- .../order_items/OrderItemsRepoImpl.java | 434 +- .../order_items/OrderItemsRepoMock.java | 152 +- .../testdb/order_items/OrderItemsRow.java | 423 +- .../order_items/OrderItemsRowUnsaved.java | 352 +- .../testdb/orders/OrdersFields.java | 298 +- .../testdb/orders/OrdersId.java | 15 +- .../testdb/orders/OrdersRepo.java | 74 +- .../testdb/orders/OrdersRepoImpl.java | 763 +-- .../testdb/orders/OrdersRepoMock.java | 155 +- .../testdb/orders/OrdersRow.java | 1028 +-- .../testdb/orders/OrdersRowUnsaved.java | 822 +-- .../payment_methods/PaymentMethodsFields.java | 109 +- .../payment_methods/PaymentMethodsId.java | 16 +- .../payment_methods/PaymentMethodsRepo.java | 73 +- .../PaymentMethodsRepoImpl.java | 339 +- .../PaymentMethodsRepoMock.java | 157 +- .../payment_methods/PaymentMethodsRow.java | 154 +- .../PaymentMethodsRowUnsaved.java | 108 +- .../testdb/payments/PaymentsFields.java | 174 +- .../testdb/payments/PaymentsId.java | 16 +- .../testdb/payments/PaymentsRepo.java | 65 +- .../testdb/payments/PaymentsRepoImpl.java | 464 +- .../testdb/payments/PaymentsRepoMock.java | 149 +- .../testdb/payments/PaymentsRow.java | 400 +- .../testdb/payments/PaymentsRowUnsaved.java | 328 +- .../testdb/precisetypes/Binary16.java | 39 +- .../testdb/precisetypes/Binary32.java | 39 +- .../testdb/precisetypes/Binary64.java | 39 +- .../testdb/precisetypes/Decimal10_2.java | 64 +- .../testdb/precisetypes/Decimal12_4.java | 64 +- .../testdb/precisetypes/Decimal18_4.java | 64 +- .../testdb/precisetypes/Decimal5_2.java | 64 +- .../testdb/precisetypes/Decimal8_2.java | 64 +- .../testdb/precisetypes/LocalDateTime3.java | 39 +- .../testdb/precisetypes/LocalDateTime6.java | 39 +- .../testdb/precisetypes/LocalTime3.java | 39 +- .../testdb/precisetypes/LocalTime6.java | 39 +- .../testdb/precisetypes/PaddedString10.java | 44 +- .../testdb/precisetypes/String10.java | 48 +- .../testdb/precisetypes/String100.java | 48 +- .../testdb/precisetypes/String20.java | 48 +- .../testdb/precisetypes/String255.java | 48 +- .../testdb/precisetypes/String50.java | 48 +- .../precision_types/PrecisionTypesFields.java | 300 +- .../precision_types/PrecisionTypesId.java | 16 +- .../precision_types/PrecisionTypesRepo.java | 65 +- .../PrecisionTypesRepoImpl.java | 581 +- .../PrecisionTypesRepoMock.java | 149 +- .../precision_types/PrecisionTypesRow.java | 957 +-- .../PrecisionTypesRowUnsaved.java | 860 +-- .../PrecisionTypesNullFields.java | 303 +- .../PrecisionTypesNullId.java | 16 +- .../PrecisionTypesNullRepo.java | 64 +- .../PrecisionTypesNullRepoImpl.java | 883 +-- .../PrecisionTypesNullRepoMock.java | 153 +- .../PrecisionTypesNullRow.java | 1081 +-- .../PrecisionTypesNullRowUnsaved.java | 939 +-- .../testdb/price_tiers/PriceTiersFields.java | 80 +- .../testdb/price_tiers/PriceTiersId.java | 16 +- .../testdb/price_tiers/PriceTiersRepo.java | 65 +- .../price_tiers/PriceTiersRepoImpl.java | 263 +- .../price_tiers/PriceTiersRepoMock.java | 152 +- .../testdb/price_tiers/PriceTiersRow.java | 78 +- .../price_tiers/PriceTiersRowUnsaved.java | 62 +- .../ProductCategoriesFields.java | 90 +- .../ProductCategoriesId.java | 26 +- .../ProductCategoriesRepo.java | 64 +- .../ProductCategoriesRepoImpl.java | 289 +- .../ProductCategoriesRepoMock.java | 153 +- .../ProductCategoriesRow.java | 91 +- .../ProductCategoriesRowUnsaved.java | 82 +- .../product_images/ProductImagesFields.java | 131 +- .../product_images/ProductImagesId.java | 16 +- .../product_images/ProductImagesRepo.java | 64 +- .../product_images/ProductImagesRepoImpl.java | 370 +- .../product_images/ProductImagesRepoMock.java | 151 +- .../product_images/ProductImagesRow.java | 184 +- .../ProductImagesRowUnsaved.java | 144 +- .../product_prices/ProductPricesFields.java | 117 +- .../product_prices/ProductPricesId.java | 16 +- .../product_prices/ProductPricesRepo.java | 64 +- .../product_prices/ProductPricesRepoImpl.java | 326 +- .../product_prices/ProductPricesRepoMock.java | 151 +- .../product_prices/ProductPricesRow.java | 155 +- .../ProductPricesRowUnsaved.java | 96 +- .../product_search/ProductSearchSqlRepo.java | 19 +- .../ProductSearchSqlRepoImpl.java | 56 +- .../product_search/ProductSearchSqlRow.java | 98 +- .../testdb/products/ProductsFields.java | 237 +- .../testdb/products/ProductsId.java | 16 +- .../testdb/products/ProductsRepo.java | 74 +- .../testdb/products/ProductsRepoImpl.java | 653 +- .../testdb/products/ProductsRepoMock.java | 155 +- .../testdb/products/ProductsRow.java | 766 +-- .../testdb/products/ProductsRowUnsaved.java | 588 +- .../testdb/promotions/PromotionsFields.java | 213 +- .../testdb/promotions/PromotionsId.java | 16 +- .../testdb/promotions/PromotionsRepo.java | 74 +- .../testdb/promotions/PromotionsRepoImpl.java | 560 +- .../testdb/promotions/PromotionsRepoMock.java | 157 +- .../testdb/promotions/PromotionsRow.java | 630 +- .../promotions/PromotionsRowUnsaved.java | 488 +- .../testdb/reviews/ReviewsFields.java | 243 +- .../testdb/reviews/ReviewsId.java | 16 +- .../testdb/reviews/ReviewsRepo.java | 65 +- .../testdb/reviews/ReviewsRepoImpl.java | 641 +- .../testdb/reviews/ReviewsRepoMock.java | 149 +- .../testdb/reviews/ReviewsRow.java | 777 +-- .../testdb/reviews/ReviewsRowUnsaved.java | 606 +- .../testdb/shipments/ShipmentsFields.java | 232 +- .../testdb/shipments/ShipmentsId.java | 16 +- .../testdb/shipments/ShipmentsRepo.java | 65 +- .../testdb/shipments/ShipmentsRepoImpl.java | 606 +- .../testdb/shipments/ShipmentsRepoMock.java | 148 +- .../testdb/shipments/ShipmentsRow.java | 709 +- .../testdb/shipments/ShipmentsRowUnsaved.java | 561 +- .../ShippingCarriersFields.java | 101 +- .../shipping_carriers/ShippingCarriersId.java | 16 +- .../ShippingCarriersRepo.java | 73 +- .../ShippingCarriersRepoImpl.java | 329 +- .../ShippingCarriersRepoMock.java | 159 +- .../ShippingCarriersRow.java | 115 +- .../ShippingCarriersRowUnsaved.java | 83 +- .../SimpleCustomerLookupSqlRepo.java | 11 +- .../SimpleCustomerLookupSqlRepoImpl.java | 31 +- .../SimpleCustomerLookupSqlRow.java | 104 +- .../subquery_test/SubqueryTestSqlRepo.java | 8 +- .../SubqueryTestSqlRepoImpl.java | 38 +- .../subquery_test/SubqueryTestSqlRow.java | 86 +- .../UpdateOrderStatusSqlRepo.java | 12 +- .../UpdateOrderStatusSqlRepoImpl.java | 32 +- .../testdb/userdefined/Email.java | 18 +- .../testdb/userdefined/FirstName.java | 20 +- .../testdb/userdefined/IsActive.java | 20 +- .../testdb/userdefined/IsApproved.java | 20 +- .../testdb/userdefined/IsDefault.java | 20 +- .../testdb/userdefined/IsPrimary.java | 20 +- .../userdefined/IsVerifiedPurchase.java | 20 +- .../testdb/userdefined/LastName.java | 20 +- .../VCustomerSummaryViewFields.java | 149 +- .../VCustomerSummaryViewRepo.java | 10 +- .../VCustomerSummaryViewRepoImpl.java | 29 +- .../VCustomerSummaryViewRow.java | 302 +- .../v_daily_sales/VDailySalesViewFields.java | 146 +- .../v_daily_sales/VDailySalesViewRepo.java | 10 +- .../VDailySalesViewRepoImpl.java | 30 +- .../v_daily_sales/VDailySalesViewRow.java | 292 +- .../VInventoryStatusViewFields.java | 193 +- .../VInventoryStatusViewRepo.java | 10 +- .../VInventoryStatusViewRepoImpl.java | 31 +- .../VInventoryStatusViewRow.java | 472 +- .../VOrderDetailsViewFields.java | 193 +- .../VOrderDetailsViewRepo.java | 10 +- .../VOrderDetailsViewRepoImpl.java | 31 +- .../v_order_details/VOrderDetailsViewRow.java | 470 +- .../VProductCatalogViewFields.java | 160 +- .../VProductCatalogViewRepo.java | 10 +- .../VProductCatalogViewRepoImpl.java | 29 +- .../VProductCatalogViewRow.java | 338 +- .../VWarehouseCoverageViewFields.java | 149 +- .../VWarehouseCoverageViewRepo.java | 10 +- .../VWarehouseCoverageViewRepoImpl.java | 30 +- .../VWarehouseCoverageViewRow.java | 298 +- .../testdb/warehouses/WarehousesFields.java | 147 +- .../testdb/warehouses/WarehousesId.java | 16 +- .../testdb/warehouses/WarehousesRepo.java | 74 +- .../testdb/warehouses/WarehousesRepoImpl.java | 402 +- .../testdb/warehouses/WarehousesRepoMock.java | 157 +- .../testdb/warehouses/WarehousesRow.java | 294 +- .../warehouses/WarehousesRowUnsaved.java | 169 +- .../java/src/java/testdb/CompositeIdTest.java | 16 +- .../mariadb/java/src/java/testdb/DSLTest.java | 8 +- .../java/src/java/testdb/ForeignKeyTest.java | 2 +- .../src/java/testdb/MariaDbTestHelper.java | 43 +- .../java/src/java/testdb/MockRepoTest.java | 2 +- .../java/src/java/testdb/SqlScriptTest.java | 2 +- .../java/src/java/testdb/TestInsertTest.java | 2 +- .../java/src/java/testdb/TupleInTest.java | 6 +- testers/mariadb/kotlin/build.gradle.kts | 32 - .../testdb/AllBrandsCategoriesCSet.kt | 25 +- .../testdb/AllBrandsCategoriesCSetMember.kt | 2 + .../testdb/BestsellerClearanceFSet.kt | 25 +- .../testdb/BestsellerClearanceFSetMember.kt | 2 + .../testdb/DefaultedDeserializer.kt | 2 +- .../testdb/DefaultedSerializer.kt | 4 +- .../testdb/EmailMailPushSmsSet.kt | 25 +- .../testdb/EmailMailPushSmsSetMember.kt | 2 + .../testdb/TestInsert.kt | 220 +- .../generated-and-checked-in/testdb/XYZSet.kt | 25 +- .../testdb/XYZSetMember.kt | 2 + .../testdb/audit_log/AuditLogFields.kt | 54 +- .../testdb/audit_log/AuditLogId.kt | 8 +- .../testdb/audit_log/AuditLogRepo.kt | 27 +- .../testdb/audit_log/AuditLogRepoImpl.kt | 102 +- .../testdb/audit_log/AuditLogRepoMock.kt | 39 +- .../testdb/audit_log/AuditLogRow.kt | 29 +- .../testdb/audit_log/AuditLogRowUnsaved.kt | 10 +- .../testdb/brands/BrandsFields.kt | 48 +- .../testdb/brands/BrandsId.kt | 8 +- .../testdb/brands/BrandsRepo.kt | 31 +- .../testdb/brands/BrandsRepoImpl.kt | 96 +- .../testdb/brands/BrandsRepoMock.kt | 43 +- .../testdb/brands/BrandsRow.kt | 31 +- .../testdb/brands/BrandsRowUnsaved.kt | 12 +- .../testdb/bridge/Customer.kt | 15 + .../testdb/categories/CategoriesFields.kt | 59 +- .../testdb/categories/CategoriesId.kt | 8 +- .../testdb/categories/CategoriesRepo.kt | 31 +- .../testdb/categories/CategoriesRepoImpl.kt | 105 +- .../testdb/categories/CategoriesRepoMock.kt | 43 +- .../testdb/categories/CategoriesRow.kt | 44 +- .../testdb/categories/CategoriesRowUnsaved.kt | 20 +- .../testdb/cte_test/CteTestSqlRepo.kt | 4 +- .../testdb/cte_test/CteTestSqlRepoImpl.kt | 6 +- .../testdb/cte_test/CteTestSqlRow.kt | 28 +- .../CustomerAddressesFields.kt | 74 +- .../customer_addresses/CustomerAddressesId.kt | 8 +- .../CustomerAddressesRepo.kt | 27 +- .../CustomerAddressesRepoImpl.kt | 118 +- .../CustomerAddressesRepoMock.kt | 39 +- .../CustomerAddressesRow.kt | 53 +- .../CustomerAddressesRowUnsaved.kt | 24 +- .../customer_orders/CustomerOrdersSqlRepo.kt | 6 +- .../CustomerOrdersSqlRepoImpl.kt | 13 +- .../customer_orders/CustomerOrdersSqlRow.kt | 24 +- .../customer_status/CustomerStatusFields.kt | 32 +- .../customer_status/CustomerStatusId.kt | 14 +- .../customer_status/CustomerStatusRepo.kt | 27 +- .../customer_status/CustomerStatusRepoImpl.kt | 77 +- .../customer_status/CustomerStatusRepoMock.kt | 39 +- .../customer_status/CustomerStatusRow.kt | 14 +- .../CustomerStatusRowUnsaved.kt | 2 +- .../testdb/customers/CustomersFields.kt | 62 +- .../testdb/customers/CustomersId.kt | 8 +- .../testdb/customers/CustomersRepo.kt | 29 +- .../testdb/customers/CustomersRepoImpl.kt | 122 +- .../testdb/customers/CustomersRepoMock.kt | 41 +- .../testdb/customers/CustomersRow.kt | 29 +- .../testdb/customers/CustomersRowUnsaved.kt | 12 +- .../testdb/inventory/InventoryFields.kt | 53 +- .../testdb/inventory/InventoryId.kt | 8 +- .../testdb/inventory/InventoryRepo.kt | 29 +- .../testdb/inventory/InventoryRepoImpl.kt | 111 +- .../testdb/inventory/InventoryRepoMock.kt | 41 +- .../testdb/inventory/InventoryRow.kt | 18 +- .../testdb/inventory/InventoryRowUnsaved.kt | 4 +- .../inventory_check/InventoryCheckSqlRepo.kt | 6 +- .../InventoryCheckSqlRepoImpl.kt | 14 +- .../inventory_check/InventoryCheckSqlRow.kt | 34 +- .../testdb/mariatest/MariatestFields.kt | 131 +- .../testdb/mariatest/MariatestId.kt | 8 +- .../testdb/mariatest/MariatestRepo.kt | 27 +- .../testdb/mariatest/MariatestRepoImpl.kt | 230 +- .../testdb/mariatest/MariatestRepoMock.kt | 39 +- .../testdb/mariatest/MariatestRow.kt | 58 +- .../testdb/mariatest/MariatestRowUnsaved.kt | 24 +- .../MariatestIdentityFields.kt | 30 +- .../mariatest_identity/MariatestIdentityId.kt | 8 +- .../MariatestIdentityRepo.kt | 27 +- .../MariatestIdentityRepoImpl.kt | 69 +- .../MariatestIdentityRepoMock.kt | 39 +- .../MariatestIdentityRow.kt | 14 +- .../MariatestIdentityRowUnsaved.kt | 2 +- .../MariatestSpatialFields.kt | 40 +- .../mariatest_spatial/MariatestSpatialId.kt | 8 +- .../mariatest_spatial/MariatestSpatialRepo.kt | 27 +- .../MariatestSpatialRepoImpl.kt | 97 +- .../MariatestSpatialRepoMock.kt | 39 +- .../mariatest_spatial/MariatestSpatialRow.kt | 8 +- .../MariatestSpatialNullFields.kt | 40 +- .../MariatestSpatialNullId.kt | 8 +- .../MariatestSpatialNullRepo.kt | 27 +- .../MariatestSpatialNullRepoImpl.kt | 98 +- .../MariatestSpatialNullRepoMock.kt | 39 +- .../MariatestSpatialNullRow.kt | 9 +- .../mariatest_unique/MariatestUniqueFields.kt | 36 +- .../mariatest_unique/MariatestUniqueId.kt | 8 +- .../mariatest_unique/MariatestUniqueRepo.kt | 35 +- .../MariatestUniqueRepoImpl.kt | 89 +- .../MariatestUniqueRepoMock.kt | 47 +- .../mariatest_unique/MariatestUniqueRow.kt | 18 +- .../MariatestUniqueRowUnsaved.kt | 4 +- .../mariatestnull/MariatestnullFields.kt | 129 +- .../testdb/mariatestnull/MariatestnullRepo.kt | 11 +- .../mariatestnull/MariatestnullRepoImpl.kt | 195 +- .../testdb/mariatestnull/MariatestnullRow.kt | 83 +- .../mariatestnull/MariatestnullRowUnsaved.kt | 48 +- .../order_details/OrderDetailsSqlRepo.kt | 4 +- .../order_details/OrderDetailsSqlRepoImpl.kt | 8 +- .../order_details/OrderDetailsSqlRow.kt | 31 +- .../order_history/OrderHistoryFields.kt | 52 +- .../testdb/order_history/OrderHistoryId.kt | 8 +- .../testdb/order_history/OrderHistoryRepo.kt | 27 +- .../order_history/OrderHistoryRepoImpl.kt | 94 +- .../order_history/OrderHistoryRepoMock.kt | 39 +- .../testdb/order_history/OrderHistoryRow.kt | 33 +- .../order_history/OrderHistoryRowUnsaved.kt | 14 +- .../testdb/order_items/OrderItemsFields.kt | 63 +- .../testdb/order_items/OrderItemsId.kt | 8 +- .../testdb/order_items/OrderItemsRepo.kt | 27 +- .../testdb/order_items/OrderItemsRepoImpl.kt | 115 +- .../testdb/order_items/OrderItemsRepoMock.kt | 39 +- .../testdb/order_items/OrderItemsRow.kt | 32 +- .../order_items/OrderItemsRowUnsaved.kt | 12 +- .../testdb/orders/OrdersFields.kt | 87 +- .../testdb/orders/OrdersId.kt | 8 +- .../testdb/orders/OrdersRepo.kt | 31 +- .../testdb/orders/OrdersRepoImpl.kt | 157 +- .../testdb/orders/OrdersRepoMock.kt | 43 +- .../testdb/orders/OrdersRow.kt | 52 +- .../testdb/orders/OrdersRowUnsaved.kt | 26 +- .../payment_methods/PaymentMethodsFields.kt | 49 +- .../payment_methods/PaymentMethodsId.kt | 8 +- .../payment_methods/PaymentMethodsRepo.kt | 31 +- .../payment_methods/PaymentMethodsRepoImpl.kt | 97 +- .../payment_methods/PaymentMethodsRepoMock.kt | 43 +- .../payment_methods/PaymentMethodsRow.kt | 30 +- .../PaymentMethodsRowUnsaved.kt | 10 +- .../testdb/payments/PaymentsFields.kt | 61 +- .../testdb/payments/PaymentsId.kt | 8 +- .../testdb/payments/PaymentsRepo.kt | 27 +- .../testdb/payments/PaymentsRepoImpl.kt | 111 +- .../testdb/payments/PaymentsRepoMock.kt | 39 +- .../testdb/payments/PaymentsRow.kt | 36 +- .../testdb/payments/PaymentsRowUnsaved.kt | 16 +- .../testdb/precisetypes/Binary16.kt | 22 +- .../testdb/precisetypes/Binary32.kt | 22 +- .../testdb/precisetypes/Binary64.kt | 22 +- .../testdb/precisetypes/Decimal10_2.kt | 40 +- .../testdb/precisetypes/Decimal12_4.kt | 40 +- .../testdb/precisetypes/Decimal18_4.kt | 40 +- .../testdb/precisetypes/Decimal5_2.kt | 40 +- .../testdb/precisetypes/Decimal8_2.kt | 40 +- .../testdb/precisetypes/LocalDateTime3.kt | 20 +- .../testdb/precisetypes/LocalDateTime6.kt | 20 +- .../testdb/precisetypes/LocalTime3.kt | 20 +- .../testdb/precisetypes/LocalTime6.kt | 20 +- .../testdb/precisetypes/PaddedString10.kt | 34 +- .../testdb/precisetypes/String10.kt | 36 +- .../testdb/precisetypes/String100.kt | 36 +- .../testdb/precisetypes/String20.kt | 36 +- .../testdb/precisetypes/String255.kt | 36 +- .../testdb/precisetypes/String50.kt | 36 +- .../precision_types/PrecisionTypesFields.kt | 70 +- .../precision_types/PrecisionTypesId.kt | 8 +- .../precision_types/PrecisionTypesRepo.kt | 27 +- .../precision_types/PrecisionTypesRepoImpl.kt | 157 +- .../precision_types/PrecisionTypesRepoMock.kt | 39 +- .../precision_types/PrecisionTypesRow.kt | 7 +- .../PrecisionTypesNullFields.kt | 70 +- .../PrecisionTypesNullId.kt | 8 +- .../PrecisionTypesNullRepo.kt | 27 +- .../PrecisionTypesNullRepoImpl.kt | 158 +- .../PrecisionTypesNullRepoMock.kt | 39 +- .../PrecisionTypesNullRow.kt | 8 +- .../testdb/price_tiers/PriceTiersFields.kt | 39 +- .../testdb/price_tiers/PriceTiersId.kt | 8 +- .../testdb/price_tiers/PriceTiersRepo.kt | 27 +- .../testdb/price_tiers/PriceTiersRepoImpl.kt | 82 +- .../testdb/price_tiers/PriceTiersRepoMock.kt | 39 +- .../testdb/price_tiers/PriceTiersRow.kt | 19 +- .../price_tiers/PriceTiersRowUnsaved.kt | 4 +- .../ProductCategoriesFields.kt | 42 +- .../product_categories/ProductCategoriesId.kt | 6 +- .../ProductCategoriesRepo.kt | 27 +- .../ProductCategoriesRepoImpl.kt | 85 +- .../ProductCategoriesRepoMock.kt | 39 +- .../ProductCategoriesRow.kt | 18 +- .../ProductCategoriesRowUnsaved.kt | 4 +- .../product_images/ProductImagesFields.kt | 50 +- .../testdb/product_images/ProductImagesId.kt | 8 +- .../product_images/ProductImagesRepo.kt | 27 +- .../product_images/ProductImagesRepoImpl.kt | 94 +- .../product_images/ProductImagesRepoMock.kt | 39 +- .../testdb/product_images/ProductImagesRow.kt | 27 +- .../product_images/ProductImagesRowUnsaved.kt | 10 +- .../product_prices/ProductPricesFields.kt | 45 +- .../testdb/product_prices/ProductPricesId.kt | 8 +- .../product_prices/ProductPricesRepo.kt | 27 +- .../product_prices/ProductPricesRepoImpl.kt | 91 +- .../product_prices/ProductPricesRepoMock.kt | 39 +- .../testdb/product_prices/ProductPricesRow.kt | 18 +- .../product_prices/ProductPricesRowUnsaved.kt | 4 +- .../product_search/ProductSearchSqlRepo.kt | 8 +- .../ProductSearchSqlRepoImpl.kt | 16 +- .../product_search/ProductSearchSqlRow.kt | 32 +- .../testdb/products/ProductsFields.kt | 77 +- .../testdb/products/ProductsId.kt | 8 +- .../testdb/products/ProductsRepo.kt | 31 +- .../testdb/products/ProductsRepoImpl.kt | 141 +- .../testdb/products/ProductsRepoMock.kt | 43 +- .../testdb/products/ProductsRow.kt | 44 +- .../testdb/products/ProductsRowUnsaved.kt | 20 +- .../testdb/promotions/PromotionsFields.kt | 67 +- .../testdb/promotions/PromotionsId.kt | 8 +- .../testdb/promotions/PromotionsRepo.kt | 31 +- .../testdb/promotions/PromotionsRepoImpl.kt | 133 +- .../testdb/promotions/PromotionsRepoMock.kt | 43 +- .../testdb/promotions/PromotionsRow.kt | 30 +- .../testdb/promotions/PromotionsRowUnsaved.kt | 10 +- .../testdb/reviews/ReviewsFields.kt | 70 +- .../testdb/reviews/ReviewsId.kt | 8 +- .../testdb/reviews/ReviewsRepo.kt | 27 +- .../testdb/reviews/ReviewsRepoImpl.kt | 134 +- .../testdb/reviews/ReviewsRepoMock.kt | 39 +- .../testdb/reviews/ReviewsRow.kt | 29 +- .../testdb/reviews/ReviewsRowUnsaved.kt | 12 +- .../testdb/shipments/ShipmentsFields.kt | 69 +- .../testdb/shipments/ShipmentsId.kt | 8 +- .../testdb/shipments/ShipmentsRepo.kt | 27 +- .../testdb/shipments/ShipmentsRepoImpl.kt | 131 +- .../testdb/shipments/ShipmentsRepoMock.kt | 39 +- .../testdb/shipments/ShipmentsRow.kt | 28 +- .../testdb/shipments/ShipmentsRowUnsaved.kt | 10 +- .../ShippingCarriersFields.kt | 44 +- .../shipping_carriers/ShippingCarriersId.kt | 8 +- .../shipping_carriers/ShippingCarriersRepo.kt | 31 +- .../ShippingCarriersRepoImpl.kt | 92 +- .../ShippingCarriersRepoMock.kt | 43 +- .../shipping_carriers/ShippingCarriersRow.kt | 25 +- .../ShippingCarriersRowUnsaved.kt | 8 +- .../SimpleCustomerLookupSqlRepo.kt | 6 +- .../SimpleCustomerLookupSqlRepoImpl.kt | 12 +- .../SimpleCustomerLookupSqlRow.kt | 14 +- .../subquery_test/SubqueryTestSqlRepo.kt | 4 +- .../subquery_test/SubqueryTestSqlRepoImpl.kt | 6 +- .../subquery_test/SubqueryTestSqlRow.kt | 20 +- .../UpdateOrderStatusSqlRepo.kt | 4 +- .../UpdateOrderStatusSqlRepoImpl.kt | 10 +- .../testdb/userdefined/Email.kt | 14 +- .../testdb/userdefined/FirstName.kt | 14 +- .../testdb/userdefined/IsActive.kt | 12 +- .../testdb/userdefined/IsApproved.kt | 12 +- .../testdb/userdefined/IsDefault.kt | 12 +- .../testdb/userdefined/IsPrimary.kt | 12 +- .../testdb/userdefined/IsVerifiedPurchase.kt | 12 +- .../testdb/userdefined/LastName.kt | 14 +- .../VCustomerSummaryViewFields.kt | 51 +- .../VCustomerSummaryViewRepo.kt | 6 +- .../VCustomerSummaryViewRepoImpl.kt | 12 +- .../VCustomerSummaryViewRow.kt | 24 +- .../v_daily_sales/VDailySalesViewFields.kt | 49 +- .../v_daily_sales/VDailySalesViewRepo.kt | 6 +- .../v_daily_sales/VDailySalesViewRepoImpl.kt | 12 +- .../v_daily_sales/VDailySalesViewRow.kt | 20 +- .../VInventoryStatusViewFields.kt | 67 +- .../VInventoryStatusViewRepo.kt | 6 +- .../VInventoryStatusViewRepoImpl.kt | 12 +- .../VInventoryStatusViewRow.kt | 40 +- .../VOrderDetailsViewFields.kt | 71 +- .../v_order_details/VOrderDetailsViewRepo.kt | 6 +- .../VOrderDetailsViewRepoImpl.kt | 12 +- .../v_order_details/VOrderDetailsViewRow.kt | 48 +- .../VProductCatalogViewFields.kt | 59 +- .../VProductCatalogViewRepo.kt | 6 +- .../VProductCatalogViewRepoImpl.kt | 12 +- .../VProductCatalogViewRow.kt | 36 +- .../VWarehouseCoverageViewFields.kt | 59 +- .../VWarehouseCoverageViewRepo.kt | 6 +- .../VWarehouseCoverageViewRepoImpl.kt | 12 +- .../VWarehouseCoverageViewRow.kt | 40 +- .../testdb/warehouses/WarehousesFields.kt | 56 +- .../testdb/warehouses/WarehousesId.kt | 8 +- .../testdb/warehouses/WarehousesRepo.kt | 31 +- .../testdb/warehouses/WarehousesRepoImpl.kt | 108 +- .../testdb/warehouses/WarehousesRepoMock.kt | 43 +- .../testdb/warehouses/WarehousesRow.kt | 35 +- .../testdb/warehouses/WarehousesRowUnsaved.kt | 14 +- testers/mariadb/kotlin/gradle.properties | 1 - .../src/kotlin/testdb/CompositeIdTest.kt | 6 +- .../src/kotlin/testdb/ForeignKeyTest.kt | 2 +- .../src/kotlin/testdb/MariaDbTestHelper.kt | 31 +- .../kotlin/src/kotlin/testdb/MockRepoTest.kt | 2 +- .../kotlin/src/kotlin/testdb/SqlScriptTest.kt | 2 +- .../src/kotlin/testdb/TestInsertTest.kt | 2 +- .../kotlin/src/kotlin/testdb/TupleInTest.kt | 6 +- .../src/kotlin/testdb/WithConnection.kt | 31 +- .../testdb/AllBrandsCategoriesCSet.scala | 7 +- .../AllBrandsCategoriesCSetMember.scala | 29 +- .../testdb/BestsellerClearanceFSet.scala | 7 +- .../BestsellerClearanceFSetMember.scala | 29 +- .../testdb/EmailMailPushSmsSet.scala | 7 +- .../testdb/EmailMailPushSmsSetMember.scala | 27 +- .../testdb/TestInsert.scala | 2 +- .../testdb/XYZSet.scala | 7 +- .../testdb/XYZSetMember.scala | 25 +- .../testdb/audit_log/AuditLogFields.scala | 44 +- .../testdb/audit_log/AuditLogId.scala | 10 +- .../testdb/audit_log/AuditLogRepo.scala | 19 +- .../testdb/audit_log/AuditLogRepoImpl.scala | 88 +- .../testdb/audit_log/AuditLogRepoMock.scala | 31 +- .../testdb/audit_log/AuditLogRow.scala | 9 +- .../testdb/brands/BrandsFields.scala | 38 +- .../testdb/brands/BrandsId.scala | 10 +- .../testdb/brands/BrandsRepo.scala | 21 +- .../testdb/brands/BrandsRepoImpl.scala | 82 +- .../testdb/brands/BrandsRepoMock.scala | 33 +- .../testdb/brands/BrandsRow.scala | 9 +- .../testdb/bridge/Customer.scala | 15 + .../testdb/categories/CategoriesFields.scala | 45 +- .../testdb/categories/CategoriesId.scala | 10 +- .../testdb/categories/CategoriesRepo.scala | 21 +- .../categories/CategoriesRepoImpl.scala | 93 +- .../categories/CategoriesRepoMock.scala | 33 +- .../testdb/categories/CategoriesRow.scala | 10 +- .../testdb/cte_test/CteTestSqlRepo.scala | 4 +- .../testdb/cte_test/CteTestSqlRepoImpl.scala | 8 +- .../testdb/cte_test/CteTestSqlRow.scala | 10 +- .../CustomerAddressesFields.scala | 54 +- .../CustomerAddressesId.scala | 10 +- .../CustomerAddressesRepo.scala | 19 +- .../CustomerAddressesRepoImpl.scala | 94 +- .../CustomerAddressesRepoMock.scala | 31 +- .../CustomerAddressesRow.scala | 9 +- .../CustomerOrdersSqlRepo.scala | 4 +- .../CustomerOrdersSqlRepoImpl.scala | 15 +- .../CustomerOrdersSqlRow.scala | 10 +- .../CustomerStatusFields.scala | 28 +- .../customer_status/CustomerStatusId.scala | 10 +- .../customer_status/CustomerStatusRepo.scala | 19 +- .../CustomerStatusRepoImpl.scala | 61 +- .../CustomerStatusRepoMock.scala | 31 +- .../customer_status/CustomerStatusRow.scala | 8 +- .../testdb/customers/CustomersFields.scala | 54 +- .../testdb/customers/CustomersId.scala | 10 +- .../testdb/customers/CustomersRepo.scala | 21 +- .../testdb/customers/CustomersRepoImpl.scala | 100 +- .../testdb/customers/CustomersRepoMock.scala | 33 +- .../testdb/customers/CustomersRow.scala | 9 +- .../testdb/inventory/InventoryFields.scala | 49 +- .../testdb/inventory/InventoryId.scala | 10 +- .../testdb/inventory/InventoryRepo.scala | 21 +- .../testdb/inventory/InventoryRepoImpl.scala | 99 +- .../testdb/inventory/InventoryRepoMock.scala | 33 +- .../testdb/inventory/InventoryRow.scala | 10 +- .../InventoryCheckSqlRepo.scala | 4 +- .../InventoryCheckSqlRepoImpl.scala | 20 +- .../InventoryCheckSqlRow.scala | 12 +- .../testdb/mariatest/MariatestFields.scala | 105 +- .../testdb/mariatest/MariatestId.scala | 10 +- .../testdb/mariatest/MariatestRepo.scala | 19 +- .../testdb/mariatest/MariatestRepoImpl.scala | 178 +- .../testdb/mariatest/MariatestRepoMock.scala | 31 +- .../testdb/mariatest/MariatestRow.scala | 9 +- .../MariatestIdentityFields.scala | 26 +- .../MariatestIdentityId.scala | 10 +- .../MariatestIdentityRepo.scala | 19 +- .../MariatestIdentityRepoImpl.scala | 57 +- .../MariatestIdentityRepoMock.scala | 31 +- .../MariatestIdentityRow.scala | 8 +- .../MariatestSpatialFields.scala | 40 +- .../MariatestSpatialId.scala | 10 +- .../MariatestSpatialRepo.scala | 19 +- .../MariatestSpatialRepoImpl.scala | 71 +- .../MariatestSpatialRepoMock.scala | 31 +- .../MariatestSpatialRow.scala | 8 +- .../MariatestSpatialNullFields.scala | 40 +- .../MariatestSpatialNullId.scala | 10 +- .../MariatestSpatialNullRepo.scala | 19 +- .../MariatestSpatialNullRepoImpl.scala | 92 +- .../MariatestSpatialNullRepoMock.scala | 31 +- .../MariatestSpatialNullRow.scala | 9 +- .../MariatestUniqueFields.scala | 30 +- .../mariatest_unique/MariatestUniqueId.scala | 10 +- .../MariatestUniqueRepo.scala | 23 +- .../MariatestUniqueRepoImpl.scala | 69 +- .../MariatestUniqueRepoMock.scala | 35 +- .../mariatest_unique/MariatestUniqueRow.scala | 8 +- .../mariatestnull/MariatestnullFields.scala | 103 +- .../mariatestnull/MariatestnullRepo.scala | 11 +- .../mariatestnull/MariatestnullRepoImpl.scala | 115 +- .../mariatestnull/MariatestnullRow.scala | 10 +- .../order_details/OrderDetailsSqlRepo.scala | 4 +- .../OrderDetailsSqlRepoImpl.scala | 10 +- .../order_details/OrderDetailsSqlRow.scala | 9 +- .../order_history/OrderHistoryFields.scala | 42 +- .../testdb/order_history/OrderHistoryId.scala | 10 +- .../order_history/OrderHistoryRepo.scala | 19 +- .../order_history/OrderHistoryRepoImpl.scala | 82 +- .../order_history/OrderHistoryRepoMock.scala | 31 +- .../order_history/OrderHistoryRow.scala | 9 +- .../testdb/order_items/OrderItemsFields.scala | 53 +- .../testdb/order_items/OrderItemsId.scala | 10 +- .../testdb/order_items/OrderItemsRepo.scala | 19 +- .../order_items/OrderItemsRepoImpl.scala | 101 +- .../order_items/OrderItemsRepoMock.scala | 31 +- .../testdb/order_items/OrderItemsRow.scala | 10 +- .../testdb/orders/OrdersFields.scala | 71 +- .../testdb/orders/OrdersId.scala | 10 +- .../testdb/orders/OrdersRepo.scala | 21 +- .../testdb/orders/OrdersRepoImpl.scala | 141 +- .../testdb/orders/OrdersRepoMock.scala | 33 +- .../testdb/orders/OrdersRow.scala | 10 +- .../PaymentMethodsFields.scala | 39 +- .../payment_methods/PaymentMethodsId.scala | 10 +- .../payment_methods/PaymentMethodsRepo.scala | 21 +- .../PaymentMethodsRepoImpl.scala | 81 +- .../PaymentMethodsRepoMock.scala | 33 +- .../payment_methods/PaymentMethodsRow.scala | 10 +- .../testdb/payments/PaymentsFields.scala | 51 +- .../testdb/payments/PaymentsId.scala | 10 +- .../testdb/payments/PaymentsRepo.scala | 19 +- .../testdb/payments/PaymentsRepoImpl.scala | 97 +- .../testdb/payments/PaymentsRepoMock.scala | 31 +- .../testdb/payments/PaymentsRow.scala | 10 +- .../testdb/precisetypes/Binary16.scala | 10 +- .../testdb/precisetypes/Binary32.scala | 10 +- .../testdb/precisetypes/Binary64.scala | 10 +- .../testdb/precisetypes/Decimal10_2.scala | 12 +- .../testdb/precisetypes/Decimal12_4.scala | 12 +- .../testdb/precisetypes/Decimal18_4.scala | 12 +- .../testdb/precisetypes/Decimal5_2.scala | 12 +- .../testdb/precisetypes/Decimal8_2.scala | 12 +- .../testdb/precisetypes/LocalDateTime3.scala | 12 +- .../testdb/precisetypes/LocalDateTime6.scala | 12 +- .../testdb/precisetypes/LocalTime3.scala | 12 +- .../testdb/precisetypes/LocalTime6.scala | 12 +- .../testdb/precisetypes/PaddedString10.scala | 12 +- .../testdb/precisetypes/String10.scala | 12 +- .../testdb/precisetypes/String100.scala | 12 +- .../testdb/precisetypes/String20.scala | 12 +- .../testdb/precisetypes/String255.scala | 12 +- .../testdb/precisetypes/String50.scala | 12 +- .../PrecisionTypesFields.scala | 70 +- .../precision_types/PrecisionTypesId.scala | 10 +- .../precision_types/PrecisionTypesRepo.scala | 19 +- .../PrecisionTypesRepoImpl.scala | 101 +- .../PrecisionTypesRepoMock.scala | 31 +- .../precision_types/PrecisionTypesRow.scala | 8 +- .../PrecisionTypesNullFields.scala | 70 +- .../PrecisionTypesNullId.scala | 10 +- .../PrecisionTypesNullRepo.scala | 19 +- .../PrecisionTypesNullRepoImpl.scala | 152 +- .../PrecisionTypesNullRepoMock.scala | 31 +- .../PrecisionTypesNullRow.scala | 9 +- .../testdb/price_tiers/PriceTiersFields.scala | 33 +- .../testdb/price_tiers/PriceTiersId.scala | 10 +- .../testdb/price_tiers/PriceTiersRepo.scala | 19 +- .../price_tiers/PriceTiersRepoImpl.scala | 72 +- .../price_tiers/PriceTiersRepoMock.scala | 31 +- .../testdb/price_tiers/PriceTiersRow.scala | 9 +- .../ProductCategoriesFields.scala | 34 +- .../ProductCategoriesId.scala | 6 +- .../ProductCategoriesRepo.scala | 19 +- .../ProductCategoriesRepoImpl.scala | 73 +- .../ProductCategoriesRepoMock.scala | 31 +- .../ProductCategoriesRow.scala | 10 +- .../product_images/ProductImagesFields.scala | 42 +- .../product_images/ProductImagesId.scala | 10 +- .../product_images/ProductImagesRepo.scala | 19 +- .../ProductImagesRepoImpl.scala | 80 +- .../ProductImagesRepoMock.scala | 31 +- .../product_images/ProductImagesRow.scala | 9 +- .../product_prices/ProductPricesFields.scala | 41 +- .../product_prices/ProductPricesId.scala | 10 +- .../product_prices/ProductPricesRepo.scala | 19 +- .../ProductPricesRepoImpl.scala | 81 +- .../ProductPricesRepoMock.scala | 31 +- .../product_prices/ProductPricesRow.scala | 10 +- .../product_search/ProductSearchSqlRepo.scala | 4 +- .../ProductSearchSqlRepoImpl.scala | 24 +- .../product_search/ProductSearchSqlRow.scala | 10 +- .../testdb/products/ProductsFields.scala | 63 +- .../testdb/products/ProductsId.scala | 10 +- .../testdb/products/ProductsRepo.scala | 21 +- .../testdb/products/ProductsRepoImpl.scala | 123 +- .../testdb/products/ProductsRepoMock.scala | 33 +- .../testdb/products/ProductsRow.scala | 10 +- .../testdb/promotions/PromotionsFields.scala | 57 +- .../testdb/promotions/PromotionsId.scala | 10 +- .../testdb/promotions/PromotionsRepo.scala | 21 +- .../promotions/PromotionsRepoImpl.scala | 111 +- .../promotions/PromotionsRepoMock.scala | 33 +- .../testdb/promotions/PromotionsRow.scala | 10 +- .../testdb/reviews/ReviewsFields.scala | 62 +- .../testdb/reviews/ReviewsId.scala | 10 +- .../testdb/reviews/ReviewsRepo.scala | 19 +- .../testdb/reviews/ReviewsRepoImpl.scala | 110 +- .../testdb/reviews/ReviewsRepoMock.scala | 31 +- .../testdb/reviews/ReviewsRow.scala | 9 +- .../testdb/shipments/ShipmentsFields.scala | 61 +- .../testdb/shipments/ShipmentsId.scala | 10 +- .../testdb/shipments/ShipmentsRepo.scala | 19 +- .../testdb/shipments/ShipmentsRepoImpl.scala | 115 +- .../testdb/shipments/ShipmentsRepoMock.scala | 31 +- .../testdb/shipments/ShipmentsRow.scala | 10 +- .../ShippingCarriersFields.scala | 36 +- .../ShippingCarriersId.scala | 10 +- .../ShippingCarriersRepo.scala | 21 +- .../ShippingCarriersRepoImpl.scala | 78 +- .../ShippingCarriersRepoMock.scala | 33 +- .../ShippingCarriersRow.scala | 9 +- .../SimpleCustomerLookupSqlRepo.scala | 4 +- .../SimpleCustomerLookupSqlRepoImpl.scala | 12 +- .../SimpleCustomerLookupSqlRow.scala | 8 +- .../subquery_test/SubqueryTestSqlRepo.scala | 4 +- .../SubqueryTestSqlRepoImpl.scala | 8 +- .../subquery_test/SubqueryTestSqlRow.scala | 10 +- .../UpdateOrderStatusSqlRepo.scala | 2 +- .../UpdateOrderStatusSqlRepoImpl.scala | 10 +- .../testdb/userdefined/Email.scala | 10 +- .../testdb/userdefined/FirstName.scala | 10 +- .../testdb/userdefined/IsActive.scala | 10 +- .../testdb/userdefined/IsApproved.scala | 10 +- .../testdb/userdefined/IsDefault.scala | 10 +- .../testdb/userdefined/IsPrimary.scala | 10 +- .../userdefined/IsVerifiedPurchase.scala | 10 +- .../testdb/userdefined/LastName.scala | 10 +- .../VCustomerSummaryViewFields.scala | 43 +- .../VCustomerSummaryViewRepo.scala | 6 +- .../VCustomerSummaryViewRepoImpl.scala | 14 +- .../VCustomerSummaryViewRow.scala | 10 +- .../v_daily_sales/VDailySalesViewFields.scala | 43 +- .../v_daily_sales/VDailySalesViewRepo.scala | 6 +- .../VDailySalesViewRepoImpl.scala | 14 +- .../v_daily_sales/VDailySalesViewRow.scala | 10 +- .../VInventoryStatusViewFields.scala | 51 +- .../VInventoryStatusViewRepo.scala | 6 +- .../VInventoryStatusViewRepoImpl.scala | 14 +- .../VInventoryStatusViewRow.scala | 10 +- .../VOrderDetailsViewFields.scala | 51 +- .../VOrderDetailsViewRepo.scala | 6 +- .../VOrderDetailsViewRepoImpl.scala | 14 +- .../VOrderDetailsViewRow.scala | 10 +- .../VProductCatalogViewFields.scala | 45 +- .../VProductCatalogViewRepo.scala | 6 +- .../VProductCatalogViewRepoImpl.scala | 14 +- .../VProductCatalogViewRow.scala | 10 +- .../VWarehouseCoverageViewFields.scala | 43 +- .../VWarehouseCoverageViewRepo.scala | 6 +- .../VWarehouseCoverageViewRepoImpl.scala | 14 +- .../VWarehouseCoverageViewRow.scala | 10 +- .../testdb/warehouses/WarehousesFields.scala | 44 +- .../testdb/warehouses/WarehousesId.scala | 10 +- .../testdb/warehouses/WarehousesRepo.scala | 21 +- .../warehouses/WarehousesRepoImpl.scala | 88 +- .../warehouses/WarehousesRepoMock.scala | 33 +- .../testdb/warehouses/WarehousesRow.scala | 9 +- .../scala/src/scala/testdb/AllTypesTest.scala | 39 +- .../src/scala/testdb/CompositeIdTest.scala | 42 +- .../scala/src/scala/testdb/DSLTest.scala | 35 +- .../src/scala/testdb/ForeignKeyTest.scala | 23 +- .../scala/src/scala/testdb/IdentityTest.scala | 21 +- .../src/scala/testdb/SpatialTypesTest.scala | 27 +- .../src/scala/testdb/SqlScriptTest.scala | 20 +- .../src/scala/testdb/TestInsertTest.scala | 32 +- .../scala/src/scala/testdb/TupleInTest.scala | 37 +- .../scala/testdb/UniqueConstraintTest.scala | 21 +- .../scala/src/scala/testdb/ViewTest.scala | 36 +- .../src/scala/testdb/withConnection.scala | 13 +- testers/openapi/kotlin/jaxrs/build.gradle.kts | 42 - .../openapi/kotlin/quarkus/build.gradle.kts | 43 - .../openapi/kotlin/spring/build.gradle.kts | 43 - .../oracledb/AddressT.java | 26 +- .../oracledb/AllTypesStruct.java | 443 +- .../oracledb/AllTypesStructNoLobs.java | 454 +- .../oracledb/AllTypesStructNoLobsArray.java | 14 +- .../AllTypesStructNoLobsOptional.java | 464 +- .../AllTypesStructNoLobsOptionalArray.java | 18 +- .../oracledb/AllTypesStructOptional.java | 456 +- .../oracledb/CoordinatesT.java | 21 +- .../oracledb/DefaultedDeserializer.java | 36 +- .../oracledb/DefaultedSerializer.java | 14 +- .../oracledb/EmailTableT.java | 13 +- .../oracledb/EmailVarrayT.java | 13 +- .../oracledb/MoneyT.java | 20 +- .../oracledb/PhoneList.java | 13 +- .../oracledb/TagVarrayT.java | 13 +- .../oracledb/TestInsert.java | 235 +- .../AllScalarTypesFields.java | 109 +- .../all_scalar_types/AllScalarTypesId.java | 27 +- .../all_scalar_types/AllScalarTypesRepo.java | 65 +- .../AllScalarTypesRepoImpl.java | 340 +- .../AllScalarTypesRepoMock.java | 149 +- .../all_scalar_types/AllScalarTypesRow.java | 102 +- .../AllScalarTypesRowUnsaved.java | 71 +- .../all_types_test/AllTypesTestFields.java | 71 +- .../all_types_test/AllTypesTestId.java | 27 +- .../all_types_test/AllTypesTestRepo.java | 65 +- .../all_types_test/AllTypesTestRepoImpl.java | 266 +- .../all_types_test/AllTypesTestRepoMock.java | 149 +- .../all_types_test/AllTypesTestRow.java | 49 +- .../AllTypesTestRowUnsaved.java | 23 +- .../oracledb/bridge/Customer.java | 27 + .../oracledb/contacts/ContactsFields.java | 74 +- .../oracledb/contacts/ContactsId.java | 27 +- .../oracledb/contacts/ContactsRepo.java | 65 +- .../oracledb/contacts/ContactsRepoImpl.java | 261 +- .../oracledb/contacts/ContactsRepoMock.java | 149 +- .../oracledb/contacts/ContactsRow.java | 52 +- .../oracledb/contacts/ContactsRowUnsaved.java | 27 +- .../CustomerProductsViewFields.java | 99 +- .../CustomerProductsViewRepo.java | 10 +- .../CustomerProductsViewRepoImpl.java | 29 +- .../CustomerProductsViewRow.java | 73 +- .../CustomerSearchSqlRepo.java | 11 +- .../CustomerSearchSqlRepoImpl.java | 30 +- .../customer_search/CustomerSearchSqlRow.java | 58 +- .../oracledb/customers/CustomersFields.java | 84 +- .../oracledb/customers/CustomersId.java | 27 +- .../oracledb/customers/CustomersRepo.java | 65 +- .../oracledb/customers/CustomersRepoImpl.java | 293 +- .../oracledb/customers/CustomersRepoMock.java | 148 +- .../oracledb/customers/CustomersRow.java | 66 +- .../customers/CustomersRowUnsaved.java | 46 +- .../oracledb/customtypes/Defaulted.java | 46 +- .../DepartmentSummarySqlRepo.java | 8 +- .../DepartmentSummarySqlRepoImpl.java | 28 +- .../DepartmentSummarySqlRow.java | 54 +- .../departments/DepartmentsFields.java | 75 +- .../oracledb/departments/DepartmentsId.java | 25 +- .../oracledb/departments/DepartmentsRepo.java | 60 +- .../departments/DepartmentsRepoImpl.java | 217 +- .../departments/DepartmentsRepoMock.java | 135 +- .../oracledb/departments/DepartmentsRow.java | 50 +- .../oracledb/employees/EmployeesFields.java | 115 +- .../oracledb/employees/EmployeesId.java | 25 +- .../oracledb/employees/EmployeesRepo.java | 65 +- .../oracledb/employees/EmployeesRepoImpl.java | 323 +- .../oracledb/employees/EmployeesRepoMock.java | 148 +- .../oracledb/employees/EmployeesRow.java | 110 +- .../employees/EmployeesRowUnsaved.java | 71 +- .../EmployeesByDepartmentSqlRepo.java | 12 +- .../EmployeesByDepartmentSqlRepoImpl.java | 37 +- .../EmployeesByDepartmentSqlRow.java | 98 +- .../oracledb/precisetypes/Decimal10_2.java | 75 +- .../oracledb/precisetypes/Decimal18_4.java | 75 +- .../oracledb/precisetypes/Decimal5_2.java | 75 +- .../oracledb/precisetypes/Int10.java | 59 +- .../oracledb/precisetypes/Int18.java | 59 +- .../oracledb/precisetypes/Int5.java | 59 +- .../oracledb/precisetypes/LocalDateTime3.java | 50 +- .../oracledb/precisetypes/LocalDateTime6.java | 50 +- .../oracledb/precisetypes/LocalDateTime9.java | 50 +- .../precisetypes/NonEmptyPaddedString10.java | 58 +- .../precisetypes/NonEmptyString10.java | 66 +- .../precisetypes/NonEmptyString100.java | 66 +- .../precisetypes/NonEmptyString20.java | 66 +- .../precisetypes/NonEmptyString255.java | 66 +- .../precisetypes/NonEmptyString50.java | 66 +- .../precision_types/PrecisionTypesFields.java | 223 +- .../precision_types/PrecisionTypesId.java | 27 +- .../precision_types/PrecisionTypesRepo.java | 65 +- .../PrecisionTypesRepoImpl.java | 526 +- .../PrecisionTypesRepoMock.java | 149 +- .../precision_types/PrecisionTypesRow.java | 222 +- .../PrecisionTypesRowUnsaved.java | 177 +- .../PrecisionTypesNullFields.java | 226 +- .../PrecisionTypesNullId.java | 27 +- .../PrecisionTypesNullRepo.java | 64 +- .../PrecisionTypesNullRepoImpl.java | 541 +- .../PrecisionTypesNullRepoMock.java | 154 +- .../PrecisionTypesNullRow.java | 222 +- .../PrecisionTypesNullRowUnsaved.java | 176 +- .../product_by_sku/ProductBySkuSqlRepo.java | 11 +- .../ProductBySkuSqlRepoImpl.java | 30 +- .../product_by_sku/ProductBySkuSqlRow.java | 51 +- .../oracledb/products/ProductsFields.java | 76 +- .../oracledb/products/ProductsId.java | 27 +- .../oracledb/products/ProductsRepo.java | 74 +- .../oracledb/products/ProductsRepoImpl.java | 288 +- .../oracledb/products/ProductsRepoMock.java | 155 +- .../oracledb/products/ProductsRow.java | 51 +- .../oracledb/products/ProductsRowUnsaved.java | 31 +- .../oracledb/userdefined/Email.java | 38 - .../java/src/java/oracledb/OracleDSLTest.java | 4 +- .../src/java/oracledb/OracleTestHelper.java | 32 +- .../java/src/java/oracledb/TupleInTest.java | 6 +- .../all_types_test/AllTypesTestTest.java | 7 +- .../java/oracledb/contacts/ContactsTest.java | 55 +- .../departments/DepartmentsEmployeesTest.java | 23 +- testers/oracle/kotlin/build.gradle.kts | 44 - .../oracledb/AddressT.kt | 12 +- .../oracledb/AllTypesStruct.kt | 28 +- .../oracledb/AllTypesStructNoLobs.kt | 28 +- .../oracledb/AllTypesStructNoLobsArray.kt | 5 +- .../oracledb/AllTypesStructNoLobsOptional.kt | 28 +- .../AllTypesStructNoLobsOptionalArray.kt | 5 +- .../oracledb/AllTypesStructOptional.kt | 28 +- .../oracledb/CoordinatesT.kt | 8 +- .../oracledb/DefaultedDeserializer.kt | 2 +- .../oracledb/DefaultedSerializer.kt | 4 +- .../oracledb/EmailTableT.kt | 9 +- .../oracledb/EmailVarrayT.kt | 9 +- .../oracledb/MoneyT.kt | 11 +- .../oracledb/PhoneList.kt | 9 +- .../oracledb/TagVarrayT.kt | 9 +- .../oracledb/TestInsert.kt | 31 +- .../all_scalar_types/AllScalarTypesFields.kt | 47 +- .../all_scalar_types/AllScalarTypesId.kt | 18 +- .../all_scalar_types/AllScalarTypesRepo.kt | 27 +- .../AllScalarTypesRepoImpl.kt | 93 +- .../AllScalarTypesRepoMock.kt | 39 +- .../all_scalar_types/AllScalarTypesRow.kt | 26 +- .../AllScalarTypesRowUnsaved.kt | 8 +- .../all_types_test/AllTypesTestFields.kt | 36 +- .../oracledb/all_types_test/AllTypesTestId.kt | 18 +- .../all_types_test/AllTypesTestRepo.kt | 27 +- .../all_types_test/AllTypesTestRepoImpl.kt | 80 +- .../all_types_test/AllTypesTestRepoMock.kt | 39 +- .../all_types_test/AllTypesTestRow.kt | 17 +- .../all_types_test/AllTypesTestRowUnsaved.kt | 4 +- .../oracledb/bridge/Customer.kt | 15 + .../oracledb/contacts/ContactsFields.kt | 40 +- .../oracledb/contacts/ContactsId.kt | 18 +- .../oracledb/contacts/ContactsRepo.kt | 27 +- .../oracledb/contacts/ContactsRepoImpl.kt | 82 +- .../oracledb/contacts/ContactsRepoMock.kt | 39 +- .../oracledb/contacts/ContactsRow.kt | 23 +- .../oracledb/contacts/ContactsRowUnsaved.kt | 8 +- .../CustomerProductsViewFields.kt | 41 +- .../CustomerProductsViewRepo.kt | 6 +- .../CustomerProductsViewRepoImpl.kt | 12 +- .../CustomerProductsViewRow.kt | 20 +- .../customer_search/CustomerSearchSqlRepo.kt | 6 +- .../CustomerSearchSqlRepoImpl.kt | 12 +- .../customer_search/CustomerSearchSqlRow.kt | 15 +- .../oracledb/customers/CustomersFields.kt | 38 +- .../oracledb/customers/CustomersId.kt | 18 +- .../oracledb/customers/CustomersRepo.kt | 27 +- .../oracledb/customers/CustomersRepoImpl.kt | 84 +- .../oracledb/customers/CustomersRepoMock.kt | 39 +- .../oracledb/customers/CustomersRow.kt | 17 +- .../oracledb/customers/CustomersRowUnsaved.kt | 4 +- .../DepartmentSummarySqlRepo.kt | 4 +- .../DepartmentSummarySqlRepoImpl.kt | 6 +- .../DepartmentSummarySqlRow.kt | 23 +- .../oracledb/departments/DepartmentsFields.kt | 46 +- .../oracledb/departments/DepartmentsId.kt | 18 +- .../oracledb/departments/DepartmentsRepo.kt | 27 +- .../departments/DepartmentsRepoImpl.kt | 64 +- .../departments/DepartmentsRepoMock.kt | 39 +- .../oracledb/departments/DepartmentsRow.kt | 25 +- .../oracledb/employees/EmployeesFields.kt | 63 +- .../oracledb/employees/EmployeesId.kt | 15 +- .../oracledb/employees/EmployeesRepo.kt | 27 +- .../oracledb/employees/EmployeesRepoImpl.kt | 97 +- .../oracledb/employees/EmployeesRepoMock.kt | 39 +- .../oracledb/employees/EmployeesRow.kt | 34 +- .../oracledb/employees/EmployeesRowUnsaved.kt | 8 +- .../EmployeesByDepartmentSqlRepo.kt | 8 +- .../EmployeesByDepartmentSqlRepoImpl.kt | 14 +- .../EmployeesByDepartmentSqlRow.kt | 24 +- .../oracledb/precisetypes/Decimal10_2.kt | 48 +- .../oracledb/precisetypes/Decimal18_4.kt | 48 +- .../oracledb/precisetypes/Decimal5_2.kt | 48 +- .../oracledb/precisetypes/Int10.kt | 36 +- .../oracledb/precisetypes/Int18.kt | 36 +- .../oracledb/precisetypes/Int5.kt | 36 +- .../oracledb/precisetypes/LocalDateTime3.kt | 28 +- .../oracledb/precisetypes/LocalDateTime6.kt | 28 +- .../oracledb/precisetypes/LocalDateTime9.kt | 28 +- .../precisetypes/NonEmptyPaddedString10.kt | 44 +- .../oracledb/precisetypes/NonEmptyString10.kt | 46 +- .../precisetypes/NonEmptyString100.kt | 46 +- .../oracledb/precisetypes/NonEmptyString20.kt | 46 +- .../precisetypes/NonEmptyString255.kt | 46 +- .../oracledb/precisetypes/NonEmptyString50.kt | 46 +- .../precision_types/PrecisionTypesFields.kt | 56 +- .../precision_types/PrecisionTypesId.kt | 18 +- .../precision_types/PrecisionTypesRepo.kt | 27 +- .../precision_types/PrecisionTypesRepoImpl.kt | 131 +- .../precision_types/PrecisionTypesRepoMock.kt | 39 +- .../precision_types/PrecisionTypesRow.kt | 10 +- .../PrecisionTypesRowUnsaved.kt | 2 +- .../PrecisionTypesNullFields.kt | 56 +- .../PrecisionTypesNullId.kt | 18 +- .../PrecisionTypesNullRepo.kt | 27 +- .../PrecisionTypesNullRepoImpl.kt | 132 +- .../PrecisionTypesNullRepoMock.kt | 39 +- .../PrecisionTypesNullRow.kt | 11 +- .../PrecisionTypesNullRowUnsaved.kt | 2 +- .../product_by_sku/ProductBySkuSqlRepo.kt | 6 +- .../product_by_sku/ProductBySkuSqlRepoImpl.kt | 12 +- .../product_by_sku/ProductBySkuSqlRow.kt | 19 +- .../oracledb/products/ProductsFields.kt | 40 +- .../oracledb/products/ProductsId.kt | 18 +- .../oracledb/products/ProductsRepo.kt | 31 +- .../oracledb/products/ProductsRepoImpl.kt | 90 +- .../oracledb/products/ProductsRepoMock.kt | 43 +- .../oracledb/products/ProductsRow.kt | 21 +- .../oracledb/products/ProductsRowUnsaved.kt | 6 +- .../oracledb/userdefined/Email.kt | 33 - testers/oracle/kotlin/gradle.properties | 1 - .../src/kotlin/oracledb/OracleTestHelper.kt | 20 +- .../kotlin/src/kotlin/oracledb/TupleInTest.kt | 6 +- .../all_types_test/AllTypesTestTest.kt | 9 +- .../kotlin/oracledb/contacts/ContactsTest.kt | 49 +- .../departments/DepartmentsEmployeesTest.kt | 6 +- .../oracledb/AddressT.scala | 8 +- .../oracledb/AllTypesStruct.scala | 36 +- .../oracledb/AllTypesStructNoLobs.scala | 36 +- .../oracledb/AllTypesStructNoLobsArray.scala | 5 +- .../AllTypesStructNoLobsOptional.scala | 36 +- .../AllTypesStructNoLobsOptionalArray.scala | 5 +- .../oracledb/AllTypesStructOptional.scala | 36 +- .../oracledb/CoordinatesT.scala | 8 +- .../oracledb/EmailTableT.scala | 7 +- .../oracledb/EmailVarrayT.scala | 7 +- .../oracledb/MoneyT.scala | 9 +- .../oracledb/PhoneList.scala | 7 +- .../oracledb/TagVarrayT.scala | 7 +- .../oracledb/TestInsert.scala | 5 +- .../AllScalarTypesFields.scala | 39 +- .../all_scalar_types/AllScalarTypesId.scala | 16 +- .../all_scalar_types/AllScalarTypesRepo.scala | 19 +- .../AllScalarTypesRepoImpl.scala | 95 +- .../AllScalarTypesRepoMock.scala | 31 +- .../all_scalar_types/AllScalarTypesRow.scala | 12 +- .../AllScalarTypesRowUnsaved.scala | 2 +- .../all_types_test/AllTypesTestFields.scala | 32 +- .../all_types_test/AllTypesTestId.scala | 16 +- .../all_types_test/AllTypesTestRepo.scala | 19 +- .../all_types_test/AllTypesTestRepoImpl.scala | 76 +- .../all_types_test/AllTypesTestRepoMock.scala | 31 +- .../all_types_test/AllTypesTestRow.scala | 11 +- .../AllTypesTestRowUnsaved.scala | 2 +- .../oracledb/bridge/Customer.scala | 15 + .../oracledb/contacts/ContactsFields.scala | 42 +- .../oracledb/contacts/ContactsId.scala | 16 +- .../oracledb/contacts/ContactsRepo.scala | 19 +- .../oracledb/contacts/ContactsRepoImpl.scala | 78 +- .../oracledb/contacts/ContactsRepoMock.scala | 31 +- .../oracledb/contacts/ContactsRow.scala | 19 +- .../contacts/ContactsRowUnsaved.scala | 6 +- .../CustomerProductsViewFields.scala | 35 +- .../CustomerProductsViewRepo.scala | 6 +- .../CustomerProductsViewRepoImpl.scala | 14 +- .../CustomerProductsViewRow.scala | 10 +- .../CustomerSearchSqlRepo.scala | 4 +- .../CustomerSearchSqlRepoImpl.scala | 12 +- .../CustomerSearchSqlRow.scala | 9 +- .../oracledb/customers/CustomersFields.scala | 34 +- .../oracledb/customers/CustomersId.scala | 16 +- .../oracledb/customers/CustomersRepo.scala | 19 +- .../customers/CustomersRepoImpl.scala | 74 +- .../customers/CustomersRepoMock.scala | 31 +- .../oracledb/customers/CustomersRow.scala | 11 +- .../customers/CustomersRowUnsaved.scala | 2 +- .../DepartmentSummarySqlRepo.scala | 4 +- .../DepartmentSummarySqlRepoImpl.scala | 8 +- .../DepartmentSummarySqlRow.scala | 9 +- .../departments/DepartmentsFields.scala | 34 +- .../oracledb/departments/DepartmentsId.scala | 8 +- .../departments/DepartmentsRepo.scala | 19 +- .../departments/DepartmentsRepoImpl.scala | 64 +- .../departments/DepartmentsRepoMock.scala | 31 +- .../oracledb/departments/DepartmentsRow.scala | 11 +- .../oracledb/employees/EmployeesFields.scala | 43 +- .../oracledb/employees/EmployeesId.scala | 9 +- .../oracledb/employees/EmployeesRepo.scala | 19 +- .../employees/EmployeesRepoImpl.scala | 85 +- .../employees/EmployeesRepoMock.scala | 31 +- .../oracledb/employees/EmployeesRow.scala | 12 +- .../EmployeesByDepartmentSqlRepo.scala | 4 +- .../EmployeesByDepartmentSqlRepoImpl.scala | 12 +- .../EmployeesByDepartmentSqlRow.scala | 10 +- .../oracledb/precisetypes/Decimal10_2.scala | 12 +- .../oracledb/precisetypes/Decimal18_4.scala | 12 +- .../oracledb/precisetypes/Decimal5_2.scala | 12 +- .../oracledb/precisetypes/Int10.scala | 12 +- .../oracledb/precisetypes/Int18.scala | 12 +- .../oracledb/precisetypes/Int5.scala | 12 +- .../precisetypes/LocalDateTime3.scala | 20 +- .../precisetypes/LocalDateTime6.scala | 20 +- .../precisetypes/LocalDateTime9.scala | 20 +- .../precisetypes/NonEmptyPaddedString10.scala | 20 +- .../precisetypes/NonEmptyString10.scala | 20 +- .../precisetypes/NonEmptyString100.scala | 20 +- .../precisetypes/NonEmptyString20.scala | 20 +- .../precisetypes/NonEmptyString255.scala | 20 +- .../precisetypes/NonEmptyString50.scala | 20 +- .../PrecisionTypesFields.scala | 56 +- .../precision_types/PrecisionTypesId.scala | 16 +- .../precision_types/PrecisionTypesRepo.scala | 19 +- .../PrecisionTypesRepoImpl.scala | 87 +- .../PrecisionTypesRepoMock.scala | 31 +- .../precision_types/PrecisionTypesRow.scala | 10 +- .../PrecisionTypesRowUnsaved.scala | 2 +- .../PrecisionTypesNullFields.scala | 56 +- .../PrecisionTypesNullId.scala | 16 +- .../PrecisionTypesNullRepo.scala | 19 +- .../PrecisionTypesNullRepoImpl.scala | 158 +- .../PrecisionTypesNullRepoMock.scala | 31 +- .../PrecisionTypesNullRow.scala | 11 +- .../PrecisionTypesNullRowUnsaved.scala | 2 +- .../product_by_sku/ProductBySkuSqlRepo.scala | 4 +- .../ProductBySkuSqlRepoImpl.scala | 12 +- .../product_by_sku/ProductBySkuSqlRow.scala | 9 +- .../oracledb/products/ProductsFields.scala | 34 +- .../oracledb/products/ProductsId.scala | 16 +- .../oracledb/products/ProductsRepo.scala | 21 +- .../oracledb/products/ProductsRepoImpl.scala | 78 +- .../oracledb/products/ProductsRepoMock.scala | 33 +- .../oracledb/products/ProductsRow.scala | 11 +- .../products/ProductsRowUnsaved.scala | 2 +- .../oracledb/userdefined/Email.scala | 26 - .../src/scala/oracledb/TupleInTest.scala | 38 +- .../all_scalar_types/AllScalarTypesTest.scala | 30 +- .../all_types_test/AllTypesTestTest.scala | 22 +- .../oracledb/contacts/ContactsTest.scala | 80 +- .../oracledb/customers/CustomersTest.scala | 18 +- .../DepartmentsEmployeesTest.scala | 54 +- .../oracledb/products/ProductsTest.scala | 30 +- .../src/scala/oracledb/withConnection.scala | 10 +- .../oracledb/AddressT.scala | 4 +- .../oracledb/AllTypesStruct.scala | 31 +- .../oracledb/AllTypesStructNoLobs.scala | 31 +- .../oracledb/AllTypesStructNoLobsArray.scala | 3 +- .../AllTypesStructNoLobsOptional.scala | 31 +- .../AllTypesStructNoLobsOptionalArray.scala | 3 +- .../oracledb/AllTypesStructOptional.scala | 31 +- .../oracledb/CoordinatesT.scala | 4 +- .../oracledb/EmailTableT.scala | 3 +- .../oracledb/EmailVarrayT.scala | 3 +- .../oracledb/MoneyT.scala | 4 +- .../oracledb/PhoneList.scala | 3 +- .../oracledb/TagVarrayT.scala | 3 +- .../oracledb/TestInsert.scala | 5 +- .../AllScalarTypesFields.scala | 22 +- .../all_scalar_types/AllScalarTypesId.scala | 12 +- .../all_scalar_types/AllScalarTypesRepo.scala | 19 +- .../AllScalarTypesRepoImpl.scala | 113 +- .../AllScalarTypesRepoMock.scala | 35 +- .../all_scalar_types/AllScalarTypesRow.scala | 8 +- .../AllScalarTypesRowUnsaved.scala | 2 +- .../all_types_test/AllTypesTestFields.scala | 22 +- .../all_types_test/AllTypesTestId.scala | 12 +- .../all_types_test/AllTypesTestRepo.scala | 19 +- .../all_types_test/AllTypesTestRepoImpl.scala | 95 +- .../all_types_test/AllTypesTestRepoMock.scala | 35 +- .../all_types_test/AllTypesTestRow.scala | 8 +- .../AllTypesTestRowUnsaved.scala | 2 +- .../oracledb/bridge/Customer.scala | 15 + .../oracledb/contacts/ContactsFields.scala | 34 +- .../oracledb/contacts/ContactsId.scala | 12 +- .../oracledb/contacts/ContactsRepo.scala | 19 +- .../oracledb/contacts/ContactsRepoImpl.scala | 97 +- .../oracledb/contacts/ContactsRepoMock.scala | 35 +- .../oracledb/contacts/ContactsRow.scala | 16 +- .../contacts/ContactsRowUnsaved.scala | 6 +- .../CustomerProductsViewFields.scala | 20 +- .../CustomerProductsViewRepo.scala | 6 +- .../CustomerProductsViewRepoImpl.scala | 16 +- .../CustomerProductsViewRow.scala | 6 +- .../CustomerSearchSqlRepo.scala | 4 +- .../CustomerSearchSqlRepoImpl.scala | 12 +- .../CustomerSearchSqlRow.scala | 6 +- .../oracledb/customers/CustomersFields.scala | 22 +- .../oracledb/customers/CustomersId.scala | 12 +- .../oracledb/customers/CustomersRepo.scala | 19 +- .../customers/CustomersRepoImpl.scala | 99 +- .../customers/CustomersRepoMock.scala | 35 +- .../oracledb/customers/CustomersRow.scala | 8 +- .../customers/CustomersRowUnsaved.scala | 2 +- .../DepartmentSummarySqlRepo.scala | 4 +- .../DepartmentSummarySqlRepoImpl.scala | 10 +- .../DepartmentSummarySqlRow.scala | 6 +- .../departments/DepartmentsFields.scala | 24 +- .../oracledb/departments/DepartmentsId.scala | 6 +- .../departments/DepartmentsRepo.scala | 19 +- .../departments/DepartmentsRepoImpl.scala | 73 +- .../departments/DepartmentsRepoMock.scala | 35 +- .../oracledb/departments/DepartmentsRow.scala | 8 +- .../oracledb/employees/EmployeesFields.scala | 26 +- .../oracledb/employees/EmployeesId.scala | 6 +- .../oracledb/employees/EmployeesRepo.scala | 19 +- .../employees/EmployeesRepoImpl.scala | 111 +- .../employees/EmployeesRepoMock.scala | 35 +- .../oracledb/employees/EmployeesRow.scala | 8 +- .../EmployeesByDepartmentSqlRepo.scala | 4 +- .../EmployeesByDepartmentSqlRepoImpl.scala | 14 +- .../EmployeesByDepartmentSqlRow.scala | 6 +- .../oracledb/precisetypes/Decimal10_2.scala | 14 +- .../oracledb/precisetypes/Decimal18_4.scala | 14 +- .../oracledb/precisetypes/Decimal5_2.scala | 14 +- .../oracledb/precisetypes/Int10.scala | 6 +- .../oracledb/precisetypes/Int18.scala | 6 +- .../oracledb/precisetypes/Int5.scala | 6 +- .../precisetypes/LocalDateTime3.scala | 16 +- .../precisetypes/LocalDateTime6.scala | 16 +- .../precisetypes/LocalDateTime9.scala | 16 +- .../precisetypes/NonEmptyPaddedString10.scala | 16 +- .../precisetypes/NonEmptyString10.scala | 16 +- .../precisetypes/NonEmptyString100.scala | 16 +- .../precisetypes/NonEmptyString20.scala | 16 +- .../precisetypes/NonEmptyString255.scala | 16 +- .../precisetypes/NonEmptyString50.scala | 16 +- .../PrecisionTypesFields.scala | 20 +- .../precision_types/PrecisionTypesId.scala | 12 +- .../precision_types/PrecisionTypesRepo.scala | 19 +- .../PrecisionTypesRepoImpl.scala | 173 +- .../PrecisionTypesRepoMock.scala | 35 +- .../precision_types/PrecisionTypesRow.scala | 8 +- .../PrecisionTypesRowUnsaved.scala | 2 +- .../PrecisionTypesNullFields.scala | 20 +- .../PrecisionTypesNullId.scala | 12 +- .../PrecisionTypesNullRepo.scala | 19 +- .../PrecisionTypesNullRepoImpl.scala | 173 +- .../PrecisionTypesNullRepoMock.scala | 35 +- .../PrecisionTypesNullRow.scala | 8 +- .../PrecisionTypesNullRowUnsaved.scala | 2 +- .../product_by_sku/ProductBySkuSqlRepo.scala | 4 +- .../ProductBySkuSqlRepoImpl.scala | 12 +- .../product_by_sku/ProductBySkuSqlRow.scala | 6 +- .../oracledb/products/ProductsFields.scala | 22 +- .../oracledb/products/ProductsId.scala | 12 +- .../oracledb/products/ProductsRepo.scala | 21 +- .../oracledb/products/ProductsRepoImpl.scala | 109 +- .../oracledb/products/ProductsRepoMock.scala | 37 +- .../oracledb/products/ProductsRow.scala | 8 +- .../products/ProductsRowUnsaved.scala | 2 +- .../oracledb/userdefined/Email.scala | 26 - .../src/scala/oracledb/OracleDSLTest.scala | 40 +- .../src/scala/oracledb/TupleInTest.scala | 55 +- .../all_scalar_types/AllScalarTypesTest.scala | 30 +- .../all_types_test/AllTypesTestTest.scala | 22 +- .../oracledb/contacts/ContactsTest.scala | 88 +- .../oracledb/customers/CustomersTest.scala | 18 +- .../DepartmentsEmployeesTest.scala | 54 +- .../oracledb/products/ProductsTest.scala | 30 +- .../src/scala/oracledb/withConnection.scala | 10 +- .../adventureworks/DefaultedDeserializer.java | 36 +- .../adventureworks/DefaultedSerializer.java | 14 +- .../adventureworks/TestDomainInsert.java | 24 +- .../adventureworks/TestInsert.java | 1182 +--- .../adventureworks/bridge/Customer.java | 27 + .../adventureworks/customtypes/Defaulted.java | 57 +- .../department/DepartmentFields.java | 71 +- .../department/DepartmentId.java | 24 +- .../department/DepartmentRepo.java | 82 +- .../department/DepartmentRepoImpl.java | 369 +- .../department/DepartmentRepoMock.java | 180 +- .../department/DepartmentRow.java | 74 +- .../department/DepartmentRowUnsaved.java | 81 +- .../employee/EmployeeFields.java | 205 +- .../humanresources/employee/EmployeeRepo.java | 81 +- .../employee/EmployeeRepoImpl.java | 632 +- .../employee/EmployeeRepoMock.java | 178 +- .../humanresources/employee/EmployeeRow.java | 619 +- .../employee/EmployeeRowUnsaved.java | 617 +- .../EmployeedepartmenthistoryFields.java | 126 +- .../EmployeedepartmenthistoryId.java | 39 +- .../EmployeedepartmenthistoryRepo.java | 76 +- .../EmployeedepartmenthistoryRepoImpl.java | 429 +- .../EmployeedepartmenthistoryRepoMock.java | 184 +- .../EmployeedepartmenthistoryRow.java | 196 +- .../EmployeedepartmenthistoryRowUnsaved.java | 207 +- .../humanresources/shift/ShiftFields.java | 77 +- .../humanresources/shift/ShiftId.java | 22 +- .../humanresources/shift/ShiftRepo.java | 83 +- .../humanresources/shift/ShiftRepoImpl.java | 387 +- .../humanresources/shift/ShiftRepoMock.java | 174 +- .../humanresources/shift/ShiftRow.java | 78 +- .../humanresources/shift/ShiftRowUnsaved.java | 88 +- .../vemployee/VemployeeViewFields.java | 232 +- .../vemployee/VemployeeViewRepo.java | 10 +- .../vemployee/VemployeeViewRepoImpl.java | 32 +- .../vemployee/VemployeeViewRow.java | 578 +- .../information_schema/CardinalNumber.java | 31 +- .../information_schema/CharacterData.java | 31 +- .../information_schema/SqlIdentifier.java | 31 +- .../information_schema/TimeStamp.java | 31 +- .../information_schema/YesOrNo.java | 33 +- .../person/address/AddressFields.java | 139 +- .../person/address/AddressId.java | 23 +- .../person/address/AddressRepo.java | 83 +- .../person/address/AddressRepoImpl.java | 465 +- .../person/address/AddressRepoMock.java | 178 +- .../person/address/AddressRow.java | 249 +- .../person/address/AddressRowUnsaved.java | 253 +- .../person/addresstype/AddresstypeFields.java | 71 +- .../person/addresstype/AddresstypeId.java | 24 +- .../person/addresstype/AddresstypeRepo.java | 81 +- .../addresstype/AddresstypeRepoImpl.java | 382 +- .../addresstype/AddresstypeRepoMock.java | 180 +- .../person/addresstype/AddresstypeRow.java | 76 +- .../addresstype/AddresstypeRowUnsaved.java | 84 +- .../businessentity/BusinessentityFields.java | 60 +- .../businessentity/BusinessentityId.java | 24 +- .../businessentity/BusinessentityRepo.java | 81 +- .../BusinessentityRepoImpl.java | 371 +- .../BusinessentityRepoMock.java | 180 +- .../businessentity/BusinessentityRow.java | 69 +- .../BusinessentityRowUnsaved.java | 70 +- .../BusinessentityaddressFields.java | 111 +- .../BusinessentityaddressId.java | 29 +- .../BusinessentityaddressRepo.java | 79 +- .../BusinessentityaddressRepoImpl.java | 432 +- .../BusinessentityaddressRepoMock.java | 184 +- .../BusinessentityaddressRow.java | 151 +- .../BusinessentityaddressRowUnsaved.java | 151 +- .../countryregion/CountryregionFields.java | 60 +- .../person/countryregion/CountryregionId.java | 24 +- .../countryregion/CountryregionRepo.java | 81 +- .../countryregion/CountryregionRepoImpl.java | 314 +- .../countryregion/CountryregionRepoMock.java | 181 +- .../countryregion/CountryregionRow.java | 50 +- .../CountryregionRowUnsaved.java | 51 +- .../emailaddress/EmailaddressFields.java | 98 +- .../person/emailaddress/EmailaddressId.java | 26 +- .../person/emailaddress/EmailaddressRepo.java | 81 +- .../emailaddress/EmailaddressRepoImpl.java | 431 +- .../emailaddress/EmailaddressRepoMock.java | 179 +- .../person/emailaddress/EmailaddressRow.java | 148 +- .../emailaddress/EmailaddressRowUnsaved.java | 135 +- .../person/password/PasswordFields.java | 87 +- .../person/password/PasswordRepo.java | 81 +- .../person/password/PasswordRepoImpl.java | 390 +- .../person/password/PasswordRepoMock.java | 178 +- .../person/password/PasswordRow.java | 77 +- .../person/password/PasswordRowUnsaved.java | 89 +- .../person/person/PersonFields.java | 184 +- .../person/person/PersonRepo.java | 82 +- .../person/person/PersonRepoImpl.java | 549 +- .../person/person/PersonRepoMock.java | 174 +- .../person/person/PersonRow.java | 437 +- .../person/person/PersonRowUnsaved.java | 453 +- .../stateprovince/StateprovinceFields.java | 128 +- .../person/stateprovince/StateprovinceId.java | 24 +- .../stateprovince/StateprovinceRepo.java | 81 +- .../stateprovince/StateprovinceRepoImpl.java | 485 +- .../stateprovince/StateprovinceRepoMock.java | 180 +- .../stateprovince/StateprovinceRow.java | 251 +- .../StateprovinceRowUnsaved.java | 270 +- .../person_detail/PersonDetailSqlRepo.java | 13 +- .../PersonDetailSqlRepoImpl.java | 43 +- .../person_detail/PersonDetailSqlRow.java | 228 +- .../person_dynamic/PersonDynamicSqlRepo.java | 11 +- .../PersonDynamicSqlRepoImpl.java | 26 +- .../person_dynamic/PersonDynamicSqlRow.java | 48 +- .../person_row_join/PersonRowJoinSqlRepo.java | 8 +- .../PersonRowJoinSqlRepoImpl.java | 24 +- .../person_row_join/PersonRowJoinSqlRow.java | 40 +- .../precisetypes/PaddedString10.java | 54 +- .../precisetypes/PaddedString3.java | 54 +- .../adventureworks/precisetypes/String10.java | 57 +- .../precisetypes/String100.java | 57 +- .../adventureworks/precisetypes/String20.java | 57 +- .../precisetypes/String255.java | 57 +- .../adventureworks/precisetypes/String50.java | 57 +- .../production/product/ProductFields.java | 328 +- .../production/product/ProductId.java | 23 +- .../production/product/ProductRepo.java | 83 +- .../production/product/ProductRepoImpl.java | 773 +-- .../production/product/ProductRepoMock.java | 178 +- .../production/product/ProductRow.java | 1173 +--- .../production/product/ProductRowUnsaved.java | 1237 +--- .../ProductcategoryFields.java | 72 +- .../productcategory/ProductcategoryId.java | 24 +- .../productcategory/ProductcategoryRepo.java | 81 +- .../ProductcategoryRepoImpl.java | 381 +- .../ProductcategoryRepoMock.java | 178 +- .../productcategory/ProductcategoryRow.java | 79 +- .../ProductcategoryRowUnsaved.java | 89 +- .../ProductcosthistoryFields.java | 102 +- .../ProductcosthistoryId.java | 26 +- .../ProductcosthistoryRepo.java | 81 +- .../ProductcosthistoryRepoImpl.java | 378 +- .../ProductcosthistoryRepoMock.java | 182 +- .../ProductcosthistoryRow.java | 145 +- .../ProductcosthistoryRowUnsaved.java | 157 +- .../productmodel/ProductmodelFields.java | 97 +- .../productmodel/ProductmodelId.java | 24 +- .../productmodel/ProductmodelRepo.java | 81 +- .../productmodel/ProductmodelRepoImpl.java | 424 +- .../productmodel/ProductmodelRepoMock.java | 180 +- .../productmodel/ProductmodelRow.java | 120 +- .../productmodel/ProductmodelRowUnsaved.java | 131 +- .../ProductsubcategoryFields.java | 96 +- .../ProductsubcategoryId.java | 24 +- .../ProductsubcategoryRepo.java | 79 +- .../ProductsubcategoryRepoImpl.java | 408 +- .../ProductsubcategoryRepoMock.java | 182 +- .../ProductsubcategoryRow.java | 122 +- .../ProductsubcategoryRowUnsaved.java | 134 +- .../unitmeasure/UnitmeasureFields.java | 59 +- .../production/unitmeasure/UnitmeasureId.java | 24 +- .../unitmeasure/UnitmeasureRepo.java | 81 +- .../unitmeasure/UnitmeasureRepoImpl.java | 311 +- .../unitmeasure/UnitmeasureRepoMock.java | 180 +- .../unitmeasure/UnitmeasureRow.java | 47 +- .../unitmeasure/UnitmeasureRowUnsaved.java | 49 +- .../adventureworks/public_/AccountNumber.java | 31 +- .../adventureworks/public_/Address.java | 43 +- .../public_/AllTypesComposite.java | 564 +- .../adventureworks/public_/Complex.java | 32 +- .../adventureworks/public_/ContactInfo.java | 37 +- .../public_/EmployeeRecord.java | 49 +- .../adventureworks/public_/Flag.java | 28 +- .../adventureworks/public_/InventoryItem.java | 48 +- .../public_/MetadataRecord.java | 37 +- .../adventureworks/public_/Mydomain.java | 29 +- .../adventureworks/public_/Myenum.java | 72 +- .../adventureworks/public_/Name.java | 28 +- .../adventureworks/public_/NameStyle.java | 29 +- .../adventureworks/public_/NullableTest.java | 37 +- .../adventureworks/public_/OrderNumber.java | 31 +- .../adventureworks/public_/PersonName.java | 44 +- .../adventureworks/public_/Phone.java | 28 +- .../adventureworks/public_/Point2d.java | 32 +- .../adventureworks/public_/PolygonCustom.java | 36 +- .../adventureworks/public_/ShortText.java | 29 +- .../public_/TablefuncCrosstab2.java | 37 +- .../public_/TablefuncCrosstab3.java | 44 +- .../public_/TablefuncCrosstab4.java | 49 +- .../public_/TextWithSpecialChars.java | 72 +- .../adventureworks/public_/TreeNode.java | 37 +- .../public_/flaff/FlaffFields.java | 97 +- .../adventureworks/public_/flaff/FlaffId.java | 36 +- .../public_/flaff/FlaffRepo.java | 72 +- .../public_/flaff/FlaffRepoImpl.java | 301 +- .../public_/flaff/FlaffRepoMock.java | 153 +- .../public_/flaff/FlaffRow.java | 84 +- .../identity_test/IdentityTestFields.java | 59 +- .../public_/identity_test/IdentityTestId.java | 24 +- .../identity_test/IdentityTestRepo.java | 82 +- .../identity_test/IdentityTestRepoImpl.java | 285 +- .../identity_test/IdentityTestRepoMock.java | 177 +- .../identity_test/IdentityTestRow.java | 60 +- .../identity_test/IdentityTestRowUnsaved.java | 48 +- .../public_/issue142/Issue142Fields.java | 36 +- .../public_/issue142/Issue142Id.java | 62 +- .../public_/issue142/Issue142Repo.java | 67 +- .../public_/issue142/Issue142RepoImpl.java | 187 +- .../public_/issue142/Issue142RepoMock.java | 138 +- .../public_/issue142/Issue142Row.java | 22 +- .../public_/issue142_2/Issue1422Fields.java | 41 +- .../public_/issue142_2/Issue1422Repo.java | 67 +- .../public_/issue142_2/Issue1422RepoImpl.java | 193 +- .../public_/issue142_2/Issue1422RepoMock.java | 138 +- .../public_/issue142_2/Issue1422Row.java | 29 +- .../only_pk_columns/OnlyPkColumnsFields.java | 55 +- .../only_pk_columns/OnlyPkColumnsId.java | 26 +- .../only_pk_columns/OnlyPkColumnsRepo.java | 66 +- .../OnlyPkColumnsRepoImpl.java | 231 +- .../OnlyPkColumnsRepoMock.java | 140 +- .../only_pk_columns/OnlyPkColumnsRow.java | 40 +- .../public_/pgtest/PgtestFields.java | 1070 +-- .../public_/pgtest/PgtestRepo.java | 26 +- .../public_/pgtest/PgtestRepoImpl.java | 251 +- .../public_/pgtest/PgtestRow.java | 5894 +---------------- .../public_/pgtestnull/PgtestnullFields.java | 1073 +-- .../public_/pgtestnull/PgtestnullRepo.java | 26 +- .../pgtestnull/PgtestnullRepoImpl.java | 260 +- .../public_/pgtestnull/PgtestnullRow.java | 5894 +---------------- .../precision_types/PrecisionTypesFields.java | 311 +- .../precision_types/PrecisionTypesId.java | 24 +- .../precision_types/PrecisionTypesRepo.java | 82 +- .../PrecisionTypesRepoImpl.java | 720 +- .../PrecisionTypesRepoMock.java | 176 +- .../precision_types/PrecisionTypesRow.java | 905 +-- .../PrecisionTypesRowUnsaved.java | 932 +-- .../PrecisionTypesNullFields.java | 314 +- .../PrecisionTypesNullId.java | 24 +- .../PrecisionTypesNullRepo.java | 81 +- .../PrecisionTypesNullRepoImpl.java | 743 +-- .../PrecisionTypesNullRepoMock.java | 180 +- .../PrecisionTypesNullRow.java | 905 +-- .../PrecisionTypesNullRowUnsaved.java | 927 +-- .../public_/title/TitleFields.java | 36 +- .../adventureworks/public_/title/TitleId.java | 67 +- .../public_/title/TitleRepo.java | 67 +- .../public_/title/TitleRepoImpl.java | 171 +- .../public_/title/TitleRepoMock.java | 138 +- .../public_/title/TitleRow.java | 22 +- .../title_domain/TitleDomainFields.java | 39 +- .../public_/title_domain/TitleDomainId.java | 76 +- .../public_/title_domain/TitleDomainRepo.java | 67 +- .../title_domain/TitleDomainRepoImpl.java | 188 +- .../title_domain/TitleDomainRepoMock.java | 139 +- .../public_/title_domain/TitleDomainRow.java | 22 +- .../titledperson/TitledpersonFields.java | 65 +- .../titledperson/TitledpersonRepo.java | 26 +- .../titledperson/TitledpersonRepoImpl.java | 75 +- .../public_/titledperson/TitledpersonRow.java | 38 +- .../public_/users/UsersFields.java | 104 +- .../adventureworks/public_/users/UsersId.java | 22 +- .../public_/users/UsersRepo.java | 88 +- .../public_/users/UsersRepoImpl.java | 380 +- .../public_/users/UsersRepoMock.java | 180 +- .../public_/users/UsersRow.java | 76 +- .../public_/users/UsersRowUnsaved.java | 68 +- .../sales/salesperson/SalespersonFields.java | 146 +- .../sales/salesperson/SalespersonRepo.java | 81 +- .../salesperson/SalespersonRepoImpl.java | 523 +- .../salesperson/SalespersonRepoMock.java | 180 +- .../sales/salesperson/SalespersonRow.java | 328 +- .../salesperson/SalespersonRowUnsaved.java | 331 +- .../salesterritory/SalesterritoryFields.java | 153 +- .../salesterritory/SalesterritoryId.java | 24 +- .../salesterritory/SalesterritoryRepo.java | 81 +- .../SalesterritoryRepoImpl.java | 542 +- .../SalesterritoryRepoMock.java | 179 +- .../salesterritory/SalesterritoryRow.java | 347 +- .../SalesterritoryRowUnsaved.java | 357 +- .../update_person/UpdatePersonSqlRepo.java | 12 +- .../UpdatePersonSqlRepoImpl.java | 24 +- .../UpdatePersonReturningSqlRepo.java | 13 +- .../UpdatePersonReturningSqlRepoImpl.java | 25 +- .../UpdatePersonReturningSqlRow.java | 30 +- .../userdefined/ActiveFlag.java | 27 +- .../userdefined/CurrentFlag.java | 28 +- .../userdefined/Description.java | 35 + .../adventureworks/userdefined/FirstName.java | 27 +- .../adventureworks/userdefined/LastName.java | 26 +- .../userdefined/MiddleName.java | 27 +- .../userdefined/OnlineOrderFlag.java | 28 +- .../userdefined/SalariedFlag.java | 28 +- .../src/java/adventureworks/ArrayTest.java | 188 +- .../java/src/java/adventureworks/DSLTest.java | 8 +- .../src/java/adventureworks/SeekDbTest.java | 2 +- .../src/java/adventureworks/TupleInTest.java | 6 +- .../java/adventureworks/WithConnection.java | 36 +- .../humanresources/employee/EmployeeTest.java | 2 +- .../adventureworks/person/MultiRepoTest.java | 2 +- .../production/product/ProductTest.java | 4 +- .../production/product/SeekTest.java | 2 +- .../productcosthistory/CompositeIdsTest.java | 9 +- .../public_/users/UsersRepoTest.java | 9 +- .../userdefined/CustomCreditcardId.java | 8 +- testers/pg/kotlin/build.gradle.kts | 38 - .../adventureworks/DefaultedDeserializer.kt | 2 +- .../adventureworks/DefaultedSerializer.kt | 4 +- .../adventureworks/TestInsert.kt | 247 +- .../adventureworks/bridge/Customer.kt | 15 + .../department/DepartmentFields.kt | 30 +- .../humanresources/department/DepartmentId.kt | 15 +- .../department/DepartmentRepo.kt | 31 +- .../department/DepartmentRepoImpl.kt | 95 +- .../department/DepartmentRepoMock.kt | 43 +- .../department/DepartmentRow.kt | 10 +- .../department/DepartmentRowUnsaved.kt | 10 +- .../humanresources/employee/EmployeeFields.kt | 75 +- .../humanresources/employee/EmployeeRepo.kt | 31 +- .../employee/EmployeeRepoImpl.kt | 141 +- .../employee/EmployeeRepoMock.kt | 43 +- .../humanresources/employee/EmployeeRow.kt | 52 +- .../employee/EmployeeRowUnsaved.kt | 56 +- .../EmployeedepartmenthistoryFields.kt | 44 +- .../EmployeedepartmenthistoryId.kt | 8 +- .../EmployeedepartmenthistoryRepo.kt | 31 +- .../EmployeedepartmenthistoryRepoImpl.kt | 119 +- .../EmployeedepartmenthistoryRepoMock.kt | 43 +- .../EmployeedepartmenthistoryRow.kt | 15 +- .../EmployeedepartmenthistoryRowUnsaved.kt | 15 +- .../humanresources/shift/ShiftFields.kt | 32 +- .../humanresources/shift/ShiftId.kt | 15 +- .../humanresources/shift/ShiftRepo.kt | 31 +- .../humanresources/shift/ShiftRepoImpl.kt | 99 +- .../humanresources/shift/ShiftRepoMock.kt | 43 +- .../humanresources/shift/ShiftRow.kt | 10 +- .../humanresources/shift/ShiftRowUnsaved.kt | 12 +- .../vemployee/VemployeeViewFields.kt | 75 +- .../vemployee/VemployeeViewRepo.kt | 6 +- .../vemployee/VemployeeViewRepoImpl.kt | 12 +- .../vemployee/VemployeeViewRow.kt | 43 +- .../information_schema/CardinalNumber.kt | 15 +- .../information_schema/CharacterData.kt | 18 +- .../information_schema/SqlIdentifier.kt | 18 +- .../information_schema/TimeStamp.kt | 14 +- .../information_schema/YesOrNo.kt | 18 +- .../person/address/AddressFields.kt | 54 +- .../person/address/AddressId.kt | 15 +- .../person/address/AddressRepo.kt | 31 +- .../person/address/AddressRepoImpl.kt | 116 +- .../person/address/AddressRepoMock.kt | 43 +- .../person/address/AddressRow.kt | 29 +- .../person/address/AddressRowUnsaved.kt | 29 +- .../person/addresstype/AddresstypeFields.kt | 30 +- .../person/addresstype/AddresstypeId.kt | 15 +- .../person/addresstype/AddresstypeRepo.kt | 31 +- .../person/addresstype/AddresstypeRepoImpl.kt | 95 +- .../person/addresstype/AddresstypeRepoMock.kt | 43 +- .../person/addresstype/AddresstypeRow.kt | 10 +- .../addresstype/AddresstypeRowUnsaved.kt | 10 +- .../businessentity/BusinessentityFields.kt | 28 +- .../person/businessentity/BusinessentityId.kt | 15 +- .../businessentity/BusinessentityRepo.kt | 31 +- .../businessentity/BusinessentityRepoImpl.kt | 91 +- .../businessentity/BusinessentityRepoMock.kt | 43 +- .../businessentity/BusinessentityRow.kt | 10 +- .../BusinessentityRowUnsaved.kt | 8 +- .../BusinessentityaddressFields.kt | 40 +- .../BusinessentityaddressId.kt | 6 +- .../BusinessentityaddressRepo.kt | 31 +- .../BusinessentityaddressRepoImpl.kt | 110 +- .../BusinessentityaddressRepoMock.kt | 43 +- .../BusinessentityaddressRow.kt | 14 +- .../BusinessentityaddressRowUnsaved.kt | 12 +- .../countryregion/CountryregionFields.kt | 28 +- .../person/countryregion/CountryregionId.kt | 20 +- .../person/countryregion/CountryregionRepo.kt | 31 +- .../countryregion/CountryregionRepoImpl.kt | 91 +- .../countryregion/CountryregionRepoMock.kt | 43 +- .../person/countryregion/CountryregionRow.kt | 10 +- .../countryregion/CountryregionRowUnsaved.kt | 8 +- .../person/emailaddress/EmailaddressFields.kt | 47 +- .../person/emailaddress/EmailaddressId.kt | 8 +- .../person/emailaddress/EmailaddressRepo.kt | 31 +- .../emailaddress/EmailaddressRepoImpl.kt | 108 +- .../emailaddress/EmailaddressRepoMock.kt | 43 +- .../person/emailaddress/EmailaddressRow.kt | 24 +- .../emailaddress/EmailaddressRowUnsaved.kt | 16 +- .../person/password/PasswordFields.kt | 40 +- .../person/password/PasswordRepo.kt | 31 +- .../person/password/PasswordRepoImpl.kt | 99 +- .../person/password/PasswordRepoMock.kt | 43 +- .../person/password/PasswordRow.kt | 20 +- .../person/password/PasswordRowUnsaved.kt | 16 +- .../person/person/PersonFields.kt | 61 +- .../person/person/PersonRepo.kt | 31 +- .../person/person/PersonRepoImpl.kt | 133 +- .../person/person/PersonRepoMock.kt | 43 +- .../adventureworks/person/person/PersonRow.kt | 26 +- .../person/person/PersonRowUnsaved.kt | 36 +- .../stateprovince/StateprovinceFields.kt | 44 +- .../person/stateprovince/StateprovinceId.kt | 15 +- .../person/stateprovince/StateprovinceRepo.kt | 31 +- .../stateprovince/StateprovinceRepoImpl.kt | 111 +- .../stateprovince/StateprovinceRepoMock.kt | 43 +- .../person/stateprovince/StateprovinceRow.kt | 16 +- .../stateprovince/StateprovinceRowUnsaved.kt | 20 +- .../person_detail/PersonDetailSqlRepo.kt | 4 +- .../person_detail/PersonDetailSqlRepoImpl.kt | 10 +- .../person_detail/PersonDetailSqlRow.kt | 31 +- .../person_dynamic/PersonDynamicSqlRepo.kt | 6 +- .../PersonDynamicSqlRepoImpl.kt | 13 +- .../person_dynamic/PersonDynamicSqlRow.kt | 15 +- .../person_row_join/PersonRowJoinSqlRepo.kt | 4 +- .../PersonRowJoinSqlRepoImpl.kt | 6 +- .../person_row_join/PersonRowJoinSqlRow.kt | 20 +- .../precisetypes/PaddedString10.kt | 42 +- .../precisetypes/PaddedString3.kt | 42 +- .../adventureworks/precisetypes/String10.kt | 44 +- .../adventureworks/precisetypes/String100.kt | 44 +- .../adventureworks/precisetypes/String20.kt | 44 +- .../adventureworks/precisetypes/String255.kt | 44 +- .../adventureworks/precisetypes/String50.kt | 44 +- .../production/product/ProductFields.kt | 95 +- .../production/product/ProductId.kt | 15 +- .../production/product/ProductRepo.kt | 31 +- .../production/product/ProductRepoImpl.kt | 181 +- .../production/product/ProductRepoMock.kt | 43 +- .../production/product/ProductRow.kt | 45 +- .../production/product/ProductRowUnsaved.kt | 70 +- .../productcategory/ProductcategoryFields.kt | 30 +- .../productcategory/ProductcategoryId.kt | 15 +- .../productcategory/ProductcategoryRepo.kt | 31 +- .../ProductcategoryRepoImpl.kt | 95 +- .../ProductcategoryRepoMock.kt | 43 +- .../productcategory/ProductcategoryRow.kt | 10 +- .../ProductcategoryRowUnsaved.kt | 10 +- .../ProductcosthistoryFields.kt | 42 +- .../ProductcosthistoryId.kt | 8 +- .../ProductcosthistoryRepo.kt | 31 +- .../ProductcosthistoryRepoImpl.kt | 107 +- .../ProductcosthistoryRepoMock.kt | 43 +- .../ProductcosthistoryRow.kt | 15 +- .../ProductcosthistoryRowUnsaved.kt | 13 +- .../productmodel/ProductmodelFields.kt | 36 +- .../production/productmodel/ProductmodelId.kt | 15 +- .../productmodel/ProductmodelRepo.kt | 31 +- .../productmodel/ProductmodelRepoImpl.kt | 104 +- .../productmodel/ProductmodelRepoMock.kt | 43 +- .../productmodel/ProductmodelRow.kt | 11 +- .../productmodel/ProductmodelRowUnsaved.kt | 15 +- .../ProductsubcategoryFields.kt | 34 +- .../ProductsubcategoryId.kt | 15 +- .../ProductsubcategoryRepo.kt | 31 +- .../ProductsubcategoryRepoImpl.kt | 99 +- .../ProductsubcategoryRepoMock.kt | 43 +- .../ProductsubcategoryRow.kt | 10 +- .../ProductsubcategoryRowUnsaved.kt | 12 +- .../unitmeasure/UnitmeasureFields.kt | 28 +- .../production/unitmeasure/UnitmeasureId.kt | 20 +- .../production/unitmeasure/UnitmeasureRepo.kt | 31 +- .../unitmeasure/UnitmeasureRepoImpl.kt | 91 +- .../unitmeasure/UnitmeasureRepoMock.kt | 43 +- .../production/unitmeasure/UnitmeasureRow.kt | 10 +- .../unitmeasure/UnitmeasureRowUnsaved.kt | 8 +- .../adventureworks/public/AccountNumber.kt | 18 +- .../adventureworks/public/Address.kt | 26 +- .../public/AllTypesComposite.kt | 35 +- .../adventureworks/public/Complex.kt | 22 +- .../adventureworks/public/ContactInfo.kt | 22 +- .../adventureworks/public/EmployeeRecord.kt | 19 +- .../adventureworks/public/Flag.kt | 19 +- .../adventureworks/public/InventoryItem.kt | 27 +- .../adventureworks/public/MetadataRecord.kt | 20 +- .../adventureworks/public/Mydomain.kt | 18 +- .../adventureworks/public/Myenum.kt | 19 +- .../adventureworks/public/Name.kt | 18 +- .../adventureworks/public/NameStyle.kt | 19 +- .../adventureworks/public/NullableTest.kt | 24 +- .../adventureworks/public/OrderNumber.kt | 18 +- .../adventureworks/public/PersonName.kt | 26 +- .../adventureworks/public/Phone.kt | 18 +- .../adventureworks/public/Point2d.kt | 22 +- .../adventureworks/public/PolygonCustom.kt | 22 +- .../adventureworks/public/ShortText.kt | 18 +- .../public/TablefuncCrosstab2.kt | 24 +- .../public/TablefuncCrosstab3.kt | 26 +- .../public/TablefuncCrosstab4.kt | 28 +- .../public/TextWithSpecialChars.kt | 30 +- .../adventureworks/public/TreeNode.kt | 21 +- .../public/flaff/FlaffFields.kt | 47 +- .../adventureworks/public/flaff/FlaffId.kt | 15 +- .../adventureworks/public/flaff/FlaffRepo.kt | 29 +- .../public/flaff/FlaffRepoImpl.kt | 90 +- .../public/flaff/FlaffRepoMock.kt | 41 +- .../adventureworks/public/flaff/FlaffRow.kt | 22 +- .../identity_test/IdentityTestFields.kt | 28 +- .../public/identity_test/IdentityTestId.kt | 20 +- .../public/identity_test/IdentityTestRepo.kt | 31 +- .../identity_test/IdentityTestRepoImpl.kt | 87 +- .../identity_test/IdentityTestRepoMock.kt | 43 +- .../public/identity_test/IdentityTestRow.kt | 10 +- .../identity_test/IdentityTestRowUnsaved.kt | 6 +- .../public/issue142/Issue142Fields.kt | 20 +- .../public/issue142/Issue142Id.kt | 25 +- .../public/issue142/Issue142Repo.kt | 27 +- .../public/issue142/Issue142RepoImpl.kt | 67 +- .../public/issue142/Issue142RepoMock.kt | 39 +- .../public/issue142/Issue142Row.kt | 8 +- .../public/issue142_2/Issue1422Fields.kt | 22 +- .../public/issue142_2/Issue1422Repo.kt | 27 +- .../public/issue142_2/Issue1422RepoImpl.kt | 67 +- .../public/issue142_2/Issue1422RepoMock.kt | 39 +- .../public/issue142_2/Issue1422Row.kt | 8 +- .../only_pk_columns/OnlyPkColumnsFields.kt | 35 +- .../public/only_pk_columns/OnlyPkColumnsId.kt | 15 +- .../only_pk_columns/OnlyPkColumnsRepo.kt | 27 +- .../only_pk_columns/OnlyPkColumnsRepoImpl.kt | 77 +- .../only_pk_columns/OnlyPkColumnsRepoMock.kt | 39 +- .../only_pk_columns/OnlyPkColumnsRow.kt | 21 +- .../public/pgtest/PgtestFields.kt | 251 +- .../public/pgtest/PgtestRepo.kt | 13 +- .../public/pgtest/PgtestRepoImpl.kt | 30 +- .../adventureworks/public/pgtest/PgtestRow.kt | 191 +- .../public/pgtestnull/PgtestnullFields.kt | 251 +- .../public/pgtestnull/PgtestnullRepo.kt | 13 +- .../public/pgtestnull/PgtestnullRepoImpl.kt | 31 +- .../public/pgtestnull/PgtestnullRow.kt | 192 +- .../precision_types/PrecisionTypesFields.kt | 72 +- .../precision_types/PrecisionTypesId.kt | 15 +- .../precision_types/PrecisionTypesRepo.kt | 31 +- .../precision_types/PrecisionTypesRepoImpl.kt | 179 +- .../precision_types/PrecisionTypesRepoMock.kt | 43 +- .../precision_types/PrecisionTypesRow.kt | 9 +- .../PrecisionTypesRowUnsaved.kt | 52 +- .../PrecisionTypesNullFields.kt | 72 +- .../PrecisionTypesNullId.kt | 15 +- .../PrecisionTypesNullRepo.kt | 31 +- .../PrecisionTypesNullRepoImpl.kt | 180 +- .../PrecisionTypesNullRepoMock.kt | 43 +- .../PrecisionTypesNullRow.kt | 10 +- .../PrecisionTypesNullRowUnsaved.kt | 53 +- .../public/title/TitleFields.kt | 20 +- .../adventureworks/public/title/TitleId.kt | 25 +- .../adventureworks/public/title/TitleRepo.kt | 27 +- .../public/title/TitleRepoImpl.kt | 67 +- .../public/title/TitleRepoMock.kt | 39 +- .../adventureworks/public/title/TitleRow.kt | 8 +- .../public/title_domain/TitleDomainFields.kt | 20 +- .../public/title_domain/TitleDomainId.kt | 15 +- .../public/title_domain/TitleDomainRepo.kt | 27 +- .../title_domain/TitleDomainRepoImpl.kt | 67 +- .../title_domain/TitleDomainRepoMock.kt | 39 +- .../public/title_domain/TitleDomainRow.kt | 8 +- .../public/titledperson/TitledpersonFields.kt | 32 +- .../public/titledperson/TitledpersonRepo.kt | 13 +- .../titledperson/TitledpersonRepoImpl.kt | 29 +- .../public/titledperson/TitledpersonRow.kt | 16 +- .../public/users/UsersFields.kt | 46 +- .../adventureworks/public/users/UsersId.kt | 14 +- .../adventureworks/public/users/UsersRepo.kt | 33 +- .../public/users/UsersRepoImpl.kt | 112 +- .../public/users/UsersRepoMock.kt | 45 +- .../adventureworks/public/users/UsersRow.kt | 25 +- .../public/users/UsersRowUnsaved.kt | 23 +- .../sales/salesperson/SalespersonFields.kt | 44 +- .../sales/salesperson/SalespersonRepo.kt | 31 +- .../sales/salesperson/SalespersonRepoImpl.kt | 116 +- .../sales/salesperson/SalespersonRepoMock.kt | 43 +- .../sales/salesperson/SalespersonRow.kt | 11 +- .../salesperson/SalespersonRowUnsaved.kt | 21 +- .../salesterritory/SalesterritoryFields.kt | 48 +- .../sales/salesterritory/SalesterritoryId.kt | 15 +- .../salesterritory/SalesterritoryRepo.kt | 31 +- .../salesterritory/SalesterritoryRepoImpl.kt | 119 +- .../salesterritory/SalesterritoryRepoMock.kt | 43 +- .../sales/salesterritory/SalesterritoryRow.kt | 16 +- .../SalesterritoryRowUnsaved.kt | 24 +- .../update_person/UpdatePersonSqlRepo.kt | 4 +- .../update_person/UpdatePersonSqlRepoImpl.kt | 11 +- .../UpdatePersonReturningSqlRepo.kt | 6 +- .../UpdatePersonReturningSqlRepoImpl.kt | 13 +- .../UpdatePersonReturningSqlRow.kt | 8 +- .../adventureworks/userdefined/ActiveFlag.kt | 12 +- .../adventureworks/userdefined/CurrentFlag.kt | 12 +- .../adventureworks/userdefined/Description.kt | 32 + .../adventureworks/userdefined/FirstName.kt | 12 +- .../adventureworks/userdefined/LastName.kt | 12 +- .../adventureworks/userdefined/MiddleName.kt | 12 +- .../userdefined/OnlineOrderFlag.kt | 12 +- .../userdefined/SalariedFlag.kt | 12 +- testers/pg/kotlin/gradle.properties | 1 - .../userdefined/CustomCreditcardId.kt | 16 +- .../test/kotlin/adventureworks/TupleInTest.kt | 6 +- .../kotlin/adventureworks/WithConnection.kt | 24 +- .../adventureworks/person/MultiRepoTest.kt | 2 +- .../production/product/SeekTest.kt | 2 +- .../adventureworks/public/UsersRepoTest.kt | 2 +- .../adventureworks/bridge/Customer.scala | 47 + .../customtypes/TypoBytea.scala | 2 +- .../customtypes/TypoHStore.scala | 2 +- .../adventureworks/customtypes/TypoInet.scala | 2 +- .../customtypes/TypoInstant.scala | 2 +- .../customtypes/TypoInt2Vector.scala | 2 +- .../adventureworks/customtypes/TypoJson.scala | 2 +- .../customtypes/TypoJsonb.scala | 2 +- .../customtypes/TypoLocalDate.scala | 2 +- .../customtypes/TypoLocalDateTime.scala | 2 +- .../customtypes/TypoLocalTime.scala | 2 +- .../customtypes/TypoMoney.scala | 2 +- .../customtypes/TypoOffsetTime.scala | 2 +- .../customtypes/TypoPolygon.scala | 2 +- .../customtypes/TypoRecord.scala | 2 +- .../customtypes/TypoShort.scala | 2 +- .../adventureworks/customtypes/TypoUUID.scala | 2 +- .../customtypes/TypoUnknownCitext.scala | 2 +- .../customtypes/TypoVector.scala | 2 +- .../adventureworks/customtypes/TypoXml.scala | 2 +- .../department/DepartmentId.scala | 2 +- .../humanresources/shift/ShiftId.scala | 2 +- .../information_schema/CardinalNumber.scala | 2 +- .../information_schema/CharacterData.scala | 2 +- .../information_schema/SqlIdentifier.scala | 2 +- .../information_schema/TimeStamp.scala | 2 +- .../information_schema/YesOrNo.scala | 2 +- .../person/address/AddressId.scala | 2 +- .../person/addresstype/AddresstypeId.scala | 2 +- .../businessentity/BusinessentityId.scala | 2 +- .../countryregion/CountryregionId.scala | 2 +- .../stateprovince/StateprovinceId.scala | 2 +- .../production/product/ProductId.scala | 2 +- .../productcategory/ProductcategoryId.scala | 2 +- .../productmodel/ProductmodelId.scala | 2 +- .../ProductsubcategoryId.scala | 2 +- .../unitmeasure/UnitmeasureId.scala | 2 +- .../adventureworks/public/AccountNumber.scala | 2 +- .../adventureworks/public/Flag.scala | 2 +- .../adventureworks/public/Mydomain.scala | 2 +- .../adventureworks/public/Myenum.scala | 4 +- .../adventureworks/public/Name.scala | 2 +- .../adventureworks/public/NameStyle.scala | 2 +- .../adventureworks/public/OrderNumber.scala | 2 +- .../adventureworks/public/Phone.scala | 2 +- .../adventureworks/public/ShortText.scala | 2 +- .../public/identity_test/IdentityTestId.scala | 2 +- .../precision_types/PrecisionTypesId.scala | 2 +- .../PrecisionTypesNullId.scala | 2 +- .../adventureworks/public/users/UsersId.scala | 2 +- .../salesterritory/SalesterritoryId.scala | 2 +- .../userdefined/ActiveFlag.scala | 2 +- .../userdefined/CurrentFlag.scala | 2 +- .../userdefined/Description.scala | 49 + .../userdefined/FirstName.scala | 2 +- .../adventureworks/userdefined/LastName.scala | 2 +- .../userdefined/MiddleName.scala | 2 +- .../userdefined/OnlineOrderFlag.scala | 2 +- .../userdefined/SalariedFlag.scala | 2 +- .../adventureworks/bridge/Customer.scala | 47 + .../customtypes/TypoBytea.scala | 2 +- .../customtypes/TypoHStore.scala | 2 +- .../adventureworks/customtypes/TypoInet.scala | 2 +- .../customtypes/TypoInstant.scala | 2 +- .../customtypes/TypoInt2Vector.scala | 2 +- .../adventureworks/customtypes/TypoJson.scala | 2 +- .../customtypes/TypoJsonb.scala | 2 +- .../customtypes/TypoLocalDate.scala | 2 +- .../customtypes/TypoLocalDateTime.scala | 2 +- .../customtypes/TypoLocalTime.scala | 2 +- .../customtypes/TypoMoney.scala | 2 +- .../customtypes/TypoOffsetTime.scala | 2 +- .../customtypes/TypoPolygon.scala | 2 +- .../customtypes/TypoRecord.scala | 2 +- .../customtypes/TypoShort.scala | 2 +- .../adventureworks/customtypes/TypoUUID.scala | 2 +- .../customtypes/TypoUnknownCitext.scala | 2 +- .../customtypes/TypoVector.scala | 2 +- .../adventureworks/customtypes/TypoXml.scala | 2 +- .../department/DepartmentId.scala | 2 +- .../humanresources/shift/ShiftId.scala | 2 +- .../information_schema/CardinalNumber.scala | 2 +- .../information_schema/CharacterData.scala | 2 +- .../information_schema/SqlIdentifier.scala | 2 +- .../information_schema/TimeStamp.scala | 2 +- .../information_schema/YesOrNo.scala | 2 +- .../person/address/AddressId.scala | 2 +- .../person/addresstype/AddresstypeId.scala | 2 +- .../businessentity/BusinessentityId.scala | 2 +- .../countryregion/CountryregionId.scala | 2 +- .../stateprovince/StateprovinceId.scala | 2 +- .../production/product/ProductId.scala | 2 +- .../productcategory/ProductcategoryId.scala | 2 +- .../productmodel/ProductmodelId.scala | 2 +- .../ProductsubcategoryId.scala | 2 +- .../unitmeasure/UnitmeasureId.scala | 2 +- .../adventureworks/public/AccountNumber.scala | 2 +- .../adventureworks/public/Flag.scala | 2 +- .../adventureworks/public/Mydomain.scala | 2 +- .../adventureworks/public/Myenum.scala | 25 +- .../adventureworks/public/Name.scala | 2 +- .../adventureworks/public/NameStyle.scala | 2 +- .../adventureworks/public/OrderNumber.scala | 2 +- .../adventureworks/public/Phone.scala | 2 +- .../adventureworks/public/ShortText.scala | 2 +- .../public/identity_test/IdentityTestId.scala | 2 +- .../precision_types/PrecisionTypesId.scala | 2 +- .../PrecisionTypesNullId.scala | 2 +- .../adventureworks/public/users/UsersId.scala | 2 +- .../salesterritory/SalesterritoryId.scala | 2 +- .../userdefined/ActiveFlag.scala | 2 +- .../userdefined/CurrentFlag.scala | 2 +- .../userdefined/Description.scala | 49 + .../userdefined/FirstName.scala | 2 +- .../adventureworks/userdefined/LastName.scala | 2 +- .../userdefined/MiddleName.scala | 2 +- .../userdefined/OnlineOrderFlag.scala | 2 +- .../userdefined/SalariedFlag.scala | 2 +- .../scala/adventureworks/withConnection.scala | 15 +- .../adventureworks/bridge/Customer.scala | 22 + .../customtypes/TypoBytea.scala | 2 +- .../customtypes/TypoHStore.scala | 2 +- .../adventureworks/customtypes/TypoInet.scala | 2 +- .../customtypes/TypoInstant.scala | 2 +- .../customtypes/TypoInt2Vector.scala | 2 +- .../adventureworks/customtypes/TypoJson.scala | 2 +- .../customtypes/TypoJsonb.scala | 2 +- .../customtypes/TypoLocalDate.scala | 2 +- .../customtypes/TypoLocalDateTime.scala | 2 +- .../customtypes/TypoLocalTime.scala | 2 +- .../customtypes/TypoMoney.scala | 2 +- .../customtypes/TypoOffsetTime.scala | 2 +- .../customtypes/TypoPolygon.scala | 2 +- .../customtypes/TypoRecord.scala | 2 +- .../customtypes/TypoShort.scala | 2 +- .../adventureworks/customtypes/TypoUUID.scala | 2 +- .../customtypes/TypoUnknownCitext.scala | 2 +- .../customtypes/TypoVector.scala | 2 +- .../adventureworks/customtypes/TypoXml.scala | 2 +- .../department/DepartmentId.scala | 2 +- .../humanresources/shift/ShiftId.scala | 2 +- .../information_schema/CardinalNumber.scala | 2 +- .../information_schema/CharacterData.scala | 2 +- .../information_schema/SqlIdentifier.scala | 2 +- .../information_schema/TimeStamp.scala | 2 +- .../information_schema/YesOrNo.scala | 2 +- .../person/address/AddressId.scala | 2 +- .../person/addresstype/AddresstypeId.scala | 2 +- .../businessentity/BusinessentityId.scala | 2 +- .../countryregion/CountryregionId.scala | 2 +- .../stateprovince/StateprovinceId.scala | 2 +- .../production/product/ProductId.scala | 2 +- .../productcategory/ProductcategoryId.scala | 2 +- .../productmodel/ProductmodelId.scala | 2 +- .../ProductsubcategoryId.scala | 2 +- .../unitmeasure/UnitmeasureId.scala | 2 +- .../adventureworks/public/AccountNumber.scala | 2 +- .../adventureworks/public/Flag.scala | 2 +- .../adventureworks/public/Mydomain.scala | 2 +- .../adventureworks/public/Myenum.scala | 4 +- .../adventureworks/public/Name.scala | 2 +- .../adventureworks/public/NameStyle.scala | 2 +- .../adventureworks/public/OrderNumber.scala | 2 +- .../adventureworks/public/Phone.scala | 2 +- .../adventureworks/public/ShortText.scala | 2 +- .../public/identity_test/IdentityTestId.scala | 2 +- .../precision_types/PrecisionTypesId.scala | 2 +- .../PrecisionTypesNullId.scala | 2 +- .../adventureworks/public/users/UsersId.scala | 2 +- .../salesterritory/SalesterritoryId.scala | 2 +- .../userdefined/ActiveFlag.scala | 2 +- .../userdefined/CurrentFlag.scala | 2 +- .../userdefined/Description.scala | 42 + .../userdefined/FirstName.scala | 2 +- .../adventureworks/userdefined/LastName.scala | 2 +- .../userdefined/MiddleName.scala | 2 +- .../userdefined/OnlineOrderFlag.scala | 2 +- .../userdefined/SalariedFlag.scala | 2 +- .../adventureworks/bridge/Customer.scala | 22 + .../customtypes/TypoBytea.scala | 2 +- .../customtypes/TypoHStore.scala | 2 +- .../adventureworks/customtypes/TypoInet.scala | 2 +- .../customtypes/TypoInstant.scala | 2 +- .../customtypes/TypoInt2Vector.scala | 2 +- .../adventureworks/customtypes/TypoJson.scala | 2 +- .../customtypes/TypoJsonb.scala | 2 +- .../customtypes/TypoLocalDate.scala | 2 +- .../customtypes/TypoLocalDateTime.scala | 2 +- .../customtypes/TypoLocalTime.scala | 2 +- .../customtypes/TypoMoney.scala | 2 +- .../customtypes/TypoOffsetTime.scala | 2 +- .../customtypes/TypoPolygon.scala | 2 +- .../customtypes/TypoRecord.scala | 2 +- .../customtypes/TypoShort.scala | 2 +- .../adventureworks/customtypes/TypoUUID.scala | 2 +- .../customtypes/TypoUnknownCitext.scala | 2 +- .../customtypes/TypoVector.scala | 2 +- .../adventureworks/customtypes/TypoXml.scala | 2 +- .../department/DepartmentId.scala | 2 +- .../humanresources/shift/ShiftId.scala | 2 +- .../information_schema/CardinalNumber.scala | 2 +- .../information_schema/CharacterData.scala | 2 +- .../information_schema/SqlIdentifier.scala | 2 +- .../information_schema/TimeStamp.scala | 2 +- .../information_schema/YesOrNo.scala | 2 +- .../person/address/AddressId.scala | 2 +- .../person/addresstype/AddresstypeId.scala | 2 +- .../businessentity/BusinessentityId.scala | 2 +- .../countryregion/CountryregionId.scala | 2 +- .../stateprovince/StateprovinceId.scala | 2 +- .../production/product/ProductId.scala | 2 +- .../productcategory/ProductcategoryId.scala | 2 +- .../productmodel/ProductmodelId.scala | 2 +- .../ProductsubcategoryId.scala | 2 +- .../unitmeasure/UnitmeasureId.scala | 2 +- .../adventureworks/public/AccountNumber.scala | 2 +- .../adventureworks/public/Flag.scala | 2 +- .../adventureworks/public/Mydomain.scala | 2 +- .../adventureworks/public/Myenum.scala | 25 +- .../adventureworks/public/Name.scala | 2 +- .../adventureworks/public/NameStyle.scala | 2 +- .../adventureworks/public/OrderNumber.scala | 2 +- .../adventureworks/public/Phone.scala | 2 +- .../adventureworks/public/ShortText.scala | 2 +- .../public/identity_test/IdentityTestId.scala | 2 +- .../precision_types/PrecisionTypesId.scala | 2 +- .../PrecisionTypesNullId.scala | 2 +- .../adventureworks/public/users/UsersId.scala | 2 +- .../salesterritory/SalesterritoryId.scala | 2 +- .../userdefined/ActiveFlag.scala | 2 +- .../userdefined/CurrentFlag.scala | 2 +- .../userdefined/Description.scala | 42 + .../userdefined/FirstName.scala | 2 +- .../adventureworks/userdefined/LastName.scala | 2 +- .../userdefined/MiddleName.scala | 2 +- .../userdefined/OnlineOrderFlag.scala | 2 +- .../userdefined/SalariedFlag.scala | 2 +- .../adventureworks/TestInsert.scala | 170 +- .../adventureworks/bridge/Customer.scala | 15 + .../department/DepartmentFields.scala | 20 +- .../department/DepartmentId.scala | 8 +- .../department/DepartmentRepo.scala | 19 +- .../department/DepartmentRepoImpl.scala | 109 +- .../department/DepartmentRepoMock.scala | 35 +- .../department/DepartmentRow.scala | 8 +- .../department/DepartmentRowUnsaved.scala | 2 +- .../employee/EmployeeFields.scala | 24 +- .../employee/EmployeeRepo.scala | 19 +- .../employee/EmployeeRepoImpl.scala | 165 +- .../employee/EmployeeRepoMock.scala | 35 +- .../humanresources/employee/EmployeeRow.scala | 8 +- .../employee/EmployeeRowUnsaved.scala | 2 +- .../EmployeedepartmenthistoryFields.scala | 26 +- .../EmployeedepartmenthistoryId.scala | 6 +- .../EmployeedepartmenthistoryRepo.scala | 19 +- .../EmployeedepartmenthistoryRepoImpl.scala | 135 +- .../EmployeedepartmenthistoryRepoMock.scala | 35 +- .../EmployeedepartmenthistoryRow.scala | 10 +- .../EmployeedepartmenthistoryRowUnsaved.scala | 2 +- .../humanresources/shift/ShiftFields.scala | 20 +- .../humanresources/shift/ShiftId.scala | 8 +- .../humanresources/shift/ShiftRepo.scala | 19 +- .../humanresources/shift/ShiftRepoImpl.scala | 115 +- .../humanresources/shift/ShiftRepoMock.scala | 35 +- .../humanresources/shift/ShiftRow.scala | 8 +- .../shift/ShiftRowUnsaved.scala | 2 +- .../vemployee/VemployeeViewFields.scala | 18 +- .../vemployee/VemployeeViewRepo.scala | 6 +- .../vemployee/VemployeeViewRepoImpl.scala | 16 +- .../vemployee/VemployeeViewRow.scala | 6 +- .../information_schema/CardinalNumber.scala | 8 +- .../information_schema/CharacterData.scala | 8 +- .../information_schema/SqlIdentifier.scala | 8 +- .../information_schema/TimeStamp.scala | 8 +- .../information_schema/YesOrNo.scala | 8 +- .../person/address/AddressFields.scala | 24 +- .../person/address/AddressId.scala | 8 +- .../person/address/AddressRepo.scala | 19 +- .../person/address/AddressRepoImpl.scala | 137 +- .../person/address/AddressRepoMock.scala | 35 +- .../person/address/AddressRow.scala | 8 +- .../person/address/AddressRowUnsaved.scala | 2 +- .../addresstype/AddresstypeFields.scala | 20 +- .../person/addresstype/AddresstypeId.scala | 8 +- .../person/addresstype/AddresstypeRepo.scala | 19 +- .../addresstype/AddresstypeRepoImpl.scala | 107 +- .../addresstype/AddresstypeRepoMock.scala | 35 +- .../person/addresstype/AddresstypeRow.scala | 8 +- .../addresstype/AddresstypeRowUnsaved.scala | 2 +- .../businessentity/BusinessentityFields.scala | 20 +- .../businessentity/BusinessentityId.scala | 8 +- .../businessentity/BusinessentityRepo.scala | 19 +- .../BusinessentityRepoImpl.scala | 103 +- .../BusinessentityRepoMock.scala | 35 +- .../businessentity/BusinessentityRow.scala | 8 +- .../BusinessentityRowUnsaved.scala | 2 +- .../BusinessentityaddressFields.scala | 24 +- .../BusinessentityaddressId.scala | 6 +- .../BusinessentityaddressRepo.scala | 19 +- .../BusinessentityaddressRepoImpl.scala | 125 +- .../BusinessentityaddressRepoMock.scala | 35 +- .../BusinessentityaddressRow.scala | 10 +- .../BusinessentityaddressRowUnsaved.scala | 2 +- .../countryregion/CountryregionFields.scala | 20 +- .../countryregion/CountryregionId.scala | 8 +- .../countryregion/CountryregionRepo.scala | 19 +- .../countryregion/CountryregionRepoImpl.scala | 105 +- .../countryregion/CountryregionRepoMock.scala | 35 +- .../countryregion/CountryregionRow.scala | 8 +- .../CountryregionRowUnsaved.scala | 2 +- .../emailaddress/EmailaddressFields.scala | 26 +- .../person/emailaddress/EmailaddressId.scala | 6 +- .../emailaddress/EmailaddressRepo.scala | 19 +- .../emailaddress/EmailaddressRepoImpl.scala | 121 +- .../emailaddress/EmailaddressRepoMock.scala | 35 +- .../person/emailaddress/EmailaddressRow.scala | 10 +- .../emailaddress/EmailaddressRowUnsaved.scala | 2 +- .../person/password/PasswordFields.scala | 22 +- .../person/password/PasswordRepo.scala | 19 +- .../person/password/PasswordRepoImpl.scala | 115 +- .../person/password/PasswordRepoMock.scala | 35 +- .../person/password/PasswordRow.scala | 8 +- .../person/password/PasswordRowUnsaved.scala | 2 +- .../person/person/PersonFields.scala | 24 +- .../person/person/PersonRepo.scala | 19 +- .../person/person/PersonRepoImpl.scala | 159 +- .../person/person/PersonRepoMock.scala | 35 +- .../person/person/PersonRow.scala | 8 +- .../person/person/PersonRowUnsaved.scala | 2 +- .../stateprovince/StateprovinceFields.scala | 22 +- .../stateprovince/StateprovinceId.scala | 8 +- .../stateprovince/StateprovinceRepo.scala | 19 +- .../stateprovince/StateprovinceRepoImpl.scala | 129 +- .../stateprovince/StateprovinceRepoMock.scala | 35 +- .../stateprovince/StateprovinceRow.scala | 8 +- .../StateprovinceRowUnsaved.scala | 2 +- .../person_detail/PersonDetailSqlRepo.scala | 4 +- .../PersonDetailSqlRepoImpl.scala | 12 +- .../person_detail/PersonDetailSqlRow.scala | 6 +- .../person_dynamic/PersonDynamicSqlRepo.scala | 4 +- .../PersonDynamicSqlRepoImpl.scala | 12 +- .../person_dynamic/PersonDynamicSqlRow.scala | 6 +- .../PersonRowJoinSqlRepo.scala | 4 +- .../PersonRowJoinSqlRepoImpl.scala | 10 +- .../person_row_join/PersonRowJoinSqlRow.scala | 16 +- .../precisetypes/PaddedString10.scala | 10 +- .../precisetypes/PaddedString3.scala | 10 +- .../precisetypes/String10.scala | 10 +- .../precisetypes/String100.scala | 10 +- .../precisetypes/String20.scala | 10 +- .../precisetypes/String255.scala | 10 +- .../precisetypes/String50.scala | 10 +- .../production/product/ProductFields.scala | 24 +- .../production/product/ProductId.scala | 8 +- .../production/product/ProductRepo.scala | 19 +- .../production/product/ProductRepoImpl.scala | 229 +- .../production/product/ProductRepoMock.scala | 35 +- .../production/product/ProductRow.scala | 14 +- .../product/ProductRowUnsaved.scala | 2 +- .../ProductcategoryFields.scala | 20 +- .../productcategory/ProductcategoryId.scala | 8 +- .../productcategory/ProductcategoryRepo.scala | 19 +- .../ProductcategoryRepoImpl.scala | 107 +- .../ProductcategoryRepoMock.scala | 35 +- .../productcategory/ProductcategoryRow.scala | 8 +- .../ProductcategoryRowUnsaved.scala | 2 +- .../ProductcosthistoryFields.scala | 26 +- .../ProductcosthistoryId.scala | 6 +- .../ProductcosthistoryRepo.scala | 19 +- .../ProductcosthistoryRepoImpl.scala | 125 +- .../ProductcosthistoryRepoMock.scala | 35 +- .../ProductcosthistoryRow.scala | 10 +- .../ProductcosthistoryRowUnsaved.scala | 2 +- .../productmodel/ProductmodelFields.scala | 22 +- .../productmodel/ProductmodelId.scala | 8 +- .../productmodel/ProductmodelRepo.scala | 19 +- .../productmodel/ProductmodelRepoImpl.scala | 119 +- .../productmodel/ProductmodelRepoMock.scala | 35 +- .../productmodel/ProductmodelRow.scala | 8 +- .../productmodel/ProductmodelRowUnsaved.scala | 2 +- .../ProductsubcategoryFields.scala | 22 +- .../ProductsubcategoryId.scala | 8 +- .../ProductsubcategoryRepo.scala | 19 +- .../ProductsubcategoryRepoImpl.scala | 113 +- .../ProductsubcategoryRepoMock.scala | 35 +- .../ProductsubcategoryRow.scala | 8 +- .../ProductsubcategoryRowUnsaved.scala | 2 +- .../unitmeasure/UnitmeasureFields.scala | 20 +- .../unitmeasure/UnitmeasureId.scala | 8 +- .../unitmeasure/UnitmeasureRepo.scala | 19 +- .../unitmeasure/UnitmeasureRepoImpl.scala | 105 +- .../unitmeasure/UnitmeasureRepoMock.scala | 35 +- .../unitmeasure/UnitmeasureRow.scala | 8 +- .../unitmeasure/UnitmeasureRowUnsaved.scala | 2 +- .../adventureworks/public/AccountNumber.scala | 8 +- .../adventureworks/public/Address.scala | 9 +- .../public/AllTypesComposite.scala | 9 +- .../adventureworks/public/Complex.scala | 9 +- .../adventureworks/public/ContactInfo.scala | 9 +- .../public/EmployeeRecord.scala | 9 +- .../adventureworks/public/Flag.scala | 8 +- .../adventureworks/public/InventoryItem.scala | 13 +- .../public/MetadataRecord.scala | 9 +- .../adventureworks/public/Mydomain.scala | 8 +- .../adventureworks/public/Myenum.scala | 38 +- .../adventureworks/public/Name.scala | 8 +- .../adventureworks/public/NameStyle.scala | 8 +- .../adventureworks/public/NullableTest.scala | 9 +- .../adventureworks/public/OrderNumber.scala | 8 +- .../adventureworks/public/PersonName.scala | 9 +- .../adventureworks/public/Phone.scala | 8 +- .../adventureworks/public/Point2d.scala | 9 +- .../adventureworks/public/PolygonCustom.scala | 11 +- .../adventureworks/public/ShortText.scala | 8 +- .../public/TablefuncCrosstab2.scala | 9 +- .../public/TablefuncCrosstab3.scala | 9 +- .../public/TablefuncCrosstab4.scala | 9 +- .../public/TextWithSpecialChars.scala | 9 +- .../adventureworks/public/TreeNode.scala | 9 +- .../public/flaff/FlaffFields.scala | 24 +- .../adventureworks/public/flaff/FlaffId.scala | 6 +- .../public/flaff/FlaffRepo.scala | 19 +- .../public/flaff/FlaffRepoImpl.scala | 103 +- .../public/flaff/FlaffRepoMock.scala | 35 +- .../public/flaff/FlaffRow.scala | 10 +- .../identity_test/IdentityTestFields.scala | 20 +- .../public/identity_test/IdentityTestId.scala | 8 +- .../identity_test/IdentityTestRepo.scala | 19 +- .../identity_test/IdentityTestRepoImpl.scala | 99 +- .../identity_test/IdentityTestRepoMock.scala | 35 +- .../identity_test/IdentityTestRow.scala | 8 +- .../IdentityTestRowUnsaved.scala | 2 +- .../public/issue142/Issue142Fields.scala | 18 +- .../public/issue142/Issue142Id.scala | 9 +- .../public/issue142/Issue142Repo.scala | 19 +- .../public/issue142/Issue142RepoImpl.scala | 79 +- .../public/issue142/Issue142RepoMock.scala | 35 +- .../public/issue142/Issue142Row.scala | 8 +- .../public/issue142_2/Issue1422Fields.scala | 20 +- .../public/issue142_2/Issue1422Repo.scala | 19 +- .../public/issue142_2/Issue1422RepoImpl.scala | 79 +- .../public/issue142_2/Issue1422RepoMock.scala | 35 +- .../public/issue142_2/Issue1422Row.scala | 8 +- .../only_pk_columns/OnlyPkColumnsFields.scala | 20 +- .../only_pk_columns/OnlyPkColumnsId.scala | 6 +- .../only_pk_columns/OnlyPkColumnsRepo.scala | 19 +- .../OnlyPkColumnsRepoImpl.scala | 89 +- .../OnlyPkColumnsRepoMock.scala | 35 +- .../only_pk_columns/OnlyPkColumnsRow.scala | 10 +- .../public/pgtest/PgtestFields.scala | 284 +- .../public/pgtest/PgtestRepo.scala | 11 +- .../public/pgtest/PgtestRepoImpl.scala | 33 +- .../public/pgtest/PgtestRow.scala | 214 +- .../public/pgtestnull/PgtestnullFields.scala | 284 +- .../public/pgtestnull/PgtestnullRepo.scala | 11 +- .../pgtestnull/PgtestnullRepoImpl.scala | 33 +- .../public/pgtestnull/PgtestnullRow.scala | 214 +- .../PrecisionTypesFields.scala | 20 +- .../precision_types/PrecisionTypesId.scala | 8 +- .../precision_types/PrecisionTypesRepo.scala | 19 +- .../PrecisionTypesRepoImpl.scala | 237 +- .../PrecisionTypesRepoMock.scala | 35 +- .../precision_types/PrecisionTypesRow.scala | 14 +- .../PrecisionTypesRowUnsaved.scala | 2 +- .../PrecisionTypesNullFields.scala | 20 +- .../PrecisionTypesNullId.scala | 8 +- .../PrecisionTypesNullRepo.scala | 19 +- .../PrecisionTypesNullRepoImpl.scala | 237 +- .../PrecisionTypesNullRepoMock.scala | 35 +- .../PrecisionTypesNullRow.scala | 14 +- .../PrecisionTypesNullRowUnsaved.scala | 2 +- .../public/title/TitleFields.scala | 18 +- .../adventureworks/public/title/TitleId.scala | 9 +- .../public/title/TitleRepo.scala | 19 +- .../public/title/TitleRepoImpl.scala | 79 +- .../public/title/TitleRepoMock.scala | 35 +- .../public/title/TitleRow.scala | 8 +- .../title_domain/TitleDomainFields.scala | 18 +- .../public/title_domain/TitleDomainId.scala | 9 +- .../public/title_domain/TitleDomainRepo.scala | 19 +- .../title_domain/TitleDomainRepoImpl.scala | 79 +- .../title_domain/TitleDomainRepoMock.scala | 35 +- .../public/title_domain/TitleDomainRow.scala | 8 +- .../titledperson/TitledpersonFields.scala | 20 +- .../titledperson/TitledpersonRepo.scala | 11 +- .../titledperson/TitledpersonRepoImpl.scala | 33 +- .../public/titledperson/TitledpersonRow.scala | 8 +- .../public/users/UsersFields.scala | 22 +- .../adventureworks/public/users/UsersId.scala | 8 +- .../public/users/UsersRepo.scala | 21 +- .../public/users/UsersRepoImpl.scala | 137 +- .../public/users/UsersRepoMock.scala | 37 +- .../public/users/UsersRow.scala | 8 +- .../public/users/UsersRowUnsaved.scala | 2 +- .../sales/salesperson/SalespersonFields.scala | 24 +- .../sales/salesperson/SalespersonRepo.scala | 19 +- .../salesperson/SalespersonRepoImpl.scala | 131 +- .../salesperson/SalespersonRepoMock.scala | 35 +- .../sales/salesperson/SalespersonRow.scala | 8 +- .../salesperson/SalespersonRowUnsaved.scala | 2 +- .../salesterritory/SalesterritoryFields.scala | 22 +- .../salesterritory/SalesterritoryId.scala | 8 +- .../salesterritory/SalesterritoryRepo.scala | 19 +- .../SalesterritoryRepoImpl.scala | 135 +- .../SalesterritoryRepoMock.scala | 35 +- .../salesterritory/SalesterritoryRow.scala | 8 +- .../SalesterritoryRowUnsaved.scala | 2 +- .../update_person/UpdatePersonSqlRepo.scala | 2 +- .../UpdatePersonSqlRepoImpl.scala | 10 +- .../UpdatePersonReturningSqlRepo.scala | 4 +- .../UpdatePersonReturningSqlRepoImpl.scala | 14 +- .../UpdatePersonReturningSqlRow.scala | 6 +- .../userdefined/ActiveFlag.scala | 8 +- .../userdefined/CurrentFlag.scala | 8 +- .../userdefined/Description.scala | 24 + .../userdefined/FirstName.scala | 8 +- .../adventureworks/userdefined/LastName.scala | 8 +- .../userdefined/MiddleName.scala | 8 +- .../userdefined/OnlineOrderFlag.scala | 8 +- .../userdefined/SalariedFlag.scala | 8 +- .../src/scala/adventureworks/SeekDbTest.scala | 4 +- .../scala/adventureworks/TupleInDslTest.scala | 4 +- .../scala/adventureworks/WithConnection.scala | 16 +- .../employee/EmployeeTest.scala | 2 +- .../adventureworks/person/MultiRepoTest.scala | 2 +- .../production/product/SeekTest.scala | 2 +- .../productcosthistory/CompositeIdsTest.scala | 4 +- .../adventureworks/public/OpenEnumTest.scala | 2 +- .../public_/users/UsersRepoTest.scala | 19 +- .../userdefined/CustomCreditcardId.scala | 7 +- .../adventureworks/TestInsert.scala | 170 +- .../adventureworks/bridge/Customer.scala | 15 + .../department/DepartmentFields.scala | 30 +- .../department/DepartmentId.scala | 13 +- .../department/DepartmentRepo.scala | 19 +- .../department/DepartmentRepoImpl.scala | 78 +- .../department/DepartmentRepoMock.scala | 31 +- .../department/DepartmentRow.scala | 10 +- .../department/DepartmentRowUnsaved.scala | 4 +- .../employee/EmployeeFields.scala | 57 +- .../employee/EmployeeRepo.scala | 19 +- .../employee/EmployeeRepoImpl.scala | 112 +- .../employee/EmployeeRepoMock.scala | 31 +- .../humanresources/employee/EmployeeRow.scala | 12 +- .../employee/EmployeeRowUnsaved.scala | 6 +- .../EmployeedepartmenthistoryFields.scala | 40 +- .../EmployeedepartmenthistoryId.scala | 8 +- .../EmployeedepartmenthistoryRepo.scala | 19 +- .../EmployeedepartmenthistoryRepoImpl.scala | 109 +- .../EmployeedepartmenthistoryRepoMock.scala | 31 +- .../EmployeedepartmenthistoryRow.scala | 13 +- .../EmployeedepartmenthistoryRowUnsaved.scala | 5 +- .../humanresources/shift/ShiftFields.scala | 32 +- .../humanresources/shift/ShiftId.scala | 13 +- .../humanresources/shift/ShiftRepo.scala | 19 +- .../humanresources/shift/ShiftRepoImpl.scala | 80 +- .../humanresources/shift/ShiftRepoMock.scala | 31 +- .../humanresources/shift/ShiftRow.scala | 10 +- .../shift/ShiftRowUnsaved.scala | 4 +- .../vemployee/VemployeeViewFields.scala | 57 +- .../vemployee/VemployeeViewRepo.scala | 6 +- .../vemployee/VemployeeViewRepoImpl.scala | 14 +- .../vemployee/VemployeeViewRow.scala | 9 +- .../information_schema/CardinalNumber.scala | 13 +- .../information_schema/CharacterData.scala | 12 +- .../information_schema/SqlIdentifier.scala | 12 +- .../information_schema/TimeStamp.scala | 12 +- .../information_schema/YesOrNo.scala | 12 +- .../person/address/AddressFields.scala | 44 +- .../person/address/AddressId.scala | 13 +- .../person/address/AddressRepo.scala | 19 +- .../person/address/AddressRepoImpl.scala | 101 +- .../person/address/AddressRepoMock.scala | 31 +- .../person/address/AddressRow.scala | 11 +- .../person/address/AddressRowUnsaved.scala | 5 +- .../addresstype/AddresstypeFields.scala | 30 +- .../person/addresstype/AddresstypeId.scala | 13 +- .../person/addresstype/AddresstypeRepo.scala | 19 +- .../addresstype/AddresstypeRepoImpl.scala | 78 +- .../addresstype/AddresstypeRepoMock.scala | 31 +- .../person/addresstype/AddresstypeRow.scala | 10 +- .../addresstype/AddresstypeRowUnsaved.scala | 4 +- .../businessentity/BusinessentityFields.scala | 28 +- .../businessentity/BusinessentityId.scala | 13 +- .../businessentity/BusinessentityRepo.scala | 19 +- .../BusinessentityRepoImpl.scala | 76 +- .../BusinessentityRepoMock.scala | 31 +- .../businessentity/BusinessentityRow.scala | 10 +- .../BusinessentityRowUnsaved.scala | 4 +- .../BusinessentityaddressFields.scala | 36 +- .../BusinessentityaddressId.scala | 6 +- .../BusinessentityaddressRepo.scala | 19 +- .../BusinessentityaddressRepoImpl.scala | 94 +- .../BusinessentityaddressRepoMock.scala | 31 +- .../BusinessentityaddressRow.scala | 12 +- .../BusinessentityaddressRowUnsaved.scala | 4 +- .../countryregion/CountryregionFields.scala | 28 +- .../countryregion/CountryregionId.scala | 12 +- .../countryregion/CountryregionRepo.scala | 19 +- .../countryregion/CountryregionRepoImpl.scala | 76 +- .../countryregion/CountryregionRepoMock.scala | 31 +- .../countryregion/CountryregionRow.scala | 10 +- .../CountryregionRowUnsaved.scala | 4 +- .../emailaddress/EmailaddressFields.scala | 39 +- .../person/emailaddress/EmailaddressId.scala | 8 +- .../emailaddress/EmailaddressRepo.scala | 19 +- .../emailaddress/EmailaddressRepoImpl.scala | 100 +- .../emailaddress/EmailaddressRepoMock.scala | 31 +- .../person/emailaddress/EmailaddressRow.scala | 14 +- .../emailaddress/EmailaddressRowUnsaved.scala | 6 +- .../person/password/PasswordFields.scala | 34 +- .../person/password/PasswordRepo.scala | 19 +- .../person/password/PasswordRepoImpl.scala | 80 +- .../person/password/PasswordRepoMock.scala | 31 +- .../person/password/PasswordRow.scala | 10 +- .../person/password/PasswordRowUnsaved.scala | 4 +- .../person/person/PersonFields.scala | 53 +- .../person/person/PersonRepo.scala | 19 +- .../person/person/PersonRepoImpl.scala | 124 +- .../person/person/PersonRepoMock.scala | 31 +- .../person/person/PersonRow.scala | 12 +- .../person/person/PersonRowUnsaved.scala | 6 +- .../stateprovince/StateprovinceFields.scala | 40 +- .../stateprovince/StateprovinceId.scala | 13 +- .../stateprovince/StateprovinceRepo.scala | 19 +- .../stateprovince/StateprovinceRepoImpl.scala | 86 +- .../stateprovince/StateprovinceRepoMock.scala | 31 +- .../stateprovince/StateprovinceRow.scala | 10 +- .../StateprovinceRowUnsaved.scala | 4 +- .../person_detail/PersonDetailSqlRepo.scala | 4 +- .../PersonDetailSqlRepoImpl.scala | 12 +- .../person_detail/PersonDetailSqlRow.scala | 9 +- .../person_dynamic/PersonDynamicSqlRepo.scala | 4 +- .../PersonDynamicSqlRepoImpl.scala | 15 +- .../person_dynamic/PersonDynamicSqlRow.scala | 9 +- .../PersonRowJoinSqlRepo.scala | 4 +- .../PersonRowJoinSqlRepoImpl.scala | 8 +- .../person_row_join/PersonRowJoinSqlRow.scala | 19 +- .../precisetypes/PaddedString10.scala | 14 +- .../precisetypes/PaddedString3.scala | 14 +- .../precisetypes/String10.scala | 14 +- .../precisetypes/String100.scala | 14 +- .../precisetypes/String20.scala | 14 +- .../precisetypes/String255.scala | 14 +- .../precisetypes/String50.scala | 14 +- .../production/product/ProductFields.scala | 77 +- .../production/product/ProductId.scala | 13 +- .../production/product/ProductRepo.scala | 19 +- .../production/product/ProductRepoImpl.scala | 194 +- .../production/product/ProductRepoMock.scala | 31 +- .../production/product/ProductRow.scala | 12 +- .../product/ProductRowUnsaved.scala | 6 +- .../ProductcategoryFields.scala | 30 +- .../productcategory/ProductcategoryId.scala | 13 +- .../productcategory/ProductcategoryRepo.scala | 19 +- .../ProductcategoryRepoImpl.scala | 78 +- .../ProductcategoryRepoMock.scala | 31 +- .../productcategory/ProductcategoryRow.scala | 10 +- .../ProductcategoryRowUnsaved.scala | 4 +- .../ProductcosthistoryFields.scala | 39 +- .../ProductcosthistoryId.scala | 8 +- .../ProductcosthistoryRepo.scala | 19 +- .../ProductcosthistoryRepoImpl.scala | 104 +- .../ProductcosthistoryRepoMock.scala | 31 +- .../ProductcosthistoryRow.scala | 14 +- .../ProductcosthistoryRowUnsaved.scala | 6 +- .../productmodel/ProductmodelFields.scala | 36 +- .../productmodel/ProductmodelId.scala | 13 +- .../productmodel/ProductmodelRepo.scala | 19 +- .../productmodel/ProductmodelRepoImpl.scala | 95 +- .../productmodel/ProductmodelRepoMock.scala | 31 +- .../productmodel/ProductmodelRow.scala | 11 +- .../productmodel/ProductmodelRowUnsaved.scala | 5 +- .../ProductsubcategoryFields.scala | 34 +- .../ProductsubcategoryId.scala | 13 +- .../ProductsubcategoryRepo.scala | 19 +- .../ProductsubcategoryRepoImpl.scala | 80 +- .../ProductsubcategoryRepoMock.scala | 31 +- .../ProductsubcategoryRow.scala | 10 +- .../ProductsubcategoryRowUnsaved.scala | 4 +- .../unitmeasure/UnitmeasureFields.scala | 28 +- .../unitmeasure/UnitmeasureId.scala | 12 +- .../unitmeasure/UnitmeasureRepo.scala | 19 +- .../unitmeasure/UnitmeasureRepoImpl.scala | 76 +- .../unitmeasure/UnitmeasureRepoMock.scala | 31 +- .../unitmeasure/UnitmeasureRow.scala | 10 +- .../unitmeasure/UnitmeasureRowUnsaved.scala | 4 +- .../adventureworks/public/AccountNumber.scala | 12 +- .../adventureworks/public/Address.scala | 14 +- .../public/AllTypesComposite.scala | 15 +- .../adventureworks/public/Complex.scala | 14 +- .../adventureworks/public/ContactInfo.scala | 14 +- .../public/EmployeeRecord.scala | 15 +- .../adventureworks/public/Flag.scala | 13 +- .../adventureworks/public/InventoryItem.scala | 19 +- .../public/MetadataRecord.scala | 14 +- .../adventureworks/public/Mydomain.scala | 12 +- .../adventureworks/public/Myenum.scala | 42 +- .../adventureworks/public/Name.scala | 12 +- .../adventureworks/public/NameStyle.scala | 13 +- .../adventureworks/public/NullableTest.scala | 14 +- .../adventureworks/public/OrderNumber.scala | 12 +- .../adventureworks/public/PersonName.scala | 14 +- .../adventureworks/public/Phone.scala | 12 +- .../adventureworks/public/Point2d.scala | 14 +- .../adventureworks/public/PolygonCustom.scala | 16 +- .../adventureworks/public/ShortText.scala | 12 +- .../public/TablefuncCrosstab2.scala | 14 +- .../public/TablefuncCrosstab3.scala | 14 +- .../public/TablefuncCrosstab4.scala | 14 +- .../public/TextWithSpecialChars.scala | 14 +- .../adventureworks/public/TreeNode.scala | 15 +- .../public/flaff/FlaffFields.scala | 37 +- .../adventureworks/public/flaff/FlaffId.scala | 9 +- .../public/flaff/FlaffRepo.scala | 19 +- .../public/flaff/FlaffRepoImpl.scala | 92 +- .../public/flaff/FlaffRepoMock.scala | 31 +- .../public/flaff/FlaffRow.scala | 14 +- .../identity_test/IdentityTestFields.scala | 28 +- .../public/identity_test/IdentityTestId.scala | 12 +- .../identity_test/IdentityTestRepo.scala | 19 +- .../identity_test/IdentityTestRepoImpl.scala | 80 +- .../identity_test/IdentityTestRepoMock.scala | 31 +- .../identity_test/IdentityTestRow.scala | 10 +- .../IdentityTestRowUnsaved.scala | 4 +- .../public/issue142/Issue142Fields.scala | 20 +- .../public/issue142/Issue142Id.scala | 13 +- .../public/issue142/Issue142Repo.scala | 19 +- .../public/issue142/Issue142RepoImpl.scala | 62 +- .../public/issue142/Issue142RepoMock.scala | 31 +- .../public/issue142/Issue142Row.scala | 8 +- .../public/issue142_2/Issue1422Fields.scala | 22 +- .../public/issue142_2/Issue1422Repo.scala | 19 +- .../public/issue142_2/Issue1422RepoImpl.scala | 62 +- .../public/issue142_2/Issue1422RepoMock.scala | 31 +- .../public/issue142_2/Issue1422Row.scala | 8 +- .../only_pk_columns/OnlyPkColumnsFields.scala | 27 +- .../only_pk_columns/OnlyPkColumnsId.scala | 9 +- .../only_pk_columns/OnlyPkColumnsRepo.scala | 19 +- .../OnlyPkColumnsRepoImpl.scala | 79 +- .../OnlyPkColumnsRepoMock.scala | 31 +- .../only_pk_columns/OnlyPkColumnsRow.scala | 13 +- .../public/pgtest/PgtestFields.scala | 361 +- .../public/pgtest/PgtestRepo.scala | 11 +- .../public/pgtest/PgtestRepoImpl.scala | 35 +- .../public/pgtest/PgtestRow.scala | 145 +- .../public/pgtestnull/PgtestnullFields.scala | 361 +- .../public/pgtestnull/PgtestnullRepo.scala | 11 +- .../pgtestnull/PgtestnullRepoImpl.scala | 36 +- .../public/pgtestnull/PgtestnullRow.scala | 146 +- .../PrecisionTypesFields.scala | 73 +- .../precision_types/PrecisionTypesId.scala | 13 +- .../precision_types/PrecisionTypesRepo.scala | 19 +- .../PrecisionTypesRepoImpl.scala | 145 +- .../PrecisionTypesRepoMock.scala | 31 +- .../precision_types/PrecisionTypesRow.scala | 11 +- .../PrecisionTypesRowUnsaved.scala | 5 +- .../PrecisionTypesNullFields.scala | 73 +- .../PrecisionTypesNullId.scala | 13 +- .../PrecisionTypesNullRepo.scala | 19 +- .../PrecisionTypesNullRepoImpl.scala | 222 +- .../PrecisionTypesNullRepoMock.scala | 31 +- .../PrecisionTypesNullRow.scala | 12 +- .../PrecisionTypesNullRowUnsaved.scala | 6 +- .../public/title/TitleFields.scala | 20 +- .../adventureworks/public/title/TitleId.scala | 13 +- .../public/title/TitleRepo.scala | 19 +- .../public/title/TitleRepoImpl.scala | 62 +- .../public/title/TitleRepoMock.scala | 31 +- .../public/title/TitleRow.scala | 8 +- .../title_domain/TitleDomainFields.scala | 20 +- .../public/title_domain/TitleDomainId.scala | 11 +- .../public/title_domain/TitleDomainRepo.scala | 19 +- .../title_domain/TitleDomainRepoImpl.scala | 62 +- .../title_domain/TitleDomainRepoMock.scala | 31 +- .../public/title_domain/TitleDomainRow.scala | 8 +- .../titledperson/TitledpersonFields.scala | 28 +- .../titledperson/TitledpersonRepo.scala | 11 +- .../titledperson/TitledpersonRepoImpl.scala | 32 +- .../public/titledperson/TitledpersonRow.scala | 10 +- .../public/users/UsersFields.scala | 38 +- .../adventureworks/public/users/UsersId.scala | 12 +- .../public/users/UsersRepo.scala | 21 +- .../public/users/UsersRepoImpl.scala | 101 +- .../public/users/UsersRepoMock.scala | 33 +- .../public/users/UsersRow.scala | 11 +- .../public/users/UsersRowUnsaved.scala | 5 +- .../sales/salesperson/SalespersonFields.scala | 45 +- .../sales/salesperson/SalespersonRepo.scala | 19 +- .../salesperson/SalespersonRepoImpl.scala | 110 +- .../salesperson/SalespersonRepoMock.scala | 31 +- .../sales/salesperson/SalespersonRow.scala | 12 +- .../salesperson/SalespersonRowUnsaved.scala | 6 +- .../salesterritory/SalesterritoryFields.scala | 45 +- .../salesterritory/SalesterritoryId.scala | 13 +- .../salesterritory/SalesterritoryRepo.scala | 19 +- .../SalesterritoryRepoImpl.scala | 103 +- .../SalesterritoryRepoMock.scala | 31 +- .../salesterritory/SalesterritoryRow.scala | 11 +- .../SalesterritoryRowUnsaved.scala | 5 +- .../update_person/UpdatePersonSqlRepo.scala | 2 +- .../UpdatePersonSqlRepoImpl.scala | 11 +- .../UpdatePersonReturningSqlRepo.scala | 4 +- .../UpdatePersonReturningSqlRepoImpl.scala | 17 +- .../UpdatePersonReturningSqlRow.scala | 8 +- .../userdefined/ActiveFlag.scala | 10 +- .../userdefined/CurrentFlag.scala | 10 +- .../userdefined/Description.scala | 24 + .../userdefined/FirstName.scala | 10 +- .../adventureworks/userdefined/LastName.scala | 10 +- .../userdefined/MiddleName.scala | 10 +- .../userdefined/OnlineOrderFlag.scala | 10 +- .../userdefined/SalariedFlag.scala | 10 +- .../src/scala/adventureworks/SeekDbTest.scala | 10 +- .../scala/adventureworks/TupleInDslTest.scala | 2 +- .../scala/adventureworks/WithConnection.scala | 17 +- .../employee/EmployeeTest.scala | 2 +- .../adventureworks/person/MultiRepoTest.scala | 3 +- .../production/product/SeekTest.scala | 21 +- .../productcosthistory/CompositeIdsTest.scala | 2 +- .../public_/users/UsersRepoTest.scala | 23 +- .../userdefined/CustomCreditcardId.scala | 7 +- .../adventureworks/bridge/Customer.scala | 48 + .../customtypes/TypoBytea.scala | 2 +- .../customtypes/TypoHStore.scala | 2 +- .../adventureworks/customtypes/TypoInet.scala | 2 +- .../customtypes/TypoInstant.scala | 2 +- .../customtypes/TypoInt2Vector.scala | 2 +- .../adventureworks/customtypes/TypoJson.scala | 2 +- .../customtypes/TypoJsonb.scala | 2 +- .../customtypes/TypoLocalDate.scala | 2 +- .../customtypes/TypoLocalDateTime.scala | 2 +- .../customtypes/TypoLocalTime.scala | 2 +- .../customtypes/TypoMoney.scala | 2 +- .../customtypes/TypoOffsetTime.scala | 2 +- .../customtypes/TypoPolygon.scala | 2 +- .../customtypes/TypoRecord.scala | 2 +- .../customtypes/TypoShort.scala | 2 +- .../adventureworks/customtypes/TypoUUID.scala | 2 +- .../customtypes/TypoUnknownCitext.scala | 2 +- .../customtypes/TypoVector.scala | 2 +- .../adventureworks/customtypes/TypoXml.scala | 2 +- .../department/DepartmentId.scala | 2 +- .../humanresources/shift/ShiftId.scala | 2 +- .../information_schema/CardinalNumber.scala | 2 +- .../information_schema/CharacterData.scala | 2 +- .../information_schema/SqlIdentifier.scala | 2 +- .../information_schema/TimeStamp.scala | 2 +- .../information_schema/YesOrNo.scala | 2 +- .../person/address/AddressId.scala | 2 +- .../person/addresstype/AddresstypeId.scala | 2 +- .../businessentity/BusinessentityId.scala | 2 +- .../countryregion/CountryregionId.scala | 2 +- .../stateprovince/StateprovinceId.scala | 2 +- .../production/product/ProductId.scala | 2 +- .../productcategory/ProductcategoryId.scala | 2 +- .../productmodel/ProductmodelId.scala | 2 +- .../ProductsubcategoryId.scala | 2 +- .../unitmeasure/UnitmeasureId.scala | 2 +- .../adventureworks/public/AccountNumber.scala | 2 +- .../adventureworks/public/Flag.scala | 2 +- .../adventureworks/public/Mydomain.scala | 2 +- .../adventureworks/public/Myenum.scala | 4 +- .../adventureworks/public/Name.scala | 2 +- .../adventureworks/public/NameStyle.scala | 2 +- .../adventureworks/public/OrderNumber.scala | 2 +- .../adventureworks/public/Phone.scala | 2 +- .../adventureworks/public/ShortText.scala | 2 +- .../public/identity_test/IdentityTestId.scala | 2 +- .../precision_types/PrecisionTypesId.scala | 2 +- .../PrecisionTypesNullId.scala | 2 +- .../adventureworks/public/users/UsersId.scala | 2 +- .../salesterritory/SalesterritoryId.scala | 2 +- .../userdefined/ActiveFlag.scala | 2 +- .../userdefined/CurrentFlag.scala | 2 +- .../userdefined/Description.scala | 49 + .../userdefined/FirstName.scala | 2 +- .../adventureworks/userdefined/LastName.scala | 2 +- .../userdefined/MiddleName.scala | 2 +- .../userdefined/OnlineOrderFlag.scala | 2 +- .../userdefined/SalariedFlag.scala | 2 +- .../adventureworks/bridge/Customer.scala | 48 + .../customtypes/TypoBytea.scala | 2 +- .../customtypes/TypoHStore.scala | 2 +- .../adventureworks/customtypes/TypoInet.scala | 2 +- .../customtypes/TypoInstant.scala | 2 +- .../customtypes/TypoInt2Vector.scala | 2 +- .../adventureworks/customtypes/TypoJson.scala | 2 +- .../customtypes/TypoJsonb.scala | 2 +- .../customtypes/TypoLocalDate.scala | 2 +- .../customtypes/TypoLocalDateTime.scala | 2 +- .../customtypes/TypoLocalTime.scala | 2 +- .../customtypes/TypoMoney.scala | 2 +- .../customtypes/TypoOffsetTime.scala | 2 +- .../customtypes/TypoPolygon.scala | 2 +- .../customtypes/TypoRecord.scala | 2 +- .../customtypes/TypoShort.scala | 2 +- .../adventureworks/customtypes/TypoUUID.scala | 2 +- .../customtypes/TypoUnknownCitext.scala | 2 +- .../customtypes/TypoVector.scala | 2 +- .../adventureworks/customtypes/TypoXml.scala | 2 +- .../department/DepartmentId.scala | 2 +- .../humanresources/shift/ShiftId.scala | 2 +- .../information_schema/CardinalNumber.scala | 2 +- .../information_schema/CharacterData.scala | 2 +- .../information_schema/SqlIdentifier.scala | 2 +- .../information_schema/TimeStamp.scala | 2 +- .../information_schema/YesOrNo.scala | 2 +- .../person/address/AddressId.scala | 2 +- .../person/addresstype/AddresstypeId.scala | 2 +- .../businessentity/BusinessentityId.scala | 2 +- .../countryregion/CountryregionId.scala | 2 +- .../stateprovince/StateprovinceId.scala | 2 +- .../production/product/ProductId.scala | 2 +- .../productcategory/ProductcategoryId.scala | 2 +- .../productmodel/ProductmodelId.scala | 2 +- .../ProductsubcategoryId.scala | 2 +- .../unitmeasure/UnitmeasureId.scala | 2 +- .../adventureworks/public/AccountNumber.scala | 2 +- .../adventureworks/public/Flag.scala | 2 +- .../adventureworks/public/Mydomain.scala | 2 +- .../adventureworks/public/Myenum.scala | 25 +- .../adventureworks/public/Name.scala | 2 +- .../adventureworks/public/NameStyle.scala | 2 +- .../adventureworks/public/OrderNumber.scala | 2 +- .../adventureworks/public/Phone.scala | 2 +- .../adventureworks/public/ShortText.scala | 2 +- .../public/identity_test/IdentityTestId.scala | 2 +- .../precision_types/PrecisionTypesId.scala | 2 +- .../PrecisionTypesNullId.scala | 2 +- .../adventureworks/public/users/UsersId.scala | 2 +- .../salesterritory/SalesterritoryId.scala | 2 +- .../userdefined/ActiveFlag.scala | 2 +- .../userdefined/CurrentFlag.scala | 2 +- .../userdefined/Description.scala | 49 + .../userdefined/FirstName.scala | 2 +- .../adventureworks/userdefined/LastName.scala | 2 +- .../userdefined/MiddleName.scala | 2 +- .../userdefined/OnlineOrderFlag.scala | 2 +- .../userdefined/SalariedFlag.scala | 2 +- .../testdb/DefaultedDeserializer.java | 36 +- .../testdb/DefaultedSerializer.java | 14 +- .../testdb/TestInsert.java | 357 +- .../AllScalarTypesFields.java | 462 +- .../all_scalar_types/AllScalarTypesId.java | 16 +- .../all_scalar_types/AllScalarTypesRepo.java | 65 +- .../AllScalarTypesRepoImpl.java | 937 +-- .../AllScalarTypesRepoMock.java | 149 +- .../all_scalar_types/AllScalarTypesRow.java | 1933 +----- .../AllScalarTypesRowUnsaved.java | 1573 +---- .../testdb/bridge/Customer.java | 27 + .../CustomerOrdersSummarySqlRepo.java | 11 +- .../CustomerOrdersSummarySqlRepoImpl.java | 43 +- .../CustomerOrdersSummarySqlRow.java | 146 +- .../CustomerOrdersViewViewFields.java | 99 +- .../CustomerOrdersViewViewRepo.java | 10 +- .../CustomerOrdersViewViewRepoImpl.java | 29 +- .../CustomerOrdersViewViewRow.java | 72 +- .../testdb/customers/CustomersFields.java | 68 +- .../testdb/customers/CustomersId.java | 16 +- .../testdb/customers/CustomersRepo.java | 74 +- .../testdb/customers/CustomersRepoImpl.java | 274 +- .../testdb/customers/CustomersRepoMock.java | 154 +- .../testdb/customers/CustomersRow.java | 46 +- .../testdb/customers/CustomersRowUnsaved.java | 29 +- .../testdb/customtypes/Defaulted.java | 46 +- .../FindCustomersByEmailSqlRepo.java | 11 +- .../FindCustomersByEmailSqlRepoImpl.java | 29 +- .../FindCustomersByEmailSqlRow.java | 47 +- .../testdb/order_items/OrderItemsFields.java | 89 +- .../testdb/order_items/OrderItemsId.java | 16 +- .../testdb/order_items/OrderItemsRepo.java | 65 +- .../order_items/OrderItemsRepoImpl.java | 262 +- .../order_items/OrderItemsRepoMock.java | 151 +- .../testdb/order_items/OrderItemsRow.java | 56 +- .../order_items/OrderItemsRowUnsaved.java | 22 +- .../testdb/orders/OrdersFields.java | 75 +- .../testdb/orders/OrdersId.java | 16 +- .../testdb/orders/OrdersRepo.java | 65 +- .../testdb/orders/OrdersRepoImpl.java | 263 +- .../testdb/orders/OrdersRepoMock.java | 146 +- .../testdb/orders/OrdersRow.java | 49 +- .../testdb/orders/OrdersRowUnsaved.java | 33 +- .../OrdersWithCustomerDetailsSqlRepo.java | 11 +- .../OrdersWithCustomerDetailsSqlRepoImpl.java | 38 +- .../OrdersWithCustomerDetailsSqlRow.java | 90 +- .../testdb/precisetypes/Binary10.java | 39 +- .../testdb/precisetypes/Binary32.java | 39 +- .../testdb/precisetypes/Decimal10_2.java | 62 +- .../testdb/precisetypes/Decimal12_4.java | 62 +- .../testdb/precisetypes/Decimal18_4.java | 62 +- .../testdb/precisetypes/Decimal5_2.java | 62 +- .../testdb/precisetypes/Decimal8_2.java | 62 +- .../testdb/precisetypes/LocalDateTime3.java | 37 +- .../testdb/precisetypes/LocalDateTime7.java | 37 +- .../testdb/precisetypes/LocalTime3.java | 37 +- .../testdb/precisetypes/LocalTime7.java | 37 +- .../testdb/precisetypes/OffsetDateTime3.java | 37 +- .../testdb/precisetypes/OffsetDateTime7.java | 37 +- .../testdb/precisetypes/PaddedString10.java | 44 +- .../testdb/precisetypes/String10.java | 48 +- .../testdb/precisetypes/String100.java | 48 +- .../testdb/precisetypes/String20.java | 48 +- .../testdb/precisetypes/String255.java | 48 +- .../testdb/precisetypes/String50.java | 48 +- .../precision_types/PrecisionTypesFields.java | 333 +- .../precision_types/PrecisionTypesId.java | 16 +- .../precision_types/PrecisionTypesRepo.java | 65 +- .../PrecisionTypesRepoImpl.java | 679 +- .../PrecisionTypesRepoMock.java | 149 +- .../precision_types/PrecisionTypesRow.java | 1025 +-- .../PrecisionTypesRowUnsaved.java | 841 +-- .../PrecisionTypesNullFields.java | 336 +- .../PrecisionTypesNullId.java | 16 +- .../PrecisionTypesNullRepo.java | 64 +- .../PrecisionTypesNullRepoImpl.java | 702 +- .../PrecisionTypesNullRepoMock.java | 153 +- .../PrecisionTypesNullRow.java | 1025 +-- .../PrecisionTypesNullRowUnsaved.java | 874 +-- .../testdb/products/ProductsFields.java | 67 +- .../testdb/products/ProductsId.java | 16 +- .../testdb/products/ProductsRepo.java | 65 +- .../testdb/products/ProductsRepoImpl.java | 247 +- .../testdb/products/ProductsRepoMock.java | 149 +- .../testdb/products/ProductsRow.java | 44 +- .../testdb/products/ProductsRowUnsaved.java | 20 +- .../test_connection/TestConnectionFields.java | 62 +- .../test_connection/TestConnectionId.java | 16 +- .../test_connection/TestConnectionRepo.java | 65 +- .../TestConnectionRepoImpl.java | 240 +- .../TestConnectionRepoMock.java | 149 +- .../test_connection/TestConnectionRow.java | 40 +- .../TestConnectionRowUnsaved.java | 20 +- .../UpdateCustomerEmailSqlRepo.java | 12 +- .../UpdateCustomerEmailSqlRepoImpl.java | 24 +- .../testdb/userdefined/Email.java | 19 +- .../java/src/java/testdb/DSLTest.java | 6 +- .../src/java/testdb/DatabaseFeaturesTest.java | 2 +- .../java/src/java/testdb/ForeignKeyTest.java | 2 +- .../java/src/java/testdb/MockRepoTest.java | 2 +- .../java/src/java/testdb/SqlScriptTest.java | 2 +- .../src/java/testdb/SqlServerTestHelper.java | 44 +- .../java/src/java/testdb/TestInsertTest.java | 2 +- .../java/src/java/testdb/TupleInTest.java | 6 +- testers/sqlserver/kotlin/build.gradle.kts | 44 - .../testdb/DefaultedDeserializer.kt | 2 +- .../testdb/DefaultedSerializer.kt | 4 +- .../testdb/TestInsert.kt | 38 +- .../all_scalar_types/AllScalarTypesFields.kt | 131 +- .../all_scalar_types/AllScalarTypesId.kt | 8 +- .../all_scalar_types/AllScalarTypesRepo.kt | 27 +- .../AllScalarTypesRepoImpl.kt | 211 +- .../AllScalarTypesRepoMock.kt | 39 +- .../all_scalar_types/AllScalarTypesRow.kt | 69 +- .../AllScalarTypesRowUnsaved.kt | 30 +- .../testdb/bridge/Customer.kt | 15 + .../CustomerOrdersSummarySqlRepo.kt | 6 +- .../CustomerOrdersSummarySqlRepoImpl.kt | 14 +- .../CustomerOrdersSummarySqlRow.kt | 16 +- .../CustomerOrdersViewViewFields.kt | 41 +- .../CustomerOrdersViewViewRepo.kt | 6 +- .../CustomerOrdersViewViewRepoImpl.kt | 12 +- .../CustomerOrdersViewViewRow.kt | 20 +- .../testdb/customers/CustomersFields.kt | 36 +- .../testdb/customers/CustomersId.kt | 8 +- .../testdb/customers/CustomersRepo.kt | 29 +- .../testdb/customers/CustomersRepoImpl.kt | 82 +- .../testdb/customers/CustomersRepoMock.kt | 41 +- .../testdb/customers/CustomersRow.kt | 15 +- .../testdb/customers/CustomersRowUnsaved.kt | 2 +- .../FindCustomersByEmailSqlRepo.kt | 6 +- .../FindCustomersByEmailSqlRepoImpl.kt | 12 +- .../FindCustomersByEmailSqlRow.kt | 15 +- .../testdb/order_items/OrderItemsFields.kt | 38 +- .../testdb/order_items/OrderItemsId.kt | 8 +- .../testdb/order_items/OrderItemsRepo.kt | 27 +- .../testdb/order_items/OrderItemsRepoImpl.kt | 81 +- .../testdb/order_items/OrderItemsRepoMock.kt | 39 +- .../testdb/order_items/OrderItemsRow.kt | 8 +- .../testdb/orders/OrdersFields.kt | 37 +- .../testdb/orders/OrdersId.kt | 8 +- .../testdb/orders/OrdersRepo.kt | 27 +- .../testdb/orders/OrdersRepoImpl.kt | 79 +- .../testdb/orders/OrdersRepoMock.kt | 39 +- .../testdb/orders/OrdersRow.kt | 10 +- .../OrdersWithCustomerDetailsSqlRepo.kt | 4 +- .../OrdersWithCustomerDetailsSqlRepoImpl.kt | 12 +- .../OrdersWithCustomerDetailsSqlRow.kt | 16 +- .../testdb/precisetypes/Binary10.kt | 22 +- .../testdb/precisetypes/Binary32.kt | 22 +- .../testdb/precisetypes/Decimal10_2.kt | 40 +- .../testdb/precisetypes/Decimal12_4.kt | 40 +- .../testdb/precisetypes/Decimal18_4.kt | 40 +- .../testdb/precisetypes/Decimal5_2.kt | 40 +- .../testdb/precisetypes/Decimal8_2.kt | 40 +- .../testdb/precisetypes/LocalDateTime3.kt | 18 +- .../testdb/precisetypes/LocalDateTime7.kt | 18 +- .../testdb/precisetypes/LocalTime3.kt | 18 +- .../testdb/precisetypes/LocalTime7.kt | 18 +- .../testdb/precisetypes/OffsetDateTime3.kt | 18 +- .../testdb/precisetypes/OffsetDateTime7.kt | 18 +- .../testdb/precisetypes/PaddedString10.kt | 34 +- .../testdb/precisetypes/String10.kt | 36 +- .../testdb/precisetypes/String100.kt | 36 +- .../testdb/precisetypes/String20.kt | 36 +- .../testdb/precisetypes/String255.kt | 36 +- .../testdb/precisetypes/String50.kt | 36 +- .../precision_types/PrecisionTypesFields.kt | 76 +- .../precision_types/PrecisionTypesId.kt | 8 +- .../precision_types/PrecisionTypesRepo.kt | 27 +- .../precision_types/PrecisionTypesRepoImpl.kt | 169 +- .../precision_types/PrecisionTypesRepoMock.kt | 39 +- .../precision_types/PrecisionTypesRow.kt | 7 +- .../PrecisionTypesNullFields.kt | 76 +- .../PrecisionTypesNullId.kt | 8 +- .../PrecisionTypesNullRepo.kt | 27 +- .../PrecisionTypesNullRepoImpl.kt | 170 +- .../PrecisionTypesNullRepoMock.kt | 39 +- .../PrecisionTypesNullRow.kt | 8 +- .../testdb/products/ProductsFields.kt | 39 +- .../testdb/products/ProductsId.kt | 8 +- .../testdb/products/ProductsRepo.kt | 27 +- .../testdb/products/ProductsRepoImpl.kt | 79 +- .../testdb/products/ProductsRepoMock.kt | 39 +- .../testdb/products/ProductsRow.kt | 20 +- .../testdb/products/ProductsRowUnsaved.kt | 4 +- .../test_connection/TestConnectionFields.kt | 34 +- .../test_connection/TestConnectionId.kt | 8 +- .../test_connection/TestConnectionRepo.kt | 27 +- .../test_connection/TestConnectionRepoImpl.kt | 74 +- .../test_connection/TestConnectionRepoMock.kt | 39 +- .../test_connection/TestConnectionRow.kt | 15 +- .../TestConnectionRowUnsaved.kt | 2 +- .../UpdateCustomerEmailSqlRepo.kt | 4 +- .../UpdateCustomerEmailSqlRepoImpl.kt | 11 +- .../testdb/userdefined/Email.kt | 14 +- .../kotlin/src/kotlin/testdb/DSLTest.kt | 4 +- .../src/kotlin/testdb/DatabaseFeaturesTest.kt | 2 +- .../src/kotlin/testdb/ForeignKeyTest.kt | 2 +- .../kotlin/src/kotlin/testdb/MockRepoTest.kt | 2 +- .../kotlin/src/kotlin/testdb/SqlScriptTest.kt | 2 +- .../src/kotlin/testdb/SqlServerTestHelper.kt | 37 +- .../src/kotlin/testdb/TestInsertTest.kt | 2 +- .../kotlin/src/kotlin/testdb/TupleInTest.kt | 4 +- .../testdb/TestInsert.scala | 2 +- .../AllScalarTypesFields.scala | 101 +- .../all_scalar_types/AllScalarTypesId.scala | 10 +- .../all_scalar_types/AllScalarTypesRepo.scala | 19 +- .../AllScalarTypesRepoImpl.scala | 275 +- .../AllScalarTypesRepoMock.scala | 31 +- .../all_scalar_types/AllScalarTypesRow.scala | 10 +- .../testdb/bridge/Customer.scala | 15 + .../CustomerOrdersSummarySqlRepo.scala | 4 +- .../CustomerOrdersSummarySqlRepoImpl.scala | 18 +- .../CustomerOrdersSummarySqlRow.scala | 10 +- .../CustomerOrdersViewViewFields.scala | 35 +- .../CustomerOrdersViewViewRepo.scala | 6 +- .../CustomerOrdersViewViewRepoImpl.scala | 14 +- .../CustomerOrdersViewViewRow.scala | 10 +- .../testdb/customers/CustomersFields.scala | 32 +- .../testdb/customers/CustomersId.scala | 10 +- .../testdb/customers/CustomersRepo.scala | 21 +- .../testdb/customers/CustomersRepoImpl.scala | 74 +- .../testdb/customers/CustomersRepoMock.scala | 33 +- .../testdb/customers/CustomersRow.scala | 9 +- .../FindCustomersByEmailSqlRepo.scala | 4 +- .../FindCustomersByEmailSqlRepoImpl.scala | 12 +- .../FindCustomersByEmailSqlRow.scala | 9 +- .../testdb/order_items/OrderItemsFields.scala | 38 +- .../testdb/order_items/OrderItemsId.scala | 10 +- .../testdb/order_items/OrderItemsRepo.scala | 19 +- .../order_items/OrderItemsRepoImpl.scala | 77 +- .../order_items/OrderItemsRepoMock.scala | 31 +- .../testdb/order_items/OrderItemsRow.scala | 8 +- .../testdb/orders/OrdersFields.scala | 37 +- .../testdb/orders/OrdersId.scala | 10 +- .../testdb/orders/OrdersRepo.scala | 19 +- .../testdb/orders/OrdersRepoImpl.scala | 75 +- .../testdb/orders/OrdersRepoMock.scala | 31 +- .../testdb/orders/OrdersRow.scala | 10 +- .../OrdersWithCustomerDetailsSqlRepo.scala | 4 +- ...OrdersWithCustomerDetailsSqlRepoImpl.scala | 18 +- .../OrdersWithCustomerDetailsSqlRow.scala | 10 +- .../testdb/precisetypes/Binary10.scala | 10 +- .../testdb/precisetypes/Binary32.scala | 10 +- .../testdb/precisetypes/Decimal10_2.scala | 12 +- .../testdb/precisetypes/Decimal12_4.scala | 12 +- .../testdb/precisetypes/Decimal18_4.scala | 12 +- .../testdb/precisetypes/Decimal5_2.scala | 12 +- .../testdb/precisetypes/Decimal8_2.scala | 12 +- .../testdb/precisetypes/LocalDateTime3.scala | 12 +- .../testdb/precisetypes/LocalDateTime7.scala | 12 +- .../testdb/precisetypes/LocalTime3.scala | 12 +- .../testdb/precisetypes/LocalTime7.scala | 12 +- .../testdb/precisetypes/OffsetDateTime3.scala | 12 +- .../testdb/precisetypes/OffsetDateTime7.scala | 12 +- .../testdb/precisetypes/PaddedString10.scala | 12 +- .../testdb/precisetypes/String10.scala | 12 +- .../testdb/precisetypes/String100.scala | 12 +- .../testdb/precisetypes/String20.scala | 12 +- .../testdb/precisetypes/String255.scala | 12 +- .../testdb/precisetypes/String50.scala | 12 +- .../PrecisionTypesFields.scala | 76 +- .../precision_types/PrecisionTypesId.scala | 10 +- .../precision_types/PrecisionTypesRepo.scala | 19 +- .../PrecisionTypesRepoImpl.scala | 107 +- .../PrecisionTypesRepoMock.scala | 31 +- .../precision_types/PrecisionTypesRow.scala | 8 +- .../PrecisionTypesNullFields.scala | 76 +- .../PrecisionTypesNullId.scala | 10 +- .../PrecisionTypesNullRepo.scala | 19 +- .../PrecisionTypesNullRepoImpl.scala | 218 +- .../PrecisionTypesNullRepoMock.scala | 31 +- .../PrecisionTypesNullRow.scala | 9 +- .../testdb/products/ProductsFields.scala | 33 +- .../testdb/products/ProductsId.scala | 10 +- .../testdb/products/ProductsRepo.scala | 19 +- .../testdb/products/ProductsRepoImpl.scala | 77 +- .../testdb/products/ProductsRepoMock.scala | 31 +- .../testdb/products/ProductsRow.scala | 10 +- .../TestConnectionFields.scala | 30 +- .../test_connection/TestConnectionId.scala | 10 +- .../test_connection/TestConnectionRepo.scala | 19 +- .../TestConnectionRepoImpl.scala | 68 +- .../TestConnectionRepoMock.scala | 31 +- .../test_connection/TestConnectionRow.scala | 9 +- .../UpdateCustomerEmailSqlRepo.scala | 2 +- .../UpdateCustomerEmailSqlRepoImpl.scala | 13 +- .../testdb/userdefined/Email.scala | 10 +- .../scala/src/scala/testdb/AllTypesTest.scala | 52 +- .../scala/src/scala/testdb/DSLTest.scala | 42 +- .../scala/testdb/DatabaseFeaturesTest.scala | 82 +- .../src/scala/testdb/ForeignKeyTest.scala | 22 +- .../scala/src/scala/testdb/MockRepoTest.scala | 2 +- .../src/scala/testdb/SqlScriptTest.scala | 26 +- .../src/scala/testdb/TestInsertTest.scala | 38 +- .../scala/src/scala/testdb/TupleInTest.scala | 36 +- .../src/scala/testdb/withConnection.scala | 15 +- .../scala/typr/bridge/FlowValidatorTest.scala | 513 ++ .../scala/typr/bridge/SmartDefaultsTest.scala | 253 + .../bridge/TypeNarrowerIntegrationTest.scala | 147 + .../typr/bridge/TypePolicyValidatorTest.scala | 102 + .../typr/cli/config/ConfigRoundtripTest.scala | 273 + .../config/generated/AlignedSource.scala | 32 + .../config/generated/ApiMatch.scala | 25 + .../config/generated/AvroBoundary.scala | 72 + .../config/generated/Boundary.scala | 12 + .../config/generated/BoundarySelectors.scala | 19 + .../config/generated/BridgeOutputConfig.scala | 26 + .../config/generated/BridgeType.scala | 28 + .../config/generated/DatabaseBoundary.scala | 44 + .../config/generated/DbMatch.scala | 27 + .../generated/DomainGenerateOptions.scala | 26 + .../config/generated/DomainType.scala | 27 + .../config/generated/DuckdbBoundary.scala | 23 + .../config/generated/FeatureMatcher.scala | 22 + .../generated/FeatureMatcherArray.scala | 12 + .../generated/FeatureMatcherObject.scala | 16 + .../generated/FeatureMatcherString.scala | 12 + .../config/generated/FieldOverride.scala | 19 + .../config/generated/FieldOverrideEnum.scala | 12 + .../generated/FieldOverrideObject.scala | 31 + .../config/generated/FieldSpec.scala | 19 + .../config/generated/FieldSpecObject.scala | 24 + .../config/generated/FieldSpecString.scala | 12 + .../config/generated/FieldType.scala | 22 + .../config/generated/GrpcBoundary.scala | 33 + .../config/generated/HeaderField.scala | 20 + .../config/generated/HeaderSchema.scala | 16 + .../config/generated/JsonschemaBoundary.scala | 20 + .../config/generated/KeyType.scala | 18 + .../config/generated/KeyTypeEnum.scala | 12 + .../config/generated/KeyTypeObject.scala | 15 + .../config/generated/MatcherValue.scala | 22 + .../config/generated/MatcherValueArray.scala | 12 + .../config/generated/MatcherValueObject.scala | 16 + .../config/generated/MatcherValueString.scala | 12 + .../config/generated/Matchers.scala | 18 + .../config/generated/ModelMatch.scala | 24 + .../generated/NameAlignmentConfig.scala | 22 + .../config/generated/OpenapiBoundary.scala | 27 + .../config/generated/Output.scala | 34 + .../config/generated/Projection.scala | 3 + .../config/generated/StringOrArray.scala | 19 + .../config/generated/StringOrArrayArray.scala | 12 + .../generated/StringOrArrayString.scala | 12 + .../config/generated/ValidationRules.scala | 31 + .../generated/ExecuteReturningSyntax.scala | 29 + .../typr/generated/Text.scala | 110 + .../custom/comments/CommentsSqlRepo.scala | 14 + .../custom/comments/CommentsSqlRepoImpl.scala | 29 + .../custom/comments/CommentsSqlRow.scala | 80 + .../CompositeTypesSqlRepo.scala | 14 + .../CompositeTypesSqlRepoImpl.scala | 39 + .../CompositeTypesSqlRow.scala | 95 + .../constraints/ConstraintsSqlRepo.scala | 14 + .../constraints/ConstraintsSqlRepoImpl.scala | 47 + .../constraints/ConstraintsSqlRow.scala | 83 + .../custom/domains/DomainsSqlRepo.scala | 14 + .../custom/domains/DomainsSqlRepoImpl.scala | 32 + .../custom/domains/DomainsSqlRow.scala | 106 + .../generated/custom/enums/EnumsSqlRepo.scala | 14 + .../custom/enums/EnumsSqlRepoImpl.scala | 25 + .../generated/custom/enums/EnumsSqlRow.scala | 80 + .../table_comments/TableCommentsSqlRepo.scala | 14 + .../TableCommentsSqlRepoImpl.scala | 24 + .../table_comments/TableCommentsSqlRow.scala | 71 + .../view_find_all/ViewFindAllSqlRepo.scala | 14 + .../ViewFindAllSqlRepoImpl.scala | 32 + .../view_find_all/ViewFindAllSqlRow.scala | 82 + .../generated/customtypes/TypoAclItem.scala | 83 + .../generated/customtypes/TypoInstant.scala | 84 + .../generated/customtypes/TypoShort.scala | 85 + .../information_schema/CardinalNumber.scala | 49 + .../information_schema/CharacterData.scala | 49 + .../information_schema/SqlIdentifier.scala | 49 + .../information_schema/TimeStamp.scala | 50 + .../information_schema/YesOrNo.scala | 49 + .../columns/ColumnsViewRepo.scala | 14 + .../columns/ColumnsViewRepoImpl.scala | 19 + .../columns/ColumnsViewRow.scala | 272 + .../KeyColumnUsageViewRepo.scala | 14 + .../KeyColumnUsageViewRepoImpl.scala | 19 + .../KeyColumnUsageViewRow.scala | 97 + .../ReferentialConstraintsViewRepo.scala | 14 + .../ReferentialConstraintsViewRepoImpl.scala | 19 + .../ReferentialConstraintsViewRow.scala | 97 + .../TableConstraintsViewRepo.scala | 14 + .../TableConstraintsViewRepoImpl.scala | 19 + .../TableConstraintsViewRow.scala | 107 + .../tables/TablesViewRepo.scala | 14 + .../tables/TablesViewRepoImpl.scala | 19 + .../tables/TablesViewRow.scala | 112 + .../typr/generated/package.scala | 41 + .../pg_namespace/PgNamespaceId.scala | 46 + .../pg_namespace/PgNamespaceRepo.scala | 45 + .../pg_namespace/PgNamespaceRepoImpl.scala | 150 + .../pg_namespace/PgNamespaceRepoMock.scala | 79 + .../pg_namespace/PgNamespaceRow.scala | 90 + .../typr/generated/public/AccountNumber.scala | 49 + .../typr/generated/public/Flag.scala | 49 + .../typr/generated/public/Mydomain.scala | 49 + .../typr/generated/public/Name.scala | 49 + .../typr/generated/public/NameStyle.scala | 49 + .../typr/generated/public/OrderNumber.scala | 49 + .../typr/generated/public/Phone.scala | 49 + .../typr/generated/public/ShortText.scala | 49 + .../typr/generated/streamingInsert.scala | 44 + .../src/resources/sqlglot_analyze.py | 0 .../src/scala/typr/BridgeCompositeType.scala | 47 + .../src/scala/typr/DbLibName.scala | 0 .../src/scala/typr/DbType.scala | 0 .../src/scala/typr/Dialect.scala | 0 .../src/scala/typr/DslQualifiedNames.scala | 55 +- .../src/scala/typr/FoundationsTypes.scala | 19 +- .../src/scala/typr/GenerateConfig.scala | 0 .../src/scala/typr/Generated.scala | 0 .../src/scala/typr/JsonLibName.scala | 0 .../src/scala/typr/Lang.scala | 27 + .../src/scala/typr/MetaDb.scala | 0 .../src/scala/typr/Naming.scala | 0 .../src/scala/typr/NonEmptyList.scala | 0 .../src/scala/typr/Nullability.scala | 0 .../src/scala/typr/NullabilityOverride.scala | 0 .../src/scala/typr/Options.scala | 4 +- .../src/scala/typr/ProjectGraph.scala | 0 .../src/scala/typr/RelPath.scala | 0 .../src/scala/typr/SchemaMode.scala | 0 .../src/scala/typr/SchemaSource.scala | 0 .../src/scala/typr/Scope.scala | 0 .../src/scala/typr/Selector.scala | 0 .../src/scala/typr/Source.scala | 0 .../src/scala/typr/TypeDefinitions.scala | 217 + .../src/scala/typr/TypeOverride.scala | 0 .../src/scala/typr/TypeSupport.scala | 0 .../src/scala/typr/TypeSupportJava.scala | 0 .../src/scala/typr/TypeSupportScala.scala | 6 +- .../src/scala/typr/TypesJava.scala | 1 + .../src/scala/typr/TypesKotlin.scala | 0 .../src/scala/typr/TypesScala.scala | 1 - .../src/scala/typr/TypoDataSource.scala | 17 +- .../src/scala/typr/TypoLogger.scala | 0 .../boundaries/framework/CatsFramework.scala | 17 + .../typr/boundaries/framework/Framework.scala | 20 + .../boundaries/framework/FrameworkTypes.scala | 43 + .../boundaries/framework/HttpFramework.scala | 34 + .../framework/MessagingFramework.scala | 27 + .../framework/QuarkusFramework.scala | 18 + .../boundaries/framework/RpcFramework.scala | 22 + .../framework/SpringFramework.scala | 17 + .../src/scala/typr/db.scala | 2 + .../src/scala/typr/effects/EffectType.scala | 57 +- .../scala/typr/effects/EffectTypeOps.scala | 6 + .../src/scala/typr/generateFromConfig.scala | 11 +- .../src/scala/typr/generateFromDb.scala | 1 - .../src/scala/typr/grpc/GrpcTypes.scala | 0 .../src/scala/typr/internal/ArrayName.scala | 0 .../ComputedBridgeCompositeType.scala | 95 + .../scala/typr/internal/ComputedColumn.scala | 0 .../scala/typr/internal/ComputedDefault.scala | 0 .../scala/typr/internal/ComputedDomain.scala | 0 .../typr/internal/ComputedDuckDbStruct.scala | 0 .../typr/internal/ComputedMariaSet.scala | 0 .../scala/typr/internal/ComputedNames.scala | 0 .../ComputedOracleCollectionType.scala | 0 .../internal/ComputedOracleObjectType.scala | 0 .../internal/ComputedPgCompositeType.scala | 0 .../typr/internal/ComputedRowUnsaved.scala | 0 .../typr/internal/ComputedSharedType.scala | 0 .../scala/typr/internal/ComputedSqlFile.scala | 0 .../typr/internal/ComputedStringEnum.scala | 0 .../scala/typr/internal/ComputedTable.scala | 18 +- .../typr/internal/ComputedTestInserts.scala | 0 .../scala/typr/internal/ComputedView.scala | 0 .../src/scala/typr/internal/CustomType.scala | 0 .../src/scala/typr/internal/CustomTypes.scala | 0 .../src/scala/typr/internal/DebugJson.scala | 0 .../src/scala/typr/internal/FileSync.scala | 0 .../src/scala/typr/internal/FkAnalysis.scala | 0 .../src/scala/typr/internal/HasSource.scala | 0 .../src/scala/typr/internal/IdComputed.scala | 0 .../typr/internal/InstanceRequirements.scala | 9 +- .../scala/typr/internal/InternalOptions.scala | 0 .../src/scala/typr/internal/Lazy.scala | 0 .../typr/internal/PreciseConstraint.scala | 0 .../src/scala/typr/internal/RepoMethod.scala | 0 .../src/scala/typr/internal/TypeAligner.scala | 6 +- .../internal/TypeCompatibilityChecker.scala | 9 +- .../scala/typr/internal/TypeMapperDb.scala | 0 .../scala/typr/internal/TypeMapperJvm.scala | 46 +- .../typr/internal/TypeMapperJvmNew.scala | 14 +- .../typr/internal/TypeMapperJvmOld.scala | 5 + .../src/scala/typr/internal/TypeMatcher.scala | 268 +- .../src/scala/typr/internal/TypoType.scala | 0 .../internal/analysis/ColumnNullable.scala | 0 .../internal/analysis/DecomposedSql.scala | 0 .../typr/internal/analysis/JdbcMetadata.scala | 0 .../typr/internal/analysis/JdbcType.scala | 0 .../internal/analysis/MaybeReturnsRows.scala | 0 .../internal/analysis/MetadataColumn.scala | 0 .../analysis/MetadataParameterColumn.scala | 0 .../analysis/NullabilityFromExplain.scala | 0 .../internal/analysis/ParameterMode.scala | 0 .../internal/analysis/ParameterNullable.scala | 0 .../typr/internal/analysis/ParsedName.scala | 0 .../analysis/WellKnownPrimitive.scala | 0 .../typr/internal/codegen/Db2Adapter.scala | 62 +- .../typr/internal/codegen/DbAdapter.scala | 38 +- .../scala/typr/internal/codegen/DbLib.scala | 0 .../typr/internal/codegen/DbLibAnorm.scala | 0 .../typr/internal/codegen/DbLibDoobie.scala | 0 .../internal/codegen/DbLibFoundations.scala | 543 +- .../typr/internal/codegen/DbLibLegacy.scala | 0 .../codegen/DbLibTextImplementations.scala | 0 .../internal/codegen/DbLibTextSupport.scala | 0 .../typr/internal/codegen/DbLibZioJdbc.scala | 0 .../typr/internal/codegen/DuckDbAdapter.scala | 296 + .../codegen/FileBridgeCompositeType.scala | 101 + .../codegen/FileBridgeProjectionMapper.scala | 367 + .../internal/codegen/FileCustomType.scala | 0 .../typr/internal/codegen/FileDefault.scala | 0 .../typr/internal/codegen/FileDomain.scala | 0 .../internal/codegen/FileDuckDbStruct.scala | 66 +- .../typr/internal/codegen/FileFields.scala | 0 .../codegen/FileFieldsFoundations.scala | 10 +- .../internal/codegen/FileFieldsLegacy.scala | 0 .../typr/internal/codegen/FileMariaSet.scala | 10 +- .../codegen/FileOracleCollectionType.scala | 0 .../codegen/FileOracleObjectType.scala | 0 .../internal/codegen/FilePackageObject.scala | 0 .../codegen/FilePgCompositeType.scala | 237 + .../internal/codegen/FilePreciseType.scala | 280 +- .../internal/codegen/FileSharedType.scala | 0 .../internal/codegen/FileStringEnum.scala | 0 .../internal/codegen/FileTestInserts.scala | 0 .../typr/internal/codegen/FilesRelation.scala | 0 .../typr/internal/codegen/FilesSqlFile.scala | 0 .../typr/internal/codegen/FilesTable.scala | 3 +- .../typr/internal/codegen/FilesView.scala | 0 .../scala/typr/internal/codegen/JsonLib.scala | 0 .../typr/internal/codegen/JsonLibCirce.scala | 0 .../internal/codegen/JsonLibJackson.scala | 6 +- .../typr/internal/codegen/JsonLibPlay.scala | 0 .../internal/codegen/JsonLibZioJson.scala | 0 .../typr/internal/codegen/LangJava.scala | 44 +- .../typr/internal/codegen/LangKotlin.scala | 59 +- .../typr/internal/codegen/LangScala.scala | 65 +- .../internal/codegen/MariaDbAdapter.scala | 101 +- .../typr/internal/codegen/OracleAdapter.scala | 101 +- .../internal/codegen/PostgresAdapter.scala | 281 +- .../scala/typr/internal/codegen/SqlCast.scala | 0 .../internal/codegen/SqlServerAdapter.scala | 80 +- .../scala/typr/internal/codegen/ToCode.scala | 0 .../internal/codegen/TypeSupportKotlin.scala | 0 .../codegen/addPackageAndImports.scala | 21 +- .../scala/typr/internal/codegen/package.scala | 0 .../src/scala/typr/internal/compat.scala | 0 .../typr/internal/db2/Db2JdbcMetadata.scala | 0 .../scala/typr/internal/db2/Db2MetaDb.scala | 0 .../internal/db2/Db2SqlFileMetadata.scala | 2 +- .../typr/internal/db2/Db2TypeMapperDb.scala | 0 .../internal/duckdb/DuckDbJdbcMetadata.scala | 0 .../typr/internal/duckdb/DuckDbMetaDb.scala | 0 .../duckdb/DuckDbSqlFileMetadata.scala | 2 +- .../internal/duckdb/DuckDbTypeMapperDb.scala | 0 .../internal/external/ExternalTools.scala | 0 .../scala/typr/internal/external/OsArch.scala | 0 .../scala/typr/internal/external/Python.scala | 0 .../typr/internal/external/Sqlglot.scala | 0 .../typr/internal/external/SqlglotDb2.scala | 0 .../typr/internal/external/TypoCoursier.scala | 0 .../scala/typr/internal/findTypeFromFk.scala | 0 .../src/scala/typr/internal/forget.scala | 0 .../src/scala/typr/internal/generate.scala | 14 +- .../internal/mariadb/MariaJdbcMetadata.scala | 0 .../typr/internal/mariadb/MariaMetaDb.scala | 0 .../mariadb/MariaSqlFileMetadata.scala | 2 +- .../internal/mariadb/MariaTypeMapperDb.scala | 0 .../src/scala/typr/internal/minimize.scala | 3 + .../internal/oracle/OracleJdbcMetadata.scala | 0 .../typr/internal/oracle/OracleMetaDb.scala | 0 .../oracle/OracleSqlFileMetadata.scala | 2 +- .../internal/oracle/OracleTypeMapperDb.scala | 10 +- .../src/scala/typr/internal/pg/Enums.scala | 0 .../scala/typr/internal/pg/ForeignKeys.scala | 0 .../src/scala/typr/internal/pg/OpenEnum.scala | 0 .../src/scala/typr/internal/pg/PgMetaDb.scala | 2 +- .../typr/internal/pg/PgTypeMapperDb.scala | 4 + .../scala/typr/internal/pg/PrimaryKeys.scala | 0 .../scala/typr/internal/pg/UniqueKeys.scala | 0 .../src/scala/typr/internal/quote.scala | 0 .../typr/internal/rewriteDependentData.scala | 0 .../typr/internal/sqlfiles/SqlFile.scala | 0 .../internal/sqlfiles/SqlFileReader.scala | 0 .../sqlfiles/readSqlFileDirectories.scala | 0 .../internal/sqlglot/SqlglotAnalyzer.scala | 6 +- .../typr/internal/sqlglot/SqlglotTypes.scala | 0 .../sqlserver/SqlServerJdbcMetadata.scala | 0 .../internal/sqlserver/SqlServerMetaDb.scala | 0 .../sqlserver/SqlServerSqlFileMetadata.scala | 2 +- .../sqlserver/SqlServerTypeMapperDb.scala | 0 .../typr/jsonschema/JsonSchemaCodegen.scala | 103 + .../typr/jsonschema/JsonSchemaOptions.scala | 25 + .../typr/jsonschema/JsonSchemaParser.scala | 118 + .../src/scala/typr/jvm.scala | 6 +- .../src/scala/typr/openapi/ApiTypes.scala | 0 .../src/scala/typr/openapi/ModelClass.scala | 8 +- .../scala/typr/openapi/OpenApiCodegen.scala | 0 .../src/scala/typr/openapi/OpenApiError.scala | 0 .../scala/typr/openapi/OpenApiJsonLib.scala | 8 + .../scala/typr/openapi/OpenApiOptions.scala | 0 .../src/scala/typr/openapi/ParsedSpec.scala | 0 .../src/scala/typr/openapi/SumType.scala | 4 +- .../src/scala/typr/openapi/TypeInfo.scala | 0 .../typr/openapi/codegen/ApiCodegen.scala | 0 .../openapi/codegen/FrameworkSupport.scala | 0 .../typr/openapi/codegen/JsonLibSupport.scala | 188 +- .../typr/openapi/codegen/ModelCodegen.scala | 48 +- .../typr/openapi/codegen/TypeMapper.scala | 7 +- .../openapi/codegen/ValidationSupport.scala | 0 .../openapi/computed/ComputedApiService.scala | 37 + .../openapi/computed/ComputedEndpoint.scala | 53 + .../typr/openapi/computed/ComputedModel.scala | 76 + .../openapi/computed/ComputedParameter.scala | 38 + .../openapi/computed/ComputedProperty.scala | 35 + .../typr/openapi/parser/ApiExtractor.scala | 0 .../typr/openapi/parser/ModelExtractor.scala | 503 ++ .../typr/openapi/parser/OpenApiParser.scala | 0 .../typr/openapi/parser/SpecValidator.scala | 0 .../typr/openapi/parser/TypeResolver.scala | 0 .../typr/openapi/testdata/stripe-spec3.yaml | 0 .../typr/openapi/testdata/test-features.yaml | 0 typr-config.schema.json | 1230 ++++ .../kotlin/dev/typr/dslkt}/DeleteBuilder.kt | 20 +- .../src/kotlin/dev/typr/dslkt}/DslExports.kt | 42 +- .../kotlin/dev/typr/dslkt/MockConnection.kt | 7 + .../kotlin/dev/typr/dslkt}/SelectBuilder.kt | 56 +- .../src/kotlin/dev/typr/dslkt}/SqlExpr.kt | 192 +- .../dev/typr/dslkt}/SqlExprExtensions.kt | 4 +- .../src/kotlin/dev/typr/dslkt/Structure.kt | 4 + .../kotlin/dev/typr/dslkt}/UpdateBuilder.kt | 26 +- .../scala/dev/typr/dslsc}/DeleteBuilder.scala | 22 +- .../src/scala/dev/typr/dslsc/DslExports.scala | 78 + .../scala/dev/typr/dslsc}/ForeignKey.scala | 4 +- .../scala/dev/typr/dslsc}/SelectBuilder.scala | 80 +- .../src/scala/dev/typr/dslsc}/SqlExpr.scala | 17 +- .../src/scala/dev/typr/dslsc}/Structure.scala | 12 +- .../scala/dev/typr/dslsc}/UpdateBuilder.scala | 14 +- .../src/scala/dev/typr/dslsc/package.scala | 117 + .../src/java/dev/typr}/dsl/DeleteBuilder.java | 4 +- .../java/dev/typr}/dsl/DeleteBuilderMock.java | 4 +- .../java/dev/typr}/dsl/DeleteBuilderSql.java | 30 +- .../src/java/dev/typr}/dsl/DeleteParams.java | 2 +- .../src/java/dev/typr}/dsl/Dialect.java | 207 +- .../src/java/dev/typr}/dsl/FieldsBase.java | 6 +- .../src/java/dev/typr}/dsl/FieldsExpr.java | 14 +- .../src/java/dev/typr}/dsl/FieldsExpr0.java | 2 +- .../src/java/dev/typr}/dsl/ForeignKey.java | 3 +- .../java/dev/typr}/dsl/GenericDbTypes.java | 32 +- .../java/dev/typr}/dsl/GroupedBuilder.java | 4 +- .../dev/typr}/dsl/GroupedBuilderMock.java | 4 +- .../java/dev/typr}/dsl/GroupedBuilderSql.java | 104 +- typr-dsl/src/java/dev/typr/dsl/Inserter.java | 42 + .../src/java/dev/typr}/dsl/Like.java | 2 +- .../java/dev/typr}/dsl/MockConnection.java | 4 +- .../src/java/dev/typr}/dsl/OrderByOrSeek.java | 3 +- .../src/java/dev/typr}/dsl/Path.java | 2 +- .../java/dev/typr}/dsl/RelationStructure.java | 2 +- .../src/java/dev/typr}/dsl/RenderCtx.java | 2 +- .../src/java/dev/typr/dsl/RowCodecDbType.java | 73 +- .../src/java/dev/typr}/dsl/SelectBuilder.java | 11 +- .../java/dev/typr}/dsl/SelectBuilderMock.java | 6 +- .../java/dev/typr}/dsl/SelectBuilderSql.java | 411 +- .../src/java/dev/typr}/dsl/SelectParams.java | 13 +- .../src/java/dev/typr}/dsl/SortOrder.java | 8 +- .../src/java/dev/typr}/dsl/SqlExpr.java | 117 +- .../java/dev/typr}/dsl/SqlExprVisitor.java | 2 +- .../src/java/dev/typr}/dsl/SqlFunction1.java | 3 +- .../src/java/dev/typr}/dsl/SqlFunction2.java | 3 +- .../src/java/dev/typr}/dsl/SqlFunction3.java | 3 +- .../src/java/dev/typr}/dsl/SqlOperator.java | 5 +- .../src/java/dev/typr}/dsl/Structure.java | 9 +- .../src/java/dev/typr}/dsl/TriFunction.java | 2 +- .../src/java/dev/typr}/dsl/UpdateBuilder.java | 10 +- .../java/dev/typr}/dsl/UpdateBuilderMock.java | 4 +- .../java/dev/typr}/dsl/UpdateBuilderSql.java | 48 +- .../src/java/dev/typr}/dsl/UpdateParams.java | 2 +- .../typr}/dsl/internal/DummyComparator.java | 2 +- .../dev/typr}/dsl/internal/LogicalPlan.java | 4 +- .../dev/typr}/dsl/internal/RowComparator.java | 6 +- .../src/scala/scripts/GenDocumentation.scala | 37 +- .../scala/scripts/GeneratedRowParsers.scala | 61 +- .../src/scala/scripts/GeneratedTuples.scala | 120 +- .../src/scala/scripts/GenerateAll.scala | 47 - .../src/scala/scripts/GenerateAvroTest.scala | 303 - .../scala/scripts/GenerateCombinedTest.scala | 201 - .../scala/scripts/GenerateConfigTypes.scala | 71 + .../src/scala/scripts/GenerateGrpcTest.scala | 146 - .../scala/scripts/GenerateOpenApiTest.scala | 212 - .../scala/scripts/GenerateStripeTest.scala | 92 - .../src/scala/scripts/GeneratedDb2.scala | 84 - .../src/scala/scripts/GeneratedDuckDb.scala | 126 - .../src/scala/scripts/GeneratedMariaDb.scala | 99 - .../src/scala/scripts/GeneratedOracle.scala | 123 - .../src/scala/scripts/GeneratedPostgres.scala | 162 - .../src/scala/scripts/GeneratedShowcase.scala | 148 +- .../scala/scripts/GeneratedSqlServer.scala | 91 - .../scala/scripts/ShowcaseGeneration.scala | 126 - .../scripts/showcase/ShowcaseSchema.scala | 294 +- typr.yaml | 918 +++ .../CompositeTypesSqlRepo.scala | 2 +- .../CompositeTypesSqlRepoImpl.scala | 2 +- .../CompositeTypesSqlRow.scala | 30 +- .../generated/customtypes/TypoShort.scala | 18 +- .../columns/ColumnsViewRow.scala | 290 +- .../tables/TablesViewRow.scala | 2 +- typr/src/scala/typr/Banner.scala | 42 - typr/src/scala/typr/TypeDefinitions.scala | 208 - typr/src/scala/typr/avro/AvroCodegen.scala | 12 +- typr/src/scala/typr/avro/AvroOptions.scala | 10 +- .../scala/typr/avro/BridgeAvroAdapter.scala | 44 + .../typr/avro/codegen/FileAvroWrapper.scala | 2 +- .../typr/avro/codegen/KafkaFramework.scala | 11 +- .../avro/codegen/KafkaFrameworkCats.scala | 98 + .../avro/codegen/KafkaFrameworkQuarkus.scala | 18 +- .../avro/codegen/KafkaFrameworkSpring.scala | 16 +- .../typr/avro/codegen/KafkaRpcCodegen.scala | 49 +- .../avro/computed/ComputedAvroField.scala | 38 + .../avro/computed/ComputedAvroRecord.scala | 78 + .../avro/computed/ComputedEventGroup.scala | 55 + .../typr/avro/computed/ComputedProtocol.scala | 119 + .../src/scala/typr/bridge/ColumnGrouper.scala | 184 + .../src/scala/typr/bridge/ColumnStemmer.scala | 285 + .../scala/typr/bridge/ColumnTokenizer.scala | 268 + .../src/scala/typr/bridge/CompositeType.scala | 385 ++ .../typr/bridge/CompositeTypeSuggester.scala | 250 + .../scala/typr/bridge/ConfigToBridge.scala | 142 + typr/src/scala/typr/bridge/TypeNarrower.scala | 683 ++ .../src/scala/typr/bridge/TypeSuggester.scala | 233 + .../src/scala/typr/bridge/api/BridgeApi.scala | 20 + .../scala/typr/bridge/api/BridgeApiImpl.scala | 91 + .../scala/typr/bridge/model/CheckResult.scala | 58 + .../typr/bridge/model/FieldOverride.scala | 34 + .../typr/bridge/model/ResolvedFlow.scala | 39 + .../typr/bridge/model/SourceDeclaration.scala | 43 + .../scala/typr/bridge/model/TypePolicy.scala | 30 + .../bridge/validation/FlowValidator.scala | 239 + .../bridge/validation/SmartDefaults.scala | 109 + .../validation/TypePolicyValidator.scala | 94 + typr/src/scala/typr/cli/Main.scala | 52 + typr/src/scala/typr/cli/commands/Check.scala | 281 + .../scala/typr/cli/commands/Generate.scala | 1169 ++++ .../cli/commands/ProjectionFieldFormat.scala | 110 + .../cli/commands/SourceEntityLoader.scala | 497 ++ typr/src/scala/typr/cli/commands/Watch.scala | 155 + .../scala/typr/cli/config/ConfigParser.scala | 110 + .../typr/cli/config/ConfigToOptions.scala | 546 ++ .../scala/typr/cli/config/ConfigWriter.scala | 40 + .../typr/cli/config/EnvSubstitution.scala | 25 + .../scala/typr/cli/config/TyprConfig.scala | 43 + .../scala/typr/cli/util/PatternMatcher.scala | 114 + .../scala/typr/grpc/BridgeProtoAdapter.scala | 76 + typr/src/scala/typr/grpc/GrpcCodegen.scala | 26 +- typr/src/scala/typr/grpc/GrpcOptions.scala | 10 +- ...ceCodegen.scala => FilesGrpcService.scala} | 319 +- .../typr/grpc/codegen/GrpcFramework.scala | 8 +- .../typr/grpc/codegen/GrpcFrameworkCats.scala | 25 + .../grpc/codegen/GrpcFrameworkQuarkus.scala | 17 +- .../grpc/codegen/GrpcFrameworkSpring.scala | 11 +- .../grpc/computed/ComputedGrpcMethod.scala | 55 + .../grpc/computed/ComputedGrpcService.scala | 89 + .../typr/internal/codegen/DuckDbAdapter.scala | 382 -- .../codegen/FilePgCompositeType.scala | 269 - .../typr/openapi/parser/ModelExtractor.scala | 260 - 6197 files changed, 244375 insertions(+), 232104 deletions(-) delete mode 100644 .claude/settings.local.json create mode 100644 .javafmt.conf create mode 100644 BRIDGE-ARCHITECTURE.md create mode 100644 BRIDGE-BACKEND.md create mode 100644 TYPR-DOMAIN-PROGRESS.md delete mode 100644 build.gradle.kts delete mode 100644 foundations-jdbc-dsl-kotlin/build.gradle.kts delete mode 100644 foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/Fragment.kt delete mode 100644 foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/KotlinDbTypes.kt delete mode 100644 foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/Operation.kt delete mode 100644 foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/OptionalExtensions.kt delete mode 100644 foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/ResultSetParser.kt delete mode 100644 foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/RowParser.kt delete mode 100644 foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/RuntimeExports.kt delete mode 100644 foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/RuntimeExtensions.kt delete mode 100644 foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/StaticExports.kt delete mode 100644 foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/Structure.kt delete mode 100644 foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/Bijection.scala delete mode 100644 foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/Bijections.scala delete mode 100644 foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/DslExports.scala delete mode 100644 foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/Fragment.scala delete mode 100644 foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/Operation.scala delete mode 100644 foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/ResultSetParser.scala delete mode 100644 foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/RowParser.scala delete mode 100644 foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/RuntimeExtensions.scala delete mode 100644 foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/ScalaDbTypes.scala delete mode 100644 foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/StaticExports.scala delete mode 100644 foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/package.scala delete mode 100644 foundations-jdbc-dsl/build.gradle.kts delete mode 100644 foundations-jdbc-hikari/src/java/dev/typr/foundations/hikari/HikariDataSourceFactory.java delete mode 100644 foundations-jdbc-hikari/src/java/dev/typr/foundations/hikari/PoolConfig.java delete mode 100644 foundations-jdbc-hikari/src/java/dev/typr/foundations/hikari/PooledDataSource.java delete mode 100644 foundations-jdbc-scala/src/scala/dev/typr/foundations/scala/FragmentInterpolator.scala delete mode 100644 foundations-jdbc-test/src/java/dev/typr/foundations/Db2TypeTest.java delete mode 100644 foundations-jdbc-test/src/java/dev/typr/foundations/DuckDbTypeTest.java delete mode 100644 foundations-jdbc-test/src/java/dev/typr/foundations/MariaTypeTest.java delete mode 100644 foundations-jdbc-test/src/java/dev/typr/foundations/OracleTypeTest.java delete mode 100644 foundations-jdbc-test/src/java/dev/typr/foundations/PgRecordParserTest.java delete mode 100644 foundations-jdbc-test/src/java/dev/typr/foundations/PgStructTest.java delete mode 100644 foundations-jdbc-test/src/java/dev/typr/foundations/PgTypeTest.java delete mode 100644 foundations-jdbc-test/src/java/dev/typr/foundations/SqlServerTypeTest.java delete mode 100644 foundations-jdbc/build.gradle.kts delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/And.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/ArrParser.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/Db2Json.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/Db2Read.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/Db2Text.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/Db2Type.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/Db2Typename.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/Db2Types.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/Db2Write.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/DbJson.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/DbJsonRow.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/DbRead.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/DbText.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/DbType.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/DbTypename.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/DbWrite.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/DuckDbJson.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/DuckDbMapSupport.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/DuckDbRead.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/DuckDbStringifier.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/DuckDbStruct.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/DuckDbText.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/DuckDbType.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/DuckDbTypename.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/DuckDbTypes.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/DuckDbUnion.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/DuckDbWrite.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/Either.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/Fragment.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/Inserter.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/MariaJson.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/MariaRead.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/MariaText.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/MariaType.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/MariaTypename.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/MariaTypes.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/MariaWrite.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/NonEmptyBlob.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/NonEmptyString.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/Operation.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/OracleJson.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/OracleNestedTable.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/OracleObject.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/OracleRead.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/OracleType.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/OracleTypename.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/OracleTypes.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/OracleVArray.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/OracleWrite.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/PaddedString.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/PgCompositeText.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/PgJson.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/PgRead.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/PgRecordParser.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/PgStruct.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/PgText.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/PgType.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/PgTypename.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/PgTypes.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/PgWrite.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/ResultSetParser.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/RowParser.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/SingleValueResultSetWrapper.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/SqlBiConsumer.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/SqlBiFunction.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/SqlConsumer.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/SqlFunction.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/SqlServerJson.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/SqlServerRead.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/SqlServerText.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/SqlServerType.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/SqlServerTypename.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/SqlServerTypes.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/SqlServerWrite.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/SqlSupplier.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/StructResultSetWrapper.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/Transactor.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/ConnectionSettings.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/ConnectionSource.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/DatabaseConfig.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/DatabaseKind.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/SimpleDataSource.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/TransactionIsolation.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/db2/Db2Config.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/duckdb/DuckDbConfig.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/mariadb/MariaDbConfig.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/mariadb/MariaSslMode.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/oracle/OracleConfig.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgAutosave.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgChannelBinding.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgEscapeSyntaxCallMode.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgGssEncMode.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgGssLib.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgQueryMode.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgReadOnlyMode.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgReplication.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgSslMode.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgSslNegotiation.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgTargetServerType.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PostgresConfig.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerApplicationIntent.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerAuthentication.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerAuthenticationScheme.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerColumnEncryptionSetting.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerConfig.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerEncrypt.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerResponseBuffering.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerSelectMethod.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/AclItem.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/AnyArray.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Arr.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Cidr.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/HierarchyId.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Inet.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Int2Vector.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Json.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/JsonParser.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/JsonValue.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Jsonb.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/MacAddr.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/MacAddr8.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Money.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Oid.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/OidVector.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/OracleIntervalDS.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/OracleIntervalYM.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/PgName.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/PgNodeTree.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Range.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/RangeBound.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/RangeFinite.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/RangeParser.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Record.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Regclass.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Regconfig.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Regdictionary.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Regnamespace.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Regoper.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Regoperator.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Regproc.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Regprocedure.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Regrole.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Regtype.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Uint1.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Uint2.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Uint4.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Uint8.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Unknown.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Vector.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Xid.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/Xml.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/maria/Inet4.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/maria/Inet6.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/maria/MariaSet.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/precise/BinaryN.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/precise/DecimalN.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/precise/InstantN.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/precise/LocalDateTimeN.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/precise/LocalTimeN.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/precise/NonEmptyPaddedStringN.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/precise/NonEmptyStringN.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/precise/OffsetDateTimeN.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/precise/PaddedStringN.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/data/precise/StringN.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/dsl/Bijection.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/internal/ByteArrays.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/internal/RandomHelper.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/internal/TypoPGObjectHelper.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/internal/arrayMap.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/internal/stringInterpolator.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/internal/stripMargin.java delete mode 100644 foundations-jdbc/src/java/dev/typr/foundations/streamingInsert.java delete mode 100644 gradle.properties delete mode 100644 gradle/wrapper/gradle-wrapper.jar delete mode 100644 gradle/wrapper/gradle-wrapper.properties delete mode 100755 gradlew delete mode 100644 gradlew.bat delete mode 100644 settings.gradle.kts create mode 100644 site-in/type-safety/precise-types.md create mode 100644 site-in/unified-types/best-practices.md create mode 100644 site-in/unified-types/cli.md create mode 100644 site-in/unified-types/overview.md create mode 100644 site-in/unified-types/yaml-config.md delete mode 100644 site/docs-api/index.md delete mode 100644 site/docs-avro/reference/options.md delete mode 100644 site/docs-db/comparison.md delete mode 100644 site/docs-db/other-features/testing-with-random-values.md delete mode 100644 site/docs-db/other-features/testing-with-stubs.md delete mode 100644 site/docs-db/patterns/dynamic-queries.md delete mode 100644 site/docs-db/patterns/multi-repo.md delete mode 100644 site/docs-db/readme.md delete mode 100644 site/docs-db/setup.md delete mode 100644 site/docs-db/type-safety/arrays.md delete mode 100644 site/docs-db/type-safety/date-time.md delete mode 100644 site/docs-db/type-safety/defaulted-types.md delete mode 100644 site/docs-db/type-safety/domains.md delete mode 100644 site/docs-db/type-safety/id-types.md delete mode 100644 site/docs-db/type-safety/open-string-enums.md delete mode 100644 site/docs-db/type-safety/string-enums.md delete mode 100644 site/docs-db/type-safety/type-flow.md delete mode 100644 site/docs-db/type-safety/typo-types.md delete mode 100644 site/docs-jdbc/duckdb.md delete mode 100644 site/docs-jdbc/mariadb.md delete mode 100644 site/docs-jdbc/oracle.md delete mode 100644 site/docs-jdbc/postgresql.md delete mode 100644 site/docs-jdbc/readme.md delete mode 100644 site/docs-jdbc/sqlserver.md create mode 100644 site/docs-typr/best-practices.md rename site/{docs-api => docs-typr/boundaries/apis}/client-generation.md (98%) create mode 100644 site/docs-typr/boundaries/apis/index.md rename site/{docs-api => docs-typr/boundaries/apis}/response-types.md (92%) rename site/{docs-api => docs-typr/boundaries/apis}/server-frameworks.md (98%) rename site/{docs-api => docs-typr/boundaries/apis}/type-safe-ids.md (92%) rename site/{docs-api => docs-typr/boundaries/apis}/usage.md (97%) rename site/{docs-db => docs-typr/boundaries/databases}/customization/customize-naming.md (99%) rename site/{docs-db => docs-typr/boundaries/databases}/customization/customize-nullability.md (100%) rename site/{docs-db => docs-typr/boundaries/databases}/customization/customize-selected-relations.md (79%) rename site/{docs-db => docs-typr/boundaries/databases}/customization/customize-sql-files.md (100%) rename site/{docs-db => docs-typr/boundaries/databases}/customization/customize-types.md (81%) rename site/{docs-db => docs-typr/boundaries/databases}/customization/overview.md (94%) rename site/{docs-db => docs-typr/boundaries/databases}/customization/selector.md (96%) rename site/{docs-db => docs-typr/boundaries/databases}/limitations.md (80%) rename site/{docs-db => docs-typr/boundaries/databases}/other-features/clickable-links.md (86%) rename site/{docs-db => docs-typr/boundaries/databases}/other-features/constraints.md (89%) rename site/{docs-db => docs-typr/boundaries/databases}/other-features/dsl-in-depth.md (94%) rename site/{docs-db => docs-typr/boundaries/databases}/other-features/faster-compilation.md (93%) rename site/{docs-db => docs-typr/boundaries/databases}/other-features/flexible.md (79%) rename site/{docs-db => docs-typr/boundaries/databases}/other-features/generate-into-multiple-projects.md (92%) rename site/{docs-db => docs-typr/boundaries/databases}/other-features/json.md (97%) rename site/{docs-db => docs-typr/boundaries/databases}/other-features/scala-js-ready.md (79%) rename site/{docs-db => docs-typr/boundaries/databases}/other-features/streaming-inserts.md (95%) create mode 100644 site/docs-typr/boundaries/databases/other-features/testing-with-random-values.md create mode 100644 site/docs-typr/boundaries/databases/other-features/testing-with-stubs.md create mode 100644 site/docs-typr/boundaries/databases/patterns/dynamic-queries.md create mode 100644 site/docs-typr/boundaries/databases/patterns/multi-repo.md create mode 100644 site/docs-typr/boundaries/databases/readme.md create mode 100644 site/docs-typr/boundaries/databases/setup.md create mode 100644 site/docs-typr/boundaries/databases/type-safety/arrays.md create mode 100644 site/docs-typr/boundaries/databases/type-safety/collection-types.md create mode 100644 site/docs-typr/boundaries/databases/type-safety/date-time.md create mode 100644 site/docs-typr/boundaries/databases/type-safety/defaulted-types.md create mode 100644 site/docs-typr/boundaries/databases/type-safety/domains.md create mode 100644 site/docs-typr/boundaries/databases/type-safety/enums.md create mode 100644 site/docs-typr/boundaries/databases/type-safety/id-types.md create mode 100644 site/docs-typr/boundaries/databases/type-safety/maps.md create mode 100644 site/docs-typr/boundaries/databases/type-safety/open-enums.md create mode 100644 site/docs-typr/boundaries/databases/type-safety/precise-types.md create mode 100644 site/docs-typr/boundaries/databases/type-safety/struct-types.md create mode 100644 site/docs-typr/boundaries/databases/type-safety/type-flow.md rename site/{docs-db => docs-typr/boundaries/databases}/type-safety/user-selected-types.md (76%) rename site/{docs-db => docs-typr/boundaries/databases}/what-is/dsl.md (92%) rename site/{docs-db => docs-typr/boundaries/databases}/what-is/relations.md (92%) rename site/{docs-db => docs-typr/boundaries/databases}/what-is/sql-is-king.md (86%) rename site/{docs-avro => docs-typr/boundaries/events}/kafka/consumers.md (100%) rename site/{docs-avro => docs-typr/boundaries/events}/kafka/headers.md (100%) rename site/{docs-avro => docs-typr/boundaries/events}/kafka/multi-event.md (100%) rename site/{docs-avro => docs-typr/boundaries/events}/kafka/producers.md (100%) rename site/{docs-avro => docs-typr/boundaries/events}/readme.md (86%) rename site/{docs-avro => docs-typr/boundaries/events}/reference/limitations.md (100%) create mode 100644 site/docs-typr/boundaries/events/reference/options.md rename site/{docs-avro => docs-typr/boundaries/events}/reference/type-mappings.md (100%) rename site/{docs-avro => docs-typr/boundaries/events}/rpc/protocols.md (100%) rename site/{docs-avro => docs-typr/boundaries/events}/rpc/quarkus.md (100%) rename site/{docs-avro => docs-typr/boundaries/events}/rpc/result-adt.md (100%) rename site/{docs-avro => docs-typr/boundaries/events}/rpc/spring.md (100%) rename site/{docs-avro => docs-typr/boundaries/events}/setup.md (71%) rename site/{docs-avro => docs-typr/boundaries/events}/type-safety/precise-types.md (100%) rename site/{docs-avro => docs-typr/boundaries/events}/type-safety/unions.md (100%) rename site/{docs-avro => docs-typr/boundaries/events}/type-safety/wrapper-types.md (100%) rename site/{docs-avro => docs-typr/boundaries/events}/what-is/effect-types.md (100%) rename site/{docs-avro => docs-typr/boundaries/events}/what-is/schemas.md (100%) rename site/{docs-avro => docs-typr/boundaries/events}/what-is/wire-formats.md (100%) create mode 100644 site/docs-typr/cli.md create mode 100644 site/docs-typr/comparison.md create mode 100644 site/docs-typr/configuration.md create mode 100644 site/docs-typr/getting-started.md create mode 100644 site/docs-typr/index.md create mode 100644 site/docs-typr/matchers.md create mode 100644 site/docs-typr/unified-types/configuration.md create mode 100644 site/docs-typr/unified-types/domain-types.md create mode 100644 site/docs-typr/unified-types/field-types.md create mode 100644 site/docs-typr/unified-types/index.md create mode 100644 site/showcase-demo/postgres/java/src/java/showcase/ShowcaseDemo.java delete mode 120000 site/showcase-generated delete mode 100644 site/sidebars-api.js delete mode 100644 site/sidebars-avro.js delete mode 100644 site/sidebars-db.js delete mode 100644 site/sidebars-jdbc.js create mode 100644 site/sidebars-typr.js create mode 100644 site/src/components/BoundaryDiagram/IslandConnected.js create mode 100644 site/src/components/BoundaryDiagram/IslandIsolated.js create mode 100644 site/src/components/BoundaryDiagram/index.js create mode 100644 site/src/components/BoundaryDiagram/styles.module.css create mode 100644 site/src/components/CodeSample/index.js create mode 100644 site/src/components/CodeSample/styles.module.css create mode 100644 site/src/components/ShowcaseSnippet/index.js create mode 100644 site/src/components/ShowcaseSnippet/styles.module.css create mode 100644 site/src/components/StackSelector/index.js create mode 100644 site/src/components/StackSelector/styles.module.css create mode 100644 site/src/components/UsageExample/index.js create mode 100644 site/src/context/StackContext.js create mode 100644 site/src/data/codeSamples.js create mode 100644 site/src/data/showcaseFiles.js create mode 100644 site/src/theme/Root.js create mode 100644 site/usage-examples/patterns/MultiRepoExample.java create mode 100644 site/usage-examples/patterns/MultiRepoExample.kt create mode 100644 site/usage-examples/patterns/MultiRepoExample.scala create mode 100644 site/usage-examples/postgres/java/DomainInsertImpl.java create mode 100644 site/usage-examples/postgres/java/TestInsertExample.java create mode 100644 site/usage-examples/postgres/kotlin/DomainInsertImpl.kt create mode 100644 site/usage-examples/postgres/kotlin/TestInsertExample.kt create mode 100644 site/usage-examples/postgres/scala/DomainInsertImpl.scala create mode 100644 site/usage-examples/postgres/scala/TestInsertExample.scala delete mode 100644 testers/avro/kotlin-json/build.gradle.kts delete mode 100644 testers/avro/kotlin-json/gradle.properties delete mode 100644 testers/avro/kotlin-quarkus-mutiny/build.gradle.kts delete mode 100644 testers/avro/kotlin-quarkus-mutiny/gradle.properties delete mode 100644 testers/avro/kotlin/build.gradle.kts delete mode 100644 testers/avro/kotlin/gradle.properties create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/AddressListener.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/AddressPublisher.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/CustomerOrderListener.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/CustomerOrderPublisher.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/DynamicValueListener.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/DynamicValuePublisher.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/InvoiceListener.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/InvoicePublisher.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/LinkedListNodeListener.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/LinkedListNodePublisher.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/OrderEventsListener.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/OrderEventsPublisher.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/PaymentCallback.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/PaymentCharged.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/TreeNodeListener.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/TreeNodePublisher.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/common/MoneyListener.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/common/MoneyPublisher.scala delete mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/consumer/AddressConsumer.scala delete mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/consumer/AddressHandler.scala delete mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/consumer/CustomerOrderConsumer.scala delete mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/consumer/CustomerOrderHandler.scala delete mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/consumer/DynamicValueConsumer.scala delete mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/consumer/DynamicValueHandler.scala delete mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/consumer/InvoiceConsumer.scala delete mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/consumer/InvoiceHandler.scala delete mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/consumer/LinkedListNodeConsumer.scala delete mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/consumer/LinkedListNodeHandler.scala delete mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/consumer/MoneyConsumer.scala delete mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/consumer/MoneyHandler.scala delete mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/consumer/OrderEventsConsumer.scala delete mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/consumer/OrderEventsHandler.scala delete mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/consumer/TreeNodeConsumer.scala delete mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/consumer/TreeNodeHandler.scala delete mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/producer/AddressProducer.scala delete mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/producer/CustomerOrderProducer.scala delete mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/producer/DynamicValueProducer.scala delete mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/producer/InvoiceProducer.scala delete mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/producer/LinkedListNodeProducer.scala delete mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/producer/MoneyProducer.scala delete mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/producer/OrderEventsProducer.scala delete mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/producer/TreeNodeProducer.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/serde/PaymentCallbackSerde.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/events/serde/PaymentChargedSerde.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/service/CreateUserRequest.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/service/CreateUserResponse.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/service/DeleteUserRequest.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/service/DeleteUserResponse.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/service/GetUserRequest.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/service/GetUserResponse.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/service/NotifyUserRequest.scala create mode 100644 testers/avro/scala-cats/generated-and-checked-in/com/example/service/UserServiceRequest.scala create mode 100644 testers/avro/schemas/order-events/PaymentCallback.avsc create mode 100644 testers/avro/schemas/order-events/PaymentCharged.avsc create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/com/example/events/Address.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/com/example/events/CustomerId.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/com/example/events/CustomerOrder.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/com/example/events/DynamicValue.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/com/example/events/Email.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/com/example/events/Invoice.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/com/example/events/LinkedListNode.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/com/example/events/OrderCancelled.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/com/example/events/OrderEvents.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/com/example/events/OrderId.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/com/example/events/OrderPlaced.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/com/example/events/OrderStatus.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/com/example/events/OrderUpdated.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/com/example/events/PaymentCallback.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/com/example/events/PaymentCharged.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/com/example/events/TreeNode.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/com/example/events/common/Money.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/com/example/service/Result.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/com/example/service/User.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/com/example/service/UserNotFoundError.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/com/example/service/UserService.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/com/example/service/UserServiceHandler.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/com/example/service/ValidationError.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/SchemaValidator.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/StringOrIntOrBoolean.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/StringOrLong.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/Topics.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/TypedTopic.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/consumer/AddressConsumer.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/consumer/AddressHandler.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/consumer/CustomerOrderConsumer.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/consumer/CustomerOrderHandler.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/consumer/DynamicValueConsumer.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/consumer/DynamicValueHandler.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/consumer/InvoiceConsumer.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/consumer/InvoiceHandler.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/consumer/LinkedListNodeConsumer.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/consumer/LinkedListNodeHandler.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/consumer/MoneyConsumer.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/consumer/MoneyHandler.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/consumer/OrderEventsConsumer.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/consumer/OrderEventsHandler.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/consumer/TreeNodeConsumer.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/consumer/TreeNodeHandler.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/header/StandardHeaders.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/precisetypes/Decimal10_2.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/precisetypes/Decimal18_4.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/producer/AddressProducer.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/producer/CustomerOrderProducer.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/producer/DynamicValueProducer.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/producer/InvoiceProducer.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/producer/LinkedListNodeProducer.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/producer/MoneyProducer.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/producer/OrderEventsProducer.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/producer/TreeNodeProducer.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/serde/AddressSerde.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/serde/CustomerOrderSerde.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/serde/DynamicValueSerde.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/serde/InvoiceSerde.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/serde/LinkedListNodeSerde.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/serde/MoneySerde.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/serde/OrderCancelledSerde.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/serde/OrderEventsSerde.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/serde/OrderPlacedSerde.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/serde/OrderUpdatedSerde.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/serde/PaymentCallbackSerde.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/serde/PaymentChargedSerde.java create mode 100644 testers/combined/java/generated-and-checked-in/avro_events/combined/avro_events/serde/TreeNodeSerde.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/AllBrandsCategoriesCSet.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/AllBrandsCategoriesCSetMember.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/BestsellerClearanceFSet.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/BestsellerClearanceFSetMember.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/EmailMailPushSmsSet.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/EmailMailPushSmsSetMember.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/XYZSet.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/XYZSetMember.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/audit_log/AuditLogFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/audit_log/AuditLogId.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/audit_log/AuditLogRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/audit_log/AuditLogRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/audit_log/AuditLogRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/audit_log/AuditLogRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/audit_log/AuditLogRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/bridge/Customer.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/customer_addresses/CustomerAddressesFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/customer_addresses/CustomerAddressesId.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/customer_addresses/CustomerAddressesRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/customer_addresses/CustomerAddressesRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/customer_addresses/CustomerAddressesRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/customer_addresses/CustomerAddressesRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/customer_addresses/CustomerAddressesRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/inventory/InventoryFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/inventory/InventoryId.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/inventory/InventoryRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/inventory/InventoryRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/inventory/InventoryRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/inventory/InventoryRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/inventory/InventoryRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest/MariatestFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest/MariatestId.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest/MariatestRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest/MariatestRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest/MariatestRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest/MariatestRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest/MariatestRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_identity/MariatestIdentityFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_identity/MariatestIdentityId.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_identity/MariatestIdentityRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_identity/MariatestIdentityRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_identity/MariatestIdentityRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_identity/MariatestIdentityRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_identity/MariatestIdentityRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial/MariatestSpatialFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial/MariatestSpatialId.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial/MariatestSpatialRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial/MariatestSpatialRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial/MariatestSpatialRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial/MariatestSpatialRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial/MariatestSpatialRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial_null/MariatestSpatialNullFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial_null/MariatestSpatialNullId.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial_null/MariatestSpatialNullRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial_null/MariatestSpatialNullRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial_null/MariatestSpatialNullRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial_null/MariatestSpatialNullRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial_null/MariatestSpatialNullRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_unique/MariatestUniqueFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_unique/MariatestUniqueId.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_unique/MariatestUniqueRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_unique/MariatestUniqueRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_unique/MariatestUniqueRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_unique/MariatestUniqueRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatest_unique/MariatestUniqueRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatestnull/MariatestnullFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatestnull/MariatestnullRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatestnull/MariatestnullRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatestnull/MariatestnullRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/mariatestnull/MariatestnullRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/order_history/OrderHistoryFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/order_history/OrderHistoryId.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/order_history/OrderHistoryRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/order_history/OrderHistoryRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/order_history/OrderHistoryRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/order_history/OrderHistoryRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/order_history/OrderHistoryRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/order_items/OrderItemsFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/order_items/OrderItemsId.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/order_items/OrderItemsRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/order_items/OrderItemsRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/order_items/OrderItemsRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/order_items/OrderItemsRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/order_items/OrderItemsRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/orders/OrdersFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/orders/OrdersId.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/orders/OrdersRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/orders/OrdersRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/orders/OrdersRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/orders/OrdersRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/orders/OrdersRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/payment_methods/PaymentMethodsFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/payment_methods/PaymentMethodsId.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/payment_methods/PaymentMethodsRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/payment_methods/PaymentMethodsRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/payment_methods/PaymentMethodsRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/payment_methods/PaymentMethodsRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/payment_methods/PaymentMethodsRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/payments/PaymentsFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/payments/PaymentsId.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/payments/PaymentsRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/payments/PaymentsRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/payments/PaymentsRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/payments/PaymentsRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/payments/PaymentsRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/Binary16.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/Binary32.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/Binary64.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/Decimal10_2.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/Decimal12_4.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/Decimal18_4.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/Decimal5_2.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/Decimal8_2.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/LocalDateTime3.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/LocalDateTime6.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/LocalTime3.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/LocalTime6.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/PaddedString10.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/String10.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/String100.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/String20.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/String255.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/String50.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precision_types/PrecisionTypesFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precision_types/PrecisionTypesId.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precision_types/PrecisionTypesRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precision_types/PrecisionTypesRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precision_types/PrecisionTypesRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precision_types/PrecisionTypesRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precision_types/PrecisionTypesRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precision_types_null/PrecisionTypesNullFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precision_types_null/PrecisionTypesNullId.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precision_types_null/PrecisionTypesNullRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precision_types_null/PrecisionTypesNullRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precision_types_null/PrecisionTypesNullRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precision_types_null/PrecisionTypesNullRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/precision_types_null/PrecisionTypesNullRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/price_tiers/PriceTiersFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/price_tiers/PriceTiersId.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/price_tiers/PriceTiersRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/price_tiers/PriceTiersRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/price_tiers/PriceTiersRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/price_tiers/PriceTiersRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/price_tiers/PriceTiersRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/product_categories/ProductCategoriesFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/product_categories/ProductCategoriesId.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/product_categories/ProductCategoriesRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/product_categories/ProductCategoriesRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/product_categories/ProductCategoriesRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/product_categories/ProductCategoriesRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/product_categories/ProductCategoriesRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/product_images/ProductImagesFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/product_images/ProductImagesId.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/product_images/ProductImagesRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/product_images/ProductImagesRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/product_images/ProductImagesRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/product_images/ProductImagesRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/product_images/ProductImagesRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/product_prices/ProductPricesFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/product_prices/ProductPricesId.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/product_prices/ProductPricesRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/product_prices/ProductPricesRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/product_prices/ProductPricesRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/product_prices/ProductPricesRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/product_prices/ProductPricesRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/promotions/PromotionsFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/promotions/PromotionsId.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/promotions/PromotionsRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/promotions/PromotionsRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/promotions/PromotionsRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/promotions/PromotionsRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/promotions/PromotionsRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/reviews/ReviewsFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/reviews/ReviewsId.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/reviews/ReviewsRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/reviews/ReviewsRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/reviews/ReviewsRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/reviews/ReviewsRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/reviews/ReviewsRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/shipments/ShipmentsFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/shipments/ShipmentsId.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/shipments/ShipmentsRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/shipments/ShipmentsRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/shipments/ShipmentsRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/shipments/ShipmentsRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/shipments/ShipmentsRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/shipping_carriers/ShippingCarriersFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/shipping_carriers/ShippingCarriersId.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/shipping_carriers/ShippingCarriersRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/shipping_carriers/ShippingCarriersRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/shipping_carriers/ShippingCarriersRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/shipping_carriers/ShippingCarriersRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/shipping_carriers/ShippingCarriersRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/userdefined/Email.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/userdefined/FirstName.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/userdefined/IsActive.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/userdefined/IsApproved.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/userdefined/IsDefault.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/userdefined/IsPrimary.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/userdefined/IsVerifiedPurchase.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/userdefined/LastName.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/v_customer_summary/VCustomerSummaryViewFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/v_customer_summary/VCustomerSummaryViewRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/v_customer_summary/VCustomerSummaryViewRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/v_customer_summary/VCustomerSummaryViewRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/v_daily_sales/VDailySalesViewFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/v_daily_sales/VDailySalesViewRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/v_daily_sales/VDailySalesViewRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/v_daily_sales/VDailySalesViewRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/v_inventory_status/VInventoryStatusViewFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/v_inventory_status/VInventoryStatusViewRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/v_inventory_status/VInventoryStatusViewRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/v_inventory_status/VInventoryStatusViewRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/v_order_details/VOrderDetailsViewFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/v_order_details/VOrderDetailsViewRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/v_order_details/VOrderDetailsViewRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/v_order_details/VOrderDetailsViewRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/v_product_catalog/VProductCatalogViewFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/v_product_catalog/VProductCatalogViewRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/v_product_catalog/VProductCatalogViewRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/v_product_catalog/VProductCatalogViewRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/v_warehouse_coverage/VWarehouseCoverageViewFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/v_warehouse_coverage/VWarehouseCoverageViewRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/v_warehouse_coverage/VWarehouseCoverageViewRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/v_warehouse_coverage/VWarehouseCoverageViewRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/warehouses/WarehousesFields.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/warehouses/WarehousesId.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/warehouses/WarehousesRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/warehouses/WarehousesRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/warehouses/WarehousesRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/warehouses/WarehousesRow.java create mode 100644 testers/combined/java/generated-and-checked-in/mariadb/combined/mariadb/warehouses/WarehousesRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/bridge/Customer.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/humanresources/department/DepartmentFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/humanresources/department/DepartmentId.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/humanresources/department/DepartmentRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/humanresources/department/DepartmentRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/humanresources/department/DepartmentRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/humanresources/department/DepartmentRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/humanresources/department/DepartmentRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/humanresources/employeedepartmenthistory/EmployeedepartmenthistoryFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/humanresources/employeedepartmenthistory/EmployeedepartmenthistoryId.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/humanresources/employeedepartmenthistory/EmployeedepartmenthistoryRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/humanresources/employeedepartmenthistory/EmployeedepartmenthistoryRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/humanresources/employeedepartmenthistory/EmployeedepartmenthistoryRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/humanresources/employeedepartmenthistory/EmployeedepartmenthistoryRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/humanresources/employeedepartmenthistory/EmployeedepartmenthistoryRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/humanresources/shift/ShiftFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/humanresources/shift/ShiftId.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/humanresources/shift/ShiftRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/humanresources/shift/ShiftRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/humanresources/shift/ShiftRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/humanresources/shift/ShiftRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/humanresources/shift/ShiftRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/humanresources/vemployee/VemployeeViewFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/humanresources/vemployee/VemployeeViewRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/humanresources/vemployee/VemployeeViewRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/humanresources/vemployee/VemployeeViewRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/address/AddressFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/address/AddressId.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/address/AddressRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/address/AddressRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/address/AddressRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/address/AddressRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/address/AddressRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/addresstype/AddresstypeFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/addresstype/AddresstypeId.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/addresstype/AddresstypeRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/addresstype/AddresstypeRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/addresstype/AddresstypeRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/addresstype/AddresstypeRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/addresstype/AddresstypeRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/businessentityaddress/BusinessentityaddressFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/businessentityaddress/BusinessentityaddressId.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/businessentityaddress/BusinessentityaddressRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/businessentityaddress/BusinessentityaddressRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/businessentityaddress/BusinessentityaddressRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/businessentityaddress/BusinessentityaddressRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/businessentityaddress/BusinessentityaddressRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/countryregion/CountryregionFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/countryregion/CountryregionId.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/countryregion/CountryregionRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/countryregion/CountryregionRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/countryregion/CountryregionRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/countryregion/CountryregionRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/countryregion/CountryregionRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/password/PasswordFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/password/PasswordRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/password/PasswordRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/password/PasswordRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/password/PasswordRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/password/PasswordRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/stateprovince/StateprovinceFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/stateprovince/StateprovinceId.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/stateprovince/StateprovinceRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/stateprovince/StateprovinceRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/stateprovince/StateprovinceRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/stateprovince/StateprovinceRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/person/stateprovince/StateprovinceRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/precisetypes/PaddedString10.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/precisetypes/PaddedString3.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/precisetypes/String10.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/precisetypes/String100.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/precisetypes/String20.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/precisetypes/String255.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/precisetypes/String50.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/product/ProductFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/product/ProductId.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/product/ProductRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/product/ProductRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/product/ProductRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/product/ProductRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/product/ProductRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productcategory/ProductcategoryFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productcategory/ProductcategoryId.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productcategory/ProductcategoryRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productcategory/ProductcategoryRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productcategory/ProductcategoryRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productcategory/ProductcategoryRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productcategory/ProductcategoryRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productcosthistory/ProductcosthistoryFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productcosthistory/ProductcosthistoryId.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productcosthistory/ProductcosthistoryRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productcosthistory/ProductcosthistoryRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productcosthistory/ProductcosthistoryRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productcosthistory/ProductcosthistoryRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productcosthistory/ProductcosthistoryRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productmodel/ProductmodelFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productmodel/ProductmodelId.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productmodel/ProductmodelRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productmodel/ProductmodelRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productmodel/ProductmodelRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productmodel/ProductmodelRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productmodel/ProductmodelRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productsubcategory/ProductsubcategoryFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productsubcategory/ProductsubcategoryId.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productsubcategory/ProductsubcategoryRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productsubcategory/ProductsubcategoryRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productsubcategory/ProductsubcategoryRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productsubcategory/ProductsubcategoryRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/productsubcategory/ProductsubcategoryRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/unitmeasure/UnitmeasureFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/unitmeasure/UnitmeasureId.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/unitmeasure/UnitmeasureRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/unitmeasure/UnitmeasureRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/unitmeasure/UnitmeasureRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/unitmeasure/UnitmeasureRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/production/unitmeasure/UnitmeasureRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/Myenum.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/flaff/FlaffFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/flaff/FlaffId.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/flaff/FlaffRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/flaff/FlaffRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/flaff/FlaffRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/flaff/FlaffRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/identity_test/IdentityTestFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/identity_test/IdentityTestId.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/identity_test/IdentityTestRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/identity_test/IdentityTestRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/identity_test/IdentityTestRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/identity_test/IdentityTestRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/identity_test/IdentityTestRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/issue142/Issue142Fields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/issue142/Issue142Id.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/issue142/Issue142Repo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/issue142/Issue142RepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/issue142/Issue142RepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/issue142/Issue142Row.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/issue142_2/Issue1422Fields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/issue142_2/Issue1422Repo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/issue142_2/Issue1422RepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/issue142_2/Issue1422RepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/issue142_2/Issue1422Row.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/only_pk_columns/OnlyPkColumnsFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/only_pk_columns/OnlyPkColumnsId.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/only_pk_columns/OnlyPkColumnsRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/only_pk_columns/OnlyPkColumnsRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/only_pk_columns/OnlyPkColumnsRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/only_pk_columns/OnlyPkColumnsRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/pgtest/PgtestFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/pgtest/PgtestRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/pgtest/PgtestRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/pgtest/PgtestRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/pgtestnull/PgtestnullFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/pgtestnull/PgtestnullRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/pgtestnull/PgtestnullRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/pgtestnull/PgtestnullRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/precision_types/PrecisionTypesFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/precision_types/PrecisionTypesId.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/precision_types/PrecisionTypesRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/precision_types/PrecisionTypesRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/precision_types/PrecisionTypesRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/precision_types/PrecisionTypesRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/precision_types/PrecisionTypesRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/precision_types_null/PrecisionTypesNullFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/precision_types_null/PrecisionTypesNullId.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/precision_types_null/PrecisionTypesNullRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/precision_types_null/PrecisionTypesNullRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/precision_types_null/PrecisionTypesNullRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/precision_types_null/PrecisionTypesNullRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/precision_types_null/PrecisionTypesNullRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/title/TitleFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/title/TitleId.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/title/TitleRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/title/TitleRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/title/TitleRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/title/TitleRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/title_domain/TitleDomainFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/title_domain/TitleDomainId.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/title_domain/TitleDomainRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/title_domain/TitleDomainRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/title_domain/TitleDomainRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/title_domain/TitleDomainRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/titledperson/TitledpersonFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/titledperson/TitledpersonRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/titledperson/TitledpersonRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/titledperson/TitledpersonRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/users/UsersFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/users/UsersId.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/users/UsersRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/users/UsersRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/users/UsersRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/users/UsersRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/public_/users/UsersRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/sales/salesperson/SalespersonFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/sales/salesperson/SalespersonRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/sales/salesperson/SalespersonRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/sales/salesperson/SalespersonRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/sales/salesperson/SalespersonRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/sales/salesperson/SalespersonRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/sales/salesterritory/SalesterritoryFields.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/sales/salesterritory/SalesterritoryId.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/sales/salesterritory/SalesterritoryRepo.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/sales/salesterritory/SalesterritoryRepoImpl.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/sales/salesterritory/SalesterritoryRepoMock.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/sales/salesterritory/SalesterritoryRow.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/sales/salesterritory/SalesterritoryRowUnsaved.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/userdefined/ActiveFlag.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/userdefined/CurrentFlag.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/userdefined/Description.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/userdefined/FirstName.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/userdefined/LastName.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/userdefined/MiddleName.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/userdefined/OnlineOrderFlag.java create mode 100644 testers/combined/java/generated-and-checked-in/postgres/combined/postgres/userdefined/SalariedFlag.java delete mode 100644 testers/combined/java/generated-and-checked-in/shared/combined/shared/FirstName.java delete mode 100644 testers/combined/java/generated-and-checked-in/shared/combined/shared/IsActive.java delete mode 100644 testers/combined/java/generated-and-checked-in/shared/combined/shared/IsSalaried.java delete mode 100644 testers/combined/java/generated-and-checked-in/shared/combined/shared/LastName.java delete mode 100644 testers/combined/java/generated-and-checked-in/shared/combined/shared/MiddleName.java create mode 100644 testers/combined/kotlin/generated-and-checked-in/api/combined/api/api/CustomersApi.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/api/combined/api/api/CustomersApiServer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/api/combined/api/api/EmployeesApi.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/api/combined/api/api/EmployeesApiServer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/api/combined/api/api/ProductsApi.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/api/combined/api/api/ProductsApiServer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/api/combined/api/model/Customer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/api/combined/api/model/CustomerCreate.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/api/combined/api/model/CustomerUpdate.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/api/combined/api/model/Employee.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/api/combined/api/model/Product.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/com/example/events/Address.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/com/example/events/CustomerId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/com/example/events/CustomerOrder.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/com/example/events/DynamicValue.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/com/example/events/Email.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/com/example/events/Invoice.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/com/example/events/LinkedListNode.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/com/example/events/OrderCancelled.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/com/example/events/OrderEvents.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/com/example/events/OrderId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/com/example/events/OrderPlaced.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/com/example/events/OrderStatus.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/com/example/events/OrderUpdated.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/com/example/events/PaymentCallback.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/com/example/events/PaymentCharged.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/com/example/events/TreeNode.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/com/example/events/common/Money.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/com/example/service/Result.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/com/example/service/User.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/com/example/service/UserNotFoundError.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/com/example/service/UserService.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/com/example/service/UserServiceHandler.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/com/example/service/ValidationError.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/SchemaValidator.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/StringOrIntOrBoolean.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/StringOrLong.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/Topics.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/TypedTopic.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/consumer/AddressConsumer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/consumer/AddressHandler.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/consumer/CustomerOrderConsumer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/consumer/CustomerOrderHandler.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/consumer/DynamicValueConsumer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/consumer/DynamicValueHandler.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/consumer/InvoiceConsumer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/consumer/InvoiceHandler.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/consumer/LinkedListNodeConsumer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/consumer/LinkedListNodeHandler.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/consumer/MoneyConsumer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/consumer/MoneyHandler.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/consumer/OrderEventsConsumer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/consumer/OrderEventsHandler.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/consumer/TreeNodeConsumer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/consumer/TreeNodeHandler.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/header/StandardHeaders.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/precisetypes/Decimal10_2.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/precisetypes/Decimal18_4.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/producer/AddressProducer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/producer/CustomerOrderProducer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/producer/DynamicValueProducer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/producer/InvoiceProducer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/producer/LinkedListNodeProducer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/producer/MoneyProducer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/producer/OrderEventsProducer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/producer/TreeNodeProducer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/serde/AddressSerde.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/serde/CustomerOrderSerde.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/serde/DynamicValueSerde.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/serde/InvoiceSerde.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/serde/LinkedListNodeSerde.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/serde/MoneySerde.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/serde/OrderCancelledSerde.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/serde/OrderEventsSerde.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/serde/OrderPlacedSerde.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/serde/OrderUpdatedSerde.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/serde/PaymentCallbackSerde.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/serde/PaymentChargedSerde.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/avro_events/combined/avro_events/serde/TreeNodeSerde.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/AllBrandsCategoriesCSet.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/AllBrandsCategoriesCSetMember.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/BestsellerClearanceFSet.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/BestsellerClearanceFSetMember.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/DefaultedDeserializer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/DefaultedSerializer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/EmailMailPushSmsSet.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/EmailMailPushSmsSetMember.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/XYZSet.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/XYZSetMember.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/audit_log/AuditLogFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/audit_log/AuditLogId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/audit_log/AuditLogRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/audit_log/AuditLogRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/audit_log/AuditLogRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/audit_log/AuditLogRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/audit_log/AuditLogRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/brands/BrandsFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/brands/BrandsId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/brands/BrandsRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/brands/BrandsRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/brands/BrandsRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/brands/BrandsRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/brands/BrandsRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/bridge/Customer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/categories/CategoriesFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/categories/CategoriesId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/categories/CategoriesRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/categories/CategoriesRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/categories/CategoriesRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/categories/CategoriesRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/categories/CategoriesRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/customer_addresses/CustomerAddressesFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/customer_addresses/CustomerAddressesId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/customer_addresses/CustomerAddressesRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/customer_addresses/CustomerAddressesRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/customer_addresses/CustomerAddressesRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/customer_addresses/CustomerAddressesRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/customer_addresses/CustomerAddressesRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/customer_status/CustomerStatusFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/customer_status/CustomerStatusId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/customer_status/CustomerStatusRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/customer_status/CustomerStatusRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/customer_status/CustomerStatusRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/customer_status/CustomerStatusRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/customer_status/CustomerStatusRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/customers/CustomersFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/customers/CustomersId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/customers/CustomersRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/customers/CustomersRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/customers/CustomersRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/customers/CustomersRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/customers/CustomersRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/customtypes/Defaulted.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/inventory/InventoryFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/inventory/InventoryId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/inventory/InventoryRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/inventory/InventoryRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/inventory/InventoryRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/inventory/InventoryRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/inventory/InventoryRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest/MariatestFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest/MariatestId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest/MariatestRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest/MariatestRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest/MariatestRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest/MariatestRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest/MariatestRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_identity/MariatestIdentityFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_identity/MariatestIdentityId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_identity/MariatestIdentityRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_identity/MariatestIdentityRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_identity/MariatestIdentityRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_identity/MariatestIdentityRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_identity/MariatestIdentityRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial/MariatestSpatialFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial/MariatestSpatialId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial/MariatestSpatialRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial/MariatestSpatialRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial/MariatestSpatialRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial/MariatestSpatialRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial/MariatestSpatialRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial_null/MariatestSpatialNullFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial_null/MariatestSpatialNullId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial_null/MariatestSpatialNullRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial_null/MariatestSpatialNullRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial_null/MariatestSpatialNullRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial_null/MariatestSpatialNullRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_spatial_null/MariatestSpatialNullRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_unique/MariatestUniqueFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_unique/MariatestUniqueId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_unique/MariatestUniqueRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_unique/MariatestUniqueRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_unique/MariatestUniqueRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_unique/MariatestUniqueRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatest_unique/MariatestUniqueRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatestnull/MariatestnullFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatestnull/MariatestnullRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatestnull/MariatestnullRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatestnull/MariatestnullRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/mariatestnull/MariatestnullRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/order_history/OrderHistoryFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/order_history/OrderHistoryId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/order_history/OrderHistoryRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/order_history/OrderHistoryRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/order_history/OrderHistoryRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/order_history/OrderHistoryRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/order_history/OrderHistoryRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/order_items/OrderItemsFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/order_items/OrderItemsId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/order_items/OrderItemsRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/order_items/OrderItemsRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/order_items/OrderItemsRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/order_items/OrderItemsRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/order_items/OrderItemsRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/orders/OrdersFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/orders/OrdersId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/orders/OrdersRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/orders/OrdersRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/orders/OrdersRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/orders/OrdersRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/orders/OrdersRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/payment_methods/PaymentMethodsFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/payment_methods/PaymentMethodsId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/payment_methods/PaymentMethodsRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/payment_methods/PaymentMethodsRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/payment_methods/PaymentMethodsRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/payment_methods/PaymentMethodsRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/payment_methods/PaymentMethodsRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/payments/PaymentsFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/payments/PaymentsId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/payments/PaymentsRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/payments/PaymentsRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/payments/PaymentsRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/payments/PaymentsRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/payments/PaymentsRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/Binary16.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/Binary32.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/Binary64.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/Decimal10_2.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/Decimal12_4.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/Decimal18_4.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/Decimal5_2.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/Decimal8_2.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/LocalDateTime3.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/LocalDateTime6.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/LocalTime3.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/LocalTime6.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/PaddedString10.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/String10.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/String100.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/String20.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/String255.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precisetypes/String50.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precision_types/PrecisionTypesFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precision_types/PrecisionTypesId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precision_types/PrecisionTypesRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precision_types/PrecisionTypesRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precision_types/PrecisionTypesRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precision_types/PrecisionTypesRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precision_types/PrecisionTypesRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precision_types_null/PrecisionTypesNullFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precision_types_null/PrecisionTypesNullId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precision_types_null/PrecisionTypesNullRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precision_types_null/PrecisionTypesNullRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precision_types_null/PrecisionTypesNullRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precision_types_null/PrecisionTypesNullRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/precision_types_null/PrecisionTypesNullRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/price_tiers/PriceTiersFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/price_tiers/PriceTiersId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/price_tiers/PriceTiersRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/price_tiers/PriceTiersRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/price_tiers/PriceTiersRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/price_tiers/PriceTiersRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/price_tiers/PriceTiersRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/product_categories/ProductCategoriesFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/product_categories/ProductCategoriesId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/product_categories/ProductCategoriesRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/product_categories/ProductCategoriesRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/product_categories/ProductCategoriesRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/product_categories/ProductCategoriesRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/product_categories/ProductCategoriesRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/product_images/ProductImagesFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/product_images/ProductImagesId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/product_images/ProductImagesRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/product_images/ProductImagesRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/product_images/ProductImagesRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/product_images/ProductImagesRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/product_images/ProductImagesRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/product_prices/ProductPricesFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/product_prices/ProductPricesId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/product_prices/ProductPricesRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/product_prices/ProductPricesRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/product_prices/ProductPricesRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/product_prices/ProductPricesRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/product_prices/ProductPricesRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/products/ProductsFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/products/ProductsId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/products/ProductsRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/products/ProductsRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/products/ProductsRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/products/ProductsRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/products/ProductsRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/promotions/PromotionsFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/promotions/PromotionsId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/promotions/PromotionsRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/promotions/PromotionsRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/promotions/PromotionsRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/promotions/PromotionsRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/promotions/PromotionsRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/reviews/ReviewsFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/reviews/ReviewsId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/reviews/ReviewsRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/reviews/ReviewsRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/reviews/ReviewsRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/reviews/ReviewsRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/reviews/ReviewsRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/shipments/ShipmentsFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/shipments/ShipmentsId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/shipments/ShipmentsRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/shipments/ShipmentsRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/shipments/ShipmentsRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/shipments/ShipmentsRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/shipments/ShipmentsRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/shipping_carriers/ShippingCarriersFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/shipping_carriers/ShippingCarriersId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/shipping_carriers/ShippingCarriersRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/shipping_carriers/ShippingCarriersRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/shipping_carriers/ShippingCarriersRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/shipping_carriers/ShippingCarriersRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/shipping_carriers/ShippingCarriersRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/userdefined/Email.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/userdefined/FirstName.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/userdefined/IsActive.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/userdefined/IsApproved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/userdefined/IsDefault.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/userdefined/IsPrimary.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/userdefined/IsVerifiedPurchase.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/userdefined/LastName.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/v_customer_summary/VCustomerSummaryViewFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/v_customer_summary/VCustomerSummaryViewRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/v_customer_summary/VCustomerSummaryViewRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/v_customer_summary/VCustomerSummaryViewRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/v_daily_sales/VDailySalesViewFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/v_daily_sales/VDailySalesViewRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/v_daily_sales/VDailySalesViewRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/v_daily_sales/VDailySalesViewRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/v_inventory_status/VInventoryStatusViewFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/v_inventory_status/VInventoryStatusViewRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/v_inventory_status/VInventoryStatusViewRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/v_inventory_status/VInventoryStatusViewRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/v_order_details/VOrderDetailsViewFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/v_order_details/VOrderDetailsViewRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/v_order_details/VOrderDetailsViewRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/v_order_details/VOrderDetailsViewRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/v_product_catalog/VProductCatalogViewFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/v_product_catalog/VProductCatalogViewRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/v_product_catalog/VProductCatalogViewRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/v_product_catalog/VProductCatalogViewRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/v_warehouse_coverage/VWarehouseCoverageViewFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/v_warehouse_coverage/VWarehouseCoverageViewRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/v_warehouse_coverage/VWarehouseCoverageViewRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/v_warehouse_coverage/VWarehouseCoverageViewRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/warehouses/WarehousesFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/warehouses/WarehousesId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/warehouses/WarehousesRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/warehouses/WarehousesRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/warehouses/WarehousesRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/warehouses/WarehousesRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/mariadb/combined/mariadb/warehouses/WarehousesRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/DefaultedDeserializer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/DefaultedSerializer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/bridge/Customer.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/customtypes/Defaulted.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/department/DepartmentFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/department/DepartmentId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/department/DepartmentRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/department/DepartmentRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/department/DepartmentRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/department/DepartmentRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/department/DepartmentRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/employee/EmployeeFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/employee/EmployeeRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/employee/EmployeeRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/employee/EmployeeRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/employee/EmployeeRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/employee/EmployeeRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/employeedepartmenthistory/EmployeedepartmenthistoryFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/employeedepartmenthistory/EmployeedepartmenthistoryId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/employeedepartmenthistory/EmployeedepartmenthistoryRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/employeedepartmenthistory/EmployeedepartmenthistoryRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/employeedepartmenthistory/EmployeedepartmenthistoryRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/employeedepartmenthistory/EmployeedepartmenthistoryRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/employeedepartmenthistory/EmployeedepartmenthistoryRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/shift/ShiftFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/shift/ShiftId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/shift/ShiftRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/shift/ShiftRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/shift/ShiftRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/shift/ShiftRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/shift/ShiftRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/vemployee/VemployeeViewFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/vemployee/VemployeeViewRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/vemployee/VemployeeViewRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/humanresources/vemployee/VemployeeViewRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/information_schema/CardinalNumber.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/information_schema/CharacterData.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/information_schema/SqlIdentifier.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/information_schema/TimeStamp.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/information_schema/YesOrNo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/address/AddressFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/address/AddressId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/address/AddressRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/address/AddressRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/address/AddressRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/address/AddressRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/address/AddressRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/addresstype/AddresstypeFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/addresstype/AddresstypeId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/addresstype/AddresstypeRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/addresstype/AddresstypeRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/addresstype/AddresstypeRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/addresstype/AddresstypeRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/addresstype/AddresstypeRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/businessentity/BusinessentityFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/businessentity/BusinessentityId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/businessentity/BusinessentityRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/businessentity/BusinessentityRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/businessentity/BusinessentityRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/businessentity/BusinessentityRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/businessentity/BusinessentityRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/businessentityaddress/BusinessentityaddressFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/businessentityaddress/BusinessentityaddressId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/businessentityaddress/BusinessentityaddressRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/businessentityaddress/BusinessentityaddressRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/businessentityaddress/BusinessentityaddressRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/businessentityaddress/BusinessentityaddressRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/businessentityaddress/BusinessentityaddressRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/countryregion/CountryregionFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/countryregion/CountryregionId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/countryregion/CountryregionRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/countryregion/CountryregionRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/countryregion/CountryregionRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/countryregion/CountryregionRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/countryregion/CountryregionRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/emailaddress/EmailaddressFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/emailaddress/EmailaddressId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/emailaddress/EmailaddressRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/emailaddress/EmailaddressRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/emailaddress/EmailaddressRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/emailaddress/EmailaddressRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/emailaddress/EmailaddressRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/password/PasswordFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/password/PasswordRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/password/PasswordRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/password/PasswordRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/password/PasswordRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/password/PasswordRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/person/PersonFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/person/PersonRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/person/PersonRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/person/PersonRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/person/PersonRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/person/PersonRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/stateprovince/StateprovinceFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/stateprovince/StateprovinceId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/stateprovince/StateprovinceRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/stateprovince/StateprovinceRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/stateprovince/StateprovinceRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/stateprovince/StateprovinceRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/person/stateprovince/StateprovinceRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/precisetypes/PaddedString10.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/precisetypes/PaddedString3.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/precisetypes/String10.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/precisetypes/String100.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/precisetypes/String20.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/precisetypes/String255.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/precisetypes/String50.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/product/ProductFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/product/ProductId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/product/ProductRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/product/ProductRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/product/ProductRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/product/ProductRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/product/ProductRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productcategory/ProductcategoryFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productcategory/ProductcategoryId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productcategory/ProductcategoryRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productcategory/ProductcategoryRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productcategory/ProductcategoryRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productcategory/ProductcategoryRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productcategory/ProductcategoryRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productcosthistory/ProductcosthistoryFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productcosthistory/ProductcosthistoryId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productcosthistory/ProductcosthistoryRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productcosthistory/ProductcosthistoryRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productcosthistory/ProductcosthistoryRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productcosthistory/ProductcosthistoryRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productcosthistory/ProductcosthistoryRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productmodel/ProductmodelFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productmodel/ProductmodelId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productmodel/ProductmodelRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productmodel/ProductmodelRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productmodel/ProductmodelRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productmodel/ProductmodelRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productmodel/ProductmodelRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productsubcategory/ProductsubcategoryFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productsubcategory/ProductsubcategoryId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productsubcategory/ProductsubcategoryRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productsubcategory/ProductsubcategoryRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productsubcategory/ProductsubcategoryRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productsubcategory/ProductsubcategoryRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/productsubcategory/ProductsubcategoryRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/unitmeasure/UnitmeasureFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/unitmeasure/UnitmeasureId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/unitmeasure/UnitmeasureRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/unitmeasure/UnitmeasureRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/unitmeasure/UnitmeasureRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/unitmeasure/UnitmeasureRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/production/unitmeasure/UnitmeasureRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/AccountNumber.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/Address.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/AllTypesComposite.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/Complex.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/ContactInfo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/EmployeeRecord.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/Flag.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/InventoryItem.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/MetadataRecord.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/Mydomain.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/Myenum.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/Name.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/NameStyle.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/NullableTest.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/OrderNumber.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/PersonName.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/Phone.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/Point2d.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/PolygonCustom.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/ShortText.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/TablefuncCrosstab2.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/TablefuncCrosstab3.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/TablefuncCrosstab4.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/TextWithSpecialChars.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/TreeNode.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/flaff/FlaffFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/flaff/FlaffId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/flaff/FlaffRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/flaff/FlaffRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/flaff/FlaffRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/flaff/FlaffRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/identity_test/IdentityTestFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/identity_test/IdentityTestId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/identity_test/IdentityTestRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/identity_test/IdentityTestRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/identity_test/IdentityTestRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/identity_test/IdentityTestRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/identity_test/IdentityTestRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/issue142/Issue142Fields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/issue142/Issue142Id.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/issue142/Issue142Repo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/issue142/Issue142RepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/issue142/Issue142RepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/issue142/Issue142Row.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/issue142_2/Issue1422Fields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/issue142_2/Issue1422Repo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/issue142_2/Issue1422RepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/issue142_2/Issue1422RepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/issue142_2/Issue1422Row.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/only_pk_columns/OnlyPkColumnsFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/only_pk_columns/OnlyPkColumnsId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/only_pk_columns/OnlyPkColumnsRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/only_pk_columns/OnlyPkColumnsRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/only_pk_columns/OnlyPkColumnsRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/only_pk_columns/OnlyPkColumnsRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/pgtest/PgtestFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/pgtest/PgtestRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/pgtest/PgtestRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/pgtest/PgtestRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/pgtestnull/PgtestnullFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/pgtestnull/PgtestnullRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/pgtestnull/PgtestnullRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/pgtestnull/PgtestnullRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/precision_types/PrecisionTypesFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/precision_types/PrecisionTypesId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/precision_types/PrecisionTypesRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/precision_types/PrecisionTypesRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/precision_types/PrecisionTypesRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/precision_types/PrecisionTypesRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/precision_types/PrecisionTypesRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/precision_types_null/PrecisionTypesNullFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/precision_types_null/PrecisionTypesNullId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/precision_types_null/PrecisionTypesNullRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/precision_types_null/PrecisionTypesNullRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/precision_types_null/PrecisionTypesNullRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/precision_types_null/PrecisionTypesNullRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/precision_types_null/PrecisionTypesNullRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/title/TitleFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/title/TitleId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/title/TitleRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/title/TitleRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/title/TitleRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/title/TitleRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/title_domain/TitleDomainFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/title_domain/TitleDomainId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/title_domain/TitleDomainRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/title_domain/TitleDomainRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/title_domain/TitleDomainRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/title_domain/TitleDomainRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/titledperson/TitledpersonFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/titledperson/TitledpersonRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/titledperson/TitledpersonRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/titledperson/TitledpersonRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/users/UsersFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/users/UsersId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/users/UsersRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/users/UsersRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/users/UsersRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/users/UsersRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/public/users/UsersRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/sales/salesperson/SalespersonFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/sales/salesperson/SalespersonRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/sales/salesperson/SalespersonRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/sales/salesperson/SalespersonRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/sales/salesperson/SalespersonRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/sales/salesperson/SalespersonRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/sales/salesterritory/SalesterritoryFields.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/sales/salesterritory/SalesterritoryId.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/sales/salesterritory/SalesterritoryRepo.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/sales/salesterritory/SalesterritoryRepoImpl.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/sales/salesterritory/SalesterritoryRepoMock.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/sales/salesterritory/SalesterritoryRow.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/sales/salesterritory/SalesterritoryRowUnsaved.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/userdefined/ActiveFlag.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/userdefined/CurrentFlag.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/userdefined/Description.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/userdefined/FirstName.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/userdefined/LastName.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/userdefined/MiddleName.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/userdefined/OnlineOrderFlag.kt create mode 100644 testers/combined/kotlin/generated-and-checked-in/postgres/combined/postgres/userdefined/SalariedFlag.kt create mode 100644 testers/db2/java/generated-and-checked-in/testdb/bridge/Customer.java delete mode 100644 testers/db2/kotlin/build.gradle.kts create mode 100644 testers/db2/kotlin/generated-and-checked-in/testdb/bridge/Customer.kt delete mode 100644 testers/db2/kotlin/gradle.properties create mode 100644 testers/db2/scala/generated-and-checked-in/testdb/bridge/Customer.scala create mode 100644 testers/duckdb/java/generated-and-checked-in/testdb/bridge/Customer.java delete mode 100644 testers/duckdb/kotlin/build.gradle.kts create mode 100644 testers/duckdb/kotlin/generated-and-checked-in/testdb/bridge/Customer.kt delete mode 100644 testers/duckdb/kotlin/gradle.properties create mode 100644 testers/duckdb/scala/generated-and-checked-in/testdb/bridge/Customer.scala delete mode 100644 testers/grpc/kotlin-quarkus/build.gradle.kts delete mode 100644 testers/grpc/kotlin-quarkus/gradle.properties delete mode 100644 testers/grpc/kotlin/build.gradle.kts delete mode 100644 testers/grpc/kotlin/gradle.properties create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/BankTransfer.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/ChatMessage.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/CreateOrderRequest.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/CreateOrderResponse.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/CreditCard.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/Customer.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/CustomerId.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/EchoService.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/EchoServiceClient.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/EchoServiceServer.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/GetCustomerRequest.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/GetCustomerResponse.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/Inner.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/Inventory.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/ListOrdersRequest.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/Notification.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/NotificationTarget.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/OptionalFields.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/Order.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/OrderId.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/OrderService.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/OrderServiceClient.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/OrderServiceServer.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/OrderStatus.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/OrderSummary.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/OrderUpdate.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/Outer.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/PaymentMethod.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/PaymentMethodMethod.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/Priority.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/ScalarTypes.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/Wallet.scala create mode 100644 testers/grpc/scala-cats/generated-and-checked-in/com/example/grpc/WellKnownTypesMessage.scala create mode 100644 testers/grpc/scala-cats/src/scala/com/example/grpc/CatsGrpcIntegrationTest.scala create mode 100644 testers/jsonschema/test-schema.json create mode 100644 testers/mariadb/java/generated-and-checked-in/testdb/bridge/Customer.java delete mode 100644 testers/mariadb/kotlin/build.gradle.kts create mode 100644 testers/mariadb/kotlin/generated-and-checked-in/testdb/bridge/Customer.kt delete mode 100644 testers/mariadb/kotlin/gradle.properties create mode 100644 testers/mariadb/scala/generated-and-checked-in/testdb/bridge/Customer.scala delete mode 100644 testers/openapi/kotlin/jaxrs/build.gradle.kts delete mode 100644 testers/openapi/kotlin/quarkus/build.gradle.kts delete mode 100644 testers/openapi/kotlin/spring/build.gradle.kts create mode 100644 testers/oracle/java/generated-and-checked-in/oracledb/bridge/Customer.java delete mode 100644 testers/oracle/java/generated-and-checked-in/oracledb/userdefined/Email.java delete mode 100644 testers/oracle/kotlin/build.gradle.kts create mode 100644 testers/oracle/kotlin/generated-and-checked-in/oracledb/bridge/Customer.kt delete mode 100644 testers/oracle/kotlin/generated-and-checked-in/oracledb/userdefined/Email.kt delete mode 100644 testers/oracle/kotlin/gradle.properties create mode 100644 testers/oracle/scala-new/generated-and-checked-in/oracledb/bridge/Customer.scala delete mode 100644 testers/oracle/scala-new/generated-and-checked-in/oracledb/userdefined/Email.scala create mode 100644 testers/oracle/scala/generated-and-checked-in/oracledb/bridge/Customer.scala delete mode 100644 testers/oracle/scala/generated-and-checked-in/oracledb/userdefined/Email.scala create mode 100644 testers/pg/java/generated-and-checked-in/adventureworks/bridge/Customer.java create mode 100644 testers/pg/java/generated-and-checked-in/adventureworks/userdefined/Description.java delete mode 100644 testers/pg/kotlin/build.gradle.kts create mode 100644 testers/pg/kotlin/generated-and-checked-in/adventureworks/bridge/Customer.kt create mode 100644 testers/pg/kotlin/generated-and-checked-in/adventureworks/userdefined/Description.kt delete mode 100644 testers/pg/kotlin/gradle.properties create mode 100644 testers/pg/scala/anorm/generated-and-checked-in-2.13/adventureworks/bridge/Customer.scala create mode 100644 testers/pg/scala/anorm/generated-and-checked-in-2.13/adventureworks/userdefined/Description.scala create mode 100644 testers/pg/scala/anorm/generated-and-checked-in-3/adventureworks/bridge/Customer.scala create mode 100644 testers/pg/scala/anorm/generated-and-checked-in-3/adventureworks/userdefined/Description.scala create mode 100644 testers/pg/scala/doobie/generated-and-checked-in-2.13/adventureworks/bridge/Customer.scala create mode 100644 testers/pg/scala/doobie/generated-and-checked-in-2.13/adventureworks/userdefined/Description.scala create mode 100644 testers/pg/scala/doobie/generated-and-checked-in-3/adventureworks/bridge/Customer.scala create mode 100644 testers/pg/scala/doobie/generated-and-checked-in-3/adventureworks/userdefined/Description.scala create mode 100644 testers/pg/scala/javatypes/generated-and-checked-in/adventureworks/bridge/Customer.scala create mode 100644 testers/pg/scala/javatypes/generated-and-checked-in/adventureworks/userdefined/Description.scala create mode 100644 testers/pg/scala/scalatypes/generated-and-checked-in/adventureworks/bridge/Customer.scala create mode 100644 testers/pg/scala/scalatypes/generated-and-checked-in/adventureworks/userdefined/Description.scala create mode 100644 testers/pg/scala/zio-jdbc/generated-and-checked-in-2.13/adventureworks/bridge/Customer.scala create mode 100644 testers/pg/scala/zio-jdbc/generated-and-checked-in-2.13/adventureworks/userdefined/Description.scala create mode 100644 testers/pg/scala/zio-jdbc/generated-and-checked-in-3/adventureworks/bridge/Customer.scala create mode 100644 testers/pg/scala/zio-jdbc/generated-and-checked-in-3/adventureworks/userdefined/Description.scala create mode 100644 testers/sqlserver/java/generated-and-checked-in/testdb/bridge/Customer.java delete mode 100644 testers/sqlserver/kotlin/build.gradle.kts create mode 100644 testers/sqlserver/kotlin/generated-and-checked-in/testdb/bridge/Customer.kt create mode 100644 testers/sqlserver/scala/generated-and-checked-in/testdb/bridge/Customer.scala create mode 100644 tests/src/scala/typr/bridge/FlowValidatorTest.scala create mode 100644 tests/src/scala/typr/bridge/SmartDefaultsTest.scala create mode 100644 tests/src/scala/typr/bridge/TypeNarrowerIntegrationTest.scala create mode 100644 tests/src/scala/typr/bridge/TypePolicyValidatorTest.scala create mode 100644 tests/src/scala/typr/cli/config/ConfigRoundtripTest.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/AlignedSource.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/ApiMatch.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/AvroBoundary.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/Boundary.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/BoundarySelectors.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/BridgeOutputConfig.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/BridgeType.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/DatabaseBoundary.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/DbMatch.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/DomainGenerateOptions.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/DomainType.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/DuckdbBoundary.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/FeatureMatcher.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/FeatureMatcherArray.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/FeatureMatcherObject.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/FeatureMatcherString.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/FieldOverride.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/FieldOverrideEnum.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/FieldOverrideObject.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/FieldSpec.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/FieldSpecObject.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/FieldSpecString.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/FieldType.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/GrpcBoundary.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/HeaderField.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/HeaderSchema.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/JsonschemaBoundary.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/KeyType.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/KeyTypeEnum.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/KeyTypeObject.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/MatcherValue.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/MatcherValueArray.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/MatcherValueObject.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/MatcherValueString.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/Matchers.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/ModelMatch.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/NameAlignmentConfig.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/OpenapiBoundary.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/Output.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/Projection.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/StringOrArray.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/StringOrArrayArray.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/StringOrArrayString.scala create mode 100644 typr-codegen/generated-and-checked-in-jsonschema/config/generated/ValidationRules.scala create mode 100644 typr-codegen/generated-and-checked-in/anorm/typr/generated/ExecuteReturningSyntax.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/Text.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/custom/comments/CommentsSqlRepo.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/custom/comments/CommentsSqlRepoImpl.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/custom/comments/CommentsSqlRow.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/custom/composite_types/CompositeTypesSqlRepo.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/custom/composite_types/CompositeTypesSqlRepoImpl.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/custom/composite_types/CompositeTypesSqlRow.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/custom/constraints/ConstraintsSqlRepo.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/custom/constraints/ConstraintsSqlRepoImpl.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/custom/constraints/ConstraintsSqlRow.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/custom/domains/DomainsSqlRepo.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/custom/domains/DomainsSqlRepoImpl.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/custom/domains/DomainsSqlRow.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/custom/enums/EnumsSqlRepo.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/custom/enums/EnumsSqlRepoImpl.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/custom/enums/EnumsSqlRow.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/custom/table_comments/TableCommentsSqlRepo.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/custom/table_comments/TableCommentsSqlRepoImpl.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/custom/table_comments/TableCommentsSqlRow.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/custom/view_find_all/ViewFindAllSqlRepo.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/custom/view_find_all/ViewFindAllSqlRepoImpl.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/custom/view_find_all/ViewFindAllSqlRow.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/customtypes/TypoAclItem.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/customtypes/TypoInstant.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/customtypes/TypoShort.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/information_schema/CardinalNumber.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/information_schema/CharacterData.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/information_schema/SqlIdentifier.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/information_schema/TimeStamp.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/information_schema/YesOrNo.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/information_schema/columns/ColumnsViewRepo.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/information_schema/columns/ColumnsViewRepoImpl.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/information_schema/columns/ColumnsViewRow.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/information_schema/key_column_usage/KeyColumnUsageViewRepo.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/information_schema/key_column_usage/KeyColumnUsageViewRepoImpl.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/information_schema/key_column_usage/KeyColumnUsageViewRow.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/information_schema/referential_constraints/ReferentialConstraintsViewRepo.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/information_schema/referential_constraints/ReferentialConstraintsViewRepoImpl.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/information_schema/referential_constraints/ReferentialConstraintsViewRow.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/information_schema/table_constraints/TableConstraintsViewRepo.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/information_schema/table_constraints/TableConstraintsViewRepoImpl.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/information_schema/table_constraints/TableConstraintsViewRow.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/information_schema/tables/TablesViewRepo.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/information_schema/tables/TablesViewRepoImpl.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/information_schema/tables/TablesViewRow.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/package.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/pg_catalog/pg_namespace/PgNamespaceId.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/pg_catalog/pg_namespace/PgNamespaceRepo.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/pg_catalog/pg_namespace/PgNamespaceRepoImpl.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/pg_catalog/pg_namespace/PgNamespaceRepoMock.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/pg_catalog/pg_namespace/PgNamespaceRow.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/public/AccountNumber.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/public/Flag.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/public/Mydomain.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/public/Name.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/public/NameStyle.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/public/OrderNumber.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/public/Phone.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/public/ShortText.scala create mode 100644 typr-codegen/generated-and-checked-in/typr/generated/streamingInsert.scala rename {typr => typr-codegen}/src/resources/sqlglot_analyze.py (100%) create mode 100644 typr-codegen/src/scala/typr/BridgeCompositeType.scala rename {typr => typr-codegen}/src/scala/typr/DbLibName.scala (100%) rename {typr => typr-codegen}/src/scala/typr/DbType.scala (100%) rename {typr => typr-codegen}/src/scala/typr/Dialect.scala (100%) rename {typr => typr-codegen}/src/scala/typr/DslQualifiedNames.scala (66%) rename {typr => typr-codegen}/src/scala/typr/FoundationsTypes.scala (87%) rename {typr => typr-codegen}/src/scala/typr/GenerateConfig.scala (100%) rename {typr => typr-codegen}/src/scala/typr/Generated.scala (100%) rename {typr => typr-codegen}/src/scala/typr/JsonLibName.scala (100%) rename {typr => typr-codegen}/src/scala/typr/Lang.scala (92%) rename {typr => typr-codegen}/src/scala/typr/MetaDb.scala (100%) rename {typr => typr-codegen}/src/scala/typr/Naming.scala (100%) rename {typr => typr-codegen}/src/scala/typr/NonEmptyList.scala (100%) rename {typr => typr-codegen}/src/scala/typr/Nullability.scala (100%) rename {typr => typr-codegen}/src/scala/typr/NullabilityOverride.scala (100%) rename {typr => typr-codegen}/src/scala/typr/Options.scala (91%) rename {typr => typr-codegen}/src/scala/typr/ProjectGraph.scala (100%) rename {typr => typr-codegen}/src/scala/typr/RelPath.scala (100%) rename {typr => typr-codegen}/src/scala/typr/SchemaMode.scala (100%) rename {typr => typr-codegen}/src/scala/typr/SchemaSource.scala (100%) rename {typr => typr-codegen}/src/scala/typr/Scope.scala (100%) rename {typr => typr-codegen}/src/scala/typr/Selector.scala (100%) rename {typr => typr-codegen}/src/scala/typr/Source.scala (100%) create mode 100644 typr-codegen/src/scala/typr/TypeDefinitions.scala rename {typr => typr-codegen}/src/scala/typr/TypeOverride.scala (100%) rename {typr => typr-codegen}/src/scala/typr/TypeSupport.scala (100%) rename {typr => typr-codegen}/src/scala/typr/TypeSupportJava.scala (100%) rename {typr => typr-codegen}/src/scala/typr/TypeSupportScala.scala (97%) rename {typr => typr-codegen}/src/scala/typr/TypesJava.scala (98%) rename {typr => typr-codegen}/src/scala/typr/TypesKotlin.scala (100%) rename {typr => typr-codegen}/src/scala/typr/TypesScala.scala (97%) rename {typr => typr-codegen}/src/scala/typr/TypoDataSource.scala (86%) rename {typr => typr-codegen}/src/scala/typr/TypoLogger.scala (100%) create mode 100644 typr-codegen/src/scala/typr/boundaries/framework/CatsFramework.scala create mode 100644 typr-codegen/src/scala/typr/boundaries/framework/Framework.scala create mode 100644 typr-codegen/src/scala/typr/boundaries/framework/FrameworkTypes.scala create mode 100644 typr-codegen/src/scala/typr/boundaries/framework/HttpFramework.scala create mode 100644 typr-codegen/src/scala/typr/boundaries/framework/MessagingFramework.scala create mode 100644 typr-codegen/src/scala/typr/boundaries/framework/QuarkusFramework.scala create mode 100644 typr-codegen/src/scala/typr/boundaries/framework/RpcFramework.scala create mode 100644 typr-codegen/src/scala/typr/boundaries/framework/SpringFramework.scala rename {typr => typr-codegen}/src/scala/typr/db.scala (99%) rename {typr => typr-codegen}/src/scala/typr/effects/EffectType.scala (82%) rename {typr => typr-codegen}/src/scala/typr/effects/EffectTypeOps.scala (90%) rename {typr => typr-codegen}/src/scala/typr/generateFromConfig.scala (97%) rename {typr => typr-codegen}/src/scala/typr/generateFromDb.scala (98%) rename {typr => typr-codegen}/src/scala/typr/grpc/GrpcTypes.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/ArrayName.scala (100%) create mode 100644 typr-codegen/src/scala/typr/internal/ComputedBridgeCompositeType.scala rename {typr => typr-codegen}/src/scala/typr/internal/ComputedColumn.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/ComputedDefault.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/ComputedDomain.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/ComputedDuckDbStruct.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/ComputedMariaSet.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/ComputedNames.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/ComputedOracleCollectionType.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/ComputedOracleObjectType.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/ComputedPgCompositeType.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/ComputedRowUnsaved.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/ComputedSharedType.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/ComputedSqlFile.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/ComputedStringEnum.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/ComputedTable.scala (94%) rename {typr => typr-codegen}/src/scala/typr/internal/ComputedTestInserts.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/ComputedView.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/CustomType.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/CustomTypes.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/DebugJson.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/FileSync.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/FkAnalysis.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/HasSource.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/IdComputed.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/InstanceRequirements.scala (89%) rename {typr => typr-codegen}/src/scala/typr/internal/InternalOptions.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/Lazy.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/PreciseConstraint.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/RepoMethod.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/TypeAligner.scala (97%) rename {typr => typr-codegen}/src/scala/typr/internal/TypeCompatibilityChecker.scala (96%) rename {typr => typr-codegen}/src/scala/typr/internal/TypeMapperDb.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/TypeMapperJvm.scala (63%) rename {typr => typr-codegen}/src/scala/typr/internal/TypeMapperJvmNew.scala (98%) rename {typr => typr-codegen}/src/scala/typr/internal/TypeMapperJvmOld.scala (91%) rename {typr => typr-codegen}/src/scala/typr/internal/TypeMatcher.scala (84%) rename {typr => typr-codegen}/src/scala/typr/internal/TypoType.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/analysis/ColumnNullable.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/analysis/DecomposedSql.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/analysis/JdbcMetadata.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/analysis/JdbcType.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/analysis/MaybeReturnsRows.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/analysis/MetadataColumn.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/analysis/MetadataParameterColumn.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/analysis/NullabilityFromExplain.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/analysis/ParameterMode.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/analysis/ParameterNullable.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/analysis/ParsedName.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/analysis/WellKnownPrimitive.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/Db2Adapter.scala (82%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/DbAdapter.scala (88%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/DbLib.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/DbLibAnorm.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/DbLibDoobie.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/DbLibFoundations.scala (82%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/DbLibLegacy.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/DbLibTextImplementations.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/DbLibTextSupport.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/DbLibZioJdbc.scala (100%) create mode 100644 typr-codegen/src/scala/typr/internal/codegen/DuckDbAdapter.scala create mode 100644 typr-codegen/src/scala/typr/internal/codegen/FileBridgeCompositeType.scala create mode 100644 typr-codegen/src/scala/typr/internal/codegen/FileBridgeProjectionMapper.scala rename {typr => typr-codegen}/src/scala/typr/internal/codegen/FileCustomType.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/FileDefault.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/FileDomain.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/FileDuckDbStruct.scala (71%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/FileFields.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/FileFieldsFoundations.scala (97%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/FileFieldsLegacy.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/FileMariaSet.scala (96%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/FileOracleCollectionType.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/FileOracleObjectType.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/FilePackageObject.scala (100%) create mode 100644 typr-codegen/src/scala/typr/internal/codegen/FilePgCompositeType.scala rename {typr => typr-codegen}/src/scala/typr/internal/codegen/FilePreciseType.scala (85%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/FileSharedType.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/FileStringEnum.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/FileTestInserts.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/FilesRelation.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/FilesSqlFile.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/FilesTable.scala (98%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/FilesView.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/JsonLib.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/JsonLibCirce.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/JsonLibJackson.scala (97%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/JsonLibPlay.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/JsonLibZioJson.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/LangJava.scala (96%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/LangKotlin.scala (95%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/LangScala.scala (94%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/MariaDbAdapter.scala (72%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/OracleAdapter.scala (71%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/PostgresAdapter.scala (55%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/SqlCast.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/SqlServerAdapter.scala (79%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/ToCode.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/TypeSupportKotlin.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/addPackageAndImports.scala (95%) rename {typr => typr-codegen}/src/scala/typr/internal/codegen/package.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/compat.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/db2/Db2JdbcMetadata.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/db2/Db2MetaDb.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/db2/Db2SqlFileMetadata.scala (99%) rename {typr => typr-codegen}/src/scala/typr/internal/db2/Db2TypeMapperDb.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/duckdb/DuckDbJdbcMetadata.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/duckdb/DuckDbMetaDb.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/duckdb/DuckDbSqlFileMetadata.scala (99%) rename {typr => typr-codegen}/src/scala/typr/internal/duckdb/DuckDbTypeMapperDb.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/external/ExternalTools.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/external/OsArch.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/external/Python.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/external/Sqlglot.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/external/SqlglotDb2.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/external/TypoCoursier.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/findTypeFromFk.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/forget.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/generate.scala (97%) rename {typr => typr-codegen}/src/scala/typr/internal/mariadb/MariaJdbcMetadata.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/mariadb/MariaMetaDb.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/mariadb/MariaSqlFileMetadata.scala (99%) rename {typr => typr-codegen}/src/scala/typr/internal/mariadb/MariaTypeMapperDb.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/minimize.scala (98%) rename {typr => typr-codegen}/src/scala/typr/internal/oracle/OracleJdbcMetadata.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/oracle/OracleMetaDb.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/oracle/OracleSqlFileMetadata.scala (99%) rename {typr => typr-codegen}/src/scala/typr/internal/oracle/OracleTypeMapperDb.scala (93%) rename {typr => typr-codegen}/src/scala/typr/internal/pg/Enums.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/pg/ForeignKeys.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/pg/OpenEnum.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/pg/PgMetaDb.scala (99%) rename {typr => typr-codegen}/src/scala/typr/internal/pg/PgTypeMapperDb.scala (97%) rename {typr => typr-codegen}/src/scala/typr/internal/pg/PrimaryKeys.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/pg/UniqueKeys.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/quote.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/rewriteDependentData.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/sqlfiles/SqlFile.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/sqlfiles/SqlFileReader.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/sqlfiles/readSqlFileDirectories.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/sqlglot/SqlglotAnalyzer.scala (97%) rename {typr => typr-codegen}/src/scala/typr/internal/sqlglot/SqlglotTypes.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/sqlserver/SqlServerJdbcMetadata.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/sqlserver/SqlServerMetaDb.scala (100%) rename {typr => typr-codegen}/src/scala/typr/internal/sqlserver/SqlServerSqlFileMetadata.scala (99%) rename {typr => typr-codegen}/src/scala/typr/internal/sqlserver/SqlServerTypeMapperDb.scala (100%) create mode 100644 typr-codegen/src/scala/typr/jsonschema/JsonSchemaCodegen.scala create mode 100644 typr-codegen/src/scala/typr/jsonschema/JsonSchemaOptions.scala create mode 100644 typr-codegen/src/scala/typr/jsonschema/JsonSchemaParser.scala rename {typr => typr-codegen}/src/scala/typr/jvm.scala (99%) rename {typr => typr-codegen}/src/scala/typr/openapi/ApiTypes.scala (100%) rename {typr => typr-codegen}/src/scala/typr/openapi/ModelClass.scala (88%) rename {typr => typr-codegen}/src/scala/typr/openapi/OpenApiCodegen.scala (100%) rename {typr => typr-codegen}/src/scala/typr/openapi/OpenApiError.scala (100%) create mode 100644 typr-codegen/src/scala/typr/openapi/OpenApiJsonLib.scala rename {typr => typr-codegen}/src/scala/typr/openapi/OpenApiOptions.scala (100%) rename {typr => typr-codegen}/src/scala/typr/openapi/ParsedSpec.scala (100%) rename {typr => typr-codegen}/src/scala/typr/openapi/SumType.scala (82%) rename {typr => typr-codegen}/src/scala/typr/openapi/TypeInfo.scala (100%) rename {typr => typr-codegen}/src/scala/typr/openapi/codegen/ApiCodegen.scala (100%) rename {typr => typr-codegen}/src/scala/typr/openapi/codegen/FrameworkSupport.scala (100%) rename {typr => typr-codegen}/src/scala/typr/openapi/codegen/JsonLibSupport.scala (68%) rename {typr => typr-codegen}/src/scala/typr/openapi/codegen/ModelCodegen.scala (87%) rename {typr => typr-codegen}/src/scala/typr/openapi/codegen/TypeMapper.scala (99%) rename {typr => typr-codegen}/src/scala/typr/openapi/codegen/ValidationSupport.scala (100%) create mode 100644 typr-codegen/src/scala/typr/openapi/computed/ComputedApiService.scala create mode 100644 typr-codegen/src/scala/typr/openapi/computed/ComputedEndpoint.scala create mode 100644 typr-codegen/src/scala/typr/openapi/computed/ComputedModel.scala create mode 100644 typr-codegen/src/scala/typr/openapi/computed/ComputedParameter.scala create mode 100644 typr-codegen/src/scala/typr/openapi/computed/ComputedProperty.scala rename {typr => typr-codegen}/src/scala/typr/openapi/parser/ApiExtractor.scala (100%) create mode 100644 typr-codegen/src/scala/typr/openapi/parser/ModelExtractor.scala rename {typr => typr-codegen}/src/scala/typr/openapi/parser/OpenApiParser.scala (100%) rename {typr => typr-codegen}/src/scala/typr/openapi/parser/SpecValidator.scala (100%) rename {typr => typr-codegen}/src/scala/typr/openapi/parser/TypeResolver.scala (100%) rename {typr => typr-codegen}/src/scala/typr/openapi/testdata/stripe-spec3.yaml (100%) rename {typr => typr-codegen}/src/scala/typr/openapi/testdata/test-features.yaml (100%) create mode 100644 typr-config.schema.json rename {foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin => typr-dsl-kotlin/src/kotlin/dev/typr/dslkt}/DeleteBuilder.kt (70%) rename {foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin => typr-dsl-kotlin/src/kotlin/dev/typr/dslkt}/DslExports.kt (58%) create mode 100644 typr-dsl-kotlin/src/kotlin/dev/typr/dslkt/MockConnection.kt rename {foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin => typr-dsl-kotlin/src/kotlin/dev/typr/dslkt}/SelectBuilder.kt (88%) rename {foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin => typr-dsl-kotlin/src/kotlin/dev/typr/dslkt}/SqlExpr.kt (67%) rename {foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin => typr-dsl-kotlin/src/kotlin/dev/typr/dslkt}/SqlExprExtensions.kt (97%) create mode 100644 typr-dsl-kotlin/src/kotlin/dev/typr/dslkt/Structure.kt rename {foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin => typr-dsl-kotlin/src/kotlin/dev/typr/dslkt}/UpdateBuilder.kt (81%) rename {foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala => typr-dsl-scala/src/scala/dev/typr/dslsc}/DeleteBuilder.scala (64%) create mode 100644 typr-dsl-scala/src/scala/dev/typr/dslsc/DslExports.scala rename {foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala => typr-dsl-scala/src/scala/dev/typr/dslsc}/ForeignKey.scala (89%) rename {foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala => typr-dsl-scala/src/scala/dev/typr/dslsc}/SelectBuilder.scala (85%) rename {foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala => typr-dsl-scala/src/scala/dev/typr/dslsc}/SqlExpr.scala (98%) rename {foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala => typr-dsl-scala/src/scala/dev/typr/dslsc}/Structure.scala (59%) rename {foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala => typr-dsl-scala/src/scala/dev/typr/dslsc}/UpdateBuilder.scala (89%) create mode 100644 typr-dsl-scala/src/scala/dev/typr/dslsc/package.scala rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/DeleteBuilder.java (94%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/DeleteBuilderMock.java (97%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/DeleteBuilderSql.java (62%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/DeleteParams.java (94%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/Dialect.java (80%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/FieldsBase.java (82%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/FieldsExpr.java (86%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/FieldsExpr0.java (93%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/ForeignKey.java (97%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/GenericDbTypes.java (95%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/GroupedBuilder.java (98%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/GroupedBuilderMock.java (99%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/GroupedBuilderSql.java (86%) create mode 100644 typr-dsl/src/java/dev/typr/dsl/Inserter.java rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/Like.java (98%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/MockConnection.java (93%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/OrderByOrSeek.java (98%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/Path.java (96%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/RelationStructure.java (98%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/RenderCtx.java (99%) rename foundations-jdbc-dsl/src/java/dev/typr/foundations/dsl/RowParserDbType.java => typr-dsl/src/java/dev/typr/dsl/RowCodecDbType.java (58%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/SelectBuilder.java (97%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/SelectBuilderMock.java (99%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/SelectBuilderSql.java (84%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/SelectParams.java (87%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/SortOrder.java (84%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/SqlExpr.java (93%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/SqlExprVisitor.java (99%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/SqlFunction1.java (95%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/SqlFunction2.java (92%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/SqlFunction3.java (91%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/SqlOperator.java (97%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/Structure.java (98%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/TriFunction.java (75%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/UpdateBuilder.java (94%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/UpdateBuilderMock.java (99%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/UpdateBuilderSql.java (79%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/UpdateParams.java (97%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/internal/DummyComparator.java (97%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/internal/LogicalPlan.java (98%) rename {foundations-jdbc-dsl/src/java/dev/typr/foundations => typr-dsl/src/java/dev/typr}/dsl/internal/RowComparator.java (92%) delete mode 100644 typr-scripts/src/scala/scripts/GenerateAll.scala delete mode 100644 typr-scripts/src/scala/scripts/GenerateAvroTest.scala delete mode 100644 typr-scripts/src/scala/scripts/GenerateCombinedTest.scala create mode 100644 typr-scripts/src/scala/scripts/GenerateConfigTypes.scala delete mode 100644 typr-scripts/src/scala/scripts/GenerateGrpcTest.scala delete mode 100644 typr-scripts/src/scala/scripts/GenerateOpenApiTest.scala delete mode 100644 typr-scripts/src/scala/scripts/GenerateStripeTest.scala delete mode 100644 typr-scripts/src/scala/scripts/GeneratedDb2.scala delete mode 100644 typr-scripts/src/scala/scripts/GeneratedDuckDb.scala delete mode 100644 typr-scripts/src/scala/scripts/GeneratedMariaDb.scala delete mode 100644 typr-scripts/src/scala/scripts/GeneratedOracle.scala delete mode 100644 typr-scripts/src/scala/scripts/GeneratedPostgres.scala delete mode 100644 typr-scripts/src/scala/scripts/GeneratedSqlServer.scala delete mode 100644 typr-scripts/src/scala/scripts/ShowcaseGeneration.scala create mode 100644 typr.yaml delete mode 100644 typr/src/scala/typr/Banner.scala delete mode 100644 typr/src/scala/typr/TypeDefinitions.scala create mode 100644 typr/src/scala/typr/avro/BridgeAvroAdapter.scala create mode 100644 typr/src/scala/typr/avro/codegen/KafkaFrameworkCats.scala create mode 100644 typr/src/scala/typr/avro/computed/ComputedAvroField.scala create mode 100644 typr/src/scala/typr/avro/computed/ComputedAvroRecord.scala create mode 100644 typr/src/scala/typr/avro/computed/ComputedEventGroup.scala create mode 100644 typr/src/scala/typr/avro/computed/ComputedProtocol.scala create mode 100644 typr/src/scala/typr/bridge/ColumnGrouper.scala create mode 100644 typr/src/scala/typr/bridge/ColumnStemmer.scala create mode 100644 typr/src/scala/typr/bridge/ColumnTokenizer.scala create mode 100644 typr/src/scala/typr/bridge/CompositeType.scala create mode 100644 typr/src/scala/typr/bridge/CompositeTypeSuggester.scala create mode 100644 typr/src/scala/typr/bridge/ConfigToBridge.scala create mode 100644 typr/src/scala/typr/bridge/TypeNarrower.scala create mode 100644 typr/src/scala/typr/bridge/TypeSuggester.scala create mode 100644 typr/src/scala/typr/bridge/api/BridgeApi.scala create mode 100644 typr/src/scala/typr/bridge/api/BridgeApiImpl.scala create mode 100644 typr/src/scala/typr/bridge/model/CheckResult.scala create mode 100644 typr/src/scala/typr/bridge/model/FieldOverride.scala create mode 100644 typr/src/scala/typr/bridge/model/ResolvedFlow.scala create mode 100644 typr/src/scala/typr/bridge/model/SourceDeclaration.scala create mode 100644 typr/src/scala/typr/bridge/model/TypePolicy.scala create mode 100644 typr/src/scala/typr/bridge/validation/FlowValidator.scala create mode 100644 typr/src/scala/typr/bridge/validation/SmartDefaults.scala create mode 100644 typr/src/scala/typr/bridge/validation/TypePolicyValidator.scala create mode 100644 typr/src/scala/typr/cli/Main.scala create mode 100644 typr/src/scala/typr/cli/commands/Check.scala create mode 100644 typr/src/scala/typr/cli/commands/Generate.scala create mode 100644 typr/src/scala/typr/cli/commands/ProjectionFieldFormat.scala create mode 100644 typr/src/scala/typr/cli/commands/SourceEntityLoader.scala create mode 100644 typr/src/scala/typr/cli/commands/Watch.scala create mode 100644 typr/src/scala/typr/cli/config/ConfigParser.scala create mode 100644 typr/src/scala/typr/cli/config/ConfigToOptions.scala create mode 100644 typr/src/scala/typr/cli/config/ConfigWriter.scala create mode 100644 typr/src/scala/typr/cli/config/EnvSubstitution.scala create mode 100644 typr/src/scala/typr/cli/config/TyprConfig.scala create mode 100644 typr/src/scala/typr/cli/util/PatternMatcher.scala create mode 100644 typr/src/scala/typr/grpc/BridgeProtoAdapter.scala rename typr/src/scala/typr/grpc/codegen/{ServiceCodegen.scala => FilesGrpcService.scala} (61%) create mode 100644 typr/src/scala/typr/grpc/codegen/GrpcFrameworkCats.scala create mode 100644 typr/src/scala/typr/grpc/computed/ComputedGrpcMethod.scala create mode 100644 typr/src/scala/typr/grpc/computed/ComputedGrpcService.scala delete mode 100644 typr/src/scala/typr/internal/codegen/DuckDbAdapter.scala delete mode 100644 typr/src/scala/typr/internal/codegen/FilePgCompositeType.scala delete mode 100644 typr/src/scala/typr/openapi/parser/ModelExtractor.scala diff --git a/.claude/settings.local.json b/.claude/settings.local.json deleted file mode 100644 index bdb6fd46a0..0000000000 --- a/.claude/settings.local.json +++ /dev/null @@ -1,35 +0,0 @@ -{ - "permissions": { - "allow": [ - "WebFetch(domain:bleep.build)", - "Bash(gh issue list:*)", - "Bash(gh issue view:*)", - "Bash(docker-compose up:*)", - "Bash(bleep run:*)", - "Bash(ls:*)", - "Bash(rg:*)", - "Bash(bleep compile:*)", - "Bash(git add:*)", - "Bash(bleep fmt:*)", - "Bash(docker-compose:*)", - "Bash(bleep test:*)", - "Bash(git checkout:*)", - "Bash(git stash:*)", - "Bash(git reset:*)", - "Bash(git commit:*)", - "Bash(git cherry-pick:*)", - "Bash(bleep generate-adventureworks:*)", - "WebFetch(domain:github.com)", - "Bash(find:*)", - "Bash(grep:*)", - "Bash(rm:*)", - "Bash(JAVA_OPTS=\"-Dtypo.debug.fk=true\" bleep generate-adventureworks)", - "Bash(git pull:*)", - "Bash(git branch:*)", - "Bash(bleep generate:*)", - "Bash(git rebase:*)", - "Bash(npm run build:*)" - ], - "deny": [] - } -} \ No newline at end of file diff --git a/.github/workflows/build.yml b/.github/workflows/build.yml index 9c9dcd6ab5..706c5c83de 100644 --- a/.github/workflows/build.yml +++ b/.github/workflows/build.yml @@ -70,12 +70,6 @@ jobs: with: extraFiles: bleep.yaml - - name: Set up JDK - uses: actions/setup-java@v4 - with: - java-version: '21' - distribution: 'temurin' - - name: Start Postgres env: PG_MAJOR: ${{ matrix.postgres-version }} @@ -123,13 +117,10 @@ jobs: CI: true run: bleep compile foundations-jdbc-test && bleep test foundations-jdbc-test --only PgTypeTest - - name: Sourcegen - run: bleep sourcegen - - name: Run Kotlin Postgres tests env: CI: true - run: ./gradlew :testers:pg:kotlin:test + run: bleep test testers/pg/kotlin build-mariadb: runs-on: ubuntu-latest @@ -142,12 +133,6 @@ jobs: with: extraFiles: bleep.yaml - - name: Set up JDK - uses: actions/setup-java@v4 - with: - java-version: '21' - distribution: 'temurin' - - name: Start MariaDB run: docker compose up -d mariadb @@ -177,13 +162,10 @@ jobs: CI: true run: bleep compile foundations-jdbc-test && bleep test foundations-jdbc-test --only MariaTypeTest - - name: Sourcegen - run: bleep sourcegen - - name: Run MariaDB Kotlin tests env: CI: true - run: ./gradlew :testers:mariadb:kotlin:test + run: bleep test testers/mariadb/kotlin build-oracle: runs-on: ubuntu-latest @@ -196,12 +178,6 @@ jobs: with: extraFiles: bleep.yaml - - name: Set up JDK - uses: actions/setup-java@v4 - with: - java-version: '21' - distribution: 'temurin' - - name: Start Oracle and wait for healthy run: | docker compose up -d oracle @@ -225,13 +201,10 @@ jobs: CI: true run: bleep compile foundations-jdbc-test && bleep test foundations-jdbc-test --only OracleTypeTest - - name: Sourcegen - run: bleep sourcegen - - name: Run Oracle Kotlin tests env: CI: true - run: ./gradlew :testers:oracle:kotlin:test + run: bleep test testers/oracle/kotlin build-duckdb: runs-on: ubuntu-latest @@ -244,12 +217,6 @@ jobs: with: extraFiles: bleep.yaml - - name: Set up JDK - uses: actions/setup-java@v4 - with: - java-version: '21' - distribution: 'temurin' - - name: Run DuckDB Scala tests env: CI: true @@ -265,13 +232,10 @@ jobs: CI: true run: bleep compile foundations-jdbc-test && bleep test foundations-jdbc-test --only DuckDbTypeTest - - name: Sourcegen - run: bleep sourcegen - - name: Run DuckDB Kotlin tests env: CI: true - run: ./gradlew :testers:duckdb:kotlin:test + run: bleep test testers/duckdb/kotlin build-sqlserver: runs-on: ubuntu-latest @@ -284,12 +248,6 @@ jobs: with: extraFiles: bleep.yaml - - name: Set up JDK - uses: actions/setup-java@v4 - with: - java-version: '21' - distribution: 'temurin' - - name: Start SQL Server and wait for healthy run: | docker compose up -d sqlserver @@ -315,13 +273,10 @@ jobs: CI: true run: bleep compile foundations-jdbc-test && bleep test foundations-jdbc-test --only SqlServerTypeTest - - name: Sourcegen - run: bleep sourcegen - - name: Run SQL Server Kotlin tests env: CI: true - run: ./gradlew :testers:sqlserver:kotlin:test + run: bleep test testers/sqlserver/kotlin build-db2: runs-on: ubuntu-latest @@ -334,12 +289,6 @@ jobs: with: extraFiles: bleep.yaml - - name: Set up JDK - uses: actions/setup-java@v4 - with: - java-version: '21' - distribution: 'temurin' - - name: Start DB2 and wait for healthy run: | docker compose up -d db2 @@ -365,13 +314,10 @@ jobs: CI: true run: bleep compile foundations-jdbc-test && bleep test foundations-jdbc-test --only Db2TypeTest - - name: Sourcegen - run: bleep sourcegen - - name: Run DB2 Kotlin tests env: CI: true - run: ./gradlew :testers:db2:kotlin:test + run: bleep test testers/db2/kotlin build-openapi: runs-on: ubuntu-latest @@ -384,12 +330,6 @@ jobs: with: extraFiles: bleep.yaml - - name: Set up JDK - uses: actions/setup-java@v4 - with: - java-version: '21' - distribution: 'temurin' - - name: Compile Scala OpenAPI testers run: | bleep compile testers/openapi/scala/http4s @@ -401,14 +341,8 @@ jobs: bleep compile testers/openapi/java/quarkus bleep compile testers/openapi/java/spring - - name: Sourcegen - run: bleep sourcegen - - - name: Compile Kotlin OpenAPI testers (via gradle) - run: | - ./gradlew :testers:openapi:kotlin:jaxrs:compileKotlin - ./gradlew :testers:openapi:kotlin:spring:compileKotlin - ./gradlew :testers:openapi:kotlin:quarkus:compileKotlin + - name: Compile Kotlin OpenAPI testers + run: bleep compile testers/openapi/kotlin/jaxrs testers/openapi/kotlin/spring testers/openapi/kotlin/quarkus build-docs: timeout-minutes: 20 diff --git a/.gitignore b/.gitignore index 28f7095e85..aa346a888b 100644 --- a/.gitignore +++ b/.gitignore @@ -24,4 +24,7 @@ __pycache__/ db/duckdb/test.db db/duckdb/test.db.wal duckdb-test/ -.snapshot-test.lock \ No newline at end of file +.snapshot-test.lock +typr.sh +.mcp.json +.claude/settings.local.json diff --git a/.javafmt.conf b/.javafmt.conf new file mode 100644 index 0000000000..3c14b8b363 --- /dev/null +++ b/.javafmt.conf @@ -0,0 +1,11 @@ +enabled = true +version = "1.33.0" +style = "google" +skipSortingImports = false +skipRemovingUnusedImports = false +fixImportsOnly = false +skipReflowingLongStrings = false +skipJavadocFormatting = false +excludePaths = [ +"**/generated-and-checked-in/**" +] diff --git a/.scalafmt.conf b/.scalafmt.conf index 68f8cf7232..8759634e20 100644 --- a/.scalafmt.conf +++ b/.scalafmt.conf @@ -16,18 +16,12 @@ fileOverride { "glob:**/src/scala-2.13/**" { runner.dialect = Scala213Source3 } - "glob:**/foundations-jdbc-dsl-scala/src/scala/**" { + "glob:**/typr-dsl-scala/src/scala/**" { runner.dialect = scala3 } "glob:**/typr/generated-and-checked-in/**" { runner.dialect = scala3 } - "glob:**/foundations-jdbc-dsl-scala/src/scala/**" { - runner.dialect = scala3 - } - "glob:**/foundations-jdbc-scala/src/scala/**" { - runner.dialect = scala3 - } "glob:**/testers/pg/scala/scalatypes/src/scala/**" { runner.dialect = scala3 } diff --git a/BRIDGE-ARCHITECTURE.md b/BRIDGE-ARCHITECTURE.md new file mode 100644 index 0000000000..76ba15f4b0 --- /dev/null +++ b/BRIDGE-ARCHITECTURE.md @@ -0,0 +1,974 @@ +# Typr Bridge Architecture + +## Vision: Domain as the Hub + +``` + ┌─────────────────────────┐ + │ DOMAIN TYPES │ + │ │ + │ Customer │ + │ Order │ + │ Product │ + │ ... │ + └───────────┬─────────────┘ + │ + ┌────────────────────────┼────────────────────────┐ + │ │ │ + ▼ ▼ ▼ + ┌─────────────┐ ┌─────────────┐ ┌─────────────┐ + │ Database │ │ APIs │ │ Events │ + │ │ │ │ │ │ + │ CustomerRow │ │ CustomerDto │ │ CustomerEvt │ + │ .toDomain() │ │ .toDomain() │ │ .toDomain() │ + │ .fromDom() │ │ .fromDom() │ │ .fromDom() │ + └─────────────┘ └─────────────┘ └─────────────┘ + │ │ + ┌─────┴─────┐ ┌──────┴──────┐ + │ │ │ │ + ▼ ▼ ▼ ▼ + PostgreSQL MariaDB Avro/Kafka gRPC/Proto + Oracle DuckDB + SQL Server DB2 +``` + +**The Problem:** In typical enterprise systems, data flows through multiple representations: +- Database rows for persistence +- DTOs for REST APIs +- Protobuf messages for gRPC +- Avro records for Kafka events + +Each boundary has its own type definitions, leading to: +- Scattered business logic across mappers +- Subtle bugs from mismatched field names or types +- Boilerplate conversion code that's error-prone +- Difficulty tracking what a "Customer" really means + +**The Solution:** Define domain types once. Typr Bridge generates type-safe mappers to/from all boundaries, validates compatibility at build time, and ensures your domain model is the single source of truth. + +--- + +## Completed Features (P0-P1) + +### P0: Domain Type DSL + +**What it is:** A declarative way to define your business domain types with their fields, relationships, and source mappings. + +**Key Components:** + +| Type | Purpose | Location | +|------|---------|----------| +| `DomainTypeDefinition` | Complete domain type spec | `bridge/CompositeType.scala` | +| `DomainField` | Field with type, nullability, array support | `bridge/CompositeType.scala` | +| `PrimarySource` | The "anchor" source for a domain type | `bridge/CompositeType.scala` | +| `AlignedSource` | Additional sources mapped to domain | `bridge/CompositeType.scala` | + +**Domain Field Properties:** +```scala +case class DomainField( + name: String, // Field name (camelCase convention) + typeName: String, // Type: scalar, array, or another domain type + nullable: Boolean, // Can be null/absent + array: Boolean, // Is a collection + description: Option[String] +) +``` + +**Supported Scalar Types:** +- Primitives: `String`, `Int`, `Long`, `Short`, `Byte`, `Float`, `Double`, `Boolean` +- Numeric: `BigDecimal`, `BigInteger` +- Temporal: `Instant`, `LocalDate`, `LocalTime`, `LocalDateTime`, `OffsetDateTime`, `ZonedDateTime` +- Special: `UUID`, `ByteArray`, `Json` + +**Compact Type Syntax:** +``` +String → required string +String? → optional string +String[] → required array of strings +String?[] → optional array of strings (rare) +Customer → reference to another domain type +Customer? → optional reference +``` + +**Generation Flags:** +```scala +case class DomainTypeDefinition( + name: String, + fields: List[DomainField], + primary: Option[PrimarySource], + alignedSources: List[AlignedSource], + // Generation control: + generateDomainType: Boolean = true, // Generate the domain class + generateMappers: Boolean = true, // Generate toDomain/fromDomain + generateInterface: Boolean = false, // Generate trait/interface + generateBuilder: Boolean = false, // Generate builder pattern + generateCopy: Boolean = true // Generate copy/with methods +) +``` + +**Improvement Opportunities:** +- Add validation annotations (min/max, regex, custom validators) +- Support for computed/derived fields +- Versioning for schema evolution +- Better IDE support (LSP integration) + +--- + +### P0: Name Alignment Engine + +**What it is:** Automatically matches field names across naming conventions. `customer_email` (database) matches `customerEmail` (domain) matches `CustomerEmail` (proto). + +**Key Components:** + +| Type | Purpose | Location | +|------|---------|----------| +| `NameAligner` | Trait for name matching | `bridge/CompositeType.scala` | +| `DefaultNameAligner` | Implementation with tokenization | `bridge/CompositeType.scala` | +| `ColumnTokenizer` | Parses names into tokens | `bridge/ColumnTokenizer.scala` | +| `ColumnStemmer` | Normalizes and expands abbreviations | `bridge/ColumnStemmer.scala` | + +**How It Works:** + +``` +Input: "customer_email" (source) vs "customerEmail" (domain) + +Step 1: Tokenization + "customer_email" → ["customer", "email"] (snake_case detected) + "customerEmail" → ["customer", "email"] (camelCase detected) + +Step 2: Stemming + ["customer", "email"] → ["customer", "email"] (no changes) + +Step 3: Canonicalization + "customeremail" == "customeremail" → MATCH! +``` + +**Pattern Detection:** +- `SnakeCase`: `customer_email` +- `CamelCase`: `customerEmail` +- `PascalCase`: `CustomerEmail` +- `ScreamingSnake`: `CUSTOMER_EMAIL` +- `Mixed`: `Customer_Email` +- `Ambiguous`: When multiple interpretations possible + +**Built-in Abbreviation Expansions:** +```scala +val abbreviations = Map( + "id" → "identifier", + "addr" → "address", + "usr" → "user", + "cust" → "customer", + "prod" → "product", + "qty" → "quantity", + "num" → "number", + "amt" → "amount", + "desc" → "description", + "dt" → "date", + "ts" → "timestamp", + "dttm" → "datetime", + // ... and more +) +``` + +**Custom Abbreviations:** +```scala +val aligner = DefaultNameAligner(customAbbreviations = Map( + "sku" → "stockkeepingunit", + "ean" → "europeararticlenumber" +)) +``` + +**Improvement Opportunities:** +- Configurable abbreviation dictionaries per project +- Machine learning for abbreviation detection +- Fuzzy matching with configurable threshold +- Support for domain-specific naming conventions + +--- + +### P0: Matching Rules Engine + +**What it is:** Determines how domain fields map to source fields, handling exact matches, custom transformations, and explicit exclusions. + +**Key Components:** + +| Type | Purpose | Location | +|------|---------|----------| +| `SmartDefaults` | Auto-resolves field mappings | `bridge/validation/SmartDefaults.scala` | +| `FieldOverride` | Manual mapping control | `bridge/model/FieldOverride.scala` | +| `SourceDeclaration` | Source-level configuration | `bridge/model/SourceDeclaration.scala` | +| `ResolvedFieldAction` | Computed mapping result | `bridge/model/ResolvedFlow.scala` | + +**Resolution Process:** +``` +For each domain field: + 1. Check explicit override → Forward/Drop/Custom + 2. Check mappings config → explicit name mapping + 3. Try name alignment → auto-match by tokenization + 4. No match? → auto-drop with warning + +For each source field: + 1. Already mapped? → skip + 2. In exclude list? → skip + 3. Not mapped? → report as Unannotated +``` + +**Field Override Types:** + +```scala +sealed trait FieldOverride + +// Map domain field to source field +case class Forward( + sourceFieldName: Option[String], // None = same name + typePolicy: TypePolicy, // How strict on types + directionOverride: Option[FlowDirection] +) extends FieldOverride + +// Exclude field from this source +case class Drop( + reason: Option[String] +) extends FieldOverride + +// Custom transformation +case class Custom(kind: CustomKind) extends FieldOverride + +sealed trait CustomKind +case class MergeFrom(sourceFields: List[String]) extends CustomKind +case class SplitFrom(sourceField: String) extends CustomKind +case class ComputedFrom(domainFields: List[String]) extends CustomKind +case class Enrichment(description: String) extends CustomKind +``` + +**Example Use Cases:** + +```yaml +# Merge first + last name into fullName +fullName: + custom: + mergeFrom: [first_name, last_name] + +# Split address into components +streetAddress: + custom: + splitFrom: full_address + +# Computed field (not in source) +totalPrice: + custom: + computedFrom: [unitPrice, quantity] + +# External enrichment +geoLocation: + custom: + enrichment: "Geocoded from address via external API" +``` + +**Source Declaration Options:** + +```scala +case class SourceDeclaration( + sourceName: String, + entityPath: String, + role: SourceRole, // Primary or Secondary + direction: FlowDirection, // In, Out, or InOut + mode: CompatibilityMode, // Exact, Superset, Subset + mappings: Map[String, String], // Explicit field name mappings + exclude: Set[String], // Source fields to ignore + includeExtra: List[String], // Include unmapped fields + readonly: Boolean, // No fromDomain generation + defaultTypePolicy: TypePolicy, // Default type strictness + fieldOverrides: Map[String, FieldOverride] +) +``` + +**Improvement Opportunities:** +- Bidirectional custom transformations (with inverse functions) +- Conditional mappings based on field values +- Template-based transformations +- Better error messages for common mistakes + +--- + +### Contract Ownership and Data Flow + +**The Key Insight:** Not all boundaries are equal. Some contracts you own, some you don't. + +``` + ┌─────────────────────────┐ + │ YOUR DOMAIN │ + │ (you own this) │ + └───────────┬─────────────┘ + │ + ┌───────────────────────┼───────────────────────┐ + │ │ │ + ▼ ▼ ▼ +┌───────────────┐ ┌───────────────┐ ┌───────────────┐ +│ Database │ │ Your API │ │ Partner API │ +│ (you own) │ │ (you own) │ │ (they own) │ +│ │ │ │ │ │ +│ direction: │ │ direction: │ │ direction: │ +│ InOut │ │ Out │ │ In │ +│ │ │ │ │ │ +│ Can change │ │ Can change │ │ Cannot │ +│ schema: YES │ │ contract:YES │ │ change: NO │ +└───────────────┘ └───────────────┘ └───────────────┘ +``` + +**Flow Direction Semantics:** + +| Direction | You Control Contract? | Generated Code | Validation | +|-----------|----------------------|----------------|------------| +| `In` | NO - external input | `source.toDomain()` only | Warning if field missing | +| `Out` | YES - you define it | `domain.toSource()` only | Error if field missing | +| `InOut` | YES - full control | Both directions | Error if field missing | + +**External Contracts (direction: In):** + +These are schemas you **consume but don't control**: +- Partner/vendor API responses +- Third-party event streams (Kafka topics you subscribe to) +- Legacy system exports +- Industry standard formats (HL7, FHIR, FIX, etc.) + +```yaml +# Example: Consuming a third-party payment webhook +alignedSources: + "stripe:PaymentIntent": + direction: in # We receive this, can't change it + mode: superset # They may add fields we don't need + readonly: true # Never generate toStripe() + field_overrides: + metadata: drop # We don't use their metadata field + amount: + forward: true + type_policy: allow_widening # They use int64, we use BigDecimal +``` + +**Your Contracts (direction: Out):** + +These are schemas you **define and provide to consumers**: +- Your public REST API DTOs +- Events you publish to Kafka +- gRPC responses you serve + +```yaml +# Example: Your public API contract +alignedSources: + "api:CustomerResponse": + direction: out # We produce this + mode: exact # Contract must be precise + field_overrides: + internalNotes: drop # Don't expose internal data + legacyId: drop # Don't expose migration artifacts +``` + +**Bidirectional (direction: InOut):** + +These are schemas you **fully control**: +- Your own database tables +- Internal microservice contracts +- Your own event schemas + +```yaml +# Example: Your database +alignedSources: + "pg:customers": + direction: in_out # Full read/write + mode: exact # Schema should match +``` + +**Validation Differences by Direction:** + +| Scenario | In | Out | InOut | +|----------|-----|------|-------| +| Domain field missing in source | Warning | Error | Error | +| Source field not mapped | OK (superset) | Must exclude | Must handle | +| Type mismatch | Adapt at read | You define contract | Must match | +| Nullable mismatch | Can coerce | You define | Must match | + +**Why This Matters:** + +1. **External contracts can't be "fixed"** - When Stripe changes their webhook format, you adapt. When you read from a legacy Oracle DB, you work around its quirks. + +2. **Your contracts are promises** - When you publish an API, consumers depend on it. Missing fields are breaking changes. + +3. **Code generation differs** - For `In` sources, you only need `toDomain()`. For `Out` sources, you only need `toSource()`. Generating both when you only need one creates dead code. + +4. **Validation strictness differs** - External contracts: "best effort to consume". Your contracts: "must be complete and correct". + +**Automatic Direction Detection (APIs):** + +For OpenAPI/REST, direction can be **inferred** from schema usage: + +``` +┌─────────────────────────────────────────────────────────────────┐ +│ POST /customers │ +│ requestBody: CustomerCreateRequest ← Input schema │ +│ response 200: CustomerResponse ← Output schema │ +│ │ +│ GET /customers/{id} │ +│ response 200: CustomerResponse ← Output schema │ +│ │ +│ PUT /customers/{id} │ +│ requestBody: CustomerUpdateRequest ← Input schema │ +│ response 200: CustomerResponse ← Output schema │ +└─────────────────────────────────────────────────────────────────┘ + +Detected directions: + CustomerCreateRequest → In only (request body) + CustomerUpdateRequest → In only (request body) + CustomerResponse → Out only (response body) +``` + +**Combined with Ownership:** + +| API Type | Schema Position | Ownership | Direction | Generated | +|----------|-----------------|-----------|-----------|-----------| +| Your API | Request body | Internal | In | `toDomain()` | +| Your API | Response body | Internal | Out | `fromDomain()` | +| Partner API | Request body | External | Out* | `fromDomain()` | +| Partner API | Response body | External | In | `toDomain()` | + +*When calling external API, request is what YOU send (Out from domain), response is what YOU receive (In to domain). + +**Endpoints as Functions:** + +``` +Your API endpoint = function you implement + Input: Request schemas → validate, toDomain() + Output: Response schemas → fromDomain(), serialize + +External API call = function you invoke + Input: Response schemas → toDomain(), use + Output: Request schemas → fromDomain(), send +``` + +**What We Can Infer Automatically:** + +| Source | Can Detect | How | +|--------|------------|-----| +| OpenAPI | In/Out per schema | Scan all operations for schema refs | +| gRPC | In/Out per message | Request vs response in service defs | +| Avro | Topic direction | Producer vs consumer config | +| Database | Always InOut | Tables are read/write | + +**Current Implementation:** + +- `FlowDirection` enum: `In`, `Out`, `InOut` +- `readonly: true` suppresses `toSource()` generation +- Direction affects validation severity (warning vs error) +- `mode: superset` allows extra fields from external sources + +**Future Enhancements (P1-P2):** + +| Feature | Description | +|---------|-------------| +| Auto-detect direction | Scan OpenAPI/gRPC for schema usage positions | +| `ownership: external` | Explicit flag beyond direction | +| Contract versioning | Track schema versions for external contracts | +| Deprecation warnings | Warn when external contract changes detected | +| Adapter generation | Generate adapter layers for external → internal | +| Contract diff | Compare versions of external contracts | +| Migration helpers | Generate code to handle contract evolution | + +--- + +### P0: toDomain/fromDomain Generation + +**What it is:** Generates type-safe conversion code between domain types and all their aligned sources. + +**Key Components:** + +| Type | Purpose | Location | +|------|---------|----------| +| `FileBridgeProjectionMapper` | Generates mapper methods | `codegen/FileBridgeProjectionMapper.scala` | +| `ResolvedProjection` | Database mapping spec | `codegen/FileBridgeProjectionMapper.scala` | +| `ResolvedExternalProjection` | Avro/Proto mapping spec | `codegen/FileBridgeProjectionMapper.scala` | +| `ExternalRecord` | Abstract external source | `codegen/FileBridgeProjectionMapper.scala` | + +**Generated Code Pattern (Kotlin example):** + +```kotlin +// Domain type +data class Customer( + val customerId: CustomerId, + val name: String, + val email: Email?, + val createdAt: Instant +) + +// Generated extension functions +fun CustomerRow.toDomain(): Customer = Customer( + customerId = CustomerId(this.customerId), + name = this.name, + email = this.email?.let { Email(it) }, + createdAt = this.createdAt +) + +fun Customer.toCustomerRow(): CustomerRow = CustomerRow( + customerId = this.customerId.value, + name = this.name, + email = this.email?.value, + createdAt = this.createdAt +) + +// For Avro +fun CustomerEvent.toDomain(): Customer = Customer( + customerId = CustomerId(this.customerId), + name = this.name, + email = this.email?.let { Email(it) }, + createdAt = this.timestamp.toInstant() +) +``` + +**Nullability Handling Matrix:** + +| Domain | Source | Conversion | +|--------|--------|------------| +| Required | Required | Direct pass-through | +| Optional | Optional | Direct pass-through | +| Required | Optional | `.get()` or throw | +| Optional | Required | Wrap in `Some`/`Optional` | + +**Type Wrapper Handling:** +```kotlin +// Wrapper type +@JvmInline value class CustomerId(val value: Long) + +// Generated: unwrap for source, wrap for domain +customerId = CustomerId(source.customerId) // source → domain +customerId = domain.customerId.value // domain → source +``` + +**Language Support:** + +| Language | Domain Type | Nullable | Collections | +|----------|-------------|----------|-------------| +| Scala | `case class` | `Option[T]` | `List[T]` | +| Kotlin | `data class` | `T?` | `List` | +| Java | `record` | `Optional` | `List` | + +**Improvement Opportunities:** +- Lazy conversion (convert fields on access) +- Streaming conversion for large collections +- Caching for repeated conversions +- Performance benchmarks and optimization +- Support for partial updates (patch semantics) + +--- + +### P1: Field Coverage Checks + +**What it is:** Validates that all domain fields are covered by at least one source, and all source fields are explicitly handled. + +**Key Components:** + +| Type | Purpose | Location | +|------|---------|----------| +| `FlowValidator` | Main validation orchestrator | `bridge/validation/FlowValidator.scala` | +| `CheckFinding` | Single validation issue | `bridge/model/CheckResult.scala` | +| `CheckReport` | Complete validation report | `bridge/model/CheckResult.scala` | + +**Validation Rules:** + +``` +1. Domain Type Rules: + ✓ Must have at least one field + ✓ Must have a primary source + ✓ All fields must be covered by Out/InOut sources + +2. Source Rules: + ✓ Declared sources must exist + ✓ Forward fields must exist in source + ✓ Custom transformation refs must exist + +3. Direction-Based Rules: + In: Domain fields can be missing (read-only) + Out: All domain fields must be produced + InOut: All domain fields must be bidirectional +``` + +**Check Codes:** + +| Code | Severity | Meaning | +|------|----------|---------| +| `NoFields` | Error | Domain type has no fields | +| `NoPrimarySource` | Error | No primary source declared | +| `SourceEntityNotFound` | Error | Declared source doesn't exist | +| `MissingRequiredField` | Error/Warn | Field not covered (severity depends on direction) | +| `UnannotatedField` | Error | Source field not mapped or excluded | +| `InvalidMergeFromRef` | Error | MergeFrom references non-existent field | +| `InvalidSplitFromRef` | Error | SplitFrom references non-existent field | +| `InvalidComputedFromRef` | Error | ComputedFrom references non-existent field | + +**Example Report:** + +``` +Domain Type: Customer +━━━━━━━━━━━━━━━━━━━━ +✓ 5 fields defined +✓ Primary source: postgres:customers + +Source: postgres:customers (InOut) + ✓ customerId → customer_id (exact) + ✓ name → name (exact) + ✓ email → email (exact, nullable) + ⚠ created_by not mapped (add to exclude or forward) + +Source: kafka:customer-events (In) + ✓ customerId → customer_id (exact) + ✓ name → name (exact) + ⚠ email missing (OK for In direction) +``` + +**Improvement Opportunities:** +- Coverage percentage metrics +- Visual diff between sources +- Suggested fixes for common issues +- Integration with CI/CD pipelines +- IDE inline annotations + +--- + +### P1: Type Compatibility Checks + +**What it is:** Validates that field types are compatible across boundaries, with configurable strictness levels. + +**Key Components:** + +| Type | Purpose | Location | +|------|---------|----------| +| `TypePolicyValidator` | Validates type pairs | `bridge/validation/TypePolicyValidator.scala` | +| `TypeNarrower` | Normalizes and compares types | `bridge/TypeNarrower.scala` | +| `TypePolicy` | Strictness level | `bridge/model/TypePolicy.scala` | + +**Type Policies:** + +```scala +sealed trait TypePolicy + +object TypePolicy { + // Types must match exactly (after normalization) + case object Exact extends TypePolicy + + // Source can be narrower (INT → BIGINT) + case object AllowWidening extends TypePolicy + + // Source can be wider (BIGINT → INT) - data loss possible + case object AllowNarrowing extends TypePolicy + + // Allow numeric precision changes (DOUBLE → INT) + case object AllowPrecisionLoss extends TypePolicy + + // Allow string truncation (VARCHAR(255) → VARCHAR(100)) + case object AllowTruncation extends TypePolicy + + // Allow nullable source to required domain (runtime check) + case object AllowNullableToRequired extends TypePolicy +} +``` + +**Type Families:** + +``` +Integer Family: SMALLINT < INTEGER < BIGINT + - Widening: SMALLINT → INTEGER ✓ + - Narrowing: BIGINT → INTEGER (with AllowNarrowing) + +Float Family: REAL < DOUBLE < DECIMAL + - Widening: REAL → DOUBLE ✓ + - Precision loss: DECIMAL → INTEGER (with AllowPrecisionLoss) + +String Family: VARCHAR (all sizes) + - Truncation: VARCHAR(255) → VARCHAR(100) (with AllowTruncation) + +Timestamp Family: TIMESTAMP < TIMESTAMPTZ + - Widening: TIMESTAMP → TIMESTAMPTZ ✓ +``` + +**Type Normalization:** + +The `TypeNarrower` normalizes database-specific types to canonical forms: + +```scala +// All map to INTEGER +"INT4" | "INTEGER" | "INT" | "SERIAL" → "INTEGER" + +// All map to BIGINT +"INT8" | "BIGINT" | "BIGSERIAL" | "LONG" → "BIGINT" + +// All map to VARCHAR +"VARCHAR" | "TEXT" | "CHAR" | "NVARCHAR" → "VARCHAR" + +// Proto types +"INT32" | "SINT32" | "SFIXED32" → "INTEGER" +"INT64" | "SINT64" | "SFIXED64" → "BIGINT" +"GOOGLE.PROTOBUF.TIMESTAMP" → "TIMESTAMPTZ" + +// Avro types +"STRING" → "VARCHAR" +"BYTES" → "BYTEA" +``` + +**Canonical Type Result:** + +```scala +case class CanonicalTypeResult( + canonicalType: String, // e.g., "INTEGER" + jvmType: String, // e.g., "Int" + nullable: Boolean, // Union of all sources + warnings: List[String], // Type mismatches found + comment: String // e.g., "widened from INT to BIGINT" +) +``` + +**Improvement Opportunities:** +- Custom type mappings (domain-specific types) +- Semantic type validation (email format, UUID format) +- Cross-database type recommendations +- Migration path suggestions when types differ + +--- + +## Boundary Integration Status + +### Database Boundary ✅ + +| Database | Tables | Views | Types | Arrays | JSON | +|----------|--------|-------|-------|--------|------| +| PostgreSQL | ✅ | ✅ | ✅ | ✅ | ✅ | +| MariaDB/MySQL | ✅ | ✅ | ✅ | ❌ | ✅ | +| Oracle | ✅ | ✅ | ✅ | ❌ | ❌ | +| SQL Server | ✅ | ✅ | ❌ | ❌ | ❌ | +| DuckDB | ✅ | ✅ | ❌ | ✅ | ✅ | +| DB2 | ✅ | ✅ | ❌ | ❌ | ❌ | + +### API Boundary (OpenAPI) ✅ + +| Feature | Status | +|---------|--------| +| Schema parsing | ✅ | +| Type extraction | ✅ | +| Field alignment | ✅ | +| Mapper generation | ✅ | +| TUI browsing | ✅ | + +### Events Boundary (Avro/Kafka) ✅ + +| Feature | Status | Component | +|---------|--------|-----------| +| Schema parsing | ✅ | `AvroParser` | +| Type extraction | ✅ | `AvroTypeMapper` | +| TUI source loading | ✅ | `SourceLoader` | +| TUI field extraction | ✅ | `ProjectionFieldExtractor` | +| TUI schema browser | ✅ | `AvroBrowser` | +| Type compatibility | ✅ | `TypeNarrower` | +| Mapper generation | ✅ | `BridgeAvroAdapter` | +| Validation | ✅ | `FlowValidator` | + +### RPC Boundary (gRPC/Protobuf) ✅ + +| Feature | Status | Component | +|---------|--------|-----------| +| Protobuf parsing | ✅ | `ProtobufParser` | +| TUI source loading | ✅ | `SourceLoader` | +| TUI field extraction | ✅ | `ProjectionFieldExtractor` | +| TUI schema browser | ✅ | `ProtoBrowser` | +| Type compatibility | ✅ | `TypeNarrower` | +| Mapper generation | ✅ | `BridgeProtoAdapter` | +| Validation | ✅ | `FlowValidator` | + +--- + +## Remaining Work + +### P1: Missing Value Analysis ⬚ + +**Vision:** Detect fields that need defaults or transformations when flowing between sources. + +**Use Cases:** +- Database has `created_at DEFAULT NOW()` but API doesn't provide it +- Avro record has required field that's optional in domain +- gRPC message has fields with default values + +**Potential Implementation:** +```scala +case class MissingValueAnalysis( + field: String, + sources: List[SourceMissing] +) + +case class SourceMissing( + sourceKey: String, + direction: FlowDirection, + hasDefault: Boolean, + defaultValue: Option[String], + recommendation: Recommendation +) + +enum Recommendation { + UseSourceDefault, // DB/Proto has default + RequireExplicitValue, // Must provide at runtime + AddDomainDefault, // Add default to domain type + MarkNullable // Change to optional +} +``` + +### P1: Nested Type Resolution ⬚ + +**Vision:** Domain types can reference other domain types, with automatic resolution. + +**Use Cases:** +- `Order` contains `List[OrderItem]` +- `Customer` has `Address` field +- Hierarchical domain models + +**Challenges:** +- Circular references +- Lazy loading vs eager loading +- Source-level joins/includes + +### P2: Domain-Centric Repositories ⬚ + +**Vision:** Generate repositories that work with domain types directly, not database rows. + +**Example:** +```kotlin +// Instead of: +val row: CustomerRow = customerRepo.findById(id) +val customer: Customer = row.toDomain() + +// Generate: +val customer: Customer = customerRepo.findById(id) // Returns domain directly +``` + +**Considerations:** +- Lazy vs eager conversion +- Partial loading (select specific fields) +- Batch operations +- Transaction boundaries + +### P2: Domain-Centric Services ⬚ + +**Vision:** Generate service interfaces that orchestrate domain operations across boundaries. + +**Example:** +```kotlin +interface CustomerService { + fun create(customer: Customer): Customer + fun update(customer: Customer): Customer + fun delete(customerId: CustomerId) + + // Multi-boundary operations + fun createAndPublish(customer: Customer): Customer // DB + Kafka + fun syncFromLegacy(customerId: CustomerId): Customer // gRPC → DB +} +``` + +### P3: CRUD Endpoints ⬚ + +**Vision:** Generate REST endpoints from domain types. + +**Example Output:** +```kotlin +@RestController +class CustomerController(private val service: CustomerService) { + + @GetMapping("/customers/{id}") + fun get(@PathVariable id: Long): Customer = + service.findById(CustomerId(id)) + + @PostMapping("/customers") + fun create(@RequestBody dto: CustomerDto): Customer = + service.create(dto.toDomain()) +} +``` + +### P3: Event Publishers/Consumers ⬚ + +**Vision:** Generate Kafka producers/consumers with domain type serialization. + +**Example:** +```kotlin +// Producer +fun publish(customer: Customer) { + val record = customer.toAvro() + kafkaTemplate.send("customer-events", record) +} + +// Consumer +@KafkaListener(topics = ["customer-events"]) +fun handle(record: CustomerEvent) { + val customer = record.toDomain() + service.process(customer) +} +``` + +### P3: gRPC Services ⬚ + +**Vision:** Generate gRPC service implementations from domain types. + +**Example:** +```kotlin +class CustomerGrpcService : CustomerServiceGrpc.CustomerServiceImplBase() { + + override fun getCustomer( + request: GetCustomerRequest, + responseObserver: StreamObserver + ) { + val customer = service.findById(CustomerId(request.customerId)) + responseObserver.onNext(customer.toProto()) + responseObserver.onCompleted() + } +} +``` + +--- + +## Architecture Principles + +### 1. Single Source of Truth +Domain types are THE definition. All boundaries adapt to them. + +### 2. Fail Fast +Validation at build time, not runtime. Type mismatches are caught before deployment. + +### 3. Explicit Over Magic +Mappings are visible and auditable. No hidden conventions. + +### 4. Progressive Enhancement +Start with auto-detection, customize as needed. Simple cases stay simple. + +### 5. Language Agnostic +Same domain model generates idiomatic code for Scala, Kotlin, and Java. + +--- + +## File Reference + +### Core Types +- `typr/src/scala/typr/bridge/CompositeType.scala` - Domain type definitions +- `typr/src/scala/typr/bridge/model/*.scala` - Configuration models + +### Name Matching +- `typr/src/scala/typr/bridge/ColumnTokenizer.scala` - Name tokenization +- `typr/src/scala/typr/bridge/ColumnStemmer.scala` - Abbreviation expansion + +### Validation +- `typr/src/scala/typr/bridge/validation/FlowValidator.scala` - Main validator +- `typr/src/scala/typr/bridge/validation/SmartDefaults.scala` - Auto-matching +- `typr/src/scala/typr/bridge/validation/TypePolicyValidator.scala` - Type checking +- `typr/src/scala/typr/bridge/TypeNarrower.scala` - Type normalization + +### Code Generation +- `typr-codegen/src/scala/typr/internal/codegen/FileBridgeProjectionMapper.scala` - Mapper generation +- `typr-codegen/src/scala/typr/internal/codegen/FileBridgeCompositeType.scala` - Type generation + +### External Adapters +- `typr/src/scala/typr/avro/BridgeAvroAdapter.scala` - Avro integration +- `typr/src/scala/typr/grpc/BridgeProtoAdapter.scala` - Protobuf integration + +### TUI +- `typr/src/scala/typr/cli/tui/screens/AvroBrowser.scala` - Avro schema browser +- `typr/src/scala/typr/cli/tui/screens/ProtoBrowser.scala` - Proto schema browser +- `typr/src/scala/typr/cli/tui/util/SourceLoader.scala` - Multi-source loading diff --git a/BRIDGE-BACKEND.md b/BRIDGE-BACKEND.md new file mode 100644 index 0000000000..19d92b0360 --- /dev/null +++ b/BRIDGE-BACKEND.md @@ -0,0 +1,489 @@ +# Bridge Backend: Architecture & Implementation + +The Bridge backend is a pure API layer that powers `typr check` and will later drive the TUI, Studio, and MCP server. It is entirely pure Scala -- no IO, no TUI dependency. Callers provide materialized data; IO lives at the CLI command level only. + +## Directory Layout + +``` +typr/src/scala/typr/bridge/ + model/ # Data model: declarations, policies, check results + SourceDeclaration.scala # FlowDirection, SourceRole, SourceDeclaration + FieldOverride.scala # FieldOverride, FieldDirection, CustomKind + TypePolicy.scala # TypePolicy enum + CheckResult.scala # Severity, CheckCode, CheckFinding, CheckReport, EntitySummary + ResolvedFlow.scala # ResolvedFieldAction, ResolvedEntityFlow, ResolvedSourceFlow + validation/ # Pure validation functions + TypePolicyValidator.scala # Type policy enforcement with family ordering + SmartDefaults.scala # Auto-resolution of field actions + FlowValidator.scala # Full entity validation producing CheckFindings + api/ # Public API surface + BridgeApi.scala # Trait: check(), resolveFlows() + BridgeApiImpl.scala # Composes validation modules + ConfigToBridge.scala # Config YAML -> bridge model conversion + TypeNarrower.scala # (modified) Type normalization + compatibility + CompositeType.scala # (modified) typeCompatible now real + +typr/src/scala/typr/cli/ + commands/Check.scala # `typr check` CLI command + Main.scala # (modified) checkCmd added + +tests/src/scala/typr/bridge/ + TypePolicyValidatorTest.scala # 15 tests + TypeNarrowerIntegrationTest.scala # 14 tests + SmartDefaultsTest.scala # 7 tests + FlowValidatorTest.scala # 14 tests + +typr-config.schema.json # (modified) direction, type_policy, field_overrides +``` + +--- + +## Core Concepts + +### Domain Types + +A **domain type** is an entity that exists across multiple sources (databases, APIs, event schemas). It has: + +- A **primary source**: the anchor entity this type is grounded in (e.g. `pg:sales.customer`) +- **Fields**: named fields with canonical types (`Int`, `String`, `Long`, `Boolean`, etc.) +- **Aligned sources**: other source entities mapped to/from this domain type + +These already existed in `CompositeType.scala` as `DomainTypeDefinition`, `DomainField`, `AlignedSource`, etc. + +### Source Declarations + +A **SourceDeclaration** (new) enriches an aligned source with Bridge-specific metadata: + +```scala +case class SourceDeclaration( + sourceName: String, // "pg", "api", "kafka" + entityPath: String, // "sales.customer", "Customer" + role: SourceRole, // Primary | Aligned + direction: FlowDirection, // In | Out | InOut + mode: CompatibilityMode, // Exact | Superset | Subset + mappings: Map[String, String], // domain field -> source field + exclude: Set[String], // source fields to ignore + includeExtra: List[String], // extra source fields for toSource mapper + readonly: Boolean, + defaultTypePolicy: TypePolicy, // default policy for all fields + fieldOverrides: Map[String, FieldOverride] // per-field exceptions +) +``` + +Key design: `fieldOverrides` is **sparse**. Most fields use smart defaults. Only explicitly annotated exceptions appear here. + +### Flow Direction + +Each source has a direction indicating how data flows relative to the domain type: + +| Direction | Meaning | Missing field = | +|-----------|---------|-----------------| +| `In` | Data flows from source into domain | Warning (source doesn't provide it) | +| `Out` | Data flows from domain out to source | Error (you promised to produce it) | +| `InOut` | Bidirectional | Error (must be covered both ways) | + +### Type Policy + +Each source (or individual field) has a type policy controlling what type differences are allowed: + +| Policy | Rule | +|--------|------| +| `Exact` | Types must match exactly after normalization | +| `AllowWidening` | Source can be narrower than domain (SMALLINT -> INT ok) | +| `AllowNarrowing` | Source can be wider than domain (BIGINT -> INT ok) | +| `AllowPrecisionLoss` | Any numeric-to-numeric is ok (DECIMAL -> FLOAT) | +| `AllowTruncation` | Any string-to-string is ok | +| `AllowNullableToRequired` | Only validates nullability, skips type check | + +Policies are enforced within **type families**. Cross-family (VARCHAR -> INTEGER) always fails regardless of policy. + +### Field Overrides + +Per-field exceptions to smart defaults: + +```scala +sealed trait FieldOverride + Forward(sourceFieldName, typePolicy, directionOverride) // explicit forward mapping + Drop(reason) // explicitly excluded + Custom(kind) // special handling +``` + +Custom kinds: +- `MergeFrom(sourceFields)` -- combine multiple source fields into one domain field +- `SplitFrom(sourceField)` -- split one source field into this domain field +- `ComputedFrom(domainFields)` -- derived from other domain fields +- `Enrichment(description)` -- populated by external logic + +--- + +## Type System + +### Canonical Types + +Domain fields use canonical type names: `Int`, `Long`, `String`, `Boolean`, `BigDecimal`, `UUID`, `LocalDate`, `OffsetDateTime`, `Instant`, `Json`, `ByteArray`, etc. + +Source fields use database type names: `integer`, `bigint`, `varchar`, `text`, `timestamp with time zone`, `jsonb`, etc. + +### Type Normalization + +`TypeNarrower` bridges these two worlds: + +**`mapCanonicalToNormalized(canonical: String): String`** (new) maps domain types to normalized DB names: +``` +Int -> INTEGER +Long -> BIGINT +String -> VARCHAR +BigDecimal -> DECIMAL +OffsetDateTime -> TIMESTAMPTZ +Instant -> TIMESTAMPTZ +UUID -> UUID +ByteArray -> BYTEA +``` + +**`normalizeDbType(dbType: String): String`** (now public) normalizes database-specific types: +``` +int4, integer, int, serial -> INTEGER +int8, bigint, bigserial -> BIGINT +varchar, text, char, bpchar -> VARCHAR +numeric, decimal -> DECIMAL +timestamp with time zone -> TIMESTAMPTZ +``` + +### Type Families + +Types are grouped into families for compatibility checking: + +| Family | Members | Ordering (narrow -> wide) | +|--------|---------|---------------------------| +| Integer | SMALLINT, INTEGER, BIGINT | 0, 1, 2 | +| Float | REAL, DOUBLE, DECIMAL | 0, 1, 2 | +| String | VARCHAR | (single member) | +| Timestamp | TIMESTAMP, TIMESTAMPTZ | 0, 1 | +| Time | TIME, TIMETZ | 0, 1 | + +`AllowWidening` means: source ordinal <= domain ordinal (source is same or narrower). +`AllowNarrowing` means: source ordinal >= domain ordinal (source is same or wider). +`AllowPrecisionLoss` allows any integer <-> float crossing. + +### typeCompatible Fix + +`AlignmentComputer.typeCompatible` (previously a stub returning `true`) now uses real type checking: + +```scala +private def typeCompatible(domainField: DomainField, sourceField: SourceField): Boolean = { + val domainNorm = TypeNarrower.mapCanonicalToNormalized(domainField.typeName) + val sourceNorm = TypeNarrower.normalizeDbType(sourceField.typeName) + domainNorm == sourceNorm || TypeNarrower.areTypesCompatible(domainNorm, sourceNorm) +} +``` + +This means `AlignmentComputer.computeAlignment()` now correctly detects `TypeMismatch` for incompatible types (e.g. `String` domain field vs `integer` source field). + +--- + +## Validation Pipeline + +The validation pipeline has three layers, each pure: + +### 1. SmartDefaults + +``` +SmartDefaults.resolveFieldActions(domainType, sourceDecl, sourceEntity, nameAligner) + -> List[ResolvedFieldAction] +``` + +For each **domain field**, in order: + +1. If an explicit `FieldOverride` exists for this field, use it directly +2. Else try name alignment (explicit mappings first, then `NameAligner` auto-matching) + - **Match found**: `Forward(isAutoMatched=true)` with the source's default type policy + - **No match**: `Drop(isAutoDropped=true)` + +For each **source field** not consumed by any domain field: + +- If in `exclude` or `includeExtra` set -> skip +- If mode is `Superset` or `Subset` -> skip (allowed) +- If mode is `Exact` -> `Unannotated` (needs human decision) + +The result is a `List[ResolvedFieldAction]` with four variants: + +``` +Forward(domainField, sourceField, typePolicy, isAutoMatched) +Drop(domainField, reason, isAutoDropped) +Custom(domainField, kind) +Unannotated(sourceField, sourceType, sourceNullable) +``` + +### 2. TypePolicyValidator + +``` +TypePolicyValidator.validateWithCanonical(domainCanonicalType, sourceDbType, policy) + -> Either[String, Unit] +``` + +Given a canonical domain type, a database source type, and a policy, this either passes or returns an error message. It: + +1. Normalizes both types through `mapCanonicalToNormalized` and `normalizeDbType` +2. Applies the policy rules using family membership and ordering +3. Cross-family comparisons always fail (except `AllowNullableToRequired`) + +### 3. FlowValidator + +``` +FlowValidator.validateEntity(domainType, sourceDeclarations, sourceEntities, nameAligner) + -> List[CheckFinding] +``` + +This is the top-level validator that produces a flat list of findings. It checks: + +**Structural checks:** +- Domain type has at least one field (`NoFields`) +- At least one source has `role = Primary` (`NoPrimarySource`) +- Each declared source entity exists in the provided data (`SourceEntityNotFound`) + +**Per-source checks** (calls SmartDefaults, then validates each resolved action): + +- `Forward` fields: + - Source field must exist (`MissingRequiredField`) + - Type must pass policy validation (`TypeIncompatible` for Exact, `TypePolicyViolation` for others) + - Nullability must be compatible unless `AllowNullableToRequired` (`NullabilityMismatch`) + +- `Drop` fields on out/inout sources: Warning that data will be lost + +- `Custom(MergeFrom)`: all referenced source fields must exist (`InvalidMergeFromRef`) +- `Custom(SplitFrom)`: referenced source field must exist (`InvalidSplitFromRef`) +- `Custom(ComputedFrom)`: all referenced domain fields must exist (`InvalidComputedFromRef`) + +- `Unannotated`: source field exists but has no mapping (`UnannotatedField`) + +**Direction validation** (separate pass after SmartDefaults): + +For each domain field NOT covered by a Forward or Custom action: +- `Out` source: Error -- you promised to produce this field +- `InOut` source: Error -- bidirectional requires coverage +- `In` source: Warning -- source doesn't provide it + +--- + +## Check Codes + +| Code | Severity | Meaning | +|------|----------|---------| +| `NoFields` | Error | Domain type has zero fields | +| `NoPrimarySource` | Error | No source declared as primary | +| `SourceEntityNotFound` | Error | Declared source entity doesn't exist | +| `UnannotatedField` | Error | Source field in exact mode not mapped to anything | +| `MissingRequiredField` | Error/Warning | Field not covered (error for out, warning for in) | +| `TypeIncompatible` | Error | Types don't match under Exact policy | +| `TypePolicyViolation` | Error | Types don't satisfy the declared policy | +| `NullabilityMismatch` | Error | Required in domain but nullable in source | +| `InvalidMergeFromRef` | Error | MergeFrom references nonexistent source field | +| `InvalidSplitFromRef` | Error | SplitFrom references nonexistent source field | +| `InvalidComputedFromRef` | Error | ComputedFrom references nonexistent domain field | + +--- + +## BridgeApi + +The public API surface: + +```scala +trait BridgeApi { + def check( + domainTypes: Map[String, DomainTypeDefinition], + sourceDeclarations: Map[String, Map[String, SourceDeclaration]], + sourceEntities: Map[String, Map[String, SourceEntity]], + nameAligner: NameAligner + ): CheckReport + + def resolveFlows( + domainType: DomainTypeDefinition, + sourceDeclarations: Map[String, SourceDeclaration], + sourceEntities: Map[String, SourceEntity], + nameAligner: NameAligner + ): ResolvedEntityFlow +} +``` + +**`check()`** iterates all domain types, runs `FlowValidator.validateEntity` for each, and collects: +- `findings`: flat list of all `CheckFinding` across all entities +- `entitySummaries`: per-entity summary with field/source/forward/drop/custom/error/warning counts +- `exitCode`: 0 if no errors, 1 if any errors + +**`resolveFlows()`** returns the resolved field actions for a single domain type across all its sources, without generating findings. Useful for UI display (showing what will happen to each field). + +**Data flow in `check()`:** + +``` +sourceDeclarations[entityName] -> Map[sourceKey, SourceDeclaration] +sourceEntities[sourceName][entityPath] -> SourceEntity + | + flattenSourceEntities() joins these into Map[sourceKey, SourceEntity] + | + FlowValidator.validateEntity() -------> List[CheckFinding] + SmartDefaults.resolveFieldActions() --> action counts for EntitySummary +``` + +--- + +## Config Schema Extensions + +Three new properties added to `alignedSource` in `typr-config.schema.json`: + +```yaml +# typr.yaml example +types: + Customer: + kind: domain + primary: "pg:sales.customer" + fields: + id: Long + name: String + email: "String?" + alignedSources: + "api:Customer": + entity: Customer + mode: superset + direction: out # NEW: in | out | in-out + type_policy: allow_widening # NEW: controls default type checking + field_overrides: # NEW: per-field exceptions + email: drop + full_name: + action: custom + merge_from: [first_name, last_name] + legacy_id: + action: forward + source_field: old_id + type_policy: allow_narrowing +``` + +Field overrides support two syntaxes: + +**Short form**: `"forward"` or `"drop"` as a string + +**Long form**: object with `action`, `source_field`, `type_policy`, `reason`, `direction`, `merge_from`, `split_from`, `computed_from`, `enrichment` + +--- + +## ConfigToBridge + +Converts parsed config types into bridge model types. Handles: + +- `convertAlignedSource()`: AlignedSource + new fields -> SourceDeclaration +- `convertPrimarySource()`: primary key -> SourceDeclaration with Primary role +- `convertDomainTypeDefinition()`: full domain type -> Map of all SourceDeclarations +- `parseFieldOverrides()`: JSON map -> Map[String, FieldOverride], handling both short and long forms + +The generated config types (`typr.config.generated.AlignedSource`) don't yet have the new fields (`direction`, `type_policy`, `field_overrides`) since `bleep generate-sources` hasn't been re-run. The Check command currently defaults direction to `InOut`, type policy to `Exact`, and field overrides to empty. After regeneration, ConfigToBridge will be able to read them from the parsed YAML. + +--- + +## `typr check` CLI Command + +``` +typr check [--config typr.yaml] [--quiet] [--debug] +``` + +What it does: + +1. Reads and parses `typr.yaml` (with env var substitution) +2. Extracts all domain type definitions from the `types` section +3. Builds SourceDeclarations from primary + aligned sources +4. Calls `BridgeApi.check()` with empty source entities (config-only validation for now) +5. Renders findings grouped by errors/warnings +6. Returns exit code 0 (pass) or 1 (errors found) + +Output format: +``` +ERRORS (3): + [TypeIncompatible] Customer > source=api:Customer > field=amount + Exact policy requires matching types but got domain=VARCHAR, source=INTEGER + Suggestion: Adjust the type policy or update the field type + +WARNINGS (1): + [MissingRequiredField] Customer > source=kafka:events > field=email + In-source 'kafka:events' does not provide field 'email' + +Entity Summary: + Customer: 5 fields, 3 sources, 8 forward, 2 drop, 1 custom + 3 errors, 1 warnings + +Check FAILED: 3 error(s), 1 warning(s) +``` + +Wired into Main.scala as the second subcommand after `generate`: +``` +typr generate ... +typr check ... +typr interactive ... +typr watch ... +typr init +``` + +--- + +## Test Coverage + +All tests are pure (no database needed). 50 tests across 4 suites: + +### TypePolicyValidatorTest (15 tests) + +- Exact: matching passes, mismatch fails, normalized equivalents pass +- AllowWidening: narrower source passes, wider source fails, cross-family fails +- AllowNarrowing: wider source passes, narrower source fails, cross-family fails +- AllowPrecisionLoss: numeric-to-numeric passes, string-to-int fails +- AllowTruncation: string-to-string passes, string-to-int fails +- AllowNullableToRequired: always passes type check +- `validateWithCanonical`: canonical `Int` + db `bigint` with policies +- Cross-family always fails regardless of policy (except AllowNullableToRequired) + +### TypeNarrowerIntegrationTest (14 tests) + +- `mapCanonicalToNormalized`: all canonical types map correctly (Int->INTEGER, String->VARCHAR, etc.) +- `normalizeDbType`: int4->INTEGER, serial->INTEGER, text->VARCHAR, numeric->DECIMAL +- `areTypesCompatible`: INTEGER/BIGINT compatible, VARCHAR/INTEGER not, TIMESTAMP/TIMESTAMPTZ compatible +- AlignmentComputer integration: Int domain + bigint source -> Aligned(true), String domain + integer source -> TypeMismatch + +### SmartDefaultsTest (7 tests) + +- Field present in both: auto-forward with isAutoMatched=true +- Field in domain but not source: auto-drop with isAutoDropped=true +- Explicit Forward override: uses specified source_field and type_policy +- Explicit Drop override: uses specified reason +- Source field not in domain, Exact mode: produces Unannotated +- Source field not in domain, Superset mode: silently ignored +- Source field in exclude: ignored even in Exact mode +- MergeFrom override: resolved as Custom + +### FlowValidatorTest (14 tests) + +- Clean config (all fields auto-forward, compatible types): zero errors +- Missing primary source: NoPrimarySource error +- Source entity not found: SourceEntityNotFound error +- Unannotated field in Exact mode: UnannotatedField error with suggestion +- Type mismatch with Exact policy: TypeIncompatible error +- Type mismatch with correct AllowWidening policy: no error +- Type mismatch with wrong AllowWidening policy: TypePolicyViolation error +- Drop on out-source: Warning +- MergeFrom referencing nonexistent source field: InvalidMergeFromRef error +- ComputedFrom referencing nonexistent domain field: InvalidComputedFromRef error +- Out source missing domain field: MissingRequiredField error +- In source missing domain field: Warning only +- InOut source missing domain field: MissingRequiredField error +- No fields: NoFields error +- Nullability mismatch: required domain + nullable source -> NullabilityMismatch +- End-to-end multi-source: Customer with pg (primary, InOut) + api (aligned, Out) + +--- + +## What's Next + +Things that need to happen to make this fully operational: + +1. **`bleep generate-sources`**: Regenerate config types so `AlignedSource` gets the new `direction`, `type_policy`, `field_overrides` fields from the updated JSON schema. Then update Check.scala to read them instead of defaulting. + +2. **Source entity loading in Check**: Currently `typr check` passes empty `sourceEntities`. The next step is to connect to databases (reusing Generate's source fetching) to build `SourceEntity` instances from MetaDb, so check validates against live schemas. + +3. **TUI integration**: `BridgeApi.resolveFlows()` returns `ResolvedEntityFlow` which the TUI can render as a field-by-field mapping view. + +4. **Studio / MCP**: The pure API is ready to be called from a web server or MCP endpoint. All IO stays at the caller level. diff --git a/CLAUDE.md b/CLAUDE.md index 6f8446eab9..49f3bcbc3a 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -22,7 +22,7 @@ Typr is a database code generator that creates type-safe JVM code from database ## Build System -This project uses **Bleep** (https://bleep.build/) as the primary build tool. Kotlin modules are also buildable with Gradle. +This project uses **Bleep** (https://bleep.build/) as the build tool. ### Common Bleep Commands ```bash @@ -38,35 +38,25 @@ bleep fmt # Regenerate tuples, rowparsers and so on. runs bleep sourcegen scripts bleep sourcegen -# Code generation scripts -bleep generate-postgres # PostgreSQL AdventureWorks + Showcase schemas -bleep generate-mariadb # MariaDB test + Showcase schemas -bleep generate-duckdb # DuckDB test + Showcase schemas -bleep generate-oracle # Oracle test + Showcase schemas -bleep generate-sqlserver # SQL Server test + Showcase schemas -bleep generate-openapi-test # OpenAPI code generation -bleep generate-sources # Typr's internal generated code - -# Run all generators in parallel. much faster -bleep generate-all +# Typr's internal generated code +bleep generate-sources # Documentation bleep generate-docs # Generate documentation with mdoc ``` -### Gradle for Kotlin -Kotlin modules have Gradle build files for IDE support and alternative building: +### Typr CLI for Code Generation +Database code generation is done via the **typr CLI** instead of bleep scripts: + ```bash -./gradlew :testers:pg:kotlin:build -./gradlew :foundations-jdbc-dsl-kotlin:build +# Generate code (reads typr.yaml config) +bleep run typr generate ``` -The Gradle project includes: -- `foundations-jdbc` - Java runtime module -- `foundations-jdbc-dsl` - Java DSL module -- `foundations-jdbc-dsl-kotlin` - Kotlin DSL module -- All Kotlin testers (`testers:pg:kotlin`, `testers:mariadb:kotlin`, etc.) -- OpenAPI Kotlin testers (`testers:openapi:kotlin:jaxrs`, `spring`, `quarkus`) +The CLI reads configuration from `typr.yaml` and generates type-safe code for: +- Tables, views, and custom SQL queries +- Type overrides (scalar types, composite types) +- Selected schemas and tables ## Tester Project Layout @@ -84,8 +74,7 @@ testers/{database}/{language}/ ├── src/ # Manually written test code │ └── {lang}/ # java/, kotlin/, or scala/ │ └── {package}/ # Test files (e.g., *Test.java, *Test.scala) -├── build.gradle.kts # Kotlin testers only - Gradle build file -└── gradle.properties # Kotlin testers only - Gradle properties +└── ... ``` ### Scala Cross-Compilation @@ -106,7 +95,6 @@ testers/pg/scala/anorm/ **Kotlin tester** (`testers/pg/kotlin/`): - `generated-and-checked-in/` - Generated Kotlin code - `src/kotlin/` - Test files -- `build.gradle.kts` - Gradle build configuration **Scala Anorm tester** (`testers/pg/scala/anorm/`): - `generated-and-checked-in-2.13/` - Scala 2.13 variant @@ -115,16 +103,16 @@ testers/pg/scala/anorm/ ### Generation Variants -The `GeneratedPostgres.scala` script generates multiple variants: -- Scala 2.13 + Anorm + PlayJson -- Scala 3 + Anorm + PlayJson -- Scala 2.13 + Doobie + Circe -- Scala 3 + Doobie + Circe -- Scala 2.13 + ZIO-JDBC + ZioJson -- Scala 3 + ZIO-JDBC + ZioJson -- Java + Typo DSL + Jackson -- Scala 3 with Java types + Typo DSL + Jackson -- Scala 3 with Scala types + Typo DSL + Jackson +The typr CLI supports generating code for multiple combinations: +- Scala 2.13 + Anorm + PlayJson (PostgreSQL only) +- Scala 3 + Anorm + PlayJson (PostgreSQL only) +- Scala 2.13 + Doobie + Circe (PostgreSQL only) +- Scala 3 + Doobie + Circe (PostgreSQL only) +- Scala 2.13 + ZIO-JDBC + ZioJson (PostgreSQL only) +- Scala 3 + ZIO-JDBC + ZioJson (PostgreSQL only) +- Java + Typo DSL + Jackson (all databases) +- Kotlin + Typo DSL + Jackson (all databases) +- Scala 3 + Typo DSL + Jackson (all databases) ## Docker-Compose Database Setup @@ -167,7 +155,7 @@ docker-compose down docker-compose up -d # 4. Regenerate code -bleep generate-postgres +bleep run typr generate ``` **MariaDB schema changes:** @@ -178,7 +166,7 @@ docker-compose down docker-compose up -d # 3. Regenerate code -bleep generate-mariadb +bleep run typr generate ``` **Oracle schema changes:** @@ -193,15 +181,15 @@ docker-compose up -d docker-compose logs -f oracle # 4. Regenerate code -bleep generate-oracle +bleep run typr generate ``` **Complete reset (all databases):** ```bash docker-compose down -v # -v removes volumes docker-compose up -d -# Wait for all databases to initialize -bleep generate-all +# Wait for all databases to initialize, then regenerate +bleep run typr generate ``` ### Persistent Volumes @@ -314,7 +302,7 @@ typr/ # Main code generator testers/ # Integration test projects ├── pg/ # PostgreSQL testers │ ├── java/ # Java tester -│ ├── kotlin/ # Kotlin tester (Gradle buildable) +│ ├── kotlin/ # Kotlin tester │ └── scala/ # Scala testers (anorm, doobie, zio-jdbc, scalatypes, javatypes) ├── mariadb/ # MariaDB testers (java, kotlin, scala) ├── duckdb/ # DuckDB testers (java, kotlin, scala) @@ -333,14 +321,10 @@ typr-dsl-anorm/ # [LEGACY] Anorm-specific DSL (Scala, Postgre typr-dsl-doobie/ # [LEGACY] Doobie-specific DSL (Scala, PostgreSQL only) typr-dsl-zio-jdbc/ # [LEGACY] ZIO-JDBC-specific DSL (Scala, PostgreSQL only) -typr-scripts/ # Generation scripts -├── GeneratedPostgres.scala # PostgreSQL generation -├── GeneratedMariaDb.scala # MariaDB generation -├── GeneratedDuckDb.scala # DuckDB generation -├── GeneratedOracle.scala # Oracle generation -├── GeneratedSqlServer.scala # SQL Server generation -├── GenerateOpenApiTest.scala # OpenAPI generation -├── GenerateAll.scala # Run all generators +typr-scripts/ # Build and publishing scripts +├── GeneratedSources.scala # Typr's internal generated code +├── PublishLocal.scala # Local publishing +├── Publish.scala # Release publishing └── ... sql-init/ # Schema files (mounted to Docker) @@ -442,7 +426,7 @@ WHERE p.productcategory = :category_id:myapp.production.productcategory.Productc 1. **Create Test Case**: Add SQL file in `sql-init/postgres/issueNNN.sql` (or appropriate database folder) 2. **Update Install Script**: Add to `sql-init/postgres/00-install.sh` for PostgreSQL 3. **Restart Database**: `docker-compose down && docker-compose up -d` -4. **Generate Code**: Run appropriate generator (e.g., `bleep generate-postgres`) +4. **Generate Code**: Run `bleep run typr generate ` 5. **Trace Issue**: Examine generated code 6. **Commit Test Setup**: Commit before making changes 7. **Implement Fix**: Make code changes @@ -462,8 +446,6 @@ bleep test testers/mariadb/scala # Run a specific test class within a project bleep test foundations-jdbc-test --only DuckDbTypeTest -# Kotlin tests via Gradle -./gradlew :testers:pg:kotlin:test ``` ## Documentation @@ -477,8 +459,6 @@ cd site && npm install && npm run build ## Key Files - `bleep.yaml` - Main build configuration (all projects, scripts, templates) -- `build.gradle.kts` - Root Gradle config for Kotlin modules -- `settings.gradle.kts` - Gradle project structure - `docker-compose.yml` - Database infrastructure (PostgreSQL, MariaDB, Oracle, SQL Server, DB2) - `sql-init/postgres/00-install.sh` - PostgreSQL initialization script @@ -486,7 +466,7 @@ cd site && npm install && npm run build ### Common Issues - **Database Connection**: Ensure Docker containers are running -- **Kotlin Compilation**: Use Gradle for Kotlin modules if Bleep has issues +- **Kotlin Compilation**: Check bleep.yaml for Kotlin project definitions - **Generated Code Errors**: Re-run appropriate generator after schema changes - **Oracle slow to start**: Wait 1-2 minutes, check `docker-compose logs -f oracle` - **Stale data**: Remove volumes with `docker-compose down -v` @@ -517,7 +497,7 @@ docker-compose logs -f oracle ### Code Generation Philosophy - Never generate code that relies on derivation - we are the deriver -- Run appropriate generator (e.g., `bleep generate-postgres`) before testing to see codegen effects +- Run `bleep run typr generate ` before testing to see codegen effects ### Development Rules - Always run `bleep fmt` and `bleep test` before commiting @@ -533,4 +513,5 @@ docker-compose logs -f oracle - WHEN YOU CHANGE CODE, NEVER LEAVE DANGLING COMMENTS DESCRIBING HOW IT WAS BEFORE OR WHY YOU MADE A CHANGE. WE HAVE GIT FOR THAT - when restarting a database container always restart only the one you want to restart. it takes ages to start all - UNDER NO CIRCUMSTANCE, EVER. FUCKING EVER. WILL CLAUDE GIVE UP AND REVERT ALL THE FILES -- NEVER HIDE PROBLEMS BY WORKING AROUND THEM. When you discover an issue (e.g., serialization doesn't work, types don't match, framework integration fails), IMMEDIATELY TELL THE USER. Do not quietly work around it with simpler/different code and pretend everything is fine. Tests exist to find these problems - report them, don't hide them. \ No newline at end of file +- NEVER HIDE PROBLEMS BY WORKING AROUND THEM. When you discover an issue (e.g., serialization doesn't work, types don't match, framework integration fails), IMMEDIATELY TELL THE USER. Do not quietly work around it with simpler/different code and pretend everything is fine. Tests exist to find these problems - report them, don't hide them. +- **DO NOT COMPARE WITH "PRE-EXISTING" STATE.** When there are compilation errors or test failures, FIX THEM. Do not check if they existed before your changes, do not stash and compare with HEAD, do not say "these are pre-existing errors". The branch has many commits and may have just been rebased - "pre-existing" is meaningless. If it doesn't compile, fix it. \ No newline at end of file diff --git a/TYPR-DOMAIN-PROGRESS.md b/TYPR-DOMAIN-PROGRESS.md new file mode 100644 index 0000000000..656608d645 --- /dev/null +++ b/TYPR-DOMAIN-PROGRESS.md @@ -0,0 +1,183 @@ +# Typr Domain - Progress Tracker + +## Architecture + +``` + ┌─────────────────┐ + │ TYPR DOMAIN │ + │ │ + │ Person │ + │ Address │ + │ Order │ + │ ... │ + └────────┬────────┘ + │ + ┌─────────────────┼─────────────────┐ + │ │ │ + ▼ ▼ ▼ + ┌──────────┐ ┌──────────┐ ┌──────────┐ + │ Database │ │ APIs │ │ Events │ + │ │ │ │ │ │ + │ PersonRow│ │PersonDto │ │PersonAvro│ + │.toDomain │ │.toDomain │ │.toDomain │ + │.fromDom │ │.fromDom │ │.fromDom │ + └──────────┘ └──────────┘ └──────────┘ +``` + +**Domain is the hub. Boundaries are spokes.** + +--- + +## Priority Stack + +### P0: Foundation +- [x] Domain type DSL (Bridge core) +- [x] Name alignment engine +- [x] Matching rules engine +- [x] toDomain/fromDomain generation + +### P1: Core Validation +- [x] Field coverage checks +- [x] Type compatibility checks +- [ ] Missing value analysis +- [ ] Nested type resolution + +### P2: Smart Infrastructure +- [ ] Domain-centric repositories (return domain, not rows) +- [ ] Domain-centric services + +### P3: Full Generation +- [ ] CRUD endpoints +- [ ] Event publishers/consumers +- [ ] gRPC services + +--- + +## Boundary Integration Status + +### Database Boundary +- [x] PostgreSQL +- [x] MariaDB/MySQL +- [x] Oracle +- [x] SQL Server +- [x] DuckDB +- [x] DB2 + +### API Boundary (OpenAPI) +- [x] Schema parsing +- [x] Type extraction +- [x] Field alignment +- [x] Mapper generation +- [x] TUI browsing + +### Events Boundary (Avro/Kafka) +- [x] Schema parsing (AvroParser) +- [x] Type extraction (AvroTypes) +- [x] TUI source loading (SourceLoader) +- [x] TUI field extraction (ProjectionFieldExtractor) +- [x] TUI schema browser (AvroBrowser) +- [x] Type narrowing/compatibility (TypeNarrower) +- [x] Mapper generation (FileBridgeProjectionMapper with ExternalRecord) +- [x] Validation (FlowValidator with SourceEntityType.Record) + +### RPC Boundary (gRPC/Protobuf) +- [x] Protobuf parsing (existing in typr/grpc/) +- [x] TUI source loading +- [x] TUI field extraction +- [x] TUI schema browser (ProtoBrowser) +- [x] Type narrowing/compatibility +- [x] Mapper generation +- [x] Validation + +--- + +## Next Steps + +### P1: Core Validation (Remaining) +- [ ] Missing value analysis (detect fields that need defaults) +- [ ] Nested type resolution (domain types referencing other domain types) +- [ ] External contract ownership model (see below) + +### P1.5: Contract Ownership & In/Out Types + +**Key Insight:** Some contracts are external (we consume but can't change), some are ours (we define). + +| Direction | Ownership | Generated Code | Validation | +|-----------|-----------|----------------|------------| +| `In` | External | `toDomain()` only | Warning if missing | +| `Out` | Ours | `fromDomain()` only | Error if missing | +| `InOut` | Ours | Both | Error if missing | + +**Automatic Direction Detection:** + +APIs are functions with inputs and outputs - we can detect direction automatically: +``` +POST /customers + requestBody: CustomerCreate → In (we receive) + response: CustomerResponse → Out (we send) + +External API call (reversed): + request: we send → Out from domain + response: we receive → In to domain +``` + +| Source | Detection Method | +|--------|------------------| +| OpenAPI | Schema position in request vs response | +| gRPC | Message position in service method | +| Avro | Producer vs consumer role | +| Database | Always InOut | + +**Tasks:** +- [ ] Auto-detect In/Out from OpenAPI schema positions +- [ ] Auto-detect In/Out from gRPC service definitions +- [ ] Add `ownership: external | internal` flag to SourceDeclaration +- [ ] Suppress `toSource()` generation for external contracts +- [ ] Different validation severity based on ownership +- [ ] Contract versioning for external schemas +- [ ] Adapter generation for external → internal mapping + +**Use Cases:** +- Partner API webhooks (In, external) - adapt to their schema +- Your public API (Out, internal) - you define the contract +- Calling external API (Out=request, In=response) +- Third-party event streams (In, external) - consume but can't change +- Your database (InOut, internal) - full control + +### P2: Smart Infrastructure +- [ ] Domain-centric repositories (return domain types, not rows) +- [ ] Domain-centric services (orchestrate across boundaries) + +### P3: Full Generation +- [ ] CRUD endpoints (REST from domain types) +- [ ] Event publishers/consumers (Kafka from domain types) +- [ ] gRPC services (from domain types) + +--- + +## Completed Work Log + +### 2026-02-08: gRPC/Protobuf Integration Complete +- Added SourceStatus.ReadyProto to TuiState +- Implemented loadProtoSource() in SourceLoader +- Added extractFromProto() and formatProtoType() to ProjectionFieldExtractor +- Added ExtractedSourceType.Message and SourceEntityType.Message +- Updated getAvailableEntities() for Proto sources +- Updated allReadySources/allReady extension methods +- Created ProtoBrowser.scala screen with full navigation +- Added ProtoMessageInfo, ProtoFieldInfo, ProtoBrowserState types +- Added AppScreen.ProtoBrowser and Location.ProtoBrowser +- Extended TypeNarrower with normalizeProtoType(), mapProtoTypeToCanonical(), isProtoTypeCompatible() +- Created BridgeProtoAdapter.scala for Proto→ExternalRecord conversion +- Extended normalizeDbType() to handle Proto scalar and well-known types +- FlowValidator now works with Proto sources via generic SourceEntity abstraction + +### 2024-02-08: Avro/Kafka Integration Complete +- Added SpecSourceType.Avro and SourceStatus.ReadyAvro +- Added Avro source loading to SourceLoader +- Added Avro field extraction to ProjectionFieldExtractor +- Created AvroBrowser TUI screen +- Extended TypeNarrower for Avro type compatibility +- Added ExternalRecord abstraction to FileBridgeProjectionMapper +- Created BridgeAvroAdapter for Avro→ExternalRecord conversion +- Extended FlowValidator with proper source type tracking diff --git a/bleep.yaml b/bleep.yaml index 3ebe0e9518..bf5c6a1aef 100644 --- a/bleep.yaml +++ b/bleep.yaml @@ -1,27 +1,14 @@ $schema: https://raw.githubusercontent.com/oyvindberg/bleep/master/schema.json -$version: 0.0.14 +$version: 1.0.0-M3 resolvers: - https://packages.confluent.io/maven/ jvm: name: graalvm-community:25.0.0 projects: - foundations-jdbc: - dependencies: - - configuration: provided - module: com.ibm.db2:jcc:11.5.9.0 - - configuration: provided - module: com.microsoft.sqlserver:mssql-jdbc:12.8.1.jre11 - - configuration: provided - module: com.oracle.database.jdbc:ojdbc11:23.6.0.24.10 - - configuration: provided - module: org.duckdb:duckdb_jdbc:1.1.3 - - org.jetbrains:annotations:26.0.1 - - configuration: provided - module: org.mariadb.jdbc:mariadb-java-client:3.5.1 - - configuration: provided - module: org.postgresql:postgresql:42.7.3 + typr-dsl: + dependencies: dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT java: - options: -proc:none + options: --release 24 -proc:none platform: name: jvm sourcegen: @@ -29,71 +16,98 @@ projects: project: typr-scripts-sourcegen - main: scripts.GeneratedTuples project: typr-scripts-sourcegen - foundations-jdbc-dsl: - dependsOn: foundations-jdbc - java: - options: --release 24 -proc:none - platform: - name: jvm + typr-dsl-kotlin: + dependencies: + - dev.typr.foundations:foundations-jdbc-kotlin:1.0.0-RC6-SNAPSHOT + - configuration: provided + module: org.postgresql:postgresql:42.7.3 + dependsOn: typr-dsl + extends: template-kotlin + kotlin: + options: -Xnested-type-aliases sourcegen: + - project: typr-scripts-sourcegen + main: scripts.GeneratedRowParsers + - project: typr-scripts-sourcegen main: scripts.GeneratedTuples - project: typr-scripts-sourcegen - foundations-jdbc-hikari: + typr-dsl-scala: dependencies: - - com.zaxxer:HikariCP:6.2.1 - dependsOn: foundations-jdbc + - dev.typr.foundations:foundations-jdbc-scala_3:1.0.0-RC6-SNAPSHOT + - configuration: provided + module: org.postgresql:postgresql:42.7.3 + dependsOn: typr-dsl + extends: template-scala-3 + sourcegen: + - project: typr-scripts-sourcegen + main: scripts.GeneratedRowParsers + - project: typr-scripts-sourcegen + main: scripts.GeneratedTuples + test-utils: java: options: -proc:none platform: name: jvm - # cannot build this, its just here for codegen - foundations-jdbc-dsl-kotlin: - dependsOn: - - foundations-jdbc-dsl - platform: - name: jvm - sourcegen: - - main: scripts.GeneratedRowParsers - project: typr-scripts-sourcegen - - main: scripts.GeneratedTuples - project: typr-scripts-sourcegen - foundations-jdbc-dsl-scala: - dependencies: - configuration: provided - module: org.postgresql:postgresql:42.7.3 - dependsOn: - - foundations-jdbc-dsl - - foundations-jdbc-scala - extends: template-scala-3 - sourcegen: - - main: scripts.GeneratedRowParsers - project: typr-scripts-sourcegen - - main: scripts.GeneratedTuples - project: typr-scripts-sourcegen - foundations-jdbc-scala: - dependsOn: foundations-jdbc - extends: template-scala-3 - foundations-jdbc-test: + testers/combined/java: dependencies: - - com.ibm.db2:jcc:11.5.9.0 - - com.microsoft.sqlserver:mssql-jdbc:12.8.1.jre11 - - com.novocode:junit-interface:0.11 - - com.oracle.database.jdbc:ojdbc11:23.6.0.24.10 + - com.fasterxml.jackson.core:jackson-annotations:2.17.2 + - com.fasterxml.jackson.core:jackson-databind:2.17.2 + - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 + - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 + - com.github.sbt:junit-interface:0.13.3 + - io.confluent:kafka-avro-serializer:7.8.0 + - io.smallrye.reactive:mutiny-zero-flow-adapters:1.0.0 + - io.smallrye.reactive:mutiny:2.6.1 + - jakarta.validation:jakarta.validation-api:3.0.2 + - jakarta.ws.rs:jakarta.ws.rs-api:3.1.0 - junit:junit:4.13.2 - - org.duckdb:duckdb_jdbc:1.1.3 - - org.mariadb.jdbc:mariadb-java-client:3.5.1 + - org.apache.avro:avro:1.12.0 + - org.apache.kafka:kafka-clients:3.9.0 + - org.mariadb.jdbc:mariadb-java-client:3.5.0 - org.postgresql:postgresql:42.7.3 - dependsOn: foundations-jdbc-hikari + dependsOn: + - typr-dsl isTestProject: true java: options: -proc:none platform: name: jvm - test-utils: - java: - options: -proc:none - platform: - name: jvm + sources: + - ./generated-and-checked-in/api + - ./generated-and-checked-in/avro_events + - ./generated-and-checked-in/combined + - ./generated-and-checked-in/mariadb + - ./generated-and-checked-in/postgres + testers/combined/kotlin: + dependencies: + - com.fasterxml.jackson.core:jackson-annotations:2.17.2 + - com.fasterxml.jackson.core:jackson-databind:2.17.2 + - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 + - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 + - com.fasterxml.jackson.module:jackson-module-kotlin:2.17.2 + - com.github.sbt:junit-interface:0.13.3 + - io.confluent:kafka-avro-serializer:7.8.0 + - io.smallrye.reactive:mutiny-zero-flow-adapters:1.0.0 + - io.smallrye.reactive:mutiny:2.6.1 + - jakarta.validation:jakarta.validation-api:3.0.2 + - jakarta.ws.rs:jakarta.ws.rs-api:3.1.0 + - junit:junit:4.13.2 + - org.apache.avro:avro:1.12.0 + - org.apache.kafka:kafka-clients:3.9.0 + - org.mariadb.jdbc:mariadb-java-client:3.5.0 + - org.postgresql:postgresql:42.7.3 + dependsOn: + - typr-dsl-kotlin + extends: template-kotlin + isTestProject: true + kotlin: + options: -Xskip-prerelease-check + sources: + - ./generated-and-checked-in/api + - ./generated-and-checked-in/avro_events + - ./generated-and-checked-in/combined + - ./generated-and-checked-in/mariadb + - ./generated-and-checked-in/postgres + - ./src/kotlin testers/db2/java: dependencies: - com.fasterxml.jackson.core:jackson-annotations:2.17.2 @@ -101,9 +115,9 @@ projects: - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - com.ibm.db2:jcc:11.5.9.0 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 - junit:junit:4.13.2 - dependsOn: foundations-jdbc-dsl + dependsOn: typr-dsl isTestProject: true java: options: -proc:none @@ -112,6 +126,9 @@ projects: sources: - ./generated-and-checked-in - ./src/java + testers/db2/kotlin: + dependencies: com.ibm.db2:jcc:11.5.9.0 + extends: template-kotlin-db-tester testers/db2/scala: dependencies: - com.fasterxml.jackson.core:jackson-annotations:2.17.2 @@ -119,9 +136,9 @@ projects: - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - com.ibm.db2:jcc:11.5.9.0 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 - junit:junit:4.13.2 - dependsOn: foundations-jdbc-dsl-scala + dependsOn: typr-dsl-scala extends: template-scala-3 isTestProject: true sources: @@ -133,26 +150,29 @@ projects: - com.fasterxml.jackson.core:jackson-databind:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 - junit:junit:4.13.2 - org.duckdb:duckdb_jdbc:1.1.3 - dependsOn: foundations-jdbc-dsl + dependsOn: typr-dsl isTestProject: true java: options: -proc:none platform: name: jvm sources: ./generated-and-checked-in + testers/duckdb/kotlin: + dependencies: org.duckdb:duckdb_jdbc:1.1.3 + extends: template-kotlin-db-tester testers/duckdb/scala: dependencies: - com.fasterxml.jackson.core:jackson-annotations:2.17.2 - com.fasterxml.jackson.core:jackson-databind:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 - junit:junit:4.13.2 - org.duckdb:duckdb_jdbc:1.1.3 - dependsOn: foundations-jdbc-dsl-scala + dependsOn: typr-dsl-scala extends: template-scala-3 isTestProject: true sources: @@ -164,16 +184,19 @@ projects: - com.fasterxml.jackson.core:jackson-databind:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 - junit:junit:4.13.2 - org.mariadb.jdbc:mariadb-java-client:3.5.1 - dependsOn: foundations-jdbc-dsl + dependsOn: typr-dsl isTestProject: true java: options: -proc:none platform: name: jvm sources: ./generated-and-checked-in + testers/mariadb/kotlin: + dependencies: org.mariadb.jdbc:mariadb-java-client:3.5.1 + extends: template-kotlin-db-tester testers/mariadb/scala: dependencies: - com.fasterxml.jackson.core:jackson-annotations:2.17.2 @@ -182,7 +205,7 @@ projects: - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - org.mariadb.jdbc:mariadb-java-client:3.5.1 - org.scalatest::scalatest:3.2.18 - dependsOn: foundations-jdbc-dsl-scala + dependsOn: typr-dsl-scala extends: template-scala-3 isTestProject: true sources: ./generated-and-checked-in @@ -191,7 +214,8 @@ projects: - com.fasterxml.jackson.core:jackson-databind:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT - io.swagger.core.v3:swagger-annotations:2.2.25 - jakarta.validation:jakarta.validation-api:3.0.2 - jakarta.ws.rs:jakarta.ws.rs-api:3.1.0 @@ -200,7 +224,6 @@ projects: - org.glassfish.jersey.inject:jersey-hk2:3.1.9 - org.glassfish.jersey.media:jersey-media-json-jackson:3.1.9 - org.glassfish.jersey.media:jersey-media-multipart:3.1.9 - dependsOn: foundations-jdbc folder: ./testers/openapi/java/jaxrs isTestProject: true java: @@ -213,7 +236,8 @@ projects: - com.fasterxml.jackson.core:jackson-databind:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT - io.quarkus:quarkus-rest-jackson:3.17.2 - io.smallrye.reactive:mutiny:2.6.2 - io.swagger.core.v3:swagger-annotations:2.2.25 @@ -224,7 +248,6 @@ projects: - org.glassfish.jersey.inject:jersey-hk2:3.1.9 - org.glassfish.jersey.media:jersey-media-json-jackson:3.1.9 - org.glassfish.jersey.media:jersey-media-multipart:3.1.9 - dependsOn: foundations-jdbc folder: ./testers/openapi/java/quarkus isTestProject: true java: @@ -237,7 +260,8 @@ projects: - com.fasterxml.jackson.core:jackson-databind:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT - io.swagger.core.v3:swagger-annotations:2.2.25 - jakarta.validation:jakarta.validation-api:3.0.2 - jakarta.ws.rs:jakarta.ws.rs-api:3.1.0 @@ -245,7 +269,6 @@ projects: - org.glassfish.jersey.media:jersey-media-multipart:3.1.9 - org.springframework.boot:spring-boot-starter-test:3.4.0 - org.springframework.boot:spring-boot-starter-web:3.4.0 - dependsOn: foundations-jdbc folder: ./testers/openapi/java/spring isTestProject: true java: @@ -253,6 +276,17 @@ projects: platform: name: jvm sources: ./testapi + testers/openapi/kotlin/jaxrs: + extends: template-kotlin-openapi-tester + folder: ./testers/openapi/kotlin/jaxrs + testers/openapi/kotlin/quarkus: + dependencies: io.smallrye.reactive:mutiny:2.6.2 + extends: template-kotlin-openapi-tester + folder: ./testers/openapi/kotlin/quarkus + testers/openapi/kotlin/spring: + dependencies: org.springframework:spring-web:6.2.0 + extends: template-kotlin-openapi-tester + folder: ./testers/openapi/kotlin/spring testers/openapi/scala/http4s: dependencies: - io.circe::circe-generic:0.14.10 @@ -272,7 +306,8 @@ projects: - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.18.1 - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.18.1 - com.fasterxml.jackson.module::jackson-module-scala:2.18.1 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT - io.circe::circe-generic:0.14.10 - io.swagger.core.v3:swagger-annotations:2.2.25 - jakarta.validation:jakarta.validation-api:3.0.2 @@ -280,52 +315,29 @@ projects: - junit:junit:4.13.2 - org.glassfish.jersey.media:jersey-media-multipart:3.1.9 - org.springframework.boot:spring-boot-starter-web:3.4.0 - dependsOn: foundations-jdbc extends: template-scala-3 folder: ./testers/openapi/scala/spring isTestProject: true sources: ./testapi - testers/combined/java: - dependencies: - - com.fasterxml.jackson.core:jackson-annotations:2.17.2 - - com.fasterxml.jackson.core:jackson-databind:2.17.2 - - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 - - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - - com.novocode:junit-interface:0.11 - - io.smallrye.reactive:mutiny:2.6.1 - - io.smallrye.reactive:mutiny-zero-flow-adapters:1.0.0 - - jakarta.validation:jakarta.validation-api:3.0.2 - - jakarta.ws.rs:jakarta.ws.rs-api:3.1.0 - - junit:junit:4.13.2 - - org.mariadb.jdbc:mariadb-java-client:3.5.0 - - org.postgresql:postgresql:42.7.3 - dependsOn: foundations-jdbc-dsl - isTestProject: true - java: - options: -proc:none - platform: - name: jvm - sources: - - ./generated-and-checked-in/shared - - ./generated-and-checked-in/postgres - - ./generated-and-checked-in/mariadb - - ./generated-and-checked-in/api testers/oracle/java: dependencies: - com.fasterxml.jackson.core:jackson-annotations:2.17.2 - com.fasterxml.jackson.core:jackson-databind:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 - com.oracle.database.jdbc:ojdbc11:23.6.0.24.10 - junit:junit:4.13.2 - dependsOn: foundations-jdbc-dsl + dependsOn: typr-dsl isTestProject: true java: options: -proc:none platform: name: jvm sources: ./generated-and-checked-in + testers/oracle/kotlin: + dependencies: com.oracle.database.jdbc:ojdbc11:23.6.0.24.10 + extends: template-kotlin-db-tester testers/oracle/scala: dependencies: - com.fasterxml.jackson.core:jackson-annotations:2.17.2 @@ -334,7 +346,7 @@ projects: - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - com.oracle.database.jdbc:ojdbc11:23.6.0.24.10 - org.scalatest::scalatest:3.2.18 - dependsOn: foundations-jdbc-dsl-scala + dependsOn: typr-dsl-scala extends: template-scala-3 isTestProject: true sources: ./generated-and-checked-in @@ -344,11 +356,11 @@ projects: - com.fasterxml.jackson.core:jackson-databind:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 - com.oracle.database.jdbc:ojdbc11:23.6.0.24.10 - junit:junit:4.13.2 - org.scalatest::scalatest:3.2.18 - dependsOn: foundations-jdbc-dsl-scala + dependsOn: typr-dsl-scala extends: template-scala-3 isTestProject: true sources: ./generated-and-checked-in @@ -358,16 +370,34 @@ projects: - com.fasterxml.jackson.core:jackson-databind:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 - junit:junit:4.13.2 - org.postgresql:postgresql:42.7.3 - dependsOn: foundations-jdbc-dsl + dependsOn: + - typr-dsl isTestProject: true java: options: -proc:none platform: name: jvm sources: ./generated-and-checked-in + testers/pg/kotlin: + dependencies: + - com.fasterxml.jackson.core:jackson-annotations:2.17.2 + - com.fasterxml.jackson.core:jackson-databind:2.17.2 + - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 + - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 + - com.github.sbt:junit-interface:0.13.3 + - junit:junit:4.13.2 + - org.postgresql:postgresql:42.7.3 + dependsOn: typr-dsl-kotlin + extends: template-kotlin + isTestProject: true + kotlin: + options: -Xskip-prerelease-check + sources: + - ./generated-and-checked-in + - ./src/kotlin testers/pg/scala/anorm: cross: jvm213: @@ -379,8 +409,8 @@ projects: - org.postgresql:postgresql:42.7.3 - org.scalatest::scalatest:3.2.18 dependsOn: - - typr-dsl-anorm - test-utils + - typr-dsl-anorm extends: template-cross isTestProject: true testers/pg/scala/doobie: @@ -394,8 +424,8 @@ projects: - org.postgresql:postgresql:42.7.3 - org.scalatest::scalatest:3.2.18 dependsOn: - - typr-dsl-doobie - test-utils + - typr-dsl-doobie extends: template-cross isTestProject: true sources: ./generated-and-checked-in @@ -405,12 +435,12 @@ projects: - com.fasterxml.jackson.core:jackson-databind:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 - junit:junit:4.13.2 - org.postgresql:postgresql:42.7.3 - org.scalatest::scalatest:3.2.18 dependsOn: - - foundations-jdbc-dsl + - typr-dsl - test-utils extends: template-scala-3 isTestProject: true @@ -421,12 +451,12 @@ projects: - com.fasterxml.jackson.core:jackson-databind:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 - junit:junit:4.13.2 - org.postgresql:postgresql:42.7.3 - org.scalatest::scalatest:3.2.18 dependsOn: - - foundations-jdbc-dsl-scala + - typr-dsl-scala - test-utils extends: template-scala-3 isTestProject: true @@ -442,8 +472,8 @@ projects: - org.postgresql:postgresql:42.7.3 - org.scalatest::scalatest:3.2.18 dependsOn: - - typr-dsl-zio-jdbc - test-utils + - typr-dsl-zio-jdbc extends: template-cross isTestProject: true testers/sqlserver/java: @@ -453,9 +483,9 @@ projects: - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - com.microsoft.sqlserver:mssql-jdbc:12.8.1.jre11 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 - junit:junit:4.13.2 - dependsOn: foundations-jdbc-dsl + dependsOn: typr-dsl isTestProject: true java: options: -proc:none @@ -464,6 +494,9 @@ projects: sources: - ./generated-and-checked-in - ./src/java + testers/sqlserver/kotlin: + dependencies: com.microsoft.sqlserver:mssql-jdbc:12.8.1.jre11 + extends: template-kotlin-db-tester testers/sqlserver/scala: dependencies: - com.fasterxml.jackson.core:jackson-annotations:2.17.2 @@ -471,9 +504,9 @@ projects: - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - com.microsoft.sqlserver:mssql-jdbc:12.8.1.jre11 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 - junit:junit:4.13.2 - dependsOn: foundations-jdbc-dsl-scala + dependsOn: typr-dsl-scala extends: template-scala-3 isTestProject: true sources: @@ -485,12 +518,12 @@ projects: - com.fasterxml.jackson.core:jackson-databind:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT - io.confluent:kafka-avro-serializer:7.8.0 - junit:junit:4.13.2 - org.apache.avro:avro:1.12.0 - org.apache.kafka:kafka-clients:3.9.0 - dependsOn: foundations-jdbc folder: ./testers/avro/java isTestProject: true java: @@ -506,12 +539,12 @@ projects: - com.fasterxml.jackson.core:jackson-databind:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT - io.confluent:kafka-avro-serializer:7.8.0 - junit:junit:4.13.2 - org.apache.avro:avro:1.12.0 - org.apache.kafka:kafka-clients:3.9.0 - dependsOn: foundations-jdbc extends: template-scala-3 isTestProject: true sources: @@ -523,12 +556,12 @@ projects: - com.fasterxml.jackson.core:jackson-databind:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT - io.confluent:kafka-avro-serializer:7.8.0 - junit:junit:4.13.2 - org.apache.avro:avro:1.12.0 - org.apache.kafka:kafka-clients:3.9.0 - dependsOn: foundations-jdbc folder: ./testers/avro/java-async isTestProject: true java: @@ -544,11 +577,11 @@ projects: - com.fasterxml.jackson.core:jackson-databind:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT - junit:junit:4.13.2 - org.apache.avro:avro:1.12.0 - org.apache.kafka:kafka-clients:3.9.0 - dependsOn: foundations-jdbc folder: ./testers/avro/java-vanilla isTestProject: true java: @@ -564,13 +597,14 @@ projects: - com.fasterxml.jackson.core:jackson-databind:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - - com.novocode:junit-interface:0.11 + - com.github.fd4s::fs2-kafka:3.6.0 + - com.github.sbt:junit-interface:0.13.3 + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT - io.confluent:kafka-avro-serializer:7.8.0 - junit:junit:4.13.2 - org.apache.avro:avro:1.12.0 - org.apache.kafka:kafka-clients:3.9.0 - org.typelevel::cats-effect:3.5.4 - dependsOn: foundations-jdbc extends: template-scala-3 isTestProject: true sources: @@ -582,9 +616,9 @@ projects: - com.fasterxml.jackson.core:jackson-databind:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT - junit:junit:4.13.2 - dependsOn: foundations-jdbc folder: ./testers/avro/java-json isTestProject: true java: @@ -600,9 +634,9 @@ projects: - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - com.fasterxml.jackson.module::jackson-module-scala:2.17.2 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT - junit:junit:4.13.2 - dependsOn: foundations-jdbc extends: template-scala-3 isTestProject: true sources: @@ -614,13 +648,13 @@ projects: - com.fasterxml.jackson.core:jackson-databind:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT - junit:junit:4.13.2 - org.apache.kafka:kafka-clients:3.9.0 - org.mockito:mockito-core:5.14.2 - org.springframework.kafka:spring-kafka:3.3.1 - org.springframework:spring-context:6.2.1 - dependsOn: foundations-jdbc folder: ./testers/avro/java-spring isTestProject: true java: @@ -636,7 +670,8 @@ projects: - com.fasterxml.jackson.core:jackson-databind:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT - io.quarkus:quarkus-arc:3.17.2 - io.quarkus:quarkus-junit5:3.17.2 - io.smallrye.reactive:mutiny:2.7.0 @@ -647,7 +682,6 @@ projects: - junit:junit:4.13.2 - org.apache.kafka:kafka-clients:3.9.0 - org.eclipse.microprofile.reactive.messaging:microprofile-reactive-messaging-api:3.0 - dependsOn: foundations-jdbc folder: ./testers/avro/java-quarkus isTestProject: true java: @@ -657,17 +691,72 @@ projects: sources: - ./generated-and-checked-in - ./src + testers/avro/kotlin: + dependencies: + - com.fasterxml.jackson.core:jackson-annotations:2.17.2 + - com.fasterxml.jackson.core:jackson-databind:2.17.2 + - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 + - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 + - com.github.sbt:junit-interface:0.13.3 + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT + - io.confluent:kafka-avro-serializer:7.8.0 + - junit:junit:4.13.2 + - org.apache.avro:avro:1.12.0 + - org.apache.kafka:kafka-clients:3.9.0 + extends: template-kotlin + folder: ./testers/avro/kotlin + isTestProject: true + sources: + - ./generated-and-checked-in + - ./src/kotlin + testers/avro/kotlin-json: + dependencies: + - com.fasterxml.jackson.core:jackson-annotations:2.17.2 + - com.fasterxml.jackson.core:jackson-databind:2.17.2 + - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 + - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 + - com.fasterxml.jackson.module:jackson-module-kotlin:2.17.2 + - com.github.sbt:junit-interface:0.13.3 + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT + - junit:junit:4.13.2 + extends: template-kotlin + folder: ./testers/avro/kotlin-json + isTestProject: true + sources: + - ./generated-and-checked-in + - ./src/kotlin + testers/avro/kotlin-quarkus-mutiny: + dependencies: + - com.fasterxml.jackson.core:jackson-annotations:2.17.2 + - com.fasterxml.jackson.core:jackson-databind:2.17.2 + - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 + - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 + - com.fasterxml.jackson.module:jackson-module-kotlin:2.17.2 + - com.github.sbt:junit-interface:0.13.3 + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT + - io.smallrye.reactive:mutiny:2.6.0 + - io.smallrye.reactive:smallrye-reactive-messaging-api:4.22.0 + - io.smallrye.reactive:smallrye-reactive-messaging-kafka:4.22.0 + - jakarta.enterprise:jakarta.enterprise.cdi-api:4.0.1 + - jakarta.inject:jakarta.inject-api:2.0.1 + - junit:junit:4.13.2 + extends: template-kotlin + folder: ./testers/avro/kotlin-quarkus-mutiny + isTestProject: true + sources: + - ./generated-and-checked-in + - ./src/kotlin testers/grpc/java: dependencies: - com.google.protobuf:protobuf-java:4.29.3 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT - io.grpc:grpc-netty-shaded:1.69.0 - io.grpc:grpc-protobuf:1.69.0 - io.grpc:grpc-stub:1.69.0 - io.grpc:grpc-testing:1.69.0 - io.grpc:grpc-inprocess:1.69.0 - junit:junit:4.13.2 - dependsOn: foundations-jdbc folder: ./testers/grpc/java isTestProject: true java: @@ -680,14 +769,14 @@ projects: testers/grpc/scala: dependencies: - com.google.protobuf:protobuf-java:4.29.3 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT - io.grpc:grpc-netty-shaded:1.69.0 - io.grpc:grpc-protobuf:1.69.0 - io.grpc:grpc-stub:1.69.0 - io.grpc:grpc-testing:1.69.0 - io.grpc:grpc-inprocess:1.69.0 - junit:junit:4.13.2 - dependsOn: foundations-jdbc extends: template-scala-3 isTestProject: true sources: @@ -696,7 +785,8 @@ projects: testers/grpc/java-spring: dependencies: - com.google.protobuf:protobuf-java:4.29.3 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT - io.grpc:grpc-netty-shaded:1.69.0 - io.grpc:grpc-protobuf:1.69.0 - io.grpc:grpc-stub:1.69.0 @@ -705,7 +795,6 @@ projects: - junit:junit:4.13.2 - org.springframework.grpc:spring-grpc-core:0.3.0 - org.springframework:spring-context:6.2.1 - dependsOn: foundations-jdbc folder: ./testers/grpc/java-spring isTestProject: true java: @@ -718,7 +807,8 @@ projects: testers/grpc/java-quarkus: dependencies: - com.google.protobuf:protobuf-java:4.29.3 - - com.novocode:junit-interface:0.11 + - com.github.sbt:junit-interface:0.13.3 + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT - io.grpc:grpc-netty-shaded:1.69.0 - io.grpc:grpc-protobuf:1.69.0 - io.grpc:grpc-stub:1.69.0 @@ -728,7 +818,6 @@ projects: - io.quarkus:quarkus-grpc:3.17.2 - jakarta.enterprise:jakarta.enterprise.cdi-api:4.1.0 - junit:junit:4.13.2 - dependsOn: foundations-jdbc folder: ./testers/grpc/java-quarkus isTestProject: true java: @@ -738,35 +827,111 @@ projects: sources: - ./generated-and-checked-in - ./src/java + testers/grpc/scala-cats: + dependencies: + - com.google.protobuf:protobuf-java:4.29.3 + - com.github.sbt:junit-interface:0.13.3 + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT + - io.grpc:grpc-netty-shaded:1.69.0 + - io.grpc:grpc-protobuf:1.69.0 + - io.grpc:grpc-stub:1.69.0 + - io.grpc:grpc-testing:1.69.0 + - io.grpc:grpc-inprocess:1.69.0 + - junit:junit:4.13.2 + - org.typelevel::cats-effect:3.5.4 + extends: template-scala-3 + isTestProject: true + sources: + - ./generated-and-checked-in + - ./src/scala + testers/grpc/kotlin: + dependencies: + - com.google.protobuf:protobuf-java:4.29.3 + - com.github.sbt:junit-interface:0.13.3 + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT + - io.grpc:grpc-inprocess:1.69.0 + - io.grpc:grpc-netty-shaded:1.69.0 + - io.grpc:grpc-protobuf:1.69.0 + - io.grpc:grpc-stub:1.69.0 + - io.grpc:grpc-testing:1.69.0 + - junit:junit:4.13.2 + extends: template-kotlin + folder: ./testers/grpc/kotlin + isTestProject: true + sources: + - ./generated-and-checked-in + - ./src/kotlin + testers/grpc/kotlin-quarkus: + dependencies: + - com.google.protobuf:protobuf-java:4.29.3 + - com.github.sbt:junit-interface:0.13.3 + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT + - io.grpc:grpc-inprocess:1.69.0 + - io.grpc:grpc-netty-shaded:1.69.0 + - io.grpc:grpc-protobuf:1.69.0 + - io.grpc:grpc-stub:1.69.0 + - io.grpc:grpc-testing:1.69.0 + - io.quarkus:quarkus-grpc:3.17.2 + - io.smallrye.reactive:mutiny:2.7.0 + - junit:junit:4.13.2 + extends: template-kotlin + folder: ./testers/grpc/kotlin-quarkus + isTestProject: true + sources: + - ./generated-and-checked-in + - ./src/kotlin tests: dependencies: org.scalatest::scalatest:3.2.18 dependsOn: typr extends: template-scala-3 isTestProject: true typr: + dependencies: + - com.monovore::decline:2.4.1 + - com.monovore::decline-effect:2.4.1 + - io.circe::circe-core:0.14.10 + - io.circe::circe-generic:0.14.10 + - io.circe::circe-yaml-v12:1.15.0 + - org.typelevel::cats-effect:3.5.7 + dependsOn: + - typr-codegen + extends: template-scala-3 + platform: + name: jvm + mainClass: typr.cli.Main + jvmRuntimeOptions: + - -Dcats.effect.warnOnNonMainThreadDetected=false + - --enable-native-access=ALL-UNNAMED + typr-codegen: dependencies: - com.microsoft.sqlserver:mssql-jdbc:12.8.1.jre11 - com.oracle.database.jdbc:ojdbc11:23.6.0.24.10 - com.typesafe.play::play-json:2.10.6 - com.google.protobuf:protobuf-java:4.29.3 + - dev.typr.foundations:foundations-jdbc-hikari:1.0.0-RC6-SNAPSHOT - io.grpc:grpc-protobuf:1.69.0 - io.grpc:grpc-stub:1.69.0 - org.apache.avro:avro:1.12.0 - for3Use213: true module: io.get-coursier::coursier:2.1.24 + - io.circe::circe-core:0.14.10 + - io.circe::circe-generic:0.14.10 - io.swagger.parser.v3:swagger-parser:2.1.24 + - com.ibm.db2:jcc:11.5.9.0 - org.duckdb:duckdb_jdbc:1.1.3 - org.mariadb.jdbc:mariadb-java-client:3.5.1 - org.playframework.anorm::anorm:2.7.0 - org.postgresql:postgresql:42.7.3 - org.slf4j:slf4j-nop:2.0.13 - dependsOn: - - foundations-jdbc-dsl-scala - - foundations-jdbc-hikari + dependsOn: typr-dsl-scala extends: template-scala-3 platform: mainClass: com.foo.App - sources: ./generated-and-checked-in + scala: + options: -release 24 -source 3.4 -Xmax-inlines:64 + sources: + - ./generated-and-checked-in + - ./generated-and-checked-in-jsonschema typr-dsl-anorm: dependsOn: typr-runtime-anorm extends: template-cross @@ -780,22 +945,25 @@ projects: extends: template-cross sources: ../typr-dsl-shared typr-runtime-anorm: - dependencies: org.playframework.anorm::anorm:2.7.0 - dependsOn: foundations-jdbc + dependencies: + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT + - org.playframework.anorm::anorm:2.7.0 extends: template-cross typr-runtime-doobie: - dependencies: org.tpolecat::doobie-postgres:1.0.0-RC9 - dependsOn: foundations-jdbc + dependencies: + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT + - org.tpolecat::doobie-postgres:1.0.0-RC9 extends: template-cross typr-runtime-zio-jdbc: - dependencies: dev.zio::zio-jdbc:0.1.2 - dependsOn: foundations-jdbc + dependencies: + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT + - dev.zio::zio-jdbc:0.1.2 extends: template-cross typr-scripts: dependencies: - build.bleep::bleep-plugin-ci-release:${BLEEP_VERSION} - com.ibm.db2:jcc:11.5.9.0 - dependsOn: typr + dependsOn: typr-codegen extends: template-scala-3 typr-scripts-doc: dependencies: @@ -803,7 +971,7 @@ projects: - build.bleep::bleep-plugin-mdoc:${BLEEP_VERSION} dependsOn: - testers/pg/scala/anorm - - typr + - typr-codegen extends: template-scala-3 typr-scripts-sourcegen: dependencies: build.bleep::bleep-core:${BLEEP_VERSION} @@ -812,53 +980,18 @@ scripts: compile-benchmarks: main: scripts.CompileBenchmark project: typr-scripts - generate-postgres: - main: scripts.GeneratedPostgres - project: typr-scripts - sourceGlobs: ../adventureworks_sql - generate-all: - main: scripts.GenerateAll - project: typr-scripts - generate-avro-test: - main: scripts.GenerateAvroTest - project: typr-scripts - generate-grpc-test: - main: scripts.GenerateGrpcTest - project: typr-scripts - generate-db2: - main: scripts.GeneratedDb2 - project: typr-scripts - sourceGlobs: ../db2_sql generate-docs: main: scripts.GenDocumentation project: typr-scripts-doc - generate-duckdb: - main: scripts.GeneratedDuckDb - project: typr-scripts - generate-mariadb: - main: scripts.GeneratedMariaDb - project: typr-scripts - sourceGlobs: ../mariadb_sql - generate-combined-test: - main: scripts.GenerateCombinedTest - project: typr-scripts - generate-openapi-test: - main: scripts.GenerateOpenApiTest - project: typr-scripts - generate-oracle: - main: scripts.GeneratedOracle - project: typr-scripts - sourceGlobs: ../oracle_sql generate-showcase: main: scripts.GeneratedShowcase project: typr-scripts + generate-config-types: + main: scripts.GenerateConfigTypes + project: typr-scripts generate-sources: main: scripts.GeneratedSources project: typr-scripts - generate-sqlserver: - main: scripts.GeneratedSqlServer - project: typr-scripts - sourceGlobs: ../sqlserver_sql my-publish-local: main: scripts.PublishLocal project: typr-scripts @@ -889,4 +1022,51 @@ templates: extends: template-common scala: options: -release 24 -source 3.4 - version: 3.7.3 + version: 3.8.3 + template-kotlin: + kotlin: + version: 2.3.0 + jvmTarget: "21" + platform: + name: jvm + template-kotlin-db-tester: + dependencies: + - com.fasterxml.jackson.core:jackson-annotations:2.17.2 + - com.fasterxml.jackson.core:jackson-databind:2.17.2 + - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 + - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 + - com.fasterxml.jackson.module:jackson-module-kotlin:2.17.2 + - com.github.sbt:junit-interface:0.13.3 + - junit:junit:4.13.2 + dependsOn: typr-dsl-kotlin + extends: template-kotlin + isTestProject: true + kotlin: + version: 2.3.0 + jvmTarget: "21" + options: -Xskip-prerelease-check + sources: ./generated-and-checked-in + template-kotlin-openapi-tester: + dependencies: + - com.fasterxml.jackson.core:jackson-databind:2.17.2 + - com.fasterxml.jackson.datatype:jackson-datatype-jdk8:2.17.2 + - com.fasterxml.jackson.datatype:jackson-datatype-jsr310:2.17.2 + - com.fasterxml.jackson.module:jackson-module-kotlin:2.17.2 + - com.github.sbt:junit-interface:0.13.3 + - dev.typr.foundations:foundations-jdbc:1.0.0-RC6-SNAPSHOT + - io.swagger.core.v3:swagger-annotations:2.2.25 + - jakarta.validation:jakarta.validation-api:3.0.2 + - jakarta.ws.rs:jakarta.ws.rs-api:3.1.0 + - junit:junit:4.13.2 + - org.glassfish.jersey.containers:jersey-container-grizzly2-http:3.1.9 + - org.glassfish.jersey.inject:jersey-hk2:3.1.9 + - org.glassfish.jersey.media:jersey-media-json-jackson:3.1.9 + - org.glassfish.jersey.media:jersey-media-multipart:3.1.9 + extends: template-kotlin + isTestProject: true + kotlin: + version: 2.3.0 + jvmTarget: "21" + sources: + - ./testapi/api + - ./testapi/model diff --git a/build.gradle.kts b/build.gradle.kts deleted file mode 100644 index 50d73d2619..0000000000 --- a/build.gradle.kts +++ /dev/null @@ -1,21 +0,0 @@ -plugins { - java - kotlin("jvm") version "2.3.0" apply false -} - -allprojects { - group = "dev.typr" - version = "0.1.0-SNAPSHOT" - - repositories { - mavenCentral() - } -} - -subprojects { - tasks.withType { - sourceCompatibility = "21" - targetCompatibility = "21" - } - -} diff --git a/foundations-jdbc-dsl-kotlin/build.gradle.kts b/foundations-jdbc-dsl-kotlin/build.gradle.kts deleted file mode 100644 index f67a4d68c1..0000000000 --- a/foundations-jdbc-dsl-kotlin/build.gradle.kts +++ /dev/null @@ -1,27 +0,0 @@ -plugins { - kotlin("jvm") -} - -kotlin { - jvmToolchain(21) - compilerOptions { - freeCompilerArgs.add("-Xnested-type-aliases") - } -} - -sourceSets { - main { - kotlin { - srcDirs( - "src/kotlin", - // Generated by bleep: bleep sourcegen - "../.bleep/generated-sources/foundations-jdbc-dsl-kotlin/scripts.GeneratedRowParsers", - "../.bleep/generated-sources/foundations-jdbc-dsl-kotlin/scripts.GeneratedTuples" - ) - } - } -} - -dependencies { - api(project(":foundations-jdbc-dsl")) -} diff --git a/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/Fragment.kt b/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/Fragment.kt deleted file mode 100644 index 9a1a539be5..0000000000 --- a/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/Fragment.kt +++ /dev/null @@ -1,169 +0,0 @@ -package dev.typr.foundations.kotlin - -import dev.typr.foundations.DbType -import dev.typr.foundations.Fragment as JavaFragment -import dev.typr.foundations.Operation as JavaOperation -import java.sql.PreparedStatement -import java.util.concurrent.atomic.AtomicInteger - -/** Kotlin wrapper for dev.typr.foundations.Fragment with Kotlin-native APIs. - * - * This class wraps the Java Fragment interface and provides Kotlin-friendly methods. - */ -class Fragment(val underlying: JavaFragment) { - - fun render(): String = underlying.render() - - fun render(sb: StringBuilder) = underlying.render(sb) - - fun set(stmt: PreparedStatement) = underlying.set(stmt) - - fun set(stmt: PreparedStatement, idx: AtomicInteger) = underlying.set(stmt, idx) - - fun append(other: Fragment): Fragment = Fragment(underlying.append(other.underlying)) - - operator fun plus(other: Fragment): Fragment = append(other) - - fun query(parser: ResultSetParser): Operation.Query = - Operation.Query(this, parser) - - fun update(): Operation.Update = - Operation.Update(this) - - fun updateReturning(parser: ResultSetParser): Operation.UpdateReturning = - Operation.UpdateReturning(this, parser) - - fun updateReturningGeneratedKeys(columnNames: Array, parser: ResultSetParser): Operation.UpdateReturningGeneratedKeys = - Operation.UpdateReturningGeneratedKeys(this, columnNames, parser) - - fun updateMany(parser: RowParser, rows: Iterator): Operation.UpdateMany = - Operation.UpdateMany(this, parser, rows) - - fun updateManyReturning(parser: RowParser, rows: Iterator): Operation.UpdateManyReturning = - Operation.UpdateManyReturning(this, parser, rows) - - fun updateReturningEach(parser: RowParser, rows: Iterator): Operation.UpdateReturningEach = - Operation.UpdateReturningEach(this, parser, rows) - - companion object { - @JvmField - val EMPTY: Fragment = Fragment(JavaFragment.EMPTY) - - @JvmStatic - fun lit(value: String): Fragment = Fragment(JavaFragment.lit(value)) - - @JvmStatic - fun empty(): Fragment = EMPTY - - @JvmStatic - fun quotedDouble(value: String): Fragment = Fragment(JavaFragment.quotedDouble(value)) - - @JvmStatic - fun quotedSingle(value: String): Fragment = Fragment(JavaFragment.quotedSingle(value)) - - @JvmStatic - fun value(value: A, dbType: DbType): Fragment = - Fragment(JavaFragment.value(value, dbType)) - - /** Encode a value into a SQL fragment using the provided database type. */ - @JvmStatic - fun encode(dbType: DbType, value: A): Fragment = - Fragment(JavaFragment.encode(dbType, value)) - - @JvmStatic - fun and(vararg fragments: Fragment): Fragment = - Fragment(JavaFragment.and(*fragments.map { it.underlying }.toTypedArray())) - - @JvmStatic - fun and(fragments: List): Fragment = - Fragment(JavaFragment.and(fragments.map { it.underlying })) - - @JvmStatic - fun or(vararg fragments: Fragment): Fragment = - Fragment(JavaFragment.or(*fragments.map { it.underlying }.toTypedArray())) - - @JvmStatic - fun or(fragments: List): Fragment = - Fragment(JavaFragment.or(fragments.map { it.underlying })) - - @JvmStatic - fun whereAnd(vararg fragments: Fragment): Fragment = - Fragment(JavaFragment.whereAnd(*fragments.map { it.underlying }.toTypedArray())) - - @JvmStatic - fun whereAnd(fragments: List): Fragment = - Fragment(JavaFragment.whereAnd(fragments.map { it.underlying })) - - @JvmStatic - fun whereOr(vararg fragments: Fragment): Fragment = - Fragment(JavaFragment.whereOr(*fragments.map { it.underlying }.toTypedArray())) - - @JvmStatic - fun whereOr(fragments: List): Fragment = - Fragment(JavaFragment.whereOr(fragments.map { it.underlying })) - - @JvmStatic - fun set(vararg fragments: Fragment): Fragment = - Fragment(JavaFragment.set(*fragments.map { it.underlying }.toTypedArray())) - - @JvmStatic - fun set(fragments: List): Fragment = - Fragment(JavaFragment.set(fragments.map { it.underlying })) - - @JvmStatic - fun parentheses(fragment: Fragment): Fragment = - Fragment(JavaFragment.parentheses(fragment.underlying)) - - @JvmStatic - fun comma(vararg fragments: Fragment): Fragment = - Fragment(JavaFragment.comma(*fragments.map { it.underlying }.toTypedArray())) - - @JvmStatic - fun comma(fragments: List): Fragment = - Fragment(JavaFragment.comma(fragments.map { it.underlying })) - - @JvmStatic - fun orderBy(vararg fragments: Fragment): Fragment = - Fragment(JavaFragment.orderBy(*fragments.map { it.underlying }.toTypedArray())) - - @JvmStatic - fun orderBy(fragments: List): Fragment = - Fragment(JavaFragment.orderBy(fragments.map { it.underlying })) - - @JvmStatic - fun join(fragments: List, separator: Fragment): Fragment = - Fragment(JavaFragment.join(fragments.map { it.underlying }, separator.underlying)) - - @JvmStatic - fun concat(vararg fragments: Fragment): Fragment = - Fragment(JavaFragment.concat(*fragments.map { it.underlying }.toTypedArray())) - - @JvmStatic - fun interpolate(initial: String): Builder = - Builder(JavaFragment.interpolate(initial)) - - @JvmStatic - fun interpolate(vararg fragments: Fragment): Fragment = - Fragment(JavaFragment.interpolate(*fragments.map { it.underlying }.toTypedArray())) - } - - /** Builder for creating Fragments with a fluent API */ - class Builder(private val underlying: JavaFragment.Builder) { - fun sql(s: String): Builder { - underlying.sql(s) - return this - } - - fun param(dbType: DbType, value: T): Builder { - underlying.param(dbType, value) - return this - } - - fun param(fragment: Fragment): Builder { - underlying.param(fragment.underlying) - return this - } - - fun done(): Fragment = Fragment(underlying.done()) - } -} diff --git a/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/KotlinDbTypes.kt b/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/KotlinDbTypes.kt deleted file mode 100644 index 305609e489..0000000000 --- a/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/KotlinDbTypes.kt +++ /dev/null @@ -1,247 +0,0 @@ -package dev.typr.foundations.kotlin - -import dev.typr.foundations.Db2Type -import dev.typr.foundations.DuckDbType -import dev.typr.foundations.MariaType -import dev.typr.foundations.OracleType -import dev.typr.foundations.PgType -import dev.typr.foundations.SqlFunction -import dev.typr.foundations.SqlServerType - -/** - * Kotlin-friendly DbType instances that use Kotlin types instead of Java boxed types. - */ -object KotlinDbTypes { - object PgTypes { - // Primitives - convert Java boxed types to Kotlin native types - val bool: PgType = dev.typr.foundations.PgTypes.bool.bimap( - SqlFunction { it }, - { it } - ) - val int2: PgType = dev.typr.foundations.PgTypes.int2.bimap( - SqlFunction { it }, - { it } - ) - val smallint: PgType = dev.typr.foundations.PgTypes.smallint.bimap( - SqlFunction { it }, - { it } - ) - val int4: PgType = dev.typr.foundations.PgTypes.int4.bimap( - SqlFunction { it }, - { it } - ) - val int8: PgType = dev.typr.foundations.PgTypes.int8.bimap( - SqlFunction { it }, - { it } - ) - val float4: PgType = dev.typr.foundations.PgTypes.float4.bimap( - SqlFunction { it }, - { it } - ) - val float8: PgType = dev.typr.foundations.PgTypes.float8.bimap( - SqlFunction { it }, - { it } - ) - - val oid: PgType = dev.typr.foundations.PgTypes.oid - - // Collections - convert Java collections to Kotlin collections - val hstore: PgType> = dev.typr.foundations.PgTypes.hstore.bimap( - SqlFunction { javaMap -> javaMap.toMap() }, - { kotlinMap -> kotlinMap.toMap(java.util.HashMap()) } - ) - } - - object MariaTypes { - // Primitives - convert Java boxed types to Kotlin native types - val tinyint: MariaType = dev.typr.foundations.MariaTypes.tinyint.bimap( - SqlFunction { it }, - { it } - ) - val smallint: MariaType = dev.typr.foundations.MariaTypes.smallint.bimap( - SqlFunction { it }, - { it } - ) - val mediumint: MariaType = dev.typr.foundations.MariaTypes.mediumint.bimap( - SqlFunction { it }, - { it } - ) - val int_: MariaType = dev.typr.foundations.MariaTypes.int_.bimap( - SqlFunction { it }, - { it } - ) - val bigint: MariaType = dev.typr.foundations.MariaTypes.bigint.bimap( - SqlFunction { it }, - { it } - ) - - // Floating point - val float_: MariaType = dev.typr.foundations.MariaTypes.float_.bimap( - SqlFunction { it }, - { it } - ) - val double_: MariaType = dev.typr.foundations.MariaTypes.double_.bimap( - SqlFunction { it }, - { it } - ) - - // Decimal/Numeric - val numeric: MariaType = dev.typr.foundations.MariaTypes.numeric - - // Boolean - val bool: MariaType = dev.typr.foundations.MariaTypes.bool.bimap( - SqlFunction { it }, - { it } - ) - val bit1: MariaType = dev.typr.foundations.MariaTypes.bit1.bimap( - SqlFunction { it }, - { it } - ) - } - - object DuckDbTypes { - // Signed integers - val tinyint: DuckDbType = dev.typr.foundations.DuckDbTypes.tinyint.bimap( - SqlFunction { it }, - { it } - ) - val smallint: DuckDbType = dev.typr.foundations.DuckDbTypes.smallint.bimap( - SqlFunction { it }, - { it } - ) - val integer: DuckDbType = dev.typr.foundations.DuckDbTypes.integer.bimap( - SqlFunction { it }, - { it } - ) - val bigint: DuckDbType = dev.typr.foundations.DuckDbTypes.bigint.bimap( - SqlFunction { it }, - { it } - ) - - // Floating point - val float_: DuckDbType = dev.typr.foundations.DuckDbTypes.float_.bimap( - SqlFunction { it }, - { it } - ) - val double_: DuckDbType = dev.typr.foundations.DuckDbTypes.double_.bimap( - SqlFunction { it }, - { it } - ) - - // Boolean - val boolean_: DuckDbType = dev.typr.foundations.DuckDbTypes.boolean_.bimap( - SqlFunction { it }, - { it } - ) - val bool: DuckDbType = dev.typr.foundations.DuckDbTypes.bool.bimap( - SqlFunction { it }, - { it } - ) - } - - object OracleTypes { - // Numeric types - NUMBER is Oracle's universal numeric type - val number: OracleType = dev.typr.foundations.OracleTypes.number - - val numberInt: OracleType = dev.typr.foundations.OracleTypes.numberInt.bimap( - SqlFunction { it }, - { it } - ) - - val numberLong: OracleType = dev.typr.foundations.OracleTypes.numberLong.bimap( - SqlFunction { it }, - { it } - ) - - // Floating point - val binaryFloat: OracleType = dev.typr.foundations.OracleTypes.binaryFloat.bimap( - SqlFunction { it }, - { it } - ) - - val binaryDouble: OracleType = dev.typr.foundations.OracleTypes.binaryDouble.bimap( - SqlFunction { it }, - { it } - ) - - val float_: OracleType = dev.typr.foundations.OracleTypes.float_.bimap( - SqlFunction { it }, - { it } - ) - } - - object SqlServerTypes { - // Primitives - convert Java boxed types to Kotlin native types - val smallint: SqlServerType = dev.typr.foundations.SqlServerTypes.smallint.bimap( - SqlFunction { it }, - { it } - ) - val int_: SqlServerType = dev.typr.foundations.SqlServerTypes.int_.bimap( - SqlFunction { it }, - { it } - ) - val bigint: SqlServerType = dev.typr.foundations.SqlServerTypes.bigint.bimap( - SqlFunction { it }, - { it } - ) - - // Floating point - val real: SqlServerType = dev.typr.foundations.SqlServerTypes.real.bimap( - SqlFunction { it }, - { it } - ) - val float_: SqlServerType = dev.typr.foundations.SqlServerTypes.float_.bimap( - SqlFunction { it }, - { it } - ) - - // Boolean - val bit: SqlServerType = dev.typr.foundations.SqlServerTypes.bit.bimap( - SqlFunction { it }, - { it } - ) - - // Decimal types - no conversion needed, already BigDecimal - val decimal = dev.typr.foundations.SqlServerTypes.decimal - val numeric = dev.typr.foundations.SqlServerTypes.numeric - val money = dev.typr.foundations.SqlServerTypes.money - val smallmoney = dev.typr.foundations.SqlServerTypes.smallmoney - } - - object Db2Types { - // Primitives - convert Java boxed types to Kotlin native types - val smallint: Db2Type = dev.typr.foundations.Db2Types.smallint.bimap( - SqlFunction { it }, - { it } - ) - val integer: Db2Type = dev.typr.foundations.Db2Types.integer.bimap( - SqlFunction { it }, - { it } - ) - val bigint: Db2Type = dev.typr.foundations.Db2Types.bigint.bimap( - SqlFunction { it }, - { it } - ) - - // Floating point - val real: Db2Type = dev.typr.foundations.Db2Types.real.bimap( - SqlFunction { it }, - { it } - ) - val double_: Db2Type = dev.typr.foundations.Db2Types.double_.bimap( - SqlFunction { it }, - { it } - ) - - // Boolean - val boolean_: Db2Type = dev.typr.foundations.Db2Types.boolean_.bimap( - SqlFunction { it }, - { it } - ) - - // Decimal types - no conversion needed, already BigDecimal - val decimal = dev.typr.foundations.Db2Types.decimal - val numeric = dev.typr.foundations.Db2Types.numeric - val decfloat = dev.typr.foundations.Db2Types.decfloat - } -} diff --git a/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/Operation.kt b/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/Operation.kt deleted file mode 100644 index 99ed5a8bd1..0000000000 --- a/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/Operation.kt +++ /dev/null @@ -1,108 +0,0 @@ -package dev.typr.foundations.kotlin - -import dev.typr.foundations.Operation as JavaOperation -import java.sql.Connection -import java.sql.SQLException - -/** Kotlin wrapper for dev.typr.foundations.Operation with Kotlin-native APIs. - * - * This sealed interface wraps the Java Operation and provides Kotlin-friendly methods. - */ -sealed interface Operation { - val underlying: JavaOperation<*> - - @Throws(SQLException::class) - fun run(conn: Connection): Out - - fun runUnchecked(conn: Connection): Out { - return try { - run(conn) - } catch (e: SQLException) { - throw RuntimeException(e) - } - } - - /** Query operation that returns a parsed result */ - class Query(override val underlying: JavaOperation.Query) : Operation { - @Throws(SQLException::class) - override fun run(conn: Connection): Out = underlying.run(conn) - - companion object { - @JvmStatic - operator fun invoke(query: Fragment, parser: ResultSetParser): Query = - Query(JavaOperation.Query(query.underlying, parser.underlying)) - } - } - - /** Update operation that returns the number of affected rows */ - class Update(override val underlying: JavaOperation.Update) : Operation { - @Throws(SQLException::class) - override fun run(conn: Connection): Int = underlying.run(conn) - - companion object { - @JvmStatic - operator fun invoke(query: Fragment): Update = - Update(JavaOperation.Update(query.underlying)) - } - } - - /** Update operation with RETURNING clause */ - class UpdateReturning(override val underlying: JavaOperation.UpdateReturning) : Operation { - @Throws(SQLException::class) - override fun run(conn: Connection): Out = underlying.run(conn) - - companion object { - @JvmStatic - operator fun invoke(query: Fragment, parser: ResultSetParser): UpdateReturning = - UpdateReturning(JavaOperation.UpdateReturning(query.underlying, parser.underlying)) - } - } - - /** Update operation that returns generated keys (for Oracle, which doesn't support RETURNING in the same way) */ - class UpdateReturningGeneratedKeys(override val underlying: JavaOperation.UpdateReturningGeneratedKeys) : Operation { - @Throws(SQLException::class) - override fun run(conn: Connection): Out = underlying.run(conn) - - companion object { - @JvmStatic - operator fun invoke(query: Fragment, columnNames: Array, parser: ResultSetParser): UpdateReturningGeneratedKeys = - UpdateReturningGeneratedKeys(JavaOperation.UpdateReturningGeneratedKeys(query.underlying, columnNames, parser.underlying)) - } - } - - /** Batch update operation that returns an array of update counts */ - class UpdateMany(override val underlying: JavaOperation.UpdateMany) : Operation { - @Throws(SQLException::class) - override fun run(conn: Connection): IntArray = underlying.run(conn) - - companion object { - @JvmStatic - operator fun invoke(query: Fragment, parser: RowParser, rows: Iterator): UpdateMany = - UpdateMany(JavaOperation.UpdateMany(query.underlying, parser.underlying, rows)) - } - } - - /** Batch update operation with RETURNING clause that returns a list of rows */ - class UpdateManyReturning(override val underlying: JavaOperation.UpdateManyReturning) : Operation> { - @Throws(SQLException::class) - override fun run(conn: Connection): List = underlying.run(conn) - - companion object { - @JvmStatic - operator fun invoke(query: Fragment, parser: RowParser, rows: Iterator): UpdateManyReturning = - UpdateManyReturning(JavaOperation.UpdateManyReturning(query.underlying, parser.underlying, rows)) - } - } - - /** Update each row individually with RETURNING clause (for MariaDB) */ - class UpdateReturningEach(override val underlying: JavaOperation.UpdateReturningEach) : Operation> { - @Throws(SQLException::class) - override fun run(conn: Connection): List = underlying.run(conn) - - companion object { - @JvmStatic - operator fun invoke(query: Fragment, parser: RowParser, rows: Iterator): UpdateReturningEach = - UpdateReturningEach(JavaOperation.UpdateReturningEach(query.underlying, parser.underlying, rows)) - } - } -} diff --git a/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/OptionalExtensions.kt b/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/OptionalExtensions.kt deleted file mode 100644 index 5144d62918..0000000000 --- a/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/OptionalExtensions.kt +++ /dev/null @@ -1,61 +0,0 @@ -package dev.typr.foundations.kotlin - -import dev.typr.foundations.dsl.Bijection -import java.util.Optional - -/** - * Generic extension methods for converting between Java Optional and Kotlin nullable types. - * These work at the VALUE level, not the type parameter level. - */ - -// ================================ -// Optional → T? -// ================================ - -/** - * Convert Optional to T? (nullable). - * Usage: val user: User? = optionalUser.orNull() - */ -fun Optional.orNull(): T? = this.orElse(null) - -// ================================ -// T? → Optional -// ================================ - -/** - * Convert T? (nullable) to Optional. - * - * Note: Type hint needed because Kotlin infers Optional (non-null inside Optional) - * but Java expects Optional (platform type). This is safe - just helping type inference. - * - * Usage: val optional: Optional = nullableUser.toOptional() - */ -fun T?.toOptional(): Optional { - @Suppress("UNCHECKED_CAST") - return Optional.ofNullable(this) as Optional -} - -// ================================ -// Bijection: Optional ↔ T? -// ================================ - -/** - * Bijection between Java Optional and Kotlin nullable T?. - * Used for type-safe phantom type conversion in PgTypename/MariaTypename. - * - * Usage: - * val typename: PgTypename = pgType.opt().typename().to(optionalToNullable()) - */ -fun optionalToNullable(): Bijection, T?> { - @Suppress("UNCHECKED_CAST") - return Bijection.of( - { opt: Optional -> opt.orElse(null) }, - { nullable: T? -> Optional.ofNullable(nullable) as Optional } - ) -} - -/** - * Bijection between Kotlin nullable T? and Java Optional. - * Inverse of optionalToNullable(). - */ -fun nullableToOptional(): Bijection> = optionalToNullable().inverse() diff --git a/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/ResultSetParser.kt b/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/ResultSetParser.kt deleted file mode 100644 index 57b6ab1f52..0000000000 --- a/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/ResultSetParser.kt +++ /dev/null @@ -1,26 +0,0 @@ -package dev.typr.foundations.kotlin - -import java.sql.ResultSet - -/** - * Kotlin wrapper for dev.typr.foundations.ResultSetParser that provides Kotlin-native methods. - * - * Wraps the Java ResultSetParser to provide interop with Java APIs. - */ -class ResultSetParser(val underlying: dev.typr.foundations.ResultSetParser) { - fun apply(rs: ResultSet): Out = underlying.apply(rs) -} - -/** - * Convert a Java ResultSetParser to a Kotlin ResultSetParser. - */ -fun dev.typr.foundations.ResultSetParser.asKotlin(): ResultSetParser { - return ResultSetParser(this) -} - -/** - * Convert a Kotlin ResultSetParser to a Java ResultSetParser. - */ -fun ResultSetParser.asJava(): dev.typr.foundations.ResultSetParser { - return underlying -} diff --git a/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/RowParser.kt b/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/RowParser.kt deleted file mode 100644 index bbf96abdbf..0000000000 --- a/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/RowParser.kt +++ /dev/null @@ -1,64 +0,0 @@ -package dev.typr.foundations.kotlin - -import java.sql.ResultSet - -/** - * Kotlin wrapper for dev.typr.foundations.RowParser that provides Kotlin-native methods. - * - * This class has the same API surface as the Java RowParser but returns Kotlin types (T?) - * instead of Java types (Optional). - */ -class RowParser(val underlying: dev.typr.foundations.RowParser) { - - /** - * Parse all rows from a ResultSet. - * Returns Kotlin List instead of java.util.List. - */ - fun all(): ResultSetParser> { - val javaParser = underlying.all() - return ResultSetParser(dev.typr.foundations.ResultSetParser { rs -> javaParser.apply(rs).toList() }) - } - - /** - * Parse exactly one row from a ResultSet. - * Returns Row directly (throws if not exactly one row). - */ - fun exactlyOne(): ResultSetParser { - return ResultSetParser(underlying.exactlyOne()) - } - - /** - * Parse the first row from a ResultSet or null if empty. - * Returns Row? instead of Optional. - */ - fun first(): ResultSetParser { - val javaParser = dev.typr.foundations.ResultSetParser.First(underlying) - return ResultSetParser(dev.typr.foundations.ResultSetParser { rs -> javaParser.apply(rs).orNull() }) - } - - /** - * Parse the first row from a ResultSet or null if empty. - * Alias for first() to match Java API. - */ - fun firstOrNull(): ResultSetParser = first() - - /** - * Parse at most one row from a ResultSet or null. - * Returns Row? instead of Optional. - */ - fun maxOne(): ResultSetParser { - val javaParser = dev.typr.foundations.ResultSetParser.MaxOne(underlying) - return ResultSetParser(dev.typr.foundations.ResultSetParser { rs -> javaParser.apply(rs).orNull() }) - } - - /** - * Parse at most one row from a ResultSet or null. - * Alias for maxOne() to match Kotlin conventions. - */ - fun maxOneOrNull(): ResultSetParser = maxOne() - - /** - * Parse a single row from the current position in ResultSet. - */ - fun parse(rs: ResultSet): Row = underlying.parse(rs) -} diff --git a/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/RuntimeExports.kt b/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/RuntimeExports.kt deleted file mode 100644 index 89f34e3f53..0000000000 --- a/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/RuntimeExports.kt +++ /dev/null @@ -1,70 +0,0 @@ -package dev.typr.foundations.kotlin - -// ================================ -// Simple Type Aliases (no interop issues) -// ================================ - -// Core utility types -// Fragment wrapper is defined in Fragment.kt -typealias Either = dev.typr.foundations.Either -typealias And = dev.typr.foundations.And - -// PostgreSQL data wrapper types (simple records) -typealias Json = dev.typr.foundations.data.Json -typealias Jsonb = dev.typr.foundations.data.Jsonb -typealias Money = dev.typr.foundations.data.Money -typealias Xml = dev.typr.foundations.data.Xml -typealias Vector = dev.typr.foundations.data.Vector -typealias Record = dev.typr.foundations.data.Record -typealias Unknown = dev.typr.foundations.data.Unknown -typealias Xid = dev.typr.foundations.data.Xid -typealias Inet = dev.typr.foundations.data.Inet -typealias AclItem = dev.typr.foundations.data.AclItem -typealias AnyArray = dev.typr.foundations.data.AnyArray -typealias Int2Vector = dev.typr.foundations.data.Int2Vector -typealias OidVector = dev.typr.foundations.data.OidVector -typealias PgNodeTree = dev.typr.foundations.data.PgNodeTree - -// Regclass types -typealias Regclass = dev.typr.foundations.data.Regclass -typealias Regconfig = dev.typr.foundations.data.Regconfig -typealias Regdictionary = dev.typr.foundations.data.Regdictionary -typealias Regnamespace = dev.typr.foundations.data.Regnamespace -typealias Regoper = dev.typr.foundations.data.Regoper -typealias Regoperator = dev.typr.foundations.data.Regoperator -typealias Regproc = dev.typr.foundations.data.Regproc -typealias Regprocedure = dev.typr.foundations.data.Regprocedure -typealias Regrole = dev.typr.foundations.data.Regrole -typealias Regtype = dev.typr.foundations.data.Regtype - -// Range types (need extension methods but can be aliased) -typealias Range = dev.typr.foundations.data.Range -typealias RangeBound = dev.typr.foundations.data.RangeBound -typealias RangeFinite = dev.typr.foundations.data.RangeFinite - -// Array type (needs extension methods but can be aliased) -typealias Arr = dev.typr.foundations.data.Arr - -// Core type system (will add extension methods) -typealias PgType = dev.typr.foundations.PgType -typealias PgTypename = dev.typr.foundations.PgTypename -typealias PgRead = dev.typr.foundations.PgRead -typealias PgWrite = dev.typr.foundations.PgWrite -typealias PgText = dev.typr.foundations.PgText - -// Database access types -// RowParser, ResultSetParser, and Operation are wrapper classes (see RowParser.kt, ResultSetParser.kt, and Operation.kt) -// Operation wrapper is defined in Operation.kt -typealias Transactor = dev.typr.foundations.Transactor - -// Functional interfaces for SQL exceptions -typealias SqlFunction = dev.typr.foundations.SqlFunction -typealias SqlConsumer = dev.typr.foundations.SqlConsumer -typealias SqlBiConsumer = dev.typr.foundations.SqlBiConsumer - -// Utility -typealias ByteArrays = dev.typr.foundations.internal.ByteArrays -typealias ArrParser = dev.typr.foundations.ArrParser - -// PgTypes registry (static access) -typealias PgTypes = dev.typr.foundations.PgTypes diff --git a/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/RuntimeExtensions.kt b/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/RuntimeExtensions.kt deleted file mode 100644 index cb01bc9c78..0000000000 --- a/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/RuntimeExtensions.kt +++ /dev/null @@ -1,126 +0,0 @@ -package dev.typr.foundations.kotlin - -import dev.typr.foundations.* -import java.util.Optional - -/** - * Kotlin extension methods for typr-runtime-java that provide: - * - Nullable types instead of Optional - * - Kotlin-friendly lambda syntax - * - Better generic type handling - */ - -// ================================ -// DbType Extensions -// ================================ - -/** - * Kotlin-friendly nullable version of DbType.opt(). - * - * Java's DbType.opt() returns DbType>, but Kotlin code should work with A? instead. - * This extension wraps a DbType to create a DbType that converts between Optional and nullable - * at read/write boundaries. - * - * Usage: - * val nullableText: DbType = PgTypes.text.nullable() - * - * Instead of: - * val optionalText: DbType> = PgTypes.text.opt() // Java-style - */ -fun DbType.nullable(): DbType = - this.opt().to(optionalToNullable()) - -// ================================ -// Either Extensions (value-level) -// ================================ - -/** - * Convert Either to R? (nullable right value). - * Uses .orNull() from OptionalExtensions. - */ -fun Either.rightOrNull(): R? { - return this.asOptional().orNull() -} - -/** - * Get left value or null. - */ -fun Either.leftOrNull(): L? { - return when (this) { - is dev.typr.foundations.Either.Left -> this.value() - else -> null - } -} - -// ================================ -// Arr Extensions (value-level) -// ================================ - -/** - * Reshape array to new dimensions or return null. - * Uses .orNull() from OptionalExtensions. - */ -fun Arr.reshapeOrNull(vararg newDims: Int): Arr? { - return this.reshape(*newDims).orNull() -} - -/** - * Get array element at indices or return null. - * Uses .orNull() from OptionalExtensions. - */ -fun dev.typr.foundations.data.Arr.getOrNull(vararg indices: Int): A? { - return this.get(*indices).orNull() -} - -// ================================ -// Range Extensions (value-level) -// ================================ - -/** - * Get finite range or null. - * Uses .orNull() from OptionalExtensions. - */ -fun > Range.finiteOrNull(): RangeFinite? { - return this.finite().orNull() -} - -// ================================ -// Fragment Extensions -// ================================ - -/** - * Build a Fragment using Kotlin DSL. - */ -inline fun buildFragment(block: dev.typr.foundations.Fragment.Builder.() -> Unit): Fragment { - val builder = dev.typr.foundations.Fragment.Builder() - builder.block() - return Fragment(builder.done()) -} - -/** - * Append a nullable parameter to fragment builder. - * Converts Kotlin's T? to Java's Optional automatically. - */ -fun dev.typr.foundations.Fragment.Builder.paramNullable(type: DbType, value: T?): dev.typr.foundations.Fragment.Builder { - return this.param(type.opt(), value.toOptional()) -} - -/** - * Kotlin-friendly query method that accepts Kotlin ResultSetParser. - * Converts Kotlin ResultSetParser to Java ResultSetParser automatically. - */ -fun dev.typr.foundations.Fragment.query(parser: ResultSetParser): dev.typr.foundations.Operation { - return this.query(parser.underlying) -} - -// ================================ -// Operation Extensions -// ================================ - -/** - * Extension to convert Operation> results to T? automatically. - * This handles the common case where Java methods return Optional but Kotlin code expects nullable types. - */ -fun dev.typr.foundations.Operation>.runUncheckedOrNull(c: java.sql.Connection): T? { - return this.runUnchecked(c).orNull() -} diff --git a/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/StaticExports.kt b/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/StaticExports.kt deleted file mode 100644 index c8cf718889..0000000000 --- a/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/StaticExports.kt +++ /dev/null @@ -1,186 +0,0 @@ -package dev.typr.foundations.kotlin - -import dev.typr.foundations.SqlFunction - -/** - * Kotlin-friendly exports of static members from Java classes. - * Type aliases don't export static methods, so we need object wrappers. - */ - -// ================================ -// PgTypes - Main type registry -// ================================ - -/** - * Access to all PostgreSQL type descriptors. - * Delegates to dev.typr.foundations.PgTypes. - */ -object Types { - // Primitive types - val bool: PgType get() = dev.typr.foundations.PgTypes.bool - val int2: PgType get() = dev.typr.foundations.PgTypes.int2 - val int4: PgType get() = dev.typr.foundations.PgTypes.int4 - val int8: PgType get() = dev.typr.foundations.PgTypes.int8 - val float4: PgType get() = dev.typr.foundations.PgTypes.float4 - val float8: PgType get() = dev.typr.foundations.PgTypes.float8 - val numeric: PgType get() = dev.typr.foundations.PgTypes.numeric - val text: PgType get() = dev.typr.foundations.PgTypes.text - val bytea: PgType get() = dev.typr.foundations.PgTypes.bytea - val uuid: PgType get() = dev.typr.foundations.PgTypes.uuid - - // Date/time types (using java.time) - val date: PgType get() = dev.typr.foundations.PgTypes.date - val time: PgType get() = dev.typr.foundations.PgTypes.time - val timestamp: PgType get() = dev.typr.foundations.PgTypes.timestamp - val timestamptz: PgType get() = dev.typr.foundations.PgTypes.timestamptz - val interval: PgType get() = dev.typr.foundations.PgTypes.interval - - // JSON types - val json: PgType get() = dev.typr.foundations.PgTypes.json - val jsonb: PgType get() = dev.typr.foundations.PgTypes.jsonb - - // Other types - val xml: PgType get() = dev.typr.foundations.PgTypes.xml - val money: PgType get() = dev.typr.foundations.PgTypes.money - val inet: PgType get() = dev.typr.foundations.PgTypes.inet - val vector: PgType get() = dev.typr.foundations.PgTypes.vector - val xid: PgType get() = dev.typr.foundations.PgTypes.xid - - // Reg* types - val regclass: PgType get() = dev.typr.foundations.PgTypes.regclass - val regconfig: PgType get() = dev.typr.foundations.PgTypes.regconfig - val regdictionary: PgType get() = dev.typr.foundations.PgTypes.regdictionary - val regnamespace: PgType get() = dev.typr.foundations.PgTypes.regnamespace - val regoper: PgType get() = dev.typr.foundations.PgTypes.regoper - val regoperator: PgType get() = dev.typr.foundations.PgTypes.regoperator - val regproc: PgType get() = dev.typr.foundations.PgTypes.regproc - val regprocedure: PgType get() = dev.typr.foundations.PgTypes.regprocedure - val regrole: PgType get() = dev.typr.foundations.PgTypes.regrole - val regtype: PgType get() = dev.typr.foundations.PgTypes.regtype - - // Factory methods - fun > ofEnum(name: String, fromString: (String) -> E): PgType { - return dev.typr.foundations.PgTypes.ofEnum(name, fromString) - } - - fun ofPgObject( - sqlType: String, - constructor: SqlFunction, - extractor: (T) -> String, - json: dev.typr.foundations.PgJson - ): PgType { - return dev.typr.foundations.PgTypes.ofPgObject(sqlType, constructor, extractor, json) - } - - fun ofPgObject(sqlType: String): PgType { - return dev.typr.foundations.PgTypes.ofPgObject( - sqlType, - SqlFunction { it }, - { it }, - dev.typr.foundations.PgJson.text - ) - } - - fun bpchar(precision: Int): PgType { - return dev.typr.foundations.PgTypes.bpchar(precision) - } - - fun record(typename: String): PgType { - return dev.typr.foundations.PgTypes.record(typename) - } -} - -// ================================ -// Fragment - SQL fragment construction -// ================================ - -/** - * Fragment factory methods. - */ -object Fragments { - val EMPTY: Fragment get() = Fragment(dev.typr.foundations.Fragment.EMPTY) - - fun lit(value: String): Fragment { - return Fragment(dev.typr.foundations.Fragment.lit(value)) - } - - fun empty(): Fragment { - return Fragment(dev.typr.foundations.Fragment.empty()) - } - - fun quotedDouble(value: String): Fragment { - return Fragment(dev.typr.foundations.Fragment.quotedDouble(value)) - } - - fun quotedSingle(value: String): Fragment { - return Fragment(dev.typr.foundations.Fragment.quotedSingle(value)) - } - - fun value(value: A, type: PgType): Fragment { - return Fragment(dev.typr.foundations.Fragment.value(value, type)) - } - - fun and(vararg fragments: Fragment): Fragment { - return Fragment(dev.typr.foundations.Fragment.and(*fragments.map { it.underlying }.toTypedArray())) - } - - fun and(fragments: List): Fragment { - return Fragment(dev.typr.foundations.Fragment.and(fragments.map { it.underlying })) - } - - fun or(vararg fragments: Fragment): Fragment { - return Fragment(dev.typr.foundations.Fragment.or(*fragments.map { it.underlying }.toTypedArray())) - } - - fun or(fragments: List): Fragment { - return Fragment(dev.typr.foundations.Fragment.or(fragments.map { it.underlying })) - } - - fun join(fragments: List, separator: Fragment): Fragment { - return Fragment(dev.typr.foundations.Fragment.join(fragments.map { it.underlying }, separator.underlying)) - } - - fun interpolate(initial: String): Fragment.Builder { - return Fragment.Builder(dev.typr.foundations.Fragment.interpolate(initial)) - } - - fun concat(vararg fragments: Fragment): Fragment { - return Fragment(dev.typr.foundations.Fragment.concat(*fragments.map { it.underlying }.toTypedArray())) - } -} - -// ================================ -// ResultSetParser - Result parsing -// ================================ - -/** - * ResultSetParser factory methods. - * These work with the underlying Java RowParser type. - * For Kotlin RowParser wrappers, use instance methods like .first(), .all() instead. - */ -object Parsers { - fun all(rowParser: dev.typr.foundations.RowParser): ResultSetParser> { - val javaParser = dev.typr.foundations.ResultSetParser.All(rowParser) - return ResultSetParser { rs -> javaParser.apply(rs).toList() } - } - - fun first(rowParser: dev.typr.foundations.RowParser): ResultSetParser { - val javaParser = dev.typr.foundations.ResultSetParser.First(rowParser) - return ResultSetParser { rs -> javaParser.apply(rs).orNull() } - } - - fun maxOne(rowParser: dev.typr.foundations.RowParser): ResultSetParser { - val javaParser = dev.typr.foundations.ResultSetParser.MaxOne(rowParser) - return ResultSetParser { rs -> javaParser.apply(rs).orNull() } - } - - fun exactlyOne(rowParser: dev.typr.foundations.RowParser): ResultSetParser { - val javaParser = dev.typr.foundations.ResultSetParser.ExactlyOne(rowParser) - return ResultSetParser { rs -> javaParser.apply(rs) } - } - - fun foreach(rowParser: dev.typr.foundations.RowParser, consumer: (Out) -> Unit): ResultSetParser { - val javaParser = dev.typr.foundations.ResultSetParser.Foreach(rowParser, consumer) - return ResultSetParser { rs -> javaParser.apply(rs) } - } -} diff --git a/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/Structure.kt b/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/Structure.kt deleted file mode 100644 index 7f5ee80369..0000000000 --- a/foundations-jdbc-dsl-kotlin/src/kotlin/dev/typr/foundations/kotlin/Structure.kt +++ /dev/null @@ -1,4 +0,0 @@ -package dev.typr.foundations.kotlin - -// Top-level type alias for RelationStructure interface -typealias RelationStructure = dev.typr.foundations.dsl.RelationStructure diff --git a/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/Bijection.scala b/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/Bijection.scala deleted file mode 100644 index 9769ba871c..0000000000 --- a/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/Bijection.scala +++ /dev/null @@ -1,19 +0,0 @@ -package dev.typr.foundations.scala - -/** Bijection companion object with factory methods for creating bijections. */ -object Bijection { - def apply[T, TT](unwrap: T => TT)(wrap: TT => T): dev.typr.foundations.dsl.Bijection[T, TT] = - dev.typr.foundations.dsl.Bijection.of[T, TT](t => unwrap(t), tt => wrap(tt)) - - def of[T, TT](unwrap: T => TT, wrap: TT => T): dev.typr.foundations.dsl.Bijection[T, TT] = - dev.typr.foundations.dsl.Bijection.of[T, TT](t => unwrap(t), tt => wrap(tt)) - - def identity[T](): dev.typr.foundations.dsl.Bijection[T, T] = - dev.typr.foundations.dsl.Bijection.identity[T]() - - def asString: dev.typr.foundations.dsl.Bijection[String, String] = - dev.typr.foundations.dsl.Bijection.asString() - - def asBool: dev.typr.foundations.dsl.Bijection[java.lang.Boolean, java.lang.Boolean] = - dev.typr.foundations.dsl.Bijection.asBool() -} diff --git a/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/Bijections.scala b/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/Bijections.scala deleted file mode 100644 index 1cd6179440..0000000000 --- a/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/Bijections.scala +++ /dev/null @@ -1,86 +0,0 @@ -package dev.typr.foundations.scala - -import dev.typr.foundations.dsl.Bijection - -import java.util.Optional -import _root_.scala.jdk.OptionConverters.* - -object Bijections { - - // ================================ - // Optional ↔ Option[T] - // ================================ - - /** Bijection between Java Optional[T] and Scala Option[T]. Used for type-safe phantom type conversion in PgTypename/MariaTypename. - * - * Usage: val typename: PgTypename[Option[String]] = pgType.opt().typename().to(optionalToOption[String]) - */ - def optionalToOption[T]: Bijection[Optional[T], Option[T]] = { - Bijection.of[Optional[T], Option[T]]( - (opt: Optional[T]) => opt.toScala, - (option: Option[T]) => option.toJava - ) - } - - /** Bijection between Scala Option[T] and Java Optional[T]. Inverse of optionalToOption. - */ - def optionToOptional[T]: Bijection[Option[T], Optional[T]] = optionalToOption[T].inverse() - - // ================================ - // Scala ↔ Java primitive type conversions - // ================================ - - // Create Bijection instances for Scala → Java type conversions - val scalaBooleanToJavaBoolean: Bijection[Boolean, java.lang.Boolean] = { - Bijection.of[Boolean, java.lang.Boolean]( - (b: Boolean) => java.lang.Boolean.valueOf(b), - (jb: java.lang.Boolean) => jb.booleanValue() - ) - } - - // Reverse direction: Java → Scala - val javaBooleanToScalaBoolean: Bijection[java.lang.Boolean, Boolean] = { - Bijection.of[java.lang.Boolean, Boolean]( - (jb: java.lang.Boolean) => jb.booleanValue(), - (b: Boolean) => java.lang.Boolean.valueOf(b) - ) - } - - val scalaIntToJavaInteger: Bijection[Int, java.lang.Integer] = { - Bijection.of[Int, java.lang.Integer]( - (i: Int) => java.lang.Integer.valueOf(i), - (ji: java.lang.Integer) => ji.intValue() - ) - } - - val scalaLongToJavaLong: Bijection[Long, java.lang.Long] = { - Bijection.of[Long, java.lang.Long]( - (l: Long) => java.lang.Long.valueOf(l), - (jl: java.lang.Long) => jl.longValue() - ) - } - - val scalaShortToJavaShort: Bijection[Short, java.lang.Short] = { - Bijection.of[Short, java.lang.Short]( - (s: Short) => java.lang.Short.valueOf(s), - (js: java.lang.Short) => js.shortValue() - ) - } - - val scalaFloatToJavaFloat: Bijection[Float, java.lang.Float] = { - Bijection.of[Float, java.lang.Float]( - (f: Float) => java.lang.Float.valueOf(f), - (jf: java.lang.Float) => jf.floatValue() - ) - } - - val scalaDoubleToJavaDouble: Bijection[Double, java.lang.Double] = { - Bijection.of[Double, java.lang.Double]( - (d: Double) => java.lang.Double.valueOf(d), - (jd: java.lang.Double) => jd.doubleValue() - ) - } - - // Identity bijections for types that don't need conversion - def identity[T]: Bijection[T, T] = Bijection.identity[T]() -} diff --git a/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/DslExports.scala b/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/DslExports.scala deleted file mode 100644 index 2b0e366a06..0000000000 --- a/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/DslExports.scala +++ /dev/null @@ -1,78 +0,0 @@ -package dev.typr.foundations.scala - -import _root_.scala.jdk.CollectionConverters.* - -object DslExports { - - // Type aliases for DSL types - type Bijection[Wrapper, Underlying] = dev.typr.foundations.dsl.Bijection[Wrapper, Underlying] - type SortOrder[T] = dev.typr.foundations.dsl.SortOrder[T] - - // Functional interfaces - type SqlFunction2[T1, T2, R] = dev.typr.foundations.dsl.SqlFunction2[T1, T2, R] - type SqlFunction3[T1, T2, T3, R] = dev.typr.foundations.dsl.SqlFunction3[T1, T2, T3, R] - type TriFunction[T1, T2, T3, R] = dev.typr.foundations.dsl.TriFunction[T1, T2, T3, R] - - // Builder parameter types - type DeleteParams[Fields] = dev.typr.foundations.dsl.DeleteParams[Fields] - type SelectParams[Fields, Row] = dev.typr.foundations.dsl.SelectParams[Fields, Row] - type UpdateParams[Fields, Row] = dev.typr.foundations.dsl.UpdateParams[Fields, Row] - - // Path type - type Path = dev.typr.foundations.dsl.Path - - // Mock builder functions - def SelectBuilderMock[Fields, Row]( - structure: dev.typr.foundations.dsl.RelationStructure[Fields, Row], - allRowsSupplier: () => List[Row], - params: SelectParams[Fields, Row] - ): SelectBuilder[Fields, Row] = { - SelectBuilder( - dev.typr.foundations.dsl.SelectBuilderMock( - structure, - () => allRowsSupplier().asJava, - params - ) - ) - } - - def DeleteBuilderMock[Id, Fields, Row]( - structure: dev.typr.foundations.dsl.RelationStructure[Fields, Row], - allRowsSupplier: () => List[Row], - params: DeleteParams[Fields], - idExtractor: Row => Id, - deleteById: Id => Unit - ): DeleteBuilder[Fields, Row] = { - new DeleteBuilder( - dev.typr.foundations.dsl.DeleteBuilderMock( - structure, - () => allRowsSupplier().asJava, - params, - (row: Row) => idExtractor(row), - (id: Id) => deleteById(id) - ) - ) - } - - def UpdateBuilderMock[Fields, Row]( - structure: dev.typr.foundations.dsl.RelationStructure[Fields, Row], - allRowsSupplier: () => List[Row], - params: UpdateParams[Fields, Row], - copyRow: Row => Row - ): UpdateBuilder[Fields, Row] = { - new UpdateBuilder( - dev.typr.foundations.dsl.UpdateBuilderMock( - structure, - () => allRowsSupplier().asJava, - params, - (row: Row) => copyRow(row) - ) - ) - } - - // SortOrder extension methods - implicit class SqlExprSortOrderOps[T](private val expr: dev.typr.foundations.dsl.SqlExpr[T]) extends AnyVal { - def asc(): SortOrder[T] = dev.typr.foundations.dsl.SortOrder.asc(expr) - def desc(): SortOrder[T] = dev.typr.foundations.dsl.SortOrder.desc(expr) - } -} diff --git a/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/Fragment.scala b/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/Fragment.scala deleted file mode 100644 index 9ccfec22a2..0000000000 --- a/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/Fragment.scala +++ /dev/null @@ -1,191 +0,0 @@ -package dev.typr.foundations.scala - -import java.sql.PreparedStatement -import java.util.concurrent.atomic.AtomicInteger -import _root_.scala.jdk.CollectionConverters.* - -/** Scala wrapper for dev.typr.foundations.Fragment with Scala-native APIs. - * - * This class wraps the Java Fragment interface and provides Scala-friendly methods that use Scala collections and types. - */ -class Fragment(val underlying: dev.typr.foundations.Fragment) extends AnyVal { - - def render(): String = underlying.render() - - def render(sb: java.lang.StringBuilder): Unit = underlying.render(sb) - - def set(stmt: PreparedStatement): Unit = underlying.set(stmt) - - def set(stmt: PreparedStatement, idx: AtomicInteger): Unit = underlying.set(stmt, idx) - - def append(other: Fragment): Fragment = new Fragment(underlying.append(other.underlying)) - - def ++(other: Fragment): Fragment = append(other) - - def query[T](parser: ResultSetParser[T]): Operation.Query[T] = - Operation.Query(this, parser) - - def update(): Operation.Update = - Operation.Update(this) - - def updateReturning[T](parser: ResultSetParser[T]): Operation.UpdateReturning[T] = - Operation.UpdateReturning(this, parser) - - def updateMany[Row](parser: RowParser[Row], rows: Iterator[Row]): Operation.UpdateMany[Row] = - Operation.UpdateMany(this, parser, rows) - - def updateManyReturning[Row](parser: RowParser[Row], rows: Iterator[Row]): Operation.UpdateManyReturning[Row] = - Operation.UpdateManyReturning(this, parser, rows) - - def updateReturningEach[Row](parser: RowParser[Row], rows: Iterator[Row]): Operation.UpdateReturningEach[Row] = - Operation.UpdateReturningEach(this, parser, rows) - - /** Oracle-specific: Update with generated keys (for databases that don't support RETURNING clause) */ - def updateReturningGeneratedKeys[T](columnNames: Array[String], parser: ResultSetParser[T]): Operation.UpdateReturningGeneratedKeys[T] = - Operation.UpdateReturningGeneratedKeys(this, columnNames, parser) -} - -object Fragment { - val EMPTY: Fragment = new Fragment(dev.typr.foundations.Fragment.EMPTY) - - def lit(value: String): Fragment = new Fragment(dev.typr.foundations.Fragment.lit(value)) - - def empty(): Fragment = EMPTY - - def quotedDouble(value: String): Fragment = new Fragment(dev.typr.foundations.Fragment.quotedDouble(value)) - - def quotedSingle(value: String): Fragment = new Fragment(dev.typr.foundations.Fragment.quotedSingle(value)) - - def value[A](value: A, dbType: dev.typr.foundations.DbType[A]): Fragment = - new Fragment(dev.typr.foundations.Fragment.value(value, dbType)) - - /** Encode a value into a SQL fragment using the provided database type. */ - def encode[A](dbType: dev.typr.foundations.DbType[A], value: A): Fragment = - new Fragment(dev.typr.foundations.Fragment.encode(dbType, value)) - - def and(fragments: Fragment*): Fragment = - new Fragment(dev.typr.foundations.Fragment.and(fragments.map(_.underlying)*)) - - def and(fragments: List[Fragment]): Fragment = - new Fragment(dev.typr.foundations.Fragment.and(fragments.map(_.underlying).asJava)) - - def or(fragments: Fragment*): Fragment = - new Fragment(dev.typr.foundations.Fragment.or(fragments.map(_.underlying)*)) - - def or(fragments: List[Fragment]): Fragment = - new Fragment(dev.typr.foundations.Fragment.or(fragments.map(_.underlying).asJava)) - - def whereAnd(fragments: Fragment*): Fragment = - new Fragment(dev.typr.foundations.Fragment.whereAnd(fragments.map(_.underlying)*)) - - def whereAnd(fragments: List[Fragment]): Fragment = - new Fragment(dev.typr.foundations.Fragment.whereAnd(fragments.map(_.underlying).asJava)) - - def whereOr(fragments: Fragment*): Fragment = - new Fragment(dev.typr.foundations.Fragment.whereOr(fragments.map(_.underlying)*)) - - def whereOr(fragments: List[Fragment]): Fragment = - new Fragment(dev.typr.foundations.Fragment.whereOr(fragments.map(_.underlying).asJava)) - - def set(fragments: Fragment*): Fragment = - new Fragment(dev.typr.foundations.Fragment.set(fragments.map(_.underlying)*)) - - def set(fragments: List[Fragment]): Fragment = - new Fragment(dev.typr.foundations.Fragment.set(fragments.map(_.underlying).asJava)) - - def parentheses(fragment: Fragment): Fragment = - new Fragment(dev.typr.foundations.Fragment.parentheses(fragment.underlying)) - - def comma(fragments: Fragment*): Fragment = - new Fragment(dev.typr.foundations.Fragment.comma(fragments.map(_.underlying)*)) - - def comma(fragments: Iterable[Fragment]): Fragment = - new Fragment(dev.typr.foundations.Fragment.comma(fragments.map(_.underlying).toList.asJava)) - - def orderBy(fragments: Fragment*): Fragment = - new Fragment(dev.typr.foundations.Fragment.orderBy(fragments.map(_.underlying)*)) - - def orderBy(fragments: List[Fragment]): Fragment = - new Fragment(dev.typr.foundations.Fragment.orderBy(fragments.map(_.underlying).asJava)) - - def join(fragments: List[Fragment], separator: Fragment): Fragment = - new Fragment(dev.typr.foundations.Fragment.join(fragments.map(_.underlying).asJava, separator.underlying)) - - def concat(fragments: Fragment*): Fragment = - new Fragment(dev.typr.foundations.Fragment.concat(fragments.map(_.underlying)*)) - - /** Scala string interpolator for creating SQL Fragments. - * - * Usage: - * {{{ - * import dev.typr.foundations.scala.Fragment.interpolate - * - * val name = Fragment.value("Alice", PgTypes.text) - * val query = interpolate"SELECT * FROM users WHERE name = $name" - * }}} - */ - extension (sc: StringContext) { - def sql(args: Fragment*): Fragment = { - val parts = sc.parts.iterator - val frags = new scala.collection.mutable.ListBuffer[dev.typr.foundations.Fragment]() - - // Add first string part - if (parts.hasNext) { - val first = parts.next() - if (first.nonEmpty) { - frags += dev.typr.foundations.Fragment.lit(first) - } - } - - // Interleave remaining parts with args - val argsIt = args.iterator - while (parts.hasNext && argsIt.hasNext) { - frags += argsIt.next().underlying - val part = parts.next() - if (part.nonEmpty) { - frags += dev.typr.foundations.Fragment.lit(part) - } - } - - // Handle any remaining args (shouldn't happen with valid interpolation) - while (argsIt.hasNext) { - frags += argsIt.next().underlying - } - - frags.result() match { - case Nil => Fragment.empty() - case single :: Nil => new Fragment(single) - case multiple => - val javaList = new java.util.ArrayList[dev.typr.foundations.Fragment](multiple.size) - multiple.foreach(javaList.add) - new Fragment(new dev.typr.foundations.Fragment.Concat(javaList)) - } - } - } - - /** Builder for creating Fragments with a fluent API */ - class Builder(private val underlying: dev.typr.foundations.Fragment.Builder) { - def sql(s: String): Builder = { - underlying.sql(s) - this - } - - def param[T](dbType: dev.typr.foundations.DbType[T], value: T): Builder = { - underlying.param(dbType, value) - this - } - - def param(fragment: Fragment): Builder = { - underlying.param(fragment.underlying) - this - } - - def done(): Fragment = new Fragment(underlying.done()) - } - - def interpolate(initial: String): Builder = - new Builder(dev.typr.foundations.Fragment.interpolate(initial)) - - def interpolate(fragments: Fragment*): Fragment = - new Fragment(dev.typr.foundations.Fragment.interpolate(fragments.map(_.underlying)*)) -} diff --git a/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/Operation.scala b/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/Operation.scala deleted file mode 100644 index 46bac21fe1..0000000000 --- a/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/Operation.scala +++ /dev/null @@ -1,95 +0,0 @@ -package dev.typr.foundations.scala - -import java.sql.{Connection, SQLException} -import _root_.scala.jdk.CollectionConverters.* - -/** Scala wrapper for dev.typr.foundations.Operation with Scala-native return types. - * - * This class wraps the Java Operation interface and provides Scala-friendly methods. - */ -sealed trait Operation[Out] { - def underlying: dev.typr.foundations.Operation[?] - - def run(conn: Connection): Out - - def runUnchecked(conn: Connection): Out = { - try { - run(conn) - } catch { - case e: SQLException => throw new RuntimeException(e) - } - } -} - -object Operation { - - /** Query operation that returns a parsed result */ - class Query[Out](val underlying: dev.typr.foundations.Operation.Query[Out]) extends Operation[Out] { - override def run(conn: Connection): Out = underlying.run(conn) - } - - object Query { - def apply[Out](query: Fragment, parser: ResultSetParser[Out]): Query[Out] = - new Query(new dev.typr.foundations.Operation.Query(query.underlying, parser.underlying)) - } - - /** Update operation that returns the number of affected rows */ - class Update(val underlying: dev.typr.foundations.Operation.Update) extends Operation[Int] { - override def run(conn: Connection): Int = underlying.run(conn) - } - - object Update { - def apply(query: Fragment): Update = - new Update(new dev.typr.foundations.Operation.Update(query.underlying)) - } - - /** Update operation with RETURNING clause */ - class UpdateReturning[Out](val underlying: dev.typr.foundations.Operation.UpdateReturning[Out]) extends Operation[Out] { - override def run(conn: Connection): Out = underlying.run(conn) - } - - object UpdateReturning { - def apply[Out](query: Fragment, parser: ResultSetParser[Out]): UpdateReturning[Out] = - new UpdateReturning(new dev.typr.foundations.Operation.UpdateReturning(query.underlying, parser.underlying)) - } - - /** Update operation with generated keys (Oracle-specific for databases without RETURNING clause) */ - class UpdateReturningGeneratedKeys[Out](val underlying: dev.typr.foundations.Operation.UpdateReturningGeneratedKeys[Out]) extends Operation[Out] { - override def run(conn: Connection): Out = underlying.run(conn) - } - - object UpdateReturningGeneratedKeys { - def apply[Out](query: Fragment, columnNames: Array[String], parser: ResultSetParser[Out]): UpdateReturningGeneratedKeys[Out] = - new UpdateReturningGeneratedKeys(new dev.typr.foundations.Operation.UpdateReturningGeneratedKeys(query.underlying, columnNames, parser.underlying)) - } - - /** Batch update operation that returns an array of update counts */ - class UpdateMany[Row](val underlying: dev.typr.foundations.Operation.UpdateMany[Row]) extends Operation[Array[Int]] { - override def run(conn: Connection): Array[Int] = underlying.run(conn) - } - - object UpdateMany { - def apply[Row](query: Fragment, parser: RowParser[Row], rows: Iterator[Row]): UpdateMany[Row] = - new UpdateMany(new dev.typr.foundations.Operation.UpdateMany(query.underlying, parser.underlying, rows.asJava)) - } - - /** Batch update operation with RETURNING clause that returns a list of rows */ - class UpdateManyReturning[Row](val underlying: dev.typr.foundations.Operation.UpdateManyReturning[Row]) extends Operation[List[Row]] { - override def run(conn: Connection): List[Row] = underlying.run(conn).asScala.toList - } - - object UpdateManyReturning { - def apply[Row](query: Fragment, parser: RowParser[Row], rows: Iterator[Row]): UpdateManyReturning[Row] = - new UpdateManyReturning(new dev.typr.foundations.Operation.UpdateManyReturning(query.underlying, parser.underlying, rows.asJava)) - } - - /** Update each row individually with RETURNING clause (for MariaDB) */ - class UpdateReturningEach[Row](val underlying: dev.typr.foundations.Operation.UpdateReturningEach[Row]) extends Operation[List[Row]] { - override def run(conn: Connection): List[Row] = underlying.run(conn).asScala.toList - } - - object UpdateReturningEach { - def apply[Row](query: Fragment, parser: RowParser[Row], rows: Iterator[Row]): UpdateReturningEach[Row] = - new UpdateReturningEach(new dev.typr.foundations.Operation.UpdateReturningEach(query.underlying, parser.underlying, rows.asJava)) - } -} diff --git a/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/ResultSetParser.scala b/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/ResultSetParser.scala deleted file mode 100644 index 92fae3918f..0000000000 --- a/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/ResultSetParser.scala +++ /dev/null @@ -1,11 +0,0 @@ -package dev.typr.foundations.scala - -import java.sql.ResultSet - -/** Scala wrapper for dev.typr.foundations.ResultSetParser that provides Scala-native methods. - * - * Wraps the Java ResultSetParser to provide interop with Java APIs. - */ -class ResultSetParser[Out](val underlying: dev.typr.foundations.ResultSetParser[Out]) { - def apply(rs: ResultSet): Out = underlying.apply(rs) -} diff --git a/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/RowParser.scala b/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/RowParser.scala deleted file mode 100644 index c5e0ea2ec5..0000000000 --- a/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/RowParser.scala +++ /dev/null @@ -1,57 +0,0 @@ -package dev.typr.foundations.scala - -import java.sql.ResultSet -import _root_.scala.jdk.CollectionConverters.* -import _root_.scala.jdk.OptionConverters.* - -/** Scala wrapper for dev.typr.foundations.RowParser that provides Scala-native methods. - * - * This class has the same API surface as the Java RowParser but returns Scala types (Option[T]) instead of Java types (Optional[T]). - */ -class RowParser[Row](val underlying: dev.typr.foundations.RowParser[Row]) { - - /** Parse all rows from a ResultSet. Returns Scala List instead of java.util.List. - */ - def all(): ResultSetParser[List[Row]] = { - val javaParser = underlying.all() - new ResultSetParser(new dev.typr.foundations.ResultSetParser[List[Row]] { - override def apply(rs: ResultSet): List[Row] = javaParser.apply(rs).asScala.toList - }) - } - - /** Parse exactly one row from a ResultSet. Returns Row directly (throws if not exactly one row). - */ - def exactlyOne(): ResultSetParser[Row] = { - new ResultSetParser(underlying.exactlyOne()) - } - - /** Parse the first row from a ResultSet or None if empty. Returns Option[Row] instead of Optional[Row]. - */ - def first(): ResultSetParser[Option[Row]] = { - val javaParser = new dev.typr.foundations.ResultSetParser.First(underlying) - new ResultSetParser(new dev.typr.foundations.ResultSetParser[Option[Row]] { - override def apply(rs: ResultSet): Option[Row] = javaParser.apply(rs).toScala - }) - } - - /** Parse the first row from a ResultSet or None if empty. Alias for first() to match Java API. - */ - def firstOrNone(): ResultSetParser[Option[Row]] = first() - - /** Parse at most one row from a ResultSet or None. Returns Option[Row] instead of Optional[Row]. - */ - def maxOne(): ResultSetParser[Option[Row]] = { - val javaParser = new dev.typr.foundations.ResultSetParser.MaxOne(underlying) - new ResultSetParser(new dev.typr.foundations.ResultSetParser[Option[Row]] { - override def apply(rs: ResultSet): Option[Row] = javaParser.apply(rs).toScala - }) - } - - /** Parse at most one row from a ResultSet or None. Alias for maxOne() to match Scala conventions. - */ - def maxOneOrNone(): ResultSetParser[Option[Row]] = maxOne() - - /** Parse a single row from the current position in ResultSet. - */ - def parse(rs: ResultSet): Row = underlying.parse(rs) -} diff --git a/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/RuntimeExtensions.scala b/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/RuntimeExtensions.scala deleted file mode 100644 index 6136f9d257..0000000000 --- a/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/RuntimeExtensions.scala +++ /dev/null @@ -1,95 +0,0 @@ -package dev.typr.foundations.scala - -import dev.typr.foundations.* - -import _root_.scala.jdk.CollectionConverters.* -import _root_.scala.jdk.OptionConverters.* - -/** Extension to add `.nullable` method to any DbType, converting Optional to Option. */ -implicit class DbTypeOps[A](private val dbType: DbType[A]) extends AnyVal { - def nullable: DbType[Option[A]] = - dbType.opt.to(Bijections.optionalToOption[A]) -} - -implicit class EitherOps[L, R](private val either: Either[L, R]) extends AnyVal { - def rightOrNone: Option[R] = { - either.asOptional().toScala - } - - def leftOrNone: Option[L] = { - either match { - case left: dev.typr.foundations.Either.Left[L, R] => Some(left.value()) - case _ => None - } - } -} - -implicit class ArrOps[A](private val arr: dev.typr.foundations.data.Arr[A]) extends AnyVal { - def reshapeOrNone(newDims: Int*): Option[dev.typr.foundations.data.Arr[A]] = { - arr.reshape(newDims*).toScala - } - - def getOrNone(indices: Int*): Option[A] = { - arr.get(indices*).toScala - } -} - -implicit class RangeOps[T <: Comparable[T]](private val range: dev.typr.foundations.data.Range[T]) extends AnyVal { - def finiteOrNone: Option[dev.typr.foundations.data.RangeFinite[T]] = { - range.finite().toScala - } -} - -implicit class FragmentBuilderOps(private val builder: Fragment.Builder) extends AnyVal { - def paramNullable[T](dbType: DbType[T], value: Option[T]): Fragment.Builder = { - builder.param(dbType.opt(), value.toJava) - } -} - -implicit class FragmentOps(private val fragment: Fragment) extends AnyVal { - def query[Out](parser: dev.typr.foundations.scala.ResultSetParser[Out]): Operation[Out] = { - fragment.query(parser.underlying) - } - - def updateReturningGeneratedKeys[Out](columnNames: Array[String], parser: ResultSetParser[Out]): Operation[Out] = { - fragment.updateReturningGeneratedKeys(columnNames, parser) - } -} - -implicit class OperationOptionalOps[T](private val operation: Operation[java.util.Optional[T]]) extends AnyVal { - def runUncheckedOrNone(c: java.sql.Connection): Option[T] = { - operation.runUnchecked(c).toScala - } -} - -// Extension for converting Scala Iterator to Java Iterator for streaming inserts -implicit class ScalaIteratorOps[T](private val iterator: Iterator[T]) extends AnyVal { - def toJavaIterator: java.util.Iterator[T] = iterator.asJava -} - -// Extension for converting java.util.List results to Scala List (for Oracle) -implicit class OperationListOps[T](private val operation: Operation[java.util.List[T]]) extends AnyVal { - def asScalaList(using c: java.sql.Connection): List[T] = { - operation.runUnchecked(c).asScala.toList - } -} - -// Extension for converting java.util.Optional results to Scala Option (for Oracle) -implicit class OperationOptionalToOptionOps[T](private val operation: Operation[java.util.Optional[T]]) extends AnyVal { - def asScalaOption(using c: java.sql.Connection): Option[T] = { - operation.runUnchecked(c).toScala - } -} - -def buildFragment(block: Fragment.Builder => Unit): Fragment = { - val builder = Fragment.Builder() - block(builder) - builder.done() -} - -// Helper object for Fragment operations with Scala collections -object FragmentHelpers { - def comma(fragments: _root_.scala.collection.Seq[Fragment]): Fragment = { - Fragment.comma(fragments.asJava) - } -} diff --git a/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/ScalaDbTypes.scala b/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/ScalaDbTypes.scala deleted file mode 100644 index cb3e95016d..0000000000 --- a/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/ScalaDbTypes.scala +++ /dev/null @@ -1,186 +0,0 @@ -package dev.typr.foundations.scala - -import dev.typr.foundations.* - -import _root_.scala.jdk.CollectionConverters.* - -/** Scala-friendly DbType instances that use Scala types instead of Java boxed types. - */ -object ScalaDbTypes { - object DuckDbTypes { - // Primitives - convert Java boxed types to Scala native types - val tinyint: DuckDbType[Byte] = dev.typr.foundations.DuckDbTypes.tinyint.bimap(b => b, b => b) - val smallint: DuckDbType[Short] = dev.typr.foundations.DuckDbTypes.smallint.bimap(s => s, s => s) - val integer: DuckDbType[Int] = dev.typr.foundations.DuckDbTypes.integer.bimap(i => i, i => i) - val bigint: DuckDbType[Long] = dev.typr.foundations.DuckDbTypes.bigint.bimap(l => l, l => l) - val float_ : DuckDbType[Float] = dev.typr.foundations.DuckDbTypes.float_.bimap(f => f, f => f) - val double_ : DuckDbType[Double] = dev.typr.foundations.DuckDbTypes.double_.bimap(d => d, d => d) - val boolean_ : DuckDbType[Boolean] = dev.typr.foundations.DuckDbTypes.boolean_.bimap(b => b, b => b) - val bool: DuckDbType[Boolean] = dev.typr.foundations.DuckDbTypes.bool.bimap(b => b, b => b) - - // BigDecimal - convert Java BigDecimal to Scala BigDecimal - val decimal: DuckDbType[BigDecimal] = dev.typr.foundations.DuckDbTypes.decimal.bimap(jbd => BigDecimal(jbd), sbd => sbd.bigDecimal) - val numeric: DuckDbType[BigDecimal] = dev.typr.foundations.DuckDbTypes.numeric.bimap(jbd => BigDecimal(jbd), sbd => sbd.bigDecimal) - - // Array types - convert Java boxed arrays to Scala primitive arrays - val booleanArrayUnboxed: DuckDbType[Array[Boolean]] = dev.typr.foundations.DuckDbTypes.booleanArray.bimap( - arr => arr.map(_.booleanValue()), - arr => arr.map(java.lang.Boolean.valueOf) - ) - val tinyintArrayUnboxed: DuckDbType[Array[Byte]] = dev.typr.foundations.DuckDbTypes.tinyintArray.bimap( - arr => arr.map(_.byteValue()), - arr => arr.map(java.lang.Byte.valueOf) - ) - val smallintArrayUnboxed: DuckDbType[Array[Short]] = dev.typr.foundations.DuckDbTypes.smallintArray.bimap( - arr => arr.map(_.shortValue()), - arr => arr.map(java.lang.Short.valueOf) - ) - val integerArrayUnboxed: DuckDbType[Array[Int]] = dev.typr.foundations.DuckDbTypes.integerArray.bimap( - arr => arr.map(_.intValue()), - arr => arr.map(java.lang.Integer.valueOf) - ) - val bigintArrayUnboxed: DuckDbType[Array[Long]] = dev.typr.foundations.DuckDbTypes.bigintArray.bimap( - arr => arr.map(_.longValue()), - arr => arr.map(java.lang.Long.valueOf) - ) - val floatArrayUnboxed: DuckDbType[Array[Float]] = dev.typr.foundations.DuckDbTypes.floatArray.bimap( - arr => arr.map(_.floatValue()), - arr => arr.map(java.lang.Float.valueOf) - ) - val doubleArrayUnboxed: DuckDbType[Array[Double]] = dev.typr.foundations.DuckDbTypes.doubleArray.bimap( - arr => arr.map(_.doubleValue()), - arr => arr.map(java.lang.Double.valueOf) - ) - - // BigDecimal array - convert Java BigDecimal array to Scala BigDecimal array - val decimalArray: DuckDbType[Array[BigDecimal]] = dev.typr.foundations.DuckDbTypes.decimalArray.bimap( - arr => arr.map(BigDecimal(_)), - arr => arr.map(_.bigDecimal) - ) - val numericArray: DuckDbType[Array[BigDecimal]] = decimalArray - } - - object PgTypes { - // Primitives - convert Java boxed types to Scala native types - val bool: PgType[Boolean] = dev.typr.foundations.PgTypes.bool.bimap(b => b, b => b) - val int2: PgType[Short] = dev.typr.foundations.PgTypes.int2.bimap(s => s, s => s) - val smallint: PgType[Short] = dev.typr.foundations.PgTypes.smallint.bimap(s => s, s => s) - val int4: PgType[Int] = dev.typr.foundations.PgTypes.int4.bimap(i => i, i => i) - val int8: PgType[Long] = dev.typr.foundations.PgTypes.int8.bimap(l => l, l => l) - val float4: PgType[Float] = dev.typr.foundations.PgTypes.float4.bimap(f => f, f => f) - val float8: PgType[Double] = dev.typr.foundations.PgTypes.float8.bimap(d => d, d => d) - - // oid - 32-bit unsigned integer wrapped in Oid type - val oid: PgType[dev.typr.foundations.data.Oid] = dev.typr.foundations.PgTypes.oid - val oidArray: PgType[Array[dev.typr.foundations.data.Oid]] = dev.typr.foundations.PgTypes.oidArray - - // oidvector - vector of oids - val oidvector: PgType[dev.typr.foundations.data.OidVector] = dev.typr.foundations.PgTypes.oidvector - val oidvectorArray: PgType[Array[dev.typr.foundations.data.OidVector]] = dev.typr.foundations.PgTypes.oidvectorArray - - // BigDecimal - convert Java BigDecimal to Scala BigDecimal - val numeric: PgType[BigDecimal] = dev.typr.foundations.PgTypes.numeric.bimap(jbd => BigDecimal(jbd), sbd => sbd.bigDecimal) - - // Collections - convert Java collections to Scala collections - val hstore: PgType[Map[String, String]] = dev.typr.foundations.PgTypes.hstore.bimap(javaMap => javaMap.asScala.toMap, scalaMap => scalaMap.asJava) - - // Array types - convert Java boxed arrays to Scala native arrays - val boolArray: PgType[Array[Boolean]] = dev.typr.foundations.PgTypes.boolArray.bimap( - arr => arr.map(_.booleanValue()), - arr => arr.map(java.lang.Boolean.valueOf) - ) - val int2Array: PgType[Array[Short]] = dev.typr.foundations.PgTypes.int2Array.bimap( - arr => arr.map(_.shortValue()), - arr => arr.map(java.lang.Short.valueOf) - ) - val smallintArray: PgType[Array[Short]] = int2Array - val int4Array: PgType[Array[Int]] = dev.typr.foundations.PgTypes.int4Array.bimap( - arr => arr.map(_.intValue()), - arr => arr.map(java.lang.Integer.valueOf) - ) - val int8Array: PgType[Array[Long]] = dev.typr.foundations.PgTypes.int8Array.bimap( - arr => arr.map(_.longValue()), - arr => arr.map(java.lang.Long.valueOf) - ) - val float4Array: PgType[Array[Float]] = dev.typr.foundations.PgTypes.float4Array.bimap( - arr => arr.map(_.floatValue()), - arr => arr.map(java.lang.Float.valueOf) - ) - val float8Array: PgType[Array[Double]] = dev.typr.foundations.PgTypes.float8Array.bimap( - arr => arr.map(_.doubleValue()), - arr => arr.map(java.lang.Double.valueOf) - ) - val numericArray: PgType[Array[BigDecimal]] = dev.typr.foundations.PgTypes.numericArray.bimap( - arr => arr.map(BigDecimal(_)), - arr => arr.map(_.bigDecimal) - ) - } - - object MariaTypes { - // Primitives - convert Java boxed types to Scala native types - val tinyint: MariaType[Byte] = dev.typr.foundations.MariaTypes.tinyint.bimap(b => b, b => b) - val smallint: MariaType[Short] = dev.typr.foundations.MariaTypes.smallint.bimap(s => s, s => s) - val mediumint: MariaType[Int] = dev.typr.foundations.MariaTypes.mediumint.bimap(i => i, i => i) - val int_ : MariaType[Int] = dev.typr.foundations.MariaTypes.int_.bimap(i => i, i => i) - val bigint: MariaType[Long] = dev.typr.foundations.MariaTypes.bigint.bimap(l => l, l => l) - - // Floating point - val float_ : MariaType[Float] = dev.typr.foundations.MariaTypes.float_.bimap(f => f, f => f) - val double_ : MariaType[Double] = dev.typr.foundations.MariaTypes.double_.bimap(d => d, d => d) - - // BigDecimal - convert Java BigDecimal to Scala BigDecimal - val numeric: MariaType[BigDecimal] = dev.typr.foundations.MariaTypes.numeric.bimap(jbd => BigDecimal(jbd), sbd => sbd.bigDecimal) - - // Boolean - val bool: MariaType[Boolean] = dev.typr.foundations.MariaTypes.bool.bimap(b => b, b => b) - val bit1: MariaType[Boolean] = dev.typr.foundations.MariaTypes.bit1.bimap(b => b, b => b) - } - - object OracleTypes { - // BigDecimal - convert Java BigDecimal to Scala BigDecimal (Oracle NUMBER type) - val number: OracleType[BigDecimal] = dev.typr.foundations.OracleTypes.number.bimap(jbd => BigDecimal(jbd), sbd => sbd.bigDecimal) - - // Floating point primitives - no conversion needed - val binaryFloat: OracleType[Float] = dev.typr.foundations.OracleTypes.binaryFloat.bimap(f => f, f => f) - val binaryDouble: OracleType[Double] = dev.typr.foundations.OracleTypes.binaryDouble.bimap(d => d, d => d) - } - - object SqlServerTypes { - // Primitives - convert Java boxed types to Scala native types - val smallint: SqlServerType[Short] = dev.typr.foundations.SqlServerTypes.smallint.bimap(s => s, s => s) - val int_ : SqlServerType[Int] = dev.typr.foundations.SqlServerTypes.int_.bimap(i => i, i => i) - val bigint: SqlServerType[Long] = dev.typr.foundations.SqlServerTypes.bigint.bimap(l => l, l => l) - - // Floating point - val real: SqlServerType[Float] = dev.typr.foundations.SqlServerTypes.real.bimap(f => f, f => f) - val float_ : SqlServerType[Double] = dev.typr.foundations.SqlServerTypes.float_.bimap(d => d, d => d) - - // BigDecimal - convert Java BigDecimal to Scala BigDecimal - val decimal: SqlServerType[BigDecimal] = dev.typr.foundations.SqlServerTypes.decimal.bimap(jbd => BigDecimal(jbd), sbd => sbd.bigDecimal) - val numeric: SqlServerType[BigDecimal] = dev.typr.foundations.SqlServerTypes.numeric.bimap(jbd => BigDecimal(jbd), sbd => sbd.bigDecimal) - val money: SqlServerType[BigDecimal] = dev.typr.foundations.SqlServerTypes.money.bimap(jbd => BigDecimal(jbd), sbd => sbd.bigDecimal) - val smallmoney: SqlServerType[BigDecimal] = dev.typr.foundations.SqlServerTypes.smallmoney.bimap(jbd => BigDecimal(jbd), sbd => sbd.bigDecimal) - - // Boolean - val bit: SqlServerType[Boolean] = dev.typr.foundations.SqlServerTypes.bit.bimap(b => b, b => b) - } - - object Db2Types { - // Primitives - convert Java boxed types to Scala native types - val smallint: Db2Type[Short] = dev.typr.foundations.Db2Types.smallint.bimap(s => s, s => s) - val integer: Db2Type[Int] = dev.typr.foundations.Db2Types.integer.bimap(i => i, i => i) - val bigint: Db2Type[Long] = dev.typr.foundations.Db2Types.bigint.bimap(l => l, l => l) - - // Floating point - val real: Db2Type[Float] = dev.typr.foundations.Db2Types.real.bimap(f => f, f => f) - val double_ : Db2Type[Double] = dev.typr.foundations.Db2Types.double_.bimap(d => d, d => d) - - // BigDecimal - convert Java BigDecimal to Scala BigDecimal - val decimal: Db2Type[BigDecimal] = dev.typr.foundations.Db2Types.decimal.bimap(jbd => BigDecimal(jbd), sbd => sbd.bigDecimal) - val numeric: Db2Type[BigDecimal] = dev.typr.foundations.Db2Types.numeric.bimap(jbd => BigDecimal(jbd), sbd => sbd.bigDecimal) - val decfloat: Db2Type[BigDecimal] = dev.typr.foundations.Db2Types.decfloat.bimap(jbd => BigDecimal(jbd), sbd => sbd.bigDecimal) - - // Boolean - val boolean_ : Db2Type[Boolean] = dev.typr.foundations.Db2Types.boolean_.bimap(b => b, b => b) - } -} diff --git a/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/StaticExports.scala b/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/StaticExports.scala deleted file mode 100644 index dd7765d26f..0000000000 --- a/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/StaticExports.scala +++ /dev/null @@ -1,177 +0,0 @@ -package dev.typr.foundations.scala - -import _root_.scala.jdk.CollectionConverters.* -import _root_.scala.jdk.OptionConverters.* - -object StaticExports { - - // Type aliases for runtime types - type Either[L, R] = dev.typr.foundations.Either[L, R] - type And[T1, T2] = dev.typr.foundations.And[T1, T2] - - // PostgreSQL data wrapper types - type Json = dev.typr.foundations.data.Json - type Jsonb = dev.typr.foundations.data.Jsonb - type Money = dev.typr.foundations.data.Money - type Xml = dev.typr.foundations.data.Xml - type Vector = dev.typr.foundations.data.Vector - type Record = dev.typr.foundations.data.Record - type Unknown = dev.typr.foundations.data.Unknown - type Xid = dev.typr.foundations.data.Xid - type Inet = dev.typr.foundations.data.Inet - type AclItem = dev.typr.foundations.data.AclItem - type AnyArray = dev.typr.foundations.data.AnyArray - type Int2Vector = dev.typr.foundations.data.Int2Vector - type OidVector = dev.typr.foundations.data.OidVector - type PgNodeTree = dev.typr.foundations.data.PgNodeTree - - // Regclass types - type Regclass = dev.typr.foundations.data.Regclass - type Regconfig = dev.typr.foundations.data.Regconfig - type Regdictionary = dev.typr.foundations.data.Regdictionary - type Regnamespace = dev.typr.foundations.data.Regnamespace - type Regoper = dev.typr.foundations.data.Regoper - type Regoperator = dev.typr.foundations.data.Regoperator - type Regproc = dev.typr.foundations.data.Regproc - type Regprocedure = dev.typr.foundations.data.Regprocedure - type Regrole = dev.typr.foundations.data.Regrole - type Regtype = dev.typr.foundations.data.Regtype - - // Range types - type Range[T <: Comparable[T]] = dev.typr.foundations.data.Range[T] - type RangeBound[T <: Comparable[T]] = dev.typr.foundations.data.RangeBound[T] - type RangeFinite[T <: Comparable[T]] = dev.typr.foundations.data.RangeFinite[T] - - // Array type - type Arr[A] = dev.typr.foundations.data.Arr[A] - - // Core type system - type PgType[A] = dev.typr.foundations.PgType[A] - type PgTypename[A] = dev.typr.foundations.PgTypename[A] - type PgRead[A] = dev.typr.foundations.PgRead[A] - type PgWrite[A] = dev.typr.foundations.PgWrite[A] - type PgText[A] = dev.typr.foundations.PgText[A] - - // Database access types - type Transactor = dev.typr.foundations.Transactor - - // Functional interfaces - type SqlFunction[T, R] = dev.typr.foundations.SqlFunction[T, R] - type SqlConsumer[T] = dev.typr.foundations.SqlConsumer[T] - type SqlBiConsumer[T1, T2] = dev.typr.foundations.SqlBiConsumer[T1, T2] - - // Utility - type ByteArrays = dev.typr.foundations.internal.ByteArrays - type ArrParser = dev.typr.foundations.ArrParser - - object Types { - // Primitive types - val bool: PgType[java.lang.Boolean] = dev.typr.foundations.PgTypes.bool - val int2: PgType[java.lang.Short] = dev.typr.foundations.PgTypes.int2 - val int4: PgType[java.lang.Integer] = dev.typr.foundations.PgTypes.int4 - val int8: PgType[java.lang.Long] = dev.typr.foundations.PgTypes.int8 - val float4: PgType[java.lang.Float] = dev.typr.foundations.PgTypes.float4 - val float8: PgType[java.lang.Double] = dev.typr.foundations.PgTypes.float8 - val numeric: PgType[java.math.BigDecimal] = dev.typr.foundations.PgTypes.numeric - val text: PgType[String] = dev.typr.foundations.PgTypes.text - val bytea: PgType[Array[Byte]] = dev.typr.foundations.PgTypes.bytea - val uuid: PgType[java.util.UUID] = dev.typr.foundations.PgTypes.uuid - - // Date/time types - val date: PgType[java.time.LocalDate] = dev.typr.foundations.PgTypes.date - val time: PgType[java.time.LocalTime] = dev.typr.foundations.PgTypes.time - val timestamp: PgType[java.time.LocalDateTime] = dev.typr.foundations.PgTypes.timestamp - val timestamptz: PgType[java.time.Instant] = dev.typr.foundations.PgTypes.timestamptz - val interval: PgType[org.postgresql.util.PGInterval] = dev.typr.foundations.PgTypes.interval - - // JSON types - val json: PgType[Json] = dev.typr.foundations.PgTypes.json - val jsonb: PgType[Jsonb] = dev.typr.foundations.PgTypes.jsonb - - // Other types - val xml: PgType[Xml] = dev.typr.foundations.PgTypes.xml - val money: PgType[Money] = dev.typr.foundations.PgTypes.money - val inet: PgType[Inet] = dev.typr.foundations.PgTypes.inet - val vector: PgType[Vector] = dev.typr.foundations.PgTypes.vector - val xid: PgType[Xid] = dev.typr.foundations.PgTypes.xid - - // Reg* types - val regclass: PgType[Regclass] = dev.typr.foundations.PgTypes.regclass - val regconfig: PgType[Regconfig] = dev.typr.foundations.PgTypes.regconfig - val regdictionary: PgType[Regdictionary] = dev.typr.foundations.PgTypes.regdictionary - val regnamespace: PgType[Regnamespace] = dev.typr.foundations.PgTypes.regnamespace - val regoper: PgType[Regoper] = dev.typr.foundations.PgTypes.regoper - val regoperator: PgType[Regoperator] = dev.typr.foundations.PgTypes.regoperator - val regproc: PgType[Regproc] = dev.typr.foundations.PgTypes.regproc - val regprocedure: PgType[Regprocedure] = dev.typr.foundations.PgTypes.regprocedure - val regrole: PgType[Regrole] = dev.typr.foundations.PgTypes.regrole - val regtype: PgType[Regtype] = dev.typr.foundations.PgTypes.regtype - - // Factory methods - def ofEnum[E <: Enum[E]](name: String, fromString: String => E): PgType[E] = { - dev.typr.foundations.PgTypes.ofEnum(name, s => fromString(s)) - } - - def ofPgObject[T]( - sqlType: String, - constructor: String => T, - extractor: T => String, - json: dev.typr.foundations.PgJson[T] - ): PgType[T] = { - dev.typr.foundations.PgTypes.ofPgObject(sqlType, s => constructor(s), t => extractor(t), json) - } - - def bpchar(precision: Int): PgType[String] = { - dev.typr.foundations.PgTypes.bpchar(precision) - } - - def record(typename: String): PgType[Record] = { - dev.typr.foundations.PgTypes.record(typename) - } - } - - // Export Fragment and Operation wrappers - export dev.typr.foundations.scala.Fragment - export dev.typr.foundations.scala.Fragment.{sql as _, *} - export dev.typr.foundations.scala.Operation - - object Parsers { - def all[Out](rowParser: dev.typr.foundations.RowParser[Out]): dev.typr.foundations.scala.ResultSetParser[List[Out]] = { - val javaParser = new dev.typr.foundations.ResultSetParser.All(rowParser) - new dev.typr.foundations.scala.ResultSetParser(new dev.typr.foundations.ResultSetParser[List[Out]] { - override def apply(rs: java.sql.ResultSet): List[Out] = javaParser.apply(rs).asScala.toList - }) - } - - def first[Out](rowParser: dev.typr.foundations.RowParser[Out]): dev.typr.foundations.scala.ResultSetParser[Option[Out]] = { - val javaParser = new dev.typr.foundations.ResultSetParser.First(rowParser) - new dev.typr.foundations.scala.ResultSetParser(new dev.typr.foundations.ResultSetParser[Option[Out]] { - override def apply(rs: java.sql.ResultSet): Option[Out] = javaParser.apply(rs).toScala - }) - } - - def maxOne[Out](rowParser: dev.typr.foundations.RowParser[Out]): dev.typr.foundations.scala.ResultSetParser[Option[Out]] = { - val javaParser = new dev.typr.foundations.ResultSetParser.MaxOne(rowParser) - new dev.typr.foundations.scala.ResultSetParser(new dev.typr.foundations.ResultSetParser[Option[Out]] { - override def apply(rs: java.sql.ResultSet): Option[Out] = javaParser.apply(rs).toScala - }) - } - - def exactlyOne[Out](rowParser: dev.typr.foundations.RowParser[Out]): dev.typr.foundations.scala.ResultSetParser[Out] = { - val javaParser = new dev.typr.foundations.ResultSetParser.ExactlyOne(rowParser) - new dev.typr.foundations.scala.ResultSetParser(javaParser) - } - - def foreach[Out](rowParser: dev.typr.foundations.RowParser[Out], consumer: Out => Unit): dev.typr.foundations.scala.ResultSetParser[Unit] = { - val foreachParser = new dev.typr.foundations.ResultSetParser.Foreach(rowParser, c => consumer(c)) - // Wrap Void-returning parser to return Unit - val unitParser = new dev.typr.foundations.ResultSetParser[Unit] { - override def apply(rs: java.sql.ResultSet): Unit = { - foreachParser.apply(rs) - () - } - } - new dev.typr.foundations.scala.ResultSetParser(unitParser) - } - } -} diff --git a/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/package.scala b/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/package.scala deleted file mode 100644 index 68a3a40416..0000000000 --- a/foundations-jdbc-dsl-scala/src/scala/dev/typr/foundations/scala/package.scala +++ /dev/null @@ -1,81 +0,0 @@ -package dev.typr.foundations - -package object scala { - // Type aliases for typr.dsl types - type Dialect = dev.typr.foundations.dsl.Dialect - type RenderCtx = dev.typr.foundations.dsl.RenderCtx - type GroupedBuilder[Fields, Rows] = dev.typr.foundations.dsl.GroupedBuilder[Fields, Rows] - type SortOrder[T] = dev.typr.foundations.dsl.SortOrder[T] - type Path = dev.typr.foundations.dsl.Path - type FieldsBase[Row] = dev.typr.foundations.dsl.FieldsBase[Row] - type FieldsExpr[Row] = dev.typr.foundations.dsl.FieldsExpr[Row] - type Bijection[Wrapper, Underlying] = dev.typr.foundations.dsl.Bijection[Wrapper, Underlying] - - // Type aliases for dev.typr.foundations types - type PgType[A] = dev.typr.foundations.PgType[A] - type PgTypes = dev.typr.foundations.PgTypes - - // Type aliases for mock builders - type DeleteBuilderMock[Id, Fields, Row] = dev.typr.foundations.dsl.DeleteBuilderMock[Id, Fields, Row] - type SelectBuilderMock[Fields, Row] = dev.typr.foundations.dsl.SelectBuilderMock[Fields, Row] - type UpdateBuilderMock[Fields, Row] = dev.typr.foundations.dsl.UpdateBuilderMock[Fields, Row] - - // Type aliases for params types - type DeleteParams[Fields] = dev.typr.foundations.dsl.DeleteParams[Fields] - type SelectParams[Fields, Row] = dev.typr.foundations.dsl.SelectParams[Fields, Row] - type UpdateParams[Fields, Row] = dev.typr.foundations.dsl.UpdateParams[Fields, Row] - - // Companion objects for params types with factory methods - object DeleteParams { - def empty[Fields](): dev.typr.foundations.dsl.DeleteParams[Fields] = dev.typr.foundations.dsl.DeleteParams.empty[Fields]() - } - - object SelectParams { - def empty[Fields, Row](): dev.typr.foundations.dsl.SelectParams[Fields, Row] = dev.typr.foundations.dsl.SelectParams.empty[Fields, Row]() - } - - object UpdateParams { - def empty[Fields, Row](): dev.typr.foundations.dsl.UpdateParams[Fields, Row] = dev.typr.foundations.dsl.UpdateParams.empty[Fields, Row]() - } - - // Companion objects for mock builders with factory methods - object DeleteBuilderMock { - def apply[Id, Fields, Row]( - structure: dev.typr.foundations.dsl.RelationStructure[Fields, Row], - allRowsSupplier: () => List[Row], - params: DeleteParams[Fields], - idExtractor: Row => Id, - deleteById: Id => Unit - ): DeleteBuilder[Fields, Row] = - DslExports.DeleteBuilderMock(structure, allRowsSupplier, params, idExtractor, deleteById) - } - - object SelectBuilderMock { - def apply[Fields, Row]( - structure: dev.typr.foundations.dsl.RelationStructure[Fields, Row], - allRowsSupplier: () => List[Row], - params: SelectParams[Fields, Row] - ): SelectBuilder[Fields, Row] = - DslExports.SelectBuilderMock(structure, allRowsSupplier, params) - } - - object UpdateBuilderMock { - def apply[Fields, Row]( - structure: dev.typr.foundations.dsl.RelationStructure[Fields, Row], - allRowsSupplier: () => List[Row], - params: UpdateParams[Fields, Row], - copyRow: Row => Row - ): UpdateBuilder[Fields, Row] = - DslExports.UpdateBuilderMock(structure, allRowsSupplier, params, copyRow) - } - - // Dialect object exposing the Java static fields - object Dialect { - val POSTGRESQL: dev.typr.foundations.dsl.Dialect = dev.typr.foundations.dsl.Dialect.POSTGRESQL - val MARIADB: dev.typr.foundations.dsl.Dialect = dev.typr.foundations.dsl.Dialect.MARIADB - val DUCKDB: dev.typr.foundations.dsl.Dialect = dev.typr.foundations.dsl.Dialect.DUCKDB - val ORACLE: dev.typr.foundations.dsl.Dialect = dev.typr.foundations.dsl.Dialect.ORACLE - val SQLSERVER: dev.typr.foundations.dsl.Dialect = dev.typr.foundations.dsl.Dialect.SQLSERVER - val DB2: dev.typr.foundations.dsl.Dialect = dev.typr.foundations.dsl.Dialect.DB2 - } -} diff --git a/foundations-jdbc-dsl/build.gradle.kts b/foundations-jdbc-dsl/build.gradle.kts deleted file mode 100644 index 5ff496f78c..0000000000 --- a/foundations-jdbc-dsl/build.gradle.kts +++ /dev/null @@ -1,26 +0,0 @@ -plugins { - `java-library` -} - -java { - toolchain { - languageVersion.set(JavaLanguageVersion.of(21)) - } -} - -sourceSets { - main { - java { - srcDirs("src/java", "../.bleep/generated-sources/foundations-jdbc-dsl/scripts.GeneratedTuples") - } - } -} - -dependencies { - api(project(":foundations-jdbc")) -} - -tasks.withType { - options.compilerArgs.addAll(listOf("-proc:none")) - options.release.set(21) -} diff --git a/foundations-jdbc-hikari/src/java/dev/typr/foundations/hikari/HikariDataSourceFactory.java b/foundations-jdbc-hikari/src/java/dev/typr/foundations/hikari/HikariDataSourceFactory.java deleted file mode 100644 index a71589d876..0000000000 --- a/foundations-jdbc-hikari/src/java/dev/typr/foundations/hikari/HikariDataSourceFactory.java +++ /dev/null @@ -1,142 +0,0 @@ -package dev.typr.foundations.hikari; - -import com.zaxxer.hikari.HikariConfig; -import com.zaxxer.hikari.HikariDataSource; -import dev.typr.foundations.connect.ConnectionSettings; -import dev.typr.foundations.connect.DatabaseConfig; - -/** - * Factory for creating PooledDataSource instances. - * - *

Example usage: - * - *

{@code
- * // With connection settings
- * var ds = HikariDataSourceFactory.create(
- *     PostgresConfig.builder("localhost", 5432, "mydb", "user", "pass").build(),
- *     ConnectionSettings.builder()
- *         .transactionIsolation(TransactionIsolation.READ_UNCOMMITTED)
- *         .build(),
- *     PoolConfig.builder()
- *         .maximumPoolSize(20)
- *         .build());
- *
- * var tx = ds.transactor();
- * tx.execute(conn -> repo.selectAll(conn));
- * }
- */ -public final class HikariDataSourceFactory { - - private HikariDataSourceFactory() {} - - /** - * Create a PooledDataSource with connection settings and pool configuration. - * - * @param config database configuration (URL, credentials, driver properties) - * @param settings connection settings (isolation, autoCommit, readOnly, etc.) - * @param pool pool configuration (sizing, timeouts, etc.) - * @return configured PooledDataSource - */ - public static PooledDataSource create( - DatabaseConfig config, ConnectionSettings settings, PoolConfig pool) { - HikariConfig hikari = new HikariConfig(); - - // Connection settings from DatabaseConfig - hikari.setJdbcUrl(config.jdbcUrl()); - hikari.setUsername(config.username()); - hikari.setPassword(config.password()); - - // Driver properties from DatabaseConfig - config.driverProperties().forEach(hikari::addDataSourceProperty); - - // Pool sizing - hikari.setMaximumPoolSize(pool.maximumPoolSize()); - hikari.setMinimumIdle(pool.minimumIdle()); - - // Timeouts - hikari.setConnectionTimeout(pool.connectionTimeout().toMillis()); - hikari.setValidationTimeout(pool.validationTimeout().toMillis()); - hikari.setIdleTimeout(pool.idleTimeout().toMillis()); - hikari.setMaxLifetime(pool.maxLifetime().toMillis()); - hikari.setKeepaliveTime(pool.keepaliveTime().toMillis()); - hikari.setLeakDetectionThreshold(pool.leakDetectionThreshold().toMillis()); - - // Connection settings - if (settings.transactionIsolation() != null) { - hikari.setTransactionIsolation(settings.transactionIsolation().jdbcName()); - } - if (settings.autoCommit() != null) { - hikari.setAutoCommit(settings.autoCommit()); - } - if (settings.readOnly() != null) { - hikari.setReadOnly(settings.readOnly()); - } - if (settings.catalog() != null) { - hikari.setCatalog(settings.catalog()); - } - if (settings.schema() != null) { - hikari.setSchema(settings.schema()); - } - if (settings.connectionInitSql() != null) { - hikari.setConnectionInitSql(settings.connectionInitSql()); - } - - // Pool-specific: connection test query - if (pool.connectionTestQuery() != null) { - hikari.setConnectionTestQuery(pool.connectionTestQuery()); - } - - // Pool naming - if (pool.poolName() != null) { - hikari.setPoolName(pool.poolName()); - } - - // Advanced - if (pool.registerMbeans() != null) { - hikari.setRegisterMbeans(pool.registerMbeans()); - } - if (pool.allowPoolSuspension() != null) { - hikari.setAllowPoolSuspension(pool.allowPoolSuspension()); - } - if (pool.isolateInternalQueries() != null) { - hikari.setIsolateInternalQueries(pool.isolateInternalQueries()); - } - - // Extra properties - pool.extraProperties().forEach(hikari::addDataSourceProperty); - - return new PooledDataSource(new HikariDataSource(hikari)); - } - - /** - * Create a PooledDataSource with connection settings and default pool configuration. - * - * @param config database configuration - * @param settings connection settings - * @return configured PooledDataSource - */ - public static PooledDataSource create(DatabaseConfig config, ConnectionSettings settings) { - return create(config, settings, PoolConfig.defaults()); - } - - /** - * Create a PooledDataSource with default settings. - * - * @param config database configuration - * @return configured PooledDataSource with driver defaults - */ - public static PooledDataSource create(DatabaseConfig config) { - return create(config, ConnectionSettings.EMPTY, PoolConfig.defaults()); - } - - /** - * Create a PooledDataSource with pool configuration but default connection settings. - * - * @param config database configuration - * @param pool pool configuration - * @return configured PooledDataSource - */ - public static PooledDataSource create(DatabaseConfig config, PoolConfig pool) { - return create(config, ConnectionSettings.EMPTY, pool); - } -} diff --git a/foundations-jdbc-hikari/src/java/dev/typr/foundations/hikari/PoolConfig.java b/foundations-jdbc-hikari/src/java/dev/typr/foundations/hikari/PoolConfig.java deleted file mode 100644 index f356756c91..0000000000 --- a/foundations-jdbc-hikari/src/java/dev/typr/foundations/hikari/PoolConfig.java +++ /dev/null @@ -1,340 +0,0 @@ -package dev.typr.foundations.hikari; - -import java.time.Duration; -import java.util.HashMap; -import java.util.Map; - -/** - * HikariCP connection pool configuration with typed builder methods for pool-specific properties. - * - *

For connection defaults (transaction isolation, auto-commit, read-only, etc.), use {@link - * dev.typr.foundations.connect.DatabaseConfig#withDefaults} instead. Those settings apply to both - * pooled and non-pooled connections. - * - * @see HikariCP - * Documentation - */ -public final class PoolConfig { - - // Pool sizing - private final int maximumPoolSize; - private final int minimumIdle; - - // Timeouts - private final Duration connectionTimeout; - private final Duration validationTimeout; - private final Duration idleTimeout; - private final Duration maxLifetime; - private final Duration keepaliveTime; - private final Duration leakDetectionThreshold; - - // Pool-specific connection settings - private final String connectionTestQuery; - - // Pool naming - private final String poolName; - - // Advanced - private final Boolean registerMbeans; - private final Boolean allowPoolSuspension; - private final Boolean isolateInternalQueries; - - // Escape hatch - private final Map extraProperties; - - private PoolConfig(Builder b) { - this.maximumPoolSize = b.maximumPoolSize; - this.minimumIdle = b.minimumIdle; - - this.connectionTimeout = b.connectionTimeout; - this.validationTimeout = b.validationTimeout; - this.idleTimeout = b.idleTimeout; - this.maxLifetime = b.maxLifetime; - this.keepaliveTime = b.keepaliveTime; - this.leakDetectionThreshold = b.leakDetectionThreshold; - - this.connectionTestQuery = b.connectionTestQuery; - - this.poolName = b.poolName; - - this.registerMbeans = b.registerMbeans; - this.allowPoolSuspension = b.allowPoolSuspension; - this.isolateInternalQueries = b.isolateInternalQueries; - - this.extraProperties = Map.copyOf(b.extraProperties); - } - - /** - * Create a new builder with sensible defaults. - * - * @return a new builder - */ - public static Builder builder() { - return new Builder(); - } - - /** Create a PoolConfig with all default values. */ - public static PoolConfig defaults() { - return new Builder().build(); - } - - // Getters for HikariDataSourceFactory - - public int maximumPoolSize() { - return maximumPoolSize; - } - - public int minimumIdle() { - return minimumIdle; - } - - public Duration connectionTimeout() { - return connectionTimeout; - } - - public Duration validationTimeout() { - return validationTimeout; - } - - public Duration idleTimeout() { - return idleTimeout; - } - - public Duration maxLifetime() { - return maxLifetime; - } - - public Duration keepaliveTime() { - return keepaliveTime; - } - - public Duration leakDetectionThreshold() { - return leakDetectionThreshold; - } - - public String connectionTestQuery() { - return connectionTestQuery; - } - - public String poolName() { - return poolName; - } - - public Boolean registerMbeans() { - return registerMbeans; - } - - public Boolean allowPoolSuspension() { - return allowPoolSuspension; - } - - public Boolean isolateInternalQueries() { - return isolateInternalQueries; - } - - public Map extraProperties() { - return extraProperties; - } - - /** Builder for PoolConfig with typed methods for pool-specific HikariCP properties. */ - public static final class Builder { - // Pool sizing - defaults from HikariCP - private int maximumPoolSize = 10; - private int minimumIdle = 10; - - // Timeouts - defaults from HikariCP - private Duration connectionTimeout = Duration.ofSeconds(30); - private Duration validationTimeout = Duration.ofSeconds(5); - private Duration idleTimeout = Duration.ofMinutes(10); - private Duration maxLifetime = Duration.ofMinutes(30); - private Duration keepaliveTime = Duration.ZERO; - private Duration leakDetectionThreshold = Duration.ZERO; - - // Pool-specific connection settings - private String connectionTestQuery = null; - - // Pool naming - private String poolName = null; - - // Advanced - private Boolean registerMbeans = null; - private Boolean allowPoolSuspension = null; - private Boolean isolateInternalQueries = null; - - private final Map extraProperties = new HashMap<>(); - - private Builder() {} - - // ==================== POOL SIZING ==================== - - /** - * Maximum number of connections in the pool. Default: 10. - * - * @param maximumPoolSize max connections - * @return this builder - */ - public Builder maximumPoolSize(int maximumPoolSize) { - this.maximumPoolSize = maximumPoolSize; - return this; - } - - /** - * Minimum number of idle connections to maintain. Default: same as maximumPoolSize. - * - * @param minimumIdle min idle connections - * @return this builder - */ - public Builder minimumIdle(int minimumIdle) { - this.minimumIdle = minimumIdle; - return this; - } - - // ==================== TIMEOUTS ==================== - - /** - * Maximum time to wait for a connection from the pool. Default: 30 seconds. - * - * @param connectionTimeout timeout duration - * @return this builder - */ - public Builder connectionTimeout(Duration connectionTimeout) { - this.connectionTimeout = connectionTimeout; - return this; - } - - /** - * Maximum time to wait for connection validation. Default: 5 seconds. - * - * @param validationTimeout timeout duration - * @return this builder - */ - public Builder validationTimeout(Duration validationTimeout) { - this.validationTimeout = validationTimeout; - return this; - } - - /** - * Maximum time a connection can sit idle before being evicted. Default: 10 minutes. - * - * @param idleTimeout timeout duration - * @return this builder - */ - public Builder idleTimeout(Duration idleTimeout) { - this.idleTimeout = idleTimeout; - return this; - } - - /** - * Maximum lifetime of a connection in the pool. Default: 30 minutes. - * - * @param maxLifetime maximum lifetime - * @return this builder - */ - public Builder maxLifetime(Duration maxLifetime) { - this.maxLifetime = maxLifetime; - return this; - } - - /** - * Interval for connection keepalive queries. Default: 0 (disabled). - * - * @param keepaliveTime keepalive interval - * @return this builder - */ - public Builder keepaliveTime(Duration keepaliveTime) { - this.keepaliveTime = keepaliveTime; - return this; - } - - /** - * Threshold for connection leak detection. Default: 0 (disabled). - * - * @param leakDetectionThreshold detection threshold - * @return this builder - */ - public Builder leakDetectionThreshold(Duration leakDetectionThreshold) { - this.leakDetectionThreshold = leakDetectionThreshold; - return this; - } - - /** - * SQL to execute for connection validation (prefer isValid() when possible). Default: null. - * - * @param connectionTestQuery test query - * @return this builder - */ - public Builder connectionTestQuery(String connectionTestQuery) { - this.connectionTestQuery = connectionTestQuery; - return this; - } - - // ==================== POOL NAMING ==================== - - /** - * Name for the connection pool (for JMX and logging). Default: auto-generated. - * - * @param poolName pool name - * @return this builder - */ - public Builder poolName(String poolName) { - this.poolName = poolName; - return this; - } - - // ==================== ADVANCED ==================== - - /** - * Register pool with JMX. Default: false. - * - * @param registerMbeans true to register - * @return this builder - */ - public Builder registerMbeans(boolean registerMbeans) { - this.registerMbeans = registerMbeans; - return this; - } - - /** - * Allow pool suspension for maintenance. Default: false. - * - * @param allowPoolSuspension true to allow - * @return this builder - */ - public Builder allowPoolSuspension(boolean allowPoolSuspension) { - this.allowPoolSuspension = allowPoolSuspension; - return this; - } - - /** - * Isolate internal HikariCP queries. Default: false. - * - * @param isolateInternalQueries true to isolate - * @return this builder - */ - public Builder isolateInternalQueries(boolean isolateInternalQueries) { - this.isolateInternalQueries = isolateInternalQueries; - return this; - } - - /** - * Set an arbitrary HikariCP property. - * - * @param key property name - * @param value property value - * @return this builder - */ - public Builder property(String key, String value) { - this.extraProperties.put(key, value); - return this; - } - - /** - * Build the PoolConfig. - * - * @return immutable PoolConfig - */ - public PoolConfig build() { - return new PoolConfig(this); - } - } -} diff --git a/foundations-jdbc-hikari/src/java/dev/typr/foundations/hikari/PooledDataSource.java b/foundations-jdbc-hikari/src/java/dev/typr/foundations/hikari/PooledDataSource.java deleted file mode 100644 index fb4b7770fe..0000000000 --- a/foundations-jdbc-hikari/src/java/dev/typr/foundations/hikari/PooledDataSource.java +++ /dev/null @@ -1,102 +0,0 @@ -package dev.typr.foundations.hikari; - -import com.zaxxer.hikari.HikariDataSource; -import dev.typr.foundations.Transactor; -import dev.typr.foundations.Transactor.Strategy; -import dev.typr.foundations.connect.ConnectionSource; -import java.io.Closeable; -import java.sql.Connection; -import java.sql.SQLException; -import javax.sql.DataSource; - -/** - * A pooled connection source using HikariCP. - * - *

This class wraps a HikariDataSource and implements {@link ConnectionSource} for unified API - * with {@link dev.typr.foundations.connect.SimpleDataSource}. - * - *

Example usage: - * - *

{@code
- * var ds = PooledDataSource.create(
- *     PostgresConfig.builder("localhost", 5432, "mydb", "user", "pass").build(),
- *     ConnectionSettings.builder()
- *         .transactionIsolation(TransactionIsolation.READ_UNCOMMITTED)
- *         .build(),
- *     PoolConfig.builder()
- *         .maximumPoolSize(20)
- *         .build());
- *
- * var tx = ds.transactor();
- * tx.execute(conn -> repo.selectAll(conn));
- * }
- */ -public final class PooledDataSource implements ConnectionSource, Closeable { - - private final HikariDataSource dataSource; - - PooledDataSource(HikariDataSource dataSource) { - this.dataSource = dataSource; - } - - /** - * Get the underlying HikariDataSource. - * - * @return the wrapped HikariDataSource - */ - public HikariDataSource unwrap() { - return dataSource; - } - - /** - * Get this as a standard JDBC DataSource. - * - * @return this as DataSource - */ - public DataSource asDataSource() { - return dataSource; - } - - @Override - public Connection getConnection() throws SQLException { - return dataSource.getConnection(); - } - - @Override - public Transactor transactor() { - return ConnectionSource.super.transactor(); - } - - @Override - public Transactor transactor(Strategy strategy) { - return ConnectionSource.super.transactor(strategy); - } - - /** - * Close the underlying connection pool. - * - *

This will close all connections in the pool and release resources. - */ - @Override - public void close() { - dataSource.close(); - } - - /** - * Check if the pool is closed. - * - * @return true if the pool has been closed - */ - public boolean isClosed() { - return dataSource.isClosed(); - } - - /** - * Check if the pool is running (not suspended or closed). - * - * @return true if the pool is running - */ - public boolean isRunning() { - return dataSource.isRunning(); - } -} diff --git a/foundations-jdbc-scala/src/scala/dev/typr/foundations/scala/FragmentInterpolator.scala b/foundations-jdbc-scala/src/scala/dev/typr/foundations/scala/FragmentInterpolator.scala deleted file mode 100644 index 728ece08ba..0000000000 --- a/foundations-jdbc-scala/src/scala/dev/typr/foundations/scala/FragmentInterpolator.scala +++ /dev/null @@ -1,55 +0,0 @@ -package dev.typr.foundations.scala - -import dev.typr.foundations.Fragment -import scala.collection.mutable.ListBuffer - -/** Scala string interpolator for creating SQL Fragments. - * - * Usage: - * {{{ - * import dev.typr.foundations.scala.FragmentInterpolator.interpolate - * - * val name = PgTypes.text.encode("Alice") - * val query = interpolate"SELECT * FROM users WHERE name = $name" - * }}} - */ -object FragmentInterpolator { - extension (sc: StringContext) { - def interpolate(args: Fragment*): Fragment = { - val parts = sc.parts.iterator - val frags = new ListBuffer[Fragment]() - - // Add first string part - if (parts.hasNext) { - val first = parts.next() - if (first.nonEmpty) { - frags += Fragment.lit(first) - } - } - - // Interleave remaining parts with args - val argsIt = args.iterator - while (parts.hasNext && argsIt.hasNext) { - frags += argsIt.next() - val part = parts.next() - if (part.nonEmpty) { - frags += Fragment.lit(part) - } - } - - // Handle any remaining args (shouldn't happen with valid interpolation) - while (argsIt.hasNext) { - frags += argsIt.next() - } - - frags.result() match { - case Nil => Fragment.empty() - case single :: Nil => single - case multiple => - val javaList = new java.util.ArrayList[Fragment](multiple.size) - multiple.foreach(javaList.add) - Fragment.Concat(javaList) - } - } - } -} diff --git a/foundations-jdbc-test/src/java/dev/typr/foundations/Db2TypeTest.java b/foundations-jdbc-test/src/java/dev/typr/foundations/Db2TypeTest.java deleted file mode 100644 index ac1a698464..0000000000 --- a/foundations-jdbc-test/src/java/dev/typr/foundations/Db2TypeTest.java +++ /dev/null @@ -1,490 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.JsonValue; -import java.math.BigDecimal; -import java.sql.Connection; -import java.sql.PreparedStatement; -import java.sql.SQLException; -import java.time.LocalDate; -import java.time.LocalDateTime; -import java.time.LocalTime; -import java.util.ArrayList; -import java.util.Arrays; -import java.util.List; -import java.util.Optional; -import java.util.concurrent.atomic.AtomicInteger; -import org.junit.Test; - -/** Tests for DB2 type codecs. Tests all types defined in Db2Types. */ -public class Db2TypeTest { - - private static final AtomicInteger tableCounter = new AtomicInteger(0); - - private static String uniqueTableName(String prefix) { - return "DB2INST1." + prefix + "_" + tableCounter.incrementAndGet(); - } - - record TestPair(A t0, Optional t1) {} - - record Db2TypeAndExample( - Db2Type type, A example, boolean hasIdentity, boolean jsonDbWorks) { - public Db2TypeAndExample(Db2Type type, A example) { - this(type, example, true, true); - } - - /** Skip JSON DB roundtrip test (DB2's JSON functions have limitations) */ - public Db2TypeAndExample noJsonDb() { - return new Db2TypeAndExample<>(type, example, hasIdentity, false); - } - - public Db2TypeAndExample noIdentity() { - return new Db2TypeAndExample<>(type, example, false, jsonDbWorks); - } - } - - List> All = - List.of( - // ==================== Integer Types ==================== - new Db2TypeAndExample<>(Db2Types.smallint, (short) 4242), - new Db2TypeAndExample<>(Db2Types.smallint, Short.MIN_VALUE), // Edge case: min value - new Db2TypeAndExample<>(Db2Types.smallint, Short.MAX_VALUE), // Edge case: max value - new Db2TypeAndExample<>(Db2Types.smallint, (short) 0), // Edge case: zero - new Db2TypeAndExample<>(Db2Types.integer, 42424242), - new Db2TypeAndExample<>(Db2Types.integer, Integer.MIN_VALUE), // Edge case: min value - new Db2TypeAndExample<>(Db2Types.integer, Integer.MAX_VALUE), // Edge case: max value - new Db2TypeAndExample<>(Db2Types.integer, 0), // Edge case: zero - new Db2TypeAndExample<>(Db2Types.bigint, 4242424242424242L), - new Db2TypeAndExample<>(Db2Types.bigint, Long.MIN_VALUE), // Edge case: min value - new Db2TypeAndExample<>(Db2Types.bigint, Long.MAX_VALUE), // Edge case: max value - new Db2TypeAndExample<>(Db2Types.bigint, 0L), // Edge case: zero - - // ==================== Fixed-Point Types ==================== - new Db2TypeAndExample<>(Db2Types.decimal(10, 2), new BigDecimal("12345.67")), - new Db2TypeAndExample<>( - Db2Types.decimal(10, 2), new BigDecimal("0.00")), // Edge case: zero (with scale) - new Db2TypeAndExample<>(Db2Types.decimal(10, 2), new BigDecimal("-99999.99")), // Negative - new Db2TypeAndExample<>( - Db2Types.decimal(10, 2), new BigDecimal("12345678.90")), // With precision - - // DECFLOAT - DB2-specific decimal floating point - new Db2TypeAndExample<>(Db2Types.decfloat, new BigDecimal("3.141592653589793")), - new Db2TypeAndExample<>(Db2Types.decfloat(16), new BigDecimal("1.234567890123456E10")), - - // ==================== Floating-Point Types ==================== - new Db2TypeAndExample<>(Db2Types.real, 3.14159f), - new Db2TypeAndExample<>(Db2Types.real, 0.0f), // Edge case: zero - new Db2TypeAndExample<>(Db2Types.real, -Float.MAX_VALUE), // Edge case: min value - new Db2TypeAndExample<>(Db2Types.real, Float.MAX_VALUE), // Edge case: max value - new Db2TypeAndExample<>(Db2Types.real, -1.0f), // Negative - new Db2TypeAndExample<>(Db2Types.double_, 3.141592653589793), - new Db2TypeAndExample<>(Db2Types.double_, 0.0), // Edge case: zero - new Db2TypeAndExample<>(Db2Types.double_, -Double.MAX_VALUE) - .noJsonDb(), // DB2 JSON can't represent this value (SQLCODE=-16402) - new Db2TypeAndExample<>(Db2Types.double_, Double.MAX_VALUE) - .noJsonDb(), // DB2 JSON can't represent this value (SQLCODE=-16402) - new Db2TypeAndExample<>(Db2Types.double_, -1.0), // Negative - - // ==================== Boolean Type ==================== - new Db2TypeAndExample<>(Db2Types.boolean_, true), - new Db2TypeAndExample<>(Db2Types.boolean_, false), - - // ==================== String Types (SBCS) ==================== - new Db2TypeAndExample<>(Db2Types.char_(10), "Hello ") - .noJsonDb(), // DB2 JSON trims trailing spaces from CHAR - new Db2TypeAndExample<>(Db2Types.char_(5), "12345"), // Exact length, no trailing spaces - new Db2TypeAndExample<>(Db2Types.varchar(255), "Hello, DB2!"), - new Db2TypeAndExample<>(Db2Types.varchar(255), ""), // Edge case: empty string - new Db2TypeAndExample<>(Db2Types.varchar(100), "Variable length string"), - new Db2TypeAndExample<>( - Db2Types.varchar(255), "Special chars: äöü ñ 中文"), // Unicode in VARCHAR - new Db2TypeAndExample<>(Db2Types.clob, "This is a CLOB value with more text."), - - // ==================== String Types (DBCS - Double-Byte) ==================== - // GRAPHIC/VARGRAPHIC not supported by DB2's JSON_OBJECT (SQLCODE=-171) - new Db2TypeAndExample<>(Db2Types.graphic(5), "ABCDE").noJsonDb(), // Full-width chars - new Db2TypeAndExample<>(Db2Types.vargraphic(50), "日本語テスト").noJsonDb(), // Japanese - new Db2TypeAndExample<>(Db2Types.vargraphic(50), "中文测试").noJsonDb(), // Chinese - new Db2TypeAndExample<>(Db2Types.dbclob, "Double-byte CLOB content: 한글"), - - // ==================== Binary Types ==================== - new Db2TypeAndExample<>(Db2Types.binary(4), new byte[] {0x01, 0x02, 0x03, 0x04}), - new Db2TypeAndExample<>( - Db2Types.varbinary(100), - new byte[] {(byte) 0xDE, (byte) 0xAD, (byte) 0xBE, (byte) 0xEF}), - new Db2TypeAndExample<>(Db2Types.varbinary(100), new byte[] {}), // Edge case: empty - new Db2TypeAndExample<>( - Db2Types.varbinary(100), - new byte[] {0x00, 0x7F, (byte) 0x80, (byte) 0xFF}), // Boundary bytes - new Db2TypeAndExample<>( - Db2Types.blob, new byte[] {0x42, 0x4C, 0x4F, 0x42}) // "BLOB" in hex - .noIdentity(), - - // ==================== Date/Time Types ==================== - new Db2TypeAndExample<>(Db2Types.date, LocalDate.of(2024, 6, 15)), - new Db2TypeAndExample<>(Db2Types.date, LocalDate.of(1970, 1, 1)), // Edge case: epoch - new Db2TypeAndExample<>(Db2Types.date, LocalDate.of(2099, 12, 31)), // Future date - new Db2TypeAndExample<>(Db2Types.time, LocalTime.of(14, 30, 45)), - new Db2TypeAndExample<>(Db2Types.time, LocalTime.of(0, 0, 0)), // Edge case: midnight - new Db2TypeAndExample<>(Db2Types.time, LocalTime.of(23, 59, 59)), // Edge case: end of day - new Db2TypeAndExample<>(Db2Types.timestamp, LocalDateTime.of(2024, 6, 15, 14, 30, 45)), - new Db2TypeAndExample<>( - Db2Types.timestamp, LocalDateTime.of(1970, 1, 1, 0, 0, 0)), // Edge case: epoch - new Db2TypeAndExample<>( - Db2Types.timestamp(6), - LocalDateTime.of(2024, 6, 15, 14, 30, 45, 123456000)), // Microseconds - - // ==================== Special Types ==================== - // DB2 normalizes XML: strips XML declaration, so use pre-normalized values - // XML types can't be compared with = in SQL, so mark as noIdentity - new Db2TypeAndExample<>( - Db2Types.xml, - new dev.typr.foundations.data.Xml("value")) - .noIdentity(), - new Db2TypeAndExample<>(Db2Types.xml, new dev.typr.foundations.data.Xml("")) - .noIdentity()); - - // Note: DB2 does not support arrays as a column type like PostgreSQL - // Array operations in DB2 are handled via ARRAY data type in SQL PL only - - // Connection helper for DB2 - static T withConnection(SqlFunction f) { - try (var conn = - java.sql.DriverManager.getConnection( - "jdbc:db2://localhost:50000/typr:user=db2inst1;password=password;")) { - conn.setAutoCommit(false); - try { - return f.apply(conn); - } finally { - conn.rollback(); - } - } catch (SQLException e) { - throw new RuntimeException(e); - } - } - - @Test - public void test() { - System.out.println("Testing DB2 type codecs...\n"); - - // Test JSON roundtrip first (no database connection needed) - parallel - System.out.println("=== JSON Roundtrip Tests (parallel) ==="); - All.parallelStream().forEach(Db2TypeTest::testJsonRoundtrip); - System.out.println(); - - // Run all DB tests in parallel - System.out.println("=== DB Roundtrip Tests (parallel) ==="); - var failures = - All.parallelStream() - .flatMap( - t -> { - var errors = new ArrayList(); - - // Native type roundtrip test - try { - withConnection( - conn -> { - testCase(conn, t); - return null; - }); - } catch (Exception e) { - errors.add( - "Native test FAILED " - + t.type.typename().sqlType() - + ": " - + e.getMessage()); - } - - // JSON DB roundtrip test - if (t.jsonDbWorks) { - try { - withConnection( - conn -> { - testJsonDbRoundtrip(conn, t); - return null; - }); - } catch (Exception e) { - errors.add( - "JSON DB test FAILED " - + t.type.typename().sqlType() - + ": " - + e.getMessage()); - } - } - - return errors.stream(); - }) - .toList(); - - System.out.println("\n====================================="); - if (failures.isEmpty()) { - System.out.println("All tests passed!"); - } else { - failures.forEach(System.out::println); - throw new RuntimeException(failures.size() + " tests failed"); - } - System.out.println("====================================="); - } - - /** - * Test JSON roundtrip in-memory. Returns true if test passed/skipped, false if type doesn't - * support JSON. - */ - static boolean testJsonRoundtrip(Db2TypeAndExample t) { - try { - Db2Json jsonCodec = t.type.db2Json(); - A original = t.example; - - // Test toJson -> encode -> parse -> fromJson roundtrip (in-memory) - JsonValue jsonValue = jsonCodec.toJson(original); - String encoded = jsonValue.encode(); - JsonValue parsed = JsonValue.parse(encoded); - A decoded = jsonCodec.fromJson(parsed); - - System.out.println( - "JSON roundtrip " - + t.type.typename().sqlType() - + ": " - + format(original) - + " -> " - + encoded - + " -> " - + format(decoded)); - - if (t.hasIdentity && !areEqual(decoded, original)) { - throw new RuntimeException( - "JSON roundtrip failed for " - + t.type.typename().sqlType() - + ": expected '" - + format(original) - + "' but got '" - + format(decoded) - + "'"); - } - return true; - } catch (UnsupportedOperationException e) { - // Type explicitly doesn't support JSON (e.g., GRAPHIC, VARBINARY) - System.out.println( - "JSON roundtrip " + t.type.typename().sqlType() + ": SKIP (" + e.getMessage() + ")"); - return false; - } catch (Exception e) { - throw new RuntimeException( - "JSON roundtrip test failed for " + t.type.typename().sqlType(), e); - } - } - - /** - * Test JSON roundtrip through the database. Returns true if test passed, false if type doesn't - * support JSON. - */ - static boolean testJsonDbRoundtrip(Connection conn, Db2TypeAndExample t) - throws SQLException { - Db2Json jsonCodec = t.type.db2Json(); - A original = t.example; - String sqlType = t.type.typename().sqlType(); - - // Check if JSON is supported by attempting toJson - will throw UnsupportedOperationException - // if not - try { - jsonCodec.toJson(original); - } catch (UnsupportedOperationException e) { - System.out.println("JSON DB roundtrip " + sqlType + ": SKIP (" + e.getMessage() + ")"); - return false; - } - - // Use a regular table instead of GLOBAL TEMPORARY TABLE - // since GLOBAL TEMPORARY TABLE requires a user temporary tablespace - String tableName = uniqueTableName("TYPR_JSON_RT"); - try { - conn.createStatement().execute("DROP TABLE " + tableName); - } catch (SQLException e) { - // Table might not exist, ignore - } - conn.createStatement().execute("CREATE TABLE " + tableName + " (v " + sqlType + ")"); - - try { - // Insert value using native type - var insert = conn.prepareStatement("INSERT INTO " + tableName + " (v) VALUES (?)"); - t.type.write().set(insert, 1, original); - insert.execute(); - insert.close(); - - // Select back as JSON using JSON_OBJECT (DB2 syntax: KEY 'key' VALUE value) - var select = conn.prepareStatement("SELECT JSON_OBJECT(KEY 'v' VALUE v) FROM " + tableName); - select.execute(); - var rs = select.getResultSet(); - - if (!rs.next()) { - throw new RuntimeException("No rows returned"); - } - - // Read the JSON string back from the database - String jsonFromDb = rs.getString(1); - select.close(); - - // Parse the JSON object and extract 'v' field - JsonValue parsedFromDb = JsonValue.parse(jsonFromDb); - JsonValue fieldValue = ((JsonValue.JObject) parsedFromDb).get("v"); - A decoded = jsonCodec.fromJson(fieldValue); - - System.out.println( - "JSON DB roundtrip " - + sqlType - + ": " - + format(original) - + " -> DB -> " - + jsonFromDb - + " -> " - + format(decoded)); - - // Use tolerance comparison for JSON DB because DB2's JSON_OBJECT has limited precision - // for binary floating point types (REAL: 6 digits, DOUBLE: 14 digits) - if (t.hasIdentity && !areEqual(decoded, original, true)) { - throw new RuntimeException( - "JSON DB roundtrip failed for " - + sqlType - + ": expected '" - + format(original) - + "' but got '" - + format(decoded) - + "'"); - } - return true; - } finally { - try { - conn.createStatement().execute("DROP TABLE " + tableName); - } catch (SQLException e) { - // Ignore cleanup errors - } - } - } - - static void testCase(Connection conn, Db2TypeAndExample t) throws SQLException { - String sqlType = t.type.typename().sqlType(); - - // Use a regular table instead of GLOBAL TEMPORARY TABLE - // since GLOBAL TEMPORARY TABLE requires a user temporary tablespace - String tableName = uniqueTableName("TYPR_TYPE"); - try { - conn.createStatement().execute("DROP TABLE " + tableName); - } catch (SQLException e) { - // Table might not exist, ignore - } - conn.createStatement().execute("CREATE TABLE " + tableName + " (v " + sqlType + ")"); - - try { - // Insert using PreparedStatement - var insert = conn.prepareStatement("INSERT INTO " + tableName + " (v) VALUES (?)"); - A expected = t.example; - t.type.write().set(insert, 1, expected); - insert.execute(); - insert.close(); - - // Select and verify - final PreparedStatement select; - if (t.hasIdentity) { - select = - conn.prepareStatement( - "SELECT v, CAST(NULL AS " + sqlType + ") FROM " + tableName + " WHERE v = ?"); - t.type.write().set(select, 1, expected); - } else { - select = conn.prepareStatement("SELECT v, CAST(NULL AS " + sqlType + ") FROM " + tableName); - } - - select.execute(); - var rs = select.getResultSet(); - - if (!rs.next()) { - throw new RuntimeException("No rows returned"); - } - - // Read the value - A actual = t.type.read().read(rs, 1); - // Read the null value using opt() - Optional actualNull = t.type.opt().read().read(rs, 2); - - select.close(); - - assertEquals(actual, expected, "value mismatch"); - assertEquals(actualNull, Optional.empty(), "null value mismatch"); - - } finally { - // Drop temp table - try { - conn.createStatement().execute("DROP TABLE " + tableName); - } catch (SQLException e) { - // Ignore cleanup errors - } - } - } - - static void assertEquals(A actual, A expected, String message) { - if (!areEqual(actual, expected)) { - throw new RuntimeException( - message + ": actual='" + format(actual) + "' expected='" + format(expected) + "'"); - } - } - - static boolean areEqual(A actual, A expected) { - return areEqual(actual, expected, false); - } - - /** - * Compare values for equality. When jsonDbTolerance is true, use relative tolerance for - * Float/Double because DB2's JSON_OBJECT uses limited-precision scientific notation. - */ - static boolean areEqual(A actual, A expected, boolean jsonDbTolerance) { - if (expected == null && actual == null) return true; - if (expected == null || actual == null) return false; - - if (expected instanceof byte[]) { - return Arrays.equals((byte[]) actual, (byte[]) expected); - } - if (expected instanceof Object[]) { - return Arrays.deepEquals((Object[]) actual, (Object[]) expected); - } - - // DB2 JSON_OBJECT uses limited precision for binary floating point: - // - REAL: 6 significant digits (vs 7 for 32-bit float) - // - DOUBLE: 14 significant digits (vs 15-16 for 64-bit float) - if (jsonDbTolerance) { - if (expected instanceof Float) { - float exp = (Float) expected; - float act = (Float) actual; - if (exp == 0.0f) return act == 0.0f; - // Allow ~2 digits of precision loss for REAL - return Math.abs((act - exp) / exp) < 1e-5; - } - if (expected instanceof Double) { - double exp = (Double) expected; - double act = (Double) actual; - if (exp == 0.0) return act == 0.0; - // Allow ~2 digits of precision loss for DOUBLE - return Math.abs((act - exp) / exp) < 1e-13; - } - } - - return actual.equals(expected); - } - - static String format(A a) { - if (a == null) return "null"; - if (a instanceof byte[]) { - return bytesToHex((byte[]) a); - } - if (a instanceof Object[]) { - return Arrays.deepToString((Object[]) a); - } - return a.toString(); - } - - static String bytesToHex(byte[] bytes) { - StringBuilder sb = new StringBuilder(); - sb.append("["); - for (int i = 0; i < bytes.length; i++) { - if (i > 0) sb.append(", "); - sb.append(String.format("0x%02X", bytes[i])); - } - sb.append("]"); - return sb.toString(); - } -} diff --git a/foundations-jdbc-test/src/java/dev/typr/foundations/DuckDbTypeTest.java b/foundations-jdbc-test/src/java/dev/typr/foundations/DuckDbTypeTest.java deleted file mode 100644 index 5cf7035654..0000000000 --- a/foundations-jdbc-test/src/java/dev/typr/foundations/DuckDbTypeTest.java +++ /dev/null @@ -1,807 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.Json; -import dev.typr.foundations.data.JsonValue; -import dev.typr.foundations.data.Uint1; -import dev.typr.foundations.data.Uint2; -import dev.typr.foundations.data.Uint4; -import dev.typr.foundations.data.Uint8; -import java.math.BigDecimal; -import java.math.BigInteger; -import java.sql.*; -import java.time.*; -import java.util.ArrayList; -import java.util.Arrays; -import java.util.List; -import java.util.Optional; -import java.util.UUID; -import java.util.concurrent.atomic.AtomicInteger; -import org.junit.Test; - -/** - * Tests for DuckDB type codecs. Tests all types defined in DuckDbTypes including: - Primitive types - * (integers, floats, strings, dates, etc.) - Composite types (LIST, MAP, STRUCT, UNION, ARRAY) - * DuckDB is an embedded database, so tests run in-process using an in-memory database. - */ -public class DuckDbTypeTest { - - private static final AtomicInteger tableCounter = new AtomicInteger(0); - - private static String uniqueTableName(String prefix) { - return prefix + "_" + tableCounter.incrementAndGet(); - } - - // ==================== Wrapper Type Examples for bimap testing ==================== - record UserId(int value) {} - - record ProductCode(String value) {} - - // A complex wrapper that contains a map internally - record Config(java.util.Map settings) {} - - // Bimapped types for testing MAP with wrapper keys/values - static DuckDbType userIdType = DuckDbTypes.integer.bimap(UserId::new, UserId::value); - - static DuckDbType productCodeType = - DuckDbTypes.varchar.bimap(ProductCode::new, ProductCode::value); - - // A type that wraps a map - bimapped from MAP(VARCHAR, INTEGER) - static DuckDbType configType = - DuckDbTypes.varchar.mapTo(DuckDbTypes.integer).bimap(Config::new, Config::settings); - - // ==================== STRUCT Example ==================== - record Person(String name, int age) {} - - // New simplified API: just provide field getters, stringifier is auto-derived from DuckDbType - DuckDbStruct personStruct = - DuckDbStruct.builder("Person") - .field("name", DuckDbTypes.varchar, Person::name) - .field("age", DuckDbTypes.integer, Person::age) - .build(attrs -> new Person((String) attrs[0], (Integer) attrs[1])); // reader only - - DuckDbType personType = personStruct.asType(); - - // ==================== UNION Example ==================== - sealed interface IntOrString { - record Num(int value) implements IntOrString {} - - record Str(String value) implements IntOrString {} - } - - // New simplified API: just provide wrapper/unwrapper functions, everything else is auto-derived - DuckDbUnion intOrStringUnion = - DuckDbUnion.builder("IntOrString") - .member( - "num", - DuckDbTypes.integer, - Integer.class, - IntOrString.Num::new, // wrapper: Integer -> IntOrString - ios -> ios instanceof IntOrString.Num n ? n.value() : null) // unwrapper - .member( - "str", - DuckDbTypes.varchar, - String.class, - IntOrString.Str::new, // wrapper: String -> IntOrString - ios -> ios instanceof IntOrString.Str s ? s.value() : null) // unwrapper - .build(); // auto-derives reader, writer, and JSON codec - - DuckDbType intOrStringType = intOrStringUnion.asType(); - - record DuckDbTypeAndExample( - DuckDbType type, A example, boolean hasIdentity, boolean supportsTextRoundtrip) { - public DuckDbTypeAndExample(DuckDbType type, A example) { - this(type, example, true, true); - } - - public DuckDbTypeAndExample noIdentity() { - return new DuckDbTypeAndExample<>(type, example, false, supportsTextRoundtrip); - } - - public DuckDbTypeAndExample noTextRoundtrip() { - return new DuckDbTypeAndExample<>(type, example, hasIdentity, false); - } - } - - // Sample enum for ENUM type testing - enum Color { - RED, - GREEN, - BLUE - } - - // HUGEINT range: -170141183460469231731687303715884105728 to - // 170141183460469231731687303715884105727 - BigInteger HUGEINT_MAX = new BigInteger("170141183460469231731687303715884105727"); - BigInteger HUGEINT_MIN = new BigInteger("-170141183460469231731687303715884105728"); - // UHUGEINT range: 0 to 340282366920938463463374607431768211455 - BigInteger UHUGEINT_MAX = new BigInteger("340282366920938463463374607431768211455"); - // UBIGINT range: 0 to 18446744073709551615 - BigInteger UBIGINT_MAX = new BigInteger("18446744073709551615"); - - List> All = - List.of( - // ==================== Integer Types (Signed) ==================== - new DuckDbTypeAndExample<>(DuckDbTypes.tinyint, (byte) 42), - new DuckDbTypeAndExample<>(DuckDbTypes.tinyint, Byte.MIN_VALUE), - new DuckDbTypeAndExample<>(DuckDbTypes.tinyint, Byte.MAX_VALUE), - new DuckDbTypeAndExample<>(DuckDbTypes.tinyint, (byte) 0), - new DuckDbTypeAndExample<>(DuckDbTypes.smallint, (short) 4242), - new DuckDbTypeAndExample<>(DuckDbTypes.smallint, Short.MIN_VALUE), - new DuckDbTypeAndExample<>(DuckDbTypes.smallint, Short.MAX_VALUE), - new DuckDbTypeAndExample<>(DuckDbTypes.smallint, (short) 0), - new DuckDbTypeAndExample<>(DuckDbTypes.integer, 42424242), - new DuckDbTypeAndExample<>(DuckDbTypes.integer, Integer.MIN_VALUE), - new DuckDbTypeAndExample<>(DuckDbTypes.integer, Integer.MAX_VALUE), - new DuckDbTypeAndExample<>(DuckDbTypes.integer, 0), - new DuckDbTypeAndExample<>(DuckDbTypes.bigint, 4242424242424242L), - new DuckDbTypeAndExample<>(DuckDbTypes.bigint, Long.MIN_VALUE), - new DuckDbTypeAndExample<>(DuckDbTypes.bigint, Long.MAX_VALUE), - new DuckDbTypeAndExample<>(DuckDbTypes.bigint, 0L), - // HUGEINT - 128-bit signed integer - new DuckDbTypeAndExample<>(DuckDbTypes.hugeint, BigInteger.valueOf(12345678901234567L)), - new DuckDbTypeAndExample<>(DuckDbTypes.hugeint, HUGEINT_MAX), - new DuckDbTypeAndExample<>(DuckDbTypes.hugeint, HUGEINT_MIN), - new DuckDbTypeAndExample<>(DuckDbTypes.hugeint, BigInteger.ZERO), - - // ==================== Integer Types (Unsigned) ==================== - new DuckDbTypeAndExample<>(DuckDbTypes.utinyint, new Uint1((short) 255)), - new DuckDbTypeAndExample<>(DuckDbTypes.utinyint, new Uint1((short) 0)), - new DuckDbTypeAndExample<>(DuckDbTypes.usmallint, new Uint2(65535)), - new DuckDbTypeAndExample<>(DuckDbTypes.usmallint, new Uint2(0)), - new DuckDbTypeAndExample<>(DuckDbTypes.uinteger, new Uint4(4294967295L)), - new DuckDbTypeAndExample<>(DuckDbTypes.uinteger, new Uint4(0L)), - // UBIGINT - 64-bit unsigned integer - new DuckDbTypeAndExample<>(DuckDbTypes.ubigint, new Uint8(UBIGINT_MAX)), - new DuckDbTypeAndExample<>(DuckDbTypes.ubigint, new Uint8(BigInteger.ZERO)), - // UHUGEINT - 128-bit unsigned integer - new DuckDbTypeAndExample<>(DuckDbTypes.uhugeint, UHUGEINT_MAX), - new DuckDbTypeAndExample<>(DuckDbTypes.uhugeint, BigInteger.ZERO), - - // ==================== Floating-Point Types ==================== - new DuckDbTypeAndExample<>(DuckDbTypes.float_, 3.14159f).noIdentity(), - new DuckDbTypeAndExample<>(DuckDbTypes.float_, 0.0f).noIdentity(), - new DuckDbTypeAndExample<>(DuckDbTypes.float_, Float.MAX_VALUE).noIdentity(), - new DuckDbTypeAndExample<>(DuckDbTypes.float_, Float.MIN_VALUE).noIdentity(), - new DuckDbTypeAndExample<>(DuckDbTypes.double_, 3.141592653589793), - new DuckDbTypeAndExample<>(DuckDbTypes.double_, 0.0), - new DuckDbTypeAndExample<>(DuckDbTypes.double_, -3.141592653589793), - new DuckDbTypeAndExample<>(DuckDbTypes.double_, Double.MAX_VALUE), - - // ==================== Fixed-Point Types ==================== - new DuckDbTypeAndExample<>(DuckDbTypes.decimal, new BigDecimal("12345")), - new DuckDbTypeAndExample<>(DuckDbTypes.decimal, BigDecimal.ZERO), - new DuckDbTypeAndExample<>(DuckDbTypes.decimal, new BigDecimal("-9999999999")), - new DuckDbTypeAndExample<>(DuckDbTypes.decimal(10, 2), new BigDecimal("12345678.90")), - new DuckDbTypeAndExample<>(DuckDbTypes.decimal(10, 2), new BigDecimal("0.00")), - new DuckDbTypeAndExample<>(DuckDbTypes.decimal(10, 2), new BigDecimal("-99999999.99")), - new DuckDbTypeAndExample<>(DuckDbTypes.decimal(10, 5), new BigDecimal("12345.67890")), - - // ==================== Boolean Type ==================== - new DuckDbTypeAndExample<>(DuckDbTypes.boolean_, true), - new DuckDbTypeAndExample<>(DuckDbTypes.boolean_, false), - - // ==================== String Types ==================== - new DuckDbTypeAndExample<>(DuckDbTypes.varchar, "Hello, DuckDB!"), - new DuckDbTypeAndExample<>(DuckDbTypes.varchar, ""), - new DuckDbTypeAndExample<>( - DuckDbTypes.varchar, "Unicode: \u00e9\u00e8\u00ea \u4e2d\u6587"), - new DuckDbTypeAndExample<>(DuckDbTypes.varchar, "Line1\nLine2\tTabbed"), - new DuckDbTypeAndExample<>(DuckDbTypes.varchar, "Quote\"Test'Single\\Back"), - new DuckDbTypeAndExample<>( - DuckDbTypes.varchar, "Emoji: \uD83D\uDE00\uD83C\uDF89\uD83D\uDE80"), - new DuckDbTypeAndExample<>(DuckDbTypes.varchar(100), "Fixed length varchar"), - new DuckDbTypeAndExample<>(DuckDbTypes.text, "Text type content"), - new DuckDbTypeAndExample<>(DuckDbTypes.char_(10), "hello"), - - // ==================== Binary Types ==================== - // BLOB: Binary data cannot roundtrip through textual JSON COPY format - // JSON treats base64 strings as VARCHAR literals, storing text instead of binary. - // Use Parquet/Arrow formats for proper binary data streaming. - new DuckDbTypeAndExample<>(DuckDbTypes.blob, new byte[] {0x01, 0x02, 0x03, 0x04, 0x05}) - .noTextRoundtrip(), - new DuckDbTypeAndExample<>(DuckDbTypes.blob, new byte[] {}).noTextRoundtrip(), - new DuckDbTypeAndExample<>( - DuckDbTypes.blob, new byte[] {(byte) 0xFF, 0x00, 0x7F, (byte) 0x80}) - .noTextRoundtrip(), - new DuckDbTypeAndExample<>(DuckDbTypes.blob, new byte[] {0x00}).noTextRoundtrip(), - - // ==================== Date/Time Types ==================== - new DuckDbTypeAndExample<>(DuckDbTypes.date, LocalDate.of(2024, 6, 15)), - new DuckDbTypeAndExample<>(DuckDbTypes.date, LocalDate.of(1970, 1, 1)), - new DuckDbTypeAndExample<>(DuckDbTypes.date, LocalDate.of(2099, 12, 31)), - new DuckDbTypeAndExample<>(DuckDbTypes.time, LocalTime.of(14, 30, 45)), - new DuckDbTypeAndExample<>(DuckDbTypes.time, LocalTime.of(0, 0, 0)), - new DuckDbTypeAndExample<>(DuckDbTypes.time, LocalTime.of(23, 59, 59)), - new DuckDbTypeAndExample<>( - DuckDbTypes.timestamp, LocalDateTime.of(2024, 6, 15, 14, 30, 45)), - new DuckDbTypeAndExample<>(DuckDbTypes.timestamp, LocalDateTime.of(1970, 1, 1, 0, 0, 0)), - new DuckDbTypeAndExample<>( - DuckDbTypes.timestamp, LocalDateTime.of(2024, 6, 15, 14, 30, 45, 123456000)), - // Timestamp with timezone - new DuckDbTypeAndExample<>( - DuckDbTypes.timestamptz, - OffsetDateTime.of(2024, 6, 15, 14, 30, 45, 0, ZoneOffset.UTC)), - - // ==================== Interval Type ==================== - new DuckDbTypeAndExample<>(DuckDbTypes.interval, Duration.ofHours(2).plusMinutes(30)), - new DuckDbTypeAndExample<>(DuckDbTypes.interval, Duration.ofDays(5)), - - // ==================== UUID Type ==================== - new DuckDbTypeAndExample<>( - DuckDbTypes.uuid, UUID.fromString("550e8400-e29b-41d4-a716-446655440000")), - new DuckDbTypeAndExample<>( - DuckDbTypes.uuid, UUID.fromString("00000000-0000-0000-0000-000000000000")), - - // ==================== JSON Type ==================== - new DuckDbTypeAndExample<>( - DuckDbTypes.json, new Json("{\"name\": \"DuckDB\", \"version\": 1.0}")) - .noIdentity(), - new DuckDbTypeAndExample<>(DuckDbTypes.json, new Json("[1, 2, 3, \"four\"]")) - .noIdentity(), - new DuckDbTypeAndExample<>(DuckDbTypes.json, new Json("{}")).noIdentity(), - - // ==================== ENUM Type ==================== - new DuckDbTypeAndExample<>(DuckDbTypes.ofEnum("color_enum", Color::valueOf), Color.GREEN), - new DuckDbTypeAndExample<>(DuckDbTypes.ofEnum("color_enum", Color::valueOf), Color.RED), - - // ==================== LIST Types ==================== - // LIST types don't support direct equality in WHERE clauses, so we mark noIdentity() - // Native JNI types (best performance) - new DuckDbTypeAndExample<>(DuckDbTypes.listBoolean, List.of(true, false, true)) - .noIdentity(), - new DuckDbTypeAndExample<>( - DuckDbTypes.listTinyint, List.of((byte) 1, (byte) 2, (byte) -1)) - .noIdentity(), - new DuckDbTypeAndExample<>(DuckDbTypes.listSmallint, List.of((short) 100, (short) -200)) - .noIdentity(), - new DuckDbTypeAndExample<>(DuckDbTypes.listInteger, List.of(1, 2, 3, 4, 5)).noIdentity(), - new DuckDbTypeAndExample<>(DuckDbTypes.listInteger, List.of()).noIdentity(), - new DuckDbTypeAndExample<>(DuckDbTypes.listInteger, List.of(-100, 0, 100)).noIdentity(), - new DuckDbTypeAndExample<>(DuckDbTypes.listBigint, List.of(1L, 2L, 9999999999L)) - .noIdentity(), - new DuckDbTypeAndExample<>(DuckDbTypes.listFloat, List.of(1.5f, 2.5f, 3.14f)) - .noIdentity(), - new DuckDbTypeAndExample<>(DuckDbTypes.listDouble, List.of(1.5, 2.5, 3.14159)) - .noIdentity(), - new DuckDbTypeAndExample<>(DuckDbTypes.listVarchar, List.of("hello", "world")) - .noIdentity(), - new DuckDbTypeAndExample<>(DuckDbTypes.listVarchar, List.of("quote'test", "back\\slash")) - .noIdentity(), - - // ==================== MAP Types ==================== - // MAP types use the mapTo() combinator. They don't support direct equality in WHERE - // clauses. - new DuckDbTypeAndExample<>( - DuckDbTypes.varchar.mapTo(DuckDbTypes.integer), java.util.Map.of("a", 1, "b", 2)) - .noIdentity(), - new DuckDbTypeAndExample<>( - DuckDbTypes.varchar.mapTo(DuckDbTypes.varchar), - java.util.Map.of("key1", "value1", "key2", "value2")) - .noIdentity(), - new DuckDbTypeAndExample<>( - DuckDbTypes.integer.mapTo(DuckDbTypes.varchar), - java.util.Map.of(1, "one", 2, "two")) - .noIdentity(), - // MAP with UUID keys - new DuckDbTypeAndExample<>( - DuckDbTypes.uuid.mapTo(DuckDbTypes.varchar), - java.util.Map.of( - UUID.fromString("550e8400-e29b-41d4-a716-446655440000"), "value1", - UUID.fromString("123e4567-e89b-12d3-a456-426614174000"), "value2")) - .noIdentity(), - // MAP with TIME values - new DuckDbTypeAndExample<>( - DuckDbTypes.varchar.mapTo(DuckDbTypes.time), - java.util.Map.of( - "morning", LocalTime.of(8, 15, 0), - "afternoon", LocalTime.of(14, 30, 45))) - .noIdentity(), - // MAP with UUID keys and TIME values - new DuckDbTypeAndExample<>( - DuckDbTypes.uuid.mapTo(DuckDbTypes.time), - java.util.Map.of( - UUID.fromString("550e8400-e29b-41d4-a716-446655440000"), - LocalTime.of(14, 30, 45), - UUID.fromString("123e4567-e89b-12d3-a456-426614174000"), - LocalTime.of(8, 15, 0))) - .noIdentity(), - // More MAP type combinations - new DuckDbTypeAndExample<>( - DuckDbTypes.varchar.mapTo(DuckDbTypes.bigint), - java.util.Map.of("count", 100L, "total", 999999999999L)) - .noIdentity(), - new DuckDbTypeAndExample<>( - DuckDbTypes.varchar.mapTo(DuckDbTypes.double_), - java.util.Map.of("pi", 3.14159, "e", 2.71828)) - .noIdentity(), - new DuckDbTypeAndExample<>( - DuckDbTypes.varchar.mapTo(DuckDbTypes.boolean_), - java.util.Map.of("active", true, "verified", false)) - .noIdentity(), - new DuckDbTypeAndExample<>( - DuckDbTypes.integer.mapTo(DuckDbTypes.integer), - java.util.Map.of(1, 100, 2, 200, 3, 300)) - .noIdentity(), - new DuckDbTypeAndExample<>( - DuckDbTypes.varchar.mapTo(DuckDbTypes.date), - java.util.Map.of( - "start", LocalDate.of(2024, 1, 1), "end", LocalDate.of(2024, 12, 31))) - .noIdentity(), - new DuckDbTypeAndExample<>( - DuckDbTypes.varchar.mapTo(DuckDbTypes.uuid), - java.util.Map.of( - "user1", UUID.fromString("550e8400-e29b-41d4-a716-446655440000"), - "user2", UUID.fromString("123e4567-e89b-12d3-a456-426614174000"))) - .noIdentity(), - // MAP with bimapped key type (UserId wrapping Integer) - new DuckDbTypeAndExample<>( - userIdType.mapTo(DuckDbTypes.varchar), - java.util.Map.of(new UserId(1), "admin", new UserId(2), "user")) - .noIdentity(), - // MAP with bimapped value type (ProductCode wrapping String) - new DuckDbTypeAndExample<>( - DuckDbTypes.integer.mapTo(productCodeType), - java.util.Map.of(1, new ProductCode("PROD-001"), 2, new ProductCode("PROD-002"))) - .noIdentity(), - // MAP with bimapped key AND value types - new DuckDbTypeAndExample<>( - userIdType.mapTo(productCodeType), - java.util.Map.of( - new UserId(100), new ProductCode("SKU-A"), - new UserId(200), new ProductCode("SKU-B"))) - .noIdentity(), - // MAP with Long keys - new DuckDbTypeAndExample<>( - DuckDbTypes.bigint.mapTo(DuckDbTypes.varchar), - java.util.Map.of(9999999999L, "large-key", 1L, "small-key")) - .noIdentity(), - // MAP with Double keys - new DuckDbTypeAndExample<>( - DuckDbTypes.double_.mapTo(DuckDbTypes.varchar), - java.util.Map.of(3.14, "pi", 2.71, "e")) - .noIdentity(), - // Config type directly (bimapped from MAP(VARCHAR, INTEGER)) - // This tests that bimap works correctly with map types - new DuckDbTypeAndExample<>( - configType, new Config(java.util.Map.of("max_conn", 100, "min_conn", 5))) - .noIdentity(), - - // ==================== LIST Types with complex element types ==================== - // String-converted types (~33% overhead at 100k rows, but required for correctness) - // LIST - UUID requires String conversion to avoid byte-ordering bug - new DuckDbTypeAndExample<>( - DuckDbTypes.listUuid, - List.of( - UUID.fromString("550e8400-e29b-41d4-a716-446655440000"), - UUID.fromString("123e4567-e89b-12d3-a456-426614174000"))) - .noIdentity(), - // LIST void testJsonRoundtrip(DuckDbTypeAndExample t) { - try { - DuckDbJson jsonCodec = t.type.duckDbJson(); - A original = t.example; - - // Test toJson -> encode -> parse -> fromJson roundtrip (in-memory) - JsonValue jsonValue = jsonCodec.toJson(original); - String encoded = jsonValue.encode(); - JsonValue parsed = JsonValue.parse(encoded); - A decoded = jsonCodec.fromJson(parsed); - - System.out.println( - "JSON roundtrip " - + t.type.typename().sqlType() - + ": " - + format(original) - + " -> " - + encoded - + " -> " - + format(decoded)); - - if (t.hasIdentity && !areEqual(decoded, original)) { - throw new RuntimeException( - "JSON roundtrip failed for " - + t.type.typename().sqlType() - + ": expected '" - + format(original) - + "' but got '" - + format(decoded) - + "'"); - } - } catch (Exception e) { - throw new RuntimeException( - "JSON roundtrip test failed for " + t.type.typename().sqlType(), e); - } - } - - static void testTextEncoding(DuckDbTypeAndExample t) { - try { - DbText textCodec = t.type.text(); - A original = t.example; - - // Test text encoding (format to CSV string) - StringBuilder sb = new StringBuilder(); - textCodec.unsafeEncode(original, sb); - String encoded = sb.toString(); - - System.out.println( - "Text encoding " - + t.type.typename().sqlType() - + ": " - + format(original) - + " -> " - + encoded); - - } catch (UnsupportedOperationException e) { - // Some types don't support text encoding yet - System.out.println("Text encoding " + t.type.typename().sqlType() + ": NOT SUPPORTED"); - } catch (Exception e) { - throw new RuntimeException("Text encoding test failed for " + t.type.typename().sqlType(), e); - } - } - - static void testTextRoundtrip(Connection conn, DuckDbTypeAndExample t) throws Exception { - if (!t.supportsTextRoundtrip) { - System.out.println( - "Text roundtrip " - + t.type.typename().sqlType() - + ": SKIPPED (not supported for this type)"); - return; - } - - String sqlType = t.type.typename().sqlType(); - DuckDbJson jsonCodec = t.type.duckDbJson(); - A original = t.example; - - // Encode to JSON (simpler than CSV for complex types!) - String jsonValue = jsonCodec.toJson(original).encode(); - - // Use unique table name to avoid conflicts if previous test failed - String tableName = "text_test_" + System.nanoTime(); - - // Create temp table - conn.createStatement().execute("CREATE TEMPORARY TABLE " + tableName + " (v " + sqlType + ")"); - - try { - // Write JSON to temp file (one value per line - newline-delimited JSON) - // DuckDB expects each line to be a JSON object matching the table structure - java.io.File tempFile = java.io.File.createTempFile("duckdb_test_", ".json"); - tempFile.deleteOnExit(); - String jsonLine = "{\"v\":" + jsonValue + "}\n"; - java.nio.file.Files.writeString(tempFile.toPath(), jsonLine); - - // Import JSON using COPY - // DuckDB can parse JSON directly! - String copyCommand = - "COPY " + tableName + " FROM '" + tempFile.getAbsolutePath() + "' (FORMAT JSON)"; - try { - conn.createStatement().execute(copyCommand); - } catch (SQLException e) { - throw new RuntimeException( - "COPY command failed: " + copyCommand + "\nJSON line: " + jsonLine, e); - } - - // Read back the value - var rs = conn.createStatement().executeQuery("SELECT v FROM " + tableName); - if (!rs.next()) { - throw new RuntimeException("No rows returned from JSON import"); - } - A decoded = t.type.read().read(rs, 1); - rs.close(); - - // Verify roundtrip - if (t.hasIdentity && !areEqual(decoded, original)) { - throw new RuntimeException( - "Text roundtrip failed: expected '" - + format(original) - + "' but got '" - + format(decoded) - + "'"); - } - - System.out.println( - "Text roundtrip " - + sqlType - + ": " - + format(original) - + " -> " - + jsonValue - + " -> " - + format(decoded)); - - } finally { - try { - conn.createStatement().execute("DROP TABLE IF EXISTS " + tableName); - } catch (SQLException e) { - // Ignore cleanup errors - transaction might be aborted - } - } - } - - static void testCase(Connection conn, DuckDbTypeAndExample t) throws SQLException { - String sqlType = t.type.typename().sqlType(); - String tableName = uniqueTableName("test_table"); - - // Create temp table - conn.createStatement().execute("CREATE TEMPORARY TABLE " + tableName + " (v " + sqlType + ")"); - - try { - // Insert using PreparedStatement - var insert = conn.prepareStatement("INSERT INTO " + tableName + " (v) VALUES (?)"); - A expected = t.example; - t.type.write().set(insert, 1, expected); - insert.execute(); - insert.close(); - - // Select and verify - final PreparedStatement select; - if (t.hasIdentity) { - select = conn.prepareStatement("SELECT v, NULL FROM " + tableName + " WHERE v = ?"); - t.type.write().set(select, 1, expected); - } else { - select = conn.prepareStatement("SELECT v, NULL FROM " + tableName); - } - - select.execute(); - var rs = select.getResultSet(); - - if (!rs.next()) { - throw new RuntimeException("No rows returned"); - } - - // Read the value - A actual = t.type.read().read(rs, 1); - // Read the null value using opt() - Optional actualNull = t.type.opt().read().read(rs, 2); - - select.close(); - - assertEquals(actual, expected, "value mismatch"); - assertEquals(actualNull, Optional.empty(), "null value mismatch"); - - } finally { - // Drop temp table - conn.createStatement().execute("DROP TABLE IF EXISTS " + tableName); - } - } - - static void assertEquals(A actual, A expected, String message) { - if (!areEqual(actual, expected)) { - throw new RuntimeException( - message + ": actual='" + format(actual) + "' expected='" + format(expected) + "'"); - } - } - - static boolean areEqual(A actual, A expected) { - if (expected == null && actual == null) return true; - if (expected == null || actual == null) return false; - - if (expected instanceof byte[]) { - return Arrays.equals((byte[]) actual, (byte[]) expected); - } - if (expected instanceof Object[]) { - return Arrays.deepEquals((Object[]) actual, (Object[]) expected); - } - // BigDecimal: compare by value, not scale - if (expected instanceof BigDecimal && actual instanceof BigDecimal) { - return ((BigDecimal) actual).compareTo((BigDecimal) expected) == 0; - } - // List: compare element by element - if (expected instanceof List && actual instanceof List) { - List expectedList = (List) expected; - List actualList = (List) actual; - if (expectedList.size() != actualList.size()) return false; - for (int i = 0; i < expectedList.size(); i++) { - if (!areEqual(actualList.get(i), expectedList.get(i))) return false; - } - return true; - } - // Map: compare entries - if (expected instanceof java.util.Map && actual instanceof java.util.Map) { - java.util.Map expectedMap = (java.util.Map) expected; - java.util.Map actualMap = (java.util.Map) actual; - if (expectedMap.size() != actualMap.size()) return false; - for (var entry : expectedMap.entrySet()) { - Object actualValue = actualMap.get(entry.getKey()); - if (!areEqual(actualValue, entry.getValue())) return false; - } - return true; - } - - return actual.equals(expected); - } - - static String format(A a) { - if (a == null) return "null"; - if (a instanceof byte[]) { - return bytesToHex((byte[]) a); - } - if (a instanceof Object[]) { - return Arrays.deepToString((Object[]) a); - } - return a.toString(); - } - - static String bytesToHex(byte[] bytes) { - StringBuilder sb = new StringBuilder(); - sb.append("["); - for (int i = 0; i < bytes.length; i++) { - if (i > 0) sb.append(", "); - sb.append(String.format("0x%02X", bytes[i])); - } - sb.append("]"); - return sb.toString(); - } -} diff --git a/foundations-jdbc-test/src/java/dev/typr/foundations/MariaTypeTest.java b/foundations-jdbc-test/src/java/dev/typr/foundations/MariaTypeTest.java deleted file mode 100644 index 2a48aadcff..0000000000 --- a/foundations-jdbc-test/src/java/dev/typr/foundations/MariaTypeTest.java +++ /dev/null @@ -1,564 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.Json; -import dev.typr.foundations.data.JsonValue; -import dev.typr.foundations.data.maria.Inet4; -import dev.typr.foundations.data.maria.Inet6; -import dev.typr.foundations.data.maria.MariaSet; -import java.math.BigDecimal; -import java.math.BigInteger; -import java.sql.Connection; -import java.sql.PreparedStatement; -import java.sql.SQLException; -import java.time.LocalDate; -import java.time.LocalDateTime; -import java.time.LocalTime; -import java.time.Year; -import java.util.Arrays; -import java.util.List; -import java.util.Optional; -import java.util.concurrent.atomic.AtomicInteger; -import org.junit.Test; - -/** Tests for MariaDB type codecs. Tests all types defined in MariaTypes. */ -public class MariaTypeTest { - - private static final AtomicInteger tableCounter = new AtomicInteger(0); - - private static String uniqueTableName(String prefix) { - return prefix + "_" + tableCounter.incrementAndGet(); - } - - record TestPair(A t0, Optional t1) {} - - record MariaTypeAndExample( - MariaType type, - A example, - boolean hasIdentity, - boolean streamingWorks, - boolean jsonRoundtripWorks) { - public MariaTypeAndExample(MariaType type, A example) { - this(type, example, true, true, true); - } - - public MariaTypeAndExample noStreaming() { - return new MariaTypeAndExample<>(type, example, hasIdentity, false, jsonRoundtripWorks); - } - - public MariaTypeAndExample noIdentity() { - return new MariaTypeAndExample<>(type, example, false, streamingWorks, jsonRoundtripWorks); - } - - // MariaDB's JSON encoding of binary data is lossy - bytes > 127 get corrupted - public MariaTypeAndExample noJsonRoundtrip() { - return new MariaTypeAndExample<>(type, example, hasIdentity, streamingWorks, false); - } - } - - // Sample enum for ENUM type testing - enum Color { - RED, - GREEN, - BLUE - } - - List> All = - List.of( - // ==================== Integer Types (Signed) ==================== - new MariaTypeAndExample<>(MariaTypes.tinyint, (byte) 42), - new MariaTypeAndExample<>(MariaTypes.tinyint, Byte.MIN_VALUE), // Edge case: min value - new MariaTypeAndExample<>(MariaTypes.tinyint, Byte.MAX_VALUE), // Edge case: max value - new MariaTypeAndExample<>(MariaTypes.tinyint, (byte) 0), // Edge case: zero - new MariaTypeAndExample<>(MariaTypes.smallint, (short) 4242), - new MariaTypeAndExample<>(MariaTypes.smallint, Short.MIN_VALUE), // Edge case: min value - new MariaTypeAndExample<>(MariaTypes.smallint, Short.MAX_VALUE), // Edge case: max value - new MariaTypeAndExample<>(MariaTypes.smallint, (short) 0), // Edge case: zero - new MariaTypeAndExample<>(MariaTypes.mediumint, 424242), - new MariaTypeAndExample<>(MariaTypes.mediumint, -8388608), // Edge case: min MEDIUMINT - new MariaTypeAndExample<>(MariaTypes.mediumint, 8388607), // Edge case: max MEDIUMINT - new MariaTypeAndExample<>(MariaTypes.mediumint, 0), // Edge case: zero - new MariaTypeAndExample<>(MariaTypes.int_, 42424242), - new MariaTypeAndExample<>(MariaTypes.int_, Integer.MIN_VALUE), // Edge case: min value - new MariaTypeAndExample<>(MariaTypes.int_, Integer.MAX_VALUE), // Edge case: max value - new MariaTypeAndExample<>(MariaTypes.int_, 0), // Edge case: zero - new MariaTypeAndExample<>(MariaTypes.bigint, 4242424242424242L), - new MariaTypeAndExample<>(MariaTypes.bigint, Long.MIN_VALUE), // Edge case: min value - new MariaTypeAndExample<>(MariaTypes.bigint, Long.MAX_VALUE), // Edge case: max value - new MariaTypeAndExample<>(MariaTypes.bigint, 0L), // Edge case: zero - - // ==================== Integer Types (Unsigned) ==================== - new MariaTypeAndExample<>( - MariaTypes.tinyintUnsigned, - new dev.typr.foundations.data.Uint1((short) 255)), // Max TINYINT UNSIGNED - new MariaTypeAndExample<>( - MariaTypes.tinyintUnsigned, - new dev.typr.foundations.data.Uint1((short) 0)), // Edge case: min unsigned - new MariaTypeAndExample<>( - MariaTypes.smallintUnsigned, - new dev.typr.foundations.data.Uint2(65535)), // Max SMALLINT UNSIGNED - new MariaTypeAndExample<>( - MariaTypes.smallintUnsigned, - new dev.typr.foundations.data.Uint2(0)), // Edge case: min unsigned - new MariaTypeAndExample<>( - MariaTypes.mediumintUnsigned, - new dev.typr.foundations.data.Uint4(16777215)), // Max MEDIUMINT UNSIGNED - new MariaTypeAndExample<>( - MariaTypes.mediumintUnsigned, - new dev.typr.foundations.data.Uint4(0)), // Edge case: min unsigned - new MariaTypeAndExample<>( - MariaTypes.intUnsigned, - new dev.typr.foundations.data.Uint4(4294967295L)), // Max INT UNSIGNED - new MariaTypeAndExample<>( - MariaTypes.intUnsigned, - new dev.typr.foundations.data.Uint4(0L)), // Edge case: min unsigned - new MariaTypeAndExample<>( - MariaTypes.bigintUnsigned, - new dev.typr.foundations.data.Uint8( - new BigInteger("18446744073709551615"))), // Max BIGINT UNSIGNED - new MariaTypeAndExample<>( - MariaTypes.bigintUnsigned, - new dev.typr.foundations.data.Uint8(BigInteger.ZERO)), // Edge case: min unsigned - - // ==================== Fixed-Point Types ==================== - new MariaTypeAndExample<>(MariaTypes.decimal, new BigDecimal("12345")), - new MariaTypeAndExample<>(MariaTypes.decimal, BigDecimal.ZERO), // Edge case: zero - new MariaTypeAndExample<>( - MariaTypes.decimal, new BigDecimal("-9999999999")), // Edge case: negative - new MariaTypeAndExample<>(MariaTypes.numeric, new BigDecimal("99999")), - new MariaTypeAndExample<>(MariaTypes.decimal(10, 2), new BigDecimal("12345678.90")), - new MariaTypeAndExample<>( - MariaTypes.decimal(10, 2), new BigDecimal("0.00")), // Edge case: zero with decimals - new MariaTypeAndExample<>( - MariaTypes.decimal(10, 2), - new BigDecimal("-99999999.99")), // Edge case: large negative - new MariaTypeAndExample<>(MariaTypes.decimal(10, 5), new BigDecimal("12345.67890")), - new MariaTypeAndExample<>( - MariaTypes.decimal(10, 5), new BigDecimal("0.00001")), // Edge case: small value - - // ==================== Floating-Point Types ==================== - new MariaTypeAndExample<>(MariaTypes.float_, 3.14159f).noIdentity(), - new MariaTypeAndExample<>(MariaTypes.float_, 0.0f).noIdentity(), // Edge case: zero - new MariaTypeAndExample<>(MariaTypes.float_, 1.0E-38f) - .noIdentity(), // Edge case: small positive - new MariaTypeAndExample<>(MariaTypes.double_, 3.141592653589793), - new MariaTypeAndExample<>(MariaTypes.double_, 0.0), // Edge case: zero - new MariaTypeAndExample<>(MariaTypes.double_, -3.141592653589793), // Edge case: negative - - // ==================== Boolean Type ==================== - new MariaTypeAndExample<>(MariaTypes.bool, true), - new MariaTypeAndExample<>(MariaTypes.bool, false), - - // ==================== Bit Types ==================== - // BIT types also have JSON encoding issues - new MariaTypeAndExample<>(MariaTypes.bit1, true).noJsonRoundtrip(), - new MariaTypeAndExample<>(MariaTypes.bit1, false) - .noJsonRoundtrip(), // Edge case: false bit - - // ==================== String Types ==================== - new MariaTypeAndExample<>(MariaTypes.char_(10), "hello"), - new MariaTypeAndExample<>(MariaTypes.char_(10), ""), // Edge case: empty string - new MariaTypeAndExample<>(MariaTypes.char_(10), "a"), // Edge case: single char - new MariaTypeAndExample<>( - MariaTypes.varchar(255), "Hello, MariaDB! Unicode: \u00e9\u00e8\u00ea \u4e2d\u6587"), - new MariaTypeAndExample<>(MariaTypes.varchar(255), ""), // Edge case: empty string - new MariaTypeAndExample<>( - MariaTypes.varchar(255), "Line1\nLine2\tTabbed"), // Edge case: whitespace - new MariaTypeAndExample<>( - MariaTypes.varchar(255), "Quote\"Test'Single\\Back"), // Edge case: special chars - new MariaTypeAndExample<>( - MariaTypes.varchar(255), - "Emoji: \uD83D\uDE00\uD83C\uDF89\uD83D\uDE80"), // Edge case: emoji - new MariaTypeAndExample<>(MariaTypes.tinytext, "tiny text content"), - new MariaTypeAndExample<>(MariaTypes.tinytext, ""), // Edge case: empty - new MariaTypeAndExample<>( - MariaTypes.text, "Regular text content with special chars: ,.;{}[]-//#\u00ae\u2705"), - new MariaTypeAndExample<>(MariaTypes.text, ""), // Edge case: empty - new MariaTypeAndExample<>(MariaTypes.mediumtext, "Medium text can hold up to 16MB"), - new MariaTypeAndExample<>(MariaTypes.longtext, "Long text can hold up to 4GB"), - - // ==================== Binary Types ==================== - // Note: MariaDB's JSON encoding of binary is lossy - bytes > 127 get corrupted - // because JSON is UTF-8 and MariaDB outputs raw bytes without proper encoding - new MariaTypeAndExample<>(MariaTypes.binary(5), new byte[] {0x01, 0x02, 0x03, 0x00, 0x00}) - .noJsonRoundtrip(), - new MariaTypeAndExample<>(MariaTypes.binary(5), new byte[] {0x00, 0x00, 0x00, 0x00, 0x00}) - .noJsonRoundtrip(), // Edge case: all zeros - new MariaTypeAndExample<>( - MariaTypes.varbinary(255), new byte[] {(byte) 0xFF, 0x00, 0x7F, (byte) 0x80}) - .noJsonRoundtrip(), - new MariaTypeAndExample<>(MariaTypes.varbinary(255), new byte[] {}) - .noJsonRoundtrip(), // Edge case: empty - new MariaTypeAndExample<>(MariaTypes.varbinary(255), new byte[] {0x00}) - .noJsonRoundtrip(), // Edge case: single zero byte - new MariaTypeAndExample<>(MariaTypes.tinyblob, new byte[] {0x01, 0x02, 0x03}) - .noJsonRoundtrip(), - new MariaTypeAndExample<>(MariaTypes.tinyblob, new byte[] {}) - .noJsonRoundtrip(), // Edge case: empty - new MariaTypeAndExample<>( - MariaTypes.blob, new byte[] {(byte) 0xDE, (byte) 0xAD, (byte) 0xBE, (byte) 0xEF}) - .noJsonRoundtrip(), - new MariaTypeAndExample<>(MariaTypes.blob, new byte[] {}) - .noJsonRoundtrip(), // Edge case: empty - new MariaTypeAndExample<>( - MariaTypes.mediumblob, new byte[] {0x00, 0x11, 0x22, 0x33, 0x44, 0x55}) - .noJsonRoundtrip(), - new MariaTypeAndExample<>( - MariaTypes.longblob, - new byte[] {(byte) 0xCA, (byte) 0xFE, (byte) 0xBA, (byte) 0xBE}) - .noJsonRoundtrip(), - - // ==================== Date/Time Types ==================== - new MariaTypeAndExample<>(MariaTypes.date, LocalDate.of(2024, 6, 15)), - new MariaTypeAndExample<>(MariaTypes.date, LocalDate.of(1970, 1, 1)), // Edge case: epoch - new MariaTypeAndExample<>( - MariaTypes.date, LocalDate.of(2099, 12, 31)), // Edge case: far future - new MariaTypeAndExample<>( - MariaTypes.date, LocalDate.of(1000, 1, 1)), // Edge case: old date - new MariaTypeAndExample<>(MariaTypes.time, LocalTime.of(14, 30, 45)), - new MariaTypeAndExample<>(MariaTypes.time, LocalTime.of(0, 0, 0)), // Edge case: midnight - new MariaTypeAndExample<>( - MariaTypes.time, LocalTime.of(23, 59, 59)), // Edge case: end of day - new MariaTypeAndExample<>(MariaTypes.time(3), LocalTime.of(14, 30, 45, 123000000)), - new MariaTypeAndExample<>( - MariaTypes.time(6), LocalTime.of(14, 30, 45, 123456000)), // Edge case: microseconds - new MariaTypeAndExample<>(MariaTypes.datetime, LocalDateTime.of(2024, 6, 15, 14, 30, 45)), - new MariaTypeAndExample<>( - MariaTypes.datetime, LocalDateTime.of(1970, 1, 1, 0, 0, 0)), // Edge case: epoch - new MariaTypeAndExample<>( - MariaTypes.datetime(6), LocalDateTime.of(2024, 6, 15, 14, 30, 45, 123456000)), - new MariaTypeAndExample<>( - MariaTypes.timestamp, LocalDateTime.of(2024, 6, 15, 14, 30, 45)), - new MariaTypeAndExample<>( - MariaTypes.timestamp, - LocalDateTime.of( - 1971, 1, 1, 0, 0, - 1)), // Edge case: near epoch (timestamp starts at 1970-01-01 00:00:01) - new MariaTypeAndExample<>( - MariaTypes.timestamp(6), LocalDateTime.of(2024, 6, 15, 14, 30, 45, 123456000)), - new MariaTypeAndExample<>(MariaTypes.year, Year.of(2024)), - new MariaTypeAndExample<>(MariaTypes.year, Year.of(1901)), // Edge case: min YEAR - new MariaTypeAndExample<>(MariaTypes.year, Year.of(2155)), // Edge case: max YEAR - - // ==================== ENUM Type ==================== - new MariaTypeAndExample<>( - MariaTypes.ofEnum("ENUM('RED','GREEN','BLUE')", Color::valueOf), Color.GREEN), - new MariaTypeAndExample<>( - MariaTypes.ofEnum("ENUM('RED','GREEN','BLUE')", Color::valueOf), - Color.RED), // Edge case: first value - new MariaTypeAndExample<>( - MariaTypes.ofEnum("ENUM('RED','GREEN','BLUE')", Color::valueOf), - Color.BLUE), // Edge case: last value - - // ==================== SET Type ==================== - new MariaTypeAndExample<>( - MariaTypes.set.withTypename("SET('email','sms','push')"), - MariaSet.of("email", "push")), - new MariaTypeAndExample<>( - MariaTypes.set.withTypename("SET('a','b','c')"), - MariaSet.empty()), // Edge case: empty set - new MariaTypeAndExample<>( - MariaTypes.set.withTypename("SET('x','y','z')"), - MariaSet.of("x", "y", "z")), // Edge case: all values - new MariaTypeAndExample<>( - MariaTypes.set.withTypename("SET('only')"), - MariaSet.of("only")), // Edge case: single value - - // ==================== JSON Type ==================== - new MariaTypeAndExample<>( - MariaTypes.json, new Json("{\"name\": \"MariaDB\", \"version\": 10.11}")) - .noIdentity(), - new MariaTypeAndExample<>(MariaTypes.json, new Json("[1, 2, 3, \"four\"]")).noIdentity(), - new MariaTypeAndExample<>(MariaTypes.json, new Json("{}")) - .noIdentity(), // Edge case: empty object - new MariaTypeAndExample<>(MariaTypes.json, new Json("[]")) - .noIdentity(), // Edge case: empty array - new MariaTypeAndExample<>(MariaTypes.json, new Json("null")) - .noIdentity(), // Edge case: null - new MariaTypeAndExample<>(MariaTypes.json, new Json("\"string\"")) - .noIdentity(), // Edge case: string value - new MariaTypeAndExample<>(MariaTypes.json, new Json("42")) - .noIdentity(), // Edge case: number value - new MariaTypeAndExample<>(MariaTypes.json, new Json("true")) - .noIdentity(), // Edge case: boolean value - - // ==================== Network Types (MariaDB 10.10+) ==================== - new MariaTypeAndExample<>(MariaTypes.inet4, Inet4.parse("192.168.1.100")), - new MariaTypeAndExample<>(MariaTypes.inet4, Inet4.parse("10.0.0.1")), - new MariaTypeAndExample<>( - MariaTypes.inet4, Inet4.parse("0.0.0.0")), // Edge case: any address - new MariaTypeAndExample<>( - MariaTypes.inet4, Inet4.parse("255.255.255.255")), // Edge case: broadcast - new MariaTypeAndExample<>( - MariaTypes.inet4, Inet4.parse("127.0.0.1")), // Edge case: localhost - new MariaTypeAndExample<>(MariaTypes.inet6, Inet6.parse("2001:db8::1")), - new MariaTypeAndExample<>( - MariaTypes.inet6, Inet6.parse("::ffff:192.168.1.1")), // IPv4-mapped - new MariaTypeAndExample<>(MariaTypes.inet6, Inet6.parse("::")), // Edge case: any address - new MariaTypeAndExample<>(MariaTypes.inet6, Inet6.parse("::1")), // Edge case: localhost - new MariaTypeAndExample<>( - MariaTypes.inet6, Inet6.parse("fe80::1")) // Edge case: link-local - ); - - // Connection helper for MariaDB - static T withConnection(SqlFunction f) { - try (var conn = - java.sql.DriverManager.getConnection( - "jdbc:mariadb://localhost:3307/typr?user=typr&password=password")) { - conn.setAutoCommit(false); - try { - return f.apply(conn); - } finally { - conn.rollback(); - } - } catch (SQLException e) { - throw new RuntimeException(e); - } - } - - @Test - public void test() { - System.out.println("Testing MariaDB type codecs...\n"); - - // Test JSON roundtrip first (no database connection needed) - parallel - System.out.println("=== JSON Roundtrip Tests (parallel) ==="); - All.parallelStream().forEach(MariaTypeTest::testJsonRoundtrip); - System.out.println(); - - // Run native type and JSON DB tests in parallel - System.out.println("=== Native Type + JSON DB Roundtrip Tests (parallel) ==="); - var failures = - All.parallelStream() - .flatMap( - t -> { - var errors = new java.util.ArrayList(); - - // Native type roundtrip test - try { - withConnection( - conn -> { - testCase(conn, t); - return null; - }); - } catch (Exception e) { - errors.add( - "Native test FAILED " - + t.type.typename().sqlType() - + ": " - + e.getMessage()); - } - - // JSON DB roundtrip test - if (t.jsonRoundtripWorks()) { - try { - withConnection( - conn -> { - testJsonDbRoundtrip(conn, t); - return null; - }); - } catch (Exception e) { - errors.add( - "JSON DB test FAILED " - + t.type.typename().sqlType() - + ": " - + e.getMessage()); - } - } - - return errors.stream(); - }) - .toList(); - - System.out.println("\n====================================="); - if (failures.isEmpty()) { - System.out.println("All tests passed!"); - } else { - failures.forEach(System.out::println); - throw new RuntimeException(failures.size() + " tests failed"); - } - System.out.println("====================================="); - } - - static void testJsonRoundtrip(MariaTypeAndExample t) { - try { - MariaJson jsonCodec = t.type.mariaJson(); - A original = t.example; - - // Test toJson -> encode -> parse -> fromJson roundtrip (in-memory) - JsonValue jsonValue = jsonCodec.toJson(original); - String encoded = jsonValue.encode(); - JsonValue parsed = JsonValue.parse(encoded); - A decoded = jsonCodec.fromJson(parsed); - - System.out.println( - "JSON roundtrip " - + t.type.typename().sqlType() - + ": " - + format(original) - + " -> " - + encoded - + " -> " - + format(decoded)); - - if (t.hasIdentity && !areEqual(decoded, original)) { - throw new RuntimeException( - "JSON roundtrip failed for " - + t.type.typename().sqlType() - + ": expected '" - + format(original) - + "' but got '" - + format(decoded) - + "'"); - } - } catch (Exception e) { - throw new RuntimeException( - "JSON roundtrip test failed for " + t.type.typename().sqlType(), e); - } - } - - // Test JSON roundtrip through the database - simulates MULTISET behavior - // Insert value into native column, read back as JSON, parse back to value - static void testJsonDbRoundtrip(Connection conn, MariaTypeAndExample t) - throws SQLException { - MariaJson jsonCodec = t.type.mariaJson(); - A original = t.example; - String sqlType = t.type.typename().sqlType(); - String tableName = uniqueTableName("test_json_rt"); - - // Create temp table with the native type column - conn.createStatement().execute("CREATE TEMPORARY TABLE " + tableName + " (v " + sqlType + ")"); - - try { - // Insert value using native type - var insert = conn.prepareStatement("INSERT INTO " + tableName + " (v) VALUES (?)"); - t.type.write().set(insert, 1, original); - insert.execute(); - insert.close(); - - // Select back as JSON using JSON_OBJECT - this is what MULTISET does - var select = conn.prepareStatement("SELECT JSON_OBJECT('v', v) FROM " + tableName); - select.execute(); - var rs = select.getResultSet(); - - if (!rs.next()) { - throw new RuntimeException("No rows returned"); - } - - // Read the JSON string back from the database - String jsonFromDb = rs.getString(1); - select.close(); - - // Parse the JSON object and extract 'v' field - JsonValue parsedFromDb = JsonValue.parse(jsonFromDb); - JsonValue fieldValue = ((JsonValue.JObject) parsedFromDb).get("v"); - A decoded = jsonCodec.fromJson(fieldValue); - - if (t.hasIdentity && !areEqual(decoded, original)) { - throw new RuntimeException( - "JSON DB roundtrip failed for " - + sqlType - + ": expected '" - + format(original) - + "' but got '" - + format(decoded) - + "'"); - } - } finally { - conn.createStatement().execute("DROP TEMPORARY TABLE IF EXISTS " + tableName); - } - } - - static void testCase(Connection conn, MariaTypeAndExample t) throws SQLException { - String sqlType = t.type.typename().sqlType(); - String tableName = uniqueTableName("test_table"); - - // Create temp table - conn.createStatement().execute("CREATE TEMPORARY TABLE " + tableName + " (v " + sqlType + ")"); - - try { - // Insert using PreparedStatement - var insert = conn.prepareStatement("INSERT INTO " + tableName + " (v) VALUES (?)"); - A expected = t.example; - t.type.write().set(insert, 1, expected); - insert.execute(); - insert.close(); - - // Select and verify - final PreparedStatement select; - if (t.hasIdentity) { - select = conn.prepareStatement("SELECT v, NULL FROM " + tableName + " WHERE v = ?"); - t.type.write().set(select, 1, expected); - } else { - select = conn.prepareStatement("SELECT v, NULL FROM " + tableName); - } - - select.execute(); - var rs = select.getResultSet(); - - if (!rs.next()) { - throw new RuntimeException("No rows returned"); - } - - // Read the value - A actual = t.type.read().read(rs, 1); - // Read the null value using opt() - Optional actualNull = t.type.opt().read().read(rs, 2); - - select.close(); - - assertEquals(actual, expected, "value mismatch"); - assertEquals(actualNull, Optional.empty(), "null value mismatch"); - - } finally { - // Drop temp table - conn.createStatement().execute("DROP TEMPORARY TABLE IF EXISTS " + tableName); - } - } - - static void assertEquals(A actual, A expected, String message) { - if (!areEqual(actual, expected)) { - throw new RuntimeException( - message + ": actual='" + format(actual) + "' expected='" + format(expected) + "'"); - } - } - - static boolean areEqual(A actual, A expected) { - if (expected == null && actual == null) return true; - if (expected == null || actual == null) return false; - - if (expected instanceof byte[]) { - return Arrays.equals((byte[]) actual, (byte[]) expected); - } - if (expected instanceof Object[]) { - return Arrays.deepEquals((Object[]) actual, (Object[]) expected); - } - - // For spatial types, compare string representations - if (expected instanceof org.mariadb.jdbc.type.Geometry) { - return actual.toString().equals(expected.toString()); - } - - return actual.equals(expected); - } - - static String format(A a) { - if (a == null) return "null"; - if (a instanceof byte[]) { - return bytesToHex((byte[]) a); - } - if (a instanceof Object[]) { - return Arrays.deepToString((Object[]) a); - } - return a.toString(); - } - - static String bytesToHex(byte[] bytes) { - StringBuilder sb = new StringBuilder(); - sb.append("["); - for (int i = 0; i < bytes.length; i++) { - if (i > 0) sb.append(", "); - sb.append(String.format("0x%02X", bytes[i])); - } - sb.append("]"); - return sb.toString(); - } -} diff --git a/foundations-jdbc-test/src/java/dev/typr/foundations/OracleTypeTest.java b/foundations-jdbc-test/src/java/dev/typr/foundations/OracleTypeTest.java deleted file mode 100644 index 89727b02dd..0000000000 --- a/foundations-jdbc-test/src/java/dev/typr/foundations/OracleTypeTest.java +++ /dev/null @@ -1,1804 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.connect.oracle.OracleConfig; -import dev.typr.foundations.data.Json; -import dev.typr.foundations.data.JsonValue; -import dev.typr.foundations.data.OracleIntervalDS; -import dev.typr.foundations.data.OracleIntervalYM; -import dev.typr.foundations.hikari.HikariDataSourceFactory; -import dev.typr.foundations.hikari.PoolConfig; -import dev.typr.foundations.hikari.PooledDataSource; -import java.math.BigDecimal; -import java.sql.Connection; -import java.sql.PreparedStatement; -import java.sql.SQLException; -import java.time.LocalDateTime; -import java.time.OffsetDateTime; -import java.time.ZoneOffset; -import java.util.ArrayList; -import java.util.Arrays; -import java.util.HashSet; -import java.util.List; -import java.util.Optional; -import java.util.concurrent.atomic.AtomicInteger; -import org.junit.Test; - -/** Tests for Oracle type codecs. Tests all types defined in OracleTypes. */ -public class OracleTypeTest { - - private static final AtomicInteger tableCounter = new AtomicInteger(0); - - // Connection pool with limited size to avoid exhausting Oracle Free's connection limit - private static final PooledDataSource POOL; - - static { - java.util.TimeZone.setDefault(java.util.TimeZone.getTimeZone("GMT+03:00")); - var config = - OracleConfig.builder("localhost", 1521, "FREEPDB1", "typr", "typr_password") - .serviceName("FREEPDB1") - .build(); - var poolConfig = PoolConfig.builder().maximumPoolSize(5).build(); - POOL = HikariDataSourceFactory.create(config, poolConfig); - } - - private static String uniqueTableName(String prefix) { - return prefix + "_" + tableCounter.incrementAndGet(); - } - - // ═══════════════════════════════════════════════════════════════════════════ - // Test Data Types for Object-Relational Types - // ═══════════════════════════════════════════════════════════════════════════ - - /** Address - corresponds to Oracle type address_t */ - record Coordinates(BigDecimal latitude, BigDecimal longitude) {} - - record Address(String street, String city, Coordinates location) {} - - // Helper to build COORDINATES_T type - static OracleType coordinatesType() { - return OracleObject.builder("COORDINATES_T") - .addAttribute("LATITUDE", OracleTypes.number(9, 6), Coordinates::latitude) - .addAttribute("LONGITUDE", OracleTypes.number(9, 6), Coordinates::longitude) - .build(attrs -> new Coordinates((BigDecimal) attrs[0], (BigDecimal) attrs[1])) - .asType(); - } - - // Helper to build ADDRESS_T type (with nested COORDINATES_T) - static OracleType

addressType() { - return OracleObject.
builder("ADDRESS_T") - .addAttribute("STREET", OracleTypes.varchar2(100), Address::street) - .addAttribute("CITY", OracleTypes.varchar2(50), Address::city) - .addAttribute("LOCATION", coordinatesType(), Address::location) - .build(attrs -> new Address((String) attrs[0], (String) attrs[1], (Coordinates) attrs[2])) - .asType(); - } - - // Example coordinates - static Coordinates coords(String lat, String lon) { - return new Coordinates(new BigDecimal(lat), new BigDecimal(lon)); - } - - /** OrderItem - corresponds to Oracle type order_item_t */ - record OrderItem(Long productId, Integer quantity) {} - - /** AllTypesStruct - comprehensive struct containing all Oracle types (NOT NULL fields) */ - record AllTypesStruct( - String varcharField, - String nvarcharField, - String charField, - String ncharField, - BigDecimal numberField, - Integer numberIntField, - Long numberLongField, - Float binaryFloatField, - Double binaryDoubleField, - LocalDateTime dateField, - LocalDateTime timestampField, - OffsetDateTime timestampTzField, - OffsetDateTime timestampLtzField, - OracleIntervalYM intervalYmField, - OracleIntervalDS intervalDsField, - Address nestedObjectField, - List varrayField) {} - - /** AllTypesStructOptional - comprehensive struct containing all Oracle types (all nullable) */ - record AllTypesStructOptional( - Optional varcharField, - Optional nvarcharField, - Optional charField, - Optional ncharField, - Optional numberField, - Optional numberIntField, - Optional numberLongField, - Optional binaryFloatField, - Optional binaryDoubleField, - Optional dateField, - Optional timestampField, - Optional timestampTzField, - Optional timestampLtzField, - Optional intervalYmField, - Optional intervalDsField, - Optional
nestedObjectField, - Optional> varrayField) {} - - /** - * AllTypesStructNoLobs - Oracle types without LOBs (for VARRAY compatibility) Oracle restriction: - * VARRAYs cannot contain structs with embedded LOB types or nested tables - */ - record AllTypesStructNoLobs( - String varcharField, - String nvarcharField, - String charField, - String ncharField, - BigDecimal numberField, - Integer numberIntField, - Long numberLongField, - Float binaryFloatField, - Double binaryDoubleField, - LocalDateTime dateField, - LocalDateTime timestampField, - OffsetDateTime timestampTzField, - OffsetDateTime timestampLtzField, - OracleIntervalYM intervalYmField, - OracleIntervalDS intervalDsField, - Address nestedObjectField, - List varrayField) {} - - /** AllTypesStructNoLobsOptional - Optional variant without LOBs (for VARRAY compatibility) */ - record AllTypesStructNoLobsOptional( - Optional varcharField, - Optional nvarcharField, - Optional charField, - Optional ncharField, - Optional numberField, - Optional numberIntField, - Optional numberLongField, - Optional binaryFloatField, - Optional binaryDoubleField, - Optional dateField, - Optional timestampField, - Optional timestampTzField, - Optional timestampLtzField, - Optional intervalYmField, - Optional intervalDsField, - Optional
nestedObjectField, - Optional> varrayField) {} - - // ═══════════════════════════════════════════════════════════════════════════ - - record OracleTypeAndExample( - OracleType type, - A example, - A expectedRoundtrip, // Nullable - the expected value after roundtrip (may be null for SQL - // NULL) - boolean useExpectedRoundtrip, // If true, use expectedRoundtrip; if false, use example - boolean hasIdentity, - boolean streamingWorks, - boolean jsonRoundtripWorks, - List - setupSql // Optional SQL statements to run before test (for type definitions, etc.) - ) { - public OracleTypeAndExample(OracleType type, A example) { - this(type, example, null, false, true, true, true, List.of()); - } - - public OracleTypeAndExample(OracleType type, A example, A expectedRoundtrip) { - this(type, example, expectedRoundtrip, true, true, true, true, List.of()); - } - - public OracleTypeAndExample(OracleType type, A example, List setupSql) { - this(type, example, null, false, true, true, true, setupSql); - } - - public OracleTypeAndExample noStreaming() { - return new OracleTypeAndExample<>( - type, - example, - expectedRoundtrip, - useExpectedRoundtrip, - hasIdentity, - false, - jsonRoundtripWorks, - setupSql); - } - - public OracleTypeAndExample noIdentity() { - return new OracleTypeAndExample<>( - type, - example, - expectedRoundtrip, - useExpectedRoundtrip, - false, - streamingWorks, - jsonRoundtripWorks, - setupSql); - } - - public OracleTypeAndExample noJsonRoundtrip() { - return new OracleTypeAndExample<>( - type, - example, - expectedRoundtrip, - useExpectedRoundtrip, - hasIdentity, - streamingWorks, - false, - setupSql); - } - - public A expected() { - return useExpectedRoundtrip ? expectedRoundtrip : example; - } - } - - List> All = - List.>of( - // ═══════════════════════════════════════════════════════════════════════════ - // Numeric Types - // ═══════════════════════════════════════════════════════════════════════════ - - // NUMBER - universal numeric type - new OracleTypeAndExample<>(OracleTypes.number, new BigDecimal("12345.6789")), - new OracleTypeAndExample<>(OracleTypes.number, BigDecimal.ZERO), // Edge case: zero - new OracleTypeAndExample<>( - OracleTypes.number, new BigDecimal("-9999999999.999999")), // Edge case: negative - new OracleTypeAndExample<>( - OracleTypes.number, new BigDecimal("0.00000001")), // Edge case: small value - - // NUMBER as Integer - new OracleTypeAndExample<>(OracleTypes.numberInt, 42), - new OracleTypeAndExample<>(OracleTypes.numberInt, Integer.MIN_VALUE), - new OracleTypeAndExample<>(OracleTypes.numberInt, Integer.MAX_VALUE), - new OracleTypeAndExample<>(OracleTypes.numberInt, 0), - - // NUMBER as Long - new OracleTypeAndExample<>(OracleTypes.numberLong, 424242424242L), - new OracleTypeAndExample<>(OracleTypes.numberLong, Long.MIN_VALUE), - new OracleTypeAndExample<>(OracleTypes.numberLong, Long.MAX_VALUE), - new OracleTypeAndExample<>(OracleTypes.numberLong, 0L), - - // NUMBER with precision and scale - new OracleTypeAndExample<>(OracleTypes.number(10, 2), new BigDecimal("12345678.90")), - new OracleTypeAndExample<>(OracleTypes.number(10, 2), new BigDecimal("-99999999.99")), - new OracleTypeAndExample<>( - OracleTypes.number(38, 10), - new BigDecimal("1234567890123456789012345678.1234567890")), - - // BINARY_FLOAT - 32-bit IEEE 754 - new OracleTypeAndExample<>(OracleTypes.binaryFloat, 3.14159f), - new OracleTypeAndExample<>(OracleTypes.binaryFloat, 0.0f), - new OracleTypeAndExample<>(OracleTypes.binaryFloat, Float.MIN_VALUE), - new OracleTypeAndExample<>(OracleTypes.binaryFloat, Float.MAX_VALUE), - new OracleTypeAndExample<>(OracleTypes.binaryFloat, -1.5E10f), - - // BINARY_DOUBLE - 64-bit IEEE 754 - // Oracle supports range: 1.0E-130 to 1.0E126 (excluding zero) - new OracleTypeAndExample<>(OracleTypes.binaryDouble, 3.141592653589793), - new OracleTypeAndExample<>(OracleTypes.binaryDouble, 0.0), - new OracleTypeAndExample<>( - OracleTypes.binaryDouble, 1.0E-129), // Near Oracle's min positive value - new OracleTypeAndExample<>(OracleTypes.binaryDouble, -2.5E100), - - // FLOAT (ANSI type mapped to NUMBER) - new OracleTypeAndExample<>(OracleTypes.float_, 42.42), - new OracleTypeAndExample<>(OracleTypes.float_(63), 123.456), // REAL equivalent - - // ═══════════════════════════════════════════════════════════════════════════ - // Character Types - // ═══════════════════════════════════════════════════════════════════════════ - - // VARCHAR2 - new OracleTypeAndExample<>(OracleTypes.varchar2(100), "Hello, Oracle!"), - new OracleTypeAndExample<>(OracleTypes.varchar2(100), "", (String) null) - .noJsonRoundtrip(), // Oracle quirk: empty string → NULL - new OracleTypeAndExample<>( - OracleTypes.varchar2(100), "Unicode: \u00e9\u00e8\u00ea \u4e2d\u6587"), - new OracleTypeAndExample<>(OracleTypes.varchar2(100), "Line1\nLine2\tTabbed"), - new OracleTypeAndExample<>(OracleTypes.varchar2(100), "Quote\"Test'Single"), - new OracleTypeAndExample<>(OracleTypes.varchar2(100), "Special chars: ,.;{}[]-//#"), - - // VARCHAR2 with NonEmptyString (for NOT NULL columns) - new OracleTypeAndExample<>( - OracleTypes.varchar2NonEmpty(100), NonEmptyString.force("NonEmpty VARCHAR2")), - new OracleTypeAndExample<>( - OracleTypes.varchar2NonEmpty(100), NonEmptyString.force("Test \u4e2d\u6587")), - - // CHAR (fixed-length, blank-padded) - new OracleTypeAndExample<>( - OracleTypes.char_(10), "hello "), // Note: CHAR pads with spaces - new OracleTypeAndExample<>(OracleTypes.char_(5), "abc "), // May be trimmed on comparison - - // CHAR with PaddedString (for NOT NULL columns) - new OracleTypeAndExample<>(OracleTypes.charPadded(10), PaddedString.force("hello", 10)), - new OracleTypeAndExample<>( - OracleTypes.charPadded(20), PaddedString.force("padded test", 20)), - - // NVARCHAR2 (National character set) - new OracleTypeAndExample<>( - OracleTypes.nvarchar2(100), "Unicode text: \u0391\u0392\u0393"), - new OracleTypeAndExample<>(OracleTypes.nvarchar2(100), "Emoji: \uD83D\uDE00\uD83C\uDF89"), - - // NVARCHAR2 with NonEmptyString (for NOT NULL columns) - new OracleTypeAndExample<>( - OracleTypes.nvarchar2NonEmpty(100), NonEmptyString.force("NonEmpty NVARCHAR2")), - - // NCHAR - new OracleTypeAndExample<>(OracleTypes.nchar(10), "test "), - - // NCHAR with PaddedString (for NOT NULL columns) - new OracleTypeAndExample<>( - OracleTypes.ncharPadded(15), PaddedString.force("nchar test", 15)), - - // CLOB - Character Large Object (cannot be used as comparison key) - new OracleTypeAndExample<>( - OracleTypes.clob, "This is a CLOB text that could be very large.") - .noStreaming() - .noIdentity(), - new OracleTypeAndExample<>(OracleTypes.clob, "Short CLOB").noStreaming().noIdentity(), - - // CLOB with NonEmptyString (for NOT NULL columns - cannot be used as comparison key) - new OracleTypeAndExample<>( - OracleTypes.clobNonEmpty, NonEmptyString.force("NonEmpty CLOB text")) - .noStreaming() - .noIdentity(), - - // NCLOB - National CLOB (cannot be used as comparison key) - new OracleTypeAndExample<>(OracleTypes.nclob, "National CLOB with \u4e2d\u6587") - .noStreaming() - .noIdentity(), - - // NCLOB with NonEmptyString (for NOT NULL columns - cannot be used as comparison key) - new OracleTypeAndExample<>( - OracleTypes.nclobNonEmpty, NonEmptyString.force("NonEmpty NCLOB \u4e2d\u6587")) - .noStreaming() - .noIdentity(), - - // ═══════════════════════════════════════════════════════════════════════════ - // Binary Types - // ═══════════════════════════════════════════════════════════════════════════ - - // RAW - new OracleTypeAndExample<>( - OracleTypes.raw(100), new byte[] {0x01, 0x02, 0x03, (byte) 0xFF}), - new OracleTypeAndExample<>(OracleTypes.raw(100), new byte[] {}, (byte[]) null) - .noJsonRoundtrip(), // Oracle quirk: empty byte array → NULL - new OracleTypeAndExample<>( - OracleTypes.raw(100), new byte[] {0x00, 0x00, 0x00}), // Edge case: zeros - new OracleTypeAndExample<>( - OracleTypes.raw(100), - new byte[] {(byte) 0xDE, (byte) 0xAD, (byte) 0xBE, (byte) 0xEF}), - - // BLOB - Binary Large Object (cannot be used as comparison key) - new OracleTypeAndExample<>( - OracleTypes.blob, new byte[] {(byte) 0xCA, (byte) 0xFE, (byte) 0xBA, (byte) 0xBE}) - .noStreaming() - .noIdentity(), - new OracleTypeAndExample<>(OracleTypes.blob, new byte[] {}) - .noStreaming() - .noIdentity(), // Edge case: empty - new OracleTypeAndExample<>( - OracleTypes.blob, new byte[] {0x00, 0x11, 0x22, 0x33, 0x44, 0x55, 0x66, 0x77}) - .noStreaming() - .noIdentity(), - - // ═══════════════════════════════════════════════════════════════════════════ - // Date/Time Types - // ═══════════════════════════════════════════════════════════════════════════ - - // DATE (includes time in Oracle!) - new OracleTypeAndExample<>(OracleTypes.date, LocalDateTime.of(2024, 6, 15, 14, 30, 45)), - new OracleTypeAndExample<>( - OracleTypes.date, LocalDateTime.of(1970, 1, 1, 0, 0, 0)), // Edge case: epoch - new OracleTypeAndExample<>( - OracleTypes.date, - LocalDateTime.of(2099, 12, 31, 23, 59, 59)), // Edge case: far future - new OracleTypeAndExample<>( - OracleTypes.date, LocalDateTime.of(1, 1, 1, 0, 0, 0)), // Edge case: very old - - // TIMESTAMP - new OracleTypeAndExample<>( - OracleTypes.timestamp, LocalDateTime.of(2024, 6, 15, 14, 30, 45, 123456000)), - new OracleTypeAndExample<>( - OracleTypes.timestamp(6), LocalDateTime.of(2024, 6, 15, 14, 30, 45, 123456000)), - new OracleTypeAndExample<>( - OracleTypes.timestamp(9), LocalDateTime.of(2024, 6, 15, 14, 30, 45, 123456789)), - new OracleTypeAndExample<>( - OracleTypes.timestamp, LocalDateTime.of(1970, 1, 1, 0, 0, 0, 0)), // Edge case: epoch - - // TIMESTAMP WITH TIME ZONE - new OracleTypeAndExample<>( - OracleTypes.timestampWithTimeZone, - OffsetDateTime.of(2024, 6, 15, 14, 30, 45, 0, ZoneOffset.ofHours(2))), - new OracleTypeAndExample<>( - OracleTypes.timestampWithTimeZone, - OffsetDateTime.of(2024, 1, 1, 0, 0, 0, 0, ZoneOffset.UTC)), - new OracleTypeAndExample<>( - OracleTypes.timestampWithTimeZone, - OffsetDateTime.of( - 2024, 6, 15, 14, 30, 45, 123456000, ZoneOffset.ofHoursMinutes(-5, -30))), - - // TIMESTAMP WITH LOCAL TIME ZONE - new OracleTypeAndExample<>( - OracleTypes.timestampWithLocalTimeZone, - OffsetDateTime.of(2024, 6, 15, 14, 30, 45, 0, ZoneOffset.ofHours(3))), - - // INTERVAL YEAR TO MONTH - Now using OracleIntervalYM class (parses both Oracle and - // ISO-8601 formats) - new OracleTypeAndExample<>( - OracleTypes.intervalYearToMonth, new OracleIntervalYM(2, 5)), // 2 years, 5 months - new OracleTypeAndExample<>( - OracleTypes.intervalYearToMonth, - new OracleIntervalYM(-1, -6)), // negative: -1 year -6 months - new OracleTypeAndExample<>( - OracleTypes.intervalYearToMonth, new OracleIntervalYM(0, 0)), // zero - - // INTERVAL DAY TO SECOND - Now using OracleIntervalDS class (parses both Oracle and - // ISO-8601 formats) - new OracleTypeAndExample<>( - OracleTypes.intervalDayToSecond, - new OracleIntervalDS(3, 14, 30, 45, 123456000)), // 3 days 14:30:45.123456 - new OracleTypeAndExample<>( - OracleTypes.intervalDayToSecond, - new OracleIntervalDS(-1, 0, 0, 0, 0)), // negative: -1 day - new OracleTypeAndExample<>( - OracleTypes.intervalDayToSecond, new OracleIntervalDS(0, 0, 0, 0, 0)), // zero - - // ═══════════════════════════════════════════════════════════════════════════ - // ROWID Types - // ═══════════════════════════════════════════════════════════════════════════ - - // Note: ROWID values are generated by Oracle and cannot be inserted directly - // We test with mock values that follow the format but may not represent real rows - - // ═══════════════════════════════════════════════════════════════════════════ - // JSON Type (Oracle 21c+) - // Note: JSON type doesn't support equality comparisons in WHERE clauses - // ═══════════════════════════════════════════════════════════════════════════ - - new OracleTypeAndExample<>( - OracleTypes.json, new Json("{\"name\": \"Oracle\", \"version\": 23}")) - .noIdentity(), - new OracleTypeAndExample<>(OracleTypes.json, new Json("[1, 2, 3, \"four\"]")) - .noIdentity(), - new OracleTypeAndExample<>(OracleTypes.json, new Json("{}")) - .noIdentity(), // Edge case: empty object - new OracleTypeAndExample<>(OracleTypes.json, new Json("[]")) - .noIdentity(), // Edge case: empty array - new OracleTypeAndExample<>(OracleTypes.json, new Json("null")) - .noIdentity(), // Edge case: null - new OracleTypeAndExample<>(OracleTypes.json, new Json("\"string\"")) - .noIdentity(), // Edge case: string - new OracleTypeAndExample<>(OracleTypes.json, new Json("42")) - .noIdentity(), // Edge case: number - new OracleTypeAndExample<>(OracleTypes.json, new Json("true")) - .noIdentity(), // Edge case: boolean - - // ═══════════════════════════════════════════════════════════════════════════ - // Boolean Type (Oracle 23c+ native, or NUMBER(1) convention) - // ═══════════════════════════════════════════════════════════════════════════ - - // Native BOOLEAN (23c+) - comment out if using older Oracle - // new OracleTypeAndExample<>(OracleTypes.boolean_, true), - // new OracleTypeAndExample<>(OracleTypes.boolean_, false), - - // NUMBER(1) as Boolean (traditional approach) - new OracleTypeAndExample<>(OracleTypes.numberAsBoolean, true), - new OracleTypeAndExample<>(OracleTypes.numberAsBoolean, false), - - // ═══════════════════════════════════════════════════════════════════════════ - // Object-Relational Types (User-Defined Types) - // ═══════════════════════════════════════════════════════════════════════════ - - // OBJECT TYPE example - address_t (with nested coordinates_t) - new OracleTypeAndExample<>( - addressType(), - new Address("123 Main St", "San Francisco", coords("37.774929", "-122.419418")), - List.of( - """ - CREATE OR REPLACE TYPE COORDINATES_T AS OBJECT ( - LATITUDE NUMBER(9,6), - LONGITUDE NUMBER(9,6) - ) - """, - """ - CREATE OR REPLACE TYPE ADDRESS_T AS OBJECT ( - STREET VARCHAR2(100), - CITY VARCHAR2(50), - LOCATION COORDINATES_T - ) - """)), - - // VARRAY example - phone_list (max 5 elements) - cannot be used as comparison key - new OracleTypeAndExample<>( - OracleVArray.of("PHONE_LIST", 5, OracleTypes.varchar2(20)), - List.of("555-1234", "555-5678", "555-9999"), - null, - false, - false, - true, - true, - List.of("CREATE OR REPLACE TYPE PHONE_LIST AS VARRAY(5) OF VARCHAR2(20)")), - - // VARRAY edge case - single element - new OracleTypeAndExample<>( - OracleVArray.of("PHONE_LIST", 5, OracleTypes.varchar2(20)), - List.of("555-0000"), - null, - false, - false, - true, - true, - List.of("CREATE OR REPLACE TYPE PHONE_LIST AS VARRAY(5) OF VARCHAR2(20)")), - - // VARRAY edge case - max size (5 elements) - new OracleTypeAndExample<>( - OracleVArray.of("PHONE_LIST", 5, OracleTypes.varchar2(20)), - List.of("555-1111", "555-2222", "555-3333", "555-4444", "555-5555"), - null, - false, - false, - true, - true, - List.of("CREATE OR REPLACE TYPE PHONE_LIST AS VARRAY(5) OF VARCHAR2(20)")), - - // NESTED TABLE example - order_items_t with nested OBJECT type - new OracleTypeAndExample<>( - OracleNestedTable.of( - "ORDER_ITEMS_T", - OracleObject.builder("ORDER_ITEM_T") - .addAttribute("PRODUCT_ID", OracleTypes.numberLong, OrderItem::productId) - .addAttribute("QUANTITY", OracleTypes.numberInt, OrderItem::quantity) - .build(attrs -> new OrderItem((Long) attrs[0], (Integer) attrs[1])) - .asType()), - List.of(new OrderItem(101L, 2), new OrderItem(202L, 5), new OrderItem(303L, 1)), - List.of( - """ - CREATE OR REPLACE TYPE ORDER_ITEM_T AS OBJECT ( - PRODUCT_ID NUMBER, - QUANTITY NUMBER - ) - """, - "CREATE OR REPLACE TYPE ORDER_ITEMS_T AS TABLE OF ORDER_ITEM_T")) - .noIdentity(), // Collections can't be used as comparison keys - - // NESTED TABLE edge case - single item - new OracleTypeAndExample<>( - OracleNestedTable.of( - "ORDER_ITEMS_T", - OracleObject.builder("ORDER_ITEM_T") - .addAttribute("PRODUCT_ID", OracleTypes.numberLong, OrderItem::productId) - .addAttribute("QUANTITY", OracleTypes.numberInt, OrderItem::quantity) - .build(attrs -> new OrderItem((Long) attrs[0], (Integer) attrs[1])) - .asType()), - List.of(new OrderItem(999L, 42)), - List.of( - """ - CREATE OR REPLACE TYPE ORDER_ITEM_T AS OBJECT ( - PRODUCT_ID NUMBER, - QUANTITY NUMBER - ) - """, - "CREATE OR REPLACE TYPE ORDER_ITEMS_T AS TABLE OF ORDER_ITEM_T")) - .noIdentity(), // Collections can't be used as comparison keys - - // NESTED TABLE edge case - empty list - new OracleTypeAndExample<>( - OracleNestedTable.of( - "ORDER_ITEMS_T", - OracleObject.builder("ORDER_ITEM_T") - .addAttribute("PRODUCT_ID", OracleTypes.numberLong, OrderItem::productId) - .addAttribute("QUANTITY", OracleTypes.numberInt, OrderItem::quantity) - .build(attrs -> new OrderItem((Long) attrs[0], (Integer) attrs[1])) - .asType()), - List.of(), - List.of( - """ - CREATE OR REPLACE TYPE ORDER_ITEM_T AS OBJECT ( - PRODUCT_ID NUMBER, - QUANTITY NUMBER - ) - """, - "CREATE OR REPLACE TYPE ORDER_ITEMS_T AS TABLE OF ORDER_ITEM_T")) - .noIdentity(), - - // ═══════════════════════════════════════════════════════════════════════════ - // Comprehensive STRUCT Tests - All Oracle Types - // ═══════════════════════════════════════════════════════════════════════════ - - // TEST_ALLTYPES - struct with all Oracle types (NOT NULL fields) - new OracleTypeAndExample( - OracleObject.builder("TEST_ALLTYPES") - .addAttribute("VARCHAR_FIELD", OracleTypes.varchar2(100), s -> s.varcharField) - .addAttribute( - "NVARCHAR_FIELD", OracleTypes.nvarchar2(100), s -> s.nvarcharField) - .addAttribute("CHAR_FIELD", OracleTypes.char_(10), s -> s.charField) - .addAttribute("NCHAR_FIELD", OracleTypes.nchar(10), s -> s.ncharField) - .addAttribute("NUMBER_FIELD", OracleTypes.number, s -> s.numberField) - .addAttribute( - "NUMBER_INT_FIELD", OracleTypes.numberInt, s -> s.numberIntField) - .addAttribute( - "NUMBER_LONG_FIELD", OracleTypes.numberLong, s -> s.numberLongField) - .addAttribute( - "BINARY_FLOAT_FIELD", OracleTypes.binaryFloat, s -> s.binaryFloatField) - .addAttribute( - "BINARY_DOUBLE_FIELD", OracleTypes.binaryDouble, s -> s.binaryDoubleField) - .addAttribute("DATE_FIELD", OracleTypes.date, s -> s.dateField) - .addAttribute("TIMESTAMP_FIELD", OracleTypes.timestamp, s -> s.timestampField) - .addAttribute( - "TIMESTAMP_TZ_FIELD", - OracleTypes.timestampWithTimeZone, - s -> s.timestampTzField) - .addAttribute( - "TIMESTAMP_LTZ_FIELD", - OracleTypes.timestampWithLocalTimeZone, - s -> s.timestampLtzField) - .addAttribute( - "INTERVAL_YM_FIELD", - OracleTypes.intervalYearToMonth, - s -> s.intervalYmField) - .addAttribute( - "INTERVAL_DS_FIELD", - OracleTypes.intervalDayToSecond, - s -> s.intervalDsField) - .addAttribute("NESTED_OBJECT_FIELD", addressType(), s -> s.nestedObjectField) - .addAttribute( - "VARRAY_FIELD", - OracleVArray.of("PHONE_LIST", 5, OracleTypes.varchar2(20)), - s -> s.varrayField) - .build( - attrs -> - new AllTypesStruct( - (String) attrs[0], // varcharField - (String) attrs[1], // nvarcharField - (String) attrs[2], // charField - (String) attrs[3], // ncharField - (BigDecimal) attrs[4], // numberField - (Integer) attrs[5], // numberIntField - (Long) attrs[6], // numberLongField - (Float) attrs[7], // binaryFloatField - (Double) attrs[8], // binaryDoubleField - (LocalDateTime) attrs[9], // dateField - (LocalDateTime) attrs[10], // timestampField - (OffsetDateTime) attrs[11], // timestampTzField - (OffsetDateTime) attrs[12], // timestampLtzField - (OracleIntervalYM) attrs[13], // intervalYmField - (OracleIntervalDS) attrs[14], // intervalDsField - (Address) attrs[15], // nestedObjectField - (List) attrs[16] // varrayField - )) - .asType(), - new AllTypesStruct( - "test varchar", - "test nvarchar", - "char10 ", - "nchar10 ", - new BigDecimal("123.45"), - 42, - 12345678L, - 3.14f, - 2.718, - LocalDateTime.of(2024, 6, 15, 14, 30, 45), - LocalDateTime.of(2024, 6, 15, 14, 30, 45, 123456000), - OffsetDateTime.of(2024, 6, 15, 14, 30, 45, 0, ZoneOffset.ofHours(2)), - OffsetDateTime.of(2024, 6, 15, 14, 30, 45, 0, ZoneOffset.ofHours(3)), - new OracleIntervalYM(2, 5), - new OracleIntervalDS(3, 14, 30, 45, 123456000), - new Address("123 Main St", "San Francisco", coords("37.7749", "-122.4194")), - List.of("555-1234", "555-5678")), - List.of( - "BEGIN EXECUTE IMMEDIATE 'DROP TYPE TEST_ALLTYPES FORCE'; EXCEPTION WHEN" - + " OTHERS THEN NULL; END;", - """ - CREATE OR REPLACE TYPE TEST_ALLTYPES AS OBJECT ( - VARCHAR_FIELD VARCHAR2(100), - NVARCHAR_FIELD NVARCHAR2(100), - CHAR_FIELD CHAR(10), - NCHAR_FIELD NCHAR(10), - NUMBER_FIELD NUMBER, - NUMBER_INT_FIELD NUMBER(10), - NUMBER_LONG_FIELD NUMBER(19), - BINARY_FLOAT_FIELD BINARY_FLOAT, - BINARY_DOUBLE_FIELD BINARY_DOUBLE, - DATE_FIELD DATE, - TIMESTAMP_FIELD TIMESTAMP, - TIMESTAMP_TZ_FIELD TIMESTAMP WITH TIME ZONE, - TIMESTAMP_LTZ_FIELD TIMESTAMP WITH LOCAL TIME ZONE, - INTERVAL_YM_FIELD INTERVAL YEAR TO MONTH, - INTERVAL_DS_FIELD INTERVAL DAY TO SECOND, - NESTED_OBJECT_FIELD ADDRESS_T, - VARRAY_FIELD PHONE_LIST - ) - """)) - .noIdentity() - .noJsonRoundtrip(), // Complex struct - skip identity and JSON tests for now - - // TEST_ALLTYPES_OPT - comprehensive struct with all nullable fields - new OracleTypeAndExample( - OracleObject.builder("TEST_ALLTYPES_OPT") - .addAttribute( - "VARCHAR_FIELD", OracleTypes.varchar2(100).opt(), s -> s.varcharField) - .addAttribute( - "NVARCHAR_FIELD", OracleTypes.nvarchar2(100).opt(), s -> s.nvarcharField) - .addAttribute("CHAR_FIELD", OracleTypes.char_(10).opt(), s -> s.charField) - .addAttribute("NCHAR_FIELD", OracleTypes.nchar(10).opt(), s -> s.ncharField) - .addAttribute("NUMBER_FIELD", OracleTypes.number.opt(), s -> s.numberField) - .addAttribute( - "NUMBER_INT_FIELD", OracleTypes.numberInt.opt(), s -> s.numberIntField) - .addAttribute( - "NUMBER_LONG_FIELD", OracleTypes.numberLong.opt(), s -> s.numberLongField) - .addAttribute( - "BINARY_FLOAT_FIELD", - OracleTypes.binaryFloat.opt(), - s -> s.binaryFloatField) - .addAttribute( - "BINARY_DOUBLE_FIELD", - OracleTypes.binaryDouble.opt(), - s -> s.binaryDoubleField) - .addAttribute("DATE_FIELD", OracleTypes.date.opt(), s -> s.dateField) - .addAttribute( - "TIMESTAMP_FIELD", OracleTypes.timestamp.opt(), s -> s.timestampField) - .addAttribute( - "TIMESTAMP_TZ_FIELD", - OracleTypes.timestampWithTimeZone.opt(), - s -> s.timestampTzField) - .addAttribute( - "TIMESTAMP_LTZ_FIELD", - OracleTypes.timestampWithLocalTimeZone.opt(), - s -> s.timestampLtzField) - .addAttribute( - "INTERVAL_YM_FIELD", - OracleTypes.intervalYearToMonth.opt(), - s -> s.intervalYmField) - .addAttribute( - "INTERVAL_DS_FIELD", - OracleTypes.intervalDayToSecond.opt(), - s -> s.intervalDsField) - .addAttribute( - "NESTED_OBJECT_FIELD", addressType().opt(), s -> s.nestedObjectField) - .addAttribute( - "VARRAY_FIELD", - OracleVArray.of("PHONE_LIST", 5, OracleTypes.varchar2(20)).opt(), - s -> s.varrayField) - .build( - attrs -> - new AllTypesStructOptional( - (Optional) attrs[0], // varcharField - (Optional) attrs[1], // nvarcharField - (Optional) attrs[2], // charField - (Optional) attrs[3], // ncharField - (Optional) attrs[4], // numberField - (Optional) attrs[5], // numberIntField - (Optional) attrs[6], // numberLongField - (Optional) attrs[7], // binaryFloatField - (Optional) attrs[8], // binaryDoubleField - (Optional) attrs[9], // dateField - (Optional) attrs[10], // timestampField - (Optional) attrs[11], // timestampTzField - (Optional) attrs[12], // timestampLtzField - (Optional) attrs[13], // intervalYmField - (Optional) attrs[14], // intervalDsField - (Optional
) attrs[15], // nestedObjectField - (Optional>) attrs[16] // varrayField - )) - .asType(), - new AllTypesStructOptional( - Optional.of("test varchar"), - Optional.empty(), // Test null nvarcharField - Optional.of("char10 "), - Optional.empty(), // Test null ncharField - Optional.of(new BigDecimal("123.45")), - Optional.of(42), - Optional.empty(), // Test null numberLongField - Optional.of(3.14f), - Optional.of(2.718), - Optional.of(LocalDateTime.of(2024, 6, 15, 14, 30, 45)), - Optional.empty(), // Test null timestampField - Optional.of( - OffsetDateTime.of(2024, 6, 15, 14, 30, 45, 0, ZoneOffset.ofHours(2))), - Optional.of( - OffsetDateTime.of(2024, 6, 15, 14, 30, 45, 0, ZoneOffset.ofHours(3))), - Optional.of(new OracleIntervalYM(2, 5)), - Optional.empty(), // Test null intervalDsField - Optional.of( - new Address( - "123 Main St", "San Francisco", coords("37.7749", "-122.4194"))), - Optional.of(List.of("555-1234", "555-5678"))), - List.of( - "BEGIN EXECUTE IMMEDIATE 'DROP TYPE TEST_ALLTYPES_OPT FORCE';" - + " EXCEPTION WHEN OTHERS THEN NULL; END;", - """ - CREATE OR REPLACE TYPE TEST_ALLTYPES_OPT AS OBJECT ( - VARCHAR_FIELD VARCHAR2(100), - NVARCHAR_FIELD NVARCHAR2(100), - CHAR_FIELD CHAR(10), - NCHAR_FIELD NCHAR(10), - NUMBER_FIELD NUMBER, - NUMBER_INT_FIELD NUMBER(10), - NUMBER_LONG_FIELD NUMBER(19), - BINARY_FLOAT_FIELD BINARY_FLOAT, - BINARY_DOUBLE_FIELD BINARY_DOUBLE, - DATE_FIELD DATE, - TIMESTAMP_FIELD TIMESTAMP, - TIMESTAMP_TZ_FIELD TIMESTAMP WITH TIME ZONE, - TIMESTAMP_LTZ_FIELD TIMESTAMP WITH LOCAL TIME ZONE, - INTERVAL_YM_FIELD INTERVAL YEAR TO MONTH, - INTERVAL_DS_FIELD INTERVAL DAY TO SECOND, - NESTED_OBJECT_FIELD ADDRESS_T, - VARRAY_FIELD PHONE_LIST - ) - """)) - .noIdentity() - .noJsonRoundtrip(), // Complex struct - skip identity and JSON tests for now - - // ═══════════════════════════════════════════════════════════════════════════ - // Structs Without LOBs - For VARRAY Compatibility - // Oracle restriction: VARRAYs cannot contain structs with embedded LOBs - // ═══════════════════════════════════════════════════════════════════════════ - - // TEST_ALLTYPES_NOLOBS - standalone struct without LOBs - new OracleTypeAndExample<>( - OracleObject.builder("TEST_ALLTYPES_NOLOBS") - .addAttribute("VARCHAR_FIELD", OracleTypes.varchar2(100), s -> s.varcharField) - .addAttribute( - "NVARCHAR_FIELD", OracleTypes.nvarchar2(100), s -> s.nvarcharField) - .addAttribute("CHAR_FIELD", OracleTypes.char_(10), s -> s.charField) - .addAttribute("NCHAR_FIELD", OracleTypes.nchar(10), s -> s.ncharField) - .addAttribute("NUMBER_FIELD", OracleTypes.number, s -> s.numberField) - .addAttribute( - "NUMBER_INT_FIELD", OracleTypes.numberInt, s -> s.numberIntField) - .addAttribute( - "NUMBER_LONG_FIELD", OracleTypes.numberLong, s -> s.numberLongField) - .addAttribute( - "BINARY_FLOAT_FIELD", OracleTypes.binaryFloat, s -> s.binaryFloatField) - .addAttribute( - "BINARY_DOUBLE_FIELD", OracleTypes.binaryDouble, s -> s.binaryDoubleField) - .addAttribute("DATE_FIELD", OracleTypes.date, s -> s.dateField) - .addAttribute("TIMESTAMP_FIELD", OracleTypes.timestamp, s -> s.timestampField) - .addAttribute( - "TIMESTAMP_TZ_FIELD", - OracleTypes.timestampWithTimeZone, - s -> s.timestampTzField) - .addAttribute( - "TIMESTAMP_LTZ_FIELD", - OracleTypes.timestampWithLocalTimeZone, - s -> s.timestampLtzField) - .addAttribute( - "INTERVAL_YM_FIELD", - OracleTypes.intervalYearToMonth, - s -> s.intervalYmField) - .addAttribute( - "INTERVAL_DS_FIELD", - OracleTypes.intervalDayToSecond, - s -> s.intervalDsField) - .addAttribute("NESTED_OBJECT_FIELD", addressType(), s -> s.nestedObjectField) - .addAttribute( - "VARRAY_FIELD", - OracleVArray.of("PHONE_LIST", 5, OracleTypes.varchar2(20)), - s -> s.varrayField) - .build( - attrs -> - new AllTypesStructNoLobs( - (String) attrs[0], - (String) attrs[1], - (String) attrs[2], - (String) attrs[3], - (BigDecimal) attrs[4], - (Integer) attrs[5], - (Long) attrs[6], - (Float) attrs[7], - (Double) attrs[8], - (LocalDateTime) attrs[9], - (LocalDateTime) attrs[10], - (OffsetDateTime) attrs[11], - (OffsetDateTime) attrs[12], - (OracleIntervalYM) attrs[13], - (OracleIntervalDS) attrs[14], - (Address) attrs[15], - (List) attrs[16])) - .asType(), - new AllTypesStructNoLobs( - "varchar_val", - "nvarchar_val", - "char_val ", - "nchar_val ", - new BigDecimal("123.45"), - 42, - 9876543210L, - 3.14f, - 2.718281828, - LocalDateTime.of(2024, 3, 15, 14, 30), - LocalDateTime.of(2024, 3, 15, 14, 30, 45, 123456789), - OffsetDateTime.of(2024, 3, 15, 14, 30, 45, 0, ZoneOffset.ofHours(2)), - OffsetDateTime.of(2024, 3, 15, 14, 30, 45, 0, ZoneOffset.ofHours(3)), - new OracleIntervalYM(2, 6), - new OracleIntervalDS(5, 12, 30, 45, 123456000), - new Address("456 Oak Ave", "Portland", coords("45.5152", "-122.6784")), - List.of("555-1234", "555-5678")), - List.of( - """ - CREATE OR REPLACE TYPE TEST_ALLTYPES_NOLOBS AS OBJECT ( - VARCHAR_FIELD VARCHAR2(100), - NVARCHAR_FIELD NVARCHAR2(100), - CHAR_FIELD CHAR(10), - NCHAR_FIELD NCHAR(10), - NUMBER_FIELD NUMBER, - NUMBER_INT_FIELD NUMBER(10), - NUMBER_LONG_FIELD NUMBER(19), - BINARY_FLOAT_FIELD BINARY_FLOAT, - BINARY_DOUBLE_FIELD BINARY_DOUBLE, - DATE_FIELD DATE, - TIMESTAMP_FIELD TIMESTAMP, - TIMESTAMP_TZ_FIELD TIMESTAMP WITH TIME ZONE, - TIMESTAMP_LTZ_FIELD TIMESTAMP WITH LOCAL TIME ZONE, - INTERVAL_YM_FIELD INTERVAL YEAR TO MONTH, - INTERVAL_DS_FIELD INTERVAL DAY TO SECOND, - NESTED_OBJECT_FIELD ADDRESS_T, - VARRAY_FIELD PHONE_LIST - ) - """)) - .noJsonRoundtrip() - .noIdentity(), // Oracle ORA-22901: cannot compare types with VARRAY attributes - - // TEST_ALLTYPES_NOLOBS_OPT - standalone struct without LOBs, optional fields - new OracleTypeAndExample<>( - OracleObject.builder("TEST_ALLTYPES_NOLOBS_OPT") - .addAttribute( - "VARCHAR_FIELD", OracleTypes.varchar2(100).opt(), s -> s.varcharField) - .addAttribute( - "NVARCHAR_FIELD", OracleTypes.nvarchar2(100).opt(), s -> s.nvarcharField) - .addAttribute("CHAR_FIELD", OracleTypes.char_(10).opt(), s -> s.charField) - .addAttribute("NCHAR_FIELD", OracleTypes.nchar(10).opt(), s -> s.ncharField) - .addAttribute("NUMBER_FIELD", OracleTypes.number.opt(), s -> s.numberField) - .addAttribute( - "NUMBER_INT_FIELD", OracleTypes.numberInt.opt(), s -> s.numberIntField) - .addAttribute( - "NUMBER_LONG_FIELD", OracleTypes.numberLong.opt(), s -> s.numberLongField) - .addAttribute( - "BINARY_FLOAT_FIELD", - OracleTypes.binaryFloat.opt(), - s -> s.binaryFloatField) - .addAttribute( - "BINARY_DOUBLE_FIELD", - OracleTypes.binaryDouble.opt(), - s -> s.binaryDoubleField) - .addAttribute("DATE_FIELD", OracleTypes.date.opt(), s -> s.dateField) - .addAttribute( - "TIMESTAMP_FIELD", OracleTypes.timestamp.opt(), s -> s.timestampField) - .addAttribute( - "TIMESTAMP_TZ_FIELD", - OracleTypes.timestampWithTimeZone.opt(), - s -> s.timestampTzField) - .addAttribute( - "TIMESTAMP_LTZ_FIELD", - OracleTypes.timestampWithLocalTimeZone.opt(), - s -> s.timestampLtzField) - .addAttribute( - "INTERVAL_YM_FIELD", - OracleTypes.intervalYearToMonth.opt(), - s -> s.intervalYmField) - .addAttribute( - "INTERVAL_DS_FIELD", - OracleTypes.intervalDayToSecond.opt(), - s -> s.intervalDsField) - .addAttribute( - "NESTED_OBJECT_FIELD", addressType().opt(), s -> s.nestedObjectField) - .addAttribute( - "VARRAY_FIELD", - OracleVArray.of("PHONE_LIST", 5, OracleTypes.varchar2(20)).opt(), - s -> s.varrayField) - .build( - attrs -> - new AllTypesStructNoLobsOptional( - (Optional) attrs[0], - (Optional) attrs[1], - (Optional) attrs[2], - (Optional) attrs[3], - (Optional) attrs[4], - (Optional) attrs[5], - (Optional) attrs[6], - (Optional) attrs[7], - (Optional) attrs[8], - (Optional) attrs[9], - (Optional) attrs[10], - (Optional) attrs[11], - (Optional) attrs[12], - (Optional) attrs[13], - (Optional) attrs[14], - (Optional
) attrs[15], - (Optional>) attrs[16])) - .asType(), - new AllTypesStructNoLobsOptional( - Optional.of("varchar_val"), - Optional.empty(), - Optional.of("char_val "), - Optional.empty(), - Optional.of(new BigDecimal("123.45")), - Optional.of(42), - Optional.empty(), - Optional.of(3.14f), - Optional.of(2.718281828), - Optional.of(LocalDateTime.of(2024, 3, 15, 14, 30)), - Optional.empty(), - Optional.of( - OffsetDateTime.of(2024, 3, 15, 14, 30, 45, 0, ZoneOffset.ofHours(2))), - Optional.of( - OffsetDateTime.of(2024, 3, 15, 14, 30, 45, 0, ZoneOffset.ofHours(3))), - Optional.of(new OracleIntervalYM(2, 6)), - Optional.empty(), - Optional.of( - new Address("456 Oak Ave", "Portland", coords("45.5152", "-122.6784"))), - Optional.of(List.of("555-1234"))), - List.of( - """ - CREATE OR REPLACE TYPE TEST_ALLTYPES_NOLOBS_OPT AS OBJECT ( - VARCHAR_FIELD VARCHAR2(100), - NVARCHAR_FIELD NVARCHAR2(100), - CHAR_FIELD CHAR(10), - NCHAR_FIELD NCHAR(10), - NUMBER_FIELD NUMBER, - NUMBER_INT_FIELD NUMBER(10), - NUMBER_LONG_FIELD NUMBER(19), - BINARY_FLOAT_FIELD BINARY_FLOAT, - BINARY_DOUBLE_FIELD BINARY_DOUBLE, - DATE_FIELD DATE, - TIMESTAMP_FIELD TIMESTAMP, - TIMESTAMP_TZ_FIELD TIMESTAMP WITH TIME ZONE, - TIMESTAMP_LTZ_FIELD TIMESTAMP WITH LOCAL TIME ZONE, - INTERVAL_YM_FIELD INTERVAL YEAR TO MONTH, - INTERVAL_DS_FIELD INTERVAL DAY TO SECOND, - NESTED_OBJECT_FIELD ADDRESS_T, - VARRAY_FIELD PHONE_LIST - ) - """)) - .noJsonRoundtrip() - .noIdentity(), // Oracle ORA-22901: cannot compare types with VARRAY attributes - - // VARRAY of TEST_ALLTYPES_NOLOBS - array of structs without LOBs - new OracleTypeAndExample<>( - OracleVArray.of( - "TEST_ALLTYPES_NOLOBS_ARR", - 10, - OracleObject.builder("TEST_ALLTYPES_NOLOBS") - .addAttribute( - "VARCHAR_FIELD", OracleTypes.varchar2(100), s -> s.varcharField) - .addAttribute( - "NVARCHAR_FIELD", OracleTypes.nvarchar2(100), s -> s.nvarcharField) - .addAttribute("CHAR_FIELD", OracleTypes.char_(10), s -> s.charField) - .addAttribute("NCHAR_FIELD", OracleTypes.nchar(10), s -> s.ncharField) - .addAttribute("NUMBER_FIELD", OracleTypes.number, s -> s.numberField) - .addAttribute( - "NUMBER_INT_FIELD", OracleTypes.numberInt, s -> s.numberIntField) - .addAttribute( - "NUMBER_LONG_FIELD", OracleTypes.numberLong, s -> s.numberLongField) - .addAttribute( - "BINARY_FLOAT_FIELD", - OracleTypes.binaryFloat, - s -> s.binaryFloatField) - .addAttribute( - "BINARY_DOUBLE_FIELD", - OracleTypes.binaryDouble, - s -> s.binaryDoubleField) - .addAttribute("DATE_FIELD", OracleTypes.date, s -> s.dateField) - .addAttribute( - "TIMESTAMP_FIELD", OracleTypes.timestamp, s -> s.timestampField) - .addAttribute( - "TIMESTAMP_TZ_FIELD", - OracleTypes.timestampWithTimeZone, - s -> s.timestampTzField) - .addAttribute( - "TIMESTAMP_LTZ_FIELD", - OracleTypes.timestampWithLocalTimeZone, - s -> s.timestampLtzField) - .addAttribute( - "INTERVAL_YM_FIELD", - OracleTypes.intervalYearToMonth, - s -> s.intervalYmField) - .addAttribute( - "INTERVAL_DS_FIELD", - OracleTypes.intervalDayToSecond, - s -> s.intervalDsField) - .addAttribute( - "NESTED_OBJECT_FIELD", addressType(), s -> s.nestedObjectField) - .addAttribute( - "VARRAY_FIELD", - OracleVArray.of("PHONE_LIST", 5, OracleTypes.varchar2(20)), - s -> s.varrayField) - .build( - attrs -> - new AllTypesStructNoLobs( - (String) attrs[0], - (String) attrs[1], - (String) attrs[2], - (String) attrs[3], - (BigDecimal) attrs[4], - (Integer) attrs[5], - (Long) attrs[6], - (Float) attrs[7], - (Double) attrs[8], - (LocalDateTime) attrs[9], - (LocalDateTime) attrs[10], - (OffsetDateTime) attrs[11], - (OffsetDateTime) attrs[12], - (OracleIntervalYM) attrs[13], - (OracleIntervalDS) attrs[14], - (Address) attrs[15], - (List) attrs[16])) - .asType()), - List.of( - new AllTypesStructNoLobs( - "varchar1", - "nvarchar1", - "char1 ", - "nchar1 ", - new BigDecimal("111.11"), - 11, - 1111L, - 1.1f, - 1.11, - LocalDateTime.of(2024, 1, 1, 10, 0), - LocalDateTime.of(2024, 1, 1, 10, 0, 0, 111000000), - OffsetDateTime.of(2024, 1, 1, 10, 0, 0, 0, ZoneOffset.UTC), - OffsetDateTime.of(2024, 1, 1, 10, 0, 0, 0, ZoneOffset.ofHours(3)), - new OracleIntervalYM(1, 1), - new OracleIntervalDS(1, 1, 1, 1, 111000000), - new Address("111 First St", "City1", coords("40.7128", "-74.006")), - List.of("111-1111")), - new AllTypesStructNoLobs( - "varchar2", - "nvarchar2", - "char2 ", - "nchar2 ", - new BigDecimal("222.22"), - 22, - 2222L, - 2.2f, - 2.22, - LocalDateTime.of(2024, 2, 2, 20, 0), - LocalDateTime.of(2024, 2, 2, 20, 0, 0, 222000000), - OffsetDateTime.of(2024, 2, 2, 20, 0, 0, 0, ZoneOffset.ofHours(-5)), - OffsetDateTime.of(2024, 2, 2, 20, 0, 0, 0, ZoneOffset.ofHours(3)), - new OracleIntervalYM(2, 2), - new OracleIntervalDS(2, 2, 2, 2, 222000000), - new Address("222 Second St", "City2", coords("34.0522", "-118.2437")), - List.of("222-2222", "222-3333"))), - List.of( - "BEGIN EXECUTE IMMEDIATE 'DROP TYPE TEST_ALLTYPES_NOLOBS_ARR';" - + " EXCEPTION WHEN OTHERS THEN NULL; END;", - "BEGIN EXECUTE IMMEDIATE 'DROP TYPE TEST_ALLTYPES_NOLOBS FORCE';" - + " EXCEPTION WHEN OTHERS THEN NULL; END;", - """ - CREATE OR REPLACE TYPE TEST_ALLTYPES_NOLOBS AS OBJECT ( - VARCHAR_FIELD VARCHAR2(100), - NVARCHAR_FIELD NVARCHAR2(100), - CHAR_FIELD CHAR(10), - NCHAR_FIELD NCHAR(10), - NUMBER_FIELD NUMBER, - NUMBER_INT_FIELD NUMBER(10), - NUMBER_LONG_FIELD NUMBER(19), - BINARY_FLOAT_FIELD BINARY_FLOAT, - BINARY_DOUBLE_FIELD BINARY_DOUBLE, - DATE_FIELD DATE, - TIMESTAMP_FIELD TIMESTAMP, - TIMESTAMP_TZ_FIELD TIMESTAMP WITH TIME ZONE, - TIMESTAMP_LTZ_FIELD TIMESTAMP WITH LOCAL TIME ZONE, - INTERVAL_YM_FIELD INTERVAL YEAR TO MONTH, - INTERVAL_DS_FIELD INTERVAL DAY TO SECOND, - NESTED_OBJECT_FIELD ADDRESS_T, - VARRAY_FIELD PHONE_LIST - ) - """, - "CREATE OR REPLACE TYPE TEST_ALLTYPES_NOLOBS_ARR AS VARRAY(10) OF" - + " TEST_ALLTYPES_NOLOBS")) - .noIdentity(), // Complex array of structs - skip identity test - - // VARRAY of TEST_ALLTYPES_NOLOBS_OPT - array of structs without LOBs, optional - // fields - new OracleTypeAndExample<>( - OracleVArray.of( - "TEST_ALLTYPES_NOLOBS_OPT_ARR", - 10, - OracleObject.builder("TEST_ALLTYPES_NOLOBS_OPT") - .addAttribute( - "VARCHAR_FIELD", OracleTypes.varchar2(100).opt(), s -> s.varcharField) - .addAttribute( - "NVARCHAR_FIELD", - OracleTypes.nvarchar2(100).opt(), - s -> s.nvarcharField) - .addAttribute("CHAR_FIELD", OracleTypes.char_(10).opt(), s -> s.charField) - .addAttribute( - "NCHAR_FIELD", OracleTypes.nchar(10).opt(), s -> s.ncharField) - .addAttribute( - "NUMBER_FIELD", OracleTypes.number.opt(), s -> s.numberField) - .addAttribute( - "NUMBER_INT_FIELD", - OracleTypes.numberInt.opt(), - s -> s.numberIntField) - .addAttribute( - "NUMBER_LONG_FIELD", - OracleTypes.numberLong.opt(), - s -> s.numberLongField) - .addAttribute( - "BINARY_FLOAT_FIELD", - OracleTypes.binaryFloat.opt(), - s -> s.binaryFloatField) - .addAttribute( - "BINARY_DOUBLE_FIELD", - OracleTypes.binaryDouble.opt(), - s -> s.binaryDoubleField) - .addAttribute("DATE_FIELD", OracleTypes.date.opt(), s -> s.dateField) - .addAttribute( - "TIMESTAMP_FIELD", OracleTypes.timestamp.opt(), s -> s.timestampField) - .addAttribute( - "TIMESTAMP_TZ_FIELD", - OracleTypes.timestampWithTimeZone.opt(), - s -> s.timestampTzField) - .addAttribute( - "TIMESTAMP_LTZ_FIELD", - OracleTypes.timestampWithLocalTimeZone.opt(), - s -> s.timestampLtzField) - .addAttribute( - "INTERVAL_YM_FIELD", - OracleTypes.intervalYearToMonth.opt(), - s -> s.intervalYmField) - .addAttribute( - "INTERVAL_DS_FIELD", - OracleTypes.intervalDayToSecond.opt(), - s -> s.intervalDsField) - .addAttribute( - "NESTED_OBJECT_FIELD", addressType().opt(), s -> s.nestedObjectField) - .addAttribute( - "VARRAY_FIELD", - OracleVArray.of("PHONE_LIST", 5, OracleTypes.varchar2(20)).opt(), - s -> s.varrayField) - .build( - attrs -> - new AllTypesStructNoLobsOptional( - (Optional) attrs[0], - (Optional) attrs[1], - (Optional) attrs[2], - (Optional) attrs[3], - (Optional) attrs[4], - (Optional) attrs[5], - (Optional) attrs[6], - (Optional) attrs[7], - (Optional) attrs[8], - (Optional) attrs[9], - (Optional) attrs[10], - (Optional) attrs[11], - (Optional) attrs[12], - (Optional) attrs[13], - (Optional) attrs[14], - (Optional
) attrs[15], - (Optional>) attrs[16])) - .asType()), - List.of( - new AllTypesStructNoLobsOptional( - Optional.of("varchar1"), - Optional.empty(), - Optional.of("char1 "), - Optional.empty(), - Optional.of(new BigDecimal("111.11")), - Optional.of(11), - Optional.empty(), - Optional.of(1.1f), - Optional.of(1.11), - Optional.of(LocalDateTime.of(2024, 1, 1, 10, 0)), - Optional.empty(), - Optional.of(OffsetDateTime.of(2024, 1, 1, 10, 0, 0, 0, ZoneOffset.UTC)), - Optional.of( - OffsetDateTime.of(2024, 1, 1, 10, 0, 0, 0, ZoneOffset.ofHours(3))), - Optional.of(new OracleIntervalYM(1, 1)), - Optional.empty(), - Optional.of( - new Address("111 First St", "City1", coords("40.7128", "-74.006"))), - Optional.of(List.of("111-1111"))), - new AllTypesStructNoLobsOptional( - Optional.of("varchar2"), - Optional.of("nvarchar2"), - Optional.of("char2 "), - Optional.of("nchar2 "), - Optional.of(new BigDecimal("222.22")), - Optional.of(22), - Optional.of(2222L), - Optional.of(2.2f), - Optional.of(2.22), - Optional.of(LocalDateTime.of(2024, 2, 2, 20, 0)), - Optional.of(LocalDateTime.of(2024, 2, 2, 20, 0, 0, 222000000)), - Optional.of( - OffsetDateTime.of(2024, 2, 2, 20, 0, 0, 0, ZoneOffset.ofHours(-5))), - Optional.of( - OffsetDateTime.of(2024, 2, 2, 20, 0, 0, 0, ZoneOffset.ofHours(3))), - Optional.of(new OracleIntervalYM(2, 2)), - Optional.of(new OracleIntervalDS(2, 2, 2, 2, 222000000)), - Optional.of( - new Address( - "222 Second St", "City2", coords("34.0522", "-118.2437"))), - Optional.of(List.of("222-2222", "222-3333")))), - List.of( - "BEGIN EXECUTE IMMEDIATE 'DROP TYPE TEST_ALLTYPES_NOLOBS_OPT_ARR';" - + " EXCEPTION WHEN OTHERS THEN NULL; END;", - "BEGIN EXECUTE IMMEDIATE 'DROP TYPE TEST_ALLTYPES_NOLOBS_OPT FORCE';" - + " EXCEPTION WHEN OTHERS THEN NULL; END;", - """ - CREATE OR REPLACE TYPE TEST_ALLTYPES_NOLOBS_OPT AS OBJECT ( - VARCHAR_FIELD VARCHAR2(100), - NVARCHAR_FIELD NVARCHAR2(100), - CHAR_FIELD CHAR(10), - NCHAR_FIELD NCHAR(10), - NUMBER_FIELD NUMBER, - NUMBER_INT_FIELD NUMBER(10), - NUMBER_LONG_FIELD NUMBER(19), - BINARY_FLOAT_FIELD BINARY_FLOAT, - BINARY_DOUBLE_FIELD BINARY_DOUBLE, - DATE_FIELD DATE, - TIMESTAMP_FIELD TIMESTAMP, - TIMESTAMP_TZ_FIELD TIMESTAMP WITH TIME ZONE, - TIMESTAMP_LTZ_FIELD TIMESTAMP WITH LOCAL TIME ZONE, - INTERVAL_YM_FIELD INTERVAL YEAR TO MONTH, - INTERVAL_DS_FIELD INTERVAL DAY TO SECOND, - NESTED_OBJECT_FIELD ADDRESS_T, - VARRAY_FIELD PHONE_LIST - ) - """, - "CREATE OR REPLACE TYPE TEST_ALLTYPES_NOLOBS_OPT_ARR AS VARRAY(10)" - + " OF TEST_ALLTYPES_NOLOBS_OPT")) - .noIdentity() // Complex array of structs - skip identity test - ); - - // Connection helper for Oracle - uses HikariCP connection pool - // Uses Oracle Free 23c on port 1521, connecting to FREEPDB1 pluggable database - static T withConnection(SqlFunction f) { - try (var pooledConn = POOL.unwrap().getConnection()) { - // Unwrap to get the underlying OracleConnection for STRUCT/ARRAY creation - var conn = pooledConn.unwrap(oracle.jdbc.OracleConnection.class); - conn.setAutoCommit(false); - try { - return f.apply(conn); - } finally { - conn.rollback(); - } - } catch (SQLException e) { - throw new RuntimeException(e); - } - } - - @Test - public void test() { - System.out.println("Testing Oracle type codecs...\n"); - - // Test JSON roundtrip first (no database connection needed) - parallel - System.out.println("=== JSON Roundtrip Tests (parallel) ==="); - All.parallelStream() - .filter(t -> t.jsonRoundtripWorks) - .forEach(OracleTypeTest::testJsonRoundtrip); - System.out.println(); - - // Create all user-defined types upfront (must be sequential to avoid conflicts) - withConnection( - conn -> { - System.out.println("=== Creating user-defined types ==="); - var executedSql = new HashSet(); - for (OracleTypeAndExample t : All) { - if (!t.setupSql.isEmpty()) { - try (var stmt = conn.createStatement()) { - for (String sql : t.setupSql) { - if (executedSql.add(sql)) { - try { - stmt.execute(sql); - } catch (SQLException e) { - if (!e.getMessage().contains("ORA-00955") - && !e.getMessage().contains("ORA-02303")) { - throw e; - } - } - } - } - } - } - } - conn.commit(); - return null; - }); - - // Run all DB tests in parallel - System.out.println("\n=== DB Roundtrip Tests (parallel) ==="); - var failures = - All.parallelStream() - .flatMap( - t -> { - var errors = new ArrayList(); - - // Native type roundtrip test - try { - withConnection( - conn -> { - testCase(conn, t); - return null; - }); - } catch (Exception e) { - errors.add( - "Native test FAILED " - + t.type.typename().sqlType() - + ": " - + e.getMessage()); - } - - // JSON DB roundtrip test - if (t.jsonRoundtripWorks()) { - try { - withConnection( - conn -> { - testJsonDbRoundtrip(conn, t); - return null; - }); - } catch (Exception e) { - errors.add( - "JSON DB test FAILED " - + t.type.typename().sqlType() - + ": " - + e.getMessage()); - } - } - - // getGeneratedKeys roundtrip test (skip user-defined types) - if (t.setupSql.isEmpty()) { - try { - withConnection( - conn -> { - testGeneratedKeysRoundtrip(conn, t); - return null; - }); - } catch (Exception e) { - errors.add( - "getGeneratedKeys test FAILED " - + t.type.typename().sqlType() - + ": " - + e.getMessage()); - } - } - - return errors.stream(); - }) - .toList(); - - System.out.println("\n====================================="); - if (failures.isEmpty()) { - System.out.println("All tests passed!"); - } else { - failures.forEach(System.out::println); - throw new RuntimeException(failures.size() + " tests failed"); - } - System.out.println("====================================="); - } - - /** - * Test getGeneratedKeys roundtrip - simulates INSERT RETURNING behavior. Creates a table with an - * auto-generated ID column plus a column of the type under test, inserts a value, and reads back - * the entire row via getGeneratedKeys(). - */ - static void testGeneratedKeysRoundtrip(Connection conn, OracleTypeAndExample t) - throws SQLException { - String sqlType = t.type.typename().sqlType(); - A original = t.example; - A expected = t.expected(); // May differ from original due to Oracle quirks - - // Create table with auto-generated ID + test column - String tableName = uniqueTableName("TEST_GENKEYS"); - try (var stmt = conn.createStatement()) { - stmt.execute( - "CREATE TABLE " - + tableName - + " (id NUMBER GENERATED ALWAYS AS IDENTITY, v " - + sqlType - + ")"); - } - - try { - // Insert using PreparedStatement with column names to get back via getGeneratedKeys - String insertSql = "INSERT INTO " + tableName + " (v) VALUES (?)"; - var insert = conn.prepareStatement(insertSql, new String[] {"ID", "V"}); - t.type.write().set(insert, 1, original); - insert.executeUpdate(); - - // Read back via getGeneratedKeys - var rs = insert.getGeneratedKeys(); - if (!rs.next()) { - throw new RuntimeException("getGeneratedKeys returned no rows"); - } - - // Check metadata - var meta = rs.getMetaData(); - System.out.println("getGeneratedKeys " + sqlType + ":"); - System.out.println(" Columns: " + meta.getColumnCount()); - for (int i = 1; i <= meta.getColumnCount(); i++) { - System.out.println( - " " + i + ": " + meta.getColumnName(i) + " (" + meta.getColumnTypeName(i) + ")"); - } - - // Read ID (column 1) - Long id = rs.getLong(1); - System.out.println(" ID: " + id); - - // Read the value (column 2) - use optional reader if expecting NULL - final A actual; - if (expected == null) { - Optional actualOpt = t.type.opt().read().read(rs, 2); - actual = actualOpt.orElse(null); - } else { - actual = t.type.read().read(rs, 2); - } - System.out.println(" Value: " + format(actual)); - - rs.close(); - insert.close(); - - assertEquals(actual, expected, "getGeneratedKeys value mismatch"); - System.out.println(" PASSED\n"); - - } finally { - // Drop table - try (var stmt = conn.createStatement()) { - stmt.execute("DROP TABLE " + tableName); - } - } - } - - static void testJsonRoundtrip(OracleTypeAndExample t) { - try { - OracleJson jsonCodec = t.type.oracleJson(); - A original = t.example; - A expected = t.expected(); // May differ from original due to Oracle quirks - - // Test toJson -> encode -> parse -> fromJson roundtrip (in-memory) - JsonValue jsonValue = jsonCodec.toJson(original); - String encoded = jsonValue.encode(); - JsonValue parsed = JsonValue.parse(encoded); - A decoded = jsonCodec.fromJson(parsed); - - System.out.println( - "JSON roundtrip " - + t.type.typename().sqlType() - + ": " - + format(original) - + " -> " - + encoded - + " -> " - + format(decoded)); - - if (t.hasIdentity && !areEqual(decoded, expected)) { - throw new RuntimeException( - "JSON roundtrip failed for " - + t.type.typename().sqlType() - + ": expected '" - + format(expected) - + "' but got '" - + format(decoded) - + "'"); - } - } catch (Exception e) { - throw new RuntimeException( - "JSON roundtrip test failed for " + t.type.typename().sqlType(), e); - } - } - - // Test JSON roundtrip through the database - simulates MULTISET behavior - // Insert value into native column, read back as JSON, parse back to value - static void testJsonDbRoundtrip(Connection conn, OracleTypeAndExample t) - throws SQLException { - OracleJson jsonCodec = t.type.oracleJson(); - A original = t.example; - A expected = t.expected(); // May differ from original due to Oracle quirks - String sqlType = t.type.typename().sqlType(); - - // Create temp table (Oracle uses Global Temporary Tables differently, using regular table + - // cleanup) - String tableName = uniqueTableName("TEST_JSON_RT"); - try (var stmt = conn.createStatement()) { - // NESTED TABLE columns require STORE AS clause - String createTableDDL = "CREATE TABLE " + tableName + " (v " + sqlType + ")"; - if (sqlType.contains("ORDER_ITEMS_T")) { // Nested table type - createTableDDL += " NESTED TABLE v STORE AS " + tableName + "_STORAGE"; - } - stmt.execute(createTableDDL); - } - - try { - // Insert value using native type - var insert = conn.prepareStatement("INSERT INTO " + tableName + " (v) VALUES (?)"); - t.type.write().set(insert, 1, original); - insert.execute(); - insert.close(); - - // Select back as JSON using JSON_OBJECT - this is what MULTISET does - var select = conn.prepareStatement("SELECT JSON_OBJECT('v' VALUE v) FROM " + tableName); - select.execute(); - var rs = select.getResultSet(); - - if (!rs.next()) { - throw new RuntimeException("No rows returned"); - } - - // Read the JSON string back from the database - String jsonFromDb = rs.getString(1); - rs.close(); - select.close(); - - // Parse the JSON object and extract 'v' field - JsonValue parsedFromDb = JsonValue.parse(jsonFromDb); - JsonValue fieldValue = ((JsonValue.JObject) parsedFromDb).get("v"); - A decoded = jsonCodec.fromJson(fieldValue); - - System.out.println( - "JSON DB roundtrip " - + sqlType - + ": " - + format(original) - + " -> DB -> " - + jsonFromDb - + " -> " - + format(decoded)); - - if (t.hasIdentity && !areEqual(decoded, expected)) { - throw new RuntimeException( - "JSON DB roundtrip failed for " - + sqlType - + ": expected '" - + format(expected) - + "' but got '" - + format(decoded) - + "'"); - } - } finally { - try (var stmt = conn.createStatement()) { - stmt.execute("DROP TABLE " + tableName); - } - } - } - - static void testCase(Connection conn, OracleTypeAndExample t) throws SQLException { - String sqlType = t.type.typename().sqlType(); - - // Execute setup SQL (for type definitions, etc.) - if (!t.setupSql.isEmpty()) { - try (var stmt = conn.createStatement()) { - for (String sql : t.setupSql) { - try { - stmt.execute(sql); - } catch (SQLException e) { - // Ignore common type creation errors: - // ORA-00955: name is already used by an existing object - // ORA-02303: cannot DROP or REPLACE a type with type or table dependents - if (!e.getMessage().contains("ORA-00955") && !e.getMessage().contains("ORA-02303")) { - throw e; - } - } - } - } - } - - // Create table (Oracle doesn't have CREATE TEMPORARY TABLE syntax in standard form) - String tableName = uniqueTableName("TEST_TABLE"); - try (var stmt = conn.createStatement()) { - // NESTED TABLE columns require STORE AS clause - String createTableDDL = "CREATE TABLE " + tableName + " (v " + sqlType + ")"; - if (sqlType.contains("ORDER_ITEMS_T") - || sqlType.contains("_NESTED_TABLE")) { // Nested table type - createTableDDL += " NESTED TABLE v STORE AS " + tableName + "_STORAGE"; - } - stmt.execute(createTableDDL); - } - - try { - // Insert using PreparedStatement - var insert = conn.prepareStatement("INSERT INTO " + tableName + " (v) VALUES (?)"); - A original = t.example; - A expected = t.expected(); // May differ from original due to Oracle quirks - t.type.write().set(insert, 1, original); - insert.execute(); - insert.close(); - - // Select and verify - final PreparedStatement select; - if (t.hasIdentity) { - // For NULL values, use IS NULL since WHERE v = NULL doesn't match (NULL = NULL is UNKNOWN) - if (expected == null) { - select = conn.prepareStatement("SELECT v, NULL FROM " + tableName + " WHERE v IS NULL"); - } else { - select = conn.prepareStatement("SELECT v, NULL FROM " + tableName + " WHERE v = ?"); - t.type.write().set(select, 1, original); - } - } else { - select = conn.prepareStatement("SELECT v, NULL FROM " + tableName); - } - - select.execute(); - var rs = select.getResultSet(); - - if (!rs.next()) { - throw new RuntimeException("No rows returned"); - } - - // Read the value - use optional reader if expecting NULL - final A actual; - if (expected == null) { - Optional actualOpt = t.type.opt().read().read(rs, 1); - actual = actualOpt.orElse(null); - } else { - actual = t.type.read().read(rs, 1); - } - - // Read the null value using opt() - Optional actualNull = t.type.opt().read().read(rs, 2); - - rs.close(); - select.close(); - - assertEquals(actual, expected, "value mismatch"); - assertEquals(actualNull, Optional.empty(), "null value mismatch"); - - } finally { - // Drop table - try (var stmt = conn.createStatement()) { - stmt.execute("DROP TABLE " + tableName); - } - } - } - - static void assertEquals(A actual, A expected, String message) { - if (!areEqual(actual, expected)) { - throw new RuntimeException( - message + ": actual='" + format(actual) + "' expected='" + format(expected) + "'"); - } - } - - static boolean areEqual(A actual, A expected) { - if (expected == null && actual == null) return true; - if (expected == null || actual == null) return false; - - if (expected instanceof byte[]) { - return Arrays.equals((byte[]) actual, (byte[]) expected); - } - if (expected instanceof Object[]) { - return Arrays.deepEquals((Object[]) actual, (Object[]) expected); - } - - // For BigDecimal, use compareTo to handle different scales - if (expected instanceof BigDecimal && actual instanceof BigDecimal) { - return ((BigDecimal) actual).compareTo((BigDecimal) expected) == 0; - } - - // For Json, parse and compare structures (Oracle normalizes JSON formatting) - if (expected instanceof Json && actual instanceof Json) { - try { - JsonValue v1 = JsonValue.parse(((Json) actual).value()); - JsonValue v2 = JsonValue.parse(((Json) expected).value()); - return v1.equals(v2); - } catch (Exception e) { - // If parsing fails, fall back to string comparison - return ((Json) actual).value().equals(((Json) expected).value()); - } - } - - return actual.equals(expected); - } - - static String format(A a) { - return switch (a) { - case null -> "null"; - case byte[] bytes -> bytesToHex(bytes); - case Object[] objects -> Arrays.deepToString(objects); - default -> a.toString(); - }; - } - - static String bytesToHex(byte[] bytes) { - StringBuilder sb = new StringBuilder(); - sb.append("["); - for (int i = 0; i < bytes.length; i++) { - if (i > 0) sb.append(", "); - sb.append(String.format("0x%02X", bytes[i])); - } - sb.append("]"); - return sb.toString(); - } -} diff --git a/foundations-jdbc-test/src/java/dev/typr/foundations/PgRecordParserTest.java b/foundations-jdbc-test/src/java/dev/typr/foundations/PgRecordParserTest.java deleted file mode 100644 index dcadf27c91..0000000000 --- a/foundations-jdbc-test/src/java/dev/typr/foundations/PgRecordParserTest.java +++ /dev/null @@ -1,385 +0,0 @@ -package dev.typr.foundations; - -import java.util.Arrays; -import java.util.List; -import org.junit.Test; - -/** - * Tests for PostgreSQL composite type (record) text format parser. - * - *

Tests cover: - * - *

    - *
  • Simple values (unquoted) - *
  • Quoted values with special characters - *
  • NULL vs empty string handling - *
  • Nested composite types - *
  • Deeply nested types (multiple levels of quote escaping) - *
  • Edge cases: newlines, backslashes, commas in values - *
  • Roundtrip encoding/decoding - *
- */ -public class PgRecordParserTest { - - @Test - public void testSimpleValues() { - // Simple unquoted values - assertParse("(hello,world,123)", List.of("hello", "world", "123")); - assertParse("(a,b,c)", List.of("a", "b", "c")); - assertParse("(1,2,3)", List.of("1", "2", "3")); - } - - @Test - public void testSingleValue() { - assertParse("(hello)", List.of("hello")); - assertParse("(123)", List.of("123")); - } - - @Test - public void testEmptyRecord() { - assertParse("()", List.of()); - } - - @Test - public void testNullValues() { - // NULL is represented by empty (no characters) - assertParse("(,)", Arrays.asList(null, null)); - assertParse("(a,,c)", Arrays.asList("a", null, "c")); - assertParse("(,,)", Arrays.asList(null, null, null)); - assertParse("(a,)", Arrays.asList("a", null)); - assertParse("(,b)", Arrays.asList(null, "b")); - } - - @Test - public void testEmptyStringVsNull() { - // Empty string is "" (quoted empty), NULL is nothing - assertParse("(\"\")", List.of("")); - assertParse("(\"\",)", Arrays.asList("", null)); - assertParse("(,\"\")", Arrays.asList(null, "")); - assertParse("(\"\",\"\",\"\")", List.of("", "", "")); - } - - @Test - public void testQuotedValuesWithCommas() { - // Values with commas need quotes - assertParse("(\"hello, world\",test)", List.of("hello, world", "test")); - assertParse("(a,\"b,c,d\",e)", List.of("a", "b,c,d", "e")); - } - - @Test - public void testQuotedValuesWithQuotes() { - // Quotes within quoted values are doubled: " becomes "" - assertParse("(\"say \"\"hello\"\"\")", List.of("say \"hello\"")); - assertParse("(\"\"\"quoted\"\"\",plain)", List.of("\"quoted\"", "plain")); - assertParse("(a,\"b\"\"c\",d)", List.of("a", "b\"c", "d")); - } - - @Test - public void testQuotedValuesWithParentheses() { - // Parentheses within quoted values - assertParse("(\"(nested)\")", List.of("(nested)")); - assertParse("(\"(a,b)\",c)", List.of("(a,b)", "c")); - assertParse("(\"((deep))\")", List.of("((deep))")); - } - - @Test - public void testQuotedValuesWithBackslashes() { - // Backslashes - PostgreSQL doubles them in quotes - assertParse("(\"path\\\\to\\\\file\")", List.of("path\\to\\file")); - assertParse("(\"a\\\\b\",c)", List.of("a\\b", "c")); - } - - @Test - public void testQuotedValuesWithNewlines() { - // Newlines within quoted values - assertParse("(\"line1\nline2\")", List.of("line1\nline2")); - assertParse("(\"a\nb\nc\",d)", List.of("a\nb\nc", "d")); - } - - @Test - public void testNestedComposites() { - // Nested composite appears as quoted string - // Inner: (a,b) -> quoted: "(a,b)" - assertParse("(\"(a,b)\",outer)", List.of("(a,b)", "outer")); - - // Parse the nested value - List nested = PgRecordParser.parseNested("(a,b)"); - assertEqual(nested, List.of("a", "b")); - } - - @Test - public void testDeeplyNestedComposites() { - // Two levels of nesting: - // Inner: (x,y) -> in middle: "(x,y)" -> in outer: "(""(x,y)"")" - // The outermost quoted string has doubled quotes for the inner quotes - - // Example from PostgreSQL: - // Outer composite containing middle composite containing inner composite - // Level 3: (x,y) - // Level 2: (inner,"(x,y)") -> "(inner,""(x,y)"")" - // Level 1: (prefix,"(inner,""(x,y)"")") -> final text - - // Simpler case: nested with quoted inner - String level2 = "(inner,\"(x,y)\")"; - List parsedLevel2 = PgRecordParser.parse(level2); - assertEqual(parsedLevel2, List.of("inner", "(x,y)")); - - // Further nested - List parsedLevel3 = PgRecordParser.parseNested(parsedLevel2.get(1)); - assertEqual(parsedLevel3, List.of("x", "y")); - } - - @Test - public void testRealWorldNestedExample() { - // From actual PostgreSQL output: - // contact_info type: (email, phone, address) - // address type: (street, city, zip, country) - // - // Example: (test@example.com,+1-555-0123,"(""456 Oak Ave"",""Los Angeles"",90001,USA)") - - String input = - "(test@example.com,+1-555-0123,\"(\"\"456 Oak Ave\"\",\"\"Los Angeles\"\",90001,USA)\")"; - List parsed = PgRecordParser.parse(input); - - assertEqual(parsed.size(), 3); - assertEqual(parsed.get(0), "test@example.com"); - assertEqual(parsed.get(1), "+1-555-0123"); - // The address is a nested composite - String addressStr = parsed.get(2); - assertEqual(addressStr, "(\"456 Oak Ave\",\"Los Angeles\",90001,USA)"); - - // Parse the nested address - List address = PgRecordParser.parseNested(addressStr); - assertEqual(address, List.of("456 Oak Ave", "Los Angeles", "90001", "USA")); - } - - @Test - public void testVeryDeeplyNested() { - // From actual PostgreSQL output for employee_record: - // employee_record: (name, contact, employee_id, salary, hire_date) - // where name is person_name: (first, middle, last, suffix) - // and contact is contact_info: (email, phone, address) - // and address is: (street, city, zip, country) - // - // Real output: - // ("(John,Michael,Doe,Jr.)","(john.doe@company.com,+1-555-9999,""(""""789 Pine - // Rd"""",Chicago,60601,USA)"")",12345,75000.50,2020-01-15) - - String input = - "(\"(John,Michael,Doe,Jr.)\",\"(john.doe@company.com,+1-555-9999,\"\"(\"\"\"\"789 Pine" - + " Rd\"\"\"\",Chicago,60601,USA)\"\")\",12345,75000.50,2020-01-15)"; - - List parsed = PgRecordParser.parse(input); - assertEqual(parsed.size(), 5); - - // First field: person_name - String nameStr = parsed.get(0); - assertEqual(nameStr, "(John,Michael,Doe,Jr.)"); - List name = PgRecordParser.parseNested(nameStr); - assertEqual(name, List.of("John", "Michael", "Doe", "Jr.")); - - // Second field: contact_info (with nested address) - String contactStr = parsed.get(1); - List contact = PgRecordParser.parseNested(contactStr); - assertEqual(contact.size(), 3); - assertEqual(contact.get(0), "john.doe@company.com"); - assertEqual(contact.get(1), "+1-555-9999"); - - // Third level: address - String addressStr = contact.get(2); - List address = PgRecordParser.parseNested(addressStr); - assertEqual(address, List.of("789 Pine Rd", "Chicago", "60601", "USA")); - - // Other fields - assertEqual(parsed.get(2), "12345"); - assertEqual(parsed.get(3), "75000.50"); - assertEqual(parsed.get(4), "2020-01-15"); - } - - @Test - public void testSpecialCharacterCombinations() { - // All special chars together - String input = "(\"all: \"\"quotes\"\", (parens), \\\\slash\n\")"; - List parsed = PgRecordParser.parse(input); - assertEqual(parsed.size(), 1); - assertEqual(parsed.get(0), "all: \"quotes\", (parens), \\slash\n"); - } - - @Test - public void testRoundtripSimple() { - List values = List.of("hello", "world", "123"); - String encoded = PgRecordParser.encode(values); - List decoded = PgRecordParser.parse(encoded); - assertEqual(decoded, values); - } - - @Test - public void testRoundtripWithNulls() { - List values = Arrays.asList("a", null, "c", null); - String encoded = PgRecordParser.encode(values); - List decoded = PgRecordParser.parse(encoded); - assertEqual(decoded, values); - } - - @Test - public void testRoundtripWithEmptyStrings() { - List values = Arrays.asList("", "a", "", null); - String encoded = PgRecordParser.encode(values); - List decoded = PgRecordParser.parse(encoded); - assertEqual(decoded, values); - } - - @Test - public void testRoundtripWithSpecialChars() { - List values = - List.of("hello, world", "say \"hello\"", "(nested)", "path\\to\\file", "line1\nline2"); - String encoded = PgRecordParser.encode(values); - List decoded = PgRecordParser.parse(encoded); - assertEqual(decoded, values); - } - - @Test - public void testRoundtripNestedComposite() { - // Encode an inner composite - List inner = List.of("a", "b", "c"); - String innerEncoded = PgRecordParser.encode(inner); - - // Now encode an outer composite containing the inner as a field - List outer = List.of("prefix", innerEncoded, "suffix"); - String outerEncoded = PgRecordParser.encode(outer); - - // Decode outer - List outerDecoded = PgRecordParser.parse(outerEncoded); - assertEqual(outerDecoded.size(), 3); - assertEqual(outerDecoded.get(0), "prefix"); - assertEqual(outerDecoded.get(2), "suffix"); - - // Decode inner from outer - List innerDecoded = PgRecordParser.parseNested(outerDecoded.get(1)); - assertEqual(innerDecoded, inner); - } - - @Test - public void testWhitespace() { - // Whitespace in unquoted values is preserved - assertParse("( hello ,world)", List.of(" hello ", "world")); - - // Whitespace around the outer parens is trimmed - assertParse(" (a,b) ", List.of("a", "b")); - } - - @Test - public void testNumericValues() { - assertParse("(123,45.67,-89,0.001)", List.of("123", "45.67", "-89", "0.001")); - } - - @Test - public void testBooleanValues() { - assertParse("(true,false,t,f)", List.of("true", "false", "t", "f")); - } - - @Test - public void testDateTimeValues() { - assertParse( - "(2024-06-15,14:30:00,2024-06-15 14:30:00)", - List.of("2024-06-15", "14:30:00", "2024-06-15 14:30:00")); - } - - @Test - public void testUuidValues() { - assertParse( - "(a0eebc99-9c0b-4ef8-bb6d-6bb9bd380a11)", List.of("a0eebc99-9c0b-4ef8-bb6d-6bb9bd380a11")); - } - - @Test(expected = IllegalArgumentException.class) - public void testInvalidNoParentheses() { - PgRecordParser.parse("hello,world"); - } - - @Test(expected = IllegalArgumentException.class) - public void testInvalidNoClosingParen() { - PgRecordParser.parse("(hello,world"); - } - - @Test(expected = IllegalArgumentException.class) - public void testInvalidNoOpeningParen() { - PgRecordParser.parse("hello,world)"); - } - - @Test(expected = IllegalArgumentException.class) - public void testInvalidUnterminatedQuote() { - PgRecordParser.parse("(\"hello)"); - } - - @Test(expected = IllegalArgumentException.class) - public void testInvalidNull() { - PgRecordParser.parse(null); - } - - @Test(expected = IllegalArgumentException.class) - public void testInvalidEmpty() { - PgRecordParser.parse(""); - } - - // Helper methods - - private void assertParse(String input, List expected) { - List actual = PgRecordParser.parse(input); - if (!listsEqual(actual, expected)) { - throw new AssertionError( - "Parse mismatch for input: " - + input - + "\nExpected: " - + formatList(expected) - + "\nActual: " - + formatList(actual)); - } - } - - private void assertEqual(Object actual, Object expected) { - if (expected == null && actual == null) return; - if (expected == null || actual == null) { - throw new AssertionError("Expected: " + expected + ", Actual: " + actual); - } - if (expected instanceof List && actual instanceof List) { - if (!listsEqual((List) actual, (List) expected)) { - throw new AssertionError("Expected: " + expected + ", Actual: " + actual); - } - } else if (!expected.equals(actual)) { - throw new AssertionError("Expected: " + expected + ", Actual: " + actual); - } - } - - private void assertEqual(int actual, int expected) { - if (actual != expected) { - throw new AssertionError("Expected: " + expected + ", Actual: " + actual); - } - } - - private boolean listsEqual(List a, List b) { - if (a.size() != b.size()) return false; - for (int i = 0; i < a.size(); i++) { - Object ai = a.get(i); - Object bi = b.get(i); - if (ai == null && bi == null) continue; - if (ai == null || bi == null) return false; - if (!ai.equals(bi)) return false; - } - return true; - } - - private String formatList(List list) { - StringBuilder sb = new StringBuilder("["); - for (int i = 0; i < list.size(); i++) { - if (i > 0) sb.append(", "); - String v = list.get(i); - if (v == null) { - sb.append("null"); - } else { - sb.append("\"").append(v.replace("\"", "\\\"").replace("\n", "\\n")).append("\""); - } - } - sb.append("]"); - return sb.toString(); - } -} diff --git a/foundations-jdbc-test/src/java/dev/typr/foundations/PgStructTest.java b/foundations-jdbc-test/src/java/dev/typr/foundations/PgStructTest.java deleted file mode 100644 index 39a3f88f29..0000000000 --- a/foundations-jdbc-test/src/java/dev/typr/foundations/PgStructTest.java +++ /dev/null @@ -1,1099 +0,0 @@ -package dev.typr.foundations; - -import java.sql.*; -import java.util.List; -import org.junit.Test; - -/** - * Tests for PostgreSQL composite type (record) support. - * - *

Tests the PgStruct class which provides JDBC support for composite types. - */ -public class PgStructTest { - - // Simple address composite type matching the one in composite-types.sql - record Address(String street, String city, String zip, String country) {} - - // Define the PgStruct for Address - static final PgStruct

addressStruct = - PgStruct.
builder("address") - .stringField("street", PgTypes.text, Address::street) - .stringField("city", PgTypes.text, Address::city) - .stringField("zip", PgTypes.text, Address::zip) - .stringField("country", PgTypes.text, Address::country) - .build( - arr -> - new Address((String) arr[0], (String) arr[1], (String) arr[2], (String) arr[3])); - - static final PgType
addressType = addressStruct.asType(); - - // person_name composite type - record PersonName(String firstName, String middleName, String lastName, String suffix) {} - - static final PgStruct personNameStruct = - PgStruct.builder("person_name") - .stringField("first_name", PgTypes.text, PersonName::firstName) - .stringField("middle_name", PgTypes.text, PersonName::middleName) - .stringField("last_name", PgTypes.text, PersonName::lastName) - .stringField("suffix", PgTypes.text, PersonName::suffix) - .build( - arr -> - new PersonName( - (String) arr[0], (String) arr[1], (String) arr[2], (String) arr[3])); - - // contact_info composite type (with nested address) - record ContactInfo(String email, String phone, Address address) {} - - static final PgStruct contactInfoStruct = - PgStruct.builder("contact_info") - .stringField("email", PgTypes.text, ContactInfo::email) - .stringField("phone", PgTypes.text, ContactInfo::phone) - .nestedField("address", addressStruct, ContactInfo::address) - .build(arr -> new ContactInfo((String) arr[0], (String) arr[1], (Address) arr[2])); - - static final PgType contactInfoType = contactInfoStruct.asType(); - - // point_2d composite type - record Point2D(Double x, Double y) {} - - static final PgStruct point2dStruct = - PgStruct.builder("point_2d") - .doubleField("x", PgTypes.float8, Point2D::x) - .doubleField("y", PgTypes.float8, Point2D::y) - .build(arr -> new Point2D((Double) arr[0], (Double) arr[1])); - - static final PgType point2dType = point2dStruct.asType(); - - // Connection helper for PostgreSQL - static T withConnection(SqlFunction f) { - String url = "jdbc:postgresql://localhost:6432/Adventureworks"; - try (Connection conn = DriverManager.getConnection(url, "postgres", "password")) { - conn.setAutoCommit(false); - try { - return f.apply(conn); - } finally { - conn.rollback(); - } - } catch (SQLException e) { - throw new RuntimeException(e); - } - } - - @Test - public void testParseSimpleAddress() throws SQLException { - // Test parsing directly with PgRecordParser - String input = "(\"123 Main St\",\"New York\",10001,USA)"; - List fields = PgRecordParser.parse(input); - - assertEqual(fields.size(), 4); - assertEqual(fields.get(0), "123 Main St"); - assertEqual(fields.get(1), "New York"); - assertEqual(fields.get(2), "10001"); - assertEqual(fields.get(3), "USA"); - } - - @Test - public void testAddressRoundtrip() { - // Test encoding and decoding without database - Address original = new Address("123 Main St", "New York", "10001", "USA"); - - // Encode to text - StringBuilder sb = new StringBuilder(); - addressType.pgText().unsafeEncode(original, sb); - String encoded = sb.toString(); - - System.out.println("Encoded address: " + encoded); - - // The encoded should be parseable - List parsed = PgRecordParser.parse(encoded); - assertEqual(parsed.size(), 4); - assertEqual(parsed.get(0), original.street()); - assertEqual(parsed.get(1), original.city()); - assertEqual(parsed.get(2), original.zip()); - assertEqual(parsed.get(3), original.country()); - } - - @Test - public void testNestedContactInfoRoundtrip() { - // Test encoding and decoding nested composite without database - Address addr = new Address("456 Oak Ave", "Los Angeles", "90001", "USA"); - ContactInfo original = new ContactInfo("test@example.com", "+1-555-0123", addr); - - // Encode to text - StringBuilder sb = new StringBuilder(); - contactInfoType.pgText().unsafeEncode(original, sb); - String encoded = sb.toString(); - - System.out.println("Encoded contact_info: " + encoded); - - // The encoded should be parseable - List parsed = PgRecordParser.parse(encoded); - assertEqual(parsed.size(), 3); - assertEqual(parsed.get(0), original.email()); - assertEqual(parsed.get(1), original.phone()); - - // Third field is nested address - List addressParsed = PgRecordParser.parseNested(parsed.get(2)); - assertEqual(addressParsed.get(0), addr.street()); - assertEqual(addressParsed.get(1), addr.city()); - assertEqual(addressParsed.get(2), addr.zip()); - assertEqual(addressParsed.get(3), addr.country()); - } - - @Test - public void testReadFromDatabase() { - withConnection( - conn -> { - System.out.println("Testing reading composite types from database...\n"); - - // Read a simple address - try (PreparedStatement ps = - conn.prepareStatement("SELECT simple_address FROM composite_test WHERE id = 1"); - ResultSet rs = ps.executeQuery()) { - - if (rs.next()) { - Address addr = addressType.read().read(rs, 1); - System.out.println("Read address: " + addr); - - assertEqual(addr.street(), "123 Main St"); - assertEqual(addr.city(), "New York"); - assertEqual(addr.zip(), "10001"); - assertEqual(addr.country(), "USA"); - } else { - throw new RuntimeException("No row found"); - } - } - - // Read a nested contact_info - try (PreparedStatement ps = - conn.prepareStatement("SELECT full_contact FROM composite_test WHERE id = 1"); - ResultSet rs = ps.executeQuery()) { - - if (rs.next()) { - ContactInfo contact = contactInfoType.read().read(rs, 1); - System.out.println("Read contact_info: " + contact); - - assertEqual(contact.email(), "test@example.com"); - assertEqual(contact.phone(), "+1-555-0123"); - assertNotNull(contact.address()); - assertEqual(contact.address().street(), "456 Oak Ave"); - assertEqual(contact.address().city(), "Los Angeles"); - } else { - throw new RuntimeException("No row found"); - } - } - - System.out.println("\nAll database read tests passed!"); - return null; - }); - } - - @Test - public void testWriteToDatabase() { - withConnection( - conn -> { - System.out.println("Testing writing composite types to database...\n"); - - // Insert a new row with composite values - Address newAddr = new Address("789 Pine Rd", "Chicago", "60601", "USA"); - - try (PreparedStatement ps = - conn.prepareStatement( - "INSERT INTO composite_test (simple_address) VALUES (?) RETURNING id," - + " simple_address")) { - addressType.write().set(ps, 1, newAddr); - try (ResultSet rs = ps.executeQuery()) { - if (rs.next()) { - int id = rs.getInt(1); - Address readBack = addressType.read().read(rs, 2); - - System.out.println("Inserted row with id: " + id); - System.out.println("Read back address: " + readBack); - - assertEqual(readBack.street(), newAddr.street()); - assertEqual(readBack.city(), newAddr.city()); - assertEqual(readBack.zip(), newAddr.zip()); - assertEqual(readBack.country(), newAddr.country()); - } else { - throw new RuntimeException("No row returned from INSERT"); - } - } - } - - System.out.println("\nAll database write tests passed!"); - return null; - }); - } - - @Test - public void testNullHandling() { - withConnection( - conn -> { - System.out.println("Testing NULL handling in composite types...\n"); - - // Read row with NULL address - try (PreparedStatement ps = - conn.prepareStatement("SELECT simple_address FROM composite_test WHERE id = 2"); - ResultSet rs = ps.executeQuery()) { - - if (rs.next()) { - // Use opt() for nullable composite - var optAddr = addressType.opt().read().read(rs, 1); - System.out.println("Read optional address from row 2: " + optAddr); - assertEqual(optAddr.isEmpty(), true); - } else { - throw new RuntimeException("No row found"); - } - } - - System.out.println("\nNULL handling tests passed!"); - return null; - }); - } - - @Test - public void testSpecialCharacters() { - withConnection( - conn -> { - System.out.println("Testing special characters in composite types...\n"); - - // Insert an address with special characters - Address specialAddr = - new Address("123 \"Quoted\" St", "City, With, Commas", "12345", "USA (Special)"); - - try (PreparedStatement ps = - conn.prepareStatement( - "INSERT INTO composite_test (simple_address) VALUES (?) RETURNING" - + " simple_address")) { - addressType.write().set(ps, 1, specialAddr); - try (ResultSet rs = ps.executeQuery()) { - if (rs.next()) { - Address readBack = addressType.read().read(rs, 1); - - System.out.println("Original: " + specialAddr); - System.out.println("Read back: " + readBack); - - assertEqual(readBack.street(), specialAddr.street()); - assertEqual(readBack.city(), specialAddr.city()); - assertEqual(readBack.zip(), specialAddr.zip()); - assertEqual(readBack.country(), specialAddr.country()); - } else { - throw new RuntimeException("No row returned from INSERT"); - } - } - } - - System.out.println("\nSpecial characters tests passed!"); - return null; - }); - } - - @Test - public void testJsonRoundtrip() { - // Test JSON encoding/decoding - Address original = new Address("123 Main St", "New York", "10001", "USA"); - - var jsonValue = addressType.pgJson().toJson(original); - String jsonStr = jsonValue.encode(); - System.out.println("JSON encoded: " + jsonStr); - - var parsed = dev.typr.foundations.data.JsonValue.parse(jsonStr); - Address decoded = addressType.pgJson().fromJson(parsed); - - assertEqual(decoded.street(), original.street()); - assertEqual(decoded.city(), original.city()); - assertEqual(decoded.zip(), original.zip()); - assertEqual(decoded.country(), original.country()); - } - - // ============================================================================ - // Deep Nesting Test - record → array → record → array with special chars - // ============================================================================ - - /** - * Inner item with special characters - the deepest level. Contains strings with quotes, commas, - * newlines, backslashes, and parentheses. - */ - record InnerItem(String name, String description) {} - - static final PgStruct innerItemStruct = - PgStruct.builder("inner_item") - .stringField("name", PgTypes.text, InnerItem::name) - .stringField("description", PgTypes.text, InnerItem::description) - .build(arr -> new InnerItem((String) arr[0], (String) arr[1])); - - static final PgType innerItemType = innerItemStruct.asType(); - - /** Middle container - has an array of InnerItem plus its own fields with special chars. */ - record MiddleContainer(String label, InnerItem[] items) {} - - static final PgStruct middleContainerStruct = - PgStruct.builder("middle_container") - .stringField("label", PgTypes.text, MiddleContainer::label) - .nestedArrayField("items", innerItemStruct, MiddleContainer::items, InnerItem[]::new) - .build(arr -> new MiddleContainer((String) arr[0], (InnerItem[]) arr[1])); - - static final PgType middleContainerType = middleContainerStruct.asType(); - - /** Outer wrapper - has an array of MiddleContainer plus its own fields with special chars. */ - record OuterWrapper(String title, String metadata, MiddleContainer[] containers) {} - - static final PgStruct outerWrapperStruct = - PgStruct.builder("outer_wrapper") - .stringField("title", PgTypes.text, OuterWrapper::title) - .stringField("metadata", PgTypes.text, OuterWrapper::metadata) - .nestedArrayField( - "containers", middleContainerStruct, OuterWrapper::containers, MiddleContainer[]::new) - .build( - arr -> - new OuterWrapper((String) arr[0], (String) arr[1], (MiddleContainer[]) arr[2])); - - static final PgType outerWrapperType = outerWrapperStruct.asType(); - - @Test - public void testDeepNestingWithSpecialCharsRoundtrip() { - System.out.println("Testing deep nesting with special characters roundtrip...\n"); - - // Create deeply nested data with special characters at ALL levels - - // Level 4 (deepest): InnerItems with special chars - InnerItem inner1 = - new InnerItem( - "Item with \"quotes\"", // quotes - "Description, with, commas"); // commas - - InnerItem inner2 = - new InnerItem( - "Item with (parens)", // parentheses - "Line1\nLine2\nLine3"); // newlines - - InnerItem inner3 = - new InnerItem( - "Item with \\backslash\\", // backslashes - "All together: \"quotes\", commas, (parens), \\slash\\ and\nnewlines"); - - InnerItem inner4 = - new InnerItem( - "", // empty string - " "); // whitespace only - - // Level 3: MiddleContainers with arrays of InnerItems - MiddleContainer middle1 = - new MiddleContainer( - "Container \"A\" with, special (chars)", new InnerItem[] {inner1, inner2}); - - MiddleContainer middle2 = - new MiddleContainer("Container\nwith\nnewlines", new InnerItem[] {inner3, inner4}); - - MiddleContainer middle3 = - new MiddleContainer("Empty items container", new InnerItem[] {}); // empty array - - // Level 2: OuterWrapper with arrays of MiddleContainers - OuterWrapper original = - new OuterWrapper( - "The \"Ultimate\" Test (with all chars)", - "Metadata: \"quotes\", commas, (parens),\nnewlines, and \\backslashes\\", - new MiddleContainer[] {middle1, middle2, middle3}); - - // Encode to text (PostgreSQL composite format) - StringBuilder sb = new StringBuilder(); - outerWrapperType.pgText().unsafeEncode(original, sb); - String encoded = sb.toString(); - - System.out.println("Encoded outer_wrapper (length=" + encoded.length() + "):"); - System.out.println(encoded); - System.out.println(); - - // Decode back using JSON roundtrip (text format decoding happens through JDBC) - // For in-memory testing we verify that the parsed record matches - List parsed = PgRecordParser.parse(encoded); - assertEqual(parsed.size(), 3); - - // Verify title and metadata decoded correctly - assertEqual(parsed.get(0), original.title()); - assertEqual(parsed.get(1), original.metadata()); - - // The containers array is the complex nested part - parse it - String containersStr = parsed.get(2); - List containerElements = PgRecordParser.parseArray(containersStr); - assertEqual(containerElements.size(), 3); // 3 middle containers - - // Parse first container - List container0 = PgRecordParser.parse(containerElements.get(0)); - assertEqual(container0.get(0), middle1.label()); - - // Parse items array of first container - String itemsStr = container0.get(1); - List itemElements = PgRecordParser.parseArray(itemsStr); - assertEqual(itemElements.size(), 2); // 2 inner items - - // Parse first inner item - List item0 = PgRecordParser.parse(itemElements.get(0)); - assertEqual(item0.get(0), inner1.name()); // quotes - assertEqual(item0.get(1), inner1.description()); // commas - - // Parse second inner item - List item1 = PgRecordParser.parse(itemElements.get(1)); - assertEqual(item1.get(0), inner2.name()); // parens - assertEqual(item1.get(1), inner2.description()); // newlines - - // Parse second container - List container1 = PgRecordParser.parse(containerElements.get(1)); - assertEqual(container1.get(0), middle2.label()); // newlines in label - - String items1Str = container1.get(1); - List itemElements1 = PgRecordParser.parseArray(items1Str); - assertEqual(itemElements1.size(), 2); - - // Parse third and fourth inner items from second container - List item2 = PgRecordParser.parse(itemElements1.get(0)); - assertEqual(item2.get(0), inner3.name()); // backslash - assertEqual(item2.get(1), inner3.description()); // all together - - List item3 = PgRecordParser.parse(itemElements1.get(1)); - assertEqual(item3.get(0), inner4.name()); // empty - assertEqual(item3.get(1), inner4.description()); // whitespace - - // Parse third container (empty items) - List container2 = PgRecordParser.parse(containerElements.get(2)); - assertEqual(container2.get(0), middle3.label()); - String items2Str = container2.get(1); - List itemElements2 = PgRecordParser.parseArray(items2Str); - assertEqual(itemElements2.size(), 0); // empty array - - // We've verified the complete deep parse of the encoded structure - OuterWrapper decoded = original; // Placeholder since we verified above - - // Verify Level 1: OuterWrapper fields - assertEqual(decoded.title(), original.title()); - assertEqual(decoded.metadata(), original.metadata()); - assertNotNull(decoded.containers()); - assertEqual(decoded.containers().length, 3); - - // Verify Level 2: MiddleContainers - assertEqual(decoded.containers()[0].label(), middle1.label()); - assertEqual(decoded.containers()[0].items().length, 2); - - assertEqual(decoded.containers()[1].label(), middle2.label()); - assertEqual(decoded.containers()[1].items().length, 2); - - assertEqual(decoded.containers()[2].label(), middle3.label()); - assertEqual(decoded.containers()[2].items().length, 0); // empty array - - // Verify Level 3: InnerItems with special chars - assertEqual(decoded.containers()[0].items()[0].name(), inner1.name()); // quotes - assertEqual(decoded.containers()[0].items()[0].description(), inner1.description()); // commas - - assertEqual(decoded.containers()[0].items()[1].name(), inner2.name()); // parens - assertEqual(decoded.containers()[0].items()[1].description(), inner2.description()); // newlines - - assertEqual(decoded.containers()[1].items()[0].name(), inner3.name()); // backslash - assertEqual( - decoded.containers()[1].items()[0].description(), inner3.description()); // all together - - assertEqual(decoded.containers()[1].items()[1].name(), inner4.name()); // empty - assertEqual( - decoded.containers()[1].items()[1].description(), inner4.description()); // whitespace - - System.out.println("All deep nesting roundtrip assertions passed!"); - } - - @Test - public void testDeepNestingDatabaseRoundtrip() { - withConnection( - conn -> { - System.out.println("Testing deep nesting with database roundtrip...\n"); - - // Create the types in the database if they don't exist - try (Statement stmt = conn.createStatement()) { - // Drop existing types (in reverse dependency order) - stmt.execute("DROP TYPE IF EXISTS test_outer_wrapper CASCADE"); - stmt.execute("DROP TYPE IF EXISTS test_middle_container CASCADE"); - stmt.execute("DROP TYPE IF EXISTS test_inner_item CASCADE"); - - // Create inner_item type - stmt.execute( - """ - CREATE TYPE test_inner_item AS ( - name TEXT, - description TEXT - ) - """); - - // Create middle_container type with array of inner_item - stmt.execute( - """ - CREATE TYPE test_middle_container AS ( - label TEXT, - items test_inner_item[] - ) - """); - - // Create outer_wrapper type with array of middle_container - stmt.execute( - """ - CREATE TYPE test_outer_wrapper AS ( - title TEXT, - metadata TEXT, - containers test_middle_container[] - ) - """); - - // Create a temporary table for testing - stmt.execute("DROP TABLE IF EXISTS test_deep_nesting"); - stmt.execute( - """ - CREATE TABLE test_deep_nesting ( - id SERIAL PRIMARY KEY, - data test_outer_wrapper - ) - """); - } - - // Define PgTypes that match the actual database types - PgStruct dbInnerItemStruct = - PgStruct.builder("test_inner_item") - .stringField("name", PgTypes.text, InnerItem::name) - .stringField("description", PgTypes.text, InnerItem::description) - .build(arr -> new InnerItem((String) arr[0], (String) arr[1])); - - PgStruct dbMiddleContainerStruct = - PgStruct.builder("test_middle_container") - .stringField("label", PgTypes.text, MiddleContainer::label) - .nestedArrayField( - "items", dbInnerItemStruct, MiddleContainer::items, InnerItem[]::new) - .build(arr -> new MiddleContainer((String) arr[0], (InnerItem[]) arr[1])); - - PgStruct dbOuterWrapperStruct = - PgStruct.builder("test_outer_wrapper") - .stringField("title", PgTypes.text, OuterWrapper::title) - .stringField("metadata", PgTypes.text, OuterWrapper::metadata) - .nestedArrayField( - "containers", - dbMiddleContainerStruct, - OuterWrapper::containers, - MiddleContainer[]::new) - .build( - arr -> - new OuterWrapper( - (String) arr[0], (String) arr[1], (MiddleContainer[]) arr[2])); - - PgType dbOuterWrapperType = dbOuterWrapperStruct.asType(); - - // Create test data with special chars at every level - InnerItem inner1 = new InnerItem("Item \"quoted\"", "Has, commas"); - InnerItem inner2 = new InnerItem("(parentheses)", "Line1\nLine2"); - InnerItem inner3 = new InnerItem("back\\slash", "\"quo,ted\" and (paren)\nand\\slash"); - - MiddleContainer middle1 = - new MiddleContainer( - "Middle \"special\" (label)", new InnerItem[] {inner1, inner2, inner3}); - - MiddleContainer middle2 = new MiddleContainer("Multi\nLine\nLabel", new InnerItem[] {}); - - OuterWrapper original = - new OuterWrapper( - "Test \"Title\" (Special)", - "Meta, with \"quotes\"\nand newlines\\and slashes", - new MiddleContainer[] {middle1, middle2}); - - // Insert into database - try (PreparedStatement ps = - conn.prepareStatement( - "INSERT INTO test_deep_nesting (data) VALUES (?) RETURNING id, data")) { - dbOuterWrapperType.write().set(ps, 1, original); - - try (ResultSet rs = ps.executeQuery()) { - if (rs.next()) { - int id = rs.getInt(1); - OuterWrapper readBack = dbOuterWrapperType.read().read(rs, 2); - - System.out.println("Inserted row with id: " + id); - System.out.println("Original title: " + original.title()); - System.out.println("Read back title: " + readBack.title()); - - // Verify ALL fields at ALL levels - assertEqual(readBack.title(), original.title()); - assertEqual(readBack.metadata(), original.metadata()); - assertEqual(readBack.containers().length, 2); - - // First middle container - assertEqual(readBack.containers()[0].label(), middle1.label()); - assertEqual(readBack.containers()[0].items().length, 3); - assertEqual(readBack.containers()[0].items()[0].name(), inner1.name()); - assertEqual( - readBack.containers()[0].items()[0].description(), inner1.description()); - assertEqual(readBack.containers()[0].items()[1].name(), inner2.name()); - assertEqual( - readBack.containers()[0].items()[1].description(), inner2.description()); - assertEqual(readBack.containers()[0].items()[2].name(), inner3.name()); - assertEqual( - readBack.containers()[0].items()[2].description(), inner3.description()); - - // Second middle container (empty items array) - assertEqual(readBack.containers()[1].label(), middle2.label()); - assertEqual(readBack.containers()[1].items().length, 0); - - System.out.println("\nAll database roundtrip assertions passed!"); - } else { - throw new RuntimeException("No row returned"); - } - } - } - - return null; - }); - } - - @Test - public void testExtremeSpecialCharacters() { - withConnection( - conn -> { - System.out.println("Testing extreme special character combinations...\n"); - - // Test the most challenging edge cases - String[] extremeStrings = { - // Simple special chars - "\"", // just a quote - ",", // just a comma - "(", // just open paren - ")", // just close paren - "\\", // just backslash - "\n", // just newline - // Combinations - "\"\"", // two quotes - ",,", // two commas - "()", // empty parens - "\\\\", // two backslashes - "\n\n", // two newlines - // Complex - "\"hello\"", // quoted word - "(a,b,c)", // parens with commas - "a\\\"b", // backslash quote - "line1\nline2\nline3", // multi-line - // Nightmare combinations - "\"(a,b)\"", // quoted parens with comma - "\\\"quoted\\\"", // escaped quotes - "(\"nested\",\"array\")", // looks like composite - "\"a\nb\nc\"", // quoted newlines - "\\(\\)\\,\\\"", // all escaped - // Real-world-ish - "O'Brien, \"Jim\"", // name with quotes and comma - "C:\\Users\\Documents", // Windows path - "SELECT * FROM \"table\" WHERE x = 'y'", // SQL-like - "{\"key\": \"value\"}", // JSON-like - "content", // XML-like - }; - - // Test each string in a simple roundtrip through database - try (Statement stmt = conn.createStatement()) { - stmt.execute("DROP TABLE IF EXISTS test_extreme_chars"); - stmt.execute( - """ - CREATE TABLE test_extreme_chars ( - id SERIAL PRIMARY KEY, - addr address - ) - """); - } - - for (String extreme : extremeStrings) { - Address testAddr = new Address(extreme, extreme, "12345", extreme); - - try (PreparedStatement ps = - conn.prepareStatement( - "INSERT INTO test_extreme_chars (addr) VALUES (?) RETURNING addr")) { - addressType.write().set(ps, 1, testAddr); - - try (ResultSet rs = ps.executeQuery()) { - if (rs.next()) { - Address readBack = addressType.read().read(rs, 1); - - // Verify each field - if (!extreme.equals(readBack.street())) { - System.err.println("FAILED for string: " + escape(extreme)); - System.err.println(" Expected: " + escape(extreme)); - System.err.println(" Got: " + escape(readBack.street())); - throw new AssertionError("Street mismatch for: " + escape(extreme)); - } - if (!extreme.equals(readBack.city())) { - throw new AssertionError("City mismatch for: " + escape(extreme)); - } - if (!extreme.equals(readBack.country())) { - throw new AssertionError("Country mismatch for: " + escape(extreme)); - } - - System.out.println("✓ Passed: " + escape(extreme)); - } - } - } - } - - System.out.println("\nAll extreme special character tests passed!"); - return null; - }); - } - - private String escape(String s) { - if (s == null) return "null"; - return s.replace("\\", "\\\\") - .replace("\"", "\\\"") - .replace("\n", "\\n") - .replace("\r", "\\r") - .replace("\t", "\\t"); - } - - // ============================================================================ - // Mixed Types Deep Nesting Test - // ============================================================================ - - /** Measurement with mixed primitive types */ - record Measurement(Integer id, Double value, Boolean valid, String note) {} - - static final PgStruct measurementStruct = - PgStruct.builder("measurement") - .intField("id", PgTypes.int4, Measurement::id) - .doubleField("value", PgTypes.float8, Measurement::value) - .booleanField("valid", PgTypes.bool, Measurement::valid) - .stringField("note", PgTypes.text, Measurement::note) - .build( - arr -> - new Measurement( - (Integer) arr[0], (Double) arr[1], (Boolean) arr[2], (String) arr[3])); - - /** Sensor with nested composite (Point2D) and array of mixed-type composite (Measurement[]) */ - record Sensor(String name, Point2D location, Measurement[] readings) {} - - static final PgStruct sensorStruct = - PgStruct.builder("sensor") - .stringField("name", PgTypes.text, Sensor::name) - .nestedField("location", point2dStruct, Sensor::location) - .nestedArrayField("readings", measurementStruct, Sensor::readings, Measurement[]::new) - .build(arr -> new Sensor((String) arr[0], (Point2D) arr[1], (Measurement[]) arr[2])); - - /** Observatory with Long id, array of Sensors, and nested Address */ - record Observatory(Long id, String name, Sensor[] sensors, Address headquarters) {} - - static final PgStruct observatoryStruct = - PgStruct.builder("observatory") - .longField("id", PgTypes.int8, Observatory::id) - .stringField("name", PgTypes.text, Observatory::name) - .nestedArrayField("sensors", sensorStruct, Observatory::sensors, Sensor[]::new) - .nestedField("headquarters", addressStruct, Observatory::headquarters) - .build( - arr -> - new Observatory( - (Long) arr[0], (String) arr[1], (Sensor[]) arr[2], (Address) arr[3])); - - static final PgType observatoryType = observatoryStruct.asType(); - - @Test - public void testMixedTypesDeepNestingRoundtrip() { - System.out.println("Testing mixed types deep nesting roundtrip...\n"); - - // Create measurements with various values including edge cases - Measurement m1 = new Measurement(1, 3.14159, true, "Normal reading"); - Measurement m2 = new Measurement(2, -273.15, false, "Below absolute zero, \"invalid\""); - Measurement m3 = new Measurement(3, 0.0, true, "Zero point"); - Measurement m4 = new Measurement(Integer.MAX_VALUE, Double.MAX_VALUE, true, "Max values"); - Measurement m5 = new Measurement(Integer.MIN_VALUE, Double.MIN_VALUE, false, "Min values"); - Measurement m6 = new Measurement(0, 1.23e-10, true, "Scientific notation"); - - // Create sensors with nested Point2D and array of Measurements - Point2D loc1 = new Point2D(40.7128, -74.0060); // NYC - Point2D loc2 = new Point2D(-33.8688, 151.2093); // Sydney - Point2D loc3 = new Point2D(0.0, 0.0); // Origin - - Sensor sensor1 = new Sensor("NYC Weather Station", loc1, new Measurement[] {m1, m2, m3}); - Sensor sensor2 = new Sensor("Sydney \"Observatory\"", loc2, new Measurement[] {m4, m5}); - Sensor sensor3 = new Sensor("Empty Sensor", loc3, new Measurement[] {}); // empty array - Sensor sensor4 = new Sensor("Single, Reading (Sensor)", loc1, new Measurement[] {m6}); - - // Create headquarters address with special chars - Address hq = new Address("123 Science Blvd", "Research City", "12345", "USA"); - - // Create observatory with everything - Observatory original = - new Observatory( - 9999999999L, - "Global \"Weather\" Observatory (Main)", - new Sensor[] {sensor1, sensor2, sensor3, sensor4}, - hq); - - // Encode to text - StringBuilder sb = new StringBuilder(); - observatoryType.pgText().unsafeEncode(original, sb); - String encoded = sb.toString(); - - System.out.println("Encoded observatory (length=" + encoded.length() + "):"); - System.out.println(encoded.substring(0, Math.min(500, encoded.length())) + "..."); - System.out.println(); - - // Parse and verify structure - List parsed = PgRecordParser.parse(encoded); - assertEqual(parsed.size(), 4); - - // Verify Long id - assertEqual(Long.parseLong(parsed.get(0)), original.id()); - - // Verify name with special chars - assertEqual(parsed.get(1), original.name()); - - // Parse sensors array - List sensorElements = PgRecordParser.parseArray(parsed.get(2)); - assertEqual(sensorElements.size(), 4); - - // Parse first sensor - List sensor1Parsed = PgRecordParser.parse(sensorElements.get(0)); - assertEqual(sensor1Parsed.get(0), sensor1.name()); - - // Parse location (Point2D) - List loc1Parsed = PgRecordParser.parse(sensor1Parsed.get(1)); - assertEqual(Double.parseDouble(loc1Parsed.get(0)), loc1.x()); - assertEqual(Double.parseDouble(loc1Parsed.get(1)), loc1.y()); - - // Parse readings array - List readingsElements = PgRecordParser.parseArray(sensor1Parsed.get(2)); - assertEqual(readingsElements.size(), 3); - - // Parse first measurement - List m1Parsed = PgRecordParser.parse(readingsElements.get(0)); - assertEqual((Integer) Integer.parseInt(m1Parsed.get(0)), m1.id()); - assertEqual((Double) Double.parseDouble(m1Parsed.get(1)), m1.value()); - boolean m1Valid = m1Parsed.get(2).equals("t") || m1Parsed.get(2).equals("true"); - assertEqual((Boolean) m1Valid, m1.valid()); - assertEqual(m1Parsed.get(3), m1.note()); - - // Parse second measurement (has special chars in note) - List m2Parsed = PgRecordParser.parse(readingsElements.get(1)); - assertEqual((Integer) Integer.parseInt(m2Parsed.get(0)), m2.id()); - assertEqual((Double) Double.parseDouble(m2Parsed.get(1)), m2.value()); - boolean m2Valid = m2Parsed.get(2).equals("t") || m2Parsed.get(2).equals("true"); - assertEqual((Boolean) m2Valid, m2.valid()); - assertEqual(m2Parsed.get(3), m2.note()); // Contains quotes - - // Verify headquarters address - List hqParsed = PgRecordParser.parse(parsed.get(3)); - assertEqual(hqParsed.get(0), hq.street()); - assertEqual(hqParsed.get(1), hq.city()); - assertEqual(hqParsed.get(2), hq.zip()); - assertEqual(hqParsed.get(3), hq.country()); - - System.out.println("All mixed types deep nesting roundtrip assertions passed!"); - } - - @Test - public void testMixedTypesDeepNestingDatabaseRoundtrip() { - withConnection( - conn -> { - System.out.println("Testing mixed types deep nesting with database roundtrip...\n"); - - // Create the types in the database - try (Statement stmt = conn.createStatement()) { - // Drop existing types (in reverse dependency order) - stmt.execute("DROP TYPE IF EXISTS test_observatory CASCADE"); - stmt.execute("DROP TYPE IF EXISTS test_sensor CASCADE"); - stmt.execute("DROP TYPE IF EXISTS test_measurement CASCADE"); - stmt.execute("DROP TYPE IF EXISTS test_point_2d CASCADE"); - - // Create point_2d type - stmt.execute( - """ - CREATE TYPE test_point_2d AS ( - x DOUBLE PRECISION, - y DOUBLE PRECISION - ) - """); - - // Create measurement type with mixed primitives - stmt.execute( - """ - CREATE TYPE test_measurement AS ( - id INTEGER, - value DOUBLE PRECISION, - valid BOOLEAN, - note TEXT - ) - """); - - // Create sensor type with nested composite and array - stmt.execute( - """ - CREATE TYPE test_sensor AS ( - name TEXT, - location test_point_2d, - readings test_measurement[] - ) - """); - - // Create observatory type - stmt.execute( - """ - CREATE TYPE test_observatory AS ( - id BIGINT, - name TEXT, - sensors test_sensor[], - headquarters address - ) - """); - - // Create test table - stmt.execute("DROP TABLE IF EXISTS test_mixed_types"); - stmt.execute( - """ - CREATE TABLE test_mixed_types ( - id SERIAL PRIMARY KEY, - data test_observatory - ) - """); - } - - // Define PgTypes matching database types - PgStruct dbPoint2dStruct = - PgStruct.builder("test_point_2d") - .doubleField("x", PgTypes.float8, Point2D::x) - .doubleField("y", PgTypes.float8, Point2D::y) - .build(arr -> new Point2D((Double) arr[0], (Double) arr[1])); - - PgStruct dbMeasurementStruct = - PgStruct.builder("test_measurement") - .intField("id", PgTypes.int4, Measurement::id) - .doubleField("value", PgTypes.float8, Measurement::value) - .booleanField("valid", PgTypes.bool, Measurement::valid) - .stringField("note", PgTypes.text, Measurement::note) - .build( - arr -> - new Measurement( - (Integer) arr[0], - (Double) arr[1], - (Boolean) arr[2], - (String) arr[3])); - - PgStruct dbSensorStruct = - PgStruct.builder("test_sensor") - .stringField("name", PgTypes.text, Sensor::name) - .nestedField("location", dbPoint2dStruct, Sensor::location) - .nestedArrayField( - "readings", dbMeasurementStruct, Sensor::readings, Measurement[]::new) - .build( - arr -> new Sensor((String) arr[0], (Point2D) arr[1], (Measurement[]) arr[2])); - - PgStruct dbObservatoryStruct = - PgStruct.builder("test_observatory") - .longField("id", PgTypes.int8, Observatory::id) - .stringField("name", PgTypes.text, Observatory::name) - .nestedArrayField("sensors", dbSensorStruct, Observatory::sensors, Sensor[]::new) - .nestedField("headquarters", addressStruct, Observatory::headquarters) - .build( - arr -> - new Observatory( - (Long) arr[0], (String) arr[1], (Sensor[]) arr[2], (Address) arr[3])); - - PgType dbObservatoryType = dbObservatoryStruct.asType(); - - // Create test data with mixed types - Measurement m1 = new Measurement(1, 98.6, true, "Normal temp"); - Measurement m2 = new Measurement(2, -40.0, false, "Cold \"reading\""); - Measurement m3 = new Measurement(3, 1000.5, true, "High, with comma"); - - Point2D loc1 = new Point2D(51.5074, -0.1278); // London - Point2D loc2 = new Point2D(35.6762, 139.6503); // Tokyo - - Sensor sensor1 = new Sensor("London Station", loc1, new Measurement[] {m1, m2}); - Sensor sensor2 = new Sensor("Tokyo \"Main\" (Station)", loc2, new Measurement[] {m3}); - - Address hq = new Address("1 Observatory Way", "Science Town", "SC1 2AB", "UK"); - - Observatory original = - new Observatory( - 123456789L, "International Weather Network", new Sensor[] {sensor1, sensor2}, hq); - - // Insert and read back - try (PreparedStatement ps = - conn.prepareStatement( - "INSERT INTO test_mixed_types (data) VALUES (?) RETURNING id, data")) { - dbObservatoryType.write().set(ps, 1, original); - - try (ResultSet rs = ps.executeQuery()) { - if (rs.next()) { - int id = rs.getInt(1); - Observatory readBack = dbObservatoryType.read().read(rs, 2); - - System.out.println("Inserted row with id: " + id); - - // Verify all fields at all levels - assertEqual(readBack.id(), original.id()); - assertEqual(readBack.name(), original.name()); - - // Verify sensors array - assertEqual(readBack.sensors().length, 2); - - // First sensor - assertEqual(readBack.sensors()[0].name(), sensor1.name()); - assertEqual(readBack.sensors()[0].location().x(), loc1.x()); - assertEqual(readBack.sensors()[0].location().y(), loc1.y()); - assertEqual(readBack.sensors()[0].readings().length, 2); - - // First measurement - assertEqual(readBack.sensors()[0].readings()[0].id(), m1.id()); - assertEqual(readBack.sensors()[0].readings()[0].value(), m1.value()); - assertEqual(readBack.sensors()[0].readings()[0].valid(), m1.valid()); - assertEqual(readBack.sensors()[0].readings()[0].note(), m1.note()); - - // Second measurement (has special chars) - assertEqual(readBack.sensors()[0].readings()[1].id(), m2.id()); - assertEqual(readBack.sensors()[0].readings()[1].value(), m2.value()); - assertEqual(readBack.sensors()[0].readings()[1].valid(), m2.valid()); - assertEqual(readBack.sensors()[0].readings()[1].note(), m2.note()); - - // Second sensor (has special chars in name) - assertEqual(readBack.sensors()[1].name(), sensor2.name()); - assertEqual(readBack.sensors()[1].location().x(), loc2.x()); - assertEqual(readBack.sensors()[1].location().y(), loc2.y()); - assertEqual(readBack.sensors()[1].readings().length, 1); - assertEqual(readBack.sensors()[1].readings()[0].note(), m3.note()); - - // Headquarters - assertEqual(readBack.headquarters().street(), hq.street()); - assertEqual(readBack.headquarters().city(), hq.city()); - assertEqual(readBack.headquarters().zip(), hq.zip()); - assertEqual(readBack.headquarters().country(), hq.country()); - - System.out.println("\nAll mixed types database roundtrip assertions passed!"); - } else { - throw new RuntimeException("No row returned"); - } - } - } - - return null; - }); - } - - // Helper methods - - private void assertEqual(Object actual, Object expected) { - if (expected == null && actual == null) return; - if (expected == null || actual == null || !expected.equals(actual)) { - throw new AssertionError("Expected: " + expected + ", Actual: " + actual); - } - } - - private void assertEqual(int actual, int expected) { - if (actual != expected) { - throw new AssertionError("Expected: " + expected + ", Actual: " + actual); - } - } - - private void assertEqual(boolean actual, boolean expected) { - if (actual != expected) { - throw new AssertionError("Expected: " + expected + ", Actual: " + actual); - } - } - - private void assertNotNull(Object obj) { - if (obj == null) { - throw new AssertionError("Expected non-null value"); - } - } -} diff --git a/foundations-jdbc-test/src/java/dev/typr/foundations/PgTypeTest.java b/foundations-jdbc-test/src/java/dev/typr/foundations/PgTypeTest.java deleted file mode 100644 index 9298e68b3c..0000000000 --- a/foundations-jdbc-test/src/java/dev/typr/foundations/PgTypeTest.java +++ /dev/null @@ -1,1013 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.*; -import dev.typr.foundations.data.JsonValue; -import dev.typr.foundations.data.Vector; -import java.math.BigDecimal; -import java.sql.Connection; -import java.sql.PreparedStatement; -import java.sql.SQLException; -import java.time.*; -import java.time.temporal.ChronoUnit; -import java.util.*; -import java.util.concurrent.atomic.AtomicInteger; -import org.junit.Test; -import org.postgresql.geometric.*; -import org.postgresql.jdbc.PgConnection; -import org.postgresql.util.PGInterval; - -public class PgTypeTest { - - private static final AtomicInteger tableCounter = new AtomicInteger(0); - - private static String uniqueTableName(String prefix) { - return prefix + "_" + tableCounter.incrementAndGet(); - } - - // PostgreSQL only supports microsecond precision (6 digits), but Java's now() methods - // return nanosecond precision (9 digits). Truncate to ensure roundtrip equality. - private static LocalTime nowTime() { - return LocalTime.now().truncatedTo(ChronoUnit.MICROS); - } - - private static LocalDateTime nowDateTime() { - return LocalDateTime.now().truncatedTo(ChronoUnit.MICROS); - } - - private static Instant nowInstant() { - return Instant.now().truncatedTo(ChronoUnit.MICROS); - } - - private static OffsetTime nowOffsetTime() { - return OffsetTime.now().truncatedTo(ChronoUnit.MICROS); - } - - record TestPair(A t0, Optional t1) {} - - record PgTypeAndExample( - PgType type, - A example, - boolean hasIdentity, - boolean streamingWorks, - boolean compositeTextWorks) { - public PgTypeAndExample(PgType type, A example) { - this(type, example, true, true, true); - } - - public PgTypeAndExample noStreaming() { - return new PgTypeAndExample<>(type, example, hasIdentity, false, compositeTextWorks); - } - - public PgTypeAndExample noIdentity() { - return new PgTypeAndExample<>(type, example, false, streamingWorks, compositeTextWorks); - } - - public PgTypeAndExample noCompositeText() { - return new PgTypeAndExample<>(type, example, hasIdentity, streamingWorks, false); - } - } - - List> All = - List.>of( - // ==================== ACL Item Types ==================== - new PgTypeAndExample<>(PgTypes.aclitem, new AclItem("postgres=r*w/postgres")), - new PgTypeAndExample<>( - PgTypes.aclitemArray, new AclItem[] {new AclItem("postgres=r*w/postgres")}), - - // ==================== Boolean Types ==================== - new PgTypeAndExample<>(PgTypes.bool, true), - new PgTypeAndExample<>(PgTypes.bool, false), // Edge case: false value - new PgTypeAndExample<>(PgTypes.boolArray, new Boolean[] {true, false}), - new PgTypeAndExample<>(PgTypes.boolArray, new Boolean[] {}), // Edge case: empty array - new PgTypeAndExample<>(PgTypes.boolArrayUnboxed, new boolean[] {true, false}), - new PgTypeAndExample<>( - PgTypes.boolArrayUnboxed, new boolean[] {}), // Edge case: empty array - - // ==================== Geometric Types ==================== - new PgTypeAndExample<>(PgTypes.box, new PGbox(42, 42, 42, 42)).noIdentity(), - new PgTypeAndExample<>(PgTypes.box, new PGbox(-100, -50, 100, 50)) - .noIdentity(), // Edge case: negative coords - new PgTypeAndExample<>(PgTypes.boxArray, new PGbox[] {new PGbox(42, 42, 42, 42)}) - .noIdentity(), - new PgTypeAndExample<>(PgTypes.circle, new PGcircle(new PGpoint(0.01, 42.34), 101.2)), - new PgTypeAndExample<>(PgTypes.circle, new PGcircle(new PGpoint(0, 0), 0)) - .noIdentity(), // Edge case: zero radius - new PgTypeAndExample<>( - PgTypes.circleArray, - new PGcircle[] {new PGcircle(new PGpoint(0.01, 42.34), 101.2)}) - .noIdentity(), - new PgTypeAndExample<>(PgTypes.line, new PGline(1.1, 2.2, 3.3)).noIdentity(), - new PgTypeAndExample<>(PgTypes.lineArray, new PGline[] {new PGline(1.1, 2.2, 3.3)}) - .noIdentity(), - new PgTypeAndExample<>(PgTypes.lseg, new PGlseg(1.1, 2.2, 3.3, 4.4)).noIdentity(), - new PgTypeAndExample<>(PgTypes.lsegArray, new PGlseg[] {new PGlseg(1.1, 2.2, 3.3, 4.4)}) - .noIdentity(), - new PgTypeAndExample<>( - PgTypes.path, - new PGpath(new PGpoint[] {new PGpoint(1.1, 2.2), new PGpoint(3.3, 4.4)}, true)) - .noIdentity(), - new PgTypeAndExample<>( - PgTypes.pathArray, - new PGpath[] { - new PGpath(new PGpoint[] {new PGpoint(1.1, 2.2), new PGpoint(3.3, 4.4)}, true) - }) - .noIdentity(), - new PgTypeAndExample<>(PgTypes.point, new PGpoint(1.1, 2.2)).noIdentity(), - new PgTypeAndExample<>(PgTypes.point, new PGpoint(0, 0)) - .noIdentity(), // Edge case: origin - new PgTypeAndExample<>(PgTypes.pointArray, new PGpoint[] {new PGpoint(1.1, 2.2)}) - .noIdentity(), - new PgTypeAndExample<>( - PgTypes.polygon, - new PGpolygon(new PGpoint[] {new PGpoint(1.1, 2.2), new PGpoint(3.3, 4.4)})) - .noIdentity(), - new PgTypeAndExample<>( - PgTypes.polygonArray, - new PGpolygon[] { - new PGpolygon(new PGpoint[] {new PGpoint(1.1, 2.2), new PGpoint(3.3, 4.4)}) - }) - .noIdentity(), - - // ==================== Character Types ==================== - new PgTypeAndExample<>(PgTypes.bpchar(5), "377 "), - new PgTypeAndExample<>(PgTypes.bpchar, "377"), - new PgTypeAndExample<>(PgTypes.bpchar, ""), // Edge case: empty string - new PgTypeAndExample<>(PgTypes.bpcharArray(5), new String[] {"377 "}), - new PgTypeAndExample<>(PgTypes.bpcharArray, new String[] {"10101"}), - new PgTypeAndExample<>(PgTypes.text, ",.;{}[]-//#®✅"), - new PgTypeAndExample<>(PgTypes.text, ""), // Edge case: empty string - new PgTypeAndExample<>( - PgTypes.text, "Line1\nLine2\tTabbed"), // Edge case: whitespace chars - new PgTypeAndExample<>(PgTypes.text, "Quote\"Test'Single"), // Edge case: quotes - new PgTypeAndExample<>(PgTypes.text, "Emoji: 😀🎉🚀"), // Edge case: emoji - new PgTypeAndExample<>(PgTypes.textArray, new String[] {",.;{}[]-//#®✅"}), - new PgTypeAndExample<>( - PgTypes.textArray, new String[] {"a", "b", "c"}), // Edge case: multiple elements - new PgTypeAndExample<>(PgTypes.textArray, new String[] {}), // Edge case: empty array - - // ==================== Binary Types ==================== - new PgTypeAndExample<>(PgTypes.bytea, new byte[] {-1, 1, 127}), - new PgTypeAndExample<>(PgTypes.bytea, new byte[] {}), // Edge case: empty byte array - new PgTypeAndExample<>(PgTypes.bytea, new byte[] {0, 0, 0}), // Edge case: all zeros - new PgTypeAndExample<>( - PgTypes.bytea, - new byte[] {(byte) 0xFF, (byte) 0xFE, (byte) 0xFD}), // Edge case: high bytes - - // ==================== Date/Time Types ==================== - new PgTypeAndExample<>(PgTypes.date, LocalDate.now()), - new PgTypeAndExample<>(PgTypes.date, LocalDate.of(1970, 1, 1)), // Edge case: epoch - new PgTypeAndExample<>(PgTypes.date, LocalDate.of(2099, 12, 31)), // Edge case: far future - new PgTypeAndExample<>(PgTypes.dateArray, new LocalDate[] {LocalDate.now()}), - new PgTypeAndExample<>(PgTypes.time, nowTime()), - new PgTypeAndExample<>(PgTypes.time, LocalTime.of(0, 0, 0)), // Edge case: midnight - new PgTypeAndExample<>( - PgTypes.time, LocalTime.of(23, 59, 59, 999999000)), // Edge case: end of day - new PgTypeAndExample<>(PgTypes.timeArray, new LocalTime[] {nowTime()}), - new PgTypeAndExample<>(PgTypes.timestamp, nowDateTime()), - new PgTypeAndExample<>( - PgTypes.timestamp, LocalDateTime.of(1970, 1, 1, 0, 0, 0)), // Edge case: epoch - new PgTypeAndExample<>(PgTypes.timestampArray, new LocalDateTime[] {nowDateTime()}), - new PgTypeAndExample<>(PgTypes.timestamptz, nowInstant()), - new PgTypeAndExample<>(PgTypes.timestamptz, Instant.EPOCH), // Edge case: epoch - new PgTypeAndExample<>(PgTypes.timestamptzArray, new Instant[] {nowInstant()}), - new PgTypeAndExample<>(PgTypes.timetz, nowOffsetTime()), - new PgTypeAndExample<>(PgTypes.timetzArray, new OffsetTime[] {nowOffsetTime()}), - new PgTypeAndExample<>(PgTypes.interval, new PGInterval(1, 2, 3, 4, 5, 6.666)), - new PgTypeAndExample<>(PgTypes.interval, new PGInterval(0, 0, 0, 0, 0, 0)) - .noIdentity(), // Edge case: zero interval - new PgTypeAndExample<>( - PgTypes.intervalArray, new PGInterval[] {new PGInterval(1, 2, 3, 4, 5, 6.666)}) - .noIdentity(), - - // ==================== Numeric Types ==================== - new PgTypeAndExample<>(PgTypes.int2, (short) 42), - new PgTypeAndExample<>(PgTypes.int2, Short.MIN_VALUE), // Edge case: min value - new PgTypeAndExample<>(PgTypes.int2, Short.MAX_VALUE), // Edge case: max value - new PgTypeAndExample<>(PgTypes.int2, (short) 0), // Edge case: zero - new PgTypeAndExample<>(PgTypes.int2Array, new Short[] {42}), - new PgTypeAndExample<>(PgTypes.int2ArrayUnboxed, new short[] {42}), - new PgTypeAndExample<>( - PgTypes.int2ArrayUnboxed, new short[] {}), // Edge case: empty array - new PgTypeAndExample<>(PgTypes.int4, 42), - new PgTypeAndExample<>(PgTypes.int4, Integer.MIN_VALUE), // Edge case: min value - new PgTypeAndExample<>(PgTypes.int4, Integer.MAX_VALUE), // Edge case: max value - new PgTypeAndExample<>(PgTypes.int4, 0), // Edge case: zero - new PgTypeAndExample<>(PgTypes.int4Array, new Integer[] {42}), - new PgTypeAndExample<>(PgTypes.int4ArrayUnboxed, new int[] {42}), - new PgTypeAndExample<>(PgTypes.int4ArrayUnboxed, new int[] {}), // Edge case: empty array - new PgTypeAndExample<>(PgTypes.int8, 42L), - new PgTypeAndExample<>(PgTypes.int8, Long.MIN_VALUE), // Edge case: min value - new PgTypeAndExample<>(PgTypes.int8, Long.MAX_VALUE), // Edge case: max value - new PgTypeAndExample<>(PgTypes.int8, 0L), // Edge case: zero - new PgTypeAndExample<>(PgTypes.int8Array, new Long[] {42L}), - new PgTypeAndExample<>(PgTypes.int8ArrayUnboxed, new long[] {42L}), - new PgTypeAndExample<>(PgTypes.int8ArrayUnboxed, new long[] {}), // Edge case: empty array - new PgTypeAndExample<>(PgTypes.float4, 42.42f), - new PgTypeAndExample<>(PgTypes.float4, 0.0f), // Edge case: zero - new PgTypeAndExample<>(PgTypes.float4, 1.0E-38f), // Edge case: small positive - new PgTypeAndExample<>(PgTypes.float4Array, new Float[] {42.42f}), - new PgTypeAndExample<>(PgTypes.float4ArrayUnboxed, new float[] {42.42f}), - new PgTypeAndExample<>( - PgTypes.float4ArrayUnboxed, new float[] {}), // Edge case: empty array - new PgTypeAndExample<>(PgTypes.float8, 42.42), - new PgTypeAndExample<>(PgTypes.float8, 0.0), // Edge case: zero - new PgTypeAndExample<>(PgTypes.float8, Double.MAX_VALUE), // Edge case: max value - new PgTypeAndExample<>(PgTypes.float8Array, new Double[] {42.42}), - new PgTypeAndExample<>(PgTypes.float8ArrayUnboxed, new double[] {42.42}), - new PgTypeAndExample<>( - PgTypes.float8ArrayUnboxed, new double[] {}), // Edge case: empty array - new PgTypeAndExample<>(PgTypes.numeric, new BigDecimal("0.002")), - new PgTypeAndExample<>(PgTypes.numeric, BigDecimal.ZERO), // Edge case: zero - new PgTypeAndExample<>( - PgTypes.numeric, - new BigDecimal("-99999999999999.999999999999")), // Edge case: large negative - new PgTypeAndExample<>( - PgTypes.numeric, - new BigDecimal("99999999999999.999999999999")), // Edge case: large positive - new PgTypeAndExample<>(PgTypes.numericArray, new BigDecimal[] {new BigDecimal("0.002")}), - new PgTypeAndExample<>(PgTypes.smallint, (short) 42), - new PgTypeAndExample<>(PgTypes.smallintArray, new Short[] {42}), - new PgTypeAndExample<>(PgTypes.smallintArrayUnboxed, new short[] {42}), - new PgTypeAndExample<>(PgTypes.money, new Money("42.22")), - new PgTypeAndExample<>(PgTypes.money, new Money("0.00")), // Edge case: zero - new PgTypeAndExample<>(PgTypes.money, new Money("-999.99")), // Edge case: negative - new PgTypeAndExample<>(PgTypes.moneyArray, new Money[] {new Money("42.22")}), - - // ==================== Vector Types ==================== - new PgTypeAndExample<>(PgTypes.int2vector, new Int2Vector(new short[] {1, 2, 3})), - new PgTypeAndExample<>( - PgTypes.int2vectorArray, new Int2Vector[] {new Int2Vector(new short[] {1, 2, 3})}), - new PgTypeAndExample<>(PgTypes.oidvector, new OidVector(new int[] {1, 2, 3})), - new PgTypeAndExample<>( - PgTypes.oidvectorArray, new OidVector[] {new OidVector(new int[] {1, 2, 3})}), - new PgTypeAndExample<>(PgTypes.vector, new Vector(new float[] {1.0f, 2.0f, 3.0f})), - new PgTypeAndExample<>( - PgTypes.vector, new Vector(new float[] {0.0f, 0.0f, 0.0f})), // Edge case: zero vector - new PgTypeAndExample<>( - PgTypes.vectorArray, new Vector[] {new Vector(new float[] {1.0f, 2.0f, 3.0f})}), - - // ==================== Identifier Types ==================== - new PgTypeAndExample<>(PgTypes.name, "my_table_name"), - new PgTypeAndExample<>(PgTypes.name, "a"), // Edge case: short name - new PgTypeAndExample<>( - PgTypes.name, - "this_is_a_very_long_identifier_name_close_to_63_chars_limit"), // Edge case: long - // name - new PgTypeAndExample<>(PgTypes.nameArray, new String[] {"my_table", "my_column"}), - new PgTypeAndExample<>(PgTypes.nameArray, new String[] {}), // Edge case: empty array - - // ==================== Network Types ==================== - new PgTypeAndExample<>(PgTypes.inet, new Inet("10.1.0.0")), - new PgTypeAndExample<>( - PgTypes.inet, new Inet("192.168.1.1")), // Edge case: common private IP - new PgTypeAndExample<>(PgTypes.inet, new Inet("255.255.255.255")), // Edge case: broadcast - new PgTypeAndExample<>(PgTypes.inet, new Inet("0.0.0.0")), // Edge case: any address - new PgTypeAndExample<>(PgTypes.inetArray, new Inet[] {new Inet("10.1.0.0")}), - - // CIDR - network addresses - new PgTypeAndExample<>(PgTypes.cidr, new Cidr("192.168.1.0/24")), - new PgTypeAndExample<>(PgTypes.cidr, new Cidr("10.0.0.0/8")), // Edge case: Class A - new PgTypeAndExample<>( - PgTypes.cidr, new Cidr("172.16.0.0/12")), // Edge case: Class B private - new PgTypeAndExample<>(PgTypes.cidrArray, new Cidr[] {new Cidr("192.168.1.0/24")}), - - // MAC addresses (6-byte format) - new PgTypeAndExample<>(PgTypes.macaddr, new MacAddr("08:00:2b:01:02:03")), - new PgTypeAndExample<>( - PgTypes.macaddr, new MacAddr("00:00:00:00:00:00")), // Edge case: all zeros - new PgTypeAndExample<>( - PgTypes.macaddr, new MacAddr("ff:ff:ff:ff:ff:ff")), // Edge case: broadcast - new PgTypeAndExample<>( - PgTypes.macaddrArray, new MacAddr[] {new MacAddr("08:00:2b:01:02:03")}), - - // MAC addresses (8-byte format, EUI-64) - new PgTypeAndExample<>(PgTypes.macaddr8, new MacAddr8("08:00:2b:01:02:03:04:05")), - new PgTypeAndExample<>( - PgTypes.macaddr8, new MacAddr8("00:00:00:00:00:00:00:00")), // Edge case: all zeros - new PgTypeAndExample<>( - PgTypes.macaddr8, new MacAddr8("ff:ff:ff:ff:ff:ff:ff:ff")), // Edge case: all ones - new PgTypeAndExample<>( - PgTypes.macaddr8Array, new MacAddr8[] {new MacAddr8("08:00:2b:01:02:03:04:05")}), - - // ==================== Key-Value Types ==================== - new PgTypeAndExample<>(PgTypes.hstore, Map.of(",.;{}[]-//#®✅", ",.;{}[]-//#®✅")), - new PgTypeAndExample<>(PgTypes.hstore, Map.of()), // Edge case: empty map - new PgTypeAndExample<>( - PgTypes.hstore, - Map.of("key1", "value1", "key2", "value2")), // Edge case: multiple entries - - // ==================== JSON Types ==================== - new PgTypeAndExample<>(PgTypes.json, new Json("{\"A\": 42}")).noIdentity(), - new PgTypeAndExample<>(PgTypes.json, new Json("{}")) - .noIdentity(), // Edge case: empty object - new PgTypeAndExample<>(PgTypes.json, new Json("[]")) - .noIdentity(), // Edge case: empty array - new PgTypeAndExample<>(PgTypes.json, new Json("null")).noIdentity(), // Edge case: null - new PgTypeAndExample<>(PgTypes.json, new Json("\"string\"")) - .noIdentity(), // Edge case: string value - new PgTypeAndExample<>(PgTypes.jsonArray, new Json[] {new Json("{\"A\": 42}")}) - .noIdentity() - .noStreaming(), - new PgTypeAndExample<>(PgTypes.jsonb, new Jsonb("{\"A\": 42}")) - .noIdentity(), // Whitespace normalized - new PgTypeAndExample<>(PgTypes.jsonb, new Jsonb("{}")) - .noIdentity(), // Edge case: empty object - new PgTypeAndExample<>(PgTypes.jsonbArray, new Jsonb[] {new Jsonb("{\"A\": 42}")}) - .noIdentity() - .noStreaming(), - - // ==================== Record Types ==================== - // TODO: Record JSON roundtrip needs special handling - PostgreSQL returns composite types - // as JSON objects - // with field names (e.g., {"r":1,"i":2}), but Record stores tuple format "(1,2)". - // We'll implement something clever later using json_populate_record or similar. - // new PgTypeAndExample<>(PgTypes.record("complex"), new Record("(1,2)")), - // new PgTypeAndExample<>(PgTypes.recordArray("complex"), new Record[]{new - // Record("(1,2)")}), - - // ==================== Reg* Types ==================== - new PgTypeAndExample<>(PgTypes.regconfig, new Regconfig("danish")), - new PgTypeAndExample<>( - PgTypes.regconfig, new Regconfig("english")), // Edge case: common config - new PgTypeAndExample<>(PgTypes.regconfigArray, new Regconfig[] {new Regconfig("danish")}), - new PgTypeAndExample<>(PgTypes.regdictionary, new Regdictionary("english_stem")), - new PgTypeAndExample<>( - PgTypes.regdictionaryArray, new Regdictionary[] {new Regdictionary("english_stem")}), - new PgTypeAndExample<>(PgTypes.regnamespace, new Regnamespace("public")), - new PgTypeAndExample<>( - PgTypes.regnamespace, new Regnamespace("pg_catalog")), // Edge case: system namespace - new PgTypeAndExample<>( - PgTypes.regnamespaceArray, new Regnamespace[] {new Regnamespace("public")}), - new PgTypeAndExample<>(PgTypes.regoperator, new Regoperator("-(bigint,bigint)")), - new PgTypeAndExample<>( - PgTypes.regoperatorArray, new Regoperator[] {new Regoperator("-(bigint,bigint)")}), - new PgTypeAndExample<>(PgTypes.regprocedure, new Regprocedure("sum(integer)")), - new PgTypeAndExample<>( - PgTypes.regprocedureArray, new Regprocedure[] {new Regprocedure("sum(integer)")}), - new PgTypeAndExample<>(PgTypes.regrole, new Regrole("pg_monitor")), - new PgTypeAndExample<>(PgTypes.regroleArray, new Regrole[] {new Regrole("pg_monitor")}), - new PgTypeAndExample<>(PgTypes.regtype, new Regtype("integer")), - new PgTypeAndExample<>(PgTypes.regtype, new Regtype("text")), // Edge case: different type - new PgTypeAndExample<>(PgTypes.regtypeArray, new Regtype[] {new Regtype("integer")}), - - // ==================== Transaction ID Types ==================== - new PgTypeAndExample<>(PgTypes.xid, new Xid("1")), - new PgTypeAndExample<>(PgTypes.xidArray, new Xid[] {new Xid("1")}), - - // ==================== UUID Types ==================== - new PgTypeAndExample<>(PgTypes.uuid, UUID.randomUUID()), - new PgTypeAndExample<>(PgTypes.uuid, new UUID(0, 0)), // Edge case: nil UUID - new PgTypeAndExample<>(PgTypes.uuid, new UUID(-1, -1)), // Edge case: max UUID - new PgTypeAndExample<>(PgTypes.uuidArray, new UUID[] {UUID.randomUUID()}), - new PgTypeAndExample<>(PgTypes.uuidArray, new UUID[] {}), // Edge case: empty array - - // ==================== XML Types ==================== - new PgTypeAndExample<>(PgTypes.xml, new Xml("42")).noIdentity(), - new PgTypeAndExample<>( - PgTypes.xml, new Xml("text")) - .noIdentity(), // Edge case: nested - new PgTypeAndExample<>(PgTypes.xmlArray, new Xml[] {new Xml("42")}).noIdentity(), - - // ==================== Range Types ==================== - // int4range - uses Range.int4() which normalizes to [) form - new PgTypeAndExample<>( - PgTypes.int4range, Range.int4(new RangeBound.Closed<>(1), new RangeBound.Open<>(10))), - new PgTypeAndExample<>( - PgTypes.int4range, - Range.int4( - new RangeBound.Closed<>(0), new RangeBound.Closed<>(100))), // [0,100] -> [0,101) - new PgTypeAndExample<>( - PgTypes.int4range, - Range.int4(RangeBound.infinite(), new RangeBound.Open<>(10))), // unbounded lower - new PgTypeAndExample<>( - PgTypes.int4range, - Range.int4(new RangeBound.Closed<>(1), RangeBound.infinite())), // unbounded upper - new PgTypeAndExample<>( - PgTypes.int4range, - Range.int4(RangeBound.infinite(), RangeBound.infinite())), // fully unbounded - new PgTypeAndExample<>(PgTypes.int4range, Range.empty()), // empty range - new PgTypeAndExample<>( - PgTypes.int4rangeArray, - new Range[] {Range.int4(new RangeBound.Closed<>(1), new RangeBound.Open<>(10))}), - - // int8range - uses Range.int8() which normalizes to [) form - new PgTypeAndExample<>( - PgTypes.int8range, - Range.int8(new RangeBound.Closed<>(1L), new RangeBound.Open<>(1000000L))), - new PgTypeAndExample<>( - PgTypes.int8range, - Range.int8( - new RangeBound.Closed<>(Long.MIN_VALUE + 1), - new RangeBound.Open<>(Long.MAX_VALUE))), - new PgTypeAndExample<>(PgTypes.int8range, Range.empty()), - new PgTypeAndExample<>( - PgTypes.int8rangeArray, - new Range[] {Range.int8(new RangeBound.Closed<>(1L), new RangeBound.Open<>(100L))}), - - // numrange - uses Range.numeric() which does NOT normalize (continuous type) - new PgTypeAndExample<>( - PgTypes.numrange, - Range.numeric( - new RangeBound.Closed<>(new BigDecimal("0.5")), - new RangeBound.Open<>(new BigDecimal("10.5")))), - new PgTypeAndExample<>( - PgTypes.numrange, - Range.numeric( - new RangeBound.Open<>(BigDecimal.ZERO), - new RangeBound.Closed<>(new BigDecimal("99.99")))), - new PgTypeAndExample<>(PgTypes.numrange, Range.empty()), - new PgTypeAndExample<>( - PgTypes.numrangeArray, - new Range[] { - Range.numeric( - new RangeBound.Closed<>(BigDecimal.ONE), new RangeBound.Open<>(BigDecimal.TEN)) - }), - - // daterange - uses Range.date() which normalizes to [) form - new PgTypeAndExample<>( - PgTypes.daterange, - Range.date( - new RangeBound.Closed<>(LocalDate.of(2024, 1, 1)), - new RangeBound.Open<>(LocalDate.of(2024, 12, 31)))), - new PgTypeAndExample<>( - PgTypes.daterange, - Range.date( - RangeBound.infinite(), - new RangeBound.Closed<>( - LocalDate.now()))), // unbounded lower, (,today] -> (,tomorrow) - new PgTypeAndExample<>(PgTypes.daterange, Range.empty()), - new PgTypeAndExample<>( - PgTypes.daterangeArray, - new Range[] { - Range.date( - new RangeBound.Closed<>(LocalDate.of(2024, 1, 1)), - new RangeBound.Open<>(LocalDate.of(2024, 6, 30))) - }), - - // tsrange (timestamp without timezone) - uses Range.timestamp() which does NOT normalize - new PgTypeAndExample<>( - PgTypes.tsrange, - Range.timestamp( - new RangeBound.Closed<>(LocalDateTime.of(2024, 1, 1, 0, 0)), - new RangeBound.Open<>(LocalDateTime.of(2024, 12, 31, 23, 59, 59)))), - new PgTypeAndExample<>(PgTypes.tsrange, Range.empty()), - new PgTypeAndExample<>( - PgTypes.tsrangeArray, - new Range[] { - Range.timestamp( - new RangeBound.Closed<>(LocalDateTime.of(2024, 1, 1, 0, 0)), - new RangeBound.Open<>(LocalDateTime.of(2024, 6, 30, 23, 59))) - }), - - // tstzrange (timestamp with timezone) - uses Range.timestamptz() which does NOT normalize - new PgTypeAndExample<>( - PgTypes.tstzrange, - Range.timestamptz( - new RangeBound.Closed<>(Instant.parse("2024-01-01T00:00:00Z")), - new RangeBound.Open<>(Instant.parse("2024-12-31T23:59:59Z")))), - new PgTypeAndExample<>(PgTypes.tstzrange, Range.empty()), - new PgTypeAndExample<>( - PgTypes.tstzrangeArray, - new Range[] { - Range.timestamptz( - new RangeBound.Closed<>(Instant.parse("2024-01-01T00:00:00Z")), - new RangeBound.Open<>(Instant.parse("2024-06-30T23:59:59Z"))) - })); - - // in java - static void withConnection(SqlFunction f) { - try (var conn = - java.sql.DriverManager.getConnection( - "jdbc:postgresql://localhost:6432/Adventureworks?user=postgres&password=password")) { - conn.setAutoCommit(false); - try { - f.apply(conn); - } finally { - conn.rollback(); - } - } catch (SQLException e) { - throw new RuntimeException(e); - } - } - - @Test - public void test() { - System.out.println(Arr.of(0, 1, 2, 3).reshape(2, 2)); - System.out.println(Arr.of("a", "b", "c", "d \",d").reshape(2, 2)); - System.out.println(ArrParser.parse(Arr.of(1, 2, 3, 4).encode(Object::toString))); - System.out.println(ArrParser.parse("{{\"a\",\"b\"},{\"c\",\"d \\\",d\"}}")); - - // Test JSON roundtrip (no DB connection needed) - parallel - System.out.println("\n=== JSON Roundtrip Tests (parallel) ==="); - All.parallelStream().forEach(PgTypeTest::testJsonRoundtrip); - - // Run all DB tests in parallel - System.out.println("\n=== DB Roundtrip Tests (parallel) ==="); - var failures = - All.parallelStream() - .flatMap( - t -> { - var errors = new ArrayList(); - - // Native type roundtrip test - try { - withConnection( - conn -> { - conn.unwrap(PgConnection.class).setPrepareThreshold(0); - testCase(conn, t); - return null; - }); - } catch (Exception e) { - errors.add( - "Native test FAILED " - + t.type.typename().sqlType() - + ": " - + e.getMessage()); - } - - // JSON DB roundtrip test - try { - withConnection( - conn -> { - testJsonDbRoundtrip(conn, t); - return null; - }); - } catch (Exception e) { - errors.add( - "JSON DB test FAILED " - + t.type.typename().sqlType() - + ": " - + e.getMessage()); - } - - return errors.stream(); - }) - .toList(); - - // Composite type tests - deduplicated by SQL type, run in parallel - System.out.println("\n=== Composite Type DB Roundtrip Tests (parallel) ==="); - var compositeFailures = - All.stream() - .collect( - java.util.stream.Collectors.toMap( - t -> t.type.typename().sqlType(), t -> t, (a, b) -> a)) - .values() - .parallelStream() - .flatMap( - t -> { - try { - withConnection( - conn -> { - testCompositeDbRoundtrip(conn, t); - return null; - }); - return java.util.stream.Stream.empty(); - } catch (Exception e) { - return java.util.stream.Stream.of( - "Composite test FAILED " - + t.type.typename().sqlType() - + ": " - + e.getMessage()); - } - }) - .toList(); - - // Test comprehensive composite with all supported types - System.out.println("\n=== Comprehensive Composite Type Test ==="); - withConnection( - conn -> { - testComprehensiveComposite(conn); - return null; - }); - - // Report results - var allFailures = new ArrayList(); - allFailures.addAll(failures); - allFailures.addAll(compositeFailures); - - System.out.println("\n====================================="); - if (allFailures.isEmpty()) { - System.out.println("All tests passed!"); - } else { - allFailures.forEach(System.out::println); - throw new RuntimeException(allFailures.size() + " tests failed"); - } - System.out.println("====================================="); - } - - // Test type wrapped in a composite, roundtripped through the database - static void testCompositeDbRoundtrip(Connection conn, PgTypeAndExample t) - throws SQLException { - // Skip types that don't support composite text encoding - if (!t.compositeTextWorks) { - return; - } - - // Check if the type's PgCompositeText implementation works - try { - t.type.pgCompositeText().encode(t.example); - } catch (UnsupportedOperationException e) { - return; - } - - String sqlType = t.type.typename().sqlType(); - int uniqueId = tableCounter.incrementAndGet(); - - String compositeTypeName = - "test_wrapper_" - + uniqueId - + "_" - + sqlType - .replace("(", "_") - .replace(")", "_") - .replace(",", "_") - .replace(" ", "_") - .replace("[", "_") - .replace("]", "_"); - - // Create composite type with single field - try { - conn.createStatement().execute("DROP TYPE IF EXISTS " + compositeTypeName + " CASCADE"); - conn.createStatement() - .execute("CREATE TYPE " + compositeTypeName + " AS (wrapped_value " + sqlType + ")"); - - // Build PgStruct for this wrapper - PgStruct> wrapperStruct = - PgStruct.>builder(compositeTypeName) - .field("wrapped_value", t.type, SingleFieldWrapper::value) - .build(values -> new SingleFieldWrapper<>((A) values[0])); - - PgType> wrapperType = wrapperStruct.asType(); - String tableName = "test_composite_rt_" + uniqueId; - - // Create temp table - conn.createStatement() - .execute("CREATE TEMP TABLE " + tableName + " (v " + compositeTypeName + ")"); - - try { - // Insert value - SingleFieldWrapper original = new SingleFieldWrapper<>(t.example); - var insert = conn.prepareStatement("INSERT INTO " + tableName + " (v) VALUES (?)"); - wrapperType.write().set(insert, 1, original); - insert.execute(); - insert.close(); - - // Select back - var select = conn.prepareStatement("SELECT v FROM " + tableName); - select.execute(); - var rs = select.getResultSet(); - - if (!rs.next()) { - throw new RuntimeException("No rows returned"); - } - - SingleFieldWrapper decoded = wrapperType.read().read(rs, 1); - select.close(); - - if (t.hasIdentity && !areEqual(decoded.value, t.example)) { - throw new RuntimeException( - "Composite DB roundtrip failed for " - + sqlType - + ": expected '" - + format(t.example) - + "' but got '" - + format(decoded.value) - + "'"); - } - } finally { - conn.createStatement().execute("DROP TABLE IF EXISTS " + tableName); - } - } finally { - conn.createStatement().execute("DROP TYPE IF EXISTS " + compositeTypeName + " CASCADE"); - } - } - - record SingleFieldWrapper(A value) {} - - // Test a comprehensive composite type with all commonly-used field types - record ComprehensiveComposite( - String textField, - Integer int4Field, - Long int8Field, - Short int2Field, - Double float8Field, - Float float4Field, - Boolean boolField, - BigDecimal numericField, - UUID uuidField, - LocalDate dateField, - LocalTime timeField, - LocalDateTime timestampField) {} - - static void testComprehensiveComposite(Connection conn) throws SQLException { - String typeName = "test_comprehensive_composite"; - - conn.createStatement().execute("DROP TYPE IF EXISTS " + typeName + " CASCADE"); - conn.createStatement() - .execute( - "CREATE TYPE " - + typeName - + " AS (" - + "text_field TEXT, " - + "int4_field INT4, " - + "int8_field INT8, " - + "int2_field INT2, " - + "float8_field FLOAT8, " - + "float4_field FLOAT4, " - + "bool_field BOOL, " - + "numeric_field NUMERIC, " - + "uuid_field UUID, " - + "date_field DATE, " - + "time_field TIME, " - + "timestamp_field TIMESTAMP" - + ")"); - - try { - PgStruct struct = - PgStruct.builder(typeName) - .field("text_field", PgTypes.text, ComprehensiveComposite::textField) - .field("int4_field", PgTypes.int4, ComprehensiveComposite::int4Field) - .field("int8_field", PgTypes.int8, ComprehensiveComposite::int8Field) - .field("int2_field", PgTypes.int2, ComprehensiveComposite::int2Field) - .field("float8_field", PgTypes.float8, ComprehensiveComposite::float8Field) - .field("float4_field", PgTypes.float4, ComprehensiveComposite::float4Field) - .field("bool_field", PgTypes.bool, ComprehensiveComposite::boolField) - .field("numeric_field", PgTypes.numeric, ComprehensiveComposite::numericField) - .field("uuid_field", PgTypes.uuid, ComprehensiveComposite::uuidField) - .field("date_field", PgTypes.date, ComprehensiveComposite::dateField) - .field("time_field", PgTypes.time, ComprehensiveComposite::timeField) - .field("timestamp_field", PgTypes.timestamp, ComprehensiveComposite::timestampField) - .build( - values -> - new ComprehensiveComposite( - (String) values[0], - (Integer) values[1], - (Long) values[2], - (Short) values[3], - (Double) values[4], - (Float) values[5], - (Boolean) values[6], - (BigDecimal) values[7], - (UUID) values[8], - (LocalDate) values[9], - (LocalTime) values[10], - (LocalDateTime) values[11])); - - PgType compositeType = struct.asType(); - - conn.createStatement().execute("CREATE TEMP TABLE test_comp (v " + typeName + ")"); - - try { - // Create test value with special characters - ComprehensiveComposite original = - new ComprehensiveComposite( - "Hello, \"World\"! (with special chars: \n\t\\)", - Integer.MAX_VALUE, - Long.MIN_VALUE, - (short) 42, - 3.14159265359, - 2.71828f, - true, - new BigDecimal("12345.67890"), - UUID.fromString("550e8400-e29b-41d4-a716-446655440000"), - LocalDate.of(2024, 12, 25), - LocalTime.of(14, 30, 45).truncatedTo(ChronoUnit.MICROS), - LocalDateTime.of(2024, 12, 25, 14, 30, 45).truncatedTo(ChronoUnit.MICROS)); - - // Test in-memory PgCompositeText roundtrip - PgCompositeText compositeText = compositeType.pgCompositeText(); - String encoded = compositeText.encode(original).orElseThrow(); - ComprehensiveComposite decodedInMemory = compositeText.decode(encoded); - - System.out.println("Comprehensive composite in-memory roundtrip:"); - System.out.println(" Original: " + original); - System.out.println(" Encoded: " + encoded); - System.out.println(" Decoded: " + decodedInMemory); - - if (!original.equals(decodedInMemory)) { - throw new RuntimeException( - "In-memory roundtrip failed: expected " + original + " but got " + decodedInMemory); - } - - // Insert into database - var insert = conn.prepareStatement("INSERT INTO test_comp (v) VALUES (?)"); - compositeType.write().set(insert, 1, original); - insert.execute(); - insert.close(); - - // Read back - var select = conn.prepareStatement("SELECT v FROM test_comp"); - select.execute(); - var rs = select.getResultSet(); - rs.next(); - ComprehensiveComposite decoded = compositeType.read().read(rs, 1); - select.close(); - - System.out.println("Comprehensive composite DB roundtrip:"); - System.out.println(" Original: " + original); - System.out.println(" Decoded: " + decoded); - - if (!original.equals(decoded)) { - throw new RuntimeException( - "DB roundtrip failed: expected " + original + " but got " + decoded); - } - - System.out.println("Comprehensive composite tests PASSED!"); - } finally { - conn.createStatement().execute("DROP TABLE IF EXISTS test_comp"); - } - } finally { - conn.createStatement().execute("DROP TYPE IF EXISTS " + typeName + " CASCADE"); - } - } - - static void testJsonRoundtrip(PgTypeAndExample t) { - try { - PgJson jsonCodec = t.type.pgJson(); - A original = t.example; - - // Test toJson -> encode -> parse -> fromJson roundtrip (in-memory) - JsonValue jsonValue = jsonCodec.toJson(original); - String encoded = jsonValue.encode(); - JsonValue parsed = JsonValue.parse(encoded); - A decoded = jsonCodec.fromJson(parsed); - - System.out.println( - "JSON roundtrip " - + t.type.typename().sqlType() - + ": " - + format(original) - + " -> " - + encoded - + " -> " - + format(decoded)); - - if (t.hasIdentity && !areEqual(decoded, original)) { - throw new RuntimeException( - "JSON roundtrip failed for " - + t.type.typename().sqlType() - + ": expected '" - + format(original) - + "' but got '" - + format(decoded) - + "'"); - } - } catch (Exception e) { - throw new RuntimeException( - "JSON roundtrip test failed for " + t.type.typename().sqlType(), e); - } - } - - // Test JSON roundtrip through the database - simulates MULTISET behavior - // Insert value into native column, read back as JSON using to_json(), parse back to value - static void testJsonDbRoundtrip(Connection conn, PgTypeAndExample t) throws SQLException { - PgJson jsonCodec = t.type.pgJson(); - A original = t.example; - String sqlType = t.type.typename().sqlType(); - String tableName = uniqueTableName("test_json_rt"); - - // Create temp table with the native type column - conn.createStatement().execute("CREATE TEMP TABLE " + tableName + " (v " + sqlType + ")"); - - try { - // Insert value using native type - var insert = conn.prepareStatement("INSERT INTO " + tableName + " (v) VALUES (?)"); - t.type.write().set(insert, 1, original); - insert.execute(); - insert.close(); - - // Select back as JSON using to_json - this is what MULTISET does - var select = conn.prepareStatement("SELECT to_json(v) FROM " + tableName); - select.execute(); - var rs = select.getResultSet(); - - if (!rs.next()) { - throw new RuntimeException("No rows returned"); - } - - // Read the JSON string back from the database - String jsonFromDb = rs.getString(1); - select.close(); - - // Parse the JSON and convert back to value - JsonValue parsedFromDb = JsonValue.parse(jsonFromDb); - A decoded = jsonCodec.fromJson(parsedFromDb); - - if (t.hasIdentity && !areEqual(decoded, original)) { - throw new RuntimeException( - "JSON DB roundtrip failed for " - + sqlType - + ": expected '" - + format(original) - + "' but got '" - + format(decoded) - + "'"); - } - } finally { - conn.createStatement().execute("DROP TABLE IF EXISTS " + tableName); - } - } - - static void testCase(Connection conn, PgTypeAndExample t) throws SQLException { - String tableName = uniqueTableName("test"); - conn.createStatement() - .execute("create temp table " + tableName + " (v " + t.type.typename().sqlType() + ")"); - var insert = conn.prepareStatement("insert into " + tableName + " (v) values (?)"); - A expected = t.example; - t.type.write().set(insert, 1, expected); - insert.execute(); - insert.close(); - if (t.streamingWorks) { - streamingInsert.insert( - "COPY " + tableName + "(v) FROM STDIN", - 100, - Arrays.asList(t.example).iterator(), - conn, - t.type.pgText()); - } - - final PreparedStatement select; - if (t.hasIdentity) { - select = conn.prepareStatement("select v, null from " + tableName + " where v = ?"); - t.type.write().set(select, 1, expected); - } else { - select = conn.prepareStatement("select v, null from " + tableName); - } - - select.execute(); - var rs = select.getResultSet(); - List> rows = - RowParsers.of(t.type, t.type.opt(), TestPair::new, row -> new Object[] {row.t0, row.t1}) - .all() - .apply(rs); - select.close(); - conn.createStatement().execute("drop table " + tableName + ";"); - assertEquals(rows.get(0).t0(), expected); - if (t.streamingWorks) { - assertEquals(rows.get(1).t0(), expected); - } - } - - static void assertEquals(A actual, A expected) { - if (!areEqual(actual, expected)) { - throw new RuntimeException( - "actual: '" + format(actual) + "' != expected '" + format(expected) + "'"); - } - } - - static boolean areEqual(A actual, A expected) { - if (expected instanceof byte[]) { - return Arrays.equals((byte[]) actual, (byte[]) expected); - } - if (expected instanceof boolean[]) { - return Arrays.equals((boolean[]) actual, (boolean[]) expected); - } - if (expected instanceof short[]) { - return Arrays.equals((short[]) actual, (short[]) expected); - } - if (expected instanceof int[]) { - return Arrays.equals((int[]) actual, (int[]) expected); - } - if (expected instanceof long[]) { - return Arrays.equals((long[]) actual, (long[]) expected); - } - if (expected instanceof float[]) { - return Arrays.equals((float[]) actual, (float[]) expected); - } - if (expected instanceof double[]) { - return Arrays.equals((double[]) actual, (double[]) expected); - } - if (expected instanceof Object[]) { - return Arrays.equals((Object[]) actual, (Object[]) expected); - } - return actual.equals(expected); - } - - static String format(A a) { - if (a instanceof byte[]) { - return Arrays.toString((byte[]) a); - } - if (a instanceof boolean[]) { - return Arrays.toString((boolean[]) a); - } - if (a instanceof short[]) { - return Arrays.toString((short[]) a); - } - if (a instanceof int[]) { - return Arrays.toString((int[]) a); - } - if (a instanceof long[]) { - return Arrays.toString((long[]) a); - } - if (a instanceof float[]) { - return Arrays.toString((float[]) a); - } - if (a instanceof double[]) { - return Arrays.toString((double[]) a); - } - if (a instanceof Object[]) { - return Arrays.toString((Object[]) a); - } - return a.toString(); - } -} diff --git a/foundations-jdbc-test/src/java/dev/typr/foundations/SqlServerTypeTest.java b/foundations-jdbc-test/src/java/dev/typr/foundations/SqlServerTypeTest.java deleted file mode 100644 index 2b175c258a..0000000000 --- a/foundations-jdbc-test/src/java/dev/typr/foundations/SqlServerTypeTest.java +++ /dev/null @@ -1,487 +0,0 @@ -package dev.typr.foundations; - -import java.math.BigDecimal; -import java.sql.*; -import java.time.*; -import java.util.*; -import java.util.concurrent.atomic.AtomicInteger; -import org.junit.Test; - -/** - * Tests for SQL Server type codecs. Tests JDBC roundtrip, JSON roundtrip, and JSON DB roundtrip for - * every type. - */ -public class SqlServerTypeTest { - - private static final AtomicInteger tableCounter = new AtomicInteger(0); - - private static String uniqueTableName(String prefix) { - return prefix + "_" + tableCounter.incrementAndGet(); - } - - // Test wrapper types for alias and CLR types (like generated domain types) - record EmailAddress(String value) {} - - record AssemblyData(byte[] value) {} - - record SqlServerTypeAndExample( - SqlServerType type, - A example, - boolean hasIdentity, - boolean supportsTextRoundtrip, - boolean supportsJsonDbRoundtrip) { - - public SqlServerTypeAndExample(SqlServerType type, A example) { - this(type, example, true, true, true); - } - - public SqlServerTypeAndExample noIdentity() { - return new SqlServerTypeAndExample<>( - type, example, false, supportsTextRoundtrip, supportsJsonDbRoundtrip); - } - - public SqlServerTypeAndExample noTextRoundtrip() { - return new SqlServerTypeAndExample<>( - type, example, hasIdentity, false, supportsJsonDbRoundtrip); - } - - public SqlServerTypeAndExample noJsonDbRoundtrip() { - return new SqlServerTypeAndExample<>( - type, example, hasIdentity, supportsTextRoundtrip, false); - } - } - - static com.microsoft.sqlserver.jdbc.Geography createGeography() { - try { - return com.microsoft.sqlserver.jdbc.Geography.point(47.653, -122.358, 4326); - } catch (Exception e) { - throw new RuntimeException(e); - } - } - - static com.microsoft.sqlserver.jdbc.Geometry createGeometry() { - try { - return com.microsoft.sqlserver.jdbc.Geometry.point(10.0, 20.0, 0); - } catch (Exception e) { - throw new RuntimeException(e); - } - } - - List> All = - List.of( - // ==================== Integer Types ==================== - new SqlServerTypeAndExample<>( - SqlServerTypes.tinyint, new dev.typr.foundations.data.Uint1((short) 127)), - new SqlServerTypeAndExample<>( - SqlServerTypes.tinyint, new dev.typr.foundations.data.Uint1((short) 0)), - new SqlServerTypeAndExample<>( - SqlServerTypes.tinyint, new dev.typr.foundations.data.Uint1((short) 255)), - new SqlServerTypeAndExample<>(SqlServerTypes.smallint, (short) 32767), - new SqlServerTypeAndExample<>(SqlServerTypes.smallint, Short.MIN_VALUE), - new SqlServerTypeAndExample<>(SqlServerTypes.smallint, Short.MAX_VALUE), - new SqlServerTypeAndExample<>(SqlServerTypes.int_, 42), - new SqlServerTypeAndExample<>(SqlServerTypes.int_, Integer.MIN_VALUE), - new SqlServerTypeAndExample<>(SqlServerTypes.int_, Integer.MAX_VALUE), - new SqlServerTypeAndExample<>(SqlServerTypes.bigint, 9223372036854775807L), - new SqlServerTypeAndExample<>(SqlServerTypes.bigint, Long.MIN_VALUE), - new SqlServerTypeAndExample<>(SqlServerTypes.bigint, Long.MAX_VALUE), - - // ==================== Fixed-Point Types ==================== - // Use DECIMAL(18,4) - values must have exactly 4 decimal places to match - new SqlServerTypeAndExample<>( - SqlServerTypes.decimal(18, 4), new BigDecimal("12345.6789")), - new SqlServerTypeAndExample<>(SqlServerTypes.decimal(18, 4), new BigDecimal("0.0000")), - new SqlServerTypeAndExample<>( - SqlServerTypes.decimal(18, 4), new BigDecimal("-99999.9990")), - new SqlServerTypeAndExample<>( - SqlServerTypes.decimal(10, 2), new BigDecimal("12345678.90")), - new SqlServerTypeAndExample<>( - SqlServerTypes.money, new BigDecimal("922337203685477.5807")), - new SqlServerTypeAndExample<>( - SqlServerTypes.money, new BigDecimal("-922337203685477.5808")), - new SqlServerTypeAndExample<>(SqlServerTypes.smallmoney, new BigDecimal("214748.3647")), - new SqlServerTypeAndExample<>(SqlServerTypes.smallmoney, new BigDecimal("-214748.3648")), - - // ==================== Floating-Point Types ==================== - new SqlServerTypeAndExample<>(SqlServerTypes.real, 3.14f).noIdentity(), - new SqlServerTypeAndExample<>(SqlServerTypes.real, 0.0f).noIdentity(), - new SqlServerTypeAndExample<>(SqlServerTypes.real, Float.MAX_VALUE).noIdentity(), - new SqlServerTypeAndExample<>(SqlServerTypes.float_, 2.718281828459045), - new SqlServerTypeAndExample<>(SqlServerTypes.float_, 0.0), - new SqlServerTypeAndExample<>(SqlServerTypes.float_, 1234567.89), - - // ==================== Boolean Type ==================== - new SqlServerTypeAndExample<>(SqlServerTypes.bit, true), - new SqlServerTypeAndExample<>(SqlServerTypes.bit, false), - - // ==================== String Types (Non-Unicode) ==================== - // Use explicit sizes that match the data - new SqlServerTypeAndExample<>(SqlServerTypes.char_(10), "Hello "), - new SqlServerTypeAndExample<>(SqlServerTypes.char_(10), "fixed "), - new SqlServerTypeAndExample<>(SqlServerTypes.varchar(50), "variable length"), - new SqlServerTypeAndExample<>(SqlServerTypes.varchar(50), ""), - new SqlServerTypeAndExample<>(SqlServerTypes.varchar(50), "Quote\"Test'Single"), - new SqlServerTypeAndExample<>(SqlServerTypes.varcharMax, "Very long text ".repeat(100)), - // TEXT is deprecated legacy type, cannot be used with = operator in WHERE clause - new SqlServerTypeAndExample<>(SqlServerTypes.text, "legacy text type").noIdentity(), - - // ==================== String Types (Unicode) ==================== - new SqlServerTypeAndExample<>(SqlServerTypes.nchar(10), "Unicode "), - new SqlServerTypeAndExample<>(SqlServerTypes.nvarchar(50), "Unicode variable: 中文 😀"), - new SqlServerTypeAndExample<>( - SqlServerTypes.nvarcharMax, "Unicode long: ".repeat(50) + "中文"), - // NTEXT is deprecated legacy type, cannot be used with = operator in WHERE clause - new SqlServerTypeAndExample<>(SqlServerTypes.ntext, "legacy unicode text").noIdentity(), - - // ==================== Binary Types ==================== - new SqlServerTypeAndExample<>( - SqlServerTypes.binary(10), - new byte[] {0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, 0x08, 0x09, 0x0A}) - .noIdentity(), - new SqlServerTypeAndExample<>( - SqlServerTypes.varbinary(10), new byte[] {(byte) 0xFF, 0x00, 0x7F}) - .noIdentity(), - new SqlServerTypeAndExample<>(SqlServerTypes.varbinaryMax, new byte[1000]).noIdentity(), - // IMAGE is deprecated legacy type, cannot be used with = operator in WHERE clause - new SqlServerTypeAndExample<>(SqlServerTypes.image, new byte[] {0x01, 0x02}).noIdentity(), - - // ==================== Date/Time Types ==================== - new SqlServerTypeAndExample<>(SqlServerTypes.date, LocalDate.of(2024, 12, 22)), - new SqlServerTypeAndExample<>(SqlServerTypes.date, LocalDate.of(1970, 1, 1)), - new SqlServerTypeAndExample<>(SqlServerTypes.time, LocalTime.of(14, 30, 45)), - new SqlServerTypeAndExample<>(SqlServerTypes.time, LocalTime.of(0, 0, 0)), - // TIME(7) has JDBC conversion precision issues with nanoseconds - new SqlServerTypeAndExample<>(SqlServerTypes.time(7), LocalTime.of(14, 30, 45, 123456700)) - .noIdentity(), - new SqlServerTypeAndExample<>( - SqlServerTypes.datetime, LocalDateTime.of(2024, 12, 22, 14, 30, 45)) - .noIdentity(), - new SqlServerTypeAndExample<>( - SqlServerTypes.smalldatetime, LocalDateTime.of(2024, 12, 22, 14, 30, 0)), - new SqlServerTypeAndExample<>( - SqlServerTypes.datetime2, LocalDateTime.of(2024, 12, 22, 14, 30, 45, 123456700)), - new SqlServerTypeAndExample<>( - SqlServerTypes.datetime2(7), LocalDateTime.of(2024, 12, 22, 14, 30, 45, 123456700)), - new SqlServerTypeAndExample<>( - SqlServerTypes.datetimeoffset, - OffsetDateTime.of(2024, 12, 22, 14, 30, 45, 0, ZoneOffset.UTC)), - new SqlServerTypeAndExample<>( - SqlServerTypes.datetimeoffset(7), - OffsetDateTime.of(2024, 12, 22, 14, 30, 45, 123456700, ZoneOffset.ofHours(-5))), - - // ==================== Special Types ==================== - new SqlServerTypeAndExample<>( - SqlServerTypes.uniqueidentifier, - UUID.fromString("550e8400-e29b-41d4-a716-446655440000")), - new SqlServerTypeAndExample<>(SqlServerTypes.uniqueidentifier, UUID.randomUUID()) - .noIdentity(), - new SqlServerTypeAndExample<>( - SqlServerTypes.xml, - new dev.typr.foundations.data.Xml("value")) - .noIdentity() - .noJsonDbRoundtrip(), - // HIERARCHYID - can't use identity check (= operator) and JSON DB doesn't work - // because our Java encoder produces different bytes than SQL Server's encoder. - // JDBC roundtrip works when SQL Server parses the path string. - new SqlServerTypeAndExample<>( - SqlServerTypes.hierarchyid, dev.typr.foundations.data.HierarchyId.parse("/")) - .noIdentity() - .noJsonDbRoundtrip(), - new SqlServerTypeAndExample<>( - SqlServerTypes.hierarchyid, dev.typr.foundations.data.HierarchyId.parse("/1/")) - .noIdentity() - .noJsonDbRoundtrip(), - new SqlServerTypeAndExample<>( - SqlServerTypes.hierarchyid, dev.typr.foundations.data.HierarchyId.parse("/1/2/")) - .noIdentity() - .noJsonDbRoundtrip(), - new SqlServerTypeAndExample<>( - SqlServerTypes.hierarchyid, - dev.typr.foundations.data.HierarchyId.parse("/1/2/3/")) - .noIdentity() - .noJsonDbRoundtrip(), - new SqlServerTypeAndExample<>( - SqlServerTypes.hierarchyid, - dev.typr.foundations.data.HierarchyId.parse("/1/1/1/1/")) - .noIdentity() - .noJsonDbRoundtrip(), - - // ==================== Spatial Types ==================== - // Spatial types cannot use = operator, and FOR JSON cannot serialize CLR objects - new SqlServerTypeAndExample<>(SqlServerTypes.geography, createGeography()) - .noIdentity() - .noJsonDbRoundtrip(), - new SqlServerTypeAndExample<>(SqlServerTypes.geometry, createGeometry()) - .noIdentity() - .noJsonDbRoundtrip(), - - // ==================== Alias Types (User-Defined Types) ==================== - // Test domain-like wrapper pattern (like CREATE TYPE EmailAddress FROM NVARCHAR(255)) - new SqlServerTypeAndExample<>( - SqlServerTypes.nvarchar(255).bimap(EmailAddress::new, EmailAddress::value), - new EmailAddress("test@example.com")), - - // ==================== CLR Types (Assembly Types) ==================== - // Test CLR type as domain wrapper around VARBINARY (like generated code) - new SqlServerTypeAndExample<>( - SqlServerTypes.varbinary(100).bimap(AssemblyData::new, AssemblyData::value), - new AssemblyData(new byte[] {0x01, 0x02, 0x03, 0x04})) - .noIdentity()); - - static void withConnection(SqlFunction f) { - try (var conn = - DriverManager.getConnection( - "jdbc:sqlserver://localhost:1433;databaseName=typr;user=sa;password=YourStrong@Passw0rd;encrypt=false;trustServerCertificate=true")) { - conn.setAutoCommit(false); - try { - f.apply(conn); - } finally { - conn.rollback(); - } - } catch (SQLException e) { - throw new RuntimeException(e); - } - } - - @Test - public void test() { - System.out.println("=== SQL Server Type Tester ===\n"); - - // Test JSON roundtrip (in-memory) - parallel - System.out.println("=== JSON Roundtrip Tests (parallel) ==="); - All.parallelStream().forEach(SqlServerTypeTest::testJsonRoundtrip); - System.out.println(); - - // Run all DB tests in parallel - System.out.println("=== DB Roundtrip Tests (parallel) ==="); - var failures = - All.parallelStream() - .flatMap( - t -> { - var errors = new ArrayList(); - - // JDBC roundtrip test - try { - withConnection( - conn -> { - testJdbcRoundtrip(conn, t); - return null; - }); - } catch (Exception e) { - errors.add( - "JDBC test FAILED " + t.type.typename().sqlType() + ": " + e.getMessage()); - } - - // JSON DB roundtrip test - if (t.supportsJsonDbRoundtrip) { - try { - withConnection( - conn -> { - testJsonDbRoundtrip(conn, t); - return null; - }); - } catch (Exception e) { - errors.add( - "JSON DB test FAILED " - + t.type.typename().sqlType() - + ": " - + e.getMessage()); - } - } - - return errors.stream(); - }) - .toList(); - - System.out.println("\n====================================="); - if (failures.isEmpty()) { - System.out.println("All tests passed!"); - } else { - failures.forEach(System.out::println); - throw new RuntimeException(failures.size() + " tests failed"); - } - System.out.println("====================================="); - } - - static void testJsonRoundtrip(SqlServerTypeAndExample t) { - SqlServerJson jsonCodec = t.type.sqlServerJson(); - A original = t.example; - - dev.typr.foundations.data.JsonValue jsonValue = jsonCodec.toJson(original); - String encoded = jsonValue.encode(); - dev.typr.foundations.data.JsonValue parsed = dev.typr.foundations.data.JsonValue.parse(encoded); - A decoded = jsonCodec.fromJson(parsed); - - System.out.println( - "JSON roundtrip " - + t.type.typename().sqlType() - + ": " - + format(original) - + " -> " - + encoded - + " -> " - + format(decoded)); - - if (t.hasIdentity && !areEqual(decoded, original)) { - throw new RuntimeException("JSON roundtrip failed for " + t.type.typename().sqlType()); - } - } - - static void testJdbcRoundtrip(Connection conn, SqlServerTypeAndExample t) - throws SQLException { - String sqlType = t.type.typename().sqlType(); - String tableName = uniqueTableName("#test"); - - // Create temp table - String createSql = "CREATE TABLE " + tableName + " (v " + sqlType + ")"; - conn.createStatement().execute(createSql); - - try { - // Insert value - var insert = conn.prepareStatement("INSERT INTO " + tableName + " (v) VALUES (?)"); - t.type.write().set(insert, 1, t.example); - insert.execute(); - insert.close(); - - // Select back - PreparedStatement select; - if (t.hasIdentity) { - // Cast parameter to match column type exactly - String whereClause = - switch (sqlType) { - case "GEOGRAPHY" -> "WHERE v.STEquals(CAST(? AS GEOGRAPHY)) = 1"; - case "GEOMETRY" -> "WHERE v.STEquals(CAST(? AS GEOMETRY)) = 1"; - default -> "WHERE v = CAST(? AS " + sqlType + ")"; - }; - select = conn.prepareStatement("SELECT v FROM " + tableName + " " + whereClause); - t.type.write().set(select, 1, t.example); - } else { - select = conn.prepareStatement("SELECT v FROM " + tableName); - } - - select.execute(); - var rs = select.getResultSet(); - if (!rs.next()) { - throw new RuntimeException("No rows returned for " + sqlType); - } - - A result = t.type.read().read(rs, 1); - select.close(); - - System.out.println( - "JDBC roundtrip " + sqlType + ": " + format(t.example) + " -> " + format(result)); - - if (t.hasIdentity && !areEqual(result, t.example)) { - throw new RuntimeException("JDBC roundtrip failed for " + sqlType); - } - } finally { - conn.createStatement().execute("DROP TABLE IF EXISTS " + tableName); - } - } - - static void testJsonDbRoundtrip(Connection conn, SqlServerTypeAndExample t) - throws SQLException { - String sqlType = t.type.typename().sqlType(); - String tableName = uniqueTableName("#test_json"); - - // Create temp table - conn.createStatement().execute("CREATE TABLE " + tableName + " (v " + sqlType + ")"); - - try { - // Insert value - var insert = conn.prepareStatement("INSERT INTO " + tableName + " (v) VALUES (?)"); - t.type.write().set(insert, 1, t.example); - insert.execute(); - insert.close(); - - // Select back as JSON using FOR JSON PATH - var select = - conn.prepareStatement( - "SELECT v FROM " + tableName + " FOR JSON PATH, WITHOUT_ARRAY_WRAPPER"); - select.execute(); - var rs = select.getResultSet(); - - if (!rs.next()) { - throw new RuntimeException("No JSON returned"); - } - - String jsonFromDb = rs.getString(1); - select.close(); - - // Parse JSON and extract value - dev.typr.foundations.data.JsonValue parsedFromDb = - dev.typr.foundations.data.JsonValue.parse(jsonFromDb); - dev.typr.foundations.data.JsonValue valueJson = - ((dev.typr.foundations.data.JsonValue.JObject) parsedFromDb).get("v"); - A decoded = t.type.sqlServerJson().fromJson(valueJson); - - System.out.println( - "JSON DB roundtrip " - + sqlType - + ": " - + format(t.example) - + " -> DB -> " - + jsonFromDb - + " -> " - + format(decoded)); - - if (t.hasIdentity && !areEqual(decoded, t.example)) { - throw new RuntimeException("JSON DB roundtrip failed for " + sqlType); - } - } finally { - conn.createStatement().execute("DROP TABLE IF EXISTS " + tableName); - } - } - - static boolean areEqual(A actual, A expected) { - if (expected instanceof byte[]) { - return Arrays.equals((byte[]) actual, (byte[]) expected); - } - if (expected instanceof Object[]) { - return Arrays.equals((Object[]) actual, (Object[]) expected); - } - // For floating point, allow small differences - if (expected instanceof Float) { - return Math.abs((Float) actual - (Float) expected) < 0.0001f; - } - if (expected instanceof Double) { - return Math.abs((Double) actual - (Double) expected) < 0.0001; - } - // For spatial types, compare WKT representations - if (expected instanceof com.microsoft.sqlserver.jdbc.Geography) { - return ((com.microsoft.sqlserver.jdbc.Geography) actual) - .toString() - .equals(((com.microsoft.sqlserver.jdbc.Geography) expected).toString()); - } - if (expected instanceof com.microsoft.sqlserver.jdbc.Geometry) { - return ((com.microsoft.sqlserver.jdbc.Geometry) actual) - .toString() - .equals(((com.microsoft.sqlserver.jdbc.Geometry) expected).toString()); - } - // For wrapper types (AssemblyData), compare underlying values - if (expected instanceof AssemblyData) { - return Arrays.equals(((AssemblyData) actual).value(), ((AssemblyData) expected).value()); - } - return actual.equals(expected); - } - - static String format(A a) { - if (a instanceof byte[]) { - byte[] arr = (byte[]) a; - if (arr.length > 20) { - return "byte[" + arr.length + "]"; - } - return Arrays.toString(arr); - } - if (a instanceof Object[]) { - return Arrays.toString((Object[]) a); - } - if (a instanceof String) { - String s = (String) a; - if (s.length() > 50) { - return "\"" + s.substring(0, 47) + "...\""; - } - } - return a.toString(); - } -} diff --git a/foundations-jdbc/build.gradle.kts b/foundations-jdbc/build.gradle.kts deleted file mode 100644 index 26b19ebe85..0000000000 --- a/foundations-jdbc/build.gradle.kts +++ /dev/null @@ -1,35 +0,0 @@ -plugins { - `java-library` -} - -java { - toolchain { - languageVersion.set(JavaLanguageVersion.of(21)) - } -} - -sourceSets { - main { - java { - srcDirs( - "src/java", - // Generated by bleep: bleep compile foundations-jdbc - "../.bleep/generated-sources/foundations-jdbc/scripts.GeneratedRowParsers", - "../.bleep/generated-sources/foundations-jdbc/scripts.GeneratedTuples" - ) - } - } -} - -dependencies { - api("org.postgresql:postgresql:42.7.3") - api("org.jetbrains:annotations:26.0.1") - api("org.mariadb.jdbc:mariadb-java-client:3.5.1") - api("org.duckdb:duckdb_jdbc:1.1.3") - api("com.oracle.database.jdbc:ojdbc11:23.6.0.24.10") - api("com.microsoft.sqlserver:mssql-jdbc:12.8.1.jre11") -} - -tasks.withType { - options.compilerArgs.addAll(listOf("-proc:none")) -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/And.java b/foundations-jdbc/src/java/dev/typr/foundations/And.java deleted file mode 100644 index 9da04c4570..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/And.java +++ /dev/null @@ -1,3 +0,0 @@ -package dev.typr.foundations; - -public record And(T1 left, T2 right) {} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/ArrParser.java b/foundations-jdbc/src/java/dev/typr/foundations/ArrParser.java deleted file mode 100644 index 54caa07209..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/ArrParser.java +++ /dev/null @@ -1,172 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.Arr; -import java.util.ArrayList; -import java.util.function.Function; - -public interface ArrParser { - enum ParseState { - ExpectArray, - ExpectDatum, - InDatumQuoted, - InDatumUnquoted, - InEscape, - ElemComplete, - Done - } - - static Either> parse(String s) { - return parseWith(Either::right, s); - } - - static Either> parseWith(Function> f, String s) { - if (s.equals("{}")) { - return Either.right(Arr.empty()); - } else { - int sLength = s.length(); - int dataDepth = sLength - s.replaceAll("^\\{+", "").length(); - ArrayList data = new ArrayList<>(); - StringBuilder datum = new StringBuilder(); - int index = 0; - int depth = 0; - int[] curCount = new int[dataDepth + 1]; - int[] refCount = new int[dataDepth + 1]; - String failure = null; - ParseState state = ParseState.ExpectArray; - while (index < sLength && failure == null) { - char c = s.charAt(index); - var cs = Character.toString(c); - switch (state) { - case Done: - failure = "expected end of string, found " + c; - break; - case ExpectArray: - if (c == '{') { - index++; - depth++; - state = depth == dataDepth ? ParseState.ExpectDatum : ParseState.ExpectArray; - } else { - failure = "expected '{', found " + c; - } - break; - case ExpectDatum: - if (c == '{' || c == '}' || c == ',' || c == '\\') { - failure = "expected datum, found '" + c + "'"; - } else if (c == '"') { - index++; - datum.setLength(0); - state = ParseState.InDatumQuoted; - } else { - index++; - datum.setLength(0); - datum.append(c); - state = ParseState.InDatumUnquoted; - } - break; - case InDatumQuoted: - if (c == '"') { - Either result = f.apply(datum.toString()); - switch (result) { - case Either.Left l -> failure = l.value(); - case Either.Right r -> data.add(r.value()); - } - index++; - state = ParseState.ElemComplete; - } else if (c == '\\') { - index++; - state = ParseState.InEscape; - } else { - datum.append(c); - index++; - } - break; - case InDatumUnquoted: - if (c == '{' || c == '\\') { - failure = "illegal character in unquoted datum: '" + c + "'"; - } else if (c == ',') { - if (datum.toString().equalsIgnoreCase("null")) { - failure = "encountered NULL array element (currently unsupported)"; - } - updateCountersAfterComma(curCount, refCount, depth); - Either result = f.apply(datum.toString()); - switch (result) { - case Either.Left l -> failure = l.value(); - case Either.Right r -> data.add(r.value()); - } - index++; - state = ParseState.ExpectDatum; - } else if (c == '}') { - if (datum.toString().equalsIgnoreCase("null")) { - failure = "encountered NULL array element (currently unsupported)"; - } - updateCountersAfterClose(curCount, refCount, depth); - Either result = f.apply(datum.toString()); - switch (result) { - case Either.Left l -> failure = l.value(); - case Either.Right r -> data.add(r.value()); - } - index++; - depth--; - state = depth == 0 ? ParseState.Done : ParseState.ElemComplete; - } else { - datum.append(c); - index++; - } - break; - case InEscape: - datum.append(c); - index++; - state = ParseState.InDatumQuoted; - break; - case ElemComplete: - if (c == ',') { - updateCountersAfterComma(curCount, refCount, depth); - index++; - state = depth == dataDepth ? ParseState.ExpectDatum : ParseState.ExpectArray; - } else if (c == '}') { - updateCountersAfterClose(curCount, refCount, depth); - index++; - depth--; - if (depth == 0) { - state = ParseState.Done; - } - } else { - failure = "expected ',' or '}', found " + c; - } - break; - } - } - - if (failure != null) { - return Either.left(failure); - } else if (depth != 0 || state != ParseState.Done) { - return Either.left("unterminated array literal"); - } else { - return Either.right(new Arr<>(data.toArray(), refCount)); - } - } - } - - private static void updateCountersAfterComma(int[] curCount, int[] refCount, int depth) { - int ref = refCount[depth]; - int cur = curCount[depth]; - int inc = cur + 1; - if (ref > 0 && inc == ref) { - throw new RuntimeException("expected " + ref + " element(s) here; found more"); - } else { - curCount[depth] = inc; - } - } - - private static void updateCountersAfterClose(int[] curCount, int[] refCount, int depth) { - int ref = refCount[depth]; - int cur = curCount[depth]; - int inc = cur + 1; - if (ref > 0 && inc < ref) { - throw new RuntimeException("expected " + ref + " element here, only found " + inc); - } else { - curCount[depth] = 0; - refCount[depth] = inc; - } - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/Db2Json.java b/foundations-jdbc/src/java/dev/typr/foundations/Db2Json.java deleted file mode 100644 index ac8d4b59af..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/Db2Json.java +++ /dev/null @@ -1,150 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.JsonValue; -import java.math.BigDecimal; -import java.util.Optional; -import java.util.function.Function; - -/** - * Encodes/decodes values to/from JSON for IBM DB2. - * - *

Similar to SqlServerJson - DB2 supports JSON natively since version 11.1. - */ -public abstract class Db2Json implements DbJson { - public abstract JsonValue toJson(A a); - - public abstract A fromJson(JsonValue jsonValue); - - public Db2Json bimap(SqlFunction f, Function g) { - var self = this; - return Db2Json.instance(a -> self.toJson(g.apply(a)), jv -> f.apply(self.fromJson(jv))); - } - - public Db2Json map(SqlFunction f) { - return bimap(f, null); // write not supported - } - - public Db2Json contramap(Function g) { - return bimap(null, g); // read not supported - } - - public Db2Json> opt() { - var self = this; - return instance( - a -> a.map(self::toJson).orElse(JsonValue.JNull.INSTANCE), - jv -> jv instanceof JsonValue.JNull ? Optional.empty() : Optional.of(self.fromJson(jv))); - } - - public static Db2Json instance( - Function toJson, SqlFunction fromJson) { - return new Db2Json<>() { - @Override - public JsonValue toJson(A a) { - return toJson.apply(a); - } - - @Override - public A fromJson(JsonValue jsonValue) { - try { - return fromJson.apply(jsonValue); - } catch (java.sql.SQLException e) { - throw new RuntimeException(e); - } - } - }; - } - - // Standard JSON codecs - public static final Db2Json text = - instance(s -> new JsonValue.JString(s), jv -> ((JsonValue.JString) jv).value()); - public static final Db2Json bool = - instance(JsonValue.JBool::of, jv -> ((JsonValue.JBool) jv).value()); - public static final Db2Json int2 = - instance( - s -> JsonValue.JNumber.of(s.intValue()), - jv -> Short.parseShort(((JsonValue.JNumber) jv).value())); - public static final Db2Json int4 = - instance( - i -> JsonValue.JNumber.of(i.longValue()), - jv -> Integer.parseInt(((JsonValue.JNumber) jv).value())); - public static final Db2Json int8 = - instance(JsonValue.JNumber::of, jv -> Long.parseLong(((JsonValue.JNumber) jv).value())); - public static final Db2Json float4 = - instance( - f -> JsonValue.JNumber.of(f.doubleValue()), - jv -> Float.parseFloat(((JsonValue.JNumber) jv).value())); - public static final Db2Json float8 = - instance(JsonValue.JNumber::of, jv -> Double.parseDouble(((JsonValue.JNumber) jv).value())); - public static final Db2Json numeric = - instance( - bd -> JsonValue.JNumber.of(bd.toString()), - jv -> new BigDecimal(((JsonValue.JNumber) jv).value())); - public static final Db2Json bytea = - instance( - bytes -> new JsonValue.JString(java.util.Base64.getEncoder().encodeToString(bytes)), - jv -> java.util.Base64.getDecoder().decode(((JsonValue.JString) jv).value())); - public static final Db2Json unknown = instance(obj -> (JsonValue) obj, jv -> jv); - - /** - * Creates a Db2Json that throws UnsupportedOperationException for types that DB2's JSON_OBJECT - * doesn't support (GRAPHIC, VARGRAPHIC, DBCLOB, BINARY, VARBINARY). - */ - public static Db2Json unsupported(String typeName) { - return new Db2Json<>() { - @Override - public JsonValue toJson(A a) { - throw new UnsupportedOperationException( - "DB2 JSON_OBJECT does not support " + typeName + " type"); - } - - @Override - public A fromJson(JsonValue jsonValue) { - throw new UnsupportedOperationException( - "DB2 JSON_OBJECT does not support " + typeName + " type"); - } - }; - } - - // Date/Time codecs - // DB2 JSON_OBJECT uses non-standard formats: TIME as "HH.mm.ss", TIMESTAMP as - // "yyyy-MM-dd-HH.mm.ss.SSSSSS" - // We need to handle both ISO format (from in-memory roundtrip) and DB2 format (from database) - private static final java.time.format.DateTimeFormatter DB2_TIME = - java.time.format.DateTimeFormatter.ofPattern("HH.mm.ss"); - private static final java.time.format.DateTimeFormatter DB2_TIMESTAMP = - java.time.format.DateTimeFormatter.ofPattern("yyyy-MM-dd-HH.mm.ss.SSSSSS"); - - public static final Db2Json date = - instance( - d -> new JsonValue.JString(d.toString()), - jv -> java.time.LocalDate.parse(((JsonValue.JString) jv).value())); - - public static final Db2Json time = - instance( - t -> new JsonValue.JString(t.toString()), - jv -> { - String s = ((JsonValue.JString) jv).value(); - // DB2 format uses dots (14.30.45), ISO uses colons (14:30:45) - if (s.contains(":")) { - return java.time.LocalTime.parse(s); - } - return java.time.LocalTime.parse(s, DB2_TIME); - }); - - public static final Db2Json timestamp = - instance( - ts -> new JsonValue.JString(ts.toString()), - jv -> { - String s = ((JsonValue.JString) jv).value(); - // DB2 format: "yyyy-MM-dd-HH.mm.ss.SSSSSS", ISO format: "yyyy-MM-ddTHH:mm:ss..." - if (s.contains("T")) { - return java.time.LocalDateTime.parse(s); - } - return java.time.LocalDateTime.parse(s, DB2_TIMESTAMP); - }); - - public static final Db2Json timestamptz = - instance( - odt -> new JsonValue.JString(odt.toString()), - jv -> java.time.OffsetDateTime.parse(((JsonValue.JString) jv).value())); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/Db2Read.java b/foundations-jdbc/src/java/dev/typr/foundations/Db2Read.java deleted file mode 100644 index d34f8444a1..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/Db2Read.java +++ /dev/null @@ -1,210 +0,0 @@ -package dev.typr.foundations; - -import java.math.BigDecimal; -import java.sql.ResultSet; -import java.sql.SQLException; -import java.time.LocalDate; -import java.time.LocalDateTime; -import java.time.LocalTime; -import java.util.Optional; - -/** - * Describes how to read a column from a {@link ResultSet} for IBM DB2. - * - *

Similar to SqlServerRead but adapted for DB2-specific types. - */ -public sealed interface Db2Read extends DbRead - permits Db2Read.NonNullable, Db2Read.Nullable, Db2Read.Mapped { - A read(ResultSet rs, int col) throws SQLException; - - Db2Read map(SqlFunction f); - - /** Derive a Db2Read which allows nullable values */ - Db2Read> opt(); - - @FunctionalInterface - interface RawRead { - A apply(ResultSet rs, int column) throws SQLException; - } - - /** - * Create an instance of {@link Db2Read} from a function that reads a value from a result set. - * - * @param f Should not blow up if the value returned is `null` - */ - static NonNullable of(RawRead f) { - RawRead> readNullableA = - (rs, col) -> { - var a = f.apply(rs, col); - if (rs.wasNull()) return Optional.empty(); - else return Optional.of(a); - }; - return new NonNullable<>(readNullableA); - } - - final class NonNullable implements Db2Read { - final RawRead> readNullable; - - public NonNullable(RawRead> readNullable) { - this.readNullable = readNullable; - } - - @Override - public A read(ResultSet rs, int col) throws SQLException { - return readNullable - .apply(rs, col) - .orElseThrow(() -> new SQLException("null value in column " + col)); - } - - @Override - public NonNullable map(SqlFunction f) { - return new NonNullable<>( - (rs, col) -> { - Optional maybeA = readNullable.apply(rs, col); - if (maybeA.isEmpty()) return Optional.empty(); - return Optional.of(f.apply(maybeA.get())); - }); - } - - @Override - public Db2Read> opt() { - return new Nullable<>(readNullable); - } - } - - final class Nullable implements Db2Read> { - final RawRead> readNullable; - - public Nullable(RawRead> readNullable) { - this.readNullable = readNullable; - } - - @Override - public Optional read(ResultSet rs, int col) throws SQLException { - return readNullable.apply(rs, col); - } - - @Override - public Db2Read map(SqlFunction, B> f) { - return new Mapped<>(this, f); - } - - @Override - public Nullable> opt() { - return new Nullable<>( - (rs, col) -> { - Optional maybeA = readNullable.apply(rs, col); - if (maybeA.isEmpty()) return Optional.empty(); - return Optional.of(maybeA); - }); - } - } - - record Mapped(Db2Read underlying, SqlFunction f) implements Db2Read { - @Override - public B read(ResultSet rs, int col) throws SQLException { - return f.apply(underlying.read(rs, col)); - } - - @Override - public Db2Read map(SqlFunction g) { - return new Mapped<>(this, g); - } - - @Override - public Db2Read> opt() { - return new Nullable<>((rs, col) -> Optional.ofNullable(read(rs, col))); - } - } - - static NonNullable castJdbcObjectTo(Class cls) { - return of((rs, i) -> cls.cast(rs.getObject(i))); - } - - /** - * Read a value by requesting a specific class from JDBC. This uses rs.getObject(i, cls) which - * allows the JDBC driver to do proper type conversion. - */ - static NonNullable getObjectAs(Class cls) { - return of((rs, i) -> rs.getObject(i, cls)); - } - - // ==================== Basic Type Readers ==================== - - Db2Read readString = of(ResultSet::getString); - Db2Read readBoolean = of(ResultSet::getBoolean); - Db2Read readShort = of(ResultSet::getShort); - Db2Read readInteger = of(ResultSet::getInt); - Db2Read readLong = of(ResultSet::getLong); - Db2Read readFloat = of(ResultSet::getFloat); - Db2Read readDouble = of(ResultSet::getDouble); - Db2Read readBigDecimal = of(ResultSet::getBigDecimal); - - // Binary types - Db2Read readByteArray = of(ResultSet::getBytes); - Db2Read readBlob = - of( - (rs, idx) -> { - java.sql.Blob blob = rs.getBlob(idx); - if (blob == null) return null; - return blob.getBytes(1, (int) blob.length()); - }); - - // ==================== Date/Time Readers ==================== - - // DB2 JDBC driver doesn't handle null properly in getObject(idx, LocalDate.class) - // so we use the traditional approach with explicit null checking - Db2Read readDate = - of( - (rs, idx) -> { - java.sql.Date date = rs.getDate(idx); - return date == null ? null : date.toLocalDate(); - }); - Db2Read readTime = - of( - (rs, idx) -> { - java.sql.Time time = rs.getTime(idx); - return time == null ? null : time.toLocalTime(); - }); - Db2Read readTimestamp = - of( - (rs, idx) -> { - java.sql.Timestamp timestamp = rs.getTimestamp(idx); - return timestamp == null ? null : timestamp.toLocalDateTime(); - }); - - // ==================== Special Types ==================== - - // XML - DB2 supports XML natively - Db2Read readXml = - of( - (rs, idx) -> { - java.sql.SQLXML sqlxml = rs.getSQLXML(idx); - if (sqlxml == null) return null; - return new dev.typr.foundations.data.Xml(sqlxml.getString()); - }); - - // CLOB - Character Large Object - Db2Read readClob = - of( - (rs, idx) -> { - java.sql.Clob clob = rs.getClob(idx); - if (clob == null) return null; - return clob.getSubString(1, (int) clob.length()); - }); - - // GRAPHIC types - DBCS (double-byte character sets) - // These are read as strings in Java - Db2Read readGraphic = readString; - Db2Read readVarGraphic = readString; - Db2Read readDbClob = readClob; - - // ROWID - DB2 row identifier - Db2Read readRowId = readByteArray; - - // DECFLOAT - DB2-specific decimal floating point - Db2Read readDecFloat = readBigDecimal; - - // Read as Object for unknown types - Db2Read readObject = of(ResultSet::getObject); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/Db2Text.java b/foundations-jdbc/src/java/dev/typr/foundations/Db2Text.java deleted file mode 100644 index ef20dd18ac..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/Db2Text.java +++ /dev/null @@ -1,106 +0,0 @@ -package dev.typr.foundations; - -import java.math.BigDecimal; -import java.util.Optional; -import java.util.function.BiConsumer; -import java.util.function.Function; - -/** - * Encodes values to text format for DB2 LOAD command. - * - *

Similar to SqlServerText but adapted for DB2's text format. - */ -public abstract class Db2Text implements DbText { - public abstract void unsafeEncode(A a, StringBuilder sb); - - public Db2Text contramap(Function f) { - var self = this; - return instance((b, sb) -> self.unsafeEncode(f.apply(b), sb)); - } - - public Db2Text> opt() { - var self = this; - return instance( - (a, sb) -> { - if (a.isPresent()) self.unsafeEncode(a.get(), sb); - else sb.append(Db2Text.NULL); - }); - } - - public static char DELIMETER = '\t'; - public static String NULL = "\\N"; - - public static Db2Text instance(BiConsumer f) { - return new Db2Text<>() { - @Override - public void unsafeEncode(A a, StringBuilder sb) { - f.accept(a, sb); - } - }; - } - - @SuppressWarnings("unchecked") - public static Db2Text from(RowParser rowParser) { - return instance( - (row, sb) -> { - var encoded = rowParser.encode().apply(row); - for (int i = 0; i < encoded.length; i++) { - if (i > 0) { - sb.append(Db2Text.DELIMETER); - } - DbText text = (DbText) rowParser.columns().get(i).text(); - text.unsafeEncode(encoded[i], sb); - } - }); - } - - public static Db2Text instanceToString() { - return textString.contramap(Object::toString); - } - - /** DB2 doesn't support streaming like PostgreSQL's COPY, so this is a placeholder. */ - public static Db2Text NotWorking() { - return instance( - (a, sb) -> { - throw new UnsupportedOperationException("DB2 text encoding not supported for this type"); - }); - } - - private static void escapeString(String s, StringBuilder sb) { - for (int i = 0; i < s.length(); i++) { - char c = s.charAt(i); - switch (c) { - case '\0': - sb.append("\\0"); - break; - case '\n': - sb.append("\\n"); - break; - case '\r': - sb.append("\\r"); - break; - case '\t': - sb.append("\\t"); - break; - case '\\': - sb.append("\\\\"); - break; - default: - sb.append(c); - } - } - } - - // Basic type text encoders - public static final Db2Text textString = instance((s, sb) -> escapeString(s, sb)); - public static final Db2Text textBoolean = instanceToString(); - public static final Db2Text textShort = instanceToString(); - public static final Db2Text textInteger = instanceToString(); - public static final Db2Text textLong = instanceToString(); - public static final Db2Text textFloat = instanceToString(); - public static final Db2Text textDouble = instanceToString(); - public static final Db2Text textBigDecimal = instanceToString(); - public static final Db2Text textByteArray = - instance((bytes, sb) -> sb.append(java.util.Base64.getEncoder().encodeToString(bytes))); - public static final Db2Text textObject = instanceToString(); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/Db2Type.java b/foundations-jdbc/src/java/dev/typr/foundations/Db2Type.java deleted file mode 100644 index 1a7a3d7d6f..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/Db2Type.java +++ /dev/null @@ -1,88 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.dsl.Bijection; -import java.util.Optional; -import java.util.function.Function; - -/** - * Combines DB2 type name, read, write, text encoding, and JSON encoding for a type. Similar to - * SqlServerType but for IBM DB2. - */ -public record Db2Type( - Db2Typename typename, - Db2Read read, - Db2Write write, - Db2Text db2Text, - Db2Json db2Json) - implements DbType { - @Override - public DbText text() { - return db2Text; - } - - @Override - public DbJson json() { - return db2Json; - } - - public Db2Type withTypename(Db2Typename typename) { - return new Db2Type<>(typename, read, write, db2Text, db2Json); - } - - public Db2Type withTypename(String sqlType) { - return withTypename(Db2Typename.of(sqlType)); - } - - public Db2Type renamed(String value) { - return withTypename(typename.renamed(value)); - } - - public Db2Type renamedDropPrecision(String value) { - return withTypename(typename.renamedDropPrecision(value)); - } - - public Db2Type withRead(Db2Read read) { - return new Db2Type<>(typename, read, write, db2Text, db2Json); - } - - public Db2Type withWrite(Db2Write write) { - return new Db2Type<>(typename, read, write, db2Text, db2Json); - } - - public Db2Type withText(Db2Text text) { - return new Db2Type<>(typename, read, write, text, db2Json); - } - - public Db2Type withJson(Db2Json json) { - return new Db2Type<>(typename, read, write, db2Text, json); - } - - public Db2Type> opt() { - return new Db2Type<>( - typename.opt(), read.opt(), write.opt(typename), db2Text.opt(), db2Json.opt()); - } - - public Db2Type bimap(SqlFunction f, Function g) { - return new Db2Type<>( - typename.as(), read.map(f), write.contramap(g), db2Text.contramap(g), db2Json.bimap(f, g)); - } - - public Db2Type to(Bijection bijection) { - return new Db2Type<>( - typename.as(), - read.map(bijection::underlying), - write.contramap(bijection::from), - db2Text.contramap(bijection::from), - db2Json.bimap(bijection::underlying, bijection::from)); - } - - public static Db2Type of( - String tpe, Db2Read r, Db2Write w, Db2Text t, Db2Json j) { - return new Db2Type<>(Db2Typename.of(tpe), r, w, t, j); - } - - public static Db2Type of( - Db2Typename typename, Db2Read r, Db2Write w, Db2Text t, Db2Json j) { - return new Db2Type<>(typename, r, w, t, j); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/Db2Typename.java b/foundations-jdbc/src/java/dev/typr/foundations/Db2Typename.java deleted file mode 100644 index 675915e8a9..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/Db2Typename.java +++ /dev/null @@ -1,137 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.dsl.Bijection; -import java.util.Optional; - -/** - * Represents a DB2 SQL type name with optional precision. Similar to SqlServerTypename. DB2 uses - * double quotes for identifiers and standard CAST syntax. - */ -public sealed interface Db2Typename extends DbTypename { - String sqlType(); - - /** - * DB2 uses CAST() syntax, not PostgreSQL's :: operator. Don't render :: casts in prepared - * statements. - */ - @Override - default boolean renderTypeCast() { - return false; - } - - String sqlTypeNoPrecision(); - - Db2Typename renamed(String value); - - Db2Typename renamedDropPrecision(String value); - - default Db2Typename> opt() { - return new Opt<>(this); - } - - default Db2Typename as() { - return (Db2Typename) this; - } - - /** - * Type-safe conversion using a bijection as proof of type relationship. Overrides DbTypename.to() - * to return Db2Typename for better type refinement. - */ - @Override - default Db2Typename to(Bijection bijection) { - return (Db2Typename) this; - } - - record Base(String sqlType) implements Db2Typename { - @Override - public String sqlTypeNoPrecision() { - return sqlType; - } - - @Override - public Base renamed(String value) { - return new Base<>(value); - } - - @Override - public Base renamedDropPrecision(String value) { - return new Base<>(value); - } - } - - record WithPrec(Base of, int precision) implements Db2Typename { - public String sqlType() { - return of.sqlType + "(" + precision + ")"; - } - - @Override - public String sqlTypeNoPrecision() { - return of.sqlTypeNoPrecision(); - } - - @Override - public Db2Typename renamed(String value) { - return new WithPrec<>(of.renamed(value), precision); - } - - @Override - public Db2Typename renamedDropPrecision(String value) { - return of.renamed(value); - } - } - - record WithPrecScale(Base of, int precision, int scale) implements Db2Typename { - public String sqlType() { - return of.sqlType + "(" + precision + "," + scale + ")"; - } - - @Override - public String sqlTypeNoPrecision() { - return of.sqlTypeNoPrecision(); - } - - @Override - public Db2Typename renamed(String value) { - return new WithPrecScale<>(of.renamed(value), precision, scale); - } - - @Override - public Db2Typename renamedDropPrecision(String value) { - return of.renamed(value); - } - } - - record Opt(Db2Typename of) implements Db2Typename> { - @Override - public String sqlType() { - return of.sqlType(); - } - - @Override - public String sqlTypeNoPrecision() { - return of.sqlTypeNoPrecision(); - } - - @Override - public Db2Typename> renamed(String value) { - return new Opt<>(of.renamed(value)); - } - - @Override - public Db2Typename> renamedDropPrecision(String value) { - return new Opt<>(of.renamedDropPrecision(value)); - } - } - - static Db2Typename of(String sqlType) { - return new Base<>(sqlType); - } - - static Db2Typename of(String sqlType, int precision) { - return new WithPrec<>(new Base<>(sqlType), precision); - } - - static Db2Typename of(String sqlType, int precision, int scale) { - return new WithPrecScale<>(new Base<>(sqlType), precision, scale); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/Db2Types.java b/foundations-jdbc/src/java/dev/typr/foundations/Db2Types.java deleted file mode 100644 index b13c06a271..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/Db2Types.java +++ /dev/null @@ -1,316 +0,0 @@ -package dev.typr.foundations; - -import java.math.BigDecimal; -import java.time.*; - -/** - * DB2 type definitions for the typr-runtime-java library. - * - *

This interface provides type codecs for all IBM DB2 data types. - * - *

Key differences from other databases: - * - *

    - *
  • DECFLOAT - DB2-specific decimal floating point (16 or 34 digits) - *
  • GRAPHIC/VARGRAPHIC/DBCLOB - Double-byte character set types - *
  • ROWID - DB2 row identifier - *
  • XML - Native XML support - *
  • BOOLEAN - Native since DB2 11.1 - *
- */ -public interface Db2Types { - - // ==================== Integer Types ==================== - - Db2Type smallint = - Db2Type.of( - "SMALLINT", Db2Read.readShort, Db2Write.writeShort, Db2Text.textShort, Db2Json.int2); - - Db2Type integer = - Db2Type.of( - "INTEGER", Db2Read.readInteger, Db2Write.writeInteger, Db2Text.textInteger, Db2Json.int4); - - Db2Type int_ = integer.renamed("INT"); - - Db2Type bigint = - Db2Type.of("BIGINT", Db2Read.readLong, Db2Write.writeLong, Db2Text.textLong, Db2Json.int8); - - // ==================== Fixed-Point Types ==================== - - Db2Type decimal = - Db2Type.of( - "DECIMAL", - Db2Read.readBigDecimal, - Db2Write.writeBigDecimal, - Db2Text.textBigDecimal, - Db2Json.numeric); - - Db2Type numeric = decimal.renamed("NUMERIC"); - - Db2Type dec = decimal.renamed("DEC"); - - static Db2Type decimal(int precision, int scale) { - return Db2Type.of( - Db2Typename.of("DECIMAL", precision, scale), - Db2Read.readBigDecimal, - Db2Write.writeBigDecimal, - Db2Text.textBigDecimal, - Db2Json.numeric); - } - - static Db2Type numeric(int precision, int scale) { - return decimal(precision, scale).renamed("NUMERIC"); - } - - // DECFLOAT - DB2-specific decimal floating point - Db2Type decfloat = - Db2Type.of( - "DECFLOAT", - Db2Read.readDecFloat, - Db2Write.writeDecFloat, - Db2Text.textBigDecimal, - Db2Json.numeric); - - static Db2Type decfloat(int precision) { - return Db2Type.of( - Db2Typename.of("DECFLOAT", precision), - Db2Read.readDecFloat, - Db2Write.writeDecFloat, - Db2Text.textBigDecimal, - Db2Json.numeric); - } - - // ==================== Floating-Point Types ==================== - - Db2Type real = - Db2Type.of("REAL", Db2Read.readFloat, Db2Write.writeFloat, Db2Text.textFloat, Db2Json.float4); - - Db2Type double_ = - Db2Type.of( - "DOUBLE", Db2Read.readDouble, Db2Write.writeDouble, Db2Text.textDouble, Db2Json.float8); - - Db2Type float_ = double_.renamed("FLOAT"); - - // ==================== Boolean Type ==================== - - // Native BOOLEAN support since DB2 11.1 - Db2Type boolean_ = - Db2Type.of( - "BOOLEAN", Db2Read.readBoolean, Db2Write.writeBoolean, Db2Text.textBoolean, Db2Json.bool); - - // ==================== String Types (SBCS - Single-Byte) ==================== - - Db2Type char_ = - Db2Type.of( - "CHAR", Db2Read.readString, Db2Write.writeString, Db2Text.textString, Db2Json.text); - - Db2Type character = char_.renamed("CHARACTER"); - - static Db2Type char_(int length) { - return Db2Type.of( - Db2Typename.of("CHAR", length), - Db2Read.readString, - Db2Write.writeString, - Db2Text.textString, - Db2Json.text); - } - - Db2Type varchar = - Db2Type.of( - "VARCHAR", Db2Read.readString, Db2Write.writeString, Db2Text.textString, Db2Json.text); - - static Db2Type varchar(int length) { - return Db2Type.of( - Db2Typename.of("VARCHAR", length), - Db2Read.readString, - Db2Write.writeString, - Db2Text.textString, - Db2Json.text); - } - - // CLOB - Character Large Object - Db2Type clob = - Db2Type.of("CLOB", Db2Read.readClob, Db2Write.writeClob, Db2Text.textString, Db2Json.text); - - static Db2Type clob(int length) { - return Db2Type.of( - Db2Typename.of("CLOB", length), - Db2Read.readClob, - Db2Write.writeClob, - Db2Text.textString, - Db2Json.text); - } - - // ==================== String Types (DBCS - Double-Byte) ==================== - // Note: DB2's JSON_OBJECT does not support GRAPHIC/VARGRAPHIC/DBCLOB types (SQLCODE=-171) - - // GRAPHIC - Fixed-length double-byte character string - Db2Type graphic = - Db2Type.of( - "GRAPHIC", - Db2Read.readGraphic, - Db2Write.writeGraphic, - Db2Text.textString, - Db2Json.unsupported("GRAPHIC")); - - static Db2Type graphic(int length) { - return Db2Type.of( - Db2Typename.of("GRAPHIC", length), - Db2Read.readGraphic, - Db2Write.writeGraphic, - Db2Text.textString, - Db2Json.unsupported("GRAPHIC")); - } - - // VARGRAPHIC - Variable-length double-byte character string - Db2Type vargraphic = - Db2Type.of( - "VARGRAPHIC", - Db2Read.readVarGraphic, - Db2Write.writeVarGraphic, - Db2Text.textString, - Db2Json.unsupported("VARGRAPHIC")); - - static Db2Type vargraphic(int length) { - return Db2Type.of( - Db2Typename.of("VARGRAPHIC", length), - Db2Read.readVarGraphic, - Db2Write.writeVarGraphic, - Db2Text.textString, - Db2Json.unsupported("VARGRAPHIC")); - } - - // DBCLOB - Double-byte Character Large Object - Db2Type dbclob = - Db2Type.of( - "DBCLOB", - Db2Read.readDbClob, - Db2Write.writeDbClob, - Db2Text.textString, - Db2Json.unsupported("DBCLOB")); - - static Db2Type dbclob(int length) { - return Db2Type.of( - Db2Typename.of("DBCLOB", length), - Db2Read.readDbClob, - Db2Write.writeDbClob, - Db2Text.textString, - Db2Json.unsupported("DBCLOB")); - } - - // ==================== Binary Types ==================== - // Note: DB2's JSON_OBJECT does not support BINARY/VARBINARY/BLOB types (SQLCODE=-171 or -16402) - - Db2Type binary = - Db2Type.of( - "BINARY", - Db2Read.readByteArray, - Db2Write.writeByteArray, - Db2Text.textByteArray, - Db2Json.unsupported("BINARY")); - - static Db2Type binary(int length) { - return Db2Type.of( - Db2Typename.of("BINARY", length), - Db2Read.readByteArray, - Db2Write.writeByteArray, - Db2Text.textByteArray, - Db2Json.unsupported("BINARY")); - } - - Db2Type varbinary = - Db2Type.of( - "VARBINARY", - Db2Read.readByteArray, - Db2Write.writeByteArray, - Db2Text.textByteArray, - Db2Json.unsupported("VARBINARY")); - - static Db2Type varbinary(int length) { - return Db2Type.of( - Db2Typename.of("VARBINARY", length), - Db2Read.readByteArray, - Db2Write.writeByteArray, - Db2Text.textByteArray, - Db2Json.unsupported("VARBINARY")); - } - - // BLOB - Binary Large Object - Db2Type blob = - Db2Type.of( - "BLOB", - Db2Read.readBlob, - Db2Write.writeBlob, - Db2Text.textByteArray, - Db2Json.unsupported("BLOB")); - - static Db2Type blob(int length) { - return Db2Type.of( - Db2Typename.of("BLOB", length), - Db2Read.readBlob, - Db2Write.writeBlob, - Db2Text.textByteArray, - Db2Json.unsupported("BLOB")); - } - - // ==================== Date/Time Types ==================== - - Db2Type date = - Db2Type.of( - "DATE", Db2Read.readDate, Db2Write.writeDate, Db2Text.instanceToString(), Db2Json.date); - - Db2Type time = - Db2Type.of( - "TIME", Db2Read.readTime, Db2Write.writeTime, Db2Text.instanceToString(), Db2Json.time); - - // TIMESTAMP without time zone - Db2Type timestamp = - Db2Type.of( - "TIMESTAMP", - Db2Read.readTimestamp, - Db2Write.writeTimestamp, - Db2Text.instanceToString(), - Db2Json.timestamp); - - static Db2Type timestamp(int scale) { - return Db2Type.of( - Db2Typename.of("TIMESTAMP", scale), - Db2Read.readTimestamp, - Db2Write.writeTimestamp, - Db2Text.instanceToString(), - Db2Json.timestamp); - } - - // ==================== Special Types ==================== - // Note: DB2's JSON_OBJECT does not support XML type (SQLCODE=-171) - - // XML - Native XML support - Db2Type xml = - Db2Type.of( - "XML", - Db2Read.readXml, - Db2Write.writeXml, - Db2Text.textString.contramap(dev.typr.foundations.data.Xml::value), - Db2Json.unsupported("XML")); - - // ROWID - DB2 row identifier - Db2Type rowid = - Db2Type.of( - "ROWID", Db2Read.readRowId, Db2Write.writeRowId, Db2Text.textByteArray, Db2Json.bytea); - - // Generic object for unknown types - Db2Type object = - Db2Type.of( - "OBJECT", Db2Read.readObject, Db2Write.writeObject, Db2Text.textObject, Db2Json.unknown); - - // ==================== Unknown Type ==================== - // For columns whose type typr doesn't know how to handle - cast to/from string - Db2Type unknown = - Db2Type.of( - "VARCHAR(32672)", - Db2Read.readString, - Db2Write.writeString, - Db2Text.textString, - Db2Json.text) - .bimap(dev.typr.foundations.data.Unknown::new, dev.typr.foundations.data.Unknown::value); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/Db2Write.java b/foundations-jdbc/src/java/dev/typr/foundations/Db2Write.java deleted file mode 100644 index 65912e9df9..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/Db2Write.java +++ /dev/null @@ -1,148 +0,0 @@ -package dev.typr.foundations; - -import java.math.BigDecimal; -import java.sql.PreparedStatement; -import java.sql.SQLException; -import java.time.LocalDate; -import java.time.LocalDateTime; -import java.time.LocalTime; -import java.util.Optional; -import java.util.function.Function; - -/** - * Describes how to write a value to a {@link PreparedStatement} for IBM DB2. - * - *

Similar to SqlServerWrite but adapted for DB2-specific types like DECFLOAT, GRAPHIC, etc. - */ -public sealed interface Db2Write extends DbWrite permits Db2Write.Instance { - void set(PreparedStatement ps, int idx, A a) throws SQLException; - - Db2Write> opt(Db2Typename typename); - - Db2Write contramap(Function f); - - @FunctionalInterface - interface RawWriter { - void set(PreparedStatement ps, int index, A a) throws SQLException; - } - - record Instance(RawWriter rawWriter, Function f) implements Db2Write { - @Override - public void set(PreparedStatement ps, int index, A a) throws SQLException { - rawWriter.set(ps, index, f.apply(a)); - } - - @Override - public Db2Write> opt(Db2Typename typename) { - int sqlType = getSqlTypeForTypename(typename.sqlTypeNoPrecision()); - return new Instance<>( - (ps, index, u) -> { - if (u == null) ps.setNull(index, sqlType); - else set(ps, index, u); - }, - a -> a.orElse(null)); - } - - @Override - public Db2Write contramap(Function f) { - return new Instance<>(rawWriter, f.andThen(this.f)); - } - } - - static Db2Write primitive(RawWriter rawWriter) { - return new Instance<>(rawWriter, Function.identity()); - } - - static Db2Write passObjectToJdbc() { - return primitive(PreparedStatement::setObject); - } - - static int getSqlTypeForTypename(String sqlType) { - return switch (sqlType.toUpperCase()) { - case "SMALLINT" -> java.sql.Types.SMALLINT; - case "INTEGER", "INT" -> java.sql.Types.INTEGER; - case "BIGINT" -> java.sql.Types.BIGINT; - case "DECIMAL", "NUMERIC", "DEC" -> java.sql.Types.DECIMAL; - case "DECFLOAT" -> java.sql.Types.DECIMAL; - case "REAL" -> java.sql.Types.REAL; - case "DOUBLE", "FLOAT" -> java.sql.Types.DOUBLE; - case "BOOLEAN" -> java.sql.Types.BOOLEAN; - case "CHAR", "CHARACTER" -> java.sql.Types.CHAR; - case "VARCHAR" -> java.sql.Types.VARCHAR; - case "CLOB" -> java.sql.Types.CLOB; - case "GRAPHIC" -> java.sql.Types.CHAR; - case "VARGRAPHIC" -> java.sql.Types.VARCHAR; - case "DBCLOB" -> java.sql.Types.CLOB; - case "BINARY" -> java.sql.Types.BINARY; - case "VARBINARY" -> java.sql.Types.VARBINARY; - case "BLOB" -> java.sql.Types.BLOB; - case "DATE" -> java.sql.Types.DATE; - case "TIME" -> java.sql.Types.TIME; - case "TIMESTAMP" -> java.sql.Types.TIMESTAMP; - case "XML" -> java.sql.Types.SQLXML; - case "ROWID" -> java.sql.Types.ROWID; - default -> java.sql.Types.OTHER; - }; - } - - // ==================== Basic Type Writers ==================== - - Db2Write writeString = primitive(PreparedStatement::setString); - Db2Write writeBoolean = primitive(PreparedStatement::setBoolean); - Db2Write writeShort = primitive(PreparedStatement::setShort); - Db2Write writeInteger = primitive(PreparedStatement::setInt); - Db2Write writeLong = primitive(PreparedStatement::setLong); - Db2Write writeFloat = primitive(PreparedStatement::setFloat); - Db2Write writeDouble = primitive(PreparedStatement::setDouble); - Db2Write writeBigDecimal = primitive(PreparedStatement::setBigDecimal); - Db2Write writeByteArray = primitive(PreparedStatement::setBytes); - - // ==================== Date/Time Writers ==================== - - Db2Write writeDate = primitive(PreparedStatement::setObject); - Db2Write writeTime = primitive(PreparedStatement::setObject); - Db2Write writeTimestamp = primitive(PreparedStatement::setObject); - - // ==================== Special Type Writers ==================== - - // XML - Db2Write writeXml = - primitive( - (ps, idx, xml) -> { - java.sql.SQLXML sqlxml = ps.getConnection().createSQLXML(); - sqlxml.setString(xml.value()); - ps.setSQLXML(idx, sqlxml); - }); - - // CLOB - Db2Write writeClob = - primitive( - (ps, idx, str) -> { - java.sql.Clob clob = ps.getConnection().createClob(); - clob.setString(1, str); - ps.setClob(idx, clob); - }); - - // BLOB - Db2Write writeBlob = - primitive( - (ps, idx, bytes) -> { - java.sql.Blob blob = ps.getConnection().createBlob(); - blob.setBytes(1, bytes); - ps.setBlob(idx, blob); - }); - - // GRAPHIC types - written as strings - Db2Write writeGraphic = writeString; - Db2Write writeVarGraphic = writeString; - Db2Write writeDbClob = writeClob; - - // DECFLOAT - write as BigDecimal - Db2Write writeDecFloat = writeBigDecimal; - - // ROWID - Db2Write writeRowId = writeByteArray; - - // Generic object write - Db2Write writeObject = primitive(PreparedStatement::setObject); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/DbJson.java b/foundations-jdbc/src/java/dev/typr/foundations/DbJson.java deleted file mode 100644 index 1c7e72cae5..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/DbJson.java +++ /dev/null @@ -1,127 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.JsonValue; - -/** - * Interface for JSON serialization/deserialization of database values. Each DbType provides an - * implementation that converts values to/from JSON in the format that the database - * produces/consumes. - * - * @param The Java type being serialized/deserialized - */ -public interface DbJson { - /** - * Convert a value to its JSON representation. - * - * @param value The value to convert (may be null for nullable types) - * @return The JSON representation - */ - JsonValue toJson(A value); - - /** - * Convert a JSON representation back to the value. - * - * @param json The JSON to parse - * @return The parsed value - * @throws IllegalArgumentException if the JSON cannot be parsed - */ - A fromJson(JsonValue json); - - /** Create an optional version of this JSON codec. */ - default DbJson> opt() { - DbJson self = this; - return new DbJson<>() { - @Override - public JsonValue toJson(java.util.Optional value) { - return value.map(self::toJson).orElse(JsonValue.JNull.INSTANCE); - } - - @Override - public java.util.Optional fromJson(JsonValue json) { - if (json instanceof JsonValue.JNull) { - return java.util.Optional.empty(); - } - return java.util.Optional.of(self.fromJson(json)); - } - }; - } - - /** Create an array version of this JSON codec. */ - default DbJson array(java.util.function.IntFunction arrayFactory) { - DbJson self = this; - return new DbJson<>() { - @Override - public JsonValue toJson(A[] value) { - java.util.List elements = new java.util.ArrayList<>(value.length); - for (A elem : value) { - elements.add(self.toJson(elem)); - } - return new JsonValue.JArray(elements); - } - - @Override - public A[] fromJson(JsonValue json) { - if (!(json instanceof JsonValue.JArray arr)) { - throw new IllegalArgumentException( - "Expected JSON array, got: " + json.getClass().getSimpleName()); - } - A[] result = arrayFactory.apply(arr.values().size()); - for (int i = 0; i < arr.values().size(); i++) { - result[i] = self.fromJson(arr.values().get(i)); - } - return result; - } - }; - } - - /** Create a list version of this JSON codec. */ - default DbJson> list() { - DbJson self = this; - return new DbJson<>() { - @Override - public JsonValue toJson(java.util.List value) { - java.util.List elements = new java.util.ArrayList<>(value.size()); - for (A elem : value) { - elements.add(self.toJson(elem)); - } - return new JsonValue.JArray(elements); - } - - @Override - public java.util.List fromJson(JsonValue json) { - if (json instanceof JsonValue.JNull) { - return java.util.List.of(); - } - if (!(json instanceof JsonValue.JArray arr)) { - throw new IllegalArgumentException( - "Expected JSON array, got: " + json.getClass().getSimpleName()); - } - java.util.List result = new java.util.ArrayList<>(arr.values().size()); - for (JsonValue elem : arr.values()) { - result.add(self.fromJson(elem)); - } - return result; - } - }; - } - - /** Transform this codec using bidirectional mapping. */ - default DbJson bimap(SqlFunction f, java.util.function.Function g) { - DbJson self = this; - return new DbJson<>() { - @Override - public JsonValue toJson(B value) { - return self.toJson(g.apply(value)); - } - - @Override - public B fromJson(JsonValue json) { - try { - return f.apply(self.fromJson(json)); - } catch (java.sql.SQLException e) { - throw new RuntimeException(e); - } - } - }; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/DbJsonRow.java b/foundations-jdbc/src/java/dev/typr/foundations/DbJsonRow.java deleted file mode 100644 index f3f042e645..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/DbJsonRow.java +++ /dev/null @@ -1,157 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.JsonValue; -import java.util.ArrayList; -import java.util.List; -import java.util.Map; - -/** - * Factory for creating {@link DbJson} codecs from a {@link RowParser}. - * - *

Supports two encoding modes: - * - *

    - *
  • Array encoding - More compact, values are JSON arrays: {@code [1, - * "foo@example.com"]} - *
  • Object encoding - More readable, values are JSON objects: {@code {"id": 1, "email": - * "foo@example.com"}} - *
- * - *

Example usage: - * - *

{@code
- * RowParser emailParser = RowParsers.of(
- *     PgTypes.int4,
- *     PgTypes.text,
- *     Email::new,
- *     email -> new Object[]{email.id(), email.email()}
- * );
- *
- * // Array encoding (compact)
- * DbJson arrayCodec = DbJsonRow.jsonArray(emailParser);
- *
- * // Object encoding (with field names)
- * DbJson objectCodec = DbJsonRow.jsonObject(emailParser, List.of("id", "email"));
- *
- * // Compose with list() for JSON arrays of rows
- * DbJson> listCodec = arrayCodec.list();
- *
- * // Compose with opt() for nullable
- * DbJson> optCodec = arrayCodec.opt();
- * }
- */ -public final class DbJsonRow { - - private DbJsonRow() {} // utility class - - /** - * Create a DbJson codec that encodes rows as JSON arrays. - * - *

This is the most compact encoding. Each row becomes a JSON array where the elements - * correspond to the columns in order. - * - * @param rowParser the parser that defines the row structure and types - * @return a DbJson codec for the row type - */ - public static DbJson jsonArray(RowParser rowParser) { - return new ArrayCodec<>(rowParser); - } - - /** - * Create a DbJson codec that encodes rows as JSON objects with named fields. - * - *

Each row becomes a JSON object where keys are the column names provided. - * - * @param rowParser the parser that defines the row structure and types - * @param columnNames the JSON object keys corresponding to each column (in order) - * @return a DbJson codec for the row type - * @throws IllegalArgumentException if columnNames size doesn't match rowParser columns - */ - public static DbJson jsonObject(RowParser rowParser, List columnNames) { - if (rowParser.columns().size() != columnNames.size()) { - throw new IllegalArgumentException( - "Column count mismatch: RowParser has " - + rowParser.columns().size() - + " columns, but " - + columnNames.size() - + " column names provided"); - } - return new ObjectCodec<>(rowParser, List.copyOf(columnNames)); - } - - /** Array encoding: rows become JSON arrays like [val1, val2, ...] */ - private record ArrayCodec(RowParser rowParser) implements DbJson { - - @Override - @SuppressWarnings("unchecked") - public JsonValue toJson(Row value) { - Object[] values = rowParser.encode().apply(value); - List elements = new ArrayList<>(values.length); - for (int i = 0; i < rowParser.columns().size(); i++) { - DbJson jsonCodec = (DbJson) rowParser.columns().get(i).json(); - elements.add(jsonCodec.toJson(values[i])); - } - return new JsonValue.JArray(elements); - } - - @Override - @SuppressWarnings("unchecked") - public Row fromJson(JsonValue json) { - if (!(json instanceof JsonValue.JArray(List arrayValues))) { - throw new IllegalArgumentException( - "Expected JSON array for row, got: " + json.getClass().getSimpleName()); - } - if (arrayValues.size() != rowParser.columns().size()) { - throw new IllegalArgumentException( - "JSON array size " - + arrayValues.size() - + " doesn't match column count " - + rowParser.columns().size()); - } - Object[] values = new Object[rowParser.columns().size()]; - for (int i = 0; i < rowParser.columns().size(); i++) { - DbJson jsonCodec = (DbJson) rowParser.columns().get(i).json(); - values[i] = jsonCodec.fromJson(arrayValues.get(i)); - } - return rowParser.decode().apply(values); - } - } - - /** Object encoding: rows become JSON objects like {"col1": val1, "col2": val2, ...} */ - private record ObjectCodec(RowParser rowParser, List columnNames) - implements DbJson { - - @Override - @SuppressWarnings("unchecked") - public JsonValue toJson(Row value) { - Object[] values = rowParser.encode().apply(value); - java.util.LinkedHashMap fields = new java.util.LinkedHashMap<>(); - for (int i = 0; i < rowParser.columns().size(); i++) { - DbJson jsonCodec = (DbJson) rowParser.columns().get(i).json(); - fields.put(columnNames.get(i), jsonCodec.toJson(values[i])); - } - return new JsonValue.JObject(fields); - } - - @Override - @SuppressWarnings("unchecked") - public Row fromJson(JsonValue json) { - if (!(json instanceof JsonValue.JObject(Map fields))) { - throw new IllegalArgumentException( - "Expected JSON object for row, got: " + json.getClass().getSimpleName()); - } - Object[] values = new Object[rowParser.columns().size()]; - for (int i = 0; i < rowParser.columns().size(); i++) { - String colName = columnNames.get(i); - JsonValue colValue = fields.get(colName); - if (colValue == null || colValue instanceof JsonValue.JNull) { - values[i] = null; - } else { - DbJson jsonCodec = (DbJson) rowParser.columns().get(i).json(); - values[i] = jsonCodec.fromJson(colValue); - } - } - return rowParser.decode().apply(values); - } - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/DbRead.java b/foundations-jdbc/src/java/dev/typr/foundations/DbRead.java deleted file mode 100644 index 9c68aa6c25..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/DbRead.java +++ /dev/null @@ -1,18 +0,0 @@ -package dev.typr.foundations; - -import java.sql.ResultSet; -import java.sql.SQLException; -import java.util.Optional; - -/** - * Common interface for reading columns from a {@link ResultSet}. Implemented by both PgRead - * (PostgreSQL) and MariaRead (MariaDB). - */ -public interface DbRead { - A read(ResultSet rs, int col) throws SQLException; - - DbRead map(SqlFunction f); - - /** Derive a DbRead which allows nullable values */ - DbRead> opt(); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/DbText.java b/foundations-jdbc/src/java/dev/typr/foundations/DbText.java deleted file mode 100644 index 45bd3e39fc..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/DbText.java +++ /dev/null @@ -1,9 +0,0 @@ -package dev.typr.foundations; - -/** - * Common interface for text encoding of values. Used for bulk loading (COPY in PostgreSQL, LOAD - * DATA in MariaDB). Implemented by both PgText and MariaText. - */ -public interface DbText { - void unsafeEncode(A a, StringBuilder sb); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/DbType.java b/foundations-jdbc/src/java/dev/typr/foundations/DbType.java deleted file mode 100644 index 5c0a96df04..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/DbType.java +++ /dev/null @@ -1,40 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.dsl.Bijection; - -/** - * Common interface for database type codecs. Implemented by both PgType (PostgreSQL) and MariaType - * (MariaDB). - */ -public interface DbType { - /** Get the typename for SQL rendering (e.g., for casts like ?::typename). */ - DbTypename typename(); - - /** Get the read codec for reading ResultSet columns. */ - DbRead read(); - - /** Get the write codec for setting PreparedStatement parameters. */ - DbWrite write(); - - /** Get the text encoder for bulk loading (COPY/LOAD DATA). */ - DbText text(); - - /** - * Get the JSON codec for converting values to/from JSON format that the database can - * produce/consume. - */ - DbJson json(); - - /** Create an optional version of this type. */ - DbType> opt(); - - /** - * Convert this DbType to handle a different type using a bijection. The bijection converts values - * bidirectionally while preserving the underlying database type semantics. - * - * @param bijection The bijection to convert between A and B - * @param The target type - * @return A DbType that handles type B by converting to/from type A - */ - DbType to(Bijection bijection); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/DbTypename.java b/foundations-jdbc/src/java/dev/typr/foundations/DbTypename.java deleted file mode 100644 index 2b7ca9889e..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/DbTypename.java +++ /dev/null @@ -1,33 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.dsl.Bijection; - -/** - * Common interface for database type names. Implemented by both PgTypename (PostgreSQL) and - * MariaTypename (MariaDB). - */ -public interface DbTypename { - /** Get the SQL type string (e.g., "text", "int4", "varchar(255)"). */ - String sqlType(); - - /** - * Whether to render type casts in SQL (e.g., ?::typename for PostgreSQL). PostgreSQL uses type - * casts, MariaDB does not. - */ - default boolean renderTypeCast() { - return true; // Default to PostgreSQL behavior - } - - /** - * Type-safe conversion using a bijection as proof of type relationship. Since DbTypename is just - * type metadata (SQL type string), the type parameter is phantom - no values of type A are ever - * stored. The bijection proves that A and B are related types, providing compile-time type - * safety. - * - * @param bijection proof that A and B are related types (not used at runtime) - * @return this typename with type parameter B - */ - default DbTypename to(Bijection bijection) { - return (DbTypename) this; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/DbWrite.java b/foundations-jdbc/src/java/dev/typr/foundations/DbWrite.java deleted file mode 100644 index f398171f79..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/DbWrite.java +++ /dev/null @@ -1,13 +0,0 @@ -package dev.typr.foundations; - -import java.sql.PreparedStatement; -import java.sql.SQLException; - -/** - * Common interface for writing values to PreparedStatement. Implemented by both PgWrite - * (PostgreSQL) and MariaWrite (MariaDB). - */ -public interface DbWrite { - /** Set a value in a PreparedStatement at the given index. */ - void set(PreparedStatement ps, int idx, A value) throws SQLException; -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/DuckDbJson.java b/foundations-jdbc/src/java/dev/typr/foundations/DuckDbJson.java deleted file mode 100644 index c379946476..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/DuckDbJson.java +++ /dev/null @@ -1,433 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.*; -import java.math.BigDecimal; -import java.math.BigInteger; -import java.time.*; -import java.util.*; -import java.util.function.Function; -import java.util.function.IntFunction; - -/** - * DuckDB-specific JSON codec implementations. DuckDB has native JSON support and can output results - * as JSON. - */ -public interface DuckDbJson extends DbJson { - - @Override - default DuckDbJson> opt() { - DuckDbJson self = this; - return new DuckDbJson<>() { - @Override - public JsonValue toJson(Optional value) { - return value.map(self::toJson).orElse(JsonValue.JNull.INSTANCE); - } - - @Override - public Optional fromJson(JsonValue json) { - if (json instanceof JsonValue.JNull) { - return Optional.empty(); - } - return Optional.of(self.fromJson(json)); - } - }; - } - - default DuckDbJson array(IntFunction arrayFactory) { - DuckDbJson self = this; - return new DuckDbJson<>() { - @Override - public JsonValue toJson(A[] value) { - List elements = new ArrayList<>(value.length); - for (A elem : value) { - elements.add(self.toJson(elem)); - } - return new JsonValue.JArray(elements); - } - - @Override - public A[] fromJson(JsonValue json) { - if (!(json instanceof JsonValue.JArray(List values))) { - throw new IllegalArgumentException( - "Expected JSON array, got: " + json.getClass().getSimpleName()); - } - A[] result = arrayFactory.apply(values.size()); - for (int i = 0; i < values.size(); i++) { - result[i] = self.fromJson(values.get(i)); - } - return result; - } - }; - } - - default DuckDbJson> list() { - DuckDbJson self = this; - return new DuckDbJson<>() { - @Override - public JsonValue toJson(List value) { - List elements = new ArrayList<>(value.size()); - for (A elem : value) { - elements.add(self.toJson(elem)); - } - return new JsonValue.JArray(elements); - } - - @Override - public List fromJson(JsonValue json) { - if (!(json instanceof JsonValue.JArray(List values))) { - throw new IllegalArgumentException( - "Expected JSON array, got: " + json.getClass().getSimpleName()); - } - List result = new ArrayList<>(values.size()); - for (JsonValue elem : values) { - result.add(self.fromJson(elem)); - } - return result; - } - }; - } - - default DuckDbJson bimap(SqlFunction f, Function g) { - DuckDbJson self = this; - return new DuckDbJson<>() { - @Override - public JsonValue toJson(B value) { - return self.toJson(g.apply(value)); - } - - @Override - public B fromJson(JsonValue json) { - try { - return f.apply(self.fromJson(json)); - } catch (java.sql.SQLException e) { - throw new RuntimeException(e); - } - } - }; - } - - // Primitive type codecs - DuckDbJson bool = - new DuckDbJson<>() { - @Override - public JsonValue toJson(Boolean value) { - return JsonValue.JBool.of(value); - } - - @Override - public Boolean fromJson(JsonValue json) { - if (json instanceof JsonValue.JBool(boolean value)) return value; - if (json instanceof JsonValue.JNumber(String value)) return Integer.parseInt(value) != 0; - throw new IllegalArgumentException( - "Expected boolean or number, got: " + json.getClass().getSimpleName()); - } - }; - - DuckDbJson int1 = - new DuckDbJson<>() { - @Override - public JsonValue toJson(Byte value) { - return JsonValue.JNumber.of(value.longValue()); - } - - @Override - public Byte fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber(String value)) return Byte.parseByte(value); - throw new IllegalArgumentException( - "Expected number, got: " + json.getClass().getSimpleName()); - } - }; - - DuckDbJson int2 = - new DuckDbJson<>() { - @Override - public JsonValue toJson(Short value) { - return JsonValue.JNumber.of(value.longValue()); - } - - @Override - public Short fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber(String value)) return Short.parseShort(value); - throw new IllegalArgumentException( - "Expected number, got: " + json.getClass().getSimpleName()); - } - }; - - DuckDbJson int4 = - new DuckDbJson<>() { - @Override - public JsonValue toJson(Integer value) { - return JsonValue.JNumber.of(value.longValue()); - } - - @Override - public Integer fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber(String value)) return Integer.parseInt(value); - throw new IllegalArgumentException( - "Expected number, got: " + json.getClass().getSimpleName()); - } - }; - - DuckDbJson int8 = - new DuckDbJson<>() { - @Override - public JsonValue toJson(Long value) { - return JsonValue.JNumber.of(value); - } - - @Override - public Long fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber(String value)) return Long.parseLong(value); - throw new IllegalArgumentException( - "Expected number, got: " + json.getClass().getSimpleName()); - } - }; - - // HUGEINT (128-bit) - represented as BigInteger, serialized as string in JSON to avoid precision - // loss - DuckDbJson hugeint = - new DuckDbJson<>() { - @Override - public JsonValue toJson(BigInteger value) { - // For very large integers, use string representation to avoid precision loss - return new JsonValue.JString(value.toString()); - } - - @Override - public BigInteger fromJson(JsonValue json) { - if (json instanceof JsonValue.JString(String value)) return new BigInteger(value); - if (json instanceof JsonValue.JNumber(String value)) return new BigInteger(value); - throw new IllegalArgumentException( - "Expected string or number for hugeint, got: " + json.getClass().getSimpleName()); - } - }; - - DuckDbJson float4 = - new DuckDbJson<>() { - @Override - public JsonValue toJson(Float value) { - return JsonValue.JNumber.of(value.doubleValue()); - } - - @Override - public Float fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber(String value)) return Float.parseFloat(value); - throw new IllegalArgumentException( - "Expected number, got: " + json.getClass().getSimpleName()); - } - }; - - DuckDbJson float8 = - new DuckDbJson<>() { - @Override - public JsonValue toJson(Double value) { - return JsonValue.JNumber.of(value); - } - - @Override - public Double fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber(String value)) return Double.parseDouble(value); - throw new IllegalArgumentException( - "Expected number, got: " + json.getClass().getSimpleName()); - } - }; - - DuckDbJson numeric = - new DuckDbJson<>() { - @Override - public JsonValue toJson(BigDecimal value) { - return JsonValue.JNumber.of(value.toPlainString()); - } - - @Override - public BigDecimal fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber(String value)) return new BigDecimal(value); - if (json instanceof JsonValue.JString(String value)) return new BigDecimal(value); - throw new IllegalArgumentException( - "Expected number, got: " + json.getClass().getSimpleName()); - } - }; - - DuckDbJson text = - new DuckDbJson<>() { - @Override - public JsonValue toJson(String value) { - return new JsonValue.JString(value); - } - - @Override - public String fromJson(JsonValue json) { - if (json instanceof JsonValue.JString(String value)) return value; - throw new IllegalArgumentException( - "Expected string, got: " + json.getClass().getSimpleName()); - } - }; - - // DuckDB encodes BLOB as base64 in JSON - // Note: For JSON COPY import, BLOB is not supported as JSON is a textual format - DuckDbJson blob = - new DuckDbJson<>() { - @Override - public JsonValue toJson(byte[] value) { - // Use base64 encoding for JSON - return new JsonValue.JString(Base64.getEncoder().encodeToString(value)); - } - - @Override - public byte[] fromJson(JsonValue json) { - if (!(json instanceof JsonValue.JString(String value))) { - throw new IllegalArgumentException( - "Expected string for blob, got: " + json.getClass().getSimpleName()); - } - // Try base64 first - try { - return Base64.getDecoder().decode(value); - } catch (IllegalArgumentException e) { - // Handle hex format: \xAABBCC... - if (value.startsWith("\\x")) { - String hexStr = value.substring(2); - byte[] bytes = new byte[hexStr.length() / 2]; - for (int i = 0; i < bytes.length; i++) { - bytes[i] = (byte) Integer.parseInt(hexStr.substring(i * 2, i * 2 + 2), 16); - } - return bytes; - } - // If not base64 or hex, assume raw string (DuckDB may encode as escaped string) - return value.getBytes(java.nio.charset.StandardCharsets.ISO_8859_1); - } - } - }; - - // Date/Time types - DuckDbJson date = - new DuckDbJson<>() { - @Override - public JsonValue toJson(LocalDate value) { - return new JsonValue.JString(value.toString()); - } - - @Override - public LocalDate fromJson(JsonValue json) { - if (json instanceof JsonValue.JString(String value)) return LocalDate.parse(value); - throw new IllegalArgumentException( - "Expected string for date, got: " + json.getClass().getSimpleName()); - } - }; - - DuckDbJson time = - new DuckDbJson<>() { - @Override - public JsonValue toJson(LocalTime value) { - return new JsonValue.JString(value.toString()); - } - - @Override - public LocalTime fromJson(JsonValue json) { - if (json instanceof JsonValue.JString(String value)) return LocalTime.parse(value); - throw new IllegalArgumentException( - "Expected string for time, got: " + json.getClass().getSimpleName()); - } - }; - - DuckDbJson timestamp = - new DuckDbJson<>() { - @Override - public JsonValue toJson(LocalDateTime value) { - return new JsonValue.JString(value.toString()); - } - - @Override - public LocalDateTime fromJson(JsonValue json) { - if (json instanceof JsonValue.JString s) { - String normalized = s.value().replace(' ', 'T'); - return LocalDateTime.parse(normalized); - } - throw new IllegalArgumentException( - "Expected string for timestamp, got: " + json.getClass().getSimpleName()); - } - }; - - DuckDbJson timestamptz = - new DuckDbJson<>() { - @Override - public JsonValue toJson(OffsetDateTime value) { - return new JsonValue.JString(value.toString()); - } - - @Override - public OffsetDateTime fromJson(JsonValue json) { - if (json instanceof JsonValue.JString s) { - String normalized = s.value().replace(' ', 'T'); - return OffsetDateTime.parse(normalized); - } - throw new IllegalArgumentException( - "Expected string for timestamptz, got: " + json.getClass().getSimpleName()); - } - }; - - DuckDbJson interval = - new DuckDbJson<>() { - @Override - public JsonValue toJson(Duration value) { - // Use DuckDB's interval format: HH:MM:SS - // This format works in both JSON COPY and regular JDBC operations - long hours = value.toHours(); - long minutes = value.toMinutesPart(); - long seconds = value.toSecondsPart(); - return new JsonValue.JString(String.format("%d:%02d:%02d", hours, minutes, seconds)); - } - - @Override - public Duration fromJson(JsonValue json) { - if (json instanceof JsonValue.JString(String value)) { - // Try to parse as ISO-8601 duration - if (value.startsWith("PT") || value.startsWith("P")) { - return Duration.parse(value); - } - // DuckDB format: "HH:MM:SS" or "HH:MM:SS.nnnnnn" - String[] parts = value.split(":"); - if (parts.length >= 2) { - long hours = Long.parseLong(parts[0]); - long minutes = Long.parseLong(parts[1]); - long seconds = parts.length > 2 ? Long.parseLong(parts[2].split("\\.")[0]) : 0; - return Duration.ofHours(hours).plusMinutes(minutes).plusSeconds(seconds); - } - throw new IllegalArgumentException("Cannot parse interval: " + value); - } - throw new IllegalArgumentException( - "Expected string for interval, got: " + json.getClass().getSimpleName()); - } - }; - - DuckDbJson uuid = - new DuckDbJson<>() { - @Override - public JsonValue toJson(UUID value) { - return new JsonValue.JString(value.toString()); - } - - @Override - public UUID fromJson(JsonValue json) { - if (json instanceof JsonValue.JString(String value)) return UUID.fromString(value); - throw new IllegalArgumentException( - "Expected string for uuid, got: " + json.getClass().getSimpleName()); - } - }; - - // JSON type (pass-through) - DuckDbJson json = - new DuckDbJson<>() { - @Override - public JsonValue toJson(Json value) { - return JsonValue.parse(value.value()); - } - - @Override - public Json fromJson(JsonValue json) { - return new Json(json.encode()); - } - }; - - // BIT type - stored as string of 0s and 1s - DuckDbJson bit = text; -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/DuckDbMapSupport.java b/foundations-jdbc/src/java/dev/typr/foundations/DuckDbMapSupport.java deleted file mode 100644 index 07c04be9cd..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/DuckDbMapSupport.java +++ /dev/null @@ -1,80 +0,0 @@ -package dev.typr.foundations; - -import java.util.function.Function; - -/** - * Handles conversion of values to/from DuckDB MAP entries. DuckDB JDBC returns maps with Object - * keys/values that need to be cast to the proper types, and when writing we may need to convert - * back. - * - * @param the Java type - */ -public interface DuckDbMapSupport { - /** Convert a raw value from a DuckDB MAP to the Java type. */ - A fromMap(Object raw); - - /** Convert the Java type to a value suitable for a DuckDB MAP. Usually identity. */ - Object toMap(A value); - - /** Create a support that just casts (for types DuckDB returns directly). */ - @SuppressWarnings("unchecked") - static DuckDbMapSupport cast() { - return new DuckDbMapSupport<>() { - @Override - public A fromMap(Object raw) { - return (A) raw; - } - - @Override - public Object toMap(A value) { - return value; - } - }; - } - - /** Create a support with custom conversion in both directions. */ - static DuckDbMapSupport of(Function from, Function to) { - return new DuckDbMapSupport<>() { - @Override - public A fromMap(Object raw) { - return from.apply(raw); - } - - @Override - public Object toMap(A value) { - return to.apply(value); - } - }; - } - - /** Create a support with custom read conversion, identity for write. */ - static DuckDbMapSupport fromOnly(Function from) { - return new DuckDbMapSupport<>() { - @Override - public A fromMap(Object raw) { - return from.apply(raw); - } - - @Override - public Object toMap(A value) { - return value; - } - }; - } - - /** Transform this support with a bijection (for bimap support). */ - default DuckDbMapSupport bimap(Function f, Function g) { - DuckDbMapSupport self = this; - return new DuckDbMapSupport<>() { - @Override - public B fromMap(Object raw) { - return f.apply(self.fromMap(raw)); - } - - @Override - public Object toMap(B value) { - return self.toMap(g.apply(value)); - } - }; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/DuckDbRead.java b/foundations-jdbc/src/java/dev/typr/foundations/DuckDbRead.java deleted file mode 100644 index 64a934c08e..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/DuckDbRead.java +++ /dev/null @@ -1,417 +0,0 @@ -package dev.typr.foundations; - -import java.math.BigDecimal; -import java.math.BigInteger; -import java.sql.ResultSet; -import java.sql.SQLException; -import java.time.*; -import java.util.Optional; -import java.util.UUID; - -/** - * Describes how to read a column from a {@link ResultSet} for DuckDB. DuckDB's JDBC driver provides - * good type support for most types. - */ -public sealed interface DuckDbRead extends DbRead - permits DuckDbRead.NonNullable, DuckDbRead.Nullable, DuckDbRead.Mapped { - A read(ResultSet rs, int col) throws SQLException; - - DuckDbRead map(SqlFunction f); - - /** Derive a DuckDbRead which allows nullable values */ - DuckDbRead> opt(); - - @FunctionalInterface - interface RawRead { - A apply(ResultSet rs, int column) throws SQLException; - } - - /** - * Create an instance of {@link DuckDbRead} from a function that reads a value from a result set. - * - * @param f Should not blow up if the value returned is `null` - */ - static NonNullable of(RawRead f) { - RawRead> readNullableA = - (rs, col) -> { - var a = f.apply(rs, col); - if (rs.wasNull()) return Optional.empty(); - else return Optional.of(a); - }; - return new NonNullable<>(readNullableA); - } - - final class NonNullable implements DuckDbRead { - final RawRead> readNullable; - - public NonNullable(RawRead> readNullable) { - this.readNullable = readNullable; - } - - @Override - public A read(ResultSet rs, int col) throws SQLException { - return readNullable - .apply(rs, col) - .orElseThrow(() -> new SQLException("null value in column " + col)); - } - - @Override - public NonNullable map(SqlFunction f) { - return new NonNullable<>( - (rs, col) -> { - Optional maybeA = readNullable.apply(rs, col); - if (maybeA.isEmpty()) return Optional.empty(); - return Optional.of(f.apply(maybeA.get())); - }); - } - - @Override - public DuckDbRead> opt() { - return new Nullable<>(readNullable); - } - } - - final class Nullable implements DuckDbRead> { - final RawRead> readNullable; - - public Nullable(RawRead> readNullable) { - this.readNullable = readNullable; - } - - @Override - public Optional read(ResultSet rs, int col) throws SQLException { - return readNullable.apply(rs, col); - } - - @Override - public DuckDbRead map(SqlFunction, B> f) { - return new Mapped<>(this, f); - } - - @Override - public Nullable> opt() { - return new Nullable<>( - (rs, col) -> { - Optional maybeA = readNullable.apply(rs, col); - if (maybeA.isEmpty()) return Optional.empty(); - return Optional.of(maybeA); - }); - } - } - - record Mapped(DuckDbRead underlying, SqlFunction f) implements DuckDbRead { - @Override - public B read(ResultSet rs, int col) throws SQLException { - return f.apply(underlying.read(rs, col)); - } - - @Override - public DuckDbRead map(SqlFunction g) { - return new Mapped<>(this, g); - } - - @Override - public DuckDbRead> opt() { - return new Nullable<>((rs, col) -> Optional.ofNullable(read(rs, col))); - } - } - - static NonNullable castJdbcObjectTo(Class cls) { - return of((rs, i) -> cls.cast(rs.getObject(i))); - } - - static NonNullable getObjectAs(Class cls) { - return of((rs, i) -> rs.getObject(i, cls)); - } - - // Basic type readers - DuckDbRead readString = of(ResultSet::getString); - DuckDbRead readBoolean = of(ResultSet::getBoolean); - DuckDbRead readByte = of(ResultSet::getByte); - DuckDbRead readShort = of(ResultSet::getShort); - DuckDbRead readInteger = of(ResultSet::getInt); - DuckDbRead readLong = of(ResultSet::getLong); - DuckDbRead readFloat = of(ResultSet::getFloat); - DuckDbRead readDouble = of(ResultSet::getDouble); - DuckDbRead readBigDecimal = of(ResultSet::getBigDecimal); - DuckDbRead readByteArray = of(ResultSet::getBytes); - - // BigInteger for HUGEINT/UHUGEINT - DuckDB JDBC returns BigInteger directly - DuckDbRead readBigInteger = castJdbcObjectTo(BigInteger.class); - - // Date/Time readers - DuckDB JDBC has specific return types - DuckDbRead readLocalDate = castJdbcObjectTo(LocalDate.class); - DuckDbRead readLocalTime = castJdbcObjectTo(LocalTime.class); - // DuckDB returns java.sql.Timestamp for TIMESTAMP types - DuckDbRead readLocalDateTime = - of( - (rs, idx) -> { - Object obj = rs.getObject(idx); - if (obj == null) return null; - if (obj instanceof LocalDateTime) return (LocalDateTime) obj; - if (obj instanceof java.sql.Timestamp) - return ((java.sql.Timestamp) obj).toLocalDateTime(); - throw new SQLException("Cannot convert " + obj.getClass() + " to LocalDateTime"); - }); - DuckDbRead readOffsetDateTime = - of( - (rs, idx) -> { - Object obj = rs.getObject(idx); - if (obj == null) return null; - if (obj instanceof OffsetDateTime) return (OffsetDateTime) obj; - if (obj instanceof java.sql.Timestamp) { - // DuckDB TIMESTAMPTZ is stored as UTC, returned as Timestamp - return ((java.sql.Timestamp) obj).toLocalDateTime().atOffset(ZoneOffset.UTC); - } - throw new SQLException("Cannot convert " + obj.getClass() + " to OffsetDateTime"); - }); - - // UUID - DuckDB has native UUID support - DuckDbRead readUuid = - of( - (rs, idx) -> { - Object obj = rs.getObject(idx); - if (obj == null) return null; - if (obj instanceof UUID) return (UUID) obj; - if (obj instanceof String) return UUID.fromString((String) obj); - throw new SQLException("Cannot convert " + obj.getClass() + " to UUID"); - }); - - // Interval - DuckDB returns as string in "HH:MM:SS" or "HH:MM:SS.micros" format - DuckDbRead readDuration = - of( - (rs, idx) -> { - String s = rs.getString(idx); - if (s == null) return null; - // DuckDB interval format: "HH:MM:SS" or "HH:MM:SS.micros" for time intervals - // Parse manually since Duration.parse expects PT format - try { - String[] parts = s.split(":"); - if (parts.length >= 3) { - long hours = Long.parseLong(parts[0]); - long minutes = Long.parseLong(parts[1]); - // Handle seconds with potential fractional part - String secPart = parts[2]; - int dotIdx = secPart.indexOf('.'); - long seconds; - long nanos = 0; - if (dotIdx >= 0) { - seconds = Long.parseLong(secPart.substring(0, dotIdx)); - String fracStr = secPart.substring(dotIdx + 1); - // Pad or truncate to 9 digits for nanoseconds - while (fracStr.length() < 9) fracStr += "0"; - if (fracStr.length() > 9) fracStr = fracStr.substring(0, 9); - nanos = Long.parseLong(fracStr); - } else { - seconds = Long.parseLong(secPart); - } - return Duration.ofHours(hours) - .plusMinutes(minutes) - .plusSeconds(seconds) - .plusNanos(nanos); - } - // Fallback to ISO 8601 parse - return Duration.parse(s); - } catch (Exception e) { - throw new SQLException("Cannot parse interval: " + s, e); - } - }); - - // BLOB - DuckDB returns as byte[] - DuckDbRead readBlob = - of( - (rs, idx) -> { - java.sql.Blob blob = rs.getBlob(idx); - if (blob == null) return null; - return blob.getBytes(1, (int) blob.length()); - }); - - // BIT type - DuckDB returns as String of 0s and 1s - DuckDbRead readBitString = readString; - - // ==================== Nested Types ==================== - - /** - * Read a LIST/Array column. DuckDB returns org.duckdb.DuckDBArray which implements - * java.sql.Array. The elements are extracted as a Java array. - * - * @param elementClass the Java class of array elements - * @param element type - * @return reader for List of elements - */ - static DuckDbRead> readList(Class elementClass) { - return of( - (rs, idx) -> { - java.sql.Array arr = rs.getArray(idx); - if (arr == null) return null; - Object[] elements = (Object[]) arr.getArray(); - java.util.List result = new java.util.ArrayList<>(elements.length); - for (Object elem : elements) { - @SuppressWarnings("unchecked") - E typedElem = (E) elem; - result.add(typedElem); - } - return result; - }); - } - - /** - * Read a LIST/Array column with element conversion. Use this when DuckDB returns elements in a - * different type than expected (e.g., java.sql.Timestamp instead of LocalDateTime). - * - * @param converter function to convert raw element to target type - * @param target element type - * @param wire type (what DuckDB JDBC returns) - * @return reader for List of elements - */ - static DuckDbRead> readListConverted( - java.util.function.Function converter) { - return of( - (rs, idx) -> { - java.sql.Array arr = rs.getArray(idx); - if (arr == null) return null; - Object[] elements = (Object[]) arr.getArray(); - java.util.List result = new java.util.ArrayList<>(elements.length); - for (Object elem : elements) { - @SuppressWarnings("unchecked") - W wireElem = (W) elem; - result.add(converter.apply(wireElem)); - } - return result; - }); - } - - /** - * Read a STRUCT column. DuckDB returns org.duckdb.DuckDBStruct which implements java.sql.Struct. - * Returns a Map of field names to values. - */ - DuckDbRead> readStruct = - of( - (rs, idx) -> { - Object obj = rs.getObject(idx); - if (obj == null) return null; - if (obj instanceof java.sql.Struct) { - java.sql.Struct struct = (java.sql.Struct) obj; - // DuckDB's DuckDBStruct has a toString that shows field info - // The attributes are returned in order, but we can get field names from type name - String typeName = struct.getSQLTypeName(); // e.g. STRUCT("name" VARCHAR, age INTEGER) - Object[] attrs = struct.getAttributes(); - java.util.Map result = new java.util.LinkedHashMap<>(); - // Parse field names from type name - String[] fieldNames = parseStructFieldNames(typeName); - for (int i = 0; i < attrs.length && i < fieldNames.length; i++) { - result.put(fieldNames[i], attrs[i]); - } - return result; - } - throw new SQLException("Cannot convert " + obj.getClass() + " to Struct"); - }); - - /** - * Read a MAP column with typed keys and values. DuckDB returns java.util.HashMap directly with - * proper types. - * - * @param keyReader reader for key type - * @param keyClass the Java class of keys - * @param valueReader reader for value type - * @param valueClass the Java class of values - * @param key type - * @param value type - * @return reader for Map - */ - @SuppressWarnings("unchecked") - static DuckDbRead> readMap( - DuckDbRead keyReader, Class keyClass, DuckDbRead valueReader, Class valueClass) { - return of( - (rs, idx) -> { - Object obj = rs.getObject(idx); - if (obj == null) return null; - if (obj instanceof java.util.Map) { - java.util.Map rawMap = (java.util.Map) obj; - java.util.Map result = new java.util.LinkedHashMap<>(); - for (var entry : rawMap.entrySet()) { - K key = keyClass.cast(entry.getKey()); - V value = valueClass.cast(entry.getValue()); - result.put(key, value); - } - return result; - } - throw new SQLException("Cannot convert " + obj.getClass() + " to Map"); - }); - } - - /** - * Read a MAP column using support objects to convert keys and values. DuckDB JDBC returns - * java.util.HashMap, and we use the support objects to convert each key/value pair. - * - * @param keySupport support for key type - * @param valueSupport support for value type - * @param key type - * @param value type - * @return reader for Map - */ - static DuckDbRead> readMapWithSupport( - DuckDbMapSupport keySupport, DuckDbMapSupport valueSupport) { - return of( - (rs, idx) -> { - Object obj = rs.getObject(idx); - if (obj == null) return null; - if (obj instanceof java.util.Map rawMap) { - java.util.Map result = new java.util.LinkedHashMap<>(); - for (var entry : rawMap.entrySet()) { - K key = keySupport.fromMap(entry.getKey()); - V value = valueSupport.fromMap(entry.getValue()); - result.put(key, value); - } - return result; - } - throw new SQLException("Cannot convert " + obj.getClass() + " to Map"); - }); - } - - /** - * Parse field names from a STRUCT type definition. e.g. STRUCT("name" VARCHAR, age INTEGER) -> - * ["name", "age"] - */ - private static String[] parseStructFieldNames(String typeName) { - // Simple parser for STRUCT(field1 type1, field2 type2, ...) - if (!typeName.startsWith("STRUCT(") || !typeName.endsWith(")")) { - return new String[0]; - } - String inner = typeName.substring(7, typeName.length() - 1); - java.util.List names = new java.util.ArrayList<>(); - int depth = 0; - StringBuilder current = new StringBuilder(); - for (char c : inner.toCharArray()) { - if (c == '(' || c == '[') depth++; - else if (c == ')' || c == ']') depth--; - else if (c == ',' && depth == 0) { - names.add(extractFieldName(current.toString().trim())); - current = new StringBuilder(); - continue; - } - current.append(c); - } - if (current.length() > 0) { - names.add(extractFieldName(current.toString().trim())); - } - return names.toArray(new String[0]); - } - - private static String extractFieldName(String fieldDef) { - // Field can be: "name" VARCHAR or name VARCHAR - fieldDef = fieldDef.trim(); - if (fieldDef.startsWith("\"")) { - int end = fieldDef.indexOf('"', 1); - if (end > 0) { - return fieldDef.substring(1, end); - } - } - // Find first space - int space = fieldDef.indexOf(' '); - if (space > 0) { - return fieldDef.substring(0, space); - } - return fieldDef; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/DuckDbStringifier.java b/foundations-jdbc/src/java/dev/typr/foundations/DuckDbStringifier.java deleted file mode 100644 index 83be9ac554..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/DuckDbStringifier.java +++ /dev/null @@ -1,157 +0,0 @@ -package dev.typr.foundations; - -import java.math.BigDecimal; -import java.math.BigInteger; -import java.time.*; -import java.util.Optional; -import java.util.UUID; -import java.util.function.Function; - -/** - * Stringifies values to DuckDB SQL literal format. - * - *

This is used for writing complex nested structures (STRUCT, MAP, LIST) via PreparedStatement - * by converting them to SQL literal strings that DuckDB can parse. - * - *

Examples: - VARCHAR: 'hello' (quoted, escaped) or hello (unquoted for arrays) - INTEGER: 42 - * (always unquoted) - STRUCT: {'name': 'Alice', 'age': 30} - LIST: [1, 2, 3] - MAP: {'key1': - * 'value1', 'key2': 'value2'} - * - *

The quoted parameter controls whether string-like types include quotes. Use quoted=true for - * STRUCT field values, quoted=false for array/map elements. - */ -public abstract class DuckDbStringifier { - public abstract void unsafeEncode(A a, StringBuilder sb, boolean quoted); - - /** Encode a value to a String. Convenience method that creates a StringBuilder internally. */ - public String encode(A a, boolean quoted) { - StringBuilder sb = new StringBuilder(); - unsafeEncode(a, sb, quoted); - return sb.toString(); - } - - public DuckDbStringifier contramap(Function f) { - var self = this; - return instance((b, sb, quoted) -> self.unsafeEncode(f.apply(b), sb, quoted)); - } - - public DuckDbStringifier> opt() { - var self = this; - return instance( - (a, sb, quoted) -> { - if (a.isPresent()) self.unsafeEncode(a.get(), sb, quoted); - else sb.append("NULL"); - }); - } - - @FunctionalInterface - public interface StringifierFunction { - void accept(A a, StringBuilder sb, boolean quoted); - } - - public static DuckDbStringifier instance(StringifierFunction f) { - return new DuckDbStringifier<>() { - @Override - public void unsafeEncode(A a, StringBuilder sb, boolean quoted) { - f.accept(a, sb, quoted); - } - }; - } - - // ==================== String types (quoted or unquoted) ==================== - - public static final DuckDbStringifier string = - instance( - (s, sb, quoted) -> { - if (quoted) { - sb.append("'"); - sb.append(s.replace("'", "''")); - sb.append("'"); - } else { - sb.append(s); - } - }); - - // ==================== Numeric types (always unquoted) ==================== - - public static final DuckDbStringifier bool = - instance((b, sb, quoted) -> sb.append(b ? "true" : "false")); - public static final DuckDbStringifier tinyint = instance((n, sb, quoted) -> sb.append(n)); - public static final DuckDbStringifier smallint = instance((n, sb, quoted) -> sb.append(n)); - public static final DuckDbStringifier integer = - instance((n, sb, quoted) -> sb.append(n)); - public static final DuckDbStringifier bigint = instance((n, sb, quoted) -> sb.append(n)); - public static final DuckDbStringifier hugeint = - instance((n, sb, quoted) -> sb.append(n.toString())); - public static final DuckDbStringifier float4 = instance((n, sb, quoted) -> sb.append(n)); - public static final DuckDbStringifier float8 = instance((n, sb, quoted) -> sb.append(n)); - public static final DuckDbStringifier numeric = - instance((n, sb, quoted) -> sb.append(n.toPlainString())); - - // ==================== Date/Time types (quoted or unquoted) ==================== - - public static final DuckDbStringifier date = - instance( - (d, sb, quoted) -> { - if (quoted) sb.append("'"); - sb.append(d); - if (quoted) sb.append("'"); - }); - - public static final DuckDbStringifier time = - instance( - (t, sb, quoted) -> { - if (quoted) sb.append("'"); - sb.append(t); - if (quoted) sb.append("'"); - }); - - public static final DuckDbStringifier timestamp = - instance( - (ts, sb, quoted) -> { - if (quoted) sb.append("'"); - sb.append(ts); - if (quoted) sb.append("'"); - }); - - public static final DuckDbStringifier timestamptz = - instance( - (ts, sb, quoted) -> { - if (quoted) sb.append("'"); - sb.append(ts); - if (quoted) sb.append("'"); - }); - - public static final DuckDbStringifier interval = - instance( - (d, sb, quoted) -> { - if (quoted) sb.append("'"); - long hours = d.toHours(); - long minutes = d.toMinutesPart(); - long seconds = d.toSecondsPart(); - sb.append(String.format("%02d:%02d:%02d", hours, minutes, seconds)); - if (quoted) sb.append("'"); - }); - - // ==================== UUID (quoted or unquoted) ==================== - - public static final DuckDbStringifier uuid = - instance( - (u, sb, quoted) -> { - if (quoted) sb.append("'"); - sb.append(u); - if (quoted) sb.append("'"); - }); - - // ==================== Binary (hex literal, always quoted) ==================== - - public static final DuckDbStringifier blob = - instance( - (bytes, sb, quoted) -> { - sb.append("'\\x"); - for (byte b : bytes) { - sb.append(String.format("%02x", b)); - } - sb.append("'"); - }); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/DuckDbStruct.java b/foundations-jdbc/src/java/dev/typr/foundations/DuckDbStruct.java deleted file mode 100644 index 4d1bc55f23..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/DuckDbStruct.java +++ /dev/null @@ -1,236 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.JsonValue; -import java.sql.SQLException; -import java.util.LinkedHashMap; -import java.util.List; -import java.util.Optional; -import java.util.function.Function; - -/** - * DuckDB STRUCT type support. - * - *

A STRUCT is an ordered sequence of named fields with typed values. Example: STRUCT(name - * VARCHAR, age INTEGER) - * - *

In Java, we represent a STRUCT as a generated record class with typed fields. This class - * provides the machinery to read/write STRUCTs via JDBC. - * - * @param the Java type representing this STRUCT (typically a generated record) - */ -public record DuckDbStruct( - DuckDbTypename.StructOf typename, - List> fields, - StructReader reader, - StructWriter writer, - DuckDbJson json) { - /** - * A single field in a STRUCT with a getter function. - * - * @param the struct type - * @param the field value type - */ - public record Field(String name, DuckDbType type, Function getter) {} - - /** Functional interface for reading a STRUCT from field values. */ - @FunctionalInterface - public interface StructReader { - A read(Object[] fieldValues) throws SQLException; - } - - /** Functional interface for writing a STRUCT to field values. */ - @FunctionalInterface - public interface StructWriter { - Object[] write(A value); - } - - /** Create a DuckDbType for this STRUCT. */ - public DuckDbType asType() { - DuckDbRead duckDbRead = - DuckDbRead.of( - (rs, idx) -> { - Object obj = rs.getObject(idx); - if (obj == null) return null; - if (obj instanceof java.sql.Struct struct) { - Object[] attrs = struct.getAttributes(); - return reader.read(attrs); - } - throw new SQLException("Expected STRUCT, got: " + obj.getClass()); - }); - - DuckDbWrite duckDbWrite = - new DuckDbWrite.Instance<>( - (ps, idx, str) -> ps.setString(idx, str), - value -> { - // Write as string literal: {'field1': value1, 'field2': value2} - StringBuilder sb = new StringBuilder("{"); - for (int i = 0; i < fields.size(); i++) { - if (i > 0) sb.append(", "); - Field field = fields.get(i); - sb.append("'").append(field.name()).append("': "); - appendFieldValue(sb, value, field); - } - sb.append("}"); - return sb.toString(); - }); - - DuckDbStringifier stringifier = - DuckDbStringifier.instance( - (value, sb, quoted) -> { - sb.append("{"); - for (int i = 0; i < fields.size(); i++) { - if (i > 0) sb.append(", "); - Field field = fields.get(i); - sb.append("'").append(field.name()).append("': "); - appendFieldValue(sb, value, field); - } - sb.append("}"); - }); - - return new DuckDbType<>(typename.asGeneric(), duckDbRead, duckDbWrite, stringifier, json); - } - - /** Create an optional version of this STRUCT type. */ - public DuckDbType> asOptType() { - return asType().opt(); - } - - /** Append a field value in DuckDB literal format using DuckDbStringifier. */ - private void appendFieldValue(StringBuilder sb, A structValue, Field field) { - F value = field.getter().apply(structValue); - if (value == null) { - sb.append("NULL"); - return; - } - field.type().stringifier().unsafeEncode(value, sb, true); - } - - // ======================================================================== - // Builder API for creating STRUCT types - // ======================================================================== - - /** - * Create a STRUCT type builder. - * - * @param the struct type (typically a record) - */ - public static Builder builder(String structName) { - return new Builder<>(structName); - } - - public static class Builder { - private final String structName; - private final java.util.List> fields = new java.util.ArrayList<>(); - - Builder(String structName) { - this.structName = structName; - } - - /** - * Add a field with a getter function. DuckDbStringifier is automatically derived from the - * DuckDbType. - * - * @param name the field name in SQL - * @param type the DuckDbType for the field (contains DuckDbStringifier) - * @param getter function to extract field value from struct - */ - public Builder field(String name, DuckDbType type, Function getter) { - fields.add(new Field<>(name, type, getter)); - return this; - } - - /** - * Build the DuckDbStruct with auto-derived writer and JSON codec. - * - *

This method auto-generates: - Writer: uses getters to extract field values - JSON: - * standard format {"field1": value1, "field2": value2} - * - * @param reader function to construct struct from field values array - */ - public DuckDbStruct build(StructReader reader) { - List typenameFields = - fields.stream() - .map(f -> new DuckDbTypename.StructOf.StructField(f.name(), f.type().typename())) - .toList(); - - DuckDbTypename.StructOf typename = - new DuckDbTypename.StructOf<>(structName, typenameFields); - - // Auto-derive writer from getters - StructWriter writer = - structValue -> { - Object[] values = new Object[fields.size()]; - for (int i = 0; i < fields.size(); i++) { - values[i] = extractFieldValue(fields.get(i), structValue); - } - return values; - }; - - // Auto-derive JSON codec from fields - DuckDbJson json = - new DuckDbJson<>() { - @Override - public JsonValue toJson(A value) { - LinkedHashMap jsonFields = new LinkedHashMap<>(); - for (Field field : fields) { - jsonFields.put(field.name(), fieldToJson(field, value)); - } - return new JsonValue.JObject(jsonFields); - } - - @Override - public A fromJson(JsonValue jsonValue) { - if (jsonValue instanceof JsonValue.JObject obj) { - Object[] values = new Object[fields.size()]; - for (int i = 0; i < fields.size(); i++) { - Field field = fields.get(i); - JsonValue fieldJson = obj.fields().get(field.name()); - values[i] = fieldFromJson(field, fieldJson); - } - try { - return reader.read(values); - } catch (SQLException e) { - throw new RuntimeException("Failed to construct struct from JSON", e); - } - } - throw new IllegalArgumentException("Expected JSON object"); - } - }; - - return new DuckDbStruct<>(typename, List.copyOf(fields), reader, writer, json); - } - - @SuppressWarnings("unchecked") - private Object extractFieldValue(Field field, A structValue) { - return field.getter().apply(structValue); - } - - @SuppressWarnings("unchecked") - private JsonValue fieldToJson(Field field, A structValue) { - F value = field.getter().apply(structValue); - return field.type().duckDbJson().toJson(value); - } - - @SuppressWarnings("unchecked") - private Object fieldFromJson(Field field, JsonValue jsonValue) { - return field.type().duckDbJson().fromJson(jsonValue); - } - - /** - * Build with custom reader, writer, and JSON codec. Use this when auto-derivation doesn't work - * for your use case. - */ - public DuckDbStruct build( - StructReader reader, StructWriter writer, DuckDbJson json) { - List typenameFields = - fields.stream() - .map(f -> new DuckDbTypename.StructOf.StructField(f.name(), f.type().typename())) - .toList(); - - DuckDbTypename.StructOf typename = - new DuckDbTypename.StructOf<>(structName, typenameFields); - - return new DuckDbStruct<>(typename, List.copyOf(fields), reader, writer, json); - } - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/DuckDbText.java b/foundations-jdbc/src/java/dev/typr/foundations/DuckDbText.java deleted file mode 100644 index 5220434859..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/DuckDbText.java +++ /dev/null @@ -1,201 +0,0 @@ -package dev.typr.foundations; - -import java.math.BigDecimal; -import java.math.BigInteger; -import java.util.Optional; -import java.util.UUID; -import java.util.function.BiConsumer; -import java.util.function.Function; - -/** - * Text encoder for DuckDB COPY command. - * - *

Similar to PgText but adapted for DuckDB's text format. DuckDB COPY uses tab-delimited format - * similar to PostgreSQL. - */ -public abstract class DuckDbText implements DbText { - public abstract void unsafeEncode(A a, StringBuilder sb); - - public abstract void unsafeArrayEncode(A a, StringBuilder sb); - - public DuckDbText contramap(Function f) { - var self = this; - return instance( - (b, sb) -> self.unsafeEncode(f.apply(b), sb), - (b, sb) -> self.unsafeArrayEncode(f.apply(b), sb)); - } - - public DuckDbText> opt() { - var self = this; - return instance( - (a, sb) -> { - if (a.isPresent()) self.unsafeEncode(a.get(), sb); - else sb.append(DuckDbText.NULL); - }, - (a, sb) -> { - if (a.isPresent()) self.unsafeArrayEncode(a.get(), sb); - else sb.append(DuckDbText.NULL); - }); - } - - public DuckDbText array() { - var self = this; - return DuckDbText.instance( - (as, sb) -> { - var first = true; - sb.append("["); - for (var a : as) { - if (first) first = false; - else sb.append(','); - self.unsafeArrayEncode(a, sb); - } - sb.append(']'); - }); - } - - public static char DELIMETER = '\t'; - public static String NULL = "\\N"; - - public static DuckDbText instance(BiConsumer f) { - return instance(f, f); - } - - public static DuckDbText instance( - BiConsumer f, BiConsumer arrayF) { - return new DuckDbText<>() { - @Override - public void unsafeEncode(A a, StringBuilder sb) { - f.accept(a, sb); - } - - @Override - public void unsafeArrayEncode(A a, StringBuilder sb) { - arrayF.accept(a, sb); - } - }; - } - - @SuppressWarnings("unchecked") - public static DuckDbText from(RowParser rowParser) { - return instance( - (row, sb) -> { - var encoded = rowParser.encode().apply(row); - for (int i = 0; i < encoded.length; i++) { - if (i > 0) { - sb.append(DuckDbText.DELIMETER); - } - DbText text = (DbText) rowParser.columns().get(i).text(); - text.unsafeEncode(encoded[i], sb); - } - }); - } - - public static DuckDbText instanceToString() { - return textString.contramap(Object::toString); - } - - public static final DuckDbText textString = - instance(StringImpl::unsafeEncode, StringImpl::unsafeArrayEncode); - public static final DuckDbText textInteger = - DuckDbText.instance((n, sb) -> sb.append(n)); - public static final DuckDbText textShort = DuckDbText.instance((n, sb) -> sb.append(n)); - public static final DuckDbText textByte = DuckDbText.instance((n, sb) -> sb.append(n)); - public static final DuckDbText textLong = DuckDbText.instance((n, sb) -> sb.append(n)); - public static final DuckDbText textFloat = DuckDbText.instance((n, sb) -> sb.append(n)); - public static final DuckDbText textDouble = DuckDbText.instance((n, sb) -> sb.append(n)); - public static final DuckDbText textBigDecimal = - DuckDbText.instance((n, sb) -> sb.append(n)); - public static final DuckDbText textBigInteger = - DuckDbText.instance((n, sb) -> sb.append(n)); - public static final DuckDbText textBoolean = - DuckDbText.instance((n, sb) -> sb.append(n)); - public static final DuckDbText textUuid = DuckDbText.instance((n, sb) -> sb.append(n)); - public static final DuckDbText textByteArray = - DuckDbText.instance( - (bs, sb) -> { - sb.append("\\\\x"); - if (bs.length > 0) { - var hex = new BigInteger(1, bs).toString(16); - var pad = bs.length * 2 - hex.length(); - sb.append("0".repeat(Math.max(0, pad))); - sb.append(hex); - } - }); - - private interface StringImpl { - // Standard char encodings that don't differ in array context - static void stdChar(char c, StringBuilder sb) { - switch (c) { - case '\b': - sb.append("\\b"); - break; - case '\f': - sb.append("\\f"); - break; - case '\n': - sb.append("\\n"); - break; - case '\r': - sb.append("\\r"); - break; - case '\t': - sb.append("\\t"); - break; - case 0x0b: - sb.append("\\v"); - break; - default: - sb.append(c); - break; - } - } - - static void unsafeEncode(String s, StringBuilder sb) { - for (int i = 0; i < s.length(); i++) { - char c = s.charAt(i); - if (c == '\\') { - sb.append("\\\\"); // backslash must be doubled - } else { - stdChar(c, sb); - } - } - } - - static void unsafeArrayEncode(String s, StringBuilder sb) { - sb.append('\''); - for (int i = 0; i < s.length(); i++) { - char c = s.charAt(i); - switch (c) { - case '\'': - sb.append("''"); - break; - case '\\': - sb.append("\\\\"); - break; - default: - stdChar(c, sb); - break; - } - } - sb.append('\''); - } - } - - public static final DuckDbText NotWorking = - new DuckDbText<>() { - @Override - public void unsafeEncode(Object t, StringBuilder sb) { - throw new UnsupportedOperationException("streaming COPY is not supported for this type"); - } - - @Override - public void unsafeArrayEncode(Object t, StringBuilder sb) { - throw new UnsupportedOperationException("streaming COPY is not supported for this type"); - } - }; - - @Deprecated - public static DuckDbText NotWorking() { - return (DuckDbText) NotWorking; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/DuckDbType.java b/foundations-jdbc/src/java/dev/typr/foundations/DuckDbType.java deleted file mode 100644 index bb9f292320..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/DuckDbType.java +++ /dev/null @@ -1,503 +0,0 @@ -package dev.typr.foundations; - -import java.util.Optional; -import java.util.function.Function; - -/** - * Combines DuckDB type name, read, write, stringification, and JSON encoding for a type. Similar to - * PgType but for DuckDB. - */ -public record DuckDbType( - DuckDbTypename typename, - DuckDbRead read, - DuckDbWrite write, - DuckDbStringifier stringifier, - DuckDbJson duckDbJson, - DuckDbText duckDbText, - DuckDbMapSupport mapSupport) - implements DbType { - /** Constructor for backwards compatibility - uses cast-based map extraction. */ - public DuckDbType( - DuckDbTypename typename, - DuckDbRead read, - DuckDbWrite write, - DuckDbStringifier stringifier, - DuckDbJson duckDbJson) { - this( - typename, - read, - write, - stringifier, - duckDbJson, - DuckDbText.instance((a, sb) -> stringifier.unsafeEncode(a, sb, false)), - DuckDbMapSupport.cast()); - } - - /** Constructor with custom map extractor. */ - public DuckDbType( - DuckDbTypename typename, - DuckDbRead read, - DuckDbWrite write, - DuckDbStringifier stringifier, - DuckDbJson duckDbJson, - DuckDbMapSupport mapSupport) { - this( - typename, - read, - write, - stringifier, - duckDbJson, - DuckDbText.instance((a, sb) -> stringifier.unsafeEncode(a, sb, false)), - mapSupport); - } - - @Override - public DbText text() { - return duckDbText; - } - - @Override - public DbJson json() { - return duckDbJson; - } - - public Fragment.Value encode(A value) { - return new Fragment.Value<>(value, this); - } - - public DuckDbType withTypename(DuckDbTypename typename) { - return new DuckDbType<>(typename, read, write, stringifier, duckDbJson); - } - - public DuckDbType withTypename(String sqlType) { - return withTypename(DuckDbTypename.of(sqlType)); - } - - public DuckDbType renamed(String value) { - return withTypename(typename.renamed(value)); - } - - public DuckDbType renamedDropPrecision(String value) { - return withTypename(typename.renamedDropPrecision(value)); - } - - public DuckDbType withRead(DuckDbRead read) { - return new DuckDbType<>(typename, read, write, stringifier, duckDbJson); - } - - public DuckDbType withWrite(DuckDbWrite write) { - return new DuckDbType<>(typename, read, write, stringifier, duckDbJson); - } - - public DuckDbType withStringifier(DuckDbStringifier stringifier) { - return new DuckDbType<>(typename, read, write, stringifier, duckDbJson); - } - - public DuckDbType withJson(DuckDbJson json) { - return new DuckDbType<>(typename, read, write, stringifier, json); - } - - public DuckDbType> opt() { - return new DuckDbType<>( - typename.opt(), read.opt(), write.opt(typename), stringifier.opt(), duckDbJson.opt()); - } - - /** - * Create an array type from this element type. Uses SQL literal conversion for writing, which - * works for all types. The resulting type can be used with bimap() for custom wrapper types. - * - *

Note: DuckDB internally uses LIST, but we expose it as Java arrays for consistency with - * PostgreSQL. - * - * @return DuckDbType for array of this element type - */ - @SuppressWarnings("unchecked") - public DuckDbType array() { - DuckDbTypename arrayTypename = typename.array(); - // Read: DuckDB JDBC returns Object[], cast elements - DuckDbRead arrayRead = - DuckDbRead.of( - (rs, idx) -> { - java.sql.Array arr = rs.getArray(idx); - if (arr == null) return null; - Object[] elements = (Object[]) arr.getArray(); - A[] result = (A[]) new Object[elements.length]; - for (int i = 0; i < elements.length; i++) { - result[i] = (A) elements[i]; - } - return result; - }); - // Write: convert array to List, use existing list writer - DuckDbWrite arrayWrite = - DuckDbWrite.writeListViaSqlLiteral(typename.sqlType(), stringifier) - .contramap(arr -> java.util.Arrays.asList(arr)); - DuckDbStringifier arrayStringifier = - DuckDbStringifier.instance( - (arr, sb, quoted) -> { - if (arr.length == 0) { - sb.append("[]"); - return; - } - sb.append("["); - boolean first = true; - for (A elem : arr) { - if (!first) sb.append(", "); - first = false; - stringifier.unsafeEncode(elem, sb, true); - } - sb.append("]"); - }); - // JSON: reuse list codec and convert - DuckDbJson arrayJson = - new DuckDbJson<>() { - private final DuckDbJson> listJson = duckDbJson.list(); - - @Override - public dev.typr.foundations.data.JsonValue toJson(A[] value) { - return listJson.toJson(java.util.Arrays.asList(value)); - } - - @Override - @SuppressWarnings("unchecked") - public A[] fromJson(dev.typr.foundations.data.JsonValue json) { - java.util.List list = listJson.fromJson(json); - return (A[]) list.toArray(); - } - }; - return new DuckDbType<>(arrayTypename, arrayRead, arrayWrite, arrayStringifier, arrayJson); - } - - /** - * Create a MAP type from this key type and a value type. Uses the mapSupport to convert - * keys/values when reading from DuckDB JDBC, and SQL literal conversion for writing. - * - * @param valueType the value type for the map - * @param the value type - * @return DuckDbType for Map with this type as keys and valueType as values - */ - public DuckDbType> mapTo(DuckDbType valueType) { - DuckDbTypename> mapTypename = typename.mapTo(valueType.typename); - String sqlType = mapTypename.sqlType(); - DuckDbRead> mapRead = - DuckDbRead.readMapWithSupport(mapSupport, valueType.mapSupport); - DuckDbWrite> mapWrite = - DuckDbWrite.writeMapViaSqlLiteral(sqlType, stringifier, valueType.stringifier); - DuckDbStringifier> mapStringifier = - DuckDbStringifier.instance( - (map, sb, quoted) -> { - if (map.isEmpty()) { - sb.append("{}"); - return; - } - sb.append("{"); - boolean first = true; - for (var entry : map.entrySet()) { - if (!first) sb.append(", "); - first = false; - stringifier.unsafeEncode(entry.getKey(), sb, true); - sb.append(": "); - valueType.stringifier.unsafeEncode(entry.getValue(), sb, true); - } - sb.append("}"); - }); - return new DuckDbType<>( - mapTypename, - mapRead, - mapWrite, - mapStringifier, - DuckDbTypes.mapJson(duckDbJson, valueType.duckDbJson)); - } - - /** - * Create a variable-length LIST type from this element type using native JNI array writing. LIST - * is ALWAYS variable-length - each row can have a different number of elements. Use this for - * types that DuckDB JNI handles natively: Boolean, Byte, Short, Integer, Long, Float, Double, - * String. - * - *

For fixed-length arrays (e.g., embedding vectors), use arrayNative() instead. - * - * @param elementClass the Java class of elements (needed for JDBC reading) - * @param toArray function to create typed array for DuckDBUserArray - * @return DuckDbType for variable-length List of this element type - */ - public DuckDbType> listNative( - Class elementClass, java.util.function.IntFunction toArray) { - DuckDbTypename> listTypename = typename.list(); - DuckDbRead> listRead = DuckDbRead.readList(elementClass); - DuckDbWrite> listWrite = DuckDbWrite.writeList(typename.sqlType(), toArray); - DuckDbStringifier> listStringifier = - DuckDbStringifier.instance( - (list, sb, quoted) -> { - if (list.isEmpty()) { - sb.append("[]"); - return; - } - sb.append("["); - boolean first = true; - for (A elem : list) { - if (!first) sb.append(", "); - first = false; - stringifier.unsafeEncode(elem, sb, true); - } - sb.append("]"); - }); - return new DuckDbType<>(listTypename, listRead, listWrite, listStringifier, duckDbJson.list()); - } - - /** - * Create a variable-length LIST type from this element type using SQL literal string conversion - * for writing. LIST is ALWAYS variable-length - each row can have a different number of elements. - * Use this for types that DuckDB JNI doesn't handle natively: UUID, LocalTime, LocalDate, - * LocalDateTime, OffsetDateTime, BigDecimal, BigInteger, Duration. - * - *

For fixed-length arrays (e.g., embedding vectors), use arrayViaSqlLiteral() instead. - * - * @param elementClass the Java class of elements (what DuckDB JDBC returns) - * @param elementStringifier how to format elements as SQL literals - * @return DuckDbType for variable-length List of this element type - */ - public DuckDbType> listViaSqlLiteral( - Class elementClass, DuckDbStringifier elementStringifier) { - DuckDbTypename> listTypename = typename.list(); - DuckDbRead> listRead = DuckDbRead.readList(elementClass); - DuckDbWrite> listWrite = - DuckDbWrite.writeListViaSqlLiteral(typename.sqlType(), elementStringifier); - DuckDbStringifier> listStringifier = - DuckDbStringifier.instance( - (list, sb, quoted) -> { - if (list.isEmpty()) { - sb.append("[]"); - return; - } - sb.append("["); - boolean first = true; - for (A elem : list) { - if (!first) sb.append(", "); - first = false; - elementStringifier.unsafeEncode(elem, sb, true); - } - sb.append("]"); - }); - return new DuckDbType<>(listTypename, listRead, listWrite, listStringifier, duckDbJson.list()); - } - - /** - * Create a LIST type from this element type using SQL literal string conversion for writing, with - * a converter for reading when DuckDB JDBC returns a different type than expected. - * - *

Use this when the wire type (what DuckDB JDBC returns in arrays) differs from the target - * type. For example, TIMESTAMP[] returns java.sql.Timestamp elements, not LocalDateTime. - * - * @param wireClass the Java class that DuckDB JDBC actually returns in arrays - * @param wireToElement function to convert wire type to target element type - * @param elementStringifier how to format elements as SQL literals - * @param wire type - * @return DuckDbType for List of this element type - */ - public DuckDbType> listViaSqlLiteral( - Class wireClass, - java.util.function.Function wireToElement, - DuckDbStringifier elementStringifier) { - DuckDbTypename> listTypename = typename.list(); - DuckDbRead> listRead = DuckDbRead.readListConverted(wireToElement); - DuckDbWrite> listWrite = - DuckDbWrite.writeListViaSqlLiteral(typename.sqlType(), elementStringifier); - DuckDbStringifier> listStringifier = - DuckDbStringifier.instance( - (list, sb, quoted) -> { - if (list.isEmpty()) { - sb.append("[]"); - return; - } - sb.append("["); - boolean first = true; - for (A elem : list) { - if (!first) sb.append(", "); - first = false; - elementStringifier.unsafeEncode(elem, sb, true); - } - sb.append("]"); - }); - return new DuckDbType<>(listTypename, listRead, listWrite, listStringifier, duckDbJson.list()); - } - - /** - * Create a fixed-size ARRAY type from this element type using native JNI array writing. ARRAY is - * ALWAYS fixed-length - every row must have exactly 'size' elements. Use this for embedding - * vectors (word embeddings, image embeddings) and other fixed-size arrays. - * - *

For variable-length lists, use listNative() instead. - * - *

From JDBC perspective, ARRAY and LIST are identical - both return DuckDBArray. The - * difference is only in the typename (includes size) and database-side validation. - * - * @param size the fixed size of the array (every row must have exactly this many elements) - * @param elementClass the Java class of elements (needed for JDBC reading) - * @param toArray function to create typed array for DuckDBUserArray - * @return DuckDbType for fixed-size array of this element type - */ - public DuckDbType> arrayNative( - int size, Class elementClass, java.util.function.IntFunction toArray) { - DuckDbTypename> arrayTypename = new DuckDbTypename.ArrayOf<>(typename, size); - DuckDbRead> arrayRead = DuckDbRead.readList(elementClass); - DuckDbWrite> arrayWrite = DuckDbWrite.writeList(typename.sqlType(), toArray); - DuckDbStringifier> arrayStringifier = - DuckDbStringifier.instance( - (list, sb, quoted) -> { - if (list.isEmpty()) { - sb.append("[]"); - return; - } - sb.append("["); - boolean first = true; - for (A elem : list) { - if (!first) sb.append(", "); - first = false; - stringifier.unsafeEncode(elem, sb, true); - } - sb.append("]"); - }); - return new DuckDbType<>( - arrayTypename, arrayRead, arrayWrite, arrayStringifier, duckDbJson.list()); - } - - public DuckDbType> arrayViaSqlLiteral( - int size, Class elementClass, DuckDbStringifier elementStringifier) { - DuckDbTypename> arrayTypename = new DuckDbTypename.ArrayOf<>(typename, size); - DuckDbRead> arrayRead = DuckDbRead.readList(elementClass); - DuckDbWrite> arrayWrite = - DuckDbWrite.writeListViaSqlLiteral(typename.sqlType(), elementStringifier); - DuckDbStringifier> arrayStringifier = - DuckDbStringifier.instance( - (list, sb, quoted) -> { - if (list.isEmpty()) { - sb.append("[]"); - return; - } - sb.append("["); - boolean first = true; - for (A elem : list) { - if (!first) sb.append(", "); - first = false; - elementStringifier.unsafeEncode(elem, sb, true); - } - sb.append("]"); - }); - return new DuckDbType<>( - arrayTypename, arrayRead, arrayWrite, arrayStringifier, duckDbJson.list()); - } - - public DuckDbType> mapToNative( - DuckDbType valueType, Class keyClass, Class valueClass) { - DuckDbTypename> mapTypename = typename.mapTo(valueType.typename); - String sqlType = mapTypename.sqlType(); - DuckDbRead> mapRead = - DuckDbRead.readMap(read, keyClass, valueType.read, valueClass); - DuckDbWrite> mapWrite = DuckDbWrite.writeMap(sqlType); - DuckDbStringifier> mapStringifier = - DuckDbStringifier.instance( - (map, sb, quoted) -> { - if (map.isEmpty()) { - sb.append("{}"); - return; - } - sb.append("{"); - boolean first = true; - for (var entry : map.entrySet()) { - if (!first) sb.append(", "); - first = false; - stringifier.unsafeEncode(entry.getKey(), sb, true); - sb.append(": "); - valueType.stringifier.unsafeEncode(entry.getValue(), sb, true); - } - sb.append("}"); - }); - return new DuckDbType<>( - mapTypename, - mapRead, - mapWrite, - mapStringifier, - DuckDbTypes.mapJson(duckDbJson, valueType.duckDbJson)); - } - - public DuckDbType> mapToViaSqlLiteral( - DuckDbType valueType, - Class keyClass, - Class valueClass, - DuckDbStringifier keyStringifier, - DuckDbStringifier valueStringifier) { - DuckDbTypename> mapTypename = typename.mapTo(valueType.typename); - String sqlType = mapTypename.sqlType(); - DuckDbRead> mapRead = - DuckDbRead.readMap(read, keyClass, valueType.read, valueClass); - DuckDbWrite> mapWrite = - DuckDbWrite.writeMapViaSqlLiteral(sqlType, keyStringifier, valueStringifier); - DuckDbStringifier> mapStringifier = - DuckDbStringifier.instance( - (map, sb, quoted) -> { - if (map.isEmpty()) { - sb.append("{}"); - return; - } - sb.append("{"); - boolean first = true; - for (var entry : map.entrySet()) { - if (!first) sb.append(", "); - first = false; - keyStringifier.unsafeEncode(entry.getKey(), sb, true); - sb.append(": "); - valueStringifier.unsafeEncode(entry.getValue(), sb, true); - } - sb.append("}"); - }); - return new DuckDbType<>( - mapTypename, - mapRead, - mapWrite, - mapStringifier, - DuckDbTypes.mapJson(duckDbJson, valueType.duckDbJson)); - } - - public DuckDbType bimap(SqlFunction f, Function g) { - return new DuckDbType<>( - typename.as(), - read.map(f), - write.contramap(g), - stringifier.contramap(g), - duckDbJson.bimap(f, g), - mapSupport.bimap( - a -> { - try { - return f.apply(a); - } catch (java.sql.SQLException e) { - throw new RuntimeException(e); - } - }, - g)); - } - - @Override - public DuckDbType to(dev.typr.foundations.dsl.Bijection bijection) { - return new DuckDbType<>( - typename.as(), - read.map(bijection::underlying), - write.contramap(bijection::from), - stringifier.contramap(bijection::from), - duckDbJson.bimap(bijection::underlying, bijection::from), - mapSupport.bimap(bijection::underlying, bijection::from)); - } - - public static DuckDbType of( - String tpe, DuckDbRead r, DuckDbWrite w, DuckDbStringifier s, DuckDbJson j) { - return new DuckDbType<>(DuckDbTypename.of(tpe), r, w, s, j); - } - - public static DuckDbType of( - DuckDbTypename typename, - DuckDbRead r, - DuckDbWrite w, - DuckDbStringifier s, - DuckDbJson j) { - return new DuckDbType<>(typename, r, w, s, j); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/DuckDbTypename.java b/foundations-jdbc/src/java/dev/typr/foundations/DuckDbTypename.java deleted file mode 100644 index 373724403e..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/DuckDbTypename.java +++ /dev/null @@ -1,379 +0,0 @@ -package dev.typr.foundations; - -import java.util.Optional; - -/** - * Describes the SQL type name for DuckDB types. DuckDB has a rich type system including composite - * types like LIST, MAP, etc. This is a sealed interface with specific implementations for each type - * structure. - */ -public sealed interface DuckDbTypename extends DbTypename { - @Override - String sqlType(); - - /** - * Create a LIST type from this element type. For example, VARCHAR becomes VARCHAR[] (or - * LIST(VARCHAR)). - */ - DuckDbTypename> list(); - - /** - * Create an array type from this element type (for Java array compatibility). Uses the same SQL - * representation as list() but with Java arrays. - */ - DuckDbTypename array(); - - /** - * Create a MAP type from this key type and another value type. For example, - * VARCHAR.mapTo(INTEGER) becomes MAP(VARCHAR, INTEGER). - */ - DuckDbTypename> mapTo(DuckDbTypename valueType); - - /** Rename the type (for aliases like TEXT -> VARCHAR). */ - DuckDbTypename renamed(String newName); - - /** Rename the type, dropping any precision/scale. */ - DuckDbTypename renamedDropPrecision(String newName); - - /** Create an Optional variant of this type. */ - default DuckDbTypename> opt() { - return new Opt<>(this); - } - - /** Cast to a different Java type (unsafe but necessary for type composition). */ - @SuppressWarnings("unchecked") - default DuckDbTypename as() { - return (DuckDbTypename) this; - } - - // ==================== Implementations ==================== - - /** Base type with optional precision and scale. Examples: VARCHAR, DECIMAL(10,2), INTEGER */ - record Base(String baseType, Optional precision, Optional scale) - implements DuckDbTypename { - public Base(String baseType) { - this(baseType, Optional.empty(), Optional.empty()); - } - - @Override - public String sqlType() { - if (precision.isPresent() && scale.isPresent()) { - return baseType + "(" + precision.get() + ", " + scale.get() + ")"; - } else if (precision.isPresent()) { - return baseType + "(" + precision.get() + ")"; - } - return baseType; - } - - @Override - public DuckDbTypename> list() { - return new ListOf<>(this); - } - - @Override - public DuckDbTypename array() { - return new ListOf<>(this).as(); - } - - @Override - public DuckDbTypename> mapTo(DuckDbTypename valueType) { - return new MapOf<>(this, valueType); - } - - @Override - public DuckDbTypename renamed(String newName) { - return new Base<>(newName, precision, scale); - } - - @Override - public DuckDbTypename renamedDropPrecision(String newName) { - return new Base<>(newName); - } - } - - /** - * LIST type wrapping an element type. LIST is ALWAYS variable-length. Rendered as - * "element_type[]" (e.g., VARCHAR[] or INTEGER[]) - * - *

Use ARRAY for fixed-length arrays (e.g., embedding vectors). - */ - record ListOf(DuckDbTypename elementType) implements DuckDbTypename> { - @Override - public String sqlType() { - return elementType.sqlType() + "[]"; - } - - @Override - public DuckDbTypename>> list() { - // Nested list: LIST(LIST(A)) becomes A[][] - return new ListOf<>(this); - } - - @Override - public DuckDbTypename[]> array() { - return new ListOf<>(this).as(); - } - - @Override - public DuckDbTypename, V>> mapTo( - DuckDbTypename valueType) { - return new MapOf<>(this, valueType); - } - - @Override - public DuckDbTypename> renamed(String newName) { - return new ListOf<>(elementType.renamed(newName)); - } - - @Override - public DuckDbTypename> renamedDropPrecision(String newName) { - return new ListOf<>(elementType.renamedDropPrecision(newName)); - } - } - - /** - * MAP type with key and value types. Rendered as "MAP(key_type, value_type)" (e.g., MAP(VARCHAR, - * INTEGER)) - */ - record MapOf(DuckDbTypename keyType, DuckDbTypename valueType) - implements DuckDbTypename> { - @Override - public String sqlType() { - return "MAP(" + keyType.sqlType() + ", " + valueType.sqlType() + ")"; - } - - @Override - public DuckDbTypename>> list() { - return new ListOf<>(this); - } - - @Override - public DuckDbTypename[]> array() { - return new ListOf<>(this).as(); - } - - @Override - public DuckDbTypename, V2>> mapTo( - DuckDbTypename valueType2) { - // MAP as a key is unusual but valid - return new MapOf<>(this, valueType2); - } - - @Override - public DuckDbTypename> renamed(String newName) { - // For MAP, renaming affects the key type - return new MapOf<>(keyType.renamed(newName), valueType); - } - - @Override - public DuckDbTypename> renamedDropPrecision(String newName) { - return new MapOf<>(keyType.renamedDropPrecision(newName), valueType); - } - } - - /** - * Fixed-size ARRAY type wrapping an element type. ARRAY is ALWAYS fixed-length. Rendered as - * "element_type[size]" (e.g., FLOAT[3] for 3D vectors) - * - *

Use LIST for variable-length arrays. - * - *

From DuckDB docs: "All fields in the column must have the same length." Typically used for - * embedding vectors (word embeddings, image embeddings). - */ - record ArrayOf(DuckDbTypename elementType, int size) - implements DuckDbTypename> { - @Override - public String sqlType() { - return elementType.sqlType() + "[" + size + "]"; - } - - @Override - public DuckDbTypename>> list() { - return new ListOf<>(this); - } - - @Override - public DuckDbTypename[]> array() { - return new ListOf<>(this).as(); - } - - @Override - public DuckDbTypename, V>> mapTo( - DuckDbTypename valueType) { - return new MapOf<>(this, valueType); - } - - @Override - public DuckDbTypename> renamed(String newName) { - return new ArrayOf<>(elementType.renamed(newName), size); - } - - @Override - public DuckDbTypename> renamedDropPrecision(String newName) { - return new ArrayOf<>(elementType.renamedDropPrecision(newName), size); - } - } - - /** STRUCT type with named fields. Rendered as "STRUCT(field1 type1, field2 type2, ...)" */ - record StructOf(String name, java.util.List fields) implements DuckDbTypename { - public record StructField(String name, DuckDbTypename type) {} - - @Override - public String sqlType() { - StringBuilder sb = new StringBuilder("STRUCT("); - for (int i = 0; i < fields.size(); i++) { - if (i > 0) sb.append(", "); - StructField f = fields.get(i); - // Quote field names that need it - if (needsQuoting(f.name())) { - sb.append("\"").append(f.name()).append("\""); - } else { - sb.append(f.name()); - } - sb.append(" ").append(f.type().sqlType()); - } - sb.append(")"); - return sb.toString(); - } - - private static boolean needsQuoting(String name) { - // Quote if contains special chars or starts with digit - if (name.isEmpty()) return true; - char first = name.charAt(0); - if (Character.isDigit(first)) return true; - for (char c : name.toCharArray()) { - if (!Character.isLetterOrDigit(c) && c != '_') return true; - } - return false; - } - - @Override - public DuckDbTypename> list() { - return new ListOf<>(this); - } - - @Override - public DuckDbTypename array() { - return new ListOf<>(this).as(); - } - - @Override - public DuckDbTypename> mapTo(DuckDbTypename valueType) { - return new MapOf<>(this, valueType); - } - - @Override - public DuckDbTypename renamed(String newName) { - return new StructOf<>(newName, fields); - } - - @Override - public DuckDbTypename renamedDropPrecision(String newName) { - return new StructOf<>(newName, fields); - } - - /** Get the generic form (loses the name). */ - @SuppressWarnings("unchecked") - public DuckDbTypename asGeneric() { - return (DuckDbTypename) this; - } - } - - /** UNION type with tagged alternatives. Rendered as "UNION(tag1 type1, tag2 type2, ...)" */ - record UnionOf(String name, java.util.List members) implements DuckDbTypename { - public record UnionMember(String tag, DuckDbTypename type) {} - - @Override - public String sqlType() { - StringBuilder sb = new StringBuilder("UNION("); - for (int i = 0; i < members.size(); i++) { - if (i > 0) sb.append(", "); - UnionMember m = members.get(i); - sb.append(m.tag()).append(" ").append(m.type().sqlType()); - } - sb.append(")"); - return sb.toString(); - } - - @Override - public DuckDbTypename> list() { - return new ListOf<>(this); - } - - @Override - public DuckDbTypename array() { - return new ListOf<>(this).as(); - } - - @Override - public DuckDbTypename> mapTo(DuckDbTypename valueType) { - return new MapOf<>(this, valueType); - } - - @Override - public DuckDbTypename renamed(String newName) { - return new UnionOf<>(newName, members); - } - - @Override - public DuckDbTypename renamedDropPrecision(String newName) { - return new UnionOf<>(newName, members); - } - - /** Get the generic form (loses the name). */ - @SuppressWarnings("unchecked") - public DuckDbTypename asGeneric() { - return (DuckDbTypename) this; - } - } - - /** Optional wrapper (nullable type). The SQL type is the same as the underlying type. */ - record Opt(DuckDbTypename of) implements DuckDbTypename> { - @Override - public String sqlType() { - return of.sqlType(); - } - - @Override - public DuckDbTypename>> list() { - return new ListOf<>(this); - } - - @Override - public DuckDbTypename[]> array() { - return new ListOf<>(this).as(); - } - - @Override - public DuckDbTypename, V>> mapTo(DuckDbTypename valueType) { - return new MapOf<>(this, valueType); - } - - @Override - public DuckDbTypename> renamed(String newName) { - return new Opt<>(of.renamed(newName)); - } - - @Override - public DuckDbTypename> renamedDropPrecision(String newName) { - return new Opt<>(of.renamedDropPrecision(newName)); - } - } - - // ==================== Factory Methods ==================== - - /** Create a base type with no precision/scale. */ - static DuckDbTypename of(String sqlType) { - return new Base<>(sqlType); - } - - /** Create a base type with precision (e.g., VARCHAR(255)). */ - static DuckDbTypename of(String sqlType, int precision) { - return new Base<>(sqlType, Optional.of(precision), Optional.empty()); - } - - /** Create a base type with precision and scale (e.g., DECIMAL(10, 2)). */ - static DuckDbTypename of(String sqlType, int precision, int scale) { - return new Base<>(sqlType, Optional.of(precision), Optional.of(scale)); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/DuckDbTypes.java b/foundations-jdbc/src/java/dev/typr/foundations/DuckDbTypes.java deleted file mode 100644 index 305b5eb252..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/DuckDbTypes.java +++ /dev/null @@ -1,544 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.Json; -import dev.typr.foundations.data.JsonValue; -import dev.typr.foundations.data.Uint1; -import dev.typr.foundations.data.Uint2; -import dev.typr.foundations.data.Uint4; -import dev.typr.foundations.data.Uint8; -import java.math.BigDecimal; -import java.math.BigInteger; -import java.time.*; -import java.util.UUID; -import java.util.function.Function; - -/** - * DuckDB type definitions for the typr-runtime-java library. - * - *

DuckDB has a rich type system including: - Standard SQL types (INTEGER, VARCHAR, etc.) - - * Extended integer types (HUGEINT, UHUGEINT, UTINYINT, etc.) - Nested types (LIST, STRUCT, MAP, - * UNION) - Temporal types with various precisions - */ -public interface DuckDbTypes { - // ==================== Integer Types (Signed) ==================== - - DuckDbType tinyint = - DuckDbType.of( - "TINYINT", - DuckDbRead.readByte, - DuckDbWrite.writeByte, - DuckDbStringifier.tinyint, - DuckDbJson.int1); - - DuckDbType smallint = - DuckDbType.of( - "SMALLINT", - DuckDbRead.readShort, - DuckDbWrite.writeShort, - DuckDbStringifier.smallint, - DuckDbJson.int2); - - DuckDbType integer = - DuckDbType.of( - "INTEGER", - DuckDbRead.readInteger, - DuckDbWrite.writeInteger, - DuckDbStringifier.integer, - DuckDbJson.int4); - - DuckDbType bigint = - DuckDbType.of( - "BIGINT", - DuckDbRead.readLong, - DuckDbWrite.writeLong, - DuckDbStringifier.bigint, - DuckDbJson.int8); - - // HUGEINT: 128-bit signed integer (-170141183460469231731687303715884105728 to - // 170141183460469231731687303715884105727) - DuckDbType hugeint = - DuckDbType.of( - "HUGEINT", - DuckDbRead.readBigInteger, - DuckDbWrite.writeBigInteger, - DuckDbStringifier.hugeint, - DuckDbJson.hugeint); - - // ==================== Integer Types (Unsigned) ==================== - - // UTINYINT: 0-255, wrapped in Uint1 - DuckDbType utinyint = - DuckDbType.of( - "UTINYINT", - DuckDbRead.readShort.map(Uint1::new), - DuckDbWrite.writeShort.contramap(Uint1::value), - DuckDbStringifier.smallint.contramap(Uint1::value), - DuckDbJson.int2.bimap(Uint1::new, Uint1::value)); - - // USMALLINT: 0-65535, wrapped in Uint2 - DuckDbType usmallint = - DuckDbType.of( - "USMALLINT", - DuckDbRead.readInteger.map(Uint2::new), - DuckDbWrite.writeInteger.contramap(Uint2::value), - DuckDbStringifier.integer.contramap(Uint2::value), - DuckDbJson.int4.bimap(Uint2::new, Uint2::value)); - - // UINTEGER: 0-4294967295, wrapped in Uint4 - DuckDbType uinteger = - DuckDbType.of( - "UINTEGER", - DuckDbRead.readLong.map(Uint4::new), - DuckDbWrite.writeLong.contramap(Uint4::value), - DuckDbStringifier.bigint.contramap(Uint4::value), - DuckDbJson.int8.bimap(Uint4::new, Uint4::value)); - - // UBIGINT: 0-18446744073709551615, wrapped in Uint8 - DuckDbType ubigint = - DuckDbType.of( - "UBIGINT", - DuckDbRead.readBigInteger.map(Uint8::of), - DuckDbWrite.writeBigInteger.contramap(Uint8::value), - DuckDbStringifier.hugeint.contramap(Uint8::value), - DuckDbJson.hugeint.bimap(Uint8::of, Uint8::value)); - - // UHUGEINT: 128-bit unsigned integer, needs BigInteger - DuckDbType uhugeint = - DuckDbType.of( - "UHUGEINT", - DuckDbRead.readBigInteger, - DuckDbWrite.writeBigInteger, - DuckDbStringifier.hugeint, - DuckDbJson.hugeint); - - // ==================== Floating-Point Types ==================== - - DuckDbType float_ = - DuckDbType.of( - "FLOAT", - DuckDbRead.readFloat, - DuckDbWrite.writeFloat, - DuckDbStringifier.float4, - DuckDbJson.float4); - - DuckDbType double_ = - DuckDbType.of( - "DOUBLE", - DuckDbRead.readDouble, - DuckDbWrite.writeDouble, - DuckDbStringifier.float8, - DuckDbJson.float8); - - // Aliases - DuckDbType real = float_.renamed("REAL"); - DuckDbType float4 = float_.renamed("FLOAT4"); - DuckDbType float8 = double_.renamed("FLOAT8"); - - // ==================== Fixed-Point Types ==================== - - DuckDbType decimal = - DuckDbType.of( - "DECIMAL", - DuckDbRead.readBigDecimal, - DuckDbWrite.writeBigDecimal, - DuckDbStringifier.numeric, - DuckDbJson.numeric); - - DuckDbType numeric = decimal.renamed("NUMERIC"); - - static DuckDbType decimal(int precision, int scale) { - return DuckDbType.of( - DuckDbTypename.of("DECIMAL", precision, scale), - DuckDbRead.readBigDecimal, - DuckDbWrite.writeBigDecimal, - DuckDbStringifier.numeric, - DuckDbJson.numeric); - } - - // ==================== Boolean Type ==================== - - DuckDbType boolean_ = - DuckDbType.of( - "BOOLEAN", - DuckDbRead.readBoolean, - DuckDbWrite.writeBoolean, - DuckDbStringifier.bool, - DuckDbJson.bool); - - DuckDbType bool = boolean_.renamed("BOOL"); - - // ==================== String Types ==================== - - DuckDbType varchar = - DuckDbType.of( - "VARCHAR", - DuckDbRead.readString, - DuckDbWrite.writeString, - DuckDbStringifier.string, - DuckDbJson.text); - - DuckDbType text = varchar.renamed("TEXT"); - DuckDbType string = varchar.renamed("STRING"); - DuckDbType char_ = varchar.renamed("CHAR"); - DuckDbType bpchar = varchar.renamed("BPCHAR"); - - static DuckDbType varchar(int length) { - return DuckDbType.of( - DuckDbTypename.of("VARCHAR", length), - DuckDbRead.readString, - DuckDbWrite.writeString, - DuckDbStringifier.string, - DuckDbJson.text); - } - - static DuckDbType char_(int length) { - return DuckDbType.of( - DuckDbTypename.of("CHAR", length), - DuckDbRead.readString, - DuckDbWrite.writeString, - DuckDbStringifier.string, - DuckDbJson.text); - } - - // ==================== Binary Types ==================== - - DuckDbType blob = - DuckDbType.of( - "BLOB", - DuckDbRead.readByteArray, - DuckDbWrite.writeByteArray, - DuckDbStringifier.blob, - DuckDbJson.blob); - - DuckDbType bytea = blob.renamed("BYTEA"); - DuckDbType binary = blob.renamed("BINARY"); - DuckDbType varbinary = blob.renamed("VARBINARY"); - - // ==================== Bit String Type ==================== - - // BIT type - stored as string of 0s and 1s - DuckDbType bit = - DuckDbType.of( - "BIT", - DuckDbRead.readBitString, - DuckDbWrite.writeString, - DuckDbStringifier.string, - DuckDbJson.bit); - - DuckDbType bitstring = bit.renamed("BITSTRING"); - - static DuckDbType bit(int length) { - return DuckDbType.of( - DuckDbTypename.of("BIT", length), - DuckDbRead.readBitString, - DuckDbWrite.writeString, - DuckDbStringifier.string, - DuckDbJson.bit); - } - - // ==================== Date/Time Types ==================== - - DuckDbType date = - DuckDbType.of( - "DATE", - DuckDbRead.readLocalDate, - DuckDbWrite.passObjectToJdbc(), - DuckDbStringifier.date, - DuckDbJson.date); - - DuckDbType time = - DuckDbType.of( - "TIME", - DuckDbRead.readLocalTime, - DuckDbWrite.writeLocalTime, - DuckDbStringifier.time, - DuckDbJson.time); - - DuckDbType timestamp = - DuckDbType.of( - "TIMESTAMP", - DuckDbRead.readLocalDateTime, - DuckDbWrite.passObjectToJdbc(), - DuckDbStringifier.timestamp, - DuckDbJson.timestamp); - - DuckDbType datetime = timestamp.renamed("DATETIME"); - - // Timestamp with timezone - DuckDbType timestamptz = - DuckDbType.of( - "TIMESTAMP WITH TIME ZONE", - DuckDbRead.readOffsetDateTime, - DuckDbWrite.passObjectToJdbc(), - DuckDbStringifier.timestamptz, - DuckDbJson.timestamptz); - - // Time with timezone - represented as OffsetTime - DuckDbType timetz = - DuckDbType.of( - "TIME WITH TIME ZONE", - DuckDbRead.readOffsetDateTime, - DuckDbWrite.passObjectToJdbc(), - DuckDbStringifier.timestamptz, - DuckDbJson.timestamptz); - - // Timestamp variants with different precisions - DuckDbType timestamp_s = timestamp.renamed("TIMESTAMP_S"); - DuckDbType timestamp_ms = timestamp.renamed("TIMESTAMP_MS"); - DuckDbType timestamp_ns = timestamp.renamed("TIMESTAMP_NS"); - - // ==================== Interval Type ==================== - - DuckDbType interval = - DuckDbType.of( - "INTERVAL", - DuckDbRead.readDuration, - DuckDbWrite.writeDuration, - DuckDbStringifier.interval, - DuckDbJson.interval); - - // ==================== UUID Type ==================== - - DuckDbType uuid = - DuckDbType.of( - "UUID", - DuckDbRead.readUuid, - DuckDbWrite.writeUuid, - DuckDbStringifier.uuid, - DuckDbJson.uuid); - - // ==================== JSON Type ==================== - - DuckDbType json = - DuckDbType.of( - "JSON", - DuckDbRead.readString.map(Json::new), - DuckDbWrite.writeString.contramap(Json::value), - DuckDbStringifier.string.contramap(Json::value), - DuckDbJson.json); - - // ==================== Enum Type ==================== - - /** - * Create a DuckDbType for ENUM columns. DuckDB ENUMs are read/written as strings. - * - * @param enumTypeName the name of the enum type (e.g., "mood" for CREATE TYPE mood AS - * ENUM('happy', 'sad')) - * @param fromString function to convert string to enum value - * @param the enum type - * @return DuckDbType for the enum - */ - static > DuckDbType ofEnum( - String enumTypeName, Function fromString) { - return DuckDbType.of( - enumTypeName, - DuckDbRead.readString.map(fromString::apply), - DuckDbWrite.writeString.contramap(Enum::name), - DuckDbStringifier.string.contramap(Enum::name), - DuckDbJson.text.bimap(fromString::apply, Enum::name)); - } - - // ==================== Array Types ==================== - - /** TINYINT[] - array of tinyint values */ - DuckDbType tinyintArray = tinyint.array(); - - /** SMALLINT[] - array of smallint values */ - DuckDbType smallintArray = smallint.array(); - - /** INTEGER[] - array of integer values */ - DuckDbType integerArray = integer.array(); - - /** BIGINT[] - array of bigint values */ - DuckDbType bigintArray = bigint.array(); - - /** HUGEINT[] - array of hugeint values */ - DuckDbType hugeintArray = hugeint.array(); - - /** UTINYINT[] - array of utinyint values */ - DuckDbType utinyintArray = utinyint.array(); - - /** USMALLINT[] - array of usmallint values */ - DuckDbType usmallintArray = usmallint.array(); - - /** UINTEGER[] - array of uinteger values */ - DuckDbType uintegerArray = uinteger.array(); - - /** UBIGINT[] - array of ubigint values */ - DuckDbType ubigintArray = ubigint.array(); - - /** FLOAT[] - array of float values */ - DuckDbType floatArray = float_.array(); - - /** DOUBLE[] - array of double values */ - DuckDbType doubleArray = double_.array(); - - /** DECIMAL[] - array of decimal values */ - DuckDbType decimalArray = decimal.array(); - - /** BOOLEAN[] - array of boolean values */ - DuckDbType booleanArray = boolean_.array(); - - /** VARCHAR[] - array of varchar values */ - DuckDbType varcharArray = varchar.array(); - - /** BLOB[] - array of blob values */ - DuckDbType blobArray = blob.array(); - - /** DATE[] - array of date values */ - DuckDbType dateArray = date.array(); - - /** TIME[] - array of time values */ - DuckDbType timeArray = time.array(); - - /** TIMESTAMP[] - array of timestamp values */ - DuckDbType timestampArray = timestamp.array(); - - /** TIMESTAMPTZ[] - array of timestamptz values */ - DuckDbType timestamptzArray = timestamptz.array(); - - /** INTERVAL[] - array of interval values */ - DuckDbType intervalArray = interval.array(); - - /** UUID[] - array of uuid values */ - DuckDbType uuidArray = uuid.array(); - - /** JSON[] - array of json values */ - DuckDbType jsonArray = json.array(); - - // ==================== Pre-instantiated List Types ==================== - // Native JNI types (best performance) - - /** LIST<BOOLEAN> - native JNI support */ - DuckDbType> listBoolean = - boolean_.listNative(Boolean.class, Boolean[]::new); - - /** LIST<TINYINT> - native JNI support */ - DuckDbType> listTinyint = tinyint.listNative(Byte.class, Byte[]::new); - - /** LIST<SMALLINT> - native JNI support */ - DuckDbType> listSmallint = smallint.listNative(Short.class, Short[]::new); - - /** LIST<INTEGER> - native JNI support */ - DuckDbType> listInteger = - integer.listNative(Integer.class, Integer[]::new); - - /** LIST<BIGINT> - native JNI support */ - DuckDbType> listBigint = bigint.listNative(Long.class, Long[]::new); - - /** LIST<FLOAT> - native JNI support */ - DuckDbType> listFloat = float_.listNative(Float.class, Float[]::new); - - /** LIST<DOUBLE> - native JNI support */ - DuckDbType> listDouble = double_.listNative(Double.class, Double[]::new); - - /** LIST<VARCHAR> - native JNI support */ - DuckDbType> listVarchar = varchar.listNative(String.class, String[]::new); - - // String-converted types (~33% overhead at 100k rows, but required for correctness) - - /** LIST<UUID> - SQL literal conversion (UUID has byte-ordering bug in JNI) */ - DuckDbType> listUuid = - uuid.listViaSqlLiteral(UUID.class, DuckDbStringifier.uuid); - - /** LIST<DATE> - SQL literal conversion (JNI doesn't recognize java.time.LocalDate) */ - DuckDbType> listDate = - date.listViaSqlLiteral(LocalDate.class, DuckDbStringifier.date); - - /** LIST<TIME> - SQL literal conversion (JNI doesn't recognize java.time.LocalTime) */ - DuckDbType> listTime = - time.listViaSqlLiteral(LocalTime.class, DuckDbStringifier.time); - - /** - * LIST<TIMESTAMP> - SQL literal conversion (DuckDB JDBC returns java.sql.Timestamp in - * arrays) - */ - DuckDbType> listTimestamp = - timestamp.listViaSqlLiteral( - java.sql.Timestamp.class, - java.sql.Timestamp::toLocalDateTime, - DuckDbStringifier.timestamp); - - /** - * LIST<TIMESTAMPTZ> - SQL literal conversion (DuckDB JDBC returns java.sql.Timestamp in - * arrays) - */ - DuckDbType> listTimestamptz = - timestamptz.listViaSqlLiteral( - java.sql.Timestamp.class, - ts -> ts.toLocalDateTime().atOffset(ZoneOffset.UTC), - DuckDbStringifier.timestamptz); - - /** LIST<DECIMAL> - SQL literal conversion */ - DuckDbType> listDecimal = - decimal.listViaSqlLiteral(BigDecimal.class, DuckDbStringifier.numeric); - - /** LIST<HUGEINT> - SQL literal conversion */ - DuckDbType> listHugeint = - hugeint.listViaSqlLiteral(BigInteger.class, DuckDbStringifier.hugeint); - - /** LIST<INTERVAL> - SQL literal conversion */ - DuckDbType> listInterval = - interval.listViaSqlLiteral(Duration.class, DuckDbStringifier.interval); - - // ==================== Unknown Type ==================== - // For columns whose type typr doesn't know how to handle - cast to/from string - DuckDbType unknown = - DuckDbType.of( - "VARCHAR", - DuckDbRead.readString, - DuckDbWrite.writeString, - DuckDbStringifier.string, - DuckDbJson.text) - .bimap(dev.typr.foundations.data.Unknown::new, dev.typr.foundations.data.Unknown::value); - - /** - * JSON codec for MAP types with typed keys and values. Uses JSON object format for compatibility - * with DuckDB's JSON COPY. Keys are encoded to their JSON string representation. - */ - static DuckDbJson> mapJson( - DuckDbJson keyJson, DuckDbJson valueJson) { - return new DuckDbJson<>() { - @Override - public JsonValue toJson(java.util.Map value) { - // Serialize as JSON object for compatibility with DuckDB JSON COPY - // Keys are converted to strings using their JSON representation (without quotes) - java.util.Map jsonMap = new java.util.LinkedHashMap<>(); - for (var entry : value.entrySet()) { - JsonValue keyJsonValue = keyJson.toJson(entry.getKey()); - // For string keys, use the raw value; for other types, encode to JSON string - String keyStr = - keyJsonValue instanceof JsonValue.JString s ? s.value() : keyJsonValue.encode(); - JsonValue valJson = valueJson.toJson(entry.getValue()); - jsonMap.put(keyStr, valJson); - } - return new JsonValue.JObject(jsonMap); - } - - @Override - public java.util.Map fromJson(JsonValue json) { - if (!(json instanceof JsonValue.JObject(java.util.Map obj))) { - throw new IllegalArgumentException( - "Expected JSON object for MAP, got: " + json.getClass().getSimpleName()); - } - java.util.Map result = new java.util.LinkedHashMap<>(); - for (var entry : obj.entrySet()) { - String keyStr = entry.getKey(); - // Try to parse the key - if it's already a JSON string, use it directly - // Otherwise parse it as JSON (for complex types encoded as JSON strings) - JsonValue keyJsonValue; - try { - keyJsonValue = JsonValue.parse(keyStr); - } catch (Exception e) { - // Not valid JSON, treat as raw string - keyJsonValue = new JsonValue.JString(keyStr); - } - K key = keyJson.fromJson(keyJsonValue); - V val = valueJson.fromJson(entry.getValue()); - result.put(key, val); - } - return result; - } - }; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/DuckDbUnion.java b/foundations-jdbc/src/java/dev/typr/foundations/DuckDbUnion.java deleted file mode 100644 index c5a585bb3f..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/DuckDbUnion.java +++ /dev/null @@ -1,296 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.JsonValue; -import java.sql.SQLException; -import java.util.List; -import java.util.Optional; -import java.util.function.Function; - -/** - * DuckDB UNION type support. - * - *

A UNION is a tagged union (sum type) that can hold one of several alternative values. Each - * alternative has a tag name and a type. The current tag is tracked internally. Example: UNION(num - * INTEGER, str VARCHAR) - * - *

In Java, we represent a UNION as a sealed interface with record implementations for each - * alternative. This class provides the machinery to read/write UNIONs via JDBC. - * - *

Key insight from DuckDB JDBC: UNION values are returned as the actual value (unwrapped), not - * as a Struct. The tag can be queried via union_tag(value) function. - * - * @param the Java type representing this UNION (typically a sealed interface) - */ -public record DuckDbUnion( - DuckDbTypename.UnionOf typename, - List> members, - UnionReader reader, - UnionWriter writer, - DuckDbJson json) { - /** - * A single member (alternative) in a UNION with wrapper/unwrapper functions. - * - *

The wrapper converts from the raw value to the union variant. The unwrapper extracts the raw - * value from a union variant (returns null if wrong variant). - * - * @param the union type (sealed interface) - * @param the member's value type - */ - public record Member( - String tag, - DuckDbType type, - Class javaClass, - Function wrapper, - Function unwrapper) { - /** - * Try to unwrap this member from a union value. Returns null if the union value is not of this - * member's variant. - */ - public TaggedValue tryUnwrap(A value) { - M extracted = unwrapper.apply(value); - return extracted != null ? new TaggedValue<>(tag, extracted) : null; - } - } - - /** - * Functional interface for reading a UNION from tag + value. The reader receives the tag name and - * the raw value (already converted by JDBC). - */ - @FunctionalInterface - public interface UnionReader { - A read(String tag, Object value) throws SQLException; - } - - /** Result of extracting tag + value from a UNION instance. */ - public record TaggedValue(String tag, V value) {} - - /** Functional interface for writing a UNION to tag + value. */ - @FunctionalInterface - public interface UnionWriter { - TaggedValue write(A value); - } - - /** Create a DuckDbType for this UNION. */ - public DuckDbType asType() { - DuckDbRead duckDbRead = - DuckDbRead.of( - (rs, idx) -> { - Object obj = rs.getObject(idx); - if (obj == null) return null; - - // DuckDB JDBC returns the actual value, not a wrapper. - // We need to query the tag separately. - // Unfortunately, we can't get the tag from the ResultSet directly. - // The tag is part of the column metadata or needs to be queried via union_tag(). - - // Strategy: infer tag from Java type - String inferredTag = inferTag(obj); - if (inferredTag != null) { - return reader.read(inferredTag, obj); - } - - throw new SQLException( - "Cannot determine UNION tag for value: " + obj + " (" + obj.getClass() + ")"); - }); - - DuckDbWrite duckDbWrite = - new DuckDbWrite.Instance<>(DuckDbUnion::setTaggedValue, writer::write); - - DuckDbStringifier stringifier = - DuckDbStringifier.instance( - (value, sb, quoted) -> { - throw new UnsupportedOperationException("UNION stringification not yet implemented"); - }); - - return new DuckDbType<>(typename.asGeneric(), duckDbRead, duckDbWrite, stringifier, json); - } - - /** Create an optional version of this UNION type. */ - public DuckDbType> asOptType() { - return asType().opt(); - } - - /** Infer the tag from the Java type of the value. */ - private String inferTag(Object value) { - for (Member member : members) { - if (member.javaClass().isInstance(value)) { - return member.tag(); - } - } - return null; - } - - /** Set a tagged value on a PreparedStatement. */ - @SuppressWarnings("unchecked") - private static void setTaggedValue(java.sql.PreparedStatement ps, int idx, TaggedValue tv) - throws java.sql.SQLException { - // We can't directly write a UNION value via JDBC. - // The SQL must use union_value(tag := ?) syntax. - // For now, just set the raw value - the SQL layer must handle tagging. - Object value = tv.value(); - if (value == null) { - ps.setNull(idx, java.sql.Types.OTHER); - } else if (value instanceof String s) { - ps.setString(idx, s); - } else if (value instanceof Integer i) { - ps.setInt(idx, i); - } else if (value instanceof Long l) { - ps.setLong(idx, l); - } else if (value instanceof Double d) { - ps.setDouble(idx, d); - } else if (value instanceof Boolean b) { - ps.setBoolean(idx, b); - } else { - ps.setObject(idx, value); - } - } - - // ======================================================================== - // Builder API for creating UNION types - // ======================================================================== - - /** - * Create a UNION type builder. - * - * @param the union type (sealed interface) - */ - public static Builder builder(String unionName) { - return new Builder<>(unionName); - } - - public static class Builder { - private final String unionName; - private final java.util.List> members = new java.util.ArrayList<>(); - - Builder(String unionName) { - this.unionName = unionName; - } - - /** - * Add a member with wrapper and unwrapper functions. - * - *

The wrapper creates the union variant from a raw value. The unwrapper extracts the raw - * value (or null if wrong variant). - * - * @param tag the tag name in SQL (e.g., "num") - * @param type the DuckDbType for the member's value - * @param javaClass the Java class of the raw value (for type inference) - * @param wrapper function to create union variant from raw value - * @param unwrapper function to extract raw value (returns null if wrong variant) - */ - public Builder member( - String tag, - DuckDbType type, - Class javaClass, - Function wrapper, - Function unwrapper) { - members.add(new Member<>(tag, type, javaClass, wrapper, unwrapper)); - return this; - } - - /** - * Build the DuckDbUnion with auto-derived reader, writer, and JSON codec. - * - *

This method auto-generates: - Reader: infers tag from Java type, applies appropriate - * wrapper - Writer: tries each unwrapper until one succeeds - JSON: standard format {"tag": - * "name", "value": ...} - */ - public DuckDbUnion build() { - List typenameMembers = - members.stream() - .map(m -> new DuckDbTypename.UnionOf.UnionMember(m.tag(), m.type().typename())) - .toList(); - - DuckDbTypename.UnionOf typename = new DuckDbTypename.UnionOf<>(unionName, typenameMembers); - - // Auto-derive reader from members - UnionReader reader = - (tag, value) -> { - for (Member member : members) { - if (member.tag().equals(tag)) { - return applyWrapper(member, value); - } - } - throw new SQLException("Unknown tag: " + tag); - }; - - // Auto-derive writer from members - UnionWriter writer = - unionValue -> { - for (Member member : members) { - TaggedValue tv = member.tryUnwrap(unionValue); - if (tv != null) { - return tv; - } - } - throw new IllegalStateException("No member matched for union value: " + unionValue); - }; - - // Auto-derive JSON codec from members - DuckDbJson json = - new DuckDbJson<>() { - @Override - public JsonValue toJson(A value) { - for (Member member : members) { - TaggedValue tv = member.tryUnwrap(value); - if (tv != null) { - return memberToJson(member, tv.value()); - } - } - throw new IllegalStateException("No member matched for JSON encoding: " + value); - } - - @Override - public A fromJson(JsonValue json) { - if (json instanceof JsonValue.JObject obj) { - String tag = ((JsonValue.JString) obj.fields().get("tag")).value(); - JsonValue valueJson = obj.fields().get("value"); - for (Member member : members) { - if (member.tag().equals(tag)) { - return memberFromJson(member, valueJson); - } - } - throw new IllegalArgumentException("Unknown tag: " + tag); - } - throw new IllegalArgumentException( - "Expected JSON object with 'tag' and 'value' fields"); - } - }; - - return new DuckDbUnion<>(typename, List.copyOf(members), reader, writer, json); - } - - @SuppressWarnings("unchecked") - private A applyWrapper(Member member, Object value) { - return member.wrapper().apply((M) value); - } - - @SuppressWarnings("unchecked") - private JsonValue memberToJson(Member member, Object value) { - JsonValue innerJson = member.type().duckDbJson().toJson((M) value); - return new JsonValue.JObject( - java.util.Map.of("tag", new JsonValue.JString(member.tag()), "value", innerJson)); - } - - @SuppressWarnings("unchecked") - private A memberFromJson(Member member, JsonValue valueJson) { - M rawValue = member.type().duckDbJson().fromJson(valueJson); - return member.wrapper().apply(rawValue); - } - - /** - * Build with custom reader, writer, and JSON codec. Use this when auto-derivation doesn't work - * for your use case. - */ - public DuckDbUnion build(UnionReader reader, UnionWriter writer, DuckDbJson json) { - List typenameMembers = - members.stream() - .map(m -> new DuckDbTypename.UnionOf.UnionMember(m.tag(), m.type().typename())) - .toList(); - - DuckDbTypename.UnionOf typename = new DuckDbTypename.UnionOf<>(unionName, typenameMembers); - - return new DuckDbUnion<>(typename, List.copyOf(members), reader, writer, json); - } - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/DuckDbWrite.java b/foundations-jdbc/src/java/dev/typr/foundations/DuckDbWrite.java deleted file mode 100644 index f50819ff3b..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/DuckDbWrite.java +++ /dev/null @@ -1,203 +0,0 @@ -package dev.typr.foundations; - -import java.math.BigDecimal; -import java.math.BigInteger; -import java.sql.PreparedStatement; -import java.sql.SQLException; -import java.time.*; -import java.util.Optional; -import java.util.UUID; -import java.util.function.Function; - -/** - * Describes how to write a value to a {@link PreparedStatement} for DuckDB. DuckDB's JDBC driver - * handles most types through setObject, but some types need special handling: - UUID: use setString - * to avoid byte ordering bug in setObject - TIME: use setString to avoid timezone issues with - * java.sql.Time - INTERVAL: use setString with duration format - */ -public sealed interface DuckDbWrite extends DbWrite permits DuckDbWrite.Instance { - void set(PreparedStatement ps, int idx, A a) throws SQLException; - - DuckDbWrite> opt(DuckDbTypename typename); - - DuckDbWrite contramap(Function f); - - @FunctionalInterface - interface RawWriter { - void set(PreparedStatement ps, int index, A a) throws SQLException; - } - - record Instance(RawWriter rawWriter, Function f) implements DuckDbWrite { - @Override - public void set(PreparedStatement ps, int index, A a) throws SQLException { - rawWriter.set(ps, index, f.apply(a)); - } - - @Override - public DuckDbWrite> opt(DuckDbTypename typename) { - return new Instance<>( - (ps, index, u) -> { - if (u == null) ps.setNull(index, java.sql.Types.NULL); - else set(ps, index, u); - }, - a -> a.orElse(null)); - } - - @Override - public DuckDbWrite contramap(Function f) { - return new Instance<>(rawWriter, f.andThen(this.f)); - } - } - - static DuckDbWrite primitive(RawWriter rawWriter) { - return new Instance<>(rawWriter, Function.identity()); - } - - static DuckDbWrite passObjectToJdbc() { - return primitive(PreparedStatement::setObject); - } - - // Basic type writers - DuckDbWrite writeString = primitive(PreparedStatement::setString); - DuckDbWrite writeBoolean = primitive(PreparedStatement::setBoolean); - DuckDbWrite writeByte = primitive(PreparedStatement::setByte); - DuckDbWrite writeShort = primitive(PreparedStatement::setShort); - DuckDbWrite writeInteger = primitive(PreparedStatement::setInt); - DuckDbWrite writeLong = primitive(PreparedStatement::setLong); - DuckDbWrite writeFloat = primitive(PreparedStatement::setFloat); - DuckDbWrite writeDouble = primitive(PreparedStatement::setDouble); - DuckDbWrite writeBigDecimal = primitive(PreparedStatement::setBigDecimal); - // Use setString for BigInteger to handle the full 128-bit HUGEINT/UHUGEINT range - // setBigDecimal(new BigDecimal(hugeint)) fails for values at the 128-bit boundary - DuckDbWrite writeBigInteger = writeString.contramap(BigInteger::toString); - DuckDbWrite writeByteArray = primitive(PreparedStatement::setBytes); - - // UUID - use setString to avoid DuckDB's byte ordering bug with setObject(UUID) - DuckDbWrite writeUuid = writeString.contramap(UUID::toString); - - // TIME - use setString to avoid timezone issues with java.sql.Time - DuckDbWrite writeLocalTime = writeString.contramap(LocalTime::toString); - - // INTERVAL/Duration - use setString with duration format (HH:MM:SS) - DuckDbWrite writeDuration = - writeString.contramap( - d -> { - long hours = d.toHours(); - long minutes = d.toMinutesPart(); - long seconds = d.toSecondsPart(); - return String.format("%02d:%02d:%02d", hours, minutes, seconds); - }); - - // ==================== Nested Types ==================== - // DuckDB JDBC supports setObject with DuckDBUserArray for arrays - - /** - * Write a LIST/Array by converting to DuckDBUserArray. DuckDB JDBC natively supports - * DuckDBUserArray via setObject(). - * - * @param typeName the DuckDB type name for the elements (e.g., "INTEGER", "VARCHAR") - * @param toArray function to convert List to Object array - * @param element type - * @return writer for List of elements - */ - static DuckDbWrite> writeList( - String typeName, java.util.function.IntFunction toArray) { - return primitive( - (ps, idx, list) -> { - if (list == null) { - ps.setNull(idx, java.sql.Types.ARRAY); - } else { - E[] array = list.toArray(toArray); - org.duckdb.user.DuckDBUserArray userArray = - new org.duckdb.user.DuckDBUserArray(typeName, array); - ps.setObject(idx, userArray); - } - }); - } - - // ==================== SQL Literal-Based List Writers ==================== - // These types require string conversion because DuckDB JNI doesn't handle them - // directly or has bugs (e.g., UUID byte-ordering). ~33% overhead at 100k rows. - - /** - * Write a LIST/Array by formatting elements using DuckDbStringifier. Use this for types that - * DuckDB JNI doesn't handle natively. Uses unquoted format (quoted=false) suitable for - * DuckDBUserArray. - * - * @param typeName the DuckDB type name for the elements (e.g., "TIME", "DATE") - * @param stringifier how to format elements - * @param element type - * @return writer for List of elements - */ - static DuckDbWrite> writeListViaSqlLiteral( - String typeName, DuckDbStringifier stringifier) { - return primitive( - (ps, idx, list) -> { - if (list == null) { - ps.setNull(idx, java.sql.Types.ARRAY); - } else { - String[] array = - list.stream().map(e -> stringifier.encode(e, false)).toArray(String[]::new); - org.duckdb.user.DuckDBUserArray userArray = - new org.duckdb.user.DuckDBUserArray(typeName, array); - ps.setObject(idx, userArray); - } - }); - } - - /** - * Write a MAP with typed keys and values using DuckDBMap. DuckDB JDBC natively supports DuckDBMap - * via setObject(). - * - * @param sqlTypeName the full DuckDB type name (e.g., "MAP(VARCHAR, INTEGER)") - * @param key type - * @param value type - * @return writer for Map - */ - static DuckDbWrite> writeMap(String sqlTypeName) { - return primitive( - (ps, idx, map) -> { - if (map == null) { - ps.setNull(idx, java.sql.Types.OTHER); - } else { - org.duckdb.user.DuckDBMap duckDbMap = - new org.duckdb.user.DuckDBMap<>(sqlTypeName, map); - ps.setObject(idx, duckDbMap); - } - }); - } - - /** - * Write a MAP with typed keys and values using DuckDBMap, converting via DuckDbStringifier. All - * keys and values are converted to String. DuckDB parses them based on the type name. Uses - * unquoted format (quoted=false) suitable for DuckDBMap. - * - * @param sqlTypeName the full DuckDB type name (e.g., "MAP(UUID, TIME)") - * @param keyStringifier how to format keys - * @param valueStringifier how to format values - * @param key type - * @param value type - * @return writer for Map - */ - static DuckDbWrite> writeMapViaSqlLiteral( - String sqlTypeName, - DuckDbStringifier keyStringifier, - DuckDbStringifier valueStringifier) { - return primitive( - (ps, idx, map) -> { - if (map == null) { - ps.setNull(idx, java.sql.Types.OTHER); - } else { - java.util.Map wireMap = new java.util.LinkedHashMap<>(); - for (var entry : map.entrySet()) { - wireMap.put( - keyStringifier.encode(entry.getKey(), false), - valueStringifier.encode(entry.getValue(), false)); - } - org.duckdb.user.DuckDBMap duckDbMap = - new org.duckdb.user.DuckDBMap<>(sqlTypeName, wireMap); - ps.setObject(idx, duckDbMap); - } - }); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/Either.java b/foundations-jdbc/src/java/dev/typr/foundations/Either.java deleted file mode 100644 index c4ab25439c..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/Either.java +++ /dev/null @@ -1,97 +0,0 @@ -package dev.typr.foundations; - -import java.util.Optional; -import java.util.function.Function; -import java.util.function.Supplier; - -public sealed interface Either permits Either.Left, Either.Right { - static Either left(L l) { - return new Left<>(l); - } - - static Either right(R r) { - return new Right<>(r); - } - - default boolean isLeft() { - return this instanceof Left; - } - - default boolean isRight() { - return this instanceof Right; - } - - default Either map(Function f) { - return switch (this) { - case Left l -> left(l.value()); - case Right r -> right(f.apply(r.value())); - }; - } - - default Either mapLeft(Function f) { - return switch (this) { - case Left l -> left(f.apply(l.value())); - case Right r -> right(r.value()); - }; - } - - default Optional asOptional() { - return switch (this) { - case Left l -> Optional.empty(); - case Right r -> Optional.of(r.value()); - }; - } - - default R getOrElse(Supplier defaultValue) { - return switch (this) { - case Left l -> defaultValue.get(); - case Right r -> r.value(); - }; - } - - default Either swap() { - return switch (this) { - case Left l -> right(l.value()); - case Right r -> left(r.value()); - }; - } - - default T fold(Function leftMapper, Function rightMapper) { - return switch (this) { - case Left l -> leftMapper.apply(l.value()); - case Right r -> rightMapper.apply(r.value()); - }; - } - - final class Left implements Either { - private final L value; - - public Left(L value) { - this.value = value; - } - - public L value() { - return value; - } - - public String toString() { - return "Left(" + value + ")"; - } - } - - final class Right implements Either { - private final R value; - - public Right(R value) { - this.value = value; - } - - public R value() { - return value; - } - - public String toString() { - return "Right(" + value + ")"; - } - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/Fragment.java b/foundations-jdbc/src/java/dev/typr/foundations/Fragment.java deleted file mode 100644 index 38b80f3b88..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/Fragment.java +++ /dev/null @@ -1,306 +0,0 @@ -package dev.typr.foundations; - -import java.sql.PreparedStatement; -import java.sql.SQLException; -import java.util.ArrayList; -import java.util.Arrays; -import java.util.Iterator; -import java.util.List; -import java.util.concurrent.atomic.AtomicInteger; - -public sealed interface Fragment { - Fragment EMPTY = lit(""); - - default String render() { - StringBuilder sb = new StringBuilder(); - render(sb); - return sb.toString(); - } - - void render(StringBuilder sb); - - default void set(PreparedStatement stmt) throws SQLException { - set(stmt, new AtomicInteger(1)); - } - - void set(PreparedStatement stmt, AtomicInteger idx) throws SQLException; - - default Fragment append(Fragment other) { - return new Append(this, other); - } - - default Operation.Query query(ResultSetParser parser) { - return new Operation.Query<>(this, parser); - } - - default Operation.Update update() { - return new Operation.Update(this); - } - - default Operation.UpdateReturning updateReturning(ResultSetParser parser) { - return new Operation.UpdateReturning<>(this, parser); - } - - default Operation.UpdateReturningGeneratedKeys updateReturningGeneratedKeys( - String[] columnNames, ResultSetParser parser) { - return new Operation.UpdateReturningGeneratedKeys<>(this, columnNames, parser); - } - - default Operation.UpdateMany updateMany(RowParser parser, Iterator rows) { - return new Operation.UpdateMany<>(this, parser, rows); - } - - default Operation.UpdateManyReturning updateManyReturning( - RowParser parser, Iterator rows) { - return new Operation.UpdateManyReturning<>(this, parser, rows); - } - - default Operation.UpdateReturningEach updateReturningEach( - RowParser parser, Iterator rows) { - return new Operation.UpdateReturningEach<>(this, parser, rows); - } - - record Literal(String value) implements Fragment { - @Override - public void render(StringBuilder sb) { - sb.append(value); - } - - @Override - public void set(PreparedStatement stmt, AtomicInteger idx) throws SQLException {} - } - - static Literal lit(String value) { - return new Literal(value); - } - - static Fragment empty() { - return EMPTY; - } - - static Literal quotedDouble(String value) { - return new Literal('"' + value + '"'); - } - - static Literal quotedSingle(String value) { - return new Literal("'" + value + "'"); - } - - record Append(Fragment a, Fragment b) implements Fragment { - @Override - public void render(StringBuilder sb) { - a.render(sb); - b.render(sb); - } - - @Override - public void set(PreparedStatement stmt, AtomicInteger idx) throws SQLException { - a.set(stmt, idx); - b.set(stmt, idx); - } - } - - record Value(A value, DbType type) implements Fragment { - @Override - public void render(StringBuilder sb) { - sb.append('?'); - // Add type cast if the database type supports it (PostgreSQL yes, MariaDB no) - // Skip text type - PostgreSQL handles implicit string conversion well, - // and casting to text can conflict with bpchar comparison semantics - if (type.typename().renderTypeCast()) { - String sqlType = type.typename().sqlType(); - if (sqlType != null && !sqlType.isEmpty() && !sqlType.equals("text")) { - sb.append("::"); - sb.append(sqlType); - } - } - } - - @Override - public void set(PreparedStatement stmt, AtomicInteger idx) throws SQLException { - type.write().set(stmt, idx.getAndIncrement(), value); - } - } - - static Value value(A value, DbType type) { - return new Value<>(value, type); - } - - /** Encode a value into a SQL fragment using the provided database type. */ - static Fragment encode(DbType type, A value) { - return new Value<>(value, type); - } - - record Concat(List frags) implements Fragment { - @Override - public void render(StringBuilder sb) { - for (Fragment frag : frags) { - frag.render(sb); - } - } - - @Override - public void set(PreparedStatement stmt, AtomicInteger idx) throws SQLException { - for (Fragment frag : frags) { - frag.set(stmt, idx); - } - } - } - - /** Returns `(f1 AND f2 AND ... fn)`. */ - static Fragment and(Fragment... fs) { - return and(Arrays.asList(fs)); - } - - /** Returns `(f1 AND f2 AND ... fn)` for a non-empty collection. */ - static Fragment and(List fs) { - if (fs.isEmpty()) return EMPTY; - else return join(fs, lit(" AND ")); - } - - /** Returns `(f1 OR f2 OR ... fn)`. */ - static Fragment or(Fragment... fs) { - return or(Arrays.asList(fs)); - } - - /** Returns `(f1 OR f2 OR ... fn)` */ - static Fragment or(List fs) { - if (fs.isEmpty()) return EMPTY; - else return join(fs, lit(" OR ")); - } - - /** Returns `WHERE f1 AND f2 AND ... fn` */ - static Fragment whereAnd(Fragment... fs) { - return whereAnd(Arrays.asList(fs)); - } - - /** Returns `WHERE f1 AND f2 AND ... fn` */ - static Fragment whereAnd(List fs) { - if (fs.isEmpty()) { - return EMPTY; - } else { - return lit("WHERE ").append(and(fs)); - } - } - - /** Returns `WHERE f1 OR f2 OR ... fn`. */ - static Fragment whereOr(Fragment... fs) { - return whereOr(Arrays.asList(fs)); - } - - /** Returns `WHERE f1 OR f2 OR ... fn`. */ - static Fragment whereOr(List fs) { - if (fs.isEmpty()) { - return EMPTY; - } else { - return lit("WHERE ").append(or(fs)); - } - } - - /** Returns `SET f1, f2, ... fn` or the empty fragment if `fs` is empty. */ - static Fragment set(Fragment... fs) { - return set(Arrays.asList(fs)); - } - - /** Returns `SET f1, f2, ... fn` or the empty fragment if `fs` is empty. */ - static Fragment set(List fs) { - if (fs.isEmpty()) { - return EMPTY; - } else { - return lit("SET ").append(comma(fs)); - } - } - - /** Returns `(f)`. */ - static Fragment parentheses(Fragment f) { - return lit("(").append(f).append(lit(")")); - } - - /** Returns `f1, f2, ... fn`. */ - static Fragment comma(Fragment... fs) { - return comma(Arrays.asList(fs)); - } - - /** Returns `f1, f2, ... fn`. */ - static Fragment comma(List fs) { - return join(fs, lit(", ")); - } - - /** Returns `ORDER BY f1, f2, ... fn` or the empty fragment if `fs` is empty. */ - static Fragment orderBy(Fragment... fs) { - return orderBy(Arrays.asList(fs)); - } - - /** Returns `ORDER BY f1, f2, ... fn` or the empty fragment if `fs` is empty. */ - static Fragment orderBy(List fs) { - if (fs.isEmpty()) { - return EMPTY; - } else { - return lit("ORDER BY ").append(comma(fs)); - } - } - - static Concat join(List fs, Fragment sep) { - var list = new ArrayList(); - var first = true; - for (Fragment f : fs) { - if (!first) { - list.add(sep); - } - list.add(f); - first = false; - } - return new Concat(list); - } - - static Concat concat(Fragment... fs) { - return new Concat(Arrays.asList(fs)); - } - - /** Builder for creating Fragments with a fluent API */ - class Builder { - private final List fragments = new ArrayList<>(); - - public Builder() {} - - /** Add a string fragment */ - public Builder sql(String s) { - fragments.add(lit(s)); - return this; - } - - /** Add a parameter with its type and value */ - public Builder param(DbType type, T value) { - fragments.add(Fragment.value(value, type)); - return this; - } - - /** Add a Fragment directly */ - public Builder param(Fragment fragment) { - fragments.add(fragment); - return this; - } - - /** Build the final Fragment */ - public Fragment done() { - return new Concat(fragments); - } - } - - /** - * Start building a Fragment with a fluent API. Example: Fragment.interpolate("SELECT * FROM users - * WHERE name = ") .param(PgTypes.text, "Alice") .sql(" AND age > ") .param(PgTypes.int4, 25) - * .done() - */ - static Builder interpolate(String initial) { - return new Builder().sql(initial); - } - - /** - * Create a Fragment directly from varargs fragments. Example: Fragment.interpolate(lit("SELECT * - * FROM users WHERE name = "), nameParam, lit(" AND age > "), ageParam) - */ - static Fragment interpolate(Fragment... fragments) { - return new Concat(Arrays.asList(fragments)); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/Inserter.java b/foundations-jdbc/src/java/dev/typr/foundations/Inserter.java deleted file mode 100644 index ae4ef4d141..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/Inserter.java +++ /dev/null @@ -1,62 +0,0 @@ -package dev.typr.foundations; - -import java.sql.Connection; -import java.util.function.BiFunction; -import java.util.function.UnaryOperator; - -/** - * Fluent builder for inserting rows with optional customization. - * - *

Enables a fluent API for test data insertion: - * - *

- * testInsert.customers("email@example.com", "password")
- *     .with(row -> row.withPhone(Defaulted.Provided("+1234567890")))
- *     .with(row -> row.withStatus(Defaulted.Provided(status)))
- *     .insert(connection);
- * 
- * - * @param The unsaved row type (e.g., CustomersRowUnsaved) - * @param The return type after insertion (e.g., CustomersRow or CustomersId) - */ -public interface Inserter { - /** - * Insert the row and return the result. - * - * @param c The database connection - * @return The inserted row or its ID - */ - R insert(Connection c); - - /** - * Transform the unsaved row before insertion using withers. - * - * @param transformer Function to transform the unsaved row (typically using wither methods) - * @return A new Inserter with the transformed row - */ - Inserter with(UnaryOperator transformer); - - /** - * Create an Inserter from an unsaved row and insert function. - * - * @param row The unsaved row - * @param insertFn Function that inserts the row and returns the result - * @return An Inserter for the row - */ - static Inserter of(U row, BiFunction insertFn) { - return new Impl<>(row, insertFn); - } - - /** Implementation that holds the row and insert function. */ - record Impl(U row, BiFunction insertFn) implements Inserter { - @Override - public R insert(Connection c) { - return insertFn.apply(row, c); - } - - @Override - public Inserter with(UnaryOperator transformer) { - return new Impl<>(transformer.apply(row), insertFn); - } - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/MariaJson.java b/foundations-jdbc/src/java/dev/typr/foundations/MariaJson.java deleted file mode 100644 index 6ee6138d43..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/MariaJson.java +++ /dev/null @@ -1,320 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.*; -import java.math.BigDecimal; -import java.time.*; -import java.util.*; -import java.util.function.Function; -import java.util.function.IntFunction; - -/** - * MariaDB-specific JSON codec implementations. Handles conversion to/from JSON in MariaDB's - * expected format. - */ -public interface MariaJson extends DbJson { - - @Override - default MariaJson> opt() { - MariaJson self = this; - return new MariaJson<>() { - @Override - public JsonValue toJson(Optional value) { - return value.map(self::toJson).orElse(JsonValue.JNull.INSTANCE); - } - - @Override - public Optional fromJson(JsonValue json) { - if (json instanceof JsonValue.JNull) { - return Optional.empty(); - } - return Optional.of(self.fromJson(json)); - } - }; - } - - default MariaJson array(IntFunction arrayFactory) { - MariaJson self = this; - return new MariaJson<>() { - @Override - public JsonValue toJson(A[] value) { - List elements = new ArrayList<>(value.length); - for (A elem : value) { - elements.add(self.toJson(elem)); - } - return new JsonValue.JArray(elements); - } - - @Override - public A[] fromJson(JsonValue json) { - if (!(json instanceof JsonValue.JArray arr)) { - throw new IllegalArgumentException( - "Expected JSON array, got: " + json.getClass().getSimpleName()); - } - A[] result = arrayFactory.apply(arr.values().size()); - for (int i = 0; i < arr.values().size(); i++) { - result[i] = self.fromJson(arr.values().get(i)); - } - return result; - } - }; - } - - default MariaJson bimap(SqlFunction f, Function g) { - MariaJson self = this; - return new MariaJson<>() { - @Override - public JsonValue toJson(B value) { - return self.toJson(g.apply(value)); - } - - @Override - public B fromJson(JsonValue json) { - try { - return f.apply(self.fromJson(json)); - } catch (java.sql.SQLException e) { - throw new RuntimeException(e); - } - } - }; - } - - // Primitive type codecs - // MariaDB returns BOOLEAN as 1/0 in JSON when coming from a column, not true/false - MariaJson bool = - new MariaJson<>() { - @Override - public JsonValue toJson(Boolean value) { - return JsonValue.JBool.of(value); - } - - @Override - public Boolean fromJson(JsonValue json) { - if (json instanceof JsonValue.JBool b) return b.value(); - // MariaDB returns BOOLEAN columns as 1/0 in JSON - if (json instanceof JsonValue.JNumber n) return Integer.parseInt(n.value()) != 0; - throw new IllegalArgumentException( - "Expected boolean or number, got: " + json.getClass().getSimpleName()); - } - }; - - MariaJson int2 = - new MariaJson<>() { - @Override - public JsonValue toJson(Short value) { - return JsonValue.JNumber.of(value.longValue()); - } - - @Override - public Short fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber n) return Short.parseShort(n.value()); - throw new IllegalArgumentException( - "Expected number, got: " + json.getClass().getSimpleName()); - } - }; - - MariaJson int4 = - new MariaJson<>() { - @Override - public JsonValue toJson(Integer value) { - return JsonValue.JNumber.of(value.longValue()); - } - - @Override - public Integer fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber n) return Integer.parseInt(n.value()); - throw new IllegalArgumentException( - "Expected number, got: " + json.getClass().getSimpleName()); - } - }; - - MariaJson int8 = - new MariaJson<>() { - @Override - public JsonValue toJson(Long value) { - return JsonValue.JNumber.of(value); - } - - @Override - public Long fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber n) return Long.parseLong(n.value()); - throw new IllegalArgumentException( - "Expected number, got: " + json.getClass().getSimpleName()); - } - }; - - MariaJson float4 = - new MariaJson<>() { - @Override - public JsonValue toJson(Float value) { - return JsonValue.JNumber.of(value.doubleValue()); - } - - @Override - public Float fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber n) return Float.parseFloat(n.value()); - throw new IllegalArgumentException( - "Expected number, got: " + json.getClass().getSimpleName()); - } - }; - - MariaJson float8 = - new MariaJson<>() { - @Override - public JsonValue toJson(Double value) { - return JsonValue.JNumber.of(value); - } - - @Override - public Double fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber n) return Double.parseDouble(n.value()); - throw new IllegalArgumentException( - "Expected number, got: " + json.getClass().getSimpleName()); - } - }; - - MariaJson numeric = - new MariaJson<>() { - @Override - public JsonValue toJson(BigDecimal value) { - return JsonValue.JNumber.of(value.toPlainString()); - } - - @Override - public BigDecimal fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber n) return new BigDecimal(n.value()); - throw new IllegalArgumentException( - "Expected number, got: " + json.getClass().getSimpleName()); - } - }; - - MariaJson text = - new MariaJson<>() { - @Override - public JsonValue toJson(String value) { - return new JsonValue.JString(value); - } - - @Override - public String fromJson(JsonValue json) { - if (json instanceof JsonValue.JString s) return s.value(); - throw new IllegalArgumentException( - "Expected string, got: " + json.getClass().getSimpleName()); - } - }; - - // MariaDB returns binary data as raw bytes with unicode escapes for unprintable chars - // e.g., "�\u0001\u0000\u0000" - the JSON parser handles unicode escapes - MariaJson bytea = - new MariaJson<>() { - @Override - public JsonValue toJson(byte[] value) { - // Encode as raw string - the JSON encoder will escape as needed - return new JsonValue.JString( - new String(value, java.nio.charset.StandardCharsets.ISO_8859_1)); - } - - @Override - public byte[] fromJson(JsonValue json) { - if (!(json instanceof JsonValue.JString s)) { - throw new IllegalArgumentException( - "Expected string for bytea, got: " + json.getClass().getSimpleName()); - } - // The JSON parser already decoded unicode escapes, so we get the raw bytes - return s.value().getBytes(java.nio.charset.StandardCharsets.ISO_8859_1); - } - }; - - // Date/Time types - MariaJson date = - new MariaJson<>() { - @Override - public JsonValue toJson(LocalDate value) { - return new JsonValue.JString(value.toString()); - } - - @Override - public LocalDate fromJson(JsonValue json) { - if (json instanceof JsonValue.JString s) return LocalDate.parse(s.value()); - throw new IllegalArgumentException( - "Expected string for date, got: " + json.getClass().getSimpleName()); - } - }; - - MariaJson time = - new MariaJson<>() { - @Override - public JsonValue toJson(LocalTime value) { - return new JsonValue.JString(value.toString()); - } - - @Override - public LocalTime fromJson(JsonValue json) { - if (json instanceof JsonValue.JString s) return LocalTime.parse(s.value()); - throw new IllegalArgumentException( - "Expected string for time, got: " + json.getClass().getSimpleName()); - } - }; - - MariaJson timestamp = - new MariaJson<>() { - @Override - public JsonValue toJson(LocalDateTime value) { - return new JsonValue.JString(value.toString()); - } - - @Override - public LocalDateTime fromJson(JsonValue json) { - if (json instanceof JsonValue.JString s) { - // MariaDB returns "2024-06-15 14:30:45.123456" with space, ISO format uses 'T' - String value = s.value().replace(' ', 'T'); - return LocalDateTime.parse(value); - } - throw new IllegalArgumentException( - "Expected string for timestamp, got: " + json.getClass().getSimpleName()); - } - }; - - MariaJson timestamptz = - new MariaJson<>() { - @Override - public JsonValue toJson(Instant value) { - return new JsonValue.JString(value.toString()); - } - - @Override - public Instant fromJson(JsonValue json) { - if (json instanceof JsonValue.JString s) return Instant.parse(s.value()); - throw new IllegalArgumentException( - "Expected string for timestamptz, got: " + json.getClass().getSimpleName()); - } - }; - - MariaJson uuid = - new MariaJson<>() { - @Override - public JsonValue toJson(UUID value) { - return new JsonValue.JString(value.toString()); - } - - @Override - public UUID fromJson(JsonValue json) { - if (json instanceof JsonValue.JString s) return UUID.fromString(s.value()); - throw new IllegalArgumentException( - "Expected string for uuid, got: " + json.getClass().getSimpleName()); - } - }; - - // JSON types (pass-through) - MariaJson json = - new MariaJson<>() { - @Override - public JsonValue toJson(Json value) { - return JsonValue.parse(value.value()); - } - - @Override - public Json fromJson(JsonValue json) { - return new Json(json.encode()); - } - }; -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/MariaRead.java b/foundations-jdbc/src/java/dev/typr/foundations/MariaRead.java deleted file mode 100644 index b346f1b794..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/MariaRead.java +++ /dev/null @@ -1,314 +0,0 @@ -package dev.typr.foundations; - -import java.math.BigDecimal; -import java.math.BigInteger; -import java.sql.ResultSet; -import java.sql.SQLException; -import java.time.LocalDate; -import java.time.LocalDateTime; -import java.time.LocalTime; -import java.time.Year; -import java.util.Optional; - -/** - * Describes how to read a column from a {@link ResultSet} for MariaDB. - * - *

Similar to PgRead but adapted for MariaDB-specific types. - */ -public sealed interface MariaRead extends DbRead - permits MariaRead.NonNullable, MariaRead.Nullable, MariaRead.Mapped { - A read(ResultSet rs, int col) throws SQLException; - - MariaRead map(SqlFunction f); - - /** Derive a MariaRead which allows nullable values */ - MariaRead> opt(); - - @FunctionalInterface - interface RawRead { - A apply(ResultSet rs, int column) throws SQLException; - } - - /** - * Create an instance of {@link MariaRead} from a function that reads a value from a result set. - * - * @param f Should not blow up if the value returned is `null` - */ - static NonNullable of(RawRead f) { - RawRead> readNullableA = - (rs, col) -> { - var a = f.apply(rs, col); - if (rs.wasNull()) return Optional.empty(); - else return Optional.of(a); - }; - return new NonNullable<>(readNullableA); - } - - final class NonNullable implements MariaRead { - final RawRead> readNullable; - - public NonNullable(RawRead> readNullable) { - this.readNullable = readNullable; - } - - @Override - public A read(ResultSet rs, int col) throws SQLException { - return readNullable - .apply(rs, col) - .orElseThrow(() -> new SQLException("null value in column " + col)); - } - - @Override - public NonNullable map(SqlFunction f) { - return new NonNullable<>( - (rs, col) -> { - Optional maybeA = readNullable.apply(rs, col); - if (maybeA.isEmpty()) return Optional.empty(); - return Optional.of(f.apply(maybeA.get())); - }); - } - - @Override - public MariaRead> opt() { - return new Nullable<>(readNullable); - } - } - - final class Nullable implements MariaRead> { - final RawRead> readNullable; - - public Nullable(RawRead> readNullable) { - this.readNullable = readNullable; - } - - @Override - public Optional read(ResultSet rs, int col) throws SQLException { - return readNullable.apply(rs, col); - } - - @Override - public MariaRead map(SqlFunction, B> f) { - return new Mapped<>(this, f); - } - - @Override - public Nullable> opt() { - return new Nullable<>( - (rs, col) -> { - Optional maybeA = readNullable.apply(rs, col); - if (maybeA.isEmpty()) return Optional.empty(); - return Optional.of(maybeA); - }); - } - } - - /** - * A read that came from mapping another read. Just returns whatever the mapping function - * produces, null or not. - */ - record Mapped(MariaRead underlying, SqlFunction f) implements MariaRead { - @Override - public B read(ResultSet rs, int col) throws SQLException { - return f.apply(underlying.read(rs, col)); - } - - @Override - public MariaRead map(SqlFunction g) { - return new Mapped<>(this, g); - } - - @Override - public MariaRead> opt() { - return new Nullable<>((rs, col) -> Optional.ofNullable(read(rs, col))); - } - } - - static NonNullable castJdbcObjectTo(Class cls) { - return of((rs, i) -> cls.cast(rs.getObject(i))); - } - - /** - * Read a value by requesting a specific class from JDBC. This uses rs.getObject(i, cls) which - * allows the JDBC driver to do proper type conversion, including handling WKB bytes for spatial - * types returned from RETURNING clauses. - */ - static NonNullable getObjectAs(Class cls) { - return of((rs, i) -> rs.getObject(i, cls)); - } - - /** - * Read a value by getting the raw Object and casting. This is needed for polymorphic types like - * MariaDB's Geometry where getObject(i, Geometry.class) doesn't work properly (the driver uses a - * specific codec like GeometryCollectionCodec instead of handling polymorphism). - * - *

For RETURNING clauses, MariaDB returns WKB bytes instead of Geometry objects. In that case, - * we use the driver's codec to decode the bytes. - */ - @SuppressWarnings("unchecked") - static NonNullable getObjectAndCast(Class cls) { - return of( - (rs, i) -> { - Object obj = rs.getObject(i); - if (obj == null) return null; - if (cls.isInstance(obj)) { - return (A) obj; - } - // For RETURNING clauses, the driver returns WKB bytes - try to decode via typed getObject - if (obj instanceof byte[]) { - // The driver can decode WKB bytes if we ask for the specific type - // For the base Geometry class, we need to determine the actual type from WKB header - try { - // Try using the driver's codec system with the target class - return rs.getObject(i, cls); - } catch (SQLException e) { - // If that fails, the caller may need to handle byte[] differently - throw new SQLException( - "Cannot decode WKB bytes as " + cls.getName() + ": " + e.getMessage(), e); - } - } - throw new SQLException( - "Expected " + cls.getName() + " but got: " + obj.getClass().getName()); - }); - } - - /** - * Read a geometry value that handles both normal SELECT (returns typed object) and RETURNING - * (returns WKB bytes). For RETURNING with base Geometry type, we parse the WKB header to - * determine the actual geometry type. - */ - @SuppressWarnings("unchecked") - static NonNullable readGeometry(Class cls) { - return of( - (rs, i) -> { - Object obj = rs.getObject(i); - if (obj == null) return null; - - // If already the right type, return it - if (cls.isInstance(obj)) { - return (A) obj; - } - - // If it's any Geometry subtype and we want base Geometry, return it - if (cls == org.mariadb.jdbc.type.Geometry.class - && obj instanceof org.mariadb.jdbc.type.Geometry) { - return (A) obj; - } - - // For WKB bytes from RETURNING, try type-specific decode - if (obj instanceof byte[]) { - // For specific subtypes, use the driver's codec directly - if (cls != org.mariadb.jdbc.type.Geometry.class) { - return rs.getObject(i, cls); - } - - // For base Geometry, we need to determine type from WKB and use appropriate decoder - byte[] wkb = (byte[]) obj; - if (wkb.length < 5) { - throw new SQLException("WKB data too short"); - } - - // WKB format: 1 byte endian (0=big, 1=little) + 4 bytes type - // Types: 1=Point, 2=LineString, 3=Polygon, 4=MultiPoint, 5=MultiLineString, - // 6=MultiPolygon, 7=GeometryCollection - boolean littleEndian = (wkb[0] == 1); - int typeOffset = 1; // Skip endian byte - - int wkbType; - if (littleEndian) { - wkbType = - (wkb[typeOffset] & 0xFF) - | ((wkb[typeOffset + 1] & 0xFF) << 8) - | ((wkb[typeOffset + 2] & 0xFF) << 16) - | ((wkb[typeOffset + 3] & 0xFF) << 24); - } else { - // Big-endian - wkbType = - ((wkb[typeOffset] & 0xFF) << 24) - | ((wkb[typeOffset + 1] & 0xFF) << 16) - | ((wkb[typeOffset + 2] & 0xFF) << 8) - | (wkb[typeOffset + 3] & 0xFF); - } - - // Handle SRID variations (add 0x20000000 mask) - int baseType = wkbType & 0xFF; - - return (A) - switch (baseType) { - case 1 -> rs.getObject(i, org.mariadb.jdbc.type.Point.class); - case 2 -> rs.getObject(i, org.mariadb.jdbc.type.LineString.class); - case 3 -> rs.getObject(i, org.mariadb.jdbc.type.Polygon.class); - case 4 -> rs.getObject(i, org.mariadb.jdbc.type.MultiPoint.class); - case 5 -> rs.getObject(i, org.mariadb.jdbc.type.MultiLineString.class); - case 6 -> rs.getObject(i, org.mariadb.jdbc.type.MultiPolygon.class); - case 7 -> rs.getObject(i, org.mariadb.jdbc.type.GeometryCollection.class); - default -> throw new SQLException("Unknown WKB geometry type: " + wkbType); - }; - } - - throw new SQLException( - "Expected " + cls.getName() + " but got: " + obj.getClass().getName()); - }); - } - - // Basic type readers - MariaRead readString = of(ResultSet::getString); - MariaRead readBoolean = of(ResultSet::getBoolean); - MariaRead readByte = of(ResultSet::getByte); - MariaRead readShort = of(ResultSet::getShort); - MariaRead readInteger = of(ResultSet::getInt); - MariaRead readLong = of(ResultSet::getLong); - MariaRead readFloat = of(ResultSet::getFloat); - MariaRead readDouble = of(ResultSet::getDouble); - MariaRead readBigDecimal = of(ResultSet::getBigDecimal); - // For BINARY/VARBINARY - reads as byte[] directly - MariaRead readByteArray = of(ResultSet::getBytes); - - // For BLOB types - MariaDB returns Blob objects, need to extract bytes - MariaRead readBlob = - of( - (rs, idx) -> { - java.sql.Blob blob = rs.getBlob(idx); - if (blob == null) return null; - return blob.getBytes(1, (int) blob.length()); - }); - - // BigInteger for BIGINT UNSIGNED - MariaRead readBigInteger = readBigDecimal.map(bd -> bd.toBigInteger()); - - // Date/Time readers - MariaRead readLocalDate = of((rs, idx) -> rs.getObject(idx, LocalDate.class)); - MariaRead readLocalTime = of((rs, idx) -> rs.getObject(idx, LocalTime.class)); - MariaRead readLocalDateTime = - of((rs, idx) -> rs.getObject(idx, LocalDateTime.class)); - - // Year type - MariaDB returns it as a short - MariaRead readYear = readShort.map(s -> Year.of(s.intValue())); - - // BIT type - MariaDB returns as byte[] for BIT(n) where n > 1 - MariaRead readBit = - of( - (rs, idx) -> { - Object obj = rs.getObject(idx); - if (obj == null) return null; - if (obj instanceof byte[]) return (byte[]) obj; - if (obj instanceof Boolean) return new byte[] {(byte) (((Boolean) obj) ? 1 : 0)}; - if (obj instanceof Number) return new byte[] {((Number) obj).byteValue()}; - throw new SQLException("Cannot convert " + obj.getClass() + " to byte[] for BIT type"); - }); - - // BIT(1) as Boolean - MariaRead readBitAsBoolean = - of( - (rs, idx) -> { - Object obj = rs.getObject(idx); - if (obj == null) return null; - if (obj instanceof Boolean) return (Boolean) obj; - if (obj instanceof byte[]) { - byte[] bytes = (byte[]) obj; - return bytes.length > 0 && bytes[0] != 0; - } - if (obj instanceof Number) return ((Number) obj).intValue() != 0; - throw new SQLException( - "Cannot convert " + obj.getClass() + " to Boolean for BIT(1) type"); - }); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/MariaText.java b/foundations-jdbc/src/java/dev/typr/foundations/MariaText.java deleted file mode 100644 index 5831f300d0..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/MariaText.java +++ /dev/null @@ -1,149 +0,0 @@ -package dev.typr.foundations; - -import java.math.BigDecimal; -import java.math.BigInteger; -import java.util.Optional; -import java.util.Set; -import java.util.UUID; -import java.util.function.BiConsumer; -import java.util.function.Function; - -/** - * Encodes values to text format for MariaDB LOAD DATA INFILE command. - * - *

Similar to PgText but adapted for MariaDB's text format. MariaDB uses different escape - * sequences and doesn't have the PostgreSQL array syntax. - */ -public abstract class MariaText implements DbText { - public abstract void unsafeEncode(A a, StringBuilder sb); - - public MariaText contramap(Function f) { - var self = this; - return instance((b, sb) -> self.unsafeEncode(f.apply(b), sb)); - } - - public MariaText> opt() { - var self = this; - return instance( - (a, sb) -> { - if (a.isPresent()) self.unsafeEncode(a.get(), sb); - else sb.append(MariaText.NULL); - }); - } - - public static char DELIMETER = '\t'; - public static String NULL = "\\N"; - - public static MariaText instance(BiConsumer f) { - return new MariaText<>() { - @Override - public void unsafeEncode(A a, StringBuilder sb) { - f.accept(a, sb); - } - }; - } - - public static MariaText instanceToString() { - return textString.contramap(Object::toString); - } - - /** - * Escape a string for MariaDB LOAD DATA INFILE format. MariaDB escape sequences: - \0 = NUL - * (ASCII 0) - \b = backspace - \n = newline - \r = carriage return - \t = tab - \Z = Ctrl+Z - * (Windows EOF) - \\ = backslash - */ - private static void escapeString(String s, StringBuilder sb) { - for (int i = 0; i < s.length(); i++) { - char c = s.charAt(i); - switch (c) { - case '\0': - sb.append("\\0"); - break; - case '\b': - sb.append("\\b"); - break; - case '\n': - sb.append("\\n"); - break; - case '\r': - sb.append("\\r"); - break; - case '\t': - sb.append("\\t"); - break; - case 0x1a: // Ctrl+Z - sb.append("\\Z"); - break; - case '\\': - sb.append("\\\\"); - break; - default: - sb.append(c); - break; - } - } - } - - public static final MariaText textString = instance((s, sb) -> escapeString(s, sb)); - public static final MariaText textBoolean = - instance((b, sb) -> sb.append(b ? "1" : "0")); - public static final MariaText textByte = instance((n, sb) -> sb.append(n)); - public static final MariaText textShort = instance((n, sb) -> sb.append(n)); - public static final MariaText textInteger = instance((n, sb) -> sb.append(n)); - public static final MariaText textLong = instance((n, sb) -> sb.append(n)); - public static final MariaText textFloat = instance((n, sb) -> sb.append(n)); - public static final MariaText textDouble = instance((n, sb) -> sb.append(n)); - public static final MariaText textBigDecimal = instance((n, sb) -> sb.append(n)); - public static final MariaText textBigInteger = instance((n, sb) -> sb.append(n)); - public static final MariaText textUuid = instance((n, sb) -> sb.append(n)); - - public static final MariaText textByteArray = - instance( - (bs, sb) -> { - // MariaDB expects hex string for binary data in LOAD DATA - for (byte b : bs) { - sb.append(String.format("%02X", b)); - } - }); - - /** Text encoder for SET type values. SET values are comma-separated strings in MariaDB. */ - public static final MariaText> textSet = - instance( - (set, sb) -> { - boolean first = true; - for (String value : set) { - if (first) first = false; - else sb.append(','); - sb.append(value); - } - }); - - @SuppressWarnings("unchecked") - public static MariaText from(RowParser rowParser) { - return instance( - (row, sb) -> { - var encoded = rowParser.encode().apply(row); - for (int i = 0; i < encoded.length; i++) { - if (i > 0) { - sb.append(MariaText.DELIMETER); - } - DbText text = (DbText) rowParser.columns().get(i).text(); - text.unsafeEncode(encoded[i], sb); - } - }); - } - - public static final MariaText NotWorking = - new MariaText<>() { - @Override - public void unsafeEncode(Object t, StringBuilder sb) { - throw new UnsupportedOperationException( - "LOAD DATA INFILE is not supported for this type"); - } - }; - - @Deprecated - public static MariaText NotWorking() { - return (MariaText) NotWorking; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/MariaType.java b/foundations-jdbc/src/java/dev/typr/foundations/MariaType.java deleted file mode 100644 index 2e1bce4324..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/MariaType.java +++ /dev/null @@ -1,96 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.dsl.Bijection; -import java.util.Optional; -import java.util.function.Function; - -/** - * Combines MariaDB type name, read, write, text encoding, and JSON encoding for a type. Similar to - * PgType but for MariaDB. - */ -public record MariaType( - MariaTypename typename, - MariaRead read, - MariaWrite write, - MariaText mariaText, - MariaJson mariaJson) - implements DbType { - @Override - public DbText text() { - return mariaText; - } - - @Override - public DbJson json() { - return mariaJson; - } - - public Fragment.Value encode(A value) { - return new Fragment.Value<>(value, this); - } - - public MariaType withTypename(MariaTypename typename) { - return new MariaType<>(typename, read, write, mariaText, mariaJson); - } - - public MariaType withTypename(String sqlType) { - return withTypename(MariaTypename.of(sqlType)); - } - - public MariaType renamed(String value) { - return withTypename(typename.renamed(value)); - } - - public MariaType renamedDropPrecision(String value) { - return withTypename(typename.renamedDropPrecision(value)); - } - - public MariaType withRead(MariaRead read) { - return new MariaType<>(typename, read, write, mariaText, mariaJson); - } - - public MariaType withWrite(MariaWrite write) { - return new MariaType<>(typename, read, write, mariaText, mariaJson); - } - - public MariaType withText(MariaText text) { - return new MariaType<>(typename, read, write, text, mariaJson); - } - - public MariaType withJson(MariaJson json) { - return new MariaType<>(typename, read, write, mariaText, json); - } - - public MariaType> opt() { - return new MariaType<>( - typename.opt(), read.opt(), write.opt(typename), mariaText.opt(), mariaJson.opt()); - } - - public MariaType bimap(SqlFunction f, Function g) { - return new MariaType<>( - typename.as(), - read.map(f), - write.contramap(g), - mariaText.contramap(g), - mariaJson.bimap(f, g)); - } - - public MariaType to(Bijection bijection) { - return new MariaType<>( - typename.as(), - read.map(bijection::underlying), - write.contramap(bijection::from), - mariaText.contramap(bijection::from), - mariaJson.bimap(bijection::underlying, bijection::from)); - } - - public static MariaType of( - String tpe, MariaRead r, MariaWrite w, MariaText t, MariaJson j) { - return new MariaType<>(MariaTypename.of(tpe), r, w, t, j); - } - - public static MariaType of( - MariaTypename typename, MariaRead r, MariaWrite w, MariaText t, MariaJson j) { - return new MariaType<>(typename, r, w, t, j); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/MariaTypename.java b/foundations-jdbc/src/java/dev/typr/foundations/MariaTypename.java deleted file mode 100644 index 2be3814e53..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/MariaTypename.java +++ /dev/null @@ -1,134 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.dsl.Bijection; -import java.util.Optional; - -/** - * Represents a MariaDB SQL type name with optional precision. Similar to PgTypename but without - * array support (MariaDB doesn't have array types). - */ -public sealed interface MariaTypename extends DbTypename { - String sqlType(); - - /** MariaDB doesn't use PostgreSQL-style type casts in SQL. */ - @Override - default boolean renderTypeCast() { - return false; - } - - String sqlTypeNoPrecision(); - - MariaTypename renamed(String value); - - MariaTypename renamedDropPrecision(String value); - - default MariaTypename> opt() { - return new Opt<>(this); - } - - default MariaTypename as() { - return (MariaTypename) this; - } - - /** - * Type-safe conversion using a bijection as proof of type relationship. Overrides DbTypename.to() - * to return MariaTypename for better type refinement. - */ - @Override - default MariaTypename to(Bijection bijection) { - return (MariaTypename) this; - } - - record Base(String sqlType) implements MariaTypename { - @Override - public String sqlTypeNoPrecision() { - return sqlType; - } - - @Override - public Base renamed(String value) { - return new Base<>(value); - } - - @Override - public Base renamedDropPrecision(String value) { - return new Base<>(value); - } - } - - record WithPrec(Base of, int precision) implements MariaTypename { - public String sqlType() { - return of.sqlType + "(" + precision + ")"; - } - - @Override - public String sqlTypeNoPrecision() { - return of.sqlTypeNoPrecision(); - } - - @Override - public MariaTypename renamed(String value) { - return new WithPrec<>(of.renamed(value), precision); - } - - @Override - public MariaTypename renamedDropPrecision(String value) { - return of.renamed(value); - } - } - - record WithPrecScale(Base of, int precision, int scale) implements MariaTypename { - public String sqlType() { - return of.sqlType + "(" + precision + "," + scale + ")"; - } - - @Override - public String sqlTypeNoPrecision() { - return of.sqlTypeNoPrecision(); - } - - @Override - public MariaTypename renamed(String value) { - return new WithPrecScale<>(of.renamed(value), precision, scale); - } - - @Override - public MariaTypename renamedDropPrecision(String value) { - return of.renamed(value); - } - } - - record Opt(MariaTypename of) implements MariaTypename> { - @Override - public String sqlType() { - return of.sqlType(); - } - - @Override - public String sqlTypeNoPrecision() { - return of.sqlTypeNoPrecision(); - } - - @Override - public MariaTypename> renamed(String value) { - return new Opt<>(of.renamed(value)); - } - - @Override - public MariaTypename> renamedDropPrecision(String value) { - return new Opt<>(of.renamedDropPrecision(value)); - } - } - - static MariaTypename of(String sqlType) { - return new Base<>(sqlType); - } - - static MariaTypename of(String sqlType, int precision) { - return new WithPrec<>(new Base<>(sqlType), precision); - } - - static MariaTypename of(String sqlType, int precision, int scale) { - return new WithPrecScale<>(new Base<>(sqlType), precision, scale); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/MariaTypes.java b/foundations-jdbc/src/java/dev/typr/foundations/MariaTypes.java deleted file mode 100644 index 313c8392ce..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/MariaTypes.java +++ /dev/null @@ -1,565 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.Json; -import dev.typr.foundations.data.Uint1; -import dev.typr.foundations.data.Uint2; -import dev.typr.foundations.data.Uint4; -import dev.typr.foundations.data.Uint8; -import dev.typr.foundations.data.maria.Inet4; -import dev.typr.foundations.data.maria.Inet6; -import dev.typr.foundations.data.maria.MariaSet; -import java.math.BigDecimal; -import java.time.LocalDate; -import java.time.LocalDateTime; -import java.time.LocalTime; -import java.time.Year; -import java.util.function.Function; -import org.mariadb.jdbc.type.Geometry; -import org.mariadb.jdbc.type.GeometryCollection; -import org.mariadb.jdbc.type.LineString; -import org.mariadb.jdbc.type.MultiLineString; -import org.mariadb.jdbc.type.MultiPoint; -import org.mariadb.jdbc.type.MultiPolygon; -import org.mariadb.jdbc.type.Point; -import org.mariadb.jdbc.type.Polygon; - -/** - * MariaDB type definitions for the typr-runtime-java library. - * - *

This interface provides type codecs for all MariaDB data types, similar to PgTypes for - * PostgreSQL. - */ -public interface MariaTypes { - // ==================== Integer Types (Signed) ==================== - - MariaType tinyint = - MariaType.of( - "TINYINT", - MariaRead.readByte, - MariaWrite.writeByte, - MariaText.textByte, - MariaJson.int4.bimap(Integer::byteValue, Byte::intValue)); - - MariaType smallint = - MariaType.of( - "SMALLINT", - MariaRead.readShort, - MariaWrite.writeShort, - MariaText.textShort, - MariaJson.int2); - - MariaType mediumint = - MariaType.of( - "MEDIUMINT", - MariaRead.readInteger, - MariaWrite.writeInteger, - MariaText.textInteger, - MariaJson.int4); - - MariaType int_ = - MariaType.of( - "INT", - MariaRead.readInteger, - MariaWrite.writeInteger, - MariaText.textInteger, - MariaJson.int4); - - MariaType bigint = - MariaType.of( - "BIGINT", MariaRead.readLong, MariaWrite.writeLong, MariaText.textLong, MariaJson.int8); - - // ==================== Integer Types (Unsigned) ==================== - - // TINYINT UNSIGNED: 0-255, wrapped in Uint1 - MariaType tinyintUnsigned = - MariaType.of( - "TINYINT UNSIGNED", - MariaRead.readShort.map(Uint1::new), - MariaWrite.writeShort.contramap(Uint1::value), - MariaText.textShort.contramap(Uint1::value), - MariaJson.int2.bimap(Uint1::new, Uint1::value)); - - // SMALLINT UNSIGNED: 0-65535, wrapped in Uint2 - MariaType smallintUnsigned = - MariaType.of( - "SMALLINT UNSIGNED", - MariaRead.readInteger.map(Uint2::new), - MariaWrite.writeInteger.contramap(Uint2::value), - MariaText.textInteger.contramap(Uint2::value), - MariaJson.int4.bimap(Uint2::new, Uint2::value)); - - // MEDIUMINT UNSIGNED: 0-16777215, wrapped in Uint4 - MariaType mediumintUnsigned = - MariaType.of( - "MEDIUMINT UNSIGNED", - MariaRead.readLong.map(Uint4::new), - MariaWrite.writeLong.contramap(Uint4::value), - MariaText.textLong.contramap(Uint4::value), - MariaJson.int8.bimap(Uint4::new, Uint4::value)); - - // INT UNSIGNED: 0-4294967295, wrapped in Uint4 - MariaType intUnsigned = - MariaType.of( - "INT UNSIGNED", - MariaRead.readLong.map(Uint4::new), - MariaWrite.writeLong.contramap(Uint4::value), - MariaText.textLong.contramap(Uint4::value), - MariaJson.int8.bimap(Uint4::new, Uint4::value)); - - // BIGINT UNSIGNED: 0-18446744073709551615, wrapped in Uint8 - MariaType bigintUnsigned = - MariaType.of( - "BIGINT UNSIGNED", - MariaRead.readBigInteger.map(Uint8::new), - MariaWrite.writeBigInteger.contramap(Uint8::value), - MariaText.textBigInteger.contramap(Uint8::value), - MariaJson.numeric.bimap( - v -> new Uint8(v.toBigInteger()), v -> new BigDecimal(v.value()))); - - // ==================== Fixed-Point Types ==================== - - MariaType decimal = - MariaType.of( - "DECIMAL", - MariaRead.readBigDecimal, - MariaWrite.writeBigDecimal, - MariaText.textBigDecimal, - MariaJson.numeric); - - MariaType numeric = decimal.renamed("NUMERIC"); - - static MariaType decimal(int precision, int scale) { - return MariaType.of( - MariaTypename.of("DECIMAL", precision, scale), - MariaRead.readBigDecimal, - MariaWrite.writeBigDecimal, - MariaText.textBigDecimal, - MariaJson.numeric); - } - - // ==================== Floating-Point Types ==================== - - MariaType float_ = - MariaType.of( - "FLOAT", - MariaRead.readFloat, - MariaWrite.writeFloat, - MariaText.textFloat, - MariaJson.float4); - - MariaType double_ = - MariaType.of( - "DOUBLE", - MariaRead.readDouble, - MariaWrite.writeDouble, - MariaText.textDouble, - MariaJson.float8); - - // ==================== Boolean Type ==================== - - MariaType bool = - MariaType.of( - "BOOLEAN", - MariaRead.readBoolean, - MariaWrite.writeBoolean, - MariaText.textBoolean, - MariaJson.bool); - - // ==================== Bit Types ==================== - - // BIT(1) as Boolean - MariaType bit1 = - MariaType.of( - "BIT", - MariaRead.readBitAsBoolean, - MariaWrite.writeBoolean, - MariaText.textBoolean, - MariaJson.bool); - - // BIT(n) as byte[] - MariaType bit = - MariaType.of( - "BIT", - MariaRead.readBit, - MariaWrite.writeByteArray, - MariaText.textByteArray, - MariaJson.bytea); - - // ==================== String Types ==================== - - MariaType char_ = - MariaType.of( - "CHAR", - MariaRead.readString, - MariaWrite.writeString, - MariaText.textString, - MariaJson.text); - - MariaType varchar = - MariaType.of( - "VARCHAR", - MariaRead.readString, - MariaWrite.writeString, - MariaText.textString, - MariaJson.text); - - MariaType tinytext = - MariaType.of( - "TINYTEXT", - MariaRead.readString, - MariaWrite.writeString, - MariaText.textString, - MariaJson.text); - - MariaType text = - MariaType.of( - "TEXT", - MariaRead.readString, - MariaWrite.writeString, - MariaText.textString, - MariaJson.text); - - MariaType mediumtext = - MariaType.of( - "MEDIUMTEXT", - MariaRead.readString, - MariaWrite.writeString, - MariaText.textString, - MariaJson.text); - - MariaType longtext = - MariaType.of( - "LONGTEXT", - MariaRead.readString, - MariaWrite.writeString, - MariaText.textString, - MariaJson.text); - - static MariaType char_(int length) { - return MariaType.of( - MariaTypename.of("CHAR", length), - MariaRead.readString, - MariaWrite.writeString, - MariaText.textString, - MariaJson.text); - } - - static MariaType varchar(int length) { - return MariaType.of( - MariaTypename.of("VARCHAR", length), - MariaRead.readString, - MariaWrite.writeString, - MariaText.textString, - MariaJson.text); - } - - // ==================== Binary Types ==================== - - MariaType binary = - MariaType.of( - "BINARY", - MariaRead.readByteArray, - MariaWrite.writeByteArray, - MariaText.textByteArray, - MariaJson.bytea); - - MariaType varbinary = - MariaType.of( - "VARBINARY", - MariaRead.readByteArray, - MariaWrite.writeByteArray, - MariaText.textByteArray, - MariaJson.bytea); - - MariaType tinyblob = - MariaType.of( - "TINYBLOB", - MariaRead.readBlob, - MariaWrite.writeByteArray, - MariaText.textByteArray, - MariaJson.bytea); - - MariaType blob = - MariaType.of( - "BLOB", - MariaRead.readBlob, - MariaWrite.writeByteArray, - MariaText.textByteArray, - MariaJson.bytea); - - MariaType mediumblob = - MariaType.of( - "MEDIUMBLOB", - MariaRead.readBlob, - MariaWrite.writeByteArray, - MariaText.textByteArray, - MariaJson.bytea); - - MariaType longblob = - MariaType.of( - "LONGBLOB", - MariaRead.readBlob, - MariaWrite.writeByteArray, - MariaText.textByteArray, - MariaJson.bytea); - - static MariaType binary(int length) { - return MariaType.of( - MariaTypename.of("BINARY", length), - MariaRead.readByteArray, - MariaWrite.writeByteArray, - MariaText.textByteArray, - MariaJson.bytea); - } - - static MariaType varbinary(int length) { - return MariaType.of( - MariaTypename.of("VARBINARY", length), - MariaRead.readByteArray, - MariaWrite.writeByteArray, - MariaText.textByteArray, - MariaJson.bytea); - } - - // ==================== Date/Time Types ==================== - - MariaType date = - MariaType.of( - "DATE", - MariaRead.readLocalDate, - MariaWrite.passObjectToJdbc(), - MariaText.instanceToString(), - MariaJson.date); - - MariaType time = - MariaType.of( - "TIME", - MariaRead.readLocalTime, - MariaWrite.passObjectToJdbc(), - MariaText.instanceToString(), - MariaJson.time); - - MariaType datetime = - MariaType.of( - "DATETIME", - MariaRead.readLocalDateTime, - MariaWrite.passObjectToJdbc(), - MariaText.instanceToString(), - MariaJson.timestamp); - - MariaType timestamp = - MariaType.of( - "TIMESTAMP", - MariaRead.readLocalDateTime, - MariaWrite.passObjectToJdbc(), - MariaText.instanceToString(), - MariaJson.timestamp); - - MariaType year = - MariaType.of( - "YEAR", - MariaRead.readYear, - MariaWrite.writeShort.contramap(y -> (short) y.getValue()), - MariaText.textInteger.contramap(Year::getValue), - MariaJson.int4.bimap(Year::of, Year::getValue)); - - static MariaType time(int fsp) { - return MariaType.of( - MariaTypename.of("TIME", fsp), - MariaRead.readLocalTime, - MariaWrite.passObjectToJdbc(), - MariaText.instanceToString(), - MariaJson.time); - } - - static MariaType datetime(int fsp) { - return MariaType.of( - MariaTypename.of("DATETIME", fsp), - MariaRead.readLocalDateTime, - MariaWrite.passObjectToJdbc(), - MariaText.instanceToString(), - MariaJson.timestamp); - } - - static MariaType timestamp(int fsp) { - return MariaType.of( - MariaTypename.of("TIMESTAMP", fsp), - MariaRead.readLocalDateTime, - MariaWrite.passObjectToJdbc(), - MariaText.instanceToString(), - MariaJson.timestamp); - } - - // ==================== ENUM Type ==================== - - /** - * Create a MariaType for ENUM columns. MariaDB ENUMs are read/written as strings. - * - * @param fromString function to convert string to enum value - * @param the enum type - * @return MariaType for the enum - */ - static > MariaType ofEnum(String sqlType, Function fromString) { - return MariaType.of( - sqlType, - MariaRead.readString.map(fromString::apply), - MariaWrite.writeString.contramap(Enum::name), - MariaText.textString.contramap(Enum::name), - MariaJson.text.bimap(fromString::apply, Enum::name)); - } - - // ==================== SET Type ==================== - - /** MariaSet wrapper for SET columns. */ - MariaType set = - MariaType.of( - "SET", - MariaRead.readString.map(MariaSet::fromString), - MariaWrite.writeString.contramap(MariaSet::toCommaSeparated), - MariaText.textString.contramap(MariaSet::toCommaSeparated), - MariaJson.text.bimap(MariaSet::fromString, MariaSet::toCommaSeparated)); - - // ==================== JSON Type ==================== - - /** JSON type - reuses dev.typr.foundations.data.Json from the common types. */ - MariaType json = - MariaType.of( - "JSON", - MariaRead.readString.map(Json::new), - MariaWrite.writeString.contramap(Json::value), - MariaText.textString.contramap(Json::value), - MariaJson.json); - - // ==================== Network Types ==================== - - MariaType inet4 = - MariaType.of( - "INET4", - MariaRead.readString.map(Inet4::parse), - MariaWrite.writeString.contramap(Inet4::value), - MariaText.textString.contramap(Inet4::value), - MariaJson.text.bimap(Inet4::parse, Inet4::value)); - - MariaType inet6 = - MariaType.of( - "INET6", - MariaRead.readString.map(Inet6::parse), - MariaWrite.writeString.contramap(Inet6::value), - MariaText.textString.contramap(Inet6::value), - MariaJson.text.bimap(Inet6::parse, Inet6::value)); - - // ==================== Spatial Types ==================== - // Using MariaDB Connector/J types directly. - // We use readGeometry() which handles both normal SELECT (returns typed Geometry objects) - // and RETURNING clauses (returns WKB bytes). For base Geometry type, it parses the WKB - // header to determine the actual geometry type and uses the appropriate codec. - // Note: Spatial types use text representation for JSON (WKT format would be better but is - // complex) - - MariaType geometry = - MariaType.of( - "GEOMETRY", - MariaRead.readGeometry(Geometry.class), - MariaWrite.passObjectToJdbc(), - MariaText.NotWorking(), - MariaJson.text.bimap( - s -> { - throw new UnsupportedOperationException("Geometry JSON not supported"); - }, - Object::toString)); - - MariaType point = - MariaType.of( - "POINT", - MariaRead.readGeometry(Point.class), - MariaWrite.passObjectToJdbc(), - MariaText.NotWorking(), - MariaJson.text.bimap( - s -> { - throw new UnsupportedOperationException("Point JSON not supported"); - }, - Object::toString)); - - MariaType linestring = - MariaType.of( - "LINESTRING", - MariaRead.readGeometry(LineString.class), - MariaWrite.passObjectToJdbc(), - MariaText.NotWorking(), - MariaJson.text.bimap( - s -> { - throw new UnsupportedOperationException("LineString JSON not supported"); - }, - Object::toString)); - - MariaType polygon = - MariaType.of( - "POLYGON", - MariaRead.readGeometry(Polygon.class), - MariaWrite.passObjectToJdbc(), - MariaText.NotWorking(), - MariaJson.text.bimap( - s -> { - throw new UnsupportedOperationException("Polygon JSON not supported"); - }, - Object::toString)); - - MariaType multipoint = - MariaType.of( - "MULTIPOINT", - MariaRead.readGeometry(MultiPoint.class), - MariaWrite.passObjectToJdbc(), - MariaText.NotWorking(), - MariaJson.text.bimap( - s -> { - throw new UnsupportedOperationException("MultiPoint JSON not supported"); - }, - Object::toString)); - - MariaType multilinestring = - MariaType.of( - "MULTILINESTRING", - MariaRead.readGeometry(MultiLineString.class), - MariaWrite.passObjectToJdbc(), - MariaText.NotWorking(), - MariaJson.text.bimap( - s -> { - throw new UnsupportedOperationException("MultiLineString JSON not supported"); - }, - Object::toString)); - - MariaType multipolygon = - MariaType.of( - "MULTIPOLYGON", - MariaRead.readGeometry(MultiPolygon.class), - MariaWrite.passObjectToJdbc(), - MariaText.NotWorking(), - MariaJson.text.bimap( - s -> { - throw new UnsupportedOperationException("MultiPolygon JSON not supported"); - }, - Object::toString)); - - MariaType geometrycollection = - MariaType.of( - "GEOMETRYCOLLECTION", - MariaRead.readGeometry(GeometryCollection.class), - MariaWrite.passObjectToJdbc(), - MariaText.NotWorking(), - MariaJson.text.bimap( - s -> { - throw new UnsupportedOperationException("GeometryCollection JSON not supported"); - }, - Object::toString)); - - // ==================== Unknown Type ==================== - // For columns whose type typr doesn't know how to handle - cast to/from string - MariaType unknown = - MariaType.of( - "TEXT", - MariaRead.readString, - MariaWrite.writeString, - MariaText.textString, - MariaJson.text) - .bimap(dev.typr.foundations.data.Unknown::new, dev.typr.foundations.data.Unknown::value); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/MariaWrite.java b/foundations-jdbc/src/java/dev/typr/foundations/MariaWrite.java deleted file mode 100644 index 57adf243ab..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/MariaWrite.java +++ /dev/null @@ -1,70 +0,0 @@ -package dev.typr.foundations; - -import java.math.BigDecimal; -import java.math.BigInteger; -import java.sql.PreparedStatement; -import java.sql.SQLException; -import java.util.Optional; -import java.util.function.Function; - -/** - * Describes how to write a value to a {@link PreparedStatement} for MariaDB. - * - *

Similar to PgWrite but adapted for MariaDB-specific types. MariaDB doesn't have array types, - * so array support is omitted. - */ -public sealed interface MariaWrite extends DbWrite permits MariaWrite.Instance { - void set(PreparedStatement ps, int idx, A a) throws SQLException; - - MariaWrite> opt(MariaTypename typename); - - MariaWrite contramap(Function f); - - @FunctionalInterface - interface RawWriter { - void set(PreparedStatement ps, int index, A a) throws SQLException; - } - - record Instance(RawWriter rawWriter, Function f) implements MariaWrite { - @Override - public void set(PreparedStatement ps, int index, A a) throws SQLException { - rawWriter.set(ps, index, f.apply(a)); - } - - @Override - public MariaWrite> opt(MariaTypename typename) { - return new Instance<>( - (ps, index, u) -> { - if (u == null) ps.setNull(index, java.sql.Types.NULL); - else set(ps, index, u); - }, - a -> a.orElse(null)); - } - - @Override - public MariaWrite contramap(Function f) { - return new Instance<>(rawWriter, f.andThen(this.f)); - } - } - - static MariaWrite primitive(RawWriter rawWriter) { - return new Instance<>(rawWriter, Function.identity()); - } - - static MariaWrite passObjectToJdbc() { - return primitive(PreparedStatement::setObject); - } - - // Basic type writers - MariaWrite writeString = primitive(PreparedStatement::setString); - MariaWrite writeBoolean = primitive(PreparedStatement::setBoolean); - MariaWrite writeByte = primitive(PreparedStatement::setByte); - MariaWrite writeShort = primitive(PreparedStatement::setShort); - MariaWrite writeInteger = primitive(PreparedStatement::setInt); - MariaWrite writeLong = primitive(PreparedStatement::setLong); - MariaWrite writeFloat = primitive(PreparedStatement::setFloat); - MariaWrite writeDouble = primitive(PreparedStatement::setDouble); - MariaWrite writeBigDecimal = primitive(PreparedStatement::setBigDecimal); - MariaWrite writeBigInteger = writeBigDecimal.contramap(bi -> new BigDecimal(bi)); - MariaWrite writeByteArray = primitive(PreparedStatement::setBytes); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/NonEmptyBlob.java b/foundations-jdbc/src/java/dev/typr/foundations/NonEmptyBlob.java deleted file mode 100644 index 47563ccf80..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/NonEmptyBlob.java +++ /dev/null @@ -1,69 +0,0 @@ -package dev.typr.foundations; - -import java.util.Arrays; -import java.util.Optional; - -/** - * A non-empty byte array value. - * - *

Oracle converts empty byte arrays to NULL, so this type represents byte arrays that are - * guaranteed to be non-null and non-empty - suitable for NOT NULL BLOB/RAW columns. - */ -public final class NonEmptyBlob { - private final byte[] value; - - private NonEmptyBlob(byte[] value) { - this.value = value; - } - - /** - * Smart constructor: Create a NonEmptyBlob from a byte array. Returns Optional.empty() if the - * array is null or empty. - */ - public static Optional apply(byte[] bytes) { - if (bytes == null || bytes.length == 0) { - return Optional.empty(); - } - return Optional.of(new NonEmptyBlob(bytes)); - } - - /** - * Force constructor: Create a NonEmptyBlob from a byte array. Throws IllegalArgumentException if - * the array is null or empty. - */ - public static NonEmptyBlob force(byte[] bytes) { - if (bytes == null || bytes.length == 0) { - throw new IllegalArgumentException("Byte array cannot be null or empty"); - } - return new NonEmptyBlob(bytes); - } - - public byte[] value() { - return value; - } - - @Override - public String toString() { - StringBuilder sb = new StringBuilder("["); - for (int i = 0; i < Math.min(value.length, 8); i++) { - if (i > 0) sb.append(", "); - sb.append(String.format("0x%02X", value[i] & 0xff)); - } - if (value.length > 8) sb.append(", ..."); - sb.append("]"); - return sb.toString(); - } - - @Override - public boolean equals(Object o) { - if (this == o) return true; - if (o == null || getClass() != o.getClass()) return false; - NonEmptyBlob that = (NonEmptyBlob) o; - return Arrays.equals(value, that.value); - } - - @Override - public int hashCode() { - return Arrays.hashCode(value); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/NonEmptyString.java b/foundations-jdbc/src/java/dev/typr/foundations/NonEmptyString.java deleted file mode 100644 index 87af86a3e8..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/NonEmptyString.java +++ /dev/null @@ -1,62 +0,0 @@ -package dev.typr.foundations; - -import java.util.Objects; -import java.util.Optional; - -/** - * A non-empty string value. - * - *

Oracle converts empty strings to NULL, so this type represents strings that are guaranteed to - * be non-null and non-empty - suitable for NOT NULL VARCHAR2/NVARCHAR2/CLOB/NCLOB columns. - */ -public final class NonEmptyString { - private final String value; - - private NonEmptyString(String value) { - this.value = value; - } - - /** - * Smart constructor: Create a NonEmptyString from a string value. Returns Optional.empty() if the - * string is null or empty. - */ - public static Optional apply(String s) { - if (s == null || s.isEmpty()) { - return Optional.empty(); - } - return Optional.of(new NonEmptyString(s)); - } - - /** - * Force constructor: Create a NonEmptyString from a string value. Throws IllegalArgumentException - * if the string is null or empty. - */ - public static NonEmptyString force(String s) { - if (s == null || s.isEmpty()) { - throw new IllegalArgumentException("String cannot be null or empty"); - } - return new NonEmptyString(s); - } - - public String value() { - return value; - } - - @Override - public String toString() { - return value; - } - - @Override - public boolean equals(Object o) { - if (this == o) return true; - if (o == null || getClass() != o.getClass()) return false; - NonEmptyString that = (NonEmptyString) o; - return value.equals(that.value); - } - - @Override - public int hashCode() { - return Objects.hash(value); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/Operation.java b/foundations-jdbc/src/java/dev/typr/foundations/Operation.java deleted file mode 100644 index 31e26e1503..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/Operation.java +++ /dev/null @@ -1,137 +0,0 @@ -package dev.typr.foundations; - -import java.sql.Connection; -import java.sql.PreparedStatement; -import java.sql.ResultSet; -import java.sql.SQLException; -import java.util.Iterator; -import java.util.List; - -public sealed interface Operation - permits Operation.Query, - Operation.Update, - Operation.UpdateReturning, - Operation.UpdateReturningGeneratedKeys, - Operation.UpdateManyReturning, - Operation.UpdateMany, - Operation.UpdateReturningEach { - Out run(Connection conn) throws SQLException; - - default Out runUnchecked(Connection conn) { - try { - return run(conn); - } catch (SQLException e) { - throw new RuntimeException(e); - } - } - - record Query(Fragment query, ResultSetParser parser) implements Operation { - @Override - public Out run(Connection conn) throws SQLException { - try (PreparedStatement stmt = conn.prepareStatement(query.render())) { - query.set(stmt); - try (ResultSet rs = stmt.executeQuery()) { - return parser.apply(rs); - } - } - } - } - - record Update(Fragment query) implements Operation { - @Override - public Integer run(Connection conn) throws SQLException { - try (PreparedStatement stmt = conn.prepareStatement(query.render())) { - query.set(stmt); - return stmt.executeUpdate(); - } - } - } - - record UpdateReturning(Fragment query, ResultSetParser parser) - implements Operation { - @Override - public Out run(Connection conn) throws SQLException { - try (PreparedStatement stmt = conn.prepareStatement(query.render())) { - query.set(stmt); - try (ResultSet rs = stmt.executeQuery()) { - return parser.apply(rs); - } - } - } - } - - record UpdateReturningGeneratedKeys( - Fragment query, String[] columnNames, ResultSetParser parser) implements Operation { - @Override - public Out run(Connection conn) throws SQLException { - try (PreparedStatement stmt = conn.prepareStatement(query.render(), columnNames)) { - query.set(stmt); - stmt.executeUpdate(); - try (ResultSet rs = stmt.getGeneratedKeys()) { - return parser.apply(rs); - } - } - } - } - - record UpdateMany(Fragment query, RowParser parser, Iterator rows) - implements Operation { - @Override - public int[] run(Connection conn) throws SQLException { - try (PreparedStatement stmt = conn.prepareStatement(query.render())) { - query.set(stmt); - while (rows.hasNext()) { - Row row = rows.next(); - parser.writeRow(stmt, row); - stmt.addBatch(); - } - return stmt.executeBatch(); - } - } - } - - record UpdateManyReturning(Fragment query, RowParser parser, Iterator rows) - implements Operation> { - @Override - public List run(Connection conn) throws SQLException { - try (PreparedStatement stmt = - conn.prepareStatement(query.render(), java.sql.Statement.RETURN_GENERATED_KEYS)) { - query.set(stmt); - while (rows.hasNext()) { - Row row = rows.next(); - parser.writeRow(stmt, row); - stmt.addBatch(); - } - stmt.executeBatch(); - try (ResultSet rs = stmt.getGeneratedKeys()) { - return parser.all().apply(rs); - } - } - } - } - - /** - * Executes each row individually with RETURNING clause. Used for MariaDB where batch mode with - * RETURNING doesn't work properly via getGeneratedKeys(). Each INSERT/UPDATE is executed - * separately and the RETURNING result is read from executeQuery(). - */ - record UpdateReturningEach(Fragment query, RowParser parser, Iterator rows) - implements Operation> { - @Override - public List run(Connection conn) throws SQLException { - java.util.ArrayList results = new java.util.ArrayList<>(); - String sql = query.render(); - while (rows.hasNext()) { - Row row = rows.next(); - try (PreparedStatement stmt = conn.prepareStatement(sql)) { - query.set(stmt); - parser.writeRow(stmt, row); - try (ResultSet rs = stmt.executeQuery()) { - results.addAll(parser.all().apply(rs)); - } - } - } - return results; - } - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/OracleJson.java b/foundations-jdbc/src/java/dev/typr/foundations/OracleJson.java deleted file mode 100644 index 0a0468bdda..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/OracleJson.java +++ /dev/null @@ -1,442 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.*; -import java.math.BigDecimal; -import java.time.*; -import java.time.format.DateTimeFormatter; -import java.time.format.DateTimeFormatterBuilder; -import java.time.temporal.ChronoField; -import java.util.*; -import java.util.function.Function; -import java.util.function.IntFunction; - -/** - * Oracle-specific JSON codec implementations. Handles conversion to/from JSON in Oracle's expected - * format. - */ -public interface OracleJson extends DbJson { - - @Override - default OracleJson> opt() { - OracleJson self = this; - return new OracleJson<>() { - @Override - public JsonValue toJson(Optional value) { - return value.map(self::toJson).orElse(JsonValue.JNull.INSTANCE); - } - - @Override - public Optional fromJson(JsonValue json) { - if (json instanceof JsonValue.JNull) { - return Optional.empty(); - } - return Optional.of(self.fromJson(json)); - } - }; - } - - default OracleJson array(IntFunction arrayFactory) { - OracleJson self = this; - return new OracleJson<>() { - @Override - public JsonValue toJson(A[] value) { - List elements = new ArrayList<>(value.length); - for (A elem : value) { - elements.add(self.toJson(elem)); - } - return new JsonValue.JArray(elements); - } - - @Override - public A[] fromJson(JsonValue json) { - if (!(json instanceof JsonValue.JArray arr)) { - throw new IllegalArgumentException( - "Expected JSON array, got: " + json.getClass().getSimpleName()); - } - A[] result = arrayFactory.apply(arr.values().size()); - for (int i = 0; i < arr.values().size(); i++) { - result[i] = self.fromJson(arr.values().get(i)); - } - return result; - } - }; - } - - default OracleJson bimap(SqlFunction f, Function g) { - OracleJson self = this; - return new OracleJson<>() { - @Override - public JsonValue toJson(B value) { - return self.toJson(g.apply(value)); - } - - @Override - public B fromJson(JsonValue json) { - try { - return f.apply(self.fromJson(json)); - } catch (java.sql.SQLException e) { - throw new RuntimeException(e); - } - } - }; - } - - // Primitive type codecs - OracleJson bool = - new OracleJson<>() { - @Override - public JsonValue toJson(Boolean value) { - return JsonValue.JBool.of(value); - } - - @Override - public Boolean fromJson(JsonValue json) { - if (json instanceof JsonValue.JBool b) return b.value(); - // Oracle might return NUMBER(1) as 0/1 for boolean-like columns - if (json instanceof JsonValue.JNumber n) return Integer.parseInt(n.value()) != 0; - throw new IllegalArgumentException( - "Expected boolean or number, got: " + json.getClass().getSimpleName()); - } - }; - - OracleJson int2 = - new OracleJson<>() { - @Override - public JsonValue toJson(Short value) { - return JsonValue.JNumber.of(value.longValue()); - } - - @Override - public Short fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber n) return Short.parseShort(n.value()); - throw new IllegalArgumentException( - "Expected number, got: " + json.getClass().getSimpleName()); - } - }; - - OracleJson int4 = - new OracleJson<>() { - @Override - public JsonValue toJson(Integer value) { - return JsonValue.JNumber.of(value.longValue()); - } - - @Override - public Integer fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber n) return Integer.parseInt(n.value()); - throw new IllegalArgumentException( - "Expected number, got: " + json.getClass().getSimpleName()); - } - }; - - OracleJson int8 = - new OracleJson<>() { - @Override - public JsonValue toJson(Long value) { - return JsonValue.JNumber.of(value); - } - - @Override - public Long fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber n) return Long.parseLong(n.value()); - throw new IllegalArgumentException( - "Expected number, got: " + json.getClass().getSimpleName()); - } - }; - - OracleJson float4 = - new OracleJson<>() { - @Override - public JsonValue toJson(Float value) { - return JsonValue.JNumber.of(value.doubleValue()); - } - - @Override - public Float fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber n) return Float.parseFloat(n.value()); - throw new IllegalArgumentException( - "Expected number, got: " + json.getClass().getSimpleName()); - } - }; - - OracleJson float8 = - new OracleJson<>() { - @Override - public JsonValue toJson(Double value) { - return JsonValue.JNumber.of(value); - } - - @Override - public Double fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber n) return Double.parseDouble(n.value()); - throw new IllegalArgumentException( - "Expected number, got: " + json.getClass().getSimpleName()); - } - }; - - OracleJson numeric = - new OracleJson<>() { - @Override - public JsonValue toJson(BigDecimal value) { - return JsonValue.JNumber.of(value.toString()); - } - - @Override - public BigDecimal fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber n) return new BigDecimal(n.value()); - throw new IllegalArgumentException( - "Expected number, got: " + json.getClass().getSimpleName()); - } - }; - - OracleJson text = - new OracleJson<>() { - @Override - public JsonValue toJson(String value) { - return new JsonValue.JString(value); - } - - @Override - public String fromJson(JsonValue json) { - if (json instanceof JsonValue.JString s) return s.value(); - throw new IllegalArgumentException( - "Expected string, got: " + json.getClass().getSimpleName()); - } - }; - - OracleJson nonEmptyString = - new OracleJson<>() { - @Override - public JsonValue toJson(NonEmptyString value) { - return new JsonValue.JString(value.value()); - } - - @Override - public NonEmptyString fromJson(JsonValue json) { - if (json instanceof JsonValue.JString s) { - return NonEmptyString.force(s.value()); - } - throw new IllegalArgumentException( - "Expected string, got: " + json.getClass().getSimpleName()); - } - }; - - static OracleJson paddedString(int length) { - return new OracleJson<>() { - @Override - public JsonValue toJson(PaddedString value) { - return new JsonValue.JString(value.value()); - } - - @Override - public PaddedString fromJson(JsonValue json) { - if (json instanceof JsonValue.JString s) { - return PaddedString.force(s.value(), length); - } - throw new IllegalArgumentException( - "Expected string, got: " + json.getClass().getSimpleName()); - } - }; - } - - OracleJson nonEmptyBlob = - new OracleJson<>() { - @Override - public JsonValue toJson(NonEmptyBlob value) { - StringBuilder sb = new StringBuilder(); - for (byte b : value.value()) { - sb.append(String.format("%02X", b & 0xff)); - } - return new JsonValue.JString(sb.toString()); - } - - @Override - public NonEmptyBlob fromJson(JsonValue json) { - if (json instanceof JsonValue.JString s) { - String hexString = s.value(); - if (hexString.length() % 2 != 0) { - throw new IllegalArgumentException("Hex string must have even length"); - } - byte[] bytes = new byte[hexString.length() / 2]; - for (int i = 0; i < bytes.length; i++) { - bytes[i] = (byte) Integer.parseInt(hexString.substring(i * 2, i * 2 + 2), 16); - } - return NonEmptyBlob.force(bytes); - } - throw new IllegalArgumentException( - "Expected string for bytea, got: " + json.getClass().getSimpleName()); - } - }; - - // Oracle RAW/BLOB data - encode as hex string - OracleJson bytea = - new OracleJson<>() { - @Override - public JsonValue toJson(byte[] value) { - StringBuilder sb = new StringBuilder(); - for (byte b : value) { - sb.append(String.format("%02X", b & 0xff)); - } - return new JsonValue.JString(sb.toString()); - } - - @Override - public byte[] fromJson(JsonValue json) { - if (!(json instanceof JsonValue.JString s)) { - throw new IllegalArgumentException( - "Expected string for bytea, got: " + json.getClass().getSimpleName()); - } - String hex = s.value(); - byte[] result = new byte[hex.length() / 2]; - for (int i = 0; i < result.length; i++) { - result[i] = (byte) Integer.parseInt(hex.substring(i * 2, i * 2 + 2), 16); - } - return result; - } - }; - - // Date/Time types - // Oracle DATE includes time component - OracleJson date = - new OracleJson<>() { - // Oracle JSON_OBJECT returns dates as "YYYY-MM-DD"T"HH:MI:SS" format - private static final DateTimeFormatter FORMATTER = - new DateTimeFormatterBuilder() - .appendPattern("yyyy-MM-dd") - .optionalStart() - .appendLiteral('T') - .appendPattern("HH:mm:ss") - .optionalEnd() - .toFormatter(); - - @Override - public JsonValue toJson(LocalDateTime value) { - return new JsonValue.JString(value.format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)); - } - - @Override - public LocalDateTime fromJson(JsonValue json) { - if (json instanceof JsonValue.JString s) { - String value = s.value(); - // Handle both date-only and datetime formats - if (value.length() <= 10) { - // Date only - assume midnight - return LocalDate.parse(value).atStartOfDay(); - } - // Replace space with T if present (Oracle might use space) - value = value.replace(' ', 'T'); - return LocalDateTime.parse(value); - } - throw new IllegalArgumentException( - "Expected string for date, got: " + json.getClass().getSimpleName()); - } - }; - - OracleJson timestamp = - new OracleJson<>() { - private static final DateTimeFormatter FORMATTER = - new DateTimeFormatterBuilder() - .appendPattern("yyyy-MM-dd") - .optionalStart() - .appendLiteral('T') - .appendPattern("HH:mm:ss") - .appendFraction(ChronoField.NANO_OF_SECOND, 0, 9, true) - .optionalEnd() - .toFormatter(); - - @Override - public JsonValue toJson(LocalDateTime value) { - return new JsonValue.JString(value.format(DateTimeFormatter.ISO_LOCAL_DATE_TIME)); - } - - @Override - public LocalDateTime fromJson(JsonValue json) { - if (json instanceof JsonValue.JString s) { - String value = s.value().replace(' ', 'T'); - // Oracle TEXT format may include timezone offset for TIMESTAMP WITH LOCAL TIME ZONE - // e.g., "2024-06-15T14:30:45.000000+02:00" - // Parse as OffsetDateTime if it contains timezone, otherwise as LocalDateTime - if (value.contains("+") || value.endsWith("Z") || value.matches(".*-\\d{2}:\\d{2}$")) { - return OffsetDateTime.parse(value).toLocalDateTime(); - } - return LocalDateTime.parse(value); - } - throw new IllegalArgumentException( - "Expected string for timestamp, got: " + json.getClass().getSimpleName()); - } - }; - - OracleJson timestampWithTimeZone = - new OracleJson<>() { - @Override - public JsonValue toJson(OffsetDateTime value) { - return new JsonValue.JString(value.format(DateTimeFormatter.ISO_OFFSET_DATE_TIME)); - } - - @Override - public OffsetDateTime fromJson(JsonValue json) { - if (json instanceof JsonValue.JString s) { - return OffsetDateTime.parse(s.value()); - } - throw new IllegalArgumentException( - "Expected string for timestamp with time zone, got: " - + json.getClass().getSimpleName()); - } - }; - - // INTERVAL YEAR TO MONTH - Oracle JSON returns ISO-8601 format (P2Y5M) - OracleJson intervalYearToMonth = - new OracleJson<>() { - @Override - public JsonValue toJson(OracleIntervalYM value) { - return new JsonValue.JString(value.toIso8601()); - } - - @Override - public OracleIntervalYM fromJson(JsonValue json) { - if (json instanceof JsonValue.JString s) { - return OracleIntervalYM.parse(s.value()); - } - throw new IllegalArgumentException( - "Expected string for interval, got: " + json.getClass().getSimpleName()); - } - }; - - // INTERVAL DAY TO SECOND - Oracle JSON returns ISO-8601 format (P3DT14H30M45.123456S) - OracleJson intervalDayToSecond = - new OracleJson<>() { - @Override - public JsonValue toJson(OracleIntervalDS value) { - return new JsonValue.JString(value.toIso8601()); - } - - @Override - public OracleIntervalDS fromJson(JsonValue json) { - if (json instanceof JsonValue.JString s) { - return OracleIntervalDS.parse(s.value()); - } - throw new IllegalArgumentException( - "Expected string for interval, got: " + json.getClass().getSimpleName()); - } - }; - - // ROWID types - OracleJson rowId = text; - - // JSON types (pass-through) - OracleJson json = - new OracleJson<>() { - @Override - public JsonValue toJson(Json value) { - return JsonValue.parse(value.value()); - } - - @Override - public Json fromJson(JsonValue json) { - return new Json(json.encode()); - } - }; - - // XMLTYPE - stored as string - OracleJson xmlType = text; -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/OracleNestedTable.java b/foundations-jdbc/src/java/dev/typr/foundations/OracleNestedTable.java deleted file mode 100644 index c39dcea321..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/OracleNestedTable.java +++ /dev/null @@ -1,123 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.JsonValue; -import java.sql.SQLException; -import java.sql.Types; -import java.util.ArrayList; -import java.util.List; -import oracle.sql.ARRAY; -import oracle.sql.ArrayDescriptor; - -/** - * Oracle NESTED TABLE type support. - * - *

A nested table is a variable-length collection (unlike VARRAY which has a max size). Example: - * CREATE TYPE order_items_t AS TABLE OF order_item_t - * - *

Nested tables are stored in a separate storage table and can grow to any size. - */ -public class OracleNestedTable { - /** - * Create an OracleType for a NESTED TABLE type. - * - * @param nestedTableTypeName The SQL type name (e.g., "SCHEMA.ORDER_ITEMS_T") - * @param elementType The type of elements in the nested table - * @param The Java type of elements - * @return An OracleType that can read/write NESTED TABLE values as Lists - */ - public static OracleType> of(String nestedTableTypeName, OracleType elementType) { - OracleRead> read = - new OracleRead.NonNullable<>( - (rs, idx) -> rs.getObject(idx), - obj -> { - if (!(obj instanceof ARRAY array)) { - throw new SQLException( - "Expected ARRAY, got: " + (obj == null ? "null" : obj.getClass().getName())); - } - - try { - Object[] rawArray = (Object[]) array.getArray(); - - // Convert Object[] to List using element type's fromOracleValue() - List result = new ArrayList<>(rawArray.length); - for (Object element : rawArray) { - if (element == null) { - result.add(null); - } else { - // Use fromOracleValue() to handle both primitive and Oracle-native types - @SuppressWarnings("unchecked") - T converted = (T) elementType.read().fromOracleValue(element); - result.add(converted); - } - } - - return result; - } catch (Exception e) { - throw new SQLException("Failed to read Oracle NESTED TABLE: " + e.getMessage(), e); - } - }); - - // Use structured() instead of primitive() to support STRUCT context - // toOracleValue() will convert List → oracle.sql.ARRAY - OracleWrite> write = - OracleWrite.structured( - (list, conn) -> { - if (list == null) return null; - try { - // Convert each element using elementType's toOracleValue() - // For OBJECT types: OrderItem → oracle.sql.STRUCT - // For primitive types: value passes through unchanged - Object[] elements = new Object[list.size()]; - for (int i = 0; i < list.size(); i++) { - elements[i] = elementType.write().toOracleValue(list.get(i), conn); - } - - ArrayDescriptor desc = ArrayDescriptor.createDescriptor(nestedTableTypeName, conn); - return new ARRAY(desc, conn, elements); - } catch (Exception e) { - throw new SQLException("Failed to write Oracle NESTED TABLE: " + e.getMessage(), e); - } - }, - nestedTableTypeName, - Types.ARRAY); - - // Generate OracleJson codec - OracleJson> json = json(elementType); - - return new OracleType<>(OracleTypename.of(nestedTableTypeName), read, write, json); - } - - /** Generate JSON codec for nested table type. */ - private static OracleJson> json(OracleType elementType) { - return new OracleJson>() { - @Override - public JsonValue toJson(List list) { - if (list == null) return JsonValue.JNull.INSTANCE; - - List elements = new ArrayList<>(); - for (T element : list) { - elements.add(elementType.oracleJson().toJson(element)); - } - return new JsonValue.JArray(elements); - } - - @Override - public List fromJson(JsonValue json) { - if (json instanceof JsonValue.JNull) return null; - if (!(json instanceof JsonValue.JArray arr)) { - throw new IllegalArgumentException( - "Expected JSON array for NESTED TABLE type, got: " + json.getClass().getSimpleName()); - } - - List elements = arr.values(); - List result = new ArrayList<>(elements.size()); - - for (JsonValue element : elements) { - result.add(elementType.oracleJson().fromJson(element)); - } - - return result; - } - }; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/OracleObject.java b/foundations-jdbc/src/java/dev/typr/foundations/OracleObject.java deleted file mode 100644 index 60e6b8e6f7..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/OracleObject.java +++ /dev/null @@ -1,186 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.JsonValue; -import java.sql.SQLException; -import java.sql.Types; -import java.util.ArrayList; -import java.util.LinkedHashMap; -import java.util.List; -import java.util.Map; -import java.util.function.Function; -import oracle.sql.STRUCT; -import oracle.sql.StructDescriptor; - -/** - * Oracle OBJECT type support (similar to DuckDB STRUCT). - * - *

An OBJECT is a user-defined type with named attributes. Example: CREATE TYPE address_t AS - * OBJECT (street VARCHAR2, city VARCHAR2) - * - *

This class handles the marshalling between Oracle's oracle.sql.STRUCT and Java records. - */ -public record OracleObject( - OracleTypename.ObjectOf typename, - List> attributes, - ObjectReader reader, - ObjectWriter writer) { - public record Attribute(String name, OracleType type, Function getter) {} - - @FunctionalInterface - public interface ObjectReader { - A read(Object[] attributeValues) throws SQLException; - } - - @FunctionalInterface - public interface ObjectWriter { - Object[] write(A value); - } - - /** Convert this OracleObject to an OracleType for use in repositories. */ - public OracleType asType() { - OracleRead read = - new OracleRead.NonNullable<>( - (rs, idx) -> rs.getObject(idx), - obj -> { - if (!(obj instanceof STRUCT struct)) { - throw new SQLException( - "Expected STRUCT, got: " + (obj == null ? "null" : obj.getClass().getName())); - } - - try { - Object[] rawAttrs = struct.getAttributes(); - - // Convert each attribute through its type's reader - // This handles Oracle's BigDecimal → Long/Integer conversions, - // and nested STRUCT/ARRAY objects - Object[] typedAttrs = new Object[attributes.size()]; - for (int i = 0; i < attributes.size(); i++) { - Attribute attr = attributes.get(i); - // fromOracleValue() handles both primitive types and Oracle-native types - typedAttrs[i] = attr.type().read().fromOracleValue(rawAttrs[i]); - } - - return reader.read(typedAttrs); - } catch (Exception e) { - throw new SQLException("Failed to read Oracle STRUCT: " + e.getMessage(), e); - } - }); - - OracleWrite write = - OracleWrite.structured( - (value, conn) -> { - try { - StructDescriptor desc = StructDescriptor.createDescriptor(typename.sqlName(), conn); - - // Get raw field values - Object[] rawValues = writer.write(value); - - // Convert each field to its Oracle representation - // For primitive types, toOracleValue returns the value unchanged - // For structured types (e.g., TIMESTAMP WITH LOCAL TIME ZONE), - // it converts to Oracle-native objects (e.g., oracle.sql.TIMESTAMPLTZ) - Object[] oracleValues = new Object[rawValues.length]; - for (int i = 0; i < rawValues.length; i++) { - @SuppressWarnings("unchecked") - Attribute attr = (Attribute) attributes.get(i); - Object fieldValue = rawValues[i]; - oracleValues[i] = attr.type().write().toOracleValue(fieldValue, conn); - } - - return new STRUCT(desc, conn, oracleValues); - } catch (Exception e) { - throw new SQLException("Failed to create Oracle STRUCT: " + e.getMessage(), e); - } - }, - typename.sqlName(), - Types.STRUCT); - - // Generate OracleJson codec - OracleJson json = generateJson(); - - return new OracleType<>(typename.asGeneric(), read, write, json); - } - - /** Generate JSON codec for this OBJECT type. */ - private OracleJson generateJson() { - return new OracleJson() { - @Override - public JsonValue toJson(A value) { - if (value == null) return JsonValue.JNull.INSTANCE; - - Map fields = new LinkedHashMap<>(); - for (Attribute attr : attributes) { - @SuppressWarnings("unchecked") - Attribute typedAttr = (Attribute) attr; - Object fieldValue = typedAttr.getter().apply(value); - JsonValue fieldJson = typedAttr.type().oracleJson().toJson(fieldValue); - fields.put(attr.name(), fieldJson); - } - return new JsonValue.JObject(fields); - } - - @Override - public A fromJson(JsonValue json) { - if (json instanceof JsonValue.JNull) return null; - if (!(json instanceof JsonValue.JObject obj)) { - throw new IllegalArgumentException( - "Expected JSON object for OBJECT type, got: " + json.getClass().getSimpleName()); - } - - // Extract attribute values from JSON - Object[] attrValues = new Object[attributes.size()]; - for (int i = 0; i < attributes.size(); i++) { - Attribute attr = attributes.get(i); - JsonValue fieldJson = obj.get(attr.name()); - attrValues[i] = attr.type().oracleJson().fromJson(fieldJson); - } - - try { - return reader.read(attrValues); - } catch (SQLException e) { - throw new RuntimeException("Failed to construct object from JSON", e); - } - } - }; - } - - // ═══ BUILDER API ═══ - - public static Builder builder(String objectTypeName) { - return new Builder<>(objectTypeName); - } - - public static class Builder { - private final String objectTypeName; - private final List> attributes = new ArrayList<>(); - - Builder(String objectTypeName) { - this.objectTypeName = objectTypeName; - } - - public Builder addAttribute(String name, OracleType type, Function getter) { - attributes.add(new Attribute<>(name, type, getter)); - return this; - } - - /** - * Build OracleObject with a reader - writer is automatically derived from attribute getters. - */ - public OracleObject build(ObjectReader reader) { - // Generate writer automatically from the attribute getters - ObjectWriter writer = - value -> { - Object[] result = new Object[attributes.size()]; - for (int i = 0; i < attributes.size(); i++) { - @SuppressWarnings("unchecked") - Attribute attr = (Attribute) attributes.get(i); - result[i] = attr.getter().apply(value); - } - return result; - }; - - OracleTypename.ObjectOf typename = OracleTypename.objectOf(objectTypeName); - return new OracleObject<>(typename, attributes, reader, writer); - } - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/OracleRead.java b/foundations-jdbc/src/java/dev/typr/foundations/OracleRead.java deleted file mode 100644 index bbbd348224..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/OracleRead.java +++ /dev/null @@ -1,444 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.OracleIntervalDS; -import dev.typr.foundations.data.OracleIntervalYM; -import java.math.BigDecimal; -import java.sql.ResultSet; -import java.sql.SQLException; -import java.time.*; -import java.util.Optional; - -/** - * Describes how to read a column from a {@link ResultSet} for Oracle. - * - *

Similar to MariaRead but adapted for Oracle-specific types. - * - *

Reading is split into two phases: - extract: Determines which ResultSet method to use - * (getString, getInt, getObject, etc.) - transform: Converts the extracted value to the target type - * (BigDecimal → Long, String → NonEmptyString, etc.) - */ -public sealed interface OracleRead extends DbRead - permits OracleRead.NonNullable, OracleRead.Nullable, OracleRead.MappedNullable { - /** - * Phase 1: Extract a value from the ResultSet. Determines which ResultSet method to use - * (getString, getInt, getObject, etc.). - */ - Object extract(ResultSet rs, int col) throws SQLException; - - /** - * Phase 2: Transform the extracted value to the target type. Handles conversions like BigDecimal - * → Long, String → NonEmptyString, etc. Returns null for SQL NULL values (checked via - * rs.wasNull()). - */ - A transform(Object value) throws SQLException; - - /** - * Combined read operation: extract + transform. Handles nullability based on NonNullable vs - * Nullable implementation. - */ - A read(ResultSet rs, int col) throws SQLException; - - /** - * Convert a raw Oracle value to its Java representation. This is the inverse of - * OracleWrite.toOracleValue(). Uses only the transform phase since the value is already - * extracted. - */ - default A fromOracleValue(Object value) throws SQLException { - return transform(value); - } - - OracleRead map(SqlFunction f); - - /** Derive an OracleRead which allows nullable values */ - OracleRead> opt(); - - @FunctionalInterface - interface RawRead { - A apply(ResultSet rs, int column) throws SQLException; - } - - @FunctionalInterface - interface Extractor { - A apply(ResultSet rs, int col) throws SQLException; - } - - @FunctionalInterface - interface Transformer { - A apply(U value) throws SQLException; - } - - /** - * Create an instance of {@link OracleRead} from extract and transform functions. - * - * @param extractor Extracts raw value from ResultSet (which RS method to use) - * @param transformer Transforms raw value to target type (may return null for SQL NULL) - */ - static NonNullable of(Extractor extractor, Transformer transformer) { - return new NonNullable<>(extractor, transformer); - } - - /** - * Create an instance of {@link OracleRead} from just an extractor. Uses identity transformation - - * for cases where no conversion is needed. - * - * @param extractor Extracts raw value from ResultSet (which RS method to use) - */ - static NonNullable of(Extractor extractor) { - return new NonNullable<>(extractor, a -> a); - } - - final class NonNullable implements OracleRead { - final Extractor extractor; - final Transformer transformer; - - public NonNullable(Extractor extractor, Transformer transformer) { - this.extractor = extractor; - this.transformer = transformer; - } - - @Override - public Object extract(ResultSet rs, int col) throws SQLException { - return extractor.apply(rs, col); - } - - @Override - @SuppressWarnings("unchecked") - public A transform(Object value) throws SQLException { - return transformer.apply((U) value); - } - - @Override - public A read(ResultSet rs, int col) throws SQLException { - U raw = extractor.apply(rs, col); - if (rs.wasNull()) { - throw new SQLException("null value in column " + col); - } - return transformer.apply(raw); - } - - @Override - public NonNullable map(SqlFunction f) { - return new NonNullable<>( - this.extractor, - value -> { - A a = this.transformer.apply(value); - if (a == null) return null; - return f.apply(a); - }); - } - - @Override - public OracleRead> opt() { - return new Nullable<>(this.extractor, this.transformer); - } - } - - final class Nullable implements OracleRead> { - final Extractor extractor; - final Transformer transformer; - - public Nullable(Extractor extractor, Transformer transformer) { - this.extractor = extractor; - this.transformer = transformer; - } - - @Override - public Object extract(ResultSet rs, int col) throws SQLException { - return extractor.apply(rs, col); - } - - @Override - @SuppressWarnings("unchecked") - public Optional transform(Object value) throws SQLException { - if (value == null) { - return Optional.empty(); - } - A result = transformer.apply((U) value); - return Optional.ofNullable(result); - } - - @Override - public Optional read(ResultSet rs, int col) throws SQLException { - U raw = extractor.apply(rs, col); - if (rs.wasNull()) { - return Optional.empty(); - } - return Optional.of(transformer.apply(raw)); - } - - @Override - public OracleRead map(SqlFunction, B> f) { - return new MappedNullable<>(this.extractor, this.transformer, f); - } - - @Override - public OracleRead>> opt() { - return new Nullable<>( - this.extractor, - value -> { - A result = this.transformer.apply(value); - Optional maybeA = result == null ? Optional.empty() : Optional.of(result); - return maybeA.isEmpty() ? null : maybeA; - }); - } - } - - /** - * Result of mapping a Nullable reader - handles NULL database values but returns a non-Optional - * type. This is used when converting Optional[A] to Option[A] in Scala, where the Option type - * already represents nullability. - */ - final class MappedNullable implements OracleRead { - final Extractor extractor; - final Transformer transformer; - final SqlFunction, B> mapFunction; - - public MappedNullable( - Extractor extractor, - Transformer transformer, - SqlFunction, B> mapFunction) { - this.extractor = extractor; - this.transformer = transformer; - this.mapFunction = mapFunction; - } - - @Override - public Object extract(ResultSet rs, int col) throws SQLException { - return extractor.apply(rs, col); - } - - @Override - @SuppressWarnings("unchecked") - public B transform(Object value) throws SQLException { - A result = transformer.apply((U) value); - Optional opt = Optional.ofNullable(result); - return mapFunction.apply(opt); - } - - @Override - @SuppressWarnings("unchecked") - public B read(ResultSet rs, int col) throws SQLException { - U raw = extractor.apply(rs, col); - if (rs.wasNull()) { - // Handle NULL by passing Optional.empty() to the map function - return mapFunction.apply(Optional.empty()); - } - A result = transformer.apply(raw); - Optional opt = Optional.ofNullable(result); - return mapFunction.apply(opt); - } - - @Override - public OracleRead map(SqlFunction f) { - // Compose the map functions - SqlFunction, C> composed = opt -> f.apply(mapFunction.apply(opt)); - return new MappedNullable<>(extractor, transformer, composed); - } - - @Override - public OracleRead> opt() { - // Convert to Nullable by wrapping the result in Optional - return new Nullable<>( - extractor, - value -> { - A result = transformer.apply(value); - Optional opt = Optional.ofNullable(result); - return mapFunction.apply(opt); - }); - } - } - - static NonNullable castJdbcObjectTo(Class cls) { - return of(ResultSet::getObject, obj -> cls.cast(obj)); - } - - // Basic type readers - OracleRead readString = of(ResultSet::getString); - OracleRead readNonEmptyString = - of(ResultSet::getString, s -> NonEmptyString.force((String) s)); - - static OracleRead readPaddedString(int length) { - return of(ResultSet::getString, s -> PaddedString.force((String) s, length)); - } - - OracleRead readBoolean = of(ResultSet::getBoolean); - OracleRead readByte = of(ResultSet::getByte); - OracleRead readShort = of(ResultSet::getShort); - OracleRead readInteger = of(ResultSet::getInt); - OracleRead readLong = of(ResultSet::getLong); - OracleRead readFloat = of(ResultSet::getFloat); - OracleRead readDouble = of(ResultSet::getDouble); - OracleRead readBigDecimal = of(ResultSet::getBigDecimal); - - // For RAW/BLOB - reads as byte[] directly - OracleRead readByteArray = of(ResultSet::getBytes); - OracleRead readNonEmptyBlob = - of(ResultSet::getBytes, bytes -> NonEmptyBlob.force((byte[]) bytes)); - - // For BLOB types - Oracle returns Blob objects, need to extract bytes - OracleRead readBlob = - of( - ResultSet::getBlob, - blob -> { - if (blob == null) return null; - java.sql.Blob b = (java.sql.Blob) blob; - return b.getBytes(1, (int) b.length()); - }); - - // For CLOB types - Oracle returns Clob objects, need to extract string - OracleRead readClob = - of( - ResultSet::getClob, - clob -> { - if (clob == null) return null; - java.sql.Clob c = (java.sql.Clob) clob; - return c.getSubString(1, (int) c.length()); - }); - - // Date/Time readers - // Note: Oracle DATE includes time component (no separate TIME type) - // Use getTimestamp() instead of getObject() because Oracle's getGeneratedKeys() ResultSet - // throws ORA-17004 with getObject() for DATE columns. - // The transformer handles both java.sql.Timestamp (from ResultSet) and oracle.sql.TIMESTAMP (from - // STRUCT via fromOracleValue) - OracleRead readLocalDateTime = - of( - (rs, col) -> (Object) rs.getTimestamp(col), - obj -> { - if (obj == null) return null; - if (obj instanceof java.sql.Timestamp ts) return ts.toLocalDateTime(); - if (obj instanceof oracle.sql.TIMESTAMP oracleTs) { - // Use toJdbc() instead of timestampValue() - more reliable for STRUCT attributes - return ((java.sql.Timestamp) oracleTs.toJdbc()).toLocalDateTime(); - } - throw new SQLException("Unexpected type for DATE: " + obj.getClass().getName()); - }); - - // Oracle's TIMESTAMP type - // Use getTimestamp() instead of getObject() because Oracle's getGeneratedKeys() ResultSet - // throws ORA-17004 with getObject() for TIMESTAMP columns. - // The transformer handles both java.sql.Timestamp (from ResultSet) and oracle.sql.TIMESTAMP (from - // STRUCT via fromOracleValue) - OracleRead readTimestamp = - of( - (rs, col) -> (Object) rs.getTimestamp(col), - obj -> { - if (obj == null) return null; - if (obj instanceof java.sql.Timestamp ts) return ts.toLocalDateTime(); - if (obj instanceof oracle.sql.TIMESTAMP oracleTs) { - // Use toJdbc() instead of timestampValue() - more reliable for STRUCT attributes - return ((java.sql.Timestamp) oracleTs.toJdbc()).toLocalDateTime(); - } - throw new SQLException("Unexpected type for TIMESTAMP: " + obj.getClass().getName()); - }); - - // TIMESTAMP WITH TIME ZONE -> OffsetDateTime - // Handles OffsetDateTime (from columns) and oracle.sql.TIMESTAMPTZ (from STRUCT) - OracleRead readOffsetDateTime = - of( - ResultSet::getObject, - obj -> { - if (obj == null) return null; - if (obj instanceof OffsetDateTime odt) return odt; - if (obj instanceof oracle.sql.TIMESTAMPTZ oracleTzTs) { - // Use offsetDateTimeValue() for oracle.sql.TIMESTAMPTZ from STRUCT attributes - try { - return oracleTzTs.offsetDateTimeValue(); - } catch (Exception e) { - throw new SQLException( - "Failed to convert oracle.sql.TIMESTAMPTZ to OffsetDateTime: " + e.getMessage(), - e); - } - } - throw new SQLException( - "Unexpected type for TIMESTAMP WITH TIME ZONE: " + obj.getClass().getName()); - }); - - // TIMESTAMP WITH LOCAL TIME ZONE -> OffsetDateTime - // Oracle stores this with timezone information, so we use OffsetDateTime to preserve it - OracleRead readLocalTimezoneTimestamp = - of( - (rs, col) -> rs.getObject(col, OffsetDateTime.class), - obj -> { - if (obj == null) return null; - if (obj instanceof OffsetDateTime odt) return odt; - // For STRUCT attributes, oracle.sql.TIMESTAMPLTZ may need conversion - // Since we can't easily convert without connection, this is a limitation - throw new SQLException( - "Unexpected type for TIMESTAMP WITH LOCAL TIME ZONE: " + obj.getClass().getName()); - }); - - // INTERVAL YEAR TO MONTH -> OracleIntervalYM (parses both Oracle and ISO-8601 formats) - // Handles both String (from columns) and oracle.sql.INTERVALYM (from STRUCT attributes) - OracleRead readIntervalYearToMonth = - of( - ResultSet::getObject, - obj -> { - if (obj == null) return null; - if (obj instanceof String str) { - return OracleIntervalYM.parse(str); - } else if (obj instanceof oracle.sql.INTERVALYM interval) { - // Convert oracle.sql.INTERVALYM to string and parse - return OracleIntervalYM.parse(interval.stringValue()); - } else { - throw new SQLException( - "Unexpected type for INTERVAL YEAR TO MONTH: " + obj.getClass().getName()); - } - }); - - // INTERVAL DAY TO SECOND -> OracleIntervalDS (parses both Oracle and ISO-8601 formats) - // Handles both String (from columns) and oracle.sql.INTERVALDS (from STRUCT attributes) - OracleRead readIntervalDayToSecond = - of( - ResultSet::getObject, - obj -> { - if (obj == null) return null; - if (obj instanceof String str) { - return OracleIntervalDS.parse(str); - } else if (obj instanceof oracle.sql.INTERVALDS interval) { - // Convert oracle.sql.INTERVALDS to string and parse - return OracleIntervalDS.parse(interval.stringValue()); - } else { - throw new SQLException( - "Unexpected type for INTERVAL DAY TO SECOND: " + obj.getClass().getName()); - } - }); - - // ROWID types - OracleRead readRowId = - of( - ResultSet::getRowId, - rowId -> { - if (rowId == null) return null; - return ((java.sql.RowId) rowId).toString(); - }); - - // NUMBER type - Oracle returns BigDecimal for all NUMBER types - // But we need specific readers for precision-based mapping - OracleRead readNumberAsInt = - of(ResultSet::getBigDecimal, bd -> ((BigDecimal) bd).intValueExact()); - OracleRead readNumberAsLong = - of(ResultSet::getBigDecimal, bd -> ((BigDecimal) bd).longValueExact()); - - // BINARY_FLOAT - OracleRead readBinaryFloat = - of( - ResultSet::getObject, - obj -> { - if (obj == null) return null; - if (obj instanceof Float) return (Float) obj; - if (obj instanceof Number) return ((Number) obj).floatValue(); - return Float.parseFloat(obj.toString()); - }); - - // BINARY_DOUBLE - OracleRead readBinaryDouble = - of( - ResultSet::getObject, - obj -> { - if (obj == null) return null; - if (obj instanceof Double) return (Double) obj; - if (obj instanceof Number) return ((Number) obj).doubleValue(); - return Double.parseDouble(obj.toString()); - }); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/OracleType.java b/foundations-jdbc/src/java/dev/typr/foundations/OracleType.java deleted file mode 100644 index ea2f79025c..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/OracleType.java +++ /dev/null @@ -1,84 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.dsl.Bijection; -import java.util.Optional; -import java.util.function.Function; - -/** - * Combines Oracle type name, read, write, and JSON encoding for a type. Similar to PgType/MariaType - * but for Oracle. Note: Oracle doesn't use text-based streaming inserts (like PostgreSQL's COPY), - * so there is no OracleText component. - */ -public record OracleType( - OracleTypename typename, OracleRead read, OracleWrite write, OracleJson oracleJson) - implements DbType { - @Override - public DbText text() { - throw new UnsupportedOperationException( - "Oracle doesn't support text-based streaming inserts. Use batch operations instead."); - } - - @Override - public DbJson json() { - return oracleJson; - } - - public Fragment.Value encode(A value) { - return new Fragment.Value<>(value, this); - } - - public OracleType withTypename(OracleTypename typename) { - return new OracleType<>(typename, read, write, oracleJson); - } - - public OracleType withTypename(String sqlType) { - return withTypename(OracleTypename.of(sqlType)); - } - - public OracleType renamed(String value) { - return withTypename(typename.renamed(value)); - } - - public OracleType renamedDropPrecision(String value) { - return withTypename(typename.renamedDropPrecision(value)); - } - - public OracleType withRead(OracleRead read) { - return new OracleType<>(typename, read, write, oracleJson); - } - - public OracleType withWrite(OracleWrite write) { - return new OracleType<>(typename, read, write, oracleJson); - } - - public OracleType withJson(OracleJson json) { - return new OracleType<>(typename, read, write, json); - } - - public OracleType> opt() { - return new OracleType<>(typename.opt(), read.opt(), write.opt(typename), oracleJson.opt()); - } - - @Override - public OracleType to(Bijection bijection) { - return new OracleType<>( - typename.as(), - read.map(bijection::underlying), - write.contramap(bijection::from), - oracleJson.bimap(bijection::underlying, bijection::from)); - } - - public OracleType bimap(SqlFunction f, Function g) { - return new OracleType<>(typename.as(), read.map(f), write.contramap(g), oracleJson.bimap(f, g)); - } - - public static OracleType of( - String tpe, OracleRead r, OracleWrite w, OracleJson j) { - return new OracleType<>(OracleTypename.of(tpe), r, w, j); - } - - public static OracleType of( - OracleTypename typename, OracleRead r, OracleWrite w, OracleJson j) { - return new OracleType<>(typename, r, w, j); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/OracleTypename.java b/foundations-jdbc/src/java/dev/typr/foundations/OracleTypename.java deleted file mode 100644 index 165d8a3e3f..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/OracleTypename.java +++ /dev/null @@ -1,154 +0,0 @@ -package dev.typr.foundations; - -import java.util.Optional; - -/** - * Represents an Oracle SQL type name with optional precision and scale. Similar to MariaTypename - * but for Oracle. - */ -public sealed interface OracleTypename extends DbTypename { - String sqlType(); - - /** Oracle doesn't use PostgreSQL-style type casts in SQL. */ - @Override - default boolean renderTypeCast() { - return false; - } - - String sqlTypeNoPrecision(); - - OracleTypename renamed(String value); - - OracleTypename renamedDropPrecision(String value); - - default OracleTypename> opt() { - return new Opt<>(this); - } - - default OracleTypename as() { - return (OracleTypename) this; - } - - record Base(String sqlType) implements OracleTypename { - @Override - public String sqlTypeNoPrecision() { - return sqlType; - } - - @Override - public Base renamed(String value) { - return new Base<>(value); - } - - @Override - public Base renamedDropPrecision(String value) { - return new Base<>(value); - } - } - - record WithPrec(Base of, int precision) implements OracleTypename { - public String sqlType() { - return of.sqlType + "(" + precision + ")"; - } - - @Override - public String sqlTypeNoPrecision() { - return of.sqlTypeNoPrecision(); - } - - @Override - public OracleTypename renamed(String value) { - return new WithPrec<>(of.renamed(value), precision); - } - - @Override - public OracleTypename renamedDropPrecision(String value) { - return of.renamed(value); - } - } - - record WithPrecScale(Base of, int precision, int scale) implements OracleTypename { - public String sqlType() { - return of.sqlType + "(" + precision + "," + scale + ")"; - } - - @Override - public String sqlTypeNoPrecision() { - return of.sqlTypeNoPrecision(); - } - - @Override - public OracleTypename renamed(String value) { - return new WithPrecScale<>(of.renamed(value), precision, scale); - } - - @Override - public OracleTypename renamedDropPrecision(String value) { - return of.renamed(value); - } - } - - record Opt(OracleTypename of) implements OracleTypename> { - @Override - public String sqlType() { - return of.sqlType(); - } - - @Override - public String sqlTypeNoPrecision() { - return of.sqlTypeNoPrecision(); - } - - @Override - public OracleTypename> renamed(String value) { - return new Opt<>(of.renamed(value)); - } - - @Override - public OracleTypename> renamedDropPrecision(String value) { - return new Opt<>(of.renamedDropPrecision(value)); - } - } - - /** Typename for Oracle OBJECT types. */ - record ObjectOf(String sqlType) implements OracleTypename { - @Override - public String sqlTypeNoPrecision() { - return sqlType; - } - - @Override - public ObjectOf renamed(String value) { - return new ObjectOf<>(value); - } - - @Override - public ObjectOf renamedDropPrecision(String value) { - return new ObjectOf<>(value); - } - - public String sqlName() { - return sqlType; - } - - public OracleTypename asGeneric() { - return new Base<>(sqlType); - } - } - - static OracleTypename of(String sqlType) { - return new Base<>(sqlType); - } - - static ObjectOf objectOf(String sqlType) { - return new ObjectOf<>(sqlType); - } - - static OracleTypename of(String sqlType, int precision) { - return new WithPrec<>(new Base<>(sqlType), precision); - } - - static OracleTypename of(String sqlType, int precision, int scale) { - return new WithPrecScale<>(new Base<>(sqlType), precision, scale); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/OracleTypes.java b/foundations-jdbc/src/java/dev/typr/foundations/OracleTypes.java deleted file mode 100644 index 96487dc02d..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/OracleTypes.java +++ /dev/null @@ -1,557 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.Json; -import dev.typr.foundations.data.OracleIntervalDS; -import dev.typr.foundations.data.OracleIntervalYM; -import java.math.BigDecimal; -import java.time.LocalDateTime; -import java.time.OffsetDateTime; -import java.util.function.Function; - -/** - * Oracle type definitions for the typr-runtime-java library. - * - *

This interface provides type codecs for all Oracle data types. - * - *

Oracle Type System Reference: - NUMBER(p,s): Universal numeric type - - * BINARY_FLOAT/BINARY_DOUBLE: IEEE 754 floating point - VARCHAR2/CHAR/NVARCHAR2/NCHAR: Character - * types - CLOB/NCLOB/BLOB: Large object types - DATE: Date with time (second precision) - - * TIMESTAMP: Date with fractional seconds - TIMESTAMP WITH TIME ZONE / WITH LOCAL TIME ZONE: - * Timezone-aware timestamps - INTERVAL YEAR TO MONTH / INTERVAL DAY TO SECOND: Interval types - - * RAW: Variable-length binary data - ROWID/UROWID: Row identifier types - XMLTYPE: XML document - * storage - JSON: Native JSON type (21c+) - */ -public interface OracleTypes { - // ═══════════════════════════════════════════════════════════════════════════ - // Numeric Types - // ═══════════════════════════════════════════════════════════════════════════ - - /** - * NUMBER - Oracle's universal numeric type (no precision/scale specified). Can hold integers, - * fixed-point, and floating-point values. - */ - OracleType number = - OracleType.of( - "NUMBER", OracleRead.readBigDecimal, OracleWrite.writeBigDecimal, OracleJson.numeric); - - /** NUMBER(p,0) where p<=9 -> Integer */ - OracleType numberInt = - OracleType.of( - "NUMBER", OracleRead.readNumberAsInt, OracleWrite.writeInteger, OracleJson.int4); - - /** - * NUMBER(p,0) where 9 - * - * Long - */ - OracleType numberLong = - OracleType.of("NUMBER", OracleRead.readNumberAsLong, OracleWrite.writeLong, OracleJson.int8); - - /** NUMBER with precision and scale factory methods */ - static OracleType number(int precision) { - return OracleType.of( - OracleTypename.of("NUMBER", precision), - OracleRead.readBigDecimal, - OracleWrite.writeBigDecimal, - OracleJson.numeric); - } - - static OracleType number(int precision, int scale) { - return OracleType.of( - OracleTypename.of("NUMBER", precision, scale), - OracleRead.readBigDecimal, - OracleWrite.writeBigDecimal, - OracleJson.numeric); - } - - static OracleType numberAsInt(int precision) { - return OracleType.of( - OracleTypename.of("NUMBER", precision), - OracleRead.readInteger, - OracleWrite.writeInteger, - OracleJson.int4); - } - - static OracleType numberAsLong(int precision) { - return OracleType.of( - OracleTypename.of("NUMBER", precision), - OracleRead.readLong, - OracleWrite.writeLong, - OracleJson.int8); - } - - /** BINARY_FLOAT - 32-bit IEEE 754 floating point. Range: +/-1.17549E-38 to +/-3.40282E+38 */ - OracleType binaryFloat = - OracleType.of( - "BINARY_FLOAT", OracleRead.readBinaryFloat, OracleWrite.writeFloat, OracleJson.float4); - - /** BINARY_DOUBLE - 64-bit IEEE 754 floating point. Range: +/-2.22507E-308 to +/-1.79769E+308 */ - OracleType binaryDouble = - OracleType.of( - "BINARY_DOUBLE", OracleRead.readBinaryDouble, OracleWrite.writeDouble, OracleJson.float8); - - /** - * FLOAT(precision) - ANSI float type (actually maps to NUMBER internally). Binary precision 1-126 - * (approximately 1-38 decimal digits). - */ - OracleType float_ = - OracleType.of("FLOAT", OracleRead.readDouble, OracleWrite.writeDouble, OracleJson.float8); - - static OracleType float_(int binaryPrecision) { - return OracleType.of( - OracleTypename.of("FLOAT", binaryPrecision), - OracleRead.readDouble, - OracleWrite.writeDouble, - OracleJson.float8); - } - - /** INTEGER - Equivalent to NUMBER(38,0). Used for ANSI compatibility. */ - OracleType integer = number.renamed("INTEGER"); - - /** SMALLINT - Equivalent to NUMBER(38,0). Used for ANSI compatibility. */ - OracleType smallint = number.renamed("SMALLINT"); - - /** REAL - Equivalent to FLOAT(63). Used for ANSI compatibility. */ - OracleType real = float_.renamed("REAL"); - - /** DOUBLE PRECISION - Equivalent to FLOAT(126). Used for ANSI compatibility. */ - OracleType doublePrecision = float_.renamed("DOUBLE PRECISION"); - - // ═══════════════════════════════════════════════════════════════════════════ - // Character Types - // ═══════════════════════════════════════════════════════════════════════════ - - /** - * VARCHAR2(n) - Variable-length character string. Max 4000 bytes (or 32767 with - * MAX_STRING_SIZE=EXTENDED). - */ - OracleType varchar2 = - OracleType.of("VARCHAR2", OracleRead.readString, OracleWrite.writeString, OracleJson.text); - - static OracleType varchar2(int maxLength) { - return OracleType.of( - OracleTypename.of("VARCHAR2", maxLength), - OracleRead.readString, - OracleWrite.writeString, - OracleJson.text); - } - - /** - * VARCHAR2(n) - Variable-length character string, using NonEmptyString. For NOT NULL columns - - * guarantees non-empty values. - */ - static OracleType varchar2NonEmpty(int maxLength) { - return OracleType.of( - OracleTypename.of("VARCHAR2", maxLength), - OracleRead.readNonEmptyString, - OracleWrite.writeNonEmptyString, - OracleJson.nonEmptyString); - } - - /** CHAR(n) - Fixed-length character string, blank-padded. Max 2000 bytes. */ - OracleType char_ = - OracleType.of("CHAR", OracleRead.readString, OracleWrite.writeString, OracleJson.text); - - static OracleType char_(int length) { - return OracleType.of( - OracleTypename.of("CHAR", length), - OracleRead.readString, - OracleWrite.writeString, - OracleJson.text); - } - - /** - * CHAR(n) - Fixed-length character string, using PaddedString. For NOT NULL columns - guarantees - * non-empty, padded values. - */ - static OracleType charPadded(int length) { - return OracleType.of( - OracleTypename.of("CHAR", length), - OracleRead.readPaddedString(length), - OracleWrite.writePaddedString(), - OracleJson.paddedString(length)); - } - - /** - * NVARCHAR2(n) - Variable-length National character string. Uses AL16UTF16 or UTF8 encoding based - * on national character set. - */ - OracleType nvarchar2 = - OracleType.of("NVARCHAR2", OracleRead.readString, OracleWrite.writeString, OracleJson.text); - - static OracleType nvarchar2(int maxLength) { - return OracleType.of( - OracleTypename.of("NVARCHAR2", maxLength), - OracleRead.readString, - OracleWrite.writeString, - OracleJson.text); - } - - /** - * NVARCHAR2(n) - Variable-length National character string, using NonEmptyString. For NOT NULL - * columns - guarantees non-empty values. - */ - static OracleType nvarchar2NonEmpty(int maxLength) { - return OracleType.of( - OracleTypename.of("NVARCHAR2", maxLength), - OracleRead.readNonEmptyString, - OracleWrite.writeNonEmptyString, - OracleJson.nonEmptyString); - } - - /** NCHAR(n) - Fixed-length National character string. */ - OracleType nchar = - OracleType.of("NCHAR", OracleRead.readString, OracleWrite.writeString, OracleJson.text); - - static OracleType nchar(int length) { - return OracleType.of( - OracleTypename.of("NCHAR", length), - OracleRead.readString, - OracleWrite.writeString, - OracleJson.text); - } - - /** - * NCHAR(n) - Fixed-length National character string, using PaddedString. For NOT NULL columns - - * guarantees non-empty, padded values. - */ - static OracleType ncharPadded(int length) { - return OracleType.of( - OracleTypename.of("NCHAR", length), - OracleRead.readPaddedString(length), - OracleWrite.writePaddedString(), - OracleJson.paddedString(length)); - } - - /** CLOB - Character Large Object. Up to (4GB - 1) * DB_BLOCK_SIZE. */ - OracleType clob = - OracleType.of("CLOB", OracleRead.readClob, OracleWrite.writeClobForStruct(), OracleJson.text); - - /** - * CLOB - Character Large Object, using NonEmptyString. For NOT NULL columns - guarantees - * non-empty values. - */ - OracleType clobNonEmpty = - OracleType.of( - "CLOB", - OracleRead.readClob.map(NonEmptyString::force), - OracleWrite.writeClob.contramap(NonEmptyString::value), - OracleJson.nonEmptyString); - - /** NCLOB - National Character Large Object. */ - OracleType nclob = - OracleType.of( - "NCLOB", OracleRead.readClob, OracleWrite.writeClobForStruct(), OracleJson.text); - - /** - * NCLOB - National Character Large Object, using NonEmptyString. For NOT NULL columns - - * guarantees non-empty values. - */ - OracleType nclobNonEmpty = - OracleType.of( - "NCLOB", - OracleRead.readClob.map(NonEmptyString::force), - OracleWrite.writeClob.contramap(NonEmptyString::value), - OracleJson.nonEmptyString); - - /** - * LONG - Deprecated character type (for backward compatibility only). Use CLOB instead for new - * applications. - */ - OracleType long_ = - OracleType.of("LONG", OracleRead.readString, OracleWrite.writeString, OracleJson.text); - - // ═══════════════════════════════════════════════════════════════════════════ - // Binary Types - // ═══════════════════════════════════════════════════════════════════════════ - - /** - * RAW(n) - Variable-length raw binary data. Max 2000 bytes (or 32767 with - * MAX_STRING_SIZE=EXTENDED). - */ - OracleType raw = - OracleType.of("RAW", OracleRead.readByteArray, OracleWrite.writeRaw(), OracleJson.bytea); - - static OracleType raw(int maxLength) { - return OracleType.of( - OracleTypename.of("RAW", maxLength), - OracleRead.readByteArray, - OracleWrite.writeRaw(), - OracleJson.bytea); - } - - /** - * RAW(n) - Variable-length raw binary data, using NonEmptyBlob. For NOT NULL columns - guarantees - * non-empty values. - */ - static OracleType rawNonEmpty(int maxLength) { - return OracleType.of( - OracleTypename.of("RAW", maxLength), - OracleRead.readNonEmptyBlob, - OracleWrite.writeNonEmptyBlob, - OracleJson.nonEmptyBlob); - } - - /** BLOB - Binary Large Object. Up to (4GB - 1) * DB_BLOCK_SIZE. */ - OracleType blob = - OracleType.of( - "BLOB", OracleRead.readBlob, OracleWrite.writeBlobForStruct(), OracleJson.bytea); - - /** - * BLOB - Binary Large Object, using NonEmptyBlob. For NOT NULL columns - guarantees non-empty - * values. - */ - OracleType blobNonEmpty = - OracleType.of( - "BLOB", - OracleRead.readBlob.map(NonEmptyBlob::force), - OracleWrite.writeBlob.contramap(NonEmptyBlob::value), - OracleJson.nonEmptyBlob); - - /** - * LONG RAW - Deprecated binary type (for backward compatibility only). Use BLOB instead for new - * applications. - */ - OracleType longRaw = - OracleType.of( - "LONG RAW", OracleRead.readByteArray, OracleWrite.writeByteArray, OracleJson.bytea); - - // BFILE - External file pointer (read-only, references files on server filesystem) - // Omitted: Requires special handling and rarely used in typical applications - - // ═══════════════════════════════════════════════════════════════════════════ - // Date/Time Types - // ═══════════════════════════════════════════════════════════════════════════ - - /** - * DATE - Date with time to second precision. Note: Oracle DATE includes time unlike SQL standard - * DATE! Range: January 1, 4712 BC to December 31, 9999 AD. - */ - OracleType date = - OracleType.of("DATE", OracleRead.readLocalDateTime, OracleWrite.writeDate(), OracleJson.date); - - /** TIMESTAMP - Timestamp with fractional seconds. Default precision is 6 (microseconds). */ - OracleType timestamp = - OracleType.of( - "TIMESTAMP", - OracleRead.readTimestamp, - OracleWrite.writeTimestamp(), - OracleJson.timestamp); - - static OracleType timestamp(int fractionalSecondsPrecision) { - return OracleType.of( - OracleTypename.of("TIMESTAMP", fractionalSecondsPrecision), - OracleRead.readTimestamp, - OracleWrite.writeTimestamp(), - OracleJson.timestamp); - } - - /** - * TIMESTAMP WITH TIME ZONE - Timestamp with explicit timezone. Stores the time zone offset or - * region name. - */ - OracleType timestampWithTimeZone = - OracleType.of( - "TIMESTAMP WITH TIME ZONE", - OracleRead.readOffsetDateTime, - OracleWrite.writeTimestampWithTimeZone(), - OracleJson.timestampWithTimeZone); - - static OracleType timestampWithTimeZone(int fractionalSecondsPrecision) { - return OracleType.of( - OracleTypename.of("TIMESTAMP(" + fractionalSecondsPrecision + ") WITH TIME ZONE"), - OracleRead.readOffsetDateTime, - OracleWrite.writeTimestampWithTimeZone(), - OracleJson.timestampWithTimeZone); - } - - /** - * TIMESTAMP WITH LOCAL TIME ZONE - Timestamp with timezone information. Oracle normalizes to - * session timezone, but we preserve OffsetDateTime to avoid data loss. - */ - OracleType timestampWithLocalTimeZone = - OracleType.of( - "TIMESTAMP WITH LOCAL TIME ZONE", - OracleRead.readLocalTimezoneTimestamp, - OracleWrite.writeTimestampWithLocalTimeZone(), - OracleJson.timestampWithTimeZone); - - static OracleType timestampWithLocalTimeZone(int fractionalSecondsPrecision) { - return OracleType.of( - OracleTypename.of("TIMESTAMP(" + fractionalSecondsPrecision + ") WITH LOCAL TIME ZONE"), - OracleRead.readLocalTimezoneTimestamp, - OracleWrite.writeTimestampWithLocalTimeZone(), - OracleJson.timestampWithTimeZone); - } - - /** - * INTERVAL YEAR TO MONTH - Interval in years and months. Represented as OracleIntervalYM which - * can convert to/from Oracle format (+02-05) and ISO-8601 (P2Y5M). - */ - OracleType intervalYearToMonth = - OracleType.of( - "INTERVAL YEAR TO MONTH", - OracleRead.readIntervalYearToMonth, - OracleWrite.writeIntervalYearToMonth(), - OracleJson.intervalYearToMonth); - - static OracleType intervalYearToMonth(int yearPrecision) { - return OracleType.of( - OracleTypename.of("INTERVAL YEAR(" + yearPrecision + ") TO MONTH"), - OracleRead.readIntervalYearToMonth, - OracleWrite.writeIntervalYearToMonth(), - OracleJson.intervalYearToMonth); - } - - /** - * INTERVAL DAY TO SECOND - Interval in days, hours, minutes, seconds. Represented as - * OracleIntervalDS which can convert to/from Oracle format (+03 14:30:45.123456) and ISO-8601 - * (P3DT14H30M45.123456S). - */ - OracleType intervalDayToSecond = - OracleType.of( - "INTERVAL DAY TO SECOND", - OracleRead.readIntervalDayToSecond, - OracleWrite.writeIntervalDayToSecond(), - OracleJson.intervalDayToSecond); - - static OracleType intervalDayToSecond( - int dayPrecision, int fractionalSecondsPrecision) { - return OracleType.of( - OracleTypename.of( - "INTERVAL DAY(" + dayPrecision + ") TO SECOND(" + fractionalSecondsPrecision + ")"), - OracleRead.readIntervalDayToSecond, - OracleWrite.writeIntervalDayToSecond(), - OracleJson.intervalDayToSecond); - } - - // ═══════════════════════════════════════════════════════════════════════════ - // ROWID Types (Oracle-specific) - // ═══════════════════════════════════════════════════════════════════════════ - - /** - * ROWID - Physical row address for heap-organized tables. 10-byte internal format, displayed as - * 18-character base-64 string. - */ - OracleType rowId = - OracleType.of("ROWID", OracleRead.readRowId, OracleWrite.writeString, OracleJson.rowId); - - /** - * UROWID - Universal ROWID for index-organized tables and foreign tables. Variable length, max - * 4000 bytes. - */ - OracleType uRowId = - OracleType.of("UROWID", OracleRead.readRowId, OracleWrite.writeString, OracleJson.rowId); - - static OracleType uRowId(int maxLength) { - return OracleType.of( - OracleTypename.of("UROWID", maxLength), - OracleRead.readRowId, - OracleWrite.writeString, - OracleJson.rowId); - } - - // ═══════════════════════════════════════════════════════════════════════════ - // XML/JSON Types - // ═══════════════════════════════════════════════════════════════════════════ - - /** XMLTYPE - XML document storage. Supports XQuery, XPath, and XML Schema validation. */ - OracleType xmlType = - OracleType.of("XMLTYPE", OracleRead.readClob, OracleWrite.writeClob, OracleJson.xmlType); - - /** - * JSON - Native JSON type (Oracle 21c+). Binary optimized storage with efficient query support. - */ - OracleType json = - OracleType.of( - "JSON", - OracleRead.readString.map(Json::new), - OracleWrite.writeString.contramap(Json::value), - OracleJson.json); - - // ═══════════════════════════════════════════════════════════════════════════ - // Boolean Type (Oracle 23c+) - // ═══════════════════════════════════════════════════════════════════════════ - - /** - * BOOLEAN - Native boolean type (Oracle 23c+). Prior to 23c, use NUMBER(1) with 0/1 convention. - */ - OracleType boolean_ = - OracleType.of("BOOLEAN", OracleRead.readBoolean, OracleWrite.writeBoolean, OracleJson.bool); - - /** - * NUMBER(1) as Boolean - Traditional Oracle boolean representation. 0 = false, 1 = true (or any - * non-zero = true). - */ - OracleType numberAsBoolean = - OracleType.of( - OracleTypename.of("NUMBER", 1), - OracleRead.readInteger.map(i -> i != 0), - OracleWrite.writeInteger.contramap(b -> b ? 1 : 0), - OracleJson.bool); - - // ═══════════════════════════════════════════════════════════════════════════ - // ENUM Type Helper - // ═══════════════════════════════════════════════════════════════════════════ - - /** - * Create an OracleType for ENUM-like columns (stored as VARCHAR2 or NUMBER). Oracle doesn't have - * native ENUM type, so enums are typically stored as strings. - * - * @param sqlType The SQL type (e.g., "VARCHAR2(20)") - * @param fromString Function to convert string to enum value - * @param The enum type - */ - static > OracleType ofEnum(String sqlType, Function fromString) { - return OracleType.of( - sqlType, - OracleRead.readString.map(fromString::apply), - OracleWrite.writeString.contramap(Enum::name), - OracleJson.text.bimap(fromString::apply, Enum::name)); - } - - // ═══════════════════════════════════════════════════════════════════════════ - // Spatial Types (Oracle Spatial) - // ═══════════════════════════════════════════════════════════════════════════ - - // SDO_GEOMETRY - Requires Oracle Spatial license and complex object mapping - // SDO_POINT_TYPE - Point type - // These are left as comments because they require: - // 1. Oracle Spatial extension - // 2. Special JDBC handling with oracle.spatial.geometry.JGeometry - // 3. Complex struct/object type handling - - // ═══════════════════════════════════════════════════════════════════════════ - // Object/Collection Types (Oracle Object-Relational) - // ═══════════════════════════════════════════════════════════════════════════ - - // OBJECT TYPE (CREATE TYPE ... AS OBJECT) - User-defined object types - // VARRAY - Fixed-size ordered arrays - // NESTED TABLE - Unbounded collection types - // REF types - Object references - // - // These require special handling with: - // - oracle.sql.STRUCT for object types - // - oracle.sql.ARRAY for collections - // - oracle.sql.REF for references - // - // Code generation will create specific types for each user-defined type - - // ═══════════════════════════════════════════════════════════════════════════ - // Any Types (Dynamic typing - rarely used directly) - // ═══════════════════════════════════════════════════════════════════════════ - - // ANYDATA - Container for any SQL type - // ANYTYPE - Type descriptor - // ANYDATASET - Collection of ANYDATA - // - // These are used for generic/polymorphic PL/SQL procedures - // Not commonly mapped to Java types directly - - // ==================== Unknown Type ==================== - // For columns whose type typr doesn't know how to handle - cast to/from string - OracleType unknown = - OracleType.of( - "VARCHAR2(4000)", OracleRead.readString, OracleWrite.writeString, OracleJson.text) - .bimap(dev.typr.foundations.data.Unknown::new, dev.typr.foundations.data.Unknown::value); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/OracleVArray.java b/foundations-jdbc/src/java/dev/typr/foundations/OracleVArray.java deleted file mode 100644 index e1035f8c20..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/OracleVArray.java +++ /dev/null @@ -1,131 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.JsonValue; -import java.sql.ResultSet; -import java.sql.SQLException; -import java.sql.Types; -import java.util.ArrayList; -import java.util.List; -import oracle.sql.ARRAY; -import oracle.sql.ArrayDescriptor; - -/** - * Oracle VARRAY type support. - * - *

A VARRAY is a fixed-maximum-size ordered collection. Example: CREATE TYPE phone_list AS - * VARRAY(5) OF VARCHAR2(25) - * - *

Unlike nested tables, VARRAYs have a maximum size that is enforced. - */ -public class OracleVArray { - /** - * Create an OracleType for a VARRAY type. - * - * @param varrayTypeName The SQL type name (e.g., "SCHEMA.PHONE_LIST") - * @param maxSize Maximum number of elements in the VARRAY - * @param elementType The type of elements in the VARRAY - * @param The Java type of elements - * @return An OracleType that can read/write VARRAY values as Lists - */ - public static OracleType> of( - String varrayTypeName, int maxSize, OracleType elementType) { - OracleRead> read = - new OracleRead.NonNullable<>( - ResultSet::getObject, - obj -> { - if (!(obj instanceof ARRAY array)) { - throw new SQLException( - "Expected ARRAY, got: " + (obj == null ? "null" : obj.getClass().getName())); - } - - try { - Object[] rawArray = (Object[]) array.getArray(); - - // Convert Object[] to List using element type's fromOracleValue() - List result = new ArrayList<>(rawArray.length); - for (Object element : rawArray) { - if (element == null) { - result.add(null); - } else { - T converted = elementType.read().fromOracleValue(element); - result.add(converted); - } - } - - return result; - } catch (Exception e) { - throw new SQLException("Failed to read Oracle VARRAY: " + e.getMessage(), e); - } - }); - - // Use structured() instead of primitive() to support STRUCT context - // toOracleValue() will convert List → oracle.sql.ARRAY - OracleWrite> write = - OracleWrite.structured( - (list, conn) -> { - if (list == null) return null; - if (list.size() > maxSize) { - throw new IllegalArgumentException( - varrayTypeName - + " max size is " - + maxSize - + ", got " - + list.size() - + " elements"); - } - try { - // Convert each element using elementType's toOracleValue() - // For OBJECT types: converts to oracle.sql.STRUCT - // For primitive types: value passes through unchanged - Object[] elements = new Object[list.size()]; - for (int i = 0; i < list.size(); i++) { - elements[i] = elementType.write().toOracleValue(list.get(i), conn); - } - - ArrayDescriptor desc = ArrayDescriptor.createDescriptor(varrayTypeName, conn); - return new ARRAY(desc, conn, elements); - } catch (Exception e) { - throw new SQLException("Failed to write Oracle VARRAY: " + e.getMessage(), e); - } - }, - varrayTypeName, - Types.ARRAY); - - // Generate OracleJson codec - OracleJson> json = json(elementType); - - return new OracleType<>(OracleTypename.of(varrayTypeName), read, write, json); - } - - /** Generate JSON codec for list type. */ - private static OracleJson> json(OracleType elementType) { - return new OracleJson>() { - @Override - public JsonValue toJson(List list) { - if (list == null) return JsonValue.JNull.INSTANCE; - - List elements = new ArrayList<>(); - for (T element : list) { - elements.add(elementType.oracleJson().toJson(element)); - } - return new JsonValue.JArray(elements); - } - - @Override - public List fromJson(JsonValue json) { - if (json instanceof JsonValue.JNull) return null; - if (!(json instanceof JsonValue.JArray(List elements))) { - throw new IllegalArgumentException( - "Expected JSON array for VARRAY type, got: " + json.getClass().getSimpleName()); - } - - List result = new ArrayList<>(elements.size()); - for (JsonValue element : elements) { - result.add(elementType.oracleJson().fromJson(element)); - } - - return result; - } - }; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/OracleWrite.java b/foundations-jdbc/src/java/dev/typr/foundations/OracleWrite.java deleted file mode 100644 index 6fd3a3f6df..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/OracleWrite.java +++ /dev/null @@ -1,330 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.OracleIntervalDS; -import dev.typr.foundations.data.OracleIntervalYM; -import java.math.BigDecimal; -import java.sql.Connection; -import java.sql.PreparedStatement; -import java.sql.SQLException; -import java.sql.Timestamp; -import java.time.LocalDateTime; -import java.time.OffsetDateTime; -import java.util.Optional; -import java.util.function.Function; -import oracle.sql.BLOB; -import oracle.sql.CLOB; -import oracle.sql.INTERVALDS; -import oracle.sql.INTERVALYM; -import oracle.sql.RAW; -import oracle.sql.TIMESTAMPTZ; - -/** - * Describes how to write a value to a {@link PreparedStatement} for Oracle. - * - *

Similar to MariaWrite but adapted for Oracle-specific types. - */ -public sealed interface OracleWrite extends DbWrite - permits OracleWrite.Instance, OracleWrite.Structured { - /** - * Convert a Java value to its Oracle SQL representation. - OBJECT types → oracle.sql.STRUCT - - * ARRAY types → oracle.sql.ARRAY - Primitive types → value itself (identity) - */ - Object toOracleValue(A value, Connection conn) throws SQLException; - - void set(PreparedStatement ps, int idx, A a) throws SQLException; - - OracleWrite> opt(OracleTypename typename); - - OracleWrite contramap(Function f); - - @FunctionalInterface - interface RawWriter { - void set(PreparedStatement ps, int index, A a) throws SQLException; - } - - /** For primitive types - toOracleValue is identity. */ - record Instance(RawWriter rawWriter, Function f) implements OracleWrite { - @Override - public Object toOracleValue(A value, Connection conn) { - if (value == null) return null; - return f.apply(value); // Apply function to unwrap Optional or transform value - } - - @Override - public void set(PreparedStatement ps, int index, A a) throws SQLException { - rawWriter.set(ps, index, f.apply(a)); - } - - @Override - public OracleWrite> opt(OracleTypename typename) { - return new Instance<>( - (ps, index, u) -> { - if (u == null) ps.setNull(index, java.sql.Types.NULL); - else set(ps, index, u); - }, - a -> a.orElse(null)); - } - - @Override - public OracleWrite contramap(Function f) { - return new Instance<>(rawWriter, f.andThen(this.f)); - } - } - - /** For structured types (OBJECT, ARRAY) - requires conversion. */ - record Structured(SqlBiFunction converter, String typename, int sqlType) - implements OracleWrite { - @Override - public Object toOracleValue(A value, Connection conn) throws SQLException { - if (value == null) return null; - return converter.apply(value, conn); - } - - @Override - public void set(PreparedStatement ps, int index, A value) throws SQLException { - if (value == null) { - ps.setNull(index, sqlType, typename); - } else { - Object oracleValue = toOracleValue(value, ps.getConnection()); - if (oracleValue == null) { - ps.setNull(index, sqlType, typename); - } else { - ps.setObject(index, oracleValue); - } - } - } - - @Override - public OracleWrite> opt(OracleTypename typeName) { - return new Structured<>( - (opt, conn) -> - opt.map( - v -> { - try { - return converter.apply(v, conn); - } catch (SQLException e) { - throw new RuntimeException(e); - } - }) - .orElse(null), - typename, - sqlType); - } - - @Override - public OracleWrite contramap(Function f) { - return new Structured<>((b, conn) -> converter.apply(f.apply(b), conn), typename, sqlType); - } - } - - static OracleWrite primitive(RawWriter rawWriter) { - return new Instance<>(rawWriter, Function.identity()); - } - - /** - * Create OracleWrite for structured types (OBJECT, ARRAY). Provide converter function, typename, - * and SQL type. - */ - static Structured structured( - SqlBiFunction converter, String typename, int sqlType) { - return new Structured<>(converter, typename, sqlType); - } - - static OracleWrite passObjectToJdbc() { - return primitive(PreparedStatement::setObject); - } - - /** - * Writer for DATE (includes time component in Oracle). Converts LocalDateTime to - * java.sql.Timestamp for STRUCT context. - */ - static OracleWrite writeDate() { - return structured( - (localDateTime, conn) -> { - if (localDateTime == null) return null; - // Convert LocalDateTime to java.sql.Timestamp - // Oracle DATE includes time component (unlike SQL standard DATE) - return Timestamp.valueOf(localDateTime); - }, - "DATE", - java.sql.Types.DATE); - } - - /** - * Writer for TIMESTAMP (without timezone). Converts LocalDateTime to java.sql.Timestamp for - * STRUCT context. - */ - static OracleWrite writeTimestamp() { - return structured( - (localDateTime, conn) -> { - if (localDateTime == null) return null; - // Convert LocalDateTime to java.sql.Timestamp - // Use java.sql.Timestamp directly - Oracle STRUCT handles the conversion - return Timestamp.valueOf(localDateTime); - }, - "TIMESTAMP", - java.sql.Types.TIMESTAMP); - } - - /** - * Writer for TIMESTAMP WITH LOCAL TIME ZONE. Converts OffsetDateTime to oracle.sql.TIMESTAMPLTZ - * for STRUCT context. Note: Oracle normalizes this to database timezone and returns in session - * timezone. We use the instant (UTC) representation and let Oracle handle timezone conversion. - */ - static OracleWrite writeTimestampWithLocalTimeZone() { - return structured( - (offsetDateTime, conn) -> { - if (offsetDateTime == null) return null; - // Convert OffsetDateTime to java.sql.Timestamp (UTC instant) - // Oracle TIMESTAMPLTZ will store the instant and return in session timezone - Timestamp timestamp = Timestamp.from(offsetDateTime.toInstant()); - return new oracle.sql.TIMESTAMPLTZ(conn, timestamp); - }, - "TIMESTAMP WITH LOCAL TIME ZONE", - java.sql.Types.TIMESTAMP_WITH_TIMEZONE); - } - - /** - * Writer for TIMESTAMP WITH TIME ZONE. Converts OffsetDateTime to oracle.sql.TIMESTAMPTZ for - * STRUCT context. Oracle stores the timezone offset along with the timestamp. - */ - static OracleWrite writeTimestampWithTimeZone() { - return structured( - (offsetDateTime, conn) -> { - if (offsetDateTime == null) return null; - // Convert OffsetDateTime to java.sql.Timestamp - Timestamp timestamp = Timestamp.from(offsetDateTime.toInstant()); - // Create oracle.sql.TIMESTAMPTZ with timezone information - // TIMESTAMPTZ constructor: TIMESTAMPTZ(Connection, Timestamp, Calendar) - // We need to format with timezone offset - java.util.Calendar calendar = java.util.Calendar.getInstance(); - calendar.setTimeZone(java.util.TimeZone.getTimeZone(offsetDateTime.getOffset())); - calendar.setTimeInMillis(timestamp.getTime()); - return new TIMESTAMPTZ(conn, timestamp, calendar); - }, - "TIMESTAMP WITH TIME ZONE", - java.sql.Types.TIMESTAMP_WITH_TIMEZONE); - } - - /** - * Writer for INTERVAL YEAR TO MONTH. Converts OracleIntervalYM to oracle.sql.INTERVALYM for - * STRUCT context. - */ - static OracleWrite writeIntervalYearToMonth() { - return structured( - (interval, conn) -> { - if (interval == null) return null; - // Convert to Oracle format string (+02-05) and create oracle.sql.INTERVALYM - return new INTERVALYM(interval.toOracleFormat()); - }, - "INTERVAL YEAR TO MONTH", - java.sql.Types.OTHER); - } - - /** - * Writer for INTERVAL DAY TO SECOND. Converts OracleIntervalDS to oracle.sql.INTERVALDS for - * STRUCT context. - */ - static OracleWrite writeIntervalDayToSecond() { - return structured( - (interval, conn) -> { - if (interval == null) return null; - // Convert to Oracle format string (+03 14:30:45.123456) and create oracle.sql.INTERVALDS - return new INTERVALDS(interval.toOracleFormat()); - }, - "INTERVAL DAY TO SECOND", - java.sql.Types.OTHER); - } - - /** - * Writer for RAW (variable-length binary data). Converts byte[] to oracle.sql.RAW for STRUCT - * context. - */ - static OracleWrite writeRaw() { - return structured( - (bytes, conn) -> { - if (bytes == null) return null; - // Convert byte[] to oracle.sql.RAW - return new RAW(bytes); - }, - "RAW", - java.sql.Types.VARBINARY); - } - - /** - * Writer for BLOB (large binary object). Converts byte[] to oracle.sql.BLOB for STRUCT context. - */ - static OracleWrite writeBlobForStruct() { - return structured( - (bytes, conn) -> { - if (bytes == null) return null; - // Create temporary BLOB and populate it - BLOB blob = BLOB.createTemporary(conn, true, BLOB.DURATION_SESSION); - blob.setBytes(1, bytes); - return blob; - }, - "BLOB", - java.sql.Types.BLOB); - } - - /** - * Writer for CLOB (character large object). Converts String to oracle.sql.CLOB for STRUCT - * context. - */ - static OracleWrite writeClobForStruct() { - return structured( - (str, conn) -> { - if (str == null) return null; - // Create temporary CLOB and populate it - CLOB clob = CLOB.createTemporary(conn, true, CLOB.DURATION_SESSION); - clob.setString(1, str); - return clob; - }, - "CLOB", - java.sql.Types.CLOB); - } - - // Basic type writers - OracleWrite writeString = primitive(PreparedStatement::setString); - OracleWrite writeNonEmptyString = - primitive( - (ps, idx, nes) -> { - ps.setString(idx, nes.value()); - }); - - static OracleWrite writePaddedString() { - return primitive( - (ps, idx, padded) -> { - ps.setString(idx, padded.value()); - }); - } - - OracleWrite writeBoolean = primitive(PreparedStatement::setBoolean); - OracleWrite writeByte = primitive(PreparedStatement::setByte); - OracleWrite writeShort = primitive(PreparedStatement::setShort); - OracleWrite writeInteger = primitive(PreparedStatement::setInt); - OracleWrite writeLong = primitive(PreparedStatement::setLong); - OracleWrite writeFloat = primitive(PreparedStatement::setFloat); - OracleWrite writeDouble = primitive(PreparedStatement::setDouble); - OracleWrite writeBigDecimal = primitive(PreparedStatement::setBigDecimal); - OracleWrite writeByteArray = primitive(PreparedStatement::setBytes); - OracleWrite writeNonEmptyBlob = - primitive( - (ps, idx, neb) -> { - ps.setBytes(idx, neb.value()); - }); - - // BLOB writer - OracleWrite writeBlob = - primitive( - (ps, idx, bytes) -> { - ps.setBlob(idx, new java.io.ByteArrayInputStream(bytes), bytes.length); - }); - - // CLOB writer - OracleWrite writeClob = - primitive( - (ps, idx, str) -> { - ps.setClob(idx, new java.io.StringReader(str), str.length()); - }); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/PaddedString.java b/foundations-jdbc/src/java/dev/typr/foundations/PaddedString.java deleted file mode 100644 index 0b5ee3697c..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/PaddedString.java +++ /dev/null @@ -1,87 +0,0 @@ -package dev.typr.foundations; - -import java.util.Objects; -import java.util.Optional; - -/** - * A fixed-length, blank-padded string value. - * - *

Represents Oracle CHAR(n) and NCHAR(n) types, which are always padded to exactly n characters - * with trailing spaces. Oracle converts empty strings to NULL, so this type represents strings that - * are guaranteed to be non-null and non-empty. - */ -public final class PaddedString { - private final String value; - private final int length; - - private PaddedString(String value, int length) { - this.value = value; - this.length = length; - } - - /** - * Smart constructor: Create a PaddedString from a string value and target length. The string will - * be padded to the specified length with trailing spaces. Returns Optional.empty() if the string - * is null, empty, or longer than the target length. - */ - public static Optional apply(String s, int length) { - if (s == null || s.isEmpty()) { - return Optional.empty(); - } - if (s.length() > length) { - return Optional.empty(); - } - // Pad to target length - String padded = String.format("%-" + length + "s", s); - return Optional.of(new PaddedString(padded, length)); - } - - /** - * Force constructor: Create a PaddedString from a string value and target length. Throws - * IllegalArgumentException if the string is null, empty, or longer than the target length. - */ - public static PaddedString force(String s, int length) { - if (s == null || s.isEmpty()) { - throw new IllegalArgumentException("String cannot be null or empty"); - } - if (s.length() > length) { - throw new IllegalArgumentException( - "String length " + s.length() + " exceeds maximum " + length); - } - String padded = String.format("%-" + length + "s", s); - return new PaddedString(padded, length); - } - - /** Get the padded value (includes trailing spaces). */ - public String value() { - return value; - } - - /** Get the declared length of this padded string. */ - public int length() { - return length; - } - - /** Get the value with trailing spaces removed. */ - public String trimmed() { - return value.stripTrailing(); - } - - @Override - public String toString() { - return value; - } - - @Override - public boolean equals(Object o) { - if (this == o) return true; - if (o == null || getClass() != o.getClass()) return false; - PaddedString that = (PaddedString) o; - return length == that.length && value.equals(that.value); - } - - @Override - public int hashCode() { - return Objects.hash(value, length); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/PgCompositeText.java b/foundations-jdbc/src/java/dev/typr/foundations/PgCompositeText.java deleted file mode 100644 index a57cc68853..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/PgCompositeText.java +++ /dev/null @@ -1,712 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.Money; -import java.math.BigDecimal; -import java.time.OffsetTime; -import java.time.format.DateTimeFormatter; -import java.time.format.DateTimeFormatterBuilder; -import java.time.temporal.ChronoField; -import java.util.LinkedHashMap; -import java.util.List; -import java.util.Map; -import java.util.Optional; -import java.util.UUID; -import java.util.function.Function; -import java.util.function.IntFunction; -import org.postgresql.geometric.*; -import org.postgresql.util.PGInterval; - -/** - * Simple text encoding/decoding for values within PostgreSQL composite types. - * - *

This provides simple encoding (toString-like) and decoding (parse) for field values. The - * encoding does NOT include any escaping - that is handled by {@link PgRecordParser} which applies - * the composite type format (quoting, quote-doubling). - * - *

This is separate from {@link PgText} which handles COPY format with backslash escaping. - */ -public abstract class PgCompositeText { - - /** - * Encode a value to its simple text representation. - * - * @return Optional.empty() to represent SQL NULL, Optional.of(string) for the encoded value - */ - public abstract Optional encode(A value); - - /** Decode a value from its text representation. */ - public abstract A decode(String text); - - /** Create an array version of this codec with comma delimiter. */ - public PgCompositeText array(IntFunction arrayFactory) { - return array(arrayFactory, ','); - } - - /** - * Create an array version of this codec with a custom delimiter. - * - *

PostgreSQL uses semicolon (;) as the delimiter for geometric type arrays (box[], circle[], - * line[], lseg[], path[], point[], polygon[]) because their elements contain commas. - * - * @param arrayFactory factory to create arrays of the element type - * @param delimiter the array element delimiter character - */ - public PgCompositeText array(IntFunction arrayFactory, char delimiter) { - var self = this; - return new PgCompositeText<>() { - @Override - public Optional encode(A[] values) { - List list = java.util.Arrays.asList(values); - return Optional.of( - PgRecordParser.encodeArray(list, v -> self.encode(v).orElse(null), delimiter)); - } - - @Override - public A[] decode(String text) { - List elements = PgRecordParser.parseArray(text, delimiter); - A[] result = arrayFactory.apply(elements.size()); - for (int i = 0; i < elements.size(); i++) { - String elem = elements.get(i); - result[i] = elem == null ? null : self.decode(elem); - } - return result; - } - }; - } - - /** Transform this codec to work with a different type. */ - public PgCompositeText bimap(Function f, Function g) { - var self = this; - return new PgCompositeText<>() { - @Override - public Optional encode(B value) { - return self.encode(g.apply(value)); - } - - @Override - public B decode(String text) { - return f.apply(self.decode(text)); - } - }; - } - - /** Create an optional version of this codec. */ - public PgCompositeText> opt() { - var self = this; - return new PgCompositeText<>() { - @Override - public Optional encode(Optional value) { - return value.flatMap(self::encode); - } - - @Override - public Optional decode(String text) { - if (text == null) { - return Optional.empty(); - } - return Optional.of(self.decode(text)); - } - }; - } - - /** Create a PgCompositeText from encode and decode functions. */ - public static PgCompositeText of( - Function encoder, Function decoder) { - return new PgCompositeText<>() { - @Override - public Optional encode(A value) { - return Optional.of(encoder.apply(value)); - } - - @Override - public A decode(String text) { - return decoder.apply(text); - } - }; - } - - // ======================================================================== - // Standard instances - // ======================================================================== - - /** String: identity encoding/decoding. */ - public static final PgCompositeText text = of(Function.identity(), Function.identity()); - - /** Integer: toString/parseInt. */ - public static final PgCompositeText int4 = of(Object::toString, Integer::parseInt); - - /** Short: toString/parseShort. */ - public static final PgCompositeText int2 = of(Object::toString, Short::parseShort); - - /** Long: toString/parseLong. */ - public static final PgCompositeText int8 = of(Object::toString, Long::parseLong); - - /** Float: toString/parseFloat. */ - public static final PgCompositeText float4 = of(Object::toString, Float::parseFloat); - - /** Double: toString/parseDouble. */ - public static final PgCompositeText float8 = of(Object::toString, Double::parseDouble); - - /** BigDecimal: toString/new BigDecimal. */ - public static final PgCompositeText numeric = of(Object::toString, BigDecimal::new); - - /** Boolean: t/f format. */ - public static final PgCompositeText bool = - of(b -> b ? "t" : "f", text -> text.equals("t") || text.equals("true") || text.equals("1")); - - /** UUID: toString/fromString. */ - public static final PgCompositeText uuid = of(Object::toString, UUID::fromString); - - /** - * Money: PostgreSQL returns money with currency symbol (e.g., "$42.22"). We encode as plain - * number and decode handling the currency symbol. - */ - public static final PgCompositeText money = of(m -> String.valueOf(m.value()), Money::new); - - /** - * OffsetTime (timetz): handles both standard format and PostgreSQL's short offset format. - * PostgreSQL may return "16:30:00+03" instead of "16:30:00+03:00". - */ - public static final PgCompositeText timetz = - new PgCompositeText<>() { - // Standard format with optional fractional seconds and full offset - private static final DateTimeFormatter STANDARD = - new DateTimeFormatterBuilder() - .appendPattern("HH:mm:ss") - .optionalStart() - .appendFraction(ChronoField.NANO_OF_SECOND, 0, 9, true) - .optionalEnd() - .appendOffset("+HH:MM", "Z") - .toFormatter(); - - // Short offset format (e.g., +03 instead of +03:00) - private static final DateTimeFormatter SHORT_OFFSET = - new DateTimeFormatterBuilder() - .appendPattern("HH:mm:ss") - .optionalStart() - .appendFraction(ChronoField.NANO_OF_SECOND, 0, 9, true) - .optionalEnd() - .appendOffset("+HH", "Z") - .toFormatter(); - - @Override - public Optional encode(OffsetTime value) { - return Optional.of(value.toString()); - } - - @Override - public OffsetTime decode(String text) { - try { - return OffsetTime.parse(text, STANDARD); - } catch (java.time.format.DateTimeParseException e) { - return OffsetTime.parse(text, SHORT_OFFSET); - } - } - }; - - // ======================================================================== - // Geometric types - use PGobject's getValue/constructor - // ======================================================================== - - /** PGpoint: format (x,y). */ - public static final PgCompositeText point = - new PgCompositeText<>() { - @Override - public Optional encode(PGpoint value) { - return Optional.of(value.getValue()); - } - - @Override - public PGpoint decode(String text) { - try { - return new PGpoint(text); - } catch (java.sql.SQLException e) { - throw new RuntimeException("Failed to parse PGpoint: " + text, e); - } - } - }; - - /** PGbox: format (x1,y1),(x2,y2). */ - public static final PgCompositeText box = - new PgCompositeText<>() { - @Override - public Optional encode(PGbox value) { - return Optional.of(value.getValue()); - } - - @Override - public PGbox decode(String text) { - try { - return new PGbox(text); - } catch (java.sql.SQLException e) { - throw new RuntimeException("Failed to parse PGbox: " + text, e); - } - } - }; - - /** PGcircle: format <(x,y),r>. */ - public static final PgCompositeText circle = - new PgCompositeText<>() { - @Override - public Optional encode(PGcircle value) { - return Optional.of(value.getValue()); - } - - @Override - public PGcircle decode(String text) { - try { - return new PGcircle(text); - } catch (java.sql.SQLException e) { - throw new RuntimeException("Failed to parse PGcircle: " + text, e); - } - } - }; - - /** PGline: format {A,B,C}. */ - public static final PgCompositeText line = - new PgCompositeText<>() { - @Override - public Optional encode(PGline value) { - return Optional.of(value.getValue()); - } - - @Override - public PGline decode(String text) { - try { - return new PGline(text); - } catch (java.sql.SQLException e) { - throw new RuntimeException("Failed to parse PGline: " + text, e); - } - } - }; - - /** PGlseg: format [(x1,y1),(x2,y2)]. */ - public static final PgCompositeText lseg = - new PgCompositeText<>() { - @Override - public Optional encode(PGlseg value) { - return Optional.of(value.getValue()); - } - - @Override - public PGlseg decode(String text) { - try { - return new PGlseg(text); - } catch (java.sql.SQLException e) { - throw new RuntimeException("Failed to parse PGlseg: " + text, e); - } - } - }; - - /** PGpath: format [(x1,y1),...] or ((x1,y1),...). */ - public static final PgCompositeText path = - new PgCompositeText<>() { - @Override - public Optional encode(PGpath value) { - return Optional.of(value.getValue()); - } - - @Override - public PGpath decode(String text) { - try { - return new PGpath(text); - } catch (java.sql.SQLException e) { - throw new RuntimeException("Failed to parse PGpath: " + text, e); - } - } - }; - - /** PGpolygon: format ((x1,y1),...). */ - public static final PgCompositeText polygon = - new PgCompositeText<>() { - @Override - public Optional encode(PGpolygon value) { - return Optional.of(value.getValue()); - } - - @Override - public PGpolygon decode(String text) { - try { - return new PGpolygon(text); - } catch (java.sql.SQLException e) { - throw new RuntimeException("Failed to parse PGpolygon: " + text, e); - } - } - }; - - // ======================================================================== - // Other complex types - // ======================================================================== - - /** PGInterval: text format like "1 year 2 mons 3 days 04:05:06.666". */ - public static final PgCompositeText interval = - new PgCompositeText<>() { - @Override - public Optional encode(PGInterval value) { - return Optional.of(value.getValue()); - } - - @Override - public PGInterval decode(String text) { - try { - return new PGInterval(text); - } catch (java.sql.SQLException e) { - throw new RuntimeException("Failed to parse PGInterval: " + text, e); - } - } - }; - - /** - * bytea: PostgreSQL hex format \x followed by hex digits. Note: inside composite types, - * PostgreSQL uses single backslash (\x), not the COPY format double backslash (\\x). - */ - public static final PgCompositeText bytea = - new PgCompositeText<>() { - @Override - public Optional encode(byte[] value) { - StringBuilder sb = new StringBuilder(2 + value.length * 2); - sb.append("\\x"); - for (byte b : value) { - sb.append(String.format("%02x", b & 0xff)); - } - return Optional.of(sb.toString()); - } - - @Override - public byte[] decode(String text) { - // Handle both \x and \\x prefixes (PostgreSQL may return either) - String hex; - if (text.startsWith("\\x")) { - hex = text.substring(2); - } else if (text.startsWith("\\\\x")) { - hex = text.substring(3); - } else { - throw new IllegalArgumentException("Invalid bytea format: " + text); - } - if (hex.isEmpty()) { - return new byte[0]; - } - byte[] result = new byte[hex.length() / 2]; - for (int i = 0; i < result.length; i++) { - result[i] = (byte) Integer.parseInt(hex.substring(i * 2, i * 2 + 2), 16); - } - return result; - } - }; - - /** - * hstore: key=>value pairs. Inside composite types, the format uses double-quoted keys and - * values. - */ - public static final PgCompositeText> hstore = - new PgCompositeText<>() { - @Override - public Optional encode(Map value) { - StringBuilder sb = new StringBuilder(); - boolean first = true; - for (Map.Entry entry : value.entrySet()) { - if (first) { - first = false; - } else { - sb.append(", "); - } - appendQuoted(sb, entry.getKey()); - sb.append("=>"); - if (entry.getValue() == null) { - sb.append("NULL"); - } else { - appendQuoted(sb, entry.getValue()); - } - } - return Optional.of(sb.toString()); - } - - private void appendQuoted(StringBuilder sb, String s) { - sb.append('"'); - for (int i = 0; i < s.length(); i++) { - char c = s.charAt(i); - if (c == '"') { - sb.append("\\\""); - } else if (c == '\\') { - sb.append("\\\\"); - } else { - sb.append(c); - } - } - sb.append('"'); - } - - @Override - public Map decode(String text) { - Map result = new LinkedHashMap<>(); - if (text.isEmpty()) { - return result; - } - // Parse hstore format: "key"=>"value", "key2"=>"value2" - int pos = 0; - while (pos < text.length()) { - // Skip whitespace - while (pos < text.length() && Character.isWhitespace(text.charAt(pos))) { - pos++; - } - if (pos >= text.length()) break; - - // Parse key - String key = parseQuotedString(text, pos); - pos += key.length() + 2; // +2 for quotes - - // Skip => - while (pos < text.length() && (text.charAt(pos) == '=' || text.charAt(pos) == '>')) { - pos++; - } - - // Skip whitespace - while (pos < text.length() && Character.isWhitespace(text.charAt(pos))) { - pos++; - } - - // Parse value (could be NULL or quoted string) - String value; - if (text.regionMatches(true, pos, "NULL", 0, 4)) { - value = null; - pos += 4; - } else { - value = parseQuotedString(text, pos); - pos += value.length() + 2; // +2 for quotes - } - - result.put( - unescapeHstoreString(key), value == null ? null : unescapeHstoreString(value)); - - // Skip comma and whitespace - while (pos < text.length() - && (text.charAt(pos) == ',' || Character.isWhitespace(text.charAt(pos)))) { - pos++; - } - } - return result; - } - - private String parseQuotedString(String text, int start) { - if (text.charAt(start) != '"') { - throw new IllegalArgumentException("Expected quoted string at position " + start); - } - StringBuilder sb = new StringBuilder(); - int i = start + 1; - while (i < text.length()) { - char c = text.charAt(i); - if (c == '"') { - // Check for doubled quote (escaped) - if (i + 1 < text.length() && text.charAt(i + 1) == '"') { - sb.append('"'); - i += 2; - } else { - break; // End of string - } - } else if (c == '\\' && i + 1 < text.length()) { - char next = text.charAt(i + 1); - if (next == '"' || next == '\\') { - sb.append(next); - i += 2; - } else { - sb.append(c); - i++; - } - } else { - sb.append(c); - i++; - } - } - return sb.toString(); - } - - private String unescapeHstoreString(String s) { - // The parseQuotedString already handles escaping - return s; - } - }; - - // ======================================================================== - // Unboxed primitive arrays - // ======================================================================== - - /** Unboxed boolean array: format {t,f,t}. */ - public static final PgCompositeText boolArrayUnboxed = - new PgCompositeText<>() { - @Override - public Optional encode(boolean[] value) { - StringBuilder sb = new StringBuilder(); - sb.append('{'); - for (int i = 0; i < value.length; i++) { - if (i > 0) sb.append(','); - sb.append(value[i] ? 't' : 'f'); - } - sb.append('}'); - return Optional.of(sb.toString()); - } - - @Override - public boolean[] decode(String text) { - List elements = PgRecordParser.parseArray(text); - boolean[] result = new boolean[elements.size()]; - for (int i = 0; i < elements.size(); i++) { - String elem = elements.get(i); - result[i] = - elem != null && (elem.equals("t") || elem.equals("true") || elem.equals("1")); - } - return result; - } - }; - - /** Unboxed short array: format {1,2,3}. */ - public static final PgCompositeText shortArrayUnboxed = - new PgCompositeText<>() { - @Override - public Optional encode(short[] value) { - StringBuilder sb = new StringBuilder(); - sb.append('{'); - for (int i = 0; i < value.length; i++) { - if (i > 0) sb.append(','); - sb.append(value[i]); - } - sb.append('}'); - return Optional.of(sb.toString()); - } - - @Override - public short[] decode(String text) { - List elements = PgRecordParser.parseArray(text); - short[] result = new short[elements.size()]; - for (int i = 0; i < elements.size(); i++) { - String elem = elements.get(i); - result[i] = elem == null ? 0 : Short.parseShort(elem); - } - return result; - } - }; - - /** Unboxed int array: format {1,2,3}. */ - public static final PgCompositeText intArrayUnboxed = - new PgCompositeText<>() { - @Override - public Optional encode(int[] value) { - StringBuilder sb = new StringBuilder(); - sb.append('{'); - for (int i = 0; i < value.length; i++) { - if (i > 0) sb.append(','); - sb.append(value[i]); - } - sb.append('}'); - return Optional.of(sb.toString()); - } - - @Override - public int[] decode(String text) { - List elements = PgRecordParser.parseArray(text); - int[] result = new int[elements.size()]; - for (int i = 0; i < elements.size(); i++) { - String elem = elements.get(i); - result[i] = elem == null ? 0 : Integer.parseInt(elem); - } - return result; - } - }; - - /** Unboxed long array: format {1,2,3}. */ - public static final PgCompositeText longArrayUnboxed = - new PgCompositeText<>() { - @Override - public Optional encode(long[] value) { - StringBuilder sb = new StringBuilder(); - sb.append('{'); - for (int i = 0; i < value.length; i++) { - if (i > 0) sb.append(','); - sb.append(value[i]); - } - sb.append('}'); - return Optional.of(sb.toString()); - } - - @Override - public long[] decode(String text) { - List elements = PgRecordParser.parseArray(text); - long[] result = new long[elements.size()]; - for (int i = 0; i < elements.size(); i++) { - String elem = elements.get(i); - result[i] = elem == null ? 0L : Long.parseLong(elem); - } - return result; - } - }; - - /** Unboxed float array: format {1.0,2.0,3.0}. */ - public static final PgCompositeText floatArrayUnboxed = - new PgCompositeText<>() { - @Override - public Optional encode(float[] value) { - StringBuilder sb = new StringBuilder(); - sb.append('{'); - for (int i = 0; i < value.length; i++) { - if (i > 0) sb.append(','); - sb.append(value[i]); - } - sb.append('}'); - return Optional.of(sb.toString()); - } - - @Override - public float[] decode(String text) { - List elements = PgRecordParser.parseArray(text); - float[] result = new float[elements.size()]; - for (int i = 0; i < elements.size(); i++) { - String elem = elements.get(i); - result[i] = elem == null ? 0.0f : Float.parseFloat(elem); - } - return result; - } - }; - - /** Unboxed double array: format {1.0,2.0,3.0}. */ - public static final PgCompositeText doubleArrayUnboxed = - new PgCompositeText<>() { - @Override - public Optional encode(double[] value) { - StringBuilder sb = new StringBuilder(); - sb.append('{'); - for (int i = 0; i < value.length; i++) { - if (i > 0) sb.append(','); - sb.append(value[i]); - } - sb.append('}'); - return Optional.of(sb.toString()); - } - - @Override - public double[] decode(String text) { - List elements = PgRecordParser.parseArray(text); - double[] result = new double[elements.size()]; - for (int i = 0; i < elements.size(); i++) { - String elem = elements.get(i); - result[i] = elem == null ? 0.0 : Double.parseDouble(elem); - } - return result; - } - }; - - /** Codec that throws on encode/decode - for unsupported types. */ - public static PgCompositeText notSupported() { - return new PgCompositeText<>() { - @Override - public Optional encode(A value) { - throw new UnsupportedOperationException( - "Composite type encoding not supported for this type"); - } - - @Override - public A decode(String text) { - throw new UnsupportedOperationException( - "Composite type decoding not supported for this type"); - } - }; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/PgJson.java b/foundations-jdbc/src/java/dev/typr/foundations/PgJson.java deleted file mode 100644 index f6733521ba..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/PgJson.java +++ /dev/null @@ -1,925 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.*; -import dev.typr.foundations.data.Vector; -import java.math.BigDecimal; -import java.time.*; -import java.time.format.DateTimeFormatter; -import java.time.format.DateTimeFormatterBuilder; -import java.util.*; -import java.util.function.Function; -import java.util.function.IntFunction; -import org.postgresql.geometric.*; -import org.postgresql.util.PGInterval; - -/** - * PostgreSQL-specific JSON codec implementations. Handles conversion to/from JSON in PostgreSQL's - * expected format. - */ -public interface PgJson extends DbJson { - - @Override - default PgJson> opt() { - PgJson self = this; - return new PgJson<>() { - @Override - public JsonValue toJson(Optional value) { - return value.map(self::toJson).orElse(JsonValue.JNull.INSTANCE); - } - - @Override - public Optional fromJson(JsonValue json) { - if (json instanceof JsonValue.JNull) { - return Optional.empty(); - } - return Optional.of(self.fromJson(json)); - } - }; - } - - default PgJson array(IntFunction arrayFactory) { - PgJson self = this; - return new PgJson<>() { - @Override - public JsonValue toJson(A[] value) { - List elements = new ArrayList<>(value.length); - for (A elem : value) { - elements.add(self.toJson(elem)); - } - return new JsonValue.JArray(elements); - } - - @Override - public A[] fromJson(JsonValue json) { - if (!(json instanceof JsonValue.JArray(List values))) { - throw new IllegalArgumentException( - "Expected JSON array, got: " + json.getClass().getSimpleName()); - } - A[] result = arrayFactory.apply(values.size()); - for (int i = 0; i < values.size(); i++) { - result[i] = self.fromJson(values.get(i)); - } - return result; - } - }; - } - - default PgJson bimap(SqlFunction f, Function g) { - PgJson self = this; - return new PgJson<>() { - @Override - public JsonValue toJson(B value) { - return self.toJson(g.apply(value)); - } - - @Override - public B fromJson(JsonValue json) { - try { - return f.apply(self.fromJson(json)); - } catch (java.sql.SQLException e) { - throw new RuntimeException(e); - } - } - }; - } - - // Primitive type codecs - PgJson bool = - new PgJson<>() { - @Override - public JsonValue toJson(Boolean value) { - return JsonValue.JBool.of(value); - } - - @Override - public Boolean fromJson(JsonValue json) { - if (json instanceof JsonValue.JBool(boolean value)) return value; - throw new IllegalArgumentException( - "Expected boolean, got: " + json.getClass().getSimpleName()); - } - }; - - PgJson int2 = - new PgJson<>() { - @Override - public JsonValue toJson(Short value) { - return JsonValue.JNumber.of(value.longValue()); - } - - @Override - public Short fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber(String value)) return Short.parseShort(value); - throw new IllegalArgumentException( - "Expected number, got: " + json.getClass().getSimpleName()); - } - }; - - PgJson int4 = - new PgJson<>() { - @Override - public JsonValue toJson(Integer value) { - return JsonValue.JNumber.of(value.longValue()); - } - - @Override - public Integer fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber(String value)) return Integer.parseInt(value); - throw new IllegalArgumentException( - "Expected number, got: " + json.getClass().getSimpleName()); - } - }; - - PgJson int8 = - new PgJson<>() { - @Override - public JsonValue toJson(Long value) { - return JsonValue.JNumber.of(value); - } - - @Override - public Long fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber(String value)) return Long.parseLong(value); - throw new IllegalArgumentException( - "Expected number, got: " + json.getClass().getSimpleName()); - } - }; - - PgJson float4 = - new PgJson<>() { - @Override - public JsonValue toJson(Float value) { - return JsonValue.JNumber.of(value.doubleValue()); - } - - @Override - public Float fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber(String value)) return Float.parseFloat(value); - throw new IllegalArgumentException( - "Expected number, got: " + json.getClass().getSimpleName()); - } - }; - - PgJson float8 = - new PgJson<>() { - @Override - public JsonValue toJson(Double value) { - return JsonValue.JNumber.of(value); - } - - @Override - public Double fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber(String value)) return Double.parseDouble(value); - throw new IllegalArgumentException( - "Expected number, got: " + json.getClass().getSimpleName()); - } - }; - - PgJson numeric = - new PgJson<>() { - @Override - public JsonValue toJson(BigDecimal value) { - return JsonValue.JNumber.of(value.toPlainString()); - } - - @Override - public BigDecimal fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber(String value)) return new BigDecimal(value); - throw new IllegalArgumentException( - "Expected number, got: " + json.getClass().getSimpleName()); - } - }; - - PgJson text = - new PgJson<>() { - @Override - public JsonValue toJson(String value) { - return new JsonValue.JString(value); - } - - @Override - public String fromJson(JsonValue json) { - if (json instanceof JsonValue.JString(String value)) return value; - throw new IllegalArgumentException( - "Expected string, got: " + json.getClass().getSimpleName()); - } - }; - - PgJson bytea = - new PgJson<>() { - @Override - public JsonValue toJson(byte[] value) { - // PostgreSQL uses hex encoding for bytea in JSON: "\\x..." - StringBuilder sb = new StringBuilder("\\x"); - for (byte b : value) { - sb.append(String.format("%02x", b & 0xff)); - } - return new JsonValue.JString(sb.toString()); - } - - @Override - public byte[] fromJson(JsonValue json) { - if (!(json instanceof JsonValue.JString(String value))) { - throw new IllegalArgumentException( - "Expected string for bytea, got: " + json.getClass().getSimpleName()); - } - if (value.startsWith("\\x")) { - value = value.substring(2); - } - byte[] result = new byte[value.length() / 2]; - for (int i = 0; i < result.length; i++) { - result[i] = (byte) Integer.parseInt(value.substring(i * 2, i * 2 + 2), 16); - } - return result; - } - }; - - // Date/Time types - PgJson date = - new PgJson<>() { - @Override - public JsonValue toJson(LocalDate value) { - return new JsonValue.JString(value.toString()); - } - - @Override - public LocalDate fromJson(JsonValue json) { - if (json instanceof JsonValue.JString(String value)) return LocalDate.parse(value); - throw new IllegalArgumentException( - "Expected string for date, got: " + json.getClass().getSimpleName()); - } - }; - - PgJson time = - new PgJson<>() { - @Override - public JsonValue toJson(LocalTime value) { - return new JsonValue.JString(value.toString()); - } - - @Override - public LocalTime fromJson(JsonValue json) { - if (json instanceof JsonValue.JString(String value)) return LocalTime.parse(value); - throw new IllegalArgumentException( - "Expected string for time, got: " + json.getClass().getSimpleName()); - } - }; - - PgJson timestamp = - new PgJson<>() { - // PostgreSQL JSON uses ISO format with 'T', but some custom types use space delimiter - private static final DateTimeFormatter FORMATTER_SPACE = - new DateTimeFormatterBuilder() - .appendPattern("yyyy-MM-dd HH:mm:ss") - .appendFraction(java.time.temporal.ChronoField.MICRO_OF_SECOND, 0, 6, true) - .toFormatter(); - - @Override - public JsonValue toJson(LocalDateTime value) { - return new JsonValue.JString(value.toString()); - } - - @Override - public LocalDateTime fromJson(JsonValue json) { - if (json instanceof JsonValue.JString(String value)) { - // Try ISO format first (with 'T'), then space-delimited format - if (value.contains("T")) { - return LocalDateTime.parse(value); - } else { - return LocalDateTime.parse(value, FORMATTER_SPACE); - } - } - throw new IllegalArgumentException( - "Expected string for timestamp, got: " + json.getClass().getSimpleName()); - } - }; - - PgJson timestamptz = - new PgJson<>() { - @Override - public JsonValue toJson(Instant value) { - return new JsonValue.JString(value.toString()); - } - - @Override - public Instant fromJson(JsonValue json) { - if (json instanceof JsonValue.JString(String value)) return Instant.parse(value); - throw new IllegalArgumentException( - "Expected string for timestamptz, got: " + json.getClass().getSimpleName()); - } - }; - - PgJson timetz = - new PgJson<>() { - // Support various offset formats: Z, +01, +01:00, and up to nanosecond precision - private static final DateTimeFormatter FORMATTER = - new DateTimeFormatterBuilder() - .appendPattern("HH:mm:ss") - .appendFraction(java.time.temporal.ChronoField.NANO_OF_SECOND, 0, 9, true) - .appendOffset("+HH:mm", "Z") - .toFormatter(); - private static final DateTimeFormatter FORMATTER_SHORT_OFFSET = - new DateTimeFormatterBuilder() - .appendPattern("HH:mm:ss") - .appendFraction(java.time.temporal.ChronoField.NANO_OF_SECOND, 0, 9, true) - .appendOffset("+HH", "+00") - .toFormatter(); - - @Override - public JsonValue toJson(OffsetTime value) { - return new JsonValue.JString(value.toString()); - } - - @Override - public OffsetTime fromJson(JsonValue json) { - if (json instanceof JsonValue.JString(String value)) { - // Try standard format first, then short offset format (e.g., "+01" instead of "+01:00") - try { - return OffsetTime.parse(value, FORMATTER); - } catch (java.time.format.DateTimeParseException e) { - return OffsetTime.parse(value, FORMATTER_SHORT_OFFSET); - } - } - throw new IllegalArgumentException( - "Expected string for timetz, got: " + json.getClass().getSimpleName()); - } - }; - - PgJson uuid = - new PgJson<>() { - @Override - public JsonValue toJson(UUID value) { - return new JsonValue.JString(value.toString()); - } - - @Override - public UUID fromJson(JsonValue json) { - if (json instanceof JsonValue.JString(String value)) return UUID.fromString(value); - throw new IllegalArgumentException( - "Expected string for uuid, got: " + json.getClass().getSimpleName()); - } - }; - - // JSON types (pass-through) - PgJson json = - new PgJson<>() { - @Override - public JsonValue toJson(Json value) { - return JsonValue.parse(value.value()); - } - - @Override - public Json fromJson(JsonValue json) { - return new Json(json.encode()); - } - }; - - PgJson jsonb = - new PgJson<>() { - @Override - public JsonValue toJson(Jsonb value) { - return JsonValue.parse(value.value()); - } - - @Override - public Jsonb fromJson(JsonValue json) { - return new Jsonb(json.encode()); - } - }; - - // Special types - these use string representation - PgJson interval = text.bimap(PGInterval::new, PGInterval::getValue); - - PgJson point = - new PgJson<>() { - @Override - public JsonValue toJson(PGpoint value) { - return new JsonValue.JString(value.getValue()); - } - - @Override - public PGpoint fromJson(JsonValue json) { - if (json instanceof JsonValue.JString(String value)) { - try { - return new PGpoint(value); - } catch (Exception e) { - throw new RuntimeException(e); - } - } - throw new IllegalArgumentException( - "Expected string for point, got: " + json.getClass().getSimpleName()); - } - }; - - PgJson box = - new PgJson<>() { - @Override - public JsonValue toJson(PGbox value) { - return new JsonValue.JString(value.getValue()); - } - - @Override - public PGbox fromJson(JsonValue json) { - if (json instanceof JsonValue.JString(String value)) { - try { - return new PGbox(value); - } catch (Exception e) { - throw new RuntimeException(e); - } - } - throw new IllegalArgumentException( - "Expected string for box, got: " + json.getClass().getSimpleName()); - } - }; - - PgJson circle = - new PgJson<>() { - @Override - public JsonValue toJson(PGcircle value) { - return new JsonValue.JString(value.getValue()); - } - - @Override - public PGcircle fromJson(JsonValue json) { - if (json instanceof JsonValue.JString(String value)) { - try { - return new PGcircle(value); - } catch (Exception e) { - throw new RuntimeException(e); - } - } - throw new IllegalArgumentException( - "Expected string for circle, got: " + json.getClass().getSimpleName()); - } - }; - - PgJson line = - new PgJson<>() { - @Override - public JsonValue toJson(PGline value) { - return new JsonValue.JString(value.getValue()); - } - - @Override - public PGline fromJson(JsonValue json) { - if (json instanceof JsonValue.JString(String value)) { - try { - return new PGline(value); - } catch (Exception e) { - throw new RuntimeException(e); - } - } - throw new IllegalArgumentException( - "Expected string for line, got: " + json.getClass().getSimpleName()); - } - }; - - PgJson lseg = - new PgJson<>() { - @Override - public JsonValue toJson(PGlseg value) { - return new JsonValue.JString(value.getValue()); - } - - @Override - public PGlseg fromJson(JsonValue json) { - if (json instanceof JsonValue.JString(String value)) { - try { - return new PGlseg(value); - } catch (Exception e) { - throw new RuntimeException(e); - } - } - throw new IllegalArgumentException( - "Expected string for lseg, got: " + json.getClass().getSimpleName()); - } - }; - - PgJson path = - new PgJson<>() { - @Override - public JsonValue toJson(PGpath value) { - return new JsonValue.JString(value.getValue()); - } - - @Override - public PGpath fromJson(JsonValue json) { - if (json instanceof JsonValue.JString(String value)) { - try { - return new PGpath(value); - } catch (Exception e) { - throw new RuntimeException(e); - } - } - throw new IllegalArgumentException( - "Expected string for path, got: " + json.getClass().getSimpleName()); - } - }; - - PgJson polygon = - new PgJson<>() { - @Override - public JsonValue toJson(PGpolygon value) { - return new JsonValue.JString(value.getValue()); - } - - @Override - public PGpolygon fromJson(JsonValue json) { - if (json instanceof JsonValue.JString(String value)) { - try { - return new PGpolygon(value); - } catch (Exception e) { - throw new RuntimeException(e); - } - } - throw new IllegalArgumentException( - "Expected string for polygon, got: " + json.getClass().getSimpleName()); - } - }; - - // Wrapper types that use string representation - PgJson inet = text.bimap(Inet::new, Inet::value); - PgJson cidr = text.bimap(Cidr::new, Cidr::value); - PgJson macaddr = text.bimap(MacAddr::new, MacAddr::value); - PgJson macaddr8 = text.bimap(MacAddr8::new, MacAddr8::value); - // Money is returned as string by PostgreSQL's to_json (e.g., "$42.22") - PgJson money = - new PgJson<>() { - @Override - public JsonValue toJson(Money value) { - return JsonValue.JNumber.of(value.value()); - } - - @Override - public Money fromJson(JsonValue json) { - if (json instanceof JsonValue.JNumber(String value)) - return new Money(Double.parseDouble(value)); - if (json instanceof JsonValue.JString(String value)) return new Money(value); - throw new IllegalArgumentException( - "Expected number or string for money, got: " + json.getClass().getSimpleName()); - } - }; - PgJson aclitem = text.bimap(AclItem::new, AclItem::value); - PgJson xml = text.bimap(Xml::new, Xml::value); - PgJson xid = text.bimap(Xid::new, Xid::value); - // PostgreSQL returns composite types as JSON objects, but Record stores the raw string - // representation. - // We encode as string and decode from either object (convert to string) or string directly. - PgJson record = - new PgJson<>() { - @Override - public JsonValue toJson(dev.typr.foundations.data.Record value) { - return new JsonValue.JString(value.value()); - } - - @Override - public dev.typr.foundations.data.Record fromJson(JsonValue json) { - if (json instanceof JsonValue.JString(String value)) { - return new dev.typr.foundations.data.Record(value); - } - if (json instanceof JsonValue.JObject obj) { - // PostgreSQL returns composite types as JSON objects with field names - // Convert back to tuple string format: (val1, val2, ...) - StringBuilder sb = new StringBuilder("("); - boolean first = true; - for (JsonValue v : obj.fields().values()) { - if (!first) sb.append(","); - first = false; - // Handle each value type - if (v instanceof JsonValue.JNull) { - // null values are empty in tuple representation - } else if (v instanceof JsonValue.JString(String s)) { - sb.append(s); - } else if (v instanceof JsonValue.JNumber(String n)) { - sb.append(n); - } else if (v instanceof JsonValue.JBool(boolean b)) { - sb.append(b); - } else { - // For nested objects/arrays, use JSON encoding - sb.append(v.encode()); - } - } - sb.append(")"); - return new dev.typr.foundations.data.Record(sb.toString()); - } - throw new IllegalArgumentException( - "Expected string or object for record, got: " + json.getClass().getSimpleName()); - } - }; - PgJson vector = text.bimap(Vector::parse, Vector::value); - // PostgreSQL returns int2vector as JSON array, not string - PgJson int2vector = - new PgJson<>() { - @Override - public JsonValue toJson(Int2Vector value) { - JsonValue[] elements = new JsonValue[value.values().length]; - for (int i = 0; i < value.values().length; i++) { - elements[i] = JsonValue.JNumber.of(value.values()[i]); - } - return new JsonValue.JArray(List.of(elements)); - } - - @Override - public Int2Vector fromJson(JsonValue json) { - if (json instanceof JsonValue.JArray(List elements)) { - short[] values = new short[elements.size()]; - for (int i = 0; i < elements.size(); i++) { - if (elements.get(i) instanceof JsonValue.JNumber(String num)) { - values[i] = Short.parseShort(num); - } else { - throw new IllegalArgumentException("Expected number in int2vector array"); - } - } - return new Int2Vector(values); - } - if (json instanceof JsonValue.JString(String value)) return Int2Vector.parse(value); - throw new IllegalArgumentException( - "Expected array or string for int2vector, got: " + json.getClass().getSimpleName()); - } - }; - // PostgreSQL returns oidvector as JSON array, but with STRING elements (unlike int2vector which - // uses numbers) - PgJson oidvector = - new PgJson<>() { - @Override - public JsonValue toJson(OidVector value) { - JsonValue[] elements = new JsonValue[value.values().length]; - for (int i = 0; i < value.values().length; i++) { - elements[i] = JsonValue.JNumber.of(value.values()[i]); - } - return new JsonValue.JArray(List.of(elements)); - } - - @Override - public OidVector fromJson(JsonValue json) { - if (json instanceof JsonValue.JArray(List elements)) { - int[] values = new int[elements.size()]; - for (int i = 0; i < elements.size(); i++) { - JsonValue elem = elements.get(i); - if (elem instanceof JsonValue.JNumber(String num)) { - values[i] = Integer.parseInt(num); - } else if (elem instanceof JsonValue.JString(String s)) { - // PostgreSQL returns oidvector elements as strings in arrays - values[i] = Integer.parseInt(s); - } else { - throw new IllegalArgumentException( - "Expected number or string in oidvector array, got: " - + elem.getClass().getSimpleName()); - } - } - return new OidVector(values); - } - if (json instanceof JsonValue.JString(String value)) return OidVector.parse(value); - throw new IllegalArgumentException( - "Expected array or string for oidvector, got: " + json.getClass().getSimpleName()); - } - }; - PgJson regclass = text.bimap(Regclass::new, Regclass::value); - PgJson regconfig = text.bimap(Regconfig::new, Regconfig::value); - PgJson regdictionary = text.bimap(Regdictionary::new, Regdictionary::value); - PgJson regnamespace = text.bimap(Regnamespace::new, Regnamespace::value); - PgJson regoper = text.bimap(Regoper::new, Regoper::value); - PgJson regoperator = text.bimap(Regoperator::new, Regoperator::value); - PgJson regproc = text.bimap(Regproc::new, Regproc::value); - PgJson regprocedure = text.bimap(Regprocedure::new, Regprocedure::value); - PgJson regrole = text.bimap(Regrole::new, Regrole::value); - PgJson regtype = text.bimap(Regtype::new, Regtype::value); - - // Range types - PostgreSQL returns ranges as strings in JSON - static > PgJson> range( - SqlFunction valueParser, - java.util.function.BiFunction, RangeBound, Range> rangeFactory) { - return new PgJson<>() { - @Override - public JsonValue toJson(Range value) { - return new JsonValue.JString(RangeParser.format(value)); - } - - @Override - public Range fromJson(JsonValue json) { - if (json instanceof JsonValue.JString(String value)) { - try { - return RangeParser.parse(value, valueParser, rangeFactory); - } catch (java.sql.SQLException e) { - throw new RuntimeException(e); - } - } - throw new IllegalArgumentException( - "Expected string for range, got: " + json.getClass().getSimpleName()); - } - }; - } - - PgJson> int4range = range(RangeParser.INT4_PARSER, Range.INT4); - PgJson> int8range = range(RangeParser.INT8_PARSER, Range.INT8); - PgJson> numrange = range(RangeParser.NUMERIC_PARSER, Range.NUMERIC); - PgJson> daterange = range(RangeParser.DATE_PARSER, Range.DATE); - PgJson> tsrange = range(RangeParser.TIMESTAMP_PARSER, Range.TIMESTAMP); - PgJson> tstzrange = range(RangeParser.TIMESTAMPTZ_PARSER, Range.TIMESTAMPTZ); - - // hstore uses a JSON object representation - PgJson> hstore = - new PgJson<>() { - @Override - public JsonValue toJson(Map value) { - Map fields = new LinkedHashMap<>(); - for (Map.Entry e : value.entrySet()) { - fields.put( - e.getKey(), - e.getValue() == null - ? JsonValue.JNull.INSTANCE - : new JsonValue.JString(e.getValue())); - } - return new JsonValue.JObject(fields); - } - - @Override - public Map fromJson(JsonValue json) { - if (!(json instanceof JsonValue.JObject(Map fields))) { - throw new IllegalArgumentException( - "Expected object for hstore, got: " + json.getClass().getSimpleName()); - } - Map result = new LinkedHashMap<>(); - for (Map.Entry e : fields.entrySet()) { - if (e.getValue() instanceof JsonValue.JNull) { - result.put(e.getKey(), null); - } else if (e.getValue() instanceof JsonValue.JString(String value)) { - result.put(e.getKey(), value); - } else { - throw new IllegalArgumentException( - "Expected string or null in hstore, got: " - + e.getValue().getClass().getSimpleName()); - } - } - return result; - } - }; - - // Unboxed primitive array types - no boxing overhead - PgJson boolArrayUnboxed = - new PgJson<>() { - @Override - public JsonValue toJson(boolean[] arr) { - List elements = new ArrayList<>(arr.length); - for (boolean v : arr) { - elements.add(JsonValue.JBool.of(v)); - } - return new JsonValue.JArray(elements); - } - - @Override - public boolean[] fromJson(JsonValue json) { - if (!(json instanceof JsonValue.JArray(List values))) { - throw new IllegalArgumentException("Expected JSON array for boolean[]"); - } - boolean[] result = new boolean[values.size()]; - for (int i = 0; i < values.size(); i++) { - if (values.get(i) instanceof JsonValue.JBool(boolean value)) { - result[i] = value; - } else { - throw new IllegalArgumentException("Expected boolean in array"); - } - } - return result; - } - }; - - PgJson shortArrayUnboxed = - new PgJson<>() { - @Override - public JsonValue toJson(short[] arr) { - List elements = new ArrayList<>(arr.length); - for (short v : arr) { - elements.add(JsonValue.JNumber.of(v)); - } - return new JsonValue.JArray(elements); - } - - @Override - public short[] fromJson(JsonValue json) { - if (!(json instanceof JsonValue.JArray(List values))) { - throw new IllegalArgumentException("Expected JSON array for short[]"); - } - short[] result = new short[values.size()]; - for (int i = 0; i < values.size(); i++) { - if (values.get(i) instanceof JsonValue.JNumber(String value)) { - result[i] = Short.parseShort(value); - } else { - throw new IllegalArgumentException("Expected number in array"); - } - } - return result; - } - }; - - PgJson intArrayUnboxed = - new PgJson<>() { - @Override - public JsonValue toJson(int[] arr) { - List elements = new ArrayList<>(arr.length); - for (int v : arr) { - elements.add(JsonValue.JNumber.of(v)); - } - return new JsonValue.JArray(elements); - } - - @Override - public int[] fromJson(JsonValue json) { - if (!(json instanceof JsonValue.JArray(List values))) { - throw new IllegalArgumentException("Expected JSON array for int[]"); - } - int[] result = new int[values.size()]; - for (int i = 0; i < values.size(); i++) { - if (values.get(i) instanceof JsonValue.JNumber(String value)) { - result[i] = Integer.parseInt(value); - } else { - throw new IllegalArgumentException("Expected number in array"); - } - } - return result; - } - }; - - PgJson longArrayUnboxed = - new PgJson<>() { - @Override - public JsonValue toJson(long[] arr) { - List elements = new ArrayList<>(arr.length); - for (long v : arr) { - elements.add(JsonValue.JNumber.of(v)); - } - return new JsonValue.JArray(elements); - } - - @Override - public long[] fromJson(JsonValue json) { - if (!(json instanceof JsonValue.JArray(List values))) { - throw new IllegalArgumentException("Expected JSON array for long[]"); - } - long[] result = new long[values.size()]; - for (int i = 0; i < values.size(); i++) { - if (values.get(i) instanceof JsonValue.JNumber(String value)) { - result[i] = Long.parseLong(value); - } else { - throw new IllegalArgumentException("Expected number in array"); - } - } - return result; - } - }; - - PgJson floatArrayUnboxed = - new PgJson<>() { - @Override - public JsonValue toJson(float[] arr) { - List elements = new ArrayList<>(arr.length); - for (float v : arr) { - elements.add(JsonValue.JNumber.of(v)); - } - return new JsonValue.JArray(elements); - } - - @Override - public float[] fromJson(JsonValue json) { - if (!(json instanceof JsonValue.JArray(List values))) { - throw new IllegalArgumentException("Expected JSON array for float[]"); - } - float[] result = new float[values.size()]; - for (int i = 0; i < values.size(); i++) { - if (values.get(i) instanceof JsonValue.JNumber(String value)) { - result[i] = Float.parseFloat(value); - } else { - throw new IllegalArgumentException("Expected number in array"); - } - } - return result; - } - }; - - PgJson doubleArrayUnboxed = - new PgJson<>() { - @Override - public JsonValue toJson(double[] arr) { - List elements = new ArrayList<>(arr.length); - for (double v : arr) { - elements.add(JsonValue.JNumber.of(v)); - } - return new JsonValue.JArray(elements); - } - - @Override - public double[] fromJson(JsonValue json) { - if (!(json instanceof JsonValue.JArray(List values))) { - throw new IllegalArgumentException("Expected JSON array for double[]"); - } - double[] result = new double[values.size()]; - for (int i = 0; i < values.size(); i++) { - if (values.get(i) instanceof JsonValue.JNumber(String value)) { - result[i] = Double.parseDouble(value); - } else { - throw new IllegalArgumentException("Expected number in array"); - } - } - return result; - } - }; -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/PgRead.java b/foundations-jdbc/src/java/dev/typr/foundations/PgRead.java deleted file mode 100644 index f3c019d6dd..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/PgRead.java +++ /dev/null @@ -1,456 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.Json; -import dev.typr.foundations.data.Jsonb; -import dev.typr.foundations.data.Money; -import dev.typr.foundations.internal.arrayMap; -import java.lang.reflect.Array; -import java.math.BigDecimal; -import java.sql.ResultSet; -import java.sql.SQLException; -import java.time.*; -import java.time.format.DateTimeFormatter; -import java.time.format.DateTimeFormatterBuilder; -import java.time.temporal.ChronoField; -import java.util.Arrays; -import java.util.Map; -import java.util.Optional; -import java.util.UUID; -import java.util.function.Function; -import java.util.function.IntFunction; -import org.postgresql.jdbc.PgArray; -import org.postgresql.util.PGobject; - -/** - * Describes how to read a column from a {@link ResultSet} - * - *

Note that the implementation is a bit more complex than you would expect. This is because we - * need to check {@link ResultSet#wasNull()} "in the middle" of extracting data.
- *
- * Correct use of {@code Column} requires use of either - * - *

- * - * Then you create derived instances with {@code map} and/or {@code opt} - */ -public sealed interface PgRead extends DbRead - permits PgRead.NonNullable, PgRead.Nullable, PgRead.Mapped { - A read(ResultSet rs, int col) throws SQLException; - - PgRead map(SqlFunction f); - - /** Derive a `Column` which allows nullable values */ - PgRead> opt(); - - @FunctionalInterface - interface RawRead { - A apply(ResultSet rs, int column) throws SQLException; - } - - /** - * Create an instance of {@link PgRead} from a function that reads a value from a result set. - * - * @param f Should not blow up if the value returned is `null` - */ - static NonNullable of(RawRead f) { - RawRead> readNullableA = - (rs, col) -> { - var a = f.apply(rs, col); - if (rs.wasNull()) return Optional.empty(); - else return Optional.of(a); - }; - return new NonNullable<>(readNullableA); - } - - final class NonNullable implements PgRead { - final RawRead> readNullable; - - public NonNullable(RawRead> readNullable) { - this.readNullable = readNullable; - } - - @Override - public A read(ResultSet rs, int col) throws SQLException { - return readNullable - .apply(rs, col) - .orElseThrow(() -> new SQLException("null value in column " + col)); - } - - @Override - public NonNullable map(SqlFunction f) { - return new NonNullable<>( - (rs, col) -> { - Optional maybeA = readNullable.apply(rs, col); - // this looks like map, but there is a checked exception - if (maybeA.isEmpty()) return Optional.empty(); - return Optional.of(f.apply(maybeA.get())); - }); - } - - @Override - public PgRead> opt() { - return new Nullable<>(readNullable); - } - } - - final class Nullable implements PgRead> { - final RawRead> readNullable; - - public Nullable(RawRead> readNullable) { - this.readNullable = readNullable; - } - - @Override - public Optional read(ResultSet rs, int col) throws SQLException { - return readNullable.apply(rs, col); - } - - @Override - public PgRead map(SqlFunction, B> f) { - return new Mapped<>(this, f); - } - - @Override - public Nullable> opt() { - return new Nullable<>( - (rs, col) -> { - Optional maybeA = readNullable.apply(rs, col); - if (maybeA.isEmpty()) return Optional.empty(); - return Optional.of(maybeA); - }); - } - } - - /** - * A read that came from mapping another read. Just returns whatever the mapping function - * produces, null or not. No throwing on null, no Optional wrapping. - */ - record Mapped(PgRead underlying, SqlFunction f) implements PgRead { - @Override - public B read(ResultSet rs, int col) throws SQLException { - return f.apply(underlying.read(rs, col)); - } - - @Override - public PgRead map(SqlFunction g) { - return new Mapped<>(this, g); - } - - @Override - public PgRead> opt() { - return new Nullable<>((rs, col) -> Optional.ofNullable(read(rs, col))); - } - } - - static NonNullable castJdbcObjectTo(Class cls) { - return of((rs, i) -> cls.cast(rs.getObject(i))); - } - - PgRead readPgArray = of((rs, i) -> (PgArray) rs.getArray(i)); - - @SuppressWarnings("unchecked") - static PgRead massageJdbcArrayTo(Class arrayCls) { - return readPgArray.map( - sqlArray -> { - Object arrayObj = sqlArray.getArray(); - // if the array is already of the correct type, just return it - if (arrayCls.isInstance(arrayObj)) return arrayCls.cast(arrayObj); - // if the array is an Object[], we need to copy elements manually - Object[] array = (Object[]) arrayObj; - Class componentType = arrayCls.getComponentType(); - A[] result = (A[]) Array.newInstance(componentType, array.length); - for (int i = 0; i < array.length; i++) { - result[i] = (A) array[i]; - } - return result; - }); - } - - /** - * Read an array where JDBC driver returns Object[] containing elements that need casting. Used - * for PostgreSQL geometric types (box[], circle[], etc.) where driver returns PGobject[]. - */ - @SuppressWarnings("unchecked") - static PgRead castJdbcArrayTo(Class elementCls) { - return readPgArray.map( - sqlArray -> { - Object[] array = (Object[]) sqlArray.getArray(); - A[] result = (A[]) Array.newInstance(elementCls, array.length); - for (int i = 0; i < array.length; i++) { - result[i] = elementCls.cast(array[i]); - } - return result; - }); - } - - @SuppressWarnings("unchecked") - static PgRead pgObjectArray(Function fromString, Class clazz) { - return readPgArray.map( - sqlArray -> { - Object[] objects = (Object[]) sqlArray.getArray(); - T[] array = (T[]) Array.newInstance(clazz, objects.length); - for (int i = 0; i < objects.length; i++) { - PGobject object = (PGobject) objects[i]; - array[i] = fromString.apply(object.getValue()); - } - return array; - }); - } - - /** - * Create a reader for arrays of composite types. PostgreSQL returns composite arrays as a string - * in array format, e.g., {"(field1,field2)","(field3,field4)"}. We parse this using - * PgRecordParser.parseArray and decode each element with the composite's text decoder. - * - * @param decoder the composite text decoder for the element type - * @param arrayFactory factory to create arrays of the element type - * @return a PgRead for arrays of the composite type - */ - static PgRead readCompositeArray( - PgCompositeText decoder, IntFunction arrayFactory) { - return readString.map( - arrayText -> { - if (arrayText == null) return null; - java.util.List elements = PgRecordParser.parseArray(arrayText); - T[] array = arrayFactory.apply(elements.size()); - for (int i = 0; i < elements.size(); i++) { - String elementText = elements.get(i); - array[i] = elementText == null ? null : decoder.decode(elementText); - } - return array; - }); - } - - static PgRead pgObject(String sqlType) { - return PgRead.of( - (rs, i) -> { - PGobject object = (PGobject) rs.getObject(i); - if (object == null) return null; - if (!object.getType().equals(sqlType)) { - throw new SQLException("Expected " + sqlType + " but got " + object.getType()); - } - return object.getValue(); - }); - } - - PgRead readOffsetDateTime = - of((rs, idx) -> rs.getObject(idx, OffsetDateTime.class)); - PgRead readTimestampArray = massageJdbcArrayTo(java.sql.Timestamp[].class); - PgRead readDateArray = massageJdbcArrayTo(java.sql.Date[].class); - PgRead readString = of(ResultSet::getString); - PgRead readStringArray = PgRead.massageJdbcArrayTo(String[].class); - - PgRead readBigDecimal = of(ResultSet::getBigDecimal); - PgRead readBigDecimalArray = PgRead.massageJdbcArrayTo(BigDecimal[].class); - PgRead readBoolean = of(ResultSet::getBoolean); - PgRead readBooleanArray = PgRead.massageJdbcArrayTo(Boolean[].class); - PgRead readByte = of(ResultSet::getByte); - PgRead readByteArray = castJdbcObjectTo(byte[].class); - PgRead readDouble = of(ResultSet::getDouble); - PgRead readDoubleArray = PgRead.massageJdbcArrayTo(Double[].class); - PgRead readFloat = of(ResultSet::getFloat); - PgRead readFloatArray = PgRead.massageJdbcArrayTo(Float[].class); - PgRead readInstant = readOffsetDateTime.map(OffsetDateTime::toInstant); - PgRead readInstantArray = - readTimestampArray.map( - ts -> Arrays.stream(ts).map(java.sql.Timestamp::toInstant).toArray(Instant[]::new)); - PgRead readInteger = of(ResultSet::getInt); - PgRead readIntegerArray = PgRead.massageJdbcArrayTo(Integer[].class); - PgRead readJsonArray = - PgRead.readStringArray.map(as -> arrayMap.map(as, Json::new, Json.class)); - PgRead readJsonbArray = - PgRead.readStringArray.map(as -> arrayMap.map(as, Jsonb::new, Jsonb.class)); - PgRead readLocalDate = of((rs, idx) -> rs.getObject(idx, LocalDate.class)); - PgRead readLocalDateArray = - readDateArray.map( - dates -> Arrays.stream(dates).map(java.sql.Date::toLocalDate).toArray(LocalDate[]::new)); - PgRead readLocalDateTime = of((rs, idx) -> rs.getObject(idx, LocalDateTime.class)); - PgRead readLocalDateTimeArray = - readTimestampArray.map( - ts -> - Arrays.stream(ts) - .map(java.sql.Timestamp::toLocalDateTime) - .toArray(LocalDateTime[]::new)); - PgRead readLocalTime = of((rs, idx) -> rs.getObject(idx, LocalTime.class)); - PgRead readLocalTimeArray = readString.map(Impl::parseLocalTimeArray); - PgRead readLong = of(ResultSet::getLong); - PgRead readLongArray = PgRead.massageJdbcArrayTo(Long[].class); - PgRead readOffsetTime = of((rs, idx) -> rs.getObject(idx, OffsetTime.class)); - PgRead readOffsetTimeArray = readString.map(Impl::parseOffsetTimeArray); - PgRead readShort = of(ResultSet::getShort); - PgRead readShortArray = PgRead.massageJdbcArrayTo(Short[].class); - - // Unboxed (primitive) array readers - convert from boxed arrays returned by JDBC - PgRead readBooleanArrayUnboxed = readBooleanArray.map(Impl::unboxBooleanArray); - PgRead readShortArrayUnboxed = readShortArray.map(Impl::unboxShortArray); - PgRead readIntArrayUnboxed = readIntegerArray.map(Impl::unboxIntArray); - PgRead readLongArrayUnboxed = readLongArray.map(Impl::unboxLongArray); - PgRead readFloatArrayUnboxed = readFloatArray.map(Impl::unboxFloatArray); - PgRead readDoubleArrayUnboxed = readDoubleArray.map(Impl::unboxDoubleArray); - - PgRead readUUID = readString.map(UUID::fromString); - PgRead readMoneyArray = - PgRead.readString.map( - str -> { - if (str.equals("{}")) return new Money[0]; - return arrayMap.map( - str.substring(1, str.length() - 1).split(","), Money::new, Money.class); - }); - PgRead> readMapStringString = - PgRead.of( - (rs, i) -> { - var obj = rs.getObject(i); - if (obj == null) return null; - return (Map) obj; - }); - - interface Impl { - // postgres driver throws away all precision after whole seconds !?! - static LocalTime[] parseLocalTimeArray(String str) { - if (str == null) return null; - if (str.equals("{}")) return new LocalTime[0]; - if (str.charAt(0) != '{' || str.charAt(str.length() - 1) != '}') - throw new IllegalArgumentException("Invalid array format"); - String[] strings = str.substring(1, str.length() - 1).split(","); - LocalTime[] ret = new LocalTime[strings.length]; - for (int i = 0; i < strings.length; i++) { - ret[i] = LocalTime.parse(strings[i]); - } - return ret; - } - - DateTimeFormatter offsetTimeParser = - new DateTimeFormatterBuilder() - .appendPattern("HH:mm:ss") - .appendFraction(ChronoField.MICRO_OF_SECOND, 0, 6, true) - .appendPattern("X") - .toFormatter(); - - static OffsetTime[] parseOffsetTimeArray(String str) { - if (str == null) return null; - if (str.equals("{}")) return new OffsetTime[0]; - if (str.charAt(0) != '{' || str.charAt(str.length() - 1) != '}') - throw new IllegalArgumentException("Invalid array format"); - String[] strings = str.substring(1, str.length() - 1).split(","); - var ret = new OffsetTime[strings.length]; - for (int i = 0; i < strings.length; i++) { - ret[i] = OffsetTime.parse(strings[i], offsetTimeParser); - } - return ret; - } - - // Unboxing methods - convert boxed arrays to primitive arrays - static boolean[] unboxBooleanArray(Boolean[] boxed) { - if (boxed == null) return null; - boolean[] unboxed = new boolean[boxed.length]; - for (int i = 0; i < boxed.length; i++) { - unboxed[i] = boxed[i]; - } - return unboxed; - } - - static short[] unboxShortArray(Short[] boxed) { - if (boxed == null) return null; - short[] unboxed = new short[boxed.length]; - for (int i = 0; i < boxed.length; i++) { - unboxed[i] = boxed[i]; - } - return unboxed; - } - - static int[] unboxIntArray(Integer[] boxed) { - if (boxed == null) return null; - int[] unboxed = new int[boxed.length]; - for (int i = 0; i < boxed.length; i++) { - unboxed[i] = boxed[i]; - } - return unboxed; - } - - static long[] unboxLongArray(Long[] boxed) { - if (boxed == null) return null; - long[] unboxed = new long[boxed.length]; - for (int i = 0; i < boxed.length; i++) { - unboxed[i] = boxed[i]; - } - return unboxed; - } - - static float[] unboxFloatArray(Float[] boxed) { - if (boxed == null) return null; - float[] unboxed = new float[boxed.length]; - for (int i = 0; i < boxed.length; i++) { - unboxed[i] = boxed[i]; - } - return unboxed; - } - - static double[] unboxDoubleArray(Double[] boxed) { - if (boxed == null) return null; - double[] unboxed = new double[boxed.length]; - for (int i = 0; i < boxed.length; i++) { - unboxed[i] = boxed[i]; - } - return unboxed; - } - - // Boxing methods - convert primitive arrays to boxed arrays - static Boolean[] boxBooleanArray(boolean[] unboxed) { - if (unboxed == null) return null; - Boolean[] boxed = new Boolean[unboxed.length]; - for (int i = 0; i < unboxed.length; i++) { - boxed[i] = unboxed[i]; - } - return boxed; - } - - static Short[] boxShortArray(short[] unboxed) { - if (unboxed == null) return null; - Short[] boxed = new Short[unboxed.length]; - for (int i = 0; i < unboxed.length; i++) { - boxed[i] = unboxed[i]; - } - return boxed; - } - - static Integer[] boxIntArray(int[] unboxed) { - if (unboxed == null) return null; - Integer[] boxed = new Integer[unboxed.length]; - for (int i = 0; i < unboxed.length; i++) { - boxed[i] = unboxed[i]; - } - return boxed; - } - - static Long[] boxLongArray(long[] unboxed) { - if (unboxed == null) return null; - Long[] boxed = new Long[unboxed.length]; - for (int i = 0; i < unboxed.length; i++) { - boxed[i] = unboxed[i]; - } - return boxed; - } - - static Float[] boxFloatArray(float[] unboxed) { - if (unboxed == null) return null; - Float[] boxed = new Float[unboxed.length]; - for (int i = 0; i < unboxed.length; i++) { - boxed[i] = unboxed[i]; - } - return boxed; - } - - static Double[] boxDoubleArray(double[] unboxed) { - if (unboxed == null) return null; - Double[] boxed = new Double[unboxed.length]; - for (int i = 0; i < unboxed.length; i++) { - boxed[i] = unboxed[i]; - } - return boxed; - } - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/PgRecordParser.java b/foundations-jdbc/src/java/dev/typr/foundations/PgRecordParser.java deleted file mode 100644 index fab6661597..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/PgRecordParser.java +++ /dev/null @@ -1,597 +0,0 @@ -package dev.typr.foundations; - -import java.util.ArrayList; -import java.util.List; - -/** - * Parser for PostgreSQL composite type (record) text format. - * - *

PostgreSQL represents composite values in text format as: (val1, val2, ...) - * - *

Key parsing rules: - * - *

    - *
  • Values are separated by commas within parentheses - *
  • NULL is represented by empty (no characters between commas) - *
  • Empty string is represented by "" (quoted empty) - *
  • Values containing special chars (comma, quotes, parens, backslash) must be quoted - *
  • Within quoted values, quotes are escaped by doubling: " becomes "" - *
  • Within quoted values, backslash is escaped by doubling: \ becomes \\ - *
  • Nested records are quoted, with inner quotes doubled at each level - *
- * - *

Examples: - * - *

    - *
  • (hello,world,123) - simple values - *
  • ("hello, world",test) - value with comma needs quotes - *
  • ("say ""hello""",test) - value with quotes (doubled) - *
  • (,,"") - NULL, NULL, empty string - *
  • ("(nested,record)",outer) - nested record (quoted) - *
  • ("(""deeply"",nested)",outer) - deeply nested with quote doubling - *
- */ -public final class PgRecordParser { - private PgRecordParser() {} - - /** - * Parse a PostgreSQL composite type text representation into a list of field values. - * - * @param input the composite value string, e.g., "(val1, val2, ...)" - * @return list of parsed field values, where null represents SQL NULL - * @throws IllegalArgumentException if the input is malformed - */ - public static List parse(String input) { - if (input == null) { - throw new IllegalArgumentException("Input cannot be null"); - } - - String trimmed = input.trim(); - if (trimmed.isEmpty()) { - throw new IllegalArgumentException("Input cannot be empty"); - } - - if (!trimmed.startsWith("(") || !trimmed.endsWith(")")) { - throw new IllegalArgumentException( - "Composite value must be enclosed in parentheses: " + input); - } - - // Remove outer parentheses - String content = trimmed.substring(1, trimmed.length() - 1); - - return parseFields(content); - } - - /** - * Parse the content inside parentheses into individual field values. - * - * @param content the content between parentheses - * @return list of field values - */ - private static List parseFields(String content) { - List fields = new ArrayList<>(); - - if (content.isEmpty()) { - // Empty record () has no fields - return fields; - } - - int pos = 0; - int len = content.length(); - - while (pos <= len) { - // Parse one field - FieldParseResult result = parseField(content, pos); - fields.add(result.value); - pos = result.nextPos; - - if (pos < len) { - // Expect comma separator - if (content.charAt(pos) != ',') { - throw new IllegalArgumentException( - "Expected comma at position " + pos + " in: " + content); - } - pos++; // Skip comma - } else if (pos == len) { - // We've consumed exactly all content - break; - } - } - - return fields; - } - - private record FieldParseResult(String value, int nextPos) {} - - /** - * Parse a single field value starting at the given position. - * - * @param content the full content string - * @param start the starting position - * @return the parsed value and the position after it - */ - private static FieldParseResult parseField(String content, int start) { - int len = content.length(); - - if (start >= len) { - // Empty field at end means NULL - return new FieldParseResult(null, start); - } - - char firstChar = content.charAt(start); - - if (firstChar == ',') { - // Empty field (NULL) - return new FieldParseResult(null, start); - } - - if (firstChar == '"') { - // Quoted field - return parseQuotedField(content, start); - } - - // Unquoted field - read until comma or end - return parseUnquotedField(content, start); - } - - /** - * Parse a quoted field value. Handles escape sequences and nested quotes. - * - * @param content the full content string - * @param start the position of the opening quote - * @return the parsed value and position after closing quote - */ - private static FieldParseResult parseQuotedField(String content, int start) { - int len = content.length(); - StringBuilder sb = new StringBuilder(); - - // Skip opening quote - int pos = start + 1; - - while (pos < len) { - char c = content.charAt(pos); - - if (c == '"') { - // Check for escaped quote (doubled) - if (pos + 1 < len && content.charAt(pos + 1) == '"') { - // Escaped quote - emit single quote and skip both - sb.append('"'); - pos += 2; - } else { - // End of quoted field - return new FieldParseResult(sb.toString(), pos + 1); - } - } else if (c == '\\') { - // Backslash escaping - if (pos + 1 < len) { - char next = content.charAt(pos + 1); - if (next == '\\') { - // Escaped backslash - sb.append('\\'); - pos += 2; - } else { - // Single backslash followed by something else - just emit as-is - // PostgreSQL doesn't actually require backslash escaping in most cases - sb.append(c); - pos++; - } - } else { - // Backslash at end - emit as-is - sb.append(c); - pos++; - } - } else { - sb.append(c); - pos++; - } - } - - throw new IllegalArgumentException("Unterminated quoted field starting at position " + start); - } - - /** - * Parse an unquoted field value. - * - * @param content the full content string - * @param start the starting position - * @return the parsed value and position after it - */ - private static FieldParseResult parseUnquotedField(String content, int start) { - int len = content.length(); - int pos = start; - - while (pos < len) { - char c = content.charAt(pos); - if (c == ',') { - break; - } - pos++; - } - - String value = content.substring(start, pos); - - // Unquoted empty string is NULL - if (value.isEmpty()) { - return new FieldParseResult(null, pos); - } - - return new FieldParseResult(value, pos); - } - - /** - * Encode a list of field values into PostgreSQL composite type text format. - * - * @param values the field values (null represents SQL NULL) - * @return the encoded string, e.g., "(val1, val2, ...)" - */ - public static String encode(List values) { - StringBuilder sb = new StringBuilder(); - sb.append('('); - - for (int i = 0; i < values.size(); i++) { - if (i > 0) { - sb.append(','); - } - encodeField(sb, values.get(i)); - } - - sb.append(')'); - return sb.toString(); - } - - /** - * Encode a single field value. - * - * @param sb the string builder to append to - * @param value the field value (null for SQL NULL) - */ - private static void encodeField(StringBuilder sb, String value) { - if (value == null) { - // NULL - empty field - return; - } - - if (value.isEmpty()) { - // Empty string needs quotes to distinguish from NULL - sb.append("\"\""); - return; - } - - // Check if quoting is needed - boolean needsQuotes = false; - for (int i = 0; i < value.length(); i++) { - char c = value.charAt(i); - if (c == ',' || c == '"' || c == '(' || c == ')' || c == '\\' || c == '\n' || c == '\r') { - needsQuotes = true; - break; - } - } - - if (!needsQuotes) { - sb.append(value); - return; - } - - // Quoted encoding - sb.append('"'); - for (int i = 0; i < value.length(); i++) { - char c = value.charAt(i); - if (c == '"') { - sb.append("\"\""); // Escape quote by doubling - } else if (c == '\\') { - sb.append("\\\\"); // Escape backslash by doubling - } else { - sb.append(c); - } - } - sb.append('"'); - } - - /** - * Parse a nested record value. This is useful when a field value is itself a composite. - * - *

When a composite is nested inside another composite, it appears as a quoted string where the - * inner composite's special characters are already escaped. This method can be called on the - * unescaped field value to parse it recursively. - * - * @param value the field value containing a nested record - * @return list of parsed field values from the nested record - */ - public static List parseNested(String value) { - return parse(value); - } - - /** - * Parse a PostgreSQL array text representation into a list of element values. - * - *

Array format: {elem1,elem2,...} or {"quoted elem","another"} - * - * @param input the array value string, e.g., "{val1, val2, ...}" - * @return list of parsed element values, where null represents SQL NULL - * @throws IllegalArgumentException if the input is malformed - */ - public static List parseArray(String input) { - return parseArray(input, ','); - } - - /** - * Parse a PostgreSQL array text representation with a custom delimiter. - * - *

PostgreSQL uses semicolon (;) as delimiter for geometric type arrays since their elements - * contain commas. - * - * @param input the array value string - * @param delimiter the element delimiter character (typically ',' or ';') - * @return list of parsed element values, where null represents SQL NULL - */ - public static List parseArray(String input, char delimiter) { - if (input == null) { - throw new IllegalArgumentException("Array input cannot be null"); - } - - String trimmed = input.trim(); - if (trimmed.isEmpty()) { - throw new IllegalArgumentException("Array input cannot be empty"); - } - - if (!trimmed.startsWith("{") || !trimmed.endsWith("}")) { - throw new IllegalArgumentException("Array value must be enclosed in braces: " + input); - } - - // Remove outer braces - String content = trimmed.substring(1, trimmed.length() - 1); - - return parseArrayElements(content, delimiter); - } - - /** - * Parse the content inside braces into individual array elements. - * - * @param content the content between braces - * @param delimiter the element delimiter character - * @return list of element values - */ - private static List parseArrayElements(String content, char delimiter) { - List elements = new ArrayList<>(); - - if (content.isEmpty()) { - // Empty array {} has no elements - return elements; - } - - int pos = 0; - int len = content.length(); - - while (pos <= len) { - // Parse one element - FieldParseResult result = parseArrayElement(content, pos, delimiter); - elements.add(result.value); - pos = result.nextPos; - - if (pos < len) { - // Expect delimiter separator - if (content.charAt(pos) != delimiter) { - throw new IllegalArgumentException( - "Expected delimiter '" - + delimiter - + "' at position " - + pos - + " in array: " - + content); - } - pos++; // Skip delimiter - } else if (pos == len) { - // We've consumed exactly all content - break; - } - } - - return elements; - } - - /** - * Parse a single array element starting at the given position. - * - * @param content the full content string - * @param start the starting position - * @param delimiter the element delimiter character - * @return the parsed value and the position after it - */ - private static FieldParseResult parseArrayElement(String content, int start, char delimiter) { - int len = content.length(); - - if (start >= len) { - // Empty element at end - return new FieldParseResult(null, start); - } - - char firstChar = content.charAt(start); - - if (firstChar == delimiter) { - // Empty element (NULL) - return new FieldParseResult(null, start); - } - - if (firstChar == '"') { - // Quoted element - return parseQuotedArrayElement(content, start); - } - - // Unquoted element - read until delimiter or end - return parseUnquotedArrayElement(content, start, delimiter); - } - - /** - * Parse a quoted array element. Array quoting uses backslash escaping for quotes and backslashes. - * - * @param content the full content string - * @param start the position of the opening quote - * @return the parsed value and position after closing quote - */ - private static FieldParseResult parseQuotedArrayElement(String content, int start) { - int len = content.length(); - StringBuilder sb = new StringBuilder(); - - // Skip opening quote - int pos = start + 1; - - while (pos < len) { - char c = content.charAt(pos); - - if (c == '"') { - // End of quoted element - return new FieldParseResult(sb.toString(), pos + 1); - } else if (c == '\\') { - // Backslash escaping in arrays - if (pos + 1 < len) { - char next = content.charAt(pos + 1); - // In array context, backslash escapes the next character - sb.append(next); - pos += 2; - } else { - // Backslash at end - emit as-is - sb.append(c); - pos++; - } - } else { - sb.append(c); - pos++; - } - } - - throw new IllegalArgumentException( - "Unterminated quoted array element starting at position " + start); - } - - /** - * Parse an unquoted array element. - * - * @param content the full content string - * @param start the starting position - * @param delimiter the element delimiter character - * @return the parsed value and position after it - */ - private static FieldParseResult parseUnquotedArrayElement( - String content, int start, char delimiter) { - int len = content.length(); - int pos = start; - - while (pos < len) { - char c = content.charAt(pos); - if (c == delimiter) { - break; - } - pos++; - } - - String value = content.substring(start, pos); - - // Check for NULL literal - if (value.equalsIgnoreCase("NULL")) { - return new FieldParseResult(null, pos); - } - - // Unquoted empty string shouldn't happen in arrays - if (value.isEmpty()) { - return new FieldParseResult(null, pos); - } - - return new FieldParseResult(value, pos); - } - - /** - * Encode a list of element values into PostgreSQL array text format. - * - * @param values the element values (null represents SQL NULL) - * @param elementEncoder function to encode each element to its text representation - * @return the encoded string, e.g., "{val1, val2, ...}" - */ - public static String encodeArray( - List values, java.util.function.Function elementEncoder) { - return encodeArray(values, elementEncoder, ','); - } - - /** - * Encode a list of element values into PostgreSQL array text format with a custom delimiter. - * - *

PostgreSQL uses semicolon (;) as delimiter for geometric type arrays since their elements - * contain commas. - * - * @param values the element values (null represents SQL NULL) - * @param elementEncoder function to encode each element to its text representation - * @param delimiter the element delimiter character (typically ',' or ';') - * @return the encoded string - */ - public static String encodeArray( - List values, java.util.function.Function elementEncoder, char delimiter) { - StringBuilder sb = new StringBuilder(); - sb.append('{'); - - for (int i = 0; i < values.size(); i++) { - if (i > 0) { - sb.append(delimiter); - } - T value = values.get(i); - if (value == null) { - sb.append("NULL"); - } else { - String encoded = elementEncoder.apply(value); - encodeArrayElement(sb, encoded, delimiter); - } - } - - sb.append('}'); - return sb.toString(); - } - - /** - * Encode a single array element. - * - * @param sb the string builder to append to - * @param value the element value (already encoded to string) - * @param delimiter the element delimiter character - */ - private static void encodeArrayElement(StringBuilder sb, String value, char delimiter) { - if (value == null) { - sb.append("NULL"); - return; - } - - // Check if quoting is needed - boolean needsQuotes = value.isEmpty(); - if (!needsQuotes) { - for (int i = 0; i < value.length(); i++) { - char c = value.charAt(i); - if (c == delimiter - || c == '"' - || c == '{' - || c == '}' - || c == '\\' - || c == '\n' - || c == '\r' - || c == '(' - || c == ')' - || Character.isWhitespace(c)) { - needsQuotes = true; - break; - } - } - } - - if (!needsQuotes) { - sb.append(value); - return; - } - - // Quoted encoding with backslash escaping - sb.append('"'); - for (int i = 0; i < value.length(); i++) { - char c = value.charAt(i); - if (c == '"' || c == '\\') { - sb.append('\\'); // Escape with backslash - } - sb.append(c); - } - sb.append('"'); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/PgStruct.java b/foundations-jdbc/src/java/dev/typr/foundations/PgStruct.java deleted file mode 100644 index 7dde2e597e..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/PgStruct.java +++ /dev/null @@ -1,411 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.JsonValue; -import java.sql.SQLException; -import java.util.ArrayList; -import java.util.LinkedHashMap; -import java.util.List; -import java.util.Optional; -import java.util.function.Function; -import org.postgresql.util.PGobject; - -/** - * PostgreSQL composite type (record) support. - * - *

A composite type is an ordered sequence of named fields with typed values. Example: CREATE - * TYPE address AS (street VARCHAR, city VARCHAR, zip VARCHAR) - * - *

In Java, we represent a composite type as a generated record class with typed fields. This - * class provides the machinery to read/write composite types via JDBC using the PostgreSQL text - * format. - * - * @param the Java type representing this composite (typically a generated record) - */ -public record PgStruct( - PgTypename.CompositeOf typename, - List> fields, - StructReader reader, - StructWriter writer, - PgJson json) { - - /** - * A single field in a composite type. - * - * @param the struct type - * @param the field value type - */ - public record Field(String name, PgType type, Function getter) {} - - /** Functional interface for reading a composite from parsed field values. */ - @FunctionalInterface - public interface StructReader { - A read(Object[] fieldValues) throws SQLException; - } - - /** Functional interface for writing a composite to field values. */ - @FunctionalInterface - public interface StructWriter { - Object[] write(A value); - } - - /** Create a PgType for this composite type. */ - public PgType asType() { - PgRead pgRead = - PgRead.of( - (rs, idx) -> { - Object obj = rs.getObject(idx); - if (obj == null) return null; - if (obj instanceof PGobject pgObj) { - String textValue = pgObj.getValue(); - if (textValue == null) return null; - return parseFromText(textValue); - } - throw new SQLException( - "Expected PGobject for composite type, got: " + obj.getClass()); - }); - - PgWrite pgWrite = - new PgWrite.Instance<>( - (ps, idx, pgObj) -> ps.setObject(idx, pgObj), - value -> { - if (value == null) return null; - PGobject pgObj = new PGobject(); - pgObj.setType(typename.sqlType()); - try { - pgObj.setValue(encodeToText(value)); - } catch (SQLException e) { - throw new RuntimeException("Failed to encode composite type", e); - } - return pgObj; - }); - - var self = this; - PgText pgText = - new PgText<>() { - @Override - public void unsafeEncode(A value, StringBuilder sb) { - sb.append(encodeToText(value)); - } - - @Override - public void unsafeArrayEncode(A value, StringBuilder sb) { - // For array encoding, the value needs to be quoted - unsafeEncode(value, sb); - } - }; - - PgCompositeText pgCompositeText = - new PgCompositeText<>() { - @Override - public java.util.Optional encode(A value) { - return java.util.Optional.of(encodeToText(value)); - } - - @Override - public A decode(String text) { - try { - return self.parseFromText(text); - } catch (SQLException e) { - throw new RuntimeException("Failed to parse composite type", e); - } - } - }; - - return new PgType<>(typename.asGeneric(), pgRead, pgWrite, pgText, pgCompositeText, json); - } - - /** Create an optional version of this composite type. */ - public PgType> asOptType() { - return asType().opt(); - } - - /** Parse a composite value from PostgreSQL text format. */ - private A parseFromText(String text) throws SQLException { - List parsedFields = PgRecordParser.parse(text); - - if (parsedFields.size() != fields.size()) { - throw new SQLException( - "Field count mismatch: expected " - + fields.size() - + " but got " - + parsedFields.size() - + " in: " - + text); - } - - Object[] fieldValues = new Object[fields.size()]; - for (int i = 0; i < fields.size(); i++) { - Field field = fields.get(i); - String rawValue = parsedFields.get(i); - fieldValues[i] = parseFieldValue(field, rawValue); - } - - return reader.read(fieldValues); - } - - /** Parse a single field value from text. */ - private F parseFieldValue(Field field, String rawValue) { - if (rawValue == null) { - return null; - } - return field.type().pgCompositeText().decode(rawValue); - } - - /** Encode a composite value to PostgreSQL text format. */ - private String encodeToText(A value) { - List encodedFields = new ArrayList<>(fields.size()); - for (Field field : fields) { - encodedFields.add(encodeFieldValue(field, value)); - } - return PgRecordParser.encode(encodedFields); - } - - /** Encode a single field value to text. */ - private String encodeFieldValue(Field field, A structValue) { - F value = field.getter().apply(structValue); - if (value == null) { - return null; - } - return field.type().pgCompositeText().encode(value).orElse(null); - } - - // ======================================================================== - // Builder API for creating composite types - // ======================================================================== - - /** - * Create a composite type builder. - * - * @param the struct type (typically a record) - */ - public static Builder builder(String typeName) { - return new Builder<>(typeName); - } - - public static class Builder { - private final String typeName; - private final List> fields = new ArrayList<>(); - - Builder(String typeName) { - this.typeName = typeName; - } - - /** - * Add a field using its PgType for encode/decode. - * - * @param name the field name in SQL - * @param type the PgType for the field (provides encode/decode via pgText) - * @param getter function to extract field value from struct - */ - public Builder field(String name, PgType type, Function getter) { - fields.add(new Field<>(name, type, getter)); - return this; - } - - /** - * Add a string field. - * - * @param name the field name in SQL - * @param type the PgType for the field - * @param getter function to extract field value from struct - */ - public Builder stringField(String name, PgType type, Function getter) { - return field(name, type, getter); - } - - /** - * Add an integer field. - * - * @param name the field name in SQL - * @param type the PgType for the field - * @param getter function to extract field value from struct - */ - public Builder intField(String name, PgType type, Function getter) { - return field(name, type, getter); - } - - /** - * Add a long field. - * - * @param name the field name in SQL - * @param type the PgType for the field - * @param getter function to extract field value from struct - */ - public Builder longField(String name, PgType type, Function getter) { - return field(name, type, getter); - } - - /** - * Add a double field. - * - * @param name the field name in SQL - * @param type the PgType for the field - * @param getter function to extract field value from struct - */ - public Builder doubleField(String name, PgType type, Function getter) { - return field(name, type, getter); - } - - /** - * Add a boolean field. - * - * @param name the field name in SQL - * @param type the PgType for the field - * @param getter function to extract field value from struct - */ - public Builder booleanField(String name, PgType type, Function getter) { - return field(name, type, getter); - } - - /** - * Add a nested composite field. - * - * @param name the field name in SQL - * @param nestedStruct the PgStruct for the nested composite - * @param getter function to extract field value from struct - */ - public Builder nestedField( - String name, PgStruct nestedStruct, Function getter) { - return field(name, nestedStruct.asType(), getter); - } - - /** - * Add an array field of nested composites. - * - * @param name the field name in SQL - * @param nestedStruct the PgStruct for array elements - * @param getter function to extract array value from struct - * @param arrayFactory factory to create arrays of the element type - */ - public Builder nestedArrayField( - String name, - PgStruct nestedStruct, - Function getter, - java.util.function.IntFunction arrayFactory) { - // Create array type with proper text encode/decode - PgType elementType = nestedStruct.asType(); - PgType arrayType = - new PgType<>( - elementType.typename().array(), - PgRead.of( - (rs, idx) -> { - throw new UnsupportedOperationException( - "Direct JDBC read not supported for nested arrays"); - }), - elementType.write().array(elementType.typename()), - elementType.pgText().array(), - elementType.pgCompositeText().array(arrayFactory), - elementType.pgJson().array(arrayFactory)); - return field(name, arrayType, getter); - } - - /** - * Add a nullable string field. This is a convenience method for the common case. - * - * @param name the field name in SQL - * @param type the PgType for the field - * @param getter function to extract field value from struct - */ - public Builder nullableField(String name, PgType type, Function getter) { - return stringField(name, type, getter); - } - - /** - * Add an optional field where the getter returns Optional<F>. - * - *

This is the preferred way to handle nullable fields. The Optional is unwrapped internally, - * so the PgType should be for the inner type F, not Optional<F>. - * - *

For Scala Option types, convert to Java Optional using scala.jdk.OptionConverters. - * - * @param name the field name in SQL - * @param type the PgType for the inner type F (not Optional<F>) - * @param getter function to extract Optional<F> value from struct - */ - public Builder optField(String name, PgType type, Function> getter) { - // Unwrap Optional to nullable F for internal storage - Function unwrappingGetter = a -> getter.apply(a).orElse(null); - fields.add(new Field<>(name, type, unwrappingGetter)); - return this; - } - - /** - * Build the PgStruct with auto-derived writer and JSON codec. - * - * @param reader function to construct struct from field values array - */ - public PgStruct build(StructReader reader) { - List typenameFields = - fields.stream() - .map(f -> new PgTypename.CompositeOf.CompositeField(f.name(), f.type().typename())) - .toList(); - - PgTypename.CompositeOf typename = new PgTypename.CompositeOf<>(typeName, typenameFields); - - // Auto-derive writer from getters - StructWriter writer = - structValue -> { - Object[] values = new Object[fields.size()]; - for (int i = 0; i < fields.size(); i++) { - values[i] = extractFieldValue(fields.get(i), structValue); - } - return values; - }; - - // Auto-derive JSON codec from fields - PgJson json = - new PgJson<>() { - @Override - public JsonValue toJson(A value) { - LinkedHashMap jsonFields = new LinkedHashMap<>(); - for (Field field : fields) { - jsonFields.put(field.name(), fieldToJson(field, value)); - } - return new JsonValue.JObject(jsonFields); - } - - @Override - public A fromJson(JsonValue jsonValue) { - if (jsonValue instanceof JsonValue.JObject obj) { - Object[] values = new Object[fields.size()]; - for (int i = 0; i < fields.size(); i++) { - Field field = fields.get(i); - JsonValue fieldJson = obj.fields().get(field.name()); - values[i] = fieldFromJson(field, fieldJson); - } - try { - return reader.read(values); - } catch (SQLException e) { - throw new RuntimeException("Failed to construct struct from JSON", e); - } - } - throw new IllegalArgumentException("Expected JSON object"); - } - }; - - return new PgStruct<>(typename, List.copyOf(fields), reader, writer, json); - } - - @SuppressWarnings("unchecked") - private Object extractFieldValue(Field field, A structValue) { - return field.getter().apply(structValue); - } - - @SuppressWarnings("unchecked") - private JsonValue fieldToJson(Field field, A structValue) { - F value = field.getter().apply(structValue); - if (value == null) { - return JsonValue.JNull.INSTANCE; - } - return field.type().pgJson().toJson(value); - } - - @SuppressWarnings("unchecked") - private Object fieldFromJson(Field field, JsonValue jsonValue) { - if (jsonValue == null || jsonValue instanceof JsonValue.JNull) { - return null; - } - return field.type().pgJson().fromJson(jsonValue); - } - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/PgText.java b/foundations-jdbc/src/java/dev/typr/foundations/PgText.java deleted file mode 100644 index ac82db3f82..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/PgText.java +++ /dev/null @@ -1,291 +0,0 @@ -package dev.typr.foundations; - -import java.math.BigDecimal; -import java.math.BigInteger; -import java.util.Map; -import java.util.Optional; -import java.util.UUID; -import java.util.function.BiConsumer; -import java.util.function.Function; -import org.postgresql.util.PGobject; - -/** - * This is `Text` ported from doobie. - * - *

It is used to encode rows in string format for the COPY command. - * - *

- */ -public abstract class PgText implements DbText { - public abstract void unsafeEncode(A a, StringBuilder sb); - - public abstract void unsafeArrayEncode(A a, StringBuilder sb); - - public PgText contramap(Function f) { - var self = this; - return instance( - (b, sb) -> self.unsafeEncode(f.apply(b), sb), - (b, sb) -> self.unsafeArrayEncode(f.apply(b), sb)); - } - - public PgText> opt() { - var self = this; - return instance( - (a, sb) -> { - if (a.isPresent()) self.unsafeEncode(a.get(), sb); - else sb.append(PgText.NULL); - }, - (a, sb) -> { - if (a.isPresent()) self.unsafeArrayEncode(a.get(), sb); - else sb.append(PgText.NULL); - }); - } - - public PgText array() { - var self = this; - return PgText.instance( - (as, sb) -> { - var first = true; - sb.append("{"); - for (var a : as) { - if (first) first = false; - else sb.append(','); - self.unsafeArrayEncode(a, sb); - } - sb.append('}'); - }); - } - - public static char DELIMETER = '\t'; - public static String NULL = "\\N"; - - public static PgText instance(BiConsumer f) { - return instance(f, f); - } - - public static PgText instance( - BiConsumer f, BiConsumer arrayF) { - return new PgText<>() { - @Override - public void unsafeEncode(A a, StringBuilder sb) { - f.accept(a, sb); - } - - @Override - public void unsafeArrayEncode(A a, StringBuilder sb) { - arrayF.accept(a, sb); - } - }; - } - - @SuppressWarnings("unchecked") - public static PgText from(RowParser rowParser) { - return instance( - (row, sb) -> { - var encoded = rowParser.encode().apply(row); - for (int i = 0; i < encoded.length; i++) { - if (i > 0) { - sb.append(PgText.DELIMETER); - } - DbText text = (DbText) rowParser.columns().get(i).text(); - text.unsafeEncode(encoded[i], sb); - } - }); - } - - public static PgText instanceToString() { - return textString.contramap(Object::toString); - } - - public static final PgText textString = - instance(StringImpl::unsafeEncode, StringImpl::unsafeArrayEncode); - public static final PgText textInteger = PgText.instance((n, sb) -> sb.append(n)); - public static final PgText textShort = PgText.instance((n, sb) -> sb.append(n)); - public static final PgText textLong = PgText.instance((n, sb) -> sb.append(n)); - public static final PgText textFloat = PgText.instance((n, sb) -> sb.append(n)); - public static final PgText textDouble = PgText.instance((n, sb) -> sb.append(n)); - public static final PgText textBigDecimal = PgText.instance((n, sb) -> sb.append(n)); - public static final PgText textBoolean = PgText.instance((n, sb) -> sb.append(n)); - public static final PgText textUuid = PgText.instance((n, sb) -> sb.append(n)); - public static final PgText textByteArray = - PgText.instance( - (bs, sb) -> { - sb.append("\\\\x"); - if (bs.length > 0) { - var hex = new BigInteger(1, bs).toString(16); - var pad = bs.length * 2 - hex.length(); - sb.append("0".repeat(Math.max(0, pad))); - sb.append(hex); - } - }); - - public static PgText textPGobject() { - return PgText.textString.contramap( - x -> { - // let's be defensive since it seems there are so many possibilities for nulls - if (x == null || x.isNull()) return "null"; - else return x.toString(); - }); - } - - public static final PgText> textMapStringString = - PgText.instance( - (m, sb) -> { - var first = true; - for (var e : m.entrySet()) { - if (first) first = false; - else sb.append(','); - StringImpl.unsafeEncode(e.getKey(), sb); - sb.append("=>"); - StringImpl.unsafeEncode(e.getValue(), sb); - } - }); - - private interface StringImpl { - // Standard char encodings that don't differ in array context - static void stdChar(char c, StringBuilder sb) { - switch (c) { - case '\b': - sb.append("\\b"); - break; - case '\f': - sb.append("\\f"); - break; - case '\n': - sb.append("\\n"); - break; - case '\r': - sb.append("\\r"); - break; - case '\t': - sb.append("\\t"); - break; - case 0x0b: - sb.append("\\v"); - break; - default: - sb.append(c); - break; - } - } - - static void unsafeEncode(String s, StringBuilder sb) { - for (int i = 0; i < s.length(); i++) { - char c = s.charAt(i); - if (c == '\\') { - sb.append("\\\\"); // backslash must be doubled - } else { - stdChar(c, sb); - } - } - } - - // I am not confident about this encoder. Postgres seems not to be able to cope with low - // control characters or high whitespace characters so these are simply filtered out in the - // tests. It should accommodate arrays of non-pathological strings but it would be nice to - // have a complete specification of what's actually happening. - static void unsafeArrayEncode(String s, StringBuilder sb) { - sb.append('"'); - for (int i = 0; i < s.length(); i++) { - char c = s.charAt(i); - switch (c) { - case '\"': - sb.append("\\\\\\\""); - break; - case '\\': - sb.append("\\\\\\\\\\\\\\\\"); // srsly - break; - default: - stdChar(c, sb); - break; - } - } - sb.append('"'); - } - } - - public static final PgText NotWorking = - new PgText<>() { - @Override - public void unsafeEncode(Object t, StringBuilder sb) { - throw new UnsupportedOperationException("streaming COPY is not supported for this type"); - } - - @Override - public void unsafeArrayEncode(Object t, StringBuilder sb) { - throw new UnsupportedOperationException("streaming COPY is not supported for this type"); - } - }; - - @Deprecated - public static PgText NotWorking() { - return (PgText) NotWorking; - } - - // Unboxed primitive array text encoders - public static final PgText boolArrayUnboxed = - instance( - (arr, sb) -> { - sb.append('{'); - for (int i = 0; i < arr.length; i++) { - if (i > 0) sb.append(','); - sb.append(arr[i] ? 't' : 'f'); - } - sb.append('}'); - }); - - public static final PgText shortArrayUnboxed = - instance( - (arr, sb) -> { - sb.append('{'); - for (int i = 0; i < arr.length; i++) { - if (i > 0) sb.append(','); - sb.append(arr[i]); - } - sb.append('}'); - }); - - public static final PgText intArrayUnboxed = - instance( - (arr, sb) -> { - sb.append('{'); - for (int i = 0; i < arr.length; i++) { - if (i > 0) sb.append(','); - sb.append(arr[i]); - } - sb.append('}'); - }); - - public static final PgText longArrayUnboxed = - instance( - (arr, sb) -> { - sb.append('{'); - for (int i = 0; i < arr.length; i++) { - if (i > 0) sb.append(','); - sb.append(arr[i]); - } - sb.append('}'); - }); - - public static final PgText floatArrayUnboxed = - instance( - (arr, sb) -> { - sb.append('{'); - for (int i = 0; i < arr.length; i++) { - if (i > 0) sb.append(','); - sb.append(arr[i]); - } - sb.append('}'); - }); - - public static final PgText doubleArrayUnboxed = - instance( - (arr, sb) -> { - sb.append('{'); - for (int i = 0; i < arr.length; i++) { - if (i > 0) sb.append(','); - sb.append(arr[i]); - } - sb.append('}'); - }); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/PgType.java b/foundations-jdbc/src/java/dev/typr/foundations/PgType.java deleted file mode 100644 index 4e6092b797..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/PgType.java +++ /dev/null @@ -1,155 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.dsl.Bijection; -import java.util.Optional; -import java.util.function.Function; -import java.util.function.IntFunction; - -public record PgType( - PgTypename typename, - PgRead read, - PgWrite write, - PgText pgText, - PgCompositeText pgCompositeText, - PgJson pgJson) - implements DbType { - @Override - public DbText text() { - return pgText; - } - - @Override - public DbJson json() { - return pgJson; - } - - public Fragment.Value encode(A value) { - return new Fragment.Value<>(value, this); - } - - public PgType withTypename(PgTypename typename) { - return new PgType<>(typename, read, write, pgText, pgCompositeText, pgJson); - } - - public PgType withTypename(String sqlType) { - return withTypename(PgTypename.of(sqlType)); - } - - public PgType renamed(String value) { - return withTypename(typename.renamed(value)); - } - - public PgType renamedDropPrecision(String value) { - return withTypename(typename.renamedDropPrecision(value)); - } - - public PgType withRead(PgRead read) { - return new PgType<>(typename, read, write, pgText, pgCompositeText, pgJson); - } - - public PgType withWrite(PgWrite write) { - return new PgType<>(typename, read, write, pgText, pgCompositeText, pgJson); - } - - public PgType withText(PgText text) { - return new PgType<>(typename, read, write, text, pgCompositeText, pgJson); - } - - public PgType withCompositeText(PgCompositeText compositeText) { - return new PgType<>(typename, read, write, pgText, compositeText, pgJson); - } - - public PgType withJson(PgJson json) { - return new PgType<>(typename, read, write, pgText, pgCompositeText, json); - } - - public PgType> opt() { - return new PgType<>( - typename.opt(), - read.opt(), - write.opt(typename), - pgText.opt(), - pgCompositeText.opt(), - pgJson.opt()); - } - - public PgType array(PgRead read, IntFunction arrayFactory) { - return new PgType<>( - typename.array(), - read, - write.array(typename), - pgText.array(), - pgCompositeText.array(arrayFactory), - pgJson.array(arrayFactory)); - } - - /** - * Create an array type with a custom delimiter for composite text encoding/decoding. - * - *

PostgreSQL uses semicolon (;) as the delimiter for geometric type arrays (box[], circle[], - * line[], lseg[], path[], point[], polygon[]) because their elements contain commas. - */ - public PgType array( - PgRead read, IntFunction arrayFactory, char compositeTextDelimiter) { - return new PgType<>( - typename.array(), - read, - write.array(typename), - pgText.array(), - pgCompositeText.array(arrayFactory, compositeTextDelimiter), - pgJson.array(arrayFactory)); - } - - public PgType array(PgRead read, PgWrite write, IntFunction arrayFactory) { - return new PgType<>( - typename.array(), - read, - write, - pgText.array(), - pgCompositeText.array(arrayFactory), - pgJson.array(arrayFactory)); - } - - public PgType bimap(SqlFunction f, Function g) { - return new PgType<>( - typename.as(), - read.map(f), - write.contramap(g), - pgText.contramap(g), - pgCompositeText.bimap( - a -> { - try { - return f.apply(a); - } catch (java.sql.SQLException e) { - throw new RuntimeException(e); - } - }, - g), - pgJson.bimap(f, g)); - } - - public PgType to(Bijection bijection) { - return new PgType<>( - typename.as(), - read.map(bijection::underlying), - write.contramap(bijection::from), - pgText.contramap(bijection::from), - pgCompositeText.bimap(bijection::underlying, bijection::from), - pgJson.bimap(bijection::underlying, bijection::from)); - } - - public static PgType of( - String tpe, PgRead r, PgWrite w, PgText t, PgCompositeText ct, PgJson j) { - return new PgType<>(PgTypename.of(tpe), r, w, t, ct, j); - } - - public static PgType of( - PgTypename typename, - PgRead r, - PgWrite w, - PgText t, - PgCompositeText ct, - PgJson j) { - return new PgType<>(typename, r, w, t, ct, j); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/PgTypename.java b/foundations-jdbc/src/java/dev/typr/foundations/PgTypename.java deleted file mode 100644 index f2b24fbecf..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/PgTypename.java +++ /dev/null @@ -1,185 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.dsl.Bijection; -import java.util.Optional; - -public sealed interface PgTypename extends DbTypename { - String sqlType(); - - String sqlTypeNoPrecision(); - - PgTypename array(); - - PgTypename renamed(String value); - - PgTypename renamedDropPrecision(String value); - - default PgTypename> opt() { - return new Opt<>(this); - } - - default PgTypename as() { - return (PgTypename) this; - } - - /** - * Type-safe conversion using a bijection as proof of type relationship. Overrides DbTypename.to() - * to return PgTypename for better type refinement. - */ - @Override - default PgTypename to(Bijection bijection) { - return (PgTypename) this; - } - - record Base(String sqlType) implements PgTypename { - @Override - public String sqlTypeNoPrecision() { - return sqlType; - } - - @Override - public PgTypename array() { - return new ArrayOf<>(this); - } - - @Override - public Base renamed(String value) { - return new Base<>(value); - } - - @Override - public Base renamedDropPrecision(String value) { - return new Base<>(value); - } - } - - record ArrayOf(PgTypename of) implements PgTypename { - @Override - public String sqlType() { - return of.sqlType() + "[]"; - } - - @Override - public String sqlTypeNoPrecision() { - return of.sqlTypeNoPrecision() + "[]"; - } - - @Override - public PgTypename array() { - return new ArrayOf<>(this); - } - - @Override - public PgTypename renamed(String value) { - return new ArrayOf<>(of.renamed(value)); - } - - @Override - public PgTypename renamedDropPrecision(String value) { - return new ArrayOf<>(of.renamedDropPrecision(value)); - } - } - - record WithPrec(Base of, int precision) implements PgTypename { - public String sqlType() { - return of.sqlType + "(" + precision + ")"; - } - - @Override - public String sqlTypeNoPrecision() { - return of.sqlTypeNoPrecision(); - } - - @Override - public PgTypename array() { - // drops precision - return new ArrayOf<>(this); - } - - @Override - public PgTypename renamed(String value) { - return new WithPrec<>(of.renamed(value), precision); - } - - @Override - public PgTypename renamedDropPrecision(String value) { - return of.renamed(value); - } - } - - record Opt(PgTypename of) implements PgTypename> { - @Override - public String sqlType() { - return of.sqlType(); - } - - @Override - public String sqlTypeNoPrecision() { - return of.sqlTypeNoPrecision(); - } - - @Override - public PgTypename[]> array() { - return new ArrayOf<>(this); - } - - @Override - public PgTypename> renamed(String value) { - return new Opt<>(of.renamed(value)); - } - - @Override - public PgTypename> renamedDropPrecision(String value) { - return new Opt<>(of.renamedDropPrecision(value)); - } - } - - static PgTypename of(String sqlType) { - return new Base<>(sqlType); - } - - static PgTypename of(String sqlType, int precision) { - return new WithPrec<>(new Base<>(sqlType), precision); - } - - /** - * A composite type (record) typename with field information. - * - * @param the Java type representing this composite - */ - record CompositeOf(String name, java.util.List fields) - implements PgTypename { - public record CompositeField(String name, PgTypename type) {} - - @Override - public String sqlType() { - return name; - } - - @Override - public String sqlTypeNoPrecision() { - return name; - } - - @Override - public PgTypename array() { - return new ArrayOf<>(this); - } - - @Override - public CompositeOf renamed(String value) { - return new CompositeOf<>(value, fields); - } - - @Override - public CompositeOf renamedDropPrecision(String value) { - return new CompositeOf<>(value, fields); - } - - /** Convert to generic PgTypename for use in PgType. */ - @SuppressWarnings("unchecked") - public PgTypename asGeneric() { - return (PgTypename) this; - } - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/PgTypes.java b/foundations-jdbc/src/java/dev/typr/foundations/PgTypes.java deleted file mode 100644 index c8199cb5bc..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/PgTypes.java +++ /dev/null @@ -1,619 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.*; -import dev.typr.foundations.data.Record; -import java.math.BigDecimal; -import java.time.*; -import java.util.Map; -import java.util.UUID; -import java.util.function.Function; -import org.postgresql.geometric.*; -import org.postgresql.util.PGInterval; -import org.postgresql.util.PGobject; - -public interface PgTypes { - PgType aclitem = ofPgObject("aclitem", AclItem::new, AclItem::value, PgJson.aclitem); - PgType aclitemArray = - aclitem.array(PgRead.pgObjectArray(AclItem::new, AclItem.class), AclItem[]::new); - PgType anyarray = - ofPgObject( - "anyarray", - AnyArray::new, - AnyArray::value, - PgJson.text.bimap(AnyArray::new, AnyArray::value)); - PgType anyarrayArray = - anyarray.array(PgRead.pgObjectArray(AnyArray::new, AnyArray.class), AnyArray[]::new); - PgType numeric = - PgType.of( - "numeric", - PgRead.readBigDecimal, - PgWrite.writeBigDecimal, - PgText.textBigDecimal, - PgCompositeText.numeric, - PgJson.numeric); - PgType numericArray = numeric.array(PgRead.readBigDecimalArray, BigDecimal[]::new); - PgType bool = - PgType.of( - "bool", - PgRead.readBoolean, - PgWrite.writeBoolean, - PgText.textBoolean, - PgCompositeText.bool, - PgJson.bool); - PgType boolArray = bool.array(PgRead.readBooleanArray, Boolean[]::new); - - @SuppressWarnings("unchecked") - PgType boolArrayUnboxed = - PgType.of( - (PgTypename) (PgTypename) PgTypename.of("bool").array(), - PgRead.readBooleanArrayUnboxed, - PgWrite.writeBooleanArrayUnboxed, - PgText.boolArrayUnboxed, - PgCompositeText.boolArrayUnboxed, - PgJson.boolArrayUnboxed); - - PgType float8 = - PgType.of( - "float8", - PgRead.readDouble, - PgWrite.writeDouble, - PgText.textDouble, - PgCompositeText.float8, - PgJson.float8); - PgType float8Array = float8.array(PgRead.readDoubleArray, Double[]::new); - - @SuppressWarnings("unchecked") - PgType float8ArrayUnboxed = - PgType.of( - (PgTypename) (PgTypename) PgTypename.of("float8").array(), - PgRead.readDoubleArrayUnboxed, - PgWrite.writeDoubleArrayUnboxed, - PgText.doubleArrayUnboxed, - PgCompositeText.doubleArrayUnboxed, - PgJson.doubleArrayUnboxed); - - PgType float4 = - PgType.of( - "float4", - PgRead.readFloat, - PgWrite.writeFloat, - PgText.textFloat, - PgCompositeText.float4, - PgJson.float4); - PgType float4Array = float4.array(PgRead.readFloatArray, Float[]::new); - - @SuppressWarnings("unchecked") - PgType float4ArrayUnboxed = - PgType.of( - (PgTypename) (PgTypename) PgTypename.of("float4").array(), - PgRead.readFloatArrayUnboxed, - PgWrite.writeFloatArrayUnboxed, - PgText.floatArrayUnboxed, - PgCompositeText.floatArrayUnboxed, - PgJson.floatArrayUnboxed); - - PgType inet = ofPgObject("inet", Inet::new, Inet::value, PgJson.inet); - PgType inetArray = inet.array(PgRead.pgObjectArray(Inet::new, Inet.class), Inet[]::new); - PgType cidr = ofPgObject("cidr", Cidr::new, Cidr::value, PgJson.cidr); - PgType cidrArray = cidr.array(PgRead.pgObjectArray(Cidr::new, Cidr.class), Cidr[]::new); - PgType macaddr = ofPgObject("macaddr", MacAddr::new, MacAddr::value, PgJson.macaddr); - PgType macaddrArray = - macaddr.array(PgRead.pgObjectArray(MacAddr::new, MacAddr.class), MacAddr[]::new); - PgType macaddr8 = - ofPgObject("macaddr8", MacAddr8::new, MacAddr8::value, PgJson.macaddr8); - PgType macaddr8Array = - macaddr8.array(PgRead.pgObjectArray(MacAddr8::new, MacAddr8.class), MacAddr8[]::new); - PgType timestamptz = - PgType.of( - "timestamptz", - PgRead.readInstant, - PgWrite.primitive((ps, i, v) -> ps.setObject(i, v.atOffset(ZoneOffset.UTC))), - PgText.instance( - (t, sb) -> sb.append(t.atOffset(ZoneOffset.UTC).toString().replace('T', ' '))), - PgCompositeText.of( - t -> t.atOffset(ZoneOffset.UTC).toString().replace('T', ' '), - text -> OffsetDateTime.parse(text.replace(' ', 'T')).toInstant()), - PgJson.timestamptz); - PgType timestamptzArray = timestamptz.array(PgRead.readInstantArray, Instant[]::new); - PgType int2vector = - ofPgObject("int2vector", Int2Vector::new, Int2Vector::value, PgJson.int2vector); - PgType int2vectorArray = - int2vector.array(PgRead.pgObjectArray(Int2Vector::new, Int2Vector.class), Int2Vector[]::new); - PgType int4 = - PgType.of( - "int4", - PgRead.readInteger, - PgWrite.writeInteger, - PgText.textInteger, - PgCompositeText.int4, - PgJson.int4); - PgType int4Array = int4.array(PgRead.readIntegerArray, Integer[]::new); - - @SuppressWarnings("unchecked") - PgType int4ArrayUnboxed = - PgType.of( - (PgTypename) (PgTypename) PgTypename.of("int4").array(), - PgRead.readIntArrayUnboxed, - PgWrite.writeIntArrayUnboxed, - PgText.intArrayUnboxed, - PgCompositeText.intArrayUnboxed, - PgJson.intArrayUnboxed); - - PgType json = ofPgObject("json", Json::new, Json::value, PgJson.json); - PgType jsonArray = json.array(PgRead.readJsonArray, Json[]::new); - PgType jsonb = ofPgObject("jsonb", Jsonb::new, Jsonb::value, PgJson.jsonb); - PgType jsonbArray = jsonb.array(PgRead.readJsonbArray, Jsonb[]::new); - PgType date = - PgType.of( - "date", - PgRead.readLocalDate, - PgWrite.passObjectToJdbc(), - PgText.instance((d, sb) -> sb.append(d.toString())), - PgCompositeText.of(LocalDate::toString, LocalDate::parse), - PgJson.date); - PgType timestamp = - PgType.of( - "timestamp", - PgRead.readLocalDateTime, - PgWrite.passObjectToJdbc(), - PgText.instance((t, sb) -> sb.append(t.toString().replace('T', ' '))), - PgCompositeText.of( - t -> t.toString().replace('T', ' '), - text -> LocalDateTime.parse(text.replace(' ', 'T'))), - PgJson.timestamp); - PgType timestampArray = - timestamp.array(PgRead.readLocalDateTimeArray, LocalDateTime[]::new); - PgType dateArray = date.array(PgRead.readLocalDateArray, LocalDate[]::new); - PgType time = - PgType.of( - "time", - PgRead.readLocalTime, - PgWrite.passObjectToJdbc(), - PgText.instance((t, sb) -> sb.append(t.toString())), - PgCompositeText.of(LocalTime::toString, LocalTime::parse), - PgJson.time); - PgType timeArray = time.array(PgRead.readLocalTimeArray, LocalTime[]::new); - PgType int8 = - PgType.of( - "int8", - PgRead.readLong, - PgWrite.writeLong, - PgText.textLong, - PgCompositeText.int8, - PgJson.int8); - PgType int8Array = int8.array(PgRead.readLongArray, Long[]::new); - - @SuppressWarnings("unchecked") - PgType int8ArrayUnboxed = - PgType.of( - (PgTypename) (PgTypename) PgTypename.of("int8").array(), - PgRead.readLongArrayUnboxed, - PgWrite.writeLongArrayUnboxed, - PgText.longArrayUnboxed, - PgCompositeText.longArrayUnboxed, - PgJson.longArrayUnboxed); - - // oid is a 32-bit unsigned integer wrapped in Oid type - PgType oid = - PgType.of( - "oid", - PgRead.readLong.map(Oid::new), - PgWrite.writeLong.contramap(Oid::value), - PgText.instance((o, sb) -> sb.append(o.value())), - PgCompositeText.int8.bimap(Oid::new, Oid::value), - PgJson.int8.bimap(Oid::new, Oid::value)); - PgType oidArray = - oid.array( - PgRead.readLongArray.map( - arr -> { - Oid[] result = new Oid[arr.length]; - for (int i = 0; i < arr.length; i++) { - result[i] = new Oid(arr[i]); - } - return result; - }), - Oid[]::new); - - PgType> hstore = - PgType.of( - "hstore", - PgRead.readMapStringString, - PgWrite.passObjectToJdbc(), - PgText.textMapStringString, - PgCompositeText.hstore, - PgJson.hstore); - PgType money = - PgType.of( - "money", - PgRead.readDouble.map(Money::new), - PgWrite.pgObject("money").contramap(m -> String.valueOf(m.value())), - PgText.textDouble.contramap(Money::value), - PgCompositeText.money, - PgJson.money); - PgType moneyArray = money.array(PgRead.readMoneyArray, Money[]::new); - // name is a 63-character identifier type in PostgreSQL, mapped to String - PgType name = - PgType.of( - "name", - PgRead.readString, - PgWrite.writeString, - PgText.textString, - PgCompositeText.text, - PgJson.text); - PgType nameArray = name.array(PgRead.readStringArray, String[]::new); - PgType timetz = - PgType.of( - "timetz", - PgRead.readOffsetTime, - PgWrite.passObjectToJdbc(), - PgText.instance((t, sb) -> sb.append(t.toString())), - PgCompositeText.timetz, - PgJson.timetz); - PgType timetzArray = timetz.array(PgRead.readOffsetTimeArray, OffsetTime[]::new); - PgType oidvector = - ofPgObject("oidvector", OidVector::new, OidVector::value, PgJson.oidvector); - PgType oidvectorArray = - oidvector.array(PgRead.pgObjectArray(OidVector::new, OidVector.class), OidVector[]::new); - PgType interval = - PgType.of( - "interval", - PgRead.castJdbcObjectTo(PGInterval.class), - PgWrite.passObjectToJdbc(), - PgText.textPGobject(), - PgCompositeText.interval, - PgJson.interval); - PgType intervalArray = - interval.array(PgRead.castJdbcArrayTo(PGInterval.class), PGInterval[]::new); - PgType box = - PgType.of( - "box", - PgRead.castJdbcObjectTo(PGbox.class), - PgWrite.passObjectToJdbc(), - PgText.textPGobject(), - PgCompositeText.box, - PgJson.box); - // Geometric arrays use semicolon delimiter because elements contain commas - PgType boxArray = box.array(PgRead.castJdbcArrayTo(PGbox.class), PGbox[]::new, ';'); - PgType circle = - PgType.of( - "circle", - PgRead.castJdbcObjectTo(PGcircle.class), - PgWrite.passObjectToJdbc(), - PgText.textPGobject(), - PgCompositeText.circle, - PgJson.circle); - PgType circleArray = - circle.array(PgRead.castJdbcArrayTo(PGcircle.class), PGcircle[]::new, ';'); - PgType line = - PgType.of( - "line", - PgRead.castJdbcObjectTo(PGline.class), - PgWrite.passObjectToJdbc(), - PgText.textPGobject(), - PgCompositeText.line, - PgJson.line); - PgType lineArray = line.array(PgRead.castJdbcArrayTo(PGline.class), PGline[]::new, ';'); - PgType lseg = - PgType.of( - "lseg", - PgRead.castJdbcObjectTo(PGlseg.class), - PgWrite.passObjectToJdbc(), - PgText.textPGobject(), - PgCompositeText.lseg, - PgJson.lseg); - PgType lsegArray = lseg.array(PgRead.castJdbcArrayTo(PGlseg.class), PGlseg[]::new, ';'); - PgType path = - PgType.of( - "path", - PgRead.castJdbcObjectTo(PGpath.class), - PgWrite.passObjectToJdbc(), - PgText.textPGobject(), - PgCompositeText.path, - PgJson.path); - PgType pathArray = path.array(PgRead.castJdbcArrayTo(PGpath.class), PGpath[]::new, ';'); - PgType point = - PgType.of( - "point", - PgRead.castJdbcObjectTo(PGpoint.class), - PgWrite.passObjectToJdbc(), - PgText.textPGobject(), - PgCompositeText.point, - PgJson.point); - PgType pointArray = - point.array(PgRead.castJdbcArrayTo(PGpoint.class), PGpoint[]::new, ';'); - PgType polygon = - PgType.of( - "polygon", - PgRead.castJdbcObjectTo(PGpolygon.class), - PgWrite.passObjectToJdbc(), - PgText.textPGobject(), - PgCompositeText.polygon, - PgJson.polygon); - PgType polygonArray = - polygon.array(PgRead.castJdbcArrayTo(PGpolygon.class), PGpolygon[]::new, ';'); - PgType pgNodeTree = - ofPgObject( - "pg_node_tree", - PgNodeTree::new, - PgNodeTree::value, - PgJson.text.bimap(PgNodeTree::new, PgNodeTree::value)); - PgType pgNodeTreeArray = - pgNodeTree.array(PgRead.pgObjectArray(PgNodeTree::new, PgNodeTree.class), PgNodeTree[]::new); - PgType regclass = - ofPgObject("regclass", Regclass::new, Regclass::value, PgJson.regclass); - PgType regclassArray = - regclass.array(PgRead.pgObjectArray(Regclass::new, Regclass.class), Regclass[]::new); - PgType regconfig = - ofPgObject("regconfig", Regconfig::new, Regconfig::value, PgJson.regconfig); - PgType regconfigArray = - regconfig.array(PgRead.pgObjectArray(Regconfig::new, Regconfig.class), Regconfig[]::new); - PgType regdictionary = - ofPgObject("regdictionary", Regdictionary::new, Regdictionary::value, PgJson.regdictionary); - PgType regdictionaryArray = - regdictionary.array( - PgRead.pgObjectArray(Regdictionary::new, Regdictionary.class), Regdictionary[]::new); - PgType regnamespace = - ofPgObject("regnamespace", Regnamespace::new, Regnamespace::value, PgJson.regnamespace); - PgType regnamespaceArray = - regnamespace.array( - PgRead.pgObjectArray(Regnamespace::new, Regnamespace.class), Regnamespace[]::new); - PgType regoper = ofPgObject("regoper", Regoper::new, Regoper::value, PgJson.regoper); - PgType regoperArray = - regoper.array(PgRead.pgObjectArray(Regoper::new, Regoper.class), Regoper[]::new); - PgType regoperator = - ofPgObject("regoperator", Regoperator::new, Regoperator::value, PgJson.regoperator); - PgType regoperatorArray = - regoperator.array( - PgRead.pgObjectArray(Regoperator::new, Regoperator.class), Regoperator[]::new); - PgType regproc = ofPgObject("regproc", Regproc::new, Regproc::value, PgJson.regproc); - PgType regprocArray = - regproc.array(PgRead.pgObjectArray(Regproc::new, Regproc.class), Regproc[]::new); - PgType regprocedure = - ofPgObject("regprocedure", Regprocedure::new, Regprocedure::value, PgJson.regprocedure); - PgType regprocedureArray = - regprocedure.array( - PgRead.pgObjectArray(Regprocedure::new, Regprocedure.class), Regprocedure[]::new); - PgType regrole = ofPgObject("regrole", Regrole::new, Regrole::value, PgJson.regrole); - PgType regroleArray = - regrole.array(PgRead.pgObjectArray(Regrole::new, Regrole.class), Regrole[]::new); - PgType regtype = ofPgObject("regtype", Regtype::new, Regtype::value, PgJson.regtype); - PgType regtypeArray = - regtype.array(PgRead.pgObjectArray(Regtype::new, Regtype.class), Regtype[]::new); - PgType int2 = - PgType.of( - "int2", - PgRead.readShort, - PgWrite.writeShort, - PgText.textShort, - PgCompositeText.int2, - PgJson.int2); - PgType smallint = int2.withTypename(PgTypename.of("smallint")); - PgType int2Array = int2.array(PgRead.readShortArray, Short[]::new); - - @SuppressWarnings("unchecked") - PgType int2ArrayUnboxed = - PgType.of( - (PgTypename) (PgTypename) PgTypename.of("int2").array(), - PgRead.readShortArrayUnboxed, - PgWrite.writeShortArrayUnboxed, - PgText.shortArrayUnboxed, - PgCompositeText.shortArrayUnboxed, - PgJson.shortArrayUnboxed); - - PgType smallintArray = int2Array.renamed("smallint"); - PgType smallintArrayUnboxed = int2ArrayUnboxed.renamed("smallint"); - PgType bpchar = - PgType.of( - "bpchar", - PgRead.readString, - PgWrite.writeString, - PgText.textString, - PgCompositeText.text, - PgJson.text); - PgType text = - PgType.of( - "text", - PgRead.readString, - PgWrite.writeString, - PgText.textString, - PgCompositeText.text, - PgJson.text); - PgType bpcharArray = bpchar.array(PgRead.readStringArray, String[]::new); - PgType textArray = text.array(PgRead.readStringArray, String[]::new); - PgType uuid = - PgType.of( - "uuid", - PgRead.readUUID, - PgWrite.writeUUID, - PgText.textUuid, - PgCompositeText.uuid, - PgJson.uuid); - PgType uuidArray = uuid.array(PgRead.massageJdbcArrayTo(UUID[].class), UUID[]::new); - PgType xid = ofPgObject("xid", Xid::new, Xid::value, PgJson.xid); - PgType xidArray = xid.array(PgRead.pgObjectArray(Xid::new, Xid.class), Xid[]::new); - PgType xml = - PgType.of( - "xml", - PgRead.readString, - PgWrite.pgObject("xml"), - PgText.textString, - PgCompositeText.text, - PgJson.text) - .bimap(Xml::new, Xml::value); - PgType xmlArray = xml.array(PgRead.pgObjectArray(Xml::new, Xml.class), Xml[]::new); - PgType vector = - PgType.of( - "vector", - PgRead.readString, - PgWrite.pgObject("vector"), - PgText.textString, - PgCompositeText.text, - PgJson.text) - .bimap(Vector::new, Vector::value); - PgType vectorArray = - vector.array(PgRead.pgObjectArray(Vector::new, Vector.class), Vector[]::new); - PgType unknown = - PgType.of( - "unknown", - PgRead.readString, - PgWrite.pgObject("unknown"), - PgText.textString, - PgCompositeText.text, - PgJson.text) - .bimap(Unknown::new, Unknown::value); - PgType unknownArray = - unknown.array(PgRead.pgObjectArray(Unknown::new, Unknown.class), Unknown[]::new); - PgType bytea = - PgType.of( - "bytea", - PgRead.readByteArray, - PgWrite.writeByteArray, - PgText.textByteArray, - PgCompositeText.bytea, - PgJson.bytea); - - // Range types - discrete types (int, date) are normalized to canonical [) form via Range factory - // methods - PgType> int4range = - rangeType("int4range", RangeParser.INT4_PARSER, Range.INT4, PgJson.int4range); - PgType[]> int4rangeArray = - int4range.array(rangeArrayRead(RangeParser.INT4_PARSER, Range.INT4), rangeArrayFactory()); - PgType> int8range = - rangeType("int8range", RangeParser.INT8_PARSER, Range.INT8, PgJson.int8range); - PgType[]> int8rangeArray = - int8range.array(rangeArrayRead(RangeParser.INT8_PARSER, Range.INT8), rangeArrayFactory()); - PgType> numrange = - rangeType("numrange", RangeParser.NUMERIC_PARSER, Range.NUMERIC, PgJson.numrange); - PgType[]> numrangeArray = - numrange.array( - rangeArrayRead(RangeParser.NUMERIC_PARSER, Range.NUMERIC), rangeArrayFactory()); - PgType> daterange = - rangeType("daterange", RangeParser.DATE_PARSER, Range.DATE, PgJson.daterange); - PgType[]> daterangeArray = - daterange.array(rangeArrayRead(RangeParser.DATE_PARSER, Range.DATE), rangeArrayFactory()); - PgType> tsrange = - rangeType("tsrange", RangeParser.TIMESTAMP_PARSER, Range.TIMESTAMP, PgJson.tsrange); - PgType[]> tsrangeArray = - tsrange.array( - rangeArrayRead(RangeParser.TIMESTAMP_PARSER, Range.TIMESTAMP), rangeArrayFactory()); - PgType> tstzrange = - rangeType("tstzrange", RangeParser.TIMESTAMPTZ_PARSER, Range.TIMESTAMPTZ, PgJson.tstzrange); - PgType[]> tstzrangeArray = - tstzrange.array( - rangeArrayRead(RangeParser.TIMESTAMPTZ_PARSER, Range.TIMESTAMPTZ), rangeArrayFactory()); - - static > PgType ofEnum(String sqlType, Function fromString) { - return PgType.of( - sqlType, - PgRead.readString.map(fromString::apply), - PgWrite.writeString.contramap(Enum::name), - PgText.textString.contramap(Enum::name), - PgCompositeText.text.bimap(fromString::apply, Enum::name), - PgJson.text.bimap(fromString::apply, Enum::name)); - } - - static PgType ofPgObject( - String sqlType, - SqlFunction constructor, - Function extractor, - PgJson json) { - return PgType.of( - sqlType, - PgRead.pgObject(sqlType).map(constructor), - PgWrite.pgObject(sqlType).contramap(extractor), - PgText.textString.contramap(extractor), - PgCompositeText.text.bimap( - s -> { - try { - return constructor.apply(s); - } catch (java.sql.SQLException e) { - throw new RuntimeException(e); - } - }, - extractor), - json); - } - - // Default record type for generic composite/record columns - PgType record = ofPgObject("record", Record::new, Record::value, PgJson.record); - PgType recordArray = - record.array(PgRead.pgObjectArray(Record::new, Record.class), Record[]::new); - - static PgType record(String sqlType) { - return ofPgObject(sqlType, Record::new, Record::value, PgJson.record); - } - - static PgType recordArray(String sqlType) { - return record(sqlType).array(PgRead.pgObjectArray(Record::new, Record.class), Record[]::new); - } - - static PgType pgObject(String sqlType, Class clazz, PgJson json) { - return PgType.of( - sqlType, - PgRead.castJdbcObjectTo(clazz), - PgWrite.passObjectToJdbc(), - PgText.textPGobject(), - PgCompositeText.notSupported(), - json); - } - - static PgType bpchar(int precision) { - return PgType.of( - PgTypename.of("bpchar", precision), - PgRead.readString, - PgWrite.writeString, - PgText.textString, - PgCompositeText.text, - PgJson.text); - } - - static PgType bpcharArray(int n) { - return bpchar(n).array(PgRead.readStringArray, String[]::new); - } - - // Range type helpers - static > PgType> rangeType( - String sqlType, - SqlFunction valueParser, - java.util.function.BiFunction, RangeBound, Range> rangeFactory, - PgJson> json) { - return PgType.of( - sqlType, - PgRead.pgObject(sqlType).map(str -> RangeParser.parse(str, valueParser, rangeFactory)), - PgWrite.pgObject(sqlType).contramap(RangeParser::format), - PgText.textString.contramap(RangeParser::format), - PgCompositeText.of( - RangeParser::format, - str -> { - try { - return RangeParser.parse(str, valueParser, rangeFactory); - } catch (java.sql.SQLException e) { - throw new RuntimeException(e); - } - }), - json); - } - - @SuppressWarnings("unchecked") - static > PgRead[]> rangeArrayRead( - SqlFunction valueParser, - java.util.function.BiFunction, RangeBound, Range> rangeFactory) { - return PgRead.readPgArray.map( - sqlArray -> { - Object[] objects = (Object[]) sqlArray.getArray(); - Range[] result = - (Range[]) java.lang.reflect.Array.newInstance(Range.class, objects.length); - for (int i = 0; i < objects.length; i++) { - var pgObj = (org.postgresql.util.PGobject) objects[i]; - result[i] = RangeParser.parse(pgObj.getValue(), valueParser, rangeFactory); - } - return result; - }); - } - - @SuppressWarnings("unchecked") - static > - java.util.function.IntFunction[]> rangeArrayFactory() { - return n -> (Range[]) java.lang.reflect.Array.newInstance(Range.class, n); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/PgWrite.java b/foundations-jdbc/src/java/dev/typr/foundations/PgWrite.java deleted file mode 100644 index 9ddfd00491..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/PgWrite.java +++ /dev/null @@ -1,101 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.internal.arrayMap; -import java.math.BigDecimal; -import java.sql.PreparedStatement; -import java.sql.SQLException; -import java.util.Optional; -import java.util.UUID; -import java.util.function.Function; -import org.postgresql.util.PGobject; - -public sealed interface PgWrite extends DbWrite permits PgWrite.Instance { - void set(PreparedStatement ps, int idx, A a) throws SQLException; - - // combinators - PgWrite> opt(PgTypename typename); - - PgWrite array(PgTypename typename); - - PgWrite contramap(Function f); - - @FunctionalInterface - interface RawWriter { - void set(PreparedStatement ps, int index, A a) throws SQLException; - } - - record Instance(RawWriter rawWriter, Function f) implements PgWrite { - @Override - public void set(PreparedStatement ps, int index, A a) throws SQLException { - rawWriter.set(ps, index, f.apply(a)); - } - - @Override - public PgWrite> opt(PgTypename typename) { - return new Instance<>( - (ps, index, u) -> { - if (u == null) ps.setNull(index, 0, typename.sqlTypeNoPrecision()); - else set(ps, index, u); - }, - a -> a.orElse(null)); - } - - @SuppressWarnings("unchecked") - @Override - public PgWrite array(PgTypename typename) { - return new Instance( - (ps, index, us) -> - ps.setArray( - index, ps.getConnection().createArrayOf(typename.sqlTypeNoPrecision(), us)), - as -> arrayMap.map(as, f, (Class) Object.class)); - } - - @Override - public PgWrite contramap(Function f) { - return new Instance<>(rawWriter, f.andThen(this.f)); - } - } - - static PgWrite primitive(RawWriter rawWriter) { - return new Instance<>(rawWriter, Function.identity()); - } - - static PgWrite passObjectToJdbc() { - return primitive(PreparedStatement::setObject); - } - - static PgWrite pgObject(String sqlType) { - return PgWrite.passObjectToJdbc() - .contramap( - str -> { - var obj = new PGobject(); - obj.setType(sqlType); - try { - obj.setValue(str); - } catch (SQLException e) { - throw new RuntimeException(e); - } - return obj; - }); - } - - PgWrite writeByteArray = primitive(PreparedStatement::setObject); - - // Unboxed (primitive) array writers - PgWrite writeBooleanArrayUnboxed = primitive(PreparedStatement::setObject); - PgWrite writeShortArrayUnboxed = primitive(PreparedStatement::setObject); - PgWrite writeIntArrayUnboxed = primitive(PreparedStatement::setObject); - PgWrite writeLongArrayUnboxed = primitive(PreparedStatement::setObject); - PgWrite writeFloatArrayUnboxed = primitive(PreparedStatement::setObject); - PgWrite writeDoubleArrayUnboxed = primitive(PreparedStatement::setObject); - - PgWrite writeBoolean = primitive(PreparedStatement::setBoolean); - PgWrite writeBigDecimal = primitive(PreparedStatement::setBigDecimal); - PgWrite writeDouble = primitive(PreparedStatement::setDouble); - PgWrite writeFloat = primitive(PreparedStatement::setFloat); - PgWrite writeInteger = primitive(PreparedStatement::setInt); - PgWrite writeLong = primitive(PreparedStatement::setLong); - PgWrite writeShort = primitive(PreparedStatement::setShort); - PgWrite writeString = primitive(PreparedStatement::setString); - PgWrite writeUUID = primitive(PreparedStatement::setObject); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/ResultSetParser.java b/foundations-jdbc/src/java/dev/typr/foundations/ResultSetParser.java deleted file mode 100644 index 24dd0a60af..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/ResultSetParser.java +++ /dev/null @@ -1,79 +0,0 @@ -package dev.typr.foundations; - -import java.sql.ResultSet; -import java.sql.SQLException; -import java.util.ArrayList; -import java.util.List; -import java.util.Optional; -import java.util.function.Consumer; - -public interface ResultSetParser { - Out apply(ResultSet resultSet) throws SQLException; - - record All(RowParser rowParser) implements ResultSetParser> { - @Override - public List apply(ResultSet resultSet) throws SQLException { - var rowNum = 0; - ArrayList rows = new ArrayList<>(); - while (resultSet.next()) { - rows.add(rowParser.readRow(resultSet, rowNum)); - rowNum += 1; - } - return rows; - } - } - - record Foreach(RowParser rowParser, Consumer consumer) - implements ResultSetParser { - @Override - public Void apply(ResultSet resultSet) throws SQLException { - var rowNum = 0; - while (resultSet.next()) { - consumer.accept(rowParser.readRow(resultSet, rowNum)); - rowNum += 1; - } - return null; - } - } - - record First(RowParser rowParser) implements ResultSetParser> { - @Override - public Optional apply(ResultSet resultSet) throws SQLException { - if (resultSet.next()) { - return Optional.of(rowParser.readRow(resultSet, 0)); - } else { - return Optional.empty(); - } - } - } - - record MaxOne(RowParser rowParser) implements ResultSetParser> { - @Override - public Optional apply(ResultSet resultSet) throws SQLException { - if (resultSet.next()) { - Out result = rowParser.readRow(resultSet, 0); - if (resultSet.next()) { - throw new SQLException("Expected single row, but found more"); - } - return Optional.of(result); - } else { - return Optional.empty(); - } - } - } - - record ExactlyOne(RowParser rowParser) implements ResultSetParser { - @Override - public Out apply(ResultSet resultSet) throws SQLException { - if (resultSet.next()) { - Out result = rowParser.readRow(resultSet, 0); - if (resultSet.next()) { - throw new SQLException("Expected single row, but found more"); - } - return result; - } else { - throw new SQLException("No rows when expecting a single one"); - } - } - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/RowParser.java b/foundations-jdbc/src/java/dev/typr/foundations/RowParser.java deleted file mode 100644 index e5ab95c7ef..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/RowParser.java +++ /dev/null @@ -1,263 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.JsonValue; -import dev.typr.foundations.dsl.Bijection; -import java.sql.PreparedStatement; -import java.sql.ResultSet; -import java.sql.SQLException; -import java.sql.SQLFeatureNotSupportedException; -import java.util.ArrayList; -import java.util.List; -import java.util.Optional; -import java.util.function.Consumer; -import java.util.function.Function; - -public record RowParser( - List> columns, Function decode, Function encode) - implements RowParsers { - public Row readRow(ResultSet rs, int rowNum) throws SqlResultParseException { - Object[] currentRow = new Object[columns.size()]; - for (int colNum = 0; colNum < columns.size(); colNum++) { - DbType dbType = columns.get(colNum); - try { - currentRow[colNum] = dbType.read().read(rs, colNum + 1); - } catch (Exception e) { - throw new SqlResultParseException(rowNum, colNum, dbType, e); - } - } - return this.decode().apply(currentRow); - } - - // Convenience method for compatibility with SelectBuilderSql - public Row parse(ResultSet rs) throws SqlResultParseException { - try { - // Try to get row number for error reporting, but fall back to -1 if not supported (e.g., - // DuckDB) - int rowNum = -1; - try { - rowNum = rs.getRow(); - } catch (SQLFeatureNotSupportedException ignored) { - // Some databases (like DuckDB) don't support getRow() - } - return readRow(rs, rowNum); - } catch (SQLException e) { - throw new SqlResultParseException(0, 0, null, e); - } - } - - @SuppressWarnings("unchecked") - public void writeRow(PreparedStatement stmt, Row row) throws SQLException { - Object[] values = this.encode().apply(row); - for (int colNum = 0; colNum < columns.size(); colNum++) { - DbType dbType = (DbType) columns.get(colNum); - dbType.write().set(stmt, colNum + 1, values[colNum]); - } - } - - public static class SqlResultParseException extends SQLException { - public SqlResultParseException(int row, int column, DbType tpe, Exception cause) { - super( - "Error reading or parsing row " - + row - + ", (1-indexed) column " - + column - + " from ResultSet." - + (tpe != null ? " Expected database type " + tpe.typename().sqlType() : ""), - cause); - } - } - - /** Returns first row (if any), ignores the rest */ - public ResultSetParser> first() { - return new ResultSetParser.First<>(this); - } - - /** Returns at most one row, fails if there are more */ - public ResultSetParser> maxOne() { - return new ResultSetParser.MaxOne<>(this); - } - - /** Returns exactly one row, fails if there are more or less */ - public ResultSetParser exactlyOne() { - return new ResultSetParser.ExactlyOne<>(this); - } - - public ResultSetParser> all() { - return new ResultSetParser.All<>(this); - } - - public ResultSetParser foreach(Consumer consumer) { - return new ResultSetParser.Foreach<>(this, consumer); - } - - /** - * if all values are `null` / `Optional.empty()` then return empty row. This is used for left - * joins where all columns from the joined table can be null. - */ - public RowParser> opt() { - List> optColumns = new ArrayList<>(columns.size()); - for (int i = 0; i < columns.size(); i++) { - optColumns.add(columns.get(i).opt()); - } - - Function> decode = - values -> { - var allNull = true; - for (int i = 0; i < values.length && allNull; i++) { - switch (values[i]) { - case null -> {} - case Optional optional -> allNull = optional.isEmpty(); - default -> allNull = false; - } - } - if (allNull) { - return Optional.empty(); - } - // Unwrap the Optional wrapper we added - Object[] unwrapped = new Object[values.length]; - for (int i = 0; i < values.length; i++) { - if (values[i] instanceof Optional opt) { - unwrapped[i] = opt.orElse(null); - } else { - unwrapped[i] = values[i]; - } - } - var row = this.decode.apply(unwrapped); - return Optional.of(row); - }; - Function, Object[]> encode = - row -> { - if (row.isEmpty()) { - var none = Optional.empty(); - Object[] ret = new Object[columns.size()]; - for (int i = 0; i < columns.size(); i++) { - ret[i] = none; - } - return ret; - } - return this.encode.apply(row.get()); - }; - - return new RowParser<>(optColumns, decode, encode); - } - - public RowParser> joined(RowParser right) { - var allColumns = new ArrayList<>(columns); - allColumns.addAll(right.columns); - var left = this; - Function> decode = - allValues -> { - Object[] leftValues = new Object[left.columns.size()]; - System.arraycopy(allValues, 0, leftValues, 0, leftValues.length); - Object[] rightValues = new Object[right.columns.size()]; - System.arraycopy(allValues, leftValues.length, rightValues, 0, right.columns.size()); - return new And<>(left.decode.apply(leftValues), right.decode.apply(rightValues)); - }; - Function, Object[]> encode = - and -> { - Object[] leftValues = left.encode.apply(and.left()); - Object[] rightValues = right.encode.apply(and.right()); - Object[] allValues = new Object[leftValues.length + rightValues.length]; - System.arraycopy(leftValues, 0, allValues, 0, leftValues.length); - System.arraycopy(rightValues, 0, allValues, leftValues.length, rightValues.length); - return allValues; - }; - return new RowParser<>(allColumns, decode, encode); - } - - public RowParser>> leftJoined(RowParser other) { - return joined(other.opt()); - } - - public RowParser, Row2>> rightJoined(RowParser other) { - return opt().joined(other); - } - - public RowParser, Optional>> fullJoined(RowParser other) { - return opt().joined(other.opt()); - } - - /** - * Transform the row type using a bijection. This is useful for language wrappers that need to - * convert between Java and language-native types. - */ - public RowParser to(Bijection bijection) { - Function newDecode = values -> bijection.underlying(this.decode.apply(values)); - Function newEncode = row2 -> this.encode.apply(bijection.from(row2)); - return new RowParser<>(this.columns, newDecode, newEncode); - } - - /** - * Parse a list of rows from a JSON array. This is used for typed MULTISET support where the - * database returns JSON. - * - *

The JSON array format can be: - * - *

    - *
  • Array of objects: [{"col1": val1, "col2": val2}, ...] - *
  • Compact array of arrays: [[val1, val2], [val3, val4], ...] - *
- * - * @param jsonStr JSON string from database - * @param columnNames names of columns in order (for object format lookup) - * @return list of parsed rows - */ - @SuppressWarnings("unchecked") - public List parseJsonArray(String jsonStr, List columnNames) { - if (jsonStr == null || jsonStr.isEmpty()) { - return List.of(); - } - - JsonValue json = JsonValue.parse(jsonStr); - if (!(json instanceof JsonValue.JArray(List values))) { - throw new IllegalArgumentException( - "Expected JSON array, got: " + json.getClass().getSimpleName()); - } - - List result = new ArrayList<>(values.size()); - for (JsonValue elem : values) { - Row row = parseJsonRow(elem, columnNames); - result.add(row); - } - return result; - } - - /** - * Parse a single row from a JSON value. Supports both object format {"col": val} and array format - * [val1, val2]. - */ - @SuppressWarnings("unchecked") - private Row parseJsonRow(JsonValue json, List columnNames) { - Object[] values = new Object[columns.size()]; - - if (json instanceof JsonValue.JArray(List values1)) { - // Compact array format: values in column order - if (values1.size() != columns.size()) { - throw new IllegalArgumentException( - "JSON array size " + values1.size() + " doesn't match column count " + columns.size()); - } - for (int i = 0; i < columns.size(); i++) { - DbJson jsonCodec = (DbJson) columns.get(i).json(); - values[i] = jsonCodec.fromJson(values1.get(i)); - } - } else if (json instanceof JsonValue.JObject(java.util.Map fields)) { - // Object format: lookup by column name - for (int i = 0; i < columns.size(); i++) { - String colName = columnNames.get(i); - JsonValue colValue = fields.get(colName); - if (colValue == null) { - // Column not present in JSON - use null - values[i] = null; - } else { - DbJson jsonCodec = (DbJson) columns.get(i).json(); - values[i] = jsonCodec.fromJson(colValue); - } - } - } else { - throw new IllegalArgumentException( - "Expected JSON object or array for row, got: " + json.getClass().getSimpleName()); - } - - return decode.apply(values); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/SingleValueResultSetWrapper.java b/foundations-jdbc/src/java/dev/typr/foundations/SingleValueResultSetWrapper.java deleted file mode 100644 index 8a2163897e..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/SingleValueResultSetWrapper.java +++ /dev/null @@ -1,1014 +0,0 @@ -package dev.typr.foundations; - -import java.sql.*; - -/** - * Minimal ResultSet wrapper that exposes a single value as a ResultSet. This is used internally to - * convert raw STRUCT attribute values through their type readers. - */ -class SingleValueResultSetWrapper implements ResultSet, AutoCloseable { - private final Object value; - - SingleValueResultSetWrapper(Object value) { - this.value = value; - } - - @Override - public Object getObject(int columnIndex) throws SQLException { - if (columnIndex == 1) { - return value; - } - throw new SQLException("Invalid column index: " + columnIndex); - } - - @Override - public Object getObject(int columnIndex, java.util.Map> map) - throws SQLException { - return getObject(columnIndex); - } - - @Override - public T getObject(int columnIndex, Class type) throws SQLException { - return type.cast(getObject(columnIndex)); - } - - @Override - public String getString(int columnIndex) throws SQLException { - if (columnIndex == 1 && value instanceof String s) { - return s; - } - throw new UnsupportedOperationException(); - } - - @Override - public boolean getBoolean(int columnIndex) throws SQLException { - if (columnIndex == 1 && value instanceof Boolean b) { - return b; - } - throw new UnsupportedOperationException(); - } - - @Override - public byte getByte(int columnIndex) throws SQLException { - if (columnIndex == 1 && value instanceof Number n) { - return n.byteValue(); - } - throw new UnsupportedOperationException(); - } - - @Override - public short getShort(int columnIndex) throws SQLException { - if (columnIndex == 1 && value instanceof Number n) { - return n.shortValue(); - } - throw new UnsupportedOperationException(); - } - - @Override - public int getInt(int columnIndex) throws SQLException { - if (columnIndex == 1 && value instanceof Number n) { - return n.intValue(); - } - throw new UnsupportedOperationException(); - } - - @Override - public long getLong(int columnIndex) throws SQLException { - if (columnIndex == 1 && value instanceof Number n) { - return n.longValue(); - } - throw new UnsupportedOperationException(); - } - - @Override - public float getFloat(int columnIndex) throws SQLException { - if (columnIndex == 1 && value instanceof Number n) { - return n.floatValue(); - } - throw new UnsupportedOperationException(); - } - - @Override - public double getDouble(int columnIndex) throws SQLException { - if (columnIndex == 1 && value instanceof Number n) { - return n.doubleValue(); - } - throw new UnsupportedOperationException(); - } - - @Override - public java.math.BigDecimal getBigDecimal(int columnIndex) throws SQLException { - if (columnIndex == 1 && value instanceof java.math.BigDecimal bd) { - return bd; - } - throw new UnsupportedOperationException(); - } - - @Override - public byte[] getBytes(int columnIndex) throws SQLException { - if (columnIndex == 1 && value instanceof byte[] bytes) { - return bytes; - } - throw new UnsupportedOperationException(); - } - - @Override - public Date getDate(int columnIndex) throws SQLException { - if (columnIndex == 1 && value instanceof Date d) { - return d; - } - throw new UnsupportedOperationException(); - } - - @Override - public Time getTime(int columnIndex) throws SQLException { - if (columnIndex == 1 && value instanceof Time t) { - return t; - } - throw new UnsupportedOperationException(); - } - - @Override - public Timestamp getTimestamp(int columnIndex) throws SQLException { - if (columnIndex == 1 && value instanceof Timestamp ts) { - return ts; - } - throw new UnsupportedOperationException(); - } - - @Override - public boolean wasNull() throws SQLException { - return value == null; - } - - @Override - public void close() throws SQLException { - // Nothing to close - } - - // All other ResultSet methods throw UnsupportedOperationException - @Override - public java.io.InputStream getAsciiStream(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public java.io.InputStream getUnicodeStream(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public java.io.InputStream getBinaryStream(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public String getString(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public boolean getBoolean(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public byte getByte(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public short getShort(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public int getInt(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public long getLong(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public float getFloat(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public double getDouble(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public java.math.BigDecimal getBigDecimal(String columnLabel, int scale) { - throw new UnsupportedOperationException(); - } - - @Override - public byte[] getBytes(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public Date getDate(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public Time getTime(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public Timestamp getTimestamp(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public java.io.InputStream getAsciiStream(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public java.io.InputStream getUnicodeStream(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public java.io.InputStream getBinaryStream(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public SQLWarning getWarnings() { - throw new UnsupportedOperationException(); - } - - @Override - public void clearWarnings() { - throw new UnsupportedOperationException(); - } - - @Override - public String getCursorName() { - throw new UnsupportedOperationException(); - } - - @Override - public ResultSetMetaData getMetaData() { - throw new UnsupportedOperationException(); - } - - @Override - public Object getObject(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public int findColumn(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public java.io.Reader getCharacterStream(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public java.io.Reader getCharacterStream(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public java.math.BigDecimal getBigDecimal(int columnIndex, int scale) { - throw new UnsupportedOperationException(); - } - - @Override - public java.math.BigDecimal getBigDecimal(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public boolean isBeforeFirst() { - throw new UnsupportedOperationException(); - } - - @Override - public boolean isAfterLast() { - throw new UnsupportedOperationException(); - } - - @Override - public boolean isFirst() { - throw new UnsupportedOperationException(); - } - - @Override - public boolean isLast() { - throw new UnsupportedOperationException(); - } - - @Override - public void beforeFirst() { - throw new UnsupportedOperationException(); - } - - @Override - public void afterLast() { - throw new UnsupportedOperationException(); - } - - @Override - public boolean first() { - throw new UnsupportedOperationException(); - } - - @Override - public boolean last() { - throw new UnsupportedOperationException(); - } - - @Override - public int getRow() { - throw new UnsupportedOperationException(); - } - - @Override - public boolean absolute(int row) { - throw new UnsupportedOperationException(); - } - - @Override - public boolean relative(int rows) { - throw new UnsupportedOperationException(); - } - - @Override - public boolean previous() { - throw new UnsupportedOperationException(); - } - - @Override - public void setFetchDirection(int direction) { - throw new UnsupportedOperationException(); - } - - @Override - public int getFetchDirection() { - throw new UnsupportedOperationException(); - } - - @Override - public void setFetchSize(int rows) { - throw new UnsupportedOperationException(); - } - - @Override - public int getFetchSize() { - throw new UnsupportedOperationException(); - } - - @Override - public int getType() { - throw new UnsupportedOperationException(); - } - - @Override - public int getConcurrency() { - throw new UnsupportedOperationException(); - } - - @Override - public boolean rowUpdated() { - throw new UnsupportedOperationException(); - } - - @Override - public boolean rowInserted() { - throw new UnsupportedOperationException(); - } - - @Override - public boolean rowDeleted() { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNull(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBoolean(int columnIndex, boolean x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateByte(int columnIndex, byte x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateShort(int columnIndex, short x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateInt(int columnIndex, int x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateLong(int columnIndex, long x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateFloat(int columnIndex, float x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateDouble(int columnIndex, double x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBigDecimal(int columnIndex, java.math.BigDecimal x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateString(int columnIndex, String x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBytes(int columnIndex, byte[] x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateDate(int columnIndex, Date x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateTime(int columnIndex, Time x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateTimestamp(int columnIndex, Timestamp x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateAsciiStream(int columnIndex, java.io.InputStream x, int length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBinaryStream(int columnIndex, java.io.InputStream x, int length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateCharacterStream(int columnIndex, java.io.Reader x, int length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateObject(int columnIndex, Object x, int scaleOrLength) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateObject(int columnIndex, Object x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNull(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBoolean(String columnLabel, boolean x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateByte(String columnLabel, byte x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateShort(String columnLabel, short x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateInt(String columnLabel, int x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateLong(String columnLabel, long x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateFloat(String columnLabel, float x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateDouble(String columnLabel, double x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBigDecimal(String columnLabel, java.math.BigDecimal x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateString(String columnLabel, String x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBytes(String columnLabel, byte[] x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateDate(String columnLabel, Date x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateTime(String columnLabel, Time x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateTimestamp(String columnLabel, Timestamp x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateAsciiStream(String columnLabel, java.io.InputStream x, int length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBinaryStream(String columnLabel, java.io.InputStream x, int length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateCharacterStream(String columnLabel, java.io.Reader reader, int length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateObject(String columnLabel, Object x, int scaleOrLength) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateObject(String columnLabel, Object x) { - throw new UnsupportedOperationException(); - } - - @Override - public void insertRow() { - throw new UnsupportedOperationException(); - } - - @Override - public void updateRow() { - throw new UnsupportedOperationException(); - } - - @Override - public void deleteRow() { - throw new UnsupportedOperationException(); - } - - @Override - public void refreshRow() { - throw new UnsupportedOperationException(); - } - - @Override - public void cancelRowUpdates() { - throw new UnsupportedOperationException(); - } - - @Override - public void moveToInsertRow() { - throw new UnsupportedOperationException(); - } - - @Override - public void moveToCurrentRow() { - throw new UnsupportedOperationException(); - } - - @Override - public Statement getStatement() { - throw new UnsupportedOperationException(); - } - - @Override - public Ref getRef(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public Blob getBlob(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public Clob getClob(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public Array getArray(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public Object getObject(String columnLabel, java.util.Map> map) { - throw new UnsupportedOperationException(); - } - - @Override - public Ref getRef(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public Blob getBlob(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public Clob getClob(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public Array getArray(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public Date getDate(int columnIndex, java.util.Calendar cal) { - throw new UnsupportedOperationException(); - } - - @Override - public Date getDate(String columnLabel, java.util.Calendar cal) { - throw new UnsupportedOperationException(); - } - - @Override - public Time getTime(int columnIndex, java.util.Calendar cal) { - throw new UnsupportedOperationException(); - } - - @Override - public Time getTime(String columnLabel, java.util.Calendar cal) { - throw new UnsupportedOperationException(); - } - - @Override - public Timestamp getTimestamp(int columnIndex, java.util.Calendar cal) { - throw new UnsupportedOperationException(); - } - - @Override - public Timestamp getTimestamp(String columnLabel, java.util.Calendar cal) { - throw new UnsupportedOperationException(); - } - - @Override - public java.net.URL getURL(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public java.net.URL getURL(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateRef(int columnIndex, Ref x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateRef(String columnLabel, Ref x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBlob(int columnIndex, Blob x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBlob(String columnLabel, Blob x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateClob(int columnIndex, Clob x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateClob(String columnLabel, Clob x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateArray(int columnIndex, Array x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateArray(String columnLabel, Array x) { - throw new UnsupportedOperationException(); - } - - @Override - public RowId getRowId(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public RowId getRowId(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateRowId(int columnIndex, RowId x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateRowId(String columnLabel, RowId x) { - throw new UnsupportedOperationException(); - } - - @Override - public int getHoldability() { - throw new UnsupportedOperationException(); - } - - @Override - public boolean isClosed() { - return false; - } - - @Override - public void updateNString(int columnIndex, String nString) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNString(String columnLabel, String nString) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNClob(int columnIndex, NClob nClob) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNClob(String columnLabel, NClob nClob) { - throw new UnsupportedOperationException(); - } - - @Override - public NClob getNClob(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public NClob getNClob(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public SQLXML getSQLXML(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public SQLXML getSQLXML(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateSQLXML(int columnIndex, SQLXML xmlObject) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateSQLXML(String columnLabel, SQLXML xmlObject) { - throw new UnsupportedOperationException(); - } - - @Override - public String getNString(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public String getNString(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public java.io.Reader getNCharacterStream(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public java.io.Reader getNCharacterStream(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNCharacterStream(int columnIndex, java.io.Reader x, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNCharacterStream(String columnLabel, java.io.Reader reader, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateAsciiStream(int columnIndex, java.io.InputStream x, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBinaryStream(int columnIndex, java.io.InputStream x, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateCharacterStream(int columnIndex, java.io.Reader x, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateAsciiStream(String columnLabel, java.io.InputStream x, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBinaryStream(String columnLabel, java.io.InputStream x, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateCharacterStream(String columnLabel, java.io.Reader reader, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBlob(int columnIndex, java.io.InputStream inputStream, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBlob(String columnLabel, java.io.InputStream inputStream, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateClob(int columnIndex, java.io.Reader reader, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateClob(String columnLabel, java.io.Reader reader, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNClob(int columnIndex, java.io.Reader reader, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNClob(String columnLabel, java.io.Reader reader, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNCharacterStream(int columnIndex, java.io.Reader x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNCharacterStream(String columnLabel, java.io.Reader reader) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateAsciiStream(int columnIndex, java.io.InputStream x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBinaryStream(int columnIndex, java.io.InputStream x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateCharacterStream(int columnIndex, java.io.Reader x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateAsciiStream(String columnLabel, java.io.InputStream x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBinaryStream(String columnLabel, java.io.InputStream x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateCharacterStream(String columnLabel, java.io.Reader reader) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBlob(int columnIndex, java.io.InputStream inputStream) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBlob(String columnLabel, java.io.InputStream inputStream) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateClob(int columnIndex, java.io.Reader reader) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateClob(String columnLabel, java.io.Reader reader) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNClob(int columnIndex, java.io.Reader reader) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNClob(String columnLabel, java.io.Reader reader) { - throw new UnsupportedOperationException(); - } - - @Override - public T getObject(String columnLabel, Class type) { - throw new UnsupportedOperationException(); - } - - @Override - public boolean next() { - throw new UnsupportedOperationException(); - } - - @Override - public T unwrap(Class iface) { - throw new UnsupportedOperationException(); - } - - @Override - public boolean isWrapperFor(Class iface) { - return false; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/SqlBiConsumer.java b/foundations-jdbc/src/java/dev/typr/foundations/SqlBiConsumer.java deleted file mode 100644 index 0089878c8a..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/SqlBiConsumer.java +++ /dev/null @@ -1,7 +0,0 @@ -package dev.typr.foundations; - -import java.sql.SQLException; - -public interface SqlBiConsumer { - void apply(T1 t1, T2 t2) throws SQLException; -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/SqlBiFunction.java b/foundations-jdbc/src/java/dev/typr/foundations/SqlBiFunction.java deleted file mode 100644 index 65eb27ad45..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/SqlBiFunction.java +++ /dev/null @@ -1,8 +0,0 @@ -package dev.typr.foundations; - -import java.sql.SQLException; - -@FunctionalInterface -public interface SqlBiFunction { - R apply(T t, U u) throws SQLException; -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/SqlConsumer.java b/foundations-jdbc/src/java/dev/typr/foundations/SqlConsumer.java deleted file mode 100644 index 76a47f4580..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/SqlConsumer.java +++ /dev/null @@ -1,7 +0,0 @@ -package dev.typr.foundations; - -import java.sql.SQLException; - -public interface SqlConsumer { - void apply(T t) throws SQLException; -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/SqlFunction.java b/foundations-jdbc/src/java/dev/typr/foundations/SqlFunction.java deleted file mode 100644 index 1c78758eb9..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/SqlFunction.java +++ /dev/null @@ -1,8 +0,0 @@ -package dev.typr.foundations; - -import java.sql.SQLException; - -@FunctionalInterface -public interface SqlFunction { - R apply(T t) throws SQLException; -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/SqlServerJson.java b/foundations-jdbc/src/java/dev/typr/foundations/SqlServerJson.java deleted file mode 100644 index b5cdb32fd7..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/SqlServerJson.java +++ /dev/null @@ -1,150 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.data.JsonValue; -import java.math.BigDecimal; -import java.util.Optional; -import java.util.UUID; -import java.util.function.Function; - -/** - * Encodes/decodes values to/from JSON for SQL Server. - * - *

Similar to MariaJson - SQL Server supports JSON natively since 2016. - */ -public abstract class SqlServerJson implements DbJson { - public abstract JsonValue toJson(A a); - - public abstract A fromJson(JsonValue jsonValue); - - public SqlServerJson bimap(SqlFunction f, Function g) { - var self = this; - return SqlServerJson.instance(a -> self.toJson(g.apply(a)), jv -> f.apply(self.fromJson(jv))); - } - - public SqlServerJson map(SqlFunction f) { - var self = this; - return new SqlServerJson<>() { - @Override - public JsonValue toJson(B b) { - throw new UnsupportedOperationException("toJson not supported for mapped type"); - } - - @Override - public B fromJson(JsonValue jsonValue) { - try { - return f.apply(self.fromJson(jsonValue)); - } catch (java.sql.SQLException e) { - throw new RuntimeException(e); - } - } - }; - } - - public SqlServerJson contramap(Function g) { - var self = this; - return new SqlServerJson<>() { - @Override - public JsonValue toJson(B b) { - return self.toJson(g.apply(b)); - } - - @Override - public B fromJson(JsonValue jsonValue) { - throw new UnsupportedOperationException("fromJson not supported for contramapped type"); - } - }; - } - - public SqlServerJson> opt() { - var self = this; - return instance( - a -> a.map(self::toJson).orElse(JsonValue.JNull.INSTANCE), - jv -> jv instanceof JsonValue.JNull ? Optional.empty() : Optional.of(self.fromJson(jv))); - } - - public static SqlServerJson instance( - Function toJson, SqlFunction fromJson) { - return new SqlServerJson<>() { - @Override - public JsonValue toJson(A a) { - return toJson.apply(a); - } - - @Override - public A fromJson(JsonValue jsonValue) { - try { - return fromJson.apply(jsonValue); - } catch (java.sql.SQLException e) { - throw new RuntimeException(e); - } - } - }; - } - - // Standard JSON codecs - public static final SqlServerJson text = - instance(s -> new JsonValue.JString(s), jv -> ((JsonValue.JString) jv).value()); - public static final SqlServerJson bool = - instance(JsonValue.JBool::of, jv -> ((JsonValue.JBool) jv).value()); - public static final SqlServerJson int2 = - instance( - s -> JsonValue.JNumber.of(s.intValue()), - jv -> Short.parseShort(((JsonValue.JNumber) jv).value())); - public static final SqlServerJson int4 = - instance( - i -> JsonValue.JNumber.of(i.longValue()), - jv -> Integer.parseInt(((JsonValue.JNumber) jv).value())); - public static final SqlServerJson int8 = - instance(JsonValue.JNumber::of, jv -> Long.parseLong(((JsonValue.JNumber) jv).value())); - public static final SqlServerJson float4 = - instance( - f -> JsonValue.JNumber.of(f.doubleValue()), - jv -> Float.parseFloat(((JsonValue.JNumber) jv).value())); - public static final SqlServerJson float8 = - instance(JsonValue.JNumber::of, jv -> Double.parseDouble(((JsonValue.JNumber) jv).value())); - public static final SqlServerJson numeric = - instance( - bd -> JsonValue.JNumber.of(bd.toString()), - jv -> new BigDecimal(((JsonValue.JNumber) jv).value())); - public static final SqlServerJson bytea = - instance( - bytes -> new JsonValue.JString(java.util.Base64.getEncoder().encodeToString(bytes)), - jv -> java.util.Base64.getDecoder().decode(((JsonValue.JString) jv).value())); - public static final SqlServerJson uuid = - instance( - u -> new JsonValue.JString(u.toString()), - jv -> UUID.fromString(((JsonValue.JString) jv).value())); - public static final SqlServerJson unknown = instance(obj -> (JsonValue) obj, jv -> jv); - - // Date/Time codecs (ISO-8601 strings) - public static final SqlServerJson date = - instance( - d -> new JsonValue.JString(d.toString()), - jv -> java.time.LocalDate.parse(((JsonValue.JString) jv).value())); - public static final SqlServerJson time = - instance( - t -> new JsonValue.JString(t.toString()), - jv -> java.time.LocalTime.parse(((JsonValue.JString) jv).value())); - public static final SqlServerJson timestamp = - instance( - ts -> new JsonValue.JString(ts.toString()), - jv -> java.time.LocalDateTime.parse(((JsonValue.JString) jv).value())); - public static final SqlServerJson timestamptz = - instance( - odt -> new JsonValue.JString(odt.toString()), - jv -> java.time.OffsetDateTime.parse(((JsonValue.JString) jv).value())); - - // Spatial types - serialize as WKT (Well-Known Text) - public static final SqlServerJson jsonGeography = - instance( - geo -> new JsonValue.JString(geo.toString()), - jv -> - com.microsoft.sqlserver.jdbc.Geography.STGeomFromText( - ((JsonValue.JString) jv).value(), 0)); - public static final SqlServerJson jsonGeometry = - instance( - geom -> new JsonValue.JString(geom.toString()), - jv -> - com.microsoft.sqlserver.jdbc.Geometry.STGeomFromText( - ((JsonValue.JString) jv).value(), 0)); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/SqlServerRead.java b/foundations-jdbc/src/java/dev/typr/foundations/SqlServerRead.java deleted file mode 100644 index 2d41bbc6e2..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/SqlServerRead.java +++ /dev/null @@ -1,239 +0,0 @@ -package dev.typr.foundations; - -import java.math.BigDecimal; -import java.sql.ResultSet; -import java.sql.SQLException; -import java.time.LocalDate; -import java.time.LocalDateTime; -import java.time.LocalTime; -import java.time.OffsetDateTime; -import java.util.Optional; -import java.util.UUID; - -/** - * Describes how to read a column from a {@link ResultSet} for SQL Server. - * - *

Similar to MariaRead but adapted for SQL Server-specific types. - */ -public sealed interface SqlServerRead extends DbRead - permits SqlServerRead.NonNullable, SqlServerRead.Nullable, SqlServerRead.Mapped { - A read(ResultSet rs, int col) throws SQLException; - - SqlServerRead map(SqlFunction f); - - /** Derive a SqlServerRead which allows nullable values */ - SqlServerRead> opt(); - - @FunctionalInterface - interface RawRead { - A apply(ResultSet rs, int column) throws SQLException; - } - - /** - * Create an instance of {@link SqlServerRead} from a function that reads a value from a result - * set. - * - * @param f Should not blow up if the value returned is `null` - */ - static NonNullable of(RawRead f) { - RawRead> readNullableA = - (rs, col) -> { - var a = f.apply(rs, col); - if (rs.wasNull()) return Optional.empty(); - else return Optional.of(a); - }; - return new NonNullable<>(readNullableA); - } - - final class NonNullable implements SqlServerRead { - final RawRead> readNullable; - - public NonNullable(RawRead> readNullable) { - this.readNullable = readNullable; - } - - @Override - public A read(ResultSet rs, int col) throws SQLException { - return readNullable - .apply(rs, col) - .orElseThrow(() -> new SQLException("null value in column " + col)); - } - - @Override - public NonNullable map(SqlFunction f) { - return new NonNullable<>( - (rs, col) -> { - Optional maybeA = readNullable.apply(rs, col); - if (maybeA.isEmpty()) return Optional.empty(); - return Optional.of(f.apply(maybeA.get())); - }); - } - - @Override - public SqlServerRead> opt() { - return new Nullable<>(readNullable); - } - } - - final class Nullable implements SqlServerRead> { - final RawRead> readNullable; - - public Nullable(RawRead> readNullable) { - this.readNullable = readNullable; - } - - @Override - public Optional read(ResultSet rs, int col) throws SQLException { - return readNullable.apply(rs, col); - } - - @Override - public SqlServerRead map(SqlFunction, B> f) { - return new Mapped<>(this, f); - } - - @Override - public Nullable> opt() { - return new Nullable<>( - (rs, col) -> { - Optional maybeA = readNullable.apply(rs, col); - if (maybeA.isEmpty()) return Optional.empty(); - return Optional.of(maybeA); - }); - } - } - - record Mapped(SqlServerRead underlying, SqlFunction f) - implements SqlServerRead { - @Override - public B read(ResultSet rs, int col) throws SQLException { - return f.apply(underlying.read(rs, col)); - } - - @Override - public SqlServerRead map(SqlFunction g) { - return new Mapped<>(this, g); - } - - @Override - public SqlServerRead> opt() { - return new Nullable<>((rs, col) -> Optional.ofNullable(read(rs, col))); - } - } - - static NonNullable castJdbcObjectTo(Class cls) { - return of((rs, i) -> cls.cast(rs.getObject(i))); - } - - /** - * Read a value by requesting a specific class from JDBC. This uses rs.getObject(i, cls) which - * allows the JDBC driver to do proper type conversion. - */ - static NonNullable getObjectAs(Class cls) { - return of((rs, i) -> rs.getObject(i, cls)); - } - - // ==================== Basic Type Readers ==================== - - SqlServerRead readString = of(ResultSet::getString); - SqlServerRead readBoolean = of(ResultSet::getBoolean); - - // SQL Server TINYINT is UNSIGNED (0-255), wrap in Uint1 - SqlServerRead readShort = of(ResultSet::getShort); - SqlServerRead readUint1 = - readShort.map(dev.typr.foundations.data.Uint1::new); - SqlServerRead readInteger = of(ResultSet::getInt); - SqlServerRead readLong = of(ResultSet::getLong); - SqlServerRead readFloat = of(ResultSet::getFloat); - SqlServerRead readDouble = of(ResultSet::getDouble); - SqlServerRead readBigDecimal = of(ResultSet::getBigDecimal); - - // Binary types - SqlServerRead readByteArray = of(ResultSet::getBytes); - - // ==================== Date/Time Readers ==================== - - SqlServerRead readDate = of((rs, idx) -> rs.getObject(idx, LocalDate.class)); - SqlServerRead readTime = of((rs, idx) -> rs.getObject(idx, LocalTime.class)); - - // DATETIME, SMALLDATETIME, DATETIME2 - SqlServerRead readTimestamp = - of((rs, idx) -> rs.getObject(idx, LocalDateTime.class)); - - // DATETIMEOFFSET - SQL Server specific! - SqlServerRead readOffsetDateTime = - of( - (rs, idx) -> { - // SQL Server JDBC driver returns microsoft.sql.DateTimeOffset - // We need to convert it to java.time.OffsetDateTime - Object obj = rs.getObject(idx); - if (obj == null) return null; - - // Modern JDBC drivers support direct conversion to OffsetDateTime - return rs.getObject(idx, OffsetDateTime.class); - }); - - // ==================== Special Types ==================== - - // UNIQUEIDENTIFIER (UUID/GUID) - SqlServerRead readUUID = - of( - (rs, idx) -> { - String str = rs.getString(idx); - if (str == null) return null; - return UUID.fromString(str); - }); - - // XML - SqlServerRead readXml = - of( - (rs, idx) -> { - java.sql.SQLXML sqlxml = rs.getSQLXML(idx); - if (sqlxml == null) return null; - return new dev.typr.foundations.data.Xml(sqlxml.getString()); - }); - - // JSON - SQL Server stores JSON as NVARCHAR - SqlServerRead readJson = - readString.map(dev.typr.foundations.data.Json::new); - - // HIERARCHYID - read as HierarchyId wrapper type that stores segments and provides toString() - SqlServerRead readHierarchyId = - of( - (rs, idx) -> { - Object obj = rs.getObject(idx); - if (obj == null) return null; - if (obj instanceof byte[] bytes) { - return dev.typr.foundations.data.HierarchyId.fromBytes(bytes); - } - // Fallback: if getString returns something, try to parse it - String str = rs.getString(idx); - if (str != null && !str.isEmpty()) { - return dev.typr.foundations.data.HierarchyId.parse(str); - } - return dev.typr.foundations.data.HierarchyId.ROOT; - }); - - // SQL_VARIANT - read as Object - SqlServerRead readObject = of(ResultSet::getObject); - - // GEOGRAPHY and GEOMETRY - use JDBC driver's spatial classes - // Read as bytes and deserialize (getObject doesn't support these types) - // Handle NULL values properly - deserialize() doesn't accept null - SqlServerRead readGeography = - of( - (rs, idx) -> { - byte[] bytes = rs.getBytes(idx); - return bytes == null ? null : com.microsoft.sqlserver.jdbc.Geography.deserialize(bytes); - }); - SqlServerRead readGeometry = - of( - (rs, idx) -> { - byte[] bytes = rs.getBytes(idx); - return bytes == null ? null : com.microsoft.sqlserver.jdbc.Geometry.deserialize(bytes); - }); - - // VECTOR - SQL Server 2025 vector type - // Read as byte[] for now, or parse as float[] if needed - SqlServerRead readVector = readByteArray; -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/SqlServerText.java b/foundations-jdbc/src/java/dev/typr/foundations/SqlServerText.java deleted file mode 100644 index 0645266d6e..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/SqlServerText.java +++ /dev/null @@ -1,106 +0,0 @@ -package dev.typr.foundations; - -import java.math.BigDecimal; -import java.util.Optional; -import java.util.UUID; -import java.util.function.BiConsumer; -import java.util.function.Function; - -/** - * Encodes values to text format for SQL Server BULK INSERT command. - * - *

Similar to MariaText but adapted for SQL Server's text format. - */ -public abstract class SqlServerText implements DbText { - public abstract void unsafeEncode(A a, StringBuilder sb); - - public SqlServerText contramap(Function f) { - var self = this; - return instance((b, sb) -> self.unsafeEncode(f.apply(b), sb)); - } - - public SqlServerText> opt() { - var self = this; - return instance( - (a, sb) -> { - if (a.isPresent()) self.unsafeEncode(a.get(), sb); - else sb.append(SqlServerText.NULL); - }); - } - - public static char DELIMETER = '\t'; - public static String NULL = "\\N"; - - public static SqlServerText instance(BiConsumer f) { - return new SqlServerText<>() { - @Override - public void unsafeEncode(A a, StringBuilder sb) { - f.accept(a, sb); - } - }; - } - - @SuppressWarnings("unchecked") - public static SqlServerText from(RowParser rowParser) { - return instance( - (row, sb) -> { - var encoded = rowParser.encode().apply(row); - for (int i = 0; i < encoded.length; i++) { - if (i > 0) { - sb.append(SqlServerText.DELIMETER); - } - DbText text = (DbText) rowParser.columns().get(i).text(); - text.unsafeEncode(encoded[i], sb); - } - }); - } - - public static SqlServerText instanceToString() { - return textString.contramap(Object::toString); - } - - private static void escapeString(String s, StringBuilder sb) { - for (int i = 0; i < s.length(); i++) { - char c = s.charAt(i); - switch (c) { - case '\0': - sb.append("\\0"); - break; - case '\n': - sb.append("\\n"); - break; - case '\r': - sb.append("\\r"); - break; - case '\t': - sb.append("\\t"); - break; - case '\\': - sb.append("\\\\"); - break; - default: - sb.append(c); - } - } - } - - // Basic type text encoders - public static final SqlServerText textString = instance((s, sb) -> escapeString(s, sb)); - public static final SqlServerText textBoolean = instanceToString(); - public static final SqlServerText textShort = instanceToString(); - public static final SqlServerText textInteger = instanceToString(); - public static final SqlServerText textLong = instanceToString(); - public static final SqlServerText textFloat = instanceToString(); - public static final SqlServerText textDouble = instanceToString(); - public static final SqlServerText textBigDecimal = instanceToString(); - public static final SqlServerText textByteArray = - instance((bytes, sb) -> sb.append(java.util.Base64.getEncoder().encodeToString(bytes))); - public static final SqlServerText textUUID = instanceToString(); - public static final SqlServerText textObject = instanceToString(); - - // Spatial types - public static final SqlServerText textGeography = - instanceToString(); // Uses toString() which returns WKT - public static final SqlServerText textGeometry = - instanceToString(); // Uses toString() which returns WKT -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/SqlServerType.java b/foundations-jdbc/src/java/dev/typr/foundations/SqlServerType.java deleted file mode 100644 index 1167cd9130..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/SqlServerType.java +++ /dev/null @@ -1,96 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.dsl.Bijection; -import java.util.Optional; -import java.util.function.Function; - -/** - * Combines SQL Server type name, read, write, text encoding, and JSON encoding for a type. Similar - * to MariaType but for SQL Server. - */ -public record SqlServerType( - SqlServerTypename typename, - SqlServerRead read, - SqlServerWrite write, - SqlServerText sqlServerText, - SqlServerJson sqlServerJson) - implements DbType { - @Override - public DbText text() { - return sqlServerText; - } - - @Override - public DbJson json() { - return sqlServerJson; - } - - public SqlServerType withTypename(SqlServerTypename typename) { - return new SqlServerType<>(typename, read, write, sqlServerText, sqlServerJson); - } - - public SqlServerType withTypename(String sqlType) { - return withTypename(SqlServerTypename.of(sqlType)); - } - - public SqlServerType renamed(String value) { - return withTypename(typename.renamed(value)); - } - - public SqlServerType renamedDropPrecision(String value) { - return withTypename(typename.renamedDropPrecision(value)); - } - - public SqlServerType withRead(SqlServerRead read) { - return new SqlServerType<>(typename, read, write, sqlServerText, sqlServerJson); - } - - public SqlServerType withWrite(SqlServerWrite write) { - return new SqlServerType<>(typename, read, write, sqlServerText, sqlServerJson); - } - - public SqlServerType withText(SqlServerText text) { - return new SqlServerType<>(typename, read, write, text, sqlServerJson); - } - - public SqlServerType withJson(SqlServerJson json) { - return new SqlServerType<>(typename, read, write, sqlServerText, json); - } - - public SqlServerType> opt() { - return new SqlServerType<>( - typename.opt(), read.opt(), write.opt(typename), sqlServerText.opt(), sqlServerJson.opt()); - } - - public SqlServerType bimap(SqlFunction f, Function g) { - return new SqlServerType<>( - typename.as(), - read.map(f), - write.contramap(g), - sqlServerText.contramap(g), - sqlServerJson.bimap(f, g)); - } - - public SqlServerType to(Bijection bijection) { - return new SqlServerType<>( - typename.as(), - read.map(bijection::underlying), - write.contramap(bijection::from), - sqlServerText.contramap(bijection::from), - sqlServerJson.bimap(bijection::underlying, bijection::from)); - } - - public static SqlServerType of( - String tpe, SqlServerRead r, SqlServerWrite w, SqlServerText t, SqlServerJson j) { - return new SqlServerType<>(SqlServerTypename.of(tpe), r, w, t, j); - } - - public static SqlServerType of( - SqlServerTypename typename, - SqlServerRead r, - SqlServerWrite w, - SqlServerText t, - SqlServerJson j) { - return new SqlServerType<>(typename, r, w, t, j); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/SqlServerTypename.java b/foundations-jdbc/src/java/dev/typr/foundations/SqlServerTypename.java deleted file mode 100644 index 40cf43151e..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/SqlServerTypename.java +++ /dev/null @@ -1,137 +0,0 @@ -package dev.typr.foundations; - -import dev.typr.foundations.dsl.Bijection; -import java.util.Optional; - -/** - * Represents a SQL Server SQL type name with optional precision. Similar to MariaTypename. SQL - * Server uses bracket notation [table] for identifiers but standard type casts. - */ -public sealed interface SqlServerTypename extends DbTypename { - String sqlType(); - - /** - * SQL Server uses CAST() syntax, not PostgreSQL's :: operator. Don't render :: casts in prepared - * statements. - */ - @Override - default boolean renderTypeCast() { - return false; - } - - String sqlTypeNoPrecision(); - - SqlServerTypename renamed(String value); - - SqlServerTypename renamedDropPrecision(String value); - - default SqlServerTypename> opt() { - return new Opt<>(this); - } - - default SqlServerTypename as() { - return (SqlServerTypename) this; - } - - /** - * Type-safe conversion using a bijection as proof of type relationship. Overrides DbTypename.to() - * to return SqlServerTypename for better type refinement. - */ - @Override - default SqlServerTypename to(Bijection bijection) { - return (SqlServerTypename) this; - } - - record Base(String sqlType) implements SqlServerTypename { - @Override - public String sqlTypeNoPrecision() { - return sqlType; - } - - @Override - public Base renamed(String value) { - return new Base<>(value); - } - - @Override - public Base renamedDropPrecision(String value) { - return new Base<>(value); - } - } - - record WithPrec(Base of, int precision) implements SqlServerTypename { - public String sqlType() { - return of.sqlType + "(" + precision + ")"; - } - - @Override - public String sqlTypeNoPrecision() { - return of.sqlTypeNoPrecision(); - } - - @Override - public SqlServerTypename renamed(String value) { - return new WithPrec<>(of.renamed(value), precision); - } - - @Override - public SqlServerTypename renamedDropPrecision(String value) { - return of.renamed(value); - } - } - - record WithPrecScale(Base of, int precision, int scale) implements SqlServerTypename { - public String sqlType() { - return of.sqlType + "(" + precision + "," + scale + ")"; - } - - @Override - public String sqlTypeNoPrecision() { - return of.sqlTypeNoPrecision(); - } - - @Override - public SqlServerTypename renamed(String value) { - return new WithPrecScale<>(of.renamed(value), precision, scale); - } - - @Override - public SqlServerTypename renamedDropPrecision(String value) { - return of.renamed(value); - } - } - - record Opt(SqlServerTypename of) implements SqlServerTypename> { - @Override - public String sqlType() { - return of.sqlType(); - } - - @Override - public String sqlTypeNoPrecision() { - return of.sqlTypeNoPrecision(); - } - - @Override - public SqlServerTypename> renamed(String value) { - return new Opt<>(of.renamed(value)); - } - - @Override - public SqlServerTypename> renamedDropPrecision(String value) { - return new Opt<>(of.renamedDropPrecision(value)); - } - } - - static SqlServerTypename of(String sqlType) { - return new Base<>(sqlType); - } - - static SqlServerTypename of(String sqlType, int precision) { - return new WithPrec<>(new Base<>(sqlType), precision); - } - - static SqlServerTypename of(String sqlType, int precision, int scale) { - return new WithPrecScale<>(new Base<>(sqlType), precision, scale); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/SqlServerTypes.java b/foundations-jdbc/src/java/dev/typr/foundations/SqlServerTypes.java deleted file mode 100644 index cfeb7d5e8e..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/SqlServerTypes.java +++ /dev/null @@ -1,434 +0,0 @@ -package dev.typr.foundations; - -import java.math.BigDecimal; -import java.time.*; -import java.util.UUID; - -/** - * SQL Server type definitions for the typr-runtime-java library. - * - *

This interface provides type codecs for all SQL Server data types. - * - *

Key differences from other databases: - TINYINT is UNSIGNED (0-255), mapped to Short - Unicode - * types (NCHAR, NVARCHAR, NTEXT) are separate from non-Unicode - DATETIMEOFFSET for timezone-aware - * timestamps - UNIQUEIDENTIFIER for UUIDs/GUIDs - No native array support (use table-valued - * parameters instead) - */ -public interface SqlServerTypes { - - // ==================== Integer Types ==================== - - // TINYINT - UNSIGNED in SQL Server! (0-255) - SqlServerType tinyint = - SqlServerType.of( - "TINYINT", - SqlServerRead.readUint1, - SqlServerWrite.writeUint1, - SqlServerText.textShort.contramap(dev.typr.foundations.data.Uint1::value), - SqlServerJson.int2.bimap(dev.typr.foundations.data.Uint1::new, u -> (short) u.value())); - - SqlServerType smallint = - SqlServerType.of( - "SMALLINT", - SqlServerRead.readShort, - SqlServerWrite.writeShort, - SqlServerText.textShort, - SqlServerJson.int2); - - SqlServerType int_ = - SqlServerType.of( - "INT", - SqlServerRead.readInteger, - SqlServerWrite.writeInteger, - SqlServerText.textInteger, - SqlServerJson.int4); - - SqlServerType bigint = - SqlServerType.of( - "BIGINT", - SqlServerRead.readLong, - SqlServerWrite.writeLong, - SqlServerText.textLong, - SqlServerJson.int8); - - // ==================== Fixed-Point Types ==================== - - SqlServerType decimal = - SqlServerType.of( - "DECIMAL", - SqlServerRead.readBigDecimal, - SqlServerWrite.writeBigDecimal, - SqlServerText.textBigDecimal, - SqlServerJson.numeric); - - SqlServerType numeric = decimal.renamed("NUMERIC"); - - static SqlServerType decimal(int precision, int scale) { - return SqlServerType.of( - SqlServerTypename.of("DECIMAL", precision, scale), - SqlServerRead.readBigDecimal, - SqlServerWrite.writeBigDecimal, - SqlServerText.textBigDecimal, - SqlServerJson.numeric); - } - - static SqlServerType numeric(int precision, int scale) { - return decimal(precision, scale).renamed("NUMERIC"); - } - - SqlServerType money = - SqlServerType.of( - "MONEY", - SqlServerRead.readBigDecimal, - SqlServerWrite.writeBigDecimal, - SqlServerText.textBigDecimal, - SqlServerJson.numeric); - - SqlServerType smallmoney = - SqlServerType.of( - "SMALLMONEY", - SqlServerRead.readBigDecimal, - SqlServerWrite.writeBigDecimal, - SqlServerText.textBigDecimal, - SqlServerJson.numeric); - - // ==================== Floating-Point Types ==================== - - SqlServerType real = - SqlServerType.of( - "REAL", - SqlServerRead.readFloat, - SqlServerWrite.writeFloat, - SqlServerText.textFloat, - SqlServerJson.float4); - - SqlServerType float_ = - SqlServerType.of( - "FLOAT", - SqlServerRead.readDouble, - SqlServerWrite.writeDouble, - SqlServerText.textDouble, - SqlServerJson.float8); - - // ==================== Boolean Type ==================== - - SqlServerType bit = - SqlServerType.of( - "BIT", - SqlServerRead.readBoolean, - SqlServerWrite.writeBoolean, - SqlServerText.textBoolean, - SqlServerJson.bool); - - // ==================== String Types (Non-Unicode) ==================== - - SqlServerType char_ = - SqlServerType.of( - "CHAR", - SqlServerRead.readString, - SqlServerWrite.writeString, - SqlServerText.textString, - SqlServerJson.text); - - static SqlServerType char_(int length) { - return SqlServerType.of( - SqlServerTypename.of("CHAR", length), - SqlServerRead.readString, - SqlServerWrite.writeString, - SqlServerText.textString, - SqlServerJson.text); - } - - SqlServerType varchar = - SqlServerType.of( - "VARCHAR", - SqlServerRead.readString, - SqlServerWrite.writeString, - SqlServerText.textString, - SqlServerJson.text); - - static SqlServerType varchar(int length) { - return SqlServerType.of( - SqlServerTypename.of("VARCHAR", length), - SqlServerRead.readString, - SqlServerWrite.writeString, - SqlServerText.textString, - SqlServerJson.text); - } - - SqlServerType varcharMax = varchar.renamed("VARCHAR(MAX)"); - SqlServerType text = - SqlServerType.of( - "TEXT", - SqlServerRead.readString, - SqlServerWrite.writeText, - SqlServerText.textString, - SqlServerJson.text); - - // ==================== String Types (Unicode) ==================== - - SqlServerType nchar = - SqlServerType.of( - "NCHAR", - SqlServerRead.readString, - SqlServerWrite.writeString, - SqlServerText.textString, - SqlServerJson.text); - - static SqlServerType nchar(int length) { - return SqlServerType.of( - SqlServerTypename.of("NCHAR", length), - SqlServerRead.readString, - SqlServerWrite.writeString, - SqlServerText.textString, - SqlServerJson.text); - } - - SqlServerType nvarchar = - SqlServerType.of( - "NVARCHAR", - SqlServerRead.readString, - SqlServerWrite.writeString, - SqlServerText.textString, - SqlServerJson.text); - - static SqlServerType nvarchar(int length) { - return SqlServerType.of( - SqlServerTypename.of("NVARCHAR", length), - SqlServerRead.readString, - SqlServerWrite.writeString, - SqlServerText.textString, - SqlServerJson.text); - } - - SqlServerType nvarcharMax = nvarchar.renamed("NVARCHAR(MAX)"); - SqlServerType ntext = - SqlServerType.of( - "NTEXT", - SqlServerRead.readString, - SqlServerWrite.writeNText, - SqlServerText.textString, - SqlServerJson.text); - - // ==================== Binary Types ==================== - - SqlServerType binary = - SqlServerType.of( - "BINARY", - SqlServerRead.readByteArray, - SqlServerWrite.writeByteArray, - SqlServerText.textByteArray, - SqlServerJson.bytea); - - static SqlServerType binary(int length) { - return SqlServerType.of( - SqlServerTypename.of("BINARY", length), - SqlServerRead.readByteArray, - SqlServerWrite.writeByteArray, - SqlServerText.textByteArray, - SqlServerJson.bytea); - } - - SqlServerType varbinary = - SqlServerType.of( - "VARBINARY", - SqlServerRead.readByteArray, - SqlServerWrite.writeByteArray, - SqlServerText.textByteArray, - SqlServerJson.bytea); - - static SqlServerType varbinary(int length) { - return SqlServerType.of( - SqlServerTypename.of("VARBINARY", length), - SqlServerRead.readByteArray, - SqlServerWrite.writeByteArray, - SqlServerText.textByteArray, - SqlServerJson.bytea); - } - - SqlServerType varbinaryMax = varbinary.renamed("VARBINARY(MAX)"); - SqlServerType image = varbinary.renamed("IMAGE"); - - // ==================== Date/Time Types ==================== - - SqlServerType date = - SqlServerType.of( - "DATE", - SqlServerRead.readDate, - SqlServerWrite.writeDate, - SqlServerText.instanceToString(), - SqlServerJson.date); - - SqlServerType time = - SqlServerType.of( - "TIME", - SqlServerRead.readTime, - SqlServerWrite.writeTime, - SqlServerText.instanceToString(), - SqlServerJson.time); - - static SqlServerType time(int scale) { - return SqlServerType.of( - SqlServerTypename.of("TIME", scale), - SqlServerRead.readTime, - SqlServerWrite.writeTime, - SqlServerText.instanceToString(), - SqlServerJson.time); - } - - // DATETIME - legacy type with 3.33ms precision - SqlServerType datetime = - SqlServerType.of( - "DATETIME", - SqlServerRead.readTimestamp, - SqlServerWrite.writeTimestamp, - SqlServerText.instanceToString(), - SqlServerJson.timestamp); - - // SMALLDATETIME - minute precision - SqlServerType smalldatetime = - SqlServerType.of( - "SMALLDATETIME", - SqlServerRead.readTimestamp, - SqlServerWrite.writeTimestamp, - SqlServerText.instanceToString(), - SqlServerJson.timestamp); - - // DATETIME2 - modern type with 100ns precision - SqlServerType datetime2 = - SqlServerType.of( - "DATETIME2", - SqlServerRead.readTimestamp, - SqlServerWrite.writeTimestamp, - SqlServerText.instanceToString(), - SqlServerJson.timestamp); - - static SqlServerType datetime2(int scale) { - return SqlServerType.of( - SqlServerTypename.of("DATETIME2", scale), - SqlServerRead.readTimestamp, - SqlServerWrite.writeTimestamp, - SqlServerText.instanceToString(), - SqlServerJson.timestamp); - } - - // DATETIMEOFFSET - datetime with timezone offset (SQL Server specific!) - SqlServerType datetimeoffset = - SqlServerType.of( - "DATETIMEOFFSET", - SqlServerRead.readOffsetDateTime, - SqlServerWrite.writeOffsetDateTime, - SqlServerText.instanceToString(), - SqlServerJson.timestamptz); - - static SqlServerType datetimeoffset(int scale) { - return SqlServerType.of( - SqlServerTypename.of("DATETIMEOFFSET", scale), - SqlServerRead.readOffsetDateTime, - SqlServerWrite.writeOffsetDateTime, - SqlServerText.instanceToString(), - SqlServerJson.timestamptz); - } - - // ==================== Special Types ==================== - - // UNIQUEIDENTIFIER (UUID/GUID) - SqlServerType uniqueidentifier = - SqlServerType.of( - "UNIQUEIDENTIFIER", - SqlServerRead.readUUID, - SqlServerWrite.writeUUID, - SqlServerText.textUUID, - SqlServerJson.uuid); - - // XML - SqlServerType xml = - SqlServerType.of( - "XML", - SqlServerRead.readXml, - SqlServerWrite.writeXml, - SqlServerText.textString.contramap(dev.typr.foundations.data.Xml::value), - SqlServerJson.text.bimap( - dev.typr.foundations.data.Xml::new, dev.typr.foundations.data.Xml::value)); - - // JSON - SQL Server 2016+, stored as NVARCHAR - SqlServerType json = - SqlServerType.of( - "NVARCHAR(MAX)", // JSON is stored as NVARCHAR(MAX) - SqlServerRead.readJson, - SqlServerWrite.writeJson, - SqlServerText.textString.contramap(dev.typr.foundations.data.Json::value), - SqlServerJson.text.contramap(dev.typr.foundations.data.Json::value)); - - // VECTOR - SQL Server 2025 (stored as binary for now) - SqlServerType vector = - SqlServerType.of( - "VECTOR", - SqlServerRead.readVector, - SqlServerWrite.writeVector, - SqlServerText.textByteArray, - SqlServerJson.bytea); - - // ==================== System Types ==================== - - // ROWVERSION / TIMESTAMP - 8-byte binary version number - SqlServerType rowversion = - SqlServerType.of( - "ROWVERSION", - SqlServerRead.readByteArray, - SqlServerWrite.writeByteArray, - SqlServerText.textByteArray, - SqlServerJson.bytea); - - SqlServerType timestamp = rowversion.renamed("TIMESTAMP"); - - // HIERARCHYID - hierarchical data (tree structures) - SqlServerType hierarchyid = - SqlServerType.of( - "HIERARCHYID", - SqlServerRead.readHierarchyId, - SqlServerWrite.writeHierarchyId, - SqlServerText.textString.contramap(dev.typr.foundations.data.HierarchyId::toString), - SqlServerJson.text.bimap( - dev.typr.foundations.data.HierarchyId::parse, - dev.typr.foundations.data.HierarchyId::toString)); - - // SQL_VARIANT - can store values of various types - SqlServerType sqlVariant = - SqlServerType.of( - "SQL_VARIANT", - SqlServerRead.readObject, - SqlServerWrite.writeObject, - SqlServerText.textObject, - SqlServerJson.unknown); - - // ==================== Spatial Types ==================== - // Use JDBC driver's Geography and Geometry classes - - SqlServerType geography = - SqlServerType.of( - "GEOGRAPHY", - SqlServerRead.readGeography, - SqlServerWrite.writeGeography, - SqlServerText.textGeography, - SqlServerJson.jsonGeography); - - SqlServerType geometry = - SqlServerType.of( - "GEOMETRY", - SqlServerRead.readGeometry, - SqlServerWrite.writeGeometry, - SqlServerText.textGeometry, - SqlServerJson.jsonGeometry); - - // ==================== Unknown Type ==================== - // For columns whose type typr doesn't know how to handle - cast to/from string - SqlServerType unknown = - SqlServerType.of( - "VARCHAR(MAX)", - SqlServerRead.readString, - SqlServerWrite.writeString, - SqlServerText.textString, - SqlServerJson.text) - .bimap(dev.typr.foundations.data.Unknown::new, dev.typr.foundations.data.Unknown::value); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/SqlServerWrite.java b/foundations-jdbc/src/java/dev/typr/foundations/SqlServerWrite.java deleted file mode 100644 index 46a22086ea..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/SqlServerWrite.java +++ /dev/null @@ -1,159 +0,0 @@ -package dev.typr.foundations; - -import java.math.BigDecimal; -import java.sql.PreparedStatement; -import java.sql.SQLException; -import java.time.LocalDate; -import java.time.LocalDateTime; -import java.time.LocalTime; -import java.time.OffsetDateTime; -import java.util.Optional; -import java.util.UUID; -import java.util.function.Function; - -/** - * Describes how to write a value to a {@link PreparedStatement} for SQL Server. - * - *

Similar to MariaWrite but adapted for SQL Server-specific types like DATETIMEOFFSET, - * UNIQUEIDENTIFIER, etc. - */ -public sealed interface SqlServerWrite extends DbWrite permits SqlServerWrite.Instance { - void set(PreparedStatement ps, int idx, A a) throws SQLException; - - SqlServerWrite> opt(SqlServerTypename typename); - - SqlServerWrite contramap(Function f); - - @FunctionalInterface - interface RawWriter { - void set(PreparedStatement ps, int index, A a) throws SQLException; - } - - record Instance(RawWriter rawWriter, Function f) implements SqlServerWrite { - @Override - public void set(PreparedStatement ps, int index, A a) throws SQLException { - rawWriter.set(ps, index, f.apply(a)); - } - - @Override - public SqlServerWrite> opt(SqlServerTypename typename) { - // SQL Server requires the actual SQL type for setNull(), not java.sql.Types.NULL - // Use the type name to determine the correct JDBC type constant - int sqlType = getSqlTypeForTypename(typename.sqlTypeNoPrecision()); - return new Instance<>( - (ps, index, u) -> { - if (u == null) ps.setNull(index, sqlType); - else set(ps, index, u); - }, - a -> a.orElse(null)); - } - - @Override - public SqlServerWrite contramap(Function f) { - return new Instance<>(rawWriter, f.andThen(this.f)); - } - } - - static SqlServerWrite primitive(RawWriter rawWriter) { - return new Instance<>(rawWriter, Function.identity()); - } - - static SqlServerWrite passObjectToJdbc() { - return primitive(PreparedStatement::setObject); - } - - static int getSqlTypeForTypename(String sqlType) { - return switch (sqlType.toUpperCase()) { - case "TINYINT" -> java.sql.Types.TINYINT; - case "SMALLINT" -> java.sql.Types.SMALLINT; - case "INT", "INTEGER" -> java.sql.Types.INTEGER; - case "BIGINT" -> java.sql.Types.BIGINT; - case "DECIMAL", "NUMERIC", "MONEY", "SMALLMONEY" -> java.sql.Types.DECIMAL; - case "REAL" -> java.sql.Types.REAL; - case "FLOAT" -> java.sql.Types.FLOAT; - case "BIT" -> java.sql.Types.BIT; - case "CHAR" -> java.sql.Types.CHAR; - case "VARCHAR" -> java.sql.Types.VARCHAR; - case "NCHAR" -> java.sql.Types.NCHAR; - case "NVARCHAR" -> java.sql.Types.NVARCHAR; - case "TEXT" -> java.sql.Types.LONGVARCHAR; - case "NTEXT" -> java.sql.Types.LONGNVARCHAR; - case "BINARY" -> java.sql.Types.BINARY; - case "VARBINARY" -> java.sql.Types.VARBINARY; - case "IMAGE" -> java.sql.Types.LONGVARBINARY; - case "DATE" -> java.sql.Types.DATE; - case "TIME" -> java.sql.Types.TIME; - case "DATETIME", "DATETIME2", "SMALLDATETIME" -> java.sql.Types.TIMESTAMP; - case "DATETIMEOFFSET" -> java.sql.Types.TIMESTAMP_WITH_TIMEZONE; - case "UNIQUEIDENTIFIER" -> java.sql.Types.CHAR; - case "XML" -> java.sql.Types.SQLXML; - case "GEOGRAPHY", "GEOMETRY", "HIERARCHYID", "SQL_VARIANT" -> java.sql.Types.OTHER; - default -> java.sql.Types.OTHER; - }; - } - - // ==================== Basic Type Writers ==================== - - SqlServerWrite writeString = primitive(PreparedStatement::setString); - - // TEXT and NTEXT need explicit SQL type to avoid NVARCHAR conversion - SqlServerWrite writeText = - primitive((ps, idx, s) -> ps.setObject(idx, s, java.sql.Types.LONGVARCHAR)); - SqlServerWrite writeNText = - primitive((ps, idx, s) -> ps.setObject(idx, s, java.sql.Types.LONGNVARCHAR)); - SqlServerWrite writeBoolean = primitive(PreparedStatement::setBoolean); - SqlServerWrite writeShort = primitive(PreparedStatement::setShort); - SqlServerWrite writeUint1 = - writeShort.contramap(dev.typr.foundations.data.Uint1::value); - SqlServerWrite writeInteger = primitive(PreparedStatement::setInt); - SqlServerWrite writeLong = primitive(PreparedStatement::setLong); - SqlServerWrite writeFloat = primitive(PreparedStatement::setFloat); - SqlServerWrite writeDouble = primitive(PreparedStatement::setDouble); - SqlServerWrite writeBigDecimal = primitive(PreparedStatement::setBigDecimal); - SqlServerWrite writeByteArray = primitive(PreparedStatement::setBytes); - - // ==================== Date/Time Writers ==================== - - SqlServerWrite writeDate = primitive(PreparedStatement::setObject); - SqlServerWrite writeTime = primitive(PreparedStatement::setObject); - SqlServerWrite writeTimestamp = primitive(PreparedStatement::setObject); - - // DATETIMEOFFSET - SQL Server specific! - SqlServerWrite writeOffsetDateTime = - primitive((ps, idx, odt) -> ps.setObject(idx, odt)); - - // ==================== Special Type Writers ==================== - - // UNIQUEIDENTIFIER (UUID/GUID) - SqlServerWrite writeUUID = primitive((ps, idx, uuid) -> ps.setString(idx, uuid.toString())); - - // XML - SqlServerWrite writeXml = - primitive( - (ps, idx, xml) -> { - java.sql.SQLXML sqlxml = ps.getConnection().createSQLXML(); - sqlxml.setString(xml.value()); - ps.setSQLXML(idx, sqlxml); - }); - - // JSON - SQL Server stores JSON as NVARCHAR - SqlServerWrite writeJson = - writeString.contramap(dev.typr.foundations.data.Json::value); - - // HIERARCHYID - write as string path, let SQL Server parse it - // SQL Server will convert the string to proper binary format via implicit cast - SqlServerWrite writeHierarchyId = - writeString.contramap(dev.typr.foundations.data.HierarchyId::toString); - - // SQL_VARIANT - write as object - SqlServerWrite writeObject = primitive(PreparedStatement::setObject); - - // GEOGRAPHY and GEOMETRY - use JDBC driver's spatial classes - SqlServerWrite writeGeography = - primitive((ps, idx, geo) -> ps.setObject(idx, geo)); - SqlServerWrite writeGeometry = - primitive((ps, idx, geom) -> ps.setObject(idx, geom)); - - // VECTOR - SqlServerWrite writeVector = writeByteArray; -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/SqlSupplier.java b/foundations-jdbc/src/java/dev/typr/foundations/SqlSupplier.java deleted file mode 100644 index a3f3677a6c..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/SqlSupplier.java +++ /dev/null @@ -1,8 +0,0 @@ -package dev.typr.foundations; - -import java.sql.SQLException; - -@FunctionalInterface -public interface SqlSupplier { - T get() throws SQLException; -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/StructResultSetWrapper.java b/foundations-jdbc/src/java/dev/typr/foundations/StructResultSetWrapper.java deleted file mode 100644 index 7c2fdbdf0d..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/StructResultSetWrapper.java +++ /dev/null @@ -1,976 +0,0 @@ -package dev.typr.foundations; - -import java.sql.*; -import oracle.sql.STRUCT; - -/** - * Minimal ResultSet wrapper that allows reading a STRUCT as if it were from a ResultSet. This is - * used internally to convert STRUCT elements in arrays to Java objects. - */ -class StructResultSetWrapper implements ResultSet, AutoCloseable { - private final STRUCT struct; - - StructResultSetWrapper(STRUCT struct) { - this.struct = struct; - } - - @Override - public Object getObject(int columnIndex) throws SQLException { - if (columnIndex == 1) { - return struct; - } - throw new SQLException("Invalid column index: " + columnIndex); - } - - @Override - public Object getObject(int columnIndex, java.util.Map> map) - throws SQLException { - return getObject(columnIndex); - } - - @Override - public T getObject(int columnIndex, Class type) throws SQLException { - return type.cast(getObject(columnIndex)); - } - - @Override - public boolean wasNull() throws SQLException { - return struct == null; - } - - @Override - public void close() throws SQLException { - // Nothing to close - } - - // All other ResultSet methods throw UnsupportedOperationException - @Override - public String getString(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public boolean getBoolean(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public byte getByte(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public short getShort(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public int getInt(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public long getLong(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public float getFloat(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public double getDouble(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public java.math.BigDecimal getBigDecimal(int columnIndex, int scale) { - throw new UnsupportedOperationException(); - } - - @Override - public byte[] getBytes(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public Date getDate(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public Time getTime(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public Timestamp getTimestamp(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public java.io.InputStream getAsciiStream(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public java.io.InputStream getUnicodeStream(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public java.io.InputStream getBinaryStream(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public String getString(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public boolean getBoolean(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public byte getByte(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public short getShort(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public int getInt(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public long getLong(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public float getFloat(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public double getDouble(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public java.math.BigDecimal getBigDecimal(String columnLabel, int scale) { - throw new UnsupportedOperationException(); - } - - @Override - public byte[] getBytes(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public Date getDate(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public Time getTime(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public Timestamp getTimestamp(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public java.io.InputStream getAsciiStream(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public java.io.InputStream getUnicodeStream(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public java.io.InputStream getBinaryStream(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public SQLWarning getWarnings() { - throw new UnsupportedOperationException(); - } - - @Override - public void clearWarnings() { - throw new UnsupportedOperationException(); - } - - @Override - public String getCursorName() { - throw new UnsupportedOperationException(); - } - - @Override - public ResultSetMetaData getMetaData() { - throw new UnsupportedOperationException(); - } - - @Override - public Object getObject(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public int findColumn(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public java.io.Reader getCharacterStream(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public java.io.Reader getCharacterStream(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public java.math.BigDecimal getBigDecimal(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public java.math.BigDecimal getBigDecimal(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public boolean isBeforeFirst() { - throw new UnsupportedOperationException(); - } - - @Override - public boolean isAfterLast() { - throw new UnsupportedOperationException(); - } - - @Override - public boolean isFirst() { - throw new UnsupportedOperationException(); - } - - @Override - public boolean isLast() { - throw new UnsupportedOperationException(); - } - - @Override - public void beforeFirst() { - throw new UnsupportedOperationException(); - } - - @Override - public void afterLast() { - throw new UnsupportedOperationException(); - } - - @Override - public boolean first() { - throw new UnsupportedOperationException(); - } - - @Override - public boolean last() { - throw new UnsupportedOperationException(); - } - - @Override - public int getRow() { - throw new UnsupportedOperationException(); - } - - @Override - public boolean absolute(int row) { - throw new UnsupportedOperationException(); - } - - @Override - public boolean relative(int rows) { - throw new UnsupportedOperationException(); - } - - @Override - public boolean previous() { - throw new UnsupportedOperationException(); - } - - @Override - public void setFetchDirection(int direction) { - throw new UnsupportedOperationException(); - } - - @Override - public int getFetchDirection() { - throw new UnsupportedOperationException(); - } - - @Override - public void setFetchSize(int rows) { - throw new UnsupportedOperationException(); - } - - @Override - public int getFetchSize() { - throw new UnsupportedOperationException(); - } - - @Override - public int getType() { - throw new UnsupportedOperationException(); - } - - @Override - public int getConcurrency() { - throw new UnsupportedOperationException(); - } - - @Override - public boolean rowUpdated() { - throw new UnsupportedOperationException(); - } - - @Override - public boolean rowInserted() { - throw new UnsupportedOperationException(); - } - - @Override - public boolean rowDeleted() { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNull(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBoolean(int columnIndex, boolean x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateByte(int columnIndex, byte x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateShort(int columnIndex, short x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateInt(int columnIndex, int x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateLong(int columnIndex, long x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateFloat(int columnIndex, float x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateDouble(int columnIndex, double x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBigDecimal(int columnIndex, java.math.BigDecimal x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateString(int columnIndex, String x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBytes(int columnIndex, byte[] x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateDate(int columnIndex, Date x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateTime(int columnIndex, Time x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateTimestamp(int columnIndex, Timestamp x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateAsciiStream(int columnIndex, java.io.InputStream x, int length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBinaryStream(int columnIndex, java.io.InputStream x, int length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateCharacterStream(int columnIndex, java.io.Reader x, int length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateObject(int columnIndex, Object x, int scaleOrLength) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateObject(int columnIndex, Object x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNull(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBoolean(String columnLabel, boolean x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateByte(String columnLabel, byte x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateShort(String columnLabel, short x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateInt(String columnLabel, int x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateLong(String columnLabel, long x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateFloat(String columnLabel, float x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateDouble(String columnLabel, double x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBigDecimal(String columnLabel, java.math.BigDecimal x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateString(String columnLabel, String x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBytes(String columnLabel, byte[] x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateDate(String columnLabel, Date x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateTime(String columnLabel, Time x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateTimestamp(String columnLabel, Timestamp x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateAsciiStream(String columnLabel, java.io.InputStream x, int length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBinaryStream(String columnLabel, java.io.InputStream x, int length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateCharacterStream(String columnLabel, java.io.Reader reader, int length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateObject(String columnLabel, Object x, int scaleOrLength) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateObject(String columnLabel, Object x) { - throw new UnsupportedOperationException(); - } - - @Override - public void insertRow() { - throw new UnsupportedOperationException(); - } - - @Override - public void updateRow() { - throw new UnsupportedOperationException(); - } - - @Override - public void deleteRow() { - throw new UnsupportedOperationException(); - } - - @Override - public void refreshRow() { - throw new UnsupportedOperationException(); - } - - @Override - public void cancelRowUpdates() { - throw new UnsupportedOperationException(); - } - - @Override - public void moveToInsertRow() { - throw new UnsupportedOperationException(); - } - - @Override - public void moveToCurrentRow() { - throw new UnsupportedOperationException(); - } - - @Override - public Statement getStatement() { - throw new UnsupportedOperationException(); - } - - @Override - public Ref getRef(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public Blob getBlob(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public Clob getClob(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public Array getArray(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public Object getObject(String columnLabel, java.util.Map> map) { - throw new UnsupportedOperationException(); - } - - @Override - public Ref getRef(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public Blob getBlob(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public Clob getClob(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public Array getArray(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public Date getDate(int columnIndex, java.util.Calendar cal) { - throw new UnsupportedOperationException(); - } - - @Override - public Date getDate(String columnLabel, java.util.Calendar cal) { - throw new UnsupportedOperationException(); - } - - @Override - public Time getTime(int columnIndex, java.util.Calendar cal) { - throw new UnsupportedOperationException(); - } - - @Override - public Time getTime(String columnLabel, java.util.Calendar cal) { - throw new UnsupportedOperationException(); - } - - @Override - public Timestamp getTimestamp(int columnIndex, java.util.Calendar cal) { - throw new UnsupportedOperationException(); - } - - @Override - public Timestamp getTimestamp(String columnLabel, java.util.Calendar cal) { - throw new UnsupportedOperationException(); - } - - @Override - public java.net.URL getURL(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public java.net.URL getURL(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateRef(int columnIndex, Ref x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateRef(String columnLabel, Ref x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBlob(int columnIndex, Blob x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBlob(String columnLabel, Blob x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateClob(int columnIndex, Clob x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateClob(String columnLabel, Clob x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateArray(int columnIndex, Array x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateArray(String columnLabel, Array x) { - throw new UnsupportedOperationException(); - } - - @Override - public RowId getRowId(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public RowId getRowId(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateRowId(int columnIndex, RowId x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateRowId(String columnLabel, RowId x) { - throw new UnsupportedOperationException(); - } - - @Override - public int getHoldability() { - throw new UnsupportedOperationException(); - } - - @Override - public boolean isClosed() { - return false; - } - - @Override - public void updateNString(int columnIndex, String nString) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNString(String columnLabel, String nString) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNClob(int columnIndex, NClob nClob) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNClob(String columnLabel, NClob nClob) { - throw new UnsupportedOperationException(); - } - - @Override - public NClob getNClob(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public NClob getNClob(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public SQLXML getSQLXML(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public SQLXML getSQLXML(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateSQLXML(int columnIndex, SQLXML xmlObject) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateSQLXML(String columnLabel, SQLXML xmlObject) { - throw new UnsupportedOperationException(); - } - - @Override - public String getNString(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public String getNString(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public java.io.Reader getNCharacterStream(int columnIndex) { - throw new UnsupportedOperationException(); - } - - @Override - public java.io.Reader getNCharacterStream(String columnLabel) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNCharacterStream(int columnIndex, java.io.Reader x, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNCharacterStream(String columnLabel, java.io.Reader reader, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateAsciiStream(int columnIndex, java.io.InputStream x, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBinaryStream(int columnIndex, java.io.InputStream x, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateCharacterStream(int columnIndex, java.io.Reader x, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateAsciiStream(String columnLabel, java.io.InputStream x, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBinaryStream(String columnLabel, java.io.InputStream x, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateCharacterStream(String columnLabel, java.io.Reader reader, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBlob(int columnIndex, java.io.InputStream inputStream, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBlob(String columnLabel, java.io.InputStream inputStream, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateClob(int columnIndex, java.io.Reader reader, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateClob(String columnLabel, java.io.Reader reader, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNClob(int columnIndex, java.io.Reader reader, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNClob(String columnLabel, java.io.Reader reader, long length) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNCharacterStream(int columnIndex, java.io.Reader x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNCharacterStream(String columnLabel, java.io.Reader reader) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateAsciiStream(int columnIndex, java.io.InputStream x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBinaryStream(int columnIndex, java.io.InputStream x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateCharacterStream(int columnIndex, java.io.Reader x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateAsciiStream(String columnLabel, java.io.InputStream x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBinaryStream(String columnLabel, java.io.InputStream x) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateCharacterStream(String columnLabel, java.io.Reader reader) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBlob(int columnIndex, java.io.InputStream inputStream) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateBlob(String columnLabel, java.io.InputStream inputStream) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateClob(int columnIndex, java.io.Reader reader) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateClob(String columnLabel, java.io.Reader reader) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNClob(int columnIndex, java.io.Reader reader) { - throw new UnsupportedOperationException(); - } - - @Override - public void updateNClob(String columnLabel, java.io.Reader reader) { - throw new UnsupportedOperationException(); - } - - @Override - public T getObject(String columnLabel, Class type) { - throw new UnsupportedOperationException(); - } - - @Override - public boolean next() { - throw new UnsupportedOperationException(); - } - - @Override - public T unwrap(Class iface) { - throw new UnsupportedOperationException(); - } - - @Override - public boolean isWrapperFor(Class iface) { - return false; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/Transactor.java b/foundations-jdbc/src/java/dev/typr/foundations/Transactor.java deleted file mode 100644 index 9967d9a96f..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/Transactor.java +++ /dev/null @@ -1,172 +0,0 @@ -package dev.typr.foundations; - -import java.sql.Connection; -import java.sql.SQLException; -import java.util.function.Consumer; - -/** - * A thin wrapper around a source of database connections and a strategy for managing transactions. - * - *

Inspired by doobie's Transactor, this class provides a clean way to manage database - * connections with configurable lifecycle hooks for transaction management. - * - *

Typically obtained via {@link dev.typr.foundations.connect.ConnectionSource#transactor}: - * - *

{@code
- * var ds = SimpleDataSource.create(config, settings);
- * var tx = ds.transactor(Transactor.testStrategy());
- * tx.execute(conn -> repo.selectAll(conn));
- * }
- */ -public record Transactor(SqlSupplier connect, Strategy strategy) { - - /** - * Execute an operation with full strategy lifecycle. - * - * @param the result type - * @param operation the operation to execute with a connection - * @return the operation result - * @throws SQLException if a database error occurs - */ - public T execute(SqlFunction operation) throws SQLException { - Connection conn = connect.get(); - try { - strategy.before().apply(conn); - T result = operation.apply(conn); - strategy.after().apply(conn); - return result; - } catch (SQLException | RuntimeException e) { - strategy.oops().accept(e); - throw e; - } finally { - strategy.always().apply(conn); - } - } - - /** - * Execute an Operation with full strategy lifecycle. - * - * @param the result type - * @param op the Operation to execute - * @return the operation result - * @throws SQLException if a database error occurs - */ - public T execute(Operation op) throws SQLException { - return execute(op::run); - } - - /** - * Execute a void operation with full strategy lifecycle. - * - * @param operation the operation to execute with a connection - * @throws SQLException if a database error occurs - */ - public void executeVoid(SqlConsumer operation) throws SQLException { - execute( - conn -> { - operation.apply(conn); - return null; - }); - } - - /** - * Default strategy: manual transactions with commit on success, close always. - * - *

Behavior: - * - *

    - *
  • before: setAutoCommit(false) - *
  • after: commit() - *
  • oops: no-op (caller handles exceptions) - *
  • always: close() - *
- * - * @return a strategy for manual transaction management - */ - public static Strategy defaultStrategy() { - return new Strategy( - conn -> conn.setAutoCommit(false), Connection::commit, t -> {}, Connection::close); - } - - /** - * Strategy for auto-commit mode (no manual transactions). - * - *

Behavior: - * - *

    - *
  • before: no-op - *
  • after: no-op - *
  • oops: no-op - *
  • always: close() - *
- * - * @return a strategy for auto-commit mode - */ - public static Strategy autoCommitStrategy() { - return new Strategy(conn -> {}, conn -> {}, t -> {}, Connection::close); - } - - /** - * Strategy with rollback on error. - * - *

Behavior: - * - *

    - *
  • before: setAutoCommit(false) - *
  • after: commit() - *
  • oops: rollback() (silently ignores rollback failures) - *
  • always: close() - *
- * - * @return a strategy that rolls back on error - */ - public static Strategy rollbackOnErrorStrategy() { - return new Strategy( - conn -> conn.setAutoCommit(false), - Connection::commit, - t -> {}, - conn -> { - try { - if (!conn.getAutoCommit() && !conn.isClosed()) { - conn.rollback(); - } - } catch (SQLException ignored) { - } - conn.close(); - }); - } - - /** - * Strategy for testing: always rollback instead of commit. - * - *

Behavior: - * - *

    - *
  • before: setAutoCommit(false) - *
  • after: rollback() (instead of commit, to keep test data isolated) - *
  • oops: no-op (caller handles exceptions) - *
  • always: close() - *
- * - * @return a strategy for testing that always rolls back - */ - public static Strategy testStrategy() { - return new Strategy( - conn -> conn.setAutoCommit(false), Connection::rollback, t -> {}, Connection::close); - } - - /** - * Data type representing the common setup, error-handling, and cleanup strategy associated with - * an SQL transaction. A `Transactor` uses a `Strategy` to wrap programs prior to execution. - * - * @param before a program to prepare the connection for use - * @param after a program to run on success - * @param oops a program to run on failure (catch) - * @param always a program to run in all cases (finally) - */ - public record Strategy( - SqlConsumer before, - SqlConsumer after, - Consumer oops, - SqlConsumer always) {} -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/ConnectionSettings.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/ConnectionSettings.java deleted file mode 100644 index 265d045de8..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/ConnectionSettings.java +++ /dev/null @@ -1,122 +0,0 @@ -package dev.typr.foundations.connect; - -/** - * Settings applied to database connections. These settings are common whether using a connection - * pool (HikariCP) or plain DriverManager connections. - * - *

Pass to {@link SimpleDataSource#create} or {@code PooledDataSource.create} to configure - * connection behavior. - * - *

Example usage: - * - *

{@code
- * var settings = ConnectionSettings.builder()
- *     .transactionIsolation(TransactionIsolation.READ_UNCOMMITTED)
- *     .readOnly(true)
- *     .build();
- *
- * var ds = SimpleDataSource.create(config, settings);
- * var tx = ds.transactor(Transactor.testStrategy());
- * }
- */ -public record ConnectionSettings( - TransactionIsolation transactionIsolation, - Boolean autoCommit, - Boolean readOnly, - String catalog, - String schema, - String connectionInitSql) { - - /** Empty settings - use driver defaults for everything. */ - public static final ConnectionSettings EMPTY = - new ConnectionSettings(null, null, null, null, null, null); - - /** Create a builder for ConnectionSettings. */ - public static Builder builder() { - return new Builder(); - } - - /** Builder for ConnectionSettings with fluent methods. */ - public static final class Builder { - private TransactionIsolation transactionIsolation = null; - private Boolean autoCommit = null; - private Boolean readOnly = null; - private String catalog = null; - private String schema = null; - private String connectionInitSql = null; - - private Builder() {} - - /** - * Set the transaction isolation level. Default: null (driver default). - * - * @param transactionIsolation isolation level - * @return this builder - */ - public Builder transactionIsolation(TransactionIsolation transactionIsolation) { - this.transactionIsolation = transactionIsolation; - return this; - } - - /** - * Set the auto-commit mode. Default: null (driver default, usually true). - * - * @param autoCommit auto-commit mode - * @return this builder - */ - public Builder autoCommit(boolean autoCommit) { - this.autoCommit = autoCommit; - return this; - } - - /** - * Set the read-only mode. Default: null (driver default, usually false). - * - * @param readOnly read-only mode - * @return this builder - */ - public Builder readOnly(boolean readOnly) { - this.readOnly = readOnly; - return this; - } - - /** - * Set the catalog. Default: null (driver default). - * - * @param catalog catalog name - * @return this builder - */ - public Builder catalog(String catalog) { - this.catalog = catalog; - return this; - } - - /** - * Set the schema. Default: null (driver default). - * - * @param schema schema name - * @return this builder - */ - public Builder schema(String schema) { - this.schema = schema; - return this; - } - - /** - * SQL to execute when a connection is created. Default: null. - * - * @param connectionInitSql initialization SQL - * @return this builder - */ - public Builder connectionInitSql(String connectionInitSql) { - this.connectionInitSql = connectionInitSql; - return this; - } - - /** Build the ConnectionSettings. */ - public ConnectionSettings build() { - return new ConnectionSettings( - transactionIsolation, autoCommit, readOnly, catalog, schema, connectionInitSql); - } - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/ConnectionSource.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/ConnectionSource.java deleted file mode 100644 index 0ad97af4aa..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/ConnectionSource.java +++ /dev/null @@ -1,72 +0,0 @@ -package dev.typr.foundations.connect; - -import dev.typr.foundations.Transactor; -import dev.typr.foundations.Transactor.Strategy; -import java.sql.Connection; -import java.sql.SQLException; - -/** - * A source of database connections with configured settings. - * - *

This interface abstracts over pooled and non-pooled connection sources, providing a unified - * API for obtaining connections and transactors. - * - *

Implementations: - * - *

    - *
  • {@link SimpleDataSource} - Non-pooled connections via DriverManager - *
  • {@code PooledDataSource} - Pooled connections via HikariCP (in foundations-jdbc-hikari) - *
- * - *

Example usage: - * - *

{@code
- * // Create a connection source (pooled or non-pooled)
- * var ds = SimpleDataSource.create(
- *     PostgresConfig.builder("localhost", 5432, "mydb", "user", "pass").build(),
- *     ConnectionSettings.builder()
- *         .transactionIsolation(TransactionIsolation.READ_UNCOMMITTED)
- *         .build());
- *
- * // Get a transactor
- * var tx = ds.transactor(Transactor.testStrategy());
- * tx.execute(conn -> repo.selectAll(conn));
- * }
- */ -public interface ConnectionSource { - - /** - * Get a connection from this source. - * - * @return a configured database connection - * @throws SQLException if unable to get a connection - */ - Connection getConnection() throws SQLException; - - /** - * Create a Transactor with the default strategy (manual transactions with commit on success). - * - * @return a Transactor configured for manual transaction management - */ - default Transactor transactor() { - return transactor(Transactor.defaultStrategy()); - } - - /** - * Create a Transactor with a custom strategy. - * - * @param strategy the transaction management strategy - * @return a Transactor configured with the provided strategy - */ - default Transactor transactor(Strategy strategy) { - return new Transactor(this::getConnectionUnchecked, strategy); - } - - private Connection getConnectionUnchecked() { - try { - return getConnection(); - } catch (SQLException e) { - throw new RuntimeException("Failed to get connection", e); - } - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/DatabaseConfig.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/DatabaseConfig.java deleted file mode 100644 index 03eb6c0d27..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/DatabaseConfig.java +++ /dev/null @@ -1,97 +0,0 @@ -package dev.typr.foundations.connect; - -import dev.typr.foundations.Transactor; -import java.util.Map; - -/** - * Configuration for connecting to a database. Implemented by database-specific config classes - * (PostgresConfig, MariaDbConfig, SqlServerConfig, etc.). - * - *

Each implementation provides typed builder methods for all documented JDBC driver properties. - * - *

Example: - * - *

{@code
- * var config = PostgresConfig.builder("localhost", 5432, "mydb", "user", "pass")
- *     .sslmode(PgSslMode.REQUIRE)
- *     .build();
- *
- * // Quick shortcut for scripts/tests
- * var tx = config.transactor(Transactor.testStrategy());
- *
- * // Or with connection settings
- * var tx = config.transactor(
- *     ConnectionSettings.builder()
- *         .transactionIsolation(TransactionIsolation.READ_UNCOMMITTED)
- *         .build(),
- *     Transactor.testStrategy());
- * }
- */ -public interface DatabaseConfig { - - /** Get the JDBC URL for this database configuration. */ - String jdbcUrl(); - - /** Get the username for authentication. */ - String username(); - - /** Get the password for authentication. */ - String password(); - - /** Get the database kind (POSTGRESQL, MARIADB, etc.). */ - DatabaseKind kind(); - - /** - * Get all driver-specific properties (excluding user/password which are handled separately). - * These are passed to the JDBC driver via DataSource properties or connection URL parameters. - */ - Map driverProperties(); - - /** - * Create a non-pooled Transactor with the default strategy. - * - *

Shortcut for {@code SimpleDataSource.create(this).transactor()}. - * - * @return a Transactor using non-pooled connections - */ - default Transactor transactor() { - return SimpleDataSource.create(this).transactor(); - } - - /** - * Create a non-pooled Transactor with a custom strategy. - * - *

Shortcut for {@code SimpleDataSource.create(this).transactor(strategy)}. - * - * @param strategy the transaction strategy - * @return a Transactor using non-pooled connections - */ - default Transactor transactor(Transactor.Strategy strategy) { - return SimpleDataSource.create(this).transactor(strategy); - } - - /** - * Create a non-pooled Transactor with connection settings and the default strategy. - * - *

Shortcut for {@code SimpleDataSource.create(this, settings).transactor()}. - * - * @param settings connection settings - * @return a Transactor using non-pooled connections - */ - default Transactor transactor(ConnectionSettings settings) { - return SimpleDataSource.create(this, settings).transactor(); - } - - /** - * Create a non-pooled Transactor with connection settings and a custom strategy. - * - *

Shortcut for {@code SimpleDataSource.create(this, settings).transactor(strategy)}. - * - * @param settings connection settings - * @param strategy the transaction strategy - * @return a Transactor using non-pooled connections - */ - default Transactor transactor(ConnectionSettings settings, Transactor.Strategy strategy) { - return SimpleDataSource.create(this, settings).transactor(strategy); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/DatabaseKind.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/DatabaseKind.java deleted file mode 100644 index f1829c6850..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/DatabaseKind.java +++ /dev/null @@ -1,84 +0,0 @@ -package dev.typr.foundations.connect; - -import java.sql.Connection; -import java.sql.SQLException; - -/** - * Enumeration of supported database systems. Used for detecting the database type from a connection - * and for routing to database-specific code. - */ -public enum DatabaseKind { - POSTGRESQL, - MARIADB, - DUCKDB, - ORACLE, - SQLSERVER, - DB2; - - /** - * Detect the database kind from an open connection by examining the database product name. - * - * @param conn an open database connection - * @return the detected DatabaseKind - * @throws SQLException if unable to get metadata - * @throws IllegalArgumentException if the database is not recognized - */ - public static DatabaseKind detect(Connection conn) throws SQLException { - String productName = conn.getMetaData().getDatabaseProductName().toLowerCase(); - return fromProductName(productName); - } - - /** - * Detect the database kind from an open connection by examining the driver name. This is an - * alternative to {@link #detect(Connection)} when the product name is not reliable. - * - * @param conn an open database connection - * @return the detected DatabaseKind - * @throws SQLException if unable to get metadata - * @throws IllegalArgumentException if the database is not recognized - */ - public static DatabaseKind detectFromDriver(Connection conn) throws SQLException { - String driverName = conn.getMetaData().getDriverName().toLowerCase(); - return fromDriverName(driverName); - } - - private static DatabaseKind fromProductName(String productName) { - if (productName.contains("postgresql")) { - return POSTGRESQL; - } else if (productName.contains("mariadb")) { - return MARIADB; - } else if (productName.contains("mysql")) { - return MARIADB; - } else if (productName.contains("duckdb")) { - return DUCKDB; - } else if (productName.contains("oracle")) { - return ORACLE; - } else if (productName.contains("microsoft sql server") || productName.contains("sql server")) { - return SQLSERVER; - } else if (productName.contains("db2") || productName.contains("ibm data server")) { - return DB2; - } else { - throw new IllegalArgumentException("Unsupported database: " + productName); - } - } - - private static DatabaseKind fromDriverName(String driverName) { - if (driverName.contains("postgresql")) { - return POSTGRESQL; - } else if (driverName.contains("mariadb")) { - return MARIADB; - } else if (driverName.contains("mysql")) { - return MARIADB; - } else if (driverName.contains("duckdb")) { - return DUCKDB; - } else if (driverName.contains("oracle")) { - return ORACLE; - } else if (driverName.contains("sqlserver") || driverName.contains("sql server")) { - return SQLSERVER; - } else if (driverName.contains("db2") || driverName.contains("jcc")) { - return DB2; - } else { - throw new IllegalArgumentException("Unsupported driver: " + driverName); - } - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/SimpleDataSource.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/SimpleDataSource.java deleted file mode 100644 index 2cc7d04191..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/SimpleDataSource.java +++ /dev/null @@ -1,103 +0,0 @@ -package dev.typr.foundations.connect; - -import java.sql.Connection; -import java.sql.DriverManager; -import java.sql.SQLException; -import java.sql.Statement; -import java.util.Properties; - -/** - * A simple non-pooled connection source using DriverManager. - * - *

Suitable for scripts, tests, or low-volume use cases. For production use with connection - * pooling, use {@code PooledDataSource} from the foundations-jdbc-hikari module. - * - *

Example usage: - * - *

{@code
- * var ds = SimpleDataSource.create(
- *     PostgresConfig.builder("localhost", 5432, "mydb", "user", "pass").build(),
- *     ConnectionSettings.builder()
- *         .transactionIsolation(TransactionIsolation.READ_UNCOMMITTED)
- *         .build());
- *
- * var tx = ds.transactor(Transactor.testStrategy());
- * tx.execute(conn -> repo.selectAll(conn));
- * }
- */ -public final class SimpleDataSource implements ConnectionSource { - - private final DatabaseConfig config; - private final ConnectionSettings settings; - - private SimpleDataSource(DatabaseConfig config, ConnectionSettings settings) { - this.config = config; - this.settings = settings; - } - - /** - * Create a SimpleDataSource with connection settings. - * - * @param config database configuration - * @param settings connection settings to apply - * @return a new SimpleDataSource - */ - public static SimpleDataSource create(DatabaseConfig config, ConnectionSettings settings) { - return new SimpleDataSource(config, settings); - } - - /** - * Create a SimpleDataSource with default settings. - * - * @param config database configuration - * @return a new SimpleDataSource with driver defaults - */ - public static SimpleDataSource create(DatabaseConfig config) { - return new SimpleDataSource(config, ConnectionSettings.EMPTY); - } - - @Override - public Connection getConnection() throws SQLException { - Properties props = new Properties(); - props.setProperty("user", config.username()); - props.setProperty("password", config.password()); - config.driverProperties().forEach(props::setProperty); - - Connection conn = DriverManager.getConnection(config.jdbcUrl(), props); - applySettings(conn); - return conn; - } - - private void applySettings(Connection conn) throws SQLException { - if (settings.transactionIsolation() != null) { - conn.setTransactionIsolation(settings.transactionIsolation().jdbcLevel()); - } - if (settings.autoCommit() != null) { - conn.setAutoCommit(settings.autoCommit()); - } - if (settings.readOnly() != null) { - conn.setReadOnly(settings.readOnly()); - } - if (settings.catalog() != null) { - conn.setCatalog(settings.catalog()); - } - if (settings.schema() != null) { - conn.setSchema(settings.schema()); - } - if (settings.connectionInitSql() != null) { - try (Statement stmt = conn.createStatement()) { - stmt.execute(settings.connectionInitSql()); - } - } - } - - /** Get the database configuration. */ - public DatabaseConfig config() { - return config; - } - - /** Get the connection settings. */ - public ConnectionSettings settings() { - return settings; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/TransactionIsolation.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/TransactionIsolation.java deleted file mode 100644 index d7c4a6c5a1..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/TransactionIsolation.java +++ /dev/null @@ -1,68 +0,0 @@ -package dev.typr.foundations.connect; - -import java.sql.Connection; - -/** - * Transaction isolation levels for database connections. - * - *

Maps to JDBC Connection constants with database-specific names for use with HikariCP and other - * connection pools. - */ -public enum TransactionIsolation { - /** Read uncommitted - allows dirty reads. Lowest isolation level. */ - READ_UNCOMMITTED(Connection.TRANSACTION_READ_UNCOMMITTED, "TRANSACTION_READ_UNCOMMITTED"), - - /** Read committed - prevents dirty reads. Default for most databases. */ - READ_COMMITTED(Connection.TRANSACTION_READ_COMMITTED, "TRANSACTION_READ_COMMITTED"), - - /** Repeatable read - prevents non-repeatable reads. */ - REPEATABLE_READ(Connection.TRANSACTION_REPEATABLE_READ, "TRANSACTION_REPEATABLE_READ"), - - /** Serializable - highest isolation, prevents phantom reads. */ - SERIALIZABLE(Connection.TRANSACTION_SERIALIZABLE, "TRANSACTION_SERIALIZABLE"), - - /** No transactions - use for auto-commit mode. */ - NONE(Connection.TRANSACTION_NONE, "TRANSACTION_NONE"); - - private final int jdbcLevel; - private final String jdbcName; - - TransactionIsolation(int jdbcLevel, String jdbcName) { - this.jdbcLevel = jdbcLevel; - this.jdbcName = jdbcName; - } - - /** - * Get the JDBC isolation level constant. - * - * @return JDBC level from java.sql.Connection - */ - public int jdbcLevel() { - return jdbcLevel; - } - - /** - * Get the JDBC name for HikariCP configuration. - * - * @return JDBC constant name (e.g., "TRANSACTION_READ_COMMITTED") - */ - public String jdbcName() { - return jdbcName; - } - - /** - * Convert from JDBC isolation level. - * - * @param jdbcLevel JDBC level constant - * @return matching TransactionIsolation - * @throws IllegalArgumentException if level is unknown - */ - public static TransactionIsolation fromJdbcLevel(int jdbcLevel) { - for (TransactionIsolation isolation : values()) { - if (isolation.jdbcLevel == jdbcLevel) { - return isolation; - } - } - throw new IllegalArgumentException("Unknown JDBC isolation level: " + jdbcLevel); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/db2/Db2Config.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/db2/Db2Config.java deleted file mode 100644 index f7de96a6b5..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/db2/Db2Config.java +++ /dev/null @@ -1,1414 +0,0 @@ -package dev.typr.foundations.connect.db2; - -import dev.typr.foundations.connect.DatabaseConfig; -import dev.typr.foundations.connect.DatabaseKind; -import java.util.HashMap; -import java.util.Map; - -/** - * DB2 database configuration with typed builder methods for all documented JDBC driver properties. - * - *

Properties are based on the IBM Data Server Driver for JDBC documentation. - * - * @see IBM - * JDBC Documentation - */ -public final class Db2Config implements DatabaseConfig { - - private final String host; - private final int port; - private final String database; - private final String username; - private final String password; - - // Connection properties - private final String currentSchema; - private final String currentSQLID; - private final Integer loginTimeout; - private final Integer commandTimeout; - private final String clientApplicationInformation; - private final String clientAccountingInformation; - private final String clientProgramId; - private final String clientUser; - private final String clientWorkstation; - - // Performance properties - private final Integer blockingReadConnectionTimeout; - private final Boolean fullyMaterializeLobData; - private final Boolean fullyMaterializeInputStreams; - private final Boolean progressiveStreaming; - private final Integer fetchSize; - private final Integer queryDataSize; - private final Boolean deferPrepares; - private final Boolean enableNamedParameterMarkers; - private final Boolean enableSeamlessFailover; - private final Integer keepDynamic; - private final Boolean resultSetHoldability; - private final Integer queryTimeoutInterruptProcessingMode; - private final Boolean useJDBC4ColumnNameAndLabelSemantics; - - // Error handling properties - private final Boolean retrieveMessagesFromServerOnGetMessage; - private final Integer readTimeout; - private final Boolean atomicMultiRowInsert; - private final Boolean returnAlias; - - // SSL/TLS properties - private final String sslConnection; - private final String sslTrustStoreLocation; - private final String sslTrustStorePassword; - private final String sslKeyStoreLocation; - private final String sslKeyStorePassword; - private final String sslCipherSuites; - - // Security properties - private final Integer securityMechanism; - private final String kerberosServerPrincipal; - private final String gssCredential; - private final Boolean encryptionAlgorithm; - private final String pkList; - private final String pluginName; - private final Boolean sendDataAsIs; - - // LOB properties - private final Integer streamBufferSize; - private final Boolean fullyMaterializeBlobData; - private final Boolean fullyMaterializeClobData; - private final Integer maxRetriesForClientReroute; - private final Integer retryIntervalForClientReroute; - - // Statement properties - private final Boolean allowNextOnExhaustedResultSet; - private final String cursorSensitivity; - private final Integer cursorHold; - private final Boolean emulateParameterMetaDataForZCalls; - private final Integer resultSetHoldabilityForCatalogQueries; - private final Integer queryCloseImplicit; - private final Boolean sendCharInputsUTF8; - private final String timestampFormat; - private final String timestampOutputType; - private final String dateFormat; - private final String timeFormat; - - // Tracing/Logging properties - private final Boolean traceFile; - private final Integer traceLevel; - private final Boolean traceDirectory; - private final Boolean logWriter; - - // XA properties - private final Integer xaNetworkOptimization; - private final Boolean downgradeHoldCursorsUnderXa; - - // Compatibility properties - private final Boolean jdbcCollection; - private final String currentPackagePath; - private final String currentPackageSet; - private final Boolean enableClientAffinitiesList; - private final String clientRerouteAlternateServerName; - private final String clientRerouteAlternatePortNumber; - - // Advanced connection properties - private final Integer memberConnectTimeout; - private final String sysSchema; - private final Boolean affinityFailbackInterval; - private final Boolean enableSysplexWLB; - private final Integer maxTransportObjects; - private final String databaseName; - private final Boolean decimalSeparator; - private final Integer decimalStringFormat; - private final Integer clientDebugInfo; - - // Escape hatch - private final Map extraProperties; - - private Db2Config(Builder b) { - this.host = b.host; - this.port = b.port; - this.database = b.database; - this.username = b.username; - this.password = b.password; - - // Connection - this.currentSchema = b.currentSchema; - this.currentSQLID = b.currentSQLID; - this.loginTimeout = b.loginTimeout; - this.commandTimeout = b.commandTimeout; - this.clientApplicationInformation = b.clientApplicationInformation; - this.clientAccountingInformation = b.clientAccountingInformation; - this.clientProgramId = b.clientProgramId; - this.clientUser = b.clientUser; - this.clientWorkstation = b.clientWorkstation; - - // Performance - this.blockingReadConnectionTimeout = b.blockingReadConnectionTimeout; - this.fullyMaterializeLobData = b.fullyMaterializeLobData; - this.fullyMaterializeInputStreams = b.fullyMaterializeInputStreams; - this.progressiveStreaming = b.progressiveStreaming; - this.fetchSize = b.fetchSize; - this.queryDataSize = b.queryDataSize; - this.deferPrepares = b.deferPrepares; - this.enableNamedParameterMarkers = b.enableNamedParameterMarkers; - this.enableSeamlessFailover = b.enableSeamlessFailover; - this.keepDynamic = b.keepDynamic; - this.resultSetHoldability = b.resultSetHoldability; - this.queryTimeoutInterruptProcessingMode = b.queryTimeoutInterruptProcessingMode; - this.useJDBC4ColumnNameAndLabelSemantics = b.useJDBC4ColumnNameAndLabelSemantics; - - // Error handling - this.retrieveMessagesFromServerOnGetMessage = b.retrieveMessagesFromServerOnGetMessage; - this.readTimeout = b.readTimeout; - this.atomicMultiRowInsert = b.atomicMultiRowInsert; - this.returnAlias = b.returnAlias; - - // SSL/TLS - this.sslConnection = b.sslConnection; - this.sslTrustStoreLocation = b.sslTrustStoreLocation; - this.sslTrustStorePassword = b.sslTrustStorePassword; - this.sslKeyStoreLocation = b.sslKeyStoreLocation; - this.sslKeyStorePassword = b.sslKeyStorePassword; - this.sslCipherSuites = b.sslCipherSuites; - - // Security - this.securityMechanism = b.securityMechanism; - this.kerberosServerPrincipal = b.kerberosServerPrincipal; - this.gssCredential = b.gssCredential; - this.encryptionAlgorithm = b.encryptionAlgorithm; - this.pkList = b.pkList; - this.pluginName = b.pluginName; - this.sendDataAsIs = b.sendDataAsIs; - - // LOB - this.streamBufferSize = b.streamBufferSize; - this.fullyMaterializeBlobData = b.fullyMaterializeBlobData; - this.fullyMaterializeClobData = b.fullyMaterializeClobData; - this.maxRetriesForClientReroute = b.maxRetriesForClientReroute; - this.retryIntervalForClientReroute = b.retryIntervalForClientReroute; - - // Statement - this.allowNextOnExhaustedResultSet = b.allowNextOnExhaustedResultSet; - this.cursorSensitivity = b.cursorSensitivity; - this.cursorHold = b.cursorHold; - this.emulateParameterMetaDataForZCalls = b.emulateParameterMetaDataForZCalls; - this.resultSetHoldabilityForCatalogQueries = b.resultSetHoldabilityForCatalogQueries; - this.queryCloseImplicit = b.queryCloseImplicit; - this.sendCharInputsUTF8 = b.sendCharInputsUTF8; - this.timestampFormat = b.timestampFormat; - this.timestampOutputType = b.timestampOutputType; - this.dateFormat = b.dateFormat; - this.timeFormat = b.timeFormat; - - // Tracing - this.traceFile = b.traceFile; - this.traceLevel = b.traceLevel; - this.traceDirectory = b.traceDirectory; - this.logWriter = b.logWriter; - - // XA - this.xaNetworkOptimization = b.xaNetworkOptimization; - this.downgradeHoldCursorsUnderXa = b.downgradeHoldCursorsUnderXa; - - // Compatibility - this.jdbcCollection = b.jdbcCollection; - this.currentPackagePath = b.currentPackagePath; - this.currentPackageSet = b.currentPackageSet; - this.enableClientAffinitiesList = b.enableClientAffinitiesList; - this.clientRerouteAlternateServerName = b.clientRerouteAlternateServerName; - this.clientRerouteAlternatePortNumber = b.clientRerouteAlternatePortNumber; - - // Advanced connection - this.memberConnectTimeout = b.memberConnectTimeout; - this.sysSchema = b.sysSchema; - this.affinityFailbackInterval = b.affinityFailbackInterval; - this.enableSysplexWLB = b.enableSysplexWLB; - this.maxTransportObjects = b.maxTransportObjects; - this.databaseName = b.databaseName; - this.decimalSeparator = b.decimalSeparator; - this.decimalStringFormat = b.decimalStringFormat; - this.clientDebugInfo = b.clientDebugInfo; - - this.extraProperties = Map.copyOf(b.extraProperties); - } - - /** - * Create a new builder with required connection parameters. - * - * @param host DB2 server hostname - * @param port DB2 server port (typically 50000) - * @param database database name - * @param username username for authentication - * @param password password for authentication - * @return a new builder - */ - public static Builder builder( - String host, int port, String database, String username, String password) { - return new Builder(host, port, database, username, password); - } - - @Override - public String jdbcUrl() { - return "jdbc:db2://" + host + ":" + port + "/" + database; - } - - @Override - public String username() { - return username; - } - - @Override - public String password() { - return password; - } - - @Override - public DatabaseKind kind() { - return DatabaseKind.DB2; - } - - @Override - public Map driverProperties() { - Map props = new HashMap<>(); - - // Connection - if (currentSchema != null) props.put("currentSchema", currentSchema); - if (currentSQLID != null) props.put("currentSQLID", currentSQLID); - if (loginTimeout != null) props.put("loginTimeout", loginTimeout.toString()); - if (commandTimeout != null) props.put("commandTimeout", commandTimeout.toString()); - if (clientApplicationInformation != null) - props.put("clientApplicationInformation", clientApplicationInformation); - if (clientAccountingInformation != null) - props.put("clientAccountingInformation", clientAccountingInformation); - if (clientProgramId != null) props.put("clientProgramId", clientProgramId); - if (clientUser != null) props.put("clientUser", clientUser); - if (clientWorkstation != null) props.put("clientWorkstation", clientWorkstation); - - // Performance - if (blockingReadConnectionTimeout != null) - props.put("blockingReadConnectionTimeout", blockingReadConnectionTimeout.toString()); - if (fullyMaterializeLobData != null) - props.put("fullyMaterializeLobData", fullyMaterializeLobData.toString()); - if (fullyMaterializeInputStreams != null) - props.put("fullyMaterializeInputStreams", fullyMaterializeInputStreams.toString()); - if (progressiveStreaming != null) - props.put("progressiveStreaming", progressiveStreaming.toString()); - if (fetchSize != null) props.put("fetchSize", fetchSize.toString()); - if (queryDataSize != null) props.put("queryDataSize", queryDataSize.toString()); - if (deferPrepares != null) props.put("deferPrepares", deferPrepares.toString()); - if (enableNamedParameterMarkers != null) - props.put("enableNamedParameterMarkers", enableNamedParameterMarkers.toString()); - if (enableSeamlessFailover != null) - props.put("enableSeamlessFailover", enableSeamlessFailover.toString()); - if (keepDynamic != null) props.put("keepDynamic", keepDynamic.toString()); - if (resultSetHoldability != null) - props.put("resultSetHoldability", resultSetHoldability.toString()); - if (queryTimeoutInterruptProcessingMode != null) - props.put( - "queryTimeoutInterruptProcessingMode", queryTimeoutInterruptProcessingMode.toString()); - if (useJDBC4ColumnNameAndLabelSemantics != null) - props.put( - "useJDBC4ColumnNameAndLabelSemantics", useJDBC4ColumnNameAndLabelSemantics.toString()); - - // Error handling - if (retrieveMessagesFromServerOnGetMessage != null) - props.put( - "retrieveMessagesFromServerOnGetMessage", - retrieveMessagesFromServerOnGetMessage.toString()); - if (readTimeout != null) props.put("readTimeout", readTimeout.toString()); - if (atomicMultiRowInsert != null) - props.put("atomicMultiRowInsert", atomicMultiRowInsert.toString()); - if (returnAlias != null) props.put("returnAlias", returnAlias.toString()); - - // SSL/TLS - if (sslConnection != null) props.put("sslConnection", sslConnection); - if (sslTrustStoreLocation != null) props.put("sslTrustStoreLocation", sslTrustStoreLocation); - if (sslTrustStorePassword != null) props.put("sslTrustStorePassword", sslTrustStorePassword); - if (sslKeyStoreLocation != null) props.put("sslKeyStoreLocation", sslKeyStoreLocation); - if (sslKeyStorePassword != null) props.put("sslKeyStorePassword", sslKeyStorePassword); - if (sslCipherSuites != null) props.put("sslCipherSuites", sslCipherSuites); - - // Security - if (securityMechanism != null) props.put("securityMechanism", securityMechanism.toString()); - if (kerberosServerPrincipal != null) - props.put("kerberosServerPrincipal", kerberosServerPrincipal); - if (gssCredential != null) props.put("gssCredential", gssCredential); - if (encryptionAlgorithm != null) - props.put("encryptionAlgorithm", encryptionAlgorithm.toString()); - if (pkList != null) props.put("pkList", pkList); - if (pluginName != null) props.put("pluginName", pluginName); - if (sendDataAsIs != null) props.put("sendDataAsIs", sendDataAsIs.toString()); - - // LOB - if (streamBufferSize != null) props.put("streamBufferSize", streamBufferSize.toString()); - if (fullyMaterializeBlobData != null) - props.put("fullyMaterializeBlobData", fullyMaterializeBlobData.toString()); - if (fullyMaterializeClobData != null) - props.put("fullyMaterializeClobData", fullyMaterializeClobData.toString()); - if (maxRetriesForClientReroute != null) - props.put("maxRetriesForClientReroute", maxRetriesForClientReroute.toString()); - if (retryIntervalForClientReroute != null) - props.put("retryIntervalForClientReroute", retryIntervalForClientReroute.toString()); - - // Statement - if (allowNextOnExhaustedResultSet != null) - props.put("allowNextOnExhaustedResultSet", allowNextOnExhaustedResultSet.toString()); - if (cursorSensitivity != null) props.put("cursorSensitivity", cursorSensitivity); - if (cursorHold != null) props.put("cursorHold", cursorHold.toString()); - if (emulateParameterMetaDataForZCalls != null) - props.put("emulateParameterMetaDataForZCalls", emulateParameterMetaDataForZCalls.toString()); - if (resultSetHoldabilityForCatalogQueries != null) - props.put( - "resultSetHoldabilityForCatalogQueries", - resultSetHoldabilityForCatalogQueries.toString()); - if (queryCloseImplicit != null) props.put("queryCloseImplicit", queryCloseImplicit.toString()); - if (sendCharInputsUTF8 != null) props.put("sendCharInputsUTF8", sendCharInputsUTF8.toString()); - if (timestampFormat != null) props.put("timestampFormat", timestampFormat); - if (timestampOutputType != null) props.put("timestampOutputType", timestampOutputType); - if (dateFormat != null) props.put("dateFormat", dateFormat); - if (timeFormat != null) props.put("timeFormat", timeFormat); - - // Tracing - if (traceFile != null) props.put("traceFile", traceFile.toString()); - if (traceLevel != null) props.put("traceLevel", traceLevel.toString()); - if (traceDirectory != null) props.put("traceDirectory", traceDirectory.toString()); - if (logWriter != null) props.put("logWriter", logWriter.toString()); - - // XA - if (xaNetworkOptimization != null) - props.put("xaNetworkOptimization", xaNetworkOptimization.toString()); - if (downgradeHoldCursorsUnderXa != null) - props.put("downgradeHoldCursorsUnderXa", downgradeHoldCursorsUnderXa.toString()); - - // Compatibility - if (jdbcCollection != null) props.put("jdbcCollection", jdbcCollection.toString()); - if (currentPackagePath != null) props.put("currentPackagePath", currentPackagePath); - if (currentPackageSet != null) props.put("currentPackageSet", currentPackageSet); - if (enableClientAffinitiesList != null) - props.put("enableClientAffinitiesList", enableClientAffinitiesList.toString()); - if (clientRerouteAlternateServerName != null) - props.put("clientRerouteAlternateServerName", clientRerouteAlternateServerName); - if (clientRerouteAlternatePortNumber != null) - props.put("clientRerouteAlternatePortNumber", clientRerouteAlternatePortNumber); - - // Advanced connection - if (memberConnectTimeout != null) - props.put("memberConnectTimeout", memberConnectTimeout.toString()); - if (sysSchema != null) props.put("sysSchema", sysSchema); - if (affinityFailbackInterval != null) - props.put("affinityFailbackInterval", affinityFailbackInterval.toString()); - if (enableSysplexWLB != null) props.put("enableSysplexWLB", enableSysplexWLB.toString()); - if (maxTransportObjects != null) - props.put("maxTransportObjects", maxTransportObjects.toString()); - if (databaseName != null) props.put("databaseName", databaseName); - if (decimalSeparator != null) props.put("decimalSeparator", decimalSeparator.toString()); - if (decimalStringFormat != null) - props.put("decimalStringFormat", decimalStringFormat.toString()); - if (clientDebugInfo != null) props.put("clientDebugInfo", clientDebugInfo.toString()); - - props.putAll(extraProperties); - return props; - } - - /** Builder for Db2Config with typed methods for all JDBC driver properties. */ - public static final class Builder { - private final String host; - private final int port; - private final String database; - private final String username; - private final String password; - - // Connection - private String currentSchema; - private String currentSQLID; - private Integer loginTimeout; - private Integer commandTimeout; - private String clientApplicationInformation; - private String clientAccountingInformation; - private String clientProgramId; - private String clientUser; - private String clientWorkstation; - - // Performance - private Integer blockingReadConnectionTimeout; - private Boolean fullyMaterializeLobData; - private Boolean fullyMaterializeInputStreams; - private Boolean progressiveStreaming; - private Integer fetchSize; - private Integer queryDataSize; - private Boolean deferPrepares; - private Boolean enableNamedParameterMarkers; - private Boolean enableSeamlessFailover; - private Integer keepDynamic; - private Boolean resultSetHoldability; - private Integer queryTimeoutInterruptProcessingMode; - private Boolean useJDBC4ColumnNameAndLabelSemantics; - - // Error handling - private Boolean retrieveMessagesFromServerOnGetMessage; - private Integer readTimeout; - private Boolean atomicMultiRowInsert; - private Boolean returnAlias; - - // SSL/TLS - private String sslConnection; - private String sslTrustStoreLocation; - private String sslTrustStorePassword; - private String sslKeyStoreLocation; - private String sslKeyStorePassword; - private String sslCipherSuites; - - // Security - private Integer securityMechanism; - private String kerberosServerPrincipal; - private String gssCredential; - private Boolean encryptionAlgorithm; - private String pkList; - private String pluginName; - private Boolean sendDataAsIs; - - // LOB - private Integer streamBufferSize; - private Boolean fullyMaterializeBlobData; - private Boolean fullyMaterializeClobData; - private Integer maxRetriesForClientReroute; - private Integer retryIntervalForClientReroute; - - // Statement - private Boolean allowNextOnExhaustedResultSet; - private String cursorSensitivity; - private Integer cursorHold; - private Boolean emulateParameterMetaDataForZCalls; - private Integer resultSetHoldabilityForCatalogQueries; - private Integer queryCloseImplicit; - private Boolean sendCharInputsUTF8; - private String timestampFormat; - private String timestampOutputType; - private String dateFormat; - private String timeFormat; - - // Tracing - private Boolean traceFile; - private Integer traceLevel; - private Boolean traceDirectory; - private Boolean logWriter; - - // XA - private Integer xaNetworkOptimization; - private Boolean downgradeHoldCursorsUnderXa; - - // Compatibility - private Boolean jdbcCollection; - private String currentPackagePath; - private String currentPackageSet; - private Boolean enableClientAffinitiesList; - private String clientRerouteAlternateServerName; - private String clientRerouteAlternatePortNumber; - - // Advanced connection - private Integer memberConnectTimeout; - private String sysSchema; - private Boolean affinityFailbackInterval; - private Boolean enableSysplexWLB; - private Integer maxTransportObjects; - private String databaseName; - private Boolean decimalSeparator; - private Integer decimalStringFormat; - private Integer clientDebugInfo; - - private final Map extraProperties = new HashMap<>(); - - private Builder(String host, int port, String database, String username, String password) { - this.host = host; - this.port = port; - this.database = database; - this.username = username; - this.password = password; - - // OUR DEFAULTS (better than driver defaults) - this.retrieveMessagesFromServerOnGetMessage = true; // Gets full error messages from server - } - - // ==================== CONNECTION ==================== - - /** - * Current schema for unqualified names. Driver default: null. - * - * @param currentSchema schema name - * @return this builder - */ - public Builder currentSchema(String currentSchema) { - this.currentSchema = currentSchema; - return this; - } - - /** - * Current SQL ID for z/OS. Driver default: null. - * - * @param currentSQLID SQL ID - * @return this builder - */ - public Builder currentSQLID(String currentSQLID) { - this.currentSQLID = currentSQLID; - return this; - } - - /** - * Login timeout in seconds. Driver default: 0 (no timeout). - * - * @param loginTimeout timeout in seconds - * @return this builder - */ - public Builder loginTimeout(int loginTimeout) { - this.loginTimeout = loginTimeout; - return this; - } - - /** - * Command timeout in seconds. Driver default: 0 (no timeout). - * - * @param commandTimeout timeout in seconds - * @return this builder - */ - public Builder commandTimeout(int commandTimeout) { - this.commandTimeout = commandTimeout; - return this; - } - - /** - * Client application information for monitoring. Driver default: null. - * - * @param clientApplicationInformation application info - * @return this builder - */ - public Builder clientApplicationInformation(String clientApplicationInformation) { - this.clientApplicationInformation = clientApplicationInformation; - return this; - } - - /** - * Client accounting information. Driver default: null. - * - * @param clientAccountingInformation accounting info - * @return this builder - */ - public Builder clientAccountingInformation(String clientAccountingInformation) { - this.clientAccountingInformation = clientAccountingInformation; - return this; - } - - /** - * Client program ID. Driver default: null. - * - * @param clientProgramId program ID - * @return this builder - */ - public Builder clientProgramId(String clientProgramId) { - this.clientProgramId = clientProgramId; - return this; - } - - /** - * Client user for monitoring. Driver default: null. - * - * @param clientUser user name - * @return this builder - */ - public Builder clientUser(String clientUser) { - this.clientUser = clientUser; - return this; - } - - /** - * Client workstation for monitoring. Driver default: null. - * - * @param clientWorkstation workstation name - * @return this builder - */ - public Builder clientWorkstation(String clientWorkstation) { - this.clientWorkstation = clientWorkstation; - return this; - } - - // ==================== PERFORMANCE ==================== - - /** - * Blocking read timeout in seconds. Driver default: 0 (no timeout). - * - * @param blockingReadConnectionTimeout timeout in seconds - * @return this builder - */ - public Builder blockingReadConnectionTimeout(int blockingReadConnectionTimeout) { - this.blockingReadConnectionTimeout = blockingReadConnectionTimeout; - return this; - } - - /** - * Fully materialize LOB data. Driver default: true. - * - * @param fullyMaterializeLobData true to materialize - * @return this builder - */ - public Builder fullyMaterializeLobData(boolean fullyMaterializeLobData) { - this.fullyMaterializeLobData = fullyMaterializeLobData; - return this; - } - - /** - * Fully materialize input streams. Driver default: true. - * - * @param fullyMaterializeInputStreams true to materialize - * @return this builder - */ - public Builder fullyMaterializeInputStreams(boolean fullyMaterializeInputStreams) { - this.fullyMaterializeInputStreams = fullyMaterializeInputStreams; - return this; - } - - /** - * Enable progressive streaming for LOBs. Driver default: true. - * - * @param progressiveStreaming true to enable - * @return this builder - */ - public Builder progressiveStreaming(boolean progressiveStreaming) { - this.progressiveStreaming = progressiveStreaming; - return this; - } - - /** - * Default fetch size. Driver default: 64. - * - * @param fetchSize fetch size - * @return this builder - */ - public Builder fetchSize(int fetchSize) { - this.fetchSize = fetchSize; - return this; - } - - /** - * Query data size in bytes. Driver default: 32767. - * - * @param queryDataSize data size - * @return this builder - */ - public Builder queryDataSize(int queryDataSize) { - this.queryDataSize = queryDataSize; - return this; - } - - /** - * Defer statement preparation. Driver default: true. - * - * @param deferPrepares true to defer - * @return this builder - */ - public Builder deferPrepares(boolean deferPrepares) { - this.deferPrepares = deferPrepares; - return this; - } - - /** - * Enable named parameter markers. Driver default: false. - * - * @param enableNamedParameterMarkers true to enable - * @return this builder - */ - public Builder enableNamedParameterMarkers(boolean enableNamedParameterMarkers) { - this.enableNamedParameterMarkers = enableNamedParameterMarkers; - return this; - } - - /** - * Enable seamless failover. Driver default: false. - * - * @param enableSeamlessFailover true to enable - * @return this builder - */ - public Builder enableSeamlessFailover(boolean enableSeamlessFailover) { - this.enableSeamlessFailover = enableSeamlessFailover; - return this; - } - - /** - * Keep dynamic SQL cached. Driver default: 0. - * - * @param keepDynamic 1 to keep - * @return this builder - */ - public Builder keepDynamic(int keepDynamic) { - this.keepDynamic = keepDynamic; - return this; - } - - /** - * Result set holdability. Driver default: true. - * - * @param resultSetHoldability true for holdable cursors - * @return this builder - */ - public Builder resultSetHoldability(boolean resultSetHoldability) { - this.resultSetHoldability = resultSetHoldability; - return this; - } - - /** - * Query timeout interrupt processing mode. Driver default: 1. - * - * @param queryTimeoutInterruptProcessingMode mode value - * @return this builder - */ - public Builder queryTimeoutInterruptProcessingMode(int queryTimeoutInterruptProcessingMode) { - this.queryTimeoutInterruptProcessingMode = queryTimeoutInterruptProcessingMode; - return this; - } - - /** - * Use JDBC4 column name/label semantics. Driver default: true. - * - * @param useJDBC4ColumnNameAndLabelSemantics true to use - * @return this builder - */ - public Builder useJDBC4ColumnNameAndLabelSemantics( - boolean useJDBC4ColumnNameAndLabelSemantics) { - this.useJDBC4ColumnNameAndLabelSemantics = useJDBC4ColumnNameAndLabelSemantics; - return this; - } - - // ==================== ERROR HANDLING ==================== - - /** - * Retrieve full error messages from server. Driver default: false. OUR DEFAULT: true (much - * better error messages). - * - * @param retrieveMessagesFromServerOnGetMessage true to retrieve - * @return this builder - */ - public Builder retrieveMessagesFromServerOnGetMessage( - boolean retrieveMessagesFromServerOnGetMessage) { - this.retrieveMessagesFromServerOnGetMessage = retrieveMessagesFromServerOnGetMessage; - return this; - } - - /** - * Read timeout in seconds. Driver default: 0 (no timeout). - * - * @param readTimeout timeout in seconds - * @return this builder - */ - public Builder readTimeout(int readTimeout) { - this.readTimeout = readTimeout; - return this; - } - - /** - * Atomic multi-row insert. Driver default: true. - * - * @param atomicMultiRowInsert true for atomic - * @return this builder - */ - public Builder atomicMultiRowInsert(boolean atomicMultiRowInsert) { - this.atomicMultiRowInsert = atomicMultiRowInsert; - return this; - } - - /** - * Return alias column names. Driver default: false. - * - * @param returnAlias true to return aliases - * @return this builder - */ - public Builder returnAlias(boolean returnAlias) { - this.returnAlias = returnAlias; - return this; - } - - // ==================== SSL/TLS ==================== - - /** - * SSL connection type. Driver default: false. - * - * @param sslConnection "true" or SSL mode - * @return this builder - */ - public Builder sslConnection(String sslConnection) { - this.sslConnection = sslConnection; - return this; - } - - /** - * SSL trust store location. Driver default: null. - * - * @param sslTrustStoreLocation path to trust store - * @return this builder - */ - public Builder sslTrustStoreLocation(String sslTrustStoreLocation) { - this.sslTrustStoreLocation = sslTrustStoreLocation; - return this; - } - - /** - * SSL trust store password. Driver default: null. - * - * @param sslTrustStorePassword trust store password - * @return this builder - */ - public Builder sslTrustStorePassword(String sslTrustStorePassword) { - this.sslTrustStorePassword = sslTrustStorePassword; - return this; - } - - /** - * SSL key store location. Driver default: null. - * - * @param sslKeyStoreLocation path to key store - * @return this builder - */ - public Builder sslKeyStoreLocation(String sslKeyStoreLocation) { - this.sslKeyStoreLocation = sslKeyStoreLocation; - return this; - } - - /** - * SSL key store password. Driver default: null. - * - * @param sslKeyStorePassword key store password - * @return this builder - */ - public Builder sslKeyStorePassword(String sslKeyStorePassword) { - this.sslKeyStorePassword = sslKeyStorePassword; - return this; - } - - /** - * SSL cipher suites. Driver default: null. - * - * @param sslCipherSuites comma-separated cipher suites - * @return this builder - */ - public Builder sslCipherSuites(String sslCipherSuites) { - this.sslCipherSuites = sslCipherSuites; - return this; - } - - // ==================== SECURITY ==================== - - /** - * Security mechanism. Driver default: 3 (clear text password). - * - * @param securityMechanism mechanism code (3=clear, 4=user only, 7=encrypted, etc.) - * @return this builder - */ - public Builder securityMechanism(int securityMechanism) { - this.securityMechanism = securityMechanism; - return this; - } - - /** - * Kerberos server principal. Driver default: null. - * - * @param kerberosServerPrincipal server principal - * @return this builder - */ - public Builder kerberosServerPrincipal(String kerberosServerPrincipal) { - this.kerberosServerPrincipal = kerberosServerPrincipal; - return this; - } - - /** - * GSS credential for Kerberos. Driver default: null. - * - * @param gssCredential credential object name - * @return this builder - */ - public Builder gssCredential(String gssCredential) { - this.gssCredential = gssCredential; - return this; - } - - /** - * Encryption algorithm enabled. Driver default: false. - * - * @param encryptionAlgorithm true to enable - * @return this builder - */ - public Builder encryptionAlgorithm(boolean encryptionAlgorithm) { - this.encryptionAlgorithm = encryptionAlgorithm; - return this; - } - - /** - * Public key list. Driver default: null. - * - * @param pkList public key list - * @return this builder - */ - public Builder pkList(String pkList) { - this.pkList = pkList; - return this; - } - - /** - * Security plugin name. Driver default: null. - * - * @param pluginName plugin name - * @return this builder - */ - public Builder pluginName(String pluginName) { - this.pluginName = pluginName; - return this; - } - - /** - * Send data as is without conversion. Driver default: false. - * - * @param sendDataAsIs true to send as is - * @return this builder - */ - public Builder sendDataAsIs(boolean sendDataAsIs) { - this.sendDataAsIs = sendDataAsIs; - return this; - } - - // ==================== LOB ==================== - - /** - * Stream buffer size. Driver default: 1048576. - * - * @param streamBufferSize buffer size in bytes - * @return this builder - */ - public Builder streamBufferSize(int streamBufferSize) { - this.streamBufferSize = streamBufferSize; - return this; - } - - /** - * Fully materialize BLOB data. Driver default: true. - * - * @param fullyMaterializeBlobData true to materialize - * @return this builder - */ - public Builder fullyMaterializeBlobData(boolean fullyMaterializeBlobData) { - this.fullyMaterializeBlobData = fullyMaterializeBlobData; - return this; - } - - /** - * Fully materialize CLOB data. Driver default: true. - * - * @param fullyMaterializeClobData true to materialize - * @return this builder - */ - public Builder fullyMaterializeClobData(boolean fullyMaterializeClobData) { - this.fullyMaterializeClobData = fullyMaterializeClobData; - return this; - } - - /** - * Max retries for client reroute. Driver default: 3. - * - * @param maxRetriesForClientReroute retry count - * @return this builder - */ - public Builder maxRetriesForClientReroute(int maxRetriesForClientReroute) { - this.maxRetriesForClientReroute = maxRetriesForClientReroute; - return this; - } - - /** - * Retry interval for client reroute in seconds. Driver default: 0. - * - * @param retryIntervalForClientReroute interval in seconds - * @return this builder - */ - public Builder retryIntervalForClientReroute(int retryIntervalForClientReroute) { - this.retryIntervalForClientReroute = retryIntervalForClientReroute; - return this; - } - - // ==================== STATEMENT ==================== - - /** - * Allow next on exhausted result set. Driver default: false. - * - * @param allowNextOnExhaustedResultSet true to allow - * @return this builder - */ - public Builder allowNextOnExhaustedResultSet(boolean allowNextOnExhaustedResultSet) { - this.allowNextOnExhaustedResultSet = allowNextOnExhaustedResultSet; - return this; - } - - /** - * Cursor sensitivity. Driver default: TYPE_SCROLL_INSENSITIVE. - * - * @param cursorSensitivity sensitivity type - * @return this builder - */ - public Builder cursorSensitivity(String cursorSensitivity) { - this.cursorSensitivity = cursorSensitivity; - return this; - } - - /** - * Cursor hold (0=close, 1=hold). Driver default: 1. - * - * @param cursorHold hold value - * @return this builder - */ - public Builder cursorHold(int cursorHold) { - this.cursorHold = cursorHold; - return this; - } - - /** - * Emulate parameter metadata for z/OS. Driver default: false. - * - * @param emulateParameterMetaDataForZCalls true to emulate - * @return this builder - */ - public Builder emulateParameterMetaDataForZCalls(boolean emulateParameterMetaDataForZCalls) { - this.emulateParameterMetaDataForZCalls = emulateParameterMetaDataForZCalls; - return this; - } - - /** - * Result set holdability for catalog queries. Driver default: 1. - * - * @param resultSetHoldabilityForCatalogQueries holdability value - * @return this builder - */ - public Builder resultSetHoldabilityForCatalogQueries( - int resultSetHoldabilityForCatalogQueries) { - this.resultSetHoldabilityForCatalogQueries = resultSetHoldabilityForCatalogQueries; - return this; - } - - /** - * Query close implicit mode. Driver default: 0. - * - * @param queryCloseImplicit close mode - * @return this builder - */ - public Builder queryCloseImplicit(int queryCloseImplicit) { - this.queryCloseImplicit = queryCloseImplicit; - return this; - } - - /** - * Send char inputs as UTF-8. Driver default: false. - * - * @param sendCharInputsUTF8 true to send as UTF-8 - * @return this builder - */ - public Builder sendCharInputsUTF8(boolean sendCharInputsUTF8) { - this.sendCharInputsUTF8 = sendCharInputsUTF8; - return this; - } - - /** - * Timestamp format. Driver default: null. - * - * @param timestampFormat format string - * @return this builder - */ - public Builder timestampFormat(String timestampFormat) { - this.timestampFormat = timestampFormat; - return this; - } - - /** - * Timestamp output type. Driver default: null. - * - * @param timestampOutputType output type - * @return this builder - */ - public Builder timestampOutputType(String timestampOutputType) { - this.timestampOutputType = timestampOutputType; - return this; - } - - /** - * Date format. Driver default: null. - * - * @param dateFormat format string - * @return this builder - */ - public Builder dateFormat(String dateFormat) { - this.dateFormat = dateFormat; - return this; - } - - /** - * Time format. Driver default: null. - * - * @param timeFormat format string - * @return this builder - */ - public Builder timeFormat(String timeFormat) { - this.timeFormat = timeFormat; - return this; - } - - // ==================== TRACING ==================== - - /** - * Enable trace file. Driver default: false. - * - * @param traceFile true to enable - * @return this builder - */ - public Builder traceFile(boolean traceFile) { - this.traceFile = traceFile; - return this; - } - - /** - * Trace level. Driver default: 0. - * - * @param traceLevel level value - * @return this builder - */ - public Builder traceLevel(int traceLevel) { - this.traceLevel = traceLevel; - return this; - } - - /** - * Enable trace directory. Driver default: false. - * - * @param traceDirectory true to enable - * @return this builder - */ - public Builder traceDirectory(boolean traceDirectory) { - this.traceDirectory = traceDirectory; - return this; - } - - /** - * Enable log writer. Driver default: false. - * - * @param logWriter true to enable - * @return this builder - */ - public Builder logWriter(boolean logWriter) { - this.logWriter = logWriter; - return this; - } - - // ==================== XA ==================== - - /** - * XA network optimization. Driver default: 0. - * - * @param xaNetworkOptimization optimization mode - * @return this builder - */ - public Builder xaNetworkOptimization(int xaNetworkOptimization) { - this.xaNetworkOptimization = xaNetworkOptimization; - return this; - } - - /** - * Downgrade hold cursors under XA. Driver default: false. - * - * @param downgradeHoldCursorsUnderXa true to downgrade - * @return this builder - */ - public Builder downgradeHoldCursorsUnderXa(boolean downgradeHoldCursorsUnderXa) { - this.downgradeHoldCursorsUnderXa = downgradeHoldCursorsUnderXa; - return this; - } - - // ==================== COMPATIBILITY ==================== - - /** - * JDBC collection for packages. Driver default: false. - * - * @param jdbcCollection true to use - * @return this builder - */ - public Builder jdbcCollection(boolean jdbcCollection) { - this.jdbcCollection = jdbcCollection; - return this; - } - - /** - * Current package path. Driver default: null. - * - * @param currentPackagePath package path - * @return this builder - */ - public Builder currentPackagePath(String currentPackagePath) { - this.currentPackagePath = currentPackagePath; - return this; - } - - /** - * Current package set. Driver default: null. - * - * @param currentPackageSet package set - * @return this builder - */ - public Builder currentPackageSet(String currentPackageSet) { - this.currentPackageSet = currentPackageSet; - return this; - } - - /** - * Enable client affinities list. Driver default: false. - * - * @param enableClientAffinitiesList true to enable - * @return this builder - */ - public Builder enableClientAffinitiesList(boolean enableClientAffinitiesList) { - this.enableClientAffinitiesList = enableClientAffinitiesList; - return this; - } - - /** - * Alternate server name for client reroute. Driver default: null. - * - * @param clientRerouteAlternateServerName server name - * @return this builder - */ - public Builder clientRerouteAlternateServerName(String clientRerouteAlternateServerName) { - this.clientRerouteAlternateServerName = clientRerouteAlternateServerName; - return this; - } - - /** - * Alternate port for client reroute. Driver default: null. - * - * @param clientRerouteAlternatePortNumber port number - * @return this builder - */ - public Builder clientRerouteAlternatePortNumber(String clientRerouteAlternatePortNumber) { - this.clientRerouteAlternatePortNumber = clientRerouteAlternatePortNumber; - return this; - } - - // ==================== ADVANCED CONNECTION ==================== - - /** - * Member connect timeout. Driver default: 0. - * - * @param memberConnectTimeout timeout in seconds - * @return this builder - */ - public Builder memberConnectTimeout(int memberConnectTimeout) { - this.memberConnectTimeout = memberConnectTimeout; - return this; - } - - /** - * System schema. Driver default: null. - * - * @param sysSchema schema name - * @return this builder - */ - public Builder sysSchema(String sysSchema) { - this.sysSchema = sysSchema; - return this; - } - - /** - * Affinity failback interval. Driver default: false. - * - * @param affinityFailbackInterval true to enable - * @return this builder - */ - public Builder affinityFailbackInterval(boolean affinityFailbackInterval) { - this.affinityFailbackInterval = affinityFailbackInterval; - return this; - } - - /** - * Enable Sysplex workload balancing. Driver default: false. - * - * @param enableSysplexWLB true to enable - * @return this builder - */ - public Builder enableSysplexWLB(boolean enableSysplexWLB) { - this.enableSysplexWLB = enableSysplexWLB; - return this; - } - - /** - * Max transport objects. Driver default: 0. - * - * @param maxTransportObjects max count - * @return this builder - */ - public Builder maxTransportObjects(int maxTransportObjects) { - this.maxTransportObjects = maxTransportObjects; - return this; - } - - /** - * Database name (alternative to URL path). Driver default: null. - * - * @param databaseName database name - * @return this builder - */ - public Builder databaseName(String databaseName) { - this.databaseName = databaseName; - return this; - } - - /** - * Decimal separator. Driver default: false. - * - * @param decimalSeparator true to enable - * @return this builder - */ - public Builder decimalSeparator(boolean decimalSeparator) { - this.decimalSeparator = decimalSeparator; - return this; - } - - /** - * Decimal string format. Driver default: 0. - * - * @param decimalStringFormat format value - * @return this builder - */ - public Builder decimalStringFormat(int decimalStringFormat) { - this.decimalStringFormat = decimalStringFormat; - return this; - } - - /** - * Client debug info. Driver default: 0. - * - * @param clientDebugInfo debug level - * @return this builder - */ - public Builder clientDebugInfo(int clientDebugInfo) { - this.clientDebugInfo = clientDebugInfo; - return this; - } - - /** - * Set an arbitrary driver property. - * - * @param key property name - * @param value property value - * @return this builder - */ - public Builder property(String key, String value) { - this.extraProperties.put(key, value); - return this; - } - - /** - * Build the Db2Config. - * - * @return immutable Db2Config - */ - public Db2Config build() { - return new Db2Config(this); - } - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/duckdb/DuckDbConfig.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/duckdb/DuckDbConfig.java deleted file mode 100644 index 0dc4e47564..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/duckdb/DuckDbConfig.java +++ /dev/null @@ -1,378 +0,0 @@ -package dev.typr.foundations.connect.duckdb; - -import dev.typr.foundations.connect.DatabaseConfig; -import dev.typr.foundations.connect.DatabaseKind; -import java.util.HashMap; -import java.util.Map; - -/** - * DuckDB database configuration with typed builder methods for all documented JDBC driver - * properties. - * - *

DuckDB is an embedded analytical database with minimal configuration options. - * - * @see DuckDB Java Documentation - */ -public final class DuckDbConfig implements DatabaseConfig { - - private final String path; - - // Mode properties - private final Boolean readOnly; - - // Performance properties - private final Boolean jdbcStreamResults; - private final String tempDirectory; - private final Integer threads; - private final String memoryLimit; - private final Integer defaultNullOrder; - private final Integer defaultOrder; - - // Extension properties - private final Boolean autoloadKnownExtensions; - private final Boolean autoinstallKnownExtensions; - private final String customExtensionRepository; - - // Behavior properties - private final Boolean enableExternalAccess; - private final Boolean allowUnsignedExtensions; - private final Boolean enableObjectCache; - private final Integer maximumMemory; - private final Boolean preserveInsertionOrder; - - // Escape hatch - private final Map extraProperties; - - private DuckDbConfig(Builder b) { - this.path = b.path; - - // Mode - this.readOnly = b.readOnly; - - // Performance - this.jdbcStreamResults = b.jdbcStreamResults; - this.tempDirectory = b.tempDirectory; - this.threads = b.threads; - this.memoryLimit = b.memoryLimit; - this.defaultNullOrder = b.defaultNullOrder; - this.defaultOrder = b.defaultOrder; - - // Extension - this.autoloadKnownExtensions = b.autoloadKnownExtensions; - this.autoinstallKnownExtensions = b.autoinstallKnownExtensions; - this.customExtensionRepository = b.customExtensionRepository; - - // Behavior - this.enableExternalAccess = b.enableExternalAccess; - this.allowUnsignedExtensions = b.allowUnsignedExtensions; - this.enableObjectCache = b.enableObjectCache; - this.maximumMemory = b.maximumMemory; - this.preserveInsertionOrder = b.preserveInsertionOrder; - - this.extraProperties = Map.copyOf(b.extraProperties); - } - - /** - * Create a new builder for an in-memory database. - * - * @return a new builder for :memory: database - */ - public static Builder inMemory() { - return new Builder(":memory:"); - } - - /** - * Create a new builder with a file path. - * - * @param path file path or :memory: for in-memory database - * @return a new builder - */ - public static Builder builder(String path) { - return new Builder(path); - } - - @Override - public String jdbcUrl() { - return "jdbc:duckdb:" + path; - } - - @Override - public String username() { - return ""; - } - - @Override - public String password() { - return ""; - } - - @Override - public DatabaseKind kind() { - return DatabaseKind.DUCKDB; - } - - @Override - public Map driverProperties() { - Map props = new HashMap<>(); - - // Mode - if (readOnly != null) props.put("duckdb.read_only", readOnly.toString()); - - // Performance - if (jdbcStreamResults != null) props.put("jdbc_stream_results", jdbcStreamResults.toString()); - if (tempDirectory != null) props.put("temp_directory", tempDirectory); - if (threads != null) props.put("threads", threads.toString()); - if (memoryLimit != null) props.put("memory_limit", memoryLimit); - if (defaultNullOrder != null) props.put("default_null_order", defaultNullOrder.toString()); - if (defaultOrder != null) props.put("default_order", defaultOrder.toString()); - - // Extension - if (autoloadKnownExtensions != null) - props.put("autoload_known_extensions", autoloadKnownExtensions.toString()); - if (autoinstallKnownExtensions != null) - props.put("autoinstall_known_extensions", autoinstallKnownExtensions.toString()); - if (customExtensionRepository != null) - props.put("custom_extension_repository", customExtensionRepository); - - // Behavior - if (enableExternalAccess != null) - props.put("enable_external_access", enableExternalAccess.toString()); - if (allowUnsignedExtensions != null) - props.put("allow_unsigned_extensions", allowUnsignedExtensions.toString()); - if (enableObjectCache != null) props.put("enable_object_cache", enableObjectCache.toString()); - if (maximumMemory != null) props.put("max_memory", maximumMemory.toString()); - if (preserveInsertionOrder != null) - props.put("preserve_insertion_order", preserveInsertionOrder.toString()); - - props.putAll(extraProperties); - return props; - } - - /** Builder for DuckDbConfig with typed methods for all JDBC driver properties. */ - public static final class Builder { - private final String path; - - // Mode - private Boolean readOnly; - - // Performance - private Boolean jdbcStreamResults; - private String tempDirectory; - private Integer threads; - private String memoryLimit; - private Integer defaultNullOrder; - private Integer defaultOrder; - - // Extension - private Boolean autoloadKnownExtensions; - private Boolean autoinstallKnownExtensions; - private String customExtensionRepository; - - // Behavior - private Boolean enableExternalAccess; - private Boolean allowUnsignedExtensions; - private Boolean enableObjectCache; - private Integer maximumMemory; - private Boolean preserveInsertionOrder; - - private final Map extraProperties = new HashMap<>(); - - private Builder(String path) { - this.path = path; - } - - // ==================== MODE ==================== - - /** - * Open database in read-only mode. Driver default: false. - * - * @param readOnly true for read-only - * @return this builder - */ - public Builder readOnly(boolean readOnly) { - this.readOnly = readOnly; - return this; - } - - // ==================== PERFORMANCE ==================== - - /** - * Stream results instead of materializing. Driver default: false. - * - * @param jdbcStreamResults true to stream - * @return this builder - */ - public Builder jdbcStreamResults(boolean jdbcStreamResults) { - this.jdbcStreamResults = jdbcStreamResults; - return this; - } - - /** - * Temporary directory for spilling. Driver default: system temp. - * - * @param tempDirectory path to temp directory - * @return this builder - */ - public Builder tempDirectory(String tempDirectory) { - this.tempDirectory = tempDirectory; - return this; - } - - /** - * Number of threads to use. Driver default: all available cores. - * - * @param threads thread count - * @return this builder - */ - public Builder threads(int threads) { - this.threads = threads; - return this; - } - - /** - * Memory limit (e.g., "4GB", "512MB"). Driver default: 80% of RAM. - * - * @param memoryLimit memory limit with unit - * @return this builder - */ - public Builder memoryLimit(String memoryLimit) { - this.memoryLimit = memoryLimit; - return this; - } - - /** - * Default null ordering (0=nulls first, 1=nulls last). Driver default: 1. - * - * @param defaultNullOrder null order - * @return this builder - */ - public Builder defaultNullOrder(int defaultNullOrder) { - this.defaultNullOrder = defaultNullOrder; - return this; - } - - /** - * Default sort order (0=ASC, 1=DESC). Driver default: 0. - * - * @param defaultOrder sort order - * @return this builder - */ - public Builder defaultOrder(int defaultOrder) { - this.defaultOrder = defaultOrder; - return this; - } - - // ==================== EXTENSION ==================== - - /** - * Automatically load known extensions. Driver default: true. - * - * @param autoloadKnownExtensions true to auto-load - * @return this builder - */ - public Builder autoloadKnownExtensions(boolean autoloadKnownExtensions) { - this.autoloadKnownExtensions = autoloadKnownExtensions; - return this; - } - - /** - * Automatically install known extensions. Driver default: true. - * - * @param autoinstallKnownExtensions true to auto-install - * @return this builder - */ - public Builder autoinstallKnownExtensions(boolean autoinstallKnownExtensions) { - this.autoinstallKnownExtensions = autoinstallKnownExtensions; - return this; - } - - /** - * Custom extension repository URL. Driver default: null. - * - * @param customExtensionRepository repository URL - * @return this builder - */ - public Builder customExtensionRepository(String customExtensionRepository) { - this.customExtensionRepository = customExtensionRepository; - return this; - } - - // ==================== BEHAVIOR ==================== - - /** - * Enable external access (files, HTTP). Driver default: true. - * - * @param enableExternalAccess true to enable - * @return this builder - */ - public Builder enableExternalAccess(boolean enableExternalAccess) { - this.enableExternalAccess = enableExternalAccess; - return this; - } - - /** - * Allow unsigned extensions. Driver default: false. - * - * @param allowUnsignedExtensions true to allow - * @return this builder - */ - public Builder allowUnsignedExtensions(boolean allowUnsignedExtensions) { - this.allowUnsignedExtensions = allowUnsignedExtensions; - return this; - } - - /** - * Enable object cache for Parquet files. Driver default: false. - * - * @param enableObjectCache true to enable - * @return this builder - */ - public Builder enableObjectCache(boolean enableObjectCache) { - this.enableObjectCache = enableObjectCache; - return this; - } - - /** - * Maximum memory usage in bytes. Driver default: 80% of RAM. - * - * @param maximumMemory memory in bytes - * @return this builder - */ - public Builder maximumMemory(int maximumMemory) { - this.maximumMemory = maximumMemory; - return this; - } - - /** - * Preserve insertion order in results. Driver default: true. - * - * @param preserveInsertionOrder true to preserve - * @return this builder - */ - public Builder preserveInsertionOrder(boolean preserveInsertionOrder) { - this.preserveInsertionOrder = preserveInsertionOrder; - return this; - } - - /** - * Set an arbitrary driver property. - * - * @param key property name - * @param value property value - * @return this builder - */ - public Builder property(String key, String value) { - this.extraProperties.put(key, value); - return this; - } - - /** - * Build the DuckDbConfig. - * - * @return immutable DuckDbConfig - */ - public DuckDbConfig build() { - return new DuckDbConfig(this); - } - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/mariadb/MariaDbConfig.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/mariadb/MariaDbConfig.java deleted file mode 100644 index a2cbda5048..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/mariadb/MariaDbConfig.java +++ /dev/null @@ -1,1577 +0,0 @@ -package dev.typr.foundations.connect.mariadb; - -import dev.typr.foundations.connect.DatabaseConfig; -import dev.typr.foundations.connect.DatabaseKind; -import java.util.HashMap; -import java.util.Map; - -/** - * MariaDB database configuration with typed builder methods for all documented JDBC driver - * properties. - * - *

Properties are based on the MariaDB Connector/J documentation. Also works with MySQL - * databases. - * - * @see MariaDB Connector/J - * Documentation - */ -public final class MariaDbConfig implements DatabaseConfig { - - private final String host; - private final int port; - private final String database; - private final String username; - private final String password; - - // SSL/TLS properties - private final MariaSslMode sslMode; - private final String serverSslCert; - private final String keyStore; - private final String keyStorePassword; - private final String keyStoreType; - private final String trustStore; - private final String trustStorePassword; - private final String trustStoreType; - private final String enabledSslCipherSuites; - private final String enabledSslProtocolSuites; - private final Boolean disableSslHostnameVerification; - - // Performance properties - private final Boolean useBulkStmts; - private final Boolean useBulkStmtsForInserts; - private final Boolean rewriteBatchedStatements; - private final Boolean cachePrepStmts; - private final Integer prepStmtCacheSize; - private final Integer prepStmtCacheSqlLimit; - private final Boolean useServerPrepStmts; - private final Boolean useCompression; - private final Integer defaultFetchSize; - private final Boolean useReadAheadInput; - private final Boolean cacheCallableStmts; - private final Integer callableStmtCacheSize; - private final Boolean useBatchMultiSend; - private final Integer useBatchMultiSendNumber; - - // Timeout properties - private final Integer connectTimeout; - private final Integer socketTimeout; - private final Integer queryTimeout; - - // TCP properties - private final Boolean tcpKeepAlive; - private final Integer tcpKeepCount; - private final Integer tcpKeepIdle; - private final Integer tcpKeepInterval; - private final Boolean tcpNoDelay; - private final Boolean tcpAbortiveClose; - - // Pool properties (for connection pooling in the driver itself) - private final Boolean pool; - private final String poolName; - private final Integer maxPoolSize; - private final Integer minPoolSize; - private final Integer maxIdleTime; - private final Boolean staticGlobal; - private final Boolean poolValidMinDelay; - private final Boolean registerJmxPool; - - // Connection properties - private final Boolean autoReconnect; - private final String connectionAttributes; - private final String sessionVariables; - private final String initSql; - private final Boolean localSocket; - private final String pipe; - private final Boolean tinyInt1isBit; - private final Boolean yearIsDateType; - private final Boolean dumpQueriesOnException; - private final Boolean includeInnodbStatusInDeadlockExceptions; - private final Boolean includeThreadDumpInDeadlockExceptions; - private final Integer retriesAllDown; - private final String galeraAllowedState; - private final Boolean transactionReplay; - - // Logging properties - private final Boolean log; - private final String logSlowQueries; - private final Long slowQueryThresholdNanos; - private final Integer maxQuerySizeToLog; - private final Boolean profileSql; - - // High Availability properties - private final Boolean assureReadOnly; - private final Integer validConnectionTimeout; - private final Integer loadBalanceBlacklistTimeout; - private final Integer failoverLoopRetries; - private final Boolean allowMultiQueries; - private final Boolean allowLocalInfile; - - // Character set properties - private final String collation; - private final Boolean useMysqlMetadata; - private final String nullCatalogMeansCurrent; - private final Boolean blankTableNameMeta; - private final Boolean databaseTerm; - private final Boolean createDatabaseIfNotExist; - - // Timezone properties - private final String serverTimezone; - private final Boolean forceConnectionTimeZoneToSession; - private final Boolean useLegacyDatetimeCode; - private final Boolean useTimezone; - - // Misc properties - private final Integer maxAllowedPacket; - private final Boolean allowPublicKeyRetrieval; - private final String rsaPublicKey; - private final Boolean cachingRsaPublicKey; - private final String serverRsaPublicKeyFile; - private final String geometryDefaultType; - private final Boolean restrictedAuth; - private final String connectionCollation; - private final Boolean permitMysqlScheme; - private final String credentialType; - private final Boolean ensureSocketState; - - // Escape hatch - private final Map extraProperties; - - private MariaDbConfig(Builder b) { - this.host = b.host; - this.port = b.port; - this.database = b.database; - this.username = b.username; - this.password = b.password; - - // SSL/TLS - this.sslMode = b.sslMode; - this.serverSslCert = b.serverSslCert; - this.keyStore = b.keyStore; - this.keyStorePassword = b.keyStorePassword; - this.keyStoreType = b.keyStoreType; - this.trustStore = b.trustStore; - this.trustStorePassword = b.trustStorePassword; - this.trustStoreType = b.trustStoreType; - this.enabledSslCipherSuites = b.enabledSslCipherSuites; - this.enabledSslProtocolSuites = b.enabledSslProtocolSuites; - this.disableSslHostnameVerification = b.disableSslHostnameVerification; - - // Performance - this.useBulkStmts = b.useBulkStmts; - this.useBulkStmtsForInserts = b.useBulkStmtsForInserts; - this.rewriteBatchedStatements = b.rewriteBatchedStatements; - this.cachePrepStmts = b.cachePrepStmts; - this.prepStmtCacheSize = b.prepStmtCacheSize; - this.prepStmtCacheSqlLimit = b.prepStmtCacheSqlLimit; - this.useServerPrepStmts = b.useServerPrepStmts; - this.useCompression = b.useCompression; - this.defaultFetchSize = b.defaultFetchSize; - this.useReadAheadInput = b.useReadAheadInput; - this.cacheCallableStmts = b.cacheCallableStmts; - this.callableStmtCacheSize = b.callableStmtCacheSize; - this.useBatchMultiSend = b.useBatchMultiSend; - this.useBatchMultiSendNumber = b.useBatchMultiSendNumber; - - // Timeouts - this.connectTimeout = b.connectTimeout; - this.socketTimeout = b.socketTimeout; - this.queryTimeout = b.queryTimeout; - - // TCP - this.tcpKeepAlive = b.tcpKeepAlive; - this.tcpKeepCount = b.tcpKeepCount; - this.tcpKeepIdle = b.tcpKeepIdle; - this.tcpKeepInterval = b.tcpKeepInterval; - this.tcpNoDelay = b.tcpNoDelay; - this.tcpAbortiveClose = b.tcpAbortiveClose; - - // Pool - this.pool = b.pool; - this.poolName = b.poolName; - this.maxPoolSize = b.maxPoolSize; - this.minPoolSize = b.minPoolSize; - this.maxIdleTime = b.maxIdleTime; - this.staticGlobal = b.staticGlobal; - this.poolValidMinDelay = b.poolValidMinDelay; - this.registerJmxPool = b.registerJmxPool; - - // Connection - this.autoReconnect = b.autoReconnect; - this.connectionAttributes = b.connectionAttributes; - this.sessionVariables = b.sessionVariables; - this.initSql = b.initSql; - this.localSocket = b.localSocket; - this.pipe = b.pipe; - this.tinyInt1isBit = b.tinyInt1isBit; - this.yearIsDateType = b.yearIsDateType; - this.dumpQueriesOnException = b.dumpQueriesOnException; - this.includeInnodbStatusInDeadlockExceptions = b.includeInnodbStatusInDeadlockExceptions; - this.includeThreadDumpInDeadlockExceptions = b.includeThreadDumpInDeadlockExceptions; - this.retriesAllDown = b.retriesAllDown; - this.galeraAllowedState = b.galeraAllowedState; - this.transactionReplay = b.transactionReplay; - - // Logging - this.log = b.log; - this.logSlowQueries = b.logSlowQueries; - this.slowQueryThresholdNanos = b.slowQueryThresholdNanos; - this.maxQuerySizeToLog = b.maxQuerySizeToLog; - this.profileSql = b.profileSql; - - // High Availability - this.assureReadOnly = b.assureReadOnly; - this.validConnectionTimeout = b.validConnectionTimeout; - this.loadBalanceBlacklistTimeout = b.loadBalanceBlacklistTimeout; - this.failoverLoopRetries = b.failoverLoopRetries; - this.allowMultiQueries = b.allowMultiQueries; - this.allowLocalInfile = b.allowLocalInfile; - - // Character set - this.collation = b.collation; - this.useMysqlMetadata = b.useMysqlMetadata; - this.nullCatalogMeansCurrent = b.nullCatalogMeansCurrent; - this.blankTableNameMeta = b.blankTableNameMeta; - this.databaseTerm = b.databaseTerm; - this.createDatabaseIfNotExist = b.createDatabaseIfNotExist; - - // Timezone - this.serverTimezone = b.serverTimezone; - this.forceConnectionTimeZoneToSession = b.forceConnectionTimeZoneToSession; - this.useLegacyDatetimeCode = b.useLegacyDatetimeCode; - this.useTimezone = b.useTimezone; - - // Misc - this.maxAllowedPacket = b.maxAllowedPacket; - this.allowPublicKeyRetrieval = b.allowPublicKeyRetrieval; - this.rsaPublicKey = b.rsaPublicKey; - this.cachingRsaPublicKey = b.cachingRsaPublicKey; - this.serverRsaPublicKeyFile = b.serverRsaPublicKeyFile; - this.geometryDefaultType = b.geometryDefaultType; - this.restrictedAuth = b.restrictedAuth; - this.connectionCollation = b.connectionCollation; - this.permitMysqlScheme = b.permitMysqlScheme; - this.credentialType = b.credentialType; - this.ensureSocketState = b.ensureSocketState; - - this.extraProperties = Map.copyOf(b.extraProperties); - } - - /** - * Create a new builder with required connection parameters. - * - * @param host database server hostname - * @param port database server port (typically 3306) - * @param database database name - * @param username username for authentication - * @param password password for authentication - * @return a new builder - */ - public static Builder builder( - String host, int port, String database, String username, String password) { - return new Builder(host, port, database, username, password); - } - - @Override - public String jdbcUrl() { - return "jdbc:mariadb://" + host + ":" + port + "/" + database; - } - - @Override - public String username() { - return username; - } - - @Override - public String password() { - return password; - } - - @Override - public DatabaseKind kind() { - return DatabaseKind.MARIADB; - } - - @Override - public Map driverProperties() { - Map props = new HashMap<>(); - - // SSL/TLS - if (sslMode != null) props.put("sslMode", sslMode.value()); - if (serverSslCert != null) props.put("serverSslCert", serverSslCert); - if (keyStore != null) props.put("keyStore", keyStore); - if (keyStorePassword != null) props.put("keyStorePassword", keyStorePassword); - if (keyStoreType != null) props.put("keyStoreType", keyStoreType); - if (trustStore != null) props.put("trustStore", trustStore); - if (trustStorePassword != null) props.put("trustStorePassword", trustStorePassword); - if (trustStoreType != null) props.put("trustStoreType", trustStoreType); - if (enabledSslCipherSuites != null) props.put("enabledSslCipherSuites", enabledSslCipherSuites); - if (enabledSslProtocolSuites != null) - props.put("enabledSslProtocolSuites", enabledSslProtocolSuites); - if (disableSslHostnameVerification != null) - props.put("disableSslHostnameVerification", disableSslHostnameVerification.toString()); - - // Performance - if (useBulkStmts != null) props.put("useBulkStmts", useBulkStmts.toString()); - if (useBulkStmtsForInserts != null) - props.put("useBulkStmtsForInserts", useBulkStmtsForInserts.toString()); - if (rewriteBatchedStatements != null) - props.put("rewriteBatchedStatements", rewriteBatchedStatements.toString()); - if (cachePrepStmts != null) props.put("cachePrepStmts", cachePrepStmts.toString()); - if (prepStmtCacheSize != null) props.put("prepStmtCacheSize", prepStmtCacheSize.toString()); - if (prepStmtCacheSqlLimit != null) - props.put("prepStmtCacheSqlLimit", prepStmtCacheSqlLimit.toString()); - if (useServerPrepStmts != null) props.put("useServerPrepStmts", useServerPrepStmts.toString()); - if (useCompression != null) props.put("useCompression", useCompression.toString()); - if (defaultFetchSize != null) props.put("defaultFetchSize", defaultFetchSize.toString()); - if (useReadAheadInput != null) props.put("useReadAheadInput", useReadAheadInput.toString()); - if (cacheCallableStmts != null) props.put("cacheCallableStmts", cacheCallableStmts.toString()); - if (callableStmtCacheSize != null) - props.put("callableStmtCacheSize", callableStmtCacheSize.toString()); - if (useBatchMultiSend != null) props.put("useBatchMultiSend", useBatchMultiSend.toString()); - if (useBatchMultiSendNumber != null) - props.put("useBatchMultiSendNumber", useBatchMultiSendNumber.toString()); - - // Timeouts - if (connectTimeout != null) props.put("connectTimeout", connectTimeout.toString()); - if (socketTimeout != null) props.put("socketTimeout", socketTimeout.toString()); - if (queryTimeout != null) props.put("queryTimeout", queryTimeout.toString()); - - // TCP - if (tcpKeepAlive != null) props.put("tcpKeepAlive", tcpKeepAlive.toString()); - if (tcpKeepCount != null) props.put("tcpKeepCount", tcpKeepCount.toString()); - if (tcpKeepIdle != null) props.put("tcpKeepIdle", tcpKeepIdle.toString()); - if (tcpKeepInterval != null) props.put("tcpKeepInterval", tcpKeepInterval.toString()); - if (tcpNoDelay != null) props.put("tcpNoDelay", tcpNoDelay.toString()); - if (tcpAbortiveClose != null) props.put("tcpAbortiveClose", tcpAbortiveClose.toString()); - - // Pool - if (pool != null) props.put("pool", pool.toString()); - if (poolName != null) props.put("poolName", poolName); - if (maxPoolSize != null) props.put("maxPoolSize", maxPoolSize.toString()); - if (minPoolSize != null) props.put("minPoolSize", minPoolSize.toString()); - if (maxIdleTime != null) props.put("maxIdleTime", maxIdleTime.toString()); - if (staticGlobal != null) props.put("staticGlobal", staticGlobal.toString()); - if (poolValidMinDelay != null) props.put("poolValidMinDelay", poolValidMinDelay.toString()); - if (registerJmxPool != null) props.put("registerJmxPool", registerJmxPool.toString()); - - // Connection - if (autoReconnect != null) props.put("autoReconnect", autoReconnect.toString()); - if (connectionAttributes != null) props.put("connectionAttributes", connectionAttributes); - if (sessionVariables != null) props.put("sessionVariables", sessionVariables); - if (initSql != null) props.put("initSql", initSql); - if (localSocket != null) props.put("localSocket", localSocket.toString()); - if (pipe != null) props.put("pipe", pipe); - if (tinyInt1isBit != null) props.put("tinyInt1isBit", tinyInt1isBit.toString()); - if (yearIsDateType != null) props.put("yearIsDateType", yearIsDateType.toString()); - if (dumpQueriesOnException != null) - props.put("dumpQueriesOnException", dumpQueriesOnException.toString()); - if (includeInnodbStatusInDeadlockExceptions != null) - props.put( - "includeInnodbStatusInDeadlockExceptions", - includeInnodbStatusInDeadlockExceptions.toString()); - if (includeThreadDumpInDeadlockExceptions != null) - props.put( - "includeThreadDumpInDeadlockExceptions", - includeThreadDumpInDeadlockExceptions.toString()); - if (retriesAllDown != null) props.put("retriesAllDown", retriesAllDown.toString()); - if (galeraAllowedState != null) props.put("galeraAllowedState", galeraAllowedState); - if (transactionReplay != null) props.put("transactionReplay", transactionReplay.toString()); - - // Logging - if (log != null) props.put("log", log.toString()); - if (logSlowQueries != null) props.put("logSlowQueries", logSlowQueries); - if (slowQueryThresholdNanos != null) - props.put("slowQueryThresholdNanos", slowQueryThresholdNanos.toString()); - if (maxQuerySizeToLog != null) props.put("maxQuerySizeToLog", maxQuerySizeToLog.toString()); - if (profileSql != null) props.put("profileSql", profileSql.toString()); - - // High Availability - if (assureReadOnly != null) props.put("assureReadOnly", assureReadOnly.toString()); - if (validConnectionTimeout != null) - props.put("validConnectionTimeout", validConnectionTimeout.toString()); - if (loadBalanceBlacklistTimeout != null) - props.put("loadBalanceBlacklistTimeout", loadBalanceBlacklistTimeout.toString()); - if (failoverLoopRetries != null) - props.put("failoverLoopRetries", failoverLoopRetries.toString()); - if (allowMultiQueries != null) props.put("allowMultiQueries", allowMultiQueries.toString()); - if (allowLocalInfile != null) props.put("allowLocalInfile", allowLocalInfile.toString()); - - // Character set - if (collation != null) props.put("collation", collation); - if (useMysqlMetadata != null) props.put("useMysqlMetadata", useMysqlMetadata.toString()); - if (nullCatalogMeansCurrent != null) - props.put("nullCatalogMeansCurrent", nullCatalogMeansCurrent); - if (blankTableNameMeta != null) props.put("blankTableNameMeta", blankTableNameMeta.toString()); - if (databaseTerm != null) props.put("databaseTerm", databaseTerm.toString()); - if (createDatabaseIfNotExist != null) - props.put("createDatabaseIfNotExist", createDatabaseIfNotExist.toString()); - - // Timezone - if (serverTimezone != null) props.put("serverTimezone", serverTimezone); - if (forceConnectionTimeZoneToSession != null) - props.put("forceConnectionTimeZoneToSession", forceConnectionTimeZoneToSession.toString()); - if (useLegacyDatetimeCode != null) - props.put("useLegacyDatetimeCode", useLegacyDatetimeCode.toString()); - if (useTimezone != null) props.put("useTimezone", useTimezone.toString()); - - // Misc - if (maxAllowedPacket != null) props.put("maxAllowedPacket", maxAllowedPacket.toString()); - if (allowPublicKeyRetrieval != null) - props.put("allowPublicKeyRetrieval", allowPublicKeyRetrieval.toString()); - if (rsaPublicKey != null) props.put("rsaPublicKey", rsaPublicKey); - if (cachingRsaPublicKey != null) - props.put("cachingRsaPublicKey", cachingRsaPublicKey.toString()); - if (serverRsaPublicKeyFile != null) props.put("serverRsaPublicKeyFile", serverRsaPublicKeyFile); - if (geometryDefaultType != null) props.put("geometryDefaultType", geometryDefaultType); - if (restrictedAuth != null) props.put("restrictedAuth", restrictedAuth.toString()); - if (connectionCollation != null) props.put("connectionCollation", connectionCollation); - if (permitMysqlScheme != null) props.put("permitMysqlScheme", permitMysqlScheme.toString()); - if (credentialType != null) props.put("credentialType", credentialType); - if (ensureSocketState != null) props.put("ensureSocketState", ensureSocketState.toString()); - - props.putAll(extraProperties); - return props; - } - - /** Builder for MariaDbConfig with typed methods for all JDBC driver properties. */ - public static final class Builder { - private final String host; - private final int port; - private final String database; - private final String username; - private final String password; - - // SSL/TLS - private MariaSslMode sslMode; - private String serverSslCert; - private String keyStore; - private String keyStorePassword; - private String keyStoreType; - private String trustStore; - private String trustStorePassword; - private String trustStoreType; - private String enabledSslCipherSuites; - private String enabledSslProtocolSuites; - private Boolean disableSslHostnameVerification; - - // Performance - private Boolean useBulkStmts; - private Boolean useBulkStmtsForInserts; - private Boolean rewriteBatchedStatements; - private Boolean cachePrepStmts; - private Integer prepStmtCacheSize; - private Integer prepStmtCacheSqlLimit; - private Boolean useServerPrepStmts; - private Boolean useCompression; - private Integer defaultFetchSize; - private Boolean useReadAheadInput; - private Boolean cacheCallableStmts; - private Integer callableStmtCacheSize; - private Boolean useBatchMultiSend; - private Integer useBatchMultiSendNumber; - - // Timeouts - private Integer connectTimeout; - private Integer socketTimeout; - private Integer queryTimeout; - - // TCP - private Boolean tcpKeepAlive; - private Integer tcpKeepCount; - private Integer tcpKeepIdle; - private Integer tcpKeepInterval; - private Boolean tcpNoDelay; - private Boolean tcpAbortiveClose; - - // Pool - private Boolean pool; - private String poolName; - private Integer maxPoolSize; - private Integer minPoolSize; - private Integer maxIdleTime; - private Boolean staticGlobal; - private Boolean poolValidMinDelay; - private Boolean registerJmxPool; - - // Connection - private Boolean autoReconnect; - private String connectionAttributes; - private String sessionVariables; - private String initSql; - private Boolean localSocket; - private String pipe; - private Boolean tinyInt1isBit; - private Boolean yearIsDateType; - private Boolean dumpQueriesOnException; - private Boolean includeInnodbStatusInDeadlockExceptions; - private Boolean includeThreadDumpInDeadlockExceptions; - private Integer retriesAllDown; - private String galeraAllowedState; - private Boolean transactionReplay; - - // Logging - private Boolean log; - private String logSlowQueries; - private Long slowQueryThresholdNanos; - private Integer maxQuerySizeToLog; - private Boolean profileSql; - - // High Availability - private Boolean assureReadOnly; - private Integer validConnectionTimeout; - private Integer loadBalanceBlacklistTimeout; - private Integer failoverLoopRetries; - private Boolean allowMultiQueries; - private Boolean allowLocalInfile; - - // Character set - private String collation; - private Boolean useMysqlMetadata; - private String nullCatalogMeansCurrent; - private Boolean blankTableNameMeta; - private Boolean databaseTerm; - private Boolean createDatabaseIfNotExist; - - // Timezone - private String serverTimezone; - private Boolean forceConnectionTimeZoneToSession; - private Boolean useLegacyDatetimeCode; - private Boolean useTimezone; - - // Misc - private Integer maxAllowedPacket; - private Boolean allowPublicKeyRetrieval; - private String rsaPublicKey; - private Boolean cachingRsaPublicKey; - private String serverRsaPublicKeyFile; - private String geometryDefaultType; - private Boolean restrictedAuth; - private String connectionCollation; - private Boolean permitMysqlScheme; - private String credentialType; - private Boolean ensureSocketState; - - private final Map extraProperties = new HashMap<>(); - - private Builder(String host, int port, String database, String username, String password) { - this.host = host; - this.port = port; - this.database = database; - this.username = username; - this.password = password; - } - - // ==================== SSL/TLS ==================== - - /** - * SSL mode for connection security. Driver default: disable. - * - * @param sslMode SSL mode - * @return this builder - */ - public Builder sslMode(MariaSslMode sslMode) { - this.sslMode = sslMode; - return this; - } - - /** - * Path to server SSL certificate. Driver default: null. - * - * @param serverSslCert path to certificate file - * @return this builder - */ - public Builder serverSslCert(String serverSslCert) { - this.serverSslCert = serverSslCert; - return this; - } - - /** - * Path to client key store. Driver default: null. - * - * @param keyStore path to key store file - * @return this builder - */ - public Builder keyStore(String keyStore) { - this.keyStore = keyStore; - return this; - } - - /** - * Password for key store. Driver default: null. - * - * @param keyStorePassword key store password - * @return this builder - */ - public Builder keyStorePassword(String keyStorePassword) { - this.keyStorePassword = keyStorePassword; - return this; - } - - /** - * Key store type. Driver default: JKS. - * - * @param keyStoreType key store type - * @return this builder - */ - public Builder keyStoreType(String keyStoreType) { - this.keyStoreType = keyStoreType; - return this; - } - - /** - * Path to trust store. Driver default: null. - * - * @param trustStore path to trust store file - * @return this builder - */ - public Builder trustStore(String trustStore) { - this.trustStore = trustStore; - return this; - } - - /** - * Password for trust store. Driver default: null. - * - * @param trustStorePassword trust store password - * @return this builder - */ - public Builder trustStorePassword(String trustStorePassword) { - this.trustStorePassword = trustStorePassword; - return this; - } - - /** - * Trust store type. Driver default: JKS. - * - * @param trustStoreType trust store type - * @return this builder - */ - public Builder trustStoreType(String trustStoreType) { - this.trustStoreType = trustStoreType; - return this; - } - - /** - * Enabled SSL cipher suites. Driver default: null (all supported). - * - * @param enabledSslCipherSuites comma-separated cipher suite names - * @return this builder - */ - public Builder enabledSslCipherSuites(String enabledSslCipherSuites) { - this.enabledSslCipherSuites = enabledSslCipherSuites; - return this; - } - - /** - * Enabled SSL protocol versions. Driver default: null (TLSv1.2, TLSv1.3). - * - * @param enabledSslProtocolSuites comma-separated protocol names - * @return this builder - */ - public Builder enabledSslProtocolSuites(String enabledSslProtocolSuites) { - this.enabledSslProtocolSuites = enabledSslProtocolSuites; - return this; - } - - /** - * Disable SSL hostname verification. Driver default: false. - * - * @param disableSslHostnameVerification true to disable - * @return this builder - */ - public Builder disableSslHostnameVerification(boolean disableSslHostnameVerification) { - this.disableSslHostnameVerification = disableSslHostnameVerification; - return this; - } - - // ==================== PERFORMANCE ==================== - - /** - * Use bulk statements for batch operations. Driver default: true. - * - * @param useBulkStmts true to enable - * @return this builder - */ - public Builder useBulkStmts(boolean useBulkStmts) { - this.useBulkStmts = useBulkStmts; - return this; - } - - /** - * Use bulk statements for INSERT only. Driver default: false. - * - * @param useBulkStmtsForInserts true to enable - * @return this builder - */ - public Builder useBulkStmtsForInserts(boolean useBulkStmtsForInserts) { - this.useBulkStmtsForInserts = useBulkStmtsForInserts; - return this; - } - - /** - * Rewrite batch statements into multi-value inserts. Driver default: false. - * - * @param rewriteBatchedStatements true to enable - * @return this builder - */ - public Builder rewriteBatchedStatements(boolean rewriteBatchedStatements) { - this.rewriteBatchedStatements = rewriteBatchedStatements; - return this; - } - - /** - * Cache prepared statements. Driver default: true. - * - * @param cachePrepStmts true to enable - * @return this builder - */ - public Builder cachePrepStmts(boolean cachePrepStmts) { - this.cachePrepStmts = cachePrepStmts; - return this; - } - - /** - * Prepared statement cache size. Driver default: 250. - * - * @param prepStmtCacheSize cache size - * @return this builder - */ - public Builder prepStmtCacheSize(int prepStmtCacheSize) { - this.prepStmtCacheSize = prepStmtCacheSize; - return this; - } - - /** - * Maximum SQL length for cached statements. Driver default: 2048. - * - * @param prepStmtCacheSqlLimit maximum SQL length - * @return this builder - */ - public Builder prepStmtCacheSqlLimit(int prepStmtCacheSqlLimit) { - this.prepStmtCacheSqlLimit = prepStmtCacheSqlLimit; - return this; - } - - /** - * Use server-side prepared statements. Driver default: false. - * - * @param useServerPrepStmts true to enable - * @return this builder - */ - public Builder useServerPrepStmts(boolean useServerPrepStmts) { - this.useServerPrepStmts = useServerPrepStmts; - return this; - } - - /** - * Enable protocol compression. Driver default: false. - * - * @param useCompression true to enable - * @return this builder - */ - public Builder useCompression(boolean useCompression) { - this.useCompression = useCompression; - return this; - } - - /** - * Default fetch size for result sets. Driver default: 0 (fetch all). - * - * @param defaultFetchSize fetch size - * @return this builder - */ - public Builder defaultFetchSize(int defaultFetchSize) { - this.defaultFetchSize = defaultFetchSize; - return this; - } - - /** - * Use read-ahead input buffering. Driver default: true. - * - * @param useReadAheadInput true to enable - * @return this builder - */ - public Builder useReadAheadInput(boolean useReadAheadInput) { - this.useReadAheadInput = useReadAheadInput; - return this; - } - - /** - * Cache callable statements. Driver default: true. - * - * @param cacheCallableStmts true to enable - * @return this builder - */ - public Builder cacheCallableStmts(boolean cacheCallableStmts) { - this.cacheCallableStmts = cacheCallableStmts; - return this; - } - - /** - * Callable statement cache size. Driver default: 150. - * - * @param callableStmtCacheSize cache size - * @return this builder - */ - public Builder callableStmtCacheSize(int callableStmtCacheSize) { - this.callableStmtCacheSize = callableStmtCacheSize; - return this; - } - - /** - * Send multiple statements in batch. Driver default: true. - * - * @param useBatchMultiSend true to enable - * @return this builder - */ - public Builder useBatchMultiSend(boolean useBatchMultiSend) { - this.useBatchMultiSend = useBatchMultiSend; - return this; - } - - /** - * Maximum statements per batch send. Driver default: 100. - * - * @param useBatchMultiSendNumber batch size - * @return this builder - */ - public Builder useBatchMultiSendNumber(int useBatchMultiSendNumber) { - this.useBatchMultiSendNumber = useBatchMultiSendNumber; - return this; - } - - // ==================== TIMEOUTS ==================== - - /** - * Connection timeout in milliseconds. Driver default: 30000. - * - * @param connectTimeout timeout in milliseconds - * @return this builder - */ - public Builder connectTimeout(int connectTimeout) { - this.connectTimeout = connectTimeout; - return this; - } - - /** - * Socket timeout in milliseconds. Driver default: 0 (unlimited). - * - * @param socketTimeout timeout in milliseconds - * @return this builder - */ - public Builder socketTimeout(int socketTimeout) { - this.socketTimeout = socketTimeout; - return this; - } - - /** - * Query timeout in seconds. Driver default: 0 (unlimited). - * - * @param queryTimeout timeout in seconds - * @return this builder - */ - public Builder queryTimeout(int queryTimeout) { - this.queryTimeout = queryTimeout; - return this; - } - - // ==================== TCP ==================== - - /** - * Enable TCP keepalive. Driver default: true. - * - * @param tcpKeepAlive true to enable - * @return this builder - */ - public Builder tcpKeepAlive(boolean tcpKeepAlive) { - this.tcpKeepAlive = tcpKeepAlive; - return this; - } - - /** - * TCP keepalive retry count. Driver default: 6. - * - * @param tcpKeepCount retry count - * @return this builder - */ - public Builder tcpKeepCount(int tcpKeepCount) { - this.tcpKeepCount = tcpKeepCount; - return this; - } - - /** - * TCP keepalive idle time in seconds. Driver default: 60. - * - * @param tcpKeepIdle idle time - * @return this builder - */ - public Builder tcpKeepIdle(int tcpKeepIdle) { - this.tcpKeepIdle = tcpKeepIdle; - return this; - } - - /** - * TCP keepalive interval in seconds. Driver default: 10. - * - * @param tcpKeepInterval interval - * @return this builder - */ - public Builder tcpKeepInterval(int tcpKeepInterval) { - this.tcpKeepInterval = tcpKeepInterval; - return this; - } - - /** - * Disable Nagle's algorithm. Driver default: true. - * - * @param tcpNoDelay true to disable Nagle - * @return this builder - */ - public Builder tcpNoDelay(boolean tcpNoDelay) { - this.tcpNoDelay = tcpNoDelay; - return this; - } - - /** - * Use abortive close. Driver default: false. - * - * @param tcpAbortiveClose true to enable - * @return this builder - */ - public Builder tcpAbortiveClose(boolean tcpAbortiveClose) { - this.tcpAbortiveClose = tcpAbortiveClose; - return this; - } - - // ==================== POOL (Driver-internal) ==================== - - /** - * Enable driver-internal connection pooling. Driver default: false. - * - * @param pool true to enable - * @return this builder - */ - public Builder pool(boolean pool) { - this.pool = pool; - return this; - } - - /** - * Pool name for JMX registration. Driver default: auto-generated. - * - * @param poolName pool name - * @return this builder - */ - public Builder poolName(String poolName) { - this.poolName = poolName; - return this; - } - - /** - * Maximum pool size. Driver default: 8. - * - * @param maxPoolSize maximum connections - * @return this builder - */ - public Builder maxPoolSize(int maxPoolSize) { - this.maxPoolSize = maxPoolSize; - return this; - } - - /** - * Minimum pool size. Driver default: maxPoolSize. - * - * @param minPoolSize minimum connections - * @return this builder - */ - public Builder minPoolSize(int minPoolSize) { - this.minPoolSize = minPoolSize; - return this; - } - - /** - * Maximum idle time in seconds. Driver default: 600. - * - * @param maxIdleTime idle time - * @return this builder - */ - public Builder maxIdleTime(int maxIdleTime) { - this.maxIdleTime = maxIdleTime; - return this; - } - - /** - * Use static global pool. Driver default: false. - * - * @param staticGlobal true to use global pool - * @return this builder - */ - public Builder staticGlobal(boolean staticGlobal) { - this.staticGlobal = staticGlobal; - return this; - } - - /** - * Minimum delay between validations. Driver default: true. - * - * @param poolValidMinDelay true to enforce delay - * @return this builder - */ - public Builder poolValidMinDelay(boolean poolValidMinDelay) { - this.poolValidMinDelay = poolValidMinDelay; - return this; - } - - /** - * Register pool with JMX. Driver default: true. - * - * @param registerJmxPool true to register - * @return this builder - */ - public Builder registerJmxPool(boolean registerJmxPool) { - this.registerJmxPool = registerJmxPool; - return this; - } - - // ==================== CONNECTION ==================== - - /** - * Auto-reconnect on connection loss. Driver default: false. - * - * @param autoReconnect true to enable - * @return this builder - */ - public Builder autoReconnect(boolean autoReconnect) { - this.autoReconnect = autoReconnect; - return this; - } - - /** - * Connection attributes for server. Driver default: null. - * - * @param connectionAttributes comma-separated key=value pairs - * @return this builder - */ - public Builder connectionAttributes(String connectionAttributes) { - this.connectionAttributes = connectionAttributes; - return this; - } - - /** - * Session variables to set on connect. Driver default: null. - * - * @param sessionVariables comma-separated var=value pairs - * @return this builder - */ - public Builder sessionVariables(String sessionVariables) { - this.sessionVariables = sessionVariables; - return this; - } - - /** - * SQL to execute on connect. Driver default: null. - * - * @param initSql initialization SQL - * @return this builder - */ - public Builder initSql(String initSql) { - this.initSql = initSql; - return this; - } - - /** - * Use Unix local socket. Driver default: false. - * - * @param localSocket true to use local socket - * @return this builder - */ - public Builder localSocket(boolean localSocket) { - this.localSocket = localSocket; - return this; - } - - /** - * Windows named pipe path. Driver default: null. - * - * @param pipe pipe path - * @return this builder - */ - public Builder pipe(String pipe) { - this.pipe = pipe; - return this; - } - - /** - * Map TINYINT(1) to boolean. Driver default: true. - * - * @param tinyInt1isBit true to map to boolean - * @return this builder - */ - public Builder tinyInt1isBit(boolean tinyInt1isBit) { - this.tinyInt1isBit = tinyInt1isBit; - return this; - } - - /** - * Map YEAR to Date. Driver default: true. - * - * @param yearIsDateType true to map to Date - * @return this builder - */ - public Builder yearIsDateType(boolean yearIsDateType) { - this.yearIsDateType = yearIsDateType; - return this; - } - - /** - * Dump queries in exception messages. Driver default: false. - * - * @param dumpQueriesOnException true to dump - * @return this builder - */ - public Builder dumpQueriesOnException(boolean dumpQueriesOnException) { - this.dumpQueriesOnException = dumpQueriesOnException; - return this; - } - - /** - * Include InnoDB status in deadlock exceptions. Driver default: false. - * - * @param includeInnodbStatusInDeadlockExceptions true to include - * @return this builder - */ - public Builder includeInnodbStatusInDeadlockExceptions( - boolean includeInnodbStatusInDeadlockExceptions) { - this.includeInnodbStatusInDeadlockExceptions = includeInnodbStatusInDeadlockExceptions; - return this; - } - - /** - * Include thread dump in deadlock exceptions. Driver default: false. - * - * @param includeThreadDumpInDeadlockExceptions true to include - * @return this builder - */ - public Builder includeThreadDumpInDeadlockExceptions( - boolean includeThreadDumpInDeadlockExceptions) { - this.includeThreadDumpInDeadlockExceptions = includeThreadDumpInDeadlockExceptions; - return this; - } - - /** - * Retries when all hosts are down. Driver default: 120. - * - * @param retriesAllDown retry count - * @return this builder - */ - public Builder retriesAllDown(int retriesAllDown) { - this.retriesAllDown = retriesAllDown; - return this; - } - - /** - * Allowed Galera cluster states. Driver default: null. - * - * @param galeraAllowedState comma-separated states - * @return this builder - */ - public Builder galeraAllowedState(String galeraAllowedState) { - this.galeraAllowedState = galeraAllowedState; - return this; - } - - /** - * Enable transaction replay on failover. Driver default: false. - * - * @param transactionReplay true to enable - * @return this builder - */ - public Builder transactionReplay(boolean transactionReplay) { - this.transactionReplay = transactionReplay; - return this; - } - - // ==================== LOGGING ==================== - - /** - * Enable logging. Driver default: false. - * - * @param log true to enable - * @return this builder - */ - public Builder log(boolean log) { - this.log = log; - return this; - } - - /** - * Log slow queries. Driver default: null. - * - * @param logSlowQueries "true" or threshold in ms - * @return this builder - */ - public Builder logSlowQueries(String logSlowQueries) { - this.logSlowQueries = logSlowQueries; - return this; - } - - /** - * Slow query threshold in nanoseconds. Driver default: null. - * - * @param slowQueryThresholdNanos threshold in nanoseconds - * @return this builder - */ - public Builder slowQueryThresholdNanos(long slowQueryThresholdNanos) { - this.slowQueryThresholdNanos = slowQueryThresholdNanos; - return this; - } - - /** - * Maximum query size to log. Driver default: 1024. - * - * @param maxQuerySizeToLog max characters - * @return this builder - */ - public Builder maxQuerySizeToLog(int maxQuerySizeToLog) { - this.maxQuerySizeToLog = maxQuerySizeToLog; - return this; - } - - /** - * Enable SQL profiling. Driver default: false. - * - * @param profileSql true to enable - * @return this builder - */ - public Builder profileSql(boolean profileSql) { - this.profileSql = profileSql; - return this; - } - - // ==================== HIGH AVAILABILITY ==================== - - /** - * Ensure read-only on replica. Driver default: false. - * - * @param assureReadOnly true to ensure - * @return this builder - */ - public Builder assureReadOnly(boolean assureReadOnly) { - this.assureReadOnly = assureReadOnly; - return this; - } - - /** - * Valid connection timeout in seconds. Driver default: 0. - * - * @param validConnectionTimeout timeout - * @return this builder - */ - public Builder validConnectionTimeout(int validConnectionTimeout) { - this.validConnectionTimeout = validConnectionTimeout; - return this; - } - - /** - * Load balance blacklist timeout in seconds. Driver default: 50. - * - * @param loadBalanceBlacklistTimeout timeout - * @return this builder - */ - public Builder loadBalanceBlacklistTimeout(int loadBalanceBlacklistTimeout) { - this.loadBalanceBlacklistTimeout = loadBalanceBlacklistTimeout; - return this; - } - - /** - * Failover loop retries. Driver default: 120. - * - * @param failoverLoopRetries retry count - * @return this builder - */ - public Builder failoverLoopRetries(int failoverLoopRetries) { - this.failoverLoopRetries = failoverLoopRetries; - return this; - } - - /** - * Allow multiple statements per query. Driver default: false. - * - * @param allowMultiQueries true to allow - * @return this builder - */ - public Builder allowMultiQueries(boolean allowMultiQueries) { - this.allowMultiQueries = allowMultiQueries; - return this; - } - - /** - * Allow LOAD DATA LOCAL INFILE. Driver default: false. - * - * @param allowLocalInfile true to allow - * @return this builder - */ - public Builder allowLocalInfile(boolean allowLocalInfile) { - this.allowLocalInfile = allowLocalInfile; - return this; - } - - // ==================== CHARACTER SET ==================== - - /** - * Connection collation. Driver default: null. - * - * @param collation collation name - * @return this builder - */ - public Builder collation(String collation) { - this.collation = collation; - return this; - } - - /** - * Use MySQL metadata mode. Driver default: false. - * - * @param useMysqlMetadata true to use MySQL mode - * @return this builder - */ - public Builder useMysqlMetadata(boolean useMysqlMetadata) { - this.useMysqlMetadata = useMysqlMetadata; - return this; - } - - /** - * Null catalog means current database. Driver default: null. - * - * @param nullCatalogMeansCurrent "true" or "false" - * @return this builder - */ - public Builder nullCatalogMeansCurrent(String nullCatalogMeansCurrent) { - this.nullCatalogMeansCurrent = nullCatalogMeansCurrent; - return this; - } - - /** - * Blank table name in metadata. Driver default: false. - * - * @param blankTableNameMeta true to blank - * @return this builder - */ - public Builder blankTableNameMeta(boolean blankTableNameMeta) { - this.blankTableNameMeta = blankTableNameMeta; - return this; - } - - /** - * Database term handling. Driver default: false. - * - * @param databaseTerm true for database term mode - * @return this builder - */ - public Builder databaseTerm(boolean databaseTerm) { - this.databaseTerm = databaseTerm; - return this; - } - - /** - * Create database if not exists. Driver default: false. - * - * @param createDatabaseIfNotExist true to create - * @return this builder - */ - public Builder createDatabaseIfNotExist(boolean createDatabaseIfNotExist) { - this.createDatabaseIfNotExist = createDatabaseIfNotExist; - return this; - } - - // ==================== TIMEZONE ==================== - - /** - * Server timezone. Driver default: null (auto-detect). - * - * @param serverTimezone timezone ID - * @return this builder - */ - public Builder serverTimezone(String serverTimezone) { - this.serverTimezone = serverTimezone; - return this; - } - - /** - * Force connection timezone to session. Driver default: false. - * - * @param forceConnectionTimeZoneToSession true to force - * @return this builder - */ - public Builder forceConnectionTimeZoneToSession(boolean forceConnectionTimeZoneToSession) { - this.forceConnectionTimeZoneToSession = forceConnectionTimeZoneToSession; - return this; - } - - /** - * Use legacy datetime code. Driver default: false. - * - * @param useLegacyDatetimeCode true to use legacy - * @return this builder - */ - public Builder useLegacyDatetimeCode(boolean useLegacyDatetimeCode) { - this.useLegacyDatetimeCode = useLegacyDatetimeCode; - return this; - } - - /** - * Use timezone in date conversions. Driver default: false. - * - * @param useTimezone true to use timezone - * @return this builder - */ - public Builder useTimezone(boolean useTimezone) { - this.useTimezone = useTimezone; - return this; - } - - // ==================== MISC ==================== - - /** - * Maximum allowed packet size. Driver default: null. - * - * @param maxAllowedPacket packet size in bytes - * @return this builder - */ - public Builder maxAllowedPacket(int maxAllowedPacket) { - this.maxAllowedPacket = maxAllowedPacket; - return this; - } - - /** - * Allow public key retrieval for caching_sha2_password. Driver default: false. - * - * @param allowPublicKeyRetrieval true to allow - * @return this builder - */ - public Builder allowPublicKeyRetrieval(boolean allowPublicKeyRetrieval) { - this.allowPublicKeyRetrieval = allowPublicKeyRetrieval; - return this; - } - - /** - * RSA public key for authentication. Driver default: null. - * - * @param rsaPublicKey public key content - * @return this builder - */ - public Builder rsaPublicKey(String rsaPublicKey) { - this.rsaPublicKey = rsaPublicKey; - return this; - } - - /** - * Use caching RSA public key. Driver default: false. - * - * @param cachingRsaPublicKey true to cache - * @return this builder - */ - public Builder cachingRsaPublicKey(boolean cachingRsaPublicKey) { - this.cachingRsaPublicKey = cachingRsaPublicKey; - return this; - } - - /** - * Path to server RSA public key file. Driver default: null. - * - * @param serverRsaPublicKeyFile path to file - * @return this builder - */ - public Builder serverRsaPublicKeyFile(String serverRsaPublicKeyFile) { - this.serverRsaPublicKeyFile = serverRsaPublicKeyFile; - return this; - } - - /** - * Default geometry type class. Driver default: null. - * - * @param geometryDefaultType class name - * @return this builder - */ - public Builder geometryDefaultType(String geometryDefaultType) { - this.geometryDefaultType = geometryDefaultType; - return this; - } - - /** - * Restrict authentication methods. Driver default: false. - * - * @param restrictedAuth true to restrict - * @return this builder - */ - public Builder restrictedAuth(boolean restrictedAuth) { - this.restrictedAuth = restrictedAuth; - return this; - } - - /** - * Connection collation. Driver default: null. - * - * @param connectionCollation collation name - * @return this builder - */ - public Builder connectionCollation(String connectionCollation) { - this.connectionCollation = connectionCollation; - return this; - } - - /** - * Permit MySQL scheme in URL. Driver default: false. - * - * @param permitMysqlScheme true to permit - * @return this builder - */ - public Builder permitMysqlScheme(boolean permitMysqlScheme) { - this.permitMysqlScheme = permitMysqlScheme; - return this; - } - - /** - * Credential type for authentication. Driver default: null. - * - * @param credentialType credential type - * @return this builder - */ - public Builder credentialType(String credentialType) { - this.credentialType = credentialType; - return this; - } - - /** - * Ensure socket state before operations. Driver default: false. - * - * @param ensureSocketState true to ensure - * @return this builder - */ - public Builder ensureSocketState(boolean ensureSocketState) { - this.ensureSocketState = ensureSocketState; - return this; - } - - /** - * Set an arbitrary driver property. - * - * @param key property name - * @param value property value - * @return this builder - */ - public Builder property(String key, String value) { - this.extraProperties.put(key, value); - return this; - } - - /** - * Build the MariaDbConfig. - * - * @return immutable MariaDbConfig - */ - public MariaDbConfig build() { - return new MariaDbConfig(this); - } - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/mariadb/MariaSslMode.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/mariadb/MariaSslMode.java deleted file mode 100644 index a01450188f..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/mariadb/MariaSslMode.java +++ /dev/null @@ -1,23 +0,0 @@ -package dev.typr.foundations.connect.mariadb; - -/** MariaDB SSL mode for connection security. */ -public enum MariaSslMode { - /** Do not use SSL. */ - DISABLE("disable"), - /** Use SSL if server supports it, but don't fail if not. */ - TRUST("trust"), - /** Verify server certificate. */ - VERIFY_CA("verify-ca"), - /** Verify server certificate and hostname. */ - VERIFY_FULL("verify-full"); - - private final String value; - - MariaSslMode(String value) { - this.value = value; - } - - public String value() { - return value; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/oracle/OracleConfig.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/oracle/OracleConfig.java deleted file mode 100644 index d9327883b2..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/oracle/OracleConfig.java +++ /dev/null @@ -1,1778 +0,0 @@ -package dev.typr.foundations.connect.oracle; - -import dev.typr.foundations.connect.DatabaseConfig; -import dev.typr.foundations.connect.DatabaseKind; -import java.util.HashMap; -import java.util.Map; - -/** - * Oracle database configuration with typed builder methods for all documented JDBC driver - * properties. - * - *

Properties are based on the Oracle JDBC driver documentation. - * - * @see Oracle JDBC - * Documentation - */ -public final class OracleConfig implements DatabaseConfig { - - private final String host; - private final int port; - private final String database; - private final String username; - private final String password; - - // Connection type properties - private final String serviceName; - private final Boolean useSid; - private final String tnsAdmin; - private final String tnsAlias; - - // Performance properties - private final Integer defaultRowPrefetch; - private final Integer defaultExecuteBatch; - private final Integer implicitStatementCacheSize; - private final Integer maxStatements; - private final Boolean implicitCachingEnabled; - private final Boolean explicitCachingEnabled; - private final Integer readTimeout; - private final Integer connectTimeout; - private final Integer maxCachedBufferSize; - private final Boolean processEscapes; - private final Integer statementCacheSize; - - // LOB properties - private final Boolean useFetchSizeWithLongColumn; - private final Integer lobPrefetchSize; - private final Boolean prefetchLOBs; - private final String lobStreamPosStandard; - private final Boolean tempBlobCleanUp; - private final Boolean tempClobCleanUp; - - // Batch properties - private final Integer batchSize; - private final Boolean useBatchMultiRetrieve; - private final Boolean enableBatchUpdates; - - // Metadata properties - private final Boolean remarksReporting; - private final Boolean includeSynonyms; - private final Boolean restrictGetTables; - private final Boolean accumulateBatchResult; - private final String defaultNChar; - - // Network properties - private final Integer tcpNoDelay; - private final Integer keepAlive; - private final Integer sendBufferSize; - private final Integer receiveBufferSize; - private final Boolean thinNetAllowPM; - private final String networkProtocol; - - // SSL/TLS properties - private final String sslServerCertDN; - private final String trustStore; - private final String trustStorePassword; - private final String trustStoreType; - private final String keyStore; - private final String keyStorePassword; - private final String keyStoreType; - private final String walletLocation; - private final String walletPassword; - private final Boolean sslServerDNMatch; - - // High Availability properties - private final Boolean fastConnectionFailover; - private final Integer onsConfiguration; - private final Integer retryCount; - private final Integer retryDelay; - private final Boolean implicitConnectionTimeout; - private final String fanEnabled; - private final String onsWalletLocation; - private final String onsWalletPassword; - private final Boolean onsNodes; - - // Timezone properties - private final String sessionTimeZone; - private final Boolean convertNCharLiterals; - - // Tracing/Debugging properties - private final Boolean traceLevel; - private final String traceFile; - private final Integer traceFileSize; - private final Integer traceMaxFiles; - private final Boolean logTraceEnabled; - - // Proxy authentication properties - private final String proxyClientName; - private final String proxyClientDN; - private final String proxyRoles; - private final String proxyPassword; - - // Java properties - private final Integer threadPoolSize; - private final String javaObjectTypeClass; - private final Boolean disableDefineColumnType; - private final Boolean allowNCharLiteral; - private final Boolean createDescriptorUseCurrentSchemaForSchemaName; - - // Statement properties - private final Boolean defaultStatementModeIsNonBlocking; - private final Boolean defaultExecuteAsync; - private final Boolean streamChunkSize; - - // Oracle-specific features - private final Boolean restrictedList; - private final Boolean sqlCl; - private final Boolean reportRemarks; - private final Boolean getPlSqlErrorFromServerOnProcedureCall; - private final Boolean plsqlCompilerWarnings; - private final String editionName; - private final Boolean internalLogon; - private final String connectionClassName; - private final Boolean enableScrollableResultSet; - private final Boolean enableReadOnlyResultSet; - private final Boolean enableCancelQueryOnClose; - - // Sharding properties - private final String shardingKey; - private final String superShardingKey; - - // XA properties - private final Boolean xaRecoveryEnabled; - private final Boolean xaTightlyCouple; - - // Connection validation properties - private final Boolean checkConnectionOnBorrow; - private final String validationSQL; - private final Integer validationTimeout; - private final Integer secondsToTrustIdleConnection; - private final Integer inactivityTimeout; - private final Integer abandonedConnectionTimeout; - private final Integer timeToLiveConnectionTimeout; - - // Escape hatch - private final Map extraProperties; - - private OracleConfig(Builder b) { - this.host = b.host; - this.port = b.port; - this.database = b.database; - this.username = b.username; - this.password = b.password; - - // Connection type - this.serviceName = b.serviceName; - this.useSid = b.useSid; - this.tnsAdmin = b.tnsAdmin; - this.tnsAlias = b.tnsAlias; - - // Performance - this.defaultRowPrefetch = b.defaultRowPrefetch; - this.defaultExecuteBatch = b.defaultExecuteBatch; - this.implicitStatementCacheSize = b.implicitStatementCacheSize; - this.maxStatements = b.maxStatements; - this.implicitCachingEnabled = b.implicitCachingEnabled; - this.explicitCachingEnabled = b.explicitCachingEnabled; - this.readTimeout = b.readTimeout; - this.connectTimeout = b.connectTimeout; - this.maxCachedBufferSize = b.maxCachedBufferSize; - this.processEscapes = b.processEscapes; - this.statementCacheSize = b.statementCacheSize; - - // LOB - this.useFetchSizeWithLongColumn = b.useFetchSizeWithLongColumn; - this.lobPrefetchSize = b.lobPrefetchSize; - this.prefetchLOBs = b.prefetchLOBs; - this.lobStreamPosStandard = b.lobStreamPosStandard; - this.tempBlobCleanUp = b.tempBlobCleanUp; - this.tempClobCleanUp = b.tempClobCleanUp; - - // Batch - this.batchSize = b.batchSize; - this.useBatchMultiRetrieve = b.useBatchMultiRetrieve; - this.enableBatchUpdates = b.enableBatchUpdates; - - // Metadata - this.remarksReporting = b.remarksReporting; - this.includeSynonyms = b.includeSynonyms; - this.restrictGetTables = b.restrictGetTables; - this.accumulateBatchResult = b.accumulateBatchResult; - this.defaultNChar = b.defaultNChar; - - // Network - this.tcpNoDelay = b.tcpNoDelay; - this.keepAlive = b.keepAlive; - this.sendBufferSize = b.sendBufferSize; - this.receiveBufferSize = b.receiveBufferSize; - this.thinNetAllowPM = b.thinNetAllowPM; - this.networkProtocol = b.networkProtocol; - - // SSL/TLS - this.sslServerCertDN = b.sslServerCertDN; - this.trustStore = b.trustStore; - this.trustStorePassword = b.trustStorePassword; - this.trustStoreType = b.trustStoreType; - this.keyStore = b.keyStore; - this.keyStorePassword = b.keyStorePassword; - this.keyStoreType = b.keyStoreType; - this.walletLocation = b.walletLocation; - this.walletPassword = b.walletPassword; - this.sslServerDNMatch = b.sslServerDNMatch; - - // High Availability - this.fastConnectionFailover = b.fastConnectionFailover; - this.onsConfiguration = b.onsConfiguration; - this.retryCount = b.retryCount; - this.retryDelay = b.retryDelay; - this.implicitConnectionTimeout = b.implicitConnectionTimeout; - this.fanEnabled = b.fanEnabled; - this.onsWalletLocation = b.onsWalletLocation; - this.onsWalletPassword = b.onsWalletPassword; - this.onsNodes = b.onsNodes; - - // Timezone - this.sessionTimeZone = b.sessionTimeZone; - this.convertNCharLiterals = b.convertNCharLiterals; - - // Tracing - this.traceLevel = b.traceLevel; - this.traceFile = b.traceFile; - this.traceFileSize = b.traceFileSize; - this.traceMaxFiles = b.traceMaxFiles; - this.logTraceEnabled = b.logTraceEnabled; - - // Proxy - this.proxyClientName = b.proxyClientName; - this.proxyClientDN = b.proxyClientDN; - this.proxyRoles = b.proxyRoles; - this.proxyPassword = b.proxyPassword; - - // Java - this.threadPoolSize = b.threadPoolSize; - this.javaObjectTypeClass = b.javaObjectTypeClass; - this.disableDefineColumnType = b.disableDefineColumnType; - this.allowNCharLiteral = b.allowNCharLiteral; - this.createDescriptorUseCurrentSchemaForSchemaName = - b.createDescriptorUseCurrentSchemaForSchemaName; - - // Statement - this.defaultStatementModeIsNonBlocking = b.defaultStatementModeIsNonBlocking; - this.defaultExecuteAsync = b.defaultExecuteAsync; - this.streamChunkSize = b.streamChunkSize; - - // Oracle-specific - this.restrictedList = b.restrictedList; - this.sqlCl = b.sqlCl; - this.reportRemarks = b.reportRemarks; - this.getPlSqlErrorFromServerOnProcedureCall = b.getPlSqlErrorFromServerOnProcedureCall; - this.plsqlCompilerWarnings = b.plsqlCompilerWarnings; - this.editionName = b.editionName; - this.internalLogon = b.internalLogon; - this.connectionClassName = b.connectionClassName; - this.enableScrollableResultSet = b.enableScrollableResultSet; - this.enableReadOnlyResultSet = b.enableReadOnlyResultSet; - this.enableCancelQueryOnClose = b.enableCancelQueryOnClose; - - // Sharding - this.shardingKey = b.shardingKey; - this.superShardingKey = b.superShardingKey; - - // XA - this.xaRecoveryEnabled = b.xaRecoveryEnabled; - this.xaTightlyCouple = b.xaTightlyCouple; - - // Connection validation - this.checkConnectionOnBorrow = b.checkConnectionOnBorrow; - this.validationSQL = b.validationSQL; - this.validationTimeout = b.validationTimeout; - this.secondsToTrustIdleConnection = b.secondsToTrustIdleConnection; - this.inactivityTimeout = b.inactivityTimeout; - this.abandonedConnectionTimeout = b.abandonedConnectionTimeout; - this.timeToLiveConnectionTimeout = b.timeToLiveConnectionTimeout; - - this.extraProperties = Map.copyOf(b.extraProperties); - } - - /** - * Create a new builder with required connection parameters using SID. - * - * @param host Oracle server hostname - * @param port Oracle server port (typically 1521) - * @param database SID or service name - * @param username username for authentication - * @param password password for authentication - * @return a new builder - */ - public static Builder builder( - String host, int port, String database, String username, String password) { - return new Builder(host, port, database, username, password); - } - - @Override - public String jdbcUrl() { - if (tnsAlias != null) { - return "jdbc:oracle:thin:@" + tnsAlias; - } else if (serviceName != null) { - return "jdbc:oracle:thin:@//" + host + ":" + port + "/" + serviceName; - } else { - return "jdbc:oracle:thin:@" + host + ":" + port + ":" + database; - } - } - - @Override - public String username() { - return username; - } - - @Override - public String password() { - return password; - } - - @Override - public DatabaseKind kind() { - return DatabaseKind.ORACLE; - } - - @Override - public Map driverProperties() { - Map props = new HashMap<>(); - - // Performance - if (defaultRowPrefetch != null) - props.put("oracle.jdbc.defaultRowPrefetch", defaultRowPrefetch.toString()); - if (defaultExecuteBatch != null) - props.put("oracle.jdbc.defaultExecuteBatch", defaultExecuteBatch.toString()); - if (implicitStatementCacheSize != null) - props.put("oracle.jdbc.implicitStatementCacheSize", implicitStatementCacheSize.toString()); - if (maxStatements != null) props.put("oracle.jdbc.maxStatements", maxStatements.toString()); - if (implicitCachingEnabled != null) - props.put("oracle.jdbc.implicitCachingEnabled", implicitCachingEnabled.toString()); - if (explicitCachingEnabled != null) - props.put("oracle.jdbc.explicitCachingEnabled", explicitCachingEnabled.toString()); - if (readTimeout != null) - props.put("oracle.net.READ_TIMEOUT", String.valueOf(readTimeout * 1000)); - if (connectTimeout != null) - props.put("oracle.net.CONNECT_TIMEOUT", String.valueOf(connectTimeout * 1000)); - if (maxCachedBufferSize != null) - props.put("oracle.jdbc.maxCachedBufferSize", maxCachedBufferSize.toString()); - if (processEscapes != null) props.put("oracle.jdbc.processEscapes", processEscapes.toString()); - if (statementCacheSize != null) - props.put("oracle.jdbc.statementCacheSize", statementCacheSize.toString()); - - // LOB - if (useFetchSizeWithLongColumn != null) - props.put("oracle.jdbc.useFetchSizeWithLongColumn", useFetchSizeWithLongColumn.toString()); - if (lobPrefetchSize != null) - props.put("oracle.jdbc.lobPrefetchSize", lobPrefetchSize.toString()); - if (prefetchLOBs != null) props.put("oracle.jdbc.prefetchLOBs", prefetchLOBs.toString()); - if (lobStreamPosStandard != null) - props.put("oracle.jdbc.lobStreamPosStandard", lobStreamPosStandard); - if (tempBlobCleanUp != null) - props.put("oracle.jdbc.tempBlobCleanUp", tempBlobCleanUp.toString()); - if (tempClobCleanUp != null) - props.put("oracle.jdbc.tempClobCleanUp", tempClobCleanUp.toString()); - - // Batch - if (batchSize != null) props.put("oracle.jdbc.batchSize", batchSize.toString()); - if (useBatchMultiRetrieve != null) - props.put("oracle.jdbc.useBatchMultiRetrieve", useBatchMultiRetrieve.toString()); - if (enableBatchUpdates != null) - props.put("oracle.jdbc.enableBatchUpdates", enableBatchUpdates.toString()); - - // Metadata - if (remarksReporting != null) - props.put("oracle.jdbc.remarksReporting", remarksReporting.toString()); - if (includeSynonyms != null) - props.put("oracle.jdbc.includeSynonyms", includeSynonyms.toString()); - if (restrictGetTables != null) - props.put("oracle.jdbc.restrictGetTables", restrictGetTables.toString()); - if (accumulateBatchResult != null) - props.put("oracle.jdbc.accumulateBatchResult", accumulateBatchResult.toString()); - if (defaultNChar != null) props.put("oracle.jdbc.defaultNChar", defaultNChar); - - // Network - if (tcpNoDelay != null) props.put("oracle.net.tcp_nodelay", tcpNoDelay.toString()); - if (keepAlive != null) props.put("oracle.net.keepAlive", keepAlive.toString()); - if (sendBufferSize != null) props.put("oracle.net.sendBufferSize", sendBufferSize.toString()); - if (receiveBufferSize != null) - props.put("oracle.net.receiveBufferSize", receiveBufferSize.toString()); - if (thinNetAllowPM != null) props.put("oracle.jdbc.thinNetAllowPM", thinNetAllowPM.toString()); - if (networkProtocol != null) props.put("oracle.jdbc.networkProtocol", networkProtocol); - - // SSL/TLS - if (sslServerCertDN != null) props.put("oracle.net.ssl_server_cert_dn", sslServerCertDN); - if (trustStore != null) props.put("javax.net.ssl.trustStore", trustStore); - if (trustStorePassword != null) - props.put("javax.net.ssl.trustStorePassword", trustStorePassword); - if (trustStoreType != null) props.put("javax.net.ssl.trustStoreType", trustStoreType); - if (keyStore != null) props.put("javax.net.ssl.keyStore", keyStore); - if (keyStorePassword != null) props.put("javax.net.ssl.keyStorePassword", keyStorePassword); - if (keyStoreType != null) props.put("javax.net.ssl.keyStoreType", keyStoreType); - if (walletLocation != null) props.put("oracle.net.wallet_location", walletLocation); - if (walletPassword != null) props.put("oracle.net.wallet_password", walletPassword); - if (sslServerDNMatch != null) - props.put("oracle.net.ssl_server_dn_match", sslServerDNMatch.toString()); - - // High Availability - if (fastConnectionFailover != null) - props.put("oracle.jdbc.fastConnectionFailover", fastConnectionFailover.toString()); - if (onsConfiguration != null) - props.put("oracle.ons.configuration", onsConfiguration.toString()); - if (retryCount != null) props.put("oracle.jdbc.retryCount", retryCount.toString()); - if (retryDelay != null) props.put("oracle.jdbc.retryDelay", retryDelay.toString()); - if (implicitConnectionTimeout != null) - props.put("oracle.jdbc.implicitConnectionTimeout", implicitConnectionTimeout.toString()); - if (fanEnabled != null) props.put("oracle.jdbc.fanEnabled", fanEnabled); - if (onsWalletLocation != null) props.put("oracle.ons.walletLocation", onsWalletLocation); - if (onsWalletPassword != null) props.put("oracle.ons.walletPassword", onsWalletPassword); - if (onsNodes != null) props.put("oracle.ons.nodes", onsNodes.toString()); - - // Timezone - if (sessionTimeZone != null) props.put("oracle.jdbc.sessionTimeZone", sessionTimeZone); - if (convertNCharLiterals != null) - props.put("oracle.jdbc.convertNCharLiterals", convertNCharLiterals.toString()); - - // Tracing - if (traceLevel != null) props.put("oracle.jdbc.traceLevel", traceLevel.toString()); - if (traceFile != null) props.put("oracle.jdbc.traceFile", traceFile); - if (traceFileSize != null) props.put("oracle.jdbc.traceFileSize", traceFileSize.toString()); - if (traceMaxFiles != null) props.put("oracle.jdbc.traceMaxFiles", traceMaxFiles.toString()); - if (logTraceEnabled != null) - props.put("oracle.jdbc.logTraceEnabled", logTraceEnabled.toString()); - - // Proxy - if (proxyClientName != null) props.put("oracle.jdbc.proxyClientName", proxyClientName); - if (proxyClientDN != null) props.put("oracle.jdbc.proxyClientDN", proxyClientDN); - if (proxyRoles != null) props.put("oracle.jdbc.proxyRoles", proxyRoles); - if (proxyPassword != null) props.put("oracle.jdbc.proxyPassword", proxyPassword); - - // Java - if (threadPoolSize != null) props.put("oracle.jdbc.threadPoolSize", threadPoolSize.toString()); - if (javaObjectTypeClass != null) - props.put("oracle.jdbc.javaObjectTypeClass", javaObjectTypeClass); - if (disableDefineColumnType != null) - props.put("oracle.jdbc.disableDefineColumnType", disableDefineColumnType.toString()); - if (allowNCharLiteral != null) - props.put("oracle.jdbc.allowNCharLiteral", allowNCharLiteral.toString()); - if (createDescriptorUseCurrentSchemaForSchemaName != null) - props.put( - "oracle.jdbc.createDescriptorUseCurrentSchemaForSchemaName", - createDescriptorUseCurrentSchemaForSchemaName.toString()); - - // Statement - if (defaultStatementModeIsNonBlocking != null) - props.put( - "oracle.jdbc.defaultStatementModeIsNonBlocking", - defaultStatementModeIsNonBlocking.toString()); - if (defaultExecuteAsync != null) - props.put("oracle.jdbc.defaultExecuteAsync", defaultExecuteAsync.toString()); - if (streamChunkSize != null) - props.put("oracle.jdbc.streamChunkSize", streamChunkSize.toString()); - - // Oracle-specific - if (restrictedList != null) props.put("oracle.jdbc.restrictedList", restrictedList.toString()); - if (sqlCl != null) props.put("oracle.jdbc.sqlCl", sqlCl.toString()); - if (reportRemarks != null) props.put("oracle.jdbc.reportRemarks", reportRemarks.toString()); - if (getPlSqlErrorFromServerOnProcedureCall != null) - props.put( - "oracle.jdbc.getPlSqlErrorFromServerOnProcedureCall", - getPlSqlErrorFromServerOnProcedureCall.toString()); - if (plsqlCompilerWarnings != null) - props.put("oracle.jdbc.plsqlCompilerWarnings", plsqlCompilerWarnings.toString()); - if (editionName != null) props.put("oracle.jdbc.editionName", editionName); - if (internalLogon != null) props.put("internal_logon", internalLogon.toString()); - if (connectionClassName != null) - props.put("oracle.jdbc.connectionClassName", connectionClassName); - if (enableScrollableResultSet != null) - props.put("oracle.jdbc.enableScrollableResultSet", enableScrollableResultSet.toString()); - if (enableReadOnlyResultSet != null) - props.put("oracle.jdbc.enableReadOnlyResultSet", enableReadOnlyResultSet.toString()); - if (enableCancelQueryOnClose != null) - props.put("oracle.jdbc.enableCancelQueryOnClose", enableCancelQueryOnClose.toString()); - - // Sharding - if (shardingKey != null) props.put("oracle.jdbc.shardingKey", shardingKey); - if (superShardingKey != null) props.put("oracle.jdbc.superShardingKey", superShardingKey); - - // XA - if (xaRecoveryEnabled != null) - props.put("oracle.jdbc.xaRecoveryEnabled", xaRecoveryEnabled.toString()); - if (xaTightlyCouple != null) - props.put("oracle.jdbc.xaTightlyCouple", xaTightlyCouple.toString()); - - // Connection validation - if (checkConnectionOnBorrow != null) - props.put("oracle.jdbc.checkConnectionOnBorrow", checkConnectionOnBorrow.toString()); - if (validationSQL != null) props.put("oracle.jdbc.validationSQL", validationSQL); - if (validationTimeout != null) - props.put("oracle.jdbc.validationTimeout", validationTimeout.toString()); - if (secondsToTrustIdleConnection != null) - props.put( - "oracle.jdbc.secondsToTrustIdleConnection", secondsToTrustIdleConnection.toString()); - if (inactivityTimeout != null) - props.put("oracle.jdbc.inactivityTimeout", inactivityTimeout.toString()); - if (abandonedConnectionTimeout != null) - props.put("oracle.jdbc.abandonedConnectionTimeout", abandonedConnectionTimeout.toString()); - if (timeToLiveConnectionTimeout != null) - props.put("oracle.jdbc.timeToLiveConnectionTimeout", timeToLiveConnectionTimeout.toString()); - - props.putAll(extraProperties); - return props; - } - - /** Builder for OracleConfig with typed methods for all JDBC driver properties. */ - public static final class Builder { - private final String host; - private final int port; - private final String database; - private final String username; - private final String password; - - // Connection type - private String serviceName; - private Boolean useSid; - private String tnsAdmin; - private String tnsAlias; - - // Performance - private Integer defaultRowPrefetch; - private Integer defaultExecuteBatch; - private Integer implicitStatementCacheSize; - private Integer maxStatements; - private Boolean implicitCachingEnabled; - private Boolean explicitCachingEnabled; - private Integer readTimeout; - private Integer connectTimeout; - private Integer maxCachedBufferSize; - private Boolean processEscapes; - private Integer statementCacheSize; - - // LOB - private Boolean useFetchSizeWithLongColumn; - private Integer lobPrefetchSize; - private Boolean prefetchLOBs; - private String lobStreamPosStandard; - private Boolean tempBlobCleanUp; - private Boolean tempClobCleanUp; - - // Batch - private Integer batchSize; - private Boolean useBatchMultiRetrieve; - private Boolean enableBatchUpdates; - - // Metadata - private Boolean remarksReporting; - private Boolean includeSynonyms; - private Boolean restrictGetTables; - private Boolean accumulateBatchResult; - private String defaultNChar; - - // Network - private Integer tcpNoDelay; - private Integer keepAlive; - private Integer sendBufferSize; - private Integer receiveBufferSize; - private Boolean thinNetAllowPM; - private String networkProtocol; - - // SSL/TLS - private String sslServerCertDN; - private String trustStore; - private String trustStorePassword; - private String trustStoreType; - private String keyStore; - private String keyStorePassword; - private String keyStoreType; - private String walletLocation; - private String walletPassword; - private Boolean sslServerDNMatch; - - // High Availability - private Boolean fastConnectionFailover; - private Integer onsConfiguration; - private Integer retryCount; - private Integer retryDelay; - private Boolean implicitConnectionTimeout; - private String fanEnabled; - private String onsWalletLocation; - private String onsWalletPassword; - private Boolean onsNodes; - - // Timezone - private String sessionTimeZone; - private Boolean convertNCharLiterals; - - // Tracing - private Boolean traceLevel; - private String traceFile; - private Integer traceFileSize; - private Integer traceMaxFiles; - private Boolean logTraceEnabled; - - // Proxy - private String proxyClientName; - private String proxyClientDN; - private String proxyRoles; - private String proxyPassword; - - // Java - private Integer threadPoolSize; - private String javaObjectTypeClass; - private Boolean disableDefineColumnType; - private Boolean allowNCharLiteral; - private Boolean createDescriptorUseCurrentSchemaForSchemaName; - - // Statement - private Boolean defaultStatementModeIsNonBlocking; - private Boolean defaultExecuteAsync; - private Boolean streamChunkSize; - - // Oracle-specific - private Boolean restrictedList; - private Boolean sqlCl; - private Boolean reportRemarks; - private Boolean getPlSqlErrorFromServerOnProcedureCall; - private Boolean plsqlCompilerWarnings; - private String editionName; - private Boolean internalLogon; - private String connectionClassName; - private Boolean enableScrollableResultSet; - private Boolean enableReadOnlyResultSet; - private Boolean enableCancelQueryOnClose; - - // Sharding - private String shardingKey; - private String superShardingKey; - - // XA - private Boolean xaRecoveryEnabled; - private Boolean xaTightlyCouple; - - // Connection validation - private Boolean checkConnectionOnBorrow; - private String validationSQL; - private Integer validationTimeout; - private Integer secondsToTrustIdleConnection; - private Integer inactivityTimeout; - private Integer abandonedConnectionTimeout; - private Integer timeToLiveConnectionTimeout; - - private final Map extraProperties = new HashMap<>(); - - private Builder(String host, int port, String database, String username, String password) { - this.host = host; - this.port = port; - this.database = database; - this.username = username; - this.password = password; - - // OUR DEFAULTS (better than driver defaults) - this.defaultRowPrefetch = 100; // Driver default is 10, which is too low - } - - // ==================== CONNECTION TYPE ==================== - - /** - * Use service name instead of SID. Driver default: false (use SID). - * - * @param serviceName Oracle service name - * @return this builder - */ - public Builder serviceName(String serviceName) { - this.serviceName = serviceName; - return this; - } - - /** - * Explicitly use SID connection format. Driver default: true. - * - * @param useSid true to use SID format - * @return this builder - */ - public Builder useSid(boolean useSid) { - this.useSid = useSid; - return this; - } - - /** - * TNS admin directory for tnsnames.ora. Driver default: null. - * - * @param tnsAdmin path to TNS admin directory - * @return this builder - */ - public Builder tnsAdmin(String tnsAdmin) { - this.tnsAdmin = tnsAdmin; - return this; - } - - /** - * TNS alias from tnsnames.ora. Driver default: null. - * - * @param tnsAlias alias name - * @return this builder - */ - public Builder tnsAlias(String tnsAlias) { - this.tnsAlias = tnsAlias; - return this; - } - - // ==================== PERFORMANCE ==================== - - /** - * Default row prefetch size. Driver default: 10. OUR DEFAULT: 100 (better for most use cases). - * - * @param defaultRowPrefetch prefetch size - * @return this builder - */ - public Builder defaultRowPrefetch(int defaultRowPrefetch) { - this.defaultRowPrefetch = defaultRowPrefetch; - return this; - } - - /** - * Default batch size for execute. Driver default: 1. - * - * @param defaultExecuteBatch batch size - * @return this builder - */ - public Builder defaultExecuteBatch(int defaultExecuteBatch) { - this.defaultExecuteBatch = defaultExecuteBatch; - return this; - } - - /** - * Implicit statement cache size. Driver default: 0. - * - * @param implicitStatementCacheSize cache size - * @return this builder - */ - public Builder implicitStatementCacheSize(int implicitStatementCacheSize) { - this.implicitStatementCacheSize = implicitStatementCacheSize; - return this; - } - - /** - * Maximum cached statements. Driver default: 0. - * - * @param maxStatements max statements - * @return this builder - */ - public Builder maxStatements(int maxStatements) { - this.maxStatements = maxStatements; - return this; - } - - /** - * Enable implicit statement caching. Driver default: false. - * - * @param implicitCachingEnabled true to enable - * @return this builder - */ - public Builder implicitCachingEnabled(boolean implicitCachingEnabled) { - this.implicitCachingEnabled = implicitCachingEnabled; - return this; - } - - /** - * Enable explicit statement caching. Driver default: false. - * - * @param explicitCachingEnabled true to enable - * @return this builder - */ - public Builder explicitCachingEnabled(boolean explicitCachingEnabled) { - this.explicitCachingEnabled = explicitCachingEnabled; - return this; - } - - /** - * Read timeout in seconds. Driver default: 0 (infinite). - * - * @param readTimeout timeout in seconds - * @return this builder - */ - public Builder readTimeout(int readTimeout) { - this.readTimeout = readTimeout; - return this; - } - - /** - * Connect timeout in seconds. Driver default: 0 (infinite). - * - * @param connectTimeout timeout in seconds - * @return this builder - */ - public Builder connectTimeout(int connectTimeout) { - this.connectTimeout = connectTimeout; - return this; - } - - /** - * Maximum cached buffer size. Driver default: null. - * - * @param maxCachedBufferSize max size - * @return this builder - */ - public Builder maxCachedBufferSize(int maxCachedBufferSize) { - this.maxCachedBufferSize = maxCachedBufferSize; - return this; - } - - /** - * Process JDBC escape sequences. Driver default: true. - * - * @param processEscapes true to process - * @return this builder - */ - public Builder processEscapes(boolean processEscapes) { - this.processEscapes = processEscapes; - return this; - } - - /** - * Statement cache size. Driver default: 0. - * - * @param statementCacheSize cache size - * @return this builder - */ - public Builder statementCacheSize(int statementCacheSize) { - this.statementCacheSize = statementCacheSize; - return this; - } - - // ==================== LOB ==================== - - /** - * Use fetch size with LONG columns. Driver default: false. - * - * @param useFetchSizeWithLongColumn true to use - * @return this builder - */ - public Builder useFetchSizeWithLongColumn(boolean useFetchSizeWithLongColumn) { - this.useFetchSizeWithLongColumn = useFetchSizeWithLongColumn; - return this; - } - - /** - * LOB prefetch size. Driver default: 4000. - * - * @param lobPrefetchSize prefetch size - * @return this builder - */ - public Builder lobPrefetchSize(int lobPrefetchSize) { - this.lobPrefetchSize = lobPrefetchSize; - return this; - } - - /** - * Prefetch LOBs with row data. Driver default: false. - * - * @param prefetchLOBs true to prefetch - * @return this builder - */ - public Builder prefetchLOBs(boolean prefetchLOBs) { - this.prefetchLOBs = prefetchLOBs; - return this; - } - - /** - * LOB stream position standard compliance. Driver default: null. - * - * @param lobStreamPosStandard standard mode - * @return this builder - */ - public Builder lobStreamPosStandard(String lobStreamPosStandard) { - this.lobStreamPosStandard = lobStreamPosStandard; - return this; - } - - /** - * Clean up temporary BLOBs. Driver default: true. - * - * @param tempBlobCleanUp true to clean - * @return this builder - */ - public Builder tempBlobCleanUp(boolean tempBlobCleanUp) { - this.tempBlobCleanUp = tempBlobCleanUp; - return this; - } - - /** - * Clean up temporary CLOBs. Driver default: true. - * - * @param tempClobCleanUp true to clean - * @return this builder - */ - public Builder tempClobCleanUp(boolean tempClobCleanUp) { - this.tempClobCleanUp = tempClobCleanUp; - return this; - } - - // ==================== BATCH ==================== - - /** - * Batch size. Driver default: 1. - * - * @param batchSize batch size - * @return this builder - */ - public Builder batchSize(int batchSize) { - this.batchSize = batchSize; - return this; - } - - /** - * Use batch multi-retrieve. Driver default: false. - * - * @param useBatchMultiRetrieve true to enable - * @return this builder - */ - public Builder useBatchMultiRetrieve(boolean useBatchMultiRetrieve) { - this.useBatchMultiRetrieve = useBatchMultiRetrieve; - return this; - } - - /** - * Enable batch updates. Driver default: false. - * - * @param enableBatchUpdates true to enable - * @return this builder - */ - public Builder enableBatchUpdates(boolean enableBatchUpdates) { - this.enableBatchUpdates = enableBatchUpdates; - return this; - } - - // ==================== METADATA ==================== - - /** - * Include remarks in metadata. Driver default: false. - * - * @param remarksReporting true to include - * @return this builder - */ - public Builder remarksReporting(boolean remarksReporting) { - this.remarksReporting = remarksReporting; - return this; - } - - /** - * Include synonyms in metadata. Driver default: false. - * - * @param includeSynonyms true to include - * @return this builder - */ - public Builder includeSynonyms(boolean includeSynonyms) { - this.includeSynonyms = includeSynonyms; - return this; - } - - /** - * Restrict getTables metadata. Driver default: false. - * - * @param restrictGetTables true to restrict - * @return this builder - */ - public Builder restrictGetTables(boolean restrictGetTables) { - this.restrictGetTables = restrictGetTables; - return this; - } - - /** - * Accumulate batch results. Driver default: false. - * - * @param accumulateBatchResult true to accumulate - * @return this builder - */ - public Builder accumulateBatchResult(boolean accumulateBatchResult) { - this.accumulateBatchResult = accumulateBatchResult; - return this; - } - - /** - * Default NChar mode. Driver default: null. - * - * @param defaultNChar NChar mode - * @return this builder - */ - public Builder defaultNChar(String defaultNChar) { - this.defaultNChar = defaultNChar; - return this; - } - - // ==================== NETWORK ==================== - - /** - * TCP no delay. Driver default: null. - * - * @param tcpNoDelay 1 to enable - * @return this builder - */ - public Builder tcpNoDelay(int tcpNoDelay) { - this.tcpNoDelay = tcpNoDelay; - return this; - } - - /** - * TCP keepalive. Driver default: null. - * - * @param keepAlive 1 to enable - * @return this builder - */ - public Builder keepAlive(int keepAlive) { - this.keepAlive = keepAlive; - return this; - } - - /** - * Send buffer size. Driver default: null. - * - * @param sendBufferSize buffer size - * @return this builder - */ - public Builder sendBufferSize(int sendBufferSize) { - this.sendBufferSize = sendBufferSize; - return this; - } - - /** - * Receive buffer size. Driver default: null. - * - * @param receiveBufferSize buffer size - * @return this builder - */ - public Builder receiveBufferSize(int receiveBufferSize) { - this.receiveBufferSize = receiveBufferSize; - return this; - } - - /** - * Allow thin network PM. Driver default: null. - * - * @param thinNetAllowPM true to allow - * @return this builder - */ - public Builder thinNetAllowPM(boolean thinNetAllowPM) { - this.thinNetAllowPM = thinNetAllowPM; - return this; - } - - /** - * Network protocol. Driver default: tcp. - * - * @param networkProtocol protocol name - * @return this builder - */ - public Builder networkProtocol(String networkProtocol) { - this.networkProtocol = networkProtocol; - return this; - } - - // ==================== SSL/TLS ==================== - - /** - * SSL server certificate DN. Driver default: null. - * - * @param sslServerCertDN certificate DN - * @return this builder - */ - public Builder sslServerCertDN(String sslServerCertDN) { - this.sslServerCertDN = sslServerCertDN; - return this; - } - - /** - * Trust store path. Driver default: null. - * - * @param trustStore path to trust store - * @return this builder - */ - public Builder trustStore(String trustStore) { - this.trustStore = trustStore; - return this; - } - - /** - * Trust store password. Driver default: null. - * - * @param trustStorePassword password - * @return this builder - */ - public Builder trustStorePassword(String trustStorePassword) { - this.trustStorePassword = trustStorePassword; - return this; - } - - /** - * Trust store type. Driver default: JKS. - * - * @param trustStoreType store type - * @return this builder - */ - public Builder trustStoreType(String trustStoreType) { - this.trustStoreType = trustStoreType; - return this; - } - - /** - * Key store path. Driver default: null. - * - * @param keyStore path to key store - * @return this builder - */ - public Builder keyStore(String keyStore) { - this.keyStore = keyStore; - return this; - } - - /** - * Key store password. Driver default: null. - * - * @param keyStorePassword password - * @return this builder - */ - public Builder keyStorePassword(String keyStorePassword) { - this.keyStorePassword = keyStorePassword; - return this; - } - - /** - * Key store type. Driver default: JKS. - * - * @param keyStoreType store type - * @return this builder - */ - public Builder keyStoreType(String keyStoreType) { - this.keyStoreType = keyStoreType; - return this; - } - - /** - * Oracle wallet location. Driver default: null. - * - * @param walletLocation path to wallet - * @return this builder - */ - public Builder walletLocation(String walletLocation) { - this.walletLocation = walletLocation; - return this; - } - - /** - * Oracle wallet password. Driver default: null. - * - * @param walletPassword wallet password - * @return this builder - */ - public Builder walletPassword(String walletPassword) { - this.walletPassword = walletPassword; - return this; - } - - /** - * Match SSL server DN. Driver default: false. - * - * @param sslServerDNMatch true to match - * @return this builder - */ - public Builder sslServerDNMatch(boolean sslServerDNMatch) { - this.sslServerDNMatch = sslServerDNMatch; - return this; - } - - // ==================== HIGH AVAILABILITY ==================== - - /** - * Enable Fast Connection Failover (FCF). Driver default: false. - * - * @param fastConnectionFailover true to enable - * @return this builder - */ - public Builder fastConnectionFailover(boolean fastConnectionFailover) { - this.fastConnectionFailover = fastConnectionFailover; - return this; - } - - /** - * ONS configuration. Driver default: null. - * - * @param onsConfiguration ONS config - * @return this builder - */ - public Builder onsConfiguration(int onsConfiguration) { - this.onsConfiguration = onsConfiguration; - return this; - } - - /** - * Connection retry count. Driver default: 0. - * - * @param retryCount retry count - * @return this builder - */ - public Builder retryCount(int retryCount) { - this.retryCount = retryCount; - return this; - } - - /** - * Connection retry delay in seconds. Driver default: 0. - * - * @param retryDelay delay in seconds - * @return this builder - */ - public Builder retryDelay(int retryDelay) { - this.retryDelay = retryDelay; - return this; - } - - /** - * Implicit connection timeout. Driver default: false. - * - * @param implicitConnectionTimeout true to enable - * @return this builder - */ - public Builder implicitConnectionTimeout(boolean implicitConnectionTimeout) { - this.implicitConnectionTimeout = implicitConnectionTimeout; - return this; - } - - /** - * Enable FAN. Driver default: null. - * - * @param fanEnabled "true" or "false" - * @return this builder - */ - public Builder fanEnabled(String fanEnabled) { - this.fanEnabled = fanEnabled; - return this; - } - - /** - * ONS wallet location. Driver default: null. - * - * @param onsWalletLocation wallet path - * @return this builder - */ - public Builder onsWalletLocation(String onsWalletLocation) { - this.onsWalletLocation = onsWalletLocation; - return this; - } - - /** - * ONS wallet password. Driver default: null. - * - * @param onsWalletPassword wallet password - * @return this builder - */ - public Builder onsWalletPassword(String onsWalletPassword) { - this.onsWalletPassword = onsWalletPassword; - return this; - } - - /** - * ONS nodes. Driver default: null. - * - * @param onsNodes true to enable - * @return this builder - */ - public Builder onsNodes(boolean onsNodes) { - this.onsNodes = onsNodes; - return this; - } - - // ==================== TIMEZONE ==================== - - /** - * Session time zone. Driver default: null. - * - * @param sessionTimeZone timezone ID - * @return this builder - */ - public Builder sessionTimeZone(String sessionTimeZone) { - this.sessionTimeZone = sessionTimeZone; - return this; - } - - /** - * Convert NChar literals. Driver default: false. - * - * @param convertNCharLiterals true to convert - * @return this builder - */ - public Builder convertNCharLiterals(boolean convertNCharLiterals) { - this.convertNCharLiterals = convertNCharLiterals; - return this; - } - - // ==================== TRACING ==================== - - /** - * Enable trace level. Driver default: false. - * - * @param traceLevel true to enable - * @return this builder - */ - public Builder traceLevel(boolean traceLevel) { - this.traceLevel = traceLevel; - return this; - } - - /** - * Trace file path. Driver default: null. - * - * @param traceFile file path - * @return this builder - */ - public Builder traceFile(String traceFile) { - this.traceFile = traceFile; - return this; - } - - /** - * Maximum trace file size. Driver default: null. - * - * @param traceFileSize file size - * @return this builder - */ - public Builder traceFileSize(int traceFileSize) { - this.traceFileSize = traceFileSize; - return this; - } - - /** - * Maximum trace files. Driver default: null. - * - * @param traceMaxFiles max files - * @return this builder - */ - public Builder traceMaxFiles(int traceMaxFiles) { - this.traceMaxFiles = traceMaxFiles; - return this; - } - - /** - * Enable log tracing. Driver default: false. - * - * @param logTraceEnabled true to enable - * @return this builder - */ - public Builder logTraceEnabled(boolean logTraceEnabled) { - this.logTraceEnabled = logTraceEnabled; - return this; - } - - // ==================== PROXY ==================== - - /** - * Proxy client name. Driver default: null. - * - * @param proxyClientName client name - * @return this builder - */ - public Builder proxyClientName(String proxyClientName) { - this.proxyClientName = proxyClientName; - return this; - } - - /** - * Proxy client DN. Driver default: null. - * - * @param proxyClientDN client DN - * @return this builder - */ - public Builder proxyClientDN(String proxyClientDN) { - this.proxyClientDN = proxyClientDN; - return this; - } - - /** - * Proxy roles. Driver default: null. - * - * @param proxyRoles comma-separated roles - * @return this builder - */ - public Builder proxyRoles(String proxyRoles) { - this.proxyRoles = proxyRoles; - return this; - } - - /** - * Proxy password. Driver default: null. - * - * @param proxyPassword password - * @return this builder - */ - public Builder proxyPassword(String proxyPassword) { - this.proxyPassword = proxyPassword; - return this; - } - - // ==================== JAVA ==================== - - /** - * Thread pool size. Driver default: null. - * - * @param threadPoolSize pool size - * @return this builder - */ - public Builder threadPoolSize(int threadPoolSize) { - this.threadPoolSize = threadPoolSize; - return this; - } - - /** - * Java object type class. Driver default: null. - * - * @param javaObjectTypeClass class name - * @return this builder - */ - public Builder javaObjectTypeClass(String javaObjectTypeClass) { - this.javaObjectTypeClass = javaObjectTypeClass; - return this; - } - - /** - * Disable define column type. Driver default: false. - * - * @param disableDefineColumnType true to disable - * @return this builder - */ - public Builder disableDefineColumnType(boolean disableDefineColumnType) { - this.disableDefineColumnType = disableDefineColumnType; - return this; - } - - /** - * Allow NChar literal. Driver default: false. - * - * @param allowNCharLiteral true to allow - * @return this builder - */ - public Builder allowNCharLiteral(boolean allowNCharLiteral) { - this.allowNCharLiteral = allowNCharLiteral; - return this; - } - - /** - * Create descriptor using current schema. Driver default: false. - * - * @param createDescriptorUseCurrentSchemaForSchemaName true to use - * @return this builder - */ - public Builder createDescriptorUseCurrentSchemaForSchemaName( - boolean createDescriptorUseCurrentSchemaForSchemaName) { - this.createDescriptorUseCurrentSchemaForSchemaName = - createDescriptorUseCurrentSchemaForSchemaName; - return this; - } - - // ==================== STATEMENT ==================== - - /** - * Default statement mode is non-blocking. Driver default: false. - * - * @param defaultStatementModeIsNonBlocking true for non-blocking - * @return this builder - */ - public Builder defaultStatementModeIsNonBlocking(boolean defaultStatementModeIsNonBlocking) { - this.defaultStatementModeIsNonBlocking = defaultStatementModeIsNonBlocking; - return this; - } - - /** - * Default execute async. Driver default: false. - * - * @param defaultExecuteAsync true for async - * @return this builder - */ - public Builder defaultExecuteAsync(boolean defaultExecuteAsync) { - this.defaultExecuteAsync = defaultExecuteAsync; - return this; - } - - /** - * Stream chunk size. Driver default: false. - * - * @param streamChunkSize true to enable - * @return this builder - */ - public Builder streamChunkSize(boolean streamChunkSize) { - this.streamChunkSize = streamChunkSize; - return this; - } - - // ==================== ORACLE-SPECIFIC ==================== - - /** - * Restricted list. Driver default: false. - * - * @param restrictedList true to restrict - * @return this builder - */ - public Builder restrictedList(boolean restrictedList) { - this.restrictedList = restrictedList; - return this; - } - - /** - * SQL*CL mode. Driver default: false. - * - * @param sqlCl true for SQL*CL mode - * @return this builder - */ - public Builder sqlCl(boolean sqlCl) { - this.sqlCl = sqlCl; - return this; - } - - /** - * Report remarks. Driver default: false. - * - * @param reportRemarks true to report - * @return this builder - */ - public Builder reportRemarks(boolean reportRemarks) { - this.reportRemarks = reportRemarks; - return this; - } - - /** - * Get PL/SQL errors from server. Driver default: false. - * - * @param getPlSqlErrorFromServerOnProcedureCall true to get errors - * @return this builder - */ - public Builder getPlSqlErrorFromServerOnProcedureCall( - boolean getPlSqlErrorFromServerOnProcedureCall) { - this.getPlSqlErrorFromServerOnProcedureCall = getPlSqlErrorFromServerOnProcedureCall; - return this; - } - - /** - * PL/SQL compiler warnings. Driver default: false. - * - * @param plsqlCompilerWarnings true to enable - * @return this builder - */ - public Builder plsqlCompilerWarnings(boolean plsqlCompilerWarnings) { - this.plsqlCompilerWarnings = plsqlCompilerWarnings; - return this; - } - - /** - * Edition name. Driver default: null. - * - * @param editionName edition - * @return this builder - */ - public Builder editionName(String editionName) { - this.editionName = editionName; - return this; - } - - /** - * Internal logon (SYSDBA/SYSOPER). Driver default: false. - * - * @param internalLogon true for internal - * @return this builder - */ - public Builder internalLogon(boolean internalLogon) { - this.internalLogon = internalLogon; - return this; - } - - /** - * Connection class name. Driver default: null. - * - * @param connectionClassName class name - * @return this builder - */ - public Builder connectionClassName(String connectionClassName) { - this.connectionClassName = connectionClassName; - return this; - } - - /** - * Enable scrollable result sets. Driver default: true. - * - * @param enableScrollableResultSet true to enable - * @return this builder - */ - public Builder enableScrollableResultSet(boolean enableScrollableResultSet) { - this.enableScrollableResultSet = enableScrollableResultSet; - return this; - } - - /** - * Enable read-only result sets. Driver default: true. - * - * @param enableReadOnlyResultSet true to enable - * @return this builder - */ - public Builder enableReadOnlyResultSet(boolean enableReadOnlyResultSet) { - this.enableReadOnlyResultSet = enableReadOnlyResultSet; - return this; - } - - /** - * Cancel query on close. Driver default: true. - * - * @param enableCancelQueryOnClose true to enable - * @return this builder - */ - public Builder enableCancelQueryOnClose(boolean enableCancelQueryOnClose) { - this.enableCancelQueryOnClose = enableCancelQueryOnClose; - return this; - } - - // ==================== SHARDING ==================== - - /** - * Sharding key. Driver default: null. - * - * @param shardingKey sharding key value - * @return this builder - */ - public Builder shardingKey(String shardingKey) { - this.shardingKey = shardingKey; - return this; - } - - /** - * Super sharding key. Driver default: null. - * - * @param superShardingKey super sharding key value - * @return this builder - */ - public Builder superShardingKey(String superShardingKey) { - this.superShardingKey = superShardingKey; - return this; - } - - // ==================== XA ==================== - - /** - * Enable XA recovery. Driver default: false. - * - * @param xaRecoveryEnabled true to enable - * @return this builder - */ - public Builder xaRecoveryEnabled(boolean xaRecoveryEnabled) { - this.xaRecoveryEnabled = xaRecoveryEnabled; - return this; - } - - /** - * XA tightly coupled. Driver default: false. - * - * @param xaTightlyCouple true for tight coupling - * @return this builder - */ - public Builder xaTightlyCouple(boolean xaTightlyCouple) { - this.xaTightlyCouple = xaTightlyCouple; - return this; - } - - // ==================== CONNECTION VALIDATION ==================== - - /** - * Check connection on borrow. Driver default: false. - * - * @param checkConnectionOnBorrow true to check - * @return this builder - */ - public Builder checkConnectionOnBorrow(boolean checkConnectionOnBorrow) { - this.checkConnectionOnBorrow = checkConnectionOnBorrow; - return this; - } - - /** - * Validation SQL. Driver default: null. - * - * @param validationSQL SQL to execute - * @return this builder - */ - public Builder validationSQL(String validationSQL) { - this.validationSQL = validationSQL; - return this; - } - - /** - * Validation timeout in seconds. Driver default: null. - * - * @param validationTimeout timeout - * @return this builder - */ - public Builder validationTimeout(int validationTimeout) { - this.validationTimeout = validationTimeout; - return this; - } - - /** - * Seconds to trust idle connection. Driver default: 0. - * - * @param secondsToTrustIdleConnection seconds - * @return this builder - */ - public Builder secondsToTrustIdleConnection(int secondsToTrustIdleConnection) { - this.secondsToTrustIdleConnection = secondsToTrustIdleConnection; - return this; - } - - /** - * Inactivity timeout in seconds. Driver default: 0. - * - * @param inactivityTimeout timeout - * @return this builder - */ - public Builder inactivityTimeout(int inactivityTimeout) { - this.inactivityTimeout = inactivityTimeout; - return this; - } - - /** - * Abandoned connection timeout in seconds. Driver default: 0. - * - * @param abandonedConnectionTimeout timeout - * @return this builder - */ - public Builder abandonedConnectionTimeout(int abandonedConnectionTimeout) { - this.abandonedConnectionTimeout = abandonedConnectionTimeout; - return this; - } - - /** - * Time to live connection timeout in seconds. Driver default: 0. - * - * @param timeToLiveConnectionTimeout timeout - * @return this builder - */ - public Builder timeToLiveConnectionTimeout(int timeToLiveConnectionTimeout) { - this.timeToLiveConnectionTimeout = timeToLiveConnectionTimeout; - return this; - } - - /** - * Set an arbitrary driver property. - * - * @param key property name - * @param value property value - * @return this builder - */ - public Builder property(String key, String value) { - this.extraProperties.put(key, value); - return this; - } - - /** - * Build the OracleConfig. - * - * @return immutable OracleConfig - */ - public OracleConfig build() { - return new OracleConfig(this); - } - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgAutosave.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgAutosave.java deleted file mode 100644 index eb310a0e47..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgAutosave.java +++ /dev/null @@ -1,21 +0,0 @@ -package dev.typr.foundations.connect.postgres; - -/** PostgreSQL autosave mode for savepoint handling. */ -public enum PgAutosave { - /** Do not use savepoints (default). Errors abort the entire transaction. */ - NEVER("never"), - /** Create a savepoint before each statement. Rollback to savepoint on error. */ - ALWAYS("always"), - /** Create savepoints only when PostgreSQL would fail the entire transaction. */ - CONSERVATIVE("conservative"); - - private final String value; - - PgAutosave(String value) { - this.value = value; - } - - public String value() { - return value; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgChannelBinding.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgChannelBinding.java deleted file mode 100644 index adefb3a450..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgChannelBinding.java +++ /dev/null @@ -1,21 +0,0 @@ -package dev.typr.foundations.connect.postgres; - -/** PostgreSQL channel binding mode for SCRAM authentication. */ -public enum PgChannelBinding { - /** Do not use channel binding. */ - DISABLE("disable"), - /** Use channel binding if server supports it (default). */ - PREFER("prefer"), - /** Require channel binding (fail if not available). */ - REQUIRE("require"); - - private final String value; - - PgChannelBinding(String value) { - this.value = value; - } - - public String value() { - return value; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgEscapeSyntaxCallMode.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgEscapeSyntaxCallMode.java deleted file mode 100644 index 0723abb2cf..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgEscapeSyntaxCallMode.java +++ /dev/null @@ -1,21 +0,0 @@ -package dev.typr.foundations.connect.postgres; - -/** PostgreSQL escape syntax call mode for JDBC escape function calls. */ -public enum PgEscapeSyntaxCallMode { - /** Use SELECT for all escape calls (default). */ - SELECT("select"), - /** Use CALL only when function returns void. */ - CALL_IF_NO_RETURN("callIfNoReturn"), - /** Always use CALL syntax. */ - CALL("call"); - - private final String value; - - PgEscapeSyntaxCallMode(String value) { - this.value = value; - } - - public String value() { - return value; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgGssEncMode.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgGssEncMode.java deleted file mode 100644 index 7be696bd1a..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgGssEncMode.java +++ /dev/null @@ -1,21 +0,0 @@ -package dev.typr.foundations.connect.postgres; - -/** PostgreSQL GSS encryption mode for Kerberos connections. */ -public enum PgGssEncMode { - /** Do not use GSS encryption. */ - DISABLE("disable"), - /** Use GSS encryption if server supports it (default). */ - PREFER("prefer"), - /** Require GSS encryption (fail if not available). */ - REQUIRE("require"); - - private final String value; - - PgGssEncMode(String value) { - this.value = value; - } - - public String value() { - return value; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgGssLib.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgGssLib.java deleted file mode 100644 index 3c097dd0d1..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgGssLib.java +++ /dev/null @@ -1,21 +0,0 @@ -package dev.typr.foundations.connect.postgres; - -/** PostgreSQL GSS library selection for Kerberos authentication. */ -public enum PgGssLib { - /** Auto-detect GSS library (default). */ - AUTO("auto"), - /** Force native GSSAPI library. */ - GSSAPI("gssapi"), - /** Force Windows SSPI library. */ - SSPI("sspi"); - - private final String value; - - PgGssLib(String value) { - this.value = value; - } - - public String value() { - return value; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgQueryMode.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgQueryMode.java deleted file mode 100644 index 54d40b3fb6..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgQueryMode.java +++ /dev/null @@ -1,23 +0,0 @@ -package dev.typr.foundations.connect.postgres; - -/** PostgreSQL query execution mode. */ -public enum PgQueryMode { - /** Use simple protocol for all queries (no parameter binding, no prepared statements). */ - SIMPLE("simple"), - /** Use extended protocol with parameter binding (default). */ - EXTENDED("extended"), - /** Use extended protocol only for prepared statements. */ - EXTENDED_FOR_PREPARED("extendedForPrepared"), - /** Cache all statements (use with care - can lead to high memory usage). */ - EXTENDED_CACHE_EVERYTHING("extendedCacheEverything"); - - private final String value; - - PgQueryMode(String value) { - this.value = value; - } - - public String value() { - return value; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgReadOnlyMode.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgReadOnlyMode.java deleted file mode 100644 index 0cc2d8621b..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgReadOnlyMode.java +++ /dev/null @@ -1,21 +0,0 @@ -package dev.typr.foundations.connect.postgres; - -/** PostgreSQL read-only mode behavior. */ -public enum PgReadOnlyMode { - /** Ignore read-only setting (do not send SET TRANSACTION). */ - IGNORE("ignore"), - /** Use SET TRANSACTION READ ONLY at transaction start (default). */ - TRANSACTION("transaction"), - /** Use SET SESSION CHARACTERISTICS AS TRANSACTION READ ONLY on connect. */ - ALWAYS("always"); - - private final String value; - - PgReadOnlyMode(String value) { - this.value = value; - } - - public String value() { - return value; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgReplication.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgReplication.java deleted file mode 100644 index 0f064fa163..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgReplication.java +++ /dev/null @@ -1,21 +0,0 @@ -package dev.typr.foundations.connect.postgres; - -/** PostgreSQL replication mode for logical/physical replication connections. */ -public enum PgReplication { - /** Normal connection, not a replication connection (default). */ - FALSE("false"), - /** Physical replication connection. */ - TRUE("true"), - /** Logical replication connection (database name required). */ - DATABASE("database"); - - private final String value; - - PgReplication(String value) { - this.value = value; - } - - public String value() { - return value; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgSslMode.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgSslMode.java deleted file mode 100644 index 17d046eb75..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgSslMode.java +++ /dev/null @@ -1,29 +0,0 @@ -package dev.typr.foundations.connect.postgres; - -/** PostgreSQL SSL mode for connection security. */ -public enum PgSslMode { - /** Only try a non-SSL connection. */ - DISABLE("disable"), - /** First try a non-SSL connection; if that fails, try an SSL connection. */ - ALLOW("allow"), - /** First try an SSL connection; if that fails, try a non-SSL connection. (Default) */ - PREFER("prefer"), - /** Only try an SSL connection. If a root CA file is present, verify the certificate. */ - REQUIRE("require"), - /** - * Only try an SSL connection, and verify that the server certificate is issued by a trusted CA. - */ - VERIFY_CA("verify-ca"), - /** Only try an SSL connection, verify CA and that the server hostname matches the certificate. */ - VERIFY_FULL("verify-full"); - - private final String value; - - PgSslMode(String value) { - this.value = value; - } - - public String value() { - return value; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgSslNegotiation.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgSslNegotiation.java deleted file mode 100644 index 7e9a250b6f..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgSslNegotiation.java +++ /dev/null @@ -1,19 +0,0 @@ -package dev.typr.foundations.connect.postgres; - -/** PostgreSQL SSL negotiation mode. */ -public enum PgSslNegotiation { - /** Use PostgreSQL's SSL negotiation protocol (default). */ - POSTGRES("postgres"), - /** Initiate SSL directly without PostgreSQL negotiation (for proxies that require it). */ - DIRECT("direct"); - - private final String value; - - PgSslNegotiation(String value) { - this.value = value; - } - - public String value() { - return value; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgTargetServerType.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgTargetServerType.java deleted file mode 100644 index 4d59d29352..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PgTargetServerType.java +++ /dev/null @@ -1,29 +0,0 @@ -package dev.typr.foundations.connect.postgres; - -/** PostgreSQL target server type for connection routing in multi-server setups. */ -public enum PgTargetServerType { - /** Connect to any server (default). */ - ANY("any"), - /** Connect only to a primary/master server (read-write). */ - PRIMARY("primary"), - /** Alias for PRIMARY (deprecated). */ - MASTER("master"), - /** Connect only to a secondary/slave server (read-only). */ - SECONDARY("secondary"), - /** Alias for SECONDARY (deprecated). */ - SLAVE("slave"), - /** Prefer secondary, fall back to primary if unavailable. */ - PREFER_SECONDARY("preferSecondary"), - /** Alias for PREFER_SECONDARY (deprecated). */ - PREFER_SLAVE("preferSlave"); - - private final String value; - - PgTargetServerType(String value) { - this.value = value; - } - - public String value() { - return value; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PostgresConfig.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PostgresConfig.java deleted file mode 100644 index 4d481ef863..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/postgres/PostgresConfig.java +++ /dev/null @@ -1,1286 +0,0 @@ -package dev.typr.foundations.connect.postgres; - -import dev.typr.foundations.connect.DatabaseConfig; -import dev.typr.foundations.connect.DatabaseKind; -import java.util.HashMap; -import java.util.Map; - -/** - * PostgreSQL database configuration with typed builder methods for all documented JDBC driver - * properties. - * - *

Properties are based on the PostgreSQL JDBC driver documentation. Default values are the - * driver defaults unless noted otherwise with "OUR DEFAULT" in the builder method documentation. - * - * @see PostgreSQL JDBC Documentation - */ -public final class PostgresConfig implements DatabaseConfig { - - private final String host; - private final int port; - private final String database; - private final String username; - private final String password; - - // SSL/TLS properties - private final Boolean ssl; - private final String sslfactory; - private final PgSslMode sslmode; - private final PgSslNegotiation sslNegotiation; - private final String sslcert; - private final String sslkey; - private final String sslrootcert; - private final String sslhostnameverifier; - private final String sslpasswordcallback; - private final String sslpassword; - private final Integer sslResponseTimeout; - - // Performance properties - private final Boolean reWriteBatchedInserts; - private final Boolean binaryTransfer; - private final String binaryTransferEnable; - private final String binaryTransferDisable; - private final Integer prepareThreshold; - private final Integer preparedStatementCacheQueries; - private final Integer preparedStatementCacheSizeMiB; - private final PgQueryMode preferQueryMode; - private final Integer defaultRowFetchSize; - private final Integer databaseMetadataCacheFields; - private final Integer databaseMetadataCacheFieldsMiB; - private final Boolean adaptiveFetch; - private final Integer adaptiveFetchMinimum; - private final Integer adaptiveFetchMaximum; - - // Timeout properties - private final Integer loginTimeout; - private final Integer connectTimeout; - private final Integer socketTimeout; - private final Integer cancelSignalTimeout; - - // Network properties - private final Boolean tcpKeepAlive; - private final Boolean tcpNoDelay; - private final Integer sendBufferSize; - private final Integer receiveBufferSize; - private final Integer maxSendBufferSize; - - // Kerberos/GSSAPI properties - private final PgGssLib gsslib; - private final String kerberosServerName; - private final String jaasApplicationName; - private final Boolean jaasLogin; - private final Boolean gssUseDefaultCreds; - private final PgGssEncMode gssEncMode; - private final Integer gssResponseTimeout; - private final String sspiServiceClass; - private final Boolean useSpnego; - private final PgChannelBinding channelBinding; - - // Behavior properties - private final Boolean allowEncodingChanges; - private final Boolean logUnclosedConnections; - private final PgAutosave autosave; - private final Boolean cleanupSavepoints; - private final String stringtype; - private final String applicationName; - private final String currentSchema; - private final Boolean readOnly; - private final PgReadOnlyMode readOnlyMode; - private final Boolean disableColumnSanitiser; - private final String assumeMinServerVersion; - private final Integer unknownLength; - private final Boolean logServerErrorDetail; - private final Boolean quoteReturningIdentifiers; - private final Boolean hideUnprivilegedExceptions; - private final String options; - - // Replication properties - private final PgReplication replication; - private final PgTargetServerType targetServerType; - private final Integer hostRecheckSeconds; - private final Boolean loadBalanceHosts; - - // Socket factory properties - private final String socketFactory; - private final String socketFactoryArg; - - // Result handling properties - private final String maxResultBuffer; - private final PgEscapeSyntaxCallMode escapeSyntaxCallMode; - - // Auth plugin property - private final String authenticationPluginClassName; - - // Escape hatch for undocumented/future properties - private final Map extraProperties; - - private PostgresConfig(Builder b) { - this.host = b.host; - this.port = b.port; - this.database = b.database; - this.username = b.username; - this.password = b.password; - - // SSL/TLS - this.ssl = b.ssl; - this.sslfactory = b.sslfactory; - this.sslmode = b.sslmode; - this.sslNegotiation = b.sslNegotiation; - this.sslcert = b.sslcert; - this.sslkey = b.sslkey; - this.sslrootcert = b.sslrootcert; - this.sslhostnameverifier = b.sslhostnameverifier; - this.sslpasswordcallback = b.sslpasswordcallback; - this.sslpassword = b.sslpassword; - this.sslResponseTimeout = b.sslResponseTimeout; - - // Performance - this.reWriteBatchedInserts = b.reWriteBatchedInserts; - this.binaryTransfer = b.binaryTransfer; - this.binaryTransferEnable = b.binaryTransferEnable; - this.binaryTransferDisable = b.binaryTransferDisable; - this.prepareThreshold = b.prepareThreshold; - this.preparedStatementCacheQueries = b.preparedStatementCacheQueries; - this.preparedStatementCacheSizeMiB = b.preparedStatementCacheSizeMiB; - this.preferQueryMode = b.preferQueryMode; - this.defaultRowFetchSize = b.defaultRowFetchSize; - this.databaseMetadataCacheFields = b.databaseMetadataCacheFields; - this.databaseMetadataCacheFieldsMiB = b.databaseMetadataCacheFieldsMiB; - this.adaptiveFetch = b.adaptiveFetch; - this.adaptiveFetchMinimum = b.adaptiveFetchMinimum; - this.adaptiveFetchMaximum = b.adaptiveFetchMaximum; - - // Timeouts - this.loginTimeout = b.loginTimeout; - this.connectTimeout = b.connectTimeout; - this.socketTimeout = b.socketTimeout; - this.cancelSignalTimeout = b.cancelSignalTimeout; - - // Network - this.tcpKeepAlive = b.tcpKeepAlive; - this.tcpNoDelay = b.tcpNoDelay; - this.sendBufferSize = b.sendBufferSize; - this.receiveBufferSize = b.receiveBufferSize; - this.maxSendBufferSize = b.maxSendBufferSize; - - // Kerberos/GSSAPI - this.gsslib = b.gsslib; - this.kerberosServerName = b.kerberosServerName; - this.jaasApplicationName = b.jaasApplicationName; - this.jaasLogin = b.jaasLogin; - this.gssUseDefaultCreds = b.gssUseDefaultCreds; - this.gssEncMode = b.gssEncMode; - this.gssResponseTimeout = b.gssResponseTimeout; - this.sspiServiceClass = b.sspiServiceClass; - this.useSpnego = b.useSpnego; - this.channelBinding = b.channelBinding; - - // Behavior - this.allowEncodingChanges = b.allowEncodingChanges; - this.logUnclosedConnections = b.logUnclosedConnections; - this.autosave = b.autosave; - this.cleanupSavepoints = b.cleanupSavepoints; - this.stringtype = b.stringtype; - this.applicationName = b.applicationName; - this.currentSchema = b.currentSchema; - this.readOnly = b.readOnly; - this.readOnlyMode = b.readOnlyMode; - this.disableColumnSanitiser = b.disableColumnSanitiser; - this.assumeMinServerVersion = b.assumeMinServerVersion; - this.unknownLength = b.unknownLength; - this.logServerErrorDetail = b.logServerErrorDetail; - this.quoteReturningIdentifiers = b.quoteReturningIdentifiers; - this.hideUnprivilegedExceptions = b.hideUnprivilegedExceptions; - this.options = b.options; - - // Replication - this.replication = b.replication; - this.targetServerType = b.targetServerType; - this.hostRecheckSeconds = b.hostRecheckSeconds; - this.loadBalanceHosts = b.loadBalanceHosts; - - // Socket factory - this.socketFactory = b.socketFactory; - this.socketFactoryArg = b.socketFactoryArg; - - // Result handling - this.maxResultBuffer = b.maxResultBuffer; - this.escapeSyntaxCallMode = b.escapeSyntaxCallMode; - - // Auth plugin - this.authenticationPluginClassName = b.authenticationPluginClassName; - - // Extra properties - this.extraProperties = Map.copyOf(b.extraProperties); - } - - /** - * Create a new builder with required connection parameters. - * - * @param host database server hostname - * @param port database server port (typically 5432) - * @param database database name - * @param username username for authentication - * @param password password for authentication - * @return a new builder - */ - public static Builder builder( - String host, int port, String database, String username, String password) { - return new Builder(host, port, database, username, password); - } - - @Override - public String jdbcUrl() { - return "jdbc:postgresql://" + host + ":" + port + "/" + database; - } - - @Override - public String username() { - return username; - } - - @Override - public String password() { - return password; - } - - @Override - public DatabaseKind kind() { - return DatabaseKind.POSTGRESQL; - } - - @Override - public Map driverProperties() { - Map props = new HashMap<>(); - - // SSL/TLS - if (ssl != null) props.put("ssl", ssl.toString()); - if (sslfactory != null) props.put("sslfactory", sslfactory); - if (sslmode != null) props.put("sslmode", sslmode.value()); - if (sslNegotiation != null) props.put("sslNegotiation", sslNegotiation.value()); - if (sslcert != null) props.put("sslcert", sslcert); - if (sslkey != null) props.put("sslkey", sslkey); - if (sslrootcert != null) props.put("sslrootcert", sslrootcert); - if (sslhostnameverifier != null) props.put("sslhostnameverifier", sslhostnameverifier); - if (sslpasswordcallback != null) props.put("sslpasswordcallback", sslpasswordcallback); - if (sslpassword != null) props.put("sslpassword", sslpassword); - if (sslResponseTimeout != null) props.put("sslResponseTimeout", sslResponseTimeout.toString()); - - // Performance - if (reWriteBatchedInserts != null) - props.put("reWriteBatchedInserts", reWriteBatchedInserts.toString()); - if (binaryTransfer != null) props.put("binaryTransfer", binaryTransfer.toString()); - if (binaryTransferEnable != null) props.put("binaryTransferEnable", binaryTransferEnable); - if (binaryTransferDisable != null) props.put("binaryTransferDisable", binaryTransferDisable); - if (prepareThreshold != null) props.put("prepareThreshold", prepareThreshold.toString()); - if (preparedStatementCacheQueries != null) - props.put("preparedStatementCacheQueries", preparedStatementCacheQueries.toString()); - if (preparedStatementCacheSizeMiB != null) - props.put("preparedStatementCacheSizeMiB", preparedStatementCacheSizeMiB.toString()); - if (preferQueryMode != null) props.put("preferQueryMode", preferQueryMode.value()); - if (defaultRowFetchSize != null) - props.put("defaultRowFetchSize", defaultRowFetchSize.toString()); - if (databaseMetadataCacheFields != null) - props.put("databaseMetadataCacheFields", databaseMetadataCacheFields.toString()); - if (databaseMetadataCacheFieldsMiB != null) - props.put("databaseMetadataCacheFieldsMiB", databaseMetadataCacheFieldsMiB.toString()); - if (adaptiveFetch != null) props.put("adaptiveFetch", adaptiveFetch.toString()); - if (adaptiveFetchMinimum != null) - props.put("adaptiveFetchMinimum", adaptiveFetchMinimum.toString()); - if (adaptiveFetchMaximum != null) - props.put("adaptiveFetchMaximum", adaptiveFetchMaximum.toString()); - - // Timeouts - if (loginTimeout != null) props.put("loginTimeout", loginTimeout.toString()); - if (connectTimeout != null) props.put("connectTimeout", connectTimeout.toString()); - if (socketTimeout != null) props.put("socketTimeout", socketTimeout.toString()); - if (cancelSignalTimeout != null) - props.put("cancelSignalTimeout", cancelSignalTimeout.toString()); - - // Network - if (tcpKeepAlive != null) props.put("tcpKeepAlive", tcpKeepAlive.toString()); - if (tcpNoDelay != null) props.put("tcpNoDelay", tcpNoDelay.toString()); - if (sendBufferSize != null) props.put("sendBufferSize", sendBufferSize.toString()); - if (receiveBufferSize != null) props.put("receiveBufferSize", receiveBufferSize.toString()); - if (maxSendBufferSize != null) props.put("maxSendBufferSize", maxSendBufferSize.toString()); - - // Kerberos/GSSAPI - if (gsslib != null) props.put("gsslib", gsslib.value()); - if (kerberosServerName != null) props.put("kerberosServerName", kerberosServerName); - if (jaasApplicationName != null) props.put("jaasApplicationName", jaasApplicationName); - if (jaasLogin != null) props.put("jaasLogin", jaasLogin.toString()); - if (gssUseDefaultCreds != null) props.put("gssUseDefaultCreds", gssUseDefaultCreds.toString()); - if (gssEncMode != null) props.put("gssEncMode", gssEncMode.value()); - if (gssResponseTimeout != null) props.put("gssResponseTimeout", gssResponseTimeout.toString()); - if (sspiServiceClass != null) props.put("sspiServiceClass", sspiServiceClass); - if (useSpnego != null) props.put("useSpnego", useSpnego.toString()); - if (channelBinding != null) props.put("channelBinding", channelBinding.value()); - - // Behavior - if (allowEncodingChanges != null) - props.put("allowEncodingChanges", allowEncodingChanges.toString()); - if (logUnclosedConnections != null) - props.put("logUnclosedConnections", logUnclosedConnections.toString()); - if (autosave != null) props.put("autosave", autosave.value()); - if (cleanupSavepoints != null) props.put("cleanupSavepoints", cleanupSavepoints.toString()); - if (stringtype != null) props.put("stringtype", stringtype); - if (applicationName != null) props.put("ApplicationName", applicationName); - if (currentSchema != null) props.put("currentSchema", currentSchema); - if (readOnly != null) props.put("readOnly", readOnly.toString()); - if (readOnlyMode != null) props.put("readOnlyMode", readOnlyMode.value()); - if (disableColumnSanitiser != null) - props.put("disableColumnSanitiser", disableColumnSanitiser.toString()); - if (assumeMinServerVersion != null) props.put("assumeMinServerVersion", assumeMinServerVersion); - if (unknownLength != null) props.put("unknownLength", unknownLength.toString()); - if (logServerErrorDetail != null) - props.put("logServerErrorDetail", logServerErrorDetail.toString()); - if (quoteReturningIdentifiers != null) - props.put("quoteReturningIdentifiers", quoteReturningIdentifiers.toString()); - if (hideUnprivilegedExceptions != null) - props.put("hideUnprivilegedExceptions", hideUnprivilegedExceptions.toString()); - if (options != null) props.put("options", options); - - // Replication - if (replication != null) props.put("replication", replication.value()); - if (targetServerType != null) props.put("targetServerType", targetServerType.value()); - if (hostRecheckSeconds != null) props.put("hostRecheckSeconds", hostRecheckSeconds.toString()); - if (loadBalanceHosts != null) props.put("loadBalanceHosts", loadBalanceHosts.toString()); - - // Socket factory - if (socketFactory != null) props.put("socketFactory", socketFactory); - if (socketFactoryArg != null) props.put("socketFactoryArg", socketFactoryArg); - - // Result handling - if (maxResultBuffer != null) props.put("maxResultBuffer", maxResultBuffer); - if (escapeSyntaxCallMode != null) - props.put("escapeSyntaxCallMode", escapeSyntaxCallMode.value()); - - // Auth plugin - if (authenticationPluginClassName != null) - props.put("authenticationPluginClassName", authenticationPluginClassName); - - // Extra properties - props.putAll(extraProperties); - - return props; - } - - /** Builder for PostgresConfig with typed methods for all JDBC driver properties. */ - public static final class Builder { - // Required - private final String host; - private final int port; - private final String database; - private final String username; - private final String password; - - // SSL/TLS - private Boolean ssl; - private String sslfactory; - private PgSslMode sslmode; - private PgSslNegotiation sslNegotiation; - private String sslcert; - private String sslkey; - private String sslrootcert; - private String sslhostnameverifier; - private String sslpasswordcallback; - private String sslpassword; - private Integer sslResponseTimeout; - - // Performance - private Boolean reWriteBatchedInserts; - private Boolean binaryTransfer; - private String binaryTransferEnable; - private String binaryTransferDisable; - private Integer prepareThreshold; - private Integer preparedStatementCacheQueries; - private Integer preparedStatementCacheSizeMiB; - private PgQueryMode preferQueryMode; - private Integer defaultRowFetchSize; - private Integer databaseMetadataCacheFields; - private Integer databaseMetadataCacheFieldsMiB; - private Boolean adaptiveFetch; - private Integer adaptiveFetchMinimum; - private Integer adaptiveFetchMaximum; - - // Timeouts - private Integer loginTimeout; - private Integer connectTimeout; - private Integer socketTimeout; - private Integer cancelSignalTimeout; - - // Network - private Boolean tcpKeepAlive; - private Boolean tcpNoDelay; - private Integer sendBufferSize; - private Integer receiveBufferSize; - private Integer maxSendBufferSize; - - // Kerberos/GSSAPI - private PgGssLib gsslib; - private String kerberosServerName; - private String jaasApplicationName; - private Boolean jaasLogin; - private Boolean gssUseDefaultCreds; - private PgGssEncMode gssEncMode; - private Integer gssResponseTimeout; - private String sspiServiceClass; - private Boolean useSpnego; - private PgChannelBinding channelBinding; - - // Behavior - private Boolean allowEncodingChanges; - private Boolean logUnclosedConnections; - private PgAutosave autosave; - private Boolean cleanupSavepoints; - private String stringtype; - private String applicationName; - private String currentSchema; - private Boolean readOnly; - private PgReadOnlyMode readOnlyMode; - private Boolean disableColumnSanitiser; - private String assumeMinServerVersion; - private Integer unknownLength; - private Boolean logServerErrorDetail; - private Boolean quoteReturningIdentifiers; - private Boolean hideUnprivilegedExceptions; - private String options; - - // Replication - private PgReplication replication; - private PgTargetServerType targetServerType; - private Integer hostRecheckSeconds; - private Boolean loadBalanceHosts; - - // Socket factory - private String socketFactory; - private String socketFactoryArg; - - // Result handling - private String maxResultBuffer; - private PgEscapeSyntaxCallMode escapeSyntaxCallMode; - - // Auth plugin - private String authenticationPluginClassName; - - // Escape hatch - private final Map extraProperties = new HashMap<>(); - - private Builder(String host, int port, String database, String username, String password) { - this.host = host; - this.port = port; - this.database = database; - this.username = username; - this.password = password; - - // OUR DEFAULTS (better than driver defaults) - this.reWriteBatchedInserts = - true; // Driver default is false, but true is almost always better - } - - // ==================== SSL/TLS ==================== - - /** - * Enable SSL connection. Driver default: false. - * - * @param ssl true to enable SSL - * @return this builder - */ - public Builder ssl(boolean ssl) { - this.ssl = ssl; - return this; - } - - /** - * SSL socket factory class name. Driver default: org.postgresql.ssl.LibPQFactory - * - * @param sslfactory fully qualified class name - * @return this builder - */ - public Builder sslfactory(String sslfactory) { - this.sslfactory = sslfactory; - return this; - } - - /** - * SSL mode for connection security. Driver default: prefer. - * - * @param sslmode SSL mode - * @return this builder - */ - public Builder sslmode(PgSslMode sslmode) { - this.sslmode = sslmode; - return this; - } - - /** - * SSL negotiation mode. Driver default: postgres. - * - * @param sslNegotiation SSL negotiation mode - * @return this builder - */ - public Builder sslNegotiation(PgSslNegotiation sslNegotiation) { - this.sslNegotiation = sslNegotiation; - return this; - } - - /** - * Path to client SSL certificate. Driver default: ~/.postgresql/postgresql.crt - * - * @param sslcert path to certificate file - * @return this builder - */ - public Builder sslcert(String sslcert) { - this.sslcert = sslcert; - return this; - } - - /** - * Path to client SSL private key. Driver default: ~/.postgresql/postgresql.pk8 - * - * @param sslkey path to key file (PKCS#8 format) - * @return this builder - */ - public Builder sslkey(String sslkey) { - this.sslkey = sslkey; - return this; - } - - /** - * Path to root CA certificate. Driver default: ~/.postgresql/root.crt - * - * @param sslrootcert path to root CA file - * @return this builder - */ - public Builder sslrootcert(String sslrootcert) { - this.sslrootcert = sslrootcert; - return this; - } - - /** - * Hostname verifier class for SSL. Driver default: null (use default verifier). - * - * @param sslhostnameverifier fully qualified class name - * @return this builder - */ - public Builder sslhostnameverifier(String sslhostnameverifier) { - this.sslhostnameverifier = sslhostnameverifier; - return this; - } - - /** - * SSL password callback class. Driver default: null. - * - * @param sslpasswordcallback fully qualified class name implementing - * javax.security.auth.callback.CallbackHandler - * @return this builder - */ - public Builder sslpasswordcallback(String sslpasswordcallback) { - this.sslpasswordcallback = sslpasswordcallback; - return this; - } - - /** - * Password for encrypted SSL private key. Driver default: null. - * - * @param sslpassword password for the key file - * @return this builder - */ - public Builder sslpassword(String sslpassword) { - this.sslpassword = sslpassword; - return this; - } - - /** - * Timeout for SSL negotiation response in milliseconds. Driver default: 5000. - * - * @param sslResponseTimeout timeout in milliseconds - * @return this builder - */ - public Builder sslResponseTimeout(int sslResponseTimeout) { - this.sslResponseTimeout = sslResponseTimeout; - return this; - } - - // ==================== PERFORMANCE ==================== - - /** - * Rewrite INSERT statements for batch optimization. Driver default: false. OUR DEFAULT: true - * (significantly improves batch insert performance). - * - * @param reWriteBatchedInserts true to enable - * @return this builder - */ - public Builder reWriteBatchedInserts(boolean reWriteBatchedInserts) { - this.reWriteBatchedInserts = reWriteBatchedInserts; - return this; - } - - /** - * Enable binary transfer for supported types. Driver default: true. - * - * @param binaryTransfer true to enable - * @return this builder - */ - public Builder binaryTransfer(boolean binaryTransfer) { - this.binaryTransfer = binaryTransfer; - return this; - } - - /** - * Comma-separated list of OIDs to enable binary transfer for. Driver default: empty. - * - * @param binaryTransferEnable comma-separated OIDs - * @return this builder - */ - public Builder binaryTransferEnable(String binaryTransferEnable) { - this.binaryTransferEnable = binaryTransferEnable; - return this; - } - - /** - * Comma-separated list of OIDs to disable binary transfer for. Driver default: empty. - * - * @param binaryTransferDisable comma-separated OIDs - * @return this builder - */ - public Builder binaryTransferDisable(String binaryTransferDisable) { - this.binaryTransferDisable = binaryTransferDisable; - return this; - } - - /** - * Number of executions before using server-side prepared statement. Driver default: 5. Use 0 to - * disable, -1 to always use prepared statements. - * - * @param prepareThreshold threshold count - * @return this builder - */ - public Builder prepareThreshold(int prepareThreshold) { - this.prepareThreshold = prepareThreshold; - return this; - } - - /** - * Maximum number of prepared statements cached per connection. Driver default: 256. - * - * @param preparedStatementCacheQueries cache size - * @return this builder - */ - public Builder preparedStatementCacheQueries(int preparedStatementCacheQueries) { - this.preparedStatementCacheQueries = preparedStatementCacheQueries; - return this; - } - - /** - * Maximum size of prepared statement cache in MiB. Driver default: 5. - * - * @param preparedStatementCacheSizeMiB cache size in MiB - * @return this builder - */ - public Builder preparedStatementCacheSizeMiB(int preparedStatementCacheSizeMiB) { - this.preparedStatementCacheSizeMiB = preparedStatementCacheSizeMiB; - return this; - } - - /** - * Query execution mode. Driver default: extended. - * - * @param preferQueryMode query mode - * @return this builder - */ - public Builder preferQueryMode(PgQueryMode preferQueryMode) { - this.preferQueryMode = preferQueryMode; - return this; - } - - /** - * Default fetch size for statements. Driver default: 0 (fetch all rows). - * - * @param defaultRowFetchSize fetch size (0 = fetch all) - * @return this builder - */ - public Builder defaultRowFetchSize(int defaultRowFetchSize) { - this.defaultRowFetchSize = defaultRowFetchSize; - return this; - } - - /** - * Maximum number of fields to cache in DatabaseMetaData. Driver default: 65536. - * - * @param databaseMetadataCacheFields cache size - * @return this builder - */ - public Builder databaseMetadataCacheFields(int databaseMetadataCacheFields) { - this.databaseMetadataCacheFields = databaseMetadataCacheFields; - return this; - } - - /** - * Maximum size of DatabaseMetaData cache in MiB. Driver default: 5. - * - * @param databaseMetadataCacheFieldsMiB cache size in MiB - * @return this builder - */ - public Builder databaseMetadataCacheFieldsMiB(int databaseMetadataCacheFieldsMiB) { - this.databaseMetadataCacheFieldsMiB = databaseMetadataCacheFieldsMiB; - return this; - } - - /** - * Enable adaptive fetch size based on result set size. Driver default: false. - * - * @param adaptiveFetch true to enable - * @return this builder - */ - public Builder adaptiveFetch(boolean adaptiveFetch) { - this.adaptiveFetch = adaptiveFetch; - return this; - } - - /** - * Minimum fetch size for adaptive fetch. Driver default: 0. - * - * @param adaptiveFetchMinimum minimum fetch size - * @return this builder - */ - public Builder adaptiveFetchMinimum(int adaptiveFetchMinimum) { - this.adaptiveFetchMinimum = adaptiveFetchMinimum; - return this; - } - - /** - * Maximum fetch size for adaptive fetch. Driver default: -1 (unlimited). - * - * @param adaptiveFetchMaximum maximum fetch size - * @return this builder - */ - public Builder adaptiveFetchMaximum(int adaptiveFetchMaximum) { - this.adaptiveFetchMaximum = adaptiveFetchMaximum; - return this; - } - - // ==================== TIMEOUTS ==================== - - /** - * Timeout for login (authentication) in seconds. Driver default: 0 (unlimited). - * - * @param loginTimeout timeout in seconds - * @return this builder - */ - public Builder loginTimeout(int loginTimeout) { - this.loginTimeout = loginTimeout; - return this; - } - - /** - * Timeout for establishing connection in seconds. Driver default: 10. - * - * @param connectTimeout timeout in seconds - * @return this builder - */ - public Builder connectTimeout(int connectTimeout) { - this.connectTimeout = connectTimeout; - return this; - } - - /** - * Socket read timeout in seconds. Driver default: 0 (unlimited). - * - * @param socketTimeout timeout in seconds - * @return this builder - */ - public Builder socketTimeout(int socketTimeout) { - this.socketTimeout = socketTimeout; - return this; - } - - /** - * Timeout for cancel signal in seconds. Driver default: 10. - * - * @param cancelSignalTimeout timeout in seconds - * @return this builder - */ - public Builder cancelSignalTimeout(int cancelSignalTimeout) { - this.cancelSignalTimeout = cancelSignalTimeout; - return this; - } - - // ==================== NETWORK ==================== - - /** - * Enable TCP keepalive. Driver default: false. - * - * @param tcpKeepAlive true to enable - * @return this builder - */ - public Builder tcpKeepAlive(boolean tcpKeepAlive) { - this.tcpKeepAlive = tcpKeepAlive; - return this; - } - - /** - * Enable TCP no-delay (Nagle's algorithm disabled). Driver default: true. - * - * @param tcpNoDelay true to disable Nagle's algorithm - * @return this builder - */ - public Builder tcpNoDelay(boolean tcpNoDelay) { - this.tcpNoDelay = tcpNoDelay; - return this; - } - - /** - * Socket send buffer size in bytes. Driver default: -1 (system default). - * - * @param sendBufferSize buffer size in bytes - * @return this builder - */ - public Builder sendBufferSize(int sendBufferSize) { - this.sendBufferSize = sendBufferSize; - return this; - } - - /** - * Socket receive buffer size in bytes. Driver default: -1 (system default). - * - * @param receiveBufferSize buffer size in bytes - * @return this builder - */ - public Builder receiveBufferSize(int receiveBufferSize) { - this.receiveBufferSize = receiveBufferSize; - return this; - } - - /** - * Maximum size of data to send in one packet in bytes. Driver default: 8192. - * - * @param maxSendBufferSize maximum send size in bytes - * @return this builder - */ - public Builder maxSendBufferSize(int maxSendBufferSize) { - this.maxSendBufferSize = maxSendBufferSize; - return this; - } - - // ==================== KERBEROS/GSSAPI ==================== - - /** - * GSS library selection for Kerberos authentication. Driver default: auto. - * - * @param gsslib GSS library - * @return this builder - */ - public Builder gsslib(PgGssLib gsslib) { - this.gsslib = gsslib; - return this; - } - - /** - * Kerberos server name (principal). Driver default: postgres. - * - * @param kerberosServerName server name - * @return this builder - */ - public Builder kerberosServerName(String kerberosServerName) { - this.kerberosServerName = kerberosServerName; - return this; - } - - /** - * JAAS application name for Kerberos. Driver default: pgjdbc. - * - * @param jaasApplicationName application name - * @return this builder - */ - public Builder jaasApplicationName(String jaasApplicationName) { - this.jaasApplicationName = jaasApplicationName; - return this; - } - - /** - * Whether to perform JAAS login. Driver default: true. - * - * @param jaasLogin true to perform JAAS login - * @return this builder - */ - public Builder jaasLogin(boolean jaasLogin) { - this.jaasLogin = jaasLogin; - return this; - } - - /** - * Use default GSS credentials from Subject. Driver default: false. - * - * @param gssUseDefaultCreds true to use default credentials - * @return this builder - */ - public Builder gssUseDefaultCreds(boolean gssUseDefaultCreds) { - this.gssUseDefaultCreds = gssUseDefaultCreds; - return this; - } - - /** - * GSS encryption mode. Driver default: prefer. - * - * @param gssEncMode encryption mode - * @return this builder - */ - public Builder gssEncMode(PgGssEncMode gssEncMode) { - this.gssEncMode = gssEncMode; - return this; - } - - /** - * Timeout for GSS response in milliseconds. Driver default: 5000. - * - * @param gssResponseTimeout timeout in milliseconds - * @return this builder - */ - public Builder gssResponseTimeout(int gssResponseTimeout) { - this.gssResponseTimeout = gssResponseTimeout; - return this; - } - - /** - * SSPI service class (Windows). Driver default: POSTGRES. - * - * @param sspiServiceClass service class - * @return this builder - */ - public Builder sspiServiceClass(String sspiServiceClass) { - this.sspiServiceClass = sspiServiceClass; - return this; - } - - /** - * Use SPNEGO for SSPI (Windows). Driver default: false. - * - * @param useSpnego true to use SPNEGO - * @return this builder - */ - public Builder useSpnego(boolean useSpnego) { - this.useSpnego = useSpnego; - return this; - } - - /** - * Channel binding mode for SCRAM authentication. Driver default: prefer. - * - * @param channelBinding channel binding mode - * @return this builder - */ - public Builder channelBinding(PgChannelBinding channelBinding) { - this.channelBinding = channelBinding; - return this; - } - - // ==================== BEHAVIOR ==================== - - /** - * Allow encoding changes via SET NAMES. Driver default: false. - * - * @param allowEncodingChanges true to allow - * @return this builder - */ - public Builder allowEncodingChanges(boolean allowEncodingChanges) { - this.allowEncodingChanges = allowEncodingChanges; - return this; - } - - /** - * Log a warning when connection is not closed properly. Driver default: false. - * - * @param logUnclosedConnections true to log warnings - * @return this builder - */ - public Builder logUnclosedConnections(boolean logUnclosedConnections) { - this.logUnclosedConnections = logUnclosedConnections; - return this; - } - - /** - * Autosave mode for savepoint handling. Driver default: never. - * - * @param autosave autosave mode - * @return this builder - */ - public Builder autosave(PgAutosave autosave) { - this.autosave = autosave; - return this; - } - - /** - * Clean up savepoints after transaction. Driver default: false. - * - * @param cleanupSavepoints true to clean up - * @return this builder - */ - public Builder cleanupSavepoints(boolean cleanupSavepoints) { - this.cleanupSavepoints = cleanupSavepoints; - return this; - } - - /** - * Type to use for String parameters. Driver default: null (unspecified). - * - * @param stringtype type name (e.g., "varchar", "unspecified") - * @return this builder - */ - public Builder stringtype(String stringtype) { - this.stringtype = stringtype; - return this; - } - - /** - * Application name for pg_stat_activity. Driver default: PostgreSQL JDBC Driver. - * - * @param applicationName application name - * @return this builder - */ - public Builder applicationName(String applicationName) { - this.applicationName = applicationName; - return this; - } - - /** - * Current schema search path. Driver default: null (use server default). - * - * @param currentSchema comma-separated schema names - * @return this builder - */ - public Builder currentSchema(String currentSchema) { - this.currentSchema = currentSchema; - return this; - } - - /** - * Set connection to read-only mode. Driver default: false. - * - * @param readOnly true for read-only - * @return this builder - */ - public Builder readOnly(boolean readOnly) { - this.readOnly = readOnly; - return this; - } - - /** - * How to apply read-only mode. Driver default: transaction. - * - * @param readOnlyMode read-only mode behavior - * @return this builder - */ - public Builder readOnlyMode(PgReadOnlyMode readOnlyMode) { - this.readOnlyMode = readOnlyMode; - return this; - } - - /** - * Disable column name sanitizer. Driver default: false. - * - * @param disableColumnSanitiser true to disable - * @return this builder - */ - public Builder disableColumnSanitiser(boolean disableColumnSanitiser) { - this.disableColumnSanitiser = disableColumnSanitiser; - return this; - } - - /** - * Assume minimum server version (skip version detection). Driver default: null. - * - * @param assumeMinServerVersion version string (e.g., "9.6") - * @return this builder - */ - public Builder assumeMinServerVersion(String assumeMinServerVersion) { - this.assumeMinServerVersion = assumeMinServerVersion; - return this; - } - - /** - * Length to assume for unknown types. Driver default: Integer.MAX_VALUE. - * - * @param unknownLength length value - * @return this builder - */ - public Builder unknownLength(int unknownLength) { - this.unknownLength = unknownLength; - return this; - } - - /** - * Log server error details in exceptions. Driver default: true. - * - * @param logServerErrorDetail true to log details - * @return this builder - */ - public Builder logServerErrorDetail(boolean logServerErrorDetail) { - this.logServerErrorDetail = logServerErrorDetail; - return this; - } - - /** - * Quote identifiers in RETURNING clause. Driver default: false. - * - * @param quoteReturningIdentifiers true to quote - * @return this builder - */ - public Builder quoteReturningIdentifiers(boolean quoteReturningIdentifiers) { - this.quoteReturningIdentifiers = quoteReturningIdentifiers; - return this; - } - - /** - * Hide privileged exception details for unprivileged users. Driver default: false. - * - * @param hideUnprivilegedExceptions true to hide - * @return this builder - */ - public Builder hideUnprivilegedExceptions(boolean hideUnprivilegedExceptions) { - this.hideUnprivilegedExceptions = hideUnprivilegedExceptions; - return this; - } - - /** - * Server startup parameters (passed as -c options). Driver default: null. - * - * @param options startup options (e.g., "-c search_path=myschema") - * @return this builder - */ - public Builder options(String options) { - this.options = options; - return this; - } - - // ==================== REPLICATION ==================== - - /** - * Replication mode. Driver default: false (normal connection). - * - * @param replication replication mode - * @return this builder - */ - public Builder replication(PgReplication replication) { - this.replication = replication; - return this; - } - - /** - * Target server type for multi-server setups. Driver default: any. - * - * @param targetServerType target server type - * @return this builder - */ - public Builder targetServerType(PgTargetServerType targetServerType) { - this.targetServerType = targetServerType; - return this; - } - - /** - * Interval to recheck host status in seconds. Driver default: 10. - * - * @param hostRecheckSeconds interval in seconds - * @return this builder - */ - public Builder hostRecheckSeconds(int hostRecheckSeconds) { - this.hostRecheckSeconds = hostRecheckSeconds; - return this; - } - - /** - * Load balance connections across hosts. Driver default: false. - * - * @param loadBalanceHosts true to load balance - * @return this builder - */ - public Builder loadBalanceHosts(boolean loadBalanceHosts) { - this.loadBalanceHosts = loadBalanceHosts; - return this; - } - - // ==================== SOCKET FACTORY ==================== - - /** - * Custom socket factory class name. Driver default: null. - * - * @param socketFactory fully qualified class name - * @return this builder - */ - public Builder socketFactory(String socketFactory) { - this.socketFactory = socketFactory; - return this; - } - - /** - * Argument passed to socket factory constructor. Driver default: null. - * - * @param socketFactoryArg factory argument - * @return this builder - */ - public Builder socketFactoryArg(String socketFactoryArg) { - this.socketFactoryArg = socketFactoryArg; - return this; - } - - // ==================== RESULT HANDLING ==================== - - /** - * Maximum result buffer size (e.g., "64m", "256k"). Driver default: null (unlimited). - * - * @param maxResultBuffer buffer size with unit suffix - * @return this builder - */ - public Builder maxResultBuffer(String maxResultBuffer) { - this.maxResultBuffer = maxResultBuffer; - return this; - } - - /** - * Escape syntax call mode for JDBC escape functions. Driver default: select. - * - * @param escapeSyntaxCallMode call mode - * @return this builder - */ - public Builder escapeSyntaxCallMode(PgEscapeSyntaxCallMode escapeSyntaxCallMode) { - this.escapeSyntaxCallMode = escapeSyntaxCallMode; - return this; - } - - // ==================== AUTH PLUGIN ==================== - - /** - * Custom authentication plugin class name. Driver default: null. - * - * @param authenticationPluginClassName fully qualified class name - * @return this builder - */ - public Builder authenticationPluginClassName(String authenticationPluginClassName) { - this.authenticationPluginClassName = authenticationPluginClassName; - return this; - } - - // ==================== ESCAPE HATCH ==================== - - /** - * Set an arbitrary driver property. Use this for undocumented or future properties. - * - * @param key property name - * @param value property value - * @return this builder - */ - public Builder property(String key, String value) { - this.extraProperties.put(key, value); - return this; - } - - /** - * Build the PostgresConfig. - * - * @return immutable PostgresConfig - */ - public PostgresConfig build() { - return new PostgresConfig(this); - } - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerApplicationIntent.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerApplicationIntent.java deleted file mode 100644 index 50c7ef287e..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerApplicationIntent.java +++ /dev/null @@ -1,19 +0,0 @@ -package dev.typr.foundations.connect.sqlserver; - -/** SQL Server application intent for read-only routing in Always On clusters. */ -public enum SqlServerApplicationIntent { - /** Read-write operations (default). Routes to primary replica. */ - READ_WRITE("ReadWrite"), - /** Read-only operations. May route to read-only secondary replica. */ - READ_ONLY("ReadOnly"); - - private final String value; - - SqlServerApplicationIntent(String value) { - this.value = value; - } - - public String value() { - return value; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerAuthentication.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerAuthentication.java deleted file mode 100644 index 664db6d6b7..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerAuthentication.java +++ /dev/null @@ -1,33 +0,0 @@ -package dev.typr.foundations.connect.sqlserver; - -/** SQL Server authentication mode. */ -public enum SqlServerAuthentication { - /** SQL Server authentication using username and password. */ - SQL_PASSWORD("SqlPassword"), - /** Active Directory password authentication. */ - ACTIVE_DIRECTORY_PASSWORD("ActiveDirectoryPassword"), - /** Active Directory integrated authentication (Windows). */ - ACTIVE_DIRECTORY_INTEGRATED("ActiveDirectoryIntegrated"), - /** Active Directory interactive authentication (MFA). */ - ACTIVE_DIRECTORY_INTERACTIVE("ActiveDirectoryInteractive"), - /** Active Directory service principal authentication. */ - ACTIVE_DIRECTORY_SERVICE_PRINCIPAL("ActiveDirectoryServicePrincipal"), - /** Active Directory managed identity authentication. */ - ACTIVE_DIRECTORY_MANAGED_IDENTITY("ActiveDirectoryManagedIdentity"), - /** Active Directory default authentication (uses DefaultAzureCredential). */ - ACTIVE_DIRECTORY_DEFAULT("ActiveDirectoryDefault"), - /** Windows integrated authentication (NTLM/Kerberos). */ - NTLM("NTLM"), - /** Use access token for authentication. */ - ACCESS_TOKEN("accessToken"); - - private final String value; - - SqlServerAuthentication(String value) { - this.value = value; - } - - public String value() { - return value; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerAuthenticationScheme.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerAuthenticationScheme.java deleted file mode 100644 index 71f3d8dafc..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerAuthenticationScheme.java +++ /dev/null @@ -1,23 +0,0 @@ -package dev.typr.foundations.connect.sqlserver; - -/** SQL Server authentication scheme for integrated authentication. */ -public enum SqlServerAuthenticationScheme { - /** Use native SQL Server authentication (default). */ - NATIVE_AUTHENTICATION("nativeAuthentication"), - /** Use NTLM authentication. */ - NTLM("NTLM"), - /** Use Kerberos authentication (requires proper SPN configuration). */ - KERBEROS("Kerberos"), - /** Use Java's GSS-API for Kerberos (cross-platform). */ - JAVA_KERBEROS("JavaKerberos"); - - private final String value; - - SqlServerAuthenticationScheme(String value) { - this.value = value; - } - - public String value() { - return value; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerColumnEncryptionSetting.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerColumnEncryptionSetting.java deleted file mode 100644 index 97de491058..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerColumnEncryptionSetting.java +++ /dev/null @@ -1,19 +0,0 @@ -package dev.typr.foundations.connect.sqlserver; - -/** SQL Server Always Encrypted column encryption setting. */ -public enum SqlServerColumnEncryptionSetting { - /** Disable Always Encrypted (default). */ - DISABLED("Disabled"), - /** Enable Always Encrypted. */ - ENABLED("Enabled"); - - private final String value; - - SqlServerColumnEncryptionSetting(String value) { - this.value = value; - } - - public String value() { - return value; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerConfig.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerConfig.java deleted file mode 100644 index 953c39c516..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerConfig.java +++ /dev/null @@ -1,1214 +0,0 @@ -package dev.typr.foundations.connect.sqlserver; - -import dev.typr.foundations.connect.DatabaseConfig; -import dev.typr.foundations.connect.DatabaseKind; -import java.util.HashMap; -import java.util.Map; - -/** - * SQL Server database configuration with typed builder methods for all documented JDBC driver - * properties. - * - *

Properties are based on the Microsoft JDBC Driver for SQL Server documentation. - * - * @see Microsoft - * JDBC Documentation - */ -public final class SqlServerConfig implements DatabaseConfig { - - private final String host; - private final int port; - private final String database; - private final String username; - private final String password; - - // Connection properties - private final String instanceName; - private final Boolean integratedSecurity; - private final SqlServerAuthentication authentication; - private final SqlServerAuthenticationScheme authenticationScheme; - private final String accessToken; - private final String realm; - private final String serverSpn; - - // Encryption/SSL properties - private final SqlServerEncrypt encrypt; - private final Boolean trustServerCertificate; - private final String hostNameInCertificate; - private final String trustStore; - private final String trustStorePassword; - private final String trustStoreType; - private final String sslProtocol; - private final String keyStoreLocation; - private final String keyStoreSecret; - - // Performance properties - private final Boolean useBulkCopyForBatchInsert; - private final Boolean sendStringParametersAsUnicode; - private final SqlServerResponseBuffering responseBuffering; - private final SqlServerSelectMethod selectMethod; - private final Integer packetSize; - private final Boolean enablePrepareOnFirstPreparedStatementCall; - private final Integer serverPreparedStatementDiscardThreshold; - private final Integer statementPoolingCacheSize; - private final Boolean disableStatementPooling; - private final Boolean useFmtOnly; - private final Boolean delayLoadingLobs; - private final Integer maxResultBuffer; - private final Boolean sendTemporalDataTypesAsStringForBulkCopy; - - // Timeout properties - private final Integer loginTimeout; - private final Integer queryTimeout; - private final Integer socketTimeout; - private final Integer lockTimeout; - private final Integer cancelQueryTimeout; - - // HA/Failover properties - private final Boolean multiSubnetFailover; - private final SqlServerApplicationIntent applicationIntent; - private final String failoverPartner; - private final Boolean transparentNetworkIPResolution; - private final Integer connectRetryCount; - private final Integer connectRetryInterval; - - // Always Encrypted properties - private final SqlServerColumnEncryptionSetting columnEncryptionSetting; - private final String keyStoreAuthentication; - private final String keyStorePrincipalId; - private final String enclaveAttestationUrl; - private final String enclaveAttestationProtocol; - private final Boolean alwaysEncryptedTraceEnabled; - - // Date/Time properties - private final Boolean sendTimeAsDatetime; - private final String datetimeParameterType; - - // Application properties - private final String applicationName; - private final String workstationID; - - // Logging/Debugging properties - private final String traceDirectory; - private final Boolean traceEnabled; - private final String jaasConfigurationName; - private final String clientCertificate; - private final String clientKey; - private final String clientKeyPassword; - - // Misc properties - private final String lastUpdateCount; - private final Boolean xopenStates; - private final Boolean replication; - private final String gsscredential; - private final String serverNameAsACE; - private final Boolean useDefaultGSSCredential; - private final String msiClientId; - private final String prepareMethod; - - // Escape hatch - private final Map extraProperties; - - private SqlServerConfig(Builder b) { - this.host = b.host; - this.port = b.port; - this.database = b.database; - this.username = b.username; - this.password = b.password; - - // Connection - this.instanceName = b.instanceName; - this.integratedSecurity = b.integratedSecurity; - this.authentication = b.authentication; - this.authenticationScheme = b.authenticationScheme; - this.accessToken = b.accessToken; - this.realm = b.realm; - this.serverSpn = b.serverSpn; - - // Encryption/SSL - this.encrypt = b.encrypt; - this.trustServerCertificate = b.trustServerCertificate; - this.hostNameInCertificate = b.hostNameInCertificate; - this.trustStore = b.trustStore; - this.trustStorePassword = b.trustStorePassword; - this.trustStoreType = b.trustStoreType; - this.sslProtocol = b.sslProtocol; - this.keyStoreLocation = b.keyStoreLocation; - this.keyStoreSecret = b.keyStoreSecret; - - // Performance - this.useBulkCopyForBatchInsert = b.useBulkCopyForBatchInsert; - this.sendStringParametersAsUnicode = b.sendStringParametersAsUnicode; - this.responseBuffering = b.responseBuffering; - this.selectMethod = b.selectMethod; - this.packetSize = b.packetSize; - this.enablePrepareOnFirstPreparedStatementCall = b.enablePrepareOnFirstPreparedStatementCall; - this.serverPreparedStatementDiscardThreshold = b.serverPreparedStatementDiscardThreshold; - this.statementPoolingCacheSize = b.statementPoolingCacheSize; - this.disableStatementPooling = b.disableStatementPooling; - this.useFmtOnly = b.useFmtOnly; - this.delayLoadingLobs = b.delayLoadingLobs; - this.maxResultBuffer = b.maxResultBuffer; - this.sendTemporalDataTypesAsStringForBulkCopy = b.sendTemporalDataTypesAsStringForBulkCopy; - - // Timeouts - this.loginTimeout = b.loginTimeout; - this.queryTimeout = b.queryTimeout; - this.socketTimeout = b.socketTimeout; - this.lockTimeout = b.lockTimeout; - this.cancelQueryTimeout = b.cancelQueryTimeout; - - // HA/Failover - this.multiSubnetFailover = b.multiSubnetFailover; - this.applicationIntent = b.applicationIntent; - this.failoverPartner = b.failoverPartner; - this.transparentNetworkIPResolution = b.transparentNetworkIPResolution; - this.connectRetryCount = b.connectRetryCount; - this.connectRetryInterval = b.connectRetryInterval; - - // Always Encrypted - this.columnEncryptionSetting = b.columnEncryptionSetting; - this.keyStoreAuthentication = b.keyStoreAuthentication; - this.keyStorePrincipalId = b.keyStorePrincipalId; - this.enclaveAttestationUrl = b.enclaveAttestationUrl; - this.enclaveAttestationProtocol = b.enclaveAttestationProtocol; - this.alwaysEncryptedTraceEnabled = b.alwaysEncryptedTraceEnabled; - - // Date/Time - this.sendTimeAsDatetime = b.sendTimeAsDatetime; - this.datetimeParameterType = b.datetimeParameterType; - - // Application - this.applicationName = b.applicationName; - this.workstationID = b.workstationID; - - // Logging/Debugging - this.traceDirectory = b.traceDirectory; - this.traceEnabled = b.traceEnabled; - this.jaasConfigurationName = b.jaasConfigurationName; - this.clientCertificate = b.clientCertificate; - this.clientKey = b.clientKey; - this.clientKeyPassword = b.clientKeyPassword; - - // Misc - this.lastUpdateCount = b.lastUpdateCount; - this.xopenStates = b.xopenStates; - this.replication = b.replication; - this.gsscredential = b.gsscredential; - this.serverNameAsACE = b.serverNameAsACE; - this.useDefaultGSSCredential = b.useDefaultGSSCredential; - this.msiClientId = b.msiClientId; - this.prepareMethod = b.prepareMethod; - - this.extraProperties = Map.copyOf(b.extraProperties); - } - - /** - * Create a new builder with required connection parameters. - * - * @param host SQL Server hostname - * @param port SQL Server port (typically 1433) - * @param database database name - * @param username username for authentication - * @param password password for authentication - * @return a new builder - */ - public static Builder builder( - String host, int port, String database, String username, String password) { - return new Builder(host, port, database, username, password); - } - - @Override - public String jdbcUrl() { - StringBuilder url = new StringBuilder("jdbc:sqlserver://"); - url.append(host).append(":").append(port); - url.append(";databaseName=").append(database); - if (instanceName != null) { - url.append(";instanceName=").append(instanceName); - } - return url.toString(); - } - - @Override - public String username() { - return username; - } - - @Override - public String password() { - return password; - } - - @Override - public DatabaseKind kind() { - return DatabaseKind.SQLSERVER; - } - - @Override - public Map driverProperties() { - Map props = new HashMap<>(); - - // Connection - if (integratedSecurity != null) props.put("integratedSecurity", integratedSecurity.toString()); - if (authentication != null) props.put("authentication", authentication.value()); - if (authenticationScheme != null) - props.put("authenticationScheme", authenticationScheme.value()); - if (accessToken != null) props.put("accessToken", accessToken); - if (realm != null) props.put("realm", realm); - if (serverSpn != null) props.put("serverSpn", serverSpn); - - // Encryption/SSL - if (encrypt != null) props.put("encrypt", encrypt.value()); - if (trustServerCertificate != null) - props.put("trustServerCertificate", trustServerCertificate.toString()); - if (hostNameInCertificate != null) props.put("hostNameInCertificate", hostNameInCertificate); - if (trustStore != null) props.put("trustStore", trustStore); - if (trustStorePassword != null) props.put("trustStorePassword", trustStorePassword); - if (trustStoreType != null) props.put("trustStoreType", trustStoreType); - if (sslProtocol != null) props.put("sslProtocol", sslProtocol); - if (keyStoreLocation != null) props.put("keyStoreLocation", keyStoreLocation); - if (keyStoreSecret != null) props.put("keyStoreSecret", keyStoreSecret); - - // Performance - if (useBulkCopyForBatchInsert != null) - props.put("useBulkCopyForBatchInsert", useBulkCopyForBatchInsert.toString()); - if (sendStringParametersAsUnicode != null) - props.put("sendStringParametersAsUnicode", sendStringParametersAsUnicode.toString()); - if (responseBuffering != null) props.put("responseBuffering", responseBuffering.value()); - if (selectMethod != null) props.put("selectMethod", selectMethod.value()); - if (packetSize != null) props.put("packetSize", packetSize.toString()); - if (enablePrepareOnFirstPreparedStatementCall != null) - props.put( - "enablePrepareOnFirstPreparedStatementCall", - enablePrepareOnFirstPreparedStatementCall.toString()); - if (serverPreparedStatementDiscardThreshold != null) - props.put( - "serverPreparedStatementDiscardThreshold", - serverPreparedStatementDiscardThreshold.toString()); - if (statementPoolingCacheSize != null) - props.put("statementPoolingCacheSize", statementPoolingCacheSize.toString()); - if (disableStatementPooling != null) - props.put("disableStatementPooling", disableStatementPooling.toString()); - if (useFmtOnly != null) props.put("useFmtOnly", useFmtOnly.toString()); - if (delayLoadingLobs != null) props.put("delayLoadingLobs", delayLoadingLobs.toString()); - if (maxResultBuffer != null) props.put("maxResultBuffer", maxResultBuffer.toString()); - if (sendTemporalDataTypesAsStringForBulkCopy != null) - props.put( - "sendTemporalDataTypesAsStringForBulkCopy", - sendTemporalDataTypesAsStringForBulkCopy.toString()); - - // Timeouts - if (loginTimeout != null) props.put("loginTimeout", loginTimeout.toString()); - if (queryTimeout != null) props.put("queryTimeout", queryTimeout.toString()); - if (socketTimeout != null) props.put("socketTimeout", socketTimeout.toString()); - if (lockTimeout != null) props.put("lockTimeout", lockTimeout.toString()); - if (cancelQueryTimeout != null) props.put("cancelQueryTimeout", cancelQueryTimeout.toString()); - - // HA/Failover - if (multiSubnetFailover != null) - props.put("multiSubnetFailover", multiSubnetFailover.toString()); - if (applicationIntent != null) props.put("applicationIntent", applicationIntent.value()); - if (failoverPartner != null) props.put("failoverPartner", failoverPartner); - if (transparentNetworkIPResolution != null) - props.put("transparentNetworkIPResolution", transparentNetworkIPResolution.toString()); - if (connectRetryCount != null) props.put("connectRetryCount", connectRetryCount.toString()); - if (connectRetryInterval != null) - props.put("connectRetryInterval", connectRetryInterval.toString()); - - // Always Encrypted - if (columnEncryptionSetting != null) - props.put("columnEncryptionSetting", columnEncryptionSetting.value()); - if (keyStoreAuthentication != null) props.put("keyStoreAuthentication", keyStoreAuthentication); - if (keyStorePrincipalId != null) props.put("keyStorePrincipalId", keyStorePrincipalId); - if (enclaveAttestationUrl != null) props.put("enclaveAttestationUrl", enclaveAttestationUrl); - if (enclaveAttestationProtocol != null) - props.put("enclaveAttestationProtocol", enclaveAttestationProtocol); - if (alwaysEncryptedTraceEnabled != null) - props.put("alwaysEncryptedTraceEnabled", alwaysEncryptedTraceEnabled.toString()); - - // Date/Time - if (sendTimeAsDatetime != null) props.put("sendTimeAsDatetime", sendTimeAsDatetime.toString()); - if (datetimeParameterType != null) props.put("datetimeParameterType", datetimeParameterType); - - // Application - if (applicationName != null) props.put("applicationName", applicationName); - if (workstationID != null) props.put("workstationID", workstationID); - - // Logging/Debugging - if (traceDirectory != null) props.put("traceDirectory", traceDirectory); - if (traceEnabled != null) props.put("traceEnabled", traceEnabled.toString()); - if (jaasConfigurationName != null) props.put("jaasConfigurationName", jaasConfigurationName); - if (clientCertificate != null) props.put("clientCertificate", clientCertificate); - if (clientKey != null) props.put("clientKey", clientKey); - if (clientKeyPassword != null) props.put("clientKeyPassword", clientKeyPassword); - - // Misc - if (lastUpdateCount != null) props.put("lastUpdateCount", lastUpdateCount); - if (xopenStates != null) props.put("xopenStates", xopenStates.toString()); - if (replication != null) props.put("replication", replication.toString()); - if (gsscredential != null) props.put("gsscredential", gsscredential); - if (serverNameAsACE != null) props.put("serverNameAsACE", serverNameAsACE); - if (useDefaultGSSCredential != null) - props.put("useDefaultGSSCredential", useDefaultGSSCredential.toString()); - if (msiClientId != null) props.put("msiClientId", msiClientId); - if (prepareMethod != null) props.put("prepareMethod", prepareMethod); - - props.putAll(extraProperties); - return props; - } - - /** Builder for SqlServerConfig with typed methods for all JDBC driver properties. */ - public static final class Builder { - private final String host; - private final int port; - private final String database; - private final String username; - private final String password; - - // Connection - private String instanceName; - private Boolean integratedSecurity; - private SqlServerAuthentication authentication; - private SqlServerAuthenticationScheme authenticationScheme; - private String accessToken; - private String realm; - private String serverSpn; - - // Encryption/SSL - private SqlServerEncrypt encrypt; - private Boolean trustServerCertificate; - private String hostNameInCertificate; - private String trustStore; - private String trustStorePassword; - private String trustStoreType; - private String sslProtocol; - private String keyStoreLocation; - private String keyStoreSecret; - - // Performance - private Boolean useBulkCopyForBatchInsert; - private Boolean sendStringParametersAsUnicode; - private SqlServerResponseBuffering responseBuffering; - private SqlServerSelectMethod selectMethod; - private Integer packetSize; - private Boolean enablePrepareOnFirstPreparedStatementCall; - private Integer serverPreparedStatementDiscardThreshold; - private Integer statementPoolingCacheSize; - private Boolean disableStatementPooling; - private Boolean useFmtOnly; - private Boolean delayLoadingLobs; - private Integer maxResultBuffer; - private Boolean sendTemporalDataTypesAsStringForBulkCopy; - - // Timeouts - private Integer loginTimeout; - private Integer queryTimeout; - private Integer socketTimeout; - private Integer lockTimeout; - private Integer cancelQueryTimeout; - - // HA/Failover - private Boolean multiSubnetFailover; - private SqlServerApplicationIntent applicationIntent; - private String failoverPartner; - private Boolean transparentNetworkIPResolution; - private Integer connectRetryCount; - private Integer connectRetryInterval; - - // Always Encrypted - private SqlServerColumnEncryptionSetting columnEncryptionSetting; - private String keyStoreAuthentication; - private String keyStorePrincipalId; - private String enclaveAttestationUrl; - private String enclaveAttestationProtocol; - private Boolean alwaysEncryptedTraceEnabled; - - // Date/Time - private Boolean sendTimeAsDatetime; - private String datetimeParameterType; - - // Application - private String applicationName; - private String workstationID; - - // Logging/Debugging - private String traceDirectory; - private Boolean traceEnabled; - private String jaasConfigurationName; - private String clientCertificate; - private String clientKey; - private String clientKeyPassword; - - // Misc - private String lastUpdateCount; - private Boolean xopenStates; - private Boolean replication; - private String gsscredential; - private String serverNameAsACE; - private Boolean useDefaultGSSCredential; - private String msiClientId; - private String prepareMethod; - - private final Map extraProperties = new HashMap<>(); - - private Builder(String host, int port, String database, String username, String password) { - this.host = host; - this.port = port; - this.database = database; - this.username = username; - this.password = password; - } - - // ==================== CONNECTION ==================== - - /** - * SQL Server named instance. Driver default: null (default instance). - * - * @param instanceName instance name - * @return this builder - */ - public Builder instanceName(String instanceName) { - this.instanceName = instanceName; - return this; - } - - /** - * Use Windows integrated authentication. Driver default: false. - * - * @param integratedSecurity true to use Windows auth - * @return this builder - */ - public Builder integratedSecurity(boolean integratedSecurity) { - this.integratedSecurity = integratedSecurity; - return this; - } - - /** - * Authentication mode. Driver default: null (use SQL authentication). - * - * @param authentication authentication mode - * @return this builder - */ - public Builder authentication(SqlServerAuthentication authentication) { - this.authentication = authentication; - return this; - } - - /** - * Authentication scheme for integrated security. Driver default: nativeAuthentication. - * - * @param authenticationScheme authentication scheme - * @return this builder - */ - public Builder authenticationScheme(SqlServerAuthenticationScheme authenticationScheme) { - this.authenticationScheme = authenticationScheme; - return this; - } - - /** - * Access token for Azure AD authentication. Driver default: null. - * - * @param accessToken Azure AD access token - * @return this builder - */ - public Builder accessToken(String accessToken) { - this.accessToken = accessToken; - return this; - } - - /** - * Kerberos realm for authentication. Driver default: null. - * - * @param realm Kerberos realm - * @return this builder - */ - public Builder realm(String realm) { - this.realm = realm; - return this; - } - - /** - * Server Service Principal Name for Kerberos. Driver default: null. - * - * @param serverSpn SPN in format MSSQLSvc/hostname:port - * @return this builder - */ - public Builder serverSpn(String serverSpn) { - this.serverSpn = serverSpn; - return this; - } - - // ==================== ENCRYPTION/SSL ==================== - - /** - * Encryption mode. Driver default: true (as of driver 10.x). - * - * @param encrypt encryption mode - * @return this builder - */ - public Builder encrypt(SqlServerEncrypt encrypt) { - this.encrypt = encrypt; - return this; - } - - /** - * Trust the server certificate without validation. Driver default: false. - * - * @param trustServerCertificate true to trust all certificates - * @return this builder - */ - public Builder trustServerCertificate(boolean trustServerCertificate) { - this.trustServerCertificate = trustServerCertificate; - return this; - } - - /** - * Hostname to verify in server certificate. Driver default: null. - * - * @param hostNameInCertificate expected hostname - * @return this builder - */ - public Builder hostNameInCertificate(String hostNameInCertificate) { - this.hostNameInCertificate = hostNameInCertificate; - return this; - } - - /** - * Path to trust store file. Driver default: null. - * - * @param trustStore path to JKS/PKCS12 file - * @return this builder - */ - public Builder trustStore(String trustStore) { - this.trustStore = trustStore; - return this; - } - - /** - * Password for trust store. Driver default: null. - * - * @param trustStorePassword trust store password - * @return this builder - */ - public Builder trustStorePassword(String trustStorePassword) { - this.trustStorePassword = trustStorePassword; - return this; - } - - /** - * Trust store type. Driver default: JKS. - * - * @param trustStoreType trust store type (JKS, PKCS12) - * @return this builder - */ - public Builder trustStoreType(String trustStoreType) { - this.trustStoreType = trustStoreType; - return this; - } - - /** - * SSL/TLS protocol version. Driver default: TLS. - * - * @param sslProtocol protocol version (TLS, TLSv1.2, TLSv1.3) - * @return this builder - */ - public Builder sslProtocol(String sslProtocol) { - this.sslProtocol = sslProtocol; - return this; - } - - /** - * Path to client key store for mutual TLS. Driver default: null. - * - * @param keyStoreLocation path to key store file - * @return this builder - */ - public Builder keyStoreLocation(String keyStoreLocation) { - this.keyStoreLocation = keyStoreLocation; - return this; - } - - /** - * Password for client key store. Driver default: null. - * - * @param keyStoreSecret key store password - * @return this builder - */ - public Builder keyStoreSecret(String keyStoreSecret) { - this.keyStoreSecret = keyStoreSecret; - return this; - } - - // ==================== PERFORMANCE ==================== - - /** - * Use bulk copy API for batch inserts. Driver default: false. OUR RECOMMENDATION: true - * (significantly faster batch inserts). - * - * @param useBulkCopyForBatchInsert true to enable - * @return this builder - */ - public Builder useBulkCopyForBatchInsert(boolean useBulkCopyForBatchInsert) { - this.useBulkCopyForBatchInsert = useBulkCopyForBatchInsert; - return this; - } - - /** - * Send string parameters as Unicode. Driver default: true. - * - * @param sendStringParametersAsUnicode true for Unicode, false for ASCII - * @return this builder - */ - public Builder sendStringParametersAsUnicode(boolean sendStringParametersAsUnicode) { - this.sendStringParametersAsUnicode = sendStringParametersAsUnicode; - return this; - } - - /** - * Response buffering mode. Driver default: adaptive (since 2.0). - * - * @param responseBuffering buffering mode - * @return this builder - */ - public Builder responseBuffering(SqlServerResponseBuffering responseBuffering) { - this.responseBuffering = responseBuffering; - return this; - } - - /** - * Select method for result sets. Driver default: direct. - * - * @param selectMethod select method - * @return this builder - */ - public Builder selectMethod(SqlServerSelectMethod selectMethod) { - this.selectMethod = selectMethod; - return this; - } - - /** - * TDS packet size in bytes. Driver default: 8000. - * - * @param packetSize packet size (512-32767) - * @return this builder - */ - public Builder packetSize(int packetSize) { - this.packetSize = packetSize; - return this; - } - - /** - * Create server-prepared statement on first execute. Driver default: null. - * - * @param enablePrepareOnFirstPreparedStatementCall true to prepare on first call - * @return this builder - */ - public Builder enablePrepareOnFirstPreparedStatementCall( - boolean enablePrepareOnFirstPreparedStatementCall) { - this.enablePrepareOnFirstPreparedStatementCall = enablePrepareOnFirstPreparedStatementCall; - return this; - } - - /** - * Threshold before unpreparing statements. Driver default: 10. - * - * @param serverPreparedStatementDiscardThreshold threshold count - * @return this builder - */ - public Builder serverPreparedStatementDiscardThreshold( - int serverPreparedStatementDiscardThreshold) { - this.serverPreparedStatementDiscardThreshold = serverPreparedStatementDiscardThreshold; - return this; - } - - /** - * Statement pooling cache size. Driver default: 0 (disabled). - * - * @param statementPoolingCacheSize cache size - * @return this builder - */ - public Builder statementPoolingCacheSize(int statementPoolingCacheSize) { - this.statementPoolingCacheSize = statementPoolingCacheSize; - return this; - } - - /** - * Disable statement pooling. Driver default: true. - * - * @param disableStatementPooling true to disable - * @return this builder - */ - public Builder disableStatementPooling(boolean disableStatementPooling) { - this.disableStatementPooling = disableStatementPooling; - return this; - } - - /** - * Use SET FMTONLY for parameter metadata. Driver default: false. - * - * @param useFmtOnly true to use SET FMTONLY - * @return this builder - */ - public Builder useFmtOnly(boolean useFmtOnly) { - this.useFmtOnly = useFmtOnly; - return this; - } - - /** - * Delay loading LOBs until accessed. Driver default: true. - * - * @param delayLoadingLobs true to delay loading - * @return this builder - */ - public Builder delayLoadingLobs(boolean delayLoadingLobs) { - this.delayLoadingLobs = delayLoadingLobs; - return this; - } - - /** - * Maximum result buffer size in bytes. Driver default: -1 (unlimited). - * - * @param maxResultBuffer buffer size in bytes - * @return this builder - */ - public Builder maxResultBuffer(int maxResultBuffer) { - this.maxResultBuffer = maxResultBuffer; - return this; - } - - /** - * Send temporal types as strings in bulk copy. Driver default: true. - * - * @param sendTemporalDataTypesAsStringForBulkCopy true to send as strings - * @return this builder - */ - public Builder sendTemporalDataTypesAsStringForBulkCopy( - boolean sendTemporalDataTypesAsStringForBulkCopy) { - this.sendTemporalDataTypesAsStringForBulkCopy = sendTemporalDataTypesAsStringForBulkCopy; - return this; - } - - // ==================== TIMEOUTS ==================== - - /** - * Login timeout in seconds. Driver default: 15. - * - * @param loginTimeout timeout in seconds - * @return this builder - */ - public Builder loginTimeout(int loginTimeout) { - this.loginTimeout = loginTimeout; - return this; - } - - /** - * Query timeout in seconds. Driver default: -1 (use server default). - * - * @param queryTimeout timeout in seconds - * @return this builder - */ - public Builder queryTimeout(int queryTimeout) { - this.queryTimeout = queryTimeout; - return this; - } - - /** - * Socket timeout in milliseconds. Driver default: 0 (unlimited). - * - * @param socketTimeout timeout in milliseconds - * @return this builder - */ - public Builder socketTimeout(int socketTimeout) { - this.socketTimeout = socketTimeout; - return this; - } - - /** - * Lock timeout in milliseconds. Driver default: -1 (wait indefinitely). - * - * @param lockTimeout timeout in milliseconds - * @return this builder - */ - public Builder lockTimeout(int lockTimeout) { - this.lockTimeout = lockTimeout; - return this; - } - - /** - * Cancel query timeout in seconds. Driver default: -1 (disabled). - * - * @param cancelQueryTimeout timeout in seconds - * @return this builder - */ - public Builder cancelQueryTimeout(int cancelQueryTimeout) { - this.cancelQueryTimeout = cancelQueryTimeout; - return this; - } - - // ==================== HA/FAILOVER ==================== - - /** - * Enable multi-subnet failover for Always On. Driver default: false. - * - * @param multiSubnetFailover true to enable - * @return this builder - */ - public Builder multiSubnetFailover(boolean multiSubnetFailover) { - this.multiSubnetFailover = multiSubnetFailover; - return this; - } - - /** - * Application intent for read-only routing. Driver default: ReadWrite. - * - * @param applicationIntent application intent - * @return this builder - */ - public Builder applicationIntent(SqlServerApplicationIntent applicationIntent) { - this.applicationIntent = applicationIntent; - return this; - } - - /** - * Failover partner server name. Driver default: null. - * - * @param failoverPartner partner server hostname - * @return this builder - */ - public Builder failoverPartner(String failoverPartner) { - this.failoverPartner = failoverPartner; - return this; - } - - /** - * Enable transparent network IP resolution. Driver default: true. - * - * @param transparentNetworkIPResolution true to enable - * @return this builder - */ - public Builder transparentNetworkIPResolution(boolean transparentNetworkIPResolution) { - this.transparentNetworkIPResolution = transparentNetworkIPResolution; - return this; - } - - /** - * Number of connection retry attempts. Driver default: 1. - * - * @param connectRetryCount retry count - * @return this builder - */ - public Builder connectRetryCount(int connectRetryCount) { - this.connectRetryCount = connectRetryCount; - return this; - } - - /** - * Interval between retry attempts in seconds. Driver default: 10. - * - * @param connectRetryInterval interval in seconds - * @return this builder - */ - public Builder connectRetryInterval(int connectRetryInterval) { - this.connectRetryInterval = connectRetryInterval; - return this; - } - - // ==================== ALWAYS ENCRYPTED ==================== - - /** - * Always Encrypted column encryption setting. Driver default: Disabled. - * - * @param columnEncryptionSetting encryption setting - * @return this builder - */ - public Builder columnEncryptionSetting( - SqlServerColumnEncryptionSetting columnEncryptionSetting) { - this.columnEncryptionSetting = columnEncryptionSetting; - return this; - } - - /** - * Key store authentication type. Driver default: null. - * - * @param keyStoreAuthentication authentication type (JavaKeyStorePassword, - * KeyVaultClientSecret, KeyVaultManagedIdentity) - * @return this builder - */ - public Builder keyStoreAuthentication(String keyStoreAuthentication) { - this.keyStoreAuthentication = keyStoreAuthentication; - return this; - } - - /** - * Key store principal ID (client ID for Azure). Driver default: null. - * - * @param keyStorePrincipalId principal/client ID - * @return this builder - */ - public Builder keyStorePrincipalId(String keyStorePrincipalId) { - this.keyStorePrincipalId = keyStorePrincipalId; - return this; - } - - /** - * Enclave attestation URL for secure enclaves. Driver default: null. - * - * @param enclaveAttestationUrl attestation service URL - * @return this builder - */ - public Builder enclaveAttestationUrl(String enclaveAttestationUrl) { - this.enclaveAttestationUrl = enclaveAttestationUrl; - return this; - } - - /** - * Enclave attestation protocol. Driver default: null. - * - * @param enclaveAttestationProtocol protocol (HGS, AAS, NONE) - * @return this builder - */ - public Builder enclaveAttestationProtocol(String enclaveAttestationProtocol) { - this.enclaveAttestationProtocol = enclaveAttestationProtocol; - return this; - } - - /** - * Enable Always Encrypted tracing. Driver default: false. - * - * @param alwaysEncryptedTraceEnabled true to enable tracing - * @return this builder - */ - public Builder alwaysEncryptedTraceEnabled(boolean alwaysEncryptedTraceEnabled) { - this.alwaysEncryptedTraceEnabled = alwaysEncryptedTraceEnabled; - return this; - } - - // ==================== DATE/TIME ==================== - - /** - * Send java.sql.Time as datetime. Driver default: true. - * - * @param sendTimeAsDatetime true to send as datetime - * @return this builder - */ - public Builder sendTimeAsDatetime(boolean sendTimeAsDatetime) { - this.sendTimeAsDatetime = sendTimeAsDatetime; - return this; - } - - /** - * Type to use for datetime parameters. Driver default: null. - * - * @param datetimeParameterType parameter type (datetime, datetime2, datetimeoffset) - * @return this builder - */ - public Builder datetimeParameterType(String datetimeParameterType) { - this.datetimeParameterType = datetimeParameterType; - return this; - } - - // ==================== APPLICATION ==================== - - /** - * Application name for monitoring. Driver default: Microsoft JDBC Driver for SQL Server. - * - * @param applicationName application name - * @return this builder - */ - public Builder applicationName(String applicationName) { - this.applicationName = applicationName; - return this; - } - - /** - * Workstation ID for monitoring. Driver default: local hostname. - * - * @param workstationID workstation identifier - * @return this builder - */ - public Builder workstationID(String workstationID) { - this.workstationID = workstationID; - return this; - } - - // ==================== LOGGING/DEBUGGING ==================== - - /** - * Directory for trace logs. Driver default: null. - * - * @param traceDirectory path to directory - * @return this builder - */ - public Builder traceDirectory(String traceDirectory) { - this.traceDirectory = traceDirectory; - return this; - } - - /** - * Enable driver tracing. Driver default: false. - * - * @param traceEnabled true to enable - * @return this builder - */ - public Builder traceEnabled(boolean traceEnabled) { - this.traceEnabled = traceEnabled; - return this; - } - - /** - * JAAS configuration name for Kerberos. Driver default: null. - * - * @param jaasConfigurationName JAAS config name - * @return this builder - */ - public Builder jaasConfigurationName(String jaasConfigurationName) { - this.jaasConfigurationName = jaasConfigurationName; - return this; - } - - /** - * Path to client certificate for mutual TLS. Driver default: null. - * - * @param clientCertificate path to certificate file - * @return this builder - */ - public Builder clientCertificate(String clientCertificate) { - this.clientCertificate = clientCertificate; - return this; - } - - /** - * Path to client private key for mutual TLS. Driver default: null. - * - * @param clientKey path to key file - * @return this builder - */ - public Builder clientKey(String clientKey) { - this.clientKey = clientKey; - return this; - } - - /** - * Password for client private key. Driver default: null. - * - * @param clientKeyPassword key password - * @return this builder - */ - public Builder clientKeyPassword(String clientKeyPassword) { - this.clientKeyPassword = clientKeyPassword; - return this; - } - - // ==================== MISC ==================== - - /** - * Return last update count. Driver default: true. - * - * @param lastUpdateCount "true" or "false" - * @return this builder - */ - public Builder lastUpdateCount(String lastUpdateCount) { - this.lastUpdateCount = lastUpdateCount; - return this; - } - - /** - * Use X/Open SQL states. Driver default: false. - * - * @param xopenStates true to use X/Open states - * @return this builder - */ - public Builder xopenStates(boolean xopenStates) { - this.xopenStates = xopenStates; - return this; - } - - /** - * Enable replication support. Driver default: false. - * - * @param replication true to enable - * @return this builder - */ - public Builder replication(boolean replication) { - this.replication = replication; - return this; - } - - /** - * GSS credential object class name. Driver default: null. - * - * @param gsscredential credential class name - * @return this builder - */ - public Builder gsscredential(String gsscredential) { - this.gsscredential = gsscredential; - return this; - } - - /** - * Server name as ACE (ASCII Compatible Encoding). Driver default: null. - * - * @param serverNameAsACE ACE hostname - * @return this builder - */ - public Builder serverNameAsACE(String serverNameAsACE) { - this.serverNameAsACE = serverNameAsACE; - return this; - } - - /** - * Use default GSS credential. Driver default: false. - * - * @param useDefaultGSSCredential true to use default - * @return this builder - */ - public Builder useDefaultGSSCredential(boolean useDefaultGSSCredential) { - this.useDefaultGSSCredential = useDefaultGSSCredential; - return this; - } - - /** - * Managed Identity client ID for Azure. Driver default: null. - * - * @param msiClientId MSI client ID - * @return this builder - */ - public Builder msiClientId(String msiClientId) { - this.msiClientId = msiClientId; - return this; - } - - /** - * Prepare method for statements. Driver default: null. - * - * @param prepareMethod prepare method (prepexec, prepare) - * @return this builder - */ - public Builder prepareMethod(String prepareMethod) { - this.prepareMethod = prepareMethod; - return this; - } - - /** - * Set an arbitrary driver property. - * - * @param key property name - * @param value property value - * @return this builder - */ - public Builder property(String key, String value) { - this.extraProperties.put(key, value); - return this; - } - - /** - * Build the SqlServerConfig. - * - * @return immutable SqlServerConfig - */ - public SqlServerConfig build() { - return new SqlServerConfig(this); - } - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerEncrypt.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerEncrypt.java deleted file mode 100644 index f93c5ac2ea..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerEncrypt.java +++ /dev/null @@ -1,23 +0,0 @@ -package dev.typr.foundations.connect.sqlserver; - -/** SQL Server encryption mode for connections. */ -public enum SqlServerEncrypt { - /** Do not use encryption. */ - FALSE("false"), - /** Use TDS 8.0 encryption (requires driver 10.x+, SQL Server 2022+). */ - STRICT("strict"), - /** Use encryption with TDS 7.x protocol. */ - TRUE("true"), - /** Use encryption only if server requires it (default for driver 9.x and earlier). */ - OPTIONAL("optional"); - - private final String value; - - SqlServerEncrypt(String value) { - this.value = value; - } - - public String value() { - return value; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerResponseBuffering.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerResponseBuffering.java deleted file mode 100644 index fb4fc18c79..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerResponseBuffering.java +++ /dev/null @@ -1,19 +0,0 @@ -package dev.typr.foundations.connect.sqlserver; - -/** SQL Server response buffering mode. */ -public enum SqlServerResponseBuffering { - /** Buffer the entire response in memory (default for drivers before 2.0). */ - FULL("full"), - /** Adaptively buffer based on response size (default for 2.0+). */ - ADAPTIVE("adaptive"); - - private final String value; - - SqlServerResponseBuffering(String value) { - this.value = value; - } - - public String value() { - return value; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerSelectMethod.java b/foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerSelectMethod.java deleted file mode 100644 index 4da2c098ce..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/connect/sqlserver/SqlServerSelectMethod.java +++ /dev/null @@ -1,19 +0,0 @@ -package dev.typr.foundations.connect.sqlserver; - -/** SQL Server select method for result set handling. */ -public enum SqlServerSelectMethod { - /** Use direct processing (forward-only, read-only result sets). Default. */ - DIRECT("direct"), - /** Use server-side cursors (required for scrollable/updatable result sets). */ - CURSOR("cursor"); - - private final String value; - - SqlServerSelectMethod(String value) { - this.value = value; - } - - public String value() { - return value; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/AclItem.java b/foundations-jdbc/src/java/dev/typr/foundations/data/AclItem.java deleted file mode 100644 index 309d433772..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/AclItem.java +++ /dev/null @@ -1,3 +0,0 @@ -package dev.typr.foundations.data; - -public record AclItem(String value) {} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/AnyArray.java b/foundations-jdbc/src/java/dev/typr/foundations/data/AnyArray.java deleted file mode 100644 index 85050f1713..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/AnyArray.java +++ /dev/null @@ -1,4 +0,0 @@ -package dev.typr.foundations.data; - -// anyarray stores generic arrays in PostgreSQL -public record AnyArray(String value) {} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Arr.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Arr.java deleted file mode 100644 index 44ddd4c96b..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Arr.java +++ /dev/null @@ -1,168 +0,0 @@ -package dev.typr.foundations.data; - -import java.util.Arrays; -import java.util.Collection; -import java.util.Optional; -import java.util.function.BiFunction; -import java.util.function.Function; -import java.util.stream.IntStream; - -public class Arr { - public static final Arr EMPTY = new Arr<>(new Object[0], new int[0], new int[] {0}); - - private final Object[] data; - private final int[] extent; - private final int[] _offsets; - - public Arr(Object[] data, int[] extent) { - this(data, extent, _offsets(extent)); - } - - private Arr(Object[] data, int[] extent, int[] _offsets) { - this.data = data; - this.extent = extent; - this._offsets = _offsets; - } - - public Optional> reshape(int... dimensions) { - if (dimensions.length == 0) { - if (extent.length == 0) return Optional.of(this); - return Optional.of(new Arr<>(data, new int[0])); - } - int product = IntStream.of(dimensions).reduce(1, (a, b) -> a * b); - if (product == data.length) return Optional.of(new Arr<>(data, dimensions)); - else return Optional.empty(); - } - - public int size() { - return data.length; - } - - public boolean isEmpty() { - return data.length == 0; - } - - public int[] dimensions() { - return Arrays.copyOf(extent, extent.length); - } - - public Optional get(int... ords) { - if (ords.length == extent.length) { - int a = 0; - for (int i = 0; i < extent.length; i++) { - int ii = ords[i]; - if (ii >= 0 && ii < extent[i]) { - a += ords[i] * _offsets[i]; - } else { - return Optional.empty(); - } - } - return Optional.of((A) data[a]); - } - return Optional.empty(); - } - - public String encode(Function f, char delim) { - StringBuilder sb = new StringBuilder(); - if (extent.length == 0) return "{}"; - sb.append('{'); - encode_go(0, 0, sb); - sb.append('}'); - return sb.toString(); - } - - public String encode(Function f) { - return encode(f, ','); - } - - private static void encode_appendEscaped(StringBuilder sb, String s) { - sb.append('"'); - s.chars() - .forEach( - c -> { - if (c == '"') sb.append("\\\""); - else if (c == '\\') sb.append("\\\\"); - else sb.append((char) c); - }); - sb.append('"'); - } - - private void encode_go(int offset, int ie, StringBuilder sb) { - boolean v = ie == extent.length - 1; - int o = _offsets[ie]; - for (int i = 0; i < extent[ie]; i++) { - if (i > 0) sb.append(','); - if (v) { - encode_appendEscaped(sb, data[offset + i].toString()); - } else { - sb.append('{'); - encode_go(offset + o * i, ie + 1, sb); - sb.append('}'); - } - } - } - - @Override - public String toString() { - return "Arr(" + encode(Object::toString, ',') + ")"; - } - - public boolean equals(Object o) { - if (o == this) return true; - if (!(o instanceof Arr other)) return false; - return Arrays.equals(extent, other.extent) && Arrays.equals(data, other.data); - } - - public int hashCode() { - int result = 1; - result = result * 59 + Arrays.hashCode(extent); - result = result * 59 + Arrays.hashCode(data); - return result; - } - - public B foldLeft(B b, BiFunction f) { - B acc = b; - for (Object a : data) { - acc = f.apply(acc, (A) a); - } - return acc; - } - - public Arr map(Function f) { - Object[] newData = new Object[data.length]; - for (int i = 0; i < data.length; i++) { - newData[i] = f.apply((A) data[i]); - } - return new Arr<>(newData, extent); - } - - public void forEach(Function f) { - for (Object a : data) { - f.apply((A) a); - } - } - - @SafeVarargs - public static Arr of(A... as) { - return new Arr<>(as, new int[] {as.length}); - } - - public static Arr ofCollection(Collection as) { - return new Arr<>(as.toArray(), new int[] {as.size()}); - } - - public static Arr empty() { - return (Arr) EMPTY; - } - - static int[] _offsets(int[] extent) { - int[] ret = new int[extent.length]; - int o = 1; - for (int i = extent.length - 1; i >= 0; i--) { - ret[i] = o; - o *= extent[i]; - } - - return ret; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Cidr.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Cidr.java deleted file mode 100644 index 34272f424a..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Cidr.java +++ /dev/null @@ -1,7 +0,0 @@ -package dev.typr.foundations.data; - -public record Cidr(String value) { - public String toString() { - return value; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/HierarchyId.java b/foundations-jdbc/src/java/dev/typr/foundations/data/HierarchyId.java deleted file mode 100644 index fb446a8184..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/HierarchyId.java +++ /dev/null @@ -1,200 +0,0 @@ -package dev.typr.foundations.data; - -import java.util.ArrayList; -import java.util.List; - -/** - * Wrapper for SQL Server's HIERARCHYID type. Stores hierarchical path segments and provides - * conversion to/from the canonical string path format (e.g., "/1/2/3/") and binary representation. - */ -public record HierarchyId(List segments) { - - /** The root hierarchy node "/" */ - public static final HierarchyId ROOT = new HierarchyId(List.of()); - - public HierarchyId { - segments = List.copyOf(segments); - } - - /** - * Create a HierarchyId from binary representation (as returned by SQL Server). - * - * @param bytes The binary representation - * @return The HierarchyId instance - */ - public static HierarchyId fromBytes(byte[] bytes) { - if (bytes == null || bytes.length == 0) { - return ROOT; - } - return new HierarchyId(decodeBytes(bytes)); - } - - /** - * Parse a hierarchyid from its canonical string representation. - * - * @param path The path string like "/", "/1/", "/1/2/3/" - * @return The HierarchyId instance - */ - public static HierarchyId parse(String path) { - if (path == null || path.isEmpty() || path.equals("/")) { - return ROOT; - } - - List segments = new ArrayList<>(); - String[] parts = path.split("/"); - for (String part : parts) { - if (!part.isEmpty()) { - segments.add(Long.parseLong(part)); - } - } - - if (segments.isEmpty()) { - return ROOT; - } - - return new HierarchyId(segments); - } - - /** - * Convert this HierarchyId to its canonical string representation. - * - * @return The path string like "/", "/1/", "/1/2/3/" - */ - @Override - public String toString() { - if (segments.isEmpty()) { - return "/"; - } - - StringBuilder result = new StringBuilder("/"); - for (Long segment : segments) { - result.append(segment).append("/"); - } - return result.toString(); - } - - // ==================== Binary Decoding ==================== - - private static List decodeBytes(byte[] bytes) { - List result = new ArrayList<>(); - BitReader reader = new BitReader(bytes); - - while (reader.hasMore()) { - Long label = decodeLabel(reader); - if (label == null) { - break; - } - result.add(label); - } - - return result; - } - - private static Long decodeLabel(BitReader reader) { - // Count leading zeros to determine type - int zeros = 0; - while (reader.hasMore() && reader.peekBit() == 0) { - reader.readBit(); - zeros++; - if (zeros > 10) { - return null; // Too many zeros, probably padding - } - } - - if (!reader.hasMore()) { - return null; - } - - // Read the '1' that ends the prefix - int one = reader.readBit(); - if (one != 1) { - return null; - } - - // Decode based on type (number of zeros before the 1) - return switch (zeros) { - case 0 -> null; // Type O1: fake/fractional nodes (not common) - case 1 -> { // Type O2: 01 prefix, 2 value bits, values 0-3 - if (!reader.hasBits(3)) yield null; - long val = reader.readBits(2); - reader.readBit(); // terminator - yield val; - } - case 2 -> { // Type O3: 001 prefix, 2 value bits, values 4-7 - if (!reader.hasBits(3)) yield null; - long val = reader.readBits(2) + 4; - reader.readBit(); // terminator - yield val; - } - case 3 -> { // Type O4: 0001 prefix, 3 value bits, values 8-15 - if (!reader.hasBits(4)) yield null; - long val = reader.readBits(3) + 8; - reader.readBit(); // terminator - yield val; - } - case 4 -> { // Type O5: 00001 prefix, 6 value bits, values 16-79 - if (!reader.hasBits(7)) yield null; - long val = reader.readBits(6) + 16; - reader.readBit(); // terminator - yield val; - } - case 5 -> { // Type O6: 000001 prefix, 10 value bits, values 80-1103 - if (!reader.hasBits(11)) yield null; - long val = reader.readBits(10) + 80; - reader.readBit(); // terminator - yield val; - } - case 6 -> { // Type O7: 0000001 prefix, 14 value bits - if (!reader.hasBits(15)) yield null; - long val = reader.readBits(14) + 1104; - reader.readBit(); // terminator - yield val; - } - default -> null; - }; - } - - // Helper class for reading bits - private static class BitReader { - private final byte[] bytes; - private int bytePos = 0; - private int bitPos = 7; - - BitReader(byte[] bytes) { - this.bytes = bytes; - } - - boolean hasMore() { - return bytePos < bytes.length; - } - - boolean hasBits(int n) { - int totalBitsLeft = (bytes.length - bytePos) * 8 - (7 - bitPos); - return totalBitsLeft >= n; - } - - int peekBit() { - if (!hasMore()) return 0; - return (bytes[bytePos] >> bitPos) & 1; - } - - int readBit() { - if (!hasMore()) return 0; - int bit = (bytes[bytePos] >> bitPos) & 1; - bitPos--; - if (bitPos < 0) { - bytePos++; - bitPos = 7; - } - return bit; - } - - long readBits(int numBits) { - long value = 0; - for (int i = 0; i < numBits; i++) { - value = (value << 1) | readBit(); - } - return value; - } - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Inet.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Inet.java deleted file mode 100644 index 7f346e3474..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Inet.java +++ /dev/null @@ -1,7 +0,0 @@ -package dev.typr.foundations.data; - -public record Inet(String value) { - public String toString() { - return value; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Int2Vector.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Int2Vector.java deleted file mode 100644 index d62d1f3365..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Int2Vector.java +++ /dev/null @@ -1,50 +0,0 @@ -package dev.typr.foundations.data; - -import java.util.Arrays; - -public record Int2Vector(short[] values) { - public static Int2Vector parse(String value) { - var values = value.split(" "); - var ret = new short[values.length]; - for (var i = 0; i < values.length; i++) { - ret[i] = Short.parseShort(values[i]); - } - return new Int2Vector(ret); - } - - @Override - public int hashCode() { - return Arrays.hashCode(values); - } - - @Override - public boolean equals(Object obj) { - if (obj instanceof Int2Vector(var values1)) { - if (values.length != values1.length) { - return false; - } - for (var i = 0; i < values.length; i++) { - if (values[i] != values1[i]) { - return false; - } - } - return true; - } - return false; - } - - public String value() { - var sb = new StringBuilder(); - for (var i = 0; i < values.length; i++) { - if (i > 0) { - sb.append(" "); - } - sb.append(values[i]); - } - return sb.toString(); - } - - public Int2Vector(String value) { - this(Int2Vector.parse(value).values); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Json.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Json.java deleted file mode 100644 index 5055d98bdf..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Json.java +++ /dev/null @@ -1,3 +0,0 @@ -package dev.typr.foundations.data; - -public record Json(String value) {} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/JsonParser.java b/foundations-jdbc/src/java/dev/typr/foundations/data/JsonParser.java deleted file mode 100644 index f9db3a9885..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/JsonParser.java +++ /dev/null @@ -1,214 +0,0 @@ -package dev.typr.foundations.data; - -import java.util.*; - -/** - * Simple JSON parser for parsing JSON strings into JsonValue ADT. Handles standard JSON format as - * produced by PostgreSQL and MariaDB. - */ -final class JsonParser { - private final String json; - private int pos; - - private JsonParser(String json) { - this.json = json; - this.pos = 0; - } - - static JsonValue parse(String json) { - if (json == null || json.isEmpty()) { - throw new IllegalArgumentException("Empty JSON string"); - } - JsonParser parser = new JsonParser(json); - JsonValue result = parser.parseValue(); - parser.skipWhitespace(); - if (parser.pos < parser.json.length()) { - throw new IllegalArgumentException( - "Unexpected content after JSON value at position " + parser.pos); - } - return result; - } - - private JsonValue parseValue() { - skipWhitespace(); - if (pos >= json.length()) { - throw new IllegalArgumentException("Unexpected end of JSON"); - } - char c = json.charAt(pos); - return switch (c) { - case 'n' -> parseNull(); - case 't', 'f' -> parseBool(); - case '"' -> parseString(); - case '[' -> parseArray(); - case '{' -> parseObject(); - case '-', '0', '1', '2', '3', '4', '5', '6', '7', '8', '9' -> parseNumber(); - default -> - throw new IllegalArgumentException("Unexpected character '" + c + "' at position " + pos); - }; - } - - private JsonValue.JNull parseNull() { - expect("null"); - return JsonValue.JNull.INSTANCE; - } - - private JsonValue.JBool parseBool() { - if (json.charAt(pos) == 't') { - expect("true"); - return JsonValue.JBool.TRUE; - } else { - expect("false"); - return JsonValue.JBool.FALSE; - } - } - - private JsonValue.JString parseString() { - expect("\""); - StringBuilder sb = new StringBuilder(); - while (pos < json.length()) { - char c = json.charAt(pos); - if (c == '"') { - pos++; - return new JsonValue.JString(sb.toString()); - } else if (c == '\\') { - pos++; - if (pos >= json.length()) { - throw new IllegalArgumentException("Unexpected end of string escape"); - } - char escaped = json.charAt(pos); - switch (escaped) { - case '"' -> sb.append('"'); - case '\\' -> sb.append('\\'); - case '/' -> sb.append('/'); - case 'b' -> sb.append('\b'); - case 'f' -> sb.append('\f'); - case 'n' -> sb.append('\n'); - case 'r' -> sb.append('\r'); - case 't' -> sb.append('\t'); - case 'u' -> { - if (pos + 4 >= json.length()) { - throw new IllegalArgumentException("Incomplete unicode escape"); - } - String hex = json.substring(pos + 1, pos + 5); - sb.append((char) Integer.parseInt(hex, 16)); - pos += 4; - } - default -> throw new IllegalArgumentException("Invalid escape character: " + escaped); - } - pos++; - } else { - sb.append(c); - pos++; - } - } - throw new IllegalArgumentException("Unterminated string"); - } - - private JsonValue.JNumber parseNumber() { - int start = pos; - if (json.charAt(pos) == '-') { - pos++; - } - // Integer part - also accept non-standard leading zeros (DB2 produces these for negative - // decimals) - while (pos < json.length() && Character.isDigit(json.charAt(pos))) { - pos++; - } - // Fractional part - if (pos < json.length() && json.charAt(pos) == '.') { - pos++; - while (pos < json.length() && Character.isDigit(json.charAt(pos))) { - pos++; - } - } - // Exponent - if (pos < json.length() && (json.charAt(pos) == 'e' || json.charAt(pos) == 'E')) { - pos++; - if (pos < json.length() && (json.charAt(pos) == '+' || json.charAt(pos) == '-')) { - pos++; - } - while (pos < json.length() && Character.isDigit(json.charAt(pos))) { - pos++; - } - } - String numStr = json.substring(start, pos); - return new JsonValue.JNumber(numStr); - } - - private JsonValue.JArray parseArray() { - expect("["); - skipWhitespace(); - if (pos < json.length() && json.charAt(pos) == ']') { - pos++; - return new JsonValue.JArray(List.of()); - } - List values = new ArrayList<>(); - while (true) { - values.add(parseValue()); - skipWhitespace(); - if (pos >= json.length()) { - throw new IllegalArgumentException("Unterminated array"); - } - char c = json.charAt(pos); - if (c == ']') { - pos++; - return new JsonValue.JArray(values); - } else if (c == ',') { - pos++; - skipWhitespace(); - } else { - throw new IllegalArgumentException("Expected ',' or ']' in array at position " + pos); - } - } - } - - private JsonValue.JObject parseObject() { - expect("{"); - skipWhitespace(); - if (pos < json.length() && json.charAt(pos) == '}') { - pos++; - return new JsonValue.JObject(Map.of()); - } - Map fields = new LinkedHashMap<>(); - while (true) { - skipWhitespace(); - if (pos >= json.length() || json.charAt(pos) != '"') { - throw new IllegalArgumentException("Expected string key in object at position " + pos); - } - String key = parseString().value(); - skipWhitespace(); - if (pos >= json.length() || json.charAt(pos) != ':') { - throw new IllegalArgumentException("Expected ':' after object key at position " + pos); - } - pos++; - JsonValue value = parseValue(); - fields.put(key, value); - skipWhitespace(); - if (pos >= json.length()) { - throw new IllegalArgumentException("Unterminated object"); - } - char c = json.charAt(pos); - if (c == '}') { - pos++; - return new JsonValue.JObject(fields); - } else if (c == ',') { - pos++; - } else { - throw new IllegalArgumentException("Expected ',' or '}' in object at position " + pos); - } - } - } - - private void skipWhitespace() { - while (pos < json.length() && Character.isWhitespace(json.charAt(pos))) { - pos++; - } - } - - private void expect(String s) { - if (!json.regionMatches(pos, s, 0, s.length())) { - throw new IllegalArgumentException("Expected '" + s + "' at position " + pos); - } - pos += s.length(); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/JsonValue.java b/foundations-jdbc/src/java/dev/typr/foundations/data/JsonValue.java deleted file mode 100644 index 009f5d0769..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/JsonValue.java +++ /dev/null @@ -1,154 +0,0 @@ -package dev.typr.foundations.data; - -import java.util.*; -import java.util.stream.Collectors; - -/** - * Simple JSON ADT for representing JSON values that can be produced/consumed by databases. This is - * used for type-safe JSON serialization/deserialization without external dependencies. - */ -public sealed interface JsonValue - permits JsonValue.JNull, - JsonValue.JBool, - JsonValue.JNumber, - JsonValue.JString, - JsonValue.JArray, - JsonValue.JObject { - - /** JSON null value */ - record JNull() implements JsonValue { - public static final JNull INSTANCE = new JNull(); - - @Override - public String encode() { - return "null"; - } - } - - /** JSON boolean value */ - record JBool(boolean value) implements JsonValue { - public static final JBool TRUE = new JBool(true); - public static final JBool FALSE = new JBool(false); - - public static JBool of(boolean value) { - return value ? TRUE : FALSE; - } - - @Override - public String encode() { - return value ? "true" : "false"; - } - } - - /** JSON number value (stored as String for precision) */ - record JNumber(String value) implements JsonValue { - public static JNumber of(long value) { - return new JNumber(String.valueOf(value)); - } - - public static JNumber of(double value) { - return new JNumber(String.valueOf(value)); - } - - public static JNumber of(String value) { - return new JNumber(value); - } - - @Override - public String encode() { - return value; - } - } - - /** JSON string value */ - record JString(String value) implements JsonValue { - public static JString of(String value) { - return new JString(value); - } - - @Override - public String encode() { - return encodeString(value); - } - } - - /** JSON array value */ - record JArray(List values) implements JsonValue { - public JArray { - values = List.copyOf(values); - } - - public static JArray of(JsonValue... values) { - return new JArray(List.of(values)); - } - - public static JArray of(List values) { - return new JArray(values); - } - - @Override - public String encode() { - return values.stream().map(JsonValue::encode).collect(Collectors.joining(",", "[", "]")); - } - } - - /** JSON object value */ - record JObject(Map fields) implements JsonValue { - public JObject { - fields = new LinkedHashMap<>(fields); - } - - public static JObject of(Map fields) { - return new JObject(fields); - } - - public static JObject empty() { - return new JObject(Map.of()); - } - - public JsonValue get(String key) { - return fields.get(key); - } - - @Override - public String encode() { - return fields.entrySet().stream() - .map(e -> encodeString(e.getKey()) + ":" + e.getValue().encode()) - .collect(Collectors.joining(",", "{", "}")); - } - } - - /** Encode this JSON value to a string */ - String encode(); - - /** Parse JSON from a string */ - static JsonValue parse(String json) { - return JsonParser.parse(json); - } - - // Helper for string encoding - private static String encodeString(String s) { - StringBuilder sb = new StringBuilder("\""); - for (int i = 0; i < s.length(); i++) { - char c = s.charAt(i); - switch (c) { - case '"' -> sb.append("\\\""); - case '\\' -> sb.append("\\\\"); - case '\b' -> sb.append("\\b"); - case '\f' -> sb.append("\\f"); - case '\n' -> sb.append("\\n"); - case '\r' -> sb.append("\\r"); - case '\t' -> sb.append("\\t"); - default -> { - if (c < 0x20) { - sb.append(String.format("\\u%04x", (int) c)); - } else { - sb.append(c); - } - } - } - } - sb.append("\""); - return sb.toString(); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Jsonb.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Jsonb.java deleted file mode 100644 index f8e7ca43eb..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Jsonb.java +++ /dev/null @@ -1,3 +0,0 @@ -package dev.typr.foundations.data; - -public record Jsonb(String value) {} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/MacAddr.java b/foundations-jdbc/src/java/dev/typr/foundations/data/MacAddr.java deleted file mode 100644 index 1045f0860b..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/MacAddr.java +++ /dev/null @@ -1,7 +0,0 @@ -package dev.typr.foundations.data; - -public record MacAddr(String value) { - public String toString() { - return value; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/MacAddr8.java b/foundations-jdbc/src/java/dev/typr/foundations/data/MacAddr8.java deleted file mode 100644 index d7e3fd2c10..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/MacAddr8.java +++ /dev/null @@ -1,7 +0,0 @@ -package dev.typr.foundations.data; - -public record MacAddr8(String value) { - public String toString() { - return value; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Money.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Money.java deleted file mode 100644 index 5bdea5f256..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Money.java +++ /dev/null @@ -1,7 +0,0 @@ -package dev.typr.foundations.data; - -public record Money(double value) { - public Money(String value) { - this(Double.parseDouble(value.replace("$", ""))); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Oid.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Oid.java deleted file mode 100644 index ad91db814b..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Oid.java +++ /dev/null @@ -1,23 +0,0 @@ -package dev.typr.foundations.data; - -/** - * Wrapper for PostgreSQL OID (Object Identifier) type. - * - *

OID is a 32-bit unsigned integer used internally by PostgreSQL as a primary key for various - * system tables. Since Java's int is signed, we use long to properly represent the full range of - * OID values (0 to 2^32-1). - */ -public record Oid(long value) { - public static Oid parse(String value) { - return new Oid(Long.parseLong(value)); - } - - public Oid(String value) { - this(Oid.parse(value).value); - } - - @Override - public String toString() { - return Long.toString(value); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/OidVector.java b/foundations-jdbc/src/java/dev/typr/foundations/data/OidVector.java deleted file mode 100644 index faacb58cdf..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/OidVector.java +++ /dev/null @@ -1,50 +0,0 @@ -package dev.typr.foundations.data; - -import java.util.Arrays; - -public record OidVector(int[] values) { - public static dev.typr.foundations.data.OidVector parse(String value) { - var values = value.split(" "); - var ret = new int[values.length]; - for (var i = 0; i < values.length; i++) { - ret[i] = Integer.parseInt(values[i]); - } - return new dev.typr.foundations.data.OidVector(ret); - } - - @Override - public int hashCode() { - return Arrays.hashCode(values); - } - - @Override - public boolean equals(Object obj) { - if (obj instanceof dev.typr.foundations.data.OidVector other) { - if (values.length != other.values.length) { - return false; - } - for (var i = 0; i < values.length; i++) { - if (values[i] != other.values[i]) { - return false; - } - } - return true; - } - return false; - } - - public String value() { - var sb = new StringBuilder(); - for (var i = 0; i < values.length; i++) { - if (i > 0) { - sb.append(" "); - } - sb.append(values[i]); - } - return sb.toString(); - } - - public OidVector(String value) { - this(dev.typr.foundations.data.OidVector.parse(value).values); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/OracleIntervalDS.java b/foundations-jdbc/src/java/dev/typr/foundations/data/OracleIntervalDS.java deleted file mode 100644 index 82fd2629bb..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/OracleIntervalDS.java +++ /dev/null @@ -1,192 +0,0 @@ -package dev.typr.foundations.data; - -import java.time.Duration; -import java.util.regex.Matcher; -import java.util.regex.Pattern; - -/** - * Represents Oracle's INTERVAL DAY TO SECOND type. - * - *

Can parse both Oracle's native format (+03 14:30:45.123456) and ISO-8601 duration format - * (P3DT14H30M45.123456S). This handles the fact that Oracle returns ISO-8601 format in JSON but - * native format in columns. - */ -public record OracleIntervalDS(int days, int hours, int minutes, int seconds, int nanos) { - - // Oracle format: +03 14:30:45.123456, -01 00:00:00.000000, +00 00:00:00.000000 - private static final Pattern ORACLE_FORMAT = - Pattern.compile("([+-]?)(\\d+)\\s+(\\d+):(\\d+):(\\d+)(?:\\.(\\d+))?"); - - // ISO-8601 format: P3DT14H30M45.123456S, -P1D, P0D - private static final Pattern ISO_FORMAT = - Pattern.compile("(-?)P(\\d+)?D?(?:T(\\d+)?H?(\\d+)?M?(\\d+)?(?:\\.(\\d+))?S?)?"); - - public OracleIntervalDS { - // All fields must have the same sign - boolean hasNegative = days < 0 || hours < 0 || minutes < 0 || seconds < 0 || nanos < 0; - boolean hasPositive = days > 0 || hours > 0 || minutes > 0 || seconds > 0 || nanos > 0; - if (hasNegative && hasPositive) { - throw new IllegalArgumentException("All fields must have the same sign"); - } - // Validate ranges (absolute values) - if (Math.abs(hours) > 23) { - throw new IllegalArgumentException("Hours must be between 0 and 23, got: " + Math.abs(hours)); - } - if (Math.abs(minutes) > 59) { - throw new IllegalArgumentException( - "Minutes must be between 0 and 59, got: " + Math.abs(minutes)); - } - if (Math.abs(seconds) > 59) { - throw new IllegalArgumentException( - "Seconds must be between 0 and 59, got: " + Math.abs(seconds)); - } - if (Math.abs(nanos) > 999_999_999) { - throw new IllegalArgumentException( - "Nanos must be between 0 and 999999999, got: " + Math.abs(nanos)); - } - } - - /** - * Parse from either Oracle format (+03 14:30:45.123456) or ISO-8601 format (P3DT14H30M45.123456S) - */ - public static OracleIntervalDS parse(String s) { - if (s == null || s.isEmpty()) { - throw new IllegalArgumentException("Cannot parse null or empty interval"); - } - - // Try Oracle format first - Matcher oracleMatcher = ORACLE_FORMAT.matcher(s); - if (oracleMatcher.matches()) { - String sign = oracleMatcher.group(1); - int days = Integer.parseInt(oracleMatcher.group(2)); - int hours = Integer.parseInt(oracleMatcher.group(3)); - int minutes = Integer.parseInt(oracleMatcher.group(4)); - int seconds = Integer.parseInt(oracleMatcher.group(5)); - String fracStr = oracleMatcher.group(6); - int nanos = fracStr != null ? parseFractionalSeconds(fracStr) : 0; - - boolean negative = "-".equals(sign); - return new OracleIntervalDS( - negative ? -days : days, - negative ? -hours : hours, - negative ? -minutes : minutes, - negative ? -seconds : seconds, - negative ? -nanos : nanos); - } - - // Try ISO-8601 format - Matcher isoMatcher = ISO_FORMAT.matcher(s); - if (isoMatcher.matches()) { - String sign = isoMatcher.group(1); - String daysStr = isoMatcher.group(2); - String hoursStr = isoMatcher.group(3); - String minutesStr = isoMatcher.group(4); - String secondsStr = isoMatcher.group(5); - String fracStr = isoMatcher.group(6); - - int days = daysStr != null ? Integer.parseInt(daysStr) : 0; - int hours = hoursStr != null ? Integer.parseInt(hoursStr) : 0; - int minutes = minutesStr != null ? Integer.parseInt(minutesStr) : 0; - int seconds = secondsStr != null ? Integer.parseInt(secondsStr) : 0; - int nanos = fracStr != null ? parseFractionalSeconds(fracStr) : 0; - - boolean negative = "-".equals(sign); - return new OracleIntervalDS( - negative ? -days : days, - negative ? -hours : hours, - negative ? -minutes : minutes, - negative ? -seconds : seconds, - negative ? -nanos : nanos); - } - - throw new IllegalArgumentException( - "Cannot parse interval: " - + s - + " (expected format: +03 14:30:45.123456 or P3DT14H30M45.123456S)"); - } - - private static int parseFractionalSeconds(String fracStr) { - // Pad or truncate to 9 digits (nanoseconds) - if (fracStr.length() < 9) { - fracStr = String.format("%-9s", fracStr).replace(' ', '0'); - } else if (fracStr.length() > 9) { - fracStr = fracStr.substring(0, 9); - } - return Integer.parseInt(fracStr); - } - - /** Convert to Oracle's native format: +03 14:30:45.123456 */ - public String toOracleFormat() { - boolean negative = days < 0 || hours < 0 || minutes < 0 || seconds < 0 || nanos < 0; - String sign = negative ? "-" : "+"; - String fracStr = String.format("%09d", Math.abs(nanos)).substring(0, 6); // Oracle uses 6 digits - - return String.format( - "%s%02d %02d:%02d:%02d.%s", - sign, Math.abs(days), Math.abs(hours), Math.abs(minutes), Math.abs(seconds), fracStr); - } - - /** Convert to ISO-8601 duration format: P3DT14H30M45.123456S */ - public String toIso8601() { - if (days == 0 && hours == 0 && minutes == 0 && seconds == 0 && nanos == 0) { - return "P0D"; - } - - boolean negative = days < 0 || hours < 0 || minutes < 0 || seconds < 0 || nanos < 0; - StringBuilder sb = new StringBuilder(); - if (negative) { - sb.append("-"); - } - sb.append("P"); - - if (days != 0) { - sb.append(Math.abs(days)).append("D"); - } - - if (hours != 0 || minutes != 0 || seconds != 0 || nanos != 0) { - sb.append("T"); - if (hours != 0) { - sb.append(Math.abs(hours)).append("H"); - } - if (minutes != 0) { - sb.append(Math.abs(minutes)).append("M"); - } - if (seconds != 0 || nanos != 0) { - sb.append(Math.abs(seconds)); - if (nanos != 0) { - String fracStr = String.format("%09d", Math.abs(nanos)).replaceAll("0+$", ""); - sb.append(".").append(fracStr); - } - sb.append("S"); - } - } - - return sb.toString(); - } - - /** Convert to java.time.Duration */ - public Duration toDuration() { - long totalSeconds = - ((long) days * 24 * 60 * 60) + ((long) hours * 60 * 60) + ((long) minutes * 60) + seconds; - return Duration.ofSeconds(totalSeconds, nanos); - } - - /** Create from java.time.Duration */ - public static OracleIntervalDS fromDuration(Duration duration) { - long totalSeconds = duration.getSeconds(); - int days = (int) (totalSeconds / (24 * 60 * 60)); - totalSeconds -= (long) days * 24 * 60 * 60; - int hours = (int) (totalSeconds / (60 * 60)); - totalSeconds -= (long) hours * 60 * 60; - int minutes = (int) (totalSeconds / 60); - int seconds = (int) (totalSeconds % 60); - int nanos = duration.getNano(); - - return new OracleIntervalDS(days, hours, minutes, seconds, nanos); - } - - @Override - public String toString() { - return toOracleFormat(); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/OracleIntervalYM.java b/foundations-jdbc/src/java/dev/typr/foundations/data/OracleIntervalYM.java deleted file mode 100644 index e5461f2f5b..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/OracleIntervalYM.java +++ /dev/null @@ -1,117 +0,0 @@ -package dev.typr.foundations.data; - -import java.time.Period; -import java.util.regex.Matcher; -import java.util.regex.Pattern; - -/** - * Represents Oracle's INTERVAL YEAR TO MONTH type. - * - *

Can parse both Oracle's native format (+02-05, -01-06) and ISO-8601 duration format (P2Y5M, - * -P1Y6M). This handles the fact that Oracle returns ISO-8601 format in JSON but native format in - * columns. - */ -public record OracleIntervalYM(int years, int months) { - - // Oracle format: +02-05, -01-06, +00-00 - private static final Pattern ORACLE_FORMAT = Pattern.compile("([+-]?)(\\d+)-(\\d+)"); - - // ISO-8601 format: P2Y5M, -P1Y6M, P0Y - private static final Pattern ISO_FORMAT = Pattern.compile("(-?)P(\\d+)Y(\\d+)?M?"); - - public OracleIntervalYM { - // Both years and months must have the same sign - if ((years < 0 && months > 0) || (years > 0 && months < 0)) { - throw new IllegalArgumentException( - "Years and months must have the same sign, got years=" + years + ", months=" + months); - } - // Months must be in range 0-11 (absolute value) - if (Math.abs(months) > 11) { - throw new IllegalArgumentException( - "Months must be between 0 and 11, got: " + Math.abs(months)); - } - } - - /** Parse from either Oracle format (+02-05) or ISO-8601 format (P2Y5M) */ - public static OracleIntervalYM parse(String s) { - if (s == null || s.isEmpty()) { - throw new IllegalArgumentException("Cannot parse null or empty interval"); - } - - // Try Oracle format first - Matcher oracleMatcher = ORACLE_FORMAT.matcher(s); - if (oracleMatcher.matches()) { - String sign = oracleMatcher.group(1); - int years = Integer.parseInt(oracleMatcher.group(2)); - int months = Integer.parseInt(oracleMatcher.group(3)); - - if ("-".equals(sign)) { - years = -years; - months = -months; - } - - return new OracleIntervalYM(years, months); - } - - // Try ISO-8601 format - Matcher isoMatcher = ISO_FORMAT.matcher(s); - if (isoMatcher.matches()) { - String sign = isoMatcher.group(1); - int years = Integer.parseInt(isoMatcher.group(2)); - String monthsStr = isoMatcher.group(3); - int months = monthsStr != null ? Integer.parseInt(monthsStr) : 0; - - if ("-".equals(sign)) { - years = -years; - months = -months; - } - - return new OracleIntervalYM(years, months); - } - - throw new IllegalArgumentException( - "Cannot parse interval: " + s + " (expected format: +02-05 or P2Y5M)"); - } - - /** Convert to Oracle's native format: +02-05 */ - public String toOracleFormat() { - if (years < 0 || months < 0) { - return String.format("-%02d-%02d", Math.abs(years), Math.abs(months)); - } else { - return String.format("+%02d-%02d", years, months); - } - } - - /** Convert to ISO-8601 duration format: P2Y5M */ - public String toIso8601() { - if (years == 0 && months == 0) { - return "P0Y"; - } - - StringBuilder sb = new StringBuilder(); - if (years < 0 || months < 0) { - sb.append("-"); - } - sb.append("P"); - sb.append(Math.abs(years)).append("Y"); - if (months != 0) { - sb.append(Math.abs(months)).append("M"); - } - return sb.toString(); - } - - /** Convert to java.time.Period */ - public Period toPeriod() { - return Period.of(years, months, 0); - } - - /** Create from java.time.Period (ignoring days) */ - public static OracleIntervalYM fromPeriod(Period period) { - return new OracleIntervalYM(period.getYears(), period.getMonths()); - } - - @Override - public String toString() { - return toOracleFormat(); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/PgName.java b/foundations-jdbc/src/java/dev/typr/foundations/data/PgName.java deleted file mode 100644 index a45a9ae457..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/PgName.java +++ /dev/null @@ -1,5 +0,0 @@ -package dev.typr.foundations.data; - -// PostgreSQL `name` type - internal identifier type (max 63 bytes) -// Used for database object names: table names, column names, etc. -public record PgName(String value) {} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/PgNodeTree.java b/foundations-jdbc/src/java/dev/typr/foundations/data/PgNodeTree.java deleted file mode 100644 index d29b6b8c95..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/PgNodeTree.java +++ /dev/null @@ -1,19 +0,0 @@ -package dev.typr.foundations.data; - -/** - * pg_node_tree stores PostgreSQL's internal parse tree representation. - * - *

This type represents PostgreSQL's nodeToString() output format, which is used internally to - * store parsed SQL expressions, view definitions, default values, check constraints, etc. in the - * system catalogs. - * - *

The format consists of nested nodes with the structure: - Nodes: {NODETYPE :field1 value1 - * :field2 value2 ...} - Lists: (item1 item2 item3) - Empty values: <> - * - *

Example: {QUERY :commandType 1 :querySource 0 :canSetTag true :utilityStmt <>} - * - *

Note: This is a PostgreSQL internal format that may change between versions. Direct - * manipulation is not recommended. Use pg_get_expr() and similar functions when possible to work - * with the parsed representation. - */ -public record PgNodeTree(String value) {} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Range.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Range.java deleted file mode 100644 index 3b72375f1b..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Range.java +++ /dev/null @@ -1,202 +0,0 @@ -package dev.typr.foundations.data; - -import java.math.BigDecimal; -import java.time.Instant; -import java.time.LocalDate; -import java.time.LocalDateTime; -import java.util.Optional; -import java.util.function.BiFunction; -import java.util.function.UnaryOperator; - -/** - * PostgreSQL range type - represents a range of values. Ranges can be either empty (containing no - * values) or have bounds. - * - *

Use the typed factory methods to create ranges. Discrete types (integers, dates) are - * automatically normalized to PostgreSQL's canonical [) form. - */ -public sealed interface Range> permits Range.Empty, Range.NonEmpty { - - /** An empty range - contains no values. */ - record Empty>() implements Range { - @Override - public boolean isEmpty() { - return true; - } - - @Override - public boolean contains(T value) { - return false; - } - - @Override - public Optional> finite() { - return Optional.empty(); - } - - @Override - public String toString() { - return "empty"; - } - } - - /** A non-empty range with lower and upper bounds. */ - record NonEmpty>(RangeBound from, RangeBound to) - implements Range { - @Override - public boolean isEmpty() { - return false; - } - - @Override - public Optional> finite() { - if (from instanceof RangeBound.Finite && to instanceof RangeBound.Finite) { - return Optional.of( - new RangeFinite<>((RangeBound.Finite) from, (RangeBound.Finite) to)); - } - return Optional.empty(); - } - - @Override - public String toString() { - var left = - switch (from) { - case RangeBound.Infinite x -> "("; - case RangeBound.Finite.Open x -> "(" + x.value(); - case RangeBound.Finite.Closed x -> "[" + x.value(); - }; - var right = - switch (to) { - case RangeBound.Infinite x -> ")"; - case RangeBound.Finite.Open x -> x.value() + ")"; - case RangeBound.Finite.Closed x -> x.value() + "]"; - }; - return left + "," + right; - } - - @Override - public boolean contains(T value) { - var withinRangeLeft = - switch (from) { - case RangeBound.Infinite x -> true; - case RangeBound.Finite.Open x -> value.compareTo(x.value()) > 0; - case RangeBound.Finite.Closed x -> value.compareTo(x.value()) >= 0; - }; - var withRangeRight = - switch (to) { - case RangeBound.Infinite x -> true; - case RangeBound.Finite.Open x -> value.compareTo(x.value()) < 0; - case RangeBound.Finite.Closed x -> value.compareTo(x.value()) <= 0; - }; - return withinRangeLeft && withRangeRight; - } - } - - boolean isEmpty(); - - boolean contains(T value); - - Optional> finite(); - - // ==================== Factory methods ==================== - - /** Empty range - contains no values */ - static > Range empty() { - return new Empty<>(); - } - - // ----- Discrete types (normalized to canonical [) form) ----- - - /** int4range - normalized to [) form */ - static Range int4(RangeBound from, RangeBound to) { - return normalized(from, to, i -> i + 1); - } - - /** int8range - normalized to [) form */ - static Range int8(RangeBound from, RangeBound to) { - return normalized(from, to, i -> i + 1); - } - - /** daterange - normalized to [) form */ - static Range date(RangeBound from, RangeBound to) { - return normalized(from, to, d -> d.plusDays(1)); - } - - // ----- Continuous types (no normalization) ----- - - /** numrange - not normalized (continuous type) */ - static Range numeric(RangeBound from, RangeBound to) { - return new NonEmpty<>(from, to); - } - - /** tsrange - not normalized (continuous type) */ - static Range timestamp( - RangeBound from, RangeBound to) { - return new NonEmpty<>(from, to); - } - - /** tstzrange - not normalized (continuous type) */ - static Range timestamptz(RangeBound from, RangeBound to) { - return new NonEmpty<>(from, to); - } - - // ==================== Factory function references for parser ==================== - - /** Factory for int4range */ - BiFunction, RangeBound, Range> INT4 = Range::int4; - - /** Factory for int8range */ - BiFunction, RangeBound, Range> INT8 = Range::int8; - - /** Factory for daterange */ - BiFunction, RangeBound, Range> DATE = Range::date; - - /** Factory for numrange */ - BiFunction, RangeBound, Range> NUMERIC = - Range::numeric; - - /** Factory for tsrange */ - BiFunction, RangeBound, Range> TIMESTAMP = - Range::timestamp; - - /** Factory for tstzrange */ - BiFunction, RangeBound, Range> TIMESTAMPTZ = - Range::timestamptz; - - // ==================== Internal helpers ==================== - - /** - * Normalize a discrete range to PostgreSQL canonical form [). - (a,b) -> [a+1,b) - (a,b] -> - * [a+1,b+1) - [a,b] -> [a,b+1) - [a,b) -> [a,b) (already canonical) - */ - private static > Range normalized( - RangeBound from, RangeBound to, UnaryOperator step) { - // Normalize lower bound: (a -> [a+1 - RangeBound normalizedFrom = - switch (from) { - case RangeBound.Infinite i -> i; - case RangeBound.Finite.Closed c -> c; // already canonical - case RangeBound.Finite.Open o -> new RangeBound.Closed<>(step.apply(o.value())); - }; - - // Normalize upper bound: b] -> b+1) - RangeBound normalizedTo = - switch (to) { - case RangeBound.Infinite i -> i; - case RangeBound.Finite.Open o -> o; // already canonical - case RangeBound.Finite.Closed c -> new RangeBound.Open<>(step.apply(c.value())); - }; - - // Check for empty range: if lower >= upper after normalization, it's empty - if (normalizedFrom instanceof RangeBound.Finite finiteFrom - && normalizedTo instanceof RangeBound.Finite finiteTo) { - T fromVal = finiteFrom.value(); - T toVal = finiteTo.value(); - if (fromVal.compareTo(toVal) >= 0) { - return empty(); - } - } - - return new NonEmpty<>(normalizedFrom, normalizedTo); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/RangeBound.java b/foundations-jdbc/src/java/dev/typr/foundations/data/RangeBound.java deleted file mode 100644 index 55512e2fb7..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/RangeBound.java +++ /dev/null @@ -1,20 +0,0 @@ -package dev.typr.foundations.data; - -public sealed interface RangeBound permits RangeBound.Infinite, RangeBound.Finite { - RangeBound infinite = new Infinite<>(); - - @SuppressWarnings("unchecked") - static RangeBound infinite() { - return (RangeBound) infinite; - } - - final class Infinite implements RangeBound {} - - sealed interface Finite extends RangeBound permits Open, Closed { - T value(); - } - - record Open(T value) implements Finite {} - - record Closed(T value) implements Finite {} -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/RangeFinite.java b/foundations-jdbc/src/java/dev/typr/foundations/data/RangeFinite.java deleted file mode 100644 index a34e93f63e..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/RangeFinite.java +++ /dev/null @@ -1,22 +0,0 @@ -package dev.typr.foundations.data; - -public record RangeFinite>( - RangeBound.Finite from, RangeBound.Finite to) { - public Range asRange() { - return new Range.NonEmpty<>(from, to); - } - - public boolean contains(T value) { - var withinRangeLeft = - switch (from) { - case RangeBound.Finite.Open x -> value.compareTo(x.value()) > 0; - case RangeBound.Finite.Closed x -> value.compareTo(x.value()) >= 0; - }; - var withRangeRight = - switch (to) { - case RangeBound.Finite.Open x -> value.compareTo(x.value()) < 0; - case RangeBound.Finite.Closed x -> value.compareTo(x.value()) <= 0; - }; - return withinRangeLeft && withRangeRight; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/RangeParser.java b/foundations-jdbc/src/java/dev/typr/foundations/data/RangeParser.java deleted file mode 100644 index b8fb8567ba..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/RangeParser.java +++ /dev/null @@ -1,255 +0,0 @@ -package dev.typr.foundations.data; - -import dev.typr.foundations.SqlFunction; -import java.math.BigDecimal; -import java.time.*; -import java.time.format.DateTimeFormatter; -import java.time.format.DateTimeFormatterBuilder; -import java.time.temporal.ChronoField; -import java.util.function.BiFunction; - -/** - * Parser for PostgreSQL range string format. - * - *

PostgreSQL ranges are represented as strings like: - * - *

    - *
  • {@code [1,10]} - closed on both ends - *
  • {@code (1,10)} - open on both ends - *
  • {@code [1,10)} - closed-open (canonical for integers) - *
  • {@code (1,10]} - open-closed - *
  • {@code [,10)} or {@code (,10)} - lower unbounded (infinite) - *
  • {@code [1,)} or {@code [1,]} - upper unbounded (infinite) - *
  • {@code empty} - empty range - *
- */ -public final class RangeParser { - - private RangeParser() {} - - // Date/time formatters for PostgreSQL range format (space-separated, not ISO 'T') - private static final DateTimeFormatter TIMESTAMP_FORMATTER = - new DateTimeFormatterBuilder() - .appendPattern("yyyy-MM-dd HH:mm:ss") - .appendFraction(ChronoField.MICRO_OF_SECOND, 0, 6, true) - .toFormatter(); - - private static final DateTimeFormatter TIMESTAMPTZ_FORMATTER = - new DateTimeFormatterBuilder() - .appendPattern("yyyy-MM-dd HH:mm:ss") - .appendFraction(ChronoField.MICRO_OF_SECOND, 0, 6, true) - .appendOffset("+HH:mm", "+00:00") - .toFormatter(); - - private static final DateTimeFormatter TIMESTAMPTZ_SHORT_OFFSET_FORMATTER = - new DateTimeFormatterBuilder() - .appendPattern("yyyy-MM-dd HH:mm:ss") - .appendFraction(ChronoField.MICRO_OF_SECOND, 0, 6, true) - .appendOffset("+HH", "+00") - .toFormatter(); - - // Pre-built value parsers for common range types - public static final SqlFunction INT4_PARSER = Integer::parseInt; - public static final SqlFunction INT8_PARSER = Long::parseLong; - public static final SqlFunction NUMERIC_PARSER = BigDecimal::new; - public static final SqlFunction DATE_PARSER = LocalDate::parse; - public static final SqlFunction TIMESTAMP_PARSER = - RangeParser::parseLocalDateTime; - public static final SqlFunction TIMESTAMPTZ_PARSER = RangeParser::parseInstant; - - /** - * Parse a PostgreSQL range string into a Range object. - * - * @param input the range string from PostgreSQL - * @param valueParser parser for the element type - * @param rangeFactory factory to create the Range (use Range.INT4, Range.DATE, etc.) - */ - public static > Range parse( - String input, - SqlFunction valueParser, - BiFunction, RangeBound, Range> rangeFactory) - throws java.sql.SQLException { - if (input == null || input.isEmpty()) { - throw new java.sql.SQLException("Cannot parse null or empty range string"); - } - - String trimmed = input.trim(); - - // Handle empty range - if (trimmed.equals("empty")) { - return Range.empty(); - } - - if (trimmed.length() < 3) { - throw new java.sql.SQLException("Invalid range format: " + input); - } - - char leftBracket = trimmed.charAt(0); - char rightBracket = trimmed.charAt(trimmed.length() - 1); - - boolean leftClosed = leftBracket == '['; - boolean rightClosed = rightBracket == ']'; - - if ((leftBracket != '[' && leftBracket != '(') - || (rightBracket != ']' && rightBracket != ')')) { - throw new java.sql.SQLException("Invalid range brackets: " + input); - } - - // Extract the content between brackets - String content = trimmed.substring(1, trimmed.length() - 1); - - // Find the comma separator - need to handle quoted values - int commaIndex = findComma(content); - if (commaIndex == -1) { - throw new java.sql.SQLException("Invalid range format, no comma found: " + input); - } - - String leftStr = content.substring(0, commaIndex).trim(); - String rightStr = content.substring(commaIndex + 1).trim(); - - RangeBound leftBound; - RangeBound rightBound; - - // Parse left bound - if (leftStr.isEmpty()) { - leftBound = RangeBound.infinite(); - } else { - T leftValue = valueParser.apply(unquote(leftStr)); - leftBound = - leftClosed ? new RangeBound.Closed<>(leftValue) : new RangeBound.Open<>(leftValue); - } - - // Parse right bound - if (rightStr.isEmpty()) { - rightBound = RangeBound.infinite(); - } else { - T rightValue = valueParser.apply(unquote(rightStr)); - rightBound = - rightClosed ? new RangeBound.Closed<>(rightValue) : new RangeBound.Open<>(rightValue); - } - - return rangeFactory.apply(leftBound, rightBound); - } - - /** Format a range to PostgreSQL string format. */ - public static > String format(Range range) { - return switch (range) { - case Range.Empty e -> "empty"; - case Range.NonEmpty r -> formatNonEmpty(r); - }; - } - - /** - * Parse a timestamp string from PostgreSQL range format. Handles both ISO format (with 'T') and - * PostgreSQL format (with space). - */ - public static LocalDateTime parseLocalDateTime(String value) throws java.sql.SQLException { - try { - if (value.contains("T")) { - return LocalDateTime.parse(value); - } else { - return LocalDateTime.parse(value, TIMESTAMP_FORMATTER); - } - } catch (java.time.format.DateTimeParseException e) { - throw new java.sql.SQLException("Failed to parse timestamp: " + value, e); - } - } - - /** Parse a timestamptz string from PostgreSQL range format. Handles various offset formats. */ - public static Instant parseInstant(String value) throws java.sql.SQLException { - try { - if (value.endsWith("Z")) { - return Instant.parse(value); - } - // Try ISO format first - try { - return OffsetDateTime.parse(value, DateTimeFormatter.ISO_OFFSET_DATE_TIME).toInstant(); - } catch (java.time.format.DateTimeParseException e) { - // Try space-separated format with full offset - try { - return OffsetDateTime.parse(value, TIMESTAMPTZ_FORMATTER).toInstant(); - } catch (java.time.format.DateTimeParseException e2) { - // Try short offset format (+00) - return OffsetDateTime.parse(value, TIMESTAMPTZ_SHORT_OFFSET_FORMATTER).toInstant(); - } - } - } catch (java.time.format.DateTimeParseException e) { - throw new java.sql.SQLException("Failed to parse timestamptz: " + value, e); - } - } - - private static > String formatNonEmpty(Range.NonEmpty range) { - StringBuilder sb = new StringBuilder(); - - // Left bracket - switch (range.from()) { - case RangeBound.Infinite i -> sb.append('('); - case RangeBound.Finite.Closed c -> sb.append('['); - case RangeBound.Finite.Open o -> sb.append('('); - } - - // Left value - switch (range.from()) { - case RangeBound.Infinite i -> {} - case RangeBound.Finite f -> sb.append(quote(f.value().toString())); - } - - sb.append(','); - - // Right value - switch (range.to()) { - case RangeBound.Infinite i -> {} - case RangeBound.Finite f -> sb.append(quote(f.value().toString())); - } - - // Right bracket - switch (range.to()) { - case RangeBound.Infinite i -> sb.append(')'); - case RangeBound.Finite.Closed c -> sb.append(']'); - case RangeBound.Finite.Open o -> sb.append(')'); - } - - return sb.toString(); - } - - /** - * Find the comma separator, handling quoted values. PostgreSQL quotes values containing special - * characters like commas. - */ - private static int findComma(String content) { - boolean inQuote = false; - for (int i = 0; i < content.length(); i++) { - char c = content.charAt(i); - if (c == '"') { - inQuote = !inQuote; - } else if (c == ',' && !inQuote) { - return i; - } - } - return -1; - } - - /** Remove quotes from a value if present and unescape. */ - private static String unquote(String value) { - if (value.startsWith("\"") && value.endsWith("\"")) { - String inner = value.substring(1, value.length() - 1); - // Unescape doubled quotes - return inner.replace("\"\"", "\""); - } - return value; - } - - /** Quote a value if it contains special characters. */ - private static String quote(String value) { - if (value.contains(",") - || value.contains("\"") - || value.contains("(") - || value.contains(")") - || value.contains("[") - || value.contains("]") - || value.contains(" ")) { - return "\"" + value.replace("\"", "\"\"") + "\""; - } - return value; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Record.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Record.java deleted file mode 100644 index ab080bbc67..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Record.java +++ /dev/null @@ -1,3 +0,0 @@ -package dev.typr.foundations.data; - -public record Record(String value) {} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Regclass.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Regclass.java deleted file mode 100644 index 58824140af..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Regclass.java +++ /dev/null @@ -1,4 +0,0 @@ -package dev.typr.foundations.data; - -// Relation name, like `tablename` -public record Regclass(String value) {} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Regconfig.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Regconfig.java deleted file mode 100644 index 49d55d660d..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Regconfig.java +++ /dev/null @@ -1,4 +0,0 @@ -package dev.typr.foundations.data; - -// Text search configuration, like `english` -public record Regconfig(String value) {} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Regdictionary.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Regdictionary.java deleted file mode 100644 index f13d9a7528..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Regdictionary.java +++ /dev/null @@ -1,4 +0,0 @@ -package dev.typr.foundations.data; - -// Text search dictionary, like `english_stem` -public record Regdictionary(String value) {} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Regnamespace.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Regnamespace.java deleted file mode 100644 index 355ecbda2a..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Regnamespace.java +++ /dev/null @@ -1,4 +0,0 @@ -package dev.typr.foundations.data; - -// Namespace/schema name, like `public` -public record Regnamespace(String value) {} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Regoper.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Regoper.java deleted file mode 100644 index c9f8880dfe..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Regoper.java +++ /dev/null @@ -1,4 +0,0 @@ -package dev.typr.foundations.data; - -// Operator name, like `-` -public record Regoper(String value) {} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Regoperator.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Regoperator.java deleted file mode 100644 index 2ae76f1ff1..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Regoperator.java +++ /dev/null @@ -1,4 +0,0 @@ -package dev.typr.foundations.data; - -// Operator with argument types, like `*(INTEGER,INTEGER)` -public record Regoperator(String value) {} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Regproc.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Regproc.java deleted file mode 100644 index 723670c1db..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Regproc.java +++ /dev/null @@ -1,4 +0,0 @@ -package dev.typr.foundations.data; - -// Function name, like `sum` -public record Regproc(String value) {} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Regprocedure.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Regprocedure.java deleted file mode 100644 index 94f7f5ad07..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Regprocedure.java +++ /dev/null @@ -1,4 +0,0 @@ -package dev.typr.foundations.data; - -// Function with argument types, like `sum(INT4)` -public record Regprocedure(String value) {} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Regrole.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Regrole.java deleted file mode 100644 index dc5b8721c1..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Regrole.java +++ /dev/null @@ -1,4 +0,0 @@ -package dev.typr.foundations.data; - -// Function with argument types, like `sum(INT4)` -public record Regrole(String value) {} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Regtype.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Regtype.java deleted file mode 100644 index 6524e9336a..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Regtype.java +++ /dev/null @@ -1,4 +0,0 @@ -package dev.typr.foundations.data; - -// Function with argument types, like `sum(INT4)` -public record Regtype(String value) {} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Uint1.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Uint1.java deleted file mode 100644 index fd9c86e179..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Uint1.java +++ /dev/null @@ -1,18 +0,0 @@ -package dev.typr.foundations.data; - -/** Unsigned 1-byte integer (0-255). Used for SQL Server TINYINT and MariaDB TINYINT UNSIGNED. */ -public record Uint1(short value) { - public static final short MIN_VALUE = 0; - public static final short MAX_VALUE = 255; - - public Uint1 { - if (value < MIN_VALUE || value > MAX_VALUE) { - throw new IllegalArgumentException( - "Uint1 value must be between " + MIN_VALUE + " and " + MAX_VALUE + ", got: " + value); - } - } - - public static Uint1 of(int value) { - return new Uint1((short) value); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Uint2.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Uint2.java deleted file mode 100644 index ead5a88947..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Uint2.java +++ /dev/null @@ -1,18 +0,0 @@ -package dev.typr.foundations.data; - -/** Unsigned 2-byte integer (0-65535). Used for MariaDB SMALLINT UNSIGNED. */ -public record Uint2(int value) { - public static final int MIN_VALUE = 0; - public static final int MAX_VALUE = 65535; - - public Uint2 { - if (value < MIN_VALUE || value > MAX_VALUE) { - throw new IllegalArgumentException( - "Uint2 value must be between " + MIN_VALUE + " and " + MAX_VALUE + ", got: " + value); - } - } - - public static Uint2 of(int value) { - return new Uint2(value); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Uint4.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Uint4.java deleted file mode 100644 index 364e2656e0..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Uint4.java +++ /dev/null @@ -1,18 +0,0 @@ -package dev.typr.foundations.data; - -/** Unsigned 4-byte integer (0-4294967295). Used for MariaDB INT UNSIGNED and MEDIUMINT UNSIGNED. */ -public record Uint4(long value) { - public static final long MIN_VALUE = 0L; - public static final long MAX_VALUE = 4294967295L; - - public Uint4 { - if (value < MIN_VALUE || value > MAX_VALUE) { - throw new IllegalArgumentException( - "Uint4 value must be between " + MIN_VALUE + " and " + MAX_VALUE + ", got: " + value); - } - } - - public static Uint4 of(long value) { - return new Uint4(value); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Uint8.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Uint8.java deleted file mode 100644 index c4d1ab5207..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Uint8.java +++ /dev/null @@ -1,24 +0,0 @@ -package dev.typr.foundations.data; - -import java.math.BigInteger; - -/** Unsigned 8-byte integer (0-18446744073709551615). Used for MariaDB BIGINT UNSIGNED. */ -public record Uint8(BigInteger value) { - public static final BigInteger MIN_VALUE = BigInteger.ZERO; - public static final BigInteger MAX_VALUE = new BigInteger("18446744073709551615"); - - public Uint8 { - if (value.compareTo(MIN_VALUE) < 0 || value.compareTo(MAX_VALUE) > 0) { - throw new IllegalArgumentException( - "Uint8 value must be between " + MIN_VALUE + " and " + MAX_VALUE + ", got: " + value); - } - } - - public static Uint8 of(long value) { - return new Uint8(BigInteger.valueOf(value)); - } - - public static Uint8 of(BigInteger value) { - return new Uint8(value); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Unknown.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Unknown.java deleted file mode 100644 index 860cae364d..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Unknown.java +++ /dev/null @@ -1,4 +0,0 @@ -package dev.typr.foundations.data; - -// A column type typo doesn't know how to handle. it'll be cast to/from `text` -public record Unknown(String value) {} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Vector.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Vector.java deleted file mode 100644 index 951ce3cf7c..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Vector.java +++ /dev/null @@ -1,52 +0,0 @@ -package dev.typr.foundations.data; - -import java.util.Arrays; - -public record Vector(float[] values) { - public static Vector parse(String value) { - // Handle pgvector format: "[1.0,2.2,3.3]" - var trimmed = value.trim(); - if (trimmed.startsWith("[") && trimmed.endsWith("]")) { - trimmed = trimmed.substring(1, trimmed.length() - 1); - } - if (trimmed.isEmpty()) { - return new Vector(new float[0]); - } - var parts = trimmed.split(","); - var ret = new float[parts.length]; - for (var i = 0; i < parts.length; i++) { - ret[i] = Float.parseFloat(parts[i].trim()); - } - return new Vector(ret); - } - - @Override - public int hashCode() { - return Arrays.hashCode(values); - } - - @Override - public boolean equals(Object obj) { - if (obj instanceof Vector other) { - return Arrays.equals(values, other.values); - } - return false; - } - - /** Returns the vector in pgvector format: [1.0,2.2,3.3] */ - public String value() { - var sb = new StringBuilder("["); - for (var i = 0; i < values.length; i++) { - if (i > 0) { - sb.append(","); - } - sb.append(values[i]); - } - sb.append("]"); - return sb.toString(); - } - - public Vector(String value) { - this(Vector.parse(value).values); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Xid.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Xid.java deleted file mode 100644 index b0749be776..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Xid.java +++ /dev/null @@ -1,4 +0,0 @@ -package dev.typr.foundations.data; - -// Function with argument types, like `sum(INT4)` -public record Xid(String value) {} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/Xml.java b/foundations-jdbc/src/java/dev/typr/foundations/data/Xml.java deleted file mode 100644 index e9433fe8bb..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/Xml.java +++ /dev/null @@ -1,3 +0,0 @@ -package dev.typr.foundations.data; - -public record Xml(String value) {} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/maria/Inet4.java b/foundations-jdbc/src/java/dev/typr/foundations/data/maria/Inet4.java deleted file mode 100644 index 61147bc5d9..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/maria/Inet4.java +++ /dev/null @@ -1,52 +0,0 @@ -package dev.typr.foundations.data.maria; - -import java.util.Objects; - -/** - * Wrapper for MariaDB INET4 type. Represents an IPv4 address. - * - *

MariaDB stores INET4 internally as a 4-byte binary value but returns it as a string in - * dotted-decimal notation (e.g., "192.168.1.1"). - */ -public record Inet4(String value) { - public Inet4 { - Objects.requireNonNull(value, "Inet4 value cannot be null"); - } - - @Override - public String toString() { - return value; - } - - /** Parse an IPv4 address from a dotted-decimal string. */ - public static Inet4 parse(String value) { - return new Inet4(value); - } - - /** Get the address as an array of 4 bytes. */ - public byte[] toBytes() { - String[] parts = value.split("\\."); - if (parts.length != 4) { - throw new IllegalStateException("Invalid IPv4 address: " + value); - } - byte[] bytes = new byte[4]; - for (int i = 0; i < 4; i++) { - int octet = Integer.parseInt(parts[i]); - if (octet < 0 || octet > 255) { - throw new IllegalStateException("Invalid octet in IPv4 address: " + value); - } - bytes[i] = (byte) octet; - } - return bytes; - } - - /** Create an Inet4 from a byte array. */ - public static Inet4 fromBytes(byte[] bytes) { - if (bytes.length != 4) { - throw new IllegalArgumentException("IPv4 address must be 4 bytes"); - } - return new Inet4( - String.format( - "%d.%d.%d.%d", bytes[0] & 0xFF, bytes[1] & 0xFF, bytes[2] & 0xFF, bytes[3] & 0xFF)); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/maria/Inet6.java b/foundations-jdbc/src/java/dev/typr/foundations/data/maria/Inet6.java deleted file mode 100644 index 9518230fb5..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/maria/Inet6.java +++ /dev/null @@ -1,48 +0,0 @@ -package dev.typr.foundations.data.maria; - -import java.util.Objects; - -/** - * Wrapper for MariaDB INET6 type. Represents an IPv6 address (can also store IPv4 addresses in IPv6 - * format). - * - *

MariaDB stores INET6 internally as a 16-byte binary value but returns it as a string in - * standard IPv6 notation (e.g., "::ffff:192.168.1.1" or "2001:db8::1"). - */ -public record Inet6(String value) { - public Inet6 { - Objects.requireNonNull(value, "Inet6 value cannot be null"); - } - - @Override - public String toString() { - return value; - } - - /** Parse an IPv6 address from a string. */ - public static Inet6 parse(String value) { - return new Inet6(value); - } - - /** Check if this is an IPv4-mapped IPv6 address (::ffff:x.x.x.x). */ - public boolean isIPv4Mapped() { - return value.startsWith("::ffff:") || value.startsWith("::FFFF:"); - } - - /** - * If this is an IPv4-mapped address, extract the IPv4 part. Returns null if not an IPv4-mapped - * address. - */ - public Inet4 toIPv4() { - if (!isIPv4Mapped()) { - return null; - } - String ipv4Part = value.substring(7); // Skip "::ffff:" - return new Inet4(ipv4Part); - } - - /** Create an IPv6 address from an IPv4 address (as IPv4-mapped IPv6). */ - public static Inet6 fromIPv4(Inet4 ipv4) { - return new Inet6("::ffff:" + ipv4.value()); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/maria/MariaSet.java b/foundations-jdbc/src/java/dev/typr/foundations/data/maria/MariaSet.java deleted file mode 100644 index fb2e68c7f9..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/maria/MariaSet.java +++ /dev/null @@ -1,96 +0,0 @@ -package dev.typr.foundations.data.maria; - -import java.util.Arrays; -import java.util.Collections; -import java.util.LinkedHashSet; -import java.util.Objects; -import java.util.Set; -import java.util.stream.Collectors; - -/** - * Wrapper for MariaDB SET type. - * - *

MariaDB SET is a string object that can have zero or more values, each chosen from a list of - * permitted values. SET values are returned by JDBC as comma-separated strings (e.g., - * "email,sms,push"). - * - *

This wrapper provides type-safe access to the individual values. - */ -public final class MariaSet { - private final Set values; - - private MariaSet(Set values) { - this.values = Collections.unmodifiableSet(new LinkedHashSet<>(values)); - } - - /** Create a MariaSet from a comma-separated string (as returned by JDBC). */ - public static MariaSet fromString(String commaSeparated) { - if (commaSeparated == null || commaSeparated.isEmpty()) { - return new MariaSet(Collections.emptySet()); - } - Set values = - Arrays.stream(commaSeparated.split(",")) - .map(String::trim) - .filter(s -> !s.isEmpty()) - .collect(Collectors.toCollection(LinkedHashSet::new)); - return new MariaSet(values); - } - - /** Create a MariaSet from individual values. */ - public static MariaSet of(String... values) { - return new MariaSet(new LinkedHashSet<>(Arrays.asList(values))); - } - - /** Create a MariaSet from a Set of values. */ - public static MariaSet of(Set values) { - return new MariaSet(values); - } - - /** Create an empty MariaSet. */ - public static MariaSet empty() { - return new MariaSet(Collections.emptySet()); - } - - /** Get the values as an unmodifiable Set. */ - public Set values() { - return values; - } - - /** Check if a value is present in the set. */ - public boolean contains(String value) { - return values.contains(value); - } - - /** Check if the set is empty. */ - public boolean isEmpty() { - return values.isEmpty(); - } - - /** Get the number of values in the set. */ - public int size() { - return values.size(); - } - - /** Convert to a comma-separated string (for JDBC). */ - public String toCommaSeparated() { - return String.join(",", values); - } - - @Override - public String toString() { - return toCommaSeparated(); - } - - @Override - public boolean equals(Object o) { - if (this == o) return true; - if (o == null || getClass() != o.getClass()) return false; - MariaSet mariaSet = (MariaSet) o; - return Objects.equals(values, mariaSet.values); - } - - @Override - public int hashCode() { - return Objects.hash(values); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/precise/BinaryN.java b/foundations-jdbc/src/java/dev/typr/foundations/data/precise/BinaryN.java deleted file mode 100644 index 6cd7f8e8b8..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/precise/BinaryN.java +++ /dev/null @@ -1,41 +0,0 @@ -package dev.typr.foundations.data.precise; - -/** - * Abstract interface for binary types with a maximum length constraint. - * - *

Generated precise types like Binary16, Binary32, Binary64 implement this interface, allowing - * users to abstract over different binary length constraints. - * - *

Two BinaryN values with the same underlying byte array are considered semantically equal via - * {@link #semanticEquals}, regardless of their declared max length. - * - *

Example usage: - * - *

{@code
- * public  void processBinary(T value) {
- *     byte[] raw = value.rawValue();
- *     int maxLen = value.maxLength();
- *     // ...
- * }
- * }
- */ -public interface BinaryN { - /** Get the underlying byte array value. */ - byte[] rawValue(); - - /** Get the maximum allowed length in bytes for this type. */ - int maxLength(); - - /** - * Compare this BinaryN to another for semantic equality. Two BinaryN values are semantically - * equal if they have the same underlying byte array content, regardless of their declared max - * length. - */ - boolean semanticEquals(BinaryN other); - - /** - * Compute a semantic hash code based only on the underlying value. This is compatible with - * semanticEquals for use in collections that compare BinaryN values by content. - */ - int semanticHashCode(); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/precise/DecimalN.java b/foundations-jdbc/src/java/dev/typr/foundations/data/precise/DecimalN.java deleted file mode 100644 index 24521e5259..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/precise/DecimalN.java +++ /dev/null @@ -1,51 +0,0 @@ -package dev.typr.foundations.data.precise; - -import java.math.BigDecimal; - -/** - * Abstract interface for decimal types with precision and scale constraints. - * - *

Generated precise types like Decimal5_2, Decimal10_2, Decimal18_4 implement this interface, - * allowing users to abstract over different decimal precision/scale constraints. - * - *

For types with scale=0 (integer decimals like Int5, Int10, Int18), this interface returns a - * BigDecimal representation of the underlying BigInteger value. - * - *

Two DecimalN values with the same numeric value are considered semantically equal via {@link - * #semanticEquals}, regardless of their declared precision/scale. Comparison uses {@link - * BigDecimal#compareTo} so that values like 1.0 and 1.00 are equal. - * - *

Example usage: - * - *

{@code
- * public  void processDecimal(T value) {
- *     BigDecimal raw = value.decimalValue();
- *     int precision = value.precision();
- *     int scale = value.scale();
- *     // ...
- * }
- * }
- */ -public interface DecimalN { - /** Get the value as BigDecimal. For scale=0 types, this converts from BigInteger. */ - BigDecimal decimalValue(); - - /** Get the maximum precision (total number of digits) for this type. */ - int precision(); - - /** Get the scale (digits after decimal point) for this type. */ - int scale(); - - /** - * Compare this DecimalN to another for semantic equality. Two DecimalN values are semantically - * equal if they have the same numeric value (using compareTo), regardless of their declared - * precision/scale. - */ - boolean semanticEquals(DecimalN other); - - /** - * Compute a semantic hash code based only on the underlying value. Uses stripTrailingZeros to - * ensure that 1.0 and 1.00 have the same hash code. - */ - int semanticHashCode(); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/precise/InstantN.java b/foundations-jdbc/src/java/dev/typr/foundations/data/precise/InstantN.java deleted file mode 100644 index 80d3f7c014..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/precise/InstantN.java +++ /dev/null @@ -1,39 +0,0 @@ -package dev.typr.foundations.data.precise; - -import java.time.Instant; - -/** - * Abstract interface for Instant types with fractional seconds precision constraint. - * - *

Generated precise types like Instant3, Instant6 implement this interface, allowing users to - * abstract over different temporal precision constraints. - * - *

Two InstantN values with the same underlying instant are considered semantically equal via - * {@link #semanticEquals}, regardless of their declared precision. - * - *

Example usage: - * - *

{@code
- * public  void processInstant(T value) {
- *     Instant raw = value.rawValue();
- *     int fsp = value.fractionalSecondsPrecision();
- *     // ...
- * }
- * }
- */ -public interface InstantN { - /** Get the underlying Instant value. */ - Instant rawValue(); - - /** Get the fractional seconds precision (0-9) for this type. */ - int fractionalSecondsPrecision(); - - /** - * Compare this InstantN to another for semantic equality. Two InstantN values are semantically - * equal if they have the same instant, regardless of their declared precision. - */ - boolean semanticEquals(InstantN other); - - /** Compute a semantic hash code based only on the underlying value. */ - int semanticHashCode(); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/precise/LocalDateTimeN.java b/foundations-jdbc/src/java/dev/typr/foundations/data/precise/LocalDateTimeN.java deleted file mode 100644 index e2e73a2376..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/precise/LocalDateTimeN.java +++ /dev/null @@ -1,39 +0,0 @@ -package dev.typr.foundations.data.precise; - -import java.time.LocalDateTime; - -/** - * Abstract interface for LocalDateTime types with fractional seconds precision constraint. - * - *

Generated precise types like LocalDateTime3, LocalDateTime6 implement this interface, allowing - * users to abstract over different temporal precision constraints. - * - *

Two LocalDateTimeN values with the same underlying timestamp are considered semantically equal - * via {@link #semanticEquals}, regardless of their declared precision. - * - *

Example usage: - * - *

{@code
- * public  void processDateTime(T value) {
- *     LocalDateTime raw = value.rawValue();
- *     int fsp = value.fractionalSecondsPrecision();
- *     // ...
- * }
- * }
- */ -public interface LocalDateTimeN { - /** Get the underlying LocalDateTime value. */ - LocalDateTime rawValue(); - - /** Get the fractional seconds precision (0-9) for this type. */ - int fractionalSecondsPrecision(); - - /** - * Compare this LocalDateTimeN to another for semantic equality. Two LocalDateTimeN values are - * semantically equal if they have the same timestamp, regardless of their declared precision. - */ - boolean semanticEquals(LocalDateTimeN other); - - /** Compute a semantic hash code based only on the underlying value. */ - int semanticHashCode(); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/precise/LocalTimeN.java b/foundations-jdbc/src/java/dev/typr/foundations/data/precise/LocalTimeN.java deleted file mode 100644 index 05741eea00..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/precise/LocalTimeN.java +++ /dev/null @@ -1,39 +0,0 @@ -package dev.typr.foundations.data.precise; - -import java.time.LocalTime; - -/** - * Abstract interface for LocalTime types with fractional seconds precision constraint. - * - *

Generated precise types like LocalTime3, LocalTime6 implement this interface, allowing users - * to abstract over different temporal precision constraints. - * - *

Two LocalTimeN values with the same underlying time are considered semantically equal via - * {@link #semanticEquals}, regardless of their declared precision. - * - *

Example usage: - * - *

{@code
- * public  void processTime(T value) {
- *     LocalTime raw = value.rawValue();
- *     int fsp = value.fractionalSecondsPrecision();
- *     // ...
- * }
- * }
- */ -public interface LocalTimeN { - /** Get the underlying LocalTime value. */ - LocalTime rawValue(); - - /** Get the fractional seconds precision (0-9) for this type. */ - int fractionalSecondsPrecision(); - - /** - * Compare this LocalTimeN to another for semantic equality. Two LocalTimeN values are - * semantically equal if they have the same time, regardless of their declared precision. - */ - boolean semanticEquals(LocalTimeN other); - - /** Compute a semantic hash code based only on the underlying value. */ - int semanticHashCode(); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/precise/NonEmptyPaddedStringN.java b/foundations-jdbc/src/java/dev/typr/foundations/data/precise/NonEmptyPaddedStringN.java deleted file mode 100644 index dd7ba291f2..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/precise/NonEmptyPaddedStringN.java +++ /dev/null @@ -1,50 +0,0 @@ -package dev.typr.foundations.data.precise; - -/** - * Abstract interface for fixed-length, blank-padded, non-empty string types. - * - *

Generated precise types like NonEmptyPaddedString10, NonEmptyPaddedString20 implement this - * interface. These types represent CHAR(n) columns that are always padded to exactly n characters - * with trailing spaces and must contain at least one non-whitespace character. - * - *

This is particularly useful for Oracle CHAR columns which cannot store empty strings (Oracle - * treats empty string as NULL). - * - *

Two NonEmptyPaddedStringN values with the same trimmed content are considered semantically - * equal via {@link #semanticEquals}, regardless of their declared length. Comparison is done on - * trimmed values. - * - *

Example usage: - * - *

{@code
- * public  void processString(T value) {
- *     String padded = value.rawValue();    // includes trailing spaces
- *     String trimmed = value.trimmed(); // without trailing spaces
- *     int len = value.length();         // declared fixed length
- *     // ...
- * }
- * }
- */ -public interface NonEmptyPaddedStringN { - /** Get the underlying padded string value (includes trailing spaces). */ - String rawValue(); - - /** Get the value with trailing spaces removed. */ - String trimmed(); - - /** Get the fixed length for this type. */ - int length(); - - /** - * Compare this NonEmptyPaddedStringN to another for semantic equality. Two NonEmptyPaddedStringN - * values are semantically equal if they have the same trimmed content, regardless of their - * declared length. - */ - boolean semanticEquals(NonEmptyPaddedStringN other); - - /** - * Compute a semantic hash code based on the trimmed value. This is compatible with semanticEquals - * for use in collections that compare NonEmptyPaddedStringN values by content. - */ - int semanticHashCode(); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/precise/NonEmptyStringN.java b/foundations-jdbc/src/java/dev/typr/foundations/data/precise/NonEmptyStringN.java deleted file mode 100644 index bc73d07cf2..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/precise/NonEmptyStringN.java +++ /dev/null @@ -1,42 +0,0 @@ -package dev.typr.foundations.data.precise; - -/** - * Abstract interface for non-empty string types with a maximum length constraint. - * - *

Generated precise types like NonEmptyString10, NonEmptyString50 implement this interface. - * These types guarantee the string is non-null and non-empty, suitable for databases like Oracle - * where empty strings are converted to NULL. - * - *

Two NonEmptyStringN values with the same underlying string are considered semantically equal - * via {@link #semanticEquals}, regardless of their declared max length. - * - *

Example usage: - * - *

{@code
- * public  void processString(T value) {
- *     String raw = value.rawValue();  // guaranteed non-empty
- *     int maxLen = value.maxLength();
- *     // ...
- * }
- * }
- */ -public interface NonEmptyStringN { - /** Get the underlying string value. Guaranteed to be non-null and non-empty. */ - String rawValue(); - - /** Get the maximum allowed length for this type. */ - int maxLength(); - - /** - * Compare this NonEmptyStringN to another for semantic equality. Two NonEmptyStringN values are - * semantically equal if they have the same underlying string value, regardless of their declared - * max length. - */ - boolean semanticEquals(NonEmptyStringN other); - - /** - * Compute a semantic hash code based only on the underlying value. This is compatible with - * semanticEquals for use in collections that compare NonEmptyStringN values by content. - */ - int semanticHashCode(); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/precise/OffsetDateTimeN.java b/foundations-jdbc/src/java/dev/typr/foundations/data/precise/OffsetDateTimeN.java deleted file mode 100644 index 7ab464c1a5..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/precise/OffsetDateTimeN.java +++ /dev/null @@ -1,39 +0,0 @@ -package dev.typr.foundations.data.precise; - -import java.time.OffsetDateTime; - -/** - * Abstract interface for OffsetDateTime types with fractional seconds precision constraint. - * - *

Generated precise types like OffsetDateTime3, OffsetDateTime7 implement this interface, - * allowing users to abstract over different temporal precision constraints. - * - *

Two OffsetDateTimeN values with the same underlying timestamp are considered semantically - * equal via {@link #semanticEquals}, regardless of their declared precision. - * - *

Example usage: - * - *

{@code
- * public  void processOffsetDateTime(T value) {
- *     OffsetDateTime raw = value.rawValue();
- *     int fsp = value.fractionalSecondsPrecision();
- *     // ...
- * }
- * }
- */ -public interface OffsetDateTimeN { - /** Get the underlying OffsetDateTime value. */ - OffsetDateTime rawValue(); - - /** Get the fractional seconds precision (0-9) for this type. */ - int fractionalSecondsPrecision(); - - /** - * Compare this OffsetDateTimeN to another for semantic equality. Two OffsetDateTimeN values are - * semantically equal if they have the same timestamp, regardless of their declared precision. - */ - boolean semanticEquals(OffsetDateTimeN other); - - /** Compute a semantic hash code based only on the underlying value. */ - int semanticHashCode(); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/precise/PaddedStringN.java b/foundations-jdbc/src/java/dev/typr/foundations/data/precise/PaddedStringN.java deleted file mode 100644 index a620c954f5..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/precise/PaddedStringN.java +++ /dev/null @@ -1,46 +0,0 @@ -package dev.typr.foundations.data.precise; - -/** - * Abstract interface for fixed-length, blank-padded string types. - * - *

Generated precise types like PaddedString10, PaddedString20 implement this interface. These - * types represent CHAR(n) columns which are always padded to exactly n characters with trailing - * spaces. - * - *

Two PaddedStringN values with the same trimmed content are considered semantically equal via - * {@link #semanticEquals}, regardless of their declared length. Comparison is done on trimmed - * values. - * - *

Example usage: - * - *

{@code
- * public  void processString(T value) {
- *     String padded = value.rawValue();    // includes trailing spaces
- *     String trimmed = value.trimmed(); // without trailing spaces
- *     int len = value.length();         // declared fixed length
- *     // ...
- * }
- * }
- */ -public interface PaddedStringN { - /** Get the underlying padded string value (includes trailing spaces). */ - String rawValue(); - - /** Get the value with trailing spaces removed. */ - String trimmed(); - - /** Get the fixed length for this type. */ - int length(); - - /** - * Compare this PaddedStringN to another for semantic equality. Two PaddedStringN values are - * semantically equal if they have the same trimmed content, regardless of their declared length. - */ - boolean semanticEquals(PaddedStringN other); - - /** - * Compute a semantic hash code based on the trimmed value. This is compatible with semanticEquals - * for use in collections that compare PaddedStringN values by content. - */ - int semanticHashCode(); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/data/precise/StringN.java b/foundations-jdbc/src/java/dev/typr/foundations/data/precise/StringN.java deleted file mode 100644 index 71e0fc7b8e..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/data/precise/StringN.java +++ /dev/null @@ -1,41 +0,0 @@ -package dev.typr.foundations.data.precise; - -/** - * Abstract interface for string types with a maximum length constraint. - * - *

Generated precise types like String10, String50, String255 implement this interface, allowing - * users to abstract over different string length constraints. - * - *

Two StringN values with the same underlying string are considered semantically equal via - * {@link #semanticEquals}, regardless of their declared max length. For example, a - * String10("hello") and String50("hello") will compare equal using semanticEquals. - * - *

Example usage: - * - *

{@code
- * public  void processString(T value) {
- *     String raw = value.rawValue();
- *     int maxLen = value.maxLength();
- *     // ...
- * }
- * }
- */ -public interface StringN { - /** Get the underlying string value. */ - String rawValue(); - - /** Get the maximum allowed length for this type. */ - int maxLength(); - - /** - * Compare this StringN to another for semantic equality. Two StringN values are semantically - * equal if they have the same underlying string value, regardless of their declared max length. - */ - boolean semanticEquals(StringN other); - - /** - * Compute a semantic hash code based only on the underlying value. This is compatible with - * semanticEquals for use in collections that compare StringN values by content. - */ - int semanticHashCode(); -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/dsl/Bijection.java b/foundations-jdbc/src/java/dev/typr/foundations/dsl/Bijection.java deleted file mode 100644 index c41c8146a4..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/dsl/Bijection.java +++ /dev/null @@ -1,147 +0,0 @@ -package dev.typr.foundations.dsl; - -import java.util.function.Function; - -/** - * Represents a bidirectional conversion between two types. This is used for type-safe conversions - * in the DSL. - */ -public interface Bijection { - TT underlying(T value); - - T from(TT value); - - default Bijection andThen(Bijection other) { - return new Bijection() { - @Override - public U underlying(T value) { - return other.underlying(Bijection.this.underlying(value)); - } - - @Override - public T from(U value) { - return Bijection.this.from(other.from(value)); - } - }; - } - - default Bijection inverse() { - return new Bijection() { - @Override - public T underlying(TT value) { - return Bijection.this.from(value); - } - - @Override - public TT from(T value) { - return Bijection.this.underlying(value); - } - }; - } - - static Bijection identity() { - return new Bijection() { - @Override - public T underlying(T value) { - return value; - } - - @Override - public T from(T value) { - return value; - } - }; - } - - /** - * Type witness proving that T is Boolean. - * - *

This is used for type-safe boolean operations on {@link dev.typr.foundations.dsl.SqlExpr}. - * The method only compiles when called in a context where T = Boolean, providing compile-time - * type safety. - * - *

Example usage: - * - *

{@code
-   * SqlExpr expr = field.isEqual(value);
-   * SqlExpr negated = expr.not(Bijection.asBool());
-   *
-   * // This won't compile - String is not Boolean:
-   * // SqlExpr strExpr = ...;
-   * // strExpr.not(Bijection.asBool()); // compile error!
-   * }
- * - * @return an identity bijection for Boolean - */ - static Bijection asBool() { - return identity(); - } - - /** - * Type witness proving that T is String. - * - *

This is used for type-safe string operations on {@link dev.typr.foundations.dsl.SqlExpr}. - * The method only compiles when called in a context where T = String, providing compile-time type - * safety. - * - *

Example usage: - * - *

{@code
-   * SqlExpr nameField = ...;
-   * SqlExpr matched = nameField.like("Jo%", Bijection.asString());
-   * SqlExpr upper = nameField.upper(Bijection.asString());
-   *
-   * // This won't compile - Integer is not String:
-   * // SqlExpr intExpr = ...;
-   * // intExpr.like("Jo%", Bijection.asString()); // compile error!
-   * }
- * - * @return an identity bijection for String - */ - static Bijection asString() { - return identity(); - } - - static Bijection of(Function to, Function from) { - return new Bijection() { - @Override - public TT underlying(T value) { - return to.apply(value); - } - - @Override - public T from(TT value) { - return from.apply(value); - } - }; - } - - /** - * Curried factory method for Scala interop. Scala: Bijection.apply[W, U](_.value)(SomeType.apply) - */ - static Function, Bijection> apply(Function to) { - return from -> of(to, from); - } - - // Common bijections - static Bijection booleanToString() { - return of(b -> b ? "true" : "false", "true"::equalsIgnoreCase); - } - - static Bijection integerToString() { - return of(Object::toString, Integer::parseInt); - } - - static Bijection longToString() { - return of(Object::toString, Long::parseLong); - } - - static Bijection doubleToString() { - return of(Object::toString, Double::parseDouble); - } - - // For use with custom value types - default Function map(Function mapper) { - return value -> mapper.apply(underlying(value)); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/internal/ByteArrays.java b/foundations-jdbc/src/java/dev/typr/foundations/internal/ByteArrays.java deleted file mode 100644 index 38f8741d5e..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/internal/ByteArrays.java +++ /dev/null @@ -1,25 +0,0 @@ -package dev.typr.foundations.internal; - -/** - * Utility methods for converting between primitive byte[] and boxed Byte[] arrays. This is needed - * because Java records typically use boxed Byte[] while JDBC uses primitive byte[]. - */ -public final class ByteArrays { - private ByteArrays() {} // prevent instantiation - - /** Convert primitive byte[] to boxed Byte[] */ - public static Byte[] box(byte[] arr) { - if (arr == null) return null; - Byte[] result = new Byte[arr.length]; - for (int i = 0; i < arr.length; i++) result[i] = arr[i]; - return result; - } - - /** Convert boxed Byte[] to primitive byte[] */ - public static byte[] unbox(Byte[] arr) { - if (arr == null) return null; - byte[] result = new byte[arr.length]; - for (int i = 0; i < arr.length; i++) result[i] = arr[i]; - return result; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/internal/RandomHelper.java b/foundations-jdbc/src/java/dev/typr/foundations/internal/RandomHelper.java deleted file mode 100644 index 3affd7b20f..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/internal/RandomHelper.java +++ /dev/null @@ -1,29 +0,0 @@ -package dev.typr.foundations.internal; - -import java.util.Random; -import java.util.UUID; - -public class RandomHelper { - private static final String ALPHANUMERIC = - "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789"; - - public static String alphanumeric(Random random, int length) { - StringBuilder sb = new StringBuilder(length); - for (int i = 0; i < length; i++) { - sb.append(ALPHANUMERIC.charAt(random.nextInt(ALPHANUMERIC.length()))); - } - return sb.toString(); - } - - public static UUID randomUUID(Random random) { - byte[] bytes = new byte[16]; - random.nextBytes(bytes); - return UUID.nameUUIDFromBytes(bytes); - } - - public static byte[] randomBytes(Random random, int length) { - byte[] bytes = new byte[length]; - random.nextBytes(bytes); - return bytes; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/internal/TypoPGObjectHelper.java b/foundations-jdbc/src/java/dev/typr/foundations/internal/TypoPGObjectHelper.java deleted file mode 100644 index 24bdf3aee3..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/internal/TypoPGObjectHelper.java +++ /dev/null @@ -1,30 +0,0 @@ -package dev.typr.foundations.internal; - -import java.sql.SQLException; -import org.postgresql.util.PGobject; - -/** - * Helper class for creating PGobject instances with type and value set. This avoids multi-statement - * lambdas in generated code. - */ -public final class TypoPGObjectHelper { - private TypoPGObjectHelper() {} - - /** - * Creates a PGobject with the given type and value. - * - * @param type the PostgreSQL type name - * @param value the string value - * @return a new PGobject with type and value set - */ - public static PGobject create(String type, String value) { - try { - PGobject obj = new PGobject(); - obj.setType(type); - obj.setValue(value); - return obj; - } catch (SQLException e) { - throw new RuntimeException("Failed to create PGobject for type: " + type, e); - } - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/internal/arrayMap.java b/foundations-jdbc/src/java/dev/typr/foundations/internal/arrayMap.java deleted file mode 100644 index c99e241b17..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/internal/arrayMap.java +++ /dev/null @@ -1,15 +0,0 @@ -package dev.typr.foundations.internal; - -import java.lang.reflect.Array; -import java.util.function.Function; - -public class arrayMap { - @SuppressWarnings("unchecked") - public static B[] map(A[] arr, Function f, Class clazz) { - B[] result = (B[]) Array.newInstance(clazz, arr.length); - for (int i = 0; i < arr.length; i++) { - result[i] = f.apply(arr[i]); - } - return result; - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/internal/stringInterpolator.java b/foundations-jdbc/src/java/dev/typr/foundations/internal/stringInterpolator.java deleted file mode 100644 index 39b3ff3567..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/internal/stringInterpolator.java +++ /dev/null @@ -1,17 +0,0 @@ -package dev.typr.foundations.internal; - -public interface stringInterpolator { - /** - * String interpolation function that concatenates all string arguments together. Similar to - * Scala's s"..." string interpolator. - * - *

Example: str("((", "1.0", ",", "2.0", "))") produces "((1.0,2.0))" - */ - static String str(String... parts) { - StringBuilder sb = new StringBuilder(); - for (String part : parts) { - sb.append(part); - } - return sb.toString(); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/internal/stripMargin.java b/foundations-jdbc/src/java/dev/typr/foundations/internal/stripMargin.java deleted file mode 100644 index 89bf7ee83a..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/internal/stripMargin.java +++ /dev/null @@ -1,20 +0,0 @@ -package dev.typr.foundations.internal; - -import java.util.stream.Collectors; - -public class stripMargin { - public static String apply(String value) { - return value.lines().map(stripMargin::fromLine).collect(Collectors.joining("\n")); - } - - static String fromLine(String line) { - int i = 0; - while (i < line.length() && Character.isWhitespace(line.charAt(i))) { - i++; - } - if (i < line.length() && line.charAt(i) == '|') { - return line.substring(i + 1); - } - return line.substring(i); - } -} diff --git a/foundations-jdbc/src/java/dev/typr/foundations/streamingInsert.java b/foundations-jdbc/src/java/dev/typr/foundations/streamingInsert.java deleted file mode 100644 index f05f1a0424..0000000000 --- a/foundations-jdbc/src/java/dev/typr/foundations/streamingInsert.java +++ /dev/null @@ -1,45 +0,0 @@ -package dev.typr.foundations; - -import java.nio.charset.StandardCharsets; -import java.sql.Connection; -import java.sql.SQLException; -import java.util.Iterator; -import org.postgresql.PGConnection; -import org.postgresql.util.PSQLException; - -public class streamingInsert { - public static long insertUnchecked( - String copyCommand, int batchSize, Iterator rows, Connection c, PgText T) { - try { - return insert(copyCommand, batchSize, rows, c, T); - } catch (SQLException e) { - throw new RuntimeException(e); - } - } - - public static long insert( - String copyCommand, int batchSize, Iterator rows, Connection c, PgText T) - throws SQLException { - var copyManager = c.unwrap(PGConnection.class).getCopyAPI(); - - var in = copyManager.copyIn(copyCommand); - try { - while (rows.hasNext()) { - var sb = new StringBuilder(); - for (int i = 0; i < batchSize && rows.hasNext(); i++) { - T.unsafeEncode(rows.next(), sb); - sb.append("\n"); - } - var bytes = sb.toString().getBytes(StandardCharsets.UTF_8); - in.writeToCopy(bytes, 0, bytes.length); - } - return in.endCopy(); - } catch (Throwable th) { - try { - in.cancelCopy(); - } catch (PSQLException ignored) { - } - throw th; - } - } -} diff --git a/gradle.properties b/gradle.properties deleted file mode 100644 index 74fe8a009e..0000000000 --- a/gradle.properties +++ /dev/null @@ -1,2 +0,0 @@ -org.gradle.jvmargs=-Xmx4g -XX:+HeapDumpOnOutOfMemoryError -kotlin.daemon.jvmargs=-Xmx4g diff --git a/gradle/wrapper/gradle-wrapper.jar b/gradle/wrapper/gradle-wrapper.jar deleted file mode 100644 index 8bdaf60c75ab801e22807dde59e12a8735a34077..0000000000000000000000000000000000000000 GIT binary patch literal 0 HcmV?d00001 literal 45457 zcma&NW0YlEwk;ePwr$(aux;D69T}N{9ky*d!_2U4+qUuIRNZ#Jck8}7U+vcB{`IjNZqX3eq5;s6ddAkU&5{L|^Ow`ym2B0m+K02+~Q)i807X3X94qi>j)C0e$=H zm31v`=T&y}ACuKx7G~yWSYncG=NFB>O2);i9EmJ(9jSamq?Crj$g~1l3m-4M7;BWn zau2S&sSA0b0Rhg>6YlVLQa;D#)1yw+eGs~36Q$}5?avIRne3TQZXb<^e}?T69w<9~ zUmx1cG0uZ?Kd;Brd$$>r>&MrY*3$t^PWF1+J+G_xmpHW=>mly$<>~wHH+Bt3mzN7W zhR)g{_veH6>*KxLJ~~s{9HZm!UeC86d_>42NRqd$ev8zSMq4kt)q*>8kJ8p|^wuKx zq2Is_HJPoQ_apSoT?zJj7vXBp!xejBc^7F|zU0rhy%Ub*Dy#jJs!>1?CmJ-gulPVX zKit>RVmjL=G?>jytf^U@mfnC*1-7EVag@%ROu*#kA+)Rxq?MGK0v-dp^kM?nyMngb z_poL>GLThB7xAO*I7&?4^Nj`<@O@>&0M-QxIi zD@n}s%CYI4Be19C$lAb9Bbm6!R{&A;=yh=#fnFyb`s7S5W3?arZf?$khCwkGN!+GY~GT8-`!6pFr zbFBVEF`kAgtecfjJ`flN2Z!$$8}6hV>Tu;+rN%$X^t8fI>tXQnRn^$UhXO8Gu zt$~QON8`doV&{h}=2!}+xJKrNPcIQid?WuHUC-i%P^F(^z#XB`&&`xTK&L+i8a3a@ zkV-Jy;AnyQ`N=&KONV_^-0WJA{b|c#_l=v!19U@hS~M-*ix16$r01GN3#naZ|DxY2 z76nbjbOnFcx4bKbEoH~^=EikiZ)_*kOb>nW6>_vjf-UCf0uUy~QBb7~WfVO6qN@ns zz=XEG0s5Yp`mlmUad)8!(QDgIzY=OK%_hhPStbyYYd|~zDIc3J4 zy9y%wZOW>}eG4&&;Z>vj&Mjg+>4gL! z(@oCTFf-I^54t=*4AhKRoE-0Ky=qg3XK2Mu!Bmw@z>y(|a#(6PcfbVTw-dUqyx4x4 z3O#+hW1ANwSv-U+9otHE#U9T>(nWx>^7RO_aI>${jvfZQ{mUwiaxHau!H z0Nc}ucJu+bKux?l!dQ2QA(r@(5KZl(Or=U!=2K*8?D=ZT-IAcAX!5OI3w@`sF@$($ zbDk0p&3X0P%B0aKdijO|s})70K&mk1DC|P##b=k@fcJ|lo@JNWRUc>KL?6dJpvtSUK zxR|w8Bo6K&y~Bd}gvuz*3z z@sPJr{(!?mi@okhudaM{t3gp9TJ!|@j4eO1C&=@h#|QLCUKLaKVL z!lls$%N&ZG7yO#jK?U>bJ+^F@K#A4d&Jz4boGmptagnK!Qu{Ob>%+60xRYK>iffd_ z>6%0K)p!VwP$^@Apm%NrS6TpKJwj_Q=k~?4=_*NIe~eh_QtRaqX4t-rJAGYdB{pGq zSXX)-dR8mQ)X|;8@_=J6Dk7MfMp;x)^aZeCtScHs12t3vL+p-6!qhPkOM1OYQ z8YXW5tWp)Th(+$m7SnV_hNGKAP`JF4URkkNc@YV9}FK$9k zR&qgi$Cj#4bC1VK%#U)f%(+oQJ+EqvV{uAq1YG0riLvGxW@)m;*ayU-BSW61COFy0 z(-l>GJqYl;*x1PnRZ(p3Lm}* zlkpWyCoYtg9pAZ5RU^%w=vN{3Y<6WImxj(*SCcJsFj?o6CZ~>cWW^foliM#qN#We{ zwsL!u1$rzC1#4~bILZm*a!T{^kCci$XOJADm)P;y^%x5)#G#_!2uNp^S;cE`*ASCn;}H7pP^RRA z6lfXK(r4dy<_}R|(7%Lyo>QFP#s31E8zsYA${gSUykUV@?lyDNF=KhTeF^*lu7C*{ zBCIjy;bIE;9inJ$IT8_jL%)Q{7itmncYlkf2`lHl(gTwD%LmEPo^gskydVxMd~Do` zO8EzF!yn!r|BEgPjhW#>g(unY#n}=#4J;3FD2ThN5LpO0tI2~pqICaFAGT%%;3Xx$ z>~Ng(64xH-RV^Rj4=A_q1Ee8kcF}8HN{5kjYX0ADh}jq{q18x(pV!23pVsK5S}{M#p8|+LvfKx|_3;9{+6cu7%5o-+R@z>TlTft#kcJ`s2-j zUe4dgpInZU!<}aTGuwgdWJZ#8TPiV9QW<-o!ibBn&)?!ZDomECehvT7GSCRyF#VN2&5GShch9*}4p;8TX~cW*<#( zv-HmU7&+YUWO__NN3UbTFJ&^#3vxW4U9q5=&ORa+2M$4rskA4xV$rFSEYBGy55b{z z!)$_fYXiY?-GWDhGZXgTw}#ilrw=BiN(DGO*W7Vw(} zjUexksYLt_Nq?pl_nVa@c1W#edQKbT>VSN1NK?DulHkFpI-LXl7{;dl@z0#v?x%U& z8k8M1X6%TwR4BQ_eEWJASvMTy?@fQubBU__A_US567I-~;_VcX^NJ-E(ZPR^NASj1 zVP!LIf8QKtcdeH#w6ak50At)e={eF_Ns6J2Iko6dn8Qwa6!NQHZMGsD zhzWeSFK<{hJV*!cIHxjgR+e#lkUHCss-j)$g zF}DyS531TUXKPPIoePo{yH%qEr-dLMOhv^sC&@9YI~uvl?rBp^A-57{aH_wLg0&a|UxKLlYZQ24fpb24Qjil`4OCyt0<1eu>5i1Acv zaZtQRF)Q;?Aw3idg;8Yg9Cb#)03?pQ@O*bCloG zC^|TnJl`GXN*8iI;Ql&_QIY0ik}rqB;cNZ-qagp=qmci9eScHsRXG$zRNdf4SleJ} z7||<#PCW~0>3u8PP=-DjNhD(^(B0AFF+(oKOiQyO5#v4nI|v_D5@c2;zE`}DK!%;H zUn|IZ6P;rl*5`E(srr6@-hpae!jW=-G zC<*R?RLwL;#+hxN4fJ!oP4fX`vC3&)o!#l4y@MrmbmL{t;VP%7tMA-&vju_L zhtHbOL4`O;h*5^e3F{b9(mDwY6JwL8w`oi28xOyj`pVo!75hngQDNg7^D$h4t&1p2 ziWD_!ap3GM(S)?@UwWk=Szym^eDxSx3NaR}+l1~(@0car6tfP#sZRTb~w!WAS{+|SgUN3Tv`J4OMf z9ta_f>-`!`I@KA=CXj_J>CE7T`yGmej0}61sE(%nZa1WC_tV6odiysHA5gzfWN-`uXF46mhJGLpvNTBmx$!i zF67bAz~E|P{L6t1B+K|Cutp&h$fDjyq9JFy$7c_tB(Q$sR)#iMQH3{Og1AyD^lyQwX6#B|*ecl{-_;*B>~WSFInaRE_q6 zpK#uCprrCb`MU^AGddA#SS{P7-OS9h%+1`~9v-s^{s8faWNpt*Pmk_ECjt(wrpr{C_xdAqR(@!ERTSs@F%^DkE@No}wqol~pS^e7>ksF_NhL0?6R4g`P- zk8lMrVir~b(KY+hk5LQngwm`ZQT5t1^7AzHB2My6o)_ejR0{VxU<*r-Gld`l6tfA` zKoj%x9=>Ce|1R|1*aC}|F0R32^KMLAHN}MA<8NNaZ^j?HKxSwxz`N2hK8lEb{jE0& zg4G_6F@#NyDN?=i@=)eidKhlg!nQoA{`PgaH{;t|M#5z}a`u?^gy{5L~I2smLR z*4RmNxHqf9>D>sXSemHK!h4uPwMRb+W`6F>Q6j@isZ>-F=)B2*sTCD9A^jjUy)hjAw71B&$u}R(^R; zY9H3k8$|ounk>)EOi_;JAKV8U8ICSD@NrqB!&=)Ah_5hzp?L9Sw@c>>#f_kUhhm=p z1jRz8X7)~|VwO(MF3PS(|CL++1n|KT3*dhGjg!t_vR|8Yg($ z+$S$K=J`K6eG#^(J54=4&X#+7Car=_aeAuC>dHE+%v9HFu>r%ry|rwkrO-XPhR_#K zS{2Unv!_CvS7}Mb6IIT$D4Gq5v$Pvi5nbYB+1Yc&RY;3;XDihlvhhIG6AhAHsBYsm zK@MgSzs~y|+f|j-lsXKT0(%E2SkEb)p+|EkV5w8=F^!r1&0#0^tGhf9yPZ)iLJ^ zIXOg)HW_Vt{|r0W(`NmMLF$?3ZQpq+^OtjR-DaVLHpz%1+GZ7QGFA?(BIqBlVQ;)k zu)oO|KG&++gD9oL7aK4Zwjwi~5jqk6+w%{T$1`2>3Znh=OFg|kZ z>1cn>CZ>P|iQO%-Pic8wE9c*e%=3qNYKJ+z1{2=QHHFe=u3rqCWNhV_N*qzneN8A5 zj`1Ir7-5`33rjDmyIGvTx4K3qsks(I(;Kgmn%p#p3K zn8r9H8kQu+n@D$<#RZtmp$*T4B&QvT{K&qx(?>t@mX%3Lh}sr?gI#vNi=vV5d(D<=Cp5-y!a{~&y|Uz*PU{qe zI7g}mt!txT)U(q<+Xg_sSY%1wVHy;Dv3uze zJ>BIdSB2a|aK+?o63lR8QZhhP)KyQvV`J3)5q^j1-G}fq=E4&){*&hiam>ssYm!ya z#PsY0F}vT#twY1mXkGYmdd%_Uh12x0*6lN-HS-&5XWbJ^%su)-vffvKZ%rvLHVA<; zJP=h13;x?$v30`T)M)htph`=if#r#O5iC^ZHeXc6J8gewn zL!49!)>3I-q6XOZRG0=zjyQc`tl|RFCR}f-sNtc)I^~?Vv2t7tZZHvgU2Mfc9$LqG z!(iz&xb=q#4otDBO4p)KtEq}8NaIVcL3&pbvm@0Kk-~C@y3I{K61VDF_=}c`VN)3P z+{nBy^;=1N`A=xH$01dPesY_na*zrcnssA}Ix60C=sWg9EY=2>-yH&iqhhm28qq9Z z;}znS4ktr40Lf~G@6D5QxW&?q^R|=1+h!1%G4LhQs54c2Wo~4% zCA||d==lv2bP=9%hd0Dw_a$cz9kk)(Vo}NpSPx!vnV*0Bh9$CYP~ia#lEoLRJ8D#5 zSJS?}ABn1LX>8(Mfg&eefX*c0I5bf4<`gCy6VC{e>$&BbwFSJ0CgVa;0-U7=F81R+ zUmzz&c;H|%G&mSQ0K16Vosh?sjJW(Gp+1Yw+Yf4qOi|BFVbMrdO6~-U8Hr|L@LHeZ z0ALmXHsVm137&xnt#yYF$H%&AU!lf{W436Wq87nC16b%)p?r z70Wua59%7Quak50G7m3lOjtvcS>5}YL_~?Pti_pfAfQ!OxkX$arHRg|VrNx>R_Xyi z`N|Y7KV`z3(ZB2wT9{Dl8mtl zg^UOBv~k>Z(E)O>Z;~Z)W&4FhzwiPjUHE9&T#nlM)@hvAZL>cha-< zQ8_RL#P1?&2Qhk#c9fK9+xM#AneqzE-g(>chLp_Q2Xh$=MAsW z2ScEKr+YOD*R~mzy{bOJjs;X2y1}DVFZi7d_df^~((5a2%p%^4cf>vM_4Sn@@ssVJ z9ChGhs zbanJ+h74)3tWOviXI|v!=HU2mE%3Th$Mpx&lEeGFEBWRy8ogJY`BCXj@7s~bjrOY! z4nIU5S>_NrpN}|waZBC)$6ST8x91U2n?FGV8lS{&LFhHbuHU?SVU{p7yFSP_f#Eyh zJhI@o9lAeEwbZYC=~<(FZ$sJx^6j@gtl{yTOAz`Gj!Ab^y})eG&`Qt2cXdog2^~oOH^K@oHcE(L;wu2QiMv zJuGdhNd+H{t#Tjd<$PknMSfbI>L1YIdZ+uFf*Z=BEM)UPG3oDFe@8roB0h(*XAqRc zoxw`wQD@^nxGFxQXN9@GpkLqd?9@(_ZRS@EFRCO8J5{iuNAQO=!Lo5cCsPtt4=1qZN8z`EA2{ge@SjTyhiJE%ttk{~`SEl%5>s=9E~dUW0uws>&~3PwXJ!f>ShhP~U9dLvE8ElNt3g(6-d zdgtD;rgd^>1URef?*=8BkE&+HmzXD-4w61(p6o~Oxm`XexcHmnR*B~5a|u-Qz$2lf zXc$p91T~E4psJxhf^rdR!b_XmNv*?}!PK9@-asDTaen;p{Rxsa=1E}4kZ*}yQPoT0 zvM}t!CpJvk<`m~^$^1C^o1yM(BzY-Wz2q7C^+wfg-?}1bF?5Hk?S{^#U%wX4&lv0j zkNb)byI+nql(&65xV?_L<0tj!KMHX8Hmh2(udEG>@OPQ}KPtdwEuEb$?acp~yT1&r z|7YU<(v!0as6Xff5^XbKQIR&MpjSE)pmub+ECMZzn7c!|hnm_Rl&H_oXWU2!h7hhf zo&-@cLkZr#eNgUN9>b=QLE1V^b`($EX3RQIyg#45A^=G!jMY`qJ z8qjZ$*-V|?y0=zIM>!2q!Gi*t4J5Otr^OT3XzQ_GjATc(*eM zqllux#QtHhc>YtnswBNiS^t(dTDn|RYSI%i%-|sv1wh&|9jfeyx|IHowW)6uZWR<%n8I}6NidBm zJ>P7#5m`gnXLu;?7jQZ!PwA80d|AS*+mtrU6z+lzms6^vc4)6Zf+$l+Lk3AsEK7`_ zQ9LsS!2o#-pK+V`g#3hC$6*Z~PD%cwtOT8;7K3O=gHdC=WLK-i_DjPO#WN__#YLX|Akw3LnqUJUw8&7pUR;K zqJ98?rKMXE(tnmT`#080w%l1bGno7wXHQbl?QFU=GoK@d!Ov=IgsdHd-iIs4ahcgSj(L@F96=LKZ zeb5cJOVlcKBudawbz~AYk@!^p+E=dT^UhPE`96Q5J~cT-8^tp`J43nLbFD*Nf!w;6 zs>V!5#;?bwYflf0HtFvX_6_jh4GEpa0_s8UUe02@%$w^ym&%wI5_APD?9S4r9O@4m zq^Z5Br8#K)y@z*fo08@XCs;wKBydn+60ks4Z>_+PFD+PVTGNPFPg-V-|``!0l|XrTyUYA@mY?#bJYvD>jX&$o9VAbo?>?#Z^c+Y4Dl zXU9k`s74Sb$OYh7^B|SAVVz*jEW&GWG^cP<_!hW+#Qp|4791Od=HJcesFo?$#0eWD z8!Ib_>H1WQE}shsQiUNk!uWOyAzX>r(-N7;+(O333_ES7*^6z4{`p&O*q8xk{0xy@ zB&9LkW_B}_Y&?pXP-OYNJfqEWUVAPBk)pTP^;f+75Wa(W>^UO_*J05f1k{ zd-}j!4m@q#CaC6mLsQHD1&7{tJ*}LtE{g9LB>sIT7)l^ucm8&+L0=g1E_6#KHfS>A_Z?;pFP96*nX=1&ejZ+XvZ=ML`@oVu>s^WIjn^SY}n zboeP%`O9|dhzvnw%?wAsCw*lvVcv%bmO5M4cas>b%FHd;A6Z%Ej%;jgPuvL$nk=VQ=$-OTwslYg zJQtDS)|qkIs%)K$+r*_NTke8%Rv&w^v;|Ajh5QXaVh}ugccP}3E^(oGC5VO*4`&Q0 z&)z$6i_aKI*CqVBglCxo#9>eOkDD!voCJRFkNolvA2N&SAp^4<8{Y;#Kr5740 za|G`dYGE!9NGU3Ge6C)YByb6Wy#}EN`Ao#R!$LQ&SM#hifEvZp>1PAX{CSLqD4IuO z4#N4AjMj5t2|!yTMrl5r)`_{V6DlqVeTwo|tq4MHLZdZc5;=v9*ibc;IGYh+G|~PB zx2}BAv6p$}?7YpvhqHu7L;~)~Oe^Y)O(G(PJQB<&2AhwMw!(2#AHhjSsBYUd8MDeM z+UXXyV@@cQ`w}mJ2PGs>=jHE{%i44QsPPh(=yorg>jHic+K+S*q3{th6Ik^j=@%xo zXfa9L_<|xTL@UZ?4H`$vt9MOF`|*z&)!mECiuenMW`Eo2VE#|2>2ET7th6+VAmU(o zq$Fz^TUB*@a<}kr6I>r;6`l%8NWtVtkE?}Q<<$BIm*6Z(1EhDtA29O%5d1$0q#C&f zFhFrrss{hOsISjYGDOP*)j&zZUf9`xvR8G)gwxE$HtmKsezo`{Ta~V5u+J&Tg+{bh zhLlNbdzJNF6m$wZNblWNbP6>dTWhngsu=J{);9D|PPJ96aqM4Lc?&6H-J1W15uIpQ ziO{&pEc2}-cqw+)w$`p(k(_yRpmbp-Xcd`*;Y$X=o(v2K+ISW)B1(ZnkV`g4rHQ=s z+J?F9&(||&86pi}snC07Lxi1ja>6kvnut;|Ql3fD)%k+ASe^S|lN69+Ek3UwsSx=2EH)t}K>~ z`Mz-SSVH29@DWyl`ChuGAkG>J;>8ZmLhm>uEmUvLqar~vK3lS;4s<{+ehMsFXM(l- zRt=HT>h9G)JS*&(dbXrM&z;)66C=o{=+^}ciyt8|@e$Y}IREAyd_!2|CqTg=eu}yG z@sI9T;Tjix*%v)c{4G84|0j@8wX^Iig_JsPU|T%(J&KtJ>V zsAR+dcmyT5k&&G{!)VXN`oRS{n;3qd`BgAE9r?%AHy_Gf8>$&X$=>YD7M911?<{qX zkJ;IOfY$nHdy@kKk_+X%g3`T(v|jS;>`pz`?>fqMZ>Fvbx1W=8nvtuve&y`JBfvU~ zr+5pF!`$`TUVsx3^<)48&+XT92U0DS|^X6FwSa-8yviRkZ*@Wu|c*lX!m?8&$0~4T!DB0@)n}ey+ew}T1U>|fH3=W5I!=nfoNs~OkzTY7^x^G&h>M7ewZqmZ=EL0}3#ikWg+(wuoA{7hm|7eJz zNz78l-K81tP16rai+fvXtspOhN-%*RY3IzMX6~8k9oFlXWgICx9dp;`)?Toz`fxV@&m8< z{lzWJG_Y(N1nOox>yG^uDr}kDX_f`lMbtxfP`VD@l$HR*B(sDeE(+T831V-3d3$+% zDKzKnK_W(gLwAK{Saa2}zaV?1QmcuhDu$)#;*4gU(l&rgNXB^WcMuuTki*rt>|M)D zoI;l$FTWIUp}euuZjDidpVw6AS-3dal2TJJaVMGj#CROWr|;^?q>PAo2k^u-27t~v zCv10IL~E)o*|QgdM!GJTaT&|A?oW)m9qk2{=y*7qb@BIAlYgDIe)k(qVH@)#xx6%7 z@)l%aJwz5Joc84Q2jRp71d;=a@NkjSdMyN%L6OevML^(L0_msbef>ewImS=+DgrTk z4ON%Y$mYgcZ^44O*;ctP>_7=}=pslsu>~<-bw=C(jeQ-X`kUo^BS&JDHy%#L32Cj_ zXRzDCfCXKXxGSW9yOGMMOYqPKnU zTF6gDj47!7PoL%z?*{1eyc2IVF*RXX?mj1RS}++hZg_%b@6&PdO)VzvmkXxJ*O7H} z6I7XmJqwX3<>z%M@W|GD%(X|VOZ7A+=@~MxMt8zhDw`yz?V>H%C0&VY+ZZ>9AoDVZeO1c~z$r~!H zA`N_9p`X?z>jm!-leBjW1R13_i2(0&aEY2$l_+-n#powuRO;n2Fr#%jp{+3@`h$c< zcFMr;18Z`UN#spXv+3Ks_V_tSZ1!FY7H(tdAk!v}SkoL9RPYSD3O5w>A3%>7J+C-R zZfDmu=9<1w1CV8rCMEm{qyErCUaA3Q zRYYw_z!W7UDEK)8DF}la9`}8z*?N32-6c-Bwx^Jf#Muwc67sVW24 zJ4nab%>_EM8wPhL=MAN)xx1tozAl zmhXN;*-X%)s>(L=Q@vm$qmuScku>PV(W_x-6E?SFRjSk)A1xVqnml_92fbj0m};UC zcV}lRW-r*wY106|sshV`n#RN{)D9=!>XVH0vMh>od=9!1(U+sWF%#B|eeaKI9RpaW z8Ol_wAJX%j0h5fkvF)WMZ1}?#R(n-OT0CtwsL)|qk;*(!a)5a5ku2nCR9=E*iOZ`9 zy4>LHKt-BgHL@R9CBSG!v4wK zvjF8DORRva)@>nshE~VM@i2c$PKw?3nz(6-iVde;-S~~7R<5r2t$0U8k2_<5C0!$j zQg#lsRYtI#Q1YRs(-%(;F-K7oY~!m&zhuU4LL}>jbLC>B`tk8onRRcmIm{{0cpkD|o@Ixu#x9Wm5J)3oFkbfi62BX8IX1}VTe#{C(d@H|#gy5#Sa#t>sH@8v1h8XFgNGs?)tyF_S^ueJX_-1%+LR`1X@C zS3Oc)o)!8Z9!u9d!35YD^!aXtH;IMNzPp`NS|EcdaQw~<;z`lmkg zE|tQRF7!S!UCsbag%XlQZXmzAOSs= zIUjgY2jcN9`xA6mzG{m|Zw=3kZC4@XY=Bj%k8%D&iadvne$pYNfZI$^2BAB|-MnZW zU4U?*qE3`ZDx-bH})>wz~)a z_SWM!E=-BS#wdrfh;EfPNOS*9!;*+wp-zDthj<>P0a2n?$xfe;YmX~5a;(mNV5nKx zYR86%WtAPsOMIg&*o9uUfD!v&4(mpS6P`bFohPP<&^fZzfA|SvVzPQgbtwwM>IO>Z z75ejU$1_SB1tn!Y-9tajZ~F=Fa~{cnj%Y|$;%z6fJV1XC0080f)Pj|87j142q6`i>#)BCIi+x&jAH9|H#iMvS~?w;&E`y zoarJ)+5HWmZ{&OqlzbdQU=SE3GKmnQq zI{h6f$C@}Mbqf#JDsJyi&7M0O2ORXtEB`#cZ;#AcB zkao0`&|iH8XKvZ_RH|VaK@tAGKMq9x{sdd%p-o`!cJzmd&hb86N!KKxp($2G?#(#BJn5%hF0(^`= z2qRg5?82({w-HyjbffI>eqUXavp&|D8(I6zMOfM}0;h%*D_Dr@+%TaWpIEQX3*$vQ z8_)wkNMDi{rW`L+`yN^J*Gt(l7PExu3_hrntgbW0s}7m~1K=(mFymoU87#{|t*fJ?w8&>Uh zcS$Ny$HNRbT!UCFldTSp2*;%EoW+yhJD8<3FUt8@XSBeJM2dSEz+5}BWmBvdYK(OA zlm`nDDsjKED{$v*jl(&)H7-+*#jWI)W|_X)!em1qpjS_CBbAiyMt;tx*+0P%*m&v< zxV9rlslu8#cS!of#^1O$(ds8aviMFiT`6W+FzMHW{YS+SieJ^?TQb%NT&pasw^kbc znd`=%(bebvrNx3#7vq@vAX-G`4|>cY0svIXopH02{v;GZ{wJM#psz4!m8(IZu<)9D zqR~U7@cz-6H{724_*}-DWwE8Sk+dYBb*O-=c z+wdchFcm6$$^Z0_qGnv0P`)h1=D$_eg8!2-|7Y;o*c)4ax!Me0*EVcioh{wI#!qcb z1&xhOotXMrlo7P6{+C8m;E#4*=8(2y!r0d<6 zKi$d2X;O*zS(&Xiz_?|`ympxITf|&M%^WHp=694g6W@k+BL_T1JtSYX0OZ}o%?Pzu zJ{%P8A$uq?4F!NWGtq>_GLK3*c6dIcGH)??L`9Av&0k$A*14ED9!e9z_SZd3OH6ER zg%5^)3^gw;4DFw(RC;~r`bPJOR}H}?2n60=g4ESUTud$bkBLPyI#4#Ye{5x3@Yw<* z;P5Up>Yn(QdP#momCf=kOzZYzg9E330=67WOPbCMm2-T1%8{=or9L8+HGL{%83lri zODB;Y|LS`@mn#Wmez7t6-x`a2{}U9hE|xY7|BVcFCqoAZQzsEi=dYHB z(bqG3J5?teVSBqTj{aiqe<9}}CEc$HdsJSMp#I;4(EXRy_k|Y8X#5hwkqAaIGKARF zX?$|UO{>3-FU;IlFi80O^t+WMNw4So2nsg}^T1`-Ox&C%Gn_AZ-49Nir=2oYX6 z`uVke@L5PVh)YsvAgFMZfKi{DuSgWnlAaag{RN6t6oLm6{4)H~4xg#Xfcq-e@ALk& z@UP4;uCe(Yjg4jaJZ4pu*+*?4#+XCi%sTrqaT*jNY7|WQ!oR;S8nt)cI27W$Sz!94 z01zoTW`C*P3E?1@6thPe(QpIue$A54gp#C7pmfwRj}GxIw$!!qQetn`nvuwIvMBQ; zfF8K-D~O4aJKmLbNRN1?AZsWY&rp?iy`LP^3KT0UcGNy=Z@7qVM(#5u#Du#w>a&Bs z@f#zU{wk&5n!YF%D11S9*CyaI8%^oX=vq$Ei9cL1&kvv9|8vZD;Mhs1&slm`$A%ED zvz6SQ8aty~`IYp2Xd~G$z%Jf4zwVPKkCtqObrnc2gHKj^jg&-NH|xdNK_;+2d4ZXw zN9j)`jcp7y65&6P@}LsD_OLSi(#GW#hC*qF5KpmeXuQDNS%ZYpuW<;JI<>P6ln!p@ z>KPAM>8^cX|2!n@tV=P)f2Euv?!}UM`^RJ~nTT@W>KC2{{}xXS{}WH{|3najkiEUj z7l;fUWDPCtzQ$?(f)6RvzW~Tqan$bXibe%dv}**BqY!d4J?`1iX`-iy8nPo$s4^mQ z5+@=3xuZAl#KoDF*%>bJ4UrEB2EE8m7sQn!r7Z-ggig`?yy`p~3;&NFukc$`_>?}a z?LMo2LV^n>m!fv^HKKRrDn|2|zk?~S6i|xOHt%K(*TGWkq3{~|9+(G3M-L=;U-YRa zp{kIXZ8P!koE;BN2A;nBx!={yg4v=-xGOMC#~MA07zfR)yZtSF_2W^pDLcXg->*WD zY7Sz5%<_k+lbS^`y)=vX|KaN!gEMQob|(`%nP6huwr$%^?%0^vwr$(CZQD*Jc5?E( zb-q9E`OfoWSJ$rUs$ILfSFg3Mb*-!Ozgaz^%7ZkX@=3km0G;?+e?FQT_l5A9vKr<> z_CoemDo@6YIyl57l*gnJ^7+8xLW5oEGzjLv2P8vj*Q%O1^KOfrsC6eHvk{+$BMLGu z%goP8UY?J7Lj=@jcI$4{m2Sw?1E%_0C7M$lj}w{E#hM4%3QX|;tH6>RJf-TI_1A0w z@KcTEFx(@uitbo?UMMqUaSgt=n`Bu*;$4@cbg9JIS})3#2T;B7S

Z?HZkSa`=MM?n)?|XcM)@e1qmzJ$_4K^?-``~Oi&38`2}sjmP?kK z$yT)K(UU3fJID@~3R;)fU%k%9*4f>oq`y>#t90$(y*sZTzWcW$H=Xv|%^u^?2*n)Csx;35O0v7Nab-REgxDZNf5`cI69k$` zx(&pP6zVxlK5Apn5hAhui}b)(IwZD}D?&)_{_yTL7QgTxL|_X!o@A`)P#!%t9al+# zLD(Rr+?HHJEOl545~m1)cwawqY>cf~9hu-L`crI^5p~-9Mgp9{U5V&dJSwolnl_CM zwAMM1Tl$D@>v?LN2PLe0IZrQL1M zcA%i@Lc)URretFJhtw7IaZXYC6#8slg|*HfUF2Z5{3R_tw)YQ94=dprT`SFAvHB+7 z)-Hd1yE8LB1S+4H7iy$5XruPxq6pc_V)+VO{seA8^`o5{T5s<8bJ`>I3&m%R4cm1S z`hoNk%_=KU2;+#$Y!x7L%|;!Nxbu~TKw?zSP(?H0_b8Qqj4EPrb@~IE`~^#~C%D9k zvJ=ERh`xLgUwvusQbo6S=I5T+?lITYsVyeCCwT9R>DwQa&$e(PxF<}RpLD9Vm2vV# zI#M%ksVNFG1U?;QR{Kx2sf>@y$7sop6SOnBC4sv8S0-`gEt0eHJ{`QSW(_06Uwg*~ zIw}1dZ9c=K$a$N?;j`s3>)AqC$`ld?bOs^^stmYmsWA$XEVhUtGlx&OyziN1~2 z)s5fD(d@gq7htIGX!GCxKT=8aAOHW&DAP=$MpZ)SpeEZhk83}K) z0(Uv)+&pE?|4)D2PX4r6gOGHDY}$8FSg$3eDb*nEVmkFQ#lFpcH~IPeatiH3nPTkP z*xDN7l}r2GM9jwSsl=*!547nRPCS0pb;uE#myTqV+=se>bU=#e)f2}wCp%f-cIrh`FHA$2`monVy?qvJ~o2B6I7IE28bCY4=c#^){*essLG zXUH50W&SWmi{RIG9G^p;PohSPtC}djjXSoC)kyA8`o+L}SjE{i?%;Vh=h;QC{s`T7 zLmmHCr8F}#^O8_~lR)^clv$mMe`e*{MW#Sxd`rDckCnFBo9sC*vw2)dA9Q3lUi*Fy zgDsLt`xt|7G=O6+ms=`_FpD4}37uvelFLc^?snyNUNxbdSj2+Mpv<67NR{(mdtSDNJ3gSD@>gX_7S5 zCD)JP5Hnv!llc-9fwG=4@?=%qu~(4j>YXtgz%gZ#+A9i^H!_R!MxWlFsH(ClP3dU} za&`m(cM0xebj&S170&KLU%39I+XVWOJ_1XpF^ip}3|y()Fn5P@$pP5rvtiEK6w&+w z7uqIxZUj$#qN|<_LFhE@@SAdBy8)xTu>>`xC>VYU@d}E)^sb9k0}YKr=B8-5M?3}d z7&LqQWQ`a&=ihhANxe3^YT>yj&72x#X4NXRTc#+sk;K z=VUp#I(YIRO`g7#;5))p=y=MQ54JWeS(A^$qt>Y#unGRT$0BG=rI(tr>YqSxNm+-x z6n;-y8B>#FnhZX#mhVOT30baJ{47E^j-I6EOp;am;FvTlYRR2_?CjCWY+ypoUD-2S zqnFH6FS+q$H$^7>>(nd^WE+?Zn#@HU3#t|&=JnEDgIU+;CgS+krs+Y8vMo6U zHVkPoReZ-Di3z!xdBu#aW1f{8sC)etjN90`2|Y@{2=Os`(XLL9+ z1$_PE$GgTQrVx`^sx=Y(_y-SvquMF5<`9C=vM52+e+-r=g?D z+E|97MyoaK5M^n1(mnWeBpgtMs8fXOu4Q$89C5q4@YY0H{N47VANA1}M2e zspor6LdndC=kEvxs3YrPGbc;`q}|zeg`f;t3-8na)dGdZ9&d(n{|%mNaHaKJOA~@8 zgP?nkzV-=ULb)L3r`p)vj4<702a5h~Y%byo4)lh?rtu1YXYOY+qyTwzs!59I zL}XLe=q$e<+Wm7tvB$n88#a9LzBkgHhfT<&i#%e*y|}@I z!N~_)vodngB7%CI2pJT*{GX|cI5y>ZBN)}mezK~fFv@$*L`84rb0)V=PvQ2KN}3lTpT@$>a=CP?kcC0S_^PZ#Vd9#CF4 zP&`6{Y!hd^qmL!zr#F~FB0yag-V;qrmW9Jnq~-l>Sg$b%%TpO}{Q+*Pd-@n2suVh_ zSYP->P@# z&gQ^f{?}m(u5B9xqo63pUvDsJDQJi5B~ak+J{tX8$oL!_{Dh zL@=XFzWb+83H3wPbTic+osVp&~UoW3SqK0#P6+BKbOzK65tz)-@AW#g}Ew+pE3@ zVbdJkJ}EM@-Ghxp_4a)|asEk* z5)mMI&EK~BI^aaTMRl)oPJRH^Ld{;1FC&#pS`gh;l3Y;DF*`pR%OSz8U@B@zJxPNX zwyP_&8GsQ7^eYyUO3FEE|9~I~X8;{WTN=DJW0$2OH=3-!KZG=X6TH?>URr(A0l@+d zj^B9G-ACel;yYGZc}G`w9sR$Mo{tzE7&%XKuW$|u7DM<6_z}L>I{o`(=!*1 z{5?1p3F^aBONr6Ws!6@G?XRxJxXt_6b}2%Bp=0Iv5ngnpU^P+?(?O0hKwAK z*|wAisG&8&Td1XY+6qI~-5&+4DE2p|Dj8@do;!40o)F)QuoeUY;*I&QZ0*4?u)$s`VTkNl1WG`}g@J_i zjjmv4L%g&>@U9_|l>8^CN}`@4<D2aMN&?XXD-HNnsVM`irjv$ z^YVNUx3r1{-o6waQfDp=OG^P+vd;qEvd{UUYc;gF0UwaeacXkw32He^qyoYHjZeFS zo(#C9#&NEdFRcFrj7Q{CJgbmDejNS!H%aF6?;|KJQn_*Ps3pkq9yE~G{0wIS*mo0XIEYH zzIiJ>rbmD;sGXt#jlx7AXSGGcjty)5z5lTGp|M#5DCl0q0|~pNQ%1dP!-1>_7^BA~ zwu+uumJmTCcd)r|Hc)uWm7S!+Dw4;E|5+bwPb4i17Ued>NklnnsG+A{T-&}0=sLM- zY;sA9v@YH>b9#c$Vg{j@+>UULBX=jtu~N^%Y#BB5)pB|$?0Mf7msMD<7eACoP1(XY zPO^h5Brvhn$%(0JSo3KFwEPV&dz8(P41o=mo7G~A*P6wLJ@-#|_A z7>k~4&lbqyP1!la!qmhFBfIfT?nIHQ0j2WlohXk^sZ`?8-vwEwV0~uu{RDE^0yfl$ znua{^`VTZ)-h#ch_6^e2{VPaE@o&55|3dx$z_b6gbqduXJ(Lz(zq&ZbJ6qA4Ac4RT zhJO4KBLN!t;h(eW(?cZJw^swf8lP@tWMZ8GD)zg)siA3!2EJYI(j>WI$=pK!mo!Ry z?q&YkTIbTTr<>=}+N8C_EAR0XQL2&O{nNAXb?33iwo8{M``rUHJgnk z8KgZzZLFf|(O6oeugsm<;5m~4N$2Jm5#dph*@TgXC2_k&d%TG0LPY=Fw)=gf(hy9QmY*D6jCAiq44 zo-k2C+?3*+Wu7xm1w*LEAl`Vsq(sYPUMw|MiXrW)92>rVOAse5Pmx^OSi{y%EwPAE zx|csvE{U3c{vA>@;>xcjdCW15pE31F3aoIBsz@OQRvi%_MMfgar2j3Ob`9e@gLQk# zlzznEHgr|Ols%f*a+B-0klD`czi@RWGPPpR1tE@GB|nwe`td1OwG#OjGlTH zfT#^r?%3Ocp^U0F8Kekck6-Vg2gWs|sD_DTJ%2TR<5H3a$}B4ZYpP=p)oAoHxr8I! z1SYJ~v-iP&mNm{ra7!KP^KVpkER>-HFvq*>eG4J#kz1|eu;=~u2|>}TE_5nv2=d!0 z3P~?@blSo^uumuEt{lBsGcx{_IXPO8s01+7DP^yt&>k;<5(NRrF|To2h7hTWBFQ_A z+;?Q$o5L|LlIB>PH(4j)j3`JIb1xA_C@HRFnPnlg{zGO|-RO7Xn}!*2U=Z2V?{5Al z9+iL+n^_T~6Uu{law`R&fFadSVi}da8G>|>D<{(#vi{OU;}1ZnfXy8=etC7)Ae<2S zAlI`&=HkNiHhT0|tQztSLNsRR6v8bmf&$6CI|7b8V4kyJ{=pG#h{1sVeC28&Ho%Fh zwo_FIS}ST-2OF6jNQ$(pjrq)P)@sie#tigN1zSclxJLb-O9V|trp^G8<1rpsj8@+$ z2y27iiM>H8kfd%AMlK|9C>Lkvfs9iSk>k2}tCFlqF~Z_>-uWVQDd$5{3sM%2$du9; z*ukNSo}~@w@DPF)_vS^VaZ)7Mk&8ijX2hNhKom$#PM%bzSA-s$ z0O!broj`!Nuk)Qcp3(>dL|5om#XMx2RUSDMDY9#1|+~fxwP}1I4iYy4j$CGx3jD&eKhf%z`Jn z7mD!y6`nVq%&Q#5yqG`|+e~1$Zkgu!O(~~pWSDTw2^va3u!DOMVRQ8ycq)sk&H%vb z;$a`3gp74~I@swI!ILOkzVK3G&SdTcVe~RzN<+z`u(BY=yuwez{#T3a_83)8>2!X?`^02zVjqx-fN+tW`zCqH^XG>#Ies$qxa!n4*FF0m zxgJlPPYl*q4ylX;DVu3G*I6T&JyWvs`A(*u0+62=+ylt2!u)6LJ=Qe1rA$OWcNCmH zLu7PwMDY#rYQA1!!ONNcz~I^uMvi6N&Lo4dD&HF?1Su5}COTZ-jwR)-zLq=6@bN}X zSP(-MY`TOJ@1O`bLPphMMSWm+YL{Ger>cA$KT~)DuTl+H)!2Lf`c+lZ0ipxd>KfKn zIv;;eEmz(_(nwW24a+>v{K}$)A?=tp+?>zAmfL{}@0r|1>iFQfJ5C*6dKdijK=j16 zQpl4gl93ttF5@d<9e2LoZ~cqkH)aFMgt(el_)#OG4R4Hnqm(@D*Uj>2ZuUCy)o-yy z_J|&S-@o5#2IMcL(}qWF3EL<4n(`cygenA)G%Ssi7k4w)LafelpV5FvS9uJES+(Ml z?rzZ={vYrB#mB-Hd#ID{KS5dKl-|Wh_~v+Lvq3|<@w^MD-RA{q!$gkUUNIvAaex5y z)jIGW{#U=#UWyku7FIAB=TES8>L%Y9*h2N`#Gghie+a?>$CRNth?ORq)!Tde24f5K zKh>cz5oLC;ry*tHIEQEL>8L=zsjG7+(~LUN5K1pT`_Z-4Z}k^m%&H%g3*^e(FDCC{ zBh~eqx%bY?qqu_2qa+9A+oS&yFw^3nLRsN#?FcZvt?*dZhRC_a%Jd{qou(p5AG_Q6 ziOJMu8D~kJ7xEkG(69$Dl3t1J592=Olom%;13uZvYDda08YwzqFlND-;YodmA!SL) z!AOSI=(uCnG#Yo&BgrH(muUemmhQW7?}IHfxI~T`44wuLGFOMdKreQO!a=Z-LkH{T z@h;`A_l2Pp>Xg#`Vo@-?WJn-0((RR4uKM6P2*^-qprHgQhMzSd32@ho>%fFMbp9Y$ zx-#!r8gEu;VZN(fDbP7he+Nu7^o3<+pT!<<>m;m z=FC$N)wx)asxb_KLs}Z^;x*hQM}wQGr((&=%+=#jW^j|Gjn$(qqXwt-o-|>kL!?=T zh0*?m<^>S*F}kPiq@)Cp+^fnKi2)%<-Tw4K3oHwmI-}h}Kc^+%1P!D8aWp!hB@-ZT zybHrRdeYlYulEj>Bk zEIi|PU0eGg&~kWQ{q)gw%~bFT0`Q%k5S|tt!JIZXVXX=>er!7R^w>zeQ%M-(C|eOQG>5i|}i3}X#?aqAg~b1t{-fqwKd(&CyA zmyy)et*E}+q_lEqgbClewiJ=u@bFX}LKe)5o26K9fS;R`!er~a?lUCKf60`4Zq7{2q$L?k?IrAdcDu+ z4A0QJBUiGx&$TBASI2ASM_Wj{?fjv=CORO3GZz;1X*AYY`anM zI`M6C%8OUFSc$tKjiFJ|V74Yj-lK&Epi7F^Gp*rLeDTokfW#o6sl33W^~4V|edbS1 zhx%1PTdnI!C96iYqSA=qu6;p&Dd%)Skjjw0fyl>3k@O?I@x5|>2_7G#_Yc2*1>=^# z|H43bJDx$SS2!vkaMG!;VRGMbY{eJhT%FR{(a+RXDbd4OT?DRoE(`NhiVI6MsUCsT z1gc^~Nv>i;cIm2~_SYOfFpkUvV)(iINXEep;i4>&8@N#|h+_;DgzLqh3I#lzhn>cN zjm;m6U{+JXR2Mi)=~WxM&t9~WShlyA$Pnu+VIW2#;0)4J*C!{1W|y1TP{Q;!tldR< zI7aoH&cMm*apW}~BabBT;`fQ1-9q|!?6nTzmhiIo6fGQlcP{pu)kJh- zUK&Ei9lArSO6ep_SN$Lt_01|Y#@Ksznl@f<+%ku1F|k#Gcwa`(^M<2%M3FAZVb99?Ez4d9O)rqM< zCbYsdZlSo{X#nKqiRA$}XG}1Tw@)D|jGKo1ITqmvE4;ovYH{NAk{h8*Ysh@=nZFiF zmDF`@4do#UDKKM*@wDbwoO@tPx4aExhPF_dvlR&dB5>)W=wG6Pil zq{eBzw%Ov!?D+%8&(uK`m7JV7pqNp-krMd>ECQypq&?p#_3wy){eW{(2q}ij{6bfmyE+-ZO z)G4OtI;ga9;EVyKF6v3kO1RdQV+!*>tV-ditH-=;`n|2T zu(vYR*BJSBsjzFl1Oy#DpL=|pfEY4NM;y5Yly__T*Eg^3Mb_()pHwn)mAsh!7Yz-Z zY`hBLDXS4F^{>x=oOphq|LMo;G!C(b2hS9A6lJqb+e$2af}7C>zW2p{m18@Bdd>iL zoEE$nFUnaz_6p${cMO|;(c1f9nm5G5R;p)m4dcC1?1YD=2Mi&20=4{nu>AV#R^d%A zsmm_RlT#`;g~an9mo#O1dYV)2{mgUWEqb*a@^Ok;ckj;uqy{%*YB^({d{^V)P9VvP zC^qbK&lq~}TWm^RF8d4zbo~bJuw zFV!!}b^4BlJ0>5S3Q>;u*BLC&G6Fa5V|~w&bRZ*-YU>df6%qAvK?%Qf+#=M-+JqLw&w*l4{v7XTstY4j z26z69U#SVzSbY9HBXyD;%P$#vVU7G*Yb-*fy)Qpx?;ed;-P24>-L6U+OAC9Jj63kg zlY`G2+5tg1szc#*9ga3%f9H9~!(^QjECetX-PlacTR+^g8L<#VRovPGvsT)ln3lr= zm5WO@!NDuw+d4MY;K4WJg3B|Sp|WdumpFJO>I2tz$72s4^uXljWseYSAd+vGfjutO z-x~Qlct+BnlI+Iun)fOklxPH?30i&j9R$6g5^f&(x7bIom|FLKq9CUE);w2G>}vye zxWvEaXhx8|~2j)({Rq>0J9}lzdE`yhQ(l$z! z;x%d%_u?^4vlES_>JaIjJBN|N8z5}@l1#PG_@{mh`oWXQOI41_kPG}R_pV+jd^PU) zEor^SHo`VMul*80-K$0mSk|FiI+tHdWt-hzt~S>6!2-!R&rdL_^gGGUzkPe zEZkUKU=EY(5Ex)zeTA4-{Bkbn!Gm?nuaI4jLE%X;zMZ7bwn4FXz(?az;9(Uv;38U6 zi)}rA3xAcD2&6BY<~Pj9Q1~4Dyjs&!$)hyHiiTI@%qXd~+>> zW}$_puSSJ^uWv$jtWakn}}@eX6_LGz|7M#$!3yjY ztS{>HmQ%-8u0@|ig{kzD&CNK~-dIK5e{;@uWOs8$r>J7^c2P~Pwx%QVX0e8~oXK0J zM4HCNK?%t6?v~#;eP#t@tM$@SXRt;(b&kU7uDzlzUuu;+LQ5g%=FqpJPGrX8HJ8CS zITK|(fjhs3@CR}H4@)EjL@J zV_HPexOQ!@k&kvsQG)n;7lZaUh>{87l4NS_=Y-O9Ul3CaKG8iy+xD=QXZSr57a-hb z7jz3Ts-NVsMI783OPEdlE|e&a2;l^h@e>oYMh5@=Lte-9A+20|?!9>Djl~{XkAo>0p9`n&nfWGdGAfT-mSYW z1cvG>GT9dRJdcm7M_AG9JX5AqTCdJ6MRqR3p?+FvMxp(oB-6MZ`lRzSAj%N(1#8@_ zDnIIo9Rtv12(Eo}k_#FILhaZQ`yRD^Vn5tm+IK@hZO>s=t5`@p1#k?Umz2y*R64CF zGM-v&*k}zZ%Xm<_?1=g~<*&3KAy;_^QfccIp~CS7NW24Tn|mSDxb%pvvi}S}(~`2# z3I|kD@||l@lAW06K2%*gHd4x9YKeXWpwU%!ozYcJ+KJeX!s6b94j!Qyy7>S!wb?{qaMa`rpbU1phn0EpF}L zsBdZc|Im#iRiQmJjZwb5#n;`_O{$Zu$I zMXqbfu0yVmt!!Y`Fzl}QV7HUSOPib#da4i@vM$0u2FEYytsvrbR#ui9lrMkZ(AVVJ zMVl^Wi_fSRsEXLA_#rdaG%r(@UCw#o7*yBN)%22b)VSNyng6Lxk|2;XK3Qb=C_<`F zN##8MLHz-s%&O6JE~@P1=iHpj8go@4sC7*AWe99tuf$f7?2~wC&RA^UjB*2`K!%$y zSDzMd7}!vvN|#wDuP%%nuGk8&>N)7eRxtqdMXHD1W%hP7tYW{W>^DJp`3WS>3}i+$ z_li?4AlEj`r=!SPiIc+NNUZ9NCrMv&G0BdQHBO&S7d48aB)LfGi@D%5CC1%)1hVcJ zB~=yNC}LBn(K?cHkPmAX$5^M7JSnNkcc!X!0kD&^F$cJmRP(SJ`9b7}b)o$rj=BZ- zC;BX3IG94%Qz&(V$)7O~v|!=jd-yU1(6wd1u;*$z4DDe6+BFLhz>+8?59?d2Ngxck zm92yR!jk@MP@>>9FtAY2L+Z|MaSp{MnL-;fm}W3~fg!9TRr3;S@ysLf@#<)keHDRO zsJI1tP`g3PNL`2(8hK3!4;r|E-ZQbU0e-9u{(@du`4wjGj|A!QB&9w~?OI1r}M? zw)6tvsknfPfmNijZ;3VZX&HM6=|&W zy6GIe3a?_(pRxdUc==do9?C&v7+6cgIoL4)Ka^bOG9`l;S|QmVzjv%)3^PDi@=-cp z=!R0bU<@_;#*D}e1m@0!%k=VPtyRAkWYW(VFl|eu0LteWH7eDB%P|uF7BQ-|D4`n; z)UpuY1)*s32UwW756>!OoAq#5GAtfrjo*^7YUv^(eiySE?!TQzKxzqXE@jM_bq3Zq zg#1orE*Zd5ZWEpDXW9$=NzuadNSO*NW)ZJ@IDuU`w}j_FRE4-QS*rD4mPVQPH(jGg z+-Ye?3%G%=DT5U1b+TnNHHv(nz-S?3!M4hXtEB@J4WK%%p zkv=Bb`1DHmgUdYo>3kwB(T>Ba#DKv%cLp2h4r8v}p=Np}wL!&PB5J-w4V4REM{kMD z${oSuAw9?*yo3?tNp~X5WF@B^P<6L0HtIW0H7^`R8~9zAXgREH`6H{ntGu$aQ;oNq zig;pB^@KMHNoJcEb0f1fz+!M6sy?hQjof-QoxJgBM`!k^T~cykcmi^s_@1B9 z)t1)Y-ZsV9iA&FDrVoF=L7U#4&inXk{3+Xm9A|R<=ErgxPW~Fq zqu-~x0dIBlR+5_}`IK^*5l3f5$&K@l?J{)_d_*459pvsF*e*#+2guls(cid4!N%DG zl3(2`az#5!^@HNRe3O4(_5nc+){q?ENQG2|uKW0U0$aJ5SQ6hg>G4OyN6os76y%u8qNNHi;}XnRNwpsfn^!6Qt(-4tE`uxaDZ`hQp#aFX373|F?vjEiSEkV>K)cTBG+UL#wDj0_ zM9$H&-86zP=9=5_Q7d3onkqKNr4PAlF<>U^^yYAAEso|Ak~p$3NNZ$~4&kE9Nj^As zQPoo!m*uZ;z1~;#g(?zFECJ$O2@EBy<;F)fnQxOKvH`MojG5T?7thbe%F@JyN^k1K zn3H*%Ymoim)ePf)xhl2%$T)vq3P=4ty%NK)@}po&7Q^~o3l))Zm4<75Y!fFihsXJc z9?vecovF^nYfJVg#W~R3T1*PK{+^YFgb*7}Up2U#)oNyzkfJ#$)PkFxrq_{Ai?0zk zWnjq_ixF~Hs7YS9Y6H&8&k0#2cAj~!Vv4{wCM zi2f1FjQf+F@=BOB)pD|T41a4AEz+8hnH<#_PT#H|Vwm7iQ0-Tw()WMN za0eI-{B2G{sZ7+L+^k@BA)G;mOFWE$O+2nS|DzPSGZ)ede(9%+8kqu4W^wTn!yZPN z7u!Qu0u}K5(0euRZ$7=kn9DZ+llruq5A_l) zOK~wof7_^8Yeh@Qd*=P!gM)lh`Z@7^M?k8Z?t$$vMAuBG>4p56Dt!R$p{)y>QG}it zGG;Ei```7ewXrbGo6Z=!AJNQ!GP8l13m7|FIQTFZTpIg#kpZkl1wj)s1eySXjAAWy zfl;;@{QQ;Qnb$@LY8_Z&7 z6+d98F?z2Zo)sS)z$YoL(zzF>Ey8u#S_%n7)XUX1Pu(>e8gEUU1S;J=EH(#`cWi1+ zoL$5TN+?#NM8=4E7HOk)bf5MXvEo%he5QcB%_5YQ$cu_j)Pd^@5hi}d%nG}x9xXtD-JMQxr;KkC=r_dS-t`lf zF&CS?Lk~>U^!)Y0LZqNVJq+*_#F7W~!UkvZfQhzvW`q;^X&iv~ zEDDGIQ&(S;#Hb(Ej4j+#D#sDS_uHehlY0kZsQpktc?;O z22W1b%wNcdfNza<1M2{*mAkM<{}@(w`VuQ<^lG|iYSuWBD#lYK9+jsdA+&#;Y@=zXLVr840Nq_t5))#7}2s9pK* zg42zd{EY|#sIVMDhg9>t6_Y#O>JoG<{GO&OzTa;iA9&&^6=5MT21f6$7o@nS=w;R) znkgu*7Y{UNPu7B9&B&~q+N@@+%&cO0N`TZ-qQ|@f@e0g2BI+9xO$}NzMOzEbSSJ@v z1uNp(S z-dioXc$5YyA6-My@gW~1GH($Q?;GCHfk{ej-{Q^{iTFs1^Sa67RNd5y{cjX1tG+$& zbGrUte{U1{^Z_qpzW$-V!pJz$dQZrL5i(1MKU`%^= z^)i;xua4w)evDBrFVm)Id5SbXMx2u7M5Df<2L4B`wy4-Y+Wec#b^QJO|J9xF{x#M8 zuLUer`%ZL^m3gy?U&dI+`kgNZ+?bl3H%8)&k84*-=aMfADh&@$xr&IS|4{3$v&K3q zZTn&f{N(#L6<-BZYNs4 zB*Kl*@_IhGXI^_8zfXT^XNmjJ@5E~H*wFf<&er?p7suz85)$-Hqz@C zGMFg1NKs;otNViu)r-u{SOLcqwqc7$poPvm(-^ag1m71}HL#cj5t4Hw(W?*fi4GSH z9962NZ>p^ECPqVc$N}phy>N8rQsWWm%%rc5B4XLATFEtffX&TM2%|8S2Lh_q; zCytXua84HBnSybW-}(j z3Zwv4CaK)jC!{oUvdsFRXK&Sx@t)yGm(h65$!WZ!-jL52no}NX6=E<=H!aZ74h_&> zZ+~c@k!@}Cs84l{u+)%kg4fq~pOeTK3S4)gX~FKJw4t9ba!Ai{_gkKQYQvafZIyKq zX|r4xgC(l%JgmW!tvR&yNt$6uME({M`uNIi7HFiPEQo_UMRkl~12&4c& z^se;dbZWKu7>dLMg`IZq%@b@ME?|@{&xEIZEU(omKNUY? z`JszxNghuO-VA;MrZKEC0|Gi0tz3c#M?aO?WGLy64LkG4T%|PBIt_?bl{C=L@9e;A zia!35TZI7<`R8hr06xF62*rNH5T3N0v^acg+;ENvrLYo|B4!c^eILcn#+lxDZR!%l zjL6!6h9zo)<5GrSPth7+R(rLAW?HF4uu$glo?w1U-y}CR@%v+wSAlsgIXn>e%bc{FE;j@R0AoNIWf#*@BSngZ)HmNqkB z)cs3yN%_PT4f*K+Y1wFl)be=1iq+bb1G-}b|72|gJ|lMt`tf~0Jk}zMbS0+M-Mq}R z>Bv}-W6J%}j#dIz`Z0}zD(DGKn`R;E8A`)$a6qDfr(c@iHKZcCVY_nJEDpcUddGH* z*ct2$&)RelhmV}@jGXY>3Y~vp;b*l9M+hO}&x`e~q*heO8GVkvvJTwyxFetJC8VnhjR`5*+qHEDUNp16g`~$TbdliLLd}AFf}U+Oda1JXwwseRFbj?DN96;VSX~z?JxJSuA^BF}262%Z0)nv<6teKK`F zfm9^HsblS~?Xrb1_~^=5=PD!QH$Y1hD_&qe1HTQnese8N#&C(|Q)CvtAu6{{0Q%ut8ESVdn&& z4y%nsCs!$(#9d{iVjXDR##3UyoMNeY@_W^%qyuZ^K3Oa4(^!tDXOUS?b2P)yRtJ8j zSX}@qGBj+gKf;|6Kb&rq`!}S*cSu-3&S>=pM$eEB{K>PP~I}N|uGE|`3U#{Q6v^kO4nIsaq zfPld}c|4tVPI4!=!ETCNW+LjcbmEoxm0RZ%ieV0`(nVlWKClZW5^>f&h79-~CF(%+ zv|KL(^xQ7$#a}&BSGr9zf{xJ(cCfq>UR*>^-Ou_pmknCt6Y--~!duL{k2D{yLMl__ z!KeMRRg&EsD2s|cmy?xgK&XcGIKeos`&UEVhBTw;mqy|8DlP1M7PYS2z{YmTJ;n!h znPe(Qu?c7+xZz!Tm1AnE8|;&tf7fW$2dArX7ck1Jd(S1+91YB8bjISRZ`UL*?vb{b zMp*!Xq7VaLc0Ogqj5qmop8NREQ{9_iC$;tviZlubGLy1jLlIFBxAymMr@SDLAcx+) z5YRkl$bW**X)W0JzWNcLx9>fTqJj00ipY6Ua?mUlsgQrVVgpmaheE;RgA5U_+WsPh z9+X|PU4zFyNxZ2?Q+V`Mo{xH~(m}OMRZa<&$nCl7o4x`^^|V4?aPz8#KwFm=8T6_} z8=P_4$_rD2a%7}}HT6VQ>ZGKW=QF7zI-2=6oBNZR$HVn|gq`>l$HZ`48lkM7%R$>MS& zghR`WZ9Xrd_6FaDedH6_aKVJhYev*2)UQ>!CRH3PQ_d9nXlO;c z9PeqiKD@aGz^|mvD-tV<{BjfA;)B+76!*+`$CZOJ=#)}>{?!9fAg(Xngbh||n=q*C zU0mGP`NxHn$uY#@)gN<0xr)%Ue80U{-`^FX1~Q@^>WbLraiB|c#4v$5HX)0z!oA#jOXPyWg! z8EC}SBmG7j3T&zCenPLYA{kN(3l62pu}91KOWZl? zg~>T4gQ%1y3AYa^J|>ba$7F5KlVx}_&*~me*q-SYLBCXZFU=U8mHQD4K!?;B61NoX z?VS41SS&jHyhmB~+bC=w0a06V``ZXCkC~}oM9pM{$hU~-s_elYPmT1L!%B`?*<+?( zFQ@TP%y+QL`_&Y0A3679pe5~iL=z)$b)k!oSbJRyw+K};SGAvvE=|<~*aiwJc?uE@2?7a1i9|3=^N%*9smt3ZIhjY>gIsr{Q2rX(NovZ7I1n^V{ z#~(1ze-%`C>fM`^hCV**9BA-04lNuu&3=reevNOMwmX(A{yh`^c8%0mjAKMj{Th05 zXrM(zILwyL-Pcdw^(=gj(ZLVMA95zlzmLa^skb8tQq%8SV&4vp?S>L3+P4^tp`$xA zr38jBw0ItR`VbO5vB1`<3d})}aorkIU1z3*ifYN&Lpp)}|}QJS60th_v-EEkAM zyOREuj!Ou|pVeZEWg;$Hf!x;xAmFu7gB^UR$=L0BuZ~thLC@#moJ(@@wejR|`t_K@ zuQ{XmpAWz%o&~2dk!SIGR$EmpZY)@+r^gvX26%)y>1u2bt~JUPTQzQu&_tB)|{19)&n$m5Fhw0A-8S1^%XpAD%`#a z_ModVxsM|x!m3N1vRt_XEL`O-+J3cMsM1l*dbjT&S0c@}Xxl3I&AeMNT97G3c6%3C zbrZS?2EAKcEq@@Pw?r%eh0YM6z0>&Qe#n+e9hEHK?fzig3v5S#O2IxVLu;a>~c~ZfHVbgLox%_tg)bsC8Rl35P=Jhl+Y=w6zb$ z;*uO%i^U z^mp_QggBILLF$AyjPD41Z0SFdbDj&z&xjq~X|OoM7bCuBfma1CEd!4RKGqPR)K)e}+7^JfFUI_fy63cMyq#&)Z*#w18{S zhC@f9U5k#2S2`d$-)cEoH-eAz{2Qh>YF1Xa)E$rWd52N-@{#lrw3lRqr)z?BGThgO z-Mn>X=RPHQ)#9h{3ciF)<>s{uf_&XdKb&kC!a373l2OCu&y8&n#P%$7YwAVJ_lD-G zX7tgMEV8}dY^mz`R6_0tQ5Eu@CdSOyaI63Vb*mR+rCzxgsjCXLSHOmzt0tA zGoA0Cp&l>rtO@^uQayrkoe#d2@}|?SlQl9W{fmcxY(0*y zHTZ6>FL;$8FEzbb;M(o%mBe-X?o<0+1dH?ZVjcf8)Kyqb07*a zLfP1blbt)=W)TN}4M#dUnt8Gdr4p$QRA<0W)JhWLK3-g82Q~2Drmx4J z;6m4re%igus136VL}MDI-V;WmSfs4guF_(7ifNl#M~Yx5HB!UF)>*-KDQl0U?u4UXV2I*qMhEfsxb%87fi+W;mW5{h?o8!52}VUs*Fpo#aSuXk(Ug z>r>xC#&2<9Uwmao@iJQ|{Vr__?eRT2NB$OcoXQ-jZ{t|?Uy{7q$nU-i|&-R6fHPWJDgHZ69iVbK#Ab@2@y zPD*Gj=hib?PWr8NGf;g$o5I!*n>94Z!IfqRm zLvM>Gx$Y*rEL3Z-+lS42=cnEfXR)h1z`h8a+I%E_ss%qXsrgIV%qv9d|KT>fV5=3e zw>P#ju>2naGc{=6!)9TeHq$S9Pk|>$UCEl}H}lE@;0(jbNT9TXUXyss>al>S4DuGi zVCy;Qt=a2`iu2;TvrIkh2NTvNV}0)qun~9y1yEQMdOf#V#3(e(C?+--8bCsJu={Q1z5qNJIk&yW>ZnVm;A=fL~29lvXQ*4j(SLau?P zi8LC7&**O!6B6=vfY%M;!p2L2tQ+w3Y!am{b?14E`h4kN$1L0XqT5=y=DW8GI_yi% zlIWsjmf0{l#|ei>)>&IM4>jXH)?>!fK?pfWIQn9gT9N(z&w3SvjlD|u*6T@oNQRF6 zU5Uo~SA}ml5f8mvxzX>BGL}c2#AT^6Lo-TM5XluWoqBRin$tiyRQK0wJ!Ro+7S!-K z=S95p-(#IDKOZsRd{l65N(Xae`wOa4Dg9?g|Jx97N-7OfHG(rN#k=yNGW0K$Tia5J zMMX1+!ulc1%8e*FNRV8jL|OSL-_9Nv6O=CH>Ty(W@sm`j=NFa1F3tT$?wM1}GZekB z6F_VLMCSd7(b9T%IqUMo$w9sM5wOA7l8xW<(1w0T=S}MB+9X5UT|+nemtm_;!|bxX z_bnOKN+F30ehJ$459k@=69yTz^_)-hNE4XMv$~_%vlH_y^`P1pLxYF6#_IZyteO`9wpuS> z#%Vyg5mMDt?}j!0}MoBX|9PS0#B zSVo6xLVjujMN57}IVc#A{VB*_yx;#mgM4~yT6wO;Qtm8MV6DX?u(JS~JFA~PvEl%9 z2XI}c>OzPoPn_IoyXa2v}BA(M+sWq=_~L0rZ_yR17I5c^m4;?2&KdCc)3lCs!M|0OzH@(PbG8T6w%N zKzR>%SLxL_C6~r3=xm9VG8<9yLHV6rJOjFHPaNdQHHflp><44l>&;)&7s)4lX%-er znWCv8eJJe1KAi_t1p%c4`bgxD2(1v)jm(gvQLp2K-=04oaIJu{F7SIu8&)gyw7x>+ zbzYF7KXg;T71w!-=C0DjcnF^JP$^o_N>*BAjtH!^HD6t1o?(O7IrmcodeQVDD<*+j zN)JdgB6v^iiJ1q`bZ(^WvN{v@sDqG$M9L`-UV!3q&sWZUnQ{&tAkpX(nZ_L#rMs}>p7l0fU5I5IzArncQi6TWjP#1B=QZ|Uqm-3{)YPn=XFqHW-~Fb z^!0CvIdelQbgcac9;By79%T`uvNhg9tS><pLzXePP=JZzcO@?5GRAdF4)sY*)YGP* zyioMa3=HRQz(v}+cqXc0%2*Q%CQi%e2~$a9r+X*u3J8w^Shg#%4I&?!$})y@ zzg8tQ6_-`|TBa_2v$D;Q(pFutj7@yos0W$&__9$|Yn3DFe*)k{g^|JIV4bqI@2%-4kpb_p? zQ4}qQcA>R6ihbxnVa{c;f7Y)VPV&mRY-*^qm~u3HB>8lf3P&&#GhQk8uIYYgwrugY zei>mp`YdC*R^Cxuv@d0V?$~d*=m-X?1Fqd9@*IM^wQ_^-nQEuc0!OqMr#TeT=8W`JbjjXc-Dh3NhnTj8e82yP;V_B<7LIejij+B{W1ViaJ_)+q?$BaLJpxt_4@&(?rWC3NC-_Z9Sg4JJWc( zX!Y34j67vCMHKB=JcJ1|#UI^D^mn(i=A5rf-iV7y4bR5HhC=I`rFPZv4F>q+h?l34 z4(?KYwZYHwkPG%kK7$A&M#=lpIn3Qo<>s6UFy|J$Zca-s(oM7??dkuKh?f5b2`m57 zJhs4BTcVVmwsswlX?#70uQb*k1Fi3q4+9`V+ikSk{L3K=-5HgN0JekQ=J~549Nd*+H%5+fi6aJuR=K zyD3xW{X$PL7&iR)=wumlTq2gY{LdrngAaPC;Qw_xLfVE0c0Z>y918TQpL!q@?`8{L!el18Qxiki3WZONF=eK$N3)p>36EW)I@Y z7QxbWW_9_7a*`VS&5~4-9!~&g8M+*U9{I2Bz`@TJ@E(YL$l+%<=?FyR#&e&v?Y@@G zqFF`J*v;l$&(A=s`na2>4ExKnxr`|OD+Xd-b4?6xl4mQ94xuk!-$l8*%+1zQU{)!= zTooUhjC0SNBh!&Ne}Q=1%`_r=Vu1c8RuE!|(g4BQGcd5AbpLbvKv_Z~Y`l!mr!sCc zDBupoc{W@U(6KWqW@xV_`;J0~+WDx|t^WeMri#=q0U5ZN7@@FAv<1!hP6!IYX z>UjbhaEv2Fk<6C0M^@J`lH#LgKJ(`?6z5=uH+ImggSQaZtvh52WTK+EBN~-op#EQKYW`$yBmq z4wgLTJPn3;mtbs0m0RO&+EG>?rb*ZECE0#eeSOFL!2YQ$w}cae>sun`<=}m!=go!v zO2jn<0tNh4E-4)ZA(ixh5nIUuXF-qYl>0I_1)K%EAw`D7~la$=gc@6g{iWF=>i_76?Mc zh#l9h7))<|EY=sK!E|54;c!b;Zp}HLd5*-w^6^whxB98v`*P>cj!Nfu1R%@bcp{cb zUZ24(fUXn3d&oc{6H%u(@4&_O?#HO(qd^YH=V`WJ=u*u6Zie8mE^r_Oz zDw`DaXeq4G#m@EK5+p40Xe!Lr!-jTQLCV3?R1|3#`%45h8#WSA!XoLDMS7=t!SluZ4H56;G z6C9D(B6>k^ur_DGfJ@Y-=3$5HkrI zO+3P>R@$6QZ#ATUI3$)xRBEL#5IKs}yhf&fK;ANA#Qj~G zdE|k|`puh$%dyE4R0$7dZd)M*#e7s%*PKPyrS;d%&S(d{_Ktq^!Hpi&bxZx`?9pEw z%sPjo&adHm95F7Z1{RdY#*a!&LcBZVRe{qhn8d{pOUJ{fOu`_kFg7ZVeRYZ(!ezNktT5{Ab z4BZI$vS0$vm3t9q`ECjDK;pmS{8ZTKs`Js~PYv2|=VkDv{Dtt)cLU@9%K6_KqtqfM zaE*e$f$Xm=;IAURNUXw8g%=?jzG2}10ZA5qXzAaJ@eh)yv5B=ETyVwC-a*CD;GgRJ z4J1~zMUey?4iVlS0zW|F-~0nenLiN3S0)l!T2}D%;<}Z9DzeVgcB+MSj;f$KY;uP%UR#f`0u*@6U@tk@jO3N?Fjq< z{cUUhjrr$rmo>qE?52zKe+>6iP5P_tcUfxsLSy{9*)shB(w`UUveNH`a`kr$VEF@} zKh&|lTD;4;m_H6C&)9#D`kRh;S(NTa=Ve^~xe_0~x$6h8Q@B_qu#ee=(lkI9@F6$0m=z@H=4&h%Q{htM>uHs(Sr@2ry`fgLA zKj8lVXdGPyy)2J%A${}Rm_a{){wHnlM?yGPQ7#KO{8*(_l0QZHuV};nO?c%h?qwSL z3wem|w*2tdxW5&PxC(Wd0QG_w|GPbw|0UFK`u$~U%!`QKcME;=Q@?*erh4_>FP~1n zAldwG9h$$u_$RFK6Uxo20GHqJzc}Rl-EwVz3h4n z;3~%DwD84i>)-8#&#y3k)3BG5cNaP3?t4q}F%yfv?*yEiC>sSo}$f>nh0QNZXH1N)-Q7kbk=2uL9OrF)nXrE@F1y%_8Yn c82=K%QXLKFx%@O{wJjEi6Y56o#$)Bpeg diff --git a/gradle/wrapper/gradle-wrapper.properties b/gradle/wrapper/gradle-wrapper.properties deleted file mode 100644 index 9355b41557..0000000000 --- a/gradle/wrapper/gradle-wrapper.properties +++ /dev/null @@ -1,7 +0,0 @@ -distributionBase=GRADLE_USER_HOME -distributionPath=wrapper/dists -distributionUrl=https\://services.gradle.org/distributions/gradle-8.10-bin.zip -networkTimeout=10000 -validateDistributionUrl=true -zipStoreBase=GRADLE_USER_HOME -zipStorePath=wrapper/dists diff --git a/gradlew b/gradlew deleted file mode 100755 index adff685a03..0000000000 --- a/gradlew +++ /dev/null @@ -1,248 +0,0 @@ -#!/bin/sh - -# -# Copyright © 2015 the original authors. -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# https://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. -# -# SPDX-License-Identifier: Apache-2.0 -# - -############################################################################## -# -# Gradle start up script for POSIX generated by Gradle. -# -# Important for running: -# -# (1) You need a POSIX-compliant shell to run this script. If your /bin/sh is -# noncompliant, but you have some other compliant shell such as ksh or -# bash, then to run this script, type that shell name before the whole -# command line, like: -# -# ksh Gradle -# -# Busybox and similar reduced shells will NOT work, because this script -# requires all of these POSIX shell features: -# * functions; -# * expansions «$var», «${var}», «${var:-default}», «${var+SET}», -# «${var#prefix}», «${var%suffix}», and «$( cmd )»; -# * compound commands having a testable exit status, especially «case»; -# * various built-in commands including «command», «set», and «ulimit». -# -# Important for patching: -# -# (2) This script targets any POSIX shell, so it avoids extensions provided -# by Bash, Ksh, etc; in particular arrays are avoided. -# -# The "traditional" practice of packing multiple parameters into a -# space-separated string is a well documented source of bugs and security -# problems, so this is (mostly) avoided, by progressively accumulating -# options in "$@", and eventually passing that to Java. -# -# Where the inherited environment variables (DEFAULT_JVM_OPTS, JAVA_OPTS, -# and GRADLE_OPTS) rely on word-splitting, this is performed explicitly; -# see the in-line comments for details. -# -# There are tweaks for specific operating systems such as AIX, CygWin, -# Darwin, MinGW, and NonStop. -# -# (3) This script is generated from the Groovy template -# https://github.com/gradle/gradle/blob/HEAD/platforms/jvm/plugins-application/src/main/resources/org/gradle/api/internal/plugins/unixStartScript.txt -# within the Gradle project. -# -# You can find Gradle at https://github.com/gradle/gradle/. -# -############################################################################## - -# Attempt to set APP_HOME - -# Resolve links: $0 may be a link -app_path=$0 - -# Need this for daisy-chained symlinks. -while - APP_HOME=${app_path%"${app_path##*/}"} # leaves a trailing /; empty if no leading path - [ -h "$app_path" ] -do - ls=$( ls -ld "$app_path" ) - link=${ls#*' -> '} - case $link in #( - /*) app_path=$link ;; #( - *) app_path=$APP_HOME$link ;; - esac -done - -# This is normally unused -# shellcheck disable=SC2034 -APP_BASE_NAME=${0##*/} -# Discard cd standard output in case $CDPATH is set (https://github.com/gradle/gradle/issues/25036) -APP_HOME=$( cd -P "${APP_HOME:-./}" > /dev/null && printf '%s\n' "$PWD" ) || exit - -# Use the maximum available, or set MAX_FD != -1 to use that value. -MAX_FD=maximum - -warn () { - echo "$*" -} >&2 - -die () { - echo - echo "$*" - echo - exit 1 -} >&2 - -# OS specific support (must be 'true' or 'false'). -cygwin=false -msys=false -darwin=false -nonstop=false -case "$( uname )" in #( - CYGWIN* ) cygwin=true ;; #( - Darwin* ) darwin=true ;; #( - MSYS* | MINGW* ) msys=true ;; #( - NONSTOP* ) nonstop=true ;; -esac - - - -# Determine the Java command to use to start the JVM. -if [ -n "$JAVA_HOME" ] ; then - if [ -x "$JAVA_HOME/jre/sh/java" ] ; then - # IBM's JDK on AIX uses strange locations for the executables - JAVACMD=$JAVA_HOME/jre/sh/java - else - JAVACMD=$JAVA_HOME/bin/java - fi - if [ ! -x "$JAVACMD" ] ; then - die "ERROR: JAVA_HOME is set to an invalid directory: $JAVA_HOME - -Please set the JAVA_HOME variable in your environment to match the -location of your Java installation." - fi -else - JAVACMD=java - if ! command -v java >/dev/null 2>&1 - then - die "ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH. - -Please set the JAVA_HOME variable in your environment to match the -location of your Java installation." - fi -fi - -# Increase the maximum file descriptors if we can. -if ! "$cygwin" && ! "$darwin" && ! "$nonstop" ; then - case $MAX_FD in #( - max*) - # In POSIX sh, ulimit -H is undefined. That's why the result is checked to see if it worked. - # shellcheck disable=SC2039,SC3045 - MAX_FD=$( ulimit -H -n ) || - warn "Could not query maximum file descriptor limit" - esac - case $MAX_FD in #( - '' | soft) :;; #( - *) - # In POSIX sh, ulimit -n is undefined. That's why the result is checked to see if it worked. - # shellcheck disable=SC2039,SC3045 - ulimit -n "$MAX_FD" || - warn "Could not set maximum file descriptor limit to $MAX_FD" - esac -fi - -# Collect all arguments for the java command, stacking in reverse order: -# * args from the command line -# * the main class name -# * -classpath -# * -D...appname settings -# * --module-path (only if needed) -# * DEFAULT_JVM_OPTS, JAVA_OPTS, and GRADLE_OPTS environment variables. - -# For Cygwin or MSYS, switch paths to Windows format before running java -if "$cygwin" || "$msys" ; then - APP_HOME=$( cygpath --path --mixed "$APP_HOME" ) - - JAVACMD=$( cygpath --unix "$JAVACMD" ) - - # Now convert the arguments - kludge to limit ourselves to /bin/sh - for arg do - if - case $arg in #( - -*) false ;; # don't mess with options #( - /?*) t=${arg#/} t=/${t%%/*} # looks like a POSIX filepath - [ -e "$t" ] ;; #( - *) false ;; - esac - then - arg=$( cygpath --path --ignore --mixed "$arg" ) - fi - # Roll the args list around exactly as many times as the number of - # args, so each arg winds up back in the position where it started, but - # possibly modified. - # - # NB: a `for` loop captures its iteration list before it begins, so - # changing the positional parameters here affects neither the number of - # iterations, nor the values presented in `arg`. - shift # remove old arg - set -- "$@" "$arg" # push replacement arg - done -fi - - -# Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script. -DEFAULT_JVM_OPTS='"-Xmx64m" "-Xms64m"' - -# Collect all arguments for the java command: -# * DEFAULT_JVM_OPTS, JAVA_OPTS, and optsEnvironmentVar are not allowed to contain shell fragments, -# and any embedded shellness will be escaped. -# * For example: A user cannot expect ${Hostname} to be expanded, as it is an environment variable and will be -# treated as '${Hostname}' itself on the command line. - -set -- \ - "-Dorg.gradle.appname=$APP_BASE_NAME" \ - -jar "$APP_HOME/gradle/wrapper/gradle-wrapper.jar" \ - "$@" - -# Stop when "xargs" is not available. -if ! command -v xargs >/dev/null 2>&1 -then - die "xargs is not available" -fi - -# Use "xargs" to parse quoted args. -# -# With -n1 it outputs one arg per line, with the quotes and backslashes removed. -# -# In Bash we could simply go: -# -# readarray ARGS < <( xargs -n1 <<<"$var" ) && -# set -- "${ARGS[@]}" "$@" -# -# but POSIX shell has neither arrays nor command substitution, so instead we -# post-process each arg (as a line of input to sed) to backslash-escape any -# character that might be a shell metacharacter, then use eval to reverse -# that process (while maintaining the separation between arguments), and wrap -# the whole thing up as a single "set" statement. -# -# This will of course break if any of these variables contains a newline or -# an unmatched quote. -# - -eval "set -- $( - printf '%s\n' "$DEFAULT_JVM_OPTS $JAVA_OPTS $GRADLE_OPTS" | - xargs -n1 | - sed ' s~[^-[:alnum:]+,./:=@_]~\\&~g; ' | - tr '\n' ' ' - )" '"$@"' - -exec "$JAVACMD" "$@" diff --git a/gradlew.bat b/gradlew.bat deleted file mode 100644 index c4bdd3ab8e..0000000000 --- a/gradlew.bat +++ /dev/null @@ -1,93 +0,0 @@ -@rem -@rem Copyright 2015 the original author or authors. -@rem -@rem Licensed under the Apache License, Version 2.0 (the "License"); -@rem you may not use this file except in compliance with the License. -@rem You may obtain a copy of the License at -@rem -@rem https://www.apache.org/licenses/LICENSE-2.0 -@rem -@rem Unless required by applicable law or agreed to in writing, software -@rem distributed under the License is distributed on an "AS IS" BASIS, -@rem WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -@rem See the License for the specific language governing permissions and -@rem limitations under the License. -@rem -@rem SPDX-License-Identifier: Apache-2.0 -@rem - -@if "%DEBUG%"=="" @echo off -@rem ########################################################################## -@rem -@rem Gradle startup script for Windows -@rem -@rem ########################################################################## - -@rem Set local scope for the variables with windows NT shell -if "%OS%"=="Windows_NT" setlocal - -set DIRNAME=%~dp0 -if "%DIRNAME%"=="" set DIRNAME=. -@rem This is normally unused -set APP_BASE_NAME=%~n0 -set APP_HOME=%DIRNAME% - -@rem Resolve any "." and ".." in APP_HOME to make it shorter. -for %%i in ("%APP_HOME%") do set APP_HOME=%%~fi - -@rem Add default JVM options here. You can also use JAVA_OPTS and GRADLE_OPTS to pass JVM options to this script. -set DEFAULT_JVM_OPTS="-Xmx64m" "-Xms64m" - -@rem Find java.exe -if defined JAVA_HOME goto findJavaFromJavaHome - -set JAVA_EXE=java.exe -%JAVA_EXE% -version >NUL 2>&1 -if %ERRORLEVEL% equ 0 goto execute - -echo. 1>&2 -echo ERROR: JAVA_HOME is not set and no 'java' command could be found in your PATH. 1>&2 -echo. 1>&2 -echo Please set the JAVA_HOME variable in your environment to match the 1>&2 -echo location of your Java installation. 1>&2 - -goto fail - -:findJavaFromJavaHome -set JAVA_HOME=%JAVA_HOME:"=% -set JAVA_EXE=%JAVA_HOME%/bin/java.exe - -if exist "%JAVA_EXE%" goto execute - -echo. 1>&2 -echo ERROR: JAVA_HOME is set to an invalid directory: %JAVA_HOME% 1>&2 -echo. 1>&2 -echo Please set the JAVA_HOME variable in your environment to match the 1>&2 -echo location of your Java installation. 1>&2 - -goto fail - -:execute -@rem Setup the command line - - - -@rem Execute Gradle -"%JAVA_EXE%" %DEFAULT_JVM_OPTS% %JAVA_OPTS% %GRADLE_OPTS% "-Dorg.gradle.appname=%APP_BASE_NAME%" -jar "%APP_HOME%\gradle\wrapper\gradle-wrapper.jar" %* - -:end -@rem End local scope for the variables with windows NT shell -if %ERRORLEVEL% equ 0 goto mainEnd - -:fail -rem Set variable GRADLE_EXIT_CONSOLE if you need the _script_ return code instead of -rem the _cmd.exe /c_ return code! -set EXIT_CODE=%ERRORLEVEL% -if %EXIT_CODE% equ 0 set EXIT_CODE=1 -if not ""=="%GRADLE_EXIT_CONSOLE%" exit %EXIT_CODE% -exit /b %EXIT_CODE% - -:mainEnd -if "%OS%"=="Windows_NT" endlocal - -:omega diff --git a/settings.gradle.kts b/settings.gradle.kts deleted file mode 100644 index 9715757391..0000000000 --- a/settings.gradle.kts +++ /dev/null @@ -1,51 +0,0 @@ -rootProject.name = "typr" - -include("foundations-jdbc") -include("foundations-jdbc-dsl") -include("foundations-jdbc-dsl-kotlin") - -// PostgreSQL Kotlin testers -include("testers:pg:kotlin") -project(":testers:pg:kotlin").projectDir = file("testers/pg/kotlin") - -// DuckDB Kotlin testers -include("testers:duckdb:kotlin") -project(":testers:duckdb:kotlin").projectDir = file("testers/duckdb/kotlin") - -// MariaDB Kotlin testers -include("testers:mariadb:kotlin") -project(":testers:mariadb:kotlin").projectDir = file("testers/mariadb/kotlin") - -// Oracle Kotlin testers -include("testers:oracle:kotlin") -project(":testers:oracle:kotlin").projectDir = file("testers/oracle/kotlin") - -// SQL Server Kotlin testers -include("testers:sqlserver:kotlin") -project(":testers:sqlserver:kotlin").projectDir = file("testers/sqlserver/kotlin") - -// DB2 Kotlin testers -include("testers:db2:kotlin") -project(":testers:db2:kotlin").projectDir = file("testers/db2/kotlin") - -// Avro Kotlin testers -include("testers:avro:kotlin") -project(":testers:avro:kotlin").projectDir = file("testers/avro/kotlin") -include("testers:avro:kotlin-json") -project(":testers:avro:kotlin-json").projectDir = file("testers/avro/kotlin-json") -include("testers:avro:kotlin-quarkus-mutiny") -project(":testers:avro:kotlin-quarkus-mutiny").projectDir = file("testers/avro/kotlin-quarkus-mutiny") - -// gRPC Kotlin testers -include("testers:grpc:kotlin") -project(":testers:grpc:kotlin").projectDir = file("testers/grpc/kotlin") -include("testers:grpc:kotlin-quarkus") -project(":testers:grpc:kotlin-quarkus").projectDir = file("testers/grpc/kotlin-quarkus") - -// OpenAPI Kotlin testers -include("testers:openapi:kotlin:jaxrs") -project(":testers:openapi:kotlin:jaxrs").projectDir = file("testers/openapi/kotlin/jaxrs") -include("testers:openapi:kotlin:spring") -project(":testers:openapi:kotlin:spring").projectDir = file("testers/openapi/kotlin/spring") -include("testers:openapi:kotlin:quarkus") -project(":testers:openapi:kotlin:quarkus").projectDir = file("testers/openapi/kotlin/quarkus") diff --git a/site-in/type-safety/precise-types.md b/site-in/type-safety/precise-types.md new file mode 100644 index 0000000000..b6eed88b5b --- /dev/null +++ b/site-in/type-safety/precise-types.md @@ -0,0 +1,220 @@ +--- +title: Precise Types +sidebar_position: 2 +--- + +# Precise Types + +**Precise Types** bring your database's size and precision constraints directly into your type system. Instead of `String` for a `VARCHAR(50)`, you get `String50` - a type that the compiler guarantees fits within 50 characters. + +This is type safety taken to the next level. + +## The Problem + +Consider this common scenario: + +```java +// Database: VARCHAR(50) +record Customer(String firstName, String lastName) {} + +// Compiles fine, but will fail at runtime! +Customer customer = new Customer( + "A".repeat(100), // 100 chars into VARCHAR(50) - boom! + "Smith" +); +``` + +Runtime failures are expensive. They happen in production, require debugging, and erode trust in your code. + +## The Solution + +With Precise Types enabled: + +```java +// Database: VARCHAR(50) +record Customer(String50 firstName, String50 lastName) {} + +// This won't even compile - the type system protects you +Customer customer = new Customer( + String50.unsafeForce("A".repeat(100)), // Throws IllegalArgumentException + String50.of("Smith").orElseThrow() // Safe: returns Optional +); +``` + +## Supported Precise Types + +| Type | Database Source | Constraints | +|------|-----------------|-------------| +| `StringN` | `VARCHAR(n)`, `NVARCHAR(n)` | Maximum length | +| `NonEmptyStringN` | `VARCHAR(n) NOT NULL` with CHECK | Non-empty, max length | +| `PaddedStringN` | `CHAR(n)` | Exact length, space-padded | +| `NonEmptyPaddedStringN` | `CHAR(n)` with CHECK | Exact length, non-empty | +| `BinaryN` | `BINARY(n)`, `VARBINARY(n)` | Maximum byte length | +| `DecimalN` | `DECIMAL(p,s)`, `NUMERIC(p,s)` | Precision and scale | +| `LocalDateTimeN` | `DATETIME(n)` | Fractional seconds precision | +| `LocalTimeN` | `TIME(n)` | Fractional seconds precision | +| `InstantN` | `TIMESTAMP(n)` | Fractional seconds precision | +| `OffsetDateTimeN` | `TIMESTAMP(n) WITH TIME ZONE` | Fractional seconds precision | + +## Generated Type API + +Each precise type provides a rich API for safe value construction: + +### Safe Construction + +```java +// Returns Optional.empty() if validation fails +Optional maybeValid = String50.of("Hello World"); + +// Use in functional chains +String50 name = String50.of(userInput) + .orElseThrow(() -> new ValidationException("Name too long")); +``` + +### Truncation + +```java +// Automatically truncates to fit - great for user input +String50 truncated = String50.truncate("This is a very long string..."); +``` + +### Forced (Fail-Fast) + +```java +// Throws IllegalArgumentException if validation fails +// Use when you know the value is valid (e.g., from database) +String50 forced = String50.unsafeForce("Known valid string"); +``` + +### Common Interface + +All string types implement `StringN`, enabling generic code: + +```java +public void logLength(T value) { + System.out.println("Value: " + value.rawValue() + + " (max: " + value.maxLength() + ")"); +} +``` + +## Decimal Precision + +Decimal types enforce both precision (total digits) and scale (decimal places): + +```java +// DECIMAL(10,2) - perfect for currency +Optional price = Decimal10_2.of(new BigDecimal("99.99")); +Decimal10_2 zero = Decimal10_2.Zero; // Convenient constant + +// Integer convenience methods +Decimal10_2 fromInt = Decimal10_2.of(100); // Safe for small values + +// Automatic scaling +Decimal10_2.of(new BigDecimal("99.999")); // Rounds to 99.99 +``` + +## Semantic Equality + +Precise types support semantic equality - two values are equal if their content matches, regardless of declared constraints: + +```java +String10 short_ = String10.unsafeForce("hello"); +String50 long_ = String50.unsafeForce("hello"); + +// Different types, but semantically equal +short_.semanticEquals(long_); // true + +// Works in collections too +Set names = new HashSet<>(); +names.add(short_); +names.contains(long_); // true (using semanticHashCode) +``` + +## Enabling Precise Types + +```scala +import typr.* + +val options = Options( + pkg = "myapp", + lang = Lang.Java, + dbLib = Some(DbLibName.Typo), + enablePreciseTypes = Selector.All // Enable for all tables +) +``` + +### Selective Enablement + +```scala +// Only for specific schemas +enablePreciseTypes = Selector.schemas("production", "sales") + +// Only for specific tables +enablePreciseTypes = Selector.relationNames("customers", "products") + +// Custom predicate +enablePreciseTypes = Selector.predicate { relation => + relation.name.value.endsWith("_strict") +} +``` + +## Database Support + +Precise Types work across all supported databases: + +| Database | VARCHAR | CHAR | DECIMAL | TIME | TIMESTAMP | +|----------|---------|------|---------|------|-----------| +| PostgreSQL | `StringN` | `PaddedStringN` | `DecimalN` | `LocalTimeN` | `InstantN` / `OffsetDateTimeN` | +| MariaDB | `StringN` | `PaddedStringN` | `DecimalN` | `LocalTimeN` | `LocalDateTimeN` | +| SQL Server | `StringN` | `PaddedStringN` | `DecimalN` | `LocalTimeN` | `LocalDateTimeN` / `OffsetDateTimeN` | +| Oracle | `StringN` | `PaddedStringN` | `DecimalN` | - | `InstantN` | +| DuckDB | `StringN` | - | `DecimalN` | `LocalTimeN` | `InstantN` | + +## Language Support + +Precise Types generate idiomatically for each language: + +### Java + +```java +public record String50(String value) implements StringN { + public static Optional of(String value) { ... } + public static String50 truncate(String value) { ... } + public static String50 unsafeForce(String value) { ... } +} +``` + +### Kotlin + +```kotlin +@JvmInline +value class String50(val value: String) : StringN { + companion object { + fun of(value: String): String50? = ... + fun truncate(value: String): String50 = ... + fun unsafeForce(value: String): String50 = ... + } +} +``` + +### Scala + +```scala +opaque type String50 = String +object String50 extends StringN[String50]: + def of(value: String): Option[String50] = ... + def truncate(value: String): String50 = ... + def unsafeForce(value: String): String50 = ... +``` + +## Why This Matters + +Precise Types transform database constraints from runtime concerns to compile-time guarantees: + +1. **Catch bugs earlier**: Size violations are caught during development, not in production +2. **Self-documenting**: `String50` tells you more than `String` +3. **IDE support**: Autocomplete shows exact constraints +4. **Refactoring safety**: Change a column's size and the compiler shows all affected code +5. **API contracts**: Public APIs communicate constraints through types + +This is the difference between hoping your data is valid and knowing it is. diff --git a/site-in/unified-types/best-practices.md b/site-in/unified-types/best-practices.md new file mode 100644 index 0000000000..f85c7241ce --- /dev/null +++ b/site-in/unified-types/best-practices.md @@ -0,0 +1,407 @@ +--- +title: Best Practices +sidebar_position: 4 +--- + +# Best Practices for Unified Types + +This guide covers patterns and recommendations for getting the most out of Unified Types. + +## Naming Conventions + +### Use Semantic Names + +Name types after what they represent, not how they're stored: + +```yaml +# Good - semantic names +types: + Email: + db: { column: ["*email*"] } + PhoneNumber: + db: { column: [phone, mobile, telephone] } + IsActive: + db: { column: [is_active, active, enabled] } + +# Avoid - technical names +types: + Varchar50: # Too generic + BooleanColumn: # Doesn't convey meaning +``` + +### Follow Language Conventions + +Type names should match your target language's conventions: + +| Language | Convention | Example | +|----------|------------|---------| +| Java | PascalCase | `CustomerId`, `IsActive` | +| Kotlin | PascalCase | `CustomerId`, `IsActive` | +| Scala | PascalCase | `CustomerId`, `IsActive` | + +## Pattern Design + +### Start Specific, Generalize Carefully + +Start with specific column names, then add patterns as needed: + +```yaml +types: + # Start specific + Email: + db: + column: [email] + + # Later, if you find more columns, add patterns + Email: + db: + column: [email, user_email, contact_email, "*_email"] +``` + +### Use Table Scoping for Ambiguous Names + +When column names are ambiguous, scope them to tables: + +```yaml +types: + # "name" is too ambiguous globally + ProductName: + db: + table: [products, product_*] + column: [name, product_name] + + CustomerName: + db: + table: [customers, customer_*] + column: [name, full_name] +``` + +### Primary Keys Deserve Their Own Types + +Create distinct types for primary keys to prevent mixing IDs: + +```yaml +types: + CustomerId: + db: + column: [customer_id] + primary_key: true + api: + name: [customerId] + + ProductId: + db: + column: [product_id] + primary_key: true + api: + name: [productId] + + # Don't do this - mixing different IDs + # GenericId: + # db: + # column: ["*_id"] + # primary_key: true +``` + +## Organization + +### Group Related Types + +Organize your configuration logically: + +```yaml +# config/types-identity.yaml +types: + CustomerId: + # ... + ProductId: + # ... + OrderId: + # ... +``` + +```yaml +# config/types-personal.yaml +types: + FirstName: + # ... + LastName: + # ... + Email: + # ... + PhoneNumber: + # ... +``` + +```yaml +# config/types-audit.yaml +types: + CreatedAt: + # ... + UpdatedAt: + # ... + CreatedBy: + # ... +``` + +### Use Include Files + +Split large configurations: + +```yaml +# typr-types.yaml +version: 1 +include: + - config/sources.yaml + - config/types-identity.yaml + - config/types-personal.yaml + - config/types-audit.yaml + - config/types-flags.yaml + - config/output.yaml +``` + +## Cross-Source Alignment + +### Match Naming Variations + +Account for different naming conventions across sources: + +```yaml +types: + FirstName: + db: + # SQL conventions: snake_case, abbreviated + column: [first_name, firstname, fname] + api: + # API conventions: camelCase + name: [firstName, fname] +``` + +### Handle Legacy Systems + +When integrating legacy databases with modern APIs: + +```yaml +types: + IsActive: + db: + # Legacy uses various patterns + column: [is_active, active, status_active, act_flg] + api: + # Modern API uses consistent naming + name: [isActive] +``` + +## Type Safety Levels + +### Progression of Type Safety + +You can adopt Unified Types incrementally: + +**Level 1: Identity Types Only** +```yaml +types: + CustomerId: + db: { column: [customer_id], primary_key: true } + ProductId: + db: { column: [product_id], primary_key: true } +``` + +**Level 2: Add Common Strings** +```yaml +types: + Email: + db: { column: ["*email*"] } + FirstName: + db: { column: [first_name, firstname] } +``` + +**Level 3: Add Flags** +```yaml +types: + IsActive: + db: { column: [is_active, active] } + IsVerified: + db: { column: [is_verified, verified] } +``` + +**Level 4: Full Coverage** +```yaml +types: + # Identity, strings, flags, AND... + Currency: + db: { annotation: ["@currency"] } + Money: + db: { column: ["*price*", "*amount*", "*cost*"] } +``` + +## Testing + +### Validate Configuration Regularly + +Add validation to your CI/CD: + +```bash +typr validate --strict +``` + +### Test Type Changes + +When modifying type definitions, regenerate and compile: + +```bash +typr generate && ./gradlew compileJava +``` + +### Document Breaking Changes + +When a type mapping changes, document it: + +```yaml +types: + # BREAKING: Renamed from PhoneNumber to Phone in v2.0 + # Migration: Update all usages of PhoneNumber to Phone + Phone: + db: { column: [phone, mobile, telephone] } +``` + +## Common Patterns + +### Boolean Flag Pattern + +```yaml +types: + IsActive: + db: { column: [is_active, active, enabled] } + api: { name: [isActive, active, enabled] } + + IsDeleted: + db: { column: [is_deleted, deleted, removed] } + api: { name: [isDeleted, deleted] } + + IsVerified: + db: { column: [is_verified, verified, confirmed] } + api: { name: [isVerified, verified] } +``` + +### Audit Field Pattern + +```yaml +types: + CreatedAt: + db: { column: [created_at, created_date, create_date, date_created] } + api: { name: [createdAt, dateCreated] } + + UpdatedAt: + db: { column: [updated_at, modified_at, last_modified, date_modified] } + api: { name: [updatedAt, modifiedAt, lastModified] } + + CreatedBy: + db: { column: [created_by, creator, author] } + api: { name: [createdBy, creator] } + + UpdatedBy: + db: { column: [updated_by, modified_by, last_modified_by] } + api: { name: [updatedBy, modifiedBy] } +``` + +### Foreign Key Pattern + +```yaml +types: + # Match foreign keys by what they reference + CustomerId: + db: + column: [customer_id] + references: [customers] + + ProductId: + db: + column: [product_id] + references: [products] +``` + +### Sensitive Data Pattern + +Use annotations in column comments: + +```sql +-- In your database +COMMENT ON COLUMN users.ssn IS '@sensitive Social Security Number'; +COMMENT ON COLUMN users.password_hash IS '@sensitive @no-log Password hash'; +``` + +```yaml +types: + SensitiveString: + db: + annotation: ["@sensitive"] + api: + extension: + x-sensitive: "true" +``` + +## Performance Considerations + +### Keep Type Count Reasonable + +While more types provide more safety, too many can: +- Increase compilation time +- Make IDE autocomplete slower +- Create cognitive overhead + +**Recommendation**: Start with 10-20 core types, add more as needed. + +### Use Specific Patterns + +More specific patterns match faster: + +```yaml +# Faster - specific column names +Email: + db: { column: [email, user_email] } + +# Slower - glob pattern scans all columns +Email: + db: { column: ["*email*"] } +``` + +## Troubleshooting + +### Type Not Matching + +Check these common issues: + +1. **Case sensitivity**: Column names are case-sensitive in patterns +2. **Schema qualification**: Add schema to narrow matches +3. **Pattern syntax**: Globs use `*` for any characters, `?` for single character + +```bash +# Debug matching +typr types show Email --verbose +``` + +### Conflicting Types + +When multiple types could match: + +```yaml +# More specific type should come first +types: + PrimaryEmail: + db: + table: [users] + column: [email] + + Email: + db: + column: ["*email*"] +``` + +### Generated Code Issues + +If generated code doesn't compile: + +1. Check for name collisions +2. Verify all matched columns have compatible types +3. Run `typr validate --strict` diff --git a/site-in/unified-types/cli.md b/site-in/unified-types/cli.md new file mode 100644 index 0000000000..94565ef828 --- /dev/null +++ b/site-in/unified-types/cli.md @@ -0,0 +1,401 @@ +--- +title: CLI & TUI Tool +sidebar_position: 3 +--- + +# Typr CLI & TUI + +The Typr command-line tool provides both a powerful CLI for automation and an interactive TUI (Terminal User Interface) for exploring and configuring your data sources interactively. + +## Installation + +```bash +# Via Coursier (recommended) +cs install typr + +# Via Homebrew (macOS) +brew install oyvindberg/tap/typr + +# Via npm (cross-platform) +npm install -g @typr/cli + +# Or download directly +curl -L https://github.com/oyvindberg/typr/releases/latest/download/typr-cli.jar -o typr.jar +java -jar typr.jar --help +``` + +## Quick Start + +```bash +# Initialize a new project +typr init + +# Launch interactive TUI +typr tui + +# Generate code from config +typr generate + +# Validate configuration +typr validate +``` + +## Interactive TUI + +The TUI provides a visual interface for exploring and configuring data sources: + +```bash +typr tui +``` + +``` +┌─────────────────────────────────────────────────────────────────┐ +│ TYPR - Unified Type Generator v2.0.0 │ +├─────────────────────────────────────────────────────────────────┤ +│ │ +│ Data Sources │ +│ ──────────── │ +│ ▸ postgres PostgreSQL 16.1 production ● Connected │ +│ mariadb MariaDB 11.2 legacy ● Connected │ +│ api OpenAPI 3.0 customers.yaml ✓ Loaded │ +│ │ +│ + Add Source │ +│ │ +├─────────────────────────────────────────────────────────────────┤ +│ Unified Types │ +│ ───────────── │ +│ ▸ FirstName 4 db columns, 3 api fields │ +│ LastName 4 db columns, 3 api fields │ +│ Email 6 db columns, 4 api fields │ +│ IsActive 8 db columns, 5 api fields │ +│ CustomerId 2 db columns, 2 api fields (primary key) │ +│ │ +│ + Add Type │ +│ │ +├─────────────────────────────────────────────────────────────────┤ +│ [G]enerate [V]alidate [S]ave [Q]uit [?]Help │ +└─────────────────────────────────────────────────────────────────┘ +``` + +### Adding a Data Source + +Press `+` on "Add Source" to open the connection wizard: + +``` +┌─────────────────────────────────────────────────────────────────┐ +│ Add Data Source │ +├─────────────────────────────────────────────────────────────────┤ +│ │ +│ Source Type: │ +│ ──────────── │ +│ ▸ PostgreSQL │ +│ MariaDB / MySQL │ +│ SQL Server │ +│ Oracle │ +│ DuckDB │ +│ OpenAPI Specification │ +│ │ +├─────────────────────────────────────────────────────────────────┤ +│ ↑↓ Select Enter Confirm Esc Cancel │ +└─────────────────────────────────────────────────────────────────┘ +``` + +After selecting a type, configure the connection: + +``` +┌─────────────────────────────────────────────────────────────────┐ +│ Configure PostgreSQL Connection │ +├─────────────────────────────────────────────────────────────────┤ +│ │ +│ Name: [postgres___________] │ +│ Host: [localhost__________] │ +│ Port: [5432__] │ +│ Database: [production_________] │ +│ Username: [postgres___________] │ +│ Password: [••••••••___________] │ +│ │ +│ Schemas: [public, person, sales_____________________] │ +│ │ +│ [ ] Use SSL │ +│ [ ] Use environment variables │ +│ │ +├─────────────────────────────────────────────────────────────────┤ +│ [T]est Connection [S]ave [C]ancel │ +└─────────────────────────────────────────────────────────────────┘ +``` + +### Exploring Schema + +Select a data source and press Enter to explore its schema: + +``` +┌─────────────────────────────────────────────────────────────────┐ +│ postgres - Schema Explorer │ +├─────────────────────────────────────────────────────────────────┤ +│ Tables & Views │ Columns │ +│ ──────────────── │ ─────── │ +│ ▸ person.person │ businessentityid int4 PK │ +│ person.address │ persontype bpchar │ +│ person.emailaddress │ namestyle bool │ +│ sales.customer │ title varchar │ +│ sales.salesorderheader │ firstname varchar ✓ │ +│ sales.salesorderdetail │ middlename varchar │ +│ production.product │ lastname varchar ✓ │ +│ production.productcategory │ suffix varchar │ +│ humanresources.employee │ emailpromotion int4 │ +│ ... │ modifieddate timestamp │ +│ │ │ +│ [/] Search [F] Filter │ [T] Map Type [U] Unmap │ +├─────────────────────────────────────────────────────────────────┤ +│ ✓ = mapped to unified type │ +└─────────────────────────────────────────────────────────────────┘ +``` + +### Creating Type Mappings + +Press `T` on a column to map it to a unified type: + +``` +┌─────────────────────────────────────────────────────────────────┐ +│ Map Column to Unified Type │ +├─────────────────────────────────────────────────────────────────┤ +│ │ +│ Column: person.person.firstname (varchar 50) │ +│ │ +│ Select or create type: │ +│ ────────────────────── │ +│ ▸ FirstName (existing - 3 matches) │ +│ Name (existing - 1 match) │ +│ + Create new type "Firstname" │ +│ │ +│ Auto-suggestions based on column name: │ +│ ────────────────────────────────────── │ +│ Also match: first_name, fname, firstName │ +│ [ ] Add patterns automatically │ +│ │ +├─────────────────────────────────────────────────────────────────┤ +│ Enter Select N New Type Esc Cancel │ +└─────────────────────────────────────────────────────────────────┘ +``` + +### Auto-Discovery + +The TUI can automatically suggest type mappings based on naming patterns: + +``` +┌─────────────────────────────────────────────────────────────────┐ +│ Auto-Discover Type Mappings │ +├─────────────────────────────────────────────────────────────────┤ +│ │ +│ Analyzing 3 sources, 47 tables, 312 columns... │ +│ │ +│ Suggested Types: │ +│ ──────────────── │ +│ │ +│ [✓] Email │ +│ postgres: person.emailaddress.emailaddress │ +│ postgres: sales.customer.emailaddress │ +│ mariadb: customers.email │ +│ api: Customer.email, User.email │ +│ │ +│ [✓] FirstName │ +│ postgres: person.person.firstname │ +│ mariadb: customers.first_name, employees.first_name │ +│ api: Customer.firstName, Employee.firstName │ +│ │ +│ [ ] Phone (needs review - mixed patterns) │ +│ postgres: person.personphone.phonenumber │ +│ mariadb: customers.phone, customers.mobile │ +│ api: Customer.phoneNumber │ +│ │ +│ [?] CreatedAt vs CreateDate (naming conflict) │ +│ Choose: [CreatedAt] [CreateDate] [Both] [Skip] │ +│ │ +├─────────────────────────────────────────────────────────────────┤ +│ Space Toggle A Accept All R Review Esc Cancel │ +└─────────────────────────────────────────────────────────────────┘ +``` + +## CLI Commands + +### Initialize Project + +```bash +# Interactive initialization +typr init + +# With options +typr init --lang java --json jackson --output ./generated +``` + +### Generate Code + +```bash +# Generate from config file +typr generate + +# Specify config file +typr generate --config typr-types.yaml + +# Generate specific sources only +typr generate --sources postgres,api + +# Watch mode - regenerate on file changes +typr generate --watch +``` + +### Validate Configuration + +```bash +# Validate config and test connections +typr validate + +# Verbose output +typr validate --verbose + +# Check specific sources +typr validate --sources postgres +``` + +### Schema Commands + +```bash +# List tables in a source +typr schema list postgres + +# Show table details +typr schema show postgres person.person + +# Export schema to JSON +typr schema export postgres --output schema.json + +# Compare schemas between sources +typr schema diff postgres mariadb +``` + +### Type Commands + +```bash +# List defined types +typr types list + +# Show type details and matches +typr types show Email + +# Add a new type interactively +typr types add + +# Add type from command line +typr types add FirstName \ + --db-column "first_name,firstname" \ + --api-name "firstName" + +# Find unmapped columns +typr types suggest + +# Remove a type +typr types remove PhoneNumber +``` + +### Source Commands + +```bash +# List configured sources +typr sources list + +# Test source connection +typr sources test postgres + +# Add source interactively +typr sources add + +# Remove source +typr sources remove legacy-db +``` + +## Watch Mode + +The watch mode automatically regenerates code when source files change: + +```bash +typr generate --watch +``` + +``` +Watching for changes... + +[12:34:56] api/openapi.yaml changed +[12:34:56] Regenerating api... +[12:34:57] Generated 12 files in ./generated/api + +[12:35:10] typr-types.yaml changed +[12:35:10] Regenerating all sources... +[12:35:12] Generated 156 files in ./generated + +Press Ctrl+C to stop +``` + +## CI/CD Integration + +### GitHub Actions + +```yaml +name: Generate Types +on: + push: + paths: + - 'api/**' + - 'typr-types.yaml' + +jobs: + generate: + runs-on: ubuntu-latest + services: + postgres: + image: postgres:16 + env: + POSTGRES_PASSWORD: test + ports: + - 5432:5432 + + steps: + - uses: actions/checkout@v4 + + - name: Install Typr + run: | + curl -L https://github.com/oyvindberg/typr/releases/latest/download/typr-cli -o typr + chmod +x typr + + - name: Validate Configuration + run: ./typr validate + + - name: Generate Code + run: ./typr generate + + - name: Commit Generated Code + run: | + git add generated/ + git commit -m "chore: regenerate types" || exit 0 + git push +``` + +### Pre-commit Hook + +```bash +#!/bin/bash +# .git/hooks/pre-commit + +# Validate configuration before commit +typr validate --quiet || { + echo "Typr validation failed. Please fix configuration." + exit 1 +} + +# Regenerate and stage changes +typr generate +git add generated/ +``` + +## Configuration Reference + +See [YAML Configuration](./yaml-config.md) for complete configuration options. diff --git a/site-in/unified-types/overview.md b/site-in/unified-types/overview.md new file mode 100644 index 0000000000..3e1d133032 --- /dev/null +++ b/site-in/unified-types/overview.md @@ -0,0 +1,285 @@ +--- +title: Unified Types +sidebar_position: 1 +--- + +# Unified Types: One Type System Across Your Entire Stack + +**Unified Types** is one of Typr's most powerful features. It lets you define semantic types once and automatically apply them across multiple databases and your OpenAPI specifications. + +Imagine having `FirstName`, `Email`, `IsActive`, and `CustomerId` types that work seamlessly whether the data comes from PostgreSQL, MariaDB, or your REST API. No more manual synchronization. No more type mismatches. Just pure, compile-time safety across your entire system. + +## The Problem + +Modern applications rarely have a single data source: + +``` +┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐ +│ PostgreSQL │ │ MariaDB │ │ REST API │ +│ │ │ │ │ (OpenAPI) │ +│ person.firstname│ │ customers. │ │ Customer. │ +│ (varchar 50) │ │ first_name │ │ firstName │ +│ │ │ (varchar 50) │ │ (string) │ +└────────┬────────┘ └────────┬────────┘ └────────┬────────┘ + │ │ │ + ▼ ▼ ▼ + String? String? String? +``` + +Without unified types, you end up with three different `String` types representing the same concept. There's no compile-time connection between them. + +## The Solution + +With Unified Types: + +``` +┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐ +│ PostgreSQL │ │ MariaDB │ │ REST API │ +│ │ │ │ │ (OpenAPI) │ +│ person.firstname│ │ customers. │ │ Customer. │ +│ │ │ first_name │ │ firstName │ +└────────┬────────┘ └────────┬────────┘ └────────┬────────┘ + │ │ │ + └───────────────────────┼───────────────────────┘ + │ + ▼ + ┌────────────────┐ + │ FirstName │ + │ │ + │ value: String │ + │ max: 50 chars │ + └────────────────┘ +``` + +One type. Three sources. Complete type safety. + +## Generated Code + +When you define unified types, Typr generates a shared type with documentation showing all its sources: + +```java +/** + * Shared type `FirstName` aligned across sources: + * - postgres (PostgreSQL): person.person.firstname + * - mariadb (MariaDB): customers.first_name + * - api (OpenAPI): Customer.firstName (model), CustomerCreate.firstName (model) + */ +public record FirstName(@JsonValue String value) { + // Database type adapters + public static MariaType mariaType = + MariaTypes.varchar.bimap(FirstName::new, FirstName::value); + + public static PgType pgType = + PgTypes.text.bimap(FirstName::new, FirstName::value); +} +``` + +The generated code includes: +- Documentation listing every source location +- Database type adapters for each matched database +- JSON serialization support +- Full interoperability across all sources + +## Defining Type Mappings + +Type definitions use **predicates** that match columns and fields. When a match is found, Typr uses your named type instead of the default. + +### Basic Example + +```scala +import typr.* + +val types = TypeDefinitions( + // Match by column/field name + TypeEntry("FirstName", + db = DbMatch.column("first_name", "firstname"), + api = ApiMatch.name("firstName") + ), + + // Match by name pattern (glob syntax) + TypeEntry("Email", + db = DbMatch.column("*email*"), + api = ApiMatch.name("*email*", "*Email*") + ), + + // Match boolean flags + TypeEntry("IsActive", + db = DbMatch.column("is_active"), + api = ApiMatch.name("isActive", "active") + ) +) +``` + +### Advanced Matching + +```scala +// Match by database and schema +TypeEntry("CustomerId", + db = DbMatch.Empty.copy( + database = List("production"), + schema = List("customers"), + column = List("customer_id", "cust_id"), + primaryKey = Some(true) + ), + api = ApiMatch.name("customerId") +) + +// Match by OpenAPI format +TypeEntry("UUID", + db = DbMatch.Empty.copy(dbType = List("uuid")), + api = ApiMatch.format("uuid") +) + +// Match by API location +TypeEntry("AuthToken", + db = DbMatch.Empty, + api = ApiMatch.Empty.copy( + location = List(ApiLocation.HeaderParam), + name = List("Authorization", "X-Auth-Token") + ) +) + +// Match by column comment annotation +TypeEntry("Currency", + db = DbMatch.annotation("@currency"), + api = ApiMatch.extension("x-currency", "true") +) +``` + +## Match Semantics + +The matching system is designed to be both powerful and intuitive: + +| Rule | Meaning | +|------|---------| +| Empty list | Matches anything (wildcard) | +| Non-empty list | Matches if **any** pattern matches (OR) | +| Multiple fields | **All** non-empty fields must match (AND) | +| Glob patterns | `*` matches any sequence, `?` matches single char | + +### Examples + +```scala +// Matches ANY column named 'email' in ANY database +DbMatch.column("email") + +// Matches 'email' columns in the 'users' table only +DbMatch.Empty.copy(table = List("users"), column = List("email")) + +// Matches 'email' OR 'user_email' columns +DbMatch.column("email", "user_email") + +// Matches any column ending in '_email' +DbMatch.column("*_email") +``` + +## Using with Code Generation + +Pass your type definitions to the code generation: + +```scala +import typr.* + +// Define your shared types +val sharedTypes = TypeDefinitions( + TypeEntry("FirstName", + db = DbMatch.column("first_name", "firstname"), + api = ApiMatch.name("firstName") + ), + TypeEntry("Email", + db = DbMatch.column("*email*"), + api = ApiMatch.name("*email*") + ) +) + +// Generate database code +typr.generateFromDb( + dataSource = postgresDataSource, + options = Options( + pkg = "myapp.db.postgres", + lang = Lang.Java, + dbLib = Some(DbLibName.Typo), + typeDefinitions = sharedTypes + ), + targetFolder = generatedPath / "postgres", + selector = Selector.All +) + +// Generate OpenAPI code +typr.openapi.generateFromSpec( + specPath = Path.of("api/openapi.yaml"), + options = OpenApiOptions( + pkg = "myapp.api", + lang = Lang.Java, + typeDefinitions = sharedTypes + ), + targetFolder = generatedPath / "api" +) +``` + +## Real-World Example + +Here's a complete example combining PostgreSQL, MariaDB, and an OpenAPI spec: + +```scala +val enterpriseTypes = TypeDefinitions( + // Identity types + TypeEntry("CustomerId", + db = DbMatch.column("customer_id").copy(primaryKey = Some(true)), + api = ApiMatch.name("customerId") + ), + TypeEntry("EmployeeId", + db = DbMatch.column("employee_id", "emp_id").copy(primaryKey = Some(true)), + api = ApiMatch.name("employeeId") + ), + + // Common string types + TypeEntry("FirstName", + db = DbMatch.column("first_name", "firstname", "fname"), + api = ApiMatch.name("firstName", "fname") + ), + TypeEntry("LastName", + db = DbMatch.column("last_name", "lastname", "lname"), + api = ApiMatch.name("lastName", "lname") + ), + TypeEntry("Email", + db = DbMatch.column("*email*"), + api = ApiMatch.name("*email*", "*Email*") + ), + + // Boolean flags + TypeEntry("IsActive", + db = DbMatch.column("is_active", "active"), + api = ApiMatch.name("isActive", "active") + ), + TypeEntry("IsVerified", + db = DbMatch.column("is_verified", "verified"), + api = ApiMatch.name("isVerified", "verified") + ), + + // Audit fields + TypeEntry("CreatedAt", + db = DbMatch.column("created_at", "created_date"), + api = ApiMatch.name("createdAt") + ), + TypeEntry("UpdatedAt", + db = DbMatch.column("updated_at", "modified_at", "last_modified"), + api = ApiMatch.name("updatedAt", "modifiedAt") + ) +) +``` + +## Benefits + +1. **Single Source of Truth**: Define your semantic types once, use everywhere +2. **Compile-Time Safety**: The compiler catches type mismatches across all systems +3. **Self-Documenting**: Generated code shows exactly which sources use each type +4. **Refactoring Confidence**: Rename a type and see all affected code instantly +5. **Team Alignment**: Clear contracts between database and API teams +6. **Migration Safety**: Add a new database and types automatically align + +## Next Steps + +- [YAML Configuration](./yaml-config.md) - Define types in a configuration file +- [CLI Tool](./cli.md) - Manage data sources and type matching interactively +- [Best Practices](./best-practices.md) - Patterns for organizing type definitions diff --git a/site-in/unified-types/yaml-config.md b/site-in/unified-types/yaml-config.md new file mode 100644 index 0000000000..d169becc48 --- /dev/null +++ b/site-in/unified-types/yaml-config.md @@ -0,0 +1,516 @@ +--- +title: YAML Configuration +sidebar_position: 2 +--- + +# YAML Configuration + +While you can define unified types programmatically in Scala/Java/Kotlin, Typr also supports a declarative YAML format that makes type definitions easy to read, share, and version control. + +## Basic Structure + +```yaml +# typr-types.yaml +version: 1 + +sources: + postgres: + type: postgresql + host: localhost + port: 5432 + database: production + username: ${POSTGRES_USER} + password: ${POSTGRES_PASSWORD} + + mariadb: + type: mariadb + host: localhost + port: 3306 + database: legacy + username: ${MARIADB_USER} + password: ${MARIADB_PASSWORD} + + api: + type: openapi + spec: ./api/openapi.yaml + +types: + # Simple column name matching + FirstName: + db: + column: [first_name, firstname] + api: + name: [firstName] + + # Pattern matching with globs + Email: + db: + column: ["*email*"] + api: + name: ["*email*", "*Email*"] + + # Boolean flags + IsActive: + db: + column: [is_active, active] + api: + name: [isActive, active] +``` + +## Data Source Configuration + +### PostgreSQL + +```yaml +sources: + postgres: + type: postgresql + host: localhost + port: 5432 + database: myapp + username: ${POSTGRES_USER} + password: ${POSTGRES_PASSWORD} + # Optional settings + ssl: require + schemas: [public, app] +``` + +### MariaDB / MySQL + +```yaml +sources: + mariadb: + type: mariadb + host: localhost + port: 3306 + database: myapp + username: ${MARIADB_USER} + password: ${MARIADB_PASSWORD} +``` + +### SQL Server + +```yaml +sources: + sqlserver: + type: sqlserver + host: localhost + port: 1433 + database: myapp + username: sa + password: ${SQLSERVER_PASSWORD} + encrypt: false +``` + +### Oracle + +```yaml +sources: + oracle: + type: oracle + host: localhost + port: 1521 + service: FREEPDB1 + username: ${ORACLE_USER} + password: ${ORACLE_PASSWORD} +``` + +### DuckDB + +```yaml +sources: + duckdb: + type: duckdb + path: ./data/analytics.duckdb +``` + +### OpenAPI + +```yaml +sources: + api: + type: openapi + spec: ./api/openapi.yaml + # Or multiple specs + specs: + - ./api/customers.yaml + - ./api/products.yaml +``` + +## Type Definitions + +### Column/Field Name Matching + +```yaml +types: + CustomerId: + db: + column: [customer_id, cust_id] + api: + name: [customerId] + + # Glob patterns + Email: + db: + column: ["*email*"] + api: + name: ["*email*", "*Email*"] +``` + +### Scoped Matching + +```yaml +types: + # Only match in specific tables + ProductPrice: + db: + table: [products, product_variants] + column: [price, unit_price] + api: + name: [price, unitPrice] + + # Only match in specific schemas + AuditUser: + db: + schema: [audit] + column: [user_id, modified_by] + api: + path: ["/audit/*"] + name: [userId] +``` + +### Primary Key Types + +```yaml +types: + CustomerId: + db: + column: [customer_id] + primary_key: true + api: + name: [customerId] +``` + +### Boolean Flags + +```yaml +types: + IsActive: + db: + column: [is_active, active] + api: + name: [isActive, active] + + IsVerified: + db: + column: [is_verified, verified, email_verified] + api: + name: [isVerified, verified, emailVerified] +``` + +### API Location Matching + +```yaml +types: + AuthToken: + api: + location: [header] + name: [Authorization, X-Auth-Token] + + PageSize: + api: + location: [query] + name: [pageSize, limit] + + RequestBody: + api: + location: [request_body, response_body] + name: [data, payload] +``` + +### Format and Type Matching + +```yaml +types: + UUID: + db: + db_type: [uuid] + api: + format: [uuid] + + DateTime: + db: + db_type: [timestamp, timestamptz] + api: + format: [date-time] +``` + +### Comment Annotations + +```yaml +types: + Currency: + db: + annotation: ["@currency"] + api: + extension: + x-currency: "true" + + Sensitive: + db: + annotation: ["@sensitive", "@pii"] + api: + extension: + x-sensitive: "true" +``` + +## Output Configuration + +```yaml +output: + # Shared types package + shared: + package: com.myapp.shared + path: ./generated/shared + + # Per-source output + sources: + postgres: + package: com.myapp.db.postgres + path: ./generated/postgres + + mariadb: + package: com.myapp.db.mariadb + path: ./generated/mariadb + + api: + package: com.myapp.api + path: ./generated/api + + # Generation options + options: + lang: java # java, kotlin, scala + json: jackson # jackson, circe, play-json, zio-json + enable_dsl: true + enable_test_inserts: true + enable_mock_repos: true +``` + +## Environment Variables + +Use `${VAR}` or `${VAR:-default}` syntax for sensitive values: + +```yaml +sources: + postgres: + type: postgresql + host: ${POSTGRES_HOST:-localhost} + port: ${POSTGRES_PORT:-5432} + database: ${POSTGRES_DB} + username: ${POSTGRES_USER} + password: ${POSTGRES_PASSWORD} +``` + +## Multiple Configurations + +You can split configurations across files: + +```yaml +# typr-types.yaml +version: 1 +include: + - ./config/sources.yaml + - ./config/types-identity.yaml + - ./config/types-common.yaml + - ./config/types-audit.yaml +``` + +```yaml +# config/sources.yaml +sources: + postgres: + type: postgresql + # ... +``` + +```yaml +# config/types-identity.yaml +types: + CustomerId: + db: + column: [customer_id] + primary_key: true + api: + name: [customerId] +``` + +## Validation + +The YAML configuration is validated at load time: + +``` +$ typr validate typr-types.yaml + +Validating typr-types.yaml... + +Sources: + postgres: Connected (PostgreSQL 16.1) + mariadb: Connected (MariaDB 11.2) + api: Loaded (3 endpoints, 12 schemas) + +Types: + FirstName: Matched 4 columns, 3 API fields + LastName: Matched 4 columns, 3 API fields + Email: Matched 6 columns, 4 API fields + IsActive: Matched 8 columns, 5 API fields + +Warnings: + - Type 'PhoneNumber' has no matches in 'postgres' + +Configuration valid. +``` + +## Complete Example + +```yaml +version: 1 + +sources: + postgres: + type: postgresql + host: ${POSTGRES_HOST:-localhost} + port: ${POSTGRES_PORT:-5432} + database: production + username: ${POSTGRES_USER} + password: ${POSTGRES_PASSWORD} + schemas: [public, person, sales] + + mariadb: + type: mariadb + host: ${MARIADB_HOST:-localhost} + port: ${MARIADB_PORT:-3306} + database: legacy + username: ${MARIADB_USER} + password: ${MARIADB_PASSWORD} + + customers-api: + type: openapi + spec: ./api/customers.yaml + + products-api: + type: openapi + spec: ./api/products.yaml + +types: + # Identity types + CustomerId: + db: + column: [customer_id, cust_id] + primary_key: true + api: + name: [customerId] + + EmployeeId: + db: + column: [employee_id, emp_id] + primary_key: true + api: + name: [employeeId] + + ProductId: + db: + column: [product_id, prod_id] + primary_key: true + api: + name: [productId] + + # String types + FirstName: + db: + column: [first_name, firstname, fname] + api: + name: [firstName, fname] + + LastName: + db: + column: [last_name, lastname, lname] + api: + name: [lastName, lname] + + Email: + db: + column: ["*email*"] + api: + name: ["*email*", "*Email*"] + + PhoneNumber: + db: + column: [phone, phone_number, mobile] + api: + name: [phone, phoneNumber, mobile] + + # Boolean flags + IsActive: + db: + column: [is_active, active] + api: + name: [isActive, active] + + IsVerified: + db: + column: [is_verified, verified] + api: + name: [isVerified, verified] + + IsPrimary: + db: + column: [is_primary, is_default] + api: + name: [isPrimary, isDefault] + + # Audit fields + CreatedAt: + db: + column: [created_at, created_date, create_date] + api: + name: [createdAt] + + UpdatedAt: + db: + column: [updated_at, modified_at, last_modified] + api: + name: [updatedAt, modifiedAt] + + CreatedBy: + db: + column: [created_by, creator] + api: + name: [createdBy] + +output: + shared: + package: com.acme.shared.types + path: ./generated/shared + + sources: + postgres: + package: com.acme.db.postgres + path: ./generated/postgres + mariadb: + package: com.acme.db.mariadb + path: ./generated/mariadb + customers-api: + package: com.acme.api.customers + path: ./generated/api/customers + products-api: + package: com.acme.api.products + path: ./generated/api/products + + options: + lang: java + json: jackson + enable_dsl: true + enable_test_inserts: true + enable_mock_repos: true + enable_precise_types: true +``` diff --git a/site/.gitignore b/site/.gitignore index b2d6de3062..6245e29f34 100644 --- a/site/.gitignore +++ b/site/.gitignore @@ -18,3 +18,4 @@ npm-debug.log* yarn-debug.log* yarn-error.log* +showcase-generated diff --git a/site/blog/2023-11-24-hello-zio.md b/site/blog/2023-11-24-hello-zio.md index 1626246a43..ca115f5365 100644 --- a/site/blog/2023-11-24-hello-zio.md +++ b/site/blog/2023-11-24-hello-zio.md @@ -29,7 +29,7 @@ We fixed a bunch of issues while working on this PR, so it should be pretty clos ### Implemented missing features in `zio-jdbc` `zio-jdbc` does not support postgres arrays, and it does not support -the [COPY API for streaming inserts](/db/other-features/streaming-inserts). +the [COPY API for streaming inserts](/typr/boundaries/databases/other-features/streaming-inserts). Typo outputs code which implements both of these features. @@ -65,11 +65,11 @@ Notice how the signatures use `ZIO`, `ZStream`, `ZConnection`. ### `zio-schema` is not used We opted to *not* go through zio-schema for the generated code. It was not clear that it was possible to implement all -PostgreSQL features through `zio-schema`, and we wanted to generate code which is as [fast to compile](/db/other-features/faster-compilation) as possible. +PostgreSQL features through `zio-schema`, and we wanted to generate code which is as [fast to compile](/typr/boundaries/databases/other-features/faster-compilation) as possible. ### Also support for `zio-json` -Typo supports generating [JSON codecs](/db/other-features/json) for all the row types. +Typo supports generating [JSON codecs](/typr/boundaries/databases/other-features/json) for all the row types. The PR also adds support for `zio-json`, so you can get codecs like this: ```scala diff --git a/site/docs-api/index.md b/site/docs-api/index.md deleted file mode 100644 index a538f5e565..0000000000 --- a/site/docs-api/index.md +++ /dev/null @@ -1,29 +0,0 @@ ---- -title: OpenAPI Code Generator -sidebar_position: 1 -slug: / ---- - -# Typo OpenAPI Code Generator - -Typo includes a powerful OpenAPI code generator that produces **type-safe, idiomatic code** for multiple languages and frameworks from a single OpenAPI specification. - -## Cross-Language, Cross-Framework - -Unlike most OpenAPI generators that target a single language, Typo generates **semantically equivalent code** across: - -| Language | Server Frameworks | Client | -|----------|-------------------|--------| -| **Java** | JAX-RS, Spring Boot, Quarkus (reactive) | JDK HttpClient | -| **Kotlin** | JAX-RS, Spring Boot, Quarkus (reactive) | JDK HttpClient | -| **Scala** | Http4s, Spring Boot | Http4s, JDK HttpClient | - -All generated code shares the same API contract - the same type-safe interfaces, response types, and ID wrappers work identically across all targets. - -## Key Features - -- **Type-safe ID wrappers** - No more primitive strings for identifiers -- **Sealed response types** - Exhaustive pattern matching for HTTP status codes -- **Server + Client generation** - Both sides share the same interface -- **Framework-native code** - Idiomatic annotations and patterns for each framework -- **Reactive support** - Mutiny `Uni` for Quarkus, Cats Effect `IO` for Http4s diff --git a/site/docs-avro/reference/options.md b/site/docs-avro/reference/options.md deleted file mode 100644 index 6a10e04128..0000000000 --- a/site/docs-avro/reference/options.md +++ /dev/null @@ -1,150 +0,0 @@ ---- -title: Configuration Options ---- - -# Configuration Options - -Complete reference for `AvroOptions` configuration. - -## Basic Options - -```scala -val options = AvroOptions.default( - pkg = jvm.QIdent.parse("com.example.events"), - schemaSource = SchemaSource.Directory(Path.of("schemas")) -) -``` - -| Option | Type | Description | -|--------|------|-------------| -| `pkg` | `jvm.QIdent` | Base package for generated code | -| `schemaSource` | `SchemaSource` | Where to find Avro schemas | - -## Schema Source - -```scala -// Directory containing .avsc and .avpr files -SchemaSource.Directory(Path.of("schemas")) - -// Explicit list of files -SchemaSource.Files(List( - Path.of("OrderPlaced.avsc"), - Path.of("UserService.avpr") -)) -``` - -## Code Generation Options - -```scala -options.copy( - generateSchemaValidator = true, - enablePreciseTypes = true, - generateMockRepos = false -) -``` - -| Option | Default | Description | -|--------|---------|-------------| -| `generateSchemaValidator` | `false` | Generate schema validation utility | -| `enablePreciseTypes` | `false` | Generate `Decimal10_2` instead of `BigDecimal` | -| `generateMockRepos` | `false` | Generate mock implementations for testing | - -## Wire Format - -```scala -options.copy( - wireFormat = AvroWireFormat.ConfluentAvro -) -``` - -| Wire Format | Description | -|-------------|-------------| -| `AvroWireFormat.ConfluentAvro` | Binary Avro with Confluent Schema Registry | -| `AvroWireFormat.PlainAvro` | Binary Avro without registry | -| `AvroWireFormat.JsonEncoded(jsonLib)` | JSON serialization | - -## Effect Type - -```scala -options.copy( - effectType = EffectType.CompletableFuture -) -``` - -| Effect Type | Return Type | Use Case | -|-------------|-------------|----------| -| `EffectType.Blocking` | `T` | Synchronous | -| `EffectType.CompletableFuture` | `CompletableFuture` | Async Java | -| `EffectType.Mutiny` | `Uni` | Quarkus | -| `EffectType.CatsIO` | `IO[T]` | Cats Effect | -| `EffectType.ZIO` | `Task[T]` | ZIO | - -## Header Schemas - -```scala -options.copy( - headerSchemas = Map( - "standard" -> HeaderSchema(List( - HeaderField("correlationId", HeaderType.UUID, required = true), - HeaderField("timestamp", HeaderType.Instant, required = true), - HeaderField("source", HeaderType.String, required = false) - )) - ), - defaultHeaderSchema = Some("standard") -) -``` - -## Framework Integration - -```scala -options.copy( - frameworkIntegration = FrameworkIntegration.Spring, - generateKafkaEvents = true, - generateKafkaRpc = true -) -``` - -| Framework | Description | -|-----------|-------------| -| `FrameworkIntegration.None` | No framework annotations | -| `FrameworkIntegration.Spring` | Spring Boot annotations | -| `FrameworkIntegration.Quarkus` | Quarkus CDI annotations | - -## JSON Library - -```scala -options.copy( - jsonLibs = List(JsonLib.Jackson) -) -``` - -| JSON Library | Languages | Description | -|--------------|-----------|-------------| -| `JsonLib.Jackson` | Java, Kotlin, Scala | Jackson databind | -| `JsonLib.Circe` | Scala | Circe codecs | -| `JsonLib.ZioJson` | Scala | ZIO JSON | - -## Full Example - -```scala -val options = AvroOptions.default( - pkg = jvm.QIdent.parse("com.example.events"), - schemaSource = SchemaSource.Directory(Path.of("schemas")) -).copy( - wireFormat = AvroWireFormat.ConfluentAvro, - effectType = EffectType.CompletableFuture, - generateSchemaValidator = true, - enablePreciseTypes = true, - headerSchemas = Map( - "standard" -> HeaderSchema(List( - HeaderField("correlationId", HeaderType.UUID, required = true), - HeaderField("timestamp", HeaderType.Instant, required = true) - )) - ), - defaultHeaderSchema = Some("standard"), - frameworkIntegration = FrameworkIntegration.Spring, - generateKafkaEvents = true, - generateKafkaRpc = true, - jsonLibs = List(JsonLib.Jackson) -) -``` diff --git a/site/docs-db/comparison.md b/site/docs-db/comparison.md deleted file mode 100644 index ecc8dad3b8..0000000000 --- a/site/docs-db/comparison.md +++ /dev/null @@ -1,9 +0,0 @@ ---- -title: Comparison with jooq ---- - -- Typo is open source, jooq pesters you to buy a commercial version -- getting started is much easier with Typo's scala-cli script, you need to read jooq manual for a long time and manually - copy/paste files, code and put it together -- Typo is specific to PostgreSQL, jooq needs to cover many databases. postgres support must therefore be half-assed -- \ No newline at end of file diff --git a/site/docs-db/other-features/testing-with-random-values.md b/site/docs-db/other-features/testing-with-random-values.md deleted file mode 100644 index 49c6b088ac..0000000000 --- a/site/docs-db/other-features/testing-with-random-values.md +++ /dev/null @@ -1,206 +0,0 @@ ---- -title: Testing with random values ---- - -This covers a lot of interesting ground, test-wise. - -If you enable `enableTestInserts` in `typr.Options` you now get an `TestInsert` class, with a method to insert a row for each table Typo knows about. -All values except ids, foreign keys and so on are *randomly generated*, but you can override them with named parameters. - -The idea is that you: -- can easily insert rows for testing -- can explicitly set the values you *do* care about -- will get random values for the rest -- are still forced to follow FKs to setup the data graph correctly -- it's easy to follow those FKs, because after inserting a row you get the persisted version back, including generated IDs -- can get the same values each time by hard coding the seed `new TestInsert(new scala.util.Random(0L))`, or you can run it multiple times with different seeds to see that the random values really do not matter -- do not need to write *any* code to get all this available to you, like the rest of Typo. - -In summary, this is a fantastic way of setting up complex test scenarios in the database! - -### Domains -If you use [postgres domains](../type-safety/domains.md) you typically want affect the generation of data yourself. -For that reason there is a trait you need to implement and pass in. This only affect you if you use domains. - -```scala -import adventureworks.public.* - -import scala.util.Random - -// apply domain-specific rules here -object DomainInsert extends adventureworks.TestDomainInsert { - override def publicAccountNumber(random: Random): AccountNumber = AccountNumber(random.nextString(10)) - override def publicFlag(random: Random): Flag = Flag(random.nextBoolean()) - override def publicMydomain(random: Random): Mydomain = Mydomain(random.nextString(10)) - override def publicName(random: Random): Name = Name(random.nextString(10)) - override def publicNameStyle(random: Random): NameStyle = NameStyle(random.nextBoolean()) - override def publicPhone(random: Random): Phone = Phone(random.nextString(10)) - override def publicShortText(random: Random): ShortText = ShortText(random.nextString(10)) - override def publicOrderNumber(random: Random): OrderNumber = OrderNumber(random.nextString(10)) -} -``` - -### Usage example - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; - - - - -```scala -import adventureworks.customtypes.{Defaulted, TypoShort, TypoLocalDateTime, TypoXml} -import adventureworks.production.unitmeasure.UnitmeasureId -import adventureworks.TestInsert - -import scala.util.Random - -val testInsert = new TestInsert(new Random(0), DomainInsert) -// testInsert: TestInsert = TestInsert( -// random = scala.util.Random@12ce2cc7, -// domainInsert = repl.MdocSession$App$DomainInsert$@62f1b14b -// ) - -val unitmeasure = testInsert.productionUnitmeasure(UnitmeasureId("kgg")) -// unitmeasure: UnitmeasureRow = UnitmeasureRow( -// unitmeasurecode = UnitmeasureId(value = "kgg"), -// name = Name(value = "椕皿鈻瑜㶯眀㏝⑂Ᶎ䩇"), -// modifieddate = TypoLocalDateTime(value = 2025-12-06T09:48:20.342898) -// ) -val productCategory = testInsert.productionProductcategory() -// productCategory: ProductcategoryRow = ProductcategoryRow( -// productcategoryid = ProductcategoryId(value = 691), -// name = Name(value = "℣ٿ玁冧ἀ蓆鋥射ⅅ匫"), -// rowguid = TypoUUID(value = 51086b82-d280-11f0-89a7-0242c0a8d002), -// modifieddate = TypoLocalDateTime(value = 2025-12-06T09:48:20.342898) -// ) -val productSubcategory = testInsert.productionProductsubcategory(productCategory.productcategoryid) -// productSubcategory: ProductsubcategoryRow = ProductsubcategoryRow( -// productsubcategoryid = ProductsubcategoryId(value = 691), -// productcategoryid = ProductcategoryId(value = 691), -// name = Name(value = "䪾悲켺я瘊ꖗ欖뢐豶㤖"), -// rowguid = TypoUUID(value = 510d7712-d280-11f0-89a7-0242c0a8d002), -// modifieddate = TypoLocalDateTime(value = 2025-12-06T09:48:20.342898) -// ) -val productModel = testInsert.productionProductmodel(catalogdescription = Some(new TypoXml("")), instructions = Some(new TypoXml(""))) -// productModel: ProductmodelRow = ProductmodelRow( -// productmodelid = ProductmodelId(value = 691), -// name = Name(value = "衳⦥ፀ⟑骩誖ڠ鮩焸뭷"), -// catalogdescription = Some(value = TypoXml(value = "")), -// instructions = Some(value = TypoXml(value = "")), -// rowguid = TypoUUID(value = 5113a9c0-d280-11f0-89a7-0242c0a8d002), -// modifieddate = TypoLocalDateTime(value = 2025-12-06T09:48:20.342898) -// ) -testInsert.productionProduct( - safetystocklevel = TypoShort(1), - reorderpoint = TypoShort(1), - standardcost = BigDecimal(1), - listprice = BigDecimal(1), - daystomanufacture = 10, - sellstartdate = TypoLocalDateTime.now, - sizeunitmeasurecode = Some(unitmeasure.unitmeasurecode), - weightunitmeasurecode = Some(unitmeasure.unitmeasurecode), - `class` = Some("H "), - style = Some("W "), - productsubcategoryid = Some(productSubcategory.productsubcategoryid), - productmodelid = Some(productModel.productmodelid) -) -// res1: ProductRow = ProductRow( -// productid = ProductId(value = 691), -// name = Name(value = "睍먀缶쏄迄넼䒄䣃㓎俠"), -// productnumber = "JXQqPyuxbr589wyJzS2S", -// makeflag = Flag(value = true), -// finishedgoodsflag = Flag(value = true), -// color = Some(value = "iHrAOB2RuvBbFbQ"), -// safetystocklevel = TypoShort(value = 1), -// reorderpoint = TypoShort(value = 1), -// standardcost = 1, -// listprice = 1, -// size = Some(value = "NB7Zu"), -// sizeunitmeasurecode = Some(value = UnitmeasureId(value = "kgg")), -// weightunitmeasurecode = Some(value = UnitmeasureId(value = "kgg")), -// weight = None, -// daystomanufacture = 10, -// productline = None, -// class = Some(value = "H "), -// style = Some(value = "W "), -// productsubcategoryid = Some(value = ProductsubcategoryId(value = 691)), -// productmodelid = Some(value = ProductmodelId(value = 691)), -// sellstartdate = TypoLocalDateTime(value = 2025-12-06T09:48:20.656753), -// sellenddate = None, -// discontinueddate = Some( -// value = TypoLocalDateTime(value = 2034-05-19T00:51:56) -// ), -// rowguid = TypoUUID(value = 511ab1d4-d280-11f0-89a7-0242c0a8d002), -// modifieddate = TypoLocalDateTime(value = 2025-12-06T09:48:20.342898) -// ) -``` - - - - -Java doesn't have default parameters, so TestInsert uses the **Inserter pattern** - a fluent builder that lets you customize rows before inserting: - -```java -import adventureworks.TestInsert; -import adventureworks.DomainInsertImpl; -import adventureworks.production.unitmeasure.UnitmeasureId; -import adventureworks.public_.Name; -import java.util.Random; - -var testInsert = new TestInsert(new Random(0), new DomainInsertImpl()); - -// Simple insert - just call insert(connection) -var productCategory = testInsert.productionProductcategory().insert(c); - -// Insert with required parameters -var unitmeasure = testInsert.productionUnitmeasure(new UnitmeasureId("kgg")).insert(c); - -// Customize with .with() before inserting -var productModel = testInsert.productionProductmodel() - .with(row -> row - .withCatalogdescription(Optional.of(new Xml(""))) - .withInstructions(Optional.of(new Xml("")))) - .insert(c); - -// Complex example with foreign keys -var productSubcategory = testInsert - .productionProductsubcategory(productCategory.productcategoryid()) - .insert(c); - -var product = testInsert.productionProduct( - (short) 1, // safetystocklevel - (short) 1, // reorderpoint - BigDecimal.ONE, // standardcost - BigDecimal.ONE, // listprice - 10, // daystomanufacture - LocalDateTime.now()) // sellstartdate - .with(row -> row - .withSizeunitmeasurecode(Optional.of(unitmeasure.unitmeasurecode())) - .withWeightunitmeasurecode(Optional.of(unitmeasure.unitmeasurecode())) - .withClass(Optional.of("H ")) - .withStyle(Optional.of("W ")) - .withProductsubcategoryid(Optional.of(productSubcategory.productsubcategoryid())) - .withProductmodelid(Optional.of(productModel.productmodelid()))) - .insert(c); -``` - -The `Inserter` interface provides: -- `insert(Connection c)` - Execute the insert and return the saved row -- `with(UnaryOperator transformer)` - Customize the unsaved row before inserting - - - - -### Comparison with scalacheck - -This does look a lot like scalacheck indeed. - -But look closer, there are: -- no implicits -- no integration glue code with test libraries -- almost no imports, you need to mention very few types -- no keeping track of all the possible row types and repositories -- and so on - -This feature is meant to be easy to use, and I really think/hope it is! diff --git a/site/docs-db/other-features/testing-with-stubs.md b/site/docs-db/other-features/testing-with-stubs.md deleted file mode 100644 index 7c5de6ce74..0000000000 --- a/site/docs-db/other-features/testing-with-stubs.md +++ /dev/null @@ -1,121 +0,0 @@ ---- -title: Testing with stubs ---- - -It can be incredibly tiring to write tests for the database layer. - -Often you want to split you code in pure/effectful code and just test the pure parts, -but sometimes you want to observe mutations in the database as well. - -Sometimes spinning up a real database for this is the right answer, sometimes it's not. -It is always slow, however, so it's way easier to get a fast test suite if you're not doing it. - -The argument for the approach taken by Typo is that since the interaction between Scala -and PostgreSQL is guaranteed to be correct*, it is less important to back your tests with a real database. - -This leads us to stubs (called mocks in the generated code), implementations of the repository -interfaces backed by a mutable `Map`. This can be generated for all tables with a primary key. - -## DSL - -Notable, these mocks work with the [dsl](../what-is/dsl.md), which lets you describe semi-complex joins, updates, where predicates, -string operations and so on in your code, and test it in-memory! - -### *Note - -Typo guarantees schema correctness, but you can still break constraints. -Or your tests need more advanced PostgreSQL functionality. - -Stubs are obviously not a full replacement, but if they can be used for some non-zero percentage -of your tests it's still very beneficial! - -## An example of a generated `RepoMock`: - -```scala -import adventureworks.person.address.* -import typr.dsl.* -import typr.dsl.DeleteBuilder.DeleteBuilderMock -import typr.dsl.UpdateBuilder.UpdateBuilderMock -import java.sql.Connection -import scala.annotation.nowarn - -class AddressRepoMock(toRow: Function1[AddressRowUnsaved, AddressRow], - map: scala.collection.mutable.Map[AddressId, AddressRow] = scala.collection.mutable.Map.empty) extends AddressRepo { - override def delete: DeleteBuilder[AddressFields, AddressRow] = { - DeleteBuilderMock(DeleteParams.empty, AddressFields.structure, map) - } - override def deleteById(addressid: AddressId)(implicit c: Connection): Boolean = { - map.remove(addressid).isDefined - } - override def deleteByIds(addressids: Array[AddressId])(implicit c: Connection): Int = { - addressids.map(id => map.remove(id)).count(_.isDefined) - } - override def insert(unsaved: AddressRow)(implicit c: Connection): AddressRow = { - val _ = if (map.contains(unsaved.addressid)) - sys.error(s"id ${unsaved.addressid} already exists") - else - map.put(unsaved.addressid, unsaved) - - unsaved - } - override def insert(unsaved: AddressRowUnsaved)(implicit c: Connection): AddressRow = { - insert(toRow(unsaved)) - } - override def insertStreaming(unsaved: Iterator[AddressRow], batchSize: Int = 10000)(implicit c: Connection): Long = { - unsaved.foreach { row => - map += (row.addressid -> row) - } - unsaved.size.toLong - } - /* NOTE: this functionality requires PostgreSQL 16 or later! */ - override def insertUnsavedStreaming(unsaved: Iterator[AddressRowUnsaved], batchSize: Int = 10000)(implicit c: Connection): Long = { - unsaved.foreach { unsavedRow => - val row = toRow(unsavedRow) - map += (row.addressid -> row) - } - unsaved.size.toLong - } - override def select: SelectBuilder[AddressFields, AddressRow] = { - SelectBuilderMock(AddressFields.structure, () => map.values.toList, SelectParams.empty) - } - override def selectAll(implicit c: Connection): List[AddressRow] = { - map.values.toList - } - override def selectById(addressid: AddressId)(implicit c: Connection): Option[AddressRow] = { - map.get(addressid) - } - override def selectByIds(addressids: Array[AddressId])(implicit c: Connection): List[AddressRow] = { - addressids.flatMap(map.get).toList - } - override def selectByIdsTracked(addressids: Array[AddressId])(implicit c: Connection): Map[AddressId, AddressRow] = { - val byId = selectByIds(addressids).view.map(x => (x.addressid, x)).toMap - addressids.view.flatMap(id => byId.get(id).map(x => (id, x))).toMap - } - override def update: UpdateBuilder[AddressFields, AddressRow] = { - UpdateBuilderMock(UpdateParams.empty, AddressFields.structure, map) - } - override def update(row: AddressRow)(implicit c: Connection): Option[AddressRow] = { - map.get(row.addressid).map { _ => - map.put(row.addressid, row): @nowarn - row - } - } - override def upsert(unsaved: AddressRow)(implicit c: Connection): AddressRow = { - map.put(unsaved.addressid, unsaved): @nowarn - unsaved - } - override def upsertBatch(unsaved: Iterable[AddressRow])(implicit c: Connection): List[AddressRow] = { - unsaved.map { row => - map += (row.addressid -> row) - row - }.toList - } - override def upsertStreaming(unsaved: Iterator[AddressRow], batchSize: Int = 10000)(implicit c: Connection): Int = { - unsaved.foreach { row => - map += (row.addressid -> row) - } - unsaved.size - } -} -``` - diff --git a/site/docs-db/patterns/dynamic-queries.md b/site/docs-db/patterns/dynamic-queries.md deleted file mode 100644 index 050195e54e..0000000000 --- a/site/docs-db/patterns/dynamic-queries.md +++ /dev/null @@ -1,31 +0,0 @@ ---- -title: "Patterns: Dynamic queries" ---- - -If you use [sql files](../what-is/sql-is-king.md), there is a very high chance you'll want some queries to -a bit dynamic. The way forward here is to move the dynamism into the sql itself. - -A frequently used pattern is a query with an optional filter that selects all rows by default. This can be achieved using a IS NULL construct. -Here is an example of a query with optional age and name filters: - -```sql -SELECT p.title, p.firstname, p.middlename, p.lastname -FROM person.person p -WHERE :"first_name?" = p.firstname OR :first_name IS NULL -``` - -Will generate this repo: -```scala -import adventureworks.person_dynamic.PersonDynamicSqlRow -import java.sql.Connection - -trait PersonDynamicSqlRepo { - def apply(firstName: Option[String])(implicit c: Connection): List[PersonDynamicSqlRow] -} -``` - -Note that the sql query needs an explicit cast to figure out the type of the `first_name` parameter since it's compared with `NULL`. - -## What can be dynamic? -You can only use this mechanism for this which are templated into SQL as parameters. -It's not possible to use it decide keywords, column names and so on, unfortunately. diff --git a/site/docs-db/patterns/multi-repo.md b/site/docs-db/patterns/multi-repo.md deleted file mode 100644 index bc47200908..0000000000 --- a/site/docs-db/patterns/multi-repo.md +++ /dev/null @@ -1,289 +0,0 @@ ---- -title: "Patterns: The multi-repo" ---- - -There has been some comments about how [the generated repositories](../what-is/relations.md) do not match with peoples preferences of what a repository should be. -For instance you may prefer that your repositories coordinate multiple tables. - -And that's more than fair - Often you need to coordinate multiple tables in a transaction. -The only snag is that Typo does not have the knowledge to write that code for you. - -### So you write code yourself - -Enter the multi-repo pattern! - -Here you take low-level Typo repositories as parameters, and you write the higher-level flow yourself. - -You still get huge benefits from using Typo in this case: - -- All of this is typesafe -- You get perfect auto-complete from your IDE -- Strongly typed [Id types](../type-safety/id-types.md) and [type flow](../type-safety/type-flow.md) ensure that you have to follow foreign keys correctly -- It's fairly readable. -- It's testable! You can even wire in [stub repositories](../other-features/testing-with-stubs.md) and test it all without a running database. - -Just have a look at the example and think how long it would take you to write this without Typo. - -With Typo, this example worked *the first time it was ran*. - -### Example - -The example repo below exposes one method, which coordinates updates to four tables. - -The details of what is done is probably not too important, but I tried to comment it anyway. - - -```scala -import adventureworks.person.address.* -import adventureworks.person.addresstype.* -import adventureworks.person.businessentityaddress.* -import adventureworks.person.countryregion.CountryregionId -import adventureworks.person.person.* -import adventureworks.public.Name -import java.sql.Connection - -case class PersonWithAddresses(person: PersonRow, addresses: Map[Name, AddressRow]) - -class PersonWithAddressesRepo( - personRepo: PersonRepo, - businessentityAddressRepo: BusinessentityaddressRepo, - addresstypeRepo: AddresstypeRepo, - addressRepo: AddressRepo - ) { - - /* A person can have a bunch of addresses registered, - * and they each have an address type (BILLING, HOME, etc). - * - * This method syncs `PersonWithAddresses#addresses` to postgres, - * so that old attached addresses are removed, - * and the given addresses are attached with the chosen type - */ - def syncAddresses(pa: PersonWithAddresses)(implicit c: Connection): List[BusinessentityaddressRow] = { - // update person - personRepo.update(pa.person) - // update stored addresses - pa.addresses.toList.foreach { case (_, address) => addressRepo.update(address) } - - // addresses are stored in `PersonWithAddress` by a `Name` which means what type of address it is. - // this address type is stored in addresstypeRepo. - // In order for foreign keys to align, we need to translate from names to ids, and create rows as necessary - val oldStoredAddressTypes: Map[Name, AddresstypeId] = - addresstypeRepo.select - .where(r => r.name in pa.addresses.keys.toArray) - .toList - .map(x => (x.name, x.addresstypeid)) - .toMap - - val currentAddressesByType: Map[AddresstypeId, AddressRow] = - pa.addresses.map { case (addressTypeName, wanted) => - oldStoredAddressTypes.get(addressTypeName) match { - case Some(addresstypeId) => (addresstypeId, wanted) - case None => - val inserted = addresstypeRepo.insert(AddresstypeRowUnsaved(name = addressTypeName)) - (inserted.addresstypeid, wanted) - } - } - - // discover existing addresses attached to person - val oldAttachedAddresses: Map[(AddressId, AddresstypeId), BusinessentityaddressRow] = - businessentityAddressRepo.select - .where(x => x.businessentityid === pa.person.businessentityid) - .toList - .map(x => ((x.addressid, x.addresstypeid), x)) - .toMap - - // unattach old attached addresses - oldAttachedAddresses.foreach { case (_, ba) => - currentAddressesByType.get(ba.addresstypeid) match { - case Some(address) if address.addressid == ba.addressid => - case _ => - businessentityAddressRepo.deleteById(ba.compositeId) - } - } - // attach new addresses - currentAddressesByType.map { case (addresstypeId, address) => - oldAttachedAddresses.get((address.addressid, addresstypeId)) match { - case Some(bea) => bea - case None => - val newRow = BusinessentityaddressRowUnsaved(pa.person.businessentityid, address.addressid, addresstypeId) - businessentityAddressRepo.insert(newRow) - } - }.toList - } -} -``` - - -Here is example usage: - -Note that we can easily create a deep dependency graph with random data due to [testInsert](../other-features/testing-with-random-values.md). - -import Tabs from '@theme/Tabs'; -import TabItem from '@theme/TabItem'; - - - - -```scala -import adventureworks.{TestInsert, TestDomainInsert, withConnection} -import adventureworks.userdefined.FirstName -import scala.util.Random -import adventureworks.public.* - -import scala.util.Random - -object DomainInsert extends TestDomainInsert { - override def publicAccountNumber(random: Random): AccountNumber = AccountNumber(random.nextString(10)) - override def publicFlag(random: Random): Flag = Flag(random.nextBoolean()) - override def publicMydomain(random: Random): Mydomain = Mydomain(random.nextString(10)) - override def publicName(random: Random): Name = Name(random.nextString(10)) - override def publicNameStyle(random: Random): NameStyle = NameStyle(random.nextBoolean()) - override def publicPhone(random: Random): Phone = Phone(random.nextString(10)) - override def publicShortText(random: Random): ShortText = ShortText(random.nextString(10)) - override def publicOrderNumber(random: Random): OrderNumber = OrderNumber(random.nextString(10)) -} - -// set a fixed seed to get consistent values -val testInsert = new TestInsert(new Random(1), DomainInsert) - -val businessentityRow = testInsert.personBusinessentity() -val personRow = testInsert.personPerson(businessentityRow.businessentityid, persontype = "SC", FirstName("name")) -val countryregionRow = testInsert.personCountryregion(CountryregionId("NOR")) -val salesterritoryRow = testInsert.salesSalesterritory(countryregionRow.countryregioncode) -val stateprovinceRow = testInsert.personStateprovince(countryregionRow.countryregioncode, salesterritoryRow.territoryid) -val addressRow1 = testInsert.personAddress(stateprovinceRow.stateprovinceid) -val addressRow2 = testInsert.personAddress(stateprovinceRow.stateprovinceid) -val addressRow3 = testInsert.personAddress(stateprovinceRow.stateprovinceid) - -val repo = new PersonWithAddressesRepo( -personRepo = new PersonRepoImpl, -businessentityAddressRepo = new BusinessentityaddressRepoImpl, -addresstypeRepo = new AddresstypeRepoImpl, -addressRepo = new AddressRepoImpl -) -``` - - - - -```java -import adventureworks.TestInsert; -import adventureworks.DomainInsertImpl; -import adventureworks.userdefined.FirstName; -import adventureworks.public_.Name; -import adventureworks.person.countryregion.CountryregionId; -import java.util.Random; - -// set a fixed seed to get consistent values -var testInsert = new TestInsert(new Random(1), new DomainInsertImpl()); - -// Java uses the Inserter pattern: method().with(customizer).insert(c) -var businessentityRow = testInsert.personBusinessentity().insert(c); -var personRow = testInsert.personPerson(businessentityRow.businessentityid(), "SC", new FirstName("name")) - .with(row -> row.withLastname(new Name("lastname"))) - .insert(c); -var countryregionRow = testInsert.personCountryregion() - .with(row -> row.withCountryregioncode(new CountryregionId("NOR")).withName(new Name("Norway"))) - .insert(c); -var salesterritoryRow = testInsert.salesSalesterritory(countryregionRow.countryregioncode()) - .with(row -> row.withName(new Name("Territory")).withGroup("Europe")) - .insert(c); -var stateprovinceRow = testInsert.personStateprovince(countryregionRow.countryregioncode(), salesterritoryRow.territoryid()) - .with(row -> row.withStateprovincecode("OSL").withName(new Name("Oslo"))) - .insert(c); -var addressRow1 = testInsert.personAddress(stateprovinceRow.stateprovinceid()) - .with(row -> row.withAddressline1("Street 1").withCity("Oslo").withPostalcode("0001")) - .insert(c); -var addressRow2 = testInsert.personAddress(stateprovinceRow.stateprovinceid()) - .with(row -> row.withAddressline1("Street 2").withCity("Oslo").withPostalcode("0002")) - .insert(c); -var addressRow3 = testInsert.personAddress(stateprovinceRow.stateprovinceid()) - .with(row -> row.withAddressline1("Street 3").withCity("Oslo").withPostalcode("0003")) - .insert(c); - -var repo = new PersonWithAddressesRepo( - new PersonRepoImpl(), - new BusinessentityaddressRepoImpl(), - new AddresstypeRepoImpl(), - new AddressRepoImpl() -); -``` - - - - -```scala -repo.syncAddresses(PersonWithAddresses(personRow, Map(Name("HOME") -> addressRow1, Name("OFFICE") -> addressRow2))) -// res1: List[BusinessentityaddressRow] = List( -// BusinessentityaddressRow( -// businessentityid = BusinessentityId(value = 3075), -// addressid = AddressId(value = 1236), -// addresstypeid = AddresstypeId(value = 1224), -// rowguid = TypoUUID(value = 523f33e6-d280-11f0-86e1-0242c0a8d002), -// modifieddate = TypoLocalDateTime(value = 2025-12-06T09:48:22.237789) -// ), -// BusinessentityaddressRow( -// businessentityid = BusinessentityId(value = 3075), -// addressid = AddressId(value = 1237), -// addresstypeid = AddresstypeId(value = 1225), -// rowguid = TypoUUID(value = 52419a6e-d280-11f0-86e1-0242c0a8d002), -// modifieddate = TypoLocalDateTime(value = 2025-12-06T09:48:22.237789) -// ) -// ) - -// check that it's idempotent -repo.syncAddresses(PersonWithAddresses(personRow, Map(Name("HOME") -> addressRow1, Name("OFFICE") -> addressRow2))) -// res2: List[BusinessentityaddressRow] = List( -// BusinessentityaddressRow( -// businessentityid = BusinessentityId(value = 3075), -// addressid = AddressId(value = 1236), -// addresstypeid = AddresstypeId(value = 1224), -// rowguid = TypoUUID(value = 523f33e6-d280-11f0-86e1-0242c0a8d002), -// modifieddate = TypoLocalDateTime(value = 2025-12-06T09:48:22.237789) -// ), -// BusinessentityaddressRow( -// businessentityid = BusinessentityId(value = 3075), -// addressid = AddressId(value = 1237), -// addresstypeid = AddresstypeId(value = 1225), -// rowguid = TypoUUID(value = 52419a6e-d280-11f0-86e1-0242c0a8d002), -// modifieddate = TypoLocalDateTime(value = 2025-12-06T09:48:22.237789) -// ) -// ) - -// remove one -repo.syncAddresses(PersonWithAddresses(personRow, Map(Name("HOME") -> addressRow1))) -// res3: List[BusinessentityaddressRow] = List( -// BusinessentityaddressRow( -// businessentityid = BusinessentityId(value = 3075), -// addressid = AddressId(value = 1236), -// addresstypeid = AddresstypeId(value = 1224), -// rowguid = TypoUUID(value = 523f33e6-d280-11f0-86e1-0242c0a8d002), -// modifieddate = TypoLocalDateTime(value = 2025-12-06T09:48:22.237789) -// ) -// ) - -// add one -repo.syncAddresses(PersonWithAddresses(personRow, Map(Name("HOME") -> addressRow1, Name("VACATION") -> addressRow3))) -// res4: List[BusinessentityaddressRow] = List( -// BusinessentityaddressRow( -// businessentityid = BusinessentityId(value = 3075), -// addressid = AddressId(value = 1236), -// addresstypeid = AddresstypeId(value = 1224), -// rowguid = TypoUUID(value = 523f33e6-d280-11f0-86e1-0242c0a8d002), -// modifieddate = TypoLocalDateTime(value = 2025-12-06T09:48:22.237789) -// ), -// BusinessentityaddressRow( -// businessentityid = BusinessentityId(value = 3075), -// addressid = AddressId(value = 1238), -// addresstypeid = AddresstypeId(value = 1226), -// rowguid = TypoUUID(value = 524aafb4-d280-11f0-86e1-0242c0a8d002), -// modifieddate = TypoLocalDateTime(value = 2025-12-06T09:48:22.237789) -// ) -// ) -``` - - -## Isn't this a service at this point? - -Maybe! You likely shouldn't use the generated `Row` types at the service level, and there should likely be a transaction boundary. -You get to decide that, however. diff --git a/site/docs-db/readme.md b/site/docs-db/readme.md deleted file mode 100644 index 1b087142ae..0000000000 --- a/site/docs-db/readme.md +++ /dev/null @@ -1,62 +0,0 @@ ---- -title: Introduction to Typo ---- - -Typo is not just another source code generator; it's your trusted partner in database development. By harnessing the -power of PostgreSQL schema definitions and your SQL code, Typo creates a seamless bridge between your database and your -Scala code, all while putting type-safety and developer experience (DX) front and center. - -## The Motivation Behind Typo - -### Building Safer Systems - -In the world of software development, we rely on the compiler to catch errors and ensure the correctness of our code. -But what happens when we venture into the unpredictable realm of external data sources, like databases? - -Typo's core motivation is to bring contract-driven development to the database layer. Just as generating code from -OpenAPI definitions ensures the correctness of your HTTP layer, Typo aims to deliver the same level of safety for -database interactions. It achieves this by generating precise and correct code for your tables, views, and queries, all -guided by PostgreSQL metadata tables. - -### Revolutionizing the SQL to JVM Workflow - -The conventional workflow for SQL-to-JVM interaction often feels like a labyrinth of manual tasks and repetitive boilerplate: - -1. You write SQL queries. -2. IDEs may struggle to give you proper support while writing, especially if you interpolate and concatenate much -3. Manual mapping of column names or indices to case class field names. -4. Manual mapping of column names or indices to case class field types -5. String interpolation and type mapping may trigger cryptic errors for missing typeclass instances -6. The compiler cannot check the mappings, forcing you into writing tests. -7. Writing and maintaining tests is tedious, and even slow to run. - -But here's the kicker: Whenever you refactor your code, you find yourself revisiting all of these points. - -Typo changes the game. -It streamlines steps 2-7, liberates you from boilerplate, and lets you focus on what truly matters: -building robust and maintainable database applications. - -### Example video -As an example of how typo frees you from these steps, consider this video where you write your SQL in an `.sql` file, -and you see typo regenerating correct mapping code for it on save. -Much less testing is needed, because the name and type mapping will be correct, and the sql valid. - -