I'm having a tough time figuring this out and there is no documentation on this. I am saving my own custom logs to BigQuery through cloudflare. For some reason, logflare is changing the schema of from RECORD NULLABLE to RECORD REPEATABLE ( from object to array of objects). Since I have set the schema correctly, BigQuery ignores the request and gives exactly the error that the schema is not compatible.
Provided Schema does not match Table lamatic:lamatic_logflare_US.d8e395a9_e423_4b6c_86d1_1ed5b92f7052. Field attributes has changed mode from NULLABLE to REPEATED
curl --location 'https://api.logflare.app/logs?source=xxx' \
--header 'Content-Type: application/json; charset=utf-8' \
--header 'X-API-KEY: xxx' \
--data '{
"event_message": "demo123",
"message": "This is the main event message",
"metadata": {"some": "log event"},
"timestamp": "2025-07-29T05:23:58.582Z",
"attributes":{
"name": "demo",
"number": "hellssd"
}
How can I enforce the object structure? I have tried locking the schema but that just doesnt send the request to BigQuery altogether.
I'm having a tough time figuring this out and there is no documentation on this. I am saving my own custom logs to BigQuery through cloudflare. For some reason, logflare is changing the schema of from RECORD NULLABLE to RECORD REPEATABLE ( from object to array of objects). Since I have set the schema correctly, BigQuery ignores the request and gives exactly the error that the schema is not compatible.
Provided Schema does not match Table lamatic:lamatic_logflare_US.d8e395a9_e423_4b6c_86d1_1ed5b92f7052. Field attributes has changed mode from NULLABLE to REPEATEDHow can I enforce the object structure? I have tried locking the schema but that just doesnt send the request to BigQuery altogether.