A Ruby gem for incremental parsing of partial and incomplete JSON streams. It is built for streaming output from LLM providers such as OpenAI and Anthropic, and processes each new chunk in O(n) time by maintaining parser state between calls. Use .parse for parsed Ruby values and .complete when you specifically need completed JSON text.
Add this line to your application's Gemfile:
gem 'json_completer'And then execute:
bundle installOr install it yourself as:
gem install json_completerUse .parse when you want the current parsed Ruby value directly from a partial stream:
require 'json_completer'
# Parse partial JSON into Ruby objects
JsonCompleter.parse('{"name": "John", "age":')
# => {"name" => "John", "age" => nil}
# Handle incomplete strings
JsonCompleter.parse('{"message": "Hello wo')
# => {"message" => "Hello wo"}
# Close unclosed structures
JsonCompleter.parse('[1, 2, {"key": "value"')
# => [1, 2, {"key" => "value"}]For streaming scenarios where JSON arrives in chunks. Each call processes only new data (O(n) complexity) by maintaining parsing state:
completer = JsonCompleter.new
# Process first chunk
result1 = completer.parse('{"users": [{"name": "')
# => {"users" => [{"name" => ""}]}
# Process additional data
result2 = completer.parse('{"users": [{"name": "Alice"}')
# => {"users" => [{"name" => "Alice"}]}
# Final parsed value
result3 = completer.parse('{"users": [{"name": "Alice"}, {"name": "Bob"}]}')
# => {"users" => [{"name" => "Alice"}, {"name" => "Bob"}]}Use .complete when you specifically need completed JSON text instead of parsed Ruby objects:
JsonCompleter.complete('{"name": "John", "age":')
# => '{"name": "John", "age": null}'
JsonCompleter.complete('[1, 2, {"key": "value"')
# => '[1, 2, {"key": "value"}]'This is the second-tier option when another layer expects JSON text and you want json_completer to materialize the current partial state as valid JSON.
- Zero reprocessing: Maintains parsing state to avoid reparsing previously processed data
- Linear complexity: Each chunk processed in O(n) time where n = new data size, not total size
- Memory efficient: Uses token-based accumulation with minimal state overhead
- Context preservation: Tracks nested structures without full document analysis
- LLM streaming output: Parse partial JSON emitted token-by-token from providers such as OpenAI and Anthropic
- Incremental structured output parsing: Keep a live Ruby object while more JSON arrives
- JSON text completion: Produce valid JSON text snapshots for downstream consumers that require a string
- Fork the repository
- Create your feature branch (
git checkout -b my-new-feature) - Make your changes and add tests
- Run the test suite (
bundle exec rspec) - Commit your changes (
git commit -am 'Add some feature') - Push to the branch (
git push origin my-new-feature) - Create a new Pull Request
This gem is available as open source under the terms of the MIT License.