1️⃣ Description of the idea
Introduce new methods that allow streaming JSON parsing rather than reading the entire JSON into memory first. This approach would process JSON objects or arrays incrementally, converting them to DAO (Data Access Object) instances one at a time. The current implementation methods—%FromJSON(jsonStr), %FromJSON(stream), and %FromJSONFile(filename)—load the entire JSON content into memory before converting it to a DAO object. Consider adding additional methods to support streaming JSON parsing for handling large JSON datasets more efficiently.
2️⃣ Who is the target audience?
Developers working with large JSON datasets and read, parse and store JOSN into the database.
3️⃣ What problem does it solve?
Current methods (%FromJSON, %FromJSONFile) load the entire JSON content into memory before parsing. This approach is time consuming for large JSON files and Increases latency due to full buffering before processing
Streaming parsing solves these issues by allowing early processing of data before full ingestion and supporting real-time or bulk JSON processing.
4️⃣ How does this impact the efficiency, stability, reliability, etc., of the product?
Enables processing of arbitrarily large JSON files or payloads efficiently and parse and handle large JSON easily
5️⃣ Provide a specific use case or scenario that illustrates how this idea could be used in practice
Importing millions of patient records from a large JSON file (e.g., 10GB) exported from an external system typically results in long delays before processing begins. By implementing JSON streaming, processing can start immediately, allowing the system to efficiently handle massive datasets with minimal resource overhead.
Thank you for submitting the idea. The status has been changed to "Future consideration".
Stay tuned!