Efficient JSON data processing is important in modern applications. If JSON data is large in size, we need to handle it properly to avoid heap out of memory error and heavy resource consumption.
GSON is one of the best library available in Java to process JSON data.
It provides thread safe instance to reuse the same object to process multiple requests.
It doesn't expect any annotations to be available to serialization and deserialization unless there is a special need.
It provides
toJson()
andfromJson()
methods to serialize and deserialize.It provides support to read the content using stream readers.
Below is the code snippet to read the file using java stream readers and convert to Java object. I am applying my business logic for every 100 records.
Assumption JSON file is having array of data like below.
[
{
"id": 1,
"firstName": "Emily",
"lastName": "Johnson"
}
]
Java Code
private Optional<Boolean> readFile(String path) {
try (InputStream inputStream = Files.newInputStream(Path.of(path));
JsonReader reader = new JsonReader(new InputStreamReader(inputStream))) {
Instant start = Instant.now();
reader.beginArray();
List<User> users = new ArrayList<>();
while (reader.hasNext()) {
users.add(gson.fromJson(reader, User.class));
if (users.size() >= 100) {
//Initiate your records processing
//You can consider processing it in async with properly handling failure scenarios
users.clear();
}
}
reader.endArray();
LOGGER.info("Total time taken to process file:{}, Duration:{} (millis)", path, Duration.between(start, Instant.now()).toMillis());
} catch (Exception e) {
LOGGER.error("", e);
return Optional.of(false);
}
return Optional.of(true);
}
If you have any question please leave a question in comments section.
Top comments (0)