JSON Parser Speeds and CorrectnessThis article follows up with our recent release of a JSON parser and compares the speed and correctness of a few alternate Pascal based JSON parser libraries. To examine these parsers we’ll be running each of them through a series of tests. Below is a list of the parsers we'll be testing.
The JSON parser included with the Free Pascal Compiler and part of the FCL.
A new JSON parser described here and implemented in a single self contained class available on GitHib.
A long standing JSON parser written by Leonid Koninin and hosted on SourceForge.
A class within a business ORM claiming to support JSON parsing and hosted on GitHib.
Test SectionsThese four JSON parsers have been tested in a few key categories broken down into two main sections.
The first section tests the speed on these JSON parsers over the course of many iterative tasks. The tasks including loading a large JSON document, querying every node within a document, editing JSON values, and building a JSON document dynamically from scratch.
The second section tests the correctness of the JSON parsers by putting them through the paces of several validation tests. These validation tests whether the parsers handle malformed JSON, handle escape characters and unicode within JSON strings, and can correctly work with numbers in JSON.
Though these tests are in no way exhaustive, we feel they are fair and give a good estimation as to the current state of each parser library.
Test PreparationBefore running these tests we will first briefly list a few statistics and notes related to each parser.
Library Number of Files Size on Disk (including comments) Warnings or Hints
FPJson 4 files 127 KB Both
JsonTools 1 file 33 KB None
LkJSON 1 file 67 KB Both
JsonStorage 2 files 32 KB Warnings
The number of files and size on disk are relatively unimportant, but they might be useful in understanding the complexity of the underlying code of each library and perhaps their over simplification, over engineering, or elegance. You may want to refer to the speed and correctness sections while considering these attributes.
And here are a few notes and thoughts about these libraries. Both FPJson and JsonTools do an excellent job of handling incorrect JSON. FPJson and LkJSON go the route of creating specialized classes for the differing node kinds like string, number, and bool. This can make programming more difficult if you want flexibility in altering node kinds at runtime. LkJSON and JsonStorage lack enumerator support and both struggle with handling malformed JSON and Unicode in general. JsonStorage was particularly bad in this regard.
Speed TestsThe first speed test loads a 6.9KB JSON document and verifies the value of at least one node nested within the document. Our methodology first loads the JSON into a string, then asks each library to parse the string exactly 100,000 times. We then examine the total time in seconds each library took to complete this task. The next speed test loads the same document and requests an enumerator. We recursively enumerate each node in the document and repeat this 100,000 times. The LkJSON and JsonStorage lack enumerator support and score a zero on this test. Although enumerators are convenient, they are not by any means a critical feature of a JSON library. They are more a "nice to have" feature. Also please note this test requires parsing the same document again, so keep that in mind while examining the speed in this test. Our third test repeats the same test as above, with a simple recursive "for...do" loop and a counter variable. All the libraries were able to handle this test. Our fourth test parses a small 928 byte JSON document and makes a few edits by navigating the document and using properties or methods to change values. We examine the edits to ensure they were done correctly. We repeat 100,000 times for each library and report back the speed. Our final speed test creates a small JSON document on the fly using the methods of each library. We add several nodes of various kinds, nesting some them, and read the final document back out as JSON text. Because there is a lack of parsing in this test we’ve had to repeat it 1,000,000 times to get a result in the range of at least a few seconds.
Correctness TestsThis section examines the correctness of each JSON library to see how well it handles JSON features, and also detecting invalid JSON. The test are listed below.
A Detect invalid JSON by substituting a colon where a comma should have been placed.
B Handling blank key names correctly.
C Handling duplicate key names correctly.
D Handle simple text escape sequences correctly.
E Handle unicode text correctly.
F Detect and handle incorrect four digit hex codes for escaped characters.
G Handle parsed number values correctly.
Below is a simple chart that plots the failure or success of these tests.
Library A B C D E F G
FPJson PASS PASS FAIL PASS FAIL PASS PASS
JsonTools PASS PASS PASS PASS PASS PASS PASS
LkJSON PASS PASS FAIL FAIL FAIL FAIL FAIL
JsonStorage FAIL FAIL FAIL FAIL FAIL FAIL FAIL
Note that while JsonTools passes every test, and JsonStorage fails every test, these tests are by no means an exhaustive test of all possible conditions. They represent a few conditions we picked out of several common mistakes a JSON parser library might make.
ConclusionsOf all the libraries tested JsonTools stands out far above the rest. Its speed tests were far above all other parsers and it was the only library to pass all the correctness tests. It was the only library to have no warnings or hints, and it was also implemented in a single file. It along with FPJson support enumerators, but JsonTools also allowed node kinds to be altered more easily.
FPJson was good with correctness, but it was also the slowest and most complicated of all the parsers we tested.
LkJSON had some problems building JSON strings and we had to remove references to some units to get it to compile correctly on Linux. Also their newer implementation of JSON string build had to be disabled as it would raise access violation in some of our tests.
JsonStorage was a small library, but handled so many parts of JSON parsing incorrectly that it would be hard to recommend.