This aims to replace the dirty #2367 and might solve the underlying problem in a better way by using a Buffer instead of an Array[String] in StringWriter.
With the following pseudo-oneliner
~~~nit
import json::string_parser
import json
print args.first.to_path.read_all.parse_json.as(not null).serialize_to_json(pretty=true, plain=true).length
~~~
And the file `nit/benchmarks/json/inputs/magic.json` a 54MB json file.
Before:
* User time (seconds): 18.02
* Elapsed (wall clock) time: 13.43
* Maximum resident set size (GB): 6.09
After:
* User time (seconds): 4.26 (-76%)
* Elapsed (wall clock) time: 3.98 (-70%)
* Maximum resident set size (GB): 1.16 (-80%)
Nevertheless, 1GB of ram to process a 54MB file is still huge.
Pull-Request: #2371
Reviewed-by: Lucas Bajolet <r4pass@hotmail.com>
class StringWriter
super Writer
- private var content = new Array[String]
- redef fun to_s do return content.plain_to_s
+ private var content = new Buffer
+ redef fun to_s do return content.to_s
redef fun is_writable do return not closed
redef fun write_bytes(b) do
- content.add(b.to_s)
+ content.append(b.to_s)
end
redef fun write(str)
do
assert not closed
- content.add(str.to_s)
+ content.append(str)
+ end
+
+ redef fun write_char(c)
+ do
+ assert not closed
+ content.add(c)
end
# Is the stream closed?
v.stream.write char.escape_to_utf16
end
else
- v.stream.write char.to_s
+ v.stream.write_char char
end
end
v.stream.write "\""