Friday 8th May, 2015
2:55pm to 3:30pm
Automatic tools can spit out super large scripts these days. Look at emscripten for examples, a script of several megabytes is not an exception (more like the rule). How do you process a script that is several megabytes large? Most parsers tend to want to parse everything in one sitting. These parsers run out of memory at some point. How can you do analysis or modifications to arbitrary scripts if you can’t even parse them? Enter a streaming parser. It is capable of parsing a script left to right without retaining more memory in the process. It can yield at any point where it needs more input and it can stream parsed tokens when one is available.
I do the JS dance! (@JS1K, zeonjs.com). See @JSGoodies for my linkdump. Firehose proximity warning. bio from Twitter
Sign in to add slides, notes or videos to this session