IndexedDB Performance Comparisons - Link
Over the last month, I have been playing with various IndexedDB operations, trying to figure out performance best practices and surprises. I introduced the test harness at the HTML5 Dev Conference at San Francisco. This post talks about the way these test cases were written, and the interesting observations while writing these cases. I hope to discuss the actual results of the tests in a followup post.
I started the test cases with JSPerf.com so that I could concentrate only on writing code, without having to worry about measuring and displaying the results (as JSPerf would take care of it for me). JSPerf internally uses Benchmarkjs that takes care of running my test cases, a statistically significant number of times to give accurate results.
However, there were some problems with continuing to use JSPerf.
- The test setup for JSPerf is not asynchronous. For IndexedDB, I wanted to delete the database between each run, or at least before the entire suite started. In case of benchmark, the setup needs to be synchronous as it is added inline with the test case itself. Hence, I had to add code in the 'Preparation' HTML code, where I hid the 'Run Tests' button till the database was deleted and any seed data added. Not the best way to run tests
- I was having problems with the versioning system. I was not able to figure out a way update the code and ensure that the latest version of my cases showed up directly on the URL.
The source code for the test cases are also checked into github. Watch out this space for discussions about the test results, soon to follow.