You have a file containing JSON data on a computer and wish to load it into a SQL database. Assume the JSON data is an array of strings (e.g. ['test', 'foo', … ]) and that the database already has a table with the fields 'id' and 'text'. The goal is to load each string from the array into the database as 'text' with the array index as 'id'.
Design a system for parsing a small amount of JSON data into RAM, and then storing it in the database (pseudo code is fine, assume there's an 'insert(id, text)' function you can use to get the data into the database).
Facebook needs to use its hardware as efficiently as possible. The amount of data we want to parse might be much larger than the size of the system's available RAM (though the database can still store everything). How could you re-design your parser to still accomplish the task?
When you profile your code you find that the 'insert(id, text)' function is now your performance bottleneck. Why could that be? How could you re-write it to fix the issue? (SQL pseudocode is fine)
-
Submissions will be graded on the following criteria:
- Meets Deliverables
- Creativity
- Clarity
will receive $150 each
will receive $50 each
$150.00 | Xiang Mao University of Florida | ||
$150.00 | Renar Narubin Illinois Institute of Technology | ||
$150.00 | Jeff Treleaven The Ohio State University | ||
$50.00 | Diego Calderon Stanford University | ||
$50.00 | Christopher Kuech Boston University | ||
$50.00 | Jin Pan Massachusetts Institute of Technology |