Where can I follow the development being made related to the problem? I am working on a set of forms (that I am not sure I can upload, I will need to ask), some of which are pretty long (the longest has ~7000 questions: many of which are skipped depending on choices made early in the form) and cannot be helped with csv approach. Now, the user requirement has proposed changes to UI which I can only achieve by merging the forms together, which will lead to utter chaos in form load times. Earlier, the longer forms were to be used very sparingly, so longer load times were not much of an issue.
The enumerators will be given pretty basic devices with 1 GB RAM only which compounds the woes.
If I can see the approach being taken towards solving the problem, maybe me and my team can add to that and contribute. The way I see it: the form xml is read entirely at form load time and takes all of the time. Since the xml is created after due verification of the form xls, can it be made optional (for long forms) to load the xml piecemeal, maybe at the cost of not being able to jump around the form sections.
Any pointers will be much appreciated.
WIP Pull request here
We would like to learn more about yourself: what you do, your interests, projects you are working ... please introduce yourself. Whilst you are it, please add a profile picture.
Thank you @Ronald_Munjoma!
While I can see that a lot of bug fixes/UI changes are taking place, I did not see what I came to find in this particular thread: an insight into what makes the loading of a long form slow. Ultimately even my longest form is just an xml that is 2 MBs in size. Does it really take such a long time to parse through 2 MB of text, especially with tasks running in multiple threads?
And even if it does, what can be done to make it faster? Can we, like I suggested in the previous post, think of a way to read the form definition sequentially as-we-go, in place of all-at-once (sounds like the Compiler vs. Interpreter debate all over again), even if it is at a cost?
I've moved this thread to Development because well, it's a development issue.
Parsing 2MB of text is easy. Storing 2MB of text into a data structure that can return the next prompt, figure out relevancy/constraints, and perform calculations across thousands of questions is not so easy. What we currently do is process the XML on first load and store that in a FormDef (slow), we then serialize the FormDef and store it on the SD card (that's what's in /odk/.cache), and then we use that serialized data structure on second load (fast).
Can you tell me what devices you've tested on? What are your first load and second load times on those devices? Are you processor bound, memory bound, or disk bound?
Also, any ideas how we can speed this up? Some ideas that we've discussed in the past...
- When a form is downloaded, do the processing in the background and only show it to the user when the FormDef is ready
- Instead of using a FormDef, parse and store the form in a database and use that instead (how this will actually work, I don't know)