Friday 11 March 2016

It's all in the application: Recon Instruments Snow 2 vs Paragliding: Part 9

Continued from Part 8

Now that we have a result, it is time to display it to the user.

The Displays are implemented as disjoint Fragments. Each Fragment has no knowledge of any other - nor does the 'master' Activity have any knowledge (at a code level) of the Display Fragments it contains.


Each display has a Broadcast Receiver that listens for Flight Data messages from the Flight Data Service. When a message arrives, an AsyncTask is created to invoke each of the Displays Processors. The result of each Processor is cached and ultimately the Display is updated.

The single deviation from this system involves the Compass. The Compass updates in real time as the user turns their head. Moving the Heading position by way of a Flight Data Message via the Flight Data Service would involve too great a latency at this point (this could be over come with predictive interpolation - but would add complexity with little benefit*). So, the Heading data used by the Compass is fed directly into the Compass Display by way of callback. Processing in the callback is throttled to a period no faster than 30 ms. When an update occurs, each of the children of the Compass (CompassSubDisplays) are walked, their Processors invoked, and the results displayed.


* that said, thought has been given to implementing a complete Flight Data Recorder that would include Head Orientation. The would allow for future playback and enable the creation of PiP video with an overlay of what the pilot would have seen on the HUD.

No comments:

Post a Comment