overhead of using multiple processes

Nobody III hungryninja101 at ...9...
Wed Dec 20 03:16:16 CET 2017

In Genode, it is common practice to separate everything into small modular
components, especially when dealing with untrusted data. How far can we go
with this before running into performance or memory issues?

For example, a file manager would benefit from content-based file type
identification. However, identifying files within the file manager (e.g.
using libmagic) could pose a security risk. However, using a discrete
component for this could have a noticeable effect on performance. If we use
a single component instance for all of the files, the only significant
added overhead would be from IPC, AFAIK. This seems to be acceptable in
terms of performance, but a malicious file may be able to cause the
component to misidentify other files in the directory, which could be a
security risk. A more secure method would be to run an instance of the
component for each individual file in the directory. However, this may
substantially reduce performance for large directories, depending on the
overhead of component creation. Would performance be an issue here? (And am
I overestimating the risk of file misidentification?)

Similar cases include icon/thumbnail rendering, general-purpose image
loading, and text rendering from many different sources. Would there be any
notable differences for these?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.genode.org/pipermail/users/attachments/20171219/43abd6ff/attachment.html>

More information about the users mailing list