Hello Genodians
Our continuous integration pipeline is split into several stages which correspond with the Genode depot tools. The artifacts created by the depot tool are passed to the subsequent stages:
- "Extract"-stage: extract all API/source archives.
- "Build"-stage: build binary archives from the source archives extracted during the previous stage. For each architecture (arm_v7a, arm_v8a, x86_64, etc.) a dedicated build job is triggered. These jobs run in parallel and possibly on different hardware.
- "Run"-stage: execute the run scripts. Everything is "imported from depot" so that nothing needs to be compiled during this stage.
- "Publish"-stage: publish API/source/binary/package archives
For the "Build"-stage, we introduced meta packages like "pkg/gapfruit_ci-x86_64", "pkg/gapfruit_ci-arm_v7a", etc. that include every archive we want to build for that architecture.
These meta-packages are also used by the "Extract"-stage, so that for example `tool/depot/extract rite/pkg/gapfruit_ci-x86_64 rite/pkg/gapfruit_ci-arm_v7a` can be called.
Now that the number of archives became too large and `depot/extract` complains: `make: execvp: bash: Argument list too long`.
Apparently, the aggregated argument list passed to `make` contains every archive including all dependencies. Duplicates (packages and dependencies that are build for x86_64 as well as arm_v7a, etc.) are not removed.
Is this currently a limitation of the `tool/depot/extract`? Before I try my luck to tackle it myself, I'd appreciate every hint how to do so.
Or am I using `depot/extract` wrong? Do you see an alternative regarding the use case described above?
Thanks Roman
PS, I disabled the dependency check in `tool/depot/mk/dependencies.inc` completely. See the discussion [1,2] on the Genode mailing list.
[1] https://lists.genode.org/pipermail/users/2018-May/006079.html [2] https://lists.genode.org/pipermail/users/2018-June/006085.html