let say I have a large C/C++ codebase, made of several modules (usually static libraries), that are organised in layers getting more and more complex.
The different products use some modules in different combinations and sometimes with different compilation options.
I think I can first create Git repo for each module, and then create a Conan package, define the relevant dependencies for each layer, and finally assemble in each product. Each time the Git module is released, and new version of the Conan package culd be produced.
Now my question:
let's suppose I'm working on a defect in Product A (in a folder,I have all the files/Conana packages needed to build it) and I find that it need a fix in base module Z, I make the fix in Product A folder.
Unless I'm mistaken, the package Z in folder of product A doesn't have any link to its Git repo, and I would have to make the correction also in original Git module Z, produce new Conan package Z, and finally update the dependency in Product A (or in the intermediate dependent module).
Is it a suitable workflow ?
What would you advise for managing projects having multiple layers of dependencies, and still keep some kind of changelog and identification of modules/versions used in a specific version of a product (for maintenance) ?
What I have tried:
I've browsed thru some docmentation for Conan and Buckaroo, even watched some online training, but I still don't understand how to design a workflow