03-17-2022 02:49 AM
Who is responsible for a "dependency" being available and up to date?
Should downstream projects care?
An example is described below, with the following assumptions:
I'd be interested in thoughts on if I want to build a library that depends on another library (build B.lvlibp, depends on A.lvlibp):
Currently I have a system that effectively rebuilds "A" when it is out of date and "B" is requested, but this requires quite a chunk of time confirming that "A" is up to date or not.
There are ways I could speed that up (probably, look at "Releases" on GitHub for "A" and see if there are more commits since, potentially filtering by some sort of branching strategy if desired), but I wonder if this is "B"'s problem at all...
In particular, if I specify a version for "A" in "B"'s dependencies, it shouldn't really matter, provided a version of "A" that is suitable can be found (and installed... without conflicting with C, D, E, whatever other dependencies are listed - but if this is not the case, then there is a configuration problem and I'm fine with that being a failure).
If I give "B" the information it would need to produce a list of dependencies for e.g. an NIPKG, then I can presumably just install those packages before loading and building B. Nested dependencies would be resolved by NIPM. Versions can also be specified, although this imposes some limits on the way they can be specified.
03-17-2022 03:48 AM
What works for us:
- Distribute dependencies as packages (NIPM, VIPM or choco packages) - package is rebuild on commit
- If the application/library needs some dependencies then CI will install package (latest or specific version)
It's quite fast (installing packages is usually very quick), it's flexible (you can have multiple versions of package and install an appropriate one for this particular project (not necessary the latest).
So we basically follow the last paragraph from you post 😉
03-17-2022 03:52 AM - edited 03-17-2022 03:52 AM
Hey!
In our company we use Git submodules for most of the dependencies (source code) as long as they are not provided by some kind of packet manager. We had this kind of discussion for our projects too and right now we are quite happy with the submodule, VI/ NIMP Package strategy.
A question you might ask your self is, is it really necessary to always have the newest version of a dependency? How do you make sure an update in a dependent package didn't break code in you project? Just blindly using the most recent version for a dependency is something we almost never do. We always use a specific version for a dependency that is tested with the project, if there is an update we have to update the submodule/ dependency by hand and test it until we are sure nothing did break.
It might be worth mentioning that we use Gitlab and it's CI pipelines to build almost all of our dependencies in some kind of VI or NIMP package. Right now we don't use .lvlibp so they might need some different approach.
1. In our case no - if the necessary files could not be found the build fails.
2. fail
3. Yes and in our projects it is always a strict version of "A".
Be aware that our way of doing things might not work for you. It is just one way to do it and it depends heavily on how your projects work and what kind of dependencies they use/ need. In the end you need to find a solution that meets your needs and most importantly is easy to maintain.
03-17-2022 08:53 AM
I have not found it to be fast or easy dealing with this.
VIPM has a tendency to crash at the drop of a hat, I have to kill it to ensure I do not get a VIPM timed out error. I have to include this in my build script.
Get-Process -Name "VI Package Manager" -ErrorAction SilentlyContinue | Stop-Process -ErrorAction SilentlyContinue ; $Error.Clear()
My hope would be if I can move to a container solution that would work better but I have not got there yet.
03-19-2022 06:01 AM
We make a point of keeping as many dependencies as possible directly in any project's repository. That way, for those "local" dependencies, this is not a problem. For very generic reuse code, we use .vip's more and more. Some of our repos also make use of git submodules (although we try to limit that to specific use cases).
When building through CI, our Release Automation Tools can
- apply .vipc file(s) if configured to do so
- clone any number of repositories to any given location
This works nicely for us and for our customers. So, all in all, the answer is: No, we don't build dependencies "on the fly".
DSH Pragmatic Software Development Workshops (Fab, Steve, Brian and me)
Release Automation Tools for LabVIEW (CI/CD integration with LabVIEW)
HSE Discord Server (Discuss our free and commercial tools and services)
DQMH® (Developer Experience that makes you smile )