Discussion:
[Scons-users] integrating scons with other (non-scons) projects
Mats Wichmann
2018-06-08 14:37:41 UTC
Permalink
In common with, I presume, many others, our project builds using scons,
but also builds on the work of other projects rather than reinventing
the wheel where not necessary - so we use external bits for, to name a
few, json, cbor, coap, tls etc. as well as using googletest for unit
testing support and so on.

Because of the multi-platform nature (including cross-compilation
capability in some combinations), we can't depend on native packaging as
we could for an all-Linux solution (for example). In other words,
installing libcoap and libcoap-dev on the build host does not solve
anything.

Of course, these external projects aren't built using scons, which means
we have dependency issues - you have to fetch, unpack, and sometimes
subprocess call out to build (in other cases, our build points to the
unpacked files). So when scons does the initial scan, some of the
things it's scanning for may not exist yet. Libraries are not found by
library search methods, headers may not exist, etc. In effect, these
parts are not part of scons's view of the world.

A lot of work has gone into bolting bits and pieces of Python code
together to work with this, and while it generally works, it's the
source of most of the ongoing hiccups in our build system. As
discussed on irc yesterday, for example, one of the downloaders is found
to be broken when the host is Windows and the Python is 3.x. Which we
didn't find before because nobody's testbed happened to be a Clean
Environment - those testing Py3 usage already had their downloaded
external projects in place, and the CI system uses another method to
pre-populate things which have to be downloaded from a cache so the
download cost isn't borne x15 for each and every change submitted to the
CI system... and then the actual fail was covered up by a try block in
our code which was there to catch a different kind of fail.

So, before launching into an even longer essay, what are the best
practices for dealing with this situation of integrating other projects
which are not under scons control?

Someone in the past obviously read this page:

https://github.com/SCons/scons/wiki/DownloadUnpack

because both of the scripts listed on that page in fact appear in our
repository; but only one external lib uses that downloader; several use
the unpacker, but it has been modified to create instead a construction
environment method (AddMethod), for reasons which the commit history
does nothing to make clear, and the devs in question are long gone. Is
that approach overkill, or is it "the best way" - trying very hard to
make scons aware of everything?

Sorry, this still got longer than I planned :)

-- mats
Gary Granger
2018-06-08 16:10:58 UTC
Permalink
Fwiw, I have tried two approaches for this.  The first approach sounds
like what you describe.  I created tools for external dependencies, and
each tool had the capability to download the source package and build
it, and the tool created scons builders whose targets were the libraries
and headers in the scons filesystem that the package build would
create.  Then our scons build system was configured (ie, with CPPPATH
and LIBS) to find the dependencies where the package build would put
them.  This approach worked to a point, but we don't use it anymore,
because as you noted it is difficult to maintain.  We only have several
simple cases, like where scons calls out to kmake to build a kernel
module whose source is in our source tree.

The more recent approach has been to create a script which just
downloads, builds, and installs each source package.  It is much easier
to encode the custom build settings (like cross-compilation in your
case) in a simple script.  In our case, we might need to bootstrap the
build environment by running the script once, and after that all the
development and building happens in our scons source tree.  We don't
need scons to trigger rebuilds of the external packages, because we're
not changing them.  If an external package needs to be upgraded, then I
just manually rerun the script to download, build, and install the new
version.  Typically, the script has a parameter which is the directory
to which all the packages should be installed.  I suppose I could add an
option to our scons tree to call the external script and make sure
everything is downloaded and built, so that things like clean bootstraps
and upgrades could still be run via scons, but I just don't think it's
worth it to go further than that, to enumerate all the external
package's build products and dependencies in scons.

Now that I've thought about it, this sounds similar to source package
tools like Homebrew or Macports, right?  Or Portage or Sorcery?  There
are tools which will download source packages and build them, and manage
dependencies too, but I haven't tried them.  Perhaps I should...

Gary
Mats Wichmann
2018-06-08 16:26:23 UTC
Permalink
I just wanted to point out this tool which can be used to maintain scripts to install libraries for multiple compilers and multiple sets of compile options (debug, different MPI,…)
I presume you can use a python script to fire this off for the required libraries…
https://computation.llnl.gov/projects/spack-hpc-package-manager
*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-
Rob Managan
WCI/DP Division LLNL
Sounds nice, until I get to here:

"Spack uses RPATH linking so that each package knows where to find its
dependencies."

rpath usage gets into trouble in the difference between develop and
deploy - I may have the built libraries in a build area when I'm testing
code, and that's going to be the wrong place for a deploy scenario where
the libraries should be in standard places (y'know, /lib, /opt/lib,
etc). and vice versa: if I'm testing new versions of libs, I don't want
the rpaths pointing to standard locations because then I'm not picking
up the experimental new versions.

And then there's no rpath at all on Windows, so this part of the concept
is pretty much dead in the water for us.
Bill Deegan
2018-06-08 16:34:37 UTC
Permalink
Seems like a library of download and build logic for various third party
components would be useful.

Anyone want to flesh out on the wiki the following:
1 - Method of obtaining the package (wget, git, hg, etc...)
1.2 - Method of detecting update from source (you'd get the meta data and
the source would be some type of Value() node. The decider would have know
how to compare such)
2 - Method of building (configure/make, cmake,...)
3 - Method of identifying the (important) output file(s). In most cases
this would be the library and one or more header files installed via make
install or equivalent.

It seems very doable, and also needed.

-Bill
I just wanted to point out this tool which can be used to maintain
scripts to install libraries for multiple compilers and multiple sets of
compile options (debug, different MPI,
)
I presume you can use a python script to fire this off for the required
libraries

https://computation.llnl.gov/projects/spack-hpc-package-manager
*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-
Rob Managan
WCI/DP Division LLNL
"Spack uses RPATH linking so that each package knows where to find its
dependencies."
rpath usage gets into trouble in the difference between develop and
deploy - I may have the built libraries in a build area when I'm testing
code, and that's going to be the wrong place for a deploy scenario where
the libraries should be in standard places (y'know, /lib, /opt/lib,
etc). and vice versa: if I'm testing new versions of libs, I don't want
the rpaths pointing to standard locations because then I'm not picking
up the experimental new versions.
And then there's no rpath at all on Windows, so this part of the concept
is pretty much dead in the water for us.
_______________________________________________
Scons-users mailing list
https://pairlist4.pair.net/mailman/listinfo/scons-users
Loading...