ejolson wrote: ↑Mon Apr 12, 2021 3:23 am
the next post will describe the commands used to download and build Julia inside the resulting Alpine Singularity container.
Since Singularity containers run with user-level permissions it was expedient to run the container as root while building Julia. This is because, I needed to configure an entire Alpine Linux distribution inside the sandbox as mentioned in the previous post before building and installing Julia.
Assuming alpine is the name of the directory that holds the sandbox, type
Code: Select all
$ su -
# singularity shell -w alpine
to enter the container. Since the sandbox was entered as root, the home directory of the host system will be conveniently mounted under /root in the sandbox.
Now, download Julia and unpack it with
Code: Select all
Singularity> wget https://github.com/JuliaLang/julia/releases/download/v1.6.0/julia-1.6.0.tar.gz
Singularity> tar xf julia-1.6.0.tar.gz
Singularity> cd julia-1.6.0
Configure Julia by creating the Make.user in the julia-1.6.0 directory with
Code: Select all
JULIA_CPU_TARGET := arm1176jzf-s
MCPU := arm1176jzf-s
MARCH := armv6zk
OPENBLAS_TARGET_ARCH := ARMV6
USE_BINARYBUILDER := 0
OPENBLAS_USE_THREAD := 0
prefix = /usr/local/julia-1.6.0
Note that I have disabled threads in OpenBLAS since the ARMv6 based Pi B, B+ and Zero computers have only one core.
Next, patch some of the scripts used in the build system with
Code: Select all
Singularity> patch -p1 <julia160.patch
where the contents of julia160.patch are given by
Code: Select all
*** julia-1.6.0/base/binaryplatforms.jl 2021-03-24 20:02:52.000000000 +0000
--- julia-1.6.0-good/base/binaryplatforms.jl 2021-04-11 04:21:58.706732181 +0100
***************
*** 575,582 ****
"x86_64" => "(x86_|amd)64",
"i686" => "i\\d86",
"aarch64" => "(aarch64|arm64)",
"armv7l" => "arm(v7l)?", # if we just see `arm-linux-gnueabihf`, we assume it's `armv7l`
- "armv6l" => "armv6l",
"powerpc64le" => "p(ower)?pc64le",
)
# Keep this in sync with `CPUID.ISAs_by_family`
--- 575,582 ----
"x86_64" => "(x86_|amd)64",
"i686" => "i\\d86",
"aarch64" => "(aarch64|arm64)",
+ "armv6l" => "armv6(l)?", # ejo--fix armv6-alpine-linux-musleabihf
"armv7l" => "arm(v7l)?", # if we just see `arm-linux-gnueabihf`, we assume it's `armv7l`
"powerpc64le" => "p(ower)?pc64le",
)
# Keep this in sync with `CPUID.ISAs_by_family`
*** julia-1.6.0/contrib/normalize_triplet.py 2021-03-24 20:02:52.000000000 +0000
--- julia-1.6.0-good/contrib/normalize_triplet.py 2021-04-11 04:26:04.669514370 +0100
***************
*** 13,18 ****
--- 13,19 ----
'x86_64': '(x86_|amd)64',
'i686': "i\\d86",
'aarch64': "(arm|aarch)64",
+ 'armv6l': "armv6(l)?", # ejo--armv6-alpine-linux-musleabihf
'armv7l': "arm(v7l)?",
'powerpc64le': "p(ower)?pc64le",
}
*** julia-1.6.0/Make.inc 2021-03-24 20:02:52.000000000 +0000
--- julia-1.6.0-good/Make.inc 2021-04-11 02:34:36.805354500 +0100
***************
*** 920,926 ****
JCFLAGS += -fsigned-char
USE_BLAS64:=0
OPENBLAS_DYNAMIC_ARCH:=0
! OPENBLAS_TARGET_ARCH:=ARMV7
endif
# If we are running on aarch64 (e.g. ARMv8 or ARM64), set certain options automatically
--- 920,926 ----
JCFLAGS += -fsigned-char
USE_BLAS64:=0
OPENBLAS_DYNAMIC_ARCH:=0
! OPENBLAS_TARGET_ARCH:=ARMV6
endif
# If we are running on aarch64 (e.g. ARMv8 or ARM64), set certain options automatically
This patch fixes three files needed for the ARMv6 build on Alpine Linux.
Upon patching, build Julia with
Code: Select all
Singularity> export VERBOSE=1
Singularity> export LDFLAGS=-Wl,-allow-multiple-definition
Singularity> nohup time make -j4 &
After about 5 hours the build should finish. At this point test the executable by running it from the source directory as
Code: Select all
Singularity> ./julia
_
_ _ _(_)_ | Documentation: https://docs.julialang.org
(_) | (_) (_) |
_ _ _| |_ __ _ | Type "?" for help, "]?" for Pkg help.
| | | | | | |/ _` | |
| | |_| | | | (_| | | Version 1.6.0 (2021-03-24)
_/ |\__'_|_|_|\__'_| |
|__/ |
julia> exit()
Assuming that went fine, install Julia into the sandbox with
Code: Select all
Singularity> make install
Singularity> cd /usr/local/bin
Singularity> ln -s ../julia-1.6.0/bin/* .
Set Julia to be the default run action by editing /singularity so it reads
Code: Select all
#!/bin/sh
exec /usr/local/bin/julia "$@"
Finally, exit the sandbox and build the container with
Code: Select all
Singularity> exit
# singularity build julia160.sif alpine
Note that no attempts have been made to reduce the size of the sandbox before converting it into a container. Thus, julia160.sif includes a full version of gcc, g++, gfortran and a slew of other things hidden inside. Mine turned out 240 MB.
The main reason I used a container for my build is because I wanted to learn about Singularity on the super-cheap cluster.
viewtopic.php?p=1247739#p1247739
Moving forward it would be nice if someone could figure out why native builds of Julia fail under the current version of 32-bit Raspberry Pi OS.