21 Commits

Author SHA1 Message Date
1d6af45cac build: Build make build and make pack sequentially
Building multiple targets in parallel may cause docker race conditions.
2025-01-15 23:43:41 +02:00
a7efcb729e build: delete package dir before redownloading it 2025-01-15 23:43:41 +02:00
8baaffdcbf build: always download and extract tars
This caused failures in our ci-cd. Always downloading & extracting the
tars makes sure we will redownload them & extract if the previous
download / extraction was faulty.
2025-01-15 23:43:41 +02:00
eef9ea9215 Merge pull request #34 from guyush1/add-submodules-note
docs: Add a note about submodules initialization and sync
2025-01-14 22:33:00 +02:00
1d7b4ff428 docs: Add a note about submodules initialization and sync 2025-01-14 21:48:07 +02:00
386d09efd7 Merge pull request #29 from RoiKlevansky/redesign-readme
Redesign readme
2025-01-14 21:43:48 +02:00
a5d4b7838e feat: add minimal package.json file for contributor-faces 2025-01-13 20:17:41 +02:00
657600689c ref: redone README.md 2025-01-13 20:17:41 +02:00
9e717db750 docs: add project logo 2025-01-13 17:13:20 +02:00
e97b65c6b9 Merge pull request #31 from guyush1/add-libexpat-support
build: added libexpat build support
2025-01-10 15:51:06 +02:00
d978ca9aaf build: added libexpat build support
This allows commands such as "info os files"
previously we had expat support on x86_64 only.
2025-01-10 15:34:56 +02:00
9e7d1ed118 Merge pull request #30 from guyush1/external-python-gdb-lib
Compiling Pygments & dependencies in GDB
2025-01-10 15:31:25 +02:00
89f092efb7 Compiling Pygments & dependencies in GDB
Added Pygments to build

This is in order to enable GDB syntax highlighting
2025-01-09 21:20:36 +02:00
f7e97cac7f Merge pull request #25 from guyush1/allow-build-with-and-without-python
build: Allow building gdb with and without python
2024-12-30 23:55:08 +02:00
6738cedefc automation: build python targets in pipeline ci-cd
done using a 2d matrix involving the build type (regular or with python)
2024-12-30 23:21:17 +02:00
5359ff1116 build: Allow building gdb with and without python 2024-12-30 23:21:17 +02:00
17346caf10 Merge pull request #23 from guyush1/reduce-static-python-size
reduce static gdb python size
2024-12-25 23:46:36 +02:00
aa49ade8d4 Strip the executables in order to reduce their size 2024-12-25 21:35:03 +02:00
1dfe3fa6ca Reduce static-gdb size by reducing python size
Updated the python submodule.
The newer submodule will create smaller static python libraries.
2024-12-25 21:35:03 +02:00
c44e67540a Added X64 build prefix
There's no real reason to assume the host machine is X64.
2024-12-21 13:50:39 +02:00
a0ceeff014 Added parallel build to PR workflow
Using a matrix and job separation we can make the architectures compile
parallel to eachother, hopefully reducing the time required for builds
and also simplifying the process of building a single architecture.

A problem that we encountered is that with Python the resulting packed
tars are very large. Each release is in the order of tens of megabytes.
Using artifacts in our pipeline can easily make us surpass the maximum
size limit for free GitHub accounts (500 MB).
Because of this, we use the regular non-parallel pipeline for release
build. Releasing the version from the same job the build was performed
in allows us to directly access the build files instead of using
artifacts.

Separated release and MR pipelines.
2024-12-21 13:50:39 +02:00
18 changed files with 1565 additions and 91 deletions

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 76 KiB

193
.github/assets/gdb-static_logo_dark.svg vendored Normal file

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 72 KiB

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 76 KiB

193
.github/assets/gdb-static_logo_light.svg vendored Normal file

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 72 KiB

25
.github/workflows/pr-pipeline.yaml vendored Normal file
View File

@ -0,0 +1,25 @@
name: gdb-static-pr-pipeline
on:
pull_request:
branches:
- '*'
jobs:
build:
strategy:
matrix:
build_type: ["build", "build-with-python"]
architecture: ["x86_64", "arm", "aarch64", "powerpc", "mips", "mipsel"]
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
submodules: recursive
- name: Install dependencies
run: sudo apt-get install -y wget
- name: Build
run: make ${{ matrix.build_type }}-${{ matrix.architecture }} -j$((`nproc`+1))

View File

@ -1,17 +1,15 @@
name: gdb-static-pipeline
name: gdb-static-release-pipeline
on:
pull_request:
branches:
- '*'
push:
tags:
- 'v*'
# Use a non-parallel single job pipeline because artifacts weigh too much. Instead,
# simply build the files in the same job they are released.
jobs:
build:
build_and_publish:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
@ -26,14 +24,7 @@ jobs:
- name: Pack
run: make pack
- name: Upload artifact
uses: actions/upload-artifact@v4
with:
name: gdb-static
path: build/artifacts/gdb-static*.tar.gz
- name: Publish release
if: github.event_name == 'push'
uses: softprops/action-gh-release@v2
with:
files: build/artifacts/gdb-static*.tar.gz
files: build/artifacts/gdb-static*.tar.gz

6
.gitmodules vendored
View File

@ -6,3 +6,9 @@
path = src/submodule_packages/binutils-gdb
url = git@github.com:guyush1/binutils-gdb.git
branch = gdb-static
[submodule "src/submodule_packages/pygments"]
path = src/submodule_packages/pygments
url = git@github.com:pygments/pygments.git
[submodule "src/submodule_packages/libexpat"]
path = src/submodule_packages/libexpat
url = git@github.com:guyush1/libexpat.git

View File

@ -19,6 +19,7 @@ RUN apt update && apt install -y \
gcc-powerpc-linux-gnu \
git \
libncurses-dev \
libtool \
m4 \
make \
patch \

View File

@ -1,17 +1,26 @@
ARCHS := x86_64 arm aarch64 powerpc mips mipsel
TARGETS := $(addprefix build-, $(ARCHS))
PYTHON_TARGETS := $(addprefix build-with-python-, $(ARCHS))
ALL_TARGETS := $(TARGETS) $(PYTHON_TARGETS)
PACK_TARGETS := $(addprefix pack-, $(ARCHS))
PYTHON_PACK_TARGETS := $(addprefix pack-with-python-, $(ARCHS))
ALL_PACK_TARGETS := $(PACK_TARGETS) $(PYTHON_PACK_TARGETS)
SUBMODULE_PACKAGES := $(wildcard src/submodule_packages/*)
BUILD_PACKAGES_DIR := "build/packages"
.PHONY: clean help download_packages build build-docker-image $(TARGETS) $(PACK_TARGETS)
.PHONY: clean help download_packages build build-docker-image $(ALL_TARGETS) $(ALL_PACK_TARGETS)
.NOTPARALLEL: build pack
help:
@echo "Usage:"
@echo " make build"
@echo ""
@for target in $(TARGETS); do \
@for target in $(ALL_TARGETS); do \
echo " $$target"; \
done
@ -20,7 +29,7 @@ help:
build/build-docker-image.stamp: Dockerfile
mkdir -p build
docker build -t gdb-static .
docker buildx build --tag gdb-static .
touch build/build-docker-image.stamp
build-docker-image: build/build-docker-image.stamp
@ -40,19 +49,31 @@ symlink-git-packages: build/symlink-git-packages.stamp
download-packages: build/download-packages.stamp
build: $(TARGETS)
build: $(ALL_TARGETS)
$(TARGETS): build-%: symlink-git-packages download-packages build-docker-image
$(TARGETS): build-%:
@$(MAKE) _build-$*
$(PYTHON_TARGETS): build-with-python-%:
@WITH_PYTHON="--with-python" $(MAKE) _build-$*
_build-%: symlink-git-packages download-packages build-docker-image
mkdir -p build
docker run --user $(shell id -u):$(shell id -g) \
--rm --volume .:/app/gdb gdb-static env TERM=xterm-256color \
/app/gdb/src/compilation/build.sh $* /app/gdb/build/ /app/gdb/src
/app/gdb/src/compilation/build.sh $* /app/gdb/build/ /app/gdb/src $(WITH_PYTHON)
pack: $(PACK_TARGETS)
pack: $(ALL_PACK_TARGETS)
$(PACK_TARGETS): pack-%: build-%
if [ ! -f "build/artifacts/gdb-static-$*.tar.gz" ]; then \
tar -czf "build/artifacts/gdb-static-$*.tar.gz" -C "build/artifacts/$*" .; \
$(PACK_TARGETS): pack-%:
@$(MAKE) _pack-$*
$(PYTHON_PACK_TARGETS): pack-with-python-%:
@TAR_EXT="with-python-" ARTIFACT_EXT="_with_python" $(MAKE) _pack-$*
_pack-%: build-%
if [ ! -f "build/artifacts/gdb-static-$(TAR_EXT)$*.tar.gz" ]; then \
tar -czf "build/artifacts/gdb-static-$(TAR_EXT)$*.tar.gz" -C "build/artifacts/$*$(ARTIFACT_EXT)" .; \
fi
clean-git-packages:

162
README.md
View File

@ -1,54 +1,150 @@
# Repository of static gdb and gdbserver
<h1 align="center">
<picture>
<source media="(prefers-color-scheme: dark)" srcset="./.github/assets/gdb-static_logo_dark.svg">
<source media="(prefers-color-scheme: light)" srcset="./.github/assets/gdb-static_logo_light.svg">
<img src="./.github/assets/gdb-static_logo_light.svg" alt="gdb-static" width="210px">
</picture>
</h1>
## **The statically compiled gdb / gdbserver binaries are avaliable to download under github releases!**
<p align="center">
<i align="center">Frozen static builds of everyone's favorite debugger!🧊</i>
</p>
link: [gdb-static github releases](https://github.com/guyush1/gdb-static/releases)
<h4 align="center">
<a href="https://github.com/guyush1/gdb-static/releases/latest">
<img src="https://img.shields.io/github/v/release/guyush1/gdb-static?style=flat-square" alt="release" style="height: 20px;">
<a href="https://github.com/guyush1/gdb-static/actions/workflows/pr-pipeline.yaml">
<img src="https://img.shields.io/github/actions/workflow/status/guyush1/gdb-static/pr-pipeline.yaml?style=flat-square&label=pipeline" alt="continuous integration" style="height: 20px;">
</a>
<a href="https://github.com/guyush1/gdb-static/graphs/contributors">
<img src="https://img.shields.io/github/contributors-anon/guyush1/gdb-static?color=yellow&style=flat-square" alt="contributors" style="height: 20px;">
</a>
<br>
<img src="https://img.shields.io/badge/GDB-v15.2-orange?logo=gnu&logoColor=white&style=flat-square" alt="gdb" style="height: 20px;">
<img src="https://img.shields.io/badge/Python-built--in-blue?logo=python&logoColor=white&style=flat-square" alt="python" style="height: 20px;">
</h4>
## For manual gdb/gdbserver compilation instructions, have a look at the compilation.md file
## TL;DR
## Compiling gdb using docker
- **Download**: Get the latest release from the [releases page](https://github.com/guyush1/gdb-static/releases/latest).
This repository contains a dockerfile and build scripts to compile gdb and gdbserver statically for multiple architectures.
Currently, the supported architectures are:
- x86_64
- arm
- aarch64
- powerpc (32bit)
You can easily expand it to support more architectures by adding the appropriate cross compilers to the dockerfile, and other build scripts.
## Introduction
NOTE: You don't need to interact with the dockerfile directly, as the Makefile will take care of everything for you.
Who doesn't love GDB? It's such a powerful tool, with such a great package.
But sometimes, you run into one of these problems:
- You can't install GDB on your machine
- You can't install an updated version of GDB on your machine
- Some other strange embedded reasons...
### Building for a specific architecture
This is where `gdb-static` comes in! We provide static builds of `gdb` (and `gdbserver` of course), so you can run them on any machine, without any dependencies!
<details open>
<summary>
Features
</summary> <br />
- **Static Builds**: No dependencies, no installation, just download and run!
- **Latest Versions**: We keep our builds up-to-date with the latest versions of GDB.
- **Builtin Python (Optional)**: We provide builds with Python support built-in.
- **XML Support**: Our builds come with XML support built-in, which is useful for some GDB commands.
- **Wide Architecture Support**: We support a wide range of architectures:
- aarch64
- arm
- mips
- mipsel
- powerpc
- x86_64
</details>
## Usage
To get started with `gdb-static`, simply download the build for your architecture from the [releases page](https://github.com/guyush1/gdb-static/releases/latest), extract the archive, and copy the binary to your desired platform.
> [!NOTE]
> We provide two types of builds:
> 1. Builds with Python support, which are approximately ~30 MB in size.
> 2. Slimmer builds without Python support, which are approximately ~7 MB in size.
You may choose to copy the `gdb` binary to the platform, or use `gdbserver` to debug remotely.
## Development
> [!NOTE]
> Before building, make sure to initialize & sync the git submodules.
Alternatively, you can build `gdb-static` from source. To do so, follow the instructions below:
<details open>
<summary>
Pre-requisites
</summary> <br />
To be able to build `gdb-static`, you will need the following tools installed on your machine:
###
- Docker
- Docker buildx
- Git
</details>
<details open>
<summary>
Building for a specific architecture
</summary> <br />
To build `gdb-static` for a specific architecture, run the following command:
To build for a specific architecture, you can use the following command:
```bash
make build-<ARCH>
make build[-with-python]-<ARCH>
```
For example, to build for arm:
Where `<ARCH>` is the architecture you want to build for, and `-with-python` may be added in order to compile gdb with Python support.
The resulting binary will be placed in the `build/artifacts/` directory:
```bash
make build-arm
```
The resulting binaries will be placed under the `build/artifacts/` directory.
Each architecture will have its own directory under `build/artifacts/`. For example, the arm architecture will have the following directory structure:
```
build/
artifacts/
arm/
...
└── artifacts/
└── <ARCH>/
└── ...
```
### Building for all architectures
</details>
<details open>
<summary>
Building for all architectures
</summary> <br />
To build `gdb-static` for all supported architectures, run the following command:
To build for all architectures, you can use the following command:
```bash
make build
```
### Cleaning the build
The resulting binary will be placed in the `build/artifacts/` directory.
To clean the build, you can use the following command:
```bash
make clean
```
</details>
<a name="contributing_anchor"></a>
## Contributing
- Bug Report: If you see an error message or encounter an issue while using gdb-static, please create a [bug report](https://github.com/guyush1/gdb-static/issues/new?assignees=&labels=bug&title=%F0%9F%90%9B+Bug+Report%3A+).
- Feature Request: If you have an idea or if there is a capability that is missing and would make `gdb-static` more robust, please submit a [feature request](https://github.com/guyush1/gdb-static/issues/new?assignees=&labels=enhancement&title=%F0%9F%9A%80+Feature+Request%3A+).
## Contributors
<!---
npx contributor-faces --exclude "*bot*" --limit 70 --repo "https://github.com/guyush1/gdb-static"
change the height and width for each of the contributors from 80 to 50.
--->
[//]: contributor-faces
<a href="https://github.com/guyush1"><img src="https://avatars.githubusercontent.com/u/82650790?v=4" title="guyush1" width="80" height="80"></a>
<a href="https://github.com/RoiKlevansky"><img src="https://avatars.githubusercontent.com/u/78471889?v=4" title="RoiKlevansky" width="80" height="80"></a>
<a href="https://github.com/roddyrap"><img src="https://avatars.githubusercontent.com/u/37045659?v=4" title="roddyrap" width="80" height="80"></a>
[//]: contributor-faces

7
package.json Normal file
View File

@ -0,0 +1,7 @@
{
"name": "gdb-static",
"repository": {
"type": "git",
"url": "https://github.com/guyush1/gdb-static"
}
}

View File

@ -43,7 +43,7 @@ function set_compliation_variables() {
CROSS=mipsel-linux-gnu-
export HOST=mipsel-linux-gnu
elif [[ "$target_arch" == "x86_64" ]]; then
CROSS=""
CROSS=x86_64-linux-gnu-
export HOST=x86_64-linux-gnu
fi
@ -52,17 +52,23 @@ function set_compliation_variables() {
export CFLAGS="-O2"
export CXXFLAGS="-O2"
# Strip the binary to reduce it's size.
export LDFLAGS="-s"
}
function set_ncurses_link_variables() {
# Set up ncurses library link variables
function set_up_lib_search_paths() {
# Set up library-related linker search paths.
#
# Parameters:
# $1: ncursesw build dir
# $2: libexpat build dir
local ncursesw_build_dir="$1"
local libexpat_build_dir="$2"
# Allow tui mode by adding our custom built static ncursesw library to the linker search path.
export LDFLAGS="-L$ncursesw_build_dir/lib $LDFLAGS"
# I) Allow tui mode by adding our custom built static ncursesw library to the linker search path.
# II) Allow parsing xml files by adding libexpat library to the linker search path.
export LDFLAGS="-L$ncursesw_build_dir/lib -L$libexpat_build_dir/lib/.libs $LDFLAGS"
}
function build_iconv() {
@ -211,12 +217,64 @@ function build_ncurses() {
popd > /dev/null
}
function build_libexpat() {
# Build libexpat.
#
# Parameters:
# $1: libexpat package directory
# $2: target architecture
#
# Echoes:
# The libexpat build directory
#
# Returns:
# 0: success
# 1: failure
local libexpat_dir="$1"
local target_arch="$2"
local libexpat_build_dir="$(realpath "$libexpat_dir/build-$target_arch")"
echo "$libexpat_build_dir"
mkdir -p "$libexpat_build_dir"
if [[ -f "$libexpat_build_dir/lib/.libs/libexpat.a" ]]; then
>&2 echo "Skipping build: libexpat already built for $target_arch"
return 0
fi
pushd "$libexpat_build_dir" > /dev/null
>&2 fancy_title "Building libexpat for $target_arch"
# Generate configure if it doesnt exist.
if [[ ! -f "$libexpat_build_dir/../expat/configure" ]]; then
>&2 ../expat/buildconf.sh ../expat/
fi
../expat/configure --enable-static "CC=$CC" "CXX=$CXX" "--host=$HOST" \
"CFLAGS=$CFLAGS" "CXXFLAGS=$CXXFLAGS" 1>&2
if [[ $? -ne 0 ]]; then
return 1
fi
make -j$(nproc) 1>&2
if [[ $? -ne 0 ]]; then
return 1
fi
>&2 fancy_title "Finished building libexpat for $target_arch"
popd > /dev/null
}
function build_python() {
# Build python.
#
# Parameters:
# $1: python package directory
# $2: target architecture
# $3: gdb's python module directory parent
# $4: pygment's toplevel source dir.
#
# Echoes:
# The python build directory
@ -226,6 +284,8 @@ function build_python() {
# 1: failure
local python_dir="$1"
local target_arch="$2"
local gdb_python_parent="$3"
local pygments_source_dir="$4"
local python_lib_dir="$(realpath "$python_dir/build-$target_arch")"
echo "$python_lib_dir"
@ -254,6 +314,17 @@ function build_python() {
--disable-ipv6 \
--disable-shared
# Extract the regular standard library modules that are to be frozen and include the gdb and pygments custom libraries.
export EXTRA_FROZEN_MODULES="$(printf "%s" "$(< ${script_dir}/frozen_python_modules.txt)" | tr $'\n' ";")"
export EXTRA_FROZEN_MODULES="${EXTRA_FROZEN_MODULES};<gdb.**.*>: gdb = ${gdb_python_parent};<pygments.**.*>: pygments = ${pygments_source_dir}"
>&2 echo "Frozen Modules: ${EXTRA_FROZEN_MODULES}"
# Regenerate frozen modules with gdb env varaible. Do it after the configure because we need
# the `regen-frozen` makefile.
>&2 python3.12 ../Tools/build/freeze_modules.py
>&2 make regen-frozen
# Build python after configuring the project and regnerating frozen files.
>&2 make -j $(nproc)
if [[ $? -ne 0 ]]; then
return 1
@ -332,6 +403,7 @@ function build_gdb() {
# $3: libiconv prefix
# $4: libgmp prefix
# $5: libmpfr prefix
# $6: whether to build with python or not
#
# Echoes:
# The gdb build directory
@ -345,7 +417,15 @@ function build_gdb() {
local libiconv_prefix="$3"
local libgmp_prefix="$4"
local libmpfr_prefix="$5"
local gdb_build_dir="$(realpath "$gdb_dir/build-$target_arch")"
local with_python="$6"
if [[ "$with_python" == "yes" ]]; then
local python_flag="--with-python=/app/gdb/build/packages/cpython-static/build-$target_arch/bin/python3-config"
local gdb_build_dir="$(realpath "$gdb_dir/build-${target_arch}_with_python")"
else
local python_flag="--without-python"
local gdb_build_dir="$(realpath "$gdb_dir/build-${target_arch}")"
fi
echo "$gdb_build_dir"
mkdir -p "$gdb_build_dir"
@ -360,11 +440,12 @@ function build_gdb() {
>&2 fancy_title "Building gdb for $target_arch"
../configure -C --enable-static --with-static-standard-libraries --disable-inprocess-agent \
--enable-tui --with-python=/app/gdb/build/packages/cpython-static/build-$target_arch/bin/python3-config \
--enable-tui "$python_flag" \
--with-expat --with-libexpat-type="static" \
"--with-libiconv-prefix=$libiconv_prefix" --with-libiconv-type=static \
"--with-gmp=$libgmp_prefix" \
"--with-mpfr=$libmpfr_prefix" \
"CC=$CC" "CXX=$CXX" "--host=$HOST" \
"CC=$CC" "CXX=$CXX" "LDFLAGS=$LDFLAGS" "--host=$HOST" \
"CFLAGS=$CFLAGS" "CXXFLAGS=$CXXFLAGS" 1>&2
if [[ $? -ne 0 ]]; then
return 1
@ -387,6 +468,7 @@ function install_gdb() {
# $1: gdb build directory
# $2: artifacts directory
# $3: target architecture
# $4: whether gdb was built with or without python
#
# Returns:
# 0: success
@ -395,15 +477,22 @@ function install_gdb() {
local gdb_build_dir="$1"
local artifacts_dir="$2"
local target_arch="$3"
local with_python="$4"
if [[ -d "$artifacts_dir/$target_arch" && -n "$(ls -A "$artifacts_dir/$target_arch")" ]]; then
if [[ "$with_python" == "yes" ]]; then
local artifacts_location="$artifacts_dir/${target_arch}_with_python"
else
local artifacts_location="$artifacts_dir/${target_arch}"
fi
if [[ -d "$artifacts_location" && -n "$(ls -A "$artifacts_location")" ]]; then
>&2 echo "Skipping install: gdb already installed for $target_arch"
return 0
fi
temp_artifacts_dir="$(mktemp -d)"
mkdir -p "$artifacts_dir/$target_arch"
mkdir -p "$artifacts_location"
make -C "$gdb_build_dir" install "DESTDIR=$temp_artifacts_dir" 1>&2
if [[ $? -ne 0 ]]; then
@ -412,7 +501,7 @@ function install_gdb() {
fi
while read file; do
cp "$file" "$artifacts_dir/$target_arch/"
cp "$file" "$artifacts_location/"
done < <(find "$temp_artifacts_dir/usr/local/bin" -type f -executable)
rm -rf "$temp_artifacts_dir"
@ -426,8 +515,9 @@ function build_and_install_gdb() {
# $2: libiconv prefix
# $3: libgmp prefix
# $4: libmpfr prefix
# $5: install directory
# $6: target architecture
# $5: whether to build with python or not
# $6: install directory
# $7: target architecture
#
# Returns:
# 0: success
@ -437,15 +527,16 @@ function build_and_install_gdb() {
local libiconv_prefix="$2"
local libgmp_prefix="$3"
local libmpfr_prefix="$4"
local artifacts_dir="$5"
local target_arch="$6"
local with_python="$5"
local artifacts_dir="$6"
local target_arch="$7"
gdb_build_dir="$(build_gdb "$gdb_dir" "$target_arch" "$libiconv_prefix" "$libgmp_prefix" "$libmpfr_prefix")"
gdb_build_dir="$(build_gdb "$gdb_dir" "$target_arch" "$libiconv_prefix" "$libgmp_prefix" "$libmpfr_prefix" "$with_python")"
if [[ $? -ne 0 ]]; then
return 1
fi
install_gdb "$gdb_build_dir" "$artifacts_dir" "$target_arch"
install_gdb "$gdb_build_dir" "$artifacts_dir" "$target_arch" "$with_python"
if [[ $? -ne 0 ]]; then
return 1
fi
@ -458,10 +549,12 @@ function build_gdb_with_dependencies() {
# $1: target architecture
# $2: build directory
# $3: src directory
# $4: whether to build gdb with python or not
local target_arch="$1"
local build_dir="$2"
local source_dir="$3"
local with_python="$4"
local packages_dir="$build_dir/packages"
local artifacts_dir="$build_dir/artifacts"
@ -491,17 +584,28 @@ function build_gdb_with_dependencies() {
if [[ $? -ne 0 ]]; then
return 1
fi
set_ncurses_link_variables "$ncursesw_build_dir"
python_build_dir="$(build_python "$packages_dir/cpython-static" "$target_arch")"
libexpat_build_dir="$(build_libexpat "$packages_dir/libexpat" "$target_arch")"
if [[ $? -ne 0 ]]; then
return 1
fi
set_up_lib_search_paths "$ncursesw_build_dir" "$libexpat_build_dir"
if [[ "$with_python" == "yes" ]]; then
local gdb_python_dir="$packages_dir/binutils-gdb/gdb/python/lib/"
local pygments_source_dir="$packages_dir/pygments/"
local python_build_dir="$(build_python "$packages_dir/cpython-static" "$target_arch" "$gdb_python_dir" "$pygments_source_dir")"
if [[ $? -ne 0 ]]; then
return 1
fi
fi
build_and_install_gdb "$packages_dir/binutils-gdb" \
"$iconv_build_dir/lib/.libs/" \
"$gmp_build_dir/.libs/" \
"$mpfr_build_dir/src/.libs/" \
"$with_python" \
"$artifacts_dir" \
"$target_arch"
if [[ $? -ne 0 ]]; then
@ -510,12 +614,17 @@ function build_gdb_with_dependencies() {
}
function main() {
if [[ $# -ne 3 ]]; then
>&2 echo "Usage: $0 <target_arch> <build_dir> <src_dir>"
if [[ $# -lt 3 ]]; then
>&2 echo "Usage: $0 <target_arch> <build_dir> <src_dir> [--with-python]"
exit 1
fi
build_gdb_with_dependencies "$1" "$2" "$3"
local with_python="no"
if [[ "$4" == "--with-python" ]]; then
with_python="yes"
fi
build_gdb_with_dependencies "$1" "$2" "$3" "$with_python"
if [[ $? -ne 0 ]]; then
>&2 echo "Error: failed to build gdb with dependencies"
exit 1

View File

@ -64,11 +64,6 @@ function download_package() {
local url="$1"
local output="$2"
if [[ -f "$output" ]]; then
>&2 echo "Skipping download: $output already exists"
return 0
fi
wget "$url" -O "$output"
if [[ $? -ne 0 ]]; then
>&2 echo "Error: failed to download $url"
@ -98,11 +93,6 @@ function extract_package() {
return 1
fi
if [[ -d "$output_dir" ]]; then
>&2 echo "Skipping extraction: $output_dir already exists"
return 0
fi
pushd "$temp_dir" > /dev/null
unpack_tarball "$tarball_realpath"
@ -113,6 +103,10 @@ function extract_package() {
popd > /dev/null
# Make sure output dir is empty, so we could move content into it.
# The directory might not exist, so we need to pass || true so that set -e won't fail us.
rm -rf "$output_dir" || true
mv "$temp_dir/$package_dir" "$output_dir"
if [[ $? -ne 0 ]]; then
return 1

View File

@ -0,0 +1,150 @@
abc
_aix_support
antigravity
argparse
ast
base64
bdb
bisect
calendar
cmd
codecs
codeop
code
<collections.**.*>
_collections_abc
colorsys
_compat_pickle
compileall
_compression
<concurrent.**.*>
configparser
contextlib
contextvars
copy
copyreg
cProfile
csv
dataclasses
datetime
<dbm.**.*>
decimal
difflib
dis
<encodings.**.*>
<ensurepip.**.*>
enum
filecmp
fileinput
fnmatch
fractions
ftplib
functools
__future__
genericpath
getopt
getpass
gettext
glob
graphlib
gzip
hashlib
heapq
hmac
imaplib
<importlib.**.*>
inspect
io
ipaddress
<json.**.*>
keyword
linecache
locale
<logging.**.*>
lzma
_markupbase
mimetypes
modulefinder
<multiprocessing.**.*>
netrc
ntpath
nturl2path
numbers
opcode
operator
optparse
os
_osx_support
pathlib
pdb
<__phello__.**.*>
pickle
pickletools
pkgutil
platform
plistlib
poplib
posixpath
pprint
profile
pstats
pty
_py_abc
pyclbr
py_compile
_pydatetime
_pydecimal
_pyio
_pylong
queue
quopri
random
<re.**.*>
reprlib
rlcompleter
sched
selectors
shelve
shlex
shutil
signal
smtplib
socket
socketserver
statistics
stat
stringprep
string
_strptime
struct
subprocess
symtable
sysconfig
tabnanny
tempfile
textwrap
this
_threading_local
threading
timeit
tokenize
token
<tomllib.**.*>
traceback
tracemalloc
trace
tty
types
typing
uuid
warnings
wave
weakref
_weakrefset
webbrowser
<wsgiref.**.*>
zipapp
<zipfile.**.*>
<zoneinfo.**.*>
<email.**.*>
<urllib.**.*>