Compare commits

..

38 Commits

Author SHA1 Message Date
Guy Shimko
56eef2a483
Merge pull request #44 from guyush1/fix-gdb-auto-load
Fix gdb auto load
2025-02-10 23:56:42 +02:00
Guy Shimko
fb1c8c064b Updated README and compilation to gdb-16 2025-02-10 00:25:35 +02:00
Guy Shimko
0aef35466f Fix gdb debug script auto-load via explicit data&debug configuration
The problem was that our default gdb datadir and debugdir had
non-standard paths, due to some unknown configuration detection.

In order to fix it, we now pass the standard paths via the --with-gdb-datadir & --with-separate-debug-dir variables.
I also set --with-jit-reader-dir, --with-system-gdbinit and --with-system-gdbinit-dir path to be the standard path as well.
2025-02-10 00:24:32 +02:00
Guy Shimko
a67de84e98
Merge pull request #43 from guyush1/fix-python-modules-on-init
Initializing gdb modules on Python init #41
2025-02-07 15:41:52 +02:00
Guy Shimko
6332ea0b03
Merge pull request #42 from guyush1/update-to-gdb-16.2
Update binutils to our static branch for gdb-16.2
2025-02-07 15:39:16 +02:00
Roddy Rappaport
cd70d531cf Initializing gdb modules on Python init #41
- Added the frozen_utils library to Python in order to expose the
  modules and submodules that were frozen with Python.
- Using frozen_utils in order to find the modules that GDB usually
  imports on init and importing them.
2025-02-07 12:15:01 +02:00
Guy Shimko
f23c33bdd3 Update binutils to our static branch for gdb-16.2
Almost no conflicts were encountered :)
2025-02-05 23:29:17 +02:00
Guy Shimko
6a31199eff
Merge pull request #40 from guyush1/improve-compilation-mdfile
docs: improve compilation manual documentation
2025-02-04 23:47:25 +02:00
Guy Shimko
646f3e2b3c docs: improve compilation manual documentation 2025-01-23 17:27:49 +02:00
Roddy Rappaport
75bc26180a Moved binutils-gdb configure changes to configure.ac #32 2025-01-22 12:28:22 +02:00
Guy Shimko
068ae4eb24
Merge pull request #37 from guyush1/hotfix/fix-python-module-generation
build: fix python frozen modules generation
2025-01-20 21:26:23 +02:00
Guy Shimko
9b43b0cf47 build: fix python frozen modules generation
The problem was due to duplicate modules present both in the base frozen
modules and also in the extra modules list file.

This commit should allow us to import gdb and pygments again.
2025-01-20 21:03:32 +02:00
Guy Shimko
1d6af45cac build: Build make build and make pack sequentially
Building multiple targets in parallel may cause docker race conditions.
2025-01-15 23:43:41 +02:00
Guy Shimko
a7efcb729e build: delete package dir before redownloading it 2025-01-15 23:43:41 +02:00
Guy Shimko
8baaffdcbf build: always download and extract tars
This caused failures in our ci-cd. Always downloading & extracting the
tars makes sure we will redownload them & extract if the previous
download / extraction was faulty.
2025-01-15 23:43:41 +02:00
Guy Shimko
eef9ea9215
Merge pull request #34 from guyush1/add-submodules-note
docs: Add a note about submodules initialization and sync
2025-01-14 22:33:00 +02:00
Guy Shimko
1d7b4ff428 docs: Add a note about submodules initialization and sync 2025-01-14 21:48:07 +02:00
Guy Shimko
386d09efd7
Merge pull request #29 from RoiKlevansky/redesign-readme
Redesign readme
2025-01-14 21:43:48 +02:00
Roi Klevansky
a5d4b7838e feat: add minimal package.json file for contributor-faces 2025-01-13 20:17:41 +02:00
Roi Klevansky
657600689c ref: redone README.md 2025-01-13 20:17:41 +02:00
Roi Klevansky
9e717db750 docs: add project logo 2025-01-13 17:13:20 +02:00
Guy Shimko
e97b65c6b9
Merge pull request #31 from guyush1/add-libexpat-support
build: added libexpat build support
2025-01-10 15:51:06 +02:00
Guy Shimko
d978ca9aaf build: added libexpat build support
This allows commands such as "info os files"
previously we had expat support on x86_64 only.
2025-01-10 15:34:56 +02:00
Guy Shimko
9e7d1ed118
Merge pull request #30 from guyush1/external-python-gdb-lib
Compiling Pygments & dependencies in GDB
2025-01-10 15:31:25 +02:00
Roddy Rappaport
89f092efb7 Compiling Pygments & dependencies in GDB
Added Pygments to build

This is in order to enable GDB syntax highlighting
2025-01-09 21:20:36 +02:00
Guy Shimko
f7e97cac7f
Merge pull request #25 from guyush1/allow-build-with-and-without-python
build: Allow building gdb with and without python
2024-12-30 23:55:08 +02:00
Guy Shimko
6738cedefc automation: build python targets in pipeline ci-cd
done using a 2d matrix involving the build type (regular or with python)
2024-12-30 23:21:17 +02:00
Guy Shimko
5359ff1116 build: Allow building gdb with and without python 2024-12-30 23:21:17 +02:00
Guy Shimko
17346caf10
Merge pull request #23 from guyush1/reduce-static-python-size
reduce static gdb python size
2024-12-25 23:46:36 +02:00
Guy Shimko
aa49ade8d4 Strip the executables in order to reduce their size 2024-12-25 21:35:03 +02:00
Guy Shimko
1dfe3fa6ca Reduce static-gdb size by reducing python size
Updated the python submodule.
The newer submodule will create smaller static python libraries.
2024-12-25 21:35:03 +02:00
Roddy Rappaport
c44e67540a Added X64 build prefix
There's no real reason to assume the host machine is X64.
2024-12-21 13:50:39 +02:00
Roddy Rappaport
a0ceeff014 Added parallel build to PR workflow
Using a matrix and job separation we can make the architectures compile
parallel to eachother, hopefully reducing the time required for builds
and also simplifying the process of building a single architecture.

A problem that we encountered is that with Python the resulting packed
tars are very large. Each release is in the order of tens of megabytes.
Using artifacts in our pipeline can easily make us surpass the maximum
size limit for free GitHub accounts (500 MB).
Because of this, we use the regular non-parallel pipeline for release
build. Releasing the version from the same job the build was performed
in allows us to directly access the build files instead of using
artifacts.

Separated release and MR pipelines.
2024-12-21 13:50:39 +02:00
Roddy Rappaport
0a60aedf76 Added submodule checkout to automation 2024-12-21 13:26:37 +02:00
Roddy Rappaport
ff0d3ad28f Added -e flag to bash build.sh
We want audible fails
2024-12-21 13:26:37 +02:00
Roddy Rappaport
c86f506e90 Add submodules to build/packages
Using symlinks the submodules are added to appear just like any
downloaded unpacked tar.

Also added a Makefile clean rule to clean the submodules, which
includes Reseting the submodules to the origin branch state,
including ignored files.
2024-12-21 13:26:37 +02:00
Roddy Rappaport
46e8eb22a8 Added static-python and static gdb submodules 2024-12-21 13:26:37 +02:00
Guy Shimko
fa04d3a7a2 gdb with python support integration
This commits enables gdb's python support. In order to make it work, we
had to create a python fork with some patches to the buildsystem, and
also had to patch gdb as well.
2024-12-21 13:26:37 +02:00
20 changed files with 1643 additions and 141 deletions

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 76 KiB

193
.github/assets/gdb-static_logo_dark.svg vendored Normal file

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 72 KiB

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 76 KiB

193
.github/assets/gdb-static_logo_light.svg vendored Normal file

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 72 KiB

25
.github/workflows/pr-pipeline.yaml vendored Normal file
View File

@ -0,0 +1,25 @@
name: gdb-static-pr-pipeline
on:
pull_request:
branches:
- '*'
jobs:
build:
strategy:
matrix:
build_type: ["build", "build-with-python"]
architecture: ["x86_64", "arm", "aarch64", "powerpc", "mips", "mipsel"]
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
submodules: recursive
- name: Install dependencies
run: sudo apt-get install -y wget
- name: Build
run: make ${{ matrix.build_type }}-${{ matrix.architecture }} -j$((`nproc`+1))

View File

@ -1,19 +1,19 @@
name: gdb-static-pipeline name: gdb-static-release-pipeline
on: on:
pull_request:
branches:
- '*'
push: push:
tags: tags:
- 'v*' - 'v*'
# Use a non-parallel single job pipeline because artifacts weigh too much. Instead,
# simply build the files in the same job they are released.
jobs: jobs:
build: build_and_publish:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4
with:
submodules: recursive
- name: Install dependencies - name: Install dependencies
run: sudo apt-get install -y wget run: sudo apt-get install -y wget
@ -24,14 +24,7 @@ jobs:
- name: Pack - name: Pack
run: make pack run: make pack
- name: Upload artifact
uses: actions/upload-artifact@v4
with:
name: gdb-static
path: build/artifacts/gdb-static*.tar.gz
- name: Publish release - name: Publish release
if: github.event_name == 'push'
uses: softprops/action-gh-release@v2 uses: softprops/action-gh-release@v2
with: with:
files: build/artifacts/gdb-static*.tar.gz files: build/artifacts/gdb-static*.tar.gz

14
.gitmodules vendored Normal file
View File

@ -0,0 +1,14 @@
[submodule "cpython-static"]
path = src/submodule_packages/cpython-static
url = git@github.com:guyush1/cpython-static.git
branch = python3.12-static
[submodule "binutils-gdb-static"]
path = src/submodule_packages/binutils-gdb
url = git@github.com:guyush1/binutils-gdb.git
branch = gdb-static
[submodule "src/submodule_packages/pygments"]
path = src/submodule_packages/pygments
url = git@github.com:pygments/pygments.git
[submodule "src/submodule_packages/libexpat"]
path = src/submodule_packages/libexpat
url = git@github.com:guyush1/libexpat.git

View File

@ -19,6 +19,7 @@ RUN apt update && apt install -y \
gcc-powerpc-linux-gnu \ gcc-powerpc-linux-gnu \
git \ git \
libncurses-dev \ libncurses-dev \
libtool \
m4 \ m4 \
make \ make \
patch \ patch \

View File

@ -1,15 +1,26 @@
ARCHS := x86_64 arm aarch64 powerpc mips mipsel ARCHS := x86_64 arm aarch64 powerpc mips mipsel
TARGETS := $(addprefix build-, $(ARCHS))
PACK_TARGETS := $(addprefix pack-, $(ARCHS))
.PHONY: clean help download_packages build build-docker-image $(TARGETS) $(PACK_TARGETS) TARGETS := $(addprefix build-, $(ARCHS))
PYTHON_TARGETS := $(addprefix build-with-python-, $(ARCHS))
ALL_TARGETS := $(TARGETS) $(PYTHON_TARGETS)
PACK_TARGETS := $(addprefix pack-, $(ARCHS))
PYTHON_PACK_TARGETS := $(addprefix pack-with-python-, $(ARCHS))
ALL_PACK_TARGETS := $(PACK_TARGETS) $(PYTHON_PACK_TARGETS)
SUBMODULE_PACKAGES := $(wildcard src/submodule_packages/*)
BUILD_PACKAGES_DIR := "build/packages"
.PHONY: clean help download_packages build build-docker-image $(ALL_TARGETS) $(ALL_PACK_TARGETS)
.NOTPARALLEL: build pack
help: help:
@echo "Usage:" @echo "Usage:"
@echo " make build" @echo " make build"
@echo "" @echo ""
@for target in $(TARGETS); do \ @for target in $(ALL_TARGETS); do \
echo " $$target"; \ echo " $$target"; \
done done
@ -18,36 +29,57 @@ help:
build/build-docker-image.stamp: Dockerfile build/build-docker-image.stamp: Dockerfile
mkdir -p build mkdir -p build
docker build -t gdb-static . docker buildx build --tag gdb-static .
touch build/build-docker-image.stamp touch build/build-docker-image.stamp
build-docker-image: build/build-docker-image.stamp build-docker-image: build/build-docker-image.stamp
build/download-packages.stamp: build/build-docker-image.stamp src/download_packages.sh build/download-packages.stamp: build/build-docker-image.stamp src/compilation/download_packages.sh
mkdir -p build/packages mkdir -p $(BUILD_PACKAGES_DIR)
docker run --user $(shell id -u):$(shell id -g) \ docker run --user $(shell id -u):$(shell id -g) \
--rm --volume .:/app/gdb gdb-static env TERM=xterm-256color \ --rm --volume .:/app/gdb gdb-static env TERM=xterm-256color \
/app/gdb/src/download_packages.sh /app/gdb/build/packages /app/gdb/src/compilation/download_packages.sh /app/gdb/$(BUILD_PACKAGES_DIR)/
touch build/download-packages.stamp touch build/download-packages.stamp
build/symlink-git-packages.stamp: $(SUBMODULE_PACKAGES)
mkdir -p $(BUILD_PACKAGES_DIR)
ln -sf $(addprefix /app/gdb/, $(SUBMODULE_PACKAGES)) $(BUILD_PACKAGES_DIR)/
symlink-git-packages: build/symlink-git-packages.stamp
download-packages: build/download-packages.stamp download-packages: build/download-packages.stamp
build: $(TARGETS) build: $(ALL_TARGETS)
$(TARGETS): build-%: download-packages build-docker-image $(TARGETS): build-%:
@$(MAKE) _build-$*
$(PYTHON_TARGETS): build-with-python-%:
@WITH_PYTHON="--with-python" $(MAKE) _build-$*
_build-%: symlink-git-packages download-packages build-docker-image
mkdir -p build mkdir -p build
docker run --user $(shell id -u):$(shell id -g) \ docker run --user $(shell id -u):$(shell id -g) \
--rm --volume .:/app/gdb gdb-static env TERM=xterm-256color \ --rm --volume .:/app/gdb gdb-static env TERM=xterm-256color \
/app/gdb/src/build.sh $* /app/gdb/build/ /app/gdb/src /app/gdb/src/compilation/build.sh $* /app/gdb/build/ /app/gdb/src $(WITH_PYTHON)
pack: $(PACK_TARGETS) pack: $(ALL_PACK_TARGETS)
$(PACK_TARGETS): pack-%: build-% $(PACK_TARGETS): pack-%:
if [ ! -f "build/artifacts/gdb-static-$*.tar.gz" ]; then \ @$(MAKE) _pack-$*
tar -czf "build/artifacts/gdb-static-$*.tar.gz" -C "build/artifacts/$*" .; \
$(PYTHON_PACK_TARGETS): pack-with-python-%:
@TAR_EXT="with-python-" ARTIFACT_EXT="_with_python" $(MAKE) _pack-$*
_pack-%: build-%
if [ ! -f "build/artifacts/gdb-static-$(TAR_EXT)$*.tar.gz" ]; then \
tar -czf "build/artifacts/gdb-static-$(TAR_EXT)$*.tar.gz" -C "build/artifacts/$*$(ARTIFACT_EXT)" .; \
fi fi
clean: clean-git-packages:
git submodule foreach '[[ ! "$$sm_path" == src/submodule_packages/* ]] || git clean -xffd'
clean: clean-git-packages
rm -rf build rm -rf build
# Kill and remove all containers of image gdb-static # Kill and remove all containers of image gdb-static
docker ps -a | grep -P "^[a-f0-9]+\s+gdb-static\s+" | awk '{print $$1}' | xargs docker rm -f 2>/dev/null || true docker ps -a | grep -P "^[a-f0-9]+\s+gdb-static\s+" | awk '{print $$1}' | xargs docker rm -f 2>/dev/null || true

162
README.md
View File

@ -1,54 +1,150 @@
# Repository of static gdb and gdbserver <h1 align="center">
<picture>
<source media="(prefers-color-scheme: dark)" srcset="./.github/assets/gdb-static_logo_dark.svg">
<source media="(prefers-color-scheme: light)" srcset="./.github/assets/gdb-static_logo_light.svg">
<img src="./.github/assets/gdb-static_logo_light.svg" alt="gdb-static" width="210px">
</picture>
</h1>
## **The statically compiled gdb / gdbserver binaries are avaliable to download under github releases!** <p align="center">
<i align="center">Frozen static builds of everyone's favorite debugger!🧊</i>
</p>
link: [gdb-static github releases](https://github.com/guyush1/gdb-static/releases) <h4 align="center">
<a href="https://github.com/guyush1/gdb-static/releases/latest">
<img src="https://img.shields.io/github/v/release/guyush1/gdb-static?style=flat-square" alt="release" style="height: 20px;">
<a href="https://github.com/guyush1/gdb-static/actions/workflows/pr-pipeline.yaml">
<img src="https://img.shields.io/github/actions/workflow/status/guyush1/gdb-static/pr-pipeline.yaml?style=flat-square&label=pipeline" alt="continuous integration" style="height: 20px;">
</a>
<a href="https://github.com/guyush1/gdb-static/graphs/contributors">
<img src="https://img.shields.io/github/contributors-anon/guyush1/gdb-static?color=yellow&style=flat-square" alt="contributors" style="height: 20px;">
</a>
<br>
<img src="https://img.shields.io/badge/GDB-v16.2-orange?logo=gnu&logoColor=white&style=flat-square" alt="gdb" style="height: 20px;">
<img src="https://img.shields.io/badge/Python-built--in-blue?logo=python&logoColor=white&style=flat-square" alt="python" style="height: 20px;">
</h4>
## For manual gdb/gdbserver compilation instructions, have a look at the compilation.md file ## TL;DR
## Compiling gdb using docker - **Download**: Get the latest release from the [releases page](https://github.com/guyush1/gdb-static/releases/latest).
This repository contains a dockerfile and build scripts to compile gdb and gdbserver statically for multiple architectures. ## Introduction
Currently, the supported architectures are:
- x86_64
- arm
- aarch64
- powerpc (32bit)
You can easily expand it to support more architectures by adding the appropriate cross compilers to the dockerfile, and other build scripts.
NOTE: You don't need to interact with the dockerfile directly, as the Makefile will take care of everything for you. Who doesn't love GDB? It's such a powerful tool, with such a great package.
But sometimes, you run into one of these problems:
- You can't install GDB on your machine
- You can't install an updated version of GDB on your machine
- Some other strange embedded reasons...
### Building for a specific architecture This is where `gdb-static` comes in! We provide static builds of `gdb` (and `gdbserver` of course), so you can run them on any machine, without any dependencies!
<details open>
<summary>
Features
</summary> <br />
- **Static Builds**: No dependencies, no installation, just download and run!
- **Latest Versions**: We keep our builds up-to-date with the latest versions of GDB.
- **Builtin Python (Optional)**: We provide builds with Python support built-in.
- **XML Support**: Our builds come with XML support built-in, which is useful for some GDB commands.
- **Wide Architecture Support**: We support a wide range of architectures:
- aarch64
- arm
- mips
- mipsel
- powerpc
- x86_64
</details>
## Usage
To get started with `gdb-static`, simply download the build for your architecture from the [releases page](https://github.com/guyush1/gdb-static/releases/latest), extract the archive, and copy the binary to your desired platform.
> [!NOTE]
> We provide two types of builds:
> 1. Builds with Python support, which are approximately ~30 MB in size.
> 2. Slimmer builds without Python support, which are approximately ~7 MB in size.
You may choose to copy the `gdb` binary to the platform, or use `gdbserver` to debug remotely.
## Development
> [!NOTE]
> Before building, make sure to initialize & sync the git submodules.
Alternatively, you can build `gdb-static` from source. To do so, follow the instructions below:
<details open>
<summary>
Pre-requisites
</summary> <br />
To be able to build `gdb-static`, you will need the following tools installed on your machine:
###
- Docker
- Docker buildx
- Git
</details>
<details open>
<summary>
Building for a specific architecture
</summary> <br />
To build `gdb-static` for a specific architecture, run the following command:
To build for a specific architecture, you can use the following command:
```bash ```bash
make build-<ARCH> make build[-with-python]-<ARCH>
``` ```
For example, to build for arm: Where `<ARCH>` is the architecture you want to build for, and `-with-python` may be added in order to compile gdb with Python support.
The resulting binary will be placed in the `build/artifacts/` directory:
```bash ```bash
make build-arm
```
The resulting binaries will be placed under the `build/artifacts/` directory.
Each architecture will have its own directory under `build/artifacts/`. For example, the arm architecture will have the following directory structure:
```
build/ build/
artifacts/ └── artifacts/
arm/ └── <ARCH>/
... └── ...
``` ```
### Building for all architectures </details>
<details open>
<summary>
Building for all architectures
</summary> <br />
To build `gdb-static` for all supported architectures, run the following command:
To build for all architectures, you can use the following command:
```bash ```bash
make build make build
``` ```
### Cleaning the build The resulting binary will be placed in the `build/artifacts/` directory.
To clean the build, you can use the following command: </details>
```bash
make clean <a name="contributing_anchor"></a>
``` ## Contributing
- Bug Report: If you see an error message or encounter an issue while using gdb-static, please create a [bug report](https://github.com/guyush1/gdb-static/issues/new?assignees=&labels=bug&title=%F0%9F%90%9B+Bug+Report%3A+).
- Feature Request: If you have an idea or if there is a capability that is missing and would make `gdb-static` more robust, please submit a [feature request](https://github.com/guyush1/gdb-static/issues/new?assignees=&labels=enhancement&title=%F0%9F%9A%80+Feature+Request%3A+).
## Contributors
<!---
npx contributor-faces --exclude "*bot*" --limit 70 --repo "https://github.com/guyush1/gdb-static"
change the height and width for each of the contributors from 80 to 50.
--->
[//]: contributor-faces
<a href="https://github.com/guyush1"><img src="https://avatars.githubusercontent.com/u/82650790?v=4" title="guyush1" width="80" height="80"></a>
<a href="https://github.com/RoiKlevansky"><img src="https://avatars.githubusercontent.com/u/78471889?v=4" title="RoiKlevansky" width="80" height="80"></a>
<a href="https://github.com/roddyrap"><img src="https://avatars.githubusercontent.com/u/37045659?v=4" title="roddyrap" width="80" height="80"></a>
[//]: contributor-faces

View File

@ -1,49 +1,58 @@
# Notes about this file - read before proceeding! # Notes about this file - read before proceeding!
While i have already provided the gdb/gdbserver-15 statically compiled binaries for you, some people might want to compile it to a different architecture (without our build scripts), or compile a newer version of gdb in the future :). The rest of the file contains a documentation of the compilation process, in order to help you out. While we have already provided the gdb/gdbserver statically compiled binaries for you, some people might want to compile it without our build scripts, or compile a newer version of gdb in the future :).
The rest of the file contains a documentation of the compilation process, in order to help you out.
## <VARAIBLES> in the script NOTE: The compilation guide describes the compilation process in order to create a minimal-working version of gdb. Our build-scripts also provides further capabilites to gdb, such as python and xml support, which are not documented in this file.
When specifying the compilation dir throughout the compilation process (specified as <COMPILATION_DIR_PATH> in this file), DO NOT use relative pathing, or bash characters such as `~`. They will not get parsed correctly! Instead, use absolute paths only. ## <VARAIBLES> In this file
Environment variables are denoted by <...> throughout this file.
Please note that when specifying a compilation dir throughout the compilation process (via the <COMPILATION_DIR_PATH> environment variable), DO NOT use relative pathing, or special bash characters such as `~`. Relative pathing / special bash characters will not get parsed correctly!
Instead, always use absolute paths.
Examples to the <VARIABLES> throughout the script: Examples to the <VARIABLES> throughout the script:
<CROSS_COMPILER_C> - arm-linux-gnueabi-gcc - <CROSS_COMPILER_C> - arm-linux-gnueabi-gcc
<CROSS_COMPILER_CPP> - arm-linux-gnueabi-g++ - <CROSS_COMPILER_CPP> - arm-linux-gnueabi-g++
<HOST_NAME> - arm-linux-gnueabi - <HOST_NAME> - arm-linux-gnueabi
<COMPILATION_DIR_PATH> - /home/username/projects/libgmp-x.y.z/build-arm/ - <COMPILATION_DIR_PATH> - /home/username/projects/libgmp-x.y.z/build-arm/
Environment info: Environment info:
- glibc version: 2.39-0ubuntu8.3 (NOTE: When i compiled gdb-15 using an older glibc, such as the one i had in my ubuntu-20.04 machine, i received a segfault in gdb...). - glibc version: 2.39-0ubuntu8.3 (NOTE: When i compiled gdb using an older glibc, such as the one i had in my ubuntu-20.04 machine, i received a segfault in gdb, so the libc version is important!).
# Compiling gdb statically to the host platform # Compiling gdb statically to the host platform
## 1) Compiling iconv ## 1) Compiling iconv
While compiling iconv is not a must, the libc-provided iconv (a utility to convert between encodings) may fail on different architectures, While compiling iconv is not a must, the libc-provided iconv (a utility to convert between encodings) may fail on different architectures,
at least in my experiance. Thus, I recommended using a custom libiconv and compiling it into gdb. at least in my experience.
Thus, I recommended using a custom libiconv and compiling it into gdb.
Download the source from https://github.com/roboticslibrary/libiconv.git Download the source from https://github.com/roboticslibrary/libiconv.git
Make sure to check out to a stable tag (in my case - v1.17). Make sure to check out to a stable tag (in my case - v1.17).
Work according to the following steps: Work according to the following steps:
I) run `./gitsub.sh pull` 1. run `./gitsub.sh pull`
II) run `./autogen.sh` to create the configure script from configure.sh. 2. run `./autogen.sh` to create the configure script from configure.sh.
III) create a build dir (e.g build), and then cd into it. 3. create a build dir (e.g build), and then cd into it.
IV) run `../configure --enable-static` 4. run `../configure --enable-static`
V) run `cp -r ./include ./lib/.libs/` 5. run `cp -r ./include ./lib/.libs/`
VI) run `mkdir ./lib/.libs/lib/` 6. run `mkdir ./lib/.libs/lib/`
VII) run `cp ./lib/.libs/libiconv.a ./lib/.libs/lib/` 7. run `cp ./lib/.libs/libiconv.a ./lib/.libs/lib/`
## 2) Compiling gdb ## 2) Compiling gdb
Clone gdb from sourceware - https://sourceware.org/git/binutils-gdb.git. Clone gdb from from my forked respository - https://github.com/guyush1/binutils-gdb/tree/gdb-static.
I checked out to the 15.2 tag.
Make sure to check out to the **gdb-static** branch - this branch contains all of the changes i had to do to the build system in order for it to compile gdb statically.
Work according to the following steps: Work according to the following steps:
I) Apply my patches (gdb_static.patch). If you are not on the exact tag i used (15.2) - you might need to apply them manually, and change some stuff. 1. create a build dir.
II) create a build dir. 2. run `../configure --enable-static --with-static-standard-libraries --disable-tui --disable-inprocess-agent --with-libiconv-prefix=<COMPILATION_DIR_PATH>/lib/.libs/ --with-libiconv-type=static`
III) run `../configure --enable-static --with-static-standard-libraries --disable-tui --disable-inprocess-agent --with-libiconv-prefix=<COMPILATION_DIR_PATH>/lib/.libs/ --with-libiconv-type=static` 3. run `make all-gdb -j$(nproc)` - for gdbserver, run `make all-gdbserver -j$(nproc)`.
IV) run `make all-gdb -j$(nproc)` - for gdbserver, run `make all-gdbserver -j$(nproc)`.
gdb will sit under gdb/gdb. gdb will sit under gdb/gdb.
gdbserver will sit under gdbserver/gdbserver. gdbserver will sit under gdbserver/gdbserver.
@ -63,13 +72,13 @@ Download and extract the latest edition from https://gmplib.org/.
I used the 6.3.0 edition. I used the 6.3.0 edition.
Work according to the following steps: Work according to the following steps:
I) Create a build dir and cd into it. 1. Create a build dir and cd into it.
II) run `../configure CC=<CROSS_COMPILER_C> CXX=<CROSS_COMPILER_CPP> --enable-static --host=<HOST_NAME>` 2. run `../configure CC=<CROSS_COMPILER_C> CXX=<CROSS_COMPILER_CPP> --enable-static --host=<HOST_NAME>`
III) run `make -j$(nproc)` 3. run `make -j$(nproc)`
IV) run `mkdir ./.libs/include/` 4. run `mkdir ./.libs/include/`
V) run `cp gmp.h ./.libs/include/` 5. run `cp gmp.h ./.libs/include/`
VI) run `mkdir ./.libs/lib` 6. run `mkdir ./.libs/lib`
VII) run `cp ./.libs/libgmp.a ./.libs/lib` 7. run `cp ./.libs/libgmp.a ./.libs/lib`
## 3) Compiling libmpfr ## 3) Compiling libmpfr
@ -77,15 +86,16 @@ Download and extract the latest edition from https://www.mpfr.org/.
I used the 4.2.1 edition. I used the 4.2.1 edition.
Work according to the following steps: Work according to the following steps:
I) Create a build dir and cd into it. 1. Create a build dir and cd into it.
II) run `../configure CC=<CROSS_COMPILER_C> CXX=<CROSS_COMPILER_CPP> --enable-static --with-gmp-build=<COMPILATION_DIR_PATH> --host=<HOST_NAME>` 2. run `../configure CC=<CROSS_COMPILER_C> CXX=<CROSS_COMPILER_CPP> --enable-static --with-gmp-build=<COMPILATION_DIR_PATH> --host=<HOST_NAME>`
III) run `make -j$(nproc)` 3. run `make -j$(nproc)`
IV) run `mkdir ./src/.libs/lib` 4. run `mkdir ./src/.libs/lib`
V) run `cp ./src/.libs/libmpfr.a ./src/.libs/lib` 5. run `cp ./src/.libs/libmpfr.a ./src/.libs/lib`
VI) run `mkdir ./src/.libs/include` 6. run `mkdir ./src/.libs/include`
VII) run `cp ../src/mpfr.h ./src/.libs/include/` 7. run `cp ../src/mpfr.h ./src/.libs/include/`
## 4) Compiling gdb ## 4) Compiling gdb
Work according to the same process as described under the compilation to the host platform, aside from the configure script: Work according to the same process as described under the compilation to the host platform, aside from the configure script:
III) run `../configure --enable-static --with-static-standard-libraries --disable-tui --disable-inprocess-agent --with-libiconv-prefix=<COMPILATION_DIR_PATH>/lib/.libs/ --with-libiconv-type=static --with-gmp=<COMPILATION_DIR_PATH>/.libs/ --with-mpfr=<COMPILATION_DIR_PATH>/src/.libs/ CC=<CROSS_COMPILER_C> CXX=<CROSS_COMPILER_CPP> --host=<HOST_NAME>`
2. run `../configure --enable-static --with-static-standard-libraries --disable-tui --disable-inprocess-agent --with-libiconv-prefix=<COMPILATION_DIR_PATH>/lib/.libs/ --with-libiconv-type=static --with-gmp=<COMPILATION_DIR_PATH>/.libs/ --with-mpfr=<COMPILATION_DIR_PATH>/src/.libs/ CC=<CROSS_COMPILER_C> CXX=<CROSS_COMPILER_CPP> --host=<HOST_NAME>`

7
package.json Normal file
View File

@ -0,0 +1,7 @@
{
"name": "gdb-static",
"repository": {
"type": "git",
"url": "https://github.com/guyush1/gdb-static"
}
}

View File

@ -4,6 +4,9 @@
script_dir=$(dirname "$0") script_dir=$(dirname "$0")
source "$script_dir/utils.sh" source "$script_dir/utils.sh"
# Don't want random unknown things to fail in the build procecss!
set -e
function set_compliation_variables() { function set_compliation_variables() {
# Set compilation variables such as which compiler to use. # Set compilation variables such as which compiler to use.
# #
@ -40,7 +43,7 @@ function set_compliation_variables() {
CROSS=mipsel-linux-gnu- CROSS=mipsel-linux-gnu-
export HOST=mipsel-linux-gnu export HOST=mipsel-linux-gnu
elif [[ "$target_arch" == "x86_64" ]]; then elif [[ "$target_arch" == "x86_64" ]]; then
CROSS="" CROSS=x86_64-linux-gnu-
export HOST=x86_64-linux-gnu export HOST=x86_64-linux-gnu
fi fi
@ -49,17 +52,23 @@ function set_compliation_variables() {
export CFLAGS="-O2" export CFLAGS="-O2"
export CXXFLAGS="-O2" export CXXFLAGS="-O2"
# Strip the binary to reduce it's size.
export LDFLAGS="-s"
} }
function set_ncurses_link_variables() { function set_up_lib_search_paths() {
# Set up ncurses library link variables # Set up library-related linker search paths.
# #
# Parameters: # Parameters:
# $1: ncursesw build dir # $1: ncursesw build dir
# $2: libexpat build dir
local ncursesw_build_dir="$1" local ncursesw_build_dir="$1"
local libexpat_build_dir="$2"
# Allow tui mode by adding our custom built static ncursesw library to the linker search path. # I) Allow tui mode by adding our custom built static ncursesw library to the linker search path.
export LDFLAGS="-L$ncursesw_build_dir/lib $LDFLAGS" # II) Allow parsing xml files by adding libexpat library to the linker search path.
export LDFLAGS="-L$ncursesw_build_dir/lib -L$libexpat_build_dir/lib/.libs $LDFLAGS"
} }
function build_iconv() { function build_iconv() {
@ -208,12 +217,64 @@ function build_ncurses() {
popd > /dev/null popd > /dev/null
} }
function build_libexpat() {
# Build libexpat.
#
# Parameters:
# $1: libexpat package directory
# $2: target architecture
#
# Echoes:
# The libexpat build directory
#
# Returns:
# 0: success
# 1: failure
local libexpat_dir="$1"
local target_arch="$2"
local libexpat_build_dir="$(realpath "$libexpat_dir/build-$target_arch")"
echo "$libexpat_build_dir"
mkdir -p "$libexpat_build_dir"
if [[ -f "$libexpat_build_dir/lib/.libs/libexpat.a" ]]; then
>&2 echo "Skipping build: libexpat already built for $target_arch"
return 0
fi
pushd "$libexpat_build_dir" > /dev/null
>&2 fancy_title "Building libexpat for $target_arch"
# Generate configure if it doesnt exist.
if [[ ! -f "$libexpat_build_dir/../expat/configure" ]]; then
>&2 ../expat/buildconf.sh ../expat/
fi
../expat/configure --enable-static "CC=$CC" "CXX=$CXX" "--host=$HOST" \
"CFLAGS=$CFLAGS" "CXXFLAGS=$CXXFLAGS" 1>&2
if [[ $? -ne 0 ]]; then
return 1
fi
make -j$(nproc) 1>&2
if [[ $? -ne 0 ]]; then
return 1
fi
>&2 fancy_title "Finished building libexpat for $target_arch"
popd > /dev/null
}
function build_python() { function build_python() {
# Build python. # Build python.
# #
# Parameters: # Parameters:
# $1: python package directory # $1: python package directory
# $2: target architecture # $2: target architecture
# $3: gdb's python module directory parent
# $4: pygment's toplevel source dir.
# #
# Echoes: # Echoes:
# The python build directory # The python build directory
@ -223,6 +284,8 @@ function build_python() {
# 1: failure # 1: failure
local python_dir="$1" local python_dir="$1"
local target_arch="$2" local target_arch="$2"
local gdb_python_parent="$3"
local pygments_source_dir="$4"
local python_lib_dir="$(realpath "$python_dir/build-$target_arch")" local python_lib_dir="$(realpath "$python_dir/build-$target_arch")"
echo "$python_lib_dir" echo "$python_lib_dir"
@ -251,6 +314,17 @@ function build_python() {
--disable-ipv6 \ --disable-ipv6 \
--disable-shared --disable-shared
# Extract the regular standard library modules that are to be frozen and include the gdb and pygments custom libraries.
export EXTRA_FROZEN_MODULES="$(printf "%s" "$(< ${script_dir}/frozen_python_modules.txt)" | tr $'\n' ";")"
export EXTRA_FROZEN_MODULES="${EXTRA_FROZEN_MODULES};<gdb.**.*>: gdb = ${gdb_python_parent};<pygments.**.*>: pygments = ${pygments_source_dir}"
>&2 echo "Frozen Modules: ${EXTRA_FROZEN_MODULES}"
# Regenerate frozen modules with gdb env varaible. Do it after the configure because we need
# the `regen-frozen` makefile.
>&2 python3.12 ../Tools/build/freeze_modules.py
>&2 make regen-frozen
# Build python after configuring the project and regnerating frozen files.
>&2 make -j $(nproc) >&2 make -j $(nproc)
if [[ $? -ne 0 ]]; then if [[ $? -ne 0 ]]; then
return 1 return 1
@ -329,6 +403,7 @@ function build_gdb() {
# $3: libiconv prefix # $3: libiconv prefix
# $4: libgmp prefix # $4: libgmp prefix
# $5: libmpfr prefix # $5: libmpfr prefix
# $6: whether to build with python or not
# #
# Echoes: # Echoes:
# The gdb build directory # The gdb build directory
@ -342,7 +417,15 @@ function build_gdb() {
local libiconv_prefix="$3" local libiconv_prefix="$3"
local libgmp_prefix="$4" local libgmp_prefix="$4"
local libmpfr_prefix="$5" local libmpfr_prefix="$5"
local gdb_build_dir="$(realpath "$gdb_dir/build-$target_arch")" local with_python="$6"
if [[ "$with_python" == "yes" ]]; then
local python_flag="--with-python=/app/gdb/build/packages/cpython-static/build-$target_arch/bin/python3-config"
local gdb_build_dir="$(realpath "$gdb_dir/build-${target_arch}_with_python")"
else
local python_flag="--without-python"
local gdb_build_dir="$(realpath "$gdb_dir/build-${target_arch}")"
fi
echo "$gdb_build_dir" echo "$gdb_build_dir"
mkdir -p "$gdb_build_dir" mkdir -p "$gdb_build_dir"
@ -357,11 +440,15 @@ function build_gdb() {
>&2 fancy_title "Building gdb for $target_arch" >&2 fancy_title "Building gdb for $target_arch"
../configure -C --enable-static --with-static-standard-libraries --disable-inprocess-agent \ ../configure -C --enable-static --with-static-standard-libraries --disable-inprocess-agent \
--enable-tui --with-python=/app/gdb/build/packages/cpython-static/build-$target_arch/bin/python3-config \ --enable-tui "$python_flag" \
--with-expat --with-libexpat-type="static" \
--with-gdb-datadir="/usr/share/gdb" --with-separate-debug-dir="/usr/lib/debug" \
--with-system-gdbinit="/etc/gdb/gdbinit" --with-system-gdbinit-dir="/etc/gdb/gdbinit.d" \
--with-jit-reader-dir="/usr/lib/gdb" \
"--with-libiconv-prefix=$libiconv_prefix" --with-libiconv-type=static \ "--with-libiconv-prefix=$libiconv_prefix" --with-libiconv-type=static \
"--with-gmp=$libgmp_prefix" \ "--with-gmp=$libgmp_prefix" \
"--with-mpfr=$libmpfr_prefix" \ "--with-mpfr=$libmpfr_prefix" \
"CC=$CC" "CXX=$CXX" "--host=$HOST" \ "CC=$CC" "CXX=$CXX" "LDFLAGS=$LDFLAGS" "--host=$HOST" \
"CFLAGS=$CFLAGS" "CXXFLAGS=$CXXFLAGS" 1>&2 "CFLAGS=$CFLAGS" "CXXFLAGS=$CXXFLAGS" 1>&2
if [[ $? -ne 0 ]]; then if [[ $? -ne 0 ]]; then
return 1 return 1
@ -384,6 +471,7 @@ function install_gdb() {
# $1: gdb build directory # $1: gdb build directory
# $2: artifacts directory # $2: artifacts directory
# $3: target architecture # $3: target architecture
# $4: whether gdb was built with or without python
# #
# Returns: # Returns:
# 0: success # 0: success
@ -392,15 +480,22 @@ function install_gdb() {
local gdb_build_dir="$1" local gdb_build_dir="$1"
local artifacts_dir="$2" local artifacts_dir="$2"
local target_arch="$3" local target_arch="$3"
local with_python="$4"
if [[ -d "$artifacts_dir/$target_arch" && -n "$(ls -A "$artifacts_dir/$target_arch")" ]]; then if [[ "$with_python" == "yes" ]]; then
local artifacts_location="$artifacts_dir/${target_arch}_with_python"
else
local artifacts_location="$artifacts_dir/${target_arch}"
fi
if [[ -d "$artifacts_location" && -n "$(ls -A "$artifacts_location")" ]]; then
>&2 echo "Skipping install: gdb already installed for $target_arch" >&2 echo "Skipping install: gdb already installed for $target_arch"
return 0 return 0
fi fi
temp_artifacts_dir="$(mktemp -d)" temp_artifacts_dir="$(mktemp -d)"
mkdir -p "$artifacts_dir/$target_arch" mkdir -p "$artifacts_location"
make -C "$gdb_build_dir" install "DESTDIR=$temp_artifacts_dir" 1>&2 make -C "$gdb_build_dir" install "DESTDIR=$temp_artifacts_dir" 1>&2
if [[ $? -ne 0 ]]; then if [[ $? -ne 0 ]]; then
@ -409,7 +504,7 @@ function install_gdb() {
fi fi
while read file; do while read file; do
cp "$file" "$artifacts_dir/$target_arch/" cp "$file" "$artifacts_location/"
done < <(find "$temp_artifacts_dir/usr/local/bin" -type f -executable) done < <(find "$temp_artifacts_dir/usr/local/bin" -type f -executable)
rm -rf "$temp_artifacts_dir" rm -rf "$temp_artifacts_dir"
@ -423,8 +518,9 @@ function build_and_install_gdb() {
# $2: libiconv prefix # $2: libiconv prefix
# $3: libgmp prefix # $3: libgmp prefix
# $4: libmpfr prefix # $4: libmpfr prefix
# $5: install directory # $5: whether to build with python or not
# $6: target architecture # $6: install directory
# $7: target architecture
# #
# Returns: # Returns:
# 0: success # 0: success
@ -434,15 +530,16 @@ function build_and_install_gdb() {
local libiconv_prefix="$2" local libiconv_prefix="$2"
local libgmp_prefix="$3" local libgmp_prefix="$3"
local libmpfr_prefix="$4" local libmpfr_prefix="$4"
local artifacts_dir="$5" local with_python="$5"
local target_arch="$6" local artifacts_dir="$6"
local target_arch="$7"
gdb_build_dir="$(build_gdb "$gdb_dir" "$target_arch" "$libiconv_prefix" "$libgmp_prefix" "$libmpfr_prefix")" gdb_build_dir="$(build_gdb "$gdb_dir" "$target_arch" "$libiconv_prefix" "$libgmp_prefix" "$libmpfr_prefix" "$with_python")"
if [[ $? -ne 0 ]]; then if [[ $? -ne 0 ]]; then
return 1 return 1
fi fi
install_gdb "$gdb_build_dir" "$artifacts_dir" "$target_arch" install_gdb "$gdb_build_dir" "$artifacts_dir" "$target_arch" "$with_python"
if [[ $? -ne 0 ]]; then if [[ $? -ne 0 ]]; then
return 1 return 1
fi fi
@ -455,10 +552,12 @@ function build_gdb_with_dependencies() {
# $1: target architecture # $1: target architecture
# $2: build directory # $2: build directory
# $3: src directory # $3: src directory
# $4: whether to build gdb with python or not
local target_arch="$1" local target_arch="$1"
local build_dir="$2" local build_dir="$2"
local source_dir="$3" local source_dir="$3"
local with_python="$4"
local packages_dir="$build_dir/packages" local packages_dir="$build_dir/packages"
local artifacts_dir="$build_dir/artifacts" local artifacts_dir="$build_dir/artifacts"
@ -488,17 +587,28 @@ function build_gdb_with_dependencies() {
if [[ $? -ne 0 ]]; then if [[ $? -ne 0 ]]; then
return 1 return 1
fi fi
set_ncurses_link_variables "$ncursesw_build_dir"
python_build_dir="$(build_python "$packages_dir/cpython-static" "$target_arch")" libexpat_build_dir="$(build_libexpat "$packages_dir/libexpat" "$target_arch")"
if [[ $? -ne 0 ]]; then if [[ $? -ne 0 ]]; then
return 1 return 1
fi fi
set_up_lib_search_paths "$ncursesw_build_dir" "$libexpat_build_dir"
if [[ "$with_python" == "yes" ]]; then
local gdb_python_dir="$packages_dir/binutils-gdb/gdb/python/lib/"
local pygments_source_dir="$packages_dir/pygments/"
local python_build_dir="$(build_python "$packages_dir/cpython-static" "$target_arch" "$gdb_python_dir" "$pygments_source_dir")"
if [[ $? -ne 0 ]]; then
return 1
fi
fi
build_and_install_gdb "$packages_dir/binutils-gdb" \ build_and_install_gdb "$packages_dir/binutils-gdb" \
"$iconv_build_dir/lib/.libs/" \ "$iconv_build_dir/lib/.libs/" \
"$gmp_build_dir/.libs/" \ "$gmp_build_dir/.libs/" \
"$mpfr_build_dir/src/.libs/" \ "$mpfr_build_dir/src/.libs/" \
"$with_python" \
"$artifacts_dir" \ "$artifacts_dir" \
"$target_arch" "$target_arch"
if [[ $? -ne 0 ]]; then if [[ $? -ne 0 ]]; then
@ -507,12 +617,17 @@ function build_gdb_with_dependencies() {
} }
function main() { function main() {
if [[ $# -ne 3 ]]; then if [[ $# -lt 3 ]]; then
>&2 echo "Usage: $0 <target_arch> <build_dir> <src_dir>" >&2 echo "Usage: $0 <target_arch> <build_dir> <src_dir> [--with-python]"
exit 1 exit 1
fi fi
build_gdb_with_dependencies "$1" "$2" "$3" local with_python="no"
if [[ "$4" == "--with-python" ]]; then
with_python="yes"
fi
build_gdb_with_dependencies "$1" "$2" "$3" "$with_python"
if [[ $? -ne 0 ]]; then if [[ $? -ne 0 ]]; then
>&2 echo "Error: failed to build gdb with dependencies" >&2 echo "Error: failed to build gdb with dependencies"
exit 1 exit 1

View File

@ -2,7 +2,7 @@
# Include utils library # Include utils library
script_dir=$(dirname "$0") script_dir=$(dirname "$0")
. "$script_dir/utils.sh" source "$script_dir/utils.sh"
# List of package URLs to download # List of package URLs to download
SOURCE_URLS=( SOURCE_URLS=(
@ -64,11 +64,6 @@ function download_package() {
local url="$1" local url="$1"
local output="$2" local output="$2"
if [[ -f "$output" ]]; then
>&2 echo "Skipping download: $output already exists"
return 0
fi
wget "$url" -O "$output" wget "$url" -O "$output"
if [[ $? -ne 0 ]]; then if [[ $? -ne 0 ]]; then
>&2 echo "Error: failed to download $url" >&2 echo "Error: failed to download $url"
@ -98,11 +93,6 @@ function extract_package() {
return 1 return 1
fi fi
if [[ -d "$output_dir" ]]; then
>&2 echo "Skipping extraction: $output_dir already exists"
return 0
fi
pushd "$temp_dir" > /dev/null pushd "$temp_dir" > /dev/null
unpack_tarball "$tarball_realpath" unpack_tarball "$tarball_realpath"
@ -113,6 +103,10 @@ function extract_package() {
popd > /dev/null popd > /dev/null
# Make sure output dir is empty, so we could move content into it.
# The directory might not exist, so we need to pass || true so that set -e won't fail us.
rm -rf "$output_dir" || true
mv "$temp_dir/$package_dir" "$output_dir" mv "$temp_dir/$package_dir" "$output_dir"
if [[ $? -ne 0 ]]; then if [[ $? -ne 0 ]]; then
return 1 return 1
@ -201,14 +195,6 @@ function download_gdb_packages() {
fi fi
done done
if [[ ! -d gdb-static ]]; then
git clone https://github.com/guyush1/binutils-gdb.git --single-branch --branch gdb-static
fi
if [[ ! -d python3.12-static ]]; then
git clone https://github.com/guyush1/cpython-static.git --single-branch --branch python3.12-static
fi
fancy_title "Finished downloading GDB packages" fancy_title "Finished downloading GDB packages"
popd popd
} }

View File

@ -0,0 +1,147 @@
_aix_support
antigravity
argparse
ast
base64
bdb
bisect
calendar
cmd
codeop
code
<collections.**.*>
_collections_abc
colorsys
_compat_pickle
compileall
_compression
<concurrent.**.*>
configparser
contextlib
contextvars
copy
copyreg
cProfile
csv
dataclasses
datetime
<dbm.**.*>
decimal
difflib
dis
<encodings.**.*>
<ensurepip.**.*>
enum
filecmp
fileinput
fnmatch
fractions
ftplib
functools
__future__
genericpath
getopt
getpass
gettext
glob
graphlib
gzip
hashlib
heapq
hmac
imaplib
<importlib.**.*>
inspect
ipaddress
<json.**.*>
keyword
linecache
locale
<logging.**.*>
lzma
_markupbase
mimetypes
modulefinder
<multiprocessing.**.*>
netrc
ntpath
nturl2path
numbers
opcode
operator
optparse
os
_osx_support
pathlib
pdb
<__phello__.**.*>
pickle
pickletools
pkgutil
platform
plistlib
poplib
posixpath
pprint
profile
pstats
pty
_py_abc
pyclbr
py_compile
_pydatetime
_pydecimal
_pyio
_pylong
queue
quopri
random
<re.**.*>
reprlib
rlcompleter
sched
selectors
shelve
shlex
shutil
signal
smtplib
socket
socketserver
statistics
stat
stringprep
string
_strptime
struct
subprocess
symtable
sysconfig
tabnanny
tempfile
textwrap
this
_threading_local
threading
timeit
tokenize
token
<tomllib.**.*>
traceback
tracemalloc
trace
tty
types
typing
uuid
warnings
wave
weakref
_weakrefset
webbrowser
<wsgiref.**.*>
zipapp
<zipfile.**.*>
<zoneinfo.**.*>
<email.**.*>
<urllib.**.*>

@ -0,0 +1 @@
Subproject commit 25f97ccb7ff1a9d7bd0e25535d749acf6e1a87be

@ -0,0 +1 @@
Subproject commit f38630702ca74b512256880f5e80130a8733a0b5

@ -0,0 +1 @@
Subproject commit 2691aff4304a6d7e053199c205620136481b9dd1

@ -0,0 +1 @@
Subproject commit b583de4794e94b4dc4c2da03a7c29f462482293e