Compare commits

...

181 Commits

Author SHA1 Message Date
Michael B. Gale e46ed2cbd0 Merge pull request #3867 from github/update-v4.35.3-8c6e48dbe
Merge main into releases/v4
2026-05-01 15:05:28 +01:00
Michael B. Gale b73d1d1634 Add changelog entry for #3853 2026-05-01 14:09:58 +01:00
Michael B. Gale 24e0bb00a9 Reorder changelog entries 2026-05-01 14:07:12 +01:00
github-actions[bot] ec298daba7 Update changelog for v4.35.3 2026-05-01 12:57:50 +00:00
Henry Mercer 8c6e48dbe0 Merge pull request #3865 from github/update-bundle/codeql-bundle-v2.25.3
Update default bundle to 2.25.3
2026-04-30 16:07:18 +00:00
github-actions[bot] 719098349e Add changelog note 2026-04-30 15:31:49 +00:00
github-actions[bot] 2bb209555a Update default bundle to codeql-bundle-v2.25.3 2026-04-30 15:31:40 +00:00
Michael B. Gale 7851e55dc3 Merge pull request #3850 from github/mbg/private-registry/cloudsmith-gcp
Private registries: Add support for Cloudsmith and GCP OIDC configurations
2026-04-30 13:33:44 +00:00
Michael B. Gale 262a15f6cf Add generic non-printable chars test for OIDC configs 2026-04-30 14:10:36 +01:00
Michael B. Gale a6109b1c07 Merge pull request #3853 from github/mbg/start-proxy/improved-checks
Improve connection tests
2026-04-30 12:48:34 +00:00
Michael B. Gale 022ff3c73f Merge remote-tracking branch 'origin/main' into mbg/private-registry/cloudsmith-gcp 2026-04-30 13:43:29 +01:00
Michael B. Gale 0a4d574ac4 Add changelog entry 2026-04-30 13:42:29 +01:00
Michael B. Gale d1edf2e4de Improve replaces-base validation and add tests 2026-04-30 13:41:13 +01:00
Henry Mercer facd53f789 Merge pull request #3859 from github/dependabot/npm_and_yarn/ava/typescript-7.0.0
Bump @ava/typescript from 6.0.0 to 7.0.0
2026-04-30 12:30:35 +00:00
Michael B. Gale b77983290b Fix permutations comment 2026-04-30 13:28:42 +01:00
Henry Mercer fcf29e3d86 Merge pull request #3862 from github/dependabot/github_actions/dot-github/workflows/actions-minor-933f87fbf1
Bump ruby/setup-ruby from 1.301.0 to 1.305.0 in /.github/workflows in the actions-minor group across 1 directory
2026-04-30 12:17:13 +00:00
Henry Mercer 1fed3e9ba8 Merge branch 'main' into dependabot/npm_and_yarn/ava/typescript-7.0.0 2026-04-30 13:10:19 +01:00
Michael B. Gale 549683cee5 Make it clearer what the expectations for isUsernamePassword are 2026-04-30 12:49:49 +01:00
Michael B. Gale 7a6ed56219 Modify FromSchema so that optional properties are actually optional 2026-04-30 11:54:21 +01:00
Michael B. Gale 91fbc51606 Improve validateSchema comment 2026-04-30 11:46:01 +01:00
Michael B. Gale 35715ef8fe Improve typing of cloneCredential 2026-04-30 11:43:54 +01:00
Michael B. Gale bac7fdaf42 Fix linter error 2026-04-30 11:26:12 +01:00
Henry Mercer 1517969c90 Merge pull request #3837 from github/update-supported-enterprise-server-versions
Update supported GitHub Enterprise Server versions
2026-04-30 10:16:37 +00:00
github-actions[bot] f073360456 Rebuild 2026-04-29 18:02:23 +00:00
dependabot[bot] 5145c112e7 Bump ruby/setup-ruby
Bumps the actions-minor group with 1 update in the /.github/workflows directory: [ruby/setup-ruby](https://github.com/ruby/setup-ruby).


Updates `ruby/setup-ruby` from 1.301.0 to 1.305.0
- [Release notes](https://github.com/ruby/setup-ruby/releases)
- [Changelog](https://github.com/ruby/setup-ruby/blob/master/release.rb)
- [Commits](https://github.com/ruby/setup-ruby/compare/4c56a21280b36d862b5fc31348f463d60bdc55d5...0cb964fd540e0a24c900370abf38a33466142735)

---
updated-dependencies:
- dependency-name: ruby/setup-ruby
  dependency-version: 1.305.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: actions-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-29 18:00:14 +00:00
dependabot[bot] 7108503ac6 Bump @ava/typescript from 6.0.0 to 7.0.0
Bumps [@ava/typescript](https://github.com/avajs/typescript) from 6.0.0 to 7.0.0.
- [Release notes](https://github.com/avajs/typescript/releases)
- [Commits](https://github.com/avajs/typescript/compare/v6.0.0...v7.0.0)

---
updated-dependencies:
- dependency-name: "@ava/typescript"
  dependency-version: 7.0.0
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-29 17:59:33 +00:00
Henry Mercer 4fe9b1e243 Merge pull request #3856 from github/henrymercer/overlay-add-log-group
Add log group for downloading overlay-base DB
2026-04-29 10:51:09 +00:00
Henry Mercer 56733fb5ae Add log group for downloading overlay-base DB 2026-04-28 19:00:28 +01:00
Henry Mercer 0a636086c9 Add GHES 3.21 to supported versions table 2026-04-28 15:32:55 +01:00
Henry Mercer 97be3af35a Deprecate CodeQL versions 2.19.3 and earlier 2026-04-28 15:32:55 +01:00
github-actions[bot] de303a9db5 Update supported GitHub Enterprise Server versions 2026-04-28 15:24:46 +01:00
Michael B. Gale 7a818e6977 Log disclaimer about connection tests, with link to docs 2026-04-28 13:45:53 +01:00
Michael B. Gale 30e0f4391d Use /v3/index.json for NuGet feed check 2026-04-28 13:45:52 +01:00
Henry Mercer 7c5585e5cf Merge pull request #3852 from github/henrymercer/avoid-diagnostic-collisions
Add random suffix when writing diagnostics to avoid filename collisions
2026-04-28 12:04:59 +00:00
Henry Mercer 245f6828c4 Use a counter instead of Math.random for diagnostic filename suffix 2026-04-28 12:42:42 +01:00
Henry Mercer c109008fac Add changelog note 2026-04-28 11:40:03 +01:00
Henry Mercer e73c940c9b Defensively sanitize timestamp 2026-04-28 11:40:02 +01:00
Henry Mercer cdb655d6d4 Add random suffix when writing diagnostics to avoid filename collisions 2026-04-28 11:39:40 +01:00
Michael B. Gale 6153577cab Switch from HEAD to GET requests
Not all registry implementations support `HEAD` correctly.
2026-04-28 10:42:27 +01:00
Michael B. Gale 0ed734b61b Ignore test files 2026-04-25 18:36:22 +01:00
Michael B. Gale efdcb31f11 Accept replaces-base option 2026-04-25 18:36:22 +01:00
Michael B. Gale 4d2c7c6e10 Validate GCP OIDC configurations 2026-04-25 18:36:22 +01:00
Michael B. Gale 70b2658d23 Validate Cloudsmith OIDC configurations 2026-04-25 18:36:21 +01:00
Michael B. Gale 530fcb3bbf Group OIDC schemas into an array 2026-04-25 18:36:19 +01:00
Michael B. Gale 2acf81942b Add tests for getAuthConfig 2026-04-25 18:34:00 +01:00
Michael B. Gale d2a54a4507 Add schemas for basic credential types 2026-04-25 18:33:01 +01:00
Michael B. Gale bc4097bbe1 Simplify credential cloning in getAuthConfig 2026-04-25 18:23:11 +01:00
Michael B. Gale c8e26e209a Move getAuthConfig out of start-proxy.ts 2026-04-25 16:49:05 +01:00
Michael B. Gale 0752451507 Use schema/validation for existing OIDC config types 2026-04-25 16:49:05 +01:00
Michael B. Gale 243c274daf Add simple JSON schema / validation helpers 2026-04-25 15:35:50 +01:00
Henry Mercer 19b3a84f58 Merge pull request #3849 from github/henrymercer/simplify-diff-range-interface
Simplify `writeDiffRangeDataExtensionPack` interface
2026-04-23 20:29:05 +00:00
Henry Mercer 858a6149c1 Simplify writeDiffRangeDataExtensionPack interface 2026-04-23 16:47:15 +01:00
Henry Mercer c60c75576d Merge pull request #3848 from github/dependabot/npm_and_yarn/fast-xml-parser-5.7.1
Bump fast-xml-parser from 5.5.7 to 5.7.1
2026-04-22 23:03:27 +00:00
Henry Mercer 59aede2113 Merge pull request #3847 from github/dependabot/npm_and_yarn/uuid-14.0.0
Bump uuid from 13.0.0 to 14.0.0
2026-04-22 23:02:16 +00:00
github-actions[bot] 6c35f8607b Rebuild 2026-04-22 21:54:06 +00:00
github-actions[bot] c486cacf49 Rebuild 2026-04-22 21:53:49 +00:00
dependabot[bot] 365478cc5b Bump fast-xml-parser from 5.5.7 to 5.7.1
Bumps [fast-xml-parser](https://github.com/NaturalIntelligence/fast-xml-parser) from 5.5.7 to 5.7.1.
- [Release notes](https://github.com/NaturalIntelligence/fast-xml-parser/releases)
- [Changelog](https://github.com/NaturalIntelligence/fast-xml-parser/blob/master/CHANGELOG.md)
- [Commits](https://github.com/NaturalIntelligence/fast-xml-parser/compare/v5.5.7...v5.7.1)

---
updated-dependencies:
- dependency-name: fast-xml-parser
  dependency-version: 5.7.1
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-22 21:52:05 +00:00
dependabot[bot] f0e6490756 Bump uuid from 13.0.0 to 14.0.0
Bumps [uuid](https://github.com/uuidjs/uuid) from 13.0.0 to 14.0.0.
- [Release notes](https://github.com/uuidjs/uuid/releases)
- [Changelog](https://github.com/uuidjs/uuid/blob/main/CHANGELOG.md)
- [Commits](https://github.com/uuidjs/uuid/compare/v13.0.0...v14.0.0)

---
updated-dependencies:
- dependency-name: uuid
  dependency-version: 14.0.0
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-22 21:51:48 +00:00
Henry Mercer 860353f245 Merge pull request #3840 from github/dependabot/npm_and_yarn/npm-minor-580efa6e3b
Bump the npm-minor group across 1 directory with 3 updates
2026-04-22 20:59:20 +00:00
Henry Mercer 4fb8483ef0 Merge pull request #3835 from github/dependabot/npm_and_yarn/eslint-import-resolver-typescript-4.4.4
Bump eslint-import-resolver-typescript from 3.8.7 to 4.4.4
2026-04-22 20:33:35 +00:00
dependabot[bot] c2574efbee Bump the npm-minor group across 1 directory with 3 updates
Bumps the npm-minor group with 3 updates in the / directory: [globals](https://github.com/sindresorhus/globals), [sinon](https://github.com/sinonjs/sinon) and [typescript-eslint](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/typescript-eslint).


Updates `globals` from 17.4.0 to 17.5.0
- [Release notes](https://github.com/sindresorhus/globals/releases)
- [Commits](https://github.com/sindresorhus/globals/compare/v17.4.0...v17.5.0)

Updates `sinon` from 21.0.3 to 21.1.2
- [Release notes](https://github.com/sinonjs/sinon/releases)
- [Changelog](https://github.com/sinonjs/sinon/blob/main/docs/changelog.md)
- [Commits](https://github.com/sinonjs/sinon/compare/v21.0.3...v21.1.2)

Updates `typescript-eslint` from 8.58.1 to 8.58.2
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/typescript-eslint/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.58.2/packages/typescript-eslint)

---
updated-dependencies:
- dependency-name: globals
  dependency-version: 17.5.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm-minor
- dependency-name: sinon
  dependency-version: 21.1.2
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm-minor
- dependency-name: typescript-eslint
  dependency-version: 8.58.2
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-22 17:58:53 +00:00
Henry Mercer 4cbe7bef85 Merge pull request #3839 from github/henrymercer/workflow-run-triggers
Escape "+"s in `on.workflow_run.workflows`
2026-04-22 10:44:53 +00:00
Henry Mercer f6a5638305 Escape "+"s in on.workflow_run.workflows 2026-04-22 11:14:07 +01:00
Henry Mercer 1dcdb940d5 Merge pull request #3830 from github/henrymercer/deflake
Add workflow to rerun potentially transient failures
2026-04-21 10:57:19 +00:00
Henry Mercer 0b7b740d4c Merge pull request #3831 from github/dependabot/npm_and_yarn/npm-minor-f46f1f14d7
Bump the npm-minor group across 1 directory with 2 updates
2026-04-16 11:08:29 +00:00
Henry Mercer 0ac85966ba Merge branch 'main' into dependabot/npm_and_yarn/npm-minor-f46f1f14d7 2026-04-16 11:49:39 +01:00
dependabot[bot] 5019ed041c Bump eslint-import-resolver-typescript from 3.8.7 to 4.4.4
Bumps [eslint-import-resolver-typescript](https://github.com/import-js/eslint-import-resolver-typescript) from 3.8.7 to 4.4.4.
- [Release notes](https://github.com/import-js/eslint-import-resolver-typescript/releases)
- [Changelog](https://github.com/import-js/eslint-import-resolver-typescript/blob/master/CHANGELOG.md)
- [Commits](https://github.com/import-js/eslint-import-resolver-typescript/compare/v3.8.7...v4.4.4)

---
updated-dependencies:
- dependency-name: eslint-import-resolver-typescript
  dependency-version: 4.4.4
  dependency-type: direct:development
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-15 17:58:58 +00:00
dependabot[bot] d64d81d41f Bump the npm-minor group across 1 directory with 2 updates
Bumps the npm-minor group with 2 updates in the / directory: [@eslint/compat](https://github.com/eslint/rewrite/tree/HEAD/packages/compat) and [typescript-eslint](https://github.com/typescript-eslint/typescript-eslint/tree/HEAD/packages/typescript-eslint).


Updates `@eslint/compat` from 2.0.4 to 2.0.5
- [Release notes](https://github.com/eslint/rewrite/releases)
- [Changelog](https://github.com/eslint/rewrite/blob/main/packages/compat/CHANGELOG.md)
- [Commits](https://github.com/eslint/rewrite/commits/compat-v2.0.5/packages/compat)

Updates `typescript-eslint` from 8.58.0 to 8.58.1
- [Release notes](https://github.com/typescript-eslint/typescript-eslint/releases)
- [Changelog](https://github.com/typescript-eslint/typescript-eslint/blob/main/packages/typescript-eslint/CHANGELOG.md)
- [Commits](https://github.com/typescript-eslint/typescript-eslint/commits/v8.58.1/packages/typescript-eslint)

---
updated-dependencies:
- dependency-name: "@eslint/compat"
  dependency-version: 2.0.5
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm-minor
- dependency-name: typescript-eslint
  dependency-version: 8.58.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-15 17:58:27 +00:00
Henry Mercer 6777c894e9 Merge pull request #3811 from github/henrymercer/record-all-builtin-languages
Store all built-in languages
2026-04-15 17:57:19 +00:00
Henry Mercer 79f9c0517c Merge remote-tracking branch 'origin/main' into henrymercer/record-all-builtin-languages
# Conflicts:
#	lib/start-proxy-action.js
#	src/known-language-aliases.json
2026-04-15 18:36:47 +01:00
Henry Mercer 3b3a77544b Rename job
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2026-04-15 18:34:13 +01:00
Henry Mercer 9f95de42d6 Add workflow to rerun potentially transient failures 2026-04-15 18:28:17 +01:00
Henry Mercer e2d518d895 Merge pull request #3827 from github/dependabot/npm_and_yarn/follow-redirects-1.16.0
Bump follow-redirects from 1.15.11 to 1.16.0
2026-04-15 12:47:52 +00:00
github-actions[bot] 9df9e9176e Rebuild 2026-04-15 12:20:46 +00:00
dependabot[bot] 6847a42aa8 Bump follow-redirects from 1.15.11 to 1.16.0
Bumps [follow-redirects](https://github.com/follow-redirects/follow-redirects) from 1.15.11 to 1.16.0.
- [Release notes](https://github.com/follow-redirects/follow-redirects/releases)
- [Commits](https://github.com/follow-redirects/follow-redirects/compare/v1.15.11...v1.16.0)

---
updated-dependencies:
- dependency-name: follow-redirects
  dependency-version: 1.16.0
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-15 12:18:36 +00:00
Henry Mercer f820c80d4d Merge pull request #3825 from github/mergeback/v4.35.2-to-main-95e58e9a
Mergeback v4.35.2 refs/heads/releases/v4 into main
2026-04-15 11:56:45 +00:00
github-actions[bot] ca7d6d3b79 Rebuild 2026-04-15 11:27:36 +00:00
github-actions[bot] 8d9c36a0ce Update changelog and version after v4.35.2 2026-04-15 11:24:19 +00:00
Henry Mercer 95e58e9a2c Merge pull request #3824 from github/update-v4.35.2-d2e135a73
Merge main into releases/v4
2026-04-15 12:22:51 +01:00
github-actions[bot] 6f31bfe060 Update changelog for v4.35.2 2026-04-15 10:56:23 +00:00
Henry Mercer d2e135a73a Merge pull request #3823 from github/update-bundle/codeql-bundle-v2.25.2
Update default bundle to 2.25.2
2026-04-15 10:06:23 +00:00
github-actions[bot] 60abb65df0 Add changelog note 2026-04-15 09:39:31 +00:00
github-actions[bot] 5a0a562209 Update default bundle to codeql-bundle-v2.25.2 2026-04-15 09:39:24 +00:00
Henry Mercer f8b62132ab Include experimental languages 2026-04-14 17:38:26 +01:00
Henry Mercer 65216971a1 Merge pull request #3820 from github/dependabot/github_actions/dot-github/workflows/actions-minor-cc17fecf2b
Bump the actions-minor group across 1 directory with 2 updates
2026-04-13 18:04:26 +00:00
Henry Mercer 3c45af2dd2 Merge pull request #3821 from github/dependabot/npm_and_yarn/npm-minor-345b938e93
Bump the npm-minor group across 1 directory with 6 updates
2026-04-13 17:59:04 +00:00
github-actions[bot] f1c339364c Rebuild 2026-04-13 17:31:19 +00:00
github-actions[bot] 1024fc496c Rebuild 2026-04-13 17:30:13 +00:00
dependabot[bot] 9dd4cfed96 Bump the npm-minor group across 1 directory with 6 updates
Bumps the npm-minor group with 6 updates in the / directory:

| Package | From | To |
| --- | --- | --- |
| [@octokit/plugin-retry](https://github.com/octokit/plugin-retry.js) | `8.0.3` | `8.1.0` |
| [jsonschema](https://github.com/tdegrunt/jsonschema) | `1.4.1` | `1.5.0` |
| [@eslint/compat](https://github.com/eslint/rewrite/tree/HEAD/packages/compat) | `2.0.3` | `2.0.4` |
| [@types/sinon](https://github.com/DefinitelyTyped/DefinitelyTyped/tree/HEAD/types/sinon) | `21.0.0` | `21.0.1` |
| [esbuild](https://github.com/evanw/esbuild) | `0.27.4` | `0.28.0` |
| [nock](https://github.com/nock/nock) | `14.0.11` | `14.0.12` |



Updates `@octokit/plugin-retry` from 8.0.3 to 8.1.0
- [Release notes](https://github.com/octokit/plugin-retry.js/releases)
- [Commits](https://github.com/octokit/plugin-retry.js/compare/v8.0.3...v8.1.0)

Updates `jsonschema` from 1.4.1 to 1.5.0
- [Commits](https://github.com/tdegrunt/jsonschema/commits)

Updates `@eslint/compat` from 2.0.3 to 2.0.4
- [Release notes](https://github.com/eslint/rewrite/releases)
- [Changelog](https://github.com/eslint/rewrite/blob/main/packages/compat/CHANGELOG.md)
- [Commits](https://github.com/eslint/rewrite/commits/compat-v2.0.4/packages/compat)

Updates `@types/sinon` from 21.0.0 to 21.0.1
- [Release notes](https://github.com/DefinitelyTyped/DefinitelyTyped/releases)
- [Commits](https://github.com/DefinitelyTyped/DefinitelyTyped/commits/HEAD/types/sinon)

Updates `esbuild` from 0.27.4 to 0.28.0
- [Release notes](https://github.com/evanw/esbuild/releases)
- [Changelog](https://github.com/evanw/esbuild/blob/main/CHANGELOG.md)
- [Commits](https://github.com/evanw/esbuild/compare/v0.27.4...v0.28.0)

Updates `nock` from 14.0.11 to 14.0.12
- [Release notes](https://github.com/nock/nock/releases)
- [Changelog](https://github.com/nock/nock/blob/main/CHANGELOG.md)
- [Commits](https://github.com/nock/nock/compare/v14.0.11...v14.0.12)

---
updated-dependencies:
- dependency-name: "@octokit/plugin-retry"
  dependency-version: 8.1.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: npm-minor
- dependency-name: jsonschema
  dependency-version: 1.5.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: npm-minor
- dependency-name: "@eslint/compat"
  dependency-version: 2.0.4
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm-minor
- dependency-name: "@types/sinon"
  dependency-version: 21.0.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm-minor
- dependency-name: esbuild
  dependency-version: 0.28.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm-minor
- dependency-name: nock
  dependency-version: 14.0.12
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-13 17:29:04 +00:00
dependabot[bot] c1403f094c Bump the actions-minor group across 1 directory with 2 updates
Bumps the actions-minor group with 2 updates in the /.github/workflows directory: [ruby/setup-ruby](https://github.com/ruby/setup-ruby) and [actions/create-github-app-token](https://github.com/actions/create-github-app-token).


Updates `ruby/setup-ruby` from 1.295.0 to 1.300.0
- [Release notes](https://github.com/ruby/setup-ruby/releases)
- [Changelog](https://github.com/ruby/setup-ruby/blob/master/release.rb)
- [Commits](https://github.com/ruby/setup-ruby/compare/319994f95fa847cf3fb3cd3dbe89f6dcde9f178f...4c56a21280b36d862b5fc31348f463d60bdc55d5)

Updates `actions/create-github-app-token` from 3.0.0 to 3.1.1
- [Release notes](https://github.com/actions/create-github-app-token/releases)
- [Commits](https://github.com/actions/create-github-app-token/compare/v3.0.0...v3.1.1)

---
updated-dependencies:
- dependency-name: ruby/setup-ruby
  dependency-version: 1.300.0
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: actions-minor
- dependency-name: actions/create-github-app-token
  dependency-version: 3.1.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
  dependency-group: actions-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-13 17:27:57 +00:00
Henry Mercer 90d7616015 Merge branch 'main' into henrymercer/record-all-builtin-languages 2026-04-13 18:00:09 +01:00
Henry Mercer 1aef4ed505 Exclude new TypeScript code from package tests
Avoid new source code changing expected output
2026-04-13 17:37:29 +01:00
Henry Mercer cb52ba6486 Refactoring: Split up script 2026-04-13 17:03:20 +01:00
Henry Mercer 7c9e131894 Add constant for builtin languages file path 2026-04-13 16:57:47 +01:00
Henry Mercer 130ab2d721 Improve JSDoc 2026-04-13 16:54:06 +01:00
Henry Mercer 8cf2dc52f9 Fix casing mismatch 2026-04-13 16:49:31 +01:00
Henry Mercer 8339b9254e Merge pull request #3819 from github/henrymercer/refactor-overlay-caching
Refactoring: Introduce `overlay/caching.ts`
2026-04-13 15:49:12 +00:00
Henry Mercer 97bcdd8c1e Move script to pr-checks directory 2026-04-13 16:49:10 +01:00
Henry Mercer e6c21da23c Refactoring: Rename KnownLanguage to BuiltInLanguage 2026-04-10 19:09:47 +01:00
Henry Mercer bad0a744dd Store all built-in languages
While we want the CodeQL Action to work with third-party language support, having a list of all built-in languages can help us create better type-level checks to ensure that we don't miss things that we want to customize for each of our built-in languages.
2026-04-10 19:09:46 +01:00
Michael B. Gale ee09113642 Merge pull request #3810 from github/mbg/ts6/fix-pr-checks
Fix `pr-checks/tsconfig.json` for TS6
2026-04-10 18:02:01 +00:00
Michael B. Gale b669eab7e3 Explicitly add pr-checks to Dependabot config 2026-04-10 16:58:30 +01:00
Henry Mercer 4e8c9ce33c Refactoring: Introduce overlay/caching.ts 2026-04-10 14:55:12 +01:00
Michael B. Gale 1cf0431149 Set module option for pr-checks/tsconfig.json 2026-04-10 13:22:36 +01:00
Michael B. Gale a26cb68cc7 Merge pull request #3807 from github/mbg/start-proxy/fix-field-names
Fix OIDC credential property names
2026-04-10 09:18:24 +00:00
Henry Mercer 60991e61ac Merge pull request #3806 from github/henrymercer/store-language-aliases
Store language aliases from linked CLI
2026-04-10 09:16:45 +00:00
Michael B. Gale 7197c2b792 Add changelog entry 2026-04-09 19:01:45 +01:00
Henry Mercer 597e12aa85 Merge pull request #3801 from github/henrymercer/swift-incompatible-os
Mark Swift incompatible OS as configuration error
2026-04-09 17:30:06 +00:00
Michael B. Gale d277a56348 Fix OIDC credential property names 2026-04-09 17:48:52 +01:00
Henry Mercer 111a537cd9 Update start-proxy Action to use known language aliases 2026-04-09 17:10:15 +01:00
Henry Mercer 51d833290e Store language aliases from linked CLI 2026-04-09 17:10:15 +01:00
Henry Mercer 5a17511bf0 Throw error on Windows too 2026-04-09 16:52:50 +01:00
Henry Mercer 43d8420a42 Do not run Swift in debug artifacts after failure check 2026-04-09 15:18:51 +01:00
Henry Mercer 76a687e1d8 Merge pull request #3804 from github/dependabot/npm_and_yarn/npm-minor-e84c604a08
Bump eslint-plugin-jsdoc from 62.8.1 to 62.9.0 in the npm-minor group
2026-04-09 13:04:00 +00:00
dependabot[bot] 751f3e2f7c Bump eslint-plugin-jsdoc from 62.8.1 to 62.9.0 in the npm-minor group
Bumps the npm-minor group with 1 update: [eslint-plugin-jsdoc](https://github.com/gajus/eslint-plugin-jsdoc).


Updates `eslint-plugin-jsdoc` from 62.8.1 to 62.9.0
- [Release notes](https://github.com/gajus/eslint-plugin-jsdoc/releases)
- [Commits](https://github.com/gajus/eslint-plugin-jsdoc/compare/v62.8.1...v62.9.0)

---
updated-dependencies:
- dependency-name: eslint-plugin-jsdoc
  dependency-version: 62.9.0
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: npm-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-08 17:53:21 +00:00
Henry Mercer 808513f048 Update language aliases test 2026-04-08 16:38:23 +01:00
Henry Mercer e452857e57 Throw error early rather than warning 2026-04-08 16:33:19 +01:00
Mario Campos b623f5fd57 Merge pull request #3799 from github/mario-campos/test-multiple-registries
Add tests for getCredentials with multiple goproxy_servers and maven_…
2026-04-07 14:52:14 +00:00
Mario Campos 35a38985d3 Specify "Java" for a test case
Co-authored-by: Michael B. Gale <mbg@github.com>
2026-04-07 09:01:00 -05:00
Mario Campos 14ed573199 Specify "Go" for a test case
Co-authored-by: Michael B. Gale <mbg@github.com>
2026-04-07 09:01:00 -05:00
Mario Campos 43d8864b35 Run npm run lint-fix to format the code 2026-04-07 09:01:00 -05:00
Mario Campos f8aff3ad8b Add tests for getCredentials with multiple goproxy_servers and maven_repositories 2026-04-07 09:01:00 -05:00
Henry Mercer e6c83948f5 Merge pull request #3802 from github/dependabot/npm_and_yarn/lodash-4.18.1
Bump lodash from 4.17.23 to 4.18.1
2026-04-07 10:12:08 +00:00
Henry Mercer 347f0c676d Merge pull request #3803 from github/dependabot/npm_and_yarn/npm-minor-113ae615b7
Bump eslint-plugin-jsdoc from 62.8.0 to 62.8.1 in the npm-minor group across 1 directory
2026-04-07 10:08:35 +00:00
dependabot[bot] 6eed62b035 Bump eslint-plugin-jsdoc in the npm-minor group across 1 directory
Bumps the npm-minor group with 1 update in the / directory: [eslint-plugin-jsdoc](https://github.com/gajus/eslint-plugin-jsdoc).


Updates `eslint-plugin-jsdoc` from 62.8.0 to 62.8.1
- [Release notes](https://github.com/gajus/eslint-plugin-jsdoc/releases)
- [Commits](https://github.com/gajus/eslint-plugin-jsdoc/compare/v62.8.0...v62.8.1)

---
updated-dependencies:
- dependency-name: eslint-plugin-jsdoc
  dependency-version: 62.8.1
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: npm-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-07 09:36:59 +00:00
dependabot[bot] de1752b85d Bump lodash from 4.17.23 to 4.18.1
Bumps [lodash](https://github.com/lodash/lodash) from 4.17.23 to 4.18.1.
- [Release notes](https://github.com/lodash/lodash/releases)
- [Commits](https://github.com/lodash/lodash/compare/4.17.23...4.18.1)

---
updated-dependencies:
- dependency-name: lodash
  dependency-version: 4.18.1
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-04-07 09:36:15 +00:00
Henry Mercer 1065967b50 Merge pull request #3800 from github/henrymercer/typescript-6
Upgrade to TypeScript 6
2026-04-07 09:14:42 +00:00
Henry Mercer e25c0a535a Merge pull request #3795 from github/henrymercer/deprecate-trap-caching-cleanup
Deprecate TRAP cache cleanup
2026-04-07 09:14:31 +00:00
Henry Mercer 5f323cad05 Mark Swift incompatible OS as configuration error 2026-04-02 18:46:26 +01:00
Henry Mercer 212e28374b Upgrade to TypeScript 6
tsconfig changes:

- Specify `moduleResolution: bundler` since we use a bundler
- Specify `types: ["node"]` to speed up build
- Remove `alwaysStrict` as this is now deprecated
- Specify `skipLibCheck: true` to speed up build
- Specify Octokit types.d.ts path manually to address compiler not being able to find types with `moduleResolution: bundler`
2026-04-02 18:32:58 +01:00
Henry Mercer 36075a4980 Deprecate TRAP cache cleanup 2026-04-01 15:31:15 +01:00
Michael B. Gale 34950e1b11 Merge pull request #3792 from github/mario-campos/issue-1664
Extend start-proxy.yml to test multiple registry support
2026-04-01 13:59:59 +00:00
Henry Mercer 57ec7e1000 Merge pull request #3794 from github/henrymercer/cleanup
Python: Disable standard library extraction on GHES
2026-04-01 11:37:34 +00:00
Henry Mercer 311573e58e Add changelog note 2026-04-01 12:19:11 +01:00
Henry Mercer 1f4c852aeb Clean up Python extract stdlib feature flag 2026-04-01 12:08:06 +01:00
Michael B. Gale 2e3aaaefca Merge pull request #3787 from github/mbg/bundle/metadata
Generate and analyse esbuild bundle metadata
2026-04-01 10:29:27 +00:00
Mario Campos e2203c62cf Delete fromJSON() calls in test validation step 2026-03-31 13:19:33 -05:00
Mario Campos 7b0c5b1669 Keep validation steps named consistently 2026-03-31 12:49:07 -05:00
Mario Campos faf45e07f9 Use different maven URL for start-proxy.yml test 2026-03-31 12:44:43 -05:00
Mario Campos 8b5e60477c Use maven_repository, not maven-repository
The registry/language mapping table does not map the one with hyphens.
2026-03-31 11:36:17 -05:00
Mario Campos 99b8dd4d57 Run pr-checks/sync.sh to generate __start-proxy.yml. 2026-03-31 09:32:42 -05:00
Henry Mercer c618c9bddb Merge pull request #3789 from github/henrymercer/lower-minimum-git-if-no-submodules
Overlay: Only require Git 2.36.0 for repos that contain submodules
2026-03-31 10:10:05 +00:00
Mario Campos 9fd9b64766 Replace jq with Actions expression for proxy_urls validation
For the sake of consistency with the other pre-existing validation code.
2026-03-30 22:47:06 -05:00
Mario Campos 0c7c298b2a Extend start-proxy.yml to test multiple registry support 2026-03-30 18:35:04 -05:00
Henry Mercer a507a542a4 Test fallback when repo has no submodules 2026-03-30 15:58:58 +01:00
Henry Mercer be0a156326 Save a computation of the git root 2026-03-30 13:37:14 +01:00
Michael B. Gale f98bf5e347 Output relative to __dirname 2026-03-27 19:21:14 +00:00
Michael B. Gale 3db32b5d27 Fix outputs type 2026-03-27 19:13:22 +00:00
Michael B. Gale 4e0952a3c0 Output largest inputs 2026-03-27 19:13:02 +00:00
Henry Mercer 0592832ed8 Add changelog note 2026-03-27 18:58:05 +00:00
Henry Mercer 88a7e5118e Don't disable if we don't need the git version 2026-03-27 18:54:26 +00:00
Henry Mercer 6643a7d207 Only require Git 2.36.0 when repo contains submodules 2026-03-27 18:54:24 +00:00
Michael B. Gale 47f1709a3c Add basic metadata analysis script 2026-03-27 18:19:57 +00:00
Michael B. Gale b1981a5480 Move getApiClient out of sync-checks.ts 2026-03-27 18:13:48 +00:00
Henry Mercer a899987af2 Merge pull request #3786 from github/henrymercer/faster-interactive-jobs
Move time-sensitive Actions workflows to `ubuntu-latest`
2026-03-27 18:08:16 +00:00
Michael B. Gale 4ed3c0efe6 Generate esbuild metadata file 2026-03-27 17:54:29 +00:00
Henry Mercer 191d7c6f13 Merge pull request #3783 from github/mergeback/v4.35.1-to-main-c10b8064
Mergeback v4.35.1 refs/heads/releases/v4 into main
2026-03-27 17:11:42 +00:00
Henry Mercer aa69c483cd Merge pull request #3779 from github/henrymercer/remove-unused-dependency
Remove unused `@schemastore/package` dependency
2026-03-27 17:11:32 +00:00
Henry Mercer fe775da508 Merge pull request #3780 from github/dependabot/npm_and_yarn/brace-expansion-1.1.13
Bump brace-expansion from 1.1.12 to 1.1.13
2026-03-27 17:11:18 +00:00
Henry Mercer 353802f9f2 Move time-sensitive Actions workflows to ubuntu-latest
We originally moved these to `ubuntu-slim`, but there is a significant performance difference.  Since we often find ourselves waiting on these jobs, let's use the faster runners.
2026-03-27 16:22:19 +00:00
github-actions[bot] cc7db4a1f9 Rebuild 2026-03-27 16:20:01 +00:00
github-actions[bot] 6010f9d8e2 Update changelog and version after v4.35.1 2026-03-27 16:10:47 +00:00
Henry Mercer c10b8064de Merge pull request #3782 from github/update-v4.35.1-d6d1743b8
Merge main into releases/v4
2026-03-27 16:07:37 +00:00
github-actions[bot] c5ffd06837 Update changelog for v4.35.1 2026-03-27 15:39:16 +00:00
Henry Mercer d6d1743b8e Merge pull request #3781 from github/henrymercer/update-git-minimum-version
Update minimum Git version for overlay to 2.36.0
2026-03-27 14:59:36 +00:00
github-actions[bot] 999119ba45 Rebuild 2026-03-27 14:00:54 +00:00
Henry Mercer 65d2efa733 Add changelog note 2026-03-27 14:00:27 +00:00
Henry Mercer 2437b20ab3 Update minimum git version for overlay to 2.36.0 2026-03-27 14:00:17 +00:00
dependabot[bot] f13c600724 Bump brace-expansion from 1.1.12 to 1.1.13
Bumps [brace-expansion](https://github.com/juliangruber/brace-expansion) from 1.1.12 to 1.1.13.
- [Release notes](https://github.com/juliangruber/brace-expansion/releases)
- [Commits](https://github.com/juliangruber/brace-expansion/compare/v1.1.12...v1.1.13)

---
updated-dependencies:
- dependency-name: brace-expansion
  dependency-version: 1.1.13
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-03-27 13:58:43 +00:00
Henry Mercer 7dcea06663 Remove unused @schemastore/package dependency 2026-03-27 13:57:52 +00:00
Michael B. Gale ea5f71947c Merge pull request #3775 from github/dependabot/npm_and_yarn/node-forge-1.4.0
Bump node-forge from 1.3.3 to 1.4.0
2026-03-27 13:47:55 +00:00
Henry Mercer 45ceeea896 Merge pull request #3777 from github/mergeback/v4.35.0-to-main-b8bb9f28
Mergeback v4.35.0 refs/heads/releases/v4 into main
2026-03-27 13:36:14 +00:00
github-actions[bot] 24448c9843 Rebuild 2026-03-27 12:23:25 +00:00
github-actions[bot] 7c51060631 Update changelog and version after v4.35.0 2026-03-27 12:14:07 +00:00
Óscar San José b8bb9f28b8 Merge pull request #3776 from github/update-v4.35.0-0078ad667
Merge main into releases/v4
2026-03-27 13:11:18 +01:00
github-actions[bot] e9cf68bb33 Update changelog for v4.35.0 2026-03-27 11:44:34 +00:00
github-actions[bot] 36791d8d66 Rebuild 2026-03-27 10:27:12 +00:00
dependabot[bot] 22eba96a28 Bump node-forge from 1.3.3 to 1.4.0
Bumps [node-forge](https://github.com/digitalbazaar/forge) from 1.3.3 to 1.4.0.
- [Changelog](https://github.com/digitalbazaar/forge/blob/main/CHANGELOG.md)
- [Commits](https://github.com/digitalbazaar/forge/compare/v1.3.3...v1.4.0)

---
updated-dependencies:
- dependency-name: node-forge
  dependency-version: 1.4.0
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2026-03-27 10:25:06 +00:00
Óscar San José 0078ad667e Merge pull request #3773 from github/update-bundle/codeql-bundle-v2.25.1
Update default bundle to 2.25.1
2026-03-27 10:02:52 +00:00
github-actions[bot] fa7a15b909 Add changelog note 2026-03-27 09:43:23 +00:00
github-actions[bot] 8c29faa7ab Update default bundle to codeql-bundle-v2.25.1 2026-03-27 09:43:12 +00:00
102 changed files with 25478 additions and 20123 deletions
+3 -1
View File
@@ -1,5 +1,5 @@
name: "CodeQL config"
queries:
queries:
- name: Run custom queries
uses: ./queries
# Run all extra query suites, both because we want to
@@ -13,3 +13,5 @@ queries:
paths-ignore:
- lib
- tests
- "**/*.test.ts"
- "**/testing-util.ts"
+3 -1
View File
@@ -1,7 +1,9 @@
version: 2
updates:
- package-ecosystem: npm
directory: "/"
directories:
- "/"
- "/pr-checks"
schedule:
interval: weekly
cooldown:
+2 -2
View File
@@ -60,12 +60,12 @@ jobs:
setup-kotlin: 'true'
- uses: ./../action/init
with:
languages: C#,java-kotlin,swift,typescript
languages: C#,java-kotlin,typescript
tools: ${{ steps.prepare-test.outputs.tools-url }}
- name: 'Check languages'
run: |
expected_languages="csharp,java,swift,javascript"
expected_languages="csharp,java,javascript"
actual_languages=$(jq -r '.languages | join(",")' "$RUNNER_TEMP"/config)
if [ "$expected_languages" != "$actual_languages" ]; then
+1 -1
View File
@@ -59,7 +59,7 @@ jobs:
use-all-platform-bundle: 'false'
setup-kotlin: 'true'
- name: Set up Ruby
uses: ruby/setup-ruby@319994f95fa847cf3fb3cd3dbe89f6dcde9f178f # v1.295.0
uses: ruby/setup-ruby@0cb964fd540e0a24c900370abf38a33466142735 # v1.305.0
with:
ruby-version: 2.6
- name: Install Code Scanning integration
+18 -1
View File
@@ -71,7 +71,17 @@ jobs:
id: proxy
uses: ./../action/start-proxy
with:
registry_secrets: '[{ "type": "nuget_feed", "url": "https://api.nuget.org/v3/index.json" }]'
registry_secrets: |
[
{
"type": "maven_repository",
"url": "https://repo.maven.apache.org/maven2/"
},
{
"type": "maven_repository",
"url": "https://repo1.maven.org/maven2"
}
]
- name: Print proxy outputs
run: |
@@ -82,5 +92,12 @@ jobs:
- name: Fail if proxy outputs are not set
if: (!steps.proxy.outputs.proxy_host) || (!steps.proxy.outputs.proxy_port) || (!steps.proxy.outputs.proxy_ca_certificate) || (!steps.proxy.outputs.proxy_urls)
run: exit 1
- name: Fail if proxy_urls does not contain all registries
if: |
join(fromJSON(steps.proxy.outputs.proxy_urls)[*].type, ',') != 'maven_repository,maven_repository'
|| !contains(steps.proxy.outputs.proxy_urls, 'https://repo.maven.apache.org/maven2/')
|| !contains(steps.proxy.outputs.proxy_urls, 'https://repo1.maven.org/maven2')
run: exit 1
env:
CODEQL_ACTION_TEST_MODE: true
@@ -66,6 +66,7 @@ jobs:
uses: ./../action/.github/actions/verify-debug-artifact-scan-completed
- uses: ./../action/init
with:
languages: cpp,csharp,go,java,javascript,python
tools: ${{ steps.prepare-test.outputs.tools-url }}
debug: true
debug-artifact-name: my-debug-artifacts
+106
View File
@@ -0,0 +1,106 @@
# Workflow runs on main, on a release branch, and that were triggered as part of a merge group have
# already passed CI before being merged. Therefore if they fail, we should make sure that there
# wasn't a transient failure by rerunning the failed jobs once before investigating further.
name: Deflake
on:
workflow_run:
types: [completed]
# Exclude workflows that have significant side effects, like publishing releases. It's OK to
# retry CodeQL analysis.
workflows:
- Check Expected Release Files
- Code-Scanning config CLI tests
- CodeQL action
- Manual Check - go
- "PR Check - All-platform bundle"
- "PR Check - Analysis kinds"
- "PR Check - Analyze: 'ref' and 'sha' from inputs"
- "PR Check - autobuild-action"
- "PR Check - Autobuild direct tracing (custom working directory)"
- "PR Check - Autobuild working directory"
- "PR Check - Build mode autobuild"
- "PR Check - Build mode manual"
- "PR Check - Build mode none"
- "PR Check - Build mode rollback"
- "PR Check - Bundle: Caching checks"
- "PR Check - Bundle: From nightly"
- "PR Check - Bundle: From toolcache"
- "PR Check - Bundle: Zstandard checks"
- "PR Check - C/C\\+\\+: autoinstalling dependencies (Linux)"
- "PR Check - C/C\\+\\+: autoinstalling dependencies is skipped (macOS)"
- "PR Check - C/C\\+\\+: disabling autoinstalling dependencies (Linux)"
- "PR Check - Clean up database cluster directory"
- "PR Check - CodeQL Bundle All"
- "PR Check - Config export"
- "PR Check - Config input"
- "PR Check - Custom source root"
- "PR Check - Debug artifact upload"
- "PR Check - Debug artifacts after failure"
- "PR Check - Diagnostic export"
- "PR Check - Export file baseline information"
- "PR Check - Extractor ram and threads options test"
- "PR Check - Go: Custom queries"
- "PR Check - Go: diagnostic when Go is changed after init step"
- "PR Check - Go: diagnostic when `file` is not installed"
- "PR Check - Go: tracing with autobuilder step"
- "PR Check - Go: tracing with custom build steps"
- "PR Check - Go: tracing with legacy workflow"
- "PR Check - Go: workaround for indirect tracing"
- "PR Check - Job run UUID added to SARIF"
- "PR Check - Language aliases"
- "PR Check - Local CodeQL bundle"
- "PR Check - Multi-language repository"
- "PR Check - Overlay database init fallback"
- "PR Check - Packaging: Action input"
- "PR Check - Packaging: Config and input"
- "PR Check - Packaging: Config and input passed to the CLI"
- "PR Check - Packaging: Config file"
- "PR Check - Packaging: Download using registries"
- "PR Check - Proxy test"
- "PR Check - Remote config file"
- "PR Check - Resolve environment"
- "PR Check - RuboCop multi-language"
- "PR Check - Ruby analysis"
- "PR Check - Rust analysis"
- "PR Check - Split workflow"
- "PR Check - Start proxy"
- "PR Check - Submit SARIF after failure"
- "PR Check - Swift analysis using a custom build command"
- "PR Check - Swift analysis using autobuild"
- "PR Check - Test different uses of `upload-sarif`"
- "PR Check - Test unsetting environment variables"
- "PR Check - Upload-sarif: ref and sha from inputs"
- "PR Check - Use a custom `checkout_path`"
- PR Checks
- Query filters tests
- Test that the workaround for python 3.12 on windows works
jobs:
rerun-on-failure:
name: Rerun failed jobs
if: >-
github.event.workflow_run.conclusion == 'failure' &&
github.event.workflow_run.run_attempt == 1 &&
(
github.event.workflow_run.head_branch == 'main' ||
startsWith(github.event.workflow_run.head_branch, 'releases/') ||
github.event.workflow_run.event == 'merge_group'
)
runs-on: ubuntu-slim
permissions:
actions: write
steps:
- name: Rerun failed jobs in ${{ github.event.workflow_run.name }}
env:
GH_TOKEN: ${{ github.token }}
GH_REPO: ${{ github.repository }}
RUN_ID: ${{ github.event.workflow_run.id }}
RUN_NAME: ${{ github.event.workflow_run.name }}
RUN_URL: ${{ github.event.workflow_run.html_url }}
run: |
echo "Rerunning failed jobs for workflow run ${RUN_ID}"
gh run rerun "${RUN_ID}" --failed
echo "### Reran failed jobs :recycle:" >> "$GITHUB_STEP_SUMMARY"
echo "" >> "$GITHUB_STEP_SUMMARY"
echo "Workflow: [${RUN_NAME}](${RUN_URL})" >> "$GITHUB_STEP_SUMMARY"
+2 -2
View File
@@ -24,7 +24,7 @@ defaults:
jobs:
merge-back:
runs-on: ubuntu-slim
runs-on: ubuntu-latest
environment: Automation
if: github.repository == 'github/codeql-action'
env:
@@ -131,7 +131,7 @@ jobs:
echo "::endgroup::"
- name: Generate token
uses: actions/create-github-app-token@v3.0.0
uses: actions/create-github-app-token@v3.1.1
id: app-token
with:
app-id: ${{ vars.AUTOMATION_APP_ID }}
+1 -1
View File
@@ -29,7 +29,7 @@ defaults:
jobs:
prepare:
name: "Prepare release"
runs-on: ubuntu-slim
runs-on: ubuntu-latest
if: github.repository == 'github/codeql-action'
permissions:
+1 -1
View File
@@ -136,7 +136,7 @@ jobs:
- name: Generate token
if: github.event_name == 'workflow_dispatch'
uses: actions/create-github-app-token@v3.0.0
uses: actions/create-github-app-token@v3.1.1
id: app-token
with:
app-id: ${{ vars.AUTOMATION_APP_ID }}
+12 -1
View File
@@ -20,7 +20,7 @@ defaults:
jobs:
update-bundle:
if: github.event.release.prerelease && startsWith(github.event.release.tag_name, 'codeql-bundle-')
runs-on: ubuntu-slim
runs-on: ubuntu-latest
permissions:
contents: write # needed to push commits
pull-requests: write # needed to create pull requests
@@ -57,6 +57,17 @@ jobs:
- name: Update bundle
uses: ./.github/actions/update-bundle
- name: Set up CodeQL CLI from new bundle
id: setup-codeql
uses: ./setup-codeql
with:
tools: https://github.com/github/codeql-action/releases/download/${{ github.event.release.tag_name }}/codeql-bundle-linux64.tar.gz
- name: Update built-in languages
run: npx tsx pr-checks/update-builtin-languages.ts "$CODEQL_PATH"
env:
CODEQL_PATH: ${{ steps.setup-codeql.outputs.codeql-path }}
- name: Bump Action minor version if new CodeQL minor version series
id: bump-action-version
run: |
+3 -3
View File
@@ -26,7 +26,7 @@ jobs:
update:
timeout-minutes: 45
runs-on: ubuntu-slim
runs-on: ubuntu-latest
if: github.event_name == 'workflow_dispatch'
needs: [prepare]
env:
@@ -77,7 +77,7 @@ jobs:
backport:
timeout-minutes: 45
runs-on: ubuntu-slim
runs-on: ubuntu-latest
environment: Automation
needs: [prepare]
if: ${{ (github.event_name == 'push') && needs.prepare.outputs.backport_target_branches != '[]' }}
@@ -93,7 +93,7 @@ jobs:
pull-requests: write # needed to create pull request
steps:
- name: Generate token
uses: actions/create-github-app-token@v3.0.0
uses: actions/create-github-app-token@v3.1.1
id: app-token
with:
app-id: ${{ vars.AUTOMATION_APP_ID }}
+2
View File
@@ -11,3 +11,5 @@ build/
eslint.sarif
# for local incremental compilation
tsconfig.tsbuildinfo
# esbuild metadata file
meta.json
+22 -1
View File
@@ -2,9 +2,30 @@
See the [releases page](https://github.com/github/codeql-action/releases) for the relevant changes to the CodeQL CLI and language packs.
## [UNRELEASED]
## 4.35.3 - 01 May 2026
- _Upcoming breaking change_: Add a deprecation warning for customers using CodeQL version 2.19.3 and earlier. These versions of CodeQL were discontinued on 9 April 2026 alongside GitHub Enterprise Server 3.15, and will be unsupported by the next minor release of the CodeQL Action. [#3837](https://github.com/github/codeql-action/pull/3837)
- Configurations for private registries that use Cloudsmith or GCP OIDC are now accepted. [#3850](https://github.com/github/codeql-action/pull/3850)
- Best-effort connection tests for private registries now use `GET` requests instead of `HEAD` for better compatibility with various registry implementations. For NuGet feeds, the test is now always performed against the service index. [#3853](https://github.com/github/codeql-action/pull/3853)
- Fixed a bug where two diagnostics produced within the same millisecond could overwrite each other on disk, causing one of them to be lost. [#3852](https://github.com/github/codeql-action/pull/3852)
- Update default CodeQL bundle version to [2.25.3](https://github.com/github/codeql-action/releases/tag/codeql-bundle-v2.25.3). [#3865](https://github.com/github/codeql-action/pull/3865)
## 4.35.2 - 15 Apr 2026
- The undocumented TRAP cache cleanup feature that could be enabled using the `CODEQL_ACTION_CLEANUP_TRAP_CACHES` environment variable is deprecated and will be removed in May 2026. If you are affected by this, we recommend disabling TRAP caching by passing the `trap-caching: false` input to the `init` Action. [#3795](https://github.com/github/codeql-action/pull/3795)
- The Git version 2.36.0 requirement for improved incremental analysis now only applies to repositories that contain submodules. [#3789](https://github.com/github/codeql-action/pull/3789)
- Python analysis on GHES no longer extracts the standard library, relying instead on models of the standard library. This should result in significantly faster extraction and analysis times, while the effect on alerts should be minimal. [#3794](https://github.com/github/codeql-action/pull/3794)
- Fixed a bug in the validation of OIDC configurations for private registries that was added in CodeQL Action 4.33.0 / 3.33.0. [#3807](https://github.com/github/codeql-action/pull/3807)
- Update default CodeQL bundle version to [2.25.2](https://github.com/github/codeql-action/releases/tag/codeql-bundle-v2.25.2). [#3823](https://github.com/github/codeql-action/pull/3823)
## 4.35.1 - 27 Mar 2026
- Fix incorrect minimum required Git version for [improved incremental analysis](https://github.com/github/roadmap/issues/1158): it should have been 2.36.0, not 2.11.0. [#3781](https://github.com/github/codeql-action/pull/3781)
## 4.35.0 - 27 Mar 2026
- Reduced the minimum Git version required for [improved incremental analysis](https://github.com/github/roadmap/issues/1158) from 2.38.0 to 2.11.0. [#3767](https://github.com/github/codeql-action/pull/3767)
- Update default CodeQL bundle version to [2.25.1](https://github.com/github/codeql-action/releases/tag/codeql-bundle-v2.25.1). [#3773](https://github.com/github/codeql-action/pull/3773)
## 4.34.1 - 20 Mar 2026
+1
View File
@@ -72,6 +72,7 @@ We typically release new minor versions of the CodeQL Action and Bundle when a n
| Minimum CodeQL Action | Minimum CodeQL Bundle Version | GitHub Environment | Notes |
|-----------------------|-------------------------------|--------------------|-------|
| `v4.33.0` | `2.24.3` | Enterprise Server 3.21 | |
| `v4.31.10` | `2.23.9` | Enterprise Server 3.20 | |
| `v3.29.11` | `2.22.4` | Enterprise Server 3.19 | |
| `v3.28.21` | `2.21.3` | Enterprise Server 3.18 | |
+10 -3
View File
@@ -1,4 +1,4 @@
import { copyFile, rm } from "node:fs/promises";
import { copyFile, rm, writeFile } from "node:fs/promises";
import { dirname, join } from "node:path";
import { fileURLToPath } from "node:url";
@@ -64,7 +64,11 @@ const onEndPlugin = {
const context = await esbuild.context({
// Include upload-lib.ts as an entry point for use in testing environments.
entryPoints: globSync([`${SRC_DIR}/*-action.ts`, `${SRC_DIR}/*-action-post.ts`, "src/upload-lib.ts"]),
entryPoints: globSync([
`${SRC_DIR}/*-action.ts`,
`${SRC_DIR}/*-action-post.ts`,
"src/upload-lib.ts",
]),
bundle: true,
format: "cjs",
outdir: OUT_DIR,
@@ -74,7 +78,10 @@ const context = await esbuild.context({
define: {
__CODEQL_ACTION_VERSION__: JSON.stringify(pkg.version),
},
metafile: true,
});
await context.rebuild();
const result = await context.rebuild();
await writeFile(join(__dirname, "meta.json"), JSON.stringify(result.metafile));
await context.dispose();
+1788 -1482
View File
File diff suppressed because it is too large Load Diff
+1574 -1261
View File
File diff suppressed because it is too large Load Diff
+1337 -1027
View File
File diff suppressed because it is too large Load Diff
+4 -4
View File
@@ -1,6 +1,6 @@
{
"bundleVersion": "codeql-bundle-v2.24.3",
"cliVersion": "2.24.3",
"priorBundleVersion": "codeql-bundle-v2.24.2",
"priorCliVersion": "2.24.2"
"bundleVersion": "codeql-bundle-v2.25.3",
"cliVersion": "2.25.3",
"priorBundleVersion": "codeql-bundle-v2.25.2",
"priorCliVersion": "2.25.2"
}
+1880 -1572
View File
File diff suppressed because it is too large Load Diff
+1632 -1324
View File
File diff suppressed because it is too large Load Diff
+1320 -1014
View File
File diff suppressed because it is too large Load Diff
+2672 -2364
View File
File diff suppressed because it is too large Load Diff
+1010 -719
View File
File diff suppressed because it is too large Load Diff
+3572 -3176
View File
File diff suppressed because it is too large Load Diff
+1387 -1085
View File
File diff suppressed because it is too large Load Diff
+1010 -719
View File
File diff suppressed because it is too large Load Diff
+2704 -2398
View File
File diff suppressed because it is too large Load Diff
+860 -377
View File
File diff suppressed because it is too large Load Diff
+19 -20
View File
@@ -1,18 +1,18 @@
{
"name": "codeql",
"version": "4.34.2",
"version": "4.35.3",
"private": true,
"description": "CodeQL action",
"scripts": {
"_build_comment": "echo 'Run the full build so we typecheck the project and can reuse the transpiled files in npm test'",
"build": "./scripts/check-node-modules.sh && npm run transpile && node build.mjs",
"build": "./scripts/check-node-modules.sh && npm run transpile && node build.mjs && npx tsx ./pr-checks/bundle-metadata.ts",
"lint": "eslint --report-unused-disable-directives --max-warnings=0 .",
"lint-ci": "SARIF_ESLINT_IGNORE_SUPPRESSED=true eslint --report-unused-disable-directives --max-warnings=0 . --format @microsoft/eslint-formatter-sarif --output-file=eslint.sarif",
"lint-fix": "eslint --report-unused-disable-directives --max-warnings=0 . --fix",
"ava": "npm run transpile && ava --verbose",
"test": "npm run ava -- src/",
"test-debug": "npm run test -- --timeout=20m",
"transpile": "tsc --build --verbose"
"transpile": "tsc --build --verbose tsconfig.json"
},
"license": "MIT",
"workspaces": [
@@ -29,23 +29,22 @@
"@actions/http-client": "^3.0.0",
"@actions/io": "^2.0.0",
"@actions/tool-cache": "^3.0.1",
"@octokit/plugin-retry": "^8.0.0",
"@schemastore/package": "0.0.10",
"@octokit/plugin-retry": "^8.1.0",
"archiver": "^7.0.1",
"fast-deep-equal": "^3.1.3",
"follow-redirects": "^1.15.11",
"follow-redirects": "^1.16.0",
"get-folder-size": "^5.0.0",
"https-proxy-agent": "^7.0.6",
"js-yaml": "^4.1.1",
"jsonschema": "1.4.1",
"jsonschema": "1.5.0",
"long": "^5.3.2",
"node-forge": "^1.3.3",
"node-forge": "^1.4.0",
"semver": "^7.7.4",
"uuid": "^13.0.0"
"uuid": "^14.0.0"
},
"devDependencies": {
"@ava/typescript": "6.0.0",
"@eslint/compat": "^2.0.3",
"@ava/typescript": "7.0.0",
"@eslint/compat": "^2.0.5",
"@microsoft/eslint-formatter-sarif": "^3.1.0",
"@octokit/types": "^16.0.0",
"@types/archiver": "^7.0.0",
@@ -55,21 +54,21 @@
"@types/node-forge": "^1.3.14",
"@types/sarif": "^2.1.7",
"@types/semver": "^7.7.1",
"@types/sinon": "^21.0.0",
"@types/sinon": "^21.0.1",
"ava": "^7.0.0",
"esbuild": "^0.27.4",
"esbuild": "^0.28.0",
"eslint": "^9.39.2",
"eslint-import-resolver-typescript": "^3.8.7",
"eslint-import-resolver-typescript": "^4.4.4",
"eslint-plugin-github": "^6.0.0",
"eslint-plugin-import-x": "^4.16.2",
"eslint-plugin-jsdoc": "^62.8.0",
"eslint-plugin-jsdoc": "^62.9.0",
"eslint-plugin-no-async-foreach": "^0.1.1",
"glob": "^11.1.0",
"globals": "^17.4.0",
"nock": "^14.0.11",
"sinon": "^21.0.3",
"typescript": "^5.9.3",
"typescript-eslint": "^8.57.1"
"globals": "^17.5.0",
"nock": "^14.0.12",
"sinon": "^21.1.2",
"typescript": "^6.0.2",
"typescript-eslint": "^8.58.2"
},
"overrides": {
"@actions/tool-cache": {
+13
View File
@@ -0,0 +1,13 @@
import * as githubUtils from "@actions/github/lib/utils";
import { type Octokit } from "@octokit/core";
import { type PaginateInterface } from "@octokit/plugin-paginate-rest";
import { type Api } from "@octokit/plugin-rest-endpoint-methods";
/** The type of the Octokit client. */
export type ApiClient = Octokit & Api & { paginate: PaginateInterface };
/** Constructs an `ApiClient` using `token` for authentication. */
export function getApiClient(token: string): ApiClient {
const opts = githubUtils.getOctokitOptions(token);
return new githubUtils.GitHub(opts);
}
+48
View File
@@ -0,0 +1,48 @@
#!/usr/bin/env npx tsx
import * as fs from "node:fs/promises";
import { BUNDLE_METADATA_FILE } from "./config";
interface InputInfo {
bytesInOutput: number;
}
type Inputs = Record<string, InputInfo>;
interface Output {
bytes: number;
inputs: Inputs;
}
interface Metadata {
outputs: Record<string, Output>;
}
function toMB(bytes: number): string {
return `${(bytes / (1024 * 1024)).toFixed(2)}MB`;
}
async function main() {
const fileContents = await fs.readFile(BUNDLE_METADATA_FILE);
const metadata = JSON.parse(String(fileContents)) as Metadata;
for (const [outputFile, outputData] of Object.entries(
metadata.outputs,
).reverse()) {
console.info(`${outputFile}: ${toMB(outputData.bytes)}`);
for (const [inputName, inputData] of Object.entries(outputData.inputs)) {
// Ignore any inputs that make up less than 5% of the output.
const percentage = (inputData.bytesInOutput / outputData.bytes) * 100.0;
if (percentage < 5.0) continue;
console.info(` ${inputName}: ${toMB(inputData.bytesInOutput)}`);
}
}
}
// Only call `main` if this script was run directly.
if (require.main === module) {
void main();
}
+2 -2
View File
@@ -5,12 +5,12 @@ versions:
steps:
- uses: ./../action/init
with:
languages: C#,java-kotlin,swift,typescript
languages: C#,java-kotlin,typescript
tools: ${{ steps.prepare-test.outputs.tools-url }}
- name: "Check languages"
run: |
expected_languages="csharp,java,swift,javascript"
expected_languages="csharp,java,javascript"
actual_languages=$(jq -r '.languages | join(",")' "$RUNNER_TEMP"/config)
if [ "$expected_languages" != "$actual_languages" ]; then
+1 -1
View File
@@ -5,7 +5,7 @@ versions:
- default
steps:
- name: Set up Ruby
uses: ruby/setup-ruby@319994f95fa847cf3fb3cd3dbe89f6dcde9f178f # v1.295.0
uses: ruby/setup-ruby@0cb964fd540e0a24c900370abf38a33466142735 # v1.305.0
with:
ruby-version: 2.6
- name: Install Code Scanning integration
+18 -1
View File
@@ -16,7 +16,17 @@ steps:
id: proxy
uses: ./../action/start-proxy
with:
registry_secrets: '[{ "type": "nuget_feed", "url": "https://api.nuget.org/v3/index.json" }]'
registry_secrets: |
[
{
"type": "maven_repository",
"url": "https://repo.maven.apache.org/maven2/"
},
{
"type": "maven_repository",
"url": "https://repo1.maven.org/maven2"
}
]
- name: Print proxy outputs
run: |
@@ -27,3 +37,10 @@ steps:
- name: Fail if proxy outputs are not set
if: (!steps.proxy.outputs.proxy_host) || (!steps.proxy.outputs.proxy_port) || (!steps.proxy.outputs.proxy_ca_certificate) || (!steps.proxy.outputs.proxy_urls)
run: exit 1
- name: Fail if proxy_urls does not contain all registries
if: |
join(fromJSON(steps.proxy.outputs.proxy_urls)[*].type, ',') != 'maven_repository,maven_repository'
|| !contains(steps.proxy.outputs.proxy_urls, 'https://repo.maven.apache.org/maven2/')
|| !contains(steps.proxy.outputs.proxy_urls, 'https://repo1.maven.org/maven2')
run: exit 1
+13
View File
@@ -8,3 +8,16 @@ export const PR_CHECKS_DIR = __dirname;
/** The path of the file configuring which checks shouldn't be required. */
export const PR_CHECK_EXCLUDED_FILE = path.join(PR_CHECKS_DIR, "excluded.yml");
/** The path to the esbuild metadata file. */
export const BUNDLE_METADATA_FILE = path.join(PR_CHECKS_DIR, "..", "meta.json");
/** The `src` directory. */
const SOURCE_ROOT = path.join(PR_CHECKS_DIR, "..", "src");
/** The path to the built-in languages file. */
export const BUILTIN_LANGUAGES_FILE = path.join(
SOURCE_ROOT,
"languages",
"builtin.json",
);
+1 -13
View File
@@ -5,12 +5,9 @@
import * as fs from "fs";
import { parseArgs } from "node:util";
import * as githubUtils from "@actions/github/lib/utils";
import { type Octokit } from "@octokit/core";
import { type PaginateInterface } from "@octokit/plugin-paginate-rest";
import { type Api } from "@octokit/plugin-rest-endpoint-methods";
import * as yaml from "yaml";
import { type ApiClient, getApiClient } from "./api-client";
import {
OLDEST_SUPPORTED_MAJOR_VERSION,
PR_CHECK_EXCLUDED_FILE,
@@ -49,15 +46,6 @@ function loadExclusions(): Exclusions {
) as Exclusions;
}
/** The type of the Octokit client. */
type ApiClient = Octokit & Api & { paginate: PaginateInterface };
/** Constructs an `ApiClient` using `token` for authentication. */
function getApiClient(token: string): ApiClient {
const opts = githubUtils.getOctokitOptions(token);
return new githubUtils.GitHub(opts);
}
/**
* Represents information about a check run. We track the `app_id` that generated the check,
* because the API will require it in addition to the name in the future.
+5 -5
View File
@@ -5,7 +5,7 @@ import * as path from "path";
import * as yaml from "yaml";
import { KnownLanguage } from "../src/languages";
import { BuiltInLanguage } from "../src/languages";
/** Known workflow input names. */
enum KnownInputName {
@@ -91,8 +91,8 @@ interface LanguageSetup {
steps: Step[];
}
/** Describes partial mappings from known languages to their specific setup information. */
type LanguageSetups = Partial<Record<KnownLanguage, LanguageSetup>>;
/** Describes partial mappings from built-in languages to their specific setup information. */
type LanguageSetups = Partial<Record<BuiltInLanguage, LanguageSetup>>;
// The default set of CodeQL Bundle versions to use for the PR checks.
const defaultTestVersions = [
@@ -125,7 +125,7 @@ const defaultLanguageVersions = {
java: "17",
python: "3.13",
csharp: "9.x",
} as const satisfies Partial<Record<KnownLanguage, string>>;
} as const satisfies Partial<Record<BuiltInLanguage, string>>;
/** A mapping from known input names to their specifications. */
const inputSpecs: WorkflowInputs = {
@@ -364,7 +364,7 @@ function getSetupSteps(checkSpecification: JobSpecification): {
const inputs: Array<Set<KnownInputName>> = [];
const steps: Step[] = [];
for (const language of Object.values(KnownLanguage).sort()) {
for (const language of Object.values(BuiltInLanguage).sort()) {
const setupSpec = languageSetups[language];
if (
+1
View File
@@ -3,6 +3,7 @@
"compilerOptions": {
/* Basic Options */
"lib": ["esnext"],
"module": "preserve",
"rootDir": "..",
"sourceMap": false,
"noEmit": true,
+131
View File
@@ -0,0 +1,131 @@
#!/usr/bin/env npx tsx
/*
* Updates src/languages/builtin.json by querying the CodeQL CLI for:
* - Languages that have default queries (via codeql-extractor.yml)
* - Language aliases (via `codeql resolve languages --format=betterjson --extractor-include-aliases`)
*
* Usage:
* npx tsx pr-checks/update-builtin-languages.ts [path-to-codeql]
*
* If no path is given, falls back to "codeql".
*/
import { execFileSync } from "node:child_process";
import * as fs from "node:fs";
import * as path from "node:path";
import * as yaml from "yaml";
import { EnvVar } from "../src/environment";
import { BUILTIN_LANGUAGES_FILE } from "./config";
/** Resolve all known language extractor directories. */
function resolveLanguages(codeqlPath: string): Record<string, string[]> {
return JSON.parse(
execFileSync(codeqlPath, ["resolve", "languages", "--format=json"], {
encoding: "utf8",
env: {
...process.env,
[EnvVar.EXPERIMENTAL_FEATURES]: "true", // include experimental languages
},
}),
) as Record<string, string[]>;
}
/**
* Return the sorted list of languages whose extractors ship default queries.
*
* @param extractorDirs - Map from language to list of extractor directories
*/
function findLanguagesWithDefaultQueries(
extractorDirs: Record<string, string[]>,
): string[] {
const languages: string[] = [];
for (const [language, dirs] of Object.entries(extractorDirs)) {
if (dirs.length !== 1) {
throw new Error(
`Expected exactly one extractor directory for language '${language}', but found ${dirs.length}: ${dirs.join(
", ",
)}`,
);
}
const extractorYmlPath = path.join(dirs[0], "codeql-extractor.yml");
if (!fs.existsSync(extractorYmlPath)) {
throw new Error(
`Extractor YAML not found for language '${language}' at expected path: ${extractorYmlPath}`,
);
}
const extractorYml = yaml.parse(fs.readFileSync(extractorYmlPath, "utf8"));
const defaultQueries: unknown[] | undefined = extractorYml.default_queries;
if (Array.isArray(defaultQueries) && defaultQueries.length > 0) {
console.log(
`${language}: included (default queries: ${JSON.stringify(defaultQueries)})`,
);
languages.push(language);
} else {
console.log(`${language}: excluded (no default queries)`);
}
}
return languages.sort();
}
/**
* Resolve language aliases from the CodeQL CLI, keeping only those whose
* target is in the given set of included languages.
*/
function resolveAliases(
codeqlPath: string,
includedLanguages: Set<string>,
): Record<string, string> {
const betterjsonOutput = JSON.parse(
execFileSync(
codeqlPath,
[
"resolve",
"languages",
"--format=betterjson",
"--extractor-include-aliases",
],
{ encoding: "utf8" },
),
);
return Object.fromEntries(
Object.entries((betterjsonOutput.aliases ?? {}) as Record<string, string>)
.filter(([, target]) => includedLanguages.has(target))
.sort(([a], [b]) => a.localeCompare(b)),
);
}
/** Write the built-in languages data to disk. */
function writeBuiltinLanguages(
languages: string[],
aliases: Record<string, string>,
): void {
const content = `${JSON.stringify({ languages, aliases }, null, 2)}\n`;
fs.mkdirSync(path.dirname(BUILTIN_LANGUAGES_FILE), { recursive: true });
fs.writeFileSync(BUILTIN_LANGUAGES_FILE, content);
console.log(`\nWrote ${BUILTIN_LANGUAGES_FILE}`);
console.log(` Languages: ${languages.join(", ")}`);
console.log(` Aliases: ${Object.keys(aliases).join(", ")}`);
}
function main(): void {
const codeqlPath = process.argv[2] || "codeql";
const extractorDirs = resolveLanguages(codeqlPath);
const languages = findLanguagesWithDefaultQueries(extractorDirs);
const aliases = resolveAliases(codeqlPath, new Set(languages));
writeBuiltinLanguages(languages, aliases);
}
main();
+11 -7
View File
@@ -30,9 +30,9 @@ import {
} from "./dependency-caching";
import { EnvVar } from "./environment";
import { initFeatures } from "./feature-flags";
import { KnownLanguage } from "./languages";
import { BuiltInLanguage } from "./languages";
import { getActionsLogger, Logger } from "./logging";
import { cleanupAndUploadOverlayBaseDatabaseToCache } from "./overlay";
import { cleanupAndUploadOverlayBaseDatabaseToCache } from "./overlay/caching";
import { getRepositoryNwo } from "./repository";
import * as statusReport from "./status-report";
import {
@@ -135,9 +135,13 @@ function hasBadExpectErrorInput(): boolean {
function doesGoExtractionOutputExist(config: Config): boolean {
const golangDbDirectory = util.getCodeQLDatabasePath(
config,
KnownLanguage.go,
BuiltInLanguage.go,
);
const trapDirectory = path.join(
golangDbDirectory,
"trap",
BuiltInLanguage.go,
);
const trapDirectory = path.join(golangDbDirectory, "trap", KnownLanguage.go);
return (
fs.existsSync(trapDirectory) &&
fs
@@ -169,7 +173,7 @@ function doesGoExtractionOutputExist(config: Config): boolean {
* whether any extraction output already exists for Go.
*/
async function runAutobuildIfLegacyGoWorkflow(config: Config, logger: Logger) {
if (!config.languages.includes(KnownLanguage.go)) {
if (!config.languages.includes(BuiltInLanguage.go)) {
return;
}
if (config.buildMode) {
@@ -182,7 +186,7 @@ async function runAutobuildIfLegacyGoWorkflow(config: Config, logger: Logger) {
logger.debug("Won't run Go autobuild since it has already been run.");
return;
}
if (dbIsFinalized(config, KnownLanguage.go, logger)) {
if (dbIsFinalized(config, BuiltInLanguage.go, logger)) {
logger.debug(
"Won't run Go autobuild since there is already a finalized database for Go.",
);
@@ -205,7 +209,7 @@ async function runAutobuildIfLegacyGoWorkflow(config: Config, logger: Logger) {
logger.debug(
"Running Go autobuild because extraction output (TRAP files) for Go code has not been found.",
);
await runAutobuild(config, KnownLanguage.go, logger);
await runAutobuild(config, BuiltInLanguage.go, logger);
}
async function run(startedAt: Date) {
+6 -6
View File
@@ -14,7 +14,7 @@ import {
} from "./analyze";
import { createStubCodeQL } from "./codeql";
import { Feature } from "./feature-flags";
import { KnownLanguage } from "./languages";
import { BuiltInLanguage } from "./languages";
import { getRunnerLogger } from "./logging";
import {
setupTests,
@@ -41,7 +41,7 @@ test.serial("status report fields", async (t) => {
const threadsFlag = "";
sinon.stub(uploadLib, "validateSarifFileSchema");
for (const language of Object.values(KnownLanguage)) {
for (const language of Object.values(BuiltInLanguage)) {
const codeql = createStubCodeQL({
databaseRunQueries: async () => {},
databaseInterpretResults: async (
@@ -130,13 +130,13 @@ test.serial("status report fields", async (t) => {
test("resolveQuerySuiteAlias", (t) => {
// default query suite names should resolve to something language-specific ending in `.qls`.
for (const suite of defaultSuites) {
const resolved = resolveQuerySuiteAlias(KnownLanguage.go, suite);
const resolved = resolveQuerySuiteAlias(BuiltInLanguage.go, suite);
t.assert(
path.extname(resolved) === ".qls",
"Resolved default suite doesn't end in .qls",
);
t.assert(
resolved.indexOf(KnownLanguage.go) >= 0,
resolved.indexOf(BuiltInLanguage.go) >= 0,
"Resolved default suite doesn't contain language name",
);
}
@@ -145,12 +145,12 @@ test("resolveQuerySuiteAlias", (t) => {
const names = ["foo", "bar", "codeql/go-queries@1.0"];
for (const name of names) {
t.deepEqual(resolveQuerySuiteAlias(KnownLanguage.go, name), name);
t.deepEqual(resolveQuerySuiteAlias(BuiltInLanguage.go, name), name);
}
});
test("addSarifExtension", (t) => {
for (const language of Object.values(KnownLanguage)) {
for (const language of Object.values(BuiltInLanguage)) {
t.deepEqual(addSarifExtension(CodeScanning, language), `${language}.sarif`);
t.deepEqual(
addSarifExtension(CodeQuality, language),
+16 -28
View File
@@ -21,9 +21,9 @@ import {
} from "./diff-informed-analysis-utils";
import { EnvVar } from "./environment";
import { FeatureEnablement, Feature } from "./feature-flags";
import { KnownLanguage, Language } from "./languages";
import { BuiltInLanguage, Language } from "./languages";
import { Logger, withGroupAsync } from "./logging";
import { OverlayDatabaseMode } from "./overlay";
import { OverlayDatabaseMode } from "./overlay/overlay-database-mode";
import type * as sarif from "./sarif";
import { DatabaseCreationTimings, EventReport } from "./status-report";
import { endTracingForCluster } from "./tracer-config";
@@ -41,7 +41,7 @@ export class CodeQLAnalysisError extends Error {
}
}
type KnownLanguageKey = keyof typeof KnownLanguage;
type BuiltInLanguageKey = keyof typeof BuiltInLanguage;
type RunQueriesDurationStatusReport = {
/**
@@ -50,12 +50,12 @@ type RunQueriesDurationStatusReport = {
* The "builtin" designation is now outdated with the move to CLI config parsing: this is the time
* taken to run _all_ the queries.
*/
[L in KnownLanguageKey as `analyze_builtin_queries_${L}_duration_ms`]?: number;
[L in BuiltInLanguageKey as `analyze_builtin_queries_${L}_duration_ms`]?: number;
};
type InterpretResultsDurationStatusReport = {
/** Time taken in ms to interpret results for the language (or undefined if this language was not analyzed). */
[L in KnownLanguageKey as `interpret_results_${L}_duration_ms`]?: number;
[L in BuiltInLanguageKey as `interpret_results_${L}_duration_ms`]?: number;
};
export interface QueriesStatusReport
@@ -115,12 +115,12 @@ export async function runExtraction(
if (await shouldExtractLanguage(codeql, config, language)) {
logger.startGroup(`Extracting ${language}`);
if (language === KnownLanguage.python) {
if (language === BuiltInLanguage.python) {
await setupPythonExtractor(logger);
}
if (config.buildMode) {
if (
language === KnownLanguage.cpp &&
language === BuiltInLanguage.cpp &&
config.buildMode === BuildMode.Autobuild
) {
await setupCppAutobuild(codeql, logger);
@@ -131,14 +131,14 @@ export async function runExtraction(
// a stable path that caches can be restored into and that we can cache at the
// end of the workflow (i.e. that does not get removed when the scratch directory is).
if (
language === KnownLanguage.java &&
language === BuiltInLanguage.java &&
config.buildMode === BuildMode.None
) {
process.env["CODEQL_EXTRACTOR_JAVA_OPTION_BUILDLESS_DEPENDENCY_DIR"] =
getJavaTempDependencyDir();
}
if (
language === KnownLanguage.csharp &&
language === BuiltInLanguage.csharp &&
config.buildMode === BuildMode.None &&
(await features.getValue(Feature.CsharpCacheBuildModeNone))
) {
@@ -251,16 +251,9 @@ export async function setupDiffInformedQueryRun(
diffRanges,
checkoutPath,
);
if (packDir === undefined) {
logger.warning(
"Cannot create diff range extension pack for diff-informed queries; " +
"reverting to performing full analysis.",
);
} else {
logger.info(
`Successfully created diff range extension pack at ${packDir}.`,
);
}
logger.info(
`Successfully created diff range extension pack at ${packDir}.`,
);
return packDir;
},
);
@@ -314,18 +307,13 @@ extensions:
* @param ranges The file line ranges, as returned by
* `getPullRequestEditedDiffRanges`.
* @param checkoutPath The path at which the repository was checked out.
* @returns The absolute path of the directory containing the extension pack, or
* `undefined` if no extension pack was created.
* @returns The absolute path of the directory containing the extension pack.
*/
function writeDiffRangeDataExtensionPack(
logger: Logger,
ranges: DiffThunkRange[] | undefined,
ranges: DiffThunkRange[],
checkoutPath: string,
): string | undefined {
if (ranges === undefined) {
return undefined;
}
): string {
if (ranges.length === 0) {
// An empty diff range means that there are no added or modified lines in
// the pull request. But the `restrictAlertsTo` extensible predicate
@@ -698,7 +686,7 @@ export async function warnIfGoInstalledAfterInit(
addDiagnostic(
config,
KnownLanguage.go,
BuiltInLanguage.go,
makeDiagnostic(
"go/workflow/go-installed-after-codeql-init",
"Go was installed after the `codeql-action/init` Action was run",
+1 -1
View File
@@ -1 +1 @@
{"maximumVersion": "3.21", "minimumVersion": "3.14"}
{"maximumVersion": "3.21", "minimumVersion": "3.16"}
+5 -5
View File
@@ -7,7 +7,7 @@ import * as configUtils from "./config-utils";
import { DocUrl } from "./doc-url";
import { EnvVar } from "./environment";
import { Feature, featureConfig, initFeatures } from "./feature-flags";
import { KnownLanguage, Language } from "./languages";
import { BuiltInLanguage, Language } from "./languages";
import { Logger } from "./logging";
import { getRepositoryNwo } from "./repository";
import { asyncFilter, BuildMode } from "./util";
@@ -72,7 +72,7 @@ export async function determineAutobuildLanguages(
* version of the CodeQL Action.
*/
const autobuildLanguagesWithoutGo = autobuildLanguages.filter(
(l) => l !== KnownLanguage.go,
(l) => l !== BuiltInLanguage.go,
);
const languages: Language[] = [];
@@ -84,7 +84,7 @@ export async function determineAutobuildLanguages(
// If Go is requested, run the Go autobuilder last to ensure it doesn't
// interfere with the other autobuilder.
if (autobuildLanguages.length !== autobuildLanguagesWithoutGo.length) {
languages.push(KnownLanguage.go);
languages.push(BuiltInLanguage.go);
}
logger.debug(`Will autobuild ${languages.join(" and ")}.`);
@@ -156,7 +156,7 @@ export async function runAutobuild(
) {
logger.startGroup(`Attempting to automatically build ${language} code`);
const codeQL = await getCodeQL(config.codeQLCmd);
if (language === KnownLanguage.cpp) {
if (language === BuiltInLanguage.cpp) {
await setupCppAutobuild(codeQL, logger);
}
if (config.buildMode) {
@@ -164,7 +164,7 @@ export async function runAutobuild(
} else {
await codeQL.runAutobuild(config, language);
}
if (language === KnownLanguage.go) {
if (language === BuiltInLanguage.go) {
core.exportVariable(EnvVar.DID_AUTOBUILD_GOLANG, "true");
}
logger.endGroup();
+14
View File
@@ -299,6 +299,20 @@ test("wrapCliConfigurationError - swift build failed", (t) => {
t.true(wrappedError instanceof ConfigurationError);
});
test("wrapCliConfigurationError - swift incompatible os", (t) => {
const commandError = new CommandInvocationError(
"codeql",
["swift/tools/autobuild.sh"],
1,
"2026-04-01 18:35:00 EST ERRO [extractor/main] [incompatible-os] Currently, Swift analysis is only supported on macOS. (IncompatibleOs.cpp:26)",
);
const cliError = new CliError(commandError);
const wrappedError = wrapCliConfigurationError(cliError);
t.true(wrappedError instanceof ConfigurationError);
});
test("wrapCliConfigurationError - pack cannot be found", (t) => {
const commandError = new CommandInvocationError(
"codeql",
+7
View File
@@ -144,6 +144,7 @@ export enum CliConfigErrorCategory {
OutOfMemoryOrDisk = "OutOfMemoryOrDisk",
PackCannotBeFound = "PackCannotBeFound",
PackMissingAuth = "PackMissingAuth",
SwiftIncompatibleOs = "SwiftIncompatibleOs",
SwiftBuildFailed = "SwiftBuildFailed",
UnsupportedBuildMode = "UnsupportedBuildMode",
}
@@ -281,6 +282,12 @@ const cliErrorsConfig: Record<CliConfigErrorCategory, CliErrorConfiguration> = {
),
],
},
[CliConfigErrorCategory.SwiftIncompatibleOs]: {
cliErrorMessageCandidates: [
new RegExp("\\[incompatible-os\\]"),
new RegExp("Swift analysis is only supported on macOS"),
],
},
[CliConfigErrorCategory.UnsupportedBuildMode]: {
cliErrorMessageCandidates: [
new RegExp(
+7 -5
View File
@@ -21,7 +21,7 @@ import {
import type { Config } from "./config-utils";
import * as defaults from "./defaults.json";
import { DocUrl } from "./doc-url";
import { KnownLanguage } from "./languages";
import { BuiltInLanguage } from "./languages";
import { getRunnerLogger } from "./logging";
import { ToolsSource } from "./setup-codeql";
import {
@@ -46,7 +46,7 @@ test.beforeEach(() => {
initializeEnvironment("1.2.3");
stubConfig = createTestConfig({
languages: [KnownLanguage.cpp],
languages: [BuiltInLanguage.cpp],
});
});
@@ -115,7 +115,7 @@ async function stubCodeql(): Promise<codeql.CodeQL> {
sinon.stub(codeqlObject, "getVersion").resolves(makeVersionInfo("2.17.6"));
sinon
.stub(codeqlObject, "isTracedLanguage")
.withArgs(KnownLanguage.cpp)
.withArgs(BuiltInLanguage.cpp)
.resolves(true);
return codeqlObject;
}
@@ -956,7 +956,8 @@ test.serial("runTool summarizes autobuilder errors", async (t) => {
sinon.stub(io, "which").resolves("");
await t.throwsAsync(
async () => await codeqlObject.runAutobuild(stubConfig, KnownLanguage.java),
async () =>
await codeqlObject.runAutobuild(stubConfig, BuiltInLanguage.java),
{
instanceOf: util.ConfigurationError,
message:
@@ -982,7 +983,8 @@ test.serial("runTool truncates long autobuilder errors", async (t) => {
sinon.stub(io, "which").resolves("");
await t.throwsAsync(
async () => await codeqlObject.runAutobuild(stubConfig, KnownLanguage.java),
async () =>
await codeqlObject.runAutobuild(stubConfig, BuiltInLanguage.java),
{
instanceOf: util.ConfigurationError,
message:
+5 -8
View File
@@ -24,11 +24,8 @@ import {
import { isAnalyzingDefaultBranch } from "./git-utils";
import { Language } from "./languages";
import { Logger } from "./logging";
import {
OverlayDatabaseMode,
writeBaseDatabaseOidsFile,
writeOverlayChangesFile,
} from "./overlay";
import { writeBaseDatabaseOidsFile, writeOverlayChangesFile } from "./overlay";
import { OverlayDatabaseMode } from "./overlay/overlay-database-mode";
import * as setupCodeql from "./setup-codeql";
import { ZstdAvailability } from "./tar";
import { ToolsDownloadStatusReport } from "./tools-download";
@@ -285,17 +282,17 @@ const CODEQL_MINIMUM_VERSION = "2.17.6";
/**
* This version will shortly become the oldest version of CodeQL that the Action will run with.
*/
const CODEQL_NEXT_MINIMUM_VERSION = "2.17.6";
const CODEQL_NEXT_MINIMUM_VERSION = "2.19.4";
/**
* This is the version of GHES that was most recently deprecated.
*/
const GHES_VERSION_MOST_RECENTLY_DEPRECATED = "3.13";
const GHES_VERSION_MOST_RECENTLY_DEPRECATED = "3.15";
/**
* This is the deprecation date for the version of GHES that was most recently deprecated.
*/
const GHES_MOST_RECENT_DEPRECATION_DATE = "2025-06-19";
const GHES_MOST_RECENT_DEPRECATION_DATE = "2026-04-09";
/** The CLI verbosity level to use for extraction in debug mode. */
const EXTRACTION_DEBUG_MODE_VERBOSITY = "progress++";
+85 -66
View File
@@ -18,10 +18,11 @@ import { Feature } from "./feature-flags";
import { RepositoryProperties } from "./feature-flags/properties";
import * as gitUtils from "./git-utils";
import { GitVersionInfo } from "./git-utils";
import { KnownLanguage, Language } from "./languages";
import { BuiltInLanguage, Language } from "./languages";
import { getRunnerLogger } from "./logging";
import { CODEQL_OVERLAY_MINIMUM_VERSION, OverlayDatabaseMode } from "./overlay";
import { CODEQL_OVERLAY_MINIMUM_VERSION } from "./overlay";
import { OverlayDisabledReason } from "./overlay/diagnostics";
import { OverlayDatabaseMode } from "./overlay/overlay-database-mode";
import * as overlayStatus from "./overlay/status";
import { parseRepositoryNwo } from "./repository";
import {
@@ -214,7 +215,7 @@ test.serial("load code quality config", async (t) => {
// And the config we expect it to result in
const expectedConfig = createTestConfig({
analysisKinds: [AnalysisKind.CodeQuality],
languages: [KnownLanguage.actions],
languages: [BuiltInLanguage.actions],
// This gets set because we only have `AnalysisKind.CodeQuality`
computedConfig: {
"disable-default-queries": true,
@@ -267,7 +268,7 @@ test.serial(
const expectedConfig = createTestConfig({
analysisKinds: [AnalysisKind.CodeQuality],
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
codeQLCmd: codeql.getPath(),
computedConfig,
dbLocation: path.resolve(tempDir, "codeql_databases"),
@@ -517,7 +518,7 @@ test.serial("load non-empty input", async (t) => {
// And the config we expect it to parse to
const expectedConfig = createTestConfig({
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
buildMode: BuildMode.None,
originalUserInput: userConfig,
computedConfig: userConfig,
@@ -891,10 +892,10 @@ const mockRepositoryNwo = parseRepositoryNwo("owner/repo");
betterResolveLanguages: (options) =>
Promise.resolve({
aliases: {
"c#": KnownLanguage.csharp,
c: KnownLanguage.cpp,
kotlin: KnownLanguage.java,
typescript: KnownLanguage.javascript,
"c#": BuiltInLanguage.csharp,
c: BuiltInLanguage.cpp,
kotlin: BuiltInLanguage.java,
typescript: BuiltInLanguage.javascript,
},
extractors: {
cpp: [stubExtractorEntry],
@@ -943,12 +944,12 @@ const mockRepositoryNwo = parseRepositoryNwo("owner/repo");
for (const { displayName, language, feature } of [
{
displayName: "Java",
language: KnownLanguage.java,
language: BuiltInLanguage.java,
feature: Feature.DisableJavaBuildlessEnabled,
},
{
displayName: "C#",
language: KnownLanguage.csharp,
language: BuiltInLanguage.csharp,
feature: Feature.DisableCsharpBuildless,
},
]) {
@@ -968,7 +969,7 @@ for (const { displayName, language, feature } of [
const messages: LoggedMessage[] = [];
const buildMode = await configUtils.parseBuildModeInput(
"none",
[KnownLanguage.python],
[BuiltInLanguage.python],
createFeatures([feature]),
getRecordingLogger(messages),
);
@@ -1004,6 +1005,7 @@ interface OverlayDatabaseModeTestSetup {
codeqlVersion: string;
gitRoot: string | undefined;
gitVersion: GitVersionInfo | undefined;
hasSubmodules: boolean;
codeScanningConfig: UserConfig;
diskUsage: DiskUsage | undefined;
memoryFlagValue: number;
@@ -1017,13 +1019,11 @@ const defaultOverlayDatabaseModeTestSetup: OverlayDatabaseModeTestSetup = {
isPullRequest: false,
isDefaultBranch: false,
buildMode: BuildMode.None,
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
codeqlVersion: CODEQL_OVERLAY_MINIMUM_VERSION,
gitRoot: "/some/git/root",
gitVersion: new GitVersionInfo(
gitUtils.GIT_MINIMUM_VERSION_FOR_OVERLAY,
gitUtils.GIT_MINIMUM_VERSION_FOR_OVERLAY,
),
gitVersion: new GitVersionInfo("2.39.0", "2.39.0"),
hasSubmodules: false,
codeScanningConfig: {},
diskUsage: {
numAvailableBytes: 50_000_000_000,
@@ -1091,7 +1091,7 @@ const checkOverlayEnablementMacro = test.macro({
sinon
.stub(codeql, "isTracedLanguage")
.callsFake(async (lang: Language) => {
return [KnownLanguage.java].includes(lang as KnownLanguage);
return lang === BuiltInLanguage.java;
});
// Mock git root detection
@@ -1099,6 +1099,9 @@ const checkOverlayEnablementMacro = test.macro({
sinon.stub(gitUtils, "getGitRoot").resolves(setup.gitRoot);
}
// Mock submodule detection
sinon.stub(gitUtils, "hasSubmodules").returns(setup.hasSubmodules);
// Mock default branch detection
sinon
.stub(gitUtils, "isAnalyzingDefaultBranch")
@@ -1181,7 +1184,7 @@ test.serial(
checkOverlayEnablementMacro,
"Ignore feature flag when analyzing non-default branch",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [Feature.OverlayAnalysis, Feature.OverlayAnalysisJavascript],
},
{
@@ -1193,7 +1196,7 @@ test.serial(
checkOverlayEnablementMacro,
"Overlay-base database on default branch when feature enabled",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [Feature.OverlayAnalysis, Feature.OverlayAnalysisJavascript],
isDefaultBranch: true,
},
@@ -1207,7 +1210,7 @@ test.serial(
checkOverlayEnablementMacro,
"Overlay-base database on default branch when feature enabled with custom analysis",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [Feature.OverlayAnalysis, Feature.OverlayAnalysisJavascript],
codeScanningConfig: {
packs: ["some-custom-pack@1.0.0"],
@@ -1224,7 +1227,7 @@ test.serial(
checkOverlayEnablementMacro,
"Overlay-base database on default branch when code-scanning feature enabled",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisCodeScanningJavascript,
@@ -1241,7 +1244,7 @@ test.serial(
checkOverlayEnablementMacro,
"No overlay-base database on default branch if runner disk space is too low",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisCodeScanningJavascript,
@@ -1261,7 +1264,7 @@ test.serial(
checkOverlayEnablementMacro,
"No overlay-base database on default branch if we can't determine runner disk space",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisCodeScanningJavascript,
@@ -1278,7 +1281,7 @@ test.serial(
checkOverlayEnablementMacro,
"Overlay-base database on default branch if runner disk space is too low and skip resource checks flag is enabled",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisCodeScanningJavascript,
@@ -1300,7 +1303,7 @@ test.serial(
checkOverlayEnablementMacro,
"No overlay-base database on default branch if runner disk space is below v2 limit and v2 resource checks enabled",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisCodeScanningJavascript,
@@ -1321,7 +1324,7 @@ test.serial(
checkOverlayEnablementMacro,
"Overlay-base database on default branch if runner disk space is between v2 and v1 limits and v2 resource checks enabled",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisCodeScanningJavascript,
@@ -1343,7 +1346,7 @@ test.serial(
checkOverlayEnablementMacro,
"No overlay-base database on default branch if runner disk space is between v2 and v1 limits and v2 resource checks not enabled",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisCodeScanningJavascript,
@@ -1363,7 +1366,7 @@ test.serial(
checkOverlayEnablementMacro,
"No overlay-base database on default branch if memory flag is too low",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisCodeScanningJavascript,
@@ -1380,7 +1383,7 @@ test.serial(
checkOverlayEnablementMacro,
"Overlay-base database on default branch if memory flag is too low but CodeQL >= 2.24.3",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisCodeScanningJavascript,
@@ -1399,7 +1402,7 @@ test.serial(
checkOverlayEnablementMacro,
"Overlay-base database on default branch if memory flag is too low and skip resource checks flag is enabled",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisCodeScanningJavascript,
@@ -1418,7 +1421,7 @@ test.serial(
checkOverlayEnablementMacro,
"No overlay-base database on default branch when cached status indicates previous failure",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisJavascript,
@@ -1436,7 +1439,7 @@ test.serial(
checkOverlayEnablementMacro,
"No overlay analysis on PR when cached status indicates previous failure",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisJavascript,
@@ -1454,7 +1457,7 @@ test.serial(
checkOverlayEnablementMacro,
"No overlay-base database on default branch when code-scanning feature enabled with disable-default-queries",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisCodeScanningJavascript,
@@ -1473,7 +1476,7 @@ test.serial(
checkOverlayEnablementMacro,
"No overlay-base database on default branch when code-scanning feature enabled with packs",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisCodeScanningJavascript,
@@ -1492,7 +1495,7 @@ test.serial(
checkOverlayEnablementMacro,
"No overlay-base database on default branch when code-scanning feature enabled with queries",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisCodeScanningJavascript,
@@ -1511,7 +1514,7 @@ test.serial(
checkOverlayEnablementMacro,
"No overlay-base database on default branch when code-scanning feature enabled with query-filters",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisCodeScanningJavascript,
@@ -1530,7 +1533,7 @@ test.serial(
checkOverlayEnablementMacro,
"No overlay-base database on default branch when only language-specific feature enabled",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [Feature.OverlayAnalysisJavascript],
isDefaultBranch: true,
},
@@ -1543,7 +1546,7 @@ test.serial(
checkOverlayEnablementMacro,
"No overlay-base database on default branch when only code-scanning feature enabled",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [Feature.OverlayAnalysisCodeScanningJavascript],
isDefaultBranch: true,
},
@@ -1556,7 +1559,7 @@ test.serial(
checkOverlayEnablementMacro,
"No overlay-base database on default branch when language-specific feature disabled",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [Feature.OverlayAnalysis],
isDefaultBranch: true,
},
@@ -1569,7 +1572,7 @@ test.serial(
checkOverlayEnablementMacro,
"Overlay analysis on PR when feature enabled",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [Feature.OverlayAnalysis, Feature.OverlayAnalysisJavascript],
isPullRequest: true,
},
@@ -1583,7 +1586,7 @@ test.serial(
checkOverlayEnablementMacro,
"Overlay analysis on PR when feature enabled with custom analysis",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [Feature.OverlayAnalysis, Feature.OverlayAnalysisJavascript],
codeScanningConfig: {
packs: ["some-custom-pack@1.0.0"],
@@ -1600,7 +1603,7 @@ test.serial(
checkOverlayEnablementMacro,
"Overlay analysis on PR when code-scanning feature enabled",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisCodeScanningJavascript,
@@ -1617,7 +1620,7 @@ test.serial(
checkOverlayEnablementMacro,
"No overlay analysis on PR if runner disk space is too low",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisCodeScanningJavascript,
@@ -1637,7 +1640,7 @@ test.serial(
checkOverlayEnablementMacro,
"Overlay analysis on PR if runner disk space is too low and skip resource checks flag is enabled",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisCodeScanningJavascript,
@@ -1659,7 +1662,7 @@ test.serial(
checkOverlayEnablementMacro,
"No overlay analysis on PR if we can't determine runner disk space",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisCodeScanningJavascript,
@@ -1676,7 +1679,7 @@ test.serial(
checkOverlayEnablementMacro,
"No overlay analysis on PR if memory flag is too low",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisCodeScanningJavascript,
@@ -1693,7 +1696,7 @@ test.serial(
checkOverlayEnablementMacro,
"Overlay analysis on PR if memory flag is too low but CodeQL >= 2.24.3",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisCodeScanningJavascript,
@@ -1712,7 +1715,7 @@ test.serial(
checkOverlayEnablementMacro,
"Overlay analysis on PR if memory flag is too low and skip resource checks flag is enabled",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisCodeScanningJavascript,
@@ -1731,7 +1734,7 @@ test.serial(
checkOverlayEnablementMacro,
"No overlay analysis on PR when code-scanning feature enabled with disable-default-queries",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisCodeScanningJavascript,
@@ -1750,7 +1753,7 @@ test.serial(
checkOverlayEnablementMacro,
"No overlay analysis on PR when code-scanning feature enabled with packs",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisCodeScanningJavascript,
@@ -1769,7 +1772,7 @@ test.serial(
checkOverlayEnablementMacro,
"No overlay analysis on PR when code-scanning feature enabled with queries",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisCodeScanningJavascript,
@@ -1788,7 +1791,7 @@ test.serial(
checkOverlayEnablementMacro,
"No overlay analysis on PR when code-scanning feature enabled with query-filters",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [
Feature.OverlayAnalysis,
Feature.OverlayAnalysisCodeScanningJavascript,
@@ -1807,7 +1810,7 @@ test.serial(
checkOverlayEnablementMacro,
"No overlay analysis on PR when only language-specific feature enabled",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [Feature.OverlayAnalysisJavascript],
isPullRequest: true,
},
@@ -1820,7 +1823,7 @@ test.serial(
checkOverlayEnablementMacro,
"No overlay analysis on PR when only code-scanning feature enabled",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [Feature.OverlayAnalysisCodeScanningJavascript],
isPullRequest: true,
},
@@ -1833,7 +1836,7 @@ test.serial(
checkOverlayEnablementMacro,
"No overlay analysis on PR when language-specific feature disabled",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [Feature.OverlayAnalysis],
isPullRequest: true,
},
@@ -1871,7 +1874,7 @@ test.serial(
checkOverlayEnablementMacro,
"Overlay PR analysis by feature flag",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [Feature.OverlayAnalysis, Feature.OverlayAnalysisJavascript],
isPullRequest: true,
},
@@ -1887,7 +1890,7 @@ test.serial(
{
overlayDatabaseEnvVar: "overlay",
buildMode: BuildMode.Autobuild,
languages: [KnownLanguage.java],
languages: [BuiltInLanguage.java],
},
{
disabledReason: OverlayDisabledReason.IncompatibleBuildMode,
@@ -1900,7 +1903,7 @@ test.serial(
{
overlayDatabaseEnvVar: "overlay",
buildMode: undefined,
languages: [KnownLanguage.java],
languages: [BuiltInLanguage.java],
},
{
disabledReason: OverlayDisabledReason.IncompatibleBuildMode,
@@ -1933,10 +1936,11 @@ test.serial(
test.serial(
checkOverlayEnablementMacro,
"Fallback due to old git version",
"Fallback due to old git version with submodules",
{
overlayDatabaseEnvVar: "overlay",
gitVersion: new GitVersionInfo("2.10.0", "2.10.0"), // Version below required 2.11.0
gitVersion: new GitVersionInfo("2.34.1", "2.34.1"), // Above 2.11.0 but below 2.36.0
hasSubmodules: true,
},
{
disabledReason: OverlayDisabledReason.IncompatibleGit,
@@ -1945,21 +1949,36 @@ test.serial(
test.serial(
checkOverlayEnablementMacro,
"Fallback when git version cannot be determined",
"Fallback when git version cannot be determined and repo has submodules",
{
overlayDatabaseEnvVar: "overlay",
gitVersion: undefined,
hasSubmodules: true,
},
{
disabledReason: OverlayDisabledReason.IncompatibleGit,
},
);
test.serial(
checkOverlayEnablementMacro,
"Overlay enabled when git version cannot be determined and repo has no submodules",
{
overlayDatabaseEnvVar: "overlay",
gitVersion: undefined,
hasSubmodules: false,
},
{
overlayDatabaseMode: OverlayDatabaseMode.Overlay,
useOverlayDatabaseCaching: false,
},
);
test.serial(
checkOverlayEnablementMacro,
"No overlay when disabled via repository property",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [Feature.OverlayAnalysis, Feature.OverlayAnalysisJavascript],
isPullRequest: true,
repositoryProperties: {
@@ -1975,7 +1994,7 @@ test.serial(
checkOverlayEnablementMacro,
"Overlay not disabled when repository property is false",
{
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
features: [Feature.OverlayAnalysis, Feature.OverlayAnalysisJavascript],
isPullRequest: true,
repositoryProperties: {
@@ -2004,7 +2023,7 @@ test.serial(
);
// Exercise language-specific overlay analysis features code paths
for (const language in KnownLanguage) {
for (const language in BuiltInLanguage) {
test.serial(
checkOverlayEnablementMacro,
`Check default overlay analysis feature for ${language}`,
@@ -2027,7 +2046,7 @@ test.serial(
checkOverlayEnablementMacro,
"No overlay analysis for language without per-language overlay feature flag",
{
languages: [KnownLanguage.swift],
languages: [BuiltInLanguage.swift],
features: [Feature.OverlayAnalysis],
isPullRequest: true,
},
+34 -26
View File
@@ -43,17 +43,19 @@ import {
getGeneratedFiles,
getGitRoot,
getGitVersionOrThrow,
GIT_MINIMUM_VERSION_FOR_OVERLAY,
GIT_MINIMUM_VERSION_FOR_OVERLAY_WITH_SUBMODULES,
GitVersionInfo,
hasSubmodules,
isAnalyzingDefaultBranch,
} from "./git-utils";
import { KnownLanguage, Language } from "./languages";
import { BuiltInLanguage, Language } from "./languages";
import { Logger } from "./logging";
import { CODEQL_OVERLAY_MINIMUM_VERSION, OverlayDatabaseMode } from "./overlay";
import { CODEQL_OVERLAY_MINIMUM_VERSION } from "./overlay";
import {
addOverlayDisablementDiagnostics,
OverlayDisabledReason,
} from "./overlay/diagnostics";
import { OverlayDatabaseMode } from "./overlay/overlay-database-mode";
import { shouldSkipOverlayAnalysis } from "./overlay/status";
import { RepositoryNwo } from "./repository";
import { ToolsFeature } from "./tools-features";
@@ -272,10 +274,10 @@ async function getSupportedLanguageMap(
for (const extractor of Object.keys(resolveResult.extractors)) {
// If the CLI supports resolving languages with default queries, use these
// as the set of supported languages. Otherwise, require the language to be
// a known language.
// a built-in language.
if (
resolveSupportedLanguagesUsingCli ||
KnownLanguage[extractor] !== undefined
BuiltInLanguage[extractor] !== undefined
) {
supportedLanguages[extractor] = extractor;
}
@@ -945,7 +947,7 @@ async function validateOverlayDatabaseMode(
await Promise.all(
languages.map(
async (l) =>
l !== KnownLanguage.go && // Workaround to allow overlay analysis for Go with any build
l !== BuiltInLanguage.go && // Workaround to allow overlay analysis for Go with any build
// mode, since it does not yet support BMN. The Go autobuilder and/or extractor will
// ensure that overlay-base databases are only created for supported Go build setups,
// and that we'll fall back to full databases in other cases.
@@ -969,7 +971,8 @@ async function validateOverlayDatabaseMode(
);
return new Failure(OverlayDisabledReason.IncompatibleCodeQl);
}
if ((await getGitRoot(sourceRoot)) === undefined) {
const gitRoot = await getGitRoot(sourceRoot);
if (gitRoot === undefined) {
logger.warning(
`Cannot build an ${overlayDatabaseMode} database because ` +
`the source root "${sourceRoot}" is not inside a git repository. ` +
@@ -977,21 +980,26 @@ async function validateOverlayDatabaseMode(
);
return new Failure(OverlayDisabledReason.NoGitRoot);
}
if (gitVersion === undefined) {
logger.warning(
`Cannot build an ${overlayDatabaseMode} database because ` +
"the Git version could not be determined. " +
"Falling back to creating a normal full database instead.",
);
return new Failure(OverlayDisabledReason.IncompatibleGit);
}
if (!gitVersion.isAtLeast(GIT_MINIMUM_VERSION_FOR_OVERLAY)) {
logger.warning(
`Cannot build an ${overlayDatabaseMode} database because ` +
`the installed Git version is older than ${GIT_MINIMUM_VERSION_FOR_OVERLAY}. ` +
"Falling back to creating a normal full database instead.",
);
return new Failure(OverlayDisabledReason.IncompatibleGit);
if (hasSubmodules(gitRoot)) {
if (gitVersion === undefined) {
logger.warning(
`Cannot build an ${overlayDatabaseMode} database because ` +
"the repository has submodules and the Git version could not be determined. " +
"Falling back to creating a normal full database instead.",
);
return new Failure(OverlayDisabledReason.IncompatibleGit);
}
if (
!gitVersion.isAtLeast(GIT_MINIMUM_VERSION_FOR_OVERLAY_WITH_SUBMODULES)
) {
logger.warning(
`Cannot build an ${overlayDatabaseMode} database because ` +
"the repository has submodules and the installed Git version is older " +
`than ${GIT_MINIMUM_VERSION_FOR_OVERLAY_WITH_SUBMODULES}. ` +
"Falling back to creating a normal full database instead.",
);
return new Failure(OverlayDisabledReason.IncompatibleGit);
}
}
return new Success({
@@ -1028,13 +1036,13 @@ async function setCppTrapCachingEnvironmentVariables(
config: Config,
logger: Logger,
): Promise<void> {
if (config.languages.includes(KnownLanguage.cpp)) {
if (config.languages.includes(BuiltInLanguage.cpp)) {
const envVar = "CODEQL_EXTRACTOR_CPP_TRAP_CACHING";
if (process.env[envVar]) {
logger.info(
`Environment variable ${envVar} already set, leaving it unchanged.`,
);
} else if (config.trapCaches[KnownLanguage.cpp]) {
} else if (config.trapCaches[BuiltInLanguage.cpp]) {
logger.info("Enabling TRAP caching for C/C++.");
core.exportVariable(envVar, "true");
} else {
@@ -1531,7 +1539,7 @@ export async function parseBuildModeInput(
}
if (
languages.includes(KnownLanguage.csharp) &&
languages.includes(BuiltInLanguage.csharp) &&
(await features.getValue(Feature.DisableCsharpBuildless))
) {
logger.warning(
@@ -1541,7 +1549,7 @@ export async function parseBuildModeInput(
}
if (
languages.includes(KnownLanguage.java) &&
languages.includes(BuiltInLanguage.java) &&
(await features.getValue(Feature.DisableJavaBuildlessEnabled))
) {
logger.warning(
+21 -21
View File
@@ -1,7 +1,7 @@
import test, { ExecutionContext } from "ava";
import { RepositoryProperties } from "../feature-flags/properties";
import { KnownLanguage, Language } from "../languages";
import { BuiltInLanguage, Language } from "../languages";
import { getRunnerLogger } from "../logging";
import {
checkExpectedLogMessages,
@@ -54,7 +54,7 @@ const invalidPackNameMacro = test.macro({
parsePacksErrorMacro.exec(
t,
name,
[KnownLanguage.cpp],
[BuiltInLanguage.cpp],
new RegExp(`^"${name}" is not a valid pack$`),
),
title: (_providedTitle: string | undefined, arg: string | undefined) =>
@@ -62,23 +62,23 @@ const invalidPackNameMacro = test.macro({
});
test("no packs", parsePacksMacro, "", [], undefined);
test("two packs", parsePacksMacro, "a/b,c/d@1.2.3", [KnownLanguage.cpp], {
[KnownLanguage.cpp]: ["a/b", "c/d@1.2.3"],
test("two packs", parsePacksMacro, "a/b,c/d@1.2.3", [BuiltInLanguage.cpp], {
[BuiltInLanguage.cpp]: ["a/b", "c/d@1.2.3"],
});
test(
"two packs with spaces",
parsePacksMacro,
" a/b , c/d@1.2.3 ",
[KnownLanguage.cpp],
[BuiltInLanguage.cpp],
{
[KnownLanguage.cpp]: ["a/b", "c/d@1.2.3"],
[BuiltInLanguage.cpp]: ["a/b", "c/d@1.2.3"],
},
);
test(
"two packs with language",
parsePacksErrorMacro,
"a/b,c/d@1.2.3",
[KnownLanguage.cpp, KnownLanguage.java],
[BuiltInLanguage.cpp, BuiltInLanguage.java],
new RegExp(
"Cannot specify a 'packs' input in a multi-language analysis. " +
"Use a codeql-config.yml file instead and specify packs by language.",
@@ -106,9 +106,9 @@ test(
// (globbing is not done)
"c/d@1.2.3:+*)_(",
].join(","),
[KnownLanguage.cpp],
[BuiltInLanguage.cpp],
{
[KnownLanguage.cpp]: [
[BuiltInLanguage.cpp]: [
"c/d@1.0",
"c/d@~1.0.0",
"c/d@~1.0.0:a/b",
@@ -215,7 +215,7 @@ test(
"All empty",
undefined,
undefined,
[KnownLanguage.javascript],
[BuiltInLanguage.javascript],
{},
{
...dbConfig.defaultAugmentationProperties,
@@ -227,7 +227,7 @@ test(
"With queries",
undefined,
" a, b , c, d",
[KnownLanguage.javascript],
[BuiltInLanguage.javascript],
{},
{
...dbConfig.defaultAugmentationProperties,
@@ -240,7 +240,7 @@ test(
"With queries combining",
undefined,
" + a, b , c, d ",
[KnownLanguage.javascript],
[BuiltInLanguage.javascript],
{},
{
...dbConfig.defaultAugmentationProperties,
@@ -254,7 +254,7 @@ test(
"With packs",
" codeql/a , codeql/b , codeql/c , codeql/d ",
undefined,
[KnownLanguage.javascript],
[BuiltInLanguage.javascript],
{},
{
...dbConfig.defaultAugmentationProperties,
@@ -267,7 +267,7 @@ test(
"With packs combining",
" + codeql/a, codeql/b, codeql/c, codeql/d",
undefined,
[KnownLanguage.javascript],
[BuiltInLanguage.javascript],
{},
{
...dbConfig.defaultAugmentationProperties,
@@ -281,7 +281,7 @@ test(
"With repo property queries",
undefined,
undefined,
[KnownLanguage.javascript],
[BuiltInLanguage.javascript],
{
"github-codeql-extra-queries": "a, b, c, d",
},
@@ -299,7 +299,7 @@ test(
"With repo property queries combining",
undefined,
undefined,
[KnownLanguage.javascript],
[BuiltInLanguage.javascript],
{
"github-codeql-extra-queries": "+ a, b, c, d",
},
@@ -341,7 +341,7 @@ test(
"Plus (+) with nothing else (queries)",
undefined,
" + ",
[KnownLanguage.javascript],
[BuiltInLanguage.javascript],
{},
/The workflow property "queries" is invalid/,
);
@@ -351,7 +351,7 @@ test(
"Plus (+) with nothing else (packs)",
" + ",
undefined,
[KnownLanguage.javascript],
[BuiltInLanguage.javascript],
{},
/The workflow property "packs" is invalid/,
);
@@ -361,7 +361,7 @@ test(
"Plus (+) with nothing else (repo property queries)",
undefined,
undefined,
[KnownLanguage.javascript],
[BuiltInLanguage.javascript],
{
"github-codeql-extra-queries": " + ",
},
@@ -373,7 +373,7 @@ test(
"Packs input with multiple languages",
" + a/b, c/d ",
undefined,
[KnownLanguage.javascript, KnownLanguage.java],
[BuiltInLanguage.javascript, BuiltInLanguage.java],
{},
/Cannot specify a 'packs' input in a multi-language analysis/,
);
@@ -393,7 +393,7 @@ test(
"Invalid packs",
" a-pack-without-a-scope ",
undefined,
[KnownLanguage.javascript],
[BuiltInLanguage.javascript],
{},
/"a-pack-without-a-scope" is not a valid pack/,
);
+2 -2
View File
@@ -12,7 +12,7 @@ import { createStubCodeQL } from "./codeql";
import { Config } from "./config-utils";
import { cleanupAndUploadDatabases } from "./database-upload";
import * as gitUtils from "./git-utils";
import { KnownLanguage } from "./languages";
import { BuiltInLanguage } from "./languages";
import { RepositoryNwo } from "./repository";
import {
checkExpectedLogMessages,
@@ -45,7 +45,7 @@ const testApiDetails: GitHubApiDetails = {
function getTestConfig(tmpDir: string): Config {
return createTestConfig({
languages: [KnownLanguage.javascript],
languages: [BuiltInLanguage.javascript],
dbLocation: tmpDir,
});
}
+1 -1
View File
@@ -12,7 +12,7 @@ import { Config } from "./config-utils";
import { Feature, FeatureEnablement } from "./feature-flags";
import * as gitUtils from "./git-utils";
import { Logger, withGroupAsync } from "./logging";
import { OverlayDatabaseMode } from "./overlay";
import { OverlayDatabaseMode } from "./overlay/overlay-database-mode";
import { RepositoryNwo } from "./repository";
import * as util from "./util";
import { asHTTPError, bundleDb, CleanupLevel, parseGitHubUrl } from "./util";
+4 -4
View File
@@ -1,6 +1,6 @@
{
"bundleVersion": "codeql-bundle-v2.24.3",
"cliVersion": "2.24.3",
"priorBundleVersion": "codeql-bundle-v2.24.2",
"priorCliVersion": "2.24.2"
"bundleVersion": "codeql-bundle-v2.25.3",
"cliVersion": "2.25.3",
"priorBundleVersion": "codeql-bundle-v2.25.2",
"priorCliVersion": "2.25.2"
}
+40 -32
View File
@@ -27,7 +27,7 @@ import {
CacheStoreResult,
} from "./dependency-caching";
import { Feature } from "./feature-flags";
import { KnownLanguage } from "./languages";
import { BuiltInLanguage } from "./languages";
import {
setupTests,
createFeatures,
@@ -179,7 +179,7 @@ test("checkHashPatterns - logs when no patterns match", async (t) => {
const result = await checkHashPatterns(
codeql,
features,
KnownLanguage.csharp,
BuiltInLanguage.csharp,
config,
"download",
getRecordingLogger(messages),
@@ -208,7 +208,7 @@ test("checkHashPatterns - returns patterns when patterns match", async (t) => {
const result = await checkHashPatterns(
codeql,
features,
KnownLanguage.csharp,
BuiltInLanguage.csharp,
config,
"upload",
getRecordingLogger(messages),
@@ -270,7 +270,7 @@ test.serial(
const keyWithFeature = await cacheKey(
codeql,
createFeatures([Feature.CsharpNewCacheKey]),
KnownLanguage.csharp,
BuiltInLanguage.csharp,
// Patterns don't matter here because we have stubbed `hashFiles` to always return a specific hash above.
[],
);
@@ -288,12 +288,12 @@ test.serial(
const result = await downloadDependencyCaches(
codeql,
createFeatures([]),
[KnownLanguage.csharp],
[BuiltInLanguage.csharp],
logger,
);
const statusReport = result.statusReport;
t.is(statusReport.length, 1);
t.is(statusReport[0].language, KnownLanguage.csharp);
t.is(statusReport[0].language, BuiltInLanguage.csharp);
t.is(statusReport[0].hit_kind, CacheHitKind.Miss);
t.deepEqual(result.restoredKeys, []);
t.assert(restoreCacheStub.calledOnce);
@@ -316,7 +316,7 @@ test.serial(
const keyWithFeature = await cacheKey(
codeql,
features,
KnownLanguage.csharp,
BuiltInLanguage.csharp,
// Patterns don't matter here because we have stubbed `hashFiles` to always return a specific hash above.
[],
);
@@ -334,14 +334,14 @@ test.serial(
const result = await downloadDependencyCaches(
codeql,
features,
[KnownLanguage.csharp],
[BuiltInLanguage.csharp],
logger,
);
// Check that the status report for telemetry indicates that one cache was restored with an exact match.
const statusReport = result.statusReport;
t.is(statusReport.length, 1);
t.is(statusReport[0].language, KnownLanguage.csharp);
t.is(statusReport[0].language, BuiltInLanguage.csharp);
t.is(statusReport[0].hit_kind, CacheHitKind.Exact);
// Check that the restored key has been returned.
@@ -380,7 +380,7 @@ test.serial(
const keyWithFeature = await cacheKey(
codeql,
features,
KnownLanguage.csharp,
BuiltInLanguage.csharp,
// Patterns don't matter here because we have stubbed `hashFiles` to always return a specific hash above.
[],
);
@@ -398,14 +398,14 @@ test.serial(
const result = await downloadDependencyCaches(
codeql,
features,
[KnownLanguage.csharp],
[BuiltInLanguage.csharp],
logger,
);
// Check that the status report for telemetry indicates that one cache was restored with a partial match.
const statusReport = result.statusReport;
t.is(statusReport.length, 1);
t.is(statusReport[0].language, KnownLanguage.csharp);
t.is(statusReport[0].language, BuiltInLanguage.csharp);
t.is(statusReport[0].hit_kind, CacheHitKind.Partial);
// Check that the restored key has been returned.
@@ -426,7 +426,7 @@ test("uploadDependencyCaches - skips upload for a language with no cache config"
const logger = getRecordingLogger(messages);
const features = createFeatures([]);
const config = createTestConfig({
languages: [KnownLanguage.actions],
languages: [BuiltInLanguage.actions],
});
const result = await uploadDependencyCaches(codeql, features, config, logger);
@@ -444,7 +444,7 @@ test.serial(
const logger = getRecordingLogger(messages);
const features = createFeatures([]);
const config = createTestConfig({
languages: [KnownLanguage.go],
languages: [BuiltInLanguage.go],
});
const makePatternCheckStub = sinon.stub(internal, "makePatternCheck");
@@ -457,7 +457,7 @@ test.serial(
logger,
);
t.is(result.length, 1);
t.is(result[0].language, KnownLanguage.go);
t.is(result[0].language, BuiltInLanguage.go);
t.is(result[0].result, CacheStoreResult.NoHash);
},
);
@@ -483,12 +483,12 @@ test.serial(
const primaryCacheKey = await cacheKey(
codeql,
features,
KnownLanguage.csharp,
BuiltInLanguage.csharp,
CSHARP_BASE_PATTERNS,
);
const config = createTestConfig({
languages: [KnownLanguage.csharp],
languages: [BuiltInLanguage.csharp],
dependencyCachingRestoredKeys: [primaryCacheKey],
});
@@ -499,7 +499,7 @@ test.serial(
logger,
);
t.is(result.length, 1);
t.is(result[0].language, KnownLanguage.csharp);
t.is(result[0].language, BuiltInLanguage.csharp);
t.is(result[0].result, CacheStoreResult.Duplicate);
},
);
@@ -525,7 +525,7 @@ test.serial(
sinon.stub(cachingUtils, "getTotalCacheSize").resolves(0);
const config = createTestConfig({
languages: [KnownLanguage.csharp],
languages: [BuiltInLanguage.csharp],
});
const result = await uploadDependencyCaches(
@@ -535,7 +535,7 @@ test.serial(
logger,
);
t.is(result.length, 1);
t.is(result[0].language, KnownLanguage.csharp);
t.is(result[0].language, BuiltInLanguage.csharp);
t.is(result[0].result, CacheStoreResult.Empty);
checkExpectedLogMessages(t, messages, [
@@ -566,7 +566,7 @@ test.serial(
sinon.stub(actionsCache, "saveCache").resolves();
const config = createTestConfig({
languages: [KnownLanguage.csharp],
languages: [BuiltInLanguage.csharp],
});
const result = await uploadDependencyCaches(
@@ -576,7 +576,7 @@ test.serial(
logger,
);
t.is(result.length, 1);
t.is(result[0].language, KnownLanguage.csharp);
t.is(result[0].language, BuiltInLanguage.csharp);
t.is(result[0].result, CacheStoreResult.Stored);
t.is(result[0].upload_size_bytes, 1024);
@@ -608,7 +608,7 @@ test.serial(
.throws(new actionsCache.ReserveCacheError("Already in use"));
const config = createTestConfig({
languages: [KnownLanguage.csharp],
languages: [BuiltInLanguage.csharp],
});
await t.notThrowsAsync(async () => {
@@ -619,7 +619,7 @@ test.serial(
logger,
);
t.is(result.length, 1);
t.is(result[0].language, KnownLanguage.csharp);
t.is(result[0].language, BuiltInLanguage.csharp);
t.is(result[0].result, CacheStoreResult.Duplicate);
checkExpectedLogMessages(t, messages, ["Not uploading cache for"]);
@@ -647,7 +647,7 @@ test.serial("uploadDependencyCaches - throws other exceptions", async (t) => {
sinon.stub(actionsCache, "saveCache").throws();
const config = createTestConfig({
languages: [KnownLanguage.csharp],
languages: [BuiltInLanguage.csharp],
});
await t.throwsAsync(async () => {
@@ -659,7 +659,7 @@ test("getFeaturePrefix - returns empty string if no features are enabled", async
const codeql = createStubCodeQL({});
const features = createFeatures([]);
for (const knownLanguage of Object.values(KnownLanguage)) {
for (const knownLanguage of Object.values(BuiltInLanguage)) {
const result = await getFeaturePrefix(codeql, features, knownLanguage);
t.deepEqual(result, "", `Expected no feature prefix for ${knownLanguage}`);
}
@@ -669,7 +669,11 @@ test("getFeaturePrefix - C# - returns prefix if CsharpNewCacheKey is enabled", a
const codeql = createStubCodeQL({});
const features = createFeatures([Feature.CsharpNewCacheKey]);
const result = await getFeaturePrefix(codeql, features, KnownLanguage.csharp);
const result = await getFeaturePrefix(
codeql,
features,
BuiltInLanguage.csharp,
);
t.notDeepEqual(result, "");
t.assert(result.endsWith("-"));
// Check the length of the prefix, which should correspond to `cacheKeyHashLength` + 1 for the trailing `-`.
@@ -680,9 +684,9 @@ test("getFeaturePrefix - non-C# - returns '' if CsharpNewCacheKey is enabled", a
const codeql = createStubCodeQL({});
const features = createFeatures([Feature.CsharpNewCacheKey]);
for (const knownLanguage of Object.values(KnownLanguage)) {
for (const knownLanguage of Object.values(BuiltInLanguage)) {
// Skip C# since we expect a result for it, which is tested in the previous test.
if (knownLanguage === KnownLanguage.csharp) {
if (knownLanguage === BuiltInLanguage.csharp) {
continue;
}
const result = await getFeaturePrefix(codeql, features, knownLanguage);
@@ -694,7 +698,11 @@ test("getFeaturePrefix - C# - returns prefix if CsharpCacheBuildModeNone is enab
const codeql = createStubCodeQL({});
const features = createFeatures([Feature.CsharpCacheBuildModeNone]);
const result = await getFeaturePrefix(codeql, features, KnownLanguage.csharp);
const result = await getFeaturePrefix(
codeql,
features,
BuiltInLanguage.csharp,
);
t.notDeepEqual(result, "");
t.assert(result.endsWith("-"));
// Check the length of the prefix, which should correspond to `cacheKeyHashLength` + 1 for the trailing `-`.
@@ -705,9 +713,9 @@ test("getFeaturePrefix - non-C# - returns '' if CsharpCacheBuildModeNone is enab
const codeql = createStubCodeQL({});
const features = createFeatures([Feature.CsharpCacheBuildModeNone]);
for (const knownLanguage of Object.values(KnownLanguage)) {
for (const knownLanguage of Object.values(BuiltInLanguage)) {
// Skip C# since we expect a result for it, which is tested in the previous test.
if (knownLanguage === KnownLanguage.csharp) {
if (knownLanguage === BuiltInLanguage.csharp) {
continue;
}
const result = await getFeaturePrefix(codeql, features, knownLanguage);
+2 -2
View File
@@ -11,7 +11,7 @@ import { CodeQL } from "./codeql";
import { Config } from "./config-utils";
import { EnvVar } from "./environment";
import { Feature, FeatureEnablement } from "./feature-flags";
import { KnownLanguage, Language } from "./languages";
import { BuiltInLanguage, Language } from "./languages";
import { Logger } from "./logging";
import { getErrorMessage, getRequiredEnvParam } from "./util";
@@ -541,7 +541,7 @@ export async function getFeaturePrefix(
}
};
if (language === KnownLanguage.csharp) {
if (language === BuiltInLanguage.csharp) {
await addFeatureIfEnabled(Feature.CsharpNewCacheKey);
await addFeatureIfEnabled(Feature.CsharpCacheBuildModeNone);
}
+17 -2
View File
@@ -72,6 +72,13 @@ let unwrittenDiagnostics: UnwrittenDiagnostic[] = [];
*/
let unwrittenDefaultLanguageDiagnostics: DiagnosticMessage[] = [];
/**
* Counter used to generate a unique suffix for each diagnostic filename, so that
* two diagnostics produced within the same millisecond do not overwrite each
* other on disk.
*/
let diagnosticCounter = 0;
/**
* Constructs a new diagnostic message with the specified id and name, as well as optional additional data.
*
@@ -167,10 +174,18 @@ function writeDiagnostic(
// Create the directory if it doesn't exist yet.
mkdirSync(diagnosticsPath, { recursive: true });
// Include a monotonically increasing suffix to avoid filename collisions
// between diagnostics produced within the same millisecond.
const uniqueSuffix = (diagnosticCounter++).toString();
// We should only need to remove colons, but to be defensive, only allow a restricted set of
// characters.
const sanitizedTimestamp = diagnostic.timestamp.replace(
/[^a-zA-Z0-9.-]/g,
"",
);
const jsonPath = path.resolve(
diagnosticsPath,
// Remove colons from the timestamp as these are not allowed in Windows filenames.
`codeql-action-${diagnostic.timestamp.replaceAll(":", "")}.json`,
`codeql-action-${sanitizedTimestamp}-${uniqueSuffix}.json`,
);
writeFileSync(jsonPath, JSON.stringify(diagnostic));
+1
View File
@@ -8,6 +8,7 @@ export enum DocUrl {
CODEQL_BUILD_MODES = "https://docs.github.com/en/code-security/code-scanning/creating-an-advanced-setup-for-code-scanning/codeql-code-scanning-for-compiled-languages#codeql-build-modes",
DEFINE_ENV_VARIABLES = "https://docs.github.com/en/actions/learn-github-actions/variables#defining-environment-variables-for-a-single-workflow",
DELETE_ACTIONS_CACHE_ENTRIES = "https://docs.github.com/en/actions/how-tos/manage-workflow-runs/manage-caches#deleting-cache-entries",
PRIVATE_REGISTRY_LOGS = "https://docs.github.com/en/code-security/reference/code-scanning/code-scanning-logs#diagnostic-information-for-private-package-registries",
SCANNING_ON_PUSH = "https://docs.github.com/en/code-security/code-scanning/creating-an-advanced-setup-for-code-scanning/customizing-your-advanced-setup-for-code-scanning#scanning-on-push",
SPECIFY_BUILD_STEPS_MANUALLY = "https://docs.github.com/en/code-security/code-scanning/creating-an-advanced-setup-for-code-scanning/codeql-code-scanning-for-compiled-languages#about-specifying-build-steps-manually",
SYSTEM_REQUIREMENTS = "https://codeql.github.com/docs/codeql-overview/system-requirements/",
-7
View File
@@ -85,7 +85,6 @@ export enum Feature {
OverlayAnalysisStatusCheck = "overlay_analysis_status_check",
/** Controls whether overlay build failures on the default branch are stored in the Actions cache. */
OverlayAnalysisStatusSave = "overlay_analysis_status_save",
PythonDefaultIsToNotExtractStdlib = "python_default_is_to_not_extract_stdlib",
QaTelemetryEnabled = "qa_telemetry_enabled",
/** Note that this currently only disables baseline file coverage information. */
SkipFileCoverageOnPrs = "skip_file_coverage_on_prs",
@@ -298,12 +297,6 @@ export const featureConfig = {
envVar: "CODEQL_ACTION_OVERLAY_ANALYSIS_SKIP_RESOURCE_CHECKS",
minimumVersion: undefined,
},
[Feature.PythonDefaultIsToNotExtractStdlib]: {
defaultValue: false,
envVar: "CODEQL_ACTION_DISABLE_PYTHON_STANDARD_LIBRARY_EXTRACTION",
minimumVersion: undefined,
toolsFeature: ToolsFeature.PythonDefaultIsToNotExtractStdlib,
},
[Feature.QaTelemetryEnabled]: {
defaultValue: false,
envVar: "CODEQL_ACTION_QA_TELEMETRY",
+118 -51
View File
@@ -343,75 +343,142 @@ test.serial("decodeGitFilePath quoted strings", async (t) => {
);
});
test.serial("getFileOidsUnderPath returns correct file mapping", async (t) => {
const runGitCommandStub = sinon
.stub(gitUtils as any, "runGitCommand")
.resolves(
"100644 30d998ded095371488be3a729eb61d86ed721a18 0\tlib/git-utils.js\n" +
"100644 d89514599a9a99f22b4085766d40af7b99974827 0\tlib/git-utils.js.map\n" +
"100644 a47c11f5bfdca7661942d2c8f1b7209fb0dfdf96 0\tsrc/git-utils.ts",
);
test.serial(
"getFileOidsUnderPath uses --recurse-submodules when submodules exist",
async (t) => {
await withTmpDir(async (tmpDir) => {
fs.writeFileSync(path.join(tmpDir, ".gitmodules"), "");
const runGitCommandStub = sinon
.stub(gitUtils as any, "runGitCommand")
.callsFake(async (_cwd: any, args: any) => {
if (args[0] === "rev-parse") {
return `${tmpDir}\n`;
}
return (
"100644 30d998ded095371488be3a729eb61d86ed721a18 0\tlib/git-utils.js\n" +
"100644 d89514599a9a99f22b4085766d40af7b99974827 0\tlib/git-utils.js.map\n" +
"100644 a47c11f5bfdca7661942d2c8f1b7209fb0dfdf96 0\tsrc/git-utils.ts"
);
});
const result = await gitUtils.getFileOidsUnderPath("/fake/path");
const result = await gitUtils.getFileOidsUnderPath("/fake/path");
t.deepEqual(result, {
"lib/git-utils.js": "30d998ded095371488be3a729eb61d86ed721a18",
"lib/git-utils.js.map": "d89514599a9a99f22b4085766d40af7b99974827",
"src/git-utils.ts": "a47c11f5bfdca7661942d2c8f1b7209fb0dfdf96",
});
t.deepEqual(result, {
"lib/git-utils.js": "30d998ded095371488be3a729eb61d86ed721a18",
"lib/git-utils.js.map": "d89514599a9a99f22b4085766d40af7b99974827",
"src/git-utils.ts": "a47c11f5bfdca7661942d2c8f1b7209fb0dfdf96",
});
t.deepEqual(runGitCommandStub.firstCall.args, [
"/fake/path",
["ls-files", "--recurse-submodules", "--stage"],
"Cannot list Git OIDs of tracked files.",
]);
});
// Second call (after getGitRoot) should include --recurse-submodules
t.deepEqual(runGitCommandStub.secondCall.args[1], [
"ls-files",
"--recurse-submodules",
"--stage",
]);
});
},
);
test.serial(
"getFileOidsUnderPath omits --recurse-submodules when no submodules exist",
async (t) => {
await withTmpDir(async (tmpDir) => {
const runGitCommandStub = sinon
.stub(gitUtils as any, "runGitCommand")
.callsFake(async (_cwd: any, args: any) => {
if (args[0] === "rev-parse") {
return `${tmpDir}\n`;
}
return (
"100644 30d998ded095371488be3a729eb61d86ed721a18 0\tlib/git-utils.js\n" +
"100644 a47c11f5bfdca7661942d2c8f1b7209fb0dfdf96 0\tsrc/git-utils.ts"
);
});
const result = await gitUtils.getFileOidsUnderPath("/fake/path");
t.deepEqual(result, {
"lib/git-utils.js": "30d998ded095371488be3a729eb61d86ed721a18",
"src/git-utils.ts": "a47c11f5bfdca7661942d2c8f1b7209fb0dfdf96",
});
// Second call (after getGitRoot) should NOT include --recurse-submodules
t.deepEqual(runGitCommandStub.secondCall.args[1], [
"ls-files",
"--stage",
]);
});
},
);
test.serial("getFileOidsUnderPath handles quoted paths", async (t) => {
sinon
.stub(gitUtils as any, "runGitCommand")
.resolves(
"100644 30d998ded095371488be3a729eb61d86ed721a18 0\tlib/normal-file.js\n" +
'100644 d89514599a9a99f22b4085766d40af7b99974827 0\t"lib/file with spaces.js"\n' +
'100644 a47c11f5bfdca7661942d2c8f1b7209fb0dfdf96 0\t"lib/file\\twith\\ttabs.js"',
);
await withTmpDir(async (tmpDir) => {
sinon
.stub(gitUtils as any, "runGitCommand")
.callsFake(async (_cwd: any, args: any) => {
if (args[0] === "rev-parse") {
return `${tmpDir}\n`;
}
return (
"100644 30d998ded095371488be3a729eb61d86ed721a18 0\tlib/normal-file.js\n" +
'100644 d89514599a9a99f22b4085766d40af7b99974827 0\t"lib/file with spaces.js"\n' +
'100644 a47c11f5bfdca7661942d2c8f1b7209fb0dfdf96 0\t"lib/file\\twith\\ttabs.js"'
);
});
const result = await gitUtils.getFileOidsUnderPath("/fake/path");
const result = await gitUtils.getFileOidsUnderPath("/fake/path");
t.deepEqual(result, {
"lib/normal-file.js": "30d998ded095371488be3a729eb61d86ed721a18",
"lib/file with spaces.js": "d89514599a9a99f22b4085766d40af7b99974827",
"lib/file\twith\ttabs.js": "a47c11f5bfdca7661942d2c8f1b7209fb0dfdf96",
t.deepEqual(result, {
"lib/normal-file.js": "30d998ded095371488be3a729eb61d86ed721a18",
"lib/file with spaces.js": "d89514599a9a99f22b4085766d40af7b99974827",
"lib/file\twith\ttabs.js": "a47c11f5bfdca7661942d2c8f1b7209fb0dfdf96",
});
});
});
test.serial("getFileOidsUnderPath handles empty output", async (t) => {
sinon.stub(gitUtils as any, "runGitCommand").resolves("");
await withTmpDir(async (tmpDir) => {
sinon
.stub(gitUtils as any, "runGitCommand")
.callsFake(async (_cwd: any, args: any) => {
if (args[0] === "rev-parse") {
return `${tmpDir}\n`;
}
return "";
});
const result = await gitUtils.getFileOidsUnderPath("/fake/path");
t.deepEqual(result, {});
const result = await gitUtils.getFileOidsUnderPath("/fake/path");
t.deepEqual(result, {});
});
});
test.serial(
"getFileOidsUnderPath throws on unexpected output format",
async (t) => {
sinon
.stub(gitUtils as any, "runGitCommand")
.resolves(
"100644 30d998ded095371488be3a729eb61d86ed721a18 0\tlib/git-utils.js\n" +
"invalid-line-format\n" +
"100644 a47c11f5bfdca7661942d2c8f1b7209fb0dfdf96 0\tsrc/git-utils.ts",
);
await withTmpDir(async (tmpDir) => {
sinon
.stub(gitUtils as any, "runGitCommand")
.callsFake(async (_cwd: any, args: any) => {
if (args[0] === "rev-parse") {
return `${tmpDir}\n`;
}
return (
"100644 30d998ded095371488be3a729eb61d86ed721a18 0\tlib/git-utils.js\n" +
"invalid-line-format\n" +
"100644 a47c11f5bfdca7661942d2c8f1b7209fb0dfdf96 0\tsrc/git-utils.ts"
);
});
await t.throwsAsync(
async () => {
await gitUtils.getFileOidsUnderPath("/fake/path");
},
{
instanceOf: Error,
message: 'Unexpected "git ls-files" output: invalid-line-format',
},
);
await t.throwsAsync(
async () => {
await gitUtils.getFileOidsUnderPath("/fake/path");
},
{
instanceOf: Error,
message: 'Unexpected "git ls-files" output: invalid-line-format',
},
);
});
},
);
+31 -9
View File
@@ -1,4 +1,6 @@
import * as fs from "fs";
import * as os from "os";
import * as path from "path";
import * as core from "@actions/core";
import { ExecOptions } from "@actions/exec";
@@ -14,11 +16,11 @@ import {
import { ConfigurationError, getRequiredEnvParam } from "./util";
/**
* Minimum Git version required for overlay analysis. The
* `git ls-files --recurse-submodules` option, which is used by
* `getFileOidsUnderPath`, was introduced in Git 2.11.0.
* Minimum Git version required for overlay analysis in repositories that
* contain submodules. Support for using the `git ls-files
* --recurse-submodules` option with `--stage` was added in Git 2.36.0.
*/
export const GIT_MINIMUM_VERSION_FOR_OVERLAY = "2.11.0";
export const GIT_MINIMUM_VERSION_FOR_OVERLAY_WITH_SUBMODULES = "2.36.0";
/**
* Git version information
@@ -245,6 +247,16 @@ export const getGitRoot = async function (
}
};
/**
* Returns true if the Git repository has submodules registered (i.e. a
* `.gitmodules` file exists at the repository root).
*
* @param gitRoot The root of the Git repository.
*/
export function hasSubmodules(gitRoot: string): boolean {
return fs.existsSync(path.join(gitRoot, ".gitmodules"));
}
/**
* Returns the Git OIDs of all tracked files (in the index and in the working
* tree) that are under the given base path, including files in active
@@ -261,11 +273,21 @@ export const getFileOidsUnderPath = async function (
// Without the --full-name flag, the path is relative to the current working
// directory of the git command, which is basePath.
//
// We use --stage rather than --format here because --stage has been available since Git 2.11.0,
// while --format was only introduced in Git 2.38.0, which would limit overlay rollout.
// We use --stage rather than --format here because --format was only
// introduced in Git 2.38.0, which would limit overlay rollout.
//
// We only pass --recurse-submodules when the repository actually has
// submodules, because the combination of --recurse-submodules and --stage is
// only supported since Git 2.36.0.
const gitRoot = await getGitRoot(basePath);
const mayHaveSubmodules =
gitRoot === undefined ? true : hasSubmodules(gitRoot);
const args = mayHaveSubmodules
? ["ls-files", "--recurse-submodules", "--stage"]
: ["ls-files", "--stage"];
const stdout = await runGitCommand(
basePath,
["ls-files", "--recurse-submodules", "--stage"],
args,
"Cannot list Git OIDs of tracked files.",
);
@@ -280,8 +302,8 @@ export const getFileOidsUnderPath = async function (
const match = line.match(regex);
if (match) {
const oid = match[1];
const path = decodeGitFilePath(match[2]);
fileOidMap[path] = oid;
const filePath = decodeGitFilePath(match[2]);
fileOidMap[filePath] = oid;
} else {
throw new Error(`Unexpected "git ls-files" output: ${line}`);
}
+1 -1
View File
@@ -12,7 +12,7 @@ import { EnvVar } from "./environment";
import { Feature } from "./feature-flags";
import * as initActionPostHelper from "./init-action-post-helper";
import { getRunnerLogger } from "./logging";
import { OverlayDatabaseMode } from "./overlay";
import { OverlayDatabaseMode } from "./overlay/overlay-database-mode";
import * as overlayStatus from "./overlay/status";
import { parseRepositoryNwo } from "./repository";
import {
+1 -1
View File
@@ -21,7 +21,7 @@ import * as dependencyCaching from "./dependency-caching";
import { EnvVar } from "./environment";
import { Feature, FeatureEnablement } from "./feature-flags";
import { Logger } from "./logging";
import { OverlayDatabaseMode } from "./overlay";
import { OverlayDatabaseMode } from "./overlay/overlay-database-mode";
import {
createOverlayStatus,
OverlayStatus,
+32 -48
View File
@@ -58,13 +58,13 @@ import {
initConfig,
runDatabaseInitCluster,
} from "./init";
import { JavaEnvVars, KnownLanguage } from "./languages";
import { JavaEnvVars, BuiltInLanguage } from "./languages";
import { getActionsLogger, Logger, withGroupAsync } from "./logging";
import {
downloadOverlayBaseDatabaseFromCache,
OverlayBaseDatabaseDownloadStats,
OverlayDatabaseMode,
} from "./overlay";
} from "./overlay/caching";
import { OverlayDatabaseMode } from "./overlay/overlay-database-mode";
import { getRepositoryNwo, RepositoryNwo } from "./repository";
import { ToolsSource } from "./setup-codeql";
import {
@@ -330,7 +330,7 @@ async function run(startedAt: Date) {
// requested rust - don't enable it via language autodetection.
configUtils
.getRawLanguagesNoAutodetect(getOptionalInput("languages"))
.includes(KnownLanguage.rust)
.includes(BuiltInLanguage.rust)
) {
const experimental = "2.19.3";
const publicPreview = "2.22.1";
@@ -389,6 +389,15 @@ async function run(startedAt: Date) {
logger,
});
if (
config.languages.includes(BuiltInLanguage.swift) &&
process.platform !== "darwin"
) {
throw new ConfigurationError(
`Swift analysis is only supported on macOS runner images. Please migrate to a macOS runner.`,
);
}
if (repositoryPropertiesResult.isFailure()) {
addNoLanguageDiagnostic(
config,
@@ -456,18 +465,23 @@ async function run(startedAt: Date) {
// necessary preparations. So, in that mode, we would assume that
// everything is in order and let the analysis fail if that turns out not
// to be the case.
overlayBaseDatabaseStats = await downloadOverlayBaseDatabaseFromCache(
codeql,
config,
logger,
await withGroupAsync(
"Checking cache for overlay-base database",
async () => {
overlayBaseDatabaseStats = await downloadOverlayBaseDatabaseFromCache(
codeql,
config,
logger,
);
if (!overlayBaseDatabaseStats) {
config.overlayDatabaseMode = OverlayDatabaseMode.None;
logger.info(
"No overlay-base database found in cache, " +
`reverting overlay database mode to ${OverlayDatabaseMode.None}.`,
);
}
},
);
if (!overlayBaseDatabaseStats) {
config.overlayDatabaseMode = OverlayDatabaseMode.None;
logger.info(
"No overlay-base database found in cache, " +
`reverting overlay database mode to ${OverlayDatabaseMode.None}.`,
);
}
}
if (config.overlayDatabaseMode !== OverlayDatabaseMode.Overlay) {
@@ -500,16 +514,7 @@ async function run(startedAt: Date) {
}
if (
config.languages.includes(KnownLanguage.swift) &&
process.platform === "linux"
) {
logger.warning(
`Swift analysis on Ubuntu runner images is no longer supported. Please migrate to a macOS runner if this affects you.`,
);
}
if (
config.languages.includes(KnownLanguage.go) &&
config.languages.includes(BuiltInLanguage.go) &&
process.platform === "linux"
) {
try {
@@ -567,7 +572,7 @@ async function run(startedAt: Date) {
if (e instanceof FileCmdNotFoundError) {
addDiagnostic(
config,
KnownLanguage.go,
BuiltInLanguage.go,
makeDiagnostic(
"go/workflow/file-program-unavailable",
"The `file` program is required on Linux, but does not appear to be installed",
@@ -645,27 +650,6 @@ async function run(startedAt: Date) {
);
}
if (
await codeql.supportsFeature(
ToolsFeature.PythonDefaultIsToNotExtractStdlib,
)
) {
if (process.env["CODEQL_EXTRACTOR_PYTHON_EXTRACT_STDLIB"]) {
logger.debug(
"CODEQL_EXTRACTOR_PYTHON_EXTRACT_STDLIB is already set, so the Action will not override it.",
);
} else if (
!(await features.getValue(
Feature.PythonDefaultIsToNotExtractStdlib,
codeql,
))
) {
// We are in a situation where the feature flag is not rolled out,
// so we need to suppress the new default CLI behavior.
core.exportVariable("CODEQL_EXTRACTOR_PYTHON_EXTRACT_STDLIB", "true");
}
}
// If we are doing a Java `build-mode: none` analysis, then set the environment variable that
// enables the option in the Java extractor to minimize dependency jars. We also only do this if
// dependency caching is enabled, since the option is intended to reduce the size of dependency
@@ -682,7 +666,7 @@ async function run(startedAt: Date) {
(await codeQlVersionAtLeast(codeql, CODEQL_VERSION_JAR_MINIMIZATION)) &&
config.dependencyCachingEnabled &&
config.buildMode === BuildMode.None &&
config.languages.includes(KnownLanguage.java)
config.languages.includes(BuiltInLanguage.java)
) {
core.exportVariable(
EnvVar.JAVA_EXTRACTOR_MINIMIZE_DEPENDENCY_JARS,
+34 -34
View File
@@ -15,7 +15,7 @@ import {
getFileCoverageInformationEnabled,
logFileCoverageOnPrsDeprecationWarning,
} from "./init";
import { KnownLanguage } from "./languages";
import { BuiltInLanguage } from "./languages";
import {
createFeatures,
LoggedMessage,
@@ -152,7 +152,7 @@ test("cleanupDatabaseClusterDirectory can disable warning with options", async (
});
type PackInfo = {
language: KnownLanguage;
language: BuiltInLanguage;
packinfoContents: string | undefined;
sourceOnlyPack?: boolean;
qlpackFileName?: string;
@@ -169,13 +169,13 @@ const testCheckPacksForOverlayCompatibility = test.macro({
expectedResult,
}: {
cliOverlayVersion: number | undefined;
languages: KnownLanguage[];
languages: BuiltInLanguage[];
packs: Record<string, PackInfo>;
expectedResult: boolean;
},
) => {
await withTmpDir(async (tmpDir) => {
const packDirsByLanguage = new Map<KnownLanguage, string[]>();
const packDirsByLanguage = new Map<BuiltInLanguage, string[]>();
for (const [packName, packInfo] of Object.entries(packs)) {
const packPath = path.join(tmpDir, packName);
@@ -242,10 +242,10 @@ test(
"returns false when CLI does not support overlay",
{
cliOverlayVersion: undefined,
languages: [KnownLanguage.java],
languages: [BuiltInLanguage.java],
packs: {
"codeql/java-queries": {
language: KnownLanguage.java,
language: BuiltInLanguage.java,
packinfoContents: '{"overlayVersion":2}',
},
},
@@ -258,7 +258,7 @@ test(
"returns true when there are no query packs",
{
cliOverlayVersion: 2,
languages: [KnownLanguage.java],
languages: [BuiltInLanguage.java],
packs: {},
expectedResult: true,
},
@@ -269,10 +269,10 @@ test(
"returns true when query pack has not been compiled",
{
cliOverlayVersion: 2,
languages: [KnownLanguage.java],
languages: [BuiltInLanguage.java],
packs: {
"codeql/java-queries": {
language: KnownLanguage.java,
language: BuiltInLanguage.java,
packinfoContents: undefined,
sourceOnlyPack: true,
},
@@ -286,10 +286,10 @@ test(
"returns true when query pack has expected overlay version",
{
cliOverlayVersion: 2,
languages: [KnownLanguage.java],
languages: [BuiltInLanguage.java],
packs: {
"codeql/java-queries": {
language: KnownLanguage.java,
language: BuiltInLanguage.java,
packinfoContents: '{"overlayVersion":2}',
},
},
@@ -302,14 +302,14 @@ test(
"returns true when query packs for all languages to analyze are compatible",
{
cliOverlayVersion: 2,
languages: [KnownLanguage.cpp, KnownLanguage.java],
languages: [BuiltInLanguage.cpp, BuiltInLanguage.java],
packs: {
"codeql/cpp-queries": {
language: KnownLanguage.cpp,
language: BuiltInLanguage.cpp,
packinfoContents: '{"overlayVersion":2}',
},
"codeql/java-queries": {
language: KnownLanguage.java,
language: BuiltInLanguage.java,
packinfoContents: '{"overlayVersion":2}',
},
},
@@ -322,14 +322,14 @@ test(
"returns true when query pack for a language not analyzed is incompatible",
{
cliOverlayVersion: 2,
languages: [KnownLanguage.java],
languages: [BuiltInLanguage.java],
packs: {
"codeql/cpp-queries": {
language: KnownLanguage.cpp,
language: BuiltInLanguage.cpp,
packinfoContents: undefined,
},
"codeql/java-queries": {
language: KnownLanguage.java,
language: BuiltInLanguage.java,
packinfoContents: '{"overlayVersion":2}',
},
},
@@ -342,14 +342,14 @@ test(
"returns false when query pack for a language to analyze is incompatible",
{
cliOverlayVersion: 2,
languages: [KnownLanguage.cpp, KnownLanguage.java],
languages: [BuiltInLanguage.cpp, BuiltInLanguage.java],
packs: {
"codeql/cpp-queries": {
language: KnownLanguage.cpp,
language: BuiltInLanguage.cpp,
packinfoContents: '{"overlayVersion":1}',
},
"codeql/java-queries": {
language: KnownLanguage.java,
language: BuiltInLanguage.java,
packinfoContents: '{"overlayVersion":2}',
},
},
@@ -362,14 +362,14 @@ test(
"returns false when query pack is missing .packinfo",
{
cliOverlayVersion: 2,
languages: [KnownLanguage.java],
languages: [BuiltInLanguage.java],
packs: {
"codeql/java-queries": {
language: KnownLanguage.java,
language: BuiltInLanguage.java,
packinfoContents: '{"overlayVersion":2}',
},
"custom/queries": {
language: KnownLanguage.java,
language: BuiltInLanguage.java,
packinfoContents: undefined,
},
},
@@ -382,14 +382,14 @@ test(
"returns false when query pack has different overlay version",
{
cliOverlayVersion: 2,
languages: [KnownLanguage.java],
languages: [BuiltInLanguage.java],
packs: {
"codeql/java-queries": {
language: KnownLanguage.java,
language: BuiltInLanguage.java,
packinfoContents: '{"overlayVersion":2}',
},
"custom/queries": {
language: KnownLanguage.java,
language: BuiltInLanguage.java,
packinfoContents: '{"overlayVersion":1}',
},
},
@@ -402,14 +402,14 @@ test(
"returns false when query pack is missing overlayVersion in .packinfo",
{
cliOverlayVersion: 2,
languages: [KnownLanguage.java],
languages: [BuiltInLanguage.java],
packs: {
"codeql/java-queries": {
language: KnownLanguage.java,
language: BuiltInLanguage.java,
packinfoContents: '{"overlayVersion":2}',
},
"custom/queries": {
language: KnownLanguage.java,
language: BuiltInLanguage.java,
packinfoContents: "{}",
},
},
@@ -422,14 +422,14 @@ test(
"returns false when .packinfo is not valid JSON",
{
cliOverlayVersion: 2,
languages: [KnownLanguage.java],
languages: [BuiltInLanguage.java],
packs: {
"codeql/java-queries": {
language: KnownLanguage.java,
language: BuiltInLanguage.java,
packinfoContents: '{"overlayVersion":2}',
},
"custom/queries": {
language: KnownLanguage.java,
language: BuiltInLanguage.java,
packinfoContents: "this_is_not_valid_json",
},
},
@@ -442,10 +442,10 @@ test(
"returns true when query pack uses codeql-pack.yml filename",
{
cliOverlayVersion: 2,
languages: [KnownLanguage.java],
languages: [BuiltInLanguage.java],
packs: {
"codeql/java-queries": {
language: KnownLanguage.java,
language: BuiltInLanguage.java,
packinfoContents: '{"overlayVersion":2}',
qlpackFileName: "codeql-pack.yml",
},
+2 -2
View File
@@ -26,7 +26,7 @@ import {
RepositoryProperties,
RepositoryPropertyName,
} from "./feature-flags/properties";
import { KnownLanguage, Language } from "./languages";
import { BuiltInLanguage, Language } from "./languages";
import { Logger, withGroupAsync } from "./logging";
import { ToolsSource } from "./setup-codeql";
import { ZstdAvailability } from "./tar";
@@ -235,7 +235,7 @@ export async function checkInstallPython311(
codeql: CodeQL,
) {
if (
languages.includes(KnownLanguage.python) &&
languages.includes(BuiltInLanguage.python) &&
process.platform === "win32" &&
!(await codeql.getVersion()).features?.supportsPython312
) {
+46
View File
@@ -0,0 +1,46 @@
import test from "ava";
import { setupTests } from "../testing-utils";
import * as json from ".";
setupTests(test);
const testSchema = {
requiredKey: json.string,
};
const optionalSchema = {
optionalKey: json.optional(json.string),
};
test("validateSchema - required properties are required", async (t) => {
t.false(json.validateSchema(testSchema, {}));
t.false(json.validateSchema(testSchema, { requiredKey: undefined }));
t.false(json.validateSchema(testSchema, { requiredKey: null }));
t.false(json.validateSchema(testSchema, { requiredKey: 0 }));
t.false(json.validateSchema(testSchema, { requiredKey: 123 }));
t.false(json.validateSchema(testSchema, { requiredKey: false }));
t.false(json.validateSchema(testSchema, { requiredKey: true }));
t.false(json.validateSchema(testSchema, { requiredKey: [] }));
t.false(json.validateSchema(testSchema, { requiredKey: {} }));
t.true(json.validateSchema(testSchema, { requiredKey: "" }));
t.true(json.validateSchema(testSchema, { requiredKey: "foo" }));
});
test("validateSchema - optional properties are optional", async (t) => {
// Optional fields may be absent
t.true(json.validateSchema(optionalSchema, {}));
t.true(json.validateSchema(optionalSchema, { optionalKey: undefined }));
t.true(json.validateSchema(optionalSchema, { optionalKey: null }));
// But, if present, should have the expected type
t.false(json.validateSchema(optionalSchema, { optionalKey: 0 }));
t.false(json.validateSchema(optionalSchema, { optionalKey: 123 }));
t.false(json.validateSchema(optionalSchema, { optionalKey: false }));
t.false(json.validateSchema(optionalSchema, { optionalKey: true }));
t.false(json.validateSchema(optionalSchema, { optionalKey: [] }));
t.false(json.validateSchema(optionalSchema, { optionalKey: {} }));
t.true(json.validateSchema(optionalSchema, { optionalKey: "" }));
t.true(json.validateSchema(optionalSchema, { optionalKey: "foo" }));
});
+79
View File
@@ -36,3 +36,82 @@ export function isStringOrUndefined(
): value is string | undefined {
return value === undefined || isString(value);
}
/**
* Represents a field of type `T` in a schema.
* Carries a validation function and flag indicating whether the field is required or not.
*/
export type Validator<T> = {
validate: (val: unknown) => val is T;
required: boolean;
};
/** Extracts `T` from `Validator<T>`. */
export type UnwrapValidator<V> = V extends Validator<infer A> ? A : never;
/** A validator for string fields in schemas. */
export const string = {
validate: isString,
required: true,
} as const satisfies Validator<string>;
/** Transforms a validator to be optional. */
export function optional<T>(validator: Validator<T>) {
return {
validate: (val: unknown) => {
return val === undefined || val === null || validator.validate(val);
},
required: false,
} as const satisfies Validator<T | undefined | null>;
}
/** Represents an arbitrary object schema. */
export type Schema = Record<string, Validator<any>>;
/** Extracts the required keys from `S`. */
export type RequiredKeys<S extends Schema> = {
[K in keyof S]: S[K]["required"] extends true ? K : never;
}[keyof S];
/** Extracts optional keys from `S`. */
export type OptionalKeys<S extends Schema> = {
[K in keyof S]: S[K]["required"] extends true ? never : K;
}[keyof S];
/** Constructs an object type corresponding to a schema. */
export type FromSchema<S extends Schema> = {
[K in RequiredKeys<S>]: UnwrapValidator<S[K]>;
} & { [K in OptionalKeys<S>]?: UnwrapValidator<S[K]> };
/**
* Validates that `obj` satisfies at least `schema`. Additional keys are accepted.
*
* @param schema The schema to validate against.
* @param obj The object to validate.
* @returns Asserts that `obj` is of the `schema`'s type if validation is successful.
*/
export function validateSchema<S extends Schema>(
schema: S,
obj: UnvalidatedObject<any>,
): obj is FromSchema<S> {
for (const [key, validator] of Object.entries(schema)) {
const hasKey = key in obj;
// If the property is required, but absent, fail.
if (validator.required && !hasKey) {
return false;
}
// If the property is required, but undefined or null, fail.
if (validator.required && (obj[key] === undefined || obj[key] === null)) {
return false;
}
// If the property is present, validate it.
if (hasKey && !validator.validate(obj[key])) {
return false;
}
}
return true;
}
+106
View File
@@ -0,0 +1,106 @@
import { ExecutionContext } from "ava";
import * as json from ".";
/**
* Constructs an object based on `schema` for unit tests.
* Assumes that all keys in `schema` have string values.
*
* @param includeOptional Whether to include optional properties.
* @param schema The schema to base the object on.
* @returns An object that satisfies `schema`.
*/
export function makeFromSchema<S extends json.Schema>(
includeOptional: boolean,
schema: S,
): json.FromSchema<S> {
const result = {};
for (const [key, validator] of Object.entries(schema)) {
if (!validator.required && !includeOptional) {
continue;
}
result[key] = `value-for-${key}`;
}
return result as json.FromSchema<S>;
}
/** Options for `withSchemaMatrix`. */
export interface SchemaMatrixOptions {
/** Whether cases where the properties are entirely absent should be excluded. */
excludeAbsent?: boolean;
}
/**
* Constructs a test matrix of possible objects for `schema`: all required properties
* plus all permutations of possible states for the optional properties.
*
* @param schema The schema to construct a test matrix for.
* @param body The test body to call with each value from the test matrix.
*/
export function withSchemaMatrix<S extends json.Schema>(
t: ExecutionContext<any>,
schema: S,
opts: SchemaMatrixOptions,
body: (value: json.FromSchema<S>) => void,
): void {
// Construct a base object that includes all required properties.
const required = makeFromSchema(false, schema);
// Identify optional properties.
const optionalKeys: Array<keyof S> = [];
for (const [key, validator] of Object.entries(schema)) {
if (!validator.required) {
optionalKeys.push(key);
}
}
const optionalValues = (key: keyof S) => [
null,
undefined,
`value-for-${String(key)}`,
];
// Constructs an array of test objects, starting with `required` and combining it with all
// possible states of each optional property. For example, with default settings:
//
// For { requiredKey: string }, we get: `[{ requiredKey: "some-string-value" }]`
//
// For { requiredKey: string, optionalKey?: string }, we get:
// [ { requiredKey: "some-string-value" },
// { requiredKey: "some-string-value", optionalKey: undefined },
// { requiredKey: "some-string-value", optionalKey: null },
// { requiredKey: "some-string-value", optionalKey: "some-value" },
// ]
const permutations = (keys: Array<keyof S>) => {
if (keys.length === 0) return [required];
const bases = permutations(keys.slice(1));
const result: Array<json.FromSchema<S>> = [];
const optionalKey = keys[0];
for (const base of bases) {
if (!opts.excludeAbsent) {
// Optional keys can be absent entirely.
result.push(base);
}
// Or be present and have one of the `optionalValues`.
for (const optionalValue of optionalValues(optionalKey)) {
result.push({ ...base, [optionalKey]: optionalValue });
}
}
return result;
};
// Call `body` for all test cases.
const testCases = permutations(optionalKeys);
for (const testCase of testCases) {
try {
body(testCase);
} catch (err) {
t.log(testCase);
throw err;
}
}
}
-29
View File
@@ -1,29 +0,0 @@
/** A language to analyze with CodeQL. */
export type Language = string;
/**
* A language supported by CodeQL that is treated specially by the Action.
*
* This is not an exhaustive list of languages supported by CodeQL and new
* languages do not need to be added here.
*/
export enum KnownLanguage {
actions = "actions",
cpp = "cpp",
csharp = "csharp",
go = "go",
java = "java",
javascript = "javascript",
python = "python",
ruby = "ruby",
rust = "rust",
swift = "swift",
}
/** Java-specific environment variable names that we may care about. */
export enum JavaEnvVars {
JAVA_HOME = "JAVA_HOME",
JAVA_TOOL_OPTIONS = "JAVA_TOOL_OPTIONS",
JDK_JAVA_OPTIONS = "JDK_JAVA_OPTIONS",
_JAVA_OPTIONS = "_JAVA_OPTIONS",
}
+25
View File
@@ -0,0 +1,25 @@
{
"languages": [
"actions",
"cpp",
"csharp",
"go",
"java",
"javascript",
"python",
"ruby",
"rust",
"swift"
],
"aliases": {
"c": "cpp",
"c-c++": "cpp",
"c-cpp": "cpp",
"c#": "csharp",
"c++": "cpp",
"java-kotlin": "java",
"javascript-typescript": "javascript",
"kotlin": "java",
"typescript": "javascript"
}
}
+46
View File
@@ -0,0 +1,46 @@
import test from "ava";
import { setupTests } from "../testing-utils";
import knownLanguagesData from "./builtin.json";
import { isBuiltInLanguage, BuiltInLanguage, parseBuiltInLanguage } from ".";
setupTests(test);
test("parseBuiltInLanguage", (t) => {
// Exact matches
t.is(parseBuiltInLanguage("csharp"), BuiltInLanguage.csharp);
t.is(parseBuiltInLanguage("cpp"), BuiltInLanguage.cpp);
t.is(parseBuiltInLanguage("go"), BuiltInLanguage.go);
t.is(parseBuiltInLanguage("java"), BuiltInLanguage.java);
t.is(parseBuiltInLanguage("javascript"), BuiltInLanguage.javascript);
t.is(parseBuiltInLanguage("python"), BuiltInLanguage.python);
t.is(parseBuiltInLanguage("rust"), BuiltInLanguage.rust);
// Aliases
t.is(parseBuiltInLanguage(" \t\nCsHaRp\t\t"), BuiltInLanguage.csharp);
t.is(parseBuiltInLanguage("c"), BuiltInLanguage.cpp);
t.is(parseBuiltInLanguage("c++"), BuiltInLanguage.cpp);
t.is(parseBuiltInLanguage("kotlin"), BuiltInLanguage.java);
t.is(parseBuiltInLanguage("typescript"), BuiltInLanguage.javascript);
// spaces and case-insensitivity
t.is(parseBuiltInLanguage(" \t\nkOtLin\t\t"), BuiltInLanguage.java);
// Not matches
t.is(parseBuiltInLanguage(BuiltInLanguage.python), BuiltInLanguage.python);
t.is(parseBuiltInLanguage("foo"), undefined);
t.is(parseBuiltInLanguage(" "), undefined);
t.is(parseBuiltInLanguage(""), undefined);
});
test("isBuiltInLanguage matches the curated built-in language set", (t) => {
t.true(isBuiltInLanguage(BuiltInLanguage.actions));
t.true(isBuiltInLanguage(BuiltInLanguage.swift));
t.false(isBuiltInLanguage("typescript"));
});
test("BuiltInLanguage enum matches builtin.json", (t) => {
t.deepEqual(Object.values(BuiltInLanguage), knownLanguagesData.languages);
});
+57
View File
@@ -0,0 +1,57 @@
import knownLanguagesData from "./builtin.json";
/** A language to analyze with CodeQL. */
export type Language = string;
/** A language built into the `defaults.json` CodeQL distribution. */
export enum BuiltInLanguage {
actions = "actions",
cpp = "cpp",
csharp = "csharp",
go = "go",
java = "java",
javascript = "javascript",
python = "python",
ruby = "ruby",
rust = "rust",
swift = "swift",
}
/** Java-specific environment variable names that we may care about. */
export enum JavaEnvVars {
JAVA_HOME = "JAVA_HOME",
JAVA_TOOL_OPTIONS = "JAVA_TOOL_OPTIONS",
JDK_JAVA_OPTIONS = "JDK_JAVA_OPTIONS",
_JAVA_OPTIONS = "_JAVA_OPTIONS",
}
const builtInLanguageSet = new Set<string>(knownLanguagesData.languages);
export function isBuiltInLanguage(
language: string,
): language is BuiltInLanguage {
return builtInLanguageSet.has(language);
}
/**
* Parse a language input corresponding to a built-in language into its canonical CodeQL language
* name.
*
* This uses the language aliases shipped with the Action and will not be able to resolve aliases
* added by third-party CodeQL language support or versions of the CodeQL CLI newer than the one
* mentioned in `defaults.json`. Therefore, this function should only be used when the CodeQL CLI is
* not available.
*/
export function parseBuiltInLanguage(
language: string,
): BuiltInLanguage | undefined {
language = language.trim().toLowerCase();
language =
knownLanguagesData.aliases[
language as keyof typeof knownLanguagesData.aliases
] ?? language;
if (isBuiltInLanguage(language)) {
return language;
}
return undefined;
}
+287
View File
@@ -0,0 +1,287 @@
import * as fs from "fs";
import * as path from "path";
import * as actionsCache from "@actions/cache";
import test from "ava";
import * as sinon from "sinon";
import * as actionsUtil from "../actions-util";
import * as apiClient from "../api-client";
import { ResolveDatabaseOutput } from "../codeql";
import * as gitUtils from "../git-utils";
import { BuiltInLanguage } from "../languages";
import { getRunnerLogger } from "../logging";
import {
createTestConfig,
mockCodeQLVersion,
setupTests,
} from "../testing-utils";
import * as utils from "../util";
import { withTmpDir } from "../util";
import {
downloadOverlayBaseDatabaseFromCache,
getCacheRestoreKeyPrefix,
getCacheSaveKey,
} from "./caching";
import { OverlayDatabaseMode } from "./overlay-database-mode";
setupTests(test);
interface DownloadOverlayBaseDatabaseTestCase {
overlayDatabaseMode: OverlayDatabaseMode;
useOverlayDatabaseCaching: boolean;
isInTestMode: boolean;
restoreCacheResult: string | undefined | Error;
hasBaseDatabaseOidsFile: boolean;
tryGetFolderBytesSucceeds: boolean;
codeQLVersion: string;
resolveDatabaseOutput: ResolveDatabaseOutput | Error;
}
const defaultDownloadTestCase: DownloadOverlayBaseDatabaseTestCase = {
overlayDatabaseMode: OverlayDatabaseMode.Overlay,
useOverlayDatabaseCaching: true,
isInTestMode: false,
restoreCacheResult: "cache-key",
hasBaseDatabaseOidsFile: true,
tryGetFolderBytesSucceeds: true,
codeQLVersion: "2.20.5",
resolveDatabaseOutput: { overlayBaseSpecifier: "20250626:XXX" },
};
const testDownloadOverlayBaseDatabaseFromCache = test.macro({
exec: async (
t,
_title: string,
partialTestCase: Partial<DownloadOverlayBaseDatabaseTestCase>,
expectDownloadSuccess: boolean,
) => {
await withTmpDir(async (tmpDir) => {
const dbLocation = path.join(tmpDir, "db");
await fs.promises.mkdir(dbLocation, { recursive: true });
const logger = getRunnerLogger(true);
const testCase = { ...defaultDownloadTestCase, ...partialTestCase };
const config = createTestConfig({
dbLocation,
languages: [BuiltInLanguage.java],
});
config.overlayDatabaseMode = testCase.overlayDatabaseMode;
config.useOverlayDatabaseCaching = testCase.useOverlayDatabaseCaching;
if (testCase.hasBaseDatabaseOidsFile) {
const baseDatabaseOidsFile = path.join(
dbLocation,
"base-database-oids.json",
);
await fs.promises.writeFile(baseDatabaseOidsFile, JSON.stringify({}));
}
const stubs: sinon.SinonStub[] = [];
const getAutomationIDStub = sinon
.stub(apiClient, "getAutomationID")
.resolves("test-automation-id/");
stubs.push(getAutomationIDStub);
const isInTestModeStub = sinon
.stub(utils, "isInTestMode")
.returns(testCase.isInTestMode);
stubs.push(isInTestModeStub);
if (testCase.restoreCacheResult instanceof Error) {
const restoreCacheStub = sinon
.stub(actionsCache, "restoreCache")
.rejects(testCase.restoreCacheResult);
stubs.push(restoreCacheStub);
} else {
const restoreCacheStub = sinon
.stub(actionsCache, "restoreCache")
.resolves(testCase.restoreCacheResult);
stubs.push(restoreCacheStub);
}
const tryGetFolderBytesStub = sinon
.stub(utils, "tryGetFolderBytes")
.resolves(testCase.tryGetFolderBytesSucceeds ? 1024 * 1024 : undefined);
stubs.push(tryGetFolderBytesStub);
const codeql = mockCodeQLVersion(testCase.codeQLVersion);
if (testCase.resolveDatabaseOutput instanceof Error) {
const resolveDatabaseStub = sinon
.stub(codeql, "resolveDatabase")
.rejects(testCase.resolveDatabaseOutput);
stubs.push(resolveDatabaseStub);
} else {
const resolveDatabaseStub = sinon
.stub(codeql, "resolveDatabase")
.resolves(testCase.resolveDatabaseOutput);
stubs.push(resolveDatabaseStub);
}
try {
const result = await downloadOverlayBaseDatabaseFromCache(
codeql,
config,
logger,
);
if (expectDownloadSuccess) {
t.truthy(result);
} else {
t.is(result, undefined);
}
} finally {
for (const stub of stubs) {
stub.restore();
}
}
});
},
title: (_, title) => `downloadOverlayBaseDatabaseFromCache: ${title}`,
});
test.serial(
testDownloadOverlayBaseDatabaseFromCache,
"returns stats when successful",
{},
true,
);
test.serial(
testDownloadOverlayBaseDatabaseFromCache,
"returns undefined when mode is OverlayDatabaseMode.OverlayBase",
{
overlayDatabaseMode: OverlayDatabaseMode.OverlayBase,
},
false,
);
test.serial(
testDownloadOverlayBaseDatabaseFromCache,
"returns undefined when mode is OverlayDatabaseMode.None",
{
overlayDatabaseMode: OverlayDatabaseMode.None,
},
false,
);
test.serial(
testDownloadOverlayBaseDatabaseFromCache,
"returns undefined when caching is disabled",
{
useOverlayDatabaseCaching: false,
},
false,
);
test.serial(
testDownloadOverlayBaseDatabaseFromCache,
"returns undefined in test mode",
{
isInTestMode: true,
},
false,
);
test.serial(
testDownloadOverlayBaseDatabaseFromCache,
"returns undefined when cache miss",
{
restoreCacheResult: undefined,
},
false,
);
test.serial(
testDownloadOverlayBaseDatabaseFromCache,
"returns undefined when download fails",
{
restoreCacheResult: new Error("Download failed"),
},
false,
);
test.serial(
testDownloadOverlayBaseDatabaseFromCache,
"returns undefined when downloaded database is invalid",
{
hasBaseDatabaseOidsFile: false,
},
false,
);
test.serial(
testDownloadOverlayBaseDatabaseFromCache,
"returns undefined when downloaded database doesn't have an overlayBaseSpecifier",
{
resolveDatabaseOutput: {},
},
false,
);
test.serial(
testDownloadOverlayBaseDatabaseFromCache,
"returns undefined when resolving database metadata fails",
{
resolveDatabaseOutput: new Error("Failed to resolve database metadata"),
},
false,
);
test.serial(
testDownloadOverlayBaseDatabaseFromCache,
"returns undefined when filesystem error occurs",
{
tryGetFolderBytesSucceeds: false,
},
false,
);
test.serial("overlay-base database cache keys remain stable", async (t) => {
const logger = getRunnerLogger(true);
const config = createTestConfig({ languages: ["python", "javascript"] });
const codeQlVersion = "2.23.0";
const commitOid = "abc123def456";
sinon.stub(apiClient, "getAutomationID").resolves("test-automation-id/");
sinon.stub(gitUtils, "getCommitOid").resolves(commitOid);
sinon.stub(actionsUtil, "getWorkflowRunID").returns(12345);
sinon.stub(actionsUtil, "getWorkflowRunAttempt").returns(1);
const saveKey = await getCacheSaveKey(
config,
codeQlVersion,
"checkout-path",
logger,
);
const expectedSaveKey =
"codeql-overlay-base-database-1-c5666c509a2d9895-javascript_python-2.23.0-abc123def456-12345-1";
t.is(
saveKey,
expectedSaveKey,
"Cache save key changed unexpectedly. " +
"This may indicate breaking changes in the cache key generation logic.",
);
const restoreKeyPrefix = await getCacheRestoreKeyPrefix(
config,
codeQlVersion,
);
const expectedRestoreKeyPrefix =
"codeql-overlay-base-database-1-c5666c509a2d9895-javascript_python-2.23.0-";
t.is(
restoreKeyPrefix,
expectedRestoreKeyPrefix,
"Cache restore key prefix changed unexpectedly. " +
"This may indicate breaking changes in the cache key generation logic.",
);
t.true(
saveKey.startsWith(restoreKeyPrefix),
`Expected save key "${saveKey}" to start with restore key prefix "${restoreKeyPrefix}"`,
);
});
+428
View File
@@ -0,0 +1,428 @@
import * as fs from "fs";
import * as actionsCache from "@actions/cache";
import {
getRequiredInput,
getWorkflowRunAttempt,
getWorkflowRunID,
} from "../actions-util";
import { getAutomationID } from "../api-client";
import { createCacheKeyHash } from "../caching-utils";
import { type CodeQL } from "../codeql";
import { type Config } from "../config-utils";
import { getCommitOid } from "../git-utils";
import { Logger, withGroupAsync } from "../logging";
import {
CleanupLevel,
getBaseDatabaseOidsFilePath,
getCodeQLDatabasePath,
getErrorMessage,
isInTestMode,
tryGetFolderBytes,
waitForResultWithTimeLimit,
} from "../util";
import { OverlayDatabaseMode } from "./overlay-database-mode";
/**
* The maximum (uncompressed) size of the overlay base database that we will
* upload. By default, the Actions Cache has an overall capacity of 10 GB, and
* the Actions Cache client library uses zstd compression.
*
* Ideally we would apply a size limit to the compressed overlay-base database,
* but we cannot do so because compression is handled transparently by the
* Actions Cache client library. Instead we place a limit on the uncompressed
* size of the overlay-base database.
*
* Assuming 2.5:1 compression ratio, the 7.5 GB limit on uncompressed data would
* translate to a limit of around 3 GB after compression.
*/
const OVERLAY_BASE_DATABASE_MAX_UPLOAD_SIZE_MB = 7500;
const OVERLAY_BASE_DATABASE_MAX_UPLOAD_SIZE_BYTES =
OVERLAY_BASE_DATABASE_MAX_UPLOAD_SIZE_MB * 1_000_000;
// Constants for database caching
const CACHE_VERSION = 1;
const CACHE_PREFIX = "codeql-overlay-base-database";
// The purpose of this ten-minute limit is to guard against the possibility
// that the cache service is unresponsive, which would otherwise cause the
// entire action to hang. Normally we expect cache operations to complete
// within two minutes.
const MAX_CACHE_OPERATION_MS = 600_000;
/**
* Checks that the overlay-base database is valid by checking for the
* existence of the base database OIDs file.
*
* @param config The configuration object
* @param logger The logger instance
* @param warningPrefix Prefix for the check failure warning message
* @returns True if the verification succeeded, false otherwise
*/
async function checkOverlayBaseDatabase(
codeql: CodeQL,
config: Config,
logger: Logger,
warningPrefix: string,
): Promise<boolean> {
// An overlay-base database should contain the base database OIDs file.
const baseDatabaseOidsFilePath = getBaseDatabaseOidsFilePath(config);
if (!fs.existsSync(baseDatabaseOidsFilePath)) {
logger.warning(
`${warningPrefix}: ${baseDatabaseOidsFilePath} does not exist`,
);
return false;
}
for (const language of config.languages) {
const dbPath = getCodeQLDatabasePath(config, language);
try {
const resolveDatabaseOutput = await codeql.resolveDatabase(dbPath);
if (
resolveDatabaseOutput === undefined ||
!("overlayBaseSpecifier" in resolveDatabaseOutput)
) {
logger.info(`${warningPrefix}: no overlayBaseSpecifier defined`);
return false;
} else {
logger.debug(
`Overlay base specifier for ${language} overlay-base database found: ` +
`${resolveDatabaseOutput.overlayBaseSpecifier}`,
);
}
} catch (e) {
logger.warning(`${warningPrefix}: failed to resolve database: ${e}`);
return false;
}
}
return true;
}
/**
* Uploads the overlay-base database to the GitHub Actions cache. If conditions
* for uploading are not met, the function does nothing and returns false.
*
* This function uses the `checkout_path` input to determine the repository path
* and works only when called from `analyze` or `upload-sarif`.
*
* @param codeql The CodeQL instance
* @param config The configuration object
* @param logger The logger instance
* @returns A promise that resolves to true if the upload was performed and
* successfully completed, or false otherwise
*/
export async function cleanupAndUploadOverlayBaseDatabaseToCache(
codeql: CodeQL,
config: Config,
logger: Logger,
): Promise<boolean> {
const overlayDatabaseMode = config.overlayDatabaseMode;
if (overlayDatabaseMode !== OverlayDatabaseMode.OverlayBase) {
logger.debug(
`Overlay database mode is ${overlayDatabaseMode}. ` +
"Skip uploading overlay-base database to cache.",
);
return false;
}
if (!config.useOverlayDatabaseCaching) {
logger.debug(
"Overlay database caching is disabled. " +
"Skip uploading overlay-base database to cache.",
);
return false;
}
if (isInTestMode()) {
logger.debug(
"In test mode. Skip uploading overlay-base database to cache.",
);
return false;
}
const databaseIsValid = await checkOverlayBaseDatabase(
codeql,
config,
logger,
"Abort uploading overlay-base database to cache",
);
if (!databaseIsValid) {
return false;
}
// Clean up the database using the overlay cleanup level.
await withGroupAsync("Cleaning up databases", async () => {
await codeql.databaseCleanupCluster(config, CleanupLevel.Overlay);
});
const dbLocation = config.dbLocation;
const databaseSizeBytes = await tryGetFolderBytes(dbLocation, logger);
if (databaseSizeBytes === undefined) {
logger.warning(
"Failed to determine database size. " +
"Skip uploading overlay-base database to cache.",
);
return false;
}
if (databaseSizeBytes > OVERLAY_BASE_DATABASE_MAX_UPLOAD_SIZE_BYTES) {
const databaseSizeMB = Math.round(databaseSizeBytes / 1_000_000);
logger.warning(
`Database size (${databaseSizeMB} MB) ` +
`exceeds maximum upload size (${OVERLAY_BASE_DATABASE_MAX_UPLOAD_SIZE_MB} MB). ` +
"Skip uploading overlay-base database to cache.",
);
return false;
}
const codeQlVersion = (await codeql.getVersion()).version;
const checkoutPath = getRequiredInput("checkout_path");
const cacheSaveKey = await getCacheSaveKey(
config,
codeQlVersion,
checkoutPath,
logger,
);
logger.info(
`Uploading overlay-base database to Actions cache with key ${cacheSaveKey}`,
);
try {
const cacheId = await waitForResultWithTimeLimit(
MAX_CACHE_OPERATION_MS,
actionsCache.saveCache([dbLocation], cacheSaveKey),
() => {},
);
if (cacheId === undefined) {
logger.warning("Timed out while uploading overlay-base database");
return false;
}
} catch (error) {
logger.warning(
"Failed to upload overlay-base database to cache: " +
`${error instanceof Error ? error.message : String(error)}`,
);
return false;
}
logger.info(`Successfully uploaded overlay-base database from ${dbLocation}`);
return true;
}
export interface OverlayBaseDatabaseDownloadStats {
databaseSizeBytes: number;
databaseDownloadDurationMs: number;
}
/**
* Downloads the overlay-base database from the GitHub Actions cache. If conditions
* for downloading are not met, the function does nothing and returns false.
*
* @param codeql The CodeQL instance
* @param config The configuration object
* @param logger The logger instance
* @returns A promise that resolves to download statistics if an overlay-base
* database was successfully downloaded, or undefined if the download was
* either not performed or failed.
*/
export async function downloadOverlayBaseDatabaseFromCache(
codeql: CodeQL,
config: Config,
logger: Logger,
): Promise<OverlayBaseDatabaseDownloadStats | undefined> {
const overlayDatabaseMode = config.overlayDatabaseMode;
if (overlayDatabaseMode !== OverlayDatabaseMode.Overlay) {
logger.debug(
`Overlay database mode is ${overlayDatabaseMode}. ` +
"Skip downloading overlay-base database from cache.",
);
return undefined;
}
if (!config.useOverlayDatabaseCaching) {
logger.debug(
"Overlay database caching is disabled. " +
"Skip downloading overlay-base database from cache.",
);
return undefined;
}
if (isInTestMode()) {
logger.debug(
"In test mode. Skip downloading overlay-base database from cache.",
);
return undefined;
}
const dbLocation = config.dbLocation;
const codeQlVersion = (await codeql.getVersion()).version;
const cacheRestoreKeyPrefix = await getCacheRestoreKeyPrefix(
config,
codeQlVersion,
);
logger.info(
"Looking in Actions cache for overlay-base database with " +
`restore key ${cacheRestoreKeyPrefix}`,
);
let databaseDownloadDurationMs = 0;
try {
const databaseDownloadStart = performance.now();
const foundKey = await waitForResultWithTimeLimit(
// This ten-minute limit for the cache restore operation is mainly to
// guard against the possibility that the cache service is unresponsive
// and hangs outside the data download.
//
// Data download (which is normally the most time-consuming part of the
// restore operation) should not run long enough to hit this limit. Even
// for an extremely large 10GB database, at a download speed of 40MB/s
// (see below), the download should complete within five minutes. If we
// do hit this limit, there are likely more serious problems other than
// mere slow download speed.
//
// This is important because we don't want any ongoing file operations
// on the database directory when we do hit this limit. Hitting this
// time limit takes us to a fallback path where we re-initialize the
// database from scratch at dbLocation, and having the cache restore
// operation continue to write into dbLocation in the background would
// really mess things up. We want to hit this limit only in the case
// of a hung cache service, not just slow download speed.
MAX_CACHE_OPERATION_MS,
actionsCache.restoreCache(
[dbLocation],
cacheRestoreKeyPrefix,
undefined,
{
// Azure SDK download (which is the default) uses 128MB segments; see
// https://github.com/actions/toolkit/blob/main/packages/cache/README.md.
// Setting segmentTimeoutInMs to 3000 translates to segment download
// speed of about 40 MB/s, which should be achievable unless the
// download is unreliable (in which case we do want to abort).
segmentTimeoutInMs: 3000,
},
),
() => {
logger.info("Timed out downloading overlay-base database from cache");
},
);
databaseDownloadDurationMs = Math.round(
performance.now() - databaseDownloadStart,
);
if (foundKey === undefined) {
logger.info("No overlay-base database found in Actions cache");
return undefined;
}
logger.info(
`Downloaded overlay-base database in cache with key ${foundKey}`,
);
} catch (error) {
logger.warning(
"Failed to download overlay-base database from cache: " +
`${error instanceof Error ? error.message : String(error)}`,
);
return undefined;
}
const databaseIsValid = await checkOverlayBaseDatabase(
codeql,
config,
logger,
"Downloaded overlay-base database is invalid",
);
if (!databaseIsValid) {
logger.warning("Downloaded overlay-base database failed validation");
return undefined;
}
const databaseSizeBytes = await tryGetFolderBytes(dbLocation, logger);
if (databaseSizeBytes === undefined) {
logger.info(
"Filesystem error while accessing downloaded overlay-base database",
);
// The problem that warrants reporting download failure is not that we are
// unable to determine the size of the database. Rather, it is that we
// encountered a filesystem error while accessing the database, which
// indicates that an overlay analysis will likely fail.
return undefined;
}
logger.info(`Successfully downloaded overlay-base database to ${dbLocation}`);
return {
databaseSizeBytes: Math.round(databaseSizeBytes),
databaseDownloadDurationMs,
};
}
/**
* Computes the cache key for saving the overlay-base database to the GitHub
* Actions cache.
*
* The key consists of the restore key prefix (which does not include the
* commit SHA) and the commit SHA of the current checkout.
*/
export async function getCacheSaveKey(
config: Config,
codeQlVersion: string,
checkoutPath: string,
logger: Logger,
): Promise<string> {
let runId = 1;
let attemptId = 1;
try {
runId = getWorkflowRunID();
attemptId = getWorkflowRunAttempt();
} catch (e) {
logger.warning(
`Failed to get workflow run ID or attempt ID. Reason: ${getErrorMessage(e)}`,
);
}
const sha = await getCommitOid(checkoutPath);
const restoreKeyPrefix = await getCacheRestoreKeyPrefix(
config,
codeQlVersion,
);
return `${restoreKeyPrefix}${sha}-${runId}-${attemptId}`;
}
/**
* Computes the cache key prefix for restoring the overlay-base database from
* the GitHub Actions cache.
*
* Actions cache supports using multiple restore keys to indicate preference,
* and this function could in principle take advantage of that feature by
* returning a list of restore key prefixes. However, since overlay-base
* databases are built from the default branch and used in PR analysis, it is
* exceedingly unlikely that the commit SHA will ever be the same.
*
* Therefore, this function returns only a single restore key prefix, which does
* not include the commit SHA. This allows us to restore the most recent
* compatible overlay-base database.
*/
export async function getCacheRestoreKeyPrefix(
config: Config,
codeQlVersion: string,
): Promise<string> {
const languages = [...config.languages].sort().join("_");
const cacheKeyComponents = {
automationID: await getAutomationID(),
// Add more components here as needed in the future
};
const componentsHash = createCacheKeyHash(cacheKeyComponents);
// For a cached overlay-base database to be considered compatible for overlay
// analysis, all components in the cache restore key must match:
//
// CACHE_PREFIX: distinguishes overlay-base databases from other cache objects
// CACHE_VERSION: cache format version
// componentsHash: hash of additional components (see above for details)
// languages: the languages included in the overlay-base database
// codeQlVersion: CodeQL bundle version
//
// Technically we can also include languages and codeQlVersion in the
// componentsHash, but including them explicitly in the cache key makes it
// easier to debug and understand the cache key structure.
return `${CACHE_PREFIX}-${CACHE_VERSION}-${componentsHash}-${languages}-${codeQlVersion}-`;
}
+2 -276
View File
@@ -1,32 +1,16 @@
import * as fs from "fs";
import * as path from "path";
import * as actionsCache from "@actions/cache";
import test from "ava";
import * as sinon from "sinon";
import * as actionsUtil from "../actions-util";
import * as apiClient from "../api-client";
import { ResolveDatabaseOutput } from "../codeql";
import * as gitUtils from "../git-utils";
import { KnownLanguage } from "../languages";
import { getRunnerLogger } from "../logging";
import {
createTestConfig,
mockCodeQLVersion,
setupTests,
} from "../testing-utils";
import * as utils from "../util";
import { createTestConfig, setupTests } from "../testing-utils";
import { withTmpDir } from "../util";
import {
downloadOverlayBaseDatabaseFromCache,
getCacheRestoreKeyPrefix,
getCacheSaveKey,
OverlayDatabaseMode,
writeBaseDatabaseOidsFile,
writeOverlayChangesFile,
} from ".";
import { writeBaseDatabaseOidsFile, writeOverlayChangesFile } from ".";
setupTests(test);
@@ -344,261 +328,3 @@ test.serial(
});
},
);
interface DownloadOverlayBaseDatabaseTestCase {
overlayDatabaseMode: OverlayDatabaseMode;
useOverlayDatabaseCaching: boolean;
isInTestMode: boolean;
restoreCacheResult: string | undefined | Error;
hasBaseDatabaseOidsFile: boolean;
tryGetFolderBytesSucceeds: boolean;
codeQLVersion: string;
resolveDatabaseOutput: ResolveDatabaseOutput | Error;
}
const defaultDownloadTestCase: DownloadOverlayBaseDatabaseTestCase = {
overlayDatabaseMode: OverlayDatabaseMode.Overlay,
useOverlayDatabaseCaching: true,
isInTestMode: false,
restoreCacheResult: "cache-key",
hasBaseDatabaseOidsFile: true,
tryGetFolderBytesSucceeds: true,
codeQLVersion: "2.20.5",
resolveDatabaseOutput: { overlayBaseSpecifier: "20250626:XXX" },
};
const testDownloadOverlayBaseDatabaseFromCache = test.macro({
exec: async (
t,
_title: string,
partialTestCase: Partial<DownloadOverlayBaseDatabaseTestCase>,
expectDownloadSuccess: boolean,
) => {
await withTmpDir(async (tmpDir) => {
const dbLocation = path.join(tmpDir, "db");
await fs.promises.mkdir(dbLocation, { recursive: true });
const logger = getRunnerLogger(true);
const testCase = { ...defaultDownloadTestCase, ...partialTestCase };
const config = createTestConfig({
dbLocation,
languages: [KnownLanguage.java],
});
config.overlayDatabaseMode = testCase.overlayDatabaseMode;
config.useOverlayDatabaseCaching = testCase.useOverlayDatabaseCaching;
if (testCase.hasBaseDatabaseOidsFile) {
const baseDatabaseOidsFile = path.join(
dbLocation,
"base-database-oids.json",
);
await fs.promises.writeFile(baseDatabaseOidsFile, JSON.stringify({}));
}
const stubs: sinon.SinonStub[] = [];
const getAutomationIDStub = sinon
.stub(apiClient, "getAutomationID")
.resolves("test-automation-id/");
stubs.push(getAutomationIDStub);
const isInTestModeStub = sinon
.stub(utils, "isInTestMode")
.returns(testCase.isInTestMode);
stubs.push(isInTestModeStub);
if (testCase.restoreCacheResult instanceof Error) {
const restoreCacheStub = sinon
.stub(actionsCache, "restoreCache")
.rejects(testCase.restoreCacheResult);
stubs.push(restoreCacheStub);
} else {
const restoreCacheStub = sinon
.stub(actionsCache, "restoreCache")
.resolves(testCase.restoreCacheResult);
stubs.push(restoreCacheStub);
}
const tryGetFolderBytesStub = sinon
.stub(utils, "tryGetFolderBytes")
.resolves(testCase.tryGetFolderBytesSucceeds ? 1024 * 1024 : undefined);
stubs.push(tryGetFolderBytesStub);
const codeql = mockCodeQLVersion(testCase.codeQLVersion);
if (testCase.resolveDatabaseOutput instanceof Error) {
const resolveDatabaseStub = sinon
.stub(codeql, "resolveDatabase")
.rejects(testCase.resolveDatabaseOutput);
stubs.push(resolveDatabaseStub);
} else {
const resolveDatabaseStub = sinon
.stub(codeql, "resolveDatabase")
.resolves(testCase.resolveDatabaseOutput);
stubs.push(resolveDatabaseStub);
}
try {
const result = await downloadOverlayBaseDatabaseFromCache(
codeql,
config,
logger,
);
if (expectDownloadSuccess) {
t.truthy(result);
} else {
t.is(result, undefined);
}
} finally {
for (const stub of stubs) {
stub.restore();
}
}
});
},
title: (_, title) => `downloadOverlayBaseDatabaseFromCache: ${title}`,
});
test.serial(
testDownloadOverlayBaseDatabaseFromCache,
"returns stats when successful",
{},
true,
);
test.serial(
testDownloadOverlayBaseDatabaseFromCache,
"returns undefined when mode is OverlayDatabaseMode.OverlayBase",
{
overlayDatabaseMode: OverlayDatabaseMode.OverlayBase,
},
false,
);
test.serial(
testDownloadOverlayBaseDatabaseFromCache,
"returns undefined when mode is OverlayDatabaseMode.None",
{
overlayDatabaseMode: OverlayDatabaseMode.None,
},
false,
);
test.serial(
testDownloadOverlayBaseDatabaseFromCache,
"returns undefined when caching is disabled",
{
useOverlayDatabaseCaching: false,
},
false,
);
test.serial(
testDownloadOverlayBaseDatabaseFromCache,
"returns undefined in test mode",
{
isInTestMode: true,
},
false,
);
test.serial(
testDownloadOverlayBaseDatabaseFromCache,
"returns undefined when cache miss",
{
restoreCacheResult: undefined,
},
false,
);
test.serial(
testDownloadOverlayBaseDatabaseFromCache,
"returns undefined when download fails",
{
restoreCacheResult: new Error("Download failed"),
},
false,
);
test.serial(
testDownloadOverlayBaseDatabaseFromCache,
"returns undefined when downloaded database is invalid",
{
hasBaseDatabaseOidsFile: false,
},
false,
);
test.serial(
testDownloadOverlayBaseDatabaseFromCache,
"returns undefined when downloaded database doesn't have an overlayBaseSpecifier",
{
resolveDatabaseOutput: {},
},
false,
);
test.serial(
testDownloadOverlayBaseDatabaseFromCache,
"returns undefined when resolving database metadata fails",
{
resolveDatabaseOutput: new Error("Failed to resolve database metadata"),
},
false,
);
test.serial(
testDownloadOverlayBaseDatabaseFromCache,
"returns undefined when filesystem error occurs",
{
tryGetFolderBytesSucceeds: false,
},
false,
);
test.serial("overlay-base database cache keys remain stable", async (t) => {
const logger = getRunnerLogger(true);
const config = createTestConfig({ languages: ["python", "javascript"] });
const codeQlVersion = "2.23.0";
const commitOid = "abc123def456";
sinon.stub(apiClient, "getAutomationID").resolves("test-automation-id/");
sinon.stub(gitUtils, "getCommitOid").resolves(commitOid);
sinon.stub(actionsUtil, "getWorkflowRunID").returns(12345);
sinon.stub(actionsUtil, "getWorkflowRunAttempt").returns(1);
const saveKey = await getCacheSaveKey(
config,
codeQlVersion,
"checkout-path",
logger,
);
const expectedSaveKey =
"codeql-overlay-base-database-1-c5666c509a2d9895-javascript_python-2.23.0-abc123def456-12345-1";
t.is(
saveKey,
expectedSaveKey,
"Cache save key changed unexpectedly. " +
"This may indicate breaking changes in the cache key generation logic.",
);
const restoreKeyPrefix = await getCacheRestoreKeyPrefix(
config,
codeQlVersion,
);
const expectedRestoreKeyPrefix =
"codeql-overlay-base-database-1-c5666c509a2d9895-javascript_python-2.23.0-";
t.is(
restoreKeyPrefix,
expectedRestoreKeyPrefix,
"Cache restore key prefix changed unexpectedly. " +
"This may indicate breaking changes in the cache key generation logic.",
);
t.true(
saveKey.startsWith(restoreKeyPrefix),
`Expected save key "${saveKey}" to start with restore key prefix "${restoreKeyPrefix}"`,
);
});
+4 -431
View File
@@ -1,37 +1,12 @@
import * as fs from "fs";
import * as path from "path";
import * as actionsCache from "@actions/cache";
import * as actionsUtil from "../actions-util";
import {
getOptionalInput,
getRequiredInput,
getTemporaryDirectory,
getWorkflowRunAttempt,
getWorkflowRunID,
} from "../actions-util";
import { getAutomationID } from "../api-client";
import { createCacheKeyHash } from "../caching-utils";
import { type CodeQL } from "../codeql";
import { getOptionalInput, getTemporaryDirectory } from "../actions-util";
import { type Config } from "../config-utils";
import { getCommitOid, getFileOidsUnderPath, getGitRoot } from "../git-utils";
import { Logger, withGroupAsync } from "../logging";
import {
CleanupLevel,
getBaseDatabaseOidsFilePath,
getCodeQLDatabasePath,
getErrorMessage,
isInTestMode,
tryGetFolderBytes,
waitForResultWithTimeLimit,
} from "../util";
export enum OverlayDatabaseMode {
Overlay = "overlay",
OverlayBase = "overlay-base",
None = "none",
}
import { getFileOidsUnderPath, getGitRoot } from "../git-utils";
import { Logger } from "../logging";
import { getBaseDatabaseOidsFilePath } from "../util";
export const CODEQL_OVERLAY_MINIMUM_VERSION = "2.23.8";
@@ -45,23 +20,6 @@ export const CODEQL_OVERLAY_MINIMUM_VERSION_JAVASCRIPT = "2.23.9";
export const CODEQL_OVERLAY_MINIMUM_VERSION_PYTHON = "2.23.9";
export const CODEQL_OVERLAY_MINIMUM_VERSION_RUBY = "2.23.9";
/**
* The maximum (uncompressed) size of the overlay base database that we will
* upload. By default, the Actions Cache has an overall capacity of 10 GB, and
* the Actions Cache client library uses zstd compression.
*
* Ideally we would apply a size limit to the compressed overlay-base database,
* but we cannot do so because compression is handled transparently by the
* Actions Cache client library. Instead we place a limit on the uncompressed
* size of the overlay-base database.
*
* Assuming 2.5:1 compression ratio, the 7.5 GB limit on uncompressed data would
* translate to a limit of around 3 GB after compression.
*/
const OVERLAY_BASE_DATABASE_MAX_UPLOAD_SIZE_MB = 7500;
const OVERLAY_BASE_DATABASE_MAX_UPLOAD_SIZE_BYTES =
OVERLAY_BASE_DATABASE_MAX_UPLOAD_SIZE_MB * 1_000_000;
/**
* Writes a JSON file containing Git OIDs for all tracked files (represented
* by path relative to the source root) under the source root. The file is
@@ -235,388 +193,3 @@ async function getDiffRangeFilePaths(
.filter((rel) => !rel.startsWith(".."));
return [...new Set(relativePaths)];
}
// Constants for database caching
const CACHE_VERSION = 1;
const CACHE_PREFIX = "codeql-overlay-base-database";
// The purpose of this ten-minute limit is to guard against the possibility
// that the cache service is unresponsive, which would otherwise cause the
// entire action to hang. Normally we expect cache operations to complete
// within two minutes.
const MAX_CACHE_OPERATION_MS = 600_000;
/**
* Checks that the overlay-base database is valid by checking for the
* existence of the base database OIDs file.
*
* @param config The configuration object
* @param logger The logger instance
* @param warningPrefix Prefix for the check failure warning message
* @returns True if the verification succeeded, false otherwise
*/
async function checkOverlayBaseDatabase(
codeql: CodeQL,
config: Config,
logger: Logger,
warningPrefix: string,
): Promise<boolean> {
// An overlay-base database should contain the base database OIDs file.
const baseDatabaseOidsFilePath = getBaseDatabaseOidsFilePath(config);
if (!fs.existsSync(baseDatabaseOidsFilePath)) {
logger.warning(
`${warningPrefix}: ${baseDatabaseOidsFilePath} does not exist`,
);
return false;
}
for (const language of config.languages) {
const dbPath = getCodeQLDatabasePath(config, language);
try {
const resolveDatabaseOutput = await codeql.resolveDatabase(dbPath);
if (
resolveDatabaseOutput === undefined ||
!("overlayBaseSpecifier" in resolveDatabaseOutput)
) {
logger.info(`${warningPrefix}: no overlayBaseSpecifier defined`);
return false;
} else {
logger.debug(
`Overlay base specifier for ${language} overlay-base database found: ` +
`${resolveDatabaseOutput.overlayBaseSpecifier}`,
);
}
} catch (e) {
logger.warning(`${warningPrefix}: failed to resolve database: ${e}`);
return false;
}
}
return true;
}
/**
* Uploads the overlay-base database to the GitHub Actions cache. If conditions
* for uploading are not met, the function does nothing and returns false.
*
* This function uses the `checkout_path` input to determine the repository path
* and works only when called from `analyze` or `upload-sarif`.
*
* @param codeql The CodeQL instance
* @param config The configuration object
* @param logger The logger instance
* @returns A promise that resolves to true if the upload was performed and
* successfully completed, or false otherwise
*/
export async function cleanupAndUploadOverlayBaseDatabaseToCache(
codeql: CodeQL,
config: Config,
logger: Logger,
): Promise<boolean> {
const overlayDatabaseMode = config.overlayDatabaseMode;
if (overlayDatabaseMode !== OverlayDatabaseMode.OverlayBase) {
logger.debug(
`Overlay database mode is ${overlayDatabaseMode}. ` +
"Skip uploading overlay-base database to cache.",
);
return false;
}
if (!config.useOverlayDatabaseCaching) {
logger.debug(
"Overlay database caching is disabled. " +
"Skip uploading overlay-base database to cache.",
);
return false;
}
if (isInTestMode()) {
logger.debug(
"In test mode. Skip uploading overlay-base database to cache.",
);
return false;
}
const databaseIsValid = await checkOverlayBaseDatabase(
codeql,
config,
logger,
"Abort uploading overlay-base database to cache",
);
if (!databaseIsValid) {
return false;
}
// Clean up the database using the overlay cleanup level.
await withGroupAsync("Cleaning up databases", async () => {
await codeql.databaseCleanupCluster(config, CleanupLevel.Overlay);
});
const dbLocation = config.dbLocation;
const databaseSizeBytes = await tryGetFolderBytes(dbLocation, logger);
if (databaseSizeBytes === undefined) {
logger.warning(
"Failed to determine database size. " +
"Skip uploading overlay-base database to cache.",
);
return false;
}
if (databaseSizeBytes > OVERLAY_BASE_DATABASE_MAX_UPLOAD_SIZE_BYTES) {
const databaseSizeMB = Math.round(databaseSizeBytes / 1_000_000);
logger.warning(
`Database size (${databaseSizeMB} MB) ` +
`exceeds maximum upload size (${OVERLAY_BASE_DATABASE_MAX_UPLOAD_SIZE_MB} MB). ` +
"Skip uploading overlay-base database to cache.",
);
return false;
}
const codeQlVersion = (await codeql.getVersion()).version;
const checkoutPath = getRequiredInput("checkout_path");
const cacheSaveKey = await getCacheSaveKey(
config,
codeQlVersion,
checkoutPath,
logger,
);
logger.info(
`Uploading overlay-base database to Actions cache with key ${cacheSaveKey}`,
);
try {
const cacheId = await waitForResultWithTimeLimit(
MAX_CACHE_OPERATION_MS,
actionsCache.saveCache([dbLocation], cacheSaveKey),
() => {},
);
if (cacheId === undefined) {
logger.warning("Timed out while uploading overlay-base database");
return false;
}
} catch (error) {
logger.warning(
"Failed to upload overlay-base database to cache: " +
`${error instanceof Error ? error.message : String(error)}`,
);
return false;
}
logger.info(`Successfully uploaded overlay-base database from ${dbLocation}`);
return true;
}
export interface OverlayBaseDatabaseDownloadStats {
databaseSizeBytes: number;
databaseDownloadDurationMs: number;
}
/**
* Downloads the overlay-base database from the GitHub Actions cache. If conditions
* for downloading are not met, the function does nothing and returns false.
*
* @param codeql The CodeQL instance
* @param config The configuration object
* @param logger The logger instance
* @returns A promise that resolves to download statistics if an overlay-base
* database was successfully downloaded, or undefined if the download was
* either not performed or failed.
*/
export async function downloadOverlayBaseDatabaseFromCache(
codeql: CodeQL,
config: Config,
logger: Logger,
): Promise<OverlayBaseDatabaseDownloadStats | undefined> {
const overlayDatabaseMode = config.overlayDatabaseMode;
if (overlayDatabaseMode !== OverlayDatabaseMode.Overlay) {
logger.debug(
`Overlay database mode is ${overlayDatabaseMode}. ` +
"Skip downloading overlay-base database from cache.",
);
return undefined;
}
if (!config.useOverlayDatabaseCaching) {
logger.debug(
"Overlay database caching is disabled. " +
"Skip downloading overlay-base database from cache.",
);
return undefined;
}
if (isInTestMode()) {
logger.debug(
"In test mode. Skip downloading overlay-base database from cache.",
);
return undefined;
}
const dbLocation = config.dbLocation;
const codeQlVersion = (await codeql.getVersion()).version;
const cacheRestoreKeyPrefix = await getCacheRestoreKeyPrefix(
config,
codeQlVersion,
);
logger.info(
"Looking in Actions cache for overlay-base database with " +
`restore key ${cacheRestoreKeyPrefix}`,
);
let databaseDownloadDurationMs = 0;
try {
const databaseDownloadStart = performance.now();
const foundKey = await waitForResultWithTimeLimit(
// This ten-minute limit for the cache restore operation is mainly to
// guard against the possibility that the cache service is unresponsive
// and hangs outside the data download.
//
// Data download (which is normally the most time-consuming part of the
// restore operation) should not run long enough to hit this limit. Even
// for an extremely large 10GB database, at a download speed of 40MB/s
// (see below), the download should complete within five minutes. If we
// do hit this limit, there are likely more serious problems other than
// mere slow download speed.
//
// This is important because we don't want any ongoing file operations
// on the database directory when we do hit this limit. Hitting this
// time limit takes us to a fallback path where we re-initialize the
// database from scratch at dbLocation, and having the cache restore
// operation continue to write into dbLocation in the background would
// really mess things up. We want to hit this limit only in the case
// of a hung cache service, not just slow download speed.
MAX_CACHE_OPERATION_MS,
actionsCache.restoreCache(
[dbLocation],
cacheRestoreKeyPrefix,
undefined,
{
// Azure SDK download (which is the default) uses 128MB segments; see
// https://github.com/actions/toolkit/blob/main/packages/cache/README.md.
// Setting segmentTimeoutInMs to 3000 translates to segment download
// speed of about 40 MB/s, which should be achievable unless the
// download is unreliable (in which case we do want to abort).
segmentTimeoutInMs: 3000,
},
),
() => {
logger.info("Timed out downloading overlay-base database from cache");
},
);
databaseDownloadDurationMs = Math.round(
performance.now() - databaseDownloadStart,
);
if (foundKey === undefined) {
logger.info("No overlay-base database found in Actions cache");
return undefined;
}
logger.info(
`Downloaded overlay-base database in cache with key ${foundKey}`,
);
} catch (error) {
logger.warning(
"Failed to download overlay-base database from cache: " +
`${error instanceof Error ? error.message : String(error)}`,
);
return undefined;
}
const databaseIsValid = await checkOverlayBaseDatabase(
codeql,
config,
logger,
"Downloaded overlay-base database is invalid",
);
if (!databaseIsValid) {
logger.warning("Downloaded overlay-base database failed validation");
return undefined;
}
const databaseSizeBytes = await tryGetFolderBytes(dbLocation, logger);
if (databaseSizeBytes === undefined) {
logger.info(
"Filesystem error while accessing downloaded overlay-base database",
);
// The problem that warrants reporting download failure is not that we are
// unable to determine the size of the database. Rather, it is that we
// encountered a filesystem error while accessing the database, which
// indicates that an overlay analysis will likely fail.
return undefined;
}
logger.info(`Successfully downloaded overlay-base database to ${dbLocation}`);
return {
databaseSizeBytes: Math.round(databaseSizeBytes),
databaseDownloadDurationMs,
};
}
/**
* Computes the cache key for saving the overlay-base database to the GitHub
* Actions cache.
*
* The key consists of the restore key prefix (which does not include the
* commit SHA) and the commit SHA of the current checkout.
*/
export async function getCacheSaveKey(
config: Config,
codeQlVersion: string,
checkoutPath: string,
logger: Logger,
): Promise<string> {
let runId = 1;
let attemptId = 1;
try {
runId = getWorkflowRunID();
attemptId = getWorkflowRunAttempt();
} catch (e) {
logger.warning(
`Failed to get workflow run ID or attempt ID. Reason: ${getErrorMessage(e)}`,
);
}
const sha = await getCommitOid(checkoutPath);
const restoreKeyPrefix = await getCacheRestoreKeyPrefix(
config,
codeQlVersion,
);
return `${restoreKeyPrefix}${sha}-${runId}-${attemptId}`;
}
/**
* Computes the cache key prefix for restoring the overlay-base database from
* the GitHub Actions cache.
*
* Actions cache supports using multiple restore keys to indicate preference,
* and this function could in principle take advantage of that feature by
* returning a list of restore key prefixes. However, since overlay-base
* databases are built from the default branch and used in PR analysis, it is
* exceedingly unlikely that the commit SHA will ever be the same.
*
* Therefore, this function returns only a single restore key prefix, which does
* not include the commit SHA. This allows us to restore the most recent
* compatible overlay-base database.
*/
export async function getCacheRestoreKeyPrefix(
config: Config,
codeQlVersion: string,
): Promise<string> {
const languages = [...config.languages].sort().join("_");
const cacheKeyComponents = {
automationID: await getAutomationID(),
// Add more components here as needed in the future
};
const componentsHash = createCacheKeyHash(cacheKeyComponents);
// For a cached overlay-base database to be considered compatible for overlay
// analysis, all components in the cache restore key must match:
//
// CACHE_PREFIX: distinguishes overlay-base databases from other cache objects
// CACHE_VERSION: cache format version
// componentsHash: hash of additional components (see above for details)
// languages: the languages included in the overlay-base database
// codeQlVersion: CodeQL bundle version
//
// Technically we can also include languages and codeQlVersion in the
// componentsHash, but including them explicitly in the cache key makes it
// easier to debug and understand the cache key structure.
return `${CACHE_PREFIX}-${CACHE_VERSION}-${componentsHash}-${languages}-${codeQlVersion}-`;
}
+5
View File
@@ -0,0 +1,5 @@
export enum OverlayDatabaseMode {
Overlay = "overlay",
OverlayBase = "overlay-base",
None = "none",
}
+6 -6
View File
@@ -6,7 +6,7 @@ import * as core from "@actions/core";
import * as actionsUtil from "./actions-util";
import { getGitHubVersion } from "./api-client";
import { Feature, FeatureEnablement, initFeatures } from "./feature-flags";
import { KnownLanguage } from "./languages";
import { BuiltInLanguage, parseBuiltInLanguage } from "./languages";
import { getActionsLogger, Logger } from "./logging";
import { getRepositoryNwo } from "./repository";
import {
@@ -14,7 +14,6 @@ import {
getCredentials,
getProxyBinaryPath,
getSafeErrorMessage,
parseLanguage,
ProxyInfo,
sendFailedStatusReport,
sendSuccessStatusReport,
@@ -33,7 +32,7 @@ async function run(startedAt: Date) {
const logger = getActionsLogger();
let features: FeatureEnablement | undefined;
let language: KnownLanguage | undefined;
let language: BuiltInLanguage | undefined;
try {
// Make inputs accessible in the `post` step.
@@ -56,7 +55,7 @@ async function run(startedAt: Date) {
// Get the language input.
const languageInput = actionsUtil.getOptionalInput("language");
language = languageInput ? parseLanguage(languageInput) : undefined;
language = languageInput ? parseBuiltInLanguage(languageInput) : undefined;
// Query the FF for whether we should use the reduced registry mapping.
const skipUnusedRegistries = await features.getValue(
@@ -112,14 +111,14 @@ async function run(startedAt: Date) {
logger,
);
// Check that the private registries are reachable.
// Perform best-effort checks that the private registries are reachable.
await checkConnections(logger, proxyInfo);
// Report success if we have reached this point.
await sendSuccessStatusReport(
startedAt,
{
languages: language && [language],
languages: language === undefined ? undefined : [language],
},
proxyConfig.all_credentials.map((c) => c.type),
logger,
@@ -199,6 +198,7 @@ async function startProxy(
.map((credential) => ({
type: credential.type,
url: credential.url,
"replaces-base": credential["replaces-base"],
}));
core.setOutput("proxy_urls", JSON.stringify(registry_urls));
+179 -161
View File
@@ -8,10 +8,11 @@ import sinon from "sinon";
import * as apiClient from "./api-client";
import * as defaults from "./defaults.json";
import { setUpFeatureFlagTests } from "./feature-flags/testing-util";
import { KnownLanguage } from "./languages";
import { UnvalidatedObject, validateSchema } from "./json";
import { makeFromSchema } from "./json/testing-util";
import { BuiltInLanguage } from "./languages";
import { getRunnerLogger, Logger } from "./logging";
import * as startProxyExports from "./start-proxy";
import { parseLanguage } from "./start-proxy";
import * as statusReport from "./status-report";
import {
assertNotLogged,
@@ -232,7 +233,7 @@ test("getCredentials filters by language when specified", async (t) => {
getRunnerLogger(true),
undefined,
toEncodedJSON(mixedCredentials),
KnownLanguage.java,
BuiltInLanguage.java,
);
t.is(credentials.length, 1);
t.is(credentials[0].type, "maven_repository");
@@ -243,7 +244,7 @@ test("getCredentials returns all for a language when specified", async (t) => {
getRunnerLogger(true),
undefined,
toEncodedJSON(mixedCredentials),
KnownLanguage.go,
BuiltInLanguage.go,
);
t.is(credentials.length, 2);
@@ -252,6 +253,57 @@ test("getCredentials returns all for a language when specified", async (t) => {
t.assert(credentialsTypes.includes("git_source"));
});
test("getCredentials returns all goproxy_servers for Go when specified", async (t) => {
const multipleGoproxyServers = [
{ type: "goproxy_server", host: "goproxy1.example.com", token: "token1" },
{ type: "goproxy_server", host: "goproxy2.example.com", token: "token2" },
{ type: "git_source", host: "github.com/github", token: "mno" },
];
const credentials = startProxyExports.getCredentials(
getRunnerLogger(true),
undefined,
toEncodedJSON(multipleGoproxyServers),
BuiltInLanguage.go,
);
t.is(credentials.length, 3);
const goproxyServers = credentials.filter((c) => c.type === "goproxy_server");
t.is(goproxyServers.length, 2);
t.assert(goproxyServers.some((c) => c.host === "goproxy1.example.com"));
t.assert(goproxyServers.some((c) => c.host === "goproxy2.example.com"));
});
test("getCredentials returns all maven_repositories for Java when specified", async (t) => {
const multipleMavenRepositories = [
{
type: "maven_repository",
host: "maven1.pkg.github.com",
token: "token1",
},
{
type: "maven_repository",
host: "maven2.pkg.github.com",
token: "token2",
},
{ type: "git_source", host: "github.com/github", token: "mno" },
];
const credentials = startProxyExports.getCredentials(
getRunnerLogger(true),
undefined,
toEncodedJSON(multipleMavenRepositories),
BuiltInLanguage.java,
);
t.is(credentials.length, 2);
const mavenRepositories = credentials.filter(
(c) => c.type === "maven_repository",
);
t.assert(mavenRepositories.some((c) => c.host === "maven1.pkg.github.com"));
t.assert(mavenRepositories.some((c) => c.host === "maven2.pkg.github.com"));
});
test("getCredentials returns all credentials when no language specified", async (t) => {
const credentialsInput = toEncodedJSON(mixedCredentials);
@@ -299,144 +351,67 @@ test("getCredentials throws an error when non-printable characters are used", as
}
});
const validAzureCredential: startProxyExports.AzureConfig = {
tenant_id: "12345678-1234-1234-1234-123456789012",
client_id: "abcdef01-2345-6789-abcd-ef0123456789",
};
for (const oidcSchemaInfo of startProxyExports.oidcSchemas) {
test(`getCredentials throws when non-printable characters are used (${oidcSchemaInfo.name} OIDC)`, (t) => {
const validCredential = makeFromSchema(true, oidcSchemaInfo.schema);
for (const key of Object.keys(validCredential)) {
const invalidAuthConfig = {
...validCredential,
[key]: "123\x00",
};
const invalidCredential: startProxyExports.RawCredential = {
type: "nuget_feed",
host: `${key}.nuget.pkg.github.com`,
...invalidAuthConfig,
};
const credentialsInput = toEncodedJSON([invalidCredential]);
const validAwsCredential: startProxyExports.AWSConfig = {
aws_region: "us-east-1",
account_id: "123456789012",
role_name: "MY_ROLE",
domain: "MY_DOMAIN",
domain_owner: "987654321098",
audience: "custom-audience",
};
const validJFrogCredential: startProxyExports.JFrogConfig = {
jfrog_oidc_provider_name: "MY_PROVIDER",
audience: "jfrog-audience",
identity_mapping_name: "my-mapping",
};
test("getCredentials throws an error when non-printable characters are used for Azure OIDC", (t) => {
for (const key of Object.keys(validAzureCredential)) {
const invalidAzureCredential = {
...validAzureCredential,
[key]: "123\x00",
};
const invalidCredential: startProxyExports.RawCredential = {
type: "nuget_feed",
host: `${key}.nuget.pkg.github.com`,
...invalidAzureCredential,
};
const credentialsInput = toEncodedJSON([invalidCredential]);
t.throws(
() =>
startProxyExports.getCredentials(
getRunnerLogger(true),
undefined,
credentialsInput,
undefined,
),
{
message:
"Invalid credentials - fields must contain only printable characters",
},
);
}
});
test("getCredentials throws an error when non-printable characters are used for AWS OIDC", (t) => {
for (const key of Object.keys(validAwsCredential)) {
const invalidAwsCredential = {
...validAwsCredential,
[key]: "123\x00",
};
const invalidCredential: startProxyExports.RawCredential = {
type: "nuget_feed",
host: `${key}.nuget.pkg.github.com`,
...invalidAwsCredential,
};
const credentialsInput = toEncodedJSON([invalidCredential]);
t.throws(
() =>
startProxyExports.getCredentials(
getRunnerLogger(true),
undefined,
credentialsInput,
undefined,
),
{
message:
"Invalid credentials - fields must contain only printable characters",
},
);
}
});
test("getCredentials throws an error when non-printable characters are used for JFrog OIDC", (t) => {
for (const key of Object.keys(validJFrogCredential)) {
const invalidJFrogCredential = {
...validJFrogCredential,
[key]: "123\x00",
};
const invalidCredential: startProxyExports.RawCredential = {
type: "nuget_feed",
host: `${key}.nuget.pkg.github.com`,
...invalidJFrogCredential,
};
const credentialsInput = toEncodedJSON([invalidCredential]);
t.throws(
() =>
startProxyExports.getCredentials(
getRunnerLogger(true),
undefined,
credentialsInput,
undefined,
),
{
message:
"Invalid credentials - fields must contain only printable characters",
},
);
}
});
t.throws(
() =>
startProxyExports.getCredentials(
getRunnerLogger(true),
undefined,
credentialsInput,
undefined,
),
{
message:
"Invalid credentials - fields must contain only printable characters",
},
);
}
});
}
test("getCredentials accepts OIDC configurations", (t) => {
const oidcConfigurations = [
{
const oidcConfigurations = startProxyExports.oidcSchemas.map(
(schemaInfo) => ({
type: "nuget_feed",
host: "azure.pkg.github.com",
...validAzureCredential,
},
{
type: "nuget_feed",
host: "aws.pkg.github.com",
...validAwsCredential,
},
{
type: "nuget_feed",
host: "jfrog.pkg.github.com",
...validJFrogCredential,
},
];
host: `${schemaInfo.name.toLowerCase()}.pkg.github.com`,
...makeFromSchema(true, schemaInfo.schema),
}),
);
const credentials = startProxyExports.getCredentials(
getRunnerLogger(true),
undefined,
toEncodedJSON(oidcConfigurations),
KnownLanguage.csharp,
BuiltInLanguage.csharp,
);
t.is(credentials.length, 3);
t.is(credentials.length, startProxyExports.oidcSchemas.length);
t.assert(credentials.every((c) => c.type === "nuget_feed"));
t.assert(credentials.some((c) => startProxyExports.isAzureConfig(c)));
t.assert(credentials.some((c) => startProxyExports.isAWSConfig(c)));
t.assert(credentials.some((c) => startProxyExports.isJFrogConfig(c)));
for (const oidcSchemaInfo of startProxyExports.oidcSchemas) {
t.assert(
credentials.some((c) =>
validateSchema(
oidcSchemaInfo.schema,
c as unknown as UnvalidatedObject<any>,
),
),
);
}
});
const getCredentialsMacro = test.macro({
@@ -482,7 +457,7 @@ test(
t.is(results[0].type, "git_server");
t.is(results[0].host, "https://github.com/");
if (startProxyExports.isUsernamePassword(results[0])) {
if (startProxyExports.hasUsernameAndPassword(results[0])) {
t.assert(results[0].password?.startsWith("ghp_"));
} else {
t.fail("Expected a `UsernamePassword`-based credential.");
@@ -513,7 +488,7 @@ test(
t.is(results[0].type, "git_server");
t.is(results[0].host, "https://github.com/");
if (startProxyExports.isUsernamePassword(results[0])) {
if (startProxyExports.hasUsernameAndPassword(results[0])) {
t.assert(results[0].password?.startsWith("ghp_"));
} else {
t.fail("Expected a `UsernamePassword`-based credential.");
@@ -589,6 +564,76 @@ test(
},
);
test("getCredentials validates 'replaces-base' correctly", async (t) => {
// Valid cases.
const credentialsInput = toEncodedJSON([
{
type: "maven_repository",
host: "maven1.pkg.github.com",
token: "abc",
"replaces-base": false,
},
{
type: "maven_repository",
host: "maven2.pkg.github.com",
token: "def",
"replaces-base": true,
},
{
type: "maven_repository",
host: "maven3.pkg.github.com",
token: "ghi",
},
]);
const credentials = startProxyExports.getCredentials(
getRunnerLogger(true),
undefined,
credentialsInput,
BuiltInLanguage.java,
false,
);
t.is(credentials.length, 3);
t.true(credentials.some((c) => c["replaces-base"] === true));
t.true(credentials.some((c) => c["replaces-base"] === false));
t.true(credentials.some((c) => c["replaces-base"] === undefined));
// Invalid cases.
const baseInvalid = {
type: "maven_repository",
host: "maven4.pkg.github.com",
token: "jkl",
};
t.throws(() =>
startProxyExports.getCredentials(
getRunnerLogger(true),
undefined,
toEncodedJSON([{ ...baseInvalid, "replaces-base": null }]),
BuiltInLanguage.actions,
false,
),
);
t.throws(() =>
startProxyExports.getCredentials(
getRunnerLogger(true),
undefined,
toEncodedJSON([{ ...baseInvalid, "replaces-base": 123 }]),
BuiltInLanguage.actions,
false,
),
);
t.throws(() =>
startProxyExports.getCredentials(
getRunnerLogger(true),
undefined,
toEncodedJSON([{ ...baseInvalid, "replaces-base": "true" }]),
BuiltInLanguage.actions,
false,
),
);
});
test("getCredentials returns all credentials for Actions when using LANGUAGE_TO_REGISTRY_TYPE", async (t) => {
const credentialsInput = toEncodedJSON(mixedCredentials);
@@ -596,7 +641,7 @@ test("getCredentials returns all credentials for Actions when using LANGUAGE_TO_
getRunnerLogger(true),
undefined,
credentialsInput,
KnownLanguage.actions,
BuiltInLanguage.actions,
false,
);
t.is(credentials.length, mixedCredentials.length);
@@ -609,39 +654,12 @@ test("getCredentials returns no credentials for Actions when using NEW_LANGUAGE_
getRunnerLogger(true),
undefined,
credentialsInput,
KnownLanguage.actions,
BuiltInLanguage.actions,
true,
);
t.deepEqual(credentials, []);
});
test("parseLanguage", async (t) => {
// Exact matches
t.deepEqual(parseLanguage("csharp"), KnownLanguage.csharp);
t.deepEqual(parseLanguage("cpp"), KnownLanguage.cpp);
t.deepEqual(parseLanguage("go"), KnownLanguage.go);
t.deepEqual(parseLanguage("java"), KnownLanguage.java);
t.deepEqual(parseLanguage("javascript"), KnownLanguage.javascript);
t.deepEqual(parseLanguage("python"), KnownLanguage.python);
t.deepEqual(parseLanguage("rust"), KnownLanguage.rust);
// Aliases
t.deepEqual(parseLanguage("c"), KnownLanguage.cpp);
t.deepEqual(parseLanguage("c++"), KnownLanguage.cpp);
t.deepEqual(parseLanguage("c#"), KnownLanguage.csharp);
t.deepEqual(parseLanguage("kotlin"), KnownLanguage.java);
t.deepEqual(parseLanguage("typescript"), KnownLanguage.javascript);
// spaces and case-insensitivity
t.deepEqual(parseLanguage(" \t\nCsHaRp\t\t"), KnownLanguage.csharp);
t.deepEqual(parseLanguage(" \t\nkOtLin\t\t"), KnownLanguage.java);
// Not matches
t.deepEqual(parseLanguage("foo"), undefined);
t.deepEqual(parseLanguage(" "), undefined);
t.deepEqual(parseLanguage(""), undefined);
});
function mockGetApiClient(endpoints: any) {
return (
sinon
+28 -130
View File
@@ -18,26 +18,18 @@ import {
FeatureEnablement,
} from "./feature-flags";
import * as json from "./json";
import { KnownLanguage } from "./languages";
import { BuiltInLanguage } from "./languages";
import { Logger } from "./logging";
import {
Address,
Registry,
Credential,
AuthConfig,
isToken,
isAzureConfig,
Token,
UsernamePassword,
AzureConfig,
isAWSConfig,
AWSConfig,
isJFrogConfig,
JFrogConfig,
isUsernamePassword,
hasToken,
hasUsernameAndPassword,
hasUsername,
RawCredential,
} from "./start-proxy/types";
import { getAuthConfig } from "./start-proxy/validation";
import {
ActionName,
createStatusReportBase,
@@ -156,7 +148,7 @@ export function getSafeErrorMessage(error: Error): string {
export async function sendFailedStatusReport(
logger: Logger,
startedAt: Date,
language: KnownLanguage | undefined,
language: BuiltInLanguage | undefined,
unwrappedError: unknown,
) {
const error = util.wrapError(unwrappedError);
@@ -172,7 +164,7 @@ export async function sendFailedStatusReport(
getActionsStatus(error),
startedAt,
{
languages: language && [language],
languages: language === undefined ? undefined : [language],
},
await util.checkDiskUsage(logger),
logger,
@@ -188,48 +180,6 @@ export const UPDATEJOB_PROXY_VERSION = "v2.0.20250624110901";
const UPDATEJOB_PROXY_URL_PREFIX =
"https://github.com/github/codeql-action/releases/download/codeql-bundle-v2.22.0/";
/*
* Language aliases supported by the start-proxy Action.
*
* In general, the CodeQL CLI is the source of truth for language aliases, and to
* allow us to more easily support new languages, we want to avoid hardcoding these
* aliases in the Action itself. However this is difficult to do in the start-proxy
* Action since this Action does not use CodeQL, so we're accepting some hardcoding
* for this Action.
*/
const LANGUAGE_ALIASES: { [lang: string]: KnownLanguage } = {
c: KnownLanguage.cpp,
"c++": KnownLanguage.cpp,
"c#": KnownLanguage.csharp,
kotlin: KnownLanguage.java,
typescript: KnownLanguage.javascript,
"javascript-typescript": KnownLanguage.javascript,
"java-kotlin": KnownLanguage.java,
};
/**
* Parse the start-proxy language input into its canonical CodeQL language name.
*
* Exported for testing. Do not use this outside of the start-proxy Action
* to avoid complicating the process of adding new CodeQL languages.
*/
export function parseLanguage(language: string): KnownLanguage | undefined {
// Normalize to lower case
language = language.trim().toLowerCase();
// See if it's an exact match
if (language in KnownLanguage) {
return language as KnownLanguage;
}
// Check language aliases
if (language in LANGUAGE_ALIASES) {
return LANGUAGE_ALIASES[language];
}
return undefined;
}
function isPAT(value: string) {
return artifactScanner.isAuthToken(value, [
artifactScanner.GITHUB_PAT_CLASSIC_PATTERN,
@@ -237,7 +187,7 @@ function isPAT(value: string) {
]);
}
type RegistryMapping = Partial<Record<KnownLanguage, string[]>>;
type RegistryMapping = Partial<Record<BuiltInLanguage, string[]>>;
const LANGUAGE_TO_REGISTRY_TYPE: RegistryMapping = {
java: ["maven_repository"],
@@ -293,75 +243,6 @@ function getRegistryAddress(
}
}
/** Extracts an `AuthConfig` value from `config`. */
export function getAuthConfig(
config: json.UnvalidatedObject<AuthConfig>,
): AuthConfig {
// Start by checking for the OIDC configurations, since they have required properties
// which we can use to identify them.
if (isAzureConfig(config)) {
return {
tenant_id: config.tenant_id,
client_id: config.client_id,
} satisfies AzureConfig;
} else if (isAWSConfig(config)) {
return {
aws_region: config.aws_region,
account_id: config.account_id,
role_name: config.role_name,
domain: config.domain,
domain_owner: config.domain_owner,
audience: config.audience,
} satisfies AWSConfig;
} else if (isJFrogConfig(config)) {
return {
jfrog_oidc_provider_name: config.jfrog_oidc_provider_name,
identity_mapping_name: config.identity_mapping_name,
audience: config.audience,
} satisfies JFrogConfig;
} else if (isToken(config)) {
// There are three scenarios for non-OIDC authentication based on the registry type:
//
// 1. `username`+`token`
// 2. A `token` that combines the username and actual token, separated by ':'.
// 3. `username`+`password`
//
// In all three cases, all fields are optional. If the `token` field is present,
// we accept the configuration as a `Token` typed configuration, with the `token`
// value and an optional `username`. Otherwise, we accept the configuration
// typed as `UsernamePassword` (in the `else` clause below) with optional
// username and password. I.e. a private registry type that uses 1. or 2.,
// but has no `token` configured, will get accepted as `UsernamePassword` here.
if (isDefined(config.token)) {
// Mask token to reduce chance of accidental leakage in logs, if we have one.
core.setSecret(config.token);
}
return { username: config.username, token: config.token } satisfies Token;
} else {
let username: string | undefined = undefined;
let password: string | undefined = undefined;
// Both "username" and "password" are optional. If we have reached this point, we need
// to validate which of them are present and that they have the correct type if so.
if ("password" in config && json.isString(config.password)) {
// Mask password to reduce chance of accidental leakage in logs, if we have one.
core.setSecret(config.password);
password = config.password;
}
if ("username" in config && json.isString(config.username)) {
username = config.username;
}
// Return the `UsernamePassword` object. Both username and password may be undefined.
return {
username,
password,
} satisfies UsernamePassword;
}
}
// getCredentials returns registry credentials from action inputs.
// It prefers `registries_credentials` over `registry_secrets`.
// If neither is set, it returns an empty array.
@@ -369,7 +250,7 @@ export function getCredentials(
logger: Logger,
registrySecrets: string | undefined,
registriesCredentials: string | undefined,
language: KnownLanguage | undefined,
language: BuiltInLanguage | undefined,
skipUnusedRegistries: boolean = false,
): Credential[] {
const registryMapping = skipUnusedRegistries
@@ -450,11 +331,11 @@ export function getCredentials(
const noUsername =
!hasUsername(authConfig) || !isDefined(authConfig.username);
const passwordIsPAT =
isUsernamePassword(authConfig) &&
hasUsernameAndPassword(authConfig) &&
isDefined(authConfig.password) &&
isPAT(authConfig.password);
const tokenIsPAT =
isToken(authConfig) &&
hasToken(authConfig) &&
isDefined(authConfig.token) &&
isPAT(authConfig.token);
@@ -466,8 +347,25 @@ export function getCredentials(
);
}
// Construct the base credential object.
const baseCredential: Omit<Registry, keyof Address> = { type: e.type };
// If "replaces-base" is present, it must be a boolean.
if ("replaces-base" in e) {
if (
isDefined(e["replaces-base"]) &&
typeof e["replaces-base"] === "boolean"
) {
baseCredential["replaces-base"] = e["replaces-base"];
} else {
throw new ConfigurationError(
"Invalid credentials - 'replaces-base' must be a boolean",
);
}
}
out.push({
type: e.type,
...baseCredential,
...authConfig,
...address,
});
+4 -4
View File
@@ -7,7 +7,7 @@ import * as io from "@actions/io";
import test, { ExecutionContext } from "ava";
import sinon from "sinon";
import { JavaEnvVars, KnownLanguage } from "../languages";
import { JavaEnvVars, BuiltInLanguage } from "../languages";
import {
checkExpectedLogMessages,
getRecordingLogger,
@@ -182,11 +182,11 @@ test.serial("checkProxyEnvVars - credentials are removed from URLs", (t) => {
});
test.serial(
"checkProxyEnvironment - includes base checks for all known languages",
"checkProxyEnvironment - includes base checks for all built-in languages",
async (t) => {
stubToolrunner();
for (const language of Object.values(KnownLanguage)) {
for (const language of Object.values(BuiltInLanguage)) {
const messages: LoggedMessage[] = [];
const logger = getRecordingLogger(messages);
@@ -204,7 +204,7 @@ test.serial(
stubToolrunner();
await checkProxyEnvironment(logger, KnownLanguage.java);
await checkProxyEnvironment(logger, BuiltInLanguage.java);
assertEnvVarLogMessages(t, Object.keys(ProxyEnvVars), messages, false);
assertEnvVarLogMessages(t, JAVA_PROXY_ENV_VARS, messages, false);
},
+2 -2
View File
@@ -4,7 +4,7 @@ import * as path from "path";
import * as toolrunner from "@actions/exec/lib/toolrunner";
import * as io from "@actions/io";
import { JavaEnvVars, KnownLanguage, Language } from "../languages";
import { JavaEnvVars, BuiltInLanguage, Language } from "../languages";
import { Logger } from "../logging";
import { getErrorMessage, isDefined } from "../util";
@@ -196,7 +196,7 @@ export async function checkProxyEnvironment(
// Check language-specific configurations. If we don't know the language,
// then we perform all checks.
if (language === undefined || language === KnownLanguage.java) {
if (language === undefined || language === BuiltInLanguage.java) {
checkJavaEnvVars(logger);
await showJavaSettings(logger);
+32
View File
@@ -8,6 +8,7 @@ import {
} from "./../testing-utils";
import {
checkConnections,
connectionTestConfig,
ReachabilityBackend,
ReachabilityError,
} from "./reachability";
@@ -118,3 +119,34 @@ test("checkConnections - handles invalid URLs", async (t) => {
`Finished testing connections`,
]);
});
test("checkConnections - appends extra paths", async (t) => {
const backend = new MockReachabilityBackend();
const checkConnection = sinon.stub(backend, "checkConnection").resolves(200);
const messages = await withRecordingLoggerAsync(async (logger) => {
await checkConnections(
logger,
{
...proxyInfo,
registries: [{ ...nugetFeed, url: "https://api.nuget.org/" }],
},
backend,
);
});
checkExpectedLogMessages(t, messages, [
`Testing connection to https://api.nuget.org/`,
`Successfully tested connection to https://api.nuget.org/`,
`Finished testing connections`,
]);
t.true(
checkConnection.calledWith(
sinon.match(
new URL(
`https://api.nuget.org/${connectionTestConfig["nuget_feed"]?.path}`,
),
),
),
);
});
+43 -2
View File
@@ -2,11 +2,41 @@ import * as https from "https";
import { HttpsProxyAgent } from "https-proxy-agent";
import { DocUrl } from "../doc-url";
import { Logger } from "../logging";
import { getErrorMessage } from "../util";
import { getAddressString, ProxyInfo, Registry } from "./types";
/** Represents registry-specific connection test configurations. */
export interface ConnectionTestConfig {
/** An optional path to append to the end of the base url. */
path?: string;
}
/** A partial mapping of registry types to extra connection test configurations. */
export const connectionTestConfig: Partial<
Record<string, ConnectionTestConfig>
> = {
nuget_feed: { path: "v3/index.json" },
};
/**
* Applies the registry-specific check configuration to the base URL, if any and applicable.
*/
export function makeTestUrl(
config: ConnectionTestConfig | undefined,
base: URL,
): URL {
if (config?.path === undefined) {
return base;
}
if (base.pathname.endsWith(config.path)) {
return base;
}
return new URL(config.path, base);
}
export class ReachabilityError extends Error {
constructor(public readonly statusCode?: number | undefined) {
super();
@@ -41,7 +71,7 @@ class NetworkReachabilityBackend implements ReachabilityBackend {
url,
{
agent: this.agent,
method: "HEAD",
method: "GET",
ca: this.proxy.cert,
timeout: 5 * 1000, // 5 seconds
},
@@ -85,6 +115,13 @@ export async function checkConnections(
// Don't do anything if there are no registries.
if (proxy.registries.length === 0) return result;
// Start a log group and print a message with a disclaimer with a link to the
// relevant documentation that these checks are a best-effort process.
logger.startGroup("Testing connections via the proxy");
logger.info(
`The connection tests performed here are best-effort only and failures here may not affect the subsequent analysis. See ${DocUrl.PRIVATE_REGISTRY_LOGS} for more information.`,
);
try {
// Initialise a networking backend if no backend was provided.
if (backend === undefined) {
@@ -92,6 +129,7 @@ export async function checkConnections(
}
for (const registry of proxy.registries) {
const config = connectionTestConfig[registry.type];
const address = getAddressString(registry);
const url = URL.parse(address);
@@ -102,9 +140,11 @@ export async function checkConnections(
continue;
}
const testUrl = makeTestUrl(config, url);
try {
logger.debug(`Testing connection to ${url}...`);
const statusCode = await backend.checkConnection(url);
const statusCode = await backend.checkConnection(testUrl);
logger.info(`Successfully tested connection to ${url} (${statusCode})`);
result.add(registry);
@@ -126,5 +166,6 @@ export async function checkConnections(
);
}
logger.endGroup();
return result;
}
+76 -10
View File
@@ -1,5 +1,6 @@
import test from "ava";
import { makeFromSchema, withSchemaMatrix } from "../json/testing-util";
import { setupTests } from "../testing-utils";
import * as types from "./types";
@@ -7,25 +8,57 @@ import * as types from "./types";
setupTests(test);
const validAzureCredential: types.AzureConfig = {
tenant_id: "12345678-1234-1234-1234-123456789012",
client_id: "abcdef01-2345-6789-abcd-ef0123456789",
"tenant-id": "12345678-1234-1234-1234-123456789012",
"client-id": "abcdef01-2345-6789-abcd-ef0123456789",
};
const validAwsCredential: types.AWSConfig = {
aws_region: "us-east-1",
account_id: "123456789012",
role_name: "MY_ROLE",
"aws-region": "us-east-1",
"account-id": "123456789012",
"role-name": "MY_ROLE",
domain: "MY_DOMAIN",
domain_owner: "987654321098",
"domain-owner": "987654321098",
audience: "custom-audience",
};
const validJFrogCredential: types.JFrogConfig = {
jfrog_oidc_provider_name: "MY_PROVIDER",
"jfrog-oidc-provider-name": "MY_PROVIDER",
audience: "jfrog-audience",
identity_mapping_name: "my-mapping",
"identity-mapping-name": "my-mapping",
};
test("hasUsername", (t) => {
// Reject the case where `username` is missing.
t.false(types.hasUsername({}));
// Test all cases where `username` is present.
withSchemaMatrix(
t,
types.usernameSchema,
{ excludeAbsent: true },
(value) => {
t.true(types.hasUsername(value));
},
);
});
test("hasUsernameAndPassword", (t) => {
// Reject cases where `username` or `password` are missing.
t.false(types.hasUsernameAndPassword({}));
t.false(types.hasUsernameAndPassword({ username: "foo" }));
t.false(types.hasUsernameAndPassword({ password: "foo" }));
// Test all cases where both `username` and `password` are present.
withSchemaMatrix(
t,
types.usernamePasswordSchema,
{ excludeAbsent: true },
(value) => {
t.true(types.hasUsernameAndPassword(value));
},
);
});
test("credentialToStr - pretty-prints valid username+password configurations", (t) => {
const secret = "password123";
const credential: types.Credential = {
@@ -107,13 +140,46 @@ test("credentialToStr - pretty-prints valid JFrog OIDC configurations", (t) => {
);
});
test("credentialToStr - pretty-prints valid Cloudsmith OIDC configurations", (t) => {
const credential: types.Credential = {
type: "maven_credential",
url: "https://localhost",
...(makeFromSchema(
true,
types.cloudsmithConfigSchema,
) as types.CloudsmithConfig),
};
const str = types.credentialToStr(credential);
t.is(
"Type: maven_credential; Url: https://localhost; Cloudsmith Namespace: value-for-namespace; Cloudsmith Service Slug: value-for-service-slug; Cloudsmith API Host: value-for-api-host;",
str,
);
});
test("credentialToStr - pretty-prints valid GCP OIDC configurations", (t) => {
const credential: types.Credential = {
type: "maven_credential",
url: "https://localhost",
...(makeFromSchema(true, types.gcpConfigSchema) as types.GCPConfig),
};
const str = types.credentialToStr(credential);
t.is(
"Type: maven_credential; Url: https://localhost; GCP Workload Identity Provider: value-for-workload-identity-provider; GCP Service Account: value-for-service-account; GCP Audience: value-for-audience;",
str,
);
});
test("credentialToStr - hides passwords", (t) => {
const secret = "password123";
const credential = {
type: "maven_credential",
password: secret,
url: "https://localhost",
};
} satisfies types.Credential;
const str = types.credentialToStr(credential);
@@ -127,7 +193,7 @@ test("credentialToStr - hides tokens", (t) => {
type: "maven_credential",
token: secret,
url: "https://localhost",
};
} satisfies types.Credential;
const str = types.credentialToStr(credential);
+145 -96
View File
@@ -9,144 +9,177 @@ import { isDefined } from "../util";
*/
export type RawCredential = UnvalidatedObject<Credential>;
/** Usernames may be present for both authentication with tokens or passwords. */
export type Username = {
/** A schema for credential objects with a username. */
export const usernameSchema = {
/** The username needed to authenticate to the package registry, if any. */
username?: string;
};
username: json.optional(json.string),
} as const satisfies json.Schema;
/** Decides whether `config` has a username. */
/** Usernames may be present for both authentication with tokens or passwords. */
export type Username = json.FromSchema<typeof usernameSchema>;
/**
* Narrows `config` to `Username` if `config` has a `username` property.
* Not used for validation. Assumes that `config` is already a validated `AuthConfig`.
*/
export function hasUsername(config: AuthConfig): config is Username {
return "username" in config;
}
/** A schema for credential objects with a username and password. */
export const usernamePasswordSchema = {
/** The password needed to authenticate to the package registry, if any. */
password: json.optional(json.string),
...usernameSchema,
} as const satisfies json.Schema;
/**
* Fields expected for authentication based on a username and password.
* Both username and password are optional.
*/
export type UsernamePassword = {
/** The password needed to authenticate to the package registry, if any. */
password?: string;
} & Username;
export type UsernamePassword = json.FromSchema<typeof usernamePasswordSchema>;
/** Decides whether `config` is based on a username and password. */
export function isUsernamePassword(
/**
* Narrows `config` to `UsernamePassword` if it has a `username` and `password` property.
* Not used for validation. Assumes that `config` is already a validated `AuthConfig`.
*/
export function hasUsernameAndPassword(
config: AuthConfig,
): config is UsernamePassword {
return hasUsername(config) && "password" in config;
}
/** A schema for credential objects for token-based authentication. */
export const tokenSchema = {
/** The token needed to authenticate to the package registry, if any. */
token: json.optional(json.string),
...usernameSchema,
} as const satisfies json.Schema;
/**
* Fields expected for token-based authentication.
* Both username and token are optional.
*/
export type Token = {
/** The token needed to authenticate to the package registry, if any. */
token?: string;
} & Username;
export type Token = json.FromSchema<typeof tokenSchema>;
/**
* Narrows `config` to `Token` if it has a `token` property.
* Not used for validation. Assumes that `config` is already a validated `AuthConfig`.
*/
export function hasToken(config: AuthConfig): config is Token {
return "token" in config;
}
/** Decides whether `config` is token-based. */
export function isToken(
config: UnvalidatedObject<AuthConfig>,
): config is Token {
// The "username" field is optional, but should be a string if present.
if ("username" in config && !json.isStringOrUndefined(config.username)) {
return false;
}
// The "token" field is required, and must be a string or undefined.
return "token" in config && json.isStringOrUndefined(config.token);
return "token" in config && json.validateSchema(tokenSchema, config);
}
/** A schema for Azure OIDC configurations. */
export const azureConfigSchema = {
"tenant-id": json.string,
"client-id": json.string,
} as const satisfies json.Schema;
/** Configuration for Azure OIDC. */
export type AzureConfig = { tenant_id: string; client_id: string };
export type AzureConfig = json.FromSchema<typeof azureConfigSchema>;
/** Decides whether `config` is an Azure OIDC configuration. */
export function isAzureConfig(
config: UnvalidatedObject<AuthConfig>,
): config is AzureConfig {
return (
"tenant_id" in config &&
"client_id" in config &&
isDefined(config.tenant_id) &&
isDefined(config.client_id) &&
json.isString(config.tenant_id) &&
json.isString(config.client_id)
);
return json.validateSchema(azureConfigSchema, config);
}
/** A schema for AWS OIDC configurations. */
export const awsConfigSchema = {
"aws-region": json.string,
"account-id": json.string,
"role-name": json.string,
domain: json.string,
"domain-owner": json.string,
audience: json.optional(json.string),
} as const satisfies json.Schema;
/** Configuration for AWS OIDC. */
export type AWSConfig = {
aws_region: string;
account_id: string;
role_name: string;
domain: string;
domain_owner: string;
audience?: string;
};
export type AWSConfig = json.FromSchema<typeof awsConfigSchema>;
/** Decides whether `config` is an AWS OIDC configuration. */
export function isAWSConfig(
config: UnvalidatedObject<AuthConfig>,
): config is AWSConfig {
// All of these properties are required.
const requiredProperties = [
"aws_region",
"account_id",
"role_name",
"domain",
"domain_owner",
];
for (const property of requiredProperties) {
if (
!(property in config) ||
!isDefined(config[property]) ||
!json.isString(config[property])
) {
return false;
}
}
// The "audience" field is optional, but should be a string if present.
if ("audience" in config && !json.isStringOrUndefined(config.audience)) {
return false;
}
return true;
return json.validateSchema(awsConfigSchema, config);
}
/** A schema for JFrog OIDC configurations. */
export const jfrogConfigSchema = {
"jfrog-oidc-provider-name": json.string,
audience: json.optional(json.string),
"identity-mapping-name": json.optional(json.string),
} as const satisfies json.Schema;
/** Configuration for JFrog OIDC. */
export type JFrogConfig = {
jfrog_oidc_provider_name: string;
audience?: string;
identity_mapping_name?: string;
};
export type JFrogConfig = json.FromSchema<typeof jfrogConfigSchema>;
/** Decides whether `config` is a JFrog OIDC configuration. */
export function isJFrogConfig(
config: UnvalidatedObject<AuthConfig>,
): config is JFrogConfig {
// The "audience" and "identity_mapping_name" fields are optional, but should be strings if present.
if ("audience" in config && !json.isStringOrUndefined(config.audience)) {
return false;
}
if (
"identity_mapping_name" in config &&
!json.isStringOrUndefined(config.identity_mapping_name)
) {
return false;
}
return (
"jfrog_oidc_provider_name" in config &&
isDefined(config.jfrog_oidc_provider_name) &&
json.isString(config.jfrog_oidc_provider_name)
);
return json.validateSchema(jfrogConfigSchema, config);
}
/** A schema for Cloudsmith OIDC configurations. */
export const cloudsmithConfigSchema = {
namespace: json.string,
"service-slug": json.string,
"api-host": json.string,
} as const satisfies json.Schema;
/** Configuration for Cloudsmith OIDC. */
export type CloudsmithConfig = json.FromSchema<typeof cloudsmithConfigSchema>;
/** Decides whether `config` is a Cloudsmith OIDC configuration. */
export function isCloudsmithConfig(
config: UnvalidatedObject<AuthConfig>,
): config is CloudsmithConfig {
return json.validateSchema(cloudsmithConfigSchema, config);
}
/** A schema for GCP OIDC configurations. */
export const gcpConfigSchema = {
"workload-identity-provider": json.string,
"service-account": json.optional(json.string),
audience: json.optional(json.string),
} as const satisfies json.Schema;
/** Configuration for GCP OIDC. */
export type GCPConfig = json.FromSchema<typeof gcpConfigSchema>;
/** Decides whether `config` is a GCP OIDC configuration. */
export function isGCPConfig(
config: UnvalidatedObject<AuthConfig>,
): config is GCPConfig {
return json.validateSchema(gcpConfigSchema, config);
}
/** An array of all OIDC configuration schemas along with output-friendly names. */
export const oidcSchemas = [
{ schema: azureConfigSchema, name: "Azure" },
{ schema: awsConfigSchema, name: "AWS" },
{ schema: jfrogConfigSchema, name: "JFrog" },
{ schema: cloudsmithConfigSchema, name: "Cloudsmith" },
{ schema: gcpConfigSchema, name: "GCP" },
];
/** Represents all supported OIDC configurations. */
export type OIDC = AzureConfig | AWSConfig | JFrogConfig;
export type OIDC =
| AzureConfig
| AWSConfig
| JFrogConfig
| CloudsmithConfig
| GCPConfig;
/** All authentication-related fields. */
export type AuthConfig = UsernamePassword | Token | OIDC;
@@ -165,7 +198,7 @@ export type Credential = AuthConfig & Registry;
export function credentialToStr(credential: Credential): string {
let result: string = `Type: ${credential.type};`;
const appendIfDefined = (name: string, val: string | undefined) => {
const appendIfDefined = (name: string, val: string | undefined | null) => {
if (isDefined(val)) {
result += ` ${name}: ${val};`;
}
@@ -184,24 +217,38 @@ export function credentialToStr(credential: Credential): string {
isDefined(credential.password) ? "***" : undefined,
);
}
if (isToken(credential)) {
if (hasToken(credential)) {
appendIfDefined("Token", isDefined(credential.token) ? "***" : undefined);
}
if (isAzureConfig(credential)) {
appendIfDefined("Tenant", credential.tenant_id);
appendIfDefined("Client", credential.client_id);
appendIfDefined("Tenant", credential["tenant-id"]);
appendIfDefined("Client", credential["client-id"]);
} else if (isAWSConfig(credential)) {
appendIfDefined("AWS Region", credential.aws_region);
appendIfDefined("AWS Account", credential.account_id);
appendIfDefined("AWS Role", credential.role_name);
appendIfDefined("AWS Region", credential["aws-region"]);
appendIfDefined("AWS Account", credential["account-id"]);
appendIfDefined("AWS Role", credential["role-name"]);
appendIfDefined("AWS Domain", credential.domain);
appendIfDefined("AWS Domain Owner", credential.domain_owner);
appendIfDefined("AWS Domain Owner", credential["domain-owner"]);
appendIfDefined("AWS Audience", credential.audience);
} else if (isJFrogConfig(credential)) {
appendIfDefined("JFrog Provider", credential.jfrog_oidc_provider_name);
appendIfDefined("JFrog Identity Mapping", credential.identity_mapping_name);
appendIfDefined("JFrog Provider", credential["jfrog-oidc-provider-name"]);
appendIfDefined(
"JFrog Identity Mapping",
credential["identity-mapping-name"],
);
appendIfDefined("JFrog Audience", credential.audience);
} else if (isCloudsmithConfig(credential)) {
appendIfDefined("Cloudsmith Namespace", credential.namespace);
appendIfDefined("Cloudsmith Service Slug", credential["service-slug"]);
appendIfDefined("Cloudsmith API Host", credential["api-host"]);
} else if (isGCPConfig(credential)) {
appendIfDefined(
"GCP Workload Identity Provider",
credential["workload-identity-provider"],
);
appendIfDefined("GCP Service Account", credential["service-account"]);
appendIfDefined("GCP Audience", credential.audience);
}
return result;
@@ -211,6 +258,8 @@ export function credentialToStr(credential: Credential): string {
export type Registry = {
/** The type of the package registry. */
type: string;
/** Whether the registry replaces the base registry for the ecosystem. */
"replaces-base"?: boolean;
} & Address;
// If a registry has an `url`, then that takes precedence over the `host` which may or may
+69
View File
@@ -0,0 +1,69 @@
import test from "ava";
import * as json from "../json";
import { makeFromSchema } from "../json/testing-util";
import { setupTests } from "../testing-utils";
import * as types from "./types";
import { getAuthConfig } from "./validation";
setupTests(test);
for (const schemaTest of types.oidcSchemas) {
for (const includeOptional of [true, false]) {
const minimalName = includeOptional ? "full" : "minimal";
test(`getAuthConfig - ${schemaTest.name} - ${minimalName}`, async (t) => {
const config = makeFromSchema(includeOptional, schemaTest.schema);
t.deepEqual(
getAuthConfig({
...config,
unexpected: "unexpected-value",
} as unknown as json.UnvalidatedObject<types.AuthConfig>),
config,
);
});
}
}
test("getAuthConfig - token", async (t) => {
const config = makeFromSchema(true, types.tokenSchema);
t.deepEqual(
getAuthConfig({
...config,
unexpected: "unexpected-value",
} as json.UnvalidatedObject<types.AuthConfig>),
config,
);
});
test("getAuthConfig - username and password", async (t) => {
const config = makeFromSchema(true, types.usernamePasswordSchema);
t.deepEqual(
getAuthConfig({
...config,
unexpected: "unexpected-value",
} as json.UnvalidatedObject<types.AuthConfig>),
config,
);
});
test("getAuthConfig - empty", async (t) => {
const config = makeFromSchema(false, types.usernamePasswordSchema);
// Since the purpose of constructing the `AuthConfig` values is for
// serialisation to JSON so that they can be passed to the proxy as configuration,
// we only care that the stringified JSON representations are the same.
t.deepEqual(
JSON.stringify(
getAuthConfig({
...config,
unexpected: "unexpected-value",
} as json.UnvalidatedObject<types.AuthConfig>),
),
JSON.stringify({}),
);
});
+81
View File
@@ -0,0 +1,81 @@
import * as core from "@actions/core";
import * as json from "../json";
import { isDefined } from "../util";
import type { AuthConfig, UsernamePassword } from "./types";
import * as types from "./types";
/** Constructs a new object from `obj` with only keys that exist in `schema`. */
export function cloneCredential<S extends json.Schema>(
schema: S,
obj: json.FromSchema<S>,
): json.FromSchema<S> {
const result = {};
for (const key of Object.keys(schema)) {
// Skip keys that don't exist or don't have a value.
if (!isDefined(obj[key])) {
continue;
}
result[key] = obj[key];
}
return result as json.FromSchema<S>;
}
/** Extracts an `AuthConfig` value from `config`. */
export function getAuthConfig(
config: json.UnvalidatedObject<AuthConfig>,
): AuthConfig {
// Start by checking for the OIDC configurations, since they have required properties
// which we can use to identify them.
for (const oidcSchema of types.oidcSchemas) {
if (json.validateSchema(oidcSchema.schema, config)) {
return cloneCredential(oidcSchema.schema, config);
}
}
// Otherwise, try the basic configuration types.
if (types.isToken(config)) {
// There are three scenarios for non-OIDC authentication based on the registry type:
//
// 1. `username`+`token`
// 2. A `token` that combines the username and actual token, separated by ':'.
// 3. `username`+`password`
//
// In all three cases, all fields are optional. If the `token` field is present,
// we accept the configuration as a `Token` typed configuration, with the `token`
// value and an optional `username`. Otherwise, we accept the configuration
// typed as `UsernamePassword` (in the `else` clause below) with optional
// username and password. I.e. a private registry type that uses 1. or 2.,
// but has no `token` configured, will get accepted as `UsernamePassword` here.
if (isDefined(config.token)) {
// Mask token to reduce chance of accidental leakage in logs, if we have one.
core.setSecret(config.token);
}
return cloneCredential(types.tokenSchema, config);
} else {
let username: string | undefined = undefined;
let password: string | undefined = undefined;
// Both "username" and "password" are optional. If we have reached this point, we need
// to validate which of them are present and that they have the correct type if so.
if ("password" in config && json.isString(config.password)) {
// Mask password to reduce chance of accidental leakage in logs, if we have one.
core.setSecret(config.password);
password = config.password;
}
if ("username" in config && json.isString(config.username)) {
username = config.username;
}
// Return the `UsernamePassword` object. Both username and password may be undefined.
return {
username,
password,
} satisfies UsernamePassword;
}
}
+5 -5
View File
@@ -4,7 +4,7 @@ import * as sinon from "sinon";
import * as actionsUtil from "./actions-util";
import { Config } from "./config-utils";
import { EnvVar } from "./environment";
import { KnownLanguage } from "./languages";
import { BuiltInLanguage } from "./languages";
import { getRunnerLogger } from "./logging";
import { ToolsSource } from "./setup-codeql";
import {
@@ -48,7 +48,7 @@ test.serial("createStatusReportBase", async (t) => {
new Date("May 19, 2023 05:19:00"),
createTestConfig({
buildMode: BuildMode.None,
languages: [KnownLanguage.java, KnownLanguage.swift],
languages: [BuiltInLanguage.java, BuiltInLanguage.swift],
}),
{ numAvailableBytes: 100, numTotalBytes: 500 },
getRunnerLogger(false),
@@ -345,7 +345,7 @@ test.serial(
"returns a value",
createTestConfig({
buildMode: BuildMode.None,
languages: [KnownLanguage.java, KnownLanguage.swift],
languages: [BuiltInLanguage.java, BuiltInLanguage.swift],
}),
{
trap_cache_download_size_bytes: 1024,
@@ -360,7 +360,7 @@ test.serial(
"includes packs for a single language",
createTestConfig({
buildMode: BuildMode.None,
languages: [KnownLanguage.java],
languages: [BuiltInLanguage.java],
computedConfig: {
packs: ["foo", "bar"],
},
@@ -377,7 +377,7 @@ test.serial(
"includes packs for multiple languages",
createTestConfig({
buildMode: BuildMode.None,
languages: [KnownLanguage.java, KnownLanguage.swift],
languages: [BuiltInLanguage.java, BuiltInLanguage.swift],
computedConfig: {
packs: { java: ["java-foo", "java-bar"], swift: ["swift-bar"] },
},
+1 -1
View File
@@ -18,7 +18,7 @@ import { DocUrl } from "./doc-url";
import { EnvVar } from "./environment";
import { getRef } from "./git-utils";
import { Logger } from "./logging";
import { OverlayBaseDatabaseDownloadStats } from "./overlay";
import { OverlayBaseDatabaseDownloadStats } from "./overlay/caching";
import { getRepositoryNwo } from "./repository";
import { ToolsSource } from "./setup-codeql";
import {
+1 -1
View File
@@ -21,7 +21,7 @@ import {
FeatureEnablement,
} from "./feature-flags";
import { Logger } from "./logging";
import { OverlayDatabaseMode } from "./overlay";
import { OverlayDatabaseMode } from "./overlay/overlay-database-mode";
import {
DEFAULT_DEBUG_ARTIFACT_NAME,
DEFAULT_DEBUG_DATABASE_NAME,
-1
View File
@@ -9,7 +9,6 @@ export enum ToolsFeature {
DatabaseInterpretResultsSupportsSarifRunProperty = "databaseInterpretResultsSupportsSarifRunProperty",
ForceOverwrite = "forceOverwrite",
IndirectTracingSupportsStaticBinaries = "indirectTracingSupportsStaticBinaries",
PythonDefaultIsToNotExtractStdlib = "pythonDefaultIsToNotExtractStdlib",
SuppressesMissingFileBaselineWarning = "suppressesMissingFileBaselineWarning",
}
+4 -4
View File
@@ -6,7 +6,7 @@ import * as sinon from "sinon";
import { CodeQL, getCodeQLForTesting } from "./codeql";
import * as configUtils from "./config-utils";
import { KnownLanguage } from "./languages";
import { BuiltInLanguage } from "./languages";
import { createTestConfig, makeVersionInfo, setupTests } from "./testing-utils";
import { ToolsFeature } from "./tools-features";
import { getCombinedTracerConfig } from "./tracer-config";
@@ -16,7 +16,7 @@ setupTests(test);
function getTestConfig(tempDir: string): configUtils.Config {
return createTestConfig({
languages: [KnownLanguage.java],
languages: [BuiltInLanguage.java],
tempDir,
dbLocation: path.resolve(tempDir, "codeql_databases"),
});
@@ -36,7 +36,7 @@ async function stubCodeql(
);
sinon
.stub(codeqlObject, "isTracedLanguage")
.withArgs(KnownLanguage.java)
.withArgs(BuiltInLanguage.java)
.resolves(true);
return codeqlObject;
}
@@ -45,7 +45,7 @@ test("getCombinedTracerConfig - return undefined when no languages are traced la
await util.withTmpDir(async (tmpDir) => {
const config = getTestConfig(tmpDir);
// No traced languages
config.languages = [KnownLanguage.javascript, KnownLanguage.python];
config.languages = [BuiltInLanguage.javascript, BuiltInLanguage.python];
t.deepEqual(
await getCombinedTracerConfig(await stubCodeql(), config),
undefined,
+9 -9
View File
@@ -15,7 +15,7 @@ import {
import * as configUtils from "./config-utils";
import { Feature } from "./feature-flags";
import * as gitUtils from "./git-utils";
import { KnownLanguage } from "./languages";
import { BuiltInLanguage } from "./languages";
import { getRunnerLogger } from "./logging";
import {
createFeatures,
@@ -41,7 +41,7 @@ const stubCodeql = createStubCodeQL({
async betterResolveLanguages() {
return {
extractors: {
[KnownLanguage.javascript]: [
[BuiltInLanguage.javascript]: [
{
extractor_root: "some_root",
extractor_options: {
@@ -65,7 +65,7 @@ const stubCodeql = createStubCodeQL({
},
},
],
[KnownLanguage.cpp]: [
[BuiltInLanguage.cpp]: [
{
extractor_root: "other_root",
},
@@ -76,7 +76,7 @@ const stubCodeql = createStubCodeQL({
});
const testConfigWithoutTmpDir = createTestConfig({
languages: [KnownLanguage.javascript, KnownLanguage.cpp],
languages: [BuiltInLanguage.javascript, BuiltInLanguage.cpp],
trapCaches: {
javascript: "/some/cache/dir",
},
@@ -84,7 +84,7 @@ const testConfigWithoutTmpDir = createTestConfig({
function getTestConfigWithTempDir(tempDir: string): configUtils.Config {
return createTestConfig({
languages: [KnownLanguage.javascript, KnownLanguage.ruby],
languages: [BuiltInLanguage.javascript, BuiltInLanguage.ruby],
tempDir,
dbLocation: path.resolve(tempDir, "codeql_databases"),
trapCaches: {
@@ -100,7 +100,7 @@ test.serial("check flags for JS, analyzing default branch", async (t) => {
sinon.stub(gitUtils, "isAnalyzingDefaultBranch").resolves(true);
const result = await getTrapCachingExtractorConfigArgsForLang(
config,
KnownLanguage.javascript,
BuiltInLanguage.javascript,
);
t.deepEqual(result, [
`-O=javascript.trap.cache.dir=${path.resolve(tmpDir, "jsCache")}`,
@@ -131,10 +131,10 @@ test("get languages that support TRAP caching", async (t) => {
const logger = getRecordingLogger(loggedMessages);
const languagesSupportingCaching = await getLanguagesSupportingCaching(
stubCodeql,
[KnownLanguage.javascript, KnownLanguage.cpp],
[BuiltInLanguage.javascript, BuiltInLanguage.cpp],
logger,
);
t.deepEqual(languagesSupportingCaching, [KnownLanguage.javascript]);
t.deepEqual(languagesSupportingCaching, [BuiltInLanguage.javascript]);
});
test.serial("upload cache key contains right fields", async (t) => {
@@ -180,7 +180,7 @@ test.serial(
);
await downloadTrapCaches(
stubCodeql,
[KnownLanguage.javascript, KnownLanguage.cpp],
[BuiltInLanguage.javascript, BuiltInLanguage.cpp],
logger,
);
t.assert(
+4
View File
@@ -185,6 +185,10 @@ export async function cleanupTrapCaches(
trap_cache_cleanup_skipped_because: "feature disabled",
};
}
logger.warning(
"TRAP cache cleanup is deprecated and will be removed in May 2026. " +
"We recommend instead disabling TRAP caching by passing the `trap-caching: false` input to the `init` Action.",
);
if (!(await gitUtils.isAnalyzingDefaultBranch())) {
return {
trap_cache_cleanup_skipped_because: "not analyzing default branch",
@@ -8,5 +8,6 @@ packs:
- codeql-testing/codeql-pack3:other-query.ql
paths-ignore:
- tests
- lib
- pr-checks
- tests
@@ -2,5 +2,6 @@ name: Pack testing in the CodeQL Action
disable-default-queries: true
paths-ignore:
- tests
- lib
- pr-checks
- tests

Some files were not shown because too many files have changed in this diff Show More